*Magnify*
    May     ►
SMTWTFS
   
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/20
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 16 17 18 19 -20- 21 22 23 24 25 ... Next
April 21, 2023 at 11:26am
April 21, 2023 at 11:26am
#1048554
Oh good. If this prompt for "Journalistic Intentions [18+] hadn't come up at random, I might have had to write an extra entry.

boning


Ever boned a fish?

I haven't, because I don't fish, but I've seen people do it. And not on one of those websites, either. Seems like a lot of work, though as with anything, practice improves skill. Some fish just don't like to be boned, though, which can lead to oral discomfort or even penetration.

The word "deboning" in this context means the same thing, which is just one of those weird things about English, like flammable/inflammable or contronyms like "cleave."

But for this blogging activity, which is all about clothing, we're not talking about fish, or anything we do with our clothes (mostly) off. And for once, I don't have to do extensive research to know what it means.

Not fish, then. But whales, which live with fish but aren't them. Back before the inspired invention of the brassiere, for a while, corsets were all the rage. And back before plastics (see previous entry on nylon), corsets were stiffened by whale bones.

Which is oddly appropriate. See, "corset" comes from an old French word meaning "body." The modern version is "corps" (pronounced more like "core" just like with the Marine Corps, and not to be confused with "cœur," which translates to "heart" and is pronounced more like "curr," as in "no1curr") and from one of these we get the English "corpse." So you had people (it's associated with women's wear but for a long time all genders wore a kind of corset) wearing parts of whale corpses to support part of their own living bodies. An endoskeleton turned exoskeleton.

We (mostly) don't kill whales for fashion, or lighting, or meat, anymore. Instead, we use older corpses. Not of dinosaurs, per popular misconception and one oil company's brontosaurus logo, but ancient marine life. They're pumped out of wells, refined, and some of them get molded into plastic boning (yes, people still wear corsets; see any RenFaire).

Eventually, and not too far in the future, we'll run out of the corpses that took billions of years to accumulate.

And then, we're well and truly boned.
April 20, 2023 at 9:24am
April 20, 2023 at 9:24am
#1048496
More proof that we're utterly obsessed with size:

    What is a black hole’s actual size?  
What do we mean by a black hole's size? A photon sphere? The minimal stable orbit? The event horizon? The singularity? Which one is right?


Out there in the Universe, size definitely matters.

Always lead with a subtle dick joke.

Objects that are stable, both microscopically and macroscopically, are described by measurable properties such as mass, volume, electric charge, and spin/angular momentum.

And, lately, political affiliation.

But “size” is a bit of a tricky one, particularly if your object is extremely small.

Snort.

After all, if all the mass and energy that goes into making a black hole inevitably collapses to a central singularity, then what does the concept of “size” even mean?

Obvious jokes aside, I've long wondered how you can measure the size of something that warps spacetime, which makes the concept of "size" very slippery (as if covered with K-Y)

The first thing you have to know about a black hole is this: in terms of its gravitational effects, especially at large distances away from it, a black hole is no different from any other mass.

I've mentioned something like this before, I know, but can't be arsed to find it right now. Black holes have some legitimately scary properties, but if they were going to eat entire galaxies, they'd have done it by now.

After all, we’re taught that black holes have an irresistible gravitational pull, and that they suck any matter that comes too close to their vicinity irrevocably into them. But the truth is that black holes don’t “suck” matter in anymore than any other mass.

I'm not well-versed in the math involved, but there's a region around a black hole within which orbits decay in a predictable way. Outside that region, black holes act like any other mass.

This is complicated by the prevalence of gas, dust, and debris in the vicinity, all of which would cause any orbit to eventually decay.

When a black hole rotates, it no longer has just one meaningful surface that’s a boundary between what can escape and what can’t; instead, there are a number of important boundaries that arise, and many of them can make a claim to being the size of a black hole, depending on what you’re trying to do. From the outside in, let’s go through them.

The rest of the article does just that; it's very interesting, and only a little bit math-oriented, but there's no need for me to comment on it. After all, I just wanted to make the point (pun intended) that even science writers are not above making the occasional dick joke.
April 19, 2023 at 10:57am
April 19, 2023 at 10:57am
#1048454
Today, another entry for "Journalistic Intentions [18+]:

Nylon


"There's a great future in plastics. Think about it. Will you think about it?"
         -Mr. McGuire, The Graduate

If Dustin Hoffman's character had actually heeded those words (frankly, I don't remember enough about the movie to know whether he did or not), he might have found himself in trouble. Shortly after the movie was released, "just one word: plastics" started to fall out of favor.

But there was a time when nylon, and other plastics, were heralded as positive agents of change.

As with most innovations of the 20th century, this was largely driven by marketing, with a hint of racism thrown in (nylon was heralded as an alternative to silk, which came from those countries.)

I don't need to rehash the Wikipedia page   for nylon, so for facts (or near-facts) and history, you can go there. Warning: a lot of it is chemistry. You can skip that part. I did, and I have some knowledge of chemistry.

But I do want to focus on one nearly throwaway sentence from that article:

In 1946, the demand for nylon stockings could not be satisfied, which led to the Nylon riots.

The... Nylon... RIOTS?

This, I gotta learn more about.

The nylon riots were a series of disturbances at American stores created by a nylon stocking shortage.  

Anyone else spot the inherent pun there? Anyone? I'll ruin it by explaining it: they had trouble stocking stockings.

During World War II, nylon was used extensively for parachutes and other war materials, such as airplane cords and ropes and the supply of nylon consumer goods was curtailed.

I'm just going to leave this here and hopefully remember to, at some point, research the effects of the rise of synthetic fabrics on the demonization of hemp. I can't help but think there was a connection, but it's irrelevant to this discussion.

The riots occurred between August 1945 and March 1946, when the War Production Board announced that the creation of Du Pont's nylon would shift its manufacturing from wartime material to nylon stockings, at the same time launching a promotional campaign.

On the other hand, I think I have a pretty good grasp on the link between overpromising stuff in promotional campaigns, and consumer riots.

It is imperative that I add: it sounds like there was a run on stockings.

In one of the worst disturbances, in Pittsburgh, 40,000 women queued up for 13,000 pairs of stockings, leading to fights breaking out.

If only we had camera phones then. Imagine the beautiful videos that would have resulted.

It took several months before Du Pont was able to ramp up production to meet demand, but until they did many women went without nylon stockings for months.

Imagine the sheer horror (pun intended).

During World War II, Japan stopped using supplies made out of silk, and so the United States had difficulty importing silk from Japan.

Pretty sure there were other reasons why importing silk from Japan wasn't possible during WWII.

Nylon stockings became increasingly popular on the black market, and sold for up to $20 per pair.

WTF, Wikipedia. Dollar amounts are useless unless we can compare them to today's dollars. Here, I'll help: $20 in 1945 is like $335 today.

There's more at the link. Maybe you'd heard of them; I had not, and despite the very real plight of the American stocking-wearers (or, technically, stocking-not-wearers) in the mid-1940s, I had a good laugh reading the section.

I have no idea what the current status of demand for nylon stockings is. I don't see them much, but I don't go to department stores or frequent the kinds of events where people would wear stockings or hose or pantyhose or whatever. Most of the ladies I see pictures of on the internet don't seem to wear them, but that may be selection bias, as most of the ladies I see pictures of on the internet don't wear much of anything.

No, I think people these days prefer "natural" fabrics, but not fur, though artificial fake fur can be made of nylon or nylon blends. I mostly see it in more bulk form, not fabric: fishing lines, zippers, even car parts. The first Wiki link above lists a bunch of uses. As a plastic, it's very versatile. But as a fabric? I can't remember the last time I could identify a pure nylon fabric. I think I had a raincoat made of it as a kid, but I'm not sure even of that.

I guess you could say that if you're looking for nylon stockings today, you might be hosed.
April 18, 2023 at 10:09am
April 18, 2023 at 10:09am
#1048384
Ah, a rare example of the wild Cracked article that I have Issues with.



I can only expect so much from a dick joke site, though.

Science, at least the way it’s taught to us at an elementary and high-school level, feels like it’s all about proof.

That much, I can agree with. The proof thing is wrong, but the statement is fairly accurate.

In reality, there’s a huge amount of science that, even after tests and studies, remains at best a heavily supported theory.

*record scratch*

Couldn't even finish the first paragraph before I found the first Issue.

I know I've written this before, but in science, a "theory" doesn't mean the same thing as when you or I use the word. I can't be arsed to go into detail right now, but "a heavily supported theory" is what used to be known as a "law" in science, as in "Newton's Second Law." It's based on observation and/or experiment (it's hard to do experiments on, say, a planetary nebula) and makes predictions. When lots of those predictions are confirmed, as is the case with the theories of relativity, gravity, quantum mechanics, and evolution—to name a few—it's basically settled science.

Which doesn't mean it can't be refined, or even overturned. And some have a higher confidence level than others. But science is all about theory (in the scientific sense).

And hey, keep in mind that this is likely going to be a fairly surface level exploration from a verified Bachelor of Fine Arts holder. Scientists, I personally invite you to absolutely go off in the comments.

I'm not a scientist, and I'm not commenting there. Instead, I'm ranting here.

4. Planet Nine/Planet X

Planet Nine is the name given to a theoretical, massive planet deep out in the reaches of our solar system, and a name that truly salts Pluto’s fresh wounds of being kicked out of the club.

Again with the Pluto thing. I've ranted about that before, too. But this is an example of something that's neither right nor wrong. It's not "theory" (in either sense) that demoted Pluto to dwarf planet status. It's us humans needing to characterize and organize things.

The reason scientists think it exists comes from that classic moment that sparks so many discoveries: a scientist going, “Well, that’s weird.”

Or if they're being less flippant about it, "This observation does not conform with previous observations."

Long ago, scientists looked at the orbit of Mercury and said, "Well, that's weird." So they proposed several solutions, one of which was an as-yet-undiscovered inner planet, which they named Vulcan. I'm going to pause here so you can think of Spock jokes.

Ready?

Okay. Turns out the weirdness was because their theory of gravity was incomplete. Once Einstein figured out the whole "relativity" thing, the oddnesses of Mercury's orbit became perfectly (or nearly so) explained, and Vulcan disappeared. Not that it was ever there. The idea disappeared.

All of which is to say that there's some evidence that's leading some scientists to believe the theory of gravity needs another tweak. Would this account for Planet Nine? Hell if I know.

3. Dark Matter

Decent SF show from Netflix. Oh, wait, they mean actual dark matter.

For a pretty complicated scientific theory, it’s surprising how commonly known of a term “dark matter” is. This can most likely be attributed to it sounding absolutely bad-ass, more like something the Avengers have to steal from a space prison than something based on high-level mathematics.

That's a pretty common issue with the unknown: it allows writers to speculate, after which readers/viewers might think the speculation is actual fact. I've seen shows where "dark matter" is used to produce all kinds of plot motivations, such as superpowers.

Again, some peculiarities in gravity are what tipped off science to the idea of this unobserved type of material.

While this isn't the whole story of dark matter, it's not a bad brief summary.

This, incidentally, is why I said above that some scientists believe the theory of gravity needs another tweak. Maybe dark matter isn't matter at all, but some other effect.

This is analogous, I think, to the old idea of the luminiferous ether. Basically, back when the ascendant idea was that light was a wave, they were stuck on the idea that it needed material to wave through, like a sound wave needs matter to propagate, or an ocean wave needs water. The ether, then, had to have certain properties, not the least of which was being entirely, or almost entirely, intangible to matter, or else the planets' orbits would decay rather quickly.

Again, though, Einstein saved the day, and destroyed ether theory with the observation, in this case, that light had properties of both wave and particle, and didn't need matter to propagate through.

Now, I might have fallen into the same trap as this Cracked article, because I'm not an expert. But even if I got the details wrong, my point stands: dark matter might not be matter at all; the observations could fit another theory entirely. One which hasn't been discovered.

Which is a good thing. I'd hate to reach the limits of discovery.

2. Unknown Elements

Another staple of science fiction.

You'll have to read the article on this one, I'm afraid, but it always bugs me when SF shows throw out "new element" as if you can slot something between a n-proton atom and an n+1-proton atom.

Which doesn't mean that there can't be exotic matter, elements made of something other than protons, neutrons, and electrons. But that's mostly speculation, too.

1. Pathogenic Fungus Capable of Infecting Humans

This one was quite popular recently. There was a whole show about it, the most impressive thing about which was it was apparently a good show based on a video game, which is rarer than unobtainium (see what I did there?)

If you were hoping this was nothing more than a spooky story, I have bad news: It’s very possible.

In the broadest sense of "possible."

Of course, our body temperatures would have to lower first, or fungi would have to evolve to live at higher temperatures. Well, why not both?

Sure, there's nothing in principle that would prevent either of these things.

Climate change has been forcing fungi to adapt to naturally higher temperatures, and human body temperatures have been dropping, with humans today having a body temperature of approximately 1 degree Fahrenheit lower than in the early 1800s.

And this is my last Issue (promise). The drop in human body temperature can be explained by improvements in medical care over the last 200+ years. A "human body temperature" is necessarily an average of a sample, and back in the early 19th century, that sample would be more likely to include individuals with low-grade infections that cause fever.

That's the flip side of there being things we don't understand: we can use them to scare the living shit out of each other.
April 17, 2023 at 9:16am
April 17, 2023 at 9:16am
#1048319
Time for another clothing-related prompt from "Journalistic Intentions [18+]. Unlike earlier prompts, this time, the RNG gave me something I've actually heard of.

Waistcoat


Now, here's a refreshing change: a name for an article of clothing that actually makes linguistic sense. Sort of. Except that "waistcoat" and "vest" are synonyms. And also, some sources claim the name is a pun: the vest was made from the waste material from tailoring the jacket and pants, hence, waste-coat. As it hugs the waist (not to be confused with a cummerbund, which is worn around the waist), the pun was really inevitable.

I've long been operating under the assumption that it's a "vest" when it's the outer layer of an outfit, and "waistcoat" when it's worn as, say, part of a three-piece suit. Three-piece suit is, itself, a misleading phrase, as one almost always also wears socks, shoes, a shirt, and a tie, making it, at the very least, a seven-piece suit. Not to mention underwear (bottom and top) (optional), cufflinks, a belt or suspenders, a hat (some of us still wear them), and my favorite, the pocket square. Though I'll grant that you normally buy those accessories separately, with the trousers-waistcoat-jacket part sold as a unit.

That assumption turns out not to be the case.

Of course, it's possible to have trousers, waistcoat, and jacket be of different materials, colors, and patterns. But then it's not a three-piece suit, is it?

Look, I know more about traditionally men's fashion than women's. This should not be a surprise to anyone.

As an aside, I mentioned up there that "vest" is effectively a synonym for "waistcoat." This isn't strictly true. In the UK, a "vest" is what we Yanks unfortunately have taken to calling a "wife-beater," from the stereotype of the man wearing only a sleeveless undershirt who engages in domestic abuse. This is, of course, not the only fashion-related linguistic trap between US and UK English. "I'm wearing pants" has a whole different meaning, too.

Adding to linguistic confusion is that "to vest" is a verb that has several meanings, including "to put on clothes." Whether those clothes involve waistcoats or not.

All of which is to say that even in this relatively simple case, nothing in fashion is as straightforward as one might think.

For example. I've been wearing suits (occasionally) (yes, I own suits) since I was a kid, and it took me until like 2019 to discover that you're not supposed to close the bottom button of a waistcoat. I knew about that rule with jackets, but no one told me about vests.

As an engineer, this really annoys me beyond all reason. While I understand, intellectually, the idea of nonfunctional junk on clothing (the rivets on a pair of Levi's served a purpose once, but nowadays they're just tradition, for example), why make it look like you're supposed to button something when you're not?

The only answer I can come up with is that they do it as a visual kind of shibboleth: if you don't conform, then those in the know have yet another reason to look down upon you.

Which is why I love pocket squares. They make excellent gags.
April 16, 2023 at 9:53am
April 16, 2023 at 9:53am
#1048283
The random number generator for my trips to the past led me, today, to that rarest of all beasts: a short blog entry.

This one's from January of 2009: "Calvin tribute

You'd expect, given the proliferation of short-attention-span media such as Twatter, that brevity would be more popular. But no, the view count on this one is actually somewhat lower than for many of my weightier tomes from the same era.

Anyway, since it's short, there's not much to talk about, but I'm going to try anyway.

First, the link: it's dead. Not too surprising, given its age, but the website it was from still exists. Just not the article. Nor did I do my usual (even for the time) selective quoting, so I can't recall exactly what it was about. It was not, as the title may suggest, referring to the stick-in-the-mud Protestant proponent of predestination, but to the play-in-the-mud comic strip kid.

Doesn't matter much. People (including me) are still singing the praises of this now-classic comic strip. One of my most prized possessions is the three-volume hardcover Complete Calvin & Hobbes, and I still occasionally come across retrospectives and "What's that crazy Bill Watterson up to these days?" articles.

Not to mention that at this old entry, I provide a link to a story I wrote imagining a middle-aged Calvin. That one's still in my port here.

What I've hardly ever mentioned is that C&H was instrumental in cementing my decision to never have kids. He was fun to read about, but, as I recall commenting back when the strip was in its heyday of the 1980s, "Calvin is the kid I wish I'd been, and also the kid I desperately don't want to ever have."

Not that this was the only reason, mind you. There's also laziness, appreciation for uninterrupted sleep, and a strong desire to have things I own remain intact and be wherever I left them. Just to name a few.

But I digress. After all these years, those strips still make me laugh uncontrollably.
April 15, 2023 at 10:36am
April 15, 2023 at 10:36am
#1048245
I've been wondering what to say about this one.



While the article emphasizes the US, it's been clear to me, thanks to the international reach of the internet, that this is a human problem, not just an American one. But as I live in the US, I'll run with what it says.

It may sound like an insensitive statement, but the cold hard truth is that there are a lot of stupid people in the world, and their stupidity presents a constant danger to others.

That depends on how you define "stupid," I suppose. I think they mean it in the sense of "stubbornly and willfully ignorant," because I fail to see how someone with a legitimate learning disability (Down's Syndrome, e.g.) presents a constant danger to others.

It would not be a stretch to say that at this point in time, stupidity presents an existential threat to America because, in some circles, it is being celebrated.

Which fits "stubbornly and willfully ignorant." Also any politician (of any party) who runs on a "common sense" platform.

Although the term "stupidity" may seem derogatory or insulting, it is actually a scientific concept that refers to a specific type of cognitive failure. It is important to realize that stupidity is not simply a lack of intelligence or knowledge, but rather a failure to use one's cognitive abilities effectively.

Okay, so a better definition than I had.

To demonstrate that stupidity does not mean having a low IQ, consider the case of Richard Branson, the billionaire CEO of Virgin Airlines, who is one of the world’s most successful businessmen.

Clearly, this article was written before Virgin Galactic fell into a black hole.

Still, business failure can't always be attributed to stupidity (as defined here).

Branson has said that he was seen as the dumbest person in school, and has admitted to having dyslexia, a learning disability that affects one’s ability to read and correctly interpret written language.

I've known several people with dyslexia, and this may be some sort of selection bias on my part, but as a whole, they seem to have an above-average intelligence. Mostly, they just have trouble with words and spelling. While I pride myself on my spelling, it's not proof of my intelligence.

Branson’s smarts come from his ability to recognize his own limitations, and to know when to defer to others on topics or tasks where he lacks sufficient knowledge or skill.

I hope his rocket science competition here in the US is taking notes.

Stupidity is a consequence of a failure to be aware of one’s own limitations, and this type of cognitive failure has a scientific name: the Dunning-Kruger effect.

Ironically, people throw around "Dunning-Kruger effect" like they know what they're talking about, even if they don't. Shit, I'm guilty of that, myself.

It is easy to think of examples in which failing to recognize one’s own ignorance can become dangerous. Take for example when people with no medical training try to provide medical advice. It doesn’t take much Internet searching to find some nutritionist from the “alternative medicine” world who is claiming that some herbal ingredient has the power to cure cancer. Some of these people are scam artists, but many of them truly believe that they have a superior understanding of health and physiology.

I'd reverse that. Many of these people are scam artists, but some of them truly believe they have a superior understanding.

There are many people who trust these self-proclaimed experts, and there is no doubt that some have paid their lives for it.

My Platonic ideal of the successful businessperson is not Richard Branson, but Steve Jobs. No, I'm not an Apple fanboi, but consider: he started a company with a couple of other geeks in his garage, and lived just long enough to see it become the most valuable company in the world (by market capitalization), exceeding far older corporations such as ExxonMobil or IBM. Sure, he was, by all accounts, a massive dick, but that's irrelevant to being a successful businessman.

And yet, he might have lived longer if he hadn't fallen for the alternative medicine scams.

What’s particularly disturbing about the Dunning-Kruger effect is that people are attracted to confident leaders, so politicians are incentivized to be overconfident in their beliefs and opinions, and to overstate their expertise.

"The best lack all conviction, while the worst / Are full of passionate intensity." -W.B. Yeats

It is only when we try to become an expert on some complex topic that we truly realize how complicated it is, and how much more there is to learn about it.

As a dedicated dilettante, interested in a broad range of topics yet not willing to delve too deeply into any of them, this is something I have to guard against, myself.

What we are dealing with here is an epidemic of stupidity that will only get worse as divisions continue to increase. This should motivate all of us to do what we can to ease the political division.

Sure, it should. But it won't.

We are all victims of the Dunning-Kruger effect to some degree. An inability to accurately assess our own competency and wisdom is something we see in both liberals and conservatives. While being more educated typically decreases our Dunning-Kruger tendencies, it does not eliminate them entirely. That takes constant cognitive effort in the form of self-awareness, continual curiosity, and a healthy amount of skepticism.

Clearly, I'm better at this than anyone else.

That's a joke. I'm deliberately overstating my competence to demonstrate the D-K effect.

The article is optimistic about being able to fix this. But there's only one thing more dangerous than willful stupidity, and that's optimism.
April 14, 2023 at 11:31am
April 14, 2023 at 11:31am
#1048188
Another fashion entry, courtesy of "Journalistic Intentions [18+]...

Swing


Wait, wait, hold up.

"Swing" has like 48 definitions, of which exactly 0 of them that I'm aware of fit the fashion theme. (But I know at least four that refer to sex.)

Time, then, to become aware.

For the first time in history, Wikipedia fails me here. There's the usual disambiguation page,   which parses out many of the 48 (I pulled that number out of my ass, but doesn't seem to be far off from reality) possible meanings of the word "swing." Not one of these refers to clothing.

So let's try that old standby, Merriam-Webster.  

Nope.

Now, look, I'll save us both some time here: best I can figure out, the only clothing-related use of the word "swing" that I could find anywhere refers to a dress style from the 1950s.

I hadn't been born yet; I'm generally bewildered by nuances of fashion; I don't wear dresses; I've never been in a serious relationship with someone who regularly wore dresses. So my lack of knowledge here shouldn't be surprising (though it's at least partly fixed now). Still, it makes me wonder why both WIkipedia and one of the leading American dictionaries failed me.

Apparently, from what little I can piece together (most of the search results I found were vendors, which is even less reliable than Wikipedia, and hopefully this morning's frantic Googling will confuse the living shit out of someone at the NSA), the swing dress was for dancing to... wait for it... swing jazz.

Which I have heard of, because unlike fashion, music is an actual interest of mine. Though I'm not generally fond of jazz.

But there is one advantage of ignorance: I can use it to make shit up.

So, obviously, the first thing I thought of was that it might be the clothes that swingers wear. As that might be ambiguous, I'll note that by "swingers," I mean the partner-swapping scene. But that wouldn't make sense; as far as I know, swingers don't use clothing to differentiate themselves from monogamists.

Maybe... clothes to wear on a swing? When I was a kid and could still fit on swings, we just wore whatever we were wearing. Which led to quite a few skinned knees.

Or, ooh, I know! it's clothing that'll change your political affiliation. "Here, wear this vest. It'll make you vote for the Libertarian and like it."

Compared to even my lousy imagination, though, the reality is—as per usual—quite disappointing, at least to me. That is, of course, assuming that I'm somehow right, which is a massive assumption. It's disappointing to me because, to me, there are only three kinds of dresses: 1) Wedding dress; 2) Bridesmaid's dress (look, I used to be a wedding photographer); 3) Dress.

I don't believe in only learning shit about shit that affects me. But even I can't hold unlimited information in my noggin, which is why things like search engines and dictionaries and encyclopedias are my friends (and auxiliary memory). Even an established nerd like me needs to let some things pass unnoticed, or I'd be overwhelmed with trivia. It's annoying and frustrating to me when such references fail.

But hey, at least I learned something new today. Again.
April 13, 2023 at 9:44am
April 13, 2023 at 9:44am
#1048137
I have a system (not a perfect one) in place to avoid duplicating links here. I also try not to repeat entry titles, but with 2300 entries, I know I've got some duplicates and maybe even triplicates.

What's worse, though, is when I thought I did an entry about something and now I can't find it. Such is the case with famed proto-astronomer Tycho Brahe. I mentioned him briefly in an entry a while back in response to someone's comment, but that seems to be the extent of things.

Until now.



Unlike most Cracked articles, this one's not a numbered list.

To non-astronomers, there aren’t a huge amount of A-list astronomers. Most people could probably name Copernicus, Galileo and Hubble, but realistically, they learned those names from the dog in 1955 in Back to the Future, the lyrics to Bohemian Rhapsody and a big-ass telescope that occasionally features in “yo mamma’s so fat…” jokes.

Hubble's successor, the JWST, is not named after an astronomer but an administrator. Which I think is a step down, and that's not even getting into the controversy around Webb's policies.

Amazing telescope, though.

One guy whose name might ring only a faint bell is Tycho Brahe, a 16th-century Danish astronomer who made some remarkable innovations that helped to bring about the Scientific Revolution.

Speaking of telescopes, he did all his work without one because they hadn't been invented yet.

For exactly how influential this guy was, there's the article, and of course also the Wiki page   about him (which may or may not be less error-prone). I'm skipping it because the clickbait headline is about his personal life (and death), which was almost as interesting. I will note, however, that his contributions were significant enough for later astronomers (ones with telescopes) to name one of the largest and most recognizable lunar craters after him.

Speaking of cutting edges, Brahe had his nose sliced off at the age of 20. He got in a duel with his third cousin, who accidentally sliced a big chunk of his honker off...

And this is what he's probably most well-known for. No, I have no idea how he smelled.

The big-brained, metal-nosed polymath undone by humble urine.

No, he didn't discover Uranus. That would have to wait for telescopes.

The story goes that Brahe was at a banquet and needed a pee, but etiquette forbade him from leaving, so he held it in. Unfortunately, when he got home and could safely drain the lizard, nothing was forthcoming.

Sounds to me like he wasn't done in by piss, but by stringently following etiquette. The cautionary tale here is: fuck etiquette.

And then, in the words of Victor Thoren and John Christianson’s 1990 book The Lord of Uraniborg: A Biography of Tycho Brahe, “through five days and nights of sleepless agony, he pondered the agony of paying so great a price for, as he thought, having committed such a trivial offense.”

Uraniborg is unrelated to urine. It was, if you read the text I'm leaving out here, one of his pre-telescope observatories. Nor is it related to Uranus (other than maybe etymologically), or the Borg ("Borg," by my understanding, is or was a Danish word for "castle.")

He had several more days of pain, fever and delirium before dying. He was 54, and had spent 38 years documenting the stars.

Which is approximately 39 years longer than I can ever concentrate on one subject.

Just in case readers might be worried that this could happen to them, the article reassures us:

“The good news is that, while uncomfortable, occasionally holding back a pee shouldn’t harm us,” explains Ajay Deshpande, senior clinical lead at London Medical Laboratory.

Also, medical science has advanced somewhat in more than half a millennium. Hell, they can even reconstruct noses now. To your specifications. Which makes a plastic surgeon's office the only place where it's socially acceptable to pick your nose.

In other words, it isn’t out of the question that the father of astronomy sat on his keys, filled up with pee and died. All that time looking up, not enough looking down.

Perhaps this inspired Oscar Wilde's famous quotation: "We are all in the gutter, but some of us are looking at the stars."
April 12, 2023 at 9:22am
April 12, 2023 at 9:22am
#1048059
Another entry for "Journalistic Intentions [18+] today:

Charmeuse


Unlike some people, I'm not proud of my ignorance. Nor do I pretend I don't have any. It's always a great moment for me when I learn something new, even if it's about a topic I don't have a lot of interest in... such as fashion, which is the theme of this month's JI.

So until these prompts were posted, I don't think I'd ever even seen or heard the word "charmeuse." And I also restrained myself from looking it up, just in case it became one of the eight out of sixteen prompts that I'd end up picking at random and I could proclaim my ignorance in public.

And behold, it was.

So let's see. Before I look it up, I have some guesses.

The word "Charmeuse" is in the section of Fabrics prompts, along with nylon, silk, and wool—all of which I am familiar with, so why couldn't I have gotten one of those... oh well, I'd have less material (pun absolutely intended) to work with. So I will start out by assuming (and shut up about "ass-u-me" already; we all make assumptions all the time just to get by) that charmeuse is a fabric.

Easy enough. What else? Well, the word seems French, so my next assumption is that the initial sound is more like sh as in shit than ch as in chit. Look, even after three and a half years of French lessons, I don't consider myself fluent, but I see patterns (though I have no idea, as yet, if charmeuse is a patterned fabric or not). And if I had to take a stab at "charmeuse," I'd guess it's the feminine word for "charmer" or "one who charms" (the masculine version being most likely "charmeur" if it indeed exists in the language). This is like how "singer" (the vocal thing, not the fabric sewing machine company) can be translated as chanteur or chanteuse.

I should emphasize that I'm using masculine and feminine in their linguistic meanings, not the sociopollitical meanings, which have become a minefield.

Alternatively, it refers to someone who prefers Charmin toilet paper, much as "bounty hunter" refers to someone who was searching for paper towels during the pandemic supply shortages.

It being French tracks, because when you think of fashion, you think of Italy, France, and maybe Mobile, AlabamaNew York City, the latter of which uses Italian and French names for fashion to make themselves seem superior.

I can also provisionally rule out nylon, silk, and wool as being synonyms, so it's probably a completely different material... though my confidence level on that isn't very high, as you also have, say, felt (the fabric, not the past tense verb), which is a different form of wool in much the same way as graphite is a different form of carbon.

And that's the limits of my guesswork. Let's see how I did, courtesy of Wikipedia:  

Charmeuse (French: [ʃaʁmøz]), from the French word for female charmer...

Il est d'accord. Point pour moi.

...is a lightweight fabric woven with a satin weave, in which the warp threads cross over four or more of the backing (weft) threads.

Another thing I learned fairly recently: satin isn't a base material, like wool or silk, but a particular kind of weave. But satin is, as I understand it, most closely associated with silk. So deduct a point for me guessing it didn't have crossover with the other prompts.

I have a vague idea what warp and weft are in relation to fabric, but only vague, and there's only so far I'm willing to go down a Wiki rabbit hole this morning.

Charmeuse differs from plain satin in that charmeuse has a different ratio of float (face) threads, and is of a lighter weight.

"Plain satin?"

Charmeuse may be made of silk, polyester, or rayon.

Cue the "one of these things is not like the others" earworm.

It is used in women's clothing such as lingerie, evening gowns, and blouses, especially garments with a bias cut.

See? I told you bias was everywhere.

It is occasionally used in menswear.

And yet it's still called charmeuse in that context.

There's more detail at the Wikipedia link, though I shouldn't have to remind anyone that I wouldn't use it as a definitive source (and I can't be arsed to follow all those links at the bottom).

Any day when I learn something new is, in my opinion, a good day. Especially when it's source material (yes, I used that pun before) for comedy.
April 11, 2023 at 10:16am
April 11, 2023 at 10:16am
#1048020
Speaking of words...



We’ve all experienced how certain sounds can grate on our nerves, such as the noise made by dragging your fingernails across a blackboard or the cry of a baby...

I suspect most people don't know what their own fingernails sound like dragged across a blackboard. Partly because who uses blackboards anymore, since like the 90s, and also partly because most people aren't evil enough to do it themselves. I, however, know exactly what my fingernails sounds like dragged across a blackboard.

Certainly I was a baby, but I don't remember how annoying my cries were. Knowing me, "very."

...but it turns out that the sounds of some words (like “virus”) can also affect how we feel and even give us a clue to what they mean (something to avoid).

I'd expect it to be the other way around: the name for something you want to avoid takes on a bad connotation. But that's why I'm reading the article, isn't it?

This phenomenon, where the sound of a word triggers an emotion or a meaning, is referred to as “sound symbolism”. Yet the idea that there might be a link between the sound of words and their meaning flies against accepted linguistic thinking going back more than a century.

And? Sometimes paradigms get reversed.

In our book...

Of course it's another book ad. What else is free on the internet anymore? Apart from this blog, I mean.

...we outline a radically new perspective on how we, as humans, got language in the first place, how children can learn and use it so effortlessly, and how sound symbolism figures into this.

Okay. Well, that first part sounds like speculation, but okay.

For example, if you pick a language at random that has the concept of “red”, the corresponding word is more likely than not to have an “r” sound in it — such as “rød” in Danish, “rouge” in French, and “krasnyy” (красный) in Russian.

Um.

The way an R is pronounced can be way different in different languages, such as the trilled R of Spanish or the nearly-German-CH sound of the letter in French.

Made-up words can be sound symbolic too.

All words are made up. Some just longer ago than others. And I'd be more surprised if newly minted words didn't have sound symbolism. Like when people go "whoosh" when a joke flies over someone's head.

In a classic study from 1929, the German psychologist Wolfgang Köhler observed that when Spanish speakers were shown a rounded shape and a spiky one and asked which one they thought were called “baluba” and which “takete”, most associated baluba with roundedness and takete with spikiness.

So much for flying against "accepted linguistic thinking going back more than a century."

Computer modelling of how children learn language has revealed that, as a child’s vocabulary grows, it becomes harder and harder to have unique sounds to signal different aspects of meaning (such as that all words relating to water should start with a “w”). Indeed, in a study of English sound-meaning mappings, we found that words that tend to be acquired earlier in development were more sound symbolic than words that are acquired later.

Computer modeling is a powerful tool, but it's only part of science. Also, I'm quite surprised these authors don't go into the "m" sound widely present in words associated with motherhood. Maybe they do, in the book.

I don't have much else to say, really (except "moist"). The article is less substantive than most book ads (which, again, I don't have a problem with here on a writing site), but the little bit of speculation I saw in it doesn't make me want to delve deeper by buying the book... and I'm predisposed to appreciating books on linguistics.

Still, I feel like it's something to keep in mind while writing, especially (but not limited to) writing that's going to be spoken, like a speech or screenplay: that the sound of words matters as well as their literal meaning.
April 10, 2023 at 10:29am
April 10, 2023 at 10:29am
#1047949
Today's entry for "Journalistic Intentions [18+] is about something that I never notice until it fails.

Hem


This word has a few definitions, but since the overall theme of this month's JI is clothing, I'm going to assume it's not referring to a makeshift fence or one of the sounds of indecision (along with "haw," which also pairs well with "hee.")

A while back, some bored scientists decided to program a computer to trace the evolution of language (specifically English, because these were UK scientists) backwards and forwards, both predicting where it might go from here, and delving deep into times from which few written records survive.

This, of course, was reported in various outlets as "Scientists discover oldest words in English!!!" Which is sensationalist and misleading as all fuck. There's an press release on it here,   but it only gets more sensationalized the further you get from the source.

If you can't be arsed to click, their candidates for Oldest Word in English are I, we, who, one, two, and three.

As these words supposedly predate English, and long predate Modern English, it's a matter of definition to call them the oldest words in the language. And a while back, I did a blog entry on the word "lox," which they think hasn't changed in 8000 years. Here's the blog entry; unfortunately, the link is already broken: "Lox Pie. I did, however, find an updated link   to the original article, or at least some version thereof.

So, what's all this hemming and hawing about? Well, in the course of researching this entry, I discovered that the word "hem," as used in fashion, dates at least from the 12th century C.E.,   and possibly from even earlier. Which makes it about 1/10th as old as "lox," but also puts it firmly into a time when some form of English was being used.

This wasn't, presumably, figured out by a computer that may or may not have been programmed correctly, but from what sparse written records survive. I can't be sure about that, but I would believe that "hem" is quite old, because, in general, shorter words live longer, and the idea of hemming a cloth garment is not something that's changed over time (though obviously, the technology to do it has).

One hems a garment to adjust its size, but the primary practical use is to keep a cut edge of cloth from fraying. Like a splice on the end of a rope, only in two dimensions.

So when it comes to discovering the origins of words, there's no need to be a-frayed.
April 9, 2023 at 12:02pm
April 9, 2023 at 12:02pm
#1047885
It's Sunday (some say Easter Sunday), so it's time once again to forget the worries of the present and focus on the disaster of the past.

Apparently, back in 2020 (a pretty bad year from most perspectives), I'd had a few blog entries, most of which I vaguely remember, talking about panpsychism, the persistent belief (for it is a belief, not scientific knowledge) that consciousness arises in complex life because every thing in the universe, including subatomic particles, has a rudimentary form of consciousness. I may not be getting it exactly right, but that's the gist of it.

Naturally (pun intended), I push back—not from any deep-seated need to be special, but because it's not, as far as I've been able to tell, a testable, falsifiable hypothesis. Apparently, I pushed back a few times, based on the title of the entry ("Once More With Feeling), and the first line:

This is probably the last bit about panpsychism I'll be linking. For now. Maybe.

And I can't be arsed to scour every day since August 11, 2020 to see if I talked about it again until now.

But let's turn this into a Merit Badge Mini-Contest.

Yeah, I know, I haven't done one of these in a while. Interest seemed to peter out. But maybe I'll start them up again, perhaps once a month. Not today, though.

Anyway, the link (available at the original entry above) is still active, if you're interested in what this publication, and one philosopher, has to say about it.

This entry is about what I have to say about it.

In the link above, I copied the "elevator pitch" for panpsychism:

In our standard view of things, consciousness exists only in the brains of highly evolved organisms, and hence consciousness exists only in a tiny part of the universe and only in very recent history. According to panpsychism, in contrast, consciousness pervades the universe and is a fundamental feature of it.

And today, rereading this, I realized that I didn't address the misconception about evolution: that there are "highly evolved organisms" as opposed, apparently, to less-evolved organisms.

That point of view is dangerously self-centered.

Every living being on Earth shares a common ancestor. Every living thing on Earth has therefore been subject to evolutionary pressure for the same amount of time (3-4 billion years; the exact time is irrelevant to this discussion so I won't bother splitting hairs on it). We have certain adaptations that have made us very good at becoming an invasive species, even to the extent of being able to live in places like Antartica and, for at least a little while, on the moon.

But that doesn't make us "highly evolved," any more than a bacterium is "highly evolved" because it acquired antibiotic resistance from its progenitors.

None of this means that panpsychism is right. I'm just pointing out that the "standard view" noted in the quote is kind of a straw man. I can accept the idea that any living thing has some form of consciousness, but every nonliving thing? I'mma need evidence. "Consciousness pervades the universe" is more in the realm of theology, which requires no evidence.

Anyway, the original entry goes deeper into other arguments, which I won't rehash here. Bottom line is, sure, it's a legitimate philosophy; it's also older than recorded language in humans (in the form of animism). It makes for good creative writing, and excellent (if later forgotten) stoned dorm room conversations.

What it's not, is science.

It may be, someday, when we know more. Which is why we do science.
April 8, 2023 at 10:40am
April 8, 2023 at 10:40am
#1047803
I do like the occasional "well, actually..." piece, so here's one from Mental Floss:



By "Civil War" they mean the American one in the 19th century, not the one we're in now, not another country's war, and not the Captain America sequel.

I should note that I didn't fact-check this, but I did go to some of the links and it looks like they did their due diligence. Just don't blame me for any continued inaccuracies on their part.

The American Civil War is a pivotal and ugly moment in American history, but it's more misunderstood than you would think.

I wouldn't think. Given the near-deific significance it's accorded by Americans, northern and southern alike, I'd expect people to get its history right. But as is often the case, I overestimate people—or, at least, ACW fanatics.

1. Misconception: Lincoln’s policies enjoyed widespread support in the North.

I find that, in hindsight, an event in history is usually more clear-cut than it was at the time. I wouldn't expect that everyone in the North supported Lincoln, or that everyone in the South wanted to secede. It's kind of like "the colonies wanted to break away from oppressive England" without considering how many colonists wanted to keep saying "pip-pip-toodleoo."

Take a look at the article for some well-sourced examples.

2. Misconception: Robert E. Lee and Jefferson Davis were staunch secessionists.

I've known this was wrong for a long time (again, details in the link), but pointing it out now seems to be a failure to read the room.

3. Misconception: The Emancipation Proclamation ended slavery

When President Abraham Lincoln delivered the Emancipation Proclamation on January 1, 1863, it declared: “[All] persons held as slaves within any State or designated part of a State, the people whereof shall then be in rebellion against the United States, shall be then, thenceforward, and forever free.”

Well, clearly not, as slavery still exists. Even after the 13th Amendment (which the article covers).

4. Misconception: All amputations were done without anesthesia.

Sure, but it makes for a more compelling movie to ignore that.

5. Misconception: Only men fought during the Civil War

Pretty sure that if you look at wars throughout history, there were almost always women finding a way to join in the fun.

6. Misconception: Abraham Lincoln was the keynote speaker on the day of the Gettysburg Address.

Honestly, this one's pretty interesting. Not surprising to me, but has details I wasn't aware of.

One thing that you might not know about the address is that Lincoln wasn’t pegged to be the main speaker on that day. That honor belonged to Edward Everett, a distinguished scholar and orator who took the stage before the president.

Everett’s speech would go on for around two hours, totaling upwards of 13,000 words...

After Everett finished his speech, the president shook his hand and told him, “I am more than gratified, I am grateful to you.” Then the Thunder-Stealer-in-Chief rang out with “Four score and seven years ago ...” and made Everett’s magnum opus a historical footnote in under 180 seconds.


The original Rap Battle.

7. Misconception: The war was fought entirely in the U.S.

Well, duh, because technically, the CSA wasn't part of the US; hence the war.

But no, this part's mostly about naval battles.

I'd also add misconception #8: Everyone knows the war is over.
April 7, 2023 at 1:27pm
April 7, 2023 at 1:27pm
#1047729
Some of my entries this month are for "Journalistic Intentions [18+]. This one, for example.

Sweater


I've always disliked sweaters.

There's this vague recollection from when I was a kid and my mom called an article of clothing that her mother gave me a "sweater." I immediately hated it.

Not because the sweater was ugly—at that age, I couldn't have assessed its attractiveness, or lack thereof. Not because it was scratchy—which it was, but lots of the clothing I got foisted upon me back then was scratchy. No, it was because of the name.

Sweater.

Sweating, Kid Me reasoned to the extent of his limited abilities, is Bad. Why would I want to do something that's bad?

Now, half a century or so of not wearing sweaters later, because of this prompt, I got off my ass and actually looked into the etymology.

It didn't help.

As with many articles on Wikipedia, I walked away more, not less, confused.

From the extensive discussion of "sweater" on that site,   we get:

The OED gives "sweater" as appearing in 1882 and gives its definition as "A woolen vest or jersey worn in rowing or other athletic exercises, originally... to reduce one's weight; now commonly put on also before or after exercise to prevent taking cold. Hence a similar garment for general informal wear; a jumper or pullover"

You know what's an even worse name for an article of clothing than "sweater?"

Jumper.

Also jersey, but okay, I can see an English word for an article of clothing named after a place in the UK.

I'm not disrespecting the British here. Just saying that with "sweater," at least the word has at least some connection to what a person wearing it does. But "jumper?" Come on. Picture in your head a proper British person jumping. Can't do it, can you? Whether or not they're wearing a knitted or crocheted garment.

"Pullover" at least makes some sense. If you ignore the buttoned kind (usually called a "cardigan," and don't get me started on wtf that word might mean), one dons a sweater by pulling it over one's head.

But there are numerous other articles of clothing, notably t-shirts and sweatshirts (which are etymologically, but not fashionably, related to sweaters) that are also put on by pulling it over one's head.

French is no help, probably because there's a bit of cross-pollination between French and British English. You know what they call a sweater in France? It's not pronounced "sweat-ay" like you might expect. No, the French equivalent of a sweater is called a pull.

Just. Pull.

Meanwhile, the French word for the verb form of pull (at least the one relevant to this rant) is appuyer. Which is pronounced with an "ay" sound at the end.

Fashion words sometimes make no sense to me, and I've got seven more to do before the end of the month. Wish me luck.
April 6, 2023 at 7:51am
April 6, 2023 at 7:51am
#1047614
Today, we're looking within.



I'm sure this comes as no surprise to at least 1% of your readers.

I saved this one because, fairly recently, it came to my attention that while I see spoken words as text in my head, not everyone does that. It may be a reason I'm pretty good at spelling: lots of practice.

When you hear someone talk, do you see the words in your mind’s eye? Or do you see what they’re saying as a movie? It’s easy to assume that the way you perceive the world is the same for everyone.

I know I've said this before, but when I was a kid, I had a deep, philosophical conversation with another kid, one which amounted to "how do I know that the color I see as 'red' is the same color as what you see as 'red?'" Much later, on the internet, such a question popped up as a profound revelation. Me, I'd spent the intervening years occasionally wondering how one would go about investigating such a thing.

The reason I bring this up is to note that I've been open to the idea that we each perceive things differently for a long time, and yet, it still sometimes surprises me.

We range from those who are “mind blind” and cannot visualise things mentally to those who have brilliant images in their mind. Some people see shapes in their mind when they hear music, or imagine colours when they see a number (a phenomenon called synaesthesia).

Yes, the article uses British spellings. Those don't usually appear in my mind when I hear words, but I can see them when someone speaks with a British accent.

There’s even a type of synaesthesia in which people’s minds run a written text on a mental ticker tape. Even though ticker tape (or subtitle) synaesthesia (TTS) was first studied in 1883 by Charles Darwin’s cousin Francis Galton, little was known about it until lately.

Yeah, I don't think it's quite like that for me. I do remember some instances of synesthesia (American spelling), like when I was a kid, each day name (Monday, Tuesday, etc.) had a different color associated with it. What those colors were, I can't describe. But ticker tape was already obsolete by the time I started reading. I saw it in action, once, when I was very young, and never since. I'd expect that a person's inner experience would be based on stuff they're familiar with; today, one might see words in their mind as text on a small screen.

Any reason to mention Darwin here other than name-dropping? Your science should stand on its own, regardless of the accomplishments of your more famous cousin. (Darwin, incidentally, had a lot of cousins. Hell. He married one.)

A study published recently, one of the first to explore this condition in more depth, found that of the 26 participants with TTS, most had additional types of synaesthesia, most commonly space-time or number-space, where they experience time or numbers as a location.

26 is hardly a compelling sample size, but considering the apparent rarity of TTS, it's understandable. I just wouldn't make any far-reaching leaps based on the study.

While many adults can imagine written words when listening to speech if asked to do so, people with TTS are different because of the ease with which it happens. In fact, some cannot stop, even when it makes it difficult to follow conversations when lots of people are talking at once. But the ability to process information from different senses at the same time is often helpful. There is an evolutionary explanation for this.

Of course there is, but it's speculative and possibly wrong.

FFS, I have to explain this again: Yes, evolution is fact. But asserting that such-and-such is the case in our minds "because our ancestors needed it on the savanna" or whatever is, at best, an interesting thought experiment, and, at worst, utter nonsense. Unless they can back it up with actual evidence.

In this case, a moment's thought might suffice to convince you that our savanna-dwelling ancestors didn't see written words when someone spoke, because writing hadn't been invented yet.

The more general informational-processing argument, okay, sure. But that would also apply to all the other animals that share, to one degree or another, our sensory array: sight, hearing, etc. And thus be a holdover from way before apes.

Okay, enough of my perennial ragging on evo-psych. Skipping that part of the article.

Except for this gem:

So if we also saw, or even smelled an animal in the scrub behind us, we could more easily determine if it was a dangerous predator we needed to escape from, or a fluffy little rabbit. As they evolved, our brains became experts in tying information from different senses together.

Which of course made my mind conjure an image of the Killer Rabbit from Holy Grail.

Researchers recently suggested fewer people are born towards the end of the low-visual-imagery spectrum. Extreme forms of aphantasia, people who do not have visual imagery at all, are rare. Less than 1% of people have this form.

While I've known about synesthesia for years, this is probably the first time I've encountered aphantasia, and I love the word, even though it describes a condition I hope I never experience.

Some research suggests we are not born with the ability to imagine. Instead, visual images emerge and develop during early childhood. This is followed by a decline in visual imagination in adulthood.

Okay, this strikes me as being like saying "we are not born with the ability to speak." Obviously, in reality, no child has ever popped out and immediately said, "Hello, Mother. Nice tits." No, we learn that later, but our capacity for speech (or imagination) is inborn. For most people.

Despite my issues with some of the claims here, I don't think there's anything inherently wrong with the basic idea: that we each have a different interior world. And, as the article notes at the end, there's a lot more to discover. We just have to begin by imagining...
April 5, 2023 at 8:56am
April 5, 2023 at 8:56am
#1047527
The final prompt from this round of "JAFBG [XGC]...

What current topic/craze are you sick of hearing about?


Well, I'm old enough to say "All of them."

But I suppose that at the top of the list sits TikTok.

"Oh, is it because the government is trying to ban it?"

No. I don't care.

"Because China uses it to spy on us?"

Nope.

"Because it's what kids these days like?"

Whatever. Kids have always liked, will always like stupid things. I did, when I was a kid.

No, the reason I wish TikTok would go away (though I don't agree with a government ban) can be summed up in one alliterative, two-word phrase:

Vertical video.

Vertical video is, in 99.9 percent of cases, an abomination against everything that is good and right.

Let me provide a bit of background.

One of the earliest computers I played on was back in the 70s. It barely even qualified as a computer, being dedicated to word processing—a typewriter with a floppy disk drive. Not one of those solid-case tiny plastic disks, either, oh no. Nor one of the larger, but still below-average, 5-1/4". No, this word processor took massive, throbbing 8" disks.

But the other "feature" of this glorified typewriter was its screen. We're used to looking at paper—typewriter paper, e.g.— in vertical orientation. So the makers of this early word processor (I think it was Wang) oriented their green CRT monitor in a vertical orientation to emulate a typewritten page. And no, it didn't have a preview function; all your codes (think early versions of {i} and {b} and {indent} here on WDC) were visible on the screen but not in the printout, so you had to print a test copy.

Anyway, the point is, this seemed weird to me even then. Paper is paper, but screens, monitors, and TVs were, by that point, standardized in a horizontal format. Mostly something like 3V:4H.

Through time, I've had several different computers, and their screen evolution went something like this: CRTs with 4:3 ratios, monitors with even more horizontal exaggeration, actual widescreen monitors. Movies released in widescreen format often had to be edited or letterboxed to fit a 4:3 monitor. Anyway, point is, I think we finally settled on an ideal ratio, whatever it is, but it's horizontal. This conforms with human vision, which has a much greater range side-to-side than it does up-to-down.

Which is one reason that the inability of some mobile phone video makers to turn the fucking phone sideways enrages me to the point of apoplexy.

One of the earliest examples I saw was some dude trying to capture the aftermath of, I don't know, a car and truck accident or something. Holding his goddamn phone vertically, he had to frantically pan side-to-side in order to catch all the carnage (not literal carnage; I don't recall there being bodies or anything). If he'd only turned the assmunching phone sideways, he could have captured the entire scene.

Since then, I have refused to watch any video in vertical format, on my phone or on the laptop. Well, with two exceptions:

1. Explosions. I don't really care about format if I'm watching shit blow up.

2. Rocket launches (with or without explosions). These are the only events that demand vertical format, because, well, everything important happens vertically.

As an aside, I don't have the same issues with still photographs or paintings. When I did photography, I'd orient the camera as appropriate: vertical for portraits (hence "portrait" mode) and horizontal for groups or landscapes (hence "landscape" mode). But videos are different.

As I blame TikTok for the appalling rise (pun intended) in vertical video, I choose to rant about TikTok in general.

So that's it. Tired of hearing about it, so writing about it. Paradox!
April 4, 2023 at 11:46am
April 4, 2023 at 11:46am
#1047477
I saved this one a long time ago (relatively speaking), and I don't remember why. Let's find out together, shall we?

    One Word That Fights Off Both Viruses and Loneliness  
There's a way to limit your exposure to the dangers of illness and isolation.


"One word?" But "chicken soup" is two words. Also, what about limiting your exposure to the danger of being around people?

Have you ever gone to bed at night feeling sad or lonely?

Well, that's not a good question. I think you'd be hard-pressed to find someone over the age of, I dunno, zero, who has never gone to bed feeling sad or lonely.

Then, you woke up not feeling refreshed, but exhausted and on edge. You are not alone.

Yes. Yes, you are. Otherwise you wouldn't have been feeling lonely, would you?

We all know how Covid has produced higher levels of isolation. Wearing masks, as important as they are, is the equivalent of placing an emotional wall between yourself and others because you can't recognize facial expressions.

Oh, boo hoo. Most of my human interaction takes place on the internet. Facial expressions are entirely optional, with practice. What surprised me, with my partial face-blindness, was how I'd manage to recognize people even with masks on. I quickly came to realize that I don't rely on mouths at all, noses somewhat less, mostly hair and eyes. Speaking of which, eyes are very expressive, and they're generally not covered by N-95 masks.

Furthermore, the lack of daily contact caused by sitting in front of a computer screen (I'm certainly guilty of that) is no substitute for real, person-to-person contact and human touch.

Extrovert-like typing detected.

See? I can tell nuance just from phrasing.

But, okay. Most people are extroverts, so while very little of this applies to me, it's a window into that other world, the world of (shudder) people.

A study from the Proceedings of the National Academy of Sciences found that "Prior-day feelings of loneliness, sadness, threat, and lack of control were associated with a higher cortisol awakening response the next day." When cortisol floods the body and the brain, it has been shown to put the brakes on our immune system by reducing T-cells; cortisol has been shown to kill NK cells—important immune cells that help fight off viruses and even some kinds of tumors.

If I cared, I'd check that link and double-check some of the claims here. For our purposes, though, I'll take them, provisionally, at face value.

Imagine you had feelings of loneliness that lasted for days or weeks or longer.

I'm not a psychologist, but isn't that called "clinical depression?"

This could produce a chronic state of stress resulting in chronic health conditions. In fact, it's been reported that eight of 10 commonly prescribed medications are for symptoms of stress.

Or, and bear with me here, maybe people are stressed because they find themselves having to work long hours at a shitty job for subpar wages.

That's where mindfulness comes in.

Oh. That's why I saved this article.

To help reduce suffering, mindfulness teaches us how to experience the world through immersive inter-being with our surroundings. This means seeing ourselves mirrored in our human community, the air, the water, the planet, the plants, and all creatures large and small.

Too much work. And I've heard rumors that you can get such an experience through the use of certain pharmaceuticals. Oh, hey, maybe that's a reason why micro-doses of LSD are reported to cancel depression?

In our culture, it's commonplace to put a high value on our independence. This illusion of independence was lifted for me when I was in the monastery and had my first meal with the monks.

This came out of left field, but think "eastern," not "western."

If you haven't guessed by now, the "one word that fights off viruses and loneliness" is relationships.

There. Saved you a click.

The next evening that you feel lonely, remember that your body is listening in on your social and emotional experiences of the day. This is why that matters:

If you feel alone and lacking support, then your body will boost your cortisol response to prepare you for what it thinks will be a stressful day.


I'm not going to deny the link between mental and physical health, though it's probably way more complicated than a mere cortisol response. But I have a mantra for just such (rare) occasions: "Damn, it's good to be alone."

Alone doesn't mean lonely. Alone means you're not obligated to mitigate your actions for others' comfort or convenience. You don't have to watch your words or apologize later when you failed to. You don't have to concern yourself with whether or not someone else is hungry or tired. And best of all, you don't have to explain yourself.

Which is not to say I'm antisocial. I enjoy being around people, usually. Like on Saturday, when I went on a wine-tasting adventure with four other people (plus the very well-paid driver). Had a great time, found some delicious wines, saw some great views (it's a real privilege to live in Virginia).

Then everyone else went home, I passed out, and later, I got to nurse a hangover without worrying about how loud other people were being.

Back to the fluffy article:

Create mutually positive and satisfying relationships. This can take time, but the best mountain climber in the world can only take one step at a time. Start tonight by taking a simple, first step that connects you with another.

A "simple, first step" like "Hey, girl, I wonder if your software is compatible with my hardware?" My days of using pickup lines are long gone. My days of successfully using pickup lines are but a distant memory.

The article has a "conclusion" section, but I'm substituting my own:

Conclusion:

Thank you for your visit to our copium den. Buy our books, subscribe to our website! Because you aren't already financially overextended. And remember, kids, if modern society has you beaten down into a corner, it's your fault, not society's.
April 3, 2023 at 11:03am
April 3, 2023 at 11:03am
#1047419
Just two prompts left in "JAFBG [XGC]. Might as well go the distance. I also intend to participate in "Journalistic Intentions [18+] this month; if you like open-ended blog entry prompts, that should be fun.

Well, that's easy to say. Tell us about something that sounds easy but you find incredibly difficult.


Oh, there are a lot of these. Like "Just talk to her," or "Call the doctor about that."

Sometimes the hardest thing for me is just getting started, even if the task itself is easy. Like, say I've decided that today is pencil-sharpening day, the day I set aside to make sure all of my standard pencils are nice and sharp. Easy, right? Maybe a little bit boring, but ultimately satisfying to see those little curls of wood peel off the points. So it's not like I don't want to sharpen my pencils. It's just that I don't want to stop wasting time on other things.

"Okay, Me, stop looking at YouTube videos and go sharpen your pencils."

"No."

"That video ended. Let's go sharpen pencils."

"But there's another one in the recommendation queue. See?"

This was especially dangerous when I was working. The number of times I cursed myself out going, "Just. Do. The. Thing." only to have my inner voice go "Don't wanna" is embarrassing.

I don't actually have a pencil-sharpening day. As an engineer, I favor mechanical pencils. Better yet, computer solutions. The point isn't the task; the point is something that's easy it is to do while being almost impossible, psychologically, to start.

I even talked to a shrink about it once, in a session. He just shrugged. "So, you don't want to." "But I do want to." "Clearly, you don't." "Okay, clearly, I don't want to. But I want to want to." It was, I think, at that point that I acquainted myself with the concept of infinite recursion in my own brain. I want to. I want to want to. I want to want to want to. And so on to infinity, while, in the meantime, the task goes undone because I'm contemplating the vastness of the universe.

But. All of those mental blocks pale in comparison with the one phrase that is guaranteed to freeze my thoughts and actions, sending the chill of intergalactic space through my whole being, rendering me utterly incapable of action. The one phrase that strikes terror in my scarred and stony heart, because I have no idea how to implement it.

The phrase? "Be creative."
April 2, 2023 at 10:25am
April 2, 2023 at 10:25am
#1047344
In the course of picking, at random, an older blog entry to revisit, I ignore anything less than a year old. I figure it needs time to become vintage, and one year is generally enough for that on the internet.

Today, I got my first result from 2022—but it was from early January, so it's more than a year old. As it turns out, I do have something to say about it. The entry was mostly the answer to a former prompt from "JAFBG [XGC]; if you're following along, you know I've been working through the current iteration of prompts there.

As for the entry itself, here's the link: "Judge, Mental

It starts off with a personal update:

Yesterday morning, we got a few inches of ugly, dense, watery sn*w which brought down branches, leaves, trees, and a nearby transformer (the electric kind, not the Michael Bay kind, though I did get to hear it explode). And my home generator picked yesterday to go on strike for higher wages and better working conditions. To be fair, any wage would be higher that what I'm paying it now, but come on, you only have to work like twice a year; get over yourself.

It took me several weeks, as I recall, to find someone to look at the generator. Fortunately, I didn't need it during that time. They did whatever they needed to do, and then, the next time the power went out... the generator failed again. Fortunately, that time wasn't during an intense cold snap.

I think it's finally back in shape, after I called out someone competent. Fortunately, or unfortunately (depending on your perception), it hasn't been fully tested under adverse field conditions, as this winter was mild.

If the power hadn't popped back on in the middle of the night, they'd have had to chip my corpsicle out of the solid block of ice that had once been a house.

I also have a vague recollection that, once I got the central heat working again that early morning, the thermometer on the thermostat said it had gotten all the way down to... 58F. Nowhere near literally freezing; just above, in fact, the ideal temperature for some darker beers and red wines. But enough for me to freeze my nads off.

In my last Comedy newsletter ("The Weather), I described with only a little bit of hyperbole how cold my dad kept the house, which you might think prepared me for a life comfortable with persistent hypothermia (my dad certainly thought, mistakenly, that it would toughen me up), but, in fact, the opposite became true.

Look, I live in Virginia. I expect this kind of shit at least once per winter. It doesn't mean I have to like it.

And this year, I found out what I like even less: not having to put up with "this kind of shit" all winter. I mean, sure, it was nice to have only a couple of sub-freezing days, and no sn*w to speak of, but it ain't natural.

Hell, there was an extended period over the winter where it got up to the 70s (F again, of course) during the days. That's warmer than it's expected to be today. Not entirely unusual for that to happen occasionally in a Virginia winter, but the lack of sn*w beyond a few non-sticking flurries was out of the ordinary. It's April, now, obviously, and it's been known to sn*w in April here; we'll see if we get any.

The rest of the entry doesn't really need embellishment; it's all about a thing I'm judgemental about: beer. About the only thing I can say is: I've been working on it, and I think I'm fairly successful in, if not being less judgemental about it, at least being less of a prick.

2,688 Entries · *Magnify*
Page of 135 · 20 per page   < >
Previous ... 16 17 18 19 -20- 21 22 23 24 25 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/20