\"Writing.Com
*Magnify*
Printed from https://www.writing.com/main/profile/blog/cathartes02/month/1-1-2026
Rated: 18+ · Book · Opinion · #2336646

Items to fit into your overhead compartment


Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
<   1  2   >
January 22, 2026 at 8:09am
January 22, 2026 at 8:09am
#1106518
Breaking one of my until-now unspoken rules here, I'm going to link to HuffPo today. As a reminder, I browse using ad and script blockers, so hopefully you'll be able to see the content through whatever popups they push at you.
What 'Only Children' Bring Up The Most In Therapy  Open in new Window.
From feeling misunderstood to putting unnecessary pressure on themselves, here's what the only child may need help with.

I'm shattering my rule because, as a former "only child," I was interested in what they had to say. Of course, it's been over 40 years since I could be considered a "child," but it's not like I suddenly grew siblings as I got older.

Okay, that's not entirely true. There are people who I consider brothers and sisters, though I can't be sure if it's the same kind of relationship because all I have to go by are other peoples' stories. And judging by some of those stories, I didn't miss out on anything good. Some, but definitely not all.

If you grew up as an only child, you’ve likely heard some of these stereotypical phrases at some point in your life: “That’s sad you grew up all alone.” “Your parents must’ve spoiled you.” “Do you have a hard time making friends?”

And yet, they never ask people who had siblings things like "Was it hard, not being the center of attention?" Or, "How did it feel to feud with your siblings over the inheritance?"

To address the quoted questions from my point of view: Not sad at all, it prepared me for a life of something close to self-sufficiency; yeah, they kind of did, but so what; and no, what I have a hard time with is meeting people.

Yet recent research shows that many of these portrayals of only children are inaccurate.

Color me shocked.

Even though growing up without siblings is becoming more common, there’s still a long-lasting stigma around only children.

And? At least when I was a kid, there was an even bigger stigma around childless people (the concept of "childfree" wasn't a thing yet). My parents could adopt exactly one brat, and that was, to my great good fortune, Me.

We talked to therapists about the most common issues they hear only children bring up.

What's not immediately clear is that this is mostly about what they bring up when they're older. You're always someone's child, but you're not always a "child." English is weird.

In therapy, adult only children sometimes share that they feel lonely because they come from a smaller family and don’t have any sibling relationships.

Not meaning to minimize others' experiences, but one thing I don't remember ever feeling was "lonely," either as a child or as a (technical) adult. There was always someone around to interact with. But I am moved to ask, perhaps rhetorically: "What about people with siblings who feel lonely?"

In my view, it's better to never have had siblings at all than it is to be in a shitty relationship with the ones you have.

“Holidays can be especially lonely for some only children because they often don’t have the big family gatherings that you see in movies and on TV.”

Yeah... those movie and TV gatherings are generally idealized (or, possibly, whatever the opposite of "idealized" is when it shows a dysfunctional family). I've been to big family gatherings
both my ex-wives came from more traditional families, though one was also adoptedand while it was never exactly an unpleasant experience, I personally prefer to stay home by myself and relax rather than putting my best face on.

As adults, many only children will seek out close friendships that feel like family members to fill that void, Clark said.

See, this kind of wording is something I bristle at. It implicitly makes having siblings the "norm" while keeping one-child families as the "weird." Yes, as the article notes, they're a minority. But so are lots of other minorities
gay people, for exampleand very few professionals these days would say something like, "many gay people will seek out close friendships with another gender to fill that void." At least not without getting pushback from both gay and straight folks.

It's natural to seek out close friendships. Even I do it. It's not an exclusively "only child" thing.

“Many adult only children feel overwhelmed and stressed being the only person in their family to handle all the elder care responsibilities for their elderly parents,” Greene said.

Well, I had parents
and a childfree aunt to deal with. And, to be honest, I couldn't. My parents both developed dementia, and that was way beyond what I was able to handle, so yeah, I hired professionals for that. It wasn't like I could quit my job to care for them full-time.

Can I just point out, though, how weird it is that we put all the burden of elder care on the kids? That sort of thing may have made sense in a pre-industrial society, but now, it's just weird.

Though having lots of attention from parents can lead to closer relationships with them, some only children may also feel like their every move is being watched.

Well, that just prepares them for the reality of a surveillance state. Oh, and again, that's not limited to onlies.

“Growing up as an only child can create a large sense of independence, which can be both a strength and a weakness,” said Priya Tahim...

While I admit that it can be a weakness, I see it as a strength, at least in myself. I've never been reluctant to ask for help when I truly needed it. Speaking of which, anyone want to come over on Monday and shovel sn*w, so I don't get another heart attack? I'll pay.

They may feel misunderstood or judged for being an only child.

Okay, sure. But partly, that's because of articles like this one.

Further perpetuating these stereotypes, only children are often portrayed negatively in movies and TV shows, such as being spoiled, selfish and having poor social skills, Greene added.

I remember doing an article on a similar subject, recently, focused on adopted children. As I was both, my representation in media is fucked. At least until I remember Clark Kent.

If you’d like to connect with and seek support from other adults who grew up without siblings, Greene recommends joining support groups on Facebook for only children.

No.

Therapy can also be an effective place to explore how your childhood is shaping who you are — no matter what your birth order is.

I'm not going to rag on therapy in general. I've done it, with mixed results. But here's the problem: one of the things I'd like to talk about in therapy is my lack of motivation to do just about anything. To do that, I'd have to find a therapist. To find a therapist, I'd have to do
work. And I don't have the motivation to do work, so I don't go looking for shrinks. Is that a vicious cycle, or a catch-22? I tried reading that book once and got bored very quickly.

“Whether you are an only child, [oldest child], middle child or [youngest] child, there are pros and cons to each,” Tahim said. “It’s how we choose to grow, learn and adapt … that truly matters.”

While I could quibble about "choose," I'm reminded of one of my favorite quotes of all time, from cartoonist R.K. Milholland:
In the end, we decide if we're remembered for what happened to us or for what we did with it.
January 21, 2026 at 8:42am
January 21, 2026 at 8:42am
#1106458
Unlike yesterday's entry, I know exactly why I saved this one from Mental Floss: because words are fun.

15 Words Derived From Mythological Creatures—From “Money” to “Cereal”  Open in new Window.
Characters of ancient Greek and Roman mythologies have worked their way into modern vocabularies.

I'm also going to brag that I knew almost every one of these. You can believe me or not; doesn't change anything.

As the first month of the year, January somewhat appropriately takes its name from the Roman god Janus, who was associated with entrances, doorways, gates, and beginnings.

Knew that one, too, though January wasn't always the first month of the year.

15 more words we owe to the Greeks and Romans are explored here.

And I'm not covering all 15.

Aurora was the Roman goddess of the dawn... As the early-morning bringer of daily light, Aurora’s name later came to be attached to the famous dawn-like phenomenon of swirling colored arches of light that appear in the night sky at high and low latitudes.

Okay, well, auroras aren't very dawn-like, from what little I've seen. And I've been trying to see; there have been reports of auroras being seen all through the continental US due to a recent solar storm but, as usual, I saw nothing.

Hyacinth is said to have been a beautiful young man who was struck on the head and killed while the god Apollo taught him how to throw a discus.

At least he didn't teach him how to throw a disco.

This, by the way, was the one I hadn't been aware of.

Both money and the coin-producing mint where it is made take their names from Juno Moneta, an epithet for the Roman goddess Juno specifically associated with an ancient temple erected in her honor on Rome’s Capitoline Hill.

What the linked article doesn't say, and is only mentioned in passing at the link given there, is that the actual translation of "Moneta" is "warner," as in "she who warns." I find this, in connection to money, amusing.

Derived ultimately from a Greek word meaning a distribution or doling out of something, Nemesis was the name of a Greek (and later Roman) goddess of retribution and divine vengeance, who was tasked with either punishing or rewarding people for their evil or benevolent actions.

And yet the "rewarding" part gets neglected.

There are, of course, more at the link, for those who enjoy etymology. Just, as usual with MF, don't take anything too seriously unless you've double-checked the facts.
January 20, 2026 at 9:41am
January 20, 2026 at 9:41am
#1106391
This is one of those times when I don't remember the original reason I saved something. But whatever; I'll find something to yap about. From NPR:

I guess I might have kept it because it's a word origin thing, and I do like knowing origins. But I've known this word's origin for decades, so I don't know.

Since the word was coined in the 18th century, "serendipity" has been used to describe all kinds of scientific and technological breakthroughs, including penicillin, the microwave oven and Velcro.

I'll take their word for it. For now.

And let's not forget that it was the name of the charming 2001 romantic comedy...

I'd already forgotten, thanks.

"Serendipity" — as the Merriam-Webster dictionary defines it — is "the ability to find valuable or agreeable things not sought for" or "luck that takes the form of such finding."

A dictionary, being descriptive and not prescriptive, is the beginning of understanding, not the end.

While the word has often been associated with good fortune or happy accidents, its origin suggests that serendipity goes beyond just happenstance. Some researchers argue that serendipity can be acquired through skill and that opportunities for serendipitous moments occur more frequently than we realize.

Okay, but wouldn't that give it a different definition?

The term was introduced by English politician and writer Horace Walpole in a letter dated Jan. 28, 1754. Walpole is widely credited with writing the first gothic novel, The Castle of Otranto, but he was also the inventor of dozens of words in the English language, including "souvenir" and "nuance,"...

Well, I thought "souvenir" was French, but I suppose someone had to port it to English. "Nuance" is definitely from French.

Walpole said he drew inspiration from a Persian fairy tale, "The Three Princes of Serendip." (Serendip is a historical name for Sri Lanka.)

No idea why I remembered that word origin over lo these many years, when I've forgotten so much else.

Over the years, the definition of "serendipity" has broadened slightly.

"I think often now people will use it in a bit more of a generic sense to mean a positive thing that happened by chance," Gorrie said. " It's the same basic meaning, but it's less to do with finding and more just to do with happening."


Yeah, words have a tendency to do that.

Personally, I don't know if I've ever used the word in other writing (besides today). I don't particularly like it. It's too close to "serenity," for one thing; and, for another, I suppose I was never quite sure of its nuance (see what I did there?) For a third thing, I can't say or even think the word without thinking "Dippity Do."

However, to Sanda Erdelez, a professor at the School of Library and Information Science at Simmons University, serendipity involves more than just being at the right place at the right time.

" What matters is not just chance, but how people recognize this opportunity and then how they act on that opportunity," she said. "There is actually an element of human agency in it."


I could argue that the ability to recognize and act on an opportunity is itself a form of luck: either you start out with that character trait, or you find an article like this one, by chance, and decide to work on that aspect of yourself. (Whether such efforts can be successful, I leave up to the reader.)

In her research, Erdelez focused on how people come across information important to them either unexpectedly or when they are not actively looking for it. She called them "super-encounterers."

"These are people who have a high level of curiosity," Erdelez said. "[They] have either a number of hobbies or interest areas so they can see connections between various things."


Oh. Yeah. That's why I saved this article: I consider myself a curious person with many areas of interest, and for as long as I can remember, I've tried to see connections between disparate things. It is, I think, a good trait for a writer to have.

So, for those on the hunt for serendipitous moments, Erdelez suggests carving out time from a busy schedule to give chance a good chance to happen.

Yeah, that borders on mysticism, but I'm not going to quibble about that; serendipity or not, I can't help but feel it's important to do that anyway.
January 19, 2026 at 9:30am
January 19, 2026 at 9:30am
#1106334
While LiveScience isn't where I'd go for trustworthy scientific information, this article had enough of interest for me to share.
Is the sun really a dwarf star?  Open in new Window.
Our sun is huge, at least compared to Earth and the other planets. So is it really a dwarf?

Well, I don't know. Is the Dead Sea really a sea? Are the Blue Ridge really mountains? Is the East River a river? And I won't get us started on Pluto again.

The sun is the biggest object in the solar system; at about 865,000 miles (1.4 million kilometers) across, it's more than 100 times wider than Earth.

Using linear measurements to compare celestial bodies can be misleading. Sure, you can try to picture 100 Earths edge-to-edge across the sun's apparent disc, or find one of the many illustrations of such that exist. Or you can look up  Open in new Window. a volumetric comparison to find that its volume is like 1,300,000 times that of Earth's.

This doesn't mean that 1.3M Earths would fit inside the thing. Think of a crate of oranges, and how there's always space between the spheres.

Despite being enormous, our star is often called a "dwarf." So is the sun really a dwarf star?

We could call it a "tank" if we wanted to. So is the sun really a tank star?

My point here is that, at first glance, this isn't a science question; it's one of categorization or nomenclature. It's like asking "is homo sapiens really sapiens?"

Dwarf stars got their name when Danish astronomer Ejnar Hertzsprung noticed that the reddest stars he observed were either much brighter or much fainter than the sun. He called the brighter ones "giants" and the dimmer ones "dwarfs..."

I do like knowing the history of science, and of words. Here, just as a wild guess, Hertzsprung was probably drawing on Norse mythology, which is absolutely crawling with giants and dwarfs.

Incidentally, there's some debate over the difference between "dwarfs" and "dwarves." Best I can tell, "dwarves" is generally used for the fantasy race popularized by Tolkien and blatantly stolen by D&D (Tolkien himself stole it from Norse mythology). From what I understand, humans of smaller stature prefer "dwarfs," and it's also the nomenclature for astronomical objects.

The sun is currently more similar in size and brightness to smaller, dimmer stars called red dwarfs than to giant stars, so the sun and its brethren also became classified as dwarf stars.

Like I said, it's a categorization thing. Also, "currently" is misleading. Yes, based on our best available information, the sun won't stay the same forever; it'll eventually blow up and turn red, or vice-versa. But "eventually" means billions of years from now.

Calling the sun yellow is a bit of a misnomer, however, as the sun's visible output is greatest in the green wavelengths, Guliano explained. But the sun emits all visible colors, so "the actual color of sunlight is white," Wong said.

One reason some non-scientists can't get into science is the nomenclature, though. For instance, the sun is also described, by astrophysicists at least, as a black body. As in black-body radiation. This confuses the fuck out of people, and they start muttering about "common sense," as if that were something that existed.

"The sun is yellow, but less-massive main sequence stars are orange or red, and more massive main sequence stars are blue," Carles Badenes, a professor of physics and astronomy at the University of Pittsburgh, told Live Science.

One of the science things that confused me as a kid was that, with stars, red is cooler and blue is hotter. Our bathroom faucet was labeled with blue for cold water and red for hot. Thus began my journey of understanding.

Color is probably less confusing than "dwarf" vs. "giant," though one can take those descriptions as being "smaller than average" or "larger than average" without getting too far off track. And yet, as we've seen, color descriptions can be misleading, as well.

Whatever box you put the sun in, and no matter how much I mutter about "the accursed daystar," it's still the sun, and while I avoid its direct rays like a vampire, it would suck if it weren't there.
January 18, 2026 at 8:18am
January 18, 2026 at 8:18am
#1106259
Today's entry is a brief introspection that you can blame on "26 Paychecks Open in new Window. [E]

Write a 300 to 500 word piece about a writing project that you have been working on, but aren't pushing through to completion.

Explain the project: genre, plot synopsis, expected length (short story, saga, epic, novel, series).

Tell us how long ago you started writing it.

Tell us why you stopped working on it, or why the work is not advancing.

Tell us what people in this group or on Writing.Com could do to help you see your project through to the end.


By “have been working on,” I suppose “only in my head” counts.

There was a NaNo project I did lo these many years ago. It’s meant to be a science fiction novel set in the next century, where human travel outside the Earth-Moon system is still not done for various technical and political reasons. Without giving away too much, the story is mostly about one pilot who breaks that barrier in a newly designed ship, built in secret and in contravention of international laws, in order to retrieve an ice asteroid that will make her orbital community more self-reliant and less dependent on Earth or Luna (such self-sufficiency is, of course, what those laws were written to prevent).

How long ago? I don’t know. It’s gotta be going on 20 years now. This is how I know I’m just not cut out to be a real writer: not because of lack of writing ability or ideas, but an utter inability to see things through.

Why did I stop working on it? Well, for starters, every time I looked at it as an editing project, I found something less like work to do. For finishers, the political milieu of the story is: a conservative, fascist, racist, protectionist hybrid corpo-theocracy has taken over most of the US, and, after Civil War II, the US is no longer the US but fractured into, basically, Good States (California, New York, etc.) and Bad States (Texas, Florida, etc.) That’s not actually what they’re called, but that’s the idea. Other countries are aligned with one or the other, but the biggest global power in the novel is a different, rival theocracy to the one in the former US.

Since I started writing the story, the US started heading for Civil War II, thanks to a conservative, fascist, racist, protectionist hybrid corpo-theocracy, so the milieu I envisioned has gone from “yeah, right” science fiction to “it took no genius to predict that” science fiction. So that’s why I’m not working on it now, apart from sheer laziness: my ability to do so without cackling at just how spot on my political, if not technological, projections were, would get in the way. It pays to be a pessimist; you can always find something to cackle about.

Tell us what people in this group or on Writing.Com could do to help you see your project through to the end.

If anyone could “help” me, I’d have completed it already. No, at some point, I simply gave up all hope of ever finishing that, or the three other novels I have in draft form, all promising, none actually finished.

(And that's still less than 500 words except for the italicized bits, which were just the assignments.)
January 17, 2026 at 10:12am
January 17, 2026 at 10:12am
#1106189
Here's a short one from The Conversation which could perhaps have been a lot shorter.

Shorter how? Shorter by saying "Don't go on social media."

But, I suppose people are going to go there anyway, so I suppose the article is relevant.

When graphic videos go viral, like the recent fatal shooting of Charlie Kirk, it can feel impossible to protect yourself from seeing things you did not consent to see.

People have different tolerances for "disturbing" content, so yes, it should be up to the user. Kind of like with the Content Rating System here.

The major platforms have also reduced their content moderation efforts over the past year or so. That means upsetting content can reach you even when you never chose to watch it.

One could argue that you chose to watch it by being on the platform in the first place.

Research shows that repeated exposure to violent or disturbing media can increase stress, heighten anxiety and contribute to feelings of helplessness.

And I went to the link to the "study" and it's not enough to convince me that this is the correct conclusion. Still, whatever the reason, if you don't want to see certain types of media, that's your choice.

Practical steps you can take

I'm obviously not going to copy all of them here.

Set boundaries. Reserve phone-free time during meals or before bed. Research shows that intentional breaks reduce stress and improve well-being.

This is just going to make you see less content overall, not just less disturbing content.

Social media is not neutral. Its algorithms are engineered to hold your attention, even when that means amplifying harmful or sensational material.

I have no reason to disbelieve this, but it seems to me to be one of those things where, even if it's not true, it helps to assume that it is.

I’m the executive director of the Post-Internet Project, a nonprofit dedicated to helping people navigate the psychological and social challenges of life online.

Oh,
now I see the author's bias.

You can try the PRISM process for yourself with an online class...

You know what's worse than the worst, most disturbing content on social media?

Stealth ads.
January 16, 2026 at 9:16am
January 16, 2026 at 9:16am
#1106109
I don't remember why I saved this one. I have absolutely no recollection of anything I might have thought about it at the time. But it's funny, so here:

I know, I know. What with everything going on in the world, why talk about someone's home invasion by Robert T. Catt? Well, remember what I said yesterday about absurdism? Yeah.

"I looked and there's a cat. What I thought was a cat," Aprea said. "It took me a second. I'm like, 'Why is there a cat? And how did it get in?' And then, I stared at it for a second, and, 'Oh my goodness, that's not a cat!'"

I mean, technically, it's a cat? It's just not a cuddly one. Well, technically, it's cuddly
once.

Aprea said her dog had been barking at the bobcat, which was sitting in the corner of the room.

That is the actual, literal definition of "not news." Right up there with "dog bites man." It would be newsworthy if the bobcat barked at the dog.

Her husband and son managed to coax the bobcat back outside, but not before it scaled the door, leaving scratches on the wood from its claws.

To be serious for a moment, I'm glad this encounter with wildlife ended well for the wildlife. Too many times, the humans' answer is "SHOOT IT."

"I'm going to leave those scratches on my wall forever so I can say, yeah, yeah, that was a bobcat," she said.

Oh, I'd absolutely do that.

And the only other thing I can think of right now is this classic xkcd strip.  Open in new Window.
January 15, 2026 at 11:09am
January 15, 2026 at 11:09am
#1106049
Despite the seasonal nature of the headline here, from Big Think, it's still relevant.

What would be a good gift to buy a philosopher?

I'm disappointed one of the answers wasn't "Nothing; it's the thought that counts." But as I've said, philosophers have no sense of humor. We have a different name for philosophers with a sense of humor: comedians.

But there was one answer that really got me thinking. I am sure it was meant as a joke, but you have to be careful joking with the philosophically minded. Because quite a few people said “purpose” or “meaning.”

Case in point. And yes, you do "have to be careful joking with the philosophically minded." Like little kids, they can be annoyingly literal. Source: me, who can be annoyingly literal.

In his book, Mortal Questions, [Thomas] Nagel has an entire essay devoted to the “absurd.” Absurdity — traditionally represented by Albert Camus — is the philosophical position that humans are caught in this dreadful existential disappointment: We are a meaning-seeking, meaning-needing species, and yet the Universe is meaningless. We’re wired to want a thing that the Universe cannot provide.

Which is not to say that this is right or wrong. Personally, I disagree with some of the premises there, but it's not about whether we agree or not.

Nagel, though, thinks that all this talk of “meaning” is a misguided fool’s errand. In his essay, Nagel argues that we can identify three different types of meaning-grasping angst in the philosophical literature, and all of them are logically flawed.

Kind of ironic, isn't it? To spend so much time arguing against something that, by your own definition, isn't meaningful? Sometimes I think the only true philosophers are the ones who don't find their meaning in publishing philosophical essays.

First, the Argument of Time

When we think “everything I’ve done will end in death” and “nothing will matter in a thousand years,” then it is enough to push you into an existential crisis.


Or, as I like to put it, "There's no such thing as a happy ending, only stories that end too early."

So, imagine that on Christmas Day, you open a box containing a magic amulet that gives you immortality... Is this any more meaningful a life than the one you have now?

I think it's already been pretty well established that death is part of what gives life what meaning it has.

Second, the Argument of Size

But I'm assured that size doesn't matter.

Now, imagine you open a present that contains an elixir that makes you the size of the Universe... Would you now have any more purpose to your existence?

Purpose? No. Something else to do? Sure.

Third, the Argument of Use

What is the point of anything at all? We waste our lives trudging to jobs we hate, to talk with people we don’t like, to live in a town we want to leave, and aspire to a future that was never what we wanted.

Need a blankie?

Nagel’s point to all of this is that when we talk about “meaning,” we often talk about it as a question without an answer.

And? Isn't that what Zen koans are, except the questions are at least more poetic in nature?

But while Nagel argues that Camus’s scorn and defiance are a little bit dramatic, he does agree that the best approach to “questions of meaning” is to live ironically.

Okay, I can get behind that.

We need to commit to life seriously while knowing that it has no “meaning” beyond what it is.

But I can't get behind that. At least, not for the standard definition of "seriously." I can fulfill my obligations as a human without being all serious about them. In fact, if it's not obvious, my personal philosophy is that everything is, or can be, funny, so a sense of humor is far more important to me than a sense of meaning.

As I like to say, if you want meaning, grab a dictionary.
January 14, 2026 at 9:44am
January 14, 2026 at 9:44am
#1105974
Another argument for why science fiction should be required reading, from Cracked:
15 Scientific Breakthroughs That Just Might Not Be Great For Humanity  Open in new Window.
Have none of these people seen sci-fi thrillers?!

This is the SF equivalent of that carved bone thing from a couple of days ago.

After reading some of these, we might honestly head for the hills and start a new life amongst the trees.

And if you also read fantasy, you'd know why that's a bad idea.

15 Hybrid human-AI co-embodied intelligence

You know, this, or something like it, is quite literally the oldest theme in science fiction.

Researchers have begun deploying robots guided by AI to run chemistry experiments, handle materials, data-analysis, and lab workflows.

I don't expect a whole lot from Cracked these days, but that sentence doesn't say "co-embodied intelligence" to me.

14 Lethal autonomous weapons

This is one of those things that was going to happen with or without SF.

12 We’re very close to mind-reading

Doubt.

A recent study reports that a new neurotechnology can now predict preconscious thoughts — i.e. indicating what a person is about to think before they consciously realize it.

Yeah, not exactly, and even the article notes that it would need to hold up under further scientific scrutiny.

10 Brain-computer interfaces

This is a more modern staple of SF, and it can, like most technology, be either good or evil. My own thought is that they'll immediately figure out how to project ads directly into our brains with it.

6 Social credit & behavioral scoring

Yeah, considering how well the actual credit scoring system works, you know that will not end well, even if you haven't seen
Black Mirror.

3 Predictive policing algorithms

Isn't that, like, the heart of every techno-dystopia?

1 “Chemputation”

Speaking of dystopias, I was pretty sure that cutesy portmanteau was of "chemical" and "amputation." Nope, turns out it's about computation.

Much less dystopian.

Going to leave it at that for today; for some reason, I'm having to wrestle extra-hard with the text editor. There are, of course, more items at the link.

January 13, 2026 at 12:51pm
January 13, 2026 at 12:51pm
#1105907
Because my schedule got disrupted with jury duty today, I'll talk about that instead of... you know, whatever the random number voices tell me I should talk about.

Given the responses to my notebook post this morning

"Note: A quick update for my adoring fans: I'm ..."

I figured I might as well address them

(In reference to the gif I used, which includes "Justice will be served") Always Humble Poet PNG- 📓 Author Icon:
I'll have mine with neeps & tatties, please.

And the next comment, Jeromée Author Icon:
nipps and tats for mine please *Wink*

To which I feel obliged to say: I'd rather have nips and tits. Preferably someone else's.

Cray Cray ☮ Author Icon:
I want to know more! We don't have jury duty here in Malaysia.

Okay, well, the short version is that the US Constitution entitles people accused of a crime to certain rights, among them being "presumed innocent until found guilty" and "trial by jury." The jury is made up of citizens selected basically at random from public records, such as voter registration lists. The theory behind it, as I understand it, is to have a suspect's fate determined by their peers: actual human beings and not lawyers. Partly this is because, most times, the law (and a thing a person does) is open to some interpretation; and sometimes, the law is utter bullshit.

The jury selection process, at least from my point of view this time, was: Go in, get assigned a number. Wait. Get called into the courtroom and listen to what the judge says. Get sworn in with questions like "Are you a US citizen and a resident of (whatever)?"

Then, out of the 40 people who showed, up, they selected 20 at random. I mean, like, they put numbers in a bag and pulled them out one at a time while joking about getting a bingo machine (I don't expect judges to have a sense of humor, but this one did). Those 20 (one of which was not me) go through a striking process, where attorneys for prosecution and defense can disqualify a juror, usually based on their answers to other questions. For example, an attorney in an illegal drug case might ask "do you have a family member or friend with a drug problem?" And if they say yes, that indicates they might have a bias.

This wasn't one of those cases, but that's the idea. Obviously it's not possible to eliminate all bias, but they do try to ensure fairness. In the US, the jury determines, based on the evidence provided by the prosecution (and possibly the defense), whether the defendant is guilty of the particular crime or not. I'm not sure about this part, but it's my understanding that a judge can overrule a jury's guilty verdict, but not their not-guilty verdict.

Anyway, striking reduces that 20 down to 12. Or maybe even less, so they might pull from the remaining 20, so we had to sit around and wait. Yeah, lots of sit around and wait happened.

Bottom line is I got through it without them calling me, so I didn't get to see the courtroom thing play out. On the plus side, I got sent home a lot earlier than the actual selected jurors. On the minus side, there's a superflu going around and I was one of like three people who bothered with a mask. So if I die in the next couple of weeks, that's why.

Brandiwyn🎶 v.2026 Author Icon:
You'd better be blogging about your wardrobe.

B here is referring to a post I made in her rant forum:
"Jury Rigged"  Open in new Window., in which I sought fashion advice.

You'll be disappointed to know that all I did was put a nice warm shirt, the kind with a collar and buttons up the front, with a solid color, on over my T-shirt, and wore actual shoes with socks. Oh, and I did remember to wear jeans as well. The only people there who were dressed more formally were one young potential juror who wasn't old enough to stop giving shits, and the attorneys and defendant. And the judge, presumably, under the traditional black robe, though for all I know he had his dick hanging out under there.

🌝 HuntersMoon Author Icon:
At least you're on the right side of the jury box... *Laugh*

...for now.

John Author Icon:
I've never been chosen for Jury Duty.

My wife says it is because I always suggest submerging the accused in a frigid stream. The innocent will float to the top.

Ot is it the guilty . . I get confused with that part.


And that is an example of why a juror might get struck off the list.

One final note: A lot of people complain about jury duty, and go to great lengths (including committing perjury) to get out of it. That ain't me. Maybe if I had to do it more than about once every decade, it would become burdensome, but as it is, it's an opportunity to learn something and contribute to one of the few remaining principles that we have as a country.
January 12, 2026 at 10:25am
January 12, 2026 at 10:25am
#1105831
I saw this at BBC and I just had to make a comment.

My comment is this:

Don't these people watch horror movies? Come ON!

An "extraordinary" artefact believed to date back to the late Roman period has been unearthed in a Worcestershire town.

I mean, that's cool and all, but will it improve the sauce?

The bone box was recovered from the grave of a young woman with archaeologists believing the find could...

...fulfill the Prophecy by awakening the Great Old Ones to purge the land.

Okay, seriously, the article is light on details, but has a picture, and the find really is pretty interesting.

Short entry today because I don't really have a lot else to say about it, but I feel like what I did have to say was important.
January 11, 2026 at 10:10am
January 11, 2026 at 10:10am
#1105742
Continuing a theme, by chance, there's this recent anti-doom article from The Conversation.

What's a US philosopher? One who thinks
before pulling the trigger?

Can one individual truly change the world?

I've had my doubts about this for a while.

US philosophers Michael Brownstein, Alex Madva and Daniel Kelly believe individuals can make a difference.

And all you have to do is buy our book!

No, seriously, they want you to buy their book. The article says so.

The authors aim to show readers how certain personal choices can alter the “structures” and “systems” that govern the myriad decisions we make, usually quite passively.

Um, that sounds like just another inspirational "self-help" book.

Written for a general audience, the gist of their argument is captured by the words of US environmental activist Bill McKibben. “The most important thing an individual can do,” he once said, “is become somewhat less of an individual.”

Pithy and all, but I think it undermines the idea that one person can change the world, at least not by themselves. You might have a good idea or a brilliant invention, but it doesn't do any good if other people don't adopt it.

As for changing the world for the worse, well, I think it's trivial to say that yes, for the worse, one individual can make a difference.

This book is timely. In the Anglosphere, and further afield, many people are unhappy.

Ah, yes, the essence of advertising: You're missing a piece. You're unhappy. I can help, for a price.

One response has been a widespread loss of belief that joining an established political party or even voting in an election can achieve change.

As they say, if voting could actually change things, it would be illegal. (Yes, I do it anyway.)

The authors are strong critics of the “self-responsibilisation” that big fossil fuel, tobacco, betting and other companies have foisted on people to head off real systemic change.

Now, on that, I can tentatively agree. I've been saying for years that it shouldn't all be on us.

They advocate an approach in which individuals focus on those activities most likely to trigger other people into changing entrenched structures.

That, at least, has potential.

In many respects, the book is inspiring. The examples show how some ordinary people can become change agents without, metaphorically, having to climb Mount Everest.

And that's cool and all, but, first of all, that sounds like those business success stories you always hear: Ordinary people, hard work, grit, determination, getting up at 4am, blah, blah... sure, but what about the vastly larger number of people who did all that and still failed?

Second of all, one of my favorite memes is the one that goes "Every corpse on Mount Everest was once a highly motivated person."

Anyway, there's a lot more at the article which, despite being a blatant book ad, is an interesting read.
January 10, 2026 at 11:18am
January 10, 2026 at 11:18am
#1105673
From Nautilus, one of those questions whose answers we're never going to agree on.
What Is Intelligence?  Open in new Window.
At a church in Italy, we sought to shed an old definition for one that could save us

"Save us?" Okay, clickbait. Tell me how it's going to "save us" to define a word.

We were in the Tuscan countryside on an impossibly green hilltop, nothing but sheep bleating in the distance, and the creak of iron gates, flanked by carved stone lions, at the end of a gravel drive lined with Italian cypress trees.

Okay, now you're just bragging.

Gleiser fixed up the 500-year-old chapel with a dream of turning it into a think tank and named it the Island of Knowledge.

There is something immensely satisfying about turning a church into a place where knowledge is sought, not repressed.

We were here to come up with a new definition of intelligence. The old one, according to Gleiser, won’t do. “We have an ideology of infinite growth on a finite planet,” he said.

And? I've been saying this for years, and so have others, and yet no one with the power to do anything about it has ever done anything about it. I guess except maybe once, when China had a one-child policy, but they abandoned that because people are gonna people no matter what.

“That’s obviously not sustainable. What kind of intelligence are we using to create this scenario? That keeps me up at night.”

Maybe if you were intelligent, you'd know it was out of your hands and get some better sleep.

To expand the definition of intelligence, Gleiser brought together cognitive neuroscientist Peter Tse; astrophysicist Adam Frank; evolutionary ecologist Monica Gagliano; philosopher Evan Thompson; technology critic and essayist Meghan O’Gieblyn; and Indigenous scholar Yuria Celidwen.

Kind of like one of those carefully diverse superhero teams, I guess.

Celidwen handed us each a dried leaf, which she produced from a small pouch, then told us to taste it. “Let it explore your palate,” she said. I pretended to comply but palmed mine, wondering what it would be like to be the kind of person who puts a strange thing in their mouth just because someone tells them to.

I think I'm beginning to better understand "intelligence."

And, you know, so much for turning a church into a place to explore knowledge.

This was not going to be a typical scientific conference. Which I suppose made sense when you’re trying to overhaul typical scientific ideas. Poems would be recited. Tears would be shed. We weren’t allowed to wear shoes.

There's a big part of my psyche that is forever salty that I was born too late to experience the sixties in all their glory. But then I see something like this and go, "Nah."

Intelligence is usually understood as the ability to use reason to solve problems, skillfully wielding knowledge to achieve particular ends.

Crucial point here: Intelligence is not the same thing as knowledge.

In 1949, at Manchester University, a computer scientist, a chemist, a philosopher, a zoologist, a neurophysiologist, and a mathematician got together to debate whether intelligence could ever be instantiated in machines.

In a bar? Please tell me it was in a bar. Or, wait. Manchester: Pub. Whatever.

One of the participants, Alan Turing, inspired by the discussion, went home and wrote up his “imitation game,” now known as the Turing test, where a machine is dubbed intelligent if, through text conversation alone, it can fool us into thinking it’s human.

It is funny how no one talks about the Turing test anymore. Thing is, I have met scadoodles of humans who could not pass the Turing test. I figure it's likely that the concept inspired PKD to come up with the fictional Voight-Kampff test to tell replicants from humans in the story that became
Blade Runner.

Thing is, from what little we know about the VK test, Dick seems to have been more focused on emotion than on intelligence, which, again, I suspect many humans (e.g. sociopaths and some of the neurodiverse) wouldn't pass, either.

Seventy-five years later, we’ve got chatbots acing the Turing test, and science conceiving of brains as Turing machines. Is it possible we’re missing something?

Of course we're missing something. I'm just not sure that "something" is hippie crap.

Inside the church, I could feel Gleiser’s urgency as he launched the discussion. Could the world agree on a new definition of intelligence before our collective stupidity destroys us?

It's not our stupidity that will destroy us. Lots of animals are stupid, by at least some definition, and most of them don't show any signs of wanting to destroy the world. And intelligence can be used for positive or negative things, and anything in between. No, if we destroy ourselves, it won't be a matter of intelligence, by any plausible definition, but shortsightedness. And maybe a little game theory: the first person or group who deliberately puts themselves at a disadvantage will be overrun by the groups that don't.

When nothing matters, nothing is a problem. Nothing means anything. “People call large language models ‘stochastic parrots,’ ” Thompson said. “But I think it’s insulting to parrots.”

Congratulations; you have reinvented nihilism.

There's quite a bit more at the link, though I wonder at the intelligence of trying to redefine an old word instead of coming up with and defining a new concept that might do a better job convincing the general public as well as those who might actually be able to do something about the problems. Despite my snark above, and a lingering doubt about their methods, I gotta give them props for trying to do
something. And, if nothing else, at least they got to go to a retreat in fucking Tuscany.
January 9, 2026 at 10:33am
January 9, 2026 at 10:33am
#1105509
Words about words, from Mental Floss:
20 Everyday Words and Phrases Turning 50 in 2026  Open in new Window.
These familiar words and sayings aren’t as old as you might think.

It’s easy to forget that new words, just like everything else that comes in and out of fashion, are being coined all the time.

One of the things I'm most salty about in life, and no I will not get over it, is that no one recognizes that I coined the word "rad."

Well, not the word, but the meaning, as in "that shit's totally rad, bro."

Now, it's entirely possible that someone else came up with it independently. Still, it stings to never get the credit I deserved.

...as best as their research can tell us, all the words and phrases in this list were coined precisely 50 years ago, in 1976.

I have my doubts. Still, it's probably a decent look at what words/phrases hit the mainstream in '76, which, for context, is the year Jimmy Carter was elected President.

As usual, I'm only going to cover a few of these.

Athleisure

The likes of tracksuits, spandex, and sneakers began to step out of the gym and into everyday fashion in the 1970s, leading to an athleisure trend that has continued to grow ever since.


I haven't heard or seen that word, outside of this article, in some time, so I don't know if it's still relevant. What is relevant is that it started, or perhaps elevated, a trend of stupid fucking portmanteaux that all need to die.

Butterfly Effect

The popular metaphor of the tiny flap of a butterfly’s wings sparking an eventually large-scale chain reaction has been discussed since the early 1970s at least. But both Merriam-Webster and the Oxford English Dictionary have traced the very earliest written record of the butterfly effect to an article published in the scientific journal Nature in 1976.


Okay, maybe. That one's related to chaos theory: that a butterfly flapping in the Amazon can, due to the way chaos works, lead to a typhoon in the Pacific. Not that such a thing can be predicted or controlled; that's why it's called chaos. But I'm pretty sure chaos theory was introduced in the early 60s, and even before that, there was Bradbury's "A Sound of Thunder," which speculated on large-scale changes in time thanks to some idiot stepping on a butterfly after time-traveling to the distant past.

That's not an obscure SF short story, either; it's one of the acknowledged all-time greats.

Couch Potato

TV played such a big part in 1970s homelife that this was the era when the couch potato was born. Defined by Merriam-Webster as “a lazy and inactive person, especially one who spends a great deal of time watching television,” etymologically, the term might just allude to the dormancy of potatoes below ground.


In the late 70s / early 80s, I proposed alternatives to this one: Bench Fry, and Sofa Spud. Neither of those caught on.

French Press

The very first kettle—or pitcher-like devices for brewing loose coffee, which can then be pushed to the bottom of the vessel using a metal plunger—were supposedly developed in France in the mid-1800s.


You know, with the exception of "fries," I can't think offhand of any phrases that start with "French" that aren't inherently sexual. Okay, maybe fries, too.

Meme

Richard Dawkins coined the word meme as “a unit of cultural transmission” in his groundbreaking book exploring gene-centered evolution, The Selfish Gene, in 1976.


Lest we forget what that word is
supposed to mean. That its popular definition has changed since then is one of the finest examples of irony that I know.

Radical

Derived from the Latin word for a plant’s root, radical first emerged in medieval English in its original and literal sense, to refer to anything growing or deriving from a root, and therefore vital or essential to life or survival.


And I'm the one who shortened it, dammit!

Wuss

No one is entirely sure where the word wuss comes from...


Oh come on. As the article suggests, it's a combination of "wimp" and "pussy." My guess is someone started to say "wimp" but his (it was almost certainly "his") tiny brain tried to change it to "pussy" halfway, and behold, a new insult was born.

I don't know why "pussy" got associated with wimpiness in the first place, anyway. Those things evolved to be tough. Their male counterparts are far more fragile.
January 8, 2026 at 8:20am
January 8, 2026 at 8:20am
#1105435
Today, we're back to food, but with an article of limited use to those who don't live in pawpaw country. From Atlas Obscura:
Kitchen Dispatch: A Quest to Create the Perfect Pawpaw Ice Cream  Open in new Window.
The wild-growing fruit is best found through foraging, and custard turned out to be the key ingredient.

And I do live in pawpaw country. Or I did. They grew wild in the woods where I spent my childhood.

As the weather got colder last week, I decided it was the perfect time to make pawpaw ice cream.

I was wondering why there's a December article about ice cream in the northern hemisphere, but really, ice cream is a forever treat.

I tested several recipes I thought would work well with the fruit’s flavor—a mix of banana, mango, and durian.

None of which, I must emphasize, are native to Virginia. Not by a long shot. Hell, durian grows on damn near the opposite side of the planet. (I'll refrain from making durian jokes this time after a faux pas in the newsfeed yesterday.)

Flavor is flavor, though. My dad always called them "Virginia bananas."

I chose a simple ice cream recipe, a mixture of pawpaw puree, sugar, cream, and milk.

Unfortunately, Dad never figured out how to prepare pawpaw, and my mother refused to. Just as well, considering her other attempts at cooking. She tried, she really did, but just never got the hang of it.

Since pawpaws are notoriously difficult to cultivate, foraging is the best way to obtain a large amount.

But that's
work.

On the other hand, it's probably cheap, or even, in my case, free.

The author apparently lives in New York, and honestly, I didn't know pawpaws ranged all the way up there. Nice to learn new things.

Making ice cream is, of course, also work, so I won't be doing it. Still, it's nice to know that this relatively obscure wild fruit, connected to my personal history in some small way, is getting the respect it deserves.
January 7, 2026 at 9:00am
January 7, 2026 at 9:00am
#1105357
A bit of what I'd consider irony from The Conversation:

Ancient scientists can be easy to dismiss.

The headline says the article is about misinformation. And then it goes calling people from ancient Greece and Rome "scientists," when that concept wasn't really a thing until something like the 1500s, and the term itself wasn't coined until the 1800s (because some chick was doing science, and up until that point, people who did that were called "men of science").

It's a little bit like calling an abacus a calculator: kind of true, but misleading.

Oh, and who the hell dismisses ancient, um, natural philosphers? People quote Aristotle to this day, and Archimedes is practically a god to engineers.

Greek philosopher Thales of Miletus, often described as the West’s first scientist, believed the whole Earth was suspended on water.

If you go by "belief," that's not science.

Roman encyclopaedist Pliny the Elder recommended entrails, chicken brains, and mice cut in two as topical remedies for snakebite.

And? Newton believed in astrology and alchemy. That doesn't make his actual scientific (and mathematical) insights any less profound. The whole point of science is to cut through the bullshit. I feel like this article is verging on anti-science propaganda.

The lone ancient Greek thinker who believed Earth orbits the Sun – Aristarchus of Samos – was universally dismissed by his contemporaries.

I want to try to be crystal-clear here, so bear with me:

The most commonly quoted example of "people who were mocked and/or dismissed by their peers but were later shown to be right" is probably Galileo. But there's a real danger here of falling into a trap: there's a common trope both in fiction and in real life to invoke the name of Galileo when someone has an idea that's being mocked and/or dismissed, the implication being that "Well, Galileo turned out to be right, and one day, my theory about birds flying due to phlogiston radiation will turn out to be right, and then they'll see. Everyone will see!"

In other words, yes, sometimes you'll turn out to be right. Sometimes that's by accident. Sometimes it's because you did actual observation, testing, etc. If you're wrong, that's quickly forgotten. It's like when a fortune-teller tells you "you will find love soon," and then you find love, and you're like "Wow! She was right!" ignoring the fact that love is pretty thick on the ground and you were going to find it anyway if you bothered to look.

However, thinkers 2,500 years ago already faced many problems that are today amplified by social media and artificial intelligence (AI), such as how to tell truth from fiction.

That's probably because social media and AI are both products of humanity, merely scaled up in speed and volume.

But hey, I could be wrong.

Here are five lessons from ancient Greek and Roman science that ring surprisingly true in the face of misinformation in the modern world.

While I still cringe at "science" and "scientist" being applied to that time period, perhaps the overall thrust of the article is useful. Let's see:

1. Start with observations

Almost every ancient scientific text offers advice about observing or collecting data before making a decision.


Okay. I'm with you so far. I just said observation is important. It's not enough, of course, but it's a first step. It seems almost tautological, though: how can you draw any conclusions, right or wrong, if you don't make some sort of observation first?

2. Think critically

Ancient scientists insisted their readers think critically, encouraging us to analyse the claims made by other people.


I can't dismiss the importance of thinking critically. Two problems, though: 1) It's often easier said than done, and, like any skill, requires training and practice; and 2) It is absolutely possible to build an entire skyscraper of thought that rests on a swampy foundation.

Ancient scientists encourage us to think critically about information we read or hear, because even well-meaning sources are not always accurate.

For instance, calling natural philosophers "scientists."

Though, to be fair, I'm willing to accept that this is a categorization issue, not a misinformation one. Like calling the Blue Ridge Mountains "mountains," which people from the Rockies always have a good laugh about.

3. Acknowledge what you don’t know

Another skill ancient scientists encourage is acknowledging our limits. Even Greek and Roman scientists who claimed to be experts in their field frequently admitted they didn’t have all the answers.


Fair enough. I've said for a long time that this is important. There exist, however, numerous people who believe, against all evidence, that it's better to be confident and wrong than skeptical and right.

4. Science is part of culture

Ancient thinkers understood that science was part of culture rather than separate from it, and that an individual’s beliefs and values will have a significant impact on the information they promote as “factual” or “truthful”.


I'd be really, really careful with this one. It can lead to beliefs like "science claims to have falsified this aspect of traditional medicine, but that's because it's from a different culture," and the ultimate dismissal of science as "western imperialism" or whatever.

5. Science is for everyone

Ancient scientists understood the importance of deferring to specialists and listening to expert advice. However, they were also keen for their readers to understand where scientists acquire knowledge and how scientific facts can be verified.


I agree with the "science is for everyone" bit. Ever diagnosed a problem with your computer, or lawnmower, or whatever? Then you've basically done science. Observe, hypothesize, test, etc.

And it's important to "understand where scientists acquire knowledge and how scientific facts can be verified." The problem is, there's way more science out there than any one person can possibly verify, and it's truly impossible to verify everything yourself. You can verify some things, and share that knowledge, but the people you share it with should think critically, too, thus compounding the problem.

In summary, I remain skeptical.
January 6, 2026 at 11:52am
January 6, 2026 at 11:52am
#1105286
Today, from LiveScience, we have an article that might verge on the political.
Climate change is real. It's happening. And it's time to make it personal.  Open in new Window.
We found the psychological impetus people need to take action on climate change — realizing it will affect them and their way of life personally.

Oh, you don't say? You mean people don't care about things until it affects them personally? Why, I never noticed that! Except I did.

"That disease is affecting
those people, so, so what?" / "That disease killed my child! I'm going to start a crusade!"

"That country is being invaded? Sucks to be them." / "My country is being invaded! Send help!"

It's human nature, and a well-documented aspect of human nature. Thing is, there's lots of things that are human nature that we try to rise above, like punching someone who looks at you funny, and this should be one of them.

That said, please keep in mind that the tag on this article is "Opinion."

Recognizing that climate change is immediate, close, and affecting people's way of life is one of the key messages we need to communicate to spur them to act.

I believe that there are people who will sit on their low-lying island until it's inundated by rising sea levels, convinced that it couldn't possibly be due to anthropogenic climate change, because their favorite echo chamber said it's not.

But in order to meaningfully limit warming, we need to enact policies that will alter the lives of billions of people.

That always goes over
so well. Besides, it would take a dictator. A benevolent dictator, not the one we've got.

And this needs to begin with individual action — getting people to care enough to alter their behavior around climate change.

And with that, you've lost me.

There are times for individual action, sure. But this is one of those times when the levers are moved by much bigger entities.

Also... I remember when
Wonder Woman 1984 came out. Big letdown after the first WW movie, but that's not my point here. It released at the depths of the COVID crisis, when smart people were like "Masks and social distancing" and the toddlers were like *stomp* "NO!"

So the central plot point of that movie (which I am now spoiling, but you weren't going to watch it anyway), that everything can be fixed if everyone would just, you know,
listen to Diana, and think a certain thought at a certain time, like in Horton Hears a Who, then the problem could be solved.

It's a nice thought. But by the time the movie was released, it was plain as the screen in front of you right now that it would never, ever, happen, for any reason, not even if the chick suggesting it was incredibly hot. And in this case, if you can't get
everyone on board, and you also can't get the people in power to move their levers, then there's no fucking point in me switching to paper straws or otherwise inconveniencing myself in the slightest.

We recruited more than 3,000 participants across six countries to see what would make them more or less motivated to help climate causes. Pro-environmental actions are often costly — incurring financial, time and physical effort.

Then there's my inherent laziness.

The findings confirm some things that we know to be true about human behavior. It's the same reason why people have a greater connection to news that is local to their area, or to their interests. When it's personal, when it's close, when it affects our usual way of life, it lands.

Like I said.

When rising water levels increase the risk that our property is going to be flooded — because events that were previously likely to happen once in 100 years are increasingly common — to protect our way of life requires us to take action, rather than do nothing.

I'm not going to get into the math here, but that's not what a 100-year flood is. I get that the name is confusing, leading people to think it is. A 100-year event is one that, based on prior observations, has a 1 in 100 chance of occurring in any given year. To understand the difference requires some knowledge of probability and statistics, which I'd expect any scientist to have. But it also means you have to have learned some basic hydrology lingo, which, in fairness, neuroscientists rarely do.

Which is not to say that extreme storm events aren't increasing in frequency and intensity, as generally predicted by climate models, but it's rare that I get to be pedantic about something I've actually gotten a degree in.

We know addressing climate change will require systemic change from governments and business. But we need to start somewhere, and getting people to see the changes happening around them may just be a small step that leads to major shifts.

So in general, I'm in agreement with the article on that. What we seem to disagree on (and I admit I'm probably the one in the wrong here) is that it's psychologically possible to get enough people on the sinking boat to start baling.

Honestly, I hope I'm wrong.
January 5, 2026 at 8:52am
January 5, 2026 at 8:52am
#1105191
Yes, it's another Mental Floss listicle today. At least it's about writing, sort of.

15 Phrases True Grammar Nerds Would Never Confuse  Open in new Window.
Is it bald faced lie, or bold faced lie? Coming down the pike? Or pipe? Coleslaw, or cold slaw?

Some people, I've found, mispronounce things. My ex, for example, pronounced "picturesque" as "picture-skew," and I thought it was so cute that I never said anything. To this day, over a quarter-century later, I have no idea if she was doing it on purpose or not, the way I pronounce "homeowner" as "ho-meow-ner."

I never rag on anyone for this, though. It means they read, but they just interpreted the word as being pronounced differently. Same thing happened to me when I first read about quinoa. I pronounced it like "Genoa," and got laughed at. Bastards. It's not like we're born knowing these things.

What I do admit to sometimes looking down my nose upon, though, is the reverse: people who hear things and then proceed to write them down wrong. A common example is when someone writes something like, "John was unphased." It's supposed to be "unfazed." Read more books.

An eggcorn is a mistaken word or phrase that makes almost as much sense as the correct version. The term eggcorn was coined by linguist Geoff Pullum in 2003 as a nod to people’s habit of mistaking the word acorn for eggcorn.

And yet, no one uses words that I coined, except me. In fairness, I'm not a linguistics professor.

Feasible arguments are a big element of eggcorns: There’s no overlord deciding which language errors are logical enough to be official eggcorns and which ones are just plain mistakes.

I know a few people who, given the chance, would absolutely sign up to be that overlord, the Chief of the Language Police. Hell, I'm one of them.

Which is not to say I never make mistakes, of course.

I'm not going to note all of them here.

For All Intents and Purposes vs. For All Intensive Purposes

The former phrase is the correct one, as the article explains. But that's a phrase I like to have fun with, too, pronouncing or spelling it as "For all intensive porpoises."

Coleslaw vs. Cold Slaw

Absolutely did this one when I was a kid. But when kids do it, it's kind of cute. Well, except me. I was never "cute."

On Tenterhooks vs. On Tenderhooks

"On tenterhooks" was one of my dad's favorite phrases. As in, "You're finally home. Your mom was on tenterhooks." (Actually, she was cool with it. My dad was the one who was worried about what Teen Me was getting up to at 2am, most likely for liability reasons.)

Happy as a Clam vs. Happy as a Clown

Honestly, I can absolutely see this one. As the article notes, the correct version (the first one) requires some knowledge of context, because, apart from that phrase itself, no one has ever considered a clam to be happy; contrariwise, clowns often have the smiles painted right on their faces.

Deep-Seated vs. Deep-Seeded

Also, this one. Having done more than my share of planting as a kid (before I started coming home at 2am), either can work, which I guess is the point of an eggcorn.

Hair’s Breadth vs. Hare’s Breath

Not so much this one. As thin as a hair might be, it still has some physicality to it, unlike "breath," which, like "wind," describes something usually unseen.

Make Ends Meet vs. Make Ends Meat

And the "meet/meat" homonym strikes again. Personally, I find the wrong one ("meat") to be hilarious.

Again, more at the link. Just remember: these phrases are pretty much all clichés, so we shouldn't be using them much anyway. They still have a place in written dialogue, though, so it pays to get them right, or at least get them wrong on porpoise.
January 4, 2026 at 9:22am
January 4, 2026 at 9:22am
#1105085
I may not be a fan of the outdoors, but I'm a huge fan of sleep. From Outside:
New research links this good habit to slower aging, stronger brain health, and a longer lifespan. Doctors share what makes the biggest difference.

Okay, but, bear with me here: what's the point of living longer if you're spending all that extra time asleep?

The old saying “I’ll sleep when I’m dead” is deeply flawed.

I'm not sure why that sentence elicits what is, for me, a strong anger reaction. Maybe if I were in therapy, I could figure it out, but I'm too depressed to expend the effort required to look for a shrink, so that's not going to happen.

I suspect it's the Puritanical thrust of the saying: that sleep is some sort of optional thing, something that gets in the way of Holy Productivity. The smug "I'm strong and you are weak" implication.

In any case, it pisses me off when I hear it. Or see it.

It’s not just your diet and exercise habits that are linked with your lifespan. You should also be prioritizing getting enough high-quality sleep on a nightly basis.

Of those three things, guess which one I don't fail at. Go ahead. Take a wild guess.

Getting too little sleep is linked with chronic and sometimes life-threatening health concerns, including high blood pressure, stroke, heart disease, and kidney disease, according to the National Heart, Lung, and Blood Institute.

I'd like to hear from a source that's not heavily biased toward hearts, lungs, and blood.

And it’s not just about quantity: Disturbances in your sleep quality, such as waking up a lot during the night, are tied to a number of signs of genetic aging, according to a 2022 study...

Okay, okay, being serious for a moment, once more for the delinquents in the back row: correlation isn't causation. I didn't look at the study, but it seems to me that aging is a cause of sleep disturbances, not the other way around.

All that said, it’s easier said than done to actually find the time to get more sleep.

Not for me! I'm retired. And what's another definition of "retire?" To go to sleep. (In case it's not obvious, I'm done being serious for now.)

1. Put Down the Phone

Or tablet or laptop. And turn off the TV. The light from these screens suppresses your body’s natural production of the hormone melatonin, which would otherwise help you fall asleep, says Dr. Scott Rosenberg...


Great Scott!

Presumably, the paperwhite Kindles are just fine.

In a perfect world, you’d power down electronics at least an hour and preferably two before bed, says Dr. Aatif Mairaj Husain...

Yeah, we don't live in a perfect world, do we?

2. Create a Bedtime Routine

a) brush teeth
b) take pants off
c) turn off light
d) move cats to edge of bed

That's a routine, right?

3. Prep Your Bedroom

Cool, that's covered under (d) above, right?

Your bedroom should be conducive to your best night’s sleep. Sleep experts often recommend your room be cool, dark, and quiet.

Gosh. I never would have thought of those things. Being in the dark and silence helps you sleep? Huh. You learn something new every day.

“The ideal temperature for sleep should be in the mid-sixties,” Dr. Rosenberg says.

Aw, hell to the power of no.

Anecdote: One time, the power went out and my generator chose that moment to get borked, so I was without power for several hours in the dead of winter. My furnace is gas, but the fans and starters run on electricity, so no heat. I put on extra clothes, extra blankets, and tried to sleep, but it was just to frackin' cold. I lay there, shivering in the darkness, trying to will my body to produce enough heat to be trapped by said clothing and blankets. The night stretched on, one of those absolutely frigid January nights with a weakened polar vortex. I lay awake, certain that the inside temperature was creeping inexorably to the outside temperature, because I still remember some thermodynamics from college.

About, oh, 2am or some such (which is usually before my bedtime, but not when I can't use computers or have enough light to read by), the light in my room came on. I crawled out of my freezing covers to check the thermostat, which helpfully also reports the interior and exterior temperatures. Exterior: I don't remember exactly, but it was like 20F, which isn't too out of the ordinary for my area at night in January.

Inside temperature? 62F.

Yeah, don't talk to me about sleeping in the mid-sixties Freedom units. I have my thermostat set to keep the house at 74F, and I still sometimes need extra blankets.

In summary, not everyone has the same temperature requirements (newsflash!)

4. Watch What You Eat and Drink

Oh, I do. I watch it go straight down my gullet.

Avoid big meals within a few hours of bed.

Get out of here with that. Big meals give me better dreams, and better dreams means a more creative Waltz.

Be mindful of water intake.

Coke Zero count?

Cap caffeine in the early afternoon.

Guess not.

Limit alcohol in the evenings.

Oh, I already do that. Well, most of the time. Okay, sometimes. My serious drinking is done before sunset.

(I don't believe in the term "day drinking." There is only "drinking.")

5. Move More During the Day

Ah... I see this publication has succumbed to the propaganda of Big Exercise.

Wait, it's called
Outside. That's probably part of their mission statement.

6. Talk to a Doctor

Ha. Hahaha. HAHAHA! Now you're the one being not-serious. I live in the US, dummy. You should have known that when you started quoting temperatures in F. Or is Liberia your target audience? Point is... "Talk to a doctor." And here I thought *I* was the comedian.

Think about it this way: If you’re waking up to an alarm rather than waking up on your own, you’re probably depriving yourself of at least some sleep, and “ultimately, it’s going to catch up to you,” he says.

The irony of
Outside magazine giving tips and pointers for staying Inside is not lost on me.

Okay, fine, another moment of semi-seriousness: I missed out on a lot of sleep back when I was working. Not out of that Puritanical work ethic, hell no, but just because I was trying to fit in work, leisure, and sleep, and of those three, sleep is the one that lost out.

Ended up having a heart attack. That was 12 years ago, and I haven't had a heart attack since. Nor have I worked. See? I, too, can ignore the correlation vs. causation thing.
January 3, 2026 at 8:50am
January 3, 2026 at 8:50am
#1104988
I'm still wrestling with formatting issues related to the new editor. It's still in beta, so that's not unexpected. Just a reminder that things might continue to be a bit inconsistent here until some wrinkles get ironed out.

I think I stole today's link from Elisa, Snowman Stik Author Icon, because it's not the sort of thing I usually cover here. Well, at least not concerning YouTube, which is one of the few "social media" sites I actually visit sometimes. From ZDNET:
Don't be fooled by this massive YouTube scam network - how to protect yourself  Open in new Window.
Researchers say it is 'one of the largest malware operations seen on YouTube.'
There are, I shouldn't have to mention but I will, many scams on YouTube. Most of them seem to be just your basic bullshit videos, spreading mis- and dis-information, promoting pseudoscience, or trolling, like trying some modern twist on the old "stick your iPhone in the microwave to charge it" thing.

A malicious network of videos hosted on YouTube has been discovered by researchers who branded it "one of the largest malware operations seen on YouTube."

Malware is a different beast, though. Instead of hacking your brain, it hacks your device. I'm not sure that this isn't worse.

On Thursday, Check Point researchers published a report that revealed the scam, dubbed the YouTube Ghost Network, which they tracked for over a year.

By "Thursday," based on the article's datestamp, they mean way back in October.

The YouTube Ghost Network has likely been active since 2021, with videos posted consistently over the years -- until 2025, when the number of videos tripled.

Spooky.

(Ghost? Spooky? October? Come on, I work
hard at these.)

Over 3,000 YouTube videos, described as part of a "sophisticated malware distribution network," contained tutorial-style content that enticed viewers with promises of free or cracked software, game hacks, and game cheats.

Okay, well, my sympathy for the victims just tanked. I've known since about 1993 to never trust cracked software, game hacks, or game cheats on the internet. I ain't saying they deserved it, just that it's the second-oldest scam on the internet, right after porn malware.

Once downloaded, users are told to temporarily disable Windows Defender before extracting and installing the file contained in the archive.

Are you fucking kidding me? That's, like, an ocean of red flags right there. The Pacific Ocean's worth. Like an entire spinning discarded plastic gyre of crimson banners.

I can understand relying solely on Windows Defender. I don't expect everyone to be as careful as I am, which approaches paranoia because I'm really not all that tech-savvy compared to some others (I'm somewhere in the middle of the spectrum between "totally clueless" and "hiding behind seven proxies and fourteen firewalls").

Still, if you're going to rely on Windows Defender,
don't freaking disable it.

If you're trying to use cracked software, you'd probably want to disable security protections, so the need to stop Windows Defender from catching a pirated file makes sense -- even though it's dangerous.

And look, confession: I've used cracked software, and I've pirated movies. I'm not proud of it, and I haven't done it recently, but I'm not going to hide it, either. Still, I have limits, and that limit is metaphorically pulling down my pants so someone can punch me in the balls.

The operators of the scam are using fake and compromised YouTube accounts not only to upload videos, but also to post links and archive file passwords, and to interact with watchers -- posting positive feedback that makes the cracks and tutorials appear genuine and safe.

So it's a sophisticated scam, which only makes me double down on not clicking on naked links.

How to stay protected

I'm not going to paste the tips here. They're at the link. Most of it is stuff I was already aware of, but, like I said, I don't expect others who maybe haven't been around the cyberblock a few times to be as paranoid, so go look at it if you're concerned.

But from what I'm gathering from the article, there's not much they can do to frag you if you don't click on the links (and take off your condoms) in the first place.

Sometimes I think these scams are run by professionals who are trying, via a kind of psychological warfare, to stop piracy and file-hacking. But if so, so what? The end result for the user is about the same either way.

22 Entries *Magnify*
Page of 2 20 per page   < >
<   1  2   >

© Copyright 2026 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/month/1-1-2026