\"Writing.Com
*Magnify*
Printed from https://www.writing.com/main/profile/blog/cathartes02
Rated: 18+ · Book · Opinion · #2336646

Items to fit into your overhead compartment


Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
Previous ... -1- 2 3 4 5 6 7 ... Next
July 30, 2025 at 9:37am
July 30, 2025 at 9:37am
#1094373
I'm leaving for a couple of days' mini-vacation, so a) I'll only be around sporadically and the next couple of blog entries could be short, delayed, or even *gasp* missed and b) of course my random numbers would give me the really heavy one from my pile: this philosophy question from The Conversation.



Well, first, define "real."

You can doubt just about anything. But there’s one thing you can know for sure: you are having thoughts right now.

*Offer not available for a certain subset of humanity

This idea came to characterise the philosophical thinking of 17th century philosopher René Descartes. For Descartes, that we have thoughts may be the only thing we can be certain about.

I can agree with that. What I take issue with is the direction he took his logic: mind/body duality.

But what exactly are thoughts?

Epistemic phenomena resulting from permutations of classical and quantum physics, subject to entropy, qualified by neurological conditions, and dependent upon energy input.

Okay, no, I just made that up.

There are two main answers to the philosophical question of what thoughts are.

As both of them require "thought," we might be in a bit of a pickle here.

The first is that thoughts might be material things. Thoughts are just like atoms, particles, cats, clouds and raindrops: part and parcel of the physical universe. This position is known as physicalism or materialism.

The problem with that approach is that we can verify the existence of all of those things, either through direct observation or scientific experiment; this isn't the case with "thought." If it were, there'd be no debate.

The second view is that thoughts might stand apart from the physical world. They are not like atoms, but are an entirely distinct type of thing. This view is called dualism, because it takes the world to have a dual nature: mental and physical.

The problem with that approach is that if something stands apart from the physical world, how can it interact with the physical world? Hell, we're pretty sure there's something called "dark matter;" we don't know its exact nature, but we do know that it interacts with the physical world in some way (primarily gravity) and, hence, it's material or physical.

If thoughts are physical, what physical things are they? One plausible answer is they are brain states.

I don't disagree, but it seems incomplete to me. Like, does a "hole" exist? Only in relation to what's around it.

On the "dualism" side:

We don’t have an explanation of how brain states – or any physical states for that matter – give rise to conscious thought.

I've already given away which side I lean toward, but I'm trying to be fair, here.

The thing we are most certain about – that we have thoughts – is still completely unexplained in physical terms.

This seems a bit backwards to me. What need is there to explain that which we're actually certain about?

But it gets worse: we may never be able to explain how thoughts arise from neural states.

It's hard to accept, but some things really are impossible. The difficult part is knowing what's impossible. It's possible, sometimes, to know what's impossible (such as counting to infinity). Other things, you don't know they're impossible, so you keep trying, and either you give up or you just... keep trying.

Settling the question of what thoughts are won’t completely settle the question of whether machines can think, but it would help.

And we're back to me making cynical jokes about how we haven't settled the question of whether humans can think. Or do we only think we think? Never mind answering that question for other animals, plants, or fungi.

I'm not here to answer these questions. All I'll say is what I've said before: philosophy guides science, and science informs philosophy. Best I can come up with is my working hypothesis, which is that consciousness, or thought, is an epiphenomenon that arose via evolution from sensory processing.

But I'm willing to change my mind if new data arise. See? I think. Therefore, I am.
July 29, 2025 at 9:52am
July 29, 2025 at 9:52am
#1094315
An important breakthrough in science and pity from Popular Science:

    The world’s smallest violin is thinner than a human hair  Open in new Window.
Don’t expect to hear any sonatas from it.


The phrase “the world’s smallest violin” is dripping with sarcasm and reserved for disdain, but for some researchers it’s a mark of pride. Thanks to the latest nanotechnology tools, a team at the United Kingdom’s Loughborough University recently crafted what is literally the world’s smallest violin.

Okay. I get how people think they can use "literally" for emphasis, or to really mean "not literally." I don't like it, but I get it.

What I won't abide is deliberately confusing an image of a thing for the thing itself. Cela n'est pas un violon.  Open in new Window.

Don’t expect to hear any scaled down sonatas, however. In this case, engineers designed a nanoscale image of a violin instead of a playable instrument.

I didn't do the math, but I suspect that if it were real and could be played (one wonders where they'd get the hair for the bow, not to mention what fingers would dance upon the neck), the sound would be way too high-frequency for humans to hear.

The rest of the article goes into more detail about the image of the violin, why they'd do such a thing, and its implications for computer science or whatever. Don't get me wrong; it's pretty cool, but a bit outside my range.

But we all know why they chose a violin instead of, say, a guitar, or a cat: so they could say (however wrongly) that there exists a world's smallest violin, and it's playing just for you.
July 28, 2025 at 9:22am
July 28, 2025 at 9:22am
#1094252
Every once in a while, I'll find an article that's directly relevant to writers. Like this one, from Esquire, which is nearly two years old:

    Has It Ever Been Harder to Make a Living As An Author?  Open in new Window.
Those who are trying (including Tom Perrotta, Ayad Akhtar, and more) tell us what it's like.


Hadn't heard those names before, which may be one reason why it's so hard to make a living as an author.

In early August, after Andrew Lipstein published The Vegan, his sophomore novel, a handful of loved ones asked if he planned to quit his day job in product design at a large financial technology company.

"Loved ones?" That was a test. "Are you going to frag off from your high-paying job to do something with questionable earning potential?"

The myth of The Writer looms large in our cultural consciousness.

Yes. You know why? Because writers control our mythology, or at least part of it.

When most readers picture an author, they imagine an astigmatic, scholarly type who wakes at the crack of dawn in a monastic, book-filled, shockingly affordable house surrounded by nature.

What? No. Okay, maybe Stephen King fits some of that description, but other big-name authors? I couldn't name even one who matches that archetype (or is it stereotype?)

Also, any career that requires me to wake "at the crack of dawn" isn't one I'd want to pursue.

Most novelists have day jobs, and the majority of those who don’t are either independently wealthy or juggling a handful of projects at once, often in different mediums like film, journalism, and audio.

It's been my long-standing impression that authors don't make the big bucks until their properties are licensed for cinema or broadcast.

These days, it seems the only way for a full-time novelist to ensure financial stability and a comfortable life is to write a Big Book—a reality that’s almost entirely outside their control.

In other words, roughly equivalent to winning a lottery. Not the big pick-7 jackpot, but a moderate scratch-off grand prize.

Far be it from me to encourage lottery-playing, but it's a hell of a lot less work than novel-writing.

So how is it that so many writers seem to be living comfortably?

They’re moonlighting as screenwriters, bylined or not. For decades, Hollywood has been the place where fiction writers could not only make a living, but receive healthcare and benefits through the WGA. Even screenwriters who’ve never had scripts produced have Writers Guild of America health insurance and own homes in Silver Lake or the Hollywood Hills. “Unless you have a big, huge hit play or a lucky best-selling novel, you’re probably paying your bills with a TV or movie gig,” said Lowell Peterson, the executive director of the Writers Guild of America East.


We can rail all we want about the social structure that rewards film/TV more highly than the printed word, but that's the reality we live in. Hell, I watch more than I read. On the other hand, writing for TV, movies, theater and video games is still writing.

One thing to remember is this: the majority of people are extroverts, and while movies, plays and shows (and even video games) lend themselves well to being a group activity, books generally do not.

And then there's simple economics: there are more writers out there than can be supported by readers alone.

The article goes into more detail, of course. I'm not here to quash anyone's dreams, and, as I said, I don't have anything published; nor am I trying to, so I don't really have a horse in the race. Mostly, I just thought the article might be of interest to writers.
July 27, 2025 at 11:12am
July 27, 2025 at 11:12am
#1094199
From The Conversation, possibly a tough pill to swallow, and which might have expired (dated last year).



I've been saying for years that "natural" doesn't mean shit. Well... maybe. Shit is, after all, natural. So is poison ivy, tobacco, hemlock, and cockroaches—to name just a few.

According to a 2023 survey, 74% of U.S. adults take vitamins, prebiotics and the like.

Proud resident of the other 26% here.

The business of supplements is booming, and with all the hype around them, it’s easy to forget what they actually are: substances that can powerfully affect the body and your health, yet aren’t regulated like drugs are.

For some, regulations are an unnecessary burden. This is because there's no one alive today who remembers when food companies loaded their products with sawdust and arsenic in an effort to be more profitable. This is not just because they ate arsenic, but because it was over 100 years ago.

They’re regulated more like food.

Well, at least there's some regulation, right? No one can just put out a product and make bullshit claims, right?

Stop laughing.

It’s important to consider why so many people believe supplements can help them lead a healthier life. While there are many reasons, how supplements are marketed is undeniably an important one.

If you exaggerate in your marketing for your new book, worst that can happen is someone's out a few bucks and maybe some wasted time. If you lie about your food or pill product, people can get sick.

In my years following the industry, I’ve found that three mistaken assumptions appear over and over in supplement marketing.

I'll touch on each one, but obviously, there's more at the link.

1. The appeal to nature fallacy

The appeal to nature fallacy occurs when you assume that because something is “natural” it must be good.


I touched on this one above. I've questioned the distinction between "natural" and "artificial" before (we are part of nature, so therefore anything we create is also natural), and the line between them isn't bright and shiny.

To be clear, “natural” does not equate to “better,” but that’s what the marketing wants you to think.

And, for the most part, they've been wildly successful at that mind control.

2. The belief that more of a good thing is always better

This one's easy enough to debunk: one shot of tequila is a very good thing. A hundred, and you're dead.

If you don’t have a deficiency, consuming more of a particular vitamin or mineral through a supplement won’t necessarily lead to health benefits. That’s why supplement skeptics sometimes say, “You’re just paying for expensive pee” – since your body will excrete the excess.

The article mentions vitamins C and D, but completely ignores the well-documented dangers of overconsuming Vitamin A. I'm not an expert, though, and the author is, so don't take my word for it.

3. The action bias

Finally, the supplement industry likes to capitalize on the idea that doing something is better than doing nothing.


This may or may not be a cultural thing; I don't know. I do know that, in many cases, doing nothing is a far superior choice. Like with work, for example.

When it comes to supplements, taking them isn’t necessarily better than not.

So, how do we know what's a proper dose? Well, like I said, I avoid them entirely, but that may not be the best course of action, either. I'd say "ask your doctor," but there's a shocking number of people who distrust experts because they're experts. And, to be fair, they are human, so they can be wrong—just not as often as those who are not experts. It's kind of like being scared shitless of flying while thinking nothing of driving, even though the latter is many orders of magnitude more likely to kill or injure you.

And you can't just go on the internet and "do research," because the internet is full of misinformation, especially YouTube.

There are good sources, but again, people tend to believe what they want to believe (or what marketing makes them believe) over dry scientific data.

I do what I can to avoid being part of the problem. At least, I don't have a profit motive, which seems to drive a lot of misinformation: "You can't trust doctors or science, but you can trust what I say because I need money!"

People used to think that education would fix this and similar problems, but I've lost any hope for that since the pandemic started. I'm tempted to just leave it alone, but it still bugs me on some level that people still fall for this shit.
July 26, 2025 at 7:55am
July 26, 2025 at 7:55am
#1094118
The article I'm sharing today, from Knowable, is almost two years old. That matters in this case, because I've followed this technology somewhat, and more advances appear to have been made. Nevertheless, I'm tackling the topic.

     Pursuing fusion power  Open in new Window.
Scientists have been chasing the dream of harnessing the reactions that power the Sun since the dawn of the atomic era. Interest, and investment, in the carbon-free energy source is heating up.


"Heating up." Snort. Look, if a headline doesn't contain a pun, I'm usually disappointed.

For the better part of a century now, astronomers and physicists have known that a process called thermonuclear fusion has kept the Sun and the stars shining for millions or even billions of years.

Let's re-evaluate that sentence. The Sun is about 4.6 billion years old. The Earth, a bit younger. Life? Well, that's debatable, but it seems to have started not long after Earth's formation. Humans have been around, depending on definition, for maybe a million years. For most of our planet's history, no one cared what powered the Sun. Once we started pondering, we figured it was gods (or that the Sun was itself a god). Until, once science had advanced far enough, we figured out, to some degree anyway, what was going on in its core, which settled an earlier question about how it could be that old.

Science didn't put all the pieces together until the depths of the Great Depression, in the 1930s. Once we did, people were like, "Okay, so how can we do that ourselves?"

It’s a dream that’s only become more compelling today, in the age of escalating climate change. Harnessing thermonuclear fusion and feeding it into the world’s electric grids could help make all our carbon dioxide-spewing coal- and gas-fired plants a distant memory.

Here's where I get cynical.

For starters, the first human-made use of thermonuclear fusion was the result of people going "How can we make a bigger kaboom?" I mean, this did result in the invention of the bikini,  Open in new Window. so it wasn't all bad. It is indeed an ill wind that blows nobody up. Or something.

Second, while I have no doubt that we'll figure this shit out eventually (assuming we don't get nuked by the kaboom version), there's a running joke that electricity from fusion power is 20 years away, and has been since about 1960. So one might forgive me for being just a little skeptical about the hype.

And for finishers, lots of inventions started with "this will be cleaner." One reason the automobile got promoted, for example, was people (especially in cities) were fed up with all the literal horseshit, and wanted something that polluted less. And, no matter what the development, someone will figure out a downside. Solar panels (a means of harnessing the power of that giant thermonuclear fusion reactor in the sky) create no pollution, but they throw shade on desert ecosystems. Windmills (also harnessing solar power, albeit less directly) make life difficult for birds (though the "causes cancer" horseshit is oil industry propaganda). Want to switch from internal combustion to electric vehicles? The battery life cycle poisons the environment, too. Point is, while no one's pointed out a downside to fusion power yet, someone will come up with something. My guess? Something to do with wasting water making the hydrogen isotopes necessary to fuel the things.

In fact, fusion is the exact opposite of fission: Instead of splitting heavy elements such as uranium into lighter atoms, fusion generates energy by merging various isotopes of light elements such as hydrogen into heavier atoms.

It's not the "exact" opposite, but close enough for a popular science article.

Doing it on Earth means putting those light isotopes into a reactor and finding a way to heat them to hundreds of millions of degrees centigrade...

You know, when you use rough but enormous orders of magnitude like "hundreds of millions of degrees," it doesn't matter whether you're talking centigrade, fahrenheit, kelvin, or rankine.

It was only in late 2022, in fact, that a multibillion-dollar fusion experiment in California finally got a tiny isotope sample to put out more thermonuclear energy than went in to ignite it. And that event, which lasted only about one-tenth of a nanosecond, had to be triggered by the combined output of 192 of the world’s most powerful lasers.

Doesn't sound like it, but that milestone was a really big deal. It made us believe that practical fusion was only 20 years away.

Again.

Indeed, more than 40 commercial fusion firms have been launched since TAE became the first in 1998 — most of them in the past five years, and many with a power-reactor design that they hope to have operating in the next decade or so.

Cynical Me: "Yeah, right, most of them are just there for the funding." And also, "You mean two decades."

None of this has gone unnoticed by private investment firms, which have backed the fusion startups with some $6 billion and counting.

Cynical Me hates being right. Oh, who am I kidding; he revels in it.

Granted, there’s ample reason for caution — starting with the fact that none of these firms has so far shown that it can generate net fusion energy even briefly, much less ramp up to a commercial-scale machine within a decade.

"Caution?" Nay. Cynicism.

In the meantime, to give a sense of the possibilities, here is an overview of the challenges that every fusion reactor has to overcome, and a look at some of the best-funded and best-developed designs for meeting those challenges.

Okay, I've spent enough time on this. The article is there if you're interested in the promised details. I really just have one more thing to comment on:

The basics of these three challenges — igniting the plasma, sustaining the reaction, and harvesting the energy — were clear from the earliest days of fusion energy research. And by the 1950s, innovators in the field had begun to come up with any number of schemes for solving them — most of which fell by the wayside after 1968, when Soviet physicists went public with a design they called the tokamak.

Like several of the earlier reactor concepts, tokamaks featured a plasma chamber something like a hollow donut...


And I'm severely disappointed that they won't call the toroid a bagel.
July 25, 2025 at 10:31am
July 25, 2025 at 10:31am
#1094073
From Quartz, an article that probably inspired my last Comedy newsletter editorial on how to be funny, but I'd forgotten I had it saved:

    The secret to being witty, revealed  Open in new Window.
Have you ever thought of the perfect quip or comeback after it didn’t matter—a minute, hour, or day after your conversation has ended?


See, that's the advantage of interacting mostly online. If someone tells me "You're a poopyhead," I can think about it for an hour, a day, even a week or more before responding, "Your mom."

By Ephrat Livni and Ephrat Livni

I don't really know what's going on with that byline, but that dynamic duo has other credits on Quartz, so it's not just this article.

Apropos of nothing, that is, so let's get to the article.

Well, there’s a name for that phenomenon. It’s called l’esprit de l’escalier, or the spirit of the staircase, and refers to the perfect retort that arises at the wrong time.

Everything sounds better in French.

You can practice being wittier, improving your reaction times and ability to land a jab or joke at just the right moment. In his new book, Wit’s End: What Wit Is, How It Works, and Why We Need It, released on Nov. 13, author, editor, and journalist James Geary of Harvard University’s Nieman Foundation argues that wit isn’t just for a few gifted linguists.

Oh, surprise, an ad for a book. I'm not sure when that "November 13" was. The dateline on the article just says "updated," and I can't be arsed to look up the original publication date.

What's worse is the missed opportunity to use "cunning" instead of "gifted."

By practicing and mastering wit, learning to turn words and phrases around in the mind and presenting new juxtapositions, we can change the way we and other people see.

Another missed opportunity: "we can change the way we and other people witness." Because wit, get it? Huh? Huh?

The wittiest among us are simply people who make unusual connections between words and ideas. There’s a refreshing element of surprise to these observations that prompts a smile or a wince from the listener who didn’t see the link until it was presented.

"Simply?" I'll have you know it's taken me years of study and practice to become this witty.

In cognitive terms, the brain of the wit is less inhibited than that of a linguistic dullard. “Uncensored access to associations, conscious and unconscious, is essential to wit,” Geary writes. He notes that some people who experience brain damage or have neuropsychiatric diseases lose their ability to make these associations altogether, while others suffer from witzelsucht. This German term means “wit sickness” or “wit addiction” and results in a compulsion to make jokes that are often socially inappropriate.

Everything sounds worse in German.

Like other forms of creativity it is borne of knowledge. Having a rich vocabulary is a starting point. Curiosity is another important element. Appreciating language in all the places and ways it’s used—from pop music to literary fiction, scientific writing to slang—makes it easier to generate unusual combinations.

The rumor that I worked to learn French just so I could pun in two languages is almost entirely true.

Anyway, no, the article doesn't provide that gleaming, just-out-of-reach jewel of "how to be funny." It's not like "how to take care of cats" or "how to make a billion dollars," both of which are far easier than being funny when you're not funny. But it might point some people in the right direction. Presumably, that "right direction" is buying the book they mentioned.
July 24, 2025 at 9:43am
July 24, 2025 at 9:43am
#1094016
I know I've written about this word before, but not about this particular article, which is pretty new as I write this. From NPR (apparently, a transcript of a broadcast piece):



Sorry, yinz. Fuhgeddaboudit, you guys: In the past 20 years or so, "y'all" has gone from being a Southernism to become America's favorite way to use the second person plural, according to linguists.

Huh. So the South did, indeed, rise again, only this time without shots being fired or people being enslaved.

"Y'all has won," says Paul E. Reed, a linguist at the University of Alabama who studies Southern American English and Appalachian English.

I'm not sure a Southerner declaring victory is any sort of proof of anything.

I'd also like to note the construction of his sentence. By saying "Y'all has won," it's clear that he means that the word "y'all" has become victorious, and it does not violate subject/verb number agreement.

Admirers appreciate y'all's tidiness and utility. In particular, Reed says, young people across the U.S. seem to love y'all.

Young people create and/or latch on to new words all the time, in part to rebel against their stodgy elders. Most of such words are dumb and eventually fall out of favor. Occasionally, one will worm its way into the general lexicon. "Cool," used to describe something or someone that doesn't suck, is one such word. I fully support that same trajectory for "y'all." (I'm also quite fond of "yeet.")

Long-term migration patterns have also helped y'all spread, from Black Americans who brought it with them out of the South during the Great Migration, to Northerners and others who have more recently adopted the term after moving to the South.

Whatever its origin (still somewhat disputed), the lack of a defined second-person plural pronoun in standard modern English is a linguistic oversight just begging to be filled by something. As the article notes:

The word has thrived because it's utilitarian, filling a gap in standard English. We use y'all — and relatives like yinz (for those in Pittsburgh) and youse — because the language has long lacked a satisfying plural pronoun for "you."

"Basically, all of the non-mainstream varieties are better than the mainstream variety, because 'you' being for plural is confusing," Reed says.

I would argue that "y'all" is mainstream, but then, I've always lived in the South.

The article goes into some of the origin debate, which I don't have much to say about.

Another theory notes that written instances of y'all date to 17th century England, as far back as a 1631 poem. But Reed and other linguists say it's not clear whether those examples exactly mirror the meaning and usage of the modern y'all.

Why that matters is beyond me. It's well-established that, in the absence of language fascism, word meaning and usage changes over time. It's an example I've used before, but the word "nice" has changed meaning more than once since 1631, and yet no one seems to care about its changes as opposed to the ones associated with "y'all."

Y'all is on a popularity streak. It's been springing up as far away as Australia, and executives are being trained to adopt "y'all" to be more inclusive.

I have yet to see it in a formal context, though (except, of course, formal reports about the use of "y'all.") It's hard to imagine, but lots of stuff that used to be hard to imagine has happened, like the first time I heard a Star Trek character say "fuck." (I cheered.)

Wright says y'all is benefiting from a process called diffusion, as it grows beyond its former geographic boundaries — a process that's very difficult to predict.

"So being able to watch this happen in real time, it's like a celestial event or something for an astrophysicist, it's like this is a once-in-a-lifetime thing."


Okay, I'm not trying to downplay the impact for a linguist of watching language change in real time, but let's not go overboard, here. Watching a nearby (but not too nearby) supernova go off would be way more awesome. Until that happens, though, I'm happy for y'all.
July 23, 2025 at 8:36am
July 23, 2025 at 8:36am
#1093951
This article, from aeon, is nearly ten years old. But as far as I know, very little progress has been made on this age-old question.

    The real problem  Open in new Window.
It looks like scientists and philosophers might have made consciousness far more mysterious than it needs to be


"You're doing it wrong," profound thoughts edition.

What is the best way to understand consciousness?

I don't know, but I have a strong suspicion that the first step is "be conscious."

Psychedelics are also an option.

In philosophy, centuries-old debates continue to rage over whether the Universe is divided, following René Descartes, into ‘mind stuff’ and ‘matter stuff’.

Regular readers will note that I appreciate Descartes. But I reject that dualism. Provisionally.

But the rise of modern neuroscience has seen a more pragmatic approach gain ground: an approach that is guided by philosophy but doesn’t rely on philosophical research to provide the answers.

Which is good, because "philosophical research" is nothing more than "reading dense tomes that philosophers crafted." And there are more of those than can be read.

I've said before that philosophy and science are partners, in a sense: science informs philosophy, while philosophy guides science.

Its key is to recognise that explaining why consciousness exists at all is not necessary in order to make progress in revealing its material basis – to start building explanatory bridges from the subjective and phenomenal to the objective and measurable.

A noble goal, in my way of thinking, but from my purely amateur perspective, the question of "why consciousness exists at all" has, at base, a very simple answer: because evolution selected for it. That's not a complete answer, of course, but one thing I haven't seen (though I haven't seen everything) is an attempt to explain consciousness in terms of evolutionary adaptation. All species have evolved, and not all have evolved what we think of consciousness, but those that didn't have other adaptations that promote survival and reproduction. In humans, it's adapted to the point that we can consciously choose (whether choice is an illusion or not) not to reproduce, and that's interesting.

In my own research, a new picture is taking shape in which conscious experience is seen as deeply grounded in how brains and bodies work together to maintain physiological integrity – to stay alive. In this story, we are conscious ‘beast-machines’, and I hope to show you why.

And, questionable labels aside, that's pretty close to my own thoughts—albeit with more resources, education, and focus.

Let’s begin with David Chalmers’s influential distinction, inherited from Descartes, between the ‘easy problem’ and the ‘hard problem’. The ‘easy problem’ is to understand how the brain (and body) gives rise to perception, cognition, learning and behaviour. The ‘hard’ problem is to understand why and how any of this should be associated with consciousness at all: why aren’t we just robots, or philosophical zombies, without any inner universe?

Again, though, I suspect that "hard problem" is asking the wrong question.

But there is an alternative, which I like to call the real problem: how to account for the various properties of consciousness in terms of biological mechanisms; without pretending it doesn’t exist (easy problem) and without worrying too much about explaining its existence in the first place (hard problem).

Ask me, calling it "the real problem" is a problem, because it exudes an air of self-superiority. But don't ask me what I'd call it, because I don't know.

I'm not going to get into much more of the article here. Just one more quote:

Rather, consciousness seems to depend on how different parts of the brain speak to each other, in specific ways.

Science likes to divide and conquer. That is, if you have a complex system, it can be more productive to study its subsystems to gain an understanding of that. Like, if you're trying to figure out a car, you can break it down into things like engine, cooling system, steering mechanism, brakes, etc. Each one of those can be understood on its own. But to make a recognizable "car," you also need to know how all of these systems interact with each other.

That's relatively simple with a car; not so much with a body.

I'm not weighing in on the results, here. I just think (perhaps because I'm programmed to) that it's helpful to try a different approach sometimes, and this seems like just such an attempt. It could well be that, just as we can't get an outsider's perspective on the Universe, perhaps we can never truly understand consciousness.

Or maybe we're just asking the wrong questions.
July 22, 2025 at 9:48am
July 22, 2025 at 9:48am
#1093895
I'm no stranger to weighing in on issues I know little about. This article from Smithsonian is a prime example.

    A Search for the World’s Best Durian, the Divisive Fruit That’s Prized—and Reviled  Open in new Window.
Devotees of the crop journey to a Malaysian island to find the most fragrant and tasty specimens


I've heard of durian before. I even incorporated them into a short story at one point. But I've never been to the places where they're common; I don't think I've ever seen one outside of a picture; and I've certainly never experienced the smell or taste.

Once you actually taste and smell a ripe durian, the Southeast Asian fruit best known for its penetrating odor, you will understand what all the fuss is about—and why it’s banned from some public spaces throughout the region.

I have a desire for new experiences; I'm just not sure that particular experience is one I want to have.

“It has something we call hong,” a sudden smell—in this case, the aroma of “bad breath and butane gas,” said Wong Peng Ho, a Malaysian Chinese doctor and durian fanatic, as we shared a particularly pungent fruit on the Malaysian island of Penang, just off the country’s west coast. “You know how when you smell butane and you know it’s not good, but you want to continue sniffing anyway? That’s hong.”

On the other hand, I do have intimate familiarity with the smell of butane, and I don't find it repulsive. Not that I want to go huffing it or anything; it's just not something I care about, unless it's somehow filling the entire house, in which case the only thing I'd care about is getting out before a spark ignites.

True durian aficionados don’t just accept extreme flavors; they celebrate, savor, even exult in them. The late celebrity chef Anthony Bourdain once said of its aroma, “Try leaving cheese or a dead body out in the sun and you’re in the same neighborhood as durian.”

Now, I do have to wonder if there's a cultural component to the varied reactions. In the West, we don't have a lot of super-pungent foods. Limburger cheese is one classic example, but people enjoy that (and I don't mind it). Another is Hákarl,  Open in new Window. which I'm really not interested in attempting.

Almost all desirable durian, at least in Malaysia, has fermented to the point that it has a slight or strong taste of alcohol.

Hm. Maybe I could be persuaded, after all.

After a few nights of aimless wandering and random tasting, I realized something important: Unlike local people, who have had durian wisdom passed down to them by their friends and relatives, I desperately needed a guide, someone who could help me decipher this array of very strange fruit. In search of this knowledge, I came across a distinct breed of sojourners—fruit travelers—who crisscross the earth in search of just-off-the-tree, perfectly ripe produce.

I can understand that. It's like going to France and being completely on your own with wine-tasting. You can do it; you might even enjoy it; but there's nothing like local guides telling you maybe-bullshit stories about the history and culture.

For Gasik, now 36, it all started 16 years ago, while she was working at what she describes as a hippie festival in the northwestern United States. She smelled something strange in the campsites and outdoor classrooms of the event. She asked around. She sniffed. She asked around some more. Finally, an older man told her that she was smelling durian, a magical fruit from Southeast Asia. A superfood. The healthiest thing in the world. “If you eat it, you will get addicted to it,” he told her. “It elevates your spirit, opens your chakras. Durian will change your life.” He was right about the addiction.

Everything else there is, like I said, obvious bullshit.

There's a great deal more at the article. My favorite bit is the photo with a "No Durians Allowed" sign, featuring the usual circle/slash over an all-black spiky ball with a short stem.

Just to be clear here, I'm not ragging on anyone's culture or traditions. I find the whole thing fascinating. And yeah, if I'm ever in the right area, I might even give it a shot.
July 21, 2025 at 11:37am
July 21, 2025 at 11:37am
#1093827
Today (or yesterday, depending on your time zone) is the anniversary of the first bootprint on the Moon, 56 years ago. We know it happened (if you don't believe it did, wow are you in the wrong place right now) in part because they sent photographic images. There was a first one of those, too, but it was a bit further back in history. From Open Culture:



In histories of early photography, Louis Daguerre faithfully appears as one of the fathers of the medium.

Near as I can tell, Daguerre means something like "of war," which explains the next 200 years of war photography.

But had things gone differently, we might know better the harder-to-pronounce name of his onetime partner Joseph Nicéphore Niépce, who produced the first known photograph ever, taken in 1826.

Nah, lucky name, is all. "daguerrotype" sounds better than "niépcotype."

Eventually, after much trial and error, Niépce developed his own photographic process, which he called “heliography.”

Yeah, that wasn't going to catch on, either.

In 1827, Niépce traveled to England to visit his brother. While there, with the assistance of English botanist Francis Bauer, he presented a paper on his new invention to the Royal Society. His findings were rejected, however, because he opted not to fully reveal the details, hoping to make economic gains with a proprietary method.

Oh, come on, it's because he was French.

Sadly for Niépce, his heliograph would not produce the financial or technological success he envisioned, and he died just four years later in 1833. Daguerre, of course, went on to develop his famous process in 1839 and passed into history, but we should remember Niépce’s efforts, and marvel at what he was able to achieve on his own with limited materials and no training or precedent.

Remember how, yesterday, we talked about accidental inventions? Well, this one's something of the opposite of that.

Niépce’s pewter plate image was re-discovered in 1952 by Helmut and Alison Gernsheim, who published an article on the find in The Photographic Journal.

Photography, and its derivative the motion picture (also a French invention, of course), is a key tool of communication now, though the process has changed. I find it remarkable that the image survived nearly 200 years. I doubt most of today's selfies will.
July 20, 2025 at 11:18am
July 20, 2025 at 11:18am
#1093752
I haven't linked to Cracked in a while. I'm not a fan of the direction they went in. And yet, on rare occasions, I still see an article there that's worth a look.

    5 Hollywood-Style Twists That Let Us Figure How the Universe Works  Open in new Window.
We weren’t trying to solve the world. We were just eavesdropping


Science requires long, repetitive, dedicated work. If we make a movie about a scientific discovery, we’d probably skip all of that and give the hero one crazy eureka moment, because that’s more exciting than how it works in real life.

Well, yeah. It's like the "training montage" that compresses a year or more of dull practice, or how cop shows almost never show them filling out forms or writing reports, except as something to be interrupted.

But occasionally, real discoveries truly do play out thanks to scientists stumbling into something they were never searching for.

What's the bit about blind squirrels finding nuts?

It does happen fairly often, but the article only goes into five examples.

5 The Ugly Smiles

Now there's a band name for you.

Have you ever wondered, though, how we began adding fluoride to water in the first place? It wasn’t because we started out interested in making teeth stronger and tested various substances on teeth till we found the one that worked.

No, of course not. It was because we had to impose mind control on an unruly populace.

Joking, of course.

4 The Noisy Radio

The Big Bang theory is arguably the oldest theory of the universe we have.

Well, at least they're using the word 'theory' in the scientific sense. There were plenty of "theories" (guesses) before the BBT, featuring all-powerful entities and/or turtles. From a scientific perspective, even Einstein did most of his work under the assumption that the universe was eternal and, on a large scale, unchanging.

But they didn’t have proof back when Georges Lemaître proposed the theory in 1931. Proof didn’t come until 1965, when scientists detected background radiation lingering from the birth of the universe.

And then they backslid with this "proof" stuff. I'd have phrased it as "support."

3 What the Spies Heard

Space contains many other kinds of radiation as well. For example, there are gamma-ray bursts, the biggest explosions in the universe, from billions of light-years away.

We'd best hope they stay that far away.

We first observed a gamma-ray burst in the 1960s. We weren’t looking for one of them. We were looking for gamma rays from nuclear tests being secretly conducted by the Soviets, and our satellites were expecting to find such rays coming from Earth.

And here we have another example of wartime tech advancing scientific knowledge.

2 The King’s Challenge

The three-body problem isn’t just the name of a book series and a Netflix show. It’s a real problem in physics, which can be stated like this:


There follows a series of simultaneous equations. Yes, Cracked is unafraid to go where even science sites fear to tread: actual math equations. You don't need to understand them to follow the article, though.

In 1889, the King of Sweden offered a prize to whoever could solve a question he presented about stability of the objects in a three-body problem.

Money is an excellent motivator, even for scientists. What is it with Sweden and fun cash prizes, anyway?

1 Metal in the Snow

Another excellent band name. Or album name.

The Earth is 4.5 billion years old. We know this not from checking the dates on contemporary news reports about the Earth’s creation but because of a calculation in the 1950s from geochemist Clair Patterson.

If we did check contemporary news reports, I'm pretty sure the consensus would have been that this was a Bad Idea.

He found lead in rocks, as expected, and he used this for his calculations, but lead also seemed to coat just about everything. He looked in Californian snowdrifts and saw they had loads of lead. He looked in the atmosphere and saw it had loads of lead. You’d think someone else would have noticed this by now, but it seemed that all scientists who previously researched the subject were employed by the National Society for Selling More Lead.

Like I said, money is an excellent motivator. So is the threat to remove money.

Patterson was now looking at something completely unrelated to what he’d planned, but he pressed further and proved that this lead contamination came from vehicle emissions. This finally ended with new clean air laws and the phasing out of leaded gasoline.

And the world became a shining utopia.
July 19, 2025 at 8:48am
July 19, 2025 at 8:48am
#1093690
Last time, it was USA Today; now, it's Psychology Today. Never will be tomorrow, will it?

    Why We're So Judgmental  Open in new Window.
Judgments serve as protective barriers for perfectionists.


The other appropriate thing is that this comes up after I was pretty damn judgmental yesterday. No, I'm not sorry; when it comes to people actively making things worse for those who are already disadvantaged, that's when we should be judgmental, in my view.

Before I get into the actual subject, a usage note: Both "judgment" and "judgement" are acceptable spellings.  Open in new Window. The one with two Es is more common in British English, while the other is more of a US thing, kind of like how we dropped the superfluous U in "colour." The article, being written by a US author and presumably published for a US audience, uses "judgment," but I can't promise to be consistent in my commentary. Like I said, neither is wrong, any more than "gray" vs. "grey." (Gray is a color. Grey is a colour.)

Judgments are knee-jerk, negative opinions of others, which are based on limited information.

We can make positive judgments, too; but that's not the sense of "judgmental."

On the one hand, it helps you feel superior. And, on the other, it contributes to chronically feeling unsafe because you believe people are awful.

Except that they're not. A few are, and, like with yesterday's example, they tend to ruin it for everyone else.

Unfortunately, perfectionism and cynicism tend to go hand in hand.

Yeah, they're going to have to explain that one to me, because I don't see the connection.

The perfectionist know-it-all believes they're the rational one. They don't perceive their defenses as defenses; to them, their perceptions are reality.

Seems to me that everyone believes they're the rational one. No one actually is.

So, self-oriented perfectionists, those who hold themselves up to extremely high standards, are usually also other-oriented perfectionists, treating others the same way. Anyone falling short of those standards is then criticized as being stupid, lazy, inconsiderate, and/or incompetent.

Ah, now I think I understand. I consider myself stupid, lazy, inconsiderate, and/or incompetent; I don't expect others to be superior to me. (Or inferior, just to be clear.)

This way of thinking forms the foundation of isolation and subsequent chronic loneliness.

Not all who isolate are lonely. As usual, I feel like the article is written with the extroverted majority in mind.

And empathy allows us to see ourselves in others, eliciting memories of similar choices in similar moments. However, that sense of weakness can scare us; perfectionists demand perfection for a sense of security, again from themselves and others.

I begin to see the real problem: thinking empathy is a weakness. It is not.

To challenge our judgmental mindset, we have to acknowledge our fear of being perceived as weak, which stems, in part, from hierarchal thinking.

Well. That ties in nicely with my non-hierarchal (look, it may be "hierarchical," but I find that unwieldy) viewpoint. But that doesn't mean it's right. My own defenses include a) humor and b) skepticism about the soft sciences like psychology.

When we're always competing, we need constant reminders of why we're better. In addition to supposedly protecting us from the world, judgments help us see the parts of ourselves that we don't like in others instead. They let us know that we're progressing well. And they act as mood regulators, picking us back up when we're down. Yet, we pay a significant price for them. The person reading this is challenged to find another way to feel proud and good about themselves.

Or, and hear me out here, maybe grow beyond needing to compete or find reasons to "feel proud and good about themselves?" I feel like there's no point competing when your competition includes up to 8 billion other people and counting.

Carl Jung noted, "Thinking is difficult; that's why most people judge." Moreover, thinking can be scary. Yet, judgments, when they become one's main coping mechanism, make our worlds even scarier.

Well, at least he quoted Jung and not Fraud.

Yes, that's a judgment. Not only have I never claimed to be consistent, I've noted repeatedly that I'm full of contradictions. "I am large; I contain multitudes."

Again, though, I feel like there's a line between judging people for cause and taking shortcuts due to perfectionism or cynicism or whatever psychojargon you prefer. But we're all flawed, and maybe a little less condemnation for things like wearing socks with sandals, which ultimately doesn't harm anyone, would be appropriate.

Then again, we all need exercise, and what's an easier exercise than jumping to conclusions?

Just to be clear once again, I'm not trying to contradict the article. I'm also not saying it's right. Just something to think about. And at least he didn't sidetrack into purely speculative evolutionary psychology, which I think most other authors would have found a way to shoehorn in there.
July 18, 2025 at 10:50am
July 18, 2025 at 10:50am
#1093643
This story from USA Today (actually from April, not today) is something I've been complaining about for a while, but people seem to think that complaining about it makes me a bad person.



Service dogs can be four-legged lifesavers, alerting to dangerous allergens, assisting with travel and making people with a wide range of disabilities safer.

Cats could do all that too, but they're too intelligent to want to.

But fake service dogs, advocates say, are taking a bite out of real service dogs’ credibility, exacerbating the challenges that people with disabilities who rely on service animals already face. Fake service dogs are poorly trained or untrained animals falsely passed off by individuals trying to access restricted places or benefits.

And that is what I've complained about, only to be told that I'm a terrible person for even suggesting it.

Thousands of grocers and shop owners now prohibit any animals, including legitimate service dogs, from entering their stores.

Huh. I didn't even know that was legal in the US. I thought ADA required service animal accommodation. Legitimate service animal accommodation, that is.

Businesses are required to allow service animals onto their premises under Title III of the Americans with Disabilities Act (ADA), but it's not always obvious whether a service dog is legitimate.

See? I was right.

"There are individuals and organizations that sell service animal certification or registration documents online," federal officials warn. "These documents do not convey any rights under the (Americans with Disabilities Act) and the Department of Justice does not recognize them as proof that the dog is a service animal."

This is the problem. Entitled people who want to bring their stupid dog everyfuckingwhere are the problem, and so are the people who enable it. If I were dictator, I'd ship every last one of them to a gulag. Without their dogs.

Mollica advises individuals with service dogs who encounter skeptical business owners to take "a nonaggressive, non-defensive stance" and let them know the animal is legitimately needed.

Nonagressive AND nondefensive. Um... okay.

Yes, people with a legitimate need for service animals should be accommodated. I'm not arguing that. But it's not the business owners who are to blame, here; it's the fake-service-animal people. It's kind of like the shops that only allow in one teen at a time: it's not because every teen is a problem, but because enough are that the business feels the need to have some control over potential situations.

According to Canine Companions, loopholes in the ADA have enabled scammers to exploit the system. In 2024, the group said it hopes to persuade lawmakers to add definitive language to the act that addresses service dog representation, making it "crystal-clear that misrepresentation of a disability for personal gain – including the use of a service dog – is against the law."

We might have more pressing legislative matters to deal with, but that seems like a step in the right direction. It's not always about personal gain, though; often, it's about Main Character Syndrome. A false sense of entitlement. Maybe even a dose of denial: "Oh, Froofloo won't hurt anyone." *Froofloo bites some kid* "That kid must have deserved it!"

Now, I'm aware that things aren't always so clear-cut. Like, maybe you've grown attached to a dog but the only place you can find to live doesn't allow dogs, so you fib a little to say it's a service animal. I'd still send you to the gulag, but at least I can sympathize with that situation.

But for the most part, anyone who fakes having a service animal is part of the problem and should be mocked, shunned and avoided.
July 17, 2025 at 12:21pm
July 17, 2025 at 12:21pm
#1093594
Well, the random numbers have hit on another article about Europe, but this time, a different country and a different ethnic group. From the very American Smithsonian:

    Jewish Food Is Making a Comeback in Poland  Open in new Window.
Bagels, knishes, bialys and more are popping up in bakeries as the country reckons with historical trauma


I have, of course, written about bagels before, but that was in the context of New York City.

Jewish food, and especially Ashkenazic Jewish food, is slowly but steadily returning to the country, where many of the dishes actually originated. The comeback is driven by a growing interest from Polish people in finally facing their country’s past.

I don't know how widespread this knowledge is, so I'll just mention that there were two main branches of Judaism in Europe: Ashkenazic in the east, and Sephardic in the west. The Ashkenazi were the ones who spoke Yiddish and popularized bagels, bringing them to the US through immigration to NYC.

This is certainly the case with the bagel, with bakeries all over Poland serving them. But other foods are reappearing as well, such as the knish, or knysz in Polish—a bun filled with kasha, potatoes or cheese.

Dammit, now I'm getting hungry.

Jewish communities in Poland originated foods like the bagel, knish and bialy. When they fled from pogroms during the late 19th century, they brought their recipes with them, says Maria Zalewska, executive director of the Auschwitz-Birkenau Memorial Foundation and co-editor of the book Honey Cake & Latkes, a compilation of recipes written down by survivors of the Auschwitz concentration camp.

Okay, that's a little dark.

Between 1881 and 1914, more than two million Jews immigrated from Eastern Europe to the United States. A large majority, about 1.6 million, came from the Russian Empire (which included parts of Poland at the time). Their exodus was driven by social, economic and technological change combined with antisemitic persecution in their countries of origins.

Pretty sure my great-grandparents were part of this migration, but I'm still not entirely clear on what year they immigrated, and probably never will be. I do remember my grandmother spoke Polish and Yiddish, though her English was excellent.

“You can follow the history of the Jewish community through food and understand why these foods have disappeared [in Poland] but survived in New York,” says Magdalena Maślak, culinary program curator at Warsaw’s POLIN Museum of the History of Polish Jews.

Another place a bunch of Ashkenazi landed in the Americas was Montreal, not mentioned in the article. I say this because Montreal is also known for having excellent bagels.

Dill pickles have long been considered a Jewish food in America, because Jewish immigrants brought pickles to this country and popularized them at Jewish delis. “But they’re not necessarily Jewish, and that tells you something. That’s the story of food,” says Liz Alpern, a chef and co-founder of the Gefilteria, a New York-based venture that offers Ashkenazic food with a modern twist.

And I'm going to clarify something here, because the article doesn't and I feel it's important: one of the most popular varieties of dill pickles is called the kosher pickle, or kosher dill. There's nothing inherently unkosher about pickles, made up as they are of cucumbers in brine with various herbs and spices including dill and garlic. The reason Kosher is a pickle style is because they use the coarse-grain kosher salt for the brine. Kosher salt itself is misnamed; it's the kind that was used to make meat certifiably kosher.

I just typed all that from memory, and my memory might be faulty, so, as usual, don't just take my word for it.

With time, foods such as the pastrami sandwich or the bagel became staples of an evolving Jewish American food culture, different from those of their parents and grandparents, that gave rise to new traditions.

We have a very good (though not up to some NYC standards) bagel place here where I live, and one of their offerings is a pastrami bagel. It is very good. It is even better with Swiss cheese on top, which makes it very much not kosher.

Pastrami itself is interesting and, if I had more time today, I'd delve into its history.

Likewise, it was Hersz Lender, a Jewish baker from Lublin, Poland, who was credited with bringing the bagel to New York—and turning it into the morning staple known today.

I'm sure you recognize the name; a very common frozen bagel still bears his name. But the current version of them bears little resemblance to the Platonic ideal of bagel; they're just cheap and mass-producible and have the toroidal shape. Last I heard, it's currently produced by the same company that does Thomas' English Muffins.

Jewish American foods mixed with other cuisines and influences, and the bagel is no exception. “The lox itself is Scandinavian. The cream cheese is from New York. The capers on it are Italian. But it’s putting it all together that made it Jewish,” says Jeffrey Yoskowitz, a New York City-based food writer and co-founder of the Gefilteria.

And yes, the cream cheese is a New York thing. Never mind that the most widespread (pun absolutely intended) brand in the US says "Philadelphia." It's got nothing to do with Philly, any more than Land O' Lakes butter is from some idyllic lake-strewn countryside that used to have Native Americans on it.

So anyway, the rest of the article is kind of depressing, as anything involving Eastern Europe and Jews tends to be. But the important thing is: bagels have come full circle. And that pun was also absolutely intended.
July 16, 2025 at 10:31am
July 16, 2025 at 10:31am
#1093541
I'm mostly sharing this PopSci article because it's interesting archaeology, but there's also a computer-age twist.

    Construction workers find Viking graves linked to King ‘Bluetooth’  Open in new Window.
The site includes relics illustrating a ‘vast and dynamic world.’


Construction workers digging about four miles north of Aarhus, Denmark have accidentally discovered a “spectacular” Viking gravesite.

That's the interesting bit, to me at least: that it wasn't a deliberate dig in a known burial mound or settlement site, but just some semi-random place.

Dating back to the second half of the 10th century, the archeological trove may even tie directly to one of Denmark’s most famous rulers: King Harald “Bluetooth” Gormsson. And yes—his legacy is tied to the handy wireless feature in your smartphone.

Hence the computer-age twist, however tenuous and speculative it might be. Hopefully the archaeologists involved work more reliably than the ancient king's namesake.

Human remains such as bones and teeth were also found at the site along with smaller, less ornate graves that possibly held an elite family’s enslaved workers.

Sometimes we forget that slavery, in some form or another, was the norm rather than the aberration throughout most of human history. It wasn't always tied to racism like we tend to think of the concept.

I'm not defending the practice, mind you.

Archeologists speculate the burial site is probably related to a nobleman’s farm located less than 0.65 miles away.

Okay, look. I get that PopSci has a mostly US audience to target, so they convert to units this back-asswards country is familiar with. But would it have killed them to phrase it as "about 1km away?"

The son of King Gorm the Old, Harald ruled over Denmark and Norway from around 958–986 CE...

At which point, presumably, the Nordic countries could be described as Gormless.

...and allegedly earned his nickname from a conspicuously colored tooth.

As they didn't find said tooth, I'm withholding judgement on whether Bluetooth was actually involved in the burial site.

His influence is so prominent that during the 1990s, Swedish telecom giant Ericsson picked “Bluetooth” as the working name for a technology intended to “unite” the computer and cellular service industries.

I suppose that's a better name than, say, Leif, though I'd wager more people knew of Leif Erikson than Harald "Bluetooth."

Its recognizable icon still used today? The Nordic rune for “B,” also featured prominently on King Bluetooth’s Jelling Stone.

Only part of the logo is the runic "B," or berkana, or bjarkan. It also incorporates a "K" or kaunaz, which has associations with fire, torches, and even death, and I'm still not clear on why the kaunaz is in there except to make the whole thing look like a variant on hagalaz, which is associated with transformation. That bit of symbolism is the truly appropriate one.

When, that is, it actually works.

Yes, I know something about Nordic runes. No, I don't ascribe them mystical properties. But symbols are what we make them, and I've always been amused at this connection between 21st century technology and 10th century kingdoms.
July 15, 2025 at 11:38am
July 15, 2025 at 11:38am
#1093479
For anyone who still reads, from Big Think:

    5 stories that teach you philosophy (better than some philosophy books)  Open in new Window.
Want to study philosophy but skip some of its heavier tomes? These five novels are a great place to start. (Existential despair guaranteed.)


Sure, because why bother chewing when you can have your meal pre-processed?

Okay, sure, that's an unfair comparison. Sometimes, fiction is what it takes to really get your ideas across.

Philosophy is a rewarding discipline to study. Actually reading philosophy? That can sometimes be a slog through scholastic drudgery.

I suspect that's true for any discipline.

If you want to dive into some philosophy but aren’t in the mood for its heavier tomes, you can find many excellent fiction stories that explore philosophical ideas in accessible and enjoyable ways.

I would argue that most fiction involves some philosophy. Yes, even mass-market pulp fiction. Maybe except for romance.

“The Ones Who Walk Away from Omelas” by Ursula K. Le Guin (1973)

“The Ones Who Walk Away from Omelas” is a short story focusing entirely on a philosophical issue. Specifically, it presents a full-throated argument against utilitarianism.

The idea's been stolen, too. Notably, an episode of Star Trek: Strange New Worlds directly cribbed the premise. And I'm pretty sure someone else did it before Le Guin, in turn. It's not about whose idea it was, though; it's about the idea.

Dream of the Red Chamber by Cao Xueqin (1791)

Dream of the Red Chamber, also known as The Story of the Stone, is an 18th-century novel. It is considered one of the great classic Chinese novels, alongside Journey to the West and Romance of the Three Kingdoms.

As an ugly American, I hadn't even heard of this one. So I can't weigh in on the content, except to say that any overview of world philosophy does need to include, well, the world.

Solaris by Stanisław Lem (1961)

The 1961 novel follows a group of astronauts trying to communicate with the planet Solaris.

It should surprise absolutely no one that science fiction ties in with philosophical ideas. And yet, some still scoff at the entire genre.

Candide: Or, Optimism by Voltaire (1759)

His vast bibliography includes numerous letters, pamphlets, plays, and novels. One of the funniest is certainly Candide: or, the Optimist. Imagine if the Monty Python trope were one French guy writing in the 18th century, and you’ll have a sense of Voltaire’s humor.

Yes, sometimes philosophers write fiction. And sometimes, it doesn't suck.

I haven't actually read this one, though. But I've said this for a long time: The optimist insists that we're living in the best of all possible worlds. The pessimist fears that this is true.

The Trial by Franz Kafka (1925)

The Trial follows Joseph K, a man who is arrested one morning for reasons never made clear to him. His attempts to follow the byzantine rules of the legal system alternatively benefit or harm his case with little rhyme or reason. He is told to attend court sessions without being told when or where and blamed for being late.

I really should read Kafka, even if he didn't write science fiction.

Anyway, mostly, I just wanted to share, though I'm only personally familiar with two of the five works. Thing is, though, pick almost any book that's not for sale at an airport, and there's sure to be some philosophy in it. Sugar-coated, maybe, but that's how the medicine goes down, I'm assured.
July 14, 2025 at 8:52am
July 14, 2025 at 8:52am
#1093397
From Nautilus, a reminder that "bug bomb" used to mean something very different.

    5 Devious Ways People Made Bugs Into Bombs  Open in new Window.
Insects are among the oldest of weaponry


Maybe they should have called it a bugapult instead of a catapult.

Alongside sticks, stones, and bone, humans also once harnessed a surprising ally in their early weaponry: insects.

To bee, or not to bee?

Researchers hypothesize that humans started using them on the battlefield as far back as 100,000 years ago, long before the beginning of recorded history.

Yeah, okay, but I'm going to need something more than guesswork.

Venomous stingers, refined through millions of years of evolution, can tear the skin and unleash poison. Bacteria that cause deadly diseases in humans and other animals can hitch a ride as insects scatter and swarm across a human landscape.

Clearly, they didn't know about the whole bacteria thing, just that bugs somehow caused illness and death sometimes.

Bee Cannons

Beehive bombs may have been some of the first projectile weapons, according to scholars. As early as Neolithic times, evidence suggests that warriors would attack enemies hiding in caves by throwing hornet nests through the openings.


I'll give them a pass on confusing bees, hornets, wasps, etc. Bee-cause the idea of a bee cannon is darkly humorous.

Bee Grenades

As early as 2600 B.C., the ancient Mayans conscripted bees for warfare. Mayans, traditionally skilled potters, are understood to have created specialized bee grenades from clay.


This is even more funny. If, of course, you're not the one getting stung by the bees (or whatever).

Scorpion Bombs

When the Roman emperor Septimus Severus waged the Second Parthian War to expand his control to Mesopotamia in 198-99 A.D., little did he know that his soldiers would also be up against venomous stinging creatures.


I can also forgive the stinging insect confusion above because a) they know the difference between "venomous" and "poisonous," and b) they didn't call scorpions "insects."

Yeah, I know, calling everything a "bug" is just as wrong, from a taxonomic point of view, but everyone knows what you mean and it's easier than saying "arthropod."

Porcelain Flea Bombs

In 1920s Japan, a mosquito-borne encephalitis virus killed 3,500 people on the island of Shikoku. General Shiro Ishii, a microbiologist and an army officer, was sent to Shikoku to study the epidemic, and he quickly began to plot using the great destructive power of insects for war.


Hm... Porcelain Flea Bombs would make an excellent band name.

Maggot Bombs

In the interest of expanding their repertoire, Japanese Unit 731 began experimenting with house flies—a pest known to flourish among human habitations. Borrowing from the design of the Uji bombs, they developed the maggot bomb, officially known as the Yagi bomb.


As the article notes, no maggots were actually involved, but adult flies were.

Because of the horrors such weapons visited upon their human victims, the Geneva Protocol of 1925 prohibited the use of biological weapons in war, and by 1972, international authorities had outlawed even their creation or possession.

Oh, yeah, that always works.

However long ago bug warfare actually started, I'm once again impressed at the creativity of humans when it comes to destroying or inconveniencing other humans.
July 13, 2025 at 9:42am
July 13, 2025 at 9:42am
#1093342
An "everything you thought you knew was wrong" article from The Conversation:



Shakespeare’s language is widely considered to represent the pinnacle of English.

Oh, right, sure, English only went downhill from there, and no other authors or poets ever created, or could create, anything worthwhile. I don't even know why we try.

But that status is underpinned by multiple myths – ideas about language that have departed from reality (or what is even plausible).

The sarcasm about pinnacles (can you put those on pizza?) aside, I'm all for correcting misconceptions.

The Encyclopedia of Shakespeare’s Language project at Lancaster University, deploying large-scale computer analyses, has been transforming what we know about Shakespeare’s language.

Speaking of which, has anyone told one of those LLMs to write a Shakespeare play yet? I bet they have, and I just haven't heard of it.

1. Shakespeare coined a vast number of words

Well, he did, but not as many as people think – even reputable sources assume more than 1,000.


Whatever the actual number is, anything we (or our computers) come up with is still an estimate.

The word “hobnail” first appears in a text attributed to Shakespeare, but it’s difficult to imagine it arose from a creative poetic act. More likely, it was around in the spoken language of the time and Shakespeare’s use is the earliest recording of it.

"Difficult to imagine" isn't evidence of anything. However, I can absolutely believe that unrecorded spoken language preceded things being written down. Speaking was the social media of the time.

2. Shakespeare IS the English language

The myth that Shakespeare coined loads of words has partly fuelled the myth that Shakespeare’s language constitutes one-quarter, a half or even all of the words of today’s English language.


Yeah, even if I had heard something like this, which I don't think I have until this article, my bullshit meter would have beeped.

3. Shakespeare had a huge vocabulary

Ludicrously, popular claims about Shakespeare’s huge vocabulary seem to be driven by the fact that his writings as a whole contain a large number of different words (as noted above, around 21,000). But the more you write, the more opportunities you have to use more words that are different. This means Shakespeare is likely to come out on top of any speculations about vocabulary size simply because he has an exceptionally large surviving body of work.


Seems like this is a version of survival bias.

4. Shakespeare has universal meaning

Sure, some themes or aspects of the human condition are universal, but let’s not get carried away and say that his language is universal.


Canonically, he stole Hamlet from the Klingons.

The mantra of the historical linguist is that all language changes – and Shakespeare isn’t exempt.

If the meanings didn't change, we wouldn't have whole bodies of work translating Elizabethan English to some modern version.

5. Shakespeare didn’t know much Latin

Within some theatrical circles, the idea that Shakespeare didn’t know much Latin emerged. Indeed, the contemporary playwright Ben Jonson famously wrote that Shakespeare had “small Latin, and less Greek”. Shakespeare lacked a university education. University-educated, jealous, snooty playwrights might have been keen to take him down a peg.


I'm not going to address this directly, but one thing I keep seeing is, I think, related: the conceit that Shakespeare didn't write what we attribute to Shakespeare.

The reason this is related, in my view anyway, is that both of these claims seem to be rooted in academic snootery. "How could a commoner have written such scenes and verses? It absolutely had to have been someone more educated."

For all I know, they may be right. I'm no expert, so I'm not weighing in. But claiming that on the grounds of literary snobbery just rubs me the wrong way. Whoever wrote those plays didn't write them for ivory-tower academics; they were the 16th-century equivalent of our pulp paperbacks, written for the amusement of the general public.

And besides, it just doesn't matter. Four hundred and some years later, we're still analyzing and reinventing these words, coined or not, and whoever originally penned them is long gone. As much as I like to dispel myths and correct misunderstandings, we know for sure that the words exist, as well as some historical context for them. And that's what really matters.
July 12, 2025 at 10:02am
July 12, 2025 at 10:02am
#1093288
Oh, wonderful. More about tipping in the USA, from USA Today.

    I'm not tipping a slack-jawed teen for no work. Let's fix our tip culture.  Open in new Window.
The social contract has been shredded, and we're all left fumbling with our wallets while the person behind us in line judges our generosity for a transaction that once went untipped.


Okay, well, worrying about the person behind you is a "you" thing. They might be just as fed up with tip creep as you are.

Food "tipping" has become an absolute circus, and I’ve had enough.

"I'm mad as hell, and I'm not going to take it anymore!"

The practice should be a straightforward way to reward exceptional service. Now, it’s a guilt-ridden tap dance where a rogue iPad demands a 25% premium for a slack-jawed teen handing you a muffin.

Don't blame the teen, dammit. They just work there.

Tipping has become a source of national anxiety, a phenomenon known as "tipflation," and frankly, it's exhausting.

Ugh. Anxiety is not what "tipflation" (goddamn stupid silly portmanteau) is. It's the proliferation of tips to inappropriate places, and increasing expectations for percentages.

If we don’t draw some clear lines in the sand, we’ll soon be tipping the self-checkout machine at the grocery.

Some people already are.

In the spirit of restoring some sanity, allow me to propose 10 reality-adjusted food tipping rules for 2025.

Yeah, that'll work out great. (I'll comment on a few here.)

1. The full-service sit-down meal ‒ 18-22%

There's little argument about this one. Until we find a way to do away with tipping culture, this stands. One might quibble with the amounts, though.

2. The counter offensive ‒ 0%

Most dining experiences these days stand in stark contrast to the classic waited table. If you order at a counter, pick up your food from someone hollering a number, fill your own drink and bus your own table – congratulations, you’ve just provided your own service.


I use the McDonald's Rule: if the level of service is that of McDonald's, where you don't tip, then you don't tip.

5. Coffee, cocktails and courtesy ‒ $1 minimum per drink, double it for effort

Coffee shops usually fall into the "counter service" category. Unless they're bar-like. Anything bar-like, you tip.

9. No SALT

Don’t tip on state and local taxes (SALT). The government is literally charging you to eat. You should not pay someone else a percentage of that amount.


Look, the problem with that is: it takes extra work. Say you go and eat out and it costs $25. Locally, taxes amount to about 10%, so the check comes with $27.50 or so. That's going to be the bottom line on the receipt. If you tip 20%, that's either $5 or $5.50. Is it really worth your time to dial in precisely on the pre-tax amount? It just seems really picky to me, when you can just see the final total, double it, and move the decimal one to the left.

Let’s reclaim some common sense in 2025.

No. No, let's not. Common sense is neither common, nor sense.

Conspicuously missing from that list: what to tip a rideshare. And you absolutely should tip a rideshare.
July 11, 2025 at 9:22am
July 11, 2025 at 9:22am
#1093226
Leave it to the French to come up with new ways to confound scientists. From Smithsonian:

    Doctors Detected a Mysterious Antibody in a French Woman’s Body. It Turned Out to Be a Brand New Blood Type  Open in new Window.
Called “Gwada negative,” it marks the discovery of the 48th known blood group system in humans


Of course it's a new blood type. It's only an antibody if it's from the Antibody region of France.

In 2011, a French woman was undergoing routine medical testing before surgery when doctors discovered a mysterious antibody in her blood.

Personally, I think we should tell kids that antibodies are little ants crawling around under their skin.

Yes, it's wrong, but so is teaching them that only the ABO blood classification system matters. And my suggestion would be funnier.

Now, scientists say the woman is the only known carrier of a new blood type called “Gwada negative.” It’s the only blood type within a new blood group system that scientists have dubbed “PigZ,” which is now the 48th known blood group system in humans, as the French Blood Establishment (EFS) announced last week.

I was going to make a comment about the inappropriateness of "PigZ," but then I saw the part about there being a French Blood Establishment, and that's way more amusing. Sounds more like a secret society of French vampires.

Humans have four major blood groups—the same ones identified at the beginning of the 20th century: A, B, O and AB. Since then, scientists have also determined that blood cells are influenced by a protein called the Rhesus factor.

Apparently, that has nothing to do with Reese's Cups, but a lot to do with rhesus monkeys, which are properly named rhesus macaques, which is yet another source of amusement.

But the full range of human blood is more complex. Scientists now know blood types result from the presence or absence of at least 366 antigens, according to the International Society of Blood Transfusion. Slight variations in which of these antigens are present can lead to rare blood types. The ABO blood group system is only one of many—with the new French research bringing that total to 48.

To be serious for a moment, I didn't know any of that, so hey, I learned something. But more importantly, I had a chuckle.

140 Entries *Magnify*
Page of 7 20 per page   < >
Previous ... -1- 2 3 4 5 6 7 ... Next

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02