*Magnify*
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/month/3-1-2021/sort_by/entry_order DESC, entry_creation_time DESC/page/2
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 1 -2- ... Next
March 12, 2021 at 12:02am
March 12, 2021 at 12:02am
#1006221
Pretty sure I've linked to this guy's blog in here before. Once again, I'm commenting on a blog that comments on a blog. Such is the circle of life. Hopefully the original bloggers won't feel the need to riff on this entry.



A brief word about The Matrix: I, alone in a world of nearly eight billion people (most of whom haven't even seen the movie), hated it.

My issues with that movie come down to three basic things. Well, two and a half.

1) Granted it's been a while since I had the misfortune of watching it, but as I recall, a good portion of the film was wasted with the "question" of whether the main character, Keanu Reeves' Neo, was "The One" or not. As the character's name is an anagram of "one," and because the plot would have gone nowhere otherwise, there was no way in hell that the writers would have made him Not The One (his name wasn't Neothonee), so a good chunk of the plot was utterly pointless. (Incidentally, I do like Reeves as an actor.)

2) I'd been reading up on Gnosticism, and the basic idea of the movie, its conception of "reality," is warmed-over, slurried, technologically updated Gnosticism. Techgnosticism, if you will. You probably won't.

2.5) This has nothing to do with the movie itself, but ever since it came out, you have streetcorner philosophers asking innocent passersby, "What if we're living in a smulation?" We're not living in a simulation; go do something useful, like masturbate.

Anyway, that's all me. Back to the blog post I'm quoting.

Sorry, I invented a label. It’s to describe a nonsensical fad that I keep running into.

He means "Matrixpunk." I invent labels all the time. I'll allow it.

It’s like steampunk: romanticizing the Industrial Revolution by putting gears on your top hat, imagining a world run on the power of steam with gleaming brass fittings, rather than coal miners coughing their lungs out or child labor keeping the textile mills running for 16 hours a day, limbs getting mangled in the machinery.

You mean, like, Dickens? I'd read the hell out of that.

Or cyberpunk, a dark gritty world where cyborgs rule and everyone is plugged into their machines, and the corporations own everything, including those neat eyes you bought.

At least cyberpunk was prophetic. Well, not really prophetic. Corporations use it as a guidebook and fulfill the prophecies on purpose.

Try to live on the bleeding edge, discover that the razor moves on fast leaving you lurking on a crusty blood clot.

Now, that's how you turn a phrase.

So…matrixpunk. One movie comes out in 1999, and everyone is wearing trenchcoats, ooohing at deja vu, and talking about how deep it is that we’re just a simulation (and never mind the losers who are gaga over the red pill/blue pill idea — boy, that one sure drew in a lot of pathetic people).

Oh yeah, I forgot about those shitheads. Call that, oh, reason #2.55 why I hate the movie. I do like my trenchcoat, though (it's not Neo-style).

However, one of the core ideas that seems to have suckered in some physicists and philosophers is the simulation crap.

Okay, so I'm not alone in that bit.

As a thought experiment, sure, speculate away…it’s when people get carried away and think it might really, really be true that my hackles rise.

Look, the problems with the simulation idea are legion, but to highlight:

a) This is an untestable hypothesis.

b) I want to know why someone is stuck on the idea that the world is some sort of holodeck simulation. What's your angle, here?

c) I play video games. In video games, sometimes I play as a despicable character. In Skyrim, for example, sometimes I depopulate entire towns (to the extent game mechanics will let me) just to see how tough my character is, or sometimes just because I feel like it. I know I can always reload a previous save and go about like a perfect paragon of virtue if I want, with the NPCs having no memory of being mercilessly slaughtered. I'd never do such a thing in real life -- because it's real life. Basically, believing we're in a simulation can be license to act like a psychopath.

d) The idea is basically techno-solipsism.

Back to the article, which now quotes another person's blog:

The controversial bit about the simulation hypothesis is that it assumes there is another level of reality where someone or some thing controls what we believe are the laws of nature, or even interferes with those laws.

Ah, I begin to see. The simulation-believers are largely tech people, and they want to believe that God is an IT wizard.

If there are a) many civilizations, and these civilizations b) build computers that run simulations of conscious beings, then c) there are many more simulated conscious beings than real ones, so you are likely to live in a simulation.

As I do not accept (a), the whole thing falls apart. Also, argument from probability is lame. If you win the lottery, do you argue that you didn't actually win the lottery because the probability of doing so is extremely low?

Myers chimes in again:

I’ve got a dazzlingly good hammer, or steam engine, or computer, and therefore the world must be made of nails, driven the piston of a very big steam engine, all under the control of a master computer. Or, more familiarly among the crackpots I have to deal with, watches are designed and manufactured, therefore the rabbits on that heath must also have been designed and manufactured.

I hear that last argument mostly from Muslims. Obviously most Muslims aren't crackpots, because most people aren't crackpots. But it's an argument that sane, rational Muslims have in their playbook to attempt to win over potential converts. My usual response is that we have a very good idea how rabbits came to exist, and they don't require a designer. Watches, on the other hand, are known to have been designed and built by human hands.

Myers: I would add that just because you can calculate the trajectory of an object with a computer doesn’t mean its movement is controlled by a computer. Calculable does not equal calculated. The laws of thermodynamics seem to specify the behavior of atoms, for instance, but that does not imply that there is a computer somewhere chugging away to figure out what that carbon atom ought to do next, and creating virtual instantiations of every particle in the universe.

To be fair to the simulation-pluggers, the simulation doesn't have to create every particle in the universe. It just has to create whatever it is you're looking at, at any given moment. For instance, the Eiffel Tower won't exist except in pictures until I go to Paris, which also doesn't exist until I go there. Because as long as we're saying that what I see is a simulation, as far as I know, I'm the only part of it with any "reality."

In the end, though, even if we were living in a simulation, so what? How would that change what you feel, what you do, what you think? If it does change these things, then we have a problem (see the "going around like a psychopath" thing above). If it doesn't change what you feel, do, or think, then... like I said... so what?
March 11, 2021 at 12:04am
March 11, 2021 at 12:04am
#1006171
So, okay, I'm still not completely coherent, but I'm not going to break my streak now.

Why a Traffic Flow Suddenly Turns Into a Traffic Jam  
Those aggravating slowdowns aren’t one driver’s fault. They’re everybody’s fault.


I've always wondered about this, myself, and I've taken actual traffic engineering classes.

Few experiences on the road are more perplexing than phantom traffic jams.

I can think of a couple, like when cops pull you over because you have out of state tags and therefore they figure you're not going to come back to contest the ticket.

Because traffic quickly resumes its original speed, phantom traffic jams usually don’t cause major delays. But neither are they just minor nuisances. They are hot spots for accidents because they force unexpected braking. And the unsteady driving they cause is not good for your car, causing wear and tear and poor gas mileage.

Okay, that's a stretch.

In contrast, macroscopic models describe traffic as a fluid, in which cars are interpreted as fluid particles.

When I was in engineering school, I learned both fluid mechanics and traffic engineering. I noted then that the equations used to model traffic flow were the discrete versions of the continuous equations used to model fluid flow. Such an epiphany made me feel a whole lot smarter than I actually was.

This observation tells us that phantom jams are not the fault of individual drivers, but result instead from the collective behavior of all drivers on the road.

You know, there's a metaphor in there somewhere. Like... pick a societal problem. Any problem. Poverty, racism, global warming, whatever. It's no individual's fault, but the result of all of our collective behavior.

It's not a great metaphor, but there it is.

However, in reality, the flow is constantly exposed to small perturbations: imperfections on the asphalt, tiny hiccups of the engines, half-seconds of driver inattention, and so on. To predict the evolution of this traffic flow, the big question is to decide whether these small perturbations decay, or are amplified.

And this bit reminds me of the difference between laminar and turbulent flow. I can't be arsed to explain that right now. Google it.

Besides being an important mathematical case study, the phantom traffic jam is, perhaps, also an interesting and instructive social system.

Like I said. Metaphor. I mean, really, you have to read the article to get the full effect; I've had a few beers tonight and thus I'm probably not making nearly as much sense as I think I am. But this one's been hanging out in my queue for a while, and I'm pretty sure that I'm saying what I wanted to say about it.

Hopefully tomorrow I'll be more clear, but right now there's a turbulent traffic jam in my neurons.
March 10, 2021 at 12:46am
March 10, 2021 at 12:46am
#1006112
I don't have much to say today. Yesterday was the first time in months that the thermostat hit 70F and I was just so happy to feel it (anything below 70F is "cold" and anything below 55F is "freezing") that I exhausted myself playing video games outside.

The next few days should hit 70+ too, but then it's back to freezing for a while. Oh well. Such is March in Virginia. At least there hasn't been gale-force winds as per usual. Yet.

Speaking of predictions...

To Make Sense of the Present, Brains May Predict the Future  
A controversial theory suggests that perception, motor control, memory and other brain functions all depend on comparisons between ongoing actual experiences and the brain’s modeled expectations.


Gosh, it's almost like "living in the present" is impossible, isn't it?

Okay, fine, it's not settled science, but it's a remarkable approach to an old problem -- and might even give us some insight into consciousness. And it might even explain compelling writing, as you'll see below.

I'm not going to quote a lot of it because, like I said, exhausted. But this bit is the idea in brief:

According to this “predictive coding” theory, at each level of a cognitive process, the brain generates models, or beliefs, about what information it should be receiving from the level below it. These beliefs get translated into predictions about what should be experienced in a given situation, providing the best explanation of what’s out there so that the experience will make sense. The predictions then get sent down as feedback to lower-level sensory regions of the brain. The brain compares its predictions with the actual sensory input it receives, “explaining away” whatever differences, or prediction errors, it can by using its internal models to determine likely causes for the discrepancies. (For instance, we might have an internal model of a table as a flat surface supported by four legs, but we can still identify an object as a table even if something else blocks half of it from view.)

This may sound all esoteric and shit, but I felt like some of this is relevant to me as a writer. See, I've long suspected that the key to interesting writing is to be unpredictable, to use words that don't quite dishrag. This is most obvious with comedy, but giving people unexpected combinations of words makes them stop and think.

The researchers observed a greater brain response when the study’s subjects came across the unexpected word “dog,” characterized by a specific pattern of electrical activity, known as the “N400 effect,” that peaked approximately 400 milliseconds after the word was revealed. But how to interpret it remained unclear. Was the brain reacting because the word’s meaning was nonsensical in the context of the sentence? Or might it have been reacting because the word was simply unanticipated, violating whatever predictions the brain had made about what to expect?

This may, in fact, explain the otherwise inexplicable attraction people have to James Joyce. Or even jazz, which delights in playing the exact wrong note next.

The rest of the article is long and somewhat technical, but between the connection to writing and the theory's apparent refutation of my hated "living in the present" philosophy, I found at least part of it relevant.

Perhaps I'll have more to say tomorrow, or perhaps cat.
March 9, 2021 at 12:01am
March 9, 2021 at 12:01am
#1006051
Warning: Math entry.

Well, actually a philosophy entry.

Philosophy of math.

Some Math Problems Seem Impossible. That Can Be a Good Thing.  
Struggling with math problems that can’t be solved helps us better understand the ones we can.


Considering that simple arithmetic seems impossible to a lot of people, that's saying a lot.

Construct a convex octagon with four right angles.

It probably says a lot about me as a teacher that I assign problems like this. I watch as students try to arrange the right angles consecutively. When that doesn’t work, some try alternating the right angles. Failing again, they insert them randomly into the polygon. They scribble, erase and argue. The sound of productive struggle is music to a teacher’s ears.


Oh, that kind of teacher.

Finally someone asks the question they’ve been tiptoeing around, the question I’ve been waiting for: “Wait, is this even possible?”

This question has the power to shift mindsets in math. Those thinking narrowly about specific conditions must now think broadly about how those conditions fit together. Those working inside the system must now take a step back and examine the system itself.


I've known people who pride themselves on "outside-the-box" thinking. Without exception, every one of those people were utterly lousy at inside-the-box thinking. Otherwise known as "thinking."

It is true that breakthroughs happen when people think beyond the stated parameters of a problem. However, it is at least as important to acknowledge the constraints of the parameters.

Consider, for example, the nine-dots puzzle, which was my -- and probably a lot of peoples' -- first encounter with lateral thinking. I have only a very limited ability to post images here, but a full description of it, with illustrations (including solution), is provided at this Wikipedia page.  

From that entry: "The puzzle proposed an intellectual challenge—to connect the dots by drawing four straight, continuous lines that pass through each of the nine dots, and never lifting the pencil from the paper. The conundrum is easily resolved, but only by drawing the lines outside the confines of the square area defined by the nine dots themselves."

The solution, as noted, relies on not seeing the outer boundary of the nine-dot square as a constraint; to draw the required lines beyond that imagined border. However, the constraints posed in the instructions are inviolable; that is, "connect," "four," "straight," "continuous," etc. One might very well wonder why such constraints exist, and what practical use such constraints might have. Violating these necessary parameters is what most people do when they smugly "think outside the box." It's pernicious.

Anyway, the article goes on to explain why the octagon problem posed in its first sentence is, in fact, impossible (assuming a flat Euclidean plane). There's a bit of math involved, but merely geometry. You can probably skim those parts and still get a good idea what the article's about.

I'm talking about this mainly because, at least at the time I plopped the article into my blog fodder stash, I'd been seeing an uptick in people going, "Nothing is impossible! Everything is possible if you try."

And that's utter codswallop.

As noted in the article:

To consider impossibility, we need to understand that just asserting that a thing exists doesn’t make it so.

A well-known example is: Consider an omnipotent being. Can such a being create a boulder so heavy that even the being itself cannot lift it?

If the answer is "Yes," because the being is, after all, omnipotent, then from the question itself, the being cannot lift the stone -- but that's impossible because the being can do anything (which is the plain definition of omnipotence).

On the other hand, if the answer is "No," then the being is not omnipotent after all.

Conclusion: Omnipotence is impossible.

And yet, we can construct the sentence: "An omnipotent being can create an object so heavy that even the being itself cannot lift it."

Many more things are possible in language than in practice. Is "everything" possible in language? I can't answer that, because I'd have to use language itself to do it.

Math is a kind of language, too -- one that is far more precise than plain English. You can say almost anything and make it look like math (for example, "2+2=143," but saying it doesn't make it true. Mathematics is based on rigorous proofs, including proofs of impossibility. Language isn't.

I do think language has its limitations; I have yet to encounter a satisfactory time-travel story, for example, and whenever we start talking about infinity, people can't grasp it. But that doesn't mean that either is impossible.

And yet, I will generalize from the article's assertions about the impossible: it's in thinking about the impossible, in any subject, that we can find solutions to problems that are merely difficult. I personally think that this is one of the main purposes of fiction writing.

For a long time, mathematicians thought that there were no numbers that, when squared, yield a negative number. But such numbers turn out to be useful, so they simply invented new ones: the so-called "imaginary" numbers that I ramble on about in this blog's intro that no one ever reads. So yes, we might find that some things thought to be impossible are actually trivial, in math or in philosophy or in other areas.

But other things will always remain imaginary, thus keeping fiction writers in business for the foreseeable future.
March 8, 2021 at 12:01am
March 8, 2021 at 12:01am
#1005993
Today's science lesson won't mean anything to anyone these days, but when I first found it about a year ago, it was relevant.



Though it might one day be relevant again.

The world is obsessed with fad diets and weight loss, yet few of us know how a kilogram of fat actually vanishes off the scales.

And normally, I'd Stop Reading There. "The world" includes a whole lot of people who only wish they had an overabundance problem; not all diets are fad diets; not all weight loss is desired; not all weight loss is fat loss; and, perhaps most of all, it's not the goddamn scales that are losing weight.

But if you can make it past the stupid-ass lede, there's some decent information in there. I'd always wondered, myself, where "lost weight" actually went. I mean, there are only five means I know of to physically shed mass: pee, poo, sweat, dead skin cells, and breath. I suppose we could count "tears" as a sixth, but I've only known one person who cried enough to actually lose weight doing it, and she was absolutely no fun to be around. Or maybe it all goes into hair and disappears when we shed or cut it? Or, hey, I know, it goes into boogers that eventually get picked when hopefully no one's looking? Okay, fine, more than five. Still, which one is it? Or is it a bit of all of them?

Even the 150 doctors, dietitians and personal trainers we surveyed shared this surprising gap in their health literacy. The most common misconception by far, was that fat is converted to energy. The problem with this theory is that it violates the law of conservation of matter, which all chemical reactions obey.

That's not quite right. It's conservation of mass-energy, because mass and energy are the same entity. But only a tiny, tiny fraction of mass gets converted into energy in the body; most of it gets converted into some other substance. Also, from what I understand, "conservation" laws aren't really physical laws like gravity or action/reaction; they're observations with no known exceptions.

Some respondents thought fat turns into muscle, which is impossible, and others assumed it escapes via the colon.

It is possible to lose fat and gain muscle simultaneously, but those are entirely different processes. And I learned early on to only weigh myself after taking a great big shit.

The correct answer is that fat is converted to carbon dioxide and water. You exhale the carbon dioxide and the water mixes into your circulation until it’s lost as urine or sweat.

And we're done here. Didn't even have to slog through the entire article to find the answer. Of course, the rest elaborates on it.

If you lose 10kg of fat, precisely 8.4kg comes out through your lungs and the remaining 1.6kg turns into water. In other words, nearly all the weight we lose is exhaled.

So, basically, by losing weight we're actually contributing to greenhouse gases and climate change. Therefore, we should be celebrating fat people, not shaming them!

Every carbohydrate you digest and nearly all the fats are converted to carbon dioxide and water. The same goes for alcohol.

For the latter, lots and lots and lots of water.

The only thing in food that makes it to your colon undigested and intact is dietary fibre (think corn).

Ew! No! Gah, I can't unthink that!

So there it is. Nope, it's not diet advice. But if you've ever wondered exactly where that lost weight goes -- well, Now You Know.
March 7, 2021 at 12:02am
March 7, 2021 at 12:02am
#1005950
Today, in Adventures in Nutrition Science, another about-face:

Great News, America: Cheese Isn’t Bad for You  
Don’t feel sheepish reaching for that manchego. Cheese doesn’t deserve its unhealthy reputation.


Well, at least the headline isn't in the form of a question.

Cheese is among the ultimate guilty pleasures. It’s gooey. It’s fatty. It’s delicious. It just has to be bad for you, right?

I mean, sure, cheese is good, but guilty pleasure? I hate that phrase anyway. Either enjoy something and wallow in it, or don't. No need to be coy about it.

Wrong. A large body of research suggests that cheese’s reputation as a fattening, heart-imperiling food is undeserved.

And this is why people have a distrust of science in general. Personally, I gave up on that field long ago, when they couldn't make up their cursed minds about eggs.

Just to be clear, nutrition science is wishy-washy like this because there are lots of hidden variables, variations among populations, suspect methodology, publication without peer review, incentives to find one result or another (that "chocolate is good for you" study was funded by chocolate companies, e.g.) and, most of all, a lot of really bad, breathless reporting about it all.

In the end, people extend their distrust of nutrition science to science in general, because your average person wants The One True Answer, which science can't give, but in other disciplines it's at least easier to approach. This is how you get shit like vaccine scares or climate change deniers.

“There’s almost no evidence that cheese causes weight gain—and in fact, there’s evidence that it’s neutral at worst,” says Dariush Mozaffarian, the lead author of the 2011 paper and dean of the Tufts Friedman School of Nutrition Science and Policy. “There’s no evidence that cheese is linked to cardiovascular disease, and in some studies, it’s even a little bit associated with lower risk. And then, for diabetes, again, it’s at worst neutral, and maybe protective.”

And so people end up back to believing whatever the hell they want to believe, and cherry-picking whatever study results back up their pre-existing beliefs. This particular quote I just pasted backs up my pre-existing belief, so I choose to believe it. Besides, even if someone posted "cheese will kill you dead," it wouldn't stop me from eating that delicious goodness. Because what's the point of living if you can't do the things that make life worth living, including eating cheese?

So just to recap so far: The evidence tends to show that cheese does not make you gain weight. Why hasn’t this amazing news spread more widely?

Probably because no one has bothered to put it into commercials or fund PR campaigns, the way they do with studies about booze or chocolate.

In the case of cheese, there could be several reasons for the surprising lack of impact on weight, though more research is needed. Cheese is fermented, meaning it has live bacterial cultures. That could have a positive effect on the gut microbiome, which appears to play a role in weight regulation. The fermentation process also creates vitamin K2, or menaquinone, which experimental studies have linked to improved insulin function. Insulin regulates blood sugar levels, hunger, calorie expenditure, and fat storage. (One upshot is that hard, aged cheeses, which are more fermented, probably provide more benefit than soft, less fermented ones.) There’s also some promising research about the benefits of a compound called the milk-fat globule membrane, which is intact in cheese but not in milk or butter.

All of these things are speculations requiring further research.

Other high-fat foods, like avocados, have lately enjoyed a reputational rehabilitation. Cheese, not so much. One reason may be the fact that cheese contains not just a lot of fat but a lot of saturated fat, a major dietary scofflaw linked to higher blood cholesterol and rates of cardiovascular disease. But here, too, the science says cheese doesn’t deserve its stigma. While cheese does have high saturated fat, that doesn’t appear to correlate with higher risk of heart disease.

I should probably make the point here (I didn't see it in the article) that they're talking about cheese. You know... real cheese? Not Cheez-Whiz or Chee-Tos and definitely not the plastic abomination labeled "Pasteurized Process Cheese Food," like a certain well-known brand of "singles."

Anyway, it's worth reading the linked article, if only to see for yourself some of the lines of research that support the conclusion. I still cringe at the writing style, but hey, not everyone can be me.

And while I'm on the subject of cheese, I thought I'd share my favorite dessert. No, it's not cheesecake (which is delicious, but it would take a lot to convince me that it's in any way "good for you"). This particular combination of flavors synergizes well, and, most of all, it's dead easy to prepare.

All it takes is:
Pears
Walnuts
Stilton cheese
Port wine, preferably tawny port.

I indulged in this deliciousness yesterday, in fact. And now with the article I linked above, I can't even claim it's just me enjoying something I shouldn't, as all of those things are now... Good For You.
March 6, 2021 at 12:03am
March 6, 2021 at 12:03am
#1005891
Formula for annoying articles:

1) Pick something that everyone thinks is a good thing
2) Explain why that something is not a good thing

Like, say: 1) You may think you're helping the environment by recycling, 2) But in reality, the amount of energy used negates any benefit.

I don't know if that's true or not; I just pulled it out of my ass. But this one, I pulled off the internet from two years ago:



And it only gets worse from there.

Cleaning out the closet for most millennials goes like this: You slough through the items you haven’t worn in the past couple of months or longer. Pack them into a tote you don’t need but keep around perhaps for moments like this. Sort the pieces you think you can get some money from at Buffalo Exchange or Beacon’s Closet if you live in New York City. Then, you bring the rest to your local Goodwill or Salvation Army and donate it.

Honestly, they lost me at "cleaning out the closet;" never mind that everything else in that paragraph is completely irrelevant to me. I have clothes from 25 years ago that I still wear. (Not underwear; shut up.) This idea that you have to get rid of shit you haven't used in two months is a pernicious lie meant to force you to spend more money to Buy More Stuff.

What actually happens to your donated clothes is a very involved process with a lot of complicated layers, each worth taking the time to understand. Let’s start here: Contrary to popular (naive) belief, less than 20 percent of clothing donations sent to charities are actually resold at those charities.

...and?

Almost half of the donations will be exported and sold in developing countries, while the other half will be recycled into rags and household insulation.

...AND??

The U.S. sends away over a billion pounds of used clothing per year, and a lot of those excess textiles are sent to East African countries like Kenya, Rwanda, and Uganda, each of which has received so much that some have proposed banning imported used clothing.

I'm still wrapping my head around the idea of not wearing an article of clothing until it wears out before getting rid of it as a rag.

To be fair, the article makes some mention of that later, but honestly, after this point, I just started skimming it. It doesn't help that their color choices absolutely suck, to say nothing of the implicit assumptions and writing style.

What you don’t want to do (ever, ever, ever) is throw away your clothes. You’ve probably already done that though, haven’t you? The average U.S. citizen throws away around 80 pounds of clothing and textiles annually.

Damn right I've done that. And I will continue to do that. Nowhere near 80 pounds a year, though. Not even close. No one is going to want to repurpose or recycle my used briefs.

Plus, at the end of the day, we’re the problem here. We’re the ones whose actions need to be questioned and challenged.

Because it's All Your Fault and You Should Be Ashamed of Yourself and It's In No Way The System's Fault.

No, "we" are not the problem, and "they" should stop trying to make us feel guilty about every goddamn little life choice. I swear it's all designed to keep us off-balance and neurotic so we'll be more susceptible to advertising.
March 5, 2021 at 12:02am
March 5, 2021 at 12:02am
#1005837
Yes, I'm aware that it's March, and if I'm going to talk about holidays, I should be talking about St. Patrick's Day, Pesach, Easter, or April Fools' Day. But this Cracked article has been hanging around in my queue since November and I'm finally getting to it.



Well, they're not "unspoken" NOW, are they? Okay, I suppose technically they are because it's an article and not a tedious video. So read it out loud to yourself.

Just pretend you're back in last December. It's cold, interminable earworms are everywhere, people are still arguing about the US Presidential election, and the pandemic is still going on.

Oh, wait, all of that is still the case.

Well, it's the holidays, and I don't feel anything. And it's not because of the pandemic or because I've finally been banned from the mall for trying to fight Santa so that I can become Santa: this has been building for years.

Unlike the author, I do feel something around the holidays: the urge to drink. Oh, wait, that's still the case, too.

For my entire adult life I've experienced this weird disconnect about Thanksgiving and Christmas. If you're a millennial, maybe you've felt or feel the same way I do -- I've never seen any kind of media discuss this disconnect, probably because so much money is spent by Big Holiday's self-mythologizing, and I'm only half-joking here.

I'm not a millennial, but yeah. I present to you reason #3452 why the concept of generations is seriously flawed.

What I mean is this: an astounding amount of cultural capital is spent, roughly from mid-August to January 1, extolling the virtues of the "holiday season" as a time of happiness, cheer, magic, and good will, when the holiday season has, in fact, been a time of hardship and insane working conditions.

On the other hand, I'm fully aware that my extravagant, luxurious, idle lifestyle depends on a lot of (mostly) younger people working long hours for shit pay, under onerous conditions, but that is a sacrifice I am willing to make.

And now, in trademark Cracked style, the numbered list.

5 So Many Americans Work On Christmas

There is, statistically, a good chance you already know what I'm talking about. But before I rant further, a quick note: I'm talking about Thanksgiving and Christmas, but I imagine you've experienced this no matter what winter-adjacent holiday you might celebrate simply by dint of what a cultural colossus Christmas is. Be it Kwanzaa or Hanukkah or Skweltegog: The Feast of the Undying Doom Serpent, you've probably felt the effects I'm going to describe.


I do believe I will have to begin celebrating Skweltegog.

And you know what? Despite my general cynicism, to quote my high school drama teacher at the casting of our winter play, I'm no Scrooge.

Bah.

Retail has a higher concentration of millennials than even craft beer microbreweries and the Nihilism Supply Store.

I'm just quoting this here because I howled.

4 Thanksgiving Is Actually A Weeks-Long Affair

And if you're lucky enough to have the sort of retail job where you get time off for the holidays, let me lay it out for you: Thanksgiving and Christmas are basically one long nightmare. Starting about two weeks before Thanksgiving is prep time for Black Friday, the American blood sacrifice to a mysterious entity known as "The Economy."


I think that's not the only blood sacrifice for that entity.

And while you're working on Thanksgiving, every single year, more reliable than the sunrise, a customer will cluck their tongue and say "They make you work on Thanksgiving? That's awful!", because no snowflake in an avalanche feels responsible.

Look -- I may be a Grumpy Old Man, but I avoid shopping on Thanksgiving or Black Friday on general principles, and the only reason I go anywhere on Christmas is to get drunk and see a movie -- at which point I tip well and never mention how much it sucks for my servants to be working on Christmas.

3 Black Friday

You might be thinking, "I'm sure Black Friday is busy, but it's just one day. How bad could it be?" To which I reply: "When you're in your private jet, can you force the pilot to let you sit in the cabin and pretend you're flying an X-Wing?"


I don't know, but I would like to find out.

Something happens to people on Black Friday where a brain parasite makes them scream "YOU'RE RUINING MY LIFE" at you when you try to explain to them that we don't carry "hot pink velour suit jackets" because we're not a 1987 cocaine orgy.

And this is why I never go anywhere on Black Friday. Except maybe bars.

People behave like werewolves except worse because I don't think a werewolf has ever told me that if I didn't like the way they were treating me I should try being smarter and getting a real job. Once, on Black Friday, I was ringing up a woman in a GOOD VIBES ONLY t-shirt and her young kid and the kid asked, "Mommy, why is he behind a counter?" and she said, "Because he didn't work hard in school."

I sorely wish there were actual consequences for people like this. Also, I have a lot to say about the concept of "hard work," but I'll save that for a future blog entry because I'm lazy.

2 And It Just Keeps Going At A Grueling Pace For Weeks

After Black Friday comes Cyber Monday, and since many chain retail stores match their Cyber Monday deals in-store so as not to lose business at their brick-and-mortar locations, it's basically a second Black Friday. This is the beginning of the Christmas Rush.


The beginning of the Christmas Rush is somewhere around Labor Day.

You might think that as Christmas gets closer people would have most of their shopping done. You would be incredibly, hilariously wrong.

Why would I think that? In addition to being lazy, I'm a professional crastinator.

And there's a solid two-ish weeks of people returning the things well-meaning but clueless family members purchased for them and using gift cards that were given to them as gifts, an interaction which always ends in me having to explain that yes, if you use a $50 gift card on a $300 suit jacket, you do still have to pay the difference. Oh, the gift card our store sent you in the mail doesn't actually have any money on it, the idea is we send it to you to put money on and give as a gift. Yes, I'll give you my manager's phone number so you can get me fired for this decision that I personally made about a nationwide advertising campaign.

You know, I'm aware that most people aren't actually stupid or inconsiderate. But in accordance with Lone Asshole Theory, all it takes is one to ruin someone's day.

1 Changing Any Of This Will Take An Impossibly Seismic Cultural Shift

My experience isn't unique. That's my whole point. Besides the insane demands of work, many youngish people moved from their hometowns to seek opportunity, so they aren't near their families anyway. It seems like we've created a shadow caste system where the holidays are just another thing that exclusively older people enjoy, like home ownership or going to the doctor or Steely Dan.


HEY NOW. Don't rag on the Dan.

There's a bit of irony that the possible death of Christmas was perpetrated by mass capitalism when right-wing ghouls like Ben Shapiro (a wooden puppet brought to life by the wish of a lonely racist) have been claiming that the left has been waging a War on Christmas.

I've been trying to get myself drafted into the War on Christmas for years, but it's like it doesn't really exist.

I don't know, maybe it's not a great sign when we're nostalgic for the state of worker's rights in Victorian London, a time when gainful employment was considered "guy who cleans the orphan fingers out of the Pollution Machine and is paid in bread crusts deemed insufficiently rich in mercury to sell."

Hey, thanks for the story idea. I'm thinking steampunk novella with a touch of the supernatural and a doomed romantic subplot.

The Grinch knew what was up: just your average mountain hermit trying to warn his neighbors that the winter celebration they so delighted in was doomed to cannibalize itself.

So we can add "The Grinch Who Stole Christmas" to the list of newly-censored Dr. Seuss stories.

Anyway, I hope y'all haven't minded this one-time intrusion of December into March. I'm calling it a flashback, because I refuse to be responsible for anyone thinking that this is the first volley of this year's Christmas season. Even in my wildest attempts at hyperbole, I never implied that Christmas creep could work its way into the weeks leading up to the Spring Equinox.

But I don't know. Maybe I should.
March 4, 2021 at 12:01am
March 4, 2021 at 12:01am
#1005764
Entry #8 of 8 for "Journalistic Intentions [18+]

*Quill* Battle of Angels


One of the most amusing typos I see is when someone is trying to say "angel" and they end up saying "angle."

I mean, that cracks me right up. Every. Time.

"You're a real angle!"

"She's my guardian angle."

"My baby is such a little angle!"

I know I shouldn't make fun. But I do.

It's not like I'm trying to be obtuse, but I just think it's really acute thing to do. Though my reaction is just not right.

It doesn't help that there was historically a people called the Angles. They migrated from Jutland or some shit like that to an island off the north coast of Europe, where they eventually merged with the Saxons but still gave their name to part of the island: England, a shortening of Angle-land. Hell, in French it's still called Angleterre.

From what I hear, though, they were no angels, often at war, engaging in pointless (and sometimes worthwhile) battles, and eventually conquering more than half the land area of the globe, before finally backing off from a lot of that. But, you know, angels were no angels either; if you really look at the mythology surrounding them, those are some scary-ass motherfuckers. Not the fluffy haloed humanoids with wings that somehow made it into popular culture.

The Angles' language lives on, though. Nealy half the world still speaks it, to one degree or another. (Yes, to be clear, that's totally another "angle" pun.) But it's nothing like Enochian, the purported language of the angels. No, it's way easier to make puns in Angle-ish, and all too easy to mistake similarly-spelled words for each other.

But that's part of the joy of English, you know. I don't think I'd have nearly as much fun punning in another language. After all, when it comes to humor, that's my angle.


 
 ~
March 3, 2021 at 12:05am
March 3, 2021 at 12:05am
#1005709
Sometimes it seems that the entire purpose of mystics and philosophers is to twist logic around until what's real becomes illusion, and what's make-believe becomes real.

The problem of now  
The injunction to immerse yourself in the present might be psychologically potent, but is it metaphysically meaningful?


But sometimes, one comes along and puts things back into perspective.

What I will dub the ‘connection thesis’ is a central claim of various spiritual practitioners, authors, lecturers and workshop leaders; it’s the contention that we should focus our full attention on the present moment precisely because of its singularity.

After which the author meticulously dismantles the entire idea, and it is glorious.

The singularity of the now might appear to be a deep and profound insight. It’s the springboard for various more practical strategies for achieving enlightenment and self-enhancement. But the claim that it is always now is so trivial that it can’t support any interesting inference, and there are other ways of justifying these same strategies and practices.

Essentially, it's the temporal equivalent of saying, "Wherever you go... there you are."

The article builds logical structures that I can't do justice to in this blog post; best to actually read it if you're interested in this sort of thing.

I'll just add a few of my own observations about the pervasive "now" idea:

1) By the time you perceive something, it's already in the past (even if only microseconds); there is no "now." Yeah, I know I've said this before but it's relevant.

2) Memory isn't a linear recording like a cassette tape (if you remember those). It's more holographic. When you access a memory, your perception focuses on the memory. It may or may not be a faithful re-creation of the past; most times, it isn't. But the important point is that the memory comes to your attention in something very close to the present moment. In other words, it would be part of the "now" if there were such a thing. And since it's not filtered through our senses, it's even closer to the elusive "now" than anything external that we perceive.

2a) A similar thing applies to thoughts about the future.

3) Failing to plan for the future or consider the consequences of our actions or inactions is objectively a Bad Idea. Living all the time "in the present" is self-defeating.

4) Being able to learn from the past and project those learnings into the future is a big part of what makes us human. To ignore everything except the tiny temporal slice of perceptions that most recently happened... well, that denies a huge part of humanity.

As I've said before (I think), I understand that one can obsess too much over what was and what might be, so it makes sense that sometimes you want to focus on what's here before you. Like if you're with a friend or lover, presumably you don't want to be thinking about a different friend or lover when interacting with them. And you probably want to do some things "now," or in the near future, rather than waiting for some nebulous far future time when you think conditions might be better.

Most importantly, though, I think it's a mistake to swap reality and fiction. There are levels of reality -- ideas are real in a sense, thoughts are real, even fictional characters have a particular kind of reality. But I'm pretty sure that the reality is that anyone who tries to tell you "everything you perceive is an illusion" is blowing smoke up your ass and probably wants some of your money, which has an entirely different kind of reality.

The singularity of the present is emphasised by influential spiritual teachers and it appears, at first, to be a profound truth. It is, however, superficial and not substantive enough to support the inference to immersion in the now. From the perspective of philosophy of language, we can see the impotence of now. From that of philosophy of time, we see that presentism doesn’t, in itself, have any implication for psychology, and eternalism is a non-starter.

And sometimes bullshit seems profound, but upon stepping in it, you find that it is, indeed, bullshit -- which nevertheless has the advantage of making an excellent fertilizer.
March 2, 2021 at 12:21am
March 2, 2021 at 12:21am
#1005616
Today's article is a perfect storm for me. Astronomy, crackpottery, and the rambling postmodern writing style of The New Yorker.

Have We Already Been Visited by Aliens?  
An eminent astrophysicist argues that signs of intelligent extraterrestrial life have appeared in our skies. What’s the evidence for his extraordinary claim?


The default answer to any headline that asks a question is always "no."

The article begins, in typical TNY style, with the history of the discovery of ‘Oumuamua -- you know, that weird interstellar object that blew through our solar system a few years ago. The article itself, though, is only a couple of months old.

As astronomers pored over the data, they excluded one theory after another. ‘Oumuamua’s weird motion couldn’t be accounted for by a collision with another object, or by interactions with the solar wind, or by a phenomenon that’s known, after a nineteenth-century Polish engineer, as the Yarkovsky effect. One group of researchers decided that the best explanation was that 1I/2017 U1 was a “miniature comet” whose tail had gone undetected because of its “unusual chemical composition.” Another group argued that ‘Oumuamua was composed mostly of frozen hydrogen. This hypothesis—a variation on the mini-comet idea—had the advantage of explaining the object’s peculiar shape. By the time it reached our solar system, it had mostly melted away, like an ice cube on the sidewalk.

Actually, the correct term would probably be "sublimated," like dry ice in the sun.

Now, to me, there's only one thing more wrong than jumping to the conclusion that a strange interstellar object is the product of alien technology, and that is completely discounting that as a possibility. It's important to keep an open mind. That said, the "alien technology" hypothesis would require a much higher level of support, as it it is, indeed, an extraordinary claim.

By far the most spectacular account of 1I/2017 U1 came from Avi Loeb, a Harvard astrophysicist. ‘Oumuamua didn’t behave as an interstellar object would be expected to, Loeb argued, because it wasn’t one. It was the handiwork of an alien civilization.

While Loeb certainly has credentials, that doesn't mean he's necessarily right.

In an equation-dense paper that appeared in The Astrophysical Journal Letters a year after Weryk’s discovery, Loeb and a Harvard postdoc named Shmuel Bialy proposed that ‘Oumuamua’s “non-gravitational acceleration” was most economically explained by assuming that the object was manufactured.

That's not how Occam's Razor actually works.

“No, ‘Oumuamua is not an alien spaceship, and the authors of the paper insult honest scientific inquiry to even suggest it,” Paul M. Sutter, an astrophysicist at Ohio State University, wrote.

That, too, is a specious argument -- again, while unlikely, I don't think it's wise to rule it out entirely.

“Can we talk about how annoying it is that Avi Loeb promotes speculative theories about alien origins of ‘Oumuamua, forcing [the] rest of us to do the scientific gruntwork of walking back these rumors?” Benjamin Weiner, an astronomer at the University of Arizona, tweeted.

This, though I despise Twitter with the fire of a thousand suns, makes sense.

Loeb has now dispensed with the scientific notation and written “Extraterrestrial: The First Sign of Intelligent Life Beyond Earth” (Houghton Mifflin Harcourt). In it, he recounts the oft-told story of how Galileo was charged with heresy for asserting that Earth circled the sun.

I will reiterate here that I'm not going to rag on anyone for promoting their book. However, as far as I'm concerned, anyone who invokes Galileo immediately gets thrown into the "crackpot" bin. Galileo had hard proof, and was facing execution from a theocratic establishment. Literally no one is saying that Loeb should be tortured and executed for his beliefs, and few scientists are going to come out and say, unequivocally, "technology-using aliens categorically do not exist." All they're saying is that we need more proof than just "none of our other hypotheses fit."

In “Extraterrestrial,” Loeb lays out his reasoning as follows. The only way to make sense of ‘Oumuamua’s strange acceleration, without resorting to some sort of undetectable outgassing, is to assume that the object was propelled by solar radiation—essentially, photons bouncing off its surface. And the only way the object could be propelled by solar radiation is if it were extremely thin—no thicker than a millimetre—with a very low density and a comparatively large surface area. Such an object would function as a sail—one powered by light, rather than by wind. The natural world doesn’t produce sails; people do. Thus, Loeb writes, “ ‘Oumuamua must have been designed, built, and launched by an extraterrestrial intelligence.”

Sometimes I forget that other people haven't been breathing, drinking and eating science fiction their entire life the way I have. Solar sails are currently speculative, in that we haven't created one yet, but the physics behind them is sound.

Again in typical TNY fashion, the article goes on a side quest to talk about extrasolar planets.

The first planet to be found circling a sunlike star was spotted in 1995 by a pair of Swiss astronomers, Michel Mayor and Didier Queloz. Its host star, 51 Pegasi, was in the constellation Pegasus, and so the planet was formally dubbed 51 Pegasi b. By a different naming convention, it became known as Dimidium.

Can I just take a moment to note just how awesome the name "Dimidium" is? It's really a shame that the moniker got slapped on a planet that it's unlikely we'll ever be able to visit.

No one knows what fraction of potentially habitable planets are, in fact, inhabited, but, even if the proportion is trivial, we’re still talking about millions—perhaps tens of millions—of planets in the galaxy that might be teeming with living things. At a public event a few years ago, Ellen Stofan, who at the time was NASA’s chief scientist and is now the director of the National Air and Space Museum, said that she believed “definitive evidence” of “life beyond earth” would be found sometime in the next two decades.

The key word here is "believed." And I'll issue my usual proactive caveat: "Life" is not the same thing as "technologically capable sentients." Life on earth proceeded to evolve quite nicely for over four billion years before one species started launching spaceships, and if that species disappeared, life would continue to evolve quite nicely. Evolution doesn't have an endgame, and those qualities that we call "intelligence" aren't the only survival traits; if you don't believe me, go look at a cockroach.

This article doesn't disagree with me here:

Assuming that there is, in fact, alien life out there, most of it seems likely to be microscopic. “We are not talking about little green men” is how Stofan put it when she said we were soon going to find it. “We are talking about little microbes.”

We are what we are because of a singular event in evolutionary history: the combination of two very different kinds of microbial life, a combination that vastly increased the energy-generating power of a cell. The development of eukaryotic life was what enabled complex organisms, such as cockroaches, birds, fish, and us, to develop.

On Earth, many animals possess what we would broadly refer to as “intelligence.” Kershenbaum argues that, given the advantages that this quality confers, natural selection all across the galaxy will favor its emergence, in which case there should be loads of life-forms out there that are as smart as we are, and some that are a whole lot smarter.

While it's tough to argue against this, and I don't want to even try, I will point out that intelligence doesn't automatically translate to technological sophistication. Few would argue, today, that dolphins aren't intelligent, and yet they show no proclivity, or an appropriate anatomy, to build complex structures and send some of those structures into space.

Sigh. I suppose I have to take a moment to make my usual plea: comments to the effect of "why are we looking for intelligent life out there when there's clearly none down here?" will be met with scorn and ridicule. The fact that we can make such comments immediately negates them. That joke was funny once, when Monty Python did it, and is now about as funny as seventh-planet puns.

This, in his view, opens up quite a can of interstellar worms. Are we going to accord aliens “human rights”? Will they accord us whatever rights, if any, they grant their little green (or silver or blue) brethren? Such questions, Kershenbaum acknowledges, are difficult to answer in advance, “without any evidence of what kind of legal system or system of ethics the aliens themselves might have.”

I have to wonder how a scientist can get so far into science without at least encountering science fiction, which has addressed these questions in myriad ways.

As disconcerting as encountering intelligent aliens would be, the fact that we haven’t yet heard from any is, arguably, even more so. Why this is the case is a question that’s become known as the Fermi paradox.

Look, I've discussed the Fermi paradox and the related Drake equation in here before, and I really don't feel like rehashing all of that. There's a search function for blogs here, and if you're interested in what I've said about them in the past, use "Fermi" or "Drake" to find earlier entries.

Or, hell, I'll save you the trouble:

"Occam's Drake

"No, It's Not 36.

"Alone

Hell, this one even mentions 'Oumuamua and Avi Loeb, but from a different source and several years ago: "Inalienable

I guess I've been harping on this shit for way too long and I'm sure my regular readers are tired of it already. I have a crap memory, so I remembered writing about it but not exactly what I said, so it's not surprising that I've repeated myself.

So I won't belabor this much further, just quoting one more paragraph from the TNY article:

It’s often said that “extraordinary claims require extraordinary evidence.” The phrase was popularized by the astronomer Carl Sagan, who probably did as much as any scientist has done to promote the search for extraterrestrial life. By what’s sometimes referred to as the “Sagan standard,” Loeb’s claim clearly falls short; the best evidence he marshals for his theory that ‘Oumuamua is an alien craft is that the alternative theories are unconvincing. Loeb, though, explicitly rejects the Sagan standard—“It is not obvious to me why extraordinary claims require extraordinary evidence,” he observes—and flips its logic on its head: “Extraordinary conservatism keeps us extraordinarily ignorant.” So long as there’s a chance that 1I/2017 U1 is an alien probe, we’d be fools not to pursue the idea. “If we acknowledge that ‘Oumuamua is plausibly of extraterrestrial-technology origin,” he writes, “whole new vistas of exploration for evidence and discovery open before us.”

And I'm not saying we shouldn't speculate, or pursue the possibility of finding alien life or technology. It would be cool as shit if we did find some (even if they do end up wiping us out because we're a bunch of assholes). Just remember, for now, it's all in the realm of speculation.

*StarB* *StarB* *StarB*

Mini-Contest Results!


As always, I appreciated all of the comments from yesterday.

Kåre Enga in Montana , that's interesting about the number 7. I was aware of the Chinese superstition surrounding the number four, and of course our own Western equivalent concerning the number 13. I've stayed in Chinese-owned hotels without a fourth floor, and I've been in American-owned buildings without a 13th.

ForeverDreamer , I had an uncle who was in WW2 and was diagnosed with, at the time, shell-shock. He was never quite right in the head, and he unfortunately died in 1992, having never really gotten the treatment he needed.

Grin 'n Bear It! , welcome!

Elisa the Bunny Stik , I haven't seen the Bean either, except in pictures, but the artist who created it really, really hates it when people call it the Bean (it's official name is Cloud Gate and it was created by professional asshole Anish Kapoor, and anything that pisses off Kapoor is okay in my book).

And out of those four, the die rolled a 1, so the MB goes to Kåre Enga in Montana this time!

Like I said, I'll do this again soon. Thanks again for the comments, everyone!
March 1, 2021 at 12:03am
March 1, 2021 at 12:03am
#1005530
Finally, the despicable month of February is over. To celebrate, I thought I'd do another Merit Badge giveaway; see below. But first a word about a word.

Never Say Wolf  
How taboo language turned the wolf into a monster.


The article is fairly long, but intriguing to me because I've been fascinated by taboo words for a very long time.

The howl of the wolf, and the fear that accompanies it, sounds across millennia. Comparing the mythologies of cultures that descended from ancient European tribes, wolves loomed large in their minds. There are myths of heroes brought up by wolves, of great wolves who will devour the sun, of wolves guarding the underworld, and of warriors taken over by the spirit of wolves.

If you've ever seen a wolf -- or, on a cold winter night, heard one in the distance -- you can understand this.

Speakers had to find ways of referring to wolves without naming them. The word for wolf becomes taboo: It shouldn’t be said. Instead, the magic of summoning through a name can be tricked. By changing the sound of the word, by using another word, perhaps borrowed from another language, or by using a descriptive phrase rather than the word itself, speakers could talk of wolves, but avoid the dangerous word itself.

This article focuses on wolves, but there were similar taboos for other feared animals, notably the bear. If you can stomach another article about linguistic taboo, here's the story   of the words used to describe ursines.

What fascinates me, though, is that it's not just the feared that become linguistic taboo. The sacred and the profane both get the same treatment. Notably, the original Hebrew pronunciation of the word for "God" has been lost to time. At the same time, we're always coming up with new ways to say curse words to get around peoples' sensibilities (such as "Holy forking shirtballs" made famous by the hilarious series "The Good Place.")

Another example from modern times is the word "retarded." It's taboo in many circles now, but I have no problem using it. I understand that the word was coined as a value-neutral way of describing someone who's mentally challenged, or "slow," hence the Latin-root "retarded." Before we had "retarded," there were three words with precise definitions: idiot, cretin, and moron. I forget which is which now, but each one of those words were assigned to people who tested at certain points of the below-average IQ scale. Then, being assholes, people started tossing those words around with great abandon, using them to simply describe someone they didn't like. "That guy's an idiot." "She's a moron." That sort of thing. "Retarded" was meant to replace those words, but of course, being assholes, we started using it to describe people and situations we didn't like.

Once that fell out of favor for that reason, they started using the word "special" as in "special education," the idea being, I suppose, that you take a word with positive connotations and maybe people will stop being assholes to the mentally challenged. And so, of course, now you can't call someone "special" without implying, deliberately or otherwise, that they're a moron.

My prediction is that there is no word or phrase that we can come up with to describe that condition which will not eventually become stigmatized.

And, of course, there's the word that cannot be uttered, the one that was used to dehumanize an entire group of people.

I, personally, came up with certain taboos myself. For instance, I almost never name the seventh planet, you know, the one orbiting between the orbits of Saturn and Neptune. This is because assholes just can't help making stupid puns about it, regardless of how it's pronounced (you have a choice between "Yer Anus" and "Urine Us," unless you want to get all Ancient Greek and pronounce it Our-Ahn-oos, at which point no one will know what the hell you're talking about).

Anyway. Point is, "wolf" is a case study for taboo word modification. You have a word, everyone knows it, but you don't want to bring down evil spirits (or public outcry, in modern times) so you use a different word. That one becomes the default word, but you don't want to bring down evil spirits, so you change the word again. And so on until some hypothetical future time when people stop being assholes. Or superstitious.

*StarB* *StarB* *StarB*

Merit Badge Mini-Contest!


I thought about asking for other examples of taboo words, but then I remembered that this blog is rated 18+ so most of the fun ones are off limits -- another variant of taboo against certain language. And you can do that in the comments if you want, if you can think of one that won't break the rating. Or you can talk about wolves or bears or planets or whatever. But as it is a) no longer February and b) at some point over the last few days, this blog ticked past 100,000 views (a pleasant milestone and I appreciate the support), as long as you comment below, today, you have a chance. The comment should have *some* relevance (that is, just posting something like "I hope I win" won't count), and I'll pick a relevant comment at random and give the person a MB tomorrow.

As always, the deadline is midnight WDC time tonight, at the end of Monday March 1. And I'll do this again, so you'll always have another shot at it.

32 Entries · *Magnify*
Page of 2 · 20 per page   < >
Previous ... 1 -2- ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/month/3-1-2021/sort_by/entry_order DESC, entry_creation_time DESC/page/2