*Magnify*
    May     ►
SMTWTFS
   
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/18
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 14 15 16 17 -18- 19 20 21 22 23 ... Next
May 28, 2023 at 9:35am
May 28, 2023 at 9:35am
#1050188
It's Sunday, so it's time to unearth an ancient fossil of a blog entry. This one's from early 2020 and references a Guardian article from half a year prior: "Poor You

The article   I referenced is still there, so context is easy to find.

The article, and my commentary, are pretty much just as relevant now as they were three years ago (though one might be tempted to wonder how the somewhat sheltered author navigated the pandemic): wages aren't growing much, prices are, and people are still people.

So, a few of my comments that might warrant additional explanation:

1) I'm only giving her a pass on the "gluten-free cookbook author" gig because she is apparently diagnosed with celiac.

What I meant by this is that, for a while, "gluten-free" was a fad, and I detest fads. Nevertheless, there are people for whom it's not a choice, but a necessity, and if the fad gave them more choices for a reasonably healthy lifestyle, great.

2) $15 an hour? Stocking groceries? I'm not saying that's a lot of money, but it's above average for unskilled workers.

I have no idea if that's still true in 2023, but for the time, I believe that to be correct.

And yeah, being broke, or close to it, damages one's mind. The crappy cheap food you have to buy if you're poor doesn't help with that.

This probably wasn't the best way I could have phrased that. I wasn't trying to imply that poor people are brain-damaged. I meant it as an indictment of the system that forces some people to be poor, not to rag on the victims of that system.

Tying health insurance to employment is good for employers. Not great for people who'd rather spend their time running a small business.

By "good for employers," I meant that the incentive of keeping one's health insurance is another trick employers can use to trap you in a dead-end and/or low-paying job. The article's author, for example, apparently took the job mostly because it offered health insurance, rare for a part-time gig. People will put up with a lot of bullshit if the alternative is worse. This might have changed a bit in the last three years, judging by all the shitty employers whining "no one wants to work anymore." (In reality, no one wants to put up with your bullshit anymore.)

Now, workers have a bit more power in negotiation. I hope that continues, but it probably won't.

Veganism is a product of privilege.

I stand by this assertion. But I'm not presuming to tell people what to eat. You do you. My objections only start when it becomes like a religion, with self-righteous adherents and attempts at conversion.

I'm aware of the contradiction inherent in my attempt to convert proselytizers to the practice of minding their own business, but I can live with that.

In closing, sometimes "poor" is a matter of one's own perspective. One doesn't have to have a lot of money or a high salary to be privileged. Reading between the lines, this author had, at minimum, a place to sleep, an internet connection, and something to write and post with (laptop, smartphone, something). And some sort of support system. Not to mention the wherewithal to pursue her preferred diet, which requires a good bit of thought, energy, and time. This may have been a step down from her previous experience, but it's not rock-bottom, even if it seemed so to her. Truly poor people are rarely able to be picky about what they eat (which is why I call veganism a product of privilege).

Life has its ups and downs, and I sincerely hope she's doing better. And understands that she could be doing worse.
May 27, 2023 at 11:37am
May 27, 2023 at 11:37am
#1050145
Writing—the concept, not this website—has been around for at least six thousand years. It's evolved somewhat in that time.



It's widely known, thanks to movies and TV shows, that some of Marvel's characters are its take on Norse gods. This article delves a bit deeper into the past than the Asgardian characters.

Ancient Mesopotamia, the region roughly encompassing modern-day Iraq, Kuwait and parts of Syria, Iran and Turkey, gave us what we could consider some of the earliest known literary “superheroes”.

It also gave us beer. Coincidence? I don't think so.

One was the hero Lugalbanda, whose kindness to animals resulted in the gift of super speed, perhaps making him the literary great-grandparent of the comic hero The Flash.

I want to reiterate here that I interpret the words "literary" and "literature" in their broadest sense; hence, comic books are literature, while movies and TV are not. The difference, to me, is whether one reads the words, or listens to actors or storytellers speaking them (even if you tend to watch shows with subtitles, as I do). This isn't an attempt at a prescriptive definition, but the point of view I take when discussing these things.

Many ancient stories probably started out in an oral tradition, and only became literature when they were written down.

But unlike the classical heroes (Theseus, Herakles, and Egyptian deities such as Horus), which have continued to be important cultural symbols in modern pop culture, Mesopotamian deities have largely fallen into obscurity.

Except, if you're me, the ones involved with beer. And D&D.

An exception to this is the representation of Mesopotamian culture in science fiction, fantasy, and especially comics. Marvel and DC comics have added Mesopotamian deities, such as Inanna, goddess of love, Netherworld deities Nergal and Ereshkigal, and Gilgamesh, the heroic king of the city of Uruk.

Many writers delve into the past for inspiration; comics writers are no exception. Like when someone retells a Shakespeare play, as with West Side Story.

The Marvel comic book hero of Gilgamesh was created by Jack Kirby, although the character has been employed by numerous authors, notably Roy Thomas. Gilgamesh the superhero is a member of the Avengers, Marvel comics’ fictional team of superheroes now the subject of a major movie franchise, including Captain America, Thor, and the Hulk. His character has a close connection with Captain America, who assists Gilgamesh in numerous battles.

And no, he hasn't shown up in the MCU yet. Unless I've missed an Easter egg somewhere, which is always possible.

Gilgamesh’s first appearance as an Avenger was in 1989 in the comic series Avengers 1, issue #300, Inferno Squared. In the comic, Gilgamesh is known, rather aptly, as the “Forgotten One”. The “forgetting” of Gilgamesh the hero is also referenced in his first appearance in Marvel comics in 1976, where the character Sprite remarks that the hero “lives like an ancient myth, no longer remembered”.

That's recent, as comic book characters go. The genre is generally considered to begin with Superman in 1939.

Unlike many ancient tales and myths, I knew about Gilgamesh before the comics version. One should never assume that a comics version is a reliable adaptation of ancient stories.

Story-telling has been recognised since ancient times as a powerful tool for imparting wisdom. Myths teach empathy and the ability to consider problems from different perspectives.

I would argue that this is a primary purpose of storytelling, ancient and modern. Not limited to myths.

A recent study has shown that packaging stories in comics makes them more memorable, a finding with particular significance for preserving Mesopotamia’s cultural heritage.

While a single unreferenced study is no grounds for drawing conclusions, this tracks for me.

The myth literacy of science fiction and fantasy audiences allows for the representation in these works of more obscure ancient figures. Marvel comics see virtually the entire pantheons of Greece, Rome, and Asgard represented. But beyond these more familiar ancient worlds, Marvel has also featured deities of the Mayan, Hawaiian, Celtic religions, and Australian Aboriginal divinities, and many others.

That's because those of us who read science fiction and fantasy are, in general, more knowledgeable and intelligent (not to mention better looking) than those who don't.

Again, though, be wary of taking modern interpretations of these figures, whether in comics or RPGs, as definitive.

In the comic multiverse, an appreciation of storytelling bridges a cultural gap of 4,000 years, making old stories new again, and hopefully preserving them for the future.

Which leads me to my main purpose in featuring this article: the idea of storytelling itself.

Before writing, stories were passed down orally (though possibly with props, like an early form of theatre), and they would have changed with changing technologies and societies. Once you write something down, though, it's preserved, like a mosquito in amber. It becomes harder to interpret after, for example, war wipes out a neighbor that was mentioned in the story, or someone invents the axle or whatever, making the story less relevant. Just as someone today might view a movie from the 70s and note that a particular predicament might have been easily solved had mobile phones been a thing then.

So, in my view, these adaptations are necessary for preservation, breathing life into old tales.

Thus, we exist in a time when we can have the best of both worlds: the original, preserved; and the reinterpretation.

It could be—and has been—argued that today's comic book superheroes are the modern take on ancient mythology in general, what with their focus on exceptional abilities and superpowers. You get that a bit in other media, but nowhere is it more obvious than in the realm of graphic storytelling.
May 26, 2023 at 11:34am
May 26, 2023 at 11:34am
#1050116
Feeling much better today, thanks.



This article, from NPR, is dated April 30 of this year, the actual anniversary in question.

"Imagine being able to communicate at-will with 10 million people all over the world," NPR's Neal Conan said...

9,999,990 of whom are idiots, trolls, bots, or some combination thereof.

..."Imagine having direct access to catalogs of hundreds of libraries as well as the most up-to-date news, business and weather reports...

Mostly fake, and/or stuck behind paywalls.

...Imagine being able to get medical advice or gardening advice immediately from any number of experts...

And having it turn out to be (sometimes dangerously) wrong.

"This is not a dream," he continued. "It's internet."

And it was fun for a couple of years.

On April 30, 1993, something called the World Wide Web launched into the public domain.

The web made it simple for anyone to navigate the internet. All users had to do was launch a new program called a "browser," type in a URL and hit return.

And then troubleshoot the url typed in, because most people can't spell, and were hopelessly confused by the // thing.

This began the internet's transformation into the vibrant online canvas we use today.

Jackson Pollock's paintings can also be described as vibrant canvases. And any attempts to make sense of those are doomed to failure, or at least false success.

Anyone could build their own "web site" with pictures, video and sound.

And malware.

They could even send visitors to other sites using hyperlinked words or phrases underlined in blue.

Such as this one.  

Okay. I'm being deliberately cynical. There's a lot to like about the internet, not the least of which is this website. But the web certainly has its downsides... just like everything else. Later, the article goes into some of these downsides.

CERN owned Berners-Lee's invention, and the lab had the option to license out the World Wide Web for profit. But Berners-Lee believed that keeping the web as open as possible would help it grow.

Well... he wasn't wrong. Fortunately for all of us, CERN was—and is—an international government science research organization, not a private company like ExxonMobil, AT&T, or Apple.

Today, nearly two-thirds of the world's population uses the web to visit hundreds of millions of active websites. Some of those pages belong to companies that are among the most valuable in history like Facebook, Amazon and Google.

None of which would exist without the internet as we know it.

With all of its problems, it's still one of the most influential inventions in history, if not the most. It's difficult to put an exact date on "this is when the internet was born," because there were all sorts of remote networks available before 1993, with many different individual inventors (none of whom were Al Gore), and of course the internet continues to evolve.

But the beginning of the World Wide Web is as good a milestone as any to commemorate.
May 25, 2023 at 8:45am
May 25, 2023 at 8:45am
#1050071
Well, I had some dental work done yesterday. This resulted in far more pain than I'd expected (I mean, sure, I expected some; it's a dental procedure). OTC pain relievers do nothing. So I have, basically, two options:

1) Be in pain, leaving me unable to sleep or concentrate; or

2) Take the good drugs the dentist prescribed, leaving me unable to sleep or concentrate (but at least not in pain), and running the risk of addiction.

Yeah, that's right: opioids keep me from getting decent sleep. That alone is probably enough to keep me from developing a habit; I'd rather sleep.

Point is, I won't be doing my usual blogging today, or, really, much of anything except staring at streaming video. Hell, I won't even be able to drink (adverse drug interactions).

Of course, this will pass, and it's really not a big deal. Perhaps I'll feel better tomorrow.
May 24, 2023 at 7:52am
May 24, 2023 at 7:52am
#1050033
In which The New Yorker discovers science fiction.

What a Sixty-Five-Year-Old Book Teaches Us About A.I.  
Rereading an oddly resonant—and prescient—consideration of how computation affects learning.


Pick a random selection of science fiction from the past—I don't mean space opera, which is fine in its own right, but actual attempts to write about the intersection of society and advancing technology—and you'll find that some of them are, in hindsight, "oddly... prescient," while the vast majority are like "well, that never happened."

Neural networks have become shockingly good at generating natural-sounding text, on almost any subject. If I were a student, I’d be thrilled—let a chatbot write that five-page paper on Hamlet’s indecision!—but if I were a teacher I’d have mixed feelings.

If I were a teacher, I'd be like, "hey [chatbot], write a syllabus for a sophomore English class aimed at mid-range students."

On the one hand, the quality of student essays is about to go through the roof.

Yeah, except, well, see Jayne's rant about that sort of thing, here: "AI Detectors are Horseshit

Luckily for us, thoughtful people long ago anticipated the rise of artificial intelligence and wrestled with some of the thornier issues.

Substitute pretty much any technological advancement for "artificial intelligence," and "thoughtful people long ago" anticipated it and thought about some of the possible consequences. That's basically the definition of science fiction.

But I don't expect anyone at TNY to understand that.

Their book—the third in what was eventually a fifteen-part series—is “Danny Dunn and the Homework Machine.”

Just including this bit so we all can see what they're talking about.

There follows several paragraphs of meanderings about the setup, plot, and characterization, before getting to the stuff that is really relevant to the topic. Typical TNY. Still, it's worth glancing over so you know what he's talking about later.

I hesitate to give away too much of the plot, but (spoiler alert!) two mean boys in their class, one of whom is jealous of Irene’s interest in Danny, watch them through a window and tattle to Miss Arnold.

Oh, no, wouldn't want to spoil a plotline from 65 years ago. (Spoiler alert: the author pretty much gives away too much of the plot.)

Incidentally, Rosebud was the sled.

He points out that Danny, in order to program Minny to do his homework, had to do the equivalent of even more homework, much of it quite advanced. (“Gosh, it—it somehow doesn’t seem fair,” Danny says.)

I must have read these books as a kid. I have a vague memory of them, anyway. But it may explain why, whenever I have to do anything more than once, I search for a way to automate it, and often spend more time crafting an Excel spreadsheet or some code than I would have spent on the projects.

“Danny Dunn and the Homework Machine” is ostensibly about computers, but it also makes an argument about homework.

And yet, there is still homework. I'm of the considered opinion that grade-school level homework has the primary purpose of making the kids leave their parents alone for a few precious minutes in the evenings.

The article spends an inordinate (to me) amount of time arguing about homework in general and not on the ethical implications of AI, but the main point remains: people have been discussing this sort of thing since long before it was technologically feasible.

Just like with the rest of science fiction.
May 23, 2023 at 10:06am
May 23, 2023 at 10:06am
#1049998
I've ragged on this sort of thing before, but it's been a while.

    Why Authenticity Doesn’t Exist When It Comes To Food  
Plus Ronnie Woo Shares a Recipe for Caramelized Hong-Kong Inspired Egg Tart


In my opinion, forcing "authenticity" on food leads to stagnation. It's a lot more interesting to mix things up a bit.

The debate over authenticity in food really comes down to how you define the word “authentic.” The word is often used to describe something that’s either fake or genuine, such as a brand name handbag or a pair of shoes, but in the case of food it doesn’t really apply (unless it’s plastic).

Most things come down to a matter of definition. As I said recently, I consider a hot dog to be a kind of taco. That depends on how you define (and serve) the hot dog, or a taco.

If every time we saw the words “authentic food” and replaced it with the word “traditional,” the sentence itself would probably be much less controversial. But even thinking of “traditional food” doesn’t maintain the intended meaning. I can guarantee that every time a recipe has been passed down to the next generation, changes were made.

Some of that is a search for novelty, but sometimes the changes are because of shifting availability of ingredients or cooking/prep methods.

Authenticity is simply a buzzword that some people have adopted as a way to declare that they are the real food-lovers and are somehow better than you based on what they perceive to be “real.”

And that's my main problem with it, I think: it's another form of gatekeeping.

Now, I know that, in the past, I've declared New York style pizza to be the One True Pizza, and American adjunct light beer to be not-beer. That's inconsistent with what I just said. It happens. I'm almost as loaded with contradictions as I am with pizza and beer. I am large; I contain multitudes.

But here's the important part, a rare instance of me completely agreeing with an article author:

I could care less if something is authentic, or even traditional for that matter – I just care that it’s delicious.

Well. Except that it should have been "I couldn't care less." Do English right! (In fairness, I don't know if English is Woo's first language or not, and here I am gatekeeping again.)

The beauty of having the privilege of eating food for pleasure is that we all are, and should be, allowed to mix and match whatever we want with reckless abandon. To me, the kitchen has no rules.

Yeah, no. The kitchen has plenty of rules. "Don't touch a hot stove." "Keep your knives sharp." "Always preheat the oven."

But now I'm being needlessly pedantic. I know what the author actually means: if you want to create a lo mein burrito, go for it.

Funny enough, there are numerous times where I actually like my interpretation of a traditional recipe more simply because I got to make it my own. Take my spicy almond pesto udon recipe, where I make a spicy version of a traditional Italian sauce and pair it with thick chewy noodles that are typically found in Japanese cuisine. Is this dish traditional? Absolutely not, but it sure is authentic to me (and, I should mention, absolutely delectable).

Many people associate pasta with "authentic" Italian cuisine. And tomatoes. But pasta is the Italian take on an Eastern innovation (noodles), and they wouldn't have tomatoes at all if not for the whole "invasion of the Americas" thing. Just because they added those ingredients long before we were born doesn't mean they're authentic or inauthentic.

While the line that differentiates appropriation and inspiration is not always crystal clear, it is important to not to erase the history or people from which a dish originated from.

This is an important point, though. And it's one that I'm not getting into, except to relate (possibly not for the first time) a personal anecdote:

I was sitting in my local bagel restaurant ("authentic" New York bagels in Virginia) many years ago, enjoying my carbohydrate toroid, when I overheard someone at the next table complain about cultural appropriation. I glanced over and noticed that she was eating a bacon, egg, and cheese bagel.

Why is this irony? Because bagels are indisputably Jewish food (it's a cultural thing, not religious). But bacon is indisputably not. Nor is the concept of eating any kind of meat alongside any kind of dairy product.

Don't get me wrong; I enjoy bacon, egg and cheese bagels, myself. But I also don't complain about food as cultural appropriation.

The meaty (pun intended) part of the article ends with:

At best, authenticity in food is subjective because no single individual or society can define what it is. If everyone stopped viewing cuisine and culture from a stagnant perspective, paid more attention to the deeply rich experiences of cooks (and people) of color, and appreciated all culinary interpretations simply for what they are, the experience of eating could just be fun and delicious. And that’s exactly what I think it should be.

And I like that viewpoint.

The rest of the article is the promised Hong Kong-inspired egg tart, but I'll leave that for people who aren't as lazy as I am to peruse. I'll just add one more short anecdote about authenticity:

Driving through the wildscape of the Olympic Peninsula in the state of Washington, one day many years ago, I started to get hungry. So I pulled off into a strip mall. Said strip mall had not one, but two, restaurants billed as Mexican.

One of them had bright neon in the window, and ads for Corona and Bud Light and, if I recall correctly, even a dancing neon margarita glass. I may not remember the details very accurately, but hopefully I'm at least conveying the feel of the place.

The other one was a simple storefront with big glass windows and some signs in Spanish.

I entered the latter, where the TV was set to a Spanish language station, on mute but subtitled in Spanish, and had Mexican music playing. The food was excellent.

But I like to think that maybe the same people owned both storefronts (there was a Latin food shop in between them), and just ran the Bud Light one as a gringo trap.
May 22, 2023 at 8:15am
May 22, 2023 at 8:15am
#1049962
Just an easy article today, and writing-related:



Clearly, they didn't Americanize the headline.

Born in Cambridge in 1952, Douglas Adams was best known for creating The Hitchhiker’s Guide to the Galaxy, a wildly successful project that began in 1978 as a science-fiction comedy radio series and eventually evolved to become something much larger, in many formats and in many languages, adored by many millions of people around the world.

I'll just point out that he did a lot more than that, including writing for Doctor Who, but it's Hitchhiker's that has embedded itself into public consciousness. Well, maybe not "public," but at least in the circles I frequent, one sign of certain intelligence is a good Hitchhiker's quote. Or a bad one.

This typically entertaining letter, which was actually a fax, was sent in 1992 to US editor Byron Preiss, whose company at the time was producing a comic book adaptation of Adams’ ever-expanding opus. Having noticed some unnecessary changes, Adams was keen to give some feedback.

I wish they wouldn't do that. Sure, Hitchhiker's has been translated into dozens of languages, but I find it offensive when they translate from British to American, as if we're too dumb over here to recogni[z/s]e that Brits spell a few words differently and have different words for several things than we do. I grew up reading both American and British literature, and while the differences did confuse me for a while, in the end, I feel like I became a better reader and writer for it.

Probably the worst offense was when they translated Philosopher's Stone to Sorcerer's Stone in the Potter books.

Anyway, the rest of the article is the letter itself, which is written, not unexpectedly, with hints of his signature dry humo(u)r. I'll just quote a few passages here.

A thing I have had said to me over and over again whenever I’ve done public appearances and readings and so on in the States is this: Please don’t let anyone Americanise it! We like it the way it is!

So it's not just me. Making it American would be like trying to do an American version of Black Adder, with American actors who have American accents. I know they did that sort of thing with a series (The Office, maybe?) but I never saw either version.

Though Hugh Laurie (who was in Black Adder), at least, does a great American accent.

The ‘Horse and Groom’ pub that Arthur and Ford go to is an English pub, the ‘pounds’ they pay with are English (but make it twenty pounds rather than five – inflation). So why suddenly ‘Newark’ instead of ‘Rickmansworth’? And ‘Bloomingdales’ instead of ‘Marks & Spencer’? The fact that Rickmansworth is not within the continental United States doesn’t mean that it doesn’t exist! American audiences do not need to feel disturbed by the notion that places do exist outside the US or that people might suddenly refer to them in works of fiction.

It is simply not possible to get the same effect if you substitute a British pub with an American bar. While there are a few drinking establishments in the US that approximate the British pub experience, the culture is still completely different.

Of course, this is only a problem for the very beginning of the story, when Earth still exists. (Spoiler)

Or we could even take the appalling risk of just recklessly mentioning things that people won’t have heard of and see if they survive the experience. They probably will – when people are born they haven’t heard or anything or anywhere, but seem to get through the first years of their lives without ill-effects.

Those sentences are pure Adams.

(Incidentally, I noticed a few years ago, when we still had £1 notes, that the Queen looked very severe on £1 notes, less severe on five pound notes, and so on, all the way up to £50 notes. If you had a £50 the queen smiled at you very broadly).

Quoting this because it legitimately made me laugh. Damn, I miss Douglas Adams. Neil Gaiman is the closest we have to him now, and while one of my favorite writers, he just can't do comedy.

One other thing. I’d rather have characters say ‘What do you mean?’ than ‘Whadd’ya mean?’ which I would never, ever write myself, even if you held me down on a table and threatened me with hot skewers.

I suspect the latter construction is a feeble attempt to render a British accent in dialogue. No. Make the thing British enough, and our minds will provide the accents.

I rarely close one of these entries with a direct quote, but it seems appropriate in this instance:

Otherwise it looks pretty good.
May 21, 2023 at 9:03am
May 21, 2023 at 9:03am
#1049920
Spinning the dials on the ol' time machine, today we land on "Naturally, from just over two years ago.

The entry was a 30DBC response, and the prompt was: Write about your favorite outdoor activities to do in the summer. Are there any activities you haven’t done that you want to try?

I have a distinct memory of shuddering when I first saw that prompt. I have a bio somewhere—not on a dating site, because I don't do dating sites—that describes me as a "dedicated indoorsman."

So, I opted for humor; specifically, hyperbole:

Well, my only favorite outdoor activity in the summer is: rushing from an air-conditioned car into an air-conditioned bar; if the accursed daystar is burning, add "while shading myself as much as possible."

And that's not entirely true. I rarely drive to bars, on the theory that I might have to leave my car there to Uber home, thus necessitating another trip, the next morning, while hung over. So the "car" in question is a rideshare.

I'm just as likely to walk to the bar. This is true no matter the outdoor temperature. That doesn't mean I'm enjoying the outdoors; it just means I'm a cheapskate.

Still, given a choice between being too hot and too cold, I will pick too hot every damn time, so if I'm going to do anything outside, it'll be in the summer. Of course, one is never given such a choice; either it's winter and too cold, summer and too hot, or a week or so in between when it's actually pleasant to be outside for five minutes when it's not raining.

This is only a slight exaggeration. We usually get more than a week of mid-range temperatures. Sometimes as many as two!

But this does give me the opportunity to talk about a phrase that triggers my grumpiness, to a possibly irrational degree: when someone says something like "I'd rather it be cold because I can always put on more clothes, but there's only so many I can take off."

It's not that those people are different from me. As with being a morning or night person, people prefer different things and it's hard to change that, even if I wanted to, which I don't. If this weren't the case, we wouldn't have people living in Alaska or, conversely, Arizona. So, okay, you prefer the cold; that's good to know. We can be friends, but we'll never take a Caribbean cruise together.

I think it's partly the implied naughtiness: tee hee, now you're picturing me naked. Thanks, I'm already doing that (if you're female) or repulsed by the idea (otherwise).

Worse, though, is that it's dead wrong.

Ever seen people adapted to extreme heat? I don't mean the scantily-clad jungle-dwellers of various tropical societies; they've usually got some shade to cool off in. I mean, like, the desert-dwellers, the real-life inspiration for the Tusken raiders of Star Wars fame. Do they run around mostly naked? No, they're completely covered in loose-fitting clothing. This serves as portable shade and takes advantage of the slightest breeze to provide evaporative air conditioning.

And for me, when it's cold, it doesn't matter how many layers of clothing I wear; I'm going to freeze anyway. Once my hands or feet are cold, that's it. I'm done. If I wear enough clothing to delay this freeze, I'm uncomfortable and can barely move.

So I prefer to do as nature intended, stay indoors and run my A/C in the summer and heating in the winter. Yeah, compared to the vast bulk of humanity, both now and in the past, I'm living in privileged luxury. So? Might as well take advantage of it.

No, I'm of the firm opinion that we evolved to build shelters for a reason, and that reason is that so we could use them. Don't get me wrong; I love nature. I'm a big fan of watching webcams of natural areas.

And that's an excuse to re-link the webcam I talked about in that entry, since we're coming up on the summer solstice once again: The Brooks Falls Brown Bears   from Alaska.
May 20, 2023 at 11:37am
May 20, 2023 at 11:37am
#1049871
Colorado is known for, well, several things, but mainly: mountains, weed, beer, and being a rectangle.

    Colorado Is Not a Rectangle—It Has 697 Sides  
The Centennial State is technically a hexahectaenneacontakaiheptagon.


Oh, well, at least they've still got mountains, weed, and beer.

America loves its straight-line borders. The only U.S. state without one is Hawaii—for obvious reasons.

There are good reasons for that, mostly involving making sure surveyors don't just give up in the middle of marking a zigzag boundary.

West of the Mississippi, states are bigger, emptier, and boxier than back east. From a distance, all seem to be made up of straight lines.

Can't be arsed to look it up right now, but there was a shift in the way surveying was done between the time of East Coast European settlement, and massive migrations west. That's one reason you get a lot of near-rectangular counties out West, and almost none in the East.

Only when you zoom in do you see their squiggly bits: the northeast corner of Kansas, for instance. Or Montana’s western border with Idaho that looks like a human face.

Never noticed that before, and now I will never not notice it.

New Mexico comes tantalizingly close to having only straight-line borders. There’s that short stretch north of El Paso that would have been just 15 miles (24 kilometers) long if it were straight instead of wavy.

Just guessing here, but it looks like it's wavy because it followed the Rio Grande. I use the past tense because, looking at a map, it looks like the river shifted but the boundary didn't (whether the river shift was intentional or not, I have no idea). There are great benefits in using rivers and other geological features as boundaries... at least until you remember that geological features change, and rivers in particular can rechannel themselves on human time scales.

No, there are only three states whose borders are entirely made up of straight lines: Utah, which would have been a rectangle if Wyoming hadn’t bitten a chunk out of its northeastern corner; Wyoming itself; and Colorado.

I've long been curious over why Wyoming bit Utah rather than the other way around, or maybe settled on a diagonal compromise. But not curious enough to delve deeper.

I'm also going to quibble a bit. With a few exceptions, most boundaries are described by line segments. The exceptions include Delaware's northern boundary, but even the ones following the thalweg—that's Surveyese for the midline or channel of a river—are generally approximated by arbitrarily short line segments (putting survey monuments in a river channel tends to be cost- and labor-intensive).

Whether we perceive something as a whole lot of short line segments, or a squiggle, or an arc, depends on the zoom factor.

Except that they aren’t. for two distinct reasons: because the earth is round, and because those 19th-century surveyors laying out state borders made mistakes.

And that's the other thing. On a Mercator and certain other map projections, latitudes appear as straight, horizontal lines. They are not. They are all circles of varying radius, centered on the poles. So when you're surveying a latitude line, you're actually describing an enormous arc (unless of course you're on the Equator). Said arc is generally approximated with line segments, as the variation from that big an arc to a line tends to be tiny.

Many states and countries have latitude lines as boundaries. Perhaps the most famous is the western segment of the border between the US and Canada. And even Eastern states (theoretically) have latitude boundaries, such as most of the line between VA and NC.

One begins to see why some of the US "Founding Fathers" were also surveyors.

Congress defined the borders of Colorado as a geospherical rectangle, stretching from 37°N to 41°N latitude, and from 25°W to 32°W longitude. While lines of latitude run in parallel circles that don’t meet, lines of longitude converge at the poles.

In contrast to latitude, longitude lines are, actually, straight (as mapped onto the ground). In theory.

Those longitude numbers seem like errors. I feel like they're measured from DC and not Greenwich, because there was a time when the US tried to measure everything from a longitude line passing through Washington, DC, and things like that tend to get entrenched into surveys. I couldn't find confirmation of this, but later in the article it acknowledges that the western boundary is more like 109°02’48″W, which supports my hypothesis.

This means that Colorado’s longitudinal borders are slightly farther apart in the south. So if you’d look closely enough, the state resembles an isosceles trapezoid rather than a rectangle. Consequently, the state’s northern borderline is about 22 miles (35 kilometers) shorter than its southern one.

I'd love to see the flat-Earther explanation for that one.

That’s not where the story ends. There’s boundary delimitation: the theoretical description of a border, as described above. But what’s more relevant is boundary demarcation: surveying and marking out the border on the ground.

A friend once asked me whether the VA/NC boundary would shift with continental movements, to keep it roughly aligned with whatever latitude it's supposed to follow. No, it wouldn't; the boundary markers take precedence over the delimitation. Kind of like the NM/TX border near, but no longer on, the Rio Grande.

Unfortunately, 19th-century surveyors lacked satellites and other high-precision measurement tools.

I say they did remarkably well with what they had. Humans are clever when they want to be.

Let’s not be too harsh: considering the size of the task and the limitation of their tools—magnetic compasses and metal chains—they did an incredible job. They had to stake straight lines irrespective of terrain, often through inhospitable land.

I guess that's the polite way of saying they had to deal with mountains, Indians, and mountain Indians.

Whether they should have been messing around on land that didn't really belong to them is another issue for another time. The fact remains that they did mess around.

Located in a dusty, desolate corner of the desert, the Four Corners monument seems very far from the middle of anything. Yet this is the meeting point of four states: Utah, Colorado, New Mexico and Arizona. It is the only quadripoint in the United States. The monument’s exact location is at 36°59’56″N, 109°02’43″W.

It's not all that desolate. And yes, I've been there. Twice. Sure, it's a tourist trap, but I'm a tourist, and spent my career working with surveyors.

However, it’s not where Congress had decreed the four states to meet. That point is about 560 feet (170 meters) northwest of the quadripoint’s current location, at 37°N, 109°02’48″W. Did you drive all the way through the desert to miss the actual point by a few hundred feet?

Look, it's a nice drive from any direction.

The rest of the article goes into some of the more obvious deviations from straight-line surveying (though still doesn't much acknowledge that latitude lines aren't "straight.") It's worth a read if you find this sort of thing interesting.
May 19, 2023 at 10:37am
May 19, 2023 at 10:37am
#1049836
You know why you should never trust atoms? Because they make up everything.

    Why atoms are the Universe’s greatest miracle  
With a massive, charged nucleus orbited by tiny electrons, atoms are such simple objects. Miraculously, they make up everything we know.


I could quibble about some of the words here, like "miracle," "simple," and "everything," but fine, they're not writing for scientists but to get ordinary people to know something about science. I can't be mad about that.

Similarly, I don't have much to say about the article itself. It's not technical, and it's got lots of cool illustrations, some of them animated. Highly recommended for anyone with curiosity.

One of the most remarkable facts about our existence was first postulated over 2000 years ago: that at some level, every part of our material reality could be reduced to a series of tiny components that still retained their important, individual characteristics that allowed them to assemble to make up all we see, know, encounter, and experience.

I mean, technically, light (which obviously enables us to see) isn't made of atoms. But it's generated from them, so okay.

Incidentally, I'm pretty sure that Democritus (the Greek who came up with the above idea) would be almost entirely forgotten had he not turned out to have been onto something. Lots of stuff the Greeks came up with didn't pan out (pun intended). And the Greek atomist theory wasn't exactly correct, either. It was more philosophy than science.

Everything that’s made up of normal matter within our Universe — whether solid, liquid, or gas — is made of atoms.

I could also quibble that this statement is a tautology, but why bother?

If all of human knowledge were someday wiped out in some grand apocalypse, but there were still intelligent survivors who remained, simply passing on the knowledge of atoms to them would go an incredibly long way toward helping them not only make sense the world around them, but to begin down the path of reconstructing the laws of physics and the full suite of the behavior of matter.

That's assuming they'd have the time to do so, between running from predators, finding prey, and hiding from aliens and/or zombies.

This is, however, just their framing device to communicate the idea of building atomic theory from the ground up. Best to not take these things too literally.

Like I said, I don't have much else to add. Mostly I just wanted to hold it up as an example of how one might communicate complex ideas to a wider audience.
May 18, 2023 at 6:11pm
May 18, 2023 at 6:11pm
#1049803
While one should never get their legal advice from an online comedy site, you might like this Cracked article about laws.



Yeah, I don't know if these are unique. And it's not like other countries don't have stupid laws, too. Hell, some of them still punish you for blasphemy.

The United States is a pretty weird country.

Which one isn't? Oh, yeah, Canada. Never mind.

Even though what’s supposed to be the famous, usually screamed tenet of America is freedom, the actual freedoms we do and don’t have are cherry-picked and puzzling.

Yeah, right. "Freedom."

So it’s unsurprising that there’s a whole lot of regulations and laws in the U.S. that haven’t fallen far from the apple tree — at best confusing, at worst fully oxymoronic.

Or just moronic.

Here are five American laws that are likely, in the eyes of other modern governments, incredibly dumb.

I always liked those lists of weird laws still on the books, like needing a license to wear penny loafers, or whatever.

These aren't those, though.

5. Female Lawmakers’ Backwards Dress Codes

Another point on the high school side of the scale is the fact that, despite being our chief legislative body, Congress still enforces a fucking dress code. And like most dress codes, it’s a whole lot more draconian when it comes to the female members.


That's idiotic, sure, but then there are still countries where "female lawmaker" is semantically and legally impossible.

Sure, Britain isn’t much looser, but they also think “fanny” is a cuss, so is that such a win?

That's what I've been saying.

4. Kinder Surprise Eggs Banned

Another common feel in American law is the conflict between a country that’s supposed to be advocating for freedom above all, while seemingly convinced that every American has the death drive of a baby lustily staring at the forbidden liquids beneath the sink. One place this pops out is in the absence of the Kinder Surprise Egg in American stores.


Meanwhile, far more hazardous products remain legal. You know what I mean.

3. Weird Real Egg Laws

Right, because no one else has weird laws about food.

2. Pharmaceutical Advertising

Everywhere else foolishly believes that if you need medication, your doctor probably isn’t relying on you to provide suggestions. It doesn’t help that the advertising is just as predatory as usual, mostly suggesting that if you don’t fix your allergies, your child will spit on you and leave you to cry in a musty robe while they go to the park to play with their other parent, who they now like more.


I despise almost all advertising, and it is kinda strange to push prescription medicine on the TV, but there are worse things to advertise. Homeopathy, e.g.

1. Sex With A Porcupine

And I'm done with the internet for today.
May 17, 2023 at 8:14am
May 17, 2023 at 8:14am
#1049703
It's been a while since I've ragged on the "time is an illusion" nonsense, so here we go again.



Despite my issues with the wording, it's an interesting article with some things I hadn't read before. Obviously, I can't do the whole thing justice here; that's what the link is for.

America's official time is kept at a government laboratory in Boulder, Colo., and according to the clock at the entrance, I was seven minutes behind schedule.

Not if time is an illusion, you weren't.

NIST broadcasts the time to points across the country. It's fed through computer networks and cellphone towers to our personal gadgets, which tick in perfect synchrony.

For various definitions of "perfect."

"A lot of us grow up being fed this idea of time as absolute," says Chanda Prescod-Weinstein, a theoretical physicist at the University of New Hampshire. But Prescod-Weinstein says the time we're experiencing is a social construct.

That's misleading at best. Sure, the way we slice time up into hours, minutes, and seconds is purely arbitrary (though it does have some basis in observation), but time's going to do its thing regardless of whether there's a clock to measure it. The orbits of the planets, for instance; they make an effective clock if you know how to read it.

How can I claim to know better than a theoretical physicist? It's the "theoretical" part. The smallest "things" we know of, quarks and electrons, don't experience... well, anything. But where time is concerned, subatomic particles follow laws that don't take much notice of time, if any. From that perspective, okay, time isn't fundamental, and that's the framework a theoretical physicist works in. But for everyday stuff? Time is real. The sun rises in the east (though the directions are also a social construct) and sets in the west. There is darkness, and then there is light.

The consensus I've seen is that it's a bulk property of matter, related to entropy. You know what else is a bulk property? Temperature. But there aren't pseudo-mystics floating around airily proclaiming that temperature is an illusion. Any that do need to be shipped to Antarctica in a pair of shorts to see if they can wish temperature away.

Real time is actually something quite different. In some of the odder corners of the Universe, space and time can stretch and slow — and sometimes even break down completely.

You're going to claim time is a human construct, and then, in the exact same paragraph, use the phrase "real time?"

Yes, that last quoted bit is correct, to the best of my knowledge. The thing is, though, we know exactly how time stretches, slows, and breaks down under acceleration (including acceleration due to gravity). There are equations for it. To me, an "illusion" wouldn't have that quality.

Space is also different from one point to another, but only the most bearded philosophers claim space is an illusion.

For many people, this unruly version of time is "radical," she says.

It is, by definition and equations, ruly. Not unruly. Yes, it seems odd to us because it's outside our normal experience. But there's plenty of observational confirmation of the way time changes at different locations.

By averaging a subset of the 21 clocks together, NIST has created a system that can count the time to within one quadrillionth of a second. That means the government's clock can keep time to within a second over the course of about 30 million years.

At which point it'll be moot, because the Earth's rotation will have changed, and the second is based on the minute, which is based on the hour, which is based on the time it takes for the Earth to make a complete rotation during the current epoch.

I expect the second will remain the same, provided we last long enough to keep measuring time. But the length of the day will gradually increase, unless something catastrophic happens.

The time from this lab is used to run our lives. It says when planes take off and land, when markets open and close, when schoolchildren arrive at class. It controls computer networks, navigation tools and much, much more.

And? I'm as lazy as anyone, but I still want to keep track of time if I'm doing something or meeting with someone.

Governments around the world aren't just providing the time as an altruistic service to citizens, Prescod-Weinstein argues. It's about keeping society organized and efficient. It's about increasing economic productivity.

Bit of a stretch, in my opinion.

"Capitalism sucks, and I think a lot of people's relationship to why time is not cool, is structured by the resource pressures that we feel," she says.

So she has an agenda.

I'm not going to get into the "capitalism sucks" debate, except to say that, well, we've tried some other systems, and as Churchill said, it's the worst economic system, except for all the others. I do hold out some hope that we'll find a replacement, à la Star Trek, or improve it so it's not so dehumanizing as we pursue peak efficiency in the name of Holy Productivity, and pretend that infinite growth is possible. So I can relate to that agenda. But really, none of that says anything about the concept or reality of time itself.

Wibbly wobbly timey wimey

I can't hate an article that has a Blink reference.

True time is actually much more flexible than most people realize, Prescod-Weinstein says. According to Einstein's general theory of relativity, space and time are tied together, and space-time can bend and curve.

Sure, but that has no practical value for us as we slog through our daily lives.

In places where gravity is very strong, time as we understand it can break down completely. At the edge of black holes, for example, the powerful gravitational pull slows time dramatically, says Prescod-Weinstein. And upon crossing the black hole's point of no return, known as its event-horizon, she says space and time flip.

I've seen that finding before, and it's got lots of theory supporting it. Obviously, there's no way to experimentally verify it. Again, no practical use for us. Not yet, anyway.

The Universe is expanding, and because of entropy, energy and matter are becoming more and more evenly spread out across the ever-growing void. In its final state, the Universe may end up as an inert cloud of energy and matter, where everything is evenly distributed.

Yeah, I've referred to that before. They call it the heat death of the Universe, because there will be no more heat transfer, because everything is already at maximum entropy. As I noted, time is probably the result of the one-way direction of entropy. No entropy change means no time. That's if our current cosmological models are correct, which is always an active question.

What this article fails to mention is that this is not an imminent existential threat. We're talking about something like 10100 years from now, which is a number so large that you don't understand it. Hell, I barely understand it, myself. As a comparison, there are fewer than 10100 atoms in the entire observable universe.

That exact number is called a googol, incidentally. Not to be confused with Google, who either deliberately or accidentally misspelled it.

Anyway.

So time, as we understand it, has some really big problems, but it also has some really tiny ones, too. In fact, some scientists who study the microscopic interactions of fundamental particles are questioning the idea of time itself.

Yes, we know, fundamental particle interactions are time-reversible. As I said up there, time is a bulk property, to the best of our knowledge.

Well, I've banged on long enough. There is, as I noted, a lot more stuff in the actual article.

If you have time to read it.
May 16, 2023 at 9:47am
May 16, 2023 at 9:47am
#1049662
I never sausage a thing.

    Which Hot Dog Brand Is Best? A Blind Taste Test of Oscar Mayer, Hebrew National, and More  
Because your summer BBQs deserve the best. (Or at the very least, not the worst.)


Dammit! It was sitting right there. They could have said "not the wurst," but no, they had to play it straight.

Do you know the difference between sausages, wieners, frankfurters, and hot dogs? If, like me, you hadn’t ever really thought about it and assumed they were all pretty much the same, I’m thrilled to tell you that you’re wrong.

Because of course we are.

It's been many years since I've actually eaten a hot dog, frankfurter, or weiner; anything requiring a hot dog bun. At home, I got really tired of the mismatch between number of franks and number of buns in their respective packages, and while out, there are other foods that appeal to me more. Not to mention I know what they're made of, but that doesn't stop me from eating breakfast sausages.

So, unlike those two other staples of American haute cuisine, hamburgers and pizza, I don't have a dog in this fight. Pun intended, as usual. I just relished the article and found it amusing.

Regardless of which type of sausage is your favorite, there’s one that screams summer louder than the others: hot dogs. Our staff tasters were sure childhood standbys like Oscar Mayer and Hebrew National would sweep the competition but, as always, the results of our taste tests are full of surprises.

Naturally, I enjoyed hot dogs when I was a kid. Our cylindrical delicacy of choice was, unsurprisingly, the Hebrew National brand. When those were unavailable for whatever reason, the replacement still had to be made of cow, because my mom tried to keep a kosher house as best she could out in the boonies.

To cut down on variables, we boiled and tasted only all-beef hot dogs.

So this is why I picked this article to go into my queue.

And though we offered up buns, ketchup, and mustard, most testers boldly chose to taste their dogs plain.

This makes sense from a pure taste-testing perspective, but out in the wild, you're looking for a whole experience, including bun and condiments. I believe that the right choice of bun influences that experience. Would you taste-test pizza without the crust?

As for condiments, in a taste-test, you at least want them to be consistent across all the samples.

And finally, I know they didn't do this in Chicago, because in Chicago, they track down anyone putting ketchup on a hot dog and run them out of town on a rail.

In the end we blind tasted seven of the most popular brands and judged them on flavor, casing snap, and the satisfying firmness of the meat each bite.

Phrasing!

Of course, for full effect, you'll need to go to the article for details. I'm just highlighting things here.

The Biggest Loser: Oscar Mayer

Quelle surprise. Their dogs are terrible.

Unflinchingly Flaccid: Ball Park

I'm starting to think this author has issues.

Not because they don't like Ball Park. That's normal. It's just, again, phrasing.

Happily Herby: Hebrew National

Here's the thing: it's hard to be objective about food (or drinks) during a taste test. Taste is, well, a matter of taste. Beer, for example, is highly personal; some love *shudder* IPAs, while I prefer darker, less hoppy brews. For colas, Coke will, for me, always be far superior to Pepsi. And Hebrew National is always going to be my Platonic ideal of hot dogs, even if I don't eat them anymore, because it was our go-to brand when I was a kid.

The two that beat it on the list, Sabrett and Nathan's, were only available to Kid Me on trips to New York City. Either they hadn't yet expanded distribution to the rest of the country, or we just didn't get them in our off-the-beaten-path area. In both cases, I always wished it was HN.

So, in the end, you'll have to make your own taste test if you care to determine which is best. Or be like most people and just eat whatever's cheapest; this is why many Americans have no sense of taste.

But you like what you like and it's not my decision.

One final note: there is perennial debate over whether a hot dog, nestled as nature intended in its bun, is a sandwich. I've heard even a Supreme Court justice once weighed in on the matter (RBG, if it matters), though in an unofficial capacity.

This is, ultimately, a categorization problem, like whether the Blue Ridge are actually mountains, or Pluto's planetary status. So there's no official answer. Categories are, in the end, a social construct. However, when you consider that the hot dog is generally served with the split side of the bun facing upwards so that the toppings don't fall out—something you never do, and can never do, with a sandwich—and that the bun itself is always solid at the bottom, barring accidents, and also given its origins as handheld street food, there is only one True Conclusion to which someone can arrive:

A hot dog isn't a sandwich.

It's a taco.
May 15, 2023 at 9:21am
May 15, 2023 at 9:21am
#1049626
Nothing is forever.



I didn't fact-check these, so beware. If true, there are some on this list I'd never heard of.

As per normal, I'm not copying all of them here.

1. Ansault pear

Unlike other items on this list, the Ansault pear appeared relatively recently. First cultivated in Angers, France, in 1863, the fruit was prized for its delectable flesh.


Angers, France? That's not Nice.

Irregular trees and the rise of commercial farming contributed to the fruit's demise.

Seems to me that, if we really wanted to, we could recreate this one. Fruit varieties are generally made by some sort of cloning or hybridization.

3. Auroch

You may have heard aurochs mentioned in Game of Thrones, but this creature doesn’t belong in the same category as dragons. The real cattle species was domesticated 10,000 years ago in the early days of agriculture. They were big (“little below the elephant in size,” according to Julius Casear) and leaner than modern cows.


Apparently these lasted longer than I thought, all the way to the 17th century.

Here, the article leaves out an interesting bit about the aurochs: it was so important, so integral to developing civilizations, that the pictogram for it became a letter. Phoenicians called it aleph. The Hebrew script still does. In Greek, alpha. We know it as the letter A.

5. Dodo

Dutch sailors first visited the island chain of Mauritius in 1598, and less than two centuries later the archipelago's native dodo went extinct. Sailors relied on the birds as sustenance during long voyages at sea, but that isn't the primary reason they died out; habitat and the introduction of invasive species like rats and pigs ultimately wiped out the animal.


Pretty sure they mean "habitat loss," not "habitat."

It's my understanding that it was fairly common, at the time, for people of the European variety to believe that God put all the other animals (and plants, etc.) on Earth for our benefit, and would never allow one to become extinct.

That turns out not to be the case.

6. Steller’s sea cow

German naturalist Georg Wilhelm Steller identified the Steller's sea cow around the Commander Islands in the Bering Sea in 1741. Growing up to 30 feet long, it was significantly larger than the sea cows alive today.


Cue Hindenburg disaster narrator: "Oh, the huge manatee!"

7. Mammoth

Wooly mammoth meat was an important component of the diets of our earliest human ancestors. We ate so much of them that hunting may have contributed to their extinction around 2000 BCE (though climate change was likely a bigger factor).


So, apparently, there were mammoths wandering around at the same time as there were pyramids in Egypt.

Not in the same place, though.

8. Taliaferro apple

Thomas Jefferson cultivated Taliaferro apples at Monticello. In an 1814 letter to his granddaughter, Jefferson said the small fruit produced "unquestionably the finest cyder we have ever known, and more like wine than any liquor I have ever tasted which was not wine."


Including this one in my commentary for literal local flavor. But also because many people might not be aware that the Virginia pronunciation of Taliaferro is, inexplicably, Tolliver.

9. Great auk

Modern humans primarily killed great auks for their down, leading to the species’s extinction in the mid-19th century, but prior to that they were hunted for dinner.


I knew about this one because I had an English teacher in high school who loved to point out awkward sentences in his students' compositions by writing a big red AWK and circling the offending phrase. He called it the "Great Awk."

It should surprise no one that I had a truly stupendous Great Awk collection.

Anyway, there's obviously more at the link, and they're all interesting, even if, as is the case with the passenger pigeon entry, some of them are already well known.
May 14, 2023 at 9:27am
May 14, 2023 at 9:27am
#1049580
Today, we're going all the way back to June of 2020 for an article about quantum mechanics: "Something about Nothing. Now, don't freak out; it's not a technical treatise in any way.

The article referenced there (really a book promotion masquerading as an interview masquerading as an article) is a year older than that, but it's still up and can be found here,   with the misleading title of "Making Sense of Quantum Mechanics."

As I've noted, the main reasons I do these retrospectives are to see if there have been any updates to the material covered, and to elucidate how my own views on the subject may have evolved. I don't follow science news all that rigorously; that is, I'll read an article or a book, or watch a video, here and there, but it's not like I delve into much depth. But the one thing that comes to mind is that, recently, there was a lot of buzz about a Nobel Prize given to some scientists for their work on quantum entanglement.

Same as with everything else related to quantum theory, people got that stuff wrong, too. I don't mean the prize-winning scientists, who presumably had really tight results, but the way it was reported on made it look like the old "Einstein was Wrong" crowing, with an emphasis on how quantum entanglement means something travels faster than light.

It does not. The light speed barrier should more properly be termed the information speed barrier, and quantum entanglement does not, at least with our current understanding, imply the transmission of information faster than light. We can't use it to send instantaneous messages to Pluto, for instance. Mostly, from what I can tell, the usefulness is limited to the arcane workings of quantum computers. Perhaps there are other uses, or will be in the future, but mostly the prize was about experimental confirmation of a theory.

None of which really negates anything in the article I featured, as far as I can tell.

In that entry, I said:

I've always been of the opinion that anyone who claims to have figured out quantum mechanics is lying, at the very least to themself.

Nothing about that has changed.

But I do want to go back to that original article to note something I apparently missed the first time around:

Horgan: Good choice. What is the point, in our scientific age, of philosophy?

Albert: I'm not sure I see the connection. It's like asking, “What is the point, in our scientific age, of ice cream?" People like it. People - or some people - want to understand how things, in the broadest possible sense of the term, hang together. And it happens (moreover) that the business of trying to figure that out has had obvious and enormous and innumerable consequences for the history of the world. And if the thought is that what goes on in university science departments has lately somehow taken that function over, then that's just clearly and wildly wrong - and the fact that it's wrong, as I explained in my answer to your previous question, was part of the reason why I moved from a science department to a philosophy department.


Here, I think both people missed the mark.

Ice cream has nothing to do with science (except in the sense that everything does and that, reportedly, Einstein was a big fan). But—and I might have mentioned this before, but I don't remember—philosophy guides science, and science informs philosophy.

"Science" isn't a body of knowledge; it's a method. The scientific method is, at base, philosophy. It's philosophy that works, in that it's been shown to get useful results, unlike a lot of the mental self-pleasuring some philosophers do. But philosophy also has at least one other function in science, and that's to limit the lengths to which we'll go to investigate some hypothesis.

To note a basic example, in biology, animal testing is a thing. What limits animal testing isn't science itself, but ethics, which is a branch of philosophy. You can argue that the restrictions go too far, or, conversely, that they don't go far enough and maybe we shouldn't be doing animal testing at all. But by doing so, you're not doing science, you're doing philosophy.

As for "science informs philosophy," well, the thing about philosophy is that you can build entire logical edifices on the foundation of a false premise. One need look no further than the convolutions of a flat-earther to see what I'm talking about here, but, in general, if you're going to draw conclusions, it's best to start with a solid and well-tested premise, such as "the earth is basically round" or "gravity is an attractive force between masses."

Sometimes, when you do that, you might find a contradiction, or a paradox. That might lead to a revised premise, and that's okay.

My point is that the universe doesn't support our penchant for putting everything into neat little boxes. There's no sharp dividing line between, say, biology and chemistry. The boundary between our selves and our environment can get murky, too, and does so every time we take a breath, or eat, or shit.

So it is with science and philosophy. Though we were doing philosophy way before the beginnings of science as a discipline (physics was originally termed "natural philosophy"), it often led to some really wrong conclusions. Still does, of course.

Okay, enough of that. I guess I just had to defend why I bang on about both those things in here, when I'm not making dick jokes. And sometimes when I am.
May 13, 2023 at 8:22am
May 13, 2023 at 8:22am
#1049551
From the "don't believe everything you hear" department (courtesy of The Guardian):

    Chocolate doesn’t cause acne – but carrots do help you see in the dark: the best and worst health myths and wisdom  
True or false: cheese gives you bad dreams and oysters are aphrodisiacs? We investigate good, bad and mad health advice


Folk "wisdom" usually isn't wisdom, but mythology. People have always had a problem confusing correlation with causation.

Sometimes, though, like a sightless person throwing darts randomly and hitting a bullseye, it turns out to be right—at least provisionally.

How do you tell the difference? Science, of course.

I won't copy all of them here; there are quite a few. Just hitting some highlights that I wanted to comment on.

Chicken soup helps cure colds and flu

Works best if prepared by a Jewish mother.

Okay, no, that's a joke. But I'm pretty sure the canned kind is going to be inferior to the homemade variety. I'm wary of the word "cure" in the title; however, this falls into the "can't hurt and might help" category. Unless you're vegan, in which case, good luck.

Anyway, I've banged on about chicken soup in here before. The short version is, if it makes you feel better, and you like it, great.

Chocolate causes acne

This one's labeled "false." As with much of "nutrition science," the jury's still out.  

An apple a day keeps the doctor away

Also labeled "false," but only on a technicality: you're going to get sick eventually, no matter what you do or don't do. But, as the article notes, it's not going to hurt you to eat a damn apple. And it's admittedly a catchy rhyme.

Going out with wet hair gives you a cold

"False." Duh. Colds are caused by viruses. Viruses that you pick up from *shudder* people.

Carrots help you to see in the dark

The article points to "true," but I have to say, not to the extent that mythology would indicate. This nonsense started, if I remember correctly, in England during WWII, when, not wanting "zee Germans" to know about the Allies' sophisticated (for the time) radar, they attributed early warning of aerial attacks to people eating carrots and therefore seeing the impending threat better in the dark.

But again, as with apples, it's not like eating a few carrots is going to hurt.

Cracking your knuckles will give you arthritis

This one's false, and I've known that for some time, but goddamn, it's annoying. So if you use it to scare kids into not cracking their knuckles, I can understand that.

I crack my knuckles all the time.

It takes up to seven years to digest swallowed chewing gum

Another false one intended to scare children straight.

Garlic under your pillow aids sleep

Labeled "false," but I'd call it true, if you're frightened of vampires; clearly, you're going to sleep easier if you know you're protected. It also keeps other people away because of the smell, so you get a better night's sleep.

Still, when it comes to garlic, I'm too busy eating it to put it under my pillow.

Urine relieves jellyfish stings

I'm including this one in my commentary because I'm still hearing this nonsense sometimes. I suspect it got started by someone with a pee fetish, which is way more common than I ever realized.

Oh, yeah, and it's false.

Cheese gives you bad dreams

I remember the first time I heard of this one. It was in Dickens' A Christmas Carol, as Scrooge attributed his nocturnal visitations to possibly having eaten cheese.

As the article notes, dairy products might actually help with sleep. Again, good luck, vegans.

Probiotics support your gut health

This is the last one on the list, and it doesn't surprise me in the yeast—er, I mean, least—that it's not entirely true. The magical benefits of probiotics are mostly marketing gimmicks.

Big surprise.
May 12, 2023 at 8:06am
May 12, 2023 at 8:06am
#1049518
A little more depth than usual, today, but I'll try to make it worth your time.

    Descartes and the Discovery of the Mind-Body Problem  
The French philosopher René Descartes is often credited with discovering the mind-body problem, a mystery that haunts philosophers to this day. The reality is more complicated than that.


Descartes was, of course, more than a philosopher. Probably most famous for "I think, therefore, I am," he was also a scientist and mathematician (he's the guy who decided it would be cool to represent points on a two-dimensional grid with x and y axes, ever after known as the Cartesian coordinate system from his name). And he had the best hair of any philosopher. Newton's was arguably better, but his focus was mostly science, math, and alchemy; plus, I suspect his was a wig.

Consider the human body, with everything in it, including internal and external organs and parts — the stomach, nerves and brain, arms, legs, eyes, and all the rest.

Yeah, I know what you were thinking about with "all the rest."

Even with all this equipment, especially the sensory organs, it is surprising that we can consciously perceive things in the world that are far away from us.

Eh, not really. Most animals can do that. Kind of necessary for avoiding predators and finding prey.

For example, I can open my eyes in the morning and see a cup of coffee waiting for me on the bedside table. There it is, a foot away, and I am not touching it, yet somehow it is making itself manifest to me. How does it happen that I see it? How does the visual system convey to my awareness or mind the image of the cup of coffee?

"Even more importantly, I live alone!"

I am conscious of the cup, we might even say, though it is not clear what this means and how it differs from saying that I see the cup.

Everyone treats consciousness like it's gotta be this massive, complex thing, but if it turns out to be really simple, I'll laugh my ass off (from beyond the grave, if the dualists turn out to be right).

How did my neurons contact me or my mind or consciousness, and stamp there the image of the cup of coffee for me?

It’s a mystery. That mystery is the mind-body problem.


By this point in the article, the author could have, you know, drank the coffee instead. Such are the perils of philosophy: your coffee gets cold while you wax philosophical about it.

Our mind-body problem is not just a difficulty about how the mind and body are related and how they affect one another. It is also a difficulty about how they can be related and how they can affect one another.

Plot twist: they're the same thing.

According to Descartes, matter is essentially spatial, and it has the characteristic properties of linear dimensionality. Things in space have a position, at least, and a height, a depth, and a length, or one or more of these.

Hence, Cartesian coordinates (extended into a third dimension).

Mental entities, on the other hand, do not have these characteristics. We cannot say that a mind is a two-by-two-by-two-inch cube or a sphere with a two-inch radius, for example, located in a position in space inside the skull. This is not because it has some other shape in space, but because it is not characterized by space at all.

*bong hit* duuuuuude.

Okay, but in seriousness, I believe this next bit is the actual crux of the matter under discussion:

What is characteristic of a mind, Descartes claims, is that it is conscious, not that it has shape or consists of physical matter. Unlike the brain, which has physical characteristics and occupies space, it does not seem to make sense to attach spatial descriptions to it. In short, our bodies are certainly in space, and our minds are not, in the very straightforward sense that the assignation of linear dimensions and locations to them or to their contents and activities is unintelligible. That this straightforward test of physicality has survived all the philosophical changes of opinion since Descartes, almost unscathed, is remarkable.

I don't know about that last sentence. There are philosophers that will tell you, with straight faces (if you can see said faces behind their beards) that what we know as matter is an illusion, and mind is the only thing that's real.

Oh, wait, that's from Descartes, too.

Descartes is untroubled by the fact that, as he has described them, mind and matter are very different: One is spatial and the other not, and therefore one cannot act upon the other.

And yet, one does act upon the other, as we prove every moment of every day, so if it didn't trouble him, it's not going to trouble me, either.

This is analogous to Zeno's paradoxes.   The way they're formulated, nothing can move, and two people can never get close enough to kiss, and no one would ever be able to enter a room. All of Zeno's paradoxes were later resolved by calculus and limit theory (which came along a generation after Descartes and built on his work), but I mean that in a philosophical sense, when your philosophy doesn't mesh with reality, it's not reality that's wrong; it's your philosophy.

Descartes is surely right about this. The “nature” of a baked Alaska pudding, for instance, is very different from that of a human being, since one is a pudding and the other is a human being — but the two can “act on each other” without difficulty, for example when the human being consumes the baked Alaska pudding and the baked Alaska in return gives the human being a stomachache.

Okay, that was legitimately funny.

The difficulty, however, is not merely that mind and body are different. It is that they are different in such a way that their interaction is impossible because it involves a contradiction.

I would say: an apparent contradiction. Again, it just means that our understanding of one or the other is faulty.

My money's on "mind."

Mind is consciousness, which has no extension or spatial dimension, and matter is not conscious, since it is completely defined by its spatial dimensions and location.

Unless you subscribe to the panpsychism philosophy, which is that all matter has some rudimentary consciousness. (I do not, and have banged on about it at length in previous entries.)

Was there really no mind-body problem before Descartes and his debate with his critics in 1641? Of course, long before Descartes, philosophers and religious thinkers had spoken about the body and the mind or soul, and their relationship. Plato, for example, wrote a fascinating dialogue, the Phaedo, which contains arguments for the survival of the soul after death, and for its immortality.

I begin to see the problem. It starts with a false premise: that the mind, or soul, exists independently of the body and can survive the body's demise.

This is like believing that a candle's light can survive the snuffing of the candle. No. Except in the metaphorical sense, as those who saw the candle's flame can remember it.

What happens, if anything, for example, when we decide to do even such a simple thing as to lift up a cup and take a sip of coffee?

Well, after spending all your time philosophizing about it, that's when you discover it's cold. And I'm still not clear on how it appeared on your bedside table to begin with.

Anyway, I get around all this by understanding that the mind is a construct of the physical body. But what do I know? I'm not a philosopher.
May 11, 2023 at 10:04am
May 11, 2023 at 10:04am
#1049467
I've been told that the source for today's amusing link (The Daily Mash, basically The Onion but British) is blocked in non-UK areas. Obviously, it's not blocked for me; there are ways around location-blocking. But if that's too much work, I'll understand.



Obviously, the link contains swear words, and would probably be 18+ on WDC.

BRITAIN leads the world in swearing, until British people attempt to say things that Americans would call ‘cuss words’.

I was under the impression that "cuss word" is pretty much a Southern thing ("cuss" being a mangling of "curse" as "y'all" is a mangling of "you all"), but I wouldn't expect Brits to understand that nuance any more than I'd expect a Yank to know the difference between a Yorkie and Mancunian accent.

Asshole

Given that we in the UK already have the vastly superior phrase ‘arsehole’, there’s really no need to be pronouncing the word ‘ass’ at all.


On this point, I can agree. "Arse" is indeed vastly superior in most instances.

Motherf**ker

Just call him a wanker, it’ll sound better.


Here, I have to disagree. Not that it wouldn't sound better; it definitely would, no matter what your accent. But "m-f'er" and "wanker" are two different insults with two entirely different purposes, coming from two completely different takes on sexual practices. Also, the former is far more versatile, being applicable to a much wider variety of situations than merely accusing someone of incest.

Douchebag

...It never sounds right coming out of a British mouth...


Phrasing!

Also, I suspect it doesn't work in England because it's too obviously French. And "douche" is a perfectly legitimate French word that translates to "shower," as in the kind you take in the morning to get yourself clean for the day.

Darn

Even our most pathetic swearwords are infinitely superior to theirs.


Except, and I must emphasiz(s)e this to keep my US passport, for m-f'er.

Pussy

Whether used to convey the fact that you think someone is weak or a pathetic way to refer to a woman’s genitalia, British people really can’t pull this off.


Right, like "fanny" is any better. And the article specifically claims British ownership of "twat," which is obviously wrong.

There's more explanation at the link, if you can view it. If not, just remember it's because of those darn arsehole douche wankers on the other side of the pond wanting to keep their comedy to themselves.
May 10, 2023 at 10:35am
May 10, 2023 at 10:35am
#1049425
When I was a kid, whenever my mom would get exasperated with me (often), she cursed me with "Just wait until you have kids!"



That was very effective in ensuring that I wouldn't have kids. I wouldn't want to deal with anyone remotely like I was.

As the owner of Childfree Millennial TikTok, Instagram and YouTube accounts, Munoz is one of a growing number of influencers producing content designed to validate why they never want to have kids.

Much as I despise TicTac, it's about time someone provided a counterbalance to the insipid mommy bloggers.

“The number-one thing that I always say when people ask me why I'm child-free – it's because I don't have a desire to have children,” says Munoz, a small-business owner from Kansas, US.

I mean, as reasons go, that's a compelling one. I never do (or don't do) anything for just one reason, so that bit at the top was just the beginning for me. There was also the 1980 Presidential election, which I think is when I realized my country was doomed.

In one of her other recent posts, she jokes, “if you have baby fever take a nap, if you enjoyed that nap don’t have kids”.

Credit where credit is due, that's a great one. I always called it "baby rabies" though. Rhymes are more memorable.

While deciding against having children is nothing new, a trend for owning the ‘child-free’ label and discussing that choice more openly is picking up pace.

Oh, come on now, we were discussing this sort of thing in online forums even before social media was a thing.

The term ‘child-free’ has existed since the early 1900s, although it wasn’t until the 1970s that feminists began using it more widely, as a way of denoting women who were voluntarily childless as a distinct group.

Oh, bite my ass. It's not just women. Sure, the social consequences are different for men, but some of us don't want to be saddled with all it entails to be a dad. To be clear: I'm not talking about the assholes who have kids and then abandon them. That's not being childfree; that's being irresponsible.

However, most academic research has typically “lumped all people who don’t have children into the same group,” explains Elizabeth Hintz, an assistant professor in communication at the University of Connecticut, US, who’s studied perceptions of child-free identities. This doesn’t reflect the very different experiences and feelings of child-free and childless people, she says, and means there’s a lack of long-term comparative data looking specifically at either group.

Which is another important point: people who do want kids, and can't have them, have my sympathy. Hell, my parents were in that group (be careful what you wish for). People who want kids and have them, well, that's great; they've achieved their desire. Then there's the people who don't want kids and have them anyway, through "accident" or social pressure, and that can be a problem.

In any case, there's a world of difference between childfree (I drop the hyphen on purpose, but I suppose the BBC has a style guide to adhere to) and childless.

In the US, a 2021 Pew Research Center study showed some 44% of non-parents aged 18 to 49 don’t think they will have children, up from 37% in 2018. More than half listed their main reason as “don’t want to have children” rather than more circumstantial factors such as medical issues or not wanting to raise a child without having a partner. In England and Wales, a 2020 YouGov study suggested that more than half of British 35-to-44-year-olds who haven’t had kids never plan on doing so.

Those are pretty high numbers, but I suppose they seem inflated because the base group is people who already weren't parents.

However, in my opinion, "don't want to have children" just kicks the question can down the road. Not that anyone is obligated to give a reason. If someone asks me "why don't you eat eggplant?" I just say "Because I don't like it," and the conversation usually ends there. But the way society is, one is tempted to ask "why don't you want to have children?" And the answers can be all over the place. Personally, I often cite the rise of fascism, environmental degradation (a lower population would ameliorate some of that), a moral obligation to not bring new life into a decaying world, being unable to afford it, and my love of being able to sleep whenever I want.

Some call it "selfish," but I can't think of anything more selfish than wanting your precious genes to somehow continue, or expecting someone to automatically care for you in your old age.

A lot of those reasons are echoed there in the article, but, again, all women.

Another burgeoning online community is We Are Childfree, run by British-born Zoë Noble and her partner James Glazebrook, who are both in their early 40s, and live in Berlin.

Finally, a dude. At least I'm assuming based on the traditionally masculine name. All I mean by that is, well, representation is a good thing.

Munoz’s content has frequently attracted harsh online comments from those who’ve disparaged her choices as being “anti-child” or “selfish”, or from followers who simply don’t believe she could find her lifestyle fulfilling. “A lot of parents just don't understand that it was a choice. And so, they see it as an attack on their choice of having children,” she says. “They immediately go on the defence mode and tell you, ‘oh, but you're going to regret it’ and ‘you're going to die alone’, and ‘who's going to take care of you when you're older?’ and ‘you'll never know true love’.”

I have, of course, heard all those things, myself. I can only imagine it must be worse for women, who, historically, have faced even more pressure to reproduce. All I can say is, different people have different priorities. Just like in my last entry, on the topic of sleep schedules, I'm not criticizing others' decisions. I'm not one of those voluntary human extinction people, just, to borrow a phrase, pro-choice.

Hintz points out that much of the criticism hurled at child-free advocates tends to be steeply gendered. “Reproductive decision making has always been a burden placed on women more so than their partners,” she says. “And motherhood and femininity are so closely intertwined as well, so that is also, I think, a part of it.” As a result, this means there’s still often more pressure on women than men to follow a traditional “life script” and start a family, says Hintz, even in Western countries that have made great strides towards equality.

So the article acknowledges what I've been saying. Still, it generally takes two to make a child (even if one of them is just a donor), so stop with the double standards, already.

Practical and financial issues are also covered, including how to plan for retirement as a non-parent. “There's a lot of fear of getting older and ‘who's going to take care of me’ and ‘what is my future going to look like’?

Those are legitimate concerns, but I offer this: last I checked, which was a while ago, it costs something like half a million dollars to raise one kid to adulthood, and that's not including any costs of higher education. It may be lower for subsequent offspring, but my point is that if you're earning income and have some discipline, you can invest that money, instead, and use it for long-term care.

O’Connor says it’s important to point out that most child-free advocates “are very pro-choice for everyone”, and don’t have the goal of “convincing people to be child-free” or “trying to recruit for the community”.

Just re-emphasizing this so people don't get defensive.

O’Connor strongly agrees that the media has an important role to play moving forward. “There's a lack of positive representation of what being child-free or childless looks like in society,” she says. “We don’t have in the wider media, in TV shows and in film, older people living just happy content, child-free lives.”

There's a reason for that beyond enforcing social norms: in entertainment, drama and conflict are key (as we all know as writers). Hell, one of the best superhero shows of the past few years is Superman and Lois, and how do the writers create drama and conflict? Clark and Lois have teenage sons. There are also potentially world-shattering villains, as befits a show about Superman, but those aren't nearly as relatable. Anyone can relate to dealing with kids, even if they don't have any, because they were, once, kids.

Happiness and contentment are great, but they can make for shitty plots.

“More and more articles are coming out about people not having kids … and seeing more accounts pop up, more channels being created on YouTube, it’s so refreshing,” she says. “I’m not discriminatory to people who have kids. I have a lot of friends in my life who are parents. But I just love that people are now thinking a little bit deeper about parenthood, rather than just assuming it's the thing to do.”

I don't usually quote the last paragraphs in the articles I feature, but this one sums everything up perfectly (thanks, BBC). That's all I want: not for everyone to make the same life choices I did (good gods no), but to acknowledge that there are other choices in life besides following the script.

I should probably end on a personal note: as I mentioned, my parents were unable to have genetic offspring, so they adopted. That was their choice, and it worked out great for me. My mom's sister, on the other hand, was a feminist before it was cool to be a feminist, so she never married, focusing instead on her career and taking care of their mentally ill brother. Sometimes, she traveled the world. One of the most impressive people I've ever known.

So this kind of choice is nothing new. It just might seem that way.
May 9, 2023 at 11:24am
May 9, 2023 at 11:24am
#1049389
Sometimes, the random number generator points me to the same topic as the day before. Such is the clumpy nature of randomness.

Pretty sure this is the last article along these lines in my queue, so after today, we're probably done for a while.



"Could" is absolutely the key word in the headline.

I will note that this article is from Fortune magazine, so I'm pretty sure what they really mean is "Waking up at 5 a.m. every day would improve your boss's bottom line."

Seize the day, we’re told.

We're told a lot of things, many of them contradictory. "Seize the day," or carpe diem in its maybe-original formulation, is meant to be an exhortation to live in the present moment with little to no thought of consequences or the future. Basically the opposite of "force yourself to get up at 5 am so you can worship at the altar of Holy Productivity."

The early morning wakeup has even become a TikTok trend...

I really should have stopped reading there. Once they mention that short-attention-span data-mining cancer of an app, I can be sure that nothing else in the article applies to me.

Not that I didn't already know that from the "wake up at 5" bullshit.

...coined the “five-to-nine before the nine-to-five,” where video montages illustrate a slow morning aesthetic of self-affirmations, workouts, and maybe even a head start into planning for the work day.

This is like the training montages in movies, which make it look like you can go from wimp to wolf with five minutes' worth of sweat.

“The pressure to be a morning person is pretty intense,” says Samantha Snowden, a mindfulness teacher at Headspace, the popular meditation app.

Everything about that sentence is pressure. Blood pressure. Mine. Higher.

For starters, getting up earlier can improve confidence, Snowden says, because it can feel like an accomplishment.

You know what else can feel like an accomplishment? Actually accomplishing something.

And if you can use those extra morning hours to make time for yourself in a way that calms you down, it can bolster productivity and make you feel less depleted by the end of the day.

Ahhh... I knew they'd get to the P word eventually.

You know what calms me down? Sleep.

Choosing to move up that alarm should not come at the expense of sleep.

I know I've said this before, but you're going to have the same number of waking hours every day, on average, no matter what time you get up. Unless you sacrifice sleep, as I did for many years, and I'm paying for it now—way more than I'm paying for smoking or drinking.

Prioritizing sleep means having good sleep hygiene, including waking up around the same time each day, limiting screens before bed, not consuming alcohol or caffeine in the evenings, and having a wind-down routine.

This is the first thing in that article that doesn't make me want to yeet something at the wall.

Not that I wake up at the same time each day, or get away from screens before bed, or refrain from alcohol or caffeine. I'm just saying it's decent advice, not that I follow it.

Snowden says you can spend 10 extra minutes slowing down (even walking a bit slower to the shower in the morning), not checking emails right away, and practicing a kindness message.

And there goes my blood pressure back up. Seriously. They say you can't feel it, but I definitely do.

Especially for the night owls, choosing to get up earlier won’t feel comfortable immediately.

It'll never feel entirely comfortable if you're a true night owl. Never. You can get used to it; we're pretty adaptable. But being an owl and being forced to be a lark is very similar to jet lag, only it never really goes away.

Source: me.

Now, look. I want to emphasize that I'm not knocking people who get up early, for whatever reason. You do you, as the kids say. Sometimes, we have to; I did for a long time. Some people are just naturally early risers, and that's okay, too.

What I object to is the incessant pounding in various media sources (usually ones with a pro-business agenda) that this is the One True Answer. People are different, and have different innate schedules; I'm not "lazy" because I prefer to sleep from 2am to 10am (or whatever my schedule is on any given day) instead of 9pm to 5am. I'm lazy for a lot of other reasons, but not that one. Still, I can acknowledge that more people prefer the latter, though I'd wager not so many among writers or other creative types.

I just want to stop the sleep-shaming.

2,685 Entries · *Magnify*
Page of 135 · 20 per page   < >
Previous ... 14 15 16 17 -18- 19 20 21 22 23 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/18