*Magnify*
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/month/6-1-2021/sort_by/entry_order DESC, entry_creation_time DESC/page/2
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 1 -2- ... Next
June 10, 2021 at 12:03am
June 10, 2021 at 12:03am
#1011579
Today's article is basically a book promotion, but whatever -- it still has what I consider to be good information.

5 Tips for Resisting the "Laziness Lie"  
No one chooses to fail or disappoint. We need to unlearn our bias of laziness.


Being a lazy person, I thought this might be about how to be not-lazy. Boy, was I wrong. It's tips for how to be more lazy, as if I didn't already have great role models for that: my cats.

The book explores how our culture’s fear of laziness is rooted in unjust historical systems such as enslavement and the belief in the Protestant Work Ethic, and how such beliefs lead to overwork, exploitation, and alienation.

Huh. So that's an elevator pitch, and a damn good one: one sentence, succinct, a summary of the main points. And it especially appeals to me because I've been part of the RAPWE (Resistance Against the Protestant Work Ethic, an acronym I just now made up) for years.

For anyone who doesn’t have the energy—or time—to read a full book about how busy and overworked we all are right now, here are five key insights from my book you can read in a single sitting:

And that's how you do a book self-promotion: don't just tease us, but give us some solid points from the text. Sure, some people (like me) will still be too lazy to buy the book, but you're still shilling it to a larger audience than you'd otherwise have.

1. "Laziness" isn't what you think.

The “laziness lie” is my term for the set of unspoken, deeply held cultural beliefs that each of us absorbs throughout our lifetimes about the value of work and the danger of “laziness.”


There seems to be a rule in self-help books -- maybe even psych books in general -- that the author has to come up with a catchphrase to center the text around. Preferably one that rhymes, alliterates, has a pun, or some combination thereof

The laziness lie tells us that our worth as human beings is linked to our productivity, that our needs and limitations cannot be trusted and must instead be ignored, and that, no matter how busy we are, there is more that we should be doing.

Yeah, again, I've resisted that shit since practically Day One.

2. When you feel “lazy,” you’re doing too much.

For my book, I interviewed some of the busiest, most stressed, most burned-out people... What I found across the board was that each extremely busy person felt like a failure and was somehow convinced they were lazy.


Newsflash: Most of us are crap at self-diagnosis.

The answer is to stop buying into the laziness lie and start reframing how we set priorities in our lives. When you feel like you’re not doing enough, the answer is to find something to cut back on or let drop.

Hm... I don't know... what should I cut back on? Playing video games or watching TV shows?

3. You aren’t “wasting” time. All your time is accounted for.

Decades of productivity research shows that, at most, the average worker can only focus on job tasks for about three to four hours per day.


I guess I brought that average down single-handedly. My limit for focusing on job tasks was approximately three nanoseconds.

4. Embracing consent means rejecting "lazy."

In a culture that hates laziness, it is difficult for a person to assert their boundaries and confidently say “no” to anything.


Huh. That's odd. I get told "no" all the time.

If we want to resist the laziness lie, we need to learn to embrace our feelings, including emotions like apathy and annoyance, which people often don’t like facing in themselves.

It is true that once I initially ran out of fucks to give, my life became ten times easier. Apathy is power.

5. Action isn’t morally superior to inaction.

In a world shaped by the laziness lie, work is equated with goodness, and doing “something” is almost always seen as superior to doing nothing. This leads to a lot of problems, like activist fatigue or activist frenzy, where people are constantly posting misinformation and poorly researched calls to action online because they are so desperate to want to do good.


Hm... I've been assuming that most of those calls to action are cackling Russian trolls doing social engineering.

We don’t have to be heroes, and we don’t constantly need to go go go. We can take a step back and cultivate relationships instead.

The only relationship I want to cultivate is with my bed. Sometimes I cheat on it with the couch.

Anyway, clearly none of this advice is useful to me (hence why I'm not rushing over to Amazon to buy the book), but I hope today's double whammy of "how to write a book pitch" and "how to get on your ass and be lazy" will be of benefit to someone reading this. That is, of course, if you made it this far and didn't give up out of laziness.
June 9, 2021 at 12:07am
June 9, 2021 at 12:07am
#1011528
Entry #3 of 8 for "Journalistic Intentions [18+] - as usual, I pulled this one at random, but honestly, if it hadn't, I'd probably cheat by making it my eighth entry anyway.



*Video* Education, education, education


The differences between US and UK terminology and society have been an interest of mine for a long time. When I was a kid, lo these many years ago (but still after the Revolution), I read both American and British childrens' books. This led to a good bit of confusion, some of which persists to this day.

I've also been known to use British idioms from time to time, like when I say I can't be arsed to do whatever. I do generally use American spellings of words, though some spellings probably slip through. For instance, I'm pretty sure I've used both "canceled" and "cancelled," the latter of which is generally British. Still, I don't have a colourful sense of humour.

Now, I've joked in here before that they invented this language, but we perfected it. Just to be clear, that is, indeed, a joke. But it does help to have a bit of guidance on the differences. Most famously, probably, Rowling changed one of the books in her famous series from "Philosopher's Stone" to "Sorcerer's Stone," apparently specifically because American kids would think of old boring guys with beards if the former were used. (To be fair, most philosophers were old boring guys with beards, except for Neitzche who had an epic mustache, and Camus who wasn't old and kept his face shaved).

Personally, I think the world would have been a better place if they didn't have a separate Freedom Edition of Potter, but instead took the opportunity to introduce the little brats to UK English.

And yes, I know Rowling is cancelled now. Or was she canceled? I can't keep it straight.

Like I said yesterday, I'm currently going through the source material before going to see Cruella on Thursday. Disney has long had a... difficult... relationship with British stories, and the Dalmatians movies are no exception. Still not as cringeworthy as Mary Poppins, though. Point is, I hadn't seen the 1961 cartoon since like 1970, and I'd never bothered with the live-action 1996 remake, which, apart from Glenn Close doing her best "The Devil Wears Pongo" impression, Hugh Laurie being Hugh Laurie, and a bunch of disgustingly cute puppies, turns out to be fairly unremarkable. In any case, the movies reminded me that I'd read the actual 101 Dalmatians book (with the numbers spelled out, which I can't be arsed to do) way back in my misspent youth, and it was one of the works of fiction that introduced me to Britishisms.

Incidentally, I'm not bothering with any of the sequels, and I don't hold out much hope for Cruella. I mean, I get her as a villain? But this trend toward glorifying the Bad Guy (or Gal) is rather off-putting. She's a puppykiller, for fuck's sake -- perfect antagonist material, as long as she doesn't actually get her way. What's next, are we going to get a movie with a pedophile as the protagonist? (Yes, yes, I know, Lolita, whatever -- no one in that movie was a sympathetic character, not even the girl or Jeremy Irons.)

Where was I? Oh, yeah, the video above. Look, I know it can be a pain in the ass (or arse) to click on all these embedded videos, but that one's worth a watch. The guy in the video at first seems like some updated version of Alistair Cooke from Masterpiece Theater, but it's all a cover for a dry British wit (which is almost as good as dry British gin), while at the same time being, well... educational.

And if you're reading this blog in the first place, I can only assume you like a side of humo(u)r with your facts and my opinions.

The ultimate point here is: yes, the US and UK educational systems are different. But both seem to work, so why bother comparing them? Well, partly because almost all knowledge is worth having, but also partly to foster better cross-pond communication. After all, as the narrator points out (paraphrasing Shaw, apparently), we're separated by a common language.
June 8, 2021 at 12:01am
June 8, 2021 at 12:01am
#1011473
Sometimes you just have to issue a correction. Well, not me, obviously. But other people.

Neandertals don't deserve their bad, dim-witted reputation  
Our hominin ancestor had bigger brains and probably went extinct with climate change. Who are we to judge?


Just kidding, of course. When I'm wrong about something, I'll admit it when I find out otherwise. But in this case, it's a matter of science doing what science does: being self-correcting.

Unfortunately, lots of people stick with what they first hear and then ignore any evidence to the contrary. Well, here's some contrary evidence.

On March 3, 2021, the governors of Texas and Mississippi announced that they were lifting their respective mask mandates, prompting criticism from President Biden, who called the move "Neanderthal thinking." Biden was implying that lifting the mandates was a primitive act — but this understanding of Neandertals is an outdated stereotype, unsupported by modern research.

That, by the way, is not an invitation to argue about politics or politicians.

Neandertals (Homo neanderthalensis) are an extinct species of hominin — the taxonomic name for a tribe that includes living and extinct humans and their ancestors. They lived throughout Eurasia up until about 40,000 years ago. One of the earliest Neandertal fossils was discovered in the Neander Valley in Germany, which is where the species gets its name. Interestingly, the contemporary word for “valley” in German is “tal,” so anthropologists tend to call the species “Neandertal,” not Neanderthal (dropping the 'h').

Completely irrelevant aside: Apparently, there were once very productive silver mines in another German-named valley, Joachimsthal (using the older "th" construction). Coins minted from there were called "Joachimsthalers" or "thalers." This got Anglicized as "dollars." Or, well, I may have some of the details wrong, but that's the gist of how "dollars" got their name.

This is one of the things I'll correct later if I'm wrong, but I can't be arsed to look it up again now. I'm planning on going to see Cruella on Thursday, and I'm busy delving into the source material (Dalmatians movies), so my time is somewhat limited. As far as I can tell, though, the "dal" in "dalmatian" has no linguistic connection to tal, thal, or dollars. Pounds, maybe, if you're Cruella.

If, as research suggests, Neandertals behaved similarly to humans, we either need to rethink our definition of human, or update our definition of Neandertal.

Given that they're all gone, except for traces of genetic material in most of us "sapiens," this may be a moot point.

Where did the brutish stereotype that plagues Neandertals originate? This association can be traced back to 1911, when French paleontologist Marcellin Boule described a newly discovered Neandertal fossil from the La Chapelle-aux-Saints site in France. Boule wrote that the fossil had primitive characteristics, including a hunched posture, big, divergent toes, and large brow ridges which, he argued, signaled a lack of intelligence.

I'm betting that was a conclusion born of phrenology. Science has had a few dead ends, and phrenology was one of them. Unfortunately, as with homeopathy, its effects linger.

Boule's original interpretation of Neandertals was based on an incorrect reconstruction of their skeletons.

But hey, you gotta start somewhere, right?

The wider build and shorter limbs common in Neandertal skeletons has also been seen as primitive compared to early humans, though researchers now argue these were adaptations to a colder climate.

"Primitive" is an unfortunate word. They were well-adapted to their environment, until they weren't. Calling them primitive is like calling a lion primitive compared to a tiger.

It's clear that incorrect early reports shaped assumptions about the behavioral and anatomical nature of Neandertals.

The brutish, sloped-forhead caveman stereotype, usually wearing skins and carrying a club, persists in cartoons to this day. Because it's a convenient shortcut for pre-agricultural humanity (or whatever), it's going to be hard to shake that image.

Dinosaurs once faced a similar issue: throughout the first half of the twentieth century, the general public believed dinosaurs were sluggish and cold-blooded. It wasn't until the 1960s that a small scientific revolution, sparked by new discoveries and led by paleontologist John Ostrom, suggested that dinosaurs were probably active, warm-blooded animals.

Like I said, science is self-correcting. Eventually.

Perhaps the Neandertals need their own revolution — a Neandertal rebirth, if you will — to help the general and scientific community understand that they were a sophisticated, cultured species who made art, had some similar anatomical features to humans, and who went extinct due to a changing climate, not inferiority.

Yeah, look, I get that the author is trying to make a point about climate change. Thing is, there weren't all that many Neandertals (comparatively speaking) in the first place, and obviously there was some interbreeding going on with Sapiens, who at this point number in the billions. Whatever your personal perception is of climate change, it's unlikely that we'll go extinct because of it. Which is not to say that things won't take a major turn for the worse.

In any case, I'm mostly presenting this in my long-standing crusade to counter outdated popular interpretations of science, not to argue about climate change or politics.
June 7, 2021 at 12:02am
June 7, 2021 at 12:02am
#1011420
Oh, look. Another "time is an illusion" article.



Time feels real to people. But it doesn’t even exist, according to quantum physics. “There is no time variable in the fundamental equations that describe the world,” theoretical physicist Carlo Rovelli tells Quartz.

Okay, well, I readily admit that this guy has studied the subject more than I have. Even so, we immediately run into definition problems. One of these is that while, yes, time disappears at the quantum level, well, so does solid matter; everything at that scale is an energy vibration in a field. It's only in aggregate that we get sand and people and planets and whatnot. Similarly, aggregate a bunch of quantum vibrations together and you get things that are affected by time.

Nevertheless, I think it's a good thing to look at this point of view.

Rovelli’s book, The Order of Time, published in April 2018, is about our experience of time’s passage as humans, and the fact of its absence at minuscule and vast scales.

I'm going to have to put a [citation needed] on the "vast scales" thing. We can see interactions of galaxies far, far away, occurring in some time frame, and even the cosmic microwave background is what it is because of the time elapsed since the Big Bang.

I can't be arsed to order that book, though, so I'll just leave that quote.

Time, Rovelli contends, is merely a perspective, rather than a universal truth. It’s a point of view that humans share as a result of our biology and evolution, our place on Earth, and the planet’s place in the universe.

Right, because obviously, there was no time before we were around to measure it. Then humans came along and, poof, suddenly time starts to pass.

In fact, Rovelli explains, there are actually no things at all. Instead, the universe is made up of countless events. Even what might seem like a thing—a stone, say—is really an event taking place at a rate we can’t register. The stone is in a continual state of transformation, and on a long enough timeline, even it is fleeting, destined to take on some other form.

"On a long enough timeline." We can't even talk about whether time exists without invoking time language. To me, this is all the proof I need that time -- and rocks -- exist.

This seems to be another version of the argument that goes something like "if it's ephemeral, it doesn't really exist." I've rejected that argument out of hand, and while I'm open to counterarguments, my own perspective is that it's only that which is fleeting that can be called "real."

By "fleeting" I mean anything from picoseconds for certain phenomena, to billions of years for things like stars. If you're going to claim that stars don't exist, well, I can't even find common ground with you.

Rovelli argues that time only seems to pass in an ordered fashion because we happen to be on Earth, which has a certain, unique entropic relationship to the rest of the universe. Essentially, the way our planet moves creates a sensation of order for us that’s not necessarily the case everywhere in the universe.

I am not aware of any evidence that time doesn't exist elsewhere in the universe. Sure, as we saw here a few days ago, there may be inaccessible areas of the universe where the rules are different, but in the observable universe, both space and time are things.

But that paragraph does hint at my understanding of the nature of time; it's tied to entropy. The "past" is a time of lower entropy and the "future" is a time of greater entropy. To be simplistic about it. How the universe began in a state of minimum entropy is, admittedly, an open question (as far as I've seen).

If all this sounds terribly abstract, that’s because it is. But there’s some relatively simple proof to support the notion time is a fluid, human concept—an experience, rather than inherent to the universe.

Right, because other animals don't experience time. Which is why my cat starts begging for food right as the accursed daystar begins to approach the western horizon. That, by the way, is another artifact of perception. I know that the planet's rotation moves my location into the side facing away from the sun. And yet, from my limited point of view, it looks exactly like the sun is setting. There's no difference in reality; just in how I describe it. This is analogous to looking at time from a different point of view.

My limited perspective also makes it look like the earth is flat. It is not, and simple tests can reveal that. If you believe it is, boy are you in the wrong place right now.

Also, the arguments that follow aren't "proof."

Imagine, for example, that you are on Earth, viewing a far-off planet, called Proxima b, through a telescope. Rovelli explains that “now” doesn’t describe the same present on Earth and that planet.

Fair enough, which is exactly my argument for why there is no such thing as "the present." There's only the past and, potentially, the future.

This might sound strange, until you consider something as mundane as making an international call. You’re in New York, talking to friends in London. When their words reach your ears, milliseconds have passed, and “now” is no longer the same “now” as it was when the person on the line replied, “I can hear you now.”

Same argument. There's no new information here. All this does is reiterate that information cannot travel faster than the speed of light. And when you talk about "speed," you're talking about distance traveled in some period of time. Again, using time in an argument for the nonexistence of time is inherently contradictory.

The best I can make of that is that time doesn't work the way our intuition says it does, which is a trivial argument. At quantum scales, nothing works the way our intuition says it does. We've known this for a century.

Consider, too, that we don’t share the same time in different places. Someone in London is always experiencing a different point in their day than someone in New York. Your New York morning is their afternoon. Your evening is their midnight. You only share the same time with people in a limited place, and even that is a relatively new invention.

Oh for fuck's sake, now they're conflating the concept of "time" with the ways we measure it, and also noting that we live on a rotating sphere which is only half-illuminated. Rotation implies time. Yes, clocks are arbitrary and standardized, and time zones exist for our convenience. So what? That says nothing about the physical concept of time.

Time even passes at different rates from place to place, Rovelli notes. On a mountaintop, time passes faster than at sea level. Similarly, the hands of a clock on the floor will move slightly slower than the hands of a clock on a tabletop.

A restatement of part of Einstein's theory of general relativity. Also, by "slightly," they're talking about minuscule fractions of a second. Yes, your head is older than your feet because you spend most of the day with your head above your feet. This is an effect that it takes incredibly precise instruments to measure.

But you know what? We have those incredibly precise instruments. What do they measure? They measure time. If you can measure something, it is not nonexistent.

Likewise, time will seem to pass slower or faster depending on what you’re doing. The minutes in a quantum physics class might crawl by, seeming interminable, while the hours of a party fly.

Oh, good gods, that old canard. Yes, our perception of the passage of time varies depending on circumstance. That's a biological effect. Perhaps it has to do with differing rates of the chemical reactions going on in our brains, or maybe there's another explanation, but it doesn't disprove "time." Like, I took a nap today. I felt like five or ten minutes went by, mostly in dreams. When I woke up, though, the clock said it was two hours later. Contradiction! Obviously, therefore, time must not exist!

Meanwhile, cesium clocks continue to tick at the same rate regardless of our perception (assuming you don't move one up or down; see above). The decay of radioactive isotopes is itself a measure of time.

“Time is a multilayered, complex concept with multiple, distinct properties deriving from various different approximations,” Rovelli writes. “The temporal structure of the world is different from the naïve image that we have of it.” The simple sense of time that we share works, more or less, in our lives. But it just isn’t accurate when describing the universe “in its minute folds or its vastness.”

And yet, after all that snark, I can't really find fault with this conclusion. There's still a lot we have to learn.

Rovelli argues that what we experience as time’s passage is a mental process happening in the space between memory and anticipation. “Time is the form in which we beings whose brains are made up essentially of memory and foresight interact with our world: it is the source of our identity,” he writes.

Okay, I promised myself I wasn't going to rag on panpsychism for a while in here, but this argument touches on that concept. Time passes for a rock whether a human observes it or not. Grass grows whether we watch it or not. It did so long before humans arrived on the scene. Well, okay, maybe not that long, because grass is a relatively recent evolutionary development of plants. But the point is, time passes for the grass. It's a seed, it sprouts, it grows, it gets eaten by a bull, it turns into manure, the manure promotes the growth of new grass. It's not like this happens all at once. Does that mean the grass is conscious, too? The manure? No. It means time has elapsed, even if we're not standing there with a stopwatch observing it.

Without a record—or memory—and expectations of continuation, we would not experience time’s passage or even know who we are, Rovelli contends. Time, then, is an emotional and psychological experience.

And without senses, we wouldn't experience input from the outside world; therefore, the universe must not actually exist! (This is an argument that mystics and acid-trippers like to latch onto. It's best to liken that argument to the bovine manure in my last paragraph.)
June 6, 2021 at 12:12am
June 6, 2021 at 12:12am
#1011375
Today's article may have more complicated implications than it would seem at first glance. So, first, the article:



The publication date is more than four years ago, but I don't think that changes much.

Grand Theft Auto, that most lavish and notorious of all modern videogames, offers countless ways for players to behave. Much of this conduct, if acted out in our reality, would be considered somewhere between impolite and morally reprehensible. Want to pull a driver from her car, take the wheel, and motor along a sidewalk? Go for it. Eager to steal a bicycle from a 10-year-old boy? Get pedaling. Want to stave off boredom by standing on a clifftop to take pot shots at the screaming gulls? You’re doing the local tourism board a favor. For a tabloid journalist in search of a hysteric headline, the game offers a trove of misdemeanors certain to outrage any non-player.

I've never played GTA, but from what I understand of it, it's a single-player game; that is, you control an avatar, and every other object in the game (including the ones that resemble people) is programmed. In that way, it's much like any other single-player video game. GTA, though, explicitly focuses on playing as a criminal, or "bad guy."

For the British artificial intelligence researcher and computer game designer Richard Bartle, the kaleidoscopic variety of human personality and interest is reflected in the video game arena. In his 1996 article “Hearts, Clubs, Diamonds, Spades: Players Who Suit MUDs,” he identified four primary types of video game player (the Killers, Achievers, Explorers, and Socializers).

And then suddenly the article switches to MUDs - originally Multe-User Dungeons, in which you might have NPCs, but other avatars are controlled by other human players. These were essentially online role-playing games, in the vein of D&D. They're mostly niche, now, as people prefer multiplayer games with all the graphics and whatnot, like World of Warcraft or whatever.

Personally, I prefer single-player games, because gamers are assholes and generally treat other player characters as if they were NPCs.

Bartle’s research showed that, in general, people were consistent in these preferred ways of being in online video game worlds.

Well, as usual, I violate the general case.

While players sometimes experiment by, for example, playing an evil character just to see what it’s like, Bartle found that such experiments usually lead to affirmation rather than transformation. “Basically,” he said, “if you’re a jerk in real life, you’re going to be a jerk in any kind of social setting, and if you’re not, you’re not.”

Yeah, that hasn't been my experience. People who are kind in real life can blow off steam by being a jerk in an online game. People who are assholes in real life, though, are also assholes in games. I guess this says something about me, or humanity in general. Maybe. Or it says something about how we approach fantasy.

I like to think that on the occasions where I've played in multiplayer games, I wasn't a jerk to other players. But maybe they didn't see it that way.

They found that video games that allowed players to play out their “ideal selves” (embodying roles that allow them to be, for example, braver, fairer, more generous, or more glorious) were not only the most intrinsically rewarding, but also had the greatest influence on our emotions. “Humans are drawn to video and computer games because such games provide players with access to ideal aspects of themselves,” the authors concluded. Video games are at their most alluring, in other words, when they allow a person to close the distance between how they are, and how they wish to be.

This makes a lot of sense; however, I've taken great joy in playing (in single-player games) as a despicable assassin, and that's hardly my ideal self. I've also played as a paragon of virtue, and find it too limiting. LIke, when playing Skyrim, a "bad guy" has a whole lot of options for quests and fun things to do, whereas a white-knight type gets bored quickly.

I solve this, in Skyrim, by being a good guy in public while sneaking around doing evil deeds where the guards won't catch me.

But, again... it's a game. It's fiction, not real life. Does me playing as an assassin in a game mean I'd want to kill people in real life? Of course not, any more than writing a story about murder would. And people love stories about murders, and bad guys doing bad guy stuff.

“It’s the very reason that people play online RPGs,” Bartle said. “In this world we are subject to all kinds of pressures to behave in a certain way and think a certain way and interact a certain way. In video games, those pressures aren’t there.” In video games, we are free to be who we really are—or at least find out who we really are if we don’t already know.

Yeah, I'm not buying it. I play games to be something other than who I really am, be it virtuous hero of despicable villain -- neither of which describes my actual personality -- or anything in between.

That's why they call it "role-playing." Did I play a druid in D&D because I love nature and the outdoors? Shudder, hell no. Did I play a Drunken Master monk because I like to drink? Well, okay, that one may have been typecasting. The point is, regardless of what this guy claims, I don't play aspirationally; I play characters that may or may not reflect my own real-life preferences, exploring different personalty types that may or may not reflect who I am in real life.

I guess that's an occupational hazard for a writer.

Not every game, however, allows us to act in the way that we might want to. The designer, that omniscient being who sets the rules and boundaries of a game reality, and the ways in which we players can interact with it, plays their own role in the dance. Through the designer’s choices, interactions that we might wish to make if we were to fully and bodily enter the fiction are entirely closed off.

Yes, and that's annoying, but for single-player games, sometimes mods can fix that. Let me give you an example.

Fallout 4 was produced by the same people who did Skyrim, and it's another game I keep going back to because there are so many character-based ways to approach it. It's harder to go the "white knight" route in it, because the game setting doesn't really fit that archetype. Which is fine, because it mimics the real world in that sometimes you have to make tough "moral" choices. I put moral in quotes because, again, it's a single-player game and every other object in it is programmed into the software, including NPCs with their actions and reactions. So nothing you do in the game has any bearing whatsoever on the "real world." Want to blow a guard's head off? You can do that. Set off a nuke in a marketplace? You can do that. Take on a companion and use them as a pack mule? You can do that. There may be repercussions in the game, but those are programmed, too.

What you can't do, in the base game, is kill "children." By which I mean, there are certain NPCs who resemble young teenagers, and, unlike the NPCs who look like grown-ass adults, the program simply will not let you do anything to permanently remove them from the game world.

And there seems to be a general consensus that, while it's perfectly okay to sneak up behind an "adult" innocent NPC and "kill" them, somehow it's different when it's a "kid." But the only difference between the two is the programming of their size and maybe voice.

So I run a mod that lets you kill any NPC that isn't essential to the storyline. Including the annoying little brats.

This, apparently, makes me a horrible person. But it's okay, because I also run a mod that makes it impossible to kill the game's cats.

Priorities, people.

Anyway, the full article is worth a read, even if I quibble with the general conclusions. I guess being able to slip into different character types is mostly a writer thing, and most gamers aren't writers; if they were, they'd be writing instead of playing video games. There, too, I'm an exception.
June 5, 2021 at 12:01am
June 5, 2021 at 12:01am
#1011329
Entry #2 of 8 for "Journalistic Intentions [18+]. Today we're going to talk about... uh... what were we going to talk about? Oh yeah. Memory.



*Video* Memory fails us these days


My memory was questionable even in the Before Times. Like, 20 years ago or more, I used to have a mind like a steel safe when it came to jokes. I could hear a joke once and reliably repeat it.

Music was a good one, too. Lyrics, writers, performers, year of production, hell, even producers -- a whole slew of trivial metadata for each of a bunch of different songs.

What changed? Not age. I know I bang on about being old, but I'm not that old, not yet anyway. No, it was the internet. With an entire world's worth of information at my fingertips, I knew I could forget a lot of things and still consult the Great Oracle for anything I'd forgotten, and besides, even Wikipedia is way more accurate than my memory.

But it made me realize that for me at least, the key to remembering things is emotion.

Give me a bunch of dry facts and figures, and I'll forget them almost instantly. Oh, sure, if I drill them enough I can remember them, like even now I can remember the basic strategy for blackjack even though I haven't gone gambling in something like two years. But if something is a joke, it triggers my emotional "humor" response, so I can remember it. With music, well, I long ago offloaded my other emotions onto music anyway, so it makes sense that I'd remember anything associated with that art, humorous or not.

But if I can make a joke out of something that's not inherently funny, well, boom, there it is, stored in my long-term memory. It's how I could remember things like organic chemistry formulae; I'd make puns about it in my head and dredge them up during a test. I've since forgotten most of what I learned, but it worked at the time.

I wonder, now, why I don't use that more often -- turning everything into a joke. Probably because, all my life, people have sneered at me, "Everything's a joke to you, isn't it?" Well, yeah, it is; that's how I remember it. I have long had this image of me going to trial (for, obviously, something I didn't do) and making an offhand joke about it. I imagine the judge going, "What do you think you are, some sort of comedian?"

"Uh... actually, yes?"

It's like when I did my movie review yesterday. Okay, so, when it came out, I saw the first A Quiet Place. Thing is, I don't remember much about it, because while I recall it was a well-shot and beautifully put-together movie, I couldn't recall a detail of it if you'd put a gun to my head. I think it's because I didn't have a real emotional reaction to it. I remember appreciating it at the time, but... something about having to be quiet because the monsters were blind but had excellent hearing? That's about it.

Hence my joke review yesterday. Give it a year, though, and I'll probably have forgotten the whole movie. That's not a commentary on the film; it was pretty good. It's just that the only emotion it induced in me was the occasional shock of a jump scare. But what I will remember (probably) is how clever I thought I was at turning "silence" into the review.

I don't have many actual phobias in life. I mean, the things that most people are afraid of, like snakes or spiders, well, they have the capacity to startle me, but I'm not frightened of them exactly (the venomous kinds get my respect). Heights invigorate me, and while I'm not a big fan of crowds, I can deal with them. Wide-open spaces, tiny closed dark spaces? Meh, whatever. Needles? Big deal. There's a deep-seated fear in me of anything touching my eyeballs, though, and that's probably why the Universe is fucking chortling right now about me probably having to get cataract surgery, but that's about it for phobias... except for losing my memory.

I'm not sure that particular fear is entirely irrational; after all, I spent 20 years dealing with my parents' profound memory loss. Sequentially. As in, right after my mom died with dementia, my dad was diagnosed with Alzheimer's. Now, I'm well aware I don't have their genes, but that doesn't mean I won't lose everything that makes me me.

At least with cataract surgery, I can be pretty sure that if I can face that particular fear, once it's done, it's done, and my vision will be (partially) restored. There's still no treatment for dementia. Honestly, I'd rather die a quick, earlier, painful death by heart attack than waste away for a decade in some hole somewhere, not knowing who or where I am.

All of which is to say that I don't really sweat the small memory glitches. They're annoying, but not worrisome, not yet anyway. As they say, forgetting where you put your keys is normal. Forgetting what keys are is a problem.

Meanwhile, I try to get enough sleep, avoid stress (and drama of all sorts), and mostly I just have to repeat language lessons more than the average person in order to get the material to stick in my gray matter. For everything else, I'll just keep making jokes for as long as I can.
June 4, 2021 at 12:13am
June 4, 2021 at 12:13am
#1011269
Today's article delves into a topic I don't generally like to explore in here, as it can lead to interpersonal arguments. Hell, it's been a cause of more than one war. But it's just too philosophically interesting for me to pass up. So without further disclaimer, an article from the BBC (which is actually reprinting a Conversation article)...



The simple answer here is, as with most headline questions, "no." Physics doesn't "prove" the existence of anything, but rather provides means to describe and predict the behavior of physical phenomena. When Newton figured out how the Earth and Moon orbited each other, and together orbited the sun, he wasn't out to prove the existence of Earth, Moon, or Sun, but to describe their interactions mathematically. We can see these bodies, so their existence is taken as given. When Einstein studied the photoelectric effect, he wasn't out to prove the existence of light. He was attempting to describe some of its observed properties. Whether any of these things exist or not is a matter for philosophers. I'm inclined to take the existence of the planet as a given, since I'm part of it, and to heap scorn on those philosophers who insist it's all an illusion.

But perhaps I'm reading too much into a headline. Even the BBC has to condense an article into a catchy line or question. Let's see what they're actually saying.

The author's introduction, a question posed by a dude from L.A. who shares a name with a late famous British TV host:

I still believed in God (I am now an atheist) when I heard the following question at a seminar, first posed by Einstein, and was stunned by its elegance and depth: "If there is a God who created the entire universe and ALL of its laws of physics, does God follow God's own laws? Or can God supersede his own laws, such as travelling faster than the speed of light and thus being able to be in two different places at the same time?" Could the answer help us prove whether or not God exists or is this where scientific empiricism and religious faith intersect, with NO true answer?

So yeah, just to be crystal clear, there are a lot of levels of quotes there, but again, it's a (probably paraphrased) Einstein quote wrapped in a question from a reader posted in The Conversation and reprinted by the BBC, and now once again quoted here in my blog.

And I'm not going to try to weasel out of this: my personal philosophy, again, is that it's not science's job to "prove" anything, but to describe behavior mathematically. But once more, the article:

I was in lockdown when I received this question and was instantly intrigued. It's no wonder about the timing – tragic events, such as pandemics, often cause us to question the existence of God: if there is a merciful God, why is a catastrophe like this happening?

This is simply a variation of theodicy: if God is good, why is there evil? Theologians and philosophers have been kicking this one around for at least a couple of thousand years, without what you'd call a consensus.

The idea that God might be "bound" by the laws of physics – which also govern chemistry and biology and thus the limits of medical science – was an interesting one to explore.

In my view, you'd first have to explain what you mean by "God." Without that definition, you can come to almost any conclusion.

If God wasn't able to break the laws of physics, she arguably wouldn't be as powerful as you'd expect a supreme being to be. But if she could, why haven't we seen any evidence of the laws of physics ever being broken in the Universe?

This doesn't quite sit right with me, either. It seems a variant of "Could God create an object so heavy that even God couldn't lift it?" I mean, there are neutron stars and black holes, both of which are pretty heavy and still participate in orbital dynamics, so... well, I'm getting off track. Also, if we see "the laws of physics ever being broken," we'd revise "the laws of physics," which aren't actually laws but predictive descriptions. It's happened in the past and hopefully will continue to happen. That's how science gets done.

We learn at school that nothing can travel faster than the speed of light – not even the USS Enterprise in Star Trek when its dilithium crystals are set to max.

Like many of the things we learn at school, this is not precisely the case. No object can accelerate to, or past, the speed of light; no information can exceed that speed; and space itself can, and does, cause distant objects to recede faster than light, which is why we only see part of the Universe. As for the Enterprise, it is of course fiction.

Fortunately, the author goes into this too; I'm not going to quote everything here.

Some argue that we therefore cannot be sure whether the laws of physics could be broken in other cosmic regions – perhaps they are just local, accidental laws. And that leads us on to something even bigger than the Universe.

Again, a quibble about "laws of physics." They wouldn't be "broken" in other regions, but perhaps "different." And again... not laws.

One headache for cosmologists has been the fact that our Universe seems fine-tuned for life to exist. The fundamental particles created in the Big Bang had the correct properties to enable the formation of hydrogen and deuterium – substances which produced the first stars.

This bit is related philosophically to the thing I posted a couple of days ago.

Some argue it's just a lucky coincidence. Others say we shouldn't be surprised to see biofriendly physical laws – they after all produced us, so what else would we see? Some theists, however, argue it points to the existence of a God creating favourable conditions.

But God isn't a valid scientific explanation. The theory of the multiverse, instead, solves the mystery because it allows different universes to have different physical laws. So, it's not surprising that we should happen to see ourselves in one of the few universes that could support life. Of course, you can't disprove the idea that a God may have created the multiverse.

That's the crux of the issue (pun intended) for me -- science isn't there to prove, or disprove, God.

The theory enables something called quantum entanglement: spookily connected particles. If two particles are entangled, you automatically manipulate its partner when you manipulate it, even if they are very far apart and without the two interacting. There are better descriptions of entanglement than the one I give here – but this is simple enough that I can follow it.

It's also simple enough to be misleading. From what I've read, entanglement doesn't allow for the communication of information faster than the speed of light.

So, there is something faster than the speed of light after all: quantum information. This doesn't prove or disprove God, but it can help us think of God in physical terms – maybe as a shower of entangled particles, transferring quantum information back and forth, and so occupying many places at the same time? Even many universes at the same time?

Eh... probably not. See above.

Has this essay come close to answering the questions posed? I suspect not: if you believe in God (as I do), then the idea of God being bound by the laws of physics is nonsense, because God can do everything, even travel faster than light. If you don't believe in God, then the question is equally nonsensical, because there isn't a God and nothing can travel faster than light. Perhaps the question is really one for agnostics, who don't know whether there is a God.

The question may or may not be nonsensical, but I'm pretty sure it's the wrong question in the first place, because it's not one that can really be answered in the framework of science.

This is indeed where science and religion differ. Science requires proof, religious belief requires faith. Scientists don't try to prove or disprove God's existence because they know there isn't an experiment that can ever detect God. And if you believe in God, it doesn't matter what scientists discover about the Universe – any cosmos can be thought of as being consistent with God.

And what I believe is that these descriptions of science and religion are themselves misleading.

The article ends with a quote from Terry Pratchett, which I find amusing for such a philosophical exploration. That alone would have been enough for me to want to share it.

So, to conclude my own take on the topic, I'm not posting this to step on any toes, but simply to present the question and some thoughts about it. I suspect that most religious people, at least here in the West, would balk at the idea of "proof" of God's existence on the grounds that faith doesn't require proof, and most scientifically-oriented people would reject the idea because that's not what science is for.

EDIT:

Oh hell, I nearly forgot with all the big concepts flooding my mind here.

*Film* *Film* *Film*


One-Sentence Movie Review: A Quiet Place Part II

.

Rating: 3.5/5
June 3, 2021 at 12:02am
June 3, 2021 at 12:02am
#1011209
Normally, I cheerfully ignore articles that claim "If you're doing x to help the environment instead of doing y, you're actually making things worse." I tend to assume they're coming from people who are trying to sell/promote y.



Not so sure about this one, though.

And look, I get it. We got conned into low-flow toilets that use 1/3 less water, but you have to flush 3 times as often. And water-reducing shower heads that cut water use by 50% but you end up doubling the length of your shower.

As far as I can tell, though, LED bulbs are actually a good thing. But enough about that. Let's get back to shopping bags.

If you’re trying to contribute as little as possible to the two global calamities of climate change and the swirling gyres of forever-materials slowly filling our oceans, there’s a useful formula to keep in mind: Use fewer things, many times, and don’t buy new ones.

Also don't have kids, but for some reason no one ever mentions that.

But are plastic bags better or worse than paper? And what about a cotton tote? Let’s rip this bandaid off right away: There’s no easy answer.

Oh, good, the possibility of nuance. None of this "Wind farms kill birds and contribute to noise pollution so let's go back to coal, everyone!" "Signed, The Coal Industry."

The article goes on to lay out the pros and cons of our shopping bag choices.

The table below, using data from the Denmark study, compares the environmental performance of LDPE bags to other bags, assuming that the LDPE bags are reused once as a trash bin liner before being incinerated (incineration is the best possible disposal for these bags, according to the report).

Of course, you're going to have to go to the link to see the actual table. I have no idea how solid the information is, but let's take it at face value for now.

On a personal note, I get groceries delivered. Sometimes they show up in plastic bags. Some of said bags end up as trash bags. Most get dumped into the purposed recycling bin at the grocery store on the rare occasion that I can be arsed to go to the grocery store. What happens to them after that is Not My Problem.

Other times, the groceries come in paper bags, which end up in general paper recycling, and as far as I know, they actually get recycled.

The important numbers are at the bottom of that table I mentioned. To "have same cumulative environmental impact (water use, energy use, etc.) as a classic plastic bag," an organic cotton bag would need to be reused 20,000 times.

As a point of reference for that, 20,000 days is over half a century.

Cotton fabric can last a long time, but rarely that long if it sees frequent use. And that's assuming you use the tote every day, which is not a good assumption.

The report also assumed the cotton could not be recycled, since very little infrastructure exists for textile recycling.

Maybe it can be used as a cleaning rag when it becomes holey?

Plus, knowing how many resources it takes to make a piece of cotton, treat fabric items in your home like infinitely reusable resources worth their carbon-mitigating weight in gold. Find new uses for old clothes, use textiles until they wear out, and when you want something new, buy vintage.

Ahem.

Bite me.

Also, no. Don't put all this shit on us. All that does is shift responsibility. Individually, we're a bunch of dumb-asses, me included. Collectively, well, we're still a bunch of dumb-asses and, as the past year or so has taught us, there are significant numbers of people who will simply never go along with a program for the common good..That just puts an extra burden on the people who do. But when small groups of people get together with a financial incentive to do so, we can be quite clever. So take your it's-all-your-fault mentality and shove it; find a way to fix the problem at the source, not at the destination.
June 2, 2021 at 12:04am
June 2, 2021 at 12:04am
#1011143
This one's been hanging out in my queue for a long time, not being deleted. It's a miracle!

Why Earth’s History Appears So Miraculous  
The strange, cosmic reason our evolutionary path will look ever luckier the longer we survive.


Because if the planet had been destroyed, we wouldn't be here to note it?

This is known as an observer selection effect, and the same sort of bias might apply not only to perforated planes, but to whole worlds as well.

"Might?" It's also a type of survivorship bias, and kin to the Anthropic Principle.

When I was a kid, lo these many aeons ago, it was common for us to ride around in the backs of cars or pickups without car seats, or even seat belts. I did it, and I'm still alive, so obviously from my point of view, that was a perfectly acceptable practice.

If, of course, you don't look at childhood death/injury statistics from that era.

It could be that we’ve been shielded from these existential threats by our very existence.

Rarely have I seen a more perfect reversal of cause and effect. It's remarkable, really.

As Sandberg and his co-authors Nick Bostrom and Milan Ćirković write, “The risks associated with catastrophes such as asteroidal/cometary impacts, supervolcanic episodes, and explosions of supernovas/gamma-ray bursts are based on their observed frequencies. As a result, the frequencies of catastrophes that destroy or are otherwise incompatible with the existence of observers are systematically underestimated.”

In other words -- as I interpret this, anyway -- we've been lucky so far, but might not be so lucky in the future. Based on the next few paragraphs in the article, I think I got it right.

If this is true, it might explain why our radio telescopes have reported only a stark silence from our cosmic neighborhood.

Or maybe it's because -- and yes, I know I've said this before -- what we're looking for is artificial signals, but evolution doesn't require the eventual emergence of technology-capable life. All it requires is fitness for survival, which takes many different forms; those qualities needed to build radios and rockets are irrelevant to species survival. Sure, one could argue that it allows for species survival through dispersal to other planets, but that's an effect, not a purpose.

One of the leading interpretations of this quantum weirdness is that all of the possible realities for the particle that were winnowed away in this act of observation actually are realized somewhere in branching-off parallel universes, by observers in parallel universes—parallel universes just as real as the one in which we happen to live. Though the universe may be infinite in distance it may also be infinitely divergent in this sort of ontological zoo.

Quibble: not actually infinite. But might as well be from our limited perspective, since the numbers involved are so freaking large.

This condition of eternal torment, where one might survive arbitrarily long by subjectively navigating the narrowing tributaries of the many-worlds time lines, staying alive through increasingly—and eventually astronomically—unlikely life paths, is known as quantum immortality, or quantum hell.

I comment on these things as I read them, not after (usually). I was just going to bring up the concept of quantum immortality. But I like "quantum hell" better.

“And eternal inflation implies an infinite universe. If there’s an infinite universe then yeah, if we wipe ourselves out there will still be plenty of life in the universe somewhere. And plenty of humans ... and plenty of Peters and plenty of Anthonys sitting in offices.”

This is a common misconception about infinity. The concept of infinity doesn't mean that there's definitely another "you" somewhere. Consider the set of positive integers, which is an infinite set. Never, not once in that entire set, does an number repeat. Sure, individual digits appear in different numbers, but there's only one 1, only one 42, only one 10100. Infinity doesn't imply duplication.

Of course, the quote I quoted above was from someone far smarter than I am, so take my comment as you will.

To summarize, the article is worth contemplating, but I have to stress that it's all speculative. Philosophical, even. When I consider how many times I've come close to dying, but didn't, sometimes I wonder if the many-worlds hypothesis holds true, and the universe split into one containing alive-me and one containing dead-me.

But no, this is literally what Occam's Razor was invented for: "Never multiply entities unnecessarily." The many-worlds hypothesis may be compelling to physicists, but it's currently both unverifiable and unfalsifiable, and it literally, actually multiplies entities unnecessarily, when a more elegant answer is that we're only around to talk about this sort of thing because nothing has happened to prevent it.
June 1, 2021 at 12:02am
June 1, 2021 at 12:02am
#1011075
Welcome to June. My inner optimist says it'll be different than May; my inner pessimist fears that he's right.

Let's start this month with an entry for "Journalistic Intentions [18+]. There's still time to join in the fun there if you want.

*Questionr* CHOP Kids First


My state, Virginia, is often cited as having the largest percentage of vanity license plates in the US.

Honestly, I don't care one way or the other. Some of them can be amusing. Others have indecipherable meanings. But real creativity can come in when one combines vanity plates with special commemorative plates. And Virginia has a lot of them, too.

For example, my optometrist, who went to UVA, got UVA tags. These consist of a giant orange V logo on the left side, which gave her the opportunity to add ISION as her personalized tag, so when read in full, it says VISION. Cute. On brand. Glad to have someone that creative taking care of my eyeballs. On the other end, I hope whoever got AGINA   on one of those is a gynecologist.

Other commemorative plates are more controversial. Every time I see the one based on the Gadsden flag   (99% of the time, these are on oversized pickup trucks), I assume the driver is a raging asshole and give him (99% of the time it's a "him") a wide berth and a short "thank you" for advertising his assholery. That's what I like about freedom of speech; it lets me know who it would be a waste of time and energy to listen to. "Dont Tread On Me" is translated, by me at least, as "Tread on THEM," where "them" is any sort of minority.

Another problem one is the "Choose life" logo. Okay, it wouldn't be problematic if our state government also issued "pro-choice" plates, but they don't. Come on, Virginia, you're a blue state now; get with the program.

The above should not be construed as an invitation to get into an argument about abortion or politics.

But the one VA specialty plate series I have hated above all others is the one that reads, in some sort of multicolored crayon font, "Kids First."

My reasons for raging about such a thing are complex; basically, I don't agree with "kids first." We have too much emphasis on kids as it is. We're not supposed to be adults living in a kids' world; they're supposed to be kids living in an adult world, and preparing for it. And too many people use "for the children" as a shield to mask their own Puritanical dislikes.

So, when I saw this plate,   I about tore a muscle laughing

I honestly can't remember if I saw it first on the road, or on the internet. I suspect the latter; there are over 7 million registered vehicles in Virginia, and besides, if I'd seen it on the road, I would have wrecked from laughing so hard.

Like most states, we're pretty picky about questionable tags. You can't get "FUCK," for example, or racially-charged terms. There are procedures in place to stop those before they ever get stamped. Somehow, though, this one got through.

At least, it did until some Puritanical wet-nose decided that "Eat The Kids First" was a sexual reference, and got the tag retroactively canceled.

You have to have a certain sort of mind to believe that it's a sexual reference. I don't have that sort of mind. I envisioned a zombie apocalypse, not a massive orgy of statutory rape. The thought never ever crossed my mind, and I have an exceptionally dirty one. In retrospect, I'm not sure that a joke about cannibalism was any better. Still, it was, obviously, a joke.

But it crossed someone's filthy, disgusting mind, and so there came to be just a little less humor on the road. Not in the world, though; as you can see from the link above, the plate is immortalized in pictures on the internet. And from what I've seen, similar ones have slipped through the cracks in other states, too.

It might be tempting to point out that I expressed a dislike of other kinds of specialty plates, so isn't it hypocritical of me to rail against someone else's objection to one? Well... consider that I have never tried to get any kind of license plate canceled, and maybe you can see that there's a difference. We don't have the right to not be offended. I'm perfectly capable of hating something without getting all activist about it. In fact, it's a core theme of my personality.

Which is not to say that I willingly support businesses run by people with what I consider the wrong politics. I just don't expect legislation to back me up on my boycotts. But that's another topic.

And it looks like the business that provided the above prompt probably got similar backlash. I risked getting put on a List by typing "CHOP kids first" into Google, and one of the results starts out: "At the CHOP Primary Care practice (formerly Kids First) in West Chester, PA, we believe that every child should have a medical home..."

Fortunately, Kid Sex Change   is apparently still in business, for those of us who have to giggle at such things. Word spacing, people. It exists for a reason. Use it! Or don't, and risk the juvenile attention of the internet.

30 Entries · *Magnify*
Page of 2 · 20 per page   < >
Previous ... 1 -2- ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/month/6-1-2021/sort_by/entry_order DESC, entry_creation_time DESC/page/2