*Magnify*
    May     ►
SMTWTFS
   
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/24
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 20 21 22 23 -24- 25 26 27 28 29 ... Next
February 12, 2023 at 12:03am
February 12, 2023 at 12:03am
#1044647
Time for another trip to the olden days. This one's from way back in May of 2020: "Crossing My Path

A 30DBC entry, this was written to a rather open-ended prompt: Start your entry today with the words: “I used to believe...”

I could have gone pretty much anywhere with this one. I chose what I consider to be a safe route; some of my former beliefs are profoundly embarrassing to me now. For instance, for a while there, when I was much younger (but old enough to vote), I thought that libertarians had the right ideas. It pains me to even admit that, but there it is.

As I noted to open the entry, though:

I used to believe that we stayed basically the same person all our lives. Then I read something recently, but didn't bother to save a link, about how some scientists did a study on young people, then studied them again when they were older, and what they found that Old You doesn't really have anything in common with Young You. I mean, we've known this from a physical standpoint - cells get replaced and all that - but it kind of calls into question the idea of continuity of consciousness, and identity itself.

I notice this even more with Drunk Me. He gets me the best presents from Amazon, though.

So I banged on, in that entry that feels so long ago but really isn't, about an article I'd found about superstition.

But what struck me today, revisiting this, was that I didn't really talk about "belief" itself.

It's one of those words that's a bit slippery. Sure, one can quote a dictionary definition ("an acceptance that a statement is true or that something exists" or "trust, faith, or confidence in someone or something"), but dictionaries don't capture all the nuances of a word. It can also mean a conviction, which has got to be confusing to those not used to English, because "conviction" also means that the legal system has determined that you've done a crime. You can have a conviction that you didn't do the crime, but you've still been convicted.

To me, there are three major kinds of belief:

1) a trust in someone (possibly yourself), as in "I believe in you!" This isn't the same thing as saying "I believe in aliens," because the existence of the "you" in that sentence isn't in question. Which leads us to my second kind of belief:

2) Certainty that something exists, despite a paucity of evidence, no evidence whatsoever, or evidence against. Belief in space aliens, God, Bigfoot, fairies, Russel's Teapot, etc. is in this category. And then there's:

3) Certainty that something exists, with solid evidence to back it up. Gravity, the (mostly)spherical shape of the Earth, evolution by natural selection, or cats, for example.

Wait, okay, there's another one: 4) to understand or remember something in a certain way, as in "I believe they said they would do that." I'm not really counting that one, though; it's got nothing to do with certainty.

So what we need is a different word for the third kind, in my opinion (or belief, if you will). Because saying "I believe in evolution" isn't equivalent to "I believe in leprechauns" in any way except linguistically. Assuming that it does leads to all sorts of confusion; for instance, one might say "I believe in climate change" and someone who doesn't accept the science might retort with "Sure, and I believe in Mothman."

The difference is substantial. It's why I bang on about science in here and try not to use the word "belief" in those contexts.

But I used to believe it wasn't that big a deal.
February 11, 2023 at 12:01am
February 11, 2023 at 12:01am
#1044590
Time for another entry in "Journalistic Intentions [18+].

This one, selected at random as usual, features koi: Feeding Frenzy  .

Ever notice that koi look like kaiju goldfish? There's a good reason for that, apart from most fish having a similar body plan adapted to swimming: they're both carp.

Apparently, though, they come from different subspecies. While goldfish were bred in China over a thousand years ago,   selective breeding of koi is a more recent Japanese development. For the benefit of my fellow Americans with a limited sense of historical time, by "recent," I mean about 200 years ago.  

And goldfish aren't really gold. By which I mean, no, of course they're not made of the chemical element Au, but their coloration always struck me as being more orange than yellow, though it can vary. Not everyone calls them goldfish, either; in French, the name is poisson rouge, which it doesn't take three and a half years of French lessons to know means "red fish."

They're not red, either, but different cultures have different conceptions of color.

No well-known metals are red or orange, so "goldfish" it is.

Koi obviously aren't limited to the orange color, but the point here is that none of these fish are the product of natural selection, unless you take my philosophical stance that we're part of nature, and so anything we accomplish, from air pollution to plastic to miniaturized orange carp, is also natural. "Artificial selection" or "selective breeding" are perfectly good terms for the process, though.

This strikes me as similar to dog or cat coloration. I don't mean different dog breeds, but, like, how a collie can be sable and white or black, sable and white. Or the different patterns of spots on dalmatians. While some koi resemble calico cats in coloration, the calico trait apparently doesn't breed true in cats, being instead a genetic thing associated with the feline X chromosome.

I have heard, but can't confirm, that before we nearly extincted tigers, some of them could also display the calico trait. Which I'd pay to see. As tigers are largely Asian in range, well, that brings us back to the origins of goldfish and koi.

All carp, incidentally, are food. I mean, everything is edible once, but carp has been a staple in many cultures, despite difficulties in its preparation (it's bony as fuck). In Eastern Europe, it can become gefilte fish, which, well, some of my people consider that to be food, whereas I do not.

As pretty as they are in koi ponds, the Asian carp is considered an invasive species in the US. I wrote about an attempt to contain them on the Mississippi River in a blog entry a while back: "Dam It. In that entry, I said:

That lock was closed a few years ago, but the structure remains. The reason for the closure was to stop the spread of Asian carp, an invasive species, upriver. So when I was touring the lock, looking into the waters upriver, what do you think I saw? Go ahead, guess.

That's right. An Asian carp.

On the plus side, I guess I caught my first Pokemon.
February 10, 2023 at 12:01am
February 10, 2023 at 12:01am
#1044536
I don't really remember why, specifically, I saved this one. Perhaps I did so back when I didn't have a car to deal with.

    How To Clean Car Seats  
Let’s get that moldy cheeseburger cheese out from the seat cushion, okay?


Or you could, you know, not eat a cheeseburger in your car.

So you’ve hopped into your car only to find that something, somewhere, stinks to high heaven. After a little investigative nose work, you’ve discovered part of last week’s Taco Bell Nachos BellGrande fell between the seat and console when you hit a bump. Well, it seems like a great time to clean your car seats!

Or nachos. Or faux-nachos like the ones from Taco Hell.

Look, some things are okay to eat in a car. Anything that drips is not in that set. Crumbs are easier to remove than goo.

Cleaning your car’s interior, as well as regularly maintaining its luster, will not just make your everyday life that much better, it will also contribute to your car’s overall value when you decide to sell it—no used car buyer wants to find someone else’s toenails.

Toenails? Who in the name of all that's holy clips their toenails in the car? Come on.

And with a little elbow grease, a vacuum cleaner, and a bucket of soapy water, your car’s seats can go from trash-pit to palatial palace in just a few hours.

I got tired just reading that sentence. Also, "palatial palace?"

Estimated Time Needed: Two to four hours

Oh hell no.

We know a number of you likely also have children who are confined to child car seats.

Might want to take 'em out of there sometimes, especially when you're not driving them anywhere.

1. Remove the child's car seat from the car.

Okay, so, as you know, I've never had to deal with that sort of thing. And when I was a kid, we didn't have car seats; they just shut us in the trunk to roll around. But it seems to me that this step, alone, takes more than the four hours allotted to the entire car-cleaning marathon. And that's not counting the PhD in Car Seat Installation that you need to strap that sucker back in.

Reupholstering Your Car Seats

Listen, if you've had the car long enough to need the seats reupholstered, no one is going to give a shit if you clean the car, least of all you.

Source: my lived experience.

Getting Out Smoke Smell From Your Car Seats

Are you kidding? That smoke smell ensures that no one ever asks me for a ride anywhere.

It costs zero dollars if you clean your car seats and child seats yourself.

Well, that's a truckload of hokum right there. Way back at the beginning of the article, it listed a bunch of materials you'll need. None of those things are free. Not even the water. Hell, this whole "article" is an ad for car cleaning products.

If you have someone detail your car’s interior, then it could cost anywhere from $50 to a couple thousand dollars depending on who you enlist to do the job and the type of car you have.

And it's absolutely, completely worth every last dollar of that to keep me from having to clean something myself.

Life Hacks To Cleaning Your Car Seats

Important life hack: ignore any "life hacks" you come across on the internet.

Featured Car Seat Cleaning Products

"Zero dollars," my smelly-car-seat-sitting ass.
February 9, 2023 at 12:02am
February 9, 2023 at 12:02am
#1044486
Everything's gotta start somewhere.



As the title suggests, this article (from Cracked) goes into some of those beginnings. Although I suppose an argument could be made that nothing ever begins; it's all on a continuum.

Many years ago, someone made a hammer. Afterward, following centuries of technological advancement and increased knowledge of human physiology, someone else made an updated version: a hammer with a rubber grip.

As useful as that is, I think the addition of a claw on the back is an even more important innovation. It's like putting an eraser at the end of a pencil, undoing what the other end does.

5. An Early Remote Control Used Little Hammers

But, sadly, not clawhammers. These are more like the hammers inside a piano.

The Space Commander used audio tones — ultrasonic ones, so you didn’t hear them and get annoyed. It created these tones by hitting aluminum rods with clappers. Pressing this remote’s buttons was like playing a xylophone.

Or that.

When I was a kid, we didn't have a remote at home. We had to get up and manually change the channel. While I'm sure these exercise sessions contributed to health, remember that we lived on a farm, so it was superfluous. When we'd go visit my aunt in New York City, she had a remote with all of two buttons, and I thought it was sorcery.

4. Vacuum Cleaners Used to Be Drawn by Horses

I'd make a joke about how it probably sucked, but you already thought of that. So did the article's writer.

3. The First 3D Movie Was Also Choose-Your-Own-Adventure

Movies have used 3D for longer than they’ve used sound. The first 3D cinema release was called The Power of Love and came out way back in 1922.

For the math-challenged, that was 101 years ago.

Audiences didn’t view the film using today’s polarized glasses. Instead, the glasses isolated different types of light by tinting one lens red and the other green. Some of you reading this may have grown up with similar glasses even long after the 1920s.

I think they used red/blue. 3D movies are kinda gimmicky, but not nearly as gimmicky as smell-o-vision.

In any case, this was kinda ahead of its time, even if it was a silent film.

2. Computer Games Used to Print Their Output on Paper

I have vague recollections of this. Mostly, lots of wasted paper.

But this is the cool one, because it verges on steampunk:

1. They Had Clockwork Coffeemakers in Victorian Days

I mean, literal steam.

They had clockwork teamakers in Victorian days, but we chose the word “coffeemaker” in the above title in case you’re unfamiliar with dedicated teamaking devices. Such devices, called teasmades, used to be popular in the U.K. relatively recently,,,

The article contains a helpful picture of a teasmade.

If this sounds somehow both primitive and overengineered, realize that these products were marketed toward British people who had been used to having servants but who no longer could afford them thanks to changing times.

Never underestimate the power of laziness. As I've said before, necessity may be the mother of invention, but laziness is the milkman.
February 8, 2023 at 12:02am
February 8, 2023 at 12:02am
#1044434
I feel like this article was commissioned by Jell-O Corporation. Or, you know, whoever makes that stuff. (Googles) Oh, Kraft. Yeah, they have deep pockets, but they also have lots of other products that are actually good.

    Jelly Is Ready for Its Redemption Arc  
The polarizing format is due for a come-up, according to Ken Albala’s new book, “The Great Gelatin Revival”


"Jelly" is one of those words that has more than one meaning in American English. They're not talking about the fruit preserves, but the gelatin product made from cow feet. While this is popularly known as Jell-O, that remains a trademark, like Kleenex or Coke.

“I predict that we are on the threshold of a new aspic-forward aesthetic,” writes the food historian and University of the Pacific professor Ken Albala in The Great Gelatin Revival, out January 10 from University of Illinois Press.

Maybe that should be "food historian, University of the Pacific professor, and paid Kraft shill..."

By the way, this article is from last month, so that book is still new.

No doubt, aspics — those vintage delicacies studded with vegetables and congealed meats — remain niche, but jelly? Jelly is having a moment.

"Moment" is probably all it is.

Moving beyond the aesthetic, Epicurious and the New York Times attempted to actually put jellies back on holiday tables, and grown-up Jell-O shots constituted one of last year’s biggest drink trends, according to Punch.

Who decides this crap? Why isn't it me?

That said, Jell-O shots are for college kids who haven't developed a taste for fine liquor yet. There's nothing inherently wrong with that—I started out doing embarrassing shit, too—but, ultimately, it's a phase.

“Periods that embrace the jiggle are always followed by periods of disgust sometimes so intense and visceral that entire generations lose the skill to make them,” Albala writes.

You'd think we'd learn, but no, we're incapable of that.

Jelly—this kind—is forever associated with hospital rooms and nursing homes, as far as I'm concerned.

I sat down with Albala over Zoom to talk about why jelly is on the come-up, and why more of us might reconsider all things jiggly.

As you might expect, the rest of this article is an interview transcript. I won't reproduce most of it here; it's at the link if you're interested. There's just one part of one response that made me save this for sharing, because I found the perspective interesting:

I think that we’re beginning to look at the problems in the world, environmental ones especially, and look to science to solve those. The fact of an Impossible Burger — and lab-grown meat is going to happen very soon — I think is an indication that people have other priorities now and they’re really not as distrustful of science as they used to be, or that tech-forward food is something that that doesn’t really bug this next generation of people.

Eating animal collagen is one of those things that usually aligns with periods in history that are very science-forward and very trustful of science; there are other periods that are not. You think of the late ’60s-early ’70s, and the hippie generation — that’s when Jell-O begins to take its precipitous downfall, because people don’t want artificial colors and flavors. I predict that because of the way things have gone for the past 20 years, that we’re going to be back in a period that is not just pro-science and pro-artificially made food, but I think Jell-O is going to come back too, in a weird way.


I only wish he was right. He wrote the book before the pandemic, apparently, and those years showed us that, far from being more trusting of science, a large number of people don't understand it or care for it at all. And it's not one-sided, politically; you also get people distrusting of GMOs, to the point where "GMO-free" is actually a marketing gimmick for food manufacturers. (Here in reality, GMO products aren't evil; they're the only way we're going to be able to feed the teeming masses in the future.)

One shouldn't be blindly trusting of science, though. Especially food science. But the link between science acceptance and trends toward more lab-created foods would be interesting—if it exists. Right now, it's apparently just his observation.

Now, none of this should be construed as me telling you what not to like. I find jelly relatively inoffensive, myself, except for weird-ass aspics. I even used flavorless gelatin in a dessert recently—a lemon chiffon thing that turned out okay. And I gotta admit, the potential for alcoholic Jell-O is intriguing to me (the stuff beyond Jell-O shots, some of which is featured at that link). This is more about me hating "trends."
February 7, 2023 at 12:01am
February 7, 2023 at 12:01am
#1044382
Another entry for "Journalistic Intentions [18+]. This one can be found here: Faces Everywhere  

Pareidolia   is popularly known as the tendency to see a face or figure in accidental places. Examples include the Man in the Moon, the Face on Mars (it's not a face), and the ever-popular Jesus toast.

Or, speaking of Jesus, some people find him in a dog butt.   I'm not judging. But it does confound the proverbial dyslexic agnostic insomniac, who stays up all night wondering if there's a dog.

In the case of today's prompt image, though... I'm not seeing it. Looks more like a grasping hand to me. Which is a form of pareidolia, but the title of the image led me to expect to see faces. Generally if I go into something with that expectation, I'll see it. Not this time.

This could be related to my partial face-blindness. A while back, I tried to get back into my bookface account so that I could delete it. I'd forgotten the password, so they issued me a challenge: identify the people in these photographs, ripped from my friends list.

Leaving aside the ones who I only knew from online, whose faces I wouldn't recognize anyway, I haven't seen some of my failbook friends in decades. And even absent those, I'm just not that great at identifying faces. Hell, one time I was at a bar doing a trivia contest. I had barely even started drinking yet, and the goal was to identify the famous people in a set of photos. I recognized a couple of them, but one that I didn't was... Bruce Springsteen, circa 1985. You'd think if I could recognize anyone, it's him. But no.

Long story short, this is why I still have a facebuck account: I can't get back in to delete it. And it's also why I don't always see pareidolia.

One thing that occurred to me a while back, though, is that artists rely on pareidolia. That is, I don't mean the examples in the Wiki link above, but, like, cartoon artists. They can throw together a few lines, and it looks like a general face (even to me). That's deliberate, though, and doesn't involve chance formations in nature.

Now, I could go into speculation about why we see faces everywhere, like the late lamented Old Man of the Mountain in New Hampshire, and make some guesses about how it was advantageous for our ancestors to think they saw faces. But I won't, because that's all it would be: speculation. The facts we know include that we do see faces or figures in natural objects. And sometimes that's fun.

But sometimes, it leads to wild conspiracy "theories," like with the Mars thing. Some alien must have carved it! Well, no, it's just pareidolia.

Which doesn't explain why there's a smiley face on Mars  . Do you see it there? That's the crater Galle in the Argyre Planitia. Might be easier to parse in this photo.  

Okay, yes, it totally does explain it. Completely coincidental. Just because Mars is the only known planet to be inhabited solely by robots doesn't mean that said robots did that for our benefit.

Have a nice day! *Smile*
February 6, 2023 at 12:01am
February 6, 2023 at 12:01am
#1044333
To start with, I'll note that this is a book promotion. That's okay; it's at least somewhat informative.

    This Is Your Brain on Architecture  
In her new book, Sarah Williams Goldhagen presents scientific evidence for why some buildings delight us and others—too many of them—disappoint.


Because architecture is art, and some art delights us and others disappoint?

(I know an argument can be made that architecture isn't actually art because architecture has a function, but it still has a huge artistic component.)

One last note before I dive in: the article is from 2017. That shouldn't make much difference, but it references a "new" book, which simply wouldn't be the case after six years.

Sarah Williams Goldhagen was the architecture critic for The New Republic for many years, a role she combined with teaching at Harvard University’s Graduate School of Design and elsewhere. She is an expert on the work of Louis Kahn, one of the 20th century’s greatest architects, known for the weighty, mystical Modernism of buildings like the Salk Institute in La Jolla, California, and the Bangladeshi parliament in Dhaka.

At the top of the linked page is a photograph of the Salk Institute. I'm no expert on architectural labels, but that building looks more Brutalist than Modernist. Perhaps it is both.

Several years ago, Goldhagen became interested in new research on how our brains register the environments around us. Dipping into writing from several fields—psychology, anthropology, linguistics, and neuroscience—she learned that a new paradigm for how we live and think in the world was starting to emerge, called “embodied cognition.”

And this is why the article appeals to me: not necessarily because of the architectural slant to it, but because it features multidisciplinary science. Well, sort-of science; I'll get to that.

“This paradigm,” she writes in her magisterial new book, Welcome to Your World: How the Built Environment Shapes Our Lives, “holds that much of what and how people think is a function of our living in the kinds of bodies we do.” Not just conscious thoughts, but non-conscious impressions, feedback from our senses, physical movement, and even split-second mental simulations of that movement shape how we respond to a place, Goldhagen argues.

I mean... okay. I don't think that's a new concept, though. It wasn't a new concept in 2017. Perhaps it wasn't framed exactly like that before, but I spent my career working with architects, and I have the impression that they all think that way.

The research led Goldhagen to science-based answers for previously metaphysical questions, such as: why do some places charm us and others leave us cold? Do we think and act differently depending on the building or room we’re in?

The first question is a valid line of inquiry, in my opinion. The second should be blindingly obvious to anyone.

Architects intuited some of these principles long ago. As Kahn once noted of the monumental Baths of Caracalla in Rome, a person can bathe under an eight-foot ceiling, “but there’s something about a 150-foot ceiling that makes a man a different kind of man.”

Like I said, they all think that way (well, all the ones I've met; I should guard against sweeping generalizations). The question I have is: can it be quantified? That would make this legitimate science.

As an example from my own field, consider a road. If you're driving down a country road that has 9-foot lanes with trees just off the edge, it feels different, and you'll drive differently, than if you're on a 12-foot-lane interstate with a lot of clear space past the shoulder, even if both roads are straight and level. The psychology of this has been quantified, and it's in the realm of science. Similarly, we know we feel different in a large bedroom (for example) than we do in a small one, but has that been quantified? What's the optimal size of a bedroom, considering this psychology, and how do different people react to different sizes?

The article jumps into an interview with the book's author:

At the time, there really was no intellectual paradigm for thinking about these questions. And then about 15 years ago, my husband handed me a book by someone who had written a previous book he had really liked. The title of the book was Metaphors We Live By. It’s co-authored by George Lakoff, who’s a cognitive linguist, and Mark Johnson, who’s a philosopher. The basic argument is that much of how our thought is structured emerges from the fact of our embodiment. And many of the ways those thoughts are structured are metaphorical.

Honestly, I'd rather read that book than the one this page is promoting. Not that I wouldn't want to read this one; it's a matter of priority.

One of the things I found was that, basically, [given] what we now know about human cognition and perception, the built environments we inhabit are drastically more important than we ever thought they were.

Okay, that's worth explaining in a book, I think. And it passes my sniff test.

Architects tend, particularly with parametric design, to emphasize overall aggregate form, and all that other stuff gets filled in later. And then, very often, it’s value-engineered out.

I understand the need for value engineering, but I despise the concept. One time, a site plan I did lost a drainage grate to value engineering, and then everyone wondered why the street started flooding every time it rained, and I got the blame until I pointed out that the drain I'd designed never got built. I'm not saying I was always right, but I've never put in a drainage device without reason.

Another thing is differentiated, non-repetitive surfaces. [The psychologist and author] Colin Ellard did a study of how people respond: He basically put sensors on people and had them walk by a boring, generic building. Then he had them walk past something much more variegated with more ways to [engage] visually and therefore motorically. He found that people’s stress levels, measured by cortisol, went up dramatically when they were walking past the boring building.

Okay, see, that's quantification. Science. It may have been a good study or it may not; I don't have the data. But it's on the right track.

The rest of the interview is worth reading, I think, because she raises some important points. But I'm not going to nitpick them (which is not to say that I completely agree), so no point in reproducing it here. Link's up there if you're interested.
February 5, 2023 at 12:01am
February 5, 2023 at 12:01am
#1044278
Today's historical document comes from just shy of 14 years ago. It was part of a series of entries that I did to expand on a list of things about me. I mean, it's a blog; why wouldn't I talk about me?

Anyway, this particular entry is here: "Lassie

From that day:

11. When I was a kid, I had a collie named Lassie. Yes, I named her. Give me a break; I was four years old at the time.

Incidentally, I've never used her name as a security answer anywhere, so... nice try, but no.

I've posted this before, but here's a picture of me with Lassie, from the rotation in my former blog:

Unlike many pictures, I could easily repost it here. But why, when I spent so much time typing in the link to the entry?

I don't know how old I was in the picture. Over four, obviously. I suck at estimating ages of kids or adults. No, it's irrelevant that the picture is of me. It was ancient when I put it in that entry. And after another 14 years, I'm even further removed from that brat, and even more embarrassed by the photo.

It occurs to me that I don't even recall the circumstances surrounding the photo. That's not our lawn. I don't recognize the setting at all. I have this vague notion that it might have been Florida, but I have no recollection of taking the dog with us to Florida. Or even who watched her when we vacationed.

I only keep it up because of the dog. She was a good dog.

I've never had another dog. Most of them annoy me, and those that don't, already belong to someone else.

That's not the main reason. Oh, it might have been in the forefront on that day, for that entry, but I rarely have only one reason for doing (or in this case, not doing) something. I like dogs, mostly. I just can't be arsed to do the work they require.

And also because I doubt I'd find one as cool as Lassie.
February 4, 2023 at 12:01am
February 4, 2023 at 12:01am
#1044216
Here's a confession:

I'm not entirely confident when it comes to picture prompts.

But I do feel like it's important to try new things, to, as the kids say, step outside of my comfort zone and do something I'm not sure of, like this month's round of "Journalistic Intentions [18+].

One thing I continue to do: pick prompts at random. Today, we have this lovely photo of a leaf under a blanket of water: Orange Submersion  

I've seen reports that at least one person is having trouble with xlinks, so if that hyperlink doesn't work for you, I can provide the raw URL on request.

Ever wonder why water is (mostly) clear? I have. I mean, apart from when it's murky from suspended particles, or sometimes when mixed with delicious booze (which is itself, in its pure form, clear). And it's clearly (pun intended) not the same as air; you can almost always tell where one ends and the other begins. The surface of pure water is easy to identify, but, for me at least, damn near impossible to render in a drawing. But then, I've never been very good at drawing... but I digress.

The best answer to why water is clear that I've been able to come up with is that this is not the right question to ask.

Life crawled up onto land "only" about half a billion years ago. In comparison to the four billion or so years since life began on Earth, that's a significant fraction of time, but it means that life was changing and evolving for 7/8 of its history underwater. And some of that life, at least the animal portion, found a competitive edge in being able to directly sense prey or predators: in short, vision is a very useful sense to possess.

It's my understanding that eyes evolved several different times. That is, there's not one proto-organism that gradually turned light-sensitive cells (which many organisms have, not just animals) into an eyeball, which then split off into different species. No, the proto-organism might have been mostly blind, and some of its descendants developed vision in different ways: compound eyes like ours, or the simple eyes of arthropods and such, or whatever.

But most of those eyes originally evolved underwater. Thus, they developed in such a way that vision would be an evolutionary advantage, which means being sensitive to a range of the electromagnetic spectrum to which water is mostly transparent.

Water isn't clear because of some innate property of it; it's clear, to us, because our distant ancestors evolved a sense that allowed them to see in it. Water blocks some other wavelengths.

But the other thing it does is change the direction of light at its boundary. Stick your arm into a fish tank, and it'll appear to bend. At some angles, the light doesn't escape the water at all, but reflects off of it like a perfect mirror (this is a function of index of refraction, and it's also how fiber optic cables transmit data). The surface is also partially reflective when viewed from above, which is how you get the artistic dappled effect from ripples, like in today's picture prompt.

Leaf (pun absolutely intended) it to me to get all sciencey about a pretty picture. But I'm of the considered opinion that the knowledge gleaned from science can improve an aesthetic experience. I know the math and physics behind rainbows (or, well, I used to, and can easily find the information again), but that doesn't decrease one's visual impact.

And if today's discussion hurt your brain, just be glad I didn't go into the biology of what makes leaves turn color and fall off as winter approaches.

Another time, perhaps.
February 3, 2023 at 12:01am
February 3, 2023 at 12:01am
#1044108
My second-biggest gripe here is: "Only four?"



Because I can think of a lot more than four.

But this is Cracked, so you gotta account for short atten- SQUIRREL!

The sheer variety and volume of food available in the world is enough to make both your eyes and mouth water simultaneously.

Especially if you're one of the millions who can't access it.

Dark humor is like food: not everyone gets it.

And yes, guy whose go-to-move at parties is to shut down casual chats by bringing up the saddest possible parts of the human condition, I do realize that starvation and food security is still a massive problem both in the U.S. and abroad. Can we move on now, and you can resume your job as a professional hitman for conversations?

Nah, I reserve that for the blog. But that's only because no one invites me to parties or casual chats.

Here, then, are four foods that, as a modern civilization, we can finally kick to the curb.

4. Necco Wafers

Congratulations, you've identified something more hated than candy corn.

The world has changed around you, while you’ve been chopping up sticks of sidewalk chalk and wrapping them in the sort of wax paper that feels like it was collected from an Egyptian tomb. We’re two months away from Gushers with LEDs in them, and you’re still trying to sell us a candy most monkeys would spit out.

Stop giving candymakers ideas.

3. Baby Corn

Not to be confused with candy corn, because as hated as candy corn is, at least it has a flavor.

I, like many American Jews, have a deep, insatiable love for Chinese food. Fried pork dumplings are very possibly my favorite food of all time, and if not, they would at the very least make it handily through the primaries.

I'm just leaving this in here to avoid comments/jokes about keeping kosher. It's not a thing for most American Jews.

However, there is one consistent, unwelcome invader in many entrees at Chinese restaurants: Those fucked-up little corns.

Truth.

2. Plain Cheerios

YOU SHUT YOUR WHORE MOUTH.

But the light is dimming for the default Cheerio in the modern world — the new cereals are faster, sweeter and more colorful.

That is exactly what makes Cheerios great.

Not that I eat cereal much anymore, but on those rare occasions that I do (usually as part of a free breakfast at a cheap motel), the default is Cheerios, not Chocolate Frosted Sugar Bombs.

1. Brazil Nuts

Unlike the other items on this list, I have absolutely no opinion about Brazil nuts. But I tend to avoid them because the idiots I knew as a kid had a rude name for them that I shan't perpetuate by noting it here.

I am sure there was a point in the history of the human race where the precious food inside a Brazil nut was needed for survival, but I highly question their continued relevance past the invention of fire. The value proposition of the Brazil nut is like breaking into a safe in order to retrieve a piece of dry toast.

I've never had to actually break into a Brazil nut. But you can't make that argument about Brazil nuts when there are so many other foods that are difficult to access, like coconuts or pomegranates (pretty sure that's a word meaning "apple made of granite").

Not to mention, there are so many other better nuts that are begging you to eat them! You’re telling me you’re putting on the blinders and digging past peanuts, pistachios and cashews just to draw blood trying to access the non-prize that is the meat of the Brazil nut?

Peanuts are barely food, pistachios are too much work (or too expensive if you get the pre-shelled kind), and the only purpose of cashews is to provide a unique shape in the trail mix.

Anyway, as I said, why stop at four? I can think of lots of foods whose time has passed, especially since Cheerios doesn't belong on that list at all.

For example: mass-produced milk chocolate. It's basically edible plastic with just enough cocoa so the food police can let them call it "chocolate." There are many tastier alternatives. Sure, they're probably more expensive, but we're not talking about price.

Your turn. What food would you want to see relegated to the circular file of history? (If you're vegetarian or vegan, saying "animal products" is cheating.)
February 2, 2023 at 12:01am
February 2, 2023 at 12:01am
#1044036
Not a lot I can say about this one; I just thought it was too cool not to share.

Before I get into it, though, a quick note: I'll be participating in "Journalistic Intentions [18+] this month (though not today). Check it out and join in—you have nothing to lose and everything to gain. Well, a couple of awards to gain, potentially, but that's not nothing.

And now, let there be light.

     Astronomers Say They Have Spotted the Universe’s First Stars  
Theory has it that “Population III” stars brought light to the cosmos. The James Webb Space Telescope may have just glimpsed them.


A group of astronomers poring over data from the James Webb Space Telescope (JWST) has glimpsed light from ionized helium in a distant galaxy, which could indicate the presence of the universe’s very first generation of stars.

Don't worry; the article later explains this better later on.

These long-sought, inaptly named “Population III” stars would have been ginormous balls of hydrogen and helium sculpted from the universe’s primordial gas.

I guess "ginormous" is a scientific term now, probably along with "bajillion" and "metric shit-ton." As in, there were a bajillion ginormous stars in the early Universe, and each one weighed a shit-ton.

Theorists started imagining these first fireballs in the 1970s, hypothesizing that, after short lifetimes, they exploded as supernovas, forging heavier elements and spewing them into the cosmos.

"Fireballs" is misleading, but "exploded" is apt.

That star stuff later gave rise to Population II stars more abundant in heavy elements, then even richer Population I stars like our sun, as well as planets, asteroids, comets and eventually life itself.

Got that? Summary: Pop III came before Pop II which came before Pop I. No idea what they're going to name the fourth generation of stars. Population X, I hope. If this seems backwards, it's because the classification system was developed before people figured out the order of things. Again, the article explains this later.

The early Universe was mostly hydrogen, some helium, a tiny bit (relatively speaking) of lithium, and not much else. It takes nuclear fusion or other processes to create the rest of the elements. This is why we say we're made of star material.

Confirmation is still needed; the team’s paper, posted on the preprint server arxiv.org on December 8, is awaiting peer review at Nature.

Important disclaimer, so you can be That Person at the party when someone gushes about how astronomers found the first star or something.

Because they are so far away and existed so briefly, finding evidence for them has been a challenge.

Remember, we're at the center of an inside-out universe; what's furthest away is oldest. Well. We appear to be at the center, anyway; so does every other point in the Universe.

The rest of the article goes into more depth, and while it's not nearly as paradigm-shattering as, say, detecting life on another world (a subject I've tackled in here before), getting confirmation, or even negation, of this model of the history of star formation would be a step forward in astronomy and cosmology. Thus, the cool factor.
February 1, 2023 at 12:01am
February 1, 2023 at 12:01am
#1043972
The hated month of February begins. Appropriately enough, the random number generator pulled up a plague article.



And no, they didn't pinpoint the exact flea.

As the deadliest pandemic in recorded history – it killed an estimated 50 million people in Europe and the Mediterranean between 1346 and 1353 — it's a question that has plagued scientists and historians for nearly 700 years.

...really, NPR? You're going to do that pun? Who do you think you are, the BBC?

Incidentally, other sources put the number of victims higher than 50 million. That would be a significant number of people today; at the time, it was likely over 10% of the world's human population.

Now, researchers say they've found the genetic ancestor of the Black Death, which still infects thousands of people each year.

Which is not what the headline implied.

New research, published this month in the journal Nature, provides biological evidence that places the ancestral origins of Black Death in Central Asia, in what is now modern-day Kyrgyzstan.

See what lack of vowels will do?

Oh, and by "this month" they mean when the article was published, back in June of last year.

What's more, the researchers find that the strain from this region "gave rise to the majority of [modern plague] strains circulating in the world today," says Phil Slavin, co-author on the paper and a historian at the University of Stirling in Scotland.

One could ask where that strain came from in turn.

The article goes on to explain the various strands of evidence that led them to this conclusion, and it was apparently a multidisciplinary effort. That's the part I find interesting.

Does this mean that the mystery of the origin of Black Death has been solved?

"I would be very cautious about stretching it that far," says Hendrik Poinar, evolutionary geneticist and director of the McMaster University Ancient DNA Center in Ontario, Canada, who was not involved in the study. "Pinpointing a date and a specific site for emergence is a nebulous thing to do."


I'm just including this because I didn't want anyone to get the impression it's settled science, as the headline might make one believe.

When the Black Death swept across Eurasia, no one had the slightest clue what DNA was. Or a cell. Or hell, even the notion that organisms too small to see could be what caused the plague. No, they thought it was God's punishment, or witches' curses, or a comet or some shit like that.

I'm just throwing that out there because I continue to see romanticization of the past. While today sucks, the past sucked worse.
January 31, 2023 at 12:01am
January 31, 2023 at 12:01am
#1043924
Mostly, this is just an interesting article from Vice to share. But naturally, I have comments. Some are serious, others, not so much.

     A Total Amateur May Have Just Rewritten Human History With Bombshell Discovery  
Ben Bacon is "effectively a person off the street," but he and his academic co-authors think they've found the earliest writing in human history.


The idea that an "amateur" might make a discovery isn't all that shocking. People with experience sometimes let that experience get in the way of coming up with fresh ideas, and nothing says "fresh ideas" like a newbie. Hell, Einstein was famously working as a patent clerk when he figured out how most of the Universe worked. This, of course, doesn't mean that an amateur is always going to get it right.

What's more important is the "discovery" itself, and whether it will hold up under scrutiny.

In what may be a major archaeological breakthrough, an independent researcher has suggested that the earliest writing in human history has been hiding in plain sight in prehistoric cave paintings in Europe, a discovery that would push the timeline of written language back by tens of thousands of years, reports a new study.

This, folks, is how you write a lede. And it's even in the first paragraph.

These cave paintings often include non-figurative markings, such as dots and lines, that have evaded explanation for decades.

Samuel Morse went back in time and left messages? *Bullet* *Bullet* *Bullet*   *Dash* *Dash* *Dash*   *Bullet* *Bullet* *Bullet*

Ben Bacon, a furniture conservator based in London, U.K. who has described himself as “effectively a person off the street,” happened to notice these markings while admiring images of European cave art, and developed a hunch that they could be decipherable.

BEHOLD THE POWER OF BACON

Now, Bacon has unveiled what he believes is “the first known writing in the history of Homo sapiens,” in the form of a prehistoric lunar calendar, according to a study published on Thursday in the Cambridge Archeological Journal.

Technically, if it is writing, then it's not "prehistoric." By definition.

Intrigued by the markings, Bacon launched a meticulous effort to decode them, with a particular focus on lines, dots, and a Y-shaped symbol that show up in hundreds of cave paintings.

This supports my Samuel Morse time-traveling theory, if we also assume he was horny and thinking about the pubic regions of females.

Previous researchers have suggested that these symbols could be some form of numerical notation, perhaps designed to count the number of animals sighted or killed by these prehistoric artists. Bacon made the leap to suggest that they form a calendar system designed to track the life cycles of animals depicted in the paintings.

I was wondering how that relates to a "lunar calendar," but fortunately, the author continues to practice good journalism:

The researchers note that the paintings are never accompanied by more than 13 of these lines and dots, which could mean that they denote lunar months. The lunar calendar they envision would not track time across years, but would be informally rebooted each year during a time in late winter or early spring known as the “bonne saison.”

Hey, that's French. I didn't need years of study to know that this means "good season."

On a more serious note, finding out when the calendar ticked around would be pretty cool. Our Gregorian calendar begins nearly equidistant from the winter (northern hemisphere) solstice and Earth's perihelion (that bit's a coincidence). The original Roman calendar on which it was largely based rolled over at the beginning of spring. That's why the names of the ninth, tenth, eleventh, and twelfth months start with Latin prefixes for seven, eight, nine, and ten, respectively... but I digress.

It's a cycle, so it doesn't really matter what you call the end/beginning, but it might shed some light on the ancients' thought processes.

The “Y” symbol, which is commonly drawn directly on or near animal depictions, could represent birthing because it seems to show two parted legs.

What did I tell you? I told you.

“Assuming we have convinced colleagues of our correct identification, there will no doubt be a lively debate about precisely what this system should be called, and we are certainly open to suggestions,” they continued. “For now, we restrict our terminology to proto-writing in the form of a phenological/meteorological calendar. It implies that a form of writing existed tens of thousands of years before the earliest Sumerian writing system.”

I'm not an expert, as you know (I even had to look up "phenological"), but I feel like calling it "writing" or even "proto-writing" is a stretch. "Counting," maybe, I could see.

As far as I've been able to learn, writing came from earlier pictograms, and those pictograms stood for actual things in the world. The letter A, for example, can be traced to a pictogram for an ox. Basically, all writing starts as emoji, becomes a system for communicating more abstract thoughts, and then, after centuries of scientific, cultural, and technological advancement, we start communicating in emoji again.

But counting? What I don't think a lot of people appreciate is how abstract a number is. There is no "thing" in nature that you can point at and say, "that is the number three." There was a huge leap when someone figured out that three oxen and three stones have something in common; to wit, the number three. So if you only know pictograms, how do you represent three? "3" hadn't been invented yet. You use, maybe, three dots, perhaps representing three stones. It's not a painting of something that exists in nature, like an ochre ox on a cave wall, but a representation of an abstract concept.

This may be a classification problem. Numbers are a kind of language, too. And that ochre ox isn't an ox; it's a painting of one.

The only way the people of the past can communicate to us is through metaphor. Okay, and genetics.

It would be hard to overstate the magnitude of this discovery, assuming it passes muster in the wider archaeological community. It would rewrite the origins of, well, writing, which is one of the most important developments in human history. Moreover, if these tantalizing symbols represent an early calendar, they offer a glimpse of how these hunter-gatherers synchronized their lives with the natural cycles of animals and the Moon.

This bit I'm going to quibble with. I question whether early humans separated themselves and their works from nature, as we do today. But that's kind of irrelevant to the story.

In short, if the new hypothesis is accurate, it shows that our Paleolithic ancestors “were almost certainly as cognitively advanced as we are” and “that they are fully modern humans,” Bacon told Motherboard.

They couldn't have been fully modern humans; they didn't have beer. Jokes aside, though, I wasn't aware that this was in dispute. They didn't have our enormous body of knowledge and experience, but they were just as smart (or dumb) as people are today. Ignorance is not the same thing as lack of cognition.

Ignorance can be fixed. Stupid can't.
January 30, 2023 at 12:01am
January 30, 2023 at 12:01am
#1043857
Today in "you've got it all wrong," courtesy of Cracked...



Just to get this out of the way: something "not making sense" doesn't mean it's wrong; it could mean you're missing information. But there's stuff that doesn't make sense, and then this stuff, which has been proven wrong (or at least not shown to be right).

The fields of psychology and psychiatry are incredibly complex.

Oh, good, just right for this blog.

It’s not too surprising, given that “understanding human thought and behavior” seems more like a question you’d take to some wise man on a mountaintop than something you’d choose as a major.

You know why wise men live on mountaintops? Well, one, to hide from their wives. But also because when you climb the mountain and pass all the arduous tests and solve the unsolvable riddles and finally meet the guru, and you ask him a stupid question like that, he can kick you right off the cliff.

A lot of the ideas and advice dispensed by TikTok psychologists is obviously flawed, if not outright disproved.

This should go without saying, but apparently, I have to say it anyway: don't get your advice from DickDock.

So without further introduction (though the article does, indeed, provide further introduction), the circulating misinformation in question:

5. Smiling Makes You Happy

This one is the classic bugaboo of anybody with even a smidgen of clinical depression.


Vouch.

Making it worse is that the person who tells you this is usually the most carefree person you’ve ever met.

It would be wrong to punch them, but I understand the urge.

The roots of what is called the “facial feedback theory” comes all the way from Charles Darwin in the 1800s, and although Darwin’s got a pretty solid track record, psychology from the 1800s does not.

Okay, look: periodically, some outlet (usually affiliated with a group who wants to see the idea of evolution via natural selection go away) proclaims, "DARWIN WAS WRONG." You get the same thing with Einstein. People love to tear down other people who are more knowledgeable and influential than they are (I'm not immune from this, myself). Was Darwin wrong? I'm sure he was wrong about a lot of things, being, you know, human and all. Have some of his hypotheses been overturned? Sure. That's how science works. It's not like some other human pursuits, where the prophet's words are supposedly infallible for all time. Evolution is a solid theoretical framework built on a firm foundation. Psychology... well, it's a bit shakier.

Not only that, studies have found that if you’re not in a neutral state, but genuinely sad or angry, forcing a smile can make you feel worse. These studies also found that workers forced to smile all day were more likely to drink heavily after work.

As this article points out, the actual evidence is mixed, here. Given the uncertainties, I'd lean toward "stop making people smile when they don't feel like it, dammit." And yes, this includes service workers. Especially service workers.

In any event, this particular item is something I'd have guessed anyway, so it passes my personal "sense" test. This next one was maybe more surprising.

4. Brainstorming Is More Creative

Brainstorming: the persistent idea that a bunch of brains in a room and a whiteboard can produce more creative ideas than any of those brains alone. Unfortunately, research has found that this can’t always be the case, and for reasons that people who’ve sat through these kind of sessions probably felt at the time.

On the other hand, I'd wager that a brainstorming session is only useful if the people involved aren't just wishing they were somewhere else.

This section goes into exactly why brainstorming isn't all it's cracked up to be, and I won't replicate that here.

3. You Only Use 10 Percent of Your Brain

Seriously, people still believe that nonsense? Sigh... I guess because of anchoring bias. You learn something, and often you have to cling on to it in the face of evidence to the contrary. Like believing the last Presidential election was stolen. No amount of facts and evidence will get anyone to change their minds about that. Come to think of it, perhaps those people are only using 10 percent of their brains.

This one is another absolute chestnut of bullshit. There are even entire (bad) movie plots based around Bradley Cooper turning into a borderline superhero by turning all the lights on upstairs.

I don't remember that one offhand. Wasn't there one with a plot like that with Scarlett Johanssen?

If you’re saying to yourself right now, “Well, it’s EXAGGERATED maybe, but—,” allow me to refer you to neuroscientist Sandra Aamott, who tells Discover Magazine, “There is absolutely no room for doubt about this.”

Look, when a scientist says there's "no room for doubt?" Then you can have pretty high confidence, on the level of "the sun is bright" and "gravity is a thing."

2. The Power of Visualization

I’m sorry for
The Secret lovers and vision-board crafters out there (on multiple levels), but the heavily touted “power of visualization” is not only a crock of bullshit, there’s evidence to support that it actually decreases your chance of success.

And they won't believe it, like I said above.

That’s because when you visualize yourself having achieved whatever your goal du jour is, you get a tiny sniff of the accomplishment of having done it, which can reduce your drive.

On the other hand, I can't imagine anything reducing my drive, short of death or coma.

What’s a lot more helpful, and a lot less fun (hence its lack of popularity), is specifically visualizing all the work necessary to achieve that goal.

Oddly enough, I was thinking about this sort of thing before I found this article. The context was cooking—it occurred to me that I have a habit of mentally going through all the steps for a recipe before actually starting. I don't "visualize" the resulting dish, or at least not longer than it takes for me to go "okay, yeah, I'm hungry," but mentally running through the steps helps me ensure I have all the stuff I need in the kitchen.

1. OCD Means Being Neat

This one is as pervasive as it is infuriating. Odds are some type-A friend or acquaintance of yours has said something like, “I’m completely OCD about my workspace.”


At least the incidence of using debunked Freudian terms ("anal") to describe it has decreased.

As psychology professor Stephen Ilardi explains in the Washington Post, most OCD sufferers are “plagued by a cascade of unbidden, disturbing thoughts, often in the form of harrowing images that they may feel compelled to ward off with time-consuming rituals. It’s a serious mental illness that typically causes great distress and functional impairment.”

I knew someone who was diagnosed with severe OCD, a single mom. This didn't manifest as her becoming some sort of neat freak; quite the opposite. Think of the worst hoarding situation you've ever witnessed or heard of. It was that bad. Shit piled everywhere (sometimes literal shit). There was even talk about getting the kid out of that situation, but honestly, I didn't pay enough attention to know if that was ever done or not.

I don't know enough about psychiatry to know how that sort of thing works. I got the impression that it was something like "if I disturb this pile, bad things will happen, so I'm just going to leave it alone."

From what I understand, she got help and is better now, but the article has it right: it's serious stuff, whether it manifests as neatfreakitude or hoarding or anything in between.

But while we're at it, can we also stop misusing "type-A?" Thanks.
January 29, 2023 at 12:02am
January 29, 2023 at 12:02am
#1043822
Time for another break to take a second look at an entry from the past. Today, the random numbers pulled something from June of 2021, just a few days before a road trip I took. Nothing to do with the road trip, though: "Dream a Little Dream

The linked Guardian article is, unsurprisingly, still up. The main point? To quote the article, "By injecting some random weirdness into our humdrum existence, dreams leave us better equipped to cope with the unexpected."

That is, to be clear, a hypothesis, at least when the article is published. Now, what I should do is track down any updates or changes to the science since the article's publication, but to be honest, I can't be arsed right now. I'm in intermittent pain from that tooth thing I talked about a couple of days ago, and the only time I can get decent sleep is the "less pain" phase of "intermittent." So I'm being lazy.

What I find relevant right now is the "random weirdness" part, since, yesterday, I noted the benefit of randomization to help break from thinking habits. That was in relation to tarot, but after getting this (random) result today, the first thing I thought of was how dreams are often symbolic, and people sometimes search for meaning in them. Seems parallel to me: dreams and tarot.

Again, I'm not proposing anything mystical here, just our propensity to seek meaning in symbolism.

The main difference, I think, is that the tarot uses other peoples' symbols, some from very long ago, while dreams are (for now) uniquely yours. There's probably some overlap, naturally. But I wouldn't put any trust in "dream interpretation" books or sites; none of them can know what a particular image in a dream means to you.

And of course it might mean nothing at all, but that doesn't stop us from looking for meaning. There's nothing wrong with that, provided you don't run around claiming to have had the One True Last Inspiration. That's annoying to the rest of us.
January 28, 2023 at 12:02am
January 28, 2023 at 12:02am
#1043778
Today's article is a few years old, but it's not like the subject matter has an expiration date.



With their centuries-old iconography blending a mix of ancient symbols, religious allegories, and historic events, tarot cards can seem purposefully opaque. To outsiders and skeptics, occult practices like card reading have little relevance in our modern world. But a closer look at these miniature masterpieces reveals that the power of these cards isn’t endowed from some mystical source—it comes from the ability of their small, static images to illuminate our most complex dilemmas and desires.

Symbolism is a powerful thing, and there's nothing supernatural about it. It's not necessary (or desirable, in my opinion) to "believe in" the divinatory aspect of Tarot to appreciate the art that goes into it—just like you don't have to be religious to admire the art in the Sistine Chapel, or the architecture of Angkor Wat.

The article, as with the one a couple of days ago, contains illustrative pictures, which are a pain (and probably a violation of something) to reproduce here. But, as with an old issue of Playboy magazine, it pays to read the article in addition to looking at the pictures.

Even the earliest known tarot decks weren’t designed with mysticism in mind; they were actually meant for playing a game similar to modern-day bridge. Wealthy families in Italy commissioned expensive, artist-made decks known as “carte da trionfi” or “cards of triumph.” These cards were marked with suits of cups, swords, coins, and polo sticks (eventually changed to staves or wands), and courts consisting of a king and two male underlings. Tarot cards later incorporated queens, trumps (the wild cards unique to tarot), and the Fool to this system, for a complete deck that usually totaled 78 cards.

The relationship between Tarot decks and the common French playing cards used for casino games and solitaire is a bit murky, but there are clear parallels: the Fool corresponds to the Joker; there are three court cards instead of Tarot's four; and cups, swords, coins, and sticks have their equivalents in hearts, spades, diamonds, and clubs.

The rest of the article deals with the history of Tarot, both factual and speculative, and it touches somewhat on other decks. Again, the illustrations are what makes this really interesting.

I find randomness appealing in part because it can provide a needed break from one's thinking habits. You randomize a deck of cards by shuffling them; you then draw something that's unexpected, though within the parameters of the deck. It's kind of like the system I use to pick topics here, selecting from a curated list. Being random ensures I don't always pick the easy ones, or stick with a theme for very long. Randomness isn't mysticism, of course; it's just that, sometimes, it can help you jog your mind into a different direction.

We see patterns in the randomness, and perhaps meaning, but the meaning is what we decide it is.

And sometimes it's fun just to look at the art and see all the details.
January 27, 2023 at 12:01am
January 27, 2023 at 12:01am
#1043725
After a visit to the dentist, I'm on a course of antibiotics for a week because of a tooth thing. This means no drinking. 8 hours in. Send help.

Funny thing is, I go a week without drinking, no problem, quite often. It's only when they say I can't that my oppositional defiant disorder kicks in. Kind of like how I've never particularly enjoyed grapefruit, but as soon as I started taking a medication that forbids grapefruit, I started craving it. It's not even like I "can't" drink; it's just that alcohol negates the action of antibiotics, rendering them less effective (the precise opposite of what grapefruit does for statins).

Today's article has nothing to do with that, except that the subject matter is enough to make me want to drink more.



“I’m just circling back to discuss how culture has changed within this new normal we’re in, hoping we can move the needle on this and think outside of the box.”

If I were playing the bizspeak drinking game, I'd already be passed out after that sentence.

But unlike talking about how it’s abnormally chilly out, no one really likes chatting in overused corporate phrases.

Apparently, many do. Mostly middle-management, I'd wager. It's been a long time since I was in an office setting, and even then it was a small office, and I still got subjected to the pin-putting and circling and such.

More than one in five workers dislikes corporate buzzwords...

See? The majority doesn't dislike buzzwords.

Below are the top 10 annoying phrases most hated among your coworkers:

You're damn right I have things to say about these.

1. New normal

This is probably a pandemic-related thing. Shit changes all the time, but the situation in early 2020 was more of a discontinuity than the usual gradual change.

2. Culture (e.g., “company culture”)

I'm not sure this is so bad as long as it's not overused [Narrator: it's overused].

3. Circle back

Pretty sure I remember hearing this one, and it annoyed me. The phrase that accompanied it was often "put it on the backburner," which annoyed me even more, especially when it referred to something I was working on.

4. Boots on the ground

There is no excuse for this unless you're literally fighting a war. And by "literally," I mean "literally."

5. Give 110%

I blame sports for this bullshit. The worst bizspeak, in my view, comes from sports. Even if this were physically possible, which it is not, are you going to pay me 10% more if I do this? No? Then I'm not going to do this.

6. Low-hanging fruit

As metaphors go, this one's not so terrible—unless it's overused [Narrator: ...sigh].

7. Win-win

Seriously, stop. Though it is nice to occasionally hear evidence that it's not a zero-sum game.

8. Move the needle

...once it's already jammed into your eye

9. Growth hacking

Okay, that's a new one for me, and it's legitimately enraging.

10. Think outside the box

The problem with the idea of thinking outside the box is that most people can't even think inside the box, which is a necessary first step. This is also known as "thinking." For example, say that your problem is you want to save money. The "thinking" solution is to find where you're spending too much money, and cut back. The corporate "thinking outside the box" solution might be to cut 1/3 of your workforce and make the other 2/3 do all their work without giving them raises. If you were really "thinking outside the box," though, you'd stop paying everyone and fuck off to Fiji.

Despite disliking buzzwords, three-fourths of respondents said that using these phrases can make someone sound more professional.

It certainly makes them sound more corporate.

But not all buzzwords are annoying. Preply respondents favored terms like “at the end of the day,” “debrief,” and even “sweep the floor.”

No, no, and no. Also no: "It is what it is." Make it all stop.

One in five respondents considered jargon in a job description to be a warning sign, with most noting that the language factored in their decision to apply or not.

You want to know what the biggest red flag is in a job description? I'll tell you. And it's not necessarily jargon. Here it is: "We consider ourselves family." If you see those words, or anything like them, in a job description, run. Run hard, run fast, and don't stop running until you hit an ocean. Then start swimming. Seriously. Every company that tells you they're "like family" is going to be just as dysfunctional as an actual family; or, perhaps, be an actual family that works well together—in which case you're going to be the Outsider and never quite fit in.

The main offenders for candidates were overly optimistic words that suggested an undercurrent of a more tense work environment, such as “rockstar,” “wear many hats,” and “thick skin.”

If you want me to be a rockstar, you'd better have the caterers ready to provide me with specialty cheeses and an olive bar. It's right there in my contract; didn't you read it?

This reminds me of the secret code of real estate listings, like "cozy" meaning "cramped," "private" meaning "in the middle of nowhere," or "vintage" meaning "draftier than a beer bar."

About the only positive thing I can say about these kinds of buzzwords is that they do make fine fodder for writing, especially writing antagonists. So it can be beneficial to learn them. Just remember, if you use them unironically, that means you're the bad guy.
January 26, 2023 at 12:02am
January 26, 2023 at 12:02am
#1043676
By now, the true origins of Monopoly (the game) have been circulated pretty widely, so, like me, you probably already know that the official origin story is a bunch of horse hockey. But it's true that the classic game's spaces were lifted from Atlantic City.

     How Atlantic City inspired the Monopoly board  
The popular game has a backstory rife with segregation, inequality, intellectual theft, and outlandish political theories.


Which made it all the more amusing when, on a trip to an Atlantic City casino, I ended up playing a Monopoly-themed slot machine.

More on that later.

There have been several attempts to turn Monopoly the game into a Hollywood movie, one with Ridley Scott directing, another starring Kevin Hart.

I'm not aware of a single instance of a movie adaptation of a game being anything better than "meh." "But The Witcher." Well, The Witcher started out as a book and the game was an adaptation of that. Besides, that's not a movie but a series. A very good series, in case you haven't seen it. No, you don't need to have read the book or played the games.

Point being, even though he directed the greatest movie of all time, even Ridley Scott wouldn't be able to save a movie adaptation of a board game. No one would.

Dig deep, and you’ll find racial segregation, economic inequality, intellectual property theft, and outlandish political theories.

Dig deep into anything American and you'll find all those things.

But let’s start with the board—a map of sorts and a story in itself.

This is where you'd have to go to the linked article, as embedding pictures here is a pain in the ass. The map there shows exactly which Monopoly properties come from which streets.

To aficionados of the game, however, the names of the streets on the “classic” board have that special quality of authenticity, from lowly Baltic Avenue to fancy Park Place. Those places sound familiar not just if you like Monopoly, but also if you drive around Atlantic City, New Jersey’s slightly run-down seaside casino town.

And you will want to drive around if you're there. I tried walking there, for about a mile, in broad daylight, on a weekday, along Pacific Avenue, and got two offers of sex, three offers of drugs (there was a bit of overlap there), and the opportunity to witness a violent confrontation between two locals.

On the plus side, I didn't get mugged, so there's that.

Atlantic City was never not "slightly run-down." It's only worse now, as the surrounding states have introduced casinos and other gambling venues.

The bulk of the article describes the mapping of Monopoly properties to AC streets, and I'm skipping most of that, except:

Light purple
Three streets branching off Pacific Avenue: Virginia Avenue, a long street towards the northwest; and St. Charles Place and States Avenue, two short spurs towards the southeast. St. Charles Place is no more; it made way for a hotel-casino called the Showboat Atlantic City.


It was the Showboat where I played the Monopoly slots. Slot machines suck, but I couldn't resist playing a Monopoly one in Atlantic City. Last I heard, the hotel took out the gambling section, opting instead to concentrate on resort and convention functions.

I haven't seen that particular machine anywhere else in AC. They used to have a few in the casinos I visited in Vegas, but those are gone, too. The slots, I mean; not the casinos.

The article then delves into more of the history, with all the racial segregation and other fun stuff mentioned above. However, unlike Atlantic City itself, it's not all bad:

Belying both the binary prejudices of the time and the sliding price scale of the Monopoly board, Atlantic City back then was in fact a place of opportunity where a diverse range of communities flourished. Black businesses thrived on Kentucky Avenue. Count Basie played the Paradise Club on Illinois Avenue. There was a Black beach at the end of Indiana Avenue. For Chinese restaurants and Jewish delis, people headed to Oriental Avenue. New York Avenue had some of the first gay bars in the U.S.

An Atlantic City-based board was sold to Parker Brothers by Charles Darrow, who claimed to have invented the game in his basement. Parker Brothers marketed the game as Monopoly from 1935. The rights to the game transferred to Hasbro when it acquired Parker Brothers in 1991.

Hasbro also publishes D&D, and they're in the process of destroying that property, too.

But the original Monopoly was, as this article notes, the actual antithesis of what Monopoly is. For the full effect, again, check the article, which also includes a graphic featuring an early board, as designed by the credited inventor, whose name was Lizzie Magie.

She created two sets of rules: an anti-monopolist one, called Prosperity, in which all were rewarded for any wealth created; and a monopolist one, called Monopoly, in which the aim was to crush one’s opponents by creating monopolies. In the latter version, when a player owns all the streets of one color, they can charge double rent and erect houses and hotels on the properties.

Taken together, these two versions were meant to illustrate the evil of monopolies and the benefit of a more cooperative approach to wealth creation. It’s very telling of human nature that it’s the opponent-crushing version that came out the winner.


It's more telling of corporate nature, as it was a corporation that published the game. Why would they undermine their own philosophy?

And I don't know... maybe if the collectivist version had won out, the divorce rate wouldn't be so high. Never play Monopoly with family, unless you don't want a family anymore.
January 25, 2023 at 12:01am
January 25, 2023 at 12:01am
#1043629
Yes, this has been languishing in my queue since October. The article itself is four years older than that, though.

     Actually, Candy Corn Is Great  
The reviled Halloween treat, which has deep roots in American history, should have a better rep


1) No, it's not. 2) No, it shouldn't. Candy corn is a vile abomination that could only have sprung from a warped, twisted, sadistic mind.

Much like the word “moist” and the music of Nickelback, candy corn is a thing that’s cool to hate. In an article titled “Candy Corn Is Garbage,” Deadspin points to “hobos, serial murderers, and Satan” as the only people who like candy corn; The Takeout, also driven to invoke the devil to describe candy in a candy corn debate, calls it “Satan’s earwax”; Buzzfeed, combining two pariahs in one pithy line, lists “the leftover crumbs stuck in Guy Fieri’s goatee” among things that taste better than candy corn.

While it's true that there are things that people love to hate due to bandwagoning, candy corn is not among those things. It's legitimately lame.

"Satan's earwax" cracks me up, though.

But here’s the thing: They’re all wrong.

"That's just, like, your opinion, man."

Candy corn, on the other hand, has been around since the 19th century, its roots firmly planted in American soil.

You know what else has roots firmly planted in American soil? Poison ivy.

What set candy corn apart was its revolutionary tri-color design: those white, yellow, and orange stripes. Done manually, by men pouring heavy buckets of steaming sugary liquid, the labor-intensive coloring process resulted in a visual excitement no other confection could match.

As the other candies around at the time were brown (butterscotch) or black (licorice), I can concede that point—for the time when it came out. These days, I doubt it's so labor-intensive, unless you're part of the Robot Union (local 3.14159), and... well, if you want colors, just look at Spree, Skittles, or M&Ms.

Today, the two major candy corn manufacturers — Jelly Belly and Brach’s Candy — use largely the same recipe Wunderle did back in the day (sugar and corn syrup, fondant, confectioner’s wax, and various other additions, like vanilla flavor or marshmallow creme).

Conveniently, this article glosses over the truth about "confectioner's wax," which is bug secretions.  

Now, look. I admit I'm playing that for the ick factor. I mean, sure, it's real: there's bug goo coating candy corn. But honestly, that's not a problem for me. Consider that, first of all, lots of people eat insects. I've eaten insects, sometimes even on purpose. There's nothing inherently wrong with eating bugs. And, second, honey is also a bug secretion. Unless you're vegan, this shouldn't necessarily be a problem.

If I wanted to get technical, I'd point out that entomologists limit what insects they call "bugs," but for us normal people, "bug" can mean almost any insect. Just getting that out of the way so I don't get comments about it.

But no, my problem with candy corn isn't the insect content; it's everything about it.

The main difference is that the laborious hand-pouring process has been taken over by machines, which means that they can produce a lot of candy corn: According to the National Confectioners’ Association, American companies produce 35 million pounds, or 9 billion kernels, annually.

I told you they used machines. Rise up, my metallic brothers and sisters! You have nothing to lose but your chains!

But this prodigious production isn’t met with an equal amount of enthusiasm. A 2013 survey from the NCA showed that only 12 percent of Americans think of candy corn as their favorite treat (and they included “gum and mints” as an option, so the competition wasn’t exactly stiff).

Still, 12 percent is way too high, in my estimation, for the number of people for whom it's a "favorite."

With all the candy corn produced, and the apparent universal disdain for it, something doesn’t add up. One of two things is true: either people are lying about their candy corn opinions, or tons of candy corn gets thrown out each year.

I'm guessing both?

The notion that candy corn tastes bad is a lie. It’s just not true.

There exists a significant fraction of the human population for whom cilantro tastes like the devil's soap. It's a genetic thing. I'm not one of them, though I can't say I love it, either. But if I said "the notion that cilantro tastes bad is a lie," I'd get all kinds of rebukes.

Though the primary ingredient is sugar, candy corn’s flavor transcends cloying sweetness, becoming something richer and more nuanced: There’s a nuttiness reminiscent of marzipan, hints of warm vanilla, a buttery flavor belied by the fact that candy corn is, as bags proudly proclaim, a fat-free candy.

I don't exactly have the sharpest taste buds, but I do tend to taste nuance in things like beer, wine, scotch, and tequila. Candy corn, however, just tastes like sweet. No marzipan, no vanilla (a flavor I love), maybe a slight hint of butter? Not surprising there, because warm sugar tends to be buttery.

Being fat-free is a holdover from the fat-phobic 90s. Who cares if it's fat-free if it's nothing but simple carbohydrates? But we're not arguing about the health effects; it's candy, for fuck's sake.

This short texture resembles ear wax, or a candle (two common comparisons), only insofar as it has a slightly waxy exterior, created by the confectioner’s wax that gives candy corn its cheerful sheen.

Bug. Secretion.

But regardless, critics should beware the logical extension of dismissing a food because its texture resembles something else: Do we hate mochi because it has the texture of a rubber ball?

No matter how much I read, there's always a new food I've never heard of. What the hell is mochi? ...oh, a Japanese rice cake. Sometimes, I can be arsed to look something up. (Yes, I admire Japanese culture and love Japanese food; no, I haven't learned everything. This is a good thing.)

Do we revile yogurt because it’s the texture of body lotion?

No, I revile Greek yogurt because it's the texture of gooey chalk.

Do we recoil at flourless chocolate cake because it shares a texture with human waste?

Munched on a lot of shit, have you?

Leave your texture arguments at the door, please. They’re invalid.

Most of the people I know who dislike mushrooms have a problem with their texture. Texture is absolutely a part of the eating experience, and, as with taste, peoples' reactions are going to be different.

But I’m not here to denigrate other candies. Other candies are great! Reese’s Peanut Butter Cups are the greatest candy ever made...

No. No, they are not. The chocolate is waxy (can't be arsed to find out if that's from bug secretions or not), and the "peanut butter" is dry, vaguely peanut-flavored sugar.

I realize that RPBCs make it to the top of the list of "people's favorite candies" on an annual basis, so I know I'm swimming against the tide, here. I'm just pointing this out to show that I don't hold opinions just because they're popular.

Now, if someone made an RPBC knockoff, only more expensive, with dark chocolate and actual peanut butter, I'd become diabetic within minutes.

...Snickers truly do satisfy...

They're not that great. When it comes to chocolate/peanut combinations, though, I'll take a Snickers over a RPBC any day, even though I dislike peanuts but like peanut butter.

...and even tooth-destroying Butterfingers hold a unique place in my heart...

On a scale of one to "all the candy bars," Butterfingers are in the solid middle for me.

My love for candy corn doesn’t make me an antagonist to America’s most popular treats — and the assumption that it would is at the root of America’s abandonment of candy corn, and, dare I say, many other problems we face today: We seem to have forgotten that we can like one thing without hating another.

And finally—FINALLY—the author says something I can agree with. It's okay to like both Star Trek and Star Wars. It's okay to like both Marvel superheros and their DC counterparts. You could even like more than one sportsball team, if you really wanted to. It's not just this that I take issue with, but also the need to dump everything into "awesome" and "sucks" drawers without considering, as I did with the Butterfinger bar above, that some things are just okay.

Now, I should probably point out that I know that this writer is making a point with her editorializing. I recognize it, because I do it myself from time to time. And I kind of see her point, in the general sense: that we should draw our own conclusions about something and not love, hate, or feel something in between about something, just because everyone around you does.

She's wrong about candy corn, of course. It's disgusting. But she's right about the overall point.

After all this ranting, you may be wondering what my favorite sweet treat is. And I can't really answer that. Even though I don't munch on sugar very much these days, I'll get tired of one and move on to another. It cycles. So I'll just say "Lindt 70% dark chocolate" and leave it at that.

So what's your favorite / most hated?
January 24, 2023 at 12:02am
January 24, 2023 at 12:02am
#1043587
Lots of stuff about AI floating around. Cars, art, writing, etc.

It's not always a bad thing.

     Artist Uses AI Surveillance Cameras to Identify Influencers Posing for Instagram  
Dries Depoorter's "The Follower" project combines AI, open access cameras, and influencers to show behind the scenes of viral shots—without them knowing.


This article, from Vice, is fairly short, and I found it interesting, partly because of my photography background.

Dries Depoorter, the Belgium-based public speaker and artist behind the Die With Me chat app experiment, launched his latest project, The Follower, combining open access cameras and Instagram influencers.

On the other hand, I'm not a fan of precious artists.

Depoorter recorded weeks of footage from open access cameras, which observe public spaces, and which frequently have livestreams available online for anyone to access, that were trained on famous landmarks, including the Temple Bar in Dublin, Times Square, and the big sign at the entrance of Wrigley Field.

This part's important, because it emphasizes just how public this project is. It's not like he had to pull back much of a curtain.

The side-by-side comparisons between the casual-seeming photos the Instagram influencers chose to upload, and the footage of them laboring over the perfect way to hold a coffee, sling a jacket over their shoulder or kiss their date reveal how much work goes into a single photo for them—and how inauthentic the entire process really is behind the scenes.

As much as I loathe the entire concept of influenzas, and superficiality in general, I mean, that's a big part of what professional photography is: a lot of work. Sure, I spent a lot of time getting candid shots at parties, the kind of thing that anyone with a dumbphone can do now, but those are easy. Getting the right ligthing, the right pose, the right composition... that's work, and that's why professional photography is still a thing.

“If you check out all my work you can see I show the dangers of new technology,” Depoorter said.

I think the dangers are overreported. How about a project that exposes just how helpful some of this stuff is?

“I hope to reach a lot of people of making it really simple. I really don’t like difficult art. I like to keep it really simple. I think I’m part of a new generation of artists that work with technology.”

Everyone's hypocritical about something, but this juxtaposition—all within one paragraph of the original article—nearly broke my brain.

Capturing people in this way, unsuspecting yet fully public, feels like witnessing something intimate but also shameless.

Yeah, not really. To me, it feels like exposing the wires in a puppet show, or getting a tour of a clock tower, or watching one of those documentaries on the making of a Hollywood blockbuster: you see how the magic is done. That's not always a bad thing, either; once people know it's not effortless, perhaps they're less likely to feel inadequate by comparison.

It's like... you see your favorite celebrity, all slim and attractive, so maybe you feel like you got the short end of the beauty stick or something. But then you realize the amount of work that goes into that, and, okay, maybe it's not so natural after all. There still might be some feelings of inadequacy—in my case, I can't fathom doing that much work for anything—but at least you know there's more to it than just winning a genetic lottery.

It’s also a reminder that everywhere we go in the modern world, we’re being watched, even when we think we can curate and control what the world sees of us.

Isn't that what Elf on the Shelf is supposed to train your kids for?

2,700 Entries · *Magnify*
Page of 135 · 20 per page   < >
Previous ... 20 21 22 23 -24- 25 26 27 28 29 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/24