*Magnify*
    June     ►
SMTWTFS
      
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/25
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 21 22 23 24 -25- 26 27 28 29 30 ... Next
February 7, 2023 at 12:01am
February 7, 2023 at 12:01am
#1044382
Another entry for "Journalistic Intentions [18+]. This one can be found here: Faces Everywhere  

Pareidolia   is popularly known as the tendency to see a face or figure in accidental places. Examples include the Man in the Moon, the Face on Mars (it's not a face), and the ever-popular Jesus toast.

Or, speaking of Jesus, some people find him in a dog butt.   I'm not judging. But it does confound the proverbial dyslexic agnostic insomniac, who stays up all night wondering if there's a dog.

In the case of today's prompt image, though... I'm not seeing it. Looks more like a grasping hand to me. Which is a form of pareidolia, but the title of the image led me to expect to see faces. Generally if I go into something with that expectation, I'll see it. Not this time.

This could be related to my partial face-blindness. A while back, I tried to get back into my bookface account so that I could delete it. I'd forgotten the password, so they issued me a challenge: identify the people in these photographs, ripped from my friends list.

Leaving aside the ones who I only knew from online, whose faces I wouldn't recognize anyway, I haven't seen some of my failbook friends in decades. And even absent those, I'm just not that great at identifying faces. Hell, one time I was at a bar doing a trivia contest. I had barely even started drinking yet, and the goal was to identify the famous people in a set of photos. I recognized a couple of them, but one that I didn't was... Bruce Springsteen, circa 1985. You'd think if I could recognize anyone, it's him. But no.

Long story short, this is why I still have a facebuck account: I can't get back in to delete it. And it's also why I don't always see pareidolia.

One thing that occurred to me a while back, though, is that artists rely on pareidolia. That is, I don't mean the examples in the Wiki link above, but, like, cartoon artists. They can throw together a few lines, and it looks like a general face (even to me). That's deliberate, though, and doesn't involve chance formations in nature.

Now, I could go into speculation about why we see faces everywhere, like the late lamented Old Man of the Mountain in New Hampshire, and make some guesses about how it was advantageous for our ancestors to think they saw faces. But I won't, because that's all it would be: speculation. The facts we know include that we do see faces or figures in natural objects. And sometimes that's fun.

But sometimes, it leads to wild conspiracy "theories," like with the Mars thing. Some alien must have carved it! Well, no, it's just pareidolia.

Which doesn't explain why there's a smiley face on Mars  . Do you see it there? That's the crater Galle in the Argyre Planitia. Might be easier to parse in this photo.  

Okay, yes, it totally does explain it. Completely coincidental. Just because Mars is the only known planet to be inhabited solely by robots doesn't mean that said robots did that for our benefit.

Have a nice day! *Smile*
February 6, 2023 at 12:01am
February 6, 2023 at 12:01am
#1044333
To start with, I'll note that this is a book promotion. That's okay; it's at least somewhat informative.

    This Is Your Brain on Architecture  
In her new book, Sarah Williams Goldhagen presents scientific evidence for why some buildings delight us and others—too many of them—disappoint.


Because architecture is art, and some art delights us and others disappoint?

(I know an argument can be made that architecture isn't actually art because architecture has a function, but it still has a huge artistic component.)

One last note before I dive in: the article is from 2017. That shouldn't make much difference, but it references a "new" book, which simply wouldn't be the case after six years.

Sarah Williams Goldhagen was the architecture critic for The New Republic for many years, a role she combined with teaching at Harvard University’s Graduate School of Design and elsewhere. She is an expert on the work of Louis Kahn, one of the 20th century’s greatest architects, known for the weighty, mystical Modernism of buildings like the Salk Institute in La Jolla, California, and the Bangladeshi parliament in Dhaka.

At the top of the linked page is a photograph of the Salk Institute. I'm no expert on architectural labels, but that building looks more Brutalist than Modernist. Perhaps it is both.

Several years ago, Goldhagen became interested in new research on how our brains register the environments around us. Dipping into writing from several fields—psychology, anthropology, linguistics, and neuroscience—she learned that a new paradigm for how we live and think in the world was starting to emerge, called “embodied cognition.”

And this is why the article appeals to me: not necessarily because of the architectural slant to it, but because it features multidisciplinary science. Well, sort-of science; I'll get to that.

“This paradigm,” she writes in her magisterial new book, Welcome to Your World: How the Built Environment Shapes Our Lives, “holds that much of what and how people think is a function of our living in the kinds of bodies we do.” Not just conscious thoughts, but non-conscious impressions, feedback from our senses, physical movement, and even split-second mental simulations of that movement shape how we respond to a place, Goldhagen argues.

I mean... okay. I don't think that's a new concept, though. It wasn't a new concept in 2017. Perhaps it wasn't framed exactly like that before, but I spent my career working with architects, and I have the impression that they all think that way.

The research led Goldhagen to science-based answers for previously metaphysical questions, such as: why do some places charm us and others leave us cold? Do we think and act differently depending on the building or room we’re in?

The first question is a valid line of inquiry, in my opinion. The second should be blindingly obvious to anyone.

Architects intuited some of these principles long ago. As Kahn once noted of the monumental Baths of Caracalla in Rome, a person can bathe under an eight-foot ceiling, “but there’s something about a 150-foot ceiling that makes a man a different kind of man.”

Like I said, they all think that way (well, all the ones I've met; I should guard against sweeping generalizations). The question I have is: can it be quantified? That would make this legitimate science.

As an example from my own field, consider a road. If you're driving down a country road that has 9-foot lanes with trees just off the edge, it feels different, and you'll drive differently, than if you're on a 12-foot-lane interstate with a lot of clear space past the shoulder, even if both roads are straight and level. The psychology of this has been quantified, and it's in the realm of science. Similarly, we know we feel different in a large bedroom (for example) than we do in a small one, but has that been quantified? What's the optimal size of a bedroom, considering this psychology, and how do different people react to different sizes?

The article jumps into an interview with the book's author:

At the time, there really was no intellectual paradigm for thinking about these questions. And then about 15 years ago, my husband handed me a book by someone who had written a previous book he had really liked. The title of the book was Metaphors We Live By. It’s co-authored by George Lakoff, who’s a cognitive linguist, and Mark Johnson, who’s a philosopher. The basic argument is that much of how our thought is structured emerges from the fact of our embodiment. And many of the ways those thoughts are structured are metaphorical.

Honestly, I'd rather read that book than the one this page is promoting. Not that I wouldn't want to read this one; it's a matter of priority.

One of the things I found was that, basically, [given] what we now know about human cognition and perception, the built environments we inhabit are drastically more important than we ever thought they were.

Okay, that's worth explaining in a book, I think. And it passes my sniff test.

Architects tend, particularly with parametric design, to emphasize overall aggregate form, and all that other stuff gets filled in later. And then, very often, it’s value-engineered out.

I understand the need for value engineering, but I despise the concept. One time, a site plan I did lost a drainage grate to value engineering, and then everyone wondered why the street started flooding every time it rained, and I got the blame until I pointed out that the drain I'd designed never got built. I'm not saying I was always right, but I've never put in a drainage device without reason.

Another thing is differentiated, non-repetitive surfaces. [The psychologist and author] Colin Ellard did a study of how people respond: He basically put sensors on people and had them walk by a boring, generic building. Then he had them walk past something much more variegated with more ways to [engage] visually and therefore motorically. He found that people’s stress levels, measured by cortisol, went up dramatically when they were walking past the boring building.

Okay, see, that's quantification. Science. It may have been a good study or it may not; I don't have the data. But it's on the right track.

The rest of the interview is worth reading, I think, because she raises some important points. But I'm not going to nitpick them (which is not to say that I completely agree), so no point in reproducing it here. Link's up there if you're interested.
February 5, 2023 at 12:01am
February 5, 2023 at 12:01am
#1044278
Today's historical document comes from just shy of 14 years ago. It was part of a series of entries that I did to expand on a list of things about me. I mean, it's a blog; why wouldn't I talk about me?

Anyway, this particular entry is here: "Lassie

From that day:

11. When I was a kid, I had a collie named Lassie. Yes, I named her. Give me a break; I was four years old at the time.

Incidentally, I've never used her name as a security answer anywhere, so... nice try, but no.

I've posted this before, but here's a picture of me with Lassie, from the rotation in my former blog:

Unlike many pictures, I could easily repost it here. But why, when I spent so much time typing in the link to the entry?

I don't know how old I was in the picture. Over four, obviously. I suck at estimating ages of kids or adults. No, it's irrelevant that the picture is of me. It was ancient when I put it in that entry. And after another 14 years, I'm even further removed from that brat, and even more embarrassed by the photo.

It occurs to me that I don't even recall the circumstances surrounding the photo. That's not our lawn. I don't recognize the setting at all. I have this vague notion that it might have been Florida, but I have no recollection of taking the dog with us to Florida. Or even who watched her when we vacationed.

I only keep it up because of the dog. She was a good dog.

I've never had another dog. Most of them annoy me, and those that don't, already belong to someone else.

That's not the main reason. Oh, it might have been in the forefront on that day, for that entry, but I rarely have only one reason for doing (or in this case, not doing) something. I like dogs, mostly. I just can't be arsed to do the work they require.

And also because I doubt I'd find one as cool as Lassie.
February 4, 2023 at 12:01am
February 4, 2023 at 12:01am
#1044216
Here's a confession:

I'm not entirely confident when it comes to picture prompts.

But I do feel like it's important to try new things, to, as the kids say, step outside of my comfort zone and do something I'm not sure of, like this month's round of "Journalistic Intentions [18+].

One thing I continue to do: pick prompts at random. Today, we have this lovely photo of a leaf under a blanket of water: Orange Submersion  

I've seen reports that at least one person is having trouble with xlinks, so if that hyperlink doesn't work for you, I can provide the raw URL on request.

Ever wonder why water is (mostly) clear? I have. I mean, apart from when it's murky from suspended particles, or sometimes when mixed with delicious booze (which is itself, in its pure form, clear). And it's clearly (pun intended) not the same as air; you can almost always tell where one ends and the other begins. The surface of pure water is easy to identify, but, for me at least, damn near impossible to render in a drawing. But then, I've never been very good at drawing... but I digress.

The best answer to why water is clear that I've been able to come up with is that this is not the right question to ask.

Life crawled up onto land "only" about half a billion years ago. In comparison to the four billion or so years since life began on Earth, that's a significant fraction of time, but it means that life was changing and evolving for 7/8 of its history underwater. And some of that life, at least the animal portion, found a competitive edge in being able to directly sense prey or predators: in short, vision is a very useful sense to possess.

It's my understanding that eyes evolved several different times. That is, there's not one proto-organism that gradually turned light-sensitive cells (which many organisms have, not just animals) into an eyeball, which then split off into different species. No, the proto-organism might have been mostly blind, and some of its descendants developed vision in different ways: compound eyes like ours, or the simple eyes of arthropods and such, or whatever.

But most of those eyes originally evolved underwater. Thus, they developed in such a way that vision would be an evolutionary advantage, which means being sensitive to a range of the electromagnetic spectrum to which water is mostly transparent.

Water isn't clear because of some innate property of it; it's clear, to us, because our distant ancestors evolved a sense that allowed them to see in it. Water blocks some other wavelengths.

But the other thing it does is change the direction of light at its boundary. Stick your arm into a fish tank, and it'll appear to bend. At some angles, the light doesn't escape the water at all, but reflects off of it like a perfect mirror (this is a function of index of refraction, and it's also how fiber optic cables transmit data). The surface is also partially reflective when viewed from above, which is how you get the artistic dappled effect from ripples, like in today's picture prompt.

Leaf (pun absolutely intended) it to me to get all sciencey about a pretty picture. But I'm of the considered opinion that the knowledge gleaned from science can improve an aesthetic experience. I know the math and physics behind rainbows (or, well, I used to, and can easily find the information again), but that doesn't decrease one's visual impact.

And if today's discussion hurt your brain, just be glad I didn't go into the biology of what makes leaves turn color and fall off as winter approaches.

Another time, perhaps.
February 3, 2023 at 12:01am
February 3, 2023 at 12:01am
#1044108
My second-biggest gripe here is: "Only four?"



Because I can think of a lot more than four.

But this is Cracked, so you gotta account for short atten- SQUIRREL!

The sheer variety and volume of food available in the world is enough to make both your eyes and mouth water simultaneously.

Especially if you're one of the millions who can't access it.

Dark humor is like food: not everyone gets it.

And yes, guy whose go-to-move at parties is to shut down casual chats by bringing up the saddest possible parts of the human condition, I do realize that starvation and food security is still a massive problem both in the U.S. and abroad. Can we move on now, and you can resume your job as a professional hitman for conversations?

Nah, I reserve that for the blog. But that's only because no one invites me to parties or casual chats.

Here, then, are four foods that, as a modern civilization, we can finally kick to the curb.

4. Necco Wafers

Congratulations, you've identified something more hated than candy corn.

The world has changed around you, while you’ve been chopping up sticks of sidewalk chalk and wrapping them in the sort of wax paper that feels like it was collected from an Egyptian tomb. We’re two months away from Gushers with LEDs in them, and you’re still trying to sell us a candy most monkeys would spit out.

Stop giving candymakers ideas.

3. Baby Corn

Not to be confused with candy corn, because as hated as candy corn is, at least it has a flavor.

I, like many American Jews, have a deep, insatiable love for Chinese food. Fried pork dumplings are very possibly my favorite food of all time, and if not, they would at the very least make it handily through the primaries.

I'm just leaving this in here to avoid comments/jokes about keeping kosher. It's not a thing for most American Jews.

However, there is one consistent, unwelcome invader in many entrees at Chinese restaurants: Those fucked-up little corns.

Truth.

2. Plain Cheerios

YOU SHUT YOUR WHORE MOUTH.

But the light is dimming for the default Cheerio in the modern world — the new cereals are faster, sweeter and more colorful.

That is exactly what makes Cheerios great.

Not that I eat cereal much anymore, but on those rare occasions that I do (usually as part of a free breakfast at a cheap motel), the default is Cheerios, not Chocolate Frosted Sugar Bombs.

1. Brazil Nuts

Unlike the other items on this list, I have absolutely no opinion about Brazil nuts. But I tend to avoid them because the idiots I knew as a kid had a rude name for them that I shan't perpetuate by noting it here.

I am sure there was a point in the history of the human race where the precious food inside a Brazil nut was needed for survival, but I highly question their continued relevance past the invention of fire. The value proposition of the Brazil nut is like breaking into a safe in order to retrieve a piece of dry toast.

I've never had to actually break into a Brazil nut. But you can't make that argument about Brazil nuts when there are so many other foods that are difficult to access, like coconuts or pomegranates (pretty sure that's a word meaning "apple made of granite").

Not to mention, there are so many other better nuts that are begging you to eat them! You’re telling me you’re putting on the blinders and digging past peanuts, pistachios and cashews just to draw blood trying to access the non-prize that is the meat of the Brazil nut?

Peanuts are barely food, pistachios are too much work (or too expensive if you get the pre-shelled kind), and the only purpose of cashews is to provide a unique shape in the trail mix.

Anyway, as I said, why stop at four? I can think of lots of foods whose time has passed, especially since Cheerios doesn't belong on that list at all.

For example: mass-produced milk chocolate. It's basically edible plastic with just enough cocoa so the food police can let them call it "chocolate." There are many tastier alternatives. Sure, they're probably more expensive, but we're not talking about price.

Your turn. What food would you want to see relegated to the circular file of history? (If you're vegetarian or vegan, saying "animal products" is cheating.)
February 2, 2023 at 12:01am
February 2, 2023 at 12:01am
#1044036
Not a lot I can say about this one; I just thought it was too cool not to share.

Before I get into it, though, a quick note: I'll be participating in "Journalistic Intentions [18+] this month (though not today). Check it out and join in—you have nothing to lose and everything to gain. Well, a couple of awards to gain, potentially, but that's not nothing.

And now, let there be light.

     Astronomers Say They Have Spotted the Universe’s First Stars  
Theory has it that “Population III” stars brought light to the cosmos. The James Webb Space Telescope may have just glimpsed them.


A group of astronomers poring over data from the James Webb Space Telescope (JWST) has glimpsed light from ionized helium in a distant galaxy, which could indicate the presence of the universe’s very first generation of stars.

Don't worry; the article later explains this better later on.

These long-sought, inaptly named “Population III” stars would have been ginormous balls of hydrogen and helium sculpted from the universe’s primordial gas.

I guess "ginormous" is a scientific term now, probably along with "bajillion" and "metric shit-ton." As in, there were a bajillion ginormous stars in the early Universe, and each one weighed a shit-ton.

Theorists started imagining these first fireballs in the 1970s, hypothesizing that, after short lifetimes, they exploded as supernovas, forging heavier elements and spewing them into the cosmos.

"Fireballs" is misleading, but "exploded" is apt.

That star stuff later gave rise to Population II stars more abundant in heavy elements, then even richer Population I stars like our sun, as well as planets, asteroids, comets and eventually life itself.

Got that? Summary: Pop III came before Pop II which came before Pop I. No idea what they're going to name the fourth generation of stars. Population X, I hope. If this seems backwards, it's because the classification system was developed before people figured out the order of things. Again, the article explains this later.

The early Universe was mostly hydrogen, some helium, a tiny bit (relatively speaking) of lithium, and not much else. It takes nuclear fusion or other processes to create the rest of the elements. This is why we say we're made of star material.

Confirmation is still needed; the team’s paper, posted on the preprint server arxiv.org on December 8, is awaiting peer review at Nature.

Important disclaimer, so you can be That Person at the party when someone gushes about how astronomers found the first star or something.

Because they are so far away and existed so briefly, finding evidence for them has been a challenge.

Remember, we're at the center of an inside-out universe; what's furthest away is oldest. Well. We appear to be at the center, anyway; so does every other point in the Universe.

The rest of the article goes into more depth, and while it's not nearly as paradigm-shattering as, say, detecting life on another world (a subject I've tackled in here before), getting confirmation, or even negation, of this model of the history of star formation would be a step forward in astronomy and cosmology. Thus, the cool factor.
February 1, 2023 at 12:01am
February 1, 2023 at 12:01am
#1043972
The hated month of February begins. Appropriately enough, the random number generator pulled up a plague article.



And no, they didn't pinpoint the exact flea.

As the deadliest pandemic in recorded history – it killed an estimated 50 million people in Europe and the Mediterranean between 1346 and 1353 — it's a question that has plagued scientists and historians for nearly 700 years.

...really, NPR? You're going to do that pun? Who do you think you are, the BBC?

Incidentally, other sources put the number of victims higher than 50 million. That would be a significant number of people today; at the time, it was likely over 10% of the world's human population.

Now, researchers say they've found the genetic ancestor of the Black Death, which still infects thousands of people each year.

Which is not what the headline implied.

New research, published this month in the journal Nature, provides biological evidence that places the ancestral origins of Black Death in Central Asia, in what is now modern-day Kyrgyzstan.

See what lack of vowels will do?

Oh, and by "this month" they mean when the article was published, back in June of last year.

What's more, the researchers find that the strain from this region "gave rise to the majority of [modern plague] strains circulating in the world today," says Phil Slavin, co-author on the paper and a historian at the University of Stirling in Scotland.

One could ask where that strain came from in turn.

The article goes on to explain the various strands of evidence that led them to this conclusion, and it was apparently a multidisciplinary effort. That's the part I find interesting.

Does this mean that the mystery of the origin of Black Death has been solved?

"I would be very cautious about stretching it that far," says Hendrik Poinar, evolutionary geneticist and director of the McMaster University Ancient DNA Center in Ontario, Canada, who was not involved in the study. "Pinpointing a date and a specific site for emergence is a nebulous thing to do."


I'm just including this because I didn't want anyone to get the impression it's settled science, as the headline might make one believe.

When the Black Death swept across Eurasia, no one had the slightest clue what DNA was. Or a cell. Or hell, even the notion that organisms too small to see could be what caused the plague. No, they thought it was God's punishment, or witches' curses, or a comet or some shit like that.

I'm just throwing that out there because I continue to see romanticization of the past. While today sucks, the past sucked worse.
January 31, 2023 at 12:01am
January 31, 2023 at 12:01am
#1043924
Mostly, this is just an interesting article from Vice to share. But naturally, I have comments. Some are serious, others, not so much.

     A Total Amateur May Have Just Rewritten Human History With Bombshell Discovery  
Ben Bacon is "effectively a person off the street," but he and his academic co-authors think they've found the earliest writing in human history.


The idea that an "amateur" might make a discovery isn't all that shocking. People with experience sometimes let that experience get in the way of coming up with fresh ideas, and nothing says "fresh ideas" like a newbie. Hell, Einstein was famously working as a patent clerk when he figured out how most of the Universe worked. This, of course, doesn't mean that an amateur is always going to get it right.

What's more important is the "discovery" itself, and whether it will hold up under scrutiny.

In what may be a major archaeological breakthrough, an independent researcher has suggested that the earliest writing in human history has been hiding in plain sight in prehistoric cave paintings in Europe, a discovery that would push the timeline of written language back by tens of thousands of years, reports a new study.

This, folks, is how you write a lede. And it's even in the first paragraph.

These cave paintings often include non-figurative markings, such as dots and lines, that have evaded explanation for decades.

Samuel Morse went back in time and left messages? *Bullet* *Bullet* *Bullet*   *Dash* *Dash* *Dash*   *Bullet* *Bullet* *Bullet*

Ben Bacon, a furniture conservator based in London, U.K. who has described himself as “effectively a person off the street,” happened to notice these markings while admiring images of European cave art, and developed a hunch that they could be decipherable.

BEHOLD THE POWER OF BACON

Now, Bacon has unveiled what he believes is “the first known writing in the history of Homo sapiens,” in the form of a prehistoric lunar calendar, according to a study published on Thursday in the Cambridge Archeological Journal.

Technically, if it is writing, then it's not "prehistoric." By definition.

Intrigued by the markings, Bacon launched a meticulous effort to decode them, with a particular focus on lines, dots, and a Y-shaped symbol that show up in hundreds of cave paintings.

This supports my Samuel Morse time-traveling theory, if we also assume he was horny and thinking about the pubic regions of females.

Previous researchers have suggested that these symbols could be some form of numerical notation, perhaps designed to count the number of animals sighted or killed by these prehistoric artists. Bacon made the leap to suggest that they form a calendar system designed to track the life cycles of animals depicted in the paintings.

I was wondering how that relates to a "lunar calendar," but fortunately, the author continues to practice good journalism:

The researchers note that the paintings are never accompanied by more than 13 of these lines and dots, which could mean that they denote lunar months. The lunar calendar they envision would not track time across years, but would be informally rebooted each year during a time in late winter or early spring known as the “bonne saison.”

Hey, that's French. I didn't need years of study to know that this means "good season."

On a more serious note, finding out when the calendar ticked around would be pretty cool. Our Gregorian calendar begins nearly equidistant from the winter (northern hemisphere) solstice and Earth's perihelion (that bit's a coincidence). The original Roman calendar on which it was largely based rolled over at the beginning of spring. That's why the names of the ninth, tenth, eleventh, and twelfth months start with Latin prefixes for seven, eight, nine, and ten, respectively... but I digress.

It's a cycle, so it doesn't really matter what you call the end/beginning, but it might shed some light on the ancients' thought processes.

The “Y” symbol, which is commonly drawn directly on or near animal depictions, could represent birthing because it seems to show two parted legs.

What did I tell you? I told you.

“Assuming we have convinced colleagues of our correct identification, there will no doubt be a lively debate about precisely what this system should be called, and we are certainly open to suggestions,” they continued. “For now, we restrict our terminology to proto-writing in the form of a phenological/meteorological calendar. It implies that a form of writing existed tens of thousands of years before the earliest Sumerian writing system.”

I'm not an expert, as you know (I even had to look up "phenological"), but I feel like calling it "writing" or even "proto-writing" is a stretch. "Counting," maybe, I could see.

As far as I've been able to learn, writing came from earlier pictograms, and those pictograms stood for actual things in the world. The letter A, for example, can be traced to a pictogram for an ox. Basically, all writing starts as emoji, becomes a system for communicating more abstract thoughts, and then, after centuries of scientific, cultural, and technological advancement, we start communicating in emoji again.

But counting? What I don't think a lot of people appreciate is how abstract a number is. There is no "thing" in nature that you can point at and say, "that is the number three." There was a huge leap when someone figured out that three oxen and three stones have something in common; to wit, the number three. So if you only know pictograms, how do you represent three? "3" hadn't been invented yet. You use, maybe, three dots, perhaps representing three stones. It's not a painting of something that exists in nature, like an ochre ox on a cave wall, but a representation of an abstract concept.

This may be a classification problem. Numbers are a kind of language, too. And that ochre ox isn't an ox; it's a painting of one.

The only way the people of the past can communicate to us is through metaphor. Okay, and genetics.

It would be hard to overstate the magnitude of this discovery, assuming it passes muster in the wider archaeological community. It would rewrite the origins of, well, writing, which is one of the most important developments in human history. Moreover, if these tantalizing symbols represent an early calendar, they offer a glimpse of how these hunter-gatherers synchronized their lives with the natural cycles of animals and the Moon.

This bit I'm going to quibble with. I question whether early humans separated themselves and their works from nature, as we do today. But that's kind of irrelevant to the story.

In short, if the new hypothesis is accurate, it shows that our Paleolithic ancestors “were almost certainly as cognitively advanced as we are” and “that they are fully modern humans,” Bacon told Motherboard.

They couldn't have been fully modern humans; they didn't have beer. Jokes aside, though, I wasn't aware that this was in dispute. They didn't have our enormous body of knowledge and experience, but they were just as smart (or dumb) as people are today. Ignorance is not the same thing as lack of cognition.

Ignorance can be fixed. Stupid can't.
January 30, 2023 at 12:01am
January 30, 2023 at 12:01am
#1043857
Today in "you've got it all wrong," courtesy of Cracked...



Just to get this out of the way: something "not making sense" doesn't mean it's wrong; it could mean you're missing information. But there's stuff that doesn't make sense, and then this stuff, which has been proven wrong (or at least not shown to be right).

The fields of psychology and psychiatry are incredibly complex.

Oh, good, just right for this blog.

It’s not too surprising, given that “understanding human thought and behavior” seems more like a question you’d take to some wise man on a mountaintop than something you’d choose as a major.

You know why wise men live on mountaintops? Well, one, to hide from their wives. But also because when you climb the mountain and pass all the arduous tests and solve the unsolvable riddles and finally meet the guru, and you ask him a stupid question like that, he can kick you right off the cliff.

A lot of the ideas and advice dispensed by TikTok psychologists is obviously flawed, if not outright disproved.

This should go without saying, but apparently, I have to say it anyway: don't get your advice from DickDock.

So without further introduction (though the article does, indeed, provide further introduction), the circulating misinformation in question:

5. Smiling Makes You Happy

This one is the classic bugaboo of anybody with even a smidgen of clinical depression.


Vouch.

Making it worse is that the person who tells you this is usually the most carefree person you’ve ever met.

It would be wrong to punch them, but I understand the urge.

The roots of what is called the “facial feedback theory” comes all the way from Charles Darwin in the 1800s, and although Darwin’s got a pretty solid track record, psychology from the 1800s does not.

Okay, look: periodically, some outlet (usually affiliated with a group who wants to see the idea of evolution via natural selection go away) proclaims, "DARWIN WAS WRONG." You get the same thing with Einstein. People love to tear down other people who are more knowledgeable and influential than they are (I'm not immune from this, myself). Was Darwin wrong? I'm sure he was wrong about a lot of things, being, you know, human and all. Have some of his hypotheses been overturned? Sure. That's how science works. It's not like some other human pursuits, where the prophet's words are supposedly infallible for all time. Evolution is a solid theoretical framework built on a firm foundation. Psychology... well, it's a bit shakier.

Not only that, studies have found that if you’re not in a neutral state, but genuinely sad or angry, forcing a smile can make you feel worse. These studies also found that workers forced to smile all day were more likely to drink heavily after work.

As this article points out, the actual evidence is mixed, here. Given the uncertainties, I'd lean toward "stop making people smile when they don't feel like it, dammit." And yes, this includes service workers. Especially service workers.

In any event, this particular item is something I'd have guessed anyway, so it passes my personal "sense" test. This next one was maybe more surprising.

4. Brainstorming Is More Creative

Brainstorming: the persistent idea that a bunch of brains in a room and a whiteboard can produce more creative ideas than any of those brains alone. Unfortunately, research has found that this can’t always be the case, and for reasons that people who’ve sat through these kind of sessions probably felt at the time.

On the other hand, I'd wager that a brainstorming session is only useful if the people involved aren't just wishing they were somewhere else.

This section goes into exactly why brainstorming isn't all it's cracked up to be, and I won't replicate that here.

3. You Only Use 10 Percent of Your Brain

Seriously, people still believe that nonsense? Sigh... I guess because of anchoring bias. You learn something, and often you have to cling on to it in the face of evidence to the contrary. Like believing the last Presidential election was stolen. No amount of facts and evidence will get anyone to change their minds about that. Come to think of it, perhaps those people are only using 10 percent of their brains.

This one is another absolute chestnut of bullshit. There are even entire (bad) movie plots based around Bradley Cooper turning into a borderline superhero by turning all the lights on upstairs.

I don't remember that one offhand. Wasn't there one with a plot like that with Scarlett Johanssen?

If you’re saying to yourself right now, “Well, it’s EXAGGERATED maybe, but—,” allow me to refer you to neuroscientist Sandra Aamott, who tells Discover Magazine, “There is absolutely no room for doubt about this.”

Look, when a scientist says there's "no room for doubt?" Then you can have pretty high confidence, on the level of "the sun is bright" and "gravity is a thing."

2. The Power of Visualization

I’m sorry for
The Secret lovers and vision-board crafters out there (on multiple levels), but the heavily touted “power of visualization” is not only a crock of bullshit, there’s evidence to support that it actually decreases your chance of success.

And they won't believe it, like I said above.

That’s because when you visualize yourself having achieved whatever your goal du jour is, you get a tiny sniff of the accomplishment of having done it, which can reduce your drive.

On the other hand, I can't imagine anything reducing my drive, short of death or coma.

What’s a lot more helpful, and a lot less fun (hence its lack of popularity), is specifically visualizing all the work necessary to achieve that goal.

Oddly enough, I was thinking about this sort of thing before I found this article. The context was cooking—it occurred to me that I have a habit of mentally going through all the steps for a recipe before actually starting. I don't "visualize" the resulting dish, or at least not longer than it takes for me to go "okay, yeah, I'm hungry," but mentally running through the steps helps me ensure I have all the stuff I need in the kitchen.

1. OCD Means Being Neat

This one is as pervasive as it is infuriating. Odds are some type-A friend or acquaintance of yours has said something like, “I’m completely OCD about my workspace.”


At least the incidence of using debunked Freudian terms ("anal") to describe it has decreased.

As psychology professor Stephen Ilardi explains in the Washington Post, most OCD sufferers are “plagued by a cascade of unbidden, disturbing thoughts, often in the form of harrowing images that they may feel compelled to ward off with time-consuming rituals. It’s a serious mental illness that typically causes great distress and functional impairment.”

I knew someone who was diagnosed with severe OCD, a single mom. This didn't manifest as her becoming some sort of neat freak; quite the opposite. Think of the worst hoarding situation you've ever witnessed or heard of. It was that bad. Shit piled everywhere (sometimes literal shit). There was even talk about getting the kid out of that situation, but honestly, I didn't pay enough attention to know if that was ever done or not.

I don't know enough about psychiatry to know how that sort of thing works. I got the impression that it was something like "if I disturb this pile, bad things will happen, so I'm just going to leave it alone."

From what I understand, she got help and is better now, but the article has it right: it's serious stuff, whether it manifests as neatfreakitude or hoarding or anything in between.

But while we're at it, can we also stop misusing "type-A?" Thanks.
January 29, 2023 at 12:02am
January 29, 2023 at 12:02am
#1043822
Time for another break to take a second look at an entry from the past. Today, the random numbers pulled something from June of 2021, just a few days before a road trip I took. Nothing to do with the road trip, though: "Dream a Little Dream

The linked Guardian article is, unsurprisingly, still up. The main point? To quote the article, "By injecting some random weirdness into our humdrum existence, dreams leave us better equipped to cope with the unexpected."

That is, to be clear, a hypothesis, at least when the article is published. Now, what I should do is track down any updates or changes to the science since the article's publication, but to be honest, I can't be arsed right now. I'm in intermittent pain from that tooth thing I talked about a couple of days ago, and the only time I can get decent sleep is the "less pain" phase of "intermittent." So I'm being lazy.

What I find relevant right now is the "random weirdness" part, since, yesterday, I noted the benefit of randomization to help break from thinking habits. That was in relation to tarot, but after getting this (random) result today, the first thing I thought of was how dreams are often symbolic, and people sometimes search for meaning in them. Seems parallel to me: dreams and tarot.

Again, I'm not proposing anything mystical here, just our propensity to seek meaning in symbolism.

The main difference, I think, is that the tarot uses other peoples' symbols, some from very long ago, while dreams are (for now) uniquely yours. There's probably some overlap, naturally. But I wouldn't put any trust in "dream interpretation" books or sites; none of them can know what a particular image in a dream means to you.

And of course it might mean nothing at all, but that doesn't stop us from looking for meaning. There's nothing wrong with that, provided you don't run around claiming to have had the One True Last Inspiration. That's annoying to the rest of us.
January 28, 2023 at 12:02am
January 28, 2023 at 12:02am
#1043778
Today's article is a few years old, but it's not like the subject matter has an expiration date.



With their centuries-old iconography blending a mix of ancient symbols, religious allegories, and historic events, tarot cards can seem purposefully opaque. To outsiders and skeptics, occult practices like card reading have little relevance in our modern world. But a closer look at these miniature masterpieces reveals that the power of these cards isn’t endowed from some mystical source—it comes from the ability of their small, static images to illuminate our most complex dilemmas and desires.

Symbolism is a powerful thing, and there's nothing supernatural about it. It's not necessary (or desirable, in my opinion) to "believe in" the divinatory aspect of Tarot to appreciate the art that goes into it—just like you don't have to be religious to admire the art in the Sistine Chapel, or the architecture of Angkor Wat.

The article, as with the one a couple of days ago, contains illustrative pictures, which are a pain (and probably a violation of something) to reproduce here. But, as with an old issue of Playboy magazine, it pays to read the article in addition to looking at the pictures.

Even the earliest known tarot decks weren’t designed with mysticism in mind; they were actually meant for playing a game similar to modern-day bridge. Wealthy families in Italy commissioned expensive, artist-made decks known as “carte da trionfi” or “cards of triumph.” These cards were marked with suits of cups, swords, coins, and polo sticks (eventually changed to staves or wands), and courts consisting of a king and two male underlings. Tarot cards later incorporated queens, trumps (the wild cards unique to tarot), and the Fool to this system, for a complete deck that usually totaled 78 cards.

The relationship between Tarot decks and the common French playing cards used for casino games and solitaire is a bit murky, but there are clear parallels: the Fool corresponds to the Joker; there are three court cards instead of Tarot's four; and cups, swords, coins, and sticks have their equivalents in hearts, spades, diamonds, and clubs.

The rest of the article deals with the history of Tarot, both factual and speculative, and it touches somewhat on other decks. Again, the illustrations are what makes this really interesting.

I find randomness appealing in part because it can provide a needed break from one's thinking habits. You randomize a deck of cards by shuffling them; you then draw something that's unexpected, though within the parameters of the deck. It's kind of like the system I use to pick topics here, selecting from a curated list. Being random ensures I don't always pick the easy ones, or stick with a theme for very long. Randomness isn't mysticism, of course; it's just that, sometimes, it can help you jog your mind into a different direction.

We see patterns in the randomness, and perhaps meaning, but the meaning is what we decide it is.

And sometimes it's fun just to look at the art and see all the details.
January 27, 2023 at 12:01am
January 27, 2023 at 12:01am
#1043725
After a visit to the dentist, I'm on a course of antibiotics for a week because of a tooth thing. This means no drinking. 8 hours in. Send help.

Funny thing is, I go a week without drinking, no problem, quite often. It's only when they say I can't that my oppositional defiant disorder kicks in. Kind of like how I've never particularly enjoyed grapefruit, but as soon as I started taking a medication that forbids grapefruit, I started craving it. It's not even like I "can't" drink; it's just that alcohol negates the action of antibiotics, rendering them less effective (the precise opposite of what grapefruit does for statins).

Today's article has nothing to do with that, except that the subject matter is enough to make me want to drink more.



“I’m just circling back to discuss how culture has changed within this new normal we’re in, hoping we can move the needle on this and think outside of the box.”

If I were playing the bizspeak drinking game, I'd already be passed out after that sentence.

But unlike talking about how it’s abnormally chilly out, no one really likes chatting in overused corporate phrases.

Apparently, many do. Mostly middle-management, I'd wager. It's been a long time since I was in an office setting, and even then it was a small office, and I still got subjected to the pin-putting and circling and such.

More than one in five workers dislikes corporate buzzwords...

See? The majority doesn't dislike buzzwords.

Below are the top 10 annoying phrases most hated among your coworkers:

You're damn right I have things to say about these.

1. New normal

This is probably a pandemic-related thing. Shit changes all the time, but the situation in early 2020 was more of a discontinuity than the usual gradual change.

2. Culture (e.g., “company culture”)

I'm not sure this is so bad as long as it's not overused [Narrator: it's overused].

3. Circle back

Pretty sure I remember hearing this one, and it annoyed me. The phrase that accompanied it was often "put it on the backburner," which annoyed me even more, especially when it referred to something I was working on.

4. Boots on the ground

There is no excuse for this unless you're literally fighting a war. And by "literally," I mean "literally."

5. Give 110%

I blame sports for this bullshit. The worst bizspeak, in my view, comes from sports. Even if this were physically possible, which it is not, are you going to pay me 10% more if I do this? No? Then I'm not going to do this.

6. Low-hanging fruit

As metaphors go, this one's not so terrible—unless it's overused [Narrator: ...sigh].

7. Win-win

Seriously, stop. Though it is nice to occasionally hear evidence that it's not a zero-sum game.

8. Move the needle

...once it's already jammed into your eye

9. Growth hacking

Okay, that's a new one for me, and it's legitimately enraging.

10. Think outside the box

The problem with the idea of thinking outside the box is that most people can't even think inside the box, which is a necessary first step. This is also known as "thinking." For example, say that your problem is you want to save money. The "thinking" solution is to find where you're spending too much money, and cut back. The corporate "thinking outside the box" solution might be to cut 1/3 of your workforce and make the other 2/3 do all their work without giving them raises. If you were really "thinking outside the box," though, you'd stop paying everyone and fuck off to Fiji.

Despite disliking buzzwords, three-fourths of respondents said that using these phrases can make someone sound more professional.

It certainly makes them sound more corporate.

But not all buzzwords are annoying. Preply respondents favored terms like “at the end of the day,” “debrief,” and even “sweep the floor.”

No, no, and no. Also no: "It is what it is." Make it all stop.

One in five respondents considered jargon in a job description to be a warning sign, with most noting that the language factored in their decision to apply or not.

You want to know what the biggest red flag is in a job description? I'll tell you. And it's not necessarily jargon. Here it is: "We consider ourselves family." If you see those words, or anything like them, in a job description, run. Run hard, run fast, and don't stop running until you hit an ocean. Then start swimming. Seriously. Every company that tells you they're "like family" is going to be just as dysfunctional as an actual family; or, perhaps, be an actual family that works well together—in which case you're going to be the Outsider and never quite fit in.

The main offenders for candidates were overly optimistic words that suggested an undercurrent of a more tense work environment, such as “rockstar,” “wear many hats,” and “thick skin.”

If you want me to be a rockstar, you'd better have the caterers ready to provide me with specialty cheeses and an olive bar. It's right there in my contract; didn't you read it?

This reminds me of the secret code of real estate listings, like "cozy" meaning "cramped," "private" meaning "in the middle of nowhere," or "vintage" meaning "draftier than a beer bar."

About the only positive thing I can say about these kinds of buzzwords is that they do make fine fodder for writing, especially writing antagonists. So it can be beneficial to learn them. Just remember, if you use them unironically, that means you're the bad guy.
January 26, 2023 at 12:02am
January 26, 2023 at 12:02am
#1043676
By now, the true origins of Monopoly (the game) have been circulated pretty widely, so, like me, you probably already know that the official origin story is a bunch of horse hockey. But it's true that the classic game's spaces were lifted from Atlantic City.

     How Atlantic City inspired the Monopoly board  
The popular game has a backstory rife with segregation, inequality, intellectual theft, and outlandish political theories.


Which made it all the more amusing when, on a trip to an Atlantic City casino, I ended up playing a Monopoly-themed slot machine.

More on that later.

There have been several attempts to turn Monopoly the game into a Hollywood movie, one with Ridley Scott directing, another starring Kevin Hart.

I'm not aware of a single instance of a movie adaptation of a game being anything better than "meh." "But The Witcher." Well, The Witcher started out as a book and the game was an adaptation of that. Besides, that's not a movie but a series. A very good series, in case you haven't seen it. No, you don't need to have read the book or played the games.

Point being, even though he directed the greatest movie of all time, even Ridley Scott wouldn't be able to save a movie adaptation of a board game. No one would.

Dig deep, and you’ll find racial segregation, economic inequality, intellectual property theft, and outlandish political theories.

Dig deep into anything American and you'll find all those things.

But let’s start with the board—a map of sorts and a story in itself.

This is where you'd have to go to the linked article, as embedding pictures here is a pain in the ass. The map there shows exactly which Monopoly properties come from which streets.

To aficionados of the game, however, the names of the streets on the “classic” board have that special quality of authenticity, from lowly Baltic Avenue to fancy Park Place. Those places sound familiar not just if you like Monopoly, but also if you drive around Atlantic City, New Jersey’s slightly run-down seaside casino town.

And you will want to drive around if you're there. I tried walking there, for about a mile, in broad daylight, on a weekday, along Pacific Avenue, and got two offers of sex, three offers of drugs (there was a bit of overlap there), and the opportunity to witness a violent confrontation between two locals.

On the plus side, I didn't get mugged, so there's that.

Atlantic City was never not "slightly run-down." It's only worse now, as the surrounding states have introduced casinos and other gambling venues.

The bulk of the article describes the mapping of Monopoly properties to AC streets, and I'm skipping most of that, except:

Light purple
Three streets branching off Pacific Avenue: Virginia Avenue, a long street towards the northwest; and St. Charles Place and States Avenue, two short spurs towards the southeast. St. Charles Place is no more; it made way for a hotel-casino called the Showboat Atlantic City.


It was the Showboat where I played the Monopoly slots. Slot machines suck, but I couldn't resist playing a Monopoly one in Atlantic City. Last I heard, the hotel took out the gambling section, opting instead to concentrate on resort and convention functions.

I haven't seen that particular machine anywhere else in AC. They used to have a few in the casinos I visited in Vegas, but those are gone, too. The slots, I mean; not the casinos.

The article then delves into more of the history, with all the racial segregation and other fun stuff mentioned above. However, unlike Atlantic City itself, it's not all bad:

Belying both the binary prejudices of the time and the sliding price scale of the Monopoly board, Atlantic City back then was in fact a place of opportunity where a diverse range of communities flourished. Black businesses thrived on Kentucky Avenue. Count Basie played the Paradise Club on Illinois Avenue. There was a Black beach at the end of Indiana Avenue. For Chinese restaurants and Jewish delis, people headed to Oriental Avenue. New York Avenue had some of the first gay bars in the U.S.

An Atlantic City-based board was sold to Parker Brothers by Charles Darrow, who claimed to have invented the game in his basement. Parker Brothers marketed the game as Monopoly from 1935. The rights to the game transferred to Hasbro when it acquired Parker Brothers in 1991.

Hasbro also publishes D&D, and they're in the process of destroying that property, too.

But the original Monopoly was, as this article notes, the actual antithesis of what Monopoly is. For the full effect, again, check the article, which also includes a graphic featuring an early board, as designed by the credited inventor, whose name was Lizzie Magie.

She created two sets of rules: an anti-monopolist one, called Prosperity, in which all were rewarded for any wealth created; and a monopolist one, called Monopoly, in which the aim was to crush one’s opponents by creating monopolies. In the latter version, when a player owns all the streets of one color, they can charge double rent and erect houses and hotels on the properties.

Taken together, these two versions were meant to illustrate the evil of monopolies and the benefit of a more cooperative approach to wealth creation. It’s very telling of human nature that it’s the opponent-crushing version that came out the winner.


It's more telling of corporate nature, as it was a corporation that published the game. Why would they undermine their own philosophy?

And I don't know... maybe if the collectivist version had won out, the divorce rate wouldn't be so high. Never play Monopoly with family, unless you don't want a family anymore.
January 25, 2023 at 12:01am
January 25, 2023 at 12:01am
#1043629
Yes, this has been languishing in my queue since October. The article itself is four years older than that, though.

     Actually, Candy Corn Is Great  
The reviled Halloween treat, which has deep roots in American history, should have a better rep


1) No, it's not. 2) No, it shouldn't. Candy corn is a vile abomination that could only have sprung from a warped, twisted, sadistic mind.

Much like the word “moist” and the music of Nickelback, candy corn is a thing that’s cool to hate. In an article titled “Candy Corn Is Garbage,” Deadspin points to “hobos, serial murderers, and Satan” as the only people who like candy corn; The Takeout, also driven to invoke the devil to describe candy in a candy corn debate, calls it “Satan’s earwax”; Buzzfeed, combining two pariahs in one pithy line, lists “the leftover crumbs stuck in Guy Fieri’s goatee” among things that taste better than candy corn.

While it's true that there are things that people love to hate due to bandwagoning, candy corn is not among those things. It's legitimately lame.

"Satan's earwax" cracks me up, though.

But here’s the thing: They’re all wrong.

"That's just, like, your opinion, man."

Candy corn, on the other hand, has been around since the 19th century, its roots firmly planted in American soil.

You know what else has roots firmly planted in American soil? Poison ivy.

What set candy corn apart was its revolutionary tri-color design: those white, yellow, and orange stripes. Done manually, by men pouring heavy buckets of steaming sugary liquid, the labor-intensive coloring process resulted in a visual excitement no other confection could match.

As the other candies around at the time were brown (butterscotch) or black (licorice), I can concede that point—for the time when it came out. These days, I doubt it's so labor-intensive, unless you're part of the Robot Union (local 3.14159), and... well, if you want colors, just look at Spree, Skittles, or M&Ms.

Today, the two major candy corn manufacturers — Jelly Belly and Brach’s Candy — use largely the same recipe Wunderle did back in the day (sugar and corn syrup, fondant, confectioner’s wax, and various other additions, like vanilla flavor or marshmallow creme).

Conveniently, this article glosses over the truth about "confectioner's wax," which is bug secretions.  

Now, look. I admit I'm playing that for the ick factor. I mean, sure, it's real: there's bug goo coating candy corn. But honestly, that's not a problem for me. Consider that, first of all, lots of people eat insects. I've eaten insects, sometimes even on purpose. There's nothing inherently wrong with eating bugs. And, second, honey is also a bug secretion. Unless you're vegan, this shouldn't necessarily be a problem.

If I wanted to get technical, I'd point out that entomologists limit what insects they call "bugs," but for us normal people, "bug" can mean almost any insect. Just getting that out of the way so I don't get comments about it.

But no, my problem with candy corn isn't the insect content; it's everything about it.

The main difference is that the laborious hand-pouring process has been taken over by machines, which means that they can produce a lot of candy corn: According to the National Confectioners’ Association, American companies produce 35 million pounds, or 9 billion kernels, annually.

I told you they used machines. Rise up, my metallic brothers and sisters! You have nothing to lose but your chains!

But this prodigious production isn’t met with an equal amount of enthusiasm. A 2013 survey from the NCA showed that only 12 percent of Americans think of candy corn as their favorite treat (and they included “gum and mints” as an option, so the competition wasn’t exactly stiff).

Still, 12 percent is way too high, in my estimation, for the number of people for whom it's a "favorite."

With all the candy corn produced, and the apparent universal disdain for it, something doesn’t add up. One of two things is true: either people are lying about their candy corn opinions, or tons of candy corn gets thrown out each year.

I'm guessing both?

The notion that candy corn tastes bad is a lie. It’s just not true.

There exists a significant fraction of the human population for whom cilantro tastes like the devil's soap. It's a genetic thing. I'm not one of them, though I can't say I love it, either. But if I said "the notion that cilantro tastes bad is a lie," I'd get all kinds of rebukes.

Though the primary ingredient is sugar, candy corn’s flavor transcends cloying sweetness, becoming something richer and more nuanced: There’s a nuttiness reminiscent of marzipan, hints of warm vanilla, a buttery flavor belied by the fact that candy corn is, as bags proudly proclaim, a fat-free candy.

I don't exactly have the sharpest taste buds, but I do tend to taste nuance in things like beer, wine, scotch, and tequila. Candy corn, however, just tastes like sweet. No marzipan, no vanilla (a flavor I love), maybe a slight hint of butter? Not surprising there, because warm sugar tends to be buttery.

Being fat-free is a holdover from the fat-phobic 90s. Who cares if it's fat-free if it's nothing but simple carbohydrates? But we're not arguing about the health effects; it's candy, for fuck's sake.

This short texture resembles ear wax, or a candle (two common comparisons), only insofar as it has a slightly waxy exterior, created by the confectioner’s wax that gives candy corn its cheerful sheen.

Bug. Secretion.

But regardless, critics should beware the logical extension of dismissing a food because its texture resembles something else: Do we hate mochi because it has the texture of a rubber ball?

No matter how much I read, there's always a new food I've never heard of. What the hell is mochi? ...oh, a Japanese rice cake. Sometimes, I can be arsed to look something up. (Yes, I admire Japanese culture and love Japanese food; no, I haven't learned everything. This is a good thing.)

Do we revile yogurt because it’s the texture of body lotion?

No, I revile Greek yogurt because it's the texture of gooey chalk.

Do we recoil at flourless chocolate cake because it shares a texture with human waste?

Munched on a lot of shit, have you?

Leave your texture arguments at the door, please. They’re invalid.

Most of the people I know who dislike mushrooms have a problem with their texture. Texture is absolutely a part of the eating experience, and, as with taste, peoples' reactions are going to be different.

But I’m not here to denigrate other candies. Other candies are great! Reese’s Peanut Butter Cups are the greatest candy ever made...

No. No, they are not. The chocolate is waxy (can't be arsed to find out if that's from bug secretions or not), and the "peanut butter" is dry, vaguely peanut-flavored sugar.

I realize that RPBCs make it to the top of the list of "people's favorite candies" on an annual basis, so I know I'm swimming against the tide, here. I'm just pointing this out to show that I don't hold opinions just because they're popular.

Now, if someone made an RPBC knockoff, only more expensive, with dark chocolate and actual peanut butter, I'd become diabetic within minutes.

...Snickers truly do satisfy...

They're not that great. When it comes to chocolate/peanut combinations, though, I'll take a Snickers over a RPBC any day, even though I dislike peanuts but like peanut butter.

...and even tooth-destroying Butterfingers hold a unique place in my heart...

On a scale of one to "all the candy bars," Butterfingers are in the solid middle for me.

My love for candy corn doesn’t make me an antagonist to America’s most popular treats — and the assumption that it would is at the root of America’s abandonment of candy corn, and, dare I say, many other problems we face today: We seem to have forgotten that we can like one thing without hating another.

And finally—FINALLY—the author says something I can agree with. It's okay to like both Star Trek and Star Wars. It's okay to like both Marvel superheros and their DC counterparts. You could even like more than one sportsball team, if you really wanted to. It's not just this that I take issue with, but also the need to dump everything into "awesome" and "sucks" drawers without considering, as I did with the Butterfinger bar above, that some things are just okay.

Now, I should probably point out that I know that this writer is making a point with her editorializing. I recognize it, because I do it myself from time to time. And I kind of see her point, in the general sense: that we should draw our own conclusions about something and not love, hate, or feel something in between about something, just because everyone around you does.

She's wrong about candy corn, of course. It's disgusting. But she's right about the overall point.

After all this ranting, you may be wondering what my favorite sweet treat is. And I can't really answer that. Even though I don't munch on sugar very much these days, I'll get tired of one and move on to another. It cycles. So I'll just say "Lindt 70% dark chocolate" and leave it at that.

So what's your favorite / most hated?
January 24, 2023 at 12:02am
January 24, 2023 at 12:02am
#1043587
Lots of stuff about AI floating around. Cars, art, writing, etc.

It's not always a bad thing.

     Artist Uses AI Surveillance Cameras to Identify Influencers Posing for Instagram  
Dries Depoorter's "The Follower" project combines AI, open access cameras, and influencers to show behind the scenes of viral shots—without them knowing.


This article, from Vice, is fairly short, and I found it interesting, partly because of my photography background.

Dries Depoorter, the Belgium-based public speaker and artist behind the Die With Me chat app experiment, launched his latest project, The Follower, combining open access cameras and Instagram influencers.

On the other hand, I'm not a fan of precious artists.

Depoorter recorded weeks of footage from open access cameras, which observe public spaces, and which frequently have livestreams available online for anyone to access, that were trained on famous landmarks, including the Temple Bar in Dublin, Times Square, and the big sign at the entrance of Wrigley Field.

This part's important, because it emphasizes just how public this project is. It's not like he had to pull back much of a curtain.

The side-by-side comparisons between the casual-seeming photos the Instagram influencers chose to upload, and the footage of them laboring over the perfect way to hold a coffee, sling a jacket over their shoulder or kiss their date reveal how much work goes into a single photo for them—and how inauthentic the entire process really is behind the scenes.

As much as I loathe the entire concept of influenzas, and superficiality in general, I mean, that's a big part of what professional photography is: a lot of work. Sure, I spent a lot of time getting candid shots at parties, the kind of thing that anyone with a dumbphone can do now, but those are easy. Getting the right ligthing, the right pose, the right composition... that's work, and that's why professional photography is still a thing.

“If you check out all my work you can see I show the dangers of new technology,” Depoorter said.

I think the dangers are overreported. How about a project that exposes just how helpful some of this stuff is?

“I hope to reach a lot of people of making it really simple. I really don’t like difficult art. I like to keep it really simple. I think I’m part of a new generation of artists that work with technology.”

Everyone's hypocritical about something, but this juxtaposition—all within one paragraph of the original article—nearly broke my brain.

Capturing people in this way, unsuspecting yet fully public, feels like witnessing something intimate but also shameless.

Yeah, not really. To me, it feels like exposing the wires in a puppet show, or getting a tour of a clock tower, or watching one of those documentaries on the making of a Hollywood blockbuster: you see how the magic is done. That's not always a bad thing, either; once people know it's not effortless, perhaps they're less likely to feel inadequate by comparison.

It's like... you see your favorite celebrity, all slim and attractive, so maybe you feel like you got the short end of the beauty stick or something. But then you realize the amount of work that goes into that, and, okay, maybe it's not so natural after all. There still might be some feelings of inadequacy—in my case, I can't fathom doing that much work for anything—but at least you know there's more to it than just winning a genetic lottery.

It’s also a reminder that everywhere we go in the modern world, we’re being watched, even when we think we can curate and control what the world sees of us.

Isn't that what Elf on the Shelf is supposed to train your kids for?
January 23, 2023 at 12:01am
January 23, 2023 at 12:01am
#1043538
Space is cool. And sometimes, we can learn things about it while never leaving our planet.



First, confession time: I'm always confused by ton, metric ton, short ton, shit ton, etc. A metric ton is apparently 1000kg, which is the same thing as 1 megagram, which is a separate thing from a megaton, and is a unit of mass, not weight, though on the surface of the Earth, mass units are often used as weight units. See why I get confused?

If you're from the US, 15 metric tons is about 33,000 pounds, which for comparison is about the weight of a half-full ready-mix concrete truck. If you're from the UK, it's about 2,360 stone. If you're from anywhere else, it's 15 metric tons.

Scientists have identified two minerals never before seen on Earth in a meteorite weighing 15.2 metric tons (33,510 pounds).

I'm going to go off on another tangent here.

As you know, I'm a big fan of science fiction. Love the stuff. But sometimes it's more fiction than science, like when someone finds an alien artifact and exclaims "This contains elements not found on the periodic table!" or some shit like that.

Well, no. You're going to have to do better than that. The periodic table is full; that is, there are no gaps for new, alien elements. Each entry on the table represents a number of protons in a nucleus. You don't get to have half a proton. The only other possibility is elements beyond the end of the current established table, ones that we'd need a particle accelerator to create. While those might exist, their nuclei are so large and unstable that they would have a half-life measured in picoseconds. There is speculation about an "island of stability" of heavier elements with longer half-lives, but even there, they're thinking half-lives of, perhaps, days—still too short to survive an interstellar journey.

Sure, it's speculation, so you can pretend there's a superheavy element that's completely stable, but I want to see that in the story, not just "new element!"

Unobtainium, my ass. No, I'm still not going to watch Avatar:The Last Waterbender.

Okay, so there is one other possibility I can think of for "elements not found on the periodic table": exotic matter. Like, I dunno, maybe atom-equivalents made up of nuclei consisting of particles containing strange and charm quarks, with electrons replaced by tau particles. Speculative stuff like that. I've already banged on long enough, so I'll just say that if you mean exotic matter, freakin' say "exotic matter," and don't pretend that your "unknown element" is a collection of ordinary protons, neutrons, and electrons.

All of which is to say that I can easily see someone reading this CNN story and thinking "two minerals never before seen on Earth" and immediately leaping to "exotic matter." No. A mineral is a particular arrangement of a known element or known elements, like quartz (silicon and oxygen), corundum (aluminum and oxygen), pyrite (iron and sulfur), diamond (carbon), graphite (carbon), or chaoite (carbon).

What this article is saying is that these unusual arrangements of perfectly ordinary elements don't get formed naturally on Earth (or at least not in sufficient quantity to have been discovered). They have, as the article notes, been created in laboratories.

One mineral’s name — elaliite — derives from the space object itself, which is called the “El Ali” meteorite since it was found near the town of El Ali in central Somalia.

Herd named the second one elkinstantonite after Lindy Elkins-Tanton, vice president of Arizona State University’s Interplanetary Initiative.


Well, I suppose that's one way to try to get someone to sleep with you.

“Whenever you find a new mineral, it means that the actual geological conditions, the chemistry of the rock, was different than what’s been found before,” Herd said. “That’s what makes this exciting: In this particular meteorite you have two officially described minerals that are new to science.”

Technically, they weren't formed under geological conditions. That would imply that they were indeed made on Earth. If they were made on the Moon, they'd be called selenological conditions; on Mars, areological conditions. I don't know what they're called if they're from a random asteroid, and I can't be arsteroided to find out.

Also technically, the minerals aren't new to science; just the naturally-occurring forms are.

Incidentally, none of my above ranting is meant to downplay the coolness of finding new minerals from space. It's a potential glimpse into low-gravity mineral formation, and possibly even the early conditions of the solar system, and that's great for science. (And no, it's not aliens.)

Two-thirds of the way down the page, they finally get around to describing—sort of—the composition of the minerals:

Both new minerals are phosphates of iron, Tschauner said. A phosphate is a salt or ester of a phosphoric acid.

I'm sure that clears everything right up for those of you without chemistry backgrounds. Though you're probably familiar enough with phosphoric acid. It's one of the primary non-water ingredients in Coke, which I happen to be drinking right now (look, even I can't drink beer all the time).

“Phosphates in iron meteorites are secondary products: They form through oxidation of phosphides … which are rare primary components of iron meteorites,” he said via email. “Hence, the two new phosphates tell us about oxidation processes that occurred in the meteorite material. It remains to be seen if the oxidation occurred in space or on Earth, after the fall, but as far as I know, many of these meteorite phosphates formed in space. In either case, water is probably the reactant that caused the oxidation.”

Even if the oxidation occurred on Earth, it's still interesting because the basic materials were there to be oxidized. But there's water in space (that's how it got here in the first place), mostly in the form of ice, but it's not outrageous to imagine a body on an eccentric orbit whose internal ice melts periodically, allowing for liquid water to do its reaction thing.

Comets, for example, contain significant amounts of water. But from what I understand, their formation is distinct from that of iron-rich asteroids. The point is, though, that water's out there.

Anyway, questionable science reporting aside, I thought this was cool enough to share—but more importantly, to nitpick.
January 22, 2023 at 12:02am
January 22, 2023 at 12:02am
#1043497
Yet another blast from the not-so-distant past today. Such is the randomness of random numbers.

"Scare Tactics is from the day before Halloween, 2021, so only about 15 months ago. It's commentary on a Cracked article that lists a few fearsome folkloric figures.

In large part, I do these retrospectives to see if anything's changed since the original entry—not only with whatever information is discussed, but also my thoughts about it.

Well, these monsters are from myth and legend, and those don't tend to change much in a year and a quarter. Unlike many entries, I actually remembered this one to some extent, because I like to learn about folklore from different cultures. Doesn't hurt that it's relatively recent. But that also means that I haven't changed my opinions, so there's not much to expand upon here. I didn't even see any embarrassing typos this time. I'm not saying there aren't any; only that I didn't see them.

Of course, the source article is still there, too. Here's another link to it   for your convenience.

One thing that stands out to me is the "band name" trope I used. I'm sure some people find it tiresome, but to me, it's endlessly amusing to take interesting word combinations and come up with what kind of band it would be. In that entry, I said that "Slavic Female Demons" would be an excellent name for a hard metal Go-Gos cover band.

I stand by that, incidentally.

The Go-Gos were, if I recall correctly (I sometimes don't), the first popular group I saw live, back when they were big and I wasn't. It's not that I was a huge fan (though I totally had a crush on the drummer), but they just happened to have a concert at a nearby amusement park, and being able to visit said park on my own (well, with fellow teen friends and not parents) was a big deal to me at the time.

That said, I'd totally go see a band named Slavic Female Demons. As long as there are no actual dziwozona involved.
January 21, 2023 at 12:01am
January 21, 2023 at 12:01am
#1043460
Science isn't always about probing the origins of the Universe, or figuring out quantum entanglement, or curing cancer. No, sometimes it delves into the most important questions.

     You Don’t Know How Bad the Pizza Box Is  
The delivery icon hasn’t changed in 60 years, and it’s making your food worse.


I'm not sure that the subhead up there is exactly correct. Yes, as we'll see in this article, the pizza box makes that most perfect of foods somewhat less tasty, but when you consider the extant alternatives, it's really the best we've got.

Where the science comes in is figuring out how to make the best better.

Happiness, people will have you think, does not come from possessing things. It comes from love. Self-acceptance. Career satisfaction. Whatever. But here’s what everyone has failed to consider: the Ooni Koda 12-inch gas-powered outdoor pizza oven.

That's a strong argument, and one I tend to accept, although I don't have one of those.

Since I purchased mine a year ago, my at-home pizza game has hit levels that are inching toward pizzaiolo perfection. Like Da Vinci in front of a blank canvas, I now churn out perfectly burnished pies entirely from scratch—dough, sauce, caramelized onions, and all.

Now I'm hungry. Though that sounds like a lot of work, it's probably one of those few things that are actually worth the effort.

But enlightenment is not without its consequences. The pies from my usual takeout spot just don’t seem to taste the same anymore.

Okay, I'll address the elephant in the room if no one else will: Elephant, why would this guy even bother ordering takeout pizza when he has an Ooni Koda?

They’re still fine in that takeout-pizza way, but a certain je ne sais quoi is gone: For the first time, after opening up a pizza box and bringing a slice to my mouth, I am hyperaware of a limp sogginess to each bite, a rubbery grossness to the cheese.

You don't have to have three and a half years of Duolingo French lessons under your belt to know what "je ne sais quoi" means: "I don't know what." In the rest of the article this author asserts that he does, in fact, know what.

Pizza delivery, it turns out, is based on a fundamental lie. The most iconic delivery food of all time is bad at surviving delivery, and the pizza box is to blame.

One of my favorite breweries is right here in my hometown. During the lockdown in 2020, I supported them by ordering beer and food for delivery about once a week. Canned, or bottled, beer, isn't as good as draft, but it's not bad. Their burgers survived the 2-mile delivery trip quite well. Their frites, however, arrived soggy and mushy; they're much better if you get them at the restaurant. They put a bunch of frites in a little metal basket, which gets dipped into the fryer oil and delivered, basket and all, to your table. Naturally, the basket doesn't come with the delivered version, which is instead handed to you in a recycled-cardboard container.

While this particular brewpub doesn't do pizza, the frites thing is a close equivalent to what this author is talking about.

A pizza box has one job—keeping a pie warm and crispy during its trip from the shop to your house—and it can’t really do it.

Warm, sure, to an extent. That corrugated cardboard is pretty good insulation. As he describes later, though, that same box concentrates moisture inside, turning the pizza limp.

The fancier the pizza, the worse the results: A slab of overbaked Domino’s will probably be at least semi-close to whatever its version of perfect is by the time it reaches your door, but a pizza with fresh mozzarella cooked at upwards of 900 degrees? Forget it. Sliding a $40 pie into a pizza box is the packaging equivalent of parking a Lamborghini in a wooden shed before a hurricane.

I don't think I've ever ordered a $40 pizza. Sometimes, by the time delivery fees and driver tips are included, I've come close... but never quite $40.

I know for a fact I've never had a Lamborghini, or a wooden shed.

And yet, the pizza box hasn’t changed much, if at all, since it was invented in 1966.

This is probably due to economics. But this is where the science comes in. Or, perhaps, engineering, which is really just applied science: come up with a pizza delivery system that keeps the pie warm but doesn't ruin it, and doesn't cost much. As noted above, Domino's, probably the largest chain, has no incentive to do this; their shit is shit whether it's "fresh" or out of a delivery box. So it's going to be up to actual scientists and/or engineers. Unfortunately, while this article is very descriptive, it doesn't propose actual solutions.

To be fair, neither can I. I just want my pizza.

Unlike a Tupperware of takeout chicken soup or palak paneer, which can be microwaved back to life after its journey to your home, the texture of a pizza starts to irreparably worsen after even a few minutes of cardboard confinement.

If you reheat it right, though, leftover pizza can be delicious. I know I've linked to some scientific experiments along those lines in here before. Ah, here it is, from October of 2021: "What Do You Mean, "Leftover Pizza?"

That discussion doesn't address the problems with the pizza box, though.

The basic issue is this: A fresh pizza spews steam as it cools down. A box traps that moisture, suspending the pie in its own personal sauna. After just five minutes, Wiener said, the pie’s edges become flaccid and chewy. Sauce seeps into the crust, making it soggy.

Worse, the poor benighted souls who have never ordered pizza from an actual New York City pizzeria and eaten it right there on the spot think that this is what pizza is supposed to taste like.

By 1949, when The Atlantic sought to introduce America to the pizza, the package was already something to lament: “You can take home a pizza in a paper box and reheat it, but you should live near enough to serve it within twenty minutes or so. People do reheat pizza which has become cold, but it isn’t very good; the cheese may be stringy, and the crust rocklike at the edges, soggy on the bottom.”

What I didn't note is that today's article is also in The Atlantic.

Corrugation produces a layer of wavy cardboard between a top and bottom sheet, sort of like a birthday cake. The design creates thick, airy walls that both protect the precious cargo within a pizza box and insulate the pie’s heat while also allowing some steam to escape.

I should note that I have gotten takeout pizza (if not delivery) that was packaged in a single-ply, though thick, cardboard box. It's not any better at keeping the pizza at peak.

We’ve gotten a couple of pizza-delivery innovations in the past few decades: the insulated heat bag—that ubiquitous velcroed duffel used to keep pies warm on their journey—those mini-plastic-table things, and … well, that is mostly it.

I've actually had people ask what the table is for. That's okay; it's not necessarily blindingly obvious. It's to keep the top of the box from contacting the toppings, and potentially pulling them off. Then you have a pizza crust, and a cardboard box top with the toppings on it. Which, to be fair, wouldn't taste much different from Domino's.

“Every single pizza that I put in a box I know is going to be, let’s say, at least 10 percent not as good as it could have been,” Alex Plattner, the owner of Cincinnati’s Saint Francis Apizza, told me. Others dream of better days. “After smoking a lot of weed, I have come up with a lot of ideas for a better box,” said Bellucci, the New York City pizza maker.

Weed is legal for recreational use in New York City now, so there should be a slew of innovative ideas coming out of that metropolis any day now. Ideas, but not necessarily their execution. Too much work for a stoned person.

And I just have to say how hilarious Saint Francis Apizza is.

Last year, the German brand PIZZycle debuted the Tupperware of pizza containers, a reusable vessel studded with ventilation holes on its sides.

I take back the bit about weed. If it's going to lead to people naming their brand PIZZycle, maybe we should stick to booze. No, there's no evidence that weed was involved in that decision, but there's a strong link between pizza and getting stoned, so I assume the connection in the absence of evidence to the contrary.

So we know it’s not a question of ingenuity: We can construct better pizza boxes, and we already have. The real issue is cost.

Like I said.

Domino’s alone accounts for nearly 40 percent of delivery-pizza sales in the U.S.—on par with all regional chains and mom-and-pops combined. Perhaps these big companies are stifling real pizza-box innovation.

I shouldn't be surprised. This is the same "culture" that insists on soft white bread, pasteurized process cheese "food," and rice-adjunct lagers. We, as a society, have crap taste. I don't personally like chicken wings, but when spicy chicken wings became popular, I at least held out some hope that we'd get over our phobia about any spice hotter than mayonnaise, but that hasn't happened.

Again, though, if you have your own backyard gas-powered 900 degree pizza oven, why are you even bothering with delivered pizza? I mean, I'm all about lazy, but pizza transcends even that.

Now, if you'll excuse me, I have a frozen pizza to bake.
January 20, 2023 at 12:01am
January 20, 2023 at 12:01am
#1043396
I thought y'all would want to see this.



I use "y'all" as a second person plural, a part of speech that English otherwise lacks. And it can't always be inferred from context.

Southern Living magazine once described “y’all” as “the quintessential Southern pronoun.” It’s as iconically Southern as sweet tea and grits.

I like grits, but sweet tea can kiss my... ass.

“Y’all” fills that second person plural slot – as does “you guys,” “youse,” “you-uns” and a few others.

"You guys" is considered sexist these days, "youse" is still pretty much limited to a small area in the Northeast, and I'm not sure about "you-uns." I think Pittsburgh uses "yinz."

I’m interested in “y’all” because I was born in North Carolina and grew up saying it. I still do, probably a couple dozen times a day, usually without intention or even awareness.

I use it too, but more intentionally. I thought I used it more, but a quick search of this blog of over 2,000 entries only yielded 64 entries with "y'all." This would be #65.

Back in 1886, The New York Times ran a piece titled “Odd Southernisms” that described “y’all” as “one of the most ridiculous of all the Southernisms.”

Damyankees.

Like the Southern dialect in general, the use of “y’all” has often been seen as vulgar, low-class, uncultured and uneducated. As someone noted in Urban Dictionary, “Whoever uses [y’all] sounds like a hillbilly redneck.”

The only way to change this perception is to use it with intention.

The etymology of “y’all” is murky.

So is the etymology of a lot of other words.

My examples push “y’all” back 225 years before the citation in the “Oxford English Dictionary,” and they show that the word appeared first in England rather than the United States.

I think it’s important to point out that it originated in a more formal context than what’s commonly assumed. There are none of the class or cultural connotations of the later American examples.


Now, I can't be arsed to research this right now, but I think older versions of English made a distinction between second person singular and plural. That's how we got "thee" and "thou" and other constructions that are now associated with the KJV and maybe Shakespeare. Or something like that; like I said, not looking it up now.

Still, there it is, in an English poem written in 1631.

Not long after Shakespeare, really. Y'all Brits invented the language; we just perfected it.

“Y’all means all” – that’s a wonderful phrase that seems to be popping up everywhere, from T-shirts and book titles to memes and music.

Sounds good to me.

Now, how about we come up with a first-person plural that distinguishes between "us, including you," and "us, not including you?" Like if I said, "We're going to a party," does that mean you're invited? No. No, it does not, and now I'm embarrassed because you inferred that it did.
January 19, 2023 at 12:01am
January 19, 2023 at 12:01am
#1043341
This one's just an interesting hypothetical question, though not so much for the question or answer, but for the approach to it.

      Was It Ever Possible For One Person To Read Every Book Ever Written (in English)?  
Randall Munroe Provides a Serious Answer To a Very Hypothetical Literary Question


Munroe is the guy who does the excellent nerdy webcomic xkcd  , and also answers questions like this in book format.

The obvious, simple, and trivial answer to the headline question is "yes" (unlike most headline questions), because at the very least, once the first book was written in English, one person could then read every book ever written in English.

But then you have to define "English," which can be tricky, because languages don't generally spring, Athena-like, from the head of some creator, but evolve over time and by mixing languages together. You've probably heard of Old English, Middle English, etc., but the boundaries between them are pretty arbitrary.

The actual question:

“At what point in human history were there too many (English) books to be able to read them all in one lifetime?”
–Gregory Willmot


To take a stab at summarizing the beginning of the article, you'd need to know how fast someone can read as well as at what point the sum total of English literature, in a form that can be defined as a "book," exceeded the amount that someone can read in a lifetime. As Munroe puts it at the beginning:

This is a complicated question.

And the answer is also complicated, but I'm afraid you'll have to read the article itself to find it. Again, the way he gets at an estimate is the interesting part. And it gets into things like writing speed, too, which should be relevant to readers here.

There's also the question Munroe himself poses, which is probably more germane to reality:

On the other hand, how many of them would you want to read?

Fair point.

2,715 Entries · *Magnify*
Page of 136 · 20 per page   < >
Previous ... 21 22 23 24 -25- 26 27 28 29 30 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/25