by Robert Waltz
Not for the faint of art.
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
|Ever wonder about grapefruit?
Grapefruit Is One of the Weirdest Fruits on the Planet
From its name, to its hazy origins, to its drug interactions, there’s a lot going on beneath that thick rind.
You know, there are quite a few weird fruits on the planet. Durian comes to mind. And wtf is up with breadfruit? But yeah, okay, maybe grapefruit is a bit weird, especially since grapes are already fruits and they have nothing to do with citrus so what's with the name?
Right from the moment of its discovery, the grapefruit has been a true oddball. Its journey started in a place where it didn’t belong, and ended up in a lab in a place where it doesn’t grow. Hell, even the name doesn’t make any sense.
The current theory is that somewhere around five or six million years ago, one parent of all citrus varieties splintered into separate species, probably due to some change in climate. Three citrus fruits spread widely: the citron, the pomelo, and the mandarin.
You don't see pomelo much around here. I have a vague memory of eating one, long ago, in a foreign land. At the time, I thought it was a cross between an orange and a grapefruit. I guess that was backwards.
With the exception of those weirdos like the finger lime, all other citrus fruits are derived from natural and, before long, artificial crossbreeding, and then crossbreeding the crossbreeds, and so on, of those three fruits. Mix certain pomelos and certain mandarins and you get a sour orange. Cross that sour orange with a citron and you get a lemon. It’s a little bit like blending and reblending primary colors. Grapefruit is a mix between the pomelo—a base fruit—and a sweet orange, which itself is a hybrid of pomelo and mandarin.
Yeah, I know. I got lost too.
Speaking of all these names, let’s discuss the word “grapefruit.” It’s commonly stated that the word comes from the fact that grapefruits grow in bunches, like grapes. There’s a pretty decent chance that this isn’t true. In 1664, a Dutch physician named Wouter Schouden visited Barbados and described the citrus he sampled there as “tasting like unripe grapes.” In 1814, John Lunan, a British plantation and slave owner from Jamaica, reported that this fruit was named “on account of its resemblance in flavour to the grape.”
This is largely guesswork, almost all of it, because citrus is a delightfully chaotic category of fruit. It hybridizes so easily that there are undoubtedly thousands, maybe more, separate varieties of citrus in the wild and in cultivation.
Seriously, though, the vast variety of citrus and its ease of modification is pretty fascinating.
The article goes on to describe how grapefruit, and other citrus, led to Florida becoming Florida, so there's another reason for me to hate grapefruit.
It also talks about the very interesting discovery that grapefruit completely fucks with some medications.
Now, I've said this before but I'll say it again: I've never liked grapefruit. I mean, I never really hated it; if it's there I'll eat it but I never sought it out, or deliberately obtained grapefruit juice to drink. It was just something that was there. That is, until I started taking a statin, at which point I got really intense cravings for grapefruit.
The surest way to get me to want something is to tell me I can't have it. I mean, it's possible that if someone told me "you can't eat eggplant or your blood pressure will go through the roof," I'd want to go out and buy bushels of eggplant.
Possible, but I doubt it. At least I always acknowledged that grapefruit was edible.
Anyway, I'm not going to quote the circuitous part of the article that goes into the discovery of grapefruit interactions, but basically, it can have the effect of making us metabolize more of certain medicines than expected.
I know a guy who takes advantage of this. He's poor and has shit insurance, so he stretches out his statin supply by taking 1/2 the recommended dose and munching on grapefruit.
Pretty sure that's not recommended.
“There are a fair number of drugs that have the potential to produce very serious side effects,” says Bailey. “Kidney failure, cardiac arrhythmia that’s life-threatening, gastrointestinal bleeding, respiratory depression.” A cardiac arrhythmia messes with how the heart pumps, and if it stops pumping, the mortality rate is about 20 percent. It’s hard to tell from the statistics, but it seems all but certain that people have died from eating grapefruit.
And see, I'd rather die from eating something that I actually like.
|You'd think that, as a long-time beer snob, I'd like Guinness.
You'd be wrong.
I mean, it's not bad, and it's certainly an important beer for many reasons. I'm sure I'd like it better if I were to go to Ireland and drink it from the tap there. And given a choice between Guinness and that watered-down rice water that passes for "beer" that's mass-produced in the US, I'll take Guinness every time.
But I made the mistake once of drinking a pint at a concert. The mistake was, before the concert, I'd gone to a taphouse and had a really good stout. By comparison, the Guinness just didn't measure up.
Like I said, though, it's a culturally important beer for many different reasons, one of them being just how long it's been brewed.
Beer drinkers are used to seeing eye-catching beer labels — some good, some not so good. But stroll past a beer fridge and you might see a label that’s simple, nostalgic, and a throwback to the 1930s: the Guinness toucan.
Beer art is flourishing right now. Graphic artists are probably having a field day with all the logos and labels and tap handle designs, to say nothing of posters and taphouse decorations.
“The Guinness family did not want an advertising campaign that equated with beer,” the UK History House writes. “They thought it would be vulgar. They also wanted to stress the brew’s strength and goodness. Somehow it led to animals.”
At least they didn't pick an elephant.
There was the pelican that stole everyone’s beers with the copy, “My goodness — my Guinness!” Also the sea lion that had a habit of stealing Guinness. And the turtle that steals Guinness on its back. A lot of thirsty animals, basically. None earned as much fame as the toucan, however.
I don't think I was ever aware of these other mascots.
The toucan’s brightly colored beak contrasts nicely with two dark glasses of Guinness (almost always two glasses, playing off of the similar sounds of “too can” and “toucan”).
As I'm only aware of the ones with draft pints, I never would have made the "two can" connection. No cans, no pun.
The toucan and its gang of animal friends graced Guinness ads for decades. Then, in 1982, Guinness stopped working with S.H. Benson and dropped the animals. In recent days the toucan has made some appearances, including a limited-edition can released in 2016. But it primarily lives on in the memories of Guinness lovers and collectors.
They're more than happy to sell toucan merchandise in dedicated Guinness stores (yes, these exist), along with the famed bar towels.
So yeah, even with the proliferation of beer choice nowadays, there will always be a place for Guinness. But that's as much a testament to the power of marketing than to the beer itself.
I'm guessing... no one, because there's still a pandemic going on.
Yeah. I know. No one cares anymore. And I get that -- I quit caring about a lot of things, myself, like the environment or adhering to society's expectations of me. But not about the pandemic. Too many people have suffered long-term effects, and I'm just done with everyone who's pretending it doesn't exist. They've made it so that, instead of me sometimes going out and doing stuff, I have to never go out and do stuff. And I miss doing stuff.
But you know what I don't miss? I don't miss having to do the work involved with having guests over. Or, more usually, me doing the work and then people cancelling at the last minute.
Just can't trust anyone.
Fortunately, I've entered Grumpy Old Man territory and I no longer have to put up with shit that I don't want to put up with, such as people who are annoying. Which is almost everybody.
Don't get me wrong - if I'm invited to dinner, once I can move around again, I'll be as polite and respectful as I know how to be (though you'll still have to put up with my jokes). But no more hosting for me. Too much work for not enough reward.
Yeah, I'm in a bad mood right now. Like I said: Grumpy Old Man.
|Well, it looks like I have a new mission in life. In addition to the beer odyssey.
The mission is to visit every state and verify -- or disprove -- this list.
"All of humankind has one thing in common—the sandwich," renowned late-aughts philosopher Liz Lemon once theorized, on NBC's 30 Rock. "I believe that all anyone really wants in this life is to sit in peace and eat a sandwich."
Late-noughties. There's no excuse to call the 2000-2009 decade anything else. (Spare me the pedantry of pointing out it's "really" 2001-2010.)
Roughly as old as the country and invented by the Earl of Sandwich, an Englishman who never seemed to have time for a proper sit-down meal, Americans have spent the entirety of our nation's existence seeking to perfect the humble art form.
You know, that "Earl of Sandwich" origin story never quite sat well with me, like the aftereffects of... well, of a sandwich that's maybe a bit too greasy.
The sandwich is named after its supposed inventor, John Montagu, 4th Earl of Sandwich. The Wall Street Journal has described it as Britain's "biggest contribution to gastronomy."
If you follow the links from that Wikipedia page, you'll find that Montagu, Earl of Sandwich, didn't invent the sandwich any more than Amerigo Vespucci "discovered" America, and yet they became their respective namesakes. So okay, fine, the concept existed before they named it that.
Anyway, back to the original article.
There were strict parameters; no burgers, no hot dogs, no burritos, no tacos, and in nearly all cases, no barbecue. Sandwiches or not sandwiches, they can go ahead and get their own lists.
Want to start an argument? Ask if a hot dog is a sandwich.
I mean, sure, it's meat and stuff eaten with bread, but technically it's not two pieces of bread, but a single hinged bun. Consequently, it's a taco. I have spoken.
Now, obviously I'm not going to comment on all of the examples here. For starters, I haven't eaten any of the examples. But I do have some things to say -- obviously.
If anyone asks you what the 1970s were like in Los Angeles, drag them down—immediately, if not sooner—to Langer's Deli, the best Jewish deli in America, for the pastrami.
That's a bold statement right there. Now, granted, I haven't been to Langer's, but for some reason pastrami sandwiches are quite popular throughout the West, and I've had a few -- and not a single one of them can compare to the ones you get in New York City (or even one I've had right here in my Virginia town). However, I'm a reasonable person, and will withhold judgment until I get to L.A.
At least they're better than most of the South, which somehow came to the mistaken conclusion that it's okay to serve cold pastrami. It is not.
We like to think of the hot brown as the beginnings of a fine turkey club: roast bird, strips of bacon, and slices of tomato on toast. Plot twist—the whole thing is then flooded with rich Mornay sauce before hitting the broiler, emerging a delicious mess that requires a knife and fork to consume.
I have no doubt that it is delicious, but this is not a sandwich. Yes, I know that "open-face sandwiches" are supposedly a thing, but if it's a piece of bread with toppings on it, it's not a sandwich; it's a pizza. Regardless of toppings.
Surely there are lobster rolls in a coastal state bookended by Maine and Connecticut, but we're too busy filling up on clam rolls, which are the first meal we think of when we think Massachusetts, or at least the very large amount of the state located by the ocean.
I have nothing to say about this creation that I haven't tried yet, but I'm including this to point out that I went to a BBQ place in central Massachusetts once that billed itself as having "The Best BBQ in the US." Being from the South, I had to go in and dispute that claim. Well, I did -- but I have to admit, it didn't suck.
...that other delightful, if less widely-renowned, Nebraska invention, the deep-fried grilled cheese sandwich.
Go. Go look at the picture of it at the link. I'm getting a heart attack just looking at it.
A serious jostling in the mosh pit that is South Philly's perennially cramped John's Roast Pork rides high atop our list of post-pandemic musts, partly just to feel something after our terrible year of No Touching, but also for a sandwich, a roast pork sandwich please, the one any right-minded Philadelphian can tell you is the one you go looking for once you're ready to dine sober, in the sunlight, like a whole adult.
I'm including this one so my friends from Pennsylvania can go find the article's authors and shove some Philly cheesesteak down their gullets.
In a perfect world, the I-81 struggle through America's Secretly Biggest State would have been long ago ameliorated by modern conveniences including a third lane in each direction; in the meantime, better to think of this essential leg of the Northeast to Deep South fast route in terms of the many small detours one can make in order to feel human again, after yet another hour or staring at the rear end of the same tractor trailer.
Virginia is not a state to travel through. It is a state to travel to. Or preferably to be from.
I haven't tried the sandwiches mentioned here but you'll note that unlike most of the other states' offerings, there are two choices. Now I have to try them, but I seriously doubt they'll be better than the pastrami and swiss with spicy mustard and onion on a sub roll that's served at a lunch counter here in my town.
So... road trip, anyone? As long as I also get to try the nearby breweries.
|Mostly I'm just linking this one because I ran across it one day and thought it was interesting. Not enlightening. Interesting.
A Buddhism Critic Goes on a Silent Buddhist Retreat
Something weird happens to a skeptical science writer during a week of meditation, chanting and skygazing
This just in: Weird things happen to your mind if you take a break from the usual routine and try something new.
Now, look, I'm not being skeptical of the skeptic here. I have no reason to believe he's not relating anything but the truth about his lived experience. I'd just, personally, rather read about it than do it.
So I recently put my skepticism to the test by going on a weeklong silent Buddhist retreat, which my pro-Buddhism friends Lisa and Bob argued was my moral obligation.
So, the rule is: if someone critiques a philosophical viewpoint, they have a moral obligation to spend a week immersed in that philosophy?
The retreat cost $1800, and we were encouraged to give the Lama a donation at the end.
So, at the very least, your bank account gets "enlightened."
Look, even monks gotta make a living, I get it. And $1800 plus tip to do essentially nothing but navel-gaze for a week isn't so bad, if food, lodging and activities (such as they are) are provided.
Each day’s schedule, which lasted from 6 A.M. to 9:15 P.M...
That's a hard pass for me right there. I'm sure your mind is more open -- or, on the flip side, more susceptible to suggestion -- if you're tired. And I know for some people that's something like a normal schedule. But not me.
The retreat convinced me that contemplation can reproduce the effects of psychedelics, a claim I have long doubted. On the retreat, as during a trip, I saw life’s inexplicability and improbability, which I like to call “the weirdness.” On psychedelics, the weirdness screams at you. On the retreat, the weirdness murmured.
Well? The only way you can find out if it's true or not is to try both. Perhaps at different times. Perhaps at the same time. Maybe all of the above. You know. For science.
Now that I’m back in the real world (which, given the digital distractions, is more virtual than real)...
I really dislike this dichotomy. Lots of things you experience are real, whether in person or mediated through a communication device. The people you interact with online are real. The stuff you get from the internet is real. Or, well, to take the Buddhist angle, as real as anything else, anyway. Or are you going to assert that talking to someone on the phone, back when we did such things, wasn't a real interaction? The internet is just an extension of communication.
And that's always been my issue with many philosophies: they turn shit around so up is down and left is right. They claim that what's real is an illusion and the shit your mind comes up with is reality. All this does is muddle the definitions of "real" and "illusion" until those are just meaningless sounds.
There's an argument to be made that we can only perceive things as they're filtered through our senses and processed by brains. But that doesn't make the chair I'm sitting in any less real, and it doesn't make the random thoughts I have any more real.
But hey, like I said, I'm not ragging on anyone else's experience, here. The article is worth a read in its entirety, I think (or I wouldn't have read it in its entirety).
And you're not going to convince me that the article isn't real, even if it is nothing but electrons flitting around in cyberspace.
|This one's a little on the deep side, and I can't claim to understand all of it. But I'm putting it out here anyway in the hope that someone will get something out of it.
Questioning Truth, Reality and the Role of Science
In an era when untestable ideas such as the multiverse hold sway, Michela Massimi defends science from those who think it hopelessly unmoored from physical reality.
Incidentally, my browser's spellchecker doesn't recognize the legitimacy of "untestable." There's probably a metaphor in there somewhere, but I'm entirely too sober to tease it out right now.
It’s an interesting time to be making a case for philosophy in science. On the one hand, some scientists working on ideas such as string theory or the multiverse — ideas that reach far beyond our current means to test them — are forced to make a philosophical defense of research that can’t rely on traditional hypothesis testing. On the other hand, some physicists, such as Richard Feynman and Stephen Hawking, were notoriously dismissive of the value of the philosophy of science.
I expected better from Feynman, and Hawking always struck me as having something akin to a philosophical streak.
That value is asserted with gentle but firm assurance by Michela Massimi, the recent recipient of the Wilkins-Bernal-Medawar Medal, an award given annually by the U.K.’s Royal Society. Massimi’s prize speech, delivered earlier this week, defended both science and the philosophy of science from accusations of irrelevance. She argues that neither enterprise should be judged in purely utilitarian terms, and asserts that they should be allies in making the case for the social and intellectual value of the open-ended exploration of the physical world.
Personally, I think judging anything on "purely utilitarian terms" misses an important part of the human experience. Which is not to say I don't have a utilitarianist perspective on a lot of things; I just don't think that's the only viewpoint worthy of consideration.
The article goes on to interview Massimi, and while I won't quote most of it here, it's worth seeing both the questions and her answers. But this is one question that struck me as interesting -- not for the answer, appropriately enough, but for the question itself:
One criticism made is that science moves on, but philosophy stays with the same old questions. Has science motivated new philosophical questions?
Her answer is important from one perspective, but I'll offer another point of view: Yes, science motivates new philosophical questions. It does this all the time. These discussions even seep into popular culture. The very first true science fiction book, Shelley's Frankenstein, delves into the philosophical repercussions of applied science (albeit from a purely fictional perspective). The question isn't answered, of course -- we're still asking it, in stories, right up to the present day. And undoubtedly beyond.
Without science offering the possibility of creating something new in the world, there would be little reason to ask those questions on a serious level.
Later, a part of her answer to another question goes like:
Obviously it is not the job of philosophers to do science, or to give verdicts on one theory over another, or to tell scientists how they should go about their business. I suspect that some of the bad press against philosophers originates from the perception that they try to do these things. But I believe it is our job to contribute to public discourse on the value of science and to make sure that discussions about the role of evidence, the accuracy and reliability of scientific theories, and the effectiveness of methodological approaches are properly investigated.
Ethics is in the purview of philosophy. Perhaps not so much in physics, but other branches of science, particularly biology and related fields, absolutely has to consider ethics. Animal testing? Hell, human testing? The potential for cloning? The quote from the original Jurassic Park movie comes to mind: "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should."
But this is, for me, the most important point she makes:
In this respect, I see philosophy of science as delivering on an important social function: making the general public more aware of the importance of science. I see philosophers of science as public intellectuals who speak up for science, and rectify common misconceptions or uninformed judgments that may feed into political lobbies, agendas and ultimately policy-making.
This article is from 2018, and since then I think at least one major thing has happened that demonstrates the value of science in the public sphere. While there is certainly research that is, at best, questionable -- nutrition science comes to mind -- I think one of the greatest issues facing us right now is general distrust of science and the dismissal of experts, in science or other fields, as no more useful than your uncle's deranged Failbook posts.
So yes, by all means, keep up the philosophy. I won't understand all of it. But I don't have to.
|Okay, it's called "Food Grammar," but "Grub Grammar" is alliterative so I'm exercising my literary license. Hey, that also alliterates.
Introducing ‘Food Grammar,’ the Unspoken Rules of Every Cuisine
Technically, spaghetti and meatballs is bad grammar.
One of the common experiences of humankind -- well, actually, of pretty much all animals -- is that we eat. Humans make it social, though, so you'd think we'd have more customs in common. That turns out not to be the case.
Serve spaghetti and meatballs to an Italian, and they may question why pasta and meat are being served together.
"Because that's how we do it in the US."
Order a samosa as an appetizer, and an Indian friend might point out, as writer Sejal Sukhadwala has, that this is similar to a British restaurant offering sandwiches as a first course.
Depends on the sandwiches.
Offer an American a hamburger patty coated in thick demi-glace, and they’ll likely raise an eyebrow at this common Japanese staple dubbed hambagoo.
Now I want to try that.
Each of these meals or dishes feels somehow odd or out of place, at least to one party, as though an unspoken rule has been broken. Except these rules have indeed been discussed, written about extensively, and given a name: food grammar.
I'm calling it grub grammar anyway. Because I'm contrary like that.
Yes, much like language, cuisine obeys grammatical rules that vary from country to country, and academics have documented and studied them.
Sounds to me more like an excuse to try a bunch of different foods.
A culinary grammar can also provide insight into how an assortment of ingredients becomes a meal, much like how a jumble of words becomes a sentence.
So, in this analogy, cooks are all writers, and really good chefs are famous novelists or poets.
The article provides many examples; if you're the slightest bit interested in international cuisine or other cultural dining practices, give it a look.
In Italy, pasta and rice dishes are served before meat rather than alongside it; in Italian-American restaurants, however, fish is often perched on risotto, and meatballs take their starring place atop spaghetti in the eponymous classic.
You know, something about spaghetti and meatballs always bugged me. It's not the taste; it's usually delicious. It's more about the work you have to do in order to eat it. Like... okay, say you have a dish of spaghetti in some other kind of sauce, no meatballs. Whether it's Italian or not is irrelevant here, the point is, it's pasta and sauce. You stick a fork in, twirl it around, and stuff it into your face. But with meatballs, unless they're really really tiny meatballs (which can also work) you have the extra step of breaking up the meatball with the fork or with the aid of a knife. Then you have to get just the right proportion of meatball fragment to pasta to sauce. I mean, you can't just eat the meatball by itself, and you can't just eat a forkful of spaghetti by itself; otherwise, why bother with one or the other?
But this has nothing to do with the national origin of the dish. As far as I'm concerned, S&M (which I enjoy calling spaghetti and meatballs because I'm hilarious) is just as proper a food as anything you get in Italy, or anywhere else for that matter.
I mean, it's not like Italy invented pasta, and it's not like the tomato that forms the basis for so many pasta sauces was native to Italy. They had to put those things together once they got noodles from the east and tomatoes from the west (see the discussion of tomatoes a few entries back). So the Italian cooks, at some point, used these foreign concepts and made their own dishes out of it. S&M is just more of the same.
None of which says a single word about the taste of the dish. And that's what I care about in a dining experience. I don't give two shits if the dish is "authentic."
“Japanese people will take anything and make it theirs,” says Albala, citing shokupan, a Japanese white bread that’s even sweeter and softer than American Wonder Bread.
This... this is nothing to be proud of. If you want to cite one of the greatest achievements of Japan, it's how they were able to steal whiskey from Scotland and come up with distilled spirits even better than most Scotch.
Oh, yeah, I said that out loud. Okay, fine, it's not technically "food," but we were talking about how Japan will take anything and improve upon it.
I'll finish this with my own first known experience with grating grub grammar. And it didn't happen in some exotic foreign land, but in New York City, and it wasn't some other culture's cuisine (there are many such in NYC, which is one thing I love about the place), but in my very own aunt's house.
You'd think that she and her sister, my mom, would have similar ideas about how food is served. And they did, in some ways, though my mom was also influenced by my father's New Orleans background. But as a kid, I was always presented with meal courses in the following order: salad, main course (that is, what we call an entree, which is not the same thing as the French entrée, which is an appetizer), dessert. I mean, that's what you do, generally, right? With of course variations depending on what's being served.
But then I go to my aunt's house and suddenly I'm expected to eat the main course first and then the salad.
I don't know why this freaked me right out. I mean, I can understand having them both at the same time and switching back and forth; that is, treating the salad as a side dish. But my aunt treated it like First Dessert. So I guess that was grub grammar, even though I didn't know what to call it then, and really, didn't know until I read the article I linked above last month.
And now? Now I'm all hungry.
|Well, it looks like I didn't get as astronomical as I expected, so here's my usual just-after-midnight post. Last night, after completing my blog entry, I ended up getting completely danchu, sampling from many different bottles. Predictably, I felt like shit all day (worth it). Hence, tonight, I ended up just taking my traditional couple of shots of birthday tequila, and I also had some port with dessert. Consequently, I'm functional right now.
I'm sure you've all heard about the successful Perseverance rover landing on Mars yesterday. Technology works, and we can do some pretty awesome shit when we put our minds to it. By the time we finally land human boots on the fourth planet, there will be a thriving robot society there. They'll probably pass resolutions to send us back home, or maybe put us in concentration camps to discourage further immigration.
So with the success of the landing, which is designed to search for signs of past life on the Red Planet, it's appropriate that this subject came up at random from my stash.
I'm somewhat sure that there is no alien life on the internet, politicians' tweets notwithstanding. Oh, wait, that prepositional phrase is meant to modify "discussion," not "alien life." English is fun.
Anyway, this blogger, PZ Myers, is one of the few that I follow regularly, because he usually has something interesting to say, even if he does have an obsession with arachnids. Fortunately, spiders are (we think) from Earth so there aren't any in this particular post.
I like speculations about alien life, just as I appreciate the diversity of life on Earth, the different forms of life in the past, and the prospect of evolution in the future, but every time I read about this stuff in astronomy-related journals, I feel like they’re making an effort to reduce my intelligence.
I know the feeling.
The problem is that they have no imagination and no biology, but they’re trying to imagine the nature of alien biology, and all they end up doing is running around in circles trying to figure out why little grey humanoids aren’t landing their flying saucers en masse to march out and shake hands with the president.
I know I've ranted about this sort of thing in here before, so here's another guy's take on the subject.
So, in this post, Myers does what I usually do, which is comment on someone else's article. So here I am commenting on an article that comments on an article. If someone quotes this entry and comments on it, that would be really meta.
This is my problem with the general tenor of these speculations. They all assume that we, that is human-like intelligences, are desirable, inevitable, and the only proper kind of life; they’ve read far too many science-fiction novels prophesying a colonialist destiny led by strong-jawed Anglo-Saxons with glinting eyes and a finger on the trigger of their blaster.
I mean, hey, I love those stories as much as anyone, though I prefer phasers.
They’re always going on and on about the likelihood of finding an intelligent civilization like ours. Why not speculate about finding a planet that has produced kangaroos? Or stomatopods? Or baobab trees? These are all unlikely outcomes of a contingent, complex process that produces immense diversity, and they’re all wondering what the “barrier” is that prevents our kind from winning the cosmic lottery every time. Get over it, we’re not a favored outcome, there’s no direction to evolution, and that’s why there aren’t smarty-pants bipeds tootling about the galaxy stopping by for tea.
I kinda hate that Myers puts this more eloquently than I have in the past. But I agree.
Anyway, I'm not quoting the quotes here, and I'm leaving out a lot of the commentary, so of course I recommend reading the source. Whether you read the article he's ragging on is up to you; I certainly didn't.
Point is, though, that this is a biologist's perspective on the question, but it's similar to one I came up with previously: that if we find life "out there" -- be it living life, or, as is hoped with the Mars robot, evidence of life that existed in the past -- it will, in all likelihood, be something other than a technological civilization. And yet, we've been so conditioned to equate "life" with "intelligent life" (as usual, please refrain from making silly dismissive comments about "intelligence," as the fact that you can make such a comment immediately contradicts it) that it's almost impossible to find someone who doesn't immediately think of bulbous-headed big-eyed gray guys, or Klingons, or other generally humanoid, tool-using, spaceship-flying aliens.
That said, I really hope this Mars rover finds unequivocal evidence of some sort of living, or once-living, organism on Mars. Because that by itself would be an incredibly important discovery, never mind what sorts of things we might learn from it.
Thanks for all the mixed drink recipes! I definitely saw a few I'd like to try. Rolling the Virtual Dice for the nine commenters (sorry, one comment came in just after midnight), the result is 3, so the Merit Badge will go to... Cubby✏️ for the mojito recipe (and I do love mojitos).
Still, I appreciated all the comments and we'll do this again soon!
I'm going to start this entry by saying that I've never read this book; I've never heard of this book; and I have not a single clue what the book is about. I mean, really, it could be almost anything:
A person with more than one personality
A haunted house
A church or other religious building
Seriously, just consider all of the various possible meanings and/or connotations of "house" and "spirits" and you're good to go.
Incidentally, because of that last bullet point, this is cheating on the part of whatever author wrote it (like I said, I haven't the faintest clue about the book in question and I'm deliberately not looking it up before doing this entry). It's cheating, because when you have words with that many definitions and connotations, you can employ a great deal of parallelism, and people respond to symbolic parallelism.
And so I'm going to talk about my favorite thing that could be considered The House of the Spirits:
I miss bars. I miss them a great deal. Oh, sure, I've gotten more into mixing my own drinks -- everyone needs a pandemic hobby -- but there was something about sitting at a bar and having a lot of options to choose from, and not having to mix them myself.
I will say this, though: one drink I made called for mint leaves. I tried to get mint leaves from the supermarket, but they were out. Instead, I opted for a mint plant, an actual living potted thing with stems and leaves.
Plants never last long around me. They usually take one look at me, say "Oh hell no," and commit suicide. This one has somehow stuck around for... I don't know... two weeks? Something like that. Time has little meaning anymore. More than a week. Less than a month. I mean, it was kind of iffy there for a while, and my housemate helped, but somehow this plant still lives; I've been pruning it occasionally for leaves, and there's even new growth starting.
But see, if I could go to a bar, I could just order a drink and the mint leaves are already cut and the bartender does all the actual work. And I wouldn't be limited to just the 25 bottles I have on hand.
The drink in question, by the way, is called a Ginger Rogers and I mostly just started making them because I had a metric ton of ginger to use up and I like gin a lot. Yes, the name is a pun because of course it is.
But since I had to also buy the mint plant, now I'm almost out of ginger and have plenty of mint, so soon I'll have to buy more ginger (and gin) in order to use the mint because I can't just let the plant grow out of control. This happens a lot. Say you want ham and cheese sandwiches. So you buy ham, cheese, bread, lettuce, tomatoes, whatever. You run out of ham, so you buy more. But then you run out of bread and you can't let the ham and cheese and veggies go to waste so you buy more bread. Then the cheese runs out and you can't have a ham and cheese sandwich without cheese, so you buy more cheese. Meanwhile, the remaining veggies have gone bad because it's been two days, so you have to buy fresh ones of those and while, sure, you can probably just make a salad you still have ham, cheese and bread so you make sandwiches but then the ham runs out again and you buy more ham... the cycle never, ever ends.
And that's another reason I prefer to go to bars and restaurants.
It's been my tradition for a while now, on my birthday, to drink tequila. Lately this has happened at one or another local tequila bar. Not this year. But I'm prepared. I have a bottle of tequila, a bottle of triple sec, some limes, salt. But I'll probably run out of one or another of those things and have to replace it, and then run out of another and have to replace that. Until my spirit gives up the ghost. The ham and cheese sandwich effect, you know.
So since it's my birthday, it's time for me to give someone a present...
Merit Badge Mini-Contest!
Give me a mixed drink recipe. Link it or type it in, I don't care. And I don't want to leave out my non-drinking readers, so I'm not saying there has to be alcohol involved, just make it an interesting mixed drink (e.g. no "virgin screwdriver," which as far as I know is just orange juice). Out of all qualifying comments, I'll pick one commenter at random to give them a Merit Badge tomorrow. Deadline, as usual, is midnight tonight, Thursday, WDC time.
And if I like a recipe I might even try it out sometime.
One thing, though: I usually post just after midnight because it suits my schedule. This will probably not happen tomorrow because, well, see above about "tequila on my birthday." I will post and choose a MB recipient sometime later on Friday, though, if the tequila doesn't kill me.
|You say toe-MAY-toe, I say toe-MAH-toe...
Clearly, it was because no one knew how to pronounce it. Hey, let's mess with everyone's heads and start pronouncing it toe-MATT-oh. Who's with me? While we're at it, we can restart the old argument over whether it's a fruit or a vegetable (truth: it's either, depending on whether you ask a botanist or a chef).
In the late 1700s, a large percentage of Europeans feared the tomato.
To be sure, there was a lot to fear in late 1700s Europe, and not just if you were a French noble.
A nickname for the fruit was the “poison apple” because it was thought that aristocrats got sick and died after eating them...
Which is how the tiny violin got invented.
...but the truth of the matter was that wealthy Europeans used pewter plates, which were high in lead content.
And they probably ate mercury off of them, too. No, really: apparently mercury was considered a cure for syphilis. This is one reason why I keep saying the past sucked.
Because tomatoes are so high in acidity, when placed on this particular tableware, the fruit would leach lead from the plate, resulting in many deaths from lead poisoning. No one made this connection between plate and poison at the time; the tomato was picked as the culprit.
This is why science is important. You need a double-blind test. Give 1/4 of the aristocrats pewter plates with tomatoes on them. Give another 1/4 of them pewter plates with placebos on them. The third 1/4 gets placebo plates with tomatoes, and the remainder get placebo plates and placebo tomatoes. Record how many of each group dies. Then guillotine the survivors just in case.
Around 1880, with the invention of the pizza in Naples, the tomato grew widespread in popularity in Europe.
Cultural appropriation! The tomato isn't native to Italy, so it's inauthentic cuisine! Oh, well, we got 'em back when New York perfected the pizza.
Before the fruit made its way to the table in North America, it was classified as a deadly nightshade, a poisonous family of Solanaceae plants that contain toxins called tropane alkaloids.
This, in spite of European invaders literally seeing Americans eating the things.
Like similar fruits and vegetables in the solanaceae family—the eggplant for example, the tomato garnered a shady reputation for being both poisonous and a source of temptation.
To be fair, eggplant isn't actually food. Oh, sure, I understand that some people eat it, but I refuse to classify it as edible. Consequently, it is not a source of temptation for me. Neither is the tomato, which I can take or leave, but at least I acknowledge it as food.
Gerard’s opinion of the tomato, though based on a fallacy, prevailed in Britain and in the British North American colonies for over 200 years.
Hence the importance of fact-checking. Fake news didn't originate with the internet. Or even with the printing press.
By 1822, hundreds of tomato recipes appeared in local periodicals and newspapers, but fears and rumors of the plant’s potential poison lingered.
Again, a testament to the problem of anchoring bias, where the first piece of information you learn about something sticks with you in spite of later corrections. Still, if that first piece of information is "that shit'll kill ya," this can be somewhat forgivable.
Today, tomatoes are consumed around the world in countless varieties: heirlooms, romas, cherry tomatoes—to name a few. More than one and a half billion tons of tomatoes are produced commercially every year. In 2009, the United States alone produced 3.32 billion pounds of fresh-market tomatoes. But some of the plant’s night-shady past seems to have followed the tomato in pop culture. In the 1978 musical drama/ comedy “Attack of the Killer Tomatoes,” giant red blobs of the fruit terrorize the country. “The nation is in chaos. Can nothing stop this tomato onslaught?”
Awww. I was hoping that the article wouldn't mention this seminal work of cinema, so I could write this entry about it. Oh, well, I guess I'll just keep it in the title.
There's more to the article than the bits I riffed off of here; it's worth reading for the description of the dreaded Green Tomato Worm (which is not, in fact, a worm that infests green tomatoes, but a green worm that infests all sorts of tomatoes -- one of the many ambiguities possible with the English language, which is what makes it so much fun).
But mostly I'm glad this article came up in my random roll today, because it's science, history, food, cinema and the potential for comedy all rolled into one, like a burrito. A burrito with chopped tomatoes. Okay. Now I'm hungry. Tomorrow's entry will almost certainly be darker, for reasons that should become apparent.
You're all expecting me to rag on Hubbard here, aren't you? Aren't you? But no, the organization he founded has armies of lawyers and this shit is public.
Instead, I'm going to take this opportunity to talk about science fiction.
I talk about it sometimes in my Fantasy newsletter editorials, because there's a lot that fantasy has in common with science fiction. Incidentally, I really hate calling it "sci-fi," but I have been known to abbreviate it as SF. Rarely, though, because that could also mean "speculative fiction," which is probably a broader concept but I can't be arsed to get into the technical nuances of the differences between genres. I'm certain that for some people, it's their hill to die on.
But, as much as SF has in common with fantasy, there are important differences. I mean, yes, I'm intimately familiar with Clarke's Third Law ("Any sufficiently advanced technology is indistinguishable from magic.") I'm also familiar with the people who think they're oh-so-clever and turn it around to "Any sufficiently advanced magic is indistinguishable from technology." That's not nearly as clever as you think it is.
And I'm not saying that there aren't shades between the two genres. Hell, some of my favorite stories are somewhere on the spectrum between the two extremes. And by "extremes" here, I mean that on the one pole, you have magic and maybe the supernatural. The Platonic ideal of this is, of course, Tolkien. On the other pole you have pure science and technology, no supernatural elements, everything is explainable in terms of laws of the universe. I can't be arsed to come up with a Platonic ideal for that.
I do know it's not Battlefield Earth.
And no, it's not Star Trek, either. I love Trek, but it takes too many liberties with the laws of physics to be pure science fiction. Just to be clear, I have no problem with that; it's just a mental categorization thing.
I can also tell you that the difference between fantasy and SF isn't about time period. There's fantasy set in the future, SF set in the past, and all kinds of variations thereof. There's also fantasy set so far in the future that the basis for the magic is actually science, in accordance with Clarke's Third Law.
Before I stopped reading Orson Scott Card's books, I attended a book signing he did. As a published writer of both fantasy and science fiction, he's probably more qualified to discuss the differences than I am, a mere reader and unpublished writer. And he said: "Look at the book covers. Fantasy has trees. Science fiction has rivets."
This was before steampunk, though. Lots of rivets. No grounding in science.
Also, to be perfectly clear, Star Wars is fantasy. Sure, yes, I know, spaceships, warp drive, robots, whatever. All the tropes and trappings of science fiction. But it's not science fiction; it's fantasy that uses SF props. That is a hill that I will die on.
Again, it doesn't matter. I don't feel the need to choose, any more than I need to choose between Wars and Trek, between Marvel and DC. But genre has one important function: marketing. Some people prefer one over the other, but almost everyone wants to know, at least vaguely, what they're getting into when they start a book or movie or whatever. If it bills itself as horror, it probably shouldn't focus on romance. If it's supposed to be a detective novel, maybe don't turn it into a gothic vampire story (or if you do, at least warn us).
Other people might disagree. And that's where the battle comes in.
|It is not likely that every problem caused by technology has a technological solution. But many do. Here's an example.
Plastic trash can now be recycled into ultra-strong graphene
Plastic decomposition is sped up by the flash Joule heating method
Packaging from the grocery store, lint from our clothing, plastic shopping bags – plastics and microplastics are everywhere, and they’re not going anywhere.
Mr. McGuire: I want to say one word to you. Just one word.
Benjamin: Yes, sir.
Mr. McGuire: Are you listening?
Benjamin: Yes, I am.
Mr. McGuire: Plastics.
Benjamin: Exactly how do you mean?
Mr. McGuire: There's a great future in plastics. Think about it. Will you think about it?
I'm sure this is not what McGuire meant in The Graduate, but yes, indeed there was a great future in plastics. A very, very long one indeed.
In order to speed up this decomposition process, scientists from Rice University are transforming these discarded plastics into non-toxic, naturally occurring materials. They're doing this by using a newly developed technique called “flash Joule heating,” to rapidly heat plastic materials to very high temperatures .
"But where's the energy coming from to-" "Shut up."
Currently, there are a few plastic recycling techniques that are widely used, with differing results.
Now, I've heard -- without a lot of confirmation -- that many plastics don't get recycled at all, even if they're labeled with recycle symbols, sorted carefully into categories, left in a recycling container instead of the trash can (rubbish bin for my Brit friends), and picked up by a green truck with a great big RECYCLE logo on it. This is, I've heard, because plastics are generally difficult to recycle. Aluminum and other metals? Easy; melt it down and you get... aluminum or whatever. Glass? Also easy. Melt it down and you get glass. Melt plastic down and those handy long-chain hydrocarbons break apart and you get methane, carbon dioxide, and all sorts of fun-to-pronounce chemicals.
Even more shockingly, each plastic bag is used for, on average, less than one hour.
Depends on what you mean by "use." and what you mean by "bag." Why, I have plastic bags in my freezer that have been in use since 1996. I suppose one of these days I should remove the 25-year-old meat, but I can't be arsed.
Okay, that's hyperbole. Still, it might be time to go through the freezer.
In contrast, the “flash Joule heating” method turns plastic into graphene, which is highly recyclable and very stable. Graphene itself is incredibly strong and stretchy – 200 times stronger than steel. Graphene is a single layer form of graphite, a naturally occurring carbon-based mineral that is commonly found as pencil lead.
What the hell, MassiveSci? I expect better than this from you. Graphite (as well as graphene) is carbon-based in the same sense that the clear liquid coming out of your faucet is "water-based." That is, it's pure carbon. Like diamond, only generally not mined by 8-year-old slaves.
By directly generating graphene from plastic waste, it is possible to reduce its production cost.
Again, graphene is carbon. Plastics are based on long-chain hydrocarbons. I don't see very much in the article about what happens to all the other little atoms. "Some hydrogen and hydrocarbons." Okay, then what?
Generating graphene directly from plastic could disrupt the graphite supply chain, thereby decreasing mining activity and reducing pollution caused by mining.
In the meantime, you can help fight plastic waste by making one small but significant change in your life: bring your own reusable fabric bag to the store with you.
You know. I'm old enough to remember that groceries used to be packed in brown paper bags exclusively. Paper, of course, is both fairly easy to recycle or, if you can't be arsed to recycle it, biodegradable. Then the grocery stores started stocking those flimsy plastic bags that (con) couldn't carry much more than three tomatoes without breaking but (pro) had handles. Plastic bags were better for the stores because they were cheaper and lighter, and you could store like 1000 of them where you could only store like 100 paper bags (note: those numbers are wild-ass guesses, but the idea is valid). There was a transition period you might recall when shoppers were offered a choice. "Paper or plastic?" became even more ubiquitous for a few years than "Credit or debit?" and "You want fries with that?"
Finally they trained consumers to not want paper bags anymore, and they quit asking, instead just dumping the groceries into plastic bags. This end of the "Paper or plastic?" era somehow coincided with the first scaremongering about OMG MICROPASTICS WTF NONBIODEGRADABLE IT'S EVERYWHERE. Hell, some areas flat-out banned plastic bags, and didn't go back to paper. And bringing my own bags to the grocery store is Not Going To Happen. (I don't go to the store anymore anyway, instead opting for delivery -- which usually comes in plastic bags. I don't give a shit.)
I'm not saying there isn't a problem with microplastics. I'm saying, like, why the fearmongering? Unless it's a massive campaign financed by pissed-off paper bag manufacturers, or the makers of reusable cloth grocery bags. Yeah... that's my working theory. It's like how I've always thought that PETA is actually run by the soybean industry in an effort to sell more tofu.
Anyway. Point is, this is pretty clever technology and I hope it actually becomes a thing.
|As I refuse to acknowledge what today is, I'll just post a random link.
Trying to Stay Optimistic Is Doing More Harm Than Good
No more FONO. How to recognize and break the habit of “toxic positivity.”
Fortunately, "trying to stay optimistic" isn't a trap I fall into. So, what's FONO? I guess I have to give them clicks to find out!
When her patient started talking about sick notes, neuropsychologist Judy Ho decided to intervene. Her client, a wildly successful entrepreneur, was rich, happily married, and well-regarded by his peers.
So, not someone anybody else is going to feel sorry for no matter what happens. Only in Bloomberg.
The problem was the days when he felt depressed and run-down but unable to admit it. The only way to address it, he felt, was to regress, like a schoolboy, and look for permission from a doctor to regroup. “He knew he wasn’t sick, but he’d go in and make something up,” she says, “just so he could take a day off and be OK with himself.”
Free business idea for you: sell miniature violins.
She recognized he was suffering from a surging contemporary malaise. “He always had to demonstrate his worth to people,” she continues. “He was thinking, ‘I must exude this image of success and a happy life that everybody has come to know about me, and I don’t want to ever change that image.’ That’s toxic positivity.”
Oh. Damn. I was hoping "toxic positivity" referred to the trend of people to exhort other people to always look on the bright side of life. Those people make me grumpy. I mean, good for them, but bah humbug.
This is bad enough, though, I suppose. Kind of like the social media hounds who are always portraying themselves as shiny and happy all the time.
Call it FONO, or fear of a negative outlook. Also known as “dismissive positivity,” it’s expressed as an overbearing cheerfulness no matter how bad things are, a pep that denies emotional oxygen to anything but a rictus grin.
Ah, there's the definition. Nah... too close to FOMO for my taste. Used to be referred to as Pollyanna syndrome or something, after a character who was always, always cheerful no matter what happened.
You see it on Instagram, where the affective filter is always upbeat, usually followed by the hashtag #blessed.
No, I don't, because I avoid Instagram even more strenuously than I avoid Bookface and Twatter. And hashtags are a plague upon the land.
You hear it from the SoulCycle instructor exhorting every rider to swaggeringly sweat through the pain.
No, I don't, because... well, you know.
It’s available from the newly anointed chief creative officer for Vital Proteins, actress Jennifer Aniston, who claims that renewal isn’t only a result of its powders: Instead, “it’s within us.”
So what do we need of your patent nostrums?
You might even recognize it in the boss who insists that colleagues start every Zoom meeting by sharing a piece of good news to help keep moods buoyant amid the gloom.
Now, look, maybe this is going a bit too far. I don't think there's anything wrong with acknowledging that there's some good news during times of crisis. It's not just that every silver lining has a cloud, it's that there may be benefit, when you're bombarded by bad news all the time, to try to find one positive thing in your life. The problem comes when you focus only on the positive things, just as it would be a problem to focus only on the negative things.
In my own opinion, anyway.
For example: I hate February. I mean, I utterly despise this fucking month with a passion fit to burn stars. It's cold, it contains stupid observances like Groundhog Day, President's Day, and... well... that which shall not be mentioned but falls on today. It hosts a stupid sportsball game with stupid commercials. My birthday occurs in this hated month, which is a constant reminder that I have fewer of those in front of me than I have behind me. It's dreary and confining and depressing. On the bright side, most years it's only 28 days. As of today, it's half over. See? I can find a light in the darkness when I try.
Think of this mindset as one that responds to all human anxiety, or sadness, with uncompromising optimism. It can be found in sentences that start with those negating words “At least,” which are followed by a suggestion that however bad you’re feeling, at least you’ve got plenty else that should offset and outweigh it.
If I trust you enough to bitch about something, don't ever start a sentence with "At least." I will punch you, and then I'll feel bad about it, but at least I'll have had the satisfaction of punching you.
Ordinary Americans, casting around for inspiration and reassurance, became prime targets for these peddlers of perkiness.
"Peddlers of perkiness" would be an awesome name for a retro swing punk band.
Such magical thinking has paralleled the rise of professionals hired to be a personal cheerleader. Membership of the International Coaching Federation, a credentialing and training program accrediting body, has soared from almost 4,700 worldwide in 2001 to more than 41,000 today.
Wait. This exists? This is a thing? Forget what I said earlier. There is no positive side to this.
Successful people are the most likely to fall prey to this way of thinking, says Naomi Torres-Mackie.
I think I might be successful investing in tiny-violin futures.
For the current generation, the origins of this emotional cure-all lie in the 1990s, when then-president of the American Psychological Association, Martin Seligman, posited that pessimism is a learned behavior. Therefore it both could and should be avoided.
Okay, but I'm pretty sure faux positivity started long, long before this asshole. Let's see... oh, yeah, here. "The modern positive thinking movement started in the late 1800s with a watchmaker named Phineas Quimby. Quimby became fascinated with the practice of mesmerism (a.k.a. hypnotism). He became the apprentice of a famous French mesmerist and traveled New England learning the trade. Once he could hypnotize on his own, he opened a practice and started having some success alleviating the symptoms of psychosomatic disorders. This lead him to believe that the body was a reflection of the mind and that all illness was caused by false beliefs."
Ah, yes, one of the oldest fallacies in the book. "An herb cured my illness, so ALL illness must be curable by herbs!"
My apologies to anyone who actually clicks on that second link. I started feeling nauseated just glancing at it. Consequently, I would say their assertion about Quimby is, at the very least, suspect. But my point is that people have been pushing bullshit positivity since LONG before the 1990s.
That observation snowballed into bestsellers such as The Secret, first published in 2006 by Australian TV executive-turned-author Rhonda Byrne. It was popularized after Oprah Winfrey championed its ethos. That breakout bunkum bible was essentially built on claims that the power of positive thinking would provide whatever you want, be it a baby or a Mercedes-Benz.
"Breakout bunkum bible?" I'm thinking bardcore / metal / steampunk fusion band.
But yes, "The Secret" is, with the possible exception of Twilight, the stupidest piece of trash published in this century. And yet, here I am, unpublished, so not so stupid for the author, is it?
So, How to Cope?
This is the central question behind about 90% of the shit I see online these days.
Ho, the neuropsychologist, has an unexpected suggestion to help calibrate a Pollyanna perspective: a session watching Disney-Pixar’s Inside Out, which animates and dramatizes human emotions.
This therapy session has been brought to you by Disney. Disney: We bring good things wherever we go! Be sure to subscribe to our streaming service, and remember, our theme parks are still open and waiting for your smiling (but masked and socially distant) faces! Remember Disney for all of your entertainment needs. Now featuring Star Wars, Marvel Comics, National Geographic, and the best lawyers in the business. Disney!
It’s no surprise that Byrne would also return now. Her sequel, The Greatest Secret, came out in November. Read it, the blurbs tout, and you can remove all negativity—as if doing so should be a central goal in life. (More than 80% of Amazon.com Inc.’s user reviews gave it five stars. It would be too negative to be negative, it seems.)
I so very badly want to give it a one-star review on general principles, but I have too much of a sense of honor to review something I haven't read, way too much self-respect to actually read it, and no intention of actually paying money for that dreck.
So there's my rant for the day, which I'll end by reiterating: I hate February.
|Sometimes, I just like to have fun with these things.
Our Brains Tell Stories So We Can Live
Without inner narratives we would be lost in a chaotic world.
And sometimes those inner narratives lead to nutjob whackaloon conspiracy "theories."
We are all storytellers; we make sense out of the world by telling stories.
Occasionally, that story is "Aliens did it."
And science is a great source of stories. Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.
Facts which can then be twisted into science fiction. Hey, I'm not knocking science or science fiction, but it's important to keep the two separate in one's head.
There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.
Not "completely," but we have pretty good ideas about some of these. That last one, for example. It's because you're playing what you think is "music" too loudly.
For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.
To get serious for a moment, I happen to agree with this. But then, I fancy myself a writer.
Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel: “The king died and then the queen died.” It is nearly impossible to read this juxtaposition of events without wondering why the queen died.
It's because she said, "Let them eat cake."
Okay, look, Marie Antoinette probably never uttered those words (not even the French equivalent). Supporting this, there was a story written by Jean-Jacques Rousseau long before the French Revolution where he had a princess utter those words (except it was "brioche," not "cake," but whatever). Here. (That wiki page also has a very helpful photograph of a brioche, and now I'm hungry.)
That's a story we tell ourselves. Some stories are based on fact. That one appears to be the other kind.
Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.
Nah, I'm sticking with the French Revolution. Vive la guillotine!
The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions.
Which goes a long way toward explaining why so many writers have drug, alcohol, and/or gambling addictions.
The article goes on to discuss the science behind this. I couldn't think of any jokes about it, but it's a good read.
An efficient pattern recognition of a lion makes perfect evolutionary sense. If you see a large feline shape moving in some nearby brush, it is unwise to wait until you see the yellows of the lion’s eyes before starting to run up the nearest tree.
I'm too old to run up trees, and besides, who runs up a tree? I'd want to pet the kitty. It may be the last thing I ever do, but it'd be worth it.
Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones.
That's... actually a really good synopsis of science.
But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible.
Hence why I'm always rambling on about confirmation bias in here. There are other biases and fallacies, of course, but that's the one I know I'm inclined toward.
Because we are compelled to make stories, we are often compelled to take incomplete stories and run with them. With a half-story from science in our minds, we earn a dopamine “reward” every time it helps us understand something in our world—even if that explanation is incomplete or wrong.
And now they've just explained 90% of the internet.
Good science is a combination of meticulously obtained and analyzed data, a restriction of the conclusions to those interpretations that are explicitly reflected in the data, and an honest and humble recognition of the limits of what this data can say about the world.
Nowhere is this illustrated better than in the insistence of certain authors to ascribe mystical properties to quantum phenomena.
The article has a lot more than I'm quoting here, and I think it's important to read. It shows, at least in part, why I'm sometimes in here ragging on science reporting, while still praising (most of) the science on which it reports.
And it helps me to remember that, believe it or not, I, too, am not always right.
Neat how this one comes up the same week I do a Fantasy newsletter editorial about bread. But it was, indeed, randomly chosen.
I haven't read the book in question. I should, though; the bullshit surrounding nutrition science is, as you probably know, an interest of mine.
Thing is, from what I've seen of it, it would be largely confirming my own beliefs about things. "An incendiary work of science journalism debunking the myths that dominate the American diet and showing readers how to stop feeling guilty and start loving their food again—sure to ignite controversy over our obsession with what it means to eat right."
But, see, part of the problem with all the different diet and exercise and other health publications, programs, whatever out there is that they follow the standard marketing playbook: 1) Convince people that they have a lack and will not be happy until that lack is fulfilled; 2) sell the thing that will fill the hole you've just created in their psyche.
I'm no expert, but as far as I've been able to tell, that is the essence of advertising. The process itself is independent of whether the product you're selling is genuinely a good thing, or utter bullshit. And the blurb I quoted above (found via Google when I searched for the book) follows the script, also throwing in the "controversy" angle, which makes people want to see what the fuss is supposedly all about.
Still, I want to believe the book is worthwhile, because I long ago grew weary of the endless succession of diet and exercise fads, each marketed to a neurotic public, each in turn fading into obscurity as the new fads roll out. Different foods are by turns demonized or extolled, until it's impossible to tell what one "should" really be doing to maybe eke out another year or two of existence.
This is by design, as it keeps the publishing industry in business.
As for gluten itself, the whole "gluten-free" fad (which finally shows signs of winding down) misled people in a big way. It didn't help that "gluten" isn't a very pretty word, so it was easy to manipulate people into believing that it's Bad For You, in much the same way as it was easy to manipulate people into thinking that something labeled "All-Natural" is Good For You. (Incidentally, gluten, tobacco, and poison ivy are all-natural. Cognitive dissonance also sells product.) I'd even heard of people demanding gluten-free foods simply because they thought it was some kind of additive, and additives are, of course, Always Bad For You. (Anything can be an additive.)
One good thing has come from this bullshit: since less than 1% of the population has legitimate problems with eating foods containing gluten, there wasn't a lot of incentive for companies to create products that people with celiac disease could eat, which limited their diet options to food that is legitimately no fun at all. But suddenly a whole lot of people convinced themselves they had gluten sensitivities, companies marketed to them, and suddenly actual gluten-intolerant people had more options. So that worked out well for them.
Anyway. Like I said, I haven't read this book so I have no idea if what I've just said supports or contradicts anything in it. It just gave me a chance to once again rant about nutrition science, and the reporting and marketing thereof.
|Well, the random number gods have frowned upon me and revealed unto me yet another Atlantic article. I suppose I could have ignored the result and picked something else, but I've got a system. Besides, at least it's not on the same topic.
It is possible that you might hit a paywall with this one. There are ways around it if you care.
One Legacy of the Pandemic May Be Less Judgment of the Child-Free
The coronavirus could change lingering cultural assumptions about what makes for a full and happy life.
While the article is six months old, I don't think it's outdated yet.
My friends were getting honest about how hard it is to raise children right now.
People have, in my experience, always been honest about how hard it is to raise children. But they almost always end the discussion with some variation of "But it's worth it." Reading between the lines here, I'm wondering if the author is responding to the absence of the "But it's worth it." But that may be my own bias talking.
I also read it as an indirect plea to not take my child-free privileges for granted.
Life without children can be easy sometimes. But it's worth it.
I’ve always been ambivalent about whether I would have children, but as I entered my early 40s, I started exploring the possibility of having a child on my own. And then the pandemic happened.
I will point out here that any discussion of whether or not to become a parent is different depending on the gender of the writer. One of society's many double standards is that, in general, men who choose to avoid becoming a father aren't treated with the level of disrespect often shown to women who choose to avoid becoming a mother.
This article is written from a female point of view, and obviously my commentary isn't.
The COVID-19 crisis has revealed the brokenness of America’s institutions: police violence and the inhumane criminal-justice system, a medical system that lacks infrastructure and essential equipment, precarious employment for an in-debt population getting by month to month, the toxic effects of globalization and climate change. Add to that list middle-class parenting, long an aspirational experience, whose social protections are now showing themselves to be a bit of a charade.
Most of which I've cited in the past as reasons why I didn't want to be responsible for bringing a child into society. Not the only reasons, but reasons. It didn't take a pandemic for me to see the writing on the wall.
While the parents in my life have been openly acknowledging the challenges of parenting during the pandemic, my child-free friends have for the first time been sharing that they are relieved they don’t have children.
Again, a very different experience. I hear some variant of "I'm so glad I never had kids" fairly often. Incidentally, I appreciate the author's use of the adjective "child-free;" many would use "childless," which implies a lack or an emptiness.
Hm... I feel like I should note that I'm not trying to rag on anyone's choices here. I understand that being a parent is very fulfilling for some, and I respect their choices. I would only ask that they do the same for the child-free.
An essay series in The Guardian, called “Childfree ,” explores that decision, with reasoning that runs the gamut: not enough money, focusing on your own life, the climate crisis, being fine with being alone.
Well. The Guardian is one of my usual sources, but I've missed those essays until now. I made it a hyperlink here mainly so I can be reminded to go look at it later. I can't be arsed right now.
In response to a harmless tweet from a parent about how “non-parents have no idea how hard it’s been” to parent during the pandemic, thousands of people chimed in with some version of: Yes, we do—that’s why we don’t have kids.
I know what a terror I was growing up, and I wouldn't subject anyone to that -- least of all ME.
This is hardly the first moment that the idea of marriage and a baby as the primary path for women has come under scrutiny.
Again, the female point of view. Which I'm by no means trying to downplay, but until we work out issues surrounding cloning, it takes two, directly or indirectly, to procreate. I feel the need to add that my own point of view comes from someone who has never had children (please keep "...that you know of" jokes to yourself; they're tiresome, false, and sexist), not like some guys I know who are childfree by way of abrogating their responsibilities. I don't think much of those guys.
For heterosexual parents, the bulk of the child care falls on the mother. The global health crisis has worsened this sexist division of labor, and the long-term effects could damage women’s careers and, despite the best intentions, become a new norm.
Which is another societal double standard that really should be addressed.
For people who were planning to have a child, those plans might now be on hold; the process of seeking fertility treatments, for example, has gotten more complicated as access to medical procedures for elective reasons has been limited.
In a perfect world, as I see it, everyone who wants to have children would, while everyone who doesn't want to wouldn't. As this is not a perfect world, you have people desperately wanting to reproduce who can't, and people who desperately don't getting stuck. Also if I were inclined to pass judgment on anyone, it wouldn't be people who choose to have kids or people who choose not to, but the ones who blow all their resources on fertility treatments when they could adopt.
Taking away a lot of the stigma around adoption might serve to alleviate this disparity. Maybe.
Childless people have long been chastised for being selfish or for not fulfilling a role their body seemingly bound them to.
Here the author reverts to the other word. And selfish? What's more selfish than insisting on bringing children into a deteriorating world? Stipulating that everyone does everything for selfish reasons, if you want to have kids and be less "selfish," then adopt. Also, biology isn't destiny. My body also evolved for chasing prey across the savanna, but I can't be arsed to do that, either.
As a child-free woman in my 40s, I’ve been tasked with taking care of my parents.
Because of course one of the main reasons to have kids is to give you free convalescent care later in life.
One legacy of the pandemic may be less judgment of the child-free.
I won't be holding my breath.
Anyway, it's a point of view, and I thought it was interesting... but again, I admit to some confirmation bias here.
|This article is a lot of words for a simple answer: telemarketers.
Why No One Answers Their Phone Anymore
Telephone culture is disappearing.
But still not as many words as if it were a New Yorker article.
See, if someone had been walking their dog in Central Park and a guy jumped out of the bushes and bit the dog, it would be reported in different ways by three different New York publications.
The New York Times:
Man Bites Dog
A man leapt from the bushes and bit the flank of a golden retriever belonging to Jamie Sands of West 72nd Street yesterday, sources say.
The New York Post:
Man Bites Dog
It finally happened!
This reporter has been waiting for this moment since his first year at J-school!
The New Yorker:
Incident in Central Park
A cool breeze brushes the tops of the trees as Jamie Sands, a 42-year-old mother of three human children and one golden retriever named Spike, strolls along one of the park's many carefully-maintained paved paths. "I walk Spike here nearly every day," she says, sighing and staring off into the distance, eyes seemingly focused on one of Manhattan's new pencil towers, its slender spire seeming to scrape the sky, to coin a phrase. "I never thought something like this could possibly happen," says Sands, her gaze shifting to her peripatetic canine companion. The dog pulls her along a winding trail through manicured lawns, stately trees, and trimmed shrubberies.
It was from one of those shrubberies, with its dense concealing foliage, that the unthinkable occurred.
Sands moved to the Upper West Side in 2006, upgrading her apartment from the 300-square-foot flat in the hipster enclave of Greenwich Village. "It was the third child that did it," she says, leashing Spike back from his investigation of a particularly brave squirrel. Squirrels have claimed Central Park since the moment it was founded, finding homes in treetops and terrorizing passers-by...
(there follows 360 column-inches of meandering, descriptive, wistful, breathless, pointless, post-modernist writing, with the description of the actual dog-biting incident buried approximately 4/5 of the way down.)
But The Atlantic? The Atlantic wouldn't touch a "Man Bites Dog" story. No, too pedestrian.
They apparently will, however, stretch out a "Why we don't answer the phone anymore" article to the point of absurdity.
The telephone swept into Americans’ lives in the first decades of the 20th century. At first, no one knew exactly how to telephone.
Verbing weirds language. Also, of course no one knew how to do it at first. This is true about any new invention. "At first, no one knew how to fly an airplane." "At first, no one knew how to attach wheels to a cart." "At first, no one knew exactly how to cultivate crops." I mean, come ON.
People built a culture around the phone that worked.
Um... I'm not sure it actually worked.
In the moment when a phone rang, there was an imperative. One had to pick up the phone. This thinking permeated the culture from adults to children. In a Hello Kitty segment designed to teach kids how the phone worked, Hello Kitty is playing when the phone starts to ring. “It’s the phone. Yay!” she says. “Mama! Mama! The telephone is ringing. Hurry! They are gonna hang up.”
That right there would have been enough to put me off phones for life.
Not picking up the phone would be like someone knocking at your door and you standing behind it not answering. It was, at the very least, rude, and quite possibly sneaky or creepy or something.
Well, too bad. I'm busy. That phone is there for my benefit, not yours.
I attach no special value to it. There’s no need to return to the pure state of 1980s telephonic culture.
I'll give the author this much: there's a simple discussion about the way things used to be, without the sense of nostalgia that so often accompanies such articles. "I remember sitting at my drawing-table, endlessly tracing cursive letters until my hand cramped. Kids these days just don't understand how beautiful handwriting can be. Those days were simpler, life more elegant..."
There are many reasons for the slow erosion of this commons. The most important aspect is structural: There are simply more communication options.
No, the most important aspect is every time I'd pick up, someone would be trying to sell me insurance, or claim they're from the IRS and I'm boned, or warn me that my car's warranty was about to expire.
You’ve got your Twitter, your Facebook, your work Slack, your email, FaceTimes incoming from family members. So many little dings have begun to make the rings obsolete.
Oh, how I long for a future without Twitter or Facebook.
Finally the author gets to the main point. This is called "burying the lede," and it's despicable.
There are unsolicited telemarketing calls. There are straight-up robocalls that merely deliver recorded messages. There are the cyborg telemarketers, who sit in call centers playing prerecorded bits of audio to simulate a conversation. There are the spam phone calls, whose sole purpose seems to be verifying that your phone number is real and working.
Incidentally, this article is nearly three years old, so the data in it is from 2018. I doubt anything's significantly changed since then. There was a period in late 2019 when I remember getting, no shit, no exaggeration, two to three dozen spam calls a day. Blocking the numbers doesn't help, because they're spoofing. Answering only makes them call more. It got to the point where I simply turned my ringer off. I missed a couple of calls from friends doing that, but it was worth it.
And then when I'd check my voicemail, I'd find messages in Mandarin. MANDARIN. Again, I'm not joking. I'm just surprised none of them were in Russian.
The spam has diminished to two or three a day, now, so I keep the ringer on. Still, I want it to go to zero.
Anyway, that's my rant for the day.
|I always find it amusing, in a dark-humor sort of way, how often groups of people flee religious persecution so that they can install their own.
Few examples are more obvious than the Puritan settlers of Massachusetts. And while the land they settled eventually came around to the ideals of freedom of religion and expression -- well, at least up to a certain point -- their insistence on a narrow view of morality echoes to this day like the reverberating sound of a turd hitting the slurry at the bottom of an outhouse.
Apparently, Thomas Morton didn’t get the memo. The English businessman arrived in Massachusetts in 1624 with the Puritans, but he wasn’t exactly on board with the strict, insular, and pious society they had hoped to build for themselves.
Why would such a person brave an uncertain ocean journey with such horrible people? Why, money, of course.
The Puritans’ move across the pond was motivated by both religion and commerce, but Morton was there only for the latter reason, as one of the owners of the Wollaston Company.
His business partner—slave-owning Richard Wollaston—moved south to Virginia to expand the company’s business...
I'd suggest cancelling this guy, but it seems history already has.
...but Morton was already deeply attached to the land, in a way his more religious neighbors likely couldn’t understand. “He was extremely responsive to the natural world and had very friendly relations with the Indians,” says Heath, while “the Puritans took the opposite stance: that the natural world was a howling wilderness, and the Indians were wild men that needed to be suppressed.”
It's funny because the researcher's name is Heath, which is a word for an area in a state of nature, which gave us the word "heathen."
After Wollaston left, Morton enlisted the help of some brave recruits—both English and Native—to establish the breakoff settlement of Ma-Re Mount, also known as Merrymount, preserved today in the Quincy neighborhood and park of the same name.
Incidentally, if you're not aware, Quincy is pronounced like "quin'-zee." Just in case you find yourself in the Boston area one day, don't get tripped up by this shibboleth.
The Puritan authorities didn’t see Merrymount as a free-wheeling annoyance; they saw an existential threat.
Of course they did. The thing most frightening to a Puritan is the horrible idea that someone, somewhere, is having a good time.
With Plymouth’s monopoly dissolved and its perceived enemies armed, Morton had perhaps done more than anyone else to undermine the Puritan project in Massachusetts.
I should absolutely build a shrine to this guy.
Worse yet, in the words of Plymouth’s governor William Bradford, Morton condoned “dancing and frisking together” with the Native Americans—activities that were banned even without Native American participation.
Snort. "Frisking." Snort. Which reminds me of a joke. Why aren't Puritan kids allowed to make out? Because it might lead to dancing.
There could be no greater symbol of such misrule than Morton’s maypole. Reaching 80 feet into the air, the structure conjured all the vile, virile vices of Merry England that the Puritans had hoped to leave behind.
I'm absolutely, totally stealing "vile, virile vices." Oh, wait, I already did, for this entry's title.
The article goes on to talk about the book in the title of the piece, and I feel like I really should own a facsimile of it (I'm not quite impressed, or rich, enough to try to track down a first edition).
After publishing the book, Morton braved a venture back to his beloved Massachusetts, only to be turned right back around upon arrival. He tried to cross the Atlantic once again in 1643, and was this time exiled to Maine, where he died.
There are probably worse fates, but I can't think of any offhand.
So, a short read, and worth it -- and the source, Atlas Obscura is a wonderful fount of information. The only problem is they keep posting places to visit in Belgium, and I don't know if I'll be able to go to them all.
Back in college, I had occasion to visit the Drama Department every once in a while, mostly in my capacity as newspaper photographer. This department was in what was then a relatively new building, with more contemporary fixtures than in many of the other University buildings. One such fixture was the toilet paper dispensers in the restroom stalls.
Now, there wasn't a lot of graffiti at the school in general. Oh, there was some, to be sure; the scribblings in the Philosophy Department, much older than Drama, were particularly incisive, and I had great laughs at the occasional math pun in the Engineering department (which is where I took most of my shits). But in this particular case, someone had seen that the toilet paper dispensers had a lever with words on it that read: "PRESS DOWN FOR NEW ROLL." Predictably, but still amusingly because this was, in fact, the Drama School, this person had altered the last L to become an E.
Anyone who's been following along should know that I like to learn about language, and play with it. "Roll" and "role," as you might imagine, are... absolutely of the same origin.
Surprised? I was, when I discovered this.
Turns out that, at least according to Dictionary.com, the word "role" split off from "roll" somewhere in Old French, where an actor's part was referred to, in what I suppose is a case of metonymy, from the roll of paper upon which the actor's lines were written.
I absolutely love this sort of thing; it gives me insight into the way peoples' minds work through the lens of language development.
I should also note that it appears that the French word "roue," which translates as "wheel," comes from the same source as well. Not entirely sure of this one, though.
The "role" origin also seems to tie into one of the other definitions of "roll," as in "roll call." Oddly enough, though, the word "scroll" doesn't seem to be related, at least not as far back as they can trace it -- though it should be, describing as it does a roll of paper, parchment, vellum, whatever. No, "scroll" comes from, of all things, "escrow," which itself was an alteration of an older word "scrow," which apparently meant... roll.
Essentially, "roll" is traced from Latin and "scroll" is Germanic in origin -- though both language families, naturally, stemmed from the same source, even further back: Proto-Indo-European, or PIE. This is, of course, what we call it now; no one seems to know what this original language was called or even, with any level of certainty, what its words or structure were.
A while back, I did a blog entry about the invention of the wheel. After much searching, I finally found it. Here: "As the Turn Worlds (or whatever)" . And in that entry, I refer to another entry from a couple weeks prior, here: "Lox Pie" . Now, based on what I found out in writing those entries, it seems that people smarter than I am have figured out where the PIE-speakers probably originated from, and that the reason that particular language spread so far and wide was because those bastards invented the wheel, slapped those suckers on a cart, and hooked the resulting contraption to horses. This made the people in that culture incredibly mobile for the time. I mean, anyone with a horse would have been more mobile (Genghis Khan comes to mind), but if you want to take your stuff with you, you need a cart, too. Preferably one behind the horse, rather than in front of it.
So, essentially, the PIEs rolled all over Europe and parts of Asia, bringing their language with them -- a language which then fractured, merged with other languages in different areas, adapted to its speakers' varying needs and environment, and then -- thousands of years later -- maybe started coming together again, in a vastly different form, as English borrows heavily from so many different other languages.
All of which is to say that there's more than half a dozen songs called "Let It Roll," and I haven't even heard some of them, and those that I have, I don't particularly like, so this entry's about roll and not rock.
|Kids these days with their... um... kid stuff.
The one constant across all of human history is the older generations freaking out over something that the younger generation is doing.
They [Young People] have exalted notions, because they have not been humbled by life or learned its necessary limitations; moreover, their hopeful disposition makes them think themselves equal to great things -- and that means having exalted notions. They would always rather do noble deeds than useful ones: Their lives are regulated more by moral feeling than by reasoning -- all their mistakes are in the direction of doing things excessively and vehemently. They overdo everything -- they love too much, hate too much, and the same with everything else.
More complaints about "kids these days" from millennia ago can be found here.
Anyway, the Cracked article linked first above.
Once upon a time, there was a world before Fortnite, COD, and even Angry Birds when most people needed to visit arcades and other public places to get their video game fix.
I was one of those kids in 1982.
Yet instead of enjoying their time outside of the house, socializing at arcades as they gamed with their friends, basically, everyone and their mom thought that video games were actively destroying their brains, sparking mass hysteria among parents.
And before that, it was hippie stuff, and before that it was jazz, and before that it was... I don't know... writing, maybe. Or revolutions against colonial oppressors.
"GRONK! FLASH! ZAP! Video Games are Blitzing the World!" read a cover of Time Magazine in 1982.
On the other hand, maybe it did destroy my brain; I have a distinct memory of Pac-Man being Time Magazine's "Man of the Year" (before they finally stopped being so fucking sexist about these things), but in researching this blog entry, it seems that the Pac-Man "Man of the Year" cover was actually a spoof done by Mad Magazine. Mad, of course, was Cracked's main rival at the time, but more importantly, it was published by the same people who published Time.
Everything is connected somehow.
Time did, however, once select "the personal computer" or something similar as "Man of the Year," notably before they changed it to "Person of the Year," thus illustrating that to people back then (People was a popular magazine then too), the computer was more important than the wimmins.
In an attempt to curb this "electronic blight," with 4,000 to 5,000 consoles popping up in arcades, pizza parlors, grocery stores, and drugstores, city officials passed regulatory laws, only allotting video games in commercial or industrial areas. Because nothing says good, wholesome fun like a bunch of unsupervised children heading down to their local factory district area to play some Pac-Man, right?
Also, I don't remember any of this. I got my video game fix in arcades and at the local 7-Eleven.
"Officials say they are responding to complaints from parents that children have skipped school or stolen money to play the games and made a nuisance of themselves," the anchor said over footage of kids seemingly having a great time playing games.
Said 7-Eleven was located right across the road from my high school. I'd leave extra early in the morning to stick stolen quarters (okay, they weren't really stolen, but it's not like I had a job at the time) into Ms. Pac-Man and/or Galaga prior to trudging over to prisonschool.
I got really, really good at Galaga, by the way. When the first Avengers movie gave it a nod, I might have cheered right there in the movie theater.
I don't recall that I ever skipped school just to play video games. But I can't say I never played video games when I skipped school. It's just that the owners of that particular convenience store were narcs, and if a kid was there during school hours, we'd get told upon.
Incidentally, I had occasion to pass by that high school fairly recently, because it was on the way to a microbrewery I wanted to try -- I think this was in November of 2019, because it was definitely in the Before Time, but still recent -- and behold, there is still a 7-Eleven across the (now four-lane) road from the high school. It does not, however, still house video game consoles, but the cashiers still looked like narcs. This shouldn't be surprising, since that convenience store is also next to the FBI Academy. Yes, that FBI Academy; it's right across the line from Quantico.
Point is, there have always been things that kids do that freak adults right out. This, I think, is an important part of childhood, and I hope it never changes. Because, it's not in spite of these moral panics that civilization keeps right on chugging along, at least for now.
It's because of them.