Items to fit into your overhead compartment |
I thought about summarizing this Futurism article with AI: Once You Notice ChatGPT's Weird Way of Talking, You Start to See It Everywhere ![]() It's not useful, it's slop. I didn't do that summary, of course. I've played around a bit with the LLMs people insist on calling AI, and of course one cannot avoid it while doing a Google search these days, but I've never used their results in my writing. Graphics, now, sure, such as the blog picture. Difference is, I have absolutely no artistic talent, but I like to think I have some small ability to write. It's not written by humans, it's written by AI. It's not useful, it's slop. It's not hard to find, it's everywhere you look. People love to call it "slop," but I've seen human writing just as sloppy, or even worse. Once you notice it, you start to see it everywhere. One teacher on Reddit even noticed that certain AI phrase structures are making the jump into spoken language. As much as I try to avoid LLM output, like I said, it's ubiquitous these days. I even mentioned to a friend that a certain sentence structure they used reminded me of ChatGPT output, even though I was sure the sentence wasn't thus generated. It's a fascinating observation that makes a striking amount of AI-generated text easily identifiable. It also raises some interesting questions about how AI chatbot tech is informing the way we speak — and how certain stylistic choices, like the em-dash in this very sentence, are becoming looked down upon for resembling the output of a large language model. There are two punctuation choices that I make, ones which you can pry from my cold, dead fingers: one is the semicolon; the other, the emdash. Beyond a prolific use of em-dashes, which have quickly become a telltale sign of AI-generated text, others pointed out the abundant use of emojis, including green checkboxes and a red X. On the other talon, I use emojis only sparingly. Tech companies have struggled to come up with trustworthy and effective AI detection tools, more often than not leaving educators to their own devices. This article is from June, and I haven't heard anything about those detection tools recently. Last I heard, they weren't very trustworthy or effective, often generating false positives. It's gotten to the point where teachers have become incredibly wary of submitted work that sounds too polished. So, my takeaway here is: don't be too polished. Throw in some deliberate typos, miss an obvious commma that sort of thing. As a side benefit, the teacher gets to use their red pen. They love using those red pens. Sure, you might get points taken off. But is that really worse than being accused of AIing when you didn't AI? |
One of the worst insults I could receive when I was a kid was that my jokes were old and not funny. Actually, still is. From BBC: Compared to the jokes, the article is fairly new: just three and a half years old. The phrase "the old ones are the best ones" might not always be true. But some of the oldest jokes in history are still in use today. Even if they're not funny, they're still windows into the past: into what people found funny back then, and into cultural contexts. After months spent poring over medieval texts for her PhD, Martha Bayless made a surprising discovery. She was looking at some of the earliest jokes written in Latin by Catholic scholars (some in excess of 1,000 years old). Few had ever been translated into English before, yet many were still funny – and some even made her laugh out loud. Semper ubi sub ubi? Shortly after, while waiting for her train, Bayless was reading a copy of Truly Tasteless Jokes 3 – a popular joke anthology from 1983. She was surprised to find, almost word for word, a joke that she had been transcribing just a day earlier. Oh, I remember that series. While I couldn't quote a single joke from the TTJ books now, I know for certain that they helped shape me into the clown I am today. It struck Bayless that the joke had continued to be shared through a spoken culture of joke-telling, starting with the Latin text and culminating with her modern joke book, without needing to be written down for centuries in between. But that was a more common means of joke (meme in the original sense) transfer, pre-internet: word of mouth, mostly kid-to-kid. Even the ones that were written down or, later, recorded with video and/or audio, were subject to censorship. Not so the underground joke economy: anything was fair game, be it sex, body functions, racism, or even worse topics. Now, we even have documentaries on what might be the foulest joke of all time, ![]() This is good, in a way. But it does have a downside, which is: kids need to feel rebellious, and they'll find something to secretly transgress against. If it can't be sick jokes, it'll be something else. Bayless, now a director of folklore and public culture at the University of Oregon, has written a number of books on early comedy. She says, "the earliest jokes were dirty jokes. People couldn’t resist them." Well, she's the one with the Piled Higher and Deeper degree, and I've no doubt that many early jokes were what we'd call "dirty jokes," but you're dealing with survivorship bias here. Like I said, jokes tend to be an oral tradition, with all the generational changes that implies. If you limit yourself to the ones that were written down (or, in the case of Sumeria, etched into clay tablets), you're not getting the full picture. Flatulence, for example, is funny because it shows our "uncontrollable physicality", says Anu Korhonen, a professor of cultural studies from the University of Helsinki in Finland. I disagree. Fart jokes aren't funny. They are, in fact, the lowest form of humor. What is funny is peoples' obsession with fart jokes. Some researchers suggest that because humour brings us together it might have an evolutionary purpose. Here we go again with evo-psych speculation. At least they wiggle out of it a bit by using "might." But not all rude jokes translate well across cultures. Peter McGraw, a professor of marketing and psychology at the University of Colorado Boulder, explains that cultural norms vary so widely, finding a universally funny joke is challenging. I don't think there's a universally funny joke. It's all relative to your own culture. This goes especially for the highest form of humor: the pun. They generally only work in one language. I can't deny that part of my motivation for learning French was to be able to pun in more than one language. You might call it committing merde-er. Who knows what audiences thousands of years in the future would think if they unearthed videos of contemporary comedians. Probably the same thing I do: 95% of it isn't funny, but the other 5% makes everything worth it. |
An important message from Self from two years ago: What to Do When You’re Super Cranky and Hate Everyone ![]() When the group chat ping instantly makes you irrationally irritated, it’s time to take a beat. Wait, I thought that was the default state for everyone. No? Just me? Fine. Go away. Once in a while I wake up inexplicably cranky. There’s nothing specifically wrong, per se. It’s just that, for whatever reason, everyone around me gets on my nerves. They used to just call that "waking up on the wrong side of the bed." My husband will come into our home office and distract me at the exact moment I start writing effortlessly after struggling with writer’s block. What? That's his JOB. My mom will call with some gossip about a person from high school I haven’t thought about (by choice) in 18 years. What is this supernatural ability to choose not to think about someone or something? A friend will send me 10 photos of their baby that I just don’t feel like looking at (I’m terrible). Nope, you're normal. Babies are ugly to everyone except their parents, and sometimes, I think even they are lying. My dog, it seems, is the only creature I can tolerate being around, and that’s because he’s perfect. And yet, if there were a human who acted exactly like the dog: finding some way to wag their tail, for instance, barking at nothing, begging for walks, whining, licking your face, etc., you'd be enraged at them, too. Adjoa Smalls-Mantey, MD, a psychiatrist based in New York City, tells SELF that there are lots of reasons why you might suddenly feel so irritated with the people around you—sleep deprivation, for example, can put you on edge, as can feeling stressed out about work or school. As can living and/or working in New York City. All of these things can influence the amount of cortisol—the primary stress hormone—in your body, Dr. Smalls-Mantey says, and turn you into a real-life Scrooge. Oh, yeah. Cortisol. The latest buzzword in pop biopsych. When I’m peeved, the last thing I want to do is reflect on and accept how fundamentally frazzled I am, but this can actually help you perk up a bit, according to Tom McDonagh, PsyD, a clinical psychologist at Good Therapy SF in San Francisco. If the quoted person didn't go on to compare this favorably to mindfulness, I might be inclined to agree. When I’m having one of these days, I’m miffed before anyone actually does anything to annoy me: I’ll see a text pop up on my phone and be like, Ugh, this is going to suck! without even seeing what the message is about. Rather than assuming your interactions with people are going to be dreadful, try to flip your POV and consider that they might be tolerable (who knows, they could even be positive!), Dr. Smalls-Mantey suggests. Or, and hear me out here, just accept that it's going to suck. That way, either it doesn't, and you're pleasantly surprised, or it does, and you're pleasantly smug because you were right. Another way to get through this testy time: Come up with a game plan that’ll make your hangs less irksome, Dr. Smalls-Mantey recommends. Apparently, they're calling casual social interactions "hangs" now. This irks me. Dr. McDonagh says that irritability, in general, is a result of shifting into fight-or-flight mode—the stress response that occurs when your body perceives some sort of danger or threat. As a result, he says, certain hormones, like cortisol and adrenaline, flood your system, and that can temporarily make you tense. To cope, he suggests taking some deep breaths. Bullshit evo-psych "fight-or-flight" speculation. But hey, if the controlled breathing works, then it works. Unless you're in an area currently blanketed by pollutants and/or wildfire smoke, it probably can't hurt. Sleepiness is one of the top reasons people get cranky with others, studies show. “If you’re tired or exhausted, you have to stop and rest,” Dr. Smalls-Mantey says. This would be at the very top of my personal list. But then, I'm not in a situation where I have to interact all that much with other people, and can sleep more or less when I choose. Now, as the disclaimer at the bottom of the linked article states, this isn't medical advice. Personally, I accept that I'll be in a bad mood from time to time, and I call it another opportunity to convince people to leave me alone. There's a doormat I need to obtain somehow. I saw it in the Marvel series Ironheart. It reads: Live. Laugh. Leave. |
An important piece of music history here, from Smithsonian. How Bruce Springsteen Created the ‘Greatest Rock Album Ever’ ![]() Fifty years ago, the Boss was at a pivotal moment in his career. A new book details what it took to launch ‘Born to Run’ Okay, yes, it's clearly a book ad. But it's an informative book ad. In the late summer of 1975, Bruce Springsteen’s third studio album, Born to Run, launched to critical acclaim and rapidly climbed the Billboard charts, holding at number three. Some might argue, well, "How can something that peaked at #3 be considered the greatest of anything?" I say: because quality isn't always recognized as such. Consider Vincent van Gogh, severely underrated in his own time, only later to become one of the world's most recognizable names in art. So it is with Springsteen. Yes. Yes, I did just compare him to van Gogh. “I can listen to it now 50 years later and think that every note and word are in exactly the right place,” says Peter Ames Carlin, Springsteen’s biographer and author of the just-released Tonight in Jungleland: The Making of Born to Run. Hence the book ad. Does it make me buy the book? No. I was going to buy it anyway. The album continues to draw audiences with an estimated total of seven million copies sold in the U.S. alone over the past five decades and is listed for its cultural importance in the National Recording Registry of the Library of Congress. The actual 50th anniversary of the album's release isn't for another few days. I should probably put it on my calendar to remember to listen to the thing on that day. I was pretty young when it came out, and it wasn't on my radar at the time. It wasn't until a few years later that I discovered the awesomeness that is Bruce. I won't bore you with the rest of the article, which is long. It's linked there for anyone interested; I know not everyone is into it. Regular readers know I don't indulge in celebrity gossip. But I don't think of Springsteen as a celebrity; I think of him as a guy who makes great music. And music matters. So you're scared and you're thinking That maybe we ain't that young anymore... |
Some science as reported by MSN: I don't know why it took them so long. Anyone who went to university knows what forces the brain to sleep: philosophy lectures. We spend nearly a third of our lives sleeping... Ideally, more. I've considered the idea that the purpose of life is to sleep, and anything we do during waking hours is in support of that. ...yet the biological trigger behind sleep has remained elusive. My above attempt at humor aside, this is pretty cool. Researchers have found that the pressure to sleep may come from deep inside our brain cells, from the tiny power plants known as mitochondria. Certain phrases have become cliché. One such phrase is calling the mitochondria "power plants." Despite the similar-sounding name, these are not the same things as the midichlorians that enable one to harness and use The Force. Pretty sure they were going for a similar-sounding name, because mitochondria are the power plants of the cell. Here in consensus reality (that is, not the Star Wars universe), almost all known macroscopic life possesses cells that have mitochondria. This is, to the extent that I understand it, the result of an ancient and perverted union of an archaeum and a bacterium, both unicellular, with the archaeum failing to digest the bacterium and instead incorporating its guts into its own cytoplasm. If it weren't for that lucky chance, which vastly increased the energy available in cells, we wouldn't be here. The team, led by Professor Gero Miesenböck and Dr. Raffaele Sarnataro, discovered that a build-up of electrical stress inside mitochondria in specific brain cells acts as a signal to trigger sleep. I suspect that, as with most scientific breakthroughs, this will need to be replicated before one can confidently use words like "discovered." The research, carried out in fruit flies, showed that when mitochondria become overcharged, they leak electrons. “When they do, they generate reactive molecules that damage cells,” said Dr. Sarnataro. If this holds up, it's a good enough reason to not skimp on sleep. We know, empirically, that sleep deprivation causes all kinds of unpleasantness. Apparently, we didn't (and still probably don't) know all the reasons why. “We set out to understand what sleep is for, and why we feel the need to sleep at all,” said Professor Miesenböck. I still say that they're probably looking at it from the wrong perspective. The question shouldn't be why we need to sleep. The question should be why we need to spend so much time awake. My cats, for instance, are keenly aware of the value of sleep. Small animals that consume more oxygen per gram of body weight tend to sleep more and live shorter lives. The phrase "tend to" is doing a lot of the lifting there. The correlation is apparent when talking about housecats, who are smaller than humans, sleep more, and live shorter lives. But, from what I've heard, big cats also require more sleep, as do, famously, bears. I expect if you graphed body weight vs. sleep requirements, though, you'd end up with one of those scatter charts with a bunch of outliers. “This research answers one of biology’s big mysteries,” said Dr. Sarnataro. “Why do we need sleep? The answer appears to be written into the very way our cells convert oxygen into energy.” "Answers" is also probably overstated. It's a step in the right direction, but there are still mysteries. That's a good thing. Hopefully, there will always be mysteries. Some might even be solved after a good night's sleep. |
Going Outside again, with another reason to avoid going outside: The Case for Killing the Campfire ![]() Outdoor tradition or dangerous, polluting, wasteful relic of the past? The article is from almost nine years ago, and since then there have been several more wildfires and one pandemic that spread like wildfire. But I don't know what the current campfire policies are anywhere. Will you be able to enjoy a fire on your next camping trip? For residents of California, Oregon, and Washington, the likely answer is already no. For much of this summer, most wilderness areas in those three states were under a total campfire ban. Outside of the metal fire rings in organized campgrounds, you could not have a fire on public land. And I understand the reasons, but if I were otherwise inclined to go camping, and someone told me "no campfires," I'd be like "Hard pass." If they didn't tell me until we were up on the mountain, I'd hike right on out of there. In the dark. But the risk and cost of wildfires is only one nail in the campfire’s coffin. And that means they could also be at risk in areas less prone to conflagration. Let’s look at the problems campfires cause. If this were a newer article, I'd wonder if an LLM wrote it. Pollution Wood smoke contains fine particles of unburnt wood. That may not sound like pollution, but reduced in size to 2.5 microns or less, these microscopic particles become toxic. Yes. We have cars, airplanes, trucks, ships, trains, coal-rolling rednecks, fossil fuel power plants, industrial manufacturing, and at least one perpetual underground coal fire. But you can do your part by... not having a campfire. Health Problems As nice as we all think wood smoke smells, inhaling all of the above isn’t good for you. See above. Litter Campfires leave behind charred wood, piles of ash, and blackened rocks. People often use them to burn trash, which may only be partially destroyed, frequently leaving behind remnants of cans, bottles, plastic, and foil. The author's assumption here seems to be that, in the absence of a campfire/incinerator, assholes wouldn't leave their thoroughly unburnt trash lying around in the wilderness. I reject that assumption. Tree Damage We all know we’re supposed to harvest only dead, fallen wood for our campfires. But in large volumes, removing even that stuff can cause problems. One of the greatest epiphanies of my existence was when I realized that creation and destruction weren't opposites, or two sides of the same coin, but the exact same thing, indistinguishable except by the value we impose on the change. You're not destroying a tree; you're creating a fire. Invasive Species Firewood that you harvest or buy can be home to invasive species like the Asian long-horned beetle. The irony of complaining about invasive species in an article promoting the roaming of an invasive species (H. sapiens) isn't lost on me. Still. Spiders. Injury There’s no national tally of campfire-related injuries, but a study in Oklahoma found that 57 people were injured and one person was killed due to campfires in a ten-year period. There's probably no national tally of campfire-related injuries because DOGE cut that departmentit happens so rarely that there's no need for one. Also, one must balance any CRIs (I'm not typing that out again) with injuries resulting from flailing around a campsite in the pitch blackness, tripping over snakes and, as per the article from last week, getting eaten by bears. I’m an odd person to be writing this article. To me, having a fire has always been a fundamental component of enjoying a night outdoors. And also, your arguments are weak. You might think I'm an odd person to be commenting on this article. It's true that this has no personal relevance to me. Still, in my younger days, I "enjoyed" the occasional wilderness trip, including a (sometimes-illicit) campfire. What, you didn't think I was ragging on a thing I'd never experienced, did you? But mostly, while the risk of wildfires is real, most of those other items don't stand up to much scrutiny. And that got me thinking: perhaps the real lesson here is that the campfire has had its day. So we have rules about them, and one can no longer escape to the wilderness to avoid human-made rules. |
The theme for this Atlas Obscura article is clearly not timely here in August, but there are good reasons to consider this essential information that's valuable year-round. 6 Historical Burgers to Make for July 4th ![]() Boozy fillings, peanut butter toppings, and other interesting recipes of yore. Those reasons are: 1) There is no one time of year to eat burgers. It's not like it's eggnog, which is only acceptable in December. 2) Lots of people who see this on the internet are from places where July 4 isn't a special holiday; and 3) The holiday has become irrelevant in the US, as most of the country's founding principles have been shredded. Now, the article itself is short, with links to the specific burger recipes. Consequently, I'll be brief, as well. I am firmly anti-gimmick burger. A well-grilled patty on a soft bun is already a fine dish that needs little embellishment. So every time I see a new version with foie-gras filling or doughnut buns, I cringe. Okay, I'm not entirely contrarian to what this author is saying, but, for starters, "doughnut buns" aren't something new and inspired; they're a long-standing tradition in my town, one that's even older than I am. Burgers on donuts were originally called, as I understand it, "grillswiths." As this is a college town, it's not surprising that the idea spread to other places, but it's not some sort of "new version." For finishers, there's another long-standing tradition, this one pretty much global, that you take street food and/or subsistence food and, later, embellish it with variations that can be labeled "fancy" or "high-class," like the aforementioned foie gras, or caviar, or one place I vaguely remember that put gold flakes on their burgers just for the novelty of it (and probably for the 1000% markup opportunity). But I will always hold a place in my heart for the slugburger. A Depression-era hack meant to stretch meager meat supplies, the recipe combines ground beef or pork with potato flour. This will probably never work in the next Great Depression, as ground beef/pork has stopped being a cheap food. From a Prohibition-era speakeasy that still slings bitters-filled patties to a roadside stand that’s carrying on a century-old tradition of steaming burgers, American history is filled with unusual burgers born of unusual times. Like I said, burgers have slipped past our border controls and can be found in lots of different places. The article has links to six specifically American burgers, but who knows what burger variations you might find on the streets of some foreign and exotic land? |
Sometimes you hear about people out wandering around doing idiotic things. The key here is that they're out wandering around. From Outside: Selfies Don’t Kill People ![]() And no place has ever been ruined by an Instagram post, either. It's time to stop blaming social media for the world's troubles. The article (an opinion piece) is six years old, published back in the Before Times. Since then, more people have had the opportunity to do something stupid in the not-so-great outdoors, like petting the fluffy cows at Yellowstone or slipping off the rim of the Grand Canyon. No one has ever been killed by a selfie. A lot of people have been killed by stupid behavior. I mean... technically? Sometimes taking a selfie is stupid behavior, like when you're at the edge of the Grand Canyon. I don’t know if it was the poppies in California, or the tourists who died in the Grand Canyon, or the guy who fell off a cliff in Yosemite National Park, but it seems as if the social-media outrage cycle has come full circle. Now, rather than being mad at a dentist who shot a lion or a zoo that killed a gorilla, everyone is outraged at social media itself. Okay, but look, hear me out here: there have been several Grand Canyon incidents. What I tried to explain is that so-called selfie deaths aren’t anything new. There’s not been any sort of increase in the frequency of accidental deaths since the advent of Instagram or Snapchat; people have always managed to find stupid ways to die. Fair enough, but with almost everyone carrying a camera around with them at all times, more people dying in stupid ways have been recorded for posterity. Smartphones could stop working tomorrow, and a teenage boy will still find a way to put his life at risk in order to impress a girl, even if he can’t snap a photo in the process. The biggest change would just be that the rest of us wouldn’t see a photo of the shenanigans and would never get the chance to get outraged about it. Outraged? Hardly. In the immortal words of Elvis Costello, "I used to be disgusted; now I try to be amused." When people get the opportunity to visit a really cool national park, or a field full of beautiful wildflowers, or see a neat animal, it is only right and normal that they want to document the experience and share it with their friends. I just saw one yesterday about a man in India who tried to take a selfie with an elephant, and the elephant chased him down, knocked him over, pulled down his pants, and stepped upon him. Guy survived, so it's not a "selfie death." But it was a pretty stupid thing to do, and I totally get where the elephant was coming from there. I like to think the beast purposely left him alive to teach him a lesson. Again, this is not a new phenomenon. Are Ansel Adams’s photos of Yosemite Valley really that different from every Instagram photo every tourist snaps in the same spot? Ahem... yes. And just like Adams’s work, all those Instagram posts from Yosemite make people want to go visit. This isn't the flex you think it is. Finding a cool camping spot is no longer something that requires navigation skills; you just click on the geotag to open Google Maps, then tell that app to lead the way there. Neither is this. Social media represents change. New people from more diverse backgrounds can now easily reach massive audiences. Change can be scary but it can also be powerful. There is indeed something new under the sun. The problem is that it's under the sun. I suppose it's possible that people have also died or been injured from taking selfies in the comfort of their own home, because one constant in the universe is the human desire to show off, but it seems to me that the problem is these people went outside. With or without phone cameras. Well, despite the age of the article, it's probably relevant again, because I see a stupid stunt death in the news at least once a month, maybe more. And again, my reaction isn't outrage. Sometimes, it's schadenfreude. |
A bit about human nature from BBC: Everyone's selfish. The only question is how much your selfish desires overlap with helping others. Like, if you give money to disaster relief, you've helped someone (mostly the charity's organizer, though some might actually get to the victims). But it also feels good. Doing something that feels good is selfish. Whenever I fly, one line jumps out from the pre-flight safety briefing. Somewhere between "welcome aboard" and "use this whistle for attracting attention", we're reminded to "put on your own oxygen mask before helping others". This is, essentially, an official instruction to be "selfish". What? No, it's not. It's pure, undiluted practicality. Most of the pre-flight videos I've seen illustrate this by showing an utterly calm and in no way panicking woman sitting next to a little kid, presumably her offspring. She methodically fixes the oxygen mask over her face while the kid's sitting there like he's waiting for the dentist, also, in the words of Tyler Durden, "calm as Hindu cows." Then she reaches over and masks up the extraordinarily well-behaved kid. In reality, the kid would be screaming, freaking out, and squirming all over the place (if, that is, it hasn't been sucked right out of the plane by whatever depressurized the cabin enough for the masks to drop). And there's a reason it's always a woman in these videos: it's generally women who are socialized to put everyone else before themselves. So, it's a clear reminder that a) you should try to stay calm while the plane you're on drops 10,000 feet in 2 seconds; b) adults are responsible for kids and c) the kid's not going to be in any position to help you if you put their mask on first, so make sure you're relatively stabilized before assisting the little brat. But on the other hand, in a world that often seems to reward narcissism, there could be a risk that that same line speaks to a somewhat troubling life philosophy. The idea that you should always put yourself first – and that selfishness trumps altruism. Again, in reality, there's a balance to be struck between pure self-interest and pure altruism, both of which are probably, like, infinity and negative infinity: useful concepts, but there's a whole infinity of range between the two extremes. Or, to put it another way if math(s) freaks you out, life isn't about one or the other; it's about balancing your own needs and desires against the needs and desires of others. Elements of psychology, economics and biology – not least the ideas of selfish genes and neo-Darwinism – have normalised the assumption that competition means humans are intrinsically cruel, ruthless or selfish, says Steve Taylor, a senior lecturer in psychology at Leeds Beckett University. Even in competition, there's an element of cooperation. If you're playing a football match (either kind), you're obviously cooperating with your teammates while trying to score more goals than the other team. But you're also, in a way, cooperating with the other team: you've agreed on the rules of the game (or had them imposed upon you), and there are penalties for breaking the rules. Take the "bystander effect", which first emerged in the 1960s. This is the widely cited idea that people typically avoid intervening in a crisis when others are nearby. The theory followed outrage over the 1964 New York murder of Kitty Genovese, a 28-year-old bartender who was reportedly raped and killed in front of nearly 40 witnesses, none of whom helped. But the final detail of the story behind the "bystander effect" appears to be an apocryphal one. While, tragically, Genovese really was sexually assaulted and murdered, investigations suggest that reports of there being 38 passive bystanders were inaccurate. I know I've pointed this out before, but I still keep seeing people referring to this incident as if it proceeded in accordance with the early tabloid sensationalism, so I'm quoting the above to emphasize that the bystander effect isn't nearly as pronounced as people think it is. Research suggests that people are actually more than willing to prioritise others' safety over their own in many situations. I expect it depends on the person, but, okay, here's something I see a lot of: Some asshole leaves a kitten by the side of the road. Someone who's probably not as big an asshole hears the poor thing's plaintive cries, and catches the feline, takes it to the vet, and generally ends up keeping it, or at least ensuring it's got a proper home with caring people. What's the big takeaway here? Usually, there are a lot of "people suck!" involved, emphasizing the cruelty of the kitten-abandoner. But I, as cynical as I am at times, draw a different conclusion: there's someone who cares enough to rescue the kitten; there's someone else whose entire job is to make kittens (and puppies) healthier; and there are a whole lot of anonymous internet commenters who would do the same thing in the same situation, and yet, they'd rather condemn the one person who did a bad thing. And for fuck's sake, that's not even our species. The vast majority of us are altruistic enough to take the time to help an entirely different expression of life. Even the act of condemning the asshole who dumped a kitten off to fend for itself speaks volumes about one's priorities. Never, not once, have I seen anyone comment on such a story with "Well, it's just a cat" or some version of "I'm sympathetic to the kitty-dumper." Are there bad people? Absolutely. Are they the majority? Do they represent humanity as a whole? Hell, no. There are evolutionary reasons for human altruism, Taylor says. For most of our history, we have lived in tribes as hunter-gatherers – highly cooperative groups. Oh, for... stop with the evolutionary guesswork. Most of our evolution happened before our ancestors could even be considered "human." I've ranted about evo-psych before, though, and I won't go into that again right now. "There's no reason why early human beings should be competitive or individualistic," says Taylor. "That would not have helped our survival at all. It would have actually endangered our survival." Within a tribe, sure. Between tribes, well, we see the result of that every day. And yet, there are still things we agree on. Mostly. Science suggests that most of us have the hardware to be selfless, often extraordinarily so. But that doesn't mean we can – or should – be selfless all the time. Whether we prioritise ourselves or others depends partly on circumstances, our prior experience and our culture. The rest of the article, which is moderately long, continues in the same vein. But, again, my takeaway here is that no one is completely selfish or completely altruistic; there are only gradations in between. That hypothetical pet-dumper I mentioned? On a good day, I might guess, without evidence, that the reason they did what they did was because they had limited resources, and prioritized their own family over the life of a cat. I've been in dire situations, myself, and I know that you don't always think things through when your main concern is feeding you and yours. And they might even think, "the cat has claws and teeth and can catch mice" (however wrong that is when you're talking about a domesticated animal). Point being, it's entirely possible they thought they were doing the right thing. Or, possibly, they're just a terrible person. Plenty of those around. But they're outnumbered. |
I'm trying to get to the movie theater more often again, so finding this article was timely. From Delish: What Nutritionists Wish You Knew About Popcorn ![]() Think twice before reaching for the movie theater butter. They call it "movie theater butter" because it has nothing in common with actual butter; that is, no cow titties were ever involved. Popcorn is one of the most versatile snacks in the world. Sure. You can eat it, or you can string it up for decoration. But there are a lot of contradicting opinions about the nutritional merits of popcorn. Of course there are. Is popcorn actually a good-for-you snack, or is it something you should avoid if you're trying to eat better? Uh huh. That depends. Are you a Calvinist (John-not-cartoon-kid) who believes that everything that tastes or feels good must be, by definition, evil? Since it’s whole grain by nature, it is filled to the brim with fiber. They couldn't come up with a more appropriate metaphor? According to experts, another major benefit of popcorn is the high concentration of polyphenols. "But that's a CHEMICAL!!!" From a macronutrient perspective, popcorn on its own is an extremely low calorie food. Right, well, I suspect here is where people start to confuse "popcorn" with "popcorn how people actually eat it." Look, we can all agree that, say, brussels sprouts are good for you, right? Green vegetable, basically cabbage, and until fairly recently they tasted so bitter people started calling them the Devil's Hemorrhoids. Okay, maybe they didn't, but they could have. The taste was proof (to Calvinists, anyway) that they had to be good for you. To make them more palatable, people slathered them in butter and cheese and all those things that, at the time, they thought would clog your arteries right up, and even if they didn't, it still increased the calorie count by several orders of magnitude. The point is, while I'm perfectly content to munch on lightly salted, but otherwise plain, popcorn, not everyone can or will do that, instead adding things that would make even brussels sprouts (or kale, which is really the same plant) unhealthy. This happens to every "healthy" thing, by the way. Take coffee. No, seriously, take it; I don't drink the stuff. But I don't have to drink it to know about the studies that seem to support its health benefits. But any benefits are from black coffee, which very few people drink. Instead, they'll go to Starsucks and order a venti whipped caramel mocha chocolate frappe latte, or whatever, to the point where you're like "You want any coffee in your flavorings?" Or yogurt, which is supposedly good for you, but not if you're going to turn it into candy. Now, look. I'm not saying "don't eat or drink these things." That would be hypocritical. Eat and drink what you want; enjoy life. All I'm urging is that you don't load up your healthy thing with sugar, salt, and fat, and then pretend it's still healthy. Okay, fine, back to the article. Popcorn kernels on their own can be considered a healthy food, but its most popular preparations are anything but. “Not all popcorn is created equal," says Elisa Kosonen, RHN, CHC, NNCP. "The biggest concerns come down to how it’s made and what’s added to it." I suppose all of the capital letters and commas after her name are there to lend gravitas to her statement, which is, after all, practically a tautology. And they might do that if I had the slightest clue what all those initials meant. Sure, I could look it up, but that would be cheating. "A lot of the pre-packaged or movie-theater versions are loaded with butter, salt, and artificial flavorings, which can turn a light snack into something super high in calories, sodium, and unhealthy fats," Ortiz says. Which is what I've been saying, but the initials after my name only make my statements about civil engineering meaningful. That's one reason I hardly ever talk about civil engineering: I don't need the liability if I'm wrong. The other is that it's generally boring as shit. (Yes, that's an attempt at a sanitary sewer pun.) Another thing to worry about? The packaging. "Microwave popcorn is often packaged in bags that may contain chemicals, such as perfluorooctanoic acid," Routhenstein adds. Oooh, another scary-sounding chemical, followed by the equally scary "acid!" In this case, though, the scariness may be somewhat justified. So like I said. I believe in eating what you want. But I also believe in being educated about things. As such, I might actually look up that quoted person's credentials. Later. |
I can't decide whether this article is an argument for, or against, the use of psychedelics. I'm leaning toward "for," but I like my world to contain a little bit of absurdity; it makes life more interesting. From Futurism: Time Is Three-Dimensional and Space Is Just a Side Effect, Scientist Says ![]() "These three time dimensions are the primary fabric of everything, like the canvas of a painting." It's not a theory. At best, it's a hypothesis. At worst, it's what happens when you get a bad batch of shrooms. A fringe new theory suggests that time is the fundamental structure of the physical universe, and space is merely a byproduct. "Fringe" is right. According to Gunther Kletetschka, a geologist — not a physicist, you'll note, but more on that later — from the University of Alaska Fairbanks, time is three-dimensional and the dimensions of space are an emergent property of it, a press release from the university explains. I could make up shit, too, and put it in a press release. I don't; I put it in a blog instead, or a story or poem. Actual science would go into a paper published in a reputable journal and get peer-reviewed. It could still be wrong, but at least it was taken seriously. The idea of reality being "an emergent property" of something deeper isn't that far out there. Temperature, for instance, is the average kinetic energy of the atoms that make up a substance; it's not proper to talk about the temperature of a single atom, or of atomless vacuum. Atoms themselves can be thought of as emergent from particular configurations of other kinds of energy. This sort of thing, though, is what leads people to airily proclaim that everything we think is real is actually an illusion, which I say stretches the definition of "illusion" beyond usefulness. It also opens the door for the person saying it to insert whatever they think is "really real" into the discussion. That, however, is a philosophical and semantic argument, not a scientific one, even if it's based on scientific discoveries. It also helps keep cannabis growers in business. Three-dimensional time is a theory that has been proposed before, though generally in pretty inaccessible terms. Similarly to the explanation for three dimensions of space — length, width, and depth — 3D time theory claims that time can move forward in the linear progression we know, sideways between parallel possible timelines, and along each one of those as it unfolds. Notably, Robert A. Heinlein used some version of that in his 1980 novel The Number of the Beast. Like I said, one can make up whatever and use it in a story. It's still not science, even if it's science fiction. Again, though: not a theory, except in the colloquial sense. It's not provably wrong (yet), like the idea that the Earth is flat or that Nazis have had a base on the Moon since the 1940s. Extraordinary claims all call for extraordinary evidence. And the claims here are already stirring controversy: as an editor's note added to the end of the press release cautions, the scientist's theory was published in the journal Reports in Advances of Physical Sciences, a "legitimate step," but one that isn't remotely sufficient to take it out of the realm of the fringe. I dispute that this is a "legitimate step." That journal has about the same level of relevance as this blog. "The theory is still in the early stages of scrutiny," the note concluded, "and has not been published in leading physics journals or independently verified through experiments or peer-reviewed replication." Translation: "this guy just made this stuff up." Still, it's a fascinating concept to consider — especially because we still don't know exactly how time works, anyway. And to be clear, it's important to come up with these things. The mistake people make isn't using their imagination; it's falling into the trap of believing that their imagination matches up with reality. I should emphasize here that none of this means the hypothesis is wrong (or right). Or that certain drugs shouldn't be legalized. |
Another reason not to go outdoors, from OutdoorLife: Colorado Campground Bans Tents After Black Bear Swats One with Kids Inside ![]() "Hey, mom, a bear was here" So, bears are calling in false reports to cause SWAT to show up at the tents? Oh, wait, the original swatting. Never mind. A popular campground in Colorado’s White River National Forest has taken the proactive and unusual step of banning tents and all other soft-sided shelters to keep campers safe from bears. That's okay. We'll just sleep out here in the open and-- wait. The U.S. Forest Service enacted the tent ban at Difficult Campground... Nominative determinism strikes again. When I was visiting Virginia Beach last week, my friends and I ended up at a winery. During the wine tasting part of the evening, I was watching the screens, which showed promotional documentaries on how they make the wine. The head sommelier (I suppose technically sommelière if you're pretending to be French) was named Emily Wines. Can't make this stuff up. No one would believe you. If I wrote a story where someone with that name worked anywhere near spoiled grape juice, some editor would make me change it. Point is, of course they made things Difficult. ...after a black bear scratched at a tent there with two young children inside it. Or, in bear terms, a burrito. The campground hosts told the Times, however, that the tent ban would remain in place for the next two months. They said they experienced similar issues last year with food-conditioned black bears breaking into tents and coolers. To be serious for a moment, this is the problem. Bears start associating humans with food. Or, to be a bit less serious, as I put it, "If you feed the bears, you will feed the bears." They added that “the bears were here first” and “this is their home,” and that campers are the real guests. Humans, most places in the world, are an invasive species. The Forest Service clarified in Wednesday’s public safety release that hard-sided campers and trailers are still permitted at Difficult Campground. That's okay, then. We wouldn't want things to get too in tents. |
We are experiencing unusually high call volumes. Your call is important to us. Please stay on the line and someone will be with you shortly. From Mental Floss: Why Hold Music Is So Annoying ![]() We asked an expert to explain why listening to hold music is such a frustrating experience. An expert? I don't need an expert to know that the reason it's annoying is that it's playing while you're on hold. Hell, they could play Springsteen while I'm on hold and I'd still be annoyed. We’ve all reluctantly dialed up business knowing that before we reach an actual person, we may be forced into the dreaded hold zone, the hum of annoying hold music flooding our ears as we ponder all the things we would rather be doing. That's why I put my phone on speaker and leave it nearby while I play a video game on the laptop. “Hold music is an audible representation of time that is being spent not being assisted,” Dr. Leigh VanHandel, Associate Professor of Music Theory and Music Cognition at the University of British Columbia, in Vancouver, Canada, tells Mental Floss via email. See? Like I said. Incidentally, if your company is "experiencing unusually high call volumes" all the fucking time, they're not "unusually high call volumes;" you just don't want to pay for sufficient staff. As much as it frustrates us, hold music may be a necessary evil. Imagine waiting on a call for any amount of time and having no music at all, just a sound void. Could be worse. Could be constant commercials. Later, companies realized that they could fill some of the customer hold time with branded announcements or advertisements... Goddammitallsomuch. “I don’t think there is a single genre that would make everyone’s wait time more pleasant,” VanHandel says. “Whatever genre [the businesses] pick, some people are going to love it and some people are going to hate it.” That might be the reason why companies choose hold music in generally inoffensive genres like classical, smooth jazz, contemporary, easy-listening music, and usually include instrumental pieces. Ah, yes, mayonnaise music. Mayonnaise music offends me. But one thing I haven't experienced yet, the one kind of hold music that would make me absolutely and immediately stop doing business with a company altogether, is winter holiday tunes. Unless the company in question had the balls to use the parody songs, in which case, I'd pay to be on hold. But none of them have the gonads. “Companies should spend less time overthinking their hold music and more time hiring and training customer service representatives.” Oh, look, another "expert" quote that I figured out first. VanHendel also points out a new trend of companies taking a caller’s phone number and calling them back instead of having people wait on the line... This is true, but I'd need something to ensure that the number they're calling back from doesn't set off my scam alarm. Look, I know holds are going to happen. I try not to let them ruin my day. It's not often that I have to actually call in to somewhere, because I think I've figured out this newfangled internet thing, so it's not a daily annoyance. Still, the music could definitely be improved. |
Little outside my usual box today; this year-old article is from Australia and, despite having spent my childhood on a farm and around farmers, I know little about the topic involved. So the reason I'm posting this has more to do with my growing suspicion that Australia is not, in fact, trying to kill you; that's just the story they tell to try to keep Americans away. Stuart Armitage is getting bitten by spiders more and more as the years go on, but he doesn't mind — it's a small price to pay for progress at his Queensland cotton farm. Case in point. To hear the Australian Department of Keeping Americans Away tell it, even the tiniest Australian spider bite should be enough to send someone into immediate and painful cardiac arrest. Prior to 1996 Mr Armitage would have had a hard time finding a spider in his paddocks or on many other Australian cotton farms. So the story is really about reducing pesticide / herbicide use on Australian farms, but in the process, they've let their secret slip out. The pesticide killed other life forms too, but with the invention of insect-resistant, genetically modified cotton – Bt cotton – the plant was able to produce a protein to kill the worm and spraying was significantly reduced. It's also pro-GMO, which I appreciate (to be serious for a moment). The industry's progress in this area and other others, including water use have been revealed in the latest independent review of the sector's environmental performance. Do we do that in the US? I don't think we do. Mr Kay expects herbicide volumes to drop as more growers adopt robots and cameras to spot spray weeds instead of applying blanket sprays. Until the inevitable robot uprising. That's it, really. I saved this entire article mostly to make an "Australian wildlife will kill you" joke. Still, the methods they're using are rather interesting; it looks like they were also able to reduce water use, which would be good anywhere, but especially good in a place that's mostly desert. |
I have to admit, Popular Science caught me with this headline. I don't usually fall for clickbait, but you can't just throw this headline out there and not expect me to fly into a rage. Inventing lager was a huge mistake ![]() The history of the beloved beer is full of yeast, witch trials, and royal spats. Obviously, I thought the article would be about how it was a mistake to create lager-style beer. But no, English just has to be ambiguous; it's saying that lager came about by mistake. Well, it wouldn't be the first or last fortunate mistake in the history of fine fermented beverages. A study published April 27 in the journal FEMS Yeast Research reveals a possible origin story for lager beer, a light type of beer produced by bottom-fermenting yeast. By "April 27" they probably mean "of 2023," when this article is from. The research team used historical records, in tandem with evolution and genomics research, and believe that lager likely originated at the court brewery–or Hofbräuhaus–of Maximilian I, the elector of Bavaria. That's a lot of work to come up with something they're not really sure of. I just hope they got to do a lot of hands-on research into beer. The rest of the article gives a few more details about beer origins, but I don't have much else to say about it. Mostly I just had to read it to calm down and confirm that they weren't somehow asserting that it was a mistake to make beer. Just one more thing: obviously, there was at least one actual mistake in beer history, and that was to pass off mass-produced rice-adjunct processed lager to the American public as "beer." |
I saved this article from Time not so long ago (it's from mid-July), but right now I can't remember what I wanted to say about it. Perhaps that's because I'm getting old. In my life philosophy, there's no point to living longer if, in order to do so, you have to give up the things that keep life worth living. But that's not what the article's about. It seems to be about better health in old age, but I have this nagging suspicion that it's really about being "productive" longer. As of July 2025, the estimated global average life expectancy is approximately 73.5 years. There's a couple of important caveats there, and probably some I haven't thought of. First, from the link provided, "Life expectancy at birth indicates the number of years a newborn infant would live if prevailing patterns of mortality at the time of its birth were to stay the same throughout its life." That doesn't mean life expectancy for someone who's 30, 40, 50, or 60 in 2025. Second, averages like that are useful for statistical calculations and things like insurance, but nearly useless for individual planning. High-income countries with advanced healthcare systems, good sanitation, and healthy lifestyles have an even longer life expectancy average, reaching up to 84 years. It's also important to note, because I live there, that the US doesn't meet those criteria. Research suggests that we’re entering the largest intergenerational wealth transfer in history, with trillions expected to pass from older to younger generations. But much of that wealth may never arrive. Oh, wealth is being transferred. It's just not being passed on to heirs, but to insurance companies and healthcare corporations. We’re not just living longer—we’re also living longer with dementia, Alzheimer’s, and other forms of cognitive impairment. And without meaningful action, this trajectory could accelerate. While I fear that way more than I fear death, this sentence makes it seem like major cognitive impairment is certain. From what I've read, it is not. The problem is that it's basically a die roll. Also, there's a difference between normal age-related cognitive decline and the profound loss of faculties characterized by Alzheimer's and the like. If we want longer lives to truly become better lives, we must shift our focus from simply extending lifespan to improving healthspan. I kind of thought that's what lots of research does. First, we must prioritize prevention and delay—not just in old age, but across all stages of life. That might not result in the paradise people think it would. Suppose they ban pizza and cheeseburgers. That might not affect some people, but for me, death would be preferable. Second, technology is a powerful enabler. AI assistants and care robots can assist with mobility, medication reminders, and safety monitoring to help older adults remain independent longer. This author clearly doesn't read science fiction. That, too, has its horror elements. Anyway. I'm not going to belabor this further. I think the article makes some good points, but I flinch every time something like this mentions "lifestyle changes" or "policy intervention." It's not that I don't want to live a long time; it's that I want to live, not exist. |
It's everywhere. In my phone. On my computer. Hell, I'm looking at one right now, just above this word. A three-year-old article from The Hustle: The $ 500m smiley face business ![]() Nearly 50 years ago, one man ‘invented’ the modern smiley face. Then, another man halfway across the world made it into a multimillion-dollar cash cow. Ah, yes, capitalism: where you take someone's emotions and sell them back to them at a profit. This simple icon — a yellow circle, two dots, a smile — retained relevancy through 50 years of cultural movements, from free love to raves to the digital revolution. My favorite iteration was in Alan Moore and Dave Gibbons's Watchmen. ![]() And in the process, it became a family-owned global licensing empire worth more than $500m per year. But how did it get there? $500m isn't a lot for a global business, but it's a lot from every other perspective. In 1963, a Worcester, Massachusetts-based freelance artist named Harvey Ball received a life-changing call from a local client. So, given the article is from 2022, it was closer to 60 years than 50. State Mutual Life Assurance Company had just merged with an out-of-town competitor, and employee morale was waning. They needed some kind of quirky and fun design to lift spirits around the office. These days, they'd lay off half the staff and hold a pizza party for the survivors. It soon became clear that Ball’s simple illustration was worth millions of dollars. But he’d made a critical mistake: He never filed a trademark. Whoops. Yeah, that's almost as bad as how Siegel and Shuster got stiffed for creating a certain Kryptonian. On the flip side, artists sketch stuff every day, and the vast, vast majority fade into obscurity. It takes real luck to think "I'd better go through the process of securing my rights to this thing," when you'd presumably rather go on to the next drawing. Halfway across the world, in Paris, France, a young journalist named Franklin Loufrani had his own stroke of invention. Hang on a minute while I process the cognitive dissonance of a smiley face in Paris. His creation, a smiling yellow face, bore a striking resemblance to Ball’s. But unlike Ball, he foresaw the symbol’s marketing potential and immediately secured a French trademark. The accompanying illustration of the front page of the newspaper in which it appeared. For some reason, one of the headlines is censored in it. The words under the smiley translate to "Take the time to smile." “You could say there was a political or social meaning behind what he did, but it was really a commercial act,” Loufrani’s son, Nicolas, told The Hustle. “He wanted to make money on it.” Dammit, France, you're supposed to be the socialist one and we're supposed to be the capitalist one. In the early ‘70s, France was emerging from a counter-cultural movement similar to America’s hippie uprising: Students were rejecting moral strictures, embracing free love, and leaning into a sexual revolution. How this is different from just "being France," I have no idea. In the 1980s and early ‘90s, the smiley was adopted by a new generation of ecstasy-fueled ravers. Somehow, I missed out on that particular cultural trend. It was mostly people younger than me. Hippies were mostly people older than me. You know what I got? Fucking disco. By 1996, the smiley business wasn’t looking so hot. Licensing deals were down, and the logo was starting to lose its edge. Oh no, a trend was ending. Nicolas formed The Smiley Company and secured trademarks for the ‘smiley’ brand name in 100 countries around the world. In countries where it was taken, the Loufranis bought it, or battled the owner in court (including a famous, 10-year legal battle with Walmart in the US). The link to the "legal battle" seems to be broken, but I did a quick search, ignoring AI as usual. Seems Wal-Mart effectively won that battle—and then proceeded to abandon the smiley face from its promotions anyway. Then, against the wishes of his father, Nicolas made major updates to the old-school smiley face, morphing it from a static image to a 3D orb. He dubbed it the “new smiley.” “It was totally against all marketing theory: If you have a logo, you don’t create new ones,” he says. “My father was furious. He said, ‘It’s stupid what you’re doing! You have a trademark! Keep the logo, but don’t change it!’” Okay, I admit I'm far from an expert in marketing, but it seems to me that companies change their logos all the freakin' time. With a few notable exceptions, like Sherwin-Williams. “When emojis started to pick up, we were seen as the originator, and it gave us a renewed credibility,” says Nicolas. “The smiley was cool again.” Poop emoji grins in agreement. Lots more at the link, of course. The story, however, failed to make me happy. Maybe if went back to Paris... |
I'm mostly going to let this Big Think article speak for itself. There's a lot more there than I'm commenting on here. You can’t do your own research without doing your homework first ![]() Here in 2025, many of us claim to come to our own conclusions by doing our own research. Here’s why we’re mostly deluding ourselves. But deluding ourselves is our superpower. Is the tap water safe to drink, or to shower in, today? We assume that it is without bothering to check. For certain values of "we." Is your seat belt fastened when you get into the car? If it isn’t, a sensor will alert you, reminding you incessantly to buckle it. My first car—well, truck—well, my dad's farm truck—didn't have seat belts. Or A/C. Or heat beyond a "defrost" switch. Or a radio. Or power steering. I miss that old truck. Is the air safe to breathe? No. We assume that these aren’t real worries and that everything will turn out fine. (And this is okay, in most cases they will.) And yet, the only reason that we can make those assumptions is because somewhere, over long periods of time, responsible adults have done the work necessary to ensure that these mundane, everyday activities aren’t going to pose threats to you. I miss those days, too. The truth is not that consensus undermines science, but rather that our modern rejection of expertise, and our newfound embracing of the “do your own research” ideology, is a recipe for societal disaster. Here, I have to agree. The idea of “doing your own research” can only exist side-by-side with a concept that most of us normally take for granted so thoroughly that we rarely, if ever, even mention it: the concept that we inhabit a reality that obeys rules and that those rules can be, at least in principle, understood. It is likely that quantum effects, while they obey rules, cannot be understood by the vast majority of people. But that doesn't have much bearing on everyday experience. For example, there are many people who believe that chemicals are bad for you, and that if you can only avoid ingesting or coming into contact with said chemicals, you’d rarely-to-never get ill. This is a small portion of what’s known as the naturalistic (or appeal-to-nature) fallacy, but there’s a tremendous lack of homework-doing that one must do to even think up such a thought. Regular readers may recognize some permutation of this, as I've been railing against it for decades. Water—pure water, distilled, no additives—is a chemical. And poison ivy is all-natural. Similarly, vaccines — rated the #1 public health intervention of the 20th century by the CDC — are safe, effective, and reduce or eliminate both the virulence and transmissibility of preventable diseases. Tens of thousands of deaths and millions of infections have been eliminated, annually, by the near-universal adoption of routine childhood and adult vaccinations as recommended by the CDC. Yet many people, out of a combination of fear and ignorance and a woefully misguided view of what vaccines actually do inside one’s body, seek to opt out of these routine vaccinations, putting children, newborns, the immunocompromised, and the elderly at elevated, unnecessary risks of severe illness, long-term injury, and even death. And we can shout that as much as we want (and I want to do so), but some people will never accept it. This is why we need real science education. Not the big stuff like "what happened in the first second after the Big Bang" or "how do we instantiate artificial general intelligence," but education in the scientific method. And people use it all the time without realizing it. Remember that truck I mentioned up there? I learned the basics of the scientific method by observing, diagnosing, and fixing its many issues when it wasn't running properly. Can't really do that with modern cars, but that's not my point. We have seen this at play repeatedly: about fluoridated drinking water, about vaccines, about the origins of COVID-19, and about global warming. Many other examples abound, as you can commonly find Americans denying evolution, the Big Bang, a round Earth, or even the rights to equality for our fellow humans here in 2025. There has always been a political/ideological aspect to resistance to science and technology, going all the way back to at least Galileo. If the transition from horse-drawn carriages to automobiles happened today, the entrenched Horse Lobby would put out all kinds of disinformation in order to convince the general public that automobiles are dangerous (which, to be fair, they can be), but their real motivation would be to be able to keep breeding and selling horses. You know, kind of like how the fossil fuel industry puts out bullshit about windmills causing cancer, or solar power being detrimental to the environment, or lobbying to restrict electric vehicles. But there is a path back from all of this: a path back to reality. First, we have to recognize the value of not “doing our own research,” but of doing our own homework... The path back to reality is instead to value actual expertise, and those who have devoted their lives to the betterment of humanity through discovering scientific truths about reality. The author is way more optimistic than I am on this subject. But I'm not going to stop trying, anyway. |
Well, if I'd known the random numbers would serve up a softball article like this one from Atlas Obscura yesterday, I'd have posted it then. But that would defeat the whole purpose of the surprise of random picking. Hibiya Godzilla Square ![]() Japan’s tallest Godzilla statue contains a piece of the original 1954 film within its base. Of course, you need to go to that link to see pictures, which you definitely want to, because it's Godzilla. Or, you know, Gojira, depending on how accurately you want to render the Japanese name in English. Hibiya Godzilla Square is home to the largest Godzilla statue in the country, a monumental tribute to Japan’s most famous kaiju (“strange beast” or “monster”). Looking at the pictures, though, might lead to some disappointment. It may be the largest Godzilla statue in the country, but it's not even close to Godzilla's actual size. Well, I mean what would be his actual size if he could exist without violating most of the known laws of physics and probably a few as-yet-unknown ones as well. Two statues have occupied the square over this time, the first of which was installed to commemorate Godzilla’s demise in Godzilla vs. Destoroyah. The present statue replaced it in 2018. To my shame, I have not seen that particular Godzilla movie. I did, however, see a remastered original Godzilla from 1954 at the local drafthouse theater a few years ago. An unmistakable representation of Godzilla as seen in the 2016 film Shin Godzilla, the centerpiece of Hibiya Godzilla Square rises above a crashing wave at an impressive height of 9.8 feet... Fuck's sake, it's in Japan. Use meters. (It's almost exactly 3m.) And yes, that's taller than a puny human, but actual Godzilla started out at about 50m and only got bigger from there. Look, I like Godzilla, okay? It's not a guilty pleasure, because I don't believe in guilty pleasures, but I realize it's completely unrealistic. That's fine. So is Superman, and I never pass up a chance to see a Superman movie. Including this latest one, which I absolutely enjoyed. (I used to do one-sentence movie reviews, but that was more of a "previous blog" thing.) Clearly, though, Superman would beat Godzilla in a fight. Unless the kaiju somehow swallowed some kryptonite, and then it'll depend on whether it's an American movie or a Japanese one. |
Made it back home, but I'm entirely too worn-out to deal with one of my usual links right now. I'll just say this: last night after sunset, some lightning storms blew through the beach. I had a room with a balcony, so I slid out there and sat down to watch the show. Nothing like having an unobstructed view all the way to the edge of the planet while Zeus gets busy smiting all the fish. Was it smart for me to sit on an open balcony on an upper floor of a hotel during a lightning storm? No. Was I frightened? No. Would I do it again? Hell, yes. Some things are worth the risk, and that's one of them. So that's my short blog entry for today. It'll probably be back to normal tomorrow. |