Items to fit into your overhead compartment |
| I've noted many times that the answer to a headline question is almost always "No." I'm willing to believe this El Pais article is an exception. Can cheese protect brain health? This is what the science says A controversial study suggests that consuming these dairy products may have a protective effect, but experts aren’t so sure Yes. It's another article about cheese. Eating more high-fat cheese and cream may be associated with a lower risk of developing dementia, according to a study published on December 17 in the academic journal Neurology. Of course, I should issue my usual disclaimers about one study, peer review, replication, and so on. And to be very careful about who funded the study; even if the scientists involved were trying to be objective, bias can creep in. Like when Willy Wonka funded those studies that insisted that chocolate is good for you. Still. Just as with chocolate, I don't much care whether it's good for you, only whether it's good. And it is. The analysis — based on data collected from nearly 30,000 people — challenges the previous scientific belief that a low-fat diet could have a protective effect against dementia. Well, I guess I at least can accept the sample size, for once. Although its conclusions are quite dramatic, it’s an observational study that doesn’t prove causation. Science doesn't "prove;" it supports or falsifies. But yes, objectively, we also have to be concerned about correlation vs. causation. Except in this case, when, let's be honest, I'm going to eat cheese anyway. Researchers analyzed data from 27,670 people in Sweden, with an average age of 58 at the start of the study. Well, there goes my lack of concern about the sample size. Sweden isn't exactly famous for ethnic diversity. At that time, participants recorded their food intake for one week and answered questions about how frequently they had consumed certain foods in recent years. Not a great methodology, in my view. Self-reporting is notoriously hit-or-miss. After adjusting for age, gender, education and overall diet quality, the researchers found that people who reported consuming more high-fat cheese had a 13% lower risk of developing dementia than those who consumed less. I mean, I'll take what I can get, but I think 13% doesn't mean much on an individual level, only in aggregate. Naveed Sattar, a professor of cardiometabolic medicine and an honorary consultant physician at the University of Glasgow, is highly critical of the study. And that's okay. This is how science works. While all experts point to the importance of lifestyle and healthy choices for maintaining optimal brain health, most of what determines whether a person develops dementia is beyond their control. Which is why I don't worry too much about it. I'm of the considered opinion that, for myself at least, the stress of always having to do the Right Thing, and deprive myself of simple pleasures such as consuming delicious cheese, has a more negative effect than just doing what feels good. Which, I know, is the basic definition of hedonism. I'm okay with that. |
| I'd saved this Quanta article just because I thought it was interesting, especially as someone who is learning a new language later in life. Is language core to thought, or a separate process? For 15 years, the neuroscientist Ev Fedorenko has gathered evidence of a language network in the human brain — and has found some similarities to LLMs. See, I'd never wondered whether language was core to thought or not; for me, it absolutely is. I think in words. Sometimes also pictures, but also words (numbers are words, too, like seventeen or one-eighty-five). Even in a world where large language models (LLMs) and AI chatbots are commonplace, it can be hard to fully accept that fluent writing can come from an unthinking machine. I thought AI chatbots were LLMs, but whatever. That’s because, to many of us, finding the right words is a crucial part of thought — not the outcome of some separate process. I expect this is especially true for writers. But what if our neurobiological reality includes a system that behaves something like an LLM? It's funny. As technology advanced, we kept coming up with new terms to compare to how the brain works. Near the beginning of the industrial revolution, it was "gears turning" (that one persisted). Later, some compared neuronal signaling to telegraph lines. A while back, people started saying our brains are "hardwired" to do this or that. Now it's "the brain works like an LLM." The joke is that a) no, the brain doesn't work like any of those things; it's just a useful metaphor and b) if anything, LLMs are like the brain, not the other way around. (In math, A=B is the same as B=A, but not necessarily in language.) Long before the rise of ChatGPT, the cognitive neuroscientist Ev Fedorenko began studying how language works in the adult human brain. The brain is, however, notoriously hard to study, because it's complicated, but also because we're using a brain to study it with. Her research suggests that, in some ways, we do carry around a biological version of an LLM — that is, a mindless language processor — inside our own brains. I'd want to be more careful using the word "mindless." I'm pretty sure I know what the author means, but one of the great mysteries left to solve is what, exactly, is a mind. “You can think of the language network as a set of pointers,” Fedorenko said. “It’s like a map, and it tells you where in the brain you can find different kinds of meaning. It’s basically a glorified parser that helps us put the pieces together — and then all the thinking and interesting stuff happens outside of [its] boundaries.” I'm no expert at coding, but I know some computer languages have variables called "pointers" whose data is solely where to find other data. Don't ask me; I never did get the hang of them. But again, we have a technological metaphor for the brain. These are like the Bohr model of the atom: useful for some things, but not reflective of reality. So when I read the above quote, that's where my brain went. Unlike a large language model, the human language network doesn’t string words into plausible-sounding patterns with nobody home; instead, it acts as a translator between external perceptions (such as speech, writing and sign language) and representations of meaning encoded in other parts of the brain (including episodic memory and social cognition, which LLMs don’t possess). Yet. A lot of the article is an interview with Fedorenko, I don't really have much more to say about it; it's just a bit of insight into how thinkers think about thinking, from a physical point of view. |
| Pardon the mess while I experiment with new formats thanks to Our Glorious Leader's fun new editing interface. Meanwhile, I think this is the first time I've highlighted a link from Snopes: How to spot suspicious health supplements and avoid getting scammed Snopes readers regularly ask whether supplement brands like Neurocept and Burn Peak are legit. Here's how you can tell yourself. Well, it's simple, see: if it's a supplement, it's a scam. Oh, sure, not always. But like getting phone calls from an unknown number, it's best to assume the worst rather than take chances. Many supplement brands readers ask about use unethical business practices to sell products that simply do not work. Or, worse, will actively make you sick. While deepfakes may be difficult for many internet users to spot, many of the health supplement products that seek to trick people into parting with an excessive amount of money have common red flags in their online presence that take no research or special knowledge to be able to spot. For starters, they advertise on the internet. In this article, Snopes will guide you step-by-step through how to easily spot a potential health supplement scam. Of course, these scams predated the internet by decades, the most famous one being the promotion and sale of fake snake oil. Turns out actual snake oil may have some beneficial properties, but that wasn't the problem (except from the point of view of snakes). The problem was they weren't selling actual snake oil, but whatever ingredients they could obtain cheaply. I also did an entry last month on a fun supplement containing radium: "It Got Glowing Reviews" You should talk to your doctor if you think there is a supplement that might be beneficial to you. These things always say "you should talk to your doctor." Bitch, I'm in the US. You know how hard it is to even get a chance to wait in the lobby? Still, yes, you should talk to your doctor. Just remember that they're people, too, and they have enough knowledge to absorb without trying to keep track of every mostly-unregulated placebo (or worse) hawked by unscrupulous vendors. It's worth noting that we were unable to reach out to the companies mentioned in this story to inquire about their business practices and the efficacy of their supplements because they all either did not list contact information or had nonfunctional contact links on their websites, a common practice for the sellers of unproven supplements. Well, I'd consider that the third red flag, right after "it's a supplement" and "it's hawked in a popup ad": if they won't let you contact them, then they're almost certainly a scam. The rest of the article is mostly about what Snopes considers to be the red flags. Personally, I prefer to keep things simple and avoid these products entirely. I think I can trust aspirin made by well-known manufacturers, but after that, my inner skeptic raises his ugly head. And the link's there if you want it. I've spent all my energy this morning playing with the new text editor, which is very cool but there's a bit of a learning curve for those of us who have spent 20+ years learning the ins and outs of WritingML. Now I have low energy. Maybe I should go to a gas station and buy one of those untested, unregulated five-hour energy shots. |
| Here's an interesting article from the BBC which, despite having a few reservations, I thought I'd share. The very first humans millions of years ago may have been inventors, according to a discovery in northwest Kenya. Like I said, reservations. Let's start with the definition of "human." The Latin binomial homo sapiens translates to "wise person" or "knowledgeable person," but that's just a matter of labeling, and besides, there's a big difference between "wise" and "knowledgeable." To the extent of my understanding, evolution is generally a gradual process. Each generation broadly resembles the one before, but distant generations do not necessarily resemble each other. It's like... if you look in the mirror today, you will see the same face as you did yesterday, when you saw the same face as the day before. (Barring injury or hangover or the like, of course). Take any two consecutive days from your life, and neither you nor facial recognition software would be able to distinguish them. And yet, you look markedly different from 10-year-old you, who looked markedly different from 20-year-old you, etc. Point is, I think it's really, really hard to point to one particular generation, especially millions of years in the past where the fossil record is sparse, and say, "These were the first humans." But if pressed, I'd say there's one philosophical hump our ancestors had to go through. It wasn't tool use; lots of animals, as we now know, use tools. Some other animals even crudely make tools. But what probably distinguishes "sapiens" from other animals is: using tools to make other tools. That's a huge leap, in my opinion. My point is: "the very first humans were inventors" is, philosophically at least, a tautology. Second quibble: humans and other animals display a wide range of what we call "intelligence" within a species. In other words, there are geniuses, average specimens, and dummies. But what the vast majority of us have in common is that we're very, very good at mimicry. There's a reason "to ape" is a verb: the actual distinguishing mental characteristic of an ape is its ability to copy what others do (though other animals have this ability as well; parrots, e.g.) So, no, I don't believe that "the very first humans were inventors." Just as we had luminaries like Nikola Tesla and whatever genius figured out how to make beer, all it takes is one human to make a mental leap, invent something truly new and useful, and next thing you know, the other humans have followed that lead. Some certainly improve on the invention, like how once someone created an incandescent bulb (I don't believe for a moment that it was Edison himself, but it could have been), almost anyone with the right equipment could create it, and someone else standardized the sockets, and another added frosted glass, etc. And my point there is that, millions of years ago, the very first inventor invented inventing, and the rest of our ancestors just kept the momentum going: slowly at first, but eventually building on previous work until we could send robots to Mars and argue on the internet. And, finally: like I said, the fossil/archaeological record is sparse. I don't see how they can definitively claim that they found the first invention. Plus, early humans seem to have been scattered in tribal or clan groups, just as we are today, but didn't have the internet—so it's entirely possible that inventing was invented in more than one place, separately, much as Newton and Liebniz both invented (or discovered, depending on your point of view) calculus. Researchers have found that the primitive humans who lived 2.75 million years ago at an archaeological site called Namorotukunan used stone tools continuously for 300,000 years. Quibbles aside, though, I'm not saying the article isn't worth reading. For instance: 2.75 million years? I can't remember hearing about any evidence of tool use that's that old. It also predates the generally accepted advent of what they call "anatomically modern" humans by, like, a shitload. I've droned on enough; the article is there if you're interested. I'll just quote one more passage, from the end: "We have probably vastly underestimated these early humans and human ancestors. We can actually trace the roots of our ability to adapt to change by using technology much earlier than we thought, all the way to 2.75 million years ago, and probably much earlier." "And probably much earlier." That just supports my points above. |
| I'll have to wash my hands and bleach my eyes after this one, because it's from (ugh) Martha Stewart. But since it's about cheese, I'll allow it... this time. How to Eat Blue Cheese the Right Way, According to a Cheesemonger Our guide to enjoying one of the cheese world’s most misunderstood stars. "How to eat blue cheese?" For fuck's sake, just slide that shit onto some bread and stick it in your mouth. Blue cheese has a reputation: bold, tangy, and sometimes intimidating—not to be confused with a bully, but easily misunderstood. What's to misunderstand? It's cheese. It has blue stuff. It's delicious. We spoke with an American Cheese Society Certified Cheese Professional and blue cheese lover to learn more about this sometimes maligned type of cheese and find out how to eat blue cheese. You... chew. And swallow. Come ON. Blue cheeses are defined by the blue or green veining, a specific type of mold, that streaks throughout the cheese. “This mold presence is very intentional—not just any cheese can grow mold and become a blue," says Lauren Toth, ACS CCP, cheesemonger and director of curriculum and talent development at Murray's Cheese. I'm going to go ahead and assume that at least two of those Cs stand for "cheese." The mold is formed when the cheesemaker introduces a specific strain of bacterial cultures, typically Penicillium roqueforti, into the milk. I know, I know, some people freak out about microorganisms. If only they could internalize that they, themselves, are more microorganism than primate. If your previous blue cheese experience came from a sad wedge at a salad bar or the bottled dressing of childhood, there’s good news: gentle, approachable blues exist, and they’re genuinely delightful. Wait... are we still talking about cheese, or have we switched to sex? Her top pick for a blue cheese newbie is Cambozola Black Label, a buttery, triple-crème hybrid of camembert and gorgonzola. Ohhhhh... it's an AD. Blues vary dramatically in texture, determined by milk type, curd handling, aging time, and piercing—encouraging more veining to develop. Toth says, “blues come in all different styles and textures—creamy, grainy, fudgy, crumbly, even fairly firm,” and those differences hint at how best to use them. Still don't know whether this is supposed to make me hungry or "thirsty." Blue cheeses tend to be bold, making their best partners sweet, rich, or fruity. To be serious for a moment, my favorite New Year's Eve pairing with the traditional sparkling wine is a blue cheese and sliced pears. Also, walnuts. Perfect flavor combination. When choosing what to drink with blue cheese, avoid overly tannic red wines when tasting new blues. Opt instead for sweet and sparkling wines to tame sharpness and highlight creaminess. While it's traditional to drink wine with cheese, I find beer makes an excellent lubricant as well. Not just any beer, though; as much as I love Belgians, they're probably too strong for the cheese. I'd go with a more hoppy variety, just not an IPA. I don't like IPAs in general, though, so you'd have to experiment, were you so inclined. For many people—Toth's own mother included—blue cheese is synonymous with bottled dressing, and that unfortunate association keeps people from discovering truly exceptional cheeses. I'm also a fan of blue cheese dressing, just not the mass-produced kind. So ends another cheesy entry. I have a few more in the pile, but I have no idea when another will grace us with its presence. |
| As today is the only day around this time of year that has any meaning for me at all, I thought it would be an appropriate time to take a break from our usual programming for a personal update. A couple of weeks ago, without noting it (or expecting anyone else to), I posted an entry here that marked six straight years of daily blog posts. The streak started in the previous blog, but when that one filled up, I just continued here. I feel accomplished and all, but at the same time, it means I haven't done anything in the past six years significant enough to make me take a break from blogging. Even the trip to Europe, which had been put off since certain events restricted travel, featured a daily blog post, if only a brief one. But, at the same time, it also means I didn't get sick enough to miss a day, and hell, that's a good thing, especially considering the "certain events" I just mentioned made a lot of people sick or dead. Still, this time of year always fucks me up, no matter how good I've objectively got it, and not to brag, but I've objectively got it damn good. That could, of course, change at a moment's notice, and almost certainly will now that I've mentioned it out loud. I'm taking the chance because it's relevant to the rest of my rant here. This December has been worse than others, though. I've withdrawn even more than usual, avoiding as much human contact as possible (and accepting as much feline contact as possible). Again, I emphasize that this is not due to anything bad happening to me in life, or any of the myriad stupid things I've done; it's just the way the season works for me. The season, layered on with the existential dread of facing one of those horrid "multiples of 10" birthdays early next calendar year. Hell, it's gotten so bad that I haven't had a single delicious fermented and/or distilled beverage since the 5th (an appropriate day, because 5 December is the anniversary of the 21st Amendment, which repealed prohibition). I'm used to going days between drinks, public perception (which I promote) to the contrary, but over two weeks? I don't think I've abstained for that long since grade school. It's not that I've forced abstention on myself; I just plain haven't felt like drinking, and I don't usually make myself drink when I don't feel like it. Next calendar month will probably be an exception—the concept of Dry January is personally offensive to me, so I'll try to do what I did last year, barring illness or other extenuating circumstances: have at least one alcoholic drink every day. Point is, I usually drink when I'm feeling really good or really bad, and this month has mostly just been a mental shade of beige. Don't congratulate me on that, by the way. It's not some sort of big accomplishment. I'm not recovering from anything, and, planned observation of Ginuary or not, I'm sure I'll feel like having a beer soon. Perhaps even today, to mark the solstice in my usual fashion. And the solstice does represent change to me. Which is odd, because the whole point is that the path of the sun in the sky remains very close to the same for the days preceding and following it. Still (pun intended because "stice" means "still"), there's an objective, measurable moment (10:03am this year, based on my time zone, which should be shortly after this gets posted) when the Sun seems to hang perfectly above the Tropic of Capricorn, pausing there as if to rest before resuming its half-year journey northward. This is, as I've harped on for many years, the Reason for the Season. Anyway. Astronomical shifts or not, tomorrow, I'll probably get back to the usual format in here. I just wanted to take the time to inflict some of my not-feelings on everyone. Thing is, I don't want anything to change, right now. Any change will almost inevitably be for the worse. But if it does, I hope I can continue to face it with a laugh in my heart, the wind in my hair, and a beer in my hand. |
| A bit more serious rant today, thanks to this article from The Conversation: Well, obviously, They're pushing conspiracy theories on us to detract from the real problems going on. Study it out, sheeple! (I said "a bit more" serious, not "completely" serious.) Everyone has looked up at the clouds and seen faces, animals, objects. One time, I saw a giant Middle Finger. I felt that was appropriate. But some people – perhaps a surprising number – look to the sky and see government plots and wicked deeds written there. Not to mention aliens. Conspiracy theorists say that contrails – long streaks of condensation left by aircraft – are actually chemtrails, clouds of chemical or biological agents dumped on the unsuspecting public for nefarious purposes. Different motives are ascribed, from weather control to mass poisoning. Here in reality, meanwhile, weather patterns are shifting due to climate change, and mass poisoning is absolutely occurring due to pollution. There are people who refuse to accept that those things are happening, so there must be a grand, evil design behind it all. I’m a communications researcher who studies conspiracy theories. The thoroughly debunked chemtrails theory provides a textbook example of how conspiracy theories work. Translation (for conspiracy theorists): "I'm part of the cover-up." More seriously, while this article focuses on the chemtrail nonsense, it provides insight into conspiracy "theories" (I really hate calling them that) in general. But even without a deep dive into the science, the chemtrail theory has glaring logical problems. Two of them are falsifiability and parsimony. This can, of course, be said of most conspiracy accusations. The Apollo one, for example. What does it really take for someone to believe that thousands, maybe millions, of people from all over the world, including our biggest rivals at the time, faked the moon landing and managed to keep a lid on it? That the USSR wouldn't have been the first to claim it was a hoax? That the US government, which these same people insist is utterly incompetent at anything, could manage to orchestrate such a grand conspiracy without a single actual piece of evidence? According to psychologist Rob Brotherton, conspiracy theories have a classic “heads I win, tails you lose” structure. Conspiracy theorists say that chemtrails are part of a nefarious government plot, but its existence has been covered up by the same villains. Any new data, to them, is either part of the cover-up, or supports their belief. Therefore, no amount of information could even hypothetically disprove it for true believers. This denial makes the theory nonfalsifiable, meaning it’s impossible to disprove. By contrast, good theories are not false, but they must also be constructed in such a way that if they were false, evidence could show that. Bit of a quibble here: it's absolutely possible to have a good hypothesis that is later falsified. In science, this happens all the time, and it's just part of the process. Right now, there's evidence calling into question some of cosmologists' most cherished previous conclusions about the age and structure of the universe, and you know what? That's a good thing. Nonfalsifiable theories are inherently suspect because they exist in a closed loop of self-confirmation. In practice, theories are not usually declared “false” based on a single test but are taken more or less seriously based on the preponderance of good evidence and scientific consensus. Again, I'm not thrilled with the casual use of "theories" here, but I know the author means it in the colloquial, not the scientific sense. In that spirit, take one of my own pet theories: that sentient, technology-using life is vanishingly rare in the universe. I could change my mind about that in a heartbeat, if, for example, a flying saucer containing bug-eyed aliens landed in my front yard when I knew I hadn't been tripping. Like most conspiracy theories, the chemtrail story tends not to meet the criteria of parsimony, also known as Occam’s razor, which suggests that the more suppositions a theory requires to be true, the less likely it actually is. While not perfect, this concept can be an important way to think about probability when it comes to conspiracy theories. In fairness, Occam's Razor isn't a law carved in stone; it's a guide for choosing between hypotheses. Sometimes, things really are complicated. And sometimes, cover-ups happen. I was just reading about a possible link Of course, calling something a “conspiracy theory” does not automatically invalidate it. After all, real conspiracies do exist. But it’s important to remember scientist and science communicator Carl Sagan’s adage that “extraordinary claims require extraordinary evidence.” This is why I get all ragey when I see yet another unsubstantiated claim about Comet 3I/Atlas, which I vaguely remember mentioning in here recently. If the evidence against it is so powerful and the logic is so weak, why do people believe the chemtrail conspiracy theory? As I have argued in my new book... Oh hey, a stealth ad for a book! It's a conspiracy! ...conspiracy theorists create bonds with each other through shared practices of interpreting the world, seeing every detail and scrap of evidence as unshakable signs of a larger, hidden meaning. Really, that sounds a lot like religion. And, in a way, it is: in that worldview, we're all at the whim of higher powers. Conspiracies are dramatic and exciting, with clear lines of good and evil, whereas real life is boring and sometimes scary. The chemtrail theory is ultimately prideful. It’s a way for theorists to feel powerful and smart when they face things beyond their comprehension and control. That's one reason why I can't completely dismiss all conspiracy "theorists" as absolute nutters. They're victims, too—victims of a world grown beyond any one person's comprehension. Also, it makes me realize that we all have the potential to fall for misinformation. Articles like this (I'm not going to buy the book, sorry) help me remember how to combat that tendency in myself. |
| Yeah, this one's been in my pile for a few months, and it's dated early July. I doubt many people are swimming there today. Paris reopens Seine River for public swimming Parisians have begun bathing in the Seine for the first time in over 100 years after a ban was lifted. The French capital has created three swimming zones along the river as part of its Olympic legacy. If I were a lesser person, I'd make a joke about "Parisians have begun bathing..." But I'm not going to stoop so low as to make that implication. Why, I wouldn't even approach it tangentially. France's capital Paris reopened the Seine River to swimmers on Saturday for the first time in over a century. Journalism at its finest, folks. I'm sure no one had any idea that Paris is the capital of France. Paris authorities have created three outdoor pool zones, complete with changing rooms and showers and supervised by lifeguards. If the Seine is so clean, why would they need showers? The swimming zones also have beach-style furniture, offering space for 150 to 300 people to sunbathe. No word on nudity? Bathing in the Seine was officially banned in 1923, primarily due to health risks from pollution. "We can either fix pollution, or ban swimming. Let's ban swimming." Around €1.4 billion ($1.6 billion) was spent on improving water quality, which officials promised would benefit not just the Olympic athletes but residents and tourists for years to come. Jokes aside for the moment, this is a civil engineering effort, so of course I appreciate it. I'll probably never know the details of all they did, but it's one of those things where the work will go largely unnoticed by the general public, but serves them. |
| I always like knowing the origins of words, at least as far back as we can trace them. Here's one I knew from being raised around agriculture, but NPR explains it better than I could have: Broadcasters keep popping up in the news. Really? Because it seems to me actual "broadcasting" is dying. Commercial TV networks have made headlines: CBS announced the cancellation of The Late Show with Stephen Colbert this summer. ABC drew ire in September when it yanked Jimmy Kimmel off the air... Those sources are, of course, two of the three "classical" commercial TV networks, the other being NBC. Broadcasting — distributing radio and television content for public audiences — has been around for a century, but is facing a uniquely challenging landscape today. And I can kinda see the word being used for cable TV, but I'm not sure about streaming. Would you call Netflix a "broadcaster?" I wouldn't. That sort of thing is a "streamer," a word which has an almost opposite implication: broadcasting is wide; streaming is narrow. It's kind of like "podcast," itself a portmanteau of iPod and broadcast. Though they stopped making iPods, so that word is an anachronym. [Broadcast] originally described a method of planting seeds, particularly for small grains like wheat, oats and barley. Oh, and here I thought it referred to hiring the female lead in an old movie. (Broad? Cast? I'll be here all week, folks; try the veal.) Various dictionaries have traced the verb's first written use — to sow seed over a broad area — to 1733 and 1744. Modern farming techniques broadcast seeds mechanically, but the basic technique is the same: just scatter the little suckers. The use of the term "broadcasting" to describe radio first hit the mainstream in the early 1920s. Radio signals (formerly called "wireless telegraphy") and amateur broadcasts existed before that, Socolow says. What I think the article glides past is the "wireless telegraphy" part. A telegraph, like the later telephone, had a sender and a recipient. Broadcasting has a sender and a large number of potential recipients. It was a piece of legislation that officially cemented broadcasting's new definition: The Communications Act of 1934 defined it as "the dissemination of radio communications intended to be received by the public, directly or by the intermediary of relay stations." Technically, broadcast TV uses radio waves, just in a different segment of the EM spectrum. These days, people tend to use the word to describe any sort of dissemination of information — even if it comes from cable news networks, social media platforms and streaming services, which are not technically broadcasters under the government's definition, Socolow says. Well, okay, then. Obviously, words change over time. I just hadn't heard of Netflix or Prime Video ever referred to as "broadcasters." The article tries to explain why it matters, but to me, it's just another quirk of language. And a source of really bad, sexist puns. After all, dames don't like to be called broads. |
| I have decided to turn this blog into a cheese-themed one. Okay, no, just kidding. But cheese is my favorite condiment (bread is the only food; everything else is a condiment), so you get cheesy links from time to time. This one's from Salon: No one can resist a good cheese ball Let alone these two: a sweet version — swirled with fruit jam — and a savory, covered in bacon and Parmesan crisps As with any recipe, they can't just give you the recipe. Oh, no, can't have that. Gotta write the Ph.D. thesis first, then get to the recipe. I know it's for search engine fuckery. I don't have to like it. In this case, it gives me something to comment on. For years, the cheese ball has been my quiet party superpower. Me too! I'm always a cheese ball at parties. A well-made cheese ball has gravitational pull. If only I could attract people by being a cheese ball. No, wait, I don't really want that. Then there'd be people around. Visits to my grandmother’s house always began in the same place: the refrigerator. How's that dissertation coming along? [Her cheese ball] was a marvel of its genre: cream cheese, sharp cheddar, a splash of Worcestershire, a spoonful of sugar, crushed pineapple, pecans. Snark aside, that does sound damn delicious. That cheese ball has stayed with me all these years... Around me, it wouldn't last three hours. And cheese has, of course, become the modern shortcut. The board. The wedge. The baked brie doing its annual molten collapse. Confession time: I'm not a big fan of baked brie. Don't get me wrong: I'll eat it before I eat anything that's not cheese or bread. It's just too messy. Like a sloppy joe when I could have a hamburger. I'm just not into messy food, is all. “Cheese balls are celebratory and fun, but sometimes the flavors feel a little outdated,” Erika Kubick, author of “Cheese Magic”, told me in a recent email. Some flavors never go out of style. Like the myriad flavors of cheese. (Her most popular recipe from her first cookbook, “Cheese, Sex, Death,” was the Everything Bagel Goat Cheese Ball, if you were wondering.) And if you were wondering, yes, that parenthetical sentence is the real reason I'm sharing this. There's more dissertation there before it gets into the recipes. Yes, recipes, plural. Now, I haven't made these. I'm probably never going to make these. There's no point to doing a cheese ball for oneself; not when one can simply add different toppings to one's sad, lonely cracker. But maybe someone out there will like them. |
| I'm not saying I agree with this Nautilus article. Or disagree. I just find the temptation too great. See, the Forbidden Fruit in the Garden of Eden is commonly translated to English as "apple." But in the original Hebrew, from what I recall anyway, the word used is a more generic word for "fruit." And since fig leaves were canonically right there, covering up all the fun parts, now I'm wondering if it should have been "fig." When you bite into an apple, a pear, or a peach, you bite into the result of thousands of years of interactions between these fruits and primates. Otherwise known as selective breeding, or proto-genetic-engineering. When you let a fig squish in your mouth, you are savoring an even more ancient story. I was way older than I should have been when I realized "ficus" was "fig." That's apropos of nothing, really, it just came to mind. Before the fruit, in a beginning, all of the seeds that dangled from trees fell from those trees. These were gravity’s seeds. This is some poetic shit right here, harking back to both the supposed "beginning" with the forbidden fruit and all and tying that to another mythical apple that mythically fell on a certain philosopher's head. No idea if it's backed up by evolutionary research, but I'll acknowledge the poetic license. Then, some plants evolved fruits. Fruits were a radical evolutionary innovation; they surrounded seeds and attracted animals in order that those animals might consume them and ingest their seeds. They called out, “Eat me.” And so we get to the real reason I saved this article to share: "Eat me!" They evolved in the tropics, around 60 million years ago, in the shadow of the extinction of the dinosaurs. Those first primates have been hypothesized to have consumed the fruits of trees as well as flowers, and then, also, insects attracted to fruits and flowers. "Hypothesized" doesn't mean much of anything without evidence. Over the succeeding tens of millions of years, some of the descendants of those first primates became more dependent on fruits. Meanwhile, many trees grew increasingly dependent on those fruit‑eating primates for dispersal of their seeds; this was a relationship of mutual benefit and dependency. I'm pretty sure, however, that it wasn't just primates who were attracted to fruit and therefore helped with the fruit tree's reproductive strategy. I remember reading a bit about how avocados co-evolved with some sort of megafauna, which later became extinct and almost took the avocado with it. Metaphorically, a forest can walk across a landscape inside the gut of a primate, traveling one defecation at a time. You know, I appreciate biology and poetry, but some metaphors, I could do without. There's a bunch more at the link, but I got a bit exhausted with all the metaphors and speculation. As I mentioned, I'm not saying it's wrong or right. And it is, after all, an ad for a book. I just thought the comparison with the Eden story, and the Newton myth (one wonders if that was actually a fig, too, leading to the popular Fig Newton), was too good to pass up. Also, "Eat me." |
| This PopSci piece is nearly a month old, which matters when you're talking about transient space phenomena. Still, I'm sure most people remember the subject. New NASA images confirm comet 3I/ATLAS is not aliens The fast-moving comet likely comes from a solar system that is older than our own. As I've been saying: It's not aliens. Provisionally. It's obviously "alien" in the sense that it comes from a whole 'nother part of space. Few would doubt that, and those few would be in the same category as young-earth creationists and flat-earthers: complete deniers of piles and piles of evidence. The only "controversy" - mostly manufactured - was whether it was the product of alien sentience. The problem with any "sentient alien" hypothesis, though, is the same as the problem with Bigfoot: we can't prove Bigfoot doesn't exist; we can only continue to not find evidence of her existence. During a press conference on November 19, NASA confirmed the icy rock poses no danger to Earth, and contrary to certain conspiracy theories, is not an alien spacecraft. The "alien" people weren't necessarily spouting conspiracy theories, though. Just wishful thinking and projection. Any true conspiracy theorist would take one look at NASA's denial, and consider it proof that they're hiding something. “It expanded people’s brains to think about how magical the universe could be,” said Dr. Tom Statler, lead scientist for solar system small bodies, during the livestream announcement. I remember when I was a kid, fascinated by astronomy, there was talk about comets or other visitors from other star systems. Much like with the detection of extrasolar planets, though, it was only recently that we actually confirmed their existence. The universe is strange enough, but people are strange enough to have to try to make it even stranger. I actually kind of love that about people. It's only when they take it too far and replace reality with one of their own that I get disgusted. There's more at the link, including actual images from actual telescopes, but fair warning: the images aren't exactly breathtaking, not like the famous pictures of nebulas and such from Webb or Hubble. Mostly I just wanted to reiterate that it's not aliens. It's still pretty cool, though. |
| Here's a rare occasion when I talk about sex, thanks to Nautilus. How Monogamous Are Humans Actually? How we rank among species on fidelity to a single partner may have shaped our evolution And already I have issues. The headline may seem neutral enough, but then you get to the subhead, and it uses "fidelity" as a synonym, which conveys an implicit bias due to the positive connotations of "fidelity." And then you get to the evolution part, and wonder about direction of causality: did sexual practices shape our evolution (other than in the obvious sense of enabling evolution to continue), or were are sexual practices shaped by evolution? Or some synergy between them? Well, substitute "I" for "you" in that paragraph. You know what I mean. And let's not undersell the worst implicit assumption there: the primacy of heterosexuality. Across cultures and millennia, humans have embraced a diversity of sexual and marital arrangements—for instance, around 85 percent of human societies in the anthropological record have allowed men to have more than one wife. "Allowed?" And yes, it's almost never the other way around. Still, remember what Oscar Wilde said: "Bigamy is having one wife too many. Monogamy is the same." Anyway. If that 85% figure is correct, and I have no facts to contradict it, then we should be considering polygamy—not monogamy, not polyandry, not any other mutually agreed-upon relationship—to be the default for humans. The problem with polygamy as a cultural norm, though, apart from Wilde's quip, is math. The proportions just don't work out, unless you send a lot of your young men off to die in war. Which, of course, a lot of cultures did. Or unless you also accept polyandry, which, given the patriarchal, hierarchical nature of most societies, ain't gonna happen. I'd be remiss if I didn't note that "monogamy" itself, the word, literally means "one wife," while "polygamy" obviously means "multiple wives," thus reinforcing the male-primacy point of view. But words can change meaning over time, and for the sake of convenience, just assume that whenever I use a word like that, I'm referring to sexual partners of any gender. But in the broader evolutionary picture, some researchers have argued that monogamy played a dominant role in Homo sapiens’ evolution, enabling greater social cooperation. "Some researchers." Right. It couldn't be the ones with an agenda to push, could it? This theory aligns with research on mammals, birds, and insects, which hints that cooperative breeding systems—where offspring receive care not just from parents, but from other group members—are more prevalent among monogamous species. I'm not sure we should be labeling it a "theory" just yet. To decipher how monogamous humans actually have been over our evolutionary history, and compare our reproductive habits to other species, University of Cambridge evolutionary anthropologist Mark Dyble collected genetic and ethnographic data from a total of 103 human societies around the world going back 7,000 years. He then compared this against genetic data from 34 non-human mammal species. With this information, Dyble traced the proportion of full versus half siblings throughout history and across all 35 species—after all, higher levels of monogamy are linked with more full siblings, while the opposite is true in more polygamous or promiscuous contexts. I have many questions about this methodology, not the least of which is this: humans don't have sex for procreation. We have it for recreation. Procreation is a byproduct, not a purpose, most (not all) of the time. A study like that, concentrating on births, completely ignores the purely social aspect of multiple partners. As support for my assertion there, I present the bonobos, our closest primate relatives, who engage in recreational sex all the time. Most species don't seem to use sex for recreation, though I'm hardly an expert in that regard. This makes humans (and some other apes) different from the birds and the bees, and using other animals as models for the ideal human behavior is a good example of the naturalistic fallacy. Point is, I submit that using live births as the sole indicator of degree of polygamy is just plain wrong, and will lead to incorrect conclusions. “There is a premier league of monogamy, in which humans sit comfortably, while the vast majority of other mammals take a far more promiscuous approach to mating,” Dyble said in a statement, comparing the rankings to those of a professional soccer league in England. And with that, he betrays his bias. This doesn't mean that the study doesn't have merit, mind you. It might be useful for drawing other conclusions. I just don't think it means what he, and the article, claim it means. Meanwhile, our primate relatives mostly sit near the bottom of the list, including several species of macaque monkeys and the common chimpanzee. Heh heh she said macaque. Let's not forget another important thing: a species will almost always have the reproductive strategy that works for that species. It could be pair-bonding. It could be complete promiscuity. It could be something in between. Whatever works for that niche. Personally, I'd hypothesize that humans fall somewhere in the middle for the simple reason that we live for drama, and what's a better source of drama than who's doinking who? Hell, it's the basis for at least half of our mythology, and all of our soap operas. There's more at the article, obviously. I just want to close with this: Unlike other humans, I don't care who's doinking who. The only thing I care about is that everyone involved be consenting or, better yet, eager. You want to be monogamous? Find another monogamist. Poly? Find other polys. Single? Get some good hand lotion. I. Don't. Care. But if you agree to a particular lifestyle, whatever that may be, and then you go behind your partner's or partners' backs? That's what I consider wrong. Not, like, on the same level as murder or theft or wearing socks with sandals, but still, it's something to be ashamed of. |
| More support for my "science fiction is the most important genre of literature" hypothesis, from a not-so-happy Medium: We were worried about the wrong dystopia We were all so busy worrying about Big Brother we forgot to worry about Soma Because those are references to Nineteen Eighty-Four and Brave New World, respectively, and, despite being Very Serious Literature studied by Very Serious Literature Students, both are indisputably science fiction. Everyone’s worried about 1984, right? Well, no. 1984 was, as the article footnotes, a damn good year. They mention Ghostbusters from that year, but it's also the year Born in the USA came out, and we were still near the beginning of our long downward slide. Michael Jackson amused us all by lighting himself on fire, This is Spinal Tap turned everyone up to eleven, and the first American chick to do a spacewalk did a spacewalk. The greatness of the year was only marred by the re-election of Reagan, which was the direct proximate cause of most of our current problems. Horrible authoritarian government that monitors its citizens every second of the day and controls them with a tight fist. You know the drill. Yes, and I wish more people would actually read it instead of pretending to have read it. And sure, the authoritarian world of 1984 is something we want to avoid. I think we call agree on that. For various definitions of "we," sure. I can think of several people right off the top of my head that would love it. Dictatorships are a wonderful form of government. For the dictator. But there’s another style of dystopia that we’re heading towards instead. One that we should be more worried about. The Fallout universe? No, that would at least provide some level of amusement for the survivors. It’s a track the internet started us down, and generative AI is accelerating us down. Oh no. No no no no. Don't blame the internet. For those who haven’t read “Brave New World” by Aldous Huxley, it paints the picture of society where every country is united in a giant World State. Okay, full disclosure, I last read both of those novels sometime before the year 1984. But I did read them. Society has strictly defined and enforced social classes, where the upper classes live luxurious lives, while the lower classes do all the hard work for less reward. As I'm sure you know, we're already in that part of the book. The internet didn't cause that. Reagan did. (Don't give me shit about that. That's the actual predictable outcome of Reaganomics.) What makes the World State interesting is how they keep the lower classes in check. In 1984, the people are controlled through fear. I mean, it's effective. In “Brave New World”, the government encourages everyone to take a drug called “Soma”, a drug that is “euphoric, narcotic, pleasantly hallucinant.” Taking Soma makes people happy and content with no ill side effects. If you're happy and you don't know any better, that can feel like true freedom. It's one reason I've ragged on the idea that happiness is a worthwhile goal, by itself, to pursue. Soma, the fictional drug, creates a sense of comfortable complacency. It doesn’t make people happy, but it distracts them from being bored or sad, and keeps them just satisfied enough to keep doing their jobs. If that book were written today, the drug would also have to boost productivity. It’s hard to see Soma now without comparing it to the modern internet. The explicit goal of TikTok, YouTube, Instagram, Reddit, and basically every other part of the internet is to give you enough of a dopamine spike to keep you addicted to them. Ehhhhh... I'm skeptical. While I avoid all of those (except the occasional hit of YouTube), other articles have ranted about how those platforms, and others, exist to keep people out- and en- raged. But definitely also engaged. (There's no such thing as outgaged, is there?) Engagement doesn't require pleasure spikes. And rage is, like, the opposite of a dopamine effect. Video games, now, I could see the dopamine thing. But nothing's going to get me to say a bad thing about video games, unless they're the free kind that come with commercials, or the other kind where you have to keep paying to play or get cool in-game stuff. The article goes on to do a fairly run-of-the-mill rant against generative AI and its companies, and you can read it if you want, but I'm fairly certain that my intelligent, perspicacious, well-informed and remarkably good-looking readers have already heard something similar. Here’s the thing about dystopias. They’re only bleak for the people on the bottom. There’s always a group on top who wins out. I just said that, up there. But then people ask me “Okay, but what do I do about it? How do we prevent this dystopian future?” One of the most insidious aspects of mass media is that it almost always gives us the hope that there are things that we, personally, can do to prevent [whatever catastrophe]. Sometimes, they might be right. But not always. And it just leads to people stressing out more. What can we do about that? I don't know, and if I did, I wouldn't want to add to your stress by stating it. It's like "The missiles are coming. But there's still hope! Get to a bomb shelter in the next 10 minutes!" Or, "An asteroid is on collision course with Earth. What can you do to save yourself? You can try to leave Earth now if you can." To summarize, their "prevention" is: don't use AI. And I can understand that. But the problem is, it's not going to happen, just like the solution to the problems of social media is "don't use social media" and people do it anyway. Or how the solution to terrible working conditions in the preparation for the World Cup was "don't watch the World Cup," but people did anyway, and advertisers got the message: doesn't matter how authoritarian the regime is, doesn't matter how many enslaved or conscripted workers they had, people will put eyes on our ads because it's a sportsball championship. At this point, not using AI is like "you can help fight global warming by not having a car." It works for some people (hell, it worked for me for over a year), but you're fighting an uphill battle suggesting it. Your individual contribution is like a bucket of piss in the ocean. Nothing more. And you can't do anything at all about the billions of other buckets of piss, or the several thousand Colorado-sized swimming pools full of piss which, in this disgusting metaphor, represents the big companies. Don't get me wrong; I'm not a fan of most AI (I have to admit I'm amused by the image-creating ones) or the tech companies that are shoving it down our throats. But I do think the article is dead wrong about one thing: it assumes that it's either 1984 OR BNW, when the reality is... we're heading for both at the same time. |
| Ah, yes. December. An appropriate time to think about Cheeses, our Savour. The Easiest Way to Grate Parmesan Cheese This TikTok hack just changed the game for getting freshly-grated Parmesan. After three rants in a row, it's nice to get back to a safe topic like delicious ch- Wait. "TikTok hack?" If you are adding grated Parm to a pasta sauce or really need it to melt into your dish freshly grated is the way to go. That's true, but, and hear me out here, the easiest way to grate Parmesan is to buy it pre-grated. I admit it's not as tasty as the fresh stuff, but all I'm doing is adding a bit of extra flavor to my sad, lonely, bachelor microwave pasta dish. This hack leaves all the work up to one staple kitchen appliance: the blender. Okay, so, let me get this straight. You think it's easier to grate the cheese in a blender than to use a handheld cheese grater? You're not thinking this through. You have to factor in cleaning. Cleaning a handheld grater is dead easy if you have a dishwasher: throw it in there. Cleaning a blender, on the other hand, is a chore. Start with a wedge of your favorite Parmesan cheese. Cut the cheese into several cubes. No special knife skills needed! Ah, yes. And then there's this additional step. And you do need special knife skills; Parmesan is a hard cheese, and unless you know what you're doing, you could add fingertip to your pasta topping. Why is Freshly Grated Cheese Better than Pre-Packaged Products? Because the Bulk Cheese people paid us and the Pre-Packaged Cheese people didn't. ...okay, no, I agree that freshly grated is best. But sometimes laziness is better than not-laziness. The article does go into some semi-technical reasons why it's best to Make Yourself Grate Again, and you can go there if you want to know. Anyway, that's all for today. No deep philosophical insights, no questionable scientific studies, no articles chock-full of confirmation bias. Just some nice, cheesy jokes. |
| What am I doing on a site called VegOut? Glad you asked. Just questioning the validity of this article. Psychology says if you had a 60s or 70s childhood, these 8 experiences shaped your brain differently than today’s kids The 60s and 70s gave you a kind of mental sturdiness — a quiet confidence, a deep practicality, a simple strength — that continues to set you apart. *taps cane* I'm telling you. Kids these days. ...shall we start with the obvious? If this really applied to everyone, or even to the majority, in my cohort, then that's the polar opposite of "set you apart." So even with the headline, my inner cynic is poking out of my pants, looking around, and going, "Yeah, right." Growing up in the 1960s or 70s wasn’t just a different era — it was a different psychological environment entirely. No smartphones. No constant supervision. No algorithm telling you what to think. You were shaped by freedoms, challenges, and cultural norms that simply don’t exist for today’s kids. And spankings. Lots and lots of spankings. (I'm not saying that was right, mind you. It's just the way most parents did it back then, because they, too, were shaped by their pasts.) And psychologists agree: the way your brain developed during those formative years left you with cognitive patterns, emotional habits, and mental strengths that set you apart — for better and for worse. I can't disagree that experience during one's formative years is a big factor in one's personalty. Without getting into the age-old debate of nature vs. nurture, both have some role to play. What makes my skepticism rise is the idea that we all, or most of us anyway, had experiences similar enough to make sweeping generalizations. My own upbringing was unique. Others were even more rural. Many were suburban, or urban. I feel like those environments mold people differently. And I'm not even going to get into how the article is US-centric; let's just acknowledge that it is, and run with it. About the only thing I can think of was what we all, indeed, had in common: TV was three commercial channels, plus PBS, broadcast over airwaves and received by antennas. Radio was our only source of music, and, for me at least, they had both kinds: country AND western. Worst of all, no video games, at least not until the late 70s, which also featured disco, so kind of a mixed blessing there. I'm not reminiscing about this out of some nostalgia, mind you. There was a lot to dislike about the situation. Let's not forget how the military action in Vietnam affected all of us at the time. Point is, I feel like a lot of these things do, in fact, apply to me. But that doesn't make my skepticism go limp. As someone who writes about psychology, mindfulness... Normally, I'd Stop Reading Right There. ...and the changing dynamics between generations, I’ve noticed something fascinating: people who grew up in the 60s or 70s often share certain traits that younger generations don’t naturally develop. Okay, but a statement that general, couldn't it apply to every "generation," however you want to define that? The article gets into the specifics of the "certain traits," but swap those out, and you're back to making sweeping generalizations about any generation. 1. You developed resilience through boredom, not stimulation Kids today are never bored — they have a screen in their pocket that delivers instant entertainment on demand. But in the 60s and 70s, boredom was your constant companion. I remember being bored on occasion. I always found something to do, though (living on a farm'll do that). If nothing else, I loved reading books. And there weren't a lot of other kids around; the ones that were, were delinquents (one of them is in prison for murder one right now). I'm not saying I was an angel, but I was an only child in a rural area and I had to get creative. But I'm certain boredom wasn't a "constant companion." This is why so many older adults are naturally more grounded and less dependent on external stimulation. Your childhood literally trained your brain to be comfortable sitting with your own thoughts. It couldn't be just because older adults of any generation generally become more grounded and less thrill-seeking. Could it? 2. You learned independence because no one was tracking your every move On this point, I provisionally agree. I'm not sure how that's better, though. There's something to be said for training your kids to grow up in a surveillance environment. Makes 'em more paranoid and less likely to do some of the shit I got away with. This is what Elf on a Shelf is for, by the way: letting kids internalize that someone, somewhere, is always watching. 3. You became socially adaptable long before the internet existed Growing up in a pre-digital world meant you had to socialize the old-fashioned way: Face to face. At school. On the street. On the phone attached to the kitchen wall with a cord that could barely reach the hallway. I'm not convinced this is better, despite having lived it. Different to today? Sure. Better? To use the catchphrase of my generation: Meh. Whatever. 4. You experienced consequences directly, not digitally When today’s kids make a mistake, they might get a notification, a screen warning, or a parent stepping in immediately. Much as I, like anyone else, am tempted to think how they grew up was "the way things ought to be" and "better," again... not convinced. For instance, there is something to be said for being able to chat in real-time with people all over the globe. The potential is there to acquire a wider perspective, not just be limited to your own neighborhood, region, or country. 5. You grew up with scarcity, so your brain understands value Whether you grew up comfortable or not, the 60s and 70s were an era before excess, convenience, and instant gratification. This actually made me laugh out loud. Not because I felt like we were rich or anything, but because my parents were older, and lived through the actual Great Depression. We didn't have scarcity. We had convenience. Sure, we weren't ordering crap from the then-nonexistent internet, but, and I can't stress this enough, we lived like royalty compared to the shit my parents went through when they were young. Not that there weren't people my age living in poverty, but you can say that about today's situation, too. 6. You developed patience and attention span from slower living And promptly threw them out the window as soon as I possibly could. There's more at the link, but my attention span is already overtaxed. The one thing I'll say that stands out to me: when I was a kid, we only had to find a way to find the fit in / stand out balance with maybe 100-1000 other kids, at school or whatever. Call it 500 for argument's sake. You can find a way to stand out against a crowd of 500. Maybe you're the best musician. Maybe you're the best artist. Maybe you're the best at math, or science, or bike-riding, or delinquency. Or you excel at football, or have the nicest tits. (For me, it was being a comedian. Class clown five years running, baby!) Now, kids aren't competing against their school group, but among millions of other kids from all over the world. How do you stand out in a crowd like that? Some do, of course. The rest? Meh. Whatever. |
| I don't expect this BBC article to be of interest to everyone. But, as someone who was adopted, I have Opinions. I will note, for those not wanting to click on the link (but come on, it's the BBC, not some malware site), that the article opens with a big pic of Tom Hiddleston as Loki from the MCU. Hollywood blockbusters and horror films frequently using adopted children as psychopaths and villains causes harm in real life, adoptees have said. Okay. I'm not going to argue that such a thing can't cause, or hasn't caused, harm. I don't know. What I do know is that the same can be said for any minority: villain-casting has the potential to reinforce stereotypes if the writers are careless. Though the same can be said for hero-casting them. James Evans, 23, was two-and-a-half months old when he was removed from his birth family due to their inability to parent and harmful behaviour. Now with a masters degree in scriptwriting, James said films such as Thor, Annabelle and The Conjuring: The Devil Made Me Do It, among many others, made him "frustratingly uncomfortable" at how adoptees are depicted. I didn't see those last two, but they sound like horror movies. No one in a horror movie, generally, is portrayed in a great light. He was fostered by two families before Ruth and Andrew Evans adopted him when he was two and said no film or TV series had ever made him feel "properly seen". Well... if he feels that way, he feels that way. I suppose there might be unresolved trauma, given his unfortunate first two years. My situation was different (adopted as an infant and kept in a stable, if not perfect, home). (What family is perfect?) One of the most high-profile adoptees in cinema is the Norse god of mischief Loki in the Marvel films. This bit, though, I can say something about. First of all, in that movie, Thor himself starts out as a massive, throbbing cock, and he's the one who's the biological (or however it works in Asgard) offspring. The whole movie is his redemption arc. Loki is... complicated, I'd say more a foil than a villain, and really shows up as a major villain in The Avengers. But what the article doesn't talk about, and may not know, is that Loki gets his own redemption arc, in the form of the series that bears his name. I'm not going to spoil it, but damn, I can't think of a more powerful redemption arc in all of literature. Yes, I said literature. Shut up. These stories reinforce damaging stereotypes of adopted people as imposters or "devil children" where trauma is used as a "lazy" plot device for evil, he said. Okay, like I said, stereotyping is bad. But writers use trauma as a background for villainy with bio children, as well. The other end of the spectrum is the "grateful adoptee", when a child's adoption is seen as a fairy tale ending, such as Miss Honey taking in Matilda in the Roald Dahl book and subsequent films. Another one I'm not familiar with, but having read other Dahl, I find it difficult to believe it's truly a fairy tale ending. This ignores "the loss and grief" of children being taken away from their birth parents, James said. Sheesh, talk about stereotypes. I've no doubt there are Issues involved there. But, again, Issues can stem from a lot of childhood trauma, not just having been adopted. While James has been "loved and cared for" and has "the best support system" in parents Ruth and Andrew, he said just because his trauma was invisible, does not mean he did not need help. To be clear, I am not trying to minimize or mock his lived experience. Again, though, lots of people need help. James said the portrayal of adoptees through the fairy tale lens was as damaging as being presented as villains as it tells society they were ungrateful if they behave outside this stereotype. You think that's bad? Try being a fairy-tale stepmother. "If an adopted child's parents are parenting them, they are their real parents. "They are the ones who are there every day fighting for their child and that is real parenting. Biology isn't fundamentally what defines parenting, it's what you do." On that, I am completely in agreement. My real parents are the ones who changed my diapers, kissed skinned knees, and put up with my teenage bullshit. Not the ones who happened to share five minutes of fun. Despite all this, James and Susie highlighted some good portrayals. And yet, the ones they highlighted don't paint the whole picture. See, there's another adoptee in literature. Probably the most famous one. The one that is the most canonically not a villain, but the absolute polar opposite thereof, despite his trauma. I'm talking about the guy in the blue suit and red cape. Now, I'm not saying they can't do better. That we can't do better, as writers. But I object to the idea that adopted people should never be villains. Just like with anyone else, they have agency. We're not angels; we're not monsters. We're just people. It would be like saying "We can't show this Black dude as a gangster, because that would make people think that all Black dudes are criminals." As for me, if I want to see a positive role model for my own experience, all I have to do is look. Up in the sky. |
| I generally prefer to share the less mainstream (at least in the US) material in here, but this one from CNN angered me enough to make me save it. Just yesterday, I was text-ranting to a friend about this very thing, not knowing that this one would come up at random the very next morning. Here is, in part, what I said (with maybe some clarifications and autocorrect decorrections): Look, lending has so far managed to mostly cover up just how POOR most people are. And it's worse than that. By making something affordable [through loans], sellers adjust prices upwards faster than otherwise. College tuition used to be pricey but not outrageous. Then student loans. Then it shot into orbit. Cars, also faster than inflation. I'm not judging people for going into debt. Mostly. I'm judging the fucking system. They want us to stop owning. They want us to rent everything. They want subscriptions, not sales. We're meat for the table. This isn't capitalism. This is feudalism. Neo-, maybe. Not bad for an off-the-cuff text rant, I guess, but not as much thought as I usually put into these blog entries. Hence, the article. Stubborn inflation continues to make the cost of living unbearable for many Americans. Is it inflation? Or is it no one wanting to pay them? A number of inventive solutions have emerged — but with a common theme: putting consumers deeper into debt. I don't think I have to tell you what I think of that, after the above. This week’s 50-year mortgage proposal from the Trump administration is the latest example of the trend. I swear to all the gods, I don't know how people survive. I don't have the best memory, I know, but I do remember seeing 60-year mortgages being offered a while back. Doesn't anyone else remember that? And by "a while back," I mean "just before the entire house of cards collapsed in 2008." They were bad ideas then, and they're bad ideas now (there's not a lot of difference between 50 and 60). I swear I'm not getting political, here. They're bad ideas no matter who's running the show, or pretending to. The potential for a 50-year mortgage comes as the auto industry has been pushing seven-year car loans, which have become an increasingly popular option with the average price of a new car hitting a new record of more than $50,000. Don't you get it? The prices will simply adjust upwards again to compensate for the increased demand. And the explosion of buy now, pay later options online and at brick-and-mortar retailers has normalized taking on longer-term debt for purchases as small as food delivery. Come to think of it, I haven't heard much noise about predatory payday lenders recently. Did someone finally put a band-aid on that chainsaw wound? Case in point: While a 50-year mortgage could lower monthly payments, the amount of interest a borrower would pay over 50 years could be double what would be paid at current rates over 30 years, the traditional length of most mortgages. Oh, but it's worse than that. Way worse. I can't be arsed to pull up an amortization calculator right now, but in very general terms, earlier mortgage payments are mostly interest, not much principal. It takes, if I recall correctly, about 20 years or so for a 30-year mortgage without pay-aheads to get to the point where one month's payment is half-interest, half-principal. That amortization would be even less favorable with a longer-term mortgage. You could be paying for years, and not build significant equity in your house. You're basically renting the damn thing from a bank. A bank who, if someone fat-fingers something or just loses some paperwork, can just take it right away from you. Not to mention that I question the ability of recently-built houses to even last 50 years. There's more at the link, but honestly, I'm too riled-up to rant any more about it today. You may be wondering why this pisses me off so much, considering that I'm retired with a fully-paid-for house and car. Well, it's simple: I don't just think of myself, and I wish other people would do the same. Just remember: if you can't pay cash for it, you can't afford it. Yes, this includes houses, though there's usually good reason to accept debt for a house. And if you're thinking, "But, Waltz, I couldn't afford anything if I couldn't take out a loan for it," then you may be beginning to see my point here: that we're not free agents, but locked into perpetual servitude to our Overlords. Like I said. Neo-feudalism. EDIT: So, after some feedback, I realized I might not have been as clear as I thought. This still might not make things clear, but I'm going to try: Now, I could be *wrong* about any of these things. I'm not quite infallible. But I hope that clears up some of what I said up there. |
| Today, we have Mental Floss with another blurb about how fun English can be. 8 Words That Are Only Used in One Weirdly Specific Context Think about it: have you ever heard someone say they had “extenuating errands”? Well, I bloody well have now. (Or at least seen them type it.) The English language is certainly bizarre in the best way. For alternative definitions of "best." Some of it is totally run-of-the-mill, and some of it is full of words that only seem to appear in one extremely specific situation. I haven't seen MF try to explain idioms like run-of-the-mill (they might have and I missed it), but nothing about our language is truly ordinary, once you really look at it. So let’s take a little stroll through eight words that only show up in one weirdly specific context. You know, there's another way to look at this, too. What we call a "word" is pretty arbitrary. You can take two words and mush them together, like "homework" or "housework" (which are, absurdly, not the same thing at all). So another perspective is that these featured phrases are actually words that just happen to have a space inserted in them somewhere. Inclement (Weather) For that matter, no one ever describes the weather, or anything else, as "clement." Bode (Well/Ill) Bode is a free agent in theory, but let’s be honest: you’ve only ever seen it next to “well” or “ill.” Free agent? Nah, it's in a threesome. Hermetically (Sealed) Now, this word is a little “underground,” if you will. Hermetically sealed sounds like something out of a sci-fi lab, but it mostly refers to food packaging... This is where I get to rant: if it's being used for food packaging, that's- well, not wrong, per se, but a distortion of its original intent. It's a bit complicated, but despite Hermes being the Greek equivalent of Mercury, it has nothing to do with the god Mercury, the planet Mercury, or the element mercury. Instead, it's related to the Egyptian god of wisdom and knowledge, Thoth (I told you language was weird.) Thoth was said to have invented a magic seal that could keep vessels airtight (presumably useful in certain Egyptian interment procedures). But he's best known for being the mythical founder of certain occult orders, and the adjective "hermetic" originally referred to these spiritual practices, and was basically a synonym for "occult." So if you use, or see, "hermetically sealed," just remember you're communing with deep, ancient spirits. Treat them with respect. Pyrrhic (Victory) There's good reason for this one, which is derived from some ancient general who won a battle, but at too great a cost. It's not like you could have a Pyrrhic loss. Or a Pyrrhic anything else at all. Contiguous (United States) I'm going to quibble with this one; I'm pretty sure I've heard the word used in math contexts. More at the link if you're interested. Mostly I just wanted to rant about the "hermetically" entry. |
| I found another rare (these days) article from Cracked which isn't all about celebrities or movies. 15 Facts About New Space Discoveries In Case We All Need To Move There Someday There's some wild stuff going on out there! It's not that I have some sort of snobbish disdain for the entertainment industry. I like shows and movies. But I don't give a damn about their actors' personal lives, usually, and I'd simply rather talk about what I consider to be more important things, like the pooping from yesterday. So the website is kind of a pale echo of its former self, though I still get highlights from them. This was interesting enough to share. And wonky enough to quibble about. We learn more and more about our universe all the time, but there’s recently been a huge spike in stunning cosmic observations. Also a huge spike in stunning bullshit, like the nonsense about Comet 3I/Atlas. Yes, it is, by the most technical interpretation of the term, an alien invader. No, it's not "aliens." One of the reasons I'm sharing this article is that it mentions that comet without giving the "aliens" hypothesis the oxygen it truly doesn't deserve. As an aside, I miss the days when they'd name comets after the people who discovered them. As I understand it, most early comet detection is automated now, hence the naming after robots instead of meat. Could we at least give them fun names? Even if we have to name them after celebrities or some shit. I'd suggest letting people bid on naming rights, but Muskmelon would win every time, and the comet names would be like X, X-1, 2X, X3, XXX, X35 (which is "sex" spelled backwards), or 3I/Atlas. Anyway, the article is, as usual, a countdown list, and there are pictures, and I'm only going to comment on a few of them (no pix). 15 Astronomers recently discovered 128 new moons orbiting Saturn To date, Saturn now has 274 known moons. I'm pretty sure I've mentioned this, or something like this, before. You know how when they redefined what "planet" meant, and the definition ended up excluding Pluto, and a whole bunch of people who otherwise only gave a shit about celebrities and who they were doinking suddenly became space critics? Well, I think they need to do a definition for "moon." See, right now, as far as I know, there's no fixed lower limit on how big something has to be to be called a "moon;" it just has to orbit a planet or dwarf planet. So, Saturn also sports the second-biggest moon in the system: Titan, aptly named. That fat bastard's bigger than Mercury. But I'm also guessing Saturn also controls one of the smallest moons in the system: a tiny grain of dust somewhere in its awesome set of rings. Yes, every particle in those iconic rings can be thought of as a moon, because it orbits the planet. Point is, Saturn doesn't have 274 moons; it has millions. Just because we can't resolve all of them individually doesn't make what I'm saying less true. We can't resolve individual stars in distant galaxies, either, but they're still stars. It is, however, a categorization issue, not a science one. 13 Uranus' 29th moon A convincing argument against free will is that you just now thought of a "your anus" joke. Hidden inside the planet's dark inner rings, new observations from the James Webb Space Telescope found Uranus’ 29th moon. And that you're trying desperately to make a "dark inner rings" joke to complement that. Anyway, see above for "what's a moon" quibbling. 11 Martian Dust Devils Someone really needs to make that a band name. Or a sportsball team name. 8 Jupiter’s neon light show This is cool and all, but this made me do actual research to see if "neon" meant the pretty colors, or the actual noble gas called neon that, when properly stimulated, makes pretty colors. But neon isn't a significant component of ol' Jupe's upper atmosphere, so I'm gonna go with "pretty colors." 5 Comet C/2025 R2 SWAN Seriously, folks. If you can change the definition of "planet," you can change the comet-naming convention. No one wants to call them by something more suitable for being an OnlyFans password. 4 The universe is getting colder and slower Aren't we all? Welcome to the party, pal. Like I said, there's more at the link, and pretty pictures too. You might have to put up with ads and popups; I don't know, because my computer wears two condoms when it probes the internet. |