Items to fit into your overhead compartment |
| Admittedly, sometimes I save a link just for the opportunity for snark. This Southern Living article is one of those. 10 Old-School Housekeeping Habits It’s Time To Let Go Of Plus, how to break them to create a healthier home. Is one of them "housekeeper?" Because who can afford a housekeeper nowadays? We all have habits that we learned from our parents that we may never have second-guessed. And most of those habits are probably healthy ones, but when our parents were growing up, standards were different. More true for some of us than for others. A generation later, we just know better on certain things—plus, we have so many more products and tools at our fingertips. Translation: This is an ad for cleaning products and tools. Old Habit: Using The Same Sponge For Everything To avoid cross-contamination, designate one sponge for each purpose. Disinfecting them in the dishwasher, or putting them in the microwave for two minutes, will kill a lot of the germs, but not all, so you also want to replace them often. This message has been brought to you by The Sponge Company. The Sponge Company. The Sponge Company: Be sure you have enough sponges! Fear the germs! Old Habit: Neglecting To Clean Your Cleaning Tools Just like a dirty sponge may actually make your “clean” dishes dirtier, the same goes for all household cleaning tools. Okay, to be fair, this one doesn't suggest buying new cleaning tools every week. They must have not gotten paid by The Mop Company. Old Habit: Using Too Much Laundry Detergent Oh, cool, this one's actually suggesting to use less laundry detergent, not more! Right? Take the time to actually read the packaging on your detergent to see how much the manufacturer recommends using per load. Chances are, it’s much less than you’d think. Nope. "Chances are" they still recommend too much, so you buy it more often. "Less than you'd think," maybe, but I'm guessing that's because modern laundry detergents are more concentrated, and if you're used to 1970s detergent measurements, yeah, you're probably using too much. It's like those toothpaste commercials that gleefully demonstrate the use of their toothpaste by squeezing out a long, curly glob that fills up the entire toothbrush. No, you don't need to do that. But if you do, you'll buy more toothpaste sooner. Speaking of toothbrushes (yeah, I'm skipping a few): Old Habit: Not Disinfecting Toothbrushes Growing up, the concept of cleaning a toothbrush was foreign. We just used them twice a day until the bristles were completely splayed out before finally replacing them. I don't remember how often we replaced teethbreesh when I was a kid. Knowing my parents' well-earned frugality, it wasn't very often. Rinse your toothbrush well after each use, and make sure you’re storing it in a place that will let it air-dry. That's easy to say, and do, if you don't have cats. Old Habit: Using Paper Towels To Clean Everything The world is a lot more environmentally conscious than it was a couple decades ago, and one old-school habit we know should be broken is using disposable paper towels to clean, well, everything. Oh, no, we should use sponges instead. You know, sponges made of some kind of plastic rather than paper towels made with vegetable matter. Sponges that, per the above, are always covered in lurking germs just waiting to pounce. Nah. I'm using paper towels to clean everything. It's more sanitary. It's not that I don't care about the environment; it's that my contribution means squattly-dick in the face of corporate shenanigans. Old Habit: Leaving Dusting Until Last You may have seen your parents dusting off furniture, ceiling fans, ledges, moldings, and the contents of the curio cabinet as the final step of cleaning a room—after the tidying and vacuuming was done. I'm not psychologically able to use "dust" as a verb. It's one of those contronyms, anyway: "to dust" can mean "to remove dust from" or "to sprinkle dust upon." And I don't have the mental energy to be careful with context every damn time. Okay, so, that article wasn't nearly as much of an ad as I'd feared. Still, you gotta look out for these things. There's more at the link, because I can reluctantly admit that there may be some good tips in there. |
| Today's link is from The Guardian, and it's long. Just a warning for short attention spans. At least it's text and not video. Apocalypse no: how almost everything we thought we knew about the Maya is wrong For many years the prevailing debate about the Maya centred upon why their civilisation collapsed. Now, many scholars are asking: how did the Maya survive? Though I hate that the headline leans into the "apocalypse" nonsense, and with a terrible pun no less, I wanted to know what they had to say. (Yes, I used a terrible pun for this entry title, too. I never said I wasn't a hypocrite.) Thanks to technological advances, we are entering a new age of discovery in the field of ancient history. Improved DNA analysis, advances in plant and climate science, soil and isotope chemistry, linguistics and other techniques such as a laser mapping technology called Lidar, are overturning long-held beliefs. Nowhere is this more true than when it comes to Maya archaeology. In other words, we move forward to look backward, which gives us tools to move forward. That is, assuming you view time as a linear thing with the future ahead of us. I did a whole entry on that a little while ago ("Fun Times" When Estrada-Belli first came to Tikal as a child, the best estimate for the classic-era (AD600-900) population of the surrounding Maya lowlands – encompassing present day southern Mexico, Belize and northern Guatemala – would have been about 2 million people. Today, his team believes that the region was home to up to 16 million. That is more than five times the area’s current population. That's a hell of a difference. By way of comparison, that's almost twice the population of New York City. A bit more spread out, though. Think of it more as: almost twice the population of Virginia. This is how science works. And that's a good thing. Some Maya cities were established hundreds of years before the founding of Rome, and they included significantly larger architecture that still stands. Both cultures developed sophisticated astronomy, mathematics, writing and agriculture, as well as elaborate trade arrangements across vast cosmopolitan lands. I'm still not convinced that Rome was all that great at math. Or maths, as The Guardian would spell it. Outsiders’ power over the story of the Maya is written into the people’s very name. After their arrival in the early 1500s, the Spanish named local populations “Maya” after the ruined city of Mayapán in present day Mexico. Yet the Maya never saw themselves as one people and were never governed under one empire. They spoke many languages – 30 of which are still around – and belong to an intricate mix of cultures and identities. This is, I think, similar to how they used to think of North American natives as "Indians" without much regard for the nations, tribes, clans, etc. that made up the diverse pre-Columbian population. Or how people think of "Africa" as one place while "Europe" is many places. Over time, some observers spread pseudoscientific stories claiming that Maya temples were more likely to have been built by aliens than by ancestors of local people. (Vikings, Mormon Nephites and other mysteriously vanished civilisations have also been dubiously credited with building the ancient sites.) Yes, because clearly, people with noticeable melanin in their skin couldn't possibly have achieved anything of greatness. Hence the "theories" about Maya, or about ancient Egyptians for that matter. Also, I thought we were done calling them "Vikings." Subsequent large-scale mappings led to Estrada-Belli’s estimate that between 9.5 and 16 million people once lived in the Maya lowlands. He calls the lowlands in the 700s a “continuously interconnected rural-urban sprawl”. This was a cosmopolitan region with high degrees of trade and settlements interconnected by a close web of causeways and roads. So, apparently, by the time Europeans showed up almost a thousand years after, most of it was reclaimed by the jungle. The ancient Maya did not use pack animals, or carriage wheels. Everything that was built and traded had to be carried by human force alone. The bit about carriage wheels is pretty famous. At least, I've known about it for a very long time. There's a lot of speculation about why, some of it centered on things like the above: that they were too primitive to invent such things. And yet, there's another possible explanation: that the wheel was entirely too sacred to use for mundane tasks such as carrying burdens. Calling something like that "sacred" may strike you as primitive, too, but from an outsider's perspective, some of the things you consider sacred are mere mundanities. Now... there's a lot more at the link. I did say it was a long read. The article itself says it's a long read. But if all you know about the Maya is the calendar thing, or the wheel thing (which I believe might be related), it's worth the time. |
| Oh hey, it's an article relevant to writers. Kind of. From The Marginalian: The Dictionary of Obscure Sorrows: Uncommonly Lovely Invented Words for What We Feel but Cannot Name As a reminder, all words were invented. Some were invented more recently than others. “Words are events, they do things, change things. They transform both speaker and hearer; they feed energy back and forth and amplify it. They feed understanding or emotion back and forth and amplify it,” Ursula K. Le Guin wrote in her exquisite manifesto for the magic of real human conversation. All due respect to Ms. Le Guin, but "real human conversation" can also be incredibly annoying. In the roots of words we find a portal to the mycelial web of invisible connections undergirding our emotional lives — the way “sadness” shares a Latin root with “sated” and originally meant a fulness of experience, the way “holy” shares a Latin root with “whole” and has its Indo-European origins in the notion of the interleaving of all things. Well, yeah, hence "holistic." I do appreciate etymology. Because we know their power, we ask of words to hold what we cannot hold — the complexity of experience, the polyphony of voices inside us narrating that experience, the longing for clarity amid the confusion. There is, therefore, singular disorientation to those moments when they fail us — when these prefabricated containers of language turn out too small to contain emotions at once overwhelmingly expansive and acutely specific. One thing I try to avoid is the phrase "words cannot describe." I call myself a writer. I need to find words that describe. I don't always succeed in my avoidance, mind you; some things are simply indescribable. John Koenig offers a remedy for this lack in The Dictionary of Obscure Sorrows... Yes, the article promotes a book. The title, though beautiful, is misleading — the emotional states Koenig defines are not obscure but, despite their specificity, profoundly relatable and universal; they are not sorrows but emissaries of the bittersweet, with all its capacity for affirming the joy of being alive: maru mori (“the heartbreaking simplicity of ordinary things”), apolytus (“the moment you realize you are changing as a person, finally outgrowing your old problems like a reptile shedding its skin”), the wends (“the frustration that you’re not enjoying an experience as much as you should… as if your heart had been inadvertently demagnetized by a surge of expectations”), anoscetia (“the anxiety of not knowing ‘the real you'”), dès vu (“the awareness that this moment will become a memory”). Well. All of that is nice. So are the other examples given, which I won't get into but are right there at the link. There's a problem, though. The purpose of language, in my view, is not for us, as writers, to demonstrate our superior intelligence, insight, vocabulary, and sexual attractiveness (though we certainly possess these qualities). It's not to smugly show how clever and erudite we are. No, the purpose is to communicate. If we're going to go by what Le Guin said in the quote above, words can certainly "do things, change things," but only if both the speaker and listener (or writer and reader) can agree, at least to some extent, on the meanings of those words. So if I'm going to a rock concert that I've been looking forward to for some time, and I feel like I should be enjoying it more, my friend might ask, "Hey, what's wrong?" And if I go, "I got the wends," unless they saw this article, they'd have no idea what the hell I'm talking about, and might even drag me to the first-aid station. Or, I could explain what "the wends" are, per the last paragraph I quoted, but at that point I wouldn't feel much like explaining anything. Or I could just say "I'm not enjoying this as much as I'd hoped," and leave it at that. And I say this as someone who's made up words in the past and have had to explain what they mean. Consider how much harder it must be if you're using someone else's recently-made-up words. But hey, maybe some of these will catch on and become part of the lexicon. Language may reflect what's important to a culture, which is why we have dozens of words for death and less than a half-dozen for love. I'm sure there are linguists who disagree, but maybe, by changing the language, we can change minds. That power, however, can also be used for evil. So, I don't know. I guess when it comes to this stuff, I'm feeling agnosthesia. |
| Today's attempt to turn this into a food blog (because "carrion") is from TastingTable. I saved this link for a few reasons. One of them is to display an example of how you can do a headline without making it clickbait. "Some Tomatoes Have A White Ring Inside. How Dangerous Is It?" would be clickbait. Have you ever cut into a tomato and been perplexed to see a white ring? No. Oh, I've seen the white rings. I was just never all that curious about them. It just never occurred to me that it was anything but standard variation in quality. One of the primary causes of a white ring is a potassium deficiency in the soil when the tomato is growing. It's important to have an adequate concentration of potassium because, without it, the fruits may not absorb enough magnesium and calcium to properly ripen. ChEmIcAlzzz!!! However, too much sun exposure can also lead to your tomato growing with pale tissue. If the fruits are left out in the open at temperatures higher than 85 degrees Fahrenheit, they may turn white or yellow, and some areas may end up being dry or shriveled. Do what now? Okay, it's been a very long time since anything useful has grown near me, and even longer since I (or rather my parents) grew tomatoes, but I seem to remember "plant after last frost" and "plant in an area with full sun." And Potomac River basin temperatures stayed above 85F most of the summer, even before climate change started to accelerate. I could be misremembering. Also, what with selective breeding and genetic engineering and whatnot, I'm pretty sure there are hardier varieties. There are a few other reasons why you may see white flesh inside your tomatoes. If stink bugs, beetles, spider mites, and other bugs get under the fruits' skin and start feeding, they'll suck out the juice and insert their saliva, leading to a white spot rather than a white ring. You know how it goes: what's worse than finding a bug in your tomato? Finding half a bug. The good news is that if you spot a white ring inside your tomato caused by a potassium deficiency in the soil, it's typically safe to eat and can simply be removed by cutting... And why would you want to cut it out if it's safe to eat? Well, "safe to eat" isn't the same thing as "appetizing." I routinely excise the stem parts from tomatoes, just because I don't like them. Really, there's not much to this article, but it did sort of answer a question I never knew I had. |
| Now, here's a departure from my usual fare, thanks to Elisa, Stik of Clubs Post-Luxury Status Symbol #2: Wasteful Time We’ve spent two decades optimising ourselves into exhaustion, and now the flex is declaring you were never stressed in the first place. I suppose that's preferable to all the bragging about how busy one is. In Eat, Pray, Love, an Italian man tells our hapless protagonist her problem is that she’s American - Americans don’t understand pleasure because they believe it must be earned through exhaustion. Far be it from me to agree with anything from the genre I call divorce porn (chick gets divorced, goes to a foreign land to "find" herself, doinks a hunky local guy, leaves satisfied), but that one feels right, like when a Belgian tour guide told me Americans eat like we have free health care. Italians, he explains, have mastered il dolce far niente: the sweetness of doing nothing. Fifteen years later, that sweetness has become the ultimate luxury. Some might recall that I had an entry about doing nothing back on Groundhog Day: "Nothing Matters" Thorstein Veblen argued that people signal wealth through conspicuous consumption, conspicuous waste, and conspicuous leisure. Had he lived into the 21st century, he might have added a fourth: conspicuous grinding. The performance of perpetual productivity. Capitalism convinced us this is what rich people actually do. It isn’t. The biggest advantage to being rich is that you have the ability, and the resources, to do nothing. Or almost nothing. But grinding doesn't get you there. Your hustle mostly enriches someone else. Someone who is doing almost nothing. And yeah, you're surviving, maybe even thriving, but you're not going to become a billionaire that way. (Look, if the article can use the second-person pronoun with impunity, so can I.) Leisure makes you feel guilty because you’re not working. Working constantly feels virtuous because that’s what success demands. We optimised our work, then ourselves, then wondered why we felt empty. (And also the first-person plural pronoun.) What’s emerging now is a pendulum swing towards a new aspirational leisure class: people whose value isn’t tied to what they do, but to how effortlessly they exist. Insofar as people have "value," I balk at the notion that some are more valuable than others. Time itself has become precious, so the ultimate status is to be wasteful with it. Complete autonomy over your schedule. The ability to meet anyone, whenever, and always know the right spot. To decline opportunities based on values or vibes. To partake in long, leisurely meals with no rushed ending. I also balk at—nay, outright reject—the idea that such things are in any way "wasteful." Although many activities today would have been considered leisure by previous generations - skincare rituals, vinyl listening bars, elaborate dining experiences - the question remains: is it still leisure if an algorithm told you to do it? People can answer that question for themselves, I think. If nobody could see you, if you couldn’t post about it, would you still do it? That one, too. In my case, I do plenty of stuff that I don't post about. Some of it's not even embarrassing to admit; I just keep it private. If so, that’s neo-leisure. If not, it’s unpaid labour, the performance of joy for an invisible audience. Personally, I'm under the impression that a lot of that sort of thing isn't someone spontaneously deciding to, say, go to Tuscany and doink a local, but someone getting paid to promote Tuscany. This is the contradiction at the heart of Neo-Leisure: the moment you perform it, you’re optimising again. The ability to waste time becomes another metric to track, another behaviour to perfect. We’ve simply replaced productivity optimisation with leisure optimisation. One exhausting performance becomes another. When your hobby becomes your job, it loses a lot of what makes it interesting. Like, can porn stars ever have normal sex again? I'll never know the answer to that one. For marginalised communities, for precarious workers, for anyone without generational security, the luxury of wasting time remains inaccessible. They’re still grinding because they have to. The status symbol isn’t in wasting time. It’s in having enough capital that you don’t need to justify how you spend it. I don't think it's an epiphany to realize that leisure is tied to privilege. I know there's a bit going around about how feudal serfs had more free time than we do in our post-industrial dystopia, or about how hunter-gatherers work less than agriculturalists. I don't know how true any of that is. The article then goes into "leisure" products that I've never even heard of. Remember what I said about getting paid for seeming to perform leisure, up there? I suspect that this is product placement. True leisure, in my view, doesn't need a "product." Odell wrote that “nothing is harder to do than nothing”. In an era where attention and consumption are currency, wasting time becomes an act of resistance. Okay, but again, I must reiterate that I don't believe that these things are wastes of time. You know what is a waste of time? Doing work for a project that ultimately gets canceled. Even that isn't a complete waste of time if you learn something along the way. The greatest luxury might be doing nothing and feeling no need to signal it at all. Maybe. Or maybe the greatest luxury is to get paid to write blog entries. (To be clear, at the risk of repeating myself, I do not get paid to write blog entries.) |
| Many years ago, I had this recurring schtick about how a certain white kawaii cat from Japan was evil and taking over the world. I called her the Nefarious Neko. I dropped it because it got old for me, but along the way, I learned way more than I ever wanted to know about Hello Kitty. So this BBC article caught my attention: I, of course, never actually hated Hello Kitty. If there's something I actually detest, I normally just leave it alone, like the way I almost never talk about sports in here. The designer behind Hello Kitty is stepping down after 46 years, during which time she oversaw the feline character achieving world recognition. "World domination" is more like it. I remember some years ago, I saw an article about what was called "the most remote community in the world" or something, a tiny village in northern Siberia that was the only place of human habitation for many kilometers around. (Perhaps there are islands more remote, technically.) I don't remember many details, but it's the kind of thing that's only accessible by rail, and, because it's Siberia, only for like three months out of the year or something. Point is, it is quite literally the farthest corner of the world, and I distinctly remember, in one of the photos, a little girl wearing a Hello Kitty shirt. Yuko Yamaguchi took over design duties for the character - who isn't actually a cat, but a little girl from London - in 1980, five years after she first launched. It's not widely known, but yes, she's actually a little girl and she's actually British and her name is actually Kitty White. Yamaguchi herself often wore Kitty-style dresses in public and piled her hair in buns. One of these days, I'm determined to visit Japan. I'll need to brace myself for that sort of thing. The Hello Kitty character first appeared on a coin purse in 1980 and has become a global marketing phenomenon. This is where I started to metaphorically scratch my head. Up there, it said she took over "in 1980, five years after [HK] first launched." Seems to be a glitch in the timeline there. She has appeared on clothes, accessories, video games, and even an Airbus plane. And I saw an entire bullet train with a Hello Kitty theme. Well, pictures of it, anyway. Not to mention the Hello Kitty vibrator and the Hello Kitty assault rifle, which I called the HK-47. Unlike other Japanese exports such as Pokemon, there is little backstory to the character of Hello Kitty. Sanrio has said she "isn't a human, [but] she's not quite a cat either". As she has an actual cat as a pet, if she were a cat, it might raise questions about slavery and feline trafficking. She was born in London and has a twin sister named Mimmy and a boyfriend named Dear Daniel, according to Sanrio. And yet, the article fails to mention her name. Fortunately, I covered that up there ^ Kitty will make her cinematic debut in a Warner Bros film in 2028. She has already appeared in several animated series but has never spoken, as she doesn't have a mouth. Now, I want you to stop and think for a moment about the optics of a British little girl character, created in Japan some 30 years after WWII, whose distinguishing feature is being voiceless. And maybe you'll see one reason why I had that schtick going for a while. |
| Yes, this headline, from allrecipes, is clickbait, or perilously close to it. This Unexpected Trick Keeps Potatoes From Sprouting, According to an Expert We tested the popular hack to see if it really works. I'm actually more annoyed at the continued use of "hack" in this context. But I'm not sure if it's better or worse than "trick." Whether you like them fried, roasted, baked, or made into tots... "Boil 'em, mash 'em, stick 'em in a stew. Lovely big golden chips with a nice piece of fried fish" ...you probably have at least one favorite potato dish. Sure: Latkes. Plus, potatoes are inexpensive—you can often get a large 5-pound bag for just a few dollars. How much for the small 5-pound bag? But if you’re a household of one or two, it can be a challenge to eat all those potatoes before they go bad, no matter how much you like them. Whenever I see something about food "going bad," I imagine it standing on a street corner in a leather jacket and tattoos and chains, smoking a cigarette. That’s why videos of people stashing apples in their bags of potatoes to prevent sprouting have popped up all over social media. But does this trick work? We looked to the science, talked to an expert, and tried it ourselves to find out for you. No, I didn't just save this article so I could make potato jokes. That's just a bonus. I'm using this as an example of How To Do Science. Still. Remember the French phrase that translates to "potato:" pomme de terre, which, literally, translates to something like "earth apple" or "ground apple" (as in "the ground," not the past tense of "grind.") So I find the apple trick amusing, whether it works or not. But in addition to storing them in that cool, dark, and ventilated space, can putting an apple in your potato sack really stop, or at least slow, potato spoilage? Well, it’s a little complicated. So, how to do science. This is an easy and cheap experiment, though if you have an ethical issue with deliberately wasting food, maybe skip it. Find a bunch of potatoes from the same harvest, get an apple, split the potatoes into two groups, put each group in a sack (one with the apple) and leave them in the same room under the same conditions, but not so close to each other that apple fumes transfer. Then simply check to see which batch, if either, grows eyes first. Of course, just one experiment won't cut it. It needs to be repeatable and verifiable. Also, best if you have a third batch of potatoes from the same source set out on the counter or something as a control group. There’s some scientific evidence to support this hack. Ethylene gas is a natural plant hormone produced by fruits like apples, bananas, and tomatoes. It plays a crucial role in the ripening process of fruits and the aging of vegetables. The theory behind the apple-potato trick is that apples release ethylene gas, and ethylene gas has been shown to inhibit potatoes from sprouting in at least one lab experiment. This is the "hypothesis" stage of science. Not "theory." It's the starting point. But one must also take into account the possibility that there are other factors in the pomme / pomme de terre synergy, not just ethylene gas. For that, you'd probably need a real lab setup. As a practical matter though, what you're really just looking for is extended shelf life on your spuds, so while the mechanism is interesting, just doing the experiment is good enough for the kitchen. Or, you know, trust the other scientists who have already done the experiment. The study showed that ethylene treatment delayed the sprouting of potatoes, at least under these tightly managed conditions. It's the "tightly managed conditions" that are often the stumbling point between experiment and practicality. To put this apple-potato trick to the test, I conducted a simple experiment in my home kitchen. I divided a bag of potatoes into two groups: one stored with an apple and the other without. I kept both bags in a relatively cool, dark pantry and checked on them every day for more than a week. See, what'd I tell you? Surprisingly, after seven days, I found that the bag of potatoes with the apple actually sprouted first, while the bag without the apple sprouted about 24 hours later, after eight days. It’s a puzzling result considering the research. A puzzling result, maybe, but a good result. Why good? Because it exposes a possible error in your hypothesis, or your method, and that's the fun part. My pantry isn’t a lab, and my climate control was anything but precise. Plus, different potato varieties may have varying susceptibility to sprouting. Okay, yeah, but like I said: get your potatoes from the same batch. If one batch was dug up 3 days ago and the other, 5, then the experiment has a fatal flaw from the get-go. There is no harm in trying this trick at home, says Jayanty. Whether you go with an apple or a banana, it won’t hurt the potatoes, and it just might delay sprouting. "No harm" unless you'd rather have an apple or a banana than a potato to eat. The article ends with actual scientifically-backed tips for extending tater life, so there's some practicality there, apart from the kitchen science stuff. |
| The absolute last thing I should be doing is talking about human interrelations. I know more about quantum physics, and I know almost nothing about quantum physics. But, in another example of random number clumping, here's another BBC article, this one on relationships: For once, a headline question can be answered "yes." Because a question like that, all you have to have is one counterexample. And I have friends who are exes, so there you go. Story over. ...except, of course, we already know that I'm an exception, an outlier, a better person. I suspect many people would say "hell, no." Break-ups are tough - you suddenly lose the person you shared everything with. But staying friends with an ex can be equally as painful. Well, yeah, but presumably you liked them for a reason, right? A reason that might be common grounds for friendship? Unless it was purely physical attraction, which there's nothing wrong with. I should note, however, that part of the "I don't know what I'm talking about" thing is that I'm not sure whether the article is talking about, like, just being cordial with the ex, or occasionally doing stuff together in a group, or somone you can call to help you move, or a full-on "help you move bodies" close friendship. That last one, admittedly, could get a bit awkward, I would imagine. It's like "Sure, I'll be there if you need me; I just find you unsuitable to live or mash genitals together with." "I don't have many friends who are friends with their exes, actually," says Olivia Petter, author of dating handbook Millennial Love. But she has managed it in a couple of cases. Obviously, this kind of thing is way behind me, anyway. So my interest is mostly sociological. Okay. I can't pass up noting yet another aptronym there. Almost makes me think "Olivia Petter" might be a nom de plume. But no one would be that obvious... would they? 1. How serious was it? "There are one or two men I've had brief, casual romantic relationships with that have evolved into friendships," Olivia told BBC Radio 4's Woman's Hour... But when it comes to serious relationships, she says while she's on good terms with them they're not close friends. See, that's the sort of thing I'm talking about. I run into my first ex-wife occasionally. We talk and catch up. Then we forget about each other again. Meanwhile, a month-long fling I had back in the 90s is the kind of friend where we text fairly often. She's on the other coast, so I don't just run into her. We both did a Thanksgiving dinner last year, though, with a group of friends. No awkwardness involved. 2. Are you over them? One of the biggest obstacles is whether you are able to separate the romance from the person. "You need to have processed the break-up, not just moved on logistically, but emotionally," Kate says. Huh... that's perilously close to what I said above. Maybe I should write a book about relationships and have BBC promote it. 3. How much time has passed? Yeah, that actually makes sense. Get past the big emotions. And, obviously, some relationships end with unforgivable acts. 4. Is your new partner ok with it? If you do decide to stay friends, then Kate says you need to talk openly about what you are both going to do if the other gets into a new relationship. And if a new partner is uncomfortable with the friendship, Kate stresses you should take their concerns seriously. We didn't use the words "red flag" way back when I was dating, but one immediate "no" signal for me was when someone indicated that being friendly with an ex was a big "no" signal for her. Most of my friends are women. Some are exes. I wasn't about to give up long-term friendships because someone new doesn't trust it. You may need to have a conversation with your ex to adjust the friendship which could be "less frequent contact, more group settings, or being more transparent about what you're doing together," she says. And then, from my longer-term perspective, sometimes friendships just fade out, whether there was ever romance and/or doinking involved. People drift apart. It's just a fact of life. It's not always a, pardon me for adopting the term, "conscious decoupling" or whatever. (I'm applying this to friendships as much as romance.) I guess, for me anyway, I just feel like holding a grudge hurts you more than it does the other person, so I try not to do it. Doesn't mean we have to do stuff together, but it hurts me not at all to at least be cordial. Again, though, I have a longer perspective on these things. It's different for younger people, and society is different, too. Hence my interest in the subject remaining purely scientific. |
| I'm skeptical about a lot of things. Not denialist; skeptical. But there are a few things I'm absolutely certain of, and one of them is that plants die in my vicinity. Hell, one time I bought a cactus for my housemate. I never touched it. She was diligent with it. It died. So I don't believe for one second this article from BBC: I thought about getting plastic plants, but I expect those would die around me, too. Have you lost count of the times you've had high hopes for a pot plant... However, this lede is the real reason I saved this article. High hopes? Pot plant? I get they speak a different language over there, but either they ignored their US editors, or this is one of those times when the BBC gets a bit cheeky. It does that sometimes. It's often subtle. So I'm hoping—really hoping—that this particular double entendre was intentional. Still, in the US, we call them "potted" plants to distinguish them from weed. ...but despite careful positioning and diligent watering it always seems to die? Every. Time. Full disclosure: I've never tried with an actual (snicker) pot plant. Well you're not cursed and you don't need particularly green fingers for your to foliage to thrive, you just need to know where you might be going wrong, experts say. You all know by now that I have no business with the supernatural. I mean, it's great as metaphor and in stories, and I enjoyed the long-running series by that name, but here in reality, I prefer science and reason. Still, if anything was going to pivot me to believing in curses and whatnot, it's the way plants die in my presence, as if realizing that they're stuck with me and have no legs to run away with. But—rationally—my cats have legs and are stuck with me, and they don't try to run away. Except the one who gets medicine twice a day, but she doesn't run far. Gardeners' World host Adam Frost and the Royal Horticultural Society's Clare Preston-Pollitt share their top tips for keeping your house plants alive and healthy. Clare's last name is precipitously close to being an aptronym. Pollitt? Pollen? I'll be here all week. Adam's is the polar opposite of an aptronym. 1. Pick the right plant I don't want to seem ungrateful, here, but that's crap advice. As someone who appreciates form, but appreciates function even more, the only plants I really want to keep around are the ones you can use in cooking: mint, thyme, basil, etc. A ficus is useless and just takes up space. So I want to know how to keep, specifically, herbs alive, not some random spider plant that would look better in a vegan restaurant anyway. Many of us pick plants we think are pretty but making sure they are compatible with the conditions in our homes is key for survival, says Clare, RHS Garden Bridgewater's horticultural advisor. And if you live with pets, some plants are right out. Imagine me having a catnip plant. It wouldn't last a day. But, worse, some plants are outright toxic to pets, usually the same ones that pets absolutely love to munch on. 2. Don't overwater Yeah, thanks. I suspect the problem here is the precise opposite, but "underwater" means something else entirely. For common house plants like peace lilies and spider plants, brown leaves are a tell-tale sign of over or under watering. Check the dryness of the soil before topping them up. For others, like cacti and succulents, Clare says we mistakenly drown them by unnecessarily watering them. I am certain that this is not the case for the cactus I mentioned above. 3. Water less in winter Some regions experience winter differently. Specifically, some never get cold, while others never get warm. I suspect this article was written with the temperate and moist UK in mind. 4. Keep your Christmas poinsettia warm Remember how I said the thing about pets? Yeah, poinsettia and pets don't go together, despite being unable to spell "poinsettia" without "pets." Also, even I know poinsettias are from Central America and Mexico. Which is one of those regions I mentioned in the last section. Exposing them to cold is like mixing good tequila with Sprite. To keep them lasting longer than your New Years resolutions, you should add plant food to your poinsettias each month, Adam says. In April, he suggests trimming the branches, before re-potting in May. And I'm including that bit to emphasize what I said up there about the BBC sometimes being cheeky. "lasting longer than your New Years resolutions," indeed. I'm obviously leaving a lot out, but the link is there if you haven't yet given up on the entire idea of houseplants, like I have. |
| There was a slew of articles about Shakespeare not too long ago, probably paid advertising for the movie Hamnet. This might or might not have been one of them, from Mental Floss: Well, yeah, because no one knows everything about Shakespeare. Hell, no one even knows for sure what day he was born on (they have a baptism record and an assumption based on custom). As usual, don't trust MF for the facts. Or me, for that matter. This is, as certain "news" outlets disclaim, for entertainment purposes. Misconception: Historians debate whether Shakespeare really wrote Shakespeare. Shakespeare wrote a lot. He was also from the country town of Stratford-upon-Avon and didn’t go to university. So could this one “simple” guy write all these impressive high-brow works? So, two things here. I've heard this "theory" bandied about for as long as I've known about Shakespeare, and my first impression (as an impressionable kid) was "How is this relevant? The plays exist. Can we not separate them from the writer, if not from the time?" Later, I came to realize what this really was: arrant snobbery. Thus, I feel like I can safely ignore any snoot who proclaims that Shakespeare didn't write Shakespeare. And finally, they're only "impressive high-brow works" from our point of view today. At the time, they were the pop culture equivalent of monster truck rallies. Misconception: Shakespeare invented 1700 words. Today’s lexicographers have a lot more data and technology—and they know Shakespeare didn’t coin that many words. (Jonathan Culpeper, a linguistics professor at Lancaster University, has spent decades researching Shakespearean language. He believes Shakespeare coined around 400 words.) As I'm pretty sure I've noted before, I don't know how these things get determined, but it seems to me that, especially before the internet, words were passed around by, well, word-of-mouth before they were ever written down. So how do we know what was coined, and what was the pop culture equivalent of "six-seven" and "skibidi?" I will, however, give more weight to the opinion of a linguistics professor in the matter. Misconception: Saying Macbeth in a theater is dangerous. Okay. Okay, fine, so that's a misconception. And yes, actors are known for being a superstitious lot, which is why you tell them to "break a leg" before a performance instead of "good luck." However, that does not make this clip any less than one of the funniest things in the history of comedy: Misconception: Wherefore means “where.” An image you’ve probably seen countless times is Juliet decrying, “O Romeo, Romeo, Wherefore art thou Romeo?” It sounds like, “Where you at, Romeo?” And some performances even have Juliet physically searching for Romeo as she says those lines. But, at the time Shakespeare was writing, wherefore essentially meant “why.” Juliet is asking, “Why are you Romeo?” because it’s his name, attached to a family that’s feuding with hers, keeping them apart. Okay, I'm not going to argue about that. It just seems weird because it's not his name "Romeo" that's keeping them apart, but his family name. And the necessities of plot. I continue to insist that R&J is best interpreted as satire and/or parody. Misconception: As Hamlet says, “to be or not to be,” he’s holding a skull. Sometimes in pop culture, you encounter a Hamlet who’s holding a skull and reciting the “to be or not to be” speech. (Billy Madison is one example.) But Hamlet holds a skull during his speech in the churchyard that begins, “Alas, poor Yorick!” It happens in Act 5, Scene 1. “To be, or not to be—that is the question” comes two acts before that, in Act 3, Scene 1. Sure. Again, not arguing. But it's pretty damn famous that in all of Shakespeare's surviving works, only one stage direction stands out: "Exit, pursued by a bear." (The Winter's Tale). And there weren't many in his entire body of work. I don't recall, and am too lazy to look up, if those parts of Hamlet have actual stage directions, or if it's left up to directorial interpretation. So if you want to have Hamlet holding a skull, or a book, or a damn iPhone, in your production of Hamlet, I say go for it. Maybe he carries a skull around with him all the time. He certainly seems the type. Misconception: The Globe Theatre was round. The first Globe Theatre was completed in 1599. Shakespeare was a part-owner of the Lord Chamberlain’s Men company, which built the venue. And he may or may not have called it a “wooden O” in the prologue of Henry V. That being said, it wasn’t exactly a circle. It was a many-sided polygon. In fairness, calculus wouldn't be invented for nearly 100 years, so I don't think they'd be aware that, as the number of sides of a regular polygon increase, its resemblance to a circle also increases. At some point, architecturally, you have to say "Yeah, fine, that's a circle." It wasn't, however, a "globe." The Hayden Planetarium Now, there are a bunch I skipped because I had nothing to say about them. They're at the link if you're interested. |
| Well, no, I haven't started suddenly following Wine Spectator. Though I might. Still, I'd rather drink it than read about it, except maybe in this case: France’s Organic Winegrowers Lose an Indispensable Tool. What Now? Health regulators are phasing out nearly all copper sprays due to heavy metal concerns—but that leaves few tools for fighting mildew Okay, well, if it's truly an indispensable tool, then the answer is "stop being 'organic'" or "move out of France." The irony (another heavy metal) here is that copper is considered "organic" for agricultural purposes. Which is distinct from the chemical meaning of "organic" (carbon, not copper, compounds), and the original meaning of "organic" (from organs). My amusement at this is tempered only by knowing that, in French, what we call organic in the agricultural sense is called biologique. It leaves organic winemakers confronting an existential question: How do you protect vines from downy mildew when your primary defense has been eliminated? Because no one in France has ever confronted an existential question before. “Copper is a natural element, naturally occurring in nature,” said Gérard Bertrand, a leading vigneron in southern France and advocate of organic farming. Uh huh. Okay. I'm letting the repetition of "natural" slide there because it's either a translation or someone's second or third language. But I'm not going to let the natural fallacy slide, oh hell no, not me. Once more for the back row: "natural" doesn't mean "good." Poison ivy is natural. Tobacco is natural. Arsenic is a naturally occurring element much like copper. Copper is a naturally occurring element and is approved worldwide for organic agriculture. Critically, there is no equivalent for organic farming. The other options are forbidden synthetic fungicides. Not, mind you, that I'm coming down on one side or the other here. I don't know enough about copper toxicity or viticulture in general to weigh in on what France did, and even if I did know enough, they wouldn't give one single shit about my opinion. All they care about is that I enjoy the finished product and keep sending them money in exchange. Copper, while it does protect vines from fungal disease, is a persistent metal that accumulates irreversibly in the top few inches of vineyard soils. In large quantities it disrupts essential microbial communities and earthworm populations that define healthy terroir. It can also contaminate the waterways that flow through wine regions. There are also mounting concerns about its impact on vignerons and vineyard workers themselves. Well, there you go. Look at that: something natural isn't good for you. Regarding a way forward, Jestin remains optimistic, hoping that scientific research can devise alternatives to copper. Honestly, I hope so too. French wine is expensive enough with tariffs. Trade body SudVinBio cautioned that producers may abandon organic practices altogether. About that, I don't care. Two copper products remain authorized, but they carry stringent restrictions, which may make them less practical. Without viable alternatives to copper, how does organic viticulture survive in regions where Bordeaux's Atlantic humidity, Burgundy's continental rainfall, Cognac’s and Champagne's persistent dampness all create conditions where mildew protection determines vineyard sustainability? I have another article in my pile about the American chestnut, which used to dominate Blue Ridge Mountain forests until a fungus destroyed the entire native population. It'll pop up eventually, but the point for now is that things change. And right now, things, climatically speaking, change even faster. France is very strict about its wine growing policies; for instance, they don't allow irrigation apart from whatever rains fall. So either they can become less strict, or wine growing will shift to some other region. Which would be a shame, but at least they'll still have cheese. Which I don't think has any copper in it, but I haven't tested it for that. |
| Today, a bit of a break as I use this space to participate in "26 Paychecks " Tell us about one (1) genre that you've never written anything for. Name the genre and tell us why it's not something that has sparked any writing from you. Thing is, I've been here for a while, and I've written a bunch of things both here and elsewhere. And I tend to experiment, sometimes. So I can't absolutely guarantee that I've never written in some particular genre. Hell, I probably have something listed under "Fashion," which is one genre I've always said I knew nothing about. But, upon browsing the list of genres here, I came across one that I don't think I've ever touched, which is Genealogy. There's nothing wrong with genealogy. I get wanting to know where people come from, much as I enjoy tracing word origins. It's just not a particular interest of mine. You'd think it would be, right? Since I was adopted as an infant, surely I must feel a burning need to track down my genetic ancestry! Well, um... no. I don't. Never have. Nothing more than idle curiosity or being able to answer inheritable disorder questions for the doctor. I have had some interest in tracing my adoptive parents' origins, but any attempts led me to dead ends and I gave up, never having written about it. Dad always warned me I'd find skeletons, anyway, and unlike a character in a horror story, I actually listen to warnings. I don't always heed them, mind. But I listen. Short one today to keep it "about 200+ words," but that suits my mood today just fine. |
| Another one that's not my usual thing, but it caught my brain. At first it looks like politics, which I try to stay away from, but it's kinda not. As we know, the answer to headline questions is "no" by default. But okay; I'm willing to listen. It’s no secret that Americans are more politically polarized today than we’ve ever been. Do you even remember a time when we weren’t this way? The only time I recall was the few months after September 11, 2001 – 24 years ago now. For a short but beautiful time, Americans really were united. No. We thought we were united. But it turned out that we were all mad for different reasons. One side was sad about the loss of life. The other seemed to be, but was really just pissed that someone caught us with our pants down. Our representatives in Congress came together to proclaim their commitment to working across the aisle and backed it up with some major bipartisan laws. Yeah. Really, really bad ones. We Americans have settled ourselves neatly into political tribes that don’t work together, don’t listen to each other, and often despise one another. Or both. I can despise both. Many people have been hurt by our current level of polarization, and there’s worse pain to come if things continue this way. If you haven't noticed, there's worse pain to come regardless. Anyway, the article goes into this "depolarization challenge," and I have no need to reproduce it here. At this point you might be thinking, “but why do I need to change? It’s those other people who are causing all the problems!” I can understand that thinking. There are certainly things I've given up on because we'd all have to do it, and that ain't gonna happen. But when you think harder, for stuff like what's in this article anyway, maybe you come to the conclusion that it's easier to change yourself than it is to change other people. Some people are not ready to step outside their comfort zone and change their mindset in this way. But those who do will be rewarded with a stronger sense of community, a more functional civic society, less heartache, better relationships, and a country that they can be proud of. And maybe, if enough of us do it over a period of time, our government can become less polarized too. Perhaps this is a noble goal. I'll have to think about it. |
| So, here's one from a source I don't usually follow, but it came to my attention thanks to Elisa, Stik of Clubs How to Avoid Falling for Fake Videos in the Age of AI Slop Why fake videos spread so easily, and what actually helps people spot them We’re entering an era of what’s often called AI slop: an endless stream of synthetic images, videos, and stories produced quickly, cheaply, and at scale. I have to admit, I'm getting more than a little tired of hearing/seeing the words "AI slop." From what I've seen, AI output has become more polished and professional than about 75% of human-generated content. I think some people might be jealous. I ain't saying it's right, mind you. Only that it's prettier. Sometimes these fake videos are harmless or silly, like 1001 cats waking up a human from sleep. Harmless? You dare to call something that triggering harmless? I don't even allow the significantly fewer than 1001 cats in my household to wake me up. Other times, they are deliberately designed to provoke outrage, manipulate identity, or push propaganda. Because human-generated content would never do that. Just like no one ever got lost using paper maps in the time before GPS. To navigate this new information environment, we need to combine psychological literacy, media literacy, and policy-level change. And here's where it gets difficult for most of us. Why should we change? It's the world that needs to change, dammit! The article provides a road map (or, if you prefer, a GPS route) to us changing: 1) Understand Our Own Psychological Biases (Psychological Literacy) The psychology behind falling for AI-generated misinformation isn’t fundamentally new. The process is largely the same as with other forms of misinformation, but AI takes it to a whole new level– it dramatically lowers the cost and effort required to produce and spread it at scale. My own simple solution: Right now, most of us have a bias that says "I saw it, so it must be real." I suggest turning that around. Assume everything you see on the internet, or on TV, is fake. Like you're watching a fictional movie or show. The burden of proof thus shifts. The downside to this (every simple solution has a downside) is that you get so you don't believe anything, And for some of these content generators, that's the goal: make you question reality itself so they can swoop in and substitute it with their own version. Hell, religion has been doing this for as long as there's been religion. As Matthew has written before about fake AI accounts, people are motivated to believe what fits their values, grievances, and group identities, not necessarily what’s true. When a video confirms what you already believe about politics, culture, or power, authenticity becomes secondary. I have noted this before: it is important to be just as, or preferably more, skeptical about the things that tickle our confirmation bias. The goal isn’t to suppress emotion. It’s to recognize when emotion is being used as a shortcut around verification, and being used to manipulate you. It sure would be nice to be able to suppress emotion, though. I've felt that way since watching Star Trek as a kid. Spock was my role model. 2) Lateral Reading Is Still the Best Tool We Have (Media Literacy) When people try to fact-check AI videos, their instinct is often to stare harder at the content itself such as examining faces, counting fingers, looking for visual glitches. Guilty. I've been seriously considering wearing a prosthetic extra pinkie finger so that anyone who looks at a surveillance photo of me will immediately assume it's an AI fake. The most effective fact-checking strategy we have isn’t vertical reading (scrutinizing the video itself). It’s lateral reading—leaving the content entirely to verify it elsewhere. I do that here, especially with notoriously unreliable sources, which, since I try to use free and easily accessible content, is almost everyone these days. 3) Policy Changes and Platform Accountability Individual skills matter. Community norms matter. But at this point, policy intervention is likely required. Well, I was trying to be funny with the "It's the world that needs to change" bit above, but I guess they're serious. Social media platforms are not optimized for truth, they’re optimized for engagement. I should fact-check this, but it aligns with what I already believe, so I won't. Conclusion The most dangerous thing about fake AI videos isn’t that people believe them once. It’s that repeated exposure erodes trust altogether: in media, in institutions, and eventually in one another. As I alluded to above, it makes us question the very meaning of "truth." I'd also add this: Be humble enough to know that you can be wrong. Be brave enough to admit when you're wrong. And allow space for the idea that sometimes, your ideological opponents are right. Not often, mind you. But sometimes. |
| You know those "which Hogwarts house are you?" quizzes designed to fill out your ad profile online? I don't know; maybe they've finally fallen out of favor. Here's a different kind to consider, from Big Think, and I'm not even building an ad profile of you: Which of the 5 philosophical archetypes best describes you? I'm definitely a Kitsune, but would a Kitsune actually say that? For clarity, that subhead there is the author describing himself as a Kitsune. I'm absolutely not a Kitsune, though I appreciate them. Sometimes. We are all philosophers. I don’t mean this in the “What do you make of Quine’s ‘Two Dogmas’?” sense. No, we are all philosophers in that we all do philosophy. Yeah, even that insipid song by Edie Brickell with the line "philosophy is the talk on a cereal box" is a kind of philosophy. Philosophy is a practice of wonder and logic; curiosity and introspection; dialectic and meditation; criticism and advocacy. I question the author's assertion here, but I guess that means I'm doing philosophy. So, without any empirical rigor whatsoever — another favorite characteristic of philosophy — I present here five different ways to be a philosopher. I feel like "The Fool" is conveniently left out, though maybe that's an aspect of the Kitsune. Yes, yes, I'm getting to what that is, if you don't already know. But that's because I assert that philosophers, by definition, have a stunted sense of humor, or none at all. We have a different word for philosophers with a sense of humor: comedians. The Sphinx The archetype: The Sphinx had the head of a woman, the body of a lion, and the wings of a bird. While that kind of chimera is probably highly symbolic, I don't know what the symbols might mean. Physical descriptions are probably the least important things in these archetypes. Each time, the Sphinx would ask a single riddle, the classic being, “What walks on four legs in the morning, two at noon, and three in the evening?” but I assume there were more. One of my favorite scenes in fiction is from a Zelazny novel. The MC meets a sphinx, who asks him a riddle. He asks, in return: "What's red and green and goes round and round and round?" This stumps the sphinx, because of course the sphinx isn't attuned to the modern definition of "riddle." He is thus able to pass while the sphinx ponders, much like when Spock set an android into an infinite loop with deliberate illogic. This is probably when I determined the essential difference between philosophers and comedians. Oh, the answer is "a frog in a blender." The Leviathan The archetype: The Leviathan is a demonic sea serpent that breathes fire. Its back is a row of shields and churns the oceans to a frothing boil. Not ever answered: what use fire-breathing has in a sea monster. This person has a transferable framework that they apply to everything. They’ve read a book, studied a philosophy, or watched a YouTube video and decided, “Yes, this idea is the one that will govern my life.” Every action in every minute of the day can be explained by this single system of ideas. Oh. That type. The Kitsune The archetype: In Japanese folklore, the kitsune is a fox spirit known for their ability to shapeshift. A kitsune might appear as a beautiful woman, an old man, a child, or a tree. Some are tricksters, and others are teachers. The "trickster" archetype can be funny. But not usually to the ones being tricked. The kitsune-person may say something outrageous and, when challenged, give a wide smile with a twinkle in their eye. They’re often impossible to argue with because they keep changing things. Oh, yeah, the goalpost-mover. The Minotaur The archetype: The Minotaur is a half-human, half-beast (typically a bull) locked in a labyrinth. The Minotaur is feral and brutal, no doubt — he will kill anyone he catches in his maze — but he is also lost and tormented. In my view, the "bull" part is essential to the minotaur's description. It's right there in the name. ("But, Waltz, what about centaurs? They're part horse, not bull." "Turns out one possible etymology for 'centaur' is 'bull-slayer.'") The minotaur-philosopher is someone lost in the mire of human suffering, mortality, freedom, and absurdity. They never escape the labyrinth but make a dark, resigned home within it. Here, you’ll find Pascal, Dostoevsky, Heidegger, Sartre, Camus, and Simone de Beauvoir pacing about in anguish. No comment. The Garuda The archetype: The Garuda is a great eagle of Indian mythology and is associated with clear sight and the dispelling of poisons — especially those of serpents and nagas. The Garuda soars above the landscape and sees the structure of things. One might think that because it's a big-ass bird associated with purification, I'd identify most closely with this. One would be wrong. The Garuda-person asks, “What do you mean by that?” a lot. They hate vagueness and metaphor used as arguments and will often call out both — “What does that actually mean?” they say. They generally don’t have time for “lived experience” or emotional reasoning. Or, I don't know. Maybe that's pretty close. Fuller descriptions exist at the link, of course. While, as the author notes, the list is by no means exhaustive, I find it amusing. I'm also quite pleased that it's not limited to one set of mythology, though there are certainly others that could be included, from other cultures. Though the "trickster" archetype seems to be pretty universal. And most of us are composites — a little Sphinx when we’re unsure, a little Minotaur late at night, a little Garuda when we’re fed up with nonsense. I'd venture that most of us just are, without thinking about archetypes. Hm. Maybe Edie Brickell was onto something, after all. |
| Here's one for your inner 12-year-old, from Live Science: How many holes does the human body have? You might think that the human body has many holes, but that number shrinks when you stop to consider what counts as a hole. Because I know your inner 12-year-old immediately said "which sex?" The human body is extraordinarily complex, with several openings and a few exits. Cue Beavis and Butt-Head. But exactly how many holes does each person have? I imagine it not only depends on your definition of "holes," but how recently someone's been shot. Maybe that only applies in war or the US. But it's not quite that easy once you start considering questions like: "What exactly is a hole?" "Does any opening count?" And "why don't mathematicians know the difference between a straw and a doughnut?" I've noted before that a "hole" isn't a thing. However you conceive of the concept, a hole can only be defined by what's around it. You can't just point to a random location in space and say "that's a hole." Or, well, you can, but people would look at you funny. "Black holes" may be the only exception to this, but their name is more metaphorical. Oh, and the branch of mathematics that doesn't know the difference between a straw and a donut (and a coffee mug, for that matter) is called topology, where all of those shapes are considered toroids: one hole going all the way through. Topologically, we're all toroids (assuming we haven't been shot through recently). Most animal life on Earth is. Katie Steckles, a lecturer in mathematics at Manchester Metropolitan University in the U.K. and a freelance mathematics communicator, told Live Science that mathematicians "use the term 'hole' to mean one like the hole in a donut: one that goes all the way through a shape and out the other side." Look, I don't care if you call it doughnut or donut. The former is more British; the latter is more US. Just do try to keep it consistent, and if you're quoting a Brit, use the former. Or do what I do, and say "bagel" instead. But if you dig a "hole" at the beach, your aim is probably not to dig right through to the other side of the world. Totally tried to do that when I was a kid. It's good to have goals. Similarly, mathematical communicator James Arthur, who is based in the U.K., told Live Science that "in topology, a 'hole' is a through hole, that is you can put your finger through the object." Um. Phrasing? And if you ask people how many holes a straw has you will get a range of different answers: one, two and even zero. This is a result of our colloquial understanding of what constitutes a hole. Are... are you telling me language can be ambiguous? Say it ain't so! In topology, objects can be grouped together by the number of holes they possess. For example, a topologist sees no difference between a golf ball, a baseball or even a Frisbee. And I knew that, obviously, but it's also another excuse for people to grumble about "common sense," as if that were a thing that existed. Armed with the topologists' definition of a hole, we can tackle the original question: How many holes does the human body have? Let's first try to list all the openings we have. The obvious ones are probably our mouths, our urethras (the ones we pee out of) and our anuses, as well as the openings in our nostrils and our ears. For some of us, there are also milk ducts in nipples and vaginas. At least they addressed the 12-year-old directly and shut down its gigglesnorts with all kinds of formal medical words. Unfortunately, that sentence needs another comma near the end. In total there are potentially millions of these openings in our bodies, but do they all count as holes? This is a bit like asking if a tomato is a fruit or a vegetable, in that a scientist will give you a different answer than a sous-chef. "They're not actually holes in the topological sense, as they don't go all the way through," Steckles said. "They're just blind pits." Again. Phrasing. A pair of underwear, for example, has three openings (one for the waist and one for each of the two legs), but it's not immediately clear how many holes a topologist would say it has. And again, it probably depends on the sex and/or gender of the person wearing it. And here comes the 12-year-old, giggling again. So the mathematician's answer is that humans have either seven or eight holes. And my answer? None. Think about it: how many holes does your house have? You can ignore the drafty cracks for my purposes; I'm talking about, like, windows and doors. Open one window: no holes in the topological sense. Open a window and a door: suddenly you have a topological hole. Open three windows, and you get the situation the article refers to with underpants. Might be complicated if you also consider water and sewer systems. And your digestive tract is, also, usually closed at at least one end, like a door or a window in your house. So while you could, technically and topologically, thread a string from mouth to asshole (preferably not the other way around), in practice, we're usually pretty closed off, apart from respiratory functions. So, again: it's all about how you look at it. And if you're 12, this shit is funny as hell. |
| Yes, sometimes I link to Outside. It's better than actually going outside. Can Jumping 50 Times Each Morning Actually Improve Your Health? Here’s what the science says about the Internet's latest trend. I dunno about science, but I have some idea about what your downstairs neighbors would say. You’ve tried everything to feel more awake in the mornings—caffeine, sunlight, water, stretching—but no matter what, you still feel groggy and unready to face the day. Have you tried attuning your schedule to your chronotype, instead of trying to fit your chronotype into someone else's schedule? Yeah, yeah, I know, few have the privilege of being able to do that. I certainly did not for most of my life. There’s one thing you probably haven’t tried that’s taking social media by storm: jumping. If it's "taking social media by storm," a) I'd be the last to hear about it and b) I'd immediately distrust it, like I did the "walking backwards" fad from, what, a year ago? Two? Now, even though I don't practice this these days, I can accept that some exercise is better than no exercise. I can also accept that, sometimes, you gotta try something new to break up your routine a bit. Near as I can tell, if you don't live above someone else or can do it (shudder) outside, there's nothing inherently wrong with this and it doesn't make you look as dumb as walking backwards does. And yet, I'd still shun it, simply because it's a trend. The article goes on to list the "benefits" of this particular exercise. I won't rehash them here. Just assume I'm skeptical. Not in the denial sense, but in the "I'm not going to trust this one source" sense. Who Should Skip the Jumping This section header is the actual reason I saved this article. Skip? Jumping? I'm dying over here. You might want to think twice about participating in this trend if you have a weak pelvic floor, significant knee, hip, ankle, or foot pain, Achilles tendinopathy, plantar fasciitis, recent sprains, a history of stress fractures, or balance issues, Wickham says. I admit, though, that putting this here assuages some of my skepticism. To get the most out of your jumps, jump 50 times in place at a rapid, consistent speed, making sure to drive through the balls of your feet and land softly on the balls of your feet. If I tried that right now, I'd end up in the hospital. Meanwhile, I'll continue my usual jumping exercises: the ones that lead me to conclusions. |
| Speaking of time, there's this article from aeon: The shape of time In the 19th century, the linear idea of time became dominant, forever changing how those in the West experience the world It would be horribly remiss of me if I didn't include this famous quote: "People assume that time is a strict progression of cause to effect, but actually from a non-linear, non-subjective viewpoint - it's more like a big ball of wibbly-wobbly... timey-wimey... stuff." — The Doctor ‘It’s natural,’ says the Stanford Encyclopedia of Philosophy, ‘to think that time can be represented by a line.’ We imagine the past stretching in a line behind us, the future stretching in an unseen line ahead. I have heard that there is a culture, perhaps in Australia, or maybe Papua New Guinea or South America (I don't recall), where they think of the future as behind them and the past as in front of them. This is, if I remember right, because you can "see" the past but you cannot "see" the future. So, no, it's no more "natural" to "think that time can be represented by a line" than it is to think of time moving from left to right on a page, the way Westerners read language. Perhaps those who write from right to left see time progressing from right to left. Even writing is arguably not "natural." However, this picture of time is not natural. Its roots stretch only to the 18th century, yet this notion has now entrenched itself so deeply in Western thought that it’s difficult to imagine time as anything else. Except, perhaps, as a big ball of wibbly-wobbly, timey-wimey stuff. Let’s journey back to ancient Greece. Amid rolls of papyrus and purplish figs, philosophers like Plato looked up into the night. His creation myth, Timaeus, connected time with the movements of celestial bodies. The god ‘brought into being’ the sun, moon and other stars, for the ‘begetting of time’. They trace circles in the sky, creating days, months, years. While it seems to be true that Western culture borrows a lot from ancient Greece, there really were other cultures in the world. I'd think the whole "this started with ancient Greece" thing would have fallen out of favor by now. Guess not. Such views of time are cyclical: time comprises a repeating cycle, as events occur, pass, and occur again. I can kind of understand why people would think time is a cycle. As the article notes, things do seem to have cycles: day/night, moon, year, planet alignments, etc. But the idea that "events occur, pass, and occur again" just seems wrong to me. Even though there's, e.g., Groundhog Day every year, not every Groundhog Day is the same. It’s even hinted at in the Bible. For example, Ecclesiastes proclaims: ‘What has been will be again … there is nothing new under the sun.’ Of all the laughably wrong things in the Bible, "there is nothing new under the sun" might well be the most laughably wrong. Well, right up there with "we live on a flat Earth between two waters," anyway. Maybe also with "there was a global flood in human history." And yet, like Greek ideas, it's part of culture. To quote the Battlestar:Galactica remake: "All of this has happened before. All of this will happen again." Importantly, medievals and early moderns didn’t literally see cyclical time as a circle, or linear time as a line. Yet in the 19th-century world of frock coats, petticoats and suet puddings, change was afoot. Gradually, the linear model of time gained ground, and thinkers literally began drawing time as a line. I believe it's important to note that, whether we conceive of time as cyclical, linear, wibbly-wobbly, or anything else we can come up with, this is a matter of perception, not reality. No one knows what time really is. People have guesses, and they'll tell you their guesses with great confidence, such as "time is like a river," but as we saw yesterday, some rivers are more rivery than others. But no. Time is time. The only thing I can say with great confidence is what it's not: an illusion. It may well be an emergent property of something deeper, but then, so is the chair you're sitting in right now. A crucial innovation lay in the invention of ‘timelines’. As Daniel Rosenberg and Anthony Grafton detail in their coffee-table gorgeous Cartographies of Time (2010), the ‘modern form’ of the timeline, ‘with a single axis and a regular, measured distribution of dates’, came into existence around the mid-18th century. In 1765, the scientist-philosopher Joseph Priestley, best known for co-discovering oxygen, invented what was arguably the world’s first modern timeline. What this brought to mind for me was the Periodic Table. Elements, like oxygen, exist with or without the Periodic Table, but Mendeleev's invention helped us visualize their relationships with each other, much like a timeline helps us visualize past events in relation to one another. Rosenberg and Grafton describe A Chart of Biography as ‘path-breaking’, a ‘watershed’. ‘Within very few years, variations on Priestley’s charts began to appear just about everywhere … and, over the course of the 19th century, envisioning history in the form of a timeline became second nature.’ Priestley’s influence was widespread. For example, William Playfair, the inventor of line graphs and bar charts, singled out Priestley’s timeline as a predecessor of his own work. I say this gives short shrift to Descartes, who basically invented graphs in the early 17th century. See, already I'm putting events on a timeline. The second key development concerns evolution. During the early 19th century, scientists created linear and cyclical models of evolutionary processes. For example, the geologist Charles Lyell hypothesised that the evolution of species might track repeatable patterns upon Earth. This led to his memorable claim that, following a ‘cycle of climate’, the ‘pterodactyle might flit again through the air.’ However, with the work of Charles Darwin, cyclical models faded. His On the Origin of Species (1859) conceives of evolution in linear terms. It literally includes diagrams depicting species’ evolution over time using splaying, branching lines. I think that once we realized that entropy only goes in one time direction, the old idea of cycles of time had to go right out the window. Entropy and time are intertwined, and physics's best guess as to the nature of time right now is that it is entropy. Now, I know some people mistakenly believe that evolution goes against entropy, but that discussion is outside the scope of this entry. The last development stemmed from mathematics: theories of the fourth dimension. Humans perceive three spatial dimensions: length, width, and depth. But mathematicians have long theorised there were more. In the 1880s, the mathematician Charles Hinton popularised these ideas, and went further. He didn’t just argue that space has a fourth dimension, he identified time with that dimension. Now that was something I wasn't aware of. I knew the idea of "spacetime" preceded Einstein, but I don't think I'd ever heard of Hinton. Nowadays, of course, mathematicians like to play with way more than four dimensions, and apparently, something like 16 are required for string theory (which, if anything in science can be said to be "only a theory," it's string theory). Within history, conceiving of time as a line helped to fuel the notion that humanity is making progress. Joseph Priestley, our timeline inventor, is partly responsible for this. The man once listed inventions that have made people happier, including flour mills, linen, clocks, and window glass. Sadly, Priestly lived before sliced bread and the "Skip Intro" button. Within philosophy, conceiving of time as a line led to thinkers debating the reality of the past and future. Whereas I assert that only the past is real; the present is an illusion created by the very recent past, and the future doesn't even rise to the level of illusion, as it does not exist at all and won't until the past catches up. But I acknowledge that this, too, is a matter of perception and point of view. I've gone on long enough for today (see, I made a time reference there). The article also goes on for a while, but it's an interesting read. And an appropriate one for an outlet named aeon. |
| Some questions may not have meaningful answers, such as this one from LiveScience: What's the oldest river in the world? The oldest river predates the dinosaurs. But how do we know this? First, you have to define what you mean by "river," and that can be harder than it sounds. The dictionary definition (at least the first one I found) is: "a large natural stream of water flowing in a channel to the sea, a lake, or another such stream." (Oxford) So you've got "large," which is a judgement call; "natural," which is fuzzy; "stream," which implies flowing, but lots of water flows and some rivers sometimes don't; and "water," which rules out, for example, the L.A. River (most of the time); "flowing," which I say is redundant after "stream;" and "channel," which seems straightforward enough until you consider that some rivers are braided and/or deltaed with multiple channels. And then you have bodies like the Potomac River, which for much of its lower reaches, all the way up past DC, isn't so much a river as a tidal estuary that happens to be fed by a higher river. Oh, but that's not all. Dictionary definitions don't cut it here in this blog. I can use them as examples, but they don't resolve arguments. You know the old saying, "You can't step into the same river twice?" I think it's supposed to be about how things change over time. Water goes in, water flows out, evaporation happens, shores get eroded, sandbars form, megatons of soil get transported, etc. Thing is, rivers (and other streams) don't just change over time; they, like living bodies, are in a constant state of flux ("flux," incidentally, shares a root with "flow" and "fluid"). Consequently, I say you can't step into the same river once. Because between the time your foot touches the surface and the bottom, the river has already changed. Hell, the mere act of stepping into it changes it, however minimally. So when you're asking a question like the one in the headline, you have to be careful. Rivers may seem as old as the hills, but they have life cycles just like other natural features do. Yeah. Like hills. Some rivers last longer than others, however. So which river is the oldest in the world today? Remember that a river isn't just its water. Sometimes, it's not even its water, but just its channel, such as the aforementioned L.A. River (which also stretches "natural" to its natural breaking point). Channels change over geological time, though, carved and altered by water flow and other processes such as continental drift. The winner is older than the dinosaurs: The Finke River in Australia, or Larapinta in the Indigenous Arrernte language, is between 300 million and 400 million years old. I'm certainly not going to argue about that, though. Australia is a remarkably stable continent (or island or whatever name you slap on the land mass). If I remember right, some of the oldest rocks in the world are also found there, presumably guarded by dangerous wildlife, but don't get me started on how they define how old a rock is. The arid conditions in the center of the continent mean the river flows only intermittently; most of the year, it exists as a string of isolated water holes. See? There's a whole lot of semi-technical geological explanation for how they figured it out at the article. While I have some experience with geology, it was rather secondary to hydrology in my education, so I'm not going to quibble about it. It is interesting, at least to me. But no quotes here. "Rivers can disappear if a massive influx of sediment overwhelms them (e.g., volcanic eruptions) or if topography changes so dramatically that the flowing water takes a new course across the landscape (e.g., glacial advance and retreat)," Ellen Wohl, a geologist at Colorado State University, told Live Science in an email. Pretty sure there's more that can change or destroy a river. In the case of the Finke, Australia has been an unusually stable landscape for a very long time. Resting in the middle of the Australian Plate, the continent has experienced virtually no significant tectonic activity for the past several 100 million years, Baker explained. Like I said. Only with more detail. If the Finke ever dries up, the runner up may be the New River, which today is about 300 million years old, Baker said, and runs through Virginia, West Virginia and North Carolina. And so we get to the final bit in the article, and the main reason I'm sharing this. Australia is on almost the opposite side of the world from me, and I've never been there, but the New River is practically in my backyard, globally speaking. I've known about its ancient age since college, when I took the aforementioned geology and hydrology courses. Unlike most Virginia rivers, it doesn't flow into the Chesapeake Bay and thence into the Atlantic; instead, it's part of the Mississippi River basin. Which technically flows into the Atlantic, too, but via the Gulf of Mexico. And unlike the Finke / Larapinta, the New River is always wet. And flowing. They take people whitewater rafting on it. Not me, obviously. But people. In the interest of full disclosure, I should note this quote, which has cited sources, from the New River article The irony, of course, is that it's called the New River, and that's what I find endlessly amusing. |
| I'm linking to Lifehacker today. Yeah, yeah, I know. Bear with me. Let me guess: 1) You're 2) getting 3) money 4) for 5) this. Quality power tools are an investment, and if you take proper care of them, they’ll last a long time. It's been a while since I bought power tools, so I'm not even sure which brands can be trusted, these days. But power tools have seen a lot of advancement in recent years. While your old warhorses might still perform their core function well enough, if your drills, saws, and other power tools are five years old or older, it’s time to consider upgrading to a more modern version, for a range of reasons. Seriously, this strikes me less as helpful advice and more as a tool companies paying for an ad that looks like an article. And, indeed, they mention some brands by name in the article. But let's see what they come up with: Advances in battery technology I suppose this is fair enough. But if you've purchased a battery-powered tool of any kind, hopefully you're aware that the battery isn't going to last forever, regardless. Such tools are going to need to be replaced sooner than corded ones, in general. Improved ergonomics This feels like a stretch. Get it? Ergonomics? Stretch? No? No. I'll be here all week. And yeah, it's looking more and more like a paid ad. More powerful motors Uh huh. If it was, and remains, adequate for what you need it for, are you just upgrading because you're a Manly Man Who Must Have More Power? Better safety features Seems to me that the best safety feature is familiarity (provided one doesn't get complacent). Smart technology Until it can do the job on its own, I'm not interested. I really didn't have much else to say, today. Just that stealth advertising sucks. |