Items to fit into your overhead compartment |
Okay, BBC again today; I expect everyone has heard the "signs of extraterrestrial life" story by now. For starters, the announcement followed the trajectory I expected: from "we found evidence of the signature of a gas in an exoplanet's atmosphere that, on Earth, is only produced by living organisms" (close to the truth) to "they found evidence of life" to "hey, they found aliens!" ...the news that signs of a gas, which on Earth is produced by simple marine organisms, has been found on a planet called K2-18b. At least the BBC isn't blowing it out of proportion. K12-18b is totally a Star Wars droid name, though. Now, the prospect of really finding alien life - meaning we are not alone in the Universe - is not far away, according to the scientist leading the team that made the detection. Okay, well, he'd know better than I would, but it seems to me that a flicker of light in a spectroscope isn't the same thing as finding alien life. And I really wish they'd worded that better, because in the popular imagination, "alien life" translates right to "little bald dudes with big eyes, death rays, and flying saucers." When what they really mean is "microbes." But all of this prompts even more questions, including, if they do find life on another world, how will this change us as a species? Ah, and now we get into the cutting-edge philosophical question. You know. The one science fiction has been exploring for over a century now. The one we've been mulling over at least since Schiaparelli found illusory channels on Mars, which got translated to English as "canals," so of course everyone thought "Martians." And yet they present it as if it's some sort of strange, new idea. To its credit, the article does talk about the Mars thing. But decades on, what has been described as "the strongest evidence yet" of life on another world has come, not from Mars or Venus, but from a planet hundreds of trillions of miles away orbiting a distant star. I'd be remiss if I didn't note that, speaking of Venus, there was a big announcement a few years ago about finding spectroscopic evidence of life-produced gases in the thick, steamy atmosphere of that planet. Which turned out to be false and was summarily retracted. If we can get false positives from our closest orbital neighbor, I'm just that much more skeptical about finding it on Star Wars Droid Planet dozens of light-years away. Skeptical, I'll emphasize, doesn't mean "in denial." I'd love for it to be a solid discovery. I've said before that I really hope that we find indisputable evidence of extraterrestrial life during my own lifetime. Thing is, though, that it will be, as this article hints, a paradigm shift in our understanding of the Universe, and so the evidence needs to be more than circumstantial. "Extraordinary claims require extraordinary evidence," and all that. But just because I want it to be true doesn't mean I'm going to fall for hype. As these so-called exoplanets were being discovered, scientists began to develop instruments to analyse the chemical composition of their atmospheres. Their ambition was breathtaking, some would say audacious. The idea was to capture the tiny amount of starlight glancing through the atmospheres of these faraway worlds and study them for chemical fingerprints of molecules, which on Earth can only be produced by living organisms, so-called biosignatures. We don't know everything. This is a good thing, because it fuels exploration. But in this case, it means that just because a gas is only farted out by life here on Earth, that's not necessarily the only way to produce it. So, to me anyway, simply finding a biosignature is promising, but it's not enough for indisputable evidence of ET life. Prof Madhusudan, however, hopes to have enough data within two years to demonstrate categorically that he really has discovered the biosignatures around K2-18b. And I "hope" to have enough money within two years to buy a Central Park West penthouse. But even if he does achieve his aim, this won't lead to mass celebrations about the discovery of life on another world. Instead, it will be the start of another robust scientific debate about whether the biosignature could be produced by non-living means. That's my point: finding these biosignatures is like a big "let's look at this more closely" signal. As the article notes, we've found nearly 6000 exoplanets (not to mention the other seven in our backyard, plus several potentially life-harboring moons). The signs help us decide which ones to focus on for more observations and data. A much more definitive discovery would be to discover life in our own solar system using robotic space craft containing portable laboratories. Any off-world bug could be analysed, possibly even brought back to Earth, providing prima facie evidence to at least significantly limit any scientific push back that may ensue. We've had this, too. I remember a meteorite they found in Antarctica, determined to have been blasted off Mars by some ancient impact there, that contained features associated with life. As with the Venusian atmosphere thing, this turned out to be a false positive. The European Space Agency's (ESA) ExoMars rover, planned for launch in 2028, will drill below the surface of Mars to search for signs of past and possibly present life. Given the extreme conditions on Mars, however, the discovery of fossilised past life is the more likely outcome. And look, let's not underplay that. Even finding fossilized (as we spell it here) former life on Mars would be a Big Fucking Deal. But again, it hasn't happened yet. Nasa is also sending a spacecraft called Dragonfly to land on one of the moons of Saturn, Titan in 2034. It is an exotic world with what are thought to be lakes and clouds made from carbon-rich chemicals which give the planet an eerie orange haze, bringing The Beatles‘ song, Lucy in the Sky with Diamonds to mind: a world with "marmalade skies". A side note: I've been playing the video game Starfield off and on for the past couple of years. It's not as good as Bethesda's prior offerings (Skyrim and Fallout 4, e.g.), but that's irrelevant. What's relevant is you can, in the game, fly around to other planets, moons, and star systems and find elements to mine to sell or to use in crafting. And one of the elements you can find on Titan is titanium. I don't think there's any scientific basis for that placement; I just find it to be an amusing pun. But I digress. The BBC article then emphasizes (or emphasises) what I consider to be the most important point in all of this: If simple life forms are found to exist that is no guarantee that more complex life forms are out there. Prof Madhusudhan believes that, if confirmed, simple life should be "pretty common" in the galaxy. "But going from that simple life to complex life is a big step, and that is an open question. How that step happens? What are the conditions that govern that? We don't know that. And then going from there to intelligent life is another big step." I've been saying that for years. I do wish he hadn't used the word "intelligent," though. That just begs someone to make the self-contradictory joke about not finding any intelligent life on Earth. So I'll just add that "intelligent" doesn't automatically mean they build telescopes and rockets (or flying saucers); lots of species here on Earth are intelligent without the capability or desire to do that. Dr Robert Massey, who is the deputy executive director of the Royal Astronomical Society, agrees that the emergence of intelligent life on another world is much less likely than simple life. Of course, I could be wrong, but it is nice to have some backup from actual scientists. [Massey quoted here]"When we see the emergence of life on Earth, it was so complex. It took such a long time for multi-cellular life to emerge and then evolve into diverse life forms. "The big question is whether there was something about the Earth that made that evolution possible. Do we need exactly the same conditions, our size, our oceans and land masses for that to happen on other worlds or will that happen regardless?" Not mentioned: as I understand things, the grand evolutionary leap that made what he's calling "complex" life possible was the merging of an archaeum and a bacterium, creating a more energy-efficient cell and leading, eventually, to every life-form we can see on Earth today: fish, trees, cats, and us, for example. The simple cells are called prokaryotes; the more complex ones with the nucleus and organelles and internal structure you might remember from high school biology are called eukaryotes. There is some question about whether that happened only once, or several times, but either way, it took a long damn time to happen. One of the many cool things about finding ET life, even simple life, will be the data it provides to help us understand that remarkable upgrade. As he puts it, centuries ago, we believed we were at the centre of the Universe and with each discovery in astronomy we have found ourselves "more displaced" from that point. "I think the discovery of life elsewhere it would further reduce our specialness," he says. And that's where I diverge from him philosophically. Just being able to ask these questions makes us special. Potentially being able to answer them would only increase our specialness. In my opinion. At the same time, though, I think I understand the underpinning of his assertion: that humans don't get to proclaim that the universe was formed specifically for them. Never before have scientists searched so hard for life on other worlds and never before have they had such incredible tools to do this with. And many working in the field believe that it is a matter of when, rather than if, they discover life on other worlds. I can't argue with that. So, yeah, that's a lot of words, both in the article and here. And I've touched on many of these points in past entries, but this recent discovery prompted me to revisit my arguments. But if you skipped all the text until now, I would ask that you just remember this: finding simple life off-Earth does not mean there are other cultures Out There pointing telescopes back at us or getting ready to invade or whatever. But finding so much as a microbe, or its alien equivalent, would change our perspective in a big way. |
Occasionally, I like to share articles about the most profound inquiries into reality. Like this one from BBC: When you see pasta, your brain probably doesn't jump to the secrets of the Universe. Then you don't know my brain. Though it's usually beer that makes it jump to the secrets of the Universe. You might think physicists only ask the big questions. Yes, and this is one of them. But physicists, of course, have ordinary lives outside of the laboratory, and sometimes their way of questioning the Universe spills over to their daily habits. There's one everyday item that seems to especially obsess them: spaghetti. Come on, now. Just admit you like it. You don't have to claim it's "for science" like I do when I concoct a new cocktail recipe. The steady torrent of spaghetti science helps to demonstrate that deep questions lurk in our ordinary routines, and that there are plenty of hungry physicists who can't stop asking them. Me, I've always wondered why tomato sauce is hotter than anything else around it. It's basically food napalm. Like, maybe you have a pizza in the oven. You can pull that sucker out and touch the crust right away. Get any of the sauce on your hands, though, and you're looking at second-degree burns at least. Yes, I've looked it up and it has to do with heat capacity and water content combined with viscosity, or some shit like that, but that just pushes the question over to the next subject. Italian food: fueling scientific inquiry since 200 BCE. For example: how thin can spaghetti get? The article answers this question, though not without confusing the issue by using both SI and imperial units. Just stick with millimeters, okay? Spoiler: the thinnest comes in at 0.4mm. But recently, a team of researchers at the University College London wondered if 21st Century lab equipment could do better. They used a technique called "electro-spinning". First, they dissolved flour into a special, electrically charged solution in a syringe. Then they held the syringe over a special, negatively-charged plate. That's all very special. The world's thinnest spaghetti is just one recent example of how physicists can't seem to stop plying their tools on everybody's favourite carb. But physicists using their noodle on their noodles is no new thing. I would have been severely disappointed if the BBC had not made the "using their noodle" pun. You may think the BBC is a stodgy and serious news outlet, but I know better. Hell, their video player has a volume control that goes to eleven. In 1949, Brown University physicist George F Carrier posed "the spaghetti problem" in The American Mathematical Monthly, which he deemed to be "of considerable popular and academic interest". Essentially, the problem amounts to: "Why can't I slurp up a strand of spaghetti without getting sauce on my face"? His problem is he was American. Only Americans slurp spaghetti. Maybe Brits, too, but definitely Americans, and it's Wrong. You're supposed to twirl it around the fork and eat it neatly. This leads to another massive physics problem, though, which is the mathematics of strand entanglement. For now, no theoretical physicist has attempted the more complicated problem of two dogs slurping from either end of the same spaghetti strand. See? What'd I tell you? A Disney movie reference, of all things. The great mid-century American physicist Richard Feynman helped unlock the riddles of quantum mechanics, explaining how the elementary particles that make up atoms interact with one another. But Feynman's enormous contribution to spaghetti physics is less widely known. One night, Feynman wondered why it's almost impossible to break a stick of spaghetti into two pieces instead of three. I'd actually seen an article about this phenomenon before. I might have even blogged about it; I can't remember. But again, though Feynman was undeniably a genius (I believe he was even smarter than Einstein), he was also American, and thus didn't know that you're not supposed to break the spaghetti before cooking it. My (Italian-American) mother taught me to break a bundle of dry spaghetti in half before putting it in boiling water, so it fits horizontally in the pot. Yeah, well, it must be the "-American" part that thought that was a good idea. I guess Feynman did the same, but it's an outrage to many of the world's spaghetti-eaters. Kind of like putting pineapple on pizza. There are a few other examples of spaghetti science, then: Spaghetti physics even goes beyond the pasta itself – sauce is loaded with its own scientific mysteries. When eight Italian physicists met while doing research abroad in Germany, they found a shared frustration in the classic Roman dish cacio e pepe. Such a simple dish with few ingredients, and yet very difficult to perfect. "This is actually a very interesting problem," says Daniel Maria Busiello, co-author on the cacio study. "So we decided to design an experimental apparatus to actually test all these things." The "apparatus" consisted of a bath of water heated to a low temperature, a kitchen thermometer, a petri dish and an iPhone camera attached to an empty box. They invited as many hungry friends as they could find to Di Terlizzi's apartment and hunkered down to cook a weekend's worth of cacio e pepe. That. That is why I love science. The physics they used connects the clumping of cacio e pepe to ideas about the origin of life on Earth. And that. Not mentioned in the article: strand entanglement. I really want to know if they've solved the math behind that. It has nothing to do with quantum entanglement, though. ...or does it? |
I'm back from my trip to NYC. Gotta say, the skyline still doesn't look right to me without the Twin Towers. Speaking of places that are no longer there, here's a list from Mental Floss: Not even a compass, no, but those of us who read science fiction and fantasy are used to traveling to places that don't exist. At least in imagination. There are places that are hard to get to, places that are less explored than others, and places you’re forbidden from visiting... One of my other perennial sources, Atlas Obscura, is pretty good at the "hard to get to" and "less explored" places. As for "forbidden," sometimes I want to visit anyway, but I'm not bold or sneaky enough to do so. ...and then there are places that you might want to go to, but they no longer exist. From land masses wiped out by changing climates to waterfalls erased by human action, here are 10 spots you won’t be able to put on your vacation bucket list. I've used the term "bucket list" unironically before, so I don't have an inherent aversion to the term, but I'm not sure it's necessary here. I also have a "fuck-it list" for places that I intend to go if/when the mood strikes. Now, I should note two things before continuing: 1) There's a YouTube video embedded in the article. I didn't watch it. I'd rather read than watch. But it's there if you feel differently; it appears to cover the same topic. 2) There are also helpful pictures in the article, which I won't reproduce here. So I'm just going to highlight a few I have something to comment on. 2. The Pink and White Terraces New Zealand was once home to what was widely called the Eighth Wonder of the World: the Pink and White Terraces. I'd like to give them credit for the name. But I can't. It's descriptive and all, but if you're not going to use the native name for a thing, at least come up with something more creative. I'd probably have called them the Hello Kitty terraces, but these days, maybe the Barbie terraces. The Māori had long valued the Pink and White Terraces; they viewed them as taonga, meaning “a treasure.” Which still doesn't tell me what the Māori actually called them. Wikipedia ![]() Until 1886, that is. On June 10 of that year, Mount Tarawera erupted. A brief search didn't verify this, but at least that volcano name seems to be partly Māori. But my main point here is that it's not always humans at fault for making places disappear; geology does a good job of that by itself. 3. Rungholt There was once an island named Strand off the northwestern coast of what’s now northern Germany. In January of 1362, a cyclone known as the Grote Mandrenke, or “Great Drowning of Men,” caused a storm surge that wiped parts of the island off the map. With them went the medieval town of Rungholt. For centuries after Rungholt’s disappearance, people spoke of it as if it were a mythical lost city (its remains may have been found in 2023). It is possible that several "mythical lost cities" had their origin in real cities wiped out by natural disasters. 4. East Island People aren’t the only ones who suffer when islands disappear. After a 2018 hurricane, East Island—part of the French Frigate Shoals of the Hawaiian Islands—was swallowed by the sea. I used to be under the impression that hurricanes were Atlantic and, if a tropical cyclone formed elsewhere, it was called something else, like a typhoon. Turns out "hurricane" is apparently the right nomenclature for Hawai'i (central Pacific) as well. And the northeastern Pacific. 5. Doggerland Doggerland was a large swath of land that once connected Great Britain to continental Europe. I've been wondering about that place since I first heard of it. Is it the origin of some flood myths? Has anyone done underwater archaeology there? (Turns out the answer to the second question is yes.) 9. Old Man of the Mountain For centuries, an old man’s face loomed over New Hampshire, peering out from the side of Cannon Mountain. The Indigenous Abenaki called him “Stone Face,” while the white settlers referred to him as the “Old Man of the Mountain.” Except it wasn’t an old man at all: It was a rock. Well, at least this one lists the Native name. Or a translation of it. Anyway, I remember when it crumbled. Not that I was there, but it was all over the news in 2003, a stark reminder that everything is ultimately ephemeral. That, I say, is how we know it's real. 10. Nuna Supermountains The Nuna supermountains stretched across an entire supercontinent and formed roughly 2 billion years ago. That one, I don't remember. I was too young. Have you been somewhere that no longer exists? I'd bet you have. For me, the WTC towers (as mentioned way up at the top here) is a notable example, but there are also some bars I used to go to that I dearly miss. |
Today, we'll talk about some moons that aren't The Moon, for once. This article, from PopSci, concerns something you may or may not have heard about last month: Which planet has the most moons? Saturn dethrones Jupiter. ![]() The International Astronomical Union recognized 128 newly discovered moons orbiting the ringed planet. Just to be clear, I'm not questioning the finding. What I am going to question is the definition of "moon." The ringed gas giant Saturn has officially replaced Jupiter as the planet in our solar system with the most moons. The International Astronomical Union officially recognized 128 new moons orbiting Saturn, bringing the new total up to 274 moons. And that's certainly a lot of moons. The moons were discovered by a group of astronomers from Taiwan, Canada, the United States, and France. Between 2019 and 2021, they used the Canada France Hawaii Telescope to repeatedly monitor the sky around Saturn. While one could wish for a more creative name, at least "Canada France Hawaii Telescope" is descriptive. As of February 2024, Jupiter has 95 moons. By comparison, Mercury, and Venus are moonless, Earth has one moon, and Mars has two. Uranus and Neptune have 28 and 16 known moons, respectively. Despite not technically being a planet anymore, Pluto has five moons. And here's where I'm going to get picky. No, not about Pluto's designation. I honestly don't care what they call it. I understand why it got demoted, and can't fault the logic. But the point is there was a method to it. They decided what should constitute a planet, and Pluto didn't make the cut with the new definition. Also, if you want to get really technical about it, Pluto's largest satellite, Charon, is more like a companion world; they orbit a point between the two of them. Thing is, okay, so we have this definition of "planet" that excludes Pluto, Charon, Ceres, Vesta, Eris, etc. It's not solely about size, but the size of the world is a factor. So we get to "moon." The 128 new Saturnian satellites are all considered irregular moons. These are objects that orbit their host planet on an elliptical, inclined, or backwards path. Which, again, is fine, but at some point, don't you have to call them something else? The two moons of Mars are small and irregular, with odd orbits, and are probably captured asteroids. Some of the moons in the outer solar system are bigger than Mercury. There's a continuum in between. There's also a continuum of bodies orbiting a planet ranging from small-planet-sized all the way down to very small rocks, pebbles, grains of sand, even dust. And that's where my issue comes in. Saturn's rings have been known for a while now to be made up of really small chunks of mostly ice, though there are some larger bodies in there. Every one of those specks could be considered a moon, giving Saturn not 146, not 274, but probably millions of "moons." The Wikipedia bit on moon, ![]() One could argue, I suppose, that it's impossible for telescopes on or near Earth to resolve each speck of dust in Saturn's rings individually, so they shouldn't be called moons. But if so, come on, IAU: get together and agree on a definition like you did with "planet." We could use something new to argue about, because the Pluto thing is getting really stale. |
No! Not when I just learned it! Not really sure why this is in Atlas Obscura, but I'll run with it. It Might Be Time to Update the Old ‘Alfa-Bravo-Charlie’ Spelling Alphabet ![]() But it’s hard to break old habits. When someone on the phone—the doctor’s office, the bank, the credit card company—asks for my name, I always offer to spell it out—it’s a pretty uncommon surname. I've quit offering and just do it. This uses what is what’s called a “spelling alphabet,” or, confusingly, a “phonetic alphabet.” It is nearly impossible to distinguish, say, a B from a D, or an M from an N, without coming up with a word starting with one of those letters. But if you just pick one off the top of your head, you can make things worse. Like "B as in bed" gets heard as "so that's D as in dead?" That's why we have the standardized spelling alphabet. The British military came up with the first few examples, just for letters they found the most difficult: “P as in pip,” “B as in beer.” Yeah, the Puritanical Americans probably came up with something like "B as in boring." I can't complain too much, though; "Whiskey" is the official spelling word for W. A tremendous amount of research, time, and money was invested into figuring out the optimal spelling alphabet—at least for the three languages that the International Civil Aviation Organization (ICAO, the United Nations agency that handles air transportation) felt significant enough to have one (English, French, and Spanish). Perhaps we begin to see why some want to update the spelling alphabet: there are now many other languages in the chat. It’s certainly the most commonly used spelling alphabet in the world, but it is, as most of these alphabets are, exceedingly Anglocentric. And yet, they replaced Beer with Bravo, much to the detriment of, well, the world. Other languages have come up with their own spelling alphabets. Some needed wholly new ones, such as Russian, which uses the Cyrillic alphabet. “Г as in Григо́рий” is the Russian version of “G as in Gregory.” Japanese and Mandarin Chinese both have their own letter-based alphabets (Kana and Pinyin, respectively) in addition to their traditional logographic alphabets (in which symbols stand in for whole words or phrases, rather than just sounds). I must admit, it has crossed my mind in the past to wonder how the East Asian language speakers handled such things, but never enough to actually, you know, look it up. Some languages that use the Roman alphabet, as English does, have letters of their own. Take Æ in Danish and Norwegian, which is usually given “Æ as in Ægir,” a figure from Norse mythology. And also, like, how they handled accented letters in French and Spanish, which this article mentions briefly. Voice call quality has gone down over the past two decades. Yeah, it turns out no one really wanted to make phone calls, anyway. Sure, much communication has moved over to text, email, and social platforms, but everyone still needs to talk on the phone sometimes. And the most common phrase uttered in such a phone call is "What'd you say?" Independent of their use in military and aviation capacities, we sort of need spelling alphabets now more than ever. The problem is that what we’ve been given by the 50-year-old standard is deeply flawed for modern use. Only if you care about non-Anglophones. “We know in speech perception that frequent words are much more easily heard in noise than infrequent words,” says Hazan. That why it is a pretty poor choice to use, say, “S as in Sierra” (the standard) instead of “S as in sugar.” Yeah, I don't buy it. "Sugar" is pronounced with a very nonstandard 'sh' phoneme up front; "Sierra" is not. I'd pick "Sucks," myself, but that's not going to happen. Hazan, in 2006, was asked for a BBC Radio story to see if she could come up with a better spelling alphabet. There's more at the article, but basically: Turns out there was effectively no difference between the new, improved spelling alphabet and the old standard. If certain letters were in certain places in the nonsense combination, the new version might be more effective; in other places, the old version was. No difference! After all that! Hey, at least they were tested rather than just assumed to work, like with the older alphabets. This can be partly explained because people have just grown familiar with the whole “alfa-bravo-charlie” thing. It’s in books and movies, it’s just one of those things we absorb without thinking about it. And there's just something satisfying about saying Whiskey Tango Foxtrot, even though it's twice as many syllables than the original "What the fuck?" So, no, I don't think we need to come up with a new spelling alphabet, except in terms of expanding it to allow for different language alphabets. Besides, I just recently (almost) mastered this one, and I don't want to go through the whole memorization thing again. It would be like if English were to suddenly decide to simplify its spelling: sure it would be easier going forward, but all of us who have worked to learn and deal with the idiosyncratic spellings we have now wood be todaly steemed. |
Today, in "things we'd like to believe," from MedicalXpress: Well, sign me up! For some of those things, anyway. Never did get a taste for coffee. In all seriousness, though, we shouldn't be taking any of these nutrition science studies at face value as presented, whether they tell us what we want to hear or not. A diet rich in produce such as grapes, strawberries, açaí, oranges, chocolate, wine and coffee can reduce the risk of metabolic syndrome... Okay, even if this one study is definitive, what about other health issues besides metabolic syndrome? ...according to the findings of a study involving more than 6,000 Brazilians... Can't fault the sample size on this one, though. ...the largest in the world to associate the effects of consuming polyphenols with protection against cardiometabolic problems. "But, but, but, I can't pronounce polyphenols, so I shouldn't be eating them!" Seriously, though, at least they reveal the key chemical up front. Though calling it a chemical will freak some people out. I don't care. Everything you eat or drink contains nothing but chemicals. Polyphenols are bioactive compounds with well-known anti-oxidant and anti-inflammatory properties. Can't be arsed to look it up because I have limited time due to travel, but I'm pretty sure the polyphenols are what started the "red wine is good for you" craze a couple decades back. Since then, they've waffled back and forth on the subject, depending on what result whoever funded or did the study wanted to push. (This is why I do not trust nutrition science.) "This is good news for people who like fruit, chocolate, coffee and wine, all of which are rich in polyphenols. Although the link between consumption of polyphenols and a reduction in the risk of metabolic syndrome had already been identified in previous studies, it had never before been verified in such a large study sample [6,378 people] and over such a long period [eight years]," said Isabela Benseñor, a co-author of the article and a professor at the University of São Paulo's Medical School (FM-USP) in Brazil. To reiterate, though, what about other health issues? It's unlikely anyone's going to come out with a "fruit is bad for you" study, but fun-haters will definitely do everything they can to debunk the chocolate and wine part. And it's possible for an item to be good for you in some ways and bad in others. Like how aspirin has been shown to protect against heart attacks, but you have to balance that with the side effects of aspirin. As for coffee, it's still the only acceptable thing for Americans to be addicted to, because it aids productivity and allows people to function on less sleep. Gods forbid we actually enjoy something that makes us less functional. Detailed interviews based on questionnaires were conducted to find out about the participants' dietary habits and the frequency with which they ingested 92 polyphenol-rich foods. This is my other caution with nutrition science: methodology is often suspect. In this case, self-reporting was used, which is notoriously flaky. The main conclusion was that consumption of polyphenols from different foods at the highest estimated level (469 mg per day) reduced the risk of developing metabolic syndrome by 23% compared with the lowest polyphenol consumption (177 mg per day). I'd also like to point out that this doesn't do much to show causation rather than correlation. In other words, would the same benefit be seen if someone forces themselves to consume polyphenol-rich foods when they usually don't? How do we know it's not "people less prone to metabolic syndrome prefer to consume more fruits, coffee, etc.?" Also, while 23% is significant, I'm not sure if it's worth eating something you dislike. Like, if someone told me I'd have a 20% lower chance of prostate cancer if I drank coffee every day, well, first I'd have to know what my baseline chance is because 20% off 10% is way less significant than 20% off 80%. And then I'd have to weigh the risk reduction against simply despising the taste of coffee. Additionally, some of those foods listed can be quite expensive, and not everyone can afford them. Anyway, like I said, I'm busy today. There's more at the link. I just wanted to throw this into the mix to show why we shouldn't just automatically believe headlines like the one in the article, whether we want to or not. |
I'll be traveling this week, so posts will be whenever I can find the time to make them. Like now, before I get some sleep so I can leave early in the morning. Another older article today, an Ars Technica piece from 2019. This is significant, because clearly, the "techniques" they discuss therein didn't work to combat the misinformation and anti-science rhetoric that amped up in the following year. Two tactics effectively limit the spread of science denialism ![]() Debunking the content or techniques of denialism mitigates their impact. Does it, though? Does it really? “Vaccines are safe and effective,” write researchers Philipp Schmid and Cornelia Betsch in a paper published in Nature Human Behavior this week. Again... 2019. “Humans cause global warming. Evolution theory explains the diversity and change of life.” But large numbers of people do not believe that these statements are true, with devastating effects: progress toward addressing the climate crisis is stultifyingly slow, and the US is seeing its largest measles outbreak since 2000. I checked the statistics, and yes, the one in 2019 was even larger than the current measles outbreak... so far. In their paper, Schmid and Betsch present some good news and some bad: rebutting misinformation reduces the ensuing level of science denialism, but not enough to completely counter the effect of the original exposure to misinformation. If what we've seen over the past five years is a reduction, I'd hate to have seen the unmitigated disaster. Schmid and Betsch make a point of emphasizing that science denialism is a universe away from a healthy skepticism. In fact, skepticism of existing results is what drives research to refine and overturn existing paradigms. Denialism, the authors write, is “dysfunctional” skepticism “driven by how the denier would like things to be rather than what he has evidence for.” There's also, I think, a knowledge gap involved. If you don't know how to fly a helicopter, don't get behind the controls of one. If you think you know how to fly a helicopter because you've seen action movies, you're wrong. Similarly, if you think you know everything about vaccines because you've watched a few videos online, you're wrong. I don't know everything about vaccines, but I have the advantage of living in the same house as an epidemiologist. And usually that of recognizing good science as opposed to bad. Schmid and Betsch focused on strategies to counter misinformation as it is being delivered during a debate, focusing on two possible approaches: correcting misinformation and laying bare the rhetorical techniques that are being used to obfuscate the truth. Maybe part of the problem is allowing it to get to the point of debate. When you get a flat-earther up on stage discussing the shape of the planet with a... well, with anyone with brains, you're putting them on equal footing. You shouldn't do that. Flat-earth nonsense needs to be nipped in the bud, even if it does make the flat-earther feel persecuted and perversely vindicated. They can have their own platform, not one shared with scientists. Flat-earth bullshit is only the most obvious of these types of "my ignorance is just as good as your knowledge" things, though. For instance, in the case of vaccine denialism, a denier might argue that vaccines are not completely safe. Correcting this misinformation (which Schmid and Betsch call a “topic” rebuttal) could take the form of arguing that vaccines in fact have an excellent safety record. A “technique” rebuttal, on the other hand, would point out that demanding perfect safety is holding vaccines to an impossible standard and that no medication is 100 percent safe. "See? It's only 99.9999% safe! Why take the chance?" Because failing to vaccinate causes more death. The article goes into the methods used in the study, then: But one thing seems clear: it could be better to turn up and debate a denialist than to stay away, a tactic that is sometimes advocated out of fear of legitimizing the denialism. Which is exactly the opposite of what I just said up there. This can tell us all three things: 1) I'm not an expert, either (but I can generally spot experts); 2) I can be wrong; 3) Unlike denialists, I can admit when I'm wrong. Still, I'm not going to debate any of these things in person. My memory is too crappy, my knowledge is too broad and not deep enough, and I'm not much of a public speaker. There's no way I could keep up with the flood of misinformation and outright lies that the denialist (of whatever) is spouting. If someone else wants to do it, someone with actual credentials and who's not going to freeze up on stage, go for it. But the bullshit comes too fast. A lie is wiping its dick on the curtains while the truth is still struggling to get the condom on. It's an uphill battle. Sisyphean, even, because once you push the boulder to the top of the hill, they'll roll it right back down again. And yet, I have to try. |
This Wired article is fairly old, and published on my birthday, but neither of those tidbits of trivia are relevant. Why a Grape Turns Into a Fireball in a Microwave ![]() Nuking a grape produces sparks of plasma, as plenty of YouTube videos document. Now physicists think they can explain how that energy builds up. No, what's relevant is that fire is fun. The internet is full of videos of thoughtful people setting things on fire. See? Here’s a perennial favorite: Cleave a grape in half, leaving a little skin connecting the two hemispheres. Blitz it in the microwave for five seconds. For one glorious moment, the grape halves will produce a fireball unfit for domestic life. Unfortunately, you can only see it through the appliance's screen door (that screen serves the important function of keeping most of the microwaves inside the microwave), and I don't know what it might do to the unit, so don't try this with your only microwave. Or at least, don't blame me if you have to buy a new one. I'm not going to pay for it. Physicist Stephen Bosi tried the experiment back in 2011 for the YouTube channel Veritasium, in the physics department’s break room at the University of Sydney. What's truly impressive is that Bosi, the grape, and the microwave oven were all upside-down. Off-camera, they discovered they had burned the interior of the physics department microwave. What'd I tell you? I'm not responsible if you blow up the one at work, either. Still, if the last person to use it committed the grave sin of microwaving fish, this might be an improvement. I should also note that the article contains moving pictures of the effect. These are cool, but you might hit a subscription screen. With my script blocker, I could see the text, but not the pictures. But it turns out, even after millions of YouTube views and probably tens of scorched microwaves, no one knew exactly why the fireball forms. As regular readers already know, this is the purpose of science. After several summers of microwaving grape-shaped objects and simulating the microwaving of those objects, a trio of physicists in Canada may have finally figured it out. At least they weren't upside-down. Sucks if they wanted to nuke some poutine, though. The fireball is merely a beautiful, hot blob of loose electrons and ions known as a plasma. The most interesting science is contained in the steps leading up to the plasma, they say. The real question is how the grape got hot enough to produce the plasma in the first place. And this is why some people think science sucks the joy out of everything. No, nerds: the fireball is the cool part. The science is merely interesting. Their conclusions: The grape is less like an antenna and more like a trombone, though for microwaves instead of sound. Huh. Never heard of a trombone exploding into a blaze of glorious fire, but I suppose it could happen. Better to save that fate for instruments that deserve it, like bagpipes, accordions, and mizmars. I joke, yes, but the article explains it rather well. If you have a subscription. Or can cleverly bypass that annoying restriction. The grape, incidentally, is the perfect size for amplifying the microwaves that your kitchen machine radiates. The appliance pushes microwaves into the two grape halves, where the waves bounce around and add constructively to focus the energy to a spot on the skin. Not explained: if the grape is "the perfect size," how come it works for grapes of different sizes? A common misconception is that the microwave acts on the grape from the outside in, like frozen meat defrosting, says physicist Pablo Bianucci of Concordia University, who worked on grape simulations included in the paper. I don't know where Concordia University is, so I can't make jokes about its location. Oh, wait, I could look it up. ... Oh, it's in SoCal. Grody. Anyway, I didn't know people still thought microwaves heated from the outside in. We can't all be physicists, but I was under the impression that it's fairly common knowledge that the wavy EM thingies work by exciting the water molecules throughout the... whatever you put in there. That's why it's usually faster to nuke a cup of water than it is to boil it on the stove. The work has more serious applications too, Bosi says. Look, not everything needs to be useful for something. But when it is, that's pretty cool. His experiments with grape balls of fire... And there we have it, folks: the real reason I saved this article to share with all of you. ...began and ended with the 2011 YouTube video, but his curiosity did not. “I’m impressed with the scientific depth of the paper,” wrote Bosi in an email. In particular, he notes that authors came up with mathematical rules for describing the grape hotspot. They could conceivably shrink these rules to a smaller scale, to create similar hotspots in nanoparticles, for example. Scientists use heated nanoparticles to make very precise sensors or to facilitate chemical reactions, says Bianucci. I'll take their words for it. During all their microwaving, they noticed that two grapes placed side by side repeatedly bump into each other, back and forth. They don’t know why that happens, and they’ll be studying that next, says Bianucci. Always something else to study. This is a good thing. Not mentioned in the article: how in the hot hell did anyone figure out that putting a grape, cut mostly in half but still connected by a tiny thread of grape skin, into a microwave would produce a "grape ball of fire?" It's not like we eat warm grapes. Even if we did, that's still a very specific configuration. Some mysteries, I suppose, will never be solved. And that's also a good thing. |
I'm more than a little pissed at Time right now because they reported the "dire wolf de-extinction" story as if it were true and not a steaming pile of bullshit. Don't know what I'm talking about? Use a search engine; I'll be damned if I'm going to give that crap any more boost by linking it. But I'm really hoping they got the science right on this article: "Surprising," I guess, if you're a prude. It makes me feel better to cuss, so I've always known it had health benefits (for me, not the people I'm cussing at). Still, it's good to have science backing me up. If it's true. After the "dire wolf" bullshit, I can't be sure. Many of us try to suppress the urge to blurt out an expletive when something goes wrong. And many of us try to hold sneezes in. That doesn't mean it's healthy. Research has found that using profanity can have beneficial effects on people’s stress, anxiety, and depression. In fact, there are numerous potential physical, psychological, and social perks related to the power of a well-timed F-bomb. "Social?" I guess it depends on the society. Cursing induces what’s called hypoalgesia, or decreased sensitivity to pain. Researchers have shown that after uttering a curse word, people can keep their hands submerged in ice water for longer than if they say a more neutral word. I get why they do the submerged in ice water thing. It's a low-risk means of inducing some level of pain in a test subject. Other kinds of pain may be unethical for scientists. But I wonder about the efficacy of low-risk pain inducement in a study such as this. For one thing, a big part of pain is the surprise. If you know you're going to get stuck with a needle at the dentist, you can control your reaction somewhat (though it's quite difficult to swear with your mouth wide open and the dentist's fingers in there). But here’s an interesting twist: “People who swear less often get more benefit from swearing when they need it,” he says. In other words, cursing all the time zaps the words of their potency. That's not surprising to me. I prefer to hold back the important words for when they can provide better emphasis. Swearing aloud is associated with improvements in exercise performance, including cycling power and hand-grip strength. This wouldn't surprise me either. I glanced at the study. Decent sample size, but restricted demographics (i.e. one of those studies that used students as swearing guinea pigs), and the control group used neutral language, presumably words such as "hit," "truck," or "bunt." A study in the European Journal of Social Psychology found that when people wrote about a time they felt socially excluded, then repeated a swear word for two minutes, their hurt feelings and social distress were significantly lower than for people who used a neutral word. Taken together with the findings about physical pain, this might lend more credence to the idea that physical pain and emotional pain are related in more ways than just being described with the same word. In another study, researchers found that when drivers cursed after being refused the right of way by another driver, or when they encountered a traffic jam caused by cars that were stopped illegally, cursing helped them tamp down their anger and return to a more balanced emotional state. I didn't look at that study. I've experienced this myself. And "cursing" in this context includes showing the offender my middle finger. There appear to be surprising social benefits associated with the well-timed use of profanity. “Some people believe that profanity can break social taboos in a generally non-harmful way, [which] can create an informal environment in which people feel like insiders together,” says Ben Bergen, a professor of cognitive science at the University of California, San Diego, and author of... This isn't on the same level as those other assertions. "Some people believe" is weasel words, which is why I'm not including the name of his book. I don't doubt that it does these things, but, as anyone who's been on WDC for a while can attest, cussing can also alienate some people. Of course, it is possible to overdo it. People who swear frequently are sometimes perceived as angry, hostile, or aggressive, so there’s a potential tipping point to using profanity. Again, I'm pretty sure that's true, but: what's the tipping point? I suspect it's different for different groups. Baptist church vs. biker bar, e.g. The article does address this qualitatively: It’s also important to know your audience. Swearing etiquette may depend on the social hierarchy and power dynamics in certain situations, such as the workplace, says Jay. Just because the boss uses curse words doesn’t necessarily mean you can get away with it. (You’ll also want to modify your language around young children.) Nah. I want young children to stay as far away from me as possible. If I cuss in public, their parents herd them away. I win. They win, too, because I have furthered their education. Not addressed in the article: whether writing "fuck" has similar benefits to saying it. I suspect not. Clearly, further study is needed. Can I get money for being a guinea pig in that study? |
I'm posting early today because I have a dentist thing that will a) take all morning and b) leave me in no shape to form coherent sentences (worse than usual, I mean) in the afternoon. Speaking of posting schedule, I'll be going on a little trip next week, so blog posts will be erratically timed. For today, though, I'll try not to make any tired old "place is in the kitchen" jokes about today's article from Gastro Obscura. No promises. Meet the Feminist Resistance Fighter Who Created the Modern Kitchen ![]() Margarete Schütte-Lihotzky left an indelible mark on Austria, architecture, and how we cook. Sexist jokes notwithstanding, this scene is set in Austria in the 1940s, and it was a central platform, in that era, of a certain political party led by a certain Austrian that women were for children, kitchen, and church. Which should be enough right there to rebel against the entire idea of rigid gender roles. Schütte-Lihotzky had been imprisoned since 1941 for her work as a courier for the Communist Party of Austria (KPÖ), which led the resistance against the Nazi regime in her home country. While she managed to narrowly avoid a death sentence, Schütte-Lihotzky remained in jail until the end of World War II in 1945. The incarceration would forever split her life in two. On the one side were her beginnings as a precocious and successful architect spurred on by the desire to create a better life for working-class women. On the other, what she would refer to as her “second life,” as an active communist, political activist, and memoirist who was professionally shunned in Austria for her political beliefs and received her much-deserved accolades only in the final decades of her life. I suppose it could have been worse. Some people don't get recognized until after they croak. Schütte-Lihotzky led a remarkably long and full life, dying a few days short of her 103rd birthday in 2000. But her name remains forever connected to a space she designed when only 29 years old: the Frankfurt Kitchen, the prototype of the modern fitted kitchen. Which is so ubiquitous in developed countries now that it's hard to imagine a time when it didn't exist. Designed in 1926 as part of a large-scale social housing project in Frankfurt, Germany, the “Frankfurt Kitchen” introduced many of the elements we now take for granted... So the concept of a kitchen as we know it today is just under 100 years old. That's not too surprising; 100 years ago, we were still arguing over things like the size of the Universe and what powers the Sun. Still, I'd have said "take for granite," because of the proliferation of granite countertops in kitchens and because I can't resist a gneiss play on words. ...a continuous countertop with a tiled backsplash, built-in cabinets, and drawers optimized for storage—all laid out with comfort and efficiency in mind. Whoever put my kitchen together must have forgotten about the "optimized for storage" bit. “She didn’t just develop a kitchen,” says Austrian architect Renate Allmayer-Beck. “It was a concept to make women’s lives easier by giving them a kitchen where they could manage more easily and have more time for themselves.” Thus leading inexorably to women joining the workforce, which, if you think that's a bad thing, boy are you reading the wrong blog. The article even addresses the obvious: While the Frankfurt Kitchen was marketed as a kitchen designed for women by a woman, Schütte-Lihotzky resented the implication that her gender automatically endowed her with secret domestic knowledge, writing in her memoir that “it fed into the notions among the bourgeoisie and petite bourgeoisie at the time that women essentially work in the home at the kitchen stove.” I vaguely remember featuring a bit back in the old blog about the invention of the automatic dishwasher, which predated the Frankfurt kitchen (I suppose that rolls off the tongue and keyboard more easily than "Schütte-Lihotzky Kitchen") by a few decades. That, too, was a woman's work. And that's the closest I'm going to get to making a "women's work" joke; you're welcome. The Frankfurt Kitchen was efficiently laid out and compact, to save both on costs and the physical effort required to use it. Here, a woman could move from sink to stove without taking a single step. This quest for efficiency also led Schütte-Lihotzky to move the kitchen from a corner of the family room into its own space—a choice that baffled contemporary homemakers. And then, decades later, they'd take away the wall separating the kitchen from the family room, putting it back into one big open space. I spent my childhood in a house with an open-concept kitchen/living area, and I have nothing inherently against it. What I have a problem with is all the remodeling shows that insist on that kind of layout. Not because they insist on it, but because they're thinly-veiled ads for home improvement stores, and they enable that bane of the housing market in the US: house flippers. The article even addresses the open-concept change, if obliquely: When the Frankfurt Kitchen came under fire from second-wave feminists in the 1970s for isolating women in the kitchens and making domestic labor invisible, the critique hit her hard. She defended her design in her memoir. “The kitchen made people’s lives easier and contributed to women being able to work and become more economically independent from men,” she wrote. Still, she conceded, “it would be a sad state of affairs if what was progressive back then were still a paragon of progress today.” I feel like a lot of people would defend their life's work to the last, but that quote demonstrates a willingness to keep an open mind, even later in life, and to acknowledge that nothing is ever truly completed. As they used to say, "a woman's work is never done." There's a lot more at the link, which I found interesting because I was only vaguely aware that today's kitchen designs owed a debt to something called a "Frankfurt Kitchen," but I didn't know anything about how it came to be. I figured maybe someone else might want to know, too. |
I sure talk about the Moon a lot. We're coming up on another Full Moon, by some reckonings the Pink Moon, the first Full Moon after the Northern Hemisphere Spring Equinox. It's also a culturally significant Full Moon because it marks the start of Pesach, or Passover; and helps to define the timing of Easter. This will occur on Saturday, based on Eastern Standard Time. But this article, from aeon, isn't about Moon lore or cultural observances; quite the opposite. How the Moon became a place ![]() For most of history, the Moon was regarded as a mysterious and powerful object. Then scientists made it into a destination On 25 May 1961, the US president John F Kennedy announced the Apollo programme: a mission to send humans to the Moon and return them safely to Earth within the decade. Specifically, white American male humans, but hey, one small step and all that. The next year, the American geologist Eugene M Shoemaker published an article on what it would take to accomplish the goal in American Scientist. It is an extraordinary document in many ways, but one part of his assessment stands out. ‘None of the detailed information necessary for the selection of sites for manned landing or bases is now available,’ Shoemaker wrote, because there were ‘less than a dozen scientists in the United States’ working on lunar mapping and geology. I had to look it up to be sure, but yeah, this was the same guy who co-discovered Comet Shoemaker-Levy 9, the one that impacted Jupiter back in the 1990s, right around the time we coincidentally started confirming the existence of exoplanets. That's a lot of astronomy wins for a geologist, especially considering that, technically, "geology" only applies to Earth. I think that's a word it's safe to expand the definition of, though; otherwise, we'll have selenology, areology, and any number of other Greek-rooted world names attached to -ology. The problem becomes especially apparent when you consider we also have geography, geometry, and geophysics. Some sources refer to him as an astrogeologist; I'm not really picky about the wording in this case, as long as we all understand what's meant, though technically "astro-" refers to stars, not moons or planets. Being picky about that would cast doubt on "astronaut" as a concept. Incidentally, he apparently died in a car crash in 1997, and some of his ashes got sent to the Moon with a probe that crashed into its south pole region. A fitting memorial, if you ask me. But I digress. The Moon is a place and a destination – but this was not always the case. Well, it was certainly a destination for Eugene M. Shoemaker. Or part of him, anyway. To geographers and anthropologists, ‘place’ is a useful concept. A place is a collision between human culture and physical space. People transform their physical environment, and it transforms them. People tell stories about physical spaces that make people feel a certain way about that space. And people build, adding to a space and transforming it even further. So, this is a situation where science, technology, anthropology, folklore, mythology, linguistics, engineering, and psychology (and probably a few other ologies) meet. In other words, candy for Waltz. Now, you might be thinking, as I did, "But science fiction treated other worlds as 'places' long before we sent white male American humans to the Moon." And you'd be right (because, of course, I was). The key is in the definition of 'place' I just quoted from the article: the Moon became a real place, as opposed to the speculative place it had in science fiction and fantasy: Centuries ago, a major reconceptualisation took place that made it possible for many to imagine the Moon as a world in the first place. New technologies enabled early scientists to slowly begin the process of mapping the lunar surface, and to eventually weave narratives about its history. Their observations and theories laid the groundwork for others to imagine the Moon as a rich world and a possible destination. Then, in the 1960s, the place-making practices of these scientists suddenly became practical knowledge, enabling the first visitors to arrive safely on the lunar surface. One might argue that we lost something with that, like the folklore and mythology bits. But we gained something, too, and didn't really lose the folklore (though some of it, as folklore is wont to do, changed). For much of history, the Moon was a mythological and mathematical object. People regarded the Moon as a deity or an abstract power and, at the same time, precisely charted its movement. It seemed to influence events around us, and it behaved in mysterious ways. The connection between the Moon and tides was clear long before Newton explained gravity enough to demonstrate a causal relationship. There were some who thought about trips to the Moon. Stories in religious traditions across the world tell of people travelling to the Moon. There were some thinkers before and after Aristotle who imagined that there were more worlds than just Earth. The ancient atomists discussed the possibility of worlds other than Earth, while other Greeks discussed the possibility of life on the Moon. This included Plutarch, who wrote about the Moon as both mythical and a physical object. But, to the extent that the Moon was thought about as a place, the notion was largely speculative or religious. I sometimes wonder if, had we not had the big shiny phasey thing in the sky, our perception of space travel might have been different. The only other big thing in the sky is the Sun; all the other relatively nearby objects resolve to little more than dots: Venus, Mars, etc. I suspect that the presence of a visible disc, with discernible features even, might have served as a stepping-stone to imagining those other dots as worlds, once the telescope could start us seeing them as discs, too. It would certainly have made mythology and folklore a lot different, not having a Moon. The rest of the article is basically a brief (well, not so brief because it's aeon, but brief in comparison to human history) recap of our cultural relationship with the Moon. I don't really have much else to comment on, but I found it an interesting read, especially to see how our understanding has changed over time. |
Got this one from Time, and now it's Time to take a look at it. "Has become?" Always has been. Imagine walking through New York City, invisible. I don't have to. I've done it. People bumped into me (and didn't even pick my pocket), cars didn't stop at crosswalks, and taxicabs just zoomed on by when I hailed them. This is also known as "being in New York City." Marilyn Monroe, one of the most recognizable women in the world, once did exactly that. The article describes how no one recognized her until she started acting Marilyny. There's some irony (or whatever) there, because it wasn't Marilyn Monroe who (if the story is true) walked through NYC invisibly; that was Norma Jeane Mortenson. So who was being herself? Marilyn or Norma Jeane? Who is real and authentic: Superman or Clark Kent? (Yes, I know, trick question; they're both fictional.) Her story is extreme, but her struggle is not unique. Like Marilyn, many of us learn to shape ourselves into what the world expects. Refining, editing, and performing until the act feels like the only version of us that belongs. Well, yeah. And then you become the act. And that becomes your authentic, real, true self. This isn't news or something to be ashamed of; it's the essential process of life as a human. Today, even authenticity is something we curate, measured not by honesty but by how well it aligns with what’s acceptable. The pressure to perform the right kind of realness has seeped into every aspect of modern life. Oh, boo hoo hoo. "Today," my ass. We've been doing this since we figured out this newfangled "fire" shit, if not before then. I might even postulate that the pressure to fit in, to conform, to not act like but be the person your society expects was even stronger in pre-industrial times. Authenticity was supposed to set us free. Instead, it has become something we must constantly prove. In a culture obsessed with being “real,” we curate our imperfections, filter our vulnerabilities, and even stage our most spontaneous moments online. Who's this "we" person? I figured out a long time ago that I needed to be someone different at work than I was for, say, my role-playing game group. The latter helped with the former. Those who should know these things told me that people responded well to honesty and authenticity, so I learned to fake those qualities. Instead of naturally shifting between different social roles, we now manage a single, optimized identity across multiple audiences—our family, coworkers, old friends, and strangers online. Again, who the fuck is "we?" Not me. The bigger, paradoxical problem is, however, that the more we strive to be real, the more we perform; and in proving our authenticity, we lose sight of who we truly are. To me, this is like saying "No one sees how we truly look; they only see the wardrobe and hairstyle we choose." Hell, even nudists get to choose their hairdos. Who "we" are is always a performance. Eventually, the performance becomes who we are. Fake it 'til you make it, and all that. Think back to childhood. At some point, you probably realized that certain behaviors made people like you more. Maybe you got extra praise for being responsible, so you leaned into that. Maybe you learned that cracking jokes made you popular, so you became the funny one. Okay, now you're attacking me directly. Psychologists call this the “False Self”—a version of you that develops to meet external expectations. Well, far be it from me to dispute what professional psychologists say, but again, that's like saying "society expects us to wear clothing to cover one's genitals, so the only way to be authentic is to be naked." And even then, which is more authentic: pre-shower, or post-shower? And do you comb/brush your hair? Then you're not being authentic; you're conforming to society's norms. My point here is that despite what the article says, authenticity isn't always a good thing. Maybe your "authentic" self is a thief, and you don't want to face society's punishment for that, so you choose not to steal stuff. You're tempted, sure, but you just walk past the shinies instead of pocketing them, or restrain yourself from picking an NYC pedestrian's pocket or running off with her purse. You become not-a-thief, and that eventually becomes your true self. Some of us are just naturally funny, but others have to work at it. The desire to work at it is just as authentic as the not-being-funny part. What's the point of trying to improve yourself if you then get slammed for being "unauthentic?" A violent person may want to do the work to stop being violent. A pedophile may choose to deliberately avoid being around children. Is that not a good thing for everyone? As for code-switching, are we supposed to wear the same clothes for lounging around the house, going to a gym, working, and attending a formal dinner? This is the same thing, but with personality. Authenticity isn’t something you achieve. It’s what’s left when you stop trying. Yet, the more we chase it, the more elusive it becomes. Well gosh, you know what that sounds exactly like, which I've harped on numerous times? That's right: happiness. Culture shifts when enough people decide to show up as they are. Naked with uncombed hair? Hard pass. |
It's nice to be able to see through optical illusions, as this article from The Conversation describes. It would be even nicer to be able to see through lies and bullshit, but that's probably harder. And I did find possible bullshit in this article, in addition to the slightly click-baity headline. Optical illusions are great fun, and they fool virtually everyone. But have you ever wondered if you could train yourself to unsee these illusions? I can usually see past the optical illusion once it's pointed out to me, or if I figure it out, but not always. Now, it should be obvious that there are pictures at the article. They'd be a pain to reproduce here, and why bother, when I already have the link up there in the headline? We use context to figure out what we are seeing. Something surrounded by smaller things is often quite big. Which is why it's important to hang out with people smaller than you are. Or bigger, depending on the effect you're looking for. How much you are affected by illusions like these depends on who you are. For example, women are more affected by the illusion than men – they see things more in context. The article includes a link to, presumably, a study that supports this statement. I say 'presumably,' because when I checked this morning, the link wasn't working. So I can't really validate or contradict that assertion, but I do question the validity of the "they see things more in context" statement. Young children do not see illusions at all. The link to that study did work for me, and from what I can tell, it was about a particular subset of illusions, not "all." The culture you grew up in also affects how much you attend to context. Research has found that east Asian perception is more holistic, taking everything into account. Western perception is more analytic, focusing on central objects. None of which fulfills the promise of the headline. This may also depend on environment. Japanese people typically live in urban environments. In crowded urban scenes, being able to keep track of objects relative to other objects is important. Okay, this shit is starting to border on racism and overgeneralization. Also, the glib explanation is the sort of thing I usually find associated with evolutionary psychology, which reeks of bullshit. However, what scientists did not know until now is whether people can learn to see illusions less intensely. A hint came from our previous work comparing mathematical and social scientists’ judgements of illusions (we work in universities, so we sometimes study our colleagues). Social scientists, such as psychologists, see illusions more strongly. And this is starting to sound like the same old "logical / creative" divide that people used to associate with left brain / right brain. Despite all these individual differences, researchers have always thought that you have no choice over whether you see the illusion. Our recent research challenges this idea. Whatever generalization they make, I can accept that there are individual differences in how strongly we see optical illusions. So this result, at least, is promising. Radiologists train extensively, so does this make them better at seeing through illusions? We found it does. We studied 44 radiologists, compared to over 100 psychology and medical students. And we finally get to the headline's subject, and I'm severely disappointed. 44? Seriously? There is plenty left to find out. I'll say. Despite my misgivings about some of the details described, I feel like the key takeaway here is that it may be possible to train people away from seeing a particular kind of optical illusion. But it may be a better use of resources to train them to smell bullshit. |
Once again, Mental Floss tackles the world's most pressing questions. Why Do So Many Maple Syrup Bottles Have a Tiny Little Handle? ![]() It’s not for holding, that’s for sure. Well, this one would be pressing if anyone in the US could still afford maple syrup. Ideally, you’d be able to hold the handle of a maple syrup container while you carry it and also while you pour the syrup onto pancakes, waffles, or whatever other foodstuff calls for it. Good gods, how big is your maple syrup container? I usually get the ones about the size of a beer bottle, which doesn't even require a handle. Or, you know, I used to, when we were still getting stuff from Canada. But the typical handle on a glass bottle of maple syrup is way too small and positioned too far up the bottleneck to be functional in either respect. So, why is it there? Why is anything nonfunctional anywhere? Decoration, tradition, or for easy identification, perhaps. The most widely accepted explanation is that the tiny handle is a skeuomorph, meaning “an ornament or design representing a utensil or implement,” per Merriam-Webster. I'm actually sharing this article not to complain about trade wars, but because I don't think I'd seen 'skeuomorph' before, and it's a great word. As the article goes on to note, it's apparently pretty common in software design. They use other examples, but here on WDC, we have a bunch of them. The magnifying glass for Search, the shopping cart (or trolley) for Shop, glasses for Read & Review, the gear icon for settings, and so on. I don't do website or graphic design, so I didn't know the word. But there are plenty of skeuomorphs that don’t involve the transition from analog to digital life, and the useless handle of a maple syrup bottle is one of them. I'd hesitate to call it "useless," myself. Obviously, it's not useful as a handle for carrying or pouring, but, clearly, it does have a purpose: marketing. Here’s one popular version of the origin story: The little handle harks back to the days of storing liquids in salt-glazed stoneware that often featured handles large enough to actually hold. Moonshine distillers, take note. (And yet, the article mostly debunks that origin story, as one might expect.) Maple syrup manufacturers had started to add little handles to their glass bottles by the early 1930s. This, apparently, was a bit of a marketing tactic. “Maple syrup companies weren’t so much retaining an old pattern of a jug as reinventing it and wanting to market their product as something nostalgic,” Canada Museum of History curator Jean-François Lozier told Reader’s Digest Canada. Like I said. Perhaps one day, I will again have the opportunity to purchase delicious maple syrup. When I do, I'll be looking for the skeuomorph. |
While Cracked ain't what it used to be (what is, though?), here, have a bite of this: It should go without saying that "mutated" is a bit misleading, but here I am, saying it anyway. We know that companies keep tinkering with the recipes behind processed foods, changing nitrates or benzoates so you’ll become as addicted as possible. Wow, that would suck, becoming addicted to food. More basic foods, however, are more dependable. And, of course, here's the countdown list to contradict that. 5 Brussels Sprouts A couple decades ago, jokes on kids’ shows would keep saying something or another about a character hating Brussels sprouts. Pretty sure it was more than a couple of decades ago. But the Brussels sprouts thing didn't stick in my memory. Broccoli did. Of course, as I got older and didn't have to eat them the way my mom overcooked them, I learned to like both. And when I got even older, I had my mind blown with the fact that they are the same species. If you were around back then, you probably learned that Brussels sprouts tasted gross before you’d ever heard of the city of Brussels. Having been to Brussels, I still don't know what they call them there. Sprouts, probably, or whatever the French or Dutch word for sprouts is. like how Canadian bacon is called bacon (or backbacon) in Canada, or French fries are called frites in Brussels because they're a Belgian invention, not French. Unlike French fries, Brussels sprouts actually have a connection to Brussels. Well, not the city. It's hard to find extensive vegetable gardens in most major cities. But they were grown extensively in the surrounding countryside, or so I've heard. Brussels sprouts used to taste bitter, but during the 1990s, we started crossbreeding them with variants that didn’t. When we were done, we’d bred the bitterness out. There's an incident stuck in my head from several years ago, back when I did my own grocery shopping so at least six years and probably more, where I sauntered up to a supermarket checkout counter with a big bag of Brussels sprouts. The cashier started to ring me up, but then she looked me in the eye and said, "Can I ask you a question?" "Sure." She held up the bag o' sprouts. "How can you eat these things?" I was rendered speechless for a moment, but retained enough presence of mind to say "With butter and garlic." Or maybe I just sputtered, and then a week or so later, lying awake at night, I finally came up with a good comeback, and edited my memory to make me look better. Turns out there’s no moral law saying healthy stuff must taste bad. Shhh, you can't say that in the US. 4 Pistachios Pistachio nuts in stores used to always be red. I don't think I ever noticed that. Today, we instead associate pistachios with the color green, due to the light green color of the nuts and the deeper green color of the unripe shells. I associate them with a lot of work and messy cleanup, but damn, they taste good. 3 Jalapeños In the 1990s, the word “jalapeño” was synonymous with spicy. Again, this is a US-oriented site. For many Americans, mayonnaise is too spicy, and anything else is way too spicy. Today? Not so much. Maybe you’d call a habanero spicy, but jalapeños are so mild, you can eat a pile of them. That's... not entirely true. It's actually worse than that; jalapeños have wildly varying levels of capsaicin, making it difficult to control the flavor of one's concoction when using that particular species. Today, you might find yourself with one of the other many hotter jalapeño varieties, but there’s a good chance you’ll find yourself with TAM II or something similarly watery. Which is why, when I want spicy peppers, I go with habanero or serrano. No, I don't use whole ghost peppers, but I do use ghost pepper sauce sometimes. 2 Sriracha Sauce You know Sriracha sauce? Its label says that the primary ingredient is “chili,” and the chili pepper they use happens to be a type of jalapeño. At least it used to be, until some recent shenanigans. I know it, and I sometimes use it, but my tongue refuses to pronounce it. It has no problem tasting it, though. 1 Apples I don't think it would surprise many people to know that this iconic fruit has been selectively bred into hundreds of different varieties. The most extreme victim of this aesthetics supremacy may be the Red Delicious apple. Today, it’s perhaps the most perfect-looking apple. It looks like it’s made of wax, and many say it tastes like it’s made of wax, too. Nah, more like cardboard. I know what cardboard tastes like because I ate a pizza from Domino's once. Buyers have started rebelling. If you aren’t satisfied with Red Delicious, you can try the increasingly popular Gala or Fuji apples. On the rare occasions that I actually buy apples for eating, those are my choices, because they're tasty and they're usually available. In summary, yeah, lots of foods have changed, and sometimes for the worse. What's remarkable isn't the change itself, but our ability to tinker with the genetics of what we eat. And we've been doing it for as long as we've been cultivating food. We can be quite clever, sometimes. But I question our collective taste. |
I wanted to share this article because a) it's interesting and I have stuff to say about it and b) I wanted to show that even the most serious science communicators, like Quanta, sometimes can't help using a pun in the headline. The Physics of Glass Opens a Window Into Biology ![]() The physicist Lisa Manning studies the dynamics of glassy materials to understand embryonic development and disease. If you're anything like me, you're wondering what the hell glass and biology could possibly have in common. Well, that's what the article's for. The ebb and flow of vehicles along congested highways was what first drew Lisa Manning to her preferred corner of physics... I can relate. I still remember the epiphany I got back in engineering school when I realized that the equations of traffic flow are the discrete-math versions of the equations of fluid flow. But it wasn’t until after she had earned her doctorate in physics in 2008 that Manning started applying that enthusiasm to problems in biology. I've noted before that, sometimes, an interdisciplinary approach can solve problems that a focus on one field cannot. Perhaps I'm biased because I prefer to know a little bit about a lot of things than to know a whole lot about one thing and nothing about anything else. ...she learned about what’s known as the differential adhesion hypothesis, an idea developed in the 1960s to explain how groups of cells in embryos move and sort themselves out from one another in response to considerations like surface tension. “It was amazing that such a simple physical idea could explain so much biological data, given how complicated biology is,” said Manning, who is now an associate professor of physics at Syracuse University. “That work really convinced me that there could be a place for this kind of [physics-based] thinking in biology.” "Amazing," sure, but to me, it's not surprising. Complexity emerges from simplicity, not the other way around. And biology is basically chemistry which is basically physics, so even there, it should be no surprise that one field can inform the other. She took inspiration from the dynamics of glasses, those disordered solid materials that resemble fluids in their behavior. I'm going to digress for a moment, here. Glass has been described as a "solid liquid." When touring some historical site lo these many years ago—hell, it might have been Monticello—I heard a tour guide assert that being a solid liquid, glass flows very, very slowly under the influence of gravity, and that's why all these old windows are wavy and thicker at the bottom. This didn't sit right with me then, so I looked into it (this was pre-internet, so it involved an actual trip to an actual library). Turns out that no, glass is solid, period. It doesn't flow any more than rocks do, assuming ordinary temperatures (of course it flows when heated enough to change phase). The waviness of pre-industrial glass is a result of its manufacturing process, and apparently, they'd often install the panes with the thicker bits at the bottom, for whatever reason. Point is, people confuse "glass resembles a fluid" with "glass flows, albeit very slowly." Which is understandable, though really, tour guides should know better. The reason we say glass is fluid-like is that most solids have a crystalline structure of some sort, at the atomic level. But glass does not; its atomic structure is disordered. I mention all this in case someone's still got that idea in their head that glass is a slow-moving liquid; the article doesn't make it clear (see, I can pun, too) until it gets into the interview portion. Manning found that the tissues in our bodies behave in many of the same ways. As a result, with insights gleaned from the physics of glasses, she has been able to model the mechanics of cellular interactions in tissues, and uncover their relevance to development and disease. Unlike the relatively simple ideas about the atomic structure, or lack thereof, of various solids, the connection to biology is beyond me. The rest of the article is, as I said, an interview, which I'm not quoting here. While I can't pretend to understand a lot of it, I can appreciate her multidisciplinary approach and how insights from one branch of science can illuminate problems from another branch. Incidentally, I find it helps to use a similar approach to writing. Because as much as we like to categorize things, the boundaries tend to blur and become fluid. Like the view through an 18th century window pane. |
Almost everyone I know, when starting to read the headline from this Guardian article, would blurt out "forty-two!" What is the meaning of life? 15 possible answers – from a palliative care doctor, a Holocaust survivor, a jail inmate and more ![]() They'd be wrong, though. Forty-two is the "Answer to the Great Question of Life, the Universe, and Everything," as revealed by the great prophet, Douglas Adams. Says nothing about "meaning." As this article is an ad for a book, I conclude that the author's actual Meaning of Life is to sell as many books as possible. But in doing so, at least he includes others' points of view, opinions from those who probably aren't trying to sell a book. In September 2015, I was unemployed, heartbroken and living alone in my dead grandad’s caravan, wondering what the meaning of life was. And it never occurred to you that being broke, depressed, and trapped in a tin can with your dead grandpa might actually be the meaning of life? See, this sort of thing is why we push people to have jobs. Not so they'll have money, but so they'll be too busy to contemplate philosophical questions. What was the point to all of this? Apparently, selling books. Like any millennial, I turned to Google for the answers. Aw, this was too early. Try that now, 10 years later, and an AI bot will confidently and definitively answer your question. Or so I assume. I'm not going to try it, because I might not like the response. Or, worse, I might like it. I trawled through essays, newspaper articles, countless YouTube videos, various dictionary definitions and numerous references to the number 42... I told you 42 would be involved. It's a red herring. To be fair, so is everything else. ...before I discovered an intriguing project carried out by the philosopher Will Durant during the 1930s. The problem with letting philosophers have a go at this question is that none of them, not a single one, has a sense of humor (or humour, as this is The Guardian). And any answer to "What is the meaning of life?" that doesn't involve humor in some way is categorically and demonstrably false. We have a different name for philosophers with a sense of humor: comedians. Durant had written to Ivy League presidents, Nobel prize winners, psychologists, novelists, professors, poets, scientists, artists and athletes to ask for their take on the meaning of life. See? Not a single comedian in the bunch. In the 1930s, there were any number of humorists he could have polled, many of which are still revered. The Marxists, er, I mean, the Marx Brothers had been active for at least a decade. There was a Laurel and a Hardy. The Three Stooges got their start in the late 1920s. I'd want to hear their answers. Nyuk nyuk. I decided that I should recreate Durant’s experiment and seek my own answers. I scoured websites searching for contact details, and spent hours carefully writing the letters, neatly sealing them inside envelopes and licking the stamps. I can almost forgive the low-tech throwback of writing letters, folding them into envelopes, and sending them through the post. What I don't get is stamp-licking. Here in the US, stamps have been peel-and-stick for decades; is it that different in the UK? What follows is a small selection of the responses, from philosophers to politicians, prisoners to playwrights. Some were handwritten, some typed, some emailed. Some were scrawled on scrap paper, some on parchment. Some are pithy one-liners, some are lengthy memoirs. When I saved this article (not that long ago), I had in mind to quote at least some of the responses. But upon reflection, I'm not going to do that. The answers are as varied as the people giving them. Some are non-answers. Some contain the barest glimmers of a sense of humor. Some are highly specific; as a trained engineer, I could very easily assert that designing systems that work to make peoples' lives better is the ultimate meaning of life, or, as an untrained comedian, I could just as easily state that the meaning of life is to laugh and to make others laugh. Or I could just say "cats." The point is, the answer is different for everybody, and even for one individual at different points in life. For some, perhaps even this author, the meaning is in the search. For others, there is no meaning; this can be horrifying or liberating, depending on one's point of view. In my more literal moments, I assert that the meaning, or at least the purpose, of life is to generate additional entropy, thus accelerating the inevitable heat death of the Universe. Mostly, though, I don't concern myself with meaning or purpose. A Jedi craves not these things. It's enough for me to occasionally sit outside on a nice day, listening to music and drinking a beer. |
In my ongoing quest to look at word/phrase origins, I found this explanation from Mental Floss, though I felt no urgency to share it. Well, I thought it was pretty common knowledge that it came from the medical field, but I've been surprised many times by what I thought was common knowledge that turned out to not be. The reason stat is short for statistics needs no explanation. Yeah, it kind of does. Because 'stat' is short for 'statistic,' and 'stats' is short for 'statistics,' at least in my country. The one thing about British English that I actually find superior is that they shorten 'mathematics' to 'maths,' while we use 'math.' If stats are statistics, why is math mathematics? Many things in language make little sense, and this is one of them. But that's not the 'stat' we're talking about. Stat simply means “immediately.” And has the advantage of one short, sharp syllable instead of an unwieldy and tongue-time-consuming five. You sometimes see it written in all caps, STAT, which could either be to add extra emphasis or because people assume it’s an acronym. Amusing thing: like many people, I have an ongoing prescription for a cholesterol-controlling medicine. My doctor's office, affiliated with the university here, has a computer system that always capitalizes STAT. Consequently, the prescription is for AtorvaSTATin. It’s possible that the all-caps custom is influenced by the fact that ASAP basically means the same thing and is an acronym (for as soon as possible). It's also possible that they just want it to stand out on reports for other medical professionals. "We need an X-ray of this leg stat" might be overlooked, but "We need an X-ray of this leg STAT" adds emphasis to the urgency. But stat is not an acronym: It’s an abbreviation for the Latin word statim, also meaning “immediately.” Oddly enough, 'immediately' is also a Latin derivative, but it appears to share its Latin root with 'mediate' and 'medium.' I don't have a good source for this, but I suspect the 'im-' prefix negates the 'mediate' root, conveying a sense of urgency as opposed to moderation. Like with 'immobile' or 'imprecise.' When stat first entered the English lexicon in the early 19th century, it was used by physicians clarifying that a drug or procedure should be administered immediately. Early 19th century? "Give me that jar of leeches, stat!" "Trepanning drill, stat!" Medical professionals still use stat today, sometimes to differentiate a medication that must be administered immediately from two other types of medication orders. There are scheduled ones, which “are typically utilized for medications that are designed to give a continuous effect over a certain period of time (e.g. antibiotics),” per a 2016 article in Pharmacy Practice; and PRN orders “for medications that are to be given in the event of specific signs or symptoms (e.g. analgesics and antipyretics for pain and fever, respectively).” PRN is Latin, too: It stands for pro re nata (literally, “for the affair born”), meaning “as needed.” There's a brewery near me called Pro Re Nata, and the R in their sign has the little x cross on the tail that signifies 'prescription.' I find this amusing. Their beer isn't bad, either. Next time I go there, I'll be like, "Pint of brown, STAT!" Though I'll have to pronounce it carefully, or they might think I'm ordering stout. Not that there's anything wrong with that. |
I know what day it is, but I'm just going about my business, here. This bit is from HuffPo, which I don't usually read, but this one caught my attention. I Moved Abroad For A Better Life. Here’s What I Found Disturbing During My First Trip Back To America. ![]() “The hardest part wasn’t seeing these differences – it was realizing I could never unsee them.” Well. Okay. I guess some people need to push outside their envelope to see what's inside it. When I left America last spring for a safer home for my family and a better quality of life, I thought the hardest part would be adapting to life in the Netherlands. It's nice to have the privilege to just up and emigrate somewhere, isn't it? Like, if you don't like your life in whatever country you're in, boy it sure would be nice to have another country you can go to where you're not treated like something lesser or illegal. “We just hired Riley’s college consultant,” my friend Jackie mentioned casually, sipping her drink. “Five thousand for the basic package, but you know how it is these days. Everyone needs an edge.” "Everyone needs an edge." Yeah. Think about that for a moment. When everyone gets an edge, nobody gets an edge. Or, perhaps, people able to drop five grand on the edge end up winning, which perpetuates the whole cycle of economic disparity. How could I explain that everything — from the massive portions before us to the casual acceptance of paying thousands to game the education system — suddenly felt alien? That I’d spent the past eight months in a place where success wasn’t measured by the size of your house or the prestige of your child’s college acceptance letters? Congratulations; you've achieved an outsider's perspective. The Dutch principle of “niksen” ― the art of doing nothing ― replaced our American addiction to busyness. We had him once, but he was forced to resign. Okay, bad Nixon pun. Seriously, though, how could you not see the problem when you were living here? Too busy, I guess. Living abroad hadn’t just changed my zip code — it had fundamentally altered how I viewed success, relationships and the American Dream itself. In the Netherlands, I’d learned that a society could prioritize collective well-being over individual achievement. But that's... that's... soshulizm! What I’ve learned is that feeling like a stranger in your own country doesn’t have to be purely painful — it can be illuminating. It shows us that another way of life isn’t just possible, it’s already happening elsewhere. I don't mean to be mean, but I've spent comparatively little time abroad and didn't need to spend any to figure out that what passes for culture in the US is fucked. Some people really do thrive on it, though, and it's good to have choices. Maybe we need more people willing to step outside the fishbowl and then return with fresh eyes. Maybe we need more voices saying, “This isn’t normal, and it doesn’t have to be this way.” And maybe some people can figure it out without having to spend a year living in another country. Because not everyone can do that. So, I hope you haven't spent this entire entry looking for an April Fools' prank. If you did, now is when I reveal that the only prank is that there was no prank. April Fools! |
Someone, a week or two ago, asked me something related to the perpetual belief that people do more crazy shit during a Full Moon than at other times during the lunar cycle. This has been a belief for a very long time, and we even have the word 'lunatic' ![]() Does the Moon Affect Humans? ![]() Yes, the moon and its lunar cycles can impact you — but for other reasons than you may think Well, the only thing I can think of that's special about the Full Moon is the amount of light we see. In the time before electric lights, this would have effectively extended the time when one could see well outdoors. More activity can lead to more perceived instances of people acting weird, because people have acted weird since there were people. Others, however, have, both historically and well into the current era, ascribed a more mystical connection to it. This is, I think, akin to the Bermuda Triangle mythology: that particular stretch of ocean has been perceived to be especially mysterious and prone to make ships and planes disappear; but, as it turns out, when you compare that area to other places with similar traffic, there are no more or fewer disappearances in the BT than elsewhere. And this is why we use science and statistics. For centuries, the moon and how it affects human behavior has been at the center of mythology and folklore around the world. The very word “lunacy” dates back to the 15th century when it was believed the moon and its phases could make people become more or less aggressive, depending on its place in the lunar cycle. So, I see four different possibilities: 1. The Full Moon causes people to do crazy shit, for some mystical reason; 2. People do more crazy shit during a Full Moon for some rational reason; 3. People don't do more crazy shit during a Full Moon; observation bias (as with the BT) makes people think it happens more then; 4. People think there's a link between Full Moon and Crazy, so they let their inhibitions loose, and it becomes self-fulfilling. Okay, 4 may be a subset of 2. I'm pretty sure regular readers already know I've ruled out #1. But I'm willing to keep an open mind. That's the only way we learn stuff. But then, of course, there are lesser stories that hold a darker tone — haunting tales of werewolves whose transformation is dependent on the full moon. It occurred to me the other day that, canonically, vampires shun sunlight and only come out at night. But moonlight is reflected sunlight, so maybe, just maybe, a Full Moon doesn't provide enough sunlight to fry a vampire, but just enough to turn them into a werewolf. I don't think anyone's written about that yet, so don't steal my idea; I may use it. When you set aside superstitions and longstanding myths, is there any scientific truth behind the way the moon bewitches us? Psychologist Susan Albers, PsyD, walks us through some of the research that’s been done on lunar cycles — and why we may just be changing our behaviors based on independent psychological reasons, instead. All organisms conduct natural biological cycles for survival. When we talk about biological cycles, we probably most often think of our circadian rhythm — our bodies’ internal 24-hour sleep-wake cycle — and infradian rhythms (cycles that last longer than 24 hours) like the 28-day menstrual cycle or seasonal affective disorder (SAD). Couple of things here. First of all, this is my introduction to the term "infradian rhythm," and I'm both happy to learn a new word and angry that it's taken me this long to discover it. Second, it's long been noted that the menstrual cycle is similar to the lunar cycle. It's right there in the name; 'menses,' 'moon,' and 'month' share a PIE root. Whether this is coincidence or causation is outside the scope of this entry, but if there were a causal link, you'd think everyone who menstruates would do so at the same time, but, as far as I know, they don't. And since our human bodies are made up of 55% to about 78% of water, there’s some reason to believe we, too, might be impacted by the moon, its light and its 27-day lunar cycle — especially when you consider the moon’s gravitational pull on the earth is powerful enough to affect the ocean tides. Here's where I start to really question the source material. For starters, the lunar cycle, from Full to Full or New to New, is about 29.5 days, not 27. I think the 27 comes from the Moon's orbital period, which is shorter because the Earth is simultaneously orbiting the Sun. But we're talking about Full Moons here, not the Moon's orbital return to a certain location against the stellar background, so 29.5 should be the operating number. Also, I've seen this comparison to tides before. Oh, we're mostly water, so we're also affected by tides? I call bullshit. There's nothing magical about water that makes it special for tides. The ground is subject to tidal forces, too, though with a much smaller effect. If this weren't the case, the Moon wouldn't be tidally locked to Earth, showing us the same side at all times. Point is, though, tides are basically caused by a different size gravity vector on one side of an object than another. Humans are quite small compared to a planet (or moon); I find it extraordinarily unlikely that gravity is involved, especially when there's an even bigger difference in gravity when the Moon is closer or further away in its elliptical orbit. I mean, do puddles experience tides? And let me digress on that "elliptical orbit" point for a moment: every time the Full Moon occurs near lunar perigee, my news feed gets inundated with articles about the impending Supermoon. I don't mind much; at least it gets people looking at the sky. But perigee changes from lunar month to lunar month; it doesn't always occur at a Full Moon. If there's an effect based on proximity, some force that makes people do crazy shit when the Moon is closer, it should happen every lunar month at a different phase, not be associated with the Full Moon. The Sun might present a confounding factor, but even there, the effect should be similar at a New Moon and a Full Moon, and less at the quarter-phases. Okay, back to the article. “Any research that’s been done has been considered controversial, in part, because studies on humans are conflicting,” says Dr. Albers. “In most cases, when there’s been discussion of the moon’s effect on humans, it’s been anecdotal.” And it's also controversial, I'd hazard to guess, because anyone who tries to study it immediately gets labeled a lunatic. Ask anyone about how a full moon affects our lives and you’ve probably heard stories about birth rates climbing, an increase in emergency room visits and an uptick in crime. As this review points out, there seems to be no correlation between the lunar cycle and those things. Well, there you go. The answer. Okay, no, I'm kidding. But, nothing happens in a vacuum. The Moon's orbit does! Some studies have shown a possible correlation between the moon and human activity. At least they were careful to use "correlation." The rest of the article is the "More research needs to be done" section, including a lot of discussion about what I hinted at up above: the self-fulfilling aspect of all this. (I'm going to utterly ignore the final section, which is about maintaining a positive, upbeat, and optimistic outlook, at which point I said "bite my ass" out loud.) When I was a kid, and way further into adulthood than I care to admit (okay, right up until the present moment), every time I'd spot a Full Moon, I'd howl like a wolf. The Moon doesn't make me howl; that's something I decided to do after reading too many werewolf stories. There is one thing I feel certain about, though; Life is a lot more interesting with a Moon in the sky. And I don't need to be a mystic or a poet to see the beauty or insanity of it all. |