by Robert Waltz
Not for the faint of art.
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
|This one's about one of my favorite subjects. Yes, it's from Cracked.
One of the few bright spots of living in this world today is that we can get food from practically anywhere. Not everything, sure; some things don't ship well, and there's not enough demand for others. But restaurants from other regional cuisines crop up everywhere, and I don't care what anyone says, this is an unmitigated boon for society.
But then there's stuff like this.
5. Suburban Ohio's Proudly Uncooked Pizza
...the local specialty of Ohio Valley pizza, where the cheese and toppings are added on cold, after the sauce and crust come out of the oven.
Everyone has an opinion about pizza. For me, it's that New York style pizza is pizza, and everything else (yes, including anything in Italy) is, at best, a pale simulation thereof; at worst, as with Chicago style "pizza," it's not pizza at all.
But I make an exception for Ohio.
The pizza there is different from New York style: thinner crust, and sometimes they cut it into (mostly) squares. But it's delicious. Many years ago, I used to find any excuse to go to Ohio just for Massey's Pizza (we can get NY style here in my town, but not Ohio style). Last time I went to one, though, either they'd changed the way it was made, or my tastes had altered -- whichever, it ain't what it used to be.
But this? This, I'd never heard of.
4. Hawaii's Guilty Pleasure Spam Sushi
"Decolonize your bookshelf" is one thing, but decolonizing your plate is impossible: So many foods that feel too normal to even think about happened because of colonialism. Without empires and armies, Italy wouldn't have tomatoes, Ireland wouldn't have potatoes, and the Pacific islands wouldn't have Spam, which is just as important as the other two.
When it comes to food, I don't give a shit about "cultural appropriation."
The most popular Spam dish by a mile, though, is Spam musubi, a kind of Spam sushi that consists of a fried slice of the pink stuff, a big ball of sticky rice, and some nori to wrap them together.
Unlike the pizza, this one I've seen. Note I said "seen." I cannot ever bring myself to eat Spam. No, the one Hawaiian dish I crave is Loco Moco, which is ground beef and fried eggs over rice. It's a week's worth of calories in one dish, and it's worth every pound.
But I can't be arsed to make it myself, so I'll have to wait until I can either go back to Hawaii, or the local Hawaiian place wises up and starts serving it.
3. Cincinnati's Challenging Chocolate-Cinnamon-Cheese Chili
On paper, it should be impossible to make chili weird enough to earn a spot on its list. The whole idea of chili is you do it your way, and say "screw you" to anyone without the taste buds to handle it. Veggies? Meat? Spice? Thickness? Toppings?
Oh, but they neglect to mention that there are people in Texas who will shoot you dead if you put beans in chili.
You can't go wrong ... but you can go to Cincinnati, and that's pretty close.
This doesn't just mean chili.
I don't know if I've had official Skyline chili as the article describes, but I did once try the general chili-pasta combination and... well, it's not bad.
2. A Sloppy Joe That Isn't The Joe You Know
...there's no real reason that a Sloppy Joe needs to be ground beef on a bun. If you feel crushed under the tyrannical boot of Manwich's dominion, then there's a place where you can escape: northern New Jersey, where a sloppy joe is a triple-decker deli sandwich with cold cuts, cheese, coleslaw and Russian dressing.
I've been to New Jersey way more than I've been to Ohio, and I've never seen this. Depending on the cold cuts, though, that could be damn delicious.
You might know similar versions of these sandwiches as a Reuben or a Rachel -- as if those names had any more meaning -- but these Sloppy Joes actually have a way better documented history than the cafeteria kind. According to the Town Hall Deli in South Orange, the deli sandwich is a recreation of a house specialty from Cuba -- specifically, the specialty of a spot in Havana named Sloppy Joe's.
Okay, no. There may be a superficial similarity, but a Reuben is hot. Always, always, always. Only idiots in the South try to serve it cold (by way of fairness, the North can't do grits or biscuits & gravy, so it tends to even out).
1. Dirt, Just Dirt
...you know what? I'm going to let you bask in this one all on your own.
And for once, after linking a food thing here, I find I'm not hungry at all. I mean, I started to feel some pangs in the middle, there, but after that last one, I don't think I'll be eating anything for a while.
Now I need an article on regional liquors and where I can find interesting ones. Eventually I'll be able to road trip again... I hope.
|Finally, vindication! This isn't confirmation bias talking at all.
Sarcasm Spurs Creative Thinking
Although snarky comments can cause conflict, a little verbal irony also stimulates new ideas
Yeah, sure it does.
“Sarcasm is the lowest form of wit but the highest form of intelligence,” that connoisseur of witticisms, Oscar Wilde, is said to have remarked.
No, fart jokes are the lowest form of wit.
The reason is simple: sarcasm carries the poisonous sting of contempt, which can hurt others and harm relationships. By its very nature, it invites conflict.
That's only if people detect it. Or, in my case, when I do detect it, I play it straight, like I didn't. That sometimes throws them off balance.
Sarcasm involves constructing or exposing contradictions between intended meanings.
Because no other forms of expression do that.
And yet behavioral scientists Li Huang of INSEAD business school, Adam D. Galinsky of Columbia University and I have found that sarcasm may also offer an unexpected psychological payoff: greater creativity.
Sure, yep. That's why I'm so damn creative.
And sarcasm can be easily misinterpreted, particularly when it is communicated electronically...
All of my attempts at turning Comic Sans into the Official Sarcasm Font have failed miserably.
Anyway, the article goes on to describe a psychology experiment -- like most psychology experiments, the guinea pigs were college students -- to test the author's claim in the headline.
Given the risks, your best bet is to keep conversational zingers limited to those you know well, lest you cause offense.
Nope. I'd rather piss off strangers on the internet. Not here, usually - I have other sites for that.
Anyway, it's an interesting result though, to be serious about it, I have doubts concerning the sample size and lack of diversity in experimental subjects. But it's probably worth pursuing further. Not by me, though. I'll just have to read about it (this article is five years old and for all I know, they've already followed up).
But I'm reminded of an old story -- could be a joke, could be real; I can't tell -- about a college English class where the professor was talking about the double negative. He said something like, "You can string two negatives together to make a positive, but there's no instance of stringing two positives together to make a negative."
To which one student muttered, "Yeah, right."
Great comments! Thanks for all of them. Hard to pick just one that stood out, because everyone had some real gems, but I'm going with Kåre Enga, P.O. 22, Blogville's five rules (though I must note that Lilli's point about tequila is absolutely true). For everyone else, I'll do this again soon! I really want to do MB entries more often; it's just that usually I get caught up in the post and then I forget. I'll try to be better at it, at least 2-3 per month (spaced out so I don't have to worry as much about CR eligibility).
|Well, after two days of heavy shit, how about a good old-fashioned Bloomberg roast?
After 45 Birthdays, Here Are '12 Rules for Life'
Don't lose friends over politics. Don't lose a spouse over pickles.
The headline made me laugh. The subhead made me stop laughing. But of course, the article is from early 2018, before politics became life-or-death, and pickles were cheaper.
Yesterday was Jan. 29, meaning that Oprah Winfrey and I are each a year older: 64 and 45.
Of all the crappy reasons to name-drop, sharing a birthday has got to be on top of the list. Why not brag about being an Aquarius while you're at it?
Forty-five is somehow a very definite year; there is no question that you are middle aged.
Average age of death is roughly 80 years, being generous and also noting that the writer is female (statistically longer-lived). Half of 80 is 40. So 31-50 is what should be considered middle age. Sorry, Megan, you're now (three years after this article) two years away from being an old lady.
At 45 one takes stock.
I think it's the rule that any Bloomberg article has to have the word "stock" in it.
The building years of your life are over, and what you are now is pretty much what you are going to be. Soon it will be what you were.
Bite my ass.
You can no longer tell yourself that you might move to Lisbon, learn Portuguese, and take up the guitar. You cannot learn Portuguese at your age. You can’t remember new words anymore; you can’t even remember where you have left your keys.
Bite my ass hard and then kiss it better. And you may want to go get a cognitive test. I started learning French two years ago, at much older than 45. Sure, it's taking me longer than it would some kid, and my memory is far from perfect, but I'm not ready to stop learning yet. J'apprends ce que je veux apprendre.
Oh, and I know exactly where my keys are (Je sais où sont mes clés). And my phone (mon portable), and my glasses (mes lunettes). Not sure about my cat (Je ne sais pas où est mon chat), but they tend to wander about on their own. (I'm not trying to pretend I know a lot of French, and I'm certainly not fluent, but I am making the point that she's wrong about the language thing.)
So it seems a good opportunity to do two things. First, to wish Oprah Winfrey a happy belated birthday.
She's not going to sleep with you.
And second, to address this “12 Rules for Life” meme that you young whippersnappers have got up to on the social medias.
Did I forget the "12 Rules for Life" meme? Oh, no, wait, I didn't give a shit or follow les réseaux sociaux so whatever.
I am probably more than halfway through my life now; I ought to have some rules.
If you're lucky, yeah, but again, might want to get that memory checked out. And no, it works exactly the other way around: the older you get, the less you need rules, and the more you can cheerfully break the ones society has laid out for you. To demonstrate this, I'mma destroy this punk's "rules."
1. Be kind. Mean is easy; kind is hard.
While it doesn't do to be a bully, sometimes you just have to kick someone's ass (metaphorically; I'm not into actual violence unless it's absolutely necessary). Someone says something stupid? Call 'em out. To be fair, that might be a lesson learned since early 2018.
2. Politics is not the most important thing in the world... If you have to choose between politics and a friendship, choose the friendship every time.
Yeah, no, I've cut off more than a few freakin' idiots who are on the wrong side of the aisle. Especially over the past 18 months or so.
3. Always order one extra dish at a restaurant, an unfamiliar one.
I'm all for trying new things from time to time, but dammit, commit. Just order the new thing. I hate taking home leftovers and I'm not ordering (and therefore eating) two meals in one sitting.
4. Give yourself permission to be bad. You know what you’re really good at? Things you’ve done many times before. Mastery is boredom.
Typed like someone who's never mastered anything.
5. Go to the party even when you don’t want to.
Okay, I'm an introvert and I've never not wanted to go to a party. Leave early? Sure, sometimes. I guess it's because I get invited to parties, like, once a year. Less now. So I have to take advantage of it when I am.
6. Save 25 percent of your income.
Yeah, right, this- no, wait, I can't argue with this one. She's right.
7. Don’t just pay people compliments; give them living eulogies.
Yeah, if you're a chick you can get away with this. If you're an old guy with a beard, you get labeled "creep."
8. That thing you kinda want to do someday? Do it now. I mean, literally, pause reading this column, pick up the phone, and book that skydiving session. RIGHT NOW.
Before-Time-like typing detected.
9. Early modernist critics used to complain about the sanitized unreality of “nice” books with no bathrooms. The great modernist mistake was to decide that if books without sewers were unrealistic, “reality” must be the sewers. This was a greater error than the one it aimed to correct.
And so we ended up with post-modernism, in which nothing matters, everything has to be dissected and deconstructed, and nothing can be enjoyed for what it is, only blasted for what its implications are.
Incidentally, I finally got to the point in my chronological Trek watching that I heard a reference to a bathroom. I think it was on DS9. It took until the mid-90s.
10. Don’t try to resolve fundamental conflicts with your spouse or roommates.
Don't have fundamental conflicts with spice or roommates. Better yet, don't have a spouse or a roommate if you're going to be picky about things. I mean, like... I have this thing about sticks of butter. They have to be cut square, or the rest of the stick is ruined. Fortunately, my housemate has her own butter stash, so it's not an issue.
11. Be grateful. No matter how awful your life seems at the moment, you have something to be grateful for. Focus on it with the laser-like, single-minded devotion of a dog eyeing a porterhouse.
Okay, I like that image, but still...
12. Always make more dinner rolls than you think you can eat.
Look, I'm a big fan of bread, and I like to eat, but I'm older than this lady, and there are limits to how much I can stuff into my dough-hole.
So of course, I had to come up with my own 12 Rules for Life. They don't parallel hers. But these are things I've learned with my far greater life experience:
1. Never go grocery shopping while hungry.
2. Don't pretend to like (or dislike) something just because everyone else likes (or dislikes) it.
3. Eat what you're craving (just maybe not too much of it).
4. Sleep when you're tired.
5. Sex is overrated.
6. Don't interrupt my writing unless something is on fire, or you want to be.
7. Always have a backup for critical systems (e.g. keep a paper map in your car for when your GPS craps out).
8. Make sure there's always booze in the house. (Or, you know, whatever your favorite drink is.)
9. Own, don't rent. (This isn't just about housing.)
10. Pay extra for quality shoes.
11. Money isn't everything, but it can buy you anything. Maybe it can't buy happiness, but it can buy beer, and that's good enough for me.
12. Automate everything you can.
Maybe you have your own Rules (or Guidelines). Which means it's finally time for another...
Merit Badge Mini-Contest!
Got your own Rules for Life? Doesn't have to be 12, but say a minimum of 5. Post them in a comment on this entry. It's okay to use one of the rules above (hers or mine), but put your own spin on it if you do so. The comment I like best will earn its author a Merit Badge tomorrow (Wednesday). As usual, the deadline is midnight WDC time at the end of today, Tuesday, September 14.
|Today's article is a long one, and I couldn't blame anyone for not reading the whole thing, because I started skimming it after a while, myself.
Meet the People Who Believe They’ve Traveled to a Past Life
Christopher was an ancient Egyptian prisoner. Stephanie's dating the man who had her murdered. They and many others swear by the controversial benefits of past-life regression.
I also couldn't blame anyone for going right now, "Waltz, why are you even giving this bullshit oxygen?"
Bear with me on this one. No, I don't "believe" in past lives. But we also don't understand how the mind works, and we have to start somewhere.
The unsettling visions and sensations Benjamin experienced while imprisoned thousands of years ago were part of what he thinks may have been a past life. His mind traveled to that time and place during a session of past-life regression, a practice in which a person, under hypnosis or in a meditative state, experiences a memory that they believe is from a time when their soul inhabited another body.
I get unsettling visions and sensations sometimes, too. They're commonly known as bad dreams. We may not know much about the mind, but we do know it's capable of giving us sensations with little to no connection to external reality.
The American Psychological Association is deeply skeptical of past-life regression’s viability, and there are serious questions about the ethics of using it as a treatment.
Skepticism is good. Outright rejection can be counterproductive.
But, Bliliuos says, “In hypnosis, you go always to the most important memory you’ve experienced,” whether that’s in this life or perhaps a previous one.
I feel obligated to point out -- in case you skim the part where the article mentions it -- that another thing we do know is that hypnosis is very, very good at installing false memories.
Notions of reincarnation are diverse and nuanced, but for past-life regression advocates like Eli Bliliuos, the New York hypnotist, “The basic belief is that we are souls; we choose to incarnate these bodies for purposes of learning from experience, growing from experience.”
I want to respect peoples' beliefs, but at the same time, how about some real evidence? Or, at the very least, a plausible mechanism whereby memories can be transferred? And what about other hypotheses about what's going on here? Like, for instance, as I noted above, false memories or dreams?
Full disclosure, though, I've had dreams that seemed to be about past lives, also. That doesn't mean that it's any more than "seemed to be." I've forgotten, on a conscious level, almost everything I've ever read or heard, but who's to say that some tidbit from history or a novel set in the past didn't lurk in my subconscious? As support for this, I also read a lot of science fiction, so one might expect unconscious visions of the future, spaceships or other worlds in such instances, and behold, yeah, I get those too.
Once a traditional psychotherapist, Weiss... has written that he was a past-life regression skeptic at first. But a hypnotized patient of his, whom he called “Catherine” in one of his books, recounted past-life memories that were so precisely outlined and, as it turned out, historically accurate, that he felt it was impossible she could have invented them.
One way to verify reincarnation would, of course, be to find details that fit historical facts. But the question would always remain: how can you tell the difference between that, and someone's vivid but subconscious memory of reading history books or memoirs?
Bliliuos and other advocates of past-life regression say that you don’t have to believe in reincarnation to benefit from the experience. He recalls one session with a client who told him that they thought their past-life regression was a figment of their imagination.
“That’s perfectly fine,” Bliliuos responded, but he asked them to at least consider whether the vision might be a message from their unconscious. If it had some relevance to their life now, then the important element of the experience was the lesson they took away from it.
“Who cares if it’s ‘real’?” he adds.
For me, though, this is the important bit. The fact of a phenomenon remains even if none of the explanations for it fit (I have another article in the wings about UFOs, or UAPs as they're trying to get us to call them, and I had many of the same thoughts reading that). I look at the past-life thing as a kind of a metaphor, and humans can be better at relating to metaphors than to reality. If it helps one deal with one's problems, then what's the harm?
Well, for one thing, the harm is that if you start believing bullshit, there's no end to the bullshit you could believe.
I once had a conversation with Kid Me. We were chatting back at the place where I grew up, a sunny summer day, everything golden, green and still, with the oppressive humidity that anyone in the general vicinity of Washington, DC knows all too well. Not to mention the clouds of gnats that we both swatted at during our discussion. Sight, smell, sound... all there, all achingly real. Going into the conversation, I had no memory of speaking to my older self. What we actually talked about, I don't remember.
Of course it was a dream. But I woke from the dream with the strange sensation of remembering the conversation from both points of view: Kid Me and Decrepit Me. More, though, I was at peace about... something. Damned if I remember what, but I vividly remember the sensation of being calm and content. It was as if the conversation had settled something that had been bugging the back of my head (like those goddamned gnats) for all those years.
I'm under absolutely no illusion that I "actually" went back in time and talked to my younger self. It was a dream, a construct of a sleeping mind. But the dream, or hallucination, or mental exercise, whatever you want to call it, had a real effect.
At least for a while. Then I had to go do something or other and it was back to the daily annoyances. But the point is that something about it changed my mental state, at least temporarily.
So, similarly, I wouldn't jump to the conclusion that "I" (however that is defined if there's no continuity of consciousness) experienced an actual past life, but if there's something in the story, metaphor, or memory that helps me come to grips with something... well, that's okay. But will it, though? That's the real question. As the article points out, it can also lead to harm.
I just wish people wouldn't attribute extraordinarily unlikely things to the activities of one's mind. Sure, I can't rule out reincarnation, any more than I could rule out unicorns or underground lizard people or Bigfoot or space aliens. But I need more than just "I experienced X, so Y *must* be true." It doesn't follow, and there's no plausible mechanism for it.
“My conclusion, then,” Andrade states, “is that it is better to play it safe.” He advocates that people seek out more evidence-based forms of treatment instead.
Absolutely agree with this. But at the same time, we (by which I mean, people who train for this sort of thing) should be working on finding out more about how the mind works. More evidence, as it were.
|This article is from back in April, and some things might have changed since then. But I'm linking it anyway, because I have things to say about risk management and science.
Deep Cleaning Isn’t a Victimless Crime
The CDC has finally said what scientists have been screaming for months: The coronavirus is overwhelmingly spread through the air, not via surfaces.
Last week [again, this is from back in April], the CDC acknowledged what many of us have been saying for almost nine months about cleaning surfaces to prevent transmission by touch of the coronavirus: It’s pure hygiene theater.
Some of us have been calling the stuff the TSA does at airports "security theater," and for similar reasons. It inconveniences and embarrasses people while not contributing significantly to actual safety -- but it makes it look like they're doing something when they make you take your shoes of because one fucking asshole tried to light a shoe bomb that one time, and failed.
But. To get one thing straight: science changes, refines, and sometimes backtracks. There is very little absolute truth in science, just levels of certainty. This bothers the living shit out of some people, who expect things to remain the same all the time. For many people, the first thing they hear is The Truth, and anything later that contradicts it must be a lie.
Science works exactly the opposite way. So when they say "clean surfaces to reduce possibility of transmission," then a few months later say, "Well, no, it turns out it's mostly airborne," that reflects the ongoing refinement of the body of knowledge. Neither is Truth or Falsehood; they're both the best they could do with limited information.
I'm not quoting much from the article, I know. You can read it at the link. Like I said, it's not the specific content I wanted to talk about, but the process behind it.
I said I also wanted to talk about risk assessment. Consider the following scenario:
There is a person who is scared shitless of flying. Absolutely terrified, to the point where they have to take a sedative on the flight. Of course, any flight includes a risk; airplanes have, in fact, been known to drop right out of the sky. That's scary, right? But this person has to fly anyway, for business or for family, whatever, so they suck it up, take the pills, and get it over with.
But what they should be worried about isn't the flight. The chances of a disaster are vanishingly small. Not zero, of course, but really damn tiny. No, if they're worried about being killed, what they should be concerned about is the drive to the airport (whether they're driving or ridesharing or on a bus). Your chance of being killed in an auto accident -- that is, doing something that most of us do every single day -- is many orders of magnitude greater than your chance of being killed on an airplane. And yet many people blithely drive around, maybe taking basic precautions but still doing stuff like texting while driving, without the massive anxiety that flying gives them.
Risk also isn't just about probabilities. It's also about consequences. Think of the probability of an event as a continuum from low to high. That's one axis. How bad the effects might be would be a continuum from "whatever" to "holy shit."
It's like... I used to design dams. Not big honkin' dams like Hoover, but little pond and lake dams for stormwater management. Dams can fail, of course, just like anything else. So you design to minimize the risk of failure (you can't, of course, reduce the risk to zero; it reaches a point of diminishing returns). But what you do depends, in part, on what's downstream of the dam. If there's a subdivision there, you put in some more safeguards because there's the risk of serious damage. If it's just a bunch of wilderness and then a river, well, you still don't want the thing to fail, but the potential for damage or loss of life is significantly smaller.
Or take a thing that actually happened to me once. I was out at a jobsite, looking at how deep a gas line was after they exposed it with an excavator. Now, this wasn't an ordinary, residential natural gas line; it was the big-ass pipeline delivering methane to the entire city.
The excavator exercised extraordinary caution (we had to know how deep it was actually buried), and the lines had been built sturdily enough, but you never know with these things. So there I am, on the edge of the pit, watching as someone took out a tape measure. Some old guy, I don't even know who it was, comes up to the edge of the pit smoking a cigarette.
In that instance, the chance of there being a gas leak was really, really tiny. But the potential consequences of sparking that leak were absolutely disastrous - big boom, gas cut off to the entire city, etc.
Risk management also has to do with what steps one needs to take to mitigate it. In that example, there was one simple step, and only one, that needed to be made: don't smoke near exposed gas mains. Cheap, simple, and you don't have to be an engineer to understand it.
But as I said, risk reduction reaches a point of diminishing returns. It's why most school buses don't have seat belts: in part, it's because the cost to install them is too high relative to the marginal increase in safety they'd bring. It's why levees aren't generally 150' tall, able to withstand any imaginable flood event, but rather designed for 100 year or 500 year storms, depending on what they're protecting. (The misconceptions about x-year storm events are also a problem, but this has gone on long enough already.)
The bottom line is, it's impossible to eliminate risk. The best any of us can do is risk management. Masks don't completely stop spread of aerosolized viruses, but they help. Vaccines don't completely stop the spread, but they're a huge help. And, as this article points out, surface transmission is probably a lesser factor, and it's nice to have clean things, but if you go overboard you're taking resources away from other options, and the marginal decrease in risk isn't worth heroic efforts.
I've seen people throw up their hands in despair: if something can't be perfect, might as well not do it at all! Why bother? Well, this is why we bother: even if the risk can never go to zero, reducing it is still a worthwhile goal, especially if the methods used to reduce it are cheap and simple.
And that's all the ranting I'm going to do about that for now.
|It might be that we're interpreting history wrong. Not to mention prehistory.
What the caves are trying to tell us
Whatever they once said to their authors, they scream their message of no message across the millennia to us now.
Every so often, I get the urge to drag someone into a cave, and show them something unspeakable.
Plenty of caves would do, but let’s take him to the Cueva de la Pileta in Andalucia, Spain. There, we’ll push him into one of its huge, damp, cool cathedral-halls of fractured rock, where the darkness and the vastness of empty space seem to press themselves tightly against your skin, close and clawed and ancient. We know that there were people here, some 20,000 years ago. They left their millstones and their axe-heads; they left walls blackened with soot from fires that went out eons ago, leaving traces across a chasm of time that could swallow up the entirety of recorded history four times over. They left the bodies of their dead. And they left marks on the walls. The people who lived in this cave 20,000 years ago, people who lived lives it’s impossible for us to even imagine, are still trying to talk to us.
Oh. Never mind.
Were our ancestors just playing, with a child’s hesitancy, at the perilous game of turning bits of pigment into an abstract form beyond space and time? Or had they, long before we realized, found a way to make dead objects speak?
Anyway, the article goes on to relate those cave drawings to thoughts on evolutionary psychology, which I've ragged on in here before because, at least in its popular version, it's extraordinarily unscientific.
Evopsych combines every unscientific pop-science trope that makes people feel smart for believing in bullshit: a fetishism of geneticism and evolutionary processes, a refusal of diachronicity, and a dogmatic insistence on the cosmological principle that blankets the universe and its past in crushing sameness.
It's one thing to appreciate science. It's another thing entirely to misuse it to bolster one's own biases. We saw it with eugenics, as people cited so-called science to enshrine their self-appointed genetic superiority; we saw it with quantum physics with mystics claiming it proved Eastern philosophy or some such. And we see it all the time with evolutionary psychology.
It works like this. You start with a vague stereotype about the failings of other people that you’d like to lend some scientific heft — to take Damore’s example, the idea that “women generally have a stronger interest in people rather than things, relative to men.” You note that this behaviour is not particularly useful in an environment where just about everybody has to feign interest in some kind of tedious nonsense just so they can feed themselves; it’s not, in evolutionary parlance, an adaptive trait. But humans are no longer biologically evolving; if people are behaving in this way, it must be because these traits evolved to be advantageous in what’s called the “Environment of Evolutionary Adaptedness:” an assumed, theoretical environment of pure biological utility which is supposed to have existed in the Pleistocene, the hunter-gatherer era stretching from two and a half million years ago to just ten thousand years short of the present, the age that produced those strange markings in the caves of Europe. This environment, it’s assumed, was exactly the same for everyone, and those primitive plains still haunt our perceptions today. If women aren’t making as much money programming Google gadgets to collect data on every aspect of our lives, it must be because evolution once gave men the skills needed to throw a stick at a reindeer, while women were stuck with the traits for childrearing and patience.
The obvious and glaring problem with these "theories" is not just that they beg the question (in the original sense of the phrase), but that males and females aren't different species, and we all have traits from both parents.
In scientific terms, this is bullshit. None of its accounts are testable or disprovable; evopsych is, for all its pretensions to rationality, a collection of just-so stories.
As I've been saying.
The real social danger of this sort of thinking is, as far as I can tell, related to the naturalistic fallacy. If, for example, you assume that every individual of a species must reproduce in order to advance the species, you end up marginalizing asexuals, homosexuals, and even people like me who make a conscious choice to avoid having kids. It completely bypasses the true nature of humanity -- we don't rely on genetic evolution, but on social evolution, which is a lot faster.
Why is pink associated with girls and blue with boys? Ignore the fact that as recently as the 1920s the gender-color identification went the other way around; it’s because women evolved to spot pink-colored berries in the forest, and men evolved to hunt between the open plain and the wide blue sky. Why is there still a gender wage gap? It’s not the fault of our own society; it’s the fault of the Stone Age.
Yep. You can come up with bullshit "evolutionary" explanations for just about anything.
The next part of the text goes back to focus on the cave paintings, with a brief history of what people thought about them (spoiler: it usually reflected the in-vogue philosophies of the times). It's fascinating, because it gives more insight into the thinking of modern scholars than into the cave-dweller mind. As is appropriate.
Within the mainstream, many theorists quietly assume that the caves served some kind of religious or proto-religious function. Their location deep in the bowels of the earth might have brought to mind some connection with a shamanic underworld or spirit realm where the animal-gods move in eternal masses.
Or -- and bear with me here -- they demonstrate survivorship bias. It could be that prehistoric humans drew and painted everywhere: cliffs, trees, each other, animal skins, the ground, as well as in caves, but only the caves could preserve the art for thousands of years. I've also often been suspicious of "It was a religious thing," because, well, how do we know that except by comparison to recent history?
Fortunately, the article addresses that, too.
Of course, as a writer, I have my own bias: the idea that the symbol (be it representational art or its successor, the written language) stimulates a mental connection to the object it represents. In a sense, the picture becomes the pictured. This is a reflex in humans. If I show you a picture of a duck, you're may be just as likely to go "It's a duck!" as you are to go "It's a picture of a duck." At base, it's neither; it's a collection of pixels or paint on canvas or precise patches of pigment on photographic paper.
What we call something doesn't always reflect its reality. We interpret it as a duck, though, and for most purposes, that's all that matters. But drill down far enough, and you're left with quantum uncertainty and the squiggles of wave functions; pull out far enough, and you can't see it at all.
For me, it's meaningless to search for meaning beyond the personal. We can't even agree on the meaning of something that happened 20 years ago; how can we possibly hope to come to a consensus on what is a thousand times older than that?
Still, this doesn't mean we should stop trying. If there is meaning to life, it's in the trying, not the success or failure. We're making our own cave drawings now, and maybe whatever's around in 20,000 years will put their own spin on things.
|Today, I have an article about writing... sort of.
Even if you're not interested in writing essays, speeches, or other traditional vehicles for rhetoric, knowing the tools of the trade could be useful. If nothing else, it'll help you spot rhetorical tricks used to manipulate you. Maybe. Personally, I can never remember all the Greek-y words, but at least sometimes I can remember the devices they signify.
Rhetoric is often defined as “the art of language.” That might sound like a bit of a cliché (which it is), but it’s actually quite a nice way of saying that rhetorical devices and figures of speech can transform an ordinary piece of writing or an everyday conversation into something much more memorable, evocative, and enjoyable. Hundreds of different rhetorical techniques and turns of phrase have been identified and described over the centuries—of which the 21 listed here are only a fraction—but they’re all just as effective and just as useful when employed successfully.
I'm not going to hit all of them here. Just some comments.
You’ll no doubt have heard of hyperbole, in which an over-exaggeration is used for rhetorical effect, like, “he’s as old as the hills,” “we died laughing,” or “hyperbole is the best thing ever.” But adynaton is a particular form of hyperbole in which an exaggeration is taken to a ridiculous and literally impossible extreme, like “when pigs fly!” or “when Hell freezes over!”
"When pigs fly" is one of my favorite phrases. I have a collection of winged pigs. Nothing profound about them; I just enjoy the absurdity.
Pretty sure Hell froze over when the Cubs won the World Series, though.
You know when you pose a question for dramatic effect and then immediately answer it yourself? That’s anthypophora.
Ha, I see what you did there.
If you’ve ever friended or texted someone, emailed or DMed something, tabled a meeting or motorwayed your way across country, then you’ll be familiar with antimeria, a rhetorical device in which an existing word is used as if it were a different part of speech.
Verbing weirds language. -Calvin
Right. Okay. Here goes. Asterismos is the use of a seemingly unnecessary word or phrase to introduce what you’re about to say. Semantically it’s fairly pointless to say something like “listen!” before you start talking to someone, because they are (or at least should be) already listening.
Yeah... I do use this one a lot.
When you say that something is like something else (“as busy as a bee”), that’s a simile. When you say that something actually is something else (“a heart of stone”) that’s a metaphor. But when you just go all out and label something as something that it actually isn’t (“You chicken!”), that’s a hypocatastasis.
I'm just leaving this one here because now hypocatastasis is my new favorite word. For now.
Anyway, like I said, just informational today. I wanted to have time to do a review. Yesterday I went to the theater and saw a... I don't know if "movie" is the right word. It's not a documentary. Whatever; I saw a film about the band Rush, focused on performances from their 2015 tour, with a few other fun things sprinkled in, and here's the review.
One-Sentence Movie Review: Rush: Cinema Strangiato - The Director's Cut
You can either accept that Rush was unique, and one of the greatest bands of all time, or you can be wrong.
|Here's an article from 2017, but one that seems more relevant now than in the Before Time.
The Five Universal Laws of Human Stupidity
We underestimate the stupid, and we do so at our own peril.
Normally, I'd link to the original source, but it's paywalled. This is a reprint.
Stupid people, Carlo M. Cipolla explained, share several identifying traits: they are abundant, they are irrational, and they cause problems for others without apparent benefit to themselves, thereby lowering society’s total well-being.
I would be remiss if I didn't note that intelligence isn't the last word on human worth. Half the world's human population possesses below-average intelligence (more or less), and it's possible to be stupid and nice, just as it's possible to be smart and an asshole. A university professor like Cipolla might have had some bias in the matter, though.
The only way a society can avoid being crushed by the burden of its idiots is if the non-stupid work even harder to offset the losses of their stupid brethren.
Wait, you want me to be smart and work hard? Let's not go too far, okay?
Of course, everyone does stupid things from time to time. I think it's a matter of proportion.
Let’s take a look at Cipolla’s five basic laws of human stupidity:
Oh, good. Having "The Five Universal Laws" in the headline and not stating them in the body would be... well... stupid.
Law 1: Always and inevitably everyone underestimates the number of stupid individuals in circulation.
No matter how many idiots you suspect yourself surrounded by, Cipolla wrote, you are invariably lowballing the total.
Yeah, best to assume everyone's an idiot until they show otherwise, and then still be prepared to revise one's opinion of them upon further observation.
But like I said... it's about half the population (depending on what's meant by "average").
Law 2: The probability that a certain person be stupid is independent of any other characteristic of that person.
Cipolla posits stupidity is a variable that remains constant across all populations. Every category one can imagine—gender, race, nationality, education level, income—possesses a fixed percentage of stupid people.
This may seem counterintuitive. Okay, it does seem counterintuitive. But the whole purpose of reasoning, and of science, is to check our intuition, which is usually less reliable than logic.
Law 3. A stupid person is a person who causes losses to another person or to a group of persons while himself deriving no gain and even possibly incurring losses.
Cipolla called this one the Golden Law of stupidity. A stupid person, according to the economist, is one who causes problems for others without any clear benefit to himself.
I'm just going to come right out and say it: anyone who trusts horse dewormer over a proven (yes, proven) preventative vaccine is, beyond any shadow of a doubt, in this group.
But to me, the important thing to know is the spectrum this economics professor proposed to map humanity:
This law also introduces three other phenotypes that Cipolla says co-exist alongside stupidity. First there is the intelligent person, whose actions benefit both himself and others. Then there is the bandit, who benefits himself at others’ expense. And lastly there is the helpless person, whose actions enrich others at his own expense.
You'll have to click on the link to see the graphic.
Law 4: Non-stupid people always underestimate the damaging power of stupid individuals. In particular non-stupid people constantly forget that at all times and places and under any circumstances to deal and/or associate with stupid people always turns out to be a costly mistake.
We underestimate the stupid, and we do so at our own peril.
There's a quote attributed to Robert A. Heinlein that's relevant and I've kept it in mind for lo these many years: "Never underestimate the power of human stupidity."
Law 5: A stupid person is the most dangerous type of person.
If "stupid" is defined as above, then yeah, I can believe that.
Declining societies have the same percentage of stupid people as successful ones. But they also have high percentages of helpless people and, Cipolla writes, “an alarming proliferation of the bandits with overtones of stupidity.”
“Such change in the composition of the non-stupid population inevitably strengthens the destructive power of the [stupid] fraction and makes decline a certainty,” Cipolla concludes. “And the country goes to Hell.”
So, based on this, the only way to keep the stupid in check is to actively increase one's position on the article's graph, up and to the right. Can it be done? I don't know. I'm too lazy to try, which I guess puts me in the "stupid" quadrant.
I think it takes an economics professor to think of things in this way. I have to wonder what his thoughts on Rational Market Theory were, considering the percentage of people who act irrationally. It was in vogue around the time he wrote the above. But I can't be arsed to look it up.
|This one might not make a lot of sense unless you're a Springsteen fan. It's also from The New Yorker, generating cognitive dissonance within me. In this case, though, the content supersedes the medium.
A Springsteen Mystery Solved
Jon Landau, the Boss’s longtime close collaborator in matters musical and financial, offers a definitive answer about what Mary’s dress is doing in “Thunder Road.”
Thunder Road happens to be one of my all-time favorite songs, one that cemented my lifelong appreciation for Springsteen. It's almost fifty years old now. Bruce himself has expressed amusement at his twentysomething self writing the line, "maybe we ain't that young anymore."
The Internet is an uneven contribution to the human prospect...
Thor's balls, TNY, just once get to the point at the beginning and stop meandering around like a rabbit in Central Park.
All the way in the middle of paragraph 3:
Early this month, on a day too grim for dogs or snakes, it was best to stay inside, scroll numbly through Twitter, and wait for a virtual brushfire. Maggie Haberman, the tireless chronicler of the Trump Administration for the Times, unintentionally provided one, tweeting a photograph of a half-empty theatre and stage along with the lyrics “A screen door slams, Mary’s dress sways.” The lyrics are the opening to “Thunder Road,” arguably the best song on Bruce Springsteen’s breakthrough album, “Born to Run.”
There is no "best song" on Born to Run. The entire album is a masterpiece.
Haberman was blasted for getting Springsteen’s lyric all wrong, and, in the days since, people have continued offering confident opinions. It’s not “Mary’s dress sways”! It’s “Mary’s dress waves.”
My original LP of BTR was lost in a flood many, many years ago, but I remember the lyric sheet. And I'm pretty sure it said "waves." But it's not like you can tell from the actual song; Springsteen is an epic songwriter and musician, but he's not the world's greatest enunciator.
The article goes on to re-create a Twatter spat and mention several other sources, none of which is very enlightening, in typical TNY fashion.
Both on Springsteen’s official Web site and in his songbook, the word is “waves.” And yet Springsteen uses “sways” on page 220 of his memoir, which is also called “Born to Run,” and in his handwritten lyrics, which were auctioned off a couple of years ago by Sotheby’s.
Yes, I have a copy of the book Born to Run. No, I don't remember the lyric being different. You think you can't understand Bruce's voice sometimes? Try reading his handwriting. Dude should have been a doctor. I'm not trying to be mean, here. My handwriting sucks ass, too. In Bruce's case, you don't have to have good handwriting to write powerful lyrics. Or, for that matter, write a damn good book.
I e-mailed Jon Landau, who, as a critic for The Real Paper, in 1974, declared Springsteen to be the future of rock and roll, and then became his close collaborator in matters musical and financial. Short of Springsteen himself, no one could answer the question more definitively than Landau.
That's from the penultimate paragraph. It's like they have to drive from midtown through Delaware to get to the Bronx.
“The word is ‘sways,’ ” Landau wrote back. “That’s the way he wrote it in his original notebooks, that’s the way he sang it on ‘Born to Run,’ in 1975, that’s the way he has always sung it at thousands of shows, and that’s the way he sings it right now on Broadway. Any typos in official Bruce material will be corrected. And, by the way, ‘dresses’ do not know how to ‘wave.’ ”
Well. I'd rather get confirmation from Bruce himself, but I guess Landau's the next best source.
And you know, now I think of it, I don't see how it could be any other way. Early Springsteen was absolutely obsessed with rhymes. The first song (Blinded by the Light) on his first album was, by his own admission, the result of him sitting down with a guitar and a rhyming dictionary. And the last word of the second line of Thunder Road? The one right after the Mary's dress one? It's "plays." Of course the first line had to end with "sways." Why I never saw this before, I don't know -- but now I'll never unsee it.
I remember the shock and dismay I felt when I found out some people don't care about lyrics. I mean, what the hell? Especially with people like Springsteen, they're, like, at least 70% of the power of the music. Regardless of the source, this was another surprise for me -- I'll try to remember it next time I'm belting out that song at karaoke.
Thunder Road breaks all the rules of songwriting. There's no real chorus. There's no melodic hook. It just jumps out (it was the first song on the album) like a bull in a rodeo, bucking and bellowing, never letting up... just, eventually, fading out like you're bored with the show and looking for something else to do, but you know that somewhere, the bull's still raging, and it will not quit until the last recording of it succumbs to the entropy that will eventually take us all.
And of course, I can't let this go without at least including the song in question.
The screen door slams, Mary's dress wavesways
Like a vision, she dances across the porch as the radio plays
Roy Orbison singing for the lonely
Hey, that's me, and I want you only
Don't turn me home again
I just can't face myself alone again...
|Today's article is more about reading than about my comments. But of course, I have comments, too.
The Baloney Detection Kit: Carl Sagan’s Rules for Bullshit-Busting and Critical Thinking
Necessary cognitive fortification against propaganda, pseudoscience, and general falsehood.
This article is over seven years old. I only wish more people had read it before... well. You know.
Carl Sagan (November 9, 1934–December 20, 1996) was many things — a cosmic sage, voracious reader, hopeless romantic, and brilliant philosopher. But above all, he endures as our era’s greatest patron saint of reason and critical thinking, a master of the vital balance between skepticism and openness.
I never read much of his stuff, and only saw a few of his videos, but I guess you could say he was an inspiration for me. I wouldn't call him a saint, though. It's not a good idea to elevate people like that. Besides, have you seen what passes for a saint lately? No thanks.
In a chapter titled “The Fine Art of Baloney Detection,” Sagan reflects on the many types of deception to which we’re susceptible — from psychics to religious zealotry to paid product endorsements by scientists, which he held in especially low regard, noting that they “betray contempt for the intelligence of their customers” and “introduce an insidious corruption of popular attitudes about scientific objectivity.”
At the same time, though, scientists have to make money somehow, and they're certainly not going to do it by stashing mircochips in vaccines or hoaxing us into believing the planet is round and global warming is an existential hazard.
Because they're not doing either of these things.
Through their training, scientists are equipped with what Sagan calls a “baloney detection kit” — a set of cognitive tools and techniques that fortify the mind against penetration by falsehoods:
Ideally, yes. In practice, some slip through the cracks. But one doesn't have to be a scientist to use some of the tools.
The article goes on to list the actual thinking methods involved. I've highlighted a few of them in here before.
Just as important as learning these helpful tools, however, is unlearning and avoiding the most common pitfalls of common sense.
I've come to the conclusion that common sense is neither. It's often wracked with nonsense, and everyone's idea of "common" is different. Science tries to get past that and into some sort of objectivity. Often, its conclusions defy common sense, and yet are still true, to the extent that anything can be called "true."
I'd encourage everyone to read the next section, because it lists a bunch of fallacies that, in order to avoid, one must recognize. They're important, and I know I need to be reminded of them repeatedly for some of them to sink in.
The Demon-Haunted World is a timelessly fantastic read in its entirety, timelier than ever in a great many ways amidst our present media landscape of propaganda, pseudoscience, and various commercial motives.
And one of these days, I really should read that.
|Yes, this article is about math. Sort of. There's very little actual math involved, so no one's head should explode if you click it.
New mathematical record: what’s the point of calculating pi?
The famous number has many practical uses, mathematicians say, but is it really worth the time and effort to work out its trillions of digits?
I mean, really, when it comes right down to it, what's the point of doing anything? Why don't we just sit here and let shit happen to us, passively? Why climb Mount Everest? Why do research at the South Pole, why swim across the Atlantic, why break speed records, why build faster computers?
Swiss researchers have spent 108 days calculating pi to a new record accuracy of 62.8tn digits.
Okay, wow, that's a serious lot of digits.
It’s an impressive and time-consuming feat that prompts the question: why?
Because it's what we do if we're not uncuriously ignorant.
Jan de Gier, a professor of mathematics and statistics at the University of Melbourne, says being able to approximate pi with some precision is important because the mathematical constant has many different practical applications.
That's great. Even if it didn't, it would still be a good thing to do, as the impractical also has a purpose.
In maths, pi pops up everywhere. “You can’t escape it,” says David Harvey, an associate professor at the University of New South Wales.
This is true. It even shows up in basic cosmological equations.
“I can’t imagine any real-life physical application where you would need any more than 15 decimal places,” [Harvey] says.
This is also true. At least for now. Things have a way of expanding to require greater precision.
Given that even calculating pi to 1,000 digits is practical overkill, why bother going to 62.8tn decimal places?
De Gier compares the feat to the athletes at the Olympic Games. “World records: they’re not useful by themselves, but they set a benchmark and they teach us about what we can achieve and they motivate others.
And sometimes, by achieving something like that, you discover new methods and other things -- some utilitarian, some not.
There is no such thing as useless knowledge. Everything is connected to everything else, and just because none of us can see the Big Picture, that doesn't mean that someone, somewhere, won't make a paradigm-shattering discovery just by playing around with what us ignorant shitheels call "trivia."
It's great to ask "why" in most circumstances. But when it comes to wondering why we should learn new things, push limits, search for new ways of looking at things... well, in those cases, it's perfectly acceptable to shrug and answer, "Because." Knowledge is its own justification.
|In a world where headlines are clickbait and ledes are buried deeper than COVID-19 mass casualties, it can be refreshing to come across a story, however bizarre, that actually qualifies as news.
That's right. You read it here, folks: an actual Man Bites Dog story!
A 32-year-old man is facing charges after allegedly biting a Vancouver Police dog on Thursday morning.
Now see? That's how you write a lede. It covers the basics: who, what, when, where, and why, with the "how" notably absent but able to be inferred ("with his teeth.") Also, the rest of the article covers that bit.
“The suspect allegedly resisted arrest and bit Police Service Dog Mando, which was assisting in the arrest,” VPD said.
Mando? Mando? Does he have a companion named Grogu? Perhaps with an insatiable hunger for frog eggs?
The suspect was bitten by the police dog during the arrest and treated at hospital. Police Service Dog Mando has minor injuries, VPD said.
Oh, you were doing so well until BOOM passive voice.
Still, "minor injuries" is encouraging.
Police said they are recommending “multiple charges” against the suspect.
Might want to test him for rabies while you're at it.
Hopefully, after this sad tail, the dog will recover fully and get a new leash on life. He can harness the paw-er of this experience for future encounters. As for the suspect, well... looks like he's boned.
|Just a few short hours ago, I had a few drinks with Same Ol' Sum1 , who's visiting for a couple of days (and also with ArtemisMad ), so it's just as well that today's randomly-selected article is easy to deal with. It does, however, touch on a controversial and divisive topic, one that has been known to destroy friendships, end marriages, and divide siblings. Yes, I'm talking about... cryptocurrency.
Or, rather, Cracked is doing most of the talking.
Well, I guess we know where the world's foremost intellectual site stands on the issue. Especially when they can't count.
Whether you think cryptocurrency is the second coming of Cyber-Jesus or that crypto is just stupid computer program money based on nothing (as opposed to fiat currency, which is also based on nothing), crypto's got some problems all the best Reddit-soaked minds haven't solved yet ...
Look, in the end, everything is based on nothing. The other alternative to fiat currency? Commodity-based currency. What's the actual, intrinsic value of, say, gold? Why, it's whatever people decide it is. Come the Apocalypse, gold won't be worth the paper it's printed on: it's shiny and pretty, but if you can't eat it or smite your enemies with it, it's worthless. What will be valuable after the complete collapse of civilization? Certainly not crypto, which requires electricity -- you know, that stuff that we won't have ANY of when the survivors emerge, starved and blinking, from their bunkers.
No, the Ruler of the Post-Apocalyptic Wasteland will be whoever has managed to stockpile the most coffee.
Anyway. Back to cryptocurrency, which I call Dunning-Krugerrands. As usual, Cracked counts it down.
6. Lose Your Password, Lose Your Money
One of the benefits of crypto is that it's decentralized. There's no central bank, no cabal of corporate overlords, Cultural Marxists, and whoever else we don't happen to like today, controlling your cash. Instead, crypto relies on a wallet, a password-protected application that holds your digital assets for you, like your own mini-bank (not of the spank variety).
Periodically, I forget the password to my primary bank account. I can recover it via a link (assuming I can remember the password to my email account). I don't think it works that way for Dunning-Krugerrands.
It's true that you can call up a professional to use brute force attacks on a crypto wallet to try and unlock it if you want to put your economic future in the hands of Dave Bitcoin.
So... it's not secure?
5. Letting Someone Else Hold Your Bitcoins ... Doesn't Always Go Well Either
I can't find a decent excerpt here. You're just going to have to go to the link to see how bitcoin exchanges can suck ass.
4. Crypto Anonymity is Only For the Sophisticated
Police are actually getting pretty good at tracing and seizing the stuff.
So... yeah, this. Think you're safe from prosecution because you hired that hitman (who was really an FBI agent) with Dogecoin? Such naïvété. wow. much laughter.
3. Crypto is Actually Hard to Use
Crypto relies on digital infrastructure alone, which means that if that infrastructure screws up, you can't buy a car, pay for pizza, or the bootleg Ninja Turtles boxset you've had your eye on. You're totally dependent on it to transact business.
Again, utterly useless come the Apocalypse. But even if we manage to escape that, how many times a day does YOUR internet glitch?
2. All Our New Renewable Energy Capacity is Being Eaten by Crypto
Hell, for that kind of energy, you could just go mine gold. Like, out of the ground and everything.
This has been bugging me for a while. So much energy is needed to mine twitcoin now that, well, there have got to be more efficient ways to make a profit, right? At some point, it will become cheaper to invent a fusion reactor that can literally turn lead into gold... at which point, gold will lose what value it has left.
1. Inequality is a Huge Problem in Crypto Too
Another one I can't do justice to. This isn't about racial inequality, though, but the ability for large players to do market manipulation.
You know, just like with fiat currency.
I'm not trying to come down hard against cryptocurrencies, mind you. I don't know enough about them, and I know I don't know enough about them. I just think it's important to know potential downsides whenever trying anything new. If you can still live with the disadvantages, great, have at it. But I've encountered too many people (all online, naturally) who have absolutely converted to the cryptocult, and there's no swaying them.
And it's still a sad state of affairs when those disadvantages are most clearly enunciated by a comedy website.
|Here's a Bloomberg link that's not nearly as pants-on-head stupid as the last Bloomberg article... though that's not a high bar to clear.
The Economics of Dining as a Couple
Sure, protectionism would guard your pommes frites. But free trade ensures that the best goods reach both sides of the table.
Actually, it's rather amusing, and I do like to see complicated concepts made relatable. I will point out that the article is five years old, but I don't think that matters here.
Marriage counselors tell us that couples frequently tie the knot without discussing the core matters that can cement or sunder their marriage: finances, children, religion. Well, let me add one under-discussed biggie to the list: restaurant dining.
Well, I guess it does matter. A lot of restaurants in my area never reopened. Some of those that have are operating under reduced hours and/or capacity, and workers are fed up with dealing with assholes while not getting paid enough to deal with assholes, so they're understaffed.
Still, you remember the concept of "restaurants," don't you?
But. To get back to the article. I'm sure some couples "forget" to discuss finances, children, and religion before getting hitched, though... I mean... why? However, I'd venture that most couples go on at least one restaurant date before getting married. If they don't, chances are they're not restaurant people to start with, so it won't matter.
I am eternally astonished to find not only that many couples I know failed to discuss this key area before they marched up to the altar, but also that many of them still have not developed a joint dining strategy even after 10 or 20 years together. This is madness.
This... is... Sparta!
Sorry, knee-jerk reflex.
As a romantic economist might put it in a wedding-reception toast, couples have the chance to jointly move to a higher utility curve.
You'll probably have to look that one up. I did.
There are basically four strategies that a couple can practice at a restaurant.
I should note here that this is something I'll personally never have to deal with again, but I do have an interest in economic theory, so I read on.
1. Autarky. This is when a country is closed to imports or exports, and produces everything it consumes. In the restaurant context, it means that you are each independently deciding on what to eat, with no input from the other person, and then solely consuming what you ordered.
Admittedly, this was always my preferred strategy. I was an only child, and while I'm happy to share a lot of things, my fries aren't one of them.
An economist will tell you that autarky is terrible. You’re missing gains from trade!
What gains? You steal my fries and offer me a bite of salad? Get bent.
2. Individual production with trade. Under this model, you both order whatever you want, and allow the other person to take a few bites in exchange for a few bites of their food. This is how the world economy works, and it is much better for dining than autarky.
The above notwithstanding, if we have an international trade agreement in place before we order, I'm open to negotiation... but only if you don't order something I'm going to hate, like any vegan dish, or anything with eggplant.
3. Individual property rights, with option trading. Now we’re moving toward a more centrally planned economy. The menu is individually consulted, and then the two parties state their preferences.
Which is what I was talking about above, Still, no, I don't care if it's someone I kiss all the time, her taking a bite out of my hamburger is completely out of the question.
4. Full food communism. A communist economy is a terrible idea. A communist dinner table, on the other hand, truly is a bounteous paradise.
No. No it is not. If I ordered a steak, I expect to eat the whole steak, and not trade some for your oysters (which are only technically food anyway).
Anyway, just a bit of fun here. I was especially amused, though, by the standard disclaimer at the bottom of the article:
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Obviously, because it has the word "communism" in it. And rich people don't share. That's how they became rich.
And now it's time for me to drink some Romulan Ale and watch Lower Decks. No, you can't have any; make your own. I'd have watched it earlier, but Shang-Chi opened yesterday and of course I had to go see that. Consequently...
One-Sentence Move Review: Shang-Chi and the Legend of the Ten Rings
I enjoy martial arts flicks and MCU movies; this one combines good stuff from both, though the Marvel tie-ins aren't nearly pervasive enough to shut out anyone unfamiliar with those films, and I feel like it can stand alone -- though one might not get the full effect from some of the in-universe references, there's plenty of action, fights, SFX, and even car chases to keep your mind off what you might be missing.
|Warning: This entry may be offensive to British readers. (It's still 18+ here though)
The article is over two years old, and for all I know fashion has changed radically in that time (apart from the addition of face masks as an accessory).
The “fanny pack” sold for $10 ($95 today). For the next several decades, it remained popular among recreational enthusiasts traveling by bike, on foot, or across trails where hands could be kept free and a large piece of travel luggage was unnecessary.
So, it's useful. That should be enough to keep it from being shunned, whether it's ugly or not.
From there, it morphed into a fashion statement, marketed by Gucci and Nike for decorative and utilitarian purposes in the 1980s and '90s, before becoming an ironic hipster joke.
But of course, when something becomes a "fashion statement" rather than merely utilitarian, it opens itself up to ridicule.
In the late 1980s, fashion took notice. High-end labels like Chanel manufactured premium fanny packs, often with the more dignified name of belt bag.
Perhaps they were more aware of the British use of the word "fanny" and wanted to market in the UK.
Like most trends, overexposure proved fatal. Fanny packs were everywhere, given out by marketing departments of major brands like Miller Beer and at sports arenas and stadiums.
Yeah, that would have been the end for me, too. Now if it had been sponsored by someone who made actual beer, it might have been different -- but there wasn't a lot of that in the 80s.
In 2018, fanny packs were credited with a surge in overall accessories sales, posting double-digit gains in merchandise. The fanny pack may have had its day as an accessory of mass appeal, but it’s not likely to completely disappear anytime soon.
Which just goes to show that if something is actually useful, it will not completely fall out of style.
Trousers... you're next.
|Welcome to September.
I'm not participating in "30-Day Blogging Challenge" [13+] this time around, because I volunteered to be a judge for a week later this month. That's right; I'm judging YOU. (If you're participating.)
There's a lot going on in September. Not just the WDC 21st birthday celebration, but it also contains my own account anniversary, and soon, people can start signing up for "October NaNoWriMo Prep Challenge" [13+]. That's right, November is coming up faster than you could possibly want, so it's not too early to start thinking about NaNoWriMo if you're into that sort of thing.
It is, however, too early to start thinking about the December holiday season.
The best thing about September, though -- actually, one of the only good things, because I'm one of those weirdos who prefers the heat of summer to the dreariness of autumn -- is Oktoberfest.
Why, you ask, am I associating Oktoberfest with September? Good question. The official Oktoberfest season is mid-September to early (really early) October. I have no desire to go to the original (and couldn't this year anyway because it's cancelled), but that doesn't mean I can't take the opportunity to enjoy a fine Märzen lager. So that's one thing I always look forward to in September. Another is, believe it or not, pumpkin beer. A lot of them suck, yes, but occasionally you find a good one.
But with the coming of cooler weather here in the Best Hemisphere, one thing I can appreciate more than I do in the summer is darker beers. Specifically, stouts. While I'm not above enjoying any beer in any season, some of them just work better in different weather conditions. And my favorite style of stout is Russian Imperial Stout, which isn't actually a Russian style but a British one, and that's a story for another time (if I haven't already told it somewhere in here).
And, to segué into today's article, my favorite Russian Imperial Stout is made by North Coast in California, and it is called Old Rasputin. It even has a portrait of the Mad Monk on the label. Its motto is "Never Say Die," because Rasputin was famously hard to kill.
...or was he?
In the winter of 1911, a group of high-ranking Russian priests gathered in secret to lay a well-planned trap. Their target was none other than the notorious Siberian mystic Grigori Rasputin, who seemed to have established a mysterious hold over the Tsar. Rasputin had been lured to the meeting by his former friend, the Archimandrite Iliodor. But as soon as he breezed into the room, he was rushed by Iliodor and the “holy fool” Mitka Kolyaba, a one-armed epileptic who had been a previous favorite of the royal family. They grabbed hold of Rasputin’s penis and squeezed it, demanding that he confess his sins, while a hysterical bishop began beating him around the head with a huge crucifix, screaming “Devil!” with each blow. After all, it’s like the old saying goes: “Problems with a mad monk? Try crushing his junk!”
In case it's not blindingly obvious from that intro, or from previewing the URL at the link, this is from Cracked.
From the safety of the palace, he had his enemies exiled. But perhaps he should have listened to them, or at least learned some lessons about attending mysterious meetings alone. Because five years to the day after the priests’ attack, Rasputin agreed to pay a late-night visit to Prince Felix Yusupov, the richest man in Russia. His body was pulled from a frozen river the next day.
Some people never learn, I guess. But I suppose if you're buddies with the Tsar and banging the Tsarina (well, okay, allegedly), you might start to believe you're bulletproof.
As we all (hopefully) know, no one in that court turned out to be bulletproof.
You’ve probably heard the conventional version of the story, where he survived being poisoned, shot, beaten, stabbed and castrated, before finally drowning clawing at the ice of the frozen river Neva. That story actually has more problems than that math textbook Jay-Z wrote, but don’t worry! Everyone at Cracked is dressed in the full Lara Croft outfit, archaeology shorts included, and we’re ready to solve this historical mystery.
There follows the usual Cracked numbered list, but I'm not going to paste it all here; this has already gone on long enough and, frankly, the remnants of a hurricane are blowing through here right now, along with rain, floods, wind, thunder, lightning, and cats and dogs living together, and my internet could go out at any moment. Especially since I just (11:25 pm) got a tornado warning on my phone.
Still, some highlights:
And no matter what a certain animated movie may have claimed, he definitely did not rise from the dead to lead the Russian Revolution.
Look, anyone who thinks that ANY historical Disney movie has anything more than a wispy connection to the facts is beyond such warnings.
In 1914, he was stabbed by a noseless follower of the mad priest-monk Iliodor.
Why Disney has to make shit up when there's awesomeness RIGHT THERE is beyond me.
Seriously, though, the rest of the article is fascinating, poking holes in the "official" history while suggesting alternatives.
But one thing remains: whatever the actual facts are -- and we will probably never know for sure -- the mythology and speculation around those last years of the Tsars is compelling enough without needing to invent Zombie Rasputin.
Great beer, though.
|Eons ago, in the Before Time, I wrote a blog entry that mentioned the Decoy Effect. Here it is if you're interested: "Gindex" . I don't expect anyone to remember this. Hell, I barely remembered it; I had some vague feeling that I'd talked about it before so I did a search and that was the only entry that came up.
I just wanted to verify that I hadn't linked this particular article, which even predates that entry. Apparently, I hadn't. So here it is.
Price is the most delicate element of the marketing mix, and much thought goes into setting prices to nudge us towards spending more.
Everyone knows, or at least should know, about the bullshit that is the "sale." Nevertheless, it works. Say you want a widget. It's $100, which is outrageous. So you wait. A couple weeks later, you see: Widget Sale! 50% off! And you buy a widget for $50 plus tax.
Thing is, the widget cost the store $20. So you're still getting boned. Yeah, I know, they gotta make a profit, but really?
There’s one particularly cunning type of pricing strategy that marketers use to get you to switch your choice from one option to a more expensive or profitable one.
It’s called the decoy effect.
Gosh wow, I've never heard of that! Except it's in the headline.
The article goes on to provide a retail example, but it also applies to menu options at a restaurant, as I noted in that long-ago entry.
The decoy effect is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option – the “decoy” – that is “asymmetrically dominated”. It is also referred to as the “attraction effect” or “asymmetric dominance effect”.
You can tell that those phrases weren't marketed to consumers because they're from Latin roots.
What asymmetric domination means is the decoy is priced to make one of the other options much more attractive. It is “dominated” in terms of perceived value (quantity, quality, extra features and so on). The decoy is not intended to sell, just to nudge consumers away from the “competitor” and towards the “target” – usually the more expensive or profitable option.
Not sure why they'd necessarily push people to the more expensive option, unless it's something that confers bragging rights, like a car or a fine scotch. Usually the company only cares that they make more profit. If you can make $20 profit on a $100 woogie, or a $10 profit on a $200 woogie, you want to sell more of the $100 woogies.
The article then describes an actual psych experiment where this was demonstrated. However, the experiment, in my view, suffers from the same bias as almost every psych experiment: their guinea pigs weren't chosen at random from the general population, but from college students.
Still, the results were interesting and changed the way businesses priced things.
The decoy effect is thus a form of “nudging” – defined by Richard Thaler and Cass Sunstein (the pioneers of nudge theory) as “any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding any options”. Not all nudging is manipulative, and some argue that even manipulative nudging can be justified if the ends are noble. It has proven useful in social marketing to encourage people to make good decisions such as using less energy, eating healthier or becoming organ donors.
I can't say I know a lot about economics (anyone who can say that is probably lying, to themselves or to others), but I've been told that one of the foundational principles of economics is: people respond to incentives. Those incentives can be financial, or, in the case of my town switching to single-stream recycling some years back, time savings.
Another example provided in the article talks about pricing a magazine subscription. I don't know; when I went through the example, I didn't choose the "forced" option (actually I didn't choose any of the options because I'm a cheap bastard, but I mean hypothetically). This is because, in that hypothetical situation where you can get online only, print only, or online and print, I have no interest whatsoever in print options because I get enough mail as it is, and most of it ends up in the aforementioned recycle bin without having been read.
The next example is more representative, I think, of peoples' daily choices.
Consider the price of drinks at a well-known juice bar: a small (350 ml) size costs $6.10; the medium (450 ml) $7.10; and the large (610 ml) $7.50.
Which would you buy?
If you’re good at doing maths in your head, or committed enough to use a calculator, you might work out that the medium is slightly better value than the small, and the large better value again.
But the pricing of the medium option – $1 more than the small but just 40 cents cheaper than the large – is designed to be asymmetrically dominated, steering you to see the biggest drink as the best value for money.
Putting aside for a moment the fact that any of those prices is way too much to pay for a drink that isn't beer, I have to admit I don't see the point here. Even people who suck at math can figure out that the large is nearly twice the size as the small for a modest increase in price (why everyone does that probably has to do with cup and labor costs), and will tend to get the large. Unless - like with me at a coffee shop perusing the tea prices (because I don't like coffee) - you decide that's just way too much drink, and pay the higher unit cost for a small because that's really all you want.
Still, the decoy effect is real; it's marketing and applied psychology. I'd be willing to bet, though, that there are plenty of people it's lost on. All I'm saying is: be aware of it, and buy what you want anyway; just know that what you "want" is, as always, being influenced by external forces.
|This... this is satire, right? This has to be some sort of philosophical parody.
Spare a Thought for the Billions of People Who Will Never Exist
As world population growth slows, the never-conceived are the ultimate forgotten ones.
Now, this source - Bloomberg - is obviously pro-capitalist, and capitalism absolutely relies on population growth. You need wage slaves and consumers, or your company doesn't make as much money, and that would be terrible. And the value of your company depends not on absolute profits, but on profit growth year over year. So it's not surprising that a bastion of capitalism would sound the bells of terror over the very idea that maybe future population will, not necessarily be lower, but not grow as quickly as they'd like.
But I don't know. That's speculation. The article is, at least on the surface, more philosophical.
A couple decides to have one child instead of two, or none instead of one. This happens all over the world. Billions of children are never conceived.
And? What matters are actual lives, not imaginary ones.
How real is the loss of a life that never began?
None. None real.
Is there a right to exist?
For a person who exists, usually yes. For a person who doesn't exist, well, that's like saying "There has not been a Martian born on the Moon, but there should be and therefore they have a right to exist!"
Is there an ideal size of the world population?
I'm sure there is, but ideal for what, and for whom? I wouldn't even attempt to guess at what the ideal size of the world population of humans is; we'd have to take into account arable land, pollution, resource availability, and myriad other factors, not the least of which would be weighing any benefit of a larger human population against the existence of other species.
These related questions become more pressing as population growth slows.
There is population. There is population growth. And then there is change in rate of population growth. These things can be seen as analogous to position, velocity, and acceleration. "As population growth slows" refers to a decrease in acceleration. And I absolutely reject the hypothesis that these questions are "pressing;" they're mere philosophical abstractions.
The late University of Oxford philosopher Derek Parfit wrestled with the question of the world’s ideal population in an influential 1984 book, Reasons and Persons. He didn’t delve into the carrying capacity of the planet, and he stayed away from the issue of abortion, which occurs after conception and thus raises a different set of concerns.
Well, that year doesn't have any dire connotations. Anyway, any meaningful discussion of ideal population size has to take into account "the carrying capacity of the planet," which to be fair is a shifting, chaotic target because so many things - like climate and technology - affect it.
Also, just to be crystal clear, I too am staying away from the issue of abortion. That has nothing whatsoever to do with any discussion here, and I only mention it to dismiss the topic out of hand.
In an abstract, theoretical way, the British thinker presented what he called the “Repugnant Conclusion.” Here’s how he stated it: “For any possible population of at least 10 billion people, all with a very high quality of life, there must be some much larger imaginable population whose existence, if other things are equal, would be better, even though its members have lives that are barely worth living.”
As a "conclusion," that is, on the surface, utter nonsense. But one must delve into how he reached that conclusion in order to consider specific arguments. I haven't read the book, so I have to rely on this author's summary.
Parfit’s utilitarian logic was that if each person on the planet is happier alive than dead—even if just barely—then the total amount of happiness in an extremely large population, let’s say hundreds of billions, would be greater than the total happiness of a smaller population whose average happiness is greater.
That's... that's not logic. First it relies on extremely questionable premises; second, why would the "total" matter?
I guess they're considering "not born" to be the same as "dead?" That's crap. Also, it is impossible to be happy, or feel any other emotion, if you don't exist; most living people would rather stay alive, sure, but dead (or never-existant) people don't have the ability to "rather." And finally, what the declarative fuck does happiness have to do with anything, and how do you define it? It sounds like the author assumes an absolute scale, like the kelvin temperature scale, where "dead" is like "absolute zero" where happiness is concerned, and it can only go up from there. But... depending on how you define happiness... it's more like the Celsius scale, where 0 happiness is some arbitrary point, and it's absolutely possible to be alive and to feel negative happiness.
If you don't believe me, congratulations; obviously, you've never been unhappy. Good for you.
It’s simple arithmetic.
No. No, it is not. How in the hell do you put a numeric value on an abstract concept? Any attempt to do so is necessarily subjective. And before anyone goes "but money is an abstract concept and we put a value on that," yes, but money still has a tangible reality aspect, whereas happiness, like love or pain confidence, can only be self-reported, and only against one's prior experience.
One way to escape the Repugnant Conclusion is to maximize average happiness instead of total happiness.
Another way is to stop worrying about happiness. Surely there are other means of self-evaluation?
Another possible escape from the dilemma is to assert that some irreplaceable things are lost in the transition from a smallish, well-off population to a huge population of people just getting by. As Parfit put it, first Mozart goes away, then Haydn, etc., until all that’s left is “muzak and potatoes,” no amount of which can compensate for the loss of Mozart.
Counterpoint: we also end up with an absolute fewer number of murderers, rapists, cannibals, etc. For every Mozart there's a Dahmer.
Oxford philosopher Hilary Greaves wrote in 2017 in an article titled “Population Axiology” in the online journal Philosophy Compass that there’s no way out of Parfit’s conundrum without surrendering one or another moral intuition, so one’s solution to it “appears to be a choice of which intuition one is least unwilling to give up.”
All due respect to any philosophers involved, but there is definitely a way out: don't use "happiness" as any kind of marker. As I said, it's squishy and undefinable, and there has got to be a more objective measure, like life expectancy or crime rate or how well any individual's objective needs, such as clean air, water, food, and companionship, are being met. Sure, one could say that fulfilling those needs provides "happiness," but I still say that a lot of people living unhappy lives would rather never have been born in the first place.
The question of the ideal world population size may never be resolved by philosophers.
Clearly not. That's going to take actual science.
In conclusion: bullshit.
|Here's something I've been ranting about for a long time, but this time with arguments from... well, someone other than me. In this case, an actual sociology professor; that is, someone who actually studies this crap.
Comment: Bye, boomers; and millennials and Gen X and Z
Naming generations works against understanding societal similarities and differences. Stop it.
I used to wonder what they'd call the generation after Z, which is, of course, the final letter of the alphabet. Then I thought "what does it matter? We're not going to last that long." Boringly, the people whose job it seems to be to slap labels on everything decided to call them Alpha or some shit like that, which will definitely end well, considering what little conversation I've been privy to from the manosphere.
Consider these facts: The tennis champion Williams sisters are a generation apart, according to the Pew Research Center. Venus, born 1980, is part of “Gen X”; Serena, born 1981, is a “millennial.” Meanwhile, Donald Trump and Michelle Obama are both in the same generation. The former was born in 1946 while the latter was born in 1964, making them both “baby boomers.”
I've been saying shit like this for a while now, without those particular examples. Consider, for example, a hypothetical pair of twins born 10 minutes apart -- one at 11:55 pm on December 31, 1964, and the other being born at 12:05 am on January 1, 1965. The former is officially a Baby Boomer. The latter is officially Gen-X. Whatever. Meh.
I'm an early Gen-Xer by any definition in use. The whole "generations" astrology thing proposes that I have more in common with someone born in -- depending on who you ask -- 1984 (my first year of college) than with someone born in 1964 (with whom I could have gone to middle school). This is grade-A cow manure.
People are, for better or worse, born every year. It's a continuum, not a series of quantum jumps. Well, okay, but only very tiny quantum jumps, definitely much finer-grained than this "generations" crap.
Anyway, back to the not-me argument.
Generation labels, although widely adopted by the public, have no basis in social reality. In fact, in one of Pew’s own surveys, most people did not identify the correct generation for themselves; even when they were shown a list of options.
I called it "astrology" above for a reason.
Instead of asking people which group they feel an affinity for and why, purveyors of social “generations” just declare the categories and start making pronouncements about them. That’s not how social identity works.
The practice of naming “generations” based on birth year goes back at least to the supposed “lost-generation” of the late 19th century. But as the tradition devolved into a never-ending competition to be the first to propose the next name that sticks, it has produced steadily diminishing returns to social science and the public understanding.
It's still a thing that exists, so it's helpful to know its origins.
There is no research identifying the appropriate boundaries between generations, and there is no empirical basis for imposing the sweeping character traits that are believed to define them. Generation descriptors are either embarrassing stereotypes or caricatures with astrology-level vagueness. In one article you might read that millennials are “liberal lions,” “downwardly mobile,” “upbeat,” “pre-Copernican,” “unaffiliated, anti-hierarchical, [and] distrustful”; even though they also “get along well with their parents, respect their elders and work well with colleagues.”
They can go sod right off with that "pre-Copernican" bullshit. I don't know exactly what they mean by that, but to me it signifies an abandonment of scientific principles and the belief that the Earth is the physical as well as metaphorical center of the Universe.
But what’s the harm? Aren’t these tags just a bit of fun for writers? A convenient hook for readers and a way of communicating generational change, which no one would deny is a real phenomenon? We in academic social science study and teach social change, but we don’t study and teach these categories because they simply aren’t real. And in social science, reality still matters.
I could dispute that last claim, but I'm wallowing in confirmation bias here and don't feel like it. Besides, I'm cynical and disaffected. I know this because I'm Gen-X.
Worse than irrelevant, such baseless categories drive people toward stereotyping and rash character judgment.
So we're trying to stop judging people by racial stereotypes and now we're judging them by some purely arbitrary chronology.
There are lots of good alternatives to today’s generations. We can simply describe people by the decade they were born. We can define cohorts specifically related to a particular issue; such as 2020 school kids. With the arrival of “Generation Z,” which Pew announced with fanfare, there has never been a better time to get off this train.
If only. No, this crap will continue because people love shortcuts.
When I was a kid, younger folks were blaming older folks for all of society's ills. Now I'm older, and younger folks are blaming older folks for all of society's ills. Specifically, the same older folks who once set out to change the world, and either failed miserably, or succeeded miserably.
An argument could be made by any younger cohort that everything's the older cohort's fault, because, after all, the older cohort has had the opportunity to do stuff or not, while the younger cohort has, for the most part, not. Meanwhile, the older cohort can always blame the younger cohort for not having the life experience necessary to understand why the older cohort chose the things they did.
As far as I can tell, this has been going on since at least the time of Socrates: “The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.”
But more likely, since way before then.
I can acknowledge that there might be some value in identifying and naming generational cohorts. But the way it's done seems to me to be counterproductive to anything except finding new ways to divide us at a time when we need to be united.
As for me, I've always had trouble identifying with a group. And when they try to shove me in one, I rebel. Again, it's like with astrology: Us Aquarians don't believe in that stuff.
|Thanks for the MB, Andy~Stargate WDC is live !
Today we're going back to that trusted source for all things financial, Cracked.
Lots of people get money things wrong. Like when they insist they'll take home less money if they move to a higher tax bracket. Well, that might happen if tax laws change year to year, but no, otherwise that can't happen. Or when people conflate "income" with "wealth." Someone could make a million dollars a year and still have nothing left over; contrariwise, you could make 100K a year and save half of it.
But I think this article is more general than that.
Our lives are one long timeline of discovering everything thought we knew about money is wrong. Early on, we're shocked to learn fairies don't really hand out money for teeth but people do voluntarily exchange money for vegetables.
Actually, I knew that last bit from an early age, as we sold our surplus vegetables.
5. Everything People Think About What Companies/Billionaires Are Worth
In general, any given thing is "worth" exactly what people are willing to trade for it, usually denominated in some currency.
And what about those richest people in the world, going from net worths of $100 billion to more like $200 billion in no time at all. Where's all that money coming from?
Built on the backs of labor. Rise up and- oh, wait, no.
They're about stocks, and stocks are a special kind of made-up value. Like, you might say that all kinds of value are made-up, but stocks are super made-up, and a lot of people don't get this.
The article goes on to explain the stock market in what I consider simple terms. But fair warning; there is math involved. But in conclusion (for this first one, anyway):
But you don't have to get extra worked up over the terrifying spectre of unrealized capital gains. You're worrying about money that doesn't actually exist.
And that's the -- pun intended -- money quote. Capital gains aren't taxed until they're realized. We can argue about how much they should be taxed, but keeping one's money in the stock market isn't "evading taxes;" it's kicking the tax can down the road.
4. "You Can't Pay Employees $15 An Hour And Profit!"
"We won't make money if we pay employees more!" say companies. And the general populist response to this is "lol, fine." Higher wages are about letting employees live better, not about keeping profits high, right?
Again, the article goes into the math involved. But my own take on that nonsense: If you pay people more, they will have more income (duh) and they will be able to spend it on products and services, thus keeping the economy rolling.
3. "All Of Our Tax Money Goes Into Defense"
"We'd have enough money to pay for health care for everyone and wiping out debt and universal basic income and more if we only trim that ridiculous defense budget," people say. And the US does spend a lot on defense, more than the next several countries combined. But did you realize that the government spends more on Medicare than on the entire Department of Defense? Going by 2020 numbers, it's $917 billion on Medicare and $714 billion on defense.
And to be clear, we should spend money on defense. And we also should spend money on health and infrastructure. The issue is more complicated than I (or Cracked) can really grok.
So if you're wondering why Bruce Wayne wastes money on batarangs rather than solving crime by lifting all of Gotham out of poverty, maybe it's because his fortune can only go so far in a city of 10 million. Or, if you're wondering why Bill Gates doesn't just end world hunger tomorrow, well, we'd love to hear what your plan is for permanently feeding billions of people using only tens of billions of dollars. The government would probably like to hear that as well. They have tens of billions too.
2. "College Students Are So Narcissistic Today You Can't Hire Them"
The generation coming out of college right now harbors deep self-loathing. A constant need for approval doesn't signal a high opinion of oneself but rather the complete opposite. Many youngsters today don't even want to be alive. They're bitterly desperate to be hired and will even utterly debase themselves in the process.
Which is, like, the polar opposite of narcissism.
Every generation ends up complaining about "kids these days." We have short (and selective) memories.
1. The Myth Of "Self-Made" Fortunes
Few things get our eyes rolling harder than hearing a newly rich entrepreneur described as a "self-made millionaire" or "self-made billionaire." Because outside of the very rare exception (say, Oprah), these people didn't start from humble beginnings at all, did they?
That is, of course, a rhetorical question.
A self-made billionaire is someone who started their own business and grew their money to a billion, even if they had some money to begin with. If you start out with $1 million and grow it to a billion thanks to the company you start, you are a self-made billionaire.
It's a matter of definition. I figure if you generally grow your net worth, or at least have a plan when you don't, you're in good financial shape. I don't worry much about "self-made." The only time it matters to me is when someone claims to be "self-made" when they, as the saying goes, scored a home run while starting on third base. It sends the wrong message, among other problems.
On the other hand, if you put a bunch of cash into Bitcoin or other Dunning-Kruggerrands a while back, you may have a few million bucks on paper, but the above comments about unrealized capital gains apply.
And then there's the author:
For what it's worth, if I inherited $500 million, I would never manage to grow it to $1 billion or more. I would immediately spend the bulk of it creating Robot Island, and I'd deliberately run the place using a nutty business plan that loses money, as I build ever more elaborate singing animatronics. What do I care about becoming a self-made billionaire? I'm rich.
And you certainly wouldn't be writing for Cracked.
Anyway, the article's a good read, with the usual humorous take on things, but still largely factual.