*Magnify*
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/month/8-1-2020/sort_by/entry_order DESC, entry_creation_time DESC/page/2
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 1 -2- ... Next
August 11, 2020 at 12:20am
August 11, 2020 at 12:20am
#990492
This is probably the last bit about panpsychism I'll be linking. For now. Maybe.

But let's turn this into a Merit Badge Mini-Contest. See below.

https://www.scientificamerican.com/article/does-consciousness-pervade-the-univer...

Does Consciousness Pervade the Universe?
Philosopher Philip Goff answers questions about “panpsychism”


I have to wonder about a journal called "Scientific" anything posting stuff about something that's intrinsically unscientific. But okay - it's a hypothesis, of sorts, and might turn out to be testable at some point.

Also, I should note that I don't buy it. I've gone into some reasons before, and I'll go into more reasons below, but people keep talking about it so I figure it's a good thing to know what they're talking about. If, indeed, they know what they're talking about.

One of science’s most challenging problems is a question that can be stated easily: Where does consciousness come from?

Look, I'm well aware that this stuff is way above my pay grade, but that's not going to stop me from commenting on it. Just know that everyone involved, including me (especially me) might be way off about all of this.

What if consciousness is not something special that the brain does but is instead a quality inherent to all matter?

What if a unicorn took a shit in my back yard?

It is a theory known as “panpsychism,” and Goff guides readers through the history of the idea, answers common objections (such as “That’s just crazy!”) and explains why he believes panpsychism represents the best path forward.

It's not a "theory." That much I know. Not in the scientific sense. Oh, sure, maybe in the colloquial sense of the word, as in "conspiracy theory." But in science, a theory is a system -- it's testable, falsifiable, and makes predictions.

Honestly, SA should know better.

Here's the important part so I can get it out of the way:

Can you explain, in simple terms, what you mean by panpsychism?

In our standard view of things, consciousness exists only in the brains of highly evolved organisms, and hence consciousness exists only in a tiny part of the universe and only in very recent history. According to panpsychism, in contrast, consciousness pervades the universe and is a fundamental feature of it.


There's a lot more in the actual article, but that's the elevator pitch.

But it’s at least coherent to suppose that this continuum of consciousness fading while never quite turning off carries on into inorganic matter, with fundamental particles having almost unimaginably simple forms of experience to reflect their incredibly simple nature. That’s what panpsychists believe.

Yeah... I'm not at all on board with the idea that it's "coherent."

There is a deep mystery in understanding how what we know about ourselves from the inside fits together with what science tells us about matter from the outside.

No argument there. But I'm not sure that "everything has some form of consciousness" as a solution is anything more than hand-waving or wishful thinking.

But in my new book, I argue that the problem of consciousness results from the way we designed science at the start of the scientific revolution.

Yeah, yeah, I know. Guy's plugging his book. I've said this before and I'll say it again: I'm not going to slap anyone for trying to shill their book. This is a writing site, after all. Lots of us want to sell our books, at least in theory (in the colloquial sense of the word).

The article goes on to explain how consciousness is necessarily outside the domain of science, because it's inherently qualitative and not quantitative. So I ask myself again: what does this philosophical discussion have to do with actual science?

The starting point of the panpsychist is that physical science doesn’t actually tell us what matter is.

That's fair enough, depending on what the actual definition of "is" is, to paraphrase a certain former president. Current understanding, as far as I'm aware, is that matter is a complex wave function that describes the probable positions of what we know as "subatomic particles," but that doesn't really explain much. But so goes the history of scientific discovery: we find that matter is made of atoms, and that the atoms are made of protons, neutrons and electrons, and the former two are made of quarks, and... well, that's as far as we get with matter before you have to start thinking of it as energy. Point is, each of these "discoveries" just kicks the can of "what 'is' it" down the road.

Physics tells us absolutely nothing about what philosophers like to call the intrinsic nature of matter: what matter is, in and of itself.

I'm... not sure that's entirely accurate, but it may be a question of semantics.

But many of our best scientific theories are wildly counter to common sense, too—for example, Albert Einstein’s theory that time slows down when you travel very fast or Charles Darwin’s theory that our ancestors were apes.

Okay, I'm going to nip that shit right in the bud, right now. Those discoveries were made by observation and mathematics, and confirmed by observation and mathematics. They're systematic descriptions of processes; in other words, they do deserve the word "theory" in the scientific sense. They make declarations, yes, but they also make predictions that can, and have been, tested. Here's the important part, though: yeah, relativity and evolution both went against the grain of what was accepted at the time, but just because something goes against the grain doesn't mean it's true. This "but other theories that seemed outrageous turned out to be true, so my pet insight must be true too" is utter hogwash.

We know that consciousness exists not from observation and experiment but by being conscious.

I'll grant that much, anyway. "I am conscious" (a variant of Descartes' famous proclamation) might very well be the only true thing one can say. The difficulty is that it's not clear if it's true when someone else says it. We each, individually, only know it for ourselves. This observation can easily lead one down the rabbit hole of solipsism, though.

Bottom line, for me: this looks an awful lot like animism, the spiritual belief that all things -- that rock, this tree, the stream over there, whatever -- has a spirit. That's nothing new; as I understand it, there's evidence that this was a common belief in prehistoric societies. So in that sense, panpsychism packages what may be the oldest human belief in modern terms and couches it in scientific language.

But it is not, in itself, science. As philosophy goes, though, I suppose it's just as interesting as any other idea.

Here's my take, for whatever it's worth: Early life would have had to have some recognition of its environment, perhaps to find food or to avoid becoming food. It wouldn't "think" in those terms, of course, or at all, but, at base, it would need some way of distinguishing food from not-food, or it wouldn't survive very long and be able to pass along its genetic information. As evolution progressed, this sense broadened and branched out, eventually becoming sight, smell, taste, touch, etc., and along the way it would need a way to coordinate all of the sensory information -- and this is what we know as consciousness.

As this article points out, consciousness exists on some sort of continuum, or spectrum, even plants have some version of it, for example trees sensing when winter's around the corner and taking appropriate action. But the thing is, non-life has no need for that. A rock (probably) just is. It doesn't act or react; it's simply subject to the forces around it such as wind or water. So, no, consciousness isn't needed to explain what happens to rocks or even water, but it is needed for that which we define as "alive."

But again, that's just my purely amateur observation. I could be wrong, and I'm willing to entertain alternatives (provided, of course, they fit observations). Therefore, one simple question for today's contest:

*StarB* *StarB* *StarB*


Merit Badge Mini-Contest


What is consciousness?


You can be scientific, funny, philosophical, whatever. I'm just interested in seeing what people come up with. The answer I like best gets its author a Merit Badge (I'll pick one I think is appropriate) tomorrow. Deadline is midnight tonight, WDC time.
August 10, 2020 at 12:03am
August 10, 2020 at 12:03am
#990401
Today's article is about an article.

https://www.bbc.com/culture/article/20200109-is-this-the-most-powerful-word-in-t...

Is this the most powerful word in the English language?

Personally I would argue that "fuck" is the most powerful word in the English language. I mean, consider how people react to it, and its versatility. Here on WDC, it's the dividing line between 13+ and 18+ ratings. Well... along with certain other words, but I've already used up my sexually-derived word quota now. So on with the article.

‘The’. It’s omnipresent; we can’t imagine English without it. But it’s not much to look at. It isn’t descriptive, evocative or inspiring. Technically, it’s meaningless. And yet this bland and innocuous-seeming word could be one of the most potent in the English language.

Long ago, I looked up the definition of the word "the." Since I've been drinking and it was long ago, I just looked it up again. The dictionary entry is entirely too long to quote here, so here it is.  

Words are split into two categories: expressions with a semantic meaning and functional words like ‘the’, ‘to’, ‘for’, with a job to do. ‘The’ can function in multiple ways. This is typical, explains Gary Thoms, assistant professor in linguistics at New York University: “a super high-usage word will often develop a real flexibility”, with different subtle uses that make it hard to define.

One thing I've noticed in my studies of French, as well as in my very surface skimming of linguistics in general, is that different languages use articles differently. The link above from BBC goes into this to some extent, but, for example, if you ever want to fake a really bad Russian accent, just eliminate articles entirely. "Is glorious day in America, comrade." I mean, come on. Just try to say that without sounding like a cartoon Russian.

Helping us understand what is being referred to, ‘the’ makes sense of nouns as a subject or an object. So even someone with a rudimentary grasp of English can tell the difference between ‘I ate an apple’ and ‘I ate the apple’.

This author might be surprised at what even native English speakers are entirely ignorant of, at least over here on this side of the pond.

There are many exceptions regarding the use of the definite article, for example in relation to proper nouns. We wouldn’t expect someone to say ‘the Jonathan’ but it’s not incorrect to say ‘you’re not the Jonathan I thought you were’.

I have a vague memory of the first time I saw or heard "the" used before a proper name. I don't really remember what the name was, but the context was something like: "So, you're the James T. Kirk." It sounded weird then, and it sounds almost as weird now.

This simplest of words can be used for dramatic effect. At the start of Hamlet, a guard’s utterance of ‘Long live the King’ is soon followed by the apparition of the ghost: ‘Looks it not like the King?’

Of course my mind immediately went to a Monty Python bit from Holy Grail: "Hey look, it's the King." "How do you know it's the King?" "He hasn't got shit all over him."

‘The’ can even have philosophical implications. The Austrian philosopher Alexius Meinong said a denoting phrase like ‘the round square’ introduced that object; there was now such a thing. According to Meinong, the word itself created non-existent objects, arguing that there are objects that exist and ones that don’t – but they are all created by language. “‘The’ has a kind of magical property in philosophy,” says Barry C Smith, director of the Institute of Philosophy, University of London.

Don't get me started on the philosophy of linguistic paradox and the nature of reality. Oh, hell, I have plenty of blog entries about that, in the past and probably in the future as well. But I'll say this here: "a round square" brings the paradoxical object into being, at least Platonically, just as thoroughly.

The British philosopher Bertrand Russell wrote a paper in 1905 called On Denoting, all about the definite article. Russell put forward a theory of definite descriptions. He thought it intolerable that phrases like ‘the man in the Moon’ were used as though they actually existed.

All due respect to Russell, he of teapot fame, you can also speak of Santa Claus, sans definite article, as if he actually existed. And there is a huge difference between "god" and "the god."

Scandinavian languages such as Danish or Norwegian and some Semitic languages like Hebrew or Arabic use an affix (or a short addition to the end of a word) to determine whether the speaker is referring to a particular object or using a more general term. Latvian or Indonesian deploy a demonstrative – words like ‘this’ and ‘that’ – to do the job of ‘the’. There’s another group of languages that don’t use any of those resources, such as Urdu or Japanese.

My vague, distant memories of learning Latin tell me that Latin didn't have a definite article, either. I could be wrong about that, though. But it does make me wonder why French, a Romance language, uses one (actually three: le, la, les) in ways we'd never dream of in English -- and then omits it in ways we'd never dream of in English.

French is a mess. From an English-speaking perspective, of course. Objectively, it makes so much more sense.

Conversely, Smith describes Russian friends who are so unsure when to use ‘the’ that they sometimes leave a little pause: ‘I went into... bank. I picked up... pen.’

We go to party, drink vodka, yes?

In some ways, this cartoon Russian accent makes Russians sound uneducated, kind of like a Southern accent in the US. Nothing could be further from the truth. They just don't have the dependence on articles that English does.

Even within the language, there are subtle differences in how ‘the’ is used in British and American English, such as when talking about playing a musical instrument. An American might be more likely to say ‘I play guitar’ whereas a British person might opt for ‘I play the guitar’.

One of the things that messed me up in English when I was a kid was that I'd devour British books right alongside US ones, so I was always confused about which side of the pond some word or phrase, or spelling, comes from. Didn't help that some of my favourite bands were British. Dammit, there I go... "favorite."

They invented the language. We perfected it. Still... there are many British expressions that we might do well to adopt.

According to Culpeper, men say ‘the’ significantly more frequently. Deborah Tannen, an American linguist, has a hypothesis that men deal more in report and women more in rapport – this could explain why men use ‘the’ more often.

Oh, shit, here we go. Okay, no, I'mma stop this right here and move on to the next subject.

The letter y in terms like ‘ye olde tea shop’ is from the old rune Thorn, part of a writing system used across northern Europe for centuries. It’s only relatively recently, with the introduction of the Roman alphabet, that ‘th’ has come into being.

I have other vague memories of doing a whole newsletter on thorn (þ) a while back. Fantasy newsletter, maybe? Of course, its use wasn't just in the word "the," but it was, like, the Nordic version of the Greek theta. In Iceland, if I recall correctly, it's actually still a letter in their alphabet, whereas in English it's been completely dropped from the alphabet and its sound replaced by "th."

What intrigues me most, though, about "the" is this: you can pronounce it "thee" or "thuh," depending on what comes next, but the rules for that are not nearly as clear-cut as the ones surrounding the indefinite article (where it's "a" before a consonant and "an" before a vowel).

Never did fully understand that, but then, if you devote too much time to thinking about "the," you might just wrap your head into a pretzel. And that would be a thorny situation, indeed.

Oh, and just because it would be wrong of me to leave this blog entry here without some explanation for the title:

August 9, 2020 at 12:07am
August 9, 2020 at 12:07am
#990304
Today I get to talk about some of my favorite things.

https://www.smithsonianmag.com/science-nature/how-cheese-wheat-and-alcohol-shape...

How Cheese, Wheat and Alcohol Shaped Human Evolution
Over time, diet causes dramatic changes to our anatomy, immune systems and maybe skin color


You aren’t what you eat, exactly. But over many generations, what we eat does shape our evolutionary path. “Diet,” says anthropologist John Hawks, of the University of Wisconsin-Madison, “has been a fundamental story throughout our evolutionary history. Over the last million years there have been changes in human anatomy, teeth and the skull, that we think are probably related to changes in diet.”

Well, that doesn't mean I'm a pizza, no, but at least on a fundamental level, we know that we're made up of atoms taken from the substances we breathe and ingest.

I do have some issues with the way this article so cavalierly uses the word "evolution." I get the feeling an evolutionary biologist would have a cogent argument against it. I'm not one, so I'm just going to go with the flow here, for now.

When mammals are young, they produce an enzyme called lactase to help digest the sugary lactose found in their mothers’ milk. But once most mammals come of age, milk disappears from the menu. That means enzymes to digest it are no longer needed, so adult mammals typically stop producing them.

Thanks to recent evolution, however, some humans defy this trend.


Another interpretation: some of us retain childlike features. As should be apparent if you've been reading my stuff.

Hawks explains why being able to digest milk would have been such a boon in the past...

Speculation without supporting evidence. I mean, if you look at the article, the explanation makes sense, but I shy away from such just-so stories on principle. The facts are what they are; the explanation for the facts is suspect.

The article then goes on to talk about grains.

Since wheat and rye became a staple of human diets, however, we've have had a relatively high frequency of celiac disease. “You look at this and say how did it happen?” asks Hawks. “That's something that natural selection shouldn't have done.”

Translation: "I don't understand how natural selection actually works."

Yet despite the obvious drawbacks of celiac disease, ongoing evolution doesn't seem to be making it less frequent. The genetic variants behind celiac disease seem to be just as common now as they've been since humans began eating wheat.

That's because evolution doesn't care if an individual gets sick or has dietary limitations. Evolution only cares if someone lives long enough to reproduce. Well, I'm speaking metaphorically. Evolution neither cares nor doesn't care. It just is. Point is, unless there's something that kills people at an early age, the genes for it happily pass on. Okay, that's another metaphor. Genes don't feel happiness.

Then the author gets into the discussion of how diet could influence skin color, over time and in populations, which is kind of a minefield these days. Keep in mind it's not promoting any shade of skin as better or worse, in terms of evolution, than another, just describing what could have influenced skin tone changes in humans.

But then you get sentences like:

Mostafavi's genetic research also revealed that some variants that actually shorten human life, like one that prompts smokers to increase their consumption above smoking norms, are still being actively selected against.

Again: evolution is concerned with reproduction. Poor health outcomes from things like smoking or alcoholism don't tend to show up until later in life, after a human has had plenty of opportunity to reproduce (whether such habits influence the portion of the process of evolution called "sexual selection," I leave as an exercise for the reader). Point is, you can have some disease that kills you at, say, 30 or 40, but it doesn't stop you from having reproduced by then, so the genes that affect the expression of said disease aren't selected against.

So if there is indeed research on that subject, I hope it's peer-reviewed, because as it's presented here it certainly doesn't inspire any confidence from me.

Overall, I expected better from Smithsonian Magazine.
August 8, 2020 at 12:01am
August 8, 2020 at 12:01am
#990216
Sometimes, in here, I talk about deep philosophical shit.

https://www.borninspace.com/musician-performs-pink-floyds-great-gig-in-the-sky-o...

Musician Performs Pink Floyd’s ‘Great Gig In The Sky’ On Theremin


But sometimes it's just some random musician playing cool instruments.

For reference, here is the original  , from one of the greatest albums of all time.

A theremin is intrinsically fascinating. It gives music an ethereal, futuristic sound and has been used in soundtracks to convey a sense of weirdness or otherworldliness.

But the theremin was invented in late 1920, meaning that in two months, the instrument will be 100 years old. It was invented after the end of WWI and the Russian Revolution.

If you're not familiar with the instrument, here's the Wikipedia page.  

One of the coolest things I ever saw was a Japanese orchestra playing the music of an Austrian composer on a Russian instrument in, in part, an American musical style. Here's that.  

To me, that just demonstrates the universality of music.

And if you don't want to click on the above link to borninspace.com, I'm embedding the theremin Great Gig in the Sky video here (he also uses other specialty instruments):


August 7, 2020 at 12:02am
August 7, 2020 at 12:02am
#990157
The way language evolves and changes over time is pretty cool.

http://nautil.us/blog/the-english-word-that-hasnt-changed-in-sound-or-meaning-in...

The English Word That Hasn’t Changed in Sound or Meaning in 8,000 Years


So linguistics is something I'm interested in and I wish I knew more about it, which is why I read articles like this one -- which is more about what doesn't change, which is equally fascinating.

One of my favorite words is lox,” says Gregory Guy, a professor of linguistics at New York University. There is hardly a more quintessential New York food than a lox bagel—a century-old popular appetizing store, Russ & Daughters, calls it “The Classic.” But Guy, who has lived in the city for the past 17 years, is passionate about lox for a different reason. “The pronunciation in the Proto-Indo-European was probably ‘lox,’ and that’s exactly how it is pronounced in modern English,” he says. “Then, it meant salmon, and now it specifically means ‘smoked salmon.’ It’s really cool that that word hasn’t changed its pronunciation at all in 8,000 years and still refers to a particular fish.”

Well, now I'm all hungry. I haven't had a lox bagel in months. There's a place near me that makes a good one but... well, you know.

So, I've known about Proto-Indo-European, or PIE, for some time. The idea is that a large number of languages originated from one source, a language that is so lost in the mists of time that we don't even know what it was called, or if it had a name at all. But I like pie almost as much as I like lox, so that will do.

In modern English, well over half of all words are borrowed from other languages.

I'd argue that all of our words are borrowed from other languages, but, well, I'm not a linguist.

Analyzing the patterns of change that words undergo, moving from one language to another, showed how to unwind these changes and identify the possible originals.

I can't help but notice the similarities to genetics, where biologists can trace common ancestry through genetic drift. The really obvious difference there is that horizontal gene transfer is rare in organisms with mitochondria in their cells, whereas horizontal meme transfer happens in language all the time. (I'm using "meme" in the original sense proposed by Dawkins.)

The word lox was one of the clues that eventually led linguists to discover who the Proto-Indo-Europeans were, and where they lived.

This is where it starts to get really cool, because it's never been clear, at least to me, where the people who spoke PIE lived, or how their language, in the form of its descendants, came to dominate half the world.

In reconstructed Indo-European, there were words for bear, honey, oak tree, and snow, and, which is also important, no words for palm tree, elephant, lion, or zebra. Based on evidence like that, linguists reconstructed what their homeland was. The only possible geographic location turned out to be in a narrow band between Eastern Europe and the Black Sea where animals, trees, and insects matched the ancient Indo-European words.

No reason to name something that you don't have in your area. But that alone is a hypothesis that requires further investigation. So they did further investigation.

In the 1950s, archaeological discoveries backed up this theory with remnants of an ancient culture that existed in that region from 6,000 to 8,000 years ago. Those people used to build kurgans, burial mountains, that archaeologists excavated to study cultural remains. In that process, scholars not only learned more about the Proto-Indo-Europeans but also why they were able to migrate across Europe and Asia.

And to summarize the rest of it: horses and the wheel made this culture extremely mobile, so they were able to spread themselves, or at least their language, far and wide.

Which still doesn't explain why you have some very different words for things in languages, like "cheval" in French and "horse" in English. But I suppose they have theories for that, too.

And yet the word for "cat" is so similar across so many different languages that I have to wonder if they brought their feline companions with them as they traveled and/or conquered.

Which would also be pretty cool. After all, someone has to eat all the leftover salmon.
August 6, 2020 at 12:03am
August 6, 2020 at 12:03am
#990052
We should all know by now that when a headline asks a question, the answer is almost certainly "no."

https://blogs.scientificamerican.com/observations/could-multiple-personality-dis...

Could Multiple Personality Disorder Explain Life, the Universe and Everything?
A new paper argues the condition now known as “dissociative identity disorder” might help us understand the fundamental nature of reality


I will, however, give them credit for referencing Douglas Adams.

In 2015, doctors in Germany reported the extraordinary case of a woman who suffered from what has traditionally been called “multiple personality disorder” and today is known as “dissociative identity disorder” (DID). The woman exhibited a variety of dissociated personalities (“alters”), some of which claimed to be blind. Using EEGs, the doctors were able to ascertain that the brain activity normally associated with sight wasn’t present while a blind alter was in control of the woman’s body, even though her eyes were open. Remarkably, when a sighted alter assumed control, the usual brain activity returned.

That's actually pretty cool. I don't know much about psychology and even less about psychiatry, but it's still cool.

The article goes into some detail about DID, before getting into the question in the headline.

Now, a newly published paper by one of us posits that dissociation can offer a solution to a critical problem in our current understanding of the nature of reality. This requires some background, so bear with us.

Background? All the paragraphs leading up to this quote (most of which I didn't copy, but they're at the link) were background. But okay -- like I said, I know very little so that was a decent introduction.

According to the mainstream metaphysical view of physicalism, reality is fundamentally constituted by physical stuff outside and independent of mind. Mental states, in turn, should be explainable in terms of the parameters of physical processes in the brain.

Any essay that uses "metaphysical" unironically is suspect.

But there's a brief introduction there to the "hard problem of consciousness," which I'm pretty sure I've mentioned in here before, in connection with panpsychism -- a philosophy I provisionally reject. "Provisionally," because there's no real science behind it, only conjecture. Could be true. Could be false. Can't be tested at this time.

To circumvent this problem, some philosophers have proposed an alternative: that experience is inherent to every fundamental physical entity in nature.

Yep. Panpsychism. But okay, I'll bite.

However, constitutive panpsychism has a critical problem of its own: there is arguably no coherent, non-magical way in which lower-level subjective points of view—such as those of subatomic particles or neurons in the brain, if they have these points of view—could combine to form higher-level subjective points of view, such as yours and ours. This is called the combination problem and it appears just as insoluble as the hard problem of consciousness.

From my purely amateur perspective, it sounds more like the same problem, only stated differently.

The obvious way around the combination problem is to posit that, although consciousness is indeed fundamental in nature, it isn’t fragmented like matter. The idea is to extend consciousness to the entire fabric of spacetime, as opposed to limiting it to the boundaries of individual subatomic particles.

Ever seen a map of the universe? They've made some. Here's one.   I've noted a superficial similarity to a neural network   before. But physical similarity isn't support for this hypothesis; it could be coincidence.

And here is where dissociation comes in.

This view—called “cosmopsychism” in modern philosophy, although our preferred formulation of it boils down to what has classically been called “idealism”—is that there is only one, universal, consciousness.

Pretty sure philosophers have posited something similar since time immemorial. That's also the basis for at least one conception of the Divine. But just because a philosopher (or a scientist for that matter) says something, doesn't mean it's fact.

You don’t need to be a philosopher to realize the obvious problem with this idea: people have private, separate fields of experience.

Basically, if there's only one Universal Consciousness, how come we all appear to live separate existences?

So, for idealism to be tenable, one must explain—at least in principle—how one universal consciousness gives rise to multiple, private but concurrently conscious centers of cognition, each with a distinct personality and sense of identity.

That's not the only thing that "one must explain" to accept this, at least in theory. There's also the small matter of communication across vast distances, communication that can't be attributed to "quantum entanglement." Consciousness requires communication, and the speed of light is a hard limit; this has been supported by evidence as well as mathematics.

And here is where dissociation comes in. We know empirically from DID that consciousness can give rise to many operationally distinct centers of concurrent experience, each with its own personality and sense of identity. Therefore, if something analogous to DID happens at a universal level, the one universal consciousness could, as a result, give rise to many alters with private inner lives like yours and ours. As such, we may all be alters—dissociated personalities—of universal consciousness.

Okay, I have to admit here that although I categorically dismiss the idea, it is attractive to me. The idea of universal oneness, of breaking down barriers; that all is, indeed, One -- well, I think most people have to drop acid to get into that mindset, and I've never dropped acid. But I think a lot of problems would be ameliorated if we stopped thinking of things as separate entities but as parts of a whole.

That doesn't mean I'm going to embrace this stuff, of course. Just that, if it turns out to be the case, it would have beneficial implications. For instance, that there is no "us" and "them," only "us."

Moreover, as we’ve seen earlier, there is something dissociative processes look like in the brain of a patient with DID. So, if some form of universal-level DID happens, the alters of universal consciousness must also have an extrinsic appearance. We posit that this appearance is life itself: metabolizing organisms are simply what universal-level dissociative processes look like.

Whether factual or not, the other thing that appeals to me is its syncretic nature -- to take ideas from several different branches of science and philosophy and put them into a coherent whole. Wait -- that's just another aspect of universal oneness, isn't it? We make distinctions between, say, cosmology, psychology, biology, chemistry, physics... when in reality the boundaries are far more blurry, if they exist at all.

Insofar as dissociation offers a path to explaining how, under idealism, one universal consciousness can become many individual minds, we may now have at our disposal an unprecedentedly coherent and empirically grounded way of making sense of life, the universe and everything.

Insert "42" joke here. You know you want to.

Just remember: the key word in the last sentence I quoted is "may." I find it highly unlikely, personally. Just to be clear, though: I do believe that everything is One Thing; just not necessarily in the manner described here. Like I said, this stuff is appealing -- which is all the more reason to be skeptical about it.
August 5, 2020 at 12:11am
August 5, 2020 at 12:11am
#989926
I was an "only child," so this article caught my interest. So I'm sharing it. See? We can share.

https://qz.com/980226/neuroscience-shows-that-only-children-are-more-creative-mo...

Neuroscience shows that our gut instincts about only children are right


That headline can bite me. That's not the thrust of the article at all, as we shall see.

Conventional wisdom has it that only children are smarter and less sociable.

"Conventional wisdom" is almost always wrong. But flattery will get you everywhere with me.

Conversely, since those only children never have to share a toy, a bedroom, or a parent’s attention, it is assumed they miss out on that critical life skill of forever-having-to-get-along.

"Critical," my solitary ass.

But are their actual brains different?

OH COME ON.

Jiang Qiu, a professor of psychology at Southwest University in Chongqing, China and director of the Key Laboratory of Cognition and Personality in the ministry of education, led a team of Chinese researchers that sought to answer this question with more than 250 college-aged Chinese students.

FWEEEEET! Flag on the play. Minuscule sample size, representative of just one culture. 10 yard penalty, fourth down.

Hey, I just made a sportsball joke.

They used standard tests of intelligence, creativity, and personality type to measure their creativity, IQ, and agreeableness.

Fuck agreeableness.

On the behavioral tests, only children displayed no differences in terms of IQ, but higher levels of flexibility—one measure of creativity—and lower levels of agreeableness than kids with siblings.

You're also going to have to explain to me how to actually quantify creativity. I mean, I know there are issues with IQ tests, but at least some measures of intelligence can be quantified within a particular cultural and socioeconomic context.

Having worked on the 1896 study “Of Peculiar and Exceptional Children,” Hall cast only children as “oddballs” as “permanent misfits,” descriptions that have stuck over the years with remarkable persistence. “Being an only child is a disease in itself,” he claimed.

As developed countries transitioned from agrarian to industrial economies, family sizes tended to get smaller. I mean, I can't be arsed to find the data on that or anything, but I'm pretty sure it's the case. So, naturally, in the 19th century, growing up with a bunch of siblings would be considered "normal" and not doing so would be considered "abnormal." I assert that had the opposite been the case, it would be having siblings that would be considered "a disease in itself."

There is ample evidence suggesting the stereotypes of the “lonely only” are wrong.

As I'm pretty sure I've mentioned in here before, being sibling-free taught me a lot about self-reliance. I wouldn't want it any other way.

They found that only children, along with firstborns and people who have only one sibling, score higher IQ marks and achieve more, but aren’t markedly different personality-wise

Which right there contradicts the Chinese study, throwing both into question.

Jiang and his co-authors hypothesized a few reasons for their findings.

You can hypothesize all day. I mean, I do. Until you do the science, it remains opinion and educated musings.

Creativity —defined as having original ideas that have value—is strongly influenced by everything from family structure and parental views, to interactions and expectations

Okay, so you can define creativity, but again -- how do you measure it? If it can't be quantified, it's not science. And "original ideas that have value?" We share a world with seven billion other humans. I've had original ideas that, it turned out, someone else had already thought of.

The article goes on to talk about some measures of creativity: flexibility, originality, and fluency. Those sound more like general intelligence to me, but what the hell do I know?

Look, I'm not trying to argue from one datum (me), and if I hadn't been drinking wine I might seek out other sources to delve deeper into this, but so far it's not passing the bullshit test for me. Then again, I'm just not very creative, so maybe I'm just inflexible enough to provisionally reject this.

He also noted that just like IQ tests, creativity tests are not perfect measures of the thing they are measuring.

Duh.

Creativity involves spontaneity and intrinsic motivation—things which are a bit hard to assess on a test.

Pretty much by definition, you can't make a standardized test that measures creativity in a meaningful way. I mean, a truly creative person could find a way to bypass the strictures of the test, making it invalid. Right? Think about how James T. Kirk beat the Kobayashi Maru test. In either universe.

To summarize, this article panders to confirmation bias.

So back to my wine, and rewatching old episodes of Star Trek because it's been years since I actually watched them. Damn, some things were cringeworthy in the 1960s. But first...

*StarB* *StarB* *StarB*


Mini-contest results!


As is appropriate for a question about decisions, this was a tough decision for me. I appreciated all of the comments. From them I gathered that we all have different ways of making decisions, as I expected.

What clinched it for me, though, was the line: "I chart a course I want to take and weigh my decisions based on whether or not they move me along the course." I like that this implies a focus on making decisions based on achieving one's overall goals, and not just the pros and cons of an individual decision -- something that neither I nor the article's author seems to have considered, but in retrospect it seems important.

So this time, the Merit Badge goes to Elisa the Bunny Stik -- but again, I liked all the comments and everyone will get another chance soon.

I should also note that no, I'm not taking sides on which decision-making process is "best." Whatever works for you, works. But I hope that seeing how other people approach it helps us all with our own decision-making. As always, thanks for reading and commenting!
August 4, 2020 at 12:02am
August 4, 2020 at 12:02am
#989830
Today, I'm going to actually talk about religion.

Well, sort of. Not really. Well, you'll see.

As a bonus, though, it's time for another Merit Badge Mini-Contest! Details below.

https://theconversation.com/what-a-16th-century-mystic-can-teach-us-about-making...

What a 16th-century mystic can teach us about making good decisions


Decision-making is a complex process.

Well, it can be. When I pick an article to highlight in the blog, I use a random number generator. So it's not always. But I'm sure they're talking about big decisions here, like where to go to college, whether or not to break up with your boyfriend, what kind of house to buy, or where to go get beer tonight.

As individuals, working through our daily lives, we often take a number of shortcuts that may not always serve us well.

To be fair, if one is faced with a slew of decisions while going about one's daily business, it might be good to take a few shortcuts. The process described here can get fairly long and involved.

Among the many decision-making methods for life’s big decisions, one that stands out is from an early 16th-century soldier-turned-mystic, St. Ignatius of Loyola.

Anyone who's been following along knows that I'm not a big fan of religion. And yet, I have a lot of respect for the Jesuits, which this guy helped to create, with their focus on science and education.

Ignatius uses the language of faith, but, I believe, anyone can apply his method to make more informed decisions.

I'm also not going to reject something just because it comes from a religious person. That would be silly and self-destructive.

The article goes into a brief bio of Ignatius before getting into the "process."

1. Rely on reason and feelings

Ignatius advises creating a list, but also takes it a step further by urging people to listen to their feelings as they consider the pros and cons for each option.


What I think a lot of people miss, probably because of Spock from Star Trek, is that "logic" (or reason) doesn't stand alone. That is, one must consider emotions as part of the logic of human existence, because we have emotions. And so does everyone else, which should be taken into account when making decisions. Those who don't are known as sociopaths.

Ignatius teaches that freedom from attachment to a particular choice or outcome is essential.

That's kind of Buddhist, isn't it?

Ignatius also advises that individuals share their deliberations with a confidant, advice that he followed when making his own decisions. Modern psychological science too has found that the process of sharing emotions with others helps make sense of our thoughts and feelings.

I mean, I do that. Doesn't everyone do that? Well, probably not "everyone," but I think it's rather common.

He also urged people to make decisions for the “greater glory of God.” How can non-religious people use this advice? I argue they can consider how their decisions will affect the vulnerable, the poorest and the most marginalized.

This is probably good advice for even the most mundane decisions. For instance, buying cheap shoes from Wal-Mart or more expensive ones from elsewhere? The former might have been made in a sweatshop by people who get like $1 a week for their labor or whatever. On the other hand, I've argued before that if we all boycotted sweatshop clothing, they might go from $1 a week to $0 a week, and how is that helping? And you also have to take your own circumstances into account; not everyone can afford the more expensive items.

Point is, though, I agree that we should give these things some thought, and I like the non-religious alternative this author proposes.

2. Imaginative reflection

Ignatius offers three imaginative exercises if no clear choice emerges:


I won't copy them here; the link is there above. To summarize, though: 1) Imagine that a friend comes to you with the same situation; 2) Imagine you're on your deathbed reflecting on this choice; and 3) Imagine a conversation with the Divine.

The author provides an alternative to that last one:

Those who do not believe in a God could have an imaginary conversation with someone they loved and trusted and who has passed away.

Hey, little secret about atheists: if we're going to hold imaginary conversations, they might as well be with God, because the conversation is imaginary anyway. I mean, you can do what the author suggests, but it's not like you're breaking the Atheist Code by pretending for the sake of this exercise that you're talking to a god or goddess. (There is no Atheist Code -- though many, like me, have ethical standards.)

3. Seek confirmation

Ignatius advises individuals to act on reason, feeling confident that they have invested their time and energy to make a good choice. But he also says that people should seek out additional information to see if reason confirms the choice.

Here, I'm a little lost. Was one of those "reason" instances supposed to be "emotion?" Because otherwise I'm not sure that this bit makes a lot of sense.

The emotions they feel following a decision, such as peace, freedom, joy, love or compassion, might give an indication if it is the right choice.

Sometimes it's just relief that you've finally made a decision after going through all that.

Anyway, I thought this would be helpful. I think I've been using a process much like this already; I rarely act on reason alone (even if I do act on emotion alone far too often). At the same time, there are decisions I've been putting off because there's no clear "better" outcome. So this will lead us into today's...

*StarB* *StarB* *StarB*


Merit Badge Mini-Contest!


In the comments below, tell me: What process do you use to come to big decisions? Do you do something like Ignatius suggests, or something else entirely? Logic, meditation, prayer, a ritual circle, "what would *insert deity here* do," hard exercise to distract the mind? Maybe getting insight into others' process can help someone make their own decisions.

As usual, the deadline will be midnight WDC tonight, and the comment I like best will get its author an appropriate Merit Badge tomorrow.

(Additional disclaimer: I spent part of the evening giving out a bunch of MBs for the Quills, and I recognized a few of the names as people who comment here from time to time. If one of those authors wins, I'll delay the MB for two weeks for CR eligibility.)
August 3, 2020 at 12:05am
August 3, 2020 at 12:05am
#989758
Today's article was published in the Before Times, but its theme is very relevant today.

https://theconversation.com/when-science-gets-ugly-the-story-of-philipp-lenard-a...

When science gets ugly – the story of Philipp Lenard and Albert Einstein


Never heard of Philipp Lenard? Neither had I until I read this article. His relative (pun intended) obscurity undermines the point that I think the article is trying to make. Let's take a look, shall we?

Scientists are not always as scientific as many suppose.

Well, duh. Scientists are humans. I know shows like to portray them as some sort of wizards, the Gandalfs of technology, but they're all people and thus subject to all of the biases, contradictions, prejudices, and just plain wrongness that plague the rest of us.

Recent well-publicized cases of scientific fraud prove that scientists can be as susceptible to the allures of wealth, power and fame as politicians...

Interesting comparison there. In science, theories that turn out to not fit observations are eventually discarded into the trash heap. In politics, well, for example, here in the US we have two houses of Congress and three branches of government, lots of different people thinking about stuff, so that eventually bad politics also gets discarded. Usually. Eventually.

Such breaches prove that scientists do not always base their work strictly on rigorous experimentation, data collection and analysis, and hypothesis testing.

"Prove" is kind of a dirty word here. I might have picked "demonstrate." That is, like I said, scientists are humans and thus fallible. However, that we know about these instances of bad science shows that science, as a process, is working as intended -- to smooth out the bumps in the road caused by individual bias and error.

In fact, scientists frequently disagree with one another, both as individuals and as representatives of competing schools of thought.

Feature. Not a bug.

The article goes on to provide a brief bio of both Lenard and Einstein. Lenard was "a German experimental physicist," while Einstein was "a Swiss theoretical physicist."

It's no secret that experimental and theoretical physicists are often at odds. They need each other, and collaborations have happened, but my impression is that, historically, members of each group consider the other somehow inferior. But, again, this is the intersection of humanity and science; eventually, the experimenters will either find results consistent with the theories, or they won't, in which case the theory has to be abandoned or modified.

Again, this is how science is supposed to work. It's not like scientists are supposed to be all one happy family, always agreeing with each other.

Lenard, meanwhile, was soon swept along in a wave of German nationalism that accompanied World War I. He became increasingly convinced of the existence of a distinctively German physics that needed to be defended against the plagiarized or frankly fabricated work emanating from other countries. Lenard also became more and more mired in anti-Semitism, accusing the “Jewish press” of, among other things, promoting Einstein’s dangerous work on relativity.

Ideally, science is universal. But scientists aren't always.

Lenard’s attacks on Einstein became increasingly vitriolic. He compared theoretical physicists to Cubist painters, who in his view were “unable to paint decently.” He lamented the fact that a “Jewish spirit” had come to rule over physics.

From everything I've heard, Einstein wasn't even a practicing Jew. So it wasn't even about religion; this was about ancestry.

Lenard’s conviction that science, “like everything else man produces,” was somehow grounded in bloodlines led him to become one of the early adherents of National Socialism.

And just to get this out of the way: Nazis were "socialists" the way North Korea is a "democratic republic." I can call myself a unicorn, but that doesn't make me a unicorn.

The story of Philipp Lenard reminds us that even scientists of the very highest caliber sometimes think, speak and act in utterly unscientific ways, swayed by prejudices that have no scientific basis.

And yet, as I said above, Einstein is a household word these days, while Philipp Lenard has been all but forgotten. Einstein's theories have been supported by evidence time and time again, from measurements of the precession of Mercury's orbit to the experiments in gravity wave detection. This is how it's supposed to work and, in general, it does.

And so science progresses. It's wrong to revere Einstein the man; I'm certain that he had his flaws just like everyone else. But again, his science has held up -- even though there seems to be growing evidence that they're going to have to tweak the equations to account for what they're calling dark matter and dark energy.

The problem comes in when people hear a pronouncement by someone with a degree or certification, take it as fact, and then ignore anything that contradicts it. So it's the last sentence of the article that concerns me:

They are human beings too, and members of the general public need to be careful to distinguish between a scientist whose arguments are based in evidence and one whose pronouncements stem from other, less reliable sources of conviction.

That's not always easy to do. Evidence, especially at the highest theoretical levels, isn't really accessible to "members of the general public." I mean, it is accessible, but not always comprehensible. But I know this much: if a so-called scientist, or doctor, or whatever, starts spouting off about demon sperm and lizard aliens, I think she can, and should, be summarily ignored -- at least until we have some hard evidence of lizard aliens.
August 2, 2020 at 12:26am
August 2, 2020 at 12:26am
#989699
Over at 30DBC, they're still in Antarctica.

Go where your cruise ship can’t — hop aboard a small, sturdy inflatable boat and buzz between the icebergs and around the mountains...



...Another cool and unexpected aspect of this research center is the Vernadsky Station Lounge, one of the southernmost bars in the world. Try the vodka, which has been distilled on site.



But no, I'mma stay where it's warm.

https://www.mentalfloss.com/article/91594/theres-wire-above-manhattan-youve-prob...

There's a Wire Above Manhattan That You've Probably Never Noticed


It's hard to imagine that anything literally hanging from utility poles across Manhattan could be considered "hidden," but throughout the borough, about 18 miles of translucent wire stretches around the skyline, and most people have likely never noticed. It's called an eruv (plural eruvin), and its existence is thanks to the Jewish Sabbath.

Well, NOW people will notice.

On the Sabbath, which is viewed as a day of rest, observant Jewish people aren't allowed to carry anything—books, groceries, even children—in public places (doing so is considered "work"). The eruv encircles much of Manhattan, acting as a symbolic boundary that turns the very public streets of the city into a private space, much like one's own home. This allows people to freely communicate and socialize on the Sabbath—and carry whatever they please—without having to worry about breaking Jewish law.

You know what *actual* work is? Actual work is having to memorize the Tanach, Talmud and Midrash, and be expected to know every detail of not only religious law, but every possible interpretation of religious law. But no, my people don't see it that way; apparently studying these texts is one of the few things you *can* do on Shabbat.

Much of the interpretation of what is and is not acceptable between sunset Friday and sunset Saturday is the result of later texts trying to make sense of commandments in the Scriptures.

But I think they're missing something important, here, and I'm going to change the world with this entry.

New York City isn't the only metropolis in the U.S. with an eruv. They can also be seen (or not seen) in St. Louis, Atlanta, Baltimore, Chicago, Dallas, and numerous other cities across the country.

So, just to be clear here:

*Bullet* There are things that can be done on Shabbat and things that cannot be done, according to law and tradition.
*Bullet* There are things that can be done on Shabbat in the home that cannot be done outside the home.
*Bullet* One can extend the definition of "home" to one's neighborhood, as long as there is an eruv to serve as a boundary.
*Bullet* I think that's cheating, but I'm not a rabbi, so whatever.

With me so far? I might still be drunk from earlier, so let me know if I'm not being clear. It helps to read the linked article and maybe click on the video there.

So, here's my world-changing, earth-shattering proposition, which no one will listen to because I'm not actually a Talmudic scholar.

Take an eruv. That is, imagine that you're sitting in, I dunno, Central Park, and you're surrounded by an eruv.

Now. Imagine that eruv expanding. The area of "home" gets bigger and bigger. It grows to encompass New Jersey, Long Island, Connecticut. But don't stop there. Bigger. Even bigger than that. It stretches until it covers all of North America, South America, the Atlantic. But keep going. Bigger. Pretty soon it's the size of the circumference of the Earth.

But don't stop there, either. Now it starts getting smaller as it moves around to the other side of the planet. You're still on the inside, sitting there in Central Park, and the "outside" of the circle is smaller than the "inside."

You see where I'm going with this, right? Eventually, you've got an eruv in the middle of the Indian Ocean, and everything in the world except for maybe 10 square meters (just an arbitrary number) of ocean is inside the eruv, and then Jews all over the world, unless they are in that particular patch of ocean, can do anything they can normally do at home on Shabbat.

Of course, there's no reason to keep that eruv in the Indian Ocean. It would be tough to erect one underwater. Pick a spot, any spot. Say... I dunno... go to Antarctica (*shudder*). Put up ten poles (ten is an important number in Jewish lore). String an eruv around the poles. Then simply declare that the bulk of the globe of the Earth is inside, while everything else (a few square meters of desert) is outside.

There might be a few penguins who can't carry groceries on Shabbat, but honestly, penguins aren't known for carrying groceries around anyway.

There you go. And no, it's not cheating, any more than the eruv itself is cheating. Okay, so I'm not a rabbi, but I know a loophole when I see one, and that's a loophole. It's a simple matter of spherical geometry: any circle on the surface of a sphere has an "inside" and an "outside" only by convention and declaration. Take the equator, for example: Is the northern hemisphere "inside" or "outside" of the circle of the equator?

And if you think what I just said is a stretch (which, topologically speaking, it totally is), then you've obviously never read the Talmud.
August 1, 2020 at 12:02am
August 1, 2020 at 12:02am
#989626
Apparently, today's premiere virtual travel blog challenge is about camping in Antarctica. Clearly, this month's challenge is SO not for me.

Instead, why don't we talk about memory?

https://www.mentalfloss.com/article/585887/mandela-effect-examples

10 Examples of the Mandela Effect


Would you believe us if we told you the most famous line of 1980’s Star Wars sequel, The Empire Strikes Back, was never uttered? Darth Vader doesn’t reveal his paternity to Luke Skywalker by saying, “Luke, I am your father.” He actually says, “No, I am your father.” The line is but one instance of what blogger Fiona Broome dubbed the “Mandela Effect” a decade ago, after she learned that a number of people shared her erroneous belief that human rights activist Nelson Mandela had perished in prison in the 1980s.

Also, McCoy never once said, "He's dead, Jim" in the original Star Trek.

The Mandela Effect is basically just false memory writ large. It happens to all of us -- perhaps some more than others. But these particular accounts are more than just one person's false memory; they're shared by many people.

This is not the same as lying. This is something that people actually remember, but they remember it incorrectly.

With apologies to conspiracy theorists, the idea of a shared false memory isn’t proof of alternate realities.

Yeah, I wanted to quote that here because it really isn't proof of alternate realities. We have no proof of alternate realities.

1. The Monopoly Man’s Monocle

For decades, Rich Uncle Pennybags (or Mr. Monopoly) has been the de facto mascot for Monopoly, the Parker Brothers (now Hasbro) game that somehow made real estate exciting.


And also somehow ended numerous relationships.

2. Jiffy Peanut Butter

If you looked forward to your school lunch break because your parent or guardian packed a Jiffy peanut butter sandwich, your childhood may be a lie.


I don't think I ever thought there was a "Jiffy." I distinctly remember the very effective line from the Jif commercial when I was a kid: "Choosy mothers choose Jif." Of course, that memory could have been falsified also. I could probably find one of those old commercials on YouTube to check it, but I can't be arsed.

I'mma skip a few here, but I think you get the idea. Or you could, you know, click on the link and read the article.

9. Risky Business

Remember Tom Cruise dancing in his underwear, a dress shirt, and Ray-Bans while home alone in 1983’s Risky Business? Your brain got most of it right.


I never really gave that much thought. I mean, sure, it's a memorable scene, but... shades or no shades? Never even thought about it until I saw this article.

I spent most of last week rewatching all of the Mission:Impossible movies. If you'd told me after Risky Business that I'd be a fan of a rebooted M:I with Tom Cruise in it, I'd have laughed.

Should these processes that lead to false memories be considered flaws? Not exactly. Current theories in psychology are exploring the idea that our ability to cull details from past experiences to create theoretical concepts is actually part of a survival mechanism.

Let's just say I'm skeptical. Not everything has an evolutionary explanation. Some things are just kind of hangers-on, neither beneficial nor detrimental. It's clear that memory itself has survival value, but obviously it's got its flaws. It is often better to remember something poorly than to not remember it at all -- but when it comes to something as useless as movie quotes or whether or not a game mascot has a monocle, I'd expect it to be irrelevant to the course of evolution, which after all is only a process by which species either continue or not.

Still, we know that memories are suspect. This has important consequences. A few decades ago, there was a massive moral panic about Satanic abuse at day care centers. It was all over the nascent internet, as I recall (which may not mean much). Turned out to be false memories; no evidence of ritual abuse was ever uncovered apart from the faulty reminiscences of supposed victims, which were induced by some shrink or something. And yet, during this panic, several people lost jobs and reputation, at the very least, and may even have gone to prison over it. I don't remember.

And yet, if someone tells you a story that turns out to not be true, maybe don't immediately jump to the conclusion that they're lying. Memory isn't a video tape; it's more of a cobbled-together hologram of chaos. (Hologram of Chaos can be the name of my Journey cover band.) So it's likely that it is the truth as they remember it; they just have no idea they're remembering it wrong.

And at the same time, none of us can fully trust our own memories.

This can be a scary thought, I know, but I've learned to come to terms with it. At least I think I have.

The only instance of false memory that I can think of offhand is that sometimes I'll remember a quote from a book or movie, and the next time I read the book or see the movie, it'll turn out I was wrong about it. But I'm pretty damn sure a lot of my memories are wrong, conflated with other memories, or otherwise suspect.

How about you? Have you experienced the Mandela Effect or discovered any false memories?

31 Entries · *Magnify*
Page of 2 · 20 per page   < >
Previous ... 1 -2- ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/month/8-1-2020/sort_by/entry_order DESC, entry_creation_time DESC/page/2