About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
April 30, 2021 at 12:02am
April 30, 2021 at 12:02am
#1009392
Well, this will probably be the last random article for a while; let's see what we end up with.

The Forest Spirits of Today Are Computers  Open in new Window.
We’ve made an artificially panpsychic world, where technology and nature are one.


Oh joy, panpsychism again.

Well... sort of.

Years before smart homes became a thing, I replaced all the switches in our house with computerized switches. At first, it was just a way to add wall switches without pulling new wire. Over time, I got more ambitious. The system runs a timer routine when it detects no one is home, turns on the basement light when you open the door, and lights up rooms in succession on well-worn paths such as bedroom to kitchen. Other members of the family are less enthusiastic. A light might fail to turn on or might go out for lack of motion, or maybe for lack of any discernible reason. The house seems to have a mind of its own.

I can see the appeal, but the tech just isn't quite up to Star Trek standards yet.

Also, to me, "seems to have a mind of its own" translates to "does stuff at random times because of power fluctuations, cosmic ray events, or loose wires." Never ascribe to consciousness what can better be explained by glitches -- unless, of course, you can get laughs by doing so.

Modern hardware and software have gotten so complicated that they resemble the organic: messy, unpredictable, inscrutable.

On a superficial level, maybe. I'd prefer to say that it's gotten so complicated that there's exponentially more opportunities for things to simply go wrong.

Gradually, we are turning an old philosophical doctrine into a reality. We are creating a panpsychic world.

Sigh. No, we're not.

All the computers with which we surround ourselves are starting to be endowed with a rudimentary sentience.

On the other hand, there might be something to that, after all. The easiest way to shop for something online, these days, is to talk about it near my phone. Within an hour I'll have fifty different ads for the product. Like, I'll say something like, "I'm going to bed now," and I wake up and there's ads for Sealy clogging my screen.

Which is not to say I have a problem with targeted ads in general. I'd rather get ads for beer, movies, and cat food (all relevant to me) than for hemorrhoid cream, tampons, or dog biscuits (all utterly useless to me).

We are placing minds everywhere and instilling seemingly inanimate objects with mental experience.

Are we, though? I'm sure several philosophers would disagree that these devices necessarily have "mental experience."

By dispersing intelligent artifacts, humanity is awakening the material world.

Oh, sure, and because of that, my grass curses my ancestors every time I have it mowed.

Proponents make three main arguments. The first is that there doesn’t seem to be any principled way to draw the line between conscious and non-conscious. If we are conscious, why not a dog? A paramecium? A protein molecule? A proton? These systems lie on a continuum with no obvious break.

Just because there's no obvious break doesn't mean there isn't a line there. Just because we haven't figured out where it is doesn't mean it doesn't exist.

Second, panpsychism would solve the hard problem of consciousness.

And love can be reduced to chemical reactions in the mind. So?

Third, several of today’s leading theories of consciousness imply panpsychism.

Before Einstein, leading theories of cosmology implied that the Universe was contained within a few thousand light-years, and said nothing about being unable to accelerate past the speed of light.

When debating panpsychism, the question is not whether, but when. Either the world already is panpsychic or it will be.

As usual, binary ideas like this are misleading. Even if we end up building gadgets that could be called "conscious," they're not going to imbue the rock I stubbed my toe on yesterday with consciousness. I kinda wish they would, because then the rock might appreciate the words I had for it.

The books paint a fascinating picture of a fully sentient world. People can telepathically communicate not just with friends and family, but with atoms, burbling brooks, and the planet as a whole. If your friends come over for dinner, the group forms a temporary collective mind that you can commune with. Every act becomes a negotiation: You had better apologize to the brook for urinating on its bank and talk nice to your hand tools. Say the right words to the right atoms and you can heal wounds or fly like Superman. On the downside, villains can brainwash atoms, unraveling the fabric of reality.

And you get to hear the screams of everything you eat. Yes, even the plants. Especially the plants. That's more horror than science fiction.

So anyway, despite my snark, the article's a good read; I don't have to agree with something to appreciate the arguments.
April 29, 2021 at 12:03am
April 29, 2021 at 12:03am
#1009331
One of the stupidest things about this timeline is how much weight people put on what celebrities say.

Science and philosophy can, and should, coexist  Open in new Window.
MC Hammer's recent tweet has sparked a conversation about the complementary nature of these two fields


Another stupid thing about this timeline is Twitter.

Nevertheless, I can't find any real fault in what MC Hammer said here.

MC Hammer recently brought the hammer down on those who see science and philosophy as fundamentally opposed disciplines. He first tweeted a link to this paper, showing that STEM fields account for 21.3% of citations of philosophy of science journals. Some tweeters responded with unflattering, and inaccurate, characterizations of philosophy that put it at odds with science. In response, MC Hammer had some words of advice praised by scientists and philosophers alike: "It's not science vs Philosophy ... It's Science + Philosophy. Elevate your Thinking and Consciousness. When you measure include the measurer."

As both science and philosophy are interests of mine, I do see that sometimes they're at odds. And they have completely different ways of approaching ideas. I see this as a good thing.

MC Hammer's insistence on the complementary nature of science and philosophy is in line with this 2019 opinion paper, published in PNAS. The authors described a continuum of science and philosophy, as the two fields share "the tools of logic, conceptual analysis, and rigorous argumentation.”

The big difference, as I see it at least, is that philosophy doesn't have to cleave to objective truth. One can draw perfectly logical conclusions from false (or at least unverifiable) premises, leading to faulty conclusions even with rigorous logic, but that's not what science is supposed to be about.

Tweets by MC Hammer promoting these views will hopefully also help to break down harmful stereotypes of the disciplines that might prevent scientists and philosophers from working together for the good of society.

I still think entertainers, no matter how talented or intelligent, have entirely too much sway over popular opinion. It's like... I'm a Springsteen fan because I like his music, but anything he says about anything other than music carries no more weight with me than when anyone else says such things.

In any case, it's always been clear to me that, deliberately or not, philosophy guides science, and science can inform philosophy. Examples? Well, consider animal testing in the biological sciences. For a while, it was generally accepted, but as attitudes changed and we found out more (partly through animal testing), such things were curtailed and subject to ethical review. There's no purely scientific rationale for not doing tests on animals (including human animals), but there's plenty of philosophical reasons not to.

After all, sometimes I wonder if the phrase "curiosity killed the cat" is more about a human's curiosity than the poor cat's.

On a personal note, for anyone who's tired of seeing these things pop up in here, I've signed up for the 30DBC again for next month. It's not like I don't have dozens more articles in the queue; it's just time for something different again.

Speaking of something different, today (Thursday) is the day I officially reach full vaccine protection, two weeks after my second shot. While it's not a good idea to start running around licking doorknobs or bathing in people's sneezes, I intend to celebrate by going to *gasp* a movie theater.

Godzilla vs. Kong looks good. If by "good" I mean "thin plot, mediocre acting, but kaiju battles with budget-breaking CGI." Which in fact I do.
April 28, 2021 at 12:15am
April 28, 2021 at 12:15am
#1009252
This article is a few years old now, but I haven't seen anything to contradict it in the last four years, so it's probably still current science.

New Model of Evolution Finally Reveals How Cooperation Evolves  Open in new Window.
By treating evolution as a thermodynamic process, theorists have solved one the great problems in biology.


It also addresses a few of the misconceptions about evolution, but there are a few points I'd quibble with.

The first misconception is that "survival of the fittest" necessarily equates with "survival of the strongest." Bunnies, for example, didn't get to where they are by being strong; they exist because they're famously fast and, even more famously, prolific reproducers. And they're very cute.

One of the great unanswered question in biology is why organisms have evolved to cooperate. The long-term benefits of cooperation are clear—look at the extraordinary structures that termites build, for example, or the complex society humans have created.

In other words, we know that intraspecies cooperation (and sometimes interspecies cooperation) exists, but how such cooperation evolved has been an ongoing subject of debate.

But evolution is a random process...

Uh... not really. There's a random element, sure, but chance isn't the biggest factor. I mention this because some people have used "random chance" as a specious argument against evolution. No, we humans didn't get to where we were able to invent computers merely "by chance," and bunnies didn't suddenly and randomly develop the ability to hop.

...based on the short-term advantages that emerge in each generation.

That whole sentence is problematic, but I understand that its purpose is to provide background for later arguments. It's close enough for that purpose, I think.

Of course, individuals can cooperate or act selfishly, and this allows them to accrue benefits or suffer costs, depending on the circumstances. But how this behavior can spread and lead to the long-term emergence of cooperation as the dominant behavior is a conundrum that has stumped evolutionary biologists for decades.

I'll take that as given. All I'll add is that neither cooperation nor competition is the only way for a species to evolve. I'm sure you can think of plenty of examples of both, but in many cases it's some mixture of the two.

Today, that could change thanks to the work of Christoph Adami and Arend Hintze at Michigan State University in East Lansing. They have created a simple mathematical model using well understood physical principles to show how cooperation emerges during evolution.

So, basically, while it's a model with solid grounding, it's not like an experimentally verified result. Still, it's worth noting, because the first thing you have to do to draw a conclusion is ensure that your hypothesis is at least plausible.

Their model suggests that the balance between cooperation and selfish behavior, called defection, can undergo rapid phase transitions, in which individuals match their behavior to their neighbors. What’s more, a crucial factor turns out to be the process of punishment.

Well, good news for BDSM fans.

The article goes on to explain, in what I think are very accessible terms (so don't be scared off by the "mathematical model" phrase above), the way they looked at the problem.

That’s an interesting result. It implies that behavior can be manipulated on a large scale by the introduction of certain costs. It also implies that the result can be modeled using relatively simple physics.

Well, they could have just taken the route of "ask an economist." The only danger there is that if you ask n economists a question, you'll get n+2 different answers.

Anyway, it seems to my completely unschooled opinion to boil down to game theory. But it struck me as an interesting result because I'd been thinking about how competition and cooperation interact. Like in a game. Two players sit down to play chess. At first glance, it's a competition: either one will win and the other will lose, or it will end in an unsatisfying stalemate. But the thing is, both players have agreed, formally or not, on a set of rules governing that competition. Someone who breaks the rules is called a "cheater" and no one likes a cheater. The rules are a form of cooperation, and the social disapprobation heaped upon cheaters serves to discourage such behavior.

Doesn't matter if the game is chess, football, Monopoly, or whatever. You have to cooperate before you can compete.

And even when there's not an actual game involved ("game theory" isn't actually about games), there are rules and consequences for breaking them -- even in the ultimate human competition called war. I know the saying is "all's fair in love and war," but that's simply not the case.

Like I said, the title is a bit misleading, because nothing's been settled (or "solved") here -- but it does provide a fresh way of looking at things, and sometimes that's all it takes to eventually make a breakthrough. Another advantage of cooperation.
April 27, 2021 at 12:02am
April 27, 2021 at 12:02am
#1009201
Well, it looks like another easy one for me today. It doesn't usually happen two days in a row, but such are the dangers of random selection.



I've described myself as uncreative before, but by this vlogger's definition, I suppose "creativity" applies even to me.

Are you skeptical about creative advice that requires a belief in the supernatural? Curious about what works in the real world? Or just interested in creativity? Join Freethought Blogs writers T.D. Walker, William Brinkman, and Megan Rahm as they discuss creativity without the woo.

Now, videos aren't my favorite means of absorbing information. I do better with written articles, as anyone who's been following along should know. There are some things that pretty much have to be video, like when you're talking about an explosion; I wanna see that thing, not read a description of it. This video in particular could probably have better been an essay, but... whatever. I think it's good information, and at least it's no vertically formatted.

I'm sharing it because it's relevant to what we do as writers. And because I don't accept the idea of the supernatural, so it never really occurred to me that some other people believe that creativity is something that comes from outside oneself.

Certainly it can be inspired by something external, but the idea that you can take that inspiration and "create" something from it, well, I consider that just another function of the mind and consciousness. In other words, you are your own muse.

But the video should be useful to all creators, not just heretics like me. So, you can click the link above to see it on YouTube, but then you might be tempted to read the comments, which is rarely a good idea on YT (I didn't venture down there, myself). So here's the video, embedded:

April 26, 2021 at 12:01am
April 26, 2021 at 12:01am
#1009149
Not much to copy here today; it's mostly screenshots, so you'll have to go to the link if you're interested in writing tips.



Yes, the originals are from Twatter, which, while I admit that sometimes good stuff happens there, is a site that I have no intention of visiting, let alone signing up for. Fortunately, the good stuff gets reposted somewhere.

Anyone who embarks upon the ever trying journey of wrestling the blank page will become terribly familiar with the grueling process that can be just trying to spew some words out. So, this helpful little Twitter thread that includes some knowledge gleaned from a creative writing class might just come in handy.

Now, some of these tips are things I was already aware of, so presumably you are too. But really, it can't hurt to review, and who knows? You might find something new.

So yeah, that's about it today. Just thought someone might benefit from the post.
April 25, 2021 at 12:04am
April 25, 2021 at 12:04am
#1009082
I mentioned before that I had at least one more thing about Leonard Cohen's song Hallelujah and that I'd get to it one day.

Today is that day. And oddly enough, the source is Cracked.



Incidentally, free idea for Cracked (hey, one of them might read this, just like Halle Berry might come by my house in the catsuit carrying two growlers of craft beer): Start a dating app for comedians. Call it Crackd.

You're welcome. And while I did say "free" if you wanted to funnel me, say, 5% of the profits, I wouldn't complain. Much.

Okay, anyway, after that massive side of aside, back to the actual article.

According to Leonard Cohen and also, like, the Bible or whatever, there's a certain chord that God particularly likes, but everyone rudely refuses to tell us what it is.

Add that to the list. Hell, no one even knows God's actual name. Some actually refer to God as HaShem, which translates to "The Name."

In the unlikely event that you're still here reading this and you're not familiar with the lyrics from which this springs, here's the first verse of the song:

Now I've heard there was a secret chord
That David played, and it pleased the Lord
But you don't really care for music, do you?
It goes like this, the fourth, the fifth
The minor falls, the major lifts
The baffled king composing Hallelujah


It's entirely likely that the chord was meant to remain, you know, a secret, but plenty of people think Cohen left us clues as to what at least he thought the secret chord was.

I'd lay all kinds of money that it's not the mysterious chord that opens The Beatles' A Hard Day's Night. Maybe the one at the end of A Day in the Life? ...Nah.

Or maybe it's a B chord. Ever try to play a B chord on guitar without retuning or faking it? ...Okay, so I never got too far with guitar lessons.

In fact, Cohen's secret chord may not be a chord at all. It might be a metaphor for divine inspiration, and the subsequently described chord progression could be a code leading the listener to certain Bible verses, which are indeed eerily appropriate.

A metaphor? In a Leonard Cohen song?! Impossible!!!

That's sarcasm, by the way. You can tell by the plurality of punctuation marks.

But the biblical David was, in fact, a musician, and Cohen's song is more about bangin' than religion, so let's assume he's speaking literally because that's also a lot more fun and involves something called the Devil's Interval.

Unless he was a liar. Liar... lyre? Get it? Crap. Back to the professionals.

We may not be totally clear on how God jams, but it turns out "the devil's music" isn't just a term applied by grandmas to bands with haircuts they don't like. "The devil's interval" is a chord a half-step below a perfect fifth that sounds so dissonant to our ears that it was banned in churches in the Middle Ages because it was thought to induce sinful thoughts.

Bet the organists back then (or whatever, if this was before organs) snuck into the church at night just to play that banned chord.

According to one theory, if the devil loves this chord so much, God's fave must be the opposite: a perfectly harmonious major chord. Probably C major, which also happens to be the key of "Hallelujah."

C major is, coincidentally or not, also the easiest key on piano, because you don't have to mess with those annoying black keys.

But maybe we're thinking too hard: When Cohen recites the chord progression, he also plays it, so maybe the secret chord is just the chord he plays when he says "secret chord," which would be A minor.

And A minor uses the same keys as C major. Look, don't ask me; I've forgotten more music theory than I ever learned, but that little tidbit stuck with me. What's the difference, then? Go ask an actual musician.

Or maybe it's something totally different. Go onto any music theory subreddit and ask people what they think the secret chord is and you'll get 100 different, equally compelling and confusing answers.

Also coincidentally, that's what happens when you ask anything on reddit.

In any case, mostly I wanted to post about this because I don't have many opportunities to make a lyre pun, and I'll take one whenever I can get one. And also because, like I said, I'll never get tired of that song.

Mini-Contest Results!


Thanks for all the opening sentences / paragraphs from yesterday. Really, I liked all of them, from the flowery to the laconic. It was hard to choose just one favorite, but I suppose the poetic imagery in the one Kåre เลียม Enga Author Icon posted just barely edged out the others. So a MB to Kåre เลียม Enga will be on the way soon, and we'll do this again sometime.
April 24, 2021 at 12:13am
April 24, 2021 at 12:13am
#1009033
So let's talk about actual writing again.

A Close Reading of the Best Opening Paragraph of All Time  Open in new Window.
From Shirley Jackson's We Have Always Lived in the Castle, of course


I don't effing think so.

One hundred and one years ago today, Shirley Jackson was born.

Article is dated December 15, 2017, so that's the "today" she's talking about. That's fine; that's how time works. Just making it clear.

I'll skip to the actual opening paragraph in question.

My name is Mary Katherine Blackwood. I am eighteen years old, and I live with my sister Constance. I have often thought that with any luck at all I could have been born a werewolf, because the two middle fingers on both my hands are the same length, but I have had to be content with what I had. I dislike washing myself, and dogs, and noise. I like my sister Constance, and Richard Plantagenet, and Amanita phalloides, the death-cup mushroom. Everyone else in my family is dead.


I'll be honest: None of this interests me in the slightest. By which I mean, okay, fine, whatever; don't care. I'm neither tempted to read on nor ready to throw the book against the wall. As opposed to this article, which does evoke emotion in me, all of it negative.

I'm willing to admit that this might be because I lack historical context; the werewolf thing has been done to death now, but was probably somewhat fresh when it was written.

Back to the article:

It almost seems like overkill to explain why this paragraph is so wonderful.

No. No, it doesn't. Explain to me, a lifetime reader of science fiction, fantasy, horror, and supernatural genres, exactly what makes you think this is anything but boring.

You must have a certain sensibility to truly appreciate its charms.

Oh, hell no; you're not going to get away with tricking me into liking it by making me feel inadequate. I could just as easily say "You must have a certain idiocy to think this paragraph is anything but mediocre."

The rhythm is key.

Bovine excrement.

It begins straightforwardly, with our narrator’s name—a somewhat old-fashioned way of opening a book, appropriate for our somewhat old-fashioned, or at least sheltered, perhaps even stunted, narrator.

Sure, it's old-fashioned, but that's not a problem; it's arguably better than waiting until some arbitrary later point to give the reader the protagonist's name. The problem is that nothing else in that paragraph says "sheltered" or "stunted." On the contrary, you're not sheltered if most of your family is dead and you're not stunted if you're familiar with the botanical binomial of a fungus, nor if you feel like you have to explain said fungus by also using its colloquial name.

And what a name it is—a somewhat old-fashioned name, Mary Katherine Blackwood, evocative of witch trials and cultists, dense trees in far-away continents and Nancy Drew mysteries

All I could tell was that the name was so white I could barely make it out against the website's background.

She tells us she is eighteen, but by the very next sentence, she already sounds younger, and she sounds younger still by the third (“I dislike washing myself” is a prim schoolgirl’s complaint—well, prim in tone if not in meaning). This too presages what will we come to discover about Merricat (for that is what she is most often called), who lives by a logic quite disconnected from that of the the adult world, i.e. the world of men, the world exterior to her cherished sororal bond, and who will aggressively reject all encroachments by same.

Oh, now what you're saying is that at least part of the supposed charm of the opening paragraph can only be discerned as it relates to the rest of the story. The story that I'm not going to read if the first paragraph doesn't hook me. Fail. I'm skipping over a bit here, but...

The perfect surreality of her matter-of-fact association—the length of her fingers to her potential as a werewolf—signals that Merricat is a magical thinker, and a confident one.

Again, I read fantasy and horror. This sort of thing neither disturbs me nor intrigues me. A person "confident" in magical thinking is, as far as I'm concerned, an unsympathetic character.

Some readers might be tempted to wonder if I'd say the same thing if the protagonist's name were, say, Mark Aloysius Blackwood. The answer is yes. I don't give two shits about the gender of a protagonist.

Submit to her logic or give up now.

I choose the latter.

As for her likes: first we get a second pointed mention of Constance in twice as many lines, which should alert us to her importance.

I can actually agree with this bit, though again, it's only on reading the actual story that the theory "Constance is important" will play out or not.

So at any rate, now we know we’re dealing with a teenage girl bizarre and erudite enough to name a member of an ancient English dynasty, killed before he could achieve the goal he’d fought his whole life for, as one of her three favorite things.

Thus once again contradicting the "sheltered" and "stunted" descriptions.

The third favorite thing, of course, is a poisonous mushroom. This should set off certain alarm bells, especially when it is immediately followed by the revelation that everyone in her family is dead. I also count this revelation as a third mention of Constance.

That's because you read the story, which, I reiterate, won't happen if the opener fails to hook me. A good opener has to stand by itself, and not be good only in hindsight.

This paragraph is brilliant because of Merricat’s voice, and so is the rest of the book. It immediately teaches us who she is, and what this book is going to be like.

Oh? Well, then HARD pass on the rest of it.

And now that I've forever destroyed any chance of getting published on LitHub, I'll stop ragging on the article and on a story I've never read. Yes, of course I've read other stuff by Shirley Jackson; it was required reading in high school. And I liked it. I'm not criticizing Shirley Jackson here.

So what does Waltz think is the greatest opening paragraph of all time? Well, obviously, to answer that, I'd have to have read every story ever written, and that's an even less attainable goal than my desire to visit All The Breweries.

But the one that has always stuck out in my mind is the opening paragraph of To Reign in Hell, a somewhat obscure novel by one of my favorite authors, Steven Brust:

Snow, tenderly caught by eddying breezes, swirled and spun in to and out of bright, lustrous shapes that gleamed against the emerald-blazoned black drape of sky and sparkled there for a moment, hanging, before settling gently to the soft, green-tufted plain with all the sickly sweetness of an over-written sentence.


What can I say? I prefer my fiction to be self-aware.

How about the rest of you? Any favorite opening sentences/paragraphs? Why not make it a

Merit Badge Mini-Contest!


Give me your favorite opening line(s) (including title and author). Copy them in if you can; or link to them. My favorite will win a Merit Badge. Yes, you can try entering your own effort, or that of other WDC authors. My requirement is that I be able to read it, else I won't be able to tell whether I like it or not. The one I like best will get a Merit Badge sometime on Sunday. As usual, you have until midnight tonight, the end of the day on Saturday, in accordance with WDC time.
April 23, 2021 at 12:01am
April 23, 2021 at 12:01am
#1008989
Not doing my usual random article thing today. I found out after I posted yesterday that one of my favorite songwriters died on the 19th, so this one's about Jim Steinman.

Jim Steinman obituary  Open in new Window.
Composer and songwriter who masterminded Meat Loaf’s Bat Out of Hell, one of the biggest-selling albums of all time


I'll let the obituary do most of the work, here. I'll be getting together with a friend of mine on Tuesday, someone who's also a fan, and pour one out for the Master of Melodrama.

Jim Steinman, who has died of kidney failure aged 73, made a spectacular career of being bigger and more bombastic than the rest, and his achievement in masterminding Meat Loaf’s album Bat Out of Hell will guarantee his immortality.

Meat Loaf is, of course, a great singer and performer, and one of those whose career survived having been in Rocky Horror. But he wouldn't have enjoyed nearly the success that he did without Steinman's songwriting.

Bat’s producer, Todd Rundgren, thought the album was supposed to be a parody of Bruce Springsteen – members of Springsteen’s E Street Band played on it – and Steinman did not entirely reject parallels with the Boss. His own songs, Steinman said, “are dream operatic, his are street operatic. He’s more West Side Story and I’m more Clockwork Orange.”

I am, of course, also a fan of Springsteen, and I noticed long ago that they shared a lot of backup musicians. Roy Bittan's piano in particular lent itself just as well to Steinman's operatic compositions as it did to Springsteen's more mainstream rock and roll.

Steinman’s career as a solo artist only managed to encompass the album Bad for Good (1981), which reached the UK Top 10 and spun off the modest hit single Rock and Roll Dreams Come Through, but he was in demand from many directions.

I had that album. Still do, actually. Most of its songs have been redone by Meat Loaf and other artists, because Steinman's voice just wasn't up to signing the checks that his fingers wrote. Still, I have a fondness for the album.

In the 1990s Steinman ventured into musical theatre by collaborating with Andrew Lloyd Webber on Whistle Down the Wind (based on the 1961 film).

Because really, there was only one composer who could out-angst Steinman, and that's Webber.

He reunited with Meat Loaf for the last time for the singer’s album Braver Than We Are (2016).

I hope they finally got over their turbulent past for that. Never got that album, though; it was after albums stopped being a thing, really. But now I'm going to have to track it down. (That's a pun, yes. "Track?" No? No? Damn. Just not in the groove today.)

The obit I linked includes a video of I Would Do Anything for Love (concerning which comments about what "that" is will be received with great disdain), but I thought I'd include a video of the song I rule at, when I have a suitable partner, on the karaoke stage.



It never felt so good, it never felt so right
And we're glowing like the metal on the edge of a knife...
April 22, 2021 at 12:02am
April 22, 2021 at 12:02am
#1008945
There's a whole lot wrong with the phrase, "Money can't buy happiness," starting with the assumption that happiness is some sort of worthwhile goal. So here's an article about it that's also wrong.

How to Buy Happiness  Open in new Window.
The joys of money are nothing without other people.


To quote the Beatles, "Money can't buy me love." To instead quote the Beatles, "Money don't get everything, it's true, but what it don't get, I can't use."

In 2010, two Nobel laureates in economics published a paper that created a tidal wave of interest both inside and outside academia. With careful data analysis, the researchers showed that people believe the quality of their lives will increase as they earn more, and their feelings do improve with additional money at low income levels. But the well-being they experience flattens out at around $75,000 in annual income (about $92,000 in today’s dollars).

Nobel laureates: appeal to authority. A paper: oh, boy, a paper. Data analysis: wow, then it must be true.

I can't be arsed to look at the actual paper (which I've been hearing about for 11 years now), but when someone's making more than $100K, they usually have other stressors that might limit their sense of well-being. In other words, once again, correlation isn't causation.

This January, another economist published a new paper on the subject that found that even beyond that income level, well-being continues to rise.

Dueling papers!

The lesson remains the same as it was a decade ago: At low levels, money improves well-being. Once you earn a solid living, however, a billionaire is not likely to be any happier than you are.

Yeah, but again, why is happiness the goal? What if I don't want to be happy, but just want to fly around in a Gulfstream and sail around in a private yacht?

Yet for the most part, this truth remains hard for people to grasp. Americans work and earn and act as if becoming richer will automatically raise our happiness, no matter how rich we might get. When it comes to money and happiness, there is a glitch in our psychological code.

I'm not sure that's the case. We're talking about Americans here. No matter how much money you make per year, no matter how much savings you manage to accumulate, one medical event can wipe it all out entirely, leaving you in poverty and/or debt. Money is a cushion against such things; the more money you have, the more likely you are to financially survive one trip to the hospital.

In other words, money is security. It's not really about happiness.

Research shows that how the wealthier among us spend their money makes all the difference for their well-being. Specifically, spending money to have experiences, buying time, and giving money away to help others all reliably raise happiness. Thus, if you have a little excess income, it’s best to use it on those three things.

[citation needed]

The key factor connecting all those approaches is other people. If you buy an experience, whether it be a vacation or just a dinner out, you can raise your happiness if you share it with someone you love.

Eh, no, not really. Adding other people into the mix means you have to recognize their desires. Say you go to Vegas with a friend. You want to gamble; they want to see a comedy show. So either one of you does something you don't want to do, or you each go your own way, in which case why bother with the friend? I mean, sure, traveling with friends can be fun, but for the most part, I travel alone so I don't have to have endless discussions over what to do next, what to eat, when to sleep, etc.

Friends and family are two key ingredients in well-being, and fun experiences with these people give us sweet memories we can enjoy for the rest of our lives—unlike the designer shoes that will wear out or go out of style.

Designer shoes may go out of style, but friends end up ghosting you, and as for family, I realized recently that my definition of "family" is "the people who, no matter what you do, it's never good enough for them and they'll be disappointed in you." Nah, I'd rather have money.

Likewise, if you pay someone to do something time-consuming that you don’t like to do (for example, cutting your yard), and don’t waste the time you gain on unpleasant things like doom-scrolling on social media, you can get a happiness boost by spending those extra hours with others.

I play video games (the single-player kind) while someone that I'm paying is mowing my lawn. That makes me feel good.

Left to our urges and natural desires, we can get stuck in a cycle of dissatisfaction, in which we work, earn, buy, and hope to finally get happier.

Yeah, well, at least part of the problem is a constant barrage of advertisements designed to make us dissatisfied with our lives, and if we only bought This Thing, we'd surely be happier. I know I've said this before, but that makes people think that happiness is having what you want; I say that happiness (if it's even a worthwhile goal in the first place) is wanting what you already have.
April 21, 2021 at 12:07am
April 21, 2021 at 12:07am
#1008886
Well, I haven't mentioned this subject in a while. The article is from 2018, but I just found it recently.



I think the last time I ragged on panpsychism was about a year ago. This article has the added benefit of the word "vibrate" in the headline, which makes me want to make vibrator jokes. But those would be too easy, so I'll try to restrain myself.

Why is my awareness here, while yours is over there? Why is the universe split in two for each of us, into a subject and an infinity of objects? How is each of us our own center of experience, receiving information about the rest of the world out there? Why are some things conscious and others apparently not? Is a rat conscious? A gnat? A bacterium?

"Infinity?" [citation needed]

Over the last decade, my colleague, University of California, Santa Barbara psychology professor Jonathan Schooler and I have developed what we call a “resonance theory of consciousness.” We suggest that resonance – another word for synchronized vibrations – is at the heart of not only human consciousness but also animal consciousness and of physical reality more generally. It sounds like something the hippies might have dreamed up – it’s all vibrations, man! – but stick with me.

Because if there's any place where such a hypothesis could emerge, it would be Santa Barbara.

Based on the observed behavior of the entities that surround us, from electrons to atoms to molecules, to bacteria to mice, bats, rats, and on, we suggest that all things may be viewed as at least a little conscious. This sounds strange at first blush, but “panpsychism” – the view that all matter has some associated consciousness – is an increasingly accepted position with respect to the nature of consciousness.

I've laid out my problem with panpsychism before. In brief: where's the evidence, for or against? Is it falsifiable? Can we do experiments on it? Because otherwise it's just an idea. Maybe it's a good idea - I mean, can it really hurt to consider all things as related? But to me it has one great flaw, which is that if you claim that every atom has some kind of consciousness, you dilute any definition of consciousness. Like when you call time an illusion, you change the definition of "illusion."

Still, the article is worth reading, because who knows? Maybe they're onto something (as opposed to on something, which is the usual state necessary to claim "It's all ONE, dude.") From a purely physical standpoint, it's clear that there's a deep connection between all life, and for that matter, all matter. What's not clear to me is that consciousness, which from the standpoint of parsimony might be better described as an emergent property of an individual's neural connections, also comes from some sort of lower-level consciousness.

It's a lot to think about, and it's not like I can come up with a more compelling idea. So I keep reading on the subject, and thus, sharing what I find.

After all, that's what group consciousness is all about, yes?
April 20, 2021 at 12:03am
April 20, 2021 at 12:03am
#1008815
It's been said that with genius comes a certain level of insanity.



Would that the reverse were true. Anyway, from Cracked...

Pop culture loves to portray geniuses as dragging us ungrateful schmucks through history, whether we want to come along or not. So it's always good to remember that brilliant people still have the same problems as the rest of us ... except when their problems get much, much stupider than ours ...

I don't know; some of us have some really stupid problems.

4. Vladimir Nabokov Hated Sleep, Conducted Weird Dream Experiments

Vladimir Nabokov, the famous author of Pale Fire and that novel about a pedophile that's awkward to read on the bus, is considered a titan of literature, especially by dudes with bad facial hair who keep cutting off other people in their MFA classes. He was also a lifelong insomniac, and the condition took him to some weird, weird places.


Probably because it's been shown to a high level of certainty that lack of sleep messes you right up.

For starters, Nabokov called sleep "mental torture," "the most moronic fraternity in the world," and "the nightly betrayal of reason, humanity, genius," among other things offensive to anyone who considers napping their greatest skill.

For me, it's between napping and drinking.

3. George S. Patton Thought He'd Have To Fight Forever

Admittedly, he also wrote poems with titles like "The Turds of the Scouts," but his belief in reincarnation was legit, and he spent both World Wars claiming to know his way around old European towns and battlefields thanks to his time as a 14th-century French knight. Unfortunately, Sir Patton was one of many Frenchmen to meet his end at the Battle of Crecy, a crushing victory for the English, but it would have been suspicious if he'd claimed nothing but a string of huge successes.


What's suspicious is that all of his purported past lives were European. Couldn't throw some Native American, African, or Asian warriors in the mix? Racist fuck.

Exactly how Patton reconciled this with his devout Christian beliefs remains unclear, but he also claimed that he was at the Siege of Tyre, served under Caesar and Marc Antony, fought with the English at Agincourt because apparently when you reincarnate during a lengthy conflict, you can ask to switch to what looks like the winning side, served the House of Stuart during the English Civil War, and was a bigshot in Napoleon's Grande Armee.

See? If his past lives were any whiter, he'd be a ghost. Oh, wait.

Anyway, this may all sound ridiculous, but reincarnationresearch.com thinks he's currently living as James Mattis, and surely if anyone would know, it's those guys.

I have to wonder if Mattis knows about this.

2. Kurt Godel Would Only Eat Food Prepared By His Wife ... And Then She Got Sick

Kurt Godel is one of those mathematicians who was so smart you need a good math background to appreciate his incredible influence. His incompleteness theorems, contributions to set theory and modal logic, and many other accomplishments are all way over our head.


Mathematicians tend toward weirdness anyway. That's why they become mathematicians.

And Cracked, what's with omitting the umlaut? It's Kurt Gödel. It's not that hard to do, and it hasn't been fashionable to piss off Germans since 1945. Also, dude was metal, and nothing says "metal" like an umlaut.

Godel became obsessed with the thought of being poisoned, either intentionally via food or accidentally through, for example, the venting of gas from his refrigerator. His wife kept his fears in check by preparing and tasting his food for him, and long talks with some guy named Einstein helped keep him mentally grounded, but he was never the most stable genius.

Hm, take the slight risk of possible poison, or the sure risk of dying from malnutrition? If only there were a logical construct for managing risk.

1. Edward Gibbon's Balls Were Literally Too Big

18th-century politician and historian Edward Gibbon's big claim to fame is The History of the Decline and Fall of the Roman Empire, an epic 1.5 million word landmark in historical writing. While its central argument -- that Rome's decline can be blamed on its adoption of Christianity -- is considered wrong today, it's still seen as a huge leap forward in research standards and an entertaining read to boot. Books, shows, and games have aped its title, it's been referenced in pop culture like Mad Men, and countless dads once got copies in book subscription programs that they displayed prominently but never got around to actually reading. Whatever Gibbon's flaws, he was an esteemed, serious scholar ... who was undone by an inflamed testicle.


Right now you might be going "But Waltz, that's not insanity." I have no personal experience with this, but I'm pretty sure an inflamed ball would make any man nuts. So it fits. Also, it gives me an opportunity to make a "nuts" pun.

The intent here is not to mock a great man of history, although dying from big balls is very, very funny (Gibbon himself joked about it with close friends as he neared the end). The procedure was simply the best available at the time, and the risk was considered worth it, given what the unchecked problem would mean for his life. You are simply invited to reflect on the fact that, while you probably won't go down in history as a genius, you at least don't live in a time when preposterously swollen genitalia can take you to your grave.

No, it just gets you fame on the Dark Web.

Or so I've heard.
April 19, 2021 at 12:01am
April 19, 2021 at 12:01am
#1008753
Hey look, an actual article about writing.

How to Write a Novel, According to 10 Really Good Novelists  Open in new Window.
Take notes everywhere, embrace Wikipedia wormholes and other handy tips


A long, long time ago, back in the first lockdown, you probably told yourself that now – right this moment, in the middle of a pandemic – was the perfect time to conceive, plot, write, revise, rewrite, complete and publish a novel which completely transformed what we thought it was possible to express in the English language.

Nope, I said, "Hey, let's play video games and binge-watch TV shows." Also, "Hey, what would happen if I blogged every day around midnight?"

It wasn't. Obviously it wasn't. You know that now. But even if it turned out a year-long period of isolation and anxiety actually wasn't much good for your inner David Foster Wallace, there's no bad time to start writing.

Part of it is all we could think about was the pandemic, and what you think about is what you write. And I don't want to see one single goddamned story, book, movie, or TV show about the pandemic. The title "Love in the time of [insert disease here]" is completely played out, and as soon as I see something about two (or more) people falling in love while masking and social distancing, I am going to break something. I am also going to break something if it's a story about how lonely people are. I don't care. No. Do not want. Authors writing such tripe need to be sent to Gitmo.

While we're at it, I propose imprisonment for vertical video and a minimum of 10 years of hard labor for writing anything longer than flash fiction in the present tense.

To help you along, we asked 10 established and emerging writers for the rules of thumb they use to find ideas, to get words onto the page, and to turn an interesting first draft into something more substantial.

And you know, I'm just going to leave this here because I'm wiped. No, it's not the second vaccine shot; I'm over that. It's that I managed to score a bottle of rare, artisanal, small-batch gin from Utah (don't ask). (No, really, don't ask.) It's very, very good gin. The downside of it is that I can't be arsed to comment on the tips. I will say I do some of them already, I might try others, but what works for that editor won't necessarily work for me, and what works for me won't necessarily work for you.

So if you're writing a book, or thinking of writing one, I guess you'll just have to click on the link. I'mma go pass out.
April 18, 2021 at 12:01am
April 18, 2021 at 12:01am
#1008685
Today, just a gentle reminder that we don't always know what we think we know. From Cracked, so it's also amusing.



Only it's not limited to science.

We all love science, sure. It's what makes things explode when we want them to, and keeps things from exploding the rest of the time.

Leave it to Cracked to succinctly explain the primary purpose of science.

5. "You Are What You Eat"

And if you eat a lot of a nutrient, you build a whole lot of it into your body. Right? That sounds reasonable enough. But that very last observation is kind of only true with fat, which is something your body likes to store away (and even fat storage is more complicated than that). With other stuff, you have to eat it, but if you eat extra, your body just kind of discards it.

This is demonstrably true, but the article does a pretty good explanation of why it's so.

Anyway, my mom always used that line on me: "You are what you eat." She was a lousy cook and I was a lousy kid, so one time, I said "You're saying I'm garbage?"

That was the last time I said that.

4. "Save The Rain Forest, It's The Earth's Lungs"

Most oxygen doesn't come from trees at all. It comes from algae in the ocean. And you don't get many trendy charities for looking after algae, because no one had fun climbing algae as a kid.


I gotta give 'em this one. Assuming it's as well-researched as the other points, it's something I never looked into, so I didn't know. In any case, there are plenty of other reasons to save the rainforest.

3. "Let's Keep Squeezing Solar Panels Everywhere!"

Having conceded that trees are great, though not in the way some people think, let's also reassure you that we think solar power is great. The tech works, it's advanced enormously recently, and we should build a bunch of new solar arrays. But many people seem to take this to mean that we need a solar panel, like, right there, right where they’re staring right now.


Minor quibble with this one: even deserts have biodiversity and what life exists there is dependent on sunlight. Build giant solar arrays out in Nevada and what happens? Shade on the ground. The ground that's harboring some really delicate life. So if we can put solar panels on things that are already blocking out the ground, like roofs and roads, I have no problem with it. But the article also goes into the economics of such.

2. We Have No Way Of Picturing Extreme Probabilities

This one's a pet annoyance of mine. Large and small numbers are incomprehensible. Hell, to most people, numbers are incomprehensible. It doesn't help that a lot of people don't see the world in terms of probabilities; only in black and white. Either I win a hand of poker, or I don't. Either I win the lottery, or I don't. If I get vaccinated, I could still catch the disease, so why bother? (The answer to that last one is because your chance of catching the disease drops by several orders of magnitude.)

1. None Of Us Can Picture Distances In Space

There's a Douglas Adams quote about how big space is, something about how it's considerably more than even the distance from here to CVS.

The actual quote is "“Space [...] is big. Really big. You just won't believe how vastly hugely mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist, but that's just peanuts to space.” I first read it when I was fairly young, so I didn't know "chemist" was Brit for "pharmacist," and was confused. What's really confusing is that their word for "peanut" is the same as ours.

In any case, this one is kind of related to #2. Some things are just hard for our earthbound ape minds to comprehend.
April 17, 2021 at 12:02am
April 17, 2021 at 12:02am
#1008644
Okay, so, look.

When I got my second Covfefe shot on Thursday, the nurse or tech or whateverthefuck, the chick with the needle, advised me to "drink plenty of fluids."

Beer is, in fact, a fluid.

So it's a good thing that this article came up in my random selection rather than, say, something deep about philosophy, science, or philosophy of science. Or even music.



Just to be clear, here, this is an infographic that shades each state in the US based on which state the people in it hate the most.

There's really not much for me to add. It's just that the more you look at it, the funnier it gets. Like, for instance: exactly how many states hate California. Or that my own state is the only one that hates West Virginia, while both West Virginia and Maryland return the favor.

There are some exclusive mutual hatreds going on, but for maximum lulz, just look at New Jersey.

But my absolute, all-time favorite is Florida.

Seriously, just go there and look at the graphic. I'mma go pass out again.
April 16, 2021 at 12:03am
April 16, 2021 at 12:03am
#1008600
Today's article talks about something I've harped on before -- the unreliability of food science studies and reporting.

Here’s Why It’s So Impossible to Get Reliable Diet Advice From the News  Open in new Window.
What’s good for you seems to change every week. Maybe we should stop blaming the media and look at the studies underneath the stories, too.


And maybe we should stop blaming ourselves and being so damn neurotic about these things.

It is easy, especially as someone who is on the research side of things most of the time, to fault the media for sensational coverage of individual studies that fails to consider the broader context. And certainly there is a healthy dose of that all around us (for example, why write a headline like “Do Tomatoes Cause Heart Attacks?” when the answer is “no”?).

Uh, I know that was a rhetorical question, but the answer is "for clicks." It's best to assume that the answer to any headline in the form of a question is "No."

I would argue the main problem is that the studies that underlie this reporting are themselves subject to significant bias.

I've been saying this for years. If you've been following along, you're probably sick of hearing me go on about it.

Take coffee as an example. On the one hand, coffee drinkers are more educated on average, and they exercise more.

How do you know that? A study? But didn't you just say that studies are biased? And of course coffee drinkers want to think they're smarter, but if they're so smart, how did they get addicted to caffeine? I should commission a study that shows how guys with long hair who wear Birkenstocks and Hawaiian shirts are far, far sexier than average.

I set out to analyze—using a top-notch data set and standard analysis techniques—a simple question: “Which foods make you thin?”

None. None foods make you thin. Lack of food makes you thin. Duh.

In the data, eating quinoa is associated with a whopping 4.2 point reduction in BMI, whereas eating breakfast sandwiches is associated with more than a 2 point BMI increase.

So if you eat a quinoa breakfast sandwich, it balances out.

There are other issues with quinoa, cultural ones. But this isn't a post about American cultural imperialism.

The article goes on to describe kind of a study of studies, with lots of bar graphs and explanation. It's worth at least glancing through if you're interested in this sort of thing.

But if not, here's the bottom line: Educated rich people are healthier than ignorant poor ones. (Educated people who wear Birkenstocks and Hawaiian shirts are also way more attractive.)

Or something like that. Honestly, she loses me about halfway through, but that's probably because I have a bit of a mind fog right now. No, I haven't been drinking; I got my second Trump Mumps shot yesterday and, while I'm generally feeling pretty good, my brain's kinda wonky.

In any case, it's pretty well established that eating vegetables is better for your health than eating candy bars. I don't think any of us need a study to know that, but since candy bars taste way better than any vegetable, it helps to be reminded from time to time.
April 15, 2021 at 12:52am
April 15, 2021 at 12:52am
#1008535
How about a trip out on a limb, today?

The Hard Science of Reincarnation  Open in new Window.
All over the world, scholars studying reincarnation are making findings even skeptics have difficulty explaining.


Let me preface this by saying: I don't believe in this stuff.

Nevertheless, just like with out-of-body experiences, hauntings, UFOs, cryptids, and other phenomena considered "fringe," something is going on. Some of it is hoax, some false perception, and a lot of attention-seeking behavior is involved; but even discounting those things, there's something there that, if it doesn't tell us anything about the world outside, can shed some light on how the mind works.

The trap is falling into the belief in these things wholeheartedly, finding "evidence" that supports said belief, and discounting that which does not. That's the worst kind of bias, and it's the sort of thing that divorces people from reality. For instance, the phenomenon of sleep paralysis is fairly well documented and partially understood. But it often tracks neatly with descriptions of alien abduction. Having experienced sleep paralysis myself, I can absolutely understand how some people can be convinced they were teleported up to the mothership and experimented upon.

It's also been established that certain frequencies of sound waves, too low to be perceived by our ears, can induce fear and hallucination. Hence, ghost stories.

While I'm absolutely on the side of science when it comes to these things, part of science is admitting that we don't know everything. Another part is trying to fix that. It's an ongoing process, hopefully one that never ends.

But the history of science is littered with the dashed hopes of people who wanted a thing to be true, a thing that turned out to not be the case. And I think "reincarnation" is going to end up being one of those things.

You'll notice I'm not quoting much from the article today. It's an interesting article, if a bit credulous, though it includes some of the skepticism involved. But one of the things it focuses on is my own former university's research into these phenomena, in a department I've known about since I started going there lo these many years ago. This was right around the time Ghostbusters came out, and afterwards, it was probably embarrassing for a major state university to have a Parapsychology Department, so I suspect that's why they ended up changing the name.

It's this quote from a skeptic, from the linked article, that I find most appealing:

“There could only be two possibilities. One is that there is something genuinely paranormal happening, and if that is true, that would be amazing,” he told me. “Or, alternatively—which is more the line that I do favor—it tells us something very interesting about human psychology. So either way, it's worth taking seriously.”

While I object to the characterization of "only two possibilities," I certainly can't think of others right now.

As for reincarnation itself, we know this: on a purely physical level, of course we're recycled. Whatever they do with your body after you kick it, those atoms don't just poof from existence; they get incorporated into the environment, perhaps even into other living things. You've almost certainly eaten carbon atoms that were once in the body of an ancestor of yours; you've definitely breathed air that has spent some time in other living beings. (People who fart in elevators cannot be trusted. Just saying.)

That's so obvious as to be trivial, which is why we don't think about it much.

What reincarnation purports to be, though, is a kind of continuation of consciousness, or at least of the memories that are incorporated into consciousness. And we don't understand consciousness, or even the way memories are formed and stored, so there's no way to say with certainty "this is utter bullshit." And yet, it's probably utter bullshit, or at least something else is going on that we don't fully understand.

Given consciousness's utter inability to conceive of its own nonexistence, the belief persists. So I'm all for the research. Just let's not be too trusting of our own conclusions about it.
April 14, 2021 at 12:02am
April 14, 2021 at 12:02am
#1008482
Just an interesting tale of tales today.

Fairy Tales Could Be Older Than You Ever Imagined  Open in new Window.
Jack may have been climbing that beanstalk for more than 5,000 years


Though I'm not sure how solid the foundation for this finding is, it's about storytelling, so it's actually relevant for once.

In a new study published in the journal Royal Society Open Science, a folklorist and anthropologist say that stories like Rumpelstiltskin and Jack and the Beanstalk are much older than originally thought. Instead of dating from the 1500s, the researchers say that some of these classic stories are 4,000 and 5,000 years old, respectively. This contradicts previous speculation that story collectors like the Brothers Grimm were relaying tales that were only a few hundred years old.

Before writing, we're pretty sure people passed down their stories mouth to ear. There's no reason to believe that wouldn't have continued; not everyone in every society was - or is - literate.

It turns out that it’s pretty hard to figure out how old fairy tales are using simple historical data. Since the tales were passed down orally, they can be almost impossible to unwind using a historian or anthropologist’s traditional toolbox. So the team borrowed from biology, instead, using a technique called phylogenetic analysis. Usually, phylogenetic analysis is used to show how organisms evolved. In this case, researchers used strategies created by evolutionary biologists to trace the roots of 275 fairy tales through complex trees of language, population and culture.

While that's interesting and all that, I'd be inclined to want to hear how they justify using phylogenetic analysis for something for which it wasn't intended.

Using the Aarne-Thompson-Uther Classification of Folk Tales, a kind of über index...

More like Uther index, amirite?

...that breaks fairy tales down into groups like “the obstinate wife learns to obey” and “partnership between man and ogre,”...

Both groups being what today we'd call "fantasy."

...the team tracked the presence of the tales in 50 Indo-European language-speaking populations.

It might also be illuminating to use this technique (if it's not found to be fatally flawed) on non-PIE-derived languages. There's a rich culture of fairy tales, or their equivalent, in the East.

But not everyone is certain that the study proves fairy tales are that old. As Chris Samoray writes for Science News, other folklorists are finding fault with the study’s insistence that The Smith and the Devil dates back to the Bronze Age—a time before a word for “metalsmith” is thought to have existed.

Clearly, skepticism is warranted (and encouraged), but in that particular example, so what? The thing about oral traditions is that each generation adds its own spin on the tradition. If environmental, cultural, or (in this case) technological changes occur, they'd probably morph the story to make it more accessible to a younger generation, one that lacks historical context. If you don't believe me, go watch one of the approximately two thousand remakes Hollywood puts out in a year.

Point being, maybe that particular story started out as "The Flint-Knapper and the Evil Spirit." No need for "metalsmith" to have been invented; that would come in after metalworking got going.

So, yeah, I wouldn't take anything here as hard evidence of the age of stories... but we're pretty sure storytelling itself is as old as humanity. Maybe it's when our ancestors started telling stories that we could finally call them "human."

But of course a writer would think that.
April 13, 2021 at 12:05am
April 13, 2021 at 12:05am
#1008426
More science.

Fossil Discoveries Challenge Ideas About Earth’s Start  Open in new Window.
A series of fossil finds suggests that life on Earth started earlier than anyone thought, calling into question a widely held theory of the solar system’s beginnings.


There are some compelling hypotheses about how life got its start. There are also not so compelling ones, like "it came from somewhere else in space." That one just kicks the can down the road and doesn't explain how life first arose out of non-life; it just postulates that it didn't begin on this planet. Regardless, the question of "when" might shed some light on "how."

In the arid, sun-soaked northwest corner of Australia, along the Tropic of Capricorn, the oldest face of Earth is exposed to the sky. Drive through the northern outback for a while, south of Port Hedland on the coast, and you will come upon hills softened by time. They are part of a region called the Pilbara Craton, which formed about 3.5 billion years ago, when Earth was in its youth.

That's if you can avoid the drop bears and everything else that's trying to kill you in Australia. Really, it's a wonder anyone can live there at all. Maybe the reputation is exaggerated and promoted by the locals to keep tourons away.

In the past year, separate teams of researchers have dug up, pulverized and laser-blasted pieces of rock that may contain life dating to 3.7, 3.95 and maybe even 4.28 billion years ago.

Those are all a Really Damn Long Time Ago. For reference, the Sun is probably about 5 billion years old, a little more than 1/3 the age of the Universe itself. Also for reference, human civilization is maybe 10,000 years old - not even as big as the error bars in whatever they used to date those (possible) fossils.

Taken together, the latest evidence from the ancient Earth and from the moon is painting a picture of a very different Hadean Earth: a stoutly solid, temperate, meteorite-clear and watery world, an Eden from the very beginning.

When you consider that all early life, up until (they think) about half a billion years ago, was aquatic, the water could have provided some protection from asteroidal catastrophes.

The article includes a timeline for early Earth/Moon formation, with graphics that for some reason amuse me.

Anyway, the rest of the story lays out the evidence and discusses some of the debate surrounding it. No need to reproduce most of it here.

“Are there other explanations than life? Yeah, there are,” Bell said. “But this is what I would consider the most secure evidence for some sort of fossil or biogenic structure.”

Thing is, just as it's important to be really, really sure when we find evidence of extraterrestrial life (by which I mean microbes, not Vulcans), it's also important to be skeptical of this sort of finding. Likely there will need to be additional evidence.

Far from fussy and delicate, life may have taken hold in the worst conditions imaginable.

I don't know about "worst." Venus is way worse. (Before anyone starts quoting the bit about finding signs of life in Venus' atmosphere, that finding was conclusively discredited.) But worse than we imagined, sure. We already know that simple life can withstand conditions that more complex life-forms cannot, so it's at least believable.

There's also a video worth watching in the article, if you're interested in this sort of thing. With bonus explosion graphics. (You just have to suspend disbelief with all the sound effects about things going boom and thunk and whoosh in the near-vacuum of space.)

The other reason this sort of thing is interesting is the implication for finding (simple) life elsewhere -- something the article doesn't get into until the very end:

If there was no mass sterilization at 3.9 billion years ago, or if a few massive asteroid strikes confined the destruction to a single hemisphere, then Earth’s oldest ancestors may have been here from the haziest days of the planet’s own birth. And that, in turn, makes the notion of life elsewhere in the cosmos seem less implausible. Life might be able to withstand horrendous conditions much more readily than we thought. It might not need much time at all to take hold.

And again, as a caution, I wouldn't make the leap to the kind of sophisticated life it would take to, say, make radio stations or flying saucers. Nothing in evolution requires that result. But it makes ideas like finding microbes or their equivalent on, say, Mars, Europa, or Titan that much more likely.

Which would be cool.
April 12, 2021 at 12:01am
April 12, 2021 at 12:01am
#1008370
Science.

Cloud-Making Aerosol Could Devastate Polar Sea Ice  Open in new Window.
An overlooked but powerful driver of cloud formation could accelerate the loss of polar sea ice.


A word about "aerosol." Well, a few words. This might have shifted because of information about viral transmissibility, but it occurs to me that normal people don't use "aerosol" the same way scientists do. This is probably because a while back, there was the ozone hole crisis, which was popularly reported as being caused by, among other things, aerosol propellants in cans of shit like hair spray. But that's a specific use of the term; in general, it refers to particles suspended in air. Or something like that. I'm not an expert; I just like reading about this stuff.

Now, while studying the atmospheric chemistry that produces clouds, researchers have uncovered an unexpectedly potent natural process that seeds their growth.

So this isn't just another climate change scare article? Don't get me wrong - I'm firmly in the "anthropogenic climate change" camp. I just don't care about it anymore. The time to do something about climate change was 30 years ago. Now it's too late. Enjoy the slide to oblivion. Still, there's some interesting science coming out of its study.

This discovery emerged from studies of aerosols, the tiny particles suspended in air onto which water vapor condenses to form clouds. As described this month in a paper in Science, researchers have identified a powerful overlooked source of cloud-making aerosols in pristine, remote environments: iodine.

But... don't they use iodine to seed clouds for rain? How was this overlooked?

Anyway, I'm not going to quote any more of the article; I'd have to leave too much out, so it's better to just read the thing. I'm mostly just leaving this here as a case study in science advancement and reporting.

Because if nothing else, we're at least still learning new things.
April 11, 2021 at 12:01am
April 11, 2021 at 12:01am
#1008209
Today in You're Doing It Wrong, "Thinking:"

Your Brain Doesn't Work the Way You Think It Does  Open in new Window.
A conversation with neuroscientist Lisa Feldman Barrett on the counterintuitive ways your mind processes reality—and why understanding that might help you feel a little less anxious.


This is one of those articles that is basically a book promotion, but to reiterate: This is a writing site, and I'm not going to rag on anyone for promoting their book, here or elsewhere. And honestly, this one looks like it might be interesting.

People tend to feel like we’re reacting to what’s actually happening in the world. But what’s really happening is that your brain is drawing on your deep backlog of experience and memory, constructing what it believes to be your reality, cross-referencing it with incoming sense data from your heart, lungs, metabolism, immune system, as well as the surrounding world, and adjusting as needed. In other words, in a process that even Dr. Barrett admits “defies common sense,” you’re almost always acting on the predictions that your brain is making about what’s going to happen next, not reacting to experience as it unfolds.

Or as I like to put it: we think we have free will, but that's an illusion; we don't so much make choices as rationalize our choices after the fact.

“Predictions transform flashes of light into the objects you see. They turn changes in air pressure into recognizable sounds, and traces of chemicals into smells and tastes. Predictions let you read the squiggles on this page and understand them as letters and words and ideas,” Barrett writes. “They’re also the reason why it feels unsatisfying when a sentence is missing its final.”

Oh, she's smart and funny? I think I'm in love.

Anyway, my own brain is (metaphorically) running on fumes right now, so I don't have a lot more to say. The article proceeds in interview Q&A form, and both Qs and As are interesting and insightful. But one quote by Barrett stands out for me:

Your brain is making guesses about what is going to happen next, so it knows how to act next to keep you alive and well. It’s continuously drawing on your past experiences to create your present. The really cool thing about this? It's really hard for people to change their past. However, by changing your present, you are cultivating a different future. By changing what you do and say, and feel, you are seeding your brain to predict differently in the future.

This, of course, has implications for psychology and probably even philosophy. I think I suspected this on some level, because this is why I don't stop trying to learn new things. Thus:

The actions and the experiences that your brain makes today become your brain's predictions for tomorrow. So making an effort to cultivate new experiences and learn new things today is an investment in who you will be tomorrow. Some people have control over many things in their lives, and some people have less control because of their life circumstances, but everyone can control something.

So it's worth a read; the article isn't all that long. And for once, I'm actually tempted to buy the book. Or, to put it another way, I've been conditioned to accept advertisements disguised as journalism, and when said ads spark my confirmation bias, my brain wants me to buy the book so I can learn more.

30 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online