About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
July 31, 2022 at 12:01am
July 31, 2022 at 12:01am
#1035901
Of all the science fiction authors I've read—and there have been quite a few—this one has some of the more memorable works.

Philip K Dick: the writer who witnessed the future  Open in new Window.
Forty years since the death of the sci-fi author – whose stories have inspired films like Blade Runner and Minority Report – Adam Scovell explores how prophetic his work has been.


This is a BBC article from back in March, published approximately on the anniversary of PKD's death.

This is 2022. And 2022 is a Philip K Dick novel.

Well. He was, after all, very good at writing believable dystopias.

Writers of science fiction often feel more prescient than others. Whether it's the threat to women's rights in the work of Margaret Atwood, the architectural and social dystopias of JG Ballard's novels, or the internet-predicting world of E M Forster's The Machine Stops (1909), the genre is replete with prophetic writers dealing with ever more familiar issues.

Look, here's the thing about that: it's not prescience or prophecy. Plenty of things SF authors have written about haven't happened; of those that have, most of them have done so in ways they didn't envision. And as the article points out later, that's also not their purpose. It's just that when someone spends a great deal of time looking at society and technology, and thinking up ways to extrapolate both into the future, they're going to sometimes come pretty close. Also, actual inventors of technology often read SF, so they invent shit that fits what they've read about, consciously or not.

In a remarkably prolific 30-year period of work, Dick authored 44 novels and countless short stories, adaptations of which went on to redefine science fiction on screen – in particular Ridley Scott's Blade Runner (1982), which was based on Dick's story Do Androids Dream of Electric Sheep? and Paul Verhoeven's Total Recall (1990), which took his 1966 short story We Can Remember It for You Wholesale as its source material. More recently Dick's novel The Man in the High Castle (1962) has been turned into a hit Amazon series.

Blade Runner remains my favorite movie of all time (the director's cut, that is). As for High Castle, I read that book long ago and couldn't imagine it becoming a movie or TV show—and yet they slam-dunked it.

Also, the less said about the Total Recall remake, and the Blade Runner sequel, the better. But I think his writing has been adapted even more than that of Alan Moore, which is saying something.

Incidentally, one night I watched Minority Report, Blade Runner, and A Scanner Darkly (a far lesser-known movie using rotoscope technology). I bear the scars to this day of Too Much Dick.

Dick was not simply an effective writer of strange fictions, but an unusual person in his own right. Burdened by deteriorating mental health, visions, and what he alleged were all manner of paranormal experiences – many of which were woven into his expansive oeuvre – Dick had a troubled and fragmented relationship with reality.

I believe this is BBC-speak for "Dick was stoned out of his gourd and went completely barmy."

Celebrated science-fiction and fantasy author Stan Nicholls suggests Dick's work is prescient because it explored the future through the then-present. "His stories posited the ubiquitousness of the internet, virtual reality, facial recognition software, driverless cars and 3D printing," Nicholls tells BBC Culture – while also pointing out that "it's a misconception that prediction is the primary purpose of science fiction; the genre's hit rate is actually not very good in that respect. Like all the best science fiction, his stories weren't really about the future, they were about the here and now."

I'm mostly including the above quote because I Chekov's Phasered it above. I'd add that a lot of SF, including a lot of Dick, is supposed to be a warning, not a blueprint.

Nevertheless, the way he also anticipated particular technological and societal developments remains striking. "He had a lot of scientific images of the way the future would work," says Anthony Peake, author of the biography A Life of Philip K Dick: The Man Who Remembered the Future (2013). "For instance, he had a concept that you would be able to communicate advertising to people directly, that you'd be able to know them so well that you could target the marketing precisely to their anticipations. And this is exactly what is happening online."

I've said before that I despise ads and go out of my way to avoid them. Still, it's impossible to avoid them entirely and still live in the actual world, and the truth is, I'd rather have targeted ads for things like bar supplies, cat food, and t-shirts than get spammed by purveyors of, say, adult diapers, religion, tampons, or hemorrhoid cream. The difference between targeted ads and a Dick world is that in a Dick world, it's ultra-invasive. That's the dystopia element.

Screen adaptations have often latched on to the invasive nature of advertising in his work, yet the writer explored the theme in far more detail than as merely a background aesthetic (which is how it manifests on screen).

I'd venture a guess that this is because shows and movies rely to one extent or another on advertising, and it wouldn't do for ad agencies to allow their antics to be portrayed in too negative a light.

Dick's work often had a political dimension, too. The Man in the High Castle, for example, imagines an alternative history in which the Nazis won World War Two.

You know, that's been the elevator-pitch description of that story for as long as I can remember. The novel, and the TV show based on it, is way more complex than that.

Dick was altogether anti-establishment: his stories feature authorities and companies consistently abusing their power, especially when it comes to surveillance. His worlds are ultra-commodified and their citizens addicted to materialism, while celebrity, media and politics meld to create nightmarish, authoritarian scenarios, usually topped off with a heavy dose of technocracy and bureaucracy.

Nah. Not prophetic at all.

In the 1969 novel Ubik, a character ends up arguing with the door to his apartment, as he doesn't have the change to get in via its coin-operated mechanism.

Unrealistic. Today they'd sell access on a month-to-month recurring subscription plan, like BMW and their goddamned seat heaters (from "Heated DiscussionOpen in new Window.)

Putting aside Dick's ability to foreshadow the future we now take for granted, his most unnerving vision was of the world itself ultimately being a simulation.

Oh for fuck's sake, not this shit again. It's not "unnerving." Even when I was reading his books when I was much younger, that was already a bridge too far for me. But it does illustrate that the "reality is a simulation" nonsense is older than most of us think it is. Hell, it basically has its roots in millennia-old Eastern religion and Gnosticism: the idea that what we think is reality is actually illusion—a worldview that I long ago put in the manure pile, because even if it's true, and there's no certain way to test if it is or is not, what does it matter? Still, I have no inherent objection to exploring that or other speculations in science fiction.

Dick's reality was already a fragile and complex one. In many of his later books, the idea of reality being a façade grew as a dominant theme.

That's because, as noted above, dude was completely stoned.

Anyway, I saved this one to share because it's a decent, if brief, description of a very prolific, albeit relatively short, writing career.
July 30, 2022 at 12:09am
July 30, 2022 at 12:09am
#1035861
On the heels of Turkey DrumStik Author Icon's last blog entry, "Foods and stoodsOpen in new Window., it's appropriate (and coincidental) that the random number generator directed me to this article about cooking.

Can’t cook, won’t cook? Here are the tips that saved me from a life of terrible meals  Open in new Window.
I used to survive on tinned lentils, microwaved eggs and the kindness of more gifted cooks. Then I learned to pull my weight in the kitchen – and if I can do it, so can anyone


Oh, look at Ms. Posh over here actually cooking lentils and eggs instead of ordering takeout like a proper lazy Brit.

(Guardian link so English English.)

(I'm going to refrain from making tired jokes about British food here. You can find them elsewhere, or make up your own.)

For most of my life, I have been a terrible cook. Some people say they are terrible cooks then whip up a perfectly palatable meal for four. I have given myself food poisoning, twice.

Only twice? And only yourself? You're doing better than a lot of people.

My regular diet used to include microwaved scrambled eggs, children’s lunchbox cheeses, tinned lentils mixed with tinned tomatoes, bowls of garden peas, and the “complete food” powder Huel.

You know, I've tried those microwave scrambled eggs. They're not completely terrible, unless you overcook them, at which point they're suitable for fixing tire (or tyre) leaks. Not nearly as good as actual eggs, of course.

Also, what the Huel is Huel?

For once I was actually arsed to look something up. here you go.  Open in new Window. I haven't seen it marketed in the US, so I think it's primarily a British thing.

Pro: Their motto is "Don't be a dick," and it's in huge type on one of their walls.

Cons: Huel is a portmanteau of Human and fUEL, and their employees are called Hueligans.

According to a 2014 YouGov survey of 10,000 Britons, one of the largest ever conducted about food, 10% of us cannot cook a thing – equating to 5 million people. A smaller survey in 2018 found that 25% of respondents could only make three dishes (including boiled egg and soldiers, and porridge).

Translations:

Boiled egg and soldiers: Mostly a kid thing, soft-boiled eggs (the kind they make cups for) with strips of toast (the "soldiers").
Porridge: Oatmeal (usually). You knew that from stories.

She goes on to describe how to go from "I'm not a cook" to "I'm a cook":

See if you can identify the source of your belief that you are someone who can’t cook. You might uncover a false assumption – for example, that you don’t deserve to enjoy food, that any time not spent working is wasted, or that cooking is anti-intellectual or even women’s work.

It's work, I'll grant that. Sometimes too much work. In an effort to eat healthier, I've been buying more fresh fruits and vegetables, and, apart from bananas and carrots and maybe apples, they're a massive pain in the ass. Lettuce, for example. Gotta wash it, right? But then you have wet lettuce. So you try to dry it. Lettuce is wrinklier than an 80-year-old swimmer, with more folds in a head than an origami crane, so it's impossible to get all the water off. "So buy a salad spinner, Waltz." No. I don't have the room, and between me and my housemate there are already too many gadgets to clean.

But "women's work?" What is this, the 1920s? Hell, my dad grew up in that decade, and he only called it "women's work" when he wanted to get a rise out of my mom (it worked every time). Also, and I know I've said this before but I can't emphasize this enough, my mother was a lovely person but couldn't cook her way out of a shoe.

"Anti-intellectual?" I can't even.

Start by deciding that you can cook, “then prove it to yourself with small wins”, writes Clear.

Isn't that good advice for any task?

Practising my cooking felt a bit like practising my French with a native speaker who is also fluent in English: insisting on imposing my incompetence on others, at the expense of everyone’s enjoyment.

I've done that, too. But only with his permission, and not for very long.

After four months of living alone, I have learned that I cannot be without Greek yoghurt, kale, cannellini beans, peanut butter, sour cream, chilli flakes, spinach and frozen chapati breads.

Greek yogurt: I eat this stuff from time to time because of its purported health benefits. It's basically liquid chalk.
Kale: I've ranted about kale before. I've eaten it. I'd rather not.
Peanut butter: I despise peanuts. I'm okay with real peanut butter, the kind that's not 60% sugar. I'd rather have almond butter, but I can afford it because I don't eat it that often.
Spinach: Not a pantry item. I'll buy a bag of spinach (pre-washed of course) for salads and omelets sometimes and it wilts within 48 hours, sometimes less. Frozen spinach is foul. Canned spinach isn't food unless you happen to be a one-eyed sailor. Also, spinach recipes crack me up: "5 cups raw spinach (1 tsp cooked)"

Self-help guru and entrepreneur Tim Ferriss surveyed more than 100,000 of his (mostly male) Facebook fans to discover what turned them off cooking and found an array of reasons: too many ingredients or tools, intimidating skills, different dishes finishing at different times, standing at the stove, food waste.

I've learned to be okay with food waste. Shitty of me, I know, but they just don't sell certain things in sizes I can use all of before they go off. My biggest problem with cooking? I almost always cook for myself, and it irks me that it'll take an hour to make something where the recipe specifically says 15 minutes, and then I spend all of 5 minutes eating it. Nothing should take longer to cook than to eat, in my opinion. I'm getting over that, but I still resent it sometimes.

One thing I'm really good at? Having everything finish at about the same time. I can't estimate quantities to save my life (I once measured what I thought was a cup of broccoli but was only a quarter cup, and what I thought was a teaspoon of oil but was actually half a cup), but I'm damn good at timing, so long as I ignore the fantasy times listed on the recipes.

For me, it’s my tendency to get distracted, especially if I am reading a recipe on my phone. I will flick to another browser tab and forget to stir, or put a pan on a high heat then wander off. The result is a dish that tastes like burnt pan bottom.

Yeah... that's definitely a you problem.

Then I got a smart speaker.

Aw HELL no.

Not every recipe is reliable. Cookbooks have at least gone through a process of recipe-testing and copy-editing; a top Google search result can just reflect good SEO. “Frankly, there are a lot of bad recipes out there,” Johansen says.

Yeah, my comment on the Stik's entry above reflects that: "Then there was the time I got something like "1 whole rotisserie chicken, 7 stalks of celery, 3 beefsteak tomatoes, (spices), wrap in a single 6" flour tortilla."

Either way, it is important to find a source that resonates with you.

You know what hacks me right off about recipes you find online? First they have to write a PhD dissertation on the dish. Scroll, scroll, scroll, scroll, come ON, scroll, scroll, oh THERE's the recipe. It involves five pans, a crockpot, six hours, an oven, a blender, a microwave, and spices that are only available in Thailand? Crap.

If the recipe calls for 45 minutes in the oven, but it looks done to you after 30 minutes, try it. If you really like garlic, double the suggested cloves and see.

Double? More like quadruple. Yeah, I like garlic.

For example: it’s important that aubergines are “very finely” sliced, but you can usually fudge onions – and always save the pasta water.

Aubergines (that's eggplant for us yanks) aren't food.

Incidentally, I've been wondering why Brits use the French words for eggplant and zucchini (aubergine, courgette) but not the French word for spinach. My best guess is that, being Brits, it would take approximately five microseconds for someone to shorten épinards to "nards," and that'll be all she wrote.

Fuck it. I'm calling them nards from now on.

As for onions, well, we pretty much agree with the French there: oignon.

Even the simplest meals can be elevated above a survival mindset: in Solo, Johansen dedicates a chapter to things on toast.

I suppose it's too much to hope for that the chapter be titled "Shite on a Shingle."

Part of scaling up in the kitchen, Johansen suggests, is finding “like-minded” people to cook for, who are less concerned about what’s on the menu than enjoying each other’s company.

Yeah, no. With all the different dietary preferences and rampant food allergies (and "allergies") out there, I'mma just keep cooking for myself, and sometimes my housemate.
July 29, 2022 at 12:19am
July 29, 2022 at 12:19am
#1035823
Okay, look. I drank yesterday. Quite a bit. Then I passed out. Then I woke up, hung over and still drunk. But here I am, posting (nearly) at my usual time. Screw you, everyone who said I was irresponsible and can't keep a schedule or commitments.

However, when I selected the article at random from my queue, I couldn't for the life of me remember why I saved it there to begin with. Or reading it in the first place. So... whee! Let's figure this out together.



Why would I decide that, one day, I would like to blog about this? It is, by the way, clearly marked as a book excerpt, so again, we have someone trying to sell their book. As I've noted repeatedly in here, this is a writing site, so I'm okay with that.

I’m driving the three-minute scoot to the supermarket to pick up a few boxes of very safe, instantly dissolving toddler cookies called something like Nom-Noms, which is really what all cookies should be called (and, while we’re at it, all food).

The only thing I hate worse than self-absorbed blather is self-absorbed blather from mommies. So I should have given this article a complete miss. But I have to admit it: this first sentence reeled me in. Take note.

While in the car, I’m listening to the writer Elizabeth Gilbert on Oprah’s Super Soul podcast.

And then you lost me again. It's a three-minute drive, but you need to distract yourself with a podcast? (Full disclosure: I have never, not once in my life, listened to a "podcast.") Also, Oprah? Now I know that, despite your remarkable opening line, we are complete opposites and have absolutely nothing in common.

And finally? If it only takes three minutes to drive somewhere, why not walk?

Gilbert is the author of Eat, Pray, Love, the 2006 best seller about her soul-awakening travels to India, Italy, and Bali; it’s a book I love and have read an embarrassing number of times.

Now I'm starting to have a vague recollection of why I saved this article lo these many weeks ago: I hate nearly every single thing about it.

Well, good. That suits my mood right now. Damn, I type loud.

Nom-Noms are these magical little biscuits that are probably about 99 percent air. The rest is a mysterious blend of, I think, sweet-potato juice and Styrofoam.

My mother wouldn't have indulged me like that. Spoiled brat kid.

Every cookie is reliably about five inches long and shaped like a mini-surfboard with very minor irregularities around the edge. (I’m sure they could be baked to be perfectly smooth, but I think they’re going for some kind of wabi-sabi “hand-hewn” aesthetic, which I appreciate in theory, but it also feels like an unnecessary effort given the audience?)

You... you don't understand marketing, do you? The audience isn't your crotchfruit. The audience is you, and it's working exactly as planned.

The day stretching out before me as I drove the three minutes from my house to the supermarket was itself a bit like a Nom-Nom; it would be the same as all the other days I’ve been living since my son was born, since we moved to Los Angeles, and since I’ve been working part time.

Ah. Got it. L.A. That's why she can't walk. There's probably a parking lotfreeway involved, which makes me doubt the "three minutes" thing.

As I finally threw the Nom-Noms into my shopping cart, Gilbert was talking about the archetypal “hero’s journey” and how throughout the history of literature, the hero’s journey has been represented as, specifically, a man’s journey to a faraway place.

Hey folks, it's sexism time!

I hadn’t heard of the book Gilbert referenced at length, Joseph Campbell’s The Hero With a Thousand Faces, in which he distills the 17 universally traversed steps of this tale as it’s been told forever by cultures around the world. I realize as a writer I probably should have read (or at least known about?) Campbell’s book, but there are so many episodes of The Bachelor to watch that I’m not sure where I would have ever found the time.

Yes. yes, you should have. Shallow excuses won't cut it.

Even after I loaded my recyclable bags (good person) into the back of my car and began the drive home, that concept and those words, “hero’s journey,” kept echoing inside me.

Hey, "good person." Do a little research and you'll find out that those "recyclable bags" are worse for the environment than plastic bags (though they don't generally introduce quite as many microplastics, which is a separate issue). I know you live in California and plastic bags are no longer an option there, but you're actually best off with the old-fashioned brown paper bags. Environmentally-speaking.

Gods, I hate virtue signaling, even when it's self-aware.

Campbell’s conception of the journey begins with a potential hero who is just going about his life as normal — you know, texting and taking antidepressants or whatever.

And it's this sort of thing that, despite my polar opposition to everything that this woman stands for, I kept reading for. I mean, not only is she talking about one of the greatest archetypes in storytelling (albeit from a newbie perspective), but it's this sort of turn of phrase that makes me appreciate the writing.

In this moment of silent anticipation, for the first time since my son was born — having spent each day since feeling invisible to the mainstream world, over the hill, like a Swiffer on legs, wiping his nose with my hand and not having sex and generally functioning as a kind of automated milk-and-comfort-dispensing machine — I began to entertain a thought …

You know, it's your choice to pop out a sprog. You probably spend a lot of time justifying your decision to do so: "Oh, it's SO worth it!" But these little complaints are far more revealing than all the "parenthood is wonderful" anecdotes in the world.

Is it possible I’ve been on a hero’s journey this whole time? Is it possible I am on one right now?

No.

The only qualification for shitting out a sextrophy is the proper biology.

What shook me about Campbell’s words is how perfectly they describe motherhood.

That's a stretch.

Pun intended.

And of course, these are the exact same moments in which there is no more “superhuman deed” than steadfastly caring for and feeding your child and not giving in to the temptation to flee the entire situation.

Mantra: I chose this life. I chose this life. I chose this life.

And you cry because this is why you chose his name: Asher, Hebrew for “happy,” the emotion you’ve struggled so hard to feel your whole life.

The meaning is closer to "blessed." I'll give this author the benefit of the doubt; based on her last name, she very well might be Jewish and gets to give her brat an unvarnished Hebrew name without being a cultural appropriator. Also, I'm pretty sure the vast majority of names common in English are Hebrew in origin, or at least Aramaic. But the implication here is that happiness depends on reproducing, which, while admittedly a very Jewish idea, just doesn't track today.

So I have been thinking and thinking about this. Is it really possible that my trip to buy Nom-Noms is part of a meaningful narrative, a hero’s journey?

No, you self-important idiot. It's a goddamned trip to the nethers-licking grocery store. I'd bet you could order a case of those nasty things and have them delivered right to your door. Even in a backwoods hellhole like Los Angeles.

To illustrate, I invite you to investigate your gut reaction to the term “mommy blog.”

Hurk.

I guarantee you if Ernest Hemingway were alive and writing an online column about his experience of being a father, no one would call it a “daddy blog.” We’d call it For Whom the Bell Fucking Tolls.

I guarantee you Ernest Hemingway wouldn't make the main focus of his blog his experience of being a father.

The truth is that motherhood is a hero’s journey.

And yet, every other species on Earth does it without turning it into an epic saga.

Every mother you know is in this fight with herself. The sword that hangs over her is a sword of exhaustion, of frustration, of patience run dry, a sword of indignation at how little she feels like a human when she so often has to look and behave like an animal. Mostly, it is the sword of rage: the rage and shock of how completely she must annihilate herself to keep her child alive.

Ah. There it is. The confession that made me hit "save" on this article.

Okay, look, like I said, I'm writing this with a pounding headache, so I'm probably just a bit less chill than I usually am. And I'm not trying to deny how important motherhood is. Hell, my mom did all that without even the dubious benefit of having popped me out herself. I genuinely like this lady's writing, regardless of its content. So here I am, sharing it.
July 28, 2022 at 12:03am
July 28, 2022 at 12:03am
#1035779
They say money doesn't buy happiness. I disagree. It buys beer, and beer is happiness.

How Much Money Is 'Enough'? This Simple Thought Experiment Gives You an Exact Number to Aim For  Open in new Window.
Constantly chasing more and more will make you miserable. The right goal gets you off the treadmill.


This financial article has been lazing around in my queue, not even earning interest, for a long time. It's a couple of years old, but always relevant (at least until our inevitable doom).

Have you ever read those articles where some extremely well-off family details their budget and then bemoans that they're barely getting by?

Yes, and I've also read ones that talk about how a young couple managed to buy (gasp) a house in this economy. Always, without fail, their technique involves having rich parents who help out.

It's ridiculous that anyone could complain about raking in $350,000 a year, and it's clear many of these folks are wildly out of touch with how privileged they are.

Or it would be ridiculous, except that it seems like most people think "rich" means "making lots of money." It does not. It means "having money." It does no good to be making $350,000 a year if you spend all of it (and definitely if you spend more than that). Someone making $50K and saving $10K of that is richer; they're just not going to have nearly as many toys.

It's not just the wealthy who fall into the trap of earning more only to spend more and feel just as dissatisfied.

How do you get off this treadmill?


Most people also think of a treadmill as a means for getting fit and/or controlling one's weight. It was originally designed as a human-powered source of work,  Open in new Window. and later repurposed for punishment.

Research shows that up to a certain threshold (studies consistently put it at about $75,000 dollars a year, give or take a bit depending on cost of living) money has a big impact on both day-to-day happiness and life satisfaction.

I've never been entirely convinced by those studies. Not that I think they were poorly designed, but a) that's an average, so some would settle for less while others need more to hit that threshold; and b) again, it's not what you make, but what you save.

But perhaps the best way to get a feeling for your goal number isn't math but a simple thought experiment from writer Brad Stollery:

Before I paste that thought experiment here, I'll add something else. The article points out that accumulating about 25 times your expected annual expenses is the path to financial independence. I'm not going to argue that; it tracks with the common advice to live in retirement off of 4% of your liquid net worth. Fine. I will say, though, that it's important to decide if, when you reach that savings goal (if you ever do), whether enough is actually enough. You find yourself in retirement, especially a theoretical early retirement, and suddenly you have all the time to do all that traveling you never got to do because vacations were discouraged at work—but traveling ain't cheap.

I won't paste the entire thought experiment, just the—pun intended—money quote:

How much money would you have to be paid, right here and now, to retire today and never receive another dollar of income (from any source) for the rest of your life?

The catch this time is that whoever among the five players writes the lowest amount on the check will be paid that sum. The other four players will get nothing.


Now, I'm well aware that some people actually enjoy working for a living. I do not. It gets in the way of my video gaming. If you're one of those people, this article is not for you—but I still think it's important to have money stashed away in the event of disability or extended job loss.

This thought experiment forces you to cut away the natural impulse to aim ever upward (if you do that you'll bid too high and get nothing). That result is however much you ask for is your number, the amount you'd need to live comfortably and pursue your goals if status and lifestyle inflation weren't a factor.

Your expenses and desires can be infinite. If you don't want to chase them miserably forever, you need to put a cap on your financial ambitions yourself.

I dunno; do the famous billionaires (you know who they are) seem stressed or miserable to you?

On that note, there's a cultural bias surrounding the nice round sum of one million dollars (obviously, this has an American slant). "If I had a million dollars" isn't just the name of a catchy song by Barenaked Ladies (who are from Canada but the point remains); it's the number a lot of people have ingrained in their heads as the cutoff between "rich" and "well-off."

But even before this year's higher inflation, a million bucks wasn't what it used to be. Using the 4% rule I talked about earlier, that translates to an income of $40,000 a year—roughly what a $20 an hour job pays.

On the one hand, if you reverse that math, if you're making $20 an hour, then your effective net worth is $1M. On the other hand, $40K annually doesn't do much these days. It's just slightly more than enough to cover the basics in most places in the US, and actually less than you'd need to live in one of the big coastal cities.

There's another aspect that the article doesn't cover, and that's taxes. Since state taxes vary (and don't exist in some states), you'd have to do your own research there. (This does NOT mean looking at YouTube videos until you find one that tickles your confirmation bias.) But the most effective place to keep a million bucks is in the stock market. Sure, it has its ups and downs, but over a long enough time frame, it usually beats inflation. And here's the thing about stocks: realizing long-term capital gains usually results in a lower tax burden than working for a living. I mention this because it's possible to have a lower overall tax burden if you're living off of saved money, which means you might not need as much as you think you do. You'd have to look into that yourself, though.

Lots of people seem to have a problem with this, but remember, the money you put into securities (apart from certain retirement accounts) has already been taxed.

Okay, I'll shut up now. Mostly I think I just saved the article because I thought it was a different way than usual to look at things. And of course I had my own commentary to add.

Also, since I mentioned a certain song, and YouTube:



Haven't you always wanted a monkey?
July 27, 2022 at 12:02am
July 27, 2022 at 12:02am
#1035740
Today's article, sent to me a while back by Turkey DrumStik Author Icon, is a long read, and despite the partisan title, it has far more general implications than just one political party's shenanigans. Consequently, any political bile spewed at me for this will be summarily ignored.

The Losing Democrats Who Gobbled Up Money  Open in new Window.
Amy McGrath and other Senate candidates deceived donors to rake in far more cash than their Republican opponents. They got crushed anyway.


There has, historically anyway, been a correlation between money raised and election success. The problem, as is the case with most issues involving correlation, is that it doesn't imply causation. In other words, were politicians who raised more money successful because they raised more money? Or was it because the tactics needed to raise money also translate to garnering votes? Or, another hypothesis which is mine, are the people who raise more money able to do so because they already have a wide base?

As this article points out, spending more money on your campaign doesn't always affect the only poll that matters.

The long-term danger is that small donors, barraged with overheated email pleas that range in veracity from half-truths to outright bullshit, will eventually catch on.

Nah. To do that, they'd have to read this article, and it's about 20 times longer than the average voter's attention span. It's also about twice as long as mine, so I'm not going to bore you by repeating a lot of quotes here. Just the stuff I find most interesting or amusing.

Josh Nelson, a Democratic digital strategist, is among the operatives, most of them from the Democrats’ progressive wing, trying to get the party to abide by more ethical standards. “I just don’t think you can view people on your list as ATM machines, or like they’re idiots,” he said. “It might work for a while, but it’s not sustainable.”

Protip: Do not take any advice from someone who calls it an Automated Teller Machine machine.

In the long lead-up to elections, there are only two metrics for the public to follow. One is polling. The other is the money. Campaigns boast about their one-day hauls and quarterly totals, and the political press covers it as if it’s a sport. And there is useful information in the fundraising numbers. Who’s giving and how much. But the biggest truth gets buried: Money is the most overrated factor in politics.

Full disclosure here: I have never contributed money or effort to a political campaign. You know those boxes on your tax form where you can donate to the Presidential election fund (for y'all foreigners, that's really a thing, but it doesn't change your tax owed)? I've never checked those. I consider it aiding and abetting a politician in the commission of an election.

Yes, I vote. Yes, I'm pretty one-sided in my voting. I just thought the process sucked even before I read this article.

Because who does that fund-raising really help? Not the politicians (at least if they don't skim off the bottom). It's kind of like hiring someone to do SEO for your business. Once everyone does it, search rankings return to what they would have been without optimization. No, the donations contribute to ad agencies, media outlets, consultants, etc. And certainly it doesn't help the donors, because they'll never get that money back.

On the other hand, I don't have any good ideas for what they call campaign finance reform. So I just stay out of the process as much as possible.

Even in the digital age, local broadcast TV still accounts for the biggest share of campaign advertising, as high as 60 percent.

I guess some people assume that I don't watch broadcast or cable TV because I'm some sort of hipster. Nothing could be further from the truth (okay, maybe when it comes to beer I'm a hipster). It's because of ads. 99% of all ads suck, and if I have some burning desire to watch one of the other 1% because people will not fucking shut up about it, it'll be on YouTube. I despise ads with the fiery passion of ten thousand suns; the only thing I hate worse than ads would be having to pay to watch them, and that's the reason I never had cable.

But of all the drooling, mouth-breathing ads out there, by far the absolute worst are political campaign ads. Granted, because of my aversion, I haven't seen a whole lot of them, but those that I have been somehow subjected to (there are often TVs in bars and at friends' houses), they lie even worse than product commercials and, worse, the ones I've been subjected to would rather tear down the other candidate than prop up the one they're supposed to promote. "Vote for Smith! Because Jones has been accused of eating babies and kicking puppies. Do you want a baby-eating puppy kicker representing YOU?"

When campaigns are so flush, they do not have to spend with much discipline. To give one example, McGrath’s advisers were skeptical about the impact of direct mail—figuring that in lots of households campaign literature went straight into the trash.

And that's another example of being treated like subhuman scum by the people running these ad campaigns. I don't throw campaign literature or church fliers in the trash. Hell no! They go into recycling, in the hopes that they'll be repurposed into something actually useful to society. Like maybe joint rolling papers or beer bottle labels.

Anyway, the article's there if you care. I actually managed to read the whole thing. It didn't send me into paroxysms of outrage, not like the ads themselves do, but it did convince me that Something Needs To Change.

As to what that Something is, well... shrug.
July 26, 2022 at 12:02am
July 26, 2022 at 12:02am
#1035699
Oh, those crazy mad scientists...

What the heck is a time crystal, and why are physicists obsessed with them?  Open in new Window.
Some of today’s quantum physicists are tinkering with an esoteric phase of matter that seems to disobey some of our laws of physics.


You'd think scientists would learn from science fiction: don't mess with time crystals.

(No, this isn't a spoiler for the latest Star Trek series.)

You’re probably quite familiar with the basic states of matter—solid, liquid, gas—that fill everyday life on Earth.

And hopefully you're not intimately familiar with a fourth, plasma (not to be confused with blood plasma, which it might have been named after), which is a gas in a high enough energy state to be stripped of its electrons. Such as lightning.

But those three different sorts of matter that each look and act differently aren’t the whole of the universe—far from it. Scientists have discovered (or created) dozens of more exotic states of matter, often bearing mystical and fanciful names: superfluids, Bose-Einstein condensates, and neutron-degenerate matter, to name a few.

I always loved the phrase "neutron-degenerate matter." I always picture a bunch of drunk neutrons sitting around smoking and watching porn. But you know they're not actually doing anything illegal—otherwise they'd be charged.

I'll be here all week. Be sure to tip your server.

In the last few years, physicists around the world have been constructing another state of matter: a “time crystal.” If that seems like B-movie technobabble, it’s technobabble no longer.

Yeah, except in the movies, it's usually something that lets you see or travel through time. Don't be fooled by the name; this is nothing like that. Still cool, but not that universe-breaking.

In practice, that works something like this. You create a crystal whose atoms start in one state. If you blast that crystal with a finely tuned laser, those atoms might flip into another state—and then flip back—and then flip again—and so forth, all without actually absorbing any energy from the laser.

If you step back, what you’ve just created is a state of matter that’s perpetually in motion, indefinitely, without taking in any energy.


Okay, but you're still introducing energy in the form of the laser. That's not perpetual motion which, yes, is still fantasy.

That’s no small feat. It beats against one of classical physics’ most sacred tenets: the second law of thermodynamics. That law states that the amount of entropy, or disorder, always tends to increase.

Errgh. Wrong. What the SLoT says is that entropy (which isn't actually disorder) in a closed system always tends to increase or remain constant. I don't pretend to understand all the physics involved, but no, it doesn't shatter the Second Law.

The closed system thing is important, too. Idiots have used the SLoT to "prove" that evolution couldn't be possible because it would violate that principle, forgetting or ignoring that Earth's biosphere isn't a closed system; we get copious energy from the sun. This article is light on details (light? because laser?) but the crystal itself isn't a closed system either.

But these latest time-crystal-tinkerers did something different. They turned to Google and used a quantum computer: a device that takes advantage of the quirks of quantum mechanics, the seemingly mystical sort of physics that guides the universe at the tiniest scales.

Oh, it's Google? Then they'll probably yank it out from under us like they did Reader, Finance, and a bunch of other cool things.

Also, stop with the "seemingly mystical" garbage. Everything seems mystical until it's understood—though to be fair, there may be three people in the world who actually understand quantum physics, and I'm certainly not one of them.

So, could these time crystals indeed lead to a new wave of nascent time machines?

No. Look, I'm as big a fan of science fiction as anyone, but come on, stop with the bait. There is no reason to try to inject fantasy into what's already pretty cool. This is known as the "truth is stranger than fiction" principle.

But they might help make quantum computers become more robust. Engineers have struggled for years to create something that could serve as memory in quantum computers; some equivalent to the silicon that underpins traditional computers. Time crystals, physicists think, could serve that purpose.

And finally, an actual (if currently hypothetical) application. Not that physics needs an application, but it's good to know the real reason for the hype, which is not so you can go back in time and kill your grandfather before your mother was born, thus creating a world-breaking paradox.

To sum up, science is cool. Science fiction is cool. One should always heed the warnings of science fiction when doing science. But when you're writing about science, don't hype what ain't there.

If your brain won't explode, you can read more detailed information about time crystals here.  Open in new Window. As that is Wikipedia, don't, like, use it for a dissertation or anything, but it's probably a decent overview.
July 25, 2022 at 12:01am
July 25, 2022 at 12:01am
#1035662
It is fitting that I'm just now getting around to sharing today's article, which is from 2014.

Why We Procrastinate  Open in new Window.
We think of our future selves as strangers.


Can I just say how refreshing it is to have the answer to the implied headline question right there in the subhead? Neat, short, concise even.

Of course, the article delves into much more detail.

The British philosopher Derek Parfit espoused a severely reductionist view of personal identity in his seminal book, Reasons and Persons: It does not exist, at least not in the way we usually consider it. We humans, Parfit argued, are not a consistent identity moving through time, but a chain of successive selves, each tangentially linked to, and yet distinct from, the previous and subsequent ones.

You know I find philosophy intriguing. But this? This sounds like a modern-day version of one of Zeno's Paradoxes.

Parfit’s view was controversial even among philosophers.

I'm sure it was. Not that it's definitely wrong, mind you; it's just that I can't think of a way to test it. Therefore, it stays in the realm of speculation—just another way of looking at things.

But psychologists are beginning to understand that it may accurately describe our attitudes towards our own decision-making: It turns out that we see our future selves as strangers.

Your future self is entirely theoretical. Your past self is not (except in the view of fringe philosophers). There's a big difference there.

That bright, shiny New Year’s resolution? If you feel perfectly justified in breaking it, it may be because it feels like it was a promise someone else made.

That's a non sequitur if I've ever seen one. In that case, you're breaking a promise your past self (aka you) made. Now, whether your past self has any business dictating stuff to your future self, well, that's another issue entirely.

Using fMRI, Hershfield and colleagues studied brain activity changes when people imagine their future and consider their present. They homed in on two areas of the brain called the medial prefrontal cortex and the rostral anterior cingulate cortex, which are more active when a subject thinks about himself than when he thinks of someone else. They found these same areas were more strongly activated when subjects thought of themselves today, than of themselves in the future. Their future self “felt” like somebody else. In fact, their neural activity when they described themselves in a decade was similar to that when they described Matt Damon or Natalie Portman. And subjects whose brain activity changed the most when they spoke about their future selves were the least likely to favor large long-term financial gains over small immediate ones.

I'd definitely keep a promise I made to Natalie Portman. Just saying.

The disconnect between our present and time-shifted selves has real implications for how we make decisions. We might choose to procrastinate, and let some other version of our self deal with problems or chores.

Well, absolutely. I don't want to deal with them. Let someone else do it. Sure, that someone else is future-me, but hey, I might get lucky and die before I have to deal with them.

Anne Wilson, a psychologist at Wilfrid Laurier University in Canada, has manipulated people’s perception of time by presenting participants with timelines scaled to make an upcoming event, such as a paper due date, seem either very close or far off. “Using a longer timeline makes people feel more connected to their future selves,” says Wilson. That, in turn, spurred students to finish their assignment earlier, saving their end-of-semester self the stress of banging it out at the last minute.

Wait, what? Some people don't get stuff done at the last minute?

Of course, the way we treat our future self is not necessarily negative: Since we think of our future self as someone else, our own decision making reflects how we treat other people.

"Other people?" You mean the game's NPCs?

So anyway, yeah, short article, interesting implications (if true). Eventually you might even get around to reading it. Not sure it has much to do directly with the Parfit philosophy from the first quote up there, but I suppose it fits that model.

Meanwhile, I'll just give myself a pat on the back for finishing writing this at 11:30 instead of 11:59.
July 24, 2022 at 12:01am
July 24, 2022 at 12:01am
#1035616
For a dedicated indoorsman, I sure post a lot of articles about the (shudder) outdoors.

11 Common Survival Mistakes That Can Get You Killed  Open in new Window.
These simple mistakes can lead to catastrophic scenarios. Here’s how to avoid them


1. Going outside
2. Going outside
3. Going outside
...
11. Going outside.

But, you know, staying at home has its hazards too. Falling down stairs. Slipping in the bathtub. Owning a weasel. That sort of thing. So let's see what the actual article has to say.

We all make mistakes. It’s only human. But what happens when our mistakes are combined with treacherous conditions in the outdoors? What happens when our blunders combine into a series of unfortunate events?

Evolution in action.

1. Going Alone

Well, shit. As an introvert, I do this all the time.

After my divorce lo these many years ago, I got it in my head that I was going to travel across the continent, from the easternmost point to the westernmost point of the contiguous US. You can read all about it in my offsite blog (link in the sidebar on the left), which I haven't used lately but some of the earliest entries were about it. So I'm not going to rehash the whole thing here—just the endpoint. This involved parking at a ranger station in western Washington and hiking three miles through climax rainforest to the Pacific.

At that time, hiking three miles was easy for me. I made it to the ocean, everything was fine. No one was around, though campsites indicated that it was a fairly popular spot. But this was November, when the weather can be a bit iffy in the PNW. Still, that day, the temperature was perfect for hiking; the sun was shining just a few degrees above the western horizon...

...and that's when I realized I still had to hike three miles back and it was about to get very, very dark.

Long story short, I made it, but I got back to the car when it was full dark out. Obviously, I wasn't eaten by a bear, but that's more luck than planning. My whole body shook from exertion, and I could not get warm enough, even with the seat heater on and the vents blasting on full. Insult to injury, I had to spend that night in a motel in Forks. This was right after Twilight came out, so you couldn't walk down the street without tripping over a cardboard cutout of a lame character from the movie.

Point is, yeah, I made this mistake, and got lucky.

2. Getting Lost

Again, not something that you have to worry about if you stick to buildings and roads (and poorly maintained rainforest trails), and have some basic directional sense. Having a GPS doesn't hurt, either, but one shouldn't be over-reliant on technology.

Prevention: The best way to prevent getting lost is to get more familiar with land navigation. It’s not enough to bring a map and compass with you into the wild. You’ll need to know how to shoot an azimuth and back azimuth, account for declination, and determine distances (among many other skills). You’ll also need to know that steel or iron objects can pull your compass needle away from magnetic north, so keep your rifle barrel away from your compass as you take bearings.

Prevention: Don't go outside.

3. (Not-so) Edible Plants and Mushrooms

I had a whole entry about poisonous shrooms exactly one month ago: "Everyone Calls Me Mushroom Because I'm Such A Fun GuyOpen in new Window. Short version: Don't go outside.

4. Failure To Light

It’s ironic but true. The times when we need a fire the most are the times when fire can be the hardest to produce. Cold, windy, rainy conditions are prime time for hypothermia (the dangerous lowering of your body’s core temperature), and these are the toughest conditions for building a fire.


No, it's not "ironic." This, too, can be solved by utilizing our thousands of years of technological innovation and sticking to gas or electric heating inside a comfortable building. I mean, shit, our ancestors built for us a splendid mansion, and y'all wanna sleep in a mud hut.

Skipping a few here. If you refuse to heed my advice and want to go into the not-indoors, you can look at the link.

10. Ignoring Your Instincts

You mean like my instinct to find the nearest bar or craft beer dispensary and settle in there?

11. Underestimating The Risks

There are risks involved in enjoying the great outdoors. Every year, a handful of people run into real trouble in the wilderness – often because they didn’t understand the risks they were truly facing.


So, like I said, there are risks involved with staying home, too. Sure, I continue to make jokes about not venturing into the not-so-great outdoors, but—as my anecdote above indicates—I do like to go out from time to time. Perhaps I make the opposite mistake: overestimating the risks. But to me, that's preferable. A life of complete safety is no life at all (and also unattainable), but I'd rather take my chances with shitty motels than deal with tents, campfires, and nosy elephants (or whatever animals are around; I don't know).

It's not about not taking risks. It's about risk management, balanced with whatever enjoyment you might get out of the adventure. I will never again go camping, for several reasons, but if I did, I'd go with someone who knows what the hell they're doing.

Because that person is not me.
July 23, 2022 at 12:01am
July 23, 2022 at 12:01am
#1035578
A little bit more serious than usual today. Just the way the dice fall.

Bussed out  Open in new Window.
How America moves its homeless


The article is from way back in the Before Time, December of 2017. Something might have changed since then, what with travel restrictions for the pandemic. But it's still a good read.

Quinn Raber arrived at a San Francisco bus station lugging a canvas bag containing all of his belongings: jeans, socks, underwear, pajamas.

Pajamas? Luxury!

Cities have been offering homeless people free bus tickets to relocate elsewhere for at least three decades. In recent years, homeless relocation programs have become more common, sprouting up in new cities across the country and costing the public millions of dollars.

If only there were another way to use those millions to help the homeless. Gosh. I don't know. The answer eludes me.

But until now there has never been a systematic, nationwide assessment of the consequences. Where are these people being moved to? What impact are these programs having on the cities that send and the cities that receive them? And what happens to these homeless people after they reach their destination?

My guesses:

1) Anywhere else;
2) They're probably swapping so it's a wash;
3) They're still homeless.

In an 18-month investigation, the Guardian has conducted the first detailed analysis of America’s homeless relocation programs, compiling a database of around 34,240 journeys and analyzing their effect on cities and people.

That's a lot of work. Glad I didn't do it.

Some of these journeys provide a route out of homelessness, and many recipients of free tickets said they are grateful for the opportunity for a fresh start.

Well. Again, it pays to be pessimistic. That's pleasantly surprising.

While the stated goal of San Francisco’s Homeward Bound and similar programs is helping people, the schemes also serve the interests of cities, which view free bus tickets as a cheap and effective way of cutting their homeless populations.

I haven't been to SF since the Before Time, but spoiler: it's not working (sort of; you'll see later in the article). Also the font from the linked story is pretty small on my screen, so at first I thought that word was "culling."

Jeff Weinberger, co-founder of the Florida Homelessness Action Coalition, a not-for-profit that operates in a state with four bus programs, said the schemes are a “smoke-and-mirrors ruse tantamount to shifting around the deck chairs on the Titanic rather than reducing homelessness”.

You know, the "rearranging deck chairs on the Titanic" phrase is clichéd, but dammit, it's descriptive.

I'm skipping a bunch here. The good news is there's more nuance than I expected. The bad news is it's still not a comprehensive solution.

The interviews the Guardian has conducted with recipients of bus tickets indicate the outcome of their journeys can vary hugely.

Yes, better than I expected, but it sounds like some of the outcomes were even worse:

Last year Fort Lauderdale sent Fran Luciano, 49, back to her native New York to stay with her ex-husband, according to program records.

Lots of people get along with their exes. I do. But living with one? Fuck that.

But the money spent on bus tickets does not necessarily address the root causes of homelessness. “There may be cases where you have good intentions of trying to return that person back to that family”, but the family is “why they were homeless in the first place”, said Bob Erlenbusch, a longtime advocate based in Sacramento, California.

There's still this pervasive mythology about family here in the US (and elsewhere). The myth goes that family is better for you than friends, acquaintances or strangers. Obviously, this is not always the case.

The article ends, as is appropriate, by revisiting the Raber story it started with. Basically, he ended up going back to SF and:

Today his circumstances are almost exactly the same as they were before he left. He spends part of the month crashing in a friend’s room in a rundown residential hotel, and the rest bedding down on the sidewalk with a blanket and pillow in the gritty Tenderloin neighborhood, where many other homeless people congregate. He drags his belongings around in a suitcase with broken wheels, and hopes to move into a discarded tent that he found recently.

Discarded, I'm betting, by someone else who got bussed out of San Francisco.

Now, I've noticed that whenever some bleeding-heart suggests actually housing the homeless, the response from some asshole on the other side is always "start with your house." Let me nip that one right in the bud, if any of that type are still reading this: I own two houses. One of them is occupied by a husband and wife who are both disabled and would otherwise have no place to stay. So I do that. I say that not to virtue-signal, which I hate doing, but only to keep that comment from happening here.

I can only help a limited number of people myself. That's why we need to acknowledge that we live in a society, and it's everyone's responsibility.

After all, you may think it won't happen to you. But it could.
July 22, 2022 at 12:03am
July 22, 2022 at 12:03am
#1035539
I'm linking this article, from Cracked, because it illustrates a disturbing trend that needs to die.



No, the "disturbing trend" isn't people continuing to buy BMWs. On my extensive list of "things people do that I don't understand," it gets lost somewhere in the middle. You want an overpriced car? I'm not going to stop you.

No, it's this "subscription" shit that has to stop. It makes sense for things like streaming services and news outlets, because you're getting new content on the regular—and if you're not, you can cancel the subscription. But this? This sort of thing enrages me.

It’s the perfect situation for the companies that offer one. They get to present a highly reasonable visual price to the people considering it, providing that, as is true for most people, they’d rather have a fingernail removed than do any form of math.

Sadly, this is true. I'd even add "without anesthesia."

Subscriptions sneak under the radar, slowly building on your bank account as you quietly repeat to yourself ad nauseum “what’s $5 a month?” until suddenly you are a gaunt, bloodless husk.

Hey, Cracked. Pay me and I'll be an editor. "Nauseum," forsooth. (Pay me more, and I'll be a writer.)

A 2021 poll found that the average consumer spends 273 dollars a month on subscription services, or a whopping $3,276 a year, which is roughly enough to buy 9 Arizona Green Teas a day for a whole year.

I keep close track of mine, and it's less than that—and my income is above average. This includes my Writing.Com premium membership, which is sometimes paid for by Gift Points.

None of it is too surprising, and in honesty, it’s impressive in the blood-soaked way that would make bankers laugh at a white-tablecloth lunch. Income inequality and, lately, inflation, have left whatever we’re calling “middle-class” consumers with less liquid spending money than ever. However, to increase wages and salaries on these consumers would, of course, negatively impact the profits and overhead of these same companies. Therefore, the solution is to proffer in hand, outstretched with a grin, a virtual piggy bank that paycheck-to-paycheck masses can drop one of their hard-earned quarters in each week in order to own the same items their bosses can buy for their mistresses without a second thought.

It's more than that, in my opinion. People figured out a while back that if you can offer something for a monthly price, you can jack up the actual cost. Nowhere is this more obvious than with college tuition. Once student loans became widely available, tuition started increasing much more than the price of everything else, largely because people see "$300 a month" as a much easier pill to swallow than "$30,000 and done." Even if, over the length of the loan, you end up spending closer to $40,000. (There are nuances involving the time value of money here, and it has to be compared to inflation, but I'm trying to keep it simple. Sometimes a loan makes sense.) So if you, say, were offering your software for $3000, you might look at your balance sheet and decide you can make more money over the life of the application if you charged $250 a month. Most people, like this author says, won't do the math. A certain computer-aided design software company pulled this shit over ten years ago, and as an engineer, I kind of do math, so it was a major factor in my deciding to retire.

Of course, there’s a limit to these asks. Push your subscription fees even a dollar too high, as Netflix recently learned, and you risk shattering the entire facade, causing all your supplicants to suddenly awake from their stupor like a very broke Neo covered in goo.

Netflix is kind of a special case here. As I said, subscriptions to streaming services make sense if you use them, because you're getting new material. Some people aren't into the new material, in which case the subscription no longer makes sense. (For me, it'll be worth it just for The Sandman, coming out next month.)

Articles cover a recent new… I’m not sure what to call it. Program? Purchase option? Gleeful middle finger? Whatever the correct nomenclature is, BMW has attempted to board the Money Train by offering BMW owners a subscription–to the heated seats that are already installed in the car they have purchased. If you own a BMW, and should you like to sit in the gentle warming lap of luxury, you can now do so in multiple countries, including the United Kingdom, for the low, low price of $18 a month.

No. Just... no.

Okay, so here's my heated seat experience. They came with the Subaru I bought back in 2010. Average high temperatures in January in my part of Virginia are around mid-40s F. Average lows, mid-20s. Being the average, sometimes it's quite a bit lower. Point is, heated seats are something I might use a few times a year at home. More if I take a road trip to Minnesota in anything but July. Over the 11 years I owned the car, I think I turned on the seat heater less than 50 times, total.

If I had to pay for it on a monthly basis, I would forego the luxury entirely.

My plan is to buy another Subaru if the chip issue is ever sorted. If they pull this subscription shit on me, though, I'm walking.

But yes, I am here to tell you, all of the cars are fully outfitted with everything they need to heat those comfy seats. And yes, as part of a removed, self-reliant system, heating those seats transfers absolutely zero cost to anyone beyond the person who owns the car and the battery under the hood. BMW has just, while looking directly into our souls, installed a software block in the cars that says “ATTENTION CAR: do not activate heat this poor serf’s seat until they pay their monthly tithe to the Car Lord.”

To be pedantic, if the car is running (and not sitting idling), the battery is barely even involved. It draws power from the engine, like your lights do. Unless you have an EV, and I can't be arsed to find out if BMW makes EVs.

A BMW spokesperson did give a statement, as follows: “ is part of a global aftersales strategy that BMW has introduced in various markets around the world, including on a small-scale basis in the U.S.” Which is like asking someone, “Why are you pissing on my head?” and them answering “Me urinating on your head is part of a personal piss strategy I have introduced to your head.”

This is called trickle-down economics.

It’s a bit strange that of all car companies to introduce something like this, a luxury producer like BMW would employ it. The type of people who own new BMWs would be, you’d assume, flush with plenty of cash to pay a measly fee every month.

I never did follow this logic. What if you've used up all your cash and credit to buy the BMW? There's a never-ending cycle. "If you can afford the car, you can afford the insurance." "If you can afford the insurance, you can afford the stupid subscription." "If you can afford the stupid subscription, you can afford to give money to the homeless." "If you can afford to give money to the homeless, you can afford to get a new paint job every year." Stop that.

Also, rich people are notoriously cheapskates. That's how a lot of them get and stay rich.

It's also clear to me that BMW must charge a per-use fee for their turn signals. Else more BMW owners would actually use one. When I was in the L.A. area earlier this year, I witnessed a BMW driver using their turn signal. Almost gave me another heart attack. I didn't get video of this miracle, but NaNoNette Author Icon was driving and she can back me up on this.

The “strategy” has inspired a huge consumer backlash and a flurry of incredulous articles, which is what businesses refer to as a “negative response.” Weird, you’re telling me people are upset that you paywalled their car? But if people spend the money, BMW will happily take a momentary PR hit.

And that's the real problem: people allow this shit to happen. Companies wouldn't do it if people didn't pay for it. But people pay for it. I know I've harped in here before about other issues, like climate change, that shouldn't be on us to change. But this is different: it's one manufacturer out of a dozen or so, in a semi-free market. One can choose a different car. If you gotta be a prick, there are plenty of other dick-substitute cars/SUVs (BMW also makes motorcycles, but their seats are automatically heated because the engine is right under them) on the market in the same price range.

If this effort succeeds, though, it's only going to usher in more subscriptions. Unless, of course, we boil ourselves to death first. At which time, who would need seat heaters?
July 21, 2022 at 12:06am
July 21, 2022 at 12:06am
#1035507
Today, June 21, is the anniversary of the greatest achievement of humankind (apart from beer): the first dudes on the moon. Civilization has mostly gone downhill ever since, though the internet sure is nice.

The date is sometimes given as July 20. That's because a) the landing was a few hours before the moonwalk, and b) the timestamp for that first boot on the moon was 0256 UTC, which is London time, but as the rocket was launched from Florida, sometimes it's recorded in US Eastern Time, four or five hours earlier (I forget if DST was in effect in 1969).

Today's article has nothing to do with that, but I couldn't let it pass without mention. I wouldn't expect any of my readers here to make such rookie mistakes, but it never hurts to have a refresher. Plus, I have jokes and commentary.



Want to irritate a young person? Text in complete sentences and put a period at the end. That drives them absolutely up the wall, and it cracks me up.

The article's slant has to do with job hunting, but the rules are universal. Yes, even in texting. As an aside, the reason I mentioned the young person thing is my friend's kid, now all grown up and about to go to college, texted me out of the blue the other day. I was pleased to note that his texts were properly punctuated. I can only assume the college was impressed by his punctuation skills. He even mostly eschewed the use of emojis, but that "mostly" is forgivable; even I use them sometimes. But I get the impression he's an outlier in the texting world.

Good punctuation can go a long way when it comes to job hunting and being taken seriously at work. The average recruiter spends about 7.4 seconds scanning a resume.

Oh, it's worse than that. When I was running a business, I'd share the most egregiously ungrammatical resumes with my business partner, and we'd have a good laugh before hiring someone else. "But Waltz, since when do engineers have to know how to write?" Since forever, because written reports are a thing, and I didn't want to always be the one writing those boring bastards.

Based on comments and responses from our readers and listeners, here are 11 common punctuation mistakes that irritate people the most:

We get these articles from time to time, but bad punctuation lives on somehow. As far as I'm concerned, if you can't get these basic things right, either you don't care, or you're deliberately trolling; either way, I'll ignore you.

I'm not going to comment on every one of these. Check the article if you want to see others.

2. Misusing "it's"

We hear managers rant about this mistake so often that we decided to talk about it separately: Apostrophes show possession, but there's a huge exception: Its and it's.

Really, this applies to any pronoun. My other annoyance is when people mix up there, they're, and their. Or there's and theirs. Or your's. Worse is when Southerners type "ya'll." Y'all oughta know better. And what's up with people typing "loose" when they clearly mean "lose?"

3. Incorrect quotation marks

About a month ago, on a rare venture into the not-so-great outdoors (actually downtown, but close enough), I saw a chalkboard sign that read something like:

"TUTU"
Turkish Street
"FOOD"


I changed the name to protect me from lawyers. Now, I'm sure if they had food, it would be delicious. But they don't. They have "food." Though their happy hour beer prices (also posted on the sign) almost tempted me into the place anyway, before I started to wonder why a Turkish place is selling beer.

6. Using semicolons like commas

I probably overuse the semicolon; I'm aware of this stylistic flaw. But I think I use them properly—most of the time. One of the hardest things about learning French, to me, isn't the words or the grammar; it's that they use commas where we would use semicolons, and omit commas where we would have them. It makes sense within the context of the language, but as the rules are different, it took me a while to figure them out. Also, what's up with the space before the ? or ! ?

7. Putting two spaces after a period

Back in the old days of typewriters (and early computers), there was a hard and fast rule: Two spaces after a period.


You know, I'm kind of a stickler for grammar, but that sort of thing doesn't bug me in the slightest. It's a visual thing, and it's the way I learned it, as I learned how to type before personal computers came out. I did unlearn that, but if someone else hasn't? Whatever. I think the reasoning is that the two-spaces thing was helpful to distinguish the end of a sentence from, say, "Mr. Magoo." And it became obsolete with self-kerning fonts, or whatever you call them. Fonts that don't have the same space for every letter. Like the one you're reading now as opposed to this one.

10. Capitalizing too many words

Sometimes people capitalize a word to make it stand out. But not every word can or should be capitalized.


There's a time and place to do that, mostly for comedic effect—which is not something you want on a resume unless you're applying for a court jester position. Another oddity of French as compared to English: they don't capitalize a lot of words that we do. For instance, we type Monday; they type lundi. We say July, they say juillet. This is in contrast to German, which as far as I understand it, capitalizes every Noun.

11. Using too many exclamation points

Hell, using any bangs in a resume, unless you worked for a company whose listed name is something like Yahoo!, would be a hard pass for me.

I'm going to add one:

Using emojis, or their archaic text equivalents. Again, there's a time and place for that, and it's not on a work document. I once got a design plan comment letter from a municipality which was just riddled with smiley faces and (even more) frowny faces. Stop that. You're a professional. I'm a professional. We can communicate without you trying to ease the sting of your criticism by making funny faces with colons, dashes and parentheses.

Like everyone, I make mistakes now and then. Sometimes I'll catch them on rereading; sometimes I won't. So I don't sweat the occasional mistake of others, because I generally don't hold other people to higher standards than I do myself. And like most grammar rules, there are occasions to ignore the rules.

A resume isn't one of those.
July 20, 2022 at 12:02am
July 20, 2022 at 12:02am
#1035465
I'm sure most people experience this sort of thing:

What the Tip-of-the-Tongue Phenomenon Says About Cognitive Aging  Open in new Window.
While word-finding failures can be taken as evidence of memory problems, they may not be harbingers of befuddlement after all.


Still, most of us call it a "brain fart" because it's much shorter and easier to remember. It is extremely unlikely that anyone will ever forget the word "fart," even for a second, because y'all are so attached to the word.

I guess when you're MIT, you can't just use "brain fart," though.

Have you ever had trouble thinking of someone’s name? Perhaps you can even see the face of the person in your mind’s eye, and you would immediately recognize the name if a friend suggested it to you.

I find it sometimes very difficult to recall the names of actors. Even one I've seen hundreds of times, like James Spader. I can only assume that this is because actors take on different names and identities, but sometimes I can't remember those, either. This leads to multiple "You know, that woman. In that movie. The one with the kitchen cabinets? She played a drug dealer or something." moments.

This doesn't happen to me nearly as often with musicians.

In fact, psychologists refer to such experiences as tip-of-the-tongue (TOT) states. But are they really the harbingers of befuddlement that they appear to be?

Having dealt with dementia in both parents, I'm naturally averse to going through it myself. No, there's no genetics involved, but that doesn't mean I won't lose my cognition. Honestly, I'd rather just die suddenly than decline like they did.

Fortunately, I don't see these... TOTs... as being a sign of decline, because, like most people, I've always experienced them on occasion. This is why I'm a writer and not an extemporaneous public speaker; if I'm writing, I can just move on to other things while my subconscious works on finally barfing up the right word or name like a cat with a hairball.

Much like astronomers who study ephemeral phenomena like supernovas, researchers know that TOT states will eventually happen, but not exactly when. This uncertainty has led to two distinctly different ways of investigating TOTs: via naturalistic methods and by experimentally inducing word-finding failures in laboratory settings.

And this is the interesting part: the science.

Diary studies, in which people write down every time they experience a TOT state, allow researchers to assess both frequency and resolution rates. The results suggest that college students experience about one to two TOT states a week, while for people in their 60s and early 70s, the rate is slightly higher. Research participants in their 80s, however, experience TOT states at a rate almost twice as high as college students.

As it is unlikely I'll live that long, that's kind of a... what's the word... relief.

We need to be cautious, however, when interpreting such naturalistic data. It may be the case that older adults, who are more concerned about their memory lapses, will be more likely to record such instances. They may be more conscientious about writing them down, perhaps because their lives are less hectic than those of younger participants.

Which is why you work to minimize confounding variables in scientific studies.

The alternative method for studying word finding is to experimentally induce a TOT state.

When I got to this sentence, I pictured invasive electrodes delivering electric shocks to participants' brains. The reality was a bit of a letdown.

They found that simply giving participants dictionary definitions of uncommon English words would often trigger a word-finding failure. An example from their study was “A navigational instrument used in measuring angular distances, especially the altitude of the sun, moon, and stars at sea.”

Yeah, that would be a bad example for me. Dad was a sailor, and I inherited his sextant, which I keep on a shelf.

But I'm sure others would have worked.

In this study, the participants were often able to provide the desired word without difficulty. On other occasions, the subjects had no idea what word the definition was describing. However, if they found themselves in a TOT state, Brown and McNeill asked them additional questions. The researchers discovered that, while in such a state, people can report partial information about the sought-after word, even as the word itself eludes their grasp.

I've experienced this, too. It's like playing charades with yourself. "Oh, damn, I know this word. Starts with an S. Two syllables. I could make a pun out of it if I remembered it. The pun would have to do with doinking. Ah! That's it! Sextant!"

When given the definition for “sextant,” the participants sometimes responded with “astrolabe” or “compass.” However, they also sometimes offered up words that only sounded like the intended term. The definition for “sextant” also led to responses of “sextet” and “sexton.”

You can almost feel these authors refraining from using salacious examples. Yes, I had to stop to think of the word "salacious."

Incidentally, in case you don't know, it's called that because the name is derived from the Latin for "six," not anything to do with the English word "sex." This, of course, doesn't stop puns from happening. It's almost as reflexive as when someone mentions the name of the seventh planet from the sun.

As with many issues in cognitive aging, we can view the increase in TOT states as a glass half empty or half full. On the one hand, these retrieval failures can be taken as evidence of weakening connections between the meanings of concepts and the words that denote them in long-term memory. It’s also possible that the increase in word-finding problems with age reflects something very different.

Psychologist Donna Dahlgren has argued that the key issue is not one of age but one of knowledge. If older adults typically have more information in long-term memory, then as a consequence they will experience more TOT states. It’s also possible that TOT states are useful: They can serve as a signal to the older adult that the sought-for word is known, even if not currently accessible.


To me, this is the important part of the discussion. While I'm not a fan of people trying to find the positive side of freaking everything, in this case, it would be good to know that having a brain fart doesn't mean your mind is going down the shitter. I say "would be," because that entire second paragraph I just quoted is speculation; clearly, more science needs to be done.

Viewed this way, TOT states might represent not retrieval failures but valuable sources of information. If you are an older adult and still worried about the number of TOT states that you experience, research suggests you might have fewer such episodes if you maintain your aerobic fitness.

Ooooookaaaaay, that exercise thing just came swinging out of nowhere. Oh well. Hopefully, we already knew that; I know that I can write better after some mild exercise. Not excessive exercise, though; that just makes me tired.

Because like everyone else, I'm getting old.
July 19, 2022 at 12:14am
July 19, 2022 at 12:14am
#1035385
For the record, the only times I've ever urged someone to smile was when I was doing photography.

The smile: a history  Open in new Window.
How our toothy modern smile was invented by a confluence of French dentistry and Parisian portrait-painting in the 1780s


In part, this is because people have pushed back on it. Telling strangers to smile is rude, patronizing, and entitled. But even before I realized that, I never prompted anyone to smile (outside of a photography setting) because I can't smile.

The smile is the most easily recognised facial expression at a distance in human interactions.

Oh, I dunno about that. Red-faced rage might be slightly easier to spot.

As well as being easy to make and to recognise, the smile is also highly versatile. It may denote sensory pleasure and delight, gaiety and amusement, satisfaction, contentment, affection, seduction, relief, stress, nervousness, annoyance, anger, shame, aggression, fear and contempt.

You know, the word "fuck" does all that and more, and it's much, much easier to produce than a smile.

The smile comes easy to human beings.

Well, I guess that proves it, then: I'm not a human being.

I mean, it's not that I can't turn up the corners of my mouth; it's that people expect to see teeth (which when you really think about it is weird), and the only way I can show my teeth is if I draw my lips back in a rictus of agony—which will never be mistaken for a smile.

The smile may even predate the human species. Many great apes are known to produce them, suggesting that the smile first appeared on the face of a common ancestor well before the existence of Homo sapiens.

When other animals show their teeth, it means something very different, usually. And I'd be very wary of attributing the same underlying emotion to the smile on a nonhuman animal, even that of a chimp.

The smile has always been with us then, and it would appear it’s always been the same. It seems only one step further to claim that the smile has no history. But this would be far from the truth. In fact, the smile has a fascinating, if much-neglected past.

I'm deferring to the author who seems to be an expert on this facial expression (but why?), although it's been my understanding for a long time that smiles can mean different things in different cultures. So this article has a decidedly European bent.

The ubiquity and polyvalence of the smile means that, in social circumstances, for example, it is not enough to see someone smiling. One has to know what the smile intends. The expression needs untangling, deciphering, decoding. In this, it resembles the wink. As the anthropologist Clifford Geertz pointed out in 1973, the wink is physiologically identical to the involuntary eyelid twitch we call a blink.

I'm not exactly well-versed in decoding peoples' facial expressions, but what do you mean, a wink is the same as a blink? A wink is generally performed by one eye. Blinks happen in both simultaneously (excluding certain medical conditions). They feel different to execute. Try it. Maybe I'm just weird, like with being unable to grin, but to me they're vastly different despite using the same body part.

In the West, we tend to acknowledge the variability of codes in terms of space and diversity: there is a sense that Western smiling culture differs from that to be found, for example, in Japanese and Chinese societies.

Okay, so the author does acknowledge this.

The article goes on to link the etymology of words meaning "smile" in various languages to the evolution of the meaning of the smile. It's all quite fascinating, but doesn't lend itself well to copy-pasting.

Yet if the smile was alive and well in Western culture, it was not yet our own. In Western art, it differed in one highly significant respect: the smiling mouth was almost always closed. Teeth appear in facial representations extremely rarely.

Likely because dentistry wasn't a thing. Or maybe I'm a genetic throwback to those people.

It is tempting to ascribe this state of affairs to the unhygienic state of mouth. But, in fact, skeletal remains from late medieval cemeteries suggest that teeth were then less affected by cavities than they would become from the 18th century onwards, with the mass advent of sugar into Western diets.

Okay, yeah, he owned me there.

Three factors operated to minimise representation of the expression. First, there was a close association between the open mouth and the lower orders.

We still call them "mouth-breathers," for reasons not entirely clear to me. Though I think the term is meant to apply to people of limited mental capacity, to the elite, that means everyone else.

Or – and this was the second factor in play, in art as in life – it betrayed a loss of reason. The mouth lolling open was an accepted way of depicting the insane, but it went further than that, and encompassed the representation of individuals whose rational faculties had been placed in abeyance, by passions or base appetites.

Heresy. My base appetites are completely rational.

The third factor explaining the absence of positive depictions of open mouths in Western art related to what were known as ‘history paintings’ depicting scenes from ancient history or scripture.

Props to this guy for distinguishing "history" from "scripture." This part gets a little complicated, and I won't copy all his arguments here.

For Le Brun, it followed that, when the soul was calm and tranquil, the face was perfectly at rest.

This Le Brun person obviously never heard of "resting bitch face."

In late 18th-century Paris, a new phenomenon had marked its arrival in Western culture, transgressing all the norms and conventions of Western art. The modern smile was born.

After which Parisians completely forgot how to smile.

That's a joke. In the article, the author connects the visible-tooth smile to La Révolution. Somehow. Seriously, just read it. I'm not convinced, but it's a new take on the subject, at least for me.

The virtuous and transcendent smile showcasing healthy white teeth in the novels became a model for the Parisian social elite in real life. It became not only acceptable but even desirable to manifest one’s natural feelings among one’s peers. English travellers expressed amazement at how frequently Parisians exchanged smiles in everyday encounters. The city had become the world’s smile capital.

Wait, they smiled even upon spotting an English person?

If the cult of sensibility gave novel readers the wish to smile in this fashionable way, Parisians were also lucky enough to have technical assistance at hand. The French capital had become a renowned centre for dental hygiene.

And now we're back to dentistry.

The new dentists could clean, whiten, align, fill, replace and even transplant teeth so as to produce a mouth that was cleaner, healthier and – in smiling – attractive.

I feel obliged to point out that all of this is set in the late 18th, maybe early 19th, centuries—and dental anesthesia wasn't invented until the mid-19th.  Open in new Window. I guess they're right when they say "fashion is pain." (Do they say that? I say that.)

The smile went into hibernation as a public gesture in the West for more than a century... As anyone with family photograph albums going back that far will discover, it was from the 1920s and ’30s that smiles appear for the first time – precisely the period when individuals began to say ‘cheese’ when confronted with a camera. Portraiture had become democratised – and smilified.

Now, reviving the smile for photographic portraiture—that's a hypothesis I can get behind. Incidentally, never tell someone, even if they're French, to say "fromage" instead of "cheese." Their expression makes it look like they've just caught a whiff of some Limburger.

From the turn of the 21st century, iPhone photography and social media confirmed the preferred individual expression of social identity was through smiling.

Its worth it to note that, before emoji, we had to use ASCII characters to emulate expressions. I actually preferred those simpler times. But the point is, as far as I know, the first documented ASCII art  Open in new Window. that found its way into proto-internet messaging was the colon-dash-parenthesis: the "smiley face."

In 2019, the long march of the modern (Western) smile received a severe jolt, with the appearance of COVID-19. Suddenly, that expression retreated behind a surgical mask. It is true that the perceptive among us may have been aware that a genuine and sincere smile causes a detectable crinkling of muscles around the eyes.

Which is one reason I like masks: I don't have to look at peoples' teeth. My own preferred face mask is a parody of the smile: the lower part of a cat's face, with the mouth open wide, showing pointy teeth and raspy tongue. It suits me, and I will continue to wear it, because I get compliments on it—and I have never, ever gotten a compliment on my actual face. Positive reinforcement: it works.

I am fortunate in that I've never had to work in a place where one is forced to smile. To me, that would be actual hell.
July 18, 2022 at 12:01am
July 18, 2022 at 12:01am
#1035341
Today's article is brought to you by Turkey DrumStik Author Icon, and it comes from her city.



A newly arrived airliner at Minneapolis-St. Paul International Airport doesn't have wings, or wheels, or engines. It'll never leave the gate. But it will help more people gain the skills and confidence needed to make their travel goals a reality.

Look, I'm not going to rag on this concept. I think it's great. Kind of a dress rehearsal for the real thing.

The former Delta Air Lines cabin simulator, lined with rows of Boeing 737 seats, is now located in Concourse C at MSP, in what may be a first-of-its-kind airport installation.

The article includes some pictures, including the interior of the thing, and while I'm not sure if I've ever been in a 737, it certainly seems to replicate the "packed in like pickles" experience. (Sardines are cliché, and pickles is alliterative, so I'm going with that.)

"The whole goal is giving people — as well as people with service dogs — that preflight experience that helps them overcome some of those challenges without being on a real, live flight," said airport spokesperson Jeff Lea.

Bit awkwardly phrased there, Jeff. Are you saying "people with service dogs" are not the same thing as "people?" I doubt it, but he could have phrased it better.

That Navigating MSP program includes using ground transportation, getting through security, getting in and out of plane seats — everything but getting off the ground, in preparation for an actual trip.

I imagine that one's first time flying can be stressful, or, as the article points out, the first time flying after becoming disabled. It sounds like it's got most of the inconveniences covered.

But I see no mention of it being mechanized. That is, a lot of people have a fear of flying. Not of the inconveniences of getting through security theater with one's dignity intact, or of sitting next to a 400 pound individual; those are just annoyances. No, there's no mention of it being rocked around like crazy, like planes sometimes do, freaking out the passengers who are absolutely certain that the plane's about to crash.

It doesn't look like it can drop 1000 feet in 20 seconds or whatever, putting you in free-fall and validating the seat belt requirement.

It does not replicate the existential terror of knowing that you're only held up by chaos.

I mean that literally. Lots of people think it's the Bernoulli effect, which isn't all that complicated: the faster air moves, the lower the pressure, so if you get air moving faster across the top of the wing, then the pressure on the bottom lifts the plane. While that accounts for some of the lift, in reality, it's far more complicated than that, and no one really understands  Open in new Window. all the science.

Think about that. Somewhere in the neighborhood of 100,000 flights  Open in new Window. take off every day, worldwide. The airline industry is remarkably safe—you're far more likely to be in a car accident on the way to or from an airport. And yet, the science behind it relies in part on turbulent flow, which is poorly understood. Hell, it might as well be fairies doing the lifting.

And turbulent flow is in the domain of the branch of physics known as chaos theory.

There's no question that it works. But as an engineer, "it just works" is usually an unsatisfactory answer for me.

Another question: does the simulation include screeching children, smelly emotional support animals, and crappy movies that you have to pay extra for and watch on your own damn phone? Because those are the reasons that, given the choice, I'll usually seek other forms of transportation. I have no problem with actual service dogs, but if some asshole with an emotional support badger gets seated next to me, we're going to have a problem.

Again, I think this is a great idea, doing a dry run so people know a bit about what to expect when traveling by air. I just think that when they get to the real thing, they might be in for a rude awakening.

*Movie**Film**Film**Film**Movie*


I haven't done one of these in a while, because I've been stuck at home for various reasons. But yesterday, I ventured forth to see this movie. And I hope to get back into my once a week habit soon.

One-Sentence Movie Review: Thor: Love and Thunder

I like comics-based movies, and I like movies that don't take themselves too seriously, and therefore I like comic book movies that don't take themselves too seriously—but this one maybe takes that concept a bit too far, verging on self-parody and making light of some very serious situations; nevertheless, despite some choppy editing in its earlier scenes, it's a fun foray into the MCU, with the best soundtrack since the last Thor movie (though not as good as the ones from the Guardians movies).

Rating: 4/5
July 17, 2022 at 12:02am
July 17, 2022 at 12:02am
#1035301
From the BBC, we have a non-boring article about boring people.

Why we snap-judge some people as 'boring'  Open in new Window.
Why do people with certain professions and interests make us yawn – even before we get to know them?


Imagine you are at a party, and your friend calls you over to meet their cousin Barbara. Your friend peppers his introduction with a few facts: Barbara lives in a small town and works as a data analyst for an insurance agency. Her favourite pastime is watching television. You may find yourself groaning at the mere thought of the meeting – and that reaction may say as much about you as it does about data analysts who enjoy a bit of trash TV.

No, all I want to know is: Is Barbara single and heterosexual? Because she sounds like the perfect partner, someone not inclined to pester me to go hiking in (shudder) the outdoors, or demand we go out dancing or whatever it is the kids do these days.

That is, unless her TV tastes include "reality" shows. That's a hard pass.

I guess that makes me boring, too.

According to recent research, people have many preconceptions of what features make up a stereotypical bore.

Mine is "slow-speaking math professor" (or, since this is the BBC, "maths professor"). This may be because of personal experience.

People judge those who match ‘boring’ stereotypes harshly, considering them less competent and warm than the average person, and unfairly shunning them in social interactions – before they have even opened their mouths.

Yeah, no, I'd rather wait until someone actually starts speaking to decide whether they're boring or not. And even if they are, so what?

Van Tilburg’s research builds on more than two decades of scientific interest in people’s experiences of boredom.

You want to know what's boring? Studying boredom for twenty years. Twenty goddamned years.

In 2014, for instance, researchers at the University of Virginia, Charlottesville asked participants to spend 15 minutes in a sparsely-furnished room.

Oh, great, as if my hometown needed more negative publicity.

The participants did not have their mobile phones, computers or any reading materials – but there was a device that delivered a small electric shock at the press of a button. Despite the obvious pain that this would bring, 18 of the 42 participants decided to do this at least once to break up their boredom.

I'm guessing this study involved students, like most university psychology studies. It tends to skew the results. Students, used to a hectic, fast-paced lifestyle, don't know what the hell to do with themselves when confronted with nothing to do. Me, I'd just treat it as a novelty. For an hour, maybe.

Still, I'd probably hit the button just once, more out of curiosity than boredom.

Speaking of curiosity, I was wondering how long this exile lasted for. An hour? A day? A week? So I clicked through to the actual study. From the abstract: "In 11 studies, we found that participants typically did not enjoy spending 6 to 15 minutes in a room by themselves with nothing to do but think, that they enjoyed doing mundane external activities much more, and that many preferred to administer electric shocks to themselves instead of being left alone with their thoughts."

Six to 15 minutes? What the idling fuck is wrong with people? I've spent way more than that sitting in dentists' waiting rooms or train stations, without even using my phone.

You may wonder whether this reaction was peculiar to the set-up of the experiment – but it has now been replicated in other situations.

Okay, good to know.

In one later study, participants were forced to watch a tedious film that played the same 85-second scene on repeat for an hour. When given the opportunity, many participants chose to play with a device that delivered an uncomfortable zap of electricity.

That's an entirely different scenario, in my opinion. If I had to watch a boring 85-second scene on repeat, I'd try to find a way to up the amperage (not the voltage—it's the current that'll kill you). No, I'd rather have no stimulation than a crappy video.

This may explain why it is so insufferable to be stuck with a bore at a party, while we can hear all the other excited conversations around us. While we are obliged to hear about the minutest details of our new acquaintance’s job, we are missing the chance to make a deeper social connection to someone who would be much better suited to our personality.

Yeah, but what if a boring person is better suited to your personality?

The article goes on to note how people pre-judge other people as boring or not.

Working with Eric Igou at the University of Limerick and Mehr Panjwani at the London School of Economics and Politics, Van Tilburg first asked a group of 115 US residents to describe the most typical qualities that they associated with boring people.

I'd be like, "Being a professor at a British or Irish school." Because I'm a troll.

These results were, in themselves, highly revealing. According to Van Tilburg’s participants, data-entry workers, accountants and tax officers were considered to be the most boring professionals.

I mean, okay, tax officers are a lot of things, but "boring" isn't a word I'd use.

Hobbies seen as boring included going to church, watching TV and sleeping.

Sleeping is not a hobby.

In terms of personality, bores were thought to be closed-minded with a narrow range of interests, and to lack a sense of humour or strong opinions on any issue. They were also thought to be overly negative complainers, whinging about every issue.

Lacking a sense of humour (or even humor) would certainly be a deal-breaker for me. I mean, come on, you know me. But how can you complain about every issue if you don't have strong opinions about them?

How to be interesting

I'd say "Write a daily blog about stuff you find on the internet," but I may be biased.

His first is to consider whether you can reframe your job description. Data analysis might, at first glance, be seen as a boring profession – but perhaps you are contributing to a bigger endeavour, such as scientific research. In general, scientists were thought to be much less boring than data workers – so emphasising the scientific element of your job could help to bypass people’s biases.

These people obviously haven't met actual scientists (apart from the ones conducting the study). While we hear about exciting scientific breakthroughs all the time—some of them right here in this blog—they're almost always the result of years or decades of plodding and meticulous research. In other words, actually doing science is boring as hell unless that's something you want to dedicate your life to.

Remember that bores, in general, were considered to be closed-minded with few passions. Almost everyone enjoys TV, after all, and if you list that as your only hobby you are inevitably going to seem bland. But what are your more individual obsessions? Things like gardening, journaling, fishing and knitting were all viewed relatively positively.

About the only pastime I'd consider more boring than fishing is watching golf. Or maybe watching fishing.

Van Tilburg points out that people are much more likely to apply negative stereotypes to others when they feel threatened. By judging you unfairly harshly for your job or your hobbies, someone may just be covering up their insecurities.

Exactly how one can feel threatened by a boring person is beyond me.

And I'll just leave you today with this.  Open in new Window. (From this delightful book  Open in new Window.).
July 16, 2022 at 12:03am
July 16, 2022 at 12:03am
#1035263
Today's article is from 2020, hence the COVID quarantine angle. But the interesting bits are historical, so feel free to ignore the COVID part just like you're ignoring COVID itself.

Ten Surprising Facts About Everyday Household Objects  Open in new Window.
While COVID-19 has us homebound, it’s a good time to reflect on the peculiar histories of housewares we take for granted


Throughout the world, from the humblest abode to the most lavish mansion, our homes have always been a respite from the world.

That is, until recently, when the internet brought the world inside our houses.

When we think of the technology that makes our homebound life bearable, we call to mind those electronic devices that allow us to remain connected to the outside world.

Nah, I think of beer. And modern beds. A few years ago, I sprung for a memory-foam mattress with an adjustable frame so I could sit up to read in it. Worth every penny, and it even came with a remote control. Now that's a fine use of modern technology.

But yeah, the internet has been competing with books, bed, and beer for my attention since the early 90s.

However, it might surprise us to know that, for our ancestors, many of the objects we take for granted, like napkins, forks and mattresses, were also once marvels of comfort and technology—available to only the few.

Well, not that surprising. Things like adjustable beds—or even nice comfortable mattresses—pretty much everything was once a luxury item, until it could be mass-produced enough to be accessible to us peons.

Our temperature-controlled homes filled with comfortable furniture and lights that turn on at the flick of a switch are luxuries unfathomable to the kings and queens of the past.

Yeah, but kings and queens had servants.

Perhaps, like me, you’ll find yourself grateful for our ancestors who suffered with stone or wooden headrests, stiff-backed chairs and cold nights before feather-stuffed pillows and fluffy duvets were part of everyday life (and appreciative of those who imagined that things could be better).

Non-homeless-like typing detected.

In The Elements of a Home: Curious Histories Behind Everyday Household Objects, from Pillows to Forks, I’ve uncovered the stories behind the objects that fill our homes and our lives.

And there it is, folks: the "Buy My Book" portion of the article. Look, the article's free; there's gotta be an ad of some sort. Apparently they're also getting paid by the History Channel (ugh) and of course they want you to subscribe to the magazine. For every great advance in technology and society, there's a price. Today, the price is goddamned ads. (Or, like me, you can install a script blocker. I had to open the article in another browser to see the actual ads.)

I'm not going to quote every section here. Just a few highlights.

The fork was once considered immoral, unhygienic and a tool of the devil.

This should not be surprising. There's a certain segment of the population that considers everything new to be a tool of the devil. It takes a generation or two, at least, for it to become mainstream (see also: Dungeons and Dragons). To be fair, though, a fork literally looks like a tool of the devil.

In 1004, Maria Argyropoulina, niece of the Byzantine emperors Basil II and Constantine VIII, was married to the son of the Doge of Venice.

Much nobility. Wow.  Open in new Window.

She brought with her a little case of two-pronged golden forks, which she used at her wedding feast. The Venetians were shocked, and when Maria died three years later of the plague, Saint Peter Damian proclaimed it was God’s punishment. And with that, Saint Peter Damian closed the book on the fork in Europe for the next four hundred years.

Okay, so what about the rest of the world? Europe was hardly the pinnacle of culture at the time.

Keys weren’t always pocket-sized.

Locks as we know them are a surprisingly modern invention. So are the doorknobs they generally inhabit.

However, the ones that opened the wooden locks of the massive marble and bronze doors of the Greek and Egyptians could be three-feet in length, and so heavy that they were commonly carried slung over the shoulder—a fact that is mentioned in the Bible.

I'm thinking Superman's neutronium key to the Fortress of Solitude here.

Ancient Romans, who lived extravagantly in most other aspects of their lives, were surprisingly spartan when it came to their bedrooms.

The poor slept on a straw mattress set in a simple wooden frame. If your purse allowed, the frame was cast in bronze or even silver, topped with a mattress stuffed with wool or down. The bed—and only the bed—resided in a room called a cubiculum (from which we get the word cubicle), a small space with tiny windows that let in little light.


And what do they tell us these days? "Use the bedroom only for sleeping and sex, and keep it dark. Especially for sex." Well, in that spirit, screw you; I read in there (see above). Still, in this respect (and a few more), the Romans were rather ahead of the curve.

Plates were once made out of bread.

Well, the dishwasher is a thoroughly modern invention, requiring running water and electricity. Without a dishwasher, I'd be eating off of bread too. Mine's broken right now. I mean, it works, but one of the rack rollers popped off. Fortunately, this is something I can fix myself once the parts I ordered come in. As opposed to my clothes washer, which finally died after 30 years and I had to buy a new one (so I got a new dryer while I was at it). Having to replace yet another critical appliance right now would... well, I could do it, but it would still be a massive pain in the ass.

Playing cards came from the only nation with the paper-making technology to pull it off: China.

The first known cards, developed in the ninth century A.D. were the size of dominoes.

Like I said, Europe wasn't exactly the pinnacle of culture around this time.

And yet, the standard cards we use now are based on French and English designs.

In 969 A.D., when Emperor Muzong of Liao capped off a 25-day drinking binge by playing cards with his empress, it’s doubtful he had any idea that his favorite pastime would travel the Silk Road through India and Persia before igniting a frenzy for the game in Europe.

Props to that guy for going on a 25-day drinking binge. That's impressive.

Eating on a bare table was once something only a peasant would do.

Yes, because unlike nobility, peasants didn't have servants to clean the damn tablecloths. I'm betting that having such things as tablecloths indicated that you were rich enough to have servants (in the days before washing machines), because it would have been just another thing to beat in the wash tub. It's kind of the same thing as, today, buying the most expensive car your credit score will let you.

If you could afford it (and maybe even if you couldn’t), the table would be covered by a white tablecloth, pleated for a little extra oompf. A colored cloth was thought to impair the appetite. (The exception to the white-only rule was in rural areas where the top cloth might be woven with colorful stripes, plaids or checks.)

Now that's interesting. If you want to indicate "country" these days, you show a picnic table covered with a checkered tablecloth.

I guess some fashions don't change as quickly as others. Like calling any innovation a "tool of the devil."
July 15, 2022 at 12:07am
July 15, 2022 at 12:07am
#1035218
Anyone hungry? Today's link, from Cracked, might make your stomach gurgle... or twist.



Having too many choices can be worse than having only a few, as evidenced by countless hours spent opening and closing freezer doors in the ice cream aisle.

People act like "shrinkflation," the reduction of volume of a product while maintaining the same per-unit price, is a product of our current economic situation. It is not. Ice cream makers have been doing this crap for decades. Therefore, I don't buy ice cream. Also, I don't muck around with supermarkets; I let a gig worker do that for me.

But that doesn't stop me from shaking my head at the bewildering array of Oreos flavors, some of which are unholy abominations from the infernal abyss.  Open in new Window.

The same goes for nutritional knowledge, with a glut of articles, studies, op-eds, and propaganda all contending amongst each other like sports analysts arguing about Tom Brady's hair.

I've mentioned before that it's hard to trust nutritional advice. The science isn't usually rigorous, and bias is always creeping in. Hence, fads.

Luckily, the following article aims to separate the chaff from the tasty bits...

Not that I'd take nutritional advice that's coming from a comedy site as absolute truth, either, but at least it's amusing.

Remember, these headers are wrong information. Misconceptions, like in yesterday's entry, only about food and not guillotines.

4. GMOs Are A New, Human Phenomenon

I don't get all the freaking out about GMOs anyway. Phobia of science, I suppose. And sometimes, such a phobia makes sense, like when they marched people under a mushroom cloud to prove how safe the fallout was (spoiler: it wasn't). But it's one thing to be skeptical, and another thing entirely to flat-out reject anything new.

Besides, if y'all are going to keep increasing the population, we're going to have to do something to feed all those hungry mouths, and the only way to do that is through science. Science like genetic modification of crops.

...if GMO implies unnatural, then literally every food is unnatural because everything has been modified past resemblance to its wild form. And the genetic modification process existed long before humans ever built subterranean food labs to test which new Pop-Tarts flavors kill lab rats.

Which is what I've been saying.

One time I saw a video put out by creationists where they waxed poetic about how the banana is a miracle of God or whatever. Perfectly designed to fit in one's hand, plentiful, easy-to-remove peel, no seeds. I laughed about that for hours, because God didn't make the bananas we know today. Neither did natural selection. We did that.  Open in new Window. Kind of like how we did pugs, only not nearly as butt-ugly, but just as prone to disease.

And, as the Cracked link points out, even horizontal gene transfer (changing an organism's DNA through genetic modification and not just selective breeding) happens "naturally." (I put that in quotes because we're part of nature, and anything we do is therefore natural, but it's useful sometimes to distinguish the product of human intervention from what happens without our meddling.)

3. Fresh Food Is Healthiest And Leftovers Suck

Leftovers are much maligned. Not just because the sulfurous, fishy vapors emanating from a coworker's reheated salmon-cauliflower mash leave the entire real estate office smelling like a medieval brothel.

Hey now, I just read a few days ago about a guy who actually died from eating leftovers.

Sure, he'd set them on the counter and ate them like five days later, so it was his own stupid fault. Natural selection at work.

Leftovers are derided as something that was once great and is now a dim specter of its former self, like a culinary Ozymandias or the post-Macaulay Culkin Home Alones. But eating leftovers can be good for you in numerous ways.

Not to mention there are some foods that are simply better as leftovers. Putting chili in the fridge overnight enhances the flavor, for example. And pizza is great the next day, provided it's reheated properly (some people even like it -- ugh -- cold.)

Many carb-based foods that have cooled emerge from the cold with fewer calories and more fiber because their starches transform into resistant starches that pass unchanged through the small intestine.

Next up: scientists find a way to replicate that (which is fine) and then companies find a way to market it to us at a higher price point because it's healthy (which is not fine).

The reductions in calories, the insulin production, the lessening of other health risks, the promise of a nicer butt, and the prebiotic effect all have important implications for the over 2 billion people who derive "up to 80% of their energy intake" from this ancient staple food crop.

I'm just leaving this here to point out that it's not just about food consumption in developed countries.

2. Natural Is Better Than Unnatural

Yeah, this is one I've harped on before.

Bad food is made of chemicals; therefore, chemicals are bad. And bad food is bad. That makes sense, and I can't fault that logic. Especially when the entity dispensing it is wearing sprouted-mung-bean sandals and clutching a turtle-safe shopping bag made of upcycled cigarette butts from Serbian playgrounds.

But here's a life-improving pro-tip: if a person has sleeve tattoos of flowers and birds, ignore their (nutritional and health) advice. Because hey, good food has chemicals too. In fact, it might or might not be surprising to learn that all food is made of chemicals because pretty much everything is a chemical.

Which is exactly what I've been saying.

Because as chemistry teacher James Kennedy says, "In reality, 'natural' products are usually more chemically complicated than anything we can create in the lab."

Not only that, but poison ivy, hemlock, death cap mushrooms, and snake venom (to name but a few) are not exactly lab-created. To equate "natural" with "good" and "human-made" with "bad" is not just an oversimplification; it's dangerous.

1. Fancy Greens Like kale Are The Most Nutritious

Leafy greens, microgreens, macrogreens, and fractal lettuce are mainstays of the produce section at stores like Nugget or Whole Foods. Kale sales are solely responsible for at least two of Jeff Bezos' blood billions. But these overpriced veggies are no more nutritious, and sometimes less so, than the greens blooming from the gutters of San Francisco.

I'm not eating anything from the gutters of San Francisco.

Also, fractal lettuce? Closest thing I can find is Romanesco broccoli  Open in new Window., which is admittedly awesome-looking. Guess they put that in there for the comedy value Protip: if you're going to make a joke, make one that your audience is going to understand.

Anyway, people make fun of kale, and act like it hasn't been around very long. We grew that stuff in the garden when I was a kid. I personally didn't like it then (nor did I enjoy any of the other cultivars of the same species, which include cabbage, broccoli, brussels sprouts, and cauliflower—yes, those are all the same species). I still don't much care for kale (but I'll eat the hell out of that other stuff), but hey, eat it if you like it. Same goes for avocado toast, another food people like to make fun of.

So there's the science, delivered with the added calories of humor.
July 14, 2022 at 12:01am
July 14, 2022 at 12:01am
#1035176
So. I'm cheating today. Rather than picking from my list at random, I'll be featuring an article from my queue about the French Revolution.

Obviously, because today is le quatorze juillet.



The French Revolution is a landmark period in European history, but people still get a lot of it wrong.

Are you kidding? We Americans can't even get our own history right. E.g.: The Alamo.

1. Misconception: Les Misérables takes place during the French Revolution.

Confession time: I have never read Les Mis. I have never seen the play. I have never seen any of the movies. It's not that I don't want to; it's just that I've never once sat down and gone, "You know what I should do? I should see Les Mis." Someday. But today is not that day, because as this article points out, it's not about la révolution at all.

After seeing the 2012 film adaptation of the musical Les Misérables in theaters, historian Julia Gossard caught a snippet of some other viewers’ conversation. “So, that was the French Revolution?” one woman asked. “And it was unsuccessful?”

Look, I'm not a historian. Nor have I, as noted above, ever been exposed to any version of Les Mis. And yet, somehow, I knew this.

The article goes on to explain the historical background of the story, and contains spoilers. Insofar as you can spoil a novel that was published 160 years ago.

2. Misconception: Rebels stormed the Bastille to free political prisoners.

This is, of course, what led me to link this article today.

When the Bastille was stormed on July 14, 1789, there were only seven inmates. One was a wayward relative sent by his family, four were serving time for forgery, and two had been committed due to insanity—not the political prisoners you might have imagined. But if the goal wasn’t to free prisoners, why attack a prison? The real reason, according to most historians, was for ammunition.

But why was... never mind. It's in the article.

3. Misconception: French reformists all wanted to end the monarchy.

Well, duh. That would be like saying that everyone who wanted taxation with representation in the Colonies wanted to revolt against King George III.

The economic crisis illuminated some of France’s long-standing issues, perhaps most notably the insidious effects of feudalism. Not only did the nobility and high-ranking clergy members own most of the land, but their positions came with a lot of perks and exempted them from a lot of taxes.

Hm... do I hear an echo?

But it wasn’t like “Death to the monarchy” and “Long live the monarchy” were your only two options. In fact, many political factions simply wanted a constitutional monarchy. And when the National Assembly created its constitution, that’s what it was for.

Well, it worked for England. France, however, has a long history of doing the exact opposite of what England was doing, just because.

4. Misconception: The guillotine was invented during the French Revolution.

Not only did Joseph-Ignace Guillotin not invent guillotines, he didn’t even design the ones used during the French Revolution. All he really did was suggest that France standardize executions.

I guess there weren't too many anti-death-penalty Revolutionaries.

Joseph-Ignace Guillotin was against the death penalty altogether, but apparently, he realized that France was nowhere near being ready to give it up.

Ironically, Guillotin was one of them.

“With my machine,” he explained, “I strike off your head in the twinkling of an eye and you won’t feel a thing.”

"My machine?" I thought you said... aw, forget it, Jacques, it's France.

Incidentally, I'm sure they thought it was humane, but it turns out that you're aware for possibly up to ten seconds after your body is separated from your head. Ten seconds of rolling around in a basket probably seemed like an eternity. So much for "humane."

5. Misconception: Marie Antoinette said “Let them eat cake.”

Hopefully, everyone already knew this. I'm sure it's a useful myth to tell your country, but it's still false.

Just for pedantry’s sake, the sentence in French is “Qu’ils mangent de la brioche,” which literally means “Let them eat brioche.” Brioche is a rich, buttery bread that’s way more extravagant than what poor peasants would’ve been eating—it’s not exactly cake, but it doesn’t really change the supposed sentiment.

Not unless Marie was about to open the pantries of Versailles and personally distribute said brioche to the revolting peasants.

But the correct French phrasing does shed a little light on how it got popularized. The earliest known record of “Qu’ils mangent de la brioche” comes from philosopher Jean-Jacques Rousseau’s Confessions, penned in the 1760s. In Book Six, he wrote, translated from French: “At length, I recollected the thoughtless saying of a great princess, who, on being informed that the country people had no bread, replied, ‘Then let them eat brioche!’”

Which, by itself, doesn't mean that Marie couldn't have said it. That horrid movie about her back in the noughties kept mum on the subject, too. If I recall it correctly. I'm not about to watch that pile of merde again to check my memory.

So anyway, in honor of Bastille day, there it is: the record (kind of) set straight.

July 13, 2022 at 12:04am
July 13, 2022 at 12:04am
#1035132
Speaking of numbers (the musical definition this time)...

A Stanford Psychologist Says He’s Cracked the Code of One-Hit Wonders  Open in new Window.
What separates Blind Melon from Shania Twain?


"What separates Blind Melon from Shania Twain?" The US-Canadian border.

In September 1992, the band Blind Melon released their self-titled debut album. The record was mostly ignored until a music video for the song “No Rain,” featuring a girl in glasses dressed as a bumblebee, went berserk on MTV. The song rocketed up the Billboard Hot 100 charts.

You know, I'm obviously not opposed to the concept of music videos. Some of them can be quite clever. When that was all MTV did, it was pretty awesome—though I could only watch it at other peoples' houses because I always refused to get cable.

What I have a problem with is a song becoming popular because of the video. It should stand on its own. To be fair, "No Rain" did. Literally. But one thing I've learned is that other people don't appreciate music so much as they love dancing, spectacle, and a performer's brand.

I blame MTV for killing Cyndi Lauper's career, for example. Amazing musician. Really unconventional looks. What they used to call a "face for radio."

Two decades later, Rolling Stone named “No Rain” one of the biggest one-hit wonders of all time.

It never even occurred to me that the band didn't have other hits. I wasn't that into it. But at least I'd heard the song, as opposed to...

Soon after Blind Melon topped the charts, another artist had a breakout moment. Shania Twain released her second album, The Woman in Me, which included the No. 1 hit “Any Man of Mine.”

It's not that I dislike Shania Twain. I just never listened to her music. Wasn't on my radar; still isn't. It's not like her fellow Canadian, Celine Dion, whose music I actively despise.

Whatever the polar opposite of a one-hit wonder is, that’s what Shania Twain turned out to be. She became one of the most consistent hitmakers of her era, and the only female artist ever with three straight albums certified Diamond, meaning more than 10 million copies sold.

Gotta admit, that's impressive. But when it comes to country-adjacent music, give me Brandi*HeartP*Carlile any day.

For decades, psychologists have puzzled over the ingredients of creative popularity by studying music, because the medium offers literally millions of data points.

Snort. Be attractive and know how to dance. That's it. Musical talent is underappreciated. Hell, if you can put on a spectacle, these days, they'll even autotune your crappy voice.

Is the thing that separates one-hit wonders from consistent hitmakers luck, or talent, or some complex combination of factors? I did my best to summarize their work in my book, Hit Makers.

Oh hey, another ad for a book. Well, that's okay.

This month, the Stanford psychologist Justin Berg published a new paper on the topic and argued that the secret to creative success just happens to hinge on the difference between “No Rain” and Shania Twain.

"This month" apparently being this past April, based on the date of the article.

He used an algorithm developed by the company EchoNest to measure the songs’ sonic features, including key, tempo, and danceability.

Ugggghhhh. "Danceablity" is bullshit.

This allowed him to quantify how similar a given hit is to the contemporary popular-music landscape (which he calls “novelty”), and the musical diversity of an artist’s body of work (“variety”).

Probably using applied mathematics.

Blind Melon’s “No Rain” rated extremely low on novelty in Berg’s research. Dreamy, guitar-driven soft rock wasn’t exactly innovative in 1992.

It hasn't been innovative since the Beatles, but at least it's got the potential for showcasing talent.

By contrast, Twain’s breakout hit rated high on novelty in Berg’s research.

I'm betting "novelty" has to be assessed based on historical context. Hoping, anyway.

Berg’s research also found that musical variety (as opposed to novelty) was useful for artists before they broke out. But down the line, variety wasn’t very useful, possibly because audience expectations are set by initial hits. “After the first hit, the research showed that it was good for artists to focus on what I call relatedness, or similarity of music,” he said. Nobody wants Bruce Springsteen to make a rap album.

Okay, if you're going to bring up Springsteen...

Bruce's first album, Greetings from Asbury Park, had variety in spades. Slow, fast, upbeat, depressing... it's all there, and all with his signature rhyming poetic lyrical style. Far as I can recall, it produced no hits... but Manfred Mann climbed the charts with their overproduced cover of "Blinded by the Light."

Bruce's second album, The Wild, The Innocent, and the E Street Shuffle, was an artistic miracle and a commercial flop.

His third album was Born to Run, and thus history was made.

As for "nobody" wants him to make a rap album, I wouldn't be so sure about that. Last I checked, he was the most prolific songwriter in the business, and has done just about every other genre in existence. I'd buy it.

This second finding about the benefits of early variety is similar to a model of creativity known as explore-exploit. The Northwestern University economist Dashun Wang has found that artists and scientists tend to have “hot streaks,” or tight clusters of highly successful work.

In the case of Einstein, this was absolutely true. 1905 was all Al, baby.

Berg’s and Wang’s research suggests three rules of thumb that may come in handy for creative work.

This is, of course, how the article is relevant here, where most of us are doing creative work of some sort. It's the last paragraph in the article, though, and to avoid lawyers, I'll suggest you go ahead and click through to find these "rules" if you're interested.

Instead, I'll leave you today with what I consider to be the greatest one-hit wonder of all time—from a band that went on to record some truly great, but completely obscure, songs. As a bonus, the lead singer, Paul Roberts, is also an accomplished graphic artist, and as far as I know, he painted all his own album covers.



Pick up your feet
You've got to move to the trick of the beat
There is no elite
Just take your place in the driver's seat

July 12, 2022 at 12:02am
July 12, 2022 at 12:02am
#1035065
It's not very often that I talk about the plain meaning of this blog's title. I generally let the various puns on it rule here. "Complex" has several meanings, as does "numbers."

But today is one of those days. I know a lot of readers don't "get" math, but the video I'm linking today is more about history than it is about equations and such—though there are certainly equations used to illustrate the points. Periodically (pun intended), I go down a YouTube math/science video rabbit hole (though I have enough sense not to link a lot of them here), but I can't remember if this link was the result of one such journey, or if someone brought it to my attention.

Since it's a YouTube video, I'm linking it here as usual, but also embedding the video if you can't be arsed to click through.



Since it's a video, I can't do my usual quote-mining here, but to paraphrase what I feel are the important takeaways: Mathematics started out as a way to quantify the world. As such, things like negative numbers were anathema for a very long time (you can have three oranges, but you can't have negative three oranges). It was only when mathematicians let numbers be the abstractions that they actually are that math became a powerful tool for science, paradoxically allowing us to gain a much deeper understanding of reality.

Such abstractions shouldn't be so difficult to understand. As writers, we work in abstractions too—we call them metaphors, and as with math, they often reflect a deeper reality than the one in front of our noses.

This, by the way—the ability to think in metaphor—is, in my estimation, the thing that makes humans different from all other animals on Earth. It's why we're able to send rockets into space, some of them containing powerful space telescopes  Open in new Window. that search deep into space and time, giving us an even greater understanding of the universe.

(The link in the previous paragraph is to the BBC article talking about the first image released from the Webb telescope, just yesterday. I have some issues with the way the article describes things, but this entry isn't about that, and it's good enough to get a sense of what's going on. Also the picture is really damn awesome.)

So it turns out that so-called imaginary numbers and, by extension, complex numbers, really do describe reality at its most fundamental (so far as we've been able to determine) level.

A far cry from counting grains of wheat, for sure. But it all comes from the same mindspace.

The video quotes Freeman Dyson (yes, the guy who conceptualized the Dyson sphere): "...nature works with complex numbers and not with real numbers."

The phrase "real numbers" has a specific definition in mathematics, but the rest of us have a different conception of the meaning of "real." Using the layman's understanding of "real," well, all numbers are real, even the imaginary ones that don't seem, at first glance, to correspond to anything we can see or touch.

Or, alternatively, all numbers are metaphors, describing an abstraction of what we can see or touch.

Either way, math is just crazy good at describing the world we know of as "real." In many cases, such as the one in the video above, the math existed first, only to later find a use in science.

It's all real. Even the stuff we don't understand. Especially the stuff we don't understand.

31 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online