Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.
This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.
It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.
It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."
I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
Oldest Son started dating a Chinese political refugee last year. The Chinese woman was kicked out of China for posting something critical of the government on the internet. They didn't exactly kick her out, but they are blocking her from being employed, rent an apartment, register a car ... they essentially blocked her from life.
Soooooo ... she lives with us now. And, yes, she is a fourth child to me. Although her English is very good, there remains somewhat of a language barrier. There are cultural differences. Oh, and don't get me started on China's One Child policy that has created a person who has no self-awareness because she was doted on by her parents and grandparents and never told to consider others. (Like chewing with the mouth closed. Like leaving some of the fun food things for others. Like not giggling loudly in the middle of the night when others are trying to sleep.)
Argh.
Well, she loves my son. Or she loves having a free place to live. I don't care. Even with all my above gripes, she is good company. She's easy going, funny, creative, and carries her own weight in the household. She has her own car and she's giving rides when the other cars are booked up.
And, and this is crucial, when I sit her down and explain life in a large~ish household, she immediately adapts and starts following the rules.
I like robots..but according to all the movies and books..they are going to be our downfall. They could really be a big help..but our government is going to turn them toward war etc.
Re #10: My first thought was of Data connecting with Locutus-Picard after he was rescued from the Borg. As for direct ad injection, of course it's going to happen, because it would be way better than subliminal advertising.
I have always hated being asked, "How are you?" Sadly, for most of my life, "Fine" would have been a lie.
I don't like to lie--even if it is to the checker at the grocery store who is routinely asking, "How are you?" to every customer.
Over the years, I have used various answers. One for a good day was "fair to middling". At other times, I have responded, "Crappy, but thanks for asking!" Cashiers usually don't know how to respond to that one.
A fella from church asked me, "How are you?" and I did my turn it around non-answer: "How are YOU?" to which he replied, "Short, fat, and ugly." That answer stopped me in my tracks.
Lately, I came up with an answer to "How are you?" that I love so much I tend to say it even when I'm feeling great: "I plead the fifth." Saying it always makes me smile--I think it is so clever!
Wow, this public comment could almost be a blog entry, eh? Unusual for me...
I've always liked the general idea of a "jury of your peers" giving you a fair trial. Then I served on a jury and it was honestly kind of scary how people have to leave their fate in the hands of twelve individuals that, in many cases, struggle with basic deductive reasoning and critical thinking skills.
"The jury is made up of citizens selected basically at random from public records, such as voter registration lists."
I think it's from DMV records, because I (not a citizen) get jury summons every few years. I check a box that says that I'm a foreigner and they back off for another couple of years.
"A lot of people complain about jury duty, and go to great lengths (including committing perjury) to get out of it."
I made sure to stay a foreigner. As a mother of three young boys who are capable of having me called away from whatever I am doing because they caused trouble at school, I didn't think I wanted to be involved in this judiciary process. Now that they are adults with their own jury summons, I can't become a citizen because the government has started snatching and arresting people on way to their swearing in ceremonies.
I've been on a jury once. The only female. It was interesting and i enjoyed the experience. The only thing that irked me was the men kept asking if I understood! Like being female meant I didn't have a brain!
I haven't been chosen since because I answer their questions honestly. Most folks involved in jury selection don't like honesty.
Anyway it's all been interesting and a little entertaining.
Yeh, I had to pretend to look like an ass so that I wasn't selected when I had jury duty last year. I mean, I wasn't really pretending.
Granted, they let me know at 6pm the night before and I was sick and there was no way to ask to be excused on that short of notice.
I'd like to do it for real one day, maybe. But also not. Maybe for an easy case. The case I would have been on was a murder case, and I'm not about that life.
I'm disappointed one of the answers wasn't "Nothing; it's the thought that counts." But as I've said, philosophers have no sense of humor. We have a different name for philosophers with a sense of humor: comedians.
But there was one answer that really got me thinking. I am sure it was meant as a joke, but you have to be careful joking with the philosophically minded. Because quite a few people said “purpose” or “meaning.”
Case in point. And yes, you do "have to be careful joking with the philosophically minded." Like little kids, they can be annoyingly literal. Source: me, who can be annoyingly literal.
In his book, Mortal Questions, [Thomas] Nagel has an entire essay devoted to the “absurd.” Absurdity — traditionally represented by Albert Camus — is the philosophical position that humans are caught in this dreadful existential disappointment: We are a meaning-seeking, meaning-needing species, and yet the Universe is meaningless. We’re wired to want a thing that the Universe cannot provide.
Which is not to say that this is right or wrong. Personally, I disagree with some of the premises there, but it's not about whether we agree or not.
Nagel, though, thinks that all this talk of “meaning” is a misguided fool’s errand. In his essay, Nagel argues that we can identify three different types of meaning-grasping angst in the philosophical literature, and all of them are logically flawed.
Kind of ironic, isn't it? To spend so much time arguing against something that, by your own definition, isn't meaningful? Sometimes I think the only true philosophers are the ones who don't find their meaning in publishing philosophical essays.
First, the Argument of Time
When we think “everything I’ve done will end in death” and “nothing will matter in a thousand years,” then it is enough to push you into an existential crisis.
Or, as I like to put it, "There's no such thing as a happy ending, only stories that end too early."
So, imagine that on Christmas Day, you open a box containing a magic amulet that gives you immortality... Is this any more meaningful a life than the one you have now?
I think it's already been pretty well established that death is part of what gives life what meaning it has.
Second, the Argument of Size
But I'm assured that size doesn't matter.
Now, imagine you open a present that contains an elixir that makes you the size of the Universe... Would you now have any more purpose to your existence?
Purpose? No. Something else to do? Sure.
Third, the Argument of Use
What is the point of anything at all? We waste our lives trudging to jobs we hate, to talk with people we don’t like, to live in a town we want to leave, and aspire to a future that was never what we wanted.
Need a blankie?
Nagel’s point to all of this is that when we talk about “meaning,” we often talk about it as a question without an answer.
And? Isn't that what Zen koans are, except the questions are at least more poetic in nature?
But while Nagel argues that Camus’s scorn and defiance are a little bit dramatic, he does agree that the best approach to “questions of meaning” is to live ironically.
Okay, I can get behind that.
We need to commit to life seriously while knowing that it has no “meaning” beyond what it is.
But I can't get behind that. At least, not for the standard definition of "seriously." I can fulfill my obligations as a human without being all serious about them. In fact, if it's not obvious, my personal philosophy is that everything is, or can be, funny, so a sense of humor is far more important to me than a sense of meaning.
As I like to say, if you want meaning, grab a dictionary.
This is the SF equivalent of that carved bone thing from a couple of days ago.
After reading some of these, we might honestly head for the hills and start a new life amongst the trees.
And if you also read fantasy, you'd know why that's a bad idea.
15 Hybrid human-AI co-embodied intelligence
You know, this, or something like it, is quite literally the oldest theme in science fiction.
Researchers have begun deploying robots guided by AI to run chemistry experiments, handle materials, data-analysis, and lab workflows.
I don't expect a whole lot from Cracked these days, but that sentence doesn't say "co-embodied intelligence" to me.
14 Lethal autonomous weapons
This is one of those things that was going to happen with or without SF.
12 We’re very close to mind-reading
Doubt.
A recent study reports that a new neurotechnology can now predict preconscious thoughts — i.e. indicating what a person is about to think before they consciously realize it.
Yeah, not exactly, and even the article notes that it would need to hold up under further scientific scrutiny.
10 Brain-computer interfaces
This is a more modern staple of SF, and it can, like most technology, be either good or evil. My own thought is that they'll immediately figure out how to project ads directly into our brains with it.
6 Social credit & behavioral scoring
Yeah, considering how well the actual credit scoring system works, you know that will not end well, even if you haven't seen Black Mirror.
3 Predictive policing algorithms
Isn't that, like, the heart of every techno-dystopia?
1 “Chemputation”
Speaking of dystopias, I was pretty sure that cutesy portmanteau was of "chemical" and "amputation." Nope, turns out it's about computation.
Much less dystopian.
Going to leave it at that for today; for some reason, I'm having to wrestle extra-hard with the text editor. There are, of course, more items at the link.
Because my schedule got disrupted with jury duty today, I'll talk about that instead of... you know, whatever the random number voices tell me I should talk about.
Given the responses to my notebook post this morning
(In reference to the gif I used, which includes "Justice will be served") Always Humble Poet PNG- 📓: I'll have mine with neeps & tatties, please.
And the next comment, Jeromée: nipps and tats for mine please
To which I feel obliged to say: I'd rather have nips and tits. Preferably someone else's.
Cray Cray ☮: I want to know more! We don't have jury duty here in Malaysia.
Okay, well, the short version is that the US Constitution entitles people accused of a crime to certain rights, among them being "presumed innocent until found guilty" and "trial by jury." The jury is made up of citizens selected basically at random from public records, such as voter registration lists. The theory behind it, as I understand it, is to have a suspect's fate determined by their peers: actual human beings and not lawyers. Partly this is because, most times, the law (and a thing a person does) is open to some interpretation; and sometimes, the law is utter bullshit.
The jury selection process, at least from my point of view this time, was: Go in, get assigned a number. Wait. Get called into the courtroom and listen to what the judge says. Get sworn in with questions like "Are you a US citizen and a resident of (whatever)?"
Then, out of the 40 people who showed, up, they selected 20 at random. I mean, like, they put numbers in a bag and pulled them out one at a time while joking about getting a bingo machine (I don't expect judges to have a sense of humor, but this one did). Those 20 (one of which was not me) go through a striking process, where attorneys for prosecution and defense can disqualify a juror, usually based on their answers to other questions. For example, an attorney in an illegal drug case might ask "do you have a family member or friend with a drug problem?" And if they say yes, that indicates they might have a bias.
This wasn't one of those cases, but that's the idea. Obviously it's not possible to eliminate all bias, but they do try to ensure fairness. In the US, the jury determines, based on the evidence provided by the prosecution (and possibly the defense), whether the defendant is guilty of the particular crime or not. I'm not sure about this part, but it's my understanding that a judge can overrule a jury's guilty verdict, but not their not-guilty verdict.
Anyway, striking reduces that 20 down to 12. Or maybe even less, so they might pull from the remaining 20, so we had to sit around and wait. Yeah, lots of sit around and wait happened.
Bottom line is I got through it without them calling me, so I didn't get to see the courtroom thing play out. On the plus side, I got sent home a lot earlier than the actual selected jurors. On the minus side, there's a superflu going around and I was one of like three people who bothered with a mask. So if I die in the next couple of weeks, that's why.
B here is referring to a post I made in her rant forum: "Jury Rigged" , in which I sought fashion advice.
You'll be disappointed to know that all I did was put a nice warm shirt, the kind with a collar and buttons up the front, with a solid color, on over my T-shirt, and wore actual shoes with socks. Oh, and I did remember to wear jeans as well. The only people there who were dressed more formally were one young potential juror who wasn't old enough to stop giving shits, and the attorneys and defendant. And the judge, presumably, under the traditional black robe, though for all I know he had his dick hanging out under there.
🌝 HuntersMoon: At least you're on the right side of the jury box...
My wife says it is because I always suggest submerging the accused in a frigid stream. The innocent will float to the top.
Ot is it the guilty . . I get confused with that part.
And that is an example of why a juror might get struck off the list.
One final note: A lot of people complain about jury duty, and go to great lengths (including committing perjury) to get out of it. That ain't me. Maybe if I had to do it more than about once every decade, it would become burdensome, but as it is, it's an opportunity to learn something and contribute to one of the few remaining principles that we have as a country.
What's a US philosopher? One who thinks before pulling the trigger?
Can one individual truly change the world?
I've had my doubts about this for a while.
US philosophers Michael Brownstein, Alex Madva and Daniel Kelly believe individuals can make a difference.
And all you have to do is buy our book!
No, seriously, they want you to buy their book. The article says so.
The authors aim to show readers how certain personal choices can alter the “structures” and “systems” that govern the myriad decisions we make, usually quite passively.
Um, that sounds like just another inspirational "self-help" book.
Written for a general audience, the gist of their argument is captured by the words of US environmental activist Bill McKibben. “The most important thing an individual can do,” he once said, “is become somewhat less of an individual.”
Pithy and all, but I think it undermines the idea that one person can change the world, at least not by themselves. You might have a good idea or a brilliant invention, but it doesn't do any good if other people don't adopt it.
As for changing the world for the worse, well, I think it's trivial to say that yes, for the worse, one individual can make a difference.
This book is timely. In the Anglosphere, and further afield, many people are unhappy.
Ah, yes, the essence of advertising: You're missing a piece. You're unhappy. I can help, for a price.
One response has been a widespread loss of belief that joining an established political party or even voting in an election can achieve change.
As they say, if voting could actually change things, it would be illegal. (Yes, I do it anyway.)
The authors are strong critics of the “self-responsibilisation” that big fossil fuel, tobacco, betting and other companies have foisted on people to head off real systemic change.
Now, on that, I can tentatively agree. I've been saying for years that it shouldn't all be on us.
They advocate an approach in which individuals focus on those activities most likely to trigger other people into changing entrenched structures.
That, at least, has potential.
In many respects, the book is inspiring. The examples show how some ordinary people can become change agents without, metaphorically, having to climb Mount Everest.
And that's cool and all, but, first of all, that sounds like those business success stories you always hear: Ordinary people, hard work, grit, determination, getting up at 4am, blah, blah... sure, but what about the vastly larger number of people who did all that and still failed?
Second of all, one of my favorite memes is the one that goes "Every corpse on Mount Everest was once a highly motivated person."
Anyway, there's a lot more at the article which, despite being a blatant book ad, is an interesting read.
From Nautilus, one of those questions whose answers we're never going to agree on.
What Is Intelligence? At a church in Italy, we sought to shed an old definition for one that could save us
"Save us?" Okay, clickbait. Tell me how it's going to "save us" to define a word.
We were in the Tuscan countryside on an impossibly green hilltop, nothing but sheep bleating in the distance, and the creak of iron gates, flanked by carved stone lions, at the end of a gravel drive lined with Italian cypress trees.
Okay, now you're just bragging.
Gleiser fixed up the 500-year-old chapel with a dream of turning it into a think tank and named it the Island of Knowledge.
There is something immensely satisfying about turning a church into a place where knowledge is sought, not repressed.
We were here to come up with a new definition of intelligence. The old one, according to Gleiser, won’t do. “We have an ideology of infinite growth on a finite planet,” he said.
And? I've been saying this for years, and so have others, and yet no one with the power to do anything about it has ever done anything about it. I guess except maybe once, when China had a one-child policy, but they abandoned that because people are gonna people no matter what.
“That’s obviously not sustainable. What kind of intelligence are we using to create this scenario? That keeps me up at night.”
Maybe if you were intelligent, you'd know it was out of your hands and get some better sleep.
To expand the definition of intelligence, Gleiser brought together cognitive neuroscientist Peter Tse; astrophysicist Adam Frank; evolutionary ecologist Monica Gagliano; philosopher Evan Thompson; technology critic and essayist Meghan O’Gieblyn; and Indigenous scholar Yuria Celidwen.
Kind of like one of those carefully diverse superhero teams, I guess.
Celidwen handed us each a dried leaf, which she produced from a small pouch, then told us to taste it. “Let it explore your palate,” she said. I pretended to comply but palmed mine, wondering what it would be like to be the kind of person who puts a strange thing in their mouth just because someone tells them to.
I think I'm beginning to better understand "intelligence."
And, you know, so much for turning a church into a place to explore knowledge.
This was not going to be a typical scientific conference. Which I suppose made sense when you’re trying to overhaul typical scientific ideas. Poems would be recited. Tears would be shed. We weren’t allowed to wear shoes.
There's a big part of my psyche that is forever salty that I was born too late to experience the sixties in all their glory. But then I see something like this and go, "Nah."
Intelligence is usually understood as the ability to use reason to solve problems, skillfully wielding knowledge to achieve particular ends.
Crucial point here: Intelligence is not the same thing as knowledge.
In 1949, at Manchester University, a computer scientist, a chemist, a philosopher, a zoologist, a neurophysiologist, and a mathematician got together to debate whether intelligence could ever be instantiated in machines.
In a bar? Please tell me it was in a bar. Or, wait. Manchester: Pub. Whatever.
One of the participants, Alan Turing, inspired by the discussion, went home and wrote up his “imitation game,” now known as the Turing test, where a machine is dubbed intelligent if, through text conversation alone, it can fool us into thinking it’s human.
It is funny how no one talks about the Turing test anymore. Thing is, I have met scadoodles of humans who could not pass the Turing test. I figure it's likely that the concept inspired PKD to come up with the fictional Voight-Kampff test to tell replicants from humans in the story that became Blade Runner.
Thing is, from what little we know about the VK test, Dick seems to have been more focused on emotion than on intelligence, which, again, I suspect many humans (e.g. sociopaths and some of the neurodiverse) wouldn't pass, either.
Seventy-five years later, we’ve got chatbots acing the Turing test, and science conceiving of brains as Turing machines. Is it possible we’re missing something?
Of course we're missing something. I'm just not sure that "something" is hippie crap.
Inside the church, I could feel Gleiser’s urgency as he launched the discussion. Could the world agree on a new definition of intelligence before our collective stupidity destroys us?
It's not our stupidity that will destroy us. Lots of animals are stupid, by at least some definition, and most of them don't show any signs of wanting to destroy the world. And intelligence can be used for positive or negative things, and anything in between. No, if we destroy ourselves, it won't be a matter of intelligence, by any plausible definition, but shortsightedness. And maybe a little game theory: the first person or group who deliberately puts themselves at a disadvantage will be overrun by the groups that don't.
When nothing matters, nothing is a problem. Nothing means anything. “People call large language models ‘stochastic parrots,’ ” Thompson said. “But I think it’s insulting to parrots.”
Congratulations; you have reinvented nihilism.
There's quite a bit more at the link, though I wonder at the intelligence of trying to redefine an old word instead of coming up with and defining a new concept that might do a better job convincing the general public as well as those who might actually be able to do something about the problems. Despite my snark above, and a lingering doubt about their methods, I gotta give them props for trying to do something. And, if nothing else, at least they got to go to a retreat in fucking Tuscany.
It’s easy to forget that new words, just like everything else that comes in and out of fashion, are being coined all the time.
One of the things I'm most salty about in life, and no I will not get over it, is that no one recognizes that I coined the word "rad."
Well, not the word, but the meaning, as in "that shit's totally rad, bro."
Now, it's entirely possible that someone else came up with it independently. Still, it stings to never get the credit I deserved.
...as best as their research can tell us, all the words and phrases in this list were coined precisely 50 years ago, in 1976.
I have my doubts. Still, it's probably a decent look at what words/phrases hit the mainstream in '76, which, for context, is the year Jimmy Carter was elected President.
As usual, I'm only going to cover a few of these.
Athleisure
The likes of tracksuits, spandex, and sneakers began to step out of the gym and into everyday fashion in the 1970s, leading to an athleisure trend that has continued to grow ever since.
I haven't heard or seen that word, outside of this article, in some time, so I don't know if it's still relevant. What is relevant is that it started, or perhaps elevated, a trend of stupid fucking portmanteaux that all need to die.
Butterfly Effect
The popular metaphor of the tiny flap of a butterfly’s wings sparking an eventually large-scale chain reaction has been discussed since the early 1970s at least. But both Merriam-Webster and the Oxford English Dictionary have traced the very earliest written record of the butterfly effect to an article published in the scientific journal Nature in 1976.
Okay, maybe. That one's related to chaos theory: that a butterfly flapping in the Amazon can, due to the way chaos works, lead to a typhoon in the Pacific. Not that such a thing can be predicted or controlled; that's why it's called chaos. But I'm pretty sure chaos theory was introduced in the early 60s, and even before that, there was Bradbury's "A Sound of Thunder," which speculated on large-scale changes in time thanks to some idiot stepping on a butterfly after time-traveling to the distant past.
That's not an obscure SF short story, either; it's one of the acknowledged all-time greats.
Couch Potato
TV played such a big part in 1970s homelife that this was the era when the couch potato was born. Defined by Merriam-Webster as “a lazy and inactive person, especially one who spends a great deal of time watching television,” etymologically, the term might just allude to the dormancy of potatoes below ground.
In the late 70s / early 80s, I proposed alternatives to this one: Bench Fry, and Sofa Spud. Neither of those caught on.
French Press
The very first kettle—or pitcher-like devices for brewing loose coffee, which can then be pushed to the bottom of the vessel using a metal plunger—were supposedly developed in France in the mid-1800s.
You know, with the exception of "fries," I can't think offhand of any phrases that start with "French" that aren't inherently sexual. Okay, maybe fries, too.
Meme
Richard Dawkins coined the word meme as “a unit of cultural transmission” in his groundbreaking book exploring gene-centered evolution, The Selfish Gene, in 1976.
Lest we forget what that word is supposed to mean. That its popular definition has changed since then is one of the finest examples of irony that I know.
Radical
Derived from the Latin word for a plant’s root, radical first emerged in medieval English in its original and literal sense, to refer to anything growing or deriving from a root, and therefore vital or essential to life or survival.
And I'm the one who shortened it, dammit!
Wuss
No one is entirely sure where the word wuss comes from...
Oh come on. As the article suggests, it's a combination of "wimp" and "pussy." My guess is someone started to say "wimp" but his (it was almost certainly "his") tiny brain tried to change it to "pussy" halfway, and behold, a new insult was born.
I don't know why "pussy" got associated with wimpiness in the first place, anyway. Those things evolved to be tough. Their male counterparts are far more fragile.
And I do live in pawpaw country. Or I did. They grew wild in the woods where I spent my childhood.
As the weather got colder last week, I decided it was the perfect time to make pawpaw ice cream.
I was wondering why there's a December article about ice cream in the northern hemisphere, but really, ice cream is a forever treat.
I tested several recipes I thought would work well with the fruit’s flavor—a mix of banana, mango, and durian.
None of which, I must emphasize, are native to Virginia. Not by a long shot. Hell, durian grows on damn near the opposite side of the planet. (I'll refrain from making durian jokes this time after a faux pas in the newsfeed yesterday.)
Flavor is flavor, though. My dad always called them "Virginia bananas."
I chose a simple ice cream recipe, a mixture of pawpaw puree, sugar, cream, and milk.
Unfortunately, Dad never figured out how to prepare pawpaw, and my mother refused to. Just as well, considering her other attempts at cooking. She tried, she really did, but just never got the hang of it.
Since pawpaws are notoriously difficult to cultivate, foraging is the best way to obtain a large amount.
But that's work.
On the other hand, it's probably cheap, or even, in my case, free.
The author apparently lives in New York, and honestly, I didn't know pawpaws ranged all the way up there. Nice to learn new things.
Making ice cream is, of course, also work, so I won't be doing it. Still, it's nice to know that this relatively obscure wild fruit, connected to my personal history in some small way, is getting the respect it deserves.
The headline says the article is about misinformation. And then it goes calling people from ancient Greece and Rome "scientists," when that concept wasn't really a thing until something like the 1500s, and the term itself wasn't coined until the 1800s (because some chick was doing science, and up until that point, people who did that were called "men of science").
It's a little bit like calling an abacus a calculator: kind of true, but misleading.
Oh, and who the hell dismisses ancient, um, natural philosphers? People quote Aristotle to this day, and Archimedes is practically a god to engineers.
Greek philosopher Thales of Miletus, often described as the West’s first scientist, believed the whole Earth was suspended on water.
If you go by "belief," that's not science.
Roman encyclopaedist Pliny the Elder recommended entrails, chicken brains, and mice cut in two as topical remedies for snakebite.
And? Newton believed in astrology and alchemy. That doesn't make his actual scientific (and mathematical) insights any less profound. The whole point of science is to cut through the bullshit. I feel like this article is verging on anti-science propaganda.
The lone ancient Greek thinker who believed Earth orbits the Sun – Aristarchus of Samos – was universally dismissed by his contemporaries.
I want to try to be crystal-clear here, so bear with me:
The most commonly quoted example of "people who were mocked and/or dismissed by their peers but were later shown to be right" is probably Galileo. But there's a real danger here of falling into a trap: there's a common trope both in fiction and in real life to invoke the name of Galileo when someone has an idea that's being mocked and/or dismissed, the implication being that "Well, Galileo turned out to be right, and one day, my theory about birds flying due to phlogiston radiation will turn out to be right, and then they'll see. Everyone will see!"
In other words, yes, sometimes you'll turn out to be right. Sometimes that's by accident. Sometimes it's because you did actual observation, testing, etc. If you're wrong, that's quickly forgotten. It's like when a fortune-teller tells you "you will find love soon," and then you find love, and you're like "Wow! She was right!" ignoring the fact that love is pretty thick on the ground and you were going to find it anyway if you bothered to look.
However, thinkers 2,500 years ago already faced many problems that are today amplified by social media and artificial intelligence (AI), such as how to tell truth from fiction.
That's probably because social media and AI are both products of humanity, merely scaled up in speed and volume.
But hey, I could be wrong.
Here are five lessons from ancient Greek and Roman science that ring surprisingly true in the face of misinformation in the modern world.
While I still cringe at "science" and "scientist" being applied to that time period, perhaps the overall thrust of the article is useful. Let's see:
1. Start with observations
Almost every ancient scientific text offers advice about observing or collecting data before making a decision.
Okay. I'm with you so far. I just said observation is important. It's not enough, of course, but it's a first step. It seems almost tautological, though: how can you draw any conclusions, right or wrong, if you don't make some sort of observation first?
2. Think critically
Ancient scientists insisted their readers think critically, encouraging us to analyse the claims made by other people.
I can't dismiss the importance of thinking critically. Two problems, though: 1) It's often easier said than done, and, like any skill, requires training and practice; and 2) It is absolutely possible to build an entire skyscraper of thought that rests on a swampy foundation.
Ancient scientists encourage us to think critically about information we read or hear, because even well-meaning sources are not always accurate.
For instance, calling natural philosophers "scientists."
Though, to be fair, I'm willing to accept that this is a categorization issue, not a misinformation one. Like calling the Blue Ridge Mountains "mountains," which people from the Rockies always have a good laugh about.
3. Acknowledge what you don’t know
Another skill ancient scientists encourage is acknowledging our limits. Even Greek and Roman scientists who claimed to be experts in their field frequently admitted they didn’t have all the answers.
Fair enough. I've said for a long time that this is important. There exist, however, numerous people who believe, against all evidence, that it's better to be confident and wrong than skeptical and right.
4. Science is part of culture
Ancient thinkers understood that science was part of culture rather than separate from it, and that an individual’s beliefs and values will have a significant impact on the information they promote as “factual” or “truthful”.
I'd be really, really careful with this one. It can lead to beliefs like "science claims to have falsified this aspect of traditional medicine, but that's because it's from a different culture," and the ultimate dismissal of science as "western imperialism" or whatever.
5. Science is for everyone
Ancient scientists understood the importance of deferring to specialists and listening to expert advice. However, they were also keen for their readers to understand where scientists acquire knowledge and how scientific facts can be verified.
I agree with the "science is for everyone" bit. Ever diagnosed a problem with your computer, or lawnmower, or whatever? Then you've basically done science. Observe, hypothesize, test, etc.
And it's important to "understand where scientists acquire knowledge and how scientific facts can be verified." The problem is, there's way more science out there than any one person can possibly verify, and it's truly impossible to verify everything yourself. You can verify some things, and share that knowledge, but the people you share it with should think critically, too, thus compounding the problem.
Oh, you don't say? You mean people don't care about things until it affects them personally? Why, I never noticed that! Except I did.
"That disease is affecting those people, so, so what?" / "That disease killed my child! I'm going to start a crusade!"
"That country is being invaded? Sucks to be them." / "My country is being invaded! Send help!"
It's human nature, and a well-documented aspect of human nature. Thing is, there's lots of things that are human nature that we try to rise above, like punching someone who looks at you funny, and this should be one of them.
That said, please keep in mind that the tag on this article is "Opinion."
Recognizing that climate change is immediate, close, and affecting people's way of life is one of the key messages we need to communicate to spur them to act.
I believe that there are people who will sit on their low-lying island until it's inundated by rising sea levels, convinced that it couldn't possibly be due to anthropogenic climate change, because their favorite echo chamber said it's not.
But in order to meaningfully limit warming, we need to enact policies that will alter the lives of billions of people.
That always goes over so well. Besides, it would take a dictator. A benevolent dictator, not the one we've got.
And this needs to begin with individual action — getting people to care enough to alter their behavior around climate change.
And with that, you've lost me.
There are times for individual action, sure. But this is one of those times when the levers are moved by much bigger entities.
Also... I remember when Wonder Woman 1984 came out. Big letdown after the first WW movie, but that's not my point here. It released at the depths of the COVID crisis, when smart people were like "Masks and social distancing" and the toddlers were like *stomp* "NO!"
So the central plot point of that movie (which I am now spoiling, but you weren't going to watch it anyway), that everything can be fixed if everyone would just, you know, listen to Diana, and think a certain thought at a certain time, like in Horton Hears a Who, then the problem could be solved.
It's a nice thought. But by the time the movie was released, it was plain as the screen in front of you right now that it would never, ever, happen, for any reason, not even if the chick suggesting it was incredibly hot. And in this case, if you can't get everyone on board, and you also can't get the people in power to move their levers, then there's no fucking point in me switching to paper straws or otherwise inconveniencing myself in the slightest.
We recruited more than 3,000 participants across six countries to see what would make them more or less motivated to help climate causes. Pro-environmental actions are often costly — incurring financial, time and physical effort.
Then there's my inherent laziness.
The findings confirm some things that we know to be true about human behavior. It's the same reason why people have a greater connection to news that is local to their area, or to their interests. When it's personal, when it's close, when it affects our usual way of life, it lands.
Like I said.
When rising water levels increase the risk that our property is going to be flooded — because events that were previously likely to happen once in 100 years are increasingly common — to protect our way of life requires us to take action, rather than do nothing.
I'm not going to get into the math here, but that's not what a 100-year flood is. I get that the name is confusing, leading people to think it is. A 100-year event is one that, based on prior observations, has a 1 in 100 chance of occurring in any given year. To understand the difference requires some knowledge of probability and statistics, which I'd expect any scientist to have. But it also means you have to have learned some basic hydrology lingo, which, in fairness, neuroscientists rarely do.
Which is not to say that extreme storm events aren't increasing in frequency and intensity, as generally predicted by climate models, but it's rare that I get to be pedantic about something I've actually gotten a degree in.
We know addressing climate change will require systemic change from governments and business. But we need to start somewhere, and getting people to see the changes happening around them may just be a small step that leads to major shifts.
So in general, I'm in agreement with the article on that. What we seem to disagree on (and I admit I'm probably the one in the wrong here) is that it's psychologically possible to get enough people on the sinking boat to start baling.
Some people, I've found, mispronounce things. My ex, for example, pronounced "picturesque" as "picture-skew," and I thought it was so cute that I never said anything. To this day, over a quarter-century later, I have no idea if she was doing it on purpose or not, the way I pronounce "homeowner" as "ho-meow-ner."
I never rag on anyone for this, though. It means they read, but they just interpreted the word as being pronounced differently. Same thing happened to me when I first read about quinoa. I pronounced it like "Genoa," and got laughed at. Bastards. It's not like we're born knowing these things.
What I do admit to sometimes looking down my nose upon, though, is the reverse: people who hear things and then proceed to write them down wrong. A common example is when someone writes something like, "John was unphased." It's supposed to be "unfazed." Read more books.
An eggcorn is a mistaken word or phrase that makes almost as much sense as the correct version. The term eggcorn was coined by linguist Geoff Pullum in 2003 as a nod to people’s habit of mistaking the word acorn for eggcorn.
And yet, no one uses words that I coined, except me. In fairness, I'm not a linguistics professor.
Feasible arguments are a big element of eggcorns: There’s no overlord deciding which language errors are logical enough to be official eggcorns and which ones are just plain mistakes.
I know a few people who, given the chance, would absolutely sign up to be that overlord, the Chief of the Language Police. Hell, I'm one of them.
Which is not to say I never make mistakes, of course.
I'm not going to note all of them here.
For All Intents and Purposes vs. For All Intensive Purposes
The former phrase is the correct one, as the article explains. But that's a phrase I like to have fun with, too, pronouncing or spelling it as "For all intensive porpoises."
Coleslaw vs. Cold Slaw
Absolutely did this one when I was a kid. But when kids do it, it's kind of cute. Well, except me. I was never "cute."
On Tenterhooks vs. On Tenderhooks
"On tenterhooks" was one of my dad's favorite phrases. As in, "You're finally home. Your mom was on tenterhooks." (Actually, she was cool with it. My dad was the one who was worried about what Teen Me was getting up to at 2am, most likely for liability reasons.)
Happy as a Clam vs. Happy as a Clown
Honestly, I can absolutely see this one. As the article notes, the correct version (the first one) requires some knowledge of context, because, apart from that phrase itself, no one has ever considered a clam to be happy; contrariwise, clowns often have the smiles painted right on their faces.
Deep-Seated vs. Deep-Seeded
Also, this one. Having done more than my share of planting as a kid (before I started coming home at 2am), either can work, which I guess is the point of an eggcorn.
Hair’s Breadth vs. Hare’s Breath
Not so much this one. As thin as a hair might be, it still has some physicality to it, unlike "breath," which, like "wind," describes something usually unseen.
Make Ends Meet vs. Make Ends Meat
And the "meet/meat" homonym strikes again. Personally, I find the wrong one ("meat") to be hilarious.
Again, more at the link. Just remember: these phrases are pretty much all clichés, so we shouldn't be using them much anyway. They still have a place in written dialogue, though, so it pays to get them right, or at least get them wrong on porpoise.
New research links this good habit to slower aging, stronger brain health, and a longer lifespan. Doctors share what makes the biggest difference.
Okay, but, bear with me here: what's the point of living longer if you're spending all that extra time asleep?
The old saying “I’ll sleep when I’m dead” is deeply flawed.
I'm not sure why that sentence elicits what is, for me, a strong anger reaction. Maybe if I were in therapy, I could figure it out, but I'm too depressed to expend the effort required to look for a shrink, so that's not going to happen.
I suspect it's the Puritanical thrust of the saying: that sleep is some sort of optional thing, something that gets in the way of Holy Productivity. The smug "I'm strong and you are weak" implication.
In any case, it pisses me off when I hear it. Or see it.
It’s not just your diet and exercise habits that are linked with your lifespan. You should also be prioritizing getting enough high-quality sleep on a nightly basis.
Of those three things, guess which one I don't fail at. Go ahead. Take a wild guess.
Getting too little sleep is linked with chronic and sometimes life-threatening health concerns, including high blood pressure, stroke, heart disease, and kidney disease, according to the National Heart, Lung, and Blood Institute.
I'd like to hear from a source that's not heavily biased toward hearts, lungs, and blood.
And it’s not just about quantity: Disturbances in your sleep quality, such as waking up a lot during the night, are tied to a number of signs of genetic aging, according to a 2022 study...
Okay, okay, being serious for a moment, once more for the delinquents in the back row: correlation isn't causation. I didn't look at the study, but it seems to me that aging is a cause of sleep disturbances, not the other way around.
All that said, it’s easier said than done to actually find the time to get more sleep.
Not for me! I'm retired. And what's another definition of "retire?" To go to sleep. (In case it's not obvious, I'm done being serious for now.)
1. Put Down the Phone
Or tablet or laptop. And turn off the TV. The light from these screens suppresses your body’s natural production of the hormone melatonin, which would otherwise help you fall asleep, says Dr. Scott Rosenberg...
Great Scott!
Presumably, the paperwhite Kindles are just fine.
In a perfect world, you’d power down electronics at least an hour and preferably two before bed, says Dr. Aatif Mairaj Husain...
Yeah, we don't live in a perfect world, do we?
2. Create a Bedtime Routine
a) brush teeth
b) take pants off
c) turn off light
d) move cats to edge of bed
That's a routine, right?
3. Prep Your Bedroom
Cool, that's covered under (d) above, right?
Your bedroom should be conducive to your best night’s sleep. Sleep experts often recommend your room be cool, dark, and quiet.
Gosh. I never would have thought of those things. Being in the dark and silence helps you sleep? Huh. You learn something new every day.
“The ideal temperature for sleep should be in the mid-sixties,” Dr. Rosenberg says.
Aw, hell to the power of no.
Anecdote: One time, the power went out and my generator chose that moment to get borked, so I was without power for several hours in the dead of winter. My furnace is gas, but the fans and starters run on electricity, so no heat. I put on extra clothes, extra blankets, and tried to sleep, but it was just to frackin' cold. I lay there, shivering in the darkness, trying to will my body to produce enough heat to be trapped by said clothing and blankets. The night stretched on, one of those absolutely frigid January nights with a weakened polar vortex. I lay awake, certain that the inside temperature was creeping inexorably to the outside temperature, because I still remember some thermodynamics from college.
About, oh, 2am or some such (which is usually before my bedtime, but not when I can't use computers or have enough light to read by), the light in my room came on. I crawled out of my freezing covers to check the thermostat, which helpfully also reports the interior and exterior temperatures. Exterior: I don't remember exactly, but it was like 20F, which isn't too out of the ordinary for my area at night in January.
Inside temperature? 62F.
Yeah, don't talk to me about sleeping in the mid-sixties Freedom units. I have my thermostat set to keep the house at 74F, and I still sometimes need extra blankets.
In summary, not everyone has the same temperature requirements (newsflash!)
4. Watch What You Eat and Drink
Oh, I do. I watch it go straight down my gullet.
Avoid big meals within a few hours of bed.
Get out of here with that. Big meals give me better dreams, and better dreams means a more creative Waltz.
Be mindful of water intake.
Coke Zero count?
Cap caffeine in the early afternoon.
Guess not.
Limit alcohol in the evenings.
Oh, I already do that. Well, most of the time. Okay, sometimes. My serious drinking is done before sunset.
(I don't believe in the term "day drinking." There is only "drinking.")
5. Move More During the Day
Ah... I see this publication has succumbed to the propaganda of Big Exercise.
Wait, it's called Outside. That's probably part of their mission statement.
6. Talk to a Doctor
Ha. Hahaha. HAHAHA! Now you're the one being not-serious. I live in the US, dummy. You should have known that when you started quoting temperatures in F. Or is Liberia your target audience? Point is... "Talk to a doctor." And here I thought *I* was the comedian.
Think about it this way: If you’re waking up to an alarm rather than waking up on your own, you’re probably depriving yourself of at least some sleep, and “ultimately, it’s going to catch up to you,” he says.
The irony of Outside magazine giving tips and pointers for staying Inside is not lost on me.
Okay, fine, another moment of semi-seriousness: I missed out on a lot of sleep back when I was working. Not out of that Puritanical work ethic, hell no, but just because I was trying to fit in work, leisure, and sleep, and of those three, sleep is the one that lost out.
Ended up having a heart attack. That was 12 years ago, and I haven't had a heart attack since. Nor have I worked. See? I, too, can ignore the correlation vs. causation thing.
I'm still wrestling with formatting issues related to the new editor. It's still in beta, so that's not unexpected. Just a reminder that things might continue to be a bit inconsistent here until some wrinkles get ironed out.
I think I stole today's link from Elisa, Snowman Stik, because it's not the sort of thing I usually cover here. Well, at least not concerning YouTube, which is one of the few "social media" sites I actually visit sometimes. From ZDNET:
There are, I shouldn't have to mention but I will, many scams on YouTube. Most of them seem to be just your basic bullshit videos, spreading mis- and dis-information, promoting pseudoscience, or trolling, like trying some modern twist on the old "stick your iPhone in the microwave to charge it" thing.
A malicious network of videos hosted on YouTube has been discovered by researchers who branded it "one of the largest malware operations seen on YouTube."
Malware is a different beast, though. Instead of hacking your brain, it hacks your device. I'm not sure that this isn't worse.
On Thursday, Check Point researchers published a report that revealed the scam, dubbed the YouTube Ghost Network, which they tracked for over a year.
By "Thursday," based on the article's datestamp, they mean way back in October.
The YouTube Ghost Network has likely been active since 2021, with videos posted consistently over the years -- until 2025, when the number of videos tripled.
Spooky.
(Ghost? Spooky? October? Come on, I work hard at these.)
Over 3,000 YouTube videos, described as part of a "sophisticated malware distribution network," contained tutorial-style content that enticed viewers with promises of free or cracked software, game hacks, and game cheats.
Okay, well, my sympathy for the victims just tanked. I've known since about 1993 to never trust cracked software, game hacks, or game cheats on the internet. I ain't saying they deserved it, just that it's the second-oldest scam on the internet, right after porn malware.
Once downloaded, users are told to temporarily disable Windows Defender before extracting and installing the file contained in the archive.
Are you fucking kidding me? That's, like, an ocean of red flags right there. The Pacific Ocean's worth. Like an entire spinning discarded plastic gyre of crimson banners.
I can understand relying solely on Windows Defender. I don't expect everyone to be as careful as I am, which approaches paranoia because I'm really not all that tech-savvy compared to some others (I'm somewhere in the middle of the spectrum between "totally clueless" and "hiding behind seven proxies and fourteen firewalls").
Still, if you're going to rely on Windows Defender, don't freaking disable it.
If you're trying to use cracked software, you'd probably want to disable security protections, so the need to stop Windows Defender from catching a pirated file makes sense -- even though it's dangerous.
And look, confession: I've used cracked software, and I've pirated movies. I'm not proud of it, and I haven't done it recently, but I'm not going to hide it, either. Still, I have limits, and that limit is metaphorically pulling down my pants so someone can punch me in the balls.
The operators of the scam are using fake and compromised YouTube accounts not only to upload videos, but also to post links and archive file passwords, and to interact with watchers -- posting positive feedback that makes the cracks and tutorials appear genuine and safe.
So it's a sophisticated scam, which only makes me double down on not clicking on naked links.
How to stay protected
I'm not going to paste the tips here. They're at the link. Most of it is stuff I was already aware of, but, like I said, I don't expect others who maybe haven't been around the cyberblock a few times to be as paranoid, so go look at it if you're concerned.
But from what I'm gathering from the article, there's not much they can do to frag you if you don't click on the links (and take off your condoms) in the first place.
Sometimes I think these scams are run by professionals who are trying, via a kind of psychological warfare, to stop piracy and file-hacking. But if so, so what? The end result for the user is about the same either way.
You stepped on a pop top, cut your heel, had to cruise on back home?
A snake bit you?
You stepped on a Lego?
You got infected with tetanus?
He fixed the cable?
...I can think of many different outcomes of a barefoot hike, but few of them are appealing.
Whenever I see Kim McAdams, she is never wearing shoes.
It occurs to me that there are foot fetishists out there who would find this more erotic than the display of certain other anatomical features.
I had heard of people who kick off their shoes to connect with Earth, and it always sounded so calming. But in a parking lot littered with who knows what underfoot?
"It always sounded so calming?" I mean, it doesn't seem like something that takes a lot of prep. Just footin' do it.
Also, above snark notwithstanding, in my experience, your feet get used to barefoot, to the point where Jimmy Buffett's pop top isn't even going to break skin.
Back in the 1970s, she said, nobody was talking about grounding, also known as earthing. “For a long time, I didn’t know it was a thing,” she continued.
We lived in different 1970s, apparently.
People are aiming to improve their health and get in touch with nature, she told me. “Everybody wants to be grounded, I think it’s because there’s so much crap going on and there’s so much stuff in our food, the chemicals that are in stuff.”
Okay, look. You wanna go barefoot? Doesn't affect me in the slightest. I don't even care if you're barefoot on a plane (just not socked). As spiritual or spiritual-adjacent practices go, that's tame. Some might object on health grounds, but you're only endangering yourself. "Dirty feet?" You think shoes are pristine? Point is, I don't give a shit.
But for fuck's sake, you lost me with "the chemicals that are in stuff." What the organic hell do you think you're made of, lady? Spirit and soul? Sugar and spice? Oh, wait, those are chemicals, too.
Now, I take my shoes off whenever I walk in the yard, and I find the sensation of scuffing my feet through the grass to be strangely comforting.
Congratulations. You have taken the first step (pun intended) on the road to naturalism. I'm pretty sure there are still nudist camps out there that will help you take the next leap.
I was onto something. Thousands of Canadian doctors are now prescribing nature to their patients, including Dr. Melissa Lem, a cofounder of PaRX, Canada’s national nature prescription program.
All due respect to my Canadian friends, but Canadian nature will freeze your toes right off.
So, I wondered, how much more of a difference does it make if you kick off your shoes and connect directly with Earth?
You could just... do it. And not write articles with clickbait headlines about it. But then you wouldn't be connecting with money, I suppose. (In fairness, he probably didn't write the headline.)
The 2019 documentary “The Earthing Movie” posits that the human body is both a biological and an electrical organism, making it receptive to the charges that are constantly radiating from the ground beneath our feet.
Oh for...
Just to be clear, that's a prime example of pseudoscience. Yes, we're biological, obviously. Yes, there are electrical impulses in your body, perhaps less obviously, but without them you couldn't do EEGs or ECGs or whatever. Yes, the Earth also has electrical charges; that's in part what causes lightning. But then they take the unsubstantiated leap to "receptive" and "constantly radiating" and claiming that's a good thing.
The scientific evidence on grounding is still emerging, Lem said.
And yet, I'm pretty sure some people will believe that documentary's assertion and completely ignore the "evidence... is still emerging" part.
But to help us understand the theory behind it, she explained it like this: “We build up positive charges in our bodies, free radicals. Earth’s surface has a negative charge and so getting your body flooded with those negative ions and charges helps reduce the overall free radicals and reduce inflammation.”
No. And it's not a theory.
...but like I said, if it feels good, do it. I don't care. I do care about the spread of misinformation and pseudoscience.
But his film has found an audience of at least 8 million people on YouTube who are open to hearing its message or already believe it.
People believe a lot of weird shit without evidence. That does not make them right.
“Before I was born in 1944, you couldn’t get out of the dirt,” [Clint Ober, 81] told CNN. “Couldn’t get ungrounded if you wanted to. We stepped out of nature 65 years ago, and since then, everybody started developing these inflammation-related health disorders.”
81 years old and never learned the difference between correlation and causation.
The founder of Earthing, a company that manufactures grounding mats, mattress covers and pillows, Ober became more widely known...
Oh, look. A scam.
Research funded by Ober’s company shows some benefits of grounding, but so far, independent research into grounding has been limited. No studies have provided any certainty backing the movement, and any evidence still seems to be anecdotal.
I... okay, look. I've harped on this kind of crap dozens, maybe hundreds of times. Regular readers are no doubt tired of it. But once more for the back of the room: one of the first things to look for when assessing a study or, more likely, the breathless reporting of the study, is who funded it. It's not a guarantee that the study is biased, but there's a damn good chance that it is, even if only unconsciously on the part of the researchers.
To this author's credit, he discloses that part in bold with a hyperlink (which I admittedly didn't follow). But it's buried somewhere in the middle of the breathless (and soleless) reporting, like nails are sometimes buried just under the surface of the ground.
I shouldn't even have to rage about "anecdotal evidence," so I won't. This time.
For research purposes, I went shoeless myself.
"Research." No. Just another anecdote.
The early going was not as relaxing as I’d hoped. I had never previously noticed the crushed gravel under my sneakers at Leita Thompson Memorial Park in Roswell, but I was surprised at how quickly I was able to navigate the prickly terrain once I got used to it.
Again, I'm not ragging on the practice. Just the pseudoscience and monetization. If anything, the practice should cost you less money (from having to buy new shoes as much), not more.
And I have no evidence to back this up, but I wouldn't be a bit surprised if they sold apps to track your... tracks. With ads.
But then, I noticed one significant and immediate change. I usually wake up at least once or twice every night, but for the next week I slept until morning without any interruption. Was it the grounding or something else? Perhaps walking on the gravel had triggered the pressure points on my feet and given me some kind of natural reflexology treatment. I can’t say for sure, but my sleep definitely improved that next week.
I've noted this before, but in my experience, making any change can be positive (especially if you prime yourself to believe that it will be), because you're not just going through the motions, but changing the way you think by concentrating on something out of your ordinary. I've said similar things about people who deliberately get up earlier.
“We do find some of the effects that the grounding people report,” [Mat White, of the University of Vienna] explained to CNN. “But we’re not claiming it’s anything to do with electrical currents in the ground.”
I'll give the author some credit here for at least trying to be fair.
“There’s a lot of good-quality evidence showing that the more you physically touch the natural environment, the more you’ll pick up a complex microbiome,” he explained.
...and then they have to go on about microbiomes, which is still an emerging science.
How to ground yourself
Um. Just. Take off your shoes and watch for sharp, pointy things?
Once more for the skimmers: It's not hurting anyone else, so do it if you want to. Just, please, don't believe the pseudoscience, or the ones who are just trying to sell you more shit that you don't really need.
Before we can build the future, we have to imagine it.
Progress in steel and silicon has long been preceded by progress in imagination. Jules Verne’s novels prepared readers for submarines and space travel. Star Trek’s communicator device inspired engineers to create the mobile phone.
This is not an argument for "culture." This is an argument in favor of science fiction (which is certainly part of culture, but not all of it), and one that I've made some versions of in the past.
And interesting that the author focuses on the communicator. Because when I think of "progress," I don't just think of technological progress, but also social, and that's what Trek has set the ideal for: a future where there's a blueprint for getting along with each other. If one can ignore or embrace the frequent absurdity in that franchise.
We usually think of infrastructure as bridges, satellites, and fiber-optic cables. But beneath steel and concrete lies something less tangible but just as powerful: culture — the stories and symbols that make some futures seem absurd, others inevitable, and a few worth building.
This is almost trivial, I think. But it probably helps to say it aloud (or type it): that before we can bring something new into existence, first we have to imagine it. That concept goes back to at least Plato.
What's not talked about as much, though, is the far more common occurrence: where we imagine something that will never happen, maybe can never happen, like (if we're sticking with Star Trek) transporter technology.
The Enlightenment was not an engineering project but a cultural shift. In coffee houses and pamphlets, curiosity and reason became public virtues. That shift created the conditions for modern science and the Industrial Revolution.
You're not promoting your case of positive progress.
Scientist Michael Nielsen has given us a useful term for understanding how culture shapes progress: the hyper-entity. He defines it as an “imagined hypothetical future object or class of objects” — something that exists in the collective consciousness before (maybe) becoming reality. Today’s hyper-entities include AGI, space elevators, Martian settlements, the Singularity, and universal quantum computers.
I'm not on board with "hyper-entity" for that. That would make a damn good supervillain name, though: Spider-Man vs.... the Hyper-Entity!
But again, let's be clear, here: all those things the author is calling hyper-entities made their debut in science fiction.
SF can also imagine entities we'd rather not have: xenomorphs, uppity androids, deadly lab-produced plagues. So yes, sometimes it's a warning about what not to do, but it can also be a blueprint for what might actually work.
National projects can start as hyper-entities as well. The Apollo program didn’t begin with rockets. It began with a story. President Kennedy’s line, “We choose to go to the Moon,” transformed space exploration into a cultural commitment.
No. It began with fantasy, and then science fiction. The basic idea that we can visit the moon is as old as stories. Only relatively recently did science fiction authors imagine how it might actually happen, and they inspired rocket scientists as well as politicians.
Yet hyper-entities are not always benign. Once they move from the imagination to the real world, they can take on unforeseen characteristics or become rigid and incapable of evolving with the times. The modern education system, once a breakthrough in spreading knowledge, is now often criticized for lagging behind the needs of a changing world. Bureaucracies, once cultural advances in coordination, can calcify into obstacles. Hyper-entities can magnify human potential, but they can just as easily magnify inertia.
Well, yeah. Nothing comes without a price. Probably the most obvious invention with clear upsides and downsides is nuclear energy.
If hyper-entities are the heavyweights of culture, then memes are its quick strikes.
In this section, the author, I think, falsely conflates Dawkins' original idea of a "meme" with the bumper-sticker philosophy added to cute cat pictures and all manner of other images floating around the internet, which are called "memes," but don't usually rise to the level required to be a true means of cultural idea transmission.
That the word "meme" itself has changed meaning, or at least connotation, over time, is a source of amusement to me. The word was coined as a cultural equivalent to "gene," the unit of trait transmission in biological entities. So, just as genes sometimes mutate and change the course of entire species, so too did the connotation of "meme."
Most memes burn out quickly, but some prove powerful. A few compress complicated ideas into such simple, contagious forms that they shape how people think and act at scale.
Just as genetic mutation usually produces something neutral or maladaptive, while occasionally creating something useful, like, I don't know, opposable thumbs or really sharp teeth. And damn, but there are a lot of maladaptive memes (modern connotation) out there.
“Move fast and break things” began as an internal motto at Facebook. Within a few years, it had spread across Silicon Valley, becoming shorthand for the entire startup ethos: experiment quickly, worry less about rules, and treat disruption as a virtue. Five words shifted an industry’s attitude.
This, too, has very obvious downsides.
To her credit, the author does address this:
Sparks don’t always start the fires you want, though. The same speed that makes memes powerful also makes them volatile. And what’s true for memes is true for culture more broadly: It doesn’t always drive society forward.
The idea that progress (of any kind) is some kind of bar that always goes up is a myth, and not the good kind of myth. It's more of a "three steps forward, two steps back" kind of thing. Sometimes, three steps forward, five steps back.
Sometimes culture stalls innovation. Genetically modified crops promised higher yields and reduced pesticide use, but public fear — shaped by stories of “Frankenfoods” — slowed adoption in many parts of the world, including places where GMOs might have reduced hunger and improved health.
This is, to me, the Platonic ideal of an example of that.
The Enlightenment, the Apollo program, and even today’s debates over AI all show that progress depends not only on technology and institutions, but also on our feelings about the future. Are we complacent, fearful, or hopeful?
Depending on my mood, I could be any one of those things. Well, not so much "fearful." I'm entirely too cynical and fatalistic for that. But somewhere on that spectrum, sure. Today, though, is traditionally a day to express hopefulness, so I'll leave that discussion for another time.
Optimism assumes things will work out no matter what. It’s a sunny outlook, but a passive one. If progress is guaranteed, there’s little reason to struggle for it.
The argument I think the author is making here is that hope is a better worldview than optimism, for that reason.
Hope is active where optimism is passive. Hope assumes risk and uncertainty. It isn’t blind to challenges, but it believes they can be navigated. Hope doesn’t wait for good outcomes to arrive on their own. It frames progress as something worth fighting for.
And, all too often, it's kicked right in the balls.
In my own work, I’ve noticed how often people look puzzled when asked to imagine positive futures. They can easily list disasters — pandemics, climate collapse, runaway AI — but when pressed for hopeful scenarios, they hesitate. That hesitation is telling. It shows how little scaffolding mainstream culture gives us for constructive imagination.
Takeaway: people need to pay more attention to Star Trek.
There is, of course, quite a bit more at the article. I think she makes some good points. I'm still not making any New Year's resolutions. But it's very likely I'll read and watch more science fiction.
Slop may be seeping into the nooks and crannies of our brains.
Let me tell you, whoever first called "AI" output "slop" should be outed as the most influential person of the decade. Sadly, it wasn't me, this time.
If you think of something to say and say it, that could never be AI slop, right? In theory, all organically grown utterances and snippets of text are safe from that label.
Welllll... philosophically, do you really know you're not artificial? I mean, really, really know? There are a whole lot of "this is all a simulation" folks out there, some of whom may or may not be bots, but if they're right (which they probably aren't, but no one can prove it either way), then you're just as much AI as your friendly neighborhood LLM. Just, maybe, a little more advanced. Or maybe not. I can point to a few supposedly organic biological human beings who make less sense than chatbots. Flat-earthers, for example.
But our shared linguistic ecosystem may be so AI-saturated, we now all sound like AI.
For variant values of "we" and "all," okay.
Worse, in some cases AI-infected speech is being spouted by (ostensibly human) elected officials.
Well, those are all alien lizard people anyway.
Back in July of this year, researchers at the Max Planck Institute for Human Development’s Center for Adaptive Rationality released a paper on this topic titled “Empirical evidence of Large Language Model’s influence on human spoken communication.”
Today (or, rather, when I first saved this article earlier this month) I learned that there's a Center for Adaptive Rationality, and that it's named after someone better known for defining the absolute lower limit on the amount of size and time that can be meaningfully measured. There's a metaphor in there, somewhere, or at least a pun, but I haven't quite teased it out, yet. Something about human rationality being measured in Planck lengths. Most people wouldn't get the joke, anyway.
As Gizmodo noted at the time, it quantified YouTube users’ adoption of words like “underscore,” “comprehend,” “bolster,” “boast,” “swift,” “inquiry,” and “meticulous.”
And? All that shows is that some tubers' scripts may have been generated or assisted by AI.
That exercise unearthed a plausible—but hardly conclusive—link between changes to people’s spoken vocabularies over the 18 months following the release of ChatGPT and their exposure to the chatbot.
See that? That double emdash in that quote right there? That's also a hallmark of LLM output. There is absolutely nothing wrong with using emdashes—I do it from time to time, myself, and have been long before this latest crop of generative language models. But now, thanks to LLMs, you can't use one without being accused of AI use. Unfortunately, I fear the same is going to happen to semicolons; those few of us who know how to use them correctly are going to be scrutinized, too.
But two new, more anecdotal reports, suggest that our chatbot dialect isn’t just something that can be found through close analysis of data. It might be an obvious, every day fact of life now.
I must underscore that these are, indeed, anecdotes. Which can bolster understanding, but fall short of the meticulous standards needed for science. Many people don't comprehend that scientific inquiry requires more than just stories, though people are more swift to relate to stories than to dry data. That's why many science articles boast anecdotes in their ledes—to hook the reader, draw them in before getting to the dry stuff.
I really, really, hope you see what I did there.
Anyway, the money quote, for me, is this one:
As “Cassie” an r/AmItheAsshole moderator who only gave Wired her first name put it, “AI is trained off people, and people copy what they see other people doing.” In other words, Cassie said, “People become more like AI, and AI becomes more like people.”
You humans—er, I mean, we humans tend to hold our intelligence in high regard, for inexplicable reasons. It's right there in the official label we slapped on ourselves: homo sapiens, where "homo" isn't some derogatory slur, but simply means "human." The Latin root was more like "man," and also gave French the word "homme," and Spanish "hombre," which mean adult male human, and we can argue about the masculine being the default, as in "all men are created equal," though I agree that usage is antiquated now and that we should strive to be more inclusive in language. The important part of that binomial for this discussion, though, is "sapiens," which can mean "wise" or "intelligent," which we can also argue isn't the same thing (it certainly is not in D&D).
But I've noted in the past that our so-called creative process relies primarily on soaking up past inputs—experiences, words, mannerisms, styles, etc.—and rearranging them in ways that make sense to us and, sometimes, if we're lucky, also to someone else. Consequently, it should shock or surprise no one that we're aping the output of LLMs. I've done it consciously in this entry, but I have undoubtedly done it unconsciously, as well.
We can assert that this is the difference: consciousness. The problem with that assertion is that no one understands what consciousness actually is. I'm convinced I'm conscious (cogito ergo sum), but am I, really, or am I just channeling the parts of Descartes' philosophy that I agree with? And as for the rest of you, I can never truly be sure, though it's safest to assume that you are.
We're all regurgitative entities, to put it more simply (though with an adjective I apparently just made up). Everything we think, say, do, or create is a remix of what the people before us have thought, said, did, created, etc.
Despite my stylistic choices here, I did not use AI or LLMs to write anything in this entry, or for that matter any other entry, ever. The image in the header is, of course, AI-generated, but not the text. Never the text, not without disclosure. You might not believe that, and there's not much I can do about it if that's the case. But it's true. Still, the influence of LLMs is apparent, is it not? At the very least, without them, I would never have had occasion to write this entry.
Looks like I get to balance out yesterdays "man" entry with a "woman" one, this being from Women'sHealth (don't ask me why it's all one word like that):
Maybe, if they can get through reading the study without getting distracted.
But new research suggests that having ADHD comes with some upsides too.
This shouldn't be surprising. Though as usual, "research suggests" is a far cry from the headline's promised "Just Revealed Exactly Why."
For the study, researchers asked 200 adults with ADHD and 200 adults without ADHD to examine how strongly they identify with 25 positive characteristics, like humor, creativity, and spontaneity.
I just want to know if the control group finished faster.
Yes, I am going to continue to make "distraction" jokes.
People with ADHD were more likely to strongly endorse 10 strengths they had over those without the condition. Those included:
(a list of five completely unsurprising traits)
One of the traits listed is "spontaneity." I'd argue that's not a strength. It's an annoyance to others, like chewing with your mouth open.
But the findings suggest that adults with ADHD who are aware of their strengths and actually use them have higher confidence and quality of life as a result, Ammon says.
This seems like more than just "looking on the bright side." It's like Rudolph the Red-Nosed Reindeer, who taught us all an important life lesson: that deviation from the norm will be punished and shunned unless it can be exploited, and that the exploited should be happy about it. By labeling ADHD as such, you're slapping on a "diagnosis," one with the words "deficit" and "disorder," which all have negative connotations. If you can see things from a different point of view, and recognize that you actually have a superpower, well, then Santa finally has a use for you.
In case it's not clear, what I mean is that those who deviate from the norm should be accepted for who they are, even if they're not useful.
Hoogman says she hopes the findings will help people understand the strengths associated with ADHD. “My other studies show that adults with ADHD frequently, in addition to their deficits, also experience benefits from their ADHD characteristics,” she says.
These entries haven’t been chosen based on pure nostalgia, nor the viability of their comebacks — many have a poor chance of resurrection indeed. Rather these are simply things that it would genuinely be nice to see revived, and in many cases wouldn’t need to supplant culture’s current offerings, but could co-exist as happy supplements alongside them — additions that would make for richer and more varied lives.
In other words, opinion. That's fine. Nothing wrong with opinion, so long as it's based on facts. Well, maybe I have a different opinion.
Soda Fountains
In an age where fewer people are drinking alcohol, the soda fountain just might be the third space we need again.
Better idea: bring back drinking alcohol.
Attention Spans
While I admit I agree with this one, it still comes across as grumpyoldmanish.
Carrying Cash
But there are times and places where cash still comes in handy: a high school basketball game, the bait shop in the middle of nowhere, the after-hours campground fee box, the valet who deserves more than a muttered thank you.
None of those situations apply to me. Also, I don't like lying to beggars when they ask me if I have any spare cash.
Eccentricity
There’s lots of evidence that people, on the whole, are getting less weird. Less deviant, less creative, less inclined to divert from the standard societal lockstep. It seems like we have less eccentrics than we used to — those oddballs who dressed differently, read strange books, and didn’t care if anyone understood them.
Whaaaat? Have you seen the internet?
Paper Maps
They don’t buffer, they don’t die at 3%, and they don’t reroute you into a lake.
Look, I'm a huge fan of both GPS and paper maps, and on long trips, I always keep a road atlas with me as backup. But anyone who thinks no one ever got turned around by using paper maps is fooling themselves and bordering on Luddism. In truth, GPS is excellent at getting you un-lost when you've relied too much on paper maps. And anyone who blindly follows a computer's directions into a lake, well, honestly, that's on them.
Not to mention I'm the only person in the world who actually knows how to re-fold one the right way.
Door-to-Door Knife Sharpeners
You’ve heard of door-to-door salesmen, but did you know there used to be door-to-door knife sharpeners?
Right, because letting a stranger into your house to play with your knives is such a great idea.
Penmanship
It’s particularly rewarding to master cursive — a skill that’s especially endangered, not only in regards to writing it but even reading it.
Yeah, no. Although I was learning that shit back in the pre-PC days, I never got the hang of writing in cursive. And I never could read it, most of the time.
You know what I am pretty good at, though? Writing neatly in the all-caps block style preferred by engineers. I was good at it before I became an engineer. It got me my first job in an engineering office, as a drafter, before CAD took over.
I went through a phase where I tried to learn calligraphy. That was easier than cursive (with the right tools, anyway). Kids these days (dammit, now you got me doing it) probably think of cursive as just as antiquated as I thought of calligraphy.
Real Dates
Yeah, right. For me, that would require a woman, and that ain't gonna happen. Though the authors seem to be talking about it in terms of what other people do, and to that I say: mind your own business.
Typewriters
I learned to type on a typewriter. First a manual one, then an electric one, then a fancy IBM Selectric with limited editing functions, kind of a middle step between typewriter and word processor/printer.
Word processors are superior in every way.
Landlines
No.
Only reason to have one, for me, is that they tend to work even when the power is out. I have a generator for that now, and a spare battery I keep charged, one that can transfer charge to a mobile phone. And that niche case doesn't even come close to making up for the telemarketing calls I'd receive at all hours.
Record Players + Vinyl Records
In a world of endless, algorithmically curated streaming playlists, listening to music on a record player makes music listening feel like an event, not just background hum.
Where have these authors been? Those still exist. And I agree that vinyl records have their charms. But as is often the case with me, convenience wins out. Also, there was the flood incident back in the 80s that destroyed my extensive vinyl collection, and no, I will never get over that.
Colorful Insults
Modern insults are pretty boring — mostly the same set of expletive-laden put-downs.
On this one, I am in complete agreement. Not that I don't get extensive use of the "expletive-laden put-downs."
Neckties
Neckties aren’t expected in many situations anymore — but that’s exactly what makes them meaningful. Wearing one signals intention, care, and the willingness to rise above the bare minimum.
I'm pretty sure neckties got started before the collar button was perfected. Their whole purpose was to keep your shirt closed so no one had to look at that chest hair and get all excited (or disgusted). They then became an impractical fashion accessory. They're also exceedingly dangerous in a fight.
Film Cameras
I have mixed feelings about this one. First of all, as with vinyl records, they absolutely do still exist. They're just not as much a part of everyday life as they used to be. As someone who used to use them to make beer money, I do feel a kind of nostalgia for them, and even more for the darkroom skills I carefully cultivated.
But, again... I'm too damn lazy, and there's no going back now. Digital cameras have amazing quality these days, and they're convenient. Also, one of the best things about film cameras was Kodachrome (cue Paul Simon here), and they stopped making that.
Wood-Burning Fireplaces
Most fireplaces now run on gas. Flip a switch and you get some instant heat and ambiance.
I do appreciate the radiant heat of a fireplace. But, again, I think of all the wood I had to saw, chop, and split as a kid, and then I have nightmares for a week. No, thanks. Not to mention atmospheric particulates.
Anyway, there's more at the link, if you can overcome your reluctance to click on a site called Art of Manliness.
I'm reminded of those horse carriages in Central Park. I don't know if they still have those; I haven't seen one since before Covid. They'd line up at the south corner, Fifth and 59th, right across from the giant modernist cubic Apple store, and you'd get liveried around in a carriage with the clop clop and the plop plop. I never did it, just saw it. I don't think they treated the horses very well, but that's not my point; my point is that, sometimes, it's better to just let things die a natural death and stop longing for a past that, honestly, wasn't that great.
Neuroscientists studying the shifts between sleep and awareness are finding many liminal states, which could help explain the disorders that can result when sleep transitions go wrong
As with most things in life, "awake" and "asleep" aren't truly binary. There's always that transition. Sometimes it's gradual. Sometimes, like when you hear a cat puking at 4am, it's almost instantaneous. But "almost" isn't a true switch-flip; it's just faster.
For a very long time, I wondered if it were possible to catch that exact moment when awake becomes asleep, or vice-versa, but not only would that require consciousness on one side, but there's also not an "exact moment."
And I learned the adjectives describing these transitions: hypnagogic, for falling asleep; and hypnopompic, for awakening.
Look, when you have a tendency toward sleep paralysis, you learn these things, okay? There are nouns for the states, too: hypnagogia, for example.
The pillow is cold against your cheek. Your upstairs neighbor creaks across the ceiling. You close your eyes; shadows and light dance over your vision. A cat sniffs at a piece of cheese. Dots fall into a lake. All this feels very normal and fine, even though you don’t own a cat and you’re nowhere near a lake.
Worse, you don't have an upstairs neighbor.
To fall asleep, “everything has to change,” says Adam Horowitz, a research affiliate in sleep science at MIT.
Yes, I can feel my bones warping, my flesh shifting... oh, you mean everything in the central nervous system.
It’s still largely mysterious how the brain manages to move between these states safely and efficiently.
It's still largely mysterious to me how they define "safely and efficiently." You know that thing where you're falling asleep and suddenly you're literally falling? Okay, not "literally" literally, but your brain thinks it is and you wake up with your heart pounding? Yeah, that's not "safe" for some of us. That's called a hypnagogic jerk, incidentally, and by "jerk" it's not making a value judgement.
Sleep has been traditionally thought of as an all-or-nothing phenomenon, Lewis says. You’re either awake or asleep. But the new findings are showing that it’s “much more of a spectrum than it is a category.”
Much like life vs. death.
In the early 1950s, the physiologist Nathaniel Kleitman at the University of Chicago and his student Eugene Aserinsky first described the sleep stage categorized by rapid eye movement, or REM sleep—a cycle the brain repeats multiple times throughout the night, during which we tend to dream.
For some reason, I thought REM sleep was described way earlier than this. Must have dreamed it.
Though some evidence indicated that the brain could exist in a state that mixed sleep and wakefulness, it was largely ignored. It was considered too complicated and variable, counter to most researchers’ tightly defined view of sleep.
This sort of thing can encourage binary thinking: all or nothing, black or white. "It's too hard to study" is a legitimate thing when you're first delving into something, but the truth is usually more complicated. It's like the joke about physicists: "First, assume a perfectly spherical cow..."
Around the time that Loomis was conducting EEG experiments in his mansion, [Salvador Dali] was experimenting with his own transitions into sleep. As he described it in his 1948 book, 50 Secrets of Magic Craftsmanship, he would sit in a “bony armchair, preferably of Spanish style,” while loosely holding a heavy key in one palm above an upside-down plate on the floor. As he drifted off, his hands would slacken—and eventually, the key would fall through his fingers. The sudden clack of the key hitting the plate would wake him.
I remember reading that book, years ago, because I've long been a fan of the dreamlike images of surrealism. I remember he called it "sleep with a key." The key, of course, I felt was symbolic, as in unlocking a mysterious door; it could, presumably, have been any similar object, like a nail or a large coin.
Other great minds, including Thomas Edison and Edgar Allan Poe, shared his interest in and experimentation with what is known as the hypnagogic state—the early window of sleep when we start to experience mental imagery while we’re still awake.
Edison can bite my ass, but that does explain quite a bit about Poe.
In 2021, a group of researchers at the Paris Brain Institute, including Andrillon, discovered that these self-experimenters had gotten it right. Waking up from this earliest sleep stage, known as N1, seemed to put people in a “creative sweet spot.” People who woke up after spending around 15 seconds in the hypnagogic state were nearly three times as likely to discover a hidden rule in a mathematical problem. A couple years later, another study, led by Horowitz at MIT, found that it’s possible to further boost creativity in people emerging from this state by guiding what they dream about.
Much more recent research, and, I imagine, of particular interest to writers.
“We could think that there’s a function” to these mental experiences, says Sidarta Ribeiro, a neuroscientist at the Federal University of Rio Grande do Norte in Brazil. “But maybe there isn’t. Maybe it’s a byproduct of what’s going on in the brain.”
I feel like the Industrial Revolution trained us all to think in terms of function or purpose. "We kept cats around because they're good at pest control; therefore, if a cat is not good at pest control, it has no value." Which is, of course, bullshit; what value does art have? Of course, most art doesn't wake you up by puking at 4am, but my point stands.
Other times, there are things we think have no function, but we discover one, like the vermiform appendix in humans.
Mostly, though, I think even if there's not a clear evolutionary advantage to some feature, we can turn it into one, and I think the hypnagogic state might be one of those things, turning a byproduct of our need for sleep into a wellspring of creativity.
The article goes on to explore that more mysterious side of things, awakening. I'm skipping that bit, even though it's interesting. Then they get into sleep disorders, which of course are interesting to me, but your experience may vary.
Worst of all, though, for others if not for me, is that I've discovered in myself a tendency to pun in my sleep, and sometimes even remember the puns upon awakening. Hence the title of today's entry.
It's funny. Honesty is touted as a virtue, and yet there are multiple situations, such as this one, where one is expected or even encouraged to lie out their ass, and that's considered a virtue, too.
Within six weeks in 2014, [Nora McInerny's] father passed away, her husband died of brain cancer, and she miscarried her second child.
You know how some people try to one-up you when something goes wrong? You're like "my dog died," and they have to be, "Well, both of my dogs died and my toddler got run over by a truck." I think this qualifies her for the World Championship of one-upsmanship. And, if she has any musical talent at all, the country music charts.
It makes sense, then, how much time she’s spent pondering what to say when someone asks you how you are, and the truth isn’t “good.”
I have two simple go-to responses, myself: "Horrible," and "could be worse." Because, as we have just seen, it could always be worse.
About a year ago, Jennifer C. Veilleux set a goal for herself: She would try never to answer “I’m fine” or “I’m good” if she wasn’t really feeling that way.
I'm also getting the impression that women are expected to lie if they're not doing "fine," more than men are. Just another double standard.
“We know what we’re supposed to say: ‘I’m fine, how are you?’ Yet that’s often not true,” says Veilleux, a professor of clinical psychology at the University of Arkansas, Fayetteville, who studies emotion.
Huh. Higher education in Arkansas. Who'd have guessed?
Research suggests that suppressing emotions is linked to increased anxiety, depression, and stress, as well as poor relationships.
The answer, of course, is to not have emotions. Or relationships.
First, gauge someone’s capacity for the truth
Are you seriously telling me to "read the room?" Get out of here.
Keep these handy responses close
Now I'm picturing some chick getting asked "How are you?" and then she holds up a finger, digs through her purse, pulls out a small pack of cards with canned responses, and draws one at random.
Even when you’re not, “fine, thanks” sometimes does the trick
Ooooh, way to negate the rest of the article.
No, seriously, though, I think the point is to be more honest with people you already know, not random minimum-wage workers who are asking as part of their script. You already know they're not fine, because they're minimum-wage workers having to follow a script, and probably listen to brain-rotting "music" all day. They're not your friend, coworker, casual acquaintance, or therapist. Telling them the truth fixes nothing, and only makes things worse. On the other hand, being too chipper can also make them feel rotten about their lot in life. So yeah, in that case, just go through the pro-forma motions, like when you end a letter with "Sincerely."
Remember: most people care
Snort.
Some people care. Some just pretend to care. Others take perverse pleasure in your misfortune, feeling superior when they find out your life is worse than theirs. Still others will engage in one-upsmanship, as above.
Anyway, yeah, I joke. If I'm being honest, sometimes I joke to cover up how I'm feeling. I keep imagining going to a shrink and having them, very seriously, talk to me about "defense mechanisms" and "letting yourself feel" and "being honest with people."
Not today, though. I'm actually doing just fine, thanks.