Blog Calendar
    January     ►
SMTWTFS
    
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
About This Author
Come closer.
Carrion Luggage

Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.


<   1  2  3  4  5  6  7  8  9  10  ...   >
January 2, 2026 at 9:55am
January 2, 2026 at 9:55am
#1104913
This style of headline, as seen in CNN of all places, has possibly transcended clickbait and moved into cliché.


You stepped on a pop top, cut your heel, had to cruise on back home?

A snake bit you?

You stepped on a Lego?

You got infected with tetanus?

He fixed the cable?

...I can think of many different outcomes of a barefoot hike, but few of them are appealing.

Whenever I see Kim McAdams, she is never wearing shoes.

It occurs to me that there are foot fetishists out there who would find this more erotic than the display of certain other anatomical features.

I had heard of people who kick off their shoes to connect with Earth, and it always sounded so calming. But in a parking lot littered with who knows what underfoot?

"It always sounded so calming?" I mean, it doesn't seem like something that takes a lot of prep. Just footin' do it.

Also, above snark notwithstanding, in my experience, your feet get used to barefoot, to the point where Jimmy Buffett's pop top isn't even going to break skin.

Back in the 1970s, she said, nobody was talking about grounding, also known as earthing. “For a long time, I didn’t know it was a thing,” she continued.

We lived in different 1970s, apparently.

People are aiming to improve their health and get in touch with nature, she told me. “Everybody wants to be grounded, I think it’s because there’s so much crap going on and there’s so much stuff in our food, the chemicals that are in stuff.”

Okay, look. You wanna go barefoot? Doesn't affect me in the slightest. I don't even care if you're barefoot on a plane (just not socked). As spiritual or spiritual-adjacent practices go, that's tame. Some might object on health grounds, but you're only endangering yourself. "Dirty feet?" You think shoes are pristine? Point is, I don't give a shit.

But for fuck's sake, you lost me with "the chemicals that are in stuff." What the organic hell do you think you're
made of, lady? Spirit and soul? Sugar and spice? Oh, wait, those are chemicals, too.

Now, I take my shoes off whenever I walk in the yard, and I find the sensation of scuffing my feet through the grass to be strangely comforting.

Congratulations. You have taken the first step (pun intended) on the road to naturalism. I'm pretty sure there are still nudist camps out there that will help you take the next leap.

I was onto something. Thousands of Canadian doctors are now prescribing nature to their patients, including Dr. Melissa Lem, a cofounder of PaRX, Canada’s national nature prescription program.

All due respect to my Canadian friends, but Canadian nature will freeze your toes right off.

So, I wondered, how much more of a difference does it make if you kick off your shoes and connect directly with Earth?

You could just... do it. And not write articles with clickbait headlines about it. But then you wouldn't be connecting with money, I suppose. (In fairness, he probably didn't write the headline.)

The 2019 documentary “The Earthing Movie” posits that the human body is both a biological and an electrical organism, making it receptive to the charges that are constantly radiating from the ground beneath our feet.

Oh for...

Just to be clear, that's a prime example of pseudoscience. Yes, we're biological, obviously. Yes, there are electrical impulses in your body, perhaps less obviously, but without them you couldn't do EEGs or ECGs or whatever. Yes, the Earth also has electrical charges; that's in part what causes lightning. But then they take the unsubstantiated leap to "receptive" and "constantly radiating" and claiming that's a good thing.

The scientific evidence on grounding is still emerging, Lem said.

And yet, I'm pretty sure some people will believe that documentary's assertion and completely ignore the "evidence... is still emerging" part.

But to help us understand the theory behind it, she explained it like this: “We build up positive charges in our bodies, free radicals. Earth’s surface has a negative charge and so getting your body flooded with those negative ions and charges helps reduce the overall free radicals and reduce inflammation.”

No. And it's not a theory.

...but like I said, if it feels good, do it. I don't care. I do care about the spread of misinformation and pseudoscience.

But his film has found an audience of at least 8 million people on YouTube who are open to hearing its message or already believe it.

People believe a lot of weird shit without evidence. That does not make them right.

“Before I was born in 1944, you couldn’t get out of the dirt,” [Clint Ober, 81] told CNN. “Couldn’t get ungrounded if you wanted to. We stepped out of nature 65 years ago, and since then, everybody started developing these inflammation-related health disorders.”

81 years old and never learned the difference between correlation and causation.

The founder of Earthing, a company that manufactures grounding mats, mattress covers and pillows, Ober became more widely known...

Oh, look. A scam.

Research funded by Ober’s company shows some benefits of grounding, but so far, independent research into grounding has been limited. No studies have provided any certainty backing the movement, and any evidence still seems to be anecdotal.

I... okay, look. I've harped on this kind of crap dozens, maybe hundreds of times. Regular readers are no doubt tired of it. But once more for the back of the room: one of the first things to look for when assessing a study or, more likely, the breathless reporting of the study, is who funded it. It's not a guarantee that the study is biased, but there's a damn good chance that it is, even if only unconsciously on the part of the researchers.

To this author's credit, he discloses that part in bold with a hyperlink (which I admittedly didn't follow). But it's buried somewhere in the middle of the breathless (and soleless) reporting, like nails are sometimes buried just under the surface of the ground.

I shouldn't even have to rage about "anecdotal evidence," so I won't. This time.

For research purposes, I went shoeless myself.

"Research." No. Just another anecdote.

The early going was not as relaxing as I’d hoped. I had never previously noticed the crushed gravel under my sneakers at Leita Thompson Memorial Park in Roswell, but I was surprised at how quickly I was able to navigate the prickly terrain once I got used to it.

Again, I'm not ragging on the practice. Just the pseudoscience and monetization. If anything, the practice should cost you
less money (from having to buy new shoes as much), not more.

And I have no evidence to back this up, but I wouldn't be a bit surprised if they sold apps to track your... tracks. With ads.

But then, I noticed one significant and immediate change. I usually wake up at least once or twice every night, but for the next week I slept until morning without any interruption. Was it the grounding or something else? Perhaps walking on the gravel had triggered the pressure points on my feet and given me some kind of natural reflexology treatment. I can’t say for sure, but my sleep definitely improved that next week.

I've noted this before, but in my experience, making any change can be positive (especially if you prime yourself to believe that it will be), because you're not just going through the motions, but changing the way you think by concentrating on something out of your ordinary. I've said similar things about people who deliberately get up earlier.

“We do find some of the effects that the grounding people report,” [Mat White, of the University of Vienna] explained to CNN. “But we’re not claiming it’s anything to do with electrical currents in the ground.”

I'll give the author some credit here for at least trying to be fair.

“There’s a lot of good-quality evidence showing that the more you physically touch the natural environment, the more you’ll pick up a complex microbiome,” he explained.

...and then they have to go on about microbiomes, which is still an emerging science.

How to ground yourself

Um. Just. Take off your shoes and watch for sharp, pointy things?

Once more for the skimmers: It's not hurting anyone else, so do it if you want to. Just, please, don't believe the pseudoscience, or the ones who are just trying to sell you more shit that you don't really need.
January 1, 2026 at 10:51am
January 1, 2026 at 10:51am
#1104848
Yay. The odometer clicked over. Here's a somewhat appropriate article from Big Think:

Before we can build the future, we have to imagine it.

Progress in steel and silicon has long been preceded by progress in imagination. Jules Verne’s novels prepared readers for submarines and space travel. Star Trek’s communicator device inspired engineers to create the mobile phone.

This is not an argument for "culture." This is an argument in favor of science fiction (which is certainly part of culture, but not all of it), and one that I've made some versions of in the past.

And interesting that the author focuses on the communicator. Because when I think of "progress," I don't just think of technological progress, but also social, and that's what Trek has set the ideal for: a future where there's a blueprint for getting along with each other. If one can ignore or embrace the frequent absurdity in that franchise.

We usually think of infrastructure as bridges, satellites, and fiber-optic cables. But beneath steel and concrete lies something less tangible but just as powerful: culture — the stories and symbols that make some futures seem absurd, others inevitable, and a few worth building.

This is almost trivial, I think. But it probably helps to say it aloud (or type it): that before we can bring something new into existence, first we have to imagine it. That concept goes back to at least Plato.

What's not talked about as much, though, is the far more common occurrence: where we imagine something that will never happen, maybe
can never happen, like (if we're sticking with Star Trek) transporter technology.

The Enlightenment was not an engineering project but a cultural shift. In coffee houses and pamphlets, curiosity and reason became public virtues. That shift created the conditions for modern science and the Industrial Revolution.

You're not promoting your case of positive progress.

Scientist Michael Nielsen has given us a useful term for understanding how culture shapes progress: the hyper-entity. He defines it as an “imagined hypothetical future object or class of objects” — something that exists in the collective consciousness before (maybe) becoming reality. Today’s hyper-entities include AGI, space elevators, Martian settlements, the Singularity, and universal quantum computers.

I'm not on board with "hyper-entity" for that. That would make a damn good supervillain name, though: Spider-Man vs.... the Hyper-Entity!

But again, let's be clear, here: all those things the author is calling hyper-entities made their debut in science fiction.

SF can also imagine entities we'd rather not have: xenomorphs, uppity androids, deadly lab-produced plagues. So yes, sometimes it's a warning about what not to do, but it can also be a blueprint for what might actually work.

National projects can start as hyper-entities as well. The Apollo program didn’t begin with rockets. It began with a story. President Kennedy’s line, “We choose to go to the Moon,” transformed space exploration into a cultural commitment.

No. It began with fantasy, and then science fiction. The basic idea that we can visit the moon is as old as stories. Only relatively recently did science fiction authors imagine how it might actually happen, and they inspired rocket scientists as well as politicians.

Yet hyper-entities are not always benign. Once they move from the imagination to the real world, they can take on unforeseen characteristics or become rigid and incapable of evolving with the times. The modern education system, once a breakthrough in spreading knowledge, is now often criticized for lagging behind the needs of a changing world. Bureaucracies, once cultural advances in coordination, can calcify into obstacles. Hyper-entities can magnify human potential, but they can just as easily magnify inertia.

Well, yeah. Nothing comes without a price. Probably the most obvious invention with clear upsides and downsides is nuclear energy.

If hyper-entities are the heavyweights of culture, then memes are its quick strikes.

In this section, the author, I think, falsely conflates Dawkins' original idea of a "meme" with the bumper-sticker philosophy added to cute cat pictures and all manner of other images floating around the internet, which are called "memes," but don't usually rise to the level required to be a true means of cultural idea transmission.

That the word "meme" itself has changed meaning, or at least connotation, over time, is a source of amusement to me. The word was coined as a cultural equivalent to "gene," the unit of trait transmission in biological entities. So, just as genes sometimes mutate and change the course of entire species, so too did the connotation of "meme."

Most memes burn out quickly, but some prove powerful. A few compress complicated ideas into such simple, contagious forms that they shape how people think and act at scale.

Just as genetic mutation usually produces something neutral or maladaptive, while occasionally creating something useful, like, I don't know, opposable thumbs or really sharp teeth. And damn, but there are a lot of maladaptive memes (modern connotation) out there.

“Move fast and break things” began as an internal motto at Facebook. Within a few years, it had spread across Silicon Valley, becoming shorthand for the entire startup ethos: experiment quickly, worry less about rules, and treat disruption as a virtue. Five words shifted an industry’s attitude.

This, too, has very obvious downsides.

To her credit, the author does address this:

Sparks don’t always start the fires you want, though. The same speed that makes memes powerful also makes them volatile. And what’s true for memes is true for culture more broadly: It doesn’t always drive society forward.

The idea that progress (of any kind) is some kind of bar that always goes up is a myth, and not the good kind of myth. It's more of a "three steps forward, two steps back" kind of thing. Sometimes, three steps forward, five steps back.

Sometimes culture stalls innovation. Genetically modified crops promised higher yields and reduced pesticide use, but public fear — shaped by stories of “Frankenfoods” — slowed adoption in many parts of the world, including places where GMOs might have reduced hunger and improved health.

This is, to me, the Platonic ideal of an example of that.

The Enlightenment, the Apollo program, and even today’s debates over AI all show that progress depends not only on technology and institutions, but also on our feelings about the future. Are we complacent, fearful, or hopeful?

Depending on my mood, I could be any one of those things. Well, not so much "fearful." I'm entirely too cynical and fatalistic for that. But somewhere on that spectrum, sure. Today, though, is traditionally a day to express hopefulness, so I'll leave that discussion for another time.

Optimism assumes things will work out no matter what. It’s a sunny outlook, but a passive one. If progress is guaranteed, there’s little reason to struggle for it.

The argument I think the author is making here is that hope is a better worldview than optimism, for that reason.

Hope is active where optimism is passive. Hope assumes risk and uncertainty. It isn’t blind to challenges, but it believes they can be navigated. Hope doesn’t wait for good outcomes to arrive on their own. It frames progress as something worth fighting for.

And, all too often, it's kicked right in the balls.

In my own work, I’ve noticed how often people look puzzled when asked to imagine positive futures. They can easily list disasters — pandemics, climate collapse, runaway AI — but when pressed for hopeful scenarios, they hesitate. That hesitation is telling. It shows how little scaffolding mainstream culture gives us for constructive imagination.

Takeaway: people need to pay more attention to
Star Trek.

There is, of course, quite a bit more at the article. I think she makes some good points. I'm still not making any New Year's resolutions. But it's very likely I'll read and watch more science fiction.
December 31, 2025 at 8:54am
December 31, 2025 at 8:54am
#1104784
Let's wrap up the Gregorian calendar year with what may or may not be an AI-generated or -assisted article from Gizmodo.

Slop may be seeping into the nooks and crannies of our brains.

Let me tell you, whoever first called "AI" output "slop" should be outed as the most influential person of the decade. Sadly, it wasn't me, this time.

If you think of something to say and say it, that could never be AI slop, right? In theory, all organically grown utterances and snippets of text are safe from that label.

Welllll... philosophically, do you really know you're not artificial? I mean, really, really know? There are a whole lot of "this is all a simulation" folks out there, some of whom may or may not be bots, but if they're right (which they probably aren't, but no one can prove it either way), then you're just as much AI as your friendly neighborhood LLM. Just, maybe, a little more advanced. Or maybe not. I can point to a few supposedly organic biological human beings who make less sense than chatbots. Flat-earthers, for example.

But our shared linguistic ecosystem may be so AI-saturated, we now all sound like AI.

For variant values of "we" and "all," okay.

Worse, in some cases AI-infected speech is being spouted by (ostensibly human) elected officials.

Well, those are all alien lizard people anyway.

Back in July of this year, researchers at the Max Planck Institute for Human Development’s Center for Adaptive Rationality released a paper on this topic titled “Empirical evidence of Large Language Model’s influence on human spoken communication.”

Today (or, rather, when I first saved this article earlier this month) I learned that there's a Center for Adaptive Rationality, and that it's named after someone better known for defining the absolute lower limit on the amount of size and time that can be meaningfully measured. There's a metaphor in there, somewhere, or at least a pun, but I haven't quite teased it out, yet. Something about human rationality being measured in Planck lengths. Most people wouldn't get the joke, anyway.

As Gizmodo noted at the time, it quantified YouTube users’ adoption of words like “underscore,” “comprehend,” “bolster,” “boast,” “swift,” “inquiry,” and “meticulous.”

And? All that shows is that some tubers' scripts may have been generated or assisted by AI.

That exercise unearthed a plausible—but hardly conclusive—link between changes to people’s spoken vocabularies over the 18 months following the release of ChatGPT and their exposure to the chatbot.

See that? That double emdash in that quote right there? That's also a hallmark of LLM output. There is absolutely nothing wrong with using emdashes
I do it from time to time, myself, and have been long before this latest crop of generative language models. But now, thanks to LLMs, you can't use one without being accused of AI use. Unfortunately, I fear the same is going to happen to semicolons; those few of us who know how to use them correctly are going to be scrutinized, too.

But two new, more anecdotal reports, suggest that our chatbot dialect isn’t just something that can be found through close analysis of data. It might be an obvious, every day fact of life now.

I must underscore that these are, indeed, anecdotes. Which can bolster understanding, but fall short of the meticulous standards needed for science. Many people don't comprehend that scientific inquiry requires more than just stories, though people are more swift to relate to stories than to dry data. That's why many science articles boast anecdotes in their ledes
to hook the reader, draw them in before getting to the dry stuff.

I really, really, hope you see what I did there.

Anyway, the money quote, for me, is this one:

As “Cassie” an r/AmItheAsshole moderator who only gave Wired her first name put it, “AI is trained off people, and people copy what they see other people doing.” In other words, Cassie said, “People become more like AI, and AI becomes more like people.”

You humans
er, I mean, we humans tend to hold our intelligence in high regard, for inexplicable reasons. It's right there in the official label we slapped on ourselves: homo sapiens, where "homo" isn't some derogatory slur, but simply means "human." The Latin root was more like "man," and also gave French the word "homme," and Spanish "hombre," which mean adult male human, and we can argue about the masculine being the default, as in "all men are created equal," though I agree that usage is antiquated now and that we should strive to be more inclusive in language. The important part of that binomial for this discussion, though, is "sapiens," which can mean "wise" or "intelligent," which we can also argue isn't the same thing (it certainly is not in D&D).

But I've noted in the past that our so-called creative process relies primarily on soaking up past inputs
experiences, words, mannerisms, styles, etc.and rearranging them in ways that make sense to us and, sometimes, if we're lucky, also to someone else. Consequently, it should shock or surprise no one that we're aping the output of LLMs. I've done it consciously in this entry, but I have undoubtedly done it unconsciously, as well.

We can assert that this is the difference: consciousness. The problem with that assertion is that no one understands what consciousness actually is. I'm convinced I'm conscious (cogito ergo sum), but am I, really, or am I just channeling the parts of Descartes' philosophy that I agree with? And as for the rest of you, I can never truly be sure, though it's safest to assume that you are.

We're all regurgitative entities, to put it more simply (though with an adjective I apparently just made up). Everything we think, say, do, or create is a remix of what the people before us have thought, said, did, created, etc.

Despite my stylistic choices here, I did not use AI or LLMs to write anything in this entry, or for that matter any other entry, ever. The image in the header is, of course, AI-generated, but not the text. Never the text, not without disclosure. You might not believe that, and there's not much I can do about it if that's the case. But it's true. Still, the influence of LLMs is apparent, is it not? At the very least, without them, I would never have had occasion to write this entry.
December 30, 2025 at 7:25am
December 30, 2025 at 7:25am
#1104728
Looks like I get to balance out yesterdays "man" entry with a "woman" one, this being from Women'sHealth (don't ask me why it's all one word like that):
ADHD Can Be A Superpower—And Science Just Revealed Exactly Why  Open in new Window.
Researchers hope the findings will boost confidence and well-being for those with the condition.

Maybe, if they can get through reading the study without getting distracted.

But new research suggests that having ADHD comes with some upsides too.

This shouldn't be surprising. Though as usual, "research suggests" is a far cry from the headline's promised "Just Revealed Exactly Why."

For the study, researchers asked 200 adults with ADHD and 200 adults without ADHD to examine how strongly they identify with 25 positive characteristics, like humor, creativity, and spontaneity.

I just want to know if the control group finished faster.

Yes, I am going to continue to make "distraction" jokes.

People with ADHD were more likely to strongly endorse 10 strengths they had over those without the condition. Those included:

(a list of five completely unsurprising traits)

One of the traits listed is "spontaneity." I'd argue that's not a strength. It's an annoyance to others, like chewing with your mouth open.

But the findings suggest that adults with ADHD who are aware of their strengths and actually use them have higher confidence and quality of life as a result, Ammon says.

This seems like more than just "looking on the bright side." It's like Rudolph the Red-Nosed Reindeer, who taught us all an important life lesson: that deviation from the norm will be punished and shunned unless it can be exploited, and that the exploited should be happy about it. By labeling ADHD as such, you're slapping on a "diagnosis," one with the words "deficit" and "disorder," which all have negative connotations. If you can see things from a different point of view, and recognize that you actually have a superpower, well, then Santa finally has a use for you.

In case it's not clear, what I mean is that those who deviate from the norm should be accepted for who they are, even if they're not useful.

Hoogman says she hopes the findings will help people understand the strengths associated with ADHD. “My other studies show that adults with ADHD frequently, in addition to their deficits, also experience benefits from their ADHD characteristics,” she says.

Just don't put them where they can see squirrels.
December 29, 2025 at 10:27am
December 29, 2025 at 10:27am
#1104664
If there's one thing you can trust from grumpy old men, it's a sentence that starts "Back in my day..." So here's a list from Art of Manliness.


Fear not; I won't be going over all 57.

These entries haven’t been chosen based on pure nostalgia, nor the viability of their comebacks — many have a poor chance of resurrection indeed. Rather these are simply things that it would genuinely be nice to see revived, and in many cases wouldn’t need to supplant culture’s current offerings, but could co-exist as happy supplements alongside them — additions that would make for richer and more varied lives.

In other words, opinion. That's fine. Nothing wrong with opinion, so long as it's based on facts. Well, maybe I have a different opinion.

Soda Fountains

In an age where fewer people are drinking alcohol, the soda fountain just might be the third space we need again.


Better idea: bring back drinking alcohol.

Attention Spans

While I admit I agree with this one, it still comes across as grumpyoldmanish.

Carrying Cash

But there are times and places where cash still comes in handy: a high school basketball game, the bait shop in the middle of nowhere, the after-hours campground fee box, the valet who deserves more than a muttered thank you.


None of those situations apply to me. Also, I don't like lying to beggars when they ask me if I have any spare cash.

Eccentricity

There’s lots of evidence that people, on the whole, are getting less weird. Less deviant, less creative, less inclined to divert from the standard societal lockstep. It seems like we have less eccentrics than we used to — those oddballs who dressed differently, read strange books, and didn’t care if anyone understood them.


Whaaaat? Have you
seen the internet?

Paper Maps

They don’t buffer, they don’t die at 3%, and they don’t reroute you into a lake.


Look, I'm a huge fan of both GPS
and paper maps, and on long trips, I always keep a road atlas with me as backup. But anyone who thinks no one ever got turned around by using paper maps is fooling themselves and bordering on Luddism. In truth, GPS is excellent at getting you un-lost when you've relied too much on paper maps. And anyone who blindly follows a computer's directions into a lake, well, honestly, that's on them.

Not to mention I'm the only person in the world who actually knows how to re-fold one the right way.

Door-to-Door Knife Sharpeners

You’ve heard of door-to-door salesmen, but did you know there used to be door-to-door knife sharpeners?


Right, because letting a stranger into your house to play with your knives is
such a great idea.

Penmanship

It’s particularly rewarding to master cursive — a skill that’s especially endangered, not only in regards to writing it but even reading it.


Yeah, no. Although I was learning that shit back in the pre-PC days, I never got the hang of writing in cursive. And I never could read it, most of the time.

You know what I am pretty good at, though? Writing neatly in the all-caps block style preferred by engineers. I was good at it before I became an engineer. It got me my first job in an engineering office, as a drafter, before CAD took over.

I went through a phase where I tried to learn calligraphy. That was easier than cursive (with the right tools, anyway). Kids these days (dammit, now you got me doing it) probably think of cursive as just as antiquated as I thought of calligraphy.

Real Dates

Yeah, right. For me, that would require a woman, and that ain't gonna happen. Though the authors seem to be talking about it in terms of what other people do, and to that I say: mind your own business.

Typewriters

I learned to type on a typewriter. First a manual one, then an electric one, then a fancy IBM Selectric with limited editing functions, kind of a middle step between typewriter and word processor/printer.

Word processors are superior in every way.

Landlines

No.

Only reason to have one, for me, is that they tend to work even when the power is out. I have a generator for that now, and a spare battery I keep charged, one that can transfer charge to a mobile phone. And that niche case doesn't even come close to making up for the telemarketing calls I'd receive at all hours.

Record Players + Vinyl Records

In a world of endless, algorithmically curated streaming playlists, listening to music on a record player makes music listening feel like an event, not just background hum.


Where have these authors been? Those still exist. And I agree that vinyl records have their charms. But as is often the case with me, convenience wins out. Also, there was the flood incident back in the 80s that destroyed my extensive vinyl collection, and no, I will
never get over that.

Colorful Insults

Modern insults are pretty boring — mostly the same set of expletive-laden put-downs.


On this one, I am in complete agreement. Not that I don't get extensive use of the "expletive-laden put-downs."

Neckties

Neckties aren’t expected in many situations anymore — but that’s exactly what makes them meaningful. Wearing one signals intention, care, and the willingness to rise above the bare minimum.


I'm pretty sure neckties got started before the collar button was perfected. Their whole purpose was to keep your shirt closed so no one had to look at that chest hair and get all excited (or disgusted). They then became an impractical fashion accessory. They're also exceedingly dangerous in a fight.

Film Cameras

I have mixed feelings about this one. First of all, as with vinyl records, they absolutely do still exist. They're just not as much a part of everyday life as they used to be. As someone who used to use them to make beer money, I do feel a kind of nostalgia for them, and even more for the darkroom skills I carefully cultivated.

But, again... I'm too damn lazy, and there's no going back now. Digital cameras have amazing quality these days, and they're convenient. Also, one of the best things about film cameras was Kodachrome (cue Paul Simon here), and they stopped making that.

Wood-Burning Fireplaces

Most fireplaces now run on gas. Flip a switch and you get some instant heat and ambiance.


I do appreciate the radiant heat of a fireplace. But, again, I think of all the wood I had to saw, chop, and split as a kid, and then I have nightmares for a week. No, thanks. Not to mention atmospheric particulates.

Anyway, there's more at the link, if you can overcome your reluctance to click on a site called Art of Manliness.

I'm reminded of those horse carriages in Central Park. I don't know if they still have those; I haven't seen one since before Covid. They'd line up at the south corner, Fifth and 59th, right across from the giant modernist cubic Apple store, and you'd get liveried around in a carriage with the clop clop and the plop plop. I never did it, just saw it. I don't think they treated the horses very well, but that's not my point; my point is that, sometimes, it's better to just let things die a natural death and stop longing for a past that, honestly, wasn't that great.

December 28, 2025 at 10:15am
December 28, 2025 at 10:15am
#1104580
From Smithsonian (though a reprint from Quanta), an edgy article:


Neuroscientists studying the shifts between sleep and awareness are finding many liminal states, which could help explain the disorders that can result when sleep transitions go wrong

As with most things in life, "awake" and "asleep" aren't truly binary. There's always that transition. Sometimes it's gradual. Sometimes, like when you hear a cat puking at 4am, it's almost instantaneous. But "almost" isn't a true switch-flip; it's just faster.

For a very long time, I wondered if it were possible to catch that exact moment when awake becomes asleep, or vice-versa, but not only would that require consciousness on one side, but there's also not an "exact moment."

And I learned the adjectives describing these transitions:
hypnagogic, for falling asleep; and hypnopompic, for awakening.

Look, when you have a tendency toward sleep paralysis, you learn these things, okay? There are nouns for the states, too: hypnagogia, for example.

The pillow is cold against your cheek. Your upstairs neighbor creaks across the ceiling. You close your eyes; shadows and light dance over your vision. A cat sniffs at a piece of cheese. Dots fall into a lake. All this feels very normal and fine, even though you don’t own a cat and you’re nowhere near a lake.

Worse, you don't have an upstairs neighbor.

To fall asleep, “everything has to change,” says Adam Horowitz, a research affiliate in sleep science at MIT.

Yes, I can feel my bones warping, my flesh shifting... oh, you mean everything in the central nervous system.

It’s still largely mysterious how the brain manages to move between these states safely and efficiently.

It's still largely mysterious to me how they define "safely and efficiently." You know that thing where you're falling asleep and suddenly you're literally falling? Okay, not "literally" literally, but your brain thinks it is and you wake up with your heart pounding? Yeah, that's not "safe" for some of us. That's called a hypnagogic jerk, incidentally, and by "jerk" it's not making a value judgement.

Sleep has been traditionally thought of as an all-or-nothing phenomenon, Lewis says. You’re either awake or asleep. But the new findings are showing that it’s “much more of a spectrum than it is a category.”

Much like life vs. death.

In the early 1950s, the physiologist Nathaniel Kleitman at the University of Chicago and his student Eugene Aserinsky first described the sleep stage categorized by rapid eye movement, or REM sleep—a cycle the brain repeats multiple times throughout the night, during which we tend to dream.

For some reason, I thought REM sleep was described way earlier than this. Must have dreamed it.

Though some evidence indicated that the brain could exist in a state that mixed sleep and wakefulness, it was largely ignored. It was considered too complicated and variable, counter to most researchers’ tightly defined view of sleep.

This sort of thing can encourage binary thinking: all or nothing, black or white. "It's too hard to study" is a legitimate thing when you're first delving into something, but the truth is usually more complicated. It's like the joke about physicists: "First, assume a perfectly spherical cow..."

Around the time that Loomis was conducting EEG experiments in his mansion, [Salvador Dali] was experimenting with his own transitions into sleep. As he described it in his 1948 book, 50 Secrets of Magic Craftsmanship, he would sit in a “bony armchair, preferably of Spanish style,” while loosely holding a heavy key in one palm above an upside-down plate on the floor. As he drifted off, his hands would slacken—and eventually, the key would fall through his fingers. The sudden clack of the key hitting the plate would wake him.

I remember reading that book, years ago, because I've long been a fan of the dreamlike images of surrealism. I remember he called it "sleep with a key." The key, of course, I felt was symbolic, as in unlocking a mysterious door; it could, presumably, have been any similar object, like a nail or a large coin.

Other great minds, including Thomas Edison and Edgar Allan Poe, shared his interest in and experimentation with what is known as the hypnagogic state—the early window of sleep when we start to experience mental imagery while we’re still awake.

Edison can bite my ass, but that does explain quite a bit about Poe.

In 2021, a group of researchers at the Paris Brain Institute, including Andrillon, discovered that these self-experimenters had gotten it right. Waking up from this earliest sleep stage, known as N1, seemed to put people in a “creative sweet spot.” People who woke up after spending around 15 seconds in the hypnagogic state were nearly three times as likely to discover a hidden rule in a mathematical problem. A couple years later, another study, led by Horowitz at MIT, found that it’s possible to further boost creativity in people emerging from this state by guiding what they dream about.

Much more recent research, and, I imagine, of particular interest to writers.

“We could think that there’s a function” to these mental experiences, says Sidarta Ribeiro, a neuroscientist at the Federal University of Rio Grande do Norte in Brazil. “But maybe there isn’t. Maybe it’s a byproduct of what’s going on in the brain.”

I feel like the Industrial Revolution trained us all to think in terms of function or purpose. "We kept cats around because they're good at pest control; therefore, if a cat is not good at pest control, it has no value." Which is, of course, bullshit; what value does art have? Of course, most art doesn't wake you up by puking at 4am, but my point stands.

Other times, there are things we think have no function, but we discover one, like the vermiform appendix in humans.

Mostly, though, I think even if there's not a clear evolutionary advantage to some feature, we can turn it into one, and I think the hypnagogic state might be one of those things, turning a byproduct of our need for sleep into a wellspring of creativity.

The article goes on to explore that more mysterious side of things, awakening. I'm skipping that bit, even though it's interesting. Then they get into sleep disorders, which of course are interesting to me, but your experience may vary.

Worst of all, though, for others if not for me, is that I've discovered in myself a tendency to pun in my sleep, and sometimes even remember the puns upon awakening. Hence the title of today's entry.
December 27, 2025 at 9:16am
December 27, 2025 at 9:16am
#1104529
This one's from Time, and I don't know how many ads, popups, warnings, or other crud you're going to get if you click on it.


It's funny. Honesty is touted as a virtue, and yet there are multiple situations, such as this one, where one is expected or even encouraged to lie out their ass, and that's considered a virtue, too.

Within six weeks in 2014, [Nora McInerny's] father passed away, her husband died of brain cancer, and she miscarried her second child.

You know how some people try to one-up you when something goes wrong? You're like "my dog died," and they have to be, "Well, both of my dogs died and my toddler got run over by a truck." I think this qualifies her for the World Championship of one-upsmanship. And, if she has any musical talent at all, the country music charts.

It makes sense, then, how much time she’s spent pondering what to say when someone asks you how you are, and the truth isn’t “good.”

I have two simple go-to responses, myself: "Horrible," and "could be worse." Because, as we have just seen, it could always be worse.

About a year ago, Jennifer C. Veilleux set a goal for herself: She would try never to answer “I’m fine” or “I’m good” if she wasn’t really feeling that way.

I'm also getting the impression that women are expected to lie if they're not doing "fine," more than men are. Just another double standard.

“We know what we’re supposed to say: ‘I’m fine, how are you?’ Yet that’s often not true,” says Veilleux, a professor of clinical psychology at the University of Arkansas, Fayetteville, who studies emotion.

Huh. Higher education in Arkansas. Who'd have guessed?

Research suggests that suppressing emotions is linked to increased anxiety, depression, and stress, as well as poor relationships.

The answer, of course, is to not have emotions. Or relationships.

First, gauge someone’s capacity for the truth

Are you seriously telling me to "read the room?" Get out of here.

Keep these handy responses close

Now I'm picturing some chick getting asked "How are you?" and then she holds up a finger, digs through her purse, pulls out a small pack of cards with canned responses, and draws one at random.

Even when you’re not, “fine, thanks” sometimes does the trick

Ooooh, way to negate the rest of the article.

No, seriously, though, I think the point is to be more honest with people you already know, not random minimum-wage workers who are asking as part of their script. You already know they're not fine, because they're minimum-wage workers having to follow a script, and probably listen to brain-rotting "music" all day. They're not your friend, coworker, casual acquaintance, or therapist. Telling them the truth fixes nothing, and only makes things worse. On the other hand, being too chipper can also make them feel rotten about their lot in life. So yeah, in that case, just go through the pro-forma motions, like when you end a letter with "Sincerely."

Remember: most people care

Snort.

Some people care. Some just pretend to care. Others take perverse pleasure in your misfortune, feeling superior when they find out your life is worse than theirs. Still others will engage in one-upsmanship, as above.

Anyway, yeah, I joke. If I'm being honest, sometimes I joke to cover up how I'm feeling. I keep imagining going to a shrink and having them, very seriously, talk to me about "defense mechanisms" and "letting yourself feel" and "being honest with people."

Not today, though. I'm actually doing just fine, thanks.
December 26, 2025 at 8:41am
December 26, 2025 at 8:41am
#1104468
I've noted many times that the answer to a headline question is almost always "No." I'm willing to believe this El Pais article is an exception.


Can cheese protect brain health? This is what the science says  Open in new Window.
A controversial study suggests that consuming these dairy products may have a protective effect, but experts aren’t so sure

Yes. It's another article about cheese.

Eating more high-fat cheese and cream may be associated with a lower risk of developing dementia, according to a study published on December 17 in the academic journal Neurology.

Of course, I should issue my usual disclaimers about one study, peer review, replication, and so on. And to be very careful about who funded the study; even if the scientists involved were trying to be objective, bias can creep in. Like when Willy Wonka funded those studies that insisted that chocolate is good for you.

Still. Just as with chocolate, I don't much care whether it's good for you, only whether it's good. And it is.

The analysis — based on data collected from nearly 30,000 people — challenges the previous scientific belief that a low-fat diet could have a protective effect against dementia.

Well, I guess I at least can accept the sample size, for once.

Although its conclusions are quite dramatic, it’s an observational study that doesn’t prove causation.

Science doesn't "prove;" it supports or falsifies. But yes, objectively, we also have to be concerned about correlation vs. causation. Except in this case, when, let's be honest, I'm going to eat cheese anyway.

Researchers analyzed data from 27,670 people in Sweden, with an average age of 58 at the start of the study.

Well, there goes my lack of concern about the sample size. Sweden isn't exactly famous for ethnic diversity.

At that time, participants recorded their food intake for one week and answered questions about how frequently they had consumed certain foods in recent years.

Not a great methodology, in my view. Self-reporting is notoriously hit-or-miss.

After adjusting for age, gender, education and overall diet quality, the researchers found that people who reported consuming more high-fat cheese had a 13% lower risk of developing dementia than those who consumed less.

I mean, I'll take what I can get, but I think 13% doesn't mean much on an individual level, only in aggregate.

Naveed Sattar, a professor of cardiometabolic medicine and an honorary consultant physician at the University of Glasgow, is highly critical of the study.

And that's okay. This is how science works.

While all experts point to the importance of lifestyle and healthy choices for maintaining optimal brain health, most of what determines whether a person develops dementia is beyond their control.

Which is why I don't worry too much about it. I'm of the considered opinion that, for myself at least, the stress of always having to do the Right Thing, and deprive myself of simple pleasures such as consuming delicious cheese, has a more negative effect than just doing what feels good.

Which, I know, is the basic definition of hedonism. I'm okay with that.
December 25, 2025 at 8:16am
December 25, 2025 at 8:16am
#1104403
I'd saved this Quanta article just because I thought it was interesting, especially as someone who is learning a new language later in life.

Is language core to thought, or a separate process? For 15 years, the neuroscientist Ev Fedorenko has gathered evidence of a language network in the human brain — and has found some similarities to LLMs.

See, I'd never wondered whether language was core to thought or not; for me, it absolutely is. I think in words. Sometimes also pictures, but also words (numbers are words, too, like seventeen or one-eighty-five).

Even in a world where large language models (LLMs) and AI chatbots are commonplace, it can be hard to fully accept that fluent writing can come from an unthinking machine.

I thought AI chatbots were LLMs, but whatever.

That’s because, to many of us, finding the right words is a crucial part of thought — not the outcome of some separate process.

I expect this is especially true for writers.

But what if our neurobiological reality includes a system that behaves something like an LLM?

It's funny. As technology advanced, we kept coming up with new terms to compare to how the brain works. Near the beginning of the industrial revolution, it was "gears turning" (that one persisted). Later, some compared neuronal signaling to telegraph lines. A while back, people started saying our brains are "hardwired" to do this or that. Now it's "the brain works like an LLM."

The joke is that a) no, the brain doesn't work like any of those things; it's just a useful metaphor and b) if anything, LLMs are like the brain, not the other way around. (In math, A=B is the same as B=A, but not necessarily in language.)

Long before the rise of ChatGPT, the cognitive neuroscientist Ev Fedorenko began studying how language works in the adult human brain.

The brain is, however, notoriously hard to study, because it's complicated, but also because we're using a brain to study it with.

Her research suggests that, in some ways, we do carry around a biological version of an LLM — that is, a mindless language processor — inside our own brains.

I'd want to be more careful using the word "mindless." I'm pretty sure I know what the author means, but one of the great mysteries left to solve is what, exactly, is a mind.

“You can think of the language network as a set of pointers,” Fedorenko said. “It’s like a map, and it tells you where in the brain you can find different kinds of meaning. It’s basically a glorified parser that helps us put the pieces together — and then all the thinking and interesting stuff happens outside of [its] boundaries.”

I'm no expert at coding, but I know some computer languages have variables called "pointers" whose data is solely where to find other data. Don't ask me; I never did get the hang of them. But again, we have a technological metaphor for the brain. These are like the Bohr model of the atom: useful for some things, but not reflective of reality. So when I read the above quote, that's where my brain went.

Unlike a large language model, the human language network doesn’t string words into plausible-sounding patterns with nobody home; instead, it acts as a translator between external perceptions (such as speech, writing and sign language) and representations of meaning encoded in other parts of the brain (including episodic memory and social cognition, which LLMs don’t possess).

Yet.

A lot of the article is an interview with Fedorenko, I don't really have much more to say about it; it's just a bit of insight into how thinkers think about thinking, from a physical point of view.
December 24, 2025 at 10:57am
December 24, 2025 at 10:57am
#1104343
Pardon the mess while I experiment with new formats thanks to Our Glorious Leader's fun new editing interface.

Meanwhile, I think this is the first time I've highlighted a link from Snopes:

How to spot suspicious health supplements and avoid getting scammed  Open in new Window.
Snopes readers regularly ask whether supplement brands like Neurocept and Burn Peak are legit. Here's how you can tell yourself.

Well, it's simple, see: if it's a supplement, it's a scam.

Oh, sure, not always. But like getting phone calls from an unknown number, it's best to assume the worst rather than take chances.

Many supplement brands readers ask about use unethical business practices to sell products that simply do not work.

Or, worse, will actively make you sick.

While deepfakes may be difficult for many internet users to spot, many of the health supplement products that seek to trick people into parting with an excessive amount of money have common red flags in their online presence that take no research or special knowledge to be able to spot.

For starters, they advertise on the internet.

In this article, Snopes will guide you step-by-step through how to easily spot a potential health supplement scam.

Of course, these scams predated the internet by decades, the most famous one being the promotion and sale of fake snake oil.

Turns out actual snake oil may have some beneficial properties, but that wasn't the problem (except from the point of view of snakes). The problem was they weren't selling actual snake oil, but whatever ingredients they could obtain cheaply.

I also did an entry last month on a fun supplement containing radium: "It Got Glowing ReviewsOpen in new Window. There, I also ragged on "supplements" being promoted today.

You should talk to your doctor if you think there is a supplement that might be beneficial to you.

These things always say "you should talk to your doctor." Bitch, I'm in the US. You know how hard it is to even get a chance to wait in the lobby?

Still, yes, you should talk to your doctor. Just remember that they're people, too, and they have enough knowledge to absorb without trying to keep track of every mostly-unregulated placebo (or worse) hawked by unscrupulous vendors.

It's worth noting that we were unable to reach out to the companies mentioned in this story to inquire about their business practices and the efficacy of their supplements because they all either did not list contact information or had nonfunctional contact links on their websites, a common practice for the sellers of unproven supplements.

Well, I'd consider that the third red flag, right after "it's a supplement" and "it's hawked in a popup ad": if they won't let you contact them, then they're almost certainly a scam.

The rest of the article is mostly about what Snopes considers to be the red flags. Personally, I prefer to keep things simple and avoid these products entirely. I think I can trust aspirin made by well-known manufacturers, but after that, my inner skeptic raises his ugly head.

And the link's there if you want it. I've spent all my energy this morning playing with the new text editor, which is very cool but there's a bit of a learning curve for those of us who have spent 20+ years learning the ins and outs of WritingML. Now I have low energy. Maybe I should go to a gas station and buy one of those untested, unregulated five-hour energy shots.
December 23, 2025 at 8:48am
December 23, 2025 at 8:48am
#1104258
Here's an interesting article from the BBC which, despite having a few reservations, I thought I'd share.



The very first humans millions of years ago may have been inventors, according to a discovery in northwest Kenya.

Like I said, reservations.

Let's start with the definition of "human." The Latin binomial homo sapiens translates to "wise person" or "knowledgeable person," but that's just a matter of labeling, and besides, there's a big difference between "wise" and "knowledgeable."

To the extent of my understanding, evolution is generally a gradual process. Each generation broadly resembles the one before, but distant generations do not necessarily resemble each other. It's like... if you look in the mirror today, you will see the same face as you did yesterday, when you saw the same face as the day before. (Barring injury or hangover or the like, of course). Take any two consecutive days from your life, and neither you nor facial recognition software would be able to distinguish them. And yet, you look markedly different from 10-year-old you, who looked markedly different from 20-year-old you, etc.

Point is, I think it's really, really hard to point to one particular generation, especially millions of years in the past where the fossil record is sparse, and say, "These were the first humans."

But if pressed, I'd say there's one philosophical hump our ancestors had to go through. It wasn't tool use; lots of animals, as we now know, use tools. Some other animals even crudely make tools. But what probably distinguishes "sapiens" from other animals is: using tools to make other tools. That's a huge leap, in my opinion.

My point is: "the very first humans were inventors" is, philosophically at least, a tautology.

Second quibble: humans and other animals display a wide range of what we call "intelligence" within a species. In other words, there are geniuses, average specimens, and dummies. But what the vast majority of us have in common is that we're very, very good at mimicry. There's a reason "to ape" is a verb: the actual distinguishing mental characteristic of an ape is its ability to copy what others do (though other animals have this ability as well; parrots, e.g.)

So, no, I don't believe that "the very first humans were inventors." Just as we had luminaries like Nikola Tesla and whatever genius figured out how to make beer, all it takes is one human to make a mental leap, invent something truly new and useful, and next thing you know, the other humans have followed that lead. Some certainly improve on the invention, like how once someone created an incandescent bulb (I don't believe for a moment that it was Edison himself, but it could have been), almost anyone with the right equipment could create it, and someone else standardized the sockets, and another added frosted glass, etc.

And my point there is that, millions of years ago, the very first inventor invented inventing, and the rest of our ancestors just kept the momentum going: slowly at first, but eventually building on previous work until we could send robots to Mars and argue on the internet.

And, finally: like I said, the fossil/archaeological record is sparse. I don't see how they can definitively claim that they found the first invention. Plus, early humans seem to have been scattered in tribal or clan groups, just as we are today, but didn't have the internet—so it's entirely possible that inventing was invented in more than one place, separately, much as Newton and Liebniz both invented (or discovered, depending on your point of view) calculus.

Researchers have found that the primitive humans who lived 2.75 million years ago at an archaeological site called Namorotukunan used stone tools continuously for 300,000 years.

Quibbles aside, though, I'm not saying the article isn't worth reading. For instance: 2.75 million years? I can't remember hearing about any evidence of tool use that's that old. It also predates the generally accepted advent of what they call "anatomically modern" humans by, like, a shitload.

I've droned on enough; the article is there if you're interested. I'll just quote one more passage, from the end:

"We have probably vastly underestimated these early humans and human ancestors. We can actually trace the roots of our ability to adapt to change by using technology much earlier than we thought, all the way to 2.75 million years ago, and probably much earlier."

"And probably much earlier." That just supports my points above.
December 22, 2025 at 7:49am
December 22, 2025 at 7:49am
#1104171
I'll have to wash my hands and bleach my eyes after this one, because it's from (ugh) Martha Stewart. But since it's about cheese, I'll allow it... this time.

    How to Eat Blue Cheese the Right Way, According to a Cheesemonger  Open in new Window.
Our guide to enjoying one of the cheese world’s most misunderstood stars.


"How to eat blue cheese?" For fuck's sake, just slide that shit onto some bread and stick it in your mouth.

Blue cheese has a reputation: bold, tangy, and sometimes intimidating—not to be confused with a bully, but easily misunderstood.

What's to misunderstand? It's cheese. It has blue stuff. It's delicious.

We spoke with an American Cheese Society Certified Cheese Professional and blue cheese lover to learn more about this sometimes maligned type of cheese and find out how to eat blue cheese.

You... chew. And swallow. Come ON.

Blue cheeses are defined by the blue or green veining, a specific type of mold, that streaks throughout the cheese. “This mold presence is very intentional—not just any cheese can grow mold and become a blue," says Lauren Toth, ACS CCP, cheesemonger and director of curriculum and talent development at Murray's Cheese.

I'm going to go ahead and assume that at least two of those Cs stand for "cheese."

The mold is formed when the cheesemaker introduces a specific strain of bacterial cultures, typically Penicillium roqueforti, into the milk.

I know, I know, some people freak out about microorganisms. If only they could internalize that they, themselves, are more microorganism than primate.

If your previous blue cheese experience came from a sad wedge at a salad bar or the bottled dressing of childhood, there’s good news: gentle, approachable blues exist, and they’re genuinely delightful.

Wait... are we still talking about cheese, or have we switched to sex?

Her top pick for a blue cheese newbie is Cambozola Black Label, a buttery, triple-crème hybrid of camembert and gorgonzola.

Ohhhhh... it's an AD.

Blues vary dramatically in texture, determined by milk type, curd handling, aging time, and piercing—encouraging more veining to develop. Toth says, “blues come in all different styles and textures—creamy, grainy, fudgy, crumbly, even fairly firm,” and those differences hint at how best to use them.

Still don't know whether this is supposed to make me hungry or "thirsty."

Blue cheeses tend to be bold, making their best partners sweet, rich, or fruity.

To be serious for a moment, my favorite New Year's Eve pairing with the traditional sparkling wine is a blue cheese and sliced pears. Also, walnuts. Perfect flavor combination.

When choosing what to drink with blue cheese, avoid overly tannic red wines when tasting new blues. Opt instead for sweet and sparkling wines to tame sharpness and highlight creaminess.

While it's traditional to drink wine with cheese, I find beer makes an excellent lubricant as well. Not just any beer, though; as much as I love Belgians, they're probably too strong for the cheese. I'd go with a more hoppy variety, just not an IPA. I don't like IPAs in general, though, so you'd have to experiment, were you so inclined.

For many people—Toth's own mother included—blue cheese is synonymous with bottled dressing, and that unfortunate association keeps people from discovering truly exceptional cheeses.

I'm also a fan of blue cheese dressing, just not the mass-produced kind.

So ends another cheesy entry. I have a few more in the pile, but I have no idea when another will grace us with its presence.
December 21, 2025 at 9:50am
December 21, 2025 at 9:50am
#1104100
As today is the only day around this time of year that has any meaning for me at all, I thought it would be an appropriate time to take a break from our usual programming for a personal update.

A couple of weeks ago, without noting it (or expecting anyone else to), I posted an entry here that marked six straight years of daily blog posts. The streak started in the previous blog, but when that one filled up, I just continued here.

I feel accomplished and all, but at the same time, it means I haven't done anything in the past six years significant enough to make me take a break from blogging. Even the trip to Europe, which had been put off since certain events restricted travel, featured a daily blog post, if only a brief one.

But, at the same time, it also means I didn't get sick enough to miss a day, and hell, that's a good thing, especially considering the "certain events" I just mentioned made a lot of people sick or dead.

Still, this time of year always fucks me up, no matter how good I've objectively got it, and not to brag, but I've objectively got it damn good. That could, of course, change at a moment's notice, and almost certainly will now that I've mentioned it out loud. I'm taking the chance because it's relevant to the rest of my rant here.

This December has been worse than others, though. I've withdrawn even more than usual, avoiding as much human contact as possible (and accepting as much feline contact as possible). Again, I emphasize that this is not due to anything bad happening to me in life, or any of the myriad stupid things I've done; it's just the way the season works for me. The season, layered on with the existential dread of facing one of those horrid "multiples of 10" birthdays early next calendar year.

Hell, it's gotten so bad that I haven't had a single delicious fermented and/or distilled beverage since the 5th (an appropriate day, because 5 December is the anniversary of the 21st Amendment, which repealed prohibition). I'm used to going days between drinks, public perception (which I promote) to the contrary, but over two weeks? I don't think I've abstained for that long since grade school.

It's not that I've forced abstention on myself; I just plain haven't felt like drinking, and I don't usually make myself drink when I don't feel like it. Next calendar month will probably be an exception—the concept of Dry January is personally offensive to me, so I'll try to do what I did last year, barring illness or other extenuating circumstances: have at least one alcoholic drink every day. Point is, I usually drink when I'm feeling really good or really bad, and this month has mostly just been a mental shade of beige.

Don't congratulate me on that, by the way. It's not some sort of big accomplishment. I'm not recovering from anything, and, planned observation of Ginuary or not, I'm sure I'll feel like having a beer soon. Perhaps even today, to mark the solstice in my usual fashion.

And the solstice does represent change to me. Which is odd, because the whole point is that the path of the sun in the sky remains very close to the same for the days preceding and following it. Still (pun intended because "stice" means "still"), there's an objective, measurable moment (10:03am this year, based on my time zone, which should be shortly after this gets posted) when the Sun seems to hang perfectly above the Tropic of Capricorn, pausing there as if to rest before resuming its half-year journey northward.

This is, as I've harped on for many years, the Reason for the Season.

Anyway.

Astronomical shifts or not, tomorrow, I'll probably get back to the usual format in here. I just wanted to take the time to inflict some of my not-feelings on everyone. Thing is, I don't want anything to change, right now. Any change will almost inevitably be for the worse. But if it does, I hope I can continue to face it with a laugh in my heart, the wind in my hair, and a beer in my hand.
December 20, 2025 at 8:22am
December 20, 2025 at 8:22am
#1104020
A bit more serious rant today, thanks to this article from The Conversation:



Well, obviously, They're pushing conspiracy theories on us to detract from the real problems going on. Study it out, sheeple!

(I said "a bit more" serious, not "completely" serious.)

Everyone has looked up at the clouds and seen faces, animals, objects.

One time, I saw a giant Middle Finger. I felt that was appropriate.

But some people – perhaps a surprising number – look to the sky and see government plots and wicked deeds written there.

Not to mention aliens.

Conspiracy theorists say that contrails – long streaks of condensation left by aircraft – are actually chemtrails, clouds of chemical or biological agents dumped on the unsuspecting public for nefarious purposes. Different motives are ascribed, from weather control to mass poisoning.

Here in reality, meanwhile, weather patterns are shifting due to climate change, and mass poisoning is absolutely occurring due to pollution. There are people who refuse to accept that those things are happening, so there must be a grand, evil design behind it all.

I’m a communications researcher who studies conspiracy theories. The thoroughly debunked chemtrails theory provides a textbook example of how conspiracy theories work.

Translation (for conspiracy theorists): "I'm part of the cover-up."

More seriously, while this article focuses on the chemtrail nonsense, it provides insight into conspiracy "theories" (I really hate calling them that) in general.

But even without a deep dive into the science, the chemtrail theory has glaring logical problems. Two of them are falsifiability and parsimony.

This can, of course, be said of most conspiracy accusations. The Apollo one, for example. What does it really take for someone to believe that thousands, maybe millions, of people from all over the world, including our biggest rivals at the time, faked the moon landing and managed to keep a lid on it? That the USSR wouldn't have been the first to claim it was a hoax? That the US government, which these same people insist is utterly incompetent at anything, could manage to orchestrate such a grand conspiracy without a single actual piece of evidence?

According to psychologist Rob Brotherton, conspiracy theories have a classic “heads I win, tails you lose” structure. Conspiracy theorists say that chemtrails are part of a nefarious government plot, but its existence has been covered up by the same villains.

Any new data, to them, is either part of the cover-up, or supports their belief.

Therefore, no amount of information could even hypothetically disprove it for true believers. This denial makes the theory nonfalsifiable, meaning it’s impossible to disprove. By contrast, good theories are not false, but they must also be constructed in such a way that if they were false, evidence could show that.

Bit of a quibble here: it's absolutely possible to have a good hypothesis that is later falsified. In science, this happens all the time, and it's just part of the process. Right now, there's evidence calling into question some of cosmologists' most cherished previous conclusions about the age and structure of the universe, and you know what? That's a good thing.

Nonfalsifiable theories are inherently suspect because they exist in a closed loop of self-confirmation. In practice, theories are not usually declared “false” based on a single test but are taken more or less seriously based on the preponderance of good evidence and scientific consensus.

Again, I'm not thrilled with the casual use of "theories" here, but I know the author means it in the colloquial, not the scientific sense. In that spirit, take one of my own pet theories: that sentient, technology-using life is vanishingly rare in the universe. I could change my mind about that in a heartbeat, if, for example, a flying saucer containing bug-eyed aliens landed in my front yard when I knew I hadn't been tripping.

Like most conspiracy theories, the chemtrail story tends not to meet the criteria of parsimony, also known as Occam’s razor, which suggests that the more suppositions a theory requires to be true, the less likely it actually is. While not perfect, this concept can be an important way to think about probability when it comes to conspiracy theories.

In fairness, Occam's Razor isn't a law carved in stone; it's a guide for choosing between hypotheses. Sometimes, things really are complicated. And sometimes, cover-ups happen. I was just reading about a possible link  Open in new Window. between Parkinson's and certain kinds of water contamination. In that case, though, the cover-up seems to have failed.

Of course, calling something a “conspiracy theory” does not automatically invalidate it. After all, real conspiracies do exist. But it’s important to remember scientist and science communicator Carl Sagan’s adage that “extraordinary claims require extraordinary evidence.”

This is why I get all ragey when I see yet another unsubstantiated claim about Comet 3I/Atlas, which I vaguely remember mentioning in here recently.

If the evidence against it is so powerful and the logic is so weak, why do people believe the chemtrail conspiracy theory? As I have argued in my new book...

Oh hey, a stealth ad for a book! It's a conspiracy!

...conspiracy theorists create bonds with each other through shared practices of interpreting the world, seeing every detail and scrap of evidence as unshakable signs of a larger, hidden meaning.

Really, that sounds a lot like religion. And, in a way, it is: in that worldview, we're all at the whim of higher powers.

Conspiracies are dramatic and exciting, with clear lines of good and evil, whereas real life is boring and sometimes scary. The chemtrail theory is ultimately prideful. It’s a way for theorists to feel powerful and smart when they face things beyond their comprehension and control.

That's one reason why I can't completely dismiss all conspiracy "theorists" as absolute nutters. They're victims, too—victims of a world grown beyond any one person's comprehension. Also, it makes me realize that we all have the potential to fall for misinformation. Articles like this (I'm not going to buy the book, sorry) help me remember how to combat that tendency in myself.
December 19, 2025 at 10:38am
December 19, 2025 at 10:38am
#1103950
Yeah, this one's been in my pile for a few months, and it's dated early July. I doubt many people are swimming there today.

    Paris reopens Seine River for public swimming  Open in new Window.
Parisians have begun bathing in the Seine for the first time in over 100 years after a ban was lifted. The French capital has created three swimming zones along the river as part of its Olympic legacy.


If I were a lesser person, I'd make a joke about "Parisians have begun bathing..." But I'm not going to stoop so low as to make that implication. Why, I wouldn't even approach it tangentially.

France's capital Paris reopened the Seine River to swimmers on Saturday for the first time in over a century.

Journalism at its finest, folks. I'm sure no one had any idea that Paris is the capital of France.

Paris authorities have created three outdoor pool zones, complete with changing rooms and showers and supervised by lifeguards.

If the Seine is so clean, why would they need showers?

The swimming zones also have beach-style furniture, offering space for 150 to 300 people to sunbathe.

No word on nudity?

Bathing in the Seine was officially banned in 1923, primarily due to health risks from pollution.

"We can either fix pollution, or ban swimming. Let's ban swimming."

Around €1.4 billion ($1.6 billion) was spent on improving water quality, which officials promised would benefit not just the Olympic athletes but residents and tourists for years to come.

Jokes aside for the moment, this is a civil engineering effort, so of course I appreciate it. I'll probably never know the details of all they did, but it's one of those things where the work will go largely unnoticed by the general public, but serves them.
December 18, 2025 at 9:34am
December 18, 2025 at 9:34am
#1103881
I always like knowing the origins of words, at least as far back as we can trace them. Here's one I knew from being raised around agriculture, but NPR explains it better than I could have:



Broadcasters keep popping up in the news.

Really? Because it seems to me actual "broadcasting" is dying.

Commercial TV networks have made headlines: CBS announced the cancellation of The Late Show with Stephen Colbert this summer. ABC drew ire in September when it yanked Jimmy Kimmel off the air...

Those sources are, of course, two of the three "classical" commercial TV networks, the other being NBC.

Broadcasting — distributing radio and television content for public audiences — has been around for a century, but is facing a uniquely challenging landscape today.

And I can kinda see the word being used for cable TV, but I'm not sure about streaming. Would you call Netflix a "broadcaster?" I wouldn't. That sort of thing is a "streamer," a word which has an almost opposite implication: broadcasting is wide; streaming is narrow.

It's kind of like "podcast," itself a portmanteau of iPod and broadcast. Though they stopped making iPods, so that word is an anachronym.

[Broadcast] originally described a method of planting seeds, particularly for small grains like wheat, oats and barley.

Oh, and here I thought it referred to hiring the female lead in an old movie. (Broad? Cast? I'll be here all week, folks; try the veal.)

Various dictionaries have traced the verb's first written use — to sow seed over a broad area — to 1733 and 1744.

Modern farming techniques broadcast seeds mechanically, but the basic technique is the same: just scatter the little suckers.

The use of the term "broadcasting" to describe radio first hit the mainstream in the early 1920s. Radio signals (formerly called "wireless telegraphy") and amateur broadcasts existed before that, Socolow says.

What I think the article glides past is the "wireless telegraphy" part. A telegraph, like the later telephone, had a sender and a recipient. Broadcasting has a sender and a large number of potential recipients.

It was a piece of legislation that officially cemented broadcasting's new definition: The Communications Act of 1934 defined it as "the dissemination of radio communications intended to be received by the public, directly or by the intermediary of relay stations."

Technically, broadcast TV uses radio waves, just in a different segment of the EM spectrum.

These days, people tend to use the word to describe any sort of dissemination of information — even if it comes from cable news networks, social media platforms and streaming services, which are not technically broadcasters under the government's definition, Socolow says.

Well, okay, then. Obviously, words change over time. I just hadn't heard of Netflix or Prime Video ever referred to as "broadcasters."

The article tries to explain why it matters, but to me, it's just another quirk of language. And a source of really bad, sexist puns. After all, dames don't like to be called broads.
December 17, 2025 at 9:57am
December 17, 2025 at 9:57am
#1103825
I have decided to turn this blog into a cheese-themed one.

Okay, no, just kidding. But cheese is my favorite condiment (bread is the only food; everything else is a condiment), so you get cheesy links from time to time. This one's from Salon:

    No one can resist a good cheese ball  Open in new Window.
Let alone these two: a sweet version — swirled with fruit jam — and a savory, covered in bacon and Parmesan crisps


As with any recipe, they can't just give you the recipe. Oh, no, can't have that. Gotta write the Ph.D. thesis first, then get to the recipe.

I know it's for search engine fuckery. I don't have to like it.

In this case, it gives me something to comment on.

For years, the cheese ball has been my quiet party superpower.

Me too! I'm always a cheese ball at parties.

A well-made cheese ball has gravitational pull.

If only I could attract people by being a cheese ball. No, wait, I don't really want that. Then there'd be people around.

Visits to my grandmother’s house always began in the same place: the refrigerator.

How's that dissertation coming along?

[Her cheese ball] was a marvel of its genre: cream cheese, sharp cheddar, a splash of Worcestershire, a spoonful of sugar, crushed pineapple, pecans.

Snark aside, that does sound damn delicious.

That cheese ball has stayed with me all these years...

Around me, it wouldn't last three hours.

And cheese has, of course, become the modern shortcut. The board. The wedge. The baked brie doing its annual molten collapse.

Confession time: I'm not a big fan of baked brie.

Don't get me wrong: I'll eat it before I eat anything that's not cheese or bread. It's just too messy. Like a sloppy joe when I could have a hamburger. I'm just not into messy food, is all.

“Cheese balls are celebratory and fun, but sometimes the flavors feel a little outdated,” Erika Kubick, author of “Cheese Magic”, told me in a recent email.

Some flavors never go out of style. Like the myriad flavors of cheese.

(Her most popular recipe from her first cookbook, “Cheese, Sex, Death,” was the Everything Bagel Goat Cheese Ball, if you were wondering.)

And if you were wondering, yes, that parenthetical sentence is the real reason I'm sharing this.

There's more dissertation there before it gets into the recipes. Yes, recipes, plural. Now, I haven't made these. I'm probably never going to make these. There's no point to doing a cheese ball for oneself; not when one can simply add different toppings to one's sad, lonely cracker.

But maybe someone out there will like them.
December 16, 2025 at 9:56am
December 16, 2025 at 9:56am
#1103757
I'm not saying I agree with this Nautilus article. Or disagree. I just find the temptation too great.

    We Owe It All to Figs  Open in new Window.
Our primate ancestors’ love of the complex fruit changed the world


See, the Forbidden Fruit in the Garden of Eden is commonly translated to English as "apple." But in the original Hebrew, from what I recall anyway, the word used is a more generic word for "fruit."

And since fig leaves were canonically right there, covering up all the fun parts, now I'm wondering if it should have been "fig."

When you bite into an apple, a pear, or a peach, you bite into the result of thousands of years of interactions between these fruits and primates.

Otherwise known as selective breeding, or proto-genetic-engineering.

When you let a fig squish in your mouth, you are savoring an even more ancient story.

I was way older than I should have been when I realized "ficus" was "fig." That's apropos of nothing, really, it just came to mind.

Before the fruit, in a beginning, all of the seeds that dangled from trees fell from those trees. These were gravity’s seeds.

This is some poetic shit right here, harking back to both the supposed "beginning" with the forbidden fruit and all and tying that to another mythical apple that mythically fell on a certain philosopher's head.

No idea if it's backed up by evolutionary research, but I'll acknowledge the poetic license.

Then, some plants evolved fruits. Fruits were a radical evolutionary innovation; they surrounded seeds and attracted animals in order that those animals might consume them and ingest their seeds. They called out, “Eat me.”

And so we get to the real reason I saved this article to share: "Eat me!"

They evolved in the tropics, around 60 million years ago, in the shadow of the extinction of the dinosaurs. Those first primates have been hypothesized to have consumed the fruits of trees as well as flowers, and then, also, insects attracted to fruits and flowers.

"Hypothesized" doesn't mean much of anything without evidence.

Over the succeeding tens of millions of years, some of the descendants of those first primates became more dependent on fruits. Meanwhile, many trees grew increasingly dependent on those fruit‑eating primates for dispersal of their seeds; this was a relationship of mutual benefit and dependency.

I'm pretty sure, however, that it wasn't just primates who were attracted to fruit and therefore helped with the fruit tree's reproductive strategy. I remember reading a bit about how avocados co-evolved with some sort of megafauna, which later became extinct and almost took the avocado with it.

Metaphorically, a forest can walk across a landscape inside the gut of a primate, traveling one defecation at a time.

You know, I appreciate biology and poetry, but some metaphors, I could do without.

There's a bunch more at the link, but I got a bit exhausted with all the metaphors and speculation. As I mentioned, I'm not saying it's wrong or right. And it is, after all, an ad for a book.

I just thought the comparison with the Eden story, and the Newton myth (one wonders if that was actually a fig, too, leading to the popular Fig Newton), was too good to pass up.

Also, "Eat me."
December 15, 2025 at 8:34am
December 15, 2025 at 8:34am
#1103695
This PopSci piece is nearly a month old, which matters when you're talking about transient space phenomena. Still, I'm sure most people remember the subject.

    New NASA images confirm comet 3I/ATLAS is not aliens  Open in new Window.
The fast-moving comet likely comes from a solar system that is older than our own.


As I've been saying: It's not aliens.

Provisionally.

It's obviously "alien" in the sense that it comes from a whole 'nother part of space. Few would doubt that, and those few would be in the same category as young-earth creationists and flat-earthers: complete deniers of piles and piles of evidence.

The only "controversy" - mostly manufactured - was whether it was the product of alien sentience. The problem with any "sentient alien" hypothesis, though, is the same as the problem with Bigfoot: we can't prove Bigfoot doesn't exist; we can only continue to not find evidence of her existence.

During a press conference on November 19, NASA confirmed the icy rock poses no danger to Earth, and contrary to certain conspiracy theories, is not an alien spacecraft.

The "alien" people weren't necessarily spouting conspiracy theories, though. Just wishful thinking and projection. Any true conspiracy theorist would take one look at NASA's denial, and consider it proof that they're hiding something.

“It expanded people’s brains to think about how magical the universe could be,” said Dr. Tom Statler, lead scientist for solar system small bodies, during the livestream announcement.

I remember when I was a kid, fascinated by astronomy, there was talk about comets or other visitors from other star systems. Much like with the detection of extrasolar planets, though, it was only recently that we actually confirmed their existence.

The universe is strange enough, but people are strange enough to have to try to make it even stranger. I actually kind of love that about people. It's only when they take it too far and replace reality with one of their own that I get disgusted.

There's more at the link, including actual images from actual telescopes, but fair warning: the images aren't exactly breathtaking, not like the famous pictures of nebulas and such from Webb or Hubble.

Mostly I just wanted to reiterate that it's not aliens.

It's still pretty cool, though.
December 14, 2025 at 8:32am
December 14, 2025 at 8:32am
#1103632
Here's a rare occasion when I talk about sex, thanks to Nautilus.

    How Monogamous Are Humans Actually?  Open in new Window.
How we rank among species on fidelity to a single partner may have shaped our evolution


And already I have issues.

The headline may seem neutral enough, but then you get to the subhead, and it uses "fidelity" as a synonym, which conveys an implicit bias due to the positive connotations of "fidelity." And then you get to the evolution part, and wonder about direction of causality: did sexual practices shape our evolution (other than in the obvious sense of enabling evolution to continue), or were are sexual practices shaped by evolution? Or some synergy between them?

Well, substitute "I" for "you" in that paragraph. You know what I mean.

And let's not undersell the worst implicit assumption there: the primacy of heterosexuality.

Across cultures and millennia, humans have embraced a diversity of sexual and marital arrangements—for instance, around 85 percent of human societies in the anthropological record have allowed men to have more than one wife.

"Allowed?"

And yes, it's almost never the other way around.

Still, remember what Oscar Wilde said: "Bigamy is having one wife too many. Monogamy is the same."

Anyway. If that 85% figure is correct, and I have no facts to contradict it, then we should be considering polygamy—not monogamy, not polyandry, not any other mutually agreed-upon relationship—to be the default for humans.

The problem with polygamy as a cultural norm, though, apart from Wilde's quip, is math. The proportions just don't work out, unless you send a lot of your young men off to die in war. Which, of course, a lot of cultures did. Or unless you also accept polyandry, which, given the patriarchal, hierarchical nature of most societies, ain't gonna happen.

I'd be remiss if I didn't note that "monogamy" itself, the word, literally means "one wife," while "polygamy" obviously means "multiple wives," thus reinforcing the male-primacy point of view. But words can change meaning over time, and for the sake of convenience, just assume that whenever I use a word like that, I'm referring to sexual partners of any gender.

But in the broader evolutionary picture, some researchers have argued that monogamy played a dominant role in Homo sapiens’ evolution, enabling greater social cooperation.

"Some researchers." Right. It couldn't be the ones with an agenda to push, could it?

This theory aligns with research on mammals, birds, and insects, which hints that cooperative breeding systems—where offspring receive care not just from parents, but from other group members—are more prevalent among monogamous species.

I'm not sure we should be labeling it a "theory" just yet.

To decipher how monogamous humans actually have been over our evolutionary history, and compare our reproductive habits to other species, University of Cambridge evolutionary anthropologist Mark Dyble collected genetic and ethnographic data from a total of 103 human societies around the world going back 7,000 years. He then compared this against genetic data from 34 non-human mammal species. With this information, Dyble traced the proportion of full versus half siblings throughout history and across all 35 species—after all, higher levels of monogamy are linked with more full siblings, while the opposite is true in more polygamous or promiscuous contexts.

I have many questions about this methodology, not the least of which is this: humans don't have sex for procreation. We have it for recreation. Procreation is a byproduct, not a purpose, most (not all) of the time. A study like that, concentrating on births, completely ignores the purely social aspect of multiple partners.

As support for my assertion there, I present the bonobos, our closest primate relatives, who engage in recreational sex all the time.

Most species don't seem to use sex for recreation, though I'm hardly an expert in that regard. This makes humans (and some other apes) different from the birds and the bees, and using other animals as models for the ideal human behavior is a good example of the naturalistic fallacy.

Point is, I submit that using live births as the sole indicator of degree of polygamy is just plain wrong, and will lead to incorrect conclusions.

“There is a premier league of monogamy, in which humans sit comfortably, while the vast majority of other mammals take a far more promiscuous approach to mating,” Dyble said in a statement, comparing the rankings to those of a professional soccer league in England.

And with that, he betrays his bias.

This doesn't mean that the study doesn't have merit, mind you. It might be useful for drawing other conclusions. I just don't think it means what he, and the article, claim it means.

Meanwhile, our primate relatives mostly sit near the bottom of the list, including several species of macaque monkeys and the common chimpanzee.

Heh heh she said macaque.

Let's not forget another important thing: a species will almost always have the reproductive strategy that works for that species. It could be pair-bonding. It could be complete promiscuity. It could be something in between. Whatever works for that niche.

Personally, I'd hypothesize that humans fall somewhere in the middle for the simple reason that we live for drama, and what's a better source of drama than who's doinking who? Hell, it's the basis for at least half of our mythology, and all of our soap operas.

There's more at the article, obviously. I just want to close with this:

Unlike other humans, I don't care who's doinking who. The only thing I care about is that everyone involved be consenting or, better yet, eager. You want to be monogamous? Find another monogamist. Poly? Find other polys. Single? Get some good hand lotion. I. Don't. Care.

But if you agree to a particular lifestyle, whatever that may be, and then you go behind your partner's or partners' backs? That's what I consider wrong. Not, like, on the same level as murder or theft or wearing socks with sandals, but still, it's something to be ashamed of.

296 Entries
Page of 15 20 per page   < >
<   1  2  3  4  5  6  7  8  9  10  ...   >

© Copyright 2026 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online