|
About This Author
Come closer.
|
Carrion Luggage
Carrion Luggage
![Traveling Vulture [#2336297]
Blog header image](/main/trans.gif) ![Traveling Vulture [#2336297]
Blog header image Blog header image](/main/images/action/display/ver/1741870325/item_id/2336297.jpg)
Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.
This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.
It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.
It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."
I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
| |
We haven't had one of these for a little while: a language-related listicle from Mental Floss.
Well, now next time someone invites me somewhere, I can say I have extenuating errands.
The English language is certainly bizarre in the best way.
For instance, there are values of "best" I wasn't aware of until just now.
Some of it is totally run-of-the-mill, and some of it is full of words that only seem to appear in one extremely specific situation.
There's another listicle somewhere that explains "run-of-the-mill." Maybe I already featured it. Maybe it's coming. Maybe I forgot to save it. I don't recall. It's really not hard to guess at, but I think it's good for writers to think about these things.
So let’s take a little stroll through eight words that only show up in one weirdly specific context.
Including the word "weird" in the headline (and to a lesser extent, here) is way too close to clickbait. But have you ever wondered why "weird" is so weird? I mean, it doesn't even follow the well-known "I before E except after C" spelling rule.
Anyway, I'm not going to cover all of them.
Inclement (Weather)
If you’ve ever heard the word “inclement” outside of local news broadcasts, please step forward, because we know you’re lying.
This is, of course, our clue to use it to describe something other than weather. The situation at the office, maybe, or a police raid.
What the article doesn't note, but I will, is that this is one of those Latin-root words whose cousins appear every now and then. The obvious example is the name "Clementine," or the citrus fruit that has that name. But my dictionary source says "clement" can describe someone's demeanor (synonym: merciful), so why can't "inclement?"
Diametrically (Opposed)
Diametrically has one job: heighten drama. No one is ever diametrically aligned, and no one is diametrically friends.
That's because the meaning is something like "directly and completely," and it doesn't hurt that the word contains many of the same sounds. This isn't a case like "literally," which is often used to mean "figuratively or metaphorically" and also to heighten drama. "Figuratively" and "metaphorically" have definitions that should be diametrically opposed to that of "literally."
Bode (Well/Ill)
Bode is a free agent in theory, but let’s be honest: you’ve only ever seen it next to “well” or “ill.”
Again, an opportunity to get this one to stretch a bit.
Hermetically (Sealed)
Now, this word is a little “underground,” if you will. Hermetically sealed sounds like something out of a sci-fi lab, but it mostly refers to food packaging and those little foil seals you peel off with your teeth, even though you’re not supposed to.
This one's a little trickier. It doesn't mean what I thought it meant for most of my life. I thought it was related to mercury (the element, not the planet or the god), by extension from Mercury to Hermes. I thought it had to do with how liquid mercury could form seals. In my defense, I wasn't too far off, but it referred to an entirely different god: Thoth, the Egyptian god of knowledge who, when the Greeks took over Egypt, became identified with Hermes. How that led to things being described as hermetically sealed is interesting, but beyond today's scope.
If that's a little confusing, don't worry. I was confused, too. The link in the article only goes to something that explains what "hermetically sealed" does, without going into the word origin.
Pyrrhic (Victory)
Of all the words here, I think this one makes the most sense to be paired with only one other word. Again, it's of ancient origin, but it's derived from a person's name: a Roman general named Pyrrhus. And to one particular battle, which his forces won, but only at great cost. It became the Platonic ideal (Platonic, of course, being another word derived from a name, but which can pair with several other words besides "ideal") of a victory that only comes with tremendous losses.
Contiguous (United States)
Contiguous technically means “touching,” but 99% of its appearances involve either a map or the phrase “excluding Alaska and Hawaii.”
Yeah, but that other 1% exists (though I think the percentages here are pulled from thin air)—though they're mostly technical jargon.
English is full of these little linguistic oddities. Some may be outdated, sure, but one thing remains true: We sound incredibly smart when we use them the right way!
I think we can sound even smarter if we come up with new ways to use them, perhaps even all of them in one contiguous sentence. Which I'm entirely too lazy to do right now, so I'll settle for just the one. |
| |
Here's one of those reports that seems like it should be fake, but in this case, probably isn't. From ScienceNews:
Though, apart from the whole "never-before-seen" thing, I'm not sure why a predator catching prey is such a big deal. Cats catch birds in flight. Frogs catch flies in flight. I guess there's some poetry because "rat" and "bat" rhyme in English.
The observation happened by chance, says Florian Gloza-Rausch, a biologist at the Museum of Natural History in Berlin. He and colleagues had been studying a colony of 30,000 bats overwintering in a cave about 60 kilometers north of Hamburg.
I suppose if the bats in question were endangered, there'd be an issue, but that does not appear to be the case.
Brown rats (Rattus norvegicus) figured out how to get inside the kiosk and climb up to the bats’ landing platform at the entrance, using a curtain the researchers placed inside the kiosk for filming purposes.
Rats are scary smart. More importantly, they adapt to what we do.
Out of 30 filmed predation attempts, 13 were successful. The attacks happened in complete darkness, so the researchers suspect that the rats sensed the bats with their whiskers.
"Rat-sense. Tingling." Seriously, though, I'd love to see a follow-up study to determine if the rats get better at it over time. And also a follow-up to see if it is their whiskers, or if rats have a heretofore unknown echolocation sense like the bats have. Unlikely, as rats are probably the most-studied animals in the world. Still, nature surprises us all the time, as this article demonstrates.
Look, I have nothing against bats (or rats); predation is just part of nature. However, a lot of the rat population is the result of human activity, so maybe this happens because of us, collectively? I don't know. It's kind of like how some people insist cats should be kept indoors to protect birds. As if cats were the invasive species, and not us.
A colony of just 15 brown rats could reduce the cave’s population of 30,000 bats by 7 percent each winter, Gloza-Rausch and colleagues estimate.
I imagine you never have a colony of "just" 15 rats. At least not for very long. And, okay, the researchers would know better than I do what the conservation issues might be.
Bats are, of course, just as important to the ecosystem as rats, however maligned both critters may be. Perhaps not as majestic as the turkey vulture, but they are cuter. |
| |
This one, from Atlas Obscura, has been languishing in my pile for a very long time. Probably not since last July, when it was published, but it's been a while. So whatever reason I had for saving it, I don't remember what it was, other than my general appreciation for Brutalism, a much-maligned architectural style.
Now, a couple more disclaimer-type things: First, the article is a podcast transcript. I don't listen to podcasts. I'd rather read text. But if you're the other way around, I think there are links at the article. And second, my appreciation for Brutalism is a direct result of me being a function-over-form engineer who has worked in the concrete industry, not because I have a developed sense of aesthetics. Still, my favorite architecture is both functional and pretty—though of course, "pretty" is subjective.
Diana Hubbell: Whether or not you saw the movie The Brutalist, you’ve probably heard a lot about it.
This is literally the only place I've seen, or heard, anything about that movie.
In the film, Brutalist architecture serves as a metaphor for resilience and transformation.
I also appreciate metaphor.
Viewers of Brutalist architecture over the years have accused it of being drab and utilitarian. They’ve said these hulking concrete buildings looked more like fortresses. More than a few have accused them of being ugly. And while I can kind of see their point, there’s something powerful about these buildings when you consider them in the context they were made.
As I said, it's subjective. Thing is, there are a lot of ugly buildings around (there's one close to me which actually got the nickname "Big Ugly," and it's your classical colonial Virginia brick-with-white-trim, not Brutalist. We have a Brutalist building downtown, though. Used to house a spy agency. The blacked-out windows were uglier than the concrete framing them.
A couple years ago, I visited the Rothko Chapel in Houston, Texas, and it challenged my idea of what a church could be.
It's a building. It can be lots of things. We used to have a Catholic church around here with a very interesting design, including a rounded triangular shape (apparently representing the Trinity) and a reverse steeple. Yep, the ceiling dipped down in the middle, I guess to emphasize that God is coming "down" to Earth instead of people reaching "up" to God. Or something. I'm better at metaphor when it's written.
What was I doing in a Catholic church? I've been in lots of churches. Occupational hazard of having been a wedding photographer.
In true Brutalist fashion, it bears more resemblance to a bunker than a Gothic cathedral. The stark exterior is an irregular octagon done up in rose stucco.
I'm sure there was a religious reason for all of that, too. I just can't figure what it might have been.
It seems appropriate to me that the Rothko Chapel isn’t really a church in the traditional sense. Although the de Menils who commissioned it were devout Catholics, this place is non-denominational. Great art has the power to move anyone, regardless of their faith.
I think this may have been the bit that made me save this article. I find that last sentence to be true, at least for me. Once I see a great work of architecture or art, or hear music, it doesn't matter whether it had a religious, or spiritual, purpose to it; I just appreciate the artistry.
A handful of years after the Rothko Chapel was completed in the ’70s, another Brutalist structure was being built, this time on the other side of the world. Roxanne Hoorn brings us that story.
I mean, sure, another continent, but hardly the other side of the world.
Roxanne Hoorn: Shrouded by the forest and perched on a sloping hillside in northwest Bosnia and Herzegovina sits a massive marble structure. Split down the middle, its two towering concave walls reaching as high as a basketball court is long. They curve inward, reaching for one another, shadowing the two-story atrium between them.
One of my failings as a writer is that it's hard for me to do descriptions like this. So I appreciate them when I see them.
Still, I'm a little unclear on how marble fits into the Brutalist category.
There's more at the link, including the history behind this second structure. It makes me want to see the things, which I suppose is the whole point of AO (I used that site to help guide my European visit a couple years back). But mostly, I just wanted to cast (pun intended) another vote in favor of Brutalism. |
| |
This Smithsonian article was probably timed for Presidents' Day, but for some reason it seems appropriate enough that it came up at random for me today. Maybe it has something to do with reminding us what US presidents used to be like.
From Abraham Lincoln’s patent to James A. Garfield’s geometry proof, learn how these 19th- and 20th-century commanders in chief shaped their legacies beyond politics
It would also have been impossible for me not to save a link with "teenage diplomat" in the headline, thanks to Bruce Springsteen's Blinded By The Light: "Madman drummers bummers and Indians in the summer with a teenage diplomat..."
I gave up on trying to understand those lyrics years ago, but it remains one of my favorite songs. Not the Manfred Mann version. The original Springsteen. But I doubt he was speaking of the president in question.
But I digress. We were talking about former presidents.
In 1876, when James A. Garfield was serving his seventh term in Congress, he devised an original proof for the Pythagorean theorem. A classics scholar who’d taught math, history, philosophy, Greek, Latin and rhetoric at an Ohio college, the 20th president was also a preacher, a Union major general during the Civil War and a lawyer.
Elitist! Out of touch with the common citizen!
John Quincy Adams was a teenage diplomat and polyglot.
Well, that settles the headline, if not the song.
As president in the 1820s, Adams was an early, vocal proponent of astronomy, mocked when he advocated for America to build “lighthouses of the skies” that would rival European observatories.
I can appreciate that. I'd be remiss if I didn't point out that Jefferson (for all his well-known faults) was also a fan of astronomy. Lore at UVA is that he'd originally designed the Rotunda for use as an early version of a planetarium.
William Henry Harrison was the only U.S. president to attend medical school.
And yet...
William Henry Harrison had the shortest tenure of any American president, dying just 31 days after he delivered a two-hour long inaugural address in the rain, without wearing a coat or a hat.
In fairness, as the article points out, Harrison didn't actually finish medical school.
Abraham Lincoln is the only president to hold a patent.
It's unfortunate that he couldn't patent his beard.
Between 1858 and 1860, Abraham Lincoln delivered multiple lectures across Illinois on the vital importance of “discoveries and inventions” to the progress of mankind. Yet he never told audiences that he was responsible for one such innovation: U.S. Patent No. 6,469, a device for “buoying vessels over shoals.” Lincoln is the only American president to hold a patent.
Fun fact: my father held a patent. It was also ocean-related. It also had to do with the ocean. My father's excuse was that he was a sailor, not a lawyer.
James A. Garfield devised a proof for the Pythagorean theorem.
This bit just expands on what they said in the lede. I'm including it again because I like math.
Garfield’s Pythagorean proof offers just a glimpse into the brilliance of America’s 20th president, who was shot by a disgruntled lawyer in July 1881, just four months into his term.
I'm beginning to see a pattern here: the smart ones dying too early.
Herbert Hoover and his wife were giants of mining engineering.
Yeah, and he's the one with a dam named after him. But the big deal here is the "wife" bit. As the article notes, his wife "was the first woman to graduate from Stanford with a geology degree," though sexism kept her from finding a job in the field.
In the early 1900s, the Hoovers, who were then living in London, learned that no one had yet published an English translation of De Re Metallica, a seminal 16th-century mining text.
...and nothing else matters.
Yes, I can quote Springsteen and Metallica in one blog post.
Jimmy Carter was a pioneering nuclear engineer and Renaissance man.
I think most people knew that, regardless of their opinion of him as President. And this one almost makes up for the other smart ones dying too early.
For me, though, his greatest accomplishment was signing legislation that permitted homebrewing of beer, which led to an explosion of craft breweries, which brought us real alternatives to mass-produced, rice-adjunct swill. It also brought us an unfortunate deluge of IPAs, but there's nothing so good that there isn't some bad in it.
Anyway, there's a few I skipped, and there's a lot more detail at the link. |
February 28, 2026 at 8:35am February 28, 2026 at 8:35am
| |
No, I don't understand it, either, but I saved this article from Quanta anyway.
Decoherence? More like incoherence, amirite?
None of the leading interpretations of quantum theory are very convincing.
This much, I think I do understand: quantum theory works. The math is sound. Experiments match predictions (mostly). What's up in the air is what it all means (interpretation) and why the math works so well.
But the whole thing is so alien to our everyday perceptions that I doubt any interpretation would be convincing, at least to anyone who prides themselves on "common sense."
They ask us to believe, for example, that the world we experience is fundamentally divided from the subatomic realm it’s built from. Or that there is a wild proliferation of parallel universes, or that a mysterious process causes quantumness to spontaneously collapse.
Okay, no, they don't "ask us to believe." Religion peddlers ask us to believe. Politicians ask us to believe. Scientists do research and try to explain the results.
This unsatisfying state was a key element of Beyond Weird, my 2018 book on the meaning of quantum mechanics.
Yes, I'm okay with book plugs in here. Usually.
But after reading Decoherence and Quantum Darwinism, a book published in March 2025 by the physicist Wojciech Zurek, I’m excited by the possibility of an answer that does away with all those fanciful notions.
The title alone makes me skeptical.
See, Darwin was absolutely groundbreaking. Completely turned "common sense" on its head. His work laid the foundation for pretty much all of modern biology. But science has updated and refined pretty much every detail (from what I understand, anyway). But invoking Darwin is one of the things evolution deniers love to do to make science and those who follow it look bad. Consequently, every time I see "Darwinism" out of context, I cringe.
If you know something about quantum mechanics...
Which you don't. Neither do I.
...you can be forgiven for thinking that the big, strange deal is the quantum part: the idea that the world at the finest scales is grainy, that particles can only change their energy in abrupt quantum jumps by exchanging little packets of energy with fixed sizes.
"Strange" isn't just a type of quark; it's a matter of perspective. To the quantum world, our more deterministic, macroscopic existence is what's strange.
Ultimately, the arguments over quantum mechanics have much bigger stakes: what reality is.
And everyone's selling their own version of what reality is.
Quantum uncertainty, the physicist and philosopher Jeffrey Bub of the University of Maryland told me, “doesn’t simply represent ignorance about what is the case, [but] a new sort of ignorance about something that doesn’t yet have a truth value, something that simply isn’t one way or the other before we measure.”
We're conditioned to accept that everything is either true, or it's false. This is false. There are other possibilities: A statement can be neither true nor false. It can be both true and false. It can be obviously false and yet hold truths, like fiction writing. It can be non-determinable, as in quantum theory. And it can be paradoxical, like the famous "This sentence is false" and its equivalents. (There are probably other truth values, but those are the ones I thought of off the top of my limited mind.)
Which is to say that maybe the problem lies not in the quantum world, but in our limited minds.
But why should there be two distinct types of physics — classical and quantum — for big and small things? And where and how does one take over from the other?
Those are the main questions addressed in the article, so I'm quoting them.
The central element of Zurek’s approach is the phenomenon called quantum entanglement, another of the nonintuitive things that happen at quantum scales.
Again, though, I think it's a perspective thing. Entanglement seems weird to us (and there's a lot of misleading speculation about it out there, which one has to be wary of), but it's perfectly normal in the quantum realm.
The molecules in an apple are described by quantum mechanics, and photons of light bouncing off the surface molecules get entangled with them. Those photons carry information about the molecules to your eyes — say, about the redness of the apple’s skin, which stems from the quantum energy states of the molecules that constitute it.
This, for example. I'm not saying it's wrong—the author knows a hell of a lot more than I do—but I'm not saying it's right, either.
There's a lot more explanation at the link, by the way, and don't worry; there's no math involved.
Zurek’s theory of quantum Darwinism — which, again, uses nothing more than the standard equations of quantum mechanics applied to the interaction of the quantum system and its environment — makes predictions that are now being tested experimentally.
This, though... this is the important part. Maybe those experiments will support the theory. Maybe they'll falsify it. Maybe they'll leave it in an indeterminate state. But the important thing is that it's testable at all, which the highly popular (as in, fiction writers use it a lot) many-worlds interpretation isn't. At least not at our current level of understanding. Well, not "our" current level of understanding, which is pretty much limited to what celebrities are getting up to in their private lives, but scientists' current level of understanding.
This leads us to another revelation of decoherence theory, the one that persuades me that Zurek’s theory now tells a complete story. It predicts that all the imprints must be identical. Thus, quantum Darwinism insists that a unique classical world can and must emerge from quantum probabilities. This imposition of consensus obviates the rather mysterious and ad hoc process of collapse, in favor of something more rigorous.
I find myself wondering if this could also put to bed, finally, the question of "why is there something rather than nothing," which has always stuck in my craw like a sideways pretzel rod.
Now, if you didn't follow any of that, that's okay. I might blog about food again tomorrow, or rant about misinformation, or just tell jokes. But the article itself explains things rather well, I think. |
February 27, 2026 at 10:23am February 27, 2026 at 10:23am
| |
We're back to food today. Vegans might want to skip this one. From allrecipes:
I figured it's because food always tastes better when someone else is cooking and serving it.
We’ll let others settle the debate on whether smash burgers or steakhouse burgers reign supreme.
That debate will never be settled, unlike the one about the superiority of New York style pizza over every other style pizza.
“Restaurant burgers usually taste better because we obsess over the details most home cooks don’t consider,” admits Jeff Martin, chef and partner at Park Cafe, Germantown Cafe and Karrington Rowe.
That's not an "admits." That's a humblebrag. "Admits" would be something like "Yeah, the secret is we hawk a big ol' loogie into every burger."
“The biggest key is restraint. Restaurants don’t necessarily do more; we do less, but better.”
"We cheap out but we know what we're doing. Usually."
“If home cooks focus on the fundamentals—few, high-quality ingredients, heat, seasoning, and confidence to let the burger cook without fussing—their burgers would immediately get better. When you stack a few small advantages together, the difference becomes obvious in the first bite,” Martin tells us.
I'm all about the "few ingredients" thing. The less I have to do, the better. Know how I make a burger at home? Beef and premade hamburger seasoning. That's it. That's all. I mean, there's the cooking it part and the building it part and the eating it part, but the patty itself? Minimalist.
That may not work for everybody, but I'm not cooking for everybody; I'm cooking for me. Besides, no one else wants their hamburgers on an English muffin like I do.
You need not seek out something as exclusive as coveted Kobe (a premium type of Japanese Wagyu), asserts Aram Mardigian. However, all five chefs agreed you should reach for freshly-ground, high-quality 80/20 ground beef, because “fat equals flavor and juiciness,” Martin notes.
Yeah, see, here's the thing: I do not like meat fats. A little is okay (and probably necessary), but beef tallow, lard, whatever? No, not for me. Ideally, I get 95/5 ground beef. I'll put up with 90/10 (it's a lot cheaper). If I have to use 80/20, those are going to be some well-done burgers so all the fat renders away. I want my beef leaning so hard it falls over.
No need to fret if you only have a less expensive or leaner style of ground beef, assures Diego Chaparro. If that’s the case, Chaparro has a genius tip: He recommends grating a few tablespoons of frozen butter into the ground beef, which adds both fat and flavor.
That's... well, maybe it is a "genius tip," but I've been assured by people in the restaurant industry that butter is their actual dirty little secret: they add that to everything to make it taste better.
If your ground beef is frozen, transfer it to your fridge 24 hours in advance. Then 30 minutes prior to when you plan to start cooking, move your meat from the fridge to a counter and allow it to sit at room temperature.
You want me to *snicker* plan 24 hours in advance? No. If I have frozen patties, they come out of the freezer when I start thinking "You know, I could really go for a burger right now." Then they get dipped in lukewarm water for about 20-30 minutes. This thaws them right up.
Now, I'm pretty sure you can find tips like "Whatever you do, don't thaw your ground beef in lukewarm water because it will KILL you." Some things are worth the risk, and my convenience, and delicious hamburger, are two of them.
If you ask Mardigian, “the biggest mistake made by home cooks, and restaurant cooks for that matter, is a lack of seasoning.”
Didn't someone just say it was overseasoning? I can't keep up. I believe what they're saying about salt in the article, but to a large extent, seasoning is a matter of personal taste.
Handle the beef just enough to form a round patty of your desired weight and thickness, then you’re ready to cook.
I can admit that this is the actual line that made me save this article. I can no more resist a double entendre than I could resist buying hamantaschen at the bakery this morning.
All six chefs agreed that high heat is essential for that satisfying sear, no matter if you’re whipping up a thick patty or a smash burger. Not only will this form a tempting crust, but it will also help reduce the risk that the burger will stick to the surface, Mardigian says.
I do not like the cleanup involved in frying burgers. Splatter everywhere, and then you gotta scrub the pan. No, thanks. If I don't want to grill, I'll use a broiler. Those have high heat, and the drippings collect on something disposable like aluminum foil (sorry, Earth, my convenience is more important than your resources). This has another advantage, illustrated by the article's next section:
Don’t fuss with or press the patties.
Yeah. Just leave them alone. That's easier when they're in a broiler.
Now, one thing the article doesn't address: if you have a perfectly round, even patty, when you cook it and don't smush it, it tends to turn into a UFO-looking thing, way thicker in the middle than around the edge. This also screws up the cooking. The solution to that, though, is dead simple: don't make an evenly thick patty from the raw beef; instead, press your thumb into the middle of the thing. That's it. That's all. (Though depending on the size of your fingers, you may need to use more than just your thumb.)
Also, don't forget shrinkage. Your raw patty should be way bigger than the bun (or in my case, English muffin) within which it is destined to reside. It'll then cook down to something close to the right size, kind of like how five cups of raw spinach cooks down to one teaspoon of cooked spinach.
Be thoughtful with the toppings, condiments, and buns.
This should not need to be said. But, again... personal taste.
Once your burger has reached your desired temperature (measure this with an instant-read thermometer for quick and accurate confirmation that you’re on target), allow it to rest for 5 minutes as you assemble the supporting cast.
I absolutely agree with the resting time (during which the burger is actually still cooking and flavors are merging), but one shouldn't need a thermometer once one knows one's equipment.
Practice, practice, practice.
Yes, now I'm going to have to make more burgers. You know. For science. |
February 26, 2026 at 10:50am February 26, 2026 at 10:50am
| |
Admittedly, sometimes I save a link just for the opportunity for snark. This Southern Living article is one of those.
Is one of them "housekeeper?" Because who can afford a housekeeper nowadays?
We all have habits that we learned from our parents that we may never have second-guessed. And most of those habits are probably healthy ones, but when our parents were growing up, standards were different.
More true for some of us than for others.
A generation later, we just know better on certain things—plus, we have so many more products and tools at our fingertips.
Translation: This is an ad for cleaning products and tools.
Old Habit: Using The Same Sponge For Everything
To avoid cross-contamination, designate one sponge for each purpose. Disinfecting them in the dishwasher, or putting them in the microwave for two minutes, will kill a lot of the germs, but not all, so you also want to replace them often.
This message has been brought to you by The Sponge Company. The Sponge Company. The Sponge Company: Be sure you have enough sponges! Fear the germs!
Old Habit: Neglecting To Clean Your Cleaning Tools
Just like a dirty sponge may actually make your “clean” dishes dirtier, the same goes for all household cleaning tools.
Okay, to be fair, this one doesn't suggest buying new cleaning tools every week. They must have not gotten paid by The Mop Company.
Old Habit: Using Too Much Laundry Detergent
Oh, cool, this one's actually suggesting to use less laundry detergent, not more! Right?
Take the time to actually read the packaging on your detergent to see how much the manufacturer recommends using per load. Chances are, it’s much less than you’d think.
Nope. "Chances are" they still recommend too much, so you buy it more often. "Less than you'd think," maybe, but I'm guessing that's because modern laundry detergents are more concentrated, and if you're used to 1970s detergent measurements, yeah, you're probably using too much.
It's like those toothpaste commercials that gleefully demonstrate the use of their toothpaste by squeezing out a long, curly glob that fills up the entire toothbrush. No, you don't need to do that. But if you do, you'll buy more toothpaste sooner.
Speaking of toothbrushes (yeah, I'm skipping a few):
Old Habit: Not Disinfecting Toothbrushes
Growing up, the concept of cleaning a toothbrush was foreign. We just used them twice a day until the bristles were completely splayed out before finally replacing them.
I don't remember how often we replaced teethbreesh when I was a kid. Knowing my parents' well-earned frugality, it wasn't very often.
Rinse your toothbrush well after each use, and make sure you’re storing it in a place that will let it air-dry.
That's easy to say, and do, if you don't have cats.
Old Habit: Using Paper Towels To Clean Everything
The world is a lot more environmentally conscious than it was a couple decades ago, and one old-school habit we know should be broken is using disposable paper towels to clean, well, everything.
Oh, no, we should use sponges instead. You know, sponges made of some kind of plastic rather than paper towels made with vegetable matter. Sponges that, per the above, are always covered in lurking germs just waiting to pounce.
Nah. I'm using paper towels to clean everything. It's more sanitary. It's not that I don't care about the environment; it's that my contribution means squattly-dick in the face of corporate shenanigans.
Old Habit: Leaving Dusting Until Last
You may have seen your parents dusting off furniture, ceiling fans, ledges, moldings, and the contents of the curio cabinet as the final step of cleaning a room—after the tidying and vacuuming was done.
I'm not psychologically able to use "dust" as a verb. It's one of those contronyms, anyway: "to dust" can mean "to remove dust from" or "to sprinkle dust upon." And I don't have the mental energy to be careful with context every damn time.
Okay, so, that article wasn't nearly as much of an ad as I'd feared. Still, you gotta look out for these things. There's more at the link, because I can reluctantly admit that there may be some good tips in there. |
February 25, 2026 at 9:04am February 25, 2026 at 9:04am
| |
Today's link is from The Guardian, and it's long. Just a warning for short attention spans. At least it's text and not video.
Though I hate that the headline leans into the "apocalypse" nonsense, and with a terrible pun no less, I wanted to know what they had to say. (Yes, I used a terrible pun for this entry title, too. I never said I wasn't a hypocrite.)
Thanks to technological advances, we are entering a new age of discovery in the field of ancient history. Improved DNA analysis, advances in plant and climate science, soil and isotope chemistry, linguistics and other techniques such as a laser mapping technology called Lidar, are overturning long-held beliefs. Nowhere is this more true than when it comes to Maya archaeology.
In other words, we move forward to look backward, which gives us tools to move forward.
That is, assuming you view time as a linear thing with the future ahead of us. I did a whole entry on that a little while ago ("Fun Times" ), and the irony is that it doesn't appear that the Maya civilization had a linear view of time. You know, what with their whole famous calendar thing, which I'll get to in a little bit.
When Estrada-Belli first came to Tikal as a child, the best estimate for the classic-era (AD600-900) population of the surrounding Maya lowlands – encompassing present day southern Mexico, Belize and northern Guatemala – would have been about 2 million people. Today, his team believes that the region was home to up to 16 million. That is more than five times the area’s current population.
That's a hell of a difference. By way of comparison, that's almost twice the population of New York City. A bit more spread out, though. Think of it more as: almost twice the population of Virginia.
This is how science works. And that's a good thing.
Some Maya cities were established hundreds of years before the founding of Rome, and they included significantly larger architecture that still stands. Both cultures developed sophisticated astronomy, mathematics, writing and agriculture, as well as elaborate trade arrangements across vast cosmopolitan lands.
I'm still not convinced that Rome was all that great at math. Or maths, as The Guardian would spell it.
Outsiders’ power over the story of the Maya is written into the people’s very name. After their arrival in the early 1500s, the Spanish named local populations “Maya” after the ruined city of Mayapán in present day Mexico. Yet the Maya never saw themselves as one people and were never governed under one empire. They spoke many languages – 30 of which are still around – and belong to an intricate mix of cultures and identities.
This is, I think, similar to how they used to think of North American natives as "Indians" without much regard for the nations, tribes, clans, etc. that made up the diverse pre-Columbian population. Or how people think of "Africa" as one place while "Europe" is many places.
Over time, some observers spread pseudoscientific stories claiming that Maya temples were more likely to have been built by aliens than by ancestors of local people. (Vikings, Mormon Nephites and other mysteriously vanished civilisations have also been dubiously credited with building the ancient sites.)
Yes, because clearly, people with noticeable melanin in their skin couldn't possibly have achieved anything of greatness. Hence the "theories" about Maya, or about ancient Egyptians for that matter.
Also, I thought we were done calling them "Vikings."
Subsequent large-scale mappings led to Estrada-Belli’s estimate that between 9.5 and 16 million people once lived in the Maya lowlands. He calls the lowlands in the 700s a “continuously interconnected rural-urban sprawl”. This was a cosmopolitan region with high degrees of trade and settlements interconnected by a close web of causeways and roads.
So, apparently, by the time Europeans showed up almost a thousand years after, most of it was reclaimed by the jungle.
The ancient Maya did not use pack animals, or carriage wheels. Everything that was built and traded had to be carried by human force alone.
The bit about carriage wheels is pretty famous. At least, I've known about it for a very long time. There's a lot of speculation about why, some of it centered on things like the above: that they were too primitive to invent such things. And yet, there's another possible explanation: that the wheel was entirely too sacred to use for mundane tasks such as carrying burdens.
Calling something like that "sacred" may strike you as primitive, too, but from an outsider's perspective, some of the things you consider sacred are mere mundanities.
Now... there's a lot more at the link. I did say it was a long read. The article itself says it's a long read. But if all you know about the Maya is the calendar thing, or the wheel thing (which I believe might be related), it's worth the time. |
February 24, 2026 at 8:36am February 24, 2026 at 8:36am
| |
Oh hey, it's an article relevant to writers. Kind of. From The Marginalian:
As a reminder, all words were invented. Some were invented more recently than others.
“Words are events, they do things, change things. They transform both speaker and hearer; they feed energy back and forth and amplify it. They feed understanding or emotion back and forth and amplify it,” Ursula K. Le Guin wrote in her exquisite manifesto for the magic of real human conversation.
All due respect to Ms. Le Guin, but "real human conversation" can also be incredibly annoying.
In the roots of words we find a portal to the mycelial web of invisible connections undergirding our emotional lives — the way “sadness” shares a Latin root with “sated” and originally meant a fulness of experience, the way “holy” shares a Latin root with “whole” and has its Indo-European origins in the notion of the interleaving of all things.
Well, yeah, hence "holistic." I do appreciate etymology.
Because we know their power, we ask of words to hold what we cannot hold — the complexity of experience, the polyphony of voices inside us narrating that experience, the longing for clarity amid the confusion. There is, therefore, singular disorientation to those moments when they fail us — when these prefabricated containers of language turn out too small to contain emotions at once overwhelmingly expansive and acutely specific.
One thing I try to avoid is the phrase "words cannot describe." I call myself a writer. I need to find words that describe.
I don't always succeed in my avoidance, mind you; some things are simply indescribable.
John Koenig offers a remedy for this lack in The Dictionary of Obscure Sorrows...
Yes, the article promotes a book.
The title, though beautiful, is misleading — the emotional states Koenig defines are not obscure but, despite their specificity, profoundly relatable and universal; they are not sorrows but emissaries of the bittersweet, with all its capacity for affirming the joy of being alive: maru mori (“the heartbreaking simplicity of ordinary things”), apolytus (“the moment you realize you are changing as a person, finally outgrowing your old problems like a reptile shedding its skin”), the wends (“the frustration that you’re not enjoying an experience as much as you should… as if your heart had been inadvertently demagnetized by a surge of expectations”), anoscetia (“the anxiety of not knowing ‘the real you'”), dès vu (“the awareness that this moment will become a memory”).
Well. All of that is nice. So are the other examples given, which I won't get into but are right there at the link. There's a problem, though.
The purpose of language, in my view, is not for us, as writers, to demonstrate our superior intelligence, insight, vocabulary, and sexual attractiveness (though we certainly possess these qualities). It's not to smugly show how clever and erudite we are. No, the purpose is to communicate. If we're going to go by what Le Guin said in the quote above, words can certainly "do things, change things," but only if both the speaker and listener (or writer and reader) can agree, at least to some extent, on the meanings of those words.
So if I'm going to a rock concert that I've been looking forward to for some time, and I feel like I should be enjoying it more, my friend might ask, "Hey, what's wrong?" And if I go, "I got the wends," unless they saw this article, they'd have no idea what the hell I'm talking about, and might even drag me to the first-aid station. Or, I could explain what "the wends" are, per the last paragraph I quoted, but at that point I wouldn't feel much like explaining anything.
Or I could just say "I'm not enjoying this as much as I'd hoped," and leave it at that.
And I say this as someone who's made up words in the past and have had to explain what they mean. Consider how much harder it must be if you're using someone else's recently-made-up words.
But hey, maybe some of these will catch on and become part of the lexicon. Language may reflect what's important to a culture, which is why we have dozens of words for death and less than a half-dozen for love. I'm sure there are linguists who disagree, but maybe, by changing the language, we can change minds. That power, however, can also be used for evil.
So, I don't know. I guess when it comes to this stuff, I'm feeling agnosthesia. |
February 23, 2026 at 8:59am February 23, 2026 at 8:59am
| |
Today's attempt to turn this into a food blog (because "carrion") is from TastingTable.
I saved this link for a few reasons. One of them is to display an example of how you can do a headline without making it clickbait. "Some Tomatoes Have A White Ring Inside. How Dangerous Is It?" would be clickbait.
Have you ever cut into a tomato and been perplexed to see a white ring?
No. Oh, I've seen the white rings. I was just never all that curious about them. It just never occurred to me that it was anything but standard variation in quality.
One of the primary causes of a white ring is a potassium deficiency in the soil when the tomato is growing. It's important to have an adequate concentration of potassium because, without it, the fruits may not absorb enough magnesium and calcium to properly ripen.
ChEmIcAlzzz!!!
However, too much sun exposure can also lead to your tomato growing with pale tissue. If the fruits are left out in the open at temperatures higher than 85 degrees Fahrenheit, they may turn white or yellow, and some areas may end up being dry or shriveled.
Do what now? Okay, it's been a very long time since anything useful has grown near me, and even longer since I (or rather my parents) grew tomatoes, but I seem to remember "plant after last frost" and "plant in an area with full sun." And Potomac River basin temperatures stayed above 85F most of the summer, even before climate change started to accelerate.
I could be misremembering. Also, what with selective breeding and genetic engineering and whatnot, I'm pretty sure there are hardier varieties.
There are a few other reasons why you may see white flesh inside your tomatoes. If stink bugs, beetles, spider mites, and other bugs get under the fruits' skin and start feeding, they'll suck out the juice and insert their saliva, leading to a white spot rather than a white ring.
You know how it goes: what's worse than finding a bug in your tomato? Finding half a bug.
The good news is that if you spot a white ring inside your tomato caused by a potassium deficiency in the soil, it's typically safe to eat and can simply be removed by cutting...
And why would you want to cut it out if it's safe to eat? Well, "safe to eat" isn't the same thing as "appetizing." I routinely excise the stem parts from tomatoes, just because I don't like them.
Really, there's not much to this article, but it did sort of answer a question I never knew I had. |
February 22, 2026 at 11:06am February 22, 2026 at 11:06am
| |
Now, here's a departure from my usual fare, thanks to Elisa the Bunny Stik 
I suppose that's preferable to all the bragging about how busy one is.
In Eat, Pray, Love, an Italian man tells our hapless protagonist her problem is that she’s American - Americans don’t understand pleasure because they believe it must be earned through exhaustion.
Far be it from me to agree with anything from the genre I call divorce porn (chick gets divorced, goes to a foreign land to "find" herself, doinks a hunky local guy, leaves satisfied), but that one feels right, like when a Belgian tour guide told me Americans eat like we have free health care.
Italians, he explains, have mastered il dolce far niente: the sweetness of doing nothing.
Fifteen years later, that sweetness has become the ultimate luxury.
Some might recall that I had an entry about doing nothing back on Groundhog Day: "Nothing Matters" 
Thorstein Veblen argued that people signal wealth through conspicuous consumption, conspicuous waste, and conspicuous leisure. Had he lived into the 21st century, he might have added a fourth: conspicuous grinding. The performance of perpetual productivity. Capitalism convinced us this is what rich people actually do. It isn’t.
The biggest advantage to being rich is that you have the ability, and the resources, to do nothing. Or almost nothing. But grinding doesn't get you there. Your hustle mostly enriches someone else. Someone who is doing almost nothing. And yeah, you're surviving, maybe even thriving, but you're not going to become a billionaire that way.
(Look, if the article can use the second-person pronoun with impunity, so can I.)
Leisure makes you feel guilty because you’re not working. Working constantly feels virtuous because that’s what success demands. We optimised our work, then ourselves, then wondered why we felt empty.
(And also the first-person plural pronoun.)
What’s emerging now is a pendulum swing towards a new aspirational leisure class: people whose value isn’t tied to what they do, but to how effortlessly they exist.
Insofar as people have "value," I balk at the notion that some are more valuable than others.
Time itself has become precious, so the ultimate status is to be wasteful with it. Complete autonomy over your schedule. The ability to meet anyone, whenever, and always know the right spot. To decline opportunities based on values or vibes. To partake in long, leisurely meals with no rushed ending.
I also balk at—nay, outright reject—the idea that such things are in any way "wasteful."
Although many activities today would have been considered leisure by previous generations - skincare rituals, vinyl listening bars, elaborate dining experiences - the question remains: is it still leisure if an algorithm told you to do it?
People can answer that question for themselves, I think.
If nobody could see you, if you couldn’t post about it, would you still do it?
That one, too. In my case, I do plenty of stuff that I don't post about. Some of it's not even embarrassing to admit; I just keep it private.
If so, that’s neo-leisure. If not, it’s unpaid labour, the performance of joy for an invisible audience.
Personally, I'm under the impression that a lot of that sort of thing isn't someone spontaneously deciding to, say, go to Tuscany and doink a local, but someone getting paid to promote Tuscany.
This is the contradiction at the heart of Neo-Leisure: the moment you perform it, you’re optimising again. The ability to waste time becomes another metric to track, another behaviour to perfect. We’ve simply replaced productivity optimisation with leisure optimisation. One exhausting performance becomes another.
When your hobby becomes your job, it loses a lot of what makes it interesting. Like, can porn stars ever have normal sex again? I'll never know the answer to that one.
For marginalised communities, for precarious workers, for anyone without generational security, the luxury of wasting time remains inaccessible. They’re still grinding because they have to. The status symbol isn’t in wasting time. It’s in having enough capital that you don’t need to justify how you spend it.
I don't think it's an epiphany to realize that leisure is tied to privilege. I know there's a bit going around about how feudal serfs had more free time than we do in our post-industrial dystopia, or about how hunter-gatherers work less than agriculturalists. I don't know how true any of that is.
The article then goes into "leisure" products that I've never even heard of. Remember what I said about getting paid for seeming to perform leisure, up there? I suspect that this is product placement.
True leisure, in my view, doesn't need a "product."
Odell wrote that “nothing is harder to do than nothing”. In an era where attention and consumption are currency, wasting time becomes an act of resistance.
Okay, but again, I must reiterate that I don't believe that these things are wastes of time. You know what is a waste of time? Doing work for a project that ultimately gets canceled. Even that isn't a complete waste of time if you learn something along the way.
The greatest luxury might be doing nothing and feeling no need to signal it at all.
Maybe. Or maybe the greatest luxury is to get paid to write blog entries. (To be clear, at the risk of repeating myself, I do not get paid to write blog entries.) |
February 21, 2026 at 9:17am February 21, 2026 at 9:17am
| |
Many years ago, I had this recurring schtick about how a certain white kawaii cat from Japan was evil and taking over the world. I called her the Nefarious Neko. I dropped it because it got old for me, but along the way, I learned way more than I ever wanted to know about Hello Kitty. So this BBC article caught my attention:
I, of course, never actually hated Hello Kitty. If there's something I actually detest, I normally just leave it alone, like the way I almost never talk about sports in here.
The designer behind Hello Kitty is stepping down after 46 years, during which time she oversaw the feline character achieving world recognition.
"World domination" is more like it. I remember some years ago, I saw an article about what was called "the most remote community in the world" or something, a tiny village in northern Siberia that was the only place of human habitation for many kilometers around. (Perhaps there are islands more remote, technically.) I don't remember many details, but it's the kind of thing that's only accessible by rail, and, because it's Siberia, only for like three months out of the year or something. Point is, it is quite literally the farthest corner of the world, and I distinctly remember, in one of the photos, a little girl wearing a Hello Kitty shirt.
Yuko Yamaguchi took over design duties for the character - who isn't actually a cat, but a little girl from London - in 1980, five years after she first launched.
It's not widely known, but yes, she's actually a little girl and she's actually British and her name is actually Kitty White.
Yamaguchi herself often wore Kitty-style dresses in public and piled her hair in buns.
One of these days, I'm determined to visit Japan. I'll need to brace myself for that sort of thing.
The Hello Kitty character first appeared on a coin purse in 1980 and has become a global marketing phenomenon.
This is where I started to metaphorically scratch my head. Up there, it said she took over "in 1980, five years after [HK] first launched." Seems to be a glitch in the timeline there.
She has appeared on clothes, accessories, video games, and even an Airbus plane.
And I saw an entire bullet train with a Hello Kitty theme. Well, pictures of it, anyway. Not to mention the Hello Kitty vibrator and the Hello Kitty assault rifle, which I called the HK-47.
Unlike other Japanese exports such as Pokemon, there is little backstory to the character of Hello Kitty. Sanrio has said she "isn't a human, [but] she's not quite a cat either".
As she has an actual cat as a pet, if she were a cat, it might raise questions about slavery and feline trafficking.
She was born in London and has a twin sister named Mimmy and a boyfriend named Dear Daniel, according to Sanrio.
And yet, the article fails to mention her name. Fortunately, I covered that up there ^
Kitty will make her cinematic debut in a Warner Bros film in 2028. She has already appeared in several animated series but has never spoken, as she doesn't have a mouth.
Now, I want you to stop and think for a moment about the optics of a British little girl character, created in Japan some 30 years after WWII, whose distinguishing feature is being voiceless. And maybe you'll see one reason why I had that schtick going for a while. |
February 20, 2026 at 10:45am February 20, 2026 at 10:45am
| |
Yes, this headline, from allrecipes, is clickbait, or perilously close to it.
I'm actually more annoyed at the continued use of "hack" in this context. But I'm not sure if it's better or worse than "trick."
Whether you like them fried, roasted, baked, or made into tots...
"Boil 'em, mash 'em, stick 'em in a stew. Lovely big golden chips with a nice piece of fried fish"
...you probably have at least one favorite potato dish.
Sure: Latkes.
Plus, potatoes are inexpensive—you can often get a large 5-pound bag for just a few dollars.
How much for the small 5-pound bag?
But if you’re a household of one or two, it can be a challenge to eat all those potatoes before they go bad, no matter how much you like them.
Whenever I see something about food "going bad," I imagine it standing on a street corner in a leather jacket and tattoos and chains, smoking a cigarette.
That’s why videos of people stashing apples in their bags of potatoes to prevent sprouting have popped up all over social media. But does this trick work? We looked to the science, talked to an expert, and tried it ourselves to find out for you.
No, I didn't just save this article so I could make potato jokes. That's just a bonus. I'm using this as an example of How To Do Science. Still. Remember the French phrase that translates to "potato:" pomme de terre, which, literally, translates to something like "earth apple" or "ground apple" (as in "the ground," not the past tense of "grind.") So I find the apple trick amusing, whether it works or not.
But in addition to storing them in that cool, dark, and ventilated space, can putting an apple in your potato sack really stop, or at least slow, potato spoilage? Well, it’s a little complicated.
So, how to do science. This is an easy and cheap experiment, though if you have an ethical issue with deliberately wasting food, maybe skip it. Find a bunch of potatoes from the same harvest, get an apple, split the potatoes into two groups, put each group in a sack (one with the apple) and leave them in the same room under the same conditions, but not so close to each other that apple fumes transfer.
Then simply check to see which batch, if either, grows eyes first.
Of course, just one experiment won't cut it. It needs to be repeatable and verifiable. Also, best if you have a third batch of potatoes from the same source set out on the counter or something as a control group.
There’s some scientific evidence to support this hack. Ethylene gas is a natural plant hormone produced by fruits like apples, bananas, and tomatoes. It plays a crucial role in the ripening process of fruits and the aging of vegetables. The theory behind the apple-potato trick is that apples release ethylene gas, and ethylene gas has been shown to inhibit potatoes from sprouting in at least one lab experiment.
This is the "hypothesis" stage of science. Not "theory." It's the starting point. But one must also take into account the possibility that there are other factors in the pomme / pomme de terre synergy, not just ethylene gas. For that, you'd probably need a real lab setup.
As a practical matter though, what you're really just looking for is extended shelf life on your spuds, so while the mechanism is interesting, just doing the experiment is good enough for the kitchen.
Or, you know, trust the other scientists who have already done the experiment.
The study showed that ethylene treatment delayed the sprouting of potatoes, at least under these tightly managed conditions.
It's the "tightly managed conditions" that are often the stumbling point between experiment and practicality.
To put this apple-potato trick to the test, I conducted a simple experiment in my home kitchen. I divided a bag of potatoes into two groups: one stored with an apple and the other without. I kept both bags in a relatively cool, dark pantry and checked on them every day for more than a week.
See, what'd I tell you?
Surprisingly, after seven days, I found that the bag of potatoes with the apple actually sprouted first, while the bag without the apple sprouted about 24 hours later, after eight days. It’s a puzzling result considering the research.
A puzzling result, maybe, but a good result. Why good? Because it exposes a possible error in your hypothesis, or your method, and that's the fun part.
My pantry isn’t a lab, and my climate control was anything but precise. Plus, different potato varieties may have varying susceptibility to sprouting.
Okay, yeah, but like I said: get your potatoes from the same batch. If one batch was dug up 3 days ago and the other, 5, then the experiment has a fatal flaw from the get-go.
There is no harm in trying this trick at home, says Jayanty. Whether you go with an apple or a banana, it won’t hurt the potatoes, and it just might delay sprouting.
"No harm" unless you'd rather have an apple or a banana than a potato to eat.
The article ends with actual scientifically-backed tips for extending tater life, so there's some practicality there, apart from the kitchen science stuff. |
February 19, 2026 at 10:54am February 19, 2026 at 10:54am
| |
The absolute last thing I should be doing is talking about human interrelations. I know more about quantum physics, and I know almost nothing about quantum physics. But, in another example of random number clumping, here's another BBC article, this one on relationships:
For once, a headline question can be answered "yes." Because a question like that, all you have to have is one counterexample. And I have friends who are exes, so there you go. Story over.
...except, of course, we already know that I'm an exception, an outlier, a better person. I suspect many people would say "hell, no."
Break-ups are tough - you suddenly lose the person you shared everything with. But staying friends with an ex can be equally as painful.
Well, yeah, but presumably you liked them for a reason, right? A reason that might be common grounds for friendship? Unless it was purely physical attraction, which there's nothing wrong with.
I should note, however, that part of the "I don't know what I'm talking about" thing is that I'm not sure whether the article is talking about, like, just being cordial with the ex, or occasionally doing stuff together in a group, or somone you can call to help you move, or a full-on "help you move bodies" close friendship.
That last one, admittedly, could get a bit awkward, I would imagine. It's like "Sure, I'll be there if you need me; I just find you unsuitable to live or mash genitals together with."
"I don't have many friends who are friends with their exes, actually," says Olivia Petter, author of dating handbook Millennial Love. But she has managed it in a couple of cases.
Obviously, this kind of thing is way behind me, anyway. So my interest is mostly sociological.
Okay. I can't pass up noting yet another aptronym there. Almost makes me think "Olivia Petter" might be a nom de plume. But no one would be that obvious... would they?
1. How serious was it?
"There are one or two men I've had brief, casual romantic relationships with that have evolved into friendships," Olivia told BBC Radio 4's Woman's Hour...
But when it comes to serious relationships, she says while she's on good terms with them they're not close friends.
See, that's the sort of thing I'm talking about. I run into my first ex-wife occasionally. We talk and catch up. Then we forget about each other again.
Meanwhile, a month-long fling I had back in the 90s is the kind of friend where we text fairly often. She's on the other coast, so I don't just run into her. We both did a Thanksgiving dinner last year, though, with a group of friends. No awkwardness involved.
2. Are you over them?
One of the biggest obstacles is whether you are able to separate the romance from the person.
"You need to have processed the break-up, not just moved on logistically, but emotionally," Kate says.
Huh... that's perilously close to what I said above. Maybe I should write a book about relationships and have BBC promote it.
3. How much time has passed?
Yeah, that actually makes sense. Get past the big emotions. And, obviously, some relationships end with unforgivable acts.
4. Is your new partner ok with it?
If you do decide to stay friends, then Kate says you need to talk openly about what you are both going to do if the other gets into a new relationship.
And if a new partner is uncomfortable with the friendship, Kate stresses you should take their concerns seriously.
We didn't use the words "red flag" way back when I was dating, but one immediate "no" signal for me was when someone indicated that being friendly with an ex was a big "no" signal for her. Most of my friends are women. Some are exes. I wasn't about to give up long-term friendships because someone new doesn't trust it.
You may need to have a conversation with your ex to adjust the friendship which could be "less frequent contact, more group settings, or being more transparent about what you're doing together," she says.
And then, from my longer-term perspective, sometimes friendships just fade out, whether there was ever romance and/or doinking involved. People drift apart. It's just a fact of life. It's not always a, pardon me for adopting the term, "conscious decoupling" or whatever. (I'm applying this to friendships as much as romance.)
I guess, for me anyway, I just feel like holding a grudge hurts you more than it does the other person, so I try not to do it. Doesn't mean we have to do stuff together, but it hurts me not at all to at least be cordial.
Again, though, I have a longer perspective on these things. It's different for younger people, and society is different, too. Hence my interest in the subject remaining purely scientific. |
February 18, 2026 at 8:11am February 18, 2026 at 8:11am
| |
I'm skeptical about a lot of things. Not denialist; skeptical. But there are a few things I'm absolutely certain of, and one of them is that plants die in my vicinity. Hell, one time I bought a cactus for my housemate. I never touched it. She was diligent with it. It died. So I don't believe for one second this article from BBC:
I thought about getting plastic plants, but I expect those would die around me, too.
Have you lost count of the times you've had high hopes for a pot plant...
However, this lede is the real reason I saved this article. High hopes? Pot plant? I get they speak a different language over there, but either they ignored their US editors, or this is one of those times when the BBC gets a bit cheeky. It does that sometimes. It's often subtle. So I'm hoping—really hoping—that this particular double entendre was intentional.
Still, in the US, we call them "potted" plants to distinguish them from weed.
...but despite careful positioning and diligent watering it always seems to die?
Every. Time. Full disclosure: I've never tried with an actual (snicker) pot plant.
Well you're not cursed and you don't need particularly green fingers for your to foliage to thrive, you just need to know where you might be going wrong, experts say.
You all know by now that I have no business with the supernatural. I mean, it's great as metaphor and in stories, and I enjoyed the long-running series by that name, but here in reality, I prefer science and reason. Still, if anything was going to pivot me to believing in curses and whatnot, it's the way plants die in my presence, as if realizing that they're stuck with me and have no legs to run away with.
But—rationally—my cats have legs and are stuck with me, and they don't try to run away. Except the one who gets medicine twice a day, but she doesn't run far.
Gardeners' World host Adam Frost and the Royal Horticultural Society's Clare Preston-Pollitt share their top tips for keeping your house plants alive and healthy.
Clare's last name is precipitously close to being an aptronym. Pollitt? Pollen? I'll be here all week.
Adam's is the polar opposite of an aptronym.
1. Pick the right plant
I don't want to seem ungrateful, here, but that's crap advice. As someone who appreciates form, but appreciates function even more, the only plants I really want to keep around are the ones you can use in cooking: mint, thyme, basil, etc. A ficus is useless and just takes up space. So I want to know how to keep, specifically, herbs alive, not some random spider plant that would look better in a vegan restaurant anyway.
Many of us pick plants we think are pretty but making sure they are compatible with the conditions in our homes is key for survival, says Clare, RHS Garden Bridgewater's horticultural advisor.
And if you live with pets, some plants are right out. Imagine me having a catnip plant. It wouldn't last a day. But, worse, some plants are outright toxic to pets, usually the same ones that pets absolutely love to munch on.
2. Don't overwater
Yeah, thanks. I suspect the problem here is the precise opposite, but "underwater" means something else entirely.
For common house plants like peace lilies and spider plants, brown leaves are a tell-tale sign of over or under watering. Check the dryness of the soil before topping them up.
For others, like cacti and succulents, Clare says we mistakenly drown them by unnecessarily watering them.
I am certain that this is not the case for the cactus I mentioned above.
3. Water less in winter
Some regions experience winter differently. Specifically, some never get cold, while others never get warm. I suspect this article was written with the temperate and moist UK in mind.
4. Keep your Christmas poinsettia warm
Remember how I said the thing about pets? Yeah, poinsettia and pets don't go together, despite being unable to spell "poinsettia" without "pets."
Also, even I know poinsettias are from Central America and Mexico. Which is one of those regions I mentioned in the last section. Exposing them to cold is like mixing good tequila with Sprite.
To keep them lasting longer than your New Years resolutions, you should add plant food to your poinsettias each month, Adam says. In April, he suggests trimming the branches, before re-potting in May.
And I'm including that bit to emphasize what I said up there about the BBC sometimes being cheeky. "lasting longer than your New Years resolutions," indeed.
I'm obviously leaving a lot out, but the link is there if you haven't yet given up on the entire idea of houseplants, like I have. |
February 17, 2026 at 11:31am February 17, 2026 at 11:31am
| |
There was a slew of articles about Shakespeare not too long ago, probably paid advertising for the movie Hamnet. This might or might not have been one of them, from Mental Floss:
Well, yeah, because no one knows everything about Shakespeare. Hell, no one even knows for sure what day he was born on (they have a baptism record and an assumption based on custom).
As usual, don't trust MF for the facts. Or me, for that matter. This is, as certain "news" outlets disclaim, for entertainment purposes.
Misconception: Historians debate whether Shakespeare really wrote Shakespeare.
Shakespeare wrote a lot. He was also from the country town of Stratford-upon-Avon and didn’t go to university. So could this one “simple” guy write all these impressive high-brow works?
So, two things here. I've heard this "theory" bandied about for as long as I've known about Shakespeare, and my first impression (as an impressionable kid) was "How is this relevant? The plays exist. Can we not separate them from the writer, if not from the time?" Later, I came to realize what this really was: arrant snobbery.
Thus, I feel like I can safely ignore any snoot who proclaims that Shakespeare didn't write Shakespeare.
And finally, they're only "impressive high-brow works" from our point of view today. At the time, they were the pop culture equivalent of monster truck rallies.
Misconception: Shakespeare invented 1700 words.
Today’s lexicographers have a lot more data and technology—and they know Shakespeare didn’t coin that many words. (Jonathan Culpeper, a linguistics professor at Lancaster University, has spent decades researching Shakespearean language. He believes Shakespeare coined around 400 words.)
As I'm pretty sure I've noted before, I don't know how these things get determined, but it seems to me that, especially before the internet, words were passed around by, well, word-of-mouth before they were ever written down. So how do we know what was coined, and what was the pop culture equivalent of "six-seven" and "skibidi?"
I will, however, give more weight to the opinion of a linguistics professor in the matter.
Misconception: Saying Macbeth in a theater is dangerous.
Okay. Okay, fine, so that's a misconception. And yes, actors are known for being a superstitious lot, which is why you tell them to "break a leg" before a performance instead of "good luck." However, that does not make this clip any less than one of the funniest things in the history of comedy:
Misconception: Wherefore means “where.”
An image you’ve probably seen countless times is Juliet decrying, “O Romeo, Romeo, Wherefore art thou Romeo?” It sounds like, “Where you at, Romeo?” And some performances even have Juliet physically searching for Romeo as she says those lines.
But, at the time Shakespeare was writing, wherefore essentially meant “why.” Juliet is asking, “Why are you Romeo?” because it’s his name, attached to a family that’s feuding with hers, keeping them apart.
Okay, I'm not going to argue about that. It just seems weird because it's not his name "Romeo" that's keeping them apart, but his family name. And the necessities of plot.
I continue to insist that R&J is best interpreted as satire and/or parody.
Misconception: As Hamlet says, “to be or not to be,” he’s holding a skull.
Sometimes in pop culture, you encounter a Hamlet who’s holding a skull and reciting the “to be or not to be” speech. (Billy Madison is one example.) But Hamlet holds a skull during his speech in the churchyard that begins, “Alas, poor Yorick!” It happens in Act 5, Scene 1. “To be, or not to be—that is the question” comes two acts before that, in Act 3, Scene 1.
Sure. Again, not arguing. But it's pretty damn famous that in all of Shakespeare's surviving works, only one stage direction stands out: "Exit, pursued by a bear." (The Winter's Tale). And there weren't many in his entire body of work. I don't recall, and am too lazy to look up, if those parts of Hamlet have actual stage directions, or if it's left up to directorial interpretation.
So if you want to have Hamlet holding a skull, or a book, or a damn iPhone, in your production of Hamlet, I say go for it. Maybe he carries a skull around with him all the time. He certainly seems the type.
Misconception: The Globe Theatre was round.
The first Globe Theatre was completed in 1599. Shakespeare was a part-owner of the Lord Chamberlain’s Men company, which built the venue. And he may or may not have called it a “wooden O” in the prologue of Henry V. That being said, it wasn’t exactly a circle. It was a many-sided polygon.
In fairness, calculus wouldn't be invented for nearly 100 years, so I don't think they'd be aware that, as the number of sides of a regular polygon increase, its resemblance to a circle also increases. At some point, architecturally, you have to say "Yeah, fine, that's a circle."
It wasn't, however, a "globe." The Hayden Planetarium is a "globe" (inside a cube). At best, the Globe Theatre approximated a cylinder.
Now, there are a bunch I skipped because I had nothing to say about them. They're at the link if you're interested. |
February 16, 2026 at 9:18am February 16, 2026 at 9:18am
| |
Well, no, I haven't started suddenly following Wine Spectator. Though I might. Still, I'd rather drink it than read about it, except maybe in this case:
Okay, well, if it's truly an indispensable tool, then the answer is "stop being 'organic'" or "move out of France."
The irony (another heavy metal) here is that copper is considered "organic" for agricultural purposes. Which is distinct from the chemical meaning of "organic" (carbon, not copper, compounds), and the original meaning of "organic" (from organs).
My amusement at this is tempered only by knowing that, in French, what we call organic in the agricultural sense is called biologique.
It leaves organic winemakers confronting an existential question: How do you protect vines from downy mildew when your primary defense has been eliminated?
Because no one in France has ever confronted an existential question before.
“Copper is a natural element, naturally occurring in nature,” said Gérard Bertrand, a leading vigneron in southern France and advocate of organic farming.
Uh huh. Okay. I'm letting the repetition of "natural" slide there because it's either a translation or someone's second or third language. But I'm not going to let the natural fallacy slide, oh hell no, not me.
Once more for the back row: "natural" doesn't mean "good." Poison ivy is natural. Tobacco is natural. Arsenic is a naturally occurring element much like copper.
Copper is a naturally occurring element and is approved worldwide for organic agriculture. Critically, there is no equivalent for organic farming. The other options are forbidden synthetic fungicides.
Not, mind you, that I'm coming down on one side or the other here. I don't know enough about copper toxicity or viticulture in general to weigh in on what France did, and even if I did know enough, they wouldn't give one single shit about my opinion. All they care about is that I enjoy the finished product and keep sending them money in exchange.
Copper, while it does protect vines from fungal disease, is a persistent metal that accumulates irreversibly in the top few inches of vineyard soils. In large quantities it disrupts essential microbial communities and earthworm populations that define healthy terroir. It can also contaminate the waterways that flow through wine regions. There are also mounting concerns about its impact on vignerons and vineyard workers themselves.
Well, there you go. Look at that: something natural isn't good for you.
Regarding a way forward, Jestin remains optimistic, hoping that scientific research can devise alternatives to copper.
Honestly, I hope so too. French wine is expensive enough with tariffs.
Trade body SudVinBio cautioned that producers may abandon organic practices altogether.
About that, I don't care.
Two copper products remain authorized, but they carry stringent restrictions, which may make them less practical. Without viable alternatives to copper, how does organic viticulture survive in regions where Bordeaux's Atlantic humidity, Burgundy's continental rainfall, Cognac’s and Champagne's persistent dampness all create conditions where mildew protection determines vineyard sustainability?
I have another article in my pile about the American chestnut, which used to dominate Blue Ridge Mountain forests until a fungus destroyed the entire native population. It'll pop up eventually, but the point for now is that things change. And right now, things, climatically speaking, change even faster. France is very strict about its wine growing policies; for instance, they don't allow irrigation apart from whatever rains fall. So either they can become less strict, or wine growing will shift to some other region.
Which would be a shame, but at least they'll still have cheese. Which I don't think has any copper in it, but I haven't tested it for that. |
February 15, 2026 at 10:30am February 15, 2026 at 10:30am
| |
Today, a bit of a break as I use this space to participate in "26 Paychecks " [E]
Tell us about one (1) genre that you've never written anything for. Name the genre and tell us why it's not something that has sparked any writing from you.
Thing is, I've been here for a while, and I've written a bunch of things both here and elsewhere. And I tend to experiment, sometimes. So I can't absolutely guarantee that I've never written in some particular genre. Hell, I probably have something listed under "Fashion," which is one genre I've always said I knew nothing about.
But, upon browsing the list of genres here, I came across one that I don't think I've ever touched, which is Genealogy.
There's nothing wrong with genealogy. I get wanting to know where people come from, much as I enjoy tracing word origins. It's just not a particular interest of mine.
You'd think it would be, right? Since I was adopted as an infant, surely I must feel a burning need to track down my genetic ancestry! Well, um... no. I don't. Never have. Nothing more than idle curiosity or being able to answer inheritable disorder questions for the doctor. I have had some interest in tracing my adoptive parents' origins, but any attempts led me to dead ends and I gave up, never having written about it. Dad always warned me I'd find skeletons, anyway, and unlike a character in a horror story, I actually listen to warnings. I don't always heed them, mind. But I listen.
Short one today to keep it "about 200+ words," but that suits my mood today just fine. |
February 14, 2026 at 9:50am February 14, 2026 at 9:50am
| |
Another one that's not my usual thing, but it caught my brain. At first it looks like politics, which I try to stay away from, but it's kinda not.
As we know, the answer to headline questions is "no" by default. But okay; I'm willing to listen.
It’s no secret that Americans are more politically polarized today than we’ve ever been. Do you even remember a time when we weren’t this way?
The only time I recall was the few months after September 11, 2001 – 24 years ago now. For a short but beautiful time, Americans really were united.
No.
We thought we were united. But it turned out that we were all mad for different reasons. One side was sad about the loss of life. The other seemed to be, but was really just pissed that someone caught us with our pants down.
Our representatives in Congress came together to proclaim their commitment to working across the aisle and backed it up with some major bipartisan laws.
Yeah. Really, really bad ones.
We Americans have settled ourselves neatly into political tribes that don’t work together, don’t listen to each other, and often despise one another.
Or both. I can despise both.
Many people have been hurt by our current level of polarization, and there’s worse pain to come if things continue this way.
If you haven't noticed, there's worse pain to come regardless.
Anyway, the article goes into this "depolarization challenge," and I have no need to reproduce it here.
At this point you might be thinking, “but why do I need to change? It’s those other people who are causing all the problems!”
I can understand that thinking. There are certainly things I've given up on because we'd all have to do it, and that ain't gonna happen. But when you think harder, for stuff like what's in this article anyway, maybe you come to the conclusion that it's easier to change yourself than it is to change other people.
Some people are not ready to step outside their comfort zone and change their mindset in this way. But those who do will be rewarded with a stronger sense of community, a more functional civic society, less heartache, better relationships, and a country that they can be proud of. And maybe, if enough of us do it over a period of time, our government can become less polarized too.
Perhaps this is a noble goal.
I'll have to think about it. |
February 13, 2026 at 10:23am February 13, 2026 at 10:23am
| |
So, here's one from a source I don't usually follow, but it came to my attention thanks to Elisa the Bunny Stik :
We’re entering an era of what’s often called AI slop: an endless stream of synthetic images, videos, and stories produced quickly, cheaply, and at scale.
I have to admit, I'm getting more than a little tired of hearing/seeing the words "AI slop." From what I've seen, AI output has become more polished and professional than about 75% of human-generated content. I think some people might be jealous.
I ain't saying it's right, mind you. Only that it's prettier.
Sometimes these fake videos are harmless or silly, like 1001 cats waking up a human from sleep.
Harmless? You dare to call something that triggering harmless? I don't even allow the significantly fewer than 1001 cats in my household to wake me up.
Other times, they are deliberately designed to provoke outrage, manipulate identity, or push propaganda.
Because human-generated content would never do that. Just like no one ever got lost using paper maps in the time before GPS.
To navigate this new information environment, we need to combine psychological literacy, media literacy, and policy-level change.
And here's where it gets difficult for most of us. Why should we change? It's the world that needs to change, dammit!
The article provides a road map (or, if you prefer, a GPS route) to us changing:
1) Understand Our Own Psychological Biases (Psychological Literacy)
The psychology behind falling for AI-generated misinformation isn’t fundamentally new. The process is largely the same as with other forms of misinformation, but AI takes it to a whole new level– it dramatically lowers the cost and effort required to produce and spread it at scale.
My own simple solution: Right now, most of us have a bias that says "I saw it, so it must be real." I suggest turning that around. Assume everything you see on the internet, or on TV, is fake. Like you're watching a fictional movie or show. The burden of proof thus shifts.
The downside to this (every simple solution has a downside) is that you get so you don't believe anything, And for some of these content generators, that's the goal: make you question reality itself so they can swoop in and substitute it with their own version. Hell, religion has been doing this for as long as there's been religion.
As Matthew has written before about fake AI accounts, people are motivated to believe what fits their values, grievances, and group identities, not necessarily what’s true. When a video confirms what you already believe about politics, culture, or power, authenticity becomes secondary.
I have noted this before: it is important to be just as, or preferably more, skeptical about the things that tickle our confirmation bias.
The goal isn’t to suppress emotion. It’s to recognize when emotion is being used as a shortcut around verification, and being used to manipulate you.
It sure would be nice to be able to suppress emotion, though. I've felt that way since watching Star Trek as a kid. Spock was my role model.
2) Lateral Reading Is Still the Best Tool We Have (Media Literacy)
When people try to fact-check AI videos, their instinct is often to stare harder at the content itself such as examining faces, counting fingers, looking for visual glitches.
Guilty.
I've been seriously considering wearing a prosthetic extra pinkie finger so that anyone who looks at a surveillance photo of me will immediately assume it's an AI fake.
The most effective fact-checking strategy we have isn’t vertical reading (scrutinizing the video itself). It’s lateral reading—leaving the content entirely to verify it elsewhere.
I do that here, especially with notoriously unreliable sources, which, since I try to use free and easily accessible content, is almost everyone these days.
3) Policy Changes and Platform Accountability
Individual skills matter. Community norms matter. But at this point, policy intervention is likely required.
Well, I was trying to be funny with the "It's the world that needs to change" bit above, but I guess they're serious.
Social media platforms are not optimized for truth, they’re optimized for engagement.
I should fact-check this, but it aligns with what I already believe, so I won't.
Conclusion
The most dangerous thing about fake AI videos isn’t that people believe them once. It’s that repeated exposure erodes trust altogether: in media, in institutions, and eventually in one another.
As I alluded to above, it makes us question the very meaning of "truth."
I'd also add this: Be humble enough to know that you can be wrong. Be brave enough to admit when you're wrong. And allow space for the idea that sometimes, your ideological opponents are right.
Not often, mind you. But sometimes. |
© Copyright 2026 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|