Blog Calendar
    April     ►
SMTWTFS
  
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Archive RSS
About This Author
Come closer.
Carrion Luggage

Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.


April 13, 2025 at 12:35am
April 13, 2025 at 12:35am
#1087131
I'll be traveling this week, so posts will be whenever I can find the time to make them. Like now, before I get some sleep so I can leave early in the morning.

Another older article today, an Ars Technica piece from 2019. This is significant, because clearly, the "techniques" they discuss therein didn't work to combat the misinformation and anti-science rhetoric that amped up in the following year.

    Two tactics effectively limit the spread of science denialism  Open in new Window.
Debunking the content or techniques of denialism mitigates their impact.


Does it, though? Does it really?

“Vaccines are safe and effective,” write researchers Philipp Schmid and Cornelia Betsch in a paper published in Nature Human Behavior this week.

Again... 2019.

“Humans cause global warming. Evolution theory explains the diversity and change of life.” But large numbers of people do not believe that these statements are true, with devastating effects: progress toward addressing the climate crisis is stultifyingly slow, and the US is seeing its largest measles outbreak since 2000.

I checked the statistics, and yes, the one in 2019 was even larger than the current measles outbreak... so far.

In their paper, Schmid and Betsch present some good news and some bad: rebutting misinformation reduces the ensuing level of science denialism, but not enough to completely counter the effect of the original exposure to misinformation.

If what we've seen over the past five years is a reduction, I'd hate to have seen the unmitigated disaster.

Schmid and Betsch make a point of emphasizing that science denialism is a universe away from a healthy skepticism. In fact, skepticism of existing results is what drives research to refine and overturn existing paradigms. Denialism, the authors write, is “dysfunctional” skepticism “driven by how the denier would like things to be rather than what he has evidence for.”

There's also, I think, a knowledge gap involved. If you don't know how to fly a helicopter, don't get behind the controls of one. If you think you know how to fly a helicopter because you've seen action movies, you're wrong. Similarly, if you think you know everything about vaccines because you've watched a few videos online, you're wrong. I don't know everything about vaccines, but I have the advantage of living in the same house as an epidemiologist. And usually that of recognizing good science as opposed to bad.

Schmid and Betsch focused on strategies to counter misinformation as it is being delivered during a debate, focusing on two possible approaches: correcting misinformation and laying bare the rhetorical techniques that are being used to obfuscate the truth.

Maybe part of the problem is allowing it to get to the point of debate. When you get a flat-earther up on stage discussing the shape of the planet with a... well, with anyone with brains, you're putting them on equal footing. You shouldn't do that. Flat-earth nonsense needs to be nipped in the bud, even if it does make the flat-earther feel persecuted and perversely vindicated. They can have their own platform, not one shared with scientists.

Flat-earth bullshit is only the most obvious of these types of "my ignorance is just as good as your knowledge" things, though.

For instance, in the case of vaccine denialism, a denier might argue that vaccines are not completely safe. Correcting this misinformation (which Schmid and Betsch call a “topic” rebuttal) could take the form of arguing that vaccines in fact have an excellent safety record. A “technique” rebuttal, on the other hand, would point out that demanding perfect safety is holding vaccines to an impossible standard and that no medication is 100 percent safe.

"See? It's only 99.9999% safe! Why take the chance?" Because failing to vaccinate causes more death.

The article goes into the methods used in the study, then:

But one thing seems clear: it could be better to turn up and debate a denialist than to stay away, a tactic that is sometimes advocated out of fear of legitimizing the denialism.

Which is exactly the opposite of what I just said up there. This can tell us all three things:

1) I'm not an expert, either (but I can generally spot experts);
2) I can be wrong;
3) Unlike denialists, I can admit when I'm wrong.

Still, I'm not going to debate any of these things in person. My memory is too crappy, my knowledge is too broad and not deep enough, and I'm not much of a public speaker. There's no way I could keep up with the flood of misinformation and outright lies that the denialist (of whatever) is spouting. If someone else wants to do it, someone with actual credentials and who's not going to freeze up on stage, go for it.

But the bullshit comes too fast. A lie is wiping its dick on the curtains while the truth is still struggling to get the condom on.

It's an uphill battle. Sisyphean, even, because once you push the boulder to the top of the hill, they'll roll it right back down again.

And yet, I have to try.
April 12, 2025 at 3:22am
April 12, 2025 at 3:22am
#1087073
This Wired article is fairly old, and published on my birthday, but neither of those tidbits of trivia are relevant.

    Why a Grape Turns Into a Fireball in a Microwave  Open in new Window.
Nuking a grape produces sparks of plasma, as plenty of YouTube videos document. Now physicists think they can explain how that energy builds up.


No, what's relevant is that fire is fun.

The internet is full of videos of thoughtful people setting things on fire.

See?

Here’s a perennial favorite: Cleave a grape in half, leaving a little skin connecting the two hemispheres. Blitz it in the microwave for five seconds. For one glorious moment, the grape halves will produce a fireball unfit for domestic life.

Unfortunately, you can only see it through the appliance's screen door (that screen serves the important function of keeping most of the microwaves inside the microwave), and I don't know what it might do to the unit, so don't try this with your only microwave. Or at least, don't blame me if you have to buy a new one. I'm not going to pay for it.

Physicist Stephen Bosi tried the experiment back in 2011 for the YouTube channel Veritasium, in the physics department’s break room at the University of Sydney.

What's truly impressive is that Bosi, the grape, and the microwave oven were all upside-down.

Off-camera, they discovered they had burned the interior of the physics department microwave.

What'd I tell you? I'm not responsible if you blow up the one at work, either. Still, if the last person to use it committed the grave sin of microwaving fish, this might be an improvement.

I should also note that the article contains moving pictures of the effect. These are cool, but you might hit a subscription screen. With my script blocker, I could see the text, but not the pictures.

But it turns out, even after millions of YouTube views and probably tens of scorched microwaves, no one knew exactly why the fireball forms.

As regular readers already know, this is the purpose of science.

After several summers of microwaving grape-shaped objects and simulating the microwaving of those objects, a trio of physicists in Canada may have finally figured it out.

At least they weren't upside-down. Sucks if they wanted to nuke some poutine, though.

The fireball is merely a beautiful, hot blob of loose electrons and ions known as a plasma. The most interesting science is contained in the steps leading up to the plasma, they say. The real question is how the grape got hot enough to produce the plasma in the first place.

And this is why some people think science sucks the joy out of everything. No, nerds: the fireball is the cool part. The science is merely interesting.

Their conclusions: The grape is less like an antenna and more like a trombone, though for microwaves instead of sound.

Huh. Never heard of a trombone exploding into a blaze of glorious fire, but I suppose it could happen. Better to save that fate for instruments that deserve it, like bagpipes, accordions, and mizmars.

I joke, yes, but the article explains it rather well. If you have a subscription. Or can cleverly bypass that annoying restriction.

The grape, incidentally, is the perfect size for amplifying the microwaves that your kitchen machine radiates. The appliance pushes microwaves into the two grape halves, where the waves bounce around and add constructively to focus the energy to a spot on the skin.

Not explained: if the grape is "the perfect size," how come it works for grapes of different sizes?

A common misconception is that the microwave acts on the grape from the outside in, like frozen meat defrosting, says physicist Pablo Bianucci of Concordia University, who worked on grape simulations included in the paper.

I don't know where Concordia University is, so I can't make jokes about its location. Oh, wait, I could look it up.

...

Oh, it's in SoCal. Grody.

Anyway, I didn't know people still thought microwaves heated from the outside in. We can't all be physicists, but I was under the impression that it's fairly common knowledge that the wavy EM thingies work by exciting the water molecules throughout the... whatever you put in there. That's why it's usually faster to nuke a cup of water than it is to boil it on the stove.

The work has more serious applications too, Bosi says.

Look, not everything needs to be useful for something. But when it is, that's pretty cool.

His experiments with grape balls of fire...

And there we have it, folks: the real reason I saved this article to share with all of you.

...began and ended with the 2011 YouTube video, but his curiosity did not. “I’m impressed with the scientific depth of the paper,” wrote Bosi in an email. In particular, he notes that authors came up with mathematical rules for describing the grape hotspot. They could conceivably shrink these rules to a smaller scale, to create similar hotspots in nanoparticles, for example. Scientists use heated nanoparticles to make very precise sensors or to facilitate chemical reactions, says Bianucci.

I'll take their words for it.

During all their microwaving, they noticed that two grapes placed side by side repeatedly bump into each other, back and forth. They don’t know why that happens, and they’ll be studying that next, says Bianucci.

Always something else to study. This is a good thing.

Not mentioned in the article: how in the hot hell did anyone figure out that putting a grape, cut mostly in half but still connected by a tiny thread of grape skin, into a microwave would produce a "grape ball of fire?" It's not like we eat warm grapes. Even if we did, that's still a very specific configuration.

Some mysteries, I suppose, will never be solved. And that's also a good thing.
April 11, 2025 at 9:13am
April 11, 2025 at 9:13am
#1087018
I'm more than a little pissed at Time right now because they reported the "dire wolf de-extinction" story as if it were true and not a steaming pile of bullshit. Don't know what I'm talking about? Use a search engine; I'll be damned if I'm going to give that crap any more boost by linking it.

But I'm really hoping they got the science right on this article:



"Surprising," I guess, if you're a prude. It makes me feel better to cuss, so I've always known it had health benefits (for me, not the people I'm cussing at). Still, it's good to have science backing me up. If it's true. After the "dire wolf" bullshit, I can't be sure.

Many of us try to suppress the urge to blurt out an expletive when something goes wrong.

And many of us try to hold sneezes in. That doesn't mean it's healthy.

Research has found that using profanity can have beneficial effects on people’s stress, anxiety, and depression. In fact, there are numerous potential physical, psychological, and social perks related to the power of a well-timed F-bomb.

"Social?" I guess it depends on the society.

Cursing induces what’s called hypoalgesia, or decreased sensitivity to pain. Researchers have shown that after uttering a curse word, people can keep their hands submerged in ice water for longer than if they say a more neutral word.

I get why they do the submerged in ice water thing. It's a low-risk means of inducing some level of pain in a test subject. Other kinds of pain may be unethical for scientists. But I wonder about the efficacy of low-risk pain inducement in a study such as this. For one thing, a big part of pain is the surprise. If you know you're going to get stuck with a needle at the dentist, you can control your reaction somewhat (though it's quite difficult to swear with your mouth wide open and the dentist's fingers in there).

But here’s an interesting twist: “People who swear less often get more benefit from swearing when they need it,” he says. In other words, cursing all the time zaps the words of their potency.

That's not surprising to me. I prefer to hold back the important words for when they can provide better emphasis.

Swearing aloud is associated with improvements in exercise performance, including cycling power and hand-grip strength.

This wouldn't surprise me either. I glanced at the study. Decent sample size, but restricted demographics (i.e. one of those studies that used students as swearing guinea pigs), and the control group used neutral language, presumably words such as "hit," "truck," or "bunt."

A study in the European Journal of Social Psychology found that when people wrote about a time they felt socially excluded, then repeated a swear word for two minutes, their hurt feelings and social distress were significantly lower than for people who used a neutral word.

Taken together with the findings about physical pain, this might lend more credence to the idea that physical pain and emotional pain are related in more ways than just being described with the same word.

In another study, researchers found that when drivers cursed after being refused the right of way by another driver, or when they encountered a traffic jam caused by cars that were stopped illegally, cursing helped them tamp down their anger and return to a more balanced emotional state.

I didn't look at that study. I've experienced this myself. And "cursing" in this context includes showing the offender my middle finger.

There appear to be surprising social benefits associated with the well-timed use of profanity. “Some people believe that profanity can break social taboos in a generally non-harmful way, [which] can create an informal environment in which people feel like insiders together,” says Ben Bergen, a professor of cognitive science at the University of California, San Diego, and author of...

This isn't on the same level as those other assertions. "Some people believe" is weasel words, which is why I'm not including the name of his book. I don't doubt that it does these things, but, as anyone who's been on WDC for a while can attest, cussing can also alienate some people.

Of course, it is possible to overdo it. People who swear frequently are sometimes perceived as angry, hostile, or aggressive, so there’s a potential tipping point to using profanity.

Again, I'm pretty sure that's true, but: what's the tipping point? I suspect it's different for different groups. Baptist church vs. biker bar, e.g.

The article does address this qualitatively:

It’s also important to know your audience.

Swearing etiquette may depend on the social hierarchy and power dynamics in certain situations, such as the workplace, says Jay. Just because the boss uses curse words doesn’t necessarily mean you can get away with it. (You’ll also want to modify your language around young children.)

Nah. I want young children to stay as far away from me as possible. If I cuss in public, their parents herd them away. I win. They win, too, because I have furthered their education.

Not addressed in the article: whether writing "fuck" has similar benefits to saying it. I suspect not. Clearly, further study is needed. Can I get money for being a guinea pig in that study?
April 10, 2025 at 12:42am
April 10, 2025 at 12:42am
#1086956
I'm posting early today because I have a dentist thing that will a) take all morning and b) leave me in no shape to form coherent sentences (worse than usual, I mean) in the afternoon. Speaking of posting schedule, I'll be going on a little trip next week, so blog posts will be erratically timed.

For today, though, I'll try not to make any tired old "place is in the kitchen" jokes about today's article from Gastro Obscura. No promises.

    Meet the Feminist Resistance Fighter Who Created the Modern Kitchen  Open in new Window.
Margarete Schütte-Lihotzky left an indelible mark on Austria, architecture, and how we cook.


Sexist jokes notwithstanding, this scene is set in Austria in the 1940s, and it was a central platform, in that era, of a certain political party led by a certain Austrian that women were for children, kitchen, and church. Which should be enough right there to rebel against the entire idea of rigid gender roles.

Schütte-Lihotzky had been imprisoned since 1941 for her work as a courier for the Communist Party of Austria (KPÖ), which led the resistance against the Nazi regime in her home country. While she managed to narrowly avoid a death sentence, Schütte-Lihotzky remained in jail until the end of World War II in 1945. The incarceration would forever split her life in two. On the one side were her beginnings as a precocious and successful architect spurred on by the desire to create a better life for working-class women. On the other, what she would refer to as her “second life,” as an active communist, political activist, and memoirist who was professionally shunned in Austria for her political beliefs and received her much-deserved accolades only in the final decades of her life.

I suppose it could have been worse. Some people don't get recognized until after they croak.

Schütte-Lihotzky led a remarkably long and full life, dying a few days short of her 103rd birthday in 2000. But her name remains forever connected to a space she designed when only 29 years old: the Frankfurt Kitchen, the prototype of the modern fitted kitchen.

Which is so ubiquitous in developed countries now that it's hard to imagine a time when it didn't exist.

Designed in 1926 as part of a large-scale social housing project in Frankfurt, Germany, the “Frankfurt Kitchen” introduced many of the elements we now take for granted...

So the concept of a kitchen as we know it today is just under 100 years old. That's not too surprising; 100 years ago, we were still arguing over things like the size of the Universe and what powers the Sun. Still, I'd have said "take for granite," because of the proliferation of granite countertops in kitchens and because I can't resist a gneiss play on words.

...a continuous countertop with a tiled backsplash, built-in cabinets, and drawers optimized for storage—all laid out with comfort and efficiency in mind.

Whoever put my kitchen together must have forgotten about the "optimized for storage" bit.

“She didn’t just develop a kitchen,” says Austrian architect Renate Allmayer-Beck. “It was a concept to make women’s lives easier by giving them a kitchen where they could manage more easily and have more time for themselves.”

Thus leading inexorably to women joining the workforce, which, if you think that's a bad thing, boy are you reading the wrong blog.

The article even addresses the obvious:

While the Frankfurt Kitchen was marketed as a kitchen designed for women by a woman, Schütte-Lihotzky resented the implication that her gender automatically endowed her with secret domestic knowledge, writing in her memoir that “it fed into the notions among the bourgeoisie and petite bourgeoisie at the time that women essentially work in the home at the kitchen stove.”

I vaguely remember featuring a bit back in the old blog about the invention of the automatic dishwasher, which predated the Frankfurt kitchen (I suppose that rolls off the tongue and keyboard more easily than "Schütte-Lihotzky Kitchen") by a few decades. That, too, was a woman's work. And that's the closest I'm going to get to making a "women's work" joke; you're welcome.

The Frankfurt Kitchen was efficiently laid out and compact, to save both on costs and the physical effort required to use it. Here, a woman could move from sink to stove without taking a single step. This quest for efficiency also led Schütte-Lihotzky to move the kitchen from a corner of the family room into its own space—a choice that baffled contemporary homemakers.

And then, decades later, they'd take away the wall separating the kitchen from the family room, putting it back into one big open space. I spent my childhood in a house with an open-concept kitchen/living area, and I have nothing inherently against it. What I have a problem with is all the remodeling shows that insist on that kind of layout. Not because they insist on it, but because they're thinly-veiled ads for home improvement stores, and they enable that bane of the housing market in the US: house flippers.

The article even addresses the open-concept change, if obliquely:

When the Frankfurt Kitchen came under fire from second-wave feminists in the 1970s for isolating women in the kitchens and making domestic labor invisible, the critique hit her hard.

She defended her design in her memoir. “The kitchen made people’s lives easier and contributed to women being able to work and become more economically independent from men,” she wrote. Still, she conceded, “it would be a sad state of affairs if what was progressive back then were still a paragon of progress today.”


I feel like a lot of people would defend their life's work to the last, but that quote demonstrates a willingness to keep an open mind, even later in life, and to acknowledge that nothing is ever truly completed. As they used to say, "a woman's work is never done."

There's a lot more at the link, which I found interesting because I was only vaguely aware that today's kitchen designs owed a debt to something called a "Frankfurt Kitchen," but I didn't know anything about how it came to be. I figured maybe someone else might want to know, too.
April 9, 2025 at 11:19am
April 9, 2025 at 11:19am
#1086899
I sure talk about the Moon a lot. We're coming up on another Full Moon, by some reckonings the Pink Moon, the first Full Moon after the Northern Hemisphere Spring Equinox. It's also a culturally significant Full Moon because it marks the start of Pesach, or Passover; and helps to define the timing of Easter. This will occur on Saturday, based on Eastern Standard Time.

But this article, from aeon, isn't about Moon lore or cultural observances; quite the opposite.

    How the Moon became a place  Open in new Window.
For most of history, the Moon was regarded as a mysterious and powerful object. Then scientists made it into a destination


On 25 May 1961, the US president John F Kennedy announced the Apollo programme: a mission to send humans to the Moon and return them safely to Earth within the decade.

Specifically, white American male humans, but hey, one small step and all that.

The next year, the American geologist Eugene M Shoemaker published an article on what it would take to accomplish the goal in American Scientist. It is an extraordinary document in many ways, but one part of his assessment stands out. ‘None of the detailed information necessary for the selection of sites for manned landing or bases is now available,’ Shoemaker wrote, because there were ‘less than a dozen scientists in the United States’ working on lunar mapping and geology.

I had to look it up to be sure, but yeah, this was the same guy who co-discovered Comet Shoemaker-Levy 9, the one that impacted Jupiter back in the 1990s, right around the time we coincidentally started confirming the existence of exoplanets. That's a lot of astronomy wins for a geologist, especially considering that, technically, "geology" only applies to Earth. I think that's a word it's safe to expand the definition of, though; otherwise, we'll have selenology, areology, and any number of other Greek-rooted world names attached to -ology. The problem becomes especially apparent when you consider we also have geography, geometry, and geophysics. Some sources refer to him as an astrogeologist; I'm not really picky about the wording in this case, as long as we all understand what's meant, though technically "astro-" refers to stars, not moons or planets. Being picky about that would cast doubt on "astronaut" as a concept.

Incidentally, he apparently died in a car crash in 1997, and some of his ashes got sent to the Moon with a probe that crashed into its south pole region. A fitting memorial, if you ask me.

But I digress.

The Moon is a place and a destination – but this was not always the case.

Well, it was certainly a destination for Eugene M. Shoemaker. Or part of him, anyway.

To geographers and anthropologists, ‘place’ is a useful concept. A place is a collision between human culture and physical space. People transform their physical environment, and it transforms them. People tell stories about physical spaces that make people feel a certain way about that space. And people build, adding to a space and transforming it even further.

So, this is a situation where science, technology, anthropology, folklore, mythology, linguistics, engineering, and psychology (and probably a few other ologies) meet. In other words, candy for Waltz.

Now, you might be thinking, as I did, "But science fiction treated other worlds as 'places' long before we sent white male American humans to the Moon." And you'd be right (because, of course, I was). The key is in the definition of 'place' I just quoted from the article: the Moon became a real place, as opposed to the speculative place it had in science fiction and fantasy:

Centuries ago, a major reconceptualisation took place that made it possible for many to imagine the Moon as a world in the first place. New technologies enabled early scientists to slowly begin the process of mapping the lunar surface, and to eventually weave narratives about its history. Their observations and theories laid the groundwork for others to imagine the Moon as a rich world and a possible destination.

Then, in the 1960s, the place-making practices of these scientists suddenly became practical knowledge, enabling the first visitors to arrive safely on the lunar surface.


One might argue that we lost something with that, like the folklore and mythology bits. But we gained something, too, and didn't really lose the folklore (though some of it, as folklore is wont to do, changed).

For much of history, the Moon was a mythological and mathematical object. People regarded the Moon as a deity or an abstract power and, at the same time, precisely charted its movement. It seemed to influence events around us, and it behaved in mysterious ways.

The connection between the Moon and tides was clear long before Newton explained gravity enough to demonstrate a causal relationship.

There were some who thought about trips to the Moon. Stories in religious traditions across the world tell of people travelling to the Moon. There were some thinkers before and after Aristotle who imagined that there were more worlds than just Earth. The ancient atomists discussed the possibility of worlds other than Earth, while other Greeks discussed the possibility of life on the Moon. This included Plutarch, who wrote about the Moon as both mythical and a physical object. But, to the extent that the Moon was thought about as a place, the notion was largely speculative or religious.

I sometimes wonder if, had we not had the big shiny phasey thing in the sky, our perception of space travel might have been different. The only other big thing in the sky is the Sun; all the other relatively nearby objects resolve to little more than dots: Venus, Mars, etc. I suspect that the presence of a visible disc, with discernible features even, might have served as a stepping-stone to imagining those other dots as worlds, once the telescope could start us seeing them as discs, too.

It would certainly have made mythology and folklore a lot different, not having a Moon.

The rest of the article is basically a brief (well, not so brief because it's aeon, but brief in comparison to human history) recap of our cultural relationship with the Moon. I don't really have much else to comment on, but I found it an interesting read, especially to see how our understanding has changed over time.
April 8, 2025 at 10:11am
April 8, 2025 at 10:11am
#1086823
Got this one from Time, and now it's Time to take a look at it.



"Has become?" Always has been.

Imagine walking through New York City, invisible.

I don't have to. I've done it. People bumped into me (and didn't even pick my pocket), cars didn't stop at crosswalks, and taxicabs just zoomed on by when I hailed them.

This is also known as "being in New York City."

Marilyn Monroe, one of the most recognizable women in the world, once did exactly that.

The article describes how no one recognized her until she started acting Marilyny. There's some irony (or whatever) there, because it wasn't Marilyn Monroe who (if the story is true) walked through NYC invisibly; that was Norma Jeane Mortenson. So who was being herself? Marilyn or Norma Jeane? Who is real and authentic: Superman or Clark Kent? (Yes, I know, trick question; they're both fictional.)

Her story is extreme, but her struggle is not unique. Like Marilyn, many of us learn to shape ourselves into what the world expects. Refining, editing, and performing until the act feels like the only version of us that belongs.

Well, yeah. And then you become the act. And that becomes your authentic, real, true self. This isn't news or something to be ashamed of; it's the essential process of life as a human.

Today, even authenticity is something we curate, measured not by honesty but by how well it aligns with what’s acceptable. The pressure to perform the right kind of realness has seeped into every aspect of modern life.

Oh, boo hoo hoo. "Today," my ass. We've been doing this since we figured out this newfangled "fire" shit, if not before then. I might even postulate that the pressure to fit in, to conform, to not act like but be the person your society expects was even stronger in pre-industrial times.

Authenticity was supposed to set us free. Instead, it has become something we must constantly prove. In a culture obsessed with being “real,” we curate our imperfections, filter our vulnerabilities, and even stage our most spontaneous moments online.

Who's this "we" person?

I figured out a long time ago that I needed to be someone different at work than I was for, say, my role-playing game group. The latter helped with the former.

Those who should know these things told me that people responded well to honesty and authenticity, so I learned to fake those qualities.

Instead of naturally shifting between different social roles, we now manage a single, optimized identity across multiple audiences—our family, coworkers, old friends, and strangers online.

Again, who the fuck is "we?" Not me.

The bigger, paradoxical problem is, however, that the more we strive to be real, the more we perform; and in proving our authenticity, we lose sight of who we truly are.

To me, this is like saying "No one sees how we truly look; they only see the wardrobe and hairstyle we choose." Hell, even nudists get to choose their hairdos. Who "we" are is always a performance. Eventually, the performance becomes who we are. Fake it 'til you make it, and all that.

Think back to childhood. At some point, you probably realized that certain behaviors made people like you more. Maybe you got extra praise for being responsible, so you leaned into that. Maybe you learned that cracking jokes made you popular, so you became the funny one.

Okay, now you're attacking me directly.

Psychologists call this the “False Self”—a version of you that develops to meet external expectations.

Well, far be it from me to dispute what professional psychologists say, but again, that's like saying "society expects us to wear clothing to cover one's genitals, so the only way to be authentic is to be naked."

And even then, which is more authentic: pre-shower, or post-shower? And do you comb/brush your hair? Then you're not being authentic; you're conforming to society's norms.

My point here is that despite what the article says, authenticity isn't always a good thing. Maybe your "authentic" self is a thief, and you don't want to face society's punishment for that, so you choose not to steal stuff. You're tempted, sure, but you just walk past the shinies instead of pocketing them, or restrain yourself from picking an NYC pedestrian's pocket or running off with her purse. You become not-a-thief, and that eventually becomes your true self.

Some of us are just naturally funny, but others have to work at it. The desire to work at it is just as authentic as the not-being-funny part.

What's the point of trying to improve yourself if you then get slammed for being "unauthentic?" A violent person may want to do the work to stop being violent. A pedophile may choose to deliberately avoid being around children. Is that not a good thing for everyone?

As for code-switching, are we supposed to wear the same clothes for lounging around the house, going to a gym, working, and attending a formal dinner? This is the same thing, but with personality.

Authenticity isn’t something you achieve. It’s what’s left when you stop trying. Yet, the more we chase it, the more elusive it becomes.

Well gosh, you know what that sounds exactly like, which I've harped on numerous times? That's right: happiness.

Culture shifts when enough people decide to show up as they are.

Naked with uncombed hair?

Hard pass.
April 7, 2025 at 9:25am
April 7, 2025 at 9:25am
#1086745
It's nice to be able to see through optical illusions, as this article from The Conversation describes. It would be even nicer to be able to see through lies and bullshit, but that's probably harder.



And I did find possible bullshit in this article, in addition to the slightly click-baity headline.

Optical illusions are great fun, and they fool virtually everyone. But have you ever wondered if you could train yourself to unsee these illusions?

I can usually see past the optical illusion once it's pointed out to me, or if I figure it out, but not always.

Now, it should be obvious that there are pictures at the article. They'd be a pain to reproduce here, and why bother, when I already have the link up there in the headline?

We use context to figure out what we are seeing. Something surrounded by smaller things is often quite big.

Which is why it's important to hang out with people smaller than you are. Or bigger, depending on the effect you're looking for.

How much you are affected by illusions like these depends on who you are. For example, women are more affected by the illusion than men – they see things more in context.

The article includes a link to, presumably, a study that supports this statement. I say 'presumably,' because when I checked this morning, the link wasn't working. So I can't really validate or contradict that assertion, but I do question the validity of the "they see things more in context" statement.

Young children do not see illusions at all.

The link to that study did work for me, and from what I can tell, it was about a particular subset of illusions, not "all."

The culture you grew up in also affects how much you attend to context. Research has found that east Asian perception is more holistic, taking everything into account. Western perception is more analytic, focusing on central objects.

None of which fulfills the promise of the headline.

This may also depend on environment. Japanese people typically live in urban environments. In crowded urban scenes, being able to keep track of objects relative to other objects is important.

Okay, this shit is starting to border on racism and overgeneralization. Also, the glib explanation is the sort of thing I usually find associated with evolutionary psychology, which reeks of bullshit.

However, what scientists did not know until now is whether people can learn to see illusions less intensely.

A hint came from our previous work comparing mathematical and social scientists’ judgements of illusions (we work in universities, so we sometimes study our colleagues). Social scientists, such as psychologists, see illusions more strongly.


And this is starting to sound like the same old "logical / creative" divide that people used to associate with left brain / right brain.

Despite all these individual differences, researchers have always thought that you have no choice over whether you see the illusion. Our recent research challenges this idea.

Whatever generalization they make, I can accept that there are individual differences in how strongly we see optical illusions. So this result, at least, is promising.

Radiologists train extensively, so does this make them better at seeing through illusions? We found it does. We studied 44 radiologists, compared to over 100 psychology and medical students.

And we finally get to the headline's subject, and I'm severely disappointed. 44? Seriously?

There is plenty left to find out.

I'll say.

Despite my misgivings about some of the details described, I feel like the key takeaway here is that it may be possible to train people away from seeing a particular kind of optical illusion. But it may be a better use of resources to train them to smell bullshit.
April 6, 2025 at 7:50am
April 6, 2025 at 7:50am
#1086678
Once again, Mental Floss tackles the world's most pressing questions.

    Why Do So Many Maple Syrup Bottles Have a Tiny Little Handle?  Open in new Window.
It’s not for holding, that’s for sure.


Well, this one would be pressing if anyone in the US could still afford maple syrup.

Ideally, you’d be able to hold the handle of a maple syrup container while you carry it and also while you pour the syrup onto pancakes, waffles, or whatever other foodstuff calls for it.

Good gods, how big is your maple syrup container? I usually get the ones about the size of a beer bottle, which doesn't even require a handle. Or, you know, I used to, when we were still getting stuff from Canada.

But the typical handle on a glass bottle of maple syrup is way too small and positioned too far up the bottleneck to be functional in either respect.

So, why is it there?


Why is anything nonfunctional anywhere? Decoration, tradition, or for easy identification, perhaps.

The most widely accepted explanation is that the tiny handle is a skeuomorph, meaning “an ornament or design representing a utensil or implement,” per Merriam-Webster.

I'm actually sharing this article not to complain about trade wars, but because I don't think I'd seen 'skeuomorph' before, and it's a great word.

As the article goes on to note, it's apparently pretty common in software design. They use other examples, but here on WDC, we have a bunch of them. The magnifying glass for Search, the shopping cart (or trolley) for Shop, glasses for Read & Review, the gear icon for settings, and so on. I don't do website or graphic design, so I didn't know the word.

But there are plenty of skeuomorphs that don’t involve the transition from analog to digital life, and the useless handle of a maple syrup bottle is one of them.

I'd hesitate to call it "useless," myself. Obviously, it's not useful as a handle for carrying or pouring, but, clearly, it does have a purpose: marketing.

Here’s one popular version of the origin story: The little handle harks back to the days of storing liquids in salt-glazed stoneware that often featured handles large enough to actually hold.

Moonshine distillers, take note. (And yet, the article mostly debunks that origin story, as one might expect.)

Maple syrup manufacturers had started to add little handles to their glass bottles by the early 1930s. This, apparently, was a bit of a marketing tactic. “Maple syrup companies weren’t so much retaining an old pattern of a jug as reinventing it and wanting to market their product as something nostalgic,” Canada Museum of History curator Jean-François Lozier told Reader’s Digest Canada.

Like I said.

Perhaps one day, I will again have the opportunity to purchase delicious maple syrup. When I do, I'll be looking for the skeuomorph.
April 5, 2025 at 9:46am
April 5, 2025 at 9:46am
#1086609
While Cracked ain't what it used to be (what is, though?), here, have a bite of this:

    5 Foods That Mutated Within Your Lifetime  Open in new Window.
We finally figure out what happened to jalapeños


It should go without saying that "mutated" is a bit misleading, but here I am, saying it anyway.

We know that companies keep tinkering with the recipes behind processed foods, changing nitrates or benzoates so you’ll become as addicted as possible.

Wow, that would suck, becoming addicted to food.

More basic foods, however, are more dependable.

And, of course, here's the countdown list to contradict that.

5 Brussels Sprouts

A couple decades ago, jokes on kids’ shows would keep saying something or another about a character hating Brussels sprouts.


Pretty sure it was more than a couple of decades ago. But the Brussels sprouts thing didn't stick in my memory. Broccoli did. Of course, as I got older and didn't have to eat them the way my mom overcooked them, I learned to like both. And when I got even older, I had my mind blown with the fact that they are the same species.

If you were around back then, you probably learned that Brussels sprouts tasted gross before you’d ever heard of the city of Brussels.

Having been to Brussels, I still don't know what they call them there. Sprouts, probably, or whatever the French or Dutch word for sprouts is. like how Canadian bacon is called bacon (or backbacon) in Canada, or French fries are called frites in Brussels because they're a Belgian invention, not French.

Unlike French fries, Brussels sprouts actually have a connection to Brussels. Well, not the city. It's hard to find extensive vegetable gardens in most major cities. But they were grown extensively in the surrounding countryside, or so I've heard.

Brussels sprouts used to taste bitter, but during the 1990s, we started crossbreeding them with variants that didn’t. When we were done, we’d bred the bitterness out.

There's an incident stuck in my head from several years ago, back when I did my own grocery shopping so at least six years and probably more, where I sauntered up to a supermarket checkout counter with a big bag of Brussels sprouts. The cashier started to ring me up, but then she looked me in the eye and said, "Can I ask you a question?"

"Sure."

She held up the bag o' sprouts. "How can you eat these things?"

I was rendered speechless for a moment, but retained enough presence of mind to say "With butter and garlic." Or maybe I just sputtered, and then a week or so later, lying awake at night, I finally came up with a good comeback, and edited my memory to make me look better.

Turns out there’s no moral law saying healthy stuff must taste bad.

Shhh, you can't say that in the US.

4 Pistachios

Pistachio nuts in stores used to always be red.


I don't think I ever noticed that.

Today, we instead associate pistachios with the color green, due to the light green color of the nuts and the deeper green color of the unripe shells.

I associate them with a lot of work and messy cleanup, but damn, they taste good.

3 Jalapeños

In the 1990s, the word “jalapeño” was synonymous with spicy.


Again, this is a US-oriented site. For many Americans, mayonnaise is too spicy, and anything else is way too spicy.

Today? Not so much. Maybe you’d call a habanero spicy, but jalapeños are so mild, you can eat a pile of them.

That's... not entirely true. It's actually worse than that; jalapeños have wildly varying levels of capsaicin, making it difficult to control the flavor of one's concoction when using that particular species.

Today, you might find yourself with one of the other many hotter jalapeño varieties, but there’s a good chance you’ll find yourself with TAM II or something similarly watery.

Which is why, when I want spicy peppers, I go with habanero or serrano. No, I don't use whole ghost peppers, but I do use ghost pepper sauce sometimes.

2 Sriracha Sauce

You know Sriracha sauce? Its label says that the primary ingredient is “chili,” and the chili pepper they use happens to be a type of jalapeño. At least it used to be, until some recent shenanigans.


I know it, and I sometimes use it, but my tongue refuses to pronounce it. It has no problem tasting it, though.

1 Apples

I don't think it would surprise many people to know that this iconic fruit has been selectively bred into hundreds of different varieties.

The most extreme victim of this aesthetics supremacy may be the Red Delicious apple. Today, it’s perhaps the most perfect-looking apple. It looks like it’s made of wax, and many say it tastes like it’s made of wax, too.

Nah, more like cardboard. I know what cardboard tastes like because I ate a pizza from Domino's once.

Buyers have started rebelling. If you aren’t satisfied with Red Delicious, you can try the increasingly popular Gala or Fuji apples.

On the rare occasions that I actually buy apples for eating, those are my choices, because they're tasty and they're usually available.

In summary, yeah, lots of foods have changed, and sometimes for the worse. What's remarkable isn't the change itself, but our ability to tinker with the genetics of what we eat. And we've been doing it for as long as we've been cultivating food. We can be quite clever, sometimes. But I question our collective taste.
April 4, 2025 at 10:04am
April 4, 2025 at 10:04am
#1086555
I wanted to share this article because a) it's interesting and I have stuff to say about it and b) I wanted to show that even the most serious science communicators, like Quanta, sometimes can't help using a pun in the headline.

    The Physics of Glass Opens a Window Into Biology  Open in new Window.
The physicist Lisa Manning studies the dynamics of glassy materials to understand embryonic development and disease.


If you're anything like me, you're wondering what the hell glass and biology could possibly have in common. Well, that's what the article's for.

The ebb and flow of vehicles along congested highways was what first drew Lisa Manning to her preferred corner of physics...

I can relate. I still remember the epiphany I got back in engineering school when I realized that the equations of traffic flow are the discrete-math versions of the equations of fluid flow.

But it wasn’t until after she had earned her doctorate in physics in 2008 that Manning started applying that enthusiasm to problems in biology.

I've noted before that, sometimes, an interdisciplinary approach can solve problems that a focus on one field cannot. Perhaps I'm biased because I prefer to know a little bit about a lot of things than to know a whole lot about one thing and nothing about anything else.

...she learned about what’s known as the differential adhesion hypothesis, an idea developed in the 1960s to explain how groups of cells in embryos move and sort themselves out from one another in response to considerations like surface tension. “It was amazing that such a simple physical idea could explain so much biological data, given how complicated biology is,” said Manning, who is now an associate professor of physics at Syracuse University. “That work really convinced me that there could be a place for this kind of [physics-based] thinking in biology.”

"Amazing," sure, but to me, it's not surprising. Complexity emerges from simplicity, not the other way around. And biology is basically chemistry which is basically physics, so even there, it should be no surprise that one field can inform the other.

She took inspiration from the dynamics of glasses, those disordered solid materials that resemble fluids in their behavior.

I'm going to digress for a moment, here. Glass has been described as a "solid liquid." When touring some historical site lo these many years ago—hell, it might have been Monticello—I heard a tour guide assert that being a solid liquid, glass flows very, very slowly under the influence of gravity, and that's why all these old windows are wavy and thicker at the bottom. This didn't sit right with me then, so I looked into it (this was pre-internet, so it involved an actual trip to an actual library). Turns out that no, glass is solid, period. It doesn't flow any more than rocks do, assuming ordinary temperatures (of course it flows when heated enough to change phase). The waviness of pre-industrial glass is a result of its manufacturing process, and apparently, they'd often install the panes with the thicker bits at the bottom, for whatever reason.

Point is, people confuse "glass resembles a fluid" with "glass flows, albeit very slowly." Which is understandable, though really, tour guides should know better. The reason we say glass is fluid-like is that most solids have a crystalline structure of some sort, at the atomic level. But glass does not; its atomic structure is disordered.

I mention all this in case someone's still got that idea in their head that glass is a slow-moving liquid; the article doesn't make it clear (see, I can pun, too) until it gets into the interview portion.

Manning found that the tissues in our bodies behave in many of the same ways. As a result, with insights gleaned from the physics of glasses, she has been able to model the mechanics of cellular interactions in tissues, and uncover their relevance to development and disease.

Unlike the relatively simple ideas about the atomic structure, or lack thereof, of various solids, the connection to biology is beyond me. The rest of the article is, as I said, an interview, which I'm not quoting here. While I can't pretend to understand a lot of it, I can appreciate her multidisciplinary approach and how insights from one branch of science can illuminate problems from another branch.

Incidentally, I find it helps to use a similar approach to writing. Because as much as we like to categorize things, the boundaries tend to blur and become fluid. Like the view through an 18th century window pane.
April 3, 2025 at 10:20am
April 3, 2025 at 10:20am
#1086501
Almost everyone I know, when starting to read the headline from this Guardian article, would blurt out "forty-two!"



They'd be wrong, though. Forty-two is the "Answer to the Great Question of Life, the Universe, and Everything," as revealed by the great prophet, Douglas Adams. Says nothing about "meaning."

As this article is an ad for a book, I conclude that the author's actual Meaning of Life is to sell as many books as possible. But in doing so, at least he includes others' points of view, opinions from those who probably aren't trying to sell a book.

In September 2015, I was unemployed, heartbroken and living alone in my dead grandad’s caravan, wondering what the meaning of life was.

And it never occurred to you that being broke, depressed, and trapped in a tin can with your dead grandpa might actually be the meaning of life? See, this sort of thing is why we push people to have jobs. Not so they'll have money, but so they'll be too busy to contemplate philosophical questions.

What was the point to all of this?

Apparently, selling books.

Like any millennial, I turned to Google for the answers.

Aw, this was too early. Try that now, 10 years later, and an AI bot will confidently and definitively answer your question. Or so I assume. I'm not going to try it, because I might not like the response. Or, worse, I might like it.

I trawled through essays, newspaper articles, countless YouTube videos, various dictionary definitions and numerous references to the number 42...

I told you 42 would be involved. It's a red herring. To be fair, so is everything else.

...before I discovered an intriguing project carried out by the philosopher Will Durant during the 1930s.

The problem with letting philosophers have a go at this question is that none of them, not a single one, has a sense of humor (or humour, as this is The Guardian). And any answer to "What is the meaning of life?" that doesn't involve humor in some way is categorically and demonstrably false. We have a different name for philosophers with a sense of humor: comedians.

Durant had written to Ivy League presidents, Nobel prize winners, psychologists, novelists, professors, poets, scientists, artists and athletes to ask for their take on the meaning of life.

See? Not a single comedian in the bunch. In the 1930s, there were any number of humorists he could have polled, many of which are still revered. The Marxists, er, I mean, the Marx Brothers had been active for at least a decade. There was a Laurel and a Hardy. The Three Stooges got their start in the late 1920s. I'd want to hear their answers. Nyuk nyuk.

I decided that I should recreate Durant’s experiment and seek my own answers. I scoured websites searching for contact details, and spent hours carefully writing the letters, neatly sealing them inside envelopes and licking the stamps.

I can almost forgive the low-tech throwback of writing letters, folding them into envelopes, and sending them through the post. What I don't get is stamp-licking. Here in the US, stamps have been peel-and-stick for decades; is it that different in the UK?

What follows is a small selection of the responses, from philosophers to politicians, prisoners to playwrights. Some were handwritten, some typed, some emailed. Some were scrawled on scrap paper, some on parchment. Some are pithy one-liners, some are lengthy memoirs.

When I saved this article (not that long ago), I had in mind to quote at least some of the responses. But upon reflection, I'm not going to do that. The answers are as varied as the people giving them. Some are non-answers. Some contain the barest glimmers of a sense of humor. Some are highly specific; as a trained engineer, I could very easily assert that designing systems that work to make peoples' lives better is the ultimate meaning of life, or, as an untrained comedian, I could just as easily state that the meaning of life is to laugh and to make others laugh. Or I could just say "cats."

The point is, the answer is different for everybody, and even for one individual at different points in life. For some, perhaps even this author, the meaning is in the search. For others, there is no meaning; this can be horrifying or liberating, depending on one's point of view. In my more literal moments, I assert that the meaning, or at least the purpose, of life is to generate additional entropy, thus accelerating the inevitable heat death of the Universe.

Mostly, though, I don't concern myself with meaning or purpose. A Jedi craves not these things. It's enough for me to occasionally sit outside on a nice day, listening to music and drinking a beer.
April 2, 2025 at 8:54am
April 2, 2025 at 8:54am
#1086432
In my ongoing quest to look at word/phrase origins, I found this explanation from Mental Floss, though I felt no urgency to share it.

    Why Does ‘Stat’ Mean “Immediately”?  Open in new Window.
It was originally a medical thing—here’s why.


Well, I thought it was pretty common knowledge that it came from the medical field, but I've been surprised many times by what I thought was common knowledge that turned out to not be.

The reason stat is short for statistics needs no explanation.

Yeah, it kind of does. Because 'stat' is short for 'statistic,' and 'stats' is short for 'statistics,' at least in my country. The one thing about British English that I actually find superior is that they shorten 'mathematics' to 'maths,' while we use 'math.' If stats are statistics, why is math mathematics? Many things in language make little sense, and this is one of them.

But that's not the 'stat' we're talking about.

Stat simply means “immediately.”

And has the advantage of one short, sharp syllable instead of an unwieldy and tongue-time-consuming five.

You sometimes see it written in all caps, STAT, which could either be to add extra emphasis or because people assume it’s an acronym.

Amusing thing: like many people, I have an ongoing prescription for a cholesterol-controlling medicine. My doctor's office, affiliated with the university here, has a computer system that always capitalizes STAT. Consequently, the prescription is for AtorvaSTATin.

It’s possible that the all-caps custom is influenced by the fact that ASAP basically means the same thing and is an acronym (for as soon as possible).

It's also possible that they just want it to stand out on reports for other medical professionals. "We need an X-ray of this leg stat" might be overlooked, but "We need an X-ray of this leg STAT" adds emphasis to the urgency.

But stat is not an acronym: It’s an abbreviation for the Latin word statim, also meaning “immediately.”

Oddly enough, 'immediately' is also a Latin derivative, but it appears to share its Latin root with 'mediate' and 'medium.' I don't have a good source for this, but I suspect the 'im-' prefix negates the 'mediate' root, conveying a sense of urgency as opposed to moderation. Like with 'immobile' or 'imprecise.'

When stat first entered the English lexicon in the early 19th century, it was used by physicians clarifying that a drug or procedure should be administered immediately.

Early 19th century? "Give me that jar of leeches, stat!" "Trepanning drill, stat!"

Medical professionals still use stat today, sometimes to differentiate a medication that must be administered immediately from two other types of medication orders. There are scheduled ones, which “are typically utilized for medications that are designed to give a continuous effect over a certain period of time (e.g. antibiotics),” per a 2016 article in Pharmacy Practice; and PRN orders “for medications that are to be given in the event of specific signs or symptoms (e.g. analgesics and antipyretics for pain and fever, respectively).” PRN is Latin, too: It stands for pro re nata (literally, “for the affair born”), meaning “as needed.”

There's a brewery near me called Pro Re Nata, and the R in their sign has the little x cross on the tail that signifies 'prescription.' I find this amusing. Their beer isn't bad, either.

Next time I go there, I'll be like, "Pint of brown, STAT!" Though I'll have to pronounce it carefully, or they might think I'm ordering stout. Not that there's anything wrong with that.
April 1, 2025 at 10:18am
April 1, 2025 at 10:18am
#1086338
I know what day it is, but I'm just going about my business, here. This bit is from HuffPo, which I don't usually read, but this one caught my attention.

    I Moved Abroad For A Better Life. Here’s What I Found Disturbing During My First Trip Back To America.  Open in new Window.
“The hardest part wasn’t seeing these differences – it was realizing I could never unsee them.”


Well. Okay. I guess some people need to push outside their envelope to see what's inside it.

When I left America last spring for a safer home for my family and a better quality of life, I thought the hardest part would be adapting to life in the Netherlands.

It's nice to have the privilege to just up and emigrate somewhere, isn't it? Like, if you don't like your life in whatever country you're in, boy it sure would be nice to have another country you can go to where you're not treated like something lesser or illegal.

“We just hired Riley’s college consultant,” my friend Jackie mentioned casually, sipping her drink. “Five thousand for the basic package, but you know how it is these days. Everyone needs an edge.”

"Everyone needs an edge." Yeah. Think about that for a moment. When everyone gets an edge, nobody gets an edge. Or, perhaps, people able to drop five grand on the edge end up winning, which perpetuates the whole cycle of economic disparity.

How could I explain that everything — from the massive portions before us to the casual acceptance of paying thousands to game the education system — suddenly felt alien? That I’d spent the past eight months in a place where success wasn’t measured by the size of your house or the prestige of your child’s college acceptance letters?

Congratulations; you've achieved an outsider's perspective.

The Dutch principle of “niksen” ― the art of doing nothing ― replaced our American addiction to busyness.

We had him once, but he was forced to resign.

Okay, bad Nixon pun. Seriously, though, how could you not see the problem when you were living here? Too busy, I guess.

Living abroad hadn’t just changed my zip code — it had fundamentally altered how I viewed success, relationships and the American Dream itself. In the Netherlands, I’d learned that a society could prioritize collective well-being over individual achievement.

But that's... that's... soshulizm!

What I’ve learned is that feeling like a stranger in your own country doesn’t have to be purely painful — it can be illuminating. It shows us that another way of life isn’t just possible, it’s already happening elsewhere.

I don't mean to be mean, but I've spent comparatively little time abroad and didn't need to spend any to figure out that what passes for culture in the US is fucked.

Some people really do thrive on it, though, and it's good to have choices.

Maybe we need more people willing to step outside the fishbowl and then return with fresh eyes. Maybe we need more voices saying, “This isn’t normal, and it doesn’t have to be this way.”

And maybe some people can figure it out without having to spend a year living in another country. Because not everyone can do that.

So, I hope you haven't spent this entire entry looking for an April Fools' prank. If you did, now is when I reveal that the only prank is that there was no prank. April Fools!

13 Entries
Page of 1 20 per page   < >

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online