Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.
This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.
It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.
It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."
I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
It's been years since I grew tomatoes and never knew any of this. Now that I know, I'm with Brandiwynđ¶ I'm not eating any tomatoes with chemicals in them.
Hm. Where I live right now, we've had days over 85F in January. Summers get up into triple digits. I guess my tomatoes will have white, shriveled rings?
Tomatoes need a very balanced diet to grow well. In addition to the potassium, which prevents white rings, they need calcium to prevent blossom end rot.
When your hobby becomes your job, it loses a lot of what makes it interesting.
I'm not sure if I'm the right gauge, since I have a disorder that makes my hobby-turned-career more difficult and in some cases, impossible, but I still play piano for fun occasionally. Not nearly as often as I used to.
Doing nothing feels differently depending on location.
When I do nothing at home, I have dishes and dusty floors silently judging me.
Second Son got stung on the cheek by a bee in Rome and his face swelled up like that of a cartoon character. We spent half the day resting in the hotel room. It was the most glorious doing nothing ever. We were in one of the oldest cities in Europe in a modern hotel doing nothing.
The best doing nothing in many years. Except for the bee sting.
I have a lit of leisure time..most of the time. I used to feel guilty thinking I needed to be doing something constructive. I'm getting passed that..I worked my entire life..now it's time to relax and do what I want..if it's nothing..so be it.
All I really know about it, was that there was a police department that had an issue with cops that liked to abuse their power. They tried yellow bands (or whatever), but other officers thought they had looked cool, and intentionally got into trouble to get them. So, they decided to punish the officers with Hello Kitty stickers - hard to look intimidating, or cool, with one of those stuck to your breast pocket.
As a reminder, all words were invented. Some were invented more recently than others.
âWords are events, they do things, change things. They transform both speaker and hearer; they feed energy back and forth and amplify it. They feed understanding or emotion back and forth and amplify it,â Ursula K. Le Guin wrote in her exquisite manifesto for the magic of real human conversation.
All due respect to Ms. Le Guin, but "real human conversation" can also be incredibly annoying.
In the roots of words we find a portal to the mycelial web of invisible connections undergirding our emotional lives â the way âsadnessâ shares a Latin root with âsatedâ and originally meant a fulness of experience, the way âholyâ shares a Latin root with âwholeâ and has its Indo-European origins in the notion of the interleaving of all things.
Well, yeah, hence "holistic." I do appreciate etymology.
Because we know their power, we ask of words to hold what we cannot hold â the complexity of experience, the polyphony of voices inside us narrating that experience, the longing for clarity amid the confusion. There is, therefore, singular disorientation to those moments when they fail us â when these prefabricated containers of language turn out too small to contain emotions at once overwhelmingly expansive and acutely specific.
One thing I try to avoid is the phrase "words cannot describe." I call myself a writer. I need to find words that describe.
I don't always succeed in my avoidance, mind you; some things are simply indescribable.
John Koenig offers a remedy for this lack in The Dictionary of Obscure Sorrows...
Yes, the article promotes a book.
The title, though beautiful, is misleading â the emotional states Koenig defines are not obscure but, despite their specificity, profoundly relatable and universal; they are not sorrows but emissaries of the bittersweet, with all its capacity for affirming the joy of being alive: maru mori (âthe heartbreaking simplicity of ordinary thingsâ), apolytus (âthe moment you realize you are changing as a person, finally outgrowing your old problems like a reptile shedding its skinâ), the wends (âthe frustration that youâre not enjoying an experience as much as you should⊠as if your heart had been inadvertently demagnetized by a surge of expectationsâ), anoscetia (âthe anxiety of not knowing âthe real you'â), dĂšs vu (âthe awareness that this moment will become a memoryâ).
Well. All of that is nice. So are the other examples given, which I won't get into but are right there at the link. There's a problem, though.
The purpose of language, in my view, is not for us, as writers, to demonstrate our superior intelligence, insight, vocabulary, and sexual attractiveness (though we certainly possess these qualities). It's not to smugly show how clever and erudite we are. No, the purpose is to communicate. If we're going to go by what Le Guin said in the quote above, words can certainly "do things, change things," but only if both the speaker and listener (or writer and reader) can agree, at least to some extent, on the meanings of those words.
So if I'm going to a rock concert that I've been looking forward to for some time, and I feel like I should be enjoying it more, my friend might ask, "Hey, what's wrong?" And if I go, "I got the wends," unless they saw this article, they'd have no idea what the hell I'm talking about, and might even drag me to the first-aid station. Or, I could explain what "the wends" are, per the last paragraph I quoted, but at that point I wouldn't feel much like explaining anything.
Or I could just say "I'm not enjoying this as much as I'd hoped," and leave it at that.
And I say this as someone who's made up words in the past and have had to explain what they mean. Consider how much harder it must be if you're using someone else's recently-made-up words.
But hey, maybe some of these will catch on and become part of the lexicon. Language may reflect what's important to a culture, which is why we have dozens of words for death and less than a half-dozen for love. I'm sure there are linguists who disagree, but maybe, by changing the language, we can change minds. That power, however, can also be used for evil.
So, I don't know. I guess when it comes to this stuff, I'm feeling agnosthesia.
I saved this link for a few reasons. One of them is to display an example of how you can do a headline without making it clickbait. "Some Tomatoes Have A White Ring Inside. How Dangerous Is It?" would be clickbait.
Have you ever cut into a tomato and been perplexed to see a white ring?
No. Oh, I've seen the white rings. I was just never all that curious about them. It just never occurred to me that it was anything but standard variation in quality.
One of the primary causes of a white ring is a potassium deficiency in the soil when the tomato is growing. It's important to have an adequate concentration of potassium because, without it, the fruits may not absorb enough magnesium and calcium to properly ripen.
ChEmIcAlzzz!!!
However, too much sun exposure can also lead to your tomato growing with pale tissue. If the fruits are left out in the open at temperatures higher than 85 degrees Fahrenheit, they may turn white or yellow, and some areas may end up being dry or shriveled.
Do what now? Okay, it's been a very long time since anything useful has grown near me, and even longer since I (or rather my parents) grew tomatoes, but I seem to remember "plant after last frost" and "plant in an area with full sun." And Potomac River basin temperatures stayed above 85F most of the summer, even before climate change started to accelerate.
I could be misremembering. Also, what with selective breeding and genetic engineering and whatnot, I'm pretty sure there are hardier varieties.
There are a few other reasons why you may see white flesh inside your tomatoes. If stink bugs, beetles, spider mites, and other bugs get under the fruits' skin and start feeding, they'll suck out the juice and insert their saliva, leading to a white spot rather than a white ring.
You know how it goes: what's worse than finding a bug in your tomato? Finding half a bug.
The good news is that if you spot a white ring inside your tomato caused by a potassium deficiency in the soil, it's typically safe to eat and can simply be removed by cutting...
And why would you want to cut it out if it's safe to eat? Well, "safe to eat" isn't the same thing as "appetizing." I routinely excise the stem parts from tomatoes, just because I don't like them.
Really, there's not much to this article, but it did sort of answer a question I never knew I had.
Post-Luxury Status Symbol #2: Wasteful Time Weâve spent two decades optimising ourselves into exhaustion, and now the flex is declaring you were never stressed in the first place.
I suppose that's preferable to all the bragging about how busy one is.
In Eat, Pray, Love, an Italian man tells our hapless protagonist her problem is that sheâs American - Americans donât understand pleasure because they believe it must be earned through exhaustion.
Far be it from me to agree with anything from the genre I call divorce porn (chick gets divorced, goes to a foreign land to "find" herself, doinks a hunky local guy, leaves satisfied), but that one feels right, like when a Belgian tour guide told me Americans eat like we have free health care.
Italians, he explains, have mastered il dolce far niente: the sweetness of doing nothing.
Fifteen years later, that sweetness has become the ultimate luxury.
Some might recall that I had an entry about doing nothing back on Groundhog Day: "Nothing Matters"
Thorstein Veblen argued that people signal wealth through conspicuous consumption, conspicuous waste, and conspicuous leisure. Had he lived into the 21st century, he might have added a fourth: conspicuous grinding. The performance of perpetual productivity. Capitalism convinced us this is what rich people actually do. It isnât.
The biggest advantage to being rich is that you have the ability, and the resources, to do nothing. Or almost nothing. But grinding doesn't get you there. Your hustle mostly enriches someone else. Someone who is doing almost nothing. And yeah, you're surviving, maybe even thriving, but you're not going to become a billionaire that way.
(Look, if the article can use the second-person pronoun with impunity, so can I.)
Leisure makes you feel guilty because youâre not working. Working constantly feels virtuous because thatâs what success demands. We optimised our work, then ourselves, then wondered why we felt empty.
(And also the first-person plural pronoun.)
Whatâs emerging now is a pendulum swing towards a new aspirational leisure class: people whose value isnât tied to what they do, but to how effortlessly they exist.
Insofar as people have "value," I balk at the notion that some are more valuable than others.
Time itself has become precious, so the ultimate status is to be wasteful with it. Complete autonomy over your schedule. The ability to meet anyone, whenever, and always know the right spot. To decline opportunities based on values or vibes. To partake in long, leisurely meals with no rushed ending.
I also balk at—nay, outright reject—the idea that such things are in any way "wasteful."
Although many activities today would have been considered leisure by previous generations - skincare rituals, vinyl listening bars, elaborate dining experiences - the question remains: is it still leisure if an algorithm told you to do it?
People can answer that question for themselves, I think.
If nobody could see you, if you couldnât post about it, would you still do it?
That one, too. In my case, I do plenty of stuff that I don't post about. Some of it's not even embarrassing to admit; I just keep it private.
If so, thatâs neo-leisure. If not, itâs unpaid labour, the performance of joy for an invisible audience.
Personally, I'm under the impression that a lot of that sort of thing isn't someone spontaneously deciding to, say, go to Tuscany and doink a local, but someone getting paid to promote Tuscany.
This is the contradiction at the heart of Neo-Leisure: the moment you perform it, youâre optimising again. The ability to waste time becomes another metric to track, another behaviour to perfect. Weâve simply replaced productivity optimisation with leisure optimisation. One exhausting performance becomes another.
When your hobby becomes your job, it loses a lot of what makes it interesting. Like, can porn stars ever have normal sex again? I'll never know the answer to that one.
For marginalised communities, for precarious workers, for anyone without generational security, the luxury of wasting time remains inaccessible. Theyâre still grinding because they have to. The status symbol isnât in wasting time. Itâs in having enough capital that you donât need to justify how you spend it.
I don't think it's an epiphany to realize that leisure is tied to privilege. I know there's a bit going around about how feudal serfs had more free time than we do in our post-industrial dystopia, or about how hunter-gatherers work less than agriculturalists. I don't know how true any of that is.
The article then goes into "leisure" products that I've never even heard of. Remember what I said about getting paid for seeming to perform leisure, up there? I suspect that this is product placement.
True leisure, in my view, doesn't need a "product."
Odell wrote that ânothing is harder to do than nothingâ. In an era where attention and consumption are currency, wasting time becomes an act of resistance.
Okay, but again, I must reiterate that I don't believe that these things are wastes of time. You know what is a waste of time? Doing work for a project that ultimately gets canceled. Even that isn't a complete waste of time if you learn something along the way.
The greatest luxury might be doing nothing and feeling no need to signal it at all.
Maybe. Or maybe the greatest luxury is to get paid to write blog entries. (To be clear, at the risk of repeating myself, I do not get paid to write blog entries.)
Many years ago, I had this recurring schtick about how a certain white kawaii cat from Japan was evil and taking over the world. I called her the Nefarious Neko. I dropped it because it got old for me, but along the way, I learned way more than I ever wanted to know about Hello Kitty. So this BBC article caught my attention:
I, of course, never actually hated Hello Kitty. If there's something I actually detest, I normally just leave it alone, like the way I almost never talk about sports in here.
The designer behind Hello Kitty is stepping down after 46 years, during which time she oversaw the feline character achieving world recognition.
"World domination" is more like it. I remember some years ago, I saw an article about what was called "the most remote community in the world" or something, a tiny village in northern Siberia that was the only place of human habitation for many kilometers around. (Perhaps there are islands more remote, technically.) I don't remember many details, but it's the kind of thing that's only accessible by rail, and, because it's Siberia, only for like three months out of the year or something. Point is, it is quite literally the farthest corner of the world, and I distinctly remember, in one of the photos, a little girl wearing a Hello Kitty shirt.
Yuko Yamaguchi took over design duties for the character - who isn't actually a cat, but a little girl from London - in 1980, five years after she first launched.
It's not widely known, but yes, she's actually a little girl and she's actually British and her name is actually Kitty White.
Yamaguchi herself often wore Kitty-style dresses in public and piled her hair in buns.
One of these days, I'm determined to visit Japan. I'll need to brace myself for that sort of thing.
The Hello Kitty character first appeared on a coin purse in 1980 and has become a global marketing phenomenon.
This is where I started to metaphorically scratch my head. Up there, it said she took over "in 1980, five years after [HK] first launched." Seems to be a glitch in the timeline there.
She has appeared on clothes, accessories, video games, and even an Airbus plane.
And I saw an entire bullet train with a Hello Kitty theme. Well, pictures of it, anyway. Not to mention the Hello Kitty vibrator and the Hello Kitty assault rifle, which I called the HK-47.
Unlike other Japanese exports such as Pokemon, there is little backstory to the character of Hello Kitty. Sanrio has said she "isn't a human, [but] she's not quite a cat either".
As she has an actual cat as a pet, if she were a cat, it might raise questions about slavery and feline trafficking.
She was born in London and has a twin sister named Mimmy and a boyfriend named Dear Daniel, according to Sanrio.
And yet, the article fails to mention her name. Fortunately, I covered that up there ^
Kitty will make her cinematic debut in a Warner Bros film in 2028. She has already appeared in several animated series but has never spoken, as she doesn't have a mouth.
Now, I want you to stop and think for a moment about the optics of a British little girl character, created in Japan some 30 years after WWII, whose distinguishing feature is being voiceless. And maybe you'll see one reason why I had that schtick going for a while.
I'm actually more annoyed at the continued use of "hack" in this context. But I'm not sure if it's better or worse than "trick."
Whether you like them fried, roasted, baked, or made into tots...
"Boil 'em, mash 'em, stick 'em in a stew. Lovely big golden chips with a nice piece of fried fish"
...you probably have at least one favorite potato dish.
Sure: Latkes.
Plus, potatoes are inexpensiveâyou can often get a large 5-pound bag for just a few dollars.
How much for the small 5-pound bag?
But if youâre a household of one or two, it can be a challenge to eat all those potatoes before they go bad, no matter how much you like them.
Whenever I see something about food "going bad," I imagine it standing on a street corner in a leather jacket and tattoos and chains, smoking a cigarette.
Thatâs why videos of people stashing apples in their bags of potatoes to prevent sprouting have popped up all over social media. But does this trick work? We looked to the science, talked to an expert, and tried it ourselves to find out for you.
No, I didn't just save this article so I could make potato jokes. That's just a bonus. I'm using this as an example of How To Do Science. Still. Remember the French phrase that translates to "potato:" pomme de terre, which, literally, translates to something like "earth apple" or "ground apple" (as in "the ground," not the past tense of "grind.") So I find the apple trick amusing, whether it works or not.
But in addition to storing them in that cool, dark, and ventilated space, can putting an apple in your potato sack really stop, or at least slow, potato spoilage? Well, itâs a little complicated.
So, how to do science. This is an easy and cheap experiment, though if you have an ethical issue with deliberately wasting food, maybe skip it. Find a bunch of potatoes from the same harvest, get an apple, split the potatoes into two groups, put each group in a sack (one with the apple) and leave them in the same room under the same conditions, but not so close to each other that apple fumes transfer.
Then simply check to see which batch, if either, grows eyes first.
Of course, just one experiment won't cut it. It needs to be repeatable and verifiable. Also, best if you have a third batch of potatoes from the same source set out on the counter or something as a control group.
Thereâs some scientific evidence to support this hack. Ethylene gas is a natural plant hormone produced by fruits like apples, bananas, and tomatoes. It plays a crucial role in the ripening process of fruits and the aging of vegetables. The theory behind the apple-potato trick is that apples release ethylene gas, and ethylene gas has been shown to inhibit potatoes from sprouting in at least one lab experiment.
This is the "hypothesis" stage of science. Not "theory." It's the starting point. But one must also take into account the possibility that there are other factors in the pomme / pomme de terre synergy, not just ethylene gas. For that, you'd probably need a real lab setup.
As a practical matter though, what you're really just looking for is extended shelf life on your spuds, so while the mechanism is interesting, just doing the experiment is good enough for the kitchen.
Or, you know, trust the other scientists who have already done the experiment.
The study showed that ethylene treatment delayed the sprouting of potatoes, at least under these tightly managed conditions.
It's the "tightly managed conditions" that are often the stumbling point between experiment and practicality.
To put this apple-potato trick to the test, I conducted a simple experiment in my home kitchen. I divided a bag of potatoes into two groups: one stored with an apple and the other without. I kept both bags in a relatively cool, dark pantry and checked on them every day for more than a week.
See, what'd I tell you?
Surprisingly, after seven days, I found that the bag of potatoes with the apple actually sprouted first, while the bag without the apple sprouted about 24 hours later, after eight days. Itâs a puzzling result considering the research.
A puzzling result, maybe, but a good result. Why good? Because it exposes a possible error in your hypothesis, or your method, and that's the fun part.
My pantry isnât a lab, and my climate control was anything but precise. Plus, different potato varieties may have varying susceptibility to sprouting.
Okay, yeah, but like I said: get your potatoes from the same batch. If one batch was dug up 3 days ago and the other, 5, then the experiment has a fatal flaw from the get-go.
There is no harm in trying this trick at home, says Jayanty. Whether you go with an apple or a banana, it wonât hurt the potatoes, and it just might delay sprouting.
"No harm" unless you'd rather have an apple or a banana than a potato to eat.
The article ends with actual scientifically-backed tips for extending tater life, so there's some practicality there, apart from the kitchen science stuff.
The absolute last thing I should be doing is talking about human interrelations. I know more about quantum physics, and I know almost nothing about quantum physics. But, in another example of random number clumping, here's another BBC article, this one on relationships:
For once, a headline question can be answered "yes." Because a question like that, all you have to have is one counterexample. And I have friends who are exes, so there you go. Story over.
...except, of course, we already know that I'm an exception, an outlier, a better person. I suspect many people would say "hell, no."
Break-ups are tough - you suddenly lose the person you shared everything with. But staying friends with an ex can be equally as painful.
Well, yeah, but presumably you liked them for a reason, right? A reason that might be common grounds for friendship? Unless it was purely physical attraction, which there's nothing wrong with.
I should note, however, that part of the "I don't know what I'm talking about" thing is that I'm not sure whether the article is talking about, like, just being cordial with the ex, or occasionally doing stuff together in a group, or somone you can call to help you move, or a full-on "help you move bodies" close friendship.
That last one, admittedly, could get a bit awkward, I would imagine. It's like "Sure, I'll be there if you need me; I just find you unsuitable to live or mash genitals together with."
"I don't have many friends who are friends with their exes, actually," says Olivia Petter, author of dating handbook Millennial Love. But she has managed it in a couple of cases.
Obviously, this kind of thing is way behind me, anyway. So my interest is mostly sociological.
Okay. I can't pass up noting yet another aptronym there. Almost makes me think "Olivia Petter" might be a nom de plume. But no one would be that obvious... would they?
1. How serious was it?
"There are one or two men I've had brief, casual romantic relationships with that have evolved into friendships," Olivia told BBC Radio 4's Woman's Hour...
But when it comes to serious relationships, she says while she's on good terms with them they're not close friends.
See, that's the sort of thing I'm talking about. I run into my first ex-wife occasionally. We talk and catch up. Then we forget about each other again.
Meanwhile, a month-long fling I had back in the 90s is the kind of friend where we text fairly often. She's on the other coast, so I don't just run into her. We both did a Thanksgiving dinner last year, though, with a group of friends. No awkwardness involved.
2. Are you over them?
One of the biggest obstacles is whether you are able to separate the romance from the person.
"You need to have processed the break-up, not just moved on logistically, but emotionally," Kate says.
Huh... that's perilously close to what I said above. Maybe I should write a book about relationships and have BBC promote it.
3. How much time has passed?
Yeah, that actually makes sense. Get past the big emotions. And, obviously, some relationships end with unforgivable acts.
4. Is your new partner ok with it?
If you do decide to stay friends, then Kate says you need to talk openly about what you are both going to do if the other gets into a new relationship.
And if a new partner is uncomfortable with the friendship, Kate stresses you should take their concerns seriously.
We didn't use the words "red flag" way back when I was dating, but one immediate "no" signal for me was when someone indicated that being friendly with an ex was a big "no" signal for her. Most of my friends are women. Some are exes. I wasn't about to give up long-term friendships because someone new doesn't trust it.
You may need to have a conversation with your ex to adjust the friendship which could be "less frequent contact, more group settings, or being more transparent about what you're doing together," she says.
And then, from my longer-term perspective, sometimes friendships just fade out, whether there was ever romance and/or doinking involved. People drift apart. It's just a fact of life. It's not always a, pardon me for adopting the term, "conscious decoupling" or whatever. (I'm applying this to friendships as much as romance.)
I guess, for me anyway, I just feel like holding a grudge hurts you more than it does the other person, so I try not to do it. Doesn't mean we have to do stuff together, but it hurts me not at all to at least be cordial.
Again, though, I have a longer perspective on these things. It's different for younger people, and society is different, too. Hence my interest in the subject remaining purely scientific.
I'm skeptical about a lot of things. Not denialist; skeptical. But there are a few things I'm absolutely certain of, and one of them is that plants die in my vicinity. Hell, one time I bought a cactus for my housemate. I never touched it. She was diligent with it. It died. So I don't believe for one second this article from BBC:
I thought about getting plastic plants, but I expect those would die around me, too.
Have you lost count of the times you've had high hopes for a pot plant...
However, this lede is the real reason I saved this article. High hopes? Pot plant? I get they speak a different language over there, but either they ignored their US editors, or this is one of those times when the BBC gets a bit cheeky. It does that sometimes. It's often subtle. So I'm hoping—really hoping—that this particular double entendre was intentional.
Still, in the US, we call them "potted" plants to distinguish them from weed.
...but despite careful positioning and diligent watering it always seems to die?
Every. Time. Full disclosure: I've never tried with an actual (snicker) pot plant.
Well you're not cursed and you don't need particularly green fingers for your to foliage to thrive, you just need to know where you might be going wrong, experts say.
You all know by now that I have no business with the supernatural. I mean, it's great as metaphor and in stories, and I enjoyed the long-running series by that name, but here in reality, I prefer science and reason. Still, if anything was going to pivot me to believing in curses and whatnot, it's the way plants die in my presence, as if realizing that they're stuck with me and have no legs to run away with.
But—rationally—my cats have legs and are stuck with me, and they don't try to run away. Except the one who gets medicine twice a day, but she doesn't run far.
Gardeners' World host Adam Frost and the Royal Horticultural Society's Clare Preston-Pollitt share their top tips for keeping your house plants alive and healthy.
Clare's last name is precipitously close to being an aptronym. Pollitt? Pollen? I'll be here all week.
Adam's is the polar opposite of an aptronym.
1. Pick the right plant
I don't want to seem ungrateful, here, but that's crap advice. As someone who appreciates form, but appreciates function even more, the only plants I really want to keep around are the ones you can use in cooking: mint, thyme, basil, etc. A ficus is useless and just takes up space. So I want to know how to keep, specifically, herbs alive, not some random spider plant that would look better in a vegan restaurant anyway.
Many of us pick plants we think are pretty but making sure they are compatible with the conditions in our homes is key for survival, says Clare, RHS Garden Bridgewater's horticultural advisor.
And if you live with pets, some plants are right out. Imagine me having a catnip plant. It wouldn't last a day. But, worse, some plants are outright toxic to pets, usually the same ones that pets absolutely love to munch on.
2. Don't overwater
Yeah, thanks. I suspect the problem here is the precise opposite, but "underwater" means something else entirely.
For common house plants like peace lilies and spider plants, brown leaves are a tell-tale sign of over or under watering. Check the dryness of the soil before topping them up.
For others, like cacti and succulents, Clare says we mistakenly drown them by unnecessarily watering them.
I am certain that this is not the case for the cactus I mentioned above.
3. Water less in winter
Some regions experience winter differently. Specifically, some never get cold, while others never get warm. I suspect this article was written with the temperate and moist UK in mind.
4. Keep your Christmas poinsettia warm
Remember how I said the thing about pets? Yeah, poinsettia and pets don't go together, despite being unable to spell "poinsettia" without "pets."
Also, even I know poinsettias are from Central America and Mexico. Which is one of those regions I mentioned in the last section. Exposing them to cold is like mixing good tequila with Sprite.
To keep them lasting longer than your New Years resolutions, you should add plant food to your poinsettias each month, Adam says. In April, he suggests trimming the branches, before re-potting in May.
And I'm including that bit to emphasize what I said up there about the BBC sometimes being cheeky. "lasting longer than your New Years resolutions," indeed.
I'm obviously leaving a lot out, but the link is there if you haven't yet given up on the entire idea of houseplants, like I have.
There was a slew of articles about Shakespeare not too long ago, probably paid advertising for the movie Hamnet. This might or might not have been one of them, from Mental Floss:
Well, yeah, because no one knows everything about Shakespeare. Hell, no one even knows for sure what day he was born on (they have a baptism record and an assumption based on custom).
As usual, don't trust MF for the facts. Or me, for that matter. This is, as certain "news" outlets disclaim, for entertainment purposes.
Misconception: Historians debate whether Shakespeare really wrote Shakespeare.
Shakespeare wrote a lot. He was also from the country town of Stratford-upon-Avon and didnât go to university. So could this one âsimpleâ guy write all these impressive high-brow works?
So, two things here. I've heard this "theory" bandied about for as long as I've known about Shakespeare, and my first impression (as an impressionable kid) was "How is this relevant? The plays exist. Can we not separate them from the writer, if not from the time?" Later, I came to realize what this really was: arrant snobbery.
Thus, I feel like I can safely ignore any snoot who proclaims that Shakespeare didn't write Shakespeare.
And finally, they're only "impressive high-brow works" from our point of view today. At the time, they were the pop culture equivalent of monster truck rallies.
Misconception: Shakespeare invented 1700 words.
Todayâs lexicographers have a lot more data and technologyâand they know Shakespeare didnât coin that many words. (Jonathan Culpeper, a linguistics professor at Lancaster University, has spent decades researching Shakespearean language. He believes Shakespeare coined around 400 words.)
As I'm pretty sure I've noted before, I don't know how these things get determined, but it seems to me that, especially before the internet, words were passed around by, well, word-of-mouth before they were ever written down. So how do we know what was coined, and what was the pop culture equivalent of "six-seven" and "skibidi?"
I will, however, give more weight to the opinion of a linguistics professor in the matter.
Misconception: Saying Macbeth in a theater is dangerous.
Okay. Okay, fine, so that's a misconception. And yes, actors are known for being a superstitious lot, which is why you tell them to "break a leg" before a performance instead of "good luck." However, that does not make this clip any less than one of the funniest things in the history of comedy:
Misconception: Wherefore means âwhere.â
An image youâve probably seen countless times is Juliet decrying, âO Romeo, Romeo, Wherefore art thou Romeo?â It sounds like, âWhere you at, Romeo?â And some performances even have Juliet physically searching for Romeo as she says those lines.
But, at the time Shakespeare was writing, wherefore essentially meant âwhy.â Juliet is asking, âWhy are you Romeo?â because itâs his name, attached to a family thatâs feuding with hers, keeping them apart.
Okay, I'm not going to argue about that. It just seems weird because it's not his name "Romeo" that's keeping them apart, but his family name. And the necessities of plot.
I continue to insist that R&J is best interpreted as satire and/or parody.
Misconception: As Hamlet says, âto be or not to be,â heâs holding a skull.
Sometimes in pop culture, you encounter a Hamlet whoâs holding a skull and reciting the âto be or not to beâ speech. (Billy Madison is one example.) But Hamlet holds a skull during his speech in the churchyard that begins, âAlas, poor Yorick!â It happens in Act 5, Scene 1. âTo be, or not to beâthat is the questionâ comes two acts before that, in Act 3, Scene 1.
Sure. Again, not arguing. But it's pretty damn famous that in all of Shakespeare's surviving works, only one stage direction stands out: "Exit, pursued by a bear." (The Winter's Tale). And there weren't many in his entire body of work. I don't recall, and am too lazy to look up, if those parts of Hamlet have actual stage directions, or if it's left up to directorial interpretation.
So if you want to have Hamlet holding a skull, or a book, or a damn iPhone, in your production of Hamlet, I say go for it. Maybe he carries a skull around with him all the time. He certainly seems the type.
Misconception: The Globe Theatre was round.
The first Globe Theatre was completed in 1599. Shakespeare was a part-owner of the Lord Chamberlainâs Men company, which built the venue. And he may or may not have called it a âwooden Oâ in the prologue of Henry V. That being said, it wasnât exactly a circle. It was a many-sided polygon.
In fairness, calculus wouldn't be invented for nearly 100 years, so I don't think they'd be aware that, as the number of sides of a regular polygon increase, its resemblance to a circle also increases. At some point, architecturally, you have to say "Yeah, fine, that's a circle."
It wasn't, however, a "globe." The Hayden Planetarium is a "globe" (inside a cube). At best, the Globe Theatre approximated a cylinder.
Now, there are a bunch I skipped because I had nothing to say about them. They're at the link if you're interested.
Well, no, I haven't started suddenly following Wine Spectator. Though I might. Still, I'd rather drink it than read about it, except maybe in this case:
Okay, well, if it's truly an indispensable tool, then the answer is "stop being 'organic'" or "move out of France."
The irony (another heavy metal) here is that copper is considered "organic" for agricultural purposes. Which is distinct from the chemical meaning of "organic" (carbon, not copper, compounds), and the original meaning of "organic" (from organs).
My amusement at this is tempered only by knowing that, in French, what we call organic in the agricultural sense is called biologique.
It leaves organic winemakers confronting an existential question: How do you protect vines from downy mildew when your primary defense has been eliminated?
Because no one in France has ever confronted an existential question before.
Uh huh. Okay. I'm letting the repetition of "natural" slide there because it's either a translation or someone's second or third language. But I'm not going to let the natural fallacy slide, oh hell no, not me.
Once more for the back row: "natural" doesn't mean "good." Poison ivy is natural. Tobacco is natural. Arsenic is a naturally occurring element much like copper.
Copper is a naturally occurring element and is approved worldwide for organic agriculture. Critically, there is no equivalent for organic farming. The other options are forbidden synthetic fungicides.
Not, mind you, that I'm coming down on one side or the other here. I don't know enough about copper toxicity or viticulture in general to weigh in on what France did, and even if I did know enough, they wouldn't give one single shit about my opinion. All they care about is that I enjoy the finished product and keep sending them money in exchange.
Copper, while it does protect vines from fungal disease, is a persistent metal that accumulates irreversibly in the top few inches of vineyard soils. In large quantities it disrupts essential microbial communities and earthworm populations that define healthy terroir. It can also contaminate the waterways that flow through wine regions. There are also mounting concerns about its impact on vignerons and vineyard workers themselves.
Well, there you go. Look at that: something natural isn't good for you.
Regarding a way forward, Jestin remains optimistic, hoping that scientific research can devise alternatives to copper.
Honestly, I hope so too. French wine is expensive enough with tariffs.
Trade body SudVinBio cautioned that producers may abandon organic practices altogether.
About that, I don't care.
Two copper products remain authorized, but they carry stringent restrictions, which may make them less practical. Without viable alternatives to copper, how does organic viticulture survive in regions where Bordeaux's Atlantic humidity, Burgundy's continental rainfall, Cognacâs and Champagne's persistent dampness all create conditions where mildew protection determines vineyard sustainability?
I have another article in my pile about the American chestnut, which used to dominate Blue Ridge Mountain forests until a fungus destroyed the entire native population. It'll pop up eventually, but the point for now is that things change. And right now, things, climatically speaking, change even faster. France is very strict about its wine growing policies; for instance, they don't allow irrigation apart from whatever rains fall. So either they can become less strict, or wine growing will shift to some other region.
Which would be a shame, but at least they'll still have cheese. Which I don't think has any copper in it, but I haven't tested it for that.
Today, a bit of a break as I use this space to participate in "26 Paychecks " [E]
Tell us about one (1) genre that you've never written anything for. Name the genre and tell us why it's not something that has sparked any writing from you.
Thing is, I've been here for a while, and I've written a bunch of things both here and elsewhere. And I tend to experiment, sometimes. So I can't absolutely guarantee that I've never written in some particular genre. Hell, I probably have something listed under "Fashion," which is one genre I've always said I knew nothing about.
But, upon browsing the list of genres here, I came across one that I don't think I've ever touched, which is Genealogy.
There's nothing wrong with genealogy. I get wanting to know where people come from, much as I enjoy tracing word origins. It's just not a particular interest of mine.
You'd think it would be, right? Since I was adopted as an infant, surely I must feel a burning need to track down my genetic ancestry! Well, um... no. I don't. Never have. Nothing more than idle curiosity or being able to answer inheritable disorder questions for the doctor. I have had some interest in tracing my adoptive parents' origins, but any attempts led me to dead ends and I gave up, never having written about it. Dad always warned me I'd find skeletons, anyway, and unlike a character in a horror story, I actually listen to warnings. I don't always heed them, mind. But I listen.
Short one today to keep it "about 200+ words," but that suits my mood today just fine.
As we know, the answer to headline questions is "no" by default. But okay; I'm willing to listen.
Itâs no secret that Americans are more politically polarized today than weâve ever been. Do you even remember a time when we werenât this way?
The only time I recall was the few months after September 11, 2001 â 24 years ago now. For a short but beautiful time, Americans really were united.
No.
We thought we were united. But it turned out that we were all mad for different reasons. One side was sad about the loss of life. The other seemed to be, but was really just pissed that someone caught us with our pants down.
Our representatives in Congress came together to proclaim their commitment to working across the aisle and backed it up with some major bipartisan laws.
Yeah. Really, really bad ones.
We Americans have settled ourselves neatly into political tribes that donât work together, donât listen to each other, and often despise one another.
Or both. I can despise both.
Many people have been hurt by our current level of polarization, and thereâs worse pain to come if things continue this way.
If you haven't noticed, there's worse pain to come regardless.
Anyway, the article goes into this "depolarization challenge," and I have no need to reproduce it here.
At this point you might be thinking, âbut why do I need to change? Itâs those other people who are causing all the problems!â
I can understand that thinking. There are certainly things I've given up on because we'd all have to do it, and that ain't gonna happen. But when you think harder, for stuff like what's in this article anyway, maybe you come to the conclusion that it's easier to change yourself than it is to change other people.
Some people are not ready to step outside their comfort zone and change their mindset in this way. But those who do will be rewarded with a stronger sense of community, a more functional civic society, less heartache, better relationships, and a country that they can be proud of. And maybe, if enough of us do it over a period of time, our government can become less polarized too.
Weâre entering an era of whatâs often called AI slop: an endless stream of synthetic images, videos, and stories produced quickly, cheaply, and at scale.
I have to admit, I'm getting more than a little tired of hearing/seeing the words "AI slop." From what I've seen, AI output has become more polished and professional than about 75% of human-generated content. I think some people might be jealous.
I ain't saying it's right, mind you. Only that it's prettier.
Sometimes these fake videos are harmless or silly, like 1001 cats waking up a human from sleep.
Harmless? You dare to call something that triggering harmless? I don't even allow the significantly fewer than 1001 cats in my household to wake me up.
Other times, they are deliberately designed to provoke outrage, manipulate identity, or push propaganda.
Because human-generated content would never do that. Just like no one ever got lost using paper maps in the time before GPS.
To navigate this new information environment, we need to combine psychological literacy, media literacy, and policy-level change.
And here's where it gets difficult for most of us. Why should we change? It's the world that needs to change, dammit!
The article provides a road map (or, if you prefer, a GPS route) to us changing:
1) Understand Our Own Psychological Biases (Psychological Literacy)
The psychology behind falling for AI-generated misinformation isnât fundamentally new. The process is largely the same as with other forms of misinformation, but AI takes it to a whole new levelâ it dramatically lowers the cost and effort required to produce and spread it at scale.
My own simple solution: Right now, most of us have a bias that says "I saw it, so it must be real." I suggest turning that around. Assume everything you see on the internet, or on TV, is fake. Like you're watching a fictional movie or show. The burden of proof thus shifts.
The downside to this (every simple solution has a downside) is that you get so you don't believe anything, And for some of these content generators, that's the goal: make you question reality itself so they can swoop in and substitute it with their own version. Hell, religion has been doing this for as long as there's been religion.
As Matthew has written before about fake AI accounts, people are motivated to believe what fits their values, grievances, and group identities, not necessarily whatâs true. When a video confirms what you already believe about politics, culture, or power, authenticity becomes secondary.
I have noted this before: it is important to be just as, or preferably more, skeptical about the things that tickle our confirmation bias.
The goal isnât to suppress emotion. Itâs to recognize when emotion is being used as a shortcut around verification, and being used to manipulate you.
It sure would be nice to be able to suppress emotion, though. I've felt that way since watching Star Trek as a kid. Spock was my role model.
2) Lateral Reading Is Still the Best Tool We Have (Media Literacy)
When people try to fact-check AI videos, their instinct is often to stare harder at the content itself such as examining faces, counting fingers, looking for visual glitches.
Guilty.
I've been seriously considering wearing a prosthetic extra pinkie finger so that anyone who looks at a surveillance photo of me will immediately assume it's an AI fake.
The most effective fact-checking strategy we have isnât vertical reading (scrutinizing the video itself). Itâs lateral readingâleaving the content entirely to verify it elsewhere.
I do that here, especially with notoriously unreliable sources, which, since I try to use free and easily accessible content, is almost everyone these days.
3) Policy Changes and Platform Accountability
Individual skills matter. Community norms matter. But at this point, policy intervention is likely required.
Well, I was trying to be funny with the "It's the world that needs to change" bit above, but I guess they're serious.
Social media platforms are not optimized for truth, theyâre optimized for engagement.
I should fact-check this, but it aligns with what I already believe, so I won't.
Conclusion
The most dangerous thing about fake AI videos isnât that people believe them once. Itâs that repeated exposure erodes trust altogether: in media, in institutions, and eventually in one another.
As I alluded to above, it makes us question the very meaning of "truth."
I'd also add this: Be humble enough to know that you can be wrong. Be brave enough to admit when you're wrong. And allow space for the idea that sometimes, your ideological opponents are right.
You know those "which Hogwarts house are you?" quizzes designed to fill out your ad profile online? I don't know; maybe they've finally fallen out of favor. Here's a different kind to consider, from Big Think, and I'm not even building an ad profile of you:
For clarity, that subhead there is the author describing himself as a Kitsune. I'm absolutely not a Kitsune, though I appreciate them. Sometimes.
We are all philosophers. I donât mean this in the âWhat do you make of Quineâs âTwo Dogmasâ?â sense. No, we are all philosophers in that we all do philosophy.
Yeah, even that insipid song by Edie Brickell with the line "philosophy is the talk on a cereal box" is a kind of philosophy.
Philosophy is a practice of wonder and logic; curiosity and introspection; dialectic and meditation; criticism and advocacy.
I question the author's assertion here, but I guess that means I'm doing philosophy.
So, without any empirical rigor whatsoever â another favorite characteristic of philosophy â I present here five different ways to be a philosopher.
I feel like "The Fool" is conveniently left out, though maybe that's an aspect of the Kitsune. Yes, yes, I'm getting to what that is, if you don't already know.
But that's because I assert that philosophers, by definition, have a stunted sense of humor, or none at all. We have a different word for philosophers with a sense of humor: comedians.
The Sphinx
The archetype: The Sphinx had the head of a woman, the body of a lion, and the wings of a bird.
While that kind of chimera is probably highly symbolic, I don't know what the symbols might mean. Physical descriptions are probably the least important things in these archetypes.
Each time, the Sphinx would ask a single riddle, the classic being, âWhat walks on four legs in the morning, two at noon, and three in the evening?â but I assume there were more.
One of my favorite scenes in fiction is from a Zelazny novel. The MC meets a sphinx, who asks him a riddle. He asks, in return: "What's red and green and goes round and round and round?" This stumps the sphinx, because of course the sphinx isn't attuned to the modern definition of "riddle." He is thus able to pass while the sphinx ponders, much like when Spock set an android into an infinite loop with deliberate illogic.
This is probably when I determined the essential difference between philosophers and comedians.
Oh, the answer is "a frog in a blender."
The Leviathan
The archetype: The Leviathan is a demonic sea serpent that breathes fire. Its back is a row of shields and churns the oceans to a frothing boil.
Not ever answered: what use fire-breathing has in a sea monster.
This person has a transferable framework that they apply to everything. Theyâve read a book, studied a philosophy, or watched a YouTube video and decided, âYes, this idea is the one that will govern my life.â Every action in every minute of the day can be explained by this single system of ideas.
Oh. That type.
The Kitsune
The archetype: In Japanese folklore, the kitsune is a fox spirit known for their ability to shapeshift. A kitsune might appear as a beautiful woman, an old man, a child, or a tree. Some are tricksters, and others are teachers.
The "trickster" archetype can be funny. But not usually to the ones being tricked.
The kitsune-person may say something outrageous and, when challenged, give a wide smile with a twinkle in their eye. Theyâre often impossible to argue with because they keep changing things.
Oh, yeah, the goalpost-mover.
The Minotaur
The archetype: The Minotaur is a half-human, half-beast (typically a bull) locked in a labyrinth. The Minotaur is feral and brutal, no doubt â he will kill anyone he catches in his maze â but he is also lost and tormented.
In my view, the "bull" part is essential to the minotaur's description. It's right there in the name. ("But, Waltz, what about centaurs? They're part horse, not bull." "Turns out one possible etymology for 'centaur' is 'bull-slayer.'")
The minotaur-philosopher is someone lost in the mire of human suffering, mortality, freedom, and absurdity. They never escape the labyrinth but make a dark, resigned home within it. Here, youâll find Pascal, Dostoevsky, Heidegger, Sartre, Camus, and Simone de Beauvoir pacing about in anguish.
No comment.
The Garuda
The archetype: The Garuda is a great eagle of Indian mythology and is associated with clear sight and the dispelling of poisons â especially those of serpents and nagas. The Garuda soars above the landscape and sees the structure of things.
One might think that because it's a big-ass bird associated with purification, I'd identify most closely with this. One would be wrong.
The Garuda-person asks, âWhat do you mean by that?â a lot. They hate vagueness and metaphor used as arguments and will often call out both â âWhat does that actually mean?â they say. They generally donât have time for âlived experienceâ or emotional reasoning.
Or, I don't know. Maybe that's pretty close.
Fuller descriptions exist at the link, of course.
While, as the author notes, the list is by no means exhaustive, I find it amusing. I'm also quite pleased that it's not limited to one set of mythology, though there are certainly others that could be included, from other cultures. Though the "trickster" archetype seems to be pretty universal.
And most of us are composites â a little Sphinx when weâre unsure, a little Minotaur late at night, a little Garuda when weâre fed up with nonsense.
I'd venture that most of us just are, without thinking about archetypes. Hm. Maybe Edie Brickell was onto something, after all.
Here's one for your inner 12-year-old, from Live Science:
How many holes does the human body have? You might think that the human body has many holes, but that number shrinks when you stop to consider what counts as a hole.
Because I know your inner 12-year-old immediately said "which sex?"
The human body is extraordinarily complex, with several openings and a few exits.
Cue Beavis and Butt-Head.
But exactly how many holes does each person have?
I imagine it not only depends on your definition of "holes," but how recently someone's been shot. Maybe that only applies in war or the US.
But it's not quite that easy once you start considering questions like: "What exactly is a hole?" "Does any opening count?" And "why don't mathematicians know the difference between a straw and a doughnut?"
I've noted before that a "hole" isn't a thing. However you conceive of the concept, a hole can only be defined by what's around it. You can't just point to a random location in space and say "that's a hole." Or, well, you can, but people would look at you funny.
"Black holes" may be the only exception to this, but their name is more metaphorical.
Oh, and the branch of mathematics that doesn't know the difference between a straw and a donut (and a coffee mug, for that matter) is called topology, where all of those shapes are considered toroids: one hole going all the way through.
Topologically, we're all toroids (assuming we haven't been shot through recently). Most animal life on Earth is.
Katie Steckles, a lecturer in mathematics at Manchester Metropolitan University in the U.K. and a freelance mathematics communicator, told Live Science that mathematicians "use the term 'hole' to mean one like the hole in a donut: one that goes all the way through a shape and out the other side."
Look, I don't care if you call it doughnut or donut. The former is more British; the latter is more US. Just do try to keep it consistent, and if you're quoting a Brit, use the former. Or do what I do, and say "bagel" instead.
But if you dig a "hole" at the beach, your aim is probably not to dig right through to the other side of the world.
Totally tried to do that when I was a kid. It's good to have goals.
Similarly, mathematical communicator James Arthur, who is based in the U.K., told Live Science that "in topology, a 'hole' is a through hole, that is you can put your finger through the object."
Um.
Phrasing?
And if you ask people how many holes a straw has you will get a range of different answers: one, two and even zero. This is a result of our colloquial understanding of what constitutes a hole.
Are... are you telling me language can be ambiguous? Say it ain't so!
In topology, objects can be grouped together by the number of holes they possess. For example, a topologist sees no difference between a golf ball, a baseball or even a Frisbee.
And I knew that, obviously, but it's also another excuse for people to grumble about "common sense," as if that were a thing that existed.
Armed with the topologists' definition of a hole, we can tackle the original question: How many holes does the human body have? Let's first try to list all the openings we have. The obvious ones are probably our mouths, our urethras (the ones we pee out of) and our anuses, as well as the openings in our nostrils and our ears. For some of us, there are also milk ducts in nipples and vaginas.
At least they addressed the 12-year-old directly and shut down its gigglesnorts with all kinds of formal medical words. Unfortunately, that sentence needs another comma near the end.
In total there are potentially millions of these openings in our bodies, but do they all count as holes?
This is a bit like asking if a tomato is a fruit or a vegetable, in that a scientist will give you a different answer than a sous-chef.
"They're not actually holes in the topological sense, as they don't go all the way through," Steckles said. "They're just blind pits."
Again. Phrasing.
A pair of underwear, for example, has three openings (one for the waist and one for each of the two legs), but it's not immediately clear how many holes a topologist would say it has.
And again, it probably depends on the sex and/or gender of the person wearing it. And here comes the 12-year-old, giggling again.
So the mathematician's answer is that humans have either seven or eight holes.
And my answer? None.
Think about it: how many holes does your house have? You can ignore the drafty cracks for my purposes; I'm talking about, like, windows and doors. Open one window: no holes in the topological sense. Open a window and a door: suddenly you have a topological hole. Open three windows, and you get the situation the article refers to with underpants. Might be complicated if you also consider water and sewer systems.
And your digestive tract is, also, usually closed at at least one end, like a door or a window in your house. So while you could, technically and topologically, thread a string from mouth to asshole (preferably not the other way around), in practice, we're usually pretty closed off, apart from respiratory functions.
So, again: it's all about how you look at it. And if you're 12, this shit is funny as hell.
I dunno about science, but I have some idea about what your downstairs neighbors would say.
Youâve tried everything to feel more awake in the morningsâcaffeine, sunlight, water, stretchingâbut no matter what, you still feel groggy and unready to face the day.
Have you tried attuning your schedule to your chronotype, instead of trying to fit your chronotype into someone else's schedule?
Yeah, yeah, I know, few have the privilege of being able to do that. I certainly did not for most of my life.
Thereâs one thing you probably havenât tried thatâs taking social media by storm: jumping.
If it's "taking social media by storm," a) I'd be the last to hear about it and b) I'd immediately distrust it, like I did the "walking backwards" fad from, what, a year ago? Two?
Now, even though I don't practice this these days, I can accept that some exercise is better than no exercise. I can also accept that, sometimes, you gotta try something new to break up your routine a bit.
Near as I can tell, if you don't live above someone else or can do it (shudder) outside, there's nothing inherently wrong with this and it doesn't make you look as dumb as walking backwards does.
And yet, I'd still shun it, simply because it's a trend.
The article goes on to list the "benefits" of this particular exercise. I won't rehash them here. Just assume I'm skeptical. Not in the denial sense, but in the "I'm not going to trust this one source" sense.
Who Should Skip the Jumping
This section header is the actual reason I saved this article. Skip? Jumping? I'm dying over here.
You might want to think twice about participating in this trend if you have a weak pelvic floor, significant knee, hip, ankle, or foot pain, Achilles tendinopathy, plantar fasciitis, recent sprains, a history of stress fractures, or balance issues, Wickham says.
I admit, though, that putting this here assuages some of my skepticism.
To get the most out of your jumps, jump 50 times in place at a rapid, consistent speed, making sure to drive through the balls of your feet and land softly on the balls of your feet.
If I tried that right now, I'd end up in the hospital.
Meanwhile, I'll continue my usual jumping exercises: the ones that lead me to conclusions.
The shape of time In the 19th century, the linear idea of time became dominant, forever changing how those in the West experience the world
It would be horribly remiss of me if I didn't include this famous quote:
"People assume that time is a strict progression of cause to effect, but actually from a non-linear, non-subjective viewpoint - it's more like a big ball of wibbly-wobbly... timey-wimey... stuff." — The Doctor
âItâs natural,â says the Stanford Encyclopedia of Philosophy, âto think that time can be represented by a line.â We imagine the past stretching in a line behind us, the future stretching in an unseen line ahead.
I have heard that there is a culture, perhaps in Australia, or maybe Papua New Guinea or South America (I don't recall), where they think of the future as behind them and the past as in front of them. This is, if I remember right, because you can "see" the past but you cannot "see" the future.
So, no, it's no more "natural" to "think that time can be represented by a line" than it is to think of time moving from left to right on a page, the way Westerners read language. Perhaps those who write from right to left see time progressing from right to left. Even writing is arguably not "natural."
However, this picture of time is not natural. Its roots stretch only to the 18th century, yet this notion has now entrenched itself so deeply in Western thought that itâs difficult to imagine time as anything else.
Except, perhaps, as a big ball of wibbly-wobbly, timey-wimey stuff.
Letâs journey back to ancient Greece. Amid rolls of papyrus and purplish figs, philosophers like Plato looked up into the night. His creation myth, Timaeus, connected time with the movements of celestial bodies. The god âbrought into beingâ the sun, moon and other stars, for the âbegetting of timeâ. They trace circles in the sky, creating days, months, years.
While it seems to be true that Western culture borrows a lot from ancient Greece, there really were other cultures in the world. I'd think the whole "this started with ancient Greece" thing would have fallen out of favor by now. Guess not.
Such views of time are cyclical: time comprises a repeating cycle, as events occur, pass, and occur again.
I can kind of understand why people would think time is a cycle. As the article notes, things do seem to have cycles: day/night, moon, year, planet alignments, etc. But the idea that "events occur, pass, and occur again" just seems wrong to me. Even though there's, e.g., Groundhog Day every year, not every Groundhog Day is the same.
Itâs even hinted at in the Bible. For example, Ecclesiastes proclaims: âWhat has been will be again ⊠there is nothing new under the sun.â
Of all the laughably wrong things in the Bible, "there is nothing new under the sun" might well be the most laughably wrong. Well, right up there with "we live on a flat Earth between two waters," anyway. Maybe also with "there was a global flood in human history."
And yet, like Greek ideas, it's part of culture. To quote the Battlestar:Galactica remake: "All of this has happened before. All of this will happen again."
Importantly, medievals and early moderns didnât literally see cyclical time as a circle, or linear time as a line. Yet in the 19th-century world of frock coats, petticoats and suet puddings, change was afoot. Gradually, the linear model of time gained ground, and thinkers literally began drawing time as a line.
I believe it's important to note that, whether we conceive of time as cyclical, linear, wibbly-wobbly, or anything else we can come up with, this is a matter of perception, not reality. No one knows what time really is. People have guesses, and they'll tell you their guesses with great confidence, such as "time is like a river," but as we saw yesterday, some rivers are more rivery than others.
But no. Time is time. The only thing I can say with great confidence is what it's not: an illusion. It may well be an emergent property of something deeper, but then, so is the chair you're sitting in right now.
A crucial innovation lay in the invention of âtimelinesâ. As Daniel Rosenberg and Anthony Grafton detail in their coffee-table gorgeous Cartographies of Time (2010), the âmodern formâ of the timeline, âwith a single axis and a regular, measured distribution of datesâ, came into existence around the mid-18th century. In 1765, the scientist-philosopher Joseph Priestley, best known for co-discovering oxygen, invented what was arguably the worldâs first modern timeline.
What this brought to mind for me was the Periodic Table. Elements, like oxygen, exist with or without the Periodic Table, but Mendeleev's invention helped us visualize their relationships with each other, much like a timeline helps us visualize past events in relation to one another.
Rosenberg and Grafton describe A Chart of Biography as âpath-breakingâ, a âwatershedâ. âWithin very few years, variations on Priestleyâs charts began to appear just about everywhere ⊠and, over the course of the 19th century, envisioning history in the form of a timeline became second nature.â Priestleyâs influence was widespread. For example, William Playfair, the inventor of line graphs and bar charts, singled out Priestleyâs timeline as a predecessor of his own work.
I say this gives short shrift to Descartes, who basically invented graphs in the early 17th century. See, already I'm putting events on a timeline.
The second key development concerns evolution. During the early 19th century, scientists created linear and cyclical models of evolutionary processes. For example, the geologist Charles Lyell hypothesised that the evolution of species might track repeatable patterns upon Earth. This led to his memorable claim that, following a âcycle of climateâ, the âpterodactyle might flit again through the air.â However, with the work of Charles Darwin, cyclical models faded. His On the Origin of Species (1859) conceives of evolution in linear terms. It literally includes diagrams depicting speciesâ evolution over time using splaying, branching lines.
I think that once we realized that entropy only goes in one time direction, the old idea of cycles of time had to go right out the window.
Entropy and time are intertwined, and physics's best guess as to the nature of time right now is that it is entropy.
Now, I know some people mistakenly believe that evolution goes against entropy, but that discussion is outside the scope of this entry.
The last development stemmed from mathematics: theories of the fourth dimension. Humans perceive three spatial dimensions: length, width, and depth. But mathematicians have long theorised there were more. In the 1880s, the mathematician Charles Hinton popularised these ideas, and went further. He didnât just argue that space has a fourth dimension, he identified time with that dimension.
Now that was something I wasn't aware of. I knew the idea of "spacetime" preceded Einstein, but I don't think I'd ever heard of Hinton.
Nowadays, of course, mathematicians like to play with way more than four dimensions, and apparently, something like 16 are required for string theory (which, if anything in science can be said to be "only a theory," it's string theory).
Within history, conceiving of time as a line helped to fuel the notion that humanity is making progress. Joseph Priestley, our timeline inventor, is partly responsible for this. The man once listed inventions that have made people happier, including flour mills, linen, clocks, and window glass.
Sadly, Priestly lived before sliced bread and the "Skip Intro" button.
Within philosophy, conceiving of time as a line led to thinkers debating the reality of the past and future.
Whereas I assert that only the past is real; the present is an illusion created by the very recent past, and the future doesn't even rise to the level of illusion, as it does not exist at all and won't until the past catches up.
But I acknowledge that this, too, is a matter of perception and point of view.
I've gone on long enough for today (see, I made a time reference there). The article also goes on for a while, but it's an interesting read. And an appropriate one for an outlet named aeon.
First, you have to define what you mean by "river," and that can be harder than it sounds. The dictionary definition (at least the first one I found) is: "a large natural stream of water flowing in a channel to the sea, a lake, or another such stream." (Oxford)
So you've got "large," which is a judgement call; "natural," which is fuzzy; "stream," which implies flowing, but lots of water flows and some rivers sometimes don't; and "water," which rules out, for example, the L.A. River (most of the time); "flowing," which I say is redundant after "stream;" and "channel," which seems straightforward enough until you consider that some rivers are braided and/or deltaed with multiple channels.
And then you have bodies like the Potomac River, which for much of its lower reaches, all the way up past DC, isn't so much a river as a tidal estuary that happens to be fed by a higher river.
Oh, but that's not all. Dictionary definitions don't cut it here in this blog. I can use them as examples, but they don't resolve arguments.
You know the old saying, "You can't step into the same river twice?" I think it's supposed to be about how things change over time. Water goes in, water flows out, evaporation happens, shores get eroded, sandbars form, megatons of soil get transported, etc. Thing is, rivers (and other streams) don't just change over time; they, like living bodies, are in a constant state of flux ("flux," incidentally, shares a root with "flow" and "fluid").
Consequently, I say you can't step into the same river once. Because between the time your foot touches the surface and the bottom, the river has already changed. Hell, the mere act of stepping into it changes it, however minimally.
So when you're asking a question like the one in the headline, you have to be careful.
Rivers may seem as old as the hills, but they have life cycles just like other natural features do.
Yeah. Like hills.
Some rivers last longer than others, however. So which river is the oldest in the world today?
Remember that a river isn't just its water. Sometimes, it's not even its water, but just its channel, such as the aforementioned L.A. River (which also stretches "natural" to its natural breaking point). Channels change over geological time, though, carved and altered by water flow and other processes such as continental drift.
The winner is older than the dinosaurs: The Finke River in Australia, or Larapinta in the Indigenous Arrernte language, is between 300 million and 400 million years old.
I'm certainly not going to argue about that, though. Australia is a remarkably stable continent (or island or whatever name you slap on the land mass). If I remember right, some of the oldest rocks in the world are also found there, presumably guarded by dangerous wildlife, but don't get me started on how they define how old a rock is.
The arid conditions in the center of the continent mean the river flows only intermittently; most of the year, it exists as a string of isolated water holes.
See?
There's a whole lot of semi-technical geological explanation for how they figured it out at the article. While I have some experience with geology, it was rather secondary to hydrology in my education, so I'm not going to quibble about it. It is interesting, at least to me. But no quotes here.
"Rivers can disappear if a massive influx of sediment overwhelms them (e.g., volcanic eruptions) or if topography changes so dramatically that the flowing water takes a new course across the landscape (e.g., glacial advance and retreat)," Ellen Wohl, a geologist at Colorado State University, told Live Science in an email.
Pretty sure there's more that can change or destroy a river.
In the case of the Finke, Australia has been an unusually stable landscape for a very long time. Resting in the middle of the Australian Plate, the continent has experienced virtually no significant tectonic activity for the past several 100 million years, Baker explained.
Like I said. Only with more detail.
If the Finke ever dries up, the runner up may be the New River, which today is about 300 million years old, Baker said, and runs through Virginia, West Virginia and North Carolina.
And so we get to the final bit in the article, and the main reason I'm sharing this. Australia is on almost the opposite side of the world from me, and I've never been there, but the New River is practically in my backyard, globally speaking. I've known about its ancient age since college, when I took the aforementioned geology and hydrology courses.
Unlike most Virginia rivers, it doesn't flow into the Chesapeake Bay and thence into the Atlantic; instead, it's part of the Mississippi River basin. Which technically flows into the Atlantic, too, but via the Gulf of Mexico.
And unlike the Finke / Larapinta, the New River is always wet. And flowing. They take people whitewater rafting on it. Not me, obviously. But people.
In the interest of full disclosure, I should note this quote, which has cited sources, from the New River article on Wikipedia: "...a claim that the river is the second oldest in the world is disputed by the West Virginia Geological and Economic Survey and the National Park Service." It is, however, still damn old.
The irony, of course, is that it's called the New River, and that's what I find endlessly amusing.
Let me guess: 1) You're 2) getting 3) money 4) for 5) this.
Quality power tools are an investment, and if you take proper care of them, theyâll last a long time.
It's been a while since I bought power tools, so I'm not even sure which brands can be trusted, these days.
But power tools have seen a lot of advancement in recent years. While your old warhorses might still perform their core function well enough, if your drills, saws, and other power tools are five years old or older, itâs time to consider upgrading to a more modern version, for a range of reasons.
Seriously, this strikes me less as helpful advice and more as a tool companies paying for an ad that looks like an article.
And, indeed, they mention some brands by name in the article. But let's see what they come up with:
Advances in battery technology
I suppose this is fair enough. But if you've purchased a battery-powered tool of any kind, hopefully you're aware that the battery isn't going to last forever, regardless. Such tools are going to need to be replaced sooner than corded ones, in general.
Improved ergonomics
This feels like a stretch.
Get it? Ergonomics? Stretch? No? No. I'll be here all week.
And yeah, it's looking more and more like a paid ad.
More powerful motors
Uh huh. If it was, and remains, adequate for what you need it for, are you just upgrading because you're a Manly Man Who Must Have More Power?
Better safety features
Seems to me that the best safety feature is familiarity (provided one doesn't get complacent).
Smart technology
Until it can do the job on its own, I'm not interested.
I really didn't have much else to say, today. Just that stealth advertising sucks.
For many in the business world, a return to work after the winter break will mean once again donning the dreaded suit and tie.
Pretty sure that's falling out of fashion, except for, like, lawyers.
The corporate neckwear is the everyday counterpart to the traditionally more luxurious cravat â a voluminous neckscarf that conjures up images of opulent dinners aboard a yacht sailing through the Mediterranean.
It does no such thing for me. But I do know that what we call "a tie" is called "une cravate" in French, and France has a Mediterranean coast, so... whatever; I don't really have a point here, unlike my ties.
Yes, I do own some.
President Abraham Lincoln wore cravats, as did Hollywood actor Cary Grant and the extravagant entertainer Liberace.
At least one, possibly all three, of those men were gay. Nothing wrong with that, of course, at least not from today's perspective; I'm just pointing out that it might be a factor.
In more recent times, the garment has been popularized in the American mainstream by the likes of Madonna and the late Diane Keaton.
Fashion has been moving toward more unisex styles, from what little I know of it. Nothing wrong with that, either.
In this installment of NPR's "Word of the Week" series we trace the origins of the "cravat" (borrowed from the French "cravate") back to the battlefields of 17th century Europe and explore its links to the modern day necktie, patented in New York more than 100 years ago.
That is, honestly, more recent than I thought modern neckties were.
"Scarves worn around the neck existed long before, but the story of the cravat truly begins in the Thirty Years' War when it first gained wider European recognition," explains Filip Hren...
As someone who has studied fighting skills, albeit briefly and without much enthusiasm, I've often wondered about that. Something worn tied around one's neck is a liability in a fight. Unless it's a fake, designed to throw the opponent off-guard when they grab it to strangle you, and it instead comes off in your hand, giving you at least a temporary advantage.
Hren is referring to the 1618-48 conflict fought between Catholics and Protestants and known as Europe's last religious war.
Heh. That's funny.
The word "cravate" first appeared in the French language to describe military attire worn by Croatian mercenaries who were renowned among their enemies for their brutal fighting prowess.
Looking like a fighter is at least half the battle. Not sure if Sun Tzu wrote that, but I believe it to be true.
Made of silk or cotton, the cloth is said to have been used to protect their faces against cold weather and smoke in battle, and to treat injuries.
For a while, neckwear existed with a practical purpose (for non-warriors): shirts didn't have top buttons, or had really bad top buttons, so they used ties (of various styles) to hold the collar closed for a cleaner, more formal look.
For fighters, I can only imagine that they could turn its inherent disadvantage into an advantage: "I can kick your ass even with this liability looped around my most vital body connection."
"The scarves took their names from Croats. It was tied in a Croatian manner, or in French â a la Croate," explains Filip Hren.
And that, I didn't know until I read this article.
As an aside, Croats should not be confused with the Croatoan, a native American tribe largely in what is now North Carolina.
King Louis XIV introduced the cravat into French fashion and from Paris it soon spread across Europe.
And who, in the history of the world, has had more impact on clothing fashion than the French? No one, I say.
They also kick military ass. Coincidence? I think not.
Over the years, the necktie has come to symbolize success, sophistication and status, but has also been criticized by some as a symbol of power, control and oppression.
I don't really understand fashion, but I am rather attuned to symbolism (pretty much have to be, as a writer).
Remaining unexplained, however, is the continued popularity of its cousin, the bowtie.
Not that self-promotion is inherently bad. But check how many times he (yes, I'm assuming gender) pushes his podcast, newsletter, book, etc.
This does not mean the content is bad, either.
Have you ever heard of the term âconflict entrepreneur?â
Until my conversation with Martin Carcasson, I hadnât heard it.
That's because someone made it up. All words and phrases are made up, of course, just some more recently than others. This particular one isn't catchy or short enough to ever catch on, the way other phrases like "concern troll" have.
I propose "strifemonger."
...the idea is simple: A conflict entrepreneur is someone who makes money and/or generates a large following by intentionally pitting people against each other.
And they have been around since long before the internet.
Unfortunately, conflict entrepreneurship is big business, and itâs scary.
One of those things is opinion.
Itâs scary because itâs easy to rile up peoplesâ sensitivities and emotions.
You take that back RIGHT NOW!
Perhaps most unsettling, it takes zero experience, financial backing, wisdom, or talent to become a successful conflict entrepreneur.
Eh, I don't know about that. You gotta want to do it, and have some efficacy at it, and what's that besides talent? And you can earn experience along the way.
We see example after example in popular media of people who make their living off of reducing complicated issues into black-and-white binaries, removing nuance from conversation in favor of parroted talking points, and stereotyping the many based off the actions of the few.
This is, I think, the important part.
Think about, for example, kiddy-diddlers. I know you don't want to think about kiddy-diddlers, but I'm making a point here. There's a meme (original sense of the word) going around that drag queens are bad and they shouldn't be around children because they'll diddle them. Whereas, here in reality, the vast, vast majority of kiddy-diddlers who aren't family (happens a lot) are fine, upstanding church or school leaders. And yet if ONE trans person got caught diddling a kid, they'd say it's because they're trans; while the fine, upstanding church or school leaders who diddle kids are "mentally ill" and "don't reflect the values of the group."
In other words, if someone in the in-group does something bad, it's their fault (or we ignore it, as has been the case lately). If someone in the out-group does something bad, it's the entire out-group that's at fault.
To a conflict entrepreneur, your anger and your discontent are their supply. Your desire to withdraw into a tribe and demonize anyone outside of it is the capital a conflict entrepreneur needs to continue to build their empire.
Like I said.
Our anger sustains them. Our frustration feeds them. We're raging all over the internet, and they're sitting there chuckling.
Curious questions stop them in their tracks.
Okay, first of all, no; second, first you'd have to find and identify them.
This process of asking yourself questions, asking questions about others, and asking questions of others is at the heart of the...
...thing he's self-promoting.
As usual, I'm not avoiding talking about something in here just because someone's trying to sell a book. We're mostly writers and readers here, with many interested in selling their books and many more (hopefully) interested in reading them. And I think the basic points here are sound: that strifemongers exist, that they're manipulating people for fun and profit, and there are ways to aikido the hell out of them.
Now if I could just remember this the next time someone posts something deliberately inflammatory.