About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
January 26, 2025 at 9:08am January 26, 2025 at 9:08am
|
My weekly roll of the dice landed, this time, on an entry the significance of which I could not have known at the time: "Unwritten, Expanded"
Written on December 14, 2019, it's just over five years old.
I'm back! Actually been back a while, but it's taken me this long to recover enough from my trip to concentrate on a blog entry.
The entry was my first in over a week. The significance is that it was the last time I wrote anything like that, because it's the entry that began my current >5 year daily blogging streak. It was also only ten days after the subject of my last "Revisited" post. Random numbers do things we don't expect, like clustering.
As for the actual subject of the entry, of course it's a riff on an article, one which still exists . The article is about trends; as I said, this was over five years ago, before covfefe, and before Twitter became Xitter (pronounced "shitter"), so it's likely everything in it is out of date.
Article headline: 'Use at least one Emoji per text'
Me: I, and the people I text with regularly (a select few), use emojis sparingly, if at all. We also use complete words, spelled correctly (absent the inevitable occasional typo), and proper punctuation.
There have been arguments about whether one should use periods at the end of a text message. My answer: Yes, absolutely, if you're texting a complete sentence. Younger folks say it makes you seem "curt" or whatever. You know what's curt? Fucking emojis. Younger folks aren't always right, and they don't get to make rules. That's the job of us oldsters.
And here's a rule I made that I'd like to see enforced: no more vertically-oriented video. Ever. If I see a vertical video, I ignore it. I don't care if it's a warning about an incoming tsunami or something cute your cat did. (One exception is rocket launches, where it makes sense. Another is video of a CEO jumping out of a skyscraper.)
Me: If someone has my phone number, I expect they might call at some point. That's okay. As long as they don't expect me to answer, especially if they're not on my contact list. Any unidentified number is a health insurance telemarketer, as far as I'm concerned.
I'm sure I mentioned this somewhere, but there was a period when I was getting dozens of calls per day from telemarketers. Maybe I put my phone number somewhere I shouldn't, but don't blame the victim (me). That's reduced now to maybe once a week, but I still get scam calls of different sorts 2-3 times a day, on average. Consequently, I don't answer my phone unless I know the number or am expecting a call from, like, a doctor or whatever.
Me: So... wait... we're supposed to, on the one hand, condense our texts into emojis and "u" for "you" and acronyms such as IDK or LOL or whatevs, but you're going to get pissed because someone typed "K" for "OK?"
That wasn't the only "rule" I found to be stupid and inconsistent.
The entry continues, with me mostly getting more and more agitated about these arbitrary and changeable "rules," but I don't have any further comments on what I wrote back then. I'll just add one of my own about punctuation in text messages:
Except in truly extreme circumstances, one ? or ! (not "and;" "or") is more than sufficient. Also, if you end each of your texts with ... I will end up ignoring you. I actually don't care if you end a text with a period/full stop or not, but don't give me shit about it when I do.
So, in conclusion, the rules might have changed since then—I neither know nor care—but one thing hasn't: my lack of patience with arbitrary "rules." Except, of course, the ones I make up. |
January 25, 2025 at 7:30am January 25, 2025 at 7:30am
|
I'm used to stepping into puddles with indeterminate depths, but this one's going to test the integrity of my hip-waders. From The Conversation, and also from last month, hence the theme:
I have a vivid memory of the moment I realised Santa didnât exist. I was around six years old, it was the height of summer, and I was sitting on the step outside our back door, thinking about God. The existence of God, back then, was something that annoyed me: it meant that every Sunday, we had to go to church.
Then I realised: there isnât actually any evidence God exists. I only think God exists, because this is something people have told me.
I'll be honest here: that was way younger than I was when I had that particular epiphany. Well, it wasn't an epiphany for me, and not just because the word itself has religious connotations. No, it was more a gradual realization in my case.
I remember bounding up, excited, ready to share with my family this wonderful news. No longer would we be forced to endure the drudgery of weekly Sunday schools and sermons. But then I remember checking myself and thinking, oh no. If God doesnât exist, by the same logic, Santa must be made up as well.
I want to doubt that a six-year-old could leap to such logical conclusions, but peoples' minds work differently. This author went on to be, as the headline notes, a philosopher. I did not. I have too much of a sense of humor to be a philosopher. You know what we call a philosopher with a sense of humor? A comedian. Pays better, though that isn't saying much.
Perhaps this was the moment I became a philosopher (though I should say that as an adult, I no longer think that the analogy between God and Santa really holds).
You know what the difference is? In most cases, I think, parents know that Santa is mythology, while remaining somewhat certain about the existence of God.
Here's a philosophical question: Telling someone something that isn't true is called lying (again, as noted in the article's headline). But isn't there a difference between telling them something that you know isn't true, and telling them something that you believe is true, but isn't?
Like, if I'm absolutely convinced that it's about to rain. Clouds are gathering on the horizon, and there's that smell in the air and maybe the wind is picking up. So I turn to my imaginary friend and say, "It's about to pour down rain here." But over the next hour or so, the clouds move off to the north (or whatever), leaving the area where I'm standing completely dry. Did I lie? Obviously, I was wrong (hey, it happens, however rarely). But was it a lie when all signs supported my conclusion?
But now the tables have turned. Now I am a parent of young children, and I am the one enforcing hegemonic myths about Santa.
Ah, yes, yet another difficult decision avoided by me not wanting or having offspring.
We all do it, of course.
No, "we" don't. Even if he meant "we parents," not all parents do it.
Our culture expects parents, basically, to lie to our children that their presents were left by a jolly fat man who flies in a sleigh pulled by reindeer through the sky.
Hell, even my parents did that, and they weren't Christians. Santa's not known for leaving Hanukkah presents. That's jolly old Judah Maccabee.
We all surely want our children to grow up to be honest people. Shouldnât we set a good example, as far as possible, by telling them the truth?
Except that, sometimes, honesty doesn't serve us very well. For instance, when someone asks us, "Does this dress make my ass look fat?" we don't necessarily want to tell the bare, unvarnished truth.
Here's the thing about lying: it demonstrates empathy. To tell a deliberate lie requires enough knowledge about the other person's mental configuration to be able to tell them what they want to hear. As with all of our superpowers, this can be used for good or evil. But someone who "cannot tell a lie," like the mythological version of George Washington, might very well lack empathy.
Amusingly, and possibly even ironically, the "cannot tell a lie" cherry-tree incident is itself a made-up story. A myth. A lie.
To which I would say: well, no. We shouldnât be honest about Santa â at least not at first. It is morally OK, to the point of being actively morally good, for parents to participate in the grand Santa lie.
And the rest of the article explains why this particular author believes this.
If I felt compelled to tell my children everything, I would pull no punches in relating the wretched state of the world, of existence, of my still-deepening resignation that nothing positive can be done about it. I would inflict the full brunt of my money worries, my health concerns, my (mostly irrational) worries about them.
Oh, joy! More existential conundrums that I neatly avoided by means of a simple medical procedure.
Now, here's the fun part:
Just two days before this piece was published, the same outlet posted an article by a different philosopher (also from the UK, though Scotland, not England): "Why you shouldnât lie to your children about Father Christmas, according to philosophers"
I'll just note that in the UK, "Santa Claus" and "Father Christmas" are essentially the same figure. The history there is kinda cool; basically, Santa was a US thing after we divorced from the UK, but then the name went back across the pond and, eventually, the stories merged.
Point is, these articles, together, present two sides of the argument. That might leave parents even more confused than they were before they read the articles, but it just leaves me with a certain sense of smugness for never having to make that decision.
Well, there was the one time I was at a friend's house and accidentally dropped the truth bomb about Santa Claus within earshot of their six-year-old (or thereabouts). I haven't been invited back since. That then-six-year-old would be in college now. I wonder if she's studying philosophy. |
January 24, 2025 at 8:48am January 24, 2025 at 8:48am
|
You are being watched. Every move. Every breath. Whatever the hell else Sting sung about in that stalker song. From The Conversation:
From self-service checkouts to public streets to stadiums â surveillance technology is everywhere.
I'll note that the authors are Australian (hence the Commonwealth alternate spellings), but I'm pretty sure this applies to all developed countries, including the US.
This pervasive monitoring is often justified in the name of safety and security.
It's well-known that all-around awesome dude Ben Franklin once wrote, "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety." Well, it turns out that the quote has lost its original context, and as smart as Ben was, I'm pretty sure he didn't envision the surveillance technology of 250 years in his future.
I'm just pointing that out, because different context or not, it has relevance.
Surveillance isnât just changing our behaviour â itâs altering how our brains process information, operating largely outside our awareness.
I'd be interested to know whether this is different depending on religion (or lack thereof). Does it matter if the person involved believed that God (or whatever) was watching every single move they made?
Humans have evolved the crucial ability to detect another personâs gaze to navigate social situations. This allows us to discern friend from foe, interpret emotions and understand intentions.
I'm going to tamp down my knee-jerk reaction to that, as it edges into evolutionary psychology (aka guesswork and bullshit). I expect it's true as far as it goes, but the trait is hardly unique to humans.
Surveillance may be subtly amplifying this ancient survival mechanism, placing our brains on high alert for social cues.
I know that, for a while now, I've simply assumed that everywhere I am, everywhere I go (including at home, in the shower, whatever), someone is watching. Why they'd want to see me take a shit is beyond me, but I'm not kink-shaming here.
A total of 54 people participated in our study â all of whom were undergraduate students. They performed a visual task while being monitored by CCTV cameras. A control group performed the same task without surveillance.
And this is where I normally close the laptop screen. That sample size is crap, the demographics of it is questionable, and any result from such an experiment should include the caveat "needs to be replicated with a larger, more diverse sample." I'm assuming the undergrads just happened to be volunteers at University of Technology Sydney who wanted some beer money. I'd stake money on that assumption. Real money, not roobucks or whatever they use for currency down there.
The article goes into more detail on the methodology used, which I don't have a strong opinion about.
This seemingly subtle shift in perception may have profound consequences.
"May have?"
Our findings are especially timely given recent pronouncements by tech industry leaders for more surveillance. For example, Larry Ellison, the worldâs fifth richest man and CEO of computer technology company Oracle, has pitched his vision for an always-on, AI-powered surveillance state.
I do have concerns about combining so-called AI tech with universal surveillance. It's one thing to know my every move and breath is being recorded. It's another thing entirely to know that there's a supercomputer that can flip through my (and everyone else's) surveillance data and spit out a pattern in seconds.
My criticism of the methods aside, it does seem like an idea worth pursuing further, with larger studies. But there's a potential problem to be resolved: how do you convince the control group that they're not being watched, during a study that's specifically about people being watched? Or maybe there's a better way I could put it, but I can't think of one right now. Basically, if you know you're in a study, you can assume you're being monitored, whether they tell you or not.
Because someone is watching. Always. Without exception. |
January 23, 2025 at 9:52am January 23, 2025 at 9:52am
|
Ah, yes, bacon. The candy of the meat world. I never thought I'd post a link to Martha Stewart, but that's the kind of universe we're living in now, I guess.
Remember a week ago, when I did an entry on lab-grown meats? This one: "Meat Me In the Lab" .
One thing I'd meant to mention there, but forgot, is the most important question I had about the technology: Can it create bacon? Or would it be limited to the same kind of fakon that gave us turkey strips and the like? Not that turkey strips are bad, mind you; I enjoy them. They're just not bacon.
Until such tech reaches the masses, though, people will still cook bacon, and some of them will cook it badly.
There are few foods more satisfying than a strip of perfectly cooked bacon, whether served with pancakes for breakfast, in a BLT at lunch, or in a dinner pasta.
Or, you know, on its own as a snack.
Cooking bacon isn't as simple as it seems, as proven by the burnt (or gummy) strips many of us have eaten in the past.
This is complicated by the annoying tendency of different people to have different tastes and preferences. For my own purposes, I like a very crisp bacon strip, one where there's no chewy fat whatsoever. Hold it up horizontally by one end, and it should have the structural integrity to remain horizontal along its entire length. I also have no use for the fat drippings; I'll cook the eggs in butter, thanks. Unfortunately, not everyone has the same bacon requirements, so any article (or blog post by me) purporting to know The Best Way to cook bacon isn't going to be universal.
So when I say "If you're cooking it in a pan, you're already Doing It Wrong," that's a proclamation with the caveat of "if you're cooking it for me," which you're not.
Many recipes start with preheating a pan but the method for cooking bacon is a different. It's recommended to start with a cold pan, which gives the bacon fat time to render (release)...
And see, you're already Doing It Wrong.
It can be tempting to cook bacon straight from the fridge, especially when you're in a rush, but you might want to let it rest at room temperature first...
Yes, everyone surely has the time and planning foresight to do this.
Packing as much bacon as you can fit into the pan might seem like a win because you'll have cooked more bacon, but it's a common bacon cooking mistake.
Raw bacon takes up approximately 5 times the surface area of cooked bacon (at least if you cook it right). How a pan can get overcrowded with that, assuming you're making the mistake of cooking it in a pan in the first place, is beyond me. Unless you're trying to stack it, which you should know better.
Whether you're feeding a crowd or meal prepping, not using the oven to cook bacon is a major mistake.
And finally, we get to "don't cook it in a pan at all." Detailed instructions follow.
What the article leaves out is my lazy method to perfect bacon every time:
1. Place three sheets of absorbent paper towels on a large plate.
2. Lay up to five bacon strips on top.
3. Cover with three more sheets of paper towels.
4. Microwave for however long it takes for the towels to soak up all the nasty grease and the bacon to be cooked. (Did I mention the plate needs to be nuke-safe? Well, it does.)
While I generally object to microwaving meat of any kind, bacon is an exception. This method creates perfect bacon strips (by my definition) with minimal clean-up and in less than five minutes.
I'm sure someone read all that and cried "Heresy! Blasphemy! Abomination!" Well, I'll just be over here going to Hell while enjoying my perfectly-cooked microwaved bacon strips. |
January 22, 2025 at 10:29am January 22, 2025 at 10:29am
|
The article I'm playing with today is quite old by internet standards, dated all the way back in 2013. It's likely that a few things have changed since then—but not human nature.
Funny thing about "deserved:" I, for one, would hate to have fame. A bit more reach for my writing, maybe, but I don't want the spotlight or red carpet or cameras in my face or any of that crap. If I became famous, and someone told me I "deserved" it, I'd ask what sin I committed to make me deserve that kind of nonsense.
We humans are storytelling and story-finding machines: homo narrativus, if you will.
I most certainly will not. Not that I disagree with the storytelling bit. It's one reason I'm here. Just stop with the faux binomials already.
In our everyday, human stories, far away from science, we have a limited (if generous) capacity to entertain randomnessâwe are certainly not homo probabilisticus.
I can only assume this author is a specimen of homo annoyingcuss.
We also instinctively build our stories around individuals.
Yeah, that's, like, so basic you have to know it before even getting accepted into Writing 101.
Both stories tether the complex, stochastic narrative of the larger population to that of an individual. We canât blame the Times here: This kind of narrative works. We can put ourselves into that personâs mind, walk in their shoes, and travel in their story.
That's a lot of words to rephrase that a thousand deaths are a statistic, while one death is a tragedy.
I'm not going to quote a lot more, here. The basic argument seems to be that fame is due more to the characteristics of the people who make someone or something famous, rather than some intrinsic quality of the famous person or thing.
And the author (who, as far as I know, isn't famous) lays out a decent case for that, but it gets kinda long and maybe even a bit mathy. Mathive? Yeah, I'm going to go with mathive, as in it was a mathive pain in the butt to read all that.
So just one more, then:
The data implies that there is no such thing as fate, only the story of fate. This idea is encoded in the etymology of the word: âfateâ derives from the Latin fatus, meaning âspokenââtalk that is doneâin direct opposition to the root of âfame,â which is fÄma, meaning âtalk.â
Retreating to etymology when in doubt about a concept is a trick of mine, too. But I like to think that when I do it, it makes more sense than trying to tease out the difference between "spoken" and "talk." Even if the difference is one of past vs. present continuous tenses (which that quote seems to imply), that's hardly "direct opposition."
Of course, it may be that I'm missing something.
But you know what I'm not missing? Being famous. |
January 21, 2025 at 9:36am January 21, 2025 at 9:36am
|
You know how cats like to bat random things around? Well, that's a metaphor for me with articles like this ancient one (from Inc.):
And right there in the headline should have been my clue to click away: "to literally get smarter." I'll even let the split infinitive pass; that's a stupid rule anyway. But it's bad enough "literally" has come to also mean "figuratively," that's not what it's doing in the headline. It's an unnecessary intensifier, and the headline would be stronger without it.
Studies repeatedly show that you can make yourself smarter.
Yeah, I'mma need a citation or three for that.
High intelligence is nothing more than a great ability for pattern recognition and problem solving, all which is trainable.
"Nothing more than?" Seriously?
1. Keep intelligent company
You mightâve noticed that people of high intelligence often group together and this is because they want to discuss a broad range of topics freely without objections from their companions.
Oh. Sure. That's gotta be the only reason.
2. Read
You donât need to stick to self-help books or dry, scientific tomes either. Books like Lord of The Rings, Oliver Twist, and even Pride and Prejudice can improve your mind and impart life lessons.
What? No! I draw the line at Austen.
3. Rest
Hey, now we're on the right track!
The lack of sleep, relaxation, and excessive stress can diminish your brainâs capacity.
Yeah, and stressing about it makes you lack even more sleep.
4. Eat brain food
I wouldn't trust this section.
5. Play brain games
Once the brain realizes itâs good at something, it stops trying, just like any one of us.
Um... duh. Because we all have brains (some maybe more than others).
6. Keep a journal
What do Einstein, Isaac Newton and Thomas Jefferson have in common? They were all diary keepers.
They had other things in common, the most obvious probably being that they were all men. There were also plenty of diary keepers who never made it into history. Correlation isn't causation.
Anyway, yeah, it was probably beneath me to snark on this. There's plenty of fluff in my writing, too, and I may be the literal pot calling the figurative kettle metaphorically black.
But I couldn't let that use of "literally" in the headline slide. |
January 20, 2025 at 9:32am January 20, 2025 at 9:32am
|
This article from Discover isn't very old, and it's about one of the most important international cultural figures.
Eh, if it'd been that urgent, we would have taken better care of the Earth for the past 70 years.
The 2024 Nobel Peace Prize has been awarded to Nihon Hidankyo, the Japan Confederation of A- and H-bomb Sufferers Organizations.
The Peace Prize is a joke and has been since it got awarded to a terrorist back in the 1970s. While last year's doesn't stoop to that level, I don't know how many have been awarded to an organization instead of an individual.
Fortunately, this article isn't actually about spurious awards; that was just what passes for a human-interest lede.
Around the same time that Nihon Hidankyo was formed, Japan produced another warning: a towering monster who topples Tokyo with blasts of irradiated breath. The 1954 film âGodzillaâ launched a franchise that has been warning viewers to take better care of the Earth for the past 70 years.
A couple years back, a remastered Godzilla (more properly Gojira, from what I understand) got a screening in my local theater, in all its 1950s black-and-white glory. I canceled actual social plans to go see it, because it's Godzilla. I don't think I'd ever seen the entire movie start-to-finish until that night, and, from what I've been told, the version released in the US was severely edited from the original.
Point is, even if I had seen the whole film, it might have been the idiot version; also, I would have been a kid, and the whole metaphor would have been lost on Kid Me.
In our view, these films convey a vital message about Earthâs creeping environmental catastrophe.
Godzilla is as sudden and relentless as a nuclear weapon. The thing about a creeping environmental catastrophe is that it's, well, creeping. There's a parable about a frog in a kettle as the water slowly comes to a boil; it's inaccurate as biology, but still a decent metaphor for living in a crumbling ecosphere.
Superman's famous origin story comes to mind. In brief, Jor-El figured out that the planet Krypton was in imminent danger (details vary among retellings), but no one listened to him, so he sent his kid to Earth to become a god. When I was a kid, I was like "But why didn't they listen to Jor-El?" Well, now I know.
âGodzillaâ is full of deep social debates, complex characters, and cutting-edge special effects for its time. Much of the film involves characters discussing their responsibilitiesâ to each other, to society, and to the environment.
I've said that while American movies are about bad guys getting the shit kicked out of them by the good guys, Japanese movies are about honor. This tracks here, as honor is tied to responsibility. Another way to look at it is that American movies are about rights, while Japanese ones are about responsibilities. It's obviously not that simplistic in reality, and you can find all kinds of exceptions. (Meanwhile, French movies are about managing to have sex.)
Anyway, the article goes into how Gojira evolved from villain (sort of) to hero (sort of), and some of the other highlights and lowlights of the movie series. It pays special attention to the metaphors, but, really, there's nothing wrong with enjoying a good kaiju fight movie.
With a purposeful grimace and a terrible sound
He pulls the spitting high tension wires down
Helpless people on a subway train
Scream bug-eyed as he looks in on them
He picks up a bus and he throws it back down
As he wades through the buildings toward the center of town |
January 19, 2025 at 9:25am January 19, 2025 at 9:25am
|
Our delve into the past today takes us back to just before the beginning of my still-going blogging streak, early December of 2019, when I saved this entry: "Science Is Almost Always Wrong, And That's A Good Thing"
Other than noting that maybe—just maybe—being the longest entry title I've ever typed, there's not a lot to say about the entry itself. The article featured is from Quanta, and it's still available.
There are a few things I see now from a different, older, wiser perspective:
I mean, seriously, how hard is it to get the damn science right? I personally know a guy who works as a science consultant to certain TV shows, and the problem is they don't listen to him.
Turns out it's kinda hard to "get the damn science right." Sometimes the necessities of plot just get in the way.
That's going to be my new curse. "Hey, Waltz, someone keyed your car." "Galileo's balls!"
Unsurprisingly, I promptly forgot about that expletive, and I don't think I ever actually used it in speech or writing.
What that entry is, then, is probably a good representation of my postings here, but that's about all. |
January 18, 2025 at 8:57am January 18, 2025 at 8:57am
|
People like a happy-ending success story, I hear. This one's from Nautilus, and dated just last month:
When the two Voyager probes launched into space in 1977, they were headed to uncharted territory. It was the first time humanity had sent robot spacecraft to study up close the four giant outer planets of our solar system: Jupiter, Saturn, Uranus, and Neptune.
I'll pause a moment for everyone to get the Uranus jokes out of their system.
...
Okay? Good.
Let me then emphasize that they did all this with 1970s technology. Personal computers hadn't even been invented yet.
Once the Voyagersâ tour of the four planets was complete in 1990, the worldâs attention faded; but the probes continued to provide remarkable insights into the dynamics of the solar system, including ultraviolet sources among the stars and the boundary between the sunâs influence and interstellar space.
As I said above: success.
More than 45 years after they first launched, the Voyagers are now NASAâs longest-lived mission and the most distant human-made objects from the Earthâbut they will one day soon go offline and drift silently into the final frontier, perhaps for eternity.
Well, maybe one of them will gain sentience and surprise future starship captains.
For McNutt, itâs a âpleasant surpriseâ that the Voyagers are still working after all these years: âI joke with people: If you go back and look at the original papers, the Voyagers were designed to work for four and a half years,â he says. âWeâve outlived the warranty by a factor of 10.â
It seems like a lot of NASA missions went on long past their expiration dates. I suppose that balances the few who fail early into the mission.
Even when the Voyagers can no longer communicate with Earth, it will not be the end of their mission. Both probes bear the famous 12-inch âgolden recordâ of the sounds of Earth, greetings in more than 50 languages, music by Mozart and Chuck Berry, and a star map showing how to get here.
Kid Me thought that was a bad idea back then, and I still think it's a bad idea. It's like setting your home address in GPS: someone steals the GPS, knows where you live and that you're not home. Only in this case they'll know where we live and it doesn't matter if we're home.
But whatever. The chance of meeting other technological beings in this galaxy are minuscule at best, and if it happens, it'll be a long, long time from now, and that'll be the last remaining shred of human artifice.
There are no happy endings, you see. There are only writers who decide to stop the story early. |
January 17, 2025 at 11:03am January 17, 2025 at 11:03am
|
I tried to avoid becoming a superconductor, but I just could not resist.
They pun; I pun.
To the uninitiated, electricity might seem like a sort of hidden magic.
Any sufficiently advanced technology...
It plays by laws of physics we canât necessarily perceive with our eyes.
Well, then, it clearly doesn't exist.
Anyone who has ever lived through a power outage knows how inconvenient it is.
Yes, which is why I got myself a whole-house backup power generator.
âIf I lose electricity, I lose telecommunications. I lose the financial sector. I lose water treatment. I canât milk the cows. I canât refrigerate food,â says Mark Petri...
Glad he's there to give us the dish.
The universe as we know it is governed by four fundamental forces: the strong nuclear force (which holds subatomic particles together inside atoms), the weak nuclear force (which guides some types of radioactivity), gravity, and electromagnetism (which governs the intrinsically linked concepts of electricity and magnetism).
The weak nuclear force is always described in pop science articles (and this one's from actual Popular Science) as being related to "some types of radioactivity" or "radioactive decay." That's all a lot of people know, which is already more than most people know. I looked more deeply into this a while back and gave up. Also, apparently, they didn't see fit to explain gravity, which is really the most important force for our day-to-day lives, as without it, we wouldn't have lives.
Some scientists and engineers think of electricity as a bit like water streaming through a pipe.
I'm the first to admit that, despite being an engineer, and despite articles like this one, I still don't really understand electricity. Might as well be sorcery. I'm not that kind of engineer. But what I do understand is water flow through pipes, and, yes, the equations have similarities. You know what else is similar to water flow? Traffic flow.
Anyway, the article does manage to keep things simple. There's not even any math. I don't know how useful it is. But I couldn't let that pun in the subhead go without trying to top it. |
January 16, 2025 at 9:13am January 16, 2025 at 9:13am
|
Today's article possesses the unfortunate combination of being long, four years old, and from a source (Town & Country) that I don't usually link.
Fake Steak, Well Done
Science is promising us steak thatâs heart-healthy, eco-friendly, and still decadent. But will we eat filet mignon from a bioreactor?
I don't know about "we," but I absolutely would—assuming it's comparable in taste and texture to actual dead cow. When it comes to eating meat, I don't actually enjoy the idea of killing animals. It's just that the delicious taste of it is more than enough for me to overcome any moral objections.
See? I have principles. They're just selfish ones.
Now, like I said, this is a long article, so I'm only going to include a few highlights.
Aleph is one among an expanding field of companies racing to bring to market what they would rather not be called âlab-grown meatâ (they prefer âcultivatedâ or âslaughter-freeâ).
What you call something matters. For instance, they could have marketed GMOs better as, I don't know, Power Plants or something. Instead, the freaks and astroturfers got a hold of the idea of calling it "frankenfood" and that was so catchy that it caused people to actually believe that there's something wrong with GMOs.
Though the technology did not exist even just a few years ago, today at least 33 startups in 12 countries are producing a variety of meatsâfrom dog food to foie gras, pork to duck, chicken nuggets to beef patties. Some are promising cultivated meat in stores next year.
As far as I've heard, cultivated meat didn't, in fact, make it to stores that next year, which would have been 2022. Nowhere have I found anything that indicates that the technology has been scaled up for mass production yet.
Thanks to our palates, Americans donât generally eat bats, the animals most widely suspected of harboring SARS-CoV-2âs precursor, but two other potentially fatal viruses, the influenza strains H1N1 and H5N1, have come from poultry and livestock in recent yearsâsuggesting that more are on the way.
On the other hoof, this sentence turned out to be prescient.
And if pandemics arenât enough to convince people, maybe antibiotic resistance is. Cattle producers discovered some time ago that giving their animals antibiotics to head off any possibility of bacterial infection also causes even healthy cattle to grow faster.
I will give them points for this: every other time I've seen cattle antibiotics mentioned anywhere, the phrase used is "pump them full of antibiotics." It's long past being a cliché, and I'm glad the article avoids this overworn, overused, tired phrase that nevertheless attempts to sensationalize the practice.
Overuse of antibiotics has accelerated the evolution of bacteria that can resist them, and now around 700,000 people all around the world die every year from what should be treatable infections.
Of all the things science should have seen coming, this is right there at the top of the list. Or maybe they did see it coming, but figured we'd just make better antibiotics. I don't know. But it stems from a basic principle of evolution: the organisms that have resistance to pressures can go on to pass that resistance onto their offspring.
Look at it this way: suppose that, every once in a while, a bulletproof deer is born. Normally, the bulletproof trait doesn't breed true. But after you send hunters into the woods with the mandate to reduce the deer population, the bulletproof deer won't be leaving with them. No, it'll stay in the woods with a few lucky stragglers. Eventually, another bulletproof deer is produced. Produce enough of them, and let them breed together, and within a few years, you have an entire population of bulletproof deer who proceed to take over the forest.
The only thing left to do, then, is nuke the site from orbit, which has the unfortunate side effects of a) destroying the entire ecosystem and b) accelerating the rate of genetic mutation in the nearby populations.
I was at the dentist yesterday, which meant I was subjected to television ads. They've only gotten worse since I last saw them. But I digress. One of them was, appropriately enough for the location, for mouthwash. "Kills 99% of germs that cause bad breath!" the commercial proclaimed.
Okay, yeah, sure. Even if that's true (which it's not, except maybe in a petri dish or something), that other 1%? Yeah, those will go on to reproduce into an entire population of Listerine-resistant bad-breath germs. Result: in 20-200 years or so, everyone will have bad breath and there won't be anything we can do about it except maybe nuke the site from orbit.
All of which is to say that bacteria evolving antibiotic resistance through selection pressures was entirely and completely predictable, even by non-scientists with some knowledge of science.
And that's a major digression, so I won't be quoting any more from the article. My only excuse is that this sort of thing won't apply to lab meats (or whatever marketing name they settle on).
As I said, I would absolutely try this sort of thing. Hopefully, it'll be better for the environment. I'm sure there are people, and some are mentioned in the article, who will oppose it on principle. Maybe because they're trying to promote their own meat alternatives. Whatever.
But I want to throw a hypothetical out there. If you can do this with pig and cow and chicken... what, besides the ew factor, would stop them from creating lab-grown human meat? There's already talk about 3-d printing human organs for replacement, and no one seems to have ethical issues with that.
I guarantee you, I'm not the only one who's had that thought. |
January 15, 2025 at 9:21am January 15, 2025 at 9:21am
|
The link I'm sharing today from Mental Floss is over 10 years old, published on an April Fools' Day, and really, really short. Which is fine, because I don't have a lot of time before I have to leave for an appointment.
I saw the headline and thought, "Is it some reason besides that pranks are jokes put into practice? What have I been getting wrong all my life now?
Turns out, nothing. Well, except maybe overestimating the linguistic intelligence of Anglophones.
Every year on April Fools Day, you might find yourself the victim of a practical joke or two...
No, because I trust no one on that day, and I try to spend it in hiding. I call it Comedy Christmas, but the only gifts I want involve other people pranking other people. A prank pulled on me is, by definition, not funny.
But why are these jokes called practical?
I think I get the confusion. We use "practically" as a synonym for "almost," and "practical" as a synonym for one sense of "virtual", as in "Her victory was a practical certainty." Look, the biggest prank ever played on us is the English language itself.
Prop-based hijinks are called "practical jokes" because they require actionâlike slipping a Whoopee cushion onto someoneâs chairâto be put into "practice."
See? I wasn't wrong, after all. Unless this article is a prank.
According to the Online Etymology Dictionary, the term was first used in 1804; before that, it was called a "handicraft joke," a term coined in 1741.
And it might be less confusing to go back to that nomenclature, except then everyone will expect their pranks to involve knitted fabrics.
If that etymology is not a prank, also.
"Practical joke" also distinguishes such pranks from strictly verbal or intellectual jokes, such as the one about the Grecian Urn.
That's a pun, which is the highest form of humor and definitely intellectual. But it works better in spoken English than written. |
January 14, 2025 at 9:37am January 14, 2025 at 9:37am
|
Whaddaya know—it turns out that nobody's perfect, though some seem to be less perfect than others. Here's a Cracked article for examples, because I am in no way referring to recent real-world revelations.
Yes, even I make mistakes sometimes.
5 Benjamin Franklinâs Turkey Electrocution
At one point, he became convinced an electrocuted turkey would be tastier than a normal one, and in attempting to demonstrate this, he accidentally touched the electrified wire intended for the turkey and electrocuted himself instead.
As far as I can tell, this, unlike certain other Founding Father stories, actually happened. Except for the word "electrocute." That was unknown until a century or more later, when Edison coined the word in an attempt to make Nikola Tesla look bad (long story). It's a portmanteau of "electro-" and "execute," as in, one can only be electrocuted if one dies from it. As Franklin continued to live, he wasn't electrocuted.
4 Francis Baconâs Frozen Chicken
Mmmm... bacon and chicken.
In 1626, Bacon was determined to prove that you could freeze and preserve a chicken by stuffing it full of snow.
Okay, fine, a hypothesis. Which he went and tested. This is science. Every scientist makes hypotheses that reach dead ends. So, in this case, it's a literal dead end, which is why this counts as a failure, and not merely a falsified hypothesis.
3 Thomas Edisonâs Creepy Talking Doll
In 1890, the world wasnât ready for Chatty Cathy.
It may be that the word "electrocute" from above was the only thing this guy ever actually invented, rather than stole from his employees. Point being, he probably didn't actually invent the talking doll, either. But he certainly marketed them, which led to people finding dolls creepy forevermore.
2 Mark Twainâs Ill-Fated Start-ups
Twain was one of the highest paid authors in 19th-century America, but it somehow wasnât enough for him.
Some people are good at lots of things. Others, not so much. Thing is, you never know which you are until you try, and possibly fail.
1 Albert Einsteinâs Cosmological Constant
When Einstein was forming his theory of general relativity, it was believed the universe was static, so when his equations kept predicting some wacky expanding universe, he added a term he called the cosmological constant to make them work.
I'm not sure this was such a failure. He seemed to think it was, but it's just an extra term in the equation, easy enough to take out. We did it in engineering school all the time; we called it Ff (read "F sub F"), for Fudge Factor. Let's also remember that this was back when people thought our galaxy was the extent of the universe.
I see it more of a lesson about examining your assumptions, even the unspoken ones. Which we should all be doing, scientist or not. |
January 13, 2025 at 9:48am January 13, 2025 at 9:48am
|
Somehow, I thought I'd covered this Mental Floss piece before, but it didn't come up in the search which probably took up more of my time than writing this entry will.
As witnessed by, well, everyone, not everybody talks good.
English, the language of Shakespeare and the internet, is often touted for its flexibility and adaptability. But with great flexibility comes great inconsistency.
Perhaps the inconsistency of the language is what makes so many English-speaking people comfortable with contradictions in other areas.
1. Tenses donât respect times.
âSo this guy walks into a bar âŠâ We know a story is coming, and itâs clearly a story about something from the pastâand yet, the word walks is in the present tense.
Jokes are traditionally rendered in present tense. I can't articulate why, but it just works better. Unfortunately, people have started using it for entire novels, and I find it fatiguing in that context.
In the sentence If it rained tomorrow, I would stay home, the past tense rained is used to refer to a future event.
I'm no grammartalker, but I thought that's a conditional tense, and it just happens to be the same spelling as the past tense.
2. Definites can be indefinite.
The word this is a definite determiner: It picks out referents that are specific and identifiable. If someone says âThis is the right one,â they do so because they expect the listener to know which one they mean. But in the story that starts with this guy walks into a bar, this guy doesnât necessarily refer to any person the speaker expects you to be able to identify.
Someone really likes "walks into a bar" jokes. That's okay. So do I: Two guys walk into a bar. The third one ducks.
3. Dummy pronouns serve as subjects.
...But in weather sentences like âitâs snowingâ or âitâs sunny,â it doesnât replace any noun phrase. What is the âitâ thatâs snowing? The sky? The clouds? Linguists call this it a âdummy pronoun,â...
It is true that I have wondered in the past about the antecedent for the pronoun in "it's snowing," but never enough to look up the answer. Now I know, and I feel like a dummy.
4. Objects can be âraised.â
On the surface, the sentences She persuaded them to try it and She intended them to try it seem pretty similar, but they differ in syntactic structure.
This one gets a little esoteric. It's one of those things that I somehow knew intuitively, but couldn't put a name to the grammar used. Well, I guess now I can.
5. Number agreement doesnât always agree.
I have to admit, this is one I make mistakes with on occasion. Rather than fumble around with grammar rules, though, I generally opt to rewrite the sentence to avoid the awkwardness in the first place.
Alternatively, I just don't notice when it's wrong.
Which, really, can be said for all of these rule-breakers. |
January 12, 2025 at 9:31am January 12, 2025 at 9:31am
|
Today, being a Sunday, is Time Travel Day. Back near Halloween of 2021, I wrote an entry about squash and its botanical relatives: "Squash Court"
The more I learn about squash and its relatives, the less I know. That's a bit of a cliché, but it describes my confusion fairly accurately. It's not like the cabbage cultivars, also mentioned in that entry, which are varied but somewhat limited. In contrast, there's a dizzying variety of gourdlike berries, and even calling them "berries," while apparently correct by botanical definiton, only adds to the confusion. Some are edible. Some are not. Many of the hardier ones are used as decoration, including the always-popular pumpkin.
Funny thing about pumpkins. They're all over the place in the fall, which is the season when I wrote that entry. Then they disappear for like 9 months, only to reappear again the following September, along with their associated spices and, largely unrelated but important, Oktoberfest beer.
It's not like we have to eat seasonally. One of the few things keeping us from sliding into full-blown dystopia is being able to eat pretty much everything year-round, including stuff from halfway around the world. Some people object to this availability. I do not. And yet, here in the middle of blasted winter, I'd completely forgotten about the existence of pumpkins until my random number generator pointed me to that entry.
Sure, it's understandable, as they're a major symbol of fall, at least in my part of the world. But, for instance, zucchini (related to squash) is most definitely a summer vegetable—using the culinary definition now—but they're available year-round. Which makes them harder for me to avoid. Yes, I've gotten over my dislike of zucchini, as noted in the linked entry, but that doesn't mean I go seeking them out.
The availability of the actual prompt for that entry, the delicata squash, I still don't know, and I'm not sure I've ever seen it in grocery stores or restaurant menus.
One thing I neglected to do back then was look into the etymology of "squash." As a kid, I always figured it was related, somehow, to the verb. Like you're supposed to squash the stuff like you mash potatoes. This annoyed my mother, but that was just a bonus for me; it was easier to pick up bits of squashed squash and eat it with the almost-edible other stuff on my plate. So I never gave it much thought. Until now.
Turns out the word squash, used for the food, is completely unrelated to the other definitions of squash (the verb or the ball game) and is, at least according to one source, from the Narragansett word asquutasquash. English people being English, that became squash. Presumably, other Native peoples had different names for it, but I can't be arsed to look up all of them.
So there it is: a long-standing mystery to me, finally resolved. Now if I could just figure out all the phylogenic tangle for all their many and various relatives, I might feel a sense of accomplishment. |
January 11, 2025 at 11:16am January 11, 2025 at 11:16am
|
At the end of last year (almost two whole weeks ago), I linked an article that tried to refute arguments against Mars colonization. Today's article, from Inverse, takes a somewhat different trajectory.
And already I'm giving it the side-eye because of the headline. "Combine science and sci-fi?" Look, I'll give "sci-fi" a pass (I have to because it's an official genre name here, though I never liked that particular abbreviation), but that phrase is meaningless. A thing that exists is either science or it's fiction, literary genres notwithstanding. I take it to mean that Mars habitats will require technology that doesn't exist yet, except in the realm of speculation (or speculative fiction). But by the time we're colonizing Mars—if we do—it had better be science, not fiction, in the habitat design.
As for the sub-head, I think we have over 100 years of awesome SF novel cover art to look at for "striking" and "weird."
On Earth, the buildings and dwellings humans spend the majority of their lives in serve as reflections of our societyâs culture, beliefs, and values. So if the shelters we make for ourselves truly mirror and influence our everyday lives, how might that sentiment be translated to living in space?
I'm not a big fan of this first paragraph, either. Sure, architecture is partly art, but it's also partly pragmatic. We didn't start building stuff out of wood because we thought the result was pretty, or because we hated trees; it happened because the trees were available and useful. A slanted roof didn't get invented to make a statement; it exists to keep the rain out and the snow from piling up too high. Once you get these and other functional basics covered, then you can start thinking about aesthetics.
Sure, maybe I have an inherent bias for function over form, but does anyone seriously think the design of first Mars habitat is going to be driven by looks rather than function and durability?
One of the most accurate parts of the 2015 movie The Martian is that when Matt Damon ends up stranded on Mars, everything about the planet is trying to kill him and his habitat holding up is his only hope of survival.
Okay, I will admit here that I haven't seen that movie (yet). I did, however, read the book, where the main character wasn't Matt Damon (yet). Apart from being jealous that the author managed to get his first novel published (bastard), the survival thing wasn't a "part," but the entire fucking plot. By which I mean, like, in your usual novel, you have a protagonist and an antagonist, right? Unless you're going for snooty literary-genre crap, in which case you can throw out plot, characterization, and making even the slightest bit of sense, usually you have what boils down to good guy / bad guy, and the plot is the tension between the two. In this book, the antagonist is the damn planet, and the hero survives by using his brain.
Which is not to say it's a bad book. It's a good book. All I'm saying is that calling this central conflict "one of the most accurate parts" of the story is like calling chocolate one of the main ingredients in a Three Musketeers bar (okay, that's funny because Three Musketeers are produced by Mars Inc., get it?) (Yes, I know it's not real chocolate; shut up, that's not my point.)
Yet some sci-fi classics like Star Wars also offer alternative, less bleak visions of humanityâs future off-Earth.
And with that, the article completely lost me.
Star Wars isn't science fiction. It's fantasy with science fiction props.
I've banged on at length about why this is, but for now I'll just stick to the article: it doesn't present humanity's future off-Earth, but rather famously takes place "a long time ago in a galaxy far, far away."
Please note that this isn't a value judgement, just a categorization one.
Perhaps the author meant Star Trek, which is science fiction, set in the future largely off-Earth, and generally not bleak. But anyone who confuses the two is either a) trolling, which I can appreciate to some extent or b) a complete idiot. As I detect no other signs of trolling in the vicinity, Captain, I'm going to go with b, which means anything else the article has to say can be safely dismissed.
Well, I didn't dismiss it entirely. I read the whole thing, and there's some good stuff there. I just don't feel the need to quote from it further. I'll just note one final thing, since we're talking about science fiction.
As far as I'm aware, there are no fictional depictions of a Mars colony that do not end in revolution and independence. The idea of a Mars colony remaining indefinitely subject to Earth control runs completely counter to history, psychology, technology, and all of known science. That's something else to keep in mind before we go running off building Mars habitats. |
January 10, 2025 at 11:01am January 10, 2025 at 11:01am
|
Sometimes, I just find something I think explains something pretty well. This is one of those times. From Quanta:
Meet the Eukaryote, the First Cell to Get Organized
All modern multicellular life â all life that any of us regularly see â is made of cells with a knack for compartmentalization. Recent discoveries are revealing how the first eukaryote got its start.
While it's amusing to think of cell organization as a bunch of oppressed worker cells getting together and striking for more pay and better benefits, that's not quite what happened.
Three billion years ago, life on Earth was simple.
Ah, yes, the good old days.
Single-celled organisms ruled, and there wasnât much to them. They were what we now call prokaryotic cells, which include modern-day bacteria and archaea, essentially sacks of loose molecular parts.
In fairness, I know a few people who are little more than sacks of loose molecular parts.
Then, one day, that wilderness of simple cells cooked up something more complex: the ancestor of all plants, animals and fungi alive today, a cell type known to us as the eukaryote.
Think of the eukaryotic cell as like those old Reese's commercials where "You got chocolate in my peanut butter!" "You got peanut butter in my chocolate!"
âEukaryotes are this bananas chimera of bacteria and archaea,â said Leigh Anne Riedman (opens a new tab), a paleontologist who studies early life at the University of California, Santa Barbara.
It took me a minute to grok that "bananas" in this case was used in its metaphorical sense of "wild and crazy." Though bananas are eukaryotic life, too.
The eukaryotes invented organization, if we use the literal definition of âorganizeâ: to be furnished with organs.
It's not like that was directed by consciousness, but okay.
For many decades, biologists considered eukaryotes to be one of three main domains of life on Earth. Life is composed of three distinct cell types: bacteria and archaea, which are both prokaryotic cells with some key differences â for example in their cell membranes and reproductive strategies â and then there are eukaryotes, which are a much different kind of cell. Experts believed that bacteria, archaea and eukaryotes each evolved independently from a more ancient ancestor.
I know some people might be upset when science changes its mind. They want everything to be known, settled, certain. I kind of get that. But life doesn't work that way. Science correcting its own misconceptions is part of why it's awesome.
Then there's a bit in there about some researchers finding some archaea that might be like the ones that first became eukaryotes, and the whole family is named from Norse sagas, which I find amusing.
There's speculation out there that the origin of eukaryotic life is an extremely unlikely, once-in-a-planetary-lifetime process, and one that would have to take place for complex life to evolve on other planets. As we have not found even bacteria-equivalents on other planets yet, that's largely speculation. But we owe our existence to the microbial equivalent of a Reese's Cup. |
January 9, 2025 at 10:09am January 9, 2025 at 10:09am
|
Yes, it's a BBC article about a Japanese immigrant to California.
And no, he didn't change it with his katana.
An hour's drive north of San Francisco, rows of gnarled and twisted vines terrace up the slopes of gently rolling hills in Sonoma County, California â which, alongside its neighbour, Napa, has been one of the world's premier wine-growing regions for more than a century.
Much to the annoyance of the French.
But California might never have earned such viticultural acclaim if it weren't for the little-known story of a Japanese immigrant named Kanaye Nagasawa.
So, you're telling me that immigrants can be a benefit?
Born into a samurai family and smuggled out of Shogunate Japan, only to become a founding member of a utopian cult and eventually known as the "Wine King of California", Nagasawa led a life that was stranger than fiction.
Okay, you had me at "utopian cult."
Nagasawa's extraordinary story goes back to 1864, when 19 young samurai from the Satsuma peninsula of Kagoshima were smuggled out of fiercely isolationist Edo-era Japan on a secret mission to study science and technology in the West.
While the BBC seems to have these archaic principles called "journalistic standards," I do not, and therefore I can write that this sounds like the coolest thing ever and why isn't it a movie already?
The youngest of the group, 13-year-old Hikosuke Isonaga went to Scotland, changing his name to Kanaye Nagasawa to protect his family, since at the time it was illegal to travel outside Japan. There, he came into the orbit of a charismatic religious leader named Thomas Lake Harris, who was recruiting followers to his version of ecstatic transcendentalism called The Brotherhood of the New Life.
Ah, yes, the utopian cult in question.
Naming the estate Fountaingrove after a year-round spring on the property, Harris set out to grow grapes, putting Nagasawa in charge of the operation. The winery soon prospered, but the "Eden of the West", as the commune described itself, became ever more wild, making headlines in San Francisco for its bacchanalian parties that eventually led to Harris' ignominious departure.
Hey, Hollywood: I know you're kind of busy right now what with the wildfires and all, but when shit stops being on fire, make this fucking movie.
All this came to an end during one of the darkest chapters of California's history, when Fountaingrove was seized by the government as part of the state's discriminatory Alien Land Laws, which were instituted in 1913, expanded in the 1920s and forbade Asian nationals from owning land or businesses.
Ah, there it is: the America I know. That plus the internment camps a couple decades later, which also feature in the article.
And that's why Hollywood won't touch it: there's no happy ending. It would have to be a Japanese movie. They're not focused on happy endings; what's important is that the protagonists maintain their honor. |
January 8, 2025 at 10:32am January 8, 2025 at 10:32am
|
This Cracked article is from the other side of the Earth's orbit, and focuses on parks in the US where it's summer then, so "freezing you to death" isn't on the list.
Like I said, "don't go outside."
Thereâs an anecdote that floats around lots of National Park-adjacent towns: A tourist is preparing to take their family out on a day-long hike, so they line up their kids and hit âem with the essential repellants: first the sunscreen, then the bug spray, then the bear spray.
A friend of mine keeps bear spray in his car as a self-defense thing. Now he's facing charges for actually using it.
This story almost definitely never happened, but at its core is a universal truth. Locals will say that truth is: âTourists are stupid and canât be trusted with access to the wild.â
Or, to be fair, access to anywhere.
Over 300 people die on National Park land every year.
Oh, no, over 300 people. That's like one plane crash. Or, to be fair, compare that to the number of people who die in their own bathroom in the US every year, which I can't find the damn numbers on, but it's way more than 300. I just mention that to underscore that my hatred of the outdoors isn't rooted in fear of death, which can happen anywhere, but because it's the goddamn outdoors.
Still, you can't die by National Park if you don't go onto National Park land. I guess technically, you could; the chance is very low but not zero.
5 Lethal Selfies
According to one study published in the Journal of Travel Medicine, the world saw 379 fatal selfies between 2008 and 2021.
Those are rookie numbers! We have to pump those numbers up!
Well, no, not really, but it is true that I find it very difficult to work up sympathy for selfie-related deaths.
4 Nuzzled to Death by Bison
Another one it's hard to get teary about.
Avoiding bison is surprisingly easy. Billions of people do it every day. But each year, one or two budding influencers are attacked by bison after venturing too close to get a sick shot for the grid.
Especially if it's an influenza wannabe.
That same season, a woman was gored by a charging bison, and although she and her companion appear to have been doing the right thing (hightailing it) at the time of the attack, she was hurt pretty badly (although, thankfully, survived).
Sometimes, though, it's not idiocy, and in those cases, I do feel bad for them.
3 Boiled in Hot Springs
Hot springsâ body count is more than double that of bison and bears combined. Yellowstone is a literal minefield of geothermal activity.
Admittedly, I've never been to Yellowstone. But from what I understand, there's all kinds of signage warning you against going into the hot springs. Another group for whom I have extremely limited sympathy is the "you can't tell me what to do and I go where I want" crowd.
2 Too Darn Much Water
If you donât bring enough water on a long hike, youâre gonna have a bad time. But itâs too much water that can really sneak up on you.
Yep, dihydrogen monoxide is a killer.
Letâs start with the frozen kind: Avalanches on National Park land have killed 37 people.
Movies, shows and cartoons from when I was a kid led me to vastly overestimate the number of times I'd encounter an avalanche. And quicksand, for that matter.
When all that snow thaws out, it gets significantly deadlier. Between 2014 and 2019, there were 314 drowning deaths in the parks, second only to motor vehicle accidents at 354. Those tend to happen when someone falls off a boat in a lake or a river, and theyâre often selfie-assisted.
I don't mean to be rude, here, but there's a reason why life jackets exist.
But the scariest water-related death, for my money, is flash flooding. The slot canyons of the Grand Canyon are enormous tributaries, geologically designed over millennia to collect rainwater from intense desert storms and deliver it to the Colorado River far below within minutes.
I'll let "designed" slide. But yeah, flash flooding is scary as hell.
1 Just Freaking Explode Everything
Letâs travel back to Yellowstone for a moment. The source of all that geothermal activity is our old friend, the Yellowstone Caldera.
Huh, and here I thought it was because it contained a direct portal to Hell.
Thatâs a supervolcano lurking just beneath the surface that has erupted every half-million years or so, and is currently running 40,000 years late.
That's not how return periods work.
Should the Great Pimple ever decide to pop â which it could at literally any moment â itâs estimated that 90,000 people would die instantly, and lava could splash from Calgary to Los Angeles.
Just to underscore the point I made earlier: your chance of dying by National Park if you never set foot in a National Park is low, but never zero.
Still, we're way more likely to get nuked to oblivion first. |
January 7, 2025 at 9:11am January 7, 2025 at 9:11am
|
A little less rant-worthy today—some history from Smithsonian.
But why didn't he just look it up on the internet?
Before Benjamin Franklin became a printer, newsman, author, inventor, philosopher, diplomat and founding father of the United States, he failed math twice.
They forgot "epic troll." Also, I'd never heard this "failed math twice" thing before, and couldn't find other support for the claim apart from things linked within the Smithsonian article. Those lead to Franklin's words, and, I reiterate, the man was an epic troll and in no way a reliable narrator. This doesn't mean it's false; I just can't confirm or refute it. We like to hear about great people's failings, and there was a similar rumor about Einstein that turned out to be pants-on-fire false. Also that thing about Franklin's fellow Founding Father George Washington chopping down a cherry tree was false. So you'll have to deal with my skepticism.
It doesn't change what the article is mainly about, which is the math textbook.
Yet the story of the âbook of arithmeticâ that finally helped Franklin master the subject is little known todayâanother irony, because in his day, it was every bit as famous as he was.
Probably even more famous, at least when he was a brat.
Cockerâs Arithmetick was probably the most successful elementary math textbook published in English before the 19th century. It epitomized an age in which the expanding worlds of commerce and capitalism, education and Enlightenment, coalesced to make basic arithmetic the classroom staple it is today.
Oh, I thought they'd given up on teaching math. It sure seems that way.
Though perhaps unfamiliar to modern readers, in the 18th century, Cockerâs Arithmetick was as close to a household name as any math textbook is likely to ever be. Edited from the writings of London-based teacher Edward Cocker and published posthumously in 1678, the book included lessons on basic arithmetic with a commercial slant, posed as a set of rules to be memorized, as was typical of educational books of the day.
So it was kind of like the Ferengi Rules of Acquisition, but more math-oriented?
Cockerâs lessons covered addition, subtraction, multiplication and division, as well as calculation with pre-decimal British currency and a gentle introduction to 17th-century Englandâs bamboozling array of weights and measures.
I often found it ironic (or whatever) that, upon revolting, the US adopted a decimal money system while keeping the "bamboozling array of weights and measures." I think I did an entry about that a while back. Meanwhile, the UK kept the non-decimal money system and adopted, at least officially, SI units. If I remember right, the UK didn't switch from pounds/shillings/pence until around 1970.
Finally, the textbook taught arithmetic skills for business, such as dividing profits equitably between partners.
No wonder the book fell out of favor. Nowadays, we like books about how to screw over the other partners.
Whatever else a student might learn in Franklinâs grand scheme of general instruction, which included everything from oratory and ancient customs to drawing and geography, he maintained that arithmetic was âabsolutely necessary.â
Nah, best to keep the populace ignorant of math so they don't notice when you short their paychecks or mess with their timeclocks.
Toward the end of his life, in a well-known letter of 1784, Franklin reflected in satirical fashion on the importance of waking early in order to save money that would otherwise be spent burning candles at night.
I'm just including this bit to emphasize that the "early to bed, early to rise" thing was satire.
Lots more history at the link, of course. Mostly, I just found it interesting that I'd never heard of the book before. Probably because no one can make money advertising it. |
Previous ... - 1- 2 ... Next
© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|