About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
November 30, 2022 at 12:01am
November 30, 2022 at 12:01am
#1041155
This article is a few years old now, but that's a rounding error compared to the subject matter.

500 Years Later, MIT Proves That Leonardo Da Vinci's Bridge Design Works  Open in new Window.
If accepted at the time, the design would have likely revolutionized architecture.


More like engineering.

Researchers at MIT have proven Leonardo da Vinci correct yet again, this time involving his design for what would have been at the time a revolutionary bridge design.

Leonardo was undeniably a genius (though, as with anyone, not always right), but one limitation on genius is the mindset of the people around you.

When Sultan Bayezid II of the Ottoman Empire put out a request for proposals for a bridge connecting capital city Constantinople (now Istanbul) with its neighbor city Galata, da Vinci was eager for the chance to win the contract.

I'm just leaving this here so you can take the time to get They Might Be Giants out of your head. It is not possible to contemplate Istanbul (not Constantinople) without thinking of their song.

Da Vinci's proposal was radically different than the standard bridge at the time. As described by the MIT group, it was approximately 918 feet long (218 meters, though neither system of measurement had been developed yet) and would have consisted of a flattened arch "tall enough to allow a sailboat to pass underneath with its mast in place...but that would cross the wide span with a single enormous arch," according to an MIT press statement. It would have been the longest bridge in the world at the time by a significant measure, using an unheard of style of design.

And see, that's why I'm putting this in the realm of engineering, not architecture. I admit I may be biased on those subjects, but bridge design is solidly in the realm of civil engineering, no matter how elegant the design may be.

There is, of course, significant overlap in those disciplines. But if the focus of the construction is structure and transportation, I'd call that civil engineering.

It wasn't just length or style that set da Vinci's bridge apart. It also had safety features unheard of at the time. One of the biggest challenges facing any bridge design is that it has to exist in nature no matter the conditions, including wind.

In theory. In practice, lots of things have brought bridges down, including unexpected floods and, yes, wind loads.

Strong winds have forced many bridge, including relatively modern bridges from the 20th century, into lateral oscillations leading to collapse.

I don't think it's possible to become a civil engineer without seeing the video of the Tacoma Narrows Bridge  Open in new Window. failure.

To be clear, though, that collapse was due to aerodynamic effects that were barely understood even in the mid-20th century, and as smart as Leonardo was, the math for it didn't exist in his time.

Since building a full-scale bridge would have been unwieldily,[sic] the team resorted to building a model. Using 126 blocks, they built the bridge at a scale of 1 to 500, making it around three feet long.

Modeling things like this properly is a challenge in itself. You run into things like the square-cube law, which has to be taken into account.

"It's the power of geometry" that makes it work, she says. "This is a strong concept. It was well thought out." Further tests showed that the bridge could have even stood its own against earthquakes to an extent far beyond other bridges at the time.

Math: it works.

There are still mysteries surrounding the project. "Was this sketch just freehanded, something he did in 50 seconds, or is it something he really sat down and thought deeply about? It's difficult to know."

It's entirely possible that the sketch built on things Leonardo would have already been thinking about. I mean, it's basically a freestanding arch, right? They figured arches out long before his time. Putting that together with other concepts, such as the wind loads mentioned above, might not have taken him very long at all (genius, remember).

It's this combination of disparate ideas that's the hallmark of true genius, and it's one reason there's no such thing as useless knowledge.

While it's difficult to know da Vinci's intentions, one thing is now relatively certain: the bridge would have worked.

And I gotta admit, it looks cool.
November 29, 2022 at 12:01am
November 29, 2022 at 12:01am
#1041125
Science is hard.

Controversy Continues Over Whether Hot Water Freezes Faster Than Cold  Open in new Window.
Decades after a Tanzanian teenager initiated study of the “Mpemba effect,” the effort to confirm or refute it is leading physicists toward new theories about how substances relax to equilibrium.


Not all counterintuitive results involve quantum effects.

It sounds like one of the easiest experiments possible: Take two cups of water, one hot, one cold. Place both in a freezer and note which one freezes first. Common sense suggests that the colder water will.

And this is one reason I don't trust common sense.

But luminaries including Aristotle, Rene Descartes and Sir Francis Bacon have all observed that hot water may actually cool more quickly.

To be fair, Aristotle was wrong about a lot of things; those other dudes were right about a lot of things, but they were still subject to human biases and observational inconsistencies. This is why we do science: to counteract the effects of human bias, error, and "common sense."

The modern term for hot water freezing faster than cold water is the Mpemba effect, named after Erasto Mpemba, a Tanzanian teenager who, along with the physicist Denis Osborne, conducted the first systematic, scientific studies of it in the 1960s.

Probably you've seen those videos of idiots living in the snow who throw boiling water from its pot and it freezes solid before it hits the ground. This is not the same thing; that happens, they're pretty sure, at least partly because hot water evaporates quickly, and because throwing it increases the surface area, which means it can evaporate even more quickly.

They're not idiots for doing this; they're idiots for living where the world is trying to turn them into corpsicles.

No, this is talking about hunks of water under conditions identical except for the temperature. Even then, as it turns out, it's not always observed.

Over the past few years, as the controversy continues about whether the Mpemba effect occurs in water, the phenomenon has been spotted in other substances — crystalline polymers, icelike solids called clathrate hydrates, and manganite minerals cooling in a magnetic field.

All of which may have profound technological implications, but we want to know about water. You know, that stuff we literally can't live without.

“A glass of water stuck in a freezer seems simple,” said John Bechhoefer, a physicist at Simon Fraser University in Canada whose recent experiments are the most solid observations of the Mpemba effect to date. “But it’s actually not so simple once you start thinking about it.”

One cool thing (pun absolutely intended) about this article is how it highlights the international, multicultural process of science.

“My name is Erasto B. Mpemba, and I am going to tell you about my discovery, which was due to misusing a refrigerator.” Thus begins a 1969 paper in the journal Physics Education in which Mpemba described an incident at Magamba Secondary School in Tanzania when he and his classmates were making ice cream.

Okay, see, right there I'm already running into a contradiction. Ice cream has a high water content, sure, but it's not water. Most water isn't water, either; it contains various minerals. Even distilled water is rarely 100% pure, and even if it were, doing experiments with it doesn't necessarily translate to real-world applications.

Basically, I suspect that at least part of the problem with replicating the results here is that different trace elements in a sample of water can cause it to behave differently in certain circumstances, like when you're doing freezing experiments.

I don't know that for sure, though. I'm sure the scientists involved have already thought of that, but if so, the article doesn't say much about it.

Space was limited in the students’ refrigerator, and in the rush to nab the last available ice tray, Mpemba opted to skip waiting for his boiled-milk-and-sugar concoction to cool to room temperature like the other students had done. An hour and a half later, his mixture had frozen into ice cream, whereas those of his more patient classmates remained a thick liquid slurry. When Mpemba asked his physics teacher why this occurred, he was told, “You were confused. That cannot happen.”

Some teachers don't have open minds. Many kids do, at least until teachers close them.

Over the decades, scientists have offered a wide variety of theoretical explanations to explain the Mpemba effect. Water is a strange substance, less dense when solid than liquid, and with solid and liquid phases that can coexist at the same temperature.

And that's another problem: even if it happened with consistency, you have to come up with a reason why it happens, and then test that.

Still, Burridge and Linden’s findings highlight a key reason why the Mpemba effect, real or not, might be so hard to pin down: Temperature varies throughout a cup of rapidly cooling water because the water is out of equilibrium, and physicists understand very little about out-of-equilibrium systems.

Stacking on yet another problem.

Statistical physicist Marija Vucelja of the University of Virginia started wondering how common the phenomenon might be. “Is this like is a needle in a haystack, or could it be useful for optimal heating or cooling protocols?” she asked.

Just leaving this bit here because I like it when my alma mater is involved.

If nothing else, the theoretical and experimental work on the Mpemba effect has started giving physicists a handhold into nonequilibrium systems that they otherwise lack.

Another fun thing about science: sometimes a result, or even a failure, can have secondary benefits.

Fortunately for lazy-ass me, the article ends with a brief description of whatever happened to Mpemba himself:

After igniting a decades-long controversy with his teenage interrogations, Mpemba himself went on to study wildlife management, becoming a principal game officer in Tanzania’s Ministry of Natural Resources and Tourism before retiring.

So he didn't end up working on pure science, and the effect named after him might never had any impact on his life. Remember that next time some kid complains that they'll "never use" whatever they're studying at the moment; it's still important.

Osborne, discussing the results of their investigations together, took a lesson from the initial skepticism and dismissal that the schoolboy’s counterintuitive claim had faced: “It points to the danger of an authoritarian physics.”

And also to the danger of a stubborn "common sense."
November 28, 2022 at 12:02am
November 28, 2022 at 12:02am
#1041092
This one's been in my queue for a long time, such that every time I see it on the list, I go, "Now what am I going to say about this?"

Kurt Vonnegut’s Greatest Writing Advice  Open in new Window.
"Literature should not disappear up its own asshole," and other craft imperatives


After all, how can I, being just me, comment on writing advice from one of the most acclaimed, award-winning, and all-around great writers of the 20th century? Someone who is, moreover, dead and therefore can't rebut anything I say?

...Easy. I just read, paste, format, and type.

(Article is from 2017, the 10th anniversary of Vonnegut's tragic, untimely and completely unexpected death, but that shouldn't matter.)

Today, if you can believe it, makes it ten years since we lost one of the greatest American writers—and, no matter how he tried to deny it, one of the greatest writing teachers. Certainly one of the greatest writing advice list-makers, at any rate.

Like I said, 2017. As list-making became the default method of communication on the internet, once again, Kurt was ahead of his time.

Plus, it’s no-nonsense advice with a little bit of nonsense. Like his books, really.

I don't want anyone getting the idea I didn't like Vonnegut. Far from it. But he's not up on a pedestal for me like Twain or Poe.

Find some of Vonnegut’s greatest writing advice, plucked from interviews, essays, and elsewhere, below—but first, find some of Vonnegut’s greatest life advice right here: “I tell you, we are here on Earth to fart around, and don’t let anybody tell you different.”

On that point, at least, we can agree completely.

On proper punctuation:

Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing.


These days, Vonnegut would be soundly thrashed on social media for using the words "transvestite hermaphrodites;" times and words change.

You'll note I used a semicolon in the previous sentence; I do that when I damn well feel like it.

I agree it probably shouldn't be (over)used in creative writing; this is a blog entry, so I can get away with it. However, someone as well-versed in satiricism as Vonnegut must have known that there's a place for semicolons even there, once you know the "rules" so well you can break them.

On having other interests:

I think it can be tremendously refreshing if a creator of literature has something on his mind other than the history of literature so far. Literature should not disappear up its own asshole, so to speak.


And yet, like all art, it often does.

What's not clear to me here is how he meant "literature." For me, the term encompasses all fiction writing. Depending on context, though, it can be used by snobs to differentiate high art from low art, which they snobbily call "genre writing." News flash: every work of fiction is genre fiction in that it has a genre. So that quote is kinda rich coming from someone who resisted acknowledging that he wrote science fiction because some science fiction writers were hacks.

On the value of writing:

If you want to really hurt your parents, and you don’t have the nerve to be gay, the least you can do is go into the arts.


Or, you know, lean into the stereotypes and do both.

On plot:

I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere.


Seriously, the link is worth clicking on just to read this one section. I can't do it justice here with cherry-picked quotes; just trust me on this.

Also, I'm skipping a few of them here, so if you want to read more, well, there's the link.

On a good work schedule:

I get up at 7:30 and work four hours a day. Nine to twelve in the morning, five to six in the evening. Businessmen would achieve better results if they studied human metabolism. No one works well eight hours a day. No one ought to work more than four hours.


Everyone has their own preferred schedule, but I'm pretty sure the "eight hours a day" thing is industrialist nonsense. Thing is, though, writing is different from some other professions: whether you're actively typing or not, your brain is going through plot, characterization, descriptions, whatever. At least that's how it works for me. Like I said above, every time I saw this one in my list, I started thinking about what I was going to write. Some of that even made it in here.

On “how to write with style,” aka List #1:

Reverse the numbers in this section and it might as well have come from Cracked. Not that it's especially amusing, but this list, more than his other advice, rings true for me. Too long to quote; just give it a look if you care.

I ignored his gushing about Joyce, of course.

I am, however, going to reproduce the second list in its entirety. It's brief, and it's been passed around quite a bit already, so of course I have something to say about it.

On how to write good short stories, aka List #2:

1. Use the time of a total stranger in such a way that he or she will not feel the time was wasted.


Great advice, but the devil's in the implementation.

2. Give the reader at least one character he or she can root for.

So many stories these days don't bother to do that.

3. Every character should want something, even if it is only a glass of water.

This is probably Vonnegut's most quoted piece of advice. It's definitely important, but it's insufficient by itself.

4. Every sentence must do one of two things—reveal character or advance the action.

See my multiple rants about overlong descriptions.

5. Start as close to the end as possible.

Lots of writers lately have taken this literally, starting at the very end, thus relegating the rest of the story to flashbacks. Nothing inherently wrong with that, but sometimes it's difficult to follow too many jumps around in time.

6. Be a sadist. No matter how sweet and innocent your leading characters, make awful things happen to them—in order that the reader may see what they are made of.

On that point, I agree.

7. Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.

I came up with this one independently, I think. I noted that some of the greatest works of literature (using the term in its broadest sense, as above) were written with just one person, or possibly a small group of people, in mind; they didn't get their start from an author using focus groups or brainstorming what demographic he or she was shooting for. Examples include Alice in Wonderland, pretty much anything by Poe, and freaking Lord of the Rings.

My personal corollary to this is: you have to expect that some people won't like it. That's still better than blandly trying to please everyone.

8. Give your readers as much information as possible as soon as possible. To heck with suspense. Readers should have such complete understanding of what is going on, where and why, that they could finish the story themselves, should cockroaches eat the last few pages.

I can't completely get behind this one, but like I said: he's Vonnegut and I'm not. I'm mostly commenting here from a reader's perspective, and two of my preferred genres are fantasy and science fiction. In those, you want to withhold some information; it helps keep the reader reading so as to eventually discover, say, why the world is a post-apocalyptic wasteland, or who the MC's father really is (yes, that's mostly a Star Wars reference). I'm not quite as big a fan of mystery, but in that genre, it should be glaringly obvious that you withhold some information. As for horror, if you explain the monsters too much, they cease being monsters.

The danger of putting too much information up front is obvious: you end up writing The Silmarillion before Lord of the Rings, and you'll lose readers. No, I say (again from a reader's perspective): start with the story, not the Book of Genesis.

On ignoring rules:

And there, I’ve just used a semi-colon, which at the outset I told you never to use. It is to make a point that I did it. The point is: Rules only take us so far, even good rules.


Told you so.
November 27, 2022 at 12:01am
November 27, 2022 at 12:01am
#1041062
Today's blast from the past is a short entry I wrote near the end of August, 2008: "I can't get no...Open in new Window.

I've been trying to put words to the malaise that seems to have overtaken my life. It's not that I'm not happy, or I'm severely lacking in anything (except maybe motivation).

I don't think I wallow in self-pity to that degree anymore. If I do, I don't write about it, because that leads to a) people shunning you even more; b) advice that doesn't work for me (as per some of the comments on that entry); or c) people being happy about it because at least they're not you.

Anyway, I did finally figure out what was bugging me, I think: work. The fact that I had to do it. It was interfering with my video game time.

What I want to focus on, though, is the first comment, from someone I sadly haven't seen in a while but used to comment here a lot:

It's "midlife crisis," buddy. I've experienced it and know men who experience it until either they self destruct or they do something to sate it.

Some people treat a midlife crisis, especially in dudes, as a joke. And to be fair, sometimes it really is funny, like when someone goes out and blows all their money and credit on a Porsche and a 21-year-old hooker (neither of which I've ever done). But that shit's real, and as much as society pushes men to squash their emotions, doing so is generally a Bad Idea.

That shit can ruin lives. I had an uncle who had an affair with a grad student, and ended up destroying his family. Fortunately, the kids reconciled with him before he croaked, and he expressed regret at his actions.

Anyway, I don't think that was one; I've always been prone to depressive episodes, regardless of age. And even if it was, I didn't have a family to ruin. Sure, I ended up getting divorced the following year, something which was in no way my fault [Narrator: It was a little bit his fault]. But having glanced at a few of the intervening blog entries, they weren't all gloom. Some of them were about my epidural for back pain, and anyone who's experienced chronic back pain can tell you it definitely affects one's mental health. And then there was a vacation, which apparently helped, too.

I did end up, over the following year, buying a new car, retiring, and traveling (in that general order). But the car was a Subaru, not a Porsche, and traveling is something I'd always wanted to do but was difficult while working full-time.

Whilst out and about yesterday in my new-to-me Subaru—after over a year without a car, I wanted to see what changes happened in my town, and besides, it was sunny and 70 damn degrees outside—I saw that we now have a Porsche dealership in my town.

I wasn't even the slightest bit tempted.
November 26, 2022 at 12:01am
November 26, 2022 at 12:01am
#1041037
I've done entries about avocados before. Here are a couple: "The Devil's AvocatOpen in new Window. and "Another Avocado ArticleOpen in new Window.. This one's not about the fruit so much as it is about the product.

How Marketing Changed the Way We See Avocados  Open in new Window.
Once upon a time, Americans didn't know what to do with "alligator pears." Now we can't get enough


I have a confession to make: When I was a kid, I hated avocados. "Zaboca" as they are called in Trinidad (and maybe elsewhere in the Caribbean) were mushy and gross, in my young, uninformed opinion. They just didn’t taste like anything. Plus, I believed my parents ate some weird things in general.

I'm pretty sure all kids, from all cultures, have different approaches to food than adults. Hell, most of 'em probably don't like beer, even. But we usually grow up to actually like some of those yucky things mom and/or dad tried to shove down our gullets when we were little.

In the beginning of the 20th-century, they were called “alligator pears.” Their bumpy, olive skin connected them to those denizens of the swamp, and it’s shape resembled, well, a pear.

They had a marketing problem.


In addition to already discussing avocados, I've mentioned a few marketing problems in here before. Can't be arsed to find those entries, but I vaguely remember something about orange juice. And maybe chicken wings.

Along these lines, the California Avocado Grower’s Exchange launched a petition to change the name of the fruit, formally. They were pushing to get back to the cultural roots of avocados: The word “avocado” is derived from the Aztec “ahuacacuahatl.” This renaming was meant to further exoticize the product, lending credence to the idea that it was a special treat.

Another source puts it as āhuacatl. I don't know enough about the Aztec language to know if someone made a mistake or if maybe they used both words, or the one was a shortening of the other. Doesn't matter much, I suppose, but I do like to get these things right.

That was all well and good until nutrition experts began to promote a low-fat diet.

Wait, I thought that wasn't a thing until like the 1990s. Is this article skipping whole decades?

The public didn’t differentiate between saturated fats, which were target of this movement, and monounsaturated fats, which are “good” fats. Avocados came under fire.

Avocados under fire are disgusting. No, seriously, nothing's worse than a heated avocado.

So the avocado growers rallied. They funded research and put out studies meant to extoll the virtues of the fruit.

This. This is why people don't trust nutrition science.

The turning point for avocados was their integration with the [name of copyrighted sportsball game that takes place in Feburary redacted].

With enough money, anyone can advertise anything during that game and people will flock to it. If you put a halftime ad there selling boxes of unprocessed human shit, your shit supply would run out the next day.

The public’s investment and interest in the Su[bleep]wl cemented guacamole as a snack item, giving the avocado a foothold it needed. Access to avocados also increased as the previous ban on the import of this item was lifted in 1997, and fruits from Michoacan began to flow across the border.

Then why do I remember avocados from way back in the 70s? It's possible my memory is faulty, or maybe I shifted timelines.

At the end of the day, avocados have a place on today’s table thanks in part to a tireless campaign to redefine and redraft their identity. Some of it was misguided, some of it was weird and some of it was good. That is the nature of advertising.

I put up marketing articles here from time to time because many writers need to know how to do marketing. I mean, I would, but no matter how much I learn about it, I'm utterly incompetent at marketing (maybe because I won't pay for an ad spot during the game-that-shall-not-be-named). Doesn't stop me from reading about it, though.
November 25, 2022 at 12:01am
November 25, 2022 at 12:01am
#1041003
Wrong time of year for this, but that's never stopped me before.

The Ancient Math That Sets the Date of Easter and Passover  Open in new Window.
Why don’t the two holidays always coincide? It is, to some degree, the moon’s fault.


And yes the headline uses the M word.

Passover is a springtime Jewish festival celebrating the early Israelites’ exodus from Egypt and freedom from slavery. Jews observe it by hosting a ritual dinner, called a seder, and then by abstaining from eating all leavened bread for about a week.

Hey, it's the yeast we could do.

Easter is a springtime Christian holiday celebrating the resurrection of Jesus Christ and freedom from sin and death. It is preceded by a series of holidays commemorating Jesus’s path to the cross. One of these holidays is Maundy Thursday, which, aside from being a great name for a holiday, is a remembrance of the Last Supper, which was a seder. In the United States, many Christians observe Easter by attending a ritual meal between breakfast and lunch, called a brunch.

That part cracked me up.

These holidays have a lot in common: They share themes of liberation and triumph; they both involve buying a lot of eggs; they were both a pretty big deal for Jesus.

To acknowledge that I'm writing this in late November, just when the winter holiday marketing season is gearing into overdrive and rolling coal at our collective Priuses, I'll note that there is something of a parallel here with Christmas and Hanukkah. There is, however, one incredibly important difference: while Easter was built off of Passover (and both holidays stole from Pagans), Hanukkah and Christmas (also stolen from Pagans) have fuck-all to do with each other, apart from generally happening when it's way too bloody damn cold in the northern hemisphere.

Without going into detail, Hanukkah isn't "Jewish Christmas" (my friend likes to call it "Blue and Silver Christmas"). But, like the holidays in the article, sometimes they happen to overlap (like this year), and sometimes they don't.

In the Gospels, the existential drama of Easter happens against the backdrop of Passover. Yet about 15 percent of the time, the two holidays actually occur a month apart.

Those are good years for me. See, my cousin usually wants me to travel for Passover. Which is fine. Except when Passover falls on Easter, in which case traveling up the Northeast Corridor is the First Circle of Hell.

Anyway, the rest of the article goes into the differences between the Hebrew lunisolar calendar and the Christian solar calendar.

During the month of Adar (which directly precedes the Passover month of Nisan), the ancient rabbinical court would decide if it was springy enough outside for Passover. If spring seemed to be on track, Nisan could occur. But if it wasn’t warm enough outside yet, the rabbis would tack on another month of Adar. They called this leap month Adar II.

Early Rabbinical Judaism was very creative with names.

Today Roman Catholics and most Protestant traditions now celebrate Easter after March 21 on the Gregorian calendar. But the Eastern Orthodox Church uses the older version of that calendar, known as the Julian, to determine the date of Easter and other festivals.

So it's not just different religions' calendars that cause issues, but that of different sects of the same religion. This in no way surprises me.

Thanks to tiny wobbles in Earth’s orbit, some years are a second or two longer or shorter than others. So every year, the International Earth Rotation and Reference Systems Service announces whether to add a leap second in order to align Earth time with solar time.

Yeah, it looks like they're abandoning, or at least pausing,  Open in new Window. this practice.

And as it happens, the first night of Passover can never fall on Maundy Thursday, even though that holiday commemorates a seder. That’s because Passover can never begin on Thursday, ever. “The calendar is rigged so that [seder] can fall only on certain days of the week,” Dreyfus told me. “If Passover started Thursday night, it would push Rosh Hashanah the following year to start on Saturday night.” And neither Rosh Hashanah nor Yom Kippur, the two High Holidays of the Jewish year, can fall the day after Shabbat.

Just in case you thought it wasn't complicated enough.

But no, to directly address the article's subhead, it's not the moon's fault at all. Nor is it the sun's. "The fault, dear Brutus, is not in our stars, but in ourselves, that we are underlings. Might you become master of your fate through choice—no matter what the stars say?" When it comes to holidays and observances, we're all servants of the calendar, and the calendar—any calendar—is purely arbitrary. I could argue that the Hebrew calendar, tied as it is to lunar cycles, is somewhat less arbitrary than the Gregorian, which is tied to nothing (well, sort of, mostly). But it's still arbitrary: Hebrew calendar months start on new moons. Why not full? Or half?

Sometimes I think we should just abandon the whole thing and adopt the Tranquility Calendar.  Open in new Window. Other times I go in the complete opposite direction and want a calendar more obviously tied to actual astronomical observations: solstices, equinoxes, moon phases.

Hardly matters, though. Inertia favors the Gregorian. We can't even agree on stopping this Daylight Saving Time nonsense; calendars ain't gonna change.

Well. Eventually they will, because even the orbits of the Earth and Moon are, over a long enough timescale, chaotic. Or we could disappear, along with our calendars. But we're stuck with these complicated things for the foreseeable future. Fortunately, you don't have to make any of the observations and computations yourselves; someone else will tell you when it's time to celebrate whatever.

*Movie**Film**Film**Film**Movie*


Because yesterday was Thanksgiving, I had nothing better to do than go to the movies.

One-Sentence Movie Review: Bones and All:

A meaty movie with a side of mashed metaphors and symbolism sauce, this story of two fine young cannibals in the 1980s was filling Thanksgiving Day fare for me.

Rating: 3.5/5
November 24, 2022 at 12:03am
November 24, 2022 at 12:03am
#1040976
You've certainly heard the cliché, "the greatest thing since sliced bread," which implies that sliced bread was the greatest invention. This is obviously false, as beer was invented before sliced bread.

Sometimes it takes a lack of something to truly appreciated it. This was true during Prohibition, and apparently, also true during World War II.

Remembering When America Banned Sliced Bread  Open in new Window.
During World War II, the U.S. government turned to drastic rationing measures.


The year was 1943, and Americans were in crisis. Across the Atlantic, war with Germany was raging. On the home front, homemakers were facing a very different sort of challenge: a nationwide ban on sliced bread.

While it's true that there were far worse things in WWII than a lack of sliced bread, I can see how that would be frustrating.

The ban on sliced bread was just one of many resource-conserving campaigns during World War II. In May 1942, Americans received their first ration booklets and, within the year, commodities ranging from rubber tires to sugar were in short supply.

These days, of course, such measures would be "government overreach," "tyranny," "an attack on muh freedumbz," and "cause for riots in the streets."

So by January 18, 1943, when Claude R. Wickard, the secretary of agriculture and head of the War Foods Administration, declared the selling of sliced bread illegal, patience was already running thin. Since sliced bread required thicker wrapping to stay fresh, Wickard reasoned that the move would save wax paper, not to mention tons of alloyed steel used to make bread-slicing machines.

Okay, so I can see the wax paper thing (plastic wasn't much used then), but didn't the slicers already exist? Sliced bread was invented in 1928  Open in new Window., and pretty much everywhere five years later—ten years before the ban.

On July 7, 1928, the Chillicothe Baking Company in Missouri first put his invention to use, saying it was “the greatest forward step in the baking industry since bread was wrapped.”

So, really, the expression should be "the greatest thing since wrapped bread."

As an aside, I'm picky enough to get my bread from a local bakery rather than the supermarket. They'll slice it right there upon request, and slide it into a plastic bag with a twist tie. Since it's fresh bakery bread without preservatives, if it's not wrapped, it becomes a rock within 24 hours.

Sliced bread really took off in 1930, when the Continental Baking Company’s pre-sliced Wonder Bread made its way into American homes.

Ugh. Foul. Disgusting.

After a few years of aggressive marketing, the pillowy, preservative-laced loaves were synonymous with modernity and convenience.

They're synonymous with American lack of taste, along with American cheese and light beer.

On January 24, less than a week after the ban, the whole thing began to unravel. New York Mayor Fiorello LaGuardia made a public announcement that bakeries that already had bread-slicing machines could carry on using them.

See?

No wonder he got an airport named after him.

One baker by the name of Fink, who also happened to be a member of the New York City Bakers Advisory Committee, publicly advocated for the ban, then was fined $1,000 (more than $14,000 today) for sneakily violating it.

Tempting as it may be to claim that his name was the source of the verb "to fink," no, it wasn't.

By March 8, the government decided to abandon the wildly unpopular measure. “Housewives who have risked thumbs and tempers slicing bread at home for nearly two months will find sliced loaves back on the grocery store shelves tomorrow in most places,” noted the Associated Press.

Government overreach, tyranny, muh freedumbz.

In the end, no thumbs were severed and Americans were reunited with the sliced bread they had learned to hold so dear.

Once you get used to an invention, especially one that saves time and work, it's very, very hard to do without it. If necessity is the mother of invention, laziness is the milkman.

Also, fuck Wonder Bread.
November 23, 2022 at 12:02am
November 23, 2022 at 12:02am
#1040944
This one's been hanging out in my queue since October, but whatever. Poe is timeless.



Article is from Cracked, so take it with a grain of gothic black salt.

As the original goth boi, it’s only fitting that Edgar Allan Poe’s death was as mysterious and haunting as one of his stories. Just before he died at age 40, he seemed to drop off the face of the Earth for a week, and his death has been attributed to everything from low blood sugar to murder.

There was a movie called The Raven about 10 years ago, starring John Cusack as Poe. Critically panned and engendering lukewarm audience response at best, I felt like it was severely underrated. Not that it was a great movie, but it didn't suck, either. I think most people missed the point. The movie makes the most sense, I think, if you remember the Poe quote, "All that we see or seem is but a dream within a dream." Or maybe I'm enough of a fan of both Poe and Cusack to have enjoyed it anyway.

15. Missing Poe-son

Oh, how clever. A Poe pun. Well, I shouldn't complain too much; I have every intention of adopting a tomcat just so I can name him Edgar Allan Purr. Bonus points if I can find a black one.

On September 27, 1849, Poe left Richmond, Virginia, where he’d been busy talking his childhood sweetheart into marrying him, for Philadelphia for a job editing a poetry collection (it needed more symbolism or something), after which he intended to head back to New York, where he lived.

Lots of places claim Poe, and for good reason. He belongs to everywhere, but really, he was a Virginian.

14. October 3, 1849

Four days before his death, Poe resurfaced in Baltimore, if you can call the gutter a surface.


Isn't that just basically Baltimore?

13. Poe’s Death

According to the most likely account, he never got it together long enough to explain how a business trip turned into a disoriented game of dress-up before he died on October 7.


Of course it was October.

For we knew not the month was October,
And we marked not the night of the year


12. Poe’s Cause of Death

Cracked's attempts to fit this story into its usual bite-sized countdown chunks seems forced here, so I'm skipping a few; basically, the next several points involve the various ideas about what might have led to his death. There are a lot of them, and as far as I can tell, none of them really fit.

My personal theory? He fell ill from gothic ennui.

The only person who saw Poe alive after he was brought to the hospital was Dr. John Moran, who kept changing his story.

Keep writing unreliable narrators  Open in new Window. and you, too, can have your last days confounded by an unreliable narrator.

You know what would be really helpful here? An autopsy. A death certificate. Any records of any kind. None have survived, if they ever existed, and no autopsy was ever performed on a famous writer who died a bizarrely mysterious death. We don’t know, guys. Our money’s on the hospital administration.

Well, this is America we're talking about. Most likely he died of shock after seeing the hospital bill.

To die laughing must be the most glorious of all glorious deaths!
         —E. A. Poe
November 22, 2022 at 12:01am
November 22, 2022 at 12:01am
#1040914
Nope.

I Spent the Winter Solstice in One of the Darkest Places on Earth  Open in new Window.
During the phenomenon of polar night, parts of the Arctic don’t see the sun for weeks or months at a time. The darkness drives some people insane, but for others, it opens a gateway into wonder and peace.


Well. A qualified "nope," anyway.

About eight years ago, I stepped through the unlocked door of a 1915 cabin-turned-chapel in Wiseman, Alaska, an Arctic settlement of about a dozen people roughly seven hours north of Fairbanks.

I looked up Wiseman when I found this article. The "dozen people" thing appears to be true, though some sources say less. Oddly, there is a bed and breakfast there, called Arctic Getaway.

It is quite literally in the middle of nowhere.

The pastor, who had lived in Wiseman for decades, described the inexorable march of darkness as a force both terrifying and beautiful. She spoke of chopping wood, preserving berries, and squeezing the joy out of every moment of daylight before a winter in which, for more than a month, the sun never rises above the horizon.

That's the actual definition of "existing north of the Arctic Circle." You also get a month or so of permanent daylight in the summer. Given my complicated relationship with the accursed daystar, I'm not sure which is worse.

The notion of such sustained darkness in a remote corner of the planet unnerved me. Residents of the Arctic tell stories of people losing their minds in the black of polar night. But I also felt strangely curious—and drawn to return one day.

I, too, admit to some curiosity. But not enough for me to actually go haring off to the Arctic. It's cold and there's probably no internet. On the plus side, there's the aurora borealis. I wouldn't mind seeing that once.

It’s not exactly easy to get to at any time of year and services like hotels and transport are few.

Well, there is that B&B. And being Alaska, don't they all get around by small aircraft? Also, Maps shows an actual state road going through it (apparently built to support the Alaska Pipeline), but I can't be arsed to see if it's passable in the winter.

I did, however, note that there's a place just south of Wiseman called Coldfoot. I immediately assumed that this referred to frostbite, but Wikipedia has other ideas:

Coldfoot is a census-designated place in Yukon-Koyukuk Census Area in the U.S. state of Alaska. The population was 34 at the 2020 census. It is said that the name was derived from travelers getting "cold feet" about making the 240-some-mile journey north to Deadhorse.


So apparently there is also a town (or whatever you want to call it) named Deadhorse. Okay, Alaska.

But last summer, a friend forwarded me an email about a tiny off-grid six-person retreat center that had just opened outside of Wiseman. The owners were hosting a week-long trip that included yoga and exploring the Arctic wild with skis, snowshoes, and dogsleds, and the dates fell right on the winter solstice.

Nope, nope, nope, and nope. Also nope. But at least it's (probably) not hot yoga, though I'm not sure if freezing-your-ass-off yoga would be any better.

I’m not exactly a cold-resistant creature: I’ve suffered from hypothermia multiple times and frostbite that turned my feet white and wooden. I’m generally dressed in a sweater and jeans when my friends are wearing shorts and flip flops. Even at much more temperate latitudes, seasonal affective disorder runs in my family.

I consider anything below 70F to be "cold." Anything below about 60F is "too damn cold." Like, tonight, it was around 40F and in order to take my recycling to the curb, I had to put on my battery-powered vest, scarf, heavy coat, and ushanka hat. I experience seasonal depression, too, but it's not the darkness that would stop me; it's the temperature, and, again, I can't emphasize this enough, no fucking internet.

I also contemplated the wisdom of traveling during a pandemic, and the carbon emissions of flying long distances.

Look how virtuous I wanted to be, but I did it anyway, tee hee.

Soon after arriving, I tugged my snowpants over my jeans, donned both my down jacket and an insulated parka, and pulled on my warmest hat for a short walk.

Whyyyyyy?

The cold blew through it all in seconds. My eyelashes froze and my nose hairs crinkled. The liquid on my eyeballs felt like it was turning to slush. Even the slightest breeze lacerated my cheeks, and my mind felt tight with a barely concealed panic.

Look, I'm not going to berate anyone for stepping outside their comfort zone. I need to do it myself, from time to time. But there's leaving your comfort zone, and then there's going to goddamn Alaska in cocksucking December.

Between November 30 and January 9, the residents of Wiseman, Alaska, do not see the sun. They lose about 12 to 15 minutes of light each day until the solstice and then gain it back just as quickly. The future always looks scarier from the confines of imagination, and polar night was not so unnerving once I was in it. It was actually brighter than I anticipated—locals like to say that on the winter solstice, there are still five hours when it’s light enough that you can’t see the stars.

That's the thing a lot of people don't get about the Arctic Circle (or its southern counterpart). Sure, there are stretches of time when the sun is below the horizon, but depending on how far toward the pole you are, this can be basically a really long twilight.

Not that I've ever ventured that close, myself. The furthest north I've ever been was southern Scotland, unless you count the path my plane took to get there (which happened much closer to the summer solstice, so I could see the midnight sun shining through the plane's windows).

Anyway. The rest of the article attempts to wax poetic about the author's experience and, while I can appreciate the language, every other sentence just made me yelp "nope" again.

One night, I wandered out of my cabin, wrapped in a sleeping bag I had brought just in case, and watched slack-jawed as the northern lights whirled across the dome overhead like a luminous river. After many days, the formidable peaks of the Brooks Range finally disrobed from their mantle of clouds and shone resplendent in the moonlight.

That bit, though... that almost makes me want to visit.

Almost.

At least there's pictures and descriptions that I can see and read because I have a freakin' internet connection in a heated house.
November 21, 2022 at 12:02am
November 21, 2022 at 12:02am
#1040879
One of the more interesting branches of folklore is kidlore.

Why Did We All Have the Same Childhood?  Open in new Window.
Children have a folklore all their own, and the games, rhymes, trends, and legends that catch on spread to many kids across time and space.


Though I'm pretty sure some of the stuff circulating in school when I was a kid would get someone in huge trouble nowadays.

For example, we had a song we sang to the tune of "Battle Hymn of the Republic:"

Mine eyes have seen the glory of the burning of the school
We have wiped out all the teachers, we have broken every rule
We got sent up to the office and we shot the principal
Our school is burning down
Glory, glory, hallelujah!
Teacher hit us with the ruler
Glory, glory hallelujah
Our school is burning down


...yep, these days that would result in forced therapy at best and juvie at worst. Just for singing it.

You might not think of typing “BOOBS” on a calculator as cultural heritage, but it is.

Definitely. We also would type 7734 which, when turned upside down, said "hell." This was the height of hilarity in like 4th grade. In our defense, this was very early in the history of pocket calculators. They ran on 9V batteries and the screen was red LED, and most of them could do no more than add, subtract, multiply, and divide. And spell hell and boobs.

This sacred communal knowledge, along with other ephemera of youth—the blueprints for a cootie catcher, the words to a jump-rope rhyme, the rhythm of a clapping game—is central to the experience of being a kid.

We didn't call 'em cootie catchers. In fact, cooties were kind of a foreign concept in my school, something talked about like it was from another culture. No, what they're calling a cootie catcher, we called a fortune teller, and there were intricate ways of labeling and using the thing.

When children are together, they develop their own rituals, traditions, games, and legends—essentially, their own folklore, or, as researchers call it, “childlore.”

I like my "kidlore" better. Rolls off the tongue more easily.

Even seemingly more modern inventions, such as the “cool S”—a blocky, graffiti-ish S that has been etched into countless spiral-bound notebooks—are a shared touchstone for many people who grew up in different times and places in the U.S.

The main graphic at the link displays the S in question. I always somehow associated it with the band Styx, even though their stylized logo S was somewhat different.

Indeed, thinking back to the lore of my own youth, I have no idea how my friends and I thought to give each other “cootie shots” with the lead of a mechanical pencil, or why everyone in my elementary-school art class would smear their hands with Elmer’s glue, wait for it to dry, and then methodically peel it off (other than the fact that it was super fun and I would do it again right now if I had some glue nearby). These things were almost like analog memes, micro-bits of culture that seemed to come from nowhere and everywhere.

I'mma stop you right there: "almost like analog memes," my ass. You know what the actual definition of a meme is? Unlike kidlore, we know exactly where and when the word and concept of "meme" came from: Richard Dawkins, who meant it as "a unit of cultural transmission" and the definition got into a dictionary as "an element of a culture or system of behavior that may be considered to be passed from one individual to another by nongenetic means, especially imitation." By the definition of meme, kidlore is memes, full stop, end of discussion.

Labeling funny images with text "memes" came later. I'm not saying it's wrong; definitions change and in the immortal words of Robert Plant, you know sometimes words have two meanings. But saying that a particular bit of kidlore is "like" a meme is ignorant as shit; it is absolutely a meme.

Anyway...

The main way childlore spreads is, perhaps obviously, by children teaching it to one another. Older kids mentor younger ones both at school and at home, where siblings play a vital role in passing jokes and games down through generations.

And most of this lore is stuff that would horrify parents, who will conveniently forget that they participated in the same rituals as kids.

Parents and teachers share nursery rhymes, folk songs, and games with kids, and adults create the movies, books, and TV shows that kids consume.

Yeah, but those are deliberately sanitized, adult propaganda to convey what adults think children should know. Kidlore is different, and often includes elements that would never make it into a Disney movie. Like the song parody above, or one of the many racist chants I was subjected to as a kid.

The author does point this out later, albeit parenthetically.

Although some elements of childlore last and last, others come and go with the culture of the moment. But even then, Willett told me, kids often build on what came before, whether they realize it or not. For instance, COVID-19 has shown up in many kids’ games, including coronavirus tag— which is, of course, built on one of the most classic kids’ games there is. (Roud suspects that in Victorian times, European children played cholera tag or something similar.)

Interestingly enough, the rhyme "Ring around the rosie" was not,  Open in new Window. as is commonly assumed, about the Plague.

Another example Willett gave, from one of her studies, was a game based on the Weeping Angels from Doctor Who—monsters that can move only when you’re not looking at them.

I just gotta say, that's awesome.

And so, as we come of age, we may lose an understanding of something we once knew in our very bones: that typing 8-0-0-8-5 on a calculator is not just naughty and fun, but important. The rebellious thrill, the intense comradery, the urge to pass the knowledge along (and pretend you came up with it yourself)—all of these things fade with time.

I'm sure there's a lot of it I've forgotten. But I knew then, and I know now, that these things were important. And they were just as much a part of learning as any formal schooling. Rhyming taught me wordplay. Figuring out how to make a simple pocket calculator say a naughty word led to my interest in computers (now I can make them say ALL the naughty words). Folding and scribbling on a fortune-teller led to an interest in random numbers and topology.

The racist chants, though, those I could have done without... but I guess they taught me that some people are best avoided. By which I mean the ones perpetuating those memes.
November 20, 2022 at 12:01am
November 20, 2022 at 12:01am
#1040853
One of the reasons I've been randomly digging into the archives on occasion is that things change, especially me.

"RarityOpen in new Window. is from mid-2009, and it made me wonder just how much some things actually change. The entry was done in what would eventually become my standard format, commenting on an article I found on the internet. Being from 2009, though, the actual article is gone, but hopefully there's enough in my entry to get the general idea: some employers were allegedly having trouble filling positions.

Like I said, made me wonder just how much some things actually change.

In 2009, the US was in the middle of the Great Recession, marked in part by high unemployment rates. It peaked then at about 10%, and that number would steadily decline over the next several years until it shot up again in early 2020 (gosh I wonder what happened then)—but we couldn't have known that at the time. The current value is around 3.5%, close to what it was in the Before Time.

With an unemployment rate that low, I can understand employers not being able to find workers. At 10%, though? Well, I guess the original article was about professionals and trade workers, not unskilled labor.

What stood out for me in my original entry above was this:

Businesses try to create a glut of workers so that they can have more control over the workers. Supply and demand, folks. If there are 30 engineers competing for one position, they'll go with the competent one who can live on the lowest salary. Just business. If, however, there are 30 possible positions for each engineer, the engineer's in the catbird's seat.

I'm not sure why I phrased it that way now, implying that businesses are creating workers somehow. I mean, I guess they do to some extent when they go to the press with "we have a shortage of [nurses|engineers|welders|whatever] so kids should go learn these trades!" But I can't see that making a huge difference.

What I probably should have said was that businesses like it when there's an abundance of workers. And I still think that way. Oversupply of workers leads to management deciding they can make more demands of them, while a scarcity of potential employees means someone looking for a job can be the one doing the demanding.

Embarrassing moment for me in the entry:

Let's play SAT test (don't worry; no math in this one).

Oh, I'll need to pull cash from the ATM machine and use my GPS system to get to the testing facility. Oy. Sorry about the redundancy. I make mistakes from time to time. I know, I know; it was hard to believe when I wrote yesterday that I don't always get everything right. But this is proof.

In any case, i was struck by the similarities of the 2009 article (at least what excerpts survived in my entry) and a lot of the rhetoric businesses are spouting today. I tried to find a similar article from this year or at least last year, to see what "the hardest jobs to fill" are right now, but my search came up with nothing relevant. I did find this  Open in new Window., but there's no date on the page (apart from a copyright that probably updates every January 1), and the salaries listed appear to be from 20 years ago. So I have no idea what the hardest jobs to fill right now are.

I suspect "blogger" isn't on the list, though.
November 19, 2022 at 12:02am
November 19, 2022 at 12:02am
#1040820
Today's article is about avid readers, which I'm sure all of my readers are. Else you wouldn't be reading this.

Words we think we know, but can't pronounce: the curse of the avid reader  Open in new Window.
Do you know how to say apropos? What about awry? We want to know which words you’ve mispronounced – and how you found out your mistake


This is from a couple of years ago, but that's probably irrelevant. I just found it last month. What is relevant is that it's an article about English pronunciation in the Guardian (British) written by an Australian woman, and it's well-known that the US, the UK, and Oz pronounce certain words different ways. "Privacy," for example. I think only the US pronounces that with a long i. So just keep that in mind.

When I mispronounced tinnitus (ti–nuh–tuhs is correct, ti-nai-tis is not) recently and was kindly corrected, my embarrassment was a fraction of when I said apropos (a–prow–pow instead of a-pruh-pow) to a large table of people in London when I was in my 20s. That day I was not kindly corrected, but only realised my mistake after howls of laughter and a whispered, “Maybe that’s how they say it in Australia?”

Now, see, I always thought it was ti-nai-tis. Some sources say both are correct. Officially, the first syllable should be emphasized. If enough people pronounce it the "wrong" way long enough, though, it becomes an alternative pronunciation.

As for "apropos," well, at least she didn't pronounce the s at the end, right?

Since then, I have learned that mispronunciation is often the downfall of people who read widely as children and form the incorrect pronunciation in their mind before actually hearing the word said aloud.

You know what? That shouldn't be embarrassing. It means you read. What should be embarrassing, but too often isn't, is the polar opposite: when you try to write something as it's pronounced, and you spell it wrong. I worked with a guy who kept writing stuff like "part in parcel" (should be "part and parcel"); "save and accept" (save and except), and "beckon call" for beck and call. Those errors aren't proof of illiteracy per se (he would write "per say"), but they do indicate a lack of interest in reading. And don't get me started on affect/effect.

What's even worse, of course, is mixing up things like it's and its; there, they're, and their; and your and you're.

Now, I'm not saying I always get everything right. Far from it. Only that I have more respect for people who mispronounce things because they read a lot than (not "then") I have for people who misspell things because they hardly read.

My ex-wife, for example, pronounced "picturesque" like "picture-skew." I thought it was adorable and never corrected her. Though in hindsight I should have used it against her in the divorce.

In short, I'd rather deal with someone who mispronounces "apropos" than with someone who writes it "apropoe."

Annals (not ay-nals), Hermione, misled (does not rhyme with thistled) and glower...

Look, it should be shining obvious that annals isn't pronounced the same way as anals. No one in the US knew how to pronounce Hermione until the first Harry Potter movie came out. I never thought misled rhymed with thistled. As for glower, well, honestly, I never was very sure about that one so I avoided saying it (turns out it's pronounced like flower).

A colleague pronounced facade with a k sound, another thought burial rhymed with Muriel and yet another was mortified to discover that segue was not pronounced seeg.

At least two of those are a result of not knowing the French influence on English.

English pronunciation can be tricky like that, anyway. We've borrowed so many words from other languages, words where you have to know a bit about the language to pronounce them correctly. Like, if you see the word "sake," you need to know if it's preceded by "oh for fuck's" or if you're talking about delicious Japanese rice wine.

French words very often leave English-speakers flummoxed. I’ve heard canapés pronounced in quite creative ways, and amuse-bouche, prix fixe and hors d’oeuvre have seen the odd food lover come a cropper.

Before I started learning French, I had a lot of fun deliberately mispronouncing French words. Canapés became can o' peas, for example, and hors d'œuvres became, to my vast personal amusement, horse doovers. And the surest way to annoy a French person is to say "Par-lezz-vouse fran-kais"

What word have you always mispronounced?

The article recommends commenting there with an answer to that. I wouldn't advise it as, again, this is over two years out of date. It also recommends tweeting same, which I definitely don't recommend right now.

But if you want to give me your examples below, feel free. Me? I don't know which words I'm mispronouncing. If I did, I wouldn't mispronounce them anymore. I know I used to think that rappelling (the practice of rock climbing with ropes) was like rapple-ing, but once I was laughed at and corrected I said ra-PELL-ing like you're supposed to. But that was back in high school.

I guess what we need is a verbal version of spell check. Something that makes a red squiggly line appear in your vision when you're about to mangle a word that you've only ever seen in print. Alas, we'll have to wait until cyborg technology is more advanced for that.
November 18, 2022 at 12:02am
November 18, 2022 at 12:02am
#1040776
Hell of a hazard.

A Woman Was Caught Whacking a Golf Ball into the Grand Canyon, and the Feds Aren’t Happy  Open in new Window.
The latest story of a tourist behaving badly in a national park is a real head scratcher


Source is (shudder) Outside magazine. I still don't know why I keep reading their stuff.

Somewhere in the dark recesses of my memories lives my long-forgotten teenager sensibilities. This is the version of myself that delighted in immature pranks, like toilet papering a classmate’s cottonwood trees and playing ding-dong ditch.

Both of which are annoying but relatively harmless. If no one was injured, a school wasn't evacuated, and nothing caught on fire, you were a goody-two-shoes.

I'm not admitting to anything, by the way. Just saying.

I’ll admit it: my teenaged self would absolutely understand the allure of whacking a golf ball off of the side of the Grand Canyon and watching it disappear into the chasm below.

Okay, so true story: they taught us the basics of golf in sophomore gym class in high school. As I recall, we split up into pairs and each pair got a golf club (don't ask me what its number was or whether it was iron or wood) and a wiffle ball the size of a regulation golf ball. The idea was to learn our swings and recover the ball easily.

I was paired up with the class stoner, who, with a level of perception and intelligence only displayed by a high school stoner, found the one real ball in the box of wiffles. One of each pair of students teed up, and on the coach's command, wound up and swung.

Everyone else's ball caught a bit of air and then dropped down to bounce sadly on the grass. Ours, however, made a perfect golf-ball arc through the air and ended up 300 yards downrange.

Coach got up in our faces. "WAS THAT A REAL BALL?"

Stoner nodded. (Like anyone could have done that with a wiffle.)

"YOU GO GET THAT RIGHT NOW."

So we trudged through the bushes. As soon as we were out of sight, my teammate produced a joint from his pocket and sparked it.

Like I said, perceptive and intelligent.

As I recall, we did find the ball, and both got lousy grades in that gym class. Which was the only class my parents didn't care what grade I got in, so it all worked out for everyone involved. Coach got to get in someone's face; stoner got to get high in class; and I got a story.

Anyway. The relevant thing is that, while I certainly got up to some shady shit in high school, I don't think I ever considered whacking a golf ball off the rim of the Grand Canyon. For one thing, it's 2100 miles away. For another, I don't play golf (it generally requires being *shudder* outside).

On Thursday, National Park Service officials posted an update on the Grand Canyon’s official Facebook page about a woman who filmed herself hitting a golf ball into the canyon, which she then uploaded to TikTok. In the video, the woman also loses the grip on a golf club and flings it off the cliffside.

If it wasn't recorded and posted on social media, it didn't happen.

Now, look. The problem with lofting a golf ball into the Grand Canyon isn't that it might hit someone. The chance of that is infinitesimally small, though admittedly if it did happen the consequences would be terrible. No, it's that it might inspire other people to do it. TokTik trends are a thing, and I can definitely see the "Grand Canyon Hole In One Challenge" going viral. With that many balls flying through the air, the chance of hitting someone increases significantly... as does the amount of litter, which is the real problem here. (The article does mention all this later).

Officials acted swiftly, and with the help of the general public, were able to track down the woman.

Snitches.

At Outside, we come across a litany of stories of people behaving badly in the outdoors, and this year has been a busy one.

More reasons not to go into the outdoors.

There were the high schoolers who booted a football off of Colorado’s Uncompaghre Peak...

Touchdown!

...the dudes who were photographed scrawling graffiti in chalk on a rock at the Grand Canyon...

At least it was chalk and not spray paint?

...and the never-ending march of tourists getting too close to animals at Yellowstone National Park.

Those, I have mixed feelings about. I mean, it might injure the poor animal if it attacks and kills the stupid human, but other than that, it only harms the stupid human. So while it's true that one shouldn't get cuddly with a Yellowstone grizzly, bison, unicorn, or whatever they have out there in the wilderness, the penalty is built right in.

Article doesn't mention my favorite Yellowstone idiocy, which is people who see the pretty pools of azure water and decide to go off the very clearly marked trail festooned with danger signs and warnings (you can't tell me what to do! mah freedumz! stoopid government!) to take a dip. Some of those pools are near boiling and have a pH of like 1.5. Here's someone  Open in new Window. of whom nothing was left but a foot.

I mean, that sort of thing is just par for the course.
November 17, 2022 at 12:01am
November 17, 2022 at 12:01am
#1040741
Speaking of food, I continue to see people raving about sourdough, which apparently a lot of people played with when they were stuck at home over the past couple of years. Me, I just continued buying bread from the bakery as nature intended. Every time I've tried to bake bread, bad things happened.

Anyway, this article isn't about pandemic sourdough bakers; it goes back a bit further than that.

San Francisco’s Famous Sourdough Was Once Really Gross  Open in new Window.
Gold miners made themselves sick on smelly, hard loaves.


Long before it became a viral food trend or social-media sensation, American sourdough was surprisingly gross.

So, more like the other meaning of "viral."

San Franciscans proudly trace their city’s iconic bread back to the Gold Rush of 1849.

That's fair, but it should be noted that before people knew what yeast actually was, almost all bread was "sourdough." I think some was made with repurposed beer yeast, but the origins of sourdough extend way before San Francisco was a thing.

The men who flocked to Northern California in search of gold made bread in their wilderness camps not with store-bought yeast, but with their own supply of sour, fermented dough.

Yeah, I could be wrong, but I don't think there was a lot of store-bought yeast there at that time.

Letters, diaries, and newspaper articles written by and about the 49ers, lumberjacks, and pioneers of the American West are full of complaints about horrible and inedible sourdough. Could bad bread really have inspired San Francisco’s most beloved loaf?

Or it could be gold miners protecting their hoard. "Don't come out here. The weather is shit, people are camping everywhere, and the bread sucks." You know, a bit like San Francisco today.

In 1849, when gold miners began arriving in San Francisco, most Americans didn’t bake or eat sourdough bread. American bakers typically leavened their bread with “barm” (a yeast derived from beer brewing) or one of several relatively new commercial yeast products.

Yeah, see? Look, I comment on these things as I go, and it's nice to turn out to be mostly right.

Hm, I wonder... why, yes, that is the origin of the mostly British word barmy.  Open in new Window.

These commercial yeasts were easy to work with, didn’t require constant maintenance, and produced reliable results. They also produced bread that appealed to American taste buds.

American taste buds suck. They think Budweiser is beer, pasteurized process cheese food is cheese, and Wonder Bread is bread.

Most 19th-century Americans preferred bread that was sweet rather than sour. According to one 1882 advice book for housekeepers, the “ideal loaf” was “light, spongy, with a crispness and sweet pleasant taste.” Sour bread was a sign of failure. As a result, bread recipes from the period used commercial yeasts along with considerable amounts of sugar or other sweeteners to speed up fermentation and avoid an overly sour flavor.

The ideal bread is a French baguette with a crispy crust. Period.

Sourdough required only flour, water, and fresh air. A sourdough “start” needed care, attention, and regular feedings but offered an inexhaustible, self-perpetuating supply of leavening agent, even in the wilderness.

Some brewers make beer with wild yeast, too. The results are all over the place. Some of them are also described as "sour."

Bread was baked under difficult circumstances—outdoors, over a campfire or hot coals, and sometimes in the same flat pan used for panning for gold—leading to inconsistent and unsanitary results.

But they could have sold it as artisanal sourdough with flaked gold.

Sourdough baked by pioneers wasn’t just gross and unappetizing; it could also make you sick.

Well, duh. Not all the microorganisms floating around are beneficial.

Across the American West, sourdough was considered a food for unmarried men who didn’t know how to cook.

As opposed to married men who didn't know how to cook.

So it’s worth asking: If sourdough bread baked by miners was so terrible, how did it become one of San Francisco’s most beloved foods?

It all came down to the success of the city’s French and Italian bakeries.


Yep. That'll fix it, alright.

By the second half of the 20th century, tourism boards in San Francisco were placing the 49ers at the center of the city’s history, idealizing life on the frontier and playing up links between the Gold Rush and the City by the Bay. San Francisco bakeries joined in, crafting stories about partnerships between bakers and miners and attempting to market the bread nationwide.

And once again, we see that no matter how disgusting something may be, if you market it right, it'll become popular.
November 16, 2022 at 12:01am
November 16, 2022 at 12:01am
#1040706
Yeah... I can relate.



Of course, ordering a pizza takes less than 30 minutes. Sure, you usually gotta wait longer than that for it to show up—Domino's nixed the whole "30 minutes or less" thing years ago, and they suck anyway—but at least you can be playing video games while you wait.

We’ve all fallen for the trap before. Wooed by the promise of pan-seared chicken thighs in 30 minutes, only to be defeated and left overanalyzing what went wrong more than an hour later. Or worse, we’ve thrown some onions in a pan to caramelize while we’re searing a batch of burgers, only to find ourselves still stirring the onions dejectedly, 45 minutes later.

It's not just recipes, either. "Minute" rice can take way longer than a minute to cook.

It’s right there, staring at me. Cook time: 30 minutes. But a closer look at the ingredients says otherwise. Five garlic cloves, minced. One stalk of celery, thinly sliced on the bias. Two carrots, peeled and chopped. One yellow onion, finely diced. There go 15 minutes already (on a good day, with a sharp knife, and no distractions), which doesn’t even account for the five minutes needed to compose myself after tearfully hacking at an onion. And that’s only half the battle, if we’re counting the unglamorous process of washing and thoroughly drying all of those vegetables.

Not to mention the half hour or so you spend cleaning up what your roommate left in the sink and on the stovetop.

Look, I'm a big fan of mise-en-place, and was even before I started seriously learning French. Get all that measuring and chopping crap out of the way before you start cooking and you're not stuck watching your pot burn while chopping the onions in the middle of it all. There will be at least one onion that's started to go bad, too, so you always use more onions than you think.

I have managed to keep onion juice from messing with my eyes, though, so there's that.

But the problem with mise-en-place is you're not multitasking, so it usually takes more time to cook something if you're careful about getting everything all set before you fire up the stove.

Recently I fell into a similar trap after being convinced by a trusted blog that 35 minutes was all I would need to make mapo tofu in my Brooklyn kitchen.

Gotta get that humblebrag in. At least in Brooklyn, if you suddenly find you're out of ingredients, it's a much quicker trip to get more than in most parts of the world.

After pulverizing Sichuan peppercorns with a mortar and pestle, peeling and mincing a three-inch knob of ginger, finely chopping half a head of garlic, and rummaging through my dish rack to get enough small bowls and spoons to premeasure the rest of the ingredients, I’d already blown past the 20-minute mark, and I hadn’t even turned on the stove.

The peppercorn thing is way too much work. Ginger is way too much work, too, but it's worth it. And don't get me started again on peeling garlic.

Beneath a fish taco recipe advertised as a “fast dinner for hungry, busy people” in the New York Times, a comment reads, “It’s unbelievably condescending to claim this meal takes 30 minutes. It took me 15 minutes just to make the salsa, 7 for the mayo, 10 to warm all the tortillas, and a full 30 to fry all the fish in batches…. Great recipe, horrifically underestimated execution time, especially for those with kids running around.”

If it's taking you 10 minutes to warm up tortillas, either you're feeding the Mexican army, or you're doing something very, very wrong.

The conditions we’re under have their own matrices of variables. “Part of it is that recipes don’t account for skill levels—such as how fast you chop or mince and the equipment you have at your disposal,” says Kelly Chan, a Queens-based nonprofit analyst who’s often folding dumplings or prepping Cantonese-style stir-fries. Recipes are written with the presumption that all home cooks have speedy, chef-like knife skills to whiz through a mountain of shallots and tomatoes, or that they know how to butterfly a chicken without pausing, washing their hands, and looking up a YouTube tutorial. Even the Instant Pot—widely adored among home cooks for its shortcuts to complex 20-minute pho broths or five-minute steel-cut oats—still needs time to preheat and depressurize, effectively tripling the cooking time in some cases. (But of course no one tells you that, because it’s called an Instant Pot for a reason.)

I've never used an Instant Pot, but my gut told me the name was an exaggeration. Not just that, though, but also the work involved in cleaning it keeps me from buying one (my housemate has one, but rarely uses it, and I'm concerned I'll muck it up). Every time I consider a new kitchen gadget, I mentally figure out how much work cleaning it will be, and usually don't bother. One exception is a blender; those are usually worth the work to clean.

Real cooking proficiency isn’t about whipping things up without a recipe—it’s about reading between the lines of that recipe and knowing when an hour means two hours.

I usually mentally double a recipe's stated cooking time, and it still often runs longer than that. One time, I was trying to make latkes. I knew going in that it would be labor-intensive; that's just the way it goes. What I didn't account for, though, was that the damn things took three times as long to cook as I expected. To be fair, this doesn't always happen; one should use russet potatoes for latkes, and I had to get some other kind because the store was out of russets (this was around Hanukkah a couple years back; I guess everyone else was making latkes, too. Everyone's Jewish for latkes.)

My biggest gripe about cooking for myself, which is the usual case, is that I generally think that a dish shouldn't take longer to cook than it does to eat. Sometimes I do it anyway, for practice. But after laboring over a hot stove for two hours and finishing the resulting meal in less than five minutes, I'm left with the distinct impression that I've wasted my time.

Maybe I should buy more frozen dinners.
November 15, 2022 at 12:01am
November 15, 2022 at 12:01am
#1040668
Look, sometimes the Random Number Generator (peace be unto it) gives me the same source twice in a row. But I promise you this one's worth it, because it resolves a question I've had since I first learned what a question was.



I know I've commented on this before. For a while, I was methodically going through every single episode of every Star Trek show, plus the movies, in chronological order. I think the first time I noticed any reference to a toilet (or that would be head, since it's a ship) was sometime in the early 90s, some 25 years or so after the show's beginning.

But, apparently, I'd missed some. We'll get to that.

Without its fantastical future technology, Star Trek would just be a series about people who love sitting in comfortable chairs.

People watch shows like that all the time.

What has wowed audiences for decades are inventions such as the faster-than-light warp drive, the matter transportation system, and the Starfleet human resources nightmare that is the Holodeck.

Someone, I think it was Dave Barry, noted that the holodeck would be humankind's last invention. Personally, I don't think we'll ever invent it. Not because we won't be able to, but because, well, look at the Metaverse. Any real-life implementation of a holodeck will be absolutely loaded with safeguards, to the point where no one will be able to do anything fun with it, it will be a joke, and everyone will laugh at the inventor (who will consequently become a supervillain: "Laugh at ME, will you? Let's see who laughs now muhuahahaha!!!")

But anyone who’s ever watched any Star Trek TV shows, movies, or adult movies probably has some serious questions about how this fictional universe really works – perhaps the biggest being: where the hell does everyone go to the bathroom?

This is, indeed, one of the biggest questions in the universe.

I should note that, at about the same time DS9 was airing, and also around the same time I saw someone in the Trek universe finally acknowledge the existence of a toilet, Babylon 5 (another SF show about another space station that stayed in one place) not only acknowledged it, but set scenes in it.

Anyway, the next bit points out that there was a door labeled HEAD on the bridge of the Enterprise-D. That was TNG, before DS9.

And while the production “did not design or build the inside of that bathroom,” it was still there, just in case Number One had to take a … well, you know. Also, in the crew members' “various living quarters,” there is an unopened door that “we assume led to a bathroom.”

I should also note that this was around the time when, in attempting to answer my own version of the question (and the Trek novels were no help in that regard, either), I finally decided the key had to be transporter technology. Think about it. A transporter, under normal use, records the position and connections of every molecule, atom, proton, electron, and so on in your body (they have "Heisenberg compensators" to handle the Uncertainty Principle) and your clothes and equipment, right? And when you arrive at your destination, your clothes aren't somehow bonded to your body, etc.

Well, part of your body is the shit in your intestines and piss in your bladder. Sorry, but it's true: transporting someone necessarily requires transporting whatever waste products are awaiting exit. And with the transporter able to easily distinguish between different molecules, certainly it can tell shit from shinola. So. All you have to do is program the transporter to image you, then pull out the waste. No need to even strain on the throne; just push a button and boom, it's gone.

Where it goes is... well, let's keep reading. My theory turns out to be wrong, anyway. But it could have been right.

So we know that the Enterprise is equipped with toilets, but do we know what happens to the human waste? Do they make Chief O’Brien beam it out? More sensibly, one would think that the Enterprise crew could simply fire their poop into space from time to time, like a smellier version of Spock’s funeral.

Send it all to the Klingons like Scotty did with the tribbles?

But it turns out that this might be a terrible, terrible idea.

It is, for many reasons, not the least of which is the next starship zipping through the area will come into spacedock with skid marks.

Dr. Siegel speculated that, while none of the Trek shows go into much detail about toilet-based issues, “every bit of eliminated human waste is useful matter that you can reconstitute into something else.” Meaning that any waste created could potentially be used to “power the Replicators” – you know, the device that people of the future use to create small objects, cups of piping hot tea, and food.

Now, look. Some people are going to be grossed out by this idea. But come on. How do you think it works right here on Earth, and mostly without technological interference?

Everything you eat contains atoms that were once part of poo. Everything you drink contains atoms that were once pee. Every breath you take inhales atoms that were in a fart. Hell, worse, they've all been part of dead things. I don't mean fresh dead things like the kind you eat (even if you're vegan), but yummy, delicious carrion... oh, sorry, channeling my spirit animal again.

A replicator would just speed up that process.

Part of civil engineering, though not a part I ever participated in directly, is sewage treatment. You take all that waste and process it, and (in theory anyway) the water becomes clean enough to dump into a river. It's then either evaporated, falling back as rain that you might eventually drink; or it's processed by fish that you might eventually eat. Solid waste is sterilized and becomes fertilizer. Trek would just use the technological evolution of the same sort of processes.

According to Dr. Siegel, this is a pretty solid plan, although admittedly, you have to get over the “gross factor.” Like if you were to replicate a clarinet, à la Harry Kim in Voyager, you’d have to look past how “the atoms making up the clarinet that I'm putting in my mouth and playing right now were defecated out by me and a thousand other crew members onboard the ship.”

Again, the only difference between that and the way things occur naturally here on Earth is scale.

Still, seems to me that in Trek it would be simpler to cut out the whole "plumbing" bit and use the transporter like I said. But I guess that might set things up for some really gross practical jokes.
November 14, 2022 at 12:02am
November 14, 2022 at 12:02am
#1040628
Today's article, another one from Cracked, is a fun (and sometimes funny) exercise in speculation.



And I do mean speculation. Which is okay; you gotta start somewhere, and speculating is better than being incurious.

Humanity's most important question, excluding pop media relationship statuses, Incognito rash inquiries, and whether or how various animals would wear pants, is this: are we alone?

Well, some of us definitely are.

Oh, you mean "we" as a species. Or maybe "we" as a biosphere.

Are there any intelligent aliens we could have a brew and watch the game with, or are they all crabs?

There was an idea floating around a while back that came out as "everything eventually evolves into a crab." This is, of course, crabshit, as a moment's thought of how evolution works to create and fill environmental niches will conclude. The original idea is that a lot of, specifically, crustaceans, eventually take on the morphology of crabs, and there may be something to that. But it does represent one important leap in popular speculation about extraterrestrial intelligence: the idea that the human form isn't necessarily what evolution works toward (it actually doesn't "work toward" anything).

Speaking of which, this article uses "intelligence." I've beaten that dead crab a few times; basically, let's not conflate intelligence with technological capability. And please, please, stow the tired old "but we're not intelligent either" jokes. This article has enough of them.

Chillingly, we may be the gleaming example of advanced life in the entire universe.

Maybe. The Universe is a big place, though. So I doubt it.

In its usual style, the list is in reverse numerical order. Look, it's just their brand.

4. It's Not Just What Other Life Looks Like, But How We See It

Plants are green because they reflect green light. But the chlorophyll that powers their photosynthetic planty prowesses is extra reflective in near-infrared. Sadly, we're limited to seeing the visible spectrum of light, which is a tiny portion of the entire spectrum.


This is, essentially, true. But there's a decent reason for why we see the sliver of spectrum that we do, and not way out in other wavelengths: it's the relative transparency of water (where our distant ancestors evolved) and air (where our more recent ancestors evolved) to those particular frequencies. Now, some species see higher or lower wavelengths, but our red-to-violet vision is more than acceptable for what evolution produced vision for in the first place: seeing predators coming, and seeing prey.

The rest of this section goes deep into the speculation bit, and it has helpful images designed to be seen by our puny-sliver-of-spectrum-seeing eyes.

3. Aliens Could Look And Maybe Even Communicate Like Us, Dawg

Regarding convergent evolution, maybe nature isn't as creative as we thought and survival problems "only have a few good solutions."


Again, not borne out by evidence right here on our own planet. Every single living thing right now has been subject to evolution just as long as humans (and crabs) have, and this includes such varying survival techniques as nonskeletal molluscs (octopuses), opposable thumbs (primates), claws (crabs), bills (ducks), mushiness (jellyfish), ants, trees, and many other wildly varying features.

Photosynthesis necessitated loads of tweaks to many cell types, so plants produce oxygen and not, perhaps, farts. So, such intelligence as ours may occur on only 1 in 100 trillion habitable worlds. But while there may not be any civilizations in our galaxy, it's quite possible that the Milky Way still harbors tens of billions of planets covered in prokaryotic purple slime.

This aligns with what I've been saying all along. But remember, we have a sample size of exactly one when it comes to "examples of worlds with life on them." It could be that technology (again, not using "intelligence") happens on 1 in 10. It could be 1 in a googolplex. Personally, I suspect it's closer to the latter than the former. We don't know.

2. They might be robots, or robotic brains the size of your city

The problem with organic "wet" brains is that they're limited by size and processing power. Similar challenges are faced by the organic “wet” under-parts that get so many of us in trouble today. But inorganic brains theoretically have no limits of perception or conception, and robotification may be the ultimate destiny of all lifeforms that don't nuke themselves into glowing dust.


This is certainly not a new idea. Our own history of space exploration is "send the robot first." It's entirely likely that if another tech civilization exists, we'll meet their robot probes first. Or they'll meet ours; whatever. The logical extension of that would be consciousness transfer to robotic forms, which isn't remotely possible with our current technology (not to mention we don't really know what consciousness is), but hey, we're speculating here.

1. We May Be All Alone

Or maybe advanced aliens don't look like anything because they don't exist. We may be the only intelligent (ish) life in the universe. Based on statistical models, Oxford researchers say "average evolutionary transition times likely exceed the lifetime of Earth." And the universe is only a few Earth-ages old.


Oh, it's worse than that, though. Life as we know it depends on certain heavier elements. Not just the molybdenum mentioned in a recent blog entry, but something as seemingly basic as oxygen, or the iron that makes our blood work. And such elements just aren't found in the early universe. No, they have to be forged in stars, supernovae, and things like neutron star collisions. This takes time. It's not like life as we know it could have begun early on. "Sure, but what about life not as we know it?" Sure, we can just make stuff up.

Still, let's not put too much stock in statistical models, even if they do come from a reputable place like Oxford. Again, we have one data point.

My favorite theory? The one with the greatest potential for mindscrewiness: that aliens may, or may have initially, looked like us.

That's not a theory. That's more speculation. I also want to take this opportunity to reiterate what I've said in the past: there is no universal law of evolution that requires the emergence of a technology-using species. Plenty of species get along just fine without building computers or rockets.

They may even address us in English, which isn't crazy as it sounds. If they can traverse space, why couldn't they learn our language by jamming to Spotify in Moon orbit?

More likely they'll be speaking Mandarin Chinese or Spanish; more people speak those. As for what they'd listen to, I'll just note that the radio waves with the greatest chance of punching through Earth's atmosphere are in the FM band.

So I really, really hope they're listening to NPR and not some morning show shock jock.

When they do show up, just stay out of range of their pincers and you should be fine.
November 13, 2022 at 12:01am
November 13, 2022 at 12:01am
#1040597
Once again, taking a look at a past entry. This one's from way, way back in October of 2008: "Never endsOpen in new Window.

It's hard for me to remember last week, let alone that far back. I used Wikipedia to remind me of stuff that happened around the time of the entry. At that point, George W. Bush was still President (look, I'm not getting political here, just stating facts), and we were still a month away from electing Obama. The Great Recession, as it came to be called, had just begun; Bush had, a few days earlier, bailed out some of the banks involved. I was still working, still married. None of that mattered to me on October 7-8 because, apparently, on those days, I was in severe pain.

This was, obviously, from a very different time in my life, so I'll try to take it step by step.

Up until a week and a half ago, I had that nasty, unrelenting back pain and sciatica, the only relief for which was lying down and taking lots of meds.

Oh, yeah. I do remember how bad my back pain could get back in the noughties. At some point, I got steroid shots for it, and it got better. You know, those great big needles that they insert into your actual spinal nerves? Yeah. They suck. But not as hard as back pain.

I was okay for a week, then. I mean, I still had twinges in my back and leg, but nothing major.

Memory of pain is weird. It all blends together for me. Back and neck pain was just part of my existence back then. But this particular episode stands out as being utterly incapacitating.

Then, as I reported here, I got sick on Sunday, cutting short our anniversary celebration. This continued through Monday.

Hm. Somehow I had it in my head that our anniversary was closer to the end of October. In my defense, once she dumped me, I could release that date from long-term memory storage, so I did. I couldn't find anything in the archives about an anniversary celebration, only about everyone in the house, including the cats, being sick.

Monday night, I slept for a few hours, then was wide awake for a few hours, until maybe 15 minutes before my alarm went off. When I woke up, my neck and shoulders were stiff. No big deal, except my stomach was still upset, so I went to work. I left work early afternoon, figuring what I needed was to lie down with some heat on my neck.

No, I couldn't call in sick. Hard to do that when you own the company and don't have employees (not at that time anyway). The effects of the Great Recession on the business hadn't taken hold yet. Can't recall what projects we were working on that month, only that we were still able to make money.

So I heated up a neck thing and went to lie down, and pain exploded between my shoulder blades such as made the worst pain I experienced with my lower back (not to mention appendicitis) seem like a pleasant day in the Caribbean.

Like I said, memory of pain is weird. If I were to rank my pain as I remember it now, that day would only be about #4, behind the appendicitis and my heart attack (which happened later) and that time I got stung by a whole nest of yellow jackets (which happened in the eighties).

I couldn't move. Oh, I could move my legs and, to some extent, my arms, but I couldn't sit up or roll over. I couldn't even play dead because I kept looking for a position that minimized the pain.

Ha! "play dead." I crack me up.

My mobile phone was not nearby, so I couldn't reach it to call anyone. Every time I tried, I felt like someone was pushing a knife into my upper spine.

I do remember this particular episode of pain. Until I found this blog post at random, though, I couldn't even have guessed at the year, only that it had to be sometime in the noughties because my wife was involved.

I think I dozed off for a while. My phone rang. I had no way to get to it. I could only hope that my wife would come home before she went out to dance practice.

These days, I'd be utterly boned. Someone would find my emaciated, cat-chewed corpse.

Fortunately, she did. Unfortunately, she had no way of moving me. Fortunately, one of our close friends is a chiropractor. Unfortunately, the chiropractor was still at work. Fortunately, we were able to leave a message. Unfortunately, the ditzbrain who took the message didn't give it to her. Fortunately, I called her mobile phone an hour later to see if she got the message. Unfortunately, she hadn't. Fortunately, she was still able to come over and fix it so I could at least stand up - albeit with intense pain.

Remember a week ago I said some of these entries made me cringe? This is one of them. It's a bit embarrassing to me now. The idea of going to a quack to crack my back wouldn't fly with me these days. Sometimes you have to learn these things the hard way, I guess. It's entirely possible that chiropractic was the actual cause of much of my back pain in that era, though obviously there was some short-term relief from it.

Once I stood up, holding my head straight and not twisting or raising an arm, I was okay. We got back to the chiropractic clinic and she worked on me some more on the table. Then she said I couldn't get on the computer, so I sat with ice on my upper back.

Like I said, short-term relief. I haven't been to a chiropractor in well over ten years, and I rarely have these bouts of pain anymore. The one time I remember since then was neck pain coinciding with my month-long trip to Maui in... 2017, maybe? Some February in the teens. Really cramped my style; it's hard to snorkel when you can't move your head around to see where you're going. Bouncing around on the roads wasn't pleasant, either. At least there was copious alcohol.

I can only imagine how antsy I was without being able to compute. I don't think I've gone a day since 1979 (with the possible exception of a couple of vacation trips) without using some computer, somewhere, for work or school, or the internet or gaming. Not even that day; I would have used one at work the day of the incident, and obviously I was using one to make the blog post about it the following day. The thought of going without a computer for so much as a day fills me with the dread of possible boredom.

And look, I'm not trying to come down hard against alternative medicine. And I'm certainly not dissing my friend (I still call her my friend even though we've barely seen each other in the last decade or more. People drift apart; it happens.) It's just that these days, I need more scientific evidence before trying a course of treatment. Chiropractic has been shown to work for several things, but it's also been reported that there's a risk involved with adjustments, especially neck adjustments. To me, right now, the risk isn't worth the benefit.

I might change my mind if I find my neck in that much pain again, though. People in general will go to great lengths to make pain go away, especially hedonists like me. I'd be like "Fine, even having a stroke would be better than enduring this much agony."

But it was around that time that I came up with this:

I used to say "My back is killing me!" Now I say, "My quack is billing me!"
November 12, 2022 at 12:01am
November 12, 2022 at 12:01am
#1040558
Hope you're not hungry.



"Delicious" is, of course, a matter of opinion. The headline would have probably been too long if Cracked had qualified that, though.

As usual with such a long list, I'm only going to share a few of these.

15. Lobster

Lobster was so plentiful in the areas colonized by early Americans that they stopped eating it as soon as they could. Only prisoners, the poor, and livestock -- which were pretty much considered the same things -- would deign to eat it, and it was even used as fish bait.


You know, I've been hearing this for so long and with such certainty of delivery that I started to question it. So I checked a source that's marginally more verifiable than Cracked, and discovered  Open in new Window. that while this probably is the case, in Europe it was often associated with wealth, before a bunch of Europeans came over here.

So this is more of a case of changing popularity over time, which happens with many foods.

Also, keep in mind that sometimes the wealthy like to eat expensive things just because they can, and because the poor can't. This has little to do with the actual taste of the food. See also: caviar. That shit's disgusting.

14. Chicken Wings

Though now a staple of [sportsball game whose name is copyrighted] parties and other manly gatherings, chicken wings were considered the grossest part of the chicken, to be either unceremoniously thrown out or used for soup stock at the most.


Ah, one of my favorite things to rag on. "Let's take the chicken part that used to be made into dog food and turn it into sports food." Look, they're still the grossest part of the chicken (except maybe the beak and intestines) and are only popular because they're marketed to be. And because of the hot sauce, of course.

12. Foie Gras

There is so much wrong with this entry that I'm not even going to paste it here. Also, everything about foie gras is foul. Pun intended.

9. Peanuts

Peanuts came to America from Africa, and like most delicious African foods, they were immediately dismissed by colonizers as unfit for humans until three things happened. First, the Civil War reduced people to choking down whatever protein they could get their hands on, and peanuts were definitely preferable to rats.


I'm no fan of peanuts, but yes, if I had a choice between peanuts and rats, I'd eat the peanuts.

Then P.T. Barnum began selling peanuts as circus food.

Having been marketed by the Platonic ideal of "huckster" ("sucker born every minute" etc.) is not something that I'd use to recommend a product.

Finally, peanut butter happened. Even the most frothing bigot can’t resist a spoonful of peanut butter.

Admittedly, I'm okay with peanut butter. I still don't know why I like peanut butter (but only the real kind, not the candy kind like Skippy) and not peanuts, but I never claimed to be entirely consistent.

7. Mushrooms

The Western world shunned mushrooms on account of their tendency to make you see God and/or kill you until the French insisted they were the height of cuisine in the 18th century


I do like (commercially available) mushrooms. I know several people who can't stand them, mostly due to the texture (they say). I can understand that. When you really think about it, eating mushrooms is weird. It's a fungus, neither animal nor plant (but, oddly, closer to the former than to the latter), and thrives in shit. There aren't many fungi that we eat. Yeast, sure, but we were ingesting that (as part of bread or fine fermented beverages) long before we knew what yeast actually was.

On the other hand, when you really think about a lot of things that we eat, you start to question them. Eggs, for example. Or:

3. Oysters

The story of oysters is a very straightforward one of supply and demand. They were once so plentiful that Charles Dickens characters looked down on the patrons of oyster houses that lined the London streets one “to every half-dozen houses.” Then we filled the oceans with so much pollution that it was hard to get a good oyster, prices went up, and the rich just equated “expensive” with “good.”


Like I said, sometimes rich people do rich people things just because the poor can't.

Still, you have to wonder how people figured out oysters in the first place. "Let's crack open this rock and see if there's a tasty treat inside."

1. Burgers

There are people who, not unreasonably, would take a juicy burger over the finest steak any day, but back in the Upton Sinclair days, ground beef was seen as unclean at best and possibly containing dude at worst.


It's actually more complicated than that, and ground beef certainly predates fast food. I'm pretty sure, though, that the popularity of hamburgers didn't take off until there was enough supply to make them cheaply, and that required access to electricity to power the grinders.

When I was a kid it confused the hell out of me that hamburgers didn't contain ham. Just another linguistic weirdness of English, in this case with the word deriving from the city of Hamburg which may or may not have had nothing to do with the invention of the hamburger. Nowadays you can talk about beef burgers, veggie burgers, turkey burgers, even cheeseburgers (which aren't made out of cheese) but never ham burgers. And no one calls it a hamburger anymore, either.

Now you're hungry, aren't you?
November 11, 2022 at 12:01am
November 11, 2022 at 12:01am
#1040530
Just like UFOs are only UFOs until they're identified (then they're just FOs), it's only a cryptid until you know what it is.

Beware Montana’s Shunka Warak’in, the ‘Rocky Mountain Hyena’  Open in new Window.
Is one of these crafty cryptids on display in a small museum?


Not to be confused with the Rocky Mountain Oyster.

Something has been preying on domesticated animals across the plains of Montana for centuries.

Yeah, wolves.

It has been given many names over the years, below most of which burn angry red squiggly lines when typed into Microsoft Word: Shunka warak’in. Ringdocus. Guyasticutus.

All of which would be awesome band names. Still. Wolf doesn't freak out my spell checker.

But it’s also been called the Beast and the Rocky Mountain hyena—in fact, any name but wolf, although the creature could easily be called a wolf.

Which is what I've been saying.

Perhaps that’s because wolves were extinct in the state for about half of the 20th century.

Yeah, sure they were.

Here in Virginia, and down into North Carolina, in the Blue Ridge Mountains, people occasionally claim to see a mountain lion (which also has multiple names: puma, cougar, whatever). Officially, mountain lions are extinct in the eastern part of the US. Unofficially, everyone knows there are mountain lions up there. Lamest cryptid ever: unlike the Jersey Devil or the Mothman, we're pretty sure what a mountain lion is.

I'm not saying cougars aren't cool. Just that we lack imagination when it comes to cryptids here in the Blue Ridge.

If only we had a carcass, we could figure out what this creature is once and for all.

Oh, wait. Turns out, we do. It’s on display in a museum in Montana.


There are museums in Montana?

(I know at least two of my occasional readers are from Montana. Relax. I'm joking.)

The article (which is actually a book excerpt, but whatever) goes on to describe how someone actually killed one, and it ended up stuffed and mounted because that's what we do, apparently. Then:

The ringdocus outlasted Sherwood and was on display at least into the 1980s. And then it disappeared.

Probably stolen by Bigfoot.

Or maybe a wolf.

Meanwhile, Lance Foster, a historic site preservationist, paranormal enthusiast, and member of the Ioway tribe, speculated that the beast could be a shunka warak’in, a canid non-wolf beast from Native American lore that would sneak into camps at night and make off with dogs (the name translates to “carries off dogs”).

Okay, fine. Not a wolf. Maybe Bigfoot's dog.

Apparently, they tracked down the taxidermied whatever-it-was (turns out it wasn't stolen by Bigfoot, but just transferred to a museum in the one state less likely to have a museum than Montana) and put the thing back on display.

Today, the creature is the museum’s most popular exhibit. They just call it the Beast.

And we're back to no imagination.

Or is it just a bad taxidermy mount? Only a DNA test could tell, and all interested parties have decided not to do that. The mystery of the shunka warak’in has gone on so long that nobody wants to risk solving it.

It may be the case that some mysteries are best left unsolved, but in this case, come ON. It's like when they tested hair that supposedly got rubbed off of a Bigfoot, and it turned out to be bear or cow or whatever. People still believe in Bigfoot after all that, because it's hard to prove a negative. (See my entry from last year, "Tales from the CryptidOpen in new Window..) Even though we have hard evidence that all the blurry pictures of that particular cryptid were definitely hoaxes.

It would be like refusing to test the DNA from the multiple taxidermied jackalopes  Open in new Window. in neighboring Wyoming: they just don't want people to think that jackalopes are completely made up.

30 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online