Blog Calendar
About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
December 24, 2024 at 2:28am
December 24, 2024 at 2:28am
#1081447
Posting early today because, like many people, I have stuff to do later. In my case, though, the stuff is completely unrelated to tomorrow's holiday.

I've written about Betelgeuse before, most recently here: "Betelgeuse 2Open in new Window.. This is, however, a different article, more recent, from Big Think.

    This is what we’ll see when Betelgeuse goes supernova  Open in new Window.
The closest known star that will soon undergo a core-collapse supernova is Betelgeuse, just 640 light-years away. Here’s what we’ll observe.


And already I have Quibbles.

1: "what we'll see." It's extremely unlikely that anyone alive as I write this will see it happen. I'm a gambling man, and I wouldn't bet on it, not unless some bookie was offering billion-to-one odds and I could bet, like, a dollar. The headline uses the same value of "we" as people do when they talk about when "we" will colonize distant star systems (hopefully not Betelgeuse).

2. "will soon undergo." As with "we," they're using a variant value of "soon." Best estimate I've seen is within 100,000 years. That's soon in cosmic terms. It's not soon in human terms. Hell, 100,000 years ago, we'd (entirely different definition of "we" this time) barely started using fire.

3. "640 light-years away." Yeah... maybe. For whatever technical reason (it's been explained to me, but it's over my head), B's distance has been tricky to pin down. Wiki claims 400-600 light years, and that's a huge margin which doesn't even include 640.

I should reiterate here that even if it's at the low end of that scale, astronomers expect no ill effects for any life remaining on Earth when it happens. Of course, astronomers have been known to be wrong, from time to time.

But, okay. Issues with the headline don't always transfer to the actual text. It's just that it's the first thing we see, so getting it right is kinda a big deal. I'm not saying that it's clickbait, but it is a bit sensationalized.

The stars in the night sky, as we typically perceive them, are normally static and unchanging to our eyes. Sure, there are variable stars that brighten and fainten, but most of those do so periodically and regularly, with only a few exceptions. One of the most prominent exceptions is Betelgeuse, the red supergiant that makes up one of the “shoulders” of the constellation Orion.

Hence the title of today's entry.

Over the past five years, not only has it been fluctuating in brightness, but its dimming in late 2019 and early 2020, followed by a strange brightening in 2023, indicates variation in a fashion never before witnessed by living humans.

It is necessary for a human to be living in order to witness anything (metaphysics and religion aside), but I think they mean it's weirder than it's been for the past 100 years or so.

There’s no scientific reason to believe that Betelgeuse is in any more danger of going supernova today than at any random day over the next ~100,000 years or so, but many of us — including a great many professional and amateur astronomers — are hoping to witness the first naked-eye supernova in our galaxy since 1604.

As unlikely as it might be, I've said before that it would be very, very cool if I got to see it. I'm just not betting on it.

Located approximately 640 light-years away, it’s more than 2,000 °C cooler than our Sun, but also much larger, at approximately 900 times our Sun’s radius and occupying some 700,000,000 times our Sun’s volume. If you were to replace our Sun with Betelgeuse, it would engulf Mercury, Venus, Earth, Mars, the asteroid belt, and even Jupiter!

See, those numbers don't hit very well with people, including me. Even comparing the size to our solar system doesn't give us a visceral idea of just how fucking huge that star is (not to mention I'd question the Jupiter orbit thing, because red giants like that just don't have a well-defined surface in the way that we think of the Sun as having one).

This image  Open in new Window. might help with the size comparison.

Even when it transitions to the more advanced stages of life within its core, from carbon-burning to then neon and oxygen and eventually silicon fusion, we won’t have any directly observable signatures of those events.

Dude, people are easily confused. I get that stars, like people or cats, have a birth, time of existence, and death. As far as we know, though, stars themselves don't harbor life. Yes, the universe is weird, and it's fun to speculate that maybe they do, but "life" in this case is a metaphor for how stars change over time. Calling it life just begs people to misunderstand, deliberately or not, what's meant.

The article goes on to describe what whoever's on Earth when it happens can expect to experience when the event finally occurs. Not going to quote more, but it's pretty interesting, in my opinion. Pay no mind to the "it really happened 640 [or whatever] years ago" thing, though; it's irrelevant except as a way to acknowledge that information has a maximum speed.

Naturally, being science, everything there is based on our best knowledge at this point in time. I'd also expect surprises. But those surprises will only serve to advance the science.

Unless, of course, they're wrong about the "it won't irradiate and sterilize the Earth" thing.

Sleep tight!
December 23, 2024 at 7:31am
December 23, 2024 at 7:31am
#1081423
In my partial listing of intelligent species yesterday, I left out an important one. But my random number generator reminded me today, by pointing to this article from Atlas Obscura:

    Can You Outsmart a Raccoon?  Open in new Window.
Recent studies show just how tricky these trash pandas can be, from opening locks to nabbing DoorDash orders.


Well, that last bit won't happen to me because I don't use DoorTrash. But I have caught those masked marauders in the actual trash. I also once caught one that had opened the door to the house, snuck in, and scarfed down the cat food.

While many other species around the world are in decline, raccoons are actually thriving, and do particularly well in urban areas, says lead author Lauren Stanton, a cognitive ecologist at the University of California, Berkeley.

Okay, that tracks, but... "cognitive ecologist?"

Raccoons are strong—they can push a cinder block off a trash can—and tenacious. The more we do to keep them out, the more skills they learn for breaking in, leading to a cognitive arms race between people and raccoons.

You know how people keep saying that if cats ever got opposable thumbs, we'd be in big trouble? Well, raccoons don't have opposable thumbs, either, but their little paws grip stuff just fine without them.

In 2016, for example, the city of Toronto spent 31 million CAD (that’s about $23 million) on raccoon-resistant waste bins. While they deterred most would-be robbers, certain tricksters had no problem solving the new puzzle. The city has continued to release new versions of the bin, trying to outsmart Toronto’s most persistent trash invaders.

All due respect to our Canadian friends, that right there cracked me up.

The word raccoon can be traced back to the Proto-Algonquian word Ă€rĂ€hkun, deriving from the phrase “he scratches with his hands.”

The name was more directly from a specific Algonquian language, spoken by the Powhatans here in what would become Virginia.

While it’s hard to compare intelligence across species, says Stanton, she says that some recent studies show the neural density of raccoons is “more similar to primates than other carnivore species.”

Also, from what I've been hearing, neither brain size nor neural density is strongly correlated with those traits we call intelligence. Still, there's no mistaking that at least some raccoons exhibit advanced problem-solving skills.

However, we have learned that raccoons, once thought to be solitary, are in cahoots with each other far more than we knew.

Great. Now we have to deal with raccoon gangs.

Lots more at the article, which, most importantly, features multiple pictures of impossibly cute raccoons.
December 22, 2024 at 6:24am
December 22, 2024 at 6:24am
#1081398
As usual for Sunday entries these days, I selected an older entry at random to take a second look at. This time, I didn't land all that far in the past, just over a year (one year being my minimum for these exercises), and found this: "The Trouble with QuibblesOpen in new Window.

Being relatively recent, the linked article,  Open in new Window. from Vox, is still up, and I didn't see any indication that it's been revised since then.

I will therefore address, primarily, my own thoughts at the time.

Quibble 1: "Intelligent." I've railed on this before, but, to summarize: What they really mean is "technology-using."

I have, in fact, bitched about this sort of thing on numerous occasions, and for reasons I go over in that entry. But, even apart from the tiresome jokes about humans not being intelligent, we know of other intelligent life on Earth: octopodes, dolphins, cats, crows, etc. It took entirely too long, from the perspective of our time as a species, to recognize these intelligences. Our communication with them is limited in scope; those species are related to us, so you'd think there would be enough common ground to establish a dialogue, but no. How much worse might it be to communicate with someone from an entirely different genetic lineage?

Of course, there's always the most basic form of communication: violence. I know it's fashionable to think that any culture advanced enough to get to the stars will have put all that behind them, but I'm skeptical. We certainly haven't. Humans fear the Other, and there's probably sound evolutionary reasons for that, but nothing would be more Other than space aliens. To them, we're the space aliens.

We're looking for signs that some extraterrestrial species has developed communication or other technology whose effects we can detect. This technology would indicate what we call intelligence, but not all intelligence develops technology. One might even successfully argue that it's kinda dumb to invent technology.

Quibble 2: UAP may be more or less silly than UFO, but I believe it to be a better fit.

UAP may have less baggage than UFO, but we have a history of taking new, neutral terms and giving them the same encrustations of connotation that we give the old terms. Like how "retarded" started out as a value-neutral term to describe humans of lower intelligence (see above), to replace words like idiot, cretin, and moron, which had turned into insults. Then "retarded" turned into an insult, and some say we shouldn't be using the term at all. Well, that's special.

Point is, I give UAP (unidentified anomalous (originally aerial) phenomena) about five more years before they have to come up with something new because UAP studies have taken a turn for the edge of the world.

And I don't doubt that there's something to study. Sure, there are hoaxes; there are always hoaxes, like with Bigfoot, but they're probably not all hoaxes. I just don't jump straight to the conclusion that if there's a sighting of something that can't be immediately identified, it must therefore be aliens. That's just retarded.

Quibble 5: What's the first thing we did when we started exploring space? Sent robots, not people. No reason to assume hypothetical aliens wouldn't do the same.

This can probably be nitpicked because some of our early ventures into space were crewed: first person in space, first human on the moon, etc. Still, Sputnik (not really a robot but not living, either) preceded Gagarin, and lunar probes preceded Tranquillity Base, and since then pretty much everything outside of Low Earth Orbit has been a robot.

Well, that's all I'm going to expand on today. My thoughts haven't changed much in the 14 months since that entry, and we have found no extraterrestrial life, intelligent or otherwise, during that time, so the search continues.
December 21, 2024 at 11:36am
December 21, 2024 at 11:36am
#1081375
A neutron walks into a bar. "What'll it be?" asks the bartender. "How much for a beer?" "For you, no charge!"



While this Quartz article is from the faraway year of 2017, I found enough to snark on to make it worth sharing.

You’ve likely been asked how you see the proverbial glass: half full or half empty? Your answer allegedly reflects your attitude about life—if you see it half full, you’re optimistic, and if you see it half empty, you’re pessimistic.

I'm an engineer. All I see is an overdesigned glass. Or, depending on my mood, an inefficient use of available storage space.

I'm also a pessimist, but at least I'm a pragmatic one.

Implied in this axiom is the superiority of optimism.

Also, I don't know if even the most dedicated pessimist, not knowing and deliberately following this particular cliché, would seriously consider "half-empty" to be a thing. Almost everything is related to full. Like, if your fuel gauge is in the middle, you say "we have half a tank of gas," not "the tank's half-empty."

Thus, the good answer is to see the glass half full. Otherwise, you risk revealing a bad attitude.

Shut the fuck up about my attitude.

Actually, the glass isn’t half full or half empty. It’s both, or neither.

Come on now. No. It's not in a state of quantum indeterminacy. Or, at least, no more than any other object in view.

Things aren’t mutually exclusive, awesome or awful. Mostly they’re both, and if we poke around our thoughts and feelings, we can see multiple angles.

On that bit, though, I'm fully on board. I really hate it when people put things into two boxes: "awesome" and "suck." The moment Netflix went to shit was the moment it switched from star ratings to thumbs up or down. Of course, I'm fully aware that by hating it, I'm putting the idea of the awesome/suck binary into the "suck" box. Everyone is a hypocrite, including me.

Neutrality sets us free. It helps us see something more like the truth, what’s happening, instead of experiencing circumstances in relation to expectations and desires.

Ehhhh... nah. Pessimism, and only pessimism, sets us free. An optimist is doubly disappointed when their imaginings fail to materialize: from the positive outcome having not worked out, as well as the ego hit from being wrong. A neutral person risks never experiencing the joy of anticipation. It is only the pessimist who, if their prediction falls flat, still takes a win: either something good happens, which is good; or they turn out to be right, which is a pleasant feeling.

The article goes on to relate the quality of neutrality to Buddhism, I suppose in an effort to give neutrality some ancient gravitas, but instead, it only makes Buddhism seem even less appealing to me.

But hey, it's not about me. On the subject of whether this applies to anyone else or not, well, I'm neutral.
December 20, 2024 at 10:25am
December 20, 2024 at 10:25am
#1081341
Solstice tomorrow (around 4:20 am here), so this is my last entry of astronomical fall. Today's article, from BBC, has nothing to do with seasons, though, and it's a subject I really shouldn't be weighing in on—but of course, I'm going to do it anyway.



What inspired me to jump in above my head here was the lede:

Far from triumphantly breezing out of Africa, modern humans went extinct many times before going on to populate the world, new studies have revealed.

Now, there's a poorly phrased sentence if I've ever seen one. It should be blindingly obvious to everyone who can read this that modern humans didn't go extinct. This is a fact on par with "Earth is roughly spherical" and "Space is mostly vacuum." Actually, wait, no, I'm even more sure that modern humans didn't go extinct than I am about those other things, because, last I checked, there were about 8 billion modern humans running around. Or sitting around. Whatever. You do you. Point is, we're not extinct yet.

Likely, the author meant "sub-populations of modern humans went extinct many times," which, okay, I guess they have science to back them up on that, and I'm not going to argue about it. But I feel like the way it's phrased would be like if they said "humans went extinct in Pompeii in 79 C.E."

The new DNA research has also shed new light on the role our Neanderthal cousins played in our success.

This is, I think, the interesting bit here. But I'd like to emphasize the "cousins" metaphor there. Sapiens and neandertals (the spelling of the latter appears to have legitimate variants) shared a common ancestor. A certain ape population separated at some point, genetic drift and selection occurred differently in each population, until you got groups with clear physiological differences in the fossil record. But, apparently, the physiological differences weren't enough to prevent interbreeding.

The definition of a species is, to my understanding, a bit of a slippery concept in biology. That is, it's not always obvious what constitutes a separate species. If it were as easy as "a population that can breed to produce fertile offspring," we wouldn't consider sapiens and neandertals separate species because, according to DNA evidence, they produced fertile offspring together.

While these early European humans were long seen as a species which we successfully dominated after leaving Africa, new studies show that only humans who interbred with Neanderthals went on to thrive, while other bloodlines died out.

Again, I feel like this is poorly phrased, and puts too much emphasis on Europe. Apparently, there are populations in sub-Saharan Africa today with no neandertal genes and, again, obviously they didn't die out. And they're the same species as the rest of us mostly-hairless bipeds.

Apart from these nitpicks, I think the new findings are fascinating, delving into how, basically, hybridization led to greater hardiness. As with all science, it may be overturned or refined through later studies, but this article itself describes an overturning of previous hypotheses about early human ancestry. And it has helpful infographics and pictures.

But unless we invent time travel (unlikely), all we can do is make hypotheses and test them. It's really a very human thing to do.
December 19, 2024 at 8:22am
December 19, 2024 at 8:22am
#1081306
Show of hands, please: how many of you are planning on eating over the next week or so?

Huh, that's a lot.

An eating-related article from PopSci:

    Is the five-second rule true? Don’t push your luck.  Open in new Window.
The scientific research on floor food has a clear answer.


I never heard about this "five-second rule" until I was fully an adult. Now, remember, I spent my childhood out in the country, on a farm, and we had our own vegetable garden. The door to the house opened into the kitchen. Clean floors were a "sometimes" thing. But I honestly can't remember what my mom (it was almost always my mom) did if something dropped onto said floor. Probably wouldn't have mattered because I used to pick veggies straight from the garden and eat them. Yes, even carrots. Especially carrots. I wouldn't eat vegetables that she'd cook, but I ate the hell out of raw, dirty-root carrots.

Nurses are always surprised and distrustful when they see "no known allergies" on my chart, but I credit my finely-tuned immune system (quite unscientifically) to eating dirt as a kid.

Anyway, I never believed the five-second rule, and now there's some science to back me up on this.

According to this popular belief, if you drop a piece of food on the floor and pick it up in less than five seconds, then it’s safe to eat. The presumption is that bacteria on the floor don’t have enough time to hitch a ride on the food.

Right, because bacteria are just little animals that take more than five seconds to realize there's a tasty treat above and jump onto it. Snort. No, any bacteria (or other unwanted contamination) hitches a ride on the floor dirt that the dropped food picks up immediately. And I don't care how clean you think your floor is; if it's just been cleaned, there's cleaning agent, which is also not very good for you; and if it hasn't, there's dirt.

In 2003, Jillian Clarke, a senior at the Chicago High School for Agricultural Sciences in Illinois, put the five-second rule to the test.

I will reiterate here that, as a high-schooler, she was younger than I was when I first heard about the five-second rule. Also, we never got to do cool science projects like that in my high school.

Clarke and her coworkers saw that bacteria transferred to food very quickly, even in just five seconds, thus challenging the popular belief.

While this supports my own non-scientific conclusion, one study, performed by a high-school team, is hardly definitive.

A few years later, food scientist Paul Dawson and his students at Clemson University in South Carolina also tested the five-second rule and published their results in the Journal of Applied Microbiology.

Additional studies and replication, now... that starts to move the needle to "definitive."

When they dropped bologna sausage onto a piece of tile contaminated with Salmonella typhimurium, over 99% of the bacteria transferred from the tile to the bologna after just five seconds. The five-second rule was just baloney, Dawson concluded.

One might think that the main reason I saved this article to share was because of the bologna/baloney pun.

One would be correct.

But in 2014, microbiology professor Anthony Hilton and his students at Aston University in the United Kingdom reignited the debate... According to their results (which were shared in a press release but not published in a peer-reviewed journal), the longer a piece of food was in contact with the floor, the more likely it was to contain bacteria. This could be interpreted as evidence in favor of the five-second rule, Hilton noted, but was not conclusive.

Well, maybe UK bacteria are less aggressive.

This prompted food science professor Donald Schaffner and his master’s thesis student, Robyn C. Miranda, at Rutgers University in New Jersey to conduct a rigorous study on the validity of the five-second rule, which they published in the journal Applied and Environmental Microbiology... By analyzing bacterial transfer at <1, 5, 30, and 300 seconds, they found that longer contact times resulted in more transfer but some transfer took place “instantaneously,” after less than 1 second, thus debunking the five-second rule once and for all.

Now that "definitive" needle has moved substantially. But shame on the source for applying "once and for all" to science.

“Based on our studies, the kitchen floor is one of the germiest spots in the house,” Charles P. Gerba, a microbiologist and professor of virology at the University of Arizona, tells Popular Science. Believe it or not, “the kitchen is actually germier than the restroom in the home,” he added.

I get really tired of the "more germs than a toilet seat" scaremongering.

The next time you’re tempted to eat that cookie you just dropped, remember: bacteria move fast.

Or they're hitching a ride on the larger particles that stick to the toast that you inevitably dropped butter-side-down.

Anyway, I'm not sharing this to shame anyone for eating stuff off the floor. You do you, as they say. Just don't make me eat it. My dirt-eating days are long behind me.
December 18, 2024 at 9:58am
December 18, 2024 at 9:58am
#1081275
Getting back to science today, here's one from Quanta for all the opponents of nihilism out there.

    What Is Entropy? A Measure of Just How Little We Really Know.  Open in new Window.
Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.


It makes all kinds of sense that it took a French person to figure this out.

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

That sounds more like a French (or possibly Russian) philosophy book than science, but I assure you, it's science (just without the math). As I've said before, philosophy guides science, while science informs philosophy.

To keep track of this cosmic decay, physicists employ a concept called entropy.

Keeping track of decay may sound like a paradox, and, in a way, it is.

Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

That's slightly simplified. The Second Law of Thermodynamics states that in a closed system, entropy can never decrease. It can remain constant, just never decrease. And it specifies "closed system," which the Earth most definitely is not; we have a massive energy source close by (in cosmic terms), at least for now.

Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball.

I've also noted before that creation and destruction are actually the same thing. What we call it depends on our perspective at the time. Did you create a sheet of paper, or did you destroy a tree? Well, both, really, but maybe you needed the paper more than you needed the tree, so you lean toward the "creation" angle.

We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire.

Who's this "we?"

We are, despite our best intentions, agents of entropy.

At the risk of repeating myself once more, it could well be that the purpose of life, if such a thing exists at all, is to accelerate entropy.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

Uncomfortable for some, maybe.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance.

What he's basically saying here, if I understand correctly (always in question), is that they're trying to fit entropy into information theory. Remember a few days ago when I said information theory is a big deal in physics? It was here: "Life IsOpen in new Window.

The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine.

It's important, I think, to remember that the steam engine was the cutting-edge of technology at the time.

Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation.

I find that satisfying, as well, given my philosophical inclination concerning creation and destruction. If they're the same thing, then "transformation" is a better word.

Physicists of the era erroneously believed that heat was a fluid (called “caloric”).

Yes, science is sometimes wrong, and later corrects itself. This should, however, not be justification to assume that the Second Law will somehow also be overturned (though, you know, if you want to do that in a science fiction story, just make it a good story).

This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.

So far, they've talked about a French person, a German, and an Austrian. This doesn't mean thermodynamics is inherently Eurocentric.

The second law becomes an intuitive probabilistic statement: There are more ways for something to look messy than clean, so, as the parts of a system randomly shuffle through different possible configurations, they tend to take on arrangements that appear messier and messier.

The article uses a checkerboard as an example, but as a gambler, I prefer thinking of a deck of cards. The cards come in from the factory all nice and clean and ordered by rank and suit. The chance of that same order being recreated after shuffling is infinitesimal.

Entropy experienced a rebirth during World War II.

Now, there's a great double entendre. I wonder if it was intentional.

Claude Shannon, an American mathematician, was working to encrypt communication channels... Shannon sought to measure the amount of information contained in a message. He did so in a roundabout way, by treating knowledge as a reduction in uncertainty.

Sometimes, it really does take a shift in perspective to move things along.

In two landmark (opens a new tab) papers (opens a new tab) in 1957, the American physicist E.T. Jaynes cemented this connection by viewing thermodynamics through the lens of information theory.

Okay, so the connection between entropy and information isn't exactly new.

However, this unified understanding of entropy raises a troubling concern: Whose ignorance are we talking about?

And that's where I stop today. There is, of course, a lot more at the link. Just remember that by increasing your own knowledge, you're accelerating the entropy of the universe by an infinitesimal amount. You're going to do that whether you read the article or not, so you might as well read the article. As it notes, "Knowledge begets power, but acquiring and remembering that knowledge consumes power."
December 17, 2024 at 7:01am
December 17, 2024 at 7:01am
#1081249
Today, from the Land of Party Poopers (actually, from Thrillist):

    No, That Isn't Duct Tape on Your Plane's Wings  Open in new Window.
An aircraft mechanic explains what the tape you sometimes see on plane wings really is.


Why Party Poopers? Well, because they're taking one of my few precious joys out of life.

See, I don't fly all that often. Once a year, maybe. (Okay, twice, if you count the round trip as two trips.) So I don't get to do this often, but when I see that tape on a plane, I usually wait until the plane starts to taxi away from the gate to loudly exclaim, "Hey, look, the wings are being held together by duct tape!"

I also find the fake outlets  Open in new Window. at waiting areas incredibly hilarious, though I've never done that prank, myself.

Those are little moments of happiness for me, but this article sucks the joy out of the first one. Well, at least, it would, if people actually read Thrillist. Maybe my faux-freakout over the tape will still have its desired effect.

Anyway, after all that, I'm sure you're dying to know what it really is on the wings.

As a passenger, noticing that your plane's wings are seemingly held together by the same silver duct tape that your dad uses to fix anything around the house is, by all means, a frightening sight.

Or, you know, it would, if duct tape weren't so damn useful.

"That's not actually duct tape," says an aircraft mechanic in a TikTok video addressing the issue. "That's speed tape, [...] and speed tape is an aluminum-base tape that's designed specifically for aviation due to the large speeds and the large temperature differentials that aircraft are subjected to."

I actually knew that. But knowing that it's called "speed tape" doesn't help for shit. Like, from the sound of it, it should make the airplane go faster, but if that were the case, the whole plane would be covered in it, right? If it has something to do with the "large speeds" (eyetwitch) as well as temperature differentials, why call it speed tape and not cold tape?

Instead, sometimes, it's used as a temporary sealant to prevent moisture from entering specific components.

Uh huh. Okay. Doesn't tell me why it's called speed tape.

"Speed tape, also known as aluminum tape, is a material used for temporary, minor repairs to nonstructural aircraft components," an FAA spokesperson told Thrillist.

And it's called that because...?

Yes, I know I could ask some random AI the question and get some kind of answer, but that's not the point. The point is, why can't the article purporting to explain all about speed tape not even bother to explain why it's called speed tape?

You can relax now and enjoy your flight stress-free.

HA! Like there aren't 498 other things about flying that cause stress.

Oh, right: 499 if I'm around.
December 16, 2024 at 7:51am
December 16, 2024 at 7:51am
#1081213
Way the hell back in 2018, Quartz published the article / stealth book ad I'm linking today.



Does it? Does it really remain at the center of dining controversy? Because I thought that in 2018, and even now, the "center of dining controversy" is how to handle mobile phones at the table.

On June 25, 1633, when governor John Winthrop, a founding father of the Massachusetts Bay Colony, took out a fork, then known as a “split spoon,” at the dinner table, the utensil was dubbed “evil” by the clergy.

While this article is US-centric, and makes no attempt to be otherwise, other sources  Open in new Window. show that the fork has been considered a tool of the devil since it was introduced to Europe. This is, naturally, just another in a long list of humans considering anything new and different to be necessarily evil, because we're kinda stupid like that.

Forks were pretty much unheard of during Winthrop’s era. People would use their hands or wooden spoons to eat. The Museum of Fine Arts (MFA) in Boston says that only “a handful of well-to-do colonists,” adopted the use of the fork.

I mean, technically, you're using your hands either way.

When Americans finally started their love affair with the fork, their dining etiquette compared to their international peers became a source of controversy for centuries, whether it’s the way the fork is held, only eating with the fork, or using the “cut-and-switch.“

Oh, no, different countries do things differently. The horror.

During the time it took for Americans to widely start using the fork, dining cutlery was evolving in England. Knives changed to have rounded blade ends, since forks had “assumed the function of the pointed blade,” says Deetz.

I'm betting there were other reasons for the switch, like, maybe, deweaponization?

So if you've ever wondered why some cultures point fork tines up while others point them down, well, the article explains that. Sort of. Unsatisfactorily. Still not mentioned: why formal place settings are the way they are.

Also not mentioned in the article (perhaps one of the books it promotes says something about it, but it's unlikely I'll ever find out) is the abomination known as the spork.
December 15, 2024 at 8:49am
December 15, 2024 at 8:49am
#1081177
It's time-travel time again. Today's random numbers brought me all the way back to July of 2008, with a short and ranty entry: "Those Naughty BritsOpen in new Window.

Apparently, there was a link to a (London) Times article, in the chick section, about "kinky sex." It should be surprising to no one that the link is dead and now just redirects to the Times main page, which I didn't bother looking at.

"Why do many of us like kinky sex?" apparently opened the original article, based on what I said in that entry. These days, I have preconceived ideas about headline questions: First, if it's a yes/no question, the answer is probably "no." Second, if it's a "why" question, the answer is probably "money."

I think I'm wrong about the second idea, but only this time.

2008 Me: Why is this in the "women" section? Men don't want to read about kinky sex? Please.

I'm guessing men are less likely to consider it kinky, outrageous, or naughty. But what the fuck do I know (pun intended)?

2008 Me: In conclusion, the article seems to be designed to be provocative, but semantically null.

I guess that was me, waking up to the practices of major information outlets.

2008 Me: What happened to investigative journalism? Hell, what happened to comprehensive news stories?

Gods, 2008 Me was so young and naĂŻve.

2008 Me: ...an excuse to link the blog of a friend of mine...

Said blog no longer exists, and I have no recollection of who the friend was now.

2008 Me: Journalism may not be dead yet, but it's starting to wander and stink.

Dead now. Mostly.

2008 Me: I blame bloggers.

Clearly, that was an attempt at irony. The reality was, and is, way more complicated than one single reason, as these things usually are. I'm not getting into it here, and I'm probably wrong, anyway. But this look into the far-distant past has been enlightening, and maddening. Still, one constant that hasn't changed, and was an old constant even in 2008: sex sells. And, apparently, kinky sex sells more.
December 14, 2024 at 9:32am
December 14, 2024 at 9:32am
#1081148
Appropriately enough, the first entry after the completion of my five-year daily blogging streak is from Cracked:

    How the Tomato Became Torn Between the Lands of Fruits and Vegetables  Open in new Window.
A confusing, red, plant-based chimera


Right, because the most important characteristic of a tomato is which category we pigeonhole it into. But, okay, I'll play along.

I don’t know what it is about the fruit-versus-vegetable designation of a tomato that I find so particularly annoying, but it twists in my brain like a knife.

That sounds serious. Maybe, instead, take a break and think about Pluto for a while.

As it sits today, the tomato is indeed, botanically a fruit. At the same time, it is legally a vegetable...

Yes, and my mom was, to me, my mother, but to my dad, she was his wife. So?

First, let’s stick to the science, which decidedly declares a tomato a fruit according to botanical guidelines.

Well, botanically, it's a berry. And, according to botany, strawberries, raspberries, and blackberries are not berries. Why this would matter to anyone trying to fix dinner or dessert, though, is beyond my comprehension.

Where the other side of the argument comes from is the culinary world, the place where most people are interacting with tomatoes on a daily basis. It’s also the dominant layman’s classification, probably due to the fact that it’s based in common fucking sense.

Here's where I usually rant about how common sense is usually wrong and needs to be superseded by science. But the classification of a tomato isn't like studying what its nutritional characteristics are. Categories and classifications are imposed from the outside and are supposed to help us make sense of the universe, like what the definition of "planet" or "mammal" should be. Then something like a platypus comes along to remind us that the universe fundamentally doesn't make sense and we shouldn't expect it to. Point is, we could just as well say "any topping on a Big Mac is officially a vegetable," which might settle the tomato question once and for all, but move the discussion to whether cheese should be called a vegetable or not.

And yet, no less of an authority than the Supreme Court has ruled differently. Unsurprisingly, it’s money-related, specifically to do with tariffs. In the late 1800s in America, the taxation on fruits and vegetables was starkly different. Fruits could be imported with impunity, while bringing in foreign veggies would demand a steep 10-percent tariff. An importer named John Nix saw opportunity in the science, and refused to pay tariffs on a shipment of tomatoes, since they were technically fruits. The case climbed all the way to the Supreme Court, where it was heard in 1893.

It also should come as no shock that, in some cases, a thing can be categorized as one thing in one context, and another thing in other contexts. Like, astronomers consider any element that's not hydrogen or helium to be a "metal." That works for astronomy. It doesn't work for structural engineering.

As I read it, the Supreme Court agrees with the people, issuing the legal equivalent of “sure, technically, but come on, dude.”

Leaving aside for the moment that botanists and biologists are also (usually) people, all that means is that, in the US, tomatoes are vegetables by legal definition. I vaguely remember some nonsense a while back about whether ketchup, which doesn't have to be made from tomatoes but usually is, should also be considered a vegetable for the purpose of school lunch nutrition or something.

Left unsettled, then, is still the question of whether a hot dog (with or without ketchup) is a sandwich, and I maintain that no, it's a taco.
December 13, 2024 at 6:26am
December 13, 2024 at 6:26am
#1081102
1827.

No, I'm not referring to the year. 1827 is what you get when you multiply 365 by 5, and then add 2.

Yes, today is not only Friday the 13th, but it's also the day I claim a five-year daily blogging streak, having shat an entry out every single day between December 14, 2019, and today: one thousand eight hundred twenty-seven entries. (There were two leap days in there, hence the "add 2.")

Granted, they weren't all great entries. Some of them were probably not even very good. But I put thought and effort into each of them, and I really did do one every day (as defined by WDC time, midnight to midnight); we're not set up here to release entries at some scripted time, or to make up for lost days.

But that's the limit. There will be no six-year blogging streak, at least not in this item. With fewer than 100 entries left in its capacity, the end looms like a kaiju over Tokyo. I thought about ending it today, but nah. Or maybe on the solstice, because that would be appropriate. Or on December 31, because the very first entry was on a January 1. No, I think I'll make the attempt to continue until entry #3000, and then... hell, I don't know. Take a break? Start a new one? Retire from writing? I haven't decided yet, and, knowing me, won't decide until the very last possible minute.

Well, I promised something different today, and there it is: a great big brag. Tomorrow, I'll be back to my usual humble self. Hey... stop laughing.
December 12, 2024 at 9:23am
December 12, 2024 at 9:23am
#1081071
Another day, another book ad. But an interesting book ad, this one from Big Think. I promise something different tomorrow.

     A bold challenge to the orthodox definition of life  Open in new Window.
In “Life As No One Knows It,” Sara Imari Walker explains why the key distinction between life and other kinds of “things” is how life uses information.


I'm not going to weigh in on whether she's right or wrong, or somewhere in between. That's above my pay grade (not that that's ever stopped me before). I do think it's an interesting approach that adds to the conversation of science, even if it's ultimately a categorization issue, like the planetary status of Pluto or the sea status of the Great Lakes.

Sara Imari Walker is not messing around. From the first lines of the physicist’s new book, Life As No One Knows It, she calls out some big-name public intellectuals for missing the boat on the ancient, fundamental question, “What Is Life?”

I'm not sure how ancient, or fundamental, that question really is. With regards to humans and other animals, our distant ancestors could pretty much figure out the difference between life and not-life. With plants, it may have been a bit trickier, as they tend to not move even when they're alive. But I think Jo Cavewoman would scoff at the question. Dog: life. Rock: not life. (Yes, I'm aware that belief in animism might counter what I just said, but I'm talking in generalities here.)

It probably took until we started looking through microscopes that we began to question the boundaries. Is a spermatozoon "life?" How about a virus?

Since then, it's my understanding that people have proposed several different definitions for life, all necessarily based on conditions on Earth, and scientists and philosophers have been arguing ever since, as scientists and philosophers love to do.

Subtitled The Physics of Life’s Emergence, one of the book’s major themes is a critique of the orthodox view in the physical sciences that life is an “epiphenomenon.”

"Epiphenomenon" is another word with a kind of slippery definition. I don't like to quote dictionaries as sources, because they're descriptive and not prescriptive, but the definition I found was "a secondary effect or byproduct that arises from but does not causally influence a process." Which, well, thanks? That doesn't help.

The Wikipedia article  Open in new Window. on it is similarly confusing, at least to me, with the added bonus of also coming from a source people don't like to cite.

What's worse, in my view, is when people conflate "epiphenomenon" with "illusion:"

This is the argument, often heard in mainstream popular science, that life is a kind of illusion. It’s nothing special and fully explainable by way of atoms and their motions.

To address the latter assertion first: "nothing special" is a value judgement, and "fully explainable" is laughable hubris.

As for the "illusion" thing, well, I've banged on in here on several occasions against the "time is an illusion" declaration. But that can be generalized to anyone airily calling anything an "illusion." To me, an illusion is something that, upon further study, goes away: a stage magician's trick, or those seemingly moving lines in a popular optical illusion picture. But no matter how much we study, for instance, time, it doesn't go away. I think a better word description would be "emergent phenomenon," meaning that it's not fundamental, but rather a bulk property. Like temperature. One atom doesn't have a temperature; it only has a vibration or speed or... whatever. Get a bunch of atoms together, though, and the group has an average speed, which we read as temperature.

Or, to use everyone's favorite example, the chair you're sitting in. "It's an illusion," some philosophers claim (generally after taking a few bong hits). "It's not real." Well, look, any philosophy that doesn't start with "the chair is real" is a failure, in my view. Your ass isn't sinking through it; therefore, it's real. Sure, it's made of smaller pieces. On the macro scale, it's got a seat, probably a back (sometimes in one continuous piece), maybe arms, legs and/or casters, maybe a cushion for said ass. This doesn't make the chair any less real; it just means there's a deeper level to consider.

Similarly, the cushion, for example, is usually a fabric stretched over some stuffing. The fabric itself can be further broken down into individual fibers. The fibers, in turn, are made of molecules, some of which have a particular affinity for one another, giving the fiber some integrity. The molecules are made of atoms. The atoms contain electrons, protons, and neutrons. Those latter two, at least, can be further broken down until you're left with, basically, energy. And maybe there's something even more fundamental than that.

None of that makes the chair any less real. It just shows that our understanding can go deeper than surface reality. But surface reality is still reality.

And so it is with life. I know I'm alive, for now, and that's reality. I'm pretty sure my cats are, too, and the white deer I saw munching on leaves in my backyard yesterday. Not so sure about the leaves, it being December and all, but I am as certain as I can be of anything that they are a product of life.

Whew. Okay. Point is, I'd like to see these macro-level phenomena labeled something other than "illusion." It's misleading.

In the standard physics perspective on life, living systems are fully reducible to the atoms from which they are constructed.

Yeah, well, physics gonna physic. Just as with your chair, things can be studied at different scales. Biology is usually the science concerned with life. But biology is basically chemistry, and chemistry is basically physics. This doesn't make biology an illusion, either.

Still, they will argue, nothing fundamentally new is needed to explain life. If you had God’s computer you could, in principle, predict everything about life via those atoms and their laws.

I'm gonna deliberately misquote James T. Kirk here: "What does God need with a computer?"

Walker is not having any of this. For her, the key distinction between life and other kinds of “things” is the role of information.

Well, that's amusing. Not because it's not true—like I said, I'm not weighing in on that—but because from everything I've read, physics is moving toward the view that everything is, at base, information. Yes, that might be what energy can be broken down into. Or maybe not. I don't know. But "information theory" is a big deal in physics.

Whether there's something even more fundamental than information, I haven't heard.

Life needs information. It senses it, stores it, copies it, transmits it, and processes it. This insight is, for Walker, the way to understand those strange aspects of life like its ability to set its own goals and be a “self-creating and self-maintaining” agent.

Okay. Great. Let's see some science about it.

As usual, there's more at the link, if you're interested. Might want to sit down for it, though. You know. On that chair which is definitely real and hopefully not alive.
December 11, 2024 at 12:20pm
December 11, 2024 at 12:20pm
#1081035
A Slate article from half a year ago takes on an issue I've been wondering about for a long time.

    On Both Sides of My Brain  Open in new Window.
For years now, I’ve been puzzled—and annoyed—by the way people seem to insist on labeling what type of person one can be. I’ve finally solved my problem.


Ah, I recognize that personality type! The author must be a non-labeler!

June 25, 2024 5:40 AM

Yeah, I don't usually copy timestamps in here, but this one gave me pause. Is the writer an extreme night owl, or an extreme early bird? (Or was the article's publication time scripted? Different time zone? Who knows?)

Recently, after I did a silent retreat, I was trapped on a five-hour car journey (long story) with someone who was obsessed with labeling everything. People have “math brains” or “creative brains,” there are “boy chores” and “girl chores,” and in any relationship you will have “the person who reads the map” and “the one who is social.”

Well, there's your problem: you did a silent retreat, and then got stuck as a captive audience while someone spewed out all the words they couldn't during the retreat.

This labeling tic is all over the internet too; indeed, much of the content I see online seems premised upon the idea that everything can be better understood if we simply group it as a type.

Yes, maybe we should call that kind of person a Tag Hag.

The relief in the comments is palpable: Oh, I’m that label! Everything makes sense now.

That's kind of what's been bugging me about labeling, to be serious for a moment. I am what I am (to quote either God or Popeye), so how does putting a label on it help? Like, we all know I'm into science, pedantry, gaming, and science fiction; how does it help me or anyone else if I get put in the "nerd" box?

Usually when I find myself dealing with a “labeler” in real life, it’s because this idea of there being two types of brains has come up.

There are two kinds of people in the world: Those who think there are two types of brains, and those of us who know that's been thoroughly and completely debunked.

It’s not just attachment styles. All over those platforms, you see vlogs and infographics declaring that people can be understood best as bundles of fixed, unchanging symptoms, related to corresponding bundles of trauma, grouped neatly under buzzy labels.

This is, of course nonsense. People can be best understood by their astrological natal chart.

Yes, I'm back to making jokes. But speaking of astrology:

Then there is the enormous popularity of astrology meme accounts. I find it hard to take exception with this iteration of labeling, though, because my star sign is Aquarius, so the @costarastrology account (with its 2 million followers) always presents me with flattering personality reads that position me as a cool, aloof, intellectual sort.

That's not your star sign. It's your sun sign. Your personality is also influenced by what sign the moon was in at your birth, and which one was intersected by the eastern horizon (which is what I said above). As I'm an Aquarius sun and moon (rising sign unknown), I know that astrology is complete horseshit (but sometimes fun horseshit).

On that five-hour car journey with the labeler, though, I could not simply go outside. They were always rushing to finish my sentences too, with an ending they expected might fit with the kind of thing I had been saying. There was a manic, frantic energy to every exchange. As if something terrible might happen if I were permitted to finish a sentence by myself.

Note to self: if I ever do a silent retreat (which I won't), arrange my own damn transportation. Alone.

And speaking about it, I should admit, to my psychoanalyst a few days later helped me clarify my thoughts further. (That’s right, my psychoanalyst. This essay was not eccentric and unhinged enough already.)

Right, because everyone who sees a shrink is eccentric and unhinged? Come on, lady, if you're going to rage against labels, at least stop enhancing the stigma surrounding mental health issues.

But in the wake of the silent retreat, everything seemed bathed in a rosy glow of calmness and goodwill. My thoughts were infused with peace and love and so forth. So, after my frustration had exhausted itself (and, mind you, that did still take a while), I had a sort of epiphany. After all, wasn’t there some of the labeler in me? Even by calling this person a labeler, I was assigning them a type.

I can almost forgive the dig on psychotherapy after seeing this level of self-awareness. Almost. Not quite.

There is, of course, more to the article, including another epiphany about some people needing to maintain control over social situations, or something. I don't know; I'm not sure if that revelation makes things better or worse. Just like with labels. So, I leave the article, my own curiosity unresolved, more confused than ever.

Maybe I should see a shrink.
December 10, 2024 at 12:23pm
December 10, 2024 at 12:23pm
#1081001
From aeon, a tale as old as time. Well, as old as civilization, anyway:

    The fermented crescent  Open in new Window.
Ancient Mesopotamians had a profound love of beer: a beverage they found celebratory, intoxicating and strangely erotic


So, they were human.

I should note that, like many free articles, this is a stealth ad for a book. But, for once, it's a book I would buy. (I'm not going to add to the advertising; the details are there at the link.)

Hamoukar, Syria. 20 May 2010. We are midway through what will be the last excavation season at the site for some time. The following spring will see the outbreak of a long and brutal civil war.

I don't talk about them much in here, but I do keep up with current events. Still, I find it hard to follow all the ins and outs of the disturbances in Syria. Nevertheless, yes, I heard about Assad, and I remember my inner cynic (which, frankly, is just Me) going, "Oh great, what fresh hell will Assad be replaced with?"

Today, though, the archaeologist Salam Al Kuntar, balanced on tiptoe at the bottom of a tomb, has just uncovered a little green stone. It is a cylinder seal, an ancient administrative device. We roll the tiny seal in clay – just as its former owner once would have – to reveal an impression of the intricate scene carved into its surface. It may not be the finest seal ever seen, but the tableau is eye-catching: a stick-figure man and woman are having sex, the man standing behind the woman, who bends over to drink from a jug on the ground. And is that a straw emerging from the mouth of the jug?

I, of course, instantly knew the implication: chick was drinking beer, maybe because her partner was ugly. I once saw a shirt that read: "BEER: Helping ugly people have sex since 1862!" And I snorted and said, "Yeah, right. More like 6000 BCE."

It may surprise you that our ancestors had sex, until you stop and think about how they became our ancestors.

Indeed, the drinkers of ancient Mesopotamia often drank via straw – though not always, shall we say, in this particular position.

While drinking beer through a straw today is as much a social faux pas as serving warm white wine, it was kind of necessary then because, apparently, the beer had floaty things in it and the straw kept them from getting swallowed. It would filter out the biggest and grossest solids.

Yes, the Sumerians (probably) invented beer. No, it wasn't the tall, frosty Pilsener of today. For one thing, no refrigeration. For another, no hops. But it was still fermented grain, hence: beer.

Banquets were a key part of the social calendar in Mesopotamia, and beer was an essential element. But people also drank beer at home, on the job, in the tavern, in the temple, pretty much everywhere.

Before we knew shit about microbes, beer was often a better choice than water because the process requires boiling water, which we know now destroys bad microbes. Hell, a big part of beer production today involves letting the proto-beer (called wort) cool enough for the yeast (good microbes) to be able to survive and work their magic. They wouldn't have known exactly why beer was good, only that it was.

Perhaps you have encountered the notion that beer was ‘invented’ in Mesopotamia. That is a hypothesis at best.

Yeah, well, it's still better supported than other hypotheses.

And, as the global search for earlier and earlier traces of alcoholic beverages gains steam, there is at least one key takeaway: beer was invented (or discovered) many times in many different places.

I'm okay with that clarification, and will note that yes, it is one of those things where you can have a legitimate philosophical argument over "invention" vs. "discovery."

The famous ‘land between the rivers’ was also the land of Ninkasi, goddess of beer. When Ninkasi poured out the finished beer, ready to drink – a Sumerian song tells us – it was like ‘the onrush of the Tigris and the Euphrates’.

Before you rush out and claim a name, there's already Ninkasi Brewing. It's located in Oregon, so I haven't tried many of their offerings, but I seriously doubt they used the ancient Sumerian recipe. However, as the article later attests, other brewers have attempted Sumerian beer. One of them was named Gilgamash, which utterly delights me (hence the entry title today, which, now you know, wasn't an original Waltz pun). But I'm getting ahead of myself, here.

The article goes on to discuss several aspects of ancient beer culture, including a paraphrased version of when Inanna got Enki so drunk that he gave her all his prized possessions. I'm sure I've covered that in here before.

Beer was brewed at home, in neighbourhood taverns, and in breweries managed by palace and temple authorities. In some cases, we know the names of the brewers – for example, homebrewers Tarām-KĆ«bi and LamassÄ« (both women), tavern-keepers Magurre and Ishunnatu (both women), and palace brewers QiĆĄti-Marduk and ážȘuzālu (both men).

While your image of a brewer today probably involves a very large, very bearded man, historically, beer has been either a female project or ungendered.

The most detailed account of the brewing process appears in the ‘Hymn to Ninkasi’, goddess of beer. But this lyric portrait of Ninkasi at work in the brewery is hardly a set of instructions for brewing beer.

I will defer to this author's greater experience in the historical arena, but everything else I've read does call it a recipe of sorts. Not standardized like today's recipes, with their precise measurements and somewhat detailed instructions following about 50 pages of backstory, but more like a mnemonic, which the brewers were expected to fill in with passed-down knowledge and maybe even proto-science.

I'm not sure the distinction is overly important to us. Hell, we have problems re-creating other recipes from a century or more ago, precisely because a lot of the handed-down knowledge is lost. What's more important is that beer was important enough to write hymns to the gods about.

The author has a lot more to say about this, and I can, again, provisionally defer to his greater knowledge.

In conclusion, yes, I would read that book for sure. I might wait to buy it until after the holiday season, though, just in case someone wants to give me one as a present.

No, that's definitely not a hint.

Or is it?
December 9, 2024 at 11:19am
December 9, 2024 at 11:19am
#1080964
This one's been hanging out in my queue for a long time, but it's not exactly time-sensitive. As they say, time and tide wait for no one.

    Lord Kelvin and His Analog Computer  Open in new Window.
This tide-predicting machine was one of many advances he made to maritime tech


The source is a publication of the IEEE, the electrical engineering professional organization. But fear not; the article isn't very technical.

Civilizations recognized a relationship between the tides and the moon early on, but it wasn’t until 1687 that Isaac Newton explained how the gravitational forces of the sun and the moon caused them. Nine decades later, the French astronomer and mathematician Pierre-Simon Laplace suggested that the tides could be represented as harmonic oscillations. And a century after that, [William] Thomson used that concept to design the first machine for predicting them.

Thompson was Lord Kelvin and, yes, he's the one the temperature scale is named after. One wonders what it would have been called had Thompson not become a noble, because Thompson is a boring name for a unit of measure.

Thomson’s tide-predicting machine calculated the tide pattern for a given location based on 10 cyclic constituents associated with the periodic motions of the Earth, sun, and moon. (There are actually hundreds of periodic motions associated with these objects, but modern tidal analysis uses only the 37 of them that have the most significant effects.)

Translation: it's complicated.

The most notable one is the lunar semidiurnal, observable in areas that have two high tides and two low tides each day, due to the effects of the moon.

Which is what most of us think of when we think of tides, but it's not as simple as "it's high tide when the moon is directly overhead." There's a lag, and there are local conditions that affect the timing of tides (such as sea floor depth).

On Thomson’s tide-predicting machine, each of 10 components was associated with a specific tidal constituent and had its own gearing to set the amplitude.

Basically, it's a very complicated clock. Sure, the article calls it an analog computer, and I'm not going to argue with professionals (especially ones not in my field) but I think that's a categorization issue. At some point of increasing complexity, a clock stops being a clock and starts being an analog computer. But in my view, if it involves the timing of natural phenomena like the movement of solar system bodies, it's a clock.

The device marked each hour with a small horizontal mark, making a deeper notch each day at noon. Turning the wheel rapidly allowed the user to run a year’s worth of tide readings in about 4 hours.

But this bit, an output device, is probably what pushes it into the computer category. It also was designed for prediction, not for reading what the tide is right now.

As with many inventions, the tide predictor was simultaneously and independently developed elsewhere and continued to be improved by others, as did the science of tide prediction.

One thing I'm still unclear on when it comes to tidal prediction: the wind plays a role. And wind is way, way harder to predict than the future relative position of the sun and moon. I'd ask my dad, the sailor, but I seem to have misplaced my Ouija board.

In any case, mostly, I just liked the article and the history lesson, and I wanted to muse about the differences between clocks and analog computers.
December 8, 2024 at 9:27am
December 8, 2024 at 9:27am
#1080931
It's time for another foray into the jungles of the past. This one comes from way back in 2018—Christmas Eve, to be exact: "Millennials Killed MillennialsOpen in new Window.

The entry featured an article from The Atlantic; this was before I started using xlink tags for articles, so it's a raw URL link. That source has changed its policy since then, and every time I open it now, I hit a paywall.

I'm not above paying to read or watch something. I have a few subscriptions. But even I can't subscribe to (or keep up with) everything and, more importantly, I try not to link paywalled articles here (as I said, that entry from 2018 was from before they installed a paywall).

Point is, unless you subscribe to The Atlantic or have found some way around the paywall that even I haven't been able to figure out, you're only going to get the first few paragraphs of the original story. Oh, sure, it talks about "subscribe" and "free trial," but I'm deeply suspicious of any "free trial."

You'd think they'd lift that restriction for older articles, but apparently not.

Not ragging on them, by the way. They should be able to make money if their content is useful. I used to link a bunch of their stuff, and I'm not judging anyone who subscribes. I'm just explaining why 1) I don't feature Atlantic articles anymore and 2) today's entry is more about my earlier entry than it is about the original article.

Having said that, let's look at what I was thinking six years ago this month.

The entire concept of demographic "generations" annoys the shit out of me. And it only gets worse as time goes on and I read more crap like this.

I've since moderated my feelings about that topic. As with most other things, it's not binary; I'm not obliged to either love it or hate it, with nothing in between. My stance on the practice is complicated, but I'll try to explain my current thoughts:

1. Yes, the "generation" labels are pretty arbitrary. So are Gregorian calendar months, but it's an established system that has its uses.

2. In this system, I'm early Generation X, which hardly anyone ever talks about, opting instead to rag on "boomers" or "millennials" or "Gen-Z." Or praise them, depending on the author's beliefs and age.

2a. Gen-X is supposed to be, among other things, the slacker generation. Am I a slacker because I was born under the Slacker sign, and therefore it's expected of me; or is it just my basic nature?

3. It's one thing to draw conclusions about a subgroup and market to that subgroup. It's quite another to point at a single individual from that subgroup and simply assume that they have all the traits associated with that subgroup. I don't really have a problem with the former, at least not anymore.

There's more, but I'm not writing a dissertation, here; what I'm really trying to say is that my own views have shifted over the years.

On top of which, you'd have to convince me that our 1983-born X-er has more in common with someone born in 1966 than with someone born in 1986.

That is something that no one has been able to convince me of, yet. Some people have tried to get around it by slicing generations more finely; you get, like, the "Oregon Trail generation," which of course refers to the classic "you have died of dysentery" computer game and not actual westward-ho pioneers. But all that does is support my point: slice finely enough, and you're back to taking each person individually, rendering the whole marketing concept more useless.

Also, some things suck and other things get better. This is due not to a single "generation" or cadre of ages, but every single person doing his or her own thing.

I'm not sure exactly what I was thinking when I wrote that, but I recognize it's probably unclear. Much of what sets "generations" apart, in modern terms, has to do with technological advancements and societal changes, all of which require more than one person. Like, someone had to invent Crocs, sure, but also, someone had to be convinced that they're not ugly and to start wearing them. And then other people had to think that the original wearer was cool enough to start a fashion trend.

I'm pretty sure the whole "generations" thing is just another way to divide us, like politics or countries. It distracts us from the real issues, which we can either work to solve, or ignore, depending on one's individual preference.

Again, my hardline stance on that has evolved, though I still have a measure of distrust. These days, for instance, the popular usage of "boomer" and "millennial" doesn't comport with their marketing definitions; a "boomer" is simply someone older than you whose attitude you don't like; and a "millennial" is someone younger than you whose attitude you don't like. This is one reason Gen-X gets ignored (but don't worry; we're used to it).

But it is marketing. Not science.

So you know what I want to see Millennials finally kill?

Generations.


This harks back to the original article, which was apparently about the Millennial generation killing off cultural institutions beloved by Boomers. Never mind that these same Boomers killed off cultural institutions beloved by their parents' generation.

Anyway. Changed perspective or not, that entry's still there, even if accessing its linked article remains a pain in the ass. But, to borrow the rallying cry of my generation: "Whatever."
December 7, 2024 at 10:51am
December 7, 2024 at 10:51am
#1080897
Speaking of time, here's a Guardian article about people who had more of it than usual.

    Never take health tips from world’s oldest people, say scientists  Open in new Window.
Scientists still trying to work out why some people live beyond 100, but agree it is best to avoid taking advice from centenarians themselves


No, we should definitely take health tips from people who die young, instead.

The death of the world’s oldest person, Maria Branyas Morera, at the age of 117 might cause many to ponder the secrets of an exceptionally long life, but scientists say it could be best to avoid taking advice on longevity from centenarians themselves.

Far as I can tell, the secrets to an exceptionally long life include such gems as "stay alive" and "don't die."

According to the Guinness World Records website, Branyas believed her longevity stemmed from “order, tranquility, good connection with family and friends, contact with nature, emotional stability, no worries, no regrets, lots of positivity and staying away from toxic people”.

Also, unicorns and fairies. I mean, those are probably more real than her litany.

However, Richard Faragher, a professor of biogerontology at the University of Brighton, said that in reality scientists were still trying to work out why some people lived beyond the age of 100.

Because they didn't die.

Or, in some cases, because they assumed the identity of their deceased parent so they could go on collecting the... whatever benefits.

Faragher said there were two main theories and they were not mutually exclusive.

The first, he said, was that some individuals were essentially just lucky.


At some point, though, lucky stops being "you didn't die" and starts being "you died."

The second theory, he said, was that centenarians had specific genetic features that equipped them to live a longer life

So, a different kind of luck. But still luck.

Faragher said both theories, however, resulted in the same warning: “Never, ever take health and lifestyle tips from a centenarian.”

Certainly, if I'm unlucky enough to live that long, I'd troll the hell out of anyone asking me about health tips. "See, now, the key is to kick a puppy every day. Doesn't have to be your puppy. Doesn't even have to be the same puppy. But it's gotta be a puppy, not a dog. Or a kitten or a kid. Puppy."

He added: “What you see with most centenarians most of the time – and these are generalisations – is that they don’t take much exercise. Quite often, their diets are rather unhealthy,” noting that some centenarians were also smokers.

What would amuse the hell out of me would be if no exercise, diets considered unhealthy, and smoking really were the keys to long life.

“The fact that [centenarians] do many of these unhealthy things and still just coast through [life] says they’re either lucky or typically very well endowed [genetically],” he said.

Again, both of these things are luck.

Faragher added that many of the mooted possibilities for why centenarians live longer could actually be examples of reverse causation. For example, the idea that having a positive mental outlook can help you live for a very long time might, at least in part, be rooted in people being more sanguine because they have better health.

Glad they acknowledged this. I mean, they could do studies to gain insight into it, but apparently it's more important to promote "positive mental outlooks," especially in a world falling apart all around us.

“From about 100 years ago, what we started seeing was huge advances in life expectancy driven by improvements in reducing the likelihood that children die,” said David Sinclair, the chief executive of the International Longevity Centre, noting that was largely down to the introduction of vaccinations and clean water.

Well, that lasted about a century.

“What we’ve had over the last 20 years, and we’re going to see over the next 20 years, is a similar focus in terms of old age,” Sinclair said, adding that included improvements in vaccines for flu and shingles, statins, and other medications that would help increase life expectancy among older people.

Sure, because, obviously, life expectancy is a more important metric than life quality.

But he said governments also needed to take action to help individuals to make healthier choices – choices that would ultimately help them live longer – adding that many people lived in environments where it was difficult to exercise, eat well or avoid pollution.

Ah yes. "Want to live longer? It's your fault if you don't, even though you're economically stranded in a polluted area with little access to fresh food, thanks to our policies. So instead of fixing economic disparity, despair and environmental degradation, we'll make it illegal for you to eat cheeseburgers. There, we fixed it!"

As Sinclair said, while news stories about centenarians tended to be upbeat, it often emerged that such individuals faced challenges, such as living alone for many years.

Challenges? Hell, that's probably the real reason they lived longer: not having to deal with other peoples' bullshit all the damn time.
December 6, 2024 at 9:37am
December 6, 2024 at 9:37am
#1080874
The Big Think article that came up for me today involves math. Fair warning so you don't end up defenestrating your device.

    Time: Yes, it’s a dimension, but no, it’s not like space  Open in new Window.
The fabric of spacetime is four-dimensional, with three for space and only one for time. But wow, time sure is different from space!


Time is different from space? I never would have guessed, what with them having different names and all.

When did you first realize that the shortest distance connecting any two points in space is going to be a straight line?

I'm not sure that's a fair question. It's something I'd consider intuitive. What's hard to grasp, sometimes, are the cases where the shortest distance between two points isn't a straight line, because that runs counter to our everyday experience.

In fact, that realization, as far as human knowledge is concerned, comes from a place we might not realize: the Pythagorean theorem.

"In fact," I think they've got this backwards. The Pythagorean Theorem may quantify the "shortest distance" intuition, but I'm pretty sure humans knew about the straight-line thing before they had numbers or geometry. (Also, the idea predated Pythagoras by hundreds or thousands of years; the Greeks didn't invent everything.)

Taking all three of these dimensions into account — so long as we assume that space is still flat and universal — how would we then figure out what the distance is between any two points in space? Perhaps surprisingly, we’d use the exact same method as we used in two dimensions, except with one extra dimension thrown in.

I feel like the only "surprising" thing here is that the math is basically the same.

Thinking about distances like we just did provides a great description of what we’ll wind up with if we consider flat, uncurved space on its own. What will happen when we fold time, as a dimension, into the equation as well? You might think, “Well, if time is just a dimension, too, then the distance between any two points in spacetime will work the same way.”

At which point, unsurprisingly, the math isn't basically the same.

There are two fundamental ways that time, as a dimension, is different from your typical spatial dimension. The first way is a small but straightforward difference: you can’t put space (which is a measurement of distance, with units like feet or meters) and time (which is a measurement of, well, time, with units like seconds or years) on the same footing as each other right from the start.

Feet? Footing? Get it? Haha.

Fortunately, one of the great revelations of Einstein’s theory of relativity was that there is an important, fundamental connection between distance and time: the speed of light.

Yes, and the invariance of that speed is still very, very hard to wrap your head around, because it, unlike the "straight line" thing, runs counter to everyday experience.

However, there’s also a second way that time is fundamentally different from space, as a dimension, and this second difference requires an enormous leap to understand. In fact, it’s a difference that eluded many of the greatest minds of the late 19th and early 20th centuries. The key idea is that all observers, objects, quanta, and other entities that populate our Universe all actually move through the fabric of the Universe — through both space and time — simultaneously.

Turns out that everything in the universe is moving at a constant speed... through spacetime. Once that was pointed out to me, a whole lot of other stuff started to make more sense.

It turns out that the faster (and the greater the amount) you move through space, the slower (and the lesser the amount) you move through time.

Like that, for instance.

There’s an even deeper insight to be gleaned from these thoughts, which initially eluded even Einstein himself. If you treat time as a dimension, multiply it by the speed of light, and — here’s the big leap — treat it as though it were an imaginary mathematical quantity, rather than a real one, then we can indeed define a “spacetime interval” in much the same fashion that we defined a distance interval earlier...

Great. Wonderful. Now we'll get "time is imaginary" on top of "time is an illusion" nonsense. I'd forestall this by pointing out that imaginary numbers aren't actually imaginary (or at least they're no more abstract than the "real" numbers), but that's not going to stop the airy pseudophilosophy.

There is, of course, quite a bit more at the link. I'm not sure if it's really that useful; I feel like there's either too much or not enough math to keep people interested enough to follow the arguments. But if all we can get out of it is "spacetime is weird, and time is different from space," maybe that's good enough.
December 5, 2024 at 10:02am
December 5, 2024 at 10:02am
#1080846
Yeah... I'm just going to leave this here.



Really, isn't that what most of us want?

Quantum-enhanced metrology techniques are emerging methods that enable the collection of precise measurements utilizing non-classical states.

Non-classical... so, rock or hip-hop?

To realize a significant metrological gain above classical metrology techniques using quantum-mechanical principles, Xu and his colleagues set out to devise an approach that would enable the generation of Fock states with up to 100 photons.

Okay, sure, if that's your thing.

No, I don't really understand the article, either. Nor did I look up what a Fock state is; I will eventually, but it might get in the way of my amusement right now. Point is, I only saved this one on the behest of my inner 12-year-old.

24 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Waltz Invictus (UN: cathartes02 at Writing.Com). All rights reserved.
Waltz Invictus has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online