About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
April 30, 2024 at 9:51am
April 30, 2024 at 9:51am
#1070196
Sometimes I have to be reminded of stuff, because apparently I have a problem with object permanence once an item drops off my Favorites sidebar. In particular, "JAFBGOpen in new Window. [XGC] is still around, under new management, and there have been prompts to tackle since the beginning of the year.

As usual, I picked one at random:

How useful do you think anger can be?


Well, I'm not a psych-talker person, nor are emotions something I know much about. I can barely identify them in myself, let alone in others.

That said, it seems to me that anger is, at base, a manifestation of the well-known "fight or flight" response to a threat. That makes it related to fear, in my book (which I won't write because, like I said, not an expert). The "flight" part probably doesn't involve anger, but the "fight" part does. I hear that, in those cases, anger can be fairly useful to focus one's hostility into action... so long as it doesn't escalate into blind rage, which leads to random flailing around and, eventually, defeat.

I used language related to physical conflict there, but this also applies to the mental realm, like when government passes a law you don't agree with, and anger leads you to mobilize.

"I'm as mad as hell, and I'm not going to take this anymore!"

The quote there is from a 70s movie, Network. I don't recall ever seeing the film, but the line is iconic, with or without the context provided by the story. So of course, I looked it up on Wikipedia,  Open in new Window. because it's not like I have to avoid spoilers for a movie that came out before Star Wars.

Which, if you take a look at the link there, provides a nice segué into the other half of the equation. Above, I touched on how useful anger might be to the person feeling it. But if you want to really see how anger is useful, you need to consider how it's used as a tool for manipulation.

An angry person doesn't think rationally. This is almost by definition: "mad" is a synonym for "angry," but it's also a synonym for "fucking nuts." If you don't want people thinking rationally, if you want them to react with their limbic system instead of their neocortex, as it were... get 'em angry. Convince them, through emotional language, that their social position, their way of life, their very lives are in imminent danger, and you'll sell more torches and pitchforks.

I always thought it would be fun to put a Torch and Pitchfork store into an RPG campaign, to provide helpful tools for the occasional riot. The proprietor would be bored most of the time, so, occasionally, they'd drum up some business by manufacturing some sort of outrage. "The King wants to increase taxes so he can gold-plate his throne! Meanwhile, honest merchants are struggling! Are we going to let that stand?"

Might be a little too close to reality for a fantasy role-playing game, though.

So, to answer the prompt, anger can be very useful... in the people you're trying to influence. Way more useful there, as an agent of chaos, than it can be for an individual reacting to an actual threat.

So, if a headline spurs you to anger... take a step back. While suppressing anger is probably bad (or so I've heard), it may be possible to calm down enough to let some rational thought take over. If I'm right about the fight-or-flight thing, which I can't guarantee, anger is a kind of burst emotion; I imagine a whole cascade of neurotransmitters firing indiscriminately, urging us to action. And that burst can't really be sustained, any more than more pleasant emotions can.

Anger is a useful tool for the people in charge. It's way more useful for the rest of us to recognize when we're being manipulated, and sidestep that.

In my unprofessional opinion.
April 29, 2024 at 9:25am
April 29, 2024 at 9:25am
#1070095
Snake oil gonna snake.



Though snake oil doesn't deserve all of its reputation for fakery. It had a use in traditional medicine  Open in new Window. (whether it was actually efficacious, I don't know). The problem was, hucksters started promoting it as a magical cure-all, so now "snake oil salesman" is a synonym for fraudster.

The business of supplements is booming, and with all the hype around them, it’s easy to forget what they actually are: substances that can powerfully affect the body and your health, yet aren’t regulated like drugs are. They’re regulated more like food.

Okay, but let's be clear, here: food can "powerfully affect the body and your health." The line between food and drug isn't always sharp. Hence you get a long string of "superfood" fads.

It’s important to consider why so many people believe supplements can help them lead a healthier life. While there are many reasons, how supplements are marketed is undeniably an important one. In my years following the industry, I’ve found that three mistaken assumptions appear over and over in supplement marketing.

There follows a discussion of those "three mistaken assumptions" (I'd have phrased that differently, using words like misleading, false, or lie, but I'm not getting paid to mince words to avoid lawsuits).

1. The appeal to nature fallacy

The appeal to nature fallacy occurs when you assume that because something is “natural” it must be good. The word natural is used a lot in the marketing of supplements.


Leaving aside for the moment the inherent ambiguity of "natural," I've long wondered how they can take, say, a plant, extract its juices, distill out the desired chemicals (everything is chemicals), stuff it into a pill, and still call it natural.

To be clear, “natural” does not equate to “better,” but that’s what the marketing wants you to think.

Remember, poison ivy is natural. So are those mushrooms that'll kill you horribly and painfully. And the pollen which, even though I'm not technically allergic, is playing havoc with my sinuses right now (and for which I'll happily take a manufactured antihistamine).

2. The belief that more of a good thing is always better

There’s another assumption that piggybacks on the appeal to nature fallacy: If something is natural, it must be good, and more of it must also always be better.


This seems to be pervasive in human psychology, for whatever reason. I have to guard against it, myself, even though I know it's not the case. Not just with food, but activities as well. You can overdose on anything, even water.

3. The action bias

Finally, the supplement industry likes to capitalize on the idea that doing something is better than doing nothing. This is the action bias. Taking action makes people feel like they have more control of a situation, which is especially powerful when it comes to health.


This one's pervasive, too. It doesn't help that, in some cases, it seems to be true. Getting five minutes of exercise a day is said to be better than getting no exercise at all, though again, more is better... up to a point.

But in this particular case, taking a supplement might not be better than not taking one. Okay, that's a convoluted sentence with a lot of negations. Given the risks, it might be best to avoid supplements. As with most things, I feel like a person should consult with their doctor as to whether a supplement is useful or necessary, and safe. Doctors don't know everything, either, but they're more likely to have the relevant information, based on your own health profile, than good old Doctor Google.

On the flip side, if very few people bought these supplements, they wouldn't continue to be manufactured. This might have adverse effects on the people who, for whatever reason, do have to take supplements. It's like how I get snarky about "gluten-free," but the fad helped make available more products that people with legitimate gluten allergies can eat, improving their quality of life.

So, if you're a supplement fanatic, maybe rest easy knowing that you're making life better for other people... if not necessarily yourself.
April 28, 2024 at 10:25am
April 28, 2024 at 10:25am
#1069997
As usual for Sundays, I picked an older entry at random to take another look at. This one's old, indeed, as these things go, from January of 2008. It's just a short personal update, so feel free to ignore it. I did: "ExerciseOpen in new Window.

I don't even recognize the person who wrote that. I assume it was a past version of me, someone who lived in my house and drank my beer.

Not that I don't still have intermittent back problems, but I've given up on a lot of things, fixing my back being one of them, and swimming being another.

I realize that other peoples' dreams are about the most boring things to relate, but since we're talking about exercise, I remembered that I keep having dreams about riding a bicycle. Not a motorcycle, but, like, a mountain bike or a 10-speed—something that I absolutely can't do now. It is, therefore, the only form of exercise that I actually want to do and no, a stationary bike won't cut it.

They say you never forget how to ride a bike. I'm betting those dreams are just my brain making sure that's true.
April 27, 2024 at 10:14am
April 27, 2024 at 10:14am
#1069877
Ever notice that it's possible for the same stimulus to evoke pleasure or pain, depending on context?

    Pleasure or Pain? He Maps the Neural Circuits That Decide.  Open in new Window.
The work of the neuroscientist Ishmail Abdus-Saboor has opened up a world of insights into precisely how much pleasure and pain animals experience during different forms of touch.


My only hesitation here is that doing research on pain often means deliberately causing pain, and to me, that's ethically questionable at best. Especially when the subjects can't consent. But it was only a hesitation; I'm doing this entry anyway. As an omnivore, there's a limit to how much hypocrisy I'll tolerate in myself when it comes to this sort of thing.

Ishmail Abdus-Saboor has been fascinated by the variety of the natural world since he was a boy growing up in Philadelphia.

Probably because there's so much of it in Philadelphia.

Yes, I know I've pointed out before that, as we are part of the natural world, so is everything we build. But I never let the facts get in the way of a good joke. Or a bad one. Especially a bad one.

Today, he is an associate professor of biological sciences at the Mortimer B. Zuckerman Mind Brain Behavior Institute at Columbia University, where he studies how the brain determines whether a touch to the skin is painful or pleasurable.

From Philadelphia to Manhattan... hm. Anything to stay out of New Jersey, I suppose.

To find those clues, Abdus-Saboor probes the nervous system at every juncture along the skin-to-brain axis. He does not focus on skin alone or home in on only the brain as many others do. “We merge these two worlds,” he said.

This kind of ties in with other things I've been saying. People tend to draw a boundary between body and brain, probably because of old ideas about mind/body duality, but the brain is part of the body.

Abdus-Saboor has also pioneered a new quantitative measure of pain in mice, a tool he and his team adapted to gather evidence for the transgenerational inheritance of opioid addiction.

The addiction thing aside, I'm intrigued about the "quantitative measure of pain" thing. Pain in humans is often subjective; hence the classic 1-10 scale doctors ask you to rate your own pain on. Pain in other animals is generally inferred by their reactions, but it gets tricky; for a long time, people thought nonhuman animals didn't actually experience pain as such. The latter part of the article discusses this further.

Quanta spoke with Abdus-Saboor about his penchant for starting over in science, his zebra fish eureka moment and his hopes for a newly imported naked mole rat colony.

Thus, the rest of the article is in interview format. I won't quote more, but it's there at the link. I find it interesting on several levels, not least of which is the actual science involved, but also because, to me, it illustrates why it's important to have people from different backgrounds, with different worldviews, working on scientific research.
April 26, 2024 at 3:55pm
April 26, 2024 at 3:55pm
#1069815
A friend of mine sent me this link a while back, and it's just interesting enough to comment on.



I have a board game called Lunar Rails. The idea is that you need to create rail links between lunar mines and settlements to facilitate transport of resources and passengers. Occasionally, something bad will happen like a solar flare, moonquake, or meteor strike, and you have to fix your routes or work around the disaster. I haven't played it in a while, because a board game requires friends who can all commit to the same time to play in person, and we have enough problems getting people together to play RPGs over the internet.

My point, besides bitching about how hard it still is to get people together even though we're no longer quarantined, is that the idea is hardly new. What is new is, apparently, that people are starting to examine the feasibility of turning it from science fiction into reality.

The first U.S. transcontinental railroad, completed with a spike hammered into the track in 1869, transformed the nation. Perhaps the same will happen on the moon in the game.

The difference being that the US had major settlements on both coasts before the railroad was built. Last I checked, the number of permanent human settlements on the moon was still 0.

Also, that last railroad spike was, famously, made of gold. While there is gold on the moon,  Open in new Window. it's unclear to me how much there is, and how accessible. Not a lot of point wasting fuel lofting a golden metaphor up there, so they'd need a local source. But any gold found on the moon would probably be more useful for more practical reasons.

The Defense Advanced Research Projects Agency, or DARPA — an ambitious federal innovations division — has begun collaborating with over a dozen companies on potential future lunar technologies, including a moon railroad. It's called the 10-Year Lunar Architecture Capability Study, or LunA-10, and its mission is to find technologies that will catalyze a self-perpetuating lunar economy.

Yeah, so the headline is (big surprise here) a bit misleading. They're looking at the whole picture, which you have to do because it's not like you can go up there and wing it.

Now as NASA, global space agencies, and companies return to the moon with robotic spacecraft, the future Nayak sees is one that must be able to progress. It might be mining helium-3 (an extremely rare resource on Earth that could be used in medical imaging, computing, and even energy), harvesting water ice to create rocket fuel for deep space missions, and beyond.

Amusingly, helium-3 (a really incredibly rare, stable isotope of helium with just one neutron instead of the usual two) plays a key role in the video game Starfield. Also amusingly, you can mine it from the moon in the game.

DARPA recently chose the aerospace and defense giant Northrup Grumman to create the concept for the railroad. "The envisioned lunar railroad network could transport humans, supplies, and resources for commercial ventures across the lunar surface — contributing to a space economy for the United States and international partners," the company wrote.

Why I find the railroad thing interesting, above and beyond all the other shit you'd need to establish colonies and/or industry on the moon, is not just because I have the Lunar Rails game, or even because I studied transportation engineering. It's that we sometimes think of railroads as outdated technology, so we might not think about them in a space setting. Since then, we've made trucks and planes and huge cargo ships... none of which would be possible on the moon. Well, maybe trucks, but there's a question of how to keep them powered. You might think "rockets," but then you run into using fuel again.

The engineering challenges are interesting, though. I mentioned power above, but with rail, you can power it from various solar arrays at different locations (it's not like the moon has fossil fuels or wind power) and send it along the track like they do in the Northeast Corridor . There's the extremes of temperature to deal with; how do you design rails that will withstand both extreme heat and extreme cold? Not to mention what kind of design they might need to prevent derailments in the much lower gravity.

Well. At least they don't have to worry about wind resistance or rain.

Now, some people might consider the whole thing a waste of money and Earthly resources. I'm not going to justify it in this entry. Maybe another time.
April 25, 2024 at 2:18am
April 25, 2024 at 2:18am
#1069630
This article is a couple of years old, and I'm sure some additional science has been done since then, but since it presses one of my hot buttons, well, here it is.

    Does Quantum Mechanics Reveal That Life Is But a Dream?  Open in new Window.
A radical quantum hypothesis casts doubt on objective reality


Every time I see a headline in the form of a yes/no question, I take the default position that the answer is "no." It's not impossible for the actual article (or other evidence) to convince me otherwise, but that's where I start.

Further, note the word "hypothesis." The translation of this word for non-scientists is something like "guess." There may be some good reasons to make the guess, but, as I understand things, it hasn't been subjected to rigorous testing, and therefore hasn't really been supported by evidence.

My girlfriend, “Emily,” often tells me her dreams, and I, less often, tell her mine, which are usually too murky and disjointed to share.

I guess that dynamic works for them. Rare is the occasion when someone's recounting of their dream is actually interesting to the listener(s).

Interpreting dreams is an imperfect, highly subjective art, as Sigmund Freud, in his rare moments of humility, would surely have granted.

That's a polite way of saying that dream interpretation is bullshit.

And yet making sense of dreams, it occurs to me lately, is not wholly dissimilar from making sense of “reality,” whatever that is. Yes, we all live in the same world. We can compare notes on what is happening, and draw inferences, in a way impossible with dreams.

If one assumes that solipsism is false, at any rate.

And yet your experience of the world is unique to you. So is your interpretation of it, which depends on your prior beliefs, yearnings and aversions, and on what matters to you.

This is hardly news. I'm pretty sure there were primitive humans arguing over what to hunt for dinner.

Science offers our best hope for achieving consensus about what happens.

On that point, at least, I can't disagree. But "best hope" doesn't imply certainty.

Scientists accumulate bits of evidence and try to assemble these fragments into a coherent story. After much haggling and second-guessing, scientists converge on a plausible narrative.

Remember a few days ago when I wrote about how stories are important and we shouldn't dismiss them as "just a story?" Yeah.

But subjectivity is hard to expunge even in physics, the foundation on which science rests. Quantum mechanics, a mathematical model of matter at very small scales, is science’s most rigorously tested theory. Countless experiments have confirmed it, as do computer chips, lasers and other technologies that exploit quantum effects.

I agree that physics is the foundational science. Psychology, for example, derives in part from biology, which is really just complicated chemistry, and, in turn, chemistry is, at its core, physics.

Also, I have no reason to doubt that QM "is science's most rigorously tested theory." I've seen that assertion in multiple, disconnected places.

Unfortunately, quantum mechanics defies common sense.

Well, yeah. That's one reason I disparage the idea of "common sense." Saying that QM "defies common sense" is an indictment of the concept of common sense, not science.

For more than a century, physicists have tried to interpret the theory, to turn it into a coherent story, in vain. “Every competent physicist can ‘do’ quantum mechanics,” a leading textbook says, “but the stories we tell ourselves about what we are doing are as various as the tales of Scheherazade, and almost as implausible.”

I think this is due, in part, to our limited macroscopic experience. A story requires some shared experience; at the very least, the reader must have at least one language in common with the writer for the reader to comprehend the story. And our language, which includes concepts like "something is either a wave or a particle, never both," is incompatible with the language of quantum mechanics.

Many physicists ignore the puzzles posed by quantum mechanics.

I wouldn't go that far. It's more like they have to decide whether the puzzle is relevant to the outcome.

A newish interpretation of quantum mechanics called QBism (pronounced “Cubism,” like the art movement) makes subjective experience the bedrock of knowledge and reality itself. David Mermin, a prominent theorist, says QBism can dispel the “confusion at the foundations of quantum mechanics.” You just have to accept that all knowledge begins with “individual personal experience.”

I can accept that assertion. What I don't accept, at least right now, is that it also ends there.

But QBism’s core message, science writer Amanda Gefter says, is that the idea of “a single objective reality is an illusion.” A dream, you might say.

And here's where I start having Issues.

I talked about the importance of stories. But what is a story? It can be passed verbally, but these days, it's transmitted through writing on a screen or in a book. We can each read a book and come away with different impressions of it, like, say, how I think James Joyce wrote dreck while other people worship the guy's stuff and even do entire college courses on it. This difference of opinion and viewpoint doesn't change the objective reality that Joyce wrote books.

If you want to get really technical about it, a book is a collection of (usually) formerly-living organic matter, bound with more formerly-living organic matter and containing marks made by matter that reflects light differently. Go another level in, and it's an object made of mostly hydrocarbon chain molecules. Another level beyond that, and it's electrons and quarks and whatnot: all energy of some kind. That may be the base "reality," but it doesn't make our ability to read the book and draw conclusions from it some sort of illusion. It's just a different level of reality.

Similarly, our different and varied worldviews aren't evidence of illusion.

Proponents bicker over definitions, and physicists and philosophers fond of objectivity reject QBism entirely. All this squabbling, ironically, seems to confirm QBism’s premise that there is no absolute objectivity; there are only subjective, first-person viewpoints.

Maybe it's irony that people are arguing over whether disagreement indicates that all this shit around us is an illusion. I don't know. I'm still a little hazy on the concept of irony, despite doing a whole entry about it a while back.

Some artists thwart our desire for meaning. T. S. Eliot’s poem The Waste Land is an anti-narrative, a grab bag of images that pop in and out of the void. The poem resembles a dream, or nightmare. Its meaning is that there is no meaning, no master narrative.

Some people might find an inherent contradiction in my disdain for Joyce and my appreciation of Eliot. I can only quote Walt Whitman, a poet I'm not generally fond of: I am large; I contain multitudes.

If you are a practical person, like one of the finance majors in my freshman humanities class, you might conclude, along with T. S. Eliot, that efforts to comprehend existence are futile. You might urge friends majoring in philosophy to enjoy life rather than fretting over its meaning.

I do consider myself a practical person, mostly. But the idea "enjoy life rather than fretting over its meaning" is itself a philosophy.

I'd go so far as to say that I consider our different and varied viewpoints to be evidence of reality, not of illusion. If everyone agreed on every detail, then I'd be suspicious that I was living in some holodeck simulation.

In the end, the article fails to convince me that "life is but a dream." It may be different things to different people, but we're here, alive at least for now, to argue about it. And unless you're a solipsist, well, there's the foundation of our shared reality.
April 24, 2024 at 1:02am
April 24, 2024 at 1:02am
#1069547
Yes, the article I'm sharing today was one of my inspirations for the Comedy newsletter editorial that just came out: "Keep the TipOpen in new Window.. It's coincidence that my random number generator picked it out of my queue today.



I've also done other blog entries on the subject before, but this is a relatively new take, from NPR.

Businesses that never seemed to ask for a tip before — like grocery stores, self-checkout machines and fast food restaurants — are now asking for one these days.

As I noted in the editorial, I can't personally confirm the self-checkout tip chutzpah, but this is not the only source that asserts that it's happening. It's rare to find begging that blatant and uncalled-for.

If a business you don't expect to ask for a tip is suddenly asking you for a tip, what should you do?

Ideally, take your business elsewhere.

But Shubhranshu Singh, a marketing professor at Johns Hopkins University, likes to leave a 10% tip. If an establishment is asking for a tip, it's often an indication that the workers there are not getting paid a minimum wage.

If the workers aren't getting paid a minimum wage (outside exempt employees such as restaurant servers, who are some of the few that should be tipped), that business needs to face consequences for it. How that would work, though, I don't really know.

Some businesses load their payment systems with default minimum tip options of more than 20%. If you don't want to give that much, don't worry about holding up the line to take an extra moment to select the "custom tip" option, says Singh.

In those situations, I select the "no tip" option, and definitely try to find somewhere else to shop in the future. I know it's usually not the workers' responsibility to set those defaults. Still, it's bloody outrageous.

Don't forget to tip people who you might not have a direct interaction with, like hotel housekeepers, says Singh.

I'm torn on the whole tipping hotel housekeepers thing, honestly. I've done it. I've skipped out on doing it. I feel like it should be done for extended stays, on the theory that a tipped housekeeper might go the extra mile for you and hopefully not put itching powder in your undies. This makes it more of a bribe than a tip.

Tipping is also a way to pay workers more without actually raising their wages. It allows restaurants to get more money to workers while still keeping their prices low, says Sean Jung, a professor at Boston University's School of Hospitality Administration.

But if tips are expected and customary, as with restaurants, the "low" prices (they've still gone up along with everything else recently) are a lie. Nevertheless, some restaurants have experimented with no-tip options. Usually, this means raising their prices by 20-25%. Then, when comparison shopping, a potential customer sees higher prices and balks, ignoring which places take tips and which don't. Every place I know that tried this route went out of business.

In other words, it needs to be across the board, which means either a) collusion, which, as I understand things, is illegal; or b) regulations, which are already overdone.

Tipping culture in the US is so ingrained that there's not much will to do either. And even if they did, people will still try to tip, because it's what they're used to doing.

Oh, well... at least this article, unlike some others on the topic that I've shared here, doesn't rehash the false TIPS backronym, which, as I've noted repeatedly, has been thoroughly debunked. And yet, the article is light on practical information and encourages way too much work for the consumer, who just wants a damn hamburger.

I'm not saying people shouldn't get decent pay. Quite the contrary. I'm just saying it shouldn't be laid directly on our shoulders. I'd prefer to see the actual price of things up front (and this also goes for hidden fees from businesses like airlines, ticket sellers, and hotels). Until that utopia comes into existence, though, I continue to tip waitstaff and bartenders.
April 23, 2024 at 11:38am
April 23, 2024 at 11:38am
#1069487
One thing that always strikes me about New York City is the scale of the dog infestation.

I'm pretty sure that having a dog is a requirement of living here. I wouldn't be surprised if people are issued one when moving in: "Welcome to New York. Here's your dachshund." At the very least, even though the internet is telling me that having a dog in NYC requires a license, I'd bet they charge you a fee if you don't have a canine.

And the furry barking tail-waggers are everywhere. At any given moment on a typical block (on the Upper West Side, because I can't be arsed to take a subway ride just to support a blog entry), there are at least three dog-walkers on the sidewalk with an average of two dogs on leashes. Doing some quick math leads me to believe that not a single resident of the city is dog-free. Maybe they occasionally just appear in your expensive, cramped apartment via wormhole, complete with a regulation 6-foot-or-shorter leash and a roll of plastic doggie doo bags.

At least New Yorkers tend to be good about using both of these items. Whether that's out of a heightened sense of civic duty, or because they have a healthy fear of the NYPD, I don't know, and it doesn't matter, because the result is the same.

This isn't a case of "stop liking what I don't like." It's not that I dislike dogs, any more than I dislike kids just because I don't want to own one. Although I can't be arsed to do all the work a dog requires (and I absolutely will not pick up dog shit, with or without gloves), I understand that a lot of people want to hang out with the mopey, needy quadrupeds. It's just that I know what the housing situation tends to be around here, and, even with daily walks, having a retriever in an apartment the size of my spare bathroom can't be good for the dog.

Not only that, but right now, it's pretty close to the middle of astronomical spring. This, right now, is one of two weeks of the year when it's actually pleasant to be in NYC, the other being the week of the fall equinox. Between these two events, in one direction, it's either pouring down rain or oppressively hot, or both; in the other direction, it's either pouring down rain or snow, or oppressively cold, or both. And yet, every day, you gotta walk those dogs. Twice, at least. While juggling the three jobs and two side gigs you need to afford to live in NYC.

I get that some people want to live with dogs. I get that some people want to live in NYC. I'm just left shaking my head in confusion over the intersection of the two.
April 22, 2024 at 2:00am
April 22, 2024 at 2:00am
#1069370
As I noted yesterday, entries might be short for a few days while I'm out of town, and at odd times. Fortunately, I don't have a lot to say about this one, except for NOOOOOOO.



This is an Outrage and Something Must Be Done.

Hops give bitter its taste but the plant doesn't like the hotter, drier conditions we've experienced in recent decades and production has plummeted.

Researchers in Kent are isolating hop genes in the hope of producing more climate-change resilient varieties.


I should note that what the British call "bitter" is still heavy on the malt side. But hops are still used for flavor and aroma, and as a preservative.

Yes, the generally super-bitter IPA is originally a British style, but it's more associated with the US.

They also want to produce more intense flavours that are now becoming popular.

Please don't.

"We are just going to be importing beer and we won't have the culture that goes with it anymore."

Gosh, if only you could import only one of the ingredients instead of the whole beer. Hops, for example.

Anyway, most of the article is about what scientists are doing to mitigate this Very Important Problem, and of course, I think it's awesome that they're devoting their best brainpower to solve this major issue. I don't mean climate change; I mean disappearing beer. Priorities, folks.

Don't agree with me that it would be a disaster to go without beer? Well, climate change is also killing off coffee.  Open in new Window. Oh, now you want to do something about it.
April 21, 2024 at 9:17am
April 21, 2024 at 9:17am
#1069306
I'll be on what passes for vacation for me this week—I have nothing to vacate from, but I do like to go elsewhere from time to time—so entries will happen whenever I get a chance and probably short and even more pointless than usual.

For today, though, I'll do my usual Sunday thing and look back at an old entry. This one's from 2020: "Hack ThisOpen in new Window.

Worth reading if you want a decent takedown of "lifehacks." Or even if you don't. Especially if you don't.

I still see "lifehacks" from time to time, but I see more parodies of them, which warms my heart. Things like: "Life hack: Don't have a mental breakdown at home. Have it at work so at least you're getting paid for it."

The article I linked  Open in new Window. (Medium, 2017) is still up as of today. And my opinions haven't changed much, but maybe a little, and maybe some points need clarification. But first, I'll address the end of the entry, where I discover, too late, that this advice article that is a takedown of advice books and articles is actually an ad for the author's advice books.

Also, I hate reading this far along a halfway decent article only to find that it's a commercial in disguise. Bah.

Yes, I have said numerous times that I don't mind taking a look at book ads here, on a site that caters to readers and writers, so long as the content is worth commenting on. This one, however, did manage to catch me by surprise. Usually the book-flogging is near the beginning, or in a sidebar, or otherwise obvious when you start reading the article. This one was, I felt, deceptive—moreso because, as I wrote then: "I feel like he has good points, but those are somewhat muted by the fact that he's doing exactly what he's railing against."

Now... one might say, "But Waltz, here you are doing it too." Yes. I am. But I'm not trying to sell anything.

Besides, if I really objected to it, I'd have scrapped the entry and done a different one.

Rereading this entry, I realize it might read as if I'm against self-help books and articles in general. I'm not, necessarily. It's just that most of them only "self-help" the author to make money. There's nothing wrong with making money; most of us want to do that. But doing it by misleading others into doing something that doesn't help, and may even actually harm, is generally called "fraud," and is frowned upon.

So, just one more comment on my previous comment:

I keep seeing that the true enemy is "processed" or "overprocessed" foods, but I haven't found a good definition for those, yet. I mean, technically, cooking is a process, and - raw-food-diet bullshit aside - cooking is what makes a lot of food more nutritious and digestible. Potatoes, for example. It's probably what allowed our ancestors to evolve these great big brains that most of us don't use.

Obviously, it's been four years, so a) "processing" has been better-defined, while at the same time I haven't seen much about it lately; I wonder what the next anti-fad will be. And b) I shouldn't have typed that last sentence; it's misleading and does what I'm railing against here: assertion without evidence or experience. Not that I'm above doing that, but in this particular case, it's evolutionary guesswork, which I have issues with, and it's also what's commonly known as a "chicken/egg" scenario: did we evolve big brains because our ancestors cooked their food, or did they cook their food because they had big brains from some other adaptation? Or was it a synergy of some kind? My only point should have been that cooking food is good.

In reality, the "chicken/egg" scenario is easily resolved: eggs existed long before what we call chickens, and the first chickens hatched from eggs laid by some dinosaur descendant that was almost a chicken. That's still evolutionary speculation, but at least in this case, it fits with what we know of evolution.

Therefore, it was the egg that came first.

Fortunately for chickens and those of us who enjoy eating them, it wasn't cooked.
April 20, 2024 at 11:45am
April 20, 2024 at 11:45am
#1069258
I have almost no formal training in philosophy. This doesn't stop me from discussing philosophy. And today's article is, nominally, all about philosophy, so here I am, writing about it.

    Philosophy is an art  Open in new Window.
For Margaret Macdonald, philosophical theories are akin to stories, meant to enlarge certain aspects of human life


Well, that's one point of view. And as we're all about writing, here, the connection to "stories" is a fruit too low-hanging for me to ignore.

Note: I'm unclear on whether she styled her surname Macdonald or the more common MacDonald. Sources differ. I'm sticking to the linked article's style.

‘Philosophical theories are much more like good stories than scientific explanations.’ This provocative remark comes from the paper ‘Linguistic Philosophy and Perception’ (1953) by Margaret Macdonald. Macdonald was a figure at the institutional heart of British philosophy in the mid-20th century whose work, especially her views on the nature of philosophy itself, deserves to be better known.

Full disclosure: I'd never heard of Margaret Macdonald before reading this article, confirming the "deserves to be better known" bit above.

As for the actual assertion she made, I don't find it provocative. Maybe this is because, in the single human lifetime since that paper was published, some of that idea has permeated public consciousness; but if so, why would a 2024 essay call it thus?

Philosophy, stories, and science are all interests of mine, so that sort of statement is just begging me to comment on it. So I will:

I don't think I've ever held the view that philosophy is scientific. I have, I'm pretty sure, noted in previous entries that philosophy and science are different, and complementary: philosophy can guide science, and science can inform philosophy. I mean, sure, in the early days of science, what we call physics was known as "natural philosophy," but in those same early days, you had scientists studying alchemy or astrology or other subjects we consider mystical; things change.

For instance, science achieves results through, among other things, experimentation. But it's philosophy that tells us that certain kinds of experimentation, on humans or other animals for example, is an ethical violation. Even that philosophy, though, has changed over time as certain experiments have provided evidence that some nonhuman animals feel pain and suffer. (I have another article in my queue about that very thing.)

As for the asserted closer connection between philosophy and stories, I'm willing to listen to that argument.

I'm going to give the phrase "philosophical theories" a pass for now. The article gets to it, eventually.

Early proponents of the ‘analytic’ method in philosophy such as Bertrand Russell saw good philosophy as science-like and were dismissive of philosophy that was overly poetic or unscientific.

All due respect to Russell—some of his ideas contributed greatly to advances in computing, among other things—that's a narrow view.

But where would philosophy be if philosophers always agreed with each other?

Russell’s view of what counts as good philosophy was not one that Macdonald shared. In her 1953 paper, she embraces comparisons between philosophy and literature, poetry and art. For Macdonald, philosophical theories are very much like ‘pictures’ or ‘stories’ and, perhaps even more controversially, she suggests that philosophical debates often come down to ‘temperamental differences’.

I'm going to be using the word "art" in my commentary in the broadest sense, as in the article's headline. Literature (also in its broadest sense to mean "any work of fiction") and poetry are forms of art, as is music and dance.

For example, whether you are willing to believe (in accordance with thinkers like René Descartes) that we have an immaterial soul will come down to more than just the philosophical arguments you are presented with. Your view on this matter, Macdonald thinks, will more likely be determined by your own personal values, life experiences, religion and so on.

I think it's important to acknowledge that we all have different worldviews. Unsurprisingly, I have an article in the queue about that subject, as well. The truth of someone's idea—and I also acknowledge that "truth" can be a slippery concept—isn't necessarily dependent on other ideas the person has. What I mean by that is, for example, Descartes is best known for two things: "I think; therefore, I am" and the idea to plot coordinates on a grid. Those ideas seem unrelated. One can argue the truth of the first one without reference to the second, which, reductively, is simply a very convenient way to visualize data. And neither of them depends on mind/body dualism, though I can see how both could have come out of that worldview.

Anyway. The article also provides some background on Margaret Macdonald, who's the philosopher we're actually talking about here, and I won't quote it much. In brief, she was abandoned as a child and dealt with sexism in her professional career, both of which surely shaped her worldview. And it discusses how she got started in linguistic analysis.

Linguistic analysis involves paying attention to and drawing conclusions from the language used in particular contexts, including philosophical debates, scientific theories, and ordinary (common-sense) language.

That sound like the birth of postmodernism to you? It does to me. I object to the use of "common-sense," because I do not philosophically believe that it exists, but that's a quibble.

Putting the tools of linguistic analysis to work, Macdonald focuses her attention on the word ‘theory’. What do philosophers mean when they talk about philosophical ‘theories’? And is it the same thing that scientists mean when they use the word ‘theory’? Macdonald’s answer is a categorical ‘No’.

Gotta agree with her here. Also, both are different from what ordinary people think of as a "theory."

She claims that, when scientists put forward theories, they do so to explain empirical facts. Scientists put forward hypotheses (eg, ‘Earth is round’ or ‘physical objects are governed by laws of gravity’), which can then be verified (or falsified) by experiments and observations, leaving behind only plausible theories, and eliminating those that are refuted by factual evidence.

I'm not sure that states things as clearly as it could. A theory, in the scientific sense, is an explanation that can be, or has been, thoroughly tested and compared against evidence. It can be falsified, as with the old idea that heat is a substance in itself, or the luminiferous ether. It can be supported by overwhelming evidence, as with evolution. It can never be fully "proven," a common misconception about the purpose of science.

According to Macdonald, philosophical theories cannot be tested. Is that true? What might she mean by this? Once again, she uses the philosophy of perception as her example.

I mean, sure, it feels true. But we need to beware: if a philosophical theory can be tested, supported or falsified, it suddenly becomes a scientific theory.

Thus, the first step in Macdonald’s meta-philosophical argument is to show that philosophical theories are not ‘theories’ in a scientific sense since they lack the essential criterion of being confirmed or refuted by fact. For this reason, she argues, philosophical theories, unlike scientific theories, are not in the business of discovering new facts.

Fair enough.

Macdonald claims that philosophy’s value is much closer to that of art, literature or poetry than science. She explains that the arts inform us that ‘Language has many uses besides that of giving factual information or drawing deductive conclusions.’ A philosophical theory may not provide ‘information in a scientific sense’, she writes, ‘but, as poetry shows, it is far from worthless.’

Gonna have to side with Macdonald here.

At this point, one might think: enough is enough. It’s all very well to consider how philosophy overlaps with the arts, but surely Macdonald has gone too far when she suggests that philosophical theories are just ‘good stories’.

My problem there is the adverb "just." Not only is that a slippery word in itself, with several possible meanings, but in this context, it's dismissive. "He's just a kid." "She's just a woman." My worldview, it should come as no surprise, includes the idea that "good stories" are one of our most important and effective ways of communicating with each other. There's no "just" about it; calling something a good story is one of the highest forms of praise.

If philosophical debates come down to ‘temperamental differences’, then it looks like there’s no real right or wrong (or true or false) – any more than it’s right or wrong to prefer John Keats to Shelley, or Sally Rooney to James Joyce.

That's a leap, and the leap involves the idea that right and wrong are absolute and binary: something is either right, or wrong; it's either true, or false. Life doesn't usually work that way, much as some of us would like it to. We (justly) (see what I did there?) praise people for helping little old ladies across the street and refraining from kicking puppies, but what about someone who helps a little old lady across the street, kicking a puppy out of the way in the process?

And sure, an argument can be made that not helping someone across the street serves a greater public good, possibly eliminating someone who's a drain on society. That's a worldview commonly referred to as "evil." Delving into that philosophical morass is beyond my scope here.

Is this really true? Are judgments about art, literature and poetry purely a matter of subjective preferences? Some might be tempted to answer ‘Yes’. If I like my child’s hand painting more than a piece hanging in Tate Modern, I might be inclined to say that, for me, it is a better piece of art. Similarly, if I get more enjoyment reading Rooney’s novel Normal People than Joyce’s Ulysses, then who’s to say that Joyce is a better writer.

It is an unassailable fact that Joyce was shit, the literary equivalent of Jackson Pollock. Okay, okay, no, that's just my opinion. But I stand by it.

The worry might persist that surely there’s the matter of truth to contend with. Philosophical theories might be like good stories, but surely only one of those stories can be true, or at least closer to the truth than another?

It's been expressed by better minds than mine: Fiction always contains truth. Well, except maybe Finnegan's Wake. Fine: most fiction contains truth.

Well, I've banged on long enough, and there's a lot more at the link. You might not agree. That's fine. My main purpose here was to share my new knowledge of a philosopher I hadn't heard of, and look at what she was trying to express. Her ideas are at least worth contemplating... but so are Russell's.
April 19, 2024 at 10:31am
April 19, 2024 at 10:31am
#1069178
I've done entries before about this sort of thing, but it doesn't look like I've addressed this particular article. It's quite ancient by internet standards, but I don't get the impression that much has changed on this subject since 2016.



These aren't necessarily bad things, but being aware of them might at least give you a little smug feeling next time you encounter one.

A restaurant’s menu is more than just a random list of dishes.

"Random" doesn't mean what they think it means.

It has likely been strategically tailored at the hands of a menu engineer or consultant to ensure it's on-brand, easy to read, and most importantly, profitable.

Funny, I don't remember menu engineering as an option when I went to engineering school.

As for "profitable," don't say that like it's a bad thing. Sure, focus on profits above all else is greedy, but if a business doesn't turn a profit, it's not a business much longer.

1. THEY LIMIT YOUR OPTIONS.

The best menus account for the psychological theory known as the “paradox of choice,” which says that the more options we have, the more anxiety we feel.


This is also not a bad thing. Your options usually aren't actually limited: you could go to a different restaurant, eat at home, or choose not to eat.

Some big chain restaurants have enormous menus with lots of options. Less "menu" and more "book." Think Applebee's, though I haven't been to one of those in so long that I don't know if it's still the case that their menus are about the length of War and Peace. People keep going back, though, as evidenced by the fact that many of them still exist, having survived the pandemic.

The problem with those encyclopedic menus, for me, isn't the overwhelming number of choices. No, it's the knowledge that the only way I know of to prepare that much of that variety of food is to have frozen pre-packaged portions that get microwaved.

2. THEY ADD PHOTOS.

Including a nice-looking picture alongside a food item increases sales by 30 percent, according to Rapp.


Oh no! The horror! We're being manipulated by being shown images of what we might want to eat!

No, the only problem with this is when the actual served dish is far removed from the idealized photo. Food menu photography is an art of its own. I dabbled in it a bit back in college. The photography part wasn't hard. The hard part was faking the food item in such a way that it looked hyperreal, like the actual food but... more appetizing, even though it wasn't made of real stuff.

Still, it's helpful to the consumer if the image roughly corresponds with the ingredients and serving size you're actually going to get.

3. THEY MANIPULATE PRICES.

One way to encourage you to spend more money is by making price tags as inconspicuous as possible. “We get rid of dollar signs because that’s a pain point,” says Allen. “They remind people they’re spending money.” Instead of $12.00 for that club sandwich, you’re likely to see it listed as 12.00, or even just 12.


Meh, whatever. That's on you.

One thing that has changed since this article originally came out, and it happened very recently, is that Wendy's announced that it would implement "surge pricing" much like Uber does, charging more at peak hours. Easy enough to do now that fast-food menus are mostly digital display screens, I suppose.

The problem is, in my amateur opinion, that they went about it in exactly the wrong way. They said "we'll increase prices at peak times," and people were outraged. How dare they? Whereas if they'd quietly increased prices across the board, people probably would barely notice, especially if they don't eat fast food very often. More, if they'd framed it as "we're going to decrease prices during off-peak hours," my feeling is that the vast ravening herd would have praised them for being so generous, even if it were accompanied by a quiet, general price-hike.

Don't believe me? Consider the popularity of matinee movies and happy hours, which are marketed exactly like that.

I don't care, because I eat at Wendy's maybe once a year, if that. But last I heard, the company's damage control was working full blast on their backpedal.

4. THEY USE EXPENSIVE DECOYS.

On menus, perspective is everything. One trick is to include an incredibly expensive item near the top of the menu, which makes everything else seem reasonably priced. Your server never expects you to actually order that $300 lobster, but it sure makes the $70 steak look positively thrifty, doesn’t it?


I've written about this decoy pricing before, most recently here: "The Real DecoyOpen in new Window.

5. THEY PLAY WITH YOUR EYES.

Just like supermarkets put profitable items at eye level, restaurants design their menus to make the most of your gaze.


Again, this isn't necessarily a bad thing.

6. THEY UTILIZE COLORS.

According to Allen, different colors help conjure feelings and “motivate” behavior.


Some of these color associations strike me like astrology, but, I reiterate, so what? Marketers have been using color to catch our eyes for as long as there's been color.

7. THEY USE FANCY LANGUAGE.

Longer, more detailed descriptions sell more food.


Duh. You put a menu item up that says "green beans," and I think of the slop my mother used to make of those stringy bastards. Label it "haricots verts" (French for, literally, "green beans," in case you didn't know) and describe it as "lightly braised in olive oil with crushed roasted almonds and just a hint of garlic," and I'll take a double portion, please.

All food is better in French. That's just a fact.

Related:

“People taste what you tell them they’re tasting,” Rapp says. Consider this: In another study, researchers presented two different groups with the same red wine but with different labels. One label said North Dakota (do they even make wine there?), the other said California. In taste tests, the “California” wine squarely defeated the “North Dakota” wine even though both groups' glasses were filled with “Two-Buck Chuck”.

1) Yes, they make wine in North Dakota. From what I hear, it has a terroir reminiscent of fracking.

2) This is why I hate blind taste-tests. I eat and drink for the whole experience, not just the stimulation of one sense.

3) That "study" involved more psychological manipulation than any of these listed "tricks."

8. THEY MAKE YOU FEEL NOSTALGIC.

We all have that one meal that takes us back to childhood. Restaurants know this tendency, and they use it to their advantage.


Maybe I'm in the minority here, but I assert there's nothing inherently wrong with nostalgia. As long as you identify it for what it is, and understand that not everything "back then" was roses and sunshine.

For instance, when I was a kid, there wasn't a publicly-available internet. Some might think that was better. I do not.

In any case, I think it helps to know these things, even though I don't think any of them are inherently deceitful. At least, not any more than most marketing. But being aware might help you catch when someone really is trying to trick you, usually some university psychologist who's feeding you Two-Buck Chuck.
April 18, 2024 at 10:15am
April 18, 2024 at 10:15am
#1069054
This Guardian article should be uncontroversial.

    ‘Outdated and misleading’: is it time to reassess the very concept of money?  Open in new Window.
It’s regularly being created and destroyed – and economic models that don’t reflect that fact are not even slightly useful


...not.

The article itself, which begins with the simplistic concept of banking enunciated in the well-known movie It's a Wonderful Life, isn't as amenable to out-of-context quotations as some of the stuff I post here. I haven't seen that movie in decades, by the way, but I do remember the scene in question: George Bailey (James Stewart) dealing with a potential run on the bank.

When you borrow money and your bank credits your loan account, the account balance is created anew, “from thin air”, not from or in relation to existing deposits or other existing money. And as you repay the loan principal, the money created at the time of the loan gradually disappears, reverting to its previous form of airy nothingness.

I'm no economist money talker person, but to me, that just sounds like someone figured out that negative numbers exist and can be operated upon just like positive numbers.

Your deposit account is a liability for your bank and, as a depositor, you are no more than one among many of the bank’s unsecured creditors.

Leaving aside for the moment the mistaken assumption that everyone has enough unspent money to even have a deposit account, this has been the way it works for at least as long as I've been alive. Which, of course, doesn't mean it's the only way to do things.

In normal times a promise from a private bank is nearly as good as a promise from a government or a central bank. But in a crisis the promise is worth much less, and can be worth as little as nothing at all.

That may be true in the UK, where this source is based (I don't know), but here, we have federal deposit insurance for that sort of thing, as a result of the Great Depression.

If money is as unreal and ephemeral as they claim, though, a run on the bank shouldn't make a difference. The bank, or its insurance agency, could just conjure more out of the same "thin air" that they do when you get a loan. That it does make a difference tells me there's something incomplete about this article's assertions.

Details aside, the US and UK banking and government systems aren't all that different from each other where finance is concerned, so I'm not saying this article is completely useless on my side of the pond.

There's a bit more I can't really quote out of context, then:

Crucially, society as a whole needs to think differently about the nature of money –possibly by first discarding the term itself. “Money” encompasses a range of phenomena that have intrinsically different purposes and risks.

Well, yeah. I've known that for a while. Look up money supply.  Open in new Window. Oh, wait, no, I just did that so you don't have to.

The article's conclusion:

A new financial conversation would make the causes of inter-generational inequities more explicit, and it would allow the community to rethink allowing “too big to fail” banks to earn, year in year out, enormous risk-free profits.

There may be ways to accomplish these seemingly-worthy goals without drastically switching to what appears to be a Star Trek economy, which, in truth, would never work without access to near-unlimited free and clean energy. Not that it's ever properly explained in the Star Trek universe, except for maybe holding up the capitalistic Ferengi as counterexamples to it.

The truth that no one seems to acknowledge is that our current system requires an underclass. Okay, that might not be a "truth," but only my opinion, and, like I said, I'm not an expert. The underclass doesn't have to be a particular demographic, though, sadly, it has been. Gotta have people desperate enough to eat and otherwise stay alive, so they'll take the crappy but necessary jobs that almost no one would freely choose.

That's not freedom. That's economic coercion. But any suggestions to improve it are above my pay grade.

Pun intended.
April 17, 2024 at 10:24am
April 17, 2024 at 10:24am
#1068972
Something interesting from the BBC last year:



I thought all Australians lived underground. Otherwise, they'd fall up into space.

I mostly saved this article so I could use the entry title I did. I'm inordinately proud of that pun. But it's also a fascinating concept, at least to me.

These are the first signs of Coober Pedy, an opal mining town with a population of around 2,500 people. Many of its little peaks are the waste soil from decades of mining, but they are also evidence of another local specialty – underground living.

That name sounds as Australian as kangaroos and Crocodile Dundee. I wonder if the article will state its origin.

In the winter, this troglodyte lifestyle may seem merely eccentric. But on a summer's day, Coober Pedy – loosely translated from an indigenous Australian term that means "white man in a hole"...

Of course it did, and it's hilarious.

...needs no explanation: it regularly hits 52C (126F), so hot that birds have been known to fall from the sky and electronics must be stored in fridges.

Yeah, I prefer heat to cold, but that's just going too far. I thought Phoenix was bad in the summer.

As the blistering three-month heatwave continues in the US – with temperatures even cacti can't handle – and wildfires incinerate swathes of southern Europe, what could we learn from Coober Pedy's residents?

Again, this article is from last year. August, specifically. At least it was winter then in Australia.

People have been retreating underground to cope with challenging climates for thousands of years, from the human ancestors who dropped their tools in a South African cave two million years ago, to the Neanderthals who created inexplicable piles of stalagmites in a French grotto during an ice age 176,000 years ago.

Well, duh. It's not exactly news that humans and related apes lived in caves.

Caves served another very important purpose, as well. As the article notes, most underground locations that are deep enough—provided one isn't near a volcano or whatever—stay around 13C, or 55F. Before refrigeration, this was an common way to keep beer and wine relatively chilly, and thus preserve it longer. Even now, a lot of fermented beverages are best served at "cellar temperature."

Apart from comfort, one major advantage of underground living is money.

That, and your HOA can't give you shit if you happen to paint your house the wrong shade of pink.

On the other hand, many underground homes in Coober Pedy are relatively affordable. During a recent auction, the average three-bedroom house sold for around AU$40,000 (£21,000 or US$26,000).

I'd imagine that at least part of that is its remote location.

The question is, could underground homes help people to cope with the effects of climate change elsewhere? And why aren't they more common?

Guessing that at least part of the answer to that last question is "rock" and "water tables." And also "building codes" and "zoning laws." It's my understanding that, to be considered a bedroom in the US, a room must have a window. Windows aren't exactly a common feature of underground living.

Besides, if your neighbor lives in a cave, how are you supposed to judge their lifestyle by how they keep their yard?

There are several reasons why making dugouts in Coober Pedy is uniquely practical. The first is the rock – "It's very soft, you can scratch it with a pocket knife or your fingernail," says Barry Lewis, who works at the tourist information centre.

Okay, so I might have cheated a bit with my "rock" answer, because further up in the article, they mentioned it was sandstone and whatnot.

However, the feats at Cooper Pedy would not be possible everywhere. One major challenge with any underground structure is damp.

I did say "water tables." Civil engineering education is good for something.

But in Coober Pedy, which sits on 50m (164ft) of porous sandstone, conditions are arid even underground. "It's very, very dry here," says Wright. Ventilation shafts are added to ensure an adequate supply of oxygen and to allow moisture from indoor activities to escape, though these are often just simple pipes sticking up through the ceiling.

The downside of building, even underground, in a desert is the other side of that equation: where do you get your water? I missed it if the article addresses this. I imagine there's an aquifer further down, but I don't know.

The article also doesn't note another important thing about learning how to live underground. If we end up with people living on the Moon and/or Mars, they're pretty much going to have to do so in underground habitats. Not only does the rock above you provide a barrier to space radiation, but it all but eliminates potential problems with small meteorites punching a hole in your bio-dome or whatever.

Also, nearly unlimited beer storage.
April 16, 2024 at 10:19am
April 16, 2024 at 10:19am
#1068893
Not every question has an answer, as illustrated by this Cracked piece about riddles.



By "fictional," they mean "appearing in works of fiction." Lots of riddles are fictional in that someone made them up.

The literary, cinematic and funny-paper canon is full of riddles because they’re as fun for the audience as they are dire for whichever hero must solve them to obtain the One Ring, whatever the hell a sorcerer’s stone is, etc.

The Sorcerer's Stone was meant to be the Philosopher's Stone, and it is called that in the book's original language. Of all the ways they dumbed down that series for an American audience, that was probably the worst.

The Philosopher's Stone was, historically, the hypothesized material that alchemists thought they could use to turn lead into gold, or confer eternal youth, or had whatever transformative properties were faddish back before science. Kind of the Unobtainium of its time, and its own unsolvable riddle. The more philosophical alchemists treated it not as an actual material, but a transformative idea for achieving enlightenment or whatever.

5. Alice’s Adventures in Wonderland

Just a few pages after the Mad Hatter asks Alice, “Why is a raven like a writing desk?” in the rant against modern math most beloved by hippies, the answer becomes clear: He doesn’t know.


Likely my first encounter with the subject at hand. Pretty sure it annoyed me back then, but it taught me an important life lesson: Not every question has an answer.

As the article notes, people have stumbled around trying to find an answer, but, like the search for a material Philosopher's Stone, answering it misses the entire point.

4. Life Is Beautiful

Toward the end of the movie, Lessing begs Guido for help with a riddle, translated to English, “Fat, fat, ugly, ugly / All yellow / If you ask me where I am, I say ‘quack, quack, quack’ / Walking along, I say ‘poo poo.’” Sound like nonsense? It is.

In contrast to the Mad Hatter thing, I've never encountered this, which apparently comes from an Oscar-winning movie. But even had I seen the movie, I think I might have accepted that for the near-Brechtian absurdity that it obviously is.

3. Twelfth Night

This takes some explanation; fortunately, the article does just that. But you'll have to see it there, because I can't do it any justice with small excerpts.

Our theory is that Shakespeare wrote fart jokes and never intended anyone to think this hard.

On that point, I have to agree with Cracked.

2. The Hitchhiker’s Guide to the Galaxy

Every nerd’s favorite number is 42, supposedly “the answer to the ultimate question of life, the universe, and everything” calculated by the supercomputer Deep Thought in The Hitchhiker’s Guide to the Galaxy.

It is now absolutely impossible to have a philosophical discussion about the meaning of life without someone shouting that number out.

In fairness, that someone is often me.

1. Monty Python and the Holy Grail

Ah, yes, that other work of art that nerds can't help but quote. You know both this and #2 were created in the 1970s, right? Holy Grail in particular came out in 1974. For the math-challenged, that's 50 years ago.

Wait... Shakespeare's and Carroll's works are much older than that. Some of us nerds like to quote them, too.

This has given comedy nerds a handy call and response by which to identify each other but also an actual riddle: What is the airspeed velocity of an unladen swallow?

It's true. I have been known to select people to hang out with based on whether they can do Holy Grail references. I knew one person who could quote the entire movie, verbatim.

That's dedication.

The real joke: It’s a trick. There’s no such thing as “airspeed velocity.” Sure, you could calculate the airspeed or the velocity of a swallow of any nationality.

"You're using coconuts."

Perhaps comedy is the actual Philosopher's Stone...
April 15, 2024 at 8:25am
April 15, 2024 at 8:25am
#1068809
Not to brag, but I've known this for a long time. Still, there's always new stuff to learn. This is a fairly old article from Vox:



And collard greens. Also kohlrabi, but every time I mention kohlrabi people be like wut?

In some circles, kale has become really, really popular.

Too popular. I think people may have finally calmed down on it a bit since this article came out in 2015. I don't know what the latest fad food is, but I can almost guarantee that behind the fad is someone with money trying to make more of it.

Once a little-known speciality crop, its meteoric rise is now the subject of national news segments. Some experts are predicting that kale salads will soon be on the menus at TGI Friday's and McDonald's.

I don't know about Friday's, but apparently McD's did come out with a kale salad that had more calories than a Big Mac.  Open in new Window. Calories (if you haven't been following along, I talked about calories a few days ago) aren't the only indicator of healthiness in a food, but I do find that hilarious.

This makes it pretty interesting that kale and cabbage — along with broccoli, Brussels sprouts, cauliflower, collard greens, and kohlrabi, and several other vegetables — all come from the exact same plant species: Brassica oleracea.

Linguistic detour, as I often do: you might notice that most of those have some variant of "cole" in them. It's in the middle of broccoli, the beginning of cauliflower and collard and kohlrabi, and it's a slight vowel shift for kale. Not so much cabbage itself, but the syllable survives in that version's most perfect presentation, coleslaw. The odd one out is Brussels sprouts. One might be forgiven for thinking that this is where the species name brassica came from, but apparently not; brassica seems to have been Pliny the Elder's name for the cabbage group. No, Brussels sprouts were named for, believe it or not, Brussels, the one in Belgium, near which the plant was heavily cultivated.

Nor was Brussels  Open in new Window. apparently named after brassica. The similarity of the first syllables there seems to be another linguistic near-coincidence.

How is this possible? About 2500 years ago, B. oleracea was solely a wild plant that grew along the coast of Britain, France, and countries in the Mediterranean.

Because France doesn't count as "Mediterranean?" I guess they meant the north and/or west coasts of the country.

As for "how is this possible," obviously, the article goes into detail. But it's the same sort of thing as saying "How is it possible that a chihuahua and a Great Dane can be the same species?"

Short version: artificial selection; that is, trait selection by humans. As we, too, are part of nature, the distinction between artificial and natural selection is, well, an artificial one.

For the long version, there's the article.

All this speaks to one thing: the remarkable power of human breeding and artificial selection.

Wow, that could have been worded better, couldn't it? Oh, well; I'm sure I've done worse.
April 14, 2024 at 7:19am
April 14, 2024 at 7:19am
#1068727
Today's historical excavation takes us back to 2019, at a time when I was apparently participating in the 30-Day Blogging Challenge... though I neglected to note that in the entry: "Getting the Lead OutOpen in new Window.

For context, my current >4 year daily blogging streak wouldn't start for about another month, and, obviously, this was in the Before Time.

The prompt I was writing to started with "What characteristics do you admire in a leader?" and continued to build on that. I'm not sure I really addressed all the nuances of the prompt, but I'm okay with that, especially now that it's >4 years later.

I am a Bad Example. I even printed up "business" cards with my name followed by "Bad Example."

My friend's ex-wife never liked me. She called me his "bad influence." I should have run with that, instead.

But I'm not always. Case in point: it might have been the month after this entry appeared, maybe not, but let me set the scene: California, the home of two of my friends, married to each other, and their kids. Breakfast time. Someone made pancakes. I poured a small amount of syrup onto said pancakes.

The mom, talking to the youngest kid, maybe 12 or so: "See how he doesn't drown the pancakes in syrup?"

The dad, to me: "Bet that's the first time you've been held up as an example of self-restraint."

Everyone involved, especially me, had a good laugh at that one.

Basically, I don't want to be a leader. I don't like it. The benefits don't outweigh the constrictions, for me. I admire those people who can do it...

I worded this poorly, I think. There are lots of people who are "leaders" that don't deserve, or get, my admiration, or even respect. Evangelical TV preachers, for example. Most politicians, from any political party. Certain billionaires. Thing is, those types get all the press. I'd be hard-pressed to cite an example of a leader I actually admire, at least in terms of naming someone well-known. Being well-known probably works against you, there; not one of us is without our follies and drawbacks, and being in the public eye tends to put a magnifying glass on them.

What I should have said was probably that I have some respect for those who can balance things well enough to be both a good leader and, mostly, still be what I'd consider a good person.

But there's one thing I hate more than being a leader, and that's being a follower.

Still true.

Fortunately, life isn't binary, and I choose Option C: going my own way. It's enough to be responsible for my own situation, and I'm not always so good at even that.

Not everyone has that option, I know.

Another thing I'm terrible at is motivating people. I like it when people motivate themselves. Any attempts I make in that direction always seem hollow to me. But I think good leaders find ways to motivate their team, though how they do it might as well be sorcery for all I understand it.

A few weeks after this entry, I did one on Emotional Intelligence, or EQ. I only remember this because I was skimming nearby entries while trying to remember the context of this one. I probably could have related the two concepts at that time, but I didn't. In summary, I question the whole concept of EQ because it seems self-referential: they define EQ as the quality needed to achieve success (based on a narrow definition of success), and to achieve that kind of success, you need a higher EQ.

My point being that I imagine it takes those same qualities to be an effective leader.

Anyway, yeah, this is kind of rambling, but the bottom line is: don't look to me to set examples, unless you enjoy drinking at bars.

Again, this was the Before Time, so the difference now is I'd end that sentence with "home."
April 13, 2024 at 2:25am
April 13, 2024 at 2:25am
#1068648
Posting early today because it'll be a busy and exhausting day of visiting local wineries with friends.



For the sake of context, this 2013 article from Collectors Weekly came out during the time when same-sex marriage was still being debated in the US; it wouldn't be until 2015 that Obergefell v. Hodges settled the matter once and for all. Hey... stop laughing; I said once and for all.

I mention this only because, at the time, "traditional marriage" was a buzzword, and a dog whistle for "white Christian man marries white Christian woman; together they go on to produce 2.1 children, and the man makes all the decisions in the family." And even that is not what was, historically, "traditional marriage."

In reality, it’s the idea of marrying for love that’s untraditional.

This is not even getting into different cultures' traditions.

For most of recorded human history, marriage was an arrangement designed to maximize financial stability.

Much as I'd like to agree with that, it still seems Eurocentric. But, honestly, I don't know enough about other cultures to know all the nuances involved.

By the Middle Ages, gender inequality was not only enshrined in social customs, but also common law. In most European countries, married women were forced to give up control over any personal wealth and property rights to their husbands. Eventually, the system became known as “coverture” (taken from “couverture,” which literally means “coverage” in French), whereby married couples became a single legal entity in which the husband had all power.

One of the more common arguments against same-sex marriage back then was "What's next, you can marry your dog?" To me, that argument told me everything I needed to know about how the person making it would treat women. It's like the idea of "two willing adults wanting to enter into a mutually beneficial agreement" completely escaped them. Once you get to the "two willing adults" hurdle, it's not even a little bit of a stretch to consider that those adults can be any sex and/or gender.

I'd personally be perfectly content to extend that to more than two (for other people, not for me), but that's a fight for another time.

Under such laws, children were generally viewed as assets, in part because they were expected to work for the family business.

Another change: nowadays, they're liabilities. Or, at the very least, it's an emotional bond more than a business arrangement.

Despite the church’s staunch position on monogamy, in the late Middle Ages, a legal marriage was quite easy to obtain. However, as more couples attempted to elope or marry without consent, the old guard upped its game. To combat the spread of “clandestine” marriages, or those unapproved by parents, state officials began wresting the legal process of marriage from the church.

In my view, that sowed the seed that became part of the same-sex marriage debate, at least in the US. Religious people get married twice in the same ceremony: one sealing their bond in the eyes of their religious group, and one making it official to various government agencies, not least of which is the IRS. Much breath was wasted with people talking past each other, not understanding that one person meant religious union, while the other meant civil union.

As this philosophical support for individual choice spread, more young people wanted some say regarding their future spouses. “Demands for consent from the people actually getting married were thought to be quite radical,” says Abbott. Even more radical was the idea that marriage might be entered into for emotional, rather than financial, reasons.

It's also apparently radical that a marriage be considered a partnership between equals.

In fact, for thousands of years, love was mostly seen as a hindrance to marriage, something that would inevitably cause problems. “Most societies have had romantic love, this combination of sexual passion, infatuation, and the romanticization of the partner,” says Coontz. “But very often, those things were seen as inappropriate when attached to marriage. The southern French aristocracy believed that true romantic love was only possible in an adulterous relationship, because marriage was a political, economic, and mercenary event. True love could only exist without it.”

In other words, they knew that love dies, but money is forever.

Anyway, the article goes on for a while, but, as it covers centuries of history, it seems to be a decent summary. It also emphasizes how laws are often slow to catch up to social realities. And yes, there's a nod to the then-current marriage debate.

The laws, if not the attitudes of certain kinds of people, have moved on since then, and we've shifted our focus as a society to trans issues, when we're not contemplating our looming climate apocalypse.

And no, the two have nothing to do with each other. But the shift in attitudes about marriage and the climate problem both seem to have their roots deep in the Industrial Revolution.
April 12, 2024 at 11:00am
April 12, 2024 at 11:00am
#1068587
Gonna have to contradict folk wisdom again: there really is such a thing as coincidence. Cracked has a few examples here:



...except "same exact" is, for most of these, a stretch.

In March 1951, a new comic called Dennis the Menace debuted in the U.S. That very week, a different comic about a different mischievous boy debuted in Britain, and it was also called Dennis the Menace. Both became hugely popular, and neither adapted the other, and neither ripped off the other.

It would be weirder if coincidences never happened. Other things invented in disparate places simultaneously include calculus and the theory of evolution. Those are less coincidence, though, and more about the background having been laid out, setting the stage for ideas that were ready to be invoked.

6. The Tale of Hershey’s and Hershey’s

These two brands didn’t, say, form on different continents with the same name, like how there was one restaurant in Australia called Burger King and was unrelated to the famous burger chain. The two Hershey companies both formed in Pennsylvania, in the same county, within a decade of each other, by unconnected men named Hershey.

Trademark law is complicated and way outside my expertise, but from what I understand, it gets even more complicated when actual peoples' names are involved.

Milton Hershey created the Hershey Chocolate Company in Lancaster, Pennsylvania, in 1884. Jacob Hershey and his four brothers created the Hershey Creamery Company in Lancaster, Pennsylvania, in 1894. The Hershey brothers were not related to Milton Hershey, and they didn’t form a creamery to piggyback off Hershey chocolate. They formed a creamery because they were a family of farmers.

Still, ten years apart, same county? You'd think even back then some lawyer would have advised one of them to change their name.

5. The True Name of Dogs

We’re playing around with exactly how the conversation went down, but the basics are true: The host kept saying “dog,” and Dixon assumed he’d misunderstood the question. Against all likelihood, “dog” really was the Mbabaram word for dogs, though the language shares no other roots with English or with any of the many languages related to English.

There exist other linguistic coincidences, though they escape me at the moment and I can't be arsed to look for them. But given the number of extinct and extant languages in the world, combined with the relatively limited number of sounds a human can make, it would surprise me more if there were no such coincidences.

4. We’re Stuck With Two Different Calories

We measure food using what’s called the large calorie, the dietary calorie or (most confusing of all) Calorie with a capital C. The other type of calorie is the small calorie, or calorie with a lowercase c.

And yet, people act like calories are an ingredient in food, not a measure of energy.

Obviously, we should not use the same word to describe these two different measures, and for a while, people tried calling Calories “kilocalories.” That made far too much sense, so it never caught on.

Pretty sure I've seen kCal in European food labeling, but I could be wrong. In any case, again, this is less coincidence and more sharing a common origin, unlike the dog example above.

Also, it's not that hard to determine which one is meant from context. If it's about stuff you're going to put into your gaping maw, it's Calories. If it's not, it's calories.

3. Nacho, and His Nacho Formula

Nachos were invented by Ignacio Anaya. “Nacho” was his nickname and has traditionally been the nickname of various people named Ignacio. You know that already if you have any friends named Ignacio, if you saw that Jack Black luchador movie or if you watched Better Call Saul.

Okay, gotta admit I didn't know that. I like Jack Black, but haven't seen that movie. Nor have I seen the referenced TV show.

In this example, the coincidence isn't, as one might expect, two dudes named Ignacio coincidentally inventing nachos at the same time. No, it's way cooler than that, at least if you know the slightest bit about chemistry:

The emulsification agent used in nacho cheese today is sodium citrate. Its chemical formula is Na3C6H5O7. NaCHO. It was destiny.

No, it's coincidence. Stay focused, okay?

2. We Keep Lolling Over Lol

Lol is Dutch for “laugh,” or “fun” or “joke.” This probably derives from an earlier word lallen, which referred to drunken slurring.

While it shouldn't be surprising that there are similar words in English and Dutch, which are, unlike with the dog thing above, related languages, the similarity here does seem to be coincidence in that LOL is an English acronym that happens to mean something similar to what this article claims is a Dutch word for laugh.

But, again, it's not that farfetched.

1. Everything Ends Up Crabs

No, everything really doesn't. Crabs are no more an inevitable end product of evolution than humans or bees. Some scientists a while back pointed out that different evolutionary lineages produced crablike forms, and the popular media ran with that and exaggerated it to "everything ends up crabs."

I can deal with some hyperbole on a comedy site, though.

You see, we have a lot of different types of crab, and they didn’t all form from one crab diverging into different species as evolution progressed. They formed because a bunch of different species all independently evolved into the same basic animal: the crab. Biologists call this process crabification.

Coincidentally (heh), "crabification" is what I call what people do at certain all-you-can-eat seafood buffets.

Often, when different species evolve in some convergent way, we can point to what’s desirable about that trait, which makes everyone naturally acquire it. When a bat and a dolphin each evolve methods to navigate through chirping, that’s weird, but we can easily say why multiple species would evolve echolocation: because it helps them get around. With crabs, we can only speculate on why creatures keep evolving these flat bodies and tiny tails instead of, say, all turning into lobsters.

Fair enough, but it doesn't rise to the level of coincidence if there's something in the marine environment that makes it useful to have a hard shell and pincers with a flattish body plan, even if we can't quite point to what that something is.

This would be like saying "what a coincidence that fish and marine mammals both have fins." Except we know that fins are useful appendages for underwater locomotion, and that underwater locomotion is helpful for underwater survival.

So, honestly, I mostly kept this article in the queue because of the sodium citrate nacho thing. But it's all interesting, as far as I'm concerned.

Because sometimes, there really is such a thing as coincidence.
April 11, 2024 at 10:38am
April 11, 2024 at 10:38am
#1068505
By coincidence, this article from BBC, which I've had saved for a while, is almost exactly three years old.



Well, it can't be entirely lost, as we know about it.

Pity the event planners tasked with managing Cahokia's wildest parties. A thousand years ago, the Mississippian settlement – on a site near the modern US city of St Louis, Missouri – was renowned for bashes that went on for days.

Well, at least they didn't have the internet and its trolling, grifting denizens to deal with.

Throngs jostled for space on massive plazas. Buzzy, caffeinated drinks passed from hand to hand. Crowds shouted bets as athletes hurled spears and stones. And Cahokians feasted with abandon: burrowing into their ancient waste pits, archaeologists have counted 2,000 deer carcasses from a single, blowout event.

Pretty sure most of that still happens when the Cardinals win the World Series.

A cosmopolitan whir of language, art and spiritual ferment, Cahokia's population may have swelled to 30,000 people at its 1050 BCE peak, making it larger, at the time, than Paris.

Math aficionados might note that this would have been about 3,000 years ago, not "a thousand" like in the sentence I quoted up there. This is because, apparently, the BBC made a (gasp!) mistake here, at least according to this Wikipedia entry,  Open in new Window. which appears to be thoroughly sourced. The Beeb should have written 1050 CE, not BCE.

I point this out only to show that everyone makes mistakes. I don't think that means we can't trust the rest of the article.

It's what Cahokia didn't have that's startling, writes Annalee Newitz in their recent book Four Lost Cities: A Secret History of the Urban Age. The massive city lacked a permanent marketplace, confounding old assumptions that trade is the organising principle behind all urbanisation.

Oh, look, a book promotion. Startling.

"Cahokia was really a cultural centre rather than a trade centre. It still boggles my mind. I keep wondering 'Where were they trading? Who was making money?'," Newitz said. "The answer is they weren't. That wasn't why they built the space."

Archaeology, or really any scientific discipline, is subject to human cultural biases. One reason for science's existence is to minimize, at least over time, the effects of bias. It helps to have people from diverse cultures studying things. We in the English-speaking world tend to assume everything's always been about money and trade, but this is not necessarily the case. (In fairness, it appears that math and written language in Eurasia both began as means to record business transactions, but even that appearance may be the result of some bias.)

When excavating cities in Mesopotamia, researchers found evidence that trade was the organising principle behind their development, then turned the same lens on ancient cities across the globe. "People thought that this must be the basis for all early cities. It's led to generations of looking for that kind of thing everywhere," Pauketat said.

Like I said.

They didn't find it in Cahokia, which Pauketat believes may instead have been conceived as a place to bridge the worlds of the living and the dead.

Yeah, let's not immediately leap to that conclusion, either. "May... have" doesn't fill me with great confidence. Nevertheless, it indeed seems to fit as a hypothesis.

Eventually, Cahokians simply chose to leave their city behind, seemingly impelled by a mix of environmental and human factors such a changing climate that crippled agriculture, roiling violence or disastrous flooding. By 1400, the plazas and mounds lay quiet.

Great, more ammunition for the anti-climate-change crowd to seize on. "See? Climate always changes." I mean, sure, it does, but usually on a much longer timescale than what we're experiencing now. Plus, recall that this was in the Mississippi flood plain, and that river has been known to shift, even during a single human lifespan.

But it's more than that. Cahokians loved to kick back over good barbecue and sporting events, a combination that, Newitz noted, is conspicuously familiar to nearly all modern-day Americans. "We party that way all across the United States," they said. "They fit right into American history."

The only thing missing is beer. And I'm not going to wade into the cultural quagmire on that subject.

30 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Waltz Invictus (UN: cathartes02 at Writing.Com). All rights reserved.
Waltz Invictus has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online