About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
May 31, 2023 at 9:22am
May 31, 2023 at 9:22am
#1050318
Hey look, we made it to the end of another month. Apropos of nothing:

    It’s Time to Ban “Right Turn on Red”  Open in new Window.
The dangerous maneuver is allowed thanks to a flawed idea about emissions from the 1970s. We don’t need it.


Last time I checked, RToR is the law of the land in the US, everywhere except New York City. That is, in not-NYC, a right on red after stopping is not only permitted but encouraged, unless there is a sign prohibiting it. In NYC, turning right at a red light is illegal, unless there's a sign permitting it.

There are no signs explaining this as you drive into NYC, or out of it. None. It's just something you have to know, and of course NYPD doesn't give a shit and you'll generate some revenue for the city if they catch you.

Even knowing the law, I've made the mistake in that city. No cops around, so no consequences. I was just so used to doing it that I forgot I was in NYC (not hard to do in most areas of the city that aren't Manhattan).

This article talks about the potential ban of the practice in DC. There, it makes more sense to ban it. Driving in DC is way worse than driving in NYC, and what's the point of defaulting to RToR, if something like 75% of your intersections are signed to forbid it? But, in that case, there's only a river along one edge; for most of the city's boundary, you may not know when you're in DC as opposed to Maryland.

Article is from 2022, and honestly, I can't be arsed to check to see if that ban has been implemented, because I only drive in DC under duress.

Regardless, I have a few minor issues with the article.

It’s an obsolete relic of the 1970s oil crisis.

"Obsolete" is an editorial value judgement.

It’s dangerous to pedestrians.

Read further, and you'll see that this statement isn't well-supported by data.

And, if you drive a car in the United States, you likely do it every day.

Unless you live in NYC or, like me, don't drive every day.

It’s time to get rid of right-turn-on-red.

Opinion.

Nothing wrong with stating your opinion, of course. Source: my opinion. But usually, it's clearly labeled as "opinion," not in the "Environment" section and written by someone whose byline says "News writer."

But in the United States, drivers are generally permitted to turn right at a red light, if there’s a big enough gap in the traffic for them to squeeze into. In fact, you’re likely to get honked at if you don’t do it.

Or, as it we are talking about the US, shot at.

I've been honked at for not turning right on red at intersections clearly and plainly labeled No Turn On Red.

Article states RToR isn't a thing in Europe (would be LToR in the UK), but doesn't mention Canada, which I know has RToR. Apparently, it's not a thing in Mexico, but people there do it anyway. I don't know; I've never been to Mexico.

Still, sometimes drivers fail to yield to pedestrians who have the right of way in the intersection. The data on right-turn-on-red crashes might be scarce, but the existing studies suggest that these types of collisions—while rare—frequently involve a pedestrian or cyclist.

Like I said, not well-supported by data. However, from a driving point of view, it makes sense that there would be a hazard. You're at a red light, about to turn right, and you're twisting your neck way to the left to see when the break in traffic comes. When it does, you might peel out into the intersection without really looking to your right—which is, of course, when a pedestrian is just stepping off the curb in front of you.

As for bicycles, some drivers make a game out of hitting them on purpose, anyway.

Last week...

Again, article is from October of last year.

...the Washington, DC, city council voted to ban right-turn-on-red (RTOR) at most city intersections (and to allow cyclists to treat stop signs as yield signs). If Mayor Muriel Bowser signs on and the bill receives Congressional approval, DC will become the second US city after New York not to allow RTOR. DC, which has struggled to curb traffic fatalities, hopes that ending RTOR will make its streets safer for cyclists, pedestrians, and wheelchair users.

As I noted above, a large number of intersections in DC (my 75% number was pulled out of my ass based on experience in that city) are already signed NToR. Banning RToR would likely remove these signs. This seems like a very good way to increase revenue by catching unwary drivers from VA, MD, or points further away, who are used to it being permitted-unless-signed. As with many traffic laws and enforcement policies, they talk about safety, but they're really about revenue generation.

So, why do US cities allow RTOR in the first place? Blame the oil crisis.

The article proceeds to delve more deeply into this idea.

By 1980, RTOR was the law of the land nationally, except in New York City.

I didn't get my driver's license until 1982, incidentally.

Take into account the growing number of hybrid and electric cars, and RTOR makes even less sense. Schultheiss says that electric cars are actually likely to increase the number of RTOR crashes “because their acceleration rates are dramatically quicker than gas powered vehicles.”

This ties in to my "not looking to the right" comment above.

Critics of the DC bill have pointed out the lack of data showing the dangers of RTOR, but many people who don’t use cars know instinctively how dangerous turning vehicles can be.

And yet, they step in front of right-turning stopped vehicles anyway. And even people who do drive can be inattentive as pedestrians.

I'm not trying to victim-blame, here. Sometimes, it can be the driver's responsibility even when it's the pedestrian's fault.

Anyway, you know what would significantly improve pedestrian safety while at the same time reducing idle times at intersections?

Traffic circles.

"But, Waltz, there are traffic circles all over DC; in fact, they're one of the main reasons driving in DC sucks."

True. But those roundabouts are terribly designed, relics of Pierre L'Enfant's original horse-and-buggy design for the city.

Now, it's been a long time since I've been in continental Europe, since before I could legally drive on streets. But one thing I noticed when I was in England, more recently, was the absolute reign of, not QEII (at the time), but roundabouts. I never drove there, but I rode with other people, and circles are pretty much everywhere. So they don't need LToR.

After that rant, you might think I have a strong opinion about RToR in the US. I do not. (I do have strong opinions about the benefits of roundabouts, though.) It doesn't matter to me whether it's allowed or not. What I do care about is that it be consistent, which, given the major exception of NYC, and the proliferation of intersections where it's not allowed and sometimes poorly signed, it never has been.
May 30, 2023 at 7:57am
May 30, 2023 at 7:57am
#1050264
Sweet dreams are made of cheese...

    The Next Generation of American Cheese  Open in new Window.
The founders of New School cheese say they are making the first “quality” American cheese


When I visited England some years ago, it took me a while to convince my British friends that there was (still is) a vibrant craft beer community in the US; they thought we all drank Bud Light or whatever. Then they asked, "What about wine?"

I was like, "Oh, definitely. Wineries everywhere, some even as good as or better than the French." (That didn't take nearly as much convincing.)

But then came the question I was dreading: "And cheese?"

I hung my head in shame.

Yes, some small, local artisanal cheeses exist, but they're not nearly as widely distributed as the biggies are.

Wisconsin notwithstanding, most of the actual cheese produced in the US is a copy of various European styles, such as cheddar, brie, or Swiss. A few are uniquely American... sort of. Monterey Jack might be an exception, though it was developed in an area that was, at the time, controlled by Spain via Mexico (which is part of North America, but despite the existence of it and Canada, when people say American, they generally mean US.) And then there's Colby, from Wisconsin, but it's basically cheddar that's mild enough to be suitable for a bland Midwestern palate, which generally thinks mayonnaise is quite enough spice.

The other big cheese in America is American, which in its most common form is a factory food.

My point being that when it comes to "craft" (as opposed to Kraft) products, cheese has a long way to go in the US if it's going to catch up to wine and beer.

This article is, as the headline suggests, a step in that direction.

The joke that people love to hate — or hate to love — American cheese is outdated. We’re at the point where we can all admit it’s good.

No. No, we are not. Because I do not admit any such thing. At best, it's a pale (sometimes literally) shadow of cheddar.

You ever top a burger with fresh mozzarella and watch as it sits there and does everything but melt?

No, because the only cheese I admit on a burger, if I have any kind of choice, is cheddar or Swiss. Maybe provolone. If I don't have any kind of choice, I put up with American.

Ever try putting a slice of cheddar on ramen only to witness the puddles of fat sweat out from the clumpy solids?

No, because who the fuck puts cheddar on ramen?

American cheese is not a quality product.

Finally, something I can agree with.

In fact, its lack of quality is often the point, a grand embrace of the lowbrow and cheap that is the cornerstone of so much comfort food.

I don't believe in the concept of comfort food. Food is either delicious, or it's not. Sometimes somewhere in between. The closest I come to the concept is making easy meals, like sandwiches.

In fact, Kraft Singles, the standard for American cheese, cannot legally be called American cheese, or even “cheese food,” due to being made with milk protein concentrate and consisting of less than 51 percent actual cheese. (The company itself refers to the product as a “pasteurized prepared cheese product.”) Cheese may be the first ingredient, but the slices are mostly made of whey, skim milk, and various preservatives.

I've described Singles as a product whose only relationship with cheese is that the truck passed a dairy farm on the way out from the factory.

That may be an exaggeration, but not by much.

For decades, the options have been to either accept American cheese as it is, or instead eat better-quality artisanal cheeses that, while delicious, are never quite right.

You shut your disgusting mouth.

Last fall, Greenspan and his friend Alan Leavitt, who has worked at and invested in early-stage packaged consumer goods companies, launched New School, what they claim is the first attempt by anyone to make a “quality” American cheese.

Eh. Maybe. Depends on definition of quality. Pre-industrialization, it was some bastard variant or mixture of cheddar or Colby.

“It’s hard to imagine processed food being good. What’s good? Does that mean taste? Does it mean quality? Does it mean health?” asks Helen Veit, associate professor of history at Michigan State University, who specializes in food and nutrition of the 19th and 20th centuries. Today, most Americans’ blanket assumption about processed foods is that they are unhealthy. But during the advent of many food processing methods in the early 20th century, “processed” wasn’t really a bad word. A processed food could actually be made well. Or at least better than what exists now.

I've touched on this subject before. "Processed" is now a buzzword, as in "processed food is bad for you." But when you start thinking about what "processed" really means, there are all kinds of gray areas. It's the new "don't eat anything you can't pronounce," which as I've noted in the past, does nothing but encourage ignorance.

“The initial intent of this product was to improve shelf-life of cheese shipped to warmer climates,” Zey Ustunol wrote in a 2009 article for the Michigan Dairy Review. In Illinois, James Kraft was working on a similar product, and received patents for processed cheese in 1916 and 1921, the latter of which was for processed cheese packaged in a loaf form.

The whole purpose of cheese in the first place was to preserve milk in a time before refrigeration. This is, I think, analogous to the addition of hops to beer, which had the effect of preserving it for a longer period of time.

What’s more, a new generation raised on processed food turned a more critical eye toward the entire food industry. “The broad skepticism about the benefits of processing was really a product of the 1960s and ’70s...”

You might recognize this period as also being when the use of refrigeration in the US and other industrialized nations really exploded. It is true that many preservatives aren't ideal in a person's diet, and refrigeration liberated us from so much need for preservatives.

There's a lot more at the link, and despite my conviction that the author must have lost their taste buds due to COVID, there's some interesting stuff in there. Including the shocking revelation that artisanal cheese is going to cost more than "pasteurized prepared cheese product." Well... duh.

If nothing else, read the end of the article for a truly groan-worthy pun.
May 29, 2023 at 10:40am
May 29, 2023 at 10:40am
#1050227
Today in "You can get what you want and still not be happy":

    The Unbearable Costs of Becoming a Writer  Open in new Window.
After years of hard work and low pay, the risks I took to work in publishing are finally paying off. But now, I wonder about the price my family paid, and whether it was too steep.


When I was younger, I was always told: "Be careful what you wish for; you might just get it." So I stopped wishing.

My parents didn’t understand my job.

No one's parents understand them.

There were stretches when I made so little money writing or editing that I couldn’t blame my parents for assuming they were hobbies.

Most people think it's easy and anyone can do it. And then they try, and they think they've done a good job because they get positive feedback from those who are afraid of crushing them with negative feedback. But in reality, they suck.

I'm aware I'm probably in that category, too. After all, I've never made money from writing (or editing), unless you count GPs for WDC newsletters. That's how we're measuring success now, right?

But, okay. I get it. Sometimes outsiders don't understand that it's work, whether you're getting paid for it or not. Often rewarding work, and requiring very little physical activity, but it's no less work than, say, website development.

I became an editor by volunteering for an Asian American magazine, a nonprofit mission-driven labor of love where no one drew a salary. Ten to fifteen hours of unpaid labor a week in exchange for the editorial experience I wanted was, to me, an acceptable trade—nearly all my labor then was unpaid.

And I get this. (Obviously; I do that sort of thing, myself.) But it's a symptom of a larger problem, which is that what creative-types do is largely undervalued. Hey, can your band do this gig? We can't pay you, but you'll get experience and exposure! Say, I like your art. Can you do a poster for me? I don't have a lot of money, but you can put your contact info on it.

Then one of my favorite indie websites hired me to edit on a part-time basis. The job started at thirteen dollars an hour, twenty hours a week, and after a couple of months I was brought on full-time and granted a salary in the mid-30s.

Don't mistake my above complaints for bitterness. Would it be nice to be paid for my writing? Sure. But if that did happen, I know it would come with a raft of requirements. Like, right now, I can write about anything I want to, say anything I want to, be honest about my opinions (or at least honestly lie about them). My only restrictions are self-imposed. But consider what might happen if, say, someone approached me to do a beer blog. I'd then be beholden to them, and two of my favorite hobbies (writing and drinking beer) would become, by definition, work, with deadlines and expectations.

Everything becomes promotion, then. Hell, even the article I'm highlighting today is basically an ad for the author's book. (I know I've said this before, but that never stops me, by itself, from featuring something; writing ad copy is also a writing skill, and this is a site about writing.)

Fortunately, I don't need the money. So I can pursue it as a hobby. (Still, free beer would be a nice perk.)

But there was also a reality I had to face: I was running out of time to help my family. My diabetic father’s health was in decline, and my parents were struggling to pay for his care and medications. My husband and I couldn’t pay for full-time childcare for both our kids, let alone provide the kind of support my parents needed. The little assistance I could offer wasn’t enough to make a dent, and as my father grew sicker, my guilt and anxiety intensified.

This is understandable, too, though I neatly sidestepped a lot of it by not having kids I knew I wouldn't be able to afford.

I continue to grapple with the instability of this industry and what kind of opportunities will be available to me in the years to come, as well as larger questions about whether my editorial work was valued.

Editorial work is, if anything, even less valued than writing. Don't believe me? Pick a few articles off the internet and see if they look like an editor saw them. Or, hell, this blog, which certainly doesn't have an editor. I do try to catch obvious typos and awkwardly-phrased passages, but I don't always succeed.

I think about who gets to be a writer or an editor, who can afford to wait for that livable salary or that higher advance. Who can choose to prioritize their creative goals, take potentially career-making risks, invest precious years in this work without the guarantee of financial stability.

And this is why I chose engineering way back when. Fortunately, for me, it wasn't an either-or situation; I'm equally bad at both tech and creative stuff.
May 28, 2023 at 9:35am
May 28, 2023 at 9:35am
#1050188
It's Sunday, so it's time to unearth an ancient fossil of a blog entry. This one's from early 2020 and references a Guardian article from half a year prior: "Poor YouOpen in new Window.

The article  Open in new Window. I referenced is still there, so context is easy to find.

The article, and my commentary, are pretty much just as relevant now as they were three years ago (though one might be tempted to wonder how the somewhat sheltered author navigated the pandemic): wages aren't growing much, prices are, and people are still people.

So, a few of my comments that might warrant additional explanation:

1) I'm only giving her a pass on the "gluten-free cookbook author" gig because she is apparently diagnosed with celiac.

What I meant by this is that, for a while, "gluten-free" was a fad, and I detest fads. Nevertheless, there are people for whom it's not a choice, but a necessity, and if the fad gave them more choices for a reasonably healthy lifestyle, great.

2) $15 an hour? Stocking groceries? I'm not saying that's a lot of money, but it's above average for unskilled workers.

I have no idea if that's still true in 2023, but for the time, I believe that to be correct.

And yeah, being broke, or close to it, damages one's mind. The crappy cheap food you have to buy if you're poor doesn't help with that.

This probably wasn't the best way I could have phrased that. I wasn't trying to imply that poor people are brain-damaged. I meant it as an indictment of the system that forces some people to be poor, not to rag on the victims of that system.

Tying health insurance to employment is good for employers. Not great for people who'd rather spend their time running a small business.

By "good for employers," I meant that the incentive of keeping one's health insurance is another trick employers can use to trap you in a dead-end and/or low-paying job. The article's author, for example, apparently took the job mostly because it offered health insurance, rare for a part-time gig. People will put up with a lot of bullshit if the alternative is worse. This might have changed a bit in the last three years, judging by all the shitty employers whining "no one wants to work anymore." (In reality, no one wants to put up with your bullshit anymore.)

Now, workers have a bit more power in negotiation. I hope that continues, but it probably won't.

Veganism is a product of privilege.

I stand by this assertion. But I'm not presuming to tell people what to eat. You do you. My objections only start when it becomes like a religion, with self-righteous adherents and attempts at conversion.

I'm aware of the contradiction inherent in my attempt to convert proselytizers to the practice of minding their own business, but I can live with that.

In closing, sometimes "poor" is a matter of one's own perspective. One doesn't have to have a lot of money or a high salary to be privileged. Reading between the lines, this author had, at minimum, a place to sleep, an internet connection, and something to write and post with (laptop, smartphone, something). And some sort of support system. Not to mention the wherewithal to pursue her preferred diet, which requires a good bit of thought, energy, and time. This may have been a step down from her previous experience, but it's not rock-bottom, even if it seemed so to her. Truly poor people are rarely able to be picky about what they eat (which is why I call veganism a product of privilege).

Life has its ups and downs, and I sincerely hope she's doing better. And understands that she could be doing worse.
May 27, 2023 at 11:37am
May 27, 2023 at 11:37am
#1050145
Writing—the concept, not this website—has been around for at least six thousand years. It's evolved somewhat in that time.



It's widely known, thanks to movies and TV shows, that some of Marvel's characters are its take on Norse gods. This article delves a bit deeper into the past than the Asgardian characters.

Ancient Mesopotamia, the region roughly encompassing modern-day Iraq, Kuwait and parts of Syria, Iran and Turkey, gave us what we could consider some of the earliest known literary “superheroes”.

It also gave us beer. Coincidence? I don't think so.

One was the hero Lugalbanda, whose kindness to animals resulted in the gift of super speed, perhaps making him the literary great-grandparent of the comic hero The Flash.

I want to reiterate here that I interpret the words "literary" and "literature" in their broadest sense; hence, comic books are literature, while movies and TV are not. The difference, to me, is whether one reads the words, or listens to actors or storytellers speaking them (even if you tend to watch shows with subtitles, as I do). This isn't an attempt at a prescriptive definition, but the point of view I take when discussing these things.

Many ancient stories probably started out in an oral tradition, and only became literature when they were written down.

But unlike the classical heroes (Theseus, Herakles, and Egyptian deities such as Horus), which have continued to be important cultural symbols in modern pop culture, Mesopotamian deities have largely fallen into obscurity.

Except, if you're me, the ones involved with beer. And D&D.

An exception to this is the representation of Mesopotamian culture in science fiction, fantasy, and especially comics. Marvel and DC comics have added Mesopotamian deities, such as Inanna, goddess of love, Netherworld deities Nergal and Ereshkigal, and Gilgamesh, the heroic king of the city of Uruk.

Many writers delve into the past for inspiration; comics writers are no exception. Like when someone retells a Shakespeare play, as with West Side Story.

The Marvel comic book hero of Gilgamesh was created by Jack Kirby, although the character has been employed by numerous authors, notably Roy Thomas. Gilgamesh the superhero is a member of the Avengers, Marvel comics’ fictional team of superheroes now the subject of a major movie franchise, including Captain America, Thor, and the Hulk. His character has a close connection with Captain America, who assists Gilgamesh in numerous battles.

And no, he hasn't shown up in the MCU yet. Unless I've missed an Easter egg somewhere, which is always possible.

Gilgamesh’s first appearance as an Avenger was in 1989 in the comic series Avengers 1, issue #300, Inferno Squared. In the comic, Gilgamesh is known, rather aptly, as the “Forgotten One”. The “forgetting” of Gilgamesh the hero is also referenced in his first appearance in Marvel comics in 1976, where the character Sprite remarks that the hero “lives like an ancient myth, no longer remembered”.

That's recent, as comic book characters go. The genre is generally considered to begin with Superman in 1939.

Unlike many ancient tales and myths, I knew about Gilgamesh before the comics version. One should never assume that a comics version is a reliable adaptation of ancient stories.

Story-telling has been recognised since ancient times as a powerful tool for imparting wisdom. Myths teach empathy and the ability to consider problems from different perspectives.

I would argue that this is a primary purpose of storytelling, ancient and modern. Not limited to myths.

A recent study has shown that packaging stories in comics makes them more memorable, a finding with particular significance for preserving Mesopotamia’s cultural heritage.

While a single unreferenced study is no grounds for drawing conclusions, this tracks for me.

The myth literacy of science fiction and fantasy audiences allows for the representation in these works of more obscure ancient figures. Marvel comics see virtually the entire pantheons of Greece, Rome, and Asgard represented. But beyond these more familiar ancient worlds, Marvel has also featured deities of the Mayan, Hawaiian, Celtic religions, and Australian Aboriginal divinities, and many others.

That's because those of us who read science fiction and fantasy are, in general, more knowledgeable and intelligent (not to mention better looking) than those who don't.

Again, though, be wary of taking modern interpretations of these figures, whether in comics or RPGs, as definitive.

In the comic multiverse, an appreciation of storytelling bridges a cultural gap of 4,000 years, making old stories new again, and hopefully preserving them for the future.

Which leads me to my main purpose in featuring this article: the idea of storytelling itself.

Before writing, stories were passed down orally (though possibly with props, like an early form of theatre), and they would have changed with changing technologies and societies. Once you write something down, though, it's preserved, like a mosquito in amber. It becomes harder to interpret after, for example, war wipes out a neighbor that was mentioned in the story, or someone invents the axle or whatever, making the story less relevant. Just as someone today might view a movie from the 70s and note that a particular predicament might have been easily solved had mobile phones been a thing then.

So, in my view, these adaptations are necessary for preservation, breathing life into old tales.

Thus, we exist in a time when we can have the best of both worlds: the original, preserved; and the reinterpretation.

It could be—and has been—argued that today's comic book superheroes are the modern take on ancient mythology in general, what with their focus on exceptional abilities and superpowers. You get that a bit in other media, but nowhere is it more obvious than in the realm of graphic storytelling.
May 26, 2023 at 11:34am
May 26, 2023 at 11:34am
#1050116
Feeling much better today, thanks.



This article, from NPR, is dated April 30 of this year, the actual anniversary in question.

"Imagine being able to communicate at-will with 10 million people all over the world," NPR's Neal Conan said...

9,999,990 of whom are idiots, trolls, bots, or some combination thereof.

..."Imagine having direct access to catalogs of hundreds of libraries as well as the most up-to-date news, business and weather reports...

Mostly fake, and/or stuck behind paywalls.

...Imagine being able to get medical advice or gardening advice immediately from any number of experts...

And having it turn out to be (sometimes dangerously) wrong.

"This is not a dream," he continued. "It's internet."

And it was fun for a couple of years.

On April 30, 1993, something called the World Wide Web launched into the public domain.

The web made it simple for anyone to navigate the internet. All users had to do was launch a new program called a "browser," type in a URL and hit return.

And then troubleshoot the url typed in, because most people can't spell, and were hopelessly confused by the // thing.

This began the internet's transformation into the vibrant online canvas we use today.

Jackson Pollock's paintings can also be described as vibrant canvases. And any attempts to make sense of those are doomed to failure, or at least false success.

Anyone could build their own "web site" with pictures, video and sound.

And malware.

They could even send visitors to other sites using hyperlinked words or phrases underlined in blue.

Such as this one.  Open in new Window.

Okay. I'm being deliberately cynical. There's a lot to like about the internet, not the least of which is this website. But the web certainly has its downsides... just like everything else. Later, the article goes into some of these downsides.

CERN owned Berners-Lee's invention, and the lab had the option to license out the World Wide Web for profit. But Berners-Lee believed that keeping the web as open as possible would help it grow.

Well... he wasn't wrong. Fortunately for all of us, CERN was—and is—an international government science research organization, not a private company like ExxonMobil, AT&T, or Apple.

Today, nearly two-thirds of the world's population uses the web to visit hundreds of millions of active websites. Some of those pages belong to companies that are among the most valuable in history like Facebook, Amazon and Google.

None of which would exist without the internet as we know it.

With all of its problems, it's still one of the most influential inventions in history, if not the most. It's difficult to put an exact date on "this is when the internet was born," because there were all sorts of remote networks available before 1993, with many different individual inventors (none of whom were Al Gore), and of course the internet continues to evolve.

But the beginning of the World Wide Web is as good a milestone as any to commemorate.
May 25, 2023 at 8:45am
May 25, 2023 at 8:45am
#1050071
Well, I had some dental work done yesterday. This resulted in far more pain than I'd expected (I mean, sure, I expected some; it's a dental procedure). OTC pain relievers do nothing. So I have, basically, two options:

1) Be in pain, leaving me unable to sleep or concentrate; or

2) Take the good drugs the dentist prescribed, leaving me unable to sleep or concentrate (but at least not in pain), and running the risk of addiction.

Yeah, that's right: opioids keep me from getting decent sleep. That alone is probably enough to keep me from developing a habit; I'd rather sleep.

Point is, I won't be doing my usual blogging today, or, really, much of anything except staring at streaming video. Hell, I won't even be able to drink (adverse drug interactions).

Of course, this will pass, and it's really not a big deal. Perhaps I'll feel better tomorrow.
May 24, 2023 at 7:52am
May 24, 2023 at 7:52am
#1050033
In which The New Yorker discovers science fiction.

What a Sixty-Five-Year-Old Book Teaches Us About A.I.  Open in new Window.
Rereading an oddly resonant—and prescient—consideration of how computation affects learning.


Pick a random selection of science fiction from the past—I don't mean space opera, which is fine in its own right, but actual attempts to write about the intersection of society and advancing technology—and you'll find that some of them are, in hindsight, "oddly... prescient," while the vast majority are like "well, that never happened."

Neural networks have become shockingly good at generating natural-sounding text, on almost any subject. If I were a student, I’d be thrilled—let a chatbot write that five-page paper on Hamlet’s indecision!—but if I were a teacher I’d have mixed feelings.

If I were a teacher, I'd be like, "hey [chatbot], write a syllabus for a sophomore English class aimed at mid-range students."

On the one hand, the quality of student essays is about to go through the roof.

Yeah, except, well, see JayNaNoOhNo's rant about that sort of thing, here: "AI Detectors are HorseshitOpen in new Window.

Luckily for us, thoughtful people long ago anticipated the rise of artificial intelligence and wrestled with some of the thornier issues.

Substitute pretty much any technological advancement for "artificial intelligence," and "thoughtful people long ago" anticipated it and thought about some of the possible consequences. That's basically the definition of science fiction.

But I don't expect anyone at TNY to understand that.

Their book—the third in what was eventually a fifteen-part series—is “Danny Dunn and the Homework Machine.”

Just including this bit so we all can see what they're talking about.

There follows several paragraphs of meanderings about the setup, plot, and characterization, before getting to the stuff that is really relevant to the topic. Typical TNY. Still, it's worth glancing over so you know what he's talking about later.

I hesitate to give away too much of the plot, but (spoiler alert!) two mean boys in their class, one of whom is jealous of Irene’s interest in Danny, watch them through a window and tattle to Miss Arnold.

Oh, no, wouldn't want to spoil a plotline from 65 years ago. (Spoiler alert: the author pretty much gives away too much of the plot.)

Incidentally, Rosebud was the sled.

He points out that Danny, in order to program Minny to do his homework, had to do the equivalent of even more homework, much of it quite advanced. (“Gosh, it—it somehow doesn’t seem fair,” Danny says.)

I must have read these books as a kid. I have a vague memory of them, anyway. But it may explain why, whenever I have to do anything more than once, I search for a way to automate it, and often spend more time crafting an Excel spreadsheet or some code than I would have spent on the projects.

“Danny Dunn and the Homework Machine” is ostensibly about computers, but it also makes an argument about homework.

And yet, there is still homework. I'm of the considered opinion that grade-school level homework has the primary purpose of making the kids leave their parents alone for a few precious minutes in the evenings.

The article spends an inordinate (to me) amount of time arguing about homework in general and not on the ethical implications of AI, but the main point remains: people have been discussing this sort of thing since long before it was technologically feasible.

Just like with the rest of science fiction.
May 23, 2023 at 10:06am
May 23, 2023 at 10:06am
#1049998
I've ragged on this sort of thing before, but it's been a while.

    Why Authenticity Doesn’t Exist When It Comes To Food  Open in new Window.
Plus Ronnie Woo Shares a Recipe for Caramelized Hong-Kong Inspired Egg Tart


In my opinion, forcing "authenticity" on food leads to stagnation. It's a lot more interesting to mix things up a bit.

The debate over authenticity in food really comes down to how you define the word “authentic.” The word is often used to describe something that’s either fake or genuine, such as a brand name handbag or a pair of shoes, but in the case of food it doesn’t really apply (unless it’s plastic).

Most things come down to a matter of definition. As I said recently, I consider a hot dog to be a kind of taco. That depends on how you define (and serve) the hot dog, or a taco.

If every time we saw the words “authentic food” and replaced it with the word “traditional,” the sentence itself would probably be much less controversial. But even thinking of “traditional food” doesn’t maintain the intended meaning. I can guarantee that every time a recipe has been passed down to the next generation, changes were made.

Some of that is a search for novelty, but sometimes the changes are because of shifting availability of ingredients or cooking/prep methods.

Authenticity is simply a buzzword that some people have adopted as a way to declare that they are the real food-lovers and are somehow better than you based on what they perceive to be “real.”

And that's my main problem with it, I think: it's another form of gatekeeping.

Now, I know that, in the past, I've declared New York style pizza to be the One True Pizza, and American adjunct light beer to be not-beer. That's inconsistent with what I just said. It happens. I'm almost as loaded with contradictions as I am with pizza and beer. I am large; I contain multitudes.

But here's the important part, a rare instance of me completely agreeing with an article author:

I could care less if something is authentic, or even traditional for that matter – I just care that it’s delicious.

Well. Except that it should have been "I couldn't care less." Do English right! (In fairness, I don't know if English is Woo's first language or not, and here I am gatekeeping again.)

The beauty of having the privilege of eating food for pleasure is that we all are, and should be, allowed to mix and match whatever we want with reckless abandon. To me, the kitchen has no rules.

Yeah, no. The kitchen has plenty of rules. "Don't touch a hot stove." "Keep your knives sharp." "Always preheat the oven."

But now I'm being needlessly pedantic. I know what the author actually means: if you want to create a lo mein burrito, go for it.

Funny enough, there are numerous times where I actually like my interpretation of a traditional recipe more simply because I got to make it my own. Take my spicy almond pesto udon recipe, where I make a spicy version of a traditional Italian sauce and pair it with thick chewy noodles that are typically found in Japanese cuisine. Is this dish traditional? Absolutely not, but it sure is authentic to me (and, I should mention, absolutely delectable).

Many people associate pasta with "authentic" Italian cuisine. And tomatoes. But pasta is the Italian take on an Eastern innovation (noodles), and they wouldn't have tomatoes at all if not for the whole "invasion of the Americas" thing. Just because they added those ingredients long before we were born doesn't mean they're authentic or inauthentic.

While the line that differentiates appropriation and inspiration is not always crystal clear, it is important to not to erase the history or people from which a dish originated from.

This is an important point, though. And it's one that I'm not getting into, except to relate (possibly not for the first time) a personal anecdote:

I was sitting in my local bagel restaurant ("authentic" New York bagels in Virginia) many years ago, enjoying my carbohydrate toroid, when I overheard someone at the next table complain about cultural appropriation. I glanced over and noticed that she was eating a bacon, egg, and cheese bagel.

Why is this irony? Because bagels are indisputably Jewish food (it's a cultural thing, not religious). But bacon is indisputably not. Nor is the concept of eating any kind of meat alongside any kind of dairy product.

Don't get me wrong; I enjoy bacon, egg and cheese bagels, myself. But I also don't complain about food as cultural appropriation.

The meaty (pun intended) part of the article ends with:

At best, authenticity in food is subjective because no single individual or society can define what it is. If everyone stopped viewing cuisine and culture from a stagnant perspective, paid more attention to the deeply rich experiences of cooks (and people) of color, and appreciated all culinary interpretations simply for what they are, the experience of eating could just be fun and delicious. And that’s exactly what I think it should be.

And I like that viewpoint.

The rest of the article is the promised Hong Kong-inspired egg tart, but I'll leave that for people who aren't as lazy as I am to peruse. I'll just add one more short anecdote about authenticity:

Driving through the wildscape of the Olympic Peninsula in the state of Washington, one day many years ago, I started to get hungry. So I pulled off into a strip mall. Said strip mall had not one, but two, restaurants billed as Mexican.

One of them had bright neon in the window, and ads for Corona and Bud Light and, if I recall correctly, even a dancing neon margarita glass. I may not remember the details very accurately, but hopefully I'm at least conveying the feel of the place.

The other one was a simple storefront with big glass windows and some signs in Spanish.

I entered the latter, where the TV was set to a Spanish language station, on mute but subtitled in Spanish, and had Mexican music playing. The food was excellent.

But I like to think that maybe the same people owned both storefronts (there was a Latin food shop in between them), and just ran the Bud Light one as a gringo trap.
May 22, 2023 at 8:15am
May 22, 2023 at 8:15am
#1049962
Just an easy article today, and writing-related:



Clearly, they didn't Americanize the headline.

Born in Cambridge in 1952, Douglas Adams was best known for creating The Hitchhiker’s Guide to the Galaxy, a wildly successful project that began in 1978 as a science-fiction comedy radio series and eventually evolved to become something much larger, in many formats and in many languages, adored by many millions of people around the world.

I'll just point out that he did a lot more than that, including writing for Doctor Who, but it's Hitchhiker's that has embedded itself into public consciousness. Well, maybe not "public," but at least in the circles I frequent, one sign of certain intelligence is a good Hitchhiker's quote. Or a bad one.

This typically entertaining letter, which was actually a fax, was sent in 1992 to US editor Byron Preiss, whose company at the time was producing a comic book adaptation of Adams’ ever-expanding opus. Having noticed some unnecessary changes, Adams was keen to give some feedback.

I wish they wouldn't do that. Sure, Hitchhiker's has been translated into dozens of languages, but I find it offensive when they translate from British to American, as if we're too dumb over here to recogni[z/s]e that Brits spell a few words differently and have different words for several things than we do. I grew up reading both American and British literature, and while the differences did confuse me for a while, in the end, I feel like I became a better reader and writer for it.

Probably the worst offense was when they translated Philosopher's Stone to Sorcerer's Stone in the Potter books.

Anyway, the rest of the article is the letter itself, which is written, not unexpectedly, with hints of his signature dry humo(u)r. I'll just quote a few passages here.

A thing I have had said to me over and over again whenever I’ve done public appearances and readings and so on in the States is this: Please don’t let anyone Americanise it! We like it the way it is!

So it's not just me. Making it American would be like trying to do an American version of Black Adder, with American actors who have American accents. I know they did that sort of thing with a series (The Office, maybe?) but I never saw either version.

Though Hugh Laurie (who was in Black Adder), at least, does a great American accent.

The ‘Horse and Groom’ pub that Arthur and Ford go to is an English pub, the ‘pounds’ they pay with are English (but make it twenty pounds rather than five – inflation). So why suddenly ‘Newark’ instead of ‘Rickmansworth’? And ‘Bloomingdales’ instead of ‘Marks & Spencer’? The fact that Rickmansworth is not within the continental United States doesn’t mean that it doesn’t exist! American audiences do not need to feel disturbed by the notion that places do exist outside the US or that people might suddenly refer to them in works of fiction.

It is simply not possible to get the same effect if you substitute a British pub with an American bar. While there are a few drinking establishments in the US that approximate the British pub experience, the culture is still completely different.

Of course, this is only a problem for the very beginning of the story, when Earth still exists. (Spoiler)

Or we could even take the appalling risk of just recklessly mentioning things that people won’t have heard of and see if they survive the experience. They probably will – when people are born they haven’t heard or anything or anywhere, but seem to get through the first years of their lives without ill-effects.

Those sentences are pure Adams.

(Incidentally, I noticed a few years ago, when we still had £1 notes, that the Queen looked very severe on £1 notes, less severe on five pound notes, and so on, all the way up to £50 notes. If you had a £50 the queen smiled at you very broadly).

Quoting this because it legitimately made me laugh. Damn, I miss Douglas Adams. Neil Gaiman is the closest we have to him now, and while one of my favorite writers, he just can't do comedy.

One other thing. I’d rather have characters say ‘What do you mean?’ than ‘Whadd’ya mean?’ which I would never, ever write myself, even if you held me down on a table and threatened me with hot skewers.

I suspect the latter construction is a feeble attempt to render a British accent in dialogue. No. Make the thing British enough, and our minds will provide the accents.

I rarely close one of these entries with a direct quote, but it seems appropriate in this instance:

Otherwise it looks pretty good.
May 21, 2023 at 9:03am
May 21, 2023 at 9:03am
#1049920
Spinning the dials on the ol' time machine, today we land on "NaturallyOpen in new Window., from just over two years ago.

The entry was a 30DBC response, and the prompt was: Write about your favorite outdoor activities to do in the summer. Are there any activities you haven’t done that you want to try?

I have a distinct memory of shuddering when I first saw that prompt. I have a bio somewhere—not on a dating site, because I don't do dating sites—that describes me as a "dedicated indoorsman."

So, I opted for humor; specifically, hyperbole:

Well, my only favorite outdoor activity in the summer is: rushing from an air-conditioned car into an air-conditioned bar; if the accursed daystar is burning, add "while shading myself as much as possible."

And that's not entirely true. I rarely drive to bars, on the theory that I might have to leave my car there to Uber home, thus necessitating another trip, the next morning, while hung over. So the "car" in question is a rideshare.

I'm just as likely to walk to the bar. This is true no matter the outdoor temperature. That doesn't mean I'm enjoying the outdoors; it just means I'm a cheapskate.

Still, given a choice between being too hot and too cold, I will pick too hot every damn time, so if I'm going to do anything outside, it'll be in the summer. Of course, one is never given such a choice; either it's winter and too cold, summer and too hot, or a week or so in between when it's actually pleasant to be outside for five minutes when it's not raining.

This is only a slight exaggeration. We usually get more than a week of mid-range temperatures. Sometimes as many as two!

But this does give me the opportunity to talk about a phrase that triggers my grumpiness, to a possibly irrational degree: when someone says something like "I'd rather it be cold because I can always put on more clothes, but there's only so many I can take off."

It's not that those people are different from me. As with being a morning or night person, people prefer different things and it's hard to change that, even if I wanted to, which I don't. If this weren't the case, we wouldn't have people living in Alaska or, conversely, Arizona. So, okay, you prefer the cold; that's good to know. We can be friends, but we'll never take a Caribbean cruise together.

I think it's partly the implied naughtiness: tee hee, now you're picturing me naked. Thanks, I'm already doing that (if you're female) or repulsed by the idea (otherwise).

Worse, though, is that it's dead wrong.

Ever seen people adapted to extreme heat? I don't mean the scantily-clad jungle-dwellers of various tropical societies; they've usually got some shade to cool off in. I mean, like, the desert-dwellers, the real-life inspiration for the Tusken raiders of Star Wars fame. Do they run around mostly naked? No, they're completely covered in loose-fitting clothing. This serves as portable shade and takes advantage of the slightest breeze to provide evaporative air conditioning.

And for me, when it's cold, it doesn't matter how many layers of clothing I wear; I'm going to freeze anyway. Once my hands or feet are cold, that's it. I'm done. If I wear enough clothing to delay this freeze, I'm uncomfortable and can barely move.

So I prefer to do as nature intended, stay indoors and run my A/C in the summer and heating in the winter. Yeah, compared to the vast bulk of humanity, both now and in the past, I'm living in privileged luxury. So? Might as well take advantage of it.

No, I'm of the firm opinion that we evolved to build shelters for a reason, and that reason is that so we could use them. Don't get me wrong; I love nature. I'm a big fan of watching webcams of natural areas.

And that's an excuse to re-link the webcam I talked about in that entry, since we're coming up on the summer solstice once again: The Brooks Falls Brown Bears  Open in new Window. from Alaska.
May 20, 2023 at 11:37am
May 20, 2023 at 11:37am
#1049871
Colorado is known for, well, several things, but mainly: mountains, weed, beer, and being a rectangle.

    Colorado Is Not a Rectangle—It Has 697 Sides  Open in new Window.
The Centennial State is technically a hexahectaenneacontakaiheptagon.


Oh, well, at least they've still got mountains, weed, and beer.

America loves its straight-line borders. The only U.S. state without one is Hawaii—for obvious reasons.

There are good reasons for that, mostly involving making sure surveyors don't just give up in the middle of marking a zigzag boundary.

West of the Mississippi, states are bigger, emptier, and boxier than back east. From a distance, all seem to be made up of straight lines.

Can't be arsed to look it up right now, but there was a shift in the way surveying was done between the time of East Coast European settlement, and massive migrations west. That's one reason you get a lot of near-rectangular counties out West, and almost none in the East.

Only when you zoom in do you see their squiggly bits: the northeast corner of Kansas, for instance. Or Montana’s western border with Idaho that looks like a human face.

Never noticed that before, and now I will never not notice it.

New Mexico comes tantalizingly close to having only straight-line borders. There’s that short stretch north of El Paso that would have been just 15 miles (24 kilometers) long if it were straight instead of wavy.

Just guessing here, but it looks like it's wavy because it followed the Rio Grande. I use the past tense because, looking at a map, it looks like the river shifted but the boundary didn't (whether the river shift was intentional or not, I have no idea). There are great benefits in using rivers and other geological features as boundaries... at least until you remember that geological features change, and rivers in particular can rechannel themselves on human time scales.

No, there are only three states whose borders are entirely made up of straight lines: Utah, which would have been a rectangle if Wyoming hadn’t bitten a chunk out of its northeastern corner; Wyoming itself; and Colorado.

I've long been curious over why Wyoming bit Utah rather than the other way around, or maybe settled on a diagonal compromise. But not curious enough to delve deeper.

I'm also going to quibble a bit. With a few exceptions, most boundaries are described by line segments. The exceptions include Delaware's northern boundary, but even the ones following the thalweg—that's Surveyese for the midline or channel of a river—are generally approximated by arbitrarily short line segments (putting survey monuments in a river channel tends to be cost- and labor-intensive).

Whether we perceive something as a whole lot of short line segments, or a squiggle, or an arc, depends on the zoom factor.

Except that they aren’t. for two distinct reasons: because the earth is round, and because those 19th-century surveyors laying out state borders made mistakes.

And that's the other thing. On a Mercator and certain other map projections, latitudes appear as straight, horizontal lines. They are not. They are all circles of varying radius, centered on the poles. So when you're surveying a latitude line, you're actually describing an enormous arc (unless of course you're on the Equator). Said arc is generally approximated with line segments, as the variation from that big an arc to a line tends to be tiny.

Many states and countries have latitude lines as boundaries. Perhaps the most famous is the western segment of the border between the US and Canada. And even Eastern states (theoretically) have latitude boundaries, such as most of the line between VA and NC.

One begins to see why some of the US "Founding Fathers" were also surveyors.

Congress defined the borders of Colorado as a geospherical rectangle, stretching from 37°N to 41°N latitude, and from 25°W to 32°W longitude. While lines of latitude run in parallel circles that don’t meet, lines of longitude converge at the poles.

In contrast to latitude, longitude lines are, actually, straight (as mapped onto the ground). In theory.

Those longitude numbers seem like errors. I feel like they're measured from DC and not Greenwich, because there was a time when the US tried to measure everything from a longitude line passing through Washington, DC, and things like that tend to get entrenched into surveys. I couldn't find confirmation of this, but later in the article it acknowledges that the western boundary is more like 109°02’48″W, which supports my hypothesis.

This means that Colorado’s longitudinal borders are slightly farther apart in the south. So if you’d look closely enough, the state resembles an isosceles trapezoid rather than a rectangle. Consequently, the state’s northern borderline is about 22 miles (35 kilometers) shorter than its southern one.

I'd love to see the flat-Earther explanation for that one.

That’s not where the story ends. There’s boundary delimitation: the theoretical description of a border, as described above. But what’s more relevant is boundary demarcation: surveying and marking out the border on the ground.

A friend once asked me whether the VA/NC boundary would shift with continental movements, to keep it roughly aligned with whatever latitude it's supposed to follow. No, it wouldn't; the boundary markers take precedence over the delimitation. Kind of like the NM/TX border near, but no longer on, the Rio Grande.

Unfortunately, 19th-century surveyors lacked satellites and other high-precision measurement tools.

I say they did remarkably well with what they had. Humans are clever when they want to be.

Let’s not be too harsh: considering the size of the task and the limitation of their tools—magnetic compasses and metal chains—they did an incredible job. They had to stake straight lines irrespective of terrain, often through inhospitable land.

I guess that's the polite way of saying they had to deal with mountains, Indians, and mountain Indians.

Whether they should have been messing around on land that didn't really belong to them is another issue for another time. The fact remains that they did mess around.

Located in a dusty, desolate corner of the desert, the Four Corners monument seems very far from the middle of anything. Yet this is the meeting point of four states: Utah, Colorado, New Mexico and Arizona. It is the only quadripoint in the United States. The monument’s exact location is at 36°59’56″N, 109°02’43″W.

It's not all that desolate. And yes, I've been there. Twice. Sure, it's a tourist trap, but I'm a tourist, and spent my career working with surveyors.

However, it’s not where Congress had decreed the four states to meet. That point is about 560 feet (170 meters) northwest of the quadripoint’s current location, at 37°N, 109°02’48″W. Did you drive all the way through the desert to miss the actual point by a few hundred feet?

Look, it's a nice drive from any direction.

The rest of the article goes into some of the more obvious deviations from straight-line surveying (though still doesn't much acknowledge that latitude lines aren't "straight.") It's worth a read if you find this sort of thing interesting.
May 19, 2023 at 10:37am
May 19, 2023 at 10:37am
#1049836
You know why you should never trust atoms? Because they make up everything.

    Why atoms are the Universe’s greatest miracle  Open in new Window.
With a massive, charged nucleus orbited by tiny electrons, atoms are such simple objects. Miraculously, they make up everything we know.


I could quibble about some of the words here, like "miracle," "simple," and "everything," but fine, they're not writing for scientists but to get ordinary people to know something about science. I can't be mad about that.

Similarly, I don't have much to say about the article itself. It's not technical, and it's got lots of cool illustrations, some of them animated. Highly recommended for anyone with curiosity.

One of the most remarkable facts about our existence was first postulated over 2000 years ago: that at some level, every part of our material reality could be reduced to a series of tiny components that still retained their important, individual characteristics that allowed them to assemble to make up all we see, know, encounter, and experience.

I mean, technically, light (which obviously enables us to see) isn't made of atoms. But it's generated from them, so okay.

Incidentally, I'm pretty sure that Democritus (the Greek who came up with the above idea) would be almost entirely forgotten had he not turned out to have been onto something. Lots of stuff the Greeks came up with didn't pan out (pun intended). And the Greek atomist theory wasn't exactly correct, either. It was more philosophy than science.

Everything that’s made up of normal matter within our Universe — whether solid, liquid, or gas — is made of atoms.

I could also quibble that this statement is a tautology, but why bother?

If all of human knowledge were someday wiped out in some grand apocalypse, but there were still intelligent survivors who remained, simply passing on the knowledge of atoms to them would go an incredibly long way toward helping them not only make sense the world around them, but to begin down the path of reconstructing the laws of physics and the full suite of the behavior of matter.

That's assuming they'd have the time to do so, between running from predators, finding prey, and hiding from aliens and/or zombies.

This is, however, just their framing device to communicate the idea of building atomic theory from the ground up. Best to not take these things too literally.

Like I said, I don't have much else to add. Mostly I just wanted to hold it up as an example of how one might communicate complex ideas to a wider audience.
May 18, 2023 at 6:11pm
May 18, 2023 at 6:11pm
#1049803
While one should never get their legal advice from an online comedy site, you might like this Cracked article about laws.



Yeah, I don't know if these are unique. And it's not like other countries don't have stupid laws, too. Hell, some of them still punish you for blasphemy.

The United States is a pretty weird country.

Which one isn't? Oh, yeah, Canada. Never mind.

Even though what’s supposed to be the famous, usually screamed tenet of America is freedom, the actual freedoms we do and don’t have are cherry-picked and puzzling.

Yeah, right. "Freedom."

So it’s unsurprising that there’s a whole lot of regulations and laws in the U.S. that haven’t fallen far from the apple tree — at best confusing, at worst fully oxymoronic.

Or just moronic.

Here are five American laws that are likely, in the eyes of other modern governments, incredibly dumb.

I always liked those lists of weird laws still on the books, like needing a license to wear penny loafers, or whatever.

These aren't those, though.

5. Female Lawmakers’ Backwards Dress Codes

Another point on the high school side of the scale is the fact that, despite being our chief legislative body, Congress still enforces a fucking dress code. And like most dress codes, it’s a whole lot more draconian when it comes to the female members.


That's idiotic, sure, but then there are still countries where "female lawmaker" is semantically and legally impossible.

Sure, Britain isn’t much looser, but they also think “fanny” is a cuss, so is that such a win?

That's what I've been saying.

4. Kinder Surprise Eggs Banned

Another common feel in American law is the conflict between a country that’s supposed to be advocating for freedom above all, while seemingly convinced that every American has the death drive of a baby lustily staring at the forbidden liquids beneath the sink. One place this pops out is in the absence of the Kinder Surprise Egg in American stores.


Meanwhile, far more hazardous products remain legal. You know what I mean.

3. Weird Real Egg Laws

Right, because no one else has weird laws about food.

2. Pharmaceutical Advertising

Everywhere else foolishly believes that if you need medication, your doctor probably isn’t relying on you to provide suggestions. It doesn’t help that the advertising is just as predatory as usual, mostly suggesting that if you don’t fix your allergies, your child will spit on you and leave you to cry in a musty robe while they go to the park to play with their other parent, who they now like more.


I despise almost all advertising, and it is kinda strange to push prescription medicine on the TV, but there are worse things to advertise. Homeopathy, e.g.

1. Sex With A Porcupine

And I'm done with the internet for today.
May 17, 2023 at 8:14am
May 17, 2023 at 8:14am
#1049703
It's been a while since I've ragged on the "time is an illusion" nonsense, so here we go again.



Despite my issues with the wording, it's an interesting article with some things I hadn't read before. Obviously, I can't do the whole thing justice here; that's what the link is for.

America's official time is kept at a government laboratory in Boulder, Colo., and according to the clock at the entrance, I was seven minutes behind schedule.

Not if time is an illusion, you weren't.

NIST broadcasts the time to points across the country. It's fed through computer networks and cellphone towers to our personal gadgets, which tick in perfect synchrony.

For various definitions of "perfect."

"A lot of us grow up being fed this idea of time as absolute," says Chanda Prescod-Weinstein, a theoretical physicist at the University of New Hampshire. But Prescod-Weinstein says the time we're experiencing is a social construct.

That's misleading at best. Sure, the way we slice time up into hours, minutes, and seconds is purely arbitrary (though it does have some basis in observation), but time's going to do its thing regardless of whether there's a clock to measure it. The orbits of the planets, for instance; they make an effective clock if you know how to read it.

How can I claim to know better than a theoretical physicist? It's the "theoretical" part. The smallest "things" we know of, quarks and electrons, don't experience... well, anything. But where time is concerned, subatomic particles follow laws that don't take much notice of time, if any. From that perspective, okay, time isn't fundamental, and that's the framework a theoretical physicist works in. But for everyday stuff? Time is real. The sun rises in the east (though the directions are also a social construct) and sets in the west. There is darkness, and then there is light.

The consensus I've seen is that it's a bulk property of matter, related to entropy. You know what else is a bulk property? Temperature. But there aren't pseudo-mystics floating around airily proclaiming that temperature is an illusion. Any that do need to be shipped to Antarctica in a pair of shorts to see if they can wish temperature away.

Real time is actually something quite different. In some of the odder corners of the Universe, space and time can stretch and slow — and sometimes even break down completely.

You're going to claim time is a human construct, and then, in the exact same paragraph, use the phrase "real time?"

Yes, that last quoted bit is correct, to the best of my knowledge. The thing is, though, we know exactly how time stretches, slows, and breaks down under acceleration (including acceleration due to gravity). There are equations for it. To me, an "illusion" wouldn't have that quality.

Space is also different from one point to another, but only the most bearded philosophers claim space is an illusion.

For many people, this unruly version of time is "radical," she says.

It is, by definition and equations, ruly. Not unruly. Yes, it seems odd to us because it's outside our normal experience. But there's plenty of observational confirmation of the way time changes at different locations.

By averaging a subset of the 21 clocks together, NIST has created a system that can count the time to within one quadrillionth of a second. That means the government's clock can keep time to within a second over the course of about 30 million years.

At which point it'll be moot, because the Earth's rotation will have changed, and the second is based on the minute, which is based on the hour, which is based on the time it takes for the Earth to make a complete rotation during the current epoch.

I expect the second will remain the same, provided we last long enough to keep measuring time. But the length of the day will gradually increase, unless something catastrophic happens.

The time from this lab is used to run our lives. It says when planes take off and land, when markets open and close, when schoolchildren arrive at class. It controls computer networks, navigation tools and much, much more.

And? I'm as lazy as anyone, but I still want to keep track of time if I'm doing something or meeting with someone.

Governments around the world aren't just providing the time as an altruistic service to citizens, Prescod-Weinstein argues. It's about keeping society organized and efficient. It's about increasing economic productivity.

Bit of a stretch, in my opinion.

"Capitalism sucks, and I think a lot of people's relationship to why time is not cool, is structured by the resource pressures that we feel," she says.

So she has an agenda.

I'm not going to get into the "capitalism sucks" debate, except to say that, well, we've tried some other systems, and as Churchill said, it's the worst economic system, except for all the others. I do hold out some hope that we'll find a replacement, à la Star Trek, or improve it so it's not so dehumanizing as we pursue peak efficiency in the name of Holy Productivity, and pretend that infinite growth is possible. So I can relate to that agenda. But really, none of that says anything about the concept or reality of time itself.

Wibbly wobbly timey wimey

I can't hate an article that has a Blink reference.

True time is actually much more flexible than most people realize, Prescod-Weinstein says. According to Einstein's general theory of relativity, space and time are tied together, and space-time can bend and curve.

Sure, but that has no practical value for us as we slog through our daily lives.

In places where gravity is very strong, time as we understand it can break down completely. At the edge of black holes, for example, the powerful gravitational pull slows time dramatically, says Prescod-Weinstein. And upon crossing the black hole's point of no return, known as its event-horizon, she says space and time flip.

I've seen that finding before, and it's got lots of theory supporting it. Obviously, there's no way to experimentally verify it. Again, no practical use for us. Not yet, anyway.

The Universe is expanding, and because of entropy, energy and matter are becoming more and more evenly spread out across the ever-growing void. In its final state, the Universe may end up as an inert cloud of energy and matter, where everything is evenly distributed.

Yeah, I've referred to that before. They call it the heat death of the Universe, because there will be no more heat transfer, because everything is already at maximum entropy. As I noted, time is probably the result of the one-way direction of entropy. No entropy change means no time. That's if our current cosmological models are correct, which is always an active question.

What this article fails to mention is that this is not an imminent existential threat. We're talking about something like 10100 years from now, which is a number so large that you don't understand it. Hell, I barely understand it, myself. As a comparison, there are fewer than 10100 atoms in the entire observable universe.

That exact number is called a googol, incidentally. Not to be confused with Google, who either deliberately or accidentally misspelled it.

Anyway.

So time, as we understand it, has some really big problems, but it also has some really tiny ones, too. In fact, some scientists who study the microscopic interactions of fundamental particles are questioning the idea of time itself.

Yes, we know, fundamental particle interactions are time-reversible. As I said up there, time is a bulk property, to the best of our knowledge.

Well, I've banged on long enough. There is, as I noted, a lot more stuff in the actual article.

If you have time to read it.
May 16, 2023 at 9:47am
May 16, 2023 at 9:47am
#1049662
I never sausage a thing.

    Which Hot Dog Brand Is Best? A Blind Taste Test of Oscar Mayer, Hebrew National, and More  Open in new Window.
Because your summer BBQs deserve the best. (Or at the very least, not the worst.)


Dammit! It was sitting right there. They could have said "not the wurst," but no, they had to play it straight.

Do you know the difference between sausages, wieners, frankfurters, and hot dogs? If, like me, you hadn’t ever really thought about it and assumed they were all pretty much the same, I’m thrilled to tell you that you’re wrong.

Because of course we are.

It's been many years since I've actually eaten a hot dog, frankfurter, or weiner; anything requiring a hot dog bun. At home, I got really tired of the mismatch between number of franks and number of buns in their respective packages, and while out, there are other foods that appeal to me more. Not to mention I know what they're made of, but that doesn't stop me from eating breakfast sausages.

So, unlike those two other staples of American haute cuisine, hamburgers and pizza, I don't have a dog in this fight. Pun intended, as usual. I just relished the article and found it amusing.

Regardless of which type of sausage is your favorite, there’s one that screams summer louder than the others: hot dogs. Our staff tasters were sure childhood standbys like Oscar Mayer and Hebrew National would sweep the competition but, as always, the results of our taste tests are full of surprises.

Naturally, I enjoyed hot dogs when I was a kid. Our cylindrical delicacy of choice was, unsurprisingly, the Hebrew National brand. When those were unavailable for whatever reason, the replacement still had to be made of cow, because my mom tried to keep a kosher house as best she could out in the boonies.

To cut down on variables, we boiled and tasted only all-beef hot dogs.

So this is why I picked this article to go into my queue.

And though we offered up buns, ketchup, and mustard, most testers boldly chose to taste their dogs plain.

This makes sense from a pure taste-testing perspective, but out in the wild, you're looking for a whole experience, including bun and condiments. I believe that the right choice of bun influences that experience. Would you taste-test pizza without the crust?

As for condiments, in a taste-test, you at least want them to be consistent across all the samples.

And finally, I know they didn't do this in Chicago, because in Chicago, they track down anyone putting ketchup on a hot dog and run them out of town on a rail.

In the end we blind tasted seven of the most popular brands and judged them on flavor, casing snap, and the satisfying firmness of the meat each bite.

Phrasing!

Of course, for full effect, you'll need to go to the article for details. I'm just highlighting things here.

The Biggest Loser: Oscar Mayer

Quelle surprise. Their dogs are terrible.

Unflinchingly Flaccid: Ball Park

I'm starting to think this author has issues.

Not because they don't like Ball Park. That's normal. It's just, again, phrasing.

Happily Herby: Hebrew National

Here's the thing: it's hard to be objective about food (or drinks) during a taste test. Taste is, well, a matter of taste. Beer, for example, is highly personal; some love *shudder* IPAs, while I prefer darker, less hoppy brews. For colas, Coke will, for me, always be far superior to Pepsi. And Hebrew National is always going to be my Platonic ideal of hot dogs, even if I don't eat them anymore, because it was our go-to brand when I was a kid.

The two that beat it on the list, Sabrett and Nathan's, were only available to Kid Me on trips to New York City. Either they hadn't yet expanded distribution to the rest of the country, or we just didn't get them in our off-the-beaten-path area. In both cases, I always wished it was HN.

So, in the end, you'll have to make your own taste test if you care to determine which is best. Or be like most people and just eat whatever's cheapest; this is why many Americans have no sense of taste.

But you like what you like and it's not my decision.

One final note: there is perennial debate over whether a hot dog, nestled as nature intended in its bun, is a sandwich. I've heard even a Supreme Court justice once weighed in on the matter (RBG, if it matters), though in an unofficial capacity.

This is, ultimately, a categorization problem, like whether the Blue Ridge are actually mountains, or Pluto's planetary status. So there's no official answer. Categories are, in the end, a social construct. However, when you consider that the hot dog is generally served with the split side of the bun facing upwards so that the toppings don't fall out—something you never do, and can never do, with a sandwich—and that the bun itself is always solid at the bottom, barring accidents, and also given its origins as handheld street food, there is only one True Conclusion to which someone can arrive:

A hot dog isn't a sandwich.

It's a taco.
May 15, 2023 at 9:21am
May 15, 2023 at 9:21am
#1049626
Nothing is forever.



I didn't fact-check these, so beware. If true, there are some on this list I'd never heard of.

As per normal, I'm not copying all of them here.

1. Ansault pear

Unlike other items on this list, the Ansault pear appeared relatively recently. First cultivated in Angers, France, in 1863, the fruit was prized for its delectable flesh.


Angers, France? That's not Nice.

Irregular trees and the rise of commercial farming contributed to the fruit's demise.

Seems to me that, if we really wanted to, we could recreate this one. Fruit varieties are generally made by some sort of cloning or hybridization.

3. Auroch

You may have heard aurochs mentioned in Game of Thrones, but this creature doesn’t belong in the same category as dragons. The real cattle species was domesticated 10,000 years ago in the early days of agriculture. They were big (“little below the elephant in size,” according to Julius Casear) and leaner than modern cows.


Apparently these lasted longer than I thought, all the way to the 17th century.

Here, the article leaves out an interesting bit about the aurochs: it was so important, so integral to developing civilizations, that the pictogram for it became a letter. Phoenicians called it aleph. The Hebrew script still does. In Greek, alpha. We know it as the letter A.

5. Dodo

Dutch sailors first visited the island chain of Mauritius in 1598, and less than two centuries later the archipelago's native dodo went extinct. Sailors relied on the birds as sustenance during long voyages at sea, but that isn't the primary reason they died out; habitat and the introduction of invasive species like rats and pigs ultimately wiped out the animal.


Pretty sure they mean "habitat loss," not "habitat."

It's my understanding that it was fairly common, at the time, for people of the European variety to believe that God put all the other animals (and plants, etc.) on Earth for our benefit, and would never allow one to become extinct.

That turns out not to be the case.

6. Steller’s sea cow

German naturalist Georg Wilhelm Steller identified the Steller's sea cow around the Commander Islands in the Bering Sea in 1741. Growing up to 30 feet long, it was significantly larger than the sea cows alive today.


Cue Hindenburg disaster narrator: "Oh, the huge manatee!"

7. Mammoth

Wooly mammoth meat was an important component of the diets of our earliest human ancestors. We ate so much of them that hunting may have contributed to their extinction around 2000 BCE (though climate change was likely a bigger factor).


So, apparently, there were mammoths wandering around at the same time as there were pyramids in Egypt.

Not in the same place, though.

8. Taliaferro apple

Thomas Jefferson cultivated Taliaferro apples at Monticello. In an 1814 letter to his granddaughter, Jefferson said the small fruit produced "unquestionably the finest cyder we have ever known, and more like wine than any liquor I have ever tasted which was not wine."


Including this one in my commentary for literal local flavor. But also because many people might not be aware that the Virginia pronunciation of Taliaferro is, inexplicably, Tolliver.

9. Great auk

Modern humans primarily killed great auks for their down, leading to the species’s extinction in the mid-19th century, but prior to that they were hunted for dinner.


I knew about this one because I had an English teacher in high school who loved to point out awkward sentences in his students' compositions by writing a big red AWK and circling the offending phrase. He called it the "Great Awk."

It should surprise no one that I had a truly stupendous Great Awk collection.

Anyway, there's obviously more at the link, and they're all interesting, even if, as is the case with the passenger pigeon entry, some of them are already well known.
May 14, 2023 at 9:27am
May 14, 2023 at 9:27am
#1049580
Today, we're going all the way back to June of 2020 for an article about quantum mechanics: "Something about NothingOpen in new Window.. Now, don't freak out; it's not a technical treatise in any way.

The article referenced there (really a book promotion masquerading as an interview masquerading as an article) is a year older than that, but it's still up and can be found here,  Open in new Window. with the misleading title of "Making Sense of Quantum Mechanics."

As I've noted, the main reasons I do these retrospectives are to see if there have been any updates to the material covered, and to elucidate how my own views on the subject may have evolved. I don't follow science news all that rigorously; that is, I'll read an article or a book, or watch a video, here and there, but it's not like I delve into much depth. But the one thing that comes to mind is that, recently, there was a lot of buzz about a Nobel Prize given to some scientists for their work on quantum entanglement.

Same as with everything else related to quantum theory, people got that stuff wrong, too. I don't mean the prize-winning scientists, who presumably had really tight results, but the way it was reported on made it look like the old "Einstein was Wrong" crowing, with an emphasis on how quantum entanglement means something travels faster than light.

It does not. The light speed barrier should more properly be termed the information speed barrier, and quantum entanglement does not, at least with our current understanding, imply the transmission of information faster than light. We can't use it to send instantaneous messages to Pluto, for instance. Mostly, from what I can tell, the usefulness is limited to the arcane workings of quantum computers. Perhaps there are other uses, or will be in the future, but mostly the prize was about experimental confirmation of a theory.

None of which really negates anything in the article I featured, as far as I can tell.

In that entry, I said:

I've always been of the opinion that anyone who claims to have figured out quantum mechanics is lying, at the very least to themself.

Nothing about that has changed.

But I do want to go back to that original article to note something I apparently missed the first time around:

Horgan: Good choice. What is the point, in our scientific age, of philosophy?

Albert: I'm not sure I see the connection. It's like asking, “What is the point, in our scientific age, of ice cream?" People like it. People - or some people - want to understand how things, in the broadest possible sense of the term, hang together. And it happens (moreover) that the business of trying to figure that out has had obvious and enormous and innumerable consequences for the history of the world. And if the thought is that what goes on in university science departments has lately somehow taken that function over, then that's just clearly and wildly wrong - and the fact that it's wrong, as I explained in my answer to your previous question, was part of the reason why I moved from a science department to a philosophy department.


Here, I think both people missed the mark.

Ice cream has nothing to do with science (except in the sense that everything does and that, reportedly, Einstein was a big fan). But—and I might have mentioned this before, but I don't remember—philosophy guides science, and science informs philosophy.

"Science" isn't a body of knowledge; it's a method. The scientific method is, at base, philosophy. It's philosophy that works, in that it's been shown to get useful results, unlike a lot of the mental self-pleasuring some philosophers do. But philosophy also has at least one other function in science, and that's to limit the lengths to which we'll go to investigate some hypothesis.

To note a basic example, in biology, animal testing is a thing. What limits animal testing isn't science itself, but ethics, which is a branch of philosophy. You can argue that the restrictions go too far, or, conversely, that they don't go far enough and maybe we shouldn't be doing animal testing at all. But by doing so, you're not doing science, you're doing philosophy.

As for "science informs philosophy," well, the thing about philosophy is that you can build entire logical edifices on the foundation of a false premise. One need look no further than the convolutions of a flat-earther to see what I'm talking about here, but, in general, if you're going to draw conclusions, it's best to start with a solid and well-tested premise, such as "the earth is basically round" or "gravity is an attractive force between masses."

Sometimes, when you do that, you might find a contradiction, or a paradox. That might lead to a revised premise, and that's okay.

My point is that the universe doesn't support our penchant for putting everything into neat little boxes. There's no sharp dividing line between, say, biology and chemistry. The boundary between our selves and our environment can get murky, too, and does so every time we take a breath, or eat, or shit.

So it is with science and philosophy. Though we were doing philosophy way before the beginnings of science as a discipline (physics was originally termed "natural philosophy"), it often led to some really wrong conclusions. Still does, of course.

Okay, enough of that. I guess I just had to defend why I bang on about both those things in here, when I'm not making dick jokes. And sometimes when I am.
May 13, 2023 at 8:22am
May 13, 2023 at 8:22am
#1049551
From the "don't believe everything you hear" department (courtesy of The Guardian):

    Chocolate doesn’t cause acne – but carrots do help you see in the dark: the best and worst health myths and wisdom  Open in new Window.
True or false: cheese gives you bad dreams and oysters are aphrodisiacs? We investigate good, bad and mad health advice


Folk "wisdom" usually isn't wisdom, but mythology. People have always had a problem confusing correlation with causation.

Sometimes, though, like a sightless person throwing darts randomly and hitting a bullseye, it turns out to be right—at least provisionally.

How do you tell the difference? Science, of course.

I won't copy all of them here; there are quite a few. Just hitting some highlights that I wanted to comment on.

Chicken soup helps cure colds and flu

Works best if prepared by a Jewish mother.

Okay, no, that's a joke. But I'm pretty sure the canned kind is going to be inferior to the homemade variety. I'm wary of the word "cure" in the title; however, this falls into the "can't hurt and might help" category. Unless you're vegan, in which case, good luck.

Anyway, I've banged on about chicken soup in here before. The short version is, if it makes you feel better, and you like it, great.

Chocolate causes acne

This one's labeled "false." As with much of "nutrition science," the jury's still out.  Open in new Window.

An apple a day keeps the doctor away

Also labeled "false," but only on a technicality: you're going to get sick eventually, no matter what you do or don't do. But, as the article notes, it's not going to hurt you to eat a damn apple. And it's admittedly a catchy rhyme.

Going out with wet hair gives you a cold

"False." Duh. Colds are caused by viruses. Viruses that you pick up from *shudder* people.

Carrots help you to see in the dark

The article points to "true," but I have to say, not to the extent that mythology would indicate. This nonsense started, if I remember correctly, in England during WWII, when, not wanting "zee Germans" to know about the Allies' sophisticated (for the time) radar, they attributed early warning of aerial attacks to people eating carrots and therefore seeing the impending threat better in the dark.

But again, as with apples, it's not like eating a few carrots is going to hurt.

Cracking your knuckles will give you arthritis

This one's false, and I've known that for some time, but goddamn, it's annoying. So if you use it to scare kids into not cracking their knuckles, I can understand that.

I crack my knuckles all the time.

It takes up to seven years to digest swallowed chewing gum

Another false one intended to scare children straight.

Garlic under your pillow aids sleep

Labeled "false," but I'd call it true, if you're frightened of vampires; clearly, you're going to sleep easier if you know you're protected. It also keeps other people away because of the smell, so you get a better night's sleep.

Still, when it comes to garlic, I'm too busy eating it to put it under my pillow.

Urine relieves jellyfish stings

I'm including this one in my commentary because I'm still hearing this nonsense sometimes. I suspect it got started by someone with a pee fetish, which is way more common than I ever realized.

Oh, yeah, and it's false.

Cheese gives you bad dreams

I remember the first time I heard of this one. It was in Dickens' A Christmas Carol, as Scrooge attributed his nocturnal visitations to possibly having eaten cheese.

As the article notes, dairy products might actually help with sleep. Again, good luck, vegans.

Probiotics support your gut health

This is the last one on the list, and it doesn't surprise me in the yeast—er, I mean, least—that it's not entirely true. The magical benefits of probiotics are mostly marketing gimmicks.

Big surprise.
May 12, 2023 at 8:06am
May 12, 2023 at 8:06am
#1049518
A little more depth than usual, today, but I'll try to make it worth your time.

    Descartes and the Discovery of the Mind-Body Problem  Open in new Window.
The French philosopher René Descartes is often credited with discovering the mind-body problem, a mystery that haunts philosophers to this day. The reality is more complicated than that.


Descartes was, of course, more than a philosopher. Probably most famous for "I think, therefore, I am," he was also a scientist and mathematician (he's the guy who decided it would be cool to represent points on a two-dimensional grid with x and y axes, ever after known as the Cartesian coordinate system from his name). And he had the best hair of any philosopher. Newton's was arguably better, but his focus was mostly science, math, and alchemy; plus, I suspect his was a wig.

Consider the human body, with everything in it, including internal and external organs and parts — the stomach, nerves and brain, arms, legs, eyes, and all the rest.

Yeah, I know what you were thinking about with "all the rest."

Even with all this equipment, especially the sensory organs, it is surprising that we can consciously perceive things in the world that are far away from us.

Eh, not really. Most animals can do that. Kind of necessary for avoiding predators and finding prey.

For example, I can open my eyes in the morning and see a cup of coffee waiting for me on the bedside table. There it is, a foot away, and I am not touching it, yet somehow it is making itself manifest to me. How does it happen that I see it? How does the visual system convey to my awareness or mind the image of the cup of coffee?

"Even more importantly, I live alone!"

I am conscious of the cup, we might even say, though it is not clear what this means and how it differs from saying that I see the cup.

Everyone treats consciousness like it's gotta be this massive, complex thing, but if it turns out to be really simple, I'll laugh my ass off (from beyond the grave, if the dualists turn out to be right).

How did my neurons contact me or my mind or consciousness, and stamp there the image of the cup of coffee for me?

It’s a mystery. That mystery is the mind-body problem.


By this point in the article, the author could have, you know, drank the coffee instead. Such are the perils of philosophy: your coffee gets cold while you wax philosophical about it.

Our mind-body problem is not just a difficulty about how the mind and body are related and how they affect one another. It is also a difficulty about how they can be related and how they can affect one another.

Plot twist: they're the same thing.

According to Descartes, matter is essentially spatial, and it has the characteristic properties of linear dimensionality. Things in space have a position, at least, and a height, a depth, and a length, or one or more of these.

Hence, Cartesian coordinates (extended into a third dimension).

Mental entities, on the other hand, do not have these characteristics. We cannot say that a mind is a two-by-two-by-two-inch cube or a sphere with a two-inch radius, for example, located in a position in space inside the skull. This is not because it has some other shape in space, but because it is not characterized by space at all.

*bong hit* duuuuuude.

Okay, but in seriousness, I believe this next bit is the actual crux of the matter under discussion:

What is characteristic of a mind, Descartes claims, is that it is conscious, not that it has shape or consists of physical matter. Unlike the brain, which has physical characteristics and occupies space, it does not seem to make sense to attach spatial descriptions to it. In short, our bodies are certainly in space, and our minds are not, in the very straightforward sense that the assignation of linear dimensions and locations to them or to their contents and activities is unintelligible. That this straightforward test of physicality has survived all the philosophical changes of opinion since Descartes, almost unscathed, is remarkable.

I don't know about that last sentence. There are philosophers that will tell you, with straight faces (if you can see said faces behind their beards) that what we know as matter is an illusion, and mind is the only thing that's real.

Oh, wait, that's from Descartes, too.

Descartes is untroubled by the fact that, as he has described them, mind and matter are very different: One is spatial and the other not, and therefore one cannot act upon the other.

And yet, one does act upon the other, as we prove every moment of every day, so if it didn't trouble him, it's not going to trouble me, either.

This is analogous to Zeno's paradoxes.  Open in new Window. The way they're formulated, nothing can move, and two people can never get close enough to kiss, and no one would ever be able to enter a room. All of Zeno's paradoxes were later resolved by calculus and limit theory (which came along a generation after Descartes and built on his work), but I mean that in a philosophical sense, when your philosophy doesn't mesh with reality, it's not reality that's wrong; it's your philosophy.

Descartes is surely right about this. The “nature” of a baked Alaska pudding, for instance, is very different from that of a human being, since one is a pudding and the other is a human being — but the two can “act on each other” without difficulty, for example when the human being consumes the baked Alaska pudding and the baked Alaska in return gives the human being a stomachache.

Okay, that was legitimately funny.

The difficulty, however, is not merely that mind and body are different. It is that they are different in such a way that their interaction is impossible because it involves a contradiction.

I would say: an apparent contradiction. Again, it just means that our understanding of one or the other is faulty.

My money's on "mind."

Mind is consciousness, which has no extension or spatial dimension, and matter is not conscious, since it is completely defined by its spatial dimensions and location.

Unless you subscribe to the panpsychism philosophy, which is that all matter has some rudimentary consciousness. (I do not, and have banged on about it at length in previous entries.)

Was there really no mind-body problem before Descartes and his debate with his critics in 1641? Of course, long before Descartes, philosophers and religious thinkers had spoken about the body and the mind or soul, and their relationship. Plato, for example, wrote a fascinating dialogue, the Phaedo, which contains arguments for the survival of the soul after death, and for its immortality.

I begin to see the problem. It starts with a false premise: that the mind, or soul, exists independently of the body and can survive the body's demise.

This is like believing that a candle's light can survive the snuffing of the candle. No. Except in the metaphorical sense, as those who saw the candle's flame can remember it.

What happens, if anything, for example, when we decide to do even such a simple thing as to lift up a cup and take a sip of coffee?

Well, after spending all your time philosophizing about it, that's when you discover it's cold. And I'm still not clear on how it appeared on your bedside table to begin with.

Anyway, I get around all this by understanding that the mind is a construct of the physical body. But what do I know? I'm not a philosopher.

31 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online