About This Author
Come closer.
Complex Numbers
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner



Previous ... -1- 2 ... Next
May 31, 2024 at 7:14am
May 31, 2024 at 7:14am
#1071930
If you're not conscious, this article isn't for you.

    The nature of consciousness, and how to enjoy it while you can  Open in new Window.
In his new book, Christof Koch views consciousness as a theorist and an aficionado.


Well, I guess he's solved the Hard Problem,  Open in new Window. then.

Now, with AI systems behaving in strikingly conscious-looking ways, it is more important than ever to get a handle on who and what is capable of experiencing life on a conscious level.

Oh, that's an easy one. Only I experience life on a conscious level. The rest of you, including the AIs, only mimic actual consciousness.

Solipsism makes everything simpler.

Koch, a physicist, neuroscientist, and former president of the Allen Institute for Brain Science, has spent his career hunting for the seat of consciousness, scouring the brain for physical footprints of subjective experience.

So, okay, someone with credentials, and not just some guy (like me). Doesn't mean he's right, mind you (pun intended), but it does mean it catches my interest.

It turns out that the posterior hot zone...

Posterior hot zone? Seriously? That's what you geeks are going with? You're just begging for it, aren't you? Okay, I'll bite: "Scarlett Johansson has a gorgeous posterior hot zone."

(In reality, I don't find asses to be attractive. But that's never stopped me from making jokes.)

Seriously, though, shouldn't they have looked up those words in their handy Latin dictionaries and called it that, like they do with most chunks of anatomy? Google tells me it's "calidum zona," because I haven't had an actual Latin course in 40 years.

Moving on...

...a region in the back of the neocortex, is intricately connected to self-awareness and experiences of sound, sight, and touch.

This ties in with what I believed to be the reason for consciousness: the nervous system had to evolve in such a way as to integrate sensory experiences, and those mechanisms got hijacked into "awareness." But I'm just some guy, so you shouldn't take that seriously.

Dense networks of neocortical neurons in this area connect in a looped configuration; output signals feedback into input neurons, allowing the posterior hot zone...

Snort.

...to influence its own behavior. And herein, Koch claims, lies the key to consciousness.

Makes sense, sure, but has he, or anyone else, done the science to back that up?

This declaration matches the experimental evidence Koch presents in Chapter 6: Injuries to the cerebellum do not eliminate a person’s awareness of themselves in relation to the outside world.

Okay, so there is some support.

His impeccably written descriptions are peppered with references to philosophers, writers, musicians, and psychologists—Albert Camus, Viktor Frankl, Richard Wagner, and Lewis Carroll all make appearances, adding richness and relatability to the narrative.

I mean, that's probably good writing; I'm not sure it's good science. As this is basically a book ad, though, I can cope.

The takeaway from the first half of Then I Am Myself the World is that IIT might offer the best-fit explanation for consciousness—a view, it’s worth mentioning, that is highly contested by many other neuroscientists.

Good. That's how science gets done.

Koch discusses transformative states of consciousness in the second half of his book, including near-death, psychedelic, and mystical experiences.

Aaaaaand that's not.

He also discusses the expansive benefits of sustained exercise—drawing upon his personal experiences as a bicyclist and rock climber—through which a person can enter “the zone.”

The zone? You're telling us how to enter the posterior hot zone?

Koch suggests that exercise, meditation, and the occasional guided psychedelic might be beneficial to many people.

Jokes aside, he's not the first scientist to come up with that nugget. Timothy Leary comes to mind, though it's arguable whether psychologists are real scientists.

Oh, and no, he didn't solve the Hard Problem of anything except "how to market your book." Nevertheless, I found this review/ad interesting enough to share. Even if he is talking out of his posterior hot zone.
May 30, 2024 at 8:09am
May 30, 2024 at 8:09am
#1071890
From Cracked, an example of how we can understand and not understand science at the same time. Kind of a superposition of states, like an unobserved particle. See? I can science metaphor. It's wrong, but I can do it.

    4 Scientists Who Only Added More Mystery to the World  Open in new Window.
Their problem-solving skills are only rivaled by their problem-creation skills


The misunderstanding is that scientists solve problems. I mean, sure, you get some answers, but those answers always lead to more questions. This is good, though; it's job security.

You want actual problems solved? That's what engineers are for.

There’s a bit of an erroneous belief that in order to become a famous scientist, you have to solve problems.

At least the author admits that it's a misunderstanding.

However, there’s another, arguably vastly more annoying way to get your name on an enduring thought. That’s to come up with a brand new intellectual mess for all the other scientists to have to try to figure out.

Like I said. Job security.

4. Thomas Young

I will admit that I've either never heard of this individual, or forgot that I did.

In 1801, Thomas Young disagreed with the popular belief that light was made up of particles. He believed that light was, in fact, a wave, and so he cooked up an experiment to prove it. He cut two slits in a sheet of metal and shone light through them.

Oh, yeah, the double-slit experiment. I've certainly heard of that. Hell, in physics lab in college, we performed it, only we used lasers (which hadn't been invented yet in 1801). It's clearly more famous than its creator.

Young set out to perform a simple experiment with light, and ended up creating what would be called “the central mystery of quantum mechanics.”

Sometimes, complex questions have simple answers. This is a case of the opposite. The article goes on to explain exactly why, and, miraculously, it conforms with my prior knowledge (that is, I'm sure an actual physicist could pick it apart, but for a comedy site, it's remarkably accurate).

3 Fritz Zwicky

This one, I'd heard of.

You might not know the delightfully named Fritz Zwicky, but you have heard the two words he coined in combination: dark matter.

Thus also providing job security for science fiction writers who are free to give it all kinds of magical properties.

Dark matter, which is — keep in mind that as an art major, I am fighting for my life here — matter that contains mass but emits no light and therefore cannot be observed, was his best, confident attempt at making some very nonsensical measurements make sense.

I gotta say, I'm impressed that an art major got so many things right. No disrespect intended to art majors, but they're not known for understanding physics. On the flip side, I have some small understanding of and education in physics, but I can't do art to save my life, so it all balances out.

Anyway, in my own amateurish way, I tend to see the concept of "dark matter" as a placeholder, a concept expected to have certain properties. Kind of like the luminiferous ether proposed before we more fully understood the nature of light (as per #4 above). When we finally figure it out, I'd predict that "dark matter" will be an historical relic, like luminiferous ether or the humour theory of medicine.

2. Enrico Fermi

You know this guy developed other stuff, right? Like, his famous "paradox" (which I've insisted in the past is not an actual paradox) was kind of a footnote to an incredibly productive career? And that he has a whole class of subatomic particles named in his honor?

Honestly, I think if he hadn't been so prolific, no one would have paid attention to the "paradox."

Nobody was arguing that the question “does extraterrestrial life exist” was too easy to answer. Yet, a man named Enrico Fermi decided to add another layer of unsettling confusion to that little gray layer cake, just in case anyone was feeling they had a good handle on it. Even worse, he reportedly rattled off his new addition casually at lunch, and every scientist since has been dealing with his bullshit. Bullshit that’s most commonly referred to as the “Fermi Paradox.”

It's nice to see my opinion on the matter confirmed, even it is by an art major writing on a dick joke site.

The article, again, does a good job explaining the questionable conundrum, so I won't repeat it here.

1 The Guy Who Invented Mystery Flavor Airheads

That wasn't a scientist. That was a marketer who probably got ordered to find something to do with the excess Airhead slurry left over after batches had been produced.

And we don't even know if it was a guy.
May 29, 2024 at 7:25am
May 29, 2024 at 7:25am
#1071843
Well, if it's from MIT Press Reader, surely it has the weight of scientific research backing it up, right?

The Time Hack Everyone Should Know  Open in new Window.
Much like Dorothy discovers at the end of “The Wizard of Oz,” the key to hacking time is a tool we’ve had all along: Choice.


But then you won't get to meet friends, have adventures, and defeat angry old women. Unless you spend your time playing D&D.

I’m in a complicated relationship with my phone.

Oh, dear gods, this is going to be a screen rant, isn't it? FFS. Another thing we're all Doing Wrong.

So much so that I’ve never used the screen time function, choosing to live in denial rather than dealing with the hard truths of our relationship.

Or you could, and I know this is a novel concept here... just not worry about it.

Imagine my horror then, when my 14-year-old son surreptitiously turned it on and started reading off my statistics from the previous week.

It's your own damn fault for a) reproducing and b) not securing your phone.

We all know that time is our most precious resource: It’s the one thing money cannot buy.

Glib, but demonstrably false on two counts. One, the Beatles weren't lying about "money can't buy me love" (though it can buy beer, which is close enough). Two, if you manage to accumulate enough money, invested wisely, to live off the interest... well, then, you can quit working (also known as retirement) and thus use it to buy time.

And with smartphones in everyone’s pocket these days, we’ve never been more able to track how we use every minute of it.

Wrong pronoun. Should have been "they've," not "we've."

By pressing a button or downloading an app, we can track the time we spend exercising, sleeping, and even scrolling through our social media feeds.

That information is not there for your benefit or neurotic attention; it's there so that companies can track trends.

All of this reads like the "TV will rot your brain" rants from a few decades ago, or the "comic books will rot your brain" rants from before then, or the "radio will rot your brain" rants from before then. Hell, I'm willing to bet that as soon as someone invented fire, someone else was like "That stuff makes life too easy. Kids these days, so lazy!"

Take, for example, the American Time Use Survey.

Taking surveys is one way to waste time, I suppose.

The Bureau of Labor Statistics has been collecting data on a variety of time use markers for almost 20 years.

So they can say things like "Clearly, Americans have too much free time. Let's find ways to work them harder!"

According to their 2020 findings, the average American has enough leisure time to fit in lots of healthy and life-enriching activities: 5.5 hours per day to be exact.

Okaaaaay. Here we go.

The "average American" possesses slightly fewer than 2 legs. The "average American" sports fewer than 10 toes, and fewer than 10 fingers. I can't be arsed to find the precise mean, but considering that amputees exist, it's indisputable that the average (the mean) is lower than the most common number (2 in the case of legs, 10 in the case of fingers or toes). Probably around 1.98 legs and 9.5 fingers/toes. Maybe not those exact numbers, so don't quote me on that, but it's very likely some decimal number close to but less than 2 or 10. What matters in that case isn't the mean, but the mode. And, of course, that people who don't conform to the mode are accommodated in public, but that's not relevant to this discussion.

My point is that the "average" here is probably misleading, and meaningless. If two people are working full-time for every one who is not, that 5.5 hours may be low for the latter and high for the former. Plus, even full-time workers usually get weekends and holidays; I'm not sure if that's included in the average, but you might have a good 32 hours of leisure time (not working or sleeping) on the weekends and 0 on weekdays.

I've gone through the calculations before of a typical (not "average") full-time worker, and deduced that the actual number, when removing times for things like sleep, work, getting ready for work, commuting, dealing with household chores, etc., is closer to 1 hour on a work day. And that's assuming one job, which we just can't.

And that's not even getting into the bigger, less pedantic point, which is that as soon as you add "healthy and life-enriching activities" to your day, that time no longer counts as leisure time, because you're doing stuff you feel like you have to do instead of stuff you want to do.

Or that for some of us, screen time is "life-enriching." I've been using some variant of a computer every day since February of 1979. Yes, even before the internet. I played games, sure, but I also learned programming basics and other stuff. And then the internet came along and, well, here I am.

Pant. Pant. Back to the article.

But the survey also showed that we often eschew healthy, happiness-driving activities for passive, screen-based ones.

Newsflash: how we achieve happiness is very personal. I get that some people who use computers all day at work may feel the need to disconnect in downtime (I was never one of those people; I used computers for work and play.) I also get that some of those who do manual labor might want to crack open a beer and watch spurts or whatever. It's different for all of us.

The average American spends 22 minutes a day participating in sports, exercise, and recreation; 32 minutes per day socializing or communicating; and 26 minutes per day relaxing or thinking. In contrast, they spend 211 minutes per day watching TV. That’s 2.6 times more time watching TV than exercising, relaxing, and socializing combined.

Without reiterating my "average" rant from above, I'll just note that this is presented as a Bad Thing and Something That You're Doing Wrong.

Studies have shown that heavy TV watching is related to lower life satisfaction, and that people who use social media the most are also the most socially isolated.

Assuming those studies are even valid, how much of that is because you're watching shows or spurts, versus how the inevitable barrage of ads designed to make us feel worthless and incomplete is making us feel worthless and incomplete?

The article goes on to play more funny games with averages, and ends with some not-so-bad psychology that, while I call it not-so-bad, still won't work for everyone because, just like not everyone has 2 legs, not everyone responds in the same way to the same psychological mind-games.

For instance, the article seems to make a distinction between socializing in person and socializing over the internet. I've been at this for long enough that I don't make that distinction.

In the end, if you feel like you're too focused on something, whether it's exercise or screen time, change it if you want. If you don't, then you didn't really want to; you just thought you wanted to.

Relax. Have a beer (or whatever equivalent you use to say "fuck it" to the world). And don't let other people dictate how you should be spending your time... unless they're paying you.


May 28, 2024 at 1:00pm
May 28, 2024 at 1:00pm
#1071808
Short entry today. I had a dental checkup this morning, which took up way too much of the morning, and I just don't feel like doing a longer entry.

I will say this, though:

Afterwards, afflicted with a powerful thirst, I stopped at the nearby 7-11 to get some ice-cold Crack Zero for the purposes of slaking said thirst while at the same time getting the cloying taste of tooth polish out of my mouth.

The convenience store was, inconveniently, infested with an outbreak of teen. Kids running around, pushing each other, being loud, discussing their latest trivial but earth-shattering teen drama, flirting, and generally both ignoring and getting in the way of oldies like me who just wanted to do some commerce.

I started to prepare a "kids these days" rant in my head, but then, just as my fist closed over the smooth plastic of an ice-cold Crack Zero, the realization hit me just like some kid had hit the door as I held it open:

Given the way my friends and I acted in 7-11s when we were that age, I was experiencing no more and no less than long-delayed karmic retribution.

Okay, Universe. Okay. You win this time.
May 27, 2024 at 10:22am
May 27, 2024 at 10:22am
#1071760
One reason I think science confuses people is that nutrition science, in particular, is a hot mess that can't seem to make up its mind about anything. Case in point, from Inverse (reprinted from The Conversation):

    The Ultraprocessed Food Hype Is Masking This Other Major Health Predictor, 30 Years of Data Reveal  Open in new Window.
Much of the recent evidence related to ultra-processed foods tell us what we already knew.


Even I gave up on trusting its results after several iterations of eggs being good, then bad, then good, then bad, then good, then bad, then good, etc. Not to mention all the questionable studies funded by people who push for a particular result. Nowadays, I mostly just eat for enjoyment; let other people get neurotic about what's healthy or not this week.

I recognize that part of the problem here is that biology is fiendishly complex, and has all sorts of mechanisms to return to equilibrium after a push away from it, some of which we don't fully understand. Another part of the problem is that it can be extraordinarily difficult to remove confounding variables from a nutrition experiment. And a big part of the problem is breathless reporting, eager to announce the Next Big Bad Thing or explain why the Last Big Bad Thing wasn't so bad after all—the latter of which seems to be the case for today's article.

In recent years, there’s been increasing hype about the potential health risks associated with so-called “ultra-processed” foods.

There's always a bogeyman, because we can't just let people enjoy their meals in peace. First it was fats (turns out we need some), then carbs (turns out we need some of those too), then gluten (still a source of grave misunderstandings) or whatever. Now, apparently it's ultra-processing, which I'm sure is defined somewhere.

But new evidence published this week found not all “ultra-processed” foods are linked to poor health. That includes the mass-produced wholegrain bread you buy from the supermarket.

Like I said, carbs went from good to bad to maybe good. The "maybe" is probably a matter of high vs. low glycemic index. Meaning, for example, whole wheat bread is probably better for you than white bread, which I did a whole entry about a couple of weeks ago.

Ultra-processed foods are industrially produced using a variety of processing techniques. They typically include ingredients that can’t be found in a home kitchen, such as preservatives, emulsifiers, sweeteners, and/or artificial colors.

I told you it was defined somewhere. Well, sort of. I'm pretty sure we all have things in those categories in our home kitchens. Salt, for example, is a preservative. Egg protein is an emulsifier (a thing that keeps oil and water from separating as nature intended). And so on.

Common examples of ultra-processed foods include packaged chips, flavored yogurts, soft drinks, sausages, and mass-produced packaged wholegrain bread.

In other words, anything that actually tastes good.

The new paper just published used 30 years of data from two large US cohort studies to evaluate the relationship between ultra-processed food consumption and long-term health. The study tried to disentangle the effects of the manufacturing process itself from the nutrient profile of foods.

I'm not reading that, so I'm not going to comment on the validity of the study. I'll just point out that this is just throwing another osis into our collective neuroses.

The study found a small increase in the risk of early death with higher ultra-processed food consumption.

The obvious question here is: correlation or causation?

Existing national dietary guidelines have been developed and refined based on decades of nutrition evidence.

But mostly based on pressure from lobbyists who work for manufacturers of ultra-processed foods, so that sentence makes me laugh. Nutrition science is convoluted enough as it is; throw in government bullshit, and you can see why I've given up.

Which doesn't mean I'm going to snack on Doritos all day. Just that I'm done worrying about every little thing that goes into my food and drink.
May 26, 2024 at 8:30am
May 26, 2024 at 8:30am
#1071700
Way back in May of 2008, I did an early version of my "comment on a link" thing, combined with a mini-rant: "Big BangOpen in new Window.

As that was 16 years ago (I can math), it's not surprising that the link is no longer functional, but I think we can all get the general idea: someone found a stash of expired fireworks, and authorities decided to blow them up using plastic explosive.

Looking back on this now, I'd still take the attitude that they only did it because they could. From what I've heard since then, the proper and safe way to dispose of stale firebangers is to immerse them in water and let them soak for a good long time. But where's the fun in being proper and safe? I may be older and (arguably) wiser now than I was when I found that link, and I may have turned into the neighbor who moans about other people in the neighborhood illegally making booms and whistles around every Fourth of July, but that doesn't mean there isn't, somewhere inside me, Kid Me, who would definitely have wanted to see a bunch of old fireworks get blasted by 30 pounds of C-4.

The only difference is nowadays, I'd be more careful about where I did so.

As for the rant, it was about Mother's Day, which apparently was the day of that entry:

I thought about writing something about Mothers' Day here, but what's the point?

This year, I completely ignored it.

My mom died nine years ago this summer...

Obviously, that's 25 years ago now. That's a longer span of time than I spent living with her.

...and why the hell is there a special day reserved for people who managed to reproduce?

In fairness, there's a special day reserved for just about anything. Today, for example, is Sally Ride Day,  Open in new Window. celebrating the birthday and legacy of the first female US astronaut (and who had one of the most awesome names in the history of names, thanks to a certain Wilson Pickett song). Now, there's a woman who accomplished something.

Hell, hamsters can do that. How about reserving a day for those who care enough about the planet and its other life forms that we did not breed like rabbits?

Note to self: stop mixing rodent and lagomorph metaphors.

And that goes double for Father's Day.

Lest anyone labor under the misconception that my rant was reserved for females alone.

Of course, at the end of the entry, I clarified that the whole rant was satire:

Oh, about the first paragraph? I'm just kidding. Mothers - and fathers - should definitely be acknowledged, not for breeding, but for bringing you up right.

If, that is, they did so.

About the only thing I truly regret about that long-ago entry was not being more explicit in tying the "Big Bang" of the fireworks disposal with the "Big Bang" usually associated with the conception of offspring. It's even possible that I didn't make the connection back then, but now, being older and (arguably) more foolish, it jumped right out at me like a rabbit. Or a hamster.
May 25, 2024 at 10:31am
May 25, 2024 at 10:31am
#1071672
Fair warning: this article, from Mental Floss, never really answers the headline's question. I'm still posting it, because it's may be of interest to the readers and writers here.

    What Was the First Banned Book in History?  Open in new Window.
Book bans are hardly a new practice.


For a country claiming to abide by freedom of speech, we sure do love our book bans and censorship. I suppose it's just human nature: it's not enough to get your own ideas out there; sometimes, you also have to suppress those of others.

Obviously, the US isn't the only place to ban books, but in our case, it's the hypocrisy that leaves me shaking my head.

There’s no more potent evidence of the power of the written word than the fact people have historically looked to ban them.

In my opinion, if your ideas can't hold up to scrutiny and argument, then they're not great ideas to begin with. Also see: blasphemy laws.

Cultural norms, politics, personal beliefs, school policy, and other factors can all conspire to deem a book too incendiary to circulate in America.

This article is from October, so I don't remember which specific book-banning event spurred its writing. There have been so many. As for school policies, though, there's a difference between outright censorship and desiring age-appropriate materials. Some "censorship" arguments actually boil down to differences of opinion over what's age-appropriate and what isn't. Obviously, we make those distinctions right here in this community.

I've seen (and participated in) plenty of discussions about age-appropriateness, in the context of content ratings here. There are legitimate differences of opinion there. So when I talk about book bans, it's usually about people who try to keep grown adults from making their own decisions about what to read or not.

But just how far back does this policy of thinly-veiled thought control go?

If I had to guess, I'd postulate that book-banning is as old as books. Some Sumerian probably cuneiformed a dirty joke into clay, and other Sumerians got offended and tried to burn the clay, which obviously would have had the opposite of the intended effect, leaving the joke literally etched in stone.

Shattering works much better on clay tablets.

As is often the case when you look back into history, there’s more than one possible answer. But one of the leading contenders has a fairly predictable culprit: the Puritans.

Ah, yes, that marginalized group who fled religious persecution in England so they could practice it themselves in America.

In 1637, a man named Thomas Morton published a book titled New English Canaan.

So, potentially the first banned book in the Americas, but that could hardly be the first banned book, period. Incidentally, I did a whole blog entry on that author a few years ago: "Vile, Virile VicesOpen in new Window.

His book was perceived as an all-out attack on Puritan morality, so they banned it—and effectively banned Morton, too.

The real miracle here is that their descendants ended up signing on to the whole "freedom of speech and religion" thing when they grew up.

You can go further back to find more startling examples of banned books, though the definition would have to expand to include the execution of authors.

Yeah, writing may be a fairly safe activity now, free of the occupational hazards of, say, firefighting, but it hasn't always been the case.

In 35 CE, Roman emperor Caligula—certainly a man of strong moral stuff if ever there was one—discouraged people from reading Homer’s The Odyssey because it could give them a taste of what it meant to be free.

A lot of the stuff you've heard about Caligula might have been political bickering. It would be like if the only surviving history of Kennedy's presidency was written by Nixon. And the infamous movie with his name on it wasn't exactly a documentary.

Most telling, though, there's a huge difference between being "discouraged" from reading something, and having that something banned or burned.

What book bans and censors attempt to do in the curtailing of reading is often futile.

Here is, in my view anyway, the most insidious thing about censorship: the censor has either read the book, or has not read the book. (Reading enough of the book to know you don't like it counts as reading it, in this argument.) In the latter case, it's ridiculous to ban something you haven't even read. In the former, you're setting yourself up as an arbiter, someone more qualified to make that decision than, say, me. That's also ridiculous.

If your ideas are sound, they'll withstand argument. If not, and you try to do an end-run around public discourse by banning opposing viewpoints, well, that might just make you a tyrant. And at least here in the US, whenever someone bans a book, well... that's publicity for the book, isn't it?
May 24, 2024 at 7:04am
May 24, 2024 at 7:04am
#1071617
As I normally travel alone, today's article seemed relevant to my interests. (Source is Condé Nast Traveller, if it matters.)

    The golden rules of solo travel  Open in new Window.
We ask our editors and favourite solo travellers for their savviest tips and tricks


Apparently, there's a US and UK version of that outlet. The spelling gives away that it's from the latter. But as it's about travel, I don't think the country of origin matters much.

The joys of travelling solo are endless.

I wouldn't say "endless." Just "ending later than if you have someone with you to eventually argue with."

There is something truly freeing about exploring new places alone – you can go where you please, eat when you want, and have uninterrupted quality time with yourself.

No sleep-shaming, no pressure to fit too many things into one day, no bargaining about "If we do X today, I want to do Y tomorrow," etc.

The interest in solo travel has been slowly rising for a while, but new data from UK-based travel organisation ABTA shows that 16 per cent of travellers went on holiday by themselves in 2023, a five per cent increase from the previous 12-month period.

I wonder if there had been a global problem that made people tired of always seeing the same other people all the time.

But, if you’re not a seasoned solo traveller, it can be a daunting prospect. In an age of constant connectivity, the idea of being alone for an extended period of time is a convoluted one.

The only way I'd be "alone for an extended period of time" would be if I went hiking by myself in the wilderness, which is not only a bad idea to begin with (though I've done it), but it would involve being *shudder* outdoors.

Below, we spoke to travellers who frequently book solo trips about their golden rules for travelling alone.

"Rules?" Hey, I travel alone so no one gives me "rules."

Dining alone isn’t weird
For most people, the thought of dining alone is one of the biggest barriers to travelling solo.


Yeah, I just can't relate to that. If I'm alone, I can focus on the things that really matter: the dining and drinking experience. Besides, no one is there to tell me they just can't stand whatever cuisine I've decided to try.

Fake it til you make it
Most people feel nervous about meeting new people, and introverts especially can struggle to make the first move when arriving in a new place.


I'm more introverted than extroverted, but my only apprehension involves language and cultural barriers.

Book counter dining at restaurants
If you are someone who does feel uncomfortable about dining alone, opt for a bar or counter seat.


While I don't travel internationally as much as I'd like, here in the US at least, the bar usually doesn't require reservations or other planning ahead, apart from maybe figuring out a time to go when it's not too crowded.

Plan around cultural events
Arriving at a destination just as the locals are gearing up for an important cultural event can be an incredible way to immerse yourself straight away.


It's also an incredible way to have everything crowded and sold out. Hell no. Give me off-peak travel any day.

Exception: my desire to visit Scotland during the Islay Festival for the best whiskey in the world.

Build in group activities
Booking tours and group events is a great way to meet other travellers. Most hostels have a list of activities available for guests to sign up for, and if not, then there are walking tours or live music events at local bars.


Honestly, I'm torn about this bit, personally. First of all, I'm not interested in hostels, but let's leave that aside for now. And while I love music and bars, I despise music events at bars, because I can't hear the bartender.

My passport expires in 2026. I've never used this incarnation of it, because, well, you know. I want to use it at least once, and I don't mean crossing the border into Canada. As the person I was planning to go to Belgium with has other priorities now, I'll be going alone, which is fine. Maybe France, first, though... but not until after the Olympics (see above re: crowds).

Now I just have to get on my ass and make the plans.
May 23, 2024 at 10:01am
May 23, 2024 at 10:01am
#1071563
Appropriately from Atlas Obscura, the tale of a great world explorer:

    The First Viking Woman to Sail to America Was a Legendary Traveler  Open in new Window.
Back when the Icelanders called a part of Canada the “land of grapes.”


Now, it's possible that we shouldn't be using the word "Viking" like that, as I mentioned long ago here: "The Once and Future VikingOpen in new Window.. But I'm just quoting the article here.

Her full name, in modern Icelandic, is Guðríður víðförla Þorbjarnardóttir—Gudrid the Far-Traveled, daughter of Thorbjorn.

I'll just note here that some of those weird-to-us letters used to be in English, too.

She was born around 985 AD on the Snæfellsnes peninsula in western Iceland and died around 1050 AD at Glaumbær in northern Iceland.

Just looking at that, one might conclude that she didn't travel very far at all.

What little we know of her comes from the Saga of Erik the Red and the Saga of the Greenlanders. These are collectively known as the Vinland Sagas, as they describe the Viking exploration and attempted settlement of North America—part of which the explorers called “Vinland,” after the wild grapes that grew there.

A few entries ago, some article said that meant "Land of Wine," which may be inaccurate, but I like it better anyway.

Also, they freely mix fact with fiction. Their pages crawl with dragons, trolls, and other things supernatural.

How else are you going to scare the kiddies into behaving?

But the central tenet of the sagas has been proven by archaeology: In the 1960s, the remains of a Viking outpost were dug up at L’Anse aux Meadows, on the northern tip of Newfoundland.

I wouldn't say "proven." "Supported," maybe. It's not news anymore that Scandinavians made it to North America long before Italians did.

Among the rubble was found a spindle, used for spinning yarn, which was typical women’s work and thus possibly handled by Gudrid herself.

Right, because Gudrid sounds like the kind of chick who would do "typical women's work."

And in the Saga of the Greenlanders, Gudrid is called “a woman of striking appearance and wise as well, who knew how to behave among strangers.” That’s a trait that may have come in handy when dealing with the Native tribes of North America, whom the other Vikings dismissively called skrælings (“weaklings,” “barbarians”).

As I've noted before, who's the "barbarian" depends on who you're asking.

The article continues with a summary of the sagas involving Gudrid, and while I'm sure the originals (well, the original written-down versions, I mean) would be fascinating, the Cliff's Notes here seem to provide the pertinent details.

Another story from the sagas that has mystified readers for centuries because it mentions two “Gudrids” and has traditionally been dismissed as a ghost story could in fact be the earliest recorded conversation between a European and an American.

And no, they didn't discuss trade agreements or war. Or much of anything, considering they apparently didn't have time to learn each other's languages.

There's a lot more at the article, but as it notes, her relatives Erik and Leif got all the PR, but they didn't travel alone.
May 22, 2024 at 9:26am
May 22, 2024 at 9:26am
#1071517
I'm usually a law-abiding citizen. Or, well, I try to be; sometimes it seems laws are designed so that if They want to get you for something, They can find a reason.

But some laws are fundamentally unjust, and need to be broken. Cracked provides some examples in:

    6 Loopholes People Used to Break the Law and Get Drunk  Open in new Window.
Technically, if you’re on a train, everything is legal


6. Pay to See the Blind Pig

During Prohibition, a bar could not legally operate and sell alcohol. No one could legally sell alcohol (without receiving special exemptions, such as for medicinal use), or manufacture alcohol, or transport alcohol. The law didn’t ban drinking alcohol, however, or handing the stuff to someone else without charging them any money.


It could, of course, be argued that it's hard to drink alcohol if one is prohibited from buying, making, or moving it. But, at least here in the US, lawyers thrive on technicalities.

But suppose an establishment were to hand out a drink for free and charge customers for something else? Say, they charge a customer for some entertainment — for instance, the chance to look at a marvelous animal. As for the drink the barman serves the customer, well, no one’s purchasing that.

Not mentioned: how the pig got blind. Look, I'm all for eating the tasty critters, but mutilating them doesn't fly. This was nearly a hundred years ago, though, and people didn't generally think that way.

This idea is why one alternate name for a speakeasy is a “blind pig.” And if you’re wondering why the police would ever be fooled by this, know that plenty of police didn’t really care about Prohibition laws and were possibly drinking right along with everyone else.

Which may explain why some cops love do do drug busts: it provides them with free product.

5. Instructions on How to Absolutely Not Make Wine

They had this sort of thing for beer, too.

Individual families were still allowed to make a limited amount of wine, but if you were to sell people the raw materials for making wine, along with instructions, you might find yourself in trouble.

This would be like selling fertilizer with instructions on how to ensure it never becomes a bomb. Except there's good reason to keep people from making bombs.

“After dissolving the brick in a gallon of water, do not place the liquid in a jug away in the cupboard for twenty days, because then it would turn to wine.”

These days, of course, the surest way to get a certain group of people to do something is to tell them not to do it.

4. Stick Everyone on a Train

Even when alcohol is legal in the country, you need a license to sell it. One British gin maker, Tapling & Meegan Distilling, dutifully applied for this license, but it was taking too long to get approved. So they did the only reasonable thing and turned to steampunk.


Now, there's a story idea.

Within all the country’s many alcohol regulations is a line of law saying the usual license requirements do not apply to trains in motion.

The downside of this is obvious: motion sickness exacerbated by drunkenness.

3. Let’s Call Beer Something Else

Complex regulations define exactly what beverages are, which is why you generally cannot put lemonade in a bottle and sell it as tomato juice. In Texas, they had a rule about beer: It could not contain more than 4 percent alcohol by volume.


Ah, yes, the old "name a thing something else to get around regulations" trick. No wonder fermented beverage categories can be misleading.

A lager that appears to be beer by most conventional definitions would be labeled, in fine print, “In Texas, malt liquor.”

I'm pretty sure some states still have ABV maximums for beer. As the article notes, Texas isn't one of them. But it's not a good look for a state that prides itself on limiting government interference in peoples' lives. Which, incidentally, they clearly don't do.

2. Turning Nightclubs into Pop-up Restaurants

This next law lasted from 1935 to 2000, in Ireland, a place not entirely unfamiliar with alcohol.


Now that's a cheap shot. Pun intended.

Establishments were not allowed to serve alcohol at night unless they also served a meal.

This sort of thing has been the law here in the US, off and on, depending on where you are. Hell, even my state has a version of the rule, which is why you'll technically never find a bar in Virginia; only restaurants that happen to serve booze.

The dish of choice at these clubs? Curry.

Yeah, that couldn't have ended well.

1. The Inedible Sandwich

That Irish policy hearkens to an older and famous law from New York. Way back in 1896, the same time that they made the controversial decision to raise the drinking age from 16 to 18, the state passed a law saying bars couldn’t serve alcohol on Sundays.


Like I said.

Bars, which served no actual food, qualified for the exemption by offering a sandwich. Not sandwiches, but a singular sandwich that a bar would pass from customer to customer without anyone eating it. This was named the “Raines sandwich,” after John Raines, the senator who wrote the law.

I'd heard about this loophole before, of course. It still amuses me.

The takeaway here is that, if you're clever, you can find a way around unjust laws without flagrantly breaking them. And I approve.
May 21, 2024 at 7:17am
May 21, 2024 at 7:17am
#1071456
Here's a book ad disguised as a self-help article from GQ for me to snark on.

    How to Kick Your Bad Habits (And Why That's More Important Than You Think)  Open in new Window.
Plus: Why goals don’t work, and why your to-do list is wrecking your ability to do anything.


Hey. HEY! You're doing it wrong. Send me money and I'll tell you how to do it right! Until the next guru comes along to tell you that you're still doing it wrong.

A few years ago, we reached peak you're-doing-it-wrong with a video on how you're opening a banana from the wrong end. Since that, I've felt free to ignore any attempts to convince me I'm doing anything the wrong way.

You might not spend much time thinking about your habits. They are, after all, mindless.

No, they're really not. They may be comfortable, but they're not mindless.

James Clear, on the other hand, has made something of a living on it.

Lots of people make their living preaching that "you're wrong and the only way to be right is to follow me and give me money." Some of them are literally called preachers. It doesn't mean they're right.

One of his big takeaways is a bit unsettling when you consider all the habits you’ve sworn to kick (but haven’t), and all the habits you’ve really been meaning to start (but haven’t): habits, multiplied by time, equal the person you eventually become.

I get around that by not swearing to kick or start habits. Do this, combined with accepting yourself as you are, and your life becomes simpler and happier. You won't even be tempted to give what little money you have to people who are trying to convince you that there's something wrong with you and only they can fix it.

That advice, by the way, is 100% free.

In a year, the difference between a person who does 10 push ups a day and a person who eats one bag of Doritos a day is that one person has done 3650 push ups and one person is sad.

No, the difference is that one person has sore arms and pecs, and the other person is following their bliss.

Besides, this creates the illusion of a dichotomy: while it might be difficult to eat a bag of Doritos while doing push-ups, there's absolutely nothing that says you can't do both in one day.

The rest of the articlead is in interview format, and I won't quote most of it. But at one point, the interviewer poses:

But if you think about it, I feel like we're so often controlled by sort of nudges that we aren't even conscious of.

Yes, and one of those nudges is ads.

The way I see it, if you think you want to do something (or refrain from doing something), and you continue to do (or not do) it anyway, then you don't actually want to do it, and you should respect that.

Now... this may seem hypocritical of me, since I've been on a 4+ year daily streak in both blogging and language learning. Those are habits I picked up, and sometimes I go out of my way to practice them. I'm not saying we shouldn't try new things, or improve ourselves. It's the pressure that we're somehow lacking, and we should feel bad about ourselves, that I object to.

So I make it a habit to identify that pressure when it presents itself.
May 20, 2024 at 10:50am
May 20, 2024 at 10:50am
#1071425
Speaking of science (yesterday's entry), today, the random numbers landed on this Big Think piece on the perception of science:

4 pervasive myths that cause us to abandon science  Open in new Window.
It’s not a gambit. It’s not fraud. It’s not driven by opinion, prejudice, or bias. It’s not unchallengeable. And it’s more than facts alone.


It's safe to say that I have a few issues with the wording in some places here.

When you think about what science actually is, how do you conceive of it?

I always picture Doc Brown's workshop in Back to the Future. Yes, of course I know it's wrong, but I'd rather be amused than right.

Do you do what most people do, and default to what you learned in our classes in school, with a layer of cynicism and skepticism layered atop it?

Hey! Cynicism and skepticism are my friends!

That’s understandable, as many of us remember being taught “facts” in our science classes that later were shown to be not as they seemed at the time.

This was, of course, not limited to science classes.

It’s as though, somewhere along the lines, many of us were taught isolated facts about the world, and our ability to regurgitate those facts was used as a metric to measure how good we were at science.

I've said this before, but memorizing trivia is an entirely different skill set from doing science. I'm not knocking trivia here; there's something to be said for remembering stuff without having to pull out your phone and google it. Plus, it's useful at bars sometimes.

Many of those facts may have felt absurd; many of the experiments we performed didn’t give the results we were told they should give; many of our experiences didn’t line up with what was written in our textbooks.

That's partly because some facts are, compared to our everyday experience, absurd. I mean, come on, the flow of time changes depending on how fast or slow you're going? That defies common sense! And yet it's demonstrably true; hence why I distrust "common sense."

If that’s what our experience with science is, then why should we be expected to believe that whatever science “tells us” is actually true?

Science isn't edicts handed down from on high. You're thinking of religion, or maybe law.

The way out, perhaps, is to slay the four most common myths that agenda-driven advocates leverage to sow doubt about bona fide scientific knowledge.

Science, in actuality, is both a process and the full body of knowledge that we have, and it adds up to the best, most truthful understanding we have of the world at any given time.

I'd put more emphasis on the "process" part, myself.

Here are four common lines of thought that we often make to argue against scientific findings, and why each of them is fundamentally flawed.

The article proceeds to do just that. It's not exactly short, and I'm not going to quote a whole lot.

(Myth) 1.) Science is biased by who funds it.

Because there are numerous instances throughout history where various industries have published junk science that does indeed promote their agenda — for example, the tobacco industry has famously been caught manipulating research — many people believe that this translates into all science being untrustworthy, particularly wherever it’s funded by some entity they view as ethically questionable.


One such instance that I remember referencing in here is how the "chocolate is good for you" result came from studies funded by Willy Wonka and performed by noted Oompa Loompa scientists who were paid in chocolate. It happens. (Okay, I'm being funny, but you get the point.) They usually get caught. The point here is that it's not right to generalize that to all science.

But part of valid skepticism is to look at the motivations behind research.

To be sure, there really is fraudulent research that gets conducted all the time; it’s one of the main reasons papers either get retracted or wind up being published in unscrupulous, sham journals.

One of the worst things about that is that people will remember the fraudulent science, and ignore or be unaware of the retraction.

(Myth) 2.) Science is driven by public opinion.

It’s often been said that, “Science doesn’t care what you believe,” but the truth is that science doesn’t care what any humans believe, in general. The Universe, as best as we can measure it, obeys certain rules and yields certain outcomes when we test it under controlled experimental conditions. The reason scientists so often find themselves surprised is that you cannot know the outcome of a new experiment until you perform it. The results of these experiments and the knowledge that comes along with it is available to anyone who reproduces the experiment. The results are found in nature, and anyone’s opinion on a scientific matter that’s been decided by the evidence is moot.

I'm just leaving this here because it's a pretty good summary of stuff I've been saying.

I just have one major quibble. Well, one major and one minor.

Major: saying that "anyone’s opinion on a scientific matter that’s been decided by the evidence is moot" is a bit misleading, in my opinion. For example, the chocolate nonsense I referenced above: you can have an opinion on it, but that opinion needs to be based on philosophy, not science. Like, I know the study was more than questionable, but I'll eat dark chocolate anyway, simply because it's delicious and I'm hedonistic.

Minor: there are some interpretations of quantum physics that point to the idea that the observer, in this case us, does influence the outcome of an experiment. What's in question is the validity of those interpretations, and how much they affect macroscopic phenomena.

I'm not quoting the other two points; they're also at the link.

In conclusion, for me anyway, while results are often provisional, science is the best method we've come up with for approaching legitimate knowledge. It's kind of like... maps. I've linked Atlas Obscura many times in here; hopefully, through that or by other means, you've seen historical maps. What was unknown on a map produced in, say, the 16th century, is generally presented as a best guess, an incomplete shoreline, a blob, a dragon representing the unknown, or whatever. As exploration improved, maps started to be drawn with greater and greater accuracy. Now, thanks to science, we have satellites doing our cartography with high precision.

No map, however, can have perfect accuracy. Zoom out too far, and small, but important, features disappear. In too close, and you lose the big picture. Rivers and oceans have variable water levels, so shorelines are, at best, averages. The Earth changes, and geological features (usually slowly) move: rivers change course, continental plates shift, erosion affects shorelines, etc.

This is an analogy and, like most analogies, it's not a perfect one. But if I'm going to try to circumnavigate the planet (which, by the way, is definitely round), I'll trust Google Maps before Magellan's sketches.
May 19, 2024 at 9:44am
May 19, 2024 at 9:44am
#1071368
My archaeological excavation today takes us all the way back to January of 2008, meaning the entry would be eligible to get a driver's license in most states: "ScienceOpen in new Window.

It's fairly short, and, as you can tell from the title, presages the themes of some of my numerous subsequent entries.

Well, my creationism rant inspired some comments, but not nearly the shitstorm I feared.

This refers to the previous day's entry, which included a bunch of links, which presaged the content of most of my numerous subsequent blog entries.

The rant in question was quite short, and I railed on a then-recent poll that said 47% of Americans reject the science of evolution. The implication is that most, if not all, of them do so on religious grounds.

I did want to make clear, though, that my rant about creation "science" is not about Christians in general - in other words, I'm not attacking faith, but opinion.

I try (and sometimes fail) to keep religion out of my discussions here.

I responded to someone in a private email a clarification that I'll paste here...

That message takes up the bulk of the entry, and then...

That's the last word on it from me - at least for now.

I'm glad I hedged that with "at least for now." Because it absolutely wasn't the last word, and even the last word I wrote on the topic won't be.

But let's see what Wired Magazine has to say:

While the raw link in the original entry is broken, I found it using a web search.  Open in new Window. Apparently they just changed the URL and kept the content, though there's a broken YouTube embed there.

While the article itself touches on a Presidential race that's ancient history at this point, the philosophical issues remain, and I continue to address them, even now, sixteen years later.
May 18, 2024 at 7:40am
May 18, 2024 at 7:40am
#1071313
Now, here's a kid who's going to be subject to government scrutiny for the next few years.



After which they'll probably recruit him.

Often called the father of mathematics, Archimedes was one of the most famous inventors in ancient Greece, with some of his ideas and principles still in use today.

While there is no doubt that Archimedes made important contributions to mathematics, calling him "the father of mathematics" is rather insulting to earlier mathematicians, not to mention horrifically Eurocentric.

But one fabled device has left scientists speculating on its existence for hundreds of years — the death ray. Now, a middle schooler may have some answers.

Hey, when I was that age, I was fascinated by death rays, too. But this was a simpler time, and as far as I know, I wasn't subject to government surveillance for my hobbies.

Brenden Sener, 13, of London, Ontario, has won two gold medals and a London Public Library award for his minuscule version of the contraption — a supposed war weapon made up of a large array of mirrors designed to focus and aim sunlight on a target, such as a ship, and cause combustion — according to a paper published in the January issue of the Canadian Science Fair Journal.

Also, calling it a "death ray" is probably sensationalism. Most sources I've seen use "heat ray," but I think even that is misleading. Regardless; the point is, people have argued over whether Archie could have actually constructed the weapon using Greek technology of the time, or if perhaps it was speculation (similar to da Vinci's later speculation about flying machines or whatever). Hell, René Descartes dismissed the idea.

There is no archaeological evidence that the contraption existed, as Sener noted in his paper, but many have tried to recreate the mechanism to see if the ancient invention could be feasible.

Absence of evidence isn't evidence of absence, of course. Oh, and no one, as far as I know, is saying the idea is unworkable; only that it might not have been feasible using tech available in the 3rd century BCE.

At the same time, evidence that it could have been built and used isn't the same thing as evidence that it was built and used.

The article describes his setup, and then:

Writing in the paper, Sener said he found these results to be “quite remarkable as it suggests that light is going in all directions and that the shape of the concave mirror focuses the light waves onto a single point.”

Well... that's a bit disappointing. Was CNN clickbaiting us with that headline? While this is certainly what I'd call a good middle school science fair project, it's not like we didn't already know that focusing light rays intensifies the resulting temperature. Hell, all this is, is the reflection version of using a magnifying glass to burn ants.

While the experiment doesn’t offer “anything significantly new to the scientific literature … his findings were a nice confirmation of the first law of thermodynamics,” which states energy or heat can be transferred, Ho said.

Blink. Blink.

Look, I'm not trying to diminish the kid's accomplishment. Hell, I couldn't ever come up with an idea for a science project at that age. It's not that I wasn't interested; it's that I wasn't creative enough. Still not. But it proves nothing about Archie's heat ray.
May 17, 2024 at 10:59am
May 17, 2024 at 10:59am
#1071264
Looks like someone gave Cracked a makeover. The most obvious change is that their logo no longer resembles that of the second-rate Mad ripoff they started out as. Not changed: countdown lists.

    5 Words We Only Use Because the Old Ones Were Too Dirty  Open in new Window.
People only started talking about light meat and dark meat because they were too embarrassed to talk about breasts and legs


Yeah, and now "light meat" and "dark meat" have sexual connotations, because humans can be assholes. (When I was a kid, they called it "white meat," which I suppose comes with its own dark connotations.)

A friend is getting married, and you open up their gift registry. You see a “Dutch oven” listed there, and you suppress a giggle.

No, I don't, because 1) All my friends are past the "gift registry" stage and 2) I don't find the dirty version amusing in the slightest.

Later, at the ceremony, you deliver the toast, and you mention your Dutch oven observation. This angers and confuses people, who have no idea what you’re talking about. You are expelled from the venue.

As some people here can attest, my wedding toasts are simultaneously funnier and more cringeworthy than that.

5 Light Meat and Dark Meat

Light meat and dark meat are two different kinds of flesh on a bird. Confusingly, these names have nothing at all to do with white meat and red meat — all poultry is white meat, but some is light while some is dark.


And yet, we manage to infer the intended meaning from context.

Dark meat is fattier than light meat and arguably tastier.

Not to me. You could say I'm a breast guy.

But today, “light meat” and “dark meat” also sound sexual, so let’s just say whatever’s easiest.

Like I said.

4 Rooster

If we’re talking chicken double entendres, we of course need to talk about cocks. The cock is the male chicken, otherwise known as a rooster.


Honestly, this one's so obvious, I feel like the author got lazy.

3 Missus

The title in front of your name may reveal your gender, marital status, profession or level of nobility. The most common one for men is Mr., while women have Mrs., Miss and Ms. “Mr.” is short for mister. “Miss” isn’t short for anything. “Ms.” is pronounced miz but doesn’t represent any word other than “Ms.” itself. As for “Mrs.,” that’s pronounced missus of course.


And don't get me started on how some women take serious offense at being called "ma'am." In the US, this seems to be a North vs. South thing, though it's unlikely to start another Civil War. No, we have other things that will cause that.

2 Canola Oil

A lot of our vegetable oil comes from a plant named Brassica napus. The common name for this plant is rapeseed, with the “rape” part deriving from Latin word for turnip.


I'm almost sure I've written about this before, but I'm not about to put "rape" into the search box to verify.

1 Rabbit

Rabbits used to be called “coneys.” You’ll know that if you’re a fan of Lord of the Rings, in which Sam assures us that cooking a rabbit in a stew is the “one way to eat a brace of coneys.” He pronounces the word “cone-y,” the same way we pronounce Coney Island today, but originally, this word was pronounced “cunny.”

In the 16th century, “coney” became a pet name for women. Meanwhile, people already had the Latin word cunnus for “vulva,” which also gave rise to such words as cunnilingus and...


...couldn't they find a way to do this without typing the next word?

(Look, there's no reason why I can't say "cunt" once in an entry, but the opportunity to make the "couldn't" pun was overwhelming to me.)

Anyway, this is similar to why we don't call cats "pussy" anymore, except to set up or follow through with jokes. For example, long ago, Zsa Zsa Gabors was on The Tonight Show, way back in its heyday when Johnny Carson hosted. She came in with a cat, which perched on her lap.

Gabor: "Would you like to pet my pussy?"

Carson: "I'd love to, but you'll have to remove that damn cat."

...well, actually, that never happened.  Open in new Window.

But it's still funny.
May 16, 2024 at 8:52am
May 16, 2024 at 8:52am
#1071194
Jumping back into food today, courtesy of the BBC...



Didn't they try to make "healthier white bread" some decades ago? By removing pretty much everything healthy about bread and fortifying the bland, spongy result with "vitamins and minerals?"

I'll grant we know more now. Just saying, sometimes technology gets ahead of itself.

Scientists are trying to create a new type of bread that is just as healthy as wholemeal but looks and tastes like its white counterpart.

Like I said, the article's from the UK, so there may be some phrases lost in translation. Though it's easy enough to figure out what they mean by "wholemeal."

Aimed at lovers of white bread, the project has been funded by the government to improve the health benefits of UK food.

Resisting the temptation to make a lame joke about British food. Instead, I'll just say, "Yeah, good luck on that with Scotland."

The researchers plan to add small amounts of peas, beans and cereals to the bread mix, as well as bran and wheat germ that are normally removed from white flour.

So, take out the bran and wheat germ, and then put them back in?

Another potential for lost in translation: "cereal" can refer to any grain, such as maize (UK for corn), barley (US for corn), rye, spelt... and wheat, which is what gives me pause because isn't the wheat already there?

Dr Catherine Howarth of Aberystwyth University, who is one of its leaders, said scientists had begun to analyse the detailed chemical composition of existing white flour.

It only occurred to me yesterday that some people act like gluten is some sort of food additive, like the TBHQ from the Pop-Tarts entry a few days ago. Which it isn't. It's a naturally-occurring protein in many grains, including wheat.

Of course "natural" doesn't necessarily mean "good," but in this case, my point is that there are no evil corporations going, "I know! Let's save money by throwing in gluten."

It involved adding back smaller quantities of the wheat germ and part of the bran that is taken out in the milling process, she said, as well as adding other grains that are richer in vitamins, minerals and fibre such as quinoa, teff, sorghum and millet.

Today's new word: teff. I'd never heard of it. Turns out it's from East Africa  Open in new Window., and it's a grass seed. Before that freaks anyone out, remember that maize is also technically a grass.

"Using other cereals we can enhance the iron, zinc and vitamin levels and most importantly the fibre content, because white bread has very little fibre, which is so important for good health."

So, Wonder Bread with fiber (look, I'm using US spelling in my commentary, because it's what I'm used to) re-inserted.

I concluded long ago that this quintessential American mass-produced white bread is called that because you have to Wonder what's actually in it.

“Most people know that wholemeal bread is better for you, but a lot of them are put off by the flavour, or because it’s not what they are used to and they are simply not interested,” he said of the challenge.

I've pointed out before that, to me, bread is food; everything else is a condiment. But just like I'm a snob about wine, beer, cheese, and other products of fermentation and/or distillation, I'm also a snob about bread. I like whole wheat bread just fine; I think the flavor usually has more depth. But one thing I can't deal with: whole wheat baguettes. No one's been able to match the deliciousness and texture of a real baguette by using whole grains.

Mr Holister used me as a guinea pig for an early prototype made from a mixture of normal white flour and some added grains and peas.

It was crustier than the white loaves you get from the supermarket - but otherwise looked and tasted like white bread. But there is a lot more work to be done.


To me, "crustier" is a good thing. The brown strip on the edge of your ordinary white bread hardly qualifies as "crust."

White bread has to have minerals and vitamins added to it by law to make up for the goodness that's lost in the refining process.

Pretty sure that's the case in the US, too.

"Critics would say that it is tricking people into improving their diet, but nutritionists would say it doesn’t matter how it’s done - it’s important to get it down people’s throats to improve their health!

"But the jury's out as to whether this new approach will work,” he added.


This critic doesn't say it's tricking people into improving their diet. The only "trick" would be if it wasn't labeled appropriately. No, my concern is that food is meant to be enjoyed, not to be used as medicine.

Still, even the attempt could yield good science, and I can't rail against that.
May 15, 2024 at 9:55am
May 15, 2024 at 9:55am
#1071131
"Only a Sith deals in absolutes." —Obi-Wan Kenobi



The problem with that quote from one of the Star Wars prequels is that it is, itself, an example of absolute thinking. This has been pointed out by better minds than mine, but it seems relevant to today's article (from Aeon). Of course, the Jedi weren't as pure and noble as their PR tried to paint them to be; controlling others' minds is generally considered "evil," regardless of purpose.

But if there's anything Star Wars is known for, it's lightsabers. If there's anything else it's known for, it's bad dialogue.

Anyway, to the point:

Think of the most happy and well-adjusted person you know – what can you say about their thinking style?

Well, that would be me. (If that thought doesn't scare you, nothing will.)

Are they dogmatic, with an all-or-nothing outlook on the world? Do they place totally rigid demands on themselves and those around them? When confronted with stresses and misfortunes, are they apt to magnify and fixate on them? In short, do they have an absolutist thinking style?

I absolutely do not. Well, with some exceptions.

Absolutist thoughts are unqualified by nuance and overlook the complexity of a given subject.

I've railed on this before. One solution would be to drink more Absolut.

There are generally two forms of absolutism; ‘dichotomous thinking’ and ‘categorical imperatives’.

These $2 words are explained in the article, and I'll quote them, too:

Dichotomous thinking – also referred to as ‘black-and-white’ or ‘all-or-nothing’ thinking – describes a binary outlook, where things in life are either ‘this’ or ‘that’, and nothing in between.

Some rare things are binary, or effectively so. Flip a fair coin, for example, and the chance of it landing on edge is minuscule; the practical effective outcome is either heads or tails. But, agreed, most things aren't. Like "alive" and "dead." At first that seems binary, but there are nuances to life; if someone's in a coma, e.g.

Categorical imperatives are completely rigid demands that people place on themselves and others. The term is borrowed from Immanuel Kant’s deontological moral philosophy, which is grounded in an obligation- and rules-based ethical code.

Hey, great job explaining "dichotomous" and "categorical;" but you had to go and throw in "deontological,"  Open in new Window. didn't you?

Yet we all, to varying extents, are disposed to it – why is this?

"We all?" Wow, that's some absolutist shit right there.

Primarily, because it’s much easier than dealing with the true complexities of life.

I'm all for "easy," but not if it's going to make things more complicated in the long run.

The article (which is from 2018, so things might have changed in the field since then) goes on to describe the author's work on the subject which, like all psychological science, should be read skeptically. I'm not saying it's wrong, but I'm not saying it's right, either. See? I'm not an absolutist.

These findings support the recent ‘third wave’ therapies that have entered clinical psychology. The most well-known of these is ‘mindfulness’, but they all advocate a flexible outlook, acceptance, and freedom from attachments.

Ugh. "Mindfulness" again. I'm in favor of flexible outlooks and whatnot, but I have yet to accept the usefulness of "mindfulness," and I might never do so. Then again, I might. Thus once again showing that I have some flexibility.

Many argue that the world is a harsh place, and that it is the stresses and misfortunes in life that make people depressed, not their thinking style. Wrong!

Once again, the irony of proclaiming "WRONG!" in an article like this doesn't escape me.

Countless people suffer misfortunes and do not get depressed or anxious, while others seemingly suffer no misfortune at all, and are blighted with depression and anxiety.

Or it could be, though I don't have studies and shit to pull out, that depression and anxiety are health issues, caused not by absolutist thinking but by chemical processes in the brain. This is like saying that lots of people slip and fall on ice, but not all of them break their legs doing so.

So, yeah, I'm not taking the words here to be absolute truth, but I found it to be an interesting enough take on things to share it.
May 14, 2024 at 9:58am
May 14, 2024 at 9:58am
#1071051
While we spent yesterday discussing the origin of Pop-Tarts, today's article, from Scientific American, touches on something almost as profound.

    Is There a Thing, or a Relationship between Things, at the Bottom of Things?  Open in new Window.
Quantum mechanics inspires us to speculate that interactions between entities, not entities in themselves, are fundamental to reality


Let me start with this: I was, at first, a bit thrown off by the headline. Maybe it's just me, being used to seeing headlines phrased as yes/no questions, which I thought this was at first. Then I realized that they're asking which is more fundamental: things, or relations? Just in case some reader experiences the same confusion, I thought I'd attempt to interpret.

Which makes it still a binary question, but I'll run with it for now.

What’s at the bottom of things?

It's turtles all the way down.

If we keep asking “Why?” where do we end up?

If my childhood was representative, the answer to that is "In our rooms, alone."

The monotheistic faiths assert that our questions must culminate in God, a solitary, supernatural creator.

No, I'm not going to get into that argument today.

Dissatisfied with that hypothesis, physicists postulate that everything stems from a single primordial force or particle, perhaps a supersymmetric string, from which flow the myriad forces and particles of our fallen world.

One could still, presumably, reconcile those two worldviews by assuming, for example, that God created the "primordial force or particle."

I don't buy that either, by the way. But like I said, I'm not here to argue about it.

(From what I understand, supersymmetric string theory is all but ruled out, though I still maintain that a perfectly logical "string theory" is: "The Universe is a big ball of string, and God is a cat.")

Notice that, for all their differences, religion and physics share the ultrareductionist conviction that reality comes down to one thing.

Yeah, okay, but as one of those "things" is supposedly infinitely complex, and the other is infinitesimally simple, that's a pretty damn big difference.

Call this the oneness doctrine.

This may be confusing, too, as, at least to me, "oneness" conjures up images of hippies sitting around going, "Whoa, it's, like, all ONE, man."

So, I’m intrigued by the conjecture that at the heart of reality there are at least two things doing something to each other.

This may also be a bias due to our species having evolved via two things doing something to each other.

In other words, there is an interaction, a relationship. Call this the relationship doctrine.

Another point of clarification: These days, you see the word "relationship," and the implication is sexual and/or romantic. But the word "relationship" is much broader than that, meant to describe how any thing relates to any other thing. Like how the Earth and the Moon relate via gravity.

This word definition creep is the same sort of thing that leads us to make jokes about names like the town of Intercourse. When the town, or village, or whatever, was first named, intercourse described interaction between people. Later, people started talking about "sexual intercourse," and because all we do is think about sex, when this was later shortened to "intercourse," the original meaning was all but forgotten, except among pedants like me.

Point being, erase that connotation of "relationship" from your mind when you think about how the author uses the word. Yes, despite my joke above about "two things doing something to each other."

Anyway, the article cites a bunch of thinkers who saw the "relationship" bit as being fundamental. Then:

Part of me finds the relationship doctrine, and especially Gefter’s you-centered metaphysics, beautiful and consoling, a welcome alternative to mindless materialism. The relationship doctrine also seems intuitively sensible. Just as words must be defined by other words, so we humans are defined, and to a certain extent brought into existence, by other human beings.

If there's only one thing to know about science, it's this: it doesn't care whether you find it beautiful and consoling or not. No matter how appealing an idea is, it needs to be falsifiable, and it needs to be tested.

Moreover, as I mentioned above, I have a long-standing aversion to the oneness doctrine. This antipathy dates back to a drug trip in 1981, during which I felt myself becoming a solitary consciousness, the only one in the universe.

I knew that was coming. I bet you saw it coming a light-year away, too.

I thought: What is the difference between one thing and nothing? One thing only exists in relation to something else.

I mean, I don't disagree with the philosophy, but my main takeaway here is, "Wow, someone actually remembered details of their acid trip."

And yet I have doubts about the relationship doctrine, as I do about all metaphysical systems that privilege mind, consciousness, observation, information. They smack of narcissism, anthropomorphism and wishful thinking.

This is good. Having doubts and expressing them is what saves this article.

In conclusion, for me anyway, the headline question is less meaningful than it seems. Leaving aside the imprecision of the word "thing" (which I've harped on in a recent entry), it's meaningless to consider objects without some sort of relationship between them; and you can't speak of the relationships between objects without acknowledging the objects. Also, as the drug-tripping author notes, if there's only one object, it's not much of a universe at all.

So my philosophy is "both." We have to consider both things, and the interactions between things, to get anywhere.
May 13, 2024 at 8:01am
May 13, 2024 at 8:01am
#1071005
"Mommy, where do Pop-Tarts come from?" "The supermarket, kid."

    The Contentious History of the Pop-Tart  Open in new Window.
In the 1960s, two cereal giants raced to develop a toaster pastry


In September 1964, Kellogg’s changed breakfast forever by introducing Pop-Tarts to the world.

Yeah, it sure did change forever. Now instead of an unhealthy breakfast, we can eat a prepackaged unhealthy breakfast.

What made Pop-Tarts so innovative wasn’t just the sweet filling in various flavors squished between two thin pastry crusts. Or that they could be eaten toasted or cold.

I mean, sure, technically, they can be eaten cold, just like leftover pizza technically can be eaten cold. If you're a savage.

It was the convenience with which adults and children alike could open and instantly devour them.

It's not like they didn't have packaged prepared convenience foods in the 1960s. It's just that maybe PTs were the first ones to be marketed as breakfast.

Pop-Tarts’ ingredients mean that they don’t need to be refrigerated, and their foil packaging ensures they can be stored for months.

That's a funny way to phrase "Pop-Tarts contain preservatives." Obviously, most packaged food products contain preservatives. I'm not one of those ooh-booga-booga "all preservatives are bad" people, but it's entirely possible that some are worse than others.

The ingredients are right there  Open in new Window. on the package, and most of them are pretty straightforward: sugar, corn syrup, mirror-universe sugar, high fructose corn syrup, etc. But one of them is just called TBHQ.

Now, I'm also not one of those "only eat things you can pronounce" people. As I've noted before, first, it encourages ignorance; second, I did recreational chemistry as a kid; my father had a degree in the field, and I learned how to pronounce lots of things. But I feel like calling it TBHQ deliberately hides a fell secret; it's short for tertiary butylhydroquinone,  Open in new Window. which I'm sure if spelled out would freak out your average shopper way more than MSG (monsodium glutamate).

All of which is to say that even I, a big fan of both convenience and better living through chemistry, have my limits.

Throughout the 1950s and 1960s, hyper-sweetened food products that could be eaten on the go exploded in popularity, especially among children.

This just in: children love sweet things. Who knew?

This is all the more ironic because Will Keith Kellogg and John Harvey Kellogg, two brothers from Battle Creek, Michigan, were initially invested in providing healthy foods and cereals to improve digestion when launching their company in 1906.

I suppose you can call that irony. I call it a natural progression. Anything that starts out healthy eventually gets mass-marketed as candy. Cereal products are just one example. Also see: coffee, yogurt.

Just to be clear: I'm not anti-Pop-Tarts. I just like to know what stuff's made of and make my own decisions.

Upon his death on February 10, 2024, William Post was widely identified as leading the team that created the Pop-Tart. Post told southwest Michigan’s Herald-Palladium back in 2003 that Kellogg’s approached him when he was the manager of a Keebler Foods plant in Grand Rapids, where they asked him to develop the revolutionary breakfast food.

On the official Pop-Tarts website there’s no mention of Post.


Of course there isn't. That would be like if Coca-Cola hired some guy named John Pepsi to develop a new soft drink, or if Wal-Mart tasked Betty Amazon to design their new stores.

In order to spread the word of its creation, Kellogg’s used many television shows to introduce Pop-Tarts throughout the last months of 1964. Advertising appeared on “Beverly Hillbillies,” “My Favorite Martian,” “What’s My Line,” “Huckleberry Hound,” “Yogi Bear,” “Woody Woodpecker,” “Quick Draw McGraw,” “Mighty Mouse” and across daytime television.

Product placement is a legitimate marketing strategy. What makes it sneaky is that even if people go, "Hey, that's product placement," it still works. The entire article I'm linking today, for example, is basically an ad for Seinfeld's movie which, since I'm not getting paid either way, I won't name. Nor do I have any desire to watch it, even though it wouldn't cost me a dime to do so as it's on a streaming service I'm already subscribed to.

Ad or not, though, the historical information is interesting to me. I suspect the product was a part of the childhood experience of most people born around the time I was and, as the article notes, it's not like they're going out of style anytime soon.
May 12, 2024 at 9:33am
May 12, 2024 at 9:33am
#1070954
In nearly 1700 entries over almost two decades, the only shocking thing is that I haven't repeated titles more than I actually have. Going back in time today, I'm going to resurrect the entry with this title from June of 2020: "Under PressureOpen in new Window.

The linked article itself, from six months earlier, is still around.  Open in new Window. Not too surprising, since it's from Atlas Obscura.

In the entry, I started with:

Turns out that, like gravity, crumpets, Doctor Who, and Russian Imperial Stout (no, really), champagne was a British innovation -- and the reason why will shock you! (Hey, look, I can write clickbait! Pay me!)

Do they still use that kind of clickbait title? I don't think I've seen it in a while. I'm guessing people get wise to the tactics so they no longer work, so they move on to different, more subtle tactics.

Anyway, none of this is clickbait.

Also, it is now canon that the Doctor went back in time, met Newton, and changed history by leading him to create the Theory of Mavity, not gravity. Well, it was actually his companion's fault, but that's not the point.

I also wrote:

They only lost 10% of their forests and timber supply? That doesn't sound so bad. Oh. No, the author is just misusing "decimate."

While I will never not be pedantic about the definition of "decimate," I get that it's a futile battle.

Since there's really not much else I found to criticize about that four-year-old post, I'll just point out that I did, eventually, tell the story of Russian Imperial Stout... two years later, in this entry: "The Yeast You Can DoOpen in new Window.

So, bonus link there. Fear not, though; it's short. Like today's.

31 Entries ·
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2024 Waltz Invictus (UN: cathartes02 at Writing.Com). All rights reserved.
Waltz Invictus has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online