About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
November 19, 2024 at 8:09am November 19, 2024 at 8:09am
|
I'll admit it. The only reason I'm sharing this Atlas Obscura article is that I'm actually 12 years old.
Always good to read about an organ being put to good use.
Along Shrewsbury Road in Penistone, England, an unassuming single-screen theater keeps cinematic nostalgia alive.
There's something to be said about having a single screen rather than dividing your attentions between a dozen.
Inside, the star of the show is the Compton organ. The instrument was originally built by the John Compton Organ Co. in 1937. It was first installed in Birminghamâs Paramount Theatre, where it entertained audiences for over 30 years. In 1988, it was bought by a private cinema owner and installed in the Regal Cinema at Oswestry in Shropshire.
I suppose I'm disappointed that it didn't come from Scunthorpe.
After four years at Oswestry, it was brought to the Penistone Paramount Cinema by organist Kevin Grunill. The instrument was restored in 2000 and again in 2013.
Only 13 years between restorations, for such an old organ?
Yeah, that's all I have today. I'm spent. |
November 18, 2024 at 10:25am November 18, 2024 at 10:25am
|
Big Think offers up another way for people to annoy pedants, and vice-versa.
By "we," I suppose they mostly mean "Americans." But even people in countries that use SI units will use kg as if it's a unit of weight. This is fine, as far as I'm concerned, because damn near 100% of us live on the Earth's surface. And while there are gravity variations due to latitude, elevation, or anomaly (such as the one from "Anomalies" a couple of weeks ago), they mostly don't make much difference unless you're a scientist who requires greater precision.
Still, as a pedant, I think it's important to know the difference. Also, as a pedant, I think it's important to keep it to oneself if one does not wish to be uninvited from social gatherings for being a pedant.
Since this is my blog, I'm making an exception here.
Conventionally, here on the surface of the Earth, we can convert between the two using only a minimal amount of effort: 1 kilogram is 2.205 pounds, and vice versa, 2.205 pounds converts to 1 kilogram. Going back-and-forth requires only multiplication or division, which seems easy enough.
"Easy?" Have you met people? Some of them break out in hives if you ask them to add 10% to something.
Incidentally, I don't usually bother with the 0.005. For most practical purposes, 2.2 is close enough and much easier to do head-math with.
A kilogram is an example of mass, not of weight, while a pound is an example of a weight, not of a mass. Itâs only here on the surface of the Earth, where weâre at rest relative to the rotating Earth, that these two concepts can rightfully be used interchangeably.
I mean, as long as we're being pedantic, it's a really stupendously big universe and it wouldn't surprise me if there are other planets with the same acceleration due to gravity as that of the Earth.
But for nearly 100% of the volume of the universe, weight is irrelevant and only mass matters (pun absolutely intended).
If you release an object from rest and allow it to fall, it falls straight down, accelerating at a constant rate. It gains speed directly proportional to the amount of time that itâs been falling, and the distance it covers is proportional to the amount of time squared that the object has been falling.
Again, if you're going to get technical about it, you'd add "without air resistance."
This phenomenon, however, appears to not depend on mass or weight. A light object will fall just as quickly as a heavy object, especially if air resistance isnât a factor.
There it is.
Someone did that on the Moon, incidentally. This proved two things: 1) Physics is right; 2) at least that particular Moon expedition wasn't faked on an Earthbound sound stage.
The article proceeds to go into a lengthy (and weighty) discussion of the differences between scales and balances, and how the former measures weight while the latter measures mass. It's remarkably math-light.
Then:
We can sum up the difference succinctly: your mass is an inherent quality of the atoms that make up your body, but your weight is dependent on how those atoms accelerate under the influence of all the factors and forces acting on it.
Which I suppose is the main point of the article; everything else exists to support it.
There are plenty of physics textbooks (and physics teachers) that ignore this difference, and simply state that your weight, W, always obeys the equation W = mg. This is incorrect; it is only true when you are at rest on the surface of the Earth.
There's a popular trick question: "Which weighs more, a pound of feathers or a pound of lead?" It's a trick for several reasons.
We often talk about âwatching our weightâ or âtrying to lose weight,â but if that was truly your goal, you could simply go to a higher elevation, move to a different planet, or even get into an elevator and wait for the door to close after you hit the âdownâ button.
Yes, "simply... move to a different planet."
Sounds like a good idea for other reasons, these days. |
November 17, 2024 at 7:37am November 17, 2024 at 7:37am
|
Diving into the past again today, I came up with a relatively recent entry, from May of last year: "A Frank Discussion"
The entry revolved around a piece on the bon appétit site; as of right now, the article is still there. And it's about figuring out which big-brand hot dogs are best.
I doubt the world of wieners has changed much in a year and a half. The article has a "summer's coming so here's something about grilling" slant, which, of course, it's the precise wrong time of year for, here in the One True Hemisphere.
Apparently, at the time, I'd overlooked one of the article's biggest flaws: while aimed at summer grill cooks, the testing featured boiled hot dogs. As anyone with taste buds knows, boiled dogs taste way different from grilled dogs.
Me: It's been many years since I've actually eaten a hot dog, frankfurter, or weiner; anything requiring a hot dog bun.
And now it's been many years plus a year and a half.
Not to mention I know what they're made of, but that doesn't stop me from eating breakfast sausages.
I've also eaten way worse than breakfast sausages, since then.
Here's the thing: it's hard to be objective about food (or drinks) during a taste test. Taste is, well, a matter of taste. Beer, for example, is highly personal; some love *shudder* IPAs, while I prefer darker, less hoppy brews.
Additionally, taste changes over time, and can be affected by numerous factors, such as your overall health and the last thing you ate or drank.
I concluded that entry with what may have been my first assertion in here that a hot dog is actually a taco. I'm not sure I actually believe it, myself, but it does tend to get one to think about categorization problems and their edge cases. |
November 16, 2024 at 8:59am November 16, 2024 at 8:59am
|
It's okay, folks. No need to worry about global warming; we've got the important bit covered:
Thanks to an experiment started before the Great Depression, researchers have pinpointed the genes behind the remarkable adaptability of barley, a key ingredient in beer and whiskey. These insights could ensure the cropâs continued survival amidst rapid climate change.
Whew!
Coffee and chocolate future still uncertain, but those are less important.
Grown everywhere from Asia and Egypt to Norway and the Andes mountains of South America, barley is one of the worldâs most important cereal crops and has been for at least 12,000 years. As it has spread across the globe, random changes to its DNA allowed it to survive in each new location.
I know it's necessary to summarize for an article, but "random changes" are only one component of adaptability.
The article talks about the experiment promised in the headline, and I won't quote it; I have little to say about the details except that it seems legitimate to me.
Then, towards the end:
Using modern technology like genome engineering and CRISPR, researchers could try to engineer other crops that flower at specific, more advantageous times.
And approximately 15 seconds later, someone's going to screech about genetically engineered crops.
But the really important quote, they saved for the end:
âBarleyâs ability to adapt has served as a cornerstone to the development of civilization. Understanding it is important not just to keep making alcoholic beverages, but also for our ability to develop the crops of the future and enhance their ability to adapt as the world changes,â Koenig said.
Or, to put it in layman's terms: Beer. Is there anything it can't do? |
November 15, 2024 at 10:20am November 15, 2024 at 10:20am
|
We all know I'm not religious. I'm also not spiritual, unless by "spirit" one means "distilled beverage." The closest I get is when I listen to music. But, of course, it has to be good music—which has nothing to do with whether it was created with religion in mind or not. So here's a Cracked article about songs you might not have realized were religious (and, somehow, Leonard Cohen isn't on the list).
For example, you might realize the band youâre listening to is named âCreed,â and the song is talking about going âhigher, to a place where blind men see.â Clearly, this about heaven. And yet the band actually insists they are not religious, as theyâre not trying to preach anything to anyone.
Songs (and other works of art) can be religious without being preachy. However: 1) A band naming itself "Creed" and claiming to not be religious sets off my bullshit detectors; 2) regardless of their inspiration, they suck. I like to change the lyrics of their most famous song to "With Legs Wide Open."
Or, you might hear that 1960s classic âSpirit in the Skyâ and note the lyric, âGotta have a friend in Jesus.â You then might conclude itâs a Christian song. Singer-songwriter Norman Greenbaum is actually Jewish, but penning that song was a smart move.
Insert Jewish joke about "business is business" here.
4 Imagine Dragonsâ âRadioactiveâ Is About Leaving Mormonism
If you were told that one of Imagine Dragonsâ songs is religious, maybe youâd go with âBeliever,â because it talks about being a believer. Maybe youâd go with âDemonsâ because it talks about demons. Maybe youâd go with âThunderâ because that oneâs so terrible that only divine intervention can explain its success.
Okay, that's funny. But... who?
The band formed at Brigham Young University, and when you go on to leave the Church of Latter-day Saints, the memory of your time there stays with you. We know for sure that one song of theirs is about that subject: âRadioactive.â
Yeah, so, kind of the opposite of religious, I guess? Hell if I know. I didn't watch the video they embedded in the article, so I still don't think I've ever heard any of their stuff.
3 Simon and Garfunkelâs âCeciliaâ Is St. Cecilia
The idea that âCeciliaâ is a song directed at a saint sounds absurd.
Yeah, it kind of does, especially since Paul Simon and Art Garfunkel are in the same tribe as Norman Greenbaum. But hey, Simon said it, and while he may have been trolling, it kind of works on a metaphorical level.
It doesn't help that the song is probably the duo's least awesome.
2 âDem Bonesâ Is a Spiritual About the Promised Land
Unlike the others, this isn't a relatively recent pop song, but, as the article notes, an older gospel tune. Which means it's not "secretly" religious at all, but I guess there's a secularized version for broader appeal.
Today, the song is most important during Halloween, because it gets children talking about spooky skeletons. Thatâs fine. When you get down to it, Halloween is the holiest night of the year.
And technically, anything involving an afterlife has religious connotations.
1 âThe Star-Spangled Bannerâ Is About the Greek Pantheon
This is the one I saved this article for, even though the header is misleading. "The Star-Spangled Banner" is the name of Key's poem, which, as we all know, was set to an existing tune.
If youâre a fan of trivia, you might already know that this melody came from a British drinking song.
What's weird to me is, I've known this for decades, but I never could be arsed to find out exactly what the drinking song was. Well, now I know...
The song is about a bunch of Greek gods arguing with each other. When the poet Anacreon urges humanity to drink and screw, the god Jupiter gets angry and considers intervening. Apollo disagrees, claiming Jupiter's mighty thunderbolts are nothing against the power of music.
And whaddaya know; Apollo was right.
The United States may not have been founded as a religious country, but if you ever feel the need to pledge allegiance to one nation under God, feel free to specify which god that it is.
Why, Dionysus, of course. |
November 14, 2024 at 11:16am November 14, 2024 at 11:16am
|
Back in the early days of this blog, I had a running joke about ragging on Hello Kitty. Very recently, I found this article from Fast Company:
At least, I thought of it as a running joke. Looking back, it might have seemed like a serious hate-on. Either way, though, the whole reason for the gag was the Nefarious Neko's worldwide appeal, how the character seemed to be everywhere, on all things, in all contexts.
This, of course, is successful marketing. As I've said before, I sometimes put marketing articles in here because many people have things they want to promote, like, maybe, their books. And also, the psychology of it can be interesting.
Hello Kitty turns 50 on Friday.
For context, if you don't want to click on the link, "Friday" was November 1 of this year.
As a tabula rasa open to interpretation, the non-threatening creation was the perfect vehicle for making money, she said.
This strikes me as opposite to most marketing advice, which is to know your audience and pander to it. In the past, I'd have taken issue with the "non-threatening" description, but again, I think those jokes fell flat.
There have been anniversary editions of merchandise ranging from pet collars, cosmetics and McDonaldâs Happy Meals to Crocs and a Baccarat crystal figurine.
On the other hand, I'd have had a field day with Hello Kitty Crocs. Hell, I might still be able to work up a rant about that particular combination.
By the late 1970s, Sanrio revealed the characterâs name as Kitty White, her height as five apples tall and her birthplace as suburban London, where the company said she lived with her parents and twin sister Mimmy.
That much, I knew. ("Know your enemy," I might have said in the past.)
Her TV appearances required co-stars, including a pet cat named Charmmy Kitty that made its debut 20 years ago.
This was also always suspect to me. A cat owning a cat? Isn't that, like, slavery?
But Hello Kittyâs 40th birthday brought an update that astonished fans. Sanrio clarified to a Los Angeles museum curator that Kitty, despite her feline features, was a little girl.
Thus sidestepping the "slavery" issue and pushing the conversation back to animal rights. Just kidding; I doubt anyone else thought deeply about the ethics involved.
âShe is supposed to be Kitty White and English. But this is part of the enigma: Who is Hello Kitty? We canât figure it out. We donât even know if she is a cat,â art historian Joyce S. Cheng, a University of Oregon associate professor, said. âThere is an unresolved indeterminacy about her that is so amazing.â
Like if she were in a box with a radioactive atom. SchrÔdinger's Hello Kitty.
Part of the confusion stems from a misunderstanding of âkawaii,â which is Japanese for âcuteâ but also connotes a lovable or adorable essence.
It may be cliché, but some things really do get lost in translation.
During a presentation earlier this year in Seoul, Hello Kitty designer Yamaguchi said one of her unfulfilled goals was finding a way âto develop a Hello Kitty for men to fall in love with as well.â But sheâs still working on it.
I have some ideas, but I try to keep this blog 18+.
Even if I'd been serious about my ragging on Hello Kitty, that came to an end some years ago, when I read an article (which I can't find now) featuring some village in Northern Siberia, considered the most remote human settlement in the world (pretty sure they didn't include Antarctic research bases, just places where humans naturally settled). Of course, the village isn't entirely unreachable, or they wouldn't have done an article on it, but apparently, it's only accessible by some rickety train line in the week when there's summer there.
The article included pictures, and in one of them, a little girl in this remote village unconnected to the world at large was wearing a Hello Kitty shirt. That is when I knew that her dominance was complete, and no amount of rage, joking or otherwise, would end her hegemony.
Honestly, these days, I think we could do a lot worse. |
November 13, 2024 at 11:26am November 13, 2024 at 11:26am
|
Another from the same source as yesterday; this time, about inventions of the more tangible kind.
5 Mark Twainâs Bra Straps
No doubt, they phrased it that way deliberately.
Missouriâs favorite literary son was also a celebrated inventor, but you probably didnât realize he had a hand in your over-the-shoulder boulder holder.
Definitely deliberate.
4 Roald Dahlâs Brain Shunt
Roald Dahl is best known for writing childrenâs books...
And promoting racism.
...but his arguably more significant contribution to society was saving their lives.
Oh, then, that totally makes up for the racism.
After his infant son was hit by a car, he was left with a condition that causes fluid to build up in the brain, and finding the valve meant to relieve it insufficient, Dahl invented a new kind of brain shunt with the help of a neurosurgeon and a toymaker that wound up in the skulls of thousands of children.
A neurosurgeon, a toymaker, and a writer walk into a bar...
3 Marlon Brandoâs Drumhead-Tightener
No, this wasn't a kid Brando hired, with or without innuendo, but another invention.
2 A Dentistâs Cotton-Candy Machine
Now, look, this is supposed to be about unexpected inventors. A dentist inventing a thing sure to give dentists more business is hardly unexpected.
1 Penn Jillette Invented a Vibrator
And I'm just going to leave this sitting there, untouched. |
November 12, 2024 at 8:41am November 12, 2024 at 8:41am
|
Sometimes, I link to Cracked for its take on serious topics. Sometimes, though, it's because of jokes.
Historical blonde jokes arenât as well-documented as other classics, like fart jokes, for example.
Fart jokes may be "classic" (the oldest recorded joke is one of them), but they are the lowest form of humor, unlike highbrow amusements such as dead-baby jokes.
The people who study this type of thing tend to agree that theyâre just lazily recycled versions of older jokes, that replace racism and xenophobia with sexism.
Yeah, not exactly a step up.
But the genre does have some notable subversions!
The thing about subversion is you need to be familiar with the thing that's being subverted, and hardly anyone tells blonde jokes anymore, so some people might not get the twists.
5 âThe First Officially Recorded Dumb Blondeâ
This is included for historical context.
Catherine-Rosalie Gerard Duthé was a French nun, ballet dancer and courtesan in the late 18th and early 19th centuries.
We should be concentrating on French jokes, not dumb blonde jokes.
She had a particularly notable habit of âpausing for extended periods of time before speaking.â Some might call this act âthinking,â but the French decided this made her dumber than a beret full of baguettes.
One wonders whether this contributed directly to the French Revolution.
4 Meta Blonde Jokes
You know how a blonde joke works. A brunette, a redhead and a blonde do something, the blonde does it wrong, because sheâs dumb and incapable, the patriarchy grows stronger.
Worth going to the link just for the joke here.
3 The Counter-Revolution
A notable, though flawed, early ally was cartoonist Murat Bernard Young, who drew two extremely popular comic strips. Dumb Dora was about a space cadet brunette, while Blondie was about a smart, capable mother whose husband was the bumbling, glassy-eyed doofus.
The "bumbling, glassy-eyed doofus" trope continues to this day (Homer Simpson, e.g.), but apparently, it's okay to make men look stupid.
2 Existentialist Blonde Jokes
The New Yorkerâs âExistentialist Blonde Jokes,â written by Alex Baia back in 2021, subverts some of the most tired blonde jokes out there...
I rag on The New Yorker on a regular basis, because it's pretentious as fuck, but one thing it's usually gotten right is comedy. I'll just copy one of the jokes here:
How do you drown a blonde?
Remind her that life is inane, repetitive and intrinsically meaningless.
1 Blonde Jokes for Men
Ever wonder about the difference between blonde and blond? It's not like gray/grey, which is largely a difference in US and UK/Commonwealth English. It's because "blond(e)" is French, so it possesses grammatical gender. Women are blonde; men are blond. It's one of those few words we stole from French that kept its inflections, like fiancé/fiancée.
Anyway, like I said, there's no need to specify the man's hair color. Many of us don't have any, anyway, and we're all stupid and bumbling, regardless of coiffure. |
November 11, 2024 at 9:17am November 11, 2024 at 9:17am
|
Still not quite sure what to make of this Quartz article. It's from 2018, but new to me, though I don't know if the book it's promoting is still available or not.
I don't know; a while back, I arrived at a pretty simple meaning of life: it has no meaning except for what we impose upon it. Impose a complicated meaning, and of course your life is going to be complicated.
Some people seem to spend their whole lives dissatisfied, in search of a purpose.
Or at least in search of that One Perfect Product that will finally live up to its advertising and fix everything wrong with your life.
But philosopher Iddo Landau suggests that all of us have everything we need for a meaningful existence.
The first thing you need for a meaningful existence is an existence.
Philosophersâ answers to this question are numerous and varied, and practical to different degrees. The 19th-century philosopher Friedrich Nietzsche, for example, said the question itself was meaningless because in the midst of living, weâre in no position to discern whether our lives matter, and stepping outside of the process of existence to answer is impossible.
I'm not sure Nietzsche is the best example to feature, but okay, whatever.
Those who do think meaning can be discerned, however, fall into four groups, according to Thaddeus Metz, writing in the Stanford Dictionary of Philosophy. Some are god-centered and believe only a deity can provide purpose. Others ascribe to a soul-centered view, thinking something of us must continue beyond our lives, an essence after physical existence, which gives life meaning. Then there are two camps of ânaturalistsâ seeking meaning in a purely physical world as known by science, who fall into âsubjectivistâ and âobjectivistâ categories.
I'm not well-read enough to pick sides there, but the only context in which I know the word "objectivist" has to do with philosophers like Ayn Rand, who can go fuck off.
Landau argues that meaning is essentially a sense of worth which we may all derive in a different wayâfrom relationships, creativity, accomplishment in a given field, or generosity, among other possibilities.
It seems self-evident that we each value different things, so what's meaningful to one person won't do much for another. For instance, lots of people seem to get a sense of accomplishment from working. I, on the other hand, feel accomplished if I can avoid work.
For those who feel purposeless, Landau suggests a reframing is in order. He writes, âA meaningful life is one in which there is a sufficient number of aspects of sufficient value, and a meaningless life is one in which there is not a sufficient number of aspects of sufficient value.â
I don't claim to be an expert on logic or philosophy, but that statement strikes me as both circular and weaselly.
Landau argues that anyone who believes life can be meaningless also assumes the importance of value. In other words, if you think life can be meaningless, then you believe that there is such a thing as value.
I'm not sure I agree with that. If you believe life is meaningless and that's okay, I think you're acknowledging the existence of value, but not necessarily its importance.
In Philosophy Now, Tim Bale, a professor of politics at Queen Mary University of London in the UK, provides an extremely simple answer: âThe meaning of life is not being dead.â
See? This sort of thing is why I picked engineering over philosophy.
There are a few other examples of simple answers to the question of life's meaning, and I'm really quite proud of myself for making it this far without quoting "42" as the answer, which, honestly, I'm only doing to forestall the inevitable comment about it.
I'm not sure I buy any of the arguments, though. Perhaps I find meaning in skepticism. |
November 10, 2024 at 9:43am November 10, 2024 at 9:43am
|
No, the title isn't political commentary. My random selection of past entries yielded this bit from the end of 2018: "Worst Year Ever"
The (very short) entry is basically a few comments on a link to an article from Science.org, which is still available and claims that the year 536 C.E. was "the worst [year] to be alive."
My commentary back then was pretty much focused on praising the interdisciplinary science that produced this result. But now, six years on, I might have a few more things to say about the article itself.
Ask medieval historian Michael McCormick what year was the worst to be alive, and he's got an answer: "536."
"Worst" is obviously a matter of opinion. It's also dependent on the opinion-holder's experience and knowledge. Ask people what the worst movie ever made is, and you'll get a bunch of different answers, but none of them are definitive because I guarantee you none of them have seen every single movie ever made, and one person's "worst" may be another's "yeah, it was pretty bad."
In this case, we've got a guy whose area of focus seems to be medieval Europe, so his answer is understandably Eurocentric.
A mysterious fog plunged Europe, the Middle East, and parts of Asia into darkness, day and nightâfor 18 months.
See? Nothing about North America, which had a decent population at the time, or anything in the Southern Hemisphere. It might have been a shitty time in those places, but maybe not the worst year to be alive. Especially for the native Americans or Australians.
Temperatures in the summer of 536 fell 1.5°C to 2.5°C, initiating the coldest decade in the past 2300 years.
Hey, guys, I think I've figured out how to stop global warming. If we can't make a volcano go boom, there's always the similar effects of nuclear winter.
Historians have long known that the middle of the sixth century was a dark hour in what used to be called the Dark Ages, but the source of the mysterious clouds has long been a puzzle.
Spoiler: volcanic eruptions in Iceland.
Probably. I mean, it's science, so results can have varying confidence levels, and I haven't seen any updated articles since then. Still, my main point remains: when different disciplines cross-reference each other, you get better science. |
November 9, 2024 at 10:50am November 9, 2024 at 10:50am
|
When science meets art, are they both elevated? Or do they make each other suck worse? From CNN:
I usually groan when I see "scientific theory" in a headline, because the word "theory" means something different in science than it does in everyday speech. In this case, though, my fears proved unfounded.
Which doesn't mean I don't have some issues with the rest of the piece.
Now, a new analysis by physicists based in China and France suggests the artist had a deep, intuitive understanding of the mathematical structure of turbulent flow.
Or, and hear me out here, he was an artist and thus observed turbulent flow in, perhaps, a river or whatever, and incorporated that observation without being able to do math beyond "this paint costs 3 francs and this other one costs 4; which one is cheaper?"
As a common natural phenomenon observed in fluids â moving water, ocean currents, blood flow, billowing storm clouds and plumes of smoke â turbulent flow is chaotic, as larger swirls or eddies, form and break down into smaller ones.
"Chaotic" is another word that means something different in science than it does in everyday speech. Again, though, credit where it's due; the article uses it in the scientific sense:
It may appear random to the casual observer, but turbulence nonetheless follows a cascading pattern that can be studied and, at least partially, explained using mathematical equations.
"Partially" is doing a lot of the work in that sentence.
âThe Starry Nightâ is an oil-on-canvas painting that, the study noted, depicts a view just before sunrise from the east-facing window of the artistâs asylum room at Saint-RĂ©my-de-Provence in southern France.
There are, I think, a few paintings that even the art-blind (like me) can identify at first glance. Mona Lisa. That Michelangelo thing with God and Human. The Scream. Maybe that one with the farmers. And The Starry Night. But, on the off-chance you have no idea what we're talking about, the article provides helpful illustrations.
Using a digital image of the painting, Huang and his colleagues examined the scale of its 14 main whirling shapes to understand whether they aligned with physical theories that describe the transfer of energy from large- to small-scale eddies as they collide and interact with one another.
I would love to have seen their grant proposal. "Yeah, we're going to study... art."
The atmospheric motion of the painted sky cannot be directly measured, so Huang and his colleagues precisely measured the brushstrokes and compared the size of the brushstrokes to the mathematical scales expected from turbulence theories. To gauge physical movement, they used the relative brightness or luminance of the varying paint colors.
One wonders if they already had a conclusion in mind when they picked those criteria, which would make it questionable science.
Huang and the team also found that the paint, at the smallest scale, mixes around with some background swirls and whirls in a fashion predicted by turbulence theory, following a statistical pattern known as Batchelorâs scaling. Batchelorâs scaling mathematically represents how small particles, such as drifting algae in the ocean or pieces of dust in the wind, are passively mixed around by turbulent flow.
And here's where most of the red flags appear, to me. Paint is, and you might want to sit down for this one, a fluid. Granted, that's just about the limit of my knowledge of artists' paint, but I have a high degree of confidence in my assertion, having seen paint in its fluid form. I've even seen paint mixed, and noted the swirls and eddies of turbulence. This is kind of like seeing milk added to coffee, and I don't drink coffee either.
Where I become more speculative is in thinking: well, he painted the thing with wet paint, so of course there's turbulence at the small-scale boundaries of brushstrokes.
Beattie agreed: âItâs an amazing coincidence that Van Goghâs beautiful painting shares many of the same statistics as turbulence,â he said.
While there is, indeed, such a thing as coincidence, I don't agree that this is an example of it.
The study team performed the same analysis and detected the same phenomenon in two other images, one a painting, âChain Pier, Brighton,â created by British artist John Constable in 1826-7, and the other a photograph of Jupiterâs Great Red Spot, taken by NASAâs Voyager 1 spacecraft on March 5, 1979.
Now, that seems a little more like what I'd expect from science. First, another painting; perhaps as a control of sorts. Hell if I know; I don't know that painting, and CNN neglected to illustrate it for us (there is, however, a link). As for Jupiter, we're pretty sure it exhibits all the hallmarks of turbulence on its visible surface, so it's a check on their modeling assumptions.
And yet, it shouldn't boggle anyone's mind that an artist noticed turbulence and tried to recreate it, or that one as brilliant as van Gogh was able to do it. |
November 8, 2024 at 9:10am November 8, 2024 at 9:10am
|
Today, we have an exercise in metaphor-stretching:
This starts out looking like a piece about housing:
When I was a kid, I lived in a bungalow for a while. You know, one of those houses thatâs compact and square and has a half-upper floor thatâs basically just a loft with sloped ceilings.
Yeah, well, the house I lived in as a kid was originally a shotgun shack.
Now if someone told me to live in a loft with sloped ceilings Iâd ask if there was any regular house available, please.
Nice to be able to be picky, isn't it?
Something else Iâve noticed thatâs shaped like a bungalow is the way we talk to each other. Human speech.
And that's the metaphor part, which, as it turns out, is what the post is really about.
We waste an unbelievable amount of time â in our daily lives, on podcasts, in interviews, in blogs and articles, even in Tweets and Notes for Godâs sakes â qualifying everything we say with caveats. We say ânow I know not all x are y, and I know that historically abc, and Iâm not trying to say that lmnop so please donât take this wrongâŠâ
Granted, I avoid X/Twatter like the plague it is, and I don't listen to podcasts. But while I've seen what this dude's talking about, and even engaged in it on occasion (for instance, when I note that I might be talking about something with a US bias), I don't think disclaimers like that rise to the level of wasting "an unbelievable amount of time."
Saying anything even remotely controversial on the internet is terrifying.
Is it though? Is it more terrifying than saying it in a crowd of Americans with guns?
No matter what argument you make on the internet, you will get people who reject it wholesale because you forgot an asterisk. Because you forgot to mention their particular edge case.
Yes, and?
We all know that not all heterosexual dating advice or sex advice applies to transgender people. It doesnât even apply to all straight people. We are all intelligent enough to know that.
[citation needed] on the "intelligent" bit.
People have managed to make it insensitive to speak about things that the majority of people deal with. People have managed to make it insensitive to be normal.
And what, exactly, is "normal?"
And then the tyrannical minority ostracizes you for it, and in turn makes it okay for everyone else to ostracize you.
"Tyrannical minority?" Loud, sometimes, maybe, but that seems like an oxymoron. Unless the "minority" is an actual, political tyrant, a minority of one.
If you drove down a street consisting entirely of overly-decorated bungalows, with nice upper windows and big, furnished porches, youâd call bullshit on the entire street.
No, I don't think I would.
Everywhere I look, I see people decorating their speech with nuance when all they really want to say is some simple, normal thing. It feels like bullshit.
Well, it's not everywhere I look. Perhaps examine your own biases, first?
People don't change their minds on the internet. They usually do that in books, battlefields, or not at all.
Yeah, that's a little bleak.
Some people are just actively searchlighting for reasons to get outraged. They arenât worth listening to.
On that, we can agree.
People who need that much nuance werenât going to learn from your argument anyway.
Aaaannnd you've lost me again.
As I've said before â we as a culture have become profoundly unserious.
To me, the opposite of "serious" is "funny." So the opposite of "unserious" would be "unfunny." Me? I'd rather be funny.
Maybe if we just start saying what we mean and placing the impetus on the reader to read nuance into the topic, weâll all grow up a little.
Well, sure, calling me childish certainly helps your argument.
People are tired of having to decorate their speech to make it marketable.
I think by "people," he means "I." As in him, not me.
I think weâre past peak Woke, and I think part of what that means is that weâre past peak not-being-able-to-speak-like-adults.
Ah, there it is.
I'm not dismissing his argument, mind you. I read the whole thing, top to bottom (it's really not that long). I simply don't agree with most of it, though I accept that my opinion could change.
If you're going to do a metaphor, make sure it's one that we can actually relate to. Bungalows may not be exotic or ritzy, but they're generally better than no home at all.
So that's me, saying what I mean. |
November 7, 2024 at 10:08am November 7, 2024 at 10:08am
|
I've lined Aeon articles in here before. This seems to be an affiliated site, Psyche. Apparently, someone loves Ancient Greek.
How to do mental time travel
Feeling overwhelmed by the present moment? Find a connection to the longer view and a wiser perspective on what matters
Bullshit sense... tingling!
But let's give this a chance.
You have a remarkable talent â the ability to step outside the present, and imagine the past and future in your mindâs eye.
Hey, you know what else you can imagine? Things that never happened, places that don't exist, and impossible scenarios. It's a rather important ability for fiction writers. Or city planners.
Some people apparently don't have the capability for visualization, while others can experience it more vividly. But I think it's fair to say most people have some ability to imagine.
Known as âmental time-travelâ, some psychologists propose itâs a trait that allowed our species to thrive.
Well, at least this is the polar opposite of the "staying in the present moment" crap that's been circulating. I'll give it that.
If I ask you to imagine what you did yesterday, or what youâre planning for tomorrow, you can conjure up rich scenes in the theatre of your mind.
Well. Some of us can. I doubt the author intended to be ableist, but this is a bit like saying, "If I ask you to walk a mile, you can get up and walk a mile," ignoring or forgetting that paraplegics exist.
In the accelerating, information-rich, target-driven culture of the early 21st century, the present often dominates thoughts and priorities instead.
And some people seem to want to make "the present" the only priority.
We need to be present-minded sometimes. However, too much focus on the ânowâ can also lead to the kind of harmful short-termism that infuses business, politics and media â a near-term perspective that worsens many of the long-term challenges we face this century, such as the climate crisis.
I can't really disagree with that. I've said similar things. Whether we can visualize things or not, we can learn from the past and make plans for the future, and both of those things are important.
But itâs also compounded by a host of unhelpful human habits and biases too, such as our âpresent biasâ, whereby we tend to prioritise short-term rewards over long-term benefits (the classic example is the marshmallow test, in which some children canât resist eating a single treat now, rejecting the chance to chomp two later on).
Yeah... that might not be the best example to quote.
A longer view provides a deeper, richer awareness of how we fit into the human story â and the planetâs â and reveals just how fortunate you are to be here, right now. The geologist Marcia Bjornerud calls this perspective âtimefulnessâ.
I'm not sure I like that name any more than I like its apparent inspiration, mindfulness.
Why a geologist gets to weigh in at all should be obvious: they generally cultivate a real sense of deep time, working as they do with rocks that sometimes predate eukaryotic life.
In this Guide, Iâll share practical tips and exercises that can help you escape the unwanted, short-termist distractions of the present, and discover the upsides of a longer time perspective.
The author proceeds to do just that, and at length. I don't think I need to copy anything else; if you're interested, go to the link (hopefully it won't rot anytime soon). I will note, however, that he does turn back to "mindfulness" at one point in there.
As for my bullshit sense, well, jury's still out for me. I tend to distrust pop psychology (for example, the marshmallow study, above), though that doesn't mean it's all bullshit. But right now, I have plans for the rest of the day because, no, I don't live in the present. |
November 6, 2024 at 9:03am November 6, 2024 at 9:03am
|
Well, that's finally over. Now I just get to grump at holiday season chatter.
Today's article, from BBC, has nothing to do with politics or seasons, and everything to do with the invention you're using right now.
There have been differences of opinion concerning the actual beginning of the internet. It's not like a human birth, or Armstrong's boot on the moon: a clear and obvious transition point. In my view, this article is more about a precursor technology, but a vital one for what the internet became.
On 29 October 1969, two scientists established a connection between computers some 350 miles away and started typing a message. Halfway through, it crashed.
1969 would have been long before "try rebooting and reinstalling all your drivers" would become tech support's second suggestion, after "make sure it's plugged in."
At the height of the Cold War, Charley Kline and Bill Duvall were two bright-eyed engineers on the front lines of one of technology's most ambitious experiments.
I should note, for context, that this was the same year as the aforementioned moon landing. Unlike NASA's stated mission, though, this early attempt at remote networking was in service of more military pursuits.
Funded by the US Department of Defense, the project aimed to create a network that could directly share data without relying on telephone lines. Instead, this system used a method of data delivery called "packet switching" that would later form the basis for the modern internet.
Like I said, not the actual invention of the internet, and military.
It was the first test of a technology that would change almost every facet of human life. But before it could work, you had to log in.
Some things don't change.
But Kline didn't even make it all the way through the word "L-O-G-I-N" before Duvall told him over the phone that his system crashed. Thanks to that error, the first "message" that Kline sent Duvall on that autumn day in 1969 was simply the letters "L-O".
And that's what I find amusing about the story: it's very Biblical. "Lo!" As in "Lo and behold."
On the other talon, I want to think that once they got it working (which, as the article notes, they did, after about an hour), the second message sent over this proto-internet was "Send nudes."
The BBC spoke to Kline and Duvall for the 55th anniversary of the occasion.
The rest is a transcript of that interview. It goes into more depth over what happened (or didn't happen) at the time, but there's no reason to repeat it here.
As compelling as this origin story is, I had this vague memory of a different origin story for the internet, one which took place some years later. In a rare case of me actually looking something up, I found this entry from 2019: "Birthed in Beer"
So if I had to choose which one was the actual invention of the internet, I'd pick the 1970s one, because it involved beer. |
November 5, 2024 at 10:17am November 5, 2024 at 10:17am
|
Ah, yes, November 5, and Election Day in the US. The UK will be celebrating an attempted terrorist attack, while over here, we're trying to avoid a terrorist attack.
The Random Number Gods have chosen to bless us with another Cracked link today.
Okay, but, no, there are no spots on Earth where the laws of physics actually fall apart. Well, maybe, sometimes, at CERN, but they do it on purpose.
Still, these are rather interesting.
There are some places on the planet where things get weird. For instance, you ever heard of the Bermuda Triangle? Well, it turns out thereâs nothing weird about that bit of the ocean at all â it sees a lot of traffic, but vessels that travel there are no more at risk than those anywhere else.
Like Bigfoot, that's not going to stop humans from making shit up about it.
5 Gravity Drops Near Sri Lanka
Once you learn that the force of gravity is slightly variable across the planet, it stands to reason that there are some spots where it's less and others where it's more. Finding out where it's less, though, that's what science does.
Gravity varies from place to place â and in some places, it varies a lot. In the ocean near Sri Lanka, gravity is so much weaker than in the rest of the world that the sea level is more than 300 feet lower than it would otherwise be.
Hey, I just came up with a fix for rising sea levels! Just increase the gravity of the Indian Ocean, and presto!
To know why gravityâs so low there, weâd have to burrow deep into the planet, and possibly cut it in two, which is inadvisable.
Awww.
Now, look, what that article's not telling you is that it's a minuscule effect. The variation from average, above or below, is about 0.5%. We'd never feel it. Sure, it has a profound effect on sea level, but look at what the comparatively really very tiny effect of the Moon's gravity does to that on a daily basis.
4 The Toasty Bit of Norway
In the Norwegian Sea, we have one disturbing bit called the Lofoten Vortex, where the water stores an unusual level of heat.
Lofoten Vortex can be the name of my Bjork cover band.
Yes, yes, I know Bjork is from Iceland, but come on, look at a globe. (A flat map inevitably distorts the distance between Iceland and Norway.)
3 On Top of Paraguay, Magnetism Disappears
I recently saw an argument for why the Earth's magnetic field isn't as important as we thought for maintaining our atmosphere. But this isn't about that.
Without looking at the map, if you had to guess one spot where the magnetic field gets weird, maybe youâd point your finger at one of the poles. But the planetâs rotational axis, which defines where we put the north and south poles, isnât the same as its magnetic axis (which creates the magnetic field). As a result, we have this belt of radiation around the globe that dips down and comes close to us at this unlikely spot above Paraguay.
So, it doesn't "disappear." It just gets weaker there. How much weaker, I can't be arsed to look up. I can forgive a comedy site for hyperbolic headers, but they made it sound like, I don't know, compasses won't work in Paraguay or something.
2 The Tulsa Center of the Universe
You might not have known that the center of the universe is in Tulsa, but thatâs what this spot is named, and Oklahoma wouldnât lie to us.
Can't be the center of the universe; I don't live in Tulsa.
Weâd go investigate ourselves, but that would mean having to spend time in Tulsa.
Good reason.
1 The Cave Where Energy Comes from Rock
I mean, technically, coal is rock. Uranium ore is rock.
But then you have Movile Cave in Romania. The interior is totally cut off from the outside world, and the creatures in there get no energy through photosynthesis, either directly or indirectly. Instead, the producers of this food web are bacteria that get their energy through chemosynthesis.
Here, though, they're talking about an entire cave ecosystem that doesn't rely, at its base, on solar energy input (apart from, you know, it not being frozen solid and all). Which is outside our normal experience, and definitely doesn't break the laws of physics any more than the other examples do, but is really interesting. Because now we know for sure that life can exist without photosynthesis.
Exist, yes, but imagine crawling out of that cave to go vote. Who are the candidates, now? |
November 4, 2024 at 8:17am November 4, 2024 at 8:17am
|
This Cracked article might have been better to post before Halloween, but remember, tomorrow is Election Day here. Scary stuff is still in our future.
I have often looked at articles about what some scientists are doing and think: "You fools! Have you never seen a horror movie?"
Animal testing is an unfortunate but necessary part of certain scientific fields â that is, until the scientists decide to go all Dr. Moreau on some rats. Then itâs just weird. Itâs like theyâve never seen a single monster movie.
See? I'm not the only one. (Sometimes, "horror" gets replaced with "science fiction," but usually, horror is still a subgenre.)
5 The Spider Goat
Spider silk is super useful for a lot of different things, but thereâs a reason you donât see a lot of spider farms.
I saw a video recently on how silkworms do their thing. Silkworms are larval moths, and generally unpleasant to look at, but they lack the visceral horror of arachnids. Still, spiders are cool... from a distance.
In 2012, Utah State University geneticists rectified that problem by splicing spider DNA into goat embryos, who eventually grew up to lactate spider silk.
Friendly neighborhood spider-goat.
4 The Man Mice
In 2013, scientists at the University of Rochester implanted human glial brain cells into the brains of newborn mice, who became much smarter and learned faster than other mice as a result.
"What are we going to do tomorrow night, Brain?"
Also, anyone who has ever seen a horror movie should have implored them to stop.
3 Acid Elephant
In the â60s, experimenting with LSD was all the rage, in both the scientific and âhanging out in your friendâs cousinâs basementâ senses of the word. By 1962, scientists at the University of Oklahoma had run out of ideas until one of them asked, âWhat if we gave an elephant a thousand doses?â
Which makes me wonder: if we see pink elephants, what do elephants see? Gray humans?
2 Magnetized Cockroaches
âCan you turn a cockroach into a magnet?â is a question you only ask after youâve been seriously scientifically jaded...
...or on a serious acid trip.
1 Zombie Dogs
Believe it or not, we do know how to bring once-living creatures back from the dead.
Thus shifting the definition of "dead." Used to be: heart stopped = death, but then CPR came along, and now doctors have to pretty much guess when the point of no return occurs.
Cornish hoped to try his method on humans, specifically a recently executed prisoner, but the government forbade it not out of any fear of a zombie apocalypse but because they werenât sure how double jeopardy laws applied to a revived corpse.
Seems like an important legal loophole to fix.
Add to this mix the penchant of certain scientists to revive millennia-old bacteria found in ice cores, and you definitely have the makings of a horror movie. But still not as scary as tomorrow's election. |
November 3, 2024 at 6:49am November 3, 2024 at 6:49am
|
I have an article in the queue about author deaths. Back in June of 2021, I shared a different article about author deaths: "Inevitable"
As you know, one reason I do these entries is to see how things have changed. I can assure you that every person listed in that original article is still dead.
The original article, from LitHub, is still up. Though I clicked on it to check, I no longer follow that site and want nothing to do with them, which is why we haven't had any LitHub links in over a year.
So here's my 2024 take on my 2021 words:
Most of us don't choose the time and place of our demise, with notable exceptions such as Hunter S. Thompson.
On manner of death, Thompson, of course, plagiarized Hemingway.
Quote from the article: "Camus died in a car crash. Simple enough, right? ...Apparently, Camus once said that the most absurd way to die was in a car accident." My response:
I can think of far more absurd ways to die, but most of them involve alcohol and maybe prostitutes.
I think, with that, I was trying to invoke the same ironic twist ending that Camus experienced.
Interesting as some of these are, I think it's better to be remembered for how you lived than for how you died. But, failing the former, I'll take the latter. |
November 2, 2024 at 9:58am November 2, 2024 at 9:58am
|
I guess sometimes Lifehacker is good for a laugh. Laughacker? I don't know.
Settle down, Beavis.
Roast chicken is an everyday pleasureâa good fit for both special occasions and midnight snacks.
Which is why my grocery store sells rotisserie chicks.
While you might be familiar with the classic roasting style, with trussed legs and tucked wings, this method can lead to overcooked breasts and soggy thighs, two phrases I want nowhere near my chicken.
Heh heh heh huh huh
Thereâs a better way to roast your chicken for more even cooking: spatchcocking.
Bwaaaaahahahaha
You can spatchcock, or butterfly, any bird.
That bit might only be funny if you're familiar with British slang.
Traditional roasting puts the driest cut of meat (the breast of the chicken) at the top, often closest to the heating elementâbefore you've even turned up the heat, itâs a recipe for overcooking. The parts that are juiciest (i.e. the thighs) are lower, if not completely under the rest of the body, and shielded from direct heat.
Somehow I'm hungry for wings, now, and I hate wings.
Illustrated instructions follow. I'd suggest not going to the link if you're vegan or vegetarian, or, like me, are allergic to hard work. |
November 1, 2024 at 11:01am November 1, 2024 at 11:01am
|
Did my early voting today, so I can spend Election Day doing more important things, like drinking.
Unrelated to that or, well, anything else, really, is this Atlas Obscura article featuring an obscure city in an obscure state:
Greg Brick knew it was there, lurking beneath his city, hidden within the Minneapolis water and sewer system: an enticing geologic anomaly called Schieks Cave.
That sort of thing fascinates me, but I can't be arsed to do all the work or break the laws necessary to explore for myself.
When Brick arrived at the cave, he shined a flashlight into the thick darkness of the cityâs underbelly. He found not just the natural void but also concrete walls that previous generations of civil engineers had built to support the natural structure.
If those civil engineers were anything like me, they wrote something like "put a concrete wall here" on a map, and some minimum-wage temps did the actual work.
The water was about 20 degrees hotter than it was supposed to be. It was more like the groundwater of Mississippi than that of Minneapolis. Something was warming the water beneath his city.
Well, clearly, that's because Minneapolis is actually a gateway to Hell.
[The cave's] discovery in 1904 by a city sewer engineer was initially kept secret lest the public fear Minneapolis had been built on unstable ground.
Apparently, it was, else generations of civil engineers wouldn't have specced out concrete support walls in the cave.
He finally had a way to access the caveâbut there was another problem. Along the route, the raw sewage poured from shafts overhead, shooting bacteria into the air as it splashed down and creating whatâs politely known as âcoliform aerosols.â Unsurprisingly, Brick got sick.
Another reason for us civil engineers to stay at our desks.
In 2008, a separate team from the University of Minnesota had predicted that heat from Minneapolisâs urban surface was conducting itself deep underground, heating the groundwater there like a metropolitan microwave.
As the article notes, this turned out to be the correct explanation, not the "gateway to Hell" one. Much to my disappointment.
But itâs not all bad news: Canadian and European researchers recently suggested recycling underground heat and using it as a low-carbon way to heat homes, while also cooling the groundwater back down to normal temps.
I hope their "suggestion" included exactly how to do that.
Now, from the headline, I was expecting some discovery of dark matter or a violation of the Second Law of Thermodynamics or whatever. But no. Still, at least it's not boring. |
October 31, 2024 at 9:44am October 31, 2024 at 9:44am
|
Samhain, and the temperature is supposed to get up into the 80s (Freedom Units) today. Not that I'm unhappy about that, but it is somewhat unnerving for the end of October.
This Halloween night coincides closely with a new moon (8:47 Eastern tomorrow), so it's not nearly as cool as when it coincides with a full moon.
So I'm doing... absolutely nothing for the occasion. Except, of course, trying to keep both of the black cats indoors, because some people are idiots.
Speaking of the moon, though, here's a lunar mythology article from Discover:
The moon is something all people on Earth, no matter where or when theyâve lived, have in common.
Unless, I suppose, you spend your entire life underground.
That we all share the moon does not mean that we imagine it the same way, however. In our myths and stories, the moon plays many different roles.
We know more about the moon than ever before, and we continue to learn more. That dead hunk of rock in the sky still has secrets to give up. Myths and legends don't tell us much about the moon, but they do reveal a lot about us.
The relationship between the moon and the tides, for example, was clear to early peoples who lived near the sea, explains Tok Thompson, an anthropologist at the University of Southern California who specializes in folklore and mythology.
That's probably an example of noticing a correlation without understanding the deeper effects (in this case, gravity) behind the correlation. In other words, the "observation" part of the scientific method.
The moon, with its regular phases, also functioned as a calendar. You could track the days by the sun, but for longer periods, the moon was useful. The English word moon comes from mensis, the Latin word for month, which is also the origin of the word âmenstruation.â
Much as I enjoy a good etymology, I'm not sure this is right. Wiktionary traces the word back through Proto-Germanic, all the way to PIE. This suggests to me that, rather than having its origin in Latin, the Latin and the Germanic words share a root.
I'm not an expert, by any means, but this is enough to make me question the "comes from Latin" assertion.
Pretty sure Luna is from Latin, though; that was the name of a moon goddess.
All that aside, let's stop pretending that Gregorian calendar months are related to lunar cycles.
So yes, ancient cultures knew a lot about the moon, and they wanted to share that knowledge to keep a record of it.
I also question the qualifier "a lot" in that sentence.
Thompsonâs favorite myth involving the moon is a creation story from the Tlingit people of the northwest coast of North America. In this story, an old man keeps all the light in the world stashed away in a box. Through wiles that vary from telling to telling, the trickster Raven steals the box and releases the Sun, the moon, and all the stars, bringing light to the world.
One could be forgiven for wondering how Raven was able to see the box, considering there was no light at the time, but myths follow their own, dreamlike, rules.
In Chinese mythology, a woman named Chang E drank a magic elixir, whereupon she floated all the way to the moon, and there she lives still â with, in some versions, a rabbit.
Radical feminists: "Hey, gimme some of that elixir!"
We still have some fascinating myths about our favorite satellite. Probably the most persistent is that the moon can drive us mad.
I find that it's necessary to separate the two meanings of "myth." A foundational story, like the ones about Raven or Chang E, is not the same thing as a persistent falsehood. I don't think this article draws the distinction.
Many people â including some healthcare workers and police officers â believe that crime, traffic accidents, psychiatric hospital admissions, and even suicides spike during a full moon.
Or that could be cognitive bias. You notice things more when they fit your pre-existing worldview. Like when you're frustrated by machines and bureaucracy, and think "Mercury must be in retrograde."
These days, we donât blame the effect on moon goddesses or magic elixirs but on something at least quasi-scientific: the moonâs gravitational effect on the water in our bodies. On its surface, that makes a lot of sense.
No. No, it does not.
After all, the moon creates tides, and roughly half our body weight is water.
Okay, but have they ever observed tides in a mudpuddle?
And indeed, though a few small studies have found some possible effects of the moon on mental health, research over the past decades has not borne this out. For example, a Swiss study published in 2019 looked at almost 18,000 cases of in-patient psychiatric admissions over ten years and found no correlation between the phase of the moon and psychiatric admissions or length of stay.
There's some nuance there, but basically, the "moon causes madness" bit is the second definition of myth.
Pausing to gaze at the moon can create an intense feeling of both awe and peace. Rather than making us mad, the moon might just make us a little bit more sane.
Nice to think, but there's little evidence for that, too.
Still... go look at the moon. Might want to wait a few days, though; it's too close to the sun right now. |
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|