About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
October 31, 2023 at 10:59am October 31, 2023 at 10:59am
|
It's a 2.5-year-old article from LitHub, but it's not like style ever goes out of style, right?
Parul Sehgal once argued that style âis 90 percent punctuation.â
Sure, not like anyone grumps about word choice.
As John Mayer once apparently said, for some reason, âLadies, if you want to know the way to my heart: good spelling and good grammar, good punctuation, capitalize only where you are supposed to capitalize, itâs done.â
Apparently, dudes could get away with anything.
After all, even among experts, there are disagreements, some of them oddly vehement. (What inner forces would compel someone to demonize or deify the semicolon?)
Oh, I don't know... search my blog for "semicolon" and you might find out. (I'm a fan of it, obviously.)
Then the article goes into the punctuation marks in general. I'll only highlight the ones I agree with:
(semicolon) In compiling the sentence, efficacyâor, more precisely, precisionâis important; capacity is important; and clarity is important. This kind of writer, at least, doesnât think in little stoppered declarative sentences. It isnât like that. Not really ever. Perhaps for some people. But not for us. For those of us whose thoughts digress; for whom unexpected juxtapositions are exhilarating rather than tiresome; who aim, if always inadequately, to convey lifeâs experience in some semblance of its complexityâfor such writers, the semi-colon is invaluable. -Claire Messud
I have never heard of this author before, but that makes me want to seek her work out.
(exclamation point) Keep your exclamation points under control. You are allowed no more than two or three per 100,000 words of prose. -Elmore Leonard
I think this is one of those things that's going to be different depending on what you're writing. But yeah, I believe in using bangs (easier to say than "exclamation point") sparingly so as to maintain their power. Unlike cuss words, which, as my mom's family was from New York City and my dad was a sailor, to me, are really just punctuation marks.
(em-dash) Donât you find it annoyingâand you can tell me if you do, I wonât be hurtâwhen a writer inserts a thought into the midst of another one thatâs not yet complete? Strunk and Whiteâwho must always be mentioned in articles such as this oneâcounsel against overusing the dash as well: âUse a dash only when a more common mark of punctuation seems inadequate.â - "many editors"
Contrary to what I said above, I don't really agree with this, but as with the bang, sparingly is good. I can see right through this writer's ham-handed attempt to convert people to their point of view by deliberately overusing it.
(comma) [On her editor, Bob Gottlieb, who famously âwas always inserting commas into Morrisonâs sentences and she was always taking them outâ] We read the same way. We think the same way. He is overwhelmingly aggressive about commas and all sorts of things. He does not understand that commas are for pauses and breath. He thinks commas are for grammatical things. We have come to an understanding, but it is still a fight. -Toni Morrison
Oscar Wilde rather famously once wrote: "In the morning I took out a comma, but on mature reflection, I put it back again." As the most-used punctuation mark apart from the period ("full stop" for those of you across the pond), it's subject to a great deal of wasted words and time. I admit to feeling somewhat judgemental toward people who misuse them, but I can't always articulate why they're misused in a given instance.
I'm more judgemental about apostrophe abuse.
(hyphen)
No quote here. I did a whole blog entry on it a while back: "Dashing" . In it, I overused and misused emdashes, so take it as you will.
(period)
Do we really have to argue about full stops?
James Joyce is a good model for punctuation. -Cormac McCarthy
James Joyce is not a good model for anything, and now I have even less desire to read anything McCarthy wrote.
I mean, if you write properly you shouldnât have to punctuate. -McCarthy again
And that sentence, folks, should be in the dictionary as a sample under "irony."
To be clear, I'm not trying to claim I'm always right. I'm aware that I overuse semicolons sometimes (as well as parentheses), and get things wrong on occasion. Especially here, where every entry is a first or second draft. But that doesn't mean I don't consider these things. |
October 30, 2023 at 9:32am October 30, 2023 at 9:32am
|
So you don't want to live on this planet anymore?
The idea of life on Mars (whether alien or transplanted from Earth) is obviously one of the oldest themes in science fiction and fantasy. It's so pervasive that some people take for granted that 1) we will colonize Mars and 2) at some point after we do, the colonists will rebel and declare independence, most likely resulting in a war.
If 1 then 2, because Mars is, after all, a god of war, and because all colonies eventually revolt; but 1 is still questionable.
At first glance, Mars seems pretty nice.
Compared to the other planets we know of that aren't Earth, sure.
No other world in the solar system offers us this chance. Mercury is way too close to the sun. Nearby Venus has far too much atmosphere, whose pressure and noxious gases would crush and choke visitors from Earth.
I vaguely recall a thing I linked here a while back that ran the numbers and concluded that the closest planets to Earth, on average, are 1) Mercury 2) Venus 3) Mars. I think it depends on how you calculate it. If this doesn't make sense, consider all the time that these planets spend on the other side of the solar system from us.
And yet, thanks to quirks of orbital dynamics, it's easier to get to Mars. Which doesn't mean it's easy.
At night, temperatures drop to -100 degrees Fahrenheit. Dust devils and shifting sands cover up solar panels and will test even the most tightly sealed spacesuits and habitats. During dust storm season, Martian winds can stir up haboobs that cover the entire globe in clouds of sun-blotting microscopic particles.
That's with an atmosphere that's less than 1% the density of our own troposphere. Also, "haboobs" still makes me chuckle because I'm actually 12 years old.
Humans have been slinging spacecraft Marsward for 57 years, and weâre still not even batting .500.
It has been pointed out that Mars is the only world believed to be populated entirely by robots.
So far, the U.S. is the only country to land anything on Mars, and weâve stuck the landing on eight of nine attempts.
If I recall correctly (and I might not), the reason the one mission failed is that someone forgot to convert metric to imperial units or vice-versa, which is actually kinda hilarious.
Anyway, the article continues with details of some of the technical challenges, which I won't go into. Numerous SF authors have, naturally, imagined possible ways around each of them, up to and including terraforming. I've done a bit of fictional speculation along those lines.
But I've read enough science fiction to know that it's not going to end well, in any case. |
October 29, 2023 at 11:11am October 29, 2023 at 11:11am
|
Today's callback is a difficult one, and I'm not sure I have the mental energy for it this morning. The original entry was from about two years ago: "Man"
The article I referenced there was pretty new when I wrote that, and it's still up. If you don't want to read the earlier entry (understandable, though I promise the entry, if not the article, is short), here it is again.
My intro then: "I can admit when I just don't understand something. This is one of those times."
And I can't say I've developed a full, nuanced understanding now, two years later. The comments and later blog entry on the topic from Turkey DrumStik did help me to comprehend some things, but it's just not a subject I'm attuned to.
For one thing, I thought, for the vast majority of my life, that being a man simply meant that I was in possession of a todger. Presumably, when I was born, the doctor or nurse checked me out, saw a tallywhacker, and said something like "It's a boy!" I learned of this practice at a fairly young age, and also saw it used on farm and domestic animals, so my mind went "If you see a wangdoodle, that means it's male. Therefore, 'man' is simply defined as an adult human with a pecker."
All that other superfluity, such as "boys have shorter hair" (I didn't always, and don't now) or "girls wear dresses" (they don't always, and historically, boys wore dresses too) just seemed like a decision we collectively made as a society, one which could be reversed. Now I see that this stuff is what people mean by "social construct." And it still feels optional to me.
After all, my cats don't have a gender identity. Their gender is simply their biological sex. The queens don't wear makeup or hair bows, and the toms don't deliberately grow bushy beards or smoke pipes. Even being surgically altered doesn't change what pronouns you use for the animal.
More recently, I realized that this kind of thinking gets you labeled a transphobe, and no amount of denial of that on your part can ever change that label. It's far easier to change your gender than to get people to stop calling you a transphobe. It's kind of like if you say "I'm not an alcoholic," everyone will conclude that you are, in fact, an alcoholic. Especially if you're drunk and slur the words.
Anyway, that kind of thinking is why I always considered those outward markings of "manliness" to be superficial and optional.
Then, later in life, I realized that I was wrong. I can admit that, too; I'm man enough.
So I won't bother offering any support for my assertion that I in no way fear (or hate) trans people. Think what you will of me; I don't care.
And that brings me to the actual crux of the matter, at least for me, and for now: people seem to care a great deal about what others think of them, even, in some cases, when they claim not to. In another grand cosmic coincidence (see yesterday's entry for the last time this happened), one of my favorite webcomics touched on this very subject today. {xlink:https://www.smbc-comics.com/comic/escape-2}This link should be as permanent as anything is on the internet.
In the comic, a man (you can tell from his clothes, hairstyle, and facial features) goes to visit the archetypal Hermit on the Mountain (you can tell from the mountain, his robe/dress, and his beard). His question begins: "Wise Master, how do I escape? I'm a social ape. I'm obsessed with status. All my actions, even private ones, can be perfectly explained if you assume I'm seeking the esteem of other apes."
I'll tell you what, though: once I found a way to live that does not require the approval, or escaping the disapproval, of other social apes, I found a freedom that most people can't even conceive of. Once I stopped trying to peacock or scrabble for this mysterious "status" crap, life became a whole lot easier.
In short, I'm becoming the Hermit on the Mountain, and I'm okay with that.
One other unrelated thing about my earlier entry: Yes, I used to do Merit Badge Mini-Contests. I stopped because readership dwindled (or at least the number of commenters; being not obsessed with status, I don't often check my readership statistics) and because I quit doing entries at midnight in favor of being free to drink in the evenings. The schedule is much more flexible, these days, so it's harder to do deadlines.
But I'm also inclined to be generous, so who knows? I may start that up again in some form. |
October 28, 2023 at 10:00am October 28, 2023 at 10:00am
|
Full moon tonight. This article being chosen randomly from my rather lengthy queue is pure cosmic coincidence.
Most sources on the internet insist that full moon names are linked to Gregorian calendar months. They are wrong. Giving names to full moons was a tradition of several different and disparate societies before that calendar was forced down their throats at gunpoint. Yes, this is related to my sporadic harping against the false definition of "Blue Moon."
Today's full moon is the Hunter's Moon, the second full moon after the northern hemisphere fall equinox. The corrupted sources say it's the full moon in the calendar month of October, but it doesn't always fall in October. Some cultures have different names for this instance of the full moon, but we're going with Hunter's Moon as it's certainly the best-known. So, shout-out here to our very own đ HuntersMoon !
The naming of full moons is a folkloric tradition (which is decidedly biased toward the northern hemisphere seasons), so has little to do with any scientific facts about the moon. But it's my self-imposed Quixotic duty to rail against attempts to shoehorn those names into a calendar that was never meant to track celestial events (though it's certainly useful for calculating solar returns).
Hence, to the article, which, like all good fact articles, is from Cracked:
If you havenât been to the Moon, everything you think you know about it might be wrong. âThe Moon could be made of green cheese,â people used to say, and we all know how wrong those people turned out to be.
I have another article in the queue that addresses the "green cheese" cliché. Maybe it'll turn up on the next full moon (which will be the Beaver Moon). But probably not.
Well, okay, no one ever really thought the Moon was made of green cheese. That was a figure of speech about people believing the absurd.
I wouldn't say "no one." Never underestimate the power of human ignorance.
However, people really were uncertain about the nature of the Moon until we reached there.
Well, yeah, that was one of the reasons we went there. The other being a dick-measuring contest against the USSR. Whatever; it got results.
I could smugly assert that I knew all these things already. And I'd be right. Except it's almost never right to be smug.
5. Itâs Hot
Mind you, not all of the Moon is hot. Some of itâs very cold. But many people imagine the whole thingâs cold, because space is cold, and that isnât true at all.
Neither is it true that space is cold. "Cold" and "hot" are imprecise terms for, respectively, things at a temperature below about 75F and things at a temperature above 75F. But in order to have a temperature, something has to be a... thing. A rock, a volume of water, even a gas above a certain density (our troposphere, e.g.). As a vacuum isn't a "thing" in this sense, it cannot have a temperature. The moon is a thing, so it does have a temperature, one which varies quite a lot depending on where you are on or in it.
The article goes into details, but also can't resist making a Uranus joke, so I won't give it the dignity of quoting from this section further.
4. Itâs Dull
Hey, now, it may not be as smart as the Earth, but that's a low bar to clear.
Though the Moon is white, it doesnât reflect light very well. Thatâs because itâs so rough. Itâs not a polished cue ball, itâs rock and dust, which means it doesnât have much specular reflectivity.
This, too, is misleading. While there's some color on the moon, it's mostly various shades of gray, ranging from nearly black to not quite as black. If it were white, even with a surface just as rough, it'd appear a hell of a lot brighter. It's kinda-sorta related to this optical illusion.
Again, folks, don't get your science facts from Uranus joke sites.
Still, apart from that, this section has some cool photos.
3. Itâs Smooth
Well, yeah, in stories and poetry, it is often linked to romance.
A few seconds ago, we argued that the Moon is rough, not smooth. On the other hand, itâs also a lot more smooth than people imagine.
So's Earth. Sure, up close, we have mountains, valleys, and a nice beard of forest, but I've read (though never bothered to confirm) that those variations are proportionately less than those of a cue ball.
2. It Rings
Worth reading this section, but I'll just say this shouldn't be surprising. Sound propagates through solids.
1. Itâs Far
Only by Earth standards. By cosmic standards, the distance to the moon is undetectable.
But, again, the article goes into way more detail, because it's hard for us humans to contemplate cosmic distances. And this time, I have no real quibbles.
This section's references to Starfield make me want to play that game. But I'm already hopelessly immersed in Baldur's Gate 3, so maybe later.
I'll just close by suggesting that, if it's clear tonight, venture (shudder) outdoors and take a look at the Hunter's Moon. Whatever the facts... it's just plain cool. |
October 27, 2023 at 10:14am October 27, 2023 at 10:14am
|
I always wondered about "carmine." Is it a bomb you put on a car to make it blow up? A hole in the ground where you dig for cars? Or is it pidgin for "my car?" Oooh, I know: it's a corruption of "karma mine," which is a place where you take out only what you put in.
Well, no. Sadly, it's none of those things, although they would absolutely make the world a more interesting place.
It is a not-unpopular name, especially for those of Italian origin, and one with several cognates, including Carmen (as in Sandiego) and Carmelita. But it turns out that the name has two origins, like a river with a fork: one from Latin, meaning song, and one from Hebrew, meaning garden. Not that famous "first" garden, though; that was something like paradise. (No, really, the original Hebrew word for it is also translated as paradise.) Just your regular old garden, though presumably one with flowers and bushes rather than cabbages. If one were poetically inclined, one could say that the name Carmine might mean "garden of song," which would presumably be attached to Leonard Cohen's "Tower of Song."
The way Hebrew might have gotten mixed up with Latin to create that name, I leave as an exercise for the reader.
I say all this because one might be forgiven for thinking that the name and the word for a certain deep red color share an origin. They do not. Though one Carmine I heard of, Carmine Infantino, was a comic book artist well-known in some circles (comic readers) for a great run on the Deadman series, and Deadman wore a carmine-colored onesie. Coincidence? Well, actually, yeah. Infantino was drawing an established character. Also, only the reader (and maybe other dead people in the comic) could see Deadman, as he was a ghost. His superpower was possession, incidentally.
What's not a coincidence is that carmine and crimson, the color words, share a linguistic root. In the analogy above, that would be two rivers sharing a single source, which is kind of uncommon for rivers when compared to the other way around, but not so rare in language.
No, carmine-the-color came from bugs.
Apparently, there was this one species of insect which, when tortured, killed, and ground up, produced the carmine dye. I have no idea if this is still the practice, but at least no insects were harmed in making that color in your photo editing software. After all, something's gotta live in your garden. |
October 26, 2023 at 10:06am October 26, 2023 at 10:06am
|
I put up with reading articles like this one only so I can rag on them.
And already I'm negative about it.
Are you clinging to something that is not serving you?
My cats, for some reason, refuse to serve me; they insist that the other way around is the natural order of things.
Do you feel doubt? Failure? Depressed?
Are you about to tell me that I can think my way out of clinical depression?
Itâs time to let go of negativity and pain and accept life as it is.
Ha, ha, very funny.
...oh, wait. You're serious.
We encounter numerous toxic people every day who deliver subtle negative messages and instill negative ideas constantly.
And so we come to the root of why they want us to banish the negative and plaster smiles on our faces and pretend everything's sunshine and peaches: being a grump inconveniences other people.
However, in order to cultivate a positive mindset, you need to develop a strong mentality and separate yourself from the general crowd.
Once you do those things, you won't need to cultivate a positive mindset. Life will automatically be less stressful.
By giving more power to negativity, you will send out negative thoughts to the universe and end up attracting and manifesting negative things only.
Ah, yes, the power of magical thinking. Kind of the opposite of "develop a strong mentality" here.
If you let negativity creep into your mind, it will steal all your peace and happiness and put you in a dark pit of suffering.
Follow Master Yoda you must. Heed his words you should. "Fear is the path to the dark side. Fear leads to anger. Anger leads to hate. Hate leads to suffering."
To be clear here, I work to live my life without fear, anger, or hate. If those are the "negative" things they want banished, well, you're never going to banish them entirely, not without giving up everything that's worthwhile in the process. But that doesn't mean you have to let these things control you.
Shift your focus to the positive aspects of life, as there is always a silver lining in even the darkest situations.
Conversely, there's always a dark cloud in even the brightest situations.
Your reality is not based on what others think or say about you.
Neither does it depend on your negative thoughts or self-doubt.
... but you just said that it did, that negative thoughts manifest themselves through the universe's karmic channels or whatever. Which is it? (Answer: neither. Reality includes your thoughts, but is independent of your thoughts. Perception, sure. But not reality.)
So start by letting go of negative perspectives, grudges, jealousy, fear, pain, self-doubt, and toxic relationships that might hold you back.
I could take these one by one, but I have a dentist appointment later this morning, so I have limited time. While I tentatively agree that a lot of those things stunt one's personal growth, a healthy dose of self-doubt is important to keep a person from being too arrogant. And don't get me started on "pain." Assuming that means mental anguish and not physical pain, first, it's not so easy to get past trauma, and second, sometimes you have to wallow in it for a while before you get better.
When you let go of negativity from your past, you will be more mindful and bring your awareness to the present moment and stop worrying about what you cannot control.
I'm all for not worrying about what you can't control, but screw this "present moment" and "mindfulness" bullshit. We're nothing without our pasts, and if we don't concern ourselves with the future, it'll smack us in the faces.
Here are a few proven ways to help you release negativity from your life.
"Proven?" [citation needed]
I won't go into the "few ways." The article is there if you want to draw your own conclusions.
I'm not even saying everything in the article is wrong. As I said above, I'm not a fan of letting anger control me, and I don't live my life in fear. But there's also a hefty dose of airy nonsense and denial of reality, or at least a lack of acknowledgement thereof. Banish anger completely, and there goes one motivation to make positive changes. Banish fear, and you might think it's okay to take that shortcut through Mugger Alley.
Perhaps getting stuck in Pollyanna world is preferable to getting stuck in Eeyore world. But I'd rather not get stuck anywhere. |
October 25, 2023 at 9:51am October 25, 2023 at 9:51am
|
More reasons why Prohibition was a bad idea, from Cracked:
Admittedly, the headline turns out to be a bit misleading, but at least it's not from some other shameless clickbait site.
My first clue that the article wouldn't match the headline was the opening image which, if you don't feel like clicking, features a pool upon which floats a life saver ring, and inside the ring is nestled a can of Bud Light. So, water floating in water.
A deep love of partying is generally not the path to a longer life.
No, just a more fulfilling one.
Alcohol is a poison, but one that offers some good stories and a possibility to meet your future spouse at some karaoke bar.
Too much of anything is poison, and I still follow my ironclad rule, scraped from experience, to never pick up women in a bar. No matter how much I've indulged.
But what if drinking could save your life?
That would be known as a coincidence.
That said, there are a couple people throughout history who have somehow caught a lucky break off a usually harmful substance.
Okay, okay, we get it: you're badmouthing booze so you're not responsible if someone walks away from the article with "I should drink more." Enough with the moralizing.
4. Mark Wahlberg
Mark Wahlberg, or Marky Mark, was a bona fide rock star back in the day.
As I did not know, until reading this article, that these were the same dude, I reject the label "bona fide."
When an opportunity to go party at the Toronto Film Festival presented itself, he canceled his earlier plans, bailing on a flight heâd already booked for the next morning from Boston to L.A.
In short, the flight he missed was on the morning of 9/11/01. The article implies that Wahlberg wanting to party saved his life. A bit of a stretch. You know what would be really weird? If something like this didn't happen. There's probably at least one cancellation on every commercial flight; this one just happened to be someone moderately famous, and the flight he missed became more than moderately famous.
If he'd been a "bona fide rock star," what was he doing flying commercial?
3. Clifton Vial
Not even Warhol-famous, this time.
Essentially, dude drove around Nome (not exactly known for its tropical climate, lying as it does just a few miles south of the Arctic Circle) with nothing but a 12-pack of Coors Light.
He had decided to drive out on the roads to, and I am not joking here, see how bad the roads were. They were bad.
I've done that, admittedly. But I live in a well-populated area.
When his car got lodged in a snow bank where it, and he, remained for three days, he turned to the 12 Silver Bullets in his trunk for sustenance.
Might as well have just melted some of the snow...
Not a bad idea, given that any beer drinker can tell you that Coors Light is basically water anyways.
See?
2. Moe Berg
The amount of alcohol involved in this story isnât as well documented... a dinner party where the guest of honor was famous physicist Werner Heisenberg...
One might say the amount of alcohol was... uncertain.
You could be forgiven for thinking that I linked and commented on this article because of the drinking bits. But you'd be wrong. No, I'm highlighting this article because it gave me an opportunity to make that pun.
The guest weâre talking about, however, was a man named Moe Berg, who was a major league baseball catcher, a polyglot and an American spy.
I bet he got all the chicks.
Seriously, though, read about this guy. As the article notes, he really did fucking rule.
In summary, Berg's mission was to essentially butter up Heisenberg to see how close the Germans were to making a working fission bomb. If the answer was "close," Berg (in his profession as spy, not catcher) was meant to assassinate the scientist.
I do have to wonder why he didn't just do the dirty deed anyway, considering that Heisenberg was, at the time, working for the Nazis, and Berg (despite having part of the physicist's name) was the son of Jewish Ukrainian immigrants to the US.
According to some interpretations of quantum theory (of which Heisenberg was a pioneer), there's another universe where he did just that. Probably one where no one at the dinner party had been drinking.
1. People with Antifreeze Poisoning
After that last one, this is a bit anticlimactic. Antifreezeactic? Whatever.
If someone finds themselves in a situation where theyâve ingested antifreeze or another substance containing ethylene glycol, a drink even more dangerous than rail tequila, alcohol can be used to save their life.
Now, look. I don't usually do disclaimers here. But don't take medical advice from a dick joke site. Or from me. Even the article is aware of this:
Of course, donât read this and think if you accidentally chug some car juice, you can head for the liquor cabinet instead of the hospital and sleep it off. Anytime you drink poison, itâs best to have a doctor involved.
...unless that poison is ethanol, naturally. |
October 24, 2023 at 9:31am October 24, 2023 at 9:31am
|
You know what's bugged me for a long time? Well, yes, okay, "lots of things." But to keep things related to the topic:
Hunting and fishing are two of our oldest occupations. They existed in our evolutionary history long before we were recognizably human, and these occupations are followed today by, for example, cats, as well as many humans, whether for survival, commerce, or recreation. Yes, I know the joke about prostitution being the oldest profession, but the point there is the exchange of goods for services, whereas hunting and fishing provide their own goods, ones you could eat. Or trade to the prostitute.
That's not what's bugged me, though. That's just the background. The annoyance is this: in English, one who hunts is called a hunter. Okay, that makes sense and follows the general rules of English. And yet, one who fishes is, traditionally, called a fisherman.
So why is that, or, alternatively, why isn't one who hunts called a hunterman? It's a deep linguistic mystery.
Now, yes, the language seems to be moving, in its slow plod toward inclusivity, toward "fisher." But that's not my point. My point is that, while both occupations can be and, apparently, were practiced by all genders, we still ended up with "hunter" for anyone who hunts, but "fisherman" for someone who fishes (and "fishwife" for someone who sells the fish, but in that case "wife" meant "woman" and not "married woman," as it used to in Middle English and earlier).
As a side note, anthropologists used to assume that in hunter-gatherer societies, it was the men who hunted and the women who gathered. This was an unfortunate projection of then-dominant gender roles upon primitive peoples, which is yet another reason evolutionary psychology is not to be trusted. Turns out, in most cases, everyone participates. Or they don't survive.
Anyway. None of this explains why "hunter" also became the name for a shade of green.
Apparently, though I don't have a reliable source for this, hunters used to wear that color (which hunters, I don't know; I assume it's ones in the US and/or UK because we're talking about English, here). But by the time "hunter" was used to describe that shade of green, hunters had already started switching to more subdued, camouflage-like colors.
As with many words, though, the definition stuck even as the thing it referred to changed. Now you're more likely to see hunters in the US wearing bright orange camouflage—I gather this works because their principal prey, deer, don't have the visual color receptors to distinguish that from green/olive camo, but it does help the hunters to be seen by other hunters, reducing the incidence of hilarious tragedy. |
October 23, 2023 at 10:00am October 23, 2023 at 10:00am
|
"Sorry, we can't hire you. Your natal chart shows Mercury retrograde in Pisces, and Venus square Jupiter, with Neptune ascendant."
Wait, did we forget to actually ask you any questions? That doesnât matter. Your responses wouldnât have affected the accuracy of your results, and the $4 billion personality-assessment industry has already decided the outcome for you in any case: As far as itâs concerned, youâre all of the above, your path in life has been determined and thereâs not a damn thing you can do to change that.
I've been likening that crap to astrology for a while, now. Thing is, yes, in my view, your path in life has been determined... but no one has any way of knowing what it is: no star charts, no genetic assessment, no divine revelation, no personality test, no Tarot reading. The future can go places that the past never dreamed of; you can't change what you can't predict, and if you could, that would lead to paradox. The difference between past and future is this: the past leaves evidence (memories, scars, bloodstains, ash, omelets), and the future does not.
As a rough analogy, we've gotten pretty good at predicting the general weather, short-term. While not 100% accurate, most of the time, if it says it's going to rain tonight, it's best to bring an umbrella. This comes from centuries of study, decades of computer modeling, and our very human trait of being able to project past trends into future probabilities. What we can't do, what we will never be able to do, is to predict precisely where and when the first raindrop will hit the ground. And yet, that raindrop was always going to land at that exact location at that precise moment.
On being handed his results, John raised an eyebrow at the fact that, according to the test publisherâs blurb at the top, it had color-coded his personality according to the âfour humors,â the bodily fluids â yellow bile, black bile, blood and phlegm â that Ancient Greek medical theory held as responsible for regulating both peopleâs physical health and the underlying aspects of character, or âtemperaments.â
As utterly silly as that test is, though (I'd be tempted to believe that the publisher was trolling), it, and other personality tests, don't purport to predict the future. They do, as far as I've heard, claim to predict how someone would react in a given situation—as if that's fixed, independent of other environmental factors, and completely discounting our ability to learn from our mistakes.
Did he really want to work for a company where teams were structured according to human-resources thinking from a time when motor function was attributed to animal spirits that roamed through the muscles and lived in the brain? ââWhereâs David?ââ says John, mentally sketching a regular day at the office. ââOh, heâs in Meeting Room 2 at the moment, with the leeches on him.â
Turns out leeches do have some legitimate, though limited, medical use... but it's not because of unbalanced humors.
The most well-known hot-take taxonomy of all, the Myers-Briggs Type Indicator (MBTI), is famously the product of a homeschooled Philadelphia novelist, Isabel Briggs Myers, feverishly working in the 1940s and 1950s to graft her own notions about personality onto a typology proposed by her intellectual hero, Carl Jung.
Let's be fair, here: "homeschooled" is irrelevant. "Philadelphia" is twice as irrelevant. "Novelist" is misleading: Carl Sagan was a novelist, but he was also a brilliant scientist and communicator of nonfiction. That other Carl, Jung, is worthy of study. But the problem there is not any of that, but the "graft her own notions about personality..." which is kind of like when a pharmaceutical company salts their testing results to get the outcome they want, such as "this drug is entirely nonaddictive and totally doesn't cause liver damage."
In her jargon, you were either a âthinkingâ person or a âfeelingâ person (your nailed-down psyche wasnât allowed to straddle both), you were either âintrovertedâ or âextrovertedâ and you went about the world either âsensingâ or âintuitingâ it, and either rationally âjudgingâ or irrationally âperceivingâ things.
It is the binary nature of these purported attributes that should be a massive red flag. Most of us do "straddle both" of any opposing qualities.
Meanwhile, throughout the long decades since World War II, while this and similar systems have been refined, promoted and steadily entrenched as indispensable weapons of hiring and firing, pretty much the entire community of professional academic psychologists has been quietly coughing into its hand and saying âbullshit.â
For reasons, I'm sure, beyond merely the "binary" objection I have; but then, I don't exactly have the qualifications to be an expert on this sort of thing, either. Having grown up on a farm, I can generally smell bullshit, but that doesn't mean I always know where it lies.
(I'm pretty damn proud of that "lies" pun, though.)
But, as the article points out, actual experts are calling bullshit.
In any case, the article may be of interest whether you love, hate, or fall somewhere on the love/hate spectrum with regard to these sorts of assessments. |
October 22, 2023 at 8:22am October 22, 2023 at 8:22am
|
Today is a Sunday, on which it's been my habit, recently, to take another look at random older blog entries. This time, the dice landed on one from just under two years ago: "A Gift Beyond Price, Almost Free"
In it, I comment on a New Yorker article from the previous year, which in turn was a book review. The article is still available as of today.
Now, usually, I go through and comment on things that have changed, or point out where I embarrassed myself with my comments. For the latter, I will admit to misspelling Rush drummer/lyricist Neil Peart's name in that entry. Apologies, Neil Peart's Ghost. Not that he read my blog even when he was alive.
For the former, I don't think much of anything has changed, which only supports my continued refrain in that entry every time the article points out where capitalism is exploitative: we obviously don't really care, because we don't do anything about it.
Even back in 2021, when I wrote the entry in question, I'm pretty sure I'd given up on humanity being able to take collective action for its own good. As if continued denial of climate change wasn't enough proof of that, we were in the midst of a pandemic, as you may recall, one which depended on everyone banding together to keep things from getting worse. Instead, a good half of the US decided the measures asked were too much, or that they knew better than experts.
You remember that Dr. Seuss book, "Horton Hears a Who?" I barely do, myself, but two things stand out in my memory:
First, the titular elephant is trying to save the microscopic universe of Whoville, and faces ridicule because "anything that can't be seen or heard is nonexistent." In reality, rather than a children's book, those naysayers would win, because all it would take would be one of them to mess up everything (I've referred to this as "Lone Asshole Theory," but the truth of it is, in these cases, it's not just one lone asshole, but a whole group of them).
And second, focusing instead on the microcosm of Whoville itself, it turns out (spoiler alert) that the only way to save the village is for everyone in the village, without exception, to make enough noise so that the aforementioned macroscopic deniers can hear them. Almost everyone makes noise like it's Purim and the rabbi's about to say the name Haman. Once the last holdout is convinced to scream "Yopp!" (which I'm almost certain is a Walt Whitman allusion), the disbelievers finally obtain the evidence of their own senses. Again, in reality, as opposed to a children's story, about half of Whoville would disbelieve that all the noisemaking is necessary, and a significant number of them would file complaints against their neighbors in the Mayor's office—that is, they would, if they weren't about to be utterly destroyed in an apocalyptic event.
Not only that, but as the deniers had already made up their minds, no amount of evidence would ever convince them otherwise.
Yes, people have misinterpreted that book for their anti-choice agenda, too. And those people are also part of the problem.
Apart from climate change and pandemic denial ("anything that can't be seen or heard is nonexistent"), there was one more instance, after I wrote that blog entry linked above, that reinforced my conviction that we'll never get enough people to agree on a course of action to produce meaningful change, and that was the 2022 World Cup.
Going in, we all knew, or should have known, about the host country's use of involuntary labor, housed in foul conditions and worked to the bone, to build the shiny facilities for the event. We had an opportunity, then, to speak on their behalf, to sound our not-so-barbaric Yopp over the roofs of the world. To send a clear message to all the sponsors and advertisers and organizers: "We will not stand for this." But no... we (by which I mean "you," because I did not watch) sitting at home or gathering in sports bars or even attending in person, sent a different message: "Please, go ahead and enslave people, so long as our entertainment is cheap or free."
This sort of thing is why urging individual action will never work to solve our problems, or to save the unheard, invisible people. |
October 21, 2023 at 9:29am October 21, 2023 at 9:29am
|
It's a five-year-old article, but what the hell; it's new to me. From GQ:
Why in the ever-loving shit would I want to fix what ain't broken? How can I get back what I never had? And most of all, am I going to have to once again rage against the misuse of evolutionary psychology in articles?
And why do I bother with GQ if I know I'm going to rant about it? Okay, that one I can answer: because it's fun.
Ever had someone tell you to just cheer up? Did it drive you crazy? Well, turns out that someone telling you to âbe happyâ isnât just annoyingâitâs also wildly unhelpful.
It's especially unhelpful for people with clinical depression.
Seligman compares being happy to falling asleep: itâs not something you can actively doâin the way you can get stronger by lifting more weights. It just kind of has to happen.
Okay, sure, but unless you're tired past the point of exhaustion, the way to fall asleep is to get into a comfortable position and pretend to sleep until, at some point, you either fall asleep or say "screw this" and go play a video game.
My point is that in order to sleep, we usually first have to act like we're asleep. In that analogy, we'd have to pretend to be happy in order to be happy. So, do you still want to compare being happy to falling asleep?
Now, yes, I've said in here at least a dozen times that happiness isn't a goal, but a byproduct. So, sure, I don't completely disagree with that premise; I just had to nitpick the analogy.
And as the father of positive psychologyâthe study what makes a good or meaningful lifeâmuch of Seligmanâs work has dealt with trying to help people figure how to make it happen.
Hurk.
âHalf the world is on the low positive affective spectrum,â he says referring to positive affectivity, a trait that usually correlates with sunnier dispositions. âI'm part of it, and a lot of the justification for what I work on, and what I write, is to try to help half the world, who is not naturally positive affective, to be more positive and optimistic.â
WHY?! OH GODS WHY?!!!
What if... what if the only people who benefit from someone being mindlessly optimistic are 1) those around them who don't have to deal with someone's annoying pessimism and 2) our overlords, who will more easily control a populace who believes the best will happen?
What he has learned is that well-being can be broken into five elements: positive emotion, engagement, relationships, meaning, and accomplishment (PERMA).
Boy, shrinks love their mnemonic acronyms.
Why does it seem like we are wired for pessimism?
The species that [was] going through the Ice Ages had been bred, and selected, through pessimism.
Not only is that abysmally terrible evo-psych, but it's easily falsifiable terrible evo-psych. Not that I've disproven it, mind you; but consider: not all of humanity was directly affected by an ice age. Many populations were equatorial or near-equatorial. Are their descendants happier today, controlling for all other variables?
But the main reason why it's horrible evo-psych is that, as usual, it assumes that the only evolution that mattered to us started with humans, instead of us having ancestors dating back to the dawn of life, each of which contributed factors to evolution.
So is this at odds with something like mindfulness, which argues you should be present in the moment? If you're focusing on optimism, you're also sort of missing the present moment, right?
Well, I think, if you look at what people are doing, and what you're doing right now when we're talking, you're prospecting into the future.
But if positive psychology is at odds with mindfulness, and I also despise mindfulness, oh no! Cognitive dissonance! Wait, no, I've got it—I can hate both.
I've read a few people have said that you might be better off cultivating a sort of non-attachment to well-being: be mindful that a lot of life is going to be suffering, and if you can find contentment in that, you might be better off than seeking out happiness.
I think the good thing about meditationâmindfulness, concentrating on the present, detachingâis as good anti-anxiety, anti-anger tools.
"A few people have said," interviewer? A few people? You do know that that's Buddhism, one of the most widespread spiritual practices in the world, don't you?
Of all the things you've studied, or learned, is there one idea you constantly find yourself encountering most frequently?
I think it's hope.
Despite all my rage, I am still just a rat in a cageI do believe in hope. I'll leave off quoting the article and explain that now.
So, let me simplify things, but with a simplification that can easily be extended to our more complex lives:
You have two possible outcomes, A and B. Maybe they have predefined probabilities, like drawing a certain card from a shoe in blackjack, or maybe not, but it doesn't matter. Let's say you label A bad and B good, or at least less bad.
From what I understand of positive psychology, it tells us that you should believe that B will happen. You should manifest that B will happen. But then if A happens, which it still might because no amount of manifesting will change the fact that it can happen, you're crushed, devastated, forlorn, lost. Whereas if A happens, you might feel a fleeting jolt of accomplishment, serotonin momentarily coursing through your neural network, and then it's gone.
On the other hand... if you convince yourself that A will happen, if you predict A, if you act as if A were the only way that the universe could possibly work... when A happens, you're not nearly as devastated, because you expected it. Whereas if B happens, you're not just experiencing fleeting pleasure, but absolute joy.
In other words, being pessimistic, seemingly paradoxically, must lead to greater overall happiness. But that's probably only true if you still hold out some hope that B might occur; and that's what I mean by "I do believe in hope."
This philosophy is most usually expressed by "expect the worst, but hope for the best."
No, I'm not deliriously happy all the time. But I have something I consider far more valuable: contentment. I may not be where I envisioned myself when I was younger, but I'm doing okay.
And that's the real trap of optimism: you think you can always do better, so you strive, you make changes, you expect things to improve, and then you're discontented when they don't. You're disappointed that you don't have what you want, instead of being satisfied with wanting what you have.
Or hell, I don't know. Maybe optimism works for you. I'm not judging. I only take issue with the idea that everyone should strive for some nebulous, glorious state of "happiness" at all times.
Incidentally, this is not the last GQ article in my queue. And the other? Well, it's even worse... |
October 20, 2023 at 11:11am October 20, 2023 at 11:11am
|
Of all the weird, strange, or just plain incomprehensible names for colors, "navy" stands out as one that almost makes actual sense... from a certain point of view. Under certain ideal conditions, the ocean appears that deep, dark blue color, for reasons I can't be arsed to go into right now but you're right; it does involve physics.
But our word, navy, comes from a Latin word that referred more to ships than the ocean. Still, you know, you can't have a ship without something to float it in. Which reminds me that another word from the Latin navis is navigate, whose meaning should be limited to finding one's way around on a body of water. But, naturally, it's not, because we're pretty good at making new meanings for old words, and new words for old meanings.
Take, for example, space. Thanks to years and years of science fiction, we know what a vessel that carries things around in space is: a spaceship. And again, that makes sense from a certain point of view. But the harsh reality is that if we do get to the point where we have spacecraft transporting people or things around out there, the vessels will have far more in common with a submarine than they do with a ship. And, by naval convention, submarines are always boats, not ships.
"Spaceboat" just doesn't have the same ring to it, though, does it?
Don't ask me to define the difference between boats and ships further. My dad was a sailor, and I never fully understood the distinction he made, when he bothered to make one. Near as I can tell, a boat goes out from port or ship and returns to the same port or ship, while a ship carries cargo and/or passengers from one port to another (hence the verb "to ship," which also refers to sending parcels by road or rail). But by that definition, a ferryboat should be a ship, but it's not (I sidestep this by calling it a "ferry"). A ship can carry a boat, but a boat can't carry a ship, though a tugboat can push a ship, despite "tug" having the connotation of "pull."
Anyway. The other thing we get from science fiction is the use, in spaceships, of a "navigator," like on the original Enterprise. Properly, this should be "astrogator," but that would just lead to alligator puns (though some SF does use this term), so probably best to repurpose the word. And don't get me started on the etymology of "bridge," as in a spaceship's control room, which takes its origin from Mississippi River steamboats... dammit, I said don't get me started.
Still, sometimes it bugs me that landlubber GPS uses "navigation." I mean, we want fewer people not paying attention and driving their cars into lakes, right? But again, there's not much to choose from in terms of better words. "Orienteering" is the process of finding one's way around on land, but that doesn't really work in cars or trucks, does it? So we're stuck with navigation.
But even that makes more sense than calling a web browser a "navigator." Which you don't see much these days in English, but one early web browser was Netscape Navigator. And the French word for browser is "navigateur." This wouldn't bother me so much if the defining metaphor of the Internet weren't a spiderweb, rather than an ocean.
You could say all the contradictions give me the blues. |
October 19, 2023 at 10:39am October 19, 2023 at 10:39am
|
I've been saying that every cliché started out as profound insight.
While they call out Mental Floss, the above link is from LitHub.
Worn-out phrases can make a reader roll their eyes, or worseâgive up on a book altogether.
"Roll their eyes" is itself a cliché.
ClichĂ©s are viewed as a sign of lazy writing, but they didnât get to be that way overnight; many modern clichĂ©s read as fresh and evocative when they first appeared in print...
Which is what I've been trying to say.
But of course, many clichés are tired and worn-out, but they have to be used sometimes, or else how will people know when you subvert them or make a joke out of them?
Add Insult to Injury
The concept of adding insult to injury is at the heart of the fable âThe Bald Man and the Fly.â In this storyâwhich is alternately credited to the Greek fabulist Aesop or the Roman fabulist Phaedrus...
Look, if anything's that old and passed into cultural mythology, it's not a cliché; it's an allusion. Or just part of the language, like a word, only it's a phrase. Like "part and parcel" or "cease and desist." Though no one knew the origin of this phrase. Except us, now.
Albatross Around Your Neck
If you studied the Samuel Taylor Coleridge poem âThe Rime of the Ancient Marinerâ in English class...
We skipped that one, and I still haven't read it, but I haven't seen this phrase used enough to consider it cliché. Also, it's an allusion, too.
Forever and a Day
This exaggerated way of saying âa really long timeâ would have been considered poetic in the sixteenth century.
My nitpicky mind always thinks "but when time runs out, we have no way of knowing the length of a day."
Happily Ever After
This cliché ending line to countless fairy tales originated with The Decameron, penned by Italian writer Giovanni Boccaccio in the fourteenth century.
Okay, look, no. Used at the end of a fairy tale, it's not a cliché; it's a formula, the flip side of opening it with "Once upon a time." Other languages use different formulae. You might as well claim that "amen" at the end of a prayer is a cliché.
It Was a Dark and Stormy Night
Edward Bulwer-Lyttonâs 1830 novel Paul Clifford opens with âIt was a dark and stormy night.â
Oh, come ON. I thought this list was about overused phrases that were fresh and wondrous in the beginning, but this infamous story-opener was widely hailed as "bad" from the get-go. That's why it makes great comedy material.
There are a few more, but I'll be honest, here: Yesterday, I finally broke down and purchased Baldur's Gate 3. And I'm in a hurry to get back to gaming. So feel free to see for yourself; I don't have the same sort of commentary on the others, anyway. |
October 18, 2023 at 10:23am October 18, 2023 at 10:23am
|
Mostly, I just think this is cool, so I'm sharing it. (From BBC)
What did Stonehenge sound like?
New research into the prehistoric site's acoustical properties is revealing that the stone circle may have been used for exclusive ceremonies.
While fascinating, I'll note that there's still a lot of educated speculation in here.
Through the doors of a university building, down a concrete hallway and inside a foam-covered room stands a shin-high replica of one of the most mysterious monuments ever built: Stonehenge.
Shin-high replica of Stonehenge? Did they borrow it from This Is Spinal Tap?
"We know that the acoustics of places influence how you use them, so understanding the sound of a prehistoric site is an important part of the archaeology," said Trevor Cox, professor and acoustics researcher at the University of Salford in Manchester.
As long as the theory is sound.
(You're goddamn right pun intended, and I'm WAY more proud of that one than I have any right to be.)
Despite being the world's best-known and most architecturally sophisticated ancient stone circle, archaeologists still don't know who built Stonehenge or what it was used for.
We know who built it: People. That's right; not aliens. Probably.
Thanks to Cox's recent studies, however, we now know a fascinating detail about one of the world's most enigmatic sites: it once acted as a giant echo chamber, amplifying sounds made inside the circle to those standing within, but shielding noise from those standing outside the circle.
Which is cool and all, but did the ancient builders do that on purpose? If so, how, without a scientific theory of sound, did they know? Well, it's very likely that Stonehenge didn't spring suddenly out of nowhere; I'd bet money that earlier henges were made out of material slightly easier to obtain, transport, and build with, such as wood. So, I'd guess (but it's only a guess) trial and error.
Unless, of course, it was aliens.
Once the stones were painted grey and arranged in the correct distribution according to the computer model, the challenges of the testing process began.
Another thing I wonder is: why gray? (Look, they use British spelling and I use US spelling.) Does color somehow influence the acoustic properties? I'd have guessed it would be more about material and texture than color, which just goes to show that my guesses are just guesses.
Through mathematical processing, Cox was able to create a computer model that simulates the acoustic properties of Stonehenge and can distort voices or music to give a sense of what they would sound like within the circle. The results surprised him: although Stonehenge has no roof or floor, sound bounces between the gaps in the stones and lingers within the space. In acoustics, lingering sound is known as reverberation.
Which musicians use to great effect, mostly electronically these days.
These results showed that Stonehenge would have allowed people inside the circle to hear each other quite well, while those outside would have been excluded from any ceremonies taking place. Cox's research adds to a growing body of evidence that Stonehenge may have been used for rituals reserved for a select few, with one study even pointing to the possibility of a hedge grown to shield the view from those not participating.
While this is sensible knowing what humans can be like (sometimes elitist), I wonder if the people excluded, if this speculation is true, were also the people who did the hard manual labor of cutting, moving, and erecting the stones. That, too, would be typical human behavior. Thus, probably not aliens.
Cox acknowledges that unanswered questions about the real Stonehenge make it difficult for him to draw definitive conclusions from his work with the scale model.
Like I said above: educated speculation.
But, in short, apparently people living in England have been rocking out for a lot longer than we thought. |
October 17, 2023 at 9:15am October 17, 2023 at 9:15am
|
Unlike with most words, I have a vague memory of my first known encounter with this one. I know it was in a comic book, but (this is where the vague comes in) I can't remember if it referred to the color of Superman's cape, or they called The Flash by one of his nicknames, the Scarlet Speedster.
It would be many years before I was forced to read Hawthorne's The Scarlet Letter as part of an AP English curriculum in high school, so never let it be said that we can't learn things from comic books.
The upshot of this is that, for me, the word and color "scarlet" would always be associated with heroism, and not, as turns out to be the case in our complicated psychological color map, sin. The shade of red associated with romantic love, e.g. on V-Day, is much darker (as is appropriate), and actual primary-color red mostly just has "stop!" and "expense!" connotations. Which also reminds me of romance.
The red used in comics, though, is red, not scarlet. There's a historical reason for this: the most common (probably cheapest) technology for printing in something other than monochrome was the four-color technique. There's a lot of technical stuff at that link that's irrelevant right now, but you might recognize that the system is still in use. You might even have one in your home and/or office, comprised of a cheap-ass loss-leader printer, using four cartridges of ink that, ounce for ounce, is probably more expensive than gold.
Hence, Superman (or The Flash) was rendered, in comics and in the Sunday newspaper, mostly in bright primary and complementary colors: red, green, blue, magenta, cyan... even yellow, which doesn't always show up well against a white background.
Actual scarlet, which is on the red side of reddish-orange, was probably too subtle for the four-color process. But there, I'm just guessing. "Scarlet speedster" was likely used more for its alliteration than chromatic accuracy.
As I can't seem to do one of these entries without researching etymology, though, I did so, and discovered that, apparently, scarlet was a relatively early word adopted into English: it appears in Old English texts as far back as 1250 C.E. , while the color itself stretches back into the first millennium B.C.E.
And, like I said above, it's often associated with sin, because of English translations of the Bible. But I reject that association and substitute my own. |
October 16, 2023 at 9:50am October 16, 2023 at 9:50am
|
This is pretty cool, recent, and informative, though of course I have some quibbles.
Quibble 1: "Intelligent." I've railed on this before, but, to summarize: What they really mean is "technology-using." The signals postulated in the article would be the signs of tech-using beings. It is possible to be intelligent and never invent radio, spaceships, lasers, or Tamagotchi. In fact, we did not, up until a few decades ago, which is an insignificant sliver of time compared to the age of the universe. And, finally, using "intelligence" just invites tired, outworn clichés like "we can't even find intelligent life on Earth," which is one of those rare statements that automatically disproves itself because one must have a minimum level of intelligence to utter it and have it be understood.
So. When the article says "intelligent," substitute "technological," and you'll be closer to what I believe the intent to be.
This summer, a stony-faced David Grusch, a former US Air Force intelligence officer, sat before a House Oversight subcommittee and made some extraordinary claims. Chief among them is that the American government has a clandestine program that locates then reverse engineers unidentified aerial phenomena (UAPs) â an ostensibly less-silly way of saying unidentified flying objects, or UFOs â and that US operatives were in possession of nonhuman biological matter.
Quibble 2: UAP may be more or less silly than UFO, but I believe it to be a better fit. Many phenomena formerly attributed to UFOs weren't "flying objects" at all, but mirages or electrical activity. At which point they weren't "unidentified" either, but I'm willing to bend on that one, as everything is unidentified until it's identified.
It has the added advantage of not yet having accumulated years of fringe.
Quibble 3: I am in possession of nonhuman biological matter, too. They're called cats.
Grusch didnât provide an ounce of verifiable evidence, citing only anonymous sources telling him vague things. When pressed for confirmation, he said because this was all so exceedingly classified, he was unable to provide specific details while under oath.
Not-a-quibble: uh huh.
Letâs get something straight: Congressional hearings are not the way we are going to discover the existence of intelligent alien life. They are a distraction from the bona fide alien-hunting work â the sort that doesnât involve grandstanding individuals and showy stunts, but scientists searching a sea of stars for the sounds or sights of extraterrestrial intelligence.
Quibble 4: We're also looking for signs of general life: byproducts of biological processes. There's difficulty there, because we only have one data point (Earf) for what to look for, but you gotta start somewhere.
Because space is inconveniently enormous and traversing it so intensely time-consuming (without bending the fabric of space-time to your will, anyway), itâs exceedingly more likely that humanityâs first brush with extraterrestrials (ETs) will come in the form of eavesdropping on radio transmissions theyâve sent, or seeing a sign of technological civilization with a telescope, than recovering a pancaked little green wayfarer from a crashed capsule.
Quibble 5: What's the first thing we did when we started exploring space? Sent robots, not people. No reason to assume hypothetical aliens wouldn't do the same. (No reason to assume they would, either; just gotta keep the possibility open.)
âIf we detect a civilization, that means civilizations can exist for a reasonable amount of time and overcome their issues and problems,â says Ravi Kumar Kopparapu, a planetary habitability researcher at NASA. âThat means thereâs great hope for us.â (Or, if the grim history of colonization has anything to say about it, great peril.)
Quibble 6: This does not follow, logically, and rests on a biased premise. Aliens would be, by definition, alien, and their "issues and problems," if they have any, may not bear much resemblance to ours.
If they were to discover that there is life out there â intelligent life that has forged a civilization â it would first mean that biology is not a fluke. Instead, it is something that can take root on many worlds; something that does not merely arise but repeatedly produces thinking, technological, curious creatures, those that may wish to share their knowledge of the universe, and their way of traversing or surviving it, with others. And if this civilization existed on a world very different from Earth, it would demonstrate that the largely unlivable cosmos is populated by myriad different isles of habitability.
Quibble 7: There's a lot of assumption to unpack here. For starters, it is possible (I would even say likely) that we'd see signs of life first, not technology. As I noted above, on our world, the kind of technology that produces signs that we could, in theory, detect has only been going on for less than an eyeblink compared to how long life has existed. There is nothing about evolution that requires the eventual appearance of a curious species with the right combination of intelligence (using that word now in its general sense), manual dexterity, language, socialization and other factors to begin to develop complex tools. But as the article notes toward the end, we might not find anything, but if we don't make the attempt, we will definitely not find anything (unless of course it comes to us first).
Okay, that's all I'm going to quibble about. The rest of the article goes into great detail about the actual search, and it's probably worth at least skimming if you're interested in this sort of thing. Just keep in mind that, given the above quibbles, I'm not 100% on board with the speculative aspects. |
October 15, 2023 at 9:53am October 15, 2023 at 9:53am
|
Reaching back to August of 2019, I landed on this entry: "Headline Questions Are Usually Answered "No.""
It contains a raw link (not an xlink) to a Nautilus article. The link no longer works, but I guess they simply switched around the way they handled URLs in the intervening years, because a quick search found the original article. Here it is in my more current format:
Now, the article itself is from even further back in the past: February of 2017. So the field might have experienced some changes since then, especially what with a few years where a lot of people, if they could get psych help at all, did it remotely. Which seems to me like a recipe for disaster, especially for extroverts, but what do I know?
So, I'm still not going to cherry-pick quotes from the article; as I said last time:
Now, usually, I mine quotes from the articles I link, but few of them in this one are really worth isolating; I think one needs to read the article to get the idea.
This, then, will focus on some of the things I wrote back then in the Before Time.
I have problems with how evolutionary "explanations" for this and that and the other thing are generally portrayed.
Speculation from evolutionary psychology has become so pervasive in human biology reporting that, unless the article in question is especially compelling, I just quit reading when I get to lines similar to: "This makes sense from an evolutionary perspective. In the distant past, our ancestors would have..."
Because it's almost always speculation, and it almost never takes into account that our evolved traits, whatever they may be, include holdovers from even more distant ancestors than our lonely great^great-grandparents on the African savannah.
They're origin myths, "Just So" stories, only written to appeal to a slightly more scientifically literate audience. The only difference between them and "God made us that way" is a lack of supernatural references, an acknowledgement that we are, in fact, products of evolution. Which is better, but still not great.
With evolution, not every trait is a survival trait. Some are vestigial or effectively so. Others are incidental. One way evolution works is that incidental traits sometimes end up aiding survival and/or reproduction, so those traits can get passed on. Vestigial traits like - I want to say the appendix, but I've heard that might actually be part of the immune system - whether or not you can wiggle your ears are generally neutral to survival, but might have had some benefit in a distant ancestor.
In that case, by "distant ancestor," I meant pre-hominid.
As for the appendix thing, since then, I've heard that the appendix does indeed have a function: as a store of beneficial gut microbes. When the intestines are subject to disease or poison, disrupting the flora there, it's supposedly meant to pump the good bugs back into the tube.
Now, it's not like whoever told me that was a doctor or even a biologist, so I wouldn't take that as Absolute Truth or anything. From a purely anecdotal perspective, I never had big problems with weight control before the appendectomy I endured back in the early noughties, but that could easily be correlation without causation.
I mention this because there also seems to be a correlation between gut microbe health (or, more properly, lack thereof) and mental health, so it might actually be relevant to the article. Or it might not. I don't know.
Also I should add that while I've been depressed, I've never been suicidal; I don't know what the stats on that are, but I think the article conflates "depression" and "suicidal ideation," and that pisses me right off.
Still does. I'm still subject to the occasional round of depression, and the closest I've ever come to contemplating suicide has been from a place of writer's curiosity: if I were to write a character deliberately ending their life, how could I do it in a way that seems realistic to readers?
From that perspective, and no others, I've also contemplated such things as rape, theft, murder, war, and what it would be like to be Southern Baptist. Stuff I'd also never attempt. So I don't think it counts
Still, the article makes some interesting points and I'm curious to see if this line of speculation goes anywhere.
Despite my gut feeling (pun intended, of course), there could be something to the "evolutionary psychology" explanation. But there needs to be more than just guesses, though I have no idea how one could go about doing the requisite science to support or falsify such a hypothesis.
One final thing: the end of the original article provides a link and a phone number for suicide prevention. I'm not 100% sure, but I think they've changed the phone number so you can get there by calling 988.
Which makes me glad we don't use rotary phones anymore. I can just imagine someone getting to the second digit, waiting for that goddamned dial to spin back around, knowing there's still another one to endure, then going "fuck it" and blowing their brains out. |
October 14, 2023 at 9:18am October 14, 2023 at 9:18am
|
The article today, from The Conversation, is about a year old, but that shouldn't matter.
But debates about quantum mechanics â be they on chat forums, in the media or in science fiction â can often get muddled thanks to a number of persistent myths and misconceptions.
Oh, it's way worse than that. I think there are still authors out there promoting books about harnessing the power of the quantum realm with your mind, or some such gobbledygook.
Remember: the bolded and italicized bits below, taken straight from the article, are the misconceptions. So if you skim this entry, please don't walk away thinking I'm endorsing misinformation.
1. A cat can be dead and alive
Erwin Schrödinger could probably never have predicted that his thought experiment, Schrödingerâs cat, would attain internet meme status in the 21st century.
I've met people who knew about Schrödinger's Cat, but weren't aware that it was a thought experiment. They believed Schrödinger had actually stuffed a cat in a box with a quantum choice contraption. I'm not ragging on them, but I think it's important to note that, to the best of my knowledge, no cats were harmed (or not harmed, or a superposition of the two) in the pursuit of knowledge about quantum physics.
Which is way more than other branches of science can say.
It suggests that an unlucky feline stuck in a box with a kill switch triggered by a random quantum event â radioactive decay, for example â could be alive and dead at the same time, as long as we donât open the box to check.
The obvious issue with this thought experiment is that, if it requires consciousness to collapse a quantum state, a human doesn't need to open the box; a cat possesses consciousness and knows it's alive (or doesn't know anything, if it's dead).
Is it really both alive and dead as long as we donât open the box? Obviously, a cat is nothing like an individual photon in a controlled lab environment, it is much bigger and more complex.
And that's the non-obvious issue.
In any event, Schrödinger came up with his Rube Goldberg cat-quantumizing machine idea to refute certain ideas about quantum physics, not to demonstrate its truth.
2. Simple analogies can explain entanglement
This one was way more immediately relevant last year, when the article came out, because 2022's Nobel Prize in Physics was all about quantum entanglement (this year's was about attosecond pulses of light, which, well, look it up; it's cool as hell).
There's a lot to absorb here, and I can't really do this section justice with cherry-picked quotes, but the upshot of it is this: There's no suitable macro-world analogy for quantum entanglement.
I'd also add that QE doesn't imply superluminal information transfer, as some people insist it means. It's plenty weird, but it doesn't defy the cosmic speed limit.
3. Nature is unreal and ânon-localâ
Another reminder that the above heading is false. But in this case, I'd add "probably."
Despite Bellâs theorem, nature may well be real and local, if you allowed for breaking some other things we consider common sense, such as time moving forward.
I've banged on in here about time on numerous occasions. Suffice it to say that, in the quantum realm, common sense needs to go right out the window.
I hate the concept, anyway.
Put another way, quantum equations, insofar as I understand them, don't have a time arrow. Time, then, is best viewed as an emergent property of macroscopic matter. Which is fine; lots of perfectly real things, such as temperature or life itself, are emergent phenomena.
However, most options on the table â for example, time flowing backwards, or the absence of free will â are at least as absurd as giving up on the concept of local reality.
This sentence, of course, is one of the main reasons I saved this article. The absence of free will isn't absurd at all; it is, as far as I'm concerned anyway, settled science.
4. Nobody understands quantum mechanics
But I'm not always right. For instance, I've crafted similar sentences to this #4 heading. This is my chance to qualify it; I do believe it's correct in at least one sense.
A classic quote (attributed to physicist Richard Feynman, but in this form also paraphrasing Niels Bohr) surmises: âIf you think you understand quantum mechanics, you donât understand it.â
I believe that quote is correct for people like you, me, and the author of Using Quantum Jedi Mind Tricks to Win the Lottery and Get Laid (or other books to that effect).
Quantum physics is supposedly impossible to understand, including by physicists. But from a 21st-century perspective, quantum physics is neither mathematically nor conceptually particularly difficult for scientists. We understand it extremely well, to a point where we can predict quantum phenomena with high precision, simulate highly complex quantum systems and even start to build quantum computers.
And while this is true—the calculations are, from what I've heard, far more accurate than in any other branch of science—that doesn't mean there aren't still arguments over what it all means. That is, questions of interpretation, like "many-worlds," are still open.
Where the true difficulty lies, perhaps, is in how to reconcile quantum physics with our intuitive reality.
Fair, because they are very different. I certainly don't claim to have it all figured out (unlike some writers), but as with anything else, that's not going to stop me from blogging about it. |
October 13, 2023 at 7:44am October 13, 2023 at 7:44am
|
Way back in the murky mists of deep time, during a period when I was on the fence about being childfree or not, I knew what I wanted to name a daughter: Amethyst. That would even be my red flag, I decided. When I'm dating someone, I find out if she likes the idea of naming a girl Amethyst and, if not, we weren't going to go any further.
This lasted, oh, about a month or so, when I dated a woman who hated the idea, but was really smoking hot, so all of those plans went right out the window. It wasn't long after that, probably, that I decided my actual red flag was "I want kids."
But I digress; the point is, I liked the sound of that word for as long as I can remember. Which is actually a fairly long time, as I was told early on that it's my "birth stone," just because I arrived in February.
"Birth stone" is, of course, a transparent marketing gimmick, like those silly lists of anniversary gifts. Regardless, I liked the sound of the word amethyst, and I liked the deep purple tint of the stone.
It's just quartz, you know. Silicon dioxide, the second most common mineral of the Earth's crust. (The first is feldspar, which really shouldn't count, as it isn't always composed of the same elements the way SiO2 is.)
Of course, amethyst isn't really "just" quartz. That color comes from the occasional iron atom in the crystal lattice. Don't ask me why that makes it purple, though; it probably involves quantum effects.
So yeah, common or not, I have a thing for amethyst. Or, at least, I did, until I found out the etymology of the word. It's probably common knowledge by now, but I'll reiterate it here anyway: the word comes to us from ancient Greece, though they certainly weren't the first people to know about the mineral. They assigned it the mystic property of protecting a person against drunkenness, so they named it not-intoxicated, or, in their words, a-methyst.
From what I understand, they even made goblets of carved amethyst on the theory that you could drink all you wanted to out of them and not get drunk. If they'd been half the scientists people think they were, though (and I have an article in the queue that touches on the ancient Greek penchant for natural philosophy), they might have done controlled, double-blind tests and realized that no, it possesses no such property, and any perceived resistance to the blessings of Dionysus was essentially a placebo effect.
I don't know how this belief didn't piss off Dionysus. And you don't want to piss off Dionysus; he's a mean drunk. I, however, am not a mean drunk; I'm a lot meaner when I'm sober. So even though didn't possess this magical quality, the mythology made it lose some of its sparkliness for me.
Therefore, it's just as well I never had kids to saddle one with a name I'd grow to distrust. |
October 12, 2023 at 10:31am October 12, 2023 at 10:31am
|
Another scholarly linguistics article today... wait, did I say scholarly? I meant amusing, because it's from Cracked.
When picking a new insult to throw at someone, current comedic convention suggests you string together a series of random incongruous words. You type, âYeah, like Iâm going to take advice from a lopsided milk-stained piss plank.â
This works great when you're typing, because you have time to think and/or randomly choose the next word. Not so easy in person, but for that, there's always "Your mama."
5. A Geek Was a Carnival Worker Who Bit Heads Off Chickens
I'm old enough to, if not remember this definition, at least remember older people remembering this definition.
A geek worked at a carnival, in an act called a geek show. Some carnival performers exhibited impressive talents, and the freaks showed off strange physical features, but hereâs what the geeks did: They bit the heads off of life animals.
Another expression that's lost its meaning is "copy editor."
âGeekâ started to attain its current meaning in the 1980s.
Wrong. 1970s. A wrestler called Fred Blassie (among other monikers, but that one was actually based on his real name) had used the catchphrase "Pencil-necked geek," and he wrote a song called that in, like, '75. I'm pretty sure he was the main reason the meaning changed.
4. An Idiot Was Anyone But a Politician
I'm pretty sure I've mentioned this sort of thing before.
An idiot was someone with an I.Q. less than 25, while other words described people that fell in other ranges â an imbecile scored between 25 and 50, while a moron managed between 50 and 75.
No matter what words we come up with to describe those of lower than standard intelligence, they will always, always morph into a general insult, requiring us to come up with new connotation-neutral words to describe them, which will inevitably morph into a general insult, ad infinitum.
But this is the interesting part (and it does seem to be at least partially true):
âIdiotâ has deeper roots, however. It comes from the Greek idiotes, which described a private person... A private person wasnât someone with a private personality but the opposite of a public figure. It meant a non-politician.
So, one of those cases where the word came to mean its opposite.
3. Dicks Were Older Than Penises
No real surprise here. Linguistically, anyway.
Naturally, Dick has been a name for many centuries, while âdickâ has only meant penis since the 19th century. Less obviously, a dick meant a man since before it meant a penis. In the 16th century, the word just meant âguy,â and youâd call someone an odd dick just as youâd call them an odd fellow.
Really, just about any word can mean penis if you want it to, depending on context. Like geek, for example. "That woman only dates geeks," someone might say, to which a guy might respond, "She can try my geek."
Okay, maybe that doesn't always work.
2. A Barbarian Spoke Gibberish
Today, a barbarian is a specific type of warrior, capable of relentless rage and proficient with medium armor.
Depends on your preferred game system. I think it was D&D version 3.5 where a barbarian could actually add their wisdom modifier to Armor Class, thus negating the need for armor if you happened to have a decent Wis score. This, I think, was meant to explain how famous literary barbarians such as Conan and Red Sonja could wear a loincloth and a chain bikini (respectively) and still be decent in combat.
But yes, in D&D 5 and Pathfinder (a fork of D&D 3), barbarians can rock medium armor, such as hide or chain mail.
Before that, it was a word to levy at any of various peoples to dismiss them as savages. But letâs go even further back, to the Greeks, who originated the term. They used the word to describe anyone who didnât speak Greek.
Many of the peoples called barbarians by the Greeks and, later, Romans, had a well-developed culture. The Norse, e.g.
The reason they came up with the word barbarian (or the root, barbaroi) was that people speaking anything but Greek sounded to them like they were just saying âba-ba.â Barbarians were âblah blahâ speakers.
And in earlier versions of D&D, they were generally illiterate. Which is also unrepresentative of actual barbarians.
1. âWeirdâ Meant You Have the Power to Magically Control Fate
Being literate, I knew this. But it's still an interesting case study of how words change in meaning.
Originally, the word said nothing about the wrong kind of nonconformity but instead referred to the magical power to control fate.
The article provides examples.
Wyrd was an old Norse word meaning âfate,â and in its earliest English form, it was associated with the Fates from Greek mythology.
See? Barbarians can and do contribute to culture. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|