About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
September 30, 2022 at 12:07am September 30, 2022 at 12:07am
|
My main sources for scientific discussions are YouTube (I must have broken their algorithm because everything I see on the sidebar is comedy, booze, science, philosophy, music, math, or some combination thereof) and, of course, Cracked.
You can go down your own YouTube rabbit hole; today I'm linking the latter.
Science is generally self-correcting. Theories are discarded or adjusted when new experimental data comes in, or is interpreted differently. But one thing that remains constant, apart from the speed of light, is that no matter how weird we think the Universe is... it always turns out to be weirder.
After all, weâre only here because of a giant explosion that occurred everywhere yet also nowhere, and for no reason, during a time before time. It may have happened infinite times to create an infinite multiverse. It may be happening ânow,â though ânowâ is relative and doesnât really mean anything worth jack.
Yeah, it's not really an explosion because there was nothing for it to explode into. If it actually happened. There's still some pushback to the Horrendous Space Kablooie theory.
5. Do We Have Another Planet? It May Be A 13.8-Billion-Year-Old Primordial Grapefruit 10 Times Heavier Than Earth
To quote Douglas Adams, "âSpace [...] is big. Really big. You just won't believe how vastly hugely mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist, but that's just peanuts to space.â Well, compared to the vast distant reaches of space, even the outer reaches of our solar system might as well be a trip to the chemist (that's pharmacy for us Americans). And, as this section indicates, we don't even know how many things orbit our sun.
One thing that probably will be solved in our lifetime, cyborg or not, is among the most mouth-watering space mysteries: the existence and identity of Planet Nine. Is our solar menagerie hiding at least one more big planet? Or just another derelict Kmart? The unequivocal answer is maybe.
Your lifetime, maybe; probably not mine.
If a hidden planet exists, itâs 400-800 times farther away from the Sun than Earth. So it would trace an elongated 20,000-year orbit. Humans were still wiping themselves with their hands last time it swung around our star.
To be clear, we still do; there's just usually paper in said hand. Unless you're fancy and own a bidet.
Being so far away, scientists canât see P9, but infer itâs there because nearby objects seem to be pulled by the gravity of something hefty.
On the one hand, that's a totally legitimate way to determine if there's another planet involved; it worked to discover Neptune, as I recall. On the other hand, discrepancies in Mercury's orbit compared to prediction led to the hypothesis that there was another planet orbiting even closer to the sun. They even named the planet: Vulcan. Plot twist: Vulcan doesn't exist; the perturbations turned out to be due to relativistic effects.
Point being, maybe there's another planet; maybe our theory of gravity needs to be tweaked again; or maybe it's Maybelline.
If they do find that planet, please lobby for it to be named Maybelline.
Letâs not entertain such dismal, unexciting possibilities. A much more excellent idea is that P9 isnât a planet but a black hole. A primordial black hole, dating to the first second of creation. As in, the first second ever.
No.
But it's not as farfetched as you might think. After a certain radius, black holes act just like any other mass. If our sun were to be suddenly replaced by a black hole of the same mass, all the planets would continue orbiting sedately just as they do now, not get sucked directly in like pulp SF would have you believe. Which we'd notice just before we froze to death.
4. We Can Theoretically Wee Actual, Individual Aliens By Making A Solar System-Sized âVirtual Telescopeâ
Please don't wee aliens. They might get pissed off.
Scientists just achieved the most amazing visual feat ever: capturing an image of Sagittarius A*, the bulldog of a black hole at the center of our Milky Way.
This article came out about the same time as JWST started producing real images, and some of them are pretty amazing. Still, imaging something that far away is pretty amazing, sure.
It has the mass of 4 million Suns, packed into an infinitely tiny point.
That's a bit misleading. The dimensions of a black hole are undefined because dimension implies space, and the whole point (see what I did there) of a black hole is that it warps space so completely that the concept of "distance" or "radius" or "diameter" is meaningless. And the event horizon, whose diameter can be inferred, isn't infinitely tiny. Whatever. I'm quibbling about language written on a dick joke site that can't even edit out obvious typos in headers.
Anyway, the image is helpfully included in the article, but it hardly matters; you've seen it. It's become iconic. Even Star Trek started riffing off of it. The real point is HOW they did it, which involves turning widely-spaced Earthbound telescopes into a giant virtual telescope, which is indeed cool. Now imagine putting a bunch of space telescopes in orbit around the sun, say between Earth and Mars, and turning them into a virtual telescope. That's what they mean when they claim we could "wee" aliens. If aliens existed.
3. Sadly, No Telescope Can See An Anti-Universe With Backward Time Which Could Explain Dark Matter. But Math Can
I mean, okay. This is speculation. Math is very, very good at describing the universe (and even perhaps a multiverse), but it doesn't follow that just because math shows something, it must have a physical interpretation.
To be clear, no one yet knows exactly what dark matter is. If it's even matter. That's cool; it means there's more to discover.
2. Neutrinos Will Blow Your Ass Out Of Its Mind
In addition to the Big Bang and other energetic origins, neutrinos come from nuclear reactions in stars
Neutrinos are another thing we don't fully understand. As with anything else we don't understand, speculation runs rampant. Like in this section. Still, as long as we know it's speculation, it's interesting to read about.
1. An Infinite Universe Guarantees The Occurrence Of Things That Are So Unlikely Itâs Literally Impossible To imagine, Comprehend, or Perceive
I mean, sure. Infinity itself is impossible to imagine, comprehend, or perceive. It also might not really exist, being perhaps one of those mathematical concepts that doesn't have a real-universe counterpart.
Infinity means that any non-zero event will occur.
I think they mean "any non-zero probability event."
Infinitely.
Not necessarily.
The simplest conception of infinity is "the set of all positive integers." We have notation that can render any arbitrary integer, like 2, or 42, or 10100 — which is just as far away from infinity as 2 is. But each of those numbers is unique. Sure, our notation uses some combination of ten arbitrarily-assigned digits, so two numbers may seem to have something in common, like 543 and 1345, but just because you label something "infinite" doesn't mean, as is sometimes popularly reported, that there's an exact copy of you running around on some exact copy of Earth somewhere else in the universe.
And since the universe appears to be infinite, this forces us to ask if infinity exists, or is just a human failing to comprehend the true nature of the cosmos.
Or, you know, I could be wrong about that. It's not like I'm any better at comprehending the nature of the cosmos, or infinity, than any other ape.
And to be strict about it, there are different kinds of infinity.
Hope that clears everything up. |
September 29, 2022 at 12:01am September 29, 2022 at 12:01am
|
I'm not really happy with this article.
It's not that I disagree. Anyone who's been following along should know that I think happiness is overrated (and fleeting at best). It's just that there are a few red flags here.
We think we want to be happy.
Assumes facts not in evidence. I'm not part of that "we," and I suspect I'm not alone.
Yet many of us are actually working toward some other end, according to cognitive psychologist Daniel Kahneman, winner of the 2002 Nobel Prize in economics.
Possible argument from authority. While I admit that economics and psychology have some overlap, a Nobel Prize is not an indicator of general intelligence, but of a successful breakthrough in a certain field (exception: the Peace Prize, which hasn't meant shit since at least the early 70s).
Kahneman contends that happiness and satisfaction are distinct.
Assertion without evidence. I mean, sure, I agree, but I need to be critical of articles that agree with me. Helps me avoid confirmation bias.
For example, in Kahnemanâs research measuring everyday happinessâthe experiences that leave people feeling goodâhe found that spending time with friends was highly effective. Yet those focused on long-term goals that yield satisfaction donât necessarily prioritize socializing, as theyâre busy with the bigger picture.
Fails to account for differences between extroverts and introverts; also nothing is said about those on the autism spectrum.
In an October interview with Haâaretz (paywall), Kahneman argues that satisfaction is based mostly on comparisons. âLife satisfaction is connected to a large degree to social yardsticksâachieving goals, meeting expectations.â
How much of that is actual personal achievement, and how much of that is due to comparing one's outward measures of success to others'? Like, are you really happy just because you've hit your mark, or do you see that the neighbor has a better car or a greener lawn and decide you have to do better? (I'm the former.)
He notes that money has a significant influence on life satisfaction, whereas happiness is affected by money only when funds are lacking. Poverty creates suffering, but above a certain level of income that satisfies our basic needs, wealth doesnât increase happiness.
Pretty sure I've noted before that the study he's referencing there is highly suspect. Ah, here it is, and it was only a couple of weeks ago: "Gut Wrenching"
The key here is memory. Satisfaction is retrospective. Happiness occurs in real time. In Kahnemanâs work, he found that people tell themselves a story about their lives, which may or may not add up to a pleasing tale. Yet, our day-to-day experiences yield positive feelings that may not advance that longer story, necessarily. Memory is enduring. Feelings pass. Many of our happiest moments arenât preservedâtheyâre not all caught on camera but just happen. And then theyâre gone.
Again, doesn't jive with my personal experience. My memory is shit. For example, I couldn't remember how long ago I'd written that entry I just referred to. When I saw it was just two weeks, I was surprised. I don't remember many details of events or conversations—but the way that event made me feel is indelibly burned into my synapses. So, no, I have a much better memory of emotions than I do people, places, and happenings.
This theory helps to explain our current social media-driven culture. To some extent, we care less about enjoying ourselves than presenting the appearance of an enviable existence. Weâre preoccupied with quantifying friends and followers rather than spending time with people we like. And ultimately, this makes us miserable.
Ah. Now I begin to understand why I'm so different. That quote doesn't describe me at all. I despise social media and its pretensions.
We feel happiness primarily in the company of others, Kahneman argues.
I feel happiness primarily in the company of beer.
However, the positive psychology movement that has arisen in part as a result of his work doesnât emphasize spontaneity and relationships.
Wait, is this guy the one responsible for "Toxic Positivity" ? If so, no wonder I'm squicked out at this whole article.
Kahneman counts himself lucky and âfairly happy.â He says heâs led âan interesting lifeâ because heâs spent much of his time working with people whose company he enjoyed. But he notes that there have been periods when he worked alone on writing that were âterrible,â when he felt âmiserable.â
People are different. I know people who are happy working "alone on writing." Hell, that might describe some of my regular readers.
Still, itâs worth asking if we want to be happy, to experience positive feelings, or simply wish to construct narratives that seems worth telling ourselves and others, but doesnât necessarily yield pleasure.
On that point, I conditionally agree: it's worth thinking about. For me, I'm content to be happy on occasion, and coast through everything else. I'd prefer to avoid misery, but to me that's not the same thing as seeking happiness; it's just pain avoidance. |
September 28, 2022 at 12:02am September 28, 2022 at 12:02am
|
And now for a truly impurrtant article.
My love language is avoiding people who use the phrase "love language." Stop it. Still, as it's an article about cats, I read it anyway.
On the not-so-infrequent nights when Iâm plagued by insomnia, no combination of melatonin, weighted blankets, and white noise will do. Just one cure for my affliction exists: my cat Calvin, lying atop my shoulder, lulling me to sleep with his purrs.
I get it. When my cat is tired, I'm tired. Since she's a cat, that's pretty much most of the day.
But purrsâone of the most recognizable sounds in the animal kingdomâare also one of the most mysterious. âNo one, still, knows how purring is actually done,â says Robert Eklund, a phonetician and linguist at LinkĂśping University, in Sweden.
And I hope it never is. Life is better when some things are mysteries, especially regarding cats.
Cats purr when theyâre happyâbut also sometimes when theyâre anxious or afraid, when theyâre in labor, even when theyâre about to die. Cats are perhaps the most inscrutable creatures humans welcome into our homes, and purring might be the most inscrutable sound they make.
As I noted in a recent Comedy newsletter (yes, really), I've witnessed the distress purr. It can be disconcerting if you aren't expecting it.
There is, at least, some consensus on what purring is. In the strictest sense, the sound is a rhythmic, rumbly percolation thatâs produced during both exhalesâas is the case with most typical animal vocalizationsâand inhales, with no interruptions between.
And that's really weird, when you think about it.
Purring isnât easy to study: Felines arenât usually keen on producing the sound around researchers in labs.
That's because the phrase "curiosity killed the cat" isn't referring to the feline's curiosity.
Carney told me that in some animals, purring could be a sort of vocal tic, like nervous laughter; cats might also be trying to send out pleas for help or warning messages to anyone who might dare approach. Or maybe bad-times purrs are self-soothing, says Jill Caviness, a veterinarian and cat expert at the University of Wisconsin at Madison, and parent to a feline named Electron. They could even be a catâs attempt to dupe its pain-racked body into a less stressed state.
I happen to favor this latter hypothesis myself, in a very unscientific way. The frequency of sound that a cat emits with a purr might have some healing properties.
I had a cat for a long time who wasn't the most affectionate creature. She was what some people think of as the archetypal cat: nervous, aloof, sometimes even mean. Not a lap cat at all, and I still bear some scars to prove it. But when I was lying in bed, immobile due to some kind of pulled muscle or whatever, this cat curled up right on top of where the pain was and started purring. Like she knew, and was trying to help.
In the early aughts...
Noughties, dammit.
...a researcher proposed that purring might even have palliative properties for catsâpinging out vibratory frequencies that could, for instance, speed the healing of wounds or broken bones. The thought isnât totally bonkers, Eklund told me.
Like I said.
Carney has had plenty of clients who âswear that the cats lying in bed, purring beside them while they were ill, kept them from passing away,â she told me. But alas: Although cats can purr at frequencies that overlap with those used in vibration therapy, none of the research on these treatments has actually involved felines. âI donât think we have any studies that are like, I sat with a purring cat on my broken leg for 15 minutes a day; I healed more rapidly than someone else,â Caviness told me; the same goes for the effects of purrs on the purrer.
So my story remains anecdote, but it seems like it'd be worthy of further study.
If people feel better around their cats, that might be less about purringâs direct mechanical effects on human tissues, and more about the entire companion animal being a psychological balm.
Which I'd buy if my cats weren't so damn annoying sometimes.
And unlike many other cat noises, purrs stubbornly elude human imitation (though some people on YouTube might beg to differ). Humans can easily meow back at their cats; âitâs like a very rudimentary pidgin language,â Eklund said.
I used to meow back at my cats. Their reactions indicated that I should stop, like, right now. So I did.
Purring is a language barrier we have yet to surmount. Which, in some ways, is so, so cat. Humans have spent generations breeding dogs to emote in very people-esque ways, using their soulful eyes and slobbery, smiley mouths. Cats, though, continue to thrive on subtlety; their mugs arenât evolutionarily set up for obvious expressions, defaulting instead to âresting cat face.â
Don't draw the conclusion from any of this that I don't like dogs. That's what most people leap to, like they think I hate children because I never wanted to own one. But that doesn't follow. I just never wanted the extra work, in either case.
Plus, I want to bust the stereotype of the crazy cat lady by being a crazy cat dude. But I can't bring myself to have more than three at the same time, so I'll probably never qualify. Right now, I have two, and so does my housemate. The threshold for crazy cat person status is N+3, where N is the number of humans in the house. Neither of us is there. But she also has guinea pigs, so if one of us is crazy, it ain't me. |
September 27, 2022 at 12:09am September 27, 2022 at 12:09am
|
Today's article, in contrast with the last couple, is mercifully brief. It's also either a book review or an ad, or possibly both. But whatever; the subject matter is something I'm interested in.
Easy as ABC
What were the origins of the alphabet, before the Greeks?
The alphabet—in its various forms, from different cultures—is something we tend to take for granted, like numbers or language itself, but at some point, someone had to invent that shit. Some cultures didn't, and they got along just fine without it, but as it's a thing writers use, it pays to think about it sometimes.
Few technologies are as important to our daily lives as the alphabet. But, as Johanna Drucker argues, we rarely give its history any thought at all.
And if we do, we probably use the alphabet itself to think about it.
The alphabet has been continually reinvented by each generation of thinkers in a story that meanders from Herodotus to the present day, via Jewish mystics, Arabic scholars, early modern typographers and 18th-century antiquarians.
And yet, we still have superfluous letters.
As Drucker writes, the idea that the Greeks invented the alphabet is deeply ingrained in modern thought.
Say WHAT now? No. Who thinks that? Racists who can't possibly conceive of anyone nonwhite inventing anything of value? Like how Egyptians (not white despite Hollywood's attempts) couldn't possibly have built pyramids with such great precision, so it must have been aliens?
No, I learned at an early age that it was probably the Phoenicians, a Semitic people. I also heard that "phonics" and "phonetics" come from their name, but I'm not sure about that and can't be arsed to look into it right now. Point is, no one I know ever thought the Greeks invented the alphabet. That's like saying the US invented English (but of course we perfected it). Though some Jewish mystics asserted that Hebrew was the foundational language of the Universe. Hell, early Hebrew is nothing like the modern calligraphic script, so they're obviously engaged in wishful thinking, too.
Anyway, whether it was the Phoenicians or not, it definitely wasn't the Greeks.
But this is the opposite of what the Greeks themselves thought; they were clear that it was borrowed.
More like stolen.
From the Greek perspective, the alphabet was invented either by the Phoenicians and given to the Greeks by Cadmus (this is the account given to us by Herodotus) or invented by the Egyptian god Thoth (as in the account of Plato).
Aaaaaand we're back to aliens.
For most of the 19th and 20th centuries scholarship lauded the âgeniusâ of the Greeks for adding vowels to the existing consonantal alphabet used by the Phoenicians.
Gotta admit, the whole "vowel" thing is very helpful.
Inventing the Alphabet raises all of the questions that have vexed historians. The Bible presents insoluble problems.
Not if you think of it as just another book.
If God wrote the Ten Commandments for Moses, what language were they in?
Assumes facts not in evidence.
What alphabet?
Now I have this indelible image in my head of emojis carved on stone tablets.
(old bearded guy) (check mark)
(week with a halo on Saturday)
(circle with slash) (dagger)
(circle with slash) (eggplant) (peach)
That sort of thing.
If it was the first ever written text, how did Moses know how to read it?
Look, once you accept "God did it" as an answer, that's the answer to everything. In this case, God put the knowledge in Moses' head. I mean, no, but if you accept one you might as well accept the other.
The 18th century saw a flourishing of interest, not just in Greek, Roman and Hebrew writing, but also northern European writing systems such as runes.
Anyone with even a passing knowledge of runes can tell that there was significant overlap between runic and Roman scripts. The "R" symbol is the same, for example.
And then this is what leads me to believe this is a review and not an ad:
The book is dense in places and technical terms are not always adequately explained. Readers are left to their own devices with matres lectionis (literally âmothers of readingâ, but also a term for the diacritical marks used to indicate vowels in some writing systems which do not have them).
Hebrew is a well-known example of this. The written language existed for millennia (with changes of course) before they started adding markings to indicate vowels. If I recall correctly, that process started in the first century C.E.—so likely they "borrowed" back from the Greeks. It's the circle of life.
There are some small factual errors on the ancient side â describing Persian as a Semitic language, when it is Indo-European, for example.
I'm no linguist, but that's not a "small factual error;" it's a pretty major one. if I were editing that book, I'd have red-flagged it.
In conclusion:
At the heart of it all is the alphabet: an invention that is both ubiquitously banal and world-changingly innovative.
Early alphabets were indistinguishable from magic. Like with numbers. I guess when something's used every day, it loses its magic, but that doesn't stop it from being an interesting topic of research. |
September 26, 2022 at 12:01am September 26, 2022 at 12:01am
|
Speaking of luck, my random number generator popped up this article right after yesterday's piece about Fortune.
This one's from Wired, so it's probably an easier read than yesterday's.
Really, what more can or should be said about the future? Look around and see what happens. You can look for your crypto windfall. You can look for the love of your life. You can look for the queen of hearts. Seek and ye might find. You can even look for a four-leaf clover, though the chances are about 1 in 10,000. But if you find one, the shamrock is no less lucky because you looked for it. In fact, itâs luck itself.
Sounds more "looky" than "lucky." Also, a shamrock is a particular clover-adjacent plant; the four-leaf clover can be pretty much any clover. Except, probably, alfalfa. Which may or may not be considered a variety of clover. It's all very complicated and I can't be arsed to sort it all out tonight.
âDiligence is the mother of good luckâ and âThe harder I work the luckier I getââthese brisk aphorisms get pinned on Ben Franklin and Thomas Jefferson, lest we earnest Americans forget that salvation comes only to individuals who work themselves to dust.
Hahahaha.
In truth, the luck = work axiom does nothing but serve the regime and the bosses, by kindling credulity in a phantom meritocracy instead of admitting that virtually every single advantage we get in the world is one we lucked into...
Which is what I've been saying.
How about we invert the meritocratic fallacy in those aphorisms and create a new aphorism that makes âworkâ the delusion and âluckâ the reality? âThe luckier I get, the harder I pretend Iâve worked.â An excellent way to describe the people born on third who believe they hit a triple.
There you go. That's more like it.
Disclaimer: I'm lucky. Fortunate. Whatever you want to call it. Life is fundamentally unfair, but it's usually unfair in my favor. I make no apologies for this, nor seek penance. But I do consider it my obligation to point out that luck is a far more reliable indicator of success than hard work is. As I've said numerous times, if hard work were all it took to achieve financial success, sharecroppers would be billionaires.
After all, the chances of the precise sperm colliding with the exact egg in the right fallopian tube and convening to make youâor meâare so low as to be undetectable with human mathematics.
Yeah, no, it's detectable, though very low. It's also irrelevant. As I've also said numerous times, no matter what the odds were, once you've won the lottery, the odds of having won the lottery are 1:1. All the "you were extremely unlikely" phrase accomplishes is putting an upper limit on how many other people might share your precise genetic makeup.
But work and diligence can never be the parents of luck, because luck has no mother, no father, no precedent or context. Luck is a spontaneous mutation, signaling improbability; it shows up randomly, hangs around according to whim, andâas every gambler knowsâmakes an Irish goodbye. Mischievous luck is fun, a shamrock, a âlady.â Itâs worlds away from grinding toil.
"Irish goodbye" is probably more than a little on the stereotyping side, having associations with other negative Irish stereotypes. But whatever; I didn't write it.
The "spontaneous mutation" bit reminded me of a story by Larry Niven. In Ringworld, one of the characters is lucky. Turns out that, in the story, she was the result of several generations of people winning the birth lottery (or some such; it's been a while). Point is, he treated luck as a genetically inheritable trait, rather than a matter of circumstance.
Einstein didnât like the idea of God âplaying diceâ with the world. Lucky for Einstein, dice, in a world determined by luck, are not thrown by anyone, much less a God who is said to have Yahtzee skills. Instead, the chips fall where they mayâand really they just fall, unpredictably, spontaneously. We then look for patterns in them.
Einstein's comment is often misrepresented, so I'll point out two things: 1) He apparently didn't believe in any of the traditional interpretations of God, instead using that word as a three-letter shorthand for "the laws of physics." 2) That particular quote was his visceral reaction to the idea that, at base, everything is probabilistic, as implied by quantum physics. Thing is, he was a pretty smart dude, and it's my understanding that evidence brought him around to randomness in the end.
Living by a doctrine of luck promotes at least five excellent things that have got to be good for your brain.
"have got to be" is an opinion, but let's look at the "five excellent things" and form our own opinions.
1. Active skepticism about âmeritocracy.â
Yeah, okay. I've railed on that in here before, and probably will again.
2. Recognition of the utter contingency of oneâs own advantages. An act, if I may, of âchecking your privilege.â
Again, I recognize it; I check it; and I wallow in it.
3. Appreciation for the spontaneity, serendipity, and unpredictability of the universe.
I don't know if that's good for one's brain or not, but it does bring me joy.
4. A way to practice âgratitudeâ without doing calligraphy in $75 journals. All you have you do is say, every time it hits you that life is OK and could be otherwise, âWhat luck!â
Ugh.
5. A way to make more luck in your life.
Which is the point of the article (if the article can be said to have a point at all), and I'm still not fully convinced.
Anyway, like I said, it's sheer luck that this came up back-to-back with yesterday's; I have a few dozen articles in my queue and not many of them deal with this subject. So you probably won't hear more on this from me for a while.
Lucky you. |
September 25, 2022 at 12:01am September 25, 2022 at 12:01am
|
Today's article is fairly long, but worth it if you have the time.
Fortuneâs Wheel
For many in Western history, games of chance represented a portal of possibility, not a heresy to be demonized or a statistical probability to be managed.
In the American mythology of success, labor is the only path to prosperity.
Ha!
The affluent can cleanse their cash by claiming they worked hard for it; mastering fate and controlling outcomes bestow moral legitimacy on their earnings.
HA!
Many moralists throughout American history have affirmed that merit matches reward and that people get what they deserve, in this world and the next.
At the risk of repeating myself, ha.
This all ties in to another thing I featured over two years ago: "On Merit"
But a heresy against this faith in hard work has stubbornly survived nearly two centuriesâ preaching of the virtues of Americaâs civil religion.
"Civil" religion, my lazy heretical ass. This nonsense about "hard work will get you ahead" can be directly traced to Calvin. The preacher, not the cartoon kid.
The lottery ticket, humble as it is, serves as a passport to a more fluid moral economy, where fate can be cruel or kind but is always arbitraryâwhere luck, as even Horatio Alger realized, matters more than pluck. And this culture of chance more closely resembles the world in which most people live than the one prescribed by the dominant mythology of success, which can aptly be called a culture of control.
Much has been said—including by me—about the problems with playing the lottery. But for many, it's the only thin straw they can grasp at to maybe escape their life at the bottom of the social hierarchy, at least for a while (more on that in another entry).
But it does suggest that gambling is about more than mere money. Modern games of chance reenact ancient rituals of divinationâcasting lots, throwing pebbles, bones, shells, or diceâdesigned to provide glimpses of the sacred and to conjure luck or its spiritual equivalent, grace. Rather than the static and timeless cosmic order of orthodox monotheism, the sense of the sacred sought by diviners was a pluralist plentitude, symbolized in Western tradition by inconstant Fortuna and by similar figures in American Indian and African traditions.
Yeah, that might be a stretch.
The article goes on to describe the history of Fortuna, which is quite fascinating in itself.
But Fortune did not fit well with Christian ideas of Providence. To early Christians, the divine plan unfolded as mysteriously as the fluctuations of luck, but however remote the planner or apparently perverse his decrees, his purpose was ultimately benign.
To me, as an outsider, "divine will" is indistinguishable from random chance working through the laws of physics.
Beginning in the fifteenth century, Protestant reformers assaulted these rituals as part of a broader war on the medieval culture of chance. Taking their cues from John Calvin, theologians disparaged Fortuna, deriding belief in her powers as a pagan excrescence on the Church.
See? I told you we'd get to Calvin.
Only by Godâs âsecret plan,â Calvin wrote, do âsome distinguish themselves, while others remain contemptible.â
In the millennia of lies foisted on an unsuspecting public, this one is in the running for the worst in terms of the effect it had on society.
As early as 1653, when dissenting sects proliferated amid the English Civil War, a female sectarian confessed that she could not stand to see her neighbors prosper, as it meant they had prayed more than she had.
That, for example.
Despite occasional revolts, faith in Fortune endured in a variety of ways, even in the language that people used to describe their circumstances. The word âhappinessâ has long been linguistically dependent on chance. The thirteenth-century English substantive âhapâ derives from the old Norse âhapp,â meaning âchanceâ or âgood luck.â The verb âhappenâ and the adverb âhaplyâ (by chance) emerged from this root in the fourteenth century, as did âhappy,â which originally meant âprosperousâ and by the sixteenth century had acquired the connotation of contentment.
I'm just leaving this there as further support for my continued ragging on the incessant popular harping about "happiness."
The Protestant war on Fortune, declared by John Calvin centuries earlier, also allied itself with Newtonian science, whose practitioners were less interested in denying chance than in containing it. By the end of the eighteenth century, Jacob Bernoulli, Adolphe Quetelet, and other statisticians had developed modern probability theory, reducing chance to a predictable outlier or a standard deviation. Both statisticians and devotees of ârational religionâ hastened the shift from a respectful and even fearful Renaissance vision of Fortune as a goddess to a modern, more confident understanding of chance as a condition to be managed.
Yeah, I don't fully buy that bit. Newton envisioned a purely deterministic universe, and probability theory is kind of not that.
As Wall Street sanitized speculation as âinvestment,â and the mere manipulation of money through complex financial instruments became a path to self-made manhood, gambling continued to be stigmatized by society.
This I don't buy either. Sure, you can enter the stock market speculatively, risking capital on short-term investments. But there's one very important difference between investing and speculation: Insofar as investing can be considered subject to the whims of chance, historically, long-term investments are the equivalent of owning the casino, giving the investor the house edge.
Mason Locke Weems, whose early biography of George Washington introduced American readers to the story of the cherry tree, condemned gamblingâs effect on the âsocial body.â
As the cherry tree story is pretty firmly in the realm of mythology, I wouldn't trust anything else that guy wrote, however well-intentioned.
The emergence of gambling as mass entertainment can be traced to several sources, including the willingness of cash-strapped state governments to substitute gambling revenues for taxation and the decline of job security among gamblers themselvesâin a contingent labor market, where one can be dismissed for reasons having nothing to do with oneâs performance, the disjunction between merit and reward is more painfully apparent than ever. If hard work gets you nowhere fast, why not have a fling with Fortune?
They sold us on a lottery here in Virginia, back in the 80s, saying the profits would go to transportation and education. As far as I know, they did—but the legislature ended up cutting off more traditional sources of funding to those areas, leaving us worse off than before.
The willingness to relinquish control over outcomesâto playâpromoted the insouciance toward money that lies at the core of the culture of chance. This outlook arose from an insight common among Fortunaâs children: the recognition that you donât get what you deserve. You get what you get.
The article, and I, mentioned Newton up there. As I said, he saw a deterministic world, where calculations had fixed results and if you but had enough information, you could make any prediction accurately.
And that turns out not to be the case.
so at this hour
without delay
pluck the vibrating strings;
since Fate
strikes down the strong man,
everyone weep with me!
—O Fortuna |
September 24, 2022 at 12:02am September 24, 2022 at 12:02am
|
Today's article has about as much relevance to me as that going (shudder) outside one a couple of days ago. But other people probably need to see it.
As we all know, I really am positive: for example, I'm positive we're doomed.
Eight years ago, when Whitney Goodman was a newly qualified therapist counselling cancer patients, it struck her that positive thinking was being âvery heavily pushedâ, both in her profession and the broader culture, as the way to deal with things. She wasnât convinced that platitudes like âLook on the bright side!â and âEverything happens for a reason!â held the answers for anyone trying to navigate lifeâs messiness.
"You have cancer? Look on the bright side! You'll be dead soon and won't have to worry about it!"
As for "everything happens for a reason," fuck that noise. I mean, sure, you can find reasons if you look hard enough, or make them up for yourself, but no, some things are basically a roll of the dice.
This stayed with her and, in 2019, she started an Instagram account, @sitwithwhit, as a tonic to the saccharine inspirational quotes dominating social media feeds. Her posts included: âSometimes things are hard because theyâre just hard and not because youâre incompetentâŚâ and âItâs OK to complain about something youâre grateful for.â
I'd almost visit Instagram just to see that. Almost. Still, some things are hard because you're incompetent.
Goodmanâs new book, Toxic Positivity, expands on this thinking, critiquing a culture â particularly prevalent in the US and the west more broadly â that has programmed us to believe that optimism is always best. She traces its roots in the US to 19th-century religion, but it has been especially ascendant since the 1970s, when scientists identified happiness as the ultimate life goal and started rigorously researching how to achieve it.
Ah. Now I begin to understand why I have such a strong negative reaction to positivity. Other than, you know, the whole "universal balance" and "symmetry" thing.
More recently, the wellness movement â religion for an agnostic generation â has seen fitness instructors and yogis preach about gratitude in between burpees and downward dogs. We all practise it in some way. When comforting a friend, we turn into dogged silver-lining hunters. And we lock our own difficult thoughts inside tiny boxes in a corner of our brains because theyâre uncomfortable to deal with and we believe that being relentlessly upbeat is the only way forward. Being positive, says Goodman, has become âa goal and an obligationâ.
Or as I like to put it, every silver lining has a cloud.
Toxic Positivity is among a refreshing new wave of books attempting to redress the balance by espousing the power of ânegativeâ emotions. Their authors are hardly a band of grouches advocating for us to be miserable.
And just to be clear, I'm not advocating that, either. Going all the way in the other direction to overcompensate is a very American thing (see also: McMansions vs. tiny houses), but it's not my way. Besides, seeking out misery for the sake of misery also comes from religion.
The road to the good life, you see, is paved with tears and furrowed brows as well as smiles and laughter. âI think a lot of people who focus on happiness, and the all-importance of positive emotions, are getting human psychology wrong,â says Paul Bloom, a psychology professor at Yale and the author of The Sweet Spot, which explores why some people seek out painful experiences, like running ultra marathons and watching horror movies. âIn a life well lived, you should have far fewer negative than positive emotions, but you shouldnât have zero negative emotions,â adds Daniel Pink, the author of The Power of Regret. âBanishing them is a bad strategy.â
Or as the great philosopher put it, "If everything was cool, and nothing sucked, how would we know what was cool?"
Itâs tougher making an argument for regret, which might be the worldâs most maligned emotion, but Pink is game. From a young age we are instructed to never waste energy on regrets. The phrase âNo regretsâ is inked into arms and on to bumper plates and T-shirts.
I have no idea if it's real or a Photoshop job, but there's a picture circulating of some dude who got "No Regerts" tattooed on his skin.
âRegrets clarify what matters to us and teach us how to do better. Thatâs the power of this emotion â if we treat it right.â
I've spent my life toiling in the regrettium mines of Regrettistan, so I can dig it.
Telling others about it lightens the weight. Complaining is perfectly natural, says Goodman. And articulating it helps us pinpoint what it is thatâs bothering us, because language converts this âmenacing cloudâ into âsomething concreteâ, says Pink. That disclosure could be to a friend, therapist or total stranger.
Just first make sure the person you're kvetching to isn't a positivist. Or, you know, start a blog.
Your next step will likely depend on the nature â and severity â of the emotion. To help us sit with sadness, Russell advocates being in nature.
Does looking out a window count?
âIt sounds a little âwooâ, but there are lots of studies about the effectiveness of reading therapy and looking at a piece of art â and how music can change our moods,â she says. âSad music can act as a companion when weâre feeling sad, rather than making us feel lower. I do think itâs liberating when you finally kind of surrender to it all.â
This is what I already do. Happy music makes me stabby. Sad songs make me happy. As they put it on a well-known episode of Doctor Who: "What's wrong with sad? It's happy for deep people."
Leaning into negative thoughts should ultimately leave you with a sense of fulfilment. While we might instinctively think that filling our days solely with joy and excitement is the dream, âif we want to live a meaningful and purposeful life, a lot of pain is going to be part of itâ, says Bloom. âWhat I really want is for people to be able to enjoy the full range of the human experience,â adds Goodman.
Also, booze works for me. I know, I know, you're not supposed to say that, because, supposedly, alcohol is a depressant. But I've never been happier than when I'm sitting at a bar or on a patio, having a beer (or something else with ethanol), listening to depressing music.
Maybe I'm just weird. But I don't think I'm that weird.
I know I've shared this page before, but it's relevant here, and one of my favorite image series is Fitness quotes over pictures of drinking. |
September 23, 2022 at 12:03am September 23, 2022 at 12:03am
|
Now that autumn is here (dammit), it seems appropriate that this article about fusion came up tonight. The equinox, which occurred yesterday, marks the point where the sun appears to cross the Earth's equator, and of course the sun is powered by fusion.
Yeah, yeah, it's a stretch, I know.
I see even Fast Company isn't above making puns in its headlines. Nuclear? Booming?
Anyway, as the article points out, Earthbound fusion reactors seem to be just a few years away—just as they've been for at least my entire life.
With energy prices on the rise, along with demands for energy independence and an urgent need for carbon-free power, plans to walk away from nuclear energy are now being revised in Japan, South Korea, and even Germany. Last month, Europe announced green bonds for nuclear, and the U.S., thanks to the Inflation Reduction Act, will soon devote millions to new nuclear designs, incentives for nuclear production and domestic uranium mining, and, after years of paucity in funding, cash for fusion.
It's important to note that while fission and fusion are both technically "nuclear power," there's a huge difference; fusion (at least theoretically) isn't anywhere close to as dangerous, despite the incredible temperatures involved.
Youâve likely heard this one before. The running joke is that economically harnessing fusion power, which is what a star or hydrogen bomb does, is about 30 years away. Whatâs not a joke is that we have about zero years to stop powering our civilization with earth-warming energy.
Thirty years? I've been hearing "twenty" for at least fifty.
One milestone came quietly this month, when a team of researchers at the National Ignition Facility at Lawrence Livermore National Lab in California announced that an experiment last year had yielded over 1.3 megajoules (MJ) of energy, setting a new world record for energy yield for a nuclear fusion experiment. The experiment also achieved scientific ignition for the first time in history: after applying enough heat using an arsenal of lasers, the plasma became self-heating.
There's a very good brewery right across the street from that lab. I'm pretty sure I drank with some of the people working there.
And youâd be forgiven for missing another milestone in July, when the Energy Dept. announced awards of between $50,000 and $500,000, to ten fusion companies working on projects with universities and national labs. Here are a few of the awardees, who include some of the industryâs leading companies, and whose projects offer a sampling of the opportunitiesâand hard problemsâin fusion.
Those awards seem lame in comparison to the total investment in fusion research, but I'm sure every little bit helps.
The rest of the article talks about said awardees (understandable, considering the source), and it's all very interesting, but I want to point out a more philosophical angle on fusion.
Pretty much every source of energy we have right now is, when it comes right down to it, solar.
Hydroelectric systems work by harnessing the energy of surface runoff, which got to where it was because the sun evaporates water, which turns into rain, which turns into runoff. Wind is largely driven by heat differences; the heat source is the sun. Fossil fuels such as coal or natural gas were the result of living things, which also ultimately draw their power from the sun. Solar is, of course, direct from the accursed daystar. And while fission isn't "solar" per se, the fissile material (uranium or whatever) was forged in the incomprehensible power of a supernova.
Fusion, at least in principle, doesn't rely on the sun. Oh, sure, the daystar is composed primarily of hydrogen, but most hydrogen was created in the Big Bang itself. (Though I should note that the article points out that tritium, which is actually a hydrogen atom with two neutrons instead of zero, is derived from lithium, which wasn't a huge proportion of the primordial elements.)
In any case, fusion is kind of the Holy Grail of power. While I can't speak to the monetary costs involved (the article does that to some extent), it doesn't seem to directly cause global warming the way fossil fuels do, or result in radioactive waste like fission reactors, or fuck up fish habitats like hydroelectric, or take over a landscape like wind farms.
I'm sure it has its drawbacks, too. Water use, maybe? I don't know. The only really clean alternative would be to not use electricity at all, and that ain't gonna happen.
At least until the inevitable end of civilization. |
September 22, 2022 at 12:01am September 22, 2022 at 12:01am
|
Again with the (shudder) outdoors.
1. Don't go in the wild.
2. See #1
But I suppose people don't heed this advice (or any other), and they find themselves over a mile from the nearest 7-11 from time to time, so it's important to know how to survive in such a situation.
As far as I can tell, the article actually provides helpful how-to instructions. I wouldn't know, because I don't intend to ever be in the kind of situation that requires them ever again, so I'm not double-checking.
1. Navigate with a Map and Compass
Triangulation has kept outdoorsfolks found and alive as long as magnetized needles have been pointing north, and itâs still the most effective method of getting unlost that doesnât involve an orbiting satellite.
I love GPS. I think it's the greatest invention since beer (sliced bread doesn't even come close). But one cannot rely solely on it; batteries die, and satellites get shot out of the sky. I know how to read a map. Hell, I even know how to fold one.
2. Call for Help with a Signal Mirror
Long before the era of SIM-card GPS messengers, calling in the cavalry required little more than a shaving mirror and a rudimentary grasp of Morse code.
Difficulty: receiver needs a rudimentary grasp of Morse code. I mean, all I know is S, O, and S, but hopefully that would be enough.
3. Live to Tell the Tale
My closest call in 40 years of wilderness travel came aboutânot surprisinglyâon a Field & Stream assignment in Alaska.
Reading this author's harrowing story would be enough, by itself, for me to limit my exploration to areas with breweries nearby.
4. Learn How to Snare a Rabbit
Or—and bear with me here; this is complicated—always know where the nearest grocery store is, making sure it's not more than a couple of miles.
5. Make a Tinder Bundle
Sadly, this has nothing to do with a popular hookup app.
Timeless Gear: The Old Flame
The Doan Magnesium Fire Starter was invented in 1973 by an outdoorsman named Sol Levenson, who came up with the idea while on a trip to South America...
I went through that entirely irrelevant-to-me article only to find it was a crummy commercial!? Son of a bitch! |
September 21, 2022 at 12:03am September 21, 2022 at 12:03am
|
I only had a few articles about avocados in my queue, but for some reason the random number generator decided to lump them all into one month. This month. I think this (from The Guardian) is the last one... for now.
Does the pumpkin seed paste come with pumpkin spice? I'm asking because, though the article is from last November, it's almost the autumn equinox here and now—peak pumpkin spice season. And I'm not immune: I just picked up a six-pack of Atomic Pumpkin Voodoo Ranger, a seasonal beer offering from New Belgium out of Colorado.
On the one hand, they are deliciously creamy, versatile and gloriously Instagrammable.
That last bit means nothing to me, but I suspect they're only photogenic during that fifteen-second period between "too hard" and "rotten."
On the other, they have an enormous carbon footprint for a fruit, require up to 320 litres of water each to grow and âare in such global demand they are becoming unaffordable for people indigenous to the areas they are grown inâ, according to Thomasina Miers, the co-founder of the Mexican restaurant chain Wahaca.
On the gripping hand, I'm not sure how much I trust anyone who deliberately misspelled Oaxaca apparently just so gringos won't mangle the pronunciation thereof.
For some time, the chef has struggled to balance the devastating environmental impact of avocado production with her customersâ appetite for guacamole. Now, she thinks she has found the answer: a vibrant, green guacamole-inspired dip, made from fava beans, green chilli, lime and coriander.
Eh, whatever. I rarely eat guac anyway. And when I do, it's spicy enough so I probably wouldn't care if it contained actual avocado or not.
Sucks if you're one of the people for whom coriander tastes like Dawn, though.
The dip â called Wahacamole...
And this is the place where I'd normally close the website window in disgust. But for the sake of my millions of avid followers, I read on. But first I'll note that I would definitely go there if I could just so I could order "whack-a-mole."
In Toronto, the Mexican chef Aldo Camarena recently suggested a guacamole alternative made with courgette and pumpkin seed paste.
That's "zucchini" for Americans. And to be honest, the fava beans one sounds much better.
Last year, the chef Santiago Lastra included a guacamole-style dip made from pistachios and fermented gooseberries on the menu at Kol, his Mexican restaurant in London.
Spain colonizes Mexico. Mexico produces guacamole, then England gets Mexican restaurants and changes the guac. It's the circle of life.
âA few years ago, I was quite well known for my use of avocados in my cooking â so much so that I dedicated a whole Instagram account... to my love for them,â says the vegan cookery writer Bettina Campolucci Bordi.
Wow. I really, really wish I'd stopped reading when my gut screamed at me to do so a couple of paragraphs ago.
She decided to cut back when she moved to the UK, having previously lived in Spain, where she could source avocados locally. âMy favourite recipe to date uses British peas instead. I blanche the peas before crushing them and mixing them with plant-based sour cream, salt and pepper, a little grated garlic and a spritz of lemon juice.â
Because before this, mashed peas were unheard of in the UK.
(That's a joke. Mashed peas are the traditional accompaniment to fission chips.)
But avocados are challenging to replace â as are their derivatives, avocado oil and avocado butter, which are important in gluten-free and vegan baking.
Hang on, I gotta find my microscope so I can use it to search for my tiny violin.
Plus, for many fans of the fruit, a dip made from beans, nuts, seeds or vegetables is no more a replacement for guacamole than smashed broad beans on toast (as suggested by Tom Huntâs recipe for not-avocado on toast) is an alternative to smashed avocado.
Because before this, beans on toast were unheard of in the UK.
Now, just to be clear, I'm not arguing for or against avocados. Personally, I can take them or leave them; the avocado industry could dry up (pun intended) tomorrow and I doubt I'd miss it much. I just thought some of the guac alternatives were creative, so I'm sharing even though the article treats food as a fashion trend rather than what it actually is.
I know I've said this before (sometimes I repeat myself without meaning to because my memory is shit), but one of the few benefits of living on civilization's downhill slide to oblivion is that we can get food from all over. No, I don't give two shits about sourcing food locally; I'm happy to eat tomatoes from California in January, when they're not available from nearby farms, and imported curry from India (for example) all year. But when your only source for a food is a place where it takes all the water to make said food, well, even I will think twice about that.
I'm just not going to take this article's word for that being the case. |
September 20, 2022 at 12:01am September 20, 2022 at 12:01am
|
Today, we're back to talking about marketing. You won't believe what happens next!
I'd argue that all of them are.
In the modern world, marketing pervades just about every aspect of our lives, from what we eat to how we spend our free time to who we rub our genitals on. Even some of the most basic facts of life and institutions we hold most sacred can be traced back to some guy who wanted to sell us something.
I've said before that science is value-neutral, but it can be used for good or evil. Marketing is an example of using the soft science of psychology for evil.
Well, okay, it's not always "evil." But I wouldn't call it "good" for anyone but the person making money and/or gaining power from it. So I think it's important to know the tricks, if only to protect yourself from it.
Since this is Cracked, the list counts down. And since there are 15 of them, I'm going to be choosy.
15. Coffee Breaks
...but it didnât become widespread and wasnât even called a âcoffee breakâ until it reached the ears of the Pan American Coffee Bureau in 1952, when their launched their âGive Yourself a Coffee-Breakâ campaign to take advantage of the opportunity to sell more coffee.
No amount of marketing can convince me that coffee is something I should drink. On the other hand, Coke's marketing worked on me.
13. Bacon
Remember back in the 2000s, when everything from chewing gum to lube came in bacon? It was a confusing and chaotic time, and it was all because the health craze of the â80s plummeted pork sales, so executives crafted a backlash as a way to, again, sell more bacon.
On the other hand, bacon is delicious, and I've often wondered if Atkins was getting kickbacks from Big Pork.
Oh, and that decade was the noughties. Goddammit, I am going to make that a Thing. Even if I have to resort to marketing to do it.
11. Diamond Engagement Rings
I think people have finally caught on that this was a gigantic marketing gimmick by DeBeers, and yet the "tradition" (less than 100 years old, per this article) continues that one must gift slave carbon to one's betrothed.
10. Wedding Registries
By the 1920s, the ancient practice of showering newlyweds with weird symbolic gifts had mostly died out, and only their closest friends and family were expected to do anything more than party their faces off. That didnât suit department store Marshall Fields, who invented the wedding registry in 1924 to sell more merchandise under the guise of providing the couple with everything they need to start their new life (except any kind of useful sex education).
Speaking of symbolic gifts, periodically, I'll see one of those "anniversary" lists that show what you're "supposed" to give each other for particular wedding anniversary milestones. I've wondered why more people didn't market based on that.
8. Fatherâs Day
Look, almost every holiday has been either co-opted by marketers, or came about specifically to sell shit. We know this. Father's Day is just one of the more recent of these. (To her credit, the lady who promoted Mother's Day in the US was reportedly incensed about all the selling surrounding it.)
4. Ten Commandments Monuments
People are pretty protective over the monuments to the Ten Commandments that appear on government property across the country considering they didnât exist until Cecil B. DeMille needed to promote a movie.
Worst of all, the vast majority of the Ten Commandments (whichever version you believe in) have absolutely no bearing on the modern legal system. But that's a blog for another time.
1. Santa Claus
Of course, Santa as we know him wasn't popularized until 1931, when Coca-Cola unveiled its latest holiday campaign featuring the red-coated, red-nosed, âhuh, suspiciously red-focusedâ Santa of the modern era, inspired by the description in ââTwas the Night Before Christmas.â
Perhaps one of the most successful marketing campaigns of all time, though I'm not sure how effective it is anymore at selling fizzy beverages. But considering I'm already seeing Santa-themed bullshit everywhere in September and I don't even leave the house, well, someone's making money somewhere.
And again, there's nothing wrong with wanting to sell something actually useful. It's only when that power is used for wicked ends, like overpriced fake-wellness products and chicken wings, that I have a problem. |
September 19, 2022 at 12:02am September 19, 2022 at 12:02am
|
Ooooh, scary robot vehicles. Ooga booga.
While the article is interesting and details a different approach to implementing AVs, that's not what I want to talk about.
Instead, I want to talk about how AVs can't get implemented fast enough.
"But Waltz, what about all the stories about autonomous vehicles killing people?" Yes, they happened. Sometimes airplanes kill people, too, and that's considered the safest mode of transportation. Every AV death, and I suspect most injuries caused by them, gets reported. Loudly. Meanwhile, if they did that with every death attributable to human-controlled vehicles? You'd be inundated with horror stories. About a hundred a day, in fact—just for the ones here in the US. One every 10 to 15 minutes, on average. That's just fatalities, not survivable injuries.
Of course, there are substantially fewer AVs than... humVs? Nah, that's already taken. HVs. So you can't really compare them like that. But the point is that, right now, about 40,000 people a year die in HV incidents in the US. While AV technology isn't mature yet, if it can reduce that number by a significant amount, it's a net benefit. What's "significant?" As far as I'm concerned, one standard deviation would do it.
It will not reduce it to zero. That shouldn't be the expectation, or the goal. We've already decided, as a society, that 40K a year is an acceptable death rate for the privilege of driving around. But we also implement things in an effort to reduce that number.
I know I've said stuff like this before, but it's been a while.
There are, of course, significant hurdles to overcome, even apart from the technology needed.
First, public acceptance. As I noted up there, robots are scary. "Common sense" tells you that it's better to have a person in control. Common sense is, as usual, wrong. But that's a big barrier to jump over, considering that many people are scared shitless of airplanes, which are, as I said, about as safe as they come.
Second, and this is even more pernicious, there are a lot of people making a lot of money from human-driven vehicles. I don't just mean Uber drivers, long-haul truckers, cabs, etc.; those are small-time, though that many people losing jobs at once might finally trigger a long-needed revolution. I mean governments issuing speeding tickets, using red light revenue cameras, draining the coffers of drunk drivers (of course drunk driving is bad, but as it stands there are institutions using it as a revenue stream, which means they need it to continue so they can keep getting money), and so on. And AVs don't speed, run red lights, or drive drunk.
Such institutions have a vested interest in keeping that revenue stream coming, so if they haven't already, they're going to campaign against AVs. This won't be limited to lobbying; at this point, I figure any scare article about AVs is being funded by the Fraternal Order of Police or some mayor's office or whatever. No more using speed traps in small towns as a revenue source. No more harassing drivers based on race (unless you count AI as a race). What are cops going to do if they can't pull speeders over? Solve actual crimes? Ha! No money in figuring out who stole your bicycle.
Do things need to be improved before it's implemented? Absolutely. Are there issues to be settled, such as liability, privacy, and how to handle the unexpected? Of course. But the benefits make the work worth it. Well, for other people doing the work, anyway. Me, I'm just hoping the day soon comes when I can take one home from the bar. It's not drunk driving if you're not driving; it's the same as taking an Uber, only without the judgmental glare from the driver's seat. |
September 18, 2022 at 11:31am September 18, 2022 at 11:31am
|
I recently wrote about the origins of avocados. Those are in here, but so are some other delicious foods and beverages.
Link is from Cracked, so I'd rely more on the comedy than on the facts. But I don't think they got the facts horribly wrong.
Itâs fascinating to look back through history to see why we humans ended up eating the food we do. You come upon such explanations as âa company exec thought of it,â âan advertising exec thought of it,â âa marketing exec thought of it,â andâof courseââwar.â
There's also "food is scarce so let's figure out how to make this disgusting thing marginally edible." The threat of death is part of the "necessity" that's allegedly the mother of invention (laziness is the milkman).
5. We Drank Milk For Thousands Of Years Before Evolving The Ability To Digest It
And some of us never quite did.
The article talks about a scientific study done to refute the idea that the ability to digest lactase gave some humans an evolutionary advantage. Then:
And then, over the course of the next 3,000 years, that gene variant, which was theoretically so useful ... didnât spread at all. It remained just as rare. The vast majority of people lacked any ability to digest dairy, but everyone went on eating it just the same. âThis cheese is too good to give up,â they said, blasting noisy diarrhea, for millennium after millennium.
The problem with that quote is that while the purpose of cheese is to make milk last longer. It has the side-effect of making it more digestible to lactose-intolerant humans.
We used to tell each other that lactase persistence offered a large evolutionary advantage, but it apparently did not. Maybe this was for the same reason that lactase persistence offers no boost to your life expectancy today.
Fweet. Flag on the play. Life expectancy isn't directly correlated with evolutionary advantage. Evolution only "cares" that you live long enough to reproduce. Anything after that may have some social benefit, but it's not a direct driver of natural selection.
4. We Domesticated Chickens For 7,500 Years Before We Figured Out Eating Them
You know, when you think about it, eating pretty much anything for the first time required some kind of leap of faith. Especially mushrooms, but also eggs. "I'mma eat the next thing that comes out of this chicken's ass. Hold my beer." The birds themselves, though? This chickenless past is genuinely surprising, considering how delicious those descendants of dinosaurs can be, and ancient humans must have witnessed other animals chowing down on poultry.
Take the Bible, which is full of talk of people slaughtering lambs, goats, and fatted calves. Notice what no one eats in the Bible? Chicken. There are special rules if youâre thinking of eating a camel or a badger, or a vulture or an ostrich, but no mention of eating chicken. (Which means we can assume chicken's not unclean, but still, no one in the Bible goes and eats one.)
No, chicken is absolutely kosher. Well, like any meat, it has to be prepared in a kosher manner, but that's a more recent development.
I'd always assumed they weren't in the Bible because they were native to the other side of Asia and hadn't quite spread to the Middle East.
3. The Goats Who Danced And Taught Us To Drink Coffee
That would make an excellent band name.
But we do know it took a lot of trial and error, based on evidence we have of prior attempts to eat that bush. Before humans ever made the beverage we call coffee, they made coffee protein bars by mashing the berries and mixing them with animal fats.
I dislike animal fat (except for delicious bacon), and I also never got a taste for coffee, so all I can say is:
Ew.
2. We Can Track The Rise Of Alcohol To Our Evolution As Primates
This one, of course, is the most relevant to my interests.
Humans have been making alcohol for quite a while, maybe even for 9,000 years. Weâve been drinking alcohol much longer than that, however. The first time anyone drank alcohol was when some ancient ancestor of humans picked up a fruit that had naturally fermented. They ate it, immediately spat it out, then ate some more, because it might have tasted foul but they liked what it was doing to them.
This is, of course, pure speculation, but I like it.
This section gets a little technical for a dick joke site, but it's a fascinating look into the biology involved.
1. Avocados Are So Huge Because Giants Swallowed Them
Fruit really is pretty cool, isnât it? You may think wild berries donât taste as good as, say, ham, but we carnivores evolved a taste for ham to recognize the nutrition present in meat, despite pigs not wanting us to eat them. Fruit, on the other hand, evolved to cater to our tastes, becoming more tasty because it wants to be in your mouth. Fruits contain seeds. The tastier the fruit, the more likely animals are to eat the seeds then defecate them somewhere distant, spreading that plantâs genes wide.
This is, obviously, simplistic and overgeneralized. Plenty of fruits don't want to be eaten. Chili peppers, for example. That spicy flavor comes from capiscum, as you probably know, and it evolved so animals would take one bite and go "nope." Humans being humans, though, we're like "I'mma eat that hot thing. Hold my beer. BY QUETZALCOATL'S SWINGING BALLS GIMME BACK MY BEER."
(Capiscum-laden plants originated in Central America and yes, those peppers are technically fruits. Like tomatoes.)
Lots of fruits had bigger seeds before we selectively bred the seeds out and shifted to cloning. Bananas were small and full of lumpy seeds before we fixed them.
I once saw a video made by a young-Earth creationist touting the banana as "proof" that God created the perfect fruit. I laughed. Bananas are evidence that selection pressure can change an organism.
Anyway, I went on about avocados in that entry I linked above, so no need to rehash it here. Re-guac it. Whatever.
Many plants were spread exclusively by megafauna. That means they would have gone extinct the same time megafauna did, had humans not cultivated them. In addition to avocados, these plants include pumpkins and gourds...
When we humans die out, all these crops will vanish. If we care about them surviving after we are gone, we have only one responsible course of action ahead of us: We must bring extinct megafauna back to life, and set them loose as farmers. We must unleash the mammoth, the dire wolf, andâyesâeven the Megalania, a 4,000-pound venomous lizard.
I take back what I said up there. The awesome band name would be Megalania. |
September 17, 2022 at 12:01am September 17, 2022 at 12:01am
|
Hey look: a headline question to which the answer is "yes."
Does time really exist?
We take for granted that time is real. But what if it's only an illusion, and a relative illusion at that? Does time even exist?
Astute readers who follow the above link will note that, above said headline, is an important string of characters; to wit: "April 26, 2022." In other words, they're asking if time exists right after specifying the time coordinate (to within 24 hours) when the article was published.
Still, even though I take a very materialistic stance on the subject, it's worth thinking about, and the article makes some interesting points. It also ends up agreeing with my philosophical stance, so there's that.
In a philosophical sense, weâre taught to doubt and question everything.
No, "we" are not. For various values of "we."
Even the reality of ourselves and our own experiences are up for debate, as we have to make certain assumptions about how trustworthy our sensors â and our own senses, for that matter â actually are in order to arrive at any satisfactory conclusions.
Stub your toe on the foot of a furniture and then tell me it doesn't exist.
I used to work with a guy who was devoutly religious. He kept showing me optical illusions, which of course I appreciate. But it dawned on me one day that he was sneakily trying to tell me, "Look here, you can't trust your senses. You can only trust in God." Whereas the conclusion I draw from optical illusions is: "Things can be different from what they seem. Therefore, you need science to explain why your senses are deceiving you." Because every one of those illusions had a very good explanation.
Sure, certain things might appear real, but isnât it possible that those appearances are deceiving, and that quantities or concepts that we take for granted might be nothing more than very convincing illusions?
Again: if it can be measured, quantified, and tested; or if it gets in the way of your really very pain-sensitive toe, it's "real." The chair I'm sitting in keeps me from falling to the floor. The floor itself keeps me from falling to the center of the Earth. All due to gravity, which can't be seen, but can damn sure be measured. To claim otherwise is to muddle the distinction between "reality" and "illusion."
Sure, an argument can be made—and I've made it myself—that what we sense isn't the whole picture. That's fine. Zoom in far enough and everything's energy, space, and quantum fields. What we sense is what was useful for our ancestors, going back to the first molecule of proto-life, to find energy and avoid becoming someone else's source of energy for as long as possible. It may all—consciousness, solidity, time, space—be an emergent property of some other process, but that doesn't make it any less real.
Weâve learned lots of surprising and counterintuitive lessons from our investigations of time. Time is relative, not absolute.
True. Doesn't mean it's an illusion. Sure, your "now" is (very, very slightly) different from mine, but we both sense the flow of time.
And as I've also banged on about in here, if anything's an illusion, it's the concept of "now." You don't actually live in the present; only the past and the future are real. By the time you realize it's "now," it's not "now" anymore.
Time always marches forward, not backward, but we still lack an explanation for the arrow of time.
Eh. Maybe.
Thermodynamically, the Universe has an arrow of time, which âflowsâ in the same direction as increasing entropy.
We, as the sensor, are also subject to the laws of thermodynamics—which are, indeed, emergent properties; they are the result, not of a single particle or field, but of a collection of wave functions. Heat, for example, isn't the property of an individual atom, but of billions of them all bouncing off each other. Is heat, therefore, an illusion? If you think it is, you've got some 'splainin' to do to some really sweaty folks.
And when we investigate the Universe on a fundamental level, it turns out that time may not be fundamental at all.
"Not fundamental" doesn't equate to "not real."
When it comes to the question of existence, physics is very simple and straightforward about what it considers to be a satisfactory answer.
Can you measure it?
Can you quantify it?
Can you define it in a mathematically self-consistent way?
Is it, itself, an observable quantity, and do other observables depend on it in an inextricable way?
If your answers to these questions are all in the affirmative, thereâs no way out of it: youâve got yourself a quantity that exists.
Which is, to an extent, what I've been saying. Only the article is far more rigorous.
Then the article applies these questions to time:
You might think, then, that perhaps time itself is pathological [their word for, basically, nonsensical and not understood by science]. Sure, we can measure it, quantify it, and even observe both its passing and the consequences of its passing. But shouldnât it matter that your measurements of âhow much time has passedâ between the start and end of an event depends entirely on where you are and how youâre moving when youâre making those observations?
My answer: no, because we can quantify and measure and define the differences in the way time works in different frames of reference. If we couldn't, GPS devices would simply not work. For example.
And I'm not going to paste the long-winded example, taken pretty much directly from Einstein, but spoiler: I wasn't too far off the mark.
But could it be the case that, perhaps, we only perceive time to exist, and that it isnât, in fact, actually real?
We can consider this from a particular perspective: looking at the notions of symmetries in physics. After all, the laws of physics, at least as we know them, are time-symmetric.
The laws of physics are, yes. Most of them, anyway. The laws of thermodynamics, however, are not.
But there are two ways to identify a physical difference between progressing forward in time and backward in time. The first is by looking at reactions that proceed via the weak nuclear force, such as radioactive decays.
The article gets a little technical here, but it should be clear to anyone with a little knowledge in the area that uranium decays into other stuff, which then doesn't turn around and undecay back into uranium. For example.
The second way, however, is even more familiar to most of us. Every time you:
scramble an egg,
drop a full glass of water onto the ground and watch it shatter,
or simply open the door between a hot room and a cold one,
you are creating a situation where there will be a thermodynamic arrow of time.
And that process is also going on in your brain, which is observing all that.
However, there are two important caveats to this discussion. While itâs true that time is real, itâs important to keep the following facts in mind.
Too much text to copy here; you'll have to look at the article. Basically, we don't know everything. Well, duh. We knew that we don't know everything. That's what makes science so damn interesting.
Despite the popular trend to question the nature of time, its physical ârealnessâ is not in doubt. Time is an integral part of the Universe, and the boundary between events that have been observed or measured to have a definitive outcome and those whose outcome has not yet been decided is the best way we have to define, precisely, what we mean by the moment of ânowâ.
And, again, philosophically, it's important to think about these things. But as I've said before, if time is an illusion, then space is also an illusion, which makes everything we sense an illusion, which in turn calls into question the definition of "illusion."
So, as I said above, this is a rare case of the answer to a headline question being a "yes." It is, of course, a qualified "yes," as with most things in science. But the fact that there's a headline that can be answered with a "yes" calls into question the very foundations of the laws of the universe. That's what we should be worried about, not that time might be illusory. |
September 16, 2022 at 12:05am September 16, 2022 at 12:05am
|
You know, I used to think it would be cool to be able to somehow record your dreams in the same way a video camera records reality.
Why and How Do We Dream?
Dreams are subjective, but there are ways to peer into the minds of people while they are dreaming. Steven Strogatz speaks with sleep researcher Antonio Zadra about how new experimental methods have changed our understanding of dreams.
Dreams are so personal, subjective and fleeting, they might seem impossible to study directly and with scientific objectivity. But in recent decades, laboratories around the world have developed sophisticated techniques for getting into the minds of people while they are dreaming.
SCIENCE!
In the process, they are learning more about why we need these strange nightly experiences and how our brains generate them.
For some of us, they're not just nightly. How about a little acknowledgement of different circadian rhythms?
In this episode, Steven Strogatz speaks with sleep researcher Antonio Zadra of the University of Montreal about how new experimental methods have changed our understanding of dreams.
Ugh. Podcast. No thanks; I prefer reading. Fortunately, this article is a transcript.
In this episode, weâre going to be talking about dreams. What are dreams exactly? What purpose do they serve? And why are they often so bizarre? Weâve all had this experience: Youâre dreaming about something fantastical, some kind of crazy story with a narrative arc that didnât actually happen, with people we donât necessarily know, in places we may have never even been. Is this just the brain trying to make sense of random neural firing? Or is there some evolutionary reason for dreaming?
While they can't answer all those questions, at least not fully, it seems (from the article) that they're making some progress at them.
The bulk of the transcript is an interview between the author and a Dr. Antonio Zadra from MontrĂŠal, and that person's name would be excellent for a supervillain. But he's not. Apparently.
Even when dreams are studied in the laboratory, you can look at whatâs going on in the brain or body while the person is dreaming â for instance, in REM sleep â but what they are dreaming about at that moment, we usually can only know once we wake up the individual, and he or she tells us about the dream they were experiencing. So dreams are a private, subjective experience.
Well, that's something of a relief.
Why, after I said I wanted to find a way to have them recorded?
Because we've seen what capitalism does to web browsing. Hell, one time I was just having lunch with a friend, on the patio of a local restaurant. We were talking about how much we'd like to go back to Vegas. After about a minute of this, our phones beeped nearly simultaneously (I'm Android and he's Apple, so they're different). We checked.
It was a message from Caesar's Palace.
Now imagine what marketers will do when they get access to our actual dreams.
But we know for instance, if we take the most vivid dreams, those that tend to occur in REM sleep, well, we know that the secondary visual areas are activated. And that makes sense because dreams are highly visual experiences. So the primary visual areas arenât activated for the simple reason that your eyes are closed, thereâs no visual input entering through your retina. So your brain is creating this. We also know that your motor cortex, the part of your brain that controls motor movement is activated. And that probably is one of the things that helps give us the impression that we are moving through a real three-dimensional physical world in our dreams. We know that the limbic system is also activated, and the amygdala, which probably helps explain why many dreams contain various degrees of emotions, so we are emotionally engaged in them. And we know that parts of the prefrontal cortex, the part of your brain that sits about an inch or so above your eyes, is deactivated. And so this also explains why these areas of the brain are important for what we call executive functions, judgment, critical thinking, planning, things that are usually absent in our dreams.
This might also explain why, most of the time, we don't "know" we're dreaming. Some people try different techniques to induce lucid dreaming, which sounds interesting to me, but too much work; without that, for me at least, once you figure out you're dreaming, the dream generally ends or takes on a different form.
Now, admittedly, the article is pretty long (one reason I prefer reading to listening/viewing is that I can skim if I want, and I did). And there's a lot more at the link; more than I'd care to sift through for this entry. But here are a few choice quotes that I found especially enlightening:
For every two hours we spend awake, it appears that the brain needs to shut off all external input for an hour to make sense of what weâve experienced. And that is what sleep is in part.
That might be the most concise explanation for why sleep is a thing that I've ever seen.
And even in phenomena such as lucid dreams, dreams in which you know that youâre dreaming, you have little idea of what happens next in your dream. Your brain is keeping this information from you. So in a lucid dream, you might make a dream character appear, for instance, but then if you ask them a question â Who are you? What are you doing in my dream? What is the most important thing I should remember out of this? â You have no idea what the character is going to say. But your brain does. Your brain is what is creating this character.
Which should be especially pertinent to us writers.
And so when people say, âOh, you can do anything in your dream,â or âYou are the producer and main actor of your dreams,â I donât think thatâs correct. Youâre not at the wheel of the dream construction process; your brain is. And your brain intentionally keeps much of the information of whatâs going to be happening next, and how things unfold, away from you.
Here, though, our supervillain makes a distinction between "you" and "your brain," which I find distasteful. Fortunately, the interviewer calls him out on it.
Anyway. There's a long bit about lucid dreaming, dream communication with the outside world, and the potentially Orwellian "dream engineering," which is even more scary to think about than dream tracking for marketing:
And I wouldnât want my great grandchildren to have to pay $10 a month to opt out of advertisement in their dreams, Scott.
But I'm certain it's coming. Only it'll probably be $100 a month. Inflation, you know. |
September 15, 2022 at 12:01am September 15, 2022 at 12:01am
|
Short one today, to balance out yesterday's. From Atlas Obscura:
John Calvin's Chair
CathĂŠdrale de Saint-Pierre
Geneva, Switzerland
A plain wooden seat that once belonged to one of the most prominent figures of the Protestant Reformation.
In the CathĂŠdrale de Saint-Pierre in Geneva, Switzerland, a high-backed wooden chair sits in a place of honor. Itâs roped off so that nobody can sit in the seat, where the French preacher John Calvin sat more than 500 years ago.
At least they're not claiming it's made of wood from the True Cross.
From the pulpit at St. Pierre Cathedral, he preached about the importance of religious scriptures and the concept of predestination, which held that certain people were set on a path for salvation from the very beginning of their lives.
Which apparently involved a lot of sitting.
The chair of Calvin is a plain-looking wooden chair with a trapezoidal base and narrow backrest.
I just wanna know one thing:
Did it have room for Hobbes, too? |
September 14, 2022 at 12:05am September 14, 2022 at 12:05am
|
Just found this one today, and it already showed up. This means I might actually remember what I was going to say about it.
I know I've gone on about happiness in here before. In short, I think there are more worthwhile pursuits, and some of them actually can lead to happiness.
Should you trust your gut?
Gut, heart, brain, gonads. Whatever. I trust my liver.
Weâre making all kinds of decisions every day. Most of them are trivial, like what to cook for dinner. Some of them are monumental, like whether to change jobs or sell your house.
We probably only have the illusion of making decisions, but again... whatever.
But every time we make these decisions, we make them on the basis of some feeling or evidence. Sometimes we just go with our intuition, with what feels right. And sometimes we lean on our reason. We weigh the options, consider all the factors, and follow the logic wherever it leads.
Somehow I don't think "logic" figures into it for many people.
A new book by Seth Stephens-Davidowitz, called Donât Trust Your Gut, argues that our âgutâ â or whatever you want to call it â is usually wrong.
I'll buy that. The argument, that is; not shelling out for the book.
And itâs wrong because our intuitions are often influenced by false impressions or dubious conventional wisdom.
"Conventional wisdom" sounds an awful lot like "common sense," which, as I've noted repeatedly, is neither.
Stephens-Davidowitz is an economist and a former Google data scientist.
I give economists a hard time, I know, but their thinking is surprisingly applicable to situations other than financial. Which doesn't mean they're right.
So I invited him to join an episode of Vox Conversations to talk about it.
I don't do podcasts, and I'm not about to start now. It's at the link if you're interested. Or at least I assume it is; my script blocker keeps me from seeing much more than the text. So I'm going by the excerpt provided; the rest of the article is in interview format. Thus, it's hard to quote much of it.
I always feel like if our lives are inefficient enough, you can make decisions that win on every possible dimension.
Now, that's a statement I need clarification on. But the article doesn't provide it.
So I talk about the data on happiness, and particularly the Mappiness Project, by George MacKerron and Susana Mourato, where they asked people on their iPhones: Who are you with? What are you doing? And how happy are you, 0 to 100? And they built this chart, a happiness activity chart.
It's notoriously hard to quantify subjective feelings. Chances are, you're familiar with the pain scale at the doctor's. What's your current level of pain, from 1 to 10, with 10 being the worst pain you've ever felt? Someone who's led a sheltered life might call a banged toe a 10. For someone like me, having had appendicitis, back pain, and a heart attack, it's more like a 4. The point here being that just as you can't compare pain between people, you can't compare happiness either. Different things make us happy, and my happiness at, say, drinking a fine Scotch might actually be greater than your happiness at finally getting laid.
Speaking of which:
So the happiest activity, according to Mappiness â and actually every experience sampling project has landed on the same exact finding â is that sex and intimacy and making love are the happiest activity, which is not so surprising.
I've had way better experiences. Not all of them even involve alcohol.
Gardening ranks really high. Theater, dance shows, sports, running, exercise, singing, performing â so karaoke, really good â talking, chatting, socializing, bird-watching, nature-watching, walking, hiking, hunting, and fishing.
Most of those things are mildly pleasurable for me at best and, at worst (sports, hunting, fishing, e.g.) are things I actively hate. Some are just boring.
You know, another interesting thing about a lot of those activities near the top is that they donât require a lot of money. A lot of them you can do for free.
Not sex, though. But here's the money question (literally):
Which I guess prompts the question: Did you find that having more money makes us happier? Do you find that happiness tends to scale with income?
Anyone who believes that the best things in life are free has never had a really good bottle of single-malt scotch.
Thereâs this famous idea that once you get above $75,000 in income, thereâs no gain to money. That is kind of a famous idea: You just need at least $75,000 income, then it stops.
Matthew Killingsworth at UPenn did a study. He found thatâs not true, that thereâs no point at which money stops giving people happiness. But it levels off. So itâs always going up, but itâs going up at a smaller and smaller rate.
Yeah, that $75K idea never really sat well with me. But I can believe the "smaller and smaller rate" thing. It's a principle from economics (unsurprisingly, considering the source) called marginal benefit. Or something like that.
Thereâs another study by four professors, most of them at the Harvard Business School, that found that thereâs an additional boost if your net worth gets above $8 million. And I think one of the reasons for that is, if you think of the activities that are really at the bottom of the happiness activity chart, there are these annoying things that modern life forces us to do. And once your net worth gets to $8 million, you really can stop doing them. So you just have a chef cooking your meals and you have a housekeeper whoâs cleaning up after everything. And maybe you have a personal assistant whoâs doing all your chores, and you have a personal driver, so youâre not commuting on a subway.
Sorry, guys. Rich people really are happier, and they get their happiness by using the rest of us. (You think the rich bastard's housekeeper is swimming in serotonin?)
Well, the trap is that work is the second most miserable activity according to [scholars Alex Bryson] and MacKerron, which shocked me because I had grown up with this idea that work is where you get a lot of your fulfillment and joy and purpose.
Yeah, that's another lie they tell us along with "money can't buy happiness" and "the best things in life are free." It's designed to keep us from wanting more.
You have a chapter in there about parenting and kids. What did you find that makes a good parent? What did the data tell us about how to parent better?
I've heard a lot of people say they derive their happiness from having kids. I'm sure there are some people like that; I knew from an early age I wouldn't be one of them. True happiness, for me, requires that I not have too many obligations, and kids are a huge obligation.
They also go on to talk about happiness in marriage, and the bit I find interesting there is this:
The thing that predicts happiness â by far the most important predictor of whether youâre happy in your romantic relationship â is whether youâre happy outside your relationship.
This is not what I'd consider new information, but it's quite telling that the data seem to back it up: that if you're not content in yourself, you're not going to find contentment with someone else. People searching for a "relationship" in order to be happy are looking in the wrong place. Which is not to say people can't be happy in a relationship; just that you gotta work on yourself first.
I would say that the most depressing finding in the book is probably also the least surprising, which is that basically being good-looking is the most predictable determinant of success in almost every sphere of life.
It's only "depressing" if you're already ugly. We knew this, too, though; rich and attractive people generally seem happier than poor ugly slobs.
To the extent that I sounded a skeptical note, part of what I was getting at is, I just feel like human beings are just sort of hopelessly contradictory. You know, like if every day was a perfect day, then pretty soon the things that made it perfect would cease to satisfy us. Right? You know, the sweet is only sweet because of the sour, and all that.
Both the people involved in this interview admitted to having been philosophy majors in college. Nothing wrong with that, but that bit of philosophy right there is an echo of what the great philosophers Beavis and Butt-Head once said: "If everything was cool, and nothing sucked... how would we know what was cool?"
In the end, I think the data here are important, but they shouldn't guide anyone. It's like... statistics show that there are more cats than dogs kept as pets in the US (though more households have dogs, there are more cats per household, on average). From that data alone, you might draw the conclusion "Cats are better pets than dogs. Therefore, I should adopt cats." Now, I happen to agree with that—but I have a very good friend who's more of a dog person, and she wouldn't agree. And that's okay. All I'm saying is, just because the data says, on average, "you'll be happier if you took a long walk in nature and end up doinking your romantic partner on a lakeshore" (which apparently it does), that doesn't mean that'll do it for all of us. Especially indoor-dwelling asexuals.
Not to mention the findings about work. I know a few people for whom work gives their life meaning, and meaning leads to some happiness. Again, not me, but that's my point: we're all different, and we all enjoy different things to different degrees.
Despite my objections, though, I can't say the discussion was uninteresting. Though still not interesting enough to listen to the 'cast or buy the book. Just interesting enough to produce enough food for thought to create a blog entry. So... here it is. |
September 13, 2022 at 12:02am September 13, 2022 at 12:02am
|
Hey, look, a headline question to which the answer isn't "No."
Well... okay. Technically, the answer is "No." The headline doesn't specify human brains, and while most nonhuman animals have brains, it's not clear that they all have a sense of number.
No, all y'all for whom math provokes dread, you only think you're bad at it. It becomes a self-fulfilling prophecy, like when I say I'm not creative.
Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circleâs circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street.
In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculationâcredited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujanâdoesnât involve anything more complicated than adding, dividing, and squaring numbers. âHow could you not say thatâs amazing?â Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom.
On the other hand, maybe I'm just linking this because it calls out my alma mater.
But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe.
Story idea: Horror novel, but the antagonist is math.
Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.
Seriously? Someone funded a study about that?
Some scholars argue that American cultureâs saturation with negative stereotypes around math in combination with current approaches to teaching the subject are perpetuating anxiety, making some kids think they are bad at math, and preventing them from excelling.
I can't forget when they put out a Barbie doll that voiced her frustration with math. And several other phrases promoting stereotypes and gender roles. Yes, that was already a thing in the early 90s.
In other words, we are primed to do basic math, but culture gets in the way.
This is one of the few areas where I genuinely believe the phrase "You can do anything you set your mind to." If you think you're good at math, you'll be good at math. At least enough to do some basic calculations, like tips and imperial-to-metric conversions.
This ability to estimate and understand quantity may have evolved as a basic survival skill. It could have helped our ancestors and members of other species quickly assess whether they were outnumbered by predators, for example, or to forage in places with more available food relative to others.
And then the article has to go into evolutionary speculation. It's just that: speculation. At least the author acknowledges that.
There seems to be consensus among scientists, however, that only humans mentally represent numbers precisely and with symbols, and that we need some kind of education to do so. This is potentially because many higher math skills, including arithmetic, depend on the use of languageâa symbols-based systemâwhere quantity-based judgments are pre-verbal.
I mean, when you think about it, numbers are quite abstract. If you have a small group of trees, and the same number of hogs, it's a pretty wild mental leap to say that the number "3" (or whatever) can apply to both of them.
Indeed, arithmetic is difficult to do if one does not have the language for it.
Just ask a Roman.
If we have the innate ability to understand math and acknowledge its importance in making sense of the world, why are so manyâin Western culture, mainlyâaverse to it? Scholars who study math and math education have a hard time answering this question.
Oh, but I bet we're going to try.
Perhaps it is related to the way we in the U.S. tend to conceptualize math ability. People in the West often say, âIâm bad at math,â as though it were a personality trait or even a badge of honor.
Of all the things that piss me off about people, one of the biggest is when they make ignorance a positive trait. Yeah, yeah, I know; I've done it myself, like when I proclaimed I've never seen Titanic and never will. The difference is, Titanic isn't going to help me keep a budget or measure quantities in a recipe. Plus, I never claimed to be internally consistent. No one is.
Of course, some kids do have clinical learning disabilities that are not caused by anxiety, cultural myths, or poor teaching.
And none of what I'm saying here is meant to rag on them. Learning disabilities are what they are; willful ignorance is the problem.
Evidence suggests that cultural assumptions that women are less skilled at math than men may account for much of the gender gap in math performance.
While one data point is meaningless, the top math-doer in my high school class was a girl. So I never bought into the idea that there's an innate gender difference. What, estrogen means you can't count? Come on. No.
Math anxiety may be transmitted from teachers and parents to kids, too. âEvidence from the United States suggests that children who interact with high-math-anxiety adults show impaired math performance relative to their peers,â the authors of a 2017 study wrote.
I will say this, though: math education has changed since I was a kid. Now it's all that common-core stuff. I'm not saying it's wrong or bad; just that it wasn't the way I learned. So when a friend of mine got me to help her kid with their math homework once, I took one look at the common-core textbook and noped out.
It's not that I can't learn it. It's just that the kid already knew more than I did about it.
The article touches on that subject. Then:
A seemingly opposing perspective comes from Barbara Oakley, author of A Mind For Numbers: How to Excel at Math and Science, who argues that, ultimately, learning math is more like learning a language or learning music than it is like some other forms of learning.
I tend to agree with this. Mathematics is a language. Conversely, language is math.
Ultimately, individual kids have specific needs, and parents can work with teachers on figuring out tailored approaches. But, generally parents can create a positive attitude toward math by refraining from sharing their own math anxieties and encouraging creative problem solving.
And while that's important, I'd like to see a push toward convincing adults that, no, math isn't going to pull them down into the gutter and suck out their blood or whatever. So, maybe don't write that horror story after all.
Meanwhile, can we please stop pandering to the "when are we going to use this?" crowd? Let's try to instill a love of learning for the sake of learning. There's no such thing as useless information—except maybe what some internet influenza is trying to tell us. |
September 12, 2022 at 12:02am September 12, 2022 at 12:02am
|
Today's article is all about my favorite pastime. Okay, second favorite pastime.
Now, the article is from 2008. I'd like to say that doesn't make a difference, but science has a way of correcting itself and changing course. It's not nutrition science, though, so the answer probably hasn't gone from yes to maybe to no to possibly to no to yes during the last 14 years.
Let's do some sleep math.
And you just lost half your readers.
You lost two hours of sleep every night last week because of a big project due on Friday. On Saturday and Sunday, you slept in, getting four extra hours.
Duh, five times two is ten minus four is *counts on fingers* *shrugs*
...still, I was under the impression that sleep deficit doesn't actually work that way. It's not like paying for stuff with a credit card. It's always possible that I was misinformed, however.
Sleep debt is the difference between the amount of sleep you should be getting and the amount you actually get.
If it were a debt you could pay off, I'd still be asleep from the day I retired.
"People accumulate sleep debt surreptitiously," says psychiatrist William C. Dement...
Snicker. I'm sorry. I know we shouldn't make fun of peoples' names. But come ON.
Studies show that such short-term sleep deprivation leads to a foggy brain, worsened vision, impaired driving, and trouble remembering.
Hm. I know something else that causes those symptoms, and it's a lot more fun than merely not-sleeping: booze. That would be my favorite pastime, as I noted above.
A 2005 survey by the National Sleep Foundation reports that, on average, Americans sleep 6.9 hours per nightâ6.8 hours during the week and 7.4 hours on the weekends. Generally, experts recommend eight hours of sleep per night, although some people may require only six hours of sleep while others need ten.
Seven hours? LUXURY.
The good news is that, like all debt, with some work, sleep debt can be repaidâthough it won't happen in one extended snooze marathon. Tacking on an extra hour or two of sleep a night is the way to catch up.
Great. Now: should that be at the beginning or the end, or both?
As you erase sleep debt, your body will come to rest at a sleep pattern that is specifically right for you. Sleep researchers believe that genesâalthough the precise ones have yet to be discoveredâdetermine our individual sleeping patterns.
Yeah... one thing I've learned in the last 14 years since this article came out is that's not really how genes work. There's not one gene, or a group of them, for a particular trait; rather, it's a complicated mess that's way above my pay grade. Still, I can accept that it's hereditary as opposed to learned.
As I've mentioned before, I seem to be naturally biphasic. I sleep in the late afternoon/early evening, wake up, and sleep again in the early morning. Actual times vary. One of the perks of being retired. The only consistency is that I'm almost always awake at noon and midnight, which is one reason I choose to do these entries around midnight. I say "usually" because alcohol can throw the schedule off, and so does traveling.
That more than likely means you can't train yourself to be a "short sleeper"âand you're fooling yourself if you think you've done it. A 2003 study in the journal Sleep found that the more tired we get, the less tired we feel.
I vaguely recall this being a "life hack" a while back: people insisting that you can train yourself to operate on four or two or whatever hours of sleep. I recall, less vaguely, going "bullshit." As I do with most "life hacks."
Also, apparently there's a journal called Sleep. This amuses me because I imagine the articles are a cure for insomnia.
So earn back that lost sleepâand follow the dictates of your innate sleep needs. Youâll feel better. "When you put away sleep debt, you become superhuman," says Stanford's Dement, talking about the improved mental and physical capabilities that come with being well rested.
Unfortunately, we live in a society that doesn't value sleep, seeing it instead as a necessary evil. When you're sleeping, you're not being productive, and there's nothing more sacred than Holy Productivity. We're conditioned to shame people for sleeping too long, or at the "wrong" times. Naps are scorned. "Sleep is for the weak." "I'll sleep when I'm dead." Fuck all that noise.
And I've done the math in here before, but essentially, considering all the things we need to or should do in a day, most of us just don't have the time to sleep. Theoretically, it's 8-8-8: 8 hours each of work, leisure, and sleep. But almost no one devotes only 8 hours to work; there's a significant chunk of time where you're doing work-related things, such as getting dressed or commuting, that cuts into either leisure or sleep. And if you have kids? Well, forget it. People use lack of sleep as some sort of social cred, too, and that's perverse.
What needs to change is our attitudes about sleep. And maybe shorter work days.
Not that it affects me, not anymore. But I'm incensed on everyone else's behalf. |
September 11, 2022 at 12:01am September 11, 2022 at 12:01am
|
Today's article (actually an essay) has been languishing in my queue for a while, and today is its day. I don't have a lot to say about it, and it may be of only marginal interest to most people, but I found it to be an interesting glimpse into some of the cultural tensions that we've been experiencing.
And it's a good thing I don't have a lot to say about it, because I'm really not in the mood today. Yesterday would have been my dad's birthday had he survived (he was born 105 years ago), and I've been practicing avoidance. It's raining, the days are getting shorter, and things are getting colder. I know a lot of people love fall. I do not. I'm a summer guy. If it's too hot for you, it's just right for me. Anything below 20C is "cold" and anything below 15C is "freezing" as far as I'm concerned, and the rain just makes me more miserable. The only good thing about the approaching season of dying is beer: Oktoberfest lagers and pumpkin ales make their brief seasonal appearance, and I might have had a few of those yesterday.
None of which has anything to do with the article (except for maybe the "avoidance" bit), but dammit, this is my blog and I get to rant about my state of mind if I want.
So I'm just leaving the essay here for your benefit. I'm taking a mental health night. I did read it (twice - once when Turkey DrumStik sent it to me and again just now to see if I had any coherent thoughts about it, to which the answer is "not really") but while I think it's important, I'm in no mood to articulate exactly why. But it does have implications for my life, and for that of my father before me (I knew he'd figure into this somehow). I spent my childhood on land appropriated from an Indigenous people; I've donated the artifacts found on it to their tribal council, but is that the end of my responsibility, or just the beginning? Or is it even on me at all?
I don't know, and in a few years, it won't matter anyway. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|