About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
October 31, 2024 at 9:44am October 31, 2024 at 9:44am
|
Samhain, and the temperature is supposed to get up into the 80s (Freedom Units) today. Not that I'm unhappy about that, but it is somewhat unnerving for the end of October.
This Halloween night coincides closely with a new moon (8:47 Eastern tomorrow), so it's not nearly as cool as when it coincides with a full moon.
So I'm doing... absolutely nothing for the occasion. Except, of course, trying to keep both of the black cats indoors, because some people are idiots.
Speaking of the moon, though, here's a lunar mythology article from Discover:
The moon is something all people on Earth, no matter where or when theyâve lived, have in common.
Unless, I suppose, you spend your entire life underground.
That we all share the moon does not mean that we imagine it the same way, however. In our myths and stories, the moon plays many different roles.
We know more about the moon than ever before, and we continue to learn more. That dead hunk of rock in the sky still has secrets to give up. Myths and legends don't tell us much about the moon, but they do reveal a lot about us.
The relationship between the moon and the tides, for example, was clear to early peoples who lived near the sea, explains Tok Thompson, an anthropologist at the University of Southern California who specializes in folklore and mythology.
That's probably an example of noticing a correlation without understanding the deeper effects (in this case, gravity) behind the correlation. In other words, the "observation" part of the scientific method.
The moon, with its regular phases, also functioned as a calendar. You could track the days by the sun, but for longer periods, the moon was useful. The English word moon comes from mensis, the Latin word for month, which is also the origin of the word âmenstruation.â
Much as I enjoy a good etymology, I'm not sure this is right. Wiktionary traces the word back through Proto-Germanic, all the way to PIE. This suggests to me that, rather than having its origin in Latin, the Latin and the Germanic words share a root.
I'm not an expert, by any means, but this is enough to make me question the "comes from Latin" assertion.
Pretty sure Luna is from Latin, though; that was the name of a moon goddess.
All that aside, let's stop pretending that Gregorian calendar months are related to lunar cycles.
So yes, ancient cultures knew a lot about the moon, and they wanted to share that knowledge to keep a record of it.
I also question the qualifier "a lot" in that sentence.
Thompsonâs favorite myth involving the moon is a creation story from the Tlingit people of the northwest coast of North America. In this story, an old man keeps all the light in the world stashed away in a box. Through wiles that vary from telling to telling, the trickster Raven steals the box and releases the Sun, the moon, and all the stars, bringing light to the world.
One could be forgiven for wondering how Raven was able to see the box, considering there was no light at the time, but myths follow their own, dreamlike, rules.
In Chinese mythology, a woman named Chang E drank a magic elixir, whereupon she floated all the way to the moon, and there she lives still â with, in some versions, a rabbit.
Radical feminists: "Hey, gimme some of that elixir!"
We still have some fascinating myths about our favorite satellite. Probably the most persistent is that the moon can drive us mad.
I find that it's necessary to separate the two meanings of "myth." A foundational story, like the ones about Raven or Chang E, is not the same thing as a persistent falsehood. I don't think this article draws the distinction.
Many people â including some healthcare workers and police officers â believe that crime, traffic accidents, psychiatric hospital admissions, and even suicides spike during a full moon.
Or that could be cognitive bias. You notice things more when they fit your pre-existing worldview. Like when you're frustrated by machines and bureaucracy, and think "Mercury must be in retrograde."
These days, we donât blame the effect on moon goddesses or magic elixirs but on something at least quasi-scientific: the moonâs gravitational effect on the water in our bodies. On its surface, that makes a lot of sense.
No. No, it does not.
After all, the moon creates tides, and roughly half our body weight is water.
Okay, but have they ever observed tides in a mudpuddle?
And indeed, though a few small studies have found some possible effects of the moon on mental health, research over the past decades has not borne this out. For example, a Swiss study published in 2019 looked at almost 18,000 cases of in-patient psychiatric admissions over ten years and found no correlation between the phase of the moon and psychiatric admissions or length of stay.
There's some nuance there, but basically, the "moon causes madness" bit is the second definition of myth.
Pausing to gaze at the moon can create an intense feeling of both awe and peace. Rather than making us mad, the moon might just make us a little bit more sane.
Nice to think, but there's little evidence for that, too.
Still... go look at the moon. Might want to wait a few days, though; it's too close to the sun right now. |
October 30, 2024 at 9:37am October 30, 2024 at 9:37am
|
From Cracked, an extraordinarily important article:
Puns are the highest form of humor, though the joy of them comes not from an audience's response, but from the punster watching the audience's response. And we have to watch carefully, lest one of them yeet something harder than a rotten tomato at us.
But there are two things about puns that I don't like: 1) When someone steals one of mine, even inadvertently; and 2) when someone comes up with one that I really should have, but didn't.
Such is the case with the second one at the link. Because it's Cracked, it's a countdown list, so it's the one numbered 13.
13. My girlfriend is the square root of negative 100. Sheâs a perfect 10, but also imaginary
Our relationship is complex.
See? See? May the creator of that one be darned to the grayest middle parts of Heck.
Obviously, I'm not going to reproduce all of them here. But this one made me laugh:
6. I told my husband he was awful at directions
He got so mad that he packed his bags and right.
Now, I should come up with a really witty pun of my own to wrap up this entry. I should, but I'm coming up empty. My muse is on vacation. So none of us get to be a-mused. |
October 29, 2024 at 9:35am October 29, 2024 at 9:35am
|
From Big Think, an article about how the numbers sometimes lie:
Or do they?
In 2014, I took a weekend break to York.
You know what sometimes lies? Anecdotes.
York is a lovely city in the north of the UK, with an ancient cathedral, quaint cobbled roads...
Oh, hell no.
...and an interactive Viking experience.
Is that where you get hung by the ankles and forced to watch your clan murdered and village pillaged while you're upside-down?
The reason for the anecdote is quickly revealed: the author made the mistake of trusting online reviews whilst choosing a restaurant, Skewers, which turned out to have high ratings due to drunks lovingly appreciating their late-night-early-morning fare.
Let's be clear, here: This is not a failure of TripAdvisor. It's not a failure of the food establishment. It's certainly not a failure of the well-meaning drunks. The blame lies entirely on the author. I travel quite a bit, and I prefer to form my own opinions of places I visit, rather than relying on others' (sometimes real, sometimes not) experiences. Sometimes, it sucks, but usually, I'm pleasantly surprised. Either way, I get to write reviews of my own, which may or may not factor into someone else choosing to visit.
The story of Skewers is an example of the McNamara fallacy, and learning about it can help us all (especially the underprepared tourists among us).
In spite of my misgivings about the opening anecdote, I'm willing to read on.
Fortunately, the author again gets quickly to the point, which is how I know I'm reading Big Think and not The New Yorker:
The McNamara fallacy is what occurs when decision-makers rely solely on quantitative metrics while ignoring qualitative factors.
So, like, when you rely only on the studies that claim booze is bad for you, and ignore how good it makes you feel.
The fallacy is named after Robert McNamara, the U.S. Secretary of Defense during the Vietnam War, where his over-reliance on measurable data led to several misguided strategies where considering certain human and contextual elements would have been successful.
I'm no expert on war history, but claiming that different decisions "would have been successful" strikes me as arrogant. I'd weasel out with "might have been successful," instead.
The McNamara fallacy is not saying that using data is bad or that collecting as much information as you can is wasted time.
I'm just leaving this here lest someone be thinking, "What's the point in data, then?"
For example, itâs not uncommon for someone to deeply love a book that has few or no reviews on Goodreads. Itâs possible to enjoy a restaurant or a movie in spite of what others say. Data is a great starting point, and a great many idiotic and dangerous things are done when we ignore data, but it doesnât always make for the best decisions.
I'm very familiar with that assessment, anyway. People are different, and you might love something others hate, or vice-versa. Reviews and the like are aggregations, and blindly following them is roughly the same thing as blindly following the crowd. You can do that if you want, but I don't, because that way, I miss out on the joy of making my own discoveries.
If everyone liked the same things, there'd only be one beer, and that would be a boring world indeed.
I have a much bigger problem with the explicit comparison of "finding a place to eat" and "running a war where people die."
Still, I suspect the basic ideas of the fallacy are sound, and the article goes into other examples and applications thereof, which I don't feel the need to reproduce here.
Fear not, though; it's short, and there are very few actual numbers. So, a solid four stars out of five from me. Your experience may vary. |
October 28, 2024 at 9:54am October 28, 2024 at 9:54am
|
I don't know if I've used anything from Inverse before. This article caught my eye a while back.
Just as the answer to any yes/no headline question is "no" by default, the answer to "why do [businesses] do [thing]" is almost always "money."
It's been a while since I saved this article, so let's see if I'm right.
We live on a planet where people still die of starvation, and yet we still waste so much food â itâs a problem, and not just for sustenance but the environment.
We can solve the problem, or we can learn to live with it. I did.
Part of it is just a category problem: Americans are used to seeing a wide and alluring variety of foods on shelves, and a lot of it, especially for produce and meats.
One of the first times I set foot in a Whole Fools, long before they merged with Amazon, I noticed the marketing gimmick of their produce section: an abundance of everything, all bins topped off and on the verge of overflowing. This was obviously a long time ago, but I remember thinking: "Some marketing expert decided that the store has to look like it'll never run out." I imagined stock clerks (or "associates" or whatever) being paid minimum wage to watch the produce section, and run in and replace every cantaloupe or courgette some customer put in their cart.
This is in contrast with other local grocery stores, who seemed to have no problem letting bins get nearly empty before replacing the contents. I imagine that's bad in a couple of ways. Some people, seeing that there are only three oranges left, will move on to buy something else, thinking, perhaps, "Those three must have been the ones no one else wanted," or, if they're more generous in their estimation of humans, "Let someone who really needs them have them." Others will grab all three, thinking something like "I'd better get that now," and then they don't eat them, thus simply moving the food waste problem from the producer to the consumer.
To be clear, though, I have no idea if there's any difference in food waste between WF and, say, Kroger.
So what do the stores do? They overpurchase, âknowing that some food waste must be built into their bottom line,â says Jennifer Molidor, senior food campaigner at the Center for Biological Diversity.
And what do they do with the unbought food? Do they donate it to homeless shelters? Not from what I've heard; they just toss it into a dumpster under a surveillance camera to make sure no one's getting free food.
Profit margins on perishable foods are so high that stores would rather overstock so as not to miss even one sale.
Yeah, I'm going to need a citation for that. But it does tie in to my "money" answer above.
On the high-tech side, retailers are starting to use artificial intelligence to better determine how much and when to order food items.
Using AI, huh? Well, that should calm consumer fears.
Another high-tech solution is âdynamic pricing,â or flexible price points that can shift depending on real-world market factors, in this case allowing stores to discount items that are getting close to the end of their shelf life.
That's actually pretty cool, in my opinion. Though stores have done a version of that for as long as I can remember, selling yesterday's bakery products, for example, at a discount. (The article does point that out.)
Still, when food does go unsold, it can go somewhere more productive than a landfill, and there, too, ReFED has seen progress. In the Pacific Coast Food Waste Commitment, signatories increased the percentage of unsold food that was composted by 28 percent and donated by 20 percent.
Okay, so maybe some of it does get donated. My inner cynic (which is like 90% of me) wants to know if the companies get tax breaks for doing so.
One challenge for dynamic pricing is how to reduce prices without a ton of extra labor. Some startups are experimenting with digital labels that could be easily changed.
If a price can be easily changed, it can be easily increased. When gas stations moved from physical signs to digital ones, their prices became more dynamic, too, probably because it takes less labor to change it.
There are also plenty of hurdles for finding useful places for food waste to go. For one thing, âlandfilling in the United States is dirt cheap,â Sanders says, so doing just about anything else with unsold food costs more.
"Landfilling." "Dirt cheap." I see what they did there.
Also, again: money.
Donating food, meanwhile, not only comes with liability risk to keep the food well preserved, but it also requires new processes and labor for store employees to collect the food and new partnerships for how it gets picked up and where it goes.
Yeah, you have to wonder about liability in those situations. That's gotta affect their bottom line, too.
At some point, change will require mandatory measures, not just voluntary ones, to disrupt that dynamic. âWe need laws and regulations from the government to hold industry accountable and make food waste prevention a requirement,â Molidor says.
Oh, more regulations? You just lost half your support.
Anyway, I don't have solutions. Like I said, I forced myself to get comfortable with food waste, because it's like when you were a kid and you wouldn't eat your vegetables and your mom was like "Think of all the starving children in Ethiopia!" (or whatever place was well-known for having famine when you were a kid). And you pushed the plate toward her and said "Send it to them!" |
October 27, 2024 at 9:02am October 27, 2024 at 9:02am
|
Almost four years ago, I wrote this entry in response to a 30DBC prompt: "Stylish"
The prompt was fairly long for a prompt, so the entry was fairly long for an entry. Prompt: "What is your blogging style? In your response, consider the following questions: What is your process of writing a blog entry - do you plan it out in advance, or just start writing? Who is your ideal reader? How did your unique blogging style emerge? Has your blog changed over time?"
Upon re-reading the entry, I was a bit disappointed that few of the answers would change between then and now. There's something to be said for consistency, but there's also something to be said for growth.
Yesterday, I made my habitual Monday foray to the local taphouse.
That's one thing that's changed. I still like that taphouse, but I don't go regularly anymore.
I sit outside, on their patio, because it's a better bet than dining indoors. There are no guarantees, of course, but the science points to a lower risk of Trump Mumps transmission if you're not inside.
This has changed, of course, because it's not 2020 anymore. As far as I know, I still haven't gotten Trump Mumps... but I've had a few bouts of cold-type illness, and never did bother getting tested.
None of which really has anything to do with the prompt, except to illustrate that I don't really plan out these entries, other than maybe giving them an hour or so of thought, usually while doing something else (in this case watching YouTube videos about science, philosophy, and the philosophy of science), and then, in the entry, I could write about almost anything.
My YouTube consumption has diminished since then, also. I despise ads, and they want to charge too much for no-ads. And even without ads, I got really weary of everyone telling me to "like and subscribe." Urging me to like, subscribe, comment, etc. is an absolutely certain way to ensure I never will.
I don't want to get tied down to any one subject, because so many things are interesting to me, and in the end, I'm not writing for any particular type of reader, but just to write.
That... wasn't exactly true then, and it's not exactly true now. It occurred to me several years ago that some of the best works of fiction were written, not with a particular demographic in mind, targeted to what market research suggests would be the most lucrative audience, but for an individual or small group of people. So, I imagine writing for one particular person. It's not always the same person. Still, it's not a lie; I don't write for a "type."
Thus, I really haven't tried to push myself into a "style." Sometimes I'm funny (or try to be; jury's still out), and sometimes I'm completely serious. Sometimes both at the same time. The problem there is I'm not sure anyone can tell the difference.
This bit hasn't changed.
When I started blogging, lo these many years ago, it was mostly about personal stuff, like the crap I started with today.
Clearly, I still do that when the situation warrants, like when I was traveling and had experiences to relate.
Obviously, I do still talk about personal shit sometimes, but I've arranged my life specifically to avoid drama, so very little happens to me that anyone else would consider interesting.
The downside of this is that when it does, I barely have the practice to handle it.
But writing isn't work. I suspect that if I ever made actual money from it, I'd probably start to consider it work and break out in hives.
Still true.
As I noted recently, I've managed to add an entry every day of this calendar year thus far, and I'm hoping to make it to December 31 (not that I'll stop then, but I do expect to take a couple of breaks next year).
Those breaks didn't happen, and I'm closing in on a five-year uninterrupted streak.
So, I guess, some things did change, but mostly things outside my control. |
October 26, 2024 at 9:26am October 26, 2024 at 9:26am
|
From SciAm, and an author I hung out with for a week a few years ago:
I can smugly assert that this is something I'd already figured out.
I remember watching the full moon rise one early evening a while back.
You'd think astronomers would spend more time watching moonrises.
As it cleared the horizon, the moon looked huge!
Eastern Colorado, where this occurred, is basically a continuation of Kansas: Flat as the proverbial pancake. There's an actual horizon there.
Anyone who is capable of seeing the moon (or the sun) near the horizon has experienced this effect.
I don't have an unobstructed view of the horizon here, but I've spent time on the coasts, so yes.
But itâs not real. Simple measurements of the moon show itâs essentially the same size on the horizon as when itâs overhead. This really is an illusion.
As opposed to time, which really is not (yes, another time article is in the queue).
Attempts to explain it are as old as the illusion itself, and most come up short. Aristotle wrote about it, for example, attributing it to the effects of mist.
As smart as Aristotle was, he got lots of stuff wrong. This was one of those things.
A related idea, still common today, is that Earthâs air acts like a lens, refracting (bending) the light from the moon and magnifying it.
The "simple measurements" noted in an earlier quote refute that bit instantly. Refraction does make it appear redder than usual, though, just like it does with the rising or setting sun.
Another common but mistaken explanation is that when the moonâs on the horizon, youâre subconsciously comparing it with nearby objects such as trees and buildings, making it look bigger. But that canât be right; the illusion still occurs when the horizon is empty, such as at sea or on the plains.
But, see, coming up with explanations like that, even when they can be disproven... that's part of science.
So whatâs the cause? Like so many things in science, there are two effects at play here.
Phil indeed goes on to explain the cause, but no need for me to quote it word-for-word. The first part of the answer is related to this well-known optical illusion.
The second part is devoted to showing that we don't see the sky as a hemisphere, but as a nearly-flat surface. This is the harder part to accept, I think, but next time you're out on the plains or the ocean on a partly cloudy day, note how the bottoms of the clouds make the sky seem flat. This is for the same reason that the Earth itself appears (but only appears, dammit) flat in such locations.
So you've got two (nearly) planes, earth and sky (the temptation exists to make a plains/planes pun, but I'm not in the mood), but only one moon. Still, presumably, you've seen the moon high in the sky quite often, so then when you see it near the horizon, your brain goes back to remembering seeing that.
Moon illusion misconceptions still abound, and like so many myths, they likely wonât go away no matter how much someone like me writes about them.
Yes, this is, indeed, the curse of fact-checkers. |
October 25, 2024 at 9:56am October 25, 2024 at 9:56am
|
Today in You're Doing It Wrong...
Most likely they asked "an Italian chef" because if they'd asked 10 Italian chefs, they'd have gotten a hundred different opinions.
If you think salting your pasta water is going to get dinner on the table faster, think again.
I have to wonder how that bit got started. Adding salt to water increases its boiling point, making it take longer to come to a boil. The higher-temperature salt water would then, presumably, cook the noodles faster. It would be pretty simple to do an experiment to see which effect takes precedence... and then you probably find that the amount of salt used in cooking makes no meaningful difference to boiling temperature.
To separate pasta protocol from the false and fabricated, we asked an expert about the biggest pasta myths, mistakes and misnomers that could be ruining your rotini and putting your pappardelle in peril.
Really, I'm just quoting this bit to illustrate how proud the author is of his alliteration.
Filippo de Marchi, is chef de cuisine at De Majo Restaurant & Terrace.
Stealth ad!
We grilled Marchi on nine of the top-circulated pasta cooking myths.
Tough to grill pasta.
"Cooking pasta isn't difficult at all. It's all about timing and the right water-to-pasta ratio," he says.
So, let me tell you what I remember of my mom's pasta-cooking technique. First, fill a small pot halfway with water. Then throw in the pasta (breaking the hell out of it if it's spaghetti). Then put it on the stove over high heat. Wait 45 minutes. Drain and serve.
My mom had no Italian heritage.
1. Throwing pasta against a wall to see if it sticks proves it's done
Chef's take: FALSE
Of course this is false. I've always known it to be false. And yet, the false information persists, as it usually does.
2. Adding olive oil to pasta water keeps noodles from sticking
Chef's take: FALSE
"The oil just floats on top of the water and doesn't coat the pasta effectively," says de Marchi.
Pretty sure there's more to it than that. Once the water boils, the oil gets stirred in more, though it's still going to be gloppy and not stick to the noodles.
3. Fresh pasta is always better than dry pasta
Chef's take: FALSE
Pretty sure that's a question of individual taste and opinion, not fact or fiction.
It's all about personal preference. Fresh, dry or frozen; chefs aren't here to dictate what your taste buds like and don't like.
Like I said.
4. Leave the pot covered while the pasta is cooking
Chef's take: FALSE
Of all the "myths" on the list, this is the only one I'd never even heard of. All pasta-making recipes I've seen demand open-top pots.
On the flip side, everyone's other favorite cooked starch, rice, requires a covered pot. Unless you cheat and get instant rice or whatever.
5. Adding salt helps water boil faster
Chef's take: FALSE
As I noted above, middle-school chemistry disproves this one.
"If you're cooking without enough salt, the pasta can end up tasting a bit bland," warns de Marchi, whose signature dish at NHC Murano Villa is a spaghetti alle vongole.
The stealth ad continues!
6. Drain pasta until it's completely dry
Chef's take: FALSE
Besides, "completely" is misleading, here. Presumably, the pasta started out dry; that's why you stick it in boiling water.
7. You should run cooked pasta under water before serving
Answer: FALSE
Yeah, I'm pretty sure there are situations where you do want to do that (creating pasta salad, e.g.), but for your hot spaghetti dishes? Never.
8. You should precook sheets of lasagna
Answer: FALSE
I have to admit, this one confused me for a long time. Most store-bought lasagna has pre-cooking on the box instructions, as I recall (I haven't made it in a very long time). It wasn't until I met my second wife, who also had zero Italian ancestry, who introduced me to the lazy wonders of cook-it-in-the-baking-pan lasagna.
Left out of the article: the other two cooking tips mentioned above (they said nine total), which presumably aren't myths. I guess we'll all have to go to this guy's restaurant and ask him ourselves. |
October 24, 2024 at 9:49am October 24, 2024 at 9:49am
|
I've written before about the bad-news bias toward lottery winners, where the media latches on to a tragedy connected to someone who won the lottery as if to say: "See? You have to work for your money, or bad things will happen." Whereas in reality, the majority of big-ticket lottery winners go on to live decent lives. Sure, they eventually get sick and/or die, but that's going to happen whether you have lots of money or not.
Well, here's a rare article, from Business Insider, that takes an honest look at one guy's lottery win.
The temptation there is to react with envy or sour-grapes. I'm just looking at it as one person's experience, which of course may not be representative of everyone's.
This as-told-to essay is based on a transcribed conversation with Timothy Schultz, who won the Powerball Lottery in 1999.
So, a quarter-century ago. Plenty of time for someone to figure out what to do with all that money.
I started playing the lottery once or twice weekly, buying a single ticket. I visualized winning and told people about it. They said, "Well if anyone's going to win, you're going to win."
Then I did.
Plenty of people "visualize" winning and continue to lose. I keep visualizing a date with Halle Berry, but it hasn't happened yet.
A press conference announced I had won the $28 million Powerball lottery. After that, our phone was inundated with messages. People I knew congratulated me, but there were stacks of letters from strangers, some of whom asked for money.
I'd like to see a study about whether this happens more often to major lottery winners than to other rich people.
I'd always imagined what I'd do if I ever won: pay off debt and put myself through college, but I'd never thought about how it would change my life.
I immediately question how a 21-year-old without student loans can get themselves into debt so deep the lottery has to get involved. I've been in debt, but it took me longer than 21 years to get there.
Suddenly, I'd gone from a gas station attendant to retired at 21. I felt like I was holding a magic wand. Everything was possible, but I also wanted to be financially responsible.
Some would say that one important aspect of financial responsibility is to not play the lottery. Those who are irresponsible before winning a jackpot will generally continue to be irresponsible. There are, of course, exceptions.
Before turning in the ticket, I consulted with wealth professionals to understand how much I could afford to spend and give to others.
Not a bad idea, depending on how much the "wealth professionals" gouge you for the advice.
Before I received the money, I set up a plan with advisors to invest it. We invested conservatively so the returns could last me over a lifetime.
Also a good idea.
Basic, napkin calculation: you win $20M. Assuming that's the actual lump sum payout and not the amortized version, you then want to do immediate things with it like, maybe, set your family up with houses or give some to the local animal shelter or whatever. Maybe take a round-the-world cruise. Whatever floats your boat. Say you set aside half of it just for funsies. That leaves $10M to invest.
Conservative investments tend to return an average of 8% a year. This is the stock market we're talking about, so some years it might be negative, while other years it'll be higher. In 1999, the dot-com crash was still a few months away, but you'd have no way of knowing that.
Because inflation is also a thing, there's a rule of thumb about living on 4% of the investment account balance on an annual basis. Some question that rule, but again, just being quick here. 4% of $10M is $400K. That's more than most people make in the US, even now, let alone in 1999.
Keep in mind that these assumptions maintain, or possibly slightly grow, the balance. It can stay there, working for you, while you sit on the beach very much not-working, until a) someone fucks up the portfolio, b) someone gets irresponsible with it, c) society collapses, or d) hyperinflation takes over.
But as a 21-year-old, the first thing I bought was the latest video game system.
Honestly, I can't blame him one bit. And what's that cost in 1999? $500? That's a rounding error.
I went back to college to study film and broadcast journalism, a dream come true.
Perhaps the best thing about having money at a young age would be the ability to pursue whatever college degree you wanted, without concerning yourself about ROI on tuition. Or maybe that's not everyone.
People were supportive, but some treated me differently. Some tried to get closer to me, which made me feel like a walking, talking ATM.
It may be true that, once you have money and people know it, you always wonder if someone's becoming your friend because you're cool, or because they see a way to get money out of you.
When you win the lottery, people don't view the money as something you've earned. A family member explicitly told me I got something for nothing by winning the lottery and should keep giving them and others money.
Bullshit.
Also, "I want unearned income too!" is massively hypocritical.
It was a steep learning curve navigating the social aspect of winning the lottery.
Technically, it would have been a shallow learning curve, as per an entry I did here a while back, but there's no accounting for English idioms.
People ask all the time, "Does money buy happiness?" Money doesn't necessarily change who you are. It can affect happiness by buying time, providing opportunities, and alleviating stress about debt.
I know I've harped on this before, but "Does money buy happiness?" is the wrong question, with wrong assumptions. But whoever says it doesn't has never enjoyed a really good single-malt scotch.
I wish I had invested in bitcoin a few years ago, but that's my only regret about how I've spent the winnings.
An investor's worst enemy is hindsight. Personally, I'm glad I never got into Dunning-Krugerrands. Don't trust 'em.
These days, I don't buy anything too crazy. Like many people, I live within a budget.
Budgeting is important, rich or poor.
But if I were 21 now and had the option, I would consider claiming the prize anonymously, especially if it was a large prize.
I'd recommend that for lottery winners at any age.
So. Nothing exceptionally bad happened to him, some good things did, but it did change his life and how he looked at things. Basically, life happened.
This is just one person, though, as I noted. But it's nice to see an honest look at the results for that one person, without sensationalism. Even if the article is an ad for his media conglomerate. |
October 23, 2024 at 8:02am October 23, 2024 at 8:02am
|
Comedy meets science in another Cracked article.
Right, because not knowing everything is the same as knowing nothing.
Science isnât a discipline known for moving fast, breaking things or shooting first and asking questions later.
Well... kind of. For various definitions of "fast."
4. The Riemann Hypothesis
Itâs ridiculously complicated, but letâs just say that according to the hypothesis, proposed by Bernard Riemann, thereâs a pattern to the distribution of prime numbers along the number line.
Oh, come on, that's not science; that's math, a separate discipline that will never know everything.
3. Goldbachâs Conjecture
If 160 years seems like a long time, try 282. Thatâs how long ago Russian mathematician Christian Goldbach theorized that âevery even positive integer greater than 3 is the sum of two (not necessarily distinct) primes.â
I invented the principle "never let facts get in the way of a joke," but again... math, not science.
2. The Clarendon Dry Piles
It sounds like a sex act that only exists on Urban Dictionary, but it may be even weirder. Itâs two brass bells connected by a pair of batteries called dry piles powered by electrostatic forces. The amount of charge carried between the bells is so small that it appears to be a perpetual motion machine, but the truth is, the batteries are just draining extremely slowly.
So science has, in fact, answered a question: extremely slow drain. At least this one is actually science, not math.
Itâs anyoneâs guess when the batteries might die, but weâll probably never know because the hardware will probably break first.
Seems to me that question will eventually be answered, so this section still doesn't meet the promise of the headline.
1. The 500-Year Microbiology Experiment
Also science.
Itâs a stunning act of futility and optimism to bet on society surviving another 500 years, but thatâs what scientists at the University of Edinburgh did in 2014 when they endeavored to find out if certain strains of dried bacteria can survive five centuries.
This headline sucked. That's another question that will eventually be answered.
That is, if, as the article notes, science is still around in 490 years. |
October 22, 2024 at 10:16am October 22, 2024 at 10:16am
|
And now, one of my rare posts about actual writing...
It is a truth universally acknowledged that of all the bad opening lines of literature, with the possible exception of the one that begins, "It is a truth universally acknowledged," the one that begins, "It was a dark and stormy night" is the nadir of all possible opening lines. This was from Bulwer-Lytton's novel Paul Clifford, about which literally no one outside of academia knows anything apart from the opening line.
Like all truths universally acknowledged, it's not necessarily true. But it still provides the impetus for writers to try to do worse, generally to hilarious effect.
I've posted about this before, I know, but today's entry is about this year's "winners."
Now, you'll have to go to the site to see the Grand Prize Winner. Suffice it to say that I don't agree. It's bad, but not dark-and-stormy-night bad.
I'll do what that site doesn't, which is to first paste the inspiration for the contest, the actual opening line of Paul Clifford (they put it way down the page):
It was a dark and stormy night; the rain fell in torrentsâexcept at occasional intervals, when it was checked by a violent gust of wind which swept up the streets (for it is in London that our scene lies), rattling along the housetops, and fiercely agitating the scanty flame of the lamps that struggled against the darkness.
Now, some of my personal favorites:
It was a dark and stormy night, which makes perfect sense when you realize weâre on Neptune, with a mean distance from the Sun of 4.5 billion kilometers (or 30 astronomical units), and winds that howl at 100 meters per second, composed of mostly hydrogen and helium (and only trace amounts of methane), which is way better than Uranus, which stinks to high heaven.
—Jon A. Bell, Porto, Portugal
Yes, I'd probably have selected that one as the Grand Prize Loser, mostly because I'm sick and tired of seventh-planet puns, but also because of the completely unnecessary science data.
And, of course, I'm quite fond of the Vile Puns section:
"I do enjoy turning a prophet," said Torquemada, as he roasted the heretic seer on a spit.
—A. R. Templeton, Stratford, Canada
And:
"My laddies may not be the fastest sugar cane harvesters," Fergus confessed, "but they're not as slow as my lasses..."
—Mark Meiches, Dallas, TX
Just one more; like I said, the page is there to view the other stinkers at your leisure:
Ralf Smalborgson kept a small shop in Direperil, Minnesota, and his goods consisted only of medieval stringed instruments, lanyards and backstays, and some limited apothecary supplies, giving the store its uninviting signage: Lute, Rope, and Pillage.
—CiarĂĄn McGonagle, Derry, Northern Ireland
Which reminds me of my D&D-adjacent bard, one of whose catchphrases is, "Come on, baby, fight my lyre!" |
October 21, 2024 at 10:16am October 21, 2024 at 10:16am
|
From the BBC, an article at the intersection of science and mythology.
The myths that hint at past disasters
Myths and fables passed down over thousands of years are full of fantastic creatures and warring gods. But they also might contain evidence of environmental disasters of the past.
It's long been assumed that many, if not all, myths contain some elements of fact. (Like the article, I'm using "myth" in the sense of a passed-down story, not the more modern meaning of falsehood.) The tough part is teasing out the fact from the fiction.
The article opens with a bit about expected sea level rise, comparing it to past sea level rise. This might add fuel to the "climate's changed before and we're still here" campfire of climate change deniers, but so be it.
With the possibility of a catastrophic global sea level rise of 3ft (1m) by 2050 which could force millions of people to leave their homes, researchers have now started to look at ancient stories about land lost to the sea and downed cities in a new way.
Hell, it might have looked like the whole world was flooding, leading to one extraordinarily popular myth (though that one might have been due to its Mesopotamian origins).
Some researchers argue that tales of hot boulders thrown into the sea or the building of sea walls comprise factual information, albeit exaggerated and distorted to some extent.
Well, okay, argue all you want, but how about some science to back it up? I have this hypothesis that the "nephilim" mentioned exactly once in the Bible, in Genesis, are actually a passed-down memory of Neanderthals. There's no definitive translation to it, and the KJV muddled the waters by rendering the word as "giants," with even less evidence (we didn't know about Neanderthals at the time). Like I said, it's my hypothesis. I have literally no way of supporting or disproving it.
These researchers are geomythologists.
As good a name as any, and better than most.
âGeomyths represent the earliest inklings of the scientific impulse,â says Adrienne Mayor, folklorist, historian of ancient science and research scholar at Stanford University, California, and author of the important The First Fossil Hunters, âshowing that people of antiquity were keen observers and applied the best rational, cohesive thinking of their place and time to explain remarkable natural forces they experienced.â
I suppose, for various definitions of "rational" and "cohesive," if such words can be applied to deciding that floods are due to gods yeeting boulders and whatnot.
There's a lot more at the article, including specifics; I won't quote much more.
Back when the glaciers retreated, humanity was already infesting the European lands (I'm including Neanderthals in that group). In that era, Britain wasn't an island; it was connected to the mainland by a low-lying chunk of land now called Doggerland. This wasn't particularly ancient, in geological terms; apparently, it flooded out about 6,000 years ago, around the time civilization was flourishing in Mesopotamia, thanks to beer.
I'd often wondered about undersea archaeology in that region, and what it might find in terms of artifacts. It's often called a land bridge, but that implies that humans mostly lived in Britain or Nederland, only crossing the "bridge," but there's no reason why people wouldn't have lived there.
Turns out, of course, that actual scientists were way ahead of me.
The whole thing's fascinating to me, though I doubt the article's thesis that there's much practical use in terms of dealing with catastrophic modern climate change in an era when there are billions of people, not hundreds or thousands, living near a coast. But finding more truth about ancient humans is practical enough for me. |
October 20, 2024 at 9:02am October 20, 2024 at 9:02am
|
Going back over 5 years, today's peek into the past takes us to an entry from July of 2019: "Unlucky"
I've written about luck several times in here, but this entry isn't directly about that; it's a play on the word "potluck." The entry is a response to a prompt, most likely from 30DBC: "What is your go-to dish to bring to a potluck? Does your family have any traditional recipes? (In Hawaii, these appetizers are called 'pupus'"
What I apparently didn't catch at the time was the reference to appetizers; in mainland North America, a potluck is when you bring a dish of any sort to a communal gathering. The dish could be an appetizer, but it could also be a main course (my entry goes on about casseroles for a good bit) or a dessert.
Casseroles, however, mystify me. I'm not sure why. They can't be that hard to do, or millions of Southern Baptists wouldn't do them, in all their various incarnations. You'd think that would teach them something about variety and the wonders of diversity, but... no. They just plop down their armies of Corningware, oblivious to the obvious metaphor.
Since then, I've tried my hand at various casseroles, with a good bit of success. They're less work than other dishes, but still harder than, say, popping a frozen thing into a microwave.
But what stumps me is that I don't know what the essential thing is that makes a casserole a casserole and not something else. Baking? You also bake lasagna; is that a casserole? Meat? Nonsense; there are plenty of vegetarian casseroles.
I've also had several entries about classification problems, like my perennial assertion that a hot dog is not a sandwich, but a taco.
So, yeah, no traditional family recipes here. Just stuff I've picked up along the road.
As I noted in that entry, if there were traditional recipes in my family, they'd be borderline inedible, and I'd have no problem letting them fade away.
Fortunately, one of the comments helped me to clarify the "what exactly makes a casserole a casserole" conundrum. I occasionally get comments here that thank me for helping the commenter learn something new, but I assure you, that works both ways.
Just lucky, I guess. |
October 19, 2024 at 9:39am October 19, 2024 at 9:39am
|
A Guardian article from earlier this year promotes a "radical" idea.
Or, as I call it, eating.
âIntuitive eatingâ is an anti-diet that helps reconnect us to internal cues. But how does it work?
Well, first of all, if it's intuitive, then you don't need rambling "how does it work" articles, now, do you?
Figuring out what to eat is complicated. What are you in the mood for? What do other people in your household want? What can you afford? What do you have time to prepare?
Wow. If that's complicated, I hope you never have to be a writer working on a deadline. Oh, wait.
Add the ambient pressure of a culture that loudly celebrates certain foods, bodies and lifestyles as desirable while vilifying others, and the simple question of what to have for dinner becomes fraught.
Oh, but it's even worse than that when you add in vegans, gluten-avoiders, the allergic and the "allergic," religious restrictions, etc.
Recently, there has been renewed interest in an approach called âintuitive eatingâ, based on the idea that your body already knows exactly what it needs.
Yes, it absolutely does, and apparently, it "needs" to be overweight.
Apparently, though, that's an expected result of intuitive eating. Just try telling that to my doctor.
She recommends people ask themselves questions like: what does it mean for me to feel healthy? Am I taking into consideration my mental health, emotional health and relational health? Which of my beliefs come from accurate, scientific information â and which come from movies, TV shows or ads in magazines?
Well, I'm not going to argue with this. It parallels my feelings about drinking.
Given how emotional this work can be, experts recommend working with a professional when possible.
Again... this kind of nullifies the "intuitive" part.
Giving yourself permission to eat what you want may involve wrestling with how your cultureâs foods might have been villainized by mainstream, white American culture, Johnson explains.
Oh, give me a fucking break. I get that it's fashionable to turn mainstream white American culture into the Bad Guy, but how about doing it by ragging on deep-fried butter with mayonnaise, or whatever culinary abomination came out of some Midwestern state fair this year?
Also not mentioned: the overwhelming amount of privilege it takes to "eat what you want." You think our distant, or even not-so-distant, ancestors were able to put together a shopping list or order from Instacart, like I do? No, they hunted or gathered whatever was available that week, or, later, were mostly restricted to plants and animals suited for domestication. Eating what you want is largely an industrial-age possibility, a product of privilege, and this includes veganism or gluten avoidance.
My point being that, as much as I distrust unsupported evolutionary arguments, I don't think we evolved to eat what we want, but to eat what was available, and sometimes, there wasn't a lot of that.
On the other hoof, a lot of people back then didn't live very long, so it's hardly an ideal to strive for. |
October 18, 2024 at 11:02am October 18, 2024 at 11:02am
|
It's science, not magic; and it's not secret anymore, thanks to Noemi. Fair warning, this is a long read:
The Secret, Magical Life Of Lithium
One of the oldest, scarcest elements in the universe has given us treatments for mental illness, ovenproof casserole dishes and electric cars. But how much do we really know about lithium?
It may surprise some people to know that I don't have the Periodic Table memorized. I go, "Hydrogen, helium, lithium..." and then I have to look it up. Fortunately, I live in the 21st century, and it's easy to look things up rather than commit them to long-term memory.
(Beryllium's next, by the way, followed by boron and carbon.)
The universe was born small, unimaginably dense and furiously hot. At first, it was all energy contained in a volume of space that exploded in size by a factor of 100 septillion in a fraction of a second.
This view of universal origin may be in need of revision, thanks to JWST observations. It does not change the material (pun intended) facts of the piece.
Quarks and gluons had congealed to make the first protons and neutrons, which collided over the course of a few minutes and stuck in different configurations, forming the nuclei of the first three elements: two gases and one light metal.
Those would be the ones I mentioned above.
For the next 100 million years or so, these would be the only elements in the vast, unblemished fabric of space before the first stars ignited like furnaces in the dark to forge all other matter.
"Unblemished" is surely poetic license. Also, again, details subject to change pending study of recent observations.
Almost 14 billion years later, on the third rocky planet orbiting a young star in a distal arm of a spiral galaxy, intelligent lifeforms would give names to those first three elements.
"More than" 14 billion, and they need to be careful about calling us "intelligent," as doing so invites self-contradictory jokes about intelligence. (Self-contradictory, because one needs to be intelligent to make such a joke, but it's still stupid.)
âLithium has one of the most complex stories,â Fields said. âThe oxygen youâre breathing, the carbon in your DNA, the iron in your blood â that came later, out of stars. But lithium comes straight out of the Big Bang.â
This is misleading, as lithium can also be produced post-Bang, but the article does mention that later.
But the sources of lithium in the universe have, since the late 1980s, presented astronomers with an odd problem. As Fields told me, astronomers studying the abundance of elements after the Big Bang have created complex calculations that account for the expansion of the universe, nuclear reactions and the behavior of subatomic particles like photons and neutrinos. The math pencils out for hydrogen and helium â the measurements match the predictions. Not so with lithium â only a third or less of the expected amount of lithium is observable in the universe.
This sort of thing is exactly why we do science.
The story of how humans discovered lithium goes back to the late 18th century and a Brazilian scientist, statesman and poet named JosĂŠ BonifĂĄcio de Andrada e Silva who was hopscotching around Europe on a sort of early study abroad program.
I'm betting he studied more than "a" broad.
Because the element was discovered in a mineral, Arfwedson and Berzelius named it âlithia,â after the Greek word âlithos,â for stone.
It should surprise no one that I find the origins of element names fascinating. Apart from obvious ones like iron and gold, known from antiquity, their names tend to reflect how we discovered them. Hydrogen, made from water. Helium, first detected in the sun. Beryllium, found in the mineral beryl (emeralds, e.g.) That sort of thing.
Finally, in 1855, two chemists â Robert Bunsen and Augustus Matthiessen â were able to isolate lithium in a quantity large enough to study its properties.
Yes, that Bunsen, namesake of every chemistry student's favorite flame source.
The article goes on to describe the beginnings of what might be lithium's most famous application: treating mental disorders. Don't read it if you like guinea pigs.
Research in Texas, Greece, Lithuania and elsewhere has found that naturally occurring lithium in drinking water is associated with lower rates of suicide and violent crime, prompting some scientists to advocate adding it to municipal water supplies, much like fluoride is sometimes added to strengthen teeth.
1) I want to take this opportunity to point out that Lithuania and lithium are not etymologically related.
2) Oh, adding elements to drinking water is sure to be uncontroversial.
The article then turns to what's probably the second most famous application of lithium (and might have already claimed the #1 spot): batteries.
Thomas Edison died in 1931 with a tin of lithium on his desk. One of the final projects in the great inventorâs life was experimenting with new chemistries in what he called âstorage batteries.â
Yeah, right. "Great inventor," my fat ass. Edison was a hack with a flair for marketing and melodrama, who coasted to most of his patents on the writhing backs of his underpaid servants. Could be he invented a thing or two by himself along the way, but he was a massive dick. Muskmelon reminds me of him.
Anyway, the article ends with a section highlighting lithium's ubiquitous role in today's technological life. As I expected, nothing magical about it, but it is interesting to read about. |
October 17, 2024 at 10:19am October 17, 2024 at 10:19am
|
In a sane world, this PopSci article would put an end to one of the stupidest conspiracy "theories" immediately.
Of course, in a sane world, the "theory" would never have gotten traction in the first place.
If youâve ever looked up at a mostly blue sky and seen straight white lines criss-crossing the horizon, or watched a plane puff out a plume as it passed above, you may have wondered, âwhat causes that?â
Yeah, back when I was a kid. This was before the internet, so I used the version of Wikipedia we had back then: I asked my dad, who launched into a long-winded explanation of what he called vapor trails.
I think that was his way of keeping me from asking too many questions: answer so thoroughly that I'd get bored and stop asking questions.
And no, theyâre also definitely not âchemtrails.â
I'd like to think that, even without the influence of my father's scientific mind, I still wouldn't have bought into that idiocy. But I can never be sure.
What they are, instead, is frozen water vapor crystallized on soot particles, both of which are standard byproducts of a jetâs combustion engine.
Well, there you go: the answer.
And though conspiracy theories about airplanes distributing mind control chemicals arenât correct, contrails are having a negative planetary effect.
Because of course they are. Nothing we do is allowed to actually be good for the environment.
Atmospheric conditions have to be just right to enable the jet vapor to crystallize: moist enough that the water doesnât evaporate, and cool enough that it freezes.
But as everyone knows, thanks to the internet, nothing that's described as "moist" can ever be cool.
A growing body of scientific research, including studies conducted by both Barrett and Stettler, indicate that contrails cumulatively have a warming effect, contributing to human-caused climate change.
These studies are important, because the leading hypothesis that I'd heard was that, being basically cirrus clouds, they'd have the effect of reflecting some sunlight, reducing the "heat in" side of the climate warming equation.
Perhaps they do, but other things about them overshadow (pun intended) that.
Since contrails, like other clouds, are bright white and insulating, they reflect light and heat, and also trap it. On a sunny day, a contrail does two things simultaneously. It âacts like a blanket,â preventing heat radiating from Earthâs surface from escaping to space, says Barrettâthis is the warming effect. Simultaneously, it also reflects sunlight from space away from Earthâs surface, in a cooling effect. Unfortunately, even when the sun is shining, generally the blanket effect outweighs the reflector effect, says Barrett.
At least according to this guy.
Anyway, the article goes into what can be done about them, and, surprisingly, the answer isn't "just stop flying" or "stop using plastic straws." In other words, this is one contributor to climate change that can be mitigated fairly easily, and it's not even about us doing penance for how sinful humans are (just corporations).
And yet, some of us are just going to blithely keep on believing that they're mind-control chemicals. |
October 16, 2024 at 8:12am October 16, 2024 at 8:12am
|
From The Takeout, answers to questions you never asked.
No, it's not "for her pleasure."
You might think the ribbed design on canned food is just a quirky touch meant to make jellied cranberry sauce look more interesting, but it actually serves a crucial purpose.
I might be alone in this, but I like the jellied cranberry sauce way more than the other canned kind, and way, way more than homemade. The uniformity of it appeals to me. I like to punch out both ends of the can and use one of them as a plunger, pushing out precisely 1/4" of the stuff at a time, then slicing it off at the can top. It's a far superior experience and taste.
Not that I'd ever tell that to someone who sweated over a hot stove to prepare homemade.
This is because the corrugated ridges add strength to the tin by creating a series of tiny arches along the surface. These in turn more evenly distribute pressure, similar to how the arches of a bridge help it carry heavy loads.
Always amusing to see someone unfamiliar with a subject (in this case, structural engineering) attempt to describe it to others.
Wait... is that how I sound all the time?
But the ridges have another job that goes beyond just strength.
Thus once again demonstrating the concept that things don't have to have just one purpose.
In addition to making the can stronger, the ridges on a can help it withstand heat. During the canning process, the contents are sealed and then brought to high temperatures to kill off any bacteria, helping to ensure that the food inside stays safe to eat.
In theory.
These ridges also help the can remain intact after it leaves the factory, when it might have to endure extreme temperatures during transport and storage.
I have a severe addiction to Coke Zero, and I buy it by the case. Obvously, Coke cans aren't ribbed, though their bottoms and tops are reinforced (because of how cylinders perform with pressure inside). Sometimes, on rare occasions, the reinforcement fails, and the bottom goes from concave to convex. On other occasions, it's the top that blows out, making it a whole lot harder to open the damn can. Unlike with canned food, this doesn't indicate that the contents have gone bad.
I suppose it's possible that some cans fail along the cylinder, too, and those are just culled before sale. But it seems to me that the sides of the can are not the failure points.
In the early 19th century, food was stored in glass jars that were heated to high temperatures to kill bacteria and then sealed to keep the contents safe.
Home "canners" still use glass jars. Duh.
The next big leap came in 1810 when English inventor Peter Durand developed tin-plated iron cans... Before the invention of the can opener nearly 50 years later, people had to use tools like hammers and chisels to access the food insideâ.
Fifty years? Man, humanity really dropped the ball on that one. I guess they were too busy opening cans to develop a better way of opening cans. |
October 15, 2024 at 7:42am October 15, 2024 at 7:42am
|
This BBC article (whose headline resembles a Cracked headline) is a few years old now, but all that means is they might be missing a few recent ones.
In recent history, a few individuals have made decisions that could, in theory, have unleashed killer aliens or set Earth's atmosphere on fire. What can they tell us about attitudes to the existential risks we face today?
Already, I have a bad feeling about this. Sounds like an Agenda, which I don't expect from the BBC.
In the late 1960s, Nasa faced a decision that could have shaped the fate of our species. Following the Apollo 11 Moon landings, the three astronauts were waiting to be picked up inside their capsule floating in the Pacific Ocean â and they were hot and uncomfortable. Nasa officials decided to make things more pleasant for their three national heroes. The downside? There was a small possibility of unleashing deadly alien microbes on Earth.
No. No, there wasn't.
They thought there might be such a possibility, so I understand the caution, but in hindsight, there was never any danger from that.
A couple of decades beforehand, a group of scientists and military officials stood at a similar turning point. As they waited to watch the first atomic weapon test, they were aware of a potentially catastrophic outcome. There was a chance that their experiments might accidentally ignite the atmosphere and destroy all life on the planet.
Again: no, there wasn't. They thought there might be such a possibility, and they went on with the tests anyway, which says more about humans than I care to admit.
This sort of writing demonstrates some sort of fallacy that, well, I don't know what it's called, and it's hard to back-search these things.
The people in those cases were acting on imperfect knowledge. I mean, we all are, all the time, but not usually with imaginary stakes so high.
Let me maybe give you an example. You're walking through the woods one day, and you come across a pedestal with a single, big, red button on it. There's no writing, no pictograms, nothing to indicate what (if anything) pressing the button might do. From cartoons, you know that pressing the Big Red Button is generally a Bad Idea, but those are cartoons, not reality.
You're human, and you're faced with a mysterious button in a strange location to find a button. The urge to push it is real, palpable. How can you, curious ape, just... walk past it? You can't. Billions of years of evolution can't be overcome by the moral lessons in cartoons, not easily. Maybe it opens a secret cache with 10 million dollars in nonsequential, unmarked bills. Maybe a genie would come out to grant wishes, and you could wish for more than $10M. Okay, that's unlikely. Maybe nothing will happen. That seems most likely. Maybe a piano will drop onto your head when you press it, killing you. Or maybe it connects to the gigantic planet-exploding device hidden deep in the core. Wow, that's really unlikely.
What you're doing, whether you realize it or not, is risk assessment. Each possible event has a probability. Each possible event also has a desirability. Assigning probabilities is tough, and it's even tougher without Bayes' Theorem to help you assess the probability through the lens of prior knowledge. It's also very difficult if you have no way of knowing what every possible outcome is. But the point is, if you can do that, even roughly, and multiply the probability by the desirability, you can more easily rank your choices. Finding a stash of money? Low probability but desirable. Blowing up the planet? Really low probability, but undesirable.
In the end, billions of years of evolution win out, and you press the button.
Only through hindsight can you really know if you made the right choice or not.
Getting back to the examples from the article, yes, of course, I'd heard about them before. Take the astronauts thing. High probability: no deadly space microbes. Result: no disaster. Low probability: deadly microbes. Result: Astronauts die, Earth saved. Or, Result: Everyone dies. That last one would be Bad, so I don't blame them for hedging with the "deadly microbes" scenario.
It's kind of like the Trolley Problem, only your action shifts the trolley onto a random track where a random number of people are tied.
I don't know if I'm explaining all this very well, but my main point is, the article does a really bad job distinguishing "what we thought the probability was at the time" and "what the probability actually was, in hindsight."
The remainder of the article does a much better job explaining risk and our reaction to it, but the way it's described at the beginning stuck in my craw, so I could no more resist posting about it than you could resist pushing that Big Red Button. |
October 14, 2024 at 9:27am October 14, 2024 at 9:27am
|
From Cracked, a countdown of the unseen:
All around you are signals you have no way of perceiving on your own. For example, did you know that invisible waves travel through the air, transmitting words and images, decodable by the right device?
We only see a remarkably thin slice of the electromagnetic spectrum. There are good evolutionary reasons for this, but why interrupt a fact-comedy with pure fact?
Not everything imperceptible is on the EM spectrum, though.
5. Seismometers Spot Quakes and Save Trains
Not a single person has died in an accident aboard Japanâs bullet trains.
There was a movie a couple of years back called Bullet Train, set on a Japanese bullet train. Action/comedy, so the title is a pun. It's not much of a spoiler to say that bad things happen on the train and people die. But not in an accident caused by an earthquake. I had the opportunity to watch the movie again on a recent long flight, and it was even better than I thought it was the first time.
Even in 2011, when that 9.0 earthquake hit Japan, collapsing a bunch of bullet train stations and ripping up the track, seismometers on the ocean floor detected the first tremors and shut down the trains in time.
EM waves, including radio, travel faster than seismic waves, so these early detection systems aren't magic, but science.
4. The Mountain Pool of Ultrapure Water
Japan is also home to Super-Kamiokande, an observatory for detecting neutrinos. Itâs a tank a hundred feet tall and hundred feet wide, located in a mine a thousand feet under a mountain.
Neutrinos are the second hardest things to detect, after dark matter. Well, maybe third, after dark matter and something so invisible we don't even know it exists, yet.
Also, I'm pretty sure the tank is measured in SI units, as is its depth.
Neutrinos travel at the speed of light and pass through the planet without affecting much of anything, but we still manage to detect them using Super-Kamiokande.
Far be it from me to quibble about something on a comedy site, but I'm going to do it anyway: Neutrinos have mass, and therefore don't quite travel at the speed of light. But, you know... close enough. (There was some sensationalist reporting a while back on neutrinos that appeared to exceed the speed of light, but that turned out to be a mistake.)
3. ShotSpotter, the Gunshot Detector
Sometimes, you hear a gunshot outside, and you shrug your shoulders, saying, âEh, what can you do?â
Emigrate?
Starting in 1997, several cities in the U.S. set up a system of sensors that pick up the sounds of gunfire and relay the location of the source directly to police. In D.C., for example, the sensors picked up some 40,000 shots in a 20-square-mile area over the course of seven years.
That's the sound of freedom.
2. The Detectors that Detect Radar Detector Detectors
Naturally, such devices use waves of their own, which means theyâre vulnerable to being spotted by radar detector detectors. Of course, that means you should equip your radar detector with a radar detector detector detector, which will shut your device down as soon as it realizes itâs being spotted.
And so the arms race continues.
Pretty sure my state is still the only one in the US where radar detectors are illegal, but I can't be arsed to find out for sure. Yes, I drive around a good bit. No, I don't have a radar detector.
1. The Parkinsonâs Sniffer
This one may seem out of place, being a mutant superpower instead of a tech device. But if you read to the end, you see that they used the mutant's superpower to create the tech needed for a Parkinson's test.
No mention in the article of radiation and Geiger counters, but I suppose most people know about that one, anyway. In any case, all of these were known invisible things. It's the unknown invisible things you really have to worry about. |
October 13, 2024 at 10:31am October 13, 2024 at 10:31am
|
We're only going back two years in today's trip to the past, to a brief entry that linked an Atlas Obscura article: "I Wood Knot"
Short one today, to balance out yesterday's.
Yeah, I looked, and indeed, I'd written an entire encyclopedia entry the day before.
Quote from the article: In the CathĂŠdrale de Saint-Pierre in Geneva, Switzerland, a high-backed wooden chair sits in a place of honor. Itâs roped off so that nobody can sit in the seat, where the French preacher John Calvin sat more than 500 years ago.
I've looked more into Calvin since then, and there's almost nothing about his philosophy or theology that sits right with me. (Pun intended. Chair? Sits right? Yeah, baby, I'm BACK.)
Me: At least they're not claiming it's made of wood from the True Cross.
At least one of the innumerable churches, cathedrals, basilicae, shrines, whatever that I saw in Europe claimed to have such a splinter.
Article: From the pulpit at St. Pierre Cathedral, he preached about the importance of religious scriptures and the concept of predestination, which held that certain people were set on a path for salvation from the very beginning of their lives.
Regardless of my rejection of the theology, there's no denying that he was a major influential figure. Perhaps he was predestined to be so.
Article: The chair of Calvin is a plain-looking wooden chair with a trapezoidal base and narrow backrest.
Appropriate that it's not all gilded and ornate. Credit to him for not being too hypocritical.
Me: I just wanna know one thing:
Did it have room for Hobbes, too?
I have no doubt whatsoever that John Calvin would disapprove of that bit of levity. Any worldview that doesn't put a sense of humor high on the list of important traits is not one I can ever accept. |
October 12, 2024 at 8:39am October 12, 2024 at 8:39am
|
Tomorrow, I intend to get back to my usual annoying self here by posting a Revisited entry. As for today, well, I'm still in recovery, so I'll just summarize some of my observations on the trip that I may or may not have already blogged about (because I can't remember and can't be arsed to re-read what I've already posted).
I gained a greater appreciation of the intricacies of French wine. Contrary to French beliefs, it's not necessarily superior to other countries' wine, but it does take a different approach. Not that I know a whole lot of details; that takes years of study and practice. I don't mind practicing.
It's popular in some circles to tout the general superiority of Europe over the US. In other circles, the opposite is the case. As it turns out, in my opinion at least, that they're just different. Europe does some things better, like public transportation and healthcare delivery. The US does other things better, like... um... well, I can't think of anything offhand, but I'm pretty sure I would if I weren't so tired. I guess we invented the Internet and GPS, so there's that.
There are whole sections of Brussels where there are more chocolatiers on each block than all other businesses combined.
There are others that are pretty much entirely restaurants.
I gather that it is easier and takes less time to become a medical doctor anywhere than it does to become a chocolatier in Belgium.
Real estate prices (both rent and buy) are amazingly cheap there, compared to the US. At least in the country and small towns. And that's with a denser population.
Unlike my above comment about French wine, Belgian chocolate really is the best. Sorry, Switzerland. At least you won't send assassins after me for that observation. ...Right?
The old joke about how Brits think 100 miles is a long way, while Yanks think 100 years is a long time? It applies to mainland Europe as well. You just have to convert to SI units.
Best quote of the trip, by a Belgian tour guide, uttered between bites of a Belgian waffle topped with whipped cream and chocolate: "You Americans eat like you have free health care."
So that's all I'm going to post about the trip... unless, of course, something else reminds me of a thing I did there. Like I said, tomorrow, we're back to our regularly scheduled programming. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|