About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
|
Wrapping up another month in the relentless march to our individual and collective doom. As it is Sunday, I wanted to pull out a past entry and take another look at it.
As I've noted before, I don't take anything from the past year for revisiting. Today, though, cuts it pretty close; it's an entry from April 11 of last year: "Asking Persimmon"
Since it's so recent, not much has changed. The link, to an Atlas Obscura article, still works.
I made my perennial complaint about the inconsistency of labeling delicious fermented beverages:
In general, beer is fermented grain, while wine is fermented fruit. However, exceptions abound. Cider, usually from apples or pears, is its own category. Japanese sake is most often known as "rice wine" in English, probably because its alcohol content is higher than most beer and it's generally not bubbly. For example. So if someone wants to call this "persimmon beer," well, they don't need my permission.
Well, it's not a complaint, exactly. Just an observation. In the end, though, it doesn't matter all that much, as long as you pretty much know what you're drinking. That is, don't make a sparkling plum wine and call it peach beer. Or whatever. I had a mead yesterday that was also somehow categorized as beer because, it seems, there were grains somewhere in the brewing process. My understanding is that, to be called mead, most of its fermentation product has to come from honey—kind of like something has to be 51% de agave (and from a particular region) to be called tequila.
What I'm getting at is, as with many things such as mountains/hills and planets/dwarf planets, sometimes you run into a categorization issue. I'm mostly mentioning this stuff again because, from the article and my quote thereof, there's something that I apparently didn't bother talking about:
...fermenting Diospyros virginiana, the diminutive North American persimmon, with sugar, honey, and yeast...
So what I'm not real clear on is how much of the fermentation product is from persimmon, sugar, or honey. Persimmon seems to have enough sugars in it by itself for fermentation; I'd imagine the sugar and honey were about flavor. I highly doubt the ratio of those ingredients was consistent, though.
Probably doesn't matter. The story remains the same, and it's still just as interesting as it was last year when I linked it, or the year before when the article came out. |
|
I found more new-to-me breweries today.
But let me start by recapping yesterday.
The town I was in was, if you care to look at it on a map, Wytheville, VA. I'd never spent any time in it before, but it's exactly what I expected from a small town in the mountains: two stoplights, 500 people, 600 churches, and a rustic nonfunctional clock on the courthouse.
And, as I noted, two breweries. Here's the funny part, though: as far as I know, the breweries aren't related. But one is called 7 Dogs, named after seven dogs the owner had rescued (with, fortunately, a nod to the rescued cat who kept the dogs in line).
The other is named Seven Sisters. Yes, this is important because they both had the number seven in the name. But as I approached this second brewery, I pondered: would it be a reference to actual sisters, or to the star cluster variously known as the Seven Sisters, the Pleiades, and Subaru? Because as you know, I'm a sucker for anything astronomy. And I drive a Subaru.
So I was pleased to see a nice big print of the Pleiades as I entered.
And yet, the beers were named traditionally women's names. Like Rosie, Julia, Edith, etc. Which is fine; it's good to have a theme. But one thing disappointed me: there was no beer named Kate.
You see, I like to order tasting flights, which usually consist of 4-6 sampler-sized beers. The missed opportunity was that I could have had my Kate, and Edith too.
I will just pause while you absorb the greatness of that joke.
Ready?
Okay. So, after passing out and sobering up, I spent the better part of the day driving across Virginia, to the east, all the way to the Richmond area, as prompted by my random number generator. Nice drive, great weather. I don't get to the southern part of my state very often, and never before along an east-west route. It could just as easily have landed me back in West Virginia, or Ohio, Kentucky, or Tennessee. Or maybe even a Carolina. But instead, it kept me in Virginia. Fine. Such is the nature of randomness.
Here, I found a couple of breweries in Chester, which is a city south of Richmond but north of Petersburg. No, it confuses me, too; as far as I'm concerned, all those cities and everything else in the vicinity is Richmond. There really ought to be a rule: unless there's a significant river (the James doesn't count) or a lot of trees and/or farmland in between, just fucking merge the cities.
If you still don't know what the hell I'm talking about, try Google Maps again. I've been drinking.
Just one thing of note on the drive:
I saw a wake of buzzards.
By "buzzards," I mean turkey vultures, cathartes aura, my namesake and the closest thing I have to a spirit animal. It's not unusual to see turkey vultures around here, but usually, they're loners. This is one reason why they're my spirit animal. Sometimes, you'll see a bunch ("wake" is apparently the official collective noun for turkey vultures) of them hanging around, but that's rare. So it surprised me when I saw not two, not three, but five of them just chilling together. I had the windows down, but I didn't see or smell any carrion in the vicinity, so maybe they were just having a chat. Nonsocial birds being social.
Now I see that they're only called a "wake" when they're munching on a carcass. As these weren't, there are various possible collective nouns. "Committee" seems to be the leading candidate. A committee of turkey vultures.
But I digress.
Did the Universe send me a committee of vultures as a Sign? A Message? Well, no, of course not. But that doesn't mean I can't remark on how cool and unusual it was to see a committee meeting.
Ugly-ass birds, but they're magnificent anyway. |
|
Yesterday's entry was short because I didn't have a comfortable place to type. Today, I found a hotel with a desk in the room, so that's an improvement. But I'm still going to keep it short, because I have beer to introduce myself to and I want to get to it.
In contrast to yesterday's lodging, this one has quite ancient amenities, dating back to maybe 2012. The building itself is nearly a hundred years old, but they do have wi-fi and USB charging ports.
Most importantly, though, the hotel is within stumbling distance of the two breweries in town, about midway between. This is a good thing. Sure, I'm paying a premium for proximity, but if I stayed in the cheap area, I'd have to pay for Uber. That is, if I can even get Uber here.
Since I have nothing to complain about (yet), this is going to be even shorter than yesterday's entry. I will say, though, that driving across mountains is a lot of fun. |
|
It's going to be a short one today.
I decided to do one of my random road trips. Today, I ended up in White Sulphur Springs, West Virginia.
The only hotel in town (other than The Greenbrier, which is very expensive but you should definitely Google it because it's got a fascinating history) is conveniently located about a block away from the town's single brewery, Big Draft. I have already gone there to sample their wares, and they're very good.
This hotel - converted from a schoolhouse, because why would West Virginia need schoolhouses? - is less than a year old, so it's "smart." So smart, in fact, that when I checked in, they had to give me the user manual for the room controls.
Everything, and I mean everything, in the room is controlled by a touchscreen. The fucking toilet is controlled by a touchscreen. The icons on that touchscreen are incomprehensible, and they are NOT IN THE USER MANUAL.
And I'm not about to call the front desk and ask them how to use the goddamned toilet. Which has a bidet built in. While that's a great idea, again, the controls might as well be in French. Hell, if the controls were in French, I'd actually have a shot at understanding them.
Adding injury to insult, the room doesn't even have a desk. Which means I have to balance the laptop on a TV dinner stand. Which in turn means no mouse, and I hate using the touchpad. No, I'm not going to use my phone to blog; that's even worse. This is why I'm keeping this entry short, and not linking anything.
I hate this future. You remember the Daffy Duck cartoon where he's in a push-button house? Well, if you don't, Google that too because linking anything right now is a massive pain in the ass (unmitigated by the presence of a bidet and - I am absolutely, 100% not kidding here - the massage capabilities of the room's toilet seat). I thought it was funny when I was a kid. Not any more. I'm in a literal push-button hotel room.
The one bright spot in all of this, apart from the proximal brewery (which, again, is very good), is that the hotel has a bar. And I will be availing myself of its services shortly.
I just hope it's not staffed by robots. |
|
It may seem unbelievable today, what with the push toward so-called "natural" foods and supplements, but there was a time when technology was all the rage in popular culinary culture.
This article is from December of 2019, so the writer was contractually obligated to mention the "holiday season."
I'll just conveniently ignore that some Americans never used lard for religious reasons and take the history at face value.
But for all Crisco’s popularity, what exactly is that thick, white substance in the can?
If you’re not sure, you’re not alone.
I can't remember the last time I saw Crisco. I'm vaguely aware that it's still around, but it's just not part of my worldview. Neither is lard. Oh, I long ago got over my upbringing with its aversion to all things swine; I simply detest animal fats. I won't even cook my eggs in bacon grease. Nor will I accept bacon with too much fat on it.
But okay, that's me, and I know I'm an outlier.
For decades, Crisco had only one ingredient, cottonseed oil. But most consumers never knew that. That ignorance was no accident.
Lesson learned: keep the populace ignorant and you can sell them anything.
A century ago, Crisco’s marketers pioneered revolutionary advertising techniques that encouraged consumers not to worry about ingredients and instead to put their trust in reliable brands.
Think that would work today? No? Have you seen the supplement industry? "Made from all-natural ingredients" can mean anything from diddly to squat.
For most of the 19th century, cotton seeds were a nuisance.
I suspect that for at least one demographic, for most of the 19th century, anything related to cotton was a nuisance.
When cotton gins combed the South’s ballooning cotton harvests to produce clean fiber, they left mountains of seeds behind. Early attempts to mill those seeds resulted in oil that was unappealingly dark and smelly. Many farmers just let their piles of cottonseed rot.
Admittedly, I don't know a lot about the life cycle of the cotton plant. I could learn, of course, but I don't care enough right now. But before hybridization and complete corporate control over seed stock, couldn't they have, you know, replanted the seeds? I guess not. I'll delve into that when I have more time and energy.
It was only after a chemist named David Wesson pioneered industrial bleaching and deodorizing techniques in the late 19th century that cottonseed oil became clear, tasteless and neutral-smelling enough to appeal to consumers.
Yes, this is the third blog entry in a week focusing on technological innovations of the late 19th century. I assure you this is a randomly-generated coincidence.
Oh, and you've probably heard the name Wesson before. I don't know if it's still around, but when I was a kid, his name was on cooking oil bottles.
The key word in the quoted bit there, though, is "industrial."
While lard prices stayed relatively high through the early 20th century, cottonseed oil was abundant and cheap.
Americans, at the time, overwhelmingly associated cotton with dresses, shirts and napkins, not food.
Which is understandable, but, for example, trees are associated with food, lumber, and firewood, not limited to one use. Things can have more than one association. Hemp, e.g., or barley, which is versatile enough to make beer or whiskey.
Nonetheless, early cottonseed oil and shortening companies went out of their way to highlight their connection to cotton.
I'd think that by that point, cotton could have acquired a bad reputation since it was just as associated with slavery as with clothing. But then I stop and think some more, and realize... nah, few would have cared.
When Crisco launched in 1911, it did things differently.
Meaning, as the article points out, it didn't mention cotton at all.
But it was also a new kind of fat – the world’s first solid shortening made entirely from a once-liquid plant oil. Instead of solidifying cottonseed oil by mixing it with animal fat like the other brands, Crisco used a brand-new process called hydrogenation, which Procter & Gamble, the creator of Crisco, had perfected after years of research and development.
It would take nearly 100 years for it to be properly demonized as a trans fat (which can unfortunately be confused with a fat trans, so please don't make that joke because it's rude). Again, though, at the time, it was innovative.
Instead of dwelling on its problematic sole ingredient, then, Crisco’s marketers kept consumer focus trained on brand reliability and the purity of modern factory food processing.
Sounds a bit discordant to modern ears, doesn't it? Back then, though, it worked.
Today, Crisco has replaced cottonseed oil with palm, soy and canola oils.
Canola oil is another product of marketing.
Based on my limited understanding, it was developed in Canada (hence the "can" part), with the -ola suffix associated with oils (see also once-popular margarine brand Mazola, contracted from "maize"). But that naming convention conveniently hid the true origins of the oil: a mustard family crop called rape.
Some other Anglophone countries are okay with this linguistic coincidence. Not us.
Words have power, and nowhere is this more obvious than in marketing.
Anyway. I'm pretty sure palm oil is even worse for you than cottonseed oil, notwithstanding their vegetative origins. But I've been informed that they cut out most of the trans fats in their new formulation.
Once ingredient labeling was mandated in the U.S. in the late 1960s, the multisyllabic ingredients in many highly processed foods may have mystified consumers. But for the most part, they kept on eating.
I've said this before and I'll say it again: to encourage people to "not eat anything you can't pronounce" is to encourage ignorance. Everything you eat is made up of chemicals. Here, take a glance at this paper which lists several of the compounds present in that Platonic ideal of health foods, an apple. You can't pronounce most of them. Hell, I have trouble with some of them and I have a background in chemistry. Spoiler: most of them are "healthy."
That said, I'm pretty sure it was the 1960s that also saw the shift begin away from processed foods and toward so-called "organic" (all food is organic) produce.
Was that a backlash to mandatory ingredient labels? Maybe. We're talking about a populace that rejected a 1/3 pound burger because it sounded smaller than a 1/4 pound burger.
Anyway, I'm not here to try to tell you what to eat and what not to eat. No one's paying me and I'm not selling anything. I'm only here to make jokes and note things that I find interesting. |
|
By coincidence, today's article also talks about a late nineteenth century innovation, one which was apparently met with some pushback.
Pun, of course, intended.
The electric push button, the now mundane-seeming interface between human and machine, was originally a spark for wonder, anxiety, and social transformation.
Because it's obviously sorcery.
As media studies scholar Rachel Plotnick details, people worried that the electric push button would make human skills atrophy. They wondered if such devices would seal off the wonders of technology into a black box: “effortless, opaque, and therefore unquestioned by consumers.”
Does this sound to you like almost every other complaint about new technology and/or ways of doing things? It does to me.
Right after we invented fire, a bunch of skin-wearing humans probably freaked out, too. "What is this 'starting a fire' bullshit? In MY day, we'd steal it from a burning tree after the gods smote it with lightning, as they intended. This is going to make people lazy and entitled, you mark my words."
“Some believed that users should creatively interrogate these objects and learn how they worked as part of a broader electrical education,” Plotnick explains. “Others…suggested that pushing buttons could help users to avoid complicated and laborious technological experiences. These approaches reflected different groups’ attempts at managing fears of electricity.”
One thing I remember about the show The Jetsons from when I was a kid (haven't seen it since) was that George Jetson's job in our utopian Eloi-like future involved pushing seemingly random buttons all day, to the point that his finger would cartoonishly swell and throb in pain.
Well. At least they got that part right (he typed on his computer keyboard). Still no flying cars, though. By the time we get those, it's all going to be touchscreens and voice interfaces, both of which are going to make people lazy and entitled, you mark my words.
Electric push buttons, essentially on/off switches for circuits, came on the market in the 1880s.
For those of you who didn't bother with yesterday's entry, that's more than a decade after the invention of blue jeans.
The word “button” itself comes from the French bouton, meaning pimple or projection, and to push or thrust forward.
I don't claim to be an expert in French, but I learned "bouton" to mean like the button on a shirt (or Levis), and that word doesn't look like a verb form to me. Perhaps archaic French; je ne sais pas.
Those who promoted electricity and sold electrical devices, however, wanted push-button interfaces to be “simplistic and worry-free.” They thought the world needed less thinking though and tinkering, and more automatic action. “You press the button, we do the rest”—the Eastman Company’s famous slogan for Kodak cameras—could be taken as the slogan for an entire way of life.
Like I said, lazy.
What I didn't say (at least not yet in this entry) is that I consider that a good thing. Necessity may be the mother of invention, but laziness is the milkman. Bill Gates supposedly once said that he hires lazy people because they'll figure out the most efficient way to do things. I have no idea if that quote is correctly attributed, but someone said (or wrote) it, and it rings true.
Ultimately, the idea that electricity was a kind of magic would triumph over a more hands-on, demystifying approach.
Guilty. For all I understand it on anything but the most theoretical level, it might as well be sorcery. Yes, I'm an engineer by training. No, that training didn't include electrical.
Plotnick quotes an educator and activist from 1916 lamenting that pushing a button “seems to relieve one of any necessity for responsibility about what goes on behind the button.”
Yep. History echoes once again.
150 years later, we equate button-pushing with "easy." That's the whole point of Staples' ad campaign (from a while back; I have no idea if it's still going on.) But I don't think they invented the concept; they just took an existing concept ("at the push of a button") and made it into a marketing gimmick.
No... technology is supposed to make our lives easier, make things more convenient. That's what it's for.
Kids these days with their simple-to-use devices, I'm telling you, mark my words... |
|
It is now time for me to wrap up my April entries into "Journalistic Intentions" [18+]. Of the nine prompts I haven't tackled yet, which one will my random number generator choose this morning? Let's find out...
denim
Oh, glorious.
My dad always called blue jeans "dungarees." His attire of choice was denim overalls, because by the time I became self-aware, he had retired from the military and become a farmer, and just as military officers wear uniforms, farmers wear overalls. He didn't call the overalls "dungarees," though. I don't know why and I certainly can't ask him now. Probably, like about 80% of what he did, it was to annoy me, because I liked wearing blue jeans and hated the word "dungarees" because it started with the sound "dung," which, as farmers, we were all too familiar with. Still, many people, especially in his day, used the words interchangeably.
Nowadays I like wearing black jeans, often faded to what my friends called "gamer gray," but while the color and the waist measurement have changed, they still come from Levi Strauss, complete with rivets at the seam junctions.
Turns out that denim and dungaree fabric, while similar, aren't actually the same thing. The latter was apparently invented in India many centuries ago, but the particular combination of dye and weave that go into a pair of blue jeans dates back to maybe the 19th century, with the invention of blue jeans coming shortly after the end of the US Civil War.
If you bother going to those links above, you'll see what people like to ding Wikipedia on: one calls dungaree a predecessor or precursor to denim, while the other asserts that the connection is unclear. This is why consulting footnotes and other sources is important. But I'm aware that's a lot of work, and in spite of my predilection for wearing what was originally a working-class pair of pants, I'm allergic to work.
No disrespect intended to the Indian subcontinent origin of dungaree—that culture invented many essential things for society, including our counting numbers (they're called Arabic numerals, but it turns out the Arabs borrowed the system from India). It's clear to me, though, that denim as we know it today had its origin in the same place so much fashion came from: France. Specifically, the town called Nimes, in the south. De Nimes apparently got Anglicized as denim.
Which makes me think that "jeans" should be pronounced "zhahns" in the manner of Jean-Luc Picard, but that turns out not to be the case. Apparently (at least according to one of the non-Wikipedia sources I looked at), its etymology comes from the port of Genoa, which in French is or was called Genes, not to be confused with chromosomes. Why this is the case, I couldn't be arsed to figure out; I did look at a map, which shows that while Nimes is close to la mer Méditerannée, there are at least two perfectly good French ports (Marseille and Nice) between Nimes and Genes.
In any case, you know what the French word for jeans is? It's not pronounced like Admiral Picard's name. No, it's pronounced almost exactly like Anglophones pronounce it (the 'j' sound is a bit different), which must be a source of early heart attacks for members of l'Academie française.
But jeans as we know them aren't French; they're indisputably a Jewish invention.
Gotcha. No, Levi Strauss (German-Jewish family) didn't invent blue jeans; a tailor of Lithuanian-Jewish origin named Jacob Davis (last name changed upon immigration, as with many immigrants) invented them (or at least perfected their durability by using rivets) in Nevada (which had just barely become a US State), but apparently couldn't keep up with demand, so he partnered up with Levi-Strauss to meet production needs.
The rest, as they say, is history.
To summarize, it's an Indian inspiration for a French fabric that was perfected as work clothing by Jewish immigrants in the Wild West of the US. It would be difficult to get more multicultural than that, but I suspect you can find other examples of intercultural fashion development if you try.
I mentioned in a previous entry the persistence of rivets in Levis despite advances in clothing technology that render them mostly decorative now. At the time of their invention, though, riveting the weak spots was the innovation that made jeans the preferred outfit of the working class.
They never did, as far as I know, rivet the crotch. I mean, I can kind of understand why; metal tends to get hot in the sun, and the last place you want to heat up is right under your balls. Or, if you don't have them, your cooch.
So that's what always fails first for me on a pair of black Levis (though usually only after a few good years of stalwart service). Hopefully not while I'm in public. Nothing's perfect.
But in a world dominated by throwaway fashion, it's good to know some clothing is still built to to stand the test of time.
Speaking of which, there's one final thing I want to add: that little pocket on the right where you might keep an emergency quarter or two, or a key? The original purpose of that was to hold a pocket watch. Because no matter how hard you work, you still need to keep track of time. |
|
Today is the day for me to delve into the past to see what I was up to then and how things might have changed.
The RNG took me all the way back to September of 2021 this time, so let's see what I had to say in "Cryptonite" ...
...a controversial and divisive topic, one that has been known to destroy friendships, end marriages, and divide siblings. Yes, I'm talking about... cryptocurrency.
So not much has changed.
Unsurprisingly, the original article link, from Cracked, is still there.
It's almost inevitable that I've picked up a few things in the year and a half or so since I wrote the entry, but I still won't touch crypto, not even with someone else's pole.
If I remember the timeline correctly (and I probably don't), it was the following February that crypto was pushed heavily during the big sportsball game commercials. Not that I saw them, but it's hard to avoid hearing people talk about them. After that, the price of the leading cryptocurrency, bitcoin, crashed.
So, not trusting my own memory, I looked up this chart, which tracks btc vs usd for the last 2 years.
Such are the perils of hindsight (and my memory). Someone could easily have come up to me a month after I wrote that September entry and crowed, "You're an idiot. The value of bitcoin has skyrocketed!" Because it had. And then I could have come back pretty much any time after that and said, "No, you're the idiot," because it never again reached the stratospheric level of October 2021 prices. Or even September.
Thing is, though, that whatever the value of it, the fact that it's quoted in US dollars doesn't say much about the intrinsic value of Bitcoin, but it says a lot about the intrinsic value of the USD.
It's basically a gamble. Not that I think there's anything inherently wrong with gambling; I do it, myself—on an occasional basis, and always with a self-imposed limit that is so low that it's basically my entertainment budget. Other people go to sporting events or rock concerts or Broadway plays; I sometimes go to casinos. So it would be hypocritical of me to dismiss it just because it's gambling.
"But the stock market is also a gamble, Waltz!"
Well, no. I mean, it can be, if you focus on short-term trading, which is what I think most people do because you're always hearing about day traders and the Robinhood app or whatever. But there's a big difference between investing—that is, long-term buy and hold, much less sexy than the adrenaline rush of day trading—and gambling.
A stock generally has some intrinsic value (which is almost always lower than the share price). This is because it represents partial ownership in a company, which will ideally have assets and a semi-reliable income stream, putting a floor on its valuation. Sure, sometimes individual companies get caught cooking the books (Enron, e.g.) or their income stream dries up, or they take on more debt than they have assets, and the share price falls to near zero. But more often, they have an interest in keeping on making money, which drives up the share price because people think it'll be more valuable in the future (this is a gross oversimplification, I know). Overall, the latter outnumber the former, which is why investment advisors harp on about diversification: the wins tend to more than make up for the losses.
There's still risk. But the analogy I like to make is that speculation and short-term trading are like playing in a casino, while long-term investing is like owning the casino. You can still fail, but you have the house edge.
But what's the actual inherent value of crypto? From what I can tell, it's pure speculation.
Still, no, it's not the gambling aspect that keeps me away. It's the same reason I shy away from gold trading: advertising. Also, to a lesser degree, not knowing the odds.
It's not just my inherent hatred of ads, either. Consider: if someone pays a premium for a 30-second spot during Big Sportsball Game, that means that they: a) want your fiat currency more than they want whatever they're selling; and b) think that they can make enough fiat currency to offset the high cost of said commercial. Now, to be fair, sometimes what they're promoting isn't the crypto itself, but a new way to store it. Like banks, which run commercials all the time. Since I wrote that original article, at least some of those crypto exchanges have utterly failed, the most public of which was the very well-covered fall of FTX. Still, even in that case, point b above applies.
So if someone is promoting, say, Dogecoin on TV, they're already giving away their game, which is "I'd rather hold dollars than dogecoins." If dollars are more valuable to them, taking into account the cost of advertising, why shouldn't they be more valuable to you?
I'll never ding someone just for wanting to gamble, because, like I said, I do it myself. Though I have an awareness of the odds, and I don't treat it as a get-rich-quick scheme.
But hey, there's at least one actual, legitimate, and valuable use of Bitcoin: Last time I checked, which was around December, WDC is taking it as payment for membership. |
|
I was hoping something would come up from the depths of my blog fodder folder today that was somehow related to Earth Day, so that I could once again remind everyone that we're doomed and there's nothing we can do about it, so we might as well just enjoy the slide into oblivion.
Close enough.
Storm Thorgerson and George Hardie’s image of a light shining through a prism and emerging as a spectrum of color—pitched dead center, slightly raised, cast against a black backdrop—is probably the most famous in rock history.
One might even say... iconic? Younger Me spent many an hour listening to the album while contemplating the relationship of the cover art to the music. And I never did come to a satisfactory conclusion. Like most album art at the time, I figured it was either an in-joke or something way beyond my teenage depth, like a New Yorker article.
Only thing I could come up with was the traditional rainbow has seven colors and a musical scale has seven notes. But it couldn't have been as simple as that, not from Pink Floyd... could it?
I hadn't tried drugs yet.
There's a lot more at the link, and even if you're not a big fan of Pink Floyd, it's an interesting (if somewhat circuitous) read.
...the band spent four years indulgently experimenting, with mixed results, before it pared back, tightened up, and achieved global fame with the breakthrough of The Dark Side of the Moon in 1973.
Some of their output before Dark Side is... how shall I put this charitably... unlistenable. But it did give the world such glorious song titles as "Several Species of Small Furry Animals Gathered Together in a Cave and Grooving with a Pict," so it wasn't a total bust.
The week of Dark Side’s release, the no. 1 album in both the U.S. and U.K. was Elton John’s Don’t Shoot Me I’m Only the Piano Player, whose singles, “Crocodile Rock” and “Daniel,” demonstrate the two dominant trends of that moment. The first song is a kitschy, glammy celebration of rock ’n’ roll; the second is a rippling, downbeat soft-rock song about a wounded Vietnam War vet returning home.
As different as they are, I want to emphasize that Don't Shoot Me is also on my list of greatest albums of all time.
So Dark Side is likely something that someone would be exposed to for the first time as a teenager, which also means it’s exactly what someone would play when they’re getting into weed.
As I mentioned above, I became a Pink Floyd fan before I smoked a single joint. I appreciated the music for what it was, absent mental amplification.
That said, yeah, okay, weed helps. But if I had to choose between Pink Floyd or weed, I'd pick the former every time.
The album begins at birth and then presents the three major preoccupations of modern life—the scarcity of time, the dominance of capital, and the conflicts between people and nations—before ending by basically asking listeners, “You will go mad and die, so how will you resolve these challenges?”
I noticed that thematic progression on, I don't know, my thirty-third or thirty-fourth listening. Figured maybe it was just me projecting my own biases onto the music, like it was a musical Rorschach test or Pollock painting.
For the album’s 40th anniversary in 2013, the acclaimed playwright Tom Stoppard wrote Darkside, a comedic radio play about various moral philosophers (Nietzsche, Kant, Hobbes) that’s scored by the entirety of The Dark Side of the Moon.
You know that now that I know about that, I have to find it.
Few, if any, teenage totems have come close to Dark Side’s longevity and reach. Its claimed sales are 45 million copies, making it the fourth-best-selling album of all time.
I was curious, so I looked it up: the three that beat it are Jackson's Thriller (fair), AC/DC's Back in Black (also fair), and the soundtrack to The Bodyguard (wtf?)
I would be remiss if I didn't point out #5 and #6 which are, respectively, The Eagles Greatest Hits, and Meat Loaf's Bat out of Hell, both of which are on my own list of greatests. (Source )
Anyway, the article delves into the (sometimes unfortunate) history of the band and its members, and I won't quote much more from it. I just want to point out this bit:
Perhaps no example of the mythos around Dark Side is as well known, or as ridiculed, as The Dark Side of the Rainbow, the act of syncing up Dark Side to The Wizard of Oz. In 1995, an article in The Fort Wayne Journal Gazette described how specific songs complement scenes from the first 45 minutes of the 1939 movie.
Obviously, I knew about this, but never could be arsed to take the time to sync the album to the movie, myself. Fortunately, someone did that and posted it on YouTube; the video is embedded in the article I'm referencing here. But a few years ago, NBC released a show called Emerald City, which was—how to describe this—a dark and gritty postmodernist reimagining of Wizard of Oz. It was roundly panned and canceled after one season, but I liked it—at least at first; it later devolved into a Game of Thrones parody. In it, Dorothy (with Toto, who's either a German shepherd or Belgian Malinois, I forget, in this one) meets the Wizard, who is also from Earth, maybe halfway through the season. And during this peak, important, emotional scene, when she encounters the Wizard (played by Vincent D'Onofrio, who is excellent in everything), he's got Dark Side of the Moon playing.
They knew. Damn right they knew.
There's a lot more in the article, but I'll just leave you with this tribute video below. I've posted it before, but it's very appropriate here: an AI-generated music video for the entire album, in honor of 50 years of Dark Side.
"There is no dark side of the moon really. Matter of fact it's all dark." |
April 21, 2023 at 11:26am April 21, 2023 at 11:26am
|
Oh good. If this prompt for "Journalistic Intentions" [18+] hadn't come up at random, I might have had to write an extra entry.
boning
Ever boned a fish?
I haven't, because I don't fish, but I've seen people do it. And not on one of those websites, either. Seems like a lot of work, though as with anything, practice improves skill. Some fish just don't like to be boned, though, which can lead to oral discomfort or even penetration.
The word "deboning" in this context means the same thing, which is just one of those weird things about English, like flammable/inflammable or contronyms like "cleave."
But for this blogging activity, which is all about clothing, we're not talking about fish, or anything we do with our clothes (mostly) off. And for once, I don't have to do extensive research to know what it means.
Not fish, then. But whales, which live with fish but aren't them. Back before the inspired invention of the brassiere, for a while, corsets were all the rage. And back before plastics (see previous entry on nylon), corsets were stiffened by whale bones.
Which is oddly appropriate. See, "corset" comes from an old French word meaning "body." The modern version is "corps" (pronounced more like "core" just like with the Marine Corps, and not to be confused with "cœur," which translates to "heart" and is pronounced more like "curr," as in "no1curr") and from one of these we get the English "corpse." So you had people (it's associated with women's wear but for a long time all genders wore a kind of corset) wearing parts of whale corpses to support part of their own living bodies. An endoskeleton turned exoskeleton.
We (mostly) don't kill whales for fashion, or lighting, or meat, anymore. Instead, we use older corpses. Not of dinosaurs, per popular misconception and one oil company's brontosaurus logo, but ancient marine life. They're pumped out of wells, refined, and some of them get molded into plastic boning (yes, people still wear corsets; see any RenFaire).
Eventually, and not too far in the future, we'll run out of the corpses that took billions of years to accumulate.
And then, we're well and truly boned. |
|
More proof that we're utterly obsessed with size:
Out there in the Universe, size definitely matters.
Always lead with a subtle dick joke.
Objects that are stable, both microscopically and macroscopically, are described by measurable properties such as mass, volume, electric charge, and spin/angular momentum.
And, lately, political affiliation.
But “size” is a bit of a tricky one, particularly if your object is extremely small.
Snort.
After all, if all the mass and energy that goes into making a black hole inevitably collapses to a central singularity, then what does the concept of “size” even mean?
Obvious jokes aside, I've long wondered how you can measure the size of something that warps spacetime, which makes the concept of "size" very slippery (as if covered with K-Y)
The first thing you have to know about a black hole is this: in terms of its gravitational effects, especially at large distances away from it, a black hole is no different from any other mass.
I've mentioned something like this before, I know, but can't be arsed to find it right now. Black holes have some legitimately scary properties, but if they were going to eat entire galaxies, they'd have done it by now.
After all, we’re taught that black holes have an irresistible gravitational pull, and that they suck any matter that comes too close to their vicinity irrevocably into them. But the truth is that black holes don’t “suck” matter in anymore than any other mass.
I'm not well-versed in the math involved, but there's a region around a black hole within which orbits decay in a predictable way. Outside that region, black holes act like any other mass.
This is complicated by the prevalence of gas, dust, and debris in the vicinity, all of which would cause any orbit to eventually decay.
When a black hole rotates, it no longer has just one meaningful surface that’s a boundary between what can escape and what can’t; instead, there are a number of important boundaries that arise, and many of them can make a claim to being the size of a black hole, depending on what you’re trying to do. From the outside in, let’s go through them.
The rest of the article does just that; it's very interesting, and only a little bit math-oriented, but there's no need for me to comment on it. After all, I just wanted to make the point (pun intended) that even science writers are not above making the occasional dick joke. |
April 19, 2023 at 10:57am April 19, 2023 at 10:57am
|
Today, another entry for "Journalistic Intentions" [18+]:
Nylon
"There's a great future in plastics. Think about it. Will you think about it?"
-Mr. McGuire, The Graduate
If Dustin Hoffman's character had actually heeded those words (frankly, I don't remember enough about the movie to know whether he did or not), he might have found himself in trouble. Shortly after the movie was released, "just one word: plastics" started to fall out of favor.
But there was a time when nylon, and other plastics, were heralded as positive agents of change.
As with most innovations of the 20th century, this was largely driven by marketing, with a hint of racism thrown in (nylon was heralded as an alternative to silk, which came from those countries.)
I don't need to rehash the Wikipedia page for nylon, so for facts (or near-facts) and history, you can go there. Warning: a lot of it is chemistry. You can skip that part. I did, and I have some knowledge of chemistry.
But I do want to focus on one nearly throwaway sentence from that article:
In 1946, the demand for nylon stockings could not be satisfied, which led to the Nylon riots.
The... Nylon... RIOTS?
This, I gotta learn more about.
The nylon riots were a series of disturbances at American stores created by a nylon stocking shortage.
Anyone else spot the inherent pun there? Anyone? I'll ruin it by explaining it: they had trouble stocking stockings.
During World War II, nylon was used extensively for parachutes and other war materials, such as airplane cords and ropes and the supply of nylon consumer goods was curtailed.
I'm just going to leave this here and hopefully remember to, at some point, research the effects of the rise of synthetic fabrics on the demonization of hemp. I can't help but think there was a connection, but it's irrelevant to this discussion.
The riots occurred between August 1945 and March 1946, when the War Production Board announced that the creation of Du Pont's nylon would shift its manufacturing from wartime material to nylon stockings, at the same time launching a promotional campaign.
On the other hand, I think I have a pretty good grasp on the link between overpromising stuff in promotional campaigns, and consumer riots.
It is imperative that I add: it sounds like there was a run on stockings.
In one of the worst disturbances, in Pittsburgh, 40,000 women queued up for 13,000 pairs of stockings, leading to fights breaking out.
If only we had camera phones then. Imagine the beautiful videos that would have resulted.
It took several months before Du Pont was able to ramp up production to meet demand, but until they did many women went without nylon stockings for months.
Imagine the sheer horror (pun intended).
During World War II, Japan stopped using supplies made out of silk, and so the United States had difficulty importing silk from Japan.
Pretty sure there were other reasons why importing silk from Japan wasn't possible during WWII.
Nylon stockings became increasingly popular on the black market, and sold for up to $20 per pair.
WTF, Wikipedia. Dollar amounts are useless unless we can compare them to today's dollars. Here, I'll help: $20 in 1945 is like $335 today.
There's more at the link. Maybe you'd heard of them; I had not, and despite the very real plight of the American stocking-wearers (or, technically, stocking-not-wearers) in the mid-1940s, I had a good laugh reading the section.
I have no idea what the current status of demand for nylon stockings is. I don't see them much, but I don't go to department stores or frequent the kinds of events where people would wear stockings or hose or pantyhose or whatever. Most of the ladies I see pictures of on the internet don't seem to wear them, but that may be selection bias, as most of the ladies I see pictures of on the internet don't wear much of anything.
No, I think people these days prefer "natural" fabrics, but not fur, though artificial fake fur can be made of nylon or nylon blends. I mostly see it in more bulk form, not fabric: fishing lines, zippers, even car parts. The first Wiki link above lists a bunch of uses. As a plastic, it's very versatile. But as a fabric? I can't remember the last time I could identify a pure nylon fabric. I think I had a raincoat made of it as a kid, but I'm not sure even of that.
I guess you could say that if you're looking for nylon stockings today, you might be hosed. |
April 18, 2023 at 10:09am April 18, 2023 at 10:09am
|
Ah, a rare example of the wild Cracked article that I have Issues with.
I can only expect so much from a dick joke site, though.
Science, at least the way it’s taught to us at an elementary and high-school level, feels like it’s all about proof.
That much, I can agree with. The proof thing is wrong, but the statement is fairly accurate.
In reality, there’s a huge amount of science that, even after tests and studies, remains at best a heavily supported theory.
*record scratch*
Couldn't even finish the first paragraph before I found the first Issue.
I know I've written this before, but in science, a "theory" doesn't mean the same thing as when you or I use the word. I can't be arsed to go into detail right now, but "a heavily supported theory" is what used to be known as a "law" in science, as in "Newton's Second Law." It's based on observation and/or experiment (it's hard to do experiments on, say, a planetary nebula) and makes predictions. When lots of those predictions are confirmed, as is the case with the theories of relativity, gravity, quantum mechanics, and evolution—to name a few—it's basically settled science.
Which doesn't mean it can't be refined, or even overturned. And some have a higher confidence level than others. But science is all about theory (in the scientific sense).
And hey, keep in mind that this is likely going to be a fairly surface level exploration from a verified Bachelor of Fine Arts holder. Scientists, I personally invite you to absolutely go off in the comments.
I'm not a scientist, and I'm not commenting there. Instead, I'm ranting here.
4. Planet Nine/Planet X
Planet Nine is the name given to a theoretical, massive planet deep out in the reaches of our solar system, and a name that truly salts Pluto’s fresh wounds of being kicked out of the club.
Again with the Pluto thing. I've ranted about that before, too. But this is an example of something that's neither right nor wrong. It's not "theory" (in either sense) that demoted Pluto to dwarf planet status. It's us humans needing to characterize and organize things.
The reason scientists think it exists comes from that classic moment that sparks so many discoveries: a scientist going, “Well, that’s weird.”
Or if they're being less flippant about it, "This observation does not conform with previous observations."
Long ago, scientists looked at the orbit of Mercury and said, "Well, that's weird." So they proposed several solutions, one of which was an as-yet-undiscovered inner planet, which they named Vulcan. I'm going to pause here so you can think of Spock jokes.
Ready?
Okay. Turns out the weirdness was because their theory of gravity was incomplete. Once Einstein figured out the whole "relativity" thing, the oddnesses of Mercury's orbit became perfectly (or nearly so) explained, and Vulcan disappeared. Not that it was ever there. The idea disappeared.
All of which is to say that there's some evidence that's leading some scientists to believe the theory of gravity needs another tweak. Would this account for Planet Nine? Hell if I know.
3. Dark Matter
Decent SF show from Netflix. Oh, wait, they mean actual dark matter.
For a pretty complicated scientific theory, it’s surprising how commonly known of a term “dark matter” is. This can most likely be attributed to it sounding absolutely bad-ass, more like something the Avengers have to steal from a space prison than something based on high-level mathematics.
That's a pretty common issue with the unknown: it allows writers to speculate, after which readers/viewers might think the speculation is actual fact. I've seen shows where "dark matter" is used to produce all kinds of plot motivations, such as superpowers.
Again, some peculiarities in gravity are what tipped off science to the idea of this unobserved type of material.
While this isn't the whole story of dark matter, it's not a bad brief summary.
This, incidentally, is why I said above that some scientists believe the theory of gravity needs another tweak. Maybe dark matter isn't matter at all, but some other effect.
This is analogous, I think, to the old idea of the luminiferous ether. Basically, back when the ascendant idea was that light was a wave, they were stuck on the idea that it needed material to wave through, like a sound wave needs matter to propagate, or an ocean wave needs water. The ether, then, had to have certain properties, not the least of which was being entirely, or almost entirely, intangible to matter, or else the planets' orbits would decay rather quickly.
Again, though, Einstein saved the day, and destroyed ether theory with the observation, in this case, that light had properties of both wave and particle, and didn't need matter to propagate through.
Now, I might have fallen into the same trap as this Cracked article, because I'm not an expert. But even if I got the details wrong, my point stands: dark matter might not be matter at all; the observations could fit another theory entirely. One which hasn't been discovered.
Which is a good thing. I'd hate to reach the limits of discovery.
2. Unknown Elements
Another staple of science fiction.
You'll have to read the article on this one, I'm afraid, but it always bugs me when SF shows throw out "new element" as if you can slot something between a n-proton atom and an n+1-proton atom.
Which doesn't mean that there can't be exotic matter, elements made of something other than protons, neutrons, and electrons. But that's mostly speculation, too.
1. Pathogenic Fungus Capable of Infecting Humans
This one was quite popular recently. There was a whole show about it, the most impressive thing about which was it was apparently a good show based on a video game, which is rarer than unobtainium (see what I did there?)
If you were hoping this was nothing more than a spooky story, I have bad news: It’s very possible.
In the broadest sense of "possible."
Of course, our body temperatures would have to lower first, or fungi would have to evolve to live at higher temperatures. Well, why not both?
Sure, there's nothing in principle that would prevent either of these things.
Climate change has been forcing fungi to adapt to naturally higher temperatures, and human body temperatures have been dropping, with humans today having a body temperature of approximately 1 degree Fahrenheit lower than in the early 1800s.
And this is my last Issue (promise). The drop in human body temperature can be explained by improvements in medical care over the last 200+ years. A "human body temperature" is necessarily an average of a sample, and back in the early 19th century, that sample would be more likely to include individuals with low-grade infections that cause fever.
That's the flip side of there being things we don't understand: we can use them to scare the living shit out of each other. |
|
Time for another clothing-related prompt from "Journalistic Intentions" [18+]. Unlike earlier prompts, this time, the RNG gave me something I've actually heard of.
Waistcoat
Now, here's a refreshing change: a name for an article of clothing that actually makes linguistic sense. Sort of. Except that "waistcoat" and "vest" are synonyms. And also, some sources claim the name is a pun: the vest was made from the waste material from tailoring the jacket and pants, hence, waste-coat. As it hugs the waist (not to be confused with a cummerbund, which is worn around the waist), the pun was really inevitable.
I've long been operating under the assumption that it's a "vest" when it's the outer layer of an outfit, and "waistcoat" when it's worn as, say, part of a three-piece suit. Three-piece suit is, itself, a misleading phrase, as one almost always also wears socks, shoes, a shirt, and a tie, making it, at the very least, a seven-piece suit. Not to mention underwear (bottom and top) (optional), cufflinks, a belt or suspenders, a hat (some of us still wear them), and my favorite, the pocket square. Though I'll grant that you normally buy those accessories separately, with the trousers-waistcoat-jacket part sold as a unit.
That assumption turns out not to be the case.
Of course, it's possible to have trousers, waistcoat, and jacket be of different materials, colors, and patterns. But then it's not a three-piece suit, is it?
Look, I know more about traditionally men's fashion than women's. This should not be a surprise to anyone.
As an aside, I mentioned up there that "vest" is effectively a synonym for "waistcoat." This isn't strictly true. In the UK, a "vest" is what we Yanks unfortunately have taken to calling a "wife-beater," from the stereotype of the man wearing only a sleeveless undershirt who engages in domestic abuse. This is, of course, not the only fashion-related linguistic trap between US and UK English. "I'm wearing pants" has a whole different meaning, too.
Adding to linguistic confusion is that "to vest" is a verb that has several meanings, including "to put on clothes." Whether those clothes involve waistcoats or not.
All of which is to say that even in this relatively simple case, nothing in fashion is as straightforward as one might think.
For example. I've been wearing suits (occasionally) (yes, I own suits) since I was a kid, and it took me until like 2019 to discover that you're not supposed to close the bottom button of a waistcoat. I knew about that rule with jackets, but no one told me about vests.
As an engineer, this really annoys me beyond all reason. While I understand, intellectually, the idea of nonfunctional junk on clothing (the rivets on a pair of Levi's served a purpose once, but nowadays they're just tradition, for example), why make it look like you're supposed to button something when you're not?
The only answer I can come up with is that they do it as a visual kind of shibboleth: if you don't conform, then those in the know have yet another reason to look down upon you.
Which is why I love pocket squares. They make excellent gags. |
|
The random number generator for my trips to the past led me, today, to that rarest of all beasts: a short blog entry.
This one's from January of 2009: "Calvin tribute"
You'd expect, given the proliferation of short-attention-span media such as Twatter, that brevity would be more popular. But no, the view count on this one is actually somewhat lower than for many of my weightier tomes from the same era.
Anyway, since it's short, there's not much to talk about, but I'm going to try anyway.
First, the link: it's dead. Not too surprising, given its age, but the website it was from still exists. Just not the article. Nor did I do my usual (even for the time) selective quoting, so I can't recall exactly what it was about. It was not, as the title may suggest, referring to the stick-in-the-mud Protestant proponent of predestination, but to the play-in-the-mud comic strip kid.
Doesn't matter much. People (including me) are still singing the praises of this now-classic comic strip. One of my most prized possessions is the three-volume hardcover Complete Calvin & Hobbes, and I still occasionally come across retrospectives and "What's that crazy Bill Watterson up to these days?" articles.
Not to mention that at this old entry, I provide a link to a story I wrote imagining a middle-aged Calvin. That one's still in my port here.
What I've hardly ever mentioned is that C&H was instrumental in cementing my decision to never have kids. He was fun to read about, but, as I recall commenting back when the strip was in its heyday of the 1980s, "Calvin is the kid I wish I'd been, and also the kid I desperately don't want to ever have."
Not that this was the only reason, mind you. There's also laziness, appreciation for uninterrupted sleep, and a strong desire to have things I own remain intact and be wherever I left them. Just to name a few.
But I digress. After all these years, those strips still make me laugh uncontrollably. |
April 15, 2023 at 10:36am April 15, 2023 at 10:36am
|
I've been wondering what to say about this one.
While the article emphasizes the US, it's been clear to me, thanks to the international reach of the internet, that this is a human problem, not just an American one. But as I live in the US, I'll run with what it says.
It may sound like an insensitive statement, but the cold hard truth is that there are a lot of stupid people in the world, and their stupidity presents a constant danger to others.
That depends on how you define "stupid," I suppose. I think they mean it in the sense of "stubbornly and willfully ignorant," because I fail to see how someone with a legitimate learning disability (Down's Syndrome, e.g.) presents a constant danger to others.
It would not be a stretch to say that at this point in time, stupidity presents an existential threat to America because, in some circles, it is being celebrated.
Which fits "stubbornly and willfully ignorant." Also any politician (of any party) who runs on a "common sense" platform.
Although the term "stupidity" may seem derogatory or insulting, it is actually a scientific concept that refers to a specific type of cognitive failure. It is important to realize that stupidity is not simply a lack of intelligence or knowledge, but rather a failure to use one's cognitive abilities effectively.
Okay, so a better definition than I had.
To demonstrate that stupidity does not mean having a low IQ, consider the case of Richard Branson, the billionaire CEO of Virgin Airlines, who is one of the world’s most successful businessmen.
Clearly, this article was written before Virgin Galactic fell into a black hole.
Still, business failure can't always be attributed to stupidity (as defined here).
Branson has said that he was seen as the dumbest person in school, and has admitted to having dyslexia, a learning disability that affects one’s ability to read and correctly interpret written language.
I've known several people with dyslexia, and this may be some sort of selection bias on my part, but as a whole, they seem to have an above-average intelligence. Mostly, they just have trouble with words and spelling. While I pride myself on my spelling, it's not proof of my intelligence.
Branson’s smarts come from his ability to recognize his own limitations, and to know when to defer to others on topics or tasks where he lacks sufficient knowledge or skill.
I hope his rocket science competition here in the US is taking notes.
Stupidity is a consequence of a failure to be aware of one’s own limitations, and this type of cognitive failure has a scientific name: the Dunning-Kruger effect.
Ironically, people throw around "Dunning-Kruger effect" like they know what they're talking about, even if they don't. Shit, I'm guilty of that, myself.
It is easy to think of examples in which failing to recognize one’s own ignorance can become dangerous. Take for example when people with no medical training try to provide medical advice. It doesn’t take much Internet searching to find some nutritionist from the “alternative medicine” world who is claiming that some herbal ingredient has the power to cure cancer. Some of these people are scam artists, but many of them truly believe that they have a superior understanding of health and physiology.
I'd reverse that. Many of these people are scam artists, but some of them truly believe they have a superior understanding.
There are many people who trust these self-proclaimed experts, and there is no doubt that some have paid their lives for it.
My Platonic ideal of the successful businessperson is not Richard Branson, but Steve Jobs. No, I'm not an Apple fanboi, but consider: he started a company with a couple of other geeks in his garage, and lived just long enough to see it become the most valuable company in the world (by market capitalization), exceeding far older corporations such as ExxonMobil or IBM. Sure, he was, by all accounts, a massive dick, but that's irrelevant to being a successful businessman.
And yet, he might have lived longer if he hadn't fallen for the alternative medicine scams.
What’s particularly disturbing about the Dunning-Kruger effect is that people are attracted to confident leaders, so politicians are incentivized to be overconfident in their beliefs and opinions, and to overstate their expertise.
"The best lack all conviction, while the worst / Are full of passionate intensity." -W.B. Yeats
It is only when we try to become an expert on some complex topic that we truly realize how complicated it is, and how much more there is to learn about it.
As a dedicated dilettante, interested in a broad range of topics yet not willing to delve too deeply into any of them, this is something I have to guard against, myself.
What we are dealing with here is an epidemic of stupidity that will only get worse as divisions continue to increase. This should motivate all of us to do what we can to ease the political division.
Sure, it should. But it won't.
We are all victims of the Dunning-Kruger effect to some degree. An inability to accurately assess our own competency and wisdom is something we see in both liberals and conservatives. While being more educated typically decreases our Dunning-Kruger tendencies, it does not eliminate them entirely. That takes constant cognitive effort in the form of self-awareness, continual curiosity, and a healthy amount of skepticism.
Clearly, I'm better at this than anyone else.
That's a joke. I'm deliberately overstating my competence to demonstrate the D-K effect.
The article is optimistic about being able to fix this. But there's only one thing more dangerous than willful stupidity, and that's optimism. |
April 14, 2023 at 11:31am April 14, 2023 at 11:31am
|
Another fashion entry, courtesy of "Journalistic Intentions" [18+]...
Swing
Wait, wait, hold up.
"Swing" has like 48 definitions, of which exactly 0 of them that I'm aware of fit the fashion theme. (But I know at least four that refer to sex.)
Time, then, to become aware.
For the first time in history, Wikipedia fails me here. There's the usual disambiguation page, which parses out many of the 48 (I pulled that number out of my ass, but doesn't seem to be far off from reality) possible meanings of the word "swing." Not one of these refers to clothing.
So let's try that old standby, Merriam-Webster.
Nope.
Now, look, I'll save us both some time here: best I can figure out, the only clothing-related use of the word "swing" that I could find anywhere refers to a dress style from the 1950s.
I hadn't been born yet; I'm generally bewildered by nuances of fashion; I don't wear dresses; I've never been in a serious relationship with someone who regularly wore dresses. So my lack of knowledge here shouldn't be surprising (though it's at least partly fixed now). Still, it makes me wonder why both WIkipedia and one of the leading American dictionaries failed me.
Apparently, from what little I can piece together (most of the search results I found were vendors, which is even less reliable than Wikipedia, and hopefully this morning's frantic Googling will confuse the living shit out of someone at the NSA), the swing dress was for dancing to... wait for it... swing jazz.
Which I have heard of, because unlike fashion, music is an actual interest of mine. Though I'm not generally fond of jazz.
But there is one advantage of ignorance: I can use it to make shit up.
So, obviously, the first thing I thought of was that it might be the clothes that swingers wear. As that might be ambiguous, I'll note that by "swingers," I mean the partner-swapping scene. But that wouldn't make sense; as far as I know, swingers don't use clothing to differentiate themselves from monogamists.
Maybe... clothes to wear on a swing? When I was a kid and could still fit on swings, we just wore whatever we were wearing. Which led to quite a few skinned knees.
Or, ooh, I know! it's clothing that'll change your political affiliation. "Here, wear this vest. It'll make you vote for the Libertarian and like it."
Compared to even my lousy imagination, though, the reality is—as per usual—quite disappointing, at least to me. That is, of course, assuming that I'm somehow right, which is a massive assumption. It's disappointing to me because, to me, there are only three kinds of dresses: 1) Wedding dress; 2) Bridesmaid's dress (look, I used to be a wedding photographer); 3) Dress.
I don't believe in only learning shit about shit that affects me. But even I can't hold unlimited information in my noggin, which is why things like search engines and dictionaries and encyclopedias are my friends (and auxiliary memory). Even an established nerd like me needs to let some things pass unnoticed, or I'd be overwhelmed with trivia. It's annoying and frustrating to me when such references fail.
But hey, at least I learned something new today. Again. |
|
I have a system (not a perfect one) in place to avoid duplicating links here. I also try not to repeat entry titles, but with 2300 entries, I know I've got some duplicates and maybe even triplicates.
What's worse, though, is when I thought I did an entry about something and now I can't find it. Such is the case with famed proto-astronomer Tycho Brahe. I mentioned him briefly in an entry a while back in response to someone's comment, but that seems to be the extent of things.
Until now.
Unlike most Cracked articles, this one's not a numbered list.
To non-astronomers, there aren’t a huge amount of A-list astronomers. Most people could probably name Copernicus, Galileo and Hubble, but realistically, they learned those names from the dog in 1955 in Back to the Future, the lyrics to Bohemian Rhapsody and a big-ass telescope that occasionally features in “yo mamma’s so fat…” jokes.
Hubble's successor, the JWST, is not named after an astronomer but an administrator. Which I think is a step down, and that's not even getting into the controversy around Webb's policies.
Amazing telescope, though.
One guy whose name might ring only a faint bell is Tycho Brahe, a 16th-century Danish astronomer who made some remarkable innovations that helped to bring about the Scientific Revolution.
Speaking of telescopes, he did all his work without one because they hadn't been invented yet.
For exactly how influential this guy was, there's the article, and of course also the Wiki page about him (which may or may not be less error-prone). I'm skipping it because the clickbait headline is about his personal life (and death), which was almost as interesting. I will note, however, that his contributions were significant enough for later astronomers (ones with telescopes) to name one of the largest and most recognizable lunar craters after him.
Speaking of cutting edges, Brahe had his nose sliced off at the age of 20. He got in a duel with his third cousin, who accidentally sliced a big chunk of his honker off...
And this is what he's probably most well-known for. No, I have no idea how he smelled.
The big-brained, metal-nosed polymath undone by humble urine.
No, he didn't discover Uranus. That would have to wait for telescopes.
The story goes that Brahe was at a banquet and needed a pee, but etiquette forbade him from leaving, so he held it in. Unfortunately, when he got home and could safely drain the lizard, nothing was forthcoming.
Sounds to me like he wasn't done in by piss, but by stringently following etiquette. The cautionary tale here is: fuck etiquette.
And then, in the words of Victor Thoren and John Christianson’s 1990 book The Lord of Uraniborg: A Biography of Tycho Brahe, “through five days and nights of sleepless agony, he pondered the agony of paying so great a price for, as he thought, having committed such a trivial offense.”
Uraniborg is unrelated to urine. It was, if you read the text I'm leaving out here, one of his pre-telescope observatories. Nor is it related to Uranus (other than maybe etymologically), or the Borg ("Borg," by my understanding, is or was a Danish word for "castle.")
He had several more days of pain, fever and delirium before dying. He was 54, and had spent 38 years documenting the stars.
Which is approximately 39 years longer than I can ever concentrate on one subject.
Just in case readers might be worried that this could happen to them, the article reassures us:
“The good news is that, while uncomfortable, occasionally holding back a pee shouldn’t harm us,” explains Ajay Deshpande, senior clinical lead at London Medical Laboratory.
Also, medical science has advanced somewhat in more than half a millennium. Hell, they can even reconstruct noses now. To your specifications. Which makes a plastic surgeon's office the only place where it's socially acceptable to pick your nose.
In other words, it isn’t out of the question that the father of astronomy sat on his keys, filled up with pee and died. All that time looking up, not enough looking down.
Perhaps this inspired Oscar Wilde's famous quotation: "We are all in the gutter, but some of us are looking at the stars." |
|
Another entry for "Journalistic Intentions" [18+] today:
Charmeuse
Unlike some people, I'm not proud of my ignorance. Nor do I pretend I don't have any. It's always a great moment for me when I learn something new, even if it's about a topic I don't have a lot of interest in... such as fashion, which is the theme of this month's JI.
So until these prompts were posted, I don't think I'd ever even seen or heard the word "charmeuse." And I also restrained myself from looking it up, just in case it became one of the eight out of sixteen prompts that I'd end up picking at random and I could proclaim my ignorance in public.
And behold, it was.
So let's see. Before I look it up, I have some guesses.
The word "Charmeuse" is in the section of Fabrics prompts, along with nylon, silk, and wool—all of which I am familiar with, so why couldn't I have gotten one of those... oh well, I'd have less material (pun absolutely intended) to work with. So I will start out by assuming (and shut up about "ass-u-me" already; we all make assumptions all the time just to get by) that charmeuse is a fabric.
Easy enough. What else? Well, the word seems French, so my next assumption is that the initial sound is more like sh as in shit than ch as in chit. Look, even after three and a half years of French lessons, I don't consider myself fluent, but I see patterns (though I have no idea, as yet, if charmeuse is a patterned fabric or not). And if I had to take a stab at "charmeuse," I'd guess it's the feminine word for "charmer" or "one who charms" (the masculine version being most likely "charmeur" if it indeed exists in the language). This is like how "singer" (the vocal thing, not the fabric sewing machine company) can be translated as chanteur or chanteuse.
I should emphasize that I'm using masculine and feminine in their linguistic meanings, not the sociopollitical meanings, which have become a minefield.
Alternatively, it refers to someone who prefers Charmin toilet paper, much as "bounty hunter" refers to someone who was searching for paper towels during the pandemic supply shortages.
It being French tracks, because when you think of fashion, you think of Italy, France, and maybe Mobile, AlabamaNew York City, the latter of which uses Italian and French names for fashion to make themselves seem superior.
I can also provisionally rule out nylon, silk, and wool as being synonyms, so it's probably a completely different material... though my confidence level on that isn't very high, as you also have, say, felt (the fabric, not the past tense verb), which is a different form of wool in much the same way as graphite is a different form of carbon.
And that's the limits of my guesswork. Let's see how I did, courtesy of Wikipedia:
Charmeuse (French: [ʃaʁmøz]), from the French word for female charmer...
Il est d'accord. Point pour moi.
...is a lightweight fabric woven with a satin weave, in which the warp threads cross over four or more of the backing (weft) threads.
Another thing I learned fairly recently: satin isn't a base material, like wool or silk, but a particular kind of weave. But satin is, as I understand it, most closely associated with silk. So deduct a point for me guessing it didn't have crossover with the other prompts.
I have a vague idea what warp and weft are in relation to fabric, but only vague, and there's only so far I'm willing to go down a Wiki rabbit hole this morning.
Charmeuse differs from plain satin in that charmeuse has a different ratio of float (face) threads, and is of a lighter weight.
"Plain satin?"
Charmeuse may be made of silk, polyester, or rayon.
Cue the "one of these things is not like the others" earworm.
It is used in women's clothing such as lingerie, evening gowns, and blouses, especially garments with a bias cut.
See? I told you bias was everywhere.
It is occasionally used in menswear.
And yet it's still called charmeuse in that context.
There's more detail at the Wikipedia link, though I shouldn't have to remind anyone that I wouldn't use it as a definitive source (and I can't be arsed to follow all those links at the bottom).
Any day when I learn something new is, in my opinion, a good day. Especially when it's source material (yes, I used that pun before) for comedy. |
April 11, 2023 at 10:16am April 11, 2023 at 10:16am
|
Speaking of words...
We’ve all experienced how certain sounds can grate on our nerves, such as the noise made by dragging your fingernails across a blackboard or the cry of a baby...
I suspect most people don't know what their own fingernails sound like dragged across a blackboard. Partly because who uses blackboards anymore, since like the 90s, and also partly because most people aren't evil enough to do it themselves. I, however, know exactly what my fingernails sounds like dragged across a blackboard.
Certainly I was a baby, but I don't remember how annoying my cries were. Knowing me, "very."
...but it turns out that the sounds of some words (like “virus”) can also affect how we feel and even give us a clue to what they mean (something to avoid).
I'd expect it to be the other way around: the name for something you want to avoid takes on a bad connotation. But that's why I'm reading the article, isn't it?
This phenomenon, where the sound of a word triggers an emotion or a meaning, is referred to as “sound symbolism”. Yet the idea that there might be a link between the sound of words and their meaning flies against accepted linguistic thinking going back more than a century.
And? Sometimes paradigms get reversed.
In our book...
Of course it's another book ad. What else is free on the internet anymore? Apart from this blog, I mean.
...we outline a radically new perspective on how we, as humans, got language in the first place, how children can learn and use it so effortlessly, and how sound symbolism figures into this.
Okay. Well, that first part sounds like speculation, but okay.
For example, if you pick a language at random that has the concept of “red”, the corresponding word is more likely than not to have an “r” sound in it — such as “rød” in Danish, “rouge” in French, and “krasnyy” (красный) in Russian.
Um.
The way an R is pronounced can be way different in different languages, such as the trilled R of Spanish or the nearly-German-CH sound of the letter in French.
Made-up words can be sound symbolic too.
All words are made up. Some just longer ago than others. And I'd be more surprised if newly minted words didn't have sound symbolism. Like when people go "whoosh" when a joke flies over someone's head.
In a classic study from 1929, the German psychologist Wolfgang Köhler observed that when Spanish speakers were shown a rounded shape and a spiky one and asked which one they thought were called “baluba” and which “takete”, most associated baluba with roundedness and takete with spikiness.
So much for flying against "accepted linguistic thinking going back more than a century."
Computer modelling of how children learn language has revealed that, as a child’s vocabulary grows, it becomes harder and harder to have unique sounds to signal different aspects of meaning (such as that all words relating to water should start with a “w”). Indeed, in a study of English sound-meaning mappings, we found that words that tend to be acquired earlier in development were more sound symbolic than words that are acquired later.
Computer modeling is a powerful tool, but it's only part of science. Also, I'm quite surprised these authors don't go into the "m" sound widely present in words associated with motherhood. Maybe they do, in the book.
I don't have much else to say, really (except "moist"). The article is less substantive than most book ads (which, again, I don't have a problem with here on a writing site), but the little bit of speculation I saw in it doesn't make me want to delve deeper by buying the book... and I'm predisposed to appreciating books on linguistics.
Still, I feel like it's something to keep in mind while writing, especially (but not limited to) writing that's going to be spoken, like a speech or screenplay: that the sound of words matters as well as their literal meaning. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|