About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
|
As it is April Fools' Day, how about an article that actually fits with the plain (as opposed to metaphorical) theme of the blog?
Sure, you can skip this if you want; it won't break my heart. But for anyone who's ever laughed at the idea of "imaginary numbers," this one's for you. After all... all numbers are imaginary, in a sense; there's just a subset with the official name of "imaginary."
Mathematicians were disturbed, centuries ago, to find that calculating the properties of certain curves demanded the seemingly impossible: numbers that, when multiplied by themselves, turn negative.
They weren't as disturbed as non-mathematicians, I'm pretty sure of that. Want to disturb a mathematician? Ask one how to divide by zero.
All the numbers on the number line, when squared, yield a positive number; 22 = 4, and (-2)2 = 4. Mathematicians started calling those familiar numbers ârealâ and the apparently impossible breed of numbers âimaginary.â
As I've noted, all numbers are already abstractions. These are just... I guess... more abstract?
Yet physicists may have just shown for the first time that imaginary numbers are, in a sense, real.
Okay, fine; I still say the opposite is true.
âThese complex numbers, usually theyâre just a convenient tool, but here it turns out that they really have some physical meaning,â said TamĂĄs VĂŠrtesi, a physicist at the Institute for Nuclear Research at the Hungarian Academy of Sciences who, years ago, argued the opposite. âThe world is such that it really requires these complexâ numbers, he said.
Even my own surface exploration of math and physics has led me to the conclusion that if the math exists, eventually some physicist will find an application for it. I may be wrong, but it's happened too many times to count (see what I did there?)
The earlier research led people to conclude that âin quantum theory complex numbers are only convenient, but not necessary,â wrote the authors, who include Marc-Olivier Renou of the Institute of Photonic Sciences in Spain and Nicolas Gisin of the University of Geneva. âHere we prove this conclusion wrong.â
Now, I've always heard that complex numbers also show up in, say, electrical engineering. I wouldn't know. Electricity might as well be magic, as far as I'm concerned. I got somewhat familiar with the math involved because the Mandelbrot set is just plain fascinating.
Anyway, the article goes on to describe the experiment in question, whereupon it quickly loses me.
What does this all mean? Well, nothing much to everyday life. Maybe it adds another layer of flavor to this blog's title knowing that what is imaginary possesses a kind of reality. Like I said, I just find it interesting, so y'all get to read about it here. |
March 31, 2021 at 12:02am March 31, 2021 at 12:02am
|
Going back to Scientific American for this one.
Just to be clear, here, I haven't used psychedelics. Some people I know have. What I have had is depression, and meds that never worked for it.
In 2012, I had my first psychedelic experiences, as a subject in a clinical trial at Johns Hopkins University School of Medicineâs Behavioral Pharmacology Research Unit.
Pretty sure you don't have to go to Johns Hopkins to find shrooms in Baltimore.
Prior to their 1971 prohibition, psilocybin and LSD were administered to approximately 40,000 patients, among them people with terminal cancer, alcoholics and those suffering from depression and obsessive-compulsive disorder. The results of the early clinical studies were promising, and more recent research has been as well.
Some drugs are legitimately dangerous. Some are only dangerous in context; I wouldn't want to encounter a tripper (or a drunk) driving on the road. Many are considered dangerous but probably have medicinal benefits; I've always seen shrooms as one of those. The main reason for prohibition in their case is we can't have people thinking for themselves, making their own decisions, or exploring alternative mental states; if people did all that, where would capitalism find its wage slaves?
Eight years after my sessions, researchers continue to prove the same point again and again in an ongoing effort to turn psychedelic drug therapy into FDA-sanctioned medical treatment.
To be fair, "prove" is a bad word here. Research supports (or doesn't); it can disprove, but it can't "prove."
Psychedelic drug therapy subverts the timeworn patriarchal hierarchy by creating an atmosphere of cooperation and trust rather than competition and domination.
Gender-Studies-like-typing detected. Doesn't mean it's wrong, though.
MDMA (3,4-methylenedioxymethamphetamine) is rapidly proving effective in treating PTSD.
Chemical names can sound scary (if, that is, one can pronounce them at all). They shouldn't. As I've harped on too many times already, everything we ingest is made up of chemicals, and I guarantee you a lot of them have long names. Granted, this one includes "methamphetamine," but it's important to note that MDMA (aka Ecstasy or Molly) is not meth.
The success of the cancer studies has led to investigational treatment for patients suffering from intractable depression, early-stage Alzheimerâs, anorexia nervosa and smoking addiction.
On the other foot, there's a long history of Puritanical pearl-clutching in the US, and any medicine that alters one's mental state is frowned upon (including alcohol, which in fairness is probably way more dangerous than cannabis or psychedelics). This alone is probably what's been stalling approval of effective therapies. There's also this persistent idea that we should be self-sufficient, and not turn to altered states of consciousness to relieve the pain of everyday life. That's pernicious.
Clearly, abuse of any drug (or any substance in general) has its dangers, but personally, I'd like to see a follow-up on these studies of psychedelics. I still have little desire to use them recreationally, but this is one thing science is for: to get closer and closer to the truth. Some say psychedelics help with that on a personal level; why not combine the two?
Anyway, check out the article, if you can get past the gender-studies lingo involved. |
March 30, 2021 at 12:01am March 30, 2021 at 12:01am
|
I've been waiting for a very, very long time to have an appropriate use for this entry's title.
The American Obsession with Lawns
Lawns are the most grown crop in the U.S.âand they're not one that anyone can eat; their primary purpose is to make us look and feel good about ourselves
Yes, the article is nearly four years old. Still relevant.
Itâs the time of year when the buzz of landscaping equipment begins to fill the air, and people begin to scrutinize their curb appeal.
Not this people. I have no intention of selling my house or moving, so I couldn't care less what other people think. Well, except for the minimum standards required by local ordinance. Fortunately (and not by accident), I don't live in a place with a Homeowners' Association.
The goalâas confirmed by the efforts of Abraham Levitt in his sweeping exercise in conformity (although it had been established well before that)âis to attain a patch of green grass of a singular type with no weeds that is attached to your home.
Mine currently has dandelions. They add color and nonconformity. When they seed, I'm sure my downwind neighbors curse my name and my ancestors.
Why do Americans place so much importance on lawn maintenance?
I've been asking myself that for a very long time. It's always struck me as horribly inefficient and useless.
The state of a homeownerâs lawn is important in relation to their status within the community and to the status of the community at large.
Oh, right. Status. Status is always inefficient; that's the whole point.
To have a well maintained lawn is a sign to others that you have the time and/or the money to support this attraction.
I have the time. I have the money. What I don't have is the ability to be arsed.
It signifies that you care about belonging and want others to see that you are like them.
Look, by not having the Perfect Lawn, I am actually doing my neighbors a service. Everyone wants to emulate the best lawn in the neighborhood (well, everyone but me), but they'll settle for not having the worst. By having the worst lawn, I'm lowering the bar so my neighbors don't have to work so hard.
You're welcome.
Many homeowner associations have regulations to the effect of how often a lawn must be maintained.
Hence why I don't live in an HOA neighborhood. Sure, the city has standards; the grass can't be more than 12" tall. That's way more lenient than any HOA standards.
But lawns are a recent development in the human history of altering our environment.
And that's the other thing: caring about the environment is incompatible with caring about your lawn. Mowing uses gasoline, albeit a small amount. Fertlizer gets into waterways and contributes to algae blooms and nutrification (which despite its name is actually a Bad Thing). And a perfectly manicured lawn isn't exactly what you'd call conducive to habitat diversity. Probably worst of all, though, is the utter waste of water used in sprinklers.
Lately there have been efforts to mitigate all of these things, but they could be eliminated almost entirely if we collectively decided "fuck this lawn shit."
The article goes into the history of lawns in the US, and it's enlightening to read, if peppered with editing problems.
The rise in economic opportunities meant that homeowners who were inclined to pursue a green carpet of grass could hire someone to attend to its needsâanother indicator of success.
Guilty. I pay a service $50 a month to keep the lawn somewhat trimmed.
My home, for example, sits between two extremes: on one side, we have neighbors who meticulously care for their lawnâthey have a sprinkler system and regularly scheduled lawn maintenanceâand on the other, we have neighbors who let their lawn run wild and will mow once or twice a seasonâtheir lawn is riddled with dandelions and other weeds. The homeowner in the former instance stopped by to tell us that she was seeing signs of crabgrass on her lawn. She scrutinized our patch of greenery as well as the other neighborsâ before going home. While this overall dynamic is common throughout our neighborhoodâthere is a mix of maintenanceâshe was specifically concerned on how this would impact and reflect on her.
The neighbor on one side of me used to have the perfectly maintained lawn (she died a few years ago and it's declined a bit); I'd see a Chemlawn van out there every month or so, which sucked because I'm downhill from that house and that shit seeped over the property line. The neighbors on the other side seem to have the same attitude to lawn care that I do. Neither of them ever said word one to me about the lawn, though I suspect the snooty one once called the city on me one time when the grass was 12.1" tall. Nor have I heard anything from the other homeowners in the neighborhood. With one exception (and you know the type), we tend to all mind our own business.
We are at a moment when the American Dream, inasmuch as it still exists, is changing. The idea of homeownership is untenable or undesirable for many. While green spaces are important, a large area of green grass seems to be a lower priority for many. With a growing movement that embraces a more natural lifestyle, there is a trend toward the return of naturalized lawns that welcome flowering weeds, and subsequently support a more diverse entomological ecosystem.
Translation: bugs are good now.
Lawns are American. But they're also an anomaly. And they may no longer fit the realities of the world we live in.
Sounds familiar. |
March 29, 2021 at 12:01am March 29, 2021 at 12:01am
|
Let's take another crack at Cracked today.
Tell us how you really feel.
DISCLAIMER: None of the following is to be taken as anything even approaching professional financial advice. It's simply the griping of a grumpy millennial who will probably die without ever owning property larger than a nice couch.
Ooooh, look at Mr. Moneybags Millennial over here with the nice couch.
For a minuscule percentage of the population, their finances are something they enjoy looking at. They call their Swiss accountant every morning and say things like, "Read the numbers to me again, Quincy!" while chuckling over a mug of warm stem cells.
Okay, that's legitimately funny. Except that Quincy isn't exactly the first name I think of when imagining a Swiss person.
For the rest of us, opening our checking account app feels similar to cranking a jack-in-the-box that pops out and tells you you're eating noodles this month..
Sure, because there's nothing in the middle of these extremes.
Now there's no shortage of financial BS that starts from the top, raining lukewarm piss all the way down, like the fact that the federal minimum wage hasn't increased in over 10 years while the cost of living has gone up by 20%.
Well, they called it "trickle-down" for a reason.
And then of course there's the numbered list, because Cracked.
5. Modern Credit Scores Controling[sic] Everything
The credit score, for some reason, seems to have established itself as some sort of historical constant, despite the fact that the modern FICO credit score wasn't even established until 1989. Given the importance placed on it, you'd think the colonists showed up and started immediately issuing credit scores to the Native Americans along with their smallpox blankets. In reality, those numbers between 350-800 that decide pretty much anything you can do financially are probably younger than your parents.
Try younger than I am.
These days, credit scores are pulled basically every time you want to use a restaurant's bathroom, and unfortunately, they're here to stay. The best any of us can do is to generally try to think of our credit score as an ill-tempered romantic partner and try to minimize what we do to send them flying off the rails.
While it's probably true that the credit score has an outsized influence on everyday life, the article omits one of the main reasons we use standardized scores now: before them, you'd go apply for a loan, the lender would note your skin color, and assign creditworthiness in inverse proportion to its melanin content. This was, rightly, deemed unsuitable in an egalitarian society.
4. Useless Interest Rates On Your "Savings" Accounts
Provided you're one of the few that can actually afford to put any significant money into your bank's savings account and aren't hopping from paycheck to paycheck like a series of precarious stones, you may have noticed the abysmal interest rate offered by most.
Guy has a point here. However, the common reaction to seeing sub-1% rates on savings isn't "I'mma put this money into an index fund," but "I might as well spend it." This helps the economy and rich people. It does not help ordinary people, unless you only spend your money on locally-sourced, vegan, gluten-free, artisanal products, in which case you're not an ordinary person.
Now, I'm not poor, but I get excited whenever I see a penny lying on the sidewalk. There are those who feel that the effort needed to bend over and pick it up, combined with the potential for germification, makes it not worth it. I'm not one of those people. I pick that sucker up and put it in the back pocket of my jeans, and then wash my hands first chance I get. Also, I always forget about change in my back pocket, so they show up later, nice and sanitary, clinking around in the clothes dryer. Anyway, the point is, even a 0.1% interest rate is better than nothing, even though both 0.1% and nothing are below the rate of inflation. And you can find higher interest rates on savings than that.
And there's never been a more important time to sock money away than now; uncertainty is high and you might find yourself jobless, and you'll need something to tide you over whilst looking for a new job when the one you have dumps you for posting stupid shit on Twitter. The best thing to tide you over is money; your comics collection ain't gonna cut it.
3. 401(k) Replacing Pensions
Retirement plans at all seem like a luxury these days when it feels like the main occupation of the younger generation is three occupations. Millennials and Zoomers are lucky to get health insurance, much less realistic retirement options since Ublyft or Doormates or your employer of choice plans to ax you scot-free through their use of at-will employment as soon as you deliver a half-melted ice cream cone anyways.
While this has some truth to it, the real problem is that if a company does offer a pension, they'll lay you off before you're vested in it. The 401(k) and IRA plans are way more useful when the average length of time someone's in a job is approximately three days (Yes, that's hyperbole. I can't be arsed to look up the actual number, but it's low compared to the "entire career" our forebears "enjoyed.")
Because 401ks expose you and your hopes of a cabin on Golden Pond to the stock market in a way that traditional pensions do not. Anyone living in America over the last decade or so should be rightly distrustful of putting their future in the hands of the stock market, and indeed, we saw in grim relief what can happen with the recession decimating the 401ks of many citizens and forcing your grandpa to work as a Walmart greeter into his 70s.
Um... no. The stock market crashed hard in 2008. It reached a low point in early 2009. Since then, it's had the usual bumps and dips, but the overall trend has been upward. Of course, that won't last; as soon as the average person starts trusting the stock market again, it'll go bear and people will once again freak out.
Point is, if you just look at the last decade, it's been a bull market of historic proportions, and anyone who threw all of their money into it in 2009 could have seen a 600% return on investment.
Is it risky? Sure. So is keeping your money in your sock drawer where it's guaranteed to lose value due to inflation. The trick is to keep holding (and maybe even buying) through dips, and not selling everything when you hear the bear start to roar. Most people can't do that, and they end up selling low and buying high, the polar opposite of what they should be doing.
2. Banks Being Closed On Weekends
If you're someone with an aggressive work schedule, you've likely experienced the pain associated with finally squeezing a bank visit into your calendar, only to hear a discouraging "clank" as you finally get to the door of the bank. Even today a huge amount of banks are closed on weekends, making it difficult for those who work many hours to get anything done in person.
Okay, I get that this can be inconvenient, but really, who goes to banks anymore? The last time I set foot in one was sometime last year when I needed a notary public. And technically, it was a credit union, not a bank. A credit union that has Saturday hours. Seriously, the problem here is that people keep flocking to the major banks, whose primary goal is to fuck you.
1. Having To Prepare Your Taxes
Yeah, it's annoying, but what's the alternative? How about this one: you just get sent your forms filled out, check them, and sign them if everything looks good. That's not only NOT a pipe dream; it's already currently how portions of Europe handle taxes. So if that's true, why do Americans continue to take a math exam where a failing grade results in committing a crime every year?
Well, a big reason is because the people who help you take that exam have a pretty significant investment in keeping them complicated. Reporting from ProPublica shows that Intuit (of TurboTax) and H&R Block spent a combined $5 million in 2016 lobbying to prevent pre-filled returns from being offered to taxpayers. They then have the gall to pump out commercials starring an attractive 27-year-old from central casting that's never seen a W2 telling you about how they're here for YOU, to get you the MOST money from the big bad IRS because they are your FRIEND and they let you have half a bagel that one time, don't you remember?
This, now... this is legitimately enraging.
But hey, think of it this way: you don't really want to put everyone who works in the tax preparation business out of work, do you? Spend more money! Contribute to the economy! |
March 28, 2021 at 12:03am March 28, 2021 at 12:03am
|
When you want to learn reliable facts about stuff, you can always turn to... Cracked?
Okay, maybe, maybe not. Some of these track with what I know. And no, I'm not going through all 55; you can read the article for that; they're all short blurbs because after all, it's Cracked, not The New Yorker. Which also means it's actually funny.
Ever since 1978, when it became illegal to teach history in schools, all our knowledge on the subject has come from movies, TV shows, and vague chitchat. As a result, we're all buried under misinformation. For example, someone's been spreading weird rumors about how the education system changed in 1978.
Okay, reliable except when they turn to obvious sarcasm.
The article also provides handy links for fact-checking, but of course to be thorough, one should independently verify each claim. If one can be arsed. I can't.
4. Cowboys Hats
American cowboys wore all kinds of hats. Bowler hats were most common. You'd also see sombreros or top hats.
The one time I was at a dude ranch, the only hat I had available was a Panama Jack. None of the Stetsons they sold us city folks fit my head (all those brains take up so much room, you know), so I probably have the distinction of owning the only Panama Jack with a stampede strap.
10. Cavalry Charges
In a movie, when the backup army arrives on horses, they tear through the men or orcs on foot. But in reality, horses smartly refuse to plow through soldiers bearing pikes, so cavalry divisions would dismount before fighting.
Thus once again making one wonder who the truly smart species is.
15. American Cowboys
Cowboys are a Mexican invention. There were vaqueros in Mexico decades before settlers came even to Plymouth. By the 19th century, a third of cowboys in America were Mexican -- and one quarter were Black.
Speaking of cowboys.
Also, I always wondered about that word. I mean, yeah, I knew just enough Spanish to understand what "vaquero" means (along with hola, gracias, and marijuana), but it's gotta confuse non-English-speakers. They ride horses, not cows (yes, I understand they herd cows) and they're generally too old to be "boys." Besides, "cow" is always female. Also, why is there a collective word for every kind of animal except cattle? Like, you can't point at a single cow, bull, or steer and say, "That's a cattle," because cattle is plural. You have to call it a cow, bull, or steer. You don't have that problem with mares, stallions, and geldings; you can point and go "Horsey!"
English is weird.
36. Ninjas
Ninjas didn't dress in black. They disguised themselves in whatever costumes best blended in. Even skulking at night, they didn't wear black, which would stand out like a silhouette.
Fact: Almost every photograph contains at least one ninja.
39. The Library of Alexandria
The burning of the Library of Alexandria really wasn't much of a tragedy. The library had already fallen into disrepair for centuries -- not due to fire but due to budget cuts.
Gosh, that sounds familiar.
46. Martin Luther
Luther didn't actually nail 95 theses to a church door, as cool as that would have been.
Look, this is one of those cases where it's much more amusing to accept the myth, like with George Washington and the cherry tree.
51. Pocahontas
Pocahontas was at most 12 years old when the Jamestown settlers came by. She was also bald and naked, as was customary for her tribe. Good then that she and John Smith didn't really fall in love.
No one tell Disney.
54. The Alamo
The Alamo wasn't a heroic last stand that saved Texas. The rebels only stuck around there because they ignored smart orders, and defending the place didn't help the war effort at all.
Oh, Cracked wants to get banned in Texas, I see.
Actually, I'm only including this one because my favorite drafthouse theater chain is called Alamo because they started in Texas. As I went there quite a bit before everyone started passing dangerous germs around, I became a member of their frequent-flyer club. I'm still amused at the name of the frequent-flyer club: Alamo Victory.
My local Alamo seems to have survived the pandemic and bankruptcy closings that hit some of that chain. I hope it lasts. Soon as I can, I'm going back. |
March 27, 2021 at 12:03am March 27, 2021 at 12:03am
|
Today in Adventures in Confirmation Bias...
Confirmation bias applies because a) I like to think I'm smart and b) I appreciate being alone. You may not agree with (a). Hell, most of the time, I don't agree with (a). But let's proceed as if I'm really quite clever and see where this takes us.
Psychologists have a pretty good idea of what typically makes a human happy. Dancing delights us.
No.
Being in nature brings us joy.
Being in nature brings me rashes, sunburns, and bug bites.
And, for most people, frequent contact with good friends makes us feel content.
That would be stretching the definition of "frequent" for me; I'm an introvert, not a misanthrope. Most of the time.
That is, unless youâre really, really smart.
Got me.
In a paper published in the British Journal of Psychology, researchers Norman Li and Satoshi Kanazawa report that highly intelligent people experience lower life satisfaction when they socialize with friends more frequently. These are the Sherlocks and the Newt Scamanders of the world â the very intelligent few who would be happier if they were left alone.
Who the hell is Newt Scamander? Oh, it's a Potterverse thing. I wasn't aware that anyone had actually watched those Fantastic Beasts movies.
To come to this conclusion, the researchers analyzed the survey responses of 15,197 individuals between the ages of 18 and 28.
Thus forms the first crack in the foundation of Confirmation Bias: because naturally, any study done on 18-28 year olds must extend to people of all ages.
That was sarcasm. Us supergeniuses use it frequently.
Analysis of this data revealed that being around dense crowds of people typically leads to unhappiness, while socializing with friends typically leads to happiness â that is, unless the person in question is highly intelligent.
Oh my gosh, you're telling me that a scientific study shows that being among dense crowds of people makes a person unhappy? Who'd have thought that was true? Hey, I have an proposal for a study to find out if water is wet; can I have a grant?
(That was more sarcasm.)
The authors explain these findings with the âsavanna theory of happiness,â noting how different our world is than that of our Pleistocene-era ancestors.
Oh hell, not this shit again.
The savanna theory of happiness is the idea that life satisfaction is not only determined by whatâs happening in the present but also influenced by the ways our ancestors may have reacted to the event.
And the last pebbles of Confirmation Bias crumble to dust.
Look.
Evolution is an actual ongoing process, I'm certainly not denying that. Evolutionary psychology, though... well, I call it by another name: Unsubstantiated Guesswork. It's right there in the last quote: "the ways our ancestors may have reacted to the event."
I see this all the time in certain kinds of articles, usually psychology. They see that humans have psychological trait X, and they argue thusly: "We have X because our ancestors needed X to survive because situation Y must have occurred to select for trait X." Or some so-called logic to that effect. It's usually circular reasoning, because we don't know exactly how our prehistoric ancestors lived.
Moreover, it's not like evolution started the moment our line split off from chimps and bonobos something like six million years ago. We carry remnants of evolutionary features from way before then, both physical and mental. Trying to explain human behavior by how we "must have" acted and reacted on the plains of Africa leaves out several billion years of evolution.
I'm not saying they're necessarily wrong, mind you. They often make compelling cases. But where's the evidence?
âIn general, more intelligent individuals are more likely to have âunnaturalâ preferences and values that our ancestors did not have,â Kanazawa tells Inverse. âIt is extremely natural for species like humans to seek and desire friendships and, as a result, more intelligent individuals are likely to seek them less.â
Okay, now I'm beginning to actively hate this "theory." (Which of course doesn't mean it's not valid.) For starters, "unnatural?" That word has been used to justify all kinds of oppression. "Homosexuality is unnatural." "Any sex besides missionary position is unnatural." "Polygamy is unnatural." "Being trans is unnatural." "Their rituals are unnatural while ours are ordained by Deity." Moreover, how can anything we do be unnatural? We're part of nature.
For seconds, that last sentence seems to imply that smarter people actively do things the speaker defines as unnatural.
And finally, how do you know whether prehistoric people who were smarter (they had a range of intelligence, just as we do) didn't also seek out solitude? There's plenty of evidence that people did that sort of thing within historical times. Thoreau, for example. Not that he was right in any way, but I can't deny he had smarts.
Intelligence is believed to have evolved as a psychological mechanism to solve novel problems â the sort of challenges that werenât a regular part of life. For our ancestors, frequent contact with friends and allies was a necessity that allowed them to survive. Being highly intelligent, however, meant an individual was more likely to be able to solve problems without another personâs help, which in turn diminished the importance of their friendships.
"...is believed to have evolved?" I don't even know where to begin, so let's start with the weaselly passive voice. Believed by whom? Though my personal working hypothesis is that a very few people actually innovate; the thing that enabled humans to dominate the planet was our ability to communicate, and, once a thing is discovered, to understand and copy the techniques involved (think fire or flint-knapping or the Theory of Relativity or whatever) -- one person figured it out and taught others. He or she could not have done so without contact with other humans.
âIn general, urbanites have higher average intelligence than ruralites do, possibly because more intelligent individuals are better able to live in âunnaturalâ settings of high population density,â says Kanazawa.
City-dweller-like typing detected. I mean, seriously, what the hell is this? First you connect intelligence with the need for solitude, and then you proclaim that this solitude takes its best form in crowded places? You define intelligence as the ability to solve problems without other people, and then you claim that people who do this prefer city life, where almost everything is cooperative?
That certainly doesnât mean that if you enjoy being around your friends that youâre unintelligent. But it does mean that the really smart person you know who spends much of their time alone isnât a sad loner â they probably just like it that way.
Here's an alternative, equally valid "theory:" Smart people want to be left the fuck alone so we don't have to put up with unverifiable evo-psych arguments. Or, alternatively, we just get easily bored by small talk. |
March 26, 2021 at 12:02am March 26, 2021 at 12:02am
|
I mentioned dreaming, mostly as an aside, yesterday, but today's article is entirely about some research into dreaming.
Not, of course, interpretation of dreams, but the process itself.
When he was two years old, Ben stopped seeing out of his left eye. His mother took him to the doctor and soon discovered he had retinal cancer in both eyes.
And so we start with a nightmare. Well, not a nightmare; worse, purportedly an actual thing that happened. Don't worry; it's not the rambling anecdotal "style" of The New Yorker; there's a point to this. I think.
But by the time he was seven years old, he had devised a technique for decoding the world around him: he clicked with his mouth and listened for the returning echoes.
Batboy is real! Still waiting for the point...
How could blindness give rise to the stunning ability to understand the surroundings with oneâs ears? The answer lies in a gift bestowed on the brain by evolution: tremendous adaptability.
Ah, there it is. Or at least we begin to see a glimmer of a point.
What does brain flexibility and rapid cortical takeover have to do with dreaming? Perhaps more than previously thought. Ben clearly benefited from the redistribution of his visual cortex to other senses because he had permanently lost his eyes, but what about the participants in the blindfold experiments? If our loss of a sense is only temporary, then the rapid conquest of brain territory may not be so helpful.
And this, we propose, is why we dream.
Determining the "why" of something is, to me, a bit slippery. Why do we have four fingers with an opposable thumb? So we can grasp tools? I mean, that's the result, but a distant ancestor might have said it was so we could climb trees.
Evolution doesn't have a "why;" only a "how."
But okay, I'll grant that one result of dreaming could be to help make new neural connections.
We suggest that the brain preserves the territory of the visual cortex by keeping it active at night. In our âdefensive activation theory,â dream sleep exists to keep neurons in the visual cortex active, thereby combating a takeover by the neighboring senses.
However, I'm not sure framing it in terms of competition is useful. I mean, yeah, the popular view of evolution is one of a process driven by competition. I think it's at least as much about cooperation. If you think about everything in terms of competition, you can start thinking that everything's a win/lose scenario. This is not borne out by evidence. I have another article in my queue that talks about this sort of thing; I'll probably get to it eventually.
REM sleep is triggered by a specialized set of neurons that pump activity straight into the brainâs visual cortex, causing us to experience vision even though our eyes are closed. This activity in the visual cortex is presumably why dreams are pictorial and filmic. (The dream-stoking circuitry also paralyzes your muscles during REM sleep so that your brain can simulate a visual experience without moving the body at the same time.)
Yeah, and in some of us, that mechanism can work too well (sleep paralysis) or not well enough (sleepwalking).
For example, because brain flexibility diminishes with age, the fraction of sleep spent in REM should also decrease across the lifespan. And thatâs exactly what happens: in humans, REM accounts for half of an infantâs sleep time, but the percentage decreases steadily to about 18% in the elderly. REM sleep appears to become less necessary as the brain becomes less flexible.
I can't be arsed to delve deeper into this right now (because at the moment, it feels like the only active neurons in my head are the ones responsible for reminding me about how bad pain can be), but could they have switched cause and effect there? That is, instead of "REM decreases because our brain becomes less flexible with age," maybe it's "our brain becomes less flexible with age because REM decreases." I mean, that would be more in line with the authors' hypothesis here, wouldn't it?
Dream circuitry is so fundamentally important that it is found even in people who are born blind. However, those who are born blind (or who become blind early in life) donât experience visual imagery in their dreams; instead, they have other sensory experiences, such as feeling their way around a rearranged living room or hearing strange dogs barking.
Oddly, if this is the case, my own experience is that dreams can be, but are not always, visual. I have clear memories of dreams that took place in darkness, with my dream-self's knowledge of the outlines of objects coming from -- I don't know where. Some other dream-sense besides sight.
Since the dawn of communication, dreams have perplexed philosophers, priests, and poets. What do dreams mean? Do they portend the future? In recent decades, dreams have come under the gaze of neuroscientists as one of the fieldâs central unsolved mysteries. Do they serve a more practical, functional purpose? We suggest that dream sleep exists, at least in part, to prevent the other senses from taking over the brainâs visual cortex when it goes unused. Dreams are the counterbalance against too much flexibility.
Though I'm convinced that science is the best tool we have for understanding the universe (including ourselves), there are probably some things best left to philosophy. What function dreaming has is likely a reasonable scientific pursuit. What they "mean," however, is far more subjective.
For a long time, now, I've made puns in my dreams (and sometimes even remembered them). Lately, French, which I continue to practice daily, has crept in. Perhaps they're helping me learn the language.
In any case, these authors' research is worth noting, I think -- though as with all new science, I suggest we don't take it as authoritative fact, but rather as a different way of looking at things -- until such time as the hypothesis is disproven. |
March 25, 2021 at 12:01am March 25, 2021 at 12:01am
|
Not only is this one ancient now (2014), but it's been hanging out in my queue like panhandlers outside of McDonald's for several months now.
Look, I'm all for embracing uncertainty, but "planning is pointless?" Clickbait? Or something worse?
Life doesnât go according to plan, and while a few people might do exactly what they set out to do, you never know if youâre one of those. Other things come along to change you, to change your opportunities, to change the world.
I've found the trick lies in retconning your ambitions. Like, say you're a teenager and want to be a doctor. You end up running a backhoe at the local landfill (nothing wrong with that; it's just about as far from "doctor" as I could think of). So you retcon Teenage You into someone who always wanted to run a backhoe.
Hey, it worked for me; I never wanted to be an astronaut.
So if you canât figure out the future, what do you do? Donât focus on the future. Focus on what you can do right now that will be good no matter what the future brings. Make stuff. Build stuff. Learn skills. Go on adventures. Make friends. These things will help in any future.
Nothing wrong with any of those activities, but screw "don't think about the future."
One of the most important skills you can develop is being okay with some discomfort. The best things in life are often hard, and if you shy away from difficulty and discomfort, youâll miss out.
On the contrary, I find that the best things in life are free, easy, and instantly gratifying. If you work hard, something will come along to change something in yourself or your environment, and all your hard work won't have paid off. Still gotta think about the future, though.
Learning is hard. Building something great is hard. Writing a book is hard. A marriage is hard. Running an ultramarathon is hard. All are amazing.
I've never run an ultramarathon, fortunately, but I've done those other things. I wouldn't call any of them "amazing."
Try writing a blog or meditating every day.
Um, hello.
If youâre good at discomfort and uncertainty, you could do all kinds of things: travel the world and live cheaply while blogging about it, write a book, start a business, live in a foreign country and teach English, learn to program and create your own software, take a job with a startup, create an online magazine with other good young writers, and much more.
You can also do all those things if you're independently wealthy. I don't want to travel cheaply, though. I want to experience the finer things, like great wine and five-star hotels.
All of this is useless if you canât overcome the universal problems of distraction and procrastination.
Welp, I'm boned.
Learn about how your mind works, and youâll be much better at all of this. The best ways: meditation and blogging. With meditation... you watch your mind jumping around, running from discomfort, rationalizing. With blogging, you are forced to reflect on what youâve been doing in life and what youâve learned from it.
Nah, it's more fun to blog about peoples' deadline-driven articles. As for meditation, I've tried it; I fall asleep every time.
Speaking of which, have you ever had -- I don't know what it's called, but the exact polar opposite of a nightmare? Like, in a nightmare, you wake up sweating, heart racing, maybe even crying out in terror (or in my case, being unable to cry out in terror because no voluntary muscles work). With what I'm talking about, you wake up at utter peace with yourself and the world. And get your mind out of the gutter; there was nothing sexual about the dream, though I don't much remember what it was about. Mostly I just remember the feeling of great well-being, the opposite of the fear that a nightmare engenders.
It's happened to me a few times, most recently just two nights ago. Since there's not a name for it as far as I know, I have no idea if others experience that sort of thing.
I donât think money is that important, but making money is difficult.
On the contrary, I think money is incredibly important. Enough money and you can basically tell everyone who's trying to tell you what to do to fuck off. You can set your own schedule. You don't have to worry so much about where your next meal is coming from. You don't have to spend time and energy on trying to find the cheapest everything (though it can be fun to do so). Most of all, if you get sick, you can actually get medical treatment (this last bit only applies in backwards countries). That old canard about how money can't buy happiness? Maybe, but it can buy beer, and that's close enough for me. It brings its own set of problems, of course, but they're better problems to have than "oh shit I can't afford this year's rent increase."
Protip: save an emergency fund, then start investing your earnings in an index fund and watch it grow over your lifetime.
Oh, look, one gold nugget amidst all the pyrite.
So, I don't know. The article is obviously aimed at someone much younger than me, and the world has moved on since he and I were starting out. As hard as things were in the 80s, they're exponentially tougher now. I think the basic advice has some merit, but not everyone's cut out for a life of pure uncertainty.
And yet, there's enough uncertainty that we should definitely plan for it.
Good luck with that. |
March 24, 2021 at 12:01am March 24, 2021 at 12:01am
|
Look, sometimes I just like to trace a word's etymology.
But I often get more than I bargained for. After all, no word exists by itself.
If youâre looking for evidence on how language can change, look no further than William Safireâs 1980 âOn Languageâ column in the New York Times discussing collegiate slangâor, as Safire puts it, âcampusese.â
I have a vague memory of that, but I couldn't recall the contents if you paid me.
According to that column, easy courses were called âguts,â and people who would do anything for an A were called âthroats,â short for âcutthroats.â âThroatsâ was a replacement for âgrinds,â itself a replacement for âbookworm.â
I have an alternative idea for why people who would do anything for an A were called "throats," but let's leave that alone for now. And when I went to college just a few years later, the word "gut" was absolutely still in use for easy courses. The course I took in film, for example, was officially titled "Cinema as an Art Form," which we changed to "Cinema as a Gut Form," often shortened to "Cinema-gut."
âAt Yale, the grind is a âweenie,âââ and âat Harvard, the excessively studious student is derided as a âwonk,â which Amy Berman, Harvard â79, fancifully suggests may be âknowâ spelled backward. (In British slang, âwonkyâ means âunsteady.â)â
As I had no contact with Ivy League people -- well, except my cousin who had gone to MIT, but we didn't speak much -- this is all news to me.
Merriam-Webster defines âwonkâ as âa person preoccupied with arcane details or procedures in a specialized fieldâ or, broadly, a ânerd.â
Meanwhile, the distinction between "nerd" and "geek" is still hazy to me. If pressed, I'd say a nerd is someone who likes to learn stuff, while a geek has more of a math/science focus. Everyone is a nerd about something, but not everyone can claim geekitude.
Websterâs New World College Dictionary, the one favored by the Associated Press, calls âwonkâ slang. And though its synonyms are slightly insulting terms like ânerd,â âgeek,â and âdweeb,â âwonkâ is worn as a badge of honor by many people who are specialists in their field (or claim to be).
Yeah, look, "nerd" and "geek" haven't been insulting for many years now. We reclaimed them long ago. As for "dweeb," well, that's just a nerd in a bowtie.
Turns out, the word âwonkâ has had many (unconnected) meanings over the years.
This is why I'm a nerd about etymology.
We are still not any closer to the etymology of âwonk.â The OED has a clue, in its entry for âwonky,â which it traces to 1919: âOf a person: shaky, groggy; unstable,â the British definition Safire cited. âOf a thing: faulty, unsound; unreliable.â The OED says its etymology is âObscure: the German element wankel- has similar force.â
I am forced to assume that "wonk" and "wonky" have entirely different etymologies, like "candle" and "candy." Just because words sound or are spelled similarly doesn't mean they come from the same place.
The article doesn't, however, make a connection with Willy Wonka, which is a terrible oversight; seems to me that fictional character's name would have connections to both "wonk" and "wonky." You know, because he was an expert on candymaking, and more than a little eccentric.
In the end, words mean what we collectively decide they mean, and only the most passionate grammar wonk won't admit that language itself is wonky as hell. |
March 23, 2021 at 12:08am March 23, 2021 at 12:08am
|
Seems to me I encountered the author of this article somewhere before. Perhaps a different article, or a YouTube video. He seems to be quite prolific, so it's hard to tell. Anyway, the background is Sean B. Carroll here is an evolutionary biologist; I seem to run into them a lot even though biology isn't my main scientific interest.
But the article isn't about biology; if anything, it's about sociology or, as I like to put it, how some people love to wallow in ignorance.
The article is dated last November, so it was quite timely when it came out, considering the other denial that ran rampant at the time.
I won't quote from the beginning, which goes into the history of polio and the vaccine developed to prevent it -- as well as the denial that such a vaccine even worked.
One group in particular did not welcome the vaccine as a breakthrough. Chiropractors actively opposed the vaccination campaign that followed Salkâs triumph. Many practitioners dismissed the role of contagious pathogens and adhered to the founding principle of chiropractic that all disease originated in the spine.
If it ducks like a quack...
Look, I'm not ragging on chiropractic in general here. There are things it helps with. I had a good friend who's a chiropractor, and have benefited from their treatments myself -- for back and neck pain. In general, they have to be pretty smart and go through intensive, long training in anatomy and technique. The point is they're not usually what I'd call ignorant -- and yet, this happens.
Opposition to the polio vaccine and to vaccination in general continued in the ranks such that even four decades later, long after polio had been eradicated from the United States, as many as one third of chiropractors still believed that there was no scientific proof that vaccination prevents any disease, including polio.
Often, once a person believes a certain thing, no amount of "proof" will convince them otherwise; they will instead dig in and fortify their position, rather than admit being wrong.
The vaccine is widely viewed as one of medicineâs greatest success stories: Why would anyone have opposed it? My shock turned into excitement, however, when I began to recognize the chiropractorsâ pattern of arguments was uncannily similar to those I was familiar with from creationists who deny evolutionary science. And once I perceived those parallels, my excitement became an epiphany when I realized that the same general pattern of argumentsâa denialist playbookâhas been deployed to reject other scientific consensuses from the health effects of tobacco to the existence and causes of climate change. The same playbook is now being used to deny facts concerning the COVID-19 pandemic.
This is where we get into the meat of the article. The rest of it summarizes, and then expands on, the six pillars of the "denialist playbook," which I won't reproduce here; the article is there to read or skim.
If we hope to find any cure for (or vaccine against) science denialism, scientists, journalists and the public need to be able recognize, understand and anticipate these plays.
And it's worth reading because, as I said, it's timely and pertinent to a lot of the stuff that's going on today. There are other issues people have raised regarding the Covid vaccines besides the general denialism, usually based on fear instead of actual evidence. For example, there was some concern about blood clotting issues from one of the vaccines. This was studied, and determined to be Not An Issue. But once the "blood clotting" thing got in peoples' heads, fear took over.
My take on it? No treatment is, or can be, perfect. But people generally suck at risk management. Yes, it is possible, in general, to die or suffer ill effects from a vaccine. But if the chance of a negative outcome is 0.0001%, while the chance of a negative outcome from the disease itself is (just pulling a number out of my ass here) 10%, that's a huge difference, and I'll take my chances with the vaccine, thank you very much.
I see that sort of thing as an Applied Trolley Problem. You know the Trolley Problem; it's been spoofed enough by now, but the basic conundrum applies: If you do nothing, five people will die. If you do something, one person will die. No, you're not allowed to think outside the box on this one; stop trying to find loopholes right now. Those are the choices, period. Only with vaccines, it's maybe a hundred thousand people on the "do nothing" track and one on the "do something" track. The ethical logic is clear, at least to me. Fucking do something.
That said, this article coming up today (the chance was about 1 in 50) is serendipitous; I got notified yesterday that I finally have an appointment to get my Trump Mumps vaccine. True to form, the universe had a small chuckle at my expense; the only time I could do it was Thursday. And I've had oral surgery scheduled for Thursday afternoon for a month now.
Sure, I was able to schedule the vaccine for the morning while the surgery is in the afternoon. And it's the first dose, so it probably won't kick my ass. But now I have to go get my shot in the morning and then sit through agonizing dentistry for most of the afternoon.
All of which is to say that either a) the vaccine will go fine and I won't have a problem getting my mouth cut up, which will then knock me on my ass for a while; or b) I'll have an adverse reaction to the vaccine, which will knock me on my ass for a while, and won't be able to get the surgery done, thus forcing me to wait another month or more for the next appointment and pay a no-show fee..Consequently, either way, I expect to be utterly useless (moreso than usual) on Thursday night, so don't expect the usual blog entry first thing after midnight Friday.
Still.
It's worth it. |
March 22, 2021 at 12:02am March 22, 2021 at 12:02am
|
There's a reason that the "mad scientist" archetype exists.
This article is from last August, and it speaks of a biographical movie about Tesla -- one which, apart from this article, I've heard nothing about. But long ago, I read a few books about him, and so I sometimes forget that many people only know the name because of Elon Musk or, sometimes, a mediocre band.
The opening of a new biopic of Tesla provides a timely opportunity to review the life of a man who came from nowhere yet became world famous; claimed to be devoted solely to discovery but relished the role of a showman; attracted the attention of many women but never married; and generated ideas that transformed daily life and created multiple fortunes but died nearly penniless.
That's a decent enough summary, I suppose. Read the article for details; I'm not going to paste a lot of it here. As a bonus, there are some awesome pictures.
Essentially, without Tesla, life today might have been very, very different. Oh, sure, most of his inventions would have been created by someone else eventually; such is the nature of inventions. But every once in a while, a whole lot of genius is concentrated in a single person. Everyone is familiar with Einstein and Newton; Tesla's main rival, Edison, mostly just had a big team and competent public relations. (Ironically, Elon Musk himself is more of an Edison than a Tesla)
Also, as you'll see, Tesla was the very definition of "eccentric," which I'm sure didn't help much in the publicity department.
His money long gone, Tesla spent his later years moving from place to place, leaving behind unpaid bills. Eventually, he settled in at a New York hotel, where his rent was paid by Westinghouse.
Amusing thing about that hotel: it's still around. Well, it's a hotel again, anyway; for a while there it was some cult headquarters or something, but now it's once again the New Yorker hotel, and it's located not far from Penn Station, in midtown Manhattan.
I stayed there once on a trip to New York, and while I didn't get Tesla's room (I didn't want to be That Guy), I made a pilgrimage to the floor it was on and stood outside the door.
Sadly, I didn't feel a spark.
Fortunately, these days, thanks to a few movies and such (one of which, The Prestige, had him portrayed by none other than David Bowie), more people know about Nikola Tesla, but I'm leaving this here just in case. The article is short, and really just hits the highlights.
Maybe I'll see if I can track down the biopic mentioned in the article. |
March 21, 2021 at 12:02am March 21, 2021 at 12:02am
|
I thought about writing this entry in ironic corporatespeak, but I can't bring myself to do it.
Look, the English language isn't exactly pure as driven snow.
They're unavoidable â corporate buzzwords and gobbledygook.
Oh, they're avoidable, alright - you just have to be old enough to retire.
Belittled and unloved, corporate jargon endures, even thrives. There is no movement to rip down the wallpaper. And let's be honest fellow desk jockeys. Not only have we heard these words and phrases. We've probably used them ourselves.
Even worse than the author's use of corporate jargon are the atrocities committed against the English language in that paragraph.
The use of jargon is often tied to where people stand in a social hierarchy, according to a new paper from three social scientists... People with less prestige in an organization are more likely to use buzzwords. Like interns, new hires and first-year students.
Oh, it was in a paper; it must be true.
She makes a distinction between useful jargon in specialized fields such as medicine, science and law â and the workplace language so prevalent today, a hybrid of business school lingo and Silicon Valley hype. The latter, she says, is littered with "BS words â like orientate or guesstimate, or omnichannel or core competency."
To be fair, "orientate" is probably just fine in Britain, and "core competency" might actually have real meaning. The word "guesstimate" has been around for at least 45 years, and I have hated it for 50 of those years. Make a guess. Or an estimate. Pick one.
Okay, so apparently "guesstimate" has actually been around since the 1930s. That doesn't mean we can't make it stop if we really wanted to.
Let's take a word that suddenly became popular a few years ago: efforting. On the surface it sounds ridiculous.
Let's dig a little deeper... nope, it sounds ridiculous below the surface, too. But we English speakers love to verb nouns, and then gerundize the resulting verb. I remember when "parent" was strictly a noun, for example.
To quote the great sage Calvin, "Verbing weirds language."
On a more consequential scale of slippery is the word "synergy," a longtime favorite of corporate executives extolling the benefits of mergers.
And this really pisses me off, because "synergy" is a great word with a specific meaning. But like many other words, it's been beaten into oblivion by oblivious idiots.
Being able to complain about language changes is one of the many perks of getting older, along with joint pain and ragging on "kids these days." But I've always hated corporate jargon, and did my best not to practice it when I was running a company. I should have had a swear jar, except it'd be called a jargon jar. While swearing in a professional setting is generally frowned upon, buzzwords are far, far worse offenses. |
March 20, 2021 at 12:01am March 20, 2021 at 12:01am
|
Just some fun historical trivia today.
I could list a few dishes that became generally popular due to Jews fleeing religious persecution. The Irish-American penchant for corned beef and cabbage on St. Partick's Day, for starters.
The powerful pairing of fish and chips has long been considered a British staple. Dubbed âthe undisputed national dish of Great Britainâ by the National Federation of Fish Friers, itâs been enjoyed on the island for over a century, with an estimated 35,000 chip shops in business by 1935.
I made sure to eat plenty of fish&chips when I was in England, though I was unaware of its historical origins. The best I had was in this little town on the east coast called Whitby, where they go out and catch the fish in the morning so it's all ready for lunch.
From the 8th to the 12th century, Jews, Muslims, and Christians lived in relative peace in Portugal, known as Al-Andalus under Moorish rule.
I'm guessing that was the first and last time.
As religious violence worsened, many fled Portugal and resettled in England, bringing with them culinary treasures founded in Sephardic cuisineâincluding fish.
There were two main branches of European Judaism: the Ashkenazim in the east, and the Sephardim in the west. Culturally, they were quite different. It was mostly Ashkenazim who immigrated to the US, bringing with them such things as corned beef, bagels, and latkes.
The dish of white fish, typically cod or haddock, fried in a thin coat of flour, was a favorite particularly among Sephardic Jews, who fried it on Fridays to prepare for the Sabbath, as the Mosaic laws prohibited cooking.
Hence why it's called "FRY day."
That's a joke. Friday was named after a Norse god, like most of our days of the week.
But the Friday-night tradition was likely chipless until the late-19th century. The general popularity of the potato bloomed late in Europe, and it wasnât until the late 1800s that the tuber was accepted, due especially to the promotional efforts of a French scientist.
And yet what we call French fries and the British called chips were, if my sources haven't failed me, a Belgian invention.
There are also competing theories about who created the pairing of, as Churchill called them, âgood companions.â Most trace it back to the early 1860s, when Joseph Malins, a Jewish immigrant, opened up a fish and chips shop in London. Others point to John Lee, a man living outside of Manchester, who ran a âchipped potatoâ restaurant that sold the beloved pairing.
Because London and Manchester need another excuse for a rivalry.
British natives and immigrants alike began slathering their cod in batter and frying up husky chips. Industrialization in the 19th and early 20th centuries launched the fish dish to even greater heights, as it became a favorite for factory and mill workers in London and beyond.
One of the great things about food, besides being necessary for continued existence, is how different cultures adapt and expand on other cultures' cuisines. Some call this cultural appropriation. While that's not impossible, for the most part, I see it mostly as a beneficial thing. I've mentioned in here before that certain Italian cuisines, based off of pasta and tomatoes, couldn't exist without noodles from Asia and tomatoes from the Americas. The potato itself, of course, was a New World plant as well.
We live in a time when we can sample foods (and booze) from all over the world, and as far as I'm concerned, that's a wonderful thing.
What really matters with food is not where it came from, but how delicious it is. And it's hard to get more delicious than fish and chips. |
March 19, 2021 at 12:03am March 19, 2021 at 12:03am
|
Well, here's a source that's new to me. Or at least it was when I found this article. It's been in my queue for a while. No, I haven't been putting it off, but I'll say I have for the joke.
Seven tactics? What is this, a numbered list? That's Cracked's schtick. Well, okay, these aren't numbered.
For tens of thousands of years, people have been procrastinating just like you do today: They put things off, delayed, made excuses, and wished their deadlines would disappear. And just as it does with you, this caused them anxiety, made them piss off their colleagues and families, and, worst of all, wasted time.
Yes, but at least it's a source of comedy material.
Fortunately, unlike our ancient counterparts, we have ages of wisdom to help us avoid the mental traps that lead us to procrastinate.
Unfortunately, unlike our ancient counterparts, we have Netflix and social media.
Action by action
Donât let your imagination be crushed by life as a whole⌠Stick with the situation at hand, and ask, âWhy is this so unbearable? Why canât I endure it?â Youâll be embarrassed to answer.âMarcus Aurelius, Meditations, 8.36
"Action by action?" That doesn't even make sense. Or, rather, it can make at least three different kinds of sense.
While it can be productive to think about the troubles that might lie ahead â the Stoics used an exercise called premeditatio malorum, or âpremeditation of evils,â to prepare for potential adversity â imagining the worst usually just causes us to become paralyzed with fear.
No, imagining the worst means I can only be pleasantly surprised, or right. Either way, I win.
This is why Marcus Aureliusâ advice was to keep in mind that a life is built action by action.
Oh, so that's the sense the heading means.
No author ever writes a book, he would say. Instead, they write one sentence and then another and then another.
That's, like... the worst way to write a book.
Create a routine
In many circumstances, we do not deal with our affairs in accordance with correct assumptions, but rather we follow thoughtless habit.âMusonius Rufus, Lectures and Fragments, 6.7
Thoughtless habit can be a good thing. Not always. But it can be.
Really, I won't be quoting all of it; the article speaks for itself. But that's a taste.
I'll just have one more thing to say:
Well-being is attained by little and little, and nevertheless is no little thing itself.âZeno
That's rich, coming from the guy who famously said it's impossible to go anywhere. It's almost as if philosophers have always been self-contradictory.
Anyway, yeah, I'm a professional crastinator, but this advice might help. If I ever get around to reading the rest of the article. |
March 18, 2021 at 12:01am March 18, 2021 at 12:01am
|
Wednesdays have become gaming days for me, and thinking too hard after a game gives me a headache. Still, as long as I'm not too wiped, I try to post something early Thursday. Today's random link is about defining life.
The article, which as of this posting is recent, talks about someone's book. A lot of articles pushing books are annoying, so I'm dropping a warning, but as always, I'm not going to rag on someone for flogging their book, not on a site for writers.
âIt is commonly said,â the scientists Frances Westall and AndrĂŠ Brack wrote in 2018, âthat there are as many definitions of life as there are people trying to define it.â
As an observer of science and of scientists, I find this behavior strange. It is as if astronomers kept coming up with new ways to define stars. I once asked Radu Popa, a microbiologist who started collecting definitions of life in the early 2000s, what he thought of this state of affairs.
âThis is intolerable for any science,â he replied. âYou can take a science in which there are two or three definitions for one thing. But a science in which the most important object has no definition? Thatâs absolutely unacceptable. How are we going to discuss it if you believe that the definition of life has something to do with DNA, and I think it has something to do with dynamic systems? We cannot make artificial life because we cannot agree on what life is. We cannot find life on Mars because we cannot agree what life represents.â
I imagine that this is made even more difficult when we only have one "kind" of life to study: Earth life. It's conceivable that we won't recognize it when we find it elsewhere (and, barring utter catastrophe, I'm sure that we will -- I mean simple life here, not little green women). So thinking about what makes something "alive" or "not alive" is worth pursuing.
With scientists adrift in an ocean of definitions, philosophers rowed out to offer lifelines.
I'm not sure that helps.
In 2011, Trifonov reviewed 123 definitions of life. Each was different, but the same words showed up again and again in many of them. Trifonov analyzed the linguistic structure of the definitions and sorted them into categories. Beneath their variations, Trifonov found an underlying core. He concluded that all the definitions agreed on one thing: life is selfâreproduction with variations. What NASAâs scientists had done in eleven words (âLife is a selfâsustained chemical system capable of undergoing Darwinian evolutionâ), Trifonov now did with three.
So the answer lies not in philosophy or science, but in linguistics? Color me skeptical. Especially since the "number of words" thing is spurious. Yes, fewer words may indicate elegance. But to be equivalent, the first definition has five words, not three, because "life is" precedes both.
I'd also argue that "chemical system" may or may not hold true; energy-based life has been hypothesized.
But perhaps I'm being pedantic.
In the 1940s, Wittgenstein argued that everyday conversations are rife with concepts that are very hard to define. How, for example, would you answer the question, âWhat are games?â
If you tried to answer with a list of necessary and sufficient requirements for a game, youâd fail. Some games have winners and losers, but others are openâended. Some games use tokens, others cards, others bowling balls. In some games, players get paid to play. In other games, they pay to play, even going into debt in some cases.
I only read this after doing my last Fantasy newsletter, which was about games.
A group of philosophers and scientists at Lund University in Sweden wondered if the question âWhat is life?â might better be answered the way Wittgenstein answered the question âWhat are games?â Rather than come up with a rigid list of required traits, they might be able to find family resemblances that could naturally join things together in a category we could call Life.
And this is where we find that the "lifeline" offered by philosophers, above, only muddies the waters. That's what philosophers do, though they seem to argue otherwise.
One philosopher has taken a far more radical stand. Carol Cleland argues that thereâs no point in searching for a definition of life or even just a convenient standâin for one. Itâs actually bad for science, she maintains, because it keeps us from reaching a deeper understanding about what it means to be alive. Clelandâs contempt for definitions is so profound that some of her fellow philosophers have taken issue with her. Kelly Smith has called Clelandâs ideas âdangerous.â
Nothing good has ever resulted from someone calling an idea "dangerous." Either the idea is false, in which case it eventually falls out of favor, or it's true, in which case it becomes the favored hypothesis. Since philosophers also tell us that there are things that are both true and false, or neither (such as the sentence "This sentence is a lie"), again - muddy water.
The article goes on to talk about the best evidence we have yet for past extraterrestrial life, found in a meteor. It's worth reading about the discussion; I won't paste it here because this is already longer than I wanted to type tonight. In summary, it's not incontrovertible evidence.
The trouble that scientists had with defining life had nothing to do with the particulars of lifeâs hallmarks such as homeostasis or evolution. It had to do with the nature of definitions themselves â something that scientists rarely stopped to consider. âDefinitions,â Cleland wrote, âare not the proper tools for answering the scientific question âwhat is life?ââ
Another example of philosophers fucking everything up. But I have to admit she has a point -- sometimes, definitions get in the way of understanding.
Anyway, I don't have anything profound to add. It's just worth reading, though I ended up with no desire to buy the book being promoted there. Just some things to think about. Personally, I think it's pretty clear that there's no binary "life" and "not life;" there's middle ground, like viruses. It's rare that something is truly a binary choice: heads or tails; no other options, and I think that any working definition of life needs to take these gray areas into account.
But it's not like I'm a scientist or a philosopher, and I don't have any skin in the game, so take my opinion with a big grain of non-living salt. |
March 17, 2021 at 12:03am March 17, 2021 at 12:03am
|
Check out this One Weird Trick.
You might have noticed that I have a wide range of interests. I could never choose between math things and language things. As a result, I never got really good at any of them.
But here I am spending my time attempting to improve my writing and learn new languages. For me, learning won't stop until I'm brain-dead.
I was a wayward kid who grew up on the literary side of life, treating math and science as if they were pustules from the plague.
This article is from 2016. It turns out most of us don't actually avoid the plague.
So itâs a little strange how Iâve ended up nowâsomeone who dances daily with triple integrals, Fourier transforms, and that crown jewel of mathematics, Eulerâs equation. Itâs hard to believe Iâve flipped from a virtually congenital math-phobe to a professor of engineering.
And already the author is beyond my paltry knowledge of mathematics.
But these hard-won, adult-age changes in my brain have also given me an insiderâs perspective on the neuroplasticity that underlies adult learning.
The point being that if a person wants to learn something, is motivated to do it, they usually can.
If you can explain what youâve learned to others, perhaps drawing them a picture, the thinking goes, you must understand it.
Pretty sure Richard Feynman, the late physicist, was an advocate of that technique.
The problem with focusing relentlessly on understanding is that math and science students can often grasp essentials of an important idea, but this understanding can quickly slip away without consolidation through practice and repetition.
I'd say this is the case for anything. Since my memory isn't great, I have to repeatedly review earlier language lessons, doing exercises over and over. I still don't always get it right, but I get better.
There is an interesting connection between learning math and science, and learning a sport. When you learn how to swing a golf club, you perfect that swing from lots of repetition over a period of years. Your body knows what to do from a single thoughtâone chunkâinstead of having to recall all the complex steps involved in hitting a ball.
Again, that's probably the case everywhere. A while back, I came to the conclusion that learning is largely about consolidation. Simple concepts build up to more complex ones. You see this a lot in math, where they often consolidate complex equations into one symbol for ease of manipulation, and then unpack it later. It's kind of like a compressed file on a computer.
But in my case, from my experience becoming fluent in Russian as an adult, I suspectedâor maybe I just hopedâthat there might be aspects to language learning that I might apply to learning in math and science.
This, too. Math is really just another language, or at least that's one way to look at it.
I think this is the money quote, though:
Understanding doesnât build fluency; instead, fluency builds understanding. In fact, I believe that true understanding of a complex subject comes only from fluency.
Her path is opposite mine - she learned languages first and then mathematics, whereas my journey is reversed from that - but I think her insights are helpful in either direction. If I have one quibble with what she says here, and there's a whole lot that I'm leaving out so definitely check out the article, it's a focus on learning for a purpose rather than learning for the sake of learning. However, if having a purpose for what you're learning keeps you motivated, I can't fault that.
But in my view, the biggest obstacle to learning anything is when you tell yourself you can't. Sometimes this comes from outside, as when a parent or a teacher might have discouraged you from pursuing a particular field. But sometimes it comes from having tried, and failed. If you think you can't, then it becomes a self-fulfilling prophecy. But once you open your mind to the idea that you can learn something, then you will, at least to some extent.
And I say this as someone who has been convinced, from time to time, that he can't. But I want to change that. And I think I can. |
March 16, 2021 at 12:02am March 16, 2021 at 12:02am
|
Today, we veer dangerously close to some of the Forbidden Subjects. Fortunately, the responsibility for them is on this Cracked article, not on me.
It is not a well-kept secret that we're in an era ripe with the trappings of digital-isolation.
We all make errors. Yes, even me. I'm sure you've seen some and have been too polite to mention them. In my defense, I write these things in an hour or less, and I don't have an editor. Moreover, I'm not getting paid. And sometimes I've been drinking (not tonight, though).
But if you're going to write (or edit or publish) a piece on the dangers of disinformation, it behooves you to get certain things right. To wit:
ripe: adj. having reached a state of readiness or maturity
rife: adj. full of
These words are spelled similarly and their definitions aren't that dissimilar (there are more definitions of both), but they are different enough that I catch it when someone misuses "ripe" when what they really mean is "rife." This is not as blatant as when someone says they weren't "phased" by something (it should be "fazed"), but again - it's an article about being misinformed about things.
But really, that's minor compared to the bullshit illuminated in the article.
While some argue that we're actually closer than ever -- being able to keep in contact with folks scattered across the globe -- the quality of these interactions is less than those of the in-person kind. There's something magical about that face-to-face interaction with our fellow human beings.
I have not found this to be the case, but then, I'm an introvert with misanthropic leanings.
People are now desperately seeking out that feeling of closeness and those feelings have left space for something dangerous to sneak: misinformation and conspiracies ...
To be fair, though, people have been believing bullshit for far longer than there's been pandemic-inspired social distancing. It may not feel that way, but the pandemic is only technically a year old, and bullshit is at least as old as civilization.
5 Online Groups Are Getting Really Weird
When missing that sense of community, online groups have become a quick fix for the isolated.
I've been part of one online group or another for years. I don't see the difference, except that I can't pour the other people a drink online.
There's a player in this game that, while a lot less prominent than the social media giant, has a lot more danger tied into it: Nextdoor. If you aren't familiar, Nextdoor is a social media app with about 10 million users that focuses on hyper-locality rather than the large scale of Facebook. It's more like a hot garbage fire engulfing your neighbor's trash can rather than your city's dump. You have to give your address to make an account, and your "group" are your literal neighbors. Apps like these are particularly infamous, from everything to exaggerating fear in local crime and to helping enable racial profiling.
Okay, confession time: I follow my neighborhood in Nextdoor. I have never participated in a discussion there, but I lurk. And while I've seen more bullshit than I usually like, I've also seen it get piled on and corrected. Nor have I personally witnessed some of the darker side described here. But I live in a progressive, diverse neighborhood in a college town, so I accept that my experience may not be the norm.
Rather than bringing communities together, and helping raise that social capital, they just tear people apart and devolve into massive unmoderated fights over everything from vaccines containing microchips and that tea tree oil is a suitable replacement to washing your hands, to Trump being God's gift to America fighting against cannibals and pedophiles. Seeing your neighbors spout such insanity creates a McCarthy-level paranoia, leaving people wondering if that kind person from down the street, who always drops a pie around the holidays, is really a card-carrying Antifa assassin or QAnon quack.
As Nextdoor is the polar opposite of anonymous internet shitposting, this should be easy enough to figure out.
4 Yoga And New Agey Types Are Getting Into It
Surprise: wellness communities are falling into QAnon conspiracies. Who could have expected that the people who believe crystals have magic healing powers have some hot takes on Covid?
But if you just align your chakras with the proper crystals, you'll be immune to Covid. And it just so happens that I have a perfectly tuned set, which can be yours for the low, low price of $179.99...
What may have started off as a holistic medicine post like, "Here's a turmeric-infused green juice immunity shot that will help give you a boost," can turn into a complete rejection of modern medicine and common-sense recommendations of doctors and researchers. It seems that the acceptance of alternative belief systems made it rather easy to accept alternative facts.
I don't want to rag on peoples' beliefs, but if those beliefs contradict science, we have a problem.
While I appreciate the core tenants of spiritualism, as well as keeping an open mind to the truths around you, it seems counter-intuitive to accept a "truth" that rejects all other truths. If you're able to derive meaning from an Insta or TikTok tarot reading, then here's hoping you can learn to derive reading from actual medical research and preachings from true field experts.
I've generally seen what's called "alternative medicine" as an adjunct to, you know, actual medicine. Not as a replacement for it. I get that people have spiritual needs, and if they want to do yoga or meditate or consult Tarot or pray or whatever, I'm not going to gainsay them. But when they say shit like "an angel told me that Covid is a punishment from God for our evil ways," again, we have a problem.
3 Teachers Are At Risk
We've all had that one out-there teacher who was the champion of the insane. My personal favorite was my high school physics teacher, who had the genius idea of building a hot air balloon that could lift himself off the ground as a class project. Unfortunately, the administration caught on to his idea and made him scale it down to a bowling ball.
This is funny, but it's not a great example. Sure, things could easily go wrong, perhaps even spectacularly, but the science behind hot-air ballooning is well-known and settled, and has been since the time of Aristotle (at least in theory; it took a couple millennia to put it into practice). On the other hand, there is no science behind a lot of the bullshit that's out there.
A video of an English professor at Mesa Community College in Arizona was leaked online by a student, where he went on a 14-minute rant touting QAnon to his class. He was thankfully subsequently fired, as baseless and dangerous conspiracy theories are rightfully considered more dangerous than a hot air balloon made by a bunch of teenagers.
The danger of ideas is far more pernicious than the danger of falling out of a balloon. And while, throughout history, many ideas have been labeled "dangerous," history forgives the dangerous ideas that are based in fact. Not so much the fictional ones.
There's implicit trust in teachers and a given assumption for children (and a lot of adults) that things you are taught in the classroom are facts.
I lost my naĂŻvetĂŠ in that area in eighth grade. I've told the story here before. In that case, the teacher was misinformed, but not deliberately so. She didn't have an agenda for spreading nonsense; she simply misunderstood physics. That would be bad enough in a generic teacher, but she was a science teacher.
It really doesn't help that a third of all Americans believe that the election was fraudulent or that 86% of teachers haven't even addressed Trump and his claims about voter fraud with students. It's hard to know what you're supposed to believe when the adults in your life don't even seem to know what the truth is, and no one will even talk to you about it.
Okay, but I can kind of understand the 86% thing. Stating your beliefs on the subject of the last Presidential election, whichever side you come down on (truth or fiction), is going to be seen as a political statement -- and political statements don't belong in primary schools. Hell, once I say "Joe Biden was the legitimate winner of the 2020 Presidential election," half the country would dismiss everything I had to say. Or, you know... vice versa.
This is why I try to stay away from politics, one of the Forbidden Subjects.
By the way, Biden was the legitimate winner.
This is the one in-person outlet that these incredibly impressionable youths are getting during the pandemic to even interact with other people. The level of influence that this one and only exposure will have on them can't be understated, and it's best if it isn't flooded with falsity soapboxing.
Or, really, any kind of soapboxing, at least insofar as it's unsupportable by facts.
I remember that the worst thing I had to deal with in high school was trying to create funny content for my Snap story. What a time to be a school kid, with the looming decline of democracy hovering over you like an unsafe hot-air balloon.
Oh, boo hoo. We expected the Bomb to drop at any moment.
2 Given The Coup, We Clearly Need To Address This With The Military
One in five of the participants who stormed the capital had a military history, which when you consider that veterans only make up 7% of the population, not 20%, it's clear how extreme the skew is.
No argument here, but one must be very careful playing with statistics. To hear some people talk, the military is rife (see, that's how you use the word) with right-wing Nazis. I do not believe that this is the case. Are they in there? Sure. There are also a bunch of liberals in there, and a whole lot of centrists. So while the Koup Klux Klan there in Washington in January had a disproportionate number of ex-military folks, let's not go condemning the military itself. Correlation isn't causation.
The literal last thing we want is the military involved in a coup against the American government. They're literally supposed to be protecting us from insurrectionists, not being the actual goddamn invaders. Hopefully, this really underlines the threat of misinformation campaigns with a bright yellow wide-tipped highlighter.
Can we literally stop using "literally" as an intensifier? No? Oh well, I guess that ship has sailed. Point is, though, that as far as I know, they were all acting as private citizens. While I have respect for the military in general, it's made up of people, and people can be subject to believing bullshit.
What all this really shows is that we need to create a strategy for defending against misinformation. At the very least, we could provide some basic media literacy training for our armed forces.
I have no idea if this is a thing or not, but basic media literacy training sounds like a good idea for everyone in general. It doesn't have to be politically biased.
I should also point out that neither I, nor the author of this article, is singling out the conservative side of things. Point 4 above was about new-age stuff, which is almost exclusively the province of the left. You won't hear me say "both sides are bad," but what I am saying is that bullshit doesn't take political sides; it's an equal-opportunity brain rotter.
But now we get to the other Forbidden Subject.
1 Churches Are A Fertile Ground For Misinformation
This is, quite honestly, a weird one. I mean, is it really shocking that churches house conservative messaging? But while there are definitely some churches out there that are openly advocating QAnon gospel, it's weirdly the nature of how the conspiracy is told that appeals to Christian sensibilities.
Far be it from me, an atheist, to defend churches, but I'm certain that, as with the military, they're not all conservative in the political sense. People tend to go among like-minded people, though, and if a homophobe, for example, finds himself in a church led by a gay minister, he's going to find another church, one that reinforces his beliefs. And the other way around, of course. So, no, it's not shocking that some churches house conservative messaging, but let's not delude ourselves into thinking that all churches are alike. If they were, there wouldn't be hundreds of different sects.
Once you think about it, it actually makes some sense that QAnon is filling this void of religion. Dark and terrible forces influencing the earth? A lone savior here to bring justice, who works in secret and mysterious ways? The storyline is surprisingly ... Christian. (You could have gone with Blade instead of Q, but nooo.)
Oh, come on now, Superman is the obvious choice.
So even if you wanted to do the right thing as a religious leader -- preach tolerance, forgiveness, to look out for deceivers and misleaders -- you don't really have space, time, and community to foster these connections with people and guide them away from dangerous rhetoric.
Again, I'm in the awkward position of having to defend religion here, but I'm absolutely sure that there are many religious leaders (Christian and otherwise) who are positive and inclusive in their messaging. The ones with the "dangerous rhetoric" tend to have a bigger megaphone, it's true, and they're the ones you hear about (usually right before getting caught banging a prostitute, but that's another issue). But as with the military, we can't just lump them all together in the "believes whacko conspiracy theories" pile.
Fertile ground, though? Sure. Once you start believing one unsupportable thing, you can be open to believing more. That's one reason I hammer on about science.
And while we definitely do live in the darkest timeline, don't give in to fear.
During the nadir of our current situation, I texted a friend of mine like: "What if, in the future, we invent time travel? And there are time cops who go back and fix the timeline to produce the most optimal outcome? ... And what if we're living in the result?"
"You shut your whore mouth right now," was the response.
Point is, maybe we are living in the darkest timeline. Maybe we're not. I like to make jokes about that, myself. I can't count the number of times I've said, "This timeline sucks." But for me it's more an expression of frustration than anything else; fiction aside, I don't think there's any way to change the past.
Nor can we change the future. But we create the future all the time. And I'd rather live in a future where we can still have different opinions about politics, religion, etc., and even argue about such things -- but those opinions are all based on facts.
And to those out there facilitating the spread of misinformation, here's a little biblical quote courtesy of Rob Buckingham, a pastor of Bayside Church in Melbourne: "Ephesians 4:25: So, stop telling lies. Let us tell our neighbours the truth."
And there it is, folks: probably the one and only time I'll quote the Bible in here, and it's buried three layers deep in someone else's quote. |
March 15, 2021 at 12:01am March 15, 2021 at 12:01am
|
Oh, wait, that's supposed to be "Ides," not "Ideas."
Here's the thing: I've been up way longer than usual at the moment; I've had a couple of beers, and I've been busy doing stuff all day. So I'm a little burned out.
I pick articles at random to feature here. As luck would have it, just now, my usually friendly random number generator had a good laugh at my expense, and came up with this ponderous pondering:
By the way: If you're put off by the word "mathematics," don't be; there's not a single equation in the article. As far as I can tell, it could just as well have been titled "The philosophy of mind-time."
This was fascinating when I first found it, for some reason, and it continued to stretch my brain today. The only problem is, today, my brain is already stretched as far as it can get right now.
So... I'm just going to leave this here. I hope someone else will be interested. I neither agree nor disagree with the article, incidentally; I just like the way the author presents their case. This stuff is seriously above my pay grade, but at the moment I can't even think of ways to riff off of it for humorous effect.
Yeah... I know... I'm cheating today. I'll give myself this one. Hopefully, I won't make a habit of it.
I'm going to go turn off my consciousness now. Tomorrow we'll see if I can make jokes, or at least be coherent. |
March 14, 2021 at 12:01am March 14, 2021 at 12:01am
|
I don't know if I can do this one without offending someone. But I'm going to try.
On Nov. 22, 1968, an episode of âStar Trekâ titled âPlatoâs Stepchildrenâ broadcast the first interracial kiss on American television.
So, that would be five years to the day after the JFK assassination, well into the civil rights / desegregation era. Laws changed. Lots of peoples' minds did not.
I was too young to watch it when it came out, but of course when I got old enough to catch reruns, I saw that and all the other episodes of Trek. I don't think it ever occurred to me to note the "interracial" aspect, and I was certainly ignorant of the history behind it. Kid Me was too busy being bothered by other aspects of that particular episode, as we'll see in a bit.
The episodeâs plot is bizarre: Aliens who worship the Greek philosopher Plato use telekinetic powers to force the Enterprise crew to sing, dance and kiss.
You know, as far as Trek episode summaries go, that's pretty tame. I'm trying to imagine being Roddenberry or Coon or whichever producer was responsible for hearing all the elevator pitches (yes, there were elevators back then), and having to go, "No... weirder."
Anyway, that was the part that I recall being scarred by as a child: the idea of aliens using mind powers to force others to do their bidding. Dance, my puppets! DANCE! (The episode, by the way, is worth watching just to see Nimoy doing his "I'm being forced to do this" dance.)
It was only later that the cultural significance of the Shatner/Nichols scene was pointed out to me. I mean, I lived in the South, not far from where the "Loving" in "Loving v. Virginia" came from, so I wasn't exactly sheltered from racism; I just thought of it as something only ignorant people believed in.
I guess I still do.
The smooch is not a romantic one. But in 1968 to show a black woman kissing a white man was a daring move.
As I see it, one of the main purposes of science fiction is not to show how things will be, but how they could be. This includes social change.
But just as significant is Nicholsâs off-screen activism. She leveraged her role on âStar Trekâ to become a recruiter for NASA, where she pushed for change in the space program. Her career arc shows how diverse casting on the screen can have a profound impact in the real world, too.
The rest of the article pretty much focuses on Nichelle Nichols, which as far as I'm concerned is a good thing. But it's impossible to talk about Nichols without mentioning Uhura, and especially that episode of Star Trek. So it's good to get that bit out of the way.
Nicholsâ controversial kiss took place at the end of the third season.
Flag on the play: it was the middle of the third (and final) season.
It's worth taking a look at the bits I'm not quoting here; the article is pretty short and details Nichols' efforts to advocate for inclusion.
Star Trek is, of course, fiction, but the stories we're told influence the way we see the world. The show was far from perfect, but it tried to demonstrate the benefits of diversity at a time of great social change. And who knows; maybe someday life will imitate art. |
March 13, 2021 at 12:01am March 13, 2021 at 12:01am
|
We've all known for a while that corvids are smart (and smart-asses), but it's always good to have some science backing things up.
As usual, I have a few quibbles with the article, but I think it's generally effective in communicating what's going on.
Whether crows, ravens, and other âcorvidsâ are making multipart tools like hooked sticks to reach grubs, solving geometry puzzles made famous by Aesop, or nudging a clueless hedgehog across a highway before it becomes roadkill, they have long impressed scientists with their intelligence and creativity.
Don't know why they had to put the quotes there. And it's not only scientists who are aware of these bird's big brains -- anyone who's watched them long enough can tell you they're scary smart.
Research unveiled on Thursday in Science finds that crows know what they know and can ponder the content of their own minds, a manifestation of higher intelligence and analytical thought long believed the sole province of humans and a few other higher mammals.
1) This article is dated last September. "Thursday" doesn't help much now.
2) I've known some humans who can't "ponder the content of their own minds."
3) The author uses the adjective "higher" here and in a couple other places. I don't like it. "Higher" implies a hierarchy that can be misleading, or it could mean "more intelligent," in which case it's redundant in an article talking about intelligence. No species is really more advanced than any other, in general; they've merely evolved different survival strategies, only one of which is what we'd call "intelligence."
Knowing what you know is also a form of consciousness, and the discovery that more and more nonhumans seem to have it raises tricky questions about how we treat them.
Unless they're also delicious, like pigs.
The article goes on to describe, in detail, an experiment that demonstrated the cognitive abilities of crows, and another that studied the neuroanatomy of sky rats. Er, I mean, pigeons. I have to admit I kinda zoned out on this part, more interested in the conclusions than the methodology.
âBesides crows, this kind of neurobiological evidence for sensory consciousness only exists in humans and macaque monkeys.â
Snort. He said "macaque monkeys." Snicker.
âIn theory, any brain that has a large number of neurons connected into associative circuitry ⌠could be expected to add flexibility and complexity to behavior,â said Herculano-Houzel. âThat is my favorite operational definition of intelligence: behavioral flexibility.â
I do wonder about this, but the woman quoted there is a scientist and I'm just an onlooker. Either way, though, I find this sort of thing fascinating. It's like how a crow can't resist a shiny thing, or how a raven can't resist annoying an angsty poet. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|