About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
December 31, 2018 at 12:25am December 31, 2018 at 12:25am
|
I've been thinking about what to post for this final blog entry in 2018.
I thought about doing a year in review, but everyone does that, and it's gotten boring. (I will, however, link this article about this year's science accomplishments. Some of them are more "mind-blowing" and "life-changing" than others. https://www.businessinsider.com/biggest-scientific-discoveries-of-2018-2018-12)
I also considered looking ahead. But that way lies madness. I've never yet been able to accomplish any of the goals I set at the end of a year, and I'm sure as hell not going to start now. Resolutions piss me off. At least I'm consistent at succeeding at being an utter failure.
Then I thought, maybe I'll talk about how arbitrary the new year is and how marking it is a fool's game. But I've already covered that in my post about the solstice, and what the hell, let's let everyone have their own fun. Don't forget to tip your Uber and/or Lyft driver; it sucks having to work New Year's Eve.
So, fuck it. I'm done. See you all next year.
|
December 30, 2018 at 12:17am December 30, 2018 at 12:17am
|
So, you thought 2018 was bad...
https://www.sciencemag.org/news/2018/11/why-536-was-worst-year-be-alive
Not much to say here, really, except that interdisciplinary science is cool.
"We've entered a new era with this ability to integrate ultra–high-resolution environmental records with similarly high resolution historical records," Loveluck says. "It's a real game changer."
Whenever you see someone on TV with a PhD, the implication is "wow, this person is really smart and knows a lot about their field." In reality, though, PhDs tend to focus in on minute details of one aspect of one branch of study. Perhaps it's just because I prefer to know a little bit about a lot of things than a lot about a single thing, but I find this sort of syncretic science fascinating.
Both approaches are necessary, of course. But you don't often get archaeologists, climate scientists, historians, and practitioners of other disciplines working together. When you do, things get interesting because you have multiple insights into the same era, painting a better picture of what really happened.
In any event, while the past sure is interesting, I'm not one of those people who romanticizes it. I'd rather be in my own era. Or possibly in the future. Still sore about not having a flying car. So this article puts things in perspective for me: 2018 was not, after all, the Worst Year Ever. |
December 29, 2018 at 12:46am December 29, 2018 at 12:46am
|
Today's science article is, I'm afraid, fairly long - but it's written to be easily understood:
https://www.nytimes.com/interactive/2018/11/15/magazine/tech-design-ai-predictio...
Randy Buckner was a graduate student at Washington University in st. Louis in 1991 when he stumbled across one of the most important discoveries of modern brain science. For Buckner — as for many of his peers during the early ’90s — the discovery was so counterintuitive that it took years to recognize its significance.
Now, there's a lot to unpack at that link. The title, I think, is a bit sensationalist: "The human brain is a time traveler."
Well, as it turns out, kind of. What he means is only that we have the ability to contemplate both past and future, sometimes skipping between the two in less than a heartbeat. Okay, fine. Then he proposes that, perhaps, this ability is what makes humans human and not, say, sponges or dogs.
Jury's out on that. Every time someone comes up with a single thing that claims "This... this is what gave us the ability to go to the moon / create reality TV / contemplate quantum mechanics / build skyscrapers / whatever," someone else comes up with an example of someone in the animal world having the same ability. Language. Tool use. Opposable thumbs. Etc. Apparently, it's not any one attribute, but the whole complex of them.
But, anyway, let's take this at face value for now. Like I said, there's a lot to read in that article, but I'd like to pay particular attention to something that's more between the lines.
I've been hearing a lot more lately about "mindfulness," usually in the context of "living in the moment" or in the present. This idea has never sat very well with me. Based on the information in this article, it turns out I may be onto something.
First of all, I don't think there is a "present." Or, if there is, it's something akin to the infinitesimals they use in calculus - something arbitrarily small that helps us to calculate a broader function. And I've just lost at least half my audience, so I'll move on to my next point.
Seems to me that, given this information about how our brain works in its resting state, any attempt to live in the present makes us less than human. And while there are a lot of things that humans do that we ought to be ashamed of, is the answer really to attempt to revert to some sort of pre-cognitive state?
I don't know. I did say I was biased. But I'll continue to contemplate the past and the future, because, as the author points out - that's the essence of storytelling.
Sure, the future is uncertain, and the past can be murky because our memories aren't exactly precise. But they're all we've got. |
December 28, 2018 at 12:34am December 28, 2018 at 12:34am
|
Today's linked article turns feel-good stories into feel-bad stories.
https://thinkprogress.org/do-the-impossible-never-complain-live-the-dream-the-da...
Stories like this keep popping up on Twitter like zits on a prepubescent forehead: The sunshiney announcement about the GoFundMe for the guy with leukemia who can’t pay for his own medical costs. (He is employed by an organization whose owner has a net worth of $5.2 billion.) The dad who works three jobs to support his family saving up to buy his 14-year-old daughter a dress for an eighth grade dance. The college student who ran 20 miles to work after his car broke down and whose boss rewarded him for this effort by giving him his own car.
Do you get a sinking feeling when you read these stories? This feeling like, while of course you are impressed by the tenacity and generosity on display, you still want to vomit?
Behold, the rise of the feel-good feel-bad story.
I don't disagree with the thrust of the article. Providing for others may be more rewarding to accomplish directly, rather than indirectly through mandates. But it's been obvious to me for a while that our system is broken.
And yet - and here, I'm going to venture into controversy - I'm not convinced about the value to society at large of paid parental leave.
We don’t have federally-mandated maternity leave in the United States, making us one of the only nations on the face of the Earth to deny our citizens this basic and vital thing.
Essentially, I disagree that it's "basic and vital."
Of course, it's nice. It's definitely a benefit for parents, and probably for their children. Likely, not so much for everyone else.
The above quote links to a Pew Research study from 2016: http://www.pewresearch.org/fact-tank/2016/09/26/u-s-lacks-mandated-paid-parental...
Nowhere in the article, or in the Pew study, do I see anything that explains to me how parental leave is a good thing for society. It's asserted as a given, without evidence, without even argument.
Consider this:
1) Having children is a choice. Especially in the contemporary world, virtually no one can say they brought a child into the world by accident. Pregnancy can be accidental. Birth is not.
2) While one is on leave, the employer must either do without a person's services, or find a temporary replacement. While no one is irreplaceable, this puts an undue burden on the employer.
3) As we move into a more automated workforce, the need for human workers is decreasing; therefore, it is not to the benefit of future society to encourage population growth, or even stagnation.
4) Likewise, the primary driver of many of the problems facing us today - pollution, climate change, hunger, maybe even war - is population growth. Reverse that, and some problems are mitigated.
5) Parental leave is unfair to those of us who have chosen not to reproduce. Yes, some organizations provide "family leave" for reasons other than childbirth, but we're talking about state or federally mandated leave, not a choice made by an employer.
On the other hand, I do recognize that in order to continue capitalism as we know it - profits driven by eternal growth - we need the population to increase. So you'd think more companies would be on board with this (more consumers = more profit). Again, though, I don't see how this could be a good thing overall.
Now, I can see how a policy of "no paid parental leave" affects women disproportionately. And that's not cool. But nothing I've ever seen has convinced me that mandated paid parental leave is a benefit to society in general (again, yes, it is a benefit to the individuals involved). The idea of its benefit is stated, as I've pointed out, without argument or evidence.
Thing is, I have an open mind. If you're so inclined, give me some good rationales and I could change my opinion. Not that my opinion matters either way, but at least I might understand the reasoning. (Anecdotes probably won't work; I've already acknowledged that it's a benefit for the individuals taking the leave.) |
December 27, 2018 at 12:22am December 27, 2018 at 12:22am
|
Like many people, I struggle with weight issues.
It is a source of endless frustration to me, knowing that the things that I want to eat are generally unhealthy. We all know this, of course, and yet some people are better at eating healthy foods. I am not one of them.
I waver. Sometimes, this is because I am basically lazy. It is easier to eat premade food than it is to make my own. Sometimes, even something as simple as cooking up a serving or two of broccoli (a vegetable that I actually like) seems like a monumental task, especially when factoring in having to clean up afterward.
The ideal "healthy" food, to me, is carrots. Minimal preparation, great when eaten raw (in my opinion anyway), and little to no cleanup. But when I'm hungry, do I think of grabbing a handful of "baby" carrots, a conveniently prepackaged serving of bite-sized root vegetables? No, of course not. I end up going to Taco Hell.
So yes, I'm one of those people who would rather have a pill to take to get and/or keep my weight down, something that would enable me to eat whatever the hell I want. I desire this even more than flying cars, jetpacks, or warp drive. Maybe not more than holodecks though. A fully programmable holodeck would be ideal.
But I digress.
Almost every article I've seen about Western obesity makes some version of the following observation: "We live sedentary lifestyles and eat junk food."
This observation is usually unsupported by evidence - at least, when comparing the people of today to the people of, say, the early 70s. It's an assertion that implies that the reader will agree with the writer. And, usually, because so many such articles have been written, it tends to be taken as fact.
But I remember the early 70s. One thing I remember clearly is the news reporting on this new "jogging" trend. People actually went outside and ran! This was unheard of! It's preposterous, and look what they're wearing!
As a group, we are more aware of the benefits of exercise than ever before. Office jobs have existed for over half a century, and even back then, people would come home and plop themselves in front of the TV. And yet, as a group, we're far heavier than we used to be.
Even accounting for the proliferation of "junk food" (which, I can assure you, existed in vast quantities 50 years ago), I've always felt that something else must be at work. That something else, I hypothesized, was that, at about the same time as the obesity "epidemic" began, there had been great advances made in the creation of antibiotics. And these antibiotics (again, according to my amateur hypothesis) might have attacked certain beneficial gut microbes, ones that assist in digestion and help us keep from becoming too fat.
I'm not a scientist, of course, and as I said, I'm lazy. So I never really followed up on it. Something at the back of my mind, that part of me I rarely pay attention to, also said, "Come on, Waltz. That's just an excuse. Eat more carrots and fewer cheeseburgers, and exercise." And, indeed, I'd never seen any legitimate scientists explore that area of research.
Not that they would. There's a pernicious streak of puritanism lurking in American society. The idea that there might be an easy answer is anathema. Rather than working on ways to reduce the bad effects of alcohol, for example, we urge people not to drink. Instead of concentrating on a cure for lung cancer, we tell people not to smoke. Anything perceived of as a "vice" has to be rooted out and destroyed, rather than mitigated by the awesome power of science - because we just have to ensure that anything remotely enjoyable has to have consequences.
And if we're fat, this puritanical streak insists that it's our fault. A moral failing.
Now, look, I admit to many moral failings, including laziness and apathy. Maybe even gluttony sometimes. But that's no excuse not to pursue avenues of research that could lead to some solution other than "if you don't follow our dietary and exercise advice, you're worthless." Especially when, as I've noted repeatedly, nutritional science is rife with bias and bad data - and extremely questionable conclusions.
Anyway, enough. Turns out I might have been onto something.
https://getpocket.com/explore/item/how-the-western-diet-has-derailed-our-evoluti...
I'm not taking this as the absolute truth either, of course. But it's worth studying further. That is, by people who aren't lazy. |
December 26, 2018 at 12:02am December 26, 2018 at 12:02am
|
I can venture forth again!
Not that I want to, but if I did, I could.
Yes, the incessant holiday tune barrage is (probably) over. There's only New Year's ahead, and that only has a couple of songs attached. So it's safe for me to go out again. Still not going to retail stores, though, because they'll be packed with ungrateful returners. But supermarkets are probably safe. So are bars.
My self-imposed exile has resulted in at least one benefit: I actually wrote a story.
Yes, yes, I know. Need some smelling salts?
I wrote it for a contest, and because I'm more interested in writing than in winning - and because one of the organizers is a friend - I'd love to see some other entries. You can find it here:
| | Invalid Item This item number is not valid. #2176427 by Not Available. |
There's still most of a week to enter, and the word/line count requirement is fairly low.
Maybe - I don't know; I can't promise, but maybe - I can actually write more soon. That is, after all, why I came here in the first place. |
December 25, 2018 at 12:32am December 25, 2018 at 12:32am
|
The tradition on Christmas, I believe, is to post something happy and uplifting, some sort of paean to the best side of human nature and to hope for the future.
I should hope you know me better than that.
https://getpocket.com/explore/item/the-day-dostoyevsky-discovered-the-meaning-of...
One November night in the 1870s, legendary Russian writer Fyodor Dostoyevsky (November 11, 1821–February 9, 1881) discovered the meaning of life in a dream — or, at least, the protagonist in his final short story did.
What is the polar opposite of everything that is happy and uplifting? Russian literature, of course.
Not that I've actually read any. Well, I did once read The Master and Margarita, but apart from that, I settle for excerpts like what you have here.
But that fatalistic, profoundly hopeless style appeals to me, and perhaps I'll tackle Dostoyevsky at some point, or maybe Tolstoy because what the hell, right?
Anyway, that article I linked quotes Dostoyevsky extensively and kind of sums up a lot of the existential issues I've been dealing with here. Dreams. Depression. Emotions. Symbolism. Questioning everything.
And yet...
And yet, reading through to the end, something else happens. Something, perhaps, not very Russian at all, but something that maybe touches on, if not the meaning of life, at least a reason to live.
I don't believe there is a meaning of life. Or, rather, I believe that we must impose our own meaning upon it - or not, as suits each of us. Me, I long ago abandoned the idea of "meaning" or "purpose" or "goals" and just started living.
I'm not sure if that was the best course of action. But I'm not about to give up on it now.
(Please don't say "42" in the comments. One, I've already thought of it; two, 42 is the Answer to the Ultimate Question, not the meaning of life.) |
December 24, 2018 at 12:18am December 24, 2018 at 12:18am
|
https://www.theatlantic.com/ideas/archive/2018/12/stop-blaming-millennials-killi...
The American system has thrown them into debt, depressed their wages, kept them from buying homes—and then blamed them for everything.
No, idiots. The American system didn't do that last bit. The media did. And you're part of the media.
The entire concept of demographic "generations" annoys the shit out of me. And it only gets worse as time goes on and I read more crap like this.
Just think about it for a couple of minutes. You have someone born in, say, 1986, supposedly an early year of the "Millennial" generation. And then you have someone born in 1983, supposedly a late Gen-X. You're telling me there is, somewhere in '84 or '85, a hard cut-off, where someone born one day is Gen-X and someone born the next day is a Millennial? Hell, you can even imagine identical twins, one born during the last seconds of Gen-X, and the other born in the first second of Millennial.
On top of which, you'd have to convince me that our 1983-born X-er has more in common with someone born in 1966 than with someone born in 1986.
It's fucking astrology, is what it is, and without the interesting math involving apparent planetary motion.
I look at it this way:
There has always been, and always will be (at least until the inevitable apocalypse), a continuum of ages. In any given year, a wide variety of people were / will be born. A sharecropper's kid has very little in common with a trust fund baby. Two people in the same socioeconomic stratum can also be as different as heads and tails on a coin. There is as much variation within a group of people born in a given year than there is between groups of people born 25, 50, or even 75 years apart.
Also, some things suck and other things get better. This is due not to a single "generation" or cadre of ages, but every single person doing his or her own thing.
And attitudes change over time. There's nothing new about this observation. But people pick up different attitudes depending on their upbringing and the stories to which they're exposed. And who does the upbringing, and the story-telling? Older people, subject to their own biases. Yes, ideas evolve. No, you don't get to blame an entire cohort for that.
I'm pretty sure the whole "generations" thing is just another way to divide us, like politics or countries. It distracts us from the real issues, which we can either work to solve, or ignore, depending on one's individual preference.
Sure, there are going to be differences between someone like me who remembers (barely) the first moon landing, and someone who grew up sucking on an iPad (which, incidentally, has exponentially more processing power than the Saturn V did). But let's not fool ourselves: for as long as there has been civilization, old people and young people have complained about each other, and yet people continue to produce young people like there's no tomorrow (in fairness, there might not be).
So you know what I want to see Millennials finally kill?
Generations.
Fly, my pretties! |
December 23, 2018 at 12:56am December 23, 2018 at 12:56am
|
I'm not one for tracking celebrities. For the most part, they're people just like the rest of us, who might be prettier, or more talented, or just happened to be in the right place at the right time. They're not superior in any other way, and they're certainly not deities.
I make an exception for Bruce Springsteen.
https://www.esquire.com/entertainment/a25133821/bruce-springsteen-interview-netf...
There is, I think, a certain relationship between depression and creativity. It can be an insidious one, though - you can get more creative, but you can also get more melodramatic. Or the depression can give you all kinds of ideas, but prevent you from gathering the energy to do express them. Bruce seems to have found the balance needed to overcome this.
Springsteen has always been an inspiration for me, ever since I first heard Born to Run from a tinny AM radio a few years after it came out. That combination of words and music - to me, it's true poetry. Words alone can be inspiring. Music alone can be uplifting. Put them together in the right way, and they become transcendent.
There have been few experiences in my life more fulfilling than a Springsteen concert. Make of that what you will.
Now, I guess it's off to Netflix to see the video of the Broadway show. |
December 22, 2018 at 12:26am December 22, 2018 at 12:26am
|
As you know, I really like science. It's not that I'm actually trained as a scientist or anything; it's just that I know it's the best tool we have for understanding ourselves and the universe. As a non-scientist, it's more like I'm checking out its butt as it goes by - I don't understand much of it, and I'll never be very close to it, but it sure does look good.
But, as I've noted before, I have serious issues with nutritional science - or, at least, how it's reported on. It's difficult to do it right, and even more difficult to get science reporters to really understand the findings. That's why you get the sense that they keep going back and forth on what's good for you and what isn't.
Some things are known to a high degree of certainty - for instance, the bad effects of trans fats on cholesterol. But other things keep changing: eggs have gone from bad to good to bad to good in my memory, and diet advice keeps changing with every study - biased or not - that seems to produce some result or another.
Worst of all, a lot of studies are biased because they're funded by people who want a certain result, and that can creep into the reports.
That's why I never did buy into the low-carb nonsense. Part of that was that I simply didn't want to: bread is the greatest of all foods, and if eating it makes me die sooner, then so be it; a life without bread is a life not worth living.
As it turns out, though, that may not be a problem.
https://qz.com/quartzy/1487485/the-scientific-case-for-eating-bread/
Bread has long been a foundational part of the human diet, but a revolt against it has been building for years—and seems to be reaching a crescendo. Today, many regard bread as a dietary archvillain—the cause of bigger waistlines and the possible origin of more insidious health concerns. Popular books and health gurus claim that bread and the proteins it harbors can cause or contribute to foggy thinking, fatigue, depression, and diseases ranging from Alzheimer’s to cancer.
But go digging through the published, peer-reviewed evidence on bread and human health, and most of what you’ll find suggests that bread is either benign or, in the case of whole-grain types, quite beneficial.
So there you have it. Time to make bread great again.
Now if only someone would admit that donuts are a health food... |
December 21, 2018 at 12:32am December 21, 2018 at 12:32am
|
Every once in a while, I get pushed something like this:
https://getpocket.com/explore/item/a-list-of-8-core-values-i-live-by
Coming up with one's own core values is certainly preferable to having them dictated by something outside yourself. Taking one from that list, let's use the first one, "authenticity," for example:
I don't think that's something I could adopt. Perhaps that makes me a bad person; I don't know. But the author explains it thus: "Be the same person at every occasion in life. Don’t act differently in front of your parents, friends, co-workers, in-laws, and strangers. Stay your true self. And never be afraid of other people’s judgments."
That's not me. My "true self" is a bit of a chameleon. I'm going to act differently in front of friends and strangers (I no longer have anyone in those other categories in my life).
Now that I think of it, this may be one reason I'm lousy at parties. I don't know who to be at any given moment. I feel the others sizing me up: "He's going for the cheese dip again, like he needs it." Well, I don't need it, but I enjoy it. How am I supposed to explain to people that I've never met that the joy I get from food is more important to me than being fit, or even my health?
I guess I'm doing that now, trying to explain something to people I've never met.
It wasn't always this way. When I was younger, I had more self-control. With the future ahead of me, I figured I could try on different personas, find one that fits.
Nothing ever did, so I essentially gave up. And now I don't know how to start giving a shit again. I guess it's just not that important to me.
What is important to me? Solitude - not all the time, but good chunks of it; I suppose this is helpful for a writer. Learning stuff - I feel good when I learn something new, even if it challenges what I thought I already knew. Comfort - also not all the time, but having someplace to which to retreat (I suppose this is related to solitude). Freedom - that is, being able to make my own choices without those choices affecting other people.
There's probably more, but those don't strike me as lofty ideals at all.
I read somewhere recently that who you are is more about what you do than about what you believe. Like, say you believe that it's important to stop climate change - but you still drive a low-mpg car and make unnecessary trips in it. Or if you believe in the fundamental equality of all people, but you ignore injustice when it benefits you.
In that case, does it really matter what your "core values" are? Or, perhaps, would you be better off simply deriving them from what you do, even if that's not always flattering? In which case, "procrastination" would head the list of my core values.
Which means I guess I'll have to think about it and work on self-improvement later. |
December 20, 2018 at 1:14am December 20, 2018 at 1:14am
|
Another turn of the wheel.
https://www.vox.com/science-and-health/2018/12/18/18144477/winter-solstice-2018
As I wrote last month, this is one of the things that, were we so inclined, we could all agree on. It affects the entire human race (at least for now; I'm still hoping for space travel), even if its effects depend on your current latitude.
I should add a couple of observations to the article I linked above.
First, while the vast majority of humans live in the Northern Hemisphere, that's no excuse for hemispherism. Most of us see it as the Winter Solstice but, as noted, it's the Summer Solstice in the south. To avoid being exclusionary, I've seen it referred to as the December Solstice. A better name - one that's not tied to our arbitrary calendar - might be the Capricorn Solstice. However, that would make the other solstice the Cancer Solstice, and I can see people objecting to that.
Second, unless I missed it (I tend to skip boring parts, which is one reason I don't like internet videos), it doesn't define what "solstice" really means. Yes, it says "Technically, the solstice occurs when the sun is directly over the Tropic of Capricorn, or 23.5° south latitude." But the word is from Latin: sol (sun) + stice (from a root that means 'stationary'). So literally it means like, "the time the sun stands still." Which is pretty cool. Of course, the sun doesn't actually stand still - figuratively or literally - it's a bit more complicated than that. From the point of view of any observer on Earth, each day when the sun reaches its highest point in the sky (known as local noon, which is usually not when the clock says noon), it appears slightly more to the south between late June and late December. As the calendar approaches the December solstice, the sun's apparent location at local noon drifts a bit further south each day. On the solstice itself, it's at its southernmost point. Technically, the solstice occurs at local noon at whatever longitude the sun happens to be crossing when it's at zenith at the Tropic of Capricorn.
Yes, I know. If I'd been reading this, I'd have skipped the last paragraph.
Third, did you see the gif in the linked article with the time-lapse of Earth from space? It's pretty cool. With a little digging, I found that the satellite that shot these pictures is a weather satellite in geostationary orbit. When I saw the gif, my first thought was "that satellite is directly over 0/0." 0/0 being the point where the Prime Meridian crosses the equator. It's in the Atlantic, off the coast of Nigeria. I'm not so jaded that I find photos taken from geostationary satellites to be a mundane thing. Really, I'm not sure whether it's over the actual origin or not, but it's pretty close.
Anyway, science in general is cool, astronomy in particular is fascinating, and culturally, we have a long history of marking the solstices and equinoxes. As I've mentioned before, I don't observe Christmas, so the day I care about is the solstice. It's a time to consider rebirth and renewal; the moment when, just as everything seems to be about to plunge into eternal darkness, there's the promise of more light to come. It's more interesting to me than the very arbitrary New Year's Day (the proximity of which to the solstice isn't accidental), and more meaningful.
In the past, I've marked it for myself by staying awake through the longest night, maintaining a vigil for the returning, invincible sun. I think I might revive that tradition for myself on Friday. |
December 19, 2018 at 1:35am December 19, 2018 at 1:35am
|
Spoiler alert: I lived.
I really do not like air travel. I mean, look: you're flying. And you're getting from one city to another in a timeframe that would shock even science fiction writers a century ago. That shit's inherently cool, and sometimes, looking out the window of an airplane, I can still feel that sense of wonder (if there aren't any barking dogs or wailing children in my vicinity). This is a triumph of science and technology, and I don't care what anyone says about the wastefulness or contribution to climate change, it is fucking cool.
And yet, we've managed to turn it into a shitshow what with security theater and its petty annoyances, the sardine-packing of the airplanes, and the labyrinthine airports themselves with their overpriced shops catering to a captive audience.
On top of which, the flights I took did this thing where you could access in-flight entertainment on your own tablet, phone, laptop, whatever - which is fine; I'd rather use my own device anyway - but they didn't have anywhere to plug the goddamn things in, rendering the feature all but useless unless you carry a spare battery or two with you. You know, those lithium batteries that are known to fucking explode. I thought the point was to not explode.
I enjoyed the trip, at least the times that didn't involve the actual traveling part, though I won't want to do it again for a while.
But... I did get to drive a Tesla for a few minutes (speaking of science fiction brought to life). That made everything else worth it all by itself. Not everything about this timeline sucks. |
December 8, 2018 at 12:33am December 8, 2018 at 12:33am
|
A couple of days ago I mentioned the survivorship fallacy. Well, here's a bit on that and a few other cognitive biases we tend to have.
https://getpocket.com/explore/item/5-common-mental-errors-that-sway-you-from-mak...
1. Survivorship Bias.
I won't go into this further. I think the article is clear enough, and I already talked about it.
2. Loss Aversion.
Loss aversion refers to our tendency to strongly prefer avoiding losses over acquiring gains. Research has shown that if someone gives you $10 you will experience a small boost in satisfaction, but if you lose $10 you will experience a dramatically higher loss in satisfaction. Yes, the responses are opposite, but they are not equal in magnitude.
I've tried to train myself to reverse this. That's helpful for me because I'm a gambler. I also have money in the stock market. No, those are not the same thing, but they do have one commonality: you win sometimes, and you lose sometimes. But I see loss aversion catered to in stock market analysis all the time; lately, the markets have been on a bear tear and people are freaking the fuck out. Any downturn is exacerbated by panic selling.
Don't get me wrong; losing money never feels good. But it can feel less bad if you're aware of this kind of bias.
3. The Availability Heuristic.
The Availability Heuristic refers to a common mistake that our brains make by assuming that the examples which come to mind easily are also the most important or prevalent things.
For example, research by Steven Pinker at Harvard University has shown that we are currently living in the least violent time in history. There are more people living in peace right now than ever before. The rates of homicide, rape, sexual assault, and child abuse are all falling.
Okay, we're going to use fancy words like "heuristic?" That strikes me as being obfuscatory. What this bit boils down to is this: While anecdotes can make compelling narratives, intelligent decisions can only be made on hard data. I'm sure you've already heard that "data" isn't the plural form of "anecdote."
For example, many people are afraid of flying. While some fear is understandable because you're putting your life into someone else's hands, much of the fear - of being killed in a plane crash, of getting hijacked, etc. - is almost groundless (see what I did there?) You have a much greater chance of being killed driving to and from an airport than in transit between airports. But what happens is, all but the worst highway incidents don't make international news, while every incident involving an airplane gets reported all over the world, so we start to think of flying as a hazardous proposition.
Make no mistake, there are dangers - life is inherently risky - but really, you're more likely to kick it by slipping in the shower than in a plane crash.
Or hey, maybe I'm just trying to psych myself up; I have a plane to catch in about 8 hours.
4. Anchoring.
There is a burger joint close to my hometown that is known for gourmet burgers and cheeses. On the menu, they very boldly state, “LIMIT 6 TYPES OF CHEESE PER BURGER.”
My first thought: This is absurd. Who gets six types of cheese on a burger?
My second thought: Which six am I going to get?
I didn't realize how brilliant the restaurant owners were until I learned about anchoring. You see, normally I would just pick one type of cheese on my burger, but when I read “LIMIT 6 TYPES OF CHEESE” on the menu, my mind was anchored at a much higher number than usual.
I have to admit I fall for this one all the time. But I'm working on it.
5. Confirmation Bias.
The Grandaddy of Them All. Confirmation bias refers to our tendency to search for and favor information that confirms our beliefs while simultaneously ignoring or devaluing information that contradicts our beliefs.
Yeah, this is a big one. It's one of the reasons I try not to stay in an information bubble. But it's also the hardest to deal with, because it works both ways. The article uses the example of climate change. It's a hot-button (see what I did there) topic in many forums on the internet. As far as I've been able to determine, not one single person's mind has changed on the issue based on arguments started on the internet. On the contrary, at some point, any information contrary to what a person believes only causes that person to believe even more deeply. Data is dismissed as falsified. Charts are thrown out as bogus.
Internet arguments are stupid for many reasons, but this is a big one: you're not going to change anyone's mind with any number of anecdotes, any amount of data, any quantity of reason, or any pure charismatic persuasiveness.
I've found this to be the case even with things that have a lot less potentially at stake than climate change. For instance, a lot of people seem to have internalized the definition of "blue moon" as the second full moon in a calendar month. This has been proven to be incorrect from a historical perspective, and yet when I present evidence of that, people go right on believing, promoting, and propagating the incorrect definition. Now, you could say that I'm being stubborn in clinging to the older, true definition, and you'd be right. That doesn't change the fact that the "second in a month" definition is based on an admitted error in a publication from the 1940s. To me, it's a perfect example of how falsehoods get entrenched and then promoted as fact.
If you can't persuade someone of something with such an insignificant outcome, how can you hope to persuade them about, say, vaccinations, climate change, or the nearly-spherical nature of the planet?
As a result of confirmation bias, any internet argument that doesn't dissolve into chaos will always bog down into epistemology. This is Waltz's Second Rule of the Internet, and I haven't found any evidence to contradict it. But then, I'd probably ignore said evidence anyway.
(For anyone interested, Waltz's First Rule of the Internet is that any post attempting to correct someone's spelling, grammar, or punctuation will inevitably contain spelling, grammar or punctuation errors.)
NOTE: I'm going to be traveling over the next 11 days, so updates will be spotty if they happen at all. But if you don't hear from me after that, feel free to enjoy the irony of me dying in a plane crash mere hours or days after insisting how safe air travel is. |
December 7, 2018 at 12:03am December 7, 2018 at 12:03am
|
Another year, another shitstorm about holiday songs.
https://www.bbc.com/news/entertainment-arts-46425160
Baby, It's Cold Outside is one of those Christmas songs that's about as traditional as mince pies.
But an American radio station's decision to pull it from playlists because it's seen as unsuitable in the #MeToo era has reignited a debate about the song, and raised questions about other potentially questionable Christmas classics.
First of all, let me state outright that my life would not be lessened in any way if I never heard another holiday song. It would, in fact, be improved. Ideally, I would never step into a store and be bombarded with "White Christmas" or "Santa Claus is Comin' To Town" or especially "I'll be Home for Christmas."
There are a few that I can tolerate, but for the most part, if I never had to hear another Christmas song, I'd be happy. Or at least a little less depressed, which for me amounts to the same thing.
Okay, now that that's out of the way, I'm going to talk about the controversial songs.
What a lot of people miss when consuming media - be it audio, video, or whatever - is historical context. The platitude about people who don't learn from history being doomed to repeat it holds true here.
Things were different in the past. That's why we call it the past. Lots of things that were created in the past couldn't be created today, not in the same way. Blazing Saddles comes to mind. That movie was a product of the 1970s just as sure as Watergate and disco were. Movies like that one were created to shine a spotlight on the stupid attitudes of the time, and played a role in eliminating them. Once these attitudes became obsolete, a lot of the context of the movie was lost. If you don't know what race relations were like in the 70s, you'll miss a lot of the subtext.
As writers, one of the greatest things we can hope for is that something that we write - be it a story or a song or whatever - helps to change things. In the process, often the writer's work becomes obsolete. People later will miss the point, unless they understand the historical context. It's a bit like doctors working to eradicate disease - once it's eradicated, people in the future will start to wonder what all the fuss was about. Hence you get some of the most egregious types of reactionary in the present, like anti-vaxxers. Lacking the historical context of polio outbreaks and measles epidemics, they just don't understand why people make a big deal out of vaccination.
The song in question - and make no mistake, I think it's just as crap as any other "holiday" song - was a product of its time, and a wish for things to be different. Well, now things are different. We're closer to social equality than ever before. We're open to different expressions of gender identity. Things that once were forbidden are now commonplace - and that's not a bad thing.
But if we forget where we came from, the road that took us to this place where we are, we risk going back to it. You can try to make this a generational thing - I'm going to save my rant about "generations" for another entry, as this one's going on long enough - but it's not so much about generations as it is about ignoring history. Yes, things were worse in many ways in the 40s. But to know exactly how they were different can only highlight how far we've come, and how far we still need to go.
And I think the bigger problem is that once-respected news organizations like the BBC are quoting Twatter. If anything needs to go away, it's not old Christmas songs; it's shallow social media.
Let's work on that one, shall we?
I'll wrap this up with one of the few holiday songs I can stand:
|
December 6, 2018 at 12:33am December 6, 2018 at 12:33am
|
I wanted to share this, not because I have a stake in the topic being discussed, but because my lack of a horse in the race means I have some tangential thoughts to share.
https://www.nytimes.com/2018/11/27/well/family/the-fallacy-of-the-i-turned-out-f...
The column is ostensibly about child-spanking. To summarize, a soi-disant "parenting expert" () from Down Under asserts that spanking should be a criminal offense (or offence or however they spell it in Oz), is nonplussed when a bunch of people disagree with him, and then goes on to demolish what he sees as their arguments.
I'm not overly familiar with the political situation in Australia, but first of all, it seems to me that "this practice is ineffective and probably dangerous" shouldn't automatically translate to "this practice should be a criminal offense." It's dangerous in itself to turn everything that we don't like into ban laws; also, a law of this sort strikes me as being especially easy to abuse. It's a basic idea in a so-called free country that not everything is either "permitted" or "banned." Some bad ideas can be handled with social pressure. For instance, I dare you to speak out against public breast-feeding. Go on. I triple-dog-dare you. There's no law against speaking out against public breast-feeding, nor, by our right to freedom of speech, should there be. But I guarantee that you will face a typhoon of scorn and ridicule that will make you wish you had never been born, let alone breast-fed.
I also wanted to point out that there is a such thing as making things too safe, for kids as well as for adults. Take away all the world's sharp corners, foam-pad all the walls and floors, and rope off anything that might pose even the slightest hazard to people, and that's what people will come to expect. They'll be wired to think that if something might be dangerous, there will be a barricade. And that if there is no barricade, it can't possibly be dangerous. Or, worse - you rope off areas of both minor and major hazard, so people get complacent about ignoring barricades. And that's how we get things like people sliding into the Grand Canyon or boiling to death at Yellowstone.
Life, in short, is full of risk, and there may be paradoxically more risk in protecting people all the time than in allowing for the occasional bump on the noggin. Or, to stay more on topic, the occasional light swat on the tuchis.
Unfortunately, if you don't foam-pad everything lawsuits happen, so the world just gets more and more covered with the illusion of safety.
Getting back to the article I originally linked, however, I'll just point out one fallacy that the author didn't even consider, which is the survivorship fallacy. "We didn't have car seats when I was a kid and I'm still alive." He covers that as "anecdotal fallacy" but doesn't go far enough. It doesn't take into account the loose kids bouncing around in station wagons who didn't make it to adulthood. Such a logical fallacy doesn't even require hazardous situations. You see it all the time in articles with such titles as "The Five Traits of a Successful Business Leader." The implication is that anyone with these traits will be successful, and completely ignores the vast majority of people with those traits who are standing in line at the food bank.
Well, that's enough for now. But I did want to say, one more time: "parenting expert." |
December 5, 2018 at 12:45am December 5, 2018 at 12:45am
|
I am fortunate in that, most days, I can set my own sleep schedule. Most people can't, due to work, kids or other obligations.
Previous entries have gone into more detail about this, but left to my own devices as I usually am, I'm basically biphasic, sleeping twice in a 24 hour period.
I've long believed, without any real evidence to support this, that everyone has their own sleep preferences, and that to disturb these sleep cycles leads to more stress. Well, now I might have some evidence.
https://getpocket.com/explore/item/if-you-re-just-not-a-morning-person-science-s...
I've seen some of these assertions before - that humans have a greater-than-24-hour natural cycle, for instance. I recall one study that left people in a windowless, indoor environment without clocks for an extended period, and let them sleep and wake on their own. I think the study concluded the natural cycle is about 24.5 hours. Which is weird, because a) the earth's rotation is gradually slowing down due to tidal friction from the moon, so if anything we should have inherited a less-than-24-hour cycle from our distant ancestors, and b) Mars' day is approximately 24.5 hours, and we're probably not Martians. So I don't know what's up with that; as far as I know, no one does.
What's not controversial is that there are individual differences from the "standard" sleep cycle of 11pm-7am or thereabouts. I was on such a schedule for most of my working life, and it messed with me. The times when I could sleep past 7 were rare, but I always felt that those were the occasions when I got my best sleep. Further, trying to fall asleep around 11pm was a chore for me.
I still have questions, though, such as: what happens when someone with a nonstandard monophasic cycle switches time zones? Say, moving to another longitude. We know that jet lag is a thing, but usually it's overcome in a few days of adjusting one's internal clock to local time. But can this be used to adjust, say, someone who sleeps from 2-10 in one time zone to adapt to an 11-7 schedule in another, without too much stress?
Seasonal variations in daylight hours almost certainly play a role in this as well. Right now where I am, it gets dark around 5; in the summer, during DST, darkness might not occur until 9pm, with sunrise coming earlier (I don't know offhand when sunrise happens, for the obvious reason that I'm asleep at the time). And we know that many people, myself included, "enjoy" seasonal mood swings related to the amount of light we get from the accursed daystar.
But probably the worst thing about being a night owl is suffering the scorn of morning people. I used to vacation in the Outer Banks of North Carolina every year, with a bunch of friends, and I was usually the last one to go to sleep and the last to awaken (I don't believe in keeping strict time schedules on vacation). Inevitably, I'd get up around the crack of noon, and everyone else would be all sarcastic with their "Morning, sunshine" and "It's alive!" It got on my nerves, and then my friends wondered why I'm antisocial.
Now, I could probably fix some of that by kicking the caffeine habit, but I don't really want to. In fact, I tried it a few times when I was working, but all it did was make me grumpier and give me a headache, so these trials never lasted very long. Perhaps now, when I can pretty much set my own schedule, I could try tapering off on the wakey-wakey juice to see what my unmedicated cycle might be like. But honestly, for the same reason that I can, I don't feel a pressing need to do so. I have far more urgent habits to revise, such as eating better and getting more exercise. Fortunately, there's a 24-hour gym nearby. Unfortunately, I can't be arsed to go to it at any time. Too many other things to do.
So I'll just urge everyone here to recognize that sleeping late is in no way a moral failing; it's natural for many people. Leave us alone and can the snark. We'll get up early if there's something to do; otherwise, let us sleep and don't give us shit about it. |
December 4, 2018 at 12:43am December 4, 2018 at 12:43am
|
I got hit by anxiety yesterday.
Historically, this isn't something that happens to me. I mean, sure, maybe if there's a specific thing generating it, like public speaking or having to go to a party where I don't know many people, I'll feel a bit nervous, or whatever you call it. But this is the first time I can remember just feeling it with no obvious reason.
I don't like it.
The only thing I can figure is after a long time, I'm finally feeling like being social again. I feel something other than a generalized sense of malaise and the certainty of impending doom. Those things, though, I've gotten used to. They comfort me. If this is what actual emotion feels like, then... no, thanks. I'd rather stay depressed.
Fortunately, there is beer.
This evening, BJ's (a chain brewery, but a decent one) had a Belgian Grand Cru style beer. It is delicious. I almost didn't go home.
But even I can't drink beer all the time. Mostly I imbibe Coke Zero; I want the caffeine but not the sugar/HFCS of regular sodas. And don't give me that "aspartame will kill you" crap; there are worse things out there.
Also, there's tea.
https://www.nbcnews.com/better/health/best-teas-energy-digestion-sleep-more-ncna...
Tea is the second most widely consumed beverage in the world (next to water), according to the Food and Agriculture Organization of the United Nations. While coffee drinking overshadows tea consumption in the United States, the Tea Association of the U.S.A. reports that 80 percent of American households have some form of tea in the cupboard, and more than 159 million Americans drink it on a daily basis.
I utterly despise coffee. Never got a taste for it. This, apparently, sets me apart from civilization in general. As an American, I'm supposed to be addicted to coffee. I think it's written into the Constitution or the Declaration of Independence: "We hold these truths to be self-evident, that coffee is the national beverage..."
But no, apparently I never got the memo. Can't abide the stuff. The smell gives me a headache, and being around people with coffee breath is probably the source of my antisocial tendencies. But that's okay, because once I tell people I don't drink coffee, they look at me like I'm a reptilian alien from Antares 7. It's like:
"I don't have kids."
"That's cool."
"I'm an atheist."
"Hey, I understand."
"I don't drink coffee."
"WHAT THE STEAMING PILE OF FUCK IS WRONG WITH YOU?"
So like I said, there's tea. "So why don't you just drink tea instead of that nasty cola?" Well, because for me, even the most highly caffeinated teas put me to sleep. I can drink a pot of black tea just before bedtime and sleep like a... well, not like a baby, because babies wake up every hour and a half which is why I never wanted one, so let's say sleep like a computer that's been powered down and unplugged.
If I drink it with breakfast, all I want to do is go back to sleep. It's nice when I want to relax, though. I can drink non-caffeinated herbal teas, but there's no freakin' point in drinking decaffeinated tea.
My favorite tea isn't on the list I linked above. It's harder to find in the US, but I've ordered it from Amazon. Its name is going to trigger your inner Beavis and Butthead, so let's just get that out of the way right now:
https://en.wikipedia.org/wiki/Pu%27er_tea
When you're done snickering like a 10 year old, I'll continue. Meanwhile, I'm going to go make me a pot of the stuff.
Ready?
Okay.
So yeah, Pu'er is strong, dark, earthy and bold. I drink it straight, or sometimes with just a bit of lemon because steeping it too long sometimes brings out a hint of bitterness. With a fairly high level of caffeine, it does what no other drink can do for me: make me relaxed and alert at the same time. It's a weird feeling.
And it sure beats the hell out of anxiety. |
December 3, 2018 at 12:30am December 3, 2018 at 12:30am
|
They say it is important to have goals.
https://thetakeout.com/ramen-lord-perfect-bowl-of-ramen-mike-satinover-183012867...
There's focus, and then there's obsession. At least the guy admits to the obsession.
I just consider myself a ramen nerd. A wholly obsessed ramen nerd who loves to read academic articles about gluten development in noodle doughs, a nerd who’s spent way too much money on pasta machines that then broke through the rigor of making noodle dough, a nerd who—intentionally—owns six different types of bowls to suit various ramen styles.
"I don’t count the 99-cent noodle packets" - 99 cent? Where in the slippery hell are they charging that much for ramen? I shop at a moderately upscale grocery store and they're selling 6-packs for a buck and a half. That's 25 cents a pack for the math-challenged. Not that I eat much of that anymore, myself - my cardiologist would kill me if the sodium didn't do the job for him.
Dude must be shopping at Whole Foods.
Not that I never eat it. It's just too cheap and simple to pass up. And yes, I've eaten the similar Vietnamese-derived dish called pho.
Pho, as you probably already know, is pronounced more like "phuh." Which almost makes me wish I had what it took to start my own line of instant pho. I have good reasons for this.
I'd call it Pho King. The tagline would be "Awesome." So on every pack it'd say "Pho King Awesome." You buy the instant pho mix because you're Pho King lazy.
The basic package would be called the Pho Kit. Not feeling like standing in a Pho queue? Don't want to cook an elaborate meal? Pho Kit!
Being cheaply produced, though, it might make people sick - but then they can just Pho Cough.
But alas, it is not to be. Still, if I see it anywhere, I'll have this entry to point to and say, nope, I thought of it first; give me kickbacks.
Anyway. Ramen.
Learning to craft an A+ bowl of ramen is now my life goal. Maybe A+ is never possible, perfection is a impossible pursuit after all, but the path of getting closer and closer to that immaculate, flawless bowl intrigues me.
That's remarkably similar to something I've said about writing. How we can keep practicing, writing, trying different styles, and yet never be perfect. And yet, the reward is in the attempt.
I'm not used to thinking that way. This is new to me.
I've never stuck with one thing long enough to get really good at it. There's always something else to try, something shiny off in the corner of my vision. Again, this has its advantages, but seeing this description of someone doing a deep-dive into soup - so to speak - makes me think I might be missing out.
Meanwhile, now I'm hungry. Dammit. Wish I had a Pho Kit. |
December 2, 2018 at 12:31am December 2, 2018 at 12:31am
|
You will be replaced by a machine.
Yes, you. Well, unless you retire or croak before the machine gets a chance. Many replacements have already happened (assembly line workers, cashiers), and many more are on the way (see: autonomous vehicles).
But maybe you never considered that a computer could write better than you can. Not yet, but they're working on it.
https://getpocket.com/explore/item/the-six-main-arcs-in-storytelling-as-identifi...
Until that happens, there are some pretty good takeaways in the above link for us soon-to-be obsolete human writers:
A story's emotional arc can be used to assist in plotting the narrative. The article identifies several of these arcs, and hints at the idea of nesting them within a longer story. We can use this consciously, whereas I think a lot of writers do it unconsciously.
Be aware of which words people find to be "happy" or "unhappy," which may not correspond to your own opinion. For example, "laughter" is in the list of happy words, but because of unresolved issues from childhood, it's doesn't register that way for me.
Have a look at the "Rags to Riches" archetype graph. It's maybe 2/3 down the page. Well, I was expecting a kind of ascending line; I mean, that's the implication, right? You're poor, which sucks; good shit happens, and then you're rich, which is awesome. But no, instead we have something resembling a partial sine curve, right? Curves down, bottoms out near the beginning, rises more or less smoothly to a peak... but the peak is not at the end; there's a downturn right before the end. It's implying that a rags-to-riches story can't just stop at the high point; there has to be, perhaps, some acknowledgement that everything's not as awesome as we'd hope. I mean, that drop-off at the end is pretty sharp.
Going back to near the beginning of the article, yeah, I'm gonna part with ol' Kurt on his opinion of the similarity between the Cinderella narrative and the Bible. I mean, I think he's right about the Cinderella plot shape, that works, but to apply that to the Bible you pretty much have three chapters of Genesis, then you fall into a great big hole, and then you don't climb back out until you get to the New Testament. It ignores the shapes of all the stories in between, such as the whole "escape from Egypt" and "Joshua kills half the Canaanites and enslaves the rest" bits. Besides, the Fall was the best thing that ever happened to us. Or, you know, it would be if I took the story seriously.
I want to see this work replicated independently. I'm not sure there wasn't some kind of selection or other bias.
Anyway, thought I'd link something that's actually about writing, for once. Not that it'll do me any good. I'm still too lazy to work on my own stories. At least I can get these entries done. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|