About This Author
Come closer.
|
Complex Numbers
Complex Numbers
A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.
The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.
Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.
Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.
Previous ... - 1- 2 ... Next
February 28, 2021 at 12:00am February 28, 2021 at 12:00am
|
Hello darkness, my old friend...
Yeah, yeah, I know. Look, too much light can be a bad thing, you grok?
One of the many things I hate about the beach is how relentlessly bright it is. You're lying there on your towel or whatever, and because walking in sand is intrinsically fatiguing and because of all the heat and humidity and the having to deal with other people, you're tired. So you close your eyes. But that doesn't induce darkness, no, all you see is the pulsating deep red of the skin and veins of your eyelids. The accursed daystar still burns, beating down upon you and trying to give you cancer, blasting its blackbody radiation -- a misleading term, but that's what physicists call it when something shines in a spectrum -- into your retinas.
Even worse is when a cloud passes between you an the solar orb, giving you relief that is as short-lived as it is illusory.
No.
Make it go away. At least for a few hours.
In darkness, there is comfort. In darkness, there is no ugliness, no beauty. In darkness, no one can see you smile.
Turn off the light. |
February 27, 2021 at 12:02am February 27, 2021 at 12:02am
|
As you probably know by now, sometimes I like to toss in a quantum bomb just for the fun of it.
And it's a whole lot of fun to talk about overturning "cherished assumptions about reality."
Before I get into the article, though -- and trust me, I'm not going to get too deep into it; it's there if you want to read it, but I just have a few observations to make -- a couple of quick disclaimers:
1) This stuff is way above my pay grade, but it doesn't stop me from being fascinated by it.
2) It is a Bad Idea to make general conclusions about the nature of reality from reading about quantum physics.
To give you an example of (2), consider the phenomenon of "entanglement." Entanglement is famous for appearing to violate the lightspeed limit of the universe. So SF writers, philosophers and others like to hand-wave FTL communication by using entangled particles. But the thing is, entanglement doesn't actually violate FTL limitations because there is no way, even theoretically, for it to allow the transfer of information faster than light.
You know what else violates the speed limit? Your vision. Go outside at night. Look at a star. Then look at another star. Congratulations, your vision just traveled faster than light. But so what? It's a limit on information, not observation.
Anyway, the point is, maybe I draw some of those general conclusions myself; mostly, I'm just trying to understand.
Take his fellow physicist Erwin SchrĂśdingerâs famous thought experiment in which a cat is trapped in a box with poison that will be released if a radioactive atom decays. Radioactivity is a quantum process, so before the box is opened, the story goes, the atom has both decayed and not decayed, leaving the unfortunate cat in limboâa so-called superposition between life and death. But does the cat experience being in superposition?
Here's the thing that always bugged me about the SchrĂśdingerâs Cat "experiment." Well... apart from the cavalier disregard for a poor kitty's life; fortunately, it was only a thought experiment and, as far as I know, no one has attempted it in a laboratory. The other thing that bugs me is this: If, as the theory suggests, it takes an observer to collapse the wave function and determine "alive" or "dead," how do we define "observer?" I mean, other people have asked this quesiton, too, but I'm talking about it here anyway.
What I mean is, are we really going to assert that it requires a human consciousness? Because I know cats, and I can tell you that cats are conscious animals. They probably don't understand quantum mechanics, and they probably don't have our obsession with questions of the meaning of existence or of life and death, but the cat in SchrĂśdingerâs box would "know" whether it is alive or dead (well, if dead, it wouldn't know anything because it's dead, but you get my point). To further clarify this, replace, in the thought experiment, a cat with a human. Does it really matter if an outside observer doesn't see the result? Because the person in the box does.
So, the article gets into this and even marries the cat-in-the-box thing with quantum entanglement, and like I said, I don't pretend to understand all of it. And it further describes several competing interpretations, or ways to understand what's "actually" going on.
It's easy to draw mystical conclusions from this, and quacks have done so. "Look, science says that nothing really exists until we see it, so we can manipulate reality by choosing what to see" or some such dreck. No. That's not how it works. I don't know how it works, but that's not it.
I'm also still not entirely convinced by many-worlds interpretations -- the article describes this too, but in brief, the idea is that any quantum event that can have more than one outcome (which as I understand it is all of them) produces every possible outcome, each splitting off its own universe. This may fit the data, according to physicists, but there are other possible explanations and, well, that just seems like exactly what Occam's Razor is for: "entities should not be multiplied unnecessarily."
It is, though, great fodder for science fiction. However, some writers have interpreted this to mean "every decision we make splits off new universes," but I'm also not convinced that we actually make decisions; rather, we do what we must, and make up stories about it afterward.
So yeah, I'm not quoting any more of the article. It's there if you want to read it. I just wanted to add my own thoughts. |
February 26, 2021 at 12:03am February 26, 2021 at 12:03am
|
Ever wonder about grapefruit?
You know, there are quite a few weird fruits on the planet. Durian comes to mind. And wtf is up with breadfruit? But yeah, okay, maybe grapefruit is a bit weird, especially since grapes are already fruits and they have nothing to do with citrus so what's with the name?
Right from the moment of its discovery, the grapefruit has been a true oddball. Its journey started in a place where it didnât belong, and ended up in a lab in a place where it doesnât grow. Hell, even the name doesnât make any sense.
See?
The current theory is that somewhere around five or six million years ago, one parent of all citrus varieties splintered into separate species, probably due to some change in climate. Three citrus fruits spread widely: the citron, the pomelo, and the mandarin.
You don't see pomelo much around here. I have a vague memory of eating one, long ago, in a foreign land. At the time, I thought it was a cross between an orange and a grapefruit. I guess that was backwards.
With the exception of those weirdos like the finger lime, all other citrus fruits are derived from natural and, before long, artificial crossbreeding, and then crossbreeding the crossbreeds, and so on, of those three fruits. Mix certain pomelos and certain mandarins and you get a sour orange. Cross that sour orange with a citron and you get a lemon. Itâs a little bit like blending and reblending primary colors. Grapefruit is a mix between the pomeloâa base fruitâand a sweet orange, which itself is a hybrid of pomelo and mandarin.
Yeah, I know. I got lost too.
Speaking of all these names, letâs discuss the word âgrapefruit.â Itâs commonly stated that the word comes from the fact that grapefruits grow in bunches, like grapes. Thereâs a pretty decent chance that this isnât true. In 1664, a Dutch physician named Wouter Schouden visited Barbados and described the citrus he sampled there as âtasting like unripe grapes.â In 1814, John Lunan, a British plantation and slave owner from Jamaica, reported that this fruit was named âon account of its resemblance in flavour to the grape.â
Yeah... no.
This is largely guesswork, almost all of it, because citrus is a delightfully chaotic category of fruit. It hybridizes so easily that there are undoubtedly thousands, maybe more, separate varieties of citrus in the wild and in cultivation.
Seriously, though, the vast variety of citrus and its ease of modification is pretty fascinating.
The article goes on to describe how grapefruit, and other citrus, led to Florida becoming Florida, so there's another reason for me to hate grapefruit.
It also talks about the very interesting discovery that grapefruit completely fucks with some medications.
Now, I've said this before but I'll say it again: I've never liked grapefruit. I mean, I never really hated it; if it's there I'll eat it but I never sought it out, or deliberately obtained grapefruit juice to drink. It was just something that was there. That is, until I started taking a statin, at which point I got really intense cravings for grapefruit.
The surest way to get me to want something is to tell me I can't have it. I mean, it's possible that if someone told me "you can't eat eggplant or your blood pressure will go through the roof," I'd want to go out and buy bushels of eggplant.
Possible, but I doubt it. At least I always acknowledged that grapefruit was edible.
Anyway, I'm not going to quote the circuitous part of the article that goes into the discovery of grapefruit interactions, but basically, it can have the effect of making us metabolize more of certain medicines than expected.
I know a guy who takes advantage of this. He's poor and has shit insurance, so he stretches out his statin supply by taking 1/2 the recommended dose and munching on grapefruit.
Pretty sure that's not recommended.
âThere are a fair number of drugs that have the potential to produce very serious side effects,â says Bailey. âKidney failure, cardiac arrhythmia thatâs life-threatening, gastrointestinal bleeding, respiratory depression.â A cardiac arrhythmia messes with how the heart pumps, and if it stops pumping, the mortality rate is about 20 percent. Itâs hard to tell from the statistics, but it seems all but certain that people have died from eating grapefruit.
And see, I'd rather die from eating something that I actually like. |
February 25, 2021 at 12:01am February 25, 2021 at 12:01am
|
You'd think that, as a long-time beer snob, I'd like Guinness.
You'd be wrong.
I mean, it's not bad, and it's certainly an important beer for many reasons. I'm sure I'd like it better if I were to go to Ireland and drink it from the tap there. And given a choice between Guinness and that watered-down rice water that passes for "beer" that's mass-produced in the US, I'll take Guinness every time.
But I made the mistake once of drinking a pint at a concert. The mistake was, before the concert, I'd gone to a taphouse and had a really good stout. By comparison, the Guinness just didn't measure up.
Like I said, though, it's a culturally important beer for many different reasons, one of them being just how long it's been brewed.
Beer drinkers are used to seeing eye-catching beer labels â some good, some not so good. But stroll past a beer fridge and you might see a label thatâs simple, nostalgic, and a throwback to the 1930s: the Guinness toucan.
Beer art is flourishing right now. Graphic artists are probably having a field day with all the logos and labels and tap handle designs, to say nothing of posters and taphouse decorations.
âThe Guinness family did not want an advertising campaign that equated with beer,â the UK History House writes. âThey thought it would be vulgar. They also wanted to stress the brewâs strength and goodness. Somehow it led to animals.â
At least they didn't pick an elephant.
There was the pelican that stole everyoneâs beers with the copy, âMy goodness â my Guinness!â Also the sea lion that had a habit of stealing Guinness. And the turtle that steals Guinness on its back. A lot of thirsty animals, basically. None earned as much fame as the toucan, however.
I don't think I was ever aware of these other mascots.
The toucanâs brightly colored beak contrasts nicely with two dark glasses of Guinness (almost always two glasses, playing off of the similar sounds of âtoo canâ and âtoucanâ).
As I'm only aware of the ones with draft pints, I never would have made the "two can" connection. No cans, no pun.
The toucan and its gang of animal friends graced Guinness ads for decades. Then, in 1982, Guinness stopped working with S.H. Benson and dropped the animals. In recent days the toucan has made some appearances, including a limited-edition can released in 2016. But it primarily lives on in the memories of Guinness lovers and collectors.
They're more than happy to sell toucan merchandise in dedicated Guinness stores (yes, these exist), along with the famed bar towels.
So yeah, even with the proliferation of beer choice nowadays, there will always be a place for Guinness. But that's as much a testament to the power of marketing than to the beer itself. |
February 24, 2021 at 12:01am February 24, 2021 at 12:01am
|
I'm guessing... no one, because there's still a pandemic going on.
Yeah. I know. No one cares anymore. And I get that -- I quit caring about a lot of things, myself, like the environment or adhering to society's expectations of me. But not about the pandemic. Too many people have suffered long-term effects, and I'm just done with everyone who's pretending it doesn't exist. They've made it so that, instead of me sometimes going out and doing stuff, I have to never go out and do stuff. And I miss doing stuff.
But you know what I don't miss? I don't miss having to do the work involved with having guests over. Or, more usually, me doing the work and then people cancelling at the last minute.
Just can't trust anyone.
Fortunately, I've entered Grumpy Old Man territory and I no longer have to put up with shit that I don't want to put up with, such as people who are annoying. Which is almost everybody.
Don't get me wrong - if I'm invited to dinner, once I can move around again, I'll be as polite and respectful as I know how to be (though you'll still have to put up with my jokes). But no more hosting for me. Too much work for not enough reward.
Yeah, I'm in a bad mood right now. Like I said: Grumpy Old Man. |
February 23, 2021 at 12:02am February 23, 2021 at 12:02am
|
Well, it looks like I have a new mission in life. In addition to the beer odyssey.
The mission is to visit every state and verify -- or disprove -- this list.
"All of humankind has one thing in commonâthe sandwich," renowned late-aughts philosopher Liz Lemon once theorized, on NBC's 30 Rock. "I believe that all anyone really wants in this life is to sit in peace and eat a sandwich."
Late-noughties. There's no excuse to call the 2000-2009 decade anything else. (Spare me the pedantry of pointing out it's "really" 2001-2010.)
Roughly as old as the country and invented by the Earl of Sandwich, an Englishman who never seemed to have time for a proper sit-down meal, Americans have spent the entirety of our nation's existence seeking to perfect the humble art form.
You know, that "Earl of Sandwich" origin story never quite sat well with me, like the aftereffects of... well, of a sandwich that's maybe a bit too greasy.
https://en.wikipedia.org/wiki/Sandwich
The sandwich is named after its supposed inventor, John Montagu, 4th Earl of Sandwich. The Wall Street Journal has described it as Britain's "biggest contribution to gastronomy."
If you follow the links from that Wikipedia page, you'll find that Montagu, Earl of Sandwich, didn't invent the sandwich any more than Amerigo Vespucci "discovered" America, and yet they became their respective namesakes. So okay, fine, the concept existed before they named it that.
Anyway, back to the original article.
There were strict parameters; no burgers, no hot dogs, no burritos, no tacos, and in nearly all cases, no barbecue. Sandwiches or not sandwiches, they can go ahead and get their own lists.
Want to start an argument? Ask if a hot dog is a sandwich.
I mean, sure, it's meat and stuff eaten with bread, but technically it's not two pieces of bread, but a single hinged bun. Consequently, it's a taco. I have spoken.
Now, obviously I'm not going to comment on all of the examples here. For starters, I haven't eaten any of the examples. But I do have some things to say -- obviously.
California
If anyone asks you what the 1970s were like in Los Angeles, drag them downâimmediately, if not soonerâto Langer's Deli, the best Jewish deli in America, for the pastrami.
That's a bold statement right there. Now, granted, I haven't been to Langer's, but for some reason pastrami sandwiches are quite popular throughout the West, and I've had a few -- and not a single one of them can compare to the ones you get in New York City (or even one I've had right here in my Virginia town). However, I'm a reasonable person, and will withhold judgment until I get to L.A.
At least they're better than most of the South, which somehow came to the mistaken conclusion that it's okay to serve cold pastrami. It is not.
Kentucky
We like to think of the hot brown as the beginnings of a fine turkey club: roast bird, strips of bacon, and slices of tomato on toast. Plot twistâthe whole thing is then flooded with rich Mornay sauce before hitting the broiler, emerging a delicious mess that requires a knife and fork to consume.
I have no doubt that it is delicious, but this is not a sandwich. Yes, I know that "open-face sandwiches" are supposedly a thing, but if it's a piece of bread with toppings on it, it's not a sandwich; it's a pizza. Regardless of toppings.
Massachusetts
Surely there are lobster rolls in a coastal state bookended by Maine and Connecticut, but we're too busy filling up on clam rolls, which are the first meal we think of when we think Massachusetts, or at least the very large amount of the state located by the ocean.
I have nothing to say about this creation that I haven't tried yet, but I'm including this to point out that I went to a BBQ place in central Massachusetts once that billed itself as having "The Best BBQ in the US." Being from the South, I had to go in and dispute that claim. Well, I did -- but I have to admit, it didn't suck.
Nebraska
...that other delightful, if less widely-renowned, Nebraska invention, the deep-fried grilled cheese sandwich.
Go. Go look at the picture of it at the link. I'm getting a heart attack just looking at it.
Pennsylvania
A serious jostling in the mosh pit that is South Philly's perennially cramped John's Roast Pork rides high atop our list of post-pandemic musts, partly just to feel something after our terrible year of No Touching, but also for a sandwich, a roast pork sandwich please, the one any right-minded Philadelphian can tell you is the one you go looking for once you're ready to dine sober, in the sunlight, like a whole adult.
I'm including this one so my friends from Pennsylvania can go find the article's authors and shove some Philly cheesesteak down their gullets.
Virginia
In a perfect world, the I-81 struggle through America's Secretly Biggest State would have been long ago ameliorated by modern conveniences including a third lane in each direction; in the meantime, better to think of this essential leg of the Northeast to Deep South fast route in terms of the many small detours one can make in order to feel human again, after yet another hour or staring at the rear end of the same tractor trailer.
Virginia is not a state to travel through. It is a state to travel to. Or preferably to be from.
I haven't tried the sandwiches mentioned here but you'll note that unlike most of the other states' offerings, there are two choices. Now I have to try them, but I seriously doubt they'll be better than the pastrami and swiss with spicy mustard and onion on a sub roll that's served at a lunch counter here in my town.
So... road trip, anyone? As long as I also get to try the nearby breweries. |
February 22, 2021 at 12:09am February 22, 2021 at 12:09am
|
Mostly I'm just linking this one because I ran across it one day and thought it was interesting. Not enlightening. Interesting.
This just in: Weird things happen to your mind if you take a break from the usual routine and try something new.
Now, look, I'm not being skeptical of the skeptic here. I have no reason to believe he's not relating anything but the truth about his lived experience. I'd just, personally, rather read about it than do it.
So I recently put my skepticism to the test by going on a weeklong silent Buddhist retreat, which my pro-Buddhism friends Lisa and Bob argued was my moral obligation.
So, the rule is: if someone critiques a philosophical viewpoint, they have a moral obligation to spend a week immersed in that philosophy?
The retreat cost $1800, and we were encouraged to give the Lama a donation at the end.
So, at the very least, your bank account gets "enlightened."
Look, even monks gotta make a living, I get it. And $1800 plus tip to do essentially nothing but navel-gaze for a week isn't so bad, if food, lodging and activities (such as they are) are provided.
Each dayâs schedule, which lasted from 6 A.M. to 9:15 P.M...
That's a hard pass for me right there. I'm sure your mind is more open -- or, on the flip side, more susceptible to suggestion -- if you're tired. And I know for some people that's something like a normal schedule. But not me.
The retreat convinced me that contemplation can reproduce the effects of psychedelics, a claim I have long doubted. On the retreat, as during a trip, I saw lifeâs inexplicability and improbability, which I like to call âthe weirdness.â On psychedelics, the weirdness screams at you. On the retreat, the weirdness murmured.
Well? The only way you can find out if it's true or not is to try both. Perhaps at different times. Perhaps at the same time. Maybe all of the above. You know. For science.
Now that Iâm back in the real world (which, given the digital distractions, is more virtual than real)...
I really dislike this dichotomy. Lots of things you experience are real, whether in person or mediated through a communication device. The people you interact with online are real. The stuff you get from the internet is real. Or, well, to take the Buddhist angle, as real as anything else, anyway. Or are you going to assert that talking to someone on the phone, back when we did such things, wasn't a real interaction? The internet is just an extension of communication.
And that's always been my issue with many philosophies: they turn shit around so up is down and left is right. They claim that what's real is an illusion and the shit your mind comes up with is reality. All this does is muddle the definitions of "real" and "illusion" until those are just meaningless sounds.
There's an argument to be made that we can only perceive things as they're filtered through our senses and processed by brains. But that doesn't make the chair I'm sitting in any less real, and it doesn't make the random thoughts I have any more real.
But hey, like I said, I'm not ragging on anyone else's experience, here. The article is worth a read in its entirety, I think (or I wouldn't have read it in its entirety).
And you're not going to convince me that the article isn't real, even if it is nothing but electrons flitting around in cyberspace. |
February 21, 2021 at 12:03am February 21, 2021 at 12:03am
|
This one's a little on the deep side, and I can't claim to understand all of it. But I'm putting it out here anyway in the hope that someone will get something out of it.
Incidentally, my browser's spellchecker doesn't recognize the legitimacy of "untestable." There's probably a metaphor in there somewhere, but I'm entirely too sober to tease it out right now.
Itâs an interesting time to be making a case for philosophy in science. On the one hand, some scientists working on ideas such as string theory or the multiverse â ideas that reach far beyond our current means to test them â are forced to make a philosophical defense of research that canât rely on traditional hypothesis testing. On the other hand, some physicists, such as Richard Feynman and Stephen Hawking, were notoriously dismissive of the value of the philosophy of science.
I expected better from Feynman, and Hawking always struck me as having something akin to a philosophical streak.
That value is asserted with gentle but firm assurance by Michela Massimi, the recent recipient of the Wilkins-Bernal-Medawar Medal, an award given annually by the U.K.âs Royal Society. Massimiâs prize speech, delivered earlier this week, defended both science and the philosophy of science from accusations of irrelevance. She argues that neither enterprise should be judged in purely utilitarian terms, and asserts that they should be allies in making the case for the social and intellectual value of the open-ended exploration of the physical world.
Personally, I think judging anything on "purely utilitarian terms" misses an important part of the human experience. Which is not to say I don't have a utilitarianist perspective on a lot of things; I just don't think that's the only viewpoint worthy of consideration.
The article goes on to interview Massimi, and while I won't quote most of it here, it's worth seeing both the questions and her answers. But this is one question that struck me as interesting -- not for the answer, appropriately enough, but for the question itself:
One criticism made is that science moves on, but philosophy stays with the same old questions. Has science motivated new philosophical questions?
Her answer is important from one perspective, but I'll offer another point of view: Yes, science motivates new philosophical questions. It does this all the time. These discussions even seep into popular culture. The very first true science fiction book, Shelley's Frankenstein, delves into the philosophical repercussions of applied science (albeit from a purely fictional perspective). The question isn't answered, of course -- we're still asking it, in stories, right up to the present day. And undoubtedly beyond.
Without science offering the possibility of creating something new in the world, there would be little reason to ask those questions on a serious level.
Later, a part of her answer to another question goes like:
Obviously it is not the job of philosophers to do science, or to give verdicts on one theory over another, or to tell scientists how they should go about their business. I suspect that some of the bad press against philosophers originates from the perception that they try to do these things. But I believe it is our job to contribute to public discourse on the value of science and to make sure that discussions about the role of evidence, the accuracy and reliability of scientific theories, and the effectiveness of methodological approaches are properly investigated.
Ethics is in the purview of philosophy. Perhaps not so much in physics, but other branches of science, particularly biology and related fields, absolutely has to consider ethics. Animal testing? Hell, human testing? The potential for cloning? The quote from the original Jurassic Park movie comes to mind: "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should."
But this is, for me, the most important point she makes:
In this respect, I see philosophy of science as delivering on an important social function: making the general public more aware of the importance of science. I see philosophers of science as public intellectuals who speak up for science, and rectify common misconceptions or uninformed judgments that may feed into political lobbies, agendas and ultimately policy-making.
This article is from 2018, and since then I think at least one major thing has happened that demonstrates the value of science in the public sphere. While there is certainly research that is, at best, questionable -- nutrition science comes to mind -- I think one of the greatest issues facing us right now is general distrust of science and the dismissal of experts, in science or other fields, as no more useful than your uncle's deranged Failbook posts.
So yes, by all means, keep up the philosophy. I won't understand all of it. But I don't have to. |
February 20, 2021 at 12:01am February 20, 2021 at 12:01am
|
Okay, it's called "Food Grammar," but "Grub Grammar" is alliterative so I'm exercising my literary license. Hey, that also alliterates.
One of the common experiences of humankind -- well, actually, of pretty much all animals -- is that we eat. Humans make it social, though, so you'd think we'd have more customs in common. That turns out not to be the case.
Serve spaghetti and meatballs to an Italian, and they may question why pasta and meat are being served together.
"Because that's how we do it in the US."
Order a samosa as an appetizer, and an Indian friend might point out, as writer Sejal Sukhadwala has, that this is similar to a British restaurant offering sandwiches as a first course.
Depends on the sandwiches.
Offer an American a hamburger patty coated in thick demi-glace, and theyâll likely raise an eyebrow at this common Japanese staple dubbed hambagoo.
Now I want to try that.
Each of these meals or dishes feels somehow odd or out of place, at least to one party, as though an unspoken rule has been broken. Except these rules have indeed been discussed, written about extensively, and given a name: food grammar.
I'm calling it grub grammar anyway. Because I'm contrary like that.
Yes, much like language, cuisine obeys grammatical rules that vary from country to country, and academics have documented and studied them.
Sounds to me more like an excuse to try a bunch of different foods.
A culinary grammar can also provide insight into how an assortment of ingredients becomes a meal, much like how a jumble of words becomes a sentence.
So, in this analogy, cooks are all writers, and really good chefs are famous novelists or poets.
The article provides many examples; if you're the slightest bit interested in international cuisine or other cultural dining practices, give it a look.
In Italy, pasta and rice dishes are served before meat rather than alongside it; in Italian-American restaurants, however, fish is often perched on risotto, and meatballs take their starring place atop spaghetti in the eponymous classic.
You know, something about spaghetti and meatballs always bugged me. It's not the taste; it's usually delicious. It's more about the work you have to do in order to eat it. Like... okay, say you have a dish of spaghetti in some other kind of sauce, no meatballs. Whether it's Italian or not is irrelevant here, the point is, it's pasta and sauce. You stick a fork in, twirl it around, and stuff it into your face. But with meatballs, unless they're really really tiny meatballs (which can also work) you have the extra step of breaking up the meatball with the fork or with the aid of a knife. Then you have to get just the right proportion of meatball fragment to pasta to sauce. I mean, you can't just eat the meatball by itself, and you can't just eat a forkful of spaghetti by itself; otherwise, why bother with one or the other?
But this has nothing to do with the national origin of the dish. As far as I'm concerned, S&M (which I enjoy calling spaghetti and meatballs because I'm hilarious) is just as proper a food as anything you get in Italy, or anywhere else for that matter.
I mean, it's not like Italy invented pasta, and it's not like the tomato that forms the basis for so many pasta sauces was native to Italy. They had to put those things together once they got noodles from the east and tomatoes from the west (see the discussion of tomatoes a few entries back). So the Italian cooks, at some point, used these foreign concepts and made their own dishes out of it. S&M is just more of the same.
None of which says a single word about the taste of the dish. And that's what I care about in a dining experience. I don't give two shits if the dish is "authentic."
âJapanese people will take anything and make it theirs,â says Albala, citing shokupan, a Japanese white bread thatâs even sweeter and softer than American Wonder Bread.
This... this is nothing to be proud of. If you want to cite one of the greatest achievements of Japan, it's how they were able to steal whiskey from Scotland and come up with distilled spirits even better than most Scotch.
Oh, yeah, I said that out loud. Okay, fine, it's not technically "food," but we were talking about how Japan will take anything and improve upon it.
I'll finish this with my own first known experience with grating grub grammar. And it didn't happen in some exotic foreign land, but in New York City, and it wasn't some other culture's cuisine (there are many such in NYC, which is one thing I love about the place), but in my very own aunt's house.
You'd think that she and her sister, my mom, would have similar ideas about how food is served. And they did, in some ways, though my mom was also influenced by my father's New Orleans background. But as a kid, I was always presented with meal courses in the following order: salad, main course (that is, what we call an entree, which is not the same thing as the French entrĂŠe, which is an appetizer), dessert. I mean, that's what you do, generally, right? With of course variations depending on what's being served.
But then I go to my aunt's house and suddenly I'm expected to eat the main course first and then the salad.
I don't know why this freaked me right out. I mean, I can understand having them both at the same time and switching back and forth; that is, treating the salad as a side dish. But my aunt treated it like First Dessert. So I guess that was grub grammar, even though I didn't know what to call it then, and really, didn't know until I read the article I linked above last month.
And now? Now I'm all hungry. |
February 19, 2021 at 12:07am February 19, 2021 at 12:07am
|
Well, it looks like I didn't get as astronomical as I expected, so here's my usual just-after-midnight post. Last night, after completing my blog entry, I ended up getting completely danchu, sampling from many different bottles. Predictably, I felt like shit all day (worth it). Hence, tonight, I ended up just taking my traditional couple of shots of birthday tequila, and I also had some port with dessert. Consequently, I'm functional right now.
I'm sure you've all heard about the successful Perseverance rover landing on Mars yesterday. Technology works, and we can do some pretty awesome shit when we put our minds to it. By the time we finally land human boots on the fourth planet, there will be a thriving robot society there. They'll probably pass resolutions to send us back home, or maybe put us in concentration camps to discourage further immigration.
So with the success of the landing, which is designed to search for signs of past life on the Red Planet, it's appropriate that this subject came up at random from my stash.
I'm somewhat sure that there is no alien life on the internet, politicians' tweets notwithstanding. Oh, wait, that prepositional phrase is meant to modify "discussion," not "alien life." English is fun.
Anyway, this blogger, PZ Myers, is one of the few that I follow regularly, because he usually has something interesting to say, even if he does have an obsession with arachnids. Fortunately, spiders are (we think) from Earth so there aren't any in this particular post.
I like speculations about alien life, just as I appreciate the diversity of life on Earth, the different forms of life in the past, and the prospect of evolution in the future, but every time I read about this stuff in astronomy-related journals, I feel like theyâre making an effort to reduce my intelligence.
I know the feeling.
The problem is that they have no imagination and no biology, but theyâre trying to imagine the nature of alien biology, and all they end up doing is running around in circles trying to figure out why little grey humanoids arenât landing their flying saucers en masse to march out and shake hands with the president.
I know I've ranted about this sort of thing in here before, so here's another guy's take on the subject.
So, in this post, Myers does what I usually do, which is comment on someone else's article. So here I am commenting on an article that comments on an article. If someone quotes this entry and comments on it, that would be really meta.
This is my problem with the general tenor of these speculations. They all assume that we, that is human-like intelligences, are desirable, inevitable, and the only proper kind of life; theyâve read far too many science-fiction novels prophesying a colonialist destiny led by strong-jawed Anglo-Saxons with glinting eyes and a finger on the trigger of their blaster.
I mean, hey, I love those stories as much as anyone, though I prefer phasers.
Theyâre always going on and on about the likelihood of finding an intelligent civilization like ours. Why not speculate about finding a planet that has produced kangaroos? Or stomatopods? Or baobab trees? These are all unlikely outcomes of a contingent, complex process that produces immense diversity, and theyâre all wondering what the âbarrierâ is that prevents our kind from winning the cosmic lottery every time. Get over it, weâre not a favored outcome, thereâs no direction to evolution, and thatâs why there arenât smarty-pants bipeds tootling about the galaxy stopping by for tea.
I kinda hate that Myers puts this more eloquently than I have in the past. But I agree.
Anyway, I'm not quoting the quotes here, and I'm leaving out a lot of the commentary, so of course I recommend reading the source. Whether you read the article he's ragging on is up to you; I certainly didn't.
Point is, though, that this is a biologist's perspective on the question, but it's similar to one I came up with previously: that if we find life "out there" -- be it living life, or, as is hoped with the Mars robot, evidence of life that existed in the past -- it will, in all likelihood, be something other than a technological civilization. And yet, we've been so conditioned to equate "life" with "intelligent life" (as usual, please refrain from making silly dismissive comments about "intelligence," as the fact that you can make such a comment immediately contradicts it) that it's almost impossible to find someone who doesn't immediately think of bulbous-headed big-eyed gray guys, or Klingons, or other generally humanoid, tool-using, spaceship-flying aliens.
That said, I really hope this Mars rover finds unequivocal evidence of some sort of living, or once-living, organism on Mars. Because that by itself would be an incredibly important discovery, never mind what sorts of things we might learn from it.
Thanks for all the mixed drink recipes! I definitely saw a few I'd like to try. Rolling the Virtual Dice for the nine commenters (sorry, one comment came in just after midnight), the result is 3, so the Merit Badge will go to... Cubby for the mojito recipe (and I do love mojitos).
Still, I appreciated all the comments and we'll do this again soon! |
February 18, 2021 at 12:04am February 18, 2021 at 12:04am
|
I'm going to start this entry by saying that I've never read this book; I've never heard of this book; and I have not a single clue what the book is about. I mean, really, it could be almost anything:
A person with more than one personality
A haunted house
A church or other religious building
A columbarium
Seriously, just consider all of the various possible meanings and/or connotations of "house" and "spirits" and you're good to go.
Incidentally, because of that last bullet point, this is cheating on the part of whatever author wrote it (like I said, I haven't the faintest clue about the book in question and I'm deliberately not looking it up before doing this entry). It's cheating, because when you have words with that many definitions and connotations, you can employ a great deal of parallelism, and people respond to symbolic parallelism.
And so I'm going to talk about my favorite thing that could be considered The House of the Spirits:
A bar.
I miss bars. I miss them a great deal. Oh, sure, I've gotten more into mixing my own drinks -- everyone needs a pandemic hobby -- but there was something about sitting at a bar and having a lot of options to choose from, and not having to mix them myself.
I will say this, though: one drink I made called for mint leaves. I tried to get mint leaves from the supermarket, but they were out. Instead, I opted for a mint plant, an actual living potted thing with stems and leaves.
Plants never last long around me. They usually take one look at me, say "Oh hell no," and commit suicide. This one has somehow stuck around for... I don't know... two weeks? Something like that. Time has little meaning anymore. More than a week. Less than a month. I mean, it was kind of iffy there for a while, and my housemate helped, but somehow this plant still lives; I've been pruning it occasionally for leaves, and there's even new growth starting.
But see, if I could go to a bar, I could just order a drink and the mint leaves are already cut and the bartender does all the actual work. And I wouldn't be limited to just the 25 bottles I have on hand.
The drink in question, by the way, is called a Ginger Rogers and I mostly just started making them because I had a metric ton of ginger to use up and I like gin a lot. Yes, the name is a pun because of course it is.
But since I had to also buy the mint plant, now I'm almost out of ginger and have plenty of mint, so soon I'll have to buy more ginger (and gin) in order to use the mint because I can't just let the plant grow out of control. This happens a lot. Say you want ham and cheese sandwiches. So you buy ham, cheese, bread, lettuce, tomatoes, whatever. You run out of ham, so you buy more. But then you run out of bread and you can't let the ham and cheese and veggies go to waste so you buy more bread. Then the cheese runs out and you can't have a ham and cheese sandwich without cheese, so you buy more cheese. Meanwhile, the remaining veggies have gone bad because it's been two days, so you have to buy fresh ones of those and while, sure, you can probably just make a salad you still have ham, cheese and bread so you make sandwiches but then the ham runs out again and you buy more ham... the cycle never, ever ends.
And that's another reason I prefer to go to bars and restaurants.
It's been my tradition for a while now, on my birthday, to drink tequila. Lately this has happened at one or another local tequila bar. Not this year. But I'm prepared. I have a bottle of tequila, a bottle of triple sec, some limes, salt. But I'll probably run out of one or another of those things and have to replace it, and then run out of another and have to replace that. Until my spirit gives up the ghost. The ham and cheese sandwich effect, you know.
So since it's my birthday, it's time for me to give someone a present...
Merit Badge Mini-Contest!
Give me a mixed drink recipe. Link it or type it in, I don't care. And I don't want to leave out my non-drinking readers, so I'm not saying there has to be alcohol involved, just make it an interesting mixed drink (e.g. no "virgin screwdriver," which as far as I know is just orange juice). Out of all qualifying comments, I'll pick one commenter at random to give them a Merit Badge tomorrow. Deadline, as usual, is midnight tonight, Thursday, WDC time.
And if I like a recipe I might even try it out sometime.
One thing, though: I usually post just after midnight because it suits my schedule. This will probably not happen tomorrow because, well, see above about "tequila on my birthday." I will post and choose a MB recipient sometime later on Friday, though, if the tequila doesn't kill me. |
February 17, 2021 at 12:02am February 17, 2021 at 12:02am
|
You say toe-MAY-toe, I say toe-MAH-toe...
Clearly, it was because no one knew how to pronounce it. Hey, let's mess with everyone's heads and start pronouncing it toe-MATT-oh. Who's with me? While we're at it, we can restart the old argument over whether it's a fruit or a vegetable (truth: it's either, depending on whether you ask a botanist or a chef).
In the late 1700s, a large percentage of Europeans feared the tomato.
To be sure, there was a lot to fear in late 1700s Europe, and not just if you were a French noble.
A nickname for the fruit was the âpoison appleâ because it was thought that aristocrats got sick and died after eating them...
Which is how the tiny violin got invented.
...but the truth of the matter was that wealthy Europeans used pewter plates, which were high in lead content.
And they probably ate mercury off of them, too. No, really: apparently mercury was considered a cure for syphilis. This is one reason why I keep saying the past sucked.
Because tomatoes are so high in acidity, when placed on this particular tableware, the fruit would leach lead from the plate, resulting in many deaths from lead poisoning. No one made this connection between plate and poison at the time; the tomato was picked as the culprit.
This is why science is important. You need a double-blind test. Give 1/4 of the aristocrats pewter plates with tomatoes on them. Give another 1/4 of them pewter plates with placebos on them. The third 1/4 gets placebo plates with tomatoes, and the remainder get placebo plates and placebo tomatoes. Record how many of each group dies. Then guillotine the survivors just in case.
Around 1880, with the invention of the pizza in Naples, the tomato grew widespread in popularity in Europe.
Cultural appropriation! The tomato isn't native to Italy, so it's inauthentic cuisine! Oh, well, we got 'em back when New York perfected the pizza.
Before the fruit made its way to the table in North America, it was classified as a deadly nightshade, a poisonous family of Solanaceae plants that contain toxins called tropane alkaloids.
This, in spite of European invaders literally seeing Americans eating the things.
Like similar fruits and vegetables in the solanaceae familyâthe eggplant for example, the tomato garnered a shady reputation for being both poisonous and a source of temptation.
To be fair, eggplant isn't actually food. Oh, sure, I understand that some people eat it, but I refuse to classify it as edible. Consequently, it is not a source of temptation for me. Neither is the tomato, which I can take or leave, but at least I acknowledge it as food.
Gerardâs opinion of the tomato, though based on a fallacy, prevailed in Britain and in the British North American colonies for over 200 years.
Hence the importance of fact-checking. Fake news didn't originate with the internet. Or even with the printing press.
By 1822, hundreds of tomato recipes appeared in local periodicals and newspapers, but fears and rumors of the plantâs potential poison lingered.
Again, a testament to the problem of anchoring bias, where the first piece of information you learn about something sticks with you in spite of later corrections. Still, if that first piece of information is "that shit'll kill ya," this can be somewhat forgivable.
Today, tomatoes are consumed around the world in countless varieties: heirlooms, romas, cherry tomatoesâto name a few. More than one and a half billion tons of tomatoes are produced commercially every year. In 2009, the United States alone produced 3.32 billion pounds of fresh-market tomatoes. But some of the plantâs night-shady past seems to have followed the tomato in pop culture. In the 1978 musical drama/ comedy âAttack of the Killer Tomatoes,â giant red blobs of the fruit terrorize the country. âThe nation is in chaos. Can nothing stop this tomato onslaught?â
Awww. I was hoping that the article wouldn't mention this seminal work of cinema, so I could write this entry about it. Oh, well, I guess I'll just keep it in the title.
There's more to the article than the bits I riffed off of here; it's worth reading for the description of the dreaded Green Tomato Worm (which is not, in fact, a worm that infests green tomatoes, but a green worm that infests all sorts of tomatoes -- one of the many ambiguities possible with the English language, which is what makes it so much fun).
But mostly I'm glad this article came up in my random roll today, because it's science, history, food, cinema and the potential for comedy all rolled into one, like a burrito. A burrito with chopped tomatoes. Okay. Now I'm hungry. Tomorrow's entry will almost certainly be darker, for reasons that should become apparent. |
February 16, 2021 at 12:02am February 16, 2021 at 12:02am
|
You're all expecting me to rag on Hubbard here, aren't you? Aren't you? But no, the organization he founded has armies of lawyers and this shit is public.
Instead, I'm going to take this opportunity to talk about science fiction.
I talk about it sometimes in my Fantasy newsletter editorials, because there's a lot that fantasy has in common with science fiction. Incidentally, I really hate calling it "sci-fi," but I have been known to abbreviate it as SF. Rarely, though, because that could also mean "speculative fiction," which is probably a broader concept but I can't be arsed to get into the technical nuances of the differences between genres. I'm certain that for some people, it's their hill to die on.
But, as much as SF has in common with fantasy, there are important differences. I mean, yes, I'm intimately familiar with Clarke's Third Law ("Any sufficiently advanced technology is indistinguishable from magic.") I'm also familiar with the people who think they're oh-so-clever and turn it around to "Any sufficiently advanced magic is indistinguishable from technology." That's not nearly as clever as you think it is.
And I'm not saying that there aren't shades between the two genres. Hell, some of my favorite stories are somewhere on the spectrum between the two extremes. And by "extremes" here, I mean that on the one pole, you have magic and maybe the supernatural. The Platonic ideal of this is, of course, Tolkien. On the other pole you have pure science and technology, no supernatural elements, everything is explainable in terms of laws of the universe. I can't be arsed to come up with a Platonic ideal for that.
I do know it's not Battlefield Earth.
And no, it's not Star Trek, either. I love Trek, but it takes too many liberties with the laws of physics to be pure science fiction. Just to be clear, I have no problem with that; it's just a mental categorization thing.
I can also tell you that the difference between fantasy and SF isn't about time period. There's fantasy set in the future, SF set in the past, and all kinds of variations thereof. There's also fantasy set so far in the future that the basis for the magic is actually science, in accordance with Clarke's Third Law.
Before I stopped reading Orson Scott Card's books, I attended a book signing he did. As a published writer of both fantasy and science fiction, he's probably more qualified to discuss the differences than I am, a mere reader and unpublished writer. And he said: "Look at the book covers. Fantasy has trees. Science fiction has rivets."
This was before steampunk, though. Lots of rivets. No grounding in science.
Also, to be perfectly clear, Star Wars is fantasy. Sure, yes, I know, spaceships, warp drive, robots, whatever. All the tropes and trappings of science fiction. But it's not science fiction; it's fantasy that uses SF props. That is a hill that I will die on.
Again, it doesn't matter. I don't feel the need to choose, any more than I need to choose between Wars and Trek, between Marvel and DC. But genre has one important function: marketing. Some people prefer one over the other, but almost everyone wants to know, at least vaguely, what they're getting into when they start a book or movie or whatever. If it bills itself as horror, it probably shouldn't focus on romance. If it's supposed to be a detective novel, maybe don't turn it into a gothic vampire story (or if you do, at least warn us).
Other people might disagree. And that's where the battle comes in. |
February 15, 2021 at 12:05am February 15, 2021 at 12:05am
|
It is not likely that every problem caused by technology has a technological solution. But many do. Here's an example.
Packaging from the grocery store, lint from our clothing, plastic shopping bags â plastics and microplastics are everywhere, and theyâre not going anywhere.
Mr. McGuire: I want to say one word to you. Just one word.
Benjamin: Yes, sir.
Mr. McGuire: Are you listening?
Benjamin: Yes, I am.
Mr. McGuire: Plastics.
Benjamin: Exactly how do you mean?
Mr. McGuire: There's a great future in plastics. Think about it. Will you think about it?
I'm sure this is not what McGuire meant in The Graduate, but yes, indeed there was a great future in plastics. A very, very long one indeed.
In order to speed up this decomposition process, scientists from Rice University are transforming these discarded plastics into non-toxic, naturally occurring materials. They're doing this by using a newly developed technique called âflash Joule heating,â to rapidly heat plastic materials to very high temperatures .
"But where's the energy coming from to-" "Shut up."
Currently, there are a few plastic recycling techniques that are widely used, with differing results.
Now, I've heard -- without a lot of confirmation -- that many plastics don't get recycled at all, even if they're labeled with recycle symbols, sorted carefully into categories, left in a recycling container instead of the trash can (rubbish bin for my Brit friends), and picked up by a green truck with a great big RECYCLE logo on it. This is, I've heard, because plastics are generally difficult to recycle. Aluminum and other metals? Easy; melt it down and you get... aluminum or whatever. Glass? Also easy. Melt it down and you get glass. Melt plastic down and those handy long-chain hydrocarbons break apart and you get methane, carbon dioxide, and all sorts of fun-to-pronounce chemicals.
Even more shockingly, each plastic bag is used for, on average, less than one hour.
Depends on what you mean by "use." and what you mean by "bag." Why, I have plastic bags in my freezer that have been in use since 1996. I suppose one of these days I should remove the 25-year-old meat, but I can't be arsed.
Okay, that's hyperbole. Still, it might be time to go through the freezer.
In contrast, the âflash Joule heatingâ method turns plastic into graphene, which is highly recyclable and very stable. Graphene itself is incredibly strong and stretchy â 200 times stronger than steel. Graphene is a single layer form of graphite, a naturally occurring carbon-based mineral that is commonly found as pencil lead.
What the hell, MassiveSci? I expect better than this from you. Graphite (as well as graphene) is carbon-based in the same sense that the clear liquid coming out of your faucet is "water-based." That is, it's pure carbon. Like diamond, only generally not mined by 8-year-old slaves.
By directly generating graphene from plastic waste, it is possible to reduce its production cost.
Again, graphene is carbon. Plastics are based on long-chain hydrocarbons. I don't see very much in the article about what happens to all the other little atoms. "Some hydrogen and hydrocarbons." Okay, then what?
Generating graphene directly from plastic could disrupt the graphite supply chain, thereby decreasing mining activity and reducing pollution caused by mining.
Job killer!
In the meantime, you can help fight plastic waste by making one small but significant change in your life: bring your own reusable fabric bag to the store with you.
You know. I'm old enough to remember that groceries used to be packed in brown paper bags exclusively. Paper, of course, is both fairly easy to recycle or, if you can't be arsed to recycle it, biodegradable. Then the grocery stores started stocking those flimsy plastic bags that (con) couldn't carry much more than three tomatoes without breaking but (pro) had handles. Plastic bags were better for the stores because they were cheaper and lighter, and you could store like 1000 of them where you could only store like 100 paper bags (note: those numbers are wild-ass guesses, but the idea is valid). There was a transition period you might recall when shoppers were offered a choice. "Paper or plastic?" became even more ubiquitous for a few years than "Credit or debit?" and "You want fries with that?"
Finally they trained consumers to not want paper bags anymore, and they quit asking, instead just dumping the groceries into plastic bags. This end of the "Paper or plastic?" era somehow coincided with the first scaremongering about OMG MICROPASTICS WTF NONBIODEGRADABLE IT'S EVERYWHERE. Hell, some areas flat-out banned plastic bags, and didn't go back to paper. And bringing my own bags to the grocery store is Not Going To Happen. (I don't go to the store anymore anyway, instead opting for delivery -- which usually comes in plastic bags. I don't give a shit.)
I'm not saying there isn't a problem with microplastics. I'm saying, like, why the fearmongering? Unless it's a massive campaign financed by pissed-off paper bag manufacturers, or the makers of reusable cloth grocery bags. Yeah... that's my working theory. It's like how I've always thought that PETA is actually run by the soybean industry in an effort to sell more tofu.
Anyway. Point is, this is pretty clever technology and I hope it actually becomes a thing. |
February 14, 2021 at 12:35am February 14, 2021 at 12:35am
|
As I refuse to acknowledge what today is, I'll just post a random link.
Fortunately, "trying to stay optimistic" isn't a trap I fall into. So, what's FONO? I guess I have to give them clicks to find out!
When her patient started talking about sick notes, neuropsychologist Judy Ho decided to intervene. Her client, a wildly successful entrepreneur, was rich, happily married, and well-regarded by his peers.
So, not someone anybody else is going to feel sorry for no matter what happens. Only in Bloomberg.
The problem was the days when he felt depressed and run-down but unable to admit it. The only way to address it, he felt, was to regress, like a schoolboy, and look for permission from a doctor to regroup. âHe knew he wasnât sick, but heâd go in and make something up,â she says, âjust so he could take a day off and be OK with himself.â
Free business idea for you: sell miniature violins.
She recognized he was suffering from a surging contemporary malaise. âHe always had to demonstrate his worth to people,â she continues. âHe was thinking, âI must exude this image of success and a happy life that everybody has come to know about me, and I donât want to ever change that image.â Thatâs toxic positivity.â
Oh. Damn. I was hoping "toxic positivity" referred to the trend of people to exhort other people to always look on the bright side of life. Those people make me grumpy. I mean, good for them, but bah humbug.
This is bad enough, though, I suppose. Kind of like the social media hounds who are always portraying themselves as shiny and happy all the time.
Call it FONO, or fear of a negative outlook. Also known as âdismissive positivity,â itâs expressed as an overbearing cheerfulness no matter how bad things are, a pep that denies emotional oxygen to anything but a rictus grin.
Ah, there's the definition. Nah... too close to FOMO for my taste. Used to be referred to as Pollyanna syndrome or something, after a character who was always, always cheerful no matter what happened.
You see it on Instagram, where the affective filter is always upbeat, usually followed by the hashtag #blessed.
No, I don't, because I avoid Instagram even more strenuously than I avoid Bookface and Twatter. And hashtags are a plague upon the land.
You hear it from the SoulCycle instructor exhorting every rider to swaggeringly sweat through the pain.
No, I don't, because... well, you know.
Itâs available from the newly anointed chief creative officer for Vital Proteins, actress Jennifer Aniston, who claims that renewal isnât only a result of its powders: Instead, âitâs within us.â
So what do we need of your patent nostrums?
You might even recognize it in the boss who insists that colleagues start every Zoom meeting by sharing a piece of good news to help keep moods buoyant amid the gloom.
Now, look, maybe this is going a bit too far. I don't think there's anything wrong with acknowledging that there's some good news during times of crisis. It's not just that every silver lining has a cloud, it's that there may be benefit, when you're bombarded by bad news all the time, to try to find one positive thing in your life. The problem comes when you focus only on the positive things, just as it would be a problem to focus only on the negative things.
In my own opinion, anyway.
For example: I hate February. I mean, I utterly despise this fucking month with a passion fit to burn stars. It's cold, it contains stupid observances like Groundhog Day, President's Day, and... well... that which shall not be mentioned but falls on today. It hosts a stupid sportsball game with stupid commercials. My birthday occurs in this hated month, which is a constant reminder that I have fewer of those in front of me than I have behind me. It's dreary and confining and depressing. On the bright side, most years it's only 28 days. As of today, it's half over. See? I can find a light in the darkness when I try.
Think of this mindset as one that responds to all human anxiety, or sadness, with uncompromising optimism. It can be found in sentences that start with those negating words âAt least,â which are followed by a suggestion that however bad youâre feeling, at least youâve got plenty else that should offset and outweigh it.
If I trust you enough to bitch about something, don't ever start a sentence with "At least." I will punch you, and then I'll feel bad about it, but at least I'll have had the satisfaction of punching you.
Ordinary Americans, casting around for inspiration and reassurance, became prime targets for these peddlers of perkiness.
"Peddlers of perkiness" would be an awesome name for a retro swing punk band.
Such magical thinking has paralleled the rise of professionals hired to be a personal cheerleader. Membership of the International Coaching Federation, a credentialing and training program accrediting body, has soared from almost 4,700 worldwide in 2001 to more than 41,000 today.
Wait. This exists? This is a thing? Forget what I said earlier. There is no positive side to this.
Successful people are the most likely to fall prey to this way of thinking, says Naomi Torres-Mackie.
I think I might be successful investing in tiny-violin futures.
For the current generation, the origins of this emotional cure-all lie in the 1990s, when then-president of the American Psychological Association, Martin Seligman, posited that pessimism is a learned behavior. Therefore it both could and should be avoided.
Okay, but I'm pretty sure faux positivity started long, long before this asshole. Let's see... oh, yeah, here. "The modern positive thinking movement started in the late 1800s with a watchmaker named Phineas Quimby. Quimby became fascinated with the practice of mesmerism (a.k.a. hypnotism). He became the apprentice of a famous French mesmerist and traveled New England learning the trade. Once he could hypnotize on his own, he opened a practice and started having some success alleviating the symptoms of psychosomatic disorders. This lead him to believe that the body was a reflection of the mind and that all illness was caused by false beliefs."
Ah, yes, one of the oldest fallacies in the book. "An herb cured my illness, so ALL illness must be curable by herbs!"
My apologies to anyone who actually clicks on that second link. I started feeling nauseated just glancing at it. Consequently, I would say their assertion about Quimby is, at the very least, suspect. But my point is that people have been pushing bullshit positivity since LONG before the 1990s.
That observation snowballed into bestsellers such as The Secret, first published in 2006 by Australian TV executive-turned-author Rhonda Byrne. It was popularized after Oprah Winfrey championed its ethos. That breakout bunkum bible was essentially built on claims that the power of positive thinking would provide whatever you want, be it a baby or a Mercedes-Benz.
"Breakout bunkum bible?" I'm thinking bardcore / metal / steampunk fusion band.
But yes, "The Secret" is, with the possible exception of Twilight, the stupidest piece of trash published in this century. And yet, here I am, unpublished, so not so stupid for the author, is it?
So, How to Cope?
This is the central question behind about 90% of the shit I see online these days.
Ho, the neuropsychologist, has an unexpected suggestion to help calibrate a Pollyanna perspective: a session watching Disney-Pixarâs Inside Out, which animates and dramatizes human emotions.
This therapy session has been brought to you by Disney. Disney: We bring good things wherever we go! Be sure to subscribe to our streaming service, and remember, our theme parks are still open and waiting for your smiling (but masked and socially distant) faces! Remember Disney for all of your entertainment needs. Now featuring Star Wars, Marvel Comics, National Geographic, and the best lawyers in the business. Disney!
Itâs no surprise that Byrne would also return now. Her sequel, The Greatest Secret, came out in November. Read it, the blurbs tout, and you can remove all negativityâas if doing so should be a central goal in life. (More than 80% of Amazon.com Inc.âs user reviews gave it five stars. It would be too negative to be negative, it seems.)
I so very badly want to give it a one-star review on general principles, but I have too much of a sense of honor to review something I haven't read, way too much self-respect to actually read it, and no intention of actually paying money for that dreck.
So there's my rant for the day, which I'll end by reiterating: I hate February. |
February 13, 2021 at 12:03am February 13, 2021 at 12:03am
|
Sometimes, I just like to have fun with these things.
And sometimes those inner narratives lead to nutjob whackaloon conspiracy "theories."
We are all storytellers; we make sense out of the world by telling stories.
Occasionally, that story is "Aliens did it."
And science is a great source of stories. Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.
Facts which can then be twisted into science fiction. Hey, I'm not knocking science or science fiction, but it's important to keep the two separate in one's head.
There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.
Not "completely," but we have pretty good ideas about some of these. That last one, for example. It's because you're playing what you think is "music" too loudly.
For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.
To get serious for a moment, I happen to agree with this. But then, I fancy myself a writer.
Letâs begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel: âThe king died and then the queen died.â It is nearly impossible to read this juxtaposition of events without wondering why the queen died.
It's because she said, "Let them eat cake."
Okay, look, Marie Antoinette probably never uttered those words (not even the French equivalent). Supporting this, there was a story written by Jean-Jacques Rousseau long before the French Revolution where he had a princess utter those words (except it was "brioche," not "cake," but whatever). Here. (That wiki page also has a very helpful photograph of a brioche, and now I'm hungry.)
That's a story we tell ourselves. Some stories are based on fact. That one appears to be the other kind.
Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.
Nah, I'm sticking with the French Revolution. Vive la guillotine!
The pleasurable feeling that our explanation is the right oneâranging from a modest sense of familiarity to the powerful and sublime âa-ha!ââis meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions.
Which goes a long way toward explaining why so many writers have drug, alcohol, and/or gambling addictions.
The article goes on to discuss the science behind this. I couldn't think of any jokes about it, but it's a good read.
An efficient pattern recognition of a lion makes perfect evolutionary sense. If you see a large feline shape moving in some nearby brush, it is unwise to wait until you see the yellows of the lionâs eyes before starting to run up the nearest tree.
I'm too old to run up trees, and besides, who runs up a tree? I'd want to pet the kitty. It may be the last thing I ever do, but it'd be worth it.
Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones.
That's... actually a really good synopsis of science.
But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible.
Hence why I'm always rambling on about confirmation bias in here. There are other biases and fallacies, of course, but that's the one I know I'm inclined toward.
Because we are compelled to make stories, we are often compelled to take incomplete stories and run with them. With a half-story from science in our minds, we earn a dopamine ârewardâ every time it helps us understand something in our worldâeven if that explanation is incomplete or wrong.
And now they've just explained 90% of the internet.
Good science is a combination of meticulously obtained and analyzed data, a restriction of the conclusions to those interpretations that are explicitly reflected in the data, and an honest and humble recognition of the limits of what this data can say about the world.
Nowhere is this illustrated better than in the insistence of certain authors to ascribe mystical properties to quantum phenomena.
The article has a lot more than I'm quoting here, and I think it's important to read. It shows, at least in part, why I'm sometimes in here ragging on science reporting, while still praising (most of) the science on which it reports.
And it helps me to remember that, believe it or not, I, too, am not always right. |
February 12, 2021 at 12:01am February 12, 2021 at 12:01am
|
Neat how this one comes up the same week I do a Fantasy newsletter editorial about bread. But it was, indeed, randomly chosen.
I haven't read the book in question. I should, though; the bullshit surrounding nutrition science is, as you probably know, an interest of mine.
Thing is, from what I've seen of it, it would be largely confirming my own beliefs about things. "An incendiary work of science journalism debunking the myths that dominate the American diet and showing readers how to stop feeling guilty and start loving their food againâsure to ignite controversy over our obsession with what it means to eat right."
But, see, part of the problem with all the different diet and exercise and other health publications, programs, whatever out there is that they follow the standard marketing playbook: 1) Convince people that they have a lack and will not be happy until that lack is fulfilled; 2) sell the thing that will fill the hole you've just created in their psyche.
I'm no expert, but as far as I've been able to tell, that is the essence of advertising. The process itself is independent of whether the product you're selling is genuinely a good thing, or utter bullshit. And the blurb I quoted above (found via Google when I searched for the book) follows the script, also throwing in the "controversy" angle, which makes people want to see what the fuss is supposedly all about.
Still, I want to believe the book is worthwhile, because I long ago grew weary of the endless succession of diet and exercise fads, each marketed to a neurotic public, each in turn fading into obscurity as the new fads roll out. Different foods are by turns demonized or extolled, until it's impossible to tell what one "should" really be doing to maybe eke out another year or two of existence.
This is by design, as it keeps the publishing industry in business.
As for gluten itself, the whole "gluten-free" fad (which finally shows signs of winding down) misled people in a big way. It didn't help that "gluten" isn't a very pretty word, so it was easy to manipulate people into believing that it's Bad For You, in much the same way as it was easy to manipulate people into thinking that something labeled "All-Natural" is Good For You. (Incidentally, gluten, tobacco, and poison ivy are all-natural. Cognitive dissonance also sells product.) I'd even heard of people demanding gluten-free foods simply because they thought it was some kind of additive, and additives are, of course, Always Bad For You. (Anything can be an additive.)
One good thing has come from this bullshit: since less than 1% of the population has legitimate problems with eating foods containing gluten, there wasn't a lot of incentive for companies to create products that people with celiac disease could eat, which limited their diet options to food that is legitimately no fun at all. But suddenly a whole lot of people convinced themselves they had gluten sensitivities, companies marketed to them, and suddenly actual gluten-intolerant people had more options. So that worked out well for them.
Anyway. Like I said, I haven't read this book so I have no idea if what I've just said supports or contradicts anything in it. It just gave me a chance to once again rant about nutrition science, and the reporting and marketing thereof. |
February 11, 2021 at 12:01am February 11, 2021 at 12:01am
|
Well, the random number gods have frowned upon me and revealed unto me yet another Atlantic article. I suppose I could have ignored the result and picked something else, but I've got a system. Besides, at least it's not on the same topic.
It is possible that you might hit a paywall with this one. There are ways around it if you care.
While the article is six months old, I don't think it's outdated yet.
My friends were getting honest about how hard it is to raise children right now.
People have, in my experience, always been honest about how hard it is to raise children. But they almost always end the discussion with some variation of "But it's worth it." Reading between the lines here, I'm wondering if the author is responding to the absence of the "But it's worth it." But that may be my own bias talking.
I also read it as an indirect plea to not take my child-free privileges for granted.
Life without children can be easy sometimes. But it's worth it.
Iâve always been ambivalent about whether I would have children, but as I entered my early 40s, I started exploring the possibility of having a child on my own. And then the pandemic happened.
I will point out here that any discussion of whether or not to become a parent is different depending on the gender of the writer. One of society's many double standards is that, in general, men who choose to avoid becoming a father aren't treated with the level of disrespect often shown to women who choose to avoid becoming a mother.
This article is written from a female point of view, and obviously my commentary isn't.
The COVID-19 crisis has revealed the brokenness of Americaâs institutions: police violence and the inhumane criminal-justice system, a medical system that lacks infrastructure and essential equipment, precarious employment for an in-debt population getting by month to month, the toxic effects of globalization and climate change. Add to that list middle-class parenting, long an aspirational experience, whose social protections are now showing themselves to be a bit of a charade.
Most of which I've cited in the past as reasons why I didn't want to be responsible for bringing a child into society. Not the only reasons, but reasons. It didn't take a pandemic for me to see the writing on the wall.
While the parents in my life have been openly acknowledging the challenges of parenting during the pandemic, my child-free friends have for the first time been sharing that they are relieved they donât have children.
Again, a very different experience. I hear some variant of "I'm so glad I never had kids" fairly often. Incidentally, I appreciate the author's use of the adjective "child-free;" many would use "childless," which implies a lack or an emptiness.
Hm... I feel like I should note that I'm not trying to rag on anyone's choices here. I understand that being a parent is very fulfilling for some, and I respect their choices. I would only ask that they do the same for the child-free.
An essay series in The Guardian, called âChildfree ,â explores that decision, with reasoning that runs the gamut: not enough money, focusing on your own life, the climate crisis, being fine with being alone.
Well. The Guardian is one of my usual sources, but I've missed those essays until now. I made it a hyperlink here mainly so I can be reminded to go look at it later. I can't be arsed right now.
In response to a harmless tweet from a parent about how ânon-parents have no idea how hard itâs beenâ to parent during the pandemic, thousands of people chimed in with some version of: Yes, we doâthatâs why we donât have kids.
I know what a terror I was growing up, and I wouldn't subject anyone to that -- least of all ME.
This is hardly the first moment that the idea of marriage and a baby as the primary path for women has come under scrutiny.
Again, the female point of view. Which I'm by no means trying to downplay, but until we work out issues surrounding cloning, it takes two, directly or indirectly, to procreate. I feel the need to add that my own point of view comes from someone who has never had children (please keep "...that you know of" jokes to yourself; they're tiresome, false, and sexist), not like some guys I know who are childfree by way of abrogating their responsibilities. I don't think much of those guys.
For heterosexual parents, the bulk of the child care falls on the mother. The global health crisis has worsened this sexist division of labor, and the long-term effects could damage womenâs careers and, despite the best intentions, become a new norm.
Which is another societal double standard that really should be addressed.
For people who were planning to have a child, those plans might now be on hold; the process of seeking fertility treatments, for example, has gotten more complicated as access to medical procedures for elective reasons has been limited.
In a perfect world, as I see it, everyone who wants to have children would, while everyone who doesn't want to wouldn't. As this is not a perfect world, you have people desperately wanting to reproduce who can't, and people who desperately don't getting stuck. Also if I were inclined to pass judgment on anyone, it wouldn't be people who choose to have kids or people who choose not to, but the ones who blow all their resources on fertility treatments when they could adopt.
Taking away a lot of the stigma around adoption might serve to alleviate this disparity. Maybe.
Childless people have long been chastised for being selfish or for not fulfilling a role their body seemingly bound them to.
Here the author reverts to the other word. And selfish? What's more selfish than insisting on bringing children into a deteriorating world? Stipulating that everyone does everything for selfish reasons, if you want to have kids and be less "selfish," then adopt. Also, biology isn't destiny. My body also evolved for chasing prey across the savanna, but I can't be arsed to do that, either.
As a child-free woman in my 40s, Iâve been tasked with taking care of my parents.
Because of course one of the main reasons to have kids is to give you free convalescent care later in life.
One legacy of the pandemic may be less judgment of the child-free.
I won't be holding my breath.
Anyway, it's a point of view, and I thought it was interesting... but again, I admit to some confirmation bias here. |
February 10, 2021 at 12:01am February 10, 2021 at 12:01am
|
This article is a lot of words for a simple answer: telemarketers.
But still not as many words as if it were a New Yorker article.
See, if someone had been walking their dog in Central Park and a guy jumped out of the bushes and bit the dog, it would be reported in different ways by three different New York publications.
The New York Times:
Man Bites Dog
A man leapt from the bushes and bit the flank of a golden retriever belonging to Jamie Sands of West 72nd Street yesterday, sources say.
The New York Post:
Man Bites Dog
It finally happened!
This reporter has been waiting for this moment since his first year at J-school!
The New Yorker:
Incident in Central Park
A cool breeze brushes the tops of the trees as Jamie Sands, a 42-year-old mother of three human children and one golden retriever named Spike, strolls along one of the park's many carefully-maintained paved paths. "I walk Spike here nearly every day," she says, sighing and staring off into the distance, eyes seemingly focused on one of Manhattan's new pencil towers, its slender spire seeming to scrape the sky, to coin a phrase. "I never thought something like this could possibly happen," says Sands, her gaze shifting to her peripatetic canine companion. The dog pulls her along a winding trail through manicured lawns, stately trees, and trimmed shrubberies.
It was from one of those shrubberies, with its dense concealing foliage, that the unthinkable occurred.
Sands moved to the Upper West Side in 2006, upgrading her apartment from the 300-square-foot flat in the hipster enclave of Greenwich Village. "It was the third child that did it," she says, leashing Spike back from his investigation of a particularly brave squirrel. Squirrels have claimed Central Park since the moment it was founded, finding homes in treetops and terrorizing passers-by...
(there follows 360 column-inches of meandering, descriptive, wistful, breathless, pointless, post-modernist writing, with the description of the actual dog-biting incident buried approximately 4/5 of the way down.)
But The Atlantic? The Atlantic wouldn't touch a "Man Bites Dog" story. No, too pedestrian.
They apparently will, however, stretch out a "Why we don't answer the phone anymore" article to the point of absurdity.
The telephone swept into Americansâ lives in the first decades of the 20th century. At first, no one knew exactly how to telephone.
Verbing weirds language. Also, of course no one knew how to do it at first. This is true about any new invention. "At first, no one knew how to fly an airplane." "At first, no one knew how to attach wheels to a cart." "At first, no one knew exactly how to cultivate crops." I mean, come ON.
People built a culture around the phone that worked.
Um... I'm not sure it actually worked.
In the moment when a phone rang, there was an imperative. One had to pick up the phone. This thinking permeated the culture from adults to children. In a Hello Kitty segment designed to teach kids how the phone worked, Hello Kitty is playing when the phone starts to ring. âItâs the phone. Yay!â she says. âMama! Mama! The telephone is ringing. Hurry! They are gonna hang up.â
That right there would have been enough to put me off phones for life.
Not picking up the phone would be like someone knocking at your door and you standing behind it not answering. It was, at the very least, rude, and quite possibly sneaky or creepy or something.
Well, too bad. I'm busy. That phone is there for my benefit, not yours.
I attach no special value to it. Thereâs no need to return to the pure state of 1980s telephonic culture.
I'll give the author this much: there's a simple discussion about the way things used to be, without the sense of nostalgia that so often accompanies such articles. "I remember sitting at my drawing-table, endlessly tracing cursive letters until my hand cramped. Kids these days just don't understand how beautiful handwriting can be. Those days were simpler, life more elegant..."
There are many reasons for the slow erosion of this commons. The most important aspect is structural: There are simply more communication options.
No, the most important aspect is every time I'd pick up, someone would be trying to sell me insurance, or claim they're from the IRS and I'm boned, or warn me that my car's warranty was about to expire.
Youâve got your Twitter, your Facebook, your work Slack, your email, FaceTimes incoming from family members. So many little dings have begun to make the rings obsolete.
Oh, how I long for a future without Twitter or Facebook.
Finally the author gets to the main point. This is called "burying the lede," and it's despicable.
There are unsolicited telemarketing calls. There are straight-up robocalls that merely deliver recorded messages. There are the cyborg telemarketers, who sit in call centers playing prerecorded bits of audio to simulate a conversation. There are the spam phone calls, whose sole purpose seems to be verifying that your phone number is real and working.
Incidentally, this article is nearly three years old, so the data in it is from 2018. I doubt anything's significantly changed since then. There was a period in late 2019 when I remember getting, no shit, no exaggeration, two to three dozen spam calls a day. Blocking the numbers doesn't help, because they're spoofing. Answering only makes them call more. It got to the point where I simply turned my ringer off. I missed a couple of calls from friends doing that, but it was worth it.
And then when I'd check my voicemail, I'd find messages in Mandarin. MANDARIN. Again, I'm not joking. I'm just surprised none of them were in Russian.
The spam has diminished to two or three a day, now, so I keep the ringer on. Still, I want it to go to zero.
Anyway, that's my rant for the day. |
February 9, 2021 at 12:01am February 9, 2021 at 12:01am
|
I always find it amusing, in a dark-humor sort of way, how often groups of people flee religious persecution so that they can install their own.
Few examples are more obvious than the Puritan settlers of Massachusetts. And while the land they settled eventually came around to the ideals of freedom of religion and expression -- well, at least up to a certain point -- their insistence on a narrow view of morality echoes to this day like the reverberating sound of a turd hitting the slurry at the bottom of an outhouse.
Apparently, Thomas Morton didnât get the memo. The English businessman arrived in Massachusetts in 1624 with the Puritans, but he wasnât exactly on board with the strict, insular, and pious society they had hoped to build for themselves.
Why would such a person brave an uncertain ocean journey with such horrible people? Why, money, of course.
The Puritansâ move across the pond was motivated by both religion and commerce, but Morton was there only for the latter reason, as one of the owners of the Wollaston Company.
Who?
His business partnerâslave-owning Richard Wollastonâmoved south to Virginia to expand the companyâs business...
I'd suggest cancelling this guy, but it seems history already has.
...but Morton was already deeply attached to the land, in a way his more religious neighbors likely couldnât understand. âHe was extremely responsive to the natural world and had very friendly relations with the Indians,â says Heath, while âthe Puritans took the opposite stance: that the natural world was a howling wilderness, and the Indians were wild men that needed to be suppressed.â
It's funny because the researcher's name is Heath, which is a word for an area in a state of nature, which gave us the word "heathen."
After Wollaston left, Morton enlisted the help of some brave recruitsâboth English and Nativeâto establish the breakoff settlement of Ma-Re Mount, also known as Merrymount, preserved today in the Quincy neighborhood and park of the same name.
Incidentally, if you're not aware, Quincy is pronounced like "quin'-zee." Just in case you find yourself in the Boston area one day, don't get tripped up by this shibboleth.
The Puritan authorities didnât see Merrymount as a free-wheeling annoyance; they saw an existential threat.
Of course they did. The thing most frightening to a Puritan is the horrible idea that someone, somewhere, is having a good time.
With Plymouthâs monopoly dissolved and its perceived enemies armed, Morton had perhaps done more than anyone else to undermine the Puritan project in Massachusetts.
I should absolutely build a shrine to this guy.
Worse yet, in the words of Plymouthâs governor William Bradford, Morton condoned âdancing and frisking togetherâ with the Native Americansâactivities that were banned even without Native American participation.
Snort. "Frisking." Snort. Which reminds me of a joke. Why aren't Puritan kids allowed to make out? Because it might lead to dancing.
There could be no greater symbol of such misrule than Mortonâs maypole. Reaching 80 feet into the air, the structure conjured all the vile, virile vices of Merry England that the Puritans had hoped to leave behind.
I'm absolutely, totally stealing "vile, virile vices." Oh, wait, I already did, for this entry's title.
The article goes on to talk about the book in the title of the piece, and I feel like I really should own a facsimile of it (I'm not quite impressed, or rich, enough to try to track down a first edition).
After publishing the book, Morton braved a venture back to his beloved Massachusetts, only to be turned right back around upon arrival. He tried to cross the Atlantic once again in 1643, and was this time exiled to Maine, where he died.
There are probably worse fates, but I can't think of any offhand.
So, a short read, and worth it -- and the source, Atlas Obscura is a wonderful fount of information. The only problem is they keep posting places to visit in Belgium, and I don't know if I'll be able to go to them all. |
Previous ... - 1- 2 ... Next
© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|