About This Author
Come closer.
|
Complex Numbers #996298 added October 20, 2020 at 12:04am Restrictions: None
Predictions
The thing about predictions is that people tend to remember the ones that come to pass, and selectively forget those that were off.
Though I'm still sore about not owning a flying car. I was promised a flying car long before 2020. Instead I got a pandemic, during six months of which I didn't even drive my boring, surface-grubbing Subaru. Yes, I know, prototypes exist for flying cars. Prototypes exist for a lot of things. The point is, I don't have one.
Now, Hawking had, beyond all doubt, a brilliant mind. He thought deeply and logically and, most importantly, had a sense of humor. That doesn't mean he knew everything or had a crystal ball, though.
This article is two years old, but it's not like Hawking could change his predictions, as he remains deceased.
The late physicist Stephen Hawking’s last writings predict that a breed of superhumans will take over, having used genetic engineering to surpass their fellow beings.
I, too, watched Star Trek. Still watch it, in fact, in its many incarnations. I can't imagine Hawking didn't, as he had a cameo in TNG at one point.
I don't pretend for a moment that its future will come to pass. Oh, sure, pieces of it, maybe, but it is, in the end, like all science fiction: there to make us think, be entertained, and possibly serve as a warning or roadmap.
Hawking delivers a grave warning on the importance of regulating AI, noting that “in the future AI could develop a will of its own, a will that is in conflict with ours.”
Again, anyone who has read more than a little bit of science fiction is going to be familiar with this trope. Hell, it's the entire plot of Terminator, to name just one of the more popular franchises.
Every movie, book, or TV show that I've seen on the subject dances around one simple question: How would such a thing be powered, and why can't we just, you know... flip a switch or pull a plug. When Trek brought it up, it tapped into the purely fictional near-unlimited power of the Enterprise's matter/antimatter engines, and created for itself a force field to keep it from being unplugged. Ultron had Iron Man's plot device power source. That sort of thing.
In objective reality, we have neither limitless energy or force fields... though I will concede that it's possible an AI could invent these things before making its nefarious intentions known.
The bad news: At some point in the next 1,000 years, nuclear war or environmental calamity will “cripple Earth.” However, by then, “our ingenious race will have found a way to slip the surly bonds of Earth and will therefore survive the disaster.”
Optimism and pessimism in the same paragraph. I'm impressed.
Once such superhumans appear, there are going to be significant political problems with the unimproved humans, who won’t be able to compete. Presumably, they will die out, or become unimportant.
Yeah, I like the X-Men stories too.
Hawking acknowledges there are various explanations for why intelligent life hasn’t been found or has not visited Earth. His predictions here aren’t so bold, but his preferred explanation is that humans have “overlooked” forms of intelligent life that are out there.
Every time I talk about something like this, I have to pre-emptively thwart any jokes or snide remarks about there not being any intelligent life down here, either. The sense of "intelligence" used here is the ability to use technology and be curious and self-aware. We fit the definition, even if some of us are dumber than a box of rocks and twice as dense.
That said, I've made my opinion known on this subject on multiple occasions, so I'll just summarize: I would be greatly surprised if we were the only "intelligent" life in the universe, but just as surprised if we weren't the only ones in our galaxy. Intelligence is not an inevitable product of evolution, and most species here on Earth get along without it just fine -- some better, in fact, before we came along with our tools and machines and pesky communication skills.
Skipping the "Does God exist?" section here. His opinion is no more informed on the subject than mine or yours, and I already did my religion argument for the month, back on the 14th.
The biggest threats to Earth: Threat number one one is an asteroid collision, like the one that killed the dinosaurs. However, “we have no defense” against that, Hawking writes. More immediately: climate change. “A rise in ocean temperature would melt the ice caps and cause the release of large amounts of carbon dioxide,” Hawking writes. “Both effects could make our climate like that of Venus with a temperature of 250C.”
I wouldn't say we have "no" defense against asteroids. While the chance of one in any given year is vanishingly small, the cumulative probability increases over time. Eventually, a giant rock is going to be on track to slam into the Earth. It's inevitable. But we keep improving rocket technology and sending people and robots into space. A rock big enough to cause a catastrophe will be seen early, and, in the near future, we will in fact have the technology to do something about it... that is, if the other thing doesn't happen to us first.
For my fellow Americans, 250C translates to "Really goddamned hot. You think Phoenix in the summer is hot? You ain't seen nothin' yet."
The best idea humanity could implement: Nuclear fusion power. That would give us clean energy with no pollution or global warming.
And power the vicious AIs bent on world domination and enslavement of humans, super and otherwise. I mean, come on, is it too much to ask for some consistency from one of the greatest minds of our generation? |
© Copyright 2020 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved. Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|