He may be the smartest man in the world, but even Stephen Hawking can drift into ultracrepidarian hooey now and again. That was my immediate reaction anyway, on seeing the headline, "Without a ‘world government’ technology will destroy us, says Stephen Hawking." But it turns out he was a little more nuanced than that and was expressing a dilemma many people have thought and talked about:
"Since civilisation began, aggression has been useful inasmuch as it has definite survival advantages," he told The Times.
"It is hard-wired into our genes by Darwinian evolution. Now, however, technology has advanced at such a pace that this aggression may destroy us all by nuclear or biological war. We need to control this inherited instinct by our logic and reason."
He suggests that "some form of world government" could be ideal for the job, but would itself create more problems.
"But that might become a tyranny," he added. "All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges."It might take a "one world government" to ward off technological disaster. But a one-world government could result in the greatest tyranny humankind has ever known. Talk about a rock and a hard place.
Though Hawking links them together, one-world government and technological threat are each worth exploring separately.
The danger that we will do ourselves in by letting our technology get ahead of our ability to cope with it has been the theme of science fiction work since the very first (arguably) SF novel, Mary Shelley's "Frankenstein." We can create the monster, but we cannot control it, and if we aren't careful it will destroy us. You hear that same argument over and over again these days, applied to everything from cloning and DNA research to fracking and AI experiments.Hawking and people like Elon Musk are especially worried about artificial intelligence. Hawking doesn't think AI will be malignant, just indifferent.
"You're probably not an evil ant-hater who steps on ants out of malice, but if you're in charge of a hydroelectric green energy project and there's an anthill in the region to be flooded, too bad for the ants. Let's not place humanity in the position of those ants."
Musk thinks there is a chance humans will become irrelevant:
"Over time I think we will probably see a closer merger of biological intelligence and digital intelligence," he said, suggesting that people could merge with machines in the future, in order to keep up.
Humankind now has the ability to destroy itself and to reach for the stars themselves. I've lived through a few times when people actually believed the end was near. Even before we sat in front of the TV afraid that JFK was going to bring nuclear missiles raining in because of Cuba, we scrambled under our desks for "duck and cover" exercises. But we came through all of it, and lived to watch a man set foot on the moon. We've done some horrible things with our technological ability, but we have done great ones, too. Overall, we've managed to stay ahead and make the world and our lives gradually better over time. The optimist in me says we will continue to so.
As for one-world government, we've been drifting toward that for a long time, haven't we? Libertarians and conservatives know, because they've been fighting against it for centuries, that the natural tendency of government is to grow and expand, both in influence and ambition. If this is true in separate arenas (European Union, the growth of our own federal government), it is true as different governments seek mutual interests (with the encouragement of the United Nations).
I'm not saying it's a good thing or a bad thing, but practically an inevitable development: Sooner or later, and probably sooner than I would like, there will be some form of one-world government. The best we can hope for that it is a form that still allows for a lot of local autonomy. Something like our own Articles of Confederation, which called for such a loose alliance of states that the central government had hardly any power to speak of.
Of course the lack of central power was increasingly seen as a problem that had to be solved, and then it was solved, with a civil war and more than a million deaths. So many that's not the best solution after all.
If you ask me how it is I can be so optimistic about our ability to cope with technological advance but so gloomy about our ability to deal with political advance, I do not know. It's a mystery to me, too. Charles Krauthammer once wrote that he got out of psychiatry and into writing about politics, despite the fact that the change put him into constant contact with the worst reprobates on the planet, precisely because humankind is at a crossroads, where we can reach heretofore unimagined greatness or utterly destroy ourselves. Getting the politics right, learning how to deal with each other with reason and respect, he said, will make the difference on which way we go, and that makes politics the most important story there is today.
I'm not quite sure of that. I'm more inclined to believe that if we get the technology right, the need for government will lessen to the point where it can wither and die. Government grows and becomes more central. But the astonishing revolution we're going through is going the other way, letting people make more local and personal and immediate decisions. Unlimited and cheap storage and ever improving mobile platforms are bring changes more profound than I think most of us realize.
Of course, government will not give up willingly. And maybe the fact that we feel better able to pursue our own paths without direction or interference will just make it easier to sneak in a little oppression here and there, maybe in a secret app we're not even aware is running in the background.
Whew, momentary flash of hope squelched, pessimism back. All better now.