A stray thought I had last night. I was thinking about the compulsion we in computer science have to develop artificial intelligence. My experience is that most of my colleagues in the field simply don't really question whether or not it's a good thing that they are working on trying to build algorithms that are smarter than we are. Whatever our reasons are, it's by and large not because we've really thought it through with an open mind and have decided it's a great idea. Instead, we are compelled by powerful unconscious motivations, and then try to justify it after the fact.
The analogy that occurred to me is the physicists in the first half of the twentieth century figuring out nuclear physics ("splitting the atom") and eventually developing nuclear weapons. They remain humankind's most destructive weapon. And yet, in a strange way, they have led to marked moral/spiritual progress on the part of the species. They were used twice, and then we've refrained from using them in anger since. And as a result, there's been no open war between major powers since 1945. To see how remarkable this is, here's a list of major wars in Europe - there have been wars between major powers every few decades since time immemorial. But the prospect of nuclear war was so awful that we finally learned to stop. At least, I hope it stays that way.
So perhaps that's the hope here. In starting to build something that has the potential to completely tear our society apart altogether, maybe it will force us to finally confront the unconscious forces that drive us to blindly innovate and grow our economy, whatever the cost. Being a bit more conscious about where we want to go would be a good thing.
Showing posts with label nuclear weapons. Show all posts
Showing posts with label nuclear weapons. Show all posts
Wednesday, May 29, 2013
Subscribe to:
Posts (Atom)