The Obama administration is planning a decade-long scientific effort to examine the workings of the human brain and build a comprehensive map of its activity, seeking to do for the brain what the Human Genome Project did for genetics.Indeed it does.
George M. Church, a molecular biologist at Harvard, said he was helping to plan the project, the Brain Activity Map. The project, which the administration has been looking to unveil as early as March, will include federal agencies, private foundations and teams of neuroscientists and nanoscientists in a concerted effort to advance the knowledge of the brain’s billions of neurons and gain greater insights into perception, actions and, ultimately, consciousness.
Scientists with the highest hopes for the project also see it as a way to develop the technology essential to understanding diseases like Alzheimer’s and Parkinson’s, as well as to find new therapies for a variety of mental illnesses.
Moreover, the project holds the potential of paving the way for advances in artificial intelligence.
This illustrates exactly why I claimed that the approach to the singularity is ultimately the greatest threat to us; we are completely culturally blind to the idea that innovation could be bad for us, across the political spectrum.
But at some point, doesn't the displacement of Americans by machines, and it's resulting suffering, reach a point where the public has that "Ah hah!" moment?
ReplyDeleteI don't know what that threshold might be, but it seems to me that at some point it has to become obvious to the public at large that businesses and governments are choosing money (efficiency, productivity, GDP growth, etc.) over humans. At that point, do we seen an "American Spring"? Or do Americans just walk quietly into the singularity?
i cant see how artificial intelligence can adapt facets of or in any way be similar to biological intelligence...
ReplyDeletei'm not saying AI cant be better, just that they are not and cannot be similar...they are fundamentally different...like expecting a factory to flow like a river or grow like a tree...
The vulgarity behind this hope and associated mindset is much more dangerous than it coming to fruition (won't happen)
ReplyDeleteI'd say our highest priority should be improving mental health and "cultural health" to reduce the likelihood of people using new tech to sabotage society, possibly terminally.
ReplyDeleteNew tech is unstoppable, whether it's computer science that enables automation and malignant AI, or genetic engineering that enables biological warfare, or computer networks that enable malware, or nanotech that enables "grey goo", or remote control tech that enables cheap, remote warfare.
Our tools will get more and more powerful - we need to improve our individual and collective mental health so that we have a chance of using them properly, and a chance of preventing guerrilla warfare by deranged kids with the wrong tech kits.
Unfortunately, at the moment I don't see mental/cultural health improving nearly as quickly as our tools...
“The real problem of humanity is the following: we have paleolithic emotions; medieval institutions; and god-like technology" E.O.Wilson
ReplyDeleteNote that there are similar initiatives with an equivalent discourse around it in Europe, for instance :
ReplyDeletehttp://www.youtube.com/watch?v=DsZ_LBdthC0
Neelie Kroes is responsible for "information technology" at the European commission or something.
As to the "scientist", typical guy sounding like a charlatan.
(even though I'm not able to listen to this kind of junk for very long)
Stuart,
ReplyDeleteI could never get too excited about "the singularity". Maybe you can help me understand something.
As far as I have ever been able to tell, this supposed future rendezvous between human beings and AI has been dubbed "the singularity" because supposedly it will have such profound implications we cannot predict what the consequences will be...
Fair enough?
While all this sounds very intriguing, how is it that nothing else we might consider about the future rates as "a singularity"?
I mean, isn't there already sufficient uncertainty about future events that anyone could claim that "a singularity" already exists?
Or are we certain enough about our trajectory through the next 50 years of future history that we can say that "the singularity" is the only uncertainty we really need to worry about?
Brian:
ReplyDeleteI agree that there will be a huge political backlash at some point, but we aren't there yet.
Nick G:
ReplyDelete"New tech is unstoppable" is not a law of nature, but rather a defining feature of western culture. We do have a choice, albeit that cultural change is very hard. Even within the context of western culture, we do put a lot of friction in the way of certain kinds of innovation (eg we make it incredibly painful to bring a new drug to market).
Lucas:
ReplyDeleteI think the original rationale for the "singularity" metaphor was that once machines could design themselves, progress would go faster and faster, eventually accelerating beyond comprehension. I think that's possible, but somewhat speculative - I don't think we actually know anything about how difficult it will be for machines twice as smart as us (whatever that means) to design machines four times as smart as us.
I like the metaphor of the singularity for a different reason - when you fall into a black hole, you get pulled apart by the tidal gravitational forces long before you get to the singularity itself (as you get close to the singularity, the difference between the strength of gravity at the top and bottom of you is great enough to tear you apart). I think this is a decent metaphor for the economic effects of the approach to singularity on society, with ever increasing automation and globalization resulting in ever greater benefits for those near the top of the pyramid, while leaving a larger and larger pile of unemployable folks at the bottom.