I have been thinking a lot lately about the concept of technological singularity as defined by Ray Kurzweil and Vernor Vinge. I recently read Kurzweil's newest book "The Singularity is Near" and was left with an extremely mixed impression. Kurzweil argues that based on extrapolations of current trends in information technology, those being the "doubly exponential" growth processes of computation through reduction in size, increase in efficiency and the subsequent increase in cost effectiveness, we are heading towards a point of "singularity". Singularity is used as a metaphor for technological process exceeding current human ability to understand it, and is borrowed from the mathematical term. This would occur through the development of greater than human AI, or through enhancement of human facilities (Kurzweil argues that the former will come first and indeed he intends to play a leading role in the process).
Kurzweil believes that we will reach the stage where technology spirals far beyond our control well before the middle of the 21st century, and that $1000 dollars of computer equipment will be able to simulate the human brain by 2029. The results of this process, according to Kurzweil, will be profound, leading ultimately to human immortality through software, nanotechnology, and quantum computing. Kurzweil addresses most of the important criticisms he has recieved in the last chapter of the book, and to my eyes it appears that there is no reason why what Kurzweil says could not come to pass. Kurzweil claims that the apparent lack of progress in software as opposed to hardware will be overcome with the reverse engineering of the human brain, which will provide a tool-kit for the creation of strong AI.
Where my major problems with Kurzweil arise are in his reductionist view of humanity and its place in the universe. Kurzweil sees human beings with their "version 1.0" bodies as nothing more than ineffecient computational devices. He sees technological "evolution" as an outgrowth of biological evolution, going so far as to include the development of prokaryotic and eukaryotic cells on the same graph as the development of PCs and the Internet. This is based on what he calls the "Law of Accelerating Returns", wherein a certain technology will always provide the means necessary to surmount itself on an ever-growing exponential curve (i.e. the development of hominids took half as long as the development of mammals, which took half as long as the development of reptiles etc.).
Kurzweil states that he believes that the Universe itself is destined to move beyond its current "dumb" state through the development of an intelligent species which merges with its technology and uses its nearly infinite knowledge to convert the very fabric of the universe into a giant computational device. The ability of black holes (reliant on a universe rich in carbon) to retain information, and the development of an intellegent carbon-based species serve as a verification of the anthropic principle for Kurzweil. In this he brings the product of his Law of Accelerating Returns to it's logical extreme.
While the notion of nearly infinite exponential growth in computation is not disprovable, I find Kurzweil's application of Moore's Law to all of the universe to be questionable. He relies on an extreme "whiggish" view of history, not at all dependent on individuals or small groups, only concerning himself with the megatrends of technological development and "progress". Also, I think more than a few evolutionary biologists would have a problem with his direct correlation of technological and biological development, and his belief that this implies something fundamental about the universe.
The book itself (singularity is near) is sloppy and repetitive, and the degree to which Kurzweil believes in the inevitability of his predictions is evident in his almost total lack of, or even condescension towards, political, moral, and philosophical considerations. Kurzweil is neurotic and egocentric in the extreme. Any attempt to stall the inexorable march of technological progress will only force dangerous emerging technologies to the societal fringe, where they could be extremely deadly for humanity.
Nevertheless, Kurzweil is no chump, being one of the top inventors today. Indeed, he has a very accurate track record for predicting future developments in AI and computation. He has legions of slavish fans who believe his predictions. I don't believe that such an explosion in intellegince is necessarity imminent, but it sure could be. Some thinkers, uncomfortable with the increasing ease with which people will be able to synthesize viruses and bacteria, and weaponize nanobots, believe that mere human intellegence cannot be trusted with the responsibility. I believe that thowing our hands in the air and admitting that we are incapable of dealing with our problems, and handing ouselves over into the care of intellegent machines would represent a profound failure of mankind. I am not necessarily a technophobe, but I see a future pioneered by ethical human beings with their limitations and strengths as being far preferable to giving ourselves up to a program of neo-eugenics and robotics. However, Kurzweil's megatrend analysis is insidious, when we consider the commercial as well as military applications of emerging technologies, we can see that we are separated from this by baby-steps and not leaps.
I would be interested to see the opinions of The Philosophers on issues such as accelerating change, and futurism in general. Also, I'm new here! Hello!
Kurzweil believes that we will reach the stage where technology spirals far beyond our control well before the middle of the 21st century, and that $1000 dollars of computer equipment will be able to simulate the human brain by 2029. The results of this process, according to Kurzweil, will be profound, leading ultimately to human immortality through software, nanotechnology, and quantum computing. Kurzweil addresses most of the important criticisms he has recieved in the last chapter of the book, and to my eyes it appears that there is no reason why what Kurzweil says could not come to pass. Kurzweil claims that the apparent lack of progress in software as opposed to hardware will be overcome with the reverse engineering of the human brain, which will provide a tool-kit for the creation of strong AI.
Where my major problems with Kurzweil arise are in his reductionist view of humanity and its place in the universe. Kurzweil sees human beings with their "version 1.0" bodies as nothing more than ineffecient computational devices. He sees technological "evolution" as an outgrowth of biological evolution, going so far as to include the development of prokaryotic and eukaryotic cells on the same graph as the development of PCs and the Internet. This is based on what he calls the "Law of Accelerating Returns", wherein a certain technology will always provide the means necessary to surmount itself on an ever-growing exponential curve (i.e. the development of hominids took half as long as the development of mammals, which took half as long as the development of reptiles etc.).
Kurzweil states that he believes that the Universe itself is destined to move beyond its current "dumb" state through the development of an intelligent species which merges with its technology and uses its nearly infinite knowledge to convert the very fabric of the universe into a giant computational device. The ability of black holes (reliant on a universe rich in carbon) to retain information, and the development of an intellegent carbon-based species serve as a verification of the anthropic principle for Kurzweil. In this he brings the product of his Law of Accelerating Returns to it's logical extreme.
While the notion of nearly infinite exponential growth in computation is not disprovable, I find Kurzweil's application of Moore's Law to all of the universe to be questionable. He relies on an extreme "whiggish" view of history, not at all dependent on individuals or small groups, only concerning himself with the megatrends of technological development and "progress". Also, I think more than a few evolutionary biologists would have a problem with his direct correlation of technological and biological development, and his belief that this implies something fundamental about the universe.
The book itself (singularity is near) is sloppy and repetitive, and the degree to which Kurzweil believes in the inevitability of his predictions is evident in his almost total lack of, or even condescension towards, political, moral, and philosophical considerations. Kurzweil is neurotic and egocentric in the extreme. Any attempt to stall the inexorable march of technological progress will only force dangerous emerging technologies to the societal fringe, where they could be extremely deadly for humanity.
Nevertheless, Kurzweil is no chump, being one of the top inventors today. Indeed, he has a very accurate track record for predicting future developments in AI and computation. He has legions of slavish fans who believe his predictions. I don't believe that such an explosion in intellegince is necessarity imminent, but it sure could be. Some thinkers, uncomfortable with the increasing ease with which people will be able to synthesize viruses and bacteria, and weaponize nanobots, believe that mere human intellegence cannot be trusted with the responsibility. I believe that thowing our hands in the air and admitting that we are incapable of dealing with our problems, and handing ouselves over into the care of intellegent machines would represent a profound failure of mankind. I am not necessarily a technophobe, but I see a future pioneered by ethical human beings with their limitations and strengths as being far preferable to giving ourselves up to a program of neo-eugenics and robotics. However, Kurzweil's megatrend analysis is insidious, when we consider the commercial as well as military applications of emerging technologies, we can see that we are separated from this by baby-steps and not leaps.
I would be interested to see the opinions of The Philosophers on issues such as accelerating change, and futurism in general. Also, I'm new here! Hello!