The Evolution of Human Consciousness and the Idea of the Technological Singularity

Einherjar86

Active Member
Jan 15, 2008
18,490
1,959
113
The Ivory Tower
Many people are divided over the theories on artificial intelligence, but no one can deny the fact that technology is becoming increasingly complex and advanced at an exponential rate. The Law of Accelerating Returns, presented by Ray Kurzweil, hypothesizes that technology will continue to grow at an exponential rate unto the point where humanity can no longer predict nor understand where it will go. Essentially, it will venture into territories on its own that we cannot rationally fathom.

Some people argue that the idea of the technological singularity will never occur because in order to create technology that is more intelligent than a human being, we would need to understand how such technology would work, thus knowing all it knows and being smarter than it; so, in essence, a person cannot create something smarter than himself or herself.

I partially agree with this statement, which is why I believe that if a technological singularity was to occur, it would coincide with a rapidly evolving human consciousness. Hubert Dreyfus contends that there is a fundamental difference between the way in which computers operate and the way in which the human mind operates. In order to create a truly formidable AI, I believe that computers would need to somehow evolve to process information in the same way the human brain does.

I've become increasingly interested in this topic since reading the first two books of Dan Simmons's Hyperion Cantos and Vernor Vinge's A Fire Upon the Deep. I'm also currently halfway through Charles Stross's Singularity Sky, which deals with a speculative future in which a technological singularity gives birth to an entity that calls itself the Eschaton and proceeds to assert its dominance over humanity. All of these books deal with various forms of super-advanced AI that are able to effectively establish themselves as tyrannical entities in control over humanity, to an extent. Many of the books deal with what constitutes a technological singularity and how human beings could hope to understand such a phenomenon.

My purpose in this thread was to generate some discussion about the concept of AI, what people here think about the ever-increasing amount of AI and the rate at which it is advancing, and the relationship between man and machine, especially regarding consciousness. Can human beings create machines smarter than themselves? Would a technological singularity coincide with the advent of the next phase of human evolution, such as a higher level of consciousness; and if so, would it even be a technological singularity? How akin to what we perceive as "God" could a machine become? For example, in Stross's Singularity Sky the Eschaton can simply wipe people from the face of the earth in the blink of an eye, and transport them from planet to planet, can travel through time (thus preventing anyone from ever trying to prevent it from being created- or, as it labels it, "violating causality") and can hurtle space-born objects, thus obliterating entire star systems; quite God-like, in my opinion.

Also, lastly, would a technological singularity open a pathway for the next step in human evolution; or would the next step in human evolution (especially involving, again, consciousness) make a technological singularity obsolete, or even impossible? What I mean is, will human beings always advance faster than machines, or will an advancement in technology help humanity to achieve a higher level of consciousness (or perhaps even transcendence)?

I know this is a lot, so let's just take this a little bit at a time. For starters we can just begin by talking about artifical intelligence and what we think about it's nature or the possibility of it even becoming more powerful than humanity.
 
You are way over my head on this one Jarman but it seems there has been plenty of sci-fi movies about this. Not my prediction but we do know that old sci-fi became reality over the decades... without the hollywood drama.

My largest concern with all technology for decades has been how it takes work away from humans and today we are having a fairly significant problem with employment and what exactly people are suppose to do with their lifes.

It doesnt seem as though "computers" could take over the world but who knows. Still it seems to me that its nothing a little stray voltage couldnt fix in short order.

Could it take us to a higher level ? I guess that depends on a persons opinion on what they consider higher or an improvement. As you know my feeling is that we have been headed in the wrong direction.
 
The problem I hold with technology is in the user. Since any sort of technology, whether hardware or software, is created and produced, it must meet the parameters of the designer or creator. Although AI can think, it is still artificial. It thinks within it's given parameters, and even if we could create self reproducing technology, it still would have it's root on an assembly line etc. So then one must question the creator's intent in use or misuse. As many articles already penned have pointed out, our technological advancement has far outstripped our ability to weigh the ramifications.

Singularity is an entirely different animal, because now we are talking about "post-human". I specifically detest this concept, and admit to doing so due to my "religious" beliefs.
 
I think whatever level the technology reaches, it only displays our power, so it can never be smarter than us imo, destroy us? I think it's up to its use.

My largest concern with all technology for decades has been how it takes work away from humans and today we are having a fairly significant problem with employment and what exactly people are suppose to do with their lifes.

Yes, but that creates other opportunities for OTHER people (especially those of the upcoming generations), if a machine takes away someone's job, another person will have a job as a machine technician or whatever. On a big scale, it doesn't seem to hurt, but on an individual scale it does... a lot.
 
You are way over my head on this one Jarman but it seems there has been plenty of sci-fi movies about this. Not my prediction but we do know that old sci-fi became reality over the decades... without the hollywood drama.

My largest concern with all technology for decades has been how it takes work away from humans and today we are having a fairly significant problem with employment and what exactly people are suppose to do with their lifes.

It doesnt seem as though "computers" could take over the world but who knows. Still it seems to me that its nothing a little stray voltage couldnt fix in short order.

Could it take us to a higher level ? I guess that depends on a persons opinion on what they consider higher or an improvement. As you know my feeling is that we have been headed in the wrong direction.

Good point. By "higher level" I meant achieving a more advanced level of consciousness (i.e. deeper understanding of the nature of the universe, etc.). If we were able to perfect some form of superhuman AI, it might be able to direct us towards such a state; if, that is, it deemed it practical to do so.

However, by achieving this transcendental state it nears a quasi-divine and sketchy territory in which it might become similar to a god-like figure; which is, I believe, why Dak holds certain reservations against such an idea. I'm interested to hear what Dakryn means by "post-humanity."

In the book Singularity Sky there's a particularly interesting comment one character makes, along the lines of: "The Eschaton (i.e. the name of the singularity) did not create the universe. It merely lives in it, like us." So therefore, it is implicit that the AI is not God. I don't think a singularity would become an all-powerful, omniscient being. If it comes into being after the creation of the universe, then it seems to me that it could never be an omniscient being, since it is still bound by time.

In Singularity Sky the singularity is capable of time travel, but it cannot violate time before time existed. No matter how I try and wrap my head around it, I have a difficult time imagining a being that can understand everything if it wasn't created at the beginning of everything.
 
No matter how I try and wrap my head around it, I have a difficult time imagining a being that can understand everything if it wasn't created at the beginning of everything.

If I remember my high school physic classes correctly; in physics, a part of a system cannot contain all the system itself. We as human, or any other creature that is created after the big bang, cannot contain all the universe, but I don't know if contain also refer to "understanding" or not, while im not sure how my brain function to store data.
 
Re the bit about 'can't create something more intelligent than ourselves' that really comes down to how you define intelligence. We already have tools that have many times our capacity in specific realms, and we just ask them nicely to spit out information in forms we can deal with. We understand the processes, (ie calculations, word searching, mapping, whatever) but lack the capacity. So once we understand 'general intelligence processes' properly, there should be nothing to stop us creating something with greater capacity...
 
Well in a way doesnt it all already control us ? Adding to beaurocratic red tape kinds of things, "no human voice" on the other end to get to the bottom of issues or understanding. Maybe applying a more narrow "perfect world" logic over rationalisim ? One of the beautys of humanity is individualism... as well as detriments in some cases. With a machine you dont have this, its just a dry personality free of any emotion, with only one purpose and mission.

Still for complete control it would require so much to "have eyes everywhere" and be very vulnerable to anything electrical, its food for life. In the end just like any machine while it may perform work far beyond that of a single human its still at the mercy of the human controling it.

Hex - they have never matched employment for employment even since mechanism has taken place. Its been well hidden with smoke amd mirrors for many decades, a centurys worth for the most part but today we are seeing a clearer picture as the smoke clears and mirrors broken.
 
However, by achieving this transcendental state it nears a quasi-divine and sketchy territory in which it might become similar to a god-like figure; which is, I believe, why Dak holds certain reservations against such an idea. I'm interested to hear what Dakryn means by "post-humanity."

My understanding of singularity is "post-humanism". The eventual melding of man and machine to create super yet "post" humans.
While there are obvious ethical problems with this that many techies gloss over with "ooos" and "aaahhs" at the possibilities, my religious qualms don't lay in a fear of a "god like" being but instead that since according to the Bible, man is made in the image of God, any attempts to create a "post human" has to be Satanic in design (whether consciously or not), and basically a "fuck you" to God.
 
There is a huge difference in a machine assisting an organ to do it's normal function, and replacing organs, bones, tissues, tendons with a machine.
 
How about forgetting any belief system and just consider how disgusting such a thought is. My life would be over if I became so dry and programed, I dont see any of that cyborg bullshit attractive in the least and Im sure such business would meet a overwhelming negative response from humans... unless they fall for more smoke and mirrors which in this case would be far more extreme than any blinders in the past.
 
palm plant - humans... so easily wrapped up in a distraction

Who gives a fuck where he draws his line in the sand, is it that hard to get the difference between a pacemaker and a robot ? Further whats his feelings about pacemakers got to do with the topic ?

Now here we'll go into pages of religious bullshit. My observation is that most anti religion types are far pushier than their adversaries.
 
Re the bit about 'can't create something more intelligent than ourselves' that really comes down to how you define intelligence. We already have tools that have many times our capacity in specific realms, and we just ask them nicely to spit out information in forms we can deal with. We understand the processes, (ie calculations, word searching, mapping, whatever) but lack the capacity. So once we understand 'general intelligence processes' properly, there should be nothing to stop us creating something with greater capacity...

I'm interested to hear more about what you mean here, as I'm not entirely sure I understand. The first thing that pops into my mind is that we have machines that can go on for years and years trying to find the exact value of pi, whereas it would take a far greater number of humans much, much longer to do the same amount of work. Is this what you're saying? I realize it's a somewhat simplified example if so.

My understanding of singularity is "post-humanism". The eventual melding of man and machine to create super yet "post" humans.
While there are obvious ethical problems with this that many techies gloss over with "ooos" and "aaahhs" at the possibilities, my religious qualms don't lay in a fear of a "god like" being but instead that since according to the Bible, man is made in the image of God, any attempts to create a "post human" has to be Satanic in design (whether consciously or not), and basically a "fuck you" to God.

How about forgetting any belief system and just consider how disgusting such a thought is. My life would be over if I became so dry and programed, I dont see any of that cyborg bullshit attractive in the least and Im sure such business would meet a overwhelming negative response from humans... unless they fall for more smoke and mirrors which in this case would be far more extreme than any blinders in the past.

The book I'm reading now deals quite a bit with this issue of blending man and machine. There are some characters that have been "blended" by the Singularity. The story is extremely complex, but in a nutshell an AI system called "The Festival" kills human bodies and extracts their minds for use within mechanical beings; so, in this sense, many of the humans are being replanted against their will. These characters then reappear with grotesque extremities

However, there are other forms of "blending" in the novel that seem much more appealing, most likely because they were performed not by the Festival, but through humanity's own technological intervention. For instance, some characters have the ability to upload a digital information screen in their minds (i.e. you could see the time of day in the upper right corner of your vision at any moment, if you so choose). This form of blending doesn't seem as repulsive (I don't think) as the stereotypical apocalyptic cyborg-blending done by the Singularity (and that we see in popular cinema). What exactly is everyone's thoughts on this? Are there some forms of physical unity between man and machine that could be beneficial (without us tending more toward the "machine" side of things)?
 
How about forgetting any belief system and just consider how disgusting such a thought is. My life would be over if I became so dry and programed, I dont see any of that cyborg bullshit attractive in the least and Im sure such business would meet a overwhelming negative response from humans... unless they fall for more smoke and mirrors which in this case would be far more extreme than any blinders in the past.

What if you couldn't even notice it? You go through it gradually and unconsciously.
I've noticed how society here (I don't know about other places tho) are "programmed", everybody has the same scenario; 18 years of education, the next two years are for a job, and the first big amount of money you get should be spent in a marriage right away, get kids, raise them, home work and work home, get retired, and fucking die.
The society change already makes us sorta programmed, without any major intervention of technology.
 
However, there are other forms of "blending" in the novel that seem much more appealing, most likely because they were performed not by the Festival, but through humanity's own technological intervention. For instance, some characters have the ability to upload a digital information screen in their minds (i.e. you could see the time of day in the upper right corner of your vision at any moment, if you so choose). This form of blending doesn't seem as repulsive (I don't think) as the stereotypical apocalyptic cyborg-blending done by the Singularity (and that we see in popular cinema). What exactly is everyone's thoughts on this? Are there some forms of physical unity between man and machine that could be beneficial (without us tending more toward the "machine" side of things)?


Speaking for myself and I wonder if others are like this but I dont want any digital information screen poping up in my mind, as it is I wish I could shut off my mind most of the time, its tuff living in here... LOL I never like the data that churns and churns all day. Does anyone else have this issue ? Or is it time for me to try meds?... LOL

Hex - thats just life, humans and all animals have been like that since the beginning of time. Functioning on auto pilot, dayly regiment, effecient routene, gather food and eat... work and eat. Dont you have any Squirels or Beaver in your neck of the woods ? I have little doubt that they are decendents of ancient cyborgs... :lol:
 
Who gives a fuck where he draws his line in the sand, is it that hard to get the difference between a pacemaker and a robot ? Further whats his feelings about pacemakers got to do with the topic ?

The clear difference between pacemakers and cyborgs is why I chose the pacemaker example. I'm interested in what level of artificial enhancement is deemed 'acceptable' and why that level is acceptable but a greater level is not. Deep brain stimulation implants are a significant jump along the evolutionary ladder towards cyborg, would it help your poor black and white mind to think about those instead?
 
I'm interested to hear more about what you mean here, as I'm not entirely sure I understand. The first thing that pops into my mind is that we have machines that can go on for years and years trying to find the exact value of pi, whereas it would take a far greater number of humans much, much longer to do the same amount of work. Is this what you're saying? I realize it's a somewhat simplified example if so.

That's pretty much it... we understand the process but machines can do it better. I imagine the same will be true for 'intelligence' once we understand the process.