Discussion Board

Topic: Singularity Summit

From: patrick
Location:
Date: 07/23/2009

Below are comments from such notable people as Hans Moravec, Kurzweil, etc in regards to the Singularity Summit conference in '07.

http://www.singinst.org/summit2007/index.html


Nick Bostrom: One consideration that should be taken into account when deciding whether to promote the development of superintelligence is that if superintelligence is feasible, it will likely be developed sooner or later. Therefore, we will probably one day have to take the gamble of superintelligence no matter what. But once in existence, a superintelligence could help us reduce or eliminate other existential risks, such as the risk that advanced nanotechnology will be used by humans in warfare or terrorism, a serious threat to the long-term survival of intelligent life on earth. If we get to superintelligence first, we may avoid this risk from nanotechnology and many others. If, on the other hand, we get nanotechnology first, we will have to face both the risks from nanotechnology and, if these risks are survived, also the risks from superintelligence. The overall risk seems to be minimized by implementing superintelligence, with great care, as soon as possible.


Sir Martin Rees: I certainly think that humans are not the limit of evolutionary complexity. There may indeed be posthuman entities, either organic or siliconbased, which can in some respects surpass what a human can do. I think it would be rather surprising if our mental capacities were matched to understanding all the keys levels of reality. The chimpanzees certainly aren't, so why should ours be either? So there may be levels that will have to await some post-human emergence.


Ramaz Naam: In the end, this search for ways to enhance ourselves is a natural part of being human. The urge to transform ourselves has been a force in history as far back as we can see. It's been selected for by millions of years of evolution. It's wired deep in our genes  a natural outgrowth of our human intelligence, curiosity, and drive. To turn our backs on this power would be to turn our backs on our true nature. Embracing our quest to understand and improve on ourselves doesn't call into question our humanity  it reaffirms it.

Re: Singularity Summit

From: Al Brady
Location: Sydney
Date: 11/05/2009

I hope Mr Bear wont mind me mentioning this on his web forum but there is a awesome (very) short story on the development of superintellect by Ted Chiang called Understand.
The preamble is a drug that helps people recover from brain damage and then boosts them to superhigh IQ, but the way the author explores qualitatively new forms of thinking is really, well, mind blowing.

Re: Singularity Summit

From: Greg Bear
Date: 11/12/2009

Ted's one of our best. I'll go back and find that story!

Respond to this discussion

May we post your correspondence on this site?
Yes
No
IMPORTANT: For form verification, type the following number in the box below: 75




See Also...

Archives: [Oct-Dec 2004] [Jan-June 2005] [July-Dec 2005] [Jan-June 2006] [July 2006] [Aug-Dec 2006] [2007] [2008] [2009] [2010] [2011] [2012] [2013] [2014] [Current] [Search Blog Archives]