Re: Yudkowsky's AI (again)

Anders Sandberg (asa@nada.kth.se)
26 Mar 1999 16:21:09 +0100

"Billy Brown" <bbrown@conemsco.com> writes:

> So, remind me again, why exactly are we so worried about a human upload?
>
> The last time I looked, our best theory of the human brain had it being a
> huge mass of interconnected neural nets, with (possibly) some more
> procedural software running in an emulation layer. That being the case, a
> lone uploaded human isn't likely to be capable of making any vast
> improvements to it. By the time he finishes his first primitive neurohack
> he's going to have lots of uploaded company.

Exactly.

I think a lot of the discussions about the emergence of Powers and SI are hampered by a lack of what Lyle Burkhead call 'callibration' - are these ideas really checked against reality? Technological development does not necessarily jump, there are always economics involved, humans often act together in a social manner, if somebody has technology/knowledge X then it is very likely that many has it or are close to it too, Hollywood memes are not necesarily true and so on. I think I need to write something like www.geniebusters.org for uploading and the brain :-)

-- 
-----------------------------------------------------------------------
Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y