"Michael E. Smith" <mesmith@home.net> writes:
> Anders Sandberg wrote:
> "I wouldn't call that unextropian. After all, extropianism
> is about other things than believing in specific technologies."
> Actually, I agree with you about it not being "unextropian",
> which is why I put the word in quotes. However, I have
> noticed that not all "extropians" are as open-minded as you
> about some things. If you imply that it is okay for an
> extropian to believe that AI is impossible, I find that
> extraordinary and atypical, especially if you follow that
> belief to its logical conclusions.
So? Extropians are in general atypical.
I think FTL travel is impossible (without shortcuts and tricks like wormholes). But I might be wrong, it might turn out that we discover a way to do it. Have I been unextropian for not believing in this possibility? Hardly, my position was based on known facts and rational thought. The same goes for AI; we may debate the issue of it is going to work or not much more and the interpretation of the facts isn't as clear, but in the end you can reach the conclusion that AI isn't possible given what you know and estimate and still be an extropian. The unextropian thing would be to irrationally claim "AI *must* be possible".
[Excerpts from The Diamond Age snipped]
> Comment: Now who would argue that this in some sense conflicts with
> extropianism?
So? Suppose it really turns out that there are souls (of the kind people commonly believe in). Would that invalidate extropianism? Not the least, it would just mean we have to adjust our plans. Like for example study how to make AI with souls.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y