From: Bryan Moss (bryan.moss@dsl.pipex.com)
Date: Mon Jul 21 2003 - 08:45:44 MDT
Brett Paatsch wrote:
> What do you take to be the Yudkowskian Singularity? I am curious if it
> would overlaps with Eliezer's view of the Singularity.
That's just the term I like to use for the standard Vingean, accelerating
superintelligence scenario.
> [...] It seems to me that the biological revolution will ensure that
> people will "not stop caring" about computing in the near future.
I should distinguish the general idea of computation, that will be widely
applied, from the "cultural artifact" of the computer, be it the mainframe,
personal computer, supercomputer. It's the latter that I think has had its
day. For example, computers may play some role in neuroprosthesis, but the
idea of a brain-machine interface beyond a prosthetic role or beyond a
simple interface (i.e., "mind control" of existing interfaces), is unlikely.
> Are you suggesting Artificial Intelligence is impossible?
No, I'm suggesting that the philosophical argument, the question of whether
computers are capable of genuine thought, is no basis for a research
programme. (To be honest, I don't think it's even really interesting as
philosophy.) If you want to create a brain, study the brain. The idea that
we can simply sit down and program "thought" is absurd.
> [...]
> There may be *no* effective political place to draw the line for *any*
> organisation that starts with the priority of confronting individual
> mortality first, unless that organisation decides to pursue its goal
> deliberately leaving some part of the human species outside its
> considerations.
This is a difficult issue. My partial solution is to argue that death, as
an event, is culturally determined, and to ask that our view be given as
much consideration as the "death is inevitable/desirable" view, under the
auspice of the prevailing pluralism in society. Avoiding senescence would
be a choice and a choice that casts itself as somewhat arbitrary, rather
than a goal we apply to all humanity (the "millions of lives will be lost
unless we cure aging" mentality). This requires two things: (1) greater
understanding of what I just described as the "death is
inevitable/desirable" view; and (2) a more detailed account of our own
position and how it relates to the prevailing cultural situation. The
tricky part is how to inaugerate our "sub-culture" without appearing
arrogant; jettisoning the (mock) horror and confusion we feel at the idea
that some people want to die is only the start. My own hope is that the
tension between wanting to get the word out, to secure funding, etc, and
wanting a morally defensible exclusionary practise (so we don't just seem
like a bunch of arrogant Westerners who want to live forever while half the
world is starving) will be alleviated simply by elucidating our position.
My feeling is that our goals here aren't a simple product of hubris.
> [...]
> The telephone is inanimate.
A book is inanimate. Yet it embodies the ideology of its author, its time,
its culture. Of course, the book is a particular kind of artifact designed
to communicate, but I doubt anyone now thinks a book communicates only that
which it was specifically designed to communicate. It's a reasonable
extension to suggest a technology, designed with a particular function in
mind, brings with it certain other "functions" reflective of the time and
place of its inception. Usually these cultural biases are ascribed, in
technology, through certain notions of efficiency and optimality. When a
technology is transplanted from one culture to another, the result is far
more complicated than a simple "progress." (We can accept this while
rejecting the stronger thesis that there is no such thing as progress.)
BM
This archive was generated by hypermail 2.1.5 : Mon Jul 21 2003 - 08:55:01 MDT