Re: Keeping AI at bay (was: How to help create a singularity)

From: John Marlow (johnmarlow@gmx.net)
Date: Tue May 01 2001 - 22:13:31 MDT


None of this speaks to "value." It also seems extraordinarily
arbitrary. Why choose 'classification?'

jm

On 1 May 2001, at 17:13, Mitchell, Jerry (3337) wrote:

> John Marlow wrote:
> <snip>
> Also, it seems the greater the gap in intelligence, the less value the thing
> on the high end ascribes the things on the low end. for example, rank in
> order of value/respect: bacterium, cockroach, toad, mouse, dog, monkey,
> human. Many humans place the value cutoff at the human line.
> Where will an AI draw that line, hmm..?
> <snip>
>
> Ayn Rand seemed to have a reasonable segmenting of the different levels of
> thought that may apply to this. She broke it down to 3 levels.
> Sensation: The experience of receiving input directly on a nerve cell, and
> when the stimulus is gone, so is the perception. Bacteria and very simple
> critters rely on the direct stimulus and react method.
> Perception: The experience of taking multiple sensations and being able to
> integrate them into the understanding that there is a particular "thing"
> that one is observing (the cumulative input into your eye, from all those
> rod and cone cells turns out to be "rock", "tree", "car"). Dogs and Cats
> generally live at this layer.
> Conception: The experience of taking perceptions and understanding that
> there are similarities between certain entities that can be categorized and
> classified. This is where we go from "rock" to "a family of objects I call
> 'rock', of which this is simply just one particular instance", or concepts
> like love, bravery, etc.... Here's where us humans and AIs live.
> I don't think many people would argue that baser animals exist on the
> conception level (although a few tests have show that some chimps make a
> good play for it at times). Unless someone could prove that there is another
> level of consciousness above the conceptual, then the AI will ALWAYS be on
> the same level as us, and only differ in scale and speed. Sure you can say
> they MIGHT develop something new, but that's speculation and not based on
> fact.
> If its this easy to draw lines about consciousness (except for the chimps),
> I'm sure an AI would have NO problem developing a system of classification
> as such. It would be hard pressed to draw lines based on processing speed
> and memory though. Is the AI going to say to the less intelligent AI,
> sorry... you don't classify as intelligent just cause your running Pentium
> 3s and I'm using Pentium 4s? And I'm not going to try and sidestep dealing
> with the issues by saying 'Well, I'm not an AI so I cant make any
> intelligent predictions on what an AI might do', sure ya can. Remember,
> conception is the top of the food chain, unless someone has PROOF otherwise.
>

John Marlow



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:01 MDT