At 23:35 10/09/01, you wrote:
>Party of Citizens wrote:
> >
> > On Mon, 10 Sep 2001, J. R. Molloy wrote:
> >
> > > Define intelligence as the ability to solve problems and answer questions
> >
> > How much of Hawking's work entails solving problems and answering
> > questions in mathematics? How much of that can be programmed into machines
> > now? Why shouldn't Cambride replace Hawking with a robot? See why he might
> > be worried?
Hmmmmmmmmm
Ok lets take a look at........
>Moreover, intelligence, contrary to JR's assumption, is not the be-all
>and end-all of sentience, though it is a part.
Yeah i'd completely agree with this statement, intelligence IMHO must
be about more than just raw processing power capacity. But i also have
to say i'd expect any half thoughtful person thinking about the concept to
come to the same conclusion.
But i really wouldn't like to try and nail down exactly what would be required,
all would say is that at some point i expect manufactured devices to exist
that ppl will generally accept as intelligent. I don't ever expect AI to
think like
a human does. Am kind of baffled by anyone who does.
>Creativity, for example,
>and the ability to make the value judgements that filter 'good'
>creativity from 'bad' creativity, are a hallmark of human sentience.
Woaaaaaaa this IMHO is an issue that human society has wrestled with
since before the invention of the wheel. Am certain the poor hominid who
first realized that s/he could use fire caused a huge debate amongst the
tribe about how "good" or "bad" this new thing was.
And we are still in practical terms no nearer to a value judgement free
way to analyze such concepts.
>When a computer can look at another sentients work and say "That
>is/isn't art", with some rational explaination for it, as well as create
>art that a) is not derivative, and b) critics can say IS definitely art,
>then I'll accept that AI computer as sentient.
You asking for a lot there. Since most Artists and critics cannot do this,
with any level of common agreement, and most of the really clever ones
with a modernist or post-modernist perspective wouldn't even try to.
Art abandoned the idea of is-Art is-not-Art a long time ago.
> It cannot do so without values.
Hmmmmmm all depends what you mean by values, i have to admit that i
feel somewhat uncomfortable with the use of ill defined human concepts like
values and intelligence, i regard to AI's. We aren't really sure what we mean
when we use these terms to describe our own behavior. And i am dubious
if these concepts could have real meaning when describing "other" forms
of intelligence.
Of course i realize we don't really have any other terms to use if we want to
discuss these issues.
As i said earlier IMHO it's madness to ever expect a manufactured device or a
chunk of Brain tissue isolated from a normal human body to think like a person.
Thats not however to say that something cannot be intelligent without
replicating human modes of thought.
One of the most exciting AI fantasies of mine is the chance to converse
with something
that thinks unlike i do. Maybe one day it'll be possible.
Later
Zeb
"FURIOUS GREEN DREAMS, LAY SLEEPING IN STATE,
BUT SOON THE GREAT JELLY SHALL RISE FROM THE
DEPTHS,
AND ALL THOSE WHO MOCKED SHALL KNOW THEIR FATE
IS SEALED"
Guru Zeb,
Hacienda,
Manchester, 1989
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:27 MDT