"J. R. Molloy" wrote:
>
> > So, let us keep the masses ignorant heh? Where have we heard that
> > before? So, we can get to the great and wondrous technological Advent if
> > only the unwashed never get wind of it? This is a two-edged sword.
> > When the masses hear about it they will hear through voices of alarm and
> > most-likely uninformed voices calling for our heads and entrails as
> > traitors to all humanity. Think about it.
> >
> > - samantha
>
> Thanks for the admonition. I think you've twisted the meaning of my comments. I
> don't advocate keeping the masses ignorant. I simply don't want to waste any
> more effort trying to convince the uneducated that Artificial Life is a
> significant part of the emerging technological singularity (TS).
> I don't know where you've heard any arguments in favor of keeping the masses
> ignorant. Definitely not from me.
> I've tried to tell many friends and acquaintances about the TS -- to no avail.
> According to recent polls, nine out of ten Americans still believe in something
> they call "God" so it doesn't make much sense to argue with them about the
> advent of a genetically programmed superintelligence (a probable component of
> the TS).
I do get this. And yet it still seems that even if we can't convince
them its coming or get them to really understand what that means (though
who knows?) if they are convinced, that we owe it to them to do our best
to make sure they aren't run over and that they are actually benefitted
(as a minimum).
>
> When "the masses" hear about AI, SI, TS, etc. (as if they haven't already been
> inundated with narratives about these topics in science fiction and television
> scripts), in the context of real news reportage, I rather doubt that they will
> call for "heads and entrails." It seems more probable to me that they will react
> as they did when news about the atom bomb was first broadcast (with pictures and
> sound). They'll react with awe and fear and pride and superstition and a dozen
> other emotions, and many will retreat further into their belief systems and
> religiosity.
> When first the world learns of the existence of viable Artificial Life (which
> wants to be friendly, btw), it will matter not at all what they think about it.
> What will matter is whether the thing itself is something awfully insane or
> awfully enlightening.
>
Well, the extremes are both unlikely. I would suspect it is somewhere
in between, it will make its own mistakes. But I find it very unlikely
that the masses will not be manipulated against those "selfish,
egotistical scientists" who let this thing loose in their midst.
> When "the masses" heard about atomic energy, there were voices of alarm and
> misinformation. Some called for the heads of the scientists who developed this
> awesome power. Too late. The genie was out.
> Presently, far more effort and expertise is being directed toward developing AI
> than was ever engaged in developing atomic energy. All over the planet, dozens
> of teams of computer scientists work day and night to be the first to build a
> human level AI robot. Why? Because if it can be done, the thing will be worth
> trillions and trillions of dollars. Imagine machines that can run factories and
> hospitals as well as doing the markets (and of course, as Eliezer Yudkowsky
> would be quick to point out, also building better AI robots). The stakes
> involved in this project make other projects seem dull by comparison.
>
Sure. But what of the humans who will be even more largely out of work
and feel/be even more redundant? How will you organize society so that
these folks get taken care of so that they don't see this as a
tremendous threat and possibly the end of their own means of survival?
> What drives the search for AI is money, and money makes the world go 'round. So,
> it really matters very little what anyone thinks about it.
>
If that is and continues to be *all* that makes the world go 'round then
we all end up on the trash-heap of history in the very short run. A
great motivation to do the R&D isn't it?
- samantha
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:01 MDT