Re: The Singularity

Dan Clemmensen (
Wed, 08 Jul 1998 20:59:41 -0400

Bryan Moss wrote:
> Dan Clemmensen wrote:
> > Yes, computer hardware and software have been
> > contributing to the development of the next
> > generation of computre hardware and software,
> > but this is a relatively recent trend. Software
> > and hardware development productivity is
> > horrible, and IMO we are still taking baby
> > steps. A breakthrough is not unreasonable.
> But since the contribution is there it suggests
> that catchall technologies like "artificial
> intelligence" will be needed for software
> developers to keep the current rate of growth in
> software complexity. Where do you think a
> breakthrough might come from?
Please see: For my views. The rest of the list may already have seen it.
> > 20 years of Moore's law gains us
> > another factor of a million in each of several
> > computer capability parameters.
> >
> > Instead of asking a radical singulatarian such
> > as myself to comment, please allow me to ask you
> > to predict the effect of this level of computing
> > capacity on society.
> People will have smaller faster more intelligent
> computers.
Does this mean that you feel that society will be structured substantially as it is today? Do you see any computer-related differences in the society of today versus the society of
1978? Please take these as true, friendly questions. I'm interested in you answers. I'm not trying to make rhetorical points.

> I think my real objection with the SI scenario is
> the idea that people think intelligence must also
> mean the ability to wake up one morning and decide
> to wreak havoc on the mortals.
Actually, the consensus position of the radical singulatarian community (consisting of me and perhaps 'gene ;-) ) is that the motivations and actions of the SI are intrinsically unpredictable. There is no reason to predict that the SI will be inimical, benevolent or indifferent to humans. I'm personally hoping for benevolence, and I think the potential benefit is worth the risk.