Re: The Singularity

Bryan Moss (
Wed, 8 Jul 1998 17:23:24 +0100

Dan Clemmensen wrote:

> > 2) Artificial Intelligence speeds up the
> > creation of more powerful computers and
> > intelligence’s. This seems wrong because
> > hardware and software power is already a major
> > factor in the increase of hardware and
> > software power. There is no reason to suggest
> > this trend would suddenly change.
> Yes, computer hardware and software have been
> contributing to the development of the next
> generation of computre hardware and software,
> but this is a relatively recent trend. Software
> and hardware development productivity is
> horrible, and IMO we are still taking baby
> steps. A breakthrough is not unreasonable.

But since the contribution is there it suggests that catchall technologies like "artificial intelligence" will be needed for software developers to keep the current rate of growth in software complexity. Where do you think a breakthrough might come from?

> > 3) A Super Intelligence emerges from a
> > distributed network, such as the Internet.
> > I think this goes against current
> > network/software/hardware models and that a
> > distributed intelligence would only emerge
> > under certain (possibly engineered)
> > circumstances that current trends in hardware,
> > software and the economy would not support.
> In my personal model, the SI is not purely an
> emergent phenomenon of the net. The net serves
> as the raw material for a directed augmentation
> of an initial proto-SI that first emerges as the
> result of a catenation of a set of development
> tools and a human programmer.

Creating any kind of universal intelligence would be a massive task, even with genetic algorithms (which haven't had much success on large projects, let alone human level intelligence). And if you're expecting such an SI to "evolve" in the network you would need to have a very specific environment to get to display human-like qualities. This is unlikely to happen without a large co-operative effort to build the SI, which is unlikely due to risk factors, etc. Anyone with the kind of money and resources necessary to pull off such a project would have no obvious reason to do so.

> OK, let's take something very simple: Moore's
> law. We all know that Moore's law is simply an
> observation of a historical trend and cannot
> be assumed to have any predictive power, but the
> trend is quite robust and has already survived
> through several technological generations.
> Furthemore, it's fairly easy to see the next
> several steps, getting us through the next 20
> years without recourse to fundamental
> breakthroughs. 20 years of Moore's law gains us
> another factor of a million in each of several
> computer capability parameters.
> Instead of asking a radical singulatarian such
> as myself to comment, please allow me to ask you
> to predict the effect of this level of computing
> capacity on society.

People will have smaller faster more intelligent computers.

I think my real objection with the SI scenario is the idea that people think intelligence must also mean the ability to wake up one morning and decide to wreak havoc on the mortals.