den Otter wrote:
> > From: Eliezer S. Yudkowsky <firstname.lastname@example.org>
> > I see two possible problems:
> > 1) We're so busy trying to sabotage each other's efforts that we all
> > wind up getting eaten by goo / insane Powers.
> Let's hope we can somehow prevent that...
Ooh!Yeah! Great plan!
> > 2) Iraq gets nanotechnology instead of the US / the AI project has to
> > be run in secret and is not subject to public supervision and error correction.
> Note that "curbing" AI (or any other dangerous technology) by no means
> has to involve government-imposed bans. There are other (better) ways to
> do this.
> Besides, do you really belief that the US military would ever drop their
> nano/AI research projects because of some sissy civillian ban? They're
> not *that* stupid. Not that such a ban would be likely to be imposed in
> the first place; there's too much money riding on this.
What, you mean like with strong encryption and control of the digital economy?
> No, scaring the
> public and the government would more likely result in a tightening of
> project security, which is quite good because it would buy us some time.
Buy *who* some time?
> > "Trying to suppress a dangerous technology only inflicts more damage."
> > (Yudkowsky's Threats #2.)
> How defeatist. I'd say that suppressing the proliferation of nukes, for
> example, was a *great* idea. Otherwise we probably wouldn't be here
> right now. Stupid as they may be, big governments do offer fairly good
> stability, on average.
Yes, nuclear weapons are an interesting case. I should say that trying to suppress the *creation* of a technology - research and development - only inflicts more damage. I'm fully in favor of suppressing the *proliferation* of dangerous technology. Once Zyvex has nanotechnology, I'd be fully in favor of their immediately conquering the world to prevent anyone else from getting it. That's what should've been done with nuclear weapons. If the "good guys" refuse to conquer the world each time a powerful new weapon is developed, sooner or later a bad guy is going to get to the crux point first. Alas, I don't think Zyvex's resources will suffice for the "matter programming" needed.
> > Can anyone really think that ve can panic politicians or the media into
Okay, now I don't get it. Are you under the impression you'll find it
easier to evade nanotechnology laws than I'll find it to evade AI laws?
> > pushing for laws that suppress nanotech/AI without their noticing the
> > existence of AI/nanotech/uploading/neurohacking/genetic engineering?
> > You push for the "Nanotechnology Suppression Act" and you'll get the
> > "Comprehensive Ultratechnology Regulation Bill".
> So what? Laws can be broken, twisted, evaded. Like we were waiting for
> the government's blessing in the first place.
Okay, now I don't get it. Are you under the impression you'll find it easier to evade nanotechnology laws than I'll find it to evade AI laws?
> > Who are the "Transtopians", anyway? My guess is
> > den Otter.
> The writings are mine, obviously. Anyone who agrees with the principles
> can call himself a "Transtopian". And yes, there are actually
> like-minded people out there, strangely enough. Of course, as this is
> the fringe of a fringe movement, you can't expect it to be very big.
An... interesting... perspective.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way