Lyle Burkhead wrote:
> So far, no one has attempted to defend the Assembler Breakthrough as
> described in Engines. Even my most hostile critics say that the
> Breakthrough with a capital B isn't going to happen (and they are hostile,
> apparently, because they have just glanced at the site, and they think I am
> attacking nanotechnology in general, or technology in general, or AI in
I think I would probably count myself as a "hostile critic". Polite, yes, but still hostile. Your entire Web site does seem to be a straw-man argument; if you really aren't attacking nanotechnology and AI in general, you're doing a very poor job of conveying that important piece of information. If you're really devoting an entire Website to arguing against the "Great Breakthrough" described in _Engines_, then say "I think nanotechnology will happen _this_ way", so it's clear to everyone that you're only attacking one scenario and not nanotechnology in general.
Discussions of nanotechnology and AI have advanced far beyond the popularized simplification you call the "genie" hypothesis; Drexler has _Nanosystems_, I have _Coding a Transhuman AI_, and neither could plausibly (chronologically or literarily) been included in _Engines of Creation_. You're stomping on the greasy smear where there used to lie a dead horse. No offense.
If what you mean is that nanotechnology and transhuman AI are technologies so powerful as to shatter our reality, not just manufacture frisbees, then say so. Don't talk about how they're "impossible"; don't imply that because a particular unimaginative fantasy of omnipotence is unlikely, the technology powering it is flawed. If you think nanotechnology or transhuman AI are impossible, then explain why, and talk about _Nanosystems_ or _Coding_, not _Engines_. If you think that advanced nanotechnology requires AI, then analyze what can be done using intermediate nanotechnology.
Sort out your arguments. You're mixing them together and completely failing to indicate what you're arguing against at any given time, or what implies what, or distinguishing between specific and general cases.
So, yes, I suppose I consider myself a "hostile critic".
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.