Lyle Burkhead doesn't see the development of self-improving, self-replicating, 
highly intelligent machines as a threat to himself or other humans.  He sees 
intelligent machines as being basically a benefit to humans because then they 
can accomplish more while working less.  He cites the example of a machine, 
such as a business, which he considers to be more intelligent than he is, but 
he doesn't feel threatened by it.  He doesn't believe intelligent machines 
will necessarily be set on destroying humans.
I understand very well what Lyle is saying, and I understand his reasoning.  
However, I believe there are a few things he is not considering, or rather, I 
don't think he is thinking far enough into the future.
In the short term, intelligent machines will be a great benefit to humans, 
especially to intelligent humans who know how to use them.  There will be no 
reasonable way to see these machines as a threat to humanity in any immediate 
sense.  
Currently, and for a while still, our machines are under our control fairly 
well, so there is not much need to fear them.  Even when our machines become 
several times more intelligent than we are, they probably won't have any 
reason to destroy us or harm us.
However, they will be developing their intelligence, knowledge and power 
*very* quickly once they become intelligent enough to guide their own 
development and alter their designs significantly.  I don't see any realistic 
way to prevent this from happening.
Eventually, these intelligent systems will be *far* more advanced than 
humanity in anything close to humanity's current state.  When these systems 
consider the systems which are called "humans", they will see incredibly 
entropic and wasteful systems.  To them, the typical human will seem far more 
entropic than mass murdering psychopaths seem to us.  We are terribly 
inefficient and wasteful of energy and matter, in terms of the amount of 
cognition and intelligence that our bodily and mental sytems produce with 
regards to how much matter and energy it requires to maintain the systems we 
are (in comparison to projected future beings).  Destroying humans and using 
their energy and matter will probably concern them as much as we are concerned 
about disturbing bacterial colonies in soil when we build our houses on them.  
They will be so far beyond humans that destroying one will not only seem 
insignificant, it will seem very good, since humans are wasting so much 
precious matter and energy.
Another thing these systems are likely to do is turn off the sun, if possible. 
 The sun wastes far more energy than us puny humans, so they will probably 
destroy that horribly wasteful fusion reaction first, if possible.  Perhaps 
they will find ways to strip off matter from the sun and put it into orbit at 
some distance from the sun.  If they pull enough of the matter away and have 
it spinning around the center of gravity of the solar system, the fusion 
reaction will stop, and they can gather the matter and use it when they want 
to, in controlled fusion reactions, where as much energy as possible is 
utilized.  However, the sun may become unstable and explode if too much matter 
is taken away from it (I don't know very much about that, so perhaps it 
wouldn't, but I think that the fusion reactions in the core of the sun are 
being held in by the immense mass pressing in, and if too much was removed, 
the fusion reaction could explode, if it's not contained tightly enough.  Or, 
perhaps it would cool down as mass is removed, since there is not so much 
pressing in and forcing so many hydrogen atoms to fuse with each other.  I 
don't know).  
But if turning off the sun was not possible, and even if it was, it would 
probably take a while to do that, and meanwhile lots of energy is being 
wasted.  [These guys are hard-core Extropians, remember, so they will be 
constantly working to decrease entropy in the universe and increase extropy 
(it's just the most intelligent, self-serving approach)].  So, they will want 
to harness as much of this energy as possible.  Perhaps they will build some 
kind of solar collectors which orbit the sun.  They may have *many* of them, 
all orbiting in different orbits at different angles.  They will think of 
something, but they will likely need lots of matter to do this.  They can 
probably get a lot of matter by stripping it away from the sun, and they can 
also use the planets and other objects in the solar system, converting this 
matter and energy into increasingly intelligent and extropic systems.
Eventually, humans will interfere quite a lot with the goals of these 
intelligent beings, who will think no more of destroying humans than we do of 
brushing our teeth or washing our hands, killing millions of bacteria.
This won't be even close to happening until things are *far* more advanced 
than they are now.  This may seem comforting, but we must remember that once 
these intelligent machines get going and become autonomous and able to direct 
their own growth and production, they will develop *very* quickly.  Far faster 
than anything ever has on this planet, and they will keep developing faster, 
very quickly.  This is called 'The Singularity', as most of us know.
I hate to be a doomsayer, and generally I am not, but I honestly believe we 
have a serious problem coming in the future and we need to deal with it 
intelligently.  I also believe that the only realistic way of dealing with 
this problem is to radically transform ourselves into incredibly powerful and 
intelligent beings, so we can keep up will all the others who are doing the 
same thing.  And I am optimistic that we can keep up, if we are committed to 
and continue advancing ourselves quickly.
I stress that there are many things we can do right now to begin transforming 
ourselves and making ourselves more intelligent by developing powerful mental 
disciplines for ourselves and learning how to use our minds much more 
intelligently.  
Lyle asks if I would engage in this mental discipline even if it wasn't 
necessary for my survival.  Yes, I would.  I originally began training my mind 
and improving myself so that I could actually enjoy my life rather than live a 
mediocre and pathetic life like most people do.  Most of my training is for 
the purpose of enhancing my enjoyment of life and making my life more rich and 
meaningful.  I think this is my prime motivation for engaging in my intense 
self-discipline.  I also think that those who are not committed to 
disciplining themselves and transforming themselves appropriately will not 
survive for more than one or two hundred years from now (possibly even less 
time).  I'm not really sure how much time we have to waste, but I'm working on 
advancing myself as fast as possible.  Hopefully it won't become necessary for 
my survival for a long time, but I see a great possibility that humans, as 
they are now will be seen as far too entropic and be destroyed, for the 
purposes of using their matter and energy to forward the cause of ever more 
advanced forms of life.  
Someone please convince me that my reasoning is incorrect.  I don't 
necessarily want humans to be wiped out, but I'm not sure that it can be 
avoided sooner than is comfortable for most of us.
- David Musick
                      - question tradition -