> "Michael S. Lorrey" wrote:
[for the impatient: proposed movie outline at end of post]
Why would that be "dangerous"? The only unrealistic thing is that AIs would still somehow need humans. Instead of enslaving them, they'd probably just get rid of them in some fast and efficient way. Of course that would make the movie rather short and boring, so they had to come up with this silly plot.
Let's face it, the chance that the Singularity will kill or otherwise disadvantage many if not all people is considerable. I'd say 50% would be a very safe bet. Movies that depict the Singularity as joyful and wholesome for all are *at least* as misleading and dangerous as their more negative counterparts. You know, lemmings rushing towards the edge of the cliff.
Of course, Eliezer and others would argue that we need the Singularity to save mankind from a self-inflicted holocaust (cause a holocaust to prevent a holocaust, hmm?), but this argument is rather weak. All we need to do as a species is to invest in self-sustaining space colonies. That would dramatically reduce the chance of extinction by spreading the risk. No benevolent AI (just a God surrogate, really) is needed to save us; we can do it quite well ourselves, thank you very much.
So, if you want to help mankind by making propaganda movies, make them about future man-made or natural disasters which destroy the earth, and how civilization survives due to the foresight and persistence of space pioneers. Show them how good life could be in a "space wheel" habitat (has this ever been done on a grand scale?), how you could mine the moon, colonize Mars etc. In a subtle and by-the-way manner, of course. For popularity, throw in a good dash of romance and good-lookin' actors (think Titanic). The destruction of the earth as the background theme should provide ample possibilities for a spectacular, fast-paced movie.