AI motivation

Rob Harris (rob@hbinternet.co.uk)
Mon, 25 Oct 1999 09:31:10 +0100

>Eliezer's pointed out the incoherence of believing you can hard wire
>high level beliefs or motivations and I quite agree. You do get to
>specify what kind of feedback-inducing behavior gets reinforced or
>attenuated though.

Absolutely, I'm not completely fresh to this subject. The fact remains that rewarding a particular behaviour is providing motivation in itself. Whether you type the line "Survive at all costs" into the "moti-con", or reward existence-preserving behaviour, the result is the same. A "motivation" to survive. So my point remains - who is going to create an AI with gene-being style motives, then grant the AI powers necessary to seriously challenge human affairs, which basically amounts to creating an extremely complicated armageddon device. A bomb would be far, far easier....

Rob.