From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jan 07 2003 - 19:20:47 MST
Robert J. Bradbury wrote:
>
> I think the comments by Damien's friend have some merit. This goes
> fairly deeply to some philisophical topics I sometimes try to touch
> upon. There is no such thing as an objective "better". Better always
> must be entirely subjective (even if you get 99% of the rest of the
> humans on the planet to agree that *your* version of better *really*
> is better). [All you have to do is get one post-human AI which is
> clearly superior to a human to say your version of "better" is caca
> and you are s.o.l.] I can make a completely reasonable argument
> that humans in the OEM form (i.e. natural) are "better" than anything
> we can invent ourselves.
>
> The site should at the least be changed to something like
> www.betterhumansIMO.org, or something similar with lots
> of disclaimers on the first page that they are not trying
> to assert an absolute "better". (Which is really hard to
> do if you choose to use that word.)
If all "better" is "subjective" in the sense you claim, then this would be
a mere unnecessary duplicative redundancy.
Incidentally, you can't get 99% of the rest of the humans on the planet to
agree on objective truths, either.
If I may be permitted to rephrase the problem: any transhumanist looking
at www.betterhumans.com automatically sees "we want better humans", like
www.worldpeace.com, but the rest of the planet automatically sees "the web
site where better humans hang out", like www.atheists.com.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:35:50 MST