Emmanuel Charpentier wrote:
> "Eliezer S. Yudkowsky" wrote:
> > Or to summarize: "Either life has meaning, or it doesn't. I can act as if it
> > does - or at least, the alternative doesn't influence choices. Now I'm not
> > dumb enough to think I have the vaguest idea what it's all for, but I think
> > that a superintelligence could figure it out - or at least, I don't see any
> > way to figure it out without superintelligence. Likewise, I think that a
> > superintelligence would do what's right - or at least, I don't see anything
> > else for a superintelligence to do."
> I think I like your straight and seemingly elegant use of logic, but I
> hate the consequences. Then I will not act logically! (considering that
> particular framing of logic anyway, I'm sure we can discuss the
> probabilities and the fact you use fuzzy logic and yes/no state)
Yeah, learning to accept the consequences is the hardest part of building a logical moral system. Most philosophers have preferred the traditional method of starting from the desired consequences and then inventing the system. But that's what separates logic from fantasy. Objective thought is what tells you something you didn't know and to do what you didn't want.
> One: if you do not believe in christian god there are two possibilities,
> if christian god exists you will go in hell, if it doesn't nothing
> Two: if you do believe in god, and it doesn't exist, then there is no
> consequence, but if it does (finally) exist, then JackpoT!!! you go in
The fallacy is that there are only two probabilities and that they are 50/50. The probability that Christianity is true is as close to zero as makes no difference. Now, you can talk about infinite payoffs and claim the arithmetic still works, but if you assume a smaller set of the postulates assumed by the Christians, i.e. "beings of infinite power will someday judge me", you have a relatively stronger probability that suggests a better course of action: Live an ethical life. The Christian chain is admittedly nonzero, but it is vastly outweighed by the generic theist chain, which in turn is outweighed by the hedonistic qualia chain, which in turn is IMO outweighed by the Singularity chain.
Incidentally, I sincerely doubt the Singularity will be interested in judging our past lives. Even if so, I have stronger logical chains to act on.
> And, do you currently think that there exist an objective morality?
I'd say there's between a 20% and 70% probability. Even 10% would be good enough for the logic to hold both arithmetically and intuitively.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.