Re: Where's God?

Eric Watt Forste (
Wed, 25 Jun 1997 17:14:20 -0700

Erik Moeller writes:
> So you say that computing power and high quality information are the
> same?

I said nothing of the kind. I said that you can't define the
sense of the word "quality" that you are using, and you've done
nothing so far to convince me otherwise.

> And why should any posthuman entity talk to non-posthuman entities?

Comparative advantage. Of course, that's a term of art from the
field of economics, which you can't be bothered to study,
because you're so smart you've already figured out that it's all

> High-quality information is the only *scarce* resource. One planet, one
> solar space station, provides enough energy for a posthuman think tank.
> The sensors will need some more but it will never get into the area of
> scarcity. This trend is visible even today. Look at the internet. OK, it
> needs some energy with all those computers and telephone lines, but it's
> a HUGE information storage, even without nanotechnology, and it doesn't
> take any considerable part of this planet's total energy.

Oh, yeah, all the computers and storage media devices will just
grow on trees. St. Drexler says so! (I hope that if Eric Drexler
reads this, he will forgive me for calling attention to a popular
misapprehension of his work, a misapprehension against which he
has worked and for which he is not to be blamed.) And yeah, the
"quality" will just happen magically. Never mind, Erik Moeller,
that you have no clue what the causes of quality are. If you do
have a clue as to what causes quality, perhaps you'd like to share
it with us.

> > When you give me an apple and I give you an orange, and this happens
> > voluntarily, then you have an orange (which you prefer to an apple)
> > and I have an apple (which I prefer to an orange). We both benefit,
> > positive sum game, and furthermore because our act (a transaction)
> > has infinitesimally affected the market price of apples and oranges,
> > we have anonymously transmitted some information to the world about
> > our relative preferences for apples and oranges. There's an
> > information-communication feat very difficult to pull off reliably
> > in an engineered system.

> Problem number one is, as I already said, that I think posthumans will
> only appear as collectives. If you assume that there is something like a
> "meaning of life", then all posthumans will proabably search for it and
> act like a collective.

Well, your faith is very touching, Erik, but some of us have open
minds. It is entirely possible that I will be absorbed into some
delicious ubercollective in the next forty years, and I might even
decide that this is the right thing to do before it comes down to
that. But right now, I don't know, and right now, that is not to
my taste (in part because it does not accord with my theories about
how quality emerges, which I will keep to myself because I haven't
tested them for long enough yet), and I certainly wouldn't expect
to win any arguments by handing people my Article of Faith that
We're All Going to End Up in a Big Collective Anyway.

> Problem number two is that money is simply dispensable. If you assume
> that there aren't collectives but individuals, you must also accept that
> there is nothing like a "meaning of life" or that not everyone wants to
> search for it. In this case, every posthuman has different interests. It
> is likely, or "probably unavoidable", that those with the same interests
> will form collectives _again_.

I think every posthuman has different interests, and the thing that
I find particularly annoying about you is how determinedly you
throw yourself into the world trying to force everyone to have the
same interests. "Not everyone wants to search for it"... you have
hit the nail on the head, right there, Erik, but apparently you
don't realize this. Different people, even different postpeople,
have different values, and this is PERFECTLY OKAY. It's only a
problem when they start killing each other over their differences...
I think you and I agree on that. But trying to smoosh everyone's
values into one Big Collective Value just makes everyone desperately
afraid that the final Big Collective Value is not going to end up
looking even remotely like their current values, and then their
egos are terrified and they start fussing and fighting and blowing
things up. We are still just apes with our heads bloated with
absorbed partial computational results (culture) from the big
memetic evolutionary process. And we can do a hell of a lot of
wonderful things with that... but not if we pretend that it's not
so, or that we are somehow perfect already. You can't force people
to be boddhisattvas, especially not by taking away all their money.
They have to do it from their own hearts or not at all.

Money is a communications medium that helps to automatically and
nonviolently reconcile the differing, varied values of different
people. You can only dispense with it if you accept the Moeller
Dogma that Everyone Ought to Have the Same Values. I utterly reject
that Dogma at present, so if you want to talk me into believing
it, I invite you to get down to business. So far you've used that
dogma to defend your arguments, but you haven't defended the dogma

It would be a hell of a lot easier to explain money to people if
we still used chemical elements instead of gummint counterfeit,
because in a chemical-element currency system it's fairly easy to
see that money is just a special-case refinement and technical
improvement of ordinary barter. And barter is just a means by which
people can cooperate to use their varied resources to fulfill their
varied values.

> If you think a world with several posthuman individuals is likely,
> similar to today's world just with cleverer inhabitants, you must
> consider that other posthuman collectives might be worried about this.
> For those high-powered individuals might destroy information sources the
> collectives need. In this case, the individuals would be "assimilated".

Why is this concern of the collectives more legitimate than the
concern of the individuals that the high-powered collectives might
destroy information sources the individuals need? Your argument is
curiously one-sided.

> If posthumans leave the collective for a while to gather information
> separately, and they meet other posthumans, they will try to find out
> their interests, and if they are equal again, they will probably join.

> And for the exchange between several different collectives: that's the
> same like exchange between individuals. Assimilation or ignorance. Only
> in very few cases it will work like "I tell you this, you tell me that",
> or "Give me 10$ (!!!) and I will tell you this, for with the 10 $ I can
> buy that other information from the posthuman guy next door, positive
> sum game, but consider inflation, too".

I see a lot of assertions above, and a couple of nice just-so
stories, but I don't see anything that looks like deduction,
argument, or evidence. Am I supposed to believe this stuff just
because you say so? "They will probably do this, they will
probably do that, this other thing probably won't happen." Why
are you bothering to post such drivel?

Another interesting thing about this argument is that I'm not making
any claims about what sorts of things posthumans are likely to
value (except that they'll probably value more than JUST ONE THING),
whereas you seem to be claiming that not only do you know that
all posthumans will value JUST ONE THING, but also that you know
what that ONE THING is. A rather presumptuous claim, it seems to
me. It's entirely possible that posthumans will spend all their
time chasing after things you and I simply haven't got the
conceptual apparatus to even begin to imagine. Can we have a
little acknowledgement of that, please? To quote Dr. Evil,
"Throw me a bone here."

I mean, "I'm Erik Moeller, smartest collectivist uebermensch in
the galaxy. You people are all helpless-minded, like children."
might make you feel good, and it might even be true for all I know,
but it's certainly not going to persuade anybody on the Internet.

> And you bet that Microsoft would run around shooting and bombing
> (kidnapping would be too expensive) people in a "UIF" world. OK, that's
> exaggeration, but it is obvious that they would develop some nasty
> tricks to stop illegal copying.

I'm no fan of Microsoft's, but you're just making yourself look
like a paranoid 2600 reader, like that guy who was writing in to
say that the Federal government ought to take over the American
phone companies. Microsoft really don't care about illegal copying
unless it hurts sales, which means the only illegal copying they
worry about is illegal copying within large corporate and governmental
organizations (such as is rampant in China, for instance). Microsoft
are hardly going to be shooting or bombing punks like you and me
(except for the unfortunate few who are "made examples of"), and
large corporations and some governments respond to lawyers much
better than they do to troops. People who hack corporate systems
for a living (like Bill Gates does) understand this.

> The government is not the evil player in our world's game. It's those
> who are supporting it, the big industry and the banks, who are "evil" in
> the sense that their actions endanger mankind.

I think the things that endanger mankind are considerably more
complex than capsule summaries such as this one. Socrates and my
dad both claim that evil is illusory, and that it is only
ignorance that endangers mankind, and I haven't been able to
refute either one of them yet, although I do maintain my doubts.
You might like to know that the world is full of people who
immediately turn off their ears whenever they see someone on a
soapbox prating about good and evil (not that I'm one of them or

> In order to reach posthumanity [as soon as possible], we must avoid the
> destruction of our home planet.

Maybe. I would like to believe this is true. Expectation foils
perception. I am quite fond of our home planet, except for the
inconveniently intense gravitational field (which I understand was
necessary to get things started). I'm also not quite sure that
Reaching Posthumanity (quickly or otherwise) is the Main Thing,
since no one seems to agree on what that means, much to my

> I'm not saying that I don't want to use knowledge. Using knowledge is
> what I do every day. I say that any posthuman entity has no interest
> whatsoever in using knowledge, except for obtaining more knowledge (so
> the knowledge is just a BASIS [upon which new knowledge is placed] or a
> TOOL [for instance to build special external sensors], but never [or
> rarely] a CURRENCY).

Well, if knowledge is the only thing you value, then why are you
blathering instead of studying? I know that's a terribly snide
thing to say, but value-monists are always leaving themselves
open for remarks like that.

Oh, right, you're a *collectivist* value-monist, so you want to
make sure that *I* learn your knowledge too. Look at the little
memetic salmon rushing madly upstream, propagate, propagate,
propagate, MUST SPAWN! (Not that I disapprove of literary
ambitions... far from it. I was once there too. I just try to be
a little more leisurely and deliberate about it these days.)

Just please don't send me to room 101 for my indoctrination. I
actually like all the salmon, but I don't care for rats. Or
stool pigeons.

> > What you want is power to help the poor and suffering: stop
> > pretending otherwise. You mean to rule wisely and well, but you
> > mean to rule.

> I don't want to rule. I want the rulers to rule better.

Same damn thing.

I want the rulers to stop ruling, so people can learn how to rule
themselves. You want to keep them "helpless-minded like children"
(as you once asserted most people already are) so you can inculcate
them with your Values, which are, after all, Correct.

Eric Watt Forste ++ ++ expectation foils perception -pcd