RE: Socialism, Intelligence, and Posthumanity

Billy Brown (
Tue, 12 Jan 1999 10:14:13 -0600

Eliezer S. Yudkowsky wrote:
> "Keith M. Elis" wrote:
> > How small would you say?
> Me, you, Greg Egan, Mitchell Porter, Billy Brown, Eric Watt Forste and
> Lee Daniel Crocker; possibly Anders Sandburg, Max More, Walter John
> Williams, Lawrence Watt-Evans, Carl Feynman and Socrates; and quite a
> few people I can't think of offhand. But almost certainly no
> more than twenty all told..

Thanks. I don't have any arguments with your list. I would put Eric Drexler on the 'possible' list - I don't know him well enough to form a firm opinion, but he seems to understand personal, conscious meme-pruning.

> I think it's because most people, and certainly almost all philosophers,
> can't keep track of what's real when they're operating at that level of
> abstraction.

Agreed. You can only build a chain of ungrounded symbols so long, then the system starts spitting out garbage. IMO there are at least two limitations at work here - you can only handle so many layers of indirection, and you have very little working memory with which to perform mental operations. Since human memory and logic are fuzzy, overflowing either boundary simply produces nonsense output - and since people don't normally have error checking at this level, they usually don't realize they aren't making sense.

> I can always keep track of the concrete consequences of
> any philosophical proposition; for me the words are only descriptions of
> my visualization and not the visualization itself. I can concretely
> define any term whatsoever, as it is actually used - in terms of our
> cognitive handling of the word, if necessary. And that's where
> virtually my entire facility with philosophy comes from. When I talk
> about "objective morality", I am talking about an amorphous
> (unknown-to-us) physical object which possesses an approximate
> correspondence to our cognitive goals and which motivates
> superintelligences, not "objective morality"..

I would presume you still have the limited-working-memory problem when working with complex idea systems. If not, please explain - I've been trying to generalize around that one for years now.

> The "mortals", not the "gaussians". I'm a Specialist, not a gaussian; I
> possess some slight variance relative to the unmodified human race, but
> it is not significant on a galactic or supragalactic scale. I am well
> within the mortal side of any mortal/Power duality. If it comes down to
> Us or Them, I'm with Them, but it would be megalomaniac to expect this
> to result in any sort of preferential treatment.

Unless they use rationality to divide people from animals. Then people who fit their definition of rational, within the limits of available processing power, might survive a general extermination. Not a likely scenario, but not impossibly unlikely, either.

Billy Brown, MCSE+I