Re: Why the future won't accommodate everyone!

From: Michael S. Lorrey (mike@datamann.com)
Date: Fri Mar 17 2000 - 11:23:35 MST


Sarah Marr wrote:

> Mike Lorrey wrote, in reply to my post:
>
> > They own it [stock] whether they know it or not...
>
> Perhaps, but I don't think that indirect ownership acts to effect the
> suggested benefits of the original post; i.e. it doesn't provide freedom
> from governmental interference.

True, however I think that as more people start to become more informed and
independent minded that they will start to understand that what is going on is
other people are making stock market profits on their money. The promise of
reduced risk by the separation of the person from the actual ownership of stock
is a false one, because if the true stock holder goes belly up, the government
bails them out, and we are the one's who pay the tab anyways. Granted trained
individuals using expert methods, including the Black Sholes equation, can
reduce the risk quite a bit, but with smart stock shopping software, such
expertise should be built in...so the home stock owner can enjoy the same risk
reduction methods.

> > Since increased artificial mind augmentation will add increased
> > logical abilities to
> > the average person, then the amount of illogical reasoning will
> > by definition
> > decline...
>
> That doesn't follow, Mike. Increased logical reasoning does not
> automatically decrease illogical reasoning. For logical reasoning to 'win
> out' it has to be considered favourable to that of illogical reasoning, and
> there's no guarantee that augmentation will or, indeed, should, bring that
> about.

No guarrantee, but I would be surprised if it did not come about. As far as I
can tell, most educated people tend to act logical more frequently than
uneducated people. Making it easier for people to learn, with augmentation,
should increase this effect.

> > So long as productivity growth and resource availability increase
> > faster than
> > population growth, then we remain on the track toward relatively
> > unlimited resources.
>
> But not to unrestricted and equal access to those resources for all, which
> was my point.

However there is no such thing as unrestricted and equal access. A person's own
sense of utility acts as a natural restraint upon unlimited consumption. I may
be able to buy as much Guiness as there is available, but there is only so much
Guiness I need, want, or can consume in a given amount of time. This, coupled
with the fact that each new generation of technology utilizes resources more
efficiently, indicates to me that we are on the curve to 'practical' unlimited
wealth for all (not total unlimited wealth).

> > The market will decide what is good and what is bad.
>
> No, it won't. It will merely decide what the majority consider good and bad,
> as it does now. You can't claim universal 'goodness' or 'badness' for
> something.

Well, you can, but that doesn't make it true. However as I have argued in the
past, its my opinion that the laws of nature, physical and biological, impose a
set of game rules on living in this universe. Using game theory it should be
possible to figure out what are 'good' strategies and 'bad' strategies, given
agreed upon goals. Deciding what the goals are is the question to ask. Given an
agreed upon goal, there will be a set of 'good' and 'bad' things within the
gamespace.

> > Individuals,
> > as they gain more
> > logical, computational abilities, will understand more and more
> > that power domination
> > strategies are not beneficial to one's true self interest.
>
> Why wouldn't they be? Note, I'm not asking why _are_ they in one's self
> interest, I'm asking why they are actively _against_ that best interest.

Because everyone has a self interest, and individuals who play power games may
win for a short period of time, but the stats show that in the end they
typically end up losing, especially as they become more radical in their
domination, because they just piss off too many other people who have self
interests that are violated. I read of a study last year that showed that the
chance of a national leader to be assasinated is something like 30-40 percent. I
think stats like that show that power games do have a limit to what can
contribute to an individuals true self interest.

> > > Oh, I don't know. Perhaps 'guilt' and 'loneliness' will
> > be considered
> > > negative, 'bad', emotions, and someone will have decided to
> > wire them out,
> > > as you suggested.
> >
> > Anyone who thinks that eliminating emotions is a good thing has
> > some issues to work
> > out. Being transhuman does not mean less human, it means more
> > human, better human. I
> > don't see how you can acheive this by subtraction.
>
> I agree with you one hundred percent: that was the point of my comment.
>
> > > We are not about to overcome all future suffering, and
> > the removal of
> > > death, and resurrection of the dead, will not end all evil.
> >
> > If our projections of the future are reasonably accurate...
> <snip>
> > I estimate that this means about a 90% decrease in the fatality rate.
>
> Sorry, my comment wasn't clear. I should have written, "We are not about to
> overcome all future suffering. Furthermore, the removal of death, and
> resurrection of the dead, will not end all evil." I do think removal of
> death is entirely possible, but that won't end evil or suffering. Why should
> it?

True. Some suffering is good, educational, even beneficial in the end. As in my
previous comment, those that don't think so have some issues to deal with...

> > Nobody who is not frozen can truly be resurrected...
>
> I agree.
>
> > Many, but not all people, are happy to die. Many, but not all
> > people are not happy with
> > their lives... see a resemblance?
>
> Yes, but not a one-to-one mapping by any means.

True, but I would not be surprised if there were a significant correlation.

> > However, there are things that
> > all sane people do
> > agree on. Those things can be defined as 'good'.
>
> There are very few things that _all_ sane people agree on, and you are in
> grave danger of creating a circular definition of sanity, whereby those who
> are to be considered sane are those who agree on certain things.

Good point. Who gets to choose the base set?

> > Also, due to evolutionary fitness,
> > there are things which are more good or more bad than others.
>
> This is, at best, a restrictive definition of 'good' and 'bad'. "Good" and
> "bad" go far beyond evolutionary fitness. For an assessment of something's
> being 'good' or 'bad' to rest on its evolutionary value all other
> considerations (cultural, societal, moral...) would have to be indeterminate
> or ignored. For me, that's a 'bad' thing.

I don't think they would be ignored, because those things are evolved things as
well.

--
TANSTAAFL!!!

Michael S. Lorrey Member, Extropy Institute http://www.extropy.org Member, National Rifle Association http://www.nra.org "Live Free or Die, Death is not the Worst of Evils." - General John Stark



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:34 MDT