Calculating a personal long term good

From: Adrian Tymes (
Date: Sun Apr 09 2000 - 17:00:32 MDT

Robert Bradbury wrote:
> AAHEMMM... Please note the subject line "Nanotech Restrictions" *NOT*
> "Cuban-American" Poltics and *NOT* "WACO (yet again)".

Ne...technically, both sides of the thread were/are off-topic, then.

> Somedays I wonder if we will ever get to soar like the eagles when
> members of the tribe display repeated patterns of attempting takeoffs
> with bricks attached to their feet.

That's actually one of the reasons I began tinkering with enlightened
greed. If I can soar under my own power, but I would never get off the
ground if I tried to take said members with me...maybe seeing me flying
would encourage them to take the bricks off?

> On Sat, 8 Apr 2000, Harvey Newstrom wrote:
> > "Adrian Tymes" <> wrote on Saturday, April 08, 2000 3:38
> > PM,
> > > I call it "enlightened greed", and it is driven by the principle that
> > > the correct action is that which will benefit me most in the long term.
> > >
> > > The important thing is that nowhere in the decision is there
> > > consideration of what would be good or bad for anybody else,
> >
> > Wouldn't this system lead to mass criminal behavior? If I am a skilled
> > hacker, then I can make money easier by stealing it than by working as a
> > consultant. According to your system, that would be best for me. Does your
> > system lead to this conclusion?
> Well, if as a result of criminal behavior, you get imprisoned, presumably
> there is a real hit to your long term benefit. So, unless you come up
> with the foolproof perfect crime, enlightened self-interest dictates
> a fair amount of other-interest.

That's about the only explanation I can think of for why I'm seeing the
results that I do. Of course, it does require one to be intellectually
honest - stealing an extra couple hundred bucks to spend on a dose of
cocaine, or getting an orgasm and hearing a rape victim's cries, are not
worth the consequent jail terms.

> If you are going to live 2000 years, sooner or later you are going to
> have to deal with the cleaning up the mess of your miss-spent youth
> (e.g. pollution). So a *real* "long-term" perspective includes an
> accounting for the commons, a desire to utilize and allocate resources
> most efficiently, and a desire to uplift others to your level (because
> its going to get damn boring talking to 10^10 copies of yourself), etc.

Uplifting others goes beyond the boredom. If you can uplift someone so
they can do a task more efficiently, and their doing that task is of aid
to you, and they wish to do that task anyway, then uplifting that person
has a much more direct payback.

An example from my work: I was hired to create, among other things,
software tools for use by other employees. However, some of these
employees ask for or can benefit from so many tools, and typically have
a better internal model of the tools than they can communicate, that it
is more efficient to teach them the basics of programming, to the point
that they can create the tools - which are simple automation of manual
tasks, not any advanced computer science. They can create tools
whenever they want, faster than it would take for them to think that it
might be worth asking me, then ask me (the development time for some of
these being practically nil), then adjust their minds to any differences
between how I perceive the data they work with and how they perceive it.
Their products also tend to be more useful to them, having been crafted
with a better sense of how they would use it, and they also get a better
sense of what can be automated, which has directly lead to more of their
work being automated - which leads to better productivity, which means
more clients served, which leads to the value of my stock in the company
going up (even after accounting for any pay raises they may request for
obtaining this skill). It also leaves me time to do other things; the
payoff on that usually takes less than a month.

> The whole nano-terrorist perspective has problems because its applying
> current "scarcity" economics to a post-scarcity environment (unless
> one invokes the making 10^10 copies of oneself caveat). The only
> terrorist situation I see that makes any sense is an "irrational"
> scenario where individuals are motiviated by beliefs and not by needs.

Though there are plenty of those around. But is it easy to learn GNR
when your mindset puts your beliefs ahead of rationality? Genetics
alone insists that there is not an absolute difference between humans
and other primates. Those who insist that Diety created humans from mud
would reject that, and be unable to proceed.

> For example, Stephanson's "The Diamond Age", provides no explanation
> as to why the benefits of the technologies should not be available
> to everyone. It implicitly assumes there are some limits on energy
> or mass (for an approx. current Earth level population) and that simply
> *is not* true.

For those who know enough to take advantage of future tech, there is
potential for boundless resources, given time. But at any specific date
in the future, there will always be a finite number of people aware of
the realities of nano, and a finite amount of time between now and then.
If, starting tomorrow and for the next year, I operate a mobile food
production factory that can produce enough food to feed up to ~1 billion
people for a year per day, and give away as much output as people care
to haul away, *and* take it on a tour that visits every population
center in North America with at least ten human beings, there would
still likely be people starving in North America unaware of my service
at the end of that year. And that's on what's probably the most media-
saturated continent on Earth, with the highest amount of resources that
would act independently of me to get the word out to everyone about such
a thing.

Unlimited production does not mean immediately infinite supply, no
matter how much we wish it were otherwise.

This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:09:11 MDT