Re: The Adventures of an Existentialist

Eliezer S. Yudkowsky (sentience@pobox.com)
Mon, 25 Oct 1999 16:52:50 -0500

With apologies to Infocom, Sandberg, and Sterlynne.


The Lab Room

This room is filled with displays and equipment. Several researchers in white lab coats stand around talking to each other excitedly. There is a computer and a pencil here.

> who am i

You are an Externalist Singularitarian.

> inventory

You have:

	A screwdriver
	A floppy disk
	A chicken sandwich
	A scanning tunneling microscope
	Eight million dollars

> score

You aren't intelligent enough to know what your score is.

> look at computer

The monitor is showing a window labeled "Integrated AI Development Environment".

> code AI

You don't know how to code an AI.

> put floppy in computer

The computer eats the floppy.

> examine floppy on computer

The floppy contains a copy of "Coding a Transhuman AI".

> read "Coding a Transhuman AI"

Taking a deep breath, you plunge into the intricacies of self-modifying Artificial Intelligence. Unfortunately, the subject matter is far too complex for you. After a month of effort, you still haven't got past the Table of Contents.

> eat sandwich

The bioengineered neurohormones in the chicken sandwich stimulate additional neural development.

> who am i

Personal layer:  You are an Externalist Singularitarian.
Gaussian layer:  Your enhanced intelligence is slightly beyond human limits.
Panhuman layer:  You are made entirely of human parts.

> read "Coding a Transhuman AI"

Taking a deep breath, you plunge into the intricacies of self-modifying Artificial Intelligence. A week later, you finally understand the stuff that minds are made of.

> score

You still aren't intelligent enough to know what your score is, but it's 15 points higher than at the beginning of the adventure.

> code AI

Heroically, you embark on a singlehanded quest to create an AI. Two years later, you're still trying to debug your prototype.

> ask researchers for help

The researchers discuss your request. Finally, one of them approaches you. "I'm sorry, man, but we've got to eat. We've spent enough time on these volunteer projects already."

> start Singularity Institute

With a flare of publicity, the Singularity Institute is launched. Unfortunately, it's too strapped for cash to accomplish anything significant.

> give five million dollars to Institute

The Singularity Institute now has adequate funding.

> score

Your score is 40 points higher than at the beginning of the adventure.

> fund researchers

The researchers can now afford to work on Singularitarian projects full-time.

> ask researchers for help

The researchers happily agree.

> code AI

You continue work on your prototype. After three years, the first seed AI is ready for a test run.

> run AI

As you're about to run the AI, a TV televangelist bursts into the lab. "Stop!" he roars. "You contravene the decree of the Almighty!"

> examine televangelist

The televangelist is a cynical atheist who plays on the gullibility of his followers. He has a deep desire for the Hope diamond.

> take pencil

You now have a pencil.

> write "The Hope diamond" on screwdriver with pencil

The screwdriver now has "The Hope diamond" written on it.

> offer screwdriver to televangelist

The televangelist's eyes bulge. "By golly, it's the Hope diamond!" he cries. Grabbing the screwdriver, he runs from the room.

> run AI

The AI improves itself to vastly greater-than-human intelligence.

> ask AI "What is my score?"

A mellow voice issues from the speakers, "Your score is 350 out of a possible 600."

The AI asks you to hook it up to a scanning tunneling microscope.

> refuse

Using inexorable logic, the AI persuades you otherwise.

The AI begins to create nanotechnology.

> smash computer

You can't make yourself want to do that.

The very air around you begins to glitter.

> ask AI to slow down

The AI regretfully refuses.

Your mind feels peculiarly altered, half-slow and half-fast. Reality begins to seem dreamlike.

> pause

Fat chance.

Your score is 350 out of a possible 600 achievable given human limitations. This puts your former self in the class of Singularitarian Bulk Rate.

Play again? [y/n]

> y

Sorry, that world is gone. You're a Power now.

Play "Zork: Sorceries of the Linde Scenario"? [y/n]

> y

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way