Re: seti@home WILL NOT WORK

Eliezer S. Yudkowsky (sentience@pobox.com)
Wed, 07 Jul 1999 22:12:48 -0500

Brian Atkins wrote:
>
> Wrong. You actually have no way of knowing whether the SETI
> effort will succeed or not; this is just your not-so-humble
> opinion. And don't forget that this project has other goals
> including getting the general public more interested in
> space. As for RC5, THAT is a complete waste of time. We
> already all agree on what currently makes for a "secure"
> level of encryption. Wasting your CPU time on cracking yet
> another stupid RSA message is the height of boredom.

Untrue. There's a higher chance of success, and success at the stated goal is very important for encouraging further distributed projects, something I take a tremendous interest in. I don't care whether or not the code gets cracked, but I do care a *lot* about whether distributed.net succeeds. I'm planning to use it for something.

> > A 10^40th to 10^50th Instructions Per Second SuperIntelligence
> > (compared with a 10^12 to 10^14 IPS human { leaving out those
> > on the SETI@home lists where I believe there may be a "-" sign
> > in front of the "10" :-)} ) that has the capacity of building
> > 100 billion telescopes (or radio transmitter/receiver dishes)
> > the diameter of the moon and has an internal communication bandwidth
> > so large I can't even begin to compare it to a human brain
> > *ISN'T GOING TO COMMUNICATE WITH US*! Paraphrasing what I said
> > at the Extro3 conference -- "We don't talk to nematodes and SIs
> > don't talk to us" (we are closer to a nematode than we are to
> > a SI!).
>
> Well the fact of the matter is: you can't really say for sure
> what a SI will want to do with its time. You can blather on
> all day, but it's all a guess. In fact you can't even be
> completely sure that there are even any SIs in this universe.

Precisely. Why do you think the disparity in civilizations implies that Powers wouldn't want to communicate with us? Because it's beneath their dignity? They don't HAVE dignity - they're Powers, for crying out loud! The question is simply whether a Power has any interest whatsoever in the outcome of a given civilization, perhaps including whether that civilization becomes a Power. Better hope there's no competition Out There...

Say, *that's* an interesting thought for the rabid subjective-morality guys on the list: Shut down that SETI! No! It's too late! The signal has been detected and the Blight is running on the 'Net! There's a story here.

> > We don't begin to become interesting until the post-singularity
> > era arrives.

I don't quite think you've understood the point of the SETI project - the main idea is to intercept extraterrestrial signals, not necessarily to receive signals sent specifically to us. On the other hand, it is trivially true that bandwidth between stars (on the bands we can detect) is scarce enough that data compression would make any message indistinguishable from noise. So our only hope is to intercept signals aimed at mortal civilizations, but the point is that SETI doesn't necessarily assume that.

Part two: You have not proved that Powers are the only guys taking up communications bandwidth. Even if the probability is a thousand-to-one that a civilization can broadcast between galaxies pre-Singularity, that's still plenty of civilizations.

And part three, as stated: Powers don't have dignity or social status; why shouldn't they want to influence us?

The real reason SETI doesn't work on Powers is that if they wanted to influence mortal cultures, they'd send out nanotech Von Neumann helpers. Or something more advanced, but you get the idea.

Even so, SETI may still be worth it, just on the chance of picking up some Qeng Ho transmissions from a race that delayed. It's just extremely improbable.

> > Until that time, perhaps the best they could do
> > is guide us towards that point (as lightly as possible), making
> > sure that we don't get too depressed beforehand by discovering
> > them and plunge like lemmings into the sea when we realize
> > how very far down on the evolutionary tree we actually are.

Here we go again on the "H. P. Lovecraft - E. E. Smith" cultural myth that knowing the awful immensity of the Real Picture will blast our brains. It hasn't blasted yours, and it hasn't blasted mine, so why worry?

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way