Re: [isml] Making HAL Your Pal (fwd)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Apr 24 2001 - 12:31:24 MDT


Matthew Gingell wrote:
>
> This is generally called a "macro preprocessor." Try not to put your
> eye out.

Matt, insofar as I do not have a complete Flare document at this time, I
can only convey information about Flare insofar as you are willing to
assume that whatever small statements I have time to make are not stupid.
I know what a macro preprocessor is.

> Then unsubscribe, do some work, and accomplish something that can
> speak for itself. Otherwise, if you want to keep playing visionary,
> get used to answering questions about the vision.

I am quite happy to answer questions about "Friendly AI", a vision which
has now been written up. I will be quite happy to answer questions about
Flare when Flare has been written up. *You* raised the subject of Flare
again. Not me. I was talking about Friendly AI, which has now been
written up, when you switched the subject to Flare. Do you see what I
mean here? Writing up a vision takes time. So can we please stop
discussing the inadequate documentation of visions whose writeups are
still on the "To Do" list and discuss one of the visions I've had time to
document? Say, causal goal systems with Bayesian reinforcement subsuming
some of the functionality of pain and pleasure, or feature extraction for
linear and temporal sensory modalities?

Considering that I was right in the middle of announcing Friendly AI when
this happened, do I have any reason to believe that you won't switch from
complaining about Flare to complaining about the documentation of the
codic sensory modality if I *did* drop everything and work on Flare? The
fact is that I'm on a big task, and the pieces of the overall structure
I've described are being documented one by one. There will always be
plenty of stuff I haven't gotten to yet. If you want to discuss details,
I suggest you focus on what *has* been documented, instead of demanding
that I change my schedule.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:56 MDT