david gobel wrote:
> This list frustrates me. It feels like the sailors on the admiral's flagship
> discussing geopolitics in the mess. Good ideas, but the Admiral makes all
> the calls, and he NEVER goes to the enlisted man's mess.
The thing is, this list isn't very well suited to serious discussion. A few years ago, when I first came on, things were better. Now, most of the serious guys have talked all the real subjects to death and left. I'm really just here to read Sandberg's weekly updates, the other news that gets posted, and a few favorite list members who've hung around or just arrived.
This list isn't being used to discuss immediate choices. It's being used to debate the outcome and ideology of futuristic scenarios. It has attracted listeners and posters who are interested in that particular debate.
If I was doing anything serious, like, say, writing a site that introduced a hundred unique visitors to the Singularity each day, or gathering data on their reactions, or anything else serious, I wouldn't say so here. Instead of any useful feedback I'd get a dozen people screaming because I didn't conform to their version of the dogma. Ideology, not choices.
Likewise, I'm not asking any of the questions I'm interested in discussing - whether publicizing the Singularity is a good idea to begin with, how much work it takes to start a foundation, and so on - because I wouldn't get any useful answers. After a couple of attempts at getting useful advice out of this list, I gave up. The last time I even reported on what I was doing was during the Singularity discussion six months back. This list has evolved for the discussion of futuristic scenarios, and now even that seems to be pretty much played out.
As for the projects that need doing - why bother? Is there anyone here with the money and the will to fund a project that needs doing? If so, I haven't seen it. I haven't seen any sort of practical, intelligent discussion on this list, or discussion of business issues, which is why I haven't brought up those questions. I certainly don't need anyone else's permission to work on those projects I feel need doing, and since it doesn't seem likely that I'll get any help, why ask?
If anyone wants to sponsor a list solely for discussion of immediate choices faced, from which all discussion of ideology would be excluded - and restricted (although not moderated), so that people can discuss sensitive subjects - I'm for it.
> So...what have you DONE...or are ACTUALLY DOING - to move the curves to the
> right?! Please excite me!
Usually I work with Outliers - AIs, neurohacks, and the like. I haven't spent much time pushing at the curve, unless you count trying to "turn people on" to the Singularity. The last time I tried anything like what you're doing was about three, four years ago, when I published an article called "Why Schools Don't Work", which I've now deleted from my Web site. I'll email it to you, if you like.
What have I done?
I've published about 450K on cognitive engineering, 350K in "Coding a Transhuman AI" (CaTAI) and 100K in the "Algernon's Law" guide to neurohacking. As far as I know, these are the first serious, workable proposals for creating an AI that can go "all the way" and for neurosurgically enhancing human intelligence - although only the former is likely to be permitted by the law.
If anyone wants to turn CaTAI into a foundation, they're welcome to call me about it. Right now, I continue to work on it, unpaid, whenever I can spare an hour.
In the past week or so, I've gotten a dozen people excited enough about the Singularity to state that they've "decided to devote their lives to ultratechnology", while almost a score more have cried "Blasphemy!"
As for personal improvement, I'm sorry, but personal improvement doesn't count as an accomplishment - except, of course, personally.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.