Re: Neurohacking

jmcasey@pacific.net.sg
Mon, 9 Aug 1999 13:57 +0000

>Mostly about reallocating neurons from one ability to another - robbing
>Peter to pay Paul - by placing the "Paul" abilities under constant
>stimulation, by hijacking the (hopefully localizable) interface between
>emotions and intelligence, by modifying off-the-shelf brain-stimulation
>devices originally developed for stuff like epilepsy.

Sounds in the first part of this sentence like you don't believe there are enough neurons to go around. "Reallocating neurons" sounds a lot to me like "learning."

It also sounds like you don't think emotions are a good thing. Might not emotions, properly trained, be an incredibly powerful parallel-processor that we simply don't, in general, as a race, use properly?

>I'm talking about neurosurgery.
>
>See http://pobox.com/~sentience/algernon.html

I'll look this up next time I'm online. Meanwhile, while I certainly see the value in surgery for "remedial" purposes, I presume you also have "generative" ideas in mind?

Or are youj talking about taking the parallel-processing capabilities implicit in the emotional brain and using them for intellectual problem-solving activities? If so, I agree there's value in this, though I still don't see the need for surgery.

>There's a difference between software and hardware optimization. There
>are things you can do with hardware that you just can't do with software.

Perhaps you can name a few? I don't mean this to sound facetious, because I've discovered at least one way to overcome almost every perceived nervous system limitation I've ever come across. But I imagine my goals and yours differ.

> Q: are we evolved enough for this technology?
>(1) Of course not; we aren't evolved enough to not grow tobacco plants.
>I believe the phrase is "necessary risk".

Who makes that decision, then? I read "conflict of interest" all over this.

On deeper reflection, I decided "evolved" wasn't really the word I was after anyway. I was trying to suggest that most people would end up using neural enhancement for entertainment, or for their own gain, and potentially to the detriment of society (I've another post up on this, so I won't repeat). But then, if we restrict the technology, who decides who gets to use it and who not? For that matter, who decides who decides? Suddenly we're hard up against libertarian principles -- unless the market is compensated for the risk that the man-machine might hurt it, it'll oppose his creation. That is, unless the spooks fund it for their own reasons, and there we're on very iffy ground in terms of libertarianism and externalities.

>(2) You'd better hope I'm evolved enough...

I wouldn't presume to know based on a few dozen text strings.

My criteria would include:

--can you go for half an hour without a single mental image or word of internal dialogue entering your head?
--can you voluntarily slow your heartbeat to 50 beats a minute or speed it up to 100 while remaining seated?
--can you zoom your vision by will alone?
--can you learn a new language (other than Romance languages) in 72 hours or less?
--can you neutralise, or simulate, the effects of a powerful hallucinogen at will?

Not a complete list, and not a list which demands mental adepts. But I'd no more trust someone who can't do these things with enhanced neural hardware than I'd trust a child with a chainsaw.

My basic disagreement with the line of reasoning surrounding uploading, hardware modification and the like goes right to the fundaments of what we mean by evolution. In all the history of life on this planet, species have evolved by adding modifications onto an existing configuration -- never, or rarely, subtracting. Brain structure, skeletal structure and embryonic development, to name but three, attest to this. To the extent we excise functions from our repertoire, even to add functions we (in our imperfect wisdom) believe are superior, we become potentially unstable. And if we ever get so that we can reproduce people with these unstable properties, we risk catastrophe.

jmc