Billy Brown writes:
> [the best point of uploading is that you can repair damage at data level]
> Hmm. Not exactly. At an absolute minimum level of understanding, you need
I was obviously being unclear. What I meant, Drexlerian nanotechnology might be infeasible ('advocatus diaboli, at your service'), and destructive scan of vitrified tissue and reconstruction in situ, including retractable fractal scaffolding for rapid rewarming certainly spells out a shitload of hairy technologies. With uploading all you need is some kind of a parallel MEMS cryomicrotome/scanning stage and an emulation computer made from molecular circuitry (nanotechnology made via the macromolecular autoassembly paradigm is sufficiently powerful for that), plus some algorithms for filtering/segmentation/feature extraction and simulation. There is lots less of nasty physics laying giant boulders into your path. I doubt anybody will ever be resurrected in the flesh, it is simply too expensive, particularly in a future world where biocompatible habitats would be fully synthetic.
> a simulation program that can duplicate the interaction of all the molecules
> in the brain. That requires a very sophisticated program incorporating
You almost certainly don't need MD level of detail. You might need that at the beginning/a point of departure of your molecular, or near-molecular tissue map, as a transient means to destill your custom higher-order representation, perhaps tailored to the optimal ('omega') hardware I mentioned ( http://www.ai.mit.edu/~mpf/Nano97/abstract.html ), but there is a hierarchy higher level models. Compartmental modelling, spiking codes, Darwin amongst cortex columns, whatever (please consult your resident genie from Thule for a less sketchy explanation). I'm almost certain comparmental level is enough, and it is a far cry from MD level of detail.
> pretty much everything we now know about chemistry and non-relativistic
> physics, with some special corrections in the areas where quantum effects
> creep in. That is perfectly doable, of course, but it isn't just a 'data
> filter'.
I was referring to the vitrification artefact removing and segmentation/tracing stage. These are indeed just some (though pretty damn smart) filters. The simulation stage is of course entirely different, you dowload the destilled state pattern as resulted from the above filter pipeline into dedicated molecular hardware (probably mimicking some relatively abstract, highly anisotropic, complex excitable medium with diffusable gradients), and then pull the big switch causing the cell to tick on as choreographed by the molecular hardware incarnation of your system Hamiltonian. In a sense, one could describe the mapping from your sensorics to the motorics vector as a very smart filter, but that would be stretching the analogy a bit too far (lots and lots and lots of hidden state inbetween). As to MD, that's what supposed to be my academic area of expertise. It is indeed nothing like a data filter.
> It also isn't very practical - as I pointed out in a previous post, running
> it would require a computer faster than anything we are likely to get before
> the era of advanced nanotech. It should actually be much easier to do the
> upload at a higher level of understanding.
UI'm physically incapable of commenting on every message I would like to, unfortunately. Some quick quibbles: at 10^9 neurons (ignoring the glia), 10^4 synapses/neuron, 10^2 ops/synapse and a simulation resolution of 1 ms (10^3 events/s), all very conservative estimations for neural emulation I rather arrive at roughly 10^18 ops (totally ignoring the role of glia and dendritic processing here). I'm too tired and drunk to estimate the number of Flopses for MD level of detail, so I trust your 10^30 Flops, or whatever the number was.
> The obvious first step in taming the computation problem is to move up to
> the cellular level. Once we really understand all of the details of how
> neurons work, it should be possible to write a program that can simulate the
> behavior of one. This is an ambitious exercise in molecular biology, but it
> can be done in the near future even under conservative technology
> projections.
Very smart idea. Indeed one could imagine an upwards-validating chain of methods reaching as far down as ab initio. Fully automagically, you destill MD forcefields from ab initio, compartmental magical constants from MD, etc. Top-down, you could use rich fingerprints as MEG/fMRI and implanted multielectrode arrays for validation.
> Of course, once you have a computer simulation of a neuron there is no
> reason to stop there. Figuring out which behaviors are relevant to
> computation, and which are not, should also be just a matter of
> experimentation. Once you can weed out the irrelevant processes you should
> end up with something that can run on a few hundred MIPS at most.
Oh, assuming a 10^4 synapses/neuron, 100 ops/synapse and a 1 ms tick you should be able to run a realtime neuron with around 1 GIPs.
> At that point you could simulate an entire brain with something like 10^10
> MIPS, which should be feasible by then. Of course, what you are running is
> now a fancy neural-net program, not an impenetrable blob of mysterious data.
> Figuring out how everything works is still a big job, but there is nothing
> impossible about it.
I'm a big fun of opaque GA way of doing things. The amount of details are otherwise a) overwhelming b) boring. Somebody/something please keep all the Brazil plumbing out of sight.
> As for making improvements, well, the brain does appear to have specialized
> functional regions. Once you can map the connections between one region and
> the rest of the brain, you can improve that module in isolation. An
It should be interesting to actually know the amount of modularism in the brain. We need methods to map neural tissue at nm resolution in the bulk.
> evolutionary process is the obvious technique to use, but you could also use
> traditional programming techniques in well-understood areas. You don't have
Do you have any ideas towards that end? I frankly don't see how traditional programming techniques could be instrumental.
> to have a complete understanding of everything in the brain to do this - you
> just need an approximate understanding of one specific region.
ciao,
'gene