"J. R. Molloy" <email@example.com> writes:
> "Anders Sandberg":
> > Hi again! After four months of conferences, symposia and courses on
> > transhumanism, psychology, neurocience, biomodelling and everything
> > else I have finally time to settle back into the list. Seems
> > everything is as usual here, everybody happy and gay and debating AI,
> > guns, death penalties and cryonics :-)
> Welcome back, Anders.
> Yes, the list is about the same as ever. Same old topics, same old complaints,
> same old... same old. Interesting that people who engage a discussion of
> accelerating change (of the extropic kind) do not themselves seem to change
> their minds, their opinions, their politics, their attitudes, or their
> philosophical positions.
I doubt we all want to change our core values at an accelerating
pace. But sure, this list has remained remarkably constant. That is
one reason I could survive without it for so long :-) Seriously, I
think it is a good thing to break out of the old molds and create
anew. I guess this is why many oldtimers are not around, they are out
there doing stuff. I foresee a very similar risk for me.
> You took courses on transhumanism!?!? Where do they offer courses on
Nowhere yet, to my knowledge. I attended the EU Advanced Computational
Neuroscience Course in Trieste for a month. Some interesting stuff,
especially if you like me love neuroscience. Perhaps the most
transhumanist talk was by Miguel Nicolelis, working on direct brain
control of robot arms in rats and monkeys.
I also gave a few lectures on AI, neural networks and the history and
future of computers to the participants in the Multimedia program at a
local university. Hmm, *they* got a lecture on transhumanism - when do
*I* get one? :-)
> > The idea that some will go on towards Singularity (or whatever) and
> > the rest will remain, is of course one of the main Scary Arguments
> > Against the Singularity (cf. Stuart Brand, _The Clock of the Long Now_
> > and Jaron Lanier's edge piece). Exponential growth might also imply
> > exponential growth of differences, and this is generally viewed as a
> > bad thing. After lecturing some first year students on waguely
> > transhumanism related stuff, I noticed this was one of the arguments
> > against it that seemed to work best. Well worth considering.
> I don't understand how the exponential growth of differences (a bad thing) works
> best as an argument against technological singularity. Or do I misunderstand?
It might not carry that much weight to you or even the community here,
but among many people it is a compelling reason to think that there is
something undesirable with singularity. A knee-jerk libertarian
reaction would of course to blame egalitarianism, but I think there
are also deeper reasons. But it would be too complex to explain well
at this time of night; I think I'll save it for a more careful
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! firstname.lastname@example.org http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:16 MDT