den Otter wrote:
> > From: Nick Bostrom <firstname.lastname@example.org>
> > Colin wrote:
> > > If the Argument is true then, by the very nature
> > > of it, there isn't squat we can do.
> > That's not true. Even under the interpretation that doom will strike
> > soon (just one among several other interpretations), we can reduce
> > the risk of doom by reducing the various empirical threats - black
> > goo, meteor impact, high-energy physics experiment, nuclear or germ
> > warfare, environmental collaps, etc.
> Is there any proposed role for the WTA / ExI in this, apart from
> educating the (mostly online) public?
One can imagine the several possibilities for the long-term and medium-term, but for the near future, at least with the WTA, my guess is that the activity will mainly be educational. That includes educating the public (on- and offline) but it also includes developing our own thinking on these issues. I would like to see risks and downsides (and strategies for how to minimize them) to attain a much more prominent place within transhumanism.