> You described something like how person would experience salty if he
> is only told about salty experience.
> You forgot corresponding 'look up table' containing how it relates
> *internally* to other tastes, what emotions person experienced when
> tasting, etc...
> Actually look up table is too simplistic expression. How would you
> describe trained artificial neural net semantically?
The same way abstract computer simulations describe or model
them. There is no more phenomenal information in any abstract
computer model than there is in what we can abstractly say to each
other. As I said, you can abstractly model everything with most
anything including speech and internal computer models, but any
abstract model like speech, still, is only abstractly like it. It
isn't fundamentally like a real feeling or phenomenal sensation.
Also, neurons in our eyes and optical nerve learn how to
abstractly represent and forward modeling information to the visual
cortex. But, we are not phenomenally conscious of this information
until it arrives at the visual cortex where the phenomenal conscious
representations are produced. There is something more going on in the
neurons of the primary visual cortex than the abstract stuff that is
going on in the subconsciousness like the retina neurons. I would bet
that any neural net of today is still, like the neurons of the retina,
not yet producing phenomenal sensations. We haven't yet discovered
precisely what this phenomenal process is and how it is formed into a
unified and emotional awareness and why it is like what it is like.
Brent Allsop