Software virtues into AI

Steve Witham (sw@tiac.net)
Wed, 29 Jan 1997 00:16:57 -0500


Hi. I'm about 90 days backlogged in reading the list but I had a thought
you might like. (I feel like that character who only communicated by
(virtual) teletype in the dungeons-and-dragons world of Vinge's _True
Names_. The UPS guys would deliver this teletype to the Castle...
was it THE POSTMAN?)

I like to wonder how AI might arise from software development efforts
going on right now. Also, I wonder how we might give AIs the qualities
we want them to have.

Maybe AIs will come about as assemblies of pieces of software that all
do sensible jobs that we understand: indexing, modeling, taking statistics
and making predictions, parsing and composing sentences, keeping schedules,
etc.

Maybe the personalities of these AIs will have a lot to do with the
personalities of the components. You know what I mean. Every piece of
software has some sort of personality. We notice whether
software is easy or hard, fun or boring, straightforward or devious,
dependable or flakey or downright infuriating...

If the personality of AI is going to flow from the personalities of its
components, that's scary, because a lot of software these days is nasty
and badly behaved. I spend a lot of time trying to deal with software that
seems to be written by people who have no respect for me, my time, my
sanity, my need to understand, or my peace of mind. People who seem not
to take responsibility for the context in which their software is going
to be used. These attitudes seem implicit in the behavior of software.
For instance, I install a new version of my OS and the TCP settings are
toast, and the new control panel doesn't make sense. Or it fails with a
completely uninformative and patently absurd message, with no clue where
to go from there. I call up the help wizzard and it says, "Put your
DNS server address here," indicating a section of a form that has no
blanks to fill in. (This is Mac System 7.5.5's Guide on the TCP/IP
control panel. A really beautiful animation of drawing a red crayon
circle in an empty white space, including showing through the translucent
help box.)

I'm not so much complaining about the present as fearing for the future:
imagine an AI with all the worst personality flaws of the latest
releases from your favorite software companies. Larry Ellison and
Bill Gates on chips, or downloading intently into your WebTV. You get home
and you your house has been upgraded with a butler that's like Word 6.

But on the other hand, maybe the path to nice AIs is to realize that it's
already gotten to the point where software needs virtues built into it.
People have already sensed (although not always catered to) needs for
clarity and useability. Sometimes even adaptability. How about respon-
sibility? How do you write software that knows when it's letting you down
and has some idea how to figure out what it's doing wrong, or what recourse
you can seek? I mean haven't you ever wanted to whack a piece of software
on the side, but realized that software doesn't generally have a way of
dealing with whacks? I once saw a proposal for error messages with
explorable levels of detail--like stack traces with English subtitles--
the idea hasn't caught on. What if software could show you its assumptions
and let you tell it which ones were wrong? But these things are hard.

I guess what I'm saying is that maybe virtues--good qualities--aren't
just separate parts, but need to be built in throughout a system (or
personality), and only get there if someone takes responsibility for
putting them there. But if people do take that care, then you end up
with systems that are naturally more likely to turn out virtuous.

Which is really just saying that virtues are more like skills than
compulsions, which I believe.

--Steve

--
sw@tiac.net           Steve Witham            http://www.tiac.net/users/sw
"...the Vild, where the manifold was as dangerous and deranged as a
 Scutari shahzadix in heat." --David Zindell