"Raymond G. Van De Walker" <firstname.lastname@example.org> writes:
> Ander Sandberg (email@example.com) said:
> >The problem seems to be that it is impossible to test very complex
> >systems for all possible contingencies, and this will likely cause
> >trouble when designing ufog. How do you convince the customers it is
> >perfectly safe?
> I program medical and avionic systems, and the general criteria are
> pretty straight-forward.
> You test the thing for designed behavior, and then you test it for
> benignity (e.g. operating-room equipment has saline solution pured on it,
> stell rods poked into open orfices, and various shorts on teh pweor plug)
Sounds reasonable. But medical and avionics systems have to deal with fairly well defined environments; the number of things that might be thrown at ufog in an ordinary home (just imagine what the toddlers do) are astronomical. Hmm, that actually suggests an ufog problem: getting foglets into liquids where they shouldn't be - how can we guarantee that none of the fog gets into our food?
> >You get the same problem with AI: what testing would be required
> >before an AI program was allowed to completely run a nuclear power
> Well, speaking as a professional safety engineer, I think this would be
> an easy
> argument. Just test the program with the same simulation used to train
> human operator. If it does ok, then it's ok.
> However, most regulatory environments require that no single fault should
> be able to induce a harmful failure (this is simple common sense,
> Therefore, one might have a much easier time with certification
> if there was a second AI, with a different design, to second-guess or
> cooperate with the first. This makes the system far less prone to fail
> the first time an AI makes
> a mistake, And, it _will_, right?
To err is human? :-)
Yes, a double system would be reasonable. Most likely human supervisors may be left too, so that there is somebody to personally blame for everything :-)
Anders Sandberg Towards Ascension! firstname.lastname@example.org http://www.nada.kth.se/~asa/GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y