In a message dated 98-11-08 11:51:52 EST, Spike Jones wrote:
> ralph, you have hit on the issue which will prevent computer controlled
> cars, as well as most future revolutions. if that computer fails and
> kills someone, it is unclear in our litigious society, who gets sued.
I have to beg to differ -- somewhat. The same basic AngloAmerican common law system of liability has been in effect throughout the period when we have developed and implemented some of our most important technological innovations. While it is certainly true that the U.S. is the most litigious society on Earth, we certainly aren't the most backward, technologically. Japan, one of the least litigious societies, is not significantly quicker to adopt new consumer technologies. Let me address the questions you ask to demonstrate how the system currently in place will accommodate computer-driven vehicles.
> what if your 6 yr old takes your computer driven car and it
> hits someone?
Just as with any other potentially dangerous device, the answer depends on who or what CAUSED the accident. If the six-year-old input instructions into the computer to drive in the same fashion as he had recently done on his Nintendo 256 "DeathRace 2010" video game, then at least some liability will likely fall on the kid's parents, for letting him have free access to the car. The same rules apply today to parents who carelessly allow their children to take control of dangerous instrumentalities. If, instead, the accident was caused by a basic programming flaw in the machine as it was purchased by the child's parents, then the suit is a standard products liability suit -- no different from one today against auto makers. In fact, since it's quite foreseeable that a six-year-old might get into an automated car, one can argue that safeguards against user inputs similar to those found in "DeathRace 2010" should be built into the system. So the manufacturer might well share some liability with the parents even in the first scenario.
Do such suits pose a cost to automakers? Of course. And I think most people think they should. One only has two alternatives when approaching product safety: Private suits and central government regulation. Which is more consistent with individual liberty?
> what if you send your car unaccompanied to your
> friend's house and it hits someone?
Same answer as in the hypothetical with the child. If you programmed the computer to take the shortest route to your friend's house -- including shortcuts through your neighbor's bedroom -- then you'll likely have some liability. On the other hand, Ford or GM should likely be at least partially financially responsible if they produce the car to accept such inputs.
> what if your computer car
> is sitting still at a light and some yahoo who wants a new car
> sees you asleep in the passenger side and backs into you, then
> claims you hit her?
The fact that the car has an automated guidance system doesn't change this scenario from the situation we have always faced: There will always be people who try to cheat, whether in the private tort system or in a less individualized, less flexible system of public responsibility.
>we insist that someone be responsible always,
> even if the computer driven car is ten times safer.
Actually, neither of these propositions is true. The defense of "unavoidable accident" is still alive and well in our tort system. The civil common law tort system recognizes and enforces the principle that "sh*t happens".
Your second assertion actually works the other way. When a demonstrably safer technology becomes available, the civil tort system creates powerful incentives to adopt it. Imagine the field day a plaintiff's lawyer would have in an auto accident case today involving a current-model car NOT equipped with seatbelts.
> besides, if computer driven cars became accepted, microsoft
> would get into, then dominate that market. i would not only
> refuse to ride in a microsoft car, i would flee in terror from
> any such device. {8^D spike
Ahh, antitrust law -- not my bag :-§
Now, there IS a quite difficult legal issue buried in here, actually -- one we recently thrashed about on the extropians lawnode list. What happens when the software agent at the heart of the guidance system becomes so sophisticated that it is capable of generating goals on its own initiative? I leave that knotty problem as an exercise for the reader.
Greg Burch <GBurch1@aol.com>----<burchg@liddellsapp.com> Attorney ::: Director, Extropy Institute ::: Wilderness Guide http://users.aol.com/gburch1 -or- http://members.aol.com/gburch1 "Good ideas are not adopted automatically. They must be driven into practice with courageous impatience." -- Admiral Hyman Rickover