which should be the default more-reliable case in the event of conflicting input?
One of the biggest headaches with designing any of these systems is handling the hand over from the computer control to the "backup" meatsack.
When everyhting is going well, the computer will, generally be able to do a better job of driving and will even handle extreme control situations better than people do. For example, this is why some fighter jets that are far too unstable for a human to control are flown fine when the computer is working.
Unfortunately the Plan B for almost all automated systems (autopilots, ...) is to disengage and hand back control to the meatsack.
This introduces three major dilemmas:
1) Loss of situational awareness: The meatsack has not been involved in the control and is not sufficiently aware to take control. In the case of an autonomious car, the driver has probably been reading a book, LOLling on twitter or whatever. Suddenly the controls think everything is too hard and dump control in the lap of the driver who is not sufficiently informed to take effective control. By the time the driver works out what is going on it is too late.
2) Exceeding human capability: Computers can cope with some control situations than people can, most speficially they can operate faster and with greater precision. If the computer is giving up, then the chances are a person is incapable of taking effective control. Bad things happen. This happened in an Air China crash some years ago where a thrust controller had kept tweaking things until if was forced to give up. As a result, the meatsack had no chance of recovering the plane and it crashed.
Thus, the control system has to be set up to give up when the meatsack still has a chance of coping - largely negating the benefits of the computer.
But is that the right decision? The compter is likely more skilled and probably more likely to recover from the situation. But if the computer tries more and then really does need to hand over, the pilot is in an even worse position.
Damned if you do, damned if you don't. Not at all easy to come up with a good decision.
3) Confusion of the control surface: As soon as there are more than one controllers (two people, or a computer and a person), there is the opportunity for some control to fall through the cracks.
A classic example of this is the driver using cruise control for the first time. There have been more than one occurrance of rear-ending the car in front? Why? Well in the criver's mind they hve handed over speed control to the cruise control. Part of speed control is braking when required. Unfortunately the cruse control does not brake when required. Drivers have literally watched the crash happen over a period of seconds dumbfounded that the computer did not slow down.
Of course it all makes sense in the clear light of day, but under the stress of the event, the driver's brain shuts down some of its thinking ability. (See http://en.wikipedia.org/wiki/Incident_pit for an explanation why).
In short, mixing human and computer control is a bloody nightmare.
The only people that will definitely benefit are the lawyers who outnumber engineers 5:1 in California.