Reply to post:

Tesla autopilot driver 'was speeding' moments before death – prelim report

EveryTime

The Mobileye system is the problem here.

It's good enough to explore the problems, but far from good enough for deployment.

Especially since it screws up in scenarios that humans can easily understand.

I'll call this Rule One of autonomous systems: Even if you build a system that is demonstrably safer on average, if it screws up in an 'easy' situation for a human, it will be rejected. We can be amazed if it does the easy-for-a-computer thing of tracking dozens of cars at once, including simultaneous merges from both sides into "blind spots" while spotting braking ten cars ahead and a pedestrian about to stroll into traffic. But if it occasionally confuses a plain white truck side for a threat-free path, that's unacceptable.

Mobileye doesn't build a 3D model of the world, and track objects in that model. Instead it recognizes specific features in individual images. It locates and reports on road signs. It tracks lane markings, and reports on centering, upcoming curves, and features (stop lines, cross-walks). And it has rudimentary pedestrian and vehicle detection and reporting.

I call this the 'Eliza' of self-driving. It's a simple system that is impressive in demos, but falls apart in real-life use. The pattern-matching structure is too simplistic. You can train it with more patterns, but in the end there will always be a situation that you missed.

Here is an interesting experiment: tape a 15MPH speed limit sign on the rear of your car and pull in front a self-driving Tesla on the highway. Now think about why they *really* aren't attempting to automatically enforce the speed limit.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon