nav search
Data Centre Software Security DevOps Business Personal Tech Science Emergent Tech Bootnotes
BOFH
Lectures

back to article
Tesla autopilot driver 'was speeding' moments before death – prelim report

DanDanDan

Re: Not an AI

Agreed. I think "Nearly hit in the side door" means that it nearly hit the door, but instead hit the wheel arch. You'd need some sort of linguistics/anthropology degree or something to properly grok the syntax and sentence structure.

PatientOne

Re: Not an AI

Okay, motorbikes (SMIDSY) are a different issue and has to do with how the brain cheats as much as how the eye works.

The brain works with a snapshot then looks for changes. Certain shapes are prioritised for identification, of which vertical lines (like bikes) are not included, but horizontal shapes (like cars) are. So the eye may spot the biker, but the brain doesn't 'see' them immediately. This can be 'fixed' by taking longer to look, or to look away, then back as the brain is likely to notice a different in position of the biker, which indicates movement, which is a priority for the brain to identify.

This is very much IT related, too, as it explained why AI development wasn't returning anticipated results, despite the servers being 'as powerful' as an organic brain: The simple fact was the human brain was cheating (taking short cuts) which the AI wasn't programmed to do. Oh, and this applies to other forms of processing, too: Organic brains really do cheat/take short cuts. It's why stereotypes are so important to us: The brain uses that to 'assume' information that isn't readily apparent based on a model or stereotype, and so doesn't wait for confirmation of said details, but proceeds and then reprocesses data as new information/corrections become apparent. Of cause, poorly developed stereotypes are detrimental, and can lead to bad conclusions and end in death. Or an expensive law suit. Or the purchase of that horrid jumper...

Stoneshop
Silver badge

Re: Not an AI

Small low powered radio transmitters are not that expensive and could be retro fitted to older vehicles and mandatory on new sending an 'I'm' here signal out.

Which will only tell a receiver "there's a vehicle somewhere in the vicinity". No indication of distance and direction, unless you have a directional scanning receiver and a calibrated transmitter, and even then there's no sufficiently exact way that the receiver can indicate a particular vehicle on a potential collision course.

Darryl

Re: Not an AI

"Who is going to pay? The drivers, that's who."

So hundreds of millions of drivers have to pay to install new equipment in their vehicles, including the aforementioned hassles of taking the time to get it installed, etc. because a few hundred Tesla "drivers" can't be bothered to pay attention to where they're going?

Cynic_999
Silver badge

Re: Not an AI

"

Which will only tell a receiver "there's a vehicle somewhere in the vicinity".

"

It would be trivial to send burst transmissions of a few milliseconds at short but random intervals that contain GPS information on position, speed and direction. The in-car computer can interpret the information gathered from all vehicles within range to predict collision dangers.

Ships have been doing so for decades.

Stoneshop
Silver badge

Re: Not an AI

that contain GPS information on position, speed and direction.

Sure, but that's not a "simple radio transmitter" any more. The receiver likewise has to be adapted to pass that data on to the anti-collision system so that it can be used that way. And while it'll cost just a few cents in hardware, if you want it to be traffic-certified it will be several hundred Euros/Pounds/Dollars

Mark 85
Silver badge

@Darryl -- Re: Not an AI

. because a few hundred Tesla "drivers" can't be bothered to pay attention to where they're going?

Why yes.... the 99% pay dearly for the 1% to have nice things.

Alan Brown
Silver badge

Re: Not an AI

" So the eye may spot the biker, but the brain doesn't 'see' them immediately. This can be 'fixed' by taking longer to look, or to look away, then back "

Or to actively look for bikes. You'll still see cars but you'll see the bikes too.

Vic

Re: Not an AI

Which will only tell a receiver "there's a vehicle somewhere in the vicinity". No indication of distance and direction, unless you have a directional scanning receiver and a calibrated transmitter

Not so. Have a look at the ADS-B system in use in aircraft; each transmitter is sending not just a carrier, but also position and speed vector information. You could make these quite cheaply in bulk[1].

Whether there would be sufficient precision for the distances involved on the road is another matter, of course...

Vic.

[1] I'm currently trying to see how quickly I can put one together for aircraft use. The commercially-available ones are a bit steep for my pocket...

Will Godfrey
Silver badge
Unhappy

Waste of time

it doesn't matter what you do. You can't fix stupid.

P.S. That includes the truck driver pulling out without ensuring the road was clear.

AndyS

Re: Waste of time

You can't fix it, but you can eventually take it out of the loop. That's why there are a lot of companies, Tesla included, working so hard to produce self-driving cars - exactly (and sadly ironically) to prevent accidents like this happening.

Dale 3

Re: Waste of time

Truck was turning across traffic (car hit at 90 degrees) Assuming worst case it had started from stationary due to waiting for oncoming traffic, a big heavy truck with low acceleration could easily take 30 seconds to complete the turn. A car going at 74 mph will cover over half a mile in 30s. The road may have looked clear for half a mile when the truck driver started his turn. If not starting from stationary the distances will be smaller, but still the point is that cars going at 74mph cover a lot more distance per unit of time than many people realise.

Sweep
Coat

Re: Waste of time

"cars going at 74mph cover a lot more distance per unit of time than many people realise"

I guesstimate that a car going at 74mph will cover around 74 miles per hour.

Charles 9
Silver badge

Re: Waste of time

Or, in more practical terms, over 108 feet per second (> 60mph = more than a mile a minute). Meaning, in the time it would take an 18-wheeler to turn left across the intersection, a car going 74 mph would've covered 3/5 of a mile by that point (more precisely, 3256 ft in 30 seconds).

Queasy Rider

Re: ensuring the road was clear

More than once in my driving life I have pulled out into traffic believing I had plenty of time, only to realize that I had underestimated the speed of an approaching vehicle. I managed to always save myself by jumping on the gas and scooting out of the way. I suspect that something similar happened in this case, with the Tesla going 74 at impact, and the truck unable to scoot due to its great mass. I also suspect that if the Tesla had been going 65 then it would have just cleared the truck completely, or have been automatically slowed down when its systems came face to face with the truck's rear wheels. Believing all of the above, I place no blame on either the trucker or Tesla. As usual it is the speeding drivers fault; he asked for it, he got it. Too bad, so sad.

PassiveSmoking
Coat

In Soviet Russia...

blueberries squash YOU!

Sorry...

Avatar of They
Meh

Just seems dumb not to watch the road.

I think the idea of Mansfield bars is that the front head lights and radiator take the brunt of the hit and you are decelerating from that hit. Not using the windscreen as the point of impact and then the deceleration. From old TV safety vids that have done the rounds the bars take the hit and crumble underneath just as much as the cars crumple zones, but they stop you cutting the top of a car off.

74 mph impact probably would have killed him regardless. internal injuries and even the Mansfield is gonna struggle to crumple before the windscreen hit.

Should just keep your eye on the road.

Anonymous Coward
Anonymous Coward

Dangerous attempts to fix stupid?

You can't fix stupid

That I agree with, but sometimes I have the feeling that by trying to out-engineer stupid we seem to entice stupid to play a bigger role. I personally think that the safety measures that the Autopilot brings could work quite well in situations where the driver remains in control (there are recorded cases where the car slowed down for pedestrians that the driver didn't spot) but I don't think we can ever out-engineer people becoming careless as a consequence of having this facility.

When ABS brakes were introduced, it initially resulted in elevated levels of head-tail accidents because some idiots thought it somehow gave them more margin to slow down - that has eventually settled (one would guess in a sort of Darwinian fashion). I see a repeat of the same here, and I think the issue starts with people ignoring the warning that this is BETA.

As I stated before, an active driver would have spotted something amiss because cars ahead would have started evading it, even if the truck itself was practically invisible against the background. Even the dumbest human being is still able to spot a break in an established pattern, an ability the driving AIs do not yet seem to have. We are IMHO a long distance from reliable auto-pilot facilities - they work, but we have no where near enough data on edge cases yet.

Charles 9
Silver badge

Re: Dangerous attempts to fix stupid?

"As I stated before, an active driver would have spotted something amiss because cars ahead would have started evading it, even if the truck itself was practically invisible against the background."

Unless, of course, he was the FIRST car there, meaning there were no other warning signs other than the truck itself. As for the truck yielding right of way, he may well have not seen the car prior to the actual turn. Remember, the car was speeding at near 75 mph. A car can close distance rapidly at that speed.

DanDanDan

Re: Dangerous attempts to fix stupid?

I think he must have been the first car there, otherwise the other cars in front would have been visible.

Unless the car in front slowed down and the Tesla switched lanes to pass it. That would also make sense. The car in front would block the rear of the truck a bit, and by the time the Tesla was in the outer lane, it might be difficult to see the truck because it's white on a white background.

Anonymous Coward
Anonymous Coward

Re: Dangerous attempts to fix stupid?

Unless, of course, he was the FIRST car there, meaning there were no other warning signs other than the truck itself.

That does not make sense. We have a whole motorway full of people all moving at over 60 mph, yet somehow he's the first to approach (and failing to avoid) a vehicle forming a complete roadblock? There will have been drivers engaging in evasive manoeuvres in the few seconds this truck went from potential approaching risk to oh-my-friggin-God absolute roadblock or it would not have been just him that died there.

Charles 9
Silver badge

Re: Dangerous attempts to fix stupid?

This isn't a motorway. It's an arterial, which means traffic lights. If he was FIRST out of the light, pulled ahead, and there's not much between the light and the truck, he could easily have a large opening in front of him before encountering the truck.

moiety

If he hit at 74MPH, then cowcatchers might not have helped that much. Even if they did slow him down to a stop before the front half of the roof was taken off (doubtful), decelerating from 74MPH in -guesstimate- 10 feet wouldn't do him much good.

Gotno iShit Wantno iShit

Exactly. It's similar to the tests done on (IIRC) Fifth Gear that showed a when a Smart was driven into a concrete block at 70 MPH the safety cell remained intact, no external injuries to the occupants at all. Their internal organs however would be purée due to the deceleration. It took a couple of volunteers driving smart cars into very solid objects and not surviving to prompt the test to be done.

A Tesla has more crumple zone than a Smart and had there been bars on the lorry they would have deformed too but enough to save this guy? I doubt it. His head would be less than 10 feet from the front of the vehicle, assuming negligible sideways movement of the fully loaded truck that measurement is the distance he would need to stop in.

Nano nano

"volunteers not surviving" ? link please !

TRT
Silver badge

Crumple zone located between front and read bumper. Makes you drive much more safely.

Richard 31
Paris Hilton

Only because there was noone in it: https://www.youtube.com/watch?v=mnI-LiKCtuE

8 mins in for the verdict.

Nano nano

So if the car's so clever, how was it letting him exceed the speed limit ?

allthecoolshortnamesweretaken
Silver badge

Good point. And maybe an idea for future iterations of the "autopilot" feature: if the driver sets the cruise control part of the system to exceed the speed limit, the car hands control back to him / refuses to switch to autodrive.

I use HERE maps and a Sammy tablet as a satnav, and it pings me when I go too fast. The maps are pretty accurate regarding the local speed limits.

CowardlyLion

That was mentioned in the article. The software to monitor the current speed limit was not sufficiently reliable in practice so it is not enabled by default.

James Hughes 1

Even if he had been driving normally, given the speed of 74mph, would he have been able to stop anyway? Plenty of people get splatted like this even without the aid of cruise control.

MOV r0,r0
Stop

If the semi had been joining a freeway, the Tesla would not have encountered the trailer at the angle it did. Transposing: if full auto-assist use had been restricted to freeways, this 'accident' would not have happened.

The same factors that allow for higher speed limits on freeways make them the only safe place to utilise this technology at this time. We're talking cars and plebs in public spaces here, not $m planes and professional pilots in managed airspace.

TRT
Silver badge

They can tell the driver didn't see the truck...

because of the lack of skid marks.

Mark 85
Silver badge

Re: They can tell the driver didn't see the truck...

Maybe it's me... but with ABS, I've not seen skid marks. Damn tough to lock the brakes/wheels with that system.

Fungus Bob
Silver badge

Re: They can tell the driver didn't see the truck...

Or the lack of skid marks was in his underpants.

Alistair
Silver badge
Coat

truck image in article

> appears to be reversed.

Crossing traffic to turn indicates a left turn, not a right turn.

"Did not see trailer as it was white" -> not a good thing should be fixed. I'm wondering if the system spotted the *cab* since it would have lead the trailer out, and decided that the obstacle had cleared the field.

Would help if we had map location to reference. (i.e. curves in the road, trees, Bushes (there is one florida you know), billboards, dead alligators etc)

Now, I'll admit, there are always weird intersections out there but those tend to be rare around highways, and 60 is a highway in florida.

Eyeballs on the road. Period. Always. Autowhatsis or not.

Anything else, stupidity.

captain_solo

Having recently been engaged in teaching 2 children to drive, I think the problem is that people think these driver assist systems are as good if not better than a human driver. When someone is first learning to drive they are having to do a serious amount of conscious multitasking figuring out how to actually control the vehicle and maintain its attitude and speed correctly, respond to traffic signals and keep an eye on other vehicles. The computer systems in modern vehicles are very good at helping with this part of driving and cars have for some time been taking on this role slowly. Even my several years old car has a drive by wire acclerator pedal that takes the input from the driver into consideration with a number of other factors to determine throttle position for example.

Once you have been driving for some time, the operation of the systems to keep the vehicle in its lane and appropriate speed envelope, obeying signals, and monitoring other vehicles around you is basically an automatic reflexive series of actions, and your higher brain functions (provided you have them...) are spent evaluating the driving environment for more subtle and distant cues. You are assimilating large amounts of input and discerning very nuanced responses at a longer range and figuring out "what" to do, instead of "how" to do it. At this point it should be obvious that this part of the driving process is not able to be reliably passed off to what is not really even a real AI, but more of an algorithmic approach that basically plays the odds and tries to come up with the best guess at how to respond to a given stimulus. This does not get you to 100% autonomy, it may never get there for all driving situations unless roadways are designed to eliminate the edge cases, and you never let a human operate a vehicle completely manually alongside the autoautos. For the most part the traditional car companies are not dragging their feet on this, they just know more about how much testing and engineering work it takes to put a huge feature change into a vehicle and ensure that is it safe in all varieties of driving situations. This is something the Silicon Valley, having in large part not enough experience building comprehensive systems for life-critical applications with complex human factors thinks should be easy if you throw enough silicon and sensors at it. Its about the real human intelligence that can't yet be matched by an algorithm - this is in effect what happened in this case, the car did what it does, but was unable to respond properly to an unforseen and perhaps extremely rare edge case.

JaitcH
Silver badge
Unhappy

I remember that accident - only in 1967!

Jane Mansfield, who always carried an impressive load of melons, was a passenger in a car with three adults up front and children, sleeping, on the back seat. The three adults were Mansfield, driver Ronnie Harrison and lawyer Sam Brody.

They were decapitated.

Canadian Mansfield Bars are stronger than American units, by law, but none are really that good. The BEST are MANAC - dry van trailers in the U.S. under the name TRAILMOBILE. Their advantage is the supports of the Mansfield Bars are towards the outer end - where the worst damage occurs,

Angular hits suffer the greatest damage and deaths. Short nosed vehicles, even hitting at 90 degrees, are also heavy losers.

Charles 9
Silver badge

Re: I remember that accident - only in 1967!

"The BEST are MANAC - dry van trailers in the U.S. under the name TRAILMOBILE. Their advantage is the supports of the Mansfield Bars are towards the outer end - where the worst damage occurs,"

But I doubt TRAILMOBILE rigs are recommended for areas with way-above-grade railroad crossings, since lowering the crash zone inevitably lowers the ride height, raising the chance of a hump taking the trailer off its wheels.

Jonathan 27

Re: I remember that accident - only in 1967!

So what you're saying is that I need to trade my Mazda 3 in for a car with a longer hood? New Mustang here I come!

Alan Brown
Silver badge

Re: I remember that accident - only in 1967!

"areas with way-above-grade railroad crossings"

Need to have those crossings regraded (not that difficult as all that's effectively needed is earthen ramps) or verboten for trucks. Grade-crossings are inherently dangerous and the overall best course of action is to eliminate them wherever possible.

Stevie
Silver badge

Bah!

So, the "Autopilot" suite of vehicle "enhancements" does not include a proximity triggered "Yeehaaaa!" klaxon then? Serious oversight by Tesla.

Francis Boyle
Silver badge

Re: Bah!

Shouldn't that just be someone saying "you're already dead" in a suitably doom laden voice.

Frank N. Stein

No thanks

Will pass on the whole Self Driving Car thing. Don't need it. I pay attention and operate the vehicle. Non-adaptive cruise control is the only assistance I will us, thanks.

Charles 9
Silver badge

Re: No thanks

What if it becomes take it or leave it? As in take the self-driving car, pay crazy car insurance to keep the privilege, or just get off the road?

EveryTime
Silver badge

The Mobileye system is the problem here.

It's good enough to explore the problems, but far from good enough for deployment.

Especially since it screws up in scenarios that humans can easily understand.

I'll call this Rule One of autonomous systems: Even if you build a system that is demonstrably safer on average, if it screws up in an 'easy' situation for a human, it will be rejected. We can be amazed if it does the easy-for-a-computer thing of tracking dozens of cars at once, including simultaneous merges from both sides into "blind spots" while spotting braking ten cars ahead and a pedestrian about to stroll into traffic. But if it occasionally confuses a plain white truck side for a threat-free path, that's unacceptable.

Mobileye doesn't build a 3D model of the world, and track objects in that model. Instead it recognizes specific features in individual images. It locates and reports on road signs. It tracks lane markings, and reports on centering, upcoming curves, and features (stop lines, cross-walks). And it has rudimentary pedestrian and vehicle detection and reporting.

I call this the 'Eliza' of self-driving. It's a simple system that is impressive in demos, but falls apart in real-life use. The pattern-matching structure is too simplistic. You can train it with more patterns, but in the end there will always be a situation that you missed.

Here is an interesting experiment: tape a 15MPH speed limit sign on the rear of your car and pull in front a self-driving Tesla on the highway. Now think about why they *really* aren't attempting to automatically enforce the speed limit.

Charles 9
Silver badge

"But if it occasionally confuses a plain white truck side for a threat-free path, that's unacceptable."

But here's the catch. How do we know it would be easy for a HUMAN to see it, too? Sometimes, we assume too much and don't take the assumption that the human could be as confused as well. Or the human could be tricked by illusions and other conditions a machine would be less prone. For example, a anisotropic painting of a kid in the middle of the road, or a whiteout condition.

The situation here is that human drivers and computer drivers approach perception from two completely different angles, and they don't overlap. The real question you have to ask is which of us can handle better in the overall scheme of things: human intuition that can't be taught because it's inborn even in toddlers (so we don't even know HOW we learn it) or tireless machine perception that's harder to fool objectively but likely easier to fool subjectively?

Someone Else
Silver badge
Stop

Fat effing chance

Mansfield bars are mandatory on the rear of trucks in the US but, unlike Europe, not on the sides. Maybe that's something legislators might like to consider.

What? A US legislature of any type passing a law that would cost Corporate America™ money to potentially save lives of the Little People?

Shirley, you jest.

Charles 9
Silver badge

Re: Fat effing chance

Thing was, it ALSO saved Corporate America time AND money in lawsuits claiming a design flaw that doesn't take submarining into account. Handling the back behind the rear wheels was easy enough, but the sides (which affects ride height) are another matter.

SimonC
Paris Hilton

A small £5 camera pointing at the driver that records the last 10 minutes in rotation and saves it upon a crash would probably save Tesla -millions- of pounds in lost consumer confidence and investigations.

Paris, cos she knows about filming herself.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

The Register - Independent news and views for the tech community. Part of Situation Publishing