back to article Ghost in Musk's machines: Software bugs' autonomous joy ride

Last year, a dark historical landmark was reached. Joshua Brown became the first confirmed person to die in a crash where the car was, at least in part, driving itself. On a Florida highway, his Tesla Model S ploughed underneath a white truck trailer that was straddling the road, devastating the top half of the car. Brown’s …

Page:

  1. cbars Bronze badge

    Easy fix

    Train the NN during the mormon cricket migrations in Idaho, that'll squash a few hundred thousand of the bugs

  2. John Robson Silver badge

    Really??

    "The Joshua Brown crash – driving at full speed into a clearly visible trailer – is arguably one such example as it “would never happen to a human being,” Hollander says."

    Has he never been on the road?

    There are *so* many cases of people driving into things that are perfectly visible (because there are very few things that aren't visible)

    And of course Joshua Brown is another of those - that the human behind the wheel didn't brake in response to the trailer either. You could more reasonably attribute the death to the lack of safety features required by law in US HGVs.

    1. Anonymous Coward
      Anonymous Coward

      Re: Really??

      I would also contest whether that it was a bug. It was clearly sub-optimal(!) but a bug is where something has been programmed incorrectly.

      In that case it just seems like it wasn't programmed to deal with that eventuality and relied on a different weighting of its sensors in that situation to define its parameters.

      If you have speech recognition, you would not call it a but if it didn't recognise every word or if it didn't recognise a certain accent. You would call it a bug if it recognised the phrase "call mom" but it actually dialled the emergency services.

      With a 'learning' system and pseudo AI there will always be scenarios where it won't be perfect - something like the Fairy Meadows Road is going to be almost impossible for an autonomous car to travel without being specifically programmed for that route, but it doesn't mean it is a bug.

      1. John Robson Silver badge

        Re: Really??

        "If you have speech recognition, you would not call it a but if it didn't recognise every word"

        I like what you did there ;)

        "I would also contest whether that it was a bug. It was clearly sub-optimal(!) but a bug is where something has been programmed incorrectly."

        That is also true, I've just got a vision of old film with people carrying a plate of glass across a road ;)

    2. Anonymous Coward
      Anonymous Coward

      Re: Really??

      You could more reasonably attribute the death to the lack of safety features required by law in US HGVs

      I doubt that. JB's two tonne car was doing 74 mph when it hit the truck, it'd be a very impressive side under-run bumper that'd stop that. Even if it had, to avoid a similar fate, the vehicle has to stop in about four feet - which means that even if the bumper, the car body, and the airbags spread the deceleration evenly during the circa 0.05 seconds of the impact (which I doubt) then the driver would be subject to a minimum of about 60 G.

      JB and his car had a part to play in his demise, but I'm unconvinced that a different trailer design could have saved him. However, the real root cause of this accident is the poor primary safety of US roads, often designed with uncontrolled flat 90 degree junctions on high speed roads (to save on the cost of alternative, safer layouts). These mix high speed through traffic with slow moving traffic crossing at right angles, and thus set up regular high risk conflict movements, regardless of whether a vehicle is self driving, or meatsack controlled. Anywhere in the world where there are this toxic (and cheap) mix of high speed and flat junctions, there's a history of high damage accidents. There's three choices here, all have nothing to do with self driving cars:

      1) Do nothing, live with the risks and consequences of a cheap road design.

      2) Pay to build or retrofit road layouts with better primary safety.

      3) Pay a bit less for controls such as traffic lights, along with more enforcement at flat junctions, and accept that there's still some risk, and a modest check on through traffic volumes and speed.

      1. John Robson Silver badge

        Re: Really??

        "You could more reasonably attribute the death to the lack of safety features required by law in US HGVs

        I doubt that. JB's two tonne car was doing 74 mph when it hit the truck, it'd be a very impressive side under-run bumper that'd stop that. Even if it had, to avoid a similar fate, the vehicle has to stop in about four feet - which means that even if the bumper, the car body, and the airbags spread the deceleration evenly during the circa 0.05 seconds of the impact (which I doubt) then the driver would be subject to a minimum of about 60 G."

        60g is survivable.

        OK, they have better safety harnesses etc, but F1 drivers often walk away from 50g crashes.

        https://en.wikipedia.org/wiki/Kenny_Bräck came away from a 200+g crash, and returned to racing...

        It wouldn't have prevented all injury, but it would have made a significant difference to the chances of survival (which were always ~0 without the bars). You take the collision down in speed over the first four feet and the A pillars would probably do more 'lifting' of the trailer, and get you even more deceleration time.

        You get to the point where survival is a possible outcome - and not a completely fluke one either. That's even ignoring the fact that having something of substance at that height would also likely have been sensed by the radar systems...

        1. Bronek Kozicki

          Re: Really??

          It was one of many dramatic mis-application of existing code. Others are Ariane 5 or a bug which finished Knight Capital. The code itself was working according to conditions for which it was coded, but the software has been applied in conditions for which it was not intended, for example steering a much larger rocket, actual live trading or fully autonomous driving. There is not much blame you can put on coder, and a lot of it on the organization itself.

        2. Anonymous Coward
          Anonymous Coward

          Re: Really??

          60g is survivable.

          Its also an average across the 0.05 second time the car decelerates. At the first moment of impact, zero deceleration, zero G, the airbags have yet to be triggered, fired and inflate. Realistically the G it is going to spike to a much higher value. And in such a fast crash, if the car stops in four feet and 0.05 of a second, then by the time the airbag is fully inflated (say 45 milliseconds from the crash sensor being triggered to full inflation of the airbag), the initial impact is almost over. If it stops in five feet, the car's gone under more than half the trailer width and although the G force may be lower, the loadbed of the trailer's probably come through the windshield and connected with the driver's head as they flop forward on the seatbelt.

          You get to the point where survival is a possible outcome

          I don't dispute that side under-run bars ought to be mandatory. A quick looks supports my expectation that they offer protection up to 40 mph (Angelwing). A lighter car might be protected at higher speeds, but I'd be surprised if the kinetic energy of a two ton car would be stopped above 40 before the cabin is penetrated (look at the test pictures, and you'll see that at 40 a large car only just gets stopped before the A pillars get sliced). Now consider the two ton car in a perpendicular 74 mph impact - that's got 3.5x the kinetic energy of the same car at 40, so the impact is way beyond the design parameters of even a notably stronger than average under-run protector. The A pillars will never be strong enough to lift a trailer and buy more time. Look at the pics of the crash in question, and you can see that they left marks on the trailer, but clearly weren't able to lift it.

          1. John Robson Silver badge

            Re: Really??

            "60g is survivable.

            Its also an average across the 0.05 second time the car decelerates. At the first moment of impact, zero deceleration, zero G, the airbags have yet to be triggered, fired and inflate. Realistically the G it is going to spike to a much higher value. And in such a fast crash, if the car stops in four feet and 0.05 of a second, then by the time the airbag is fully inflated (say 45 milliseconds from the crash sensor being triggered to full inflation of the airbag), the initial impact is almost over. If it stops in five feet, the car's gone under more than half the trailer width and although the G force may be lower, the loadbed of the trailer's probably come through the windshield and connected with the driver's head as they flop forward on the seatbelt.

            You get to the point where survival is a possible outcome"

            Yes - but an F1 car comes to a stop from 150+mph in well under 4 feet fairly often (thankfully they are generally good enough drivers that it isn't *that* often, but it happens)

            No airbags, although better restraints/HANS devices etc.

            I'm not saying it's gone to a 'certain kill' to a 'will absolutely walk away from', but the chances of survival are dramatically better with than without.

      2. petur

        Re: Really??

        An under-run bumper would have shown up as 'not an empty space' so it very much would have saved his life, it would have triggered emergency braking

        1. Jellied Eel Silver badge

          Re: Really??

          An under-run bumper would have shown up as 'not an empty space' so it very much would have saved his life, it would have triggered emergency braking,

          That depends on the code. The description in the incident report makes it sound like a system primarily designed to prevent rear-end collisions. So a combination of camera & radar with a check against pre-defined vehicles. It doesn't say what it'd do if it detects an unknown posterior, or how much of the space ahead of the vehicle gets scanned. I'd hope it's the full profile of the car, but presumably didn't happen in this accident. So 'operator error' caused by the driver's inattention, and possibly over reliance on auto-pilot features that didn't exist.

          Under-run bumpers would probably help in other accidents, and perhaps the Tesla Truck will slap QR codes on it's sides so airbags can be deployed in it's cars.

      3. Muscleguy

        Re: Really??

        Here in Scotland such junctions are not uncommon, with the added wrinkle of corners, blind summits, fog, blizzards, ice etc. What happens is there are a spate of bad accidents, lives lost. The media get on their hobby horses, campaign groups are formed, local government is lobbied to lobby central govt (Holyrood in the case of roads). Since we moved up here to Dundee end of '98 ALL the fast roads out, to Perth, to Forfar/Aberdeen, to Carnoustie/Montrose have had grade separated junctions installed (on and off ramps, a bridge of some sort.

        It is now much safer to drive at 70mph on a dual carriageway A-road in Scotland. Though when it advises you to slow, it might be a good idea to do that. Oh and the biggest, longest stretch of single carriageway on the A9 (Perth to Inverness) now has a long stretch of dual carriageway. Part of the project to dual the entirety of it.

        I'm not sure there is such a national or even state program to invest in roading infrastructure. With the Tea Party they are instead focussed on paying ever less tax and wondering why their infrastructure is falling down.

    3. Jonathan Richards 1
      Stop

      Re: Really??

      > the human behind the wheel didn't brake ...

      It was alleged at the time that Mr Brown was engaged in watching a movie on a tablet. He may not have seen the trailer at all. Of course, and if so, this was a fatal abuse of his vehicle, after which I believe Tesla stopped calling their software an "autopilot".

      1. Snowy Silver badge

        Re: Really??

        @Jonathan Richards 1

        As far as I can see they still call it "autopilot" when it should be considered more an advanced form of cruise control.

        1. Anonymous Coward
          Anonymous Coward

          Re: Really??

          Autopilot in an aircraft is just a more advanced version of cruise control (in 3 dimensions) sometimes with ability to change course at preselected points.

          Even most of the advanced aircraft autopilots will not avoid a white truck flying through the air and stopping across your path.

        2. fearnothing

          Re: Really??

          As I suggested to my colleagues, 'Supercruise'

    4. My Alter Ego

      Re: Really??

      That was my thought when I read that quote. From what I know the circumstances were that the side of the trailer was white and blended into the bright sky - something that can also happen very easily to humans. Anyone who's drive towards a low sun (especially during Winter with wet roads) will know what it's like to be overpowered by the glare. The M40/A34 junction at Bicester was a prime example - During the Winter there was almost a daily accident until they installed the traffic lights.

      This will sound awfully cold (and is no consolation to relatives), but autonomous driving will always be responsible for deaths. The question is, is it safer than us meat bags and according to Tesla (who are not exactly unbiased) it is.

      1. cream wobbly

        Re: Really??

        "but autonomous driving will always be responsible for deaths"

        Ahem. That case had nothing to do with autonomous driving. Rephrased, then: the driver who engaged cruise control and then took his eyes off the road to enjoy a movie was responsible for his own death.

    5. yoav_hollander

      Re: Really??

      Just to clarify, when I said the JB case was “arguably one such example”, I did not mean it in the sense of “people never drive into visible obstacles” – clearly they sometimes do. I meant it in the sense that autonomous vehicles bring with them new failure modes – in this case the failure mode of users trusting that the machine can do more than it was actually designed to do.

      That was in the context of "expected vs. unexpected bugs". Sorry if that was unclear.

  3. Anonymous Coward
    Anonymous Coward

    What did for Toyota...

    Their code was not shown to be defective or to fail. What they failed to do was follow a standard, adopt "best practice" or be able to provide adequate evidence to support any claim that they had done so.

    Courts accept that systems will fail. Being able to prove you have done your best to prevent accidents makes the difference between "only" paying compensation or punitive damages and penalties.

  4. Anonymous Coward
    Anonymous Coward

    Adaptive AI

    Should they have to go through "I still know what this is" tests once a year / whenever they boot?

    E.g. holding up picture of cat results in "that's a cat". Holding up picture of the side of a truck results in "I see a road"...

    1. Anonymous Coward
      Anonymous Coward

      Re: Adaptive AI

      NOT hotdog.

  5. Doctor Syntax Silver badge

    "People are not hiring from among the ranks of the airline safety industry."

    Of course not. They'd just fire them for being "unhelpful", "obstructive" or whatever other term comes to hand* when the techies point out the gap between company policy and reality.

    *"Sneering" is just the latest.

  6. Dr Stephen Jones

    What?

    “Neural networks train themselves, and this might appear to remove the possibility of human error.”

    They don’t and it doesn’t.

  7. Warm Braw

    The solution is to modularise neural networks

    The solution is probably to bury a wire in the road - it seems a bizarre idea to want to replace human drivers with machines but leave in place the infrastructure that machines struggle to process.

    However, it is interesting that we seem prepared to accept a much greater degree of carnage provided it originates from people like ourselves. Speed limits, seat belts and alcohol-testing were all the subject of strong opposition despite burgeoning road fatalities. Yet if a professional driver causes an accident, there is an outcry. And as for a machine... Autonomous vehicles will have as much trouble negotiating the double standards as they will the road network.

    1. Chris G

      Re: The solution is to modularise neural networks

      Presumably, various of the NN modules will require some part of other modules data at times, I can't wait to see what kind of conflicts arise from that.

  8. Anonymous Coward
    Anonymous Coward

    Another annoying trend...

    ...is the increasing tendency to mask actual hardware issues by issuing a software update.

  9. Nick Z

    Software testing is the key to knowing whether it works or not

    I'd say that automated testing of software functionality is the key to making sure that it works as intended.

    There is such a thing as test-driven development, where you write a test, before you even write any code to make the program pass this test. And of course, all of these tests stay in the program, so that every time you make a change in the program, then you run these tests again to make sure that you haven't broken anything that was working before.

    This is the direction software development needs to go. Because you can artificially create very rare program states that seldom happen in real life. And you can run it in such a rare state repeatedly, until you iron out all the bugs. This way, rare states become as common as any other states for development purposes.

    Test-driven development is actually how automated neural networks create their programs. But there is no good reason why it needs to be completely automated and left to the machines. Human developers should write tests for neural networks to increase their testing above and beyond what they do on their own automatically.

    Neural networks require a new discipline in software development. Which is writing automated tests for such networks to make sure they perform as expected.

    1. Outer mongolian custard monster from outer space (honest)

      Re: Software testing is the key to knowing whether it works or not

      Thats interesting but in short, it'll be transferring the primary source of bugs from the coder to the person who devises the unit test harness? So not much of a long term final answer to the issue at all really.

      I see packages that are shipped when they pass a test harness each release, and every release new and interesting bugs are found that the test harness doesn't cover. Really a test harness will only detect things you already know about and fixed from popping back up into your code base.

      See the commentard earlier who mentioned that some of the systems failing and killing people were working as designed, its just the initial design wasn't sufficient in scope or definition to catch the oopsie that lead to the accident.

      1. Nick Z

        Re: Thats interesting but in short...

        Testing is the basis of all science. That what the scientific method is all about. Science itself is a type of test-driven development.

        And that's why I say that testing should be the basis of software development. Because otherwise you end up with a hodge-podge of some science mixed in with a lot of beliefs, superstitions, and ignorance.

        Computer programs are a reflection of people's minds. And it's important to remember that people have a long history of all kinds superstitions, mistaken beliefs, and ignorance. The only thing that has helped people overcome such state of being is science and its method of thorough testing, before accepting any assumption or belief.

        1. Bronek Kozicki
          Joke

          Re: Thats interesting but in short...

          Testing is great but don't you dare automated testing, because that takes you towards TDD and agile, away from the sacred lands of waterfall.

        2. tom dial Silver badge

          Re: Thats interesting but in short...

          It is beyond reasonable doubt that testing, and designing/writing the tests before code delivery (and by different people), is a Good Thing.

          Still, the tests will be designed and implemented by people who, generally speaking, are at least as imperfect as the software designer and coders, and inevitably will overlook things. That will lead to occasional misbehavior of machinery the software controls, and if the machinery is a software controlled car or truck, highway accidents.

          The quest for perfection is good, but we had best recognize that it probably is futile, and that the real question is whether these automatic vehicles will produce a lower accident rate than the human controlled ones they will replace. So far, it seems likely enough that they will.

          1. annodomini2

            Re: Thats interesting but in short...

            Whether it's testing or just development, all the behaviour of the system will be defined by requirements.

            If these requirements are incomplete or inadequate, you can test till the cows come home, but it won't find any bugs.

            The issue with Level 4/5 autonomy is the basic functionality is fairly simple, but the number of edge and corner cases out in the real world is huge!!! Millions upon Millions, only so many can be catered for and so there will be gaps in your requirements and testing will not cover these scenarios.

            The use of Neural Networks is an attempt to cover for the unexpected, but these systems will have limitations and we won't know what they are until we use them.

    2. Doctor Syntax Silver badge

      Re: Software testing is the key to knowing whether it works or not

      "There is such a thing as test-driven development, where you write a test, before you even write any code to make the program pass this test. And of course, all of these tests stay in the program, so that every time you make a change in the program, then you run these tests again to make sure that you haven't broken anything that was working before."

      It's a solution to Brooks' definition problem: is the product defined by the document or an actual example? It can be defined by the tests instead. Brooks' example which posed the problem was the first shipped model of the 360 which left undocumented data in registers after an operation, developers started writing code that used that data so that future models had to behave in the same way; that wasn't intended. If the tests define the product then what the test says happens has to happen but anything the test doesn't cover has to be taken as undefined.

      Which brings us to the problem of test-driven development for critical stuff: how do you know the set of tests is complete and correct?

  10. SVV

    Autonomous vehicle software

    When it crashes, so does your car.

    Personally, having worked in software develoment for years I'd rather be in a car with a drunk driver than a self driving one. I'd onsider it safer, no matter what Elon "why do we have to keep reading about this guy's nonsense schemes" Musk says (or possibly because of it).

  11. Lysenko

    First person?

    Joshua Brown became the first confirmed person to die in a crash where the car was, at least in part, driving itself.

    I think not. Cars have been "in part, driving themselves" since cruise control and automatic transmission were invented and there have been fatal accidents attributable to such systems. In this case the cruise control might have been able to disengage itself but failed to do so - that's a big step up from incapable of disengaging itself and ignoring manual override or simply autonomously accelerating.

  12. Chris G

    Here's a thought

    We could use special training techniques on certain individuals, when they complete the course we could call them ' a driver' . Then include regular update training to allow for changing traffic conditions and retest for ability and driving safety.

    You would not need any special programming or even need to input a destination, you could stop and look at the view on a whim, it would be a kind of automotive freedom.

    Alternatively you could buy a piece of not ready for use marketing hype called an ' autonomous vehicle.

    1. Throatwarbler Mangrove Silver badge
      FAIL

      Re: Here's a thought

      It all sounds good, but your cunning plan has a demonstrably high failure rate, and the so-called "driver" wetware is typically kept in service long past the point that it remains safe.

  13. /dev/null

    Can't see it ever happening...

    ...until you can trust a self-driving car not to say "you have control", when it decides it has no idea what is going on and you're 2 seconds away from colliding with something. And if you can't trust it not to do that, then you might as well drive the damn thing yourself.

    1. Anonymous Coward
      Anonymous Coward

      Re: Can't see it ever happening...

      Personally I'd prefer it hit the breaks 2 seconds away from the collision rather than spending those 2 seconds telling me I have control.

  14. Jonathan Richards 1
    Big Brother

    Who owns the camera feed?

    from TFA:

    > In theory, the more miles autonomous cars clock up, the more data they will have to learn by, and the safer they will be.

    I want to know if there will be a record of the autonomous driving sensor feeds, and what will happen to them. I think the answer to the first part is almost certain to be "yes", since otherwise there will be nothing to help with crash investigations.

    If the answer to the second part is "they're streamed or uploaded to Google | Tesla | Uber | Dept for Transport | ... " to assist with autonomous car development, then I'm much less happy.

    FWIW, I can't see myself ever driving (or giving control to) an autonomous vehicle, and I don't look forward to sharing the road with them.

  15. nagyeger
    Facepalm

    could set off on the right hand side of the road

    Been there, done that.

    I comes from just having spent ages driving on the wrong side, and thinking "O great, I'm home now, and can relax."

    Fortunately I was on my bicycle, so while I and the oncoming car were semi-shocked into a state of utter confusion about what on earth the other was doing on the wrong side of the road, it wasn't too hard for him to actually avoid me.

    1. Seajay#

      Re: could set off on the right hand side of the road

      I've done that too. Drove perfectly happily abroad then came home and pulled away from a t-junction on the wrong side of the road.

      It's odd that is used as an example of a "autonomous vehicle only" bug when it's such a common thing for meatsacks to do.

  16. Anonymous Coward
    Anonymous Coward

    Having worked in automotive software for an independent company, we found it very difficult to win bids with OEMs that included safety critical software. We found on the whole that they would find a another company willing to do the job for 1/10 of the price, but (as we found out on several occasions) skimped on the software quality. The OEMs appear not willing to pay for high quality code.

  17. Throatwarbler Mangrove Silver badge
    Holmes

    I have the solution

    Autonomous vehicle companies should recruit exclusively from the ranks of Register commentards, who, based on the contents of their commentary, never make programming blunders, are expert at all kinds of programming, and have a flawless knowledge of business execution as well. Problem solved!

    1. Doctor Syntax Silver badge

      Re: I have the solution

      " based on the contents of their commentary, never make programming blunders, are expert at all kinds of programming, and have a flawless knowledge of business execution as well."

      On the contrary, we're well acquainted with what can go wrong. That's why at least some of us hope not to ever find our lives entrusted to autonomous road vehicles.

    2. Stoneshop
      Pirate

      Re: I have the solution

      and have a flawless knowledge of business execution as well.

      Yup. Guillotine, AK47, Browning machine gun if you want to take out the entire C*O bunch in one go.

  18. Mark 85

    Car manufacturers contacted by The Reg were unwilling to talk.

    A Reg request for clarification from Tesla went unanswered.

    I find these two bits from the story to be very troubling.

  19. Anonymous Coward
    Anonymous Coward

    There are techniques to greatly reduce the bugs in code

    They are used in many life critical industries. However, car companies like Tesla et al who make unrealistic promises about how soon autonomous vehicles will be available want to race to the finish so they can begin making money off them. I guess they figure all the extra profit from being early to market will pay for a lot of high powered lawyers to get them out of having to pay wrongful death judgments.

    While this can't completely eliminate bugs, at least we wouldn't have to live with several million bugs in an autonomous vehicle!

    As for neural networks, I think trusting that for a life critical system is a terrible idea. At least with traditional programming you know what the code does, you can do coverage tests and formally verify critical sections. With a neural network you don't really know what it is doing, so insuring it will act appropriately in a given situation is difficult at best.

  20. Doctor Syntax Silver badge

    "They are used in many life critical industries"

    Even so, are they used in a situation even an order of magnitude less complex than an autonomous vehicle can encounter.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like