back to article Cops: Autonomous Uber driver may have been streaming The Voice before death crash

A woman in the driving seat of an autonomous Uber that hit and killed a pedestrian was likely streaming an episode of telly show The Voice on her phone immediately before the collision, according to reports. State police in Tempe, Arizona, reckoned the crash, which killed 49-year-old Elaine Herzberg, was "entirely avoidable" …

  1. Paul Crawford Silver badge

    What the "safety driver" was (or was not) doing is far less important than the safety/emergency brake system being turned off. Looking to blame them for the crash in some way is utterly missing the point:

    1) The car should have detected and stopped for anyone in its path. To miss an adult & bike is an utter travesty of the system's claimed suitability for real-world use.

    2) At what point did they test the car/configuration in a test range with typical objects?

    3) Most believed the driver was there to take over if the car gave ample warning of a fault of situation it could not handle.

    4) As seen in the Air France crash, even skilled pilots with minutest to react have real difficulties in taking over from an autopilot when conditions were too difficult for its capabilities.

    1. big_D Silver badge

      Yes and no. At the end of the day, the systems are still being tested, so the safety driver should be able to take over if they note the car isn't reacting properly (i.e. hasn't seen an obstacle). If you are just waiting for the vehicle to says it hasn't seen something, you'll be waiting until after the thump, as seen here.

      Also, if the vehicle says it can't cope, you need to already be aware of what is going on around you. You can't be distracted, concentrating on something else and expect to react in an emergency.

      The driver was paid to do one job and she was loafing off when it counted...

      I agree with you about Uber having turned off the on-board safety systems, but she was the driver, so it was still her responbility to react if the car obviously wasn't.

    2. Jellied Eel Silver badge

      What the "safety driver" was (or was not) doing is far less important than the safety/emergency brake system being turned off. Looking to blame them for the crash in some way is utterly missing the point:

      I don't think so. The person in the driver's seat was in charge of the vehicle that killed a person. Especially if that's the same person that also turned off the safety feature. If someone else did that, then as long as the driver knew, the driver's still at fault.

      There are also faults if this was a test vehicle, ie if the auto-brake was enabled but didn't respond, the driver should have still been able to avoid the collision, then logged the incident/defect. If features are being disabled, then test results may not be reliable and mistaken assumptions made about autonomous safety/driving features.

      Instead, someone died to demonstrate that drivers should not be watching TV whilst driving. Which drivers should know, and the driver should be prosecuted for vehicular homicide given they appear to have been grossly negligent.

      1. John Brown (no body) Silver badge

        "I don't think so. The person in the driver's seat was in charge of the vehicle that killed a person."

        Under normal law, yes, that's true. Anyone know what the new laws passed to allow these vehicles on the road actually say?

    3. jmch Silver badge
      FAIL

      The safety driver's job is... well, the clue is in the title isn't it?

      She was responsible for safety, she should have been looking at the road, AND she was specifically prohibited from using mobile devices by company protocol (not to mention probably also local law).

      If anyone runs over and kills someone because they were looking at their phone, I would expect them at the least to face manslaughter charges. Whether she thought the car would handle it's own driving is pretty irrelevant.

      1. Robert Helpmann??
        Childcatcher

        You had just one job!

        The safety driver's job is... well, the clue is in the title isn't it?

        To which I reply "autopilot".

        All joking aside, simply based on what I have seen and read concerning this very tragic case I think charges should be brought. I am also interested in some of the back-story. Did Uber review tapes after each run and did this or any other driver behave in a similar manner? Uber has a long history of shirking their legal and social obligations, so it seems reasonable to look into their role in this rather than accept their shifting all blame onto the so-called safety driver alone.

        1. JimC

          Re: You had just one job!

          On what we've heard here it certainly sounds as if the driver bears a heavy responsibility, but the underlying culture needs looking at too.

          This being Uber, its a fair bet their 'safety drivers' are minimum wage peons, when actually they need to be highly skilled with very fast reactions because their job is to evaluate when action is needed and take it as late as possible.

          Similarly the authorities will need to know if the drivers are just told to go out and sit there. or whether they are specifically briefed before each journey with a test plan.

          And ye, as in the above, the authorities will also want to look at to what extent Uber monitor that their safety drivers are doing exactly that, and not sitting their watching videos. Bearing in mind the demands of the role it would not be inappropriate for the drivers to be banned from taking phones with them. Its hard to imagine an aviation test pilot having their phone with them!

        2. Dazed and Confused

          Re: To which I reply "autopilot".

          The safety driver's job is... well, the clue is in the title isn't it?

          To which I reply "autopilot".

          To my mind this is the problem with the Tesla autopilot. They say that the human driver is responsible for safety and is supposed to step in and take control when they notice the automatics not doing the right thing. Well this tragic case highlights a total failure to understand human nature. Here there was a "test driver" specifically tasked with monitoring a protype out on the road and yet they still can't keep their mind on what is happening. What chance is there have your average customer is going to be paying enough attention to make split second decisions and over-ride the vehicles automatics.

    4. This post has been deleted by its author

      1. Alistair
        Windows

        Re: Headlights

        @BIIIG re headlights

        The forward facing videos from the Uber vehicle are tuned for *daylight* and do not accurately reflect the lighting in place. i.e. the videos give one an impression that the area is dark and bleak and the headlights are weak. Trust me on this one, Volvo XC90 headlights are not weak, and have been known to fully illuminate my bedroom when the dink that lives around the curve leaves at 3 in the morning.

      2. Anonymous Coward
        Anonymous Coward

        Re: Headlights

        > If I recall the video properly, it seems to be the car's headlights were very poorly aimed. They appeared to be pointed down, and narrowed inside, so as to almost give a tunnelvision field of view.

        And you would still operate a vehicle in such conditions?

    5. Anonymous Coward
      Anonymous Coward

      I thought it's already established the LIDAR, RADAR and Collision Detection systems on the Volvo were functional and HAD detected an object in the road ahead that it knew it was going to collide with at least 6 seconds before the impact?

      The problem was Uber disconnecting the onboard HUD safety warning system so the driver would've had no indication the object ahead had correctly been identified as 'Bicycle' and that the Volvo itself was unable to apply the braking system.

      1. Lee D Silver badge

        The Volvo wouldn't have emergency-braked until 1.something seconds from impact anyway, which means a very late decision at 40+mph (18m/s, so 18m of travel)

        "In Uber’s self-driving system, an emergency brake maneuver refers to a deceleration greater than 6.5 meters per second squared (m/s^2)"

        At best, it would have hit her at 11.5 m/s or thereabouts (24mph), even with everything turned on.

        It also detected her as a vehicle, a bike and a pedestrian within six seconds, each with different predicted paths / speed, so it wouldn't have necessarily applied any other braking / evasion until the inevitable was detected.

        Chances are it still would have hit her, it just may not have been instantaneous death.

        https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf

        1. John Robson Silver badge

          "Chances are it still would have hit her, it just may not have been instantaneous death."

          Aren't the stats something like an 80% fatality rate at 30mph, and 80% survival at 20mph?

          Probably a fraction better than that given modern car frontal design for pedestrian impact mitigation (I refuse to call it safety).

          40mph is always going to be in the instant fatality except in rare circumstances territory.

          20mph is nowhere near that - and getting that much braking done is massively valuable.

        2. John H Woods Silver badge

          At best...

          "At best, it would have hit her at 11.5 m/s or thereabouts (24mph)." - Lee D

          Firstly, and decent car can brake at 1g. An XC90 has a 100km/h stopping distance of 36m. So, a=v²/2s gives us a deceleration of 10.7m/s² or 1.1g

          (Surely the car deceleration increases as the velocity drops, so this is a minimum.)

          A single second (1.0s) of full breaking could therefore reduce the impact speed from 40mph to 16mph. Its still gonna hurt, but it's an order of magnitude less likely to be fatal. Even if your calcs were right and you can only get down to 24mph, it must be at least 5x more likely an adult would survive such an impact than a full 40mph impact.

          Note also that you only need another ~0.5 seconds to avoid the impact altogether.

          1. DavCrav

            Re: At best...

            "Firstly, and decent car can brake at 1g. An XC90 has a 100km/h stopping distance of 36m. So, a=v²/2s gives us a deceleration of 10.7m/s² or 1.1g

            (Surely the car deceleration increases as the velocity drops, so this is a minimum.)"

            Brakes dump energy out of the system at roughly a constant rate (more or less). Since energy is proportional to the square of the velocity, deceleration increases as the car slows. Almost all of the deceleration happens right at the end, hence why you only get thrown forward right at the end of an emergency braking.

          2. zaax

            Re: At best...

            Also with a normal driver / car the driver would have swerved, which may have went missing the cyclist entirely

      2. Jellied Eel Silver badge

        I thought it's already established the LIDAR, RADAR and Collision Detection systems on the Volvo were functional and HAD detected an object in the road ahead that it knew it was going to collide with at least 6 seconds before the impact?

        From reading the NTSB report, yep. It had tracked and classified the threat, narrowing down the probability of interception in those 6 seconds. The way Uber had it set up, it wouldn't have warned the driver, but the data were logged. Again if the driver knew that, then they should have known to pay attention. I think the onboard camera footage is a bit misleading as it doesn't really show the driver's FOV. It does show the driver's inattention though.

        (And being Friday, I wonder how long it'll take to get an augmented reality feature. Pedestrian, 5 points.Cycle, 10. Fire truck, Game Over)

    6. Anonymous Coward
      Anonymous Coward

      > As seen in the Air France crash

      Let us not go there. Your experience of autonomous driving is most likely zero and your flying experience definitely is.

      When was the last time you were exposed to the cognitive, situational and somatic environment of the en-route phase of a transoceanic crossing at night in an aeroplane cockpit as flight crew?

    7. not.known@this.address

      As seen in *which* Air France crash?

      The one where the autopilot incorrectly interpreted an approach path into a forest as an approach path into an airport?

      The one where an airliner stalled because the pitot tubes iced over, the pilots failed to understand how slowly they were flying and the autopilot handed control over to the meatsacks because it thought they had a better idea of what was going on?

      The one where they lost a Concorde, crew and passengers because the accountants refused to fit fuel tank liners?

      There is no excuse for not paying attention to the road if you are driving, or even if you are "only" providing a safety backup to an automated system. But if you really think automated systems *always* stopping if an obstacle doesn't move is a good idea, I hope you never meet a Bad Guy standing in the middle of the road in the middle of the night...

      1. Anonymous Coward
        Anonymous Coward

        Re: As seen in *which* Air France crash?

        I don't know how you managed to get the causes to all three accidents so completely wrong when the accident reports are public for all of us to read (and learn from). Head over to the BEA site for the relevant documents.

  2. DavCrav

    "The report has been submitted to prosecutors, who will now decide if Vasquez should face charges."

    My guess: yes.

    1. Paul Crawford Silver badge

      Sadly they might, but really it should be whoever turned off the safety brake system and those who approved the change (or had failed to implement a change approval system).

      As software reaches the point of actually and visibly killing people, those developing and testing it need to be held to the same standards that anyone designing a bridge, etc, would be.

      1. Anonymous Coward Silver badge

        Charging the driver doesn't preclude them from charging the company/developers also.

        Hopefully they will charge both parties.

        1. stiine Silver badge

          Not necessarily

          That would depend on whether the fact that the safety/braking sytem had been disabled had been thoroughly communicated with the 'driver'. If it was, then if she was indeed watching tv on her phone, she should be charged. in my opinion.

      2. Steve the Cynic

        As software reaches the point of actually and visibly killing people

        You are sadly behind the times. Bugs in the firmware of the Therac-25 killed three people in the mid-80s, more than thirty years ago.

  3. Anonymous South African Coward Bronze badge

    And that's why, once you're in a car and driving, your mobile/cellphone/phablet should be stowed away and be left alone for the duration of the trip.

    1. Jason Bloomberg Silver badge

      Except in this case she wasn't "driving". She was a sack of meat in the drivers seat and probably as bored shitless as many passengers become when being driven.

      Maybe she should have been more attentive but you can't just throw a driver in a driver seat doing nothing for hours on end and expect them to react as if they had been driving for all that time.

      That is an unreasonable expectation to start with and did Uber even test her attentiveness before putting her in that situation? Was the true situation explained to her or was she misled into believing she simply had to sit there and everything would be fine, that her responsibilities were less than they turned out to be?

      There are a lot of unanswered questions but it seems to me there is demonisation and a witch hunt against this 'driver' with some people determined to have her held to blame for the death for various agenda reasons.

      1. Jellied Eel Silver badge

        There are a lot of unanswered questions but it seems to me there is demonisation and a witch hunt against this 'driver' with some people determined to have her held to blame for the death for various agenda reasons.

        My agenda's really simple. If you're the person sitting on the left (US/LHD), you are responsible for that vehicle. AFAIK, that's still the current legal position as well. A person died because the driver wasn't paying attention.

      2. JimC

        > but you can't just throw

        > a driver in a driver seat doing nothing for hours on end and expect them to react as if they had been

        > driving for all that time. That is an unreasonable expectation to start with

        Its a perfectly reasonable expectation if they are trained and skilled and paid accordingly. We expect exactly that of pilots.

        Now I think about it the safety driver should probably be recording a continuous commentary of what's happening on the road and how well the car is reacting to it, which would both keep them alert and provide more feedback on the system. That would require someone who can actually produce a useful evaluation though.

        However I imagine what Uber are doing is employing minimum wage peons, and recording video from the car and only examining special events back in the office.

        1. Anonymous Coward
          Anonymous Coward

          Re: > but you can't just throw

          > Now I think about it the safety driver should probably be recording a continuous commentary of what's happening on the road

          That is exactly what we are trained to do in a multicrew cockpit and, more to the point, what the quarterly driving tests looked like at a former employer of mine. It is impressive how far ahead you can predict things in a car when you are properly trained.

          PS: We used to find it amusing to compare ourselves to fighter jocks. If you watch video of a fighter pilot you will notice how their heads are constantly bobbing. They are not falling asleep. They are continuously scanning the environment around (and inside) the aircraft for threats and anomalies (and there are physiological reasons why we need to rotate our heads and not just our eyes). We were trained to do the same thing on any vehicle.

      3. Anonymous Coward
        Anonymous Coward

        > Except in this case she wasn't "driving"

        One thing I am not comfortable with is the PR approach to autonomous / automatic driving, which seems to be based on science fiction stories of sentient machines rather than on our actual state of the art in transportation.

        A ship's Captain is almost never on the bridge, but he has ultimate authority and responsibility for the ship. Likewise, an officer of the watch (or a helmsman) may or may not actually be at the helm, but he is still responsible for ensuring safe passage.

        In the air, the pilots operate (and are of course responsible for) the aircraft regardless of the level of automation being used at any given time, which particularly on airliners is usually very high. What that does is frees up your limited cognitive capabilities to do what we do best: plan and make decisions based on incomplete inputs.

        I don't know much about trains, but I believe it is a similar thing.

        For all the above we require years of intensive training, recent experience and frequent testing.

        So why would you then put a random anybody whose sole qualification is having a driving licence (which besides, in the States does not even prove that you can drive) to operate a complex, experimental system?

        To my knowledge, Google have used specifically trained engineers for this and so do Audi. There is clearly a right way and a wrong way to go about it.

        1. Stevie

          which besides, in the States does not even prove that you can drive

          I was with you 100% until you had to add that last dig.

          (Pauses to sigh deeply at once again having to explain this to people who have the word "engineer" in their job title and who have pieces of paper from places of Higher Education that say they are certified as clever)

          The matter of licensing people to drive is a STATE law thing, not a universal FEDERAL standard.

          Here in New York the test is demanding enough (as demanding as the test I went through in England before some of you were born) requiring a written test and driving about in a car test. So demanding in fact that the people you are obviously referencing don't attempt to take one, which is another problem for another thread. I imagine you have the same kinds of idiot in the UK though.

          If you've taken the test on Long Island and passed it, you are as able to drive as someone who has done the same thing in, say, Hatfield. i.e. It's a crapshoot. Some people are good at aiming a car, some aren't.

          But if you can tamp down on the yank-baiting, I think you might have a cogent argument with much merit. Pity you didn't feel invested enough to add a more personalized name to it.

          1. psdrake67

            Re: which besides, in the States does not even prove that you can drive

            I'm all for calling out gratuitous Yank-baiting, but it wasn't. You pointed out that driving tests in some areas are suitably rigorous, and not in others. In doing so you have made the point for the poster when they said that a drivers license in the US does not prove you can drive. You may be able to, but since there are different standards in different places, and some places lacking complex roads, and licenses are generally portable when moving from state to state, the basic point is made.

            In Nunavut there are no divided highways and no roads connecting to the rest of the continent. Heck, you could fit three Texas-sized states into Nunavut but the longest road is 13 miles. Get a license there and then drive a rental car all over the rest of North America with that Nunavut license.

          2. Anonymous Coward
            Anonymous Coward

            Re: which besides, in the States does not even prove that you can drive

            > Here in New York the test is demanding enough

            Yup. Now try taking the "test" in Tennessee.

  4. steviebuk Silver badge

    Lets...

    ...blame the person and ignore that Uber turned off the cars own breaking system and put a car on the road that clearly, from the video footage, wasn't ready. Considering how dark the image was.

    Yes she is also at fault for not paying attention but, again, from the footage, she wouldn't of been able to stop in time herself. From the footage, the woman that was hit came out of the darkness. If it was that dark to the naked eye, then the safety driver wouldn't of been able to stop in time. The main point is, the Uber system surely should be able to see in the dark. It should of seen her way ahead of time so been able to break in time. It was clearly not fit for purpose. I think to save themselves now, they are looking to try to squarely blame the safety driver.

    1. Hans Neeson-Bumpsadese Silver badge

      Re: Lets...

      the woman that was hit came out of the darkness. If it was that dark to the naked eye, then the safety driver wouldn't of been able to stop in time

      From what I've read elsewhere, it was nowhere near as dark as the camera footage would have you believe. Think dual carriagway with regular lamp standards, not some unlit country road.

    2. iron Silver badge

      Re: Lets...

      LIDAR does not need light to see. The car had no problems spotting the poor victim well before it hit her.

      1. Dave 126 Silver badge

        Re: Lets...

        > Uber... ..put a car on the road that clearly, from the video footage, wasn't ready.

        Hence the testing with a human driver. How else can a be made 'ready' for the road?

        If one wants to find an organisational failing, it might be in the area of psychology - i.e not putting safeguards in place to ensure the human driver is fully engaged with the job in hand. Perhaps requiring then to give a running commentary on the road situation, as is done in training for advanced driving licences (police drivers etc). If this police report is verified, then the human driver wasn't expecting her employers to routinely review video of her eyes.

        The vast majority of vehicles on the road today have human drivers and no Volvo-style automatic braking.

        1. Stoneshop

          Re: Lets...

          If one wants to find an organisational failing, it might be in the area of psychology - i.e not putting safeguards in place to ensure the human driver is fully engaged with the job in hand. Perhaps requiring then to give a running commentary on the road situation, as is done in training for advanced driving licences (police drivers etc).

          In Japan train drivers are required to acknowledge all signals they pass by pointing at them; this is being recorded and reviewed. And yes, this is on top of Automatic Train Control, which would slow or halt the train anyway if the driver ignored the signal.

        2. Anonymous Coward
          Anonymous Coward

          Re: Lets...

          > as is done in training for advanced driving licences (police drivers etc)

          Correct but you may want to point out, UK / EU / AUS / NZ police drivers.

          I do not think the Merkins get any particular training aside perhaps of the right way to hold the donuts and how to don the obligatory Ray-Bans so you can look "I'm-the-law" sort of tough.

    3. Anonymous Coward Silver badge
      Joke

      Re: Lets...

      >"so been able to break in time"

      It was already broken. What it didn't do was brake.

  5. Anonymous Coward
    Anonymous Coward

    Was the safety driver told the braking system had been degraded ?

    1. DiViDeD

      Re: Was the safety driver told the braking system had been degraded ?

      I imagine the only thing ever said to the 'safety driver' was "How would you like to earn 50 bucks? Easy money, just sit in the car for a couple of hours"

  6. Alister

    So in what way, then, was this car "autonomous", If the "safety driver" was supposed to be watching the road and liable to have to take over at any second.

    As far as I can see, the combination of Uber vehicle and safety driver were incapable of avoiding an accident, in conditions where a full time driver in a "normal" car would have avoided it.

    Indeed a normal production car with autobrake and collision avoidance (ie. what the Volvo should have been capable of) could have possibly avoided the accident without driver intervention, which makes the standard production Volvo more of an automaton than the prototype Uber vehicle.

    1. John Robson Silver badge

      "So in what way, then, was this car "autonomous", If the "safety driver" was supposed to be watching the road and liable to have to take over at any second."

      In the same way that anything autonomous is when under development and test.

      You don't do a first iteration (or a second, or a hundredth) and say 'that's autonomous' and just let it out. You watch it, test it, monitor it, evaluate it...

      That's the stage they're at. They just didn't have processes in place to ensure that the monitoring actually happened...

  7. Sureo

    If you give an unmotivated person a lengthy crushingly boring job requiring constant attention, don't be surprised at the result.

    1. 2Nick3

      "If you give an unmotivated person a lengthy crushingly boring job requiring constant attention, don't be surprised at the result."

      Getting a paycheck should be motivation enough to do your job. Should be, but obviously wasn't.

      1. Anonymous Coward
        Anonymous Coward

        There is a slight difference in the consequences of an un-motivated burger-flipper not paying attention, and putting a ton or two of metal, capable of mayhem (anybody notice that terrorists are using vehicles these days?) with less-than-fully-tested software in the hands of someone earning minimum wage.

        I'm sorry, I don't care to be part of this QA cycle vicariously. There are too many companies trying to be first-to-market in this space; I have serious questions about the calculus they are using in determining how 'safe' their offering are, and what risk mitigation actions they need to be taking.

      2. Paul Crawford Silver badge

        @ 2Nick3

        Just how much was she being paid?

        What degree of training did she have on the capabilities (or otherwise) of the "automated driving" system?

        1. 2Nick3

          Re: @ 2Nick3

          "Just how much was she being paid?"

          Enough for her to accept the job.

          "What degree of training did she have on the capabilities (or otherwise) of the "automated driving" system?"

          That's a good question, While she took the job we don't know what training was involved or what expectations were set, but watching a TV show on your phone while on the clock is probably not in the 'acceptable' column. The title is "Safety Driver," not "Safety Passenger."

          If you walk into McDonalds and the person there to take your order is playing Angry Birds and is very slow to respond to you being there, how do you react? In a retail store, you come to the cashier and they are engrossed in texting and don't notice you, what goes through your mind?

          I doubt you're thinking "They're doing a great job."

  8. Little Mouse

    "The crash has caused much soul-searching in the emerging self-driving industry"

    Have they found one yet?

    Badum-TSH!

    1. Jellied Eel Silver badge

      Re: "The crash has caused much soul-searching in the emerging self-driving industry"

      Yes! I just happen to have a newly restored red & white '58 Plymouth Fury. It's been through a few owners, but has autonomous driving features..

      1. Stevie
        Pint

        Re: "The crash has caused much soul-searching in the emerging self-driving industry"

        E-beer for you, Jellied Eel.

        Mind that The Shitters don't jog your elbow and spill it.

  9. DazzW

    I see 2 issues here.

    1. Calling it "autonomous" lulls the driver in to a false sense of security. It's really just advanced driver assist.

    2. I would find it far harder to maintain focus on watching for anomalies than just driving the damn thing.

    1. 's water music

      Calling it "autonomous" lulls the driver in to a false sense of security

      Duh! they should of (sic) called it autpilot as any fule kno. Everyone understands what that means in terms of driver assist systems

  10. Camilla Smythe

    Dick Heads

    Let me just be rubbish here, feel free to down vote if I am or improve the suggestion.

    Require that at all times the human driver always has to drive the car whilst the car drives autonomously.

    Compare the demand from the human driver to the demand from the autonomous driver. Whilst they match inside a set of defined limits for the available inputs the autonomous driver has control.

    Ideally if the inputs match the human driver will not notice and if they do then they will be able to adapt accordingly and report those differences.

    If and when they differ outside that set of defined limits for the available inputs either the car stops, preferable, or the human driver takes priority.

    If things are extremely out of whack then the human driver is looking at the fluff in their navel and MagnaVolt should be applied to the human driver after the vehicle has rendered itself safe. Call it an incentive.

    The same NTSB paper also reported that Uber's self-driving Volvo XC90's emergency braking systems were turned off "to reduce the potential for erratic vehicle behaviour". The operator of the vehicle was supposed to be watching the road and dealing with emergencies.

    ... "to stop the computers arguing with each other."?

    Presumably Uber was too busy programming a rate hike for situations when a cyclist gets pulped and local traffic is disrupted by the emergency services to bother about asking Volvo how the apparently disparate systems might be modified to play nice with each other.

    1. usbac Silver badge

      Re: Dick Heads

      I really do like the concept of recording "parallel" data sets. I think where the concept breaks down is that it would only work if the human driver was a highly trained professional driver. Letting just any driver be the source of one data stream isn't going to be a big help. Look at how many terrible drivers are on the road. Would you want them to be the "control" data set with which to compare the computer's idea of how to drive?

      1. DropBear

        Re: Dick Heads

        I have to assume you are aware of the reason the expression "back-seat driver" exists - and it would wreck this approach comprehensively. You will never, ever get two entities of any kind to drive a vehicle in any kind of vaguely similar style outside a strictly choreographed short run, regardless of how much you might try to "instruct" them to use a specific style. You need one of them to have absolute control authority and the other one to shut the hell up about how the first one is doing it all so horribly wrong.

  11. JLV
    Boffin

    Simples. Charge both Uber and the driver

    Uber disconnected the autobraking system and required the driver to annotate the drive experience on a touchpad. They, and maybe because that's because they were expecting her to be taking notes, totally failed to enforce good driver attention, despite having a recording video that shows her looking down and away. You're on a job and you know you are on video and yet you goof off? Somebody didn't tell her to take her job seriously. What were her qualifications? We know she's an ex-con - not a something negative in itself, reinsertion is important - but did she have the training and education to be doing this? Or was she just cheap for Uber to employ, in a position that was mandated by law, but considered onerous by Uber?

    If she was watching entertainment, it's entirely her fault but probably due in no small part to her feeling that she could get away with it. This lack of oversight - in a job where the job profile is pretty much conducive to folk slacking off by its very nature - and the overall lax safety attitude we've perceived from Uber so far well warrants Uber getting sued. Those facts can come out in court.

    If she was indeed watching entertainment, then she should be charged for distracted driving causing death. This local police department was also quick to absolve Uber, very early in the investigation, so they seem quite sloppy and a court should also look at the evidence.

    (quoting wikipedia)

    >

    The driver said it was like a flash, the person walked out in front of them. His [sic] first alert to the collision was the sound of the collision. [...] it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway.

    — Chief Sylvia Moir, Tempe Police, San Francisco Chronicle interview, March 19, 2018[21]

    While we are at it, maybe review also what steps the Arizona DMV took to ensure public safety. Did they just trust Uber? Uber? They don't necessarily need to be sued, but a hearing should disentangle the facts about which steps Uber was tasked to take to ensure public safety. If a state like Arizone can regulate hair dressers, surely it can regulate autodriving cars. https://boc.az.gov/licensing

    1. Justicesays

      Re: Simples. Charge both Uber and the driver

      "did she have the training and education to be doing this?"

      There are people who have roughly the training and skill set to do this job, they are called "Driving instructors". Not hiring someone with equivalent skills means that Uber doesn't give a crap about the effectiveness of the person in the seat, it's just a cost they have to pay to meet the minimum legal requirement.

      Their immediate response, drop all blame on driver, and run away to another state.

      The problem is that although they should be equally liable (for bad software, lack of supervision, insufficiently skilled employees), they will likely get away with it as the legal status of experimental self driving cars isn't being given sufficient attention.

    2. Mark 85

      Re: Simples. Charge both Uber and the driver

      I think you pretty much summed it up as to what needs to be looked at. Uber is playing the lawyer's game and pointing the finger. There should be a full investigation by a (let's say "disinterested" meaning nothing to gain) third party but it seems (on the surface at least) that there's politics and money involved also as to what is being said and investigated.

  12. RLWatkins

    Let's not lose our perspective here...

    ... and entirely forget that a pedestrian crossed a dark street in the middle of a block, stepping right into the path of an oncoming car which had its headlights, however ineffective from the standpoint of the driver, lit and clearly visible. The chances are better than even that even a human driver would have hit her.

    I'm all for making autonomous vehicles safe, and when they fail to meet their design requirements I'm all for making a stink about it, but in every case thus far which has made the news the automated system was not at fault: either the driver was depending upon it doing things it wasn't designed for, or the system performed as well as a human driver would have.

    In this case the autonomous system, had parts of it not been disabled, might have *outperformed* a human driver, but even misconfigured it did about as well as a live driver might have done.

    1. Anonymous Coward
      Anonymous Coward

      Re: Let's not lose our perspective here...

      @RLWatkins

      a pedestrian crossed a dark street in the middle of a block, stepping right into the path of an oncoming car

      You are wrong. The pedestrian was crossing a well lit 3 lane carriageway, and had already crossed two lanes before the car hit her. she did not "step into the path" of the car.

    2. Jellied Eel Silver badge

      Re: Let's not lose our perspective here...

      ... and entirely forget that a pedestrian crossed a dark street in the middle of a block, stepping right into the path of an oncoming car

      Ah. Victim blaming. Pedestrians and other road users do strange things though, which is why drivers are expected to drive with due care and attention. Which this driver wasn't. The image in the NTSB report made me a bit curious though-

      In this area, northbound Mill Avenue is separated from southbound Mill Avenue by a center median containing trees, shrubs, and brick landscaping in the shape of an X. Four signs at the edges of the brick median, facing toward the roadway, warn pedestrians to use the crosswalk. The nearest crosswalk is at the intersection of Mill Avenue and Curry Road, about 360 feet north of where the crash occurred.

      So why there was a pedestrianised median there, and why it appeared to dump pedestrians right onto Mill Avenue. If you look at the location, it makes a bit more sense as there's a theatre and Bryan University close by. And a marked cycle route with a sign and street light right where the crash happened. So it doesn't seem to have been well thought out from a pedestrian/cycle safety POV, but then roads often aren't.

  13. doug_bostrom

    Considering that inattention is the autonomous driving product, it seems more interesting to wonder why Uber's vehicle was traveling at 44mph in a 35mph zone while overtaking another car in the wrong lane in the middle of an intersection.

    The promise was to remove the human factor from driving, but the car was behaving like a human.

  14. SimonHayterUK

    To be fair streaming video within cars is growing, I personally listen to music, documentaries and storybooks on YouTube, just because there's video doesn't mean it's being watched and in fact, I stream at the low bitrate to save data from my sim-only contract. I also have YouTube playing in the office at work in the background and not openly visible in a tab.

    The voice, for example, is very easy just to listen without having the need to watch the video, I'm not saying this is the case but to simply say the driver was distracted because of streaming video may not be as accurate as one may think, in fact, I bet the police are treating it as the smoking gun.

    I say fine UBER, sack the driver and pay compensation to the family, then put in place measures to prevent this from ever happening again.

  15. Alan Hope

    The only way to have a fully aware and instantly reactive driver is to have them actually perform the task of "driving".

    When you're driving you have a constant state of heightened awareness of what is going on around you, and what might happen at any time. As a passenger in a self-driving car you simply don't, and self-driving tech can never change basic human psychology. And, yes, I know we have our weak points too but road accident rates continue to fall.

    I am now convinced it's not going to happen until we have 100% autonomous vehicles that can cope with all conditions and eventualities. And that is a very very long way off. Driving as we know it is here to stay.

    1. DiViDeD

      @ Alan Hope

      "When you're driving you have a constant state of heightened awareness of what is going on around you"

      You've not driven in Sydney (or as we like to call it, Pinball Alley) have you?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like