back to article Fatal driverless crash: Radar-maker says Uber disabled safety systems

Uber reportedly disabled safety systems on the autonomous Volvo XC90 that killed a pedestrian Stateside last week, according to the makers of the car's sensors. "We don't want people to be confused or think it was a failure of the technology that we supply for Volvo, because that's not the case," Zach Peterson, a spokesman for …

  1. Daggerchild Silver badge

    Cause of Death: Ostrich Algorithm

    I'm wondering if the bike frame reflected a funky signal the software decided was impossible, and discarded, leaving a 'perfectly empty' road ahead...

    1. Anonymous Coward
      Anonymous Coward

      Re: Cause of Death: Ostrich Algorithm

      Or perhaps it was because Uber (reportedly) disabled safety systems.

      1. Steve K

        Re: Cause of Death: Ostrich Algorithm

        I think they disabled the standard Volvo safety systems because they were using their own and - as they are not integrated - did not want them to interfere with their sensors.

    2. Amos1

      Re: Cause of Death: Ostrich Algorithm

      I'm thinking it occurred because SOMEONE WALKED IN FRONT OF A MOVING CAR AT NIGHT THAT HAD ITS HEADLIGHTS ON.

      1. Amos1

        Re: Cause of Death: Ostrich Algorithm

        a.k.a. "You can't patch stupid."

        1. Someone Else Silver badge

          Re: Cause of Death: Ostrich Algorithm

          Stated another way: You can make it foolproof, but you can't make it damnfoolproof.

      2. Alister

        Re: Cause of Death: Ostrich Algorithm

        I'm thinking it occurred because SOMEONE WALKED IN FRONT OF A MOVING CAR AT NIGHT THAT HAD ITS HEADLIGHTS ON.

        I'm thinking you're an idiot.

        She was more than halfway across the road, which means she started crossing when the car was a long way away. The car was exceeding the speed limit, and so she probably misjudged the time she had to safely cross. She also probably assumed that the car would slow enough to let her get to safety as a human driver would do.

        1. Anonymous Coward
          Anonymous Coward

          Re: Cause of Death: Ostrich Algorithm

          She might have been stupid to cross the road when/where she did, but it was inexcuseable for self driving car to hit her. Even if it only could see as well as the front facing camera (where she seemed to appear out of nowhere) it should have been able to stop. It didn't even apply the brakes!

          I think anyone testing self driving cars involved in an accident so ridiculous as this one should be banned form testing self driving cars on US roads for a period of two years. That would set them far back, thus giving them incentive to be REALLY sure they have things working well in internal testing before they decide they're ready to move onto public roads.

          1. Anonymous Coward
            Anonymous Coward

            Re: Cause of Death: Ostrich Algorithm

            "I think anyone testing self driving cars involved in an accident so ridiculous as this one should be banned form testing self driving cars on US roads for a period of two years. That would set them far back, thus giving them incentive to be REALLY sure they have things working well in internal testing before they decide they're ready to move onto public roads."

            Your 'solution' will only lead to many more deaths as people are killed by drunk, tired, high, distracted, or incompetent drivers - people whose lives could have been saved by an earlier introduction of autonomous vehicles.

          2. Sven Coenye
            Boffin

            Re: Cause of Death: Ostrich Algorithm

            As to why she was crossing there, see the overhead of the accident spot: https://www.google.com/maps/@33.4362155,-111.9424977,163m/data=!3m1!1e3 . The big X in the median makes it very enticing to cross there if your in that area as otherwise you have walk to the stop light, cross, then double back; or pedal up W Lake View Drive, then to the intersection, and then down.

            The original police report re. the vehicle's speed (doing 38 in a 35 mph zone) was not correct. If you walk Google Streetview back from the accident scene, you'll find a 45 sign just before the overpass. The limit before the overpass is indeed 35 (walk Streetview back to before the river to find that sign), so Uber's car was accelerating but within the limit as it struck Herzberg.

        2. Anonymous Coward
          Anonymous Coward

          Re: Cause of Death: Ostrich Algorithm

          "I'm thinking you're an idiot.

          She was more than halfway across the road, which means she started crossing when the car was a long way away. The car was exceeding the speed limit, and so she probably misjudged the time she had to safely cross. She also probably assumed that the car would slow enough to let her get to safety as a human driver would do."

          =

          1. The car was not greatly exceeding the speed limit. The speed was over the limit by a few miles an hour - probably less than a human driven car on an empty arterial road.

          1b. You cannot estimate the exact speed of an oncoming car, so you don't count on being able to do so.

          2. If there is one car coming on a mainly empty road, and there is any question of crossing in time, you wait for it to pass. If there are several cars you wait for a gap. If there are many cars you find an inherently safe crossing or become very patient. The video did not show a lot of cars.

          3. If you misjudge the speed of the car, and it arrives faster than you expect, you don't walk in front of it, you stop, or even back up.

          4. You never assume a driver will see you on an unlit road at night.

          5. You never assume what a driver will do if they see you unless they give some indication to you of their intent.

          6. You never assume a car will yield the right of way, unless mandated by a light, sign, crosswalk, or police officer. Even then, be ready for something else.

          1. AdamWill

            Re: Cause of Death: Ostrich Algorithm

            These are all good principles to apply if you're a pedestrian interested in staying alive. However, in the case that a pedestrian *doesn't* apply them, that does not automatically excuse the driver (human or robot) that hits them, if they could have reasonably avoided doing so. Death is not the legally mandated punishment for jaywalking, nor is Joe Random Person/Robot Driving By the legally mandated agency of punishment for jaywalking.

            1. Kevin McMurtrie Silver badge

              Re: Cause of Death: Ostrich Algorithm

              Jaywalking is ignoring a red light, short-cutting a crosswalk (painted or implied), and not yielding to cars/bikes already in your path. Crossing the middle of a road is, of course, legal if it's clear when you started. You can't fault somebody in AZ for walking in the middle of the night either, given that some days are hot enough to kill you.

              Others driving the route at night (without artificially dark video) have shown that a pedestrian should have been visible long enough to make a graceful stop. Besides, it's against the the law (and common sense) to drive at a speed where can't avoid an obstacle in the road.

              1. Alister

                Re: Cause of Death: Ostrich Algorithm

                Others driving the route at night (without artificially dark video) have shown that a pedestrian should have been visible long enough to make a graceful stop.

                Exactly. See here

                There seems to be an erroneous belief centered on the Uber video that the accident happened in the dark, whereas the fact is it the road was well lit.

                1. John Brown (no body) Silver badge

                  Re: Cause of Death: Ostrich Algorithm

                  "Exactly. See here

                  There seems to be an erroneous belief centered on the Uber video that the accident happened in the dark, whereas the fact is it the road was well lit."

                  Thanks for that. It's a very well lit area and I can no reason for the "superior" sensing ability of this car to have not seen the victim in time to stop unless there was something very badly wrong with the sensors or the data processing. She was about 2/3rds of the way across the road when she was killed. It was obvious she was a "hazard" from the cars point of view. It almost seems as though, as someone mentioned elsewhere on these forums, that the cars systems can't anticipate a collision, merely react to an imminent one (and it doesn't even seem to have reacted in this case)

              2. Alan Brown Silver badge

                Re: Cause of Death: Ostrich Algorithm

                "Jaywalking is...."

                A uniquely American concept created by a century of lobbying by the motor industry, along with a raft of other state-level laws that mandate that pedestrians shall yield to motor traffic other than at designated crossing points. (Ie, the best laws that money can buy)

                The car didn't react to the pedestrian because it wasn't expecting to see one at this point and it wasn't expecting one because it wasn't programmed to expect one. In any other country it would be expecting pedestrians or other obstacles at all times. Uber's level of FAIL is more epic than a Cecil B de Mille Bible movie.

                Humans might not expect to see an obstacle here, but they will process the exception and react to the error condition.

            2. Dodgy Geezer Silver badge

              Re: Cause of Death: Ostrich Algorithm

              No one said it EXCUSED them. It was a feature of the accident - nothing more...

          2. This post has been deleted by its author

        3. This post has been deleted by its author

        4. Miss_X2m1

          Re: Cause of Death: Ostrich Algorithm

          Actually, the speed limit was 45 mph, and the car was doing 40 mph however, at that speed, in near total darkness, using its lowbeam headlights, it was still ovedriving it's headlights and even a human driver in full control would not have been able to stop in time to avoid hitting the woman. It's all about timing, braking distance, road conditions, etc. Essentially, the car was driving blind and a human driver would have the same issue.

          1. Anonymous Coward
            Anonymous Coward

            Re: Cause of Death: Ostrich Algorithm

            "Essentially, the car was driving blind and a human driver would have the same issue."

            Or a human paying attention would have bothered to put the main beams on.

          2. John Brown (no body) Silver badge

            Re: Cause of Death: Ostrich Algorithm

            "Essentially, the car was driving blind"

            So all this stuff about night vision and LIDAR is a crock from Uber and it;s partners?

      3. a pressbutton

        Re: Cause of Death: Ostrich Algorithm

        IMO

        all the posters going into great detail about how it was the human's fault for crossing the road in front of an autonomous car

        - and -

        all the posters going into great detail about how it was NOT the human's fault for crossing the road in front of an autonomous car

        all miss one big thing:

        In a public environment, Humans should not be killed by autonomous things ever.

        Autonomous cars are there for the benefit of humans, not the other way round.

        1. Dodgy Geezer Silver badge

          Re: Cause of Death: Ostrich Algorithm

          ...In a public environment, Humans should not be killed by autonomous things ever.

          Autonomous cars are there for the benefit of humans, not the other way round....

          Very good. Now consider this:

          National Highway Traffic Safety Administration (NHTSA) 2016 data shows 37,461 people were killed in 34,436 motor vehicle crashes, an average of 102 per day, in the US.

          That makes an average of one death per 94m miles driven. I understand that autonomous cars currently have one death per 130m miles. So if we changed all cars to autonomous ones at the current state of the technology, we would drop the deaths to 27,087 - saving over 10,000 deaths per year.

          Why do you think saving 10,000 lives a year is not for the benefit of humans?

          1. Anonymous Coward
            Anonymous Coward

            Now consider this:

            An Ars Technica article pointed out that based on miles driven, Uber's self driving cars now have a much higher fatality rate than human drivers. With a sample size of 1, I know that is meaningless statistically.

            But consider that their drivers are having to manaully intervene evry 8 miles, and you have a system that 1) Doesn't appear to even close to those of Waymo and Cruse(Sp?) and 2) Requires an alert and involved human behind the wheel - which seems to be lacking in this case.

    3. Daggerchild Silver badge

      Re: Cause of Death: Ostrich Algorithm

      Wow. Okay that didn't get the response I expected..

      The 'discard the impossible' coping strategy is something that satellites, spaceships and rockets have to do, because they *can't* just blue-screen and sulk when something goes funky - they *must* try and continue. They are forced to make a realtime judgement call over which source of contradictory sensory data is 'wrong', before they can decide how to react.

      The problems come when they guess wrong..

  2. Anonymous Coward
    Anonymous Coward

    Public roads are not test tracks

    From the referenced Bloomberg article: "the ability to detect and classify objects, is a challenging task"

    Then those vehicles should not be on the road. If competent government were at work, a number of Uber managers and engineers would be perp-walked by now.

    1. Anonymous Coward
      Anonymous Coward

      Re: Public roads are not test tracks

      "...a number of Uber managers and engineers would be perp-walked by now."

      No. Uber has a sterling reputation of adhering to only the highest ethical standards.

      1. Sgt_Oddball
        Coffee/keyboard

        Re: Public roads are not test tracks

        You sir, owe me a new keyboard......

    2. skeptical i
      Thumb Down

      Re: Public roads are not test tracks

      re: "If competent government were at work"

      Nope, it happened in Arizona at the behest of Governor Doug Ducey.

  3. sjsmoto

    Uber wrote the software? I'm surprised the car didn't erase the video and hide the body.

    Dear Uber: Do you want to make sure your driverless cars are really reliable? Then have a dozen of them circling your executive parking lot 24x7 at 100 mph.

    1. Anonymous Coward
      Anonymous Coward

      Then you'd read an article about how Uber execs are now commuting to work via helicopter.

    2. Anonymous Coward
      Anonymous Coward

      Uber wrote the software?

      "Uber wrote the software? I'm surprised the car didn't erase the video and hide the body."

      No, but it did increase the Uber using percentage of the population. The AI is working a little better than expected, that's all.

  4. Whiznot

    Self-driving cars will always be inferior to good human drivers because they cannot think. Humans are able to anticipate potential risks. Self-driving cars can only react.

    1. Francis Boyle Silver badge

      Better version

      Self-driving cars Uber executives will always be inferior to good human drivers humans because they cannot think. Humans are able to anticipate potential risks. Self-driving cars Uber executives can only react to the lure of profit.

    2. John H Woods Silver badge

      Re: "Self driving cars can only only react"

      Reacting may well have been enough. One in three adult pedestrians die in a 40 mph collision with a modern car, on in eight at 30, one in 25 at 20.

      A volvo xc90 can stop from 100km/h in 36 metres in good conditions. This car was I 2/3 of that. It looks from the video that the pedestrian was directly in the path of the car for a full 2 seconds, I reckon 40m. A good driver would have been able to brake for at least one second, with every 0.1s reducing the speed by 1-1.5mph. A self-driving car should have been almost stationary on impact.

      This was not a failure of anticipation, but one of detection and reaction... If an autonomous car cannot detect an object in its path it is absolutely unfit for purpose.

      1. Noel Morgan

        Re: "Self driving cars can only only react"

        I timed the video and from when the pedestrian appears to when the video stops seems to be closer to 0.75 seconds rather than 2.

        Not sure I would have noticed, reacted and braked to a 'safe' speed in that time.

        The failure is not in autonomous cars per se, something went wrong here in that the 'extra' senses that the car should have did not operate or were turned off.

        1. Alister

          Re: "Self driving cars can only only react"

          I timed the video and from when the pedestrian appears to when the video stops seems to be closer to 0.75 seconds rather than 2.

          You cannot base any judgement on the video, as it is of such poor quality that it is in no way representative of reality. Human vision would have detected her much, much earlier.

          Don't forget, the lady didn't just suddenly step out in front of the car, she had left the median strip and already crossed one lane, and was nearly half-way across the second lane before she appears in the video.

          However, I agree that this shows that the either detection systems on the car were inadequate, or the software inexplicably decided not to brake or avoid the obstruction.

          1. Mark 85

            Re: "Self driving cars can only only react"

            Human vision would have detected her much, much earlier.

            Human vision would have detected her much, much earlier if the driver had been paying attention to the road in front of the car instead of fiddling with whatever....

            FTFY as this happens a lot without a self-driving car involved. Driver distraction is a serous problem and will get worse with driverless cars. Once us old geezers have died off and the generation raised around these cars takes over the car will need to be 100% perfect or accidents will continue.

          2. Anonymous Coward
            Anonymous Coward

            Re: "Self driving cars can only only react"

            "You cannot base any judgement on the video, as it is of such poor quality that it is in no way representative of reality. Human vision would have detected her much, much earlier."

            You have absolutely no way to know that.

            You do not know the sensitivity of the camera - some cameras see much better at night than the human eye, some do not, and some are about the same. In addition, the person becomes visible as the illumination increases. You do not know the pattern of the light emissions in relation to the movement of the person, which means the illumination fractions of a second before the person appears in the video could have been much lower - too low for human or machine perceptions set up to register moving objects while travelling to see someone - remember that time exposure and motion do not play well together.

            You are just making assumptions, when facts are needed.

            1. Alister

              Re: "Self driving cars can only only react"

              You have absolutely no way to know that.

              The facts are that contrary to the widely held belief, the place where the accident happened was not a dark country road, it was a well lit urban street. The video footage released by Uber shows a very misleading view of the available light levels.

              If you look here then you might begin to understand that the pedestrian would have been in plain view for a long time before the accident.

              You are just making assumptions, when facts are needed.

              No, I'm actually looking at the available evidence instead of accepting things at face value.

            2. Anonymous Coward
              Anonymous Coward

              You have absolutely no way to know that.

              Yes, he does. Other people have posted video of the same location at the same time of night from their dash cams. All show a well lit stretch of street with good long range visibility.

        2. Daniel 18

          Re: "Self driving cars can only only react"

          "I timed the video and from when the pedestrian appears to when the video stops seems to be closer to 0.75 seconds rather than 2.

          Not sure I would have noticed, reacted and braked to a 'safe' speed in that time."

          You could not.

          It is generally accepted that it takes about 1 second to initiate braking once an obstacle becomes visible.

          1. Boo Radley

            Re: "Self driving cars can only only react"

            I'm certain that I react in well under one second upon becoming aware of something unexpected in the road. But then, I have had advanced driver training and have generally quick reflexes.

        3. Tom 64
          Windows

          Re: "Self driving cars can only only react"

          > "Not sure I would have noticed, reacted and braked to a 'safe' speed in that time."

          Perhaps not, but you would have damn well swerved in order to avoid taking the life of another human being. Wouldn't you?

          1. Yet Another Anonymous coward Silver badge

            Re: "Self driving cars can only only react"

            but you would have damn well swerved in order to avoid taking the life of another human being.

            If it was a Volvo driver an the human being was on two wheels?

        4. This post has been deleted by its author

        5. John H Woods Silver badge

          Re: "Self driving cars can only only react"

          "I timed the video and from when the pedestrian appears to when the video stops seems to be closer to 0.75 seconds rather than 2." - Noel Morgan

          You need a bigger screen or spectacles! Seriously, though, I suspect you are looking with human eyes....run the video frame by frame and note the frame when you can see the first hint of her shoes on the road: 2.96 seconds. Now if it really were that dark, a human might be excused not recognizing that as a person but she is presenting a full radar image at that point in the lane of travel. Impact is later than 4.28 seconds.

          In the following 32 frames (1.3 seconds) she has progressed from a quarter of the way across the lane to three quarters of the way across the lane when she is hit. Therefore it is reasonable to assume it took her at least 2 seconds to cross the left lane and get to the middle of the road. If the radar can see the entire roadway, it had a minimum of 3 seconds to react, if it can only see the lane of travel there was 1.5 seconds minimum. At an entirely reasonable 0.8g braking, that's enough to drop from 40mph to 20mph.

        6. MrXavia

          Re: "Self driving cars can only only react"

          "I timed the video and from when the pedestrian appears to when the video stops seems to be closer to 0.75 seconds rather than 2."

          True but emergency braking as soon as the obstacle was detected would have reduced the impact speed and increased chance of survival, I saw no braking at all..

          But as other dash cam show, that area is well lit, even in the darker section, so the cyclist should have been seen earlier and the car should have been slowing down already anticipating the hazard and not accelerating towards 45...

          Seeing someone crossing like that I would have taken my foot off the accelerator and covered my brake (but not actually applied the brake until she crossed the middle line, so I would have been braking from the moment her wheel started to enter my lane and stopped in time to avoid her death...

          But even if I was not paying attention my 5 year old cars radar would have braked automatically as soon as she stepped into my lane even if I could not see her.. and that would have reduced impact to a survivable speed....

    3. veti Silver badge

      Ah yes, the good old "artificial intelligence is impossible" line. I wondered how long we'd take to get to that.

      Brains are not magic. Everything that goes on inside a human mind is something that can (in principle, if you really want to) be replicated in another environment. Of course mosttimes we don't really want that, but the point is that any generalised statement about what one thing or the other "can" do is just - magical thinking.

    4. Rustbucket

      > Self-driving cars can only react.

      But it didn't react. The woman was directly in front of the car for nearly a second and it made no attempt to brake.

      BTW all, Ars Technica posted a couple of images of the way the accident site is really lit. The second one in the article may be closer to what humans actually see.

      https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-came-from-the-shadows-dont-believe-it/

    5. Dodgy Geezer Silver badge

      How do you measure 'superior'? Because getting machines to fly aircraft has hugely improved the accident rate - nowadays there are very few accidents attributable to human error....

      1. Alan Brown Silver badge

        "nowadays there are very few accidents attributable to human error...."

        Actually, almost all aircraft crashes are attributable to human error and have been since the 1960s.

        EG: Pilots are supposed to know what an iced-up pitot messing up the instruments looks like. They even train for it. That didn't stop them getting confused and fighting over the controls on AF447

        Likewise they're supposed to divert and fly somewhere else if the weather is marginal, but pressure means that too many (usually ex-military) pilots will try to land in poor conditions and miss.

        The number of actual genuine equipment failure crashes only accounts for about 10% of aircraft crashes in the last 40 years (and only about 1% of car crashes)

        The study of "human factors" (Transportation psychology) has been the thing that's had the greatest effect towards the reduction of aircraft crash rates in the last 40 years (and one of the direct results of that study has been the discovery that military pilots make _lousy_ civilian ones as the mindsets required are diametrically opposed tpo each other). This study has only recently started being applied to road transport and is still not accepted by many people, particularly those in UK county councils - who need to take on board that traffic psychology starts with roadside markings and furniture.

        Any Robocar maker who doesn't have a bunch of people heavily involved in this field of study is going to kill a lot of people. Human-machine interactions is a complex field and assuming _anything_ is guaranteed to be a fuckup waiting to happen.

    6. Alan Brown Silver badge

      "Humans are able to anticipate potential risks. Self-driving cars can only react."

      Self driving cars with knowledge of road conditions and hazards ahead can also anticipate.

      If a ball rolls onto the road, a robot can anticipate a child following as easily as a human, likewise if children are playing on the footpath, or small feet are visible under parked cars at the roadside.

      If anything, a well-programmed robocar should be able to track and react to far more simultaneous hazards than a human (humans are only capable of handling a couple - concentrating on that wobbly cyclist may mean entirely missing a pedestrian stepping onto a crossing, etc) and predicting intended paths of even stationary pedestrians is something AIs have proven surprisingly good at.

      The average speed of traffic in dense urban areas and shopping zones is only 10-15mph. I would have no qualms at all about recommending that cars be limited to this for such areas. Taking impatient human drivers out of the mix would probably result in faster overall progress whilst simultaneously allowing pedestrians to freely cross the roads.

  5. Rebel Science

    We must ban all self-driving cars on public streets now

    In my opinion, the @USDOT should immediately impose a moratorium on all autonomous vehicles on public roads. Deep Learning is not suitable for uncontrolled or open environments where humans can be harmed. A deep neural net is like a rule-based expert system: it will fail catastrophically if it encounters a situation it has not been trained on. The blame for the next fatality will rest on the shoulders of @SecElaineChao.

    The UK and other European nations should do likewise because more fatal accidents are coming. Guaranteed.

    1. veti Silver badge

      Re: We must ban all self-driving cars on public streets now

      And will you take the blame for the next pedestrian killed by a human driver who performs less well than a self-driving car would have done? Because that will happen sooner.

      Guaranteed.

      Even if self-driving cars are never perfect, they might still be better than the alternative. We want to get them to that stage sooner rather than later. That requires testing.

      1. Anonymous Coward
        Anonymous Coward

        Re: We must ban all self-driving cars on public streets now

        Even if self-driving cars are never perfect, they might still be better than the alternative. We want to get them to that stage sooner rather than later. That requires testing.

        You'll be happy for them to do it around your house then?

        1. Anonymous Coward
          Anonymous Coward

          Re: We must ban all self-driving cars on public streets now

          "You'll be happy for them to do it around your house then?"

          Yes.

    2. Anonymous Coward
      Anonymous Coward

      Re: We must ban all self-driving cars on public streets now

      "The UK and other European nations should do likewise because more fatal accidents are coming. Guaranteed."

      Don't be silly. Of course fatal accidents are guaranteed, and pedestrian deaths will continue until we ban pedestrians.

      The question is whether there will be more of fewer deaths per million km with autonomous or human driven vehicles. It is most likely that autonomous vehicles, while not perfect, will be much safer than human driven vehicles.

      Any new technology will get better relatively quickly. Flying used to be marginally safer than Russian roulette. Now it is by far the safest form of long distance travel.

      Autonomous cars will follow a similar trajectory, replacing deeply flawed human drivers who never evolved to deal with such tasks and speeds with mechanisms engineered to do so reliably and effectively.

    3. Charlie Clark Silver badge
      FAIL

      Re: We must ban all self-driving cars on public streets now

      In my opinion, the @USDOT should

      You seem to be forgetting the little thing of states rights…

    4. Brangdon

      Re: We must ban all self-driving cars on public streets now

      Not all self-driving cars. Just the Uber ones. The Waymo ones are several years ahead of the Uber ones, have done many more miles, and haven't killed anyone yet. It'd be wrong to punish Waymo for Uber's mistakes.

    5. John Brown (no body) Silver badge

      Re: We must ban all self-driving cars on public streets now

      "In my opinion, the @USDOT should immediately impose a moratorium on all autonomous vehicles on public roads."

      I agree There a plenty of non-public places that are very similar environments where permissions could be gained. Such as military bases, industrial estates, university campuses or, better yet, the large areas these autonomous car makers already mainly seem to operate from.

  6. This post has been deleted by its author

  7. Commswonk

    The Shape of Things to Come...

    i.e. different parts of the technology chain arguing that "it wasn't me guv".

    What it needs is a couple of major motor insurers to stand up and say "on the present showing we will not be quoting to insure automomous vehicles". They are not under any legal obligation to provide insurance cover, so they will either refuse to quote or will come up with outrageous figures that will put everybody off.

    The technology companies involved are going to have to learn that they either stand by the integrity of their product (if it has any) or they drop the idea of participating in any further developments.

    "The Law" (both criminal and civil) is unlikely to tolerate a state of perpetual obfuscation generated by various component makers, including those who write the software. Neither the "victims" of traffic mishaps nor the drivers of the vehicles involved (including those in non - autonomous vehicles) should be expected to cope it either. Justice delayed is justice denied...

    1. usbac Silver badge

      Re: The Shape of Things to Come...

      You talk like any tech company has ever stood behind their product. For the last 20 years or so the tech industry has been rushing buggy insecure products out the door to beta testers (paying customers). None of them have ever cared if their products actually work, why do you think autonomous vehicles would be any different?

      It's an industry mentality now...

      1. Paul Crawford Silver badge

        Re: The Shape of Things to Come...

        "why do you think autonomous vehicles would be any different?"

        Because those behind it should be facing jail time for injury or death unless they can show the highest standards for safety-critical code. You know, like the aviation industry does.

        What, that will make it too expensive to get rid of human taxi drivers?

        Sadly we have reached the point where software suppliers/licensor/whatever need to be held to account for a shitty job. Just now they can fob off all sorts of liability under the EULA, but cars are different - they actually do kill people when it goes wrong as we are sadly now discussing.

        1. Anonymous Coward
          Anonymous Coward

          Re: The Shape of Things to Come...

          "Because those behind it should be facing jail time for injury or death unless they can show the highest standards for safety-critical code. You know, like the aviation industry does.

          What, that will make it too expensive to get rid of human taxi drivers?"

          So you want to make a life-saving technology too expensive to use?

          Not the best plan.

          1. Paul Crawford Silver badge

            Re: The Shape of Things to Come...

            "So you want to make a life-saving technology too expensive to use?"

            I think we have just seen it doing quite the opposite.

    2. Charlie Clark Silver badge

      Re: The Shape of Things to Come...

      What it needs is a couple of major motor insurers to stand up and say "on the present showing we will not be quoting to insure automomous vehicles"

      I think you'll find that insurers are pretty keen on autonomous vehicles. They know how shit a lot of human drivers and also the value of the data collected. In legal terms the case in San Francisco against GM is likely to be much more relevant than Uber's fuckup. Insurance is likely to be one of the biggest carrots for autonomous vehicles.

    3. Alan Brown Silver badge

      Re: The Shape of Things to Come...

      What it needs is a couple of major motor insurers to stand up and say "on the present showing we will not be quoting to insure automomous vehicles".

      Not quite.

      If robocars can show that they will react to obvious hazards (and there are several standardised hazard courses that they can be tested on), then they can graduate to public roads for further testing. But what's clear is that they must NOT be allowed to run freely on roads when they're not going to react to a clear and present hazard in front of them that they need to stop for.

  8. David 45

    Uber software?

    Since when have Uber been programmers?

  9. EveryTime

    The XC90's safety system was why I initially believed that the collision wasn't preventable.

    Later, seeing the video, I wondered what went wrong.

    While this story isn't confirmed, it explains what went wrong.

    The decision to switch off an existing safety system moves this to the realm of criminal behavior. The safety system was presumably developed and tested with a careful methodology e.g. MISRA. There is quite a bit of evidence from court documents that Uber development ignored safety. There is other evidence that the Uber system was a sham demo largely based on careful mapping, close to a screenshot slideshow of a proposed system.

    This is a huge disservice to the development teams that *are* treating safety as the very top priority.

  10. tekHedd

    Human-To-Vehicle communications

    The answer is to fit not just bicycles, but all persons with human-to-vehicle location transponders. These can be permanently installed in or on the head. It's for safety, so we should start with the children. This will also double as a handy tracker so you can locate others if they become lost. And you'll always have a GPS with you wherever you go!

    I see no possible way this could have negative consequences.

    1. Anonymous Coward
      Thumb Up

      Re: Human-To-Vehicle communications

      The answer is to fit not just bicycles, but all persons with human-to-vehicle location transponders.

      I think Parliament was a little bit hasty in abolishing the Locomotive Act 1865. The subsequent locomotive acts which removed its restrictions should have only applied to vehicles under direct human control. The most important of these were the speed limit of 4 mph in the country and 2 mph in the city, and a man carrying a red flag walking in front.

      1. Allan George Dyer
        Pirate

        Re: Human-To-Vehicle communications

        @Smooth Newt: "a man carrying a red flag walking in front"

        Remember "Sudden unintended acceleration"? Who are you going to find who's stupid enough to intentionally walk in front of one of these?

        OTOH, perhaps it could be offered as a "community service" alternative to a prison sentence for Uber executives?

        1. John Brown (no body) Silver badge
          Joke

          Re: Human-To-Vehicle communications

          "Remember "Sudden unintended acceleration"? Who are you going to find who's stupid enough to intentionally walk in front of one of these?"

          Just attach the red flag to one of those little, slow moving delivery robots and kill two birds with one stone.

    2. Mark 85

      Re: Human-To-Vehicle communications

      I'll pass on having a tracking device installed on my person thank you very much. I don't like being tracked on the web, via my phone, and soon definitely not by my car.

      While I see the need for it or maybe something different and better, what with all the tracking and surveillance that happens just from the government side of things, this might just be over the line.

      1. Yet Another Anonymous coward Silver badge

        Re: Human-To-Vehicle communications

        The answer is to fit not just bicycles, but all persons with human-to-vehicle CIWS

        Phalanx is nice but I don't think you can beat goalkeeper for pure ludicrous mode overkill

    3. Alan Brown Silver badge

      Re: Human-To-Vehicle communications

      "The answer is to fit not just bicycles, but all persons with human-to-vehicle location transponders."

      Um yeah. Good luck fitting that to a moose. Or a bear, 14 point stag, black cow, wild boar, sheep or horse

      All of which I've encountered on the road. None of which I'd like to drive into.

      Some of the larger hazards in some countries take offence at being driven into and may sit on your car, vs merely coming through the windscreen and killing you.

  11. whatsyourShtoile
    Mushroom

    satan has taken another fresh young woman

    The car will surely be going to meet its master in hell.

  12. Anonymous Coward
    Anonymous Coward

    NTSB opens field investigation into Tesla X fatality

    Today it was announced that the NTSB has begun an investigation into the fatal crash of a Tesla model X last week in California after the vehicle hit a bridge abutment and burst into flames. It's unknown if the driver was using the "autopilot" hands-free mode which has resulted in fatalities in the Tesla model S vehicles. Clearly more federal and state oversight is required for any semi-autonomous vehicle operation or more people will die. Today the state of Arizona mandated that Uber remove all AVs from their roadways as they claim Uber failed to follow the state's safety requests.

    1. Phil O'Sophical Silver badge
      WTF?

      Re: NTSB opens field investigation into Tesla X fatality

      It's unknown if the driver was using the "autopilot" hands-free mode

      and

      Clearly more federal and state oversight is required for any semi-autonomous vehicle operation

      Bit of a non-sequitor there, no?

    2. Charlie Clark Silver badge

      Re: NTSB opens field investigation into Tesla X fatality

      Tesla saves a lot of money by not using LIDAR…

  13. Anonymous Coward
    Anonymous Coward

    Beware the Video

    1) Do they really use a camera that has such poor night vision

    2) Bet with access to the original stream one could "see" the cyclist all the way across the road!

    2) Or have they set the playback to make it look like she came out of nowhere

    1. TonyJ

      Re: Beware the Video

      Don't be daft.

      I mean, Uber doing something unethical and/or illegal?

    2. Robert Carnegie Silver badge

      Re: Beware the Video

      With eyesight as fuzzy as Uber's video I think you may be not allowed to drive. But I take from that that the driving computer isn't relying on this video. Possibly not on video at all.

      Also, if it was as dark as it looks, then I would drive with full beam headlights, which this apparently isn't - but since the location has street lights, it shouldn't be as dark as it looks. Except that some municipal authorities turn their street lights down or off when it's late, to save their electricity bill.

      And I don't drive; I cycle. But not like in this video. Although I have crossed the Blantyre A725 on foot, with my bike. Central barrier and all.

      The walking cyclist would have been seen sooner in full headlights. But I assume that the car doesn't need those, and they are liable to dazzle other drivers.

  14. EUbrainwashing

    A few points

    If the driver was in full ongoing control of the vehicle - driving it - and there was no autonomous system involved, the evidence of the driver video would be explanation enough of where fault lay. The driver was not looking at the road. That the pedestrian was apparently not looking too would be secondary.

    The video of the road cannot be sufficient to say the pedestrian was not possible to see in the light conditions. The video camera will adjust the aperture for optimum video - if it was recording the darker areas the fully lit areas would be totally overexposed. So the video does no show what a human eye would see if looking into the apparent areas of darkness.

    The autonomous system should have been able to detect to pedestrian regardless of lighting conditions. It is technically possible. My AUDI would have flashed a 'pre-sence' warning in this situation - day or night.

    It looks like the pedestrian made a very bad decision. Let us say she did, lets even presume she was in some way compromised (distraught, drunk/stoned, nuts). If you are driving on a highway out of town it is true that you are moving faster that you would be able to react if the 'unexpected' occurs. That is what happens when you hit a deer or a drunk on the freeway on a bike at night with no lights.

    If you are driving in an urban environment you have to drive in a much more cautious manner. Never drive faster than you can see. The reason is that there are people about, in the dark, in the rain, children running, making mistakes, distracted people crossing a wide road in the dark - even at a point where they should not be. A good driver is watching-out for this type of event when driving in town, all the time, and modifying their driving moment by moment to be safe.

    This is called predicting and I do not believe artificial intelligence is close to being 'intuitive' in this way. More than safety too. We predict what other cars are doing to help the traffic flow and contribute to everyone getting home on time and safe. In experiments, junctions where there are no stop lines or traffic controls flow better and with less incidents that their former configuration. Humans are very good at forming cooperative self-organising systems.

    I took the trouble to visit the location and area when this UBER accident occurred via Google street view. So I know the location of street lights, they are sufficient, there is a warning sign advising pedestrians not to cross when this person did and more. What I noted, which I would have noted as a driver, is this looks like an area where there may be remote people late at night, in the darkness. There is the river-side park areas, there are large covered areas under the freeways, there is a park with covered shelters for barbecues.

    If I knew the town I would know the nature of the area, the risks. If I did not know the area I would be cautious because I would be conscious I did not know the area. That is what good human drivers do. They do not just deal with GPS and the rules of the road.

    I believe that we are a very long way from having automatous driving systems that are more safe than good human drivers in these sort of situations. I do think it is strange this vehicle appears to have failed to 'see' the pedestrian, it should have been possible in this simple enough circumstance. I think the human 'driver' failed, obviously, but this is the weakness of a early driver assist system - a human's attention will not remain sufficiently to avoid such occurrences. A human is not a robust enough backstop.

  15. Dodgy Geezer Silver badge

    Obilgatory 2001 Quote

    Dave: How would you account for this discrepancy between you and the twin 9000?

    HAL: Well, I don’t think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before, and it has always been due to human error.

    Frank: Listen HAL. There has never been any instance at all of a computer error occurring in the 9000 series, has there?

    HAL: None whatsoever, Frank. The 9000 series has a perfect operational record.

    Frank: Well of course I know all the wonderful achievements of the 9000 series, but, uh, are you certain there has never been any case of even the most insignificant computer error?

    HAL: None whatsoever, Frank. Quite honestly, I wouldn’t worry myself about that.

  16. BoldMan

    Uber shills run riot!

    Wow, how many Uber shills are commenting and voting on the comments on this topic, its quite astonishing!

  17. sjsmoto

    Reading here and on social media about the inevitability of deaths from driverless cars so we can be on our merry way to less deaths from cars is grotesque. So I'm going to rant a bit.

    I agree with others who have stated that nothing less than no deaths is acceptable.

    Here's my take on requirements for this thing.

    1. It's my unsubstantiated guess that these cars are programmed to be in a state of "go" unless there is a good reason to slow down or stop. I would mandate the opposite - their normal condition must be "stop" unless there is a good reason to go.

    2. If the software is in a situation where it cannot accurately determine what's going on, it must immediately signal the driver and begin to slow to a stop or pull over and stop until the driver takes over. Continuing on with the "thought" that it will figure out what's going on is dangerous.

    3. Companies developing self-driving cars must, for a minimum of two years, equip electric golf carts with the same software and detection equipment and let them continually drive around their offices (except when recharging). If you need a number, let's say 5 golf carts per floor. Incident reports must be maintained. After a period where there has been two years of no accidents, the company will be allowed to place their software and detection equipment on a car, and let the car drive around their property (including parking lots). After a period where there has been one year of no accidents, the company will be allowed to place one of these cars on a public road. (And so on.)

    4. Company executives will be personally liable for accidents and deaths from their driverless cars.

    /rant

  18. Claptrap314 Silver badge

    Building the wall all around

    The autonomous vehicle advocates appear to consistently miss a crucial aspect regarding mass deployment of AVs: software security vulnerabilities. This is a failure mode which simply does not exist for human drivers. What is more, any attempts to quantify an upper bound on this risk are at best highly speculative.

    Is anyone here going to claim with a straight face that the security model is going to be reasonable? Personally, I expect it to be somewhere between pacemakers (which have been non-existent) and what's currently in place for modern vehicles (totally p0wnable across the Internet en mass).

    Even the most basic question, "how does this software get updated?" leads to answers that no one here would be willing to bet their life on--and yet, that is exactly what is being expected.

    I'll pass.

  19. EUbrainwashing

    Freedom & Individualism - the strongest mechanism of all

    I wonder if there is a correlation between advocates of automatous motor vehicles and socialists? Both believe there is an achievable 'scientific' way of a mechanism fulfilling a task of incalculable complexity better than the plethora of humanity working individually and cooperatively towards forming a self-regulating system which if allowed can solve every imponderable efficiently and flexibly. I am not a socialist and am not a believer in a scientific elite being any better than a wet weekend.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like