back to article Uber jams Arizona robo-car project into reverse gear after deadly smash

Uber has confirmed that it's shutting down its self-driving car operation in Arizona – without waiting for the conclusion of the official investigation into the death of a pedestrian in the US state in March. The taxi tech upstart employs about 500 people in Arizona working on self-driving cars and trucks. On Wednesday, staff …

  1. J. R. Hartley

    The title is no longer required.

    So it's moving really, rather than shutting down.

    1. Pascal Monett Silver badge

      Uber is moving, yes, but as far is the Arizona employees are concerned, unless they all move to Pittsburgh, it's shutting down.

      1. Anonymous Coward
        Anonymous Coward

        48 States to go, and then they'll just have to move their palm-greasing operation offshore to some developing country or other. As long as it has nice neat road grids somewhere, anyway.

      2. david 12 Silver badge

        They're shredding the documents, and firing anyone who could testify, yes, but as far as management is concerned, it's business as usual ~ and dam the consequences.

  2. Anonymous Coward
    Anonymous Coward

    Basic ethics

    Some instinctively embrace basic ethics. Others only discover the concept while it is being shoved down their throat.

  3. Anonymous Coward
    Anonymous Coward

    Autonomous vehicle safety ignored

    Uber and others have failed to design in vehicle safety, security, redundant systems, fail safe designs, etc. and the fatalities confirm this. Shuttering their AZ site illustrates a lack of concern for the above issues. It's time that federal governments world wide develop minimum mandatory safety, security, design, engineering, operational and maintenance standards for AVs before more people die. The rush-to-market mentality in the AV industry needs to be eliminated.

    1. John Robson Silver badge

      Re: Autonomous vehicle safety ignored

      One of them has... not convinced you can say the same about the other companies.

      And this up against a bunch of halfwit meat sacks who kill on such a regular basis that it barely makes local news.

      1. Alister

        Re: Autonomous vehicle safety ignored

        And this up against a bunch of halfwit meat sacks who kill on such a regular basis that it barely makes local news.

        That's a flawed argument, though. If an autonomous vehicle can't at least match the safety standard of the meatsacks, then further work needs to be done before they are allowed on public roads.

        1. John Robson Silver badge

          Re: Autonomous vehicle safety ignored

          "If an autonomous vehicle can't at least match the safety standard of the meatsacks, then further work needs to be done before they are allowed on public roads."

          So learner drivers shouldn't be allowed on the road either then?

          That's the point of having a 'safety' driver - it's the equivalent of a driving instructor, there to ensure that the learner's mistakes don't cause serious incident.

          In the one fatality that we know of from an autonomous vehicle (autonomous, not fancy cruise control) we absolutely know that the 'driving instructor' was watching their phone, not the road.

          Note that I do think that initial driver training (car control) should take place off the public highway, but there comes a point when you have to train any person or system using real world conditions. You do that by putting someone next to them in a seat so that they can take over if it all goes horribly wrong.

          1. Alister

            Re: Autonomous vehicle safety ignored

            So learner drivers shouldn't be allowed on the road either then?

            That's not a fair comparison either. Human learner drivers have built in instincts and abilities which are entirely absent from current autonomous vehicles.

            1. John Robson Silver badge

              Re: Autonomous vehicle safety ignored

              "That's not a fair comparison either. Human learner drivers have built in instincts and abilities which are entirely absent from current autonomous vehicles."

              Really?

              I'm not convinced that humans can instinctively control a vehicle, and I'm pretty sure that many of them have no idea what they are doing, even after significant training and a small test.

              Wait another couple of decades with no additional training or monitoring of their driving behaviour and the number who know what they are doing seems to fall :(

              The advantages conferred by the 360 degree sensors, the lack of fatigue or emotion are significant. And not something that a meat sack could ever hope to achieve.

              Think of it another way - if you came up with a car now... would it be allowed? A tonne plus of metal that could trivially travel at 60, 70, 100 mph that you can operate within inches of completely unprotected pedestrians and other road users for decades having taken a single 30 minute test...

              You'd never get it approved, and quite rightly so.

              1. Alister

                Re: Autonomous vehicle safety ignored

                I'm not convinced that humans can instinctively control a vehicle, and I'm pretty sure that many of them have no idea what they are doing, even after significant training and a small test.

                If a human driver - even only a learner driver - was put in the same circumstances as the Uber crash, it would not have happened. A human driver would have seen the victim, and either slowed down or manoeuvred the vehicle to avoid a collision.

                Even a learner driver on their first ever outing would not have blithely continued and run into the victim. This is what I mean about the built in instincts.

                1. John Robson Silver badge

                  Re: Autonomous vehicle safety ignored

                  "I'm not convinced that humans can instinctively control a vehicle, and I'm pretty sure that many of them have no idea what they are doing, even after significant training and a small test.

                  If a human driver - even only a learner driver - was put in the same circumstances as the Uber crash, it would not have happened. A human driver would have seen the victim, and either slowed down or manoeuvred the vehicle to avoid a collision.

                  Even a learner driver on their first ever outing would not have blithely continued and run into the victim. This is what I mean about the built in instincts."

                  There *was* a human driver - they still ran straight into a pedestrian.

                  And there are enough cases across the world where people manage to hit clearly visible objects that I don't think you can reasonably assert that a meatsack wouldn't have blithely continued.

                  1. Alister

                    Re: Autonomous vehicle safety ignored

                    There *was* a human driver - they still ran straight into a pedestrian.

                    No, there wasn't. There was a human passenger with responsibility to monitor and override the vehicle.

                    You cannot equate that with a human who is fully engaged in driving the vehicle, the awareness and concentration required is completely different.

                    1. John Robson Silver badge

                      Re: Autonomous vehicle safety ignored

                      "There *was* a human driver - they still ran straight into a pedestrian.

                      No, there wasn't. There was a human passenger with responsibility to monitor and override the vehicle.

                      You cannot equate that with a human who is fully engaged in driving the vehicle, the awareness and concentration required is completely different."

                      That driver was in the same position as a driving instructor in a dual control vehicle (better actually since they also had a steering wheel directly in front of them).

                      In UK law at least, the supervising driver is considered to be 'in control' of the vehicle (even in cars *without* dual controls).

                      A quick google suggests that the same probably applies in the US as well:

                      https://www.smorganlaw.com/who-is-liable-when-a-student-driver-gets-in-an-accident/

                      https://www.nolo.com/legal-encyclopedia/who-is-liable-if-a-drivers-ed-student-crashes.html

                      (Both explicitly list instructors texting as a case where the instructor could be held liable)

                      So yes - there was a driver who was legally in control of the vehicle and managed to kill someone.

                      1. Alister

                        Re: Autonomous vehicle safety ignored

                        @John Robson

                        So yes - there was a driver who was legally in control of the vehicle and managed to kill someone.

                        You persist in trying to muddy the waters here, and I wonder why?

                        The supervising driver was at fault, because they were not concentrating on the road, and were not in a position to override the vehicle in time to prevent the accident, but the fact is that the vehicle should have been able to avoid the accident by itself, and didn't.

                        The circumstances of the accident were not some strange or random edge-case which caught the vehicle's logic out, there was a pedestrian crossing the road, in clear view for many seconds, and the vehicle drove into them without braking or trying to avoid them.

                        A human driver in manual control of a vehicle would not have behaved in that manner, they would have attempted to avoid the accident in some way.

                        1. John Robson Silver badge

                          Re: Autonomous vehicle safety ignored

                          "You persist in trying to muddy the waters here, and I wonder why?

                          The supervising driver was at fault, because they were not concentrating on the road, and were not in a position to override the vehicle in time to prevent the accident, but the fact is that the vehicle should have been able to avoid the accident by itself, and didn't."

                          I'm not muddying the waters - the fact is that the human driver allowed the car to plough into a pedestrian. Saying that no human would ever do so is therefore clearly tripe.

                          You only have to look at global accident rates to realise that people drive vehicles into perfectly visible objects on a frighteningly regular basis - your assertion that they don't simply doesn't hold water.

                          Today's report reveals, as had been suggested from the start, that the car's braking system had been explicitly *disabled*. The car *wanted* to stop, but was not allowed to.

                          Moreover it didn't even have the option to shout at the human driver either.

                          Uber also expected the human driver to be manipulating a touchscreen device to flag up 'interesting' logs - which would certainly put that driver on the wrong side of the law on this side of the pond, and I expect it would be similar on that side as well (see my previous links).

                        2. Alan Brown Silver badge

                          Re: Autonomous vehicle safety ignored

                          "The circumstances of the accident were not some strange or random edge-case which caught the vehicle's logic out"

                          The reason it caught the vehicle's logic out is simple.

                          Most US State laws say that pedestrians SHALL NOT cross the road except at designated crossing points and only when authorised to do so. They also say that pedestrians MUST give way to vehicles.

                          Put that as-is into a programming algorithm and you have a mindless death machine waiting for a victim to walk in front of it.

                    2. really_adf

                      Re: Autonomous vehicle safety ignored

                      You cannot equate that with a human who is fully engaged in driving the vehicle, the awareness and concentration required is completely different.

                      No, the awareness and concentration required is the same. That given was different, thus the collision.

                  2. Anonymous Coward
                    Anonymous Coward

                    Re: Autonomous vehicle safety ignored

                    If a human driver - even only a learner driver - was put in the same circumstances as the Uber crash, it would not have happened. A human driver would have seen the victim, and either slowed down or manoeuvred the vehicle to avoid a collision.

                    If you watch the video, then no, a human probably would not have seen the victim until too late. Granted, it didn't seem to be an HDR video, but machine sensors should have detected the pedestrian before human vision in this case.

                    1. Alister

                      Re: Autonomous vehicle safety ignored

                      The video released by Uber was highly misleading.

                      Take a look at this article on Ars Technica:

                      https://arstechnica.com/cars/2018/03/

                      1. Destroy All Monsters Silver badge

                        Re: Autonomous vehicle safety ignored

                        The video released by Uber was highly misleading.

                        Woah that software is NOT READY. It's a bit of a witches' brew.

                  3. ymjir

                    Re: Autonomous vehicle safety ignored

                    I think the issue a lot of people have is that Autonomous vehicles as there are, currently are just that. Automatons running off a pre-determined list. A script of nested if statements. If any part of the system is compromised ( missing/less sensors then as designed originally) then the system will fail to "see".

                    A robot wont tire yes, but a robot is currently incapable of intelligent decision.

                    A trainer of a Human learner driver is usually experienced enough to gauge how ready the learner is.

                    The caretakers of these bots seem to have a problem with this specific point.

                2. Charlie Clark Silver badge

                  Re: Autonomous vehicle safety ignored

                  If a human driver - even only a learner driver - was put in the same circumstances as the Uber crash, it would not have happened.

                  A learner driver is likely to be more careful. Try that with a pissed up idiot and you'll see why drink-driving limits keep coming down around the world.

                  In this case it does look very much like Uber was at fault for taking the LIDAR-free shortcut, pulling out of Arizona is almost an acceptance of liability. But that is because they were trying to avoid using something that everyone else seems to be essential, which somewhat undermines your argument.

                  AFAIK there are other cases working their way through the legal system which are likely to go the way of the cars.

              2. Alan Brown Silver badge

                Re: Autonomous vehicle safety ignored

                "Wait another couple of decades with no additional training or monitoring of their driving behaviour and the number who know what they are doing seems to fall"

                Agreed, along with the point about 360 sensors, constant attention and lack of emotions/impatience/fatigue (the single biggest cause of congestion is impatient drivers trying to jump queues and gumming things up)

                Robot drivers don't have to be perfect, just better than humans. That's not a high bar and Google have already achieved it. On the other hand Uber have not.

          2. Robert Helpmann??
            Childcatcher

            Re: Autonomous vehicle safety ignored

            That's the point of having a 'safety' driver - it's the equivalent of a driving instructor...

            It might be the legal equivalent, but I cannot believe a qualified driving instructor would fail to monitor the student to such a degree as was shown in the video released concerning the Uber pedestrian fatality. The issue is not just the tech. It is the ethics of those at the top of the company. It is the care with which they choose their employees and how they treat them. It is a direct reflection of the people running the company and it is a very scary sight to behold.

          3. Alan Brown Silver badge

            Re: Autonomous vehicle safety ignored

            "In the one fatality that we know of from an autonomous vehicle (autonomous, not fancy cruise control) we absolutely know that the 'driving instructor' was watching their phone, not the road."

            We also know that Google realised very quickly that _despite_ being told that they were safety supervisors and to pay attention at all times, the "driving instructors" were doing everything but - so they went all-out to ensure the cars were safe.

            As for national regulators - they ARE overwhelingly cautious. Arizona is one of the exceptions and the US in particular is the odd one out thanks to a century of lobbying, with vehicle-centric laws and a uniquely pedestrian-hostile culture and legislation in most areas. That makes it the worst possible place to develop automated cars as assumptions which codify "pedestrians must not be here" laws turn robots into mindless killing machines where a human would (in most cases) take evasive action or stop.

        2. Charlie Clark Silver badge

          Re: Autonomous vehicle safety ignored

          That's a flawed argument, though. If an autonomous vehicle can't at least match the safety standard of the meatsacks

          I don't think the argument is flawed. Put someone entirely untrained in charge of a motor vehicle and you'll have a crash in minutes if not seconds.

          The bigger issue is the comparability of the statistics: autonomous vehicles already have the better accident per km statistics but these are in selected environments.

          John's original point does stand that, at least in the US, the number of road deaths is staggering but barely reported. If people focussed only on the numbers car drivers, along with gun owners, would have to be considered domestic terrorists.

          1. Alister

            Re: Autonomous vehicle safety ignored

            I don't think the argument is flawed. Put someone entirely untrained in charge of a motor vehicle and you'll have a crash in minutes if not seconds.

            Nowhere did I suggest that we were discussing untrained humans. The comparison should obviously be with human drivers who are supposedly competent.

          2. Dodgy Geezer Silver badge

            Re: Autonomous vehicle safety ignored

            ...If people focussed only on the numbers car drivers, along with gun owners, would have to be considered domestic terrorists....

            At more than 40,000 for 2017, I don't think 'terrorists' quite fits the bill. Those death rates are more like the impact of a well-trained army. 40k+ is similar to the number of deaths in the war in Afghanistan...which has been going on for 40 times as long...

    2. Charlie Clark Silver badge
      Stop

      Re: Autonomous vehicle safety ignored

      Uber and others have failed to design in vehicle safety, security, redundant systems, fail safe designs, etc. and the fatalities confirm this

      It was only Uber that decided not to use expensive LIDAR sensors that other manufacturers use as part of their redundancy design.

      1. Martin Gregorie

        Re: Autonomous vehicle safety ignored

        It was only Uber that decided not to use expensive LIDAR sensors that other manufacturers use as part of their redundancy design.

        I'm a bit worried about the reliance on LIDAR for many of these vehicles.

        • For starters, LIDAR is an optical system, so subject to similar problems with airborne dust, smoke and fog as a human driver, yet I've seen no discussion about this or information about what backup systems the cars use when seeing is poor.
        • Secondly, how powerful are the lasers they use? At what distance can they harm pedestrian's and pet's eyes? What about the effect of a street packed with a LIDAR-equipped traffic jam?
        • Thirdly, how is LIDAR affected by reflective surfaces?

        1. Charlie Clark Silver badge

          Re: Autonomous vehicle safety ignored

          I'm a bit worried about the reliance on LIDAR for many of these vehicles.

          Nearly all the vehicles use multiple systems to reduce the limitations of each. LIDAR is currently favourite for object detection because its much faster than anything using video processing, though I think the hope is that real-time video processing will become possible in a few years, which should help in the situations where it's known that LIDAR has problems.

          But then, for example, small child in white clothing on a snowy street or in dark clothing at night present problems for all kinds of drivers.

      2. Alan Brown Silver badge

        Re: Autonomous vehicle safety ignored

        "It was only Uber that decided not to use expensive LIDAR sensors that other manufacturers use as part of their redundancy design."

        As the reports say: The sensors picked the pedestrian up just fine. The problem was in how the programming (failed to) respond.

  4. Anonymous Coward
    Anonymous Coward

    The Thousand Year Reich

    is coming to an end a little bit earlier than expected.

  5. Anonymous Coward
    Anonymous Coward

    Admittance of guilt?

    If they made no mistakes, there would be no reason to move this operation, right?

    If they accept they made a mistake, they would shut it down, not move it, right?

    Very suspicious activity. You know, like the person who jumps over your fence to look in your shed, but replies "I was not doing anything". Riiiiight!

    1. Gotno iShit Wantno iShit

      Re: Admittance of guilt?

      Seems to me more likely the culture at that site is deemed irretrievably Kalanick (I know few higher insults). Perhaps Khosrowshahi and Hart see a better culture in Pittsburg. Cut out the dead wood, regroup, regrow.

    2. John Brown (no body) Silver badge

      Re: Admittance of guilt?

      If they made no mistakes, there would be no reason to move this operation, right?

      If they accept they made a mistake, they would shut it down, not move it, right?

      I was thinking the same thing. It smells rather like the companies who cease trading when there's a court case and/or fines in their near future, eg phone scammers and their ilk.

  6. Wolfclaw

    This smells like, we need to start savinging millions, to payoff a unlawful death suit from the victims family and the only way is to sack staff !

  7. Pete 2 Silver badge

    AV's Hindenburg?

    It is not really about the absolute level of safety that will determine the future of autonomous vehicles, but the public perception. And thanks to a news media that lingers on every accident they have, that perception is increasingly negative.

    At what point will the public conclude (rightly or wrongly) that these vehicles are still "in beta" and refuse to adopt them? Will it take a really big and public disaster to consign self-driving cars (and lorries) to cold-storage for a few decades until the tech is finally improved, or will they be like plane crashes and the occasional fatal accident taken as "acceptable losses" (just so long as it doesn't happen to me).

    1. Alister

      Re: AV's Hindenburg?

      It is not really about the absolute level of safety that will determine the future of autonomous vehicles, but the public perception. And thanks to a news media that lingers on every accident they have, that perception is increasingly negative.

      It's easy to blame the media, but both Tesla and Uber currently seem to have issues which make their vehicles unsafe by any standard.

      It's important to note that Tesla is not meant to be an autonomous vehicle, but even taking that into account, there have been three or four incidents where the vehicle did not detect large solid obstructions in its path, and did not brake or take avoiding action.

      And as for Uber, the vehicle failed to detect or react to a pedestrian pushing a bicycle who was in clear sight for hundreds of yards before the collision.

      Whilst ever autonomous vehicles share the road with non-autonomous vehicles, there will be accidents, this is accepted. What's not acceptable is when the accidents are of such a nature that they would not have occurred if the vehicle was being driven by a normally competent human in the same circumstances.

      1. John Robson Silver badge

        Re: AV's Hindenburg?

        "Whilst ever autonomous vehicles share the road with non-autonomous vehicles, there will be accidents, this is accepted. What's not acceptable is when the accidents are of such a nature that they would not have occurred if the vehicle was being driven by a normally competent human in the same circumstances."

        Which is why in all those cases the human behind the wheel is still responsible for the safe operation of the vehicle. We absolutely know that the Uber driver wasn't doing their job, and I think it's pretty clear that the Tesla owners weren't doing what they should have been either (remembering that the Tesla isn't an autonomous vehicle).

        1. Alister

          Re: AV's Hindenburg?

          Which is why in all those cases the human behind the wheel is still responsible for the safe operation of the vehicle. We absolutely know that the Uber driver wasn't doing their job, and I think it's pretty clear that the Tesla owners weren't doing what they should have been either (remembering that the Tesla isn't an autonomous vehicle).

          But that's dodging the issue. Lets ignore Tesla for the moment, which are not autonomous, but in the Uber case the alleged autonomous vehicle wasn't capable of avoiding a simple collision without human intervention.

          I wrote:

          What's not acceptable is when the accidents are of such a nature that they would not have occurred if the vehicle was being driven by a normally competent human in the same circumstances."

          Whether or not there was supposed to be a responsible human, that doesn't change the fact that the vehicle hit and killed a pedestrian, in circumstances in which a comparable human driven vehicle would not.

          1. batfink

            Re: AV's Hindenburg?

            Back to the basic comparison of competence then.

            You assert that a "comparable human driven vehicle would not" have killed the poor pedestrian.

            Humans kill quite a lot of pedestrians every day. There were 6,000 killed in the US alone in 2017, according to Forbes (https://www.forbes.com/sites/tanyamohn/2018/02/28/high-pedestrian-deaths-in-2017-risk-becoming-new-normal-report-finds).

            So, your assertion doesn't hold. It's based on the idea that all (human) drivers are competent, which is demonstrably not the case.

            What we need are comparisons of fatalities/injuries per mile travelled. Obviously this also varies by country...

            1. Alister

              Re: AV's Hindenburg?

              You assert that a "comparable human driven vehicle would not" have killed the poor pedestrian.

              Humans kill quite a lot of pedestrians every day.

              Yes they do, for all sorts of reasons, both the pedestrians' fault, and the drivers' fault. That has nothing to do with this specific case.

              In the particular circumstances of the Uber accident: a well-lit multi-carriageway road, in dry weather, with good visibility, and a pedestrian crossing the road in plain view for many seconds, I put it to you that it is unlikely that a human driver of a non-autonomous vehicle would have killed the pedestrian.

              1. Anonymous Coward
                Anonymous Coward

                Re: People crash all the time.

                Yes, but if you wish to support programming into automated drivers "looking at the mobile phone instead of the road", I don't think that is what we call an "improvement". Neither is it something I think people are willing to pay for even if it matches or exceeds human ability.

        2. Destroy All Monsters Silver badge
          Paris Hilton

          Re: AV's Hindenburg?

          Which is why in all those cases the human behind the wheel is still responsible for the safe operation of the vehicle.

          Yeah, why have the autonomous thingamabob then?

          I think reading just the introductory notices about how human-machine systems work would reveal that this is a pious wish that will never come true. Or a marketing trick to shift blame to the customer.

      2. Alan Brown Silver badge

        Re: AV's Hindenburg?

        "[Tesla] there have been three or four incidents where the vehicle did not detect large solid obstructions in its path, and did not brake or take avoiding action."

        The Tesla 'autopilot' manual specifically warns that above 50mph it cannot detect stationary objects in front of it.

        IE: this is something that is absolutely warned about, that drivers are told they must watch for, and yet.....

  8. not.known@this.address
    Black Helicopters

    Offshoring to France?

    This on the same day Uber announce plans to build a development plant in Paris to develop flying taxis - is it just they have some people they are trying to move beyond prosecutors' reach (good luck with that, Uber - there's a little process called 'extradition' you might want to investigate), or do they hope that the Right-pondians won't notice the potential risks involved?

    The US Military is doing quite well with autonomous rotorcraft as warzone delivery vehicles and I'd far rather trust them than a company that closes one site to open another rather than allowing people to learn from mistakes...

  9. Prosthetic Conscience
    Joke

    "Uber, and its trucking subsidiary Otto"

    What's next, autonomous drone subsidiary Göring?

  10. Tikimon
    Devil

    Not ready for prime time

    We have been working for years and years to make autonomous robot assistants, butlers, assembly line workers, etc. The challenges are the same as self-driving cars: See it, Recognize it, Select appropriate action. They may have to deal with complex environments, but they don't change quickly and the droid is in no hurry. As far as I know, none of these are ready for prime time.

    How the devil can a fast-moving vehicle See, Analyze, and React any better in a constantly changing environment than a robot assistant in a fairly static environment?

    This next criticism addresses the Human vs Machine debate: The cars might be able to spot a pedestrian by the road, might even recognize them as a human. They cannot discern the INTENTIONS of that person, or of another driver. We can, based on their posture, face and other clues you're not consciously aware of. How many times do you KNOW what the other driver is going to do, letting you prepare to dodge or brake? Staying alive on my motorcycle depends on that ability. A ball rolls into the street - you know a child might follow, the AI does not. Self-driving cars cannot discern intentions, and won't be able to in my lifetime.

    I support the idea of assisted driving like the Tesla. Let the machine do the drudge work, and the human hits the brakes for situations the machine doesn't handle right.

  11. Tim Russell

    Fail Safe #Fail ?

    What I don't understand is the failure to have a Fail Safe or Alert system, this is after all a developmental vehicle???

    [quote from the NTSB report]

    According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

    [/quote]

  12. Anonymous Coward
    Anonymous Coward

    Technology is not safe from Simple Human Screwup

    Any technology, especially complex technology, is not safe from Simple Human Screwup. Two things are suspicious in this case: The employee did or did not have the safety systems engaged? Still not sure? Sounds like a typical employee/supervisor situation: Boss: "Did you..." Employee (knowing if they say 'yes' they're toast) "Ummmm...." The second thing is, no matter how well a self-driving car is designed, a pedestrian will find a way to avoid all sensors and get hit. After all, suicide by truck or train is a known phenomena, and no way self-driving cars are going to be able to stop it. If I am wrong about this second part, please go to Hizzoner the mayor of NYC and show him how he can achieve his goal of zero (0) pedestrian traffic deaths.

    1. Destroy All Monsters Silver badge

      Re: Technology is not safe from Simple Human Screwup

      > "Ummmm...."

      Logs.

      > suicide by truck or train is a known phenomena

      Yes but this is not that. Here we have sleepwalking robot event.

  13. Anonymous Coward
    Anonymous Coward

    If we had real AI....

    It would have immediately informed the Driver: "Someone has tampered with my emergency breaking system, and my LIDAR is operating at a sub par level, do you truly wish to continue in this dangerous travel mode?" on ignition of the vehicle.

    As autonomous vehicles stand we do not have this level of sophistication. They are just bots, they are a series of scripts, and nested if statements. None of which seem to have a conditional inbuilt safeguard to prevent the vehicle from operating in a setting when any of these safeguards have been turned off, for whatever reason.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like