back to article Two driverless cars stuffed with passengers are ABOUT TO CRASH - who should take the hit?

Data analysts don’t need philosophers to help them with ethical decisions - the “science” can figure that out, a top boffin said this week. Asked about the ethics of big data, the head of Imperial College’s new Big Data dept said: “We treat [ethical issues] not philosophically, but scientifically”. It’s a startling assertion …

Page:

  1. Anonymous Coward
    Joke

    "ethicists should be involved in such decisions"...

    Really ?

    Two cars... About to collide... You know what ?

    Let's call an ethicist... he'll know what to do !

    1. swschrad

      Ethernet already addressed that.

      conflict? "shut down the road, start timers, and one gets delayed."

      don't have room in the ROM? engage brakes, disable engine, "recalculating..."

      this is not rocket science.

      1. Anonymous Coward
        Anonymous Coward

        Re: Ethernet already addressed that.

        You do realize what the "CD" CSMA/CD stands for, right?

        I think you missed the point of the exercise, and the meaning of the word "inevitable". There is no "the collision should be avoided in all circumstances" answer because it is impossible for all collisions of autonomously driven vehicles be avoided, unless you restrict their speeds far below the speeds we travel at today.

        One car may slip on a patch of ice, a blowout or other malfunction could cause a car in the opposite lane to swerve over in a split second with no possibility for the other car to avoid the impact save deliberately crashing itself, possibly deliberately crashing into a car in the next lane over.

  2. Marcelo Rodrigues

    Ethics vary from one culture to another

    But, I believe, we are reasonably safe taking the "least harm" road.

    Take the two before mentioned cars: what should be done?

    The decision should be the one that would do the minimum possible amount of harm. I believe we shouldn't base this on gender, age, job... a person is a person.

    1) Count the number of persons inside each car.

    2) Calculate the action that would pose less risk to the higher number of humans.

    3) Execute.

    It could be one car would sacrifice himself. Or a head on collision could be better - it depends upon speed, road conditions and car model.

    The more I think, the more I like Assimov 3 laws. Easy, consistent and quite straightforward. Start factoring gender, age and whatnot... No one would ever reach any conclusion - and all of them would be wrong to someone.

    1. LucreLout
      Childcatcher

      Re: Ethics vary from one culture to another

      I believe we shouldn't base this on gender, age, job... a person is a person.

      I'm sure you knew someone would disagree with this, so, allow me....

      An 80 year old can't be considered worth the same in human terms as a child or 20 something. One has had most of their life, the other barely started. The minimum possible amount of harm would have to reflect the number of years of life lost as well as the number of people.

      It could be one car would sacrifice himself.

      Mine won't be making that choice. If/when these hit the road, I'll be reprogramming mine to maximise my childs survival chances, and I won't be alone in this. If the other guys car decides to go off the cliff, so be it, but I'd never allow something I own (or can exert control over) to wipe out my family because it suited someone elses ethical view. I think it would be such a common position, in fact, that both vehicles would be programmed from the factory to have the accident - as would happen with humans driving anyway.

      1. Jonathan Richards 1
        Stop

        Re: Ethics vary from one culture to another

        Ditto: I'm sure you knew someone would disagree with this, so, allow me also....

        > An 80 year old can't be considered worth the same in human terms as a child or 20 something.

        Agreed. A perfectly valid metric for our Robot Car Programming Overlord to employ would be the immediate economic value of the vehicle contents. On this scale a child and an 80 year old would weigh less than an employed adult. And then he's got to factor in the likely survival rates: front seat passengers more likely to die than back seat? Of course, badly injured survivors are more economically draining than fatalities (Bouncing Betty refers), so a particularly calculating RCPO might put in a branch where the car chooses to drive overcliff: maximising the greater good, don'ch'a know.

        I really don't see how any of this could be ethically beta tested. The very idea that the RCPO would be held negligent at law in the event of injuries or fatalities that offend the human sense of fairness will stop any such thing being deployed in the near future.

        I would like to point out that we already have a transport system in which excursions off the route—and collisions—are very rare: railways.

      2. proud2bgrumpy

        Re: Ethics vary from one culture to another

        What about a car with a single lonely 80 year old heading towards a stolen car of 4x 20 year olds.

        What about a car full of 80 year olds heading towards a car with a 20 year who has a terminal illness.

        Honestly, the list of variables is almost endless, and so the time taken to process an infinite list will simply take too long to be of any use and anything less than a very, very long list will be pointless.

        Unless we go back to a practical view of Car A is a 4x4, car B is a CityCar, so in an inevitable head on crash, the CityCar would lose, so it might as well just self-detonate to save damage to the 4x4.

        There you go - problem solved, no progress made ;-)

    2. Martin-73 Silver badge

      Re: Ethics vary from one culture to another

      You are of course aware that the story (I Robot) in which the 3 laws of robotics were introduced, largely consisted of a group of stories explaining how even such simple rules could have unforeseen, surreal, and negative consequences?

      I highly recommend this story by the way

    3. Anonymous Coward
      Joke

      @Marcelo

      1) Count the number of persons inside each car.

      2) Calculate the action that would pose less risk to the higher number of humans.

      3) Execute.

      If you're ever in a position to propose this solution for real, I'd recommend a different choice of words for step 3.

    4. Anonymous Coward
      Anonymous Coward

      Re: Ethics vary from one culture to another

      "3) Execute."

      Umm... possibly not the *best* term to have used.

  3. DavCrav

    I think you might find it difficult to convince people to get into self-driving cars if they contain a bit of their programming that tells the car to drive off a cliff to avoid an accident that you could well survive.

  4. Simon Williams 2

    I don't buy the premise

    Why would they be crashing? Any self respecting driverless car system would include brakes and collision detection. Worst case scenario, assuming a single track lane is two stationary cars and no ability to pass each other. A better question in that instance would be do you just all swap cars and carry on your journey?

    1. Goldmember

      Re: I don't buy the premise

      Yes, but collision detection systems can't currently see around corners. Imagine a single track mountain road with a 50 MPH speed limit, with steep hills and tight bends. No car has right of way, so there's a plausible scenario where collision detection systems wouldn't have enough time to stop the cars fully before impact, thereby forcing them to decide on either evasive action or simply allowing the accident to happen.

      GPS could help with this of course, but as well as the need for military-grade GPS tech being fitted to each and every car, you'd have to have a global standard with the agreement of all vehicle manufacturers sending customer tracking data to a central source for it to work.

      1. John Miles

        Re: collision detection systems can't currently see around corners

        It is easy - car slows down to a speed where it knows it can stop within half the distance it can see, unlike human drivers the computer won't get impatient and start taking risks if that is 5mph.

        It could easily be allowed to see around the bend, it just needs things like sensors and cameras placed strategically along the the road transmitting data about conditions further along and other cars similarly passing data along.

        1. LucreLout

          Re: collision detection systems can't currently see around corners

          It is easy - car slows down to a speed where it knows it can stop within half the distance it can see, unlike human drivers the computer won't get impatient and start taking risks if that is 5mph.

          Yes, the Golden Rule. So simple, so guaranteed, and yet so many people seem utterly incapable of following it.

          The sooner we have driverless cars, the sooner we can make the driving test properly hard. Can't pass? Ok, buy yourself a JohnnyCab or take the bus.

          1. Lusty

            Re: collision detection systems can't currently see around corners

            Agreed, the fact that so many people believe that there are unavoidable collisions just underlines how urgent it is to get the humans out of the driving seat!

        2. Graham Marsden

          Re: collision detection systems can't currently see around corners

          > speed where it knows it can stop within half the distance it can see

          Exactly.

          All(? Many? Most?) of these "Ethical Dilemmas" come down to people who don't a) understand safe driving principles or b) understand how such principles would be programed into self-driving cars or c) both.

          Perhaps they should be required to sign up to their local RoSPA/ IAM/ Bikesafe/ equivalent so they can actually *learn* about sensible road use before commenting.

          There again, I think the same should go for most drivers and a lot of bikers too...

        3. Goldmember

          Re: collision detection systems can't currently see around corners

          " it just needs things like sensors and cameras placed strategically along the the road transmitting data about conditions further along and other cars similarly passing data along."

          What, along every single track road? Do you have any idea of the cost of such an operation? There are miles and miles of single track, national speed limit roads in the Highlands of Scotland alone. Roads which don't see a lot of traffic, but have the potential to cause very serious accidents. They would all need a network of cameras/ sensors, which would need fitting and maintaining, and would need power etc. Then there's the proprietary standards system that all car makers would need to interface with.

          I'm not saying it isn't possible or worth doing, but it's certainly not "easy", and I seriously doubt it'll be in place by the time driverless cars are let loose on such public highways.

          1. John Miles

            Re: What, along every single track road?

            To start with only those where the speed vehicles need to drop navigating a bend would cause major inconvenience/tailback - which there shouldn't be that many if humans can currently drive them safely.

      2. jonathanb Silver badge

        Re: I don't buy the premise

        If you can't see round the corner, you shouldn't be driving at 50 mph. You should know what your stopping distance is, and be able to see that far ahead at all times. Besides, how many cars are capable of doing a hairpin bend at 50 mph without spinning off the road anyway?

      3. Vic

        Re: I don't buy the premise

        Yes, but collision detection systems can't currently see around corners.

        The Roadcraft Rule : "Always make sure you can stop on your own side of the road within the distance you can see to be clear".

        Vic.

    2. Someone Else Silver badge
      WTF?

      What I want to know is...

      ...what damfool, braindead, script-kiddie program allowed both these vehicles onto a single-lane mountain road going in opposite directions at the same time?

      Perhaps the "ethical" thing to do would be to "sacrifice" the programmer, and the CEO of the corporation that hired him/her.

      1. Goldmember

        Re: What I want to know is...

        "...what damfool, braindead, script-kiddie program allowed both these vehicles onto a single-lane mountain road going in opposite directions at the same time?"

        That's a stupid argument. Is every driverless car supposed to know the whereabouts of every other driverless car in the world, at all times? If that was true then yes, they could avoid driving down a single track road if they knew there were vehicles coming the opposite way. But in reality, how would it know a vehicle had driven onto the opposite end of the road, 30 miles away?

        1. Someone Else Silver badge
          Facepalm

          @ Goldmember -- Re: What I want to know is...

          Is it really? I'm sure all the railroads (railways, if you're a Brit) in the country don't seem to thinks it's such a stupid argument, as a variation of that (we'll get to that in the next paragraph) is the premise of managing rail traffic, and has been for, oh, say a century-and-a-half or so.

          And besides, if you'd spend a little time reading my post with the reading comprehension lamp lit, you'd notice that nowhere did I postulate the concept of every car having to know the whereabouts of any car in anyplace. A car has to only know the state of the road it's about to embark on. Any driverless car system would have to know if a segment of road was passable before it merrily entered it, or cars would haplessly pile into I-90/94 at rush hour well beyond that strip of road's capability to handle the load (kinda like what happens now, under the control of "human" drivers). It becomes an exercise in network management, and has fuck-all with "every driverless car supposed to know the whereabouts of every other driverless car in the world, at all times", numpty.

    3. Marcelo Rodrigues

      Re: I don't buy the premise

      "Why would they be crashing? Any self respecting driverless car system would include brakes and collision detection."

      Because no system is perfect. There are thousands of crazy situations that would make a collision unavoidable.

      Two cars, in opposite lanes. Suddenly, a rock falls from a cliff, and get in the way of one car. It can swerve, but this will put it in a front collision with the other.

      The other is farther away from the rock - so the front collision will be gentler to both than the collision of the one with the rock. What should happen?

      I just made one situation in wich a collision is inevitable - and no driver's fault. With a little imagination we can get a huge number of weird accidents. And given the number of cars on the road, they will happen.

  5. Anonymous Coward
    Anonymous Coward

    You what?

    So we've got two self-driving cars on a collision course? Clearly they've got themselves into a situation that they shouldn't have, and if you can't trust them to drive, how can you trust their pre-progammed ethics?

    And if they are going to crash, why all this "suicidal avoidance" nonsense? We don't have that with aircraft collision avoidance systems, they just do their best and hope for the best. And that's how most logical drivers approach driving - you brake and hope you don't hit the pedestrian who walks out in front of you, rather than electing to mow down a bus queue of OAPs because their life adjusted scores are lower than the callow youth in front of you.

    About time the ethicists were told to bugger off and stop being the modern day Red Flag Act.

    1. Professor Clifton Shallot

      Re: You what?

      "About time the ethicists were told to bugger off"

      Not a fan of The Only Way Is Ethics?

    2. Rol

      Re: You what?

      "Good morning Mr Clarkson"

      "BBC, I'm in an hurry, so quick"

      "Certainly sir"

      ......

      "I have spotted a dead fox in the road sir and have activated my ethics component"

      "and on that bombshell.....

      1. Anonymous Coward
        Anonymous Coward

        Re: You what?

        If, and that's a big one, Clarkson would ever surrender himself to a self driving car, wouldn't he stuff the virtual Stig in the dash?

  6. Tony Haines

    "...two autonomously driven vehicles, both containing human passengers, en route for an “inevitable” head-on collision on a mountain road."

    One might hope that autonomous cars would be programmed to drive defensively. Such a situation therefore *should not* occur. However, it *may* occur due to bugs (i.e. programmer error), malfunction or hacking. I don't think any of those cases warrant the other car sacrificing its passengers. Otherwise, we have the potential for an out-of-control car forcing numerous other vehicles off the road in serial encounters.

  7. Blergh
    FAIL

    Driverless motorbike?

    What would be the point in a driverless motorbike? I'm all for thinking out of the box when it comes with thinking up new ideas, but really! The whole point of a motorbike is either the driving or getting through traffic, neither of which you would get with a driverless version.

    I'm also not entirely clear on why you need to choose between which vehicle crashes. If they are driverless vehicles they should not be doing speeds they cannot stop from, even on a single track road with bad conditions. Of course a driverless vehicle can't completely account for other idiot non-driverless vehicles, but that isn't the question posed. The driverless vehicle should always take the action which causes least physical harm, which usually means stopping.

    If someone has disabled all the safety protocols of the driverless car and are going as fast as they can on the road for a thrill ride, well it's their own fault for being an idiot.

  8. SW10
    Stop

    It won't take long for the lawyers...

    ...to sort this out:

    "Put brakes in there for emergencies, and maybe a steering wheel; then the occupants will cop the liability. The last thing we need is a class-action suit because of a ballsy belief in the superiority of our programming."

    1. Roland6 Silver badge

      Re: It won't take long for the lawyers...

      Agree, the article omits an important part of the minefield: when the inevitable happens and two driverless cars crash, who is to blame and so picks up the bill?

      Also with driverless cars I can't see insurance companies offering a no claims discount, only a discount for using a particular driver system...

  9. Anonymous Coward
    Anonymous Coward

    En route for an “inevitable” head-on collision on a mountain road....

    In that particular situation I'd hope that the computer-controlled cars would both attempt to stop as quickly as possible. No ethics involved.

  10. S4qFBxkFFg

    It's obviously a tricky area, but I can't imagine that anything other than a driverless car doing "its best" to protect its passengers would be acceptable to the customer.

    Would anyone ride in a vehicle they knew/suspected would go into "sacrifice" mode if came off worse in a costs/benefit analysis when compared with a packed school bus?

    The outcome will probably be that, taking the example given, no vehicle swerves to certain doom and both end up colliding. That's if the algorithms decide a head-on is slightly-less-certain doom.

    The person shouting "My car tried to kill me!" will probably receive more attention than the person shouting "Their car didn't try to sacrifice them to save my life!" and I'd bet on a judge and jury being more likely to favour the former.

  11. Sykobee

    The vehicles should aim for a square head-on collision and hope the airbags do their job, rather than sacrificing one car over the edge of the cliff based upon some algorithm of the sum worth of the occupants (to society, profit, the car manufacturer, etc).

    Make the front of the car boot space rather than engine space, so allow for a large crumple zone.

    How about deploying air bags in front of the car? If the car knows it's going to crash, then external air bags can be deployed to soften the impact.

  12. GettinSadda

    The faulty car should sacrifice itself!

    For two driverless cars to get into a situation where they are on a single-track road (otherwise they could just switch lanes) with an unsurvivable drop on one side and an unmountable slope or wall on the other, yet be travelling fast enough that a head-on collision would be fatal even with full brakes applied by both vehicles from the moment they entered each others field of view - at least one car has to be seriously broken!

    1. The First Dave

      Re: The faulty car should sacrifice itself!

      Surely it is far more likely that the oncoming vehicle (the "high value" one) is erroneously signaling an unavoidable collision, than that an actual 100% certain to be fatal collision is imminent. Who in their right mind would chuck themselves off a cliff on the basis of a signal that logically we must presume is incorrect?

    2. Yet Another Anonymous coward Silver badge

      Re: The faulty car should sacrifice itself!

      What's wrong with the current solution?

      The most massive vehicle with the biggest bull-bars wins.

  13. Andy The Hat Silver badge

    Take for example the just-out-of-prison 57 year old ugly bloke in a rusty 4x4 versus the pretty, 20year-old blond in the sports car. Ethics says save the girl (on the basis of heathly, young, fertile) and he's a con, and an ugly, old one at that.

    Or perhaps he couldn't afford to pay his council tax and was banged up for two days, his physical attributes are not his fault and he has his grandson securely strapped into a car seat. She is a nut case of a driver, high on drugs with an Uzi stashed in the boot ...

    Stick with the science and 'take avoiding action according to the conditions'. Ethics has severe problems ...

    1. 's water music

      spelling

      the pretty, 20year-old blond in the sports car

      If that's how you spell blonde can I suggest that you always apply the "adam's apple" test before moving to second base in future?

  14. Cirdan
    Linux

    DR GUI VS ETHICS

    "Maybe boffins such as Gui hope “the computer” can tell us right from wrong - possessing a God-like authority. But this would requires us to pretend that nobody programmed the computer, and it arrived at its decision using magic. ®"

    Well, there's your problem!

    Don't ask Dr Yike Gui...

    Ask Dr Command Line. (Though if you use sudo you still possess a godlike authority!)

    1. Swarthy

      Re: DR GUI VS ETHICS

      Dr DOS?

  15. Wombling_Free

    When I drive on twisty mountain roads

    for maximum safety and ethics, I drive a bulldozer. Especially in tunnels.

    1. DJO Silver badge

      Re: When I drive on twisty mountain roads

      Was that you in the opening of the original Italian Job?

    2. TeeCee Gold badge
      Happy

      Re: When I drive on twisty mountain roads

      You're far better off in a tank.

      That way, in the case of an inevitable head-on collision on a single track mountain road, you can take preemptive action to ensure that your paint doesn't get scratched.

  16. Destroy All Monsters Silver badge
    Pint

    A properly computable logic of ethics?

    Why does this article fall off a cliff at the end? It reads like an abstract. MORE!!

    “Ethics change with technology.”

    ― Larry Niven, N-Space

  17. Michael Hawkes
    Terminator

    Using vast quantities of data to predict future events? I think that's been described before. Tell Silicon Valley they're working on practical applications of Asimov's Psychohistory and they might actually try to do it.

  18. Sir Runcible Spoon
    Pint

    This is not even a logical question.

    "Last week Fujitsu CTO Joseph Reger raised the example of two autonomously driven vehicles, both containing human passengers, en route for an “inevitable” head-on collision on a mountain road"

    Unless the two AI's are in (fast) communication there is only one decision process going on here.

    Each AI must make the best decision available to it for the humans it is currently responsible for. It is not in charge of the 'other' vehicles occupants therefore cannot decide.

    What if the other car doesn't have an AI?

    What if the other cars AI has comms problems?

    What if the other AI is making a similar decision to sacrifice it's own humans based on slightly differently biased information - everyone would die!

    The only thing we can program an AI with in terms of morality is the equivalent of a human.

    For example, you are driving your family along a road at night. All of a sudden a similar group of people to yours has just emerged from a hidden path and is now, on foot, directly in front of you.

    You only have time to

    a) brake as hard as you can and plough into the pedestrians

    b) take avoiding action and drive you and your family off the road, next to which is a 200 ft drop to a river meaning almost certain death.

    I would choose a) in an instant. I would also choose a) after some serious thought as well since it offers the most chance to the most people. If I try and avoid them, me and my family would almost certainly die, if I hit them me and my family would almost certainly survive. For the pedestrians, they would certainly survive if I went off the cliff - but they may stand a better chance to survive being hit by a car than we would by hitting a river after a 200 ft nosedive.

    In essence, most of the time you can only act to save yourself and those you are responsible for as it taps into your basic instincts for survival - anything else will take up precious moments and everyone could die.

    1. JeffUK

      Re: This is not even a logical question.

      In game theory that's called the Minimax Algorithm. Find the 'worst case' outcome of every action, and picking the action with the best 'worst case.'

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like