back to article New work: Algorithms to give self-driving cars 'impulsive' human 'ethics'

In a version of the infamous Trolley Problem, you're sitting in a runaway train on a fatal collision course with five people. You can flip a switch to change tracks, but on the other track you'd still kill one person. Now change the numbers, who the people are, pretend the trolley drives itself, and welcome to the crazy world …

  1. frank ly

    ".. three uber-basic machine learning models ..."

    Did they go out and get drunk, develop algorithms to break local laws, cover it up and then start harassing the researchers?

    1. Anonymous Coward
      Anonymous Coward

      Re: ".. three uber-basic machine learning models ..."

      Nope. These are the machine learning models for cheap taxi services.

    2. big_D Silver badge

      Re: ".. three uber-basic machine learning models ..."

      All they need is a fresh cup of really hot tea.

    3. Anonymous Coward
      Anonymous Coward

      Better algorithms

      <blockquote>However, he added, "I do not think that real-time, human driving decisions can provide an acceptable basis for the ethical programming of self-driving cars" because "it is not a good idea to create rules based on impulses".</blockquote>

      As part of the emergency services, I was trained on how to deal with these situations from a practical point of view. The philosophical discussions are all very nice and interesting, but not useful in a real scenario. Same goes for emotion-based responses.

      In our case, the general principle was "treat those to whom the treatment is most likely to be effective in terms of assuring their survival". In practice, this may mean having to treat an attacker in preference to one of your colleagues, to put an example, which can be emotionally taxing to say the least, but we were not there to play God.

      Also, the so-called trolley problem and similar thought exercises are embarrassingly one-dimensional to be useful. E.g., what stops the signalman from politely asking the people gathered on the tracks to move elsewhere? Why was there people on the tracks? Why was the trolley not working as intended? Why were there not emergency brakes on the tracks? What if going for the group will likely result in multiple non-life threatening injuries as opposed to an assured casualty? Etc., etc.

  2. Steve Button Silver badge

    Save the women and children first!

    That's pretty much what people do (when given enough time - 4 seconds - to think about it).

    So, it's basically OK to replicate this behaviour in automation? As neither a woman or a child, I don't feel entirely comfortable about that.

    What about skin colour? Or hair colour? Would most people choose to eliminate a ginger first (yes, I'm one of them too!)

    I'm not sure I'm liking this RotM future.

    1. Andrew Commons

      Re: Save the women and children first!

      @ Steve Button

      Thinking much the same way. 4 seconds is considered opinion and would probably not match instinctive reactions. The assumption is that this is a good thing but now you may be deliberately mowing down A to preserve B which will be making lawyers salivate.

      1. Steve Knox
        Joke

        Re: Save the women and children first!

        The assumption is that this is a good thing but now you may be deliberately mowing down A to preserve B which will be making lawyers salivate.

        Or run, depending on where they are with respect to the road...

    2. Jason Bloomberg Silver badge

      Re: Save the women and children first!

      What about skin colour? Or hair colour? Would most people choose to eliminate a ginger first (yes, I'm one of them too!)

      I am not convinced anyone would choose to eliminate someone first but I am sure tribalism does play a part in determining who one prefers to save. It is important to note that tribalism does not equate to racism and the like though the outcomes may often look very much like it.

      The difficulty here is that there is no universal truth, no absolute answer as to who should be saved and who should be sacrificed. The best we can do is analyse whether any outcome was legitimate in the circumstances or not and even that's hard, perhaps impossible. Lab testing can be influenced by people seeking to hide or suppress their tribalism to varying degrees, consciously and subconsciously.

      Has anyone analysed random choice; how 'shutting one's eyes and putting it in the hands of a higher power' works out?

      That would at least absolve mere mortals being held to blame for any outcome which some may disagree with.

    3. Neil Barnes Silver badge

      Re: Save the women and children first!

      Time to think is the wrong approach if you want to know what people actually *do*.

      Stick 'em in a driving simulator, let them drive real time for a while, and throw hazards at them in 'real life' situations... and see what they do.

      Some of them won't even notice the incident and will plough straight into the victim; some will notice the ball bouncing into the road and brake before the kid follows it; most will likely try and react too late. Whichever they do, you'll have a good baseline to which the AI must merely do better than.

      I am really unconvinced that people make ethical judgements in such circumstances; I do believe that they may attempt to make a choice of 'avoid hitting anything/avoid too much damage/hope the airbags work'.

    4. Anonymous Coward
      Anonymous Coward

      Re: Save the women and children first!

      The problem with the 'Trolley Problem', as usually described and as used in this article, is that only one of the two important factors is considered: the dilemma of choosing whether just one person dies or whether several people die.

      The other important factor is responsibility: if you remove yourself from the scenario then the route will not be switched and multiple people will die but as you are not present then you can bear no responsibility for any outcome. If you then add yourself back in to the scenario but do nothing then the outcome is the same: the same group of people will die, just as they would when you were not present.

      However, if you add yourself back in to the scenario but this time switch the route then one person, who would not have died, but for your action, will now die.

      Your two options, therefore, are not symmetric; if you switch the route then you become responsible for the death of the one person because they have died as a direct consequence of your actions but conversely, if you do nothing then you can't be held responsible for the deaths of the several people because their fate would be the same as if you were not present and unable to influence the outcome.

      Some countries have/are considering 'Samaritan' laws and these might be interpreted, in the 'Trolley Problem', as obliging you to switch the route and kill the one person but I don't know how they would then deal with your direct responsibility for killing the one person. And even if the Samaritan law could grant you immunity from criminal prosecution, could it also grant you immunity from civil prosecution?

      1. Steve Knox

        Re: Save the women and children first!

        "if you do nothing then you can't be held responsible for the deaths of the several people because their fate would be the same as if you were not present and unable to influence the outcome."

        No, because doing nothing when you have the capability to do something is a choice in and of itself. Hypothesizing about being removed from the situation is simply wishful thinking in an attempt to abdicate responsibility, not a valid logical argument.

        1. Anonymous Coward
          Anonymous Coward

          Re: Save the women and children first!

          Being able to make a choice, and responsibility are different things. For example, some people have the ability to act, but not the knowledge.

          So often, in real life, the trolley problem falls apart as people do not have any knowledge of the situation, or too little to have responsibility or to act. Such as not knowing who is in a cart/on a track or not knowing if the outcome will be worse or better... it could hit only one person, then fall off the track and hit a school of orphans. However, in many instances a "good Samaritan" law covers you if you act on the knowledge you have and act in a way to minimise damage.

          So applying breaks to try and stop, if it goes wrong and derails, but you did not know it would do so, negated you from responsibility of the derailment, as you tried to slow down controlled.

        2. Anonymous Coward
          Anonymous Coward

          Re: Save the women and children first!

          @Steve Knox: "because doing nothing when you have the capability to do something is a choice in and of itself"

          I believe that this is incorrect for two reasons. Firstly, having a choice does not equate to being in a position to make a choice, especially when that choice has to be made immediately and without time for analysis. Secondly, you seem to assume that everyone can make decisions easily and instantly when in reality many people find it difficult to make any decisions, let alone stressful ones; you can't simply claim that an inability to decide is a decision in itself.

          Perhaps you personally find it easy to make decisions - that's fair enough for you, but if you start projecting your decisiveness, or indeed any of your personal qualities, upon everyone else you're going to end up criticising everyone else for not being you.

          1. Steve Knox

            Re: Save the women and children first!

            Firstly, having a choice does not equate to being in a position to make a choice, especially when that choice has to be made immediately and without time for analysis.

            The parameters of the problem state that you are in a position to act, and most forms state that you do have time to make a choice, but not to analyze that choice. It's a snap decision, yes, but it is a decision.

            Secondly, you seem to assume that everyone can make decisions easily and instantly when in reality many people find it difficult to make any decisions, let alone stressful ones; you can't simply claim that an inability to decide is a decision in itself.

            This is the entire point of the Trolley Problem. It's an edge case exemplifying the line between thinking things through and snap decision making.

            Perhaps you personally find it easy to make decisions - that's fair enough for you, but if you start projecting your decisiveness, or indeed any of your personal qualities, upon everyone else you're going to end up criticising everyone else for not being you.

            I believe you misunderstand my point. My point is that if you are in the situation described by the Trolley Problem, then, whether you throw the switch or not, you are the only one with the capability to do so. Hence you have a burden of responsibility to do one or the other, and will personally have to live with the consequences of whatever you do.

            I am not ascribing judgement on either choice, nor am I suggesting legal culpability would be a good thing in this case (as others have mentioned, the Good Samaritan laws exist specifically to prevent heaping legal trouble on top of the moral conundrum this type of problem poses.)

            1. Anonymous Coward
              Anonymous Coward

              Re: Save the women and children first!

              @Steve Knox: "I believe you misunderstand my point."

              Fair enough, but I still can't agree with "...you are the only one with the capability to do so. Hence you have a burden of responsibility to do one or the other..."

              The problem I have with this is the implied obligation to act: the "burden of responsibility to do one thing or the other". I'll admit straight away that my objection is more philosophical than practical but if you accept the proposition that you are obliged to act then this raises two further issues: firstly, an obligation means you have no choice, and secondly, to whom (or what) are you obliged?

              It seems to me that if you are obliged to act then you're putting the need to act ahead of which act you perform: in the context of the Trolley Problem it means that it doesn't matter which action you take because the subject has changed from which action you perform to simply whether you act or not.

              The second issue, of to whom or what are you obliged, is a bit of a trick question. Unless you are owned, as in slavery, then any obligation you may feel you are under can only be granted by yourself, whether it be to an individual person or to a concept, like Humanity.

              The upshot is that, unless you're a slave, there can be no external obligation upon you, only the obligations you grant to others. Consequently, if there can be no external obligations imposed upon you then you can not be burdened with any responsibility to act, other than by yourself.

        3. Swarthy

          Re: Save the women and children first!

          @Steve Knox: "because doing nothing when you have the capability to do something is a choice in and of itself"

          Ah, but doing nothing, legging it out of the situation, and denying you were ever there can be the same as not being there (for legal purposes). But if you switch the points (especially if you leave finger prints/evidence) then you were there, and may face legal consequences for your choice.

      2. Mark 85

        @LeeE -- Re: Save the women and children first!

        And even if the Samaritan law could grant you immunity from criminal prosecution, could it also grant you immunity from civil prosecution?

        It might grant you immunity from those things, but what about the human angle? In other words, the angst and perhaps guilt that you made the wrong decision even if it was made with the best of intentions? Humans are not computers and thus, these "tests" are really meaningless except to rationalize something that really can't be rationalized at an emotional level.

        1. Anonymous Coward
          Anonymous Coward

          Re: @LeeE -- Save the women and children first!

          In other words, the angst and perhaps guilt that you made the wrong decision even if it was made with the best of intentions?

          Lets take the speciest angle out of it. Imagine you're barreling along at a rate of knots, and you see a small yappy dog on the pavement, at the same time you see a rare opportunity to flatten a magpie that's on the carriageway directly in front of you, that (for once) has miscalculated.

          What do you do? My money's on letting the magpie live and going up on the pavement to score the yappy dog. And bonus points if you can get the owner.

      3. Anonymous Coward
        Anonymous Coward

        Re: Save the women and children first!

        > Some countries have/are considering 'Samaritan' laws and these might be interpreted, in the 'Trolley Problem', as obliging you to switch the route

        That is not correct. To quote the relevant instance I am most familiar with¹ (my emphasis):

        Quiconque pouvant empêcher par son action immédiate, sans risque pour lui ou pour les tiers, soit un crime, soit un délit contre l'intégrité corporelle de la personne s'abstient volontairement de le faire est puni de cinq ans d'emprisonnement et de 75 000 euros d'amende.

        Sera puni des mêmes peines quiconque s'abstient volontairement de porter à une personne en péril l'assistance que, sans risque pour lui ou pour les tiers, il pouvait lui prêter soit par son action personnelle, soit en provoquant un secours.

        ¹ Art. 223-6, code pénal français.

        1. Anonymous Coward
          Anonymous Coward

          Re: Save the women and children first!

          @AC: "That is not correct. To quote the relevant instance I am most familiar with¹ (my emphasis):"

          Note that I said "might be interpreted" and not will be/is interpreted.

          But the legislation you quote only considers one party being at risk and in the trolley problem there are two parties. As framed, this law leaves the way open for lawyers to prosecute you which ever way you act because acting to help one party means that you haven't acted to help the other party.

          Of course this is nonsensical, but nonsense doesn't deter lawyers.

          1. Anonymous Coward
            Anonymous Coward

            Re: Save the women and children first!

            > Of course this is nonsensical, but nonsense doesn't deter lawyers.

            Or possibly it doesn't make sense to those who are not learned?

            There is no better example of nonsense than the post I am quoting. Would the gentleman like a bigger spade to continue his very illustrative endeavours?

    5. Clive Harris

      A third option .. and possibly a fourth

      The traditional runaway trolley-bus dilemma has it that you you see a runaway trolley-bus approaching, whilst standing near the points. You have the option of diverting it to hit one or other of two groups of people. To make it interesting, one group is usually more "deserving" than the other - possibly a choice between a group of school children or a load of drunks who've fallen asleep on the tracks.

      There is, however, a third option. By throwing yourself into the path of the trolley-bus you could derail it, or at least slow it down, thus saving both groups. This option has the questionable advantage that you won't have to live with the consequences of your choice (or with anything else, come to that!). I suppose the motoring equivalent of that is where you could avoid a collision with both oncoming vehicles by steering into a concrete wall, or over a cliff. Your choice in that situation says a lot about the sort of person you are.

      There is another variant of the dilemma which offers a fourth option. Here, you're still standing near the points watching the runaway trolley-bus. But this time you're a policeman who's just apprehended someone suspected (but not convicted) of committing a particularly nasty crime. You thus have the option of throwing him in front of the trolley-bus instead of yourself. (I can't think of a motoring equivalent for that)

    6. Anonymous Coward
      Anonymous Coward

      Re: Save the women and children first!

      Would have loved to take part in the experiment purely for the troll value of it.

      I have not looked at the research so I do not know how they controlled for the fact that people knew they were in a simulator¹, not in a real-life situation, but just to teach them to pay attention to that, I would have mowed down the poor first, then children, then women, then minorities, then vaguely communist or homosexual looking people (unless rich), backing up as many times as necessary, while avoiding white anglo-saxon protestant males at all costs².

      Come on, what next? Make political decisions based on the points of view expressed in YouTube comments?

      Oh, hang on...

      ¹ Although simulators can be impressively convincing.

      ² Wasn't one of the columnists here whose mother was a psychiatrist, that used to troll psychology students in his campus by giving the "wrong" answers to personality tests?

  3. Doctor Syntax Silver badge

    I suspect that in real reality the human driver concentrates on trying to avoid either collision and what actually happens is the failure mode (or success if the accident is actually avoided).

    1. wiggers

      From experience (my family were nearly wiped out) people's reactions are not always in favour of the 'minimise damage' ones. When a car pulls out from a turning on your side of the road, for example, many people will instinctively swerve to avoid it, ignoring oncoming traffic with double the impact speed. Surely the test of the decision-making algorithms is how much better do they do than the average human driver?

      1. Jason Bloomberg Silver badge

        When a car pulls out from a turning on your side of the road, for example, many people will instinctively swerve to avoid it, ignoring oncoming traffic with double the impact speed.

        It is counter-intuitive but impact speed isn't doubled, not when one sits down and works it out. Draw two cars driving head-on into each other. Erect two immovable walls between the two cars, have them both drive into those rather than each other. Where's the difference? There is none.

        Of course, if the car one is following is moving rather than stationary a head-on will increase the impact speed, but not double it.

        1. Len Goddard

          Zeno

          Oh yes, and Achilles never overtakes the tortoise.

        2. Holtsmark Silver badge
          Boffin

          The problem with your logic is that the car that pulls out from a turning on your side of the road is NOT an immovable wall. It is all about the energy that needs to be dissapated in the crash.

          Your car: E=m*V^2

          Other car on the road: E=m*V^2

          Car on side of road (very low speed): E=m*0^2

          Thus hitting the car on the side of the road means that half the energy must be distributed over both vehicles, when compared to impacting the one in the opposite direction.

          Rather than giving the cars knowledge of ethics, I'd prefer the car is given knowledge about physics.

          Lots of energy can be dissapated in controlled crunching and spinning, rather than into acceleration of meatbags.

          1. Jason Bloomberg Silver badge
            Coat

            It is all about the energy that needs to be dissapated in the crash.

            But I was commenting on the "double the speed" claim, not energy. I guess I might as well throw in the towel and ride the downvote slide. Hope no one's coming the other way.

            1. Anonymous Coward
              Anonymous Coward

              Car coming the other way made no mistake, they were driving. Car pulling out made a mistake. Who should have an accident if perfect information and an AI deciding which direction to go in, it's self being guaranteed total either way?

            2. Anonymous Coward
              Anonymous Coward

              > But I was commenting on the "double the speed" claim, not energy

              Sum of the speeds would have been more accurate, but it is petty to pick on the poster's assumption that two vehicles travelling in opposite directions on the same road would be going at approximately the same speed.

          2. Anonymous Coward
            Anonymous Coward

            > Lots of energy can be dissapated in controlled crunching and spinning, rather than into acceleration of meatbags.

            In our advanced driving training we were constantly reminded of the need to avoid a head-on at all costs, swerving off the road if need be.

            And from experience as a first responder, I will take a rollover, even a drop off a cliff, over a head-on any day! Rollovers are impressively survivable in modern vehicles, as long as the occupants are wearing their seat belts properly. You should see what the victims of a head-on look like.

        3. john R

          Well... You're correct in saying that colliding with a car coming the other way is much the same as crashing into an immovable wall. But crashing into a stationary car is not the same as crashing into an immovable wall: the car that is crashed into will move which will reduce the impact.

          The first-order physical approximation of the "moving/stationary" collision is the zero-momentum frame where you model it as two cars heading towards each other, each travelling at half the speed of the original car. (Of course the moving road complicates things.)

          What most people care about is reducing injuries and we would need an RTA expert to tell us if the increased injury in the "stationary" car from the lower side protection and the sideways movement of the passengers exceeds the reduction of injuries from the smaller impact. And I would guess that braking may be more effective in this case.

        4. leonard.schneider@gmail.com

          Whatever

          So crashing at one speed in a wall or double the speed in a wall is the same?

      2. Doctor Syntax Silver badge

        "When a car pulls out from a turning on your side of the road, for example, many people will instinctively swerve to avoid it, ignoring oncoming traffic with double the impact speed. "

        Depending on the road the oncoming car has the possibility of swerving out the way.

        What's quite possible, of course, is that the car pulling out will accelerate and catch them both. I've seen something like that except that it was the car that was pulling away from the lights accelerated towards a car that had, through no fault of the driver's, been stranded in the middle of a cross-roads.

  4. The Man Who Fell To Earth Silver badge
    FAIL

    intellectual masturbation

    This type of work is an exercise in intellectual masturbation (useless process that just makes those doing it feel good).

    When AI really is driving cars, the AI's will eventually be able to access a boatload of data and identify who they may be about to run over (LinkedIn, Facebook, people's CV's, facial recognition, etc.) and use that as part of the value proposition on who is most expendable in these situations. In which case, the AI decisions will be based on all sorts of evaluations as to the value the potential road kill has to society, as opposed to "save the baby" or "save the most people" propositions.

    And of course, society & industry being what they are, for those with the money, there will be a service offered that will instruct the AI to crash/run-over anyone else first.

    1. 's water music

      Re: intellectual masturbation

      This type of work is an exercise in intellectual masturbation (useless process that just makes those doing it feel good

      I call Category Error. Masturbation is never useless.

      1. handleoclast

        Re: Masturbation is never useless.

        @'s water music

        If you're male, when you get old enough you find that your dangly bits rarely do more than dangle. In those circumstances masturbation is often useless.

        1. Paul Crawford Silver badge

          Re: Masturbation is never useless.

          "masturbation is often useless."

          Surely that is what Viagra was invented for? OK there is some risk of heart failure but we all have to go sometime.

      2. Anonymous Coward
        Anonymous Coward

        Re: intellectual masturbation

        > I call Category Error. Masturbation is never useless.

        You owe me a new keyboard.

        (no, not coffee)

    2. Anonymous Coward
      Terminator

      Re: intellectual masturbation

      @The Man Who Fell:

      What if the driver AI was developed or acquired by Facebook or LinkedIn owner Microsoft? (Hmmm...I compute I cannot run that person over--they have a couple Premium services on their LinkedIn account. That toddler is looking pretty expendable though.)

      1. Anonymous Blowhard

        Re: intellectual masturbation

        "That toddler is looking pretty expendable though"

        Don't worry, toddlers would kill you in a heartbeat...

        1. Clive Harris
          Childcatcher

          "The wheels on the bus go round and round"

          After watching that video, I tried the same experiment with my 4-year-old grandson. I didn't have a toy train or Lego people handy, so I improvised with a large wooden bus and four small toy cars. I lined up three of the cars in a row and a fourth car on its own to one side. I then explained to him that the bus was going to crash and had to hit one car, or group of cars. He carefully removed the car sitting on its own, explaining "That's MY car". He then ran the bus over the other cars, singing "The wheels on the bus go round and round".

          Not sure what that proves, but I suppose a claim of ownership over one of the "victims" could tip the moral balance.

    3. DJV Silver badge

      save the baby/most people

      Well, as long as they save the cheerleader - any alternative might be truly disastrous!

    4. Pedigree-Pete
      Joke

      Re: intellectual masturbation

      @The Man who fell to Earth... SaaS, Survival as a Service? PP

      Icon, perhaps you weren't joking..

  5. Anonymous Coward
    Anonymous Coward

    Mind my ass

    Any algorithms would of course need to take account of local laws, which means in the UK the car would have to be able to tell whether the potentially colliding object was a horse, cow, ass, mule, sheep, pig, goat or dog before framing a course of action.

    If we're talking about hitting people, though, I'd go for maximising the QALYs. A bit tough if the car swerves to avoid a baby in a buggy and then as a result mows down a small group of pensioners, but that's life. Or death.

    1. Anonymous Coward
      Stop

      Re: Mind my ass

      Aren't all swans in England still royal property? So AI can run over a goose, but if your AI mistakenly runs over a swan, then HM sends a pack of corgis to chew the developer's ass...

      1. Anonymous Coward
        Anonymous Coward

        Re: Mind my ass

        Nah, you're OK with swans as, despite being royal property, they are still beasts fera natura.

      2. Paul Crawford Silver badge

        Re: Mind my ass

        "HM sends a pack of corgis to chew the developer's ass..."

        Can we have this applied to more UK gov projects? Might just improve the on-time and on-budget chances.

  6. Anonymous Coward
    Anonymous Coward

    I think the easiest way to sort this is with the handbrake.

    Get the car sideways and take out both lanes or just drive down the middle with both side doors open.

    That's that particular ethics problem solved though to be fair it then raises another more serious ethical dilemma. It's hard this self driving car stuff, I hope they get it right.

  7. John Mangan

    Ethical decisions

    I've always wondered if, in the trolley problem, people would really throw the lever. It's one thing to say kill one person rather than five but if it requires a physical act to make that happen would most people actually do that or just stand frozen rather than taking responsibility?

    I would hope (yeah, I know) that as automation improves the various sensors and programming will allow improved anticipation of events (and as more and more automated cars appear the erratic human element will decrease) and reduce the need for some of these ethical decisions.

    But there will always be edge cases and few people will buy cars that will sacrifice the driver in those cases so working out 'acceptable' resolutions sounds like a good idea.

    1. Len Goddard

      Re: Ethical decisions

      There is a variant of the trolly problem. The trolly is bearing down on 5 children. You are on a bridge over the tracks with a very fat man. If you push him onto the tracks the trolly will be stopped before it hits the kids. You can't jump yourself as you do not have the body mass to stop the trolly.

      Amusingly the 1-for-5 brigade will generally flip the switch but not shove the fat man.

      1. John Mangan

        Re: Ethical decisions

        I'm aware of the variant but I suspect that people's squeamishness will kick in earlier than these scenarios would suggest.

    2. Anonymous Coward
      Happy

      Re: Ethical decisions

      I've always wondered if, in the trolley problem, people would really throw the lever. It's one thing to say kill one person rather than five but if it requires a physical act to make that happen would most people actually do that or just stand frozen rather than taking responsibility?

      Of course not. If you don't throw the lever, it's a nasty accident and not your problem.

      If you do throw the lever, then you don't get some sort of medal - instead you are on trial for murder, you are being sued for everything you own by the relatives etc etc. So, a life of prison and penury. Society might win, but you personally lose, big time.

    3. allthecoolshortnamesweretaken

      Re: Ethical decisions

      Let's not kid ourselves here.

      Chances are that the "algorithm" that decides who should be saved and who is expendable will be based on which of the person(s) involved is a premium account holder and which isn't.

      1. Anonymous Coward
        Meh

        Re: Ethical decisions

        Chances are that the "algorithm" that decides who should be saved and who is expendable will be based on which of the person(s) involved is a premium account holder and which isn't.

        Yes - we'll quickly find that "high net worth individual" means what it says.

        1. quxinot

          Re: Ethical decisions

          Yes - we'll quickly find that "high net worth individual" means what it says.

          I'm gleefully awaiting hackers, aiming for the high score, then.

  8. earl grey
    Devil

    yeah, i'm not quite ready

    For a car that sacrifices me for toddlers, pensioners, bikers, or anyone else in the road who doesn't belong on the lane i'm in. no thank you.

    1. John Mangan

      Re: yeah, i'm not quite ready

      But you're happy/ier for the meatbag behind the wheel to randomly take you out - because you've got better odds of not being targeted? Even thought he risk of this kind of avoidance maneuver being necessary in the first place may be higher with human drivers?

      1. The First Dave

        Re: yeah, i'm not quite ready

        Any self-driving car _should_ be driving within the limits of the road conditions, so there is no need for any ethical decisions.

        1. John Mangan

          Re: yeah, i'm not quite ready

          So no pedestrian will ever walk into the road from between parked vehicles without looking ever?

          1. The First Dave

            Re: yeah, i'm not quite ready

            @John

            Just like a human driver, I would expect a self-driving car to anticipate this.

            But then, I'm not clear on what _you_ would do under those circumstances - do you normally drive too fast to be able to stop?

            1. John Mangan

              Re: yeah, i'm not quite ready

              @The First Dave - last week someone at work sent round a video of various dashcam OMGs.

              One was taken from a moving van driving at what looked like a responsible speed down a road with closely parked cars on each side of the narrow road when a mother and two children walked straight out from behind a parked van mere feet from the moving vehicle. The mother looked up after clearing the parked van and stepped back, the driver reacted with commendable speed and stopped the van in very short order but the two kids still made contact with the bonnet - although with no obvious damage.

              Shit happens. The world is not fully under out control. Idiots are abroad. Machinery fails; brake cables for instance.

        2. Neil Barnes Silver badge

          Re: yeah, i'm not quite ready

          Which raises en passant the question of why a cruise control (e.g. the one on the Tesla that couldn't tell the difference between the sky and a truck) can be set at a higher speed than the local limit...

          1. Anonymous Coward
            Anonymous Coward

            Re: yeah, i'm not quite ready

            My Ford Focus which can supposedly read speed limit signs sometimes reads the signboards of passing petrol stations or phone numbers from the backs of lorries and then swears that the speed limit is 100mph or more.

  9. Named coward

    Has the trolley problem actually ever manifested itself in real life? We tend to over-complicate things. If you are driving and see an obstacle in your way, you brake. It might mean that the asshat tailgating you ploughs right into you...tough luck. Sometimes you might even need to swerve. If, by swerving, you find yourself in a new collision path, you try to avoid that as well, and hope that if it's a car its driver will also be trying to avoid the collision. Accidents happen and will always happen...instead of wasting processor resources trying to figure out who to drive over, those same resources should be engaged in trying to avoiding any collision

    1. John Mangan

      I believe the trolley problem is just a vehicle (pun intended) for trying to pry into how humans 'ideally' weight life and danger. It's meant to provide some way of addressing the ethical difficulties that real life can throw up - when someone's brakes fail, a pedestrian is paying more attention to their phone than the road, the cyclist who believes the public highway is the place to practice their wheelies, etc.

      1. Charles 9

        Plus there ARE some real-world occurrences of "crap happens": situations where the driver WILL lose no matter what: the I-35 bridge collapse, a small deaf child (too small to be seen behind a car, unable to hear the engine) running 3 feet in front of a car (no car can stop on a dime, even at neighborhood speeds). Another term that springs to mind is "Cold Equation," where the sheer math and physics say SOMEONE's gonna die no matter what.

  10. Tim Hughes

    4 seconds?

    4 seconds seems to be an odd choice if you are interested in how people react to an imminent collision - If I'm not mistaken, I think the length of time the traffic lights are at amber in the UK is 3 seconds or so, presumably going up as the speed limit increases, and in that time you are expected to recognize the change, decide what to do and brake to a stop before the line (or, you know, say fuck it and and foot to the floor. But I digress). This is in a predictable environment, where drivers really ought to be expecting to have to stop, but there are still many people who make awful decisions every day.

    When someone/something appears in your path there is absolutely no way that you have anything like 4 seconds to make a weighted judgement and then (i.e. 4 seconds later) start to execute your plan. From personal experience I would wager that most people would go into some extreme tunnel-vision like instant reaction where they simply try to miss the thing the first grabs their attention as being something to miss, (which could be the ball or could be the child) so they hit the brakes and/or swerve.

    Only once the instinctive manouvre is started then will they start to focus on what comes next in the list of things not to hit given the new direction they are travelling (oncoming vehicle, tree, pensioner, more children, etc.).

    I don't really see where there is any ethical judgement being made by puny humans, and any machine capable of accurately recognising and categorizing all of these things in real time is also likely capable of driving entirely within its own abilities to miss everything. Whether or not people actually like being driven by machines that are that cautious around blind corners, junctions, etc. is another matter.

    Edit: Ninja'd by several people above.

    1. John Mangan

      Re: 4 seconds?

      Although I agree with your analysis of panic-ed human collision avoidance I think the four seconds was chosen to try and tease out what humans would see as an 'acceptable' hierarchy of harm should such a thing come to pass.

      1. Tim Hughes

        Re: 4 seconds?

        I agree, and it is a worthwhile exercise, but it would be a shame if everyone got hung up on something without ensuring that this research is heavily caveated with a "in the real world people don't actually have time to make these judgements" message.

      2. Anonymous Coward
        Anonymous Coward

        Re: 4 seconds?

        As I've said before somewhere on the forums here, the choice for a self driving vehicle *should* be to endanger itself and it's occupants, rather than other road users. Modern safety features mean that if the self driving car is travelling at a safe speed for an urban road then the occupants of the vehicle can walk away from a crash, whereas the muppet pedestrian / cyclist etc. who caused the crash won't walk away, and might never even breathe again.

        The flip side of this means that we'll have to bring in jay-walking laws in countries that don't have them, so that the liability for the accident falls on the individual who actually caused the accident. For the record; in the UK at least, cyclists are already bound by road traffic laws. Bare in mind that as soon as self driving cars become properly legal; there will be a complete record of every accident, and if your self driving car steers *towards* a pedestrian to hit them, it may well result in an temporary ban on using every car that uses derivatives of the same firmware until the full circumstances of the accident are understood. Think about faults in aircraft that ground entire fleets until the problem is understood; *exactly* the same will apply to self driving cars.

        Disclosure:- I'm a cyclist, but not a psycho loony who doesn't think the law applies to them.

    2. Anonymous Coward
      Anonymous Coward

      Re: 4 seconds?

      > I think the length of time the traffic lights are at amber in the UK is 3 seconds or so, presumably going up as the speed limit increases, and in that time you are expected to recognize the change, decide what to do and brake to a stop before the line

      Digressing, but the Austrians have the green light blink four times before turning yellow, meaning that there is no possible excuse to run through a red (or yellow) light.

      Of course, it helps that people actually start slowing down when the blinking starts, instead of speeding up as no doubt would happen in other parts of the world (or Vienna, they seem to win their licences in a raffle there).

  11. Anonymous Coward
    Anonymous Coward

    Open the door, Hal

    I'm sorry, Dave, I can't do that.

    This sort of thing has happened before, and it's always been due to an unpredicted issue in my programming

  12. allthecoolshortnamesweretaken

    Re: Algorithms to give self-driving cars 'impulsive' human 'ethics'

    So, an "AI" driver capable of road rage, then?

  13. Toltec

    Artificial constructs

    The four seconds is problematic for me for a number of reasons. In an urban environment you should be able to come to a stop in much less distance than your vehicle can cover in four seconds, in a modern car this should even be possible if one of the braking circuits has failed*. This is with a meatbag in control, the reaction time for an automomous car would be much quicker and sounding the horn to warn the pedestrians would be near instantaneous.

    Even if we assume a trolley bus with a long stopping distance it is still absurd as an automated system would not need to rely on sensor fitted to the bus alone, it would use external sensors fitted along the route. If it cannot stop in the distance that can be seen to be clear then it is going too fast!

    The scenario also fundamentally ignores that the pedestrians will also have four seconds or so to notice and react to the oncoming vehicle, particularly after it has sounded a warning, the chances of them standing still while the vehicle manoeuvers is very low. It is probably better to brake in a straight line, being predictable, and hope the pedestrian get out of the way.

    * An EV could have a tertiary braking system in the form of regen or even emergency reverse drive of the motor.

    The scenarios should be far more inventive, for example, if an autonomous vehicle detects a human controlled or apparently out of control vehicle about to hit pedestrians should it intervene by hitting that vehicle?

    1. Jason Bloomberg Silver badge

      Re: Artificial constructs

      The four seconds is problematic ... The scenarios should be far more inventive

      You are being chased by some nutjob locals with axes, cleavers and probably guns, you have accidentally pissed-off. You jump in your car and tear off. They jump in their cars and follow.

      Foot to the floor, you turn into an underpass, the nutjobs follow. Ahead you see there's obviously been a major hit and run accident, dozens of injured people are lying all over the road.

      Stop you die. You can't turn round. You are going to hit at least some people. You're going to have to pick some strategy. Luckily it's a long tunnel. You have 30 seconds to decide what to do.

      1. Toltec
        Coat

        Re: Artificial constructs

        "you turn into an underpass"

        That would seem to be rather a rookie error for someone that gets into fights with armed nutjobs, particularly such a very long one. The underpass will need to be at least a kilometre long, straight and level, at car chase speeds to see an accident 30 seconds away, you presumably have very good eysight or have some kind of image enhancing goggles on.

        Clearly in your scenario you stop about a 100m (depends how fast you can run) from the accident, slew your car sideways blocking the tunnel and run towards the exit. As the bad guys stop, exit their cars and give chase you fire repeatedly into the fuel tank of your car causing it to explode and deter the bad guys from continuing the pursuit. You run to the accident area and steal one of the cars of the good samaritans that has stopped to help and make your getaway.

        I've never played Grand Theft Auto so I just guessed that is where you got your scenario from. ;-)

        1. Anonymous Coward
          Anonymous Coward

          Re: Artificial constructs

          "Clearly in your scenario you stop about a 100m (depends how fast you can run) from the accident, slew your car sideways blocking the tunnel and run towards the exit."

          One of them is driving a heavy truck. They can just ram your sideways car and keep going shoving it along like a plow.

          "As the bad guys stop, exit their cars and give chase you fire repeatedly into the fuel tank of your car causing it to explode and deter the bad guys from continuing the pursuit."

          Busted by the MythBusters some time back. You'd have to be carrying incendiary rounds to even get a chance of lighting the gas in the tank, and even then it'll be a slow burn, time enough for them to climb over or otherwise get your car out of the way. Plus it won't do much versus the big truck driver.

          "You run to the accident area and steal one of the cars of the good samaritans that has stopped to help and make your getaway."

          Except the nearest cars happen to be police cars, and they tend to react VERY badly to having their cars stolen. Some of them start shooting.

  14. druck Silver badge
    Stop

    In harms way

    Choosing between hitting someone who has walked out in the road or swerving and hitting others on the pavement is not the right way to look at the issue, for a human or AI. The person that has stepped out in to the road has deliberately put themselves in harms way, the pedestrians on the pavement have not.

    If the person stepping out is aware enough they may have a chance of getting out of the way in time, the pedestrians on the pavement will be generally unaware of what is happening on the road and will have no chance - as we've seen too often recently in both lorry accidents and terrorist attacks.

    1. John Mangan

      Re: In harms way

      Whereas a child may be blissfully unaware of the danger they've put themselves in but the screaming adult on the pavement is fully aware - and you don't think a human would attempt collision avoidance in such a scenario?

      I think this discussion here completely validates this research. It's not a straight cut-and-dried answer and therefore needs research and thinking about.

  15. Anonymous Coward
    Anonymous Coward

    My Cynical View

    I think that before we try to program ethics into autonomous vehicles we must consider that if vehicle programming can be tweaked to avoid pollution tests they can be tweaked to avoid the most costly legal liability outcomes.

  16. Gene Cash Silver badge

    Don't really care

    I paid for the car, it sure as hell better protect ME first.

    If you pull out on me, walk out in the road, cross the centreline, whatever, that's YOUR lookout.

    1. Anonymous Coward
      Anonymous Coward

      Re: Don't really care

      Even if they have NO way to know you're coming?

  17. TheOtherMe
    Paris Hilton

    Cultural Bias?

    "Save the women and children first" - Is this just a reflection of the test subjects chosen for the exercise? Do all peoples from all cultures express the same value judgements? Would people from India for instance choose to hit the person rather than the cow, given their reverence of the sacred animal?

    This is a very tricky area for AI to navigate, research and discussion are important if we are going to trust these machines with our lives.

  18. Fonant

    Risk Compensation

    The problem isn't the AI, it's the trade-off between (a) taking risks to make sensible progress and (b) the rate of deaths and serious injuries that those risks result in, at a population level.

    At the moment, with human drivers, UK society seems comfortable with killing around 7 people every day of the year in return for the benefits of being able to drive as we do.

    With driverless cars, would we set the trade-off at, say 5 deaths per day? If we set the death rate too low then driverless cars will have to proceed extremely carefully to minimise the risk of death in the rare-but-possible event that a human pedestrian does something the AI does not or cannot expect. But could the families of those five dead people sue someone for deliberately designing the AI to take risks at a level that is known to result in deaths?

    Then what about deliberate "jay-walking" to play "chicken" with driverless cars? Imagine the fun to be had creating traffic jams by getting a bunch of hooligans to pretend to be about to step into the road!

    The problem is that we still allow fast-moving heavy machinery (cars, lorries) to exist in the same space as human beings (pedestrians, cyclists). Not something that is ever allowed on industrial sites, the HSE would require physical separation (like railways) or very low speed limits, banksmen for reversing vehicles, etc. See http://www.hse.gov.uk/workplacetransport/index.htm

  19. wayne 8

    A half century of driving

    Over a half century of driving, driving professionally for a time, driving a NYC taxi, driving long haul buses, commuting into NYC on motorcycles, etc. I have racked up 72K miles in one year.

    I have never had a real life trolley problem involving people, pets, or vehicles.

    Reduce speed when conditions become ambiguous. Be aware of the area your vehicle is entering. Constantly scanning for possible "targets".

    Situational awareness. [Also useful to survive as a pedestrian in current year London.]

    1. Charles 9

      Re: A half century of driving

      I guess you've never had one just suddenly jump out at you from a blind spot and too close to stop even at cautious speeds. There are also "crap happens" moments when you're simply screwed: such as driving on a bridge the day it suddenly collapses under you.

  20. Anonymous Coward
    Anonymous Coward

    Gender / Class / Race / Object Equality

    Whether you are a woman, a man, a chicken, a rich dude, a poor guy, a homosexual, a heterosexual, a goat, the boss, the founder, the CEO, a tree or a rock, if you are in the middle of the road, the self-driving car will mow down slow down under the law of physic that is possible. Fair game?

  21. tiggity Silver badge

    Pick your ethics

    As has been said, peoples ethics may differ.

    The assumption someone would (in an animal v human situation) would always choose to avoid the human.

    What if your choice was between a cuddly animal or Theresa May / Michael Gove / Jeremy Hunt/ Jeremy Corbyn/ Nigel Farage / Piers Moron / etc. etc - delete as applicable. (i.e. animal vs someone you really dislike, go a real Godwin option figure if you like)

    What if the animal was something extremely endangered and so of "higher value" than a random magpie, squirrel, etc. (and arguably "higher value" than any random person, e.g. animal species down to its last few dozen individuals)

    Let's look at baby vs adult and idea of always avoiding the baby.

    What if you bare in a country with very high baby / child moratlity rate (e.g. due to disease (& lack of vaccination. medical treatment) 90% infant mortality rate, but once someone has reached adulthood then a good long life ahead of them - it's not so long ago that most infants in UK never reached adulthood..)

    In baby v adult situation, logic would suggest baby highly likely to die anyway, whereas adult should have good lifespan ahead so best option would be avoid the adult.

    All sorts of "what ifs" you can apply to screw up the "obvious" ethical choice

    1. earl grey
      Trollface

      Re: Pick your ethics

      Can we make it a game for points? Death race 2000 style? Ms. May riding on a albino drop bear on the back of jeremy. More points if you get 'em all.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like