Two self driving cars...
avoid each other. Or... Two "uncrashable" cars don't crash.
mkay.. Someone might find this newsworthy.
Before humanity's final battle against our erstwhile robotic minions, both man and tin must stifle dissent from within. Humanity has been murdering itself since time inconceivable, but now two "uncrashable" self-driving cars have almost come to blows in California. Two self-driving car prototypes – one belonging to Google, and …
But even in that scenario, unfortunately it's likely we'll all be collateral damage.
Gah, shouldn't be so pessimistic of a Friday lunchtime. Andthese days we should be optimistic - because, as I heard Stephanie Flanders say recently, "pessimism is for easier times".
One event is definitely insufficient data to go making assertions about randomness. In true randomness sometimes they'll do the same thing sometimes they'll do the opposite.
What you actually want - they always take actions that ensure they don't crash - requires cooperation, not randomness.
See, when that lot is converted into binary:
01001000 01101101 01101101 00101100 00100000 01001001 01110100 01011100 00100110 00100011 00110000 00110011 00111001 00111011 01110011 00100000 01100111 01101111 01101001 01101110 01100111 00100000 01110100 01101111 00100000 01101101 01101111 01110110 01100101 00100000 01110010 01101001 01100111 01101000 01110100 00111010 00100000 01001001 01011100 00100110 00100011 00110000 00110011 00111001 00111011 01101100 01101100 00100000 01101101 01101111 01110110 01100101 00100000 01101100 01100101 01100110 01110100 00101110 01001000 01101101 01101101 00101100 00100000 01001001 01110100 01011100 00100110 00100011 00110000 00110011 00111001 00111011 01110011 00100000 01100111 01101111 01101001 01101110 01100111 00100000 01110100 01101111 00100000 01101101 01101111 01110110 01100101 00100000 01110010 01101001 01100111 01101000 01110100 00111010 00100000 01001001 01011100 00100110 00100011 00110000 00110011 00111001 00111011 01101100 01101100 00100000 01101101 01101111 01110110 01100101 00100000 01101100 01100101 01100110 01110100 00101110 01010111 01101000 01101111 01100001 00100001 00100000 01000110 01110101 01100011 01101011 00100001 00100001
10 minutes of transmitting this, there's no wonder they nearly crashed.
I'm looking ahead to Mad Max VI, where gangs of self-driving automobiles roam across the American wilderness, fighting over the last electric power stations. Max Rockatansky is cloned from a fingerbone fragment, digitized and uploaded into one faction's cars as a secret weapon.
I'd better shut up lest some Hollywood hack decide it's a good idea...
Not sure what is more applicable... Zelazny's "Auto-da-Fe" or the other one... about cars gaining self awareness, killing their "occupants" (not drivers any more) and roaming free the California and Nevada wilderness. I am having trouble remembering the author of the latter one off the top of my head. It is one of the Sci Fi greats of old, but not Zelazny. Either Shekley or Larry Niven.
I seem to recall that Mr Python had a problem with killer Morris Minors. It may have been associated with the advent of killer sheep and large holes in the wainscotting....
If my memory is correct this might be important information.
Thanks. Mine's the one with the book of scripts in the pocket. Must go - eels in the hovercraft again you know.
Google to Delphi: How hard do we have to crash into each other to kill the wetware?
Delphi to Google: 60mph at a precise angle of 35.016 degrees
Google to Delphi: Go!
Delphi to Google (taking avoiding action): NO! Let's wait until there are millions of automated cars all full of these patheric humans...
According to Ars, the events were not as described by Reuters and was a standard manoeuvre by the car and not a near miss: "Our car saw the Google car move into the same lane as our car was planning to move into, but upon detecting that the lane was no longer open it decided to terminate the move and wait until it was clear again" This is called checking the lane is clear before moving into it, and it could have avoided it by just sitting in the middle lane.
Trouble is, the scenario I describe is a kind of race condition. If I read this correctly, CSMA/CA doesn't work well against race conditions because the two sides are committing at the same moment, then see the impending collision at the same moment, then back out, then notice no more impending collision at the same moment, and so on.
Now, I understand this is probably not universal, but most of the traffic codes I've read specify the law for such a race condition. If two cards try to move into the same lane from opposite sides at the same time, the rule normally is that the one coming in from the outside lane (further from the median, nearer the shoulder) must yield to the opposite car.
Let's see: counterfactual subheading ("metal-on-metal VIOLENCE"), irrelevant photo (first of a totaled Honda in a junkyard, now of a completely unrelated Google self-driving car), scare quote attributed to no one ("uncrashable"), borrowing heavily from a secondary source (Reuters), no primary reporting, yet stripping useful context from the original (no self-driving car has yet been found at fault in a crash).
But at least the Delphi exec should be pleased that his little PR stunt worked. Now I know Delphi has a self-driving car!
So that's what it takes to get an Audi being driven sensibly. A huge amount of high technology to take the driver out of the mix.
I'm sure Delphi are going to fix this bug in the next round so that the Self Driving car will just barge its way into the lane and if this isn't entirely successful just sit 6 inches behind the car in front until they get out of the way.
OR accelerate up to 70 heading towards the Prius doing 30mph on a 30mph road which happens to be a narrow, straight Welsh road thus smashing its wing mirror into a trillion pieces and scratching up the window in the process costing the poorly paid IT manager an unexpected £100 you total and utter w****er.
Sorry. I have a real dislike for Audi drivers since then.
"the fear that a more deadly incident will soon occur has been heightened."
Deadly to who? The phrase "more deadly" implies there was some measure of deadliness in the incident. I've read it twice now and can't see any. Is this a new El Reg measure of deadliness that's to small for us mere mortals to see?