back to article Renewed calls for Tesla to scrap Autopilot after number of crashes

US advocacy group Consumer Watchdog has renewed its demands for a recall of Tesla's Autopilot feature following a number of crashes. Earlier this week, a Tesla smashed into a construction barrier truck on a German motorway at high speed. Consumer Watchdog said the "growing list of Tesla crashes demonstrates the urgent need to …

  1. Filippo Silver badge

    98%

    Uhm, so 2% of users do NOT understand that they are supposed to maintain control of the vehicle at all times? That's quite a lot of people. No wonder there's a number of crashes.

    1. CustardGannet
      Facepalm

      Re: 98%

      It's hard to legislate for the 2% of 'tards who can't be bothered to RTFM.

      Probably the same people who don't understand that a cup of coffee is hot, unless it says "WARNING ! CONTENTS ARE HOT !!!!!!!!!" on the label, or that packets of nuts may contain nuts.

      As a wise man (D. Adams) once said, "A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools".

      1. Neil Barnes Silver badge
        Boffin

        Re: 98%

        Making things foolproof simply means more fools breed more fools.

        1. johnfbw

          Re: 98%

          To be the 2% might not get to breed if they think a car will drive itself. (Though if they can afford a Tesla they are probably old enough to have breed)

          1. Nonymous Crowd Nerd

            Re: 98%

            Breeded? Or perhaps bread?

            1. Anonymous Coward
              Anonymous Coward

              Re: 98%

              or even bred?

              1. John Presland

                Re: 98%

                Nerd was funny, but you ...

      2. Vector

        Re: 98%

        "It's hard to legislate for the 2% of 'tards who can't be bothered to RTFM."

        No...it's not. It's simple really. DON'T ALLOW IMPLEMENTATION OF BETA SYSTEMS IN VEHICLES USED ON PUBLIC ROADS!!!

        I'm all for self driving cars but Tesla has jumped the gun in releasing this semi-autonomous system to the public. If even one person can misconstrue the name or significantly misuse the system, then that one person poses a hazard to navigation.

        Much as I want a self driving vehicle, all of these semi-autonomous steps along the way to that goal are going to add up to a nightmare on the highways.

        1. John Brown (no body) Silver badge

          Re: 98%

          "I'm all for self driving cars but Tesla has jumped the gun in releasing this semi-autonomous system to the public."

          Very few countries allow new car models to be sold to the public without all sorts of testing and safety analysis. Maybe the relevant authorities are as much to blame for allowing these on the road without demanding full and complete information being shoved in the drivers face along with proper testing, especially real-world usability testing. The semi-autonomy of these vehicles is quite a significant step changes from your average new car model with pretty standard controls and features that people get trained on when learning to drive.

      3. Ben Tasker

        Re: 98%

        > Probably the same people who don't understand that a cup of coffee is hot, unless it says "WARNING ! CONTENTS ARE HOT !!!!!!!!!" on the label

        To be fair, in that particular lawsuit the coffee wasn't just hot, it was scalding and far hotter than it ever needed to be. That and the resulting skin damage is what the lawsuit was about, not the fact they didn't mark it as hot, but that McD (IIRC) made it far hotter than any reasonable person might expect it to be (and IIRC didn't put the lid on properly)

        As far as TFA goes, just change the name, not that big a deal

      4. Pat Harkin

        Re: 98%

        "Probably the same people who don't understand that a cup of coffee is hot, unless it says "WARNING ! CONTENTS ARE HOT !!!!!!!!!" on the label, or that packets of nuts may contain nuts."

        Last time I was in the states, I got an ice cream milkshake in a cup which warned me "CHAUD! ATTENTION! HOT!"

        This was at the creamery in State College, Pennsylvania, where one of the rules is that you aren't allowed two flavours in the same bowl. I ordered Strawberry and Cookie Dough and got a look which would have peeled paint. The only person who has had two flavours together is Bill Clinton who did it while he was President. He went back after leaving office - and was refused!

    2. Fan of Mr. Obvious

      Re: 98%

      Sure, lets add legislation. How about a background check before you can buy a Tesla, same as before you can buy a firearm? This question should do nicely:

      What is Tesla Autopilot? A) Enables you to text and squeeze your passenger while the car drives itself, or B) Requires that you drive your car at all times.

      I am on the road (around stupid drivers) far more than I am around [stupid] people that have guns (probably around smart people with guns, but they are smart enough to not let me know), so I vote for the Tesla background checks! I would feel much better knowing that 100% fewer Tesla's were likely to drive up my rear, even if it is only 2% fewer Tesla's on the road.

      Whatever. Either allow the cars or don't. Sooo tired of "more" regulation that does not work.

    3. Doctor Syntax Silver badge

      Re: 98%

      "Uhm, so 2% of users do NOT understand that they are supposed to maintain control of the vehicle at all times?"

      I wonder how that compares with the percentage who think it's OK to drive singlehanded whilst holding a phone to the ear. Or zero handed whilst texting and checking emails, Facebook & tweeting "I think I just hit something".

    4. Nonymous Crowd Nerd

      Re: 98%

      My car radio has a handy feature... About 45 seconds after the engines fires up and as I'm turning out onto the main road, a screen full of smallish type appears ordering me to RTFM and not to get distracted by small things on the screen. Below the text is a small screen OK button. By tapping this button I agree that I will never look for things on the screen as small as this button. Perhaps I should turn onto the main road, stop the car, tap the button, then move off again. If I stall the engine, the sequence restarts from the beginning.

  2. Gideon 1

    "The inaccurate and sensationalistic view of Autopilot put forth by this group is exactly the kind of misinformation that threatens to harm consumer safety."

    The evidence suggests that Tesla's cars harm consumer safety.

    1. Adam 52 Silver badge

      Only if you're being obtuse. How many Fords crashed in the same time period?

      1. Manu T

        ... but they can't blame their own stupidity on "Autopilot".

        1. Neil Alexander

          Usually one can blame their stupidity on a different kind of autopilot. Certainly not all autopilot is of the technological variety. Just ask the morons who text whilst driving.

      2. Orv Silver badge

        I'm guessing there are a heck of a lot more Fords on the road than Teslas, so that statistic wouldn't be very meaningful.

        1. This post has been deleted by its author

          1. Terry 6 Silver badge

            Using statistics

            These figures actually make no real sense, to be blunt ( except the 2% who just don't know what they are driving bit).

            To make any sensible comparison you'd need to break the figures down ( or at least identify) driving location and conditions, Then compare the rate of accidents per Km and/or per journey with the expected rate for cars of a similar performance and purpose. Currently we are just jumping up and down at every accident report. It probably owes more to confirmation bias than real information - with a gentle touch of building Moral Panic to keep it rolling on in the news.

        2. Adrian Midgley 1

          Normalise.

          Statistical technique used by people who don't guess how many Fords, but count them (in a sample, or from registrations of cars etc)

          If there are 1000 times as many Fords, divide the crashed Ford's by 1000 to get a crude comparator.

    2. scarper

      >> The evidence suggests that Tesla's cars harm consumer safety.

      The post contains no evidence about Tesla's cars killing people more often than other brands do.

  3. I J Ellis
    Pint

    Lipstick on a Pig?

    Personally, if I'm going to spend £60k on a motor with a Tesla's performance I'd want to enjoy the experience of driving it myself. Having a 'self-driving' feature that you can't allow to actually control the vehicle is somewhat pointless & it's use in any case is very likely to encourage 'driving without due care & attention' at the very least. For those who want them, we already have fully self-driving cars, they're called taxis - they're even self-cleaning sometimes ;-)

  4. Gene Cash Silver badge

    Consumer Watchdog?

    Who? I've never heard of them before, and I've lived in the US for 50 years.

    Are these another smear group similar to the one that's currently having a go at Musk/Tesla/Solar City on titter twitter and other places?

    "The evidence suggests that Tesla's cars harm consumer safety."

    All cars harm consumer safety. I had a c*nt rear-end me a week ago, despite sitting at an in-use pedestrian crossing with 6ft flashing lights on both sides of the road. On the good side, at least I had a dozen pedestrians, a cop, and a school crossing guard willing to be a witness.

    1. ratfox

      Re: Consumer Watchdog?

      Consumer Watchdog is a recent org based in California. So far, I mostly noticed them complaining about Google, but it seems they have expanded their field of activity to Tesla.

      1. Anonymous Coward
        Anonymous Coward

        Re: Consumer Watchdog?

        Recent? Started up in 1985 actually.

    2. werdsmith Silver badge

      Re: Consumer Watchdog?

      I had a c*nt rear-end me a week ago

      If that c*nt had been driving a Tesla, or indeed my humble Nissan the car would probably have stopped itself before he woke up.

    3. a_yank_lurker

      Re: Consumer Watchdog?

      Never heard of them. But Telsa screwed up by naming the driver assist system "Autopilot". A marketing blunder because most are aware that aircraft routinely fly safely on an autopilot. So name the system "Autopilot" implies it can do more than the typical driver assist system.

      1. scarper

        Re: Consumer Watchdog?

        >> A marketing blunder because most are aware that aircraft routinely fly safely on an autopilot. So name the system "Autopilot" implies it can do more than the typical driver assist system.

        I fly an aircraft with an autopilot. It IS a typical driver assist system. What on earth makes you think otherwise ? Airlines secretly got their tech from space aliens, and haven't shared ??

      2. Anonymous Coward
        Anonymous Coward

        Re: Consumer Watchdog?

        The autopilot on a Tesla is way more advanced than many aircraft. Not sure any aircraft autopilot can avoid collisions with an unidentified aircraft like a glider.

        The safety of an aircraft autopilot is that there is less hazards up in the air than on the ground.

  5. Number6

    Darwin

    Why does this remind me of the old Darwin Awards myth about the Winnebago and cruise control?

    1. Anonymous Coward
      Anonymous Coward

      Re: Darwin

      They should call it a Darwin filter, like pollen filters, it gets rid of small annoyances.

    2. g e

      Re: Darwin

      From the same country as the consumer watchdog, to boot. Quel coincidence!

  6. John Robson Silver badge

    Maybe consumer groups...

    Who haven't bought a tesla don't understand an autopilot.

    Maybe non pilots don't understand how little an autopilot does in the air...

    Maybe they need to look at human vehicles for their standard, not to some vision of the cars of the next generation(s) expecting those to materialise without first passing through this generation...

    1. Vector

      Re: Maybe consumer groups...

      "Maybe non pilots don't understand how little an autopilot does in the air..."

      So, then, perhaps non-pilots shouldn't use a system called autopilot in their cars.

      "Maybe they need to look at human vehicles for their standard, not to some vision of the cars of the next generation(s) expecting those to materialise without first passing through this generation..."

      But do we really need to test this generation on public thoroughfares?

      1. Dr. Mouse

        Re: Maybe consumer groups...

        "But do we really need to test this generation on public thoroughfares?"

        Should cruise control/lane guidance/collision warning & avoidance technology "not be tested on public thoroughfares"? These technologies are there to assist the driver. They can improve safety and the enjoyment of driving.

        All Tesla's Autopilot is is a smart cruise control system. Anyone who can't understand that (when it is in the manual, the car tells them every time they turn it on, and it was explained to them when they bought the car) is an idiot who shouldn't be on the road, and should be taken to court for driving without due care and attention.

        Personally, I believe they should change the name, just to reinforce the point. But the issue is, generally, idiots driving them (or not, as the case may be). The same idiots would end up in accidents anyway.

      2. John Robson Silver badge

        Re: Maybe consumer groups...

        "But do we really need to test this generation on public thoroughfares?"

        Yes we - do because it's only real world testing that will get us through this generation. I still don't see any evidence that the autopilot is worse than the vast majority of drivers (obviously excluding the reader who is perfect).

        If I could afford one then I would have one, and would use auto-pilot, and it would be of net benefit to my safety, and those around me (I, like most people, consider myself to be a competent driver. Unlike many motorists I also acknowledge that I can miss things and get tired/grumpy/distracted/indignant and that *none* of those are qualities I want to have when in control of a tonne of lethal weapon)

    2. John Brown (no body) Silver badge

      Re: Maybe consumer groups...

      "Maybe non pilots don't understand how little an autopilot does in the air..."

      I think that might well be it. Most people have probably heard of passenger aircraft with automatic landing systems, autopilot etc such that even in news stories we hear about aircraft which pretty much take off, fly to their destination and land almost without pilot input. To the layman, that's all one big fancy "autopilot" so the perception is that an "autopilot" does everything for you.

  7. Anonymous Coward
    Anonymous Coward

    Statistically....

    ...how many crashes should there be?

    Just on the roads by my house someone fails to drive in a straight line at least once a week. A Tesla crash world-wide, once a month seems about right.

    1. Eddy Ito
      Thumb Up

      Re: Statistically....

      In 2014 there was 1.08 fatalities per 100 million miles driven in the U.S. In October, Musk said that autopilot had 222 million miles on the clock. That's about 20M miles per month so it's probably safe to say that it's about 260 million by now and that would be 2.6 fatalities on autopilot and 33 fatalities overall given the total of 3,300 million miles logged.

      Of course given the official launch of the model S was 41 months ago in June 2012 the expected number would be 0.8 per month. In short, as much as I like torturing numbers, once a month is a pretty good estimate.

      1. Anonymous Coward
        Anonymous Coward

        Re: Statistically....

        You're misusing statistics. Fatalities are not an even distribution over all miles driven. They are much more likely to happen in poor weather and where there is more traffic - the exact opposite of the conditions that Tesla owners are likely to use autopilot. At least I doubt it is used nearly as much during heavy rain/snow, or on highways in dense urban areas where fatalities are the most common.

        You can't compare the overall statistics of fatalities per mile and the amount of miles driven on autopilot. That's like comparing the overall percentage of phones dropped in water with iPhone usage by lifeguards, and concluding iPhones are more likely to be dropped in water.

        Tesla needs to change the damn name, they deliberately chose it knowing it would imply it could do more than it was really capable. Whether people are stupid for thinking that and getting into a crash is irrelevant in a country that labels those desiccant bags "do not eat" even though you'd have to be about 1000x stupider to think that's a good idea than to think a car feature called 'autopilot' doesn't drive itself.

        1. Shane Sturrock

          Re: Statistically....

          The name is fine as it is being used in the same way as it is with aircraft. A limited tool that can allow the pilot to pay attention to other things while the aircraft maintains altitude and airspeed. Autopilot on a Tesla maintains position in lane and speed while adding adaptive cruise control where it slows down in response to other traffic. It isn't self-driving and the term autopilot is simply being misunderstood by people to think it does mean self-driving. When Tesla can do self-driving, they'll call it autonomous driving or similar. That said, I don't care if they drop the name because I care about the feature as I drive long distances regularly and even something as simple as adaptive cruise control and lane control is a huge step up, but is also far more limited in scope compared with what the next generation includes which really will be self-driving capable.

          1. Orv Silver badge

            Re: Statistically....

            Aircraft autopilots have a much less challenging environment, because they aren't generally expected to need to dodge objects in their path.

            If all Tesla cars were being monitored by traffic controllers that ensured they stayed 5 nm away from all other cars and at least 1500 feet above any terrain, then using Autopilot would be almost entirely safe, I think.

            1. Anonymous Coward
              Anonymous Coward

              Re: Statistically....

              > Aircraft autopilots have a much less challenging environment,

              For values of "less challenging" around Mach 0.92?

              > because they aren't generally expected to need to dodge objects in their path.

              That is true, because by the time you see them it's too fucking late.

              Ask me how I know.

              That said, this discussion is just silly. Why don't we move on to compare cruise ships and cruise missiles, since both have "cruise" in their names?

              1. L05ER

                Re: Statistically....

                I THOUGHT THE CRUISE CONTROL IN THIS NEW CHEVY WOULD GET ME TO BERMUDA!

                LAWSUIT! REGULATION! ANYTHING BUT EDUCATING MYSELF!

          2. Rich 11

            Re: Statistically....

            The name is fine as it is being used in the same way as it is with aircraft. A limited tool that can allow the pilot to pay attention to other things while the aircraft maintains altitude and airspeed.

            But how many people actually understand the limitation in aircraft? The term autopilot has a popular connotation, and that's going to lead to assumptions about its capabilities which aren't reflected in reality.

            1. Adam 52 Silver badge

              Re: Statistically....

              "But how many people actually understand the limitation in aircraft? The term autopilot has a popular connotation, and that's going to lead to assumptions about its capabilities which aren't reflected in reality."

              Shirley even the most dense idiot knows that Airplane was fiction and Otto isn't real.

              1. Anonymous Coward
                Anonymous Coward

                Re: Statistically....

                Otto isn't real.

                I can accept a lot of the things written in El Reg. but come on -there are limits....

          3. Doctor Syntax Silver badge

            Re: Statistically....

            "It isn't self-driving and the term autopilot is simply being misunderstood by people to think it does mean self-driving."

            That's the problem. It immediately suggests two solutions. Improve understanding or drop the name. Which is the more practical?

            1. L05ER

              Re: Statistically....

              since autopilot has never EVER meant "without any human backup."...

              i'd say the former.

          4. Kiwi
            FAIL

            Re: Statistically....

            ...tool that can allow the pilot to pay attention to other things while the aircraft maintains altitude and airspeed.

            So.. What you're saying is.... The plane basically flies itself while the pilot pays attention to other things (whether checking other important things about the state of the plane/trip, taking a leak/chatting to the co-pilot/chatting up the stewards/stewardesses...)

            I remember when I was a teen (way before all the stupidity), on my first international flight, being invited into the cockpit, and spending at least 10 minutes chatting with the crew. Little attention was paid to the instruments - cursory glances and so on. From my POV the plane was flying itself quite happily and did not need attention from the pilots. And today we have lots of stuff on TV's from idiotTV (NCIS, CSI etc etc) to "ain't modern tech wonderul" docos to you-name-it, telling us how the planes pretty much fly themselves and the pilots are really only needed for take off/landing, or if there is some serious problem the computers can't figure out.

            No wonder non-pilots get the idea of "auto pilot" wrong, and think it's something that can "pilot" the car "auto"matically.

            1. Anonymous Coward
              Anonymous Coward

              Re: Statistically....

              "while the aircraft maintains altitude and airspeed"

              Even my aging car can do the equivalent. I can put it on cruise control and on a flat road it will maintain altitude, direction and speed. I can go and sit in the rear while it does it if I want and it will still merrily comply.

              The Tesla autopilot is way more advanced than that, so maybe they should have called it "super autopilot" or autopilot+?

        2. Hairy Spod

          Re: Statistically....

          "You're misusing statistics. Fatalities are not an even distribution over all miles driven. They are much more likely to happen in poor weather and where there is more traffic -"

          Actually fatalities often drop in bad weather as drivers tend to slow down and take more care

          1. Anonymous Coward
            Anonymous Coward

            Re: Statistically....

            Not true, unless you think 16% of all vehicle miles driven in the US are in adverse weather - with 6% of all miles driven being in snow/slush/ice - while people do drive slower in such conditions they STILL die at a higher rate despite that.

            http://ops.fhwa.dot.gov/Weather/q1_roadimpact.htm

        3. the Jim bloke

          Re: Statistically....

          @ DougS

          The purpose of statistics IS to be mis-used.

  8. Orv Silver badge

    The problem with the current "self-driving" systems is they assume that a human who isn't paying attention can suddenly be brought back into the decision loop in a split second, if the computer gets overwhelmed. Humans don't work that way. Until we have fully autonomous cars, these systems will continue to be highly dangerous.

    1. Anonymous Coward
      Anonymous Coward

      What's more likely - a purpose built computer running a defined function set getting 'overwhelmed' or a human being driver being faced with an unexpected scenario getting 'overwhelmed'?

      Personally I would say the latter - when the crap hits the fan let the computer do the work not the human, they normally screw it up. It's why cars already have ABS, brake assist, stability control because in nearly all circumstances the computer is a better bet than a human.

      If the computer screws up completely on a very infrequent basis then just press the switch marked "emergency stop" and the car can just pull over as quickly as possible. A standard car has got hundreds of things that could go wrong and cause you to crash, but we don't try to legislate that all cars should have run flat tyres or that immobilisers should be made illegal.

      It is a fear of the unknown (resistance to change), anthropomorphising the computer system and the idea that humans are better than a dumb computer that creates the backlash.

      1. Orv Silver badge

        "What's more likely - a purpose built computer running a defined function set getting 'overwhelmed' or a human being driver being faced with an unexpected scenario getting 'overwhelmed'?"

        Well, it's happened more than once in the case of Tesla's system. For example, the computer couldn't deal with a semi blocking the road, and expected a human to notice it, even though the human hadn't had to pay attention to the situation for probably an hour or so...

        I'm waiting for the first car-train accident involving Autopilot. A train of flat cars has very little side profile at night.

        1. Chris 239

          @orv re. Flat cars

          I reckon rail road flat cars would not be a problem as they are higher than many cars and so would be detected. The problem with the Tesla not detecting the Semi trailer was the big (but sadly not big enough) space under it, I'd be very surprised if Tesla are not updating the system to cover this if they havent already.

      2. Orv Silver badge

        "Personally I would say the latter - when the crap hits the fan let the computer do the work not the human, they normally screw it up. It's why cars already have ABS, brake assist, stability control because in nearly all circumstances the computer is a better bet than a human."

        I think humans are really good at some parts of driving, and really bad at others.

        We evolved to be good at distinguishing objects and determining their threat level. It was important to our survival. So we still have an edge over a computer when it comes to picking things out in difficult environments. An attentive human, for example, is probably going to notice cues that distinguish a white tractor-trailer from a bright sky. A computer, apparently, cannot.

        The other stuff you mention -- that stuff we're shitty at. Nothing in our evolution prepared us to control an object moving at 100 km/h.

        Unfortunately, another thing we're shitty at is paying attention to a situation that we've mentally ceded control over to someone or something else. And that's where the problem with almost-but-not-quite autonomous systems come in. We're bad at being attentive when *we're* doing the driving. We're hopeless at it when a computer is doing most of it.

    2. Dr. Mouse

      'The problem with the current "self-driving" systems is they assume that a human who isn't paying attention can suddenly be brought back into the decision loop in a split second, if the computer gets overwhelmed'

      As long as you are talking about Tesla's Autopilot (a reasonable assumption given that's what this article was about), it is your attitude which is wrong.

      That is, the "human who isn't paying attention" should be paying attention!

      It is an intelligent cruise control system, not a self-driving car. The driver should always be concentrating and be ready to take control back. If they aren't, they are driving without due care and attention.

      1. Orv Silver badge

        "It is an intelligent cruise control system, not a self-driving car. The driver should always be concentrating and be ready to take control back. If they aren't, they are driving without due care and attention.:"

        Well, yes. But the problem with that -- and this has also been shown to be a problem with pilots -- is that when you take over the human's job for a while, when they no longer have to provide inputs, their attention wanders. Even if they don't start physically doing something else, their mind is no longer on the task of hand, because it's being done for them. It takes tens of seconds to come back from that.

        In airplanes they attempt to deal with this with training, and it also helps that things usually unfold pretty slowly at 30,000 feet. There's very little to hit up there, and very few situations require immediate action. Still, there are several accidents where the autopilot decoupled due to a problem with the airplane, and pilots acted inappropriately due to not being in the loop about what was happening.

  9. Terry 6 Silver badge

    Victims

    Some of the commentards above seem to have forgotten that when a Tesla hits another vehicle it might equally be the other car's unfortunate occupants who get killed. So that 2% who do think that they can abandon responsibility for driving their cars are genuinely scary if let loose.

    .

    1. Anonymous Coward
      Anonymous Coward

      Re: Victims

      > So that 2% who do think that they can abandon responsibility

      You, and all the other armchair experts who saw it fit to comment before, are trying to prove the alternative hypothesis instead of rejecting the null hypothesis, and also fail to understand basic concepts such as power, significance, and confidence level.

      Without knowing the exact question(s) in the survey and the possible answers (plus sample size, methodology, etc.), there are no inferences at all that you can reliably make from the percentages in the answers.

      For all you know, those in the 2% did not answer at all, misinterpreted the question, misinterpreted the answer, did not give a serious answer, did not think the question or answers as posed made sense, etc., etc. Same goes for those in the 98% group, by the way.

  10. L05ER
    Facepalm

    don quixote

    gotta stay relevant somehow... i guess.

  11. JeffyPoooh
    Pint

    Fundamental Rule...

    With safety critical systems (in the specifying, designing, and marketing of same) one's 'Confidence / Competence Ratio' is absolutely critical.

    As soon as this ratio exceeds unity, people get hurt.

    Tesla's (Musk's) 'Confidence / Competence Ratio' for AutoPilot was seemingly about 1.25 (my estimate, based on the evidence before us...).

    Note that the absolute value of Competence isn't the primary factor for safety (big mistake to believe it is). Even mediocre organizations can achieve perfect safety, if they're sufficiently paranoid and cautious. In other words, adjust their self-confidence down to keep the ratio well below one

    For safety, it's this 'Confidence / Competence Ratio' that is absolutely key.

    Hardly anybody knows this. But it's subtly built into many QA processes.

  12. Alienrat

    Go and read the Tesla website, it clearly says "All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver".

    And there is a video clearly showing that it doesn't need any input from the driver and it is fully self driving, so it seems a bit odd to criticize owners for assuming that it doesn't need any input from the driver and is fully self driving.

    And yes, maybe they didn't RTFM, but I can't recall ever RTFM of a car either apart from finding out what that mysterious unique switch on the dashboard does.

    1. John Robson Silver badge

      That hardware is there - the software isn't yet. So they don't do self driving yet - they do autopilot. Which is different.

      You can tell because the words aren't the same.

      1. Kiwi

        You can tell because the words aren't the same.

        Yes yes.. People should know. I mean, automatic gearboxes on cars haven't ever been known automatically changing gears for them. Automatic washing machines don't ever go through the full wash cycle on their own. Automatically defrosting fridges need to be de-iced every week. Monitors that automatically turn off need you to switch the power off after you turn your computer on, and need you to turn the power on for those that automatically turn themselves on. Automatic dishwashers don't wash and dry the dishes for you...

        Strange how people think the word/prefix "auto" means "it does it by itself without (further) instruction from you".

        1. John Robson Silver badge

          "Strange how people think the word/prefix "auto" means "it does it by itself without (further) instruction from you"."

          My auto gearbox doesn't change gears on it's own - it depends on me applying more throttle, or the brakes.

          The automatic washing machine doesn't load itself, not even with soap/conditioner.

          How many people 'pilot' their cars? I don't know anyone who would use that word. Now if the Tesla starts flying then that might be a different area of confusion - but the Model F was, unfortunately, announced on the first day of the fourth month...

  13. M7S
    Coat

    All this talk of autopilot, and comparing it to aircraft functions of the same name

    Its just people droning on.

  14. Kartheg

    Misleading article

    The call is to remove the word 'AutoPilot' from TESLAs websites and claim and call it something else like Auto Steer.. Not removal of AutoPilot itself.

    1. Orv Silver badge

      Re: Misleading article

      Auto Steer is better. "Cruise Control With Lane Assist" is too much of a mouthful but probably most accurate.

  15. Keefwivanef

    WHOMPY WHEELS

    The driver was very lucky not to be decapitated.

    Not only did Automatic Emergency Braking fail as usual on the Tesla but didn't anybody notice that the WHEEL HAS GONE WHOMPY?

    I did.

    I've reported it.

    NHTSA

    Complaint Number: 10928676

    Vehicle Identification Number: unknown

    Your Vehicle's Make Model and Model Year: TESLA MODEL S 2016

    Note: Your VIN, make, model, and year are all protected under the Privacy Act.

    What part of your car was affected? Brakes, Steering, Suspension

    What happened?

    Here is a report of a crash in Germany. link /lokales/bruchsal/tesla-schlaegt-in-sperrwand-ein1 The front tire is peeled off the rim. That could only have happened before the collision and is almost certainly caused by failure of the suspension. It would be almost impossible for the wheel to have turned a full revolution and peeled off the tire after the collision. The driver was seriously injured. Please take immediate action.

    Files you uploaded.

    tesla-model-s-nl-crash1-e1480454411971.jpg

    When did this happen? 11/28/2016

    Was there a Crash? Yes

    Was there a Police Report? Yes

    Were Vehicles Towed? Yes

    Was there a Fire? No

    Was there an injury or fatality? Yes

    Number of fatalities:

    Number of injuries 1

    Medical Attention Required? Yes

    How fast were you going? (in mph) 60

  16. Lee Zwager

    Not that many idiots driving Teslas

    They must be monied and looking to the future - so I doubt they are f*cktards.

    Could it be that insurance companies and their legal representation are just over using a scapegoat?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like