back to article Tesla's big news today:
sudo killall -9 Autopilot

Behind the smokescreen of its new onboard hardware announcement, Tesla is quietly killing off its controversial Autopilot feature in its new cars. The money shot is buried in this announcement emitted today titled All Tesla Cars Being Produced Now Have Full Self-Driving Hardware. The blog post is a masterpiece of misdirection …

Page:

  1. Mikel

    Seems prudent

    The autopilot is a big feature. But the primary draws for most Tesla drivers isn't autopilot. It's one of: electric car, ridiculous acceleration, Tesla brand.

    I hope they work it out. Either way, it's not like they're going to run out of buyers.

    1. bazza Silver badge

      Re: Seems prudent

      Hmmm, judging by some of the opprobrium flying round Tesla forums related to Autopilot's imminent curtailment, it would seem that it (rather than electric, ridiculous acceleration, and Tesla) is the key reason why many owners have bought them.

      It would be kinda cool if it does work out; who wouldn't want a car to drive them home from the pub. But,

      "All Tesla Cars Being Produced Now Have Full Self-Driving Hardware."

      would be false advertising (if used as such). They cannot demonstrate that this is true. It may be something that they can get away with in the US, but in (for example) the UK and a lot of Europe there'd surely be ASA and Trading Standards complaints about such a claim.

      If I were an investor in Tesla I'd be worried about this. It's so unlikely that they will ever deliver a self driving car. They're hyping up the idea now, but it is very likely to result in future disappointment amongst hitherto loyal customers who need to be persuaded to by a new replacement. They're also increasing their production costs today with zero guarantee that it'll ever be worthwhile.

      But by then there's likely to be other competitors in the market (e.g. BMW, Merc, VW, the Japanese manufacturers, etc), and they will be supremely competitive, and they know how to build a higher quality interior.

      They also have the industrial capacity to build an electric car for every market segment, something Tesla are understandably avoiding doing. But that is also dangerous - Toyota have for decades shown that the real money to be made is from well priced, boring, reliable, comfortable and very well made boxes; no-one earns much money from making only rocket ships. You have to go mass market at some point.

      Tesla have only a short while to cement their position as the manufacturer of the longest range electric car with the fastest charging time. If they ever fall behind on that then they're toast. Hyping self driving today is a costly distraction from that goal, especially if (as seems likely) everyone else gives up on the idea and public perception of self-driving becomes forever tarnished.

      1. Dan 55 Silver badge

        Re: Seems prudent

        Given they need to train up the self driving systems now with big data, they might come unstuck a couple of years down the line and find that they do actually need more hardware. Legal requirements or something they haven't thought of now (e.g. rear radar).

        1. Alan Brown Silver badge

          Re: Seems prudent

          "Legal requirements or something they haven't thought of now "

          CANBUS and vehicle ethernet mean that as long as you have spare ports and CPU, you can add more tech as needed (including more CPU if needed)

          I hope those cameras are recording at a higher rate and feeding down into the logging system faster than the existing ones do. That would make crash forensics a lot easier even in clotheslining incidents.

      2. Adam 52 Silver badge

        Re: Seems prudent

        " it would seem that it ... is the key reason why many owners have bought them"

        Are they complaining that it doesn't drive itself, or that the assist features will be going temporarily (in a car that they haven't got?)

        The first group Tesla is better off without, they'll end up hurting the brand.

        The second group I feel a bit sorry for but they've still got their cars.

        It's the owners to be I feel sorry for, losing out because the lawyers got scared.

        1. Pascal Monett Silver badge

          Re: "losing out because the lawyers got scared"

          Well Death does have a tendency to be scary.

          1. M7S

            Re: "Death does have a tendency to be scary"

            WHY? I'M ONLY HERE TO STOP THE PLACE FILLING UP. NOLI TIMERE MESSOREM.

            Squeak

            1. This post has been deleted by its author

      3. JetSetJim
        Facepalm

        Re: Seems prudent

        I don't think the mainstream press write ups help things...

        http://www.bbc.co.uk/news/technology-37711489

        The BBC article leads with "Tesla to make all new cars self-driving"

        1. I ain't Spartacus Gold badge

          Re: Seems prudent

          The BBC article leads with "Tesla to make all new cars self-driving"

          Isn't that Tesla's fault though, for the way they worded the press release?

          They're trying to have their cake and eat it. Which in this case is stupid.

          Particularly as self-driving cars aren't going to be legal for the general public for many years to come. Sure there's some limited testing going on, and the tech is likely to come out safer than human drivers, but society tends to be rather conservative about these things. So I can't see it being legal to buy a self-driving car for at least a decade - and probably considerably more. By which time you're likely to have replaced any new Tesla you've bought this year.

          Plus, what if government mandates lidar instead of radar?

      4. Mikel

        Re: Seems prudent

        @bazza

        "Most do" doesn't preclude "many don't" in populations >4.

        They have no hope of scaling production fast enough to meet even half of the demand. Impatient cusses with their hearts set on autopilot NOW and nothing else will just have to buy from one of the many other self-driving car companies. It's not like Tesla needs their business.

    2. Syntax Error

      Re: Seems prudent

      In other words an expensive scalectrics toy.

    3. macjules

      Re: Seems prudent

      Apparently the biggest cause of accidents with Tesla drivers is when they drive it out of the showroom not realising what the acceleration is like.

      Sadly my Model S is now too old (end 2014) to get the autopilot. I shall just continue to be my family's "autopilot" instead. Not too sure about why they set the video to the Stones' Paint It Black though.

      1. Eddy Ito

        Re: Seems prudent

        I think the song was to express that this is a dark time as the car learns to drive and that it is doing so in California. It's clearly picked up the typical bad habits seen on Californian roadways.

  2. JeffyPoooh
    Pint

    Have they turned OFF the existing systems yet?

    I guess they're learning the lesson that's been learned over and over and over again since Minsky's time.

    "A.I. is hard." ...Especially in the real world.

    ("hard" is a comedically-vast understatement. It means 'nearly impossible'.)

    1. Anonymous Coward
      Anonymous Coward

      Re: Have they turned OFF the existing systems yet?

      "("hard" is a comedically-vast understatement. It means 'nearly impossible'.)"

      It also depends on how you define AI. Computers that could read moving car license plates would have been considered AI only 20 years ago - now they're everywhere. And there are loads of other examples of tech that was once considered AI and now is commonplace. The definition of AI is a moving goalpost , its always what we haven't quite achieved yet. Once self driving cars are on the roads it'll no longer be seen as AI.

      1. JeffyPoooh
        Pint

        Re: Have they turned OFF the existing systems yet?

        "...depends on how you define AI. ...read moving car license plates..."

        Solving the 'moving' part involves an old technology called a shutter (camera shutter). They take picture, or they take dozens. The plate doesn't have to be 'moving' in the picture.

        OCR of a plate isn't life or death. If it was, then the gap between 100% and 99.9% would be on the news every evening, with tens of thousands killed daily.

        The algorithm and OCR neural network is read plates is infinitely tighter scope than 'driving'. Driving is open ended and literally infinitely complicated.

        Optimism in the A.I. field gets punished.

        'A.I. is hard.' Especially in the real world.

        1. Anonymous Coward
          Anonymous Coward

          Re: Have they turned OFF the existing systems yet?

          "Solving the 'moving' part involves an old technology called a shutter (camera shutter). They take picture, or they take dozens. The plate doesn't have to be 'moving' in the picture."

          Shutter? They use video cameras, not a SLRs FFS. And I don't think you understand quite how much complex mathematics goes into first finding the plates themselves in the general jumble of each image, then finding and decoding the characters which will almost certainly be at various x,y angles relative to the camera and hence skewed so simple template matching won't work - the skew angles need to be calculated first then everything needs to be normalised before any recognition actually happens. Trust me, this isn't the sort of thing you could have done in real time on even a top end PC 20 years ago.

          "OCR of a plate isn't life or death. If it was, then the gap between 100% and 99.9% would be on the news every evening, with tens of thousands killed daily."

          I never said it was. Voice recognition isn't life or death, but its still a classic AI problem.

          "The algorithm and OCR neural network is read plates is infinitely tighter scope than 'driving'. Driving is open ended and literally infinitely complicated."

          Older recognition systems didn't use neural nets, not sure about the state of the art now. And yes driving is more complex, but moving plate recognition would still have been considered AI back in the day.

          1. JeffyPoooh
            Pint

            Re: Have they turned OFF the existing systems yet?

            boltar offered "...yes driving is more complex..."

            Yeah, that ---^ . Many orders of magnitude more complex.

            PS. 'shutter' can be read as 'frame grab'. It was to make a complex point in a compact manner, rebutting the 'moving' red herring. My video camera offered 'slow shutter mode', even if it was in a very detailed technical sense virtual. Again, the 'moving' was a red herring. The 'shutter' was a one-word rebuttal demolishing the red herring 'moving'. It wasn't intended to go off into a huge debate about shutters. Anyone dealing with captcha images of house numbers knows that, yes, while the Google car is moving, the resultant captcha image isn't. We could spend weeks debating the red herring 'moving'...

      2. Stevie

        Re: Have they turned OFF the existing systems yet? 4 boltar

        "Computers that could read moving car license plates would have been considered AI only 20 years ago"

        Only by marketers.

  3. JeffyPoooh
    Pint

    "...and a more powerful computer."

    "...eight cameras, 12 ultrasound detectors, a front-facing radar, and a more powerful computer."

    Don't forget the microphone to hear the sirens of the emergency vehicles giving chase, and to hear the unexpected blood-curdling screams of the mostly fallen-out passenger being dragged by his leg for the past 45 minutes.

    Don't forget to program the rear facing cameras to look for columns of smoke inexplicably always following behind, strangely coincident with sequential loss of various sensors, one after the other. Don't forget to include a software thread named 'Am_I_On_Fire?', because recognizing that one is on fire is a characteristic of intelligence. Not recognizing it is Artificial Stupidity.

    Another software thread called, 'Where's_My_Roof?', just in case history repeats itself. Some aircraft have sensors called Frangible Switches to detect crashes, to trigger emergency systems. Do 'A.I.' cars realized when an unexpected crash has occurred? Loud banging noises even noticed? Sideswipes even registered?

    Software thread 'Why_Is_My_Tire_Wobbling?' Don't forget vibration sensors, in the chassis and the steering system.

    A.I. in the real world requires I/O. Lots of I/O.

    By the time you're done, the final computer will be 100x more powerful than this "more powerful" version.

    This is all so predictable.

    1. Anonymous Coward
      Anonymous Coward

      Re: "...and a more powerful computer."

      Your examples are all part of a car driving itself with no passengers at all, which is probably way further off than a self driving car that is carrying passengers who could themselves lookout for the roof coming off and enact an emergency button.

      A modern car today with relatively little computing power has sensor all over the place checking your emissions, tire/tyre pressures, temperatures, pollen ingress, distance from other cars, current speed vs revs, dpf state, multiple temperature sensors. Generally a modern car can tell if something significant is wrong before you do.

      There are currently driverless trains which have to have a certain amount of sensors but don't have a compute that is amazingly powerful. Your house could catch fire, have an aircraft hit it, have a gas leak, have the electrics cause a potentially fatal injury, have a balcony fail etc. However we don't need to have sensors all over the house - just a smoke alarm, and some failsafe breaker switches generally suffice. In fact more sensors (IoT) are generally frowned upon.

      All the (albeit understandably) scaremongering comes from the idea of an out of control or runaway car. However that happens with vehicles today, it's just so rare that it is dismissed.

      1. Anonymous Coward
        Anonymous Coward

        Re: "...and a more powerful computer."

        slightly worried about the personal and intrusive nature of some of these: "sensor all over the place checking your emissions"

        1. allthecoolshortnamesweretaken

          Re: "...and a more powerful computer."

          " slightly worried about the personal and intrusive nature of some of these: "sensor all over the place checking your emissions" "

          Okay, if my car scolds me for farting it might be a real AI.

      2. Anonymous Coward
        Anonymous Coward

        Re: "...and a more powerful computer."

        "...that happens with vehicles today, it's just so rare that it is dismissed."

        One of the lessons learned by young engineers is that basic safety expectations and requirements have changed. There are endless examples of systems that exist today which would never be permitted if they had just been invented.

        Another example is an avionics box on an airplane. It has three known defects, one serious defect that is easy to fix quickly, and two other trivial defects that'll take three years to fix. The Airworthiness regulators that I've dealt with will not permit the quick fix until the other two are also fixed. This effectively delays fixing the more serious defect by years. Which is demonstrably stupid. They're maximizing the risk area under the risk vs time line.

        Self-Driving Cars. It's 2016. They'll not be permitted if they're as safe, or an incremental change. US regulators have already stated that they need to be twice as safe.

        Plus, they need to avoid all daft accidents. Which is a tall order.

        1. Adrian 4

          Re: "...and a more powerful computer."

          "There are endless examples of systems that exist today which would never be permitted if they had just been invented."

          I was thinking that whilst standing on the tube today. I suspect we'd have to sit if there weren't years of perfectly reasonable usage with passengers standing.

          I wonder if human car drivers would be allowed if we didn't already have them ?

    2. Neil Alexander

      Re: "...and a more powerful computer."

      "Do 'A.I.' cars realized when an unexpected crash has occurred?"

      Non-AI cars know when they've been crashed. How do you think airbags are deployed?

      Also see Volvo pedestrian airbags, which deploy even if a human is hit without the front-end being damaged or crumpled.

      1. JeffyPoooh
        Pint

        Re: "...and a more powerful computer."

        NA "Non-AI cars know when they've been crashed."

        The infamous Tesla 'Truck Not Sky' accident seemed to hint that the car was unaware of the crash even after the crash. The car continued on, as opposed to slamming on the brakes.

        ...Based on reports.

        Main small point: A.I. for self-driving cars will need microphones. Anyone believing that an A.I. can be aware of its surroundings without hearing sirens, banging noises and screaming people is not thinking clearly.

      2. Anonymous Coward
        Anonymous Coward

        Re: "...and a more powerful computer."

        If a car could detect unexpected crashes id have it monitoring Windows Servers while I drive. I wish they'd hurry up and write a plugin to put a space in the reason box and click ok though.

        I hate taking my hands off the wheel on the motorway. Frightens the passengers.

    3. Indolent Wretch

      Re: "...and a more powerful computer."

      Your absolutely right I'm also pretty sure it's been scientifically proven that a human being travelling over 30 miles an hour always suffocates. This progress must be stopped!

      1. Uffish

        Re: "This progress must be ..."

        We all want progress (I think) but please, don't test out your algorithms on the roads I drive on. Where I drive the slow, ultra-safe Google cars would be a public nuisance and the Tesla cars wouldn't have the needed reactivity.

        1. Alan_Peery

          Re: "This progress must be ..."

          That's just NIMBY in a different place. It has to be tested *somewhere* even for the difficult cases, so why should your roads be special?

          There is also no indication that Teslas running under autopilot features are more dangerous than other cars under human piloting.

          1. JeffyPoooh
            Pint

            Re: "This progress must be ..."

            AP offered up a Trump-like Post-Truth factoid "There is also no indication that Teslas running under autopilot features are more dangerous than other cars under human piloting."

            Tesla Autopilot 1.0 couldn't distinguish a truck from sky. That's unacceptable. Full stop.

            The basis of comparison isn't the fleet of legacy cars. The proper basis of comparison is a modern car with all the latest safety features (including the latest crash avoidance features).

            Nobody is going to permit the basis of comparison to be the legacy fleet. It doesn't work that way for pollution standards, other safety requirements, crash standards, CAFE fuel standards, or anything else. Musk knows this. He's BS'ing you with those marketing claims.

            The question is: do we allow the extra step of allowing the driver to stop driving, pop a movie into his portable DVD player, and let the car do the driving? Will that be safer? We're a long way from that.

            A.I. is hard. Especially in the real world.

      2. JeffyPoooh
        Pint

        Re: "...and a more powerful computer."

        IW "Your absolutely right..."

        You're --^

        Progress is good.

        Over-optimistic naivety resulting in Tesla Autopilot 1.0 not seeing a truck is not good.

        These lessons about how "A.I. is hard" goes back to the dawn of A.I. with Minsky.

        Anyone that goes into such a project without knowing this is, frankly, an uneducated moron.

  4. hypernovasoftware

    Gee, I wonder if the buyers will be compensated after purchasing a vehicle with non-working features?

    1. Anonymous Coward
      Anonymous Coward

      They sure will! In 5 years time, with a $20 off your next Tesla voucher, if they can provide proof of purchase and email evidence indicating they were planning to rely on the Autopilot feature. And once the lawyers have been paid off, naturally.

      1. King Jack
        Go

        Sony is that you?

        No need to hide behind a mask, name Sony for the PS3 feature removal. Tesla look to be take lessons.

    2. Anonymous Coward
      Anonymous Coward

      @hyper

      I'm not sure of course, but considering Tesla's eagerness to sue television shows which shared negative opinions about their cars I wouldn't be surprised at all if they got this legally covered as well. Something tells me they have to, I don't think they can afford to get sued over a malfunctioning auto pilot for example...

      1. 404

        Re: @hyper

        Check the fine print - it'll be there. Bet.

        A kind of you own the car but lease the software kind of thing - Windows 10 without the mediacenter you once purchased. You know ;)

    3. JeffyPoooh
      Pint

      "...purchasing a vehicle with non-working features?"

      Puh, it's just a wee bit of software. A.I. software. Software projects are easy, especially those involving A.I. and neurotic networks.

      Seeing as how it needs to be planned, documented and developed to some process resembling DO-178B or C, DAL perhaps A or B, so maybe ten lines of code a day per qualified coder drone, typically.

      So 250GB of tight code, at 500 bytes per Coder Drone day, it should be ready for beta release just in time to get tangled up in the Y10K problem.

      Oh, and you'll need an EVEN MORE (cubed) powerful computer.

      1. bazza Silver badge

        Re: "...purchasing a vehicle with non-working features?"

        @JeffyPoooh,

        "Seeing as how it needs to be planned, documented and developed to some process resembling DO-178B or C, DAL perhaps A or B,"

        Quite.

        The automotive industry has for years been "avoiding" this issue. They use the MISRA rule set for C programming as a means to justify claims that their software is "safe". The problem is that i) MISRA is more like smoke and magic than hard proof of correctness ii) MISRA tool chains that I've used are perfectly capable of compiling correct source code to junk object code that doesn't implement the source code (it was optimisation bugs), iii) there's no guarantee that their C libraries are themselves MISRA compliant. In fact one I've used most definitely was not MISRA compliant in its C library's source code, and the C library was buggy. Yet it had a tick box labelled 'MISRA', was and still is widely and highly regarded throughout the community.

        Of course non of that has mattered, because in all cars actual safety has been provided by everything ultimately being mechanical or hydraulic, with software not taking a primary role in car control.

        But with things like self-driving? Yep, the applicable standards have indeed got to be things like DO-178B, etc.

        "so maybe ten lines of code a day per qualified coder drone, typically. So 250GB of tight code, at 500 bytes per Coder Drone day, it should be ready for beta release just in time to get tangled up in the Y10K problem."

        Neat way of providing investment guidance!

        An industry rule of thumb I picked up some time ago was more like 1 single line of code per coder per day across an entire software project of this type. After the design and specification is done the PMs would estimate the size of a project and do their cost estimation that way. And that was on systems that had to be correct but were still human supervised. I dread to think how slow a true safety critical piece of software such as a self driving car would be.

        Of course the self driving guys know this. So they're spinning up arguments in favour of rapidly developed code being approved as safe from usage statistics to grandfather their systems into autonomous use. Kinda like "it's not gone wrong yet in our trials, so therefore it must be OK for all eternity". Accepting code in this way would be unprecedented in the history of safety critical systems and transportation. There'd also be the potential for a systemic and hitherto unidentified fault causing mass carnage and the world's most expensive law suit.

        Personally speaking I find the industry's statistical argument for what a "safe" self driving car would be somewhat distasteful and implausible. Saying that it's as safe as the "average driver" is nuts; it'd mean that many passengers would statistically speaking be worse off. Terrific. The trouble is the people who will decide what's allowed or made compulsory aren't used to thinking 'personal'; they look at nationwide or insurance statistics, and see profit in reduced costs.

        Fortunately the State of California has published Google's test results, and they don't make for encouraging reading from Google or any other self driver's point of view. Google's data, if squinted at only slightly, implied an accident every 1500-ish miles had their cars been fully autonomous and unsupervised. Not a very good statistic in favour of approving full autonomy.

  5. frank ly

    Plastic functionality

    "... we will enable them over-the-air, together with a rapidly expanding set of entirely new features."

    Can they also be disabled over-the-air? I wonder if Tesla have protections against MITM and other attacks on their communication security/validation.

    I assume that Tesla owners will get an e-mail (or something) to tell them in advance that their car will have new features. Will they be able to opt out?

    1. JeffyPoooh
      Pint

      Re: Plastic functionality

      "...also be disabled over-the-air?"

      Wait until the NHTSA is finished their 'Truck, Not Sky' investigation.

      They'll likely require Autopilot 1.0 be disabled, or use massively restricted.

  6. Donchik

    Hardly surprising!

    With Ford, GM etc. sinking billions into this money pit, It's not a surprise that Tesla's little operation has not been able to crack this.

    The reason the big players are not already in the marketplace appears obvious, with current technology it's not safe enough for them or their legally equipped customers.

    Tesla have been gambling with their customer's lives, and any other road user or pedestrian who is unlucky enough to run into them (sic)

  7. Anonymous Coward
    Anonymous Coward

    Switched Off?

    I don't see anywhere in the article that says the existing auto-pilot will be switched off, only that cars with the new hardware will not have it switched on to start with.

    So my reading says that while the new hardware is learning, the old hardware and existing cars will continue to function with the existing technology.

    Some one please point out to me what I am missing here.

    1. diodesign (Written by Reg staff) Silver badge

      Re: Switched Off?

      "only that cars with the new hardware will not have it switched on"

      Correct.

      C.

    2. I ain't Spartacus Gold badge

      Re: Switched Off?

      You're not missing anything. The current users of "Autopilot" are the beta testers for the new users.

      Think of them like royal food tasters. They get to eat a lot of very nice food not otherwise available to men of their class, it's just that every so often one of them drops dead.

      1. Alan_Peery

        Re: Switched Off?

        Rather the opposite of that. "It's just the every so often one of the meals gives them superpowers." Superpowers like automatic braking that avoids an otherwise occurring collision.

    3. JeffyPoooh
      Pint

      Re: Switched Off?

      NHTSA hasn't yet finished their 'Truck, Not Sky' investigation.

      My opinion is that the primitive Autopilot 1.0 will be either disabled by order, or retrofitted *and* tightly restricted by order.

      It'd be shocking if they allowed it to remain status quo.

  8. WereWoof

    "...including some standard safety features such as automatic emergency BREAKING . ." Sighs

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like