back to article 'We think autonomous coding is a very real thing' – GitHub CEO imagines a future without programmers

At Pier 70 in San Francisco, California, on Wednesday, where ships once were built, code-hosting biz GitHub held forth on building and shipping code, and on the people who do so. Following a manic Looney Tunes-inspired cartoon intro, Chris Wanstrath, cofounder and departing chief exec, opened the keynote presentation by …

  1. Gene Cash Silver badge

    So can we write an AI that writes code by Googling Stack Overflow? It can't do worse than Adobe...

    1. Anonymous Coward
      Anonymous Coward

      You got what "libraries" are today, scrap of code from StackOverflow and maybe a one-person project never updated in the last two years on GitHub...

      1. Anonymous Coward
        Anonymous Coward

        Don't forget the bit where all the forum posts are "I have this problem too..." - cheers for that.

  2. Anonymous Coward
    Anonymous Coward

    In the future their will be no jobs whatsoever.

    1. Amorous Cowherder
      Boffin

      In the future the government wants "1984", we want "Utopia" but what we'll get is "MegaCity-One".

    2. rmullen0

      In the future there won't be humans so it won't matter

      Global warming is going to fix the problem.

      1. Mr. Moose
        Mushroom

        Re: In the future there won't be humans so it won't matter

        ... Or maybe Donnie Two-Scoops ;-)

  3. Adrian 4

    Coding is merely writing a specification for how a machine responds to its environment.

    The form in which the code is written may vary, but if you don't have a specification, how can you get what you want ?

    1. Tomato42
      Unhappy

      And we all know how good are the people "up the food chain" at writing precise and unequivocal specifications...

      Yes, people copying solutions off of Stack Overflow *ekhm* programming will loose their jobs, just like drivers, lawyers, doctors, but programmers will not be the first people to go.

      Either way, it doesn't look like the French revolution will reign much longer as the most bloody one...

    2. rtb61

      Specification vs Instruction

      There is a huge difference between specification and instruction. Why you specify, you state what should be achieved, when you instruct, you state how it should be achieved.

      Major real world difference. The need for coding will reduce but not be eliminated. Obviously you will need to debug auto generated code, when results fail. Why did it fail, why was you specification wrong, was it an auto code mistake or is your specification in error, check the output, read the code, that's the only way you can find out.

      So you will code much faster because the bulk of the coding will be auto generated based upon the specification. You than check the code, and audit the code for security, compactness and speed, which is way faster than trying to check your own mistakes.

    3. Anonymous Coward
      Anonymous Coward

      >if you don't have a specification, how can you get what you want ?

      Irrelevant - our coding Robot Overlords will instead ensure we get what we need.

  4. Anonymous Coward
    FAIL

    The Last One...

    Redux.

    Comes up every few years. Goes nowhere.

    1. colinb

      Re: The Last One...

      Oh my dear god, not seen that before.

      "Started by a former bankrupt millionaire with a weeks formal training in computers and a Somerset Type Dealer"

      Surely worth an El Reg article, or an Ealing comedy.

      1. colinb

        Re: The Last One...

        annnnd still around up to 5 years ago

        https://web.archive.org/web/20120204205001/http://www.tloconsultants.com/index.htm

        tloconsultants =The Last One Consultants

    2. Anonymous Coward
      Anonymous Coward

      Re: The Last One...

      People write less code now than they once did due because so much functionality comes from libraries

      And libraries are what exactly? Bundles of code. This guy is a dickhead.

  5. Paul J Turner

    Has another five years gone by so soon?

    Again? - https://en.wikipedia.org/wiki/The_Last_One_(software)

    (Hah, Snap! @teahound)

    1. Peter Gathercole Silver badge

      Re: Has another five years gone by so soon?

      That immediately sprung to my mind as well.

  6. Anonymous Coward
    Anonymous Coward

    Blue pill please

    A world without developers. I ask that you don't wake me. Blue pill please.

    (sorry, no sarcasm icon available to us AC's)

    1. Anonymous Coward
      Anonymous Coward

      Re: Blue pill please

      To be honest, a world with H-1B visa, 'low cost centres' and Capita is akin to a world without developers.

  7. Anonymous Coward
    Anonymous Coward

    Aircraft? Never get off the ground...

    Until they did.

    Each iteration brings us closer. There are now robust no-code business application development platforms chomping their way through tasks that a few years ago needed skilled coders. They won't replace blue sky coding, but there will be a sea change in the next few years. DevOps will be fine, but it's a bleak outlook for grunt-work coding.

    1. Anonymous Coward
      Anonymous Coward

      Re: Aircraft? Never get off the ground...

      CASE never lift off in the past several years.... anyway GitHub is showing why it's not wise to put your code in someone's else repository: it will use it against you....

    2. Bill Gray

      Re: Aircraft? Never get off the ground...

      "They laughed at Galileo, and they laughed at Einstein. But they also laughed at Bozo the Clown."

      Yes, sometimes a seemingly dumb idea (like aircraft) actually turns out to work. But I think this autonomous coding is more in the Bozo category.

      1. Ken Hagan Gold badge

        Re: Aircraft? Never get off the ground...

        Aircraft were never a seemingly dumb idea. There are these things called birds.

        Autonomous coding, on the other hand, is a seemingly dumb idea because we aren't even sure how people do it, and we *are* people.

  8. This post has been deleted by its author

    1. Chuunen Baka

      I've been coding for 40 years. When I started, everything was built from scratch. Now I glue open source libraries together. You still have to write reams of complex code but the relative increase in productivity is amazing. There are loads of user self-service products of varying qualities but there'll always be a role for specialised computer wranglers. Maybe not as many as now and definitely fewer doing low level coding.

      1. Martin M

        Arguably, integration of higher level libraries and components is *harder* than lower level coding, which is a fairly mechanistic process.

        As for roles for specialised computer wranglers, it's notable that as abstractions and high-level reuse have increased, the number of developers has too. Probably because increased productivity means more problems are economic to tackle. Presumably that will end sometime, but we certainly don't seem to be nearing the inflection point yet.

        What *may* happen is that there's a shift in skills requirements, with the ability to talk to end users to work together on accurate requirements being a bigger part of the job. Historically that's been mostly a different set of people than the coders (business analysts, UX designers, etc.) but it works a lot better when it's one person doing both that and the coding. People did some amazing stuff to efficiently solve business problems with 4GL languages and small teams back in the day, albeit the systems were difficult to maintain and often based on very proprietary underlying platforms.

        Hardcore system-level programmers will still be required to build the lower level platforms, of course.

        1. Martin M

          By the way, I should probably have said "*can be* a fairly mechanistic process" above.

          There's a world of difference between producing something that functionally works and creating tight, beautiful code. The latter is more like an art form, really.

    2. John Smith 19 Gold badge
      Unhappy

      In 1970..software packages that had abstraction levels tailored to business niches.

      "The users would then be able to produce their own custom application interactively."

      And behold....

      ABAP

      WordBASIC, Excel macro language --> VBA

      End user writing code is why COBOL had "PLUS" & "MINUS" in, because "end users" were not thought smart enough to use "+" and "-"

  9. Vincent Manis

    Well, OK, then

    I'm a fan of the Fargo film and TV series, which depict decent people going to hell through a series of really bad decisions. Similarly, programming ended in the early 1960s, when FARGO, the programming system was created. [https://en.m.wikipedia.org/wiki/FARGO_(programming_language)]. Not!

    Of course, people don't write mortgage programs anymore, they use spreadsheets. We don't need to keep writing the same programs over and over.

    Software development has changed many times over the decades. What hasn't changed is the need to have smart people create solutions to problems that nobody has solved before, and other smart people evolve those to meet changing needs. Intelligent programming assistants can definitely be a part of that.

    1. John Smith 19 Gold badge
      Coat

      "FARGO (Fourteen-o-one [IBM 1401] Automatic Report Generation Operation"

      Holly s**t.

      For those who thought RPG was ancient.

  10. JimBob42

    Dooooooom

    In my experience, any system designed to allow non-coders to do any kind of application or data maintenance ends up being used solely by coders anyway.

  11. Anonymous Coward
    WTF?

    We already have autonomous code generation

    It's called a compiler.

    However, you still need people to write the compiler.

    A compiler that can write itself from scratch? Without human involvement? Last time I checked this was considered an un-solvable problem. Evidently not in marketing.

    1. John Gamble

      Re: We already have autonomous code generation

      Yeah, that was my first thought as well. But what gets compiled will be an even higher level language.

      It's been decades since I've had to write anything in assembler -- compilers got much better, and the instructions that were known cycle hogs have been tamed. On the other end of the scale, I'm assuming my former co-worker who specialized in VAX COBOL has either learned new skills or retired.

      I can't be certain what the Next Big Thing will be (although I'm seeing signs that explicitly written loops will be the next thing to become a rarity, as ranges-as-objects become common), but it will certainly come, and we will adapt or retire.

      1. Anonymous Coward
        Boffin

        Re: We already have autonomous code generation

        > the instructions that were known cycle hogs have been tamed

        Oh I wasn't even referring to an optimizing compiler. Just a simple C compiler that does no optimizations at all.

        Yes, someone can write a compiler-compiler -- something that emits code which, when compiled, becomes a compiler. There are known examples of such a thing - Yacc/Bison for example. But it's an infinitely recursive problem: who writes the compiler-compiler, who writes the compiler-compiler-compiler, and so on.

        Someone who makes this type of general-purpose grandiose statement - that software will write itself some day - Real Soon Now - has no understanding of the process of bootstrapping a compiler.

        Why I keep yammering on and on and on about compilers: because without a compiler, there's no software.

        1. John Gamble

          Re: We already have autonomous code generation

          "Oh I wasn't even referring to an optimizing compiler."

          Yeah, neither was I. I was referring to machine instructions, which back in the Olden Days one tried not to use in tight loops (using, say, shift & add for multiplying a constant with your variable, or even shift & subtract for the division equivalent). Changes in hardware are just one of the things that also drive changes in software languages.

          My point was more toward the fact that what we regard as a compiler will change in the coming decades, because the languages we use will have more features 1 and (one hopes) more safeguards.

          ---

          1. Julia, in fact has just-do-it operations in it that I could only dream about forty years ago.

  12. Denarius
    Meh

    haven't we been here before ?

    Ah yes, COBOL meant the PHB class and users write their own code, then SQL meant management could directly query business data followed by the great white hopes of VB and Delphi. At least the last two were not seen or sold AFAIK as anyone can code business solutions.

    Now we have a cultural elite and PHB class that despise technical knowledge and experience in the real messy world. This explains why wasting education time on coding instead of history, pure logic, maths and written communications has become an obsession with advisors.

    Andrew also touched on the biggest flaw in all of the machine learning lego style work. Who _knows_ what flaws the libraries contain that may cause ML and AI fails ? Was there a recent ElReg article on a comparison of two parallel coding libraries that gave different results ? Lastly, hardware vs software speeds; spot on. I remember 486DX with 16 Mb RAM and Win95C or early RedHat running faster than the 8 core 8Gb machine running the Abomination or Debian I use at moment doing the same tasks. If software gets much slower SteamPunks building Babbage Machines will be doing spaceflight calculations better.

    1. bombastic bob Silver badge
      Terminator

      Re: haven't we been here before ?

      "Ah yes, COBOL meant the PHB class and users write their own code, then SQL meant management could directly query business data followed by the great white hopes of VB and Delphi. "

      don't forget "Forest and Trees". that was an interesting thing. It died, like similar things. I think MS Access may have killed it.

      And one more point: SKYNET programmed itself, didn't it?

      Icon, because, SKYNET mention.

    2. Anonymous Coward
      Anonymous Coward

      Re: haven't we been here before ?

      VB and Delphi were named RAD - Rapid Application Development - solutions. They eased the GUI construction removing the need of long pieces of code (the "in Windows you need 100 lines of code to show 'Hello World'! ") and resource files. Sometimes they was sold as being able to build "codeless" applications just using "components" (which are after all libraries, just with specific interfaces) and their properties to link them, but it worked only for very simple ones, good for demos, nothing more. Delphi came with a full, powerful, object-oriented language (and could be used without components as well, being fully able to call any OS APIs directly).

      But the emphasis on the "components model" made some executives believe real applications could have been built just using pre-made components with very little glue to link them together. One in a company I worked for tried to advocate a model where only low-skilled developers (those you could probably replace with some AI...) were needed to assemble high level building blocks to deliver a full application - he called them "blue-collar programmers", just needed to tighten bolts (that company went bankrupt a few years after I left).

      It was the same time CASE tools were promoted, and even tools that should have allowed programming in a visual way without coding, i.e. Borland ObjectVision. None went far.

      Let's see if AI can change it...

  13. FelixReg

    COBOL again?

    Yep, I remember reading an article in Datamation, the main magazine for data processing types back in the late '50's or so.

    The article touted this great new thing, COBOL. COBOL would change the world. COBOL meant future managers could write their own programs rather than relying on pesky programmers!

    The article's prediction was correct.

    As time went on, "programmers" became indistinguishable from the "managers" of the article writer's imagination. And they wrote programs in COBOL.

    Too, what programmer out there has not been in the business of writing a replacement for themselves at some time?

    GitHub is, like the Internet, a huge advance in productivity. We're all richer because of them. Good deal.

    1. Teiwaz

      Re: COBOL again?

      The article's prediction was correct.

      Clearly for a given value of correct.

      I've seldom seen managers program - even in places with COBOL.

      - The whole point of being a manager is getting others to do the work.

    2. bombastic bob Silver badge

      Re: COBOL again?

      Thinking of COBOL...

      I never met Grace Hopper but I saw her once [may have saluted her], a 4-striper [at the time] and rather skinny elderly lady who had a bit of a smile on her face. It was in Orlando at the NTC in the early 80's, near her office actually...

  14. Amorous Cowherder
    Facepalm

    Maybe not for quite a while yet

    In 25 years of working with DBs I've yet to find a good SQL code generator. SQL is such a simple language, extremely simple to learn and use, not that difficult to master and yet when you start to go beyond 20 odd lines and use generators to build the code they always come up with the most stupid code that is never efficient. You always end up taking the skeleton generated code and re-writing it or worse jigging the database and data structures to suit it because the code is so complicated to deal with but sort of works. Sometimes people would rather do even more stupid things like duplicate the whole data set into more bizarre forms or start chucking indexes around like wedding confetti killing any hope of efficiency.

    When I see a decent code generator for something as simple as SQL that produces a 1500 line explain plan for a 50 table join and it works perfectly, then I'll listen and you can start talking about code generators for more complex languages like C#, Python, Node, C++, etc.

    1. Teiwaz

      Re: Maybe not for quite a while yet

      yet when you start to go beyond 20 odd lines and use generators to build the code they always come up with the most stupid code that is never efficient

      That's been true since at least dbase 3.

      Even if it is moderatly efficient, it's usually totally unreadable - making extension/expansion impossible without a total rewrite.

  15. stratcat

    CASE

    I trained on ADW and also Texas Instruments' IEF about 25 years ago. There are multiple reasons that these never truly caught on. (To be fair, they also run on OS/2 at the time which never caught on either).

  16. rmullen0

    The guy is full of it

    I went to college in the early 90s. The head of the computing department said that CASE tools were going to replace programmers. It didn't happen. I don't see it happening now either. Things are getting worse not better with the JavaScript hell that is being inflicted on people. And honestly, I can care less what some pinhead at GitHub thinks. With all due respect, GitHub's website isn't that complicated and didn't take a rocket scientist to figure out. So, maybe the are correct after all. AI could build a weak website like GitHub.

  17. rmullen0

    Don't forget about security

    Let us not forget that the elite corporations that are espousing AI have yet to even write secure software. How about they focus their efforts writing code that isn't completely insecure and riddled with security holes first. As far as I am concerned, they are doing a terrible job.

  18. Anonymous Coward
    Anonymous Coward

    the giants' shoulders

    Wanstrath suggested that coding, which has been likened to standing on the shoulders of giants, will become more like standing on each other's shoulders.

    Some of the code I see looks more like hanging off the testicles of each other trying not to fall into the abyss below.

    [Edit] Mind you, that's just my own code. What some of the others are writing does nor bear thinking about.

    1. Bill Gray

      Re: the giants' shoulders

      If I have not seen as far as others, it is because giants were standing on my shoulders.

      (Not an original sentiment, but it does resonate with me some days.)

  19. Milton

    Hype overdrive?

    Notwithstanding that AI doesn't exist and machine learning works only in extremely specific cases, I submit that the gulf between "There are oodles of libraries (of widely varying, often questionable quality)" and "Robots will write all the code" is absurdly vast.

    And of course, GitHub might have some incentive to hype such an overblown prediction. I think a positively enormous pinch of salt goes with this.

  20. Steve Channell
    Boffin

    i can see a world without Ruby or javascript developers!

    Almost nobody is using Ruby anymore (and those that are, are spending so much time on performance analysis that C++ looks like a RAD tool) and Javascript (ECMAScript) is being replaced by Typescript.

    Unfortunately Inversion of control containers make true dependency very difficult to find and dynamic scripts (e.g. Monkey patching) almost impossible. But that is not what this is about.. It's about "how f*cked am I when someone delists their library from NPM".. if you need to ask.. You already know.

  21. bfwebster

    Been hearing this for over 40 years

    The fallacy that happens again and again is the presumption that the value in coding lies in combining things that already work. It isn't. The value is in inventing new things that haven't been done. Otherwise, you're simply solving the same problem that's already been solved, and there's very little value in that.

    A great piece to read is "The Five Orders of Ignorance", an appendix in Philip Armour's book _The Laws of Software Process_.

    1. rmullen0

      Re: Been hearing this for over 40 years

      Yeah, thanks to open source and package management, I no longer have to write any code. I can just compose an app out of reusable components using a little Elmer's glue. NOT!

  22. tutnes

    No silver bullet still holds

    I think Fred Brooks No Silver Bullet still holds water:

    "there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity."

    https://en.wikipedia.org/wiki/No_Silver_Bullet

    1. handleoclast

      Re: No silver bullet still holds

      There used to be silver bullets in the past.

      * Going from hard-wired (literally, using a wire-wrapping tool) programs to stored-program computers.

      * Assemblers.

      * Compilers.

      * Structured constructs like do loops to replace spaghetti gotos.

      Etc. These things provided major improvements in programmer productivity and code reliability.

      These days we just get a load of EPNS bullets. They look like silver. They're touted like silver. But years after, when all the hype has died away, they are quietly forgotten. Another management fad that died the death it truly deserved.

      Occasionally we have a turd wrapped in cooking foil pretending to be a silver bullet. Pascal comes to mind, here, as does any language in which Niklaus Wirth had any involvement. Ob Wirth Joke: how you pronounce "Wirth" depends on whether you call him by name or by value - it's either "weerth" or "complete fuckwit who should never be let near the design of a programming language."

      That's not to say that there are no more silver bullets waiting to be found. There may be. But going by the last few decades there probably aren't.

  23. Aging Hippy

    " Don't go into computing, there's no future in it now that computers can write their own programs" - my careers adviser (1966)

    1. Bucky 2

      @Hippy:

      My careers advisor told me the same thing, but in the 70's. I believed him. I wasted years trying to stay away from computer science. Eventually, I gave up trying to do anything "real" and did indeed become a programmer.

      I frequently wonder how my life would have worked out differently, if the industry hadn't been telling schoolkids it didn't need them anymore.

  24. tiggity Silver badge

    CASE

    I did play with CASE tools that went all the way from model to generating a code skeleton from that model.

    But, to be useful, the model had to be in so much detail that it took far longer than if just writing the code by hand as model had to really be every i dotted, t crossed level (lots of stuff that's normally implicit had to be explicitly detailed) (and the skeleton still needed fleshing out with some hand crafted code modifications).

    .. and then the fun of when model changes needed ...

    So, CASE tools interesting but still have a long way to go.

    1. Tomato42
      Joke

      Re: CASE

      I don't know about you, but we in the programming business have a name for a specification that has every i dotted and every t crossed - working code.

    2. acid andy

      Re: CASE

      You're right about CASE tools. It takes at least as long as writing the equivalent code would have done and then, more to the point, it's a whole lot less fun than just writing the code. Luckily, with most tools you can knock out the basic code first, click a button and have it generate the CASE model from that afterwards if your PHB insists on having one.

  25. EnviableOne

    Coders can be automated.

    Programmers can't

    purley putting the methodology into what ever language you desire, is automateable, but the bit that requires the skill, working out the algorythm, that will still need some creativity, that AI just cant do.

  26. John Smith 19 Gold badge
    FAIL

    "writing less code" blah blah "libraries, frameworks" blah blah asymptotic curve--> end of history.

    When I put it like that does it not smell like odour du taureau ?

    IOW the rate at which the amount of code being written by meatsacks may be falling (but then isn't Excel macros the most used programming "language" on the planet by a very wide margin?) but the rate at which software "intelligence" (WTF that even means) is improving is slowing down as the problems to be tackled when facing large systems become exponentially harder.

    Low "intelligence" --> "Pick n mix" code generator

    High "intelligence" --> "Write a new system called 'Universal Credit.' Project brief is in 'projects' folder under 'UC'* Come back to me with outstanding questions and outline architectural structure. Your choice on best development languages for the different environments and benchmark data to deliver 2-3 second response times for major tasks.."

    Anyone like to guess how many problems are in those 4 lines,how hard they are and how far away any real system is from dealing with them?

    BTW it's interesting that decades of code studies show programmers actually writing executable stuff remains in the 12-20 LOC a day. So, yes size does matter.

    *I chose Universal Credit because that's a real system being built (because according to El Reg passim it's not there yet) by real people on a serious scale for serious money 9with serious cost and schedule overruns).

  27. ocratato

    More than Code

    Building software of any complexity requires a design process. It may be a huge set of documentation, or a few notes on a whiteboard or some ideas floating around in someone's head - but it always exists, and in my experience it is rarely visible on GitHub or what ever repository the code is placed on.

    An attempt to use deep learning AI to create an AI-programmer from the GitHub repository will not produce something that can take a set of requirements and produce a program.

    Before we can use AI to do the work, we need to build repositories of designs for the AI to study.

  28. BarryUK

    And the hype continues...

    "People write less code now than they once did due because so much functionality comes from libraries"

    Is this true though? Where once people might have written a simple, standalone program they now write a full blown Enterprise Architecture based solution for about the same effort, because it's so simple to use frameworks like Spring, Hibernate, etc. Does this mean people are actually writing that much less code?

    I remember 30 years ago my manager told me programming would soon be obsolete thanks to CASE tools. 15 years ago Scott McNealy no less, told me the same thing about OO. I reckon programming will see me through to retirement in another 15 years or so without any trouble.

  29. This post has been deleted by its author

  30. Fruit and Nutcase Silver badge
    Joke

    Code Reviewers

    Don't forget the Code Reviewers. A bit like Cockroaches surviving a nuclear holocaust, a particular type of code reviewing meatsack will survive coding armageddon in order to ensure the code generated by the AI programmers is formatted to the letter of the style guide. They may not understand the code in front of them to spot problems, but they sure will spot formatting that is inconsistent with the style guide.

  31. wmac

    It is called "Machine Learning". Tell him it is already here. You give data (i.e. input and expected output examples), a general machine learning algorithm, automatically builds a program that performs the task !!!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like