back to article Boffins foresee most software written by machines in 2040

Boffins at the Department of Energy's Oak Ridge National Laboratory speculate that by 2040 advances in AI disciplines like machine learning and natural language processing will shift most software code creation from people to machines. In a paper distributed via ArXiv, "Will humans even write code in 2040 and what would that …

  1. Anonymous Coward
    Anonymous Coward

    Websites designed by AI?

    I look forward to our nightmarish web 4.0 future, that is if current AI dreamed images are anything to go by.

    Or by software do we not include anything user facing?

    1. ForthIsNotDead
      Unhappy

      Re: Websites designed by AI?

      I'm getting sick of the AI moniker being attached to everything, to be honest.

      Your code editor offering auto-complete or recommending how to craft a particular line of code is not AI.

      Just fuck off with your AI bollocks. Seriously.

  2. oldtaku Silver badge
    Meh

    We've been here before...

    Yes, you may be able to get rid of code pigs - you may have something that does all the Java scutwork for your standard business reporting crap. Progress comes from encapsulating things - almost nobody needs to know asm any more, you don't need to draw your own UI windows, C# has data structures out the wazoo. But you're just moving the work higher, and then the work gets more complex. Maybe in the future database stuff will be so pedestrian it's seamlessly integrated.

    But now you're going to need someone to specify exactly what you want - and people asking for things are notoriously, provably, bad at not knowing what they actually want. I remember the last time AI was going to get rid of programmers, and it ran right up onto the shore on this problem (and terrible performance, but we'll assume we have enough horsepower now).

    If you assume maybe the generic stuff is good enough for most cases. You're still not going to be able to get rid of the software/system engineers - engineers solve general problems given constraints, and if you solve /that/, you've solved problem solving - and 'no programmers' will be the least of the impacts on society. No deep learning network has demonstrated anything like general problem solving or any penchant for it. If you could perfectly encode every bit of your problem and required software solution in an input and output vector one could understand, and you could do the same thing on all existing software to train it, maybe it would surprise you. But software is not minor fault tolerant like images, and who are you going to get to do that?

    Is the ratio of code pigs to engineers 4:1, giving you 80%? Maybe. I find Jeff Bigham's comments more believable. AI will let software engineers tackle bigger and better problems and not worry about the lower level stuff.

    1. Lysenko

      Re: We've been here before...

      almost nobody needs to know asm any more

      The "IoT" hypefest has clearly passed you by. Try building one of those billions of those battery powered sensor nodes that the (near) futurists are predicting without a solid understanding of ARM M0-M4 ASM. You can do a lot of it in higher level languages like C of course, but you'll still need to visit the basement from time to time. Java? .NET? or (I'm about to lose a rib here) functional languages with stack busting recursion all over the shop? Hahahahaha.

      AI *might* be able to start writing general purpose code reliably a few decades after it completely masters synthesizing SQL from natural language which is something it currently isn't even remotely close to achieving, despite the query space being precisely defined and constrained by the database metadata and decades of precursor work on QBE (Query by Example).

      1. oldtaku Silver badge

        Re: We've been here before...

        I still know and use x64 and ARM assembly (and a bunch of 8-bits, but sadly never get to use them) for things like patching binaries we don't have source for and the occasional really timecritical thing - like getting cycle count cheap. It's why I said 'almost nobody' and not 'nobody'. I also know from trying to hire people that that skillset is incredibly rare.

        1. yoganmahew

          Re: We've been here before...

          Indeed, and even the problem is misunderstood. What temperature will my coffee be in 10 minutes? Well, it takes me fewer than 10 minutes to drink my coffee, so it'll be body temperature. When an AI asks that question, we're fooked, until then it is mostly spoof and nonsense.

          1. This post has been deleted by its author

            1. yoganmahew

              Re: We've been here before...

              @Symon

              "Ten minutes _is_ how long it took to drink the beverage, not _are_ how long it took."

              Ah, but that is a particular beverage of a full mug of coffee, filter style. An Americano might take fewer than 8 minutes. So the minutes are individual and can be counted differently as separate unites of drink consumption time measurement. 10 minutes are taken to drink a cup of coffee... I am quite the passive.

              An espresso takes less than a minute ;)

              edit: on reflection, perhaps I should have originally said "it takes fewer than ten minutes for my coffee to be drunk by me"...

            2. Doctor Syntax Silver badge

              Re: We've been here before...

              "In this case, if you happen to be a prescriptive linguist, you should use less, because '10 minutes' isn't plural here."

              A handy rule, if you care about this sort of thing at all, is to do a units conversion. Would you write "it took fewer than 1/6th of an hour"?

              1. Charles 9

                Re: We've been here before...

                Or put simply, measurements are always taken as a singular since the unit (plural or not) is describing a single continuous thing: not the thing itself but an aspect of that thing, and that thing usually only has ONE of each aspect. You don't normally drive a kilometer 1,000 discrete meters at a time, nor do you hold a meter of ribbon in 100 separate 1cm pieces. Don't go by the unit; go by what the unit is describing.

        2. Loud Speaker

          Re: We've been here before...

          I also know from trying to hire people that that skillset is incredibly rare.

          The direct consequence of piss-poor pay for 30 years. Assembly language programmers are seen like the scrap metal workers in the engineering industry. Yes, there is a kind of respect, but not real respect, and definitely not the money they would get if seen as the precision machine operators in the development labs that they are.

          Disclaimer: I have written assembler for MIPS and Sparc, as well as Intel, and a bunch of 8 bit stuff best forgotten - I have made far more from writing PHP and C++.

      2. joeldillon

        Re: We've been here before...

        A specific selling point of the Cortex M is that (because its interrupt handlers use C calling conventions) you can write bare metal firmware for it without using any assembler whatsoever, actually.

        1. Lysenko

          Re: We've been here before...

          A specific selling point of the Cortex M is that (because its interrupt handlers use C calling conventions) you can write bare metal firmware for it without using any assembler whatsoever, actually.

          You may not have to write much of it but you certainly need to read and understand it. Try diagnosing that without knowing assembly language.

    2. Nolveys

      Re: We've been here before...

      people asking for things are notoriously, provably, bad at not knowing what they actually want.

      The people I have worked with have been quite good at not knowing what they actually want.

      1. Anonymous Coward
        Anonymous Coward

        Re: We've been here before...

        Amusingly, Nolveys' comment focuses on part of TFA in which the author was provably bad at expressing what (s)he wanted to say.

    3. Anonymous Coward
      Anonymous Coward

      Re: We've been here before...

      Indeed, we have: Something I posted in 2012:

      https://forums.theregister.co.uk/forum/1/2012/05/11/ubuntu_emulates_amazon/#c_1408177

    4. Daniel von Asmuth
      Meh

      In the year 2000

      In the year 2000, Queen Elisabeth II will still be on the throne thanks to DNA therapy making her effectively immortal.

      In the year 2000, economic crises will be a thing of the past.

      In the year 2000, we will get around in flying cars.

      In the year 2000, routine factory work will be done by trained apes.

      In the year 2000, we have molten the Arctic ice cap to shorten shipping routes.

      In the year 2000, users just have to say what they want in COBOL and the computer will write all the machine code.

      In the year 2000, mankind shall witness the second coming of the Lamb for the End is nigh.

      1. Loud Speaker

        Re: In the year 2000

        You forgot to mention: Voice recognition will be a solved problem, and robots will have taken over the world.

        And, I wish to point out that your COBOL one has come true: and it probably explains why the banking system is no longer reliable.

    5. Mage Silver badge
      Flame

      Re: We've been here before...

      Yes.

      You need EXPERTS to figure out what is really wanted and then design it.

      Remember ForTran, Cobol, The Last One, 4GLs.

      This is nonsense. All current AI relies on a huge amount of human curated data input before it's let loose. At best it will only be a new programming language where you are less sure that at run time it will do what you wanted.

  3. Dagg Silver badge
    Pint

    The ultimate self modifying code

    Nuff said!

    I would think that this will be just more comprehensive higher level libraries that you just configure and string together. Payroll library, GL library anyone...

    1. Destroy All Monsters Silver badge

      Re: The ultimate self modifying code

      Self-modifying code == Not-self-modifying code with a modifiable data structure.

      The Bad Developer is on the left side.

      1. Prst. V.Jeltz Silver badge
        Coat

        Re: The ultimate self modifying code

        I wrote an HTA app once that changed its own html as it went to display different things. Does that count?

      2. Dagg Silver badge

        Re: The ultimate self modifying code

        Nah, just think about it in the machine code days all of the machine code was just data that the CPU 'processed' and gave you results..... <G>

        I remember some of the old machines I worked on, I'm talking core memory and 7400 TTL logic (can't remember the name of) you had to actually had to write a JMP instruction into a specific memory location as a return from subroutine as this thing had no JSR or stack.

        Even with something like a PDP11 under RT11 if you loaded a relocatable module into memory you need to go through it and modify the relative addresses to actual addresses. You could even do this several times as you moved the code around to free up memory. A bit like an early overlay.

        By 2040 I would say that the machine that writes the code will be involved in writing it's own code hence the concept of "ultimate self modifying code"

  4. Kevin McMurtrie Silver badge
    WTF?

    Clippy

    Why would you have an AI write code? I shudder when I think of what garbage it would get as training data. Take whatever it's supposed to be doing and convert it into libraries and language features.

    Software tends to have infinite requirements. Every time somebody swears they have an architecture to solve the problem of writing code, they instead create an architecture that solves one use case. The architecture is modified to handle more and more use cases until eventually the architecture is more complicated than writing code. This is the birth of a "legacy system" that people will curse for years.

    Recent advances in computer languages have been in better abstracting data transformations. You can describe what you want done using formulas and it gets taken care of. When those don't fit needs, plain old brute-force still works. The next step could be making it easy to declare, at will, a locally scoped DSL for performing a specific task - there would be some good science in figuring out how that would look.

    1. E 2
      Trollface

      Re: Clippy

      Feed it Clipper code!

      1. Prst. V.Jeltz Silver badge

        "AI doing coding" could mean anything

        Writng code just seems like the opposite of things an AI can do given that a large part of writing code is understanding what the code is supposed to do - that means communicating with people - not something i've seen an AI do yet .

        On the other hand you could say that a high level language is basically telling the computer what you want it to do , then the AI compiler writes the "machine code"

    2. GreenBit

      Re: Clippy

      Maybe that's where AI could help. A working system is in fact a kind of specification. When that system is 25 years of patches, the code itself can be unmaintainable goo, but running it demonstrates required behaviour (one hopes). Sometimes you look at 100,000 lines of code, and feel it in your bones that it could be rewritten in 20,000 but the task is just too daunting to undertake. And it's *menial*.

      So there's a domain (legacy bloatware) that's fairly unambiguous, with mountains of menial detail. Sounds like a job for a commuter! Ok, crushing 100K lines of crufty C++ to 20K lines of well organized C++ might be a bit hopeful, but hell, even if it could just look at such a system and produce a specification, that would be brilliant. (Of course, even that's not necessary because we all diligently maintain our specs, right? )

      1. Doctor Syntax Silver badge

        Re: Clippy

        "So there's a domain (legacy bloatware) that's fairly unambiguous, with mountains of menial detail."

        What would the AI do with all the bugs which are in there but undiscovered because in the operational domain they're never triggered? It might add a few hundred K lines to deal with them.

      2. cavac

        Re: Clippy

        "Sometimes you look at 100,000 lines of code, and feel it in your bones that it could be rewritten in 20,000"

        I hear that a lot. But while writing the new code, more often than not, you will discover all the edge cases and special requirements the projects also needs to handle[*]. And in many cases, you will end up with the same amount of code, anyway. Except that it will be less reliable, because it is still missing about, uh, 25 years of bugfixing.

        [*] Things like "didn't you know, every other year we have an ISO-somethingorother audit, and we need to run this very complicated reporting thingy that would us take weeks to do by hand". Or the always fun "Ok, we upgraded all the systems to use your new API, except of course that Doohikey2000 thing back there in the corner. Changing that would cost too much and cause a downtime of weeks because of the re-certification required"

      3. Daniel von Asmuth
        Unhappy

        Re: Clippy

        Anybody can refactor 100,000 lines of code. Applications over 100,000,000 lines of code require a capable team and lots of time. Windows 2016 is a good deal bigger than that.

  5. Herby

    Wasn't COBOL....

    Designed to make it to code for accountants back in the 1950's. You wouldn't need programmers, and anyone could "write code".

    This whole thing is a case of "been there, done that", and it will continue. We humans are the ones that think into the future and can "design" things. Very little (if any, as I can't think of anything) is designed without human input. I have strong doubts that this will change.

    Nice try though.

    p.s. COBOL is still here, writing paychecks and checking general ledger stuff.

    1. Dagg Silver badge
      Pint

      Re: Wasn't COBOL....

      Designed to make it to code for accountants back in the 1950's. You wouldn't need programmers, and anyone could "write code".

      Bloody accountants!

      I'm maintaining a legacy system written by an accountant in C/C++ full of magic numbers, gotos, death by pointers hard coded everything. No documentation and extremely short variable names a 3 character name is a luxury.

      But... it is keeping me in a job. Some times I just feel like Wally, old, bald and maintaining the old legacy system that can never be replaced as no one know what the hell it actually does.

  6. Long John Brass
    Terminator

    Stack Overflow

    I kind of just have to know vaguely what's possible, and then I find the specifics on Stack Overflow.

    Wow.... The devs on my site are all AI's :)

    1. JLV
      Trollface

      Re: Stack Overflow

      Good to see Nissan is ahead of its time:

      https://mobile.twitter.com/Scott_Helme/status/727832672551219201/photo/1

      which may explain why their security is very retro 90s-lets-trust-each-other

      https://www.troyhunt.com/controlling-vehicle-features-of-nissan/

      This article feels really 90-y too: computer languages are improving on abstractions and expressivity, and libraries pack a lot savvy. But, contrary to the claims here, at some point, exact, not probabilisitic/AI/big data/generic consumer processing requirements that are not part of a well-known and generic domain will require someone to be highly specific about what they want done - that's likely to remain looking strangely like computer code.

  7. inmypjs Silver badge

    The 80's are calling

    and want their ideas back.

    Developed by D.J. "AI" Systems and called 'The last one' because it was the only programming language humanity would ever need.

    https://en.wikipedia.org/wiki/The_Last_One_(software)

    1. Doctor Syntax Silver badge

      Re: The 80's are calling

      "and want their ideas back."

      Indeed. Will this really be the Last One.

  8. Ted's Toy

    Who will program the software writer?

    Without human input there can be no machine output.

    1. SVV

      Re: Who will program the software writer?

      Have written several code generators myself over the years, and used some others, so the answer is obviously computer programmers. They are good for basic gruntwork such as database access code, initial user interface generation and very basic data validation (eg order must be between 0 and 1000 quid). After that, it's diminishing returns as the complexity of what you need to define for the generator approaches the complexity of just writing the complete code in the first place.

      "But they point to recent Facebook research, saying it suggests machines may be able to negotiate with each other to communicate their requirements."

      Not a chance until humans can manage this in an efficient, complete, consistent way too (Note : anybody using the letters UML in a reply here will be ignored). Again, you'll just come up against the complexity problem where the requirements definition becomes more complex that the syntax definition of the target language.

  9. TRT Silver badge

    Prove it...

    Get it to the stage it can comment and document human written code first, then I might believe it to be possible.

    1. Dagg Silver badge
      Mushroom

      Re: Prove it...

      Get it to the stage it can comment and document human written code first, then I might believe it to be possible

      Extremely simple...

      // Start of crap human code that contains substandard business rules

      Large block of human code

      // End of crap human code that contains substandard business rules

      /sarcasm - reg where the hell is the icon

  10. Anonymous Coward
    Anonymous Coward

    Hidden comments too?

    <!-- Let the meat-bags try and figure that shit out! MQ7565422 -->

    <!-- Good one MQ7565422. JH87675 -->

  11. Khaptain Silver badge

    Silver Bullet Syndrome

    Never forget that clients are comes meatbags that are never truly sure nor fully understand their own requirements which change constantly during development phases...

    It's not AI that is required, you would need to be at least Deity Level to get coding done correctly without human intervention..

  12. Oengus

    Commonsense isn't common

    Even the best written spec today includes a lot of unstated commonsense assumptions.

    The problem with commonsense assumptions is that what is commonsense to one person is entirely different to another particularly cross culture. I am dealing with these "commonsense assumptions" on a project I am currently on and it is causing a lot of grief...

    1. Charles 9

      Re: Commonsense isn't common

      For example, never assume everyone writes or types left-to-right (Hebrew and Arabic are both right-to-left, as are other Middle East languages).

  13. Dwarf

    I smell BS

    Even if it were possible, we would still need a way to explain what we want the system to do, which by definition would be a programming language

    Ok machine, I want you to take a reading from that sensor, convert the returned data from its packed form into something usable, modify the value by applying the calibration data in your eeprom that I explained earlier, now take a rolling average over 500ms to stabilise it and remove mouse/ measurement uncertainty and use the result as the stable and accurate sensor output

    Me thinks I could write the c/c++ in less time. Wonder how I’d apply optimisation like bit shifting and Boolean operations to any of the above ?

    1. Anonymous Coward
      Anonymous Coward

      Re: Ok machine, I want you to...

      Start with something simple, like user interface design.

      OK machine, make that box green.

      Darker.

      Darker.

      Darker.

      Darker.

      Too dark... make it greener.

      Greener.

      Greener.

      Can you make it at the same time greenish and reddish?

      (dedicated to all designers/developers that had to deal with micromanaging types and did not kill them)

      1. Doctor Syntax Silver badge

        Re: Ok machine, I want you to...

        "dedicated to all designers/developers that had to deal with micromanaging types and did not kill them"

        We had a volume at 11 shouting match between two of the client's directors in the middle of their general office as to how a particular batching operation should be carried out. Privately we thought it should be operator configurable and built that in. The configuration would be able to be set to fit either of the directors' views - or anything in between and maybe more. During commissioning we set up something that looked reasonable. AFAIK it was never subsequently altered.

        1. Dagg Silver badge
          Pint

          Re: Ok machine, I want you to...

          "dedicated to all designers/developers that had to deal with micromanaging types and did not kill them"

          Had those, one wanted a button moved 1/5 a pixel. Had a "UX graphic designer" who loved pastel colours which on her 2**100000000 colour monitor looked fantastic but on the crock-of-shit monitors the rest of us used they all looked the same.

          I need a beer.

    2. Jason Bloomberg Silver badge
      Paris Hilton

      Re: I smell BS

      Wonder how I’d apply optimisation like bit shifting and Boolean operations to any of the above ?

      That's where you are going wrong, still trying to tell your 'computer code writing secretary' how it should be done rather than letting it get on with it.

      "Computer! Write me a program to control my country's defence system". It's that easy. You are over thinking it.

      1. Charles 9

        Re: I smell BS

        Because we realize computers can't predict what it doesn't know. Heck, WE can't handle a defense system properly without all the parameters. We can't expect man-made computers to be any better.

        1. Charles 9

          Re: I smell BS

          Replying to my own post, the main reason we need human programmers is that we never get the complete specs to a job at the start. Something always gets left out that then needs to be addressed in a hurry. A computer needs to be able to handle the job even when the specs change (sometimes drastically). They also need to be able to handle vague specs and know whether to just assume something or to ask for more specifics which may not be forthcoming.

  14. simonb_london

    1997? Pah!

    In 1987 I had a book for programming the Atari ST with the operating system (TOS) 68000 disassembly listing in half of it.

  15. joeldillon

    'The tools for writing apps that write other apps are also showing up in various programming languages, like Go'

    That link is a facility Go has added for running tools like Yacc as part of its build process. Yacc is 40 frigging years old and was old hat then (Yet Another Compiler Compiler). The facility is only necessary because go manages its own build process; if you were writing C it would just go in your Makefile.

    The long standing existence of tools which generate code to a programmer's specifications is ummm really not a good argument for most code being 'written by AIs' in the future or whatever they're yammering on about.

    Note how stuff like this is always 20 years in the future, by the way? In 20 years' time it'll be 20 years in the future too.

    1. Rafael #872397
      Holmes

      Re: In 20 years' time it'll be 20 years in the future too.

      I predict that in 20 years' time whoever is making money out of predictions will still be making money out of predictions.

  16. Anonymous Coward
    Anonymous Coward

    The less you know, the sooner you will be replaced

    It seems the people who write such papers - and have obviously no idea what are writing about - will be replaced first.

    1. Doctor Syntax Silver badge

      Re: The less you know, the sooner you will be replaced

      "It seems the people who write such papers ...will be replaced first."

      They have been and the replacements keep arriving.

  17. Anonymous Coward
    Anonymous Coward

    Bollocks.

    And where's my flying car, futurists?

  18. Prst. V.Jeltz Silver badge

    This is akin to someone saying "Robots will soon be able to breakdance"

    To which my reply would be "Well in that case long before that they should be able to bring me a tray full of beer , wash my car , cook me a lasagne , clean the bathroom , drive me to work home from the pub, mow the lawn , do the shopping , weed the garden , etc - and none of that has happened yet.

  19. Anonymous Coward
    Anonymous Coward

    The problem with building blocks is that you need to understand their internal constraints. Like Expert Systems - AI is going to have to "explain" what it has done and what assumptions have been made.

    I recently used an Arduino library for an RTC. The chip specification notes that you must access the registers in a particular order if you want to freeze the yyyymmddhhmmss fields' contents as a single atomic value. Otherwise it is possible to get the wrong time when a field rolls over between field accesses. I have no idea if the library has taken that into consideration.

    1. Mage Silver badge

      I have no idea if the library has taken that into consideration.

      I spent more time fixing bugs in libraries than writing my own code. Many are not properly tested, or not properly designed, or don't do what they claim, or written by students etc because for experience, too busy etc.

  20. Tom 38

    Warning: scientists

    Warning: there is a mahoosive generalisation coming up.

    Scientists of all disciplines suffer from being very smart, but generally focused on one single sphere of science. The combination of this extreme expertise in one sphere, and general smartness leads them to underestimate the complexity of tasks that are from other spheres - particularly coding (a scientist that can actually program is a rare thing).

    Since the 90s, scientists of all spheres have had to write more and more code. Almost universally, they hate it - they know their field and model perfectly, so why won't the stupid computer do it? - and they resent that they need real developers to help them, and they are protective of the code they wrote, even when it is a pile of dung.

    To these people, the "problem" of programming is just an extension of <some problem from their sphere>, and the "boring, rote work of programming" will be a solvable problem.

    1. Daniel von Asmuth
      Boffin

      Re: Warning: scientists

      Funny, almost all the programmers I met have been scientists with only a small percentage of computer science graduates amongst them.

  21. Brewster's Angle Grinder Silver badge

    We already have computer programs that write programs for us. They're called compilers.

    1. Brewster's Angle Grinder Silver badge

      And x86 chips analyse the instructions and attempt to organise them for the most efficient execution. And garbage collectors automate memory management. And the javascript VMs dynamically analyse your code to understand what its doing in practice and optimise it.

      So the stack continues to get taller and smarter, and the code more generic. (Bonus side effect: tiny changes can flip performance from exceptional to atrocious, depending on whether or not the stack understood what you intended.) But there are still programmers sitting on the top, making it work.

      Good programming could get even harder as you have to mediate between a semi-intelligent agent working in natural language and a computer that's trying to ape its foibles.

  22. Anonymous Noel Coward
    Terminator

    Mwahaha

    Operation Skynet is going all according to plan.

  23. hatti

    two words

    paperless office

  24. Sykowasp

    If this was a study by experienced software engineers ... but it's not.

    I can see it taking over some aspects of business processes that are currently achieved via programming against a dataset. For example reports, etc, are manually programmed or configured, yet in the future AI will understand 'give me the KPIs on this XYZ' and do things automatically.

    But real programming? When nobody can specify anything clearly when a bit of software engineering is required?

    At the worst, current programmers will migrate to being 'specification engineers', i.e., turning the vague useless specification a business provides into something concise and specific, that a code generator - AI or not - will turn into a working system, removing all the drudge work (but IDEs and Frameworks seem to halve the drudge work every 5 years anyway already, without AI) and filling in the actual code to achieve the end aim.

    So yeah, better tools to remove drudge work for programs that don't need much flexibility. But that isn't what anyone understands by 'AIs will write software'.

    Btw, we'll need twice as many 'specification engineers' to do a certain amount of work in 2040 than we need 'software engineers' to do a certain amount of work today. That's how these things seem to work, so why would it change?

  25. Anonymous Coward
    Anonymous Coward

    The key word...

    ... of this article occurs in the first sentence.

    "Speculate".

    People were speculating in just the same way in the 1980s. The main thing that has changed since then, I hope, is that more software people (if not journalists) have adjusted their speculations to be more in line with hard reality.

  26. JeffyPoooh
    Pint

    One can make a lot of money by betting...

    Requirements driven programming requires AI, and likely Strong AI.

    One can make a lot of money by betting that Strong AI isn't as close as some folks seem to believe. This "AI is here soon" nonsense has been going on for many decades.

    Every now and then I'll see somebody actually admit that what they're talking about might be something like 50 or 75 years in the future. At that point I'll agree that it's certainly possible.

    PS: You might find that simple coding is faster than trying to explain everything about the Universe to a bucket of sand.

  27. handleoclast

    I wondered how long it would take

    Before some moron came up with this idea (AI couldn't have come up with this idea because AI isn't that fucking stupid).

    Allow me to summarize:

    Add 1 to 4GL giving 5GL.

    1. Destroy All Monsters Silver badge
      Terminator

      Re: I wondered how long it would take

      This is just a marketing term.

      Meanwhile: Search-based software engineering.

      Toasters with Guns soon.

  28. Fading
    Mushroom

    2 years after...

    The entire internet dies due to the UNIX 2038 bug? We'll be back coding with punch-cards in 2040.

  29. NOTiFY

    The Last One

    See "The Last One" circa 1981

    https://en.wikipedia.org/wiki/The_Last_One_(software)

    A casual phone conversation in early November sent David Tebbutt scurrying off to deepest Somerset where he discovered a program which could become the last one ever written by a human being.

    http://www.tebbo.com/presshere/html/pw8102.htm

  30. IGnatius T Foobar
    FAIL

    It's so easy!

    COBOL has syntax so easy to understand that even a manager can read it. Low level programmers will be a thing of the past.

  31. douggack

    In 1965, I was warned...

    I got into this crazy business in November of 1965. I started out in the 'tab room', running sorters, collators, etc. Early on, an old guy I worked with said I was doing the right thing, but I should avoid programming, because they were going to get computers to write their own code. Fifty two years later, I'm still coding my fingers to the bone. We're not going to run out of work any time soon.

    Doug

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like