back to article What’s the real point of being a dev? It's saving management from themselves

What’s the point of being a developer ? My experience of being one taught me that it isn’t primarily the coding – it’s actually much more important than that. Last week at Parliament, discussion touched on the role The Register plays in the media landscape. The chairman of the House of Lords’ enquiry on Artificial Intelligence …

  1. Anonymous Coward
    Anonymous Coward

    When I was in college, we were told that 4G languages were the future. Drag and drop programs.

    I don't know how widespread that idea was at the time but it seemed obvious even then that it was bollocks.

    1. Doctor Syntax Silver badge

      "When I was in college, we were told that 4G languages were the future. Drag and drop programs."

      Wasn't that 5GL?

      1. Anonymous Coward
        Anonymous Coward

        No, it was 4GL. I think 5GL had been forgotten about by then. 4GL was still waiting for its turn to be the next big thing.

        1. James 51

          When I was a lad 4th generation languages were Java and the like. C++ was 3.5 or 4 depending on which text book your read.

          1. Anonymous Coward
            Anonymous Coward

            Maybe I've got the number wrong, but google suggests I was right.

            Either way, it didn't pan out.

    2. Anonymous Coward
      Anonymous Coward

      Oh this is a good laugh - the Wikipedia entry lists "Visual FoxPro" under their 4GL entry!

      1. Anonymous Coward
        Anonymous Coward

        4GL is a UI glued to a database, which I think VFP was.

        The only recent one is Oracle's abysmal Application Express, where you code webapps in PL/SQL. Blurgh.

  2. Doctor Syntax Silver badge

    "Deep in the mists of time, some 25 to 30 years ago, there was a belief that software production would change radically."

    I'm glad you went on to explain which particular belief you were thinking of. There've been so many software that writes itself fads. If one actually works it'll take everyone with real experience by surprise.

  3. sandman

    The Fads of Doom

    I've had the dubious pleasure of watching senior management adoption of fads damage or destroy their companies. My favourite was "Shockwave is the future of the internet, we are going to create all our training materials using it!" (For youngsters, think Adobe Flash, but bigger and capable of doing more stuff). The few of us with experience went "NO! That way lies madness!" So it proved, instead of an estimated take-up of 3000, only 3, repeat 3, users got on to it (I'm not sure how, even the contact details and help were embedded in it). Not long after, the long-established company died.

    1. HmmmYes

      Re: The Fads of Doom

      I did come across a smallish but long established company that be the farm on Silverlight (I had to google this, Id forgotten it).

      How Id like to have been a fly on the wall when MS pulled the plug.

      Never invest in a language/tool supplied/patented by a single company esp. MS.

  4. Anonymous Coward
    Anonymous Coward

    Next up: "Serverless".

  5. Anonymous Coward
    Anonymous Coward

    OO came from industry

    At university twenty (or so) years ago the research was all about functional programming, plus some C for the applied world. OO was some technique from commercial tools that the academics were taking on. Certainly business and academia were aligned in the oo hoopla not much later.

    Meanwhile, twenty years later, the functional programming techniques are finally available in the commercial tools.

    Edit: OK, 24 years ago and 24 years later.

    1. HmmmYes

      Re: OO came from industry

      Really?

      20 years ago at Uni it was all OO vi Java.

      There was some functional, mainly Haskell, if you were lucky.

      Have a google Simon Double-Haskell face doing his Haskell is useless thang.

      1. Anonymous Coward
        Anonymous Coward

        Re: OO came from industry

        OK so I need to recognise that it is more than twenty years since I left uni.

        Its true that by 1996 java was being taught in academia and the commercial world were singing the same thing, and some were lucky to get some Haskell. But a little earlier at uni, in 1993 / 1994 we were doing C, and modula-2 with its exciting new 'modules' - OO was a rumour. I suppose a pedant could argue that with OO was invented in the commercial world with SmallTalk at Xerox.

        TBH, Andrew seems to be talking more about small chunks of software than the fine detail objects. And on that basis, if you let a small chunk talk over a http port I'd say you've got a microservice. the hype just continues.

        Thanks so much for the Simon PJ video, just fantastic :)

      2. Anonymous Coward
        Anonymous Coward

        "20 years ago at Uni it was all OO vi Java."

        Java was late to the party. Just Java went the way of a taliban OO language, spreading the hate about OO.... as if Smalltalk was not enough.

        Other languages introduced OO features still without forcing you to become a worshiper - just a user of the good parts. I.e. TurboPascal got OO features in 1989, Apple Pascal somewhat earlier.

      3. EarthDog

        Re: OO came from industry

        Don't forget Java came from Sun Microsystems

  6. HmmmYes

    Yeah, Id have said the hype was on 4GL rather than OO.

    OO was just another way of doing stuff, stuff that would be 4GL.

    I was never that sure what a 4GL language/system was - other than the future. Bets I could get was some vague concept of being a declarative language. Of course ,this left out the detail of whether they would be general purpose/4GL for everything, or limited to certain problems domains. I think AI was meant to fix all that, in the meantime some pleb programmer was meant to implement some very high level spec of 'Do everything I ask, cheaply, make $$$$ profits'.

    The OO diversion was minor in comparison. There was some idea that OO would fix the hard stuff that procedural programming discovers. Of course, everyone had different ideas on what OO was - Simula, C++, Smalltalk. All, bar Smalltalk, missed the concept of messages and state being important, esp. C++, which decided that he problem of controlling state, which is the biggy in all software - state, state, state, would be fixed by hiding all the state in 1000s of classes/objects.

    To give obective C its due, it did provide some help with a message based run-time. C++ never managed to define a fucking ABI until 2005ish, or a decent C++ language compliance test.

    The does seem o be a 10 year cycle on this. Who remembers Bill Joy on his 'Grey stuff is going to take over the world!!!!' Of course, he'd just brought out Jini that was going to put java everywhere and network and stuff. Me? I cant get different Java based programmes running, never mind syncing.

    Software is very hard. It does not seem to respond to throwing more people at it - manly as here are fuckall people with the skills to throw at 'It'. Neither does it seemed to bebetaen down by throwing capital at it - Witness the current effort of GE trying to turn itself into a SW company (just like Apple, honest, we are management geniuses,) by hiring 1000s of software people, well people who claim to writ software. If the skill are not there a quick 2 weeks at 'code camp' will get them up to speed ....

    1. Doctor Syntax Silver badge

      "I was never that sure what a 4GL language/system was"

      I can only speak from experience with Informix 4GL. Essentially a structured procedural language (but why retain "LET" as a keyword for assignment statements) but also incorporated SQL, form handling and report handling. The form and report handling incorporated event handling ("ON" as a keyword).

      Essentially it added higher-level abstractions (databases, menus and reports) to what previous languages had had just as 3rd generation languages did in comparison to assembler and as assembler did in comparison to machine code.

      I never quite escaped the feeling that the original non-procedural Informix applications, Perform and Ace (a report writer) were more truly what was toted about as 4GLs at the time but, within its restrictions, it was more practical - serious work with Perform & Ace required linking C code into their interpreters.

      The OO approach to incorporating SQL has been very different, as far as I've experienced it. There would be some object which had a string component into which one would build the SQL with string operations rather than directly writing it in the source as a statement in its own right. This seemed clumsy. It was no surprise that New Era, Informix's own attempt at introducing OO into 4GL which did just this was a failure. However, as time went on it became more practical to take Informix 4GL's approach of preparing a named statement from a string and then executing it as more practical than the original for many purposes and still less OTT than the OO approach.

      What we seem to have ended up with, for the most part, is 3rd generation procedural languages with OO features added. They don't really represent a higher level of abstraction and this, I think, is a failure.

      1. Gordon 11

        Essentially it added higher-level abstractions (databases, menus and reports) to what previous languages had had
        You mean it added new syntax to the language to do things that could be (and in practice probably were..) done by library code.

        What always amazed me was the unwillingness of academics and managers to believe that you could, and should, write commonly-needed functionality into callable library code.

  7. knarf

    what a load of Tosh!

    OO is a very big success and if your coding and not using OO then I don't know wtf your doing with your code.

    This sort of airy piece belongs in the broadsheet where snr managers can talk bollocks over it.

    1. richardcox13

      Re: what a load of Tosh!

      Exactly.

      OO is the basis of interface segregation which allows dependency injection, Inversion of control and everything else to enable testability.

      While still not Brook's Silver Bullet, it is the closest we've come.

      Pre-built components ... that's a different matter.

      1. Raumkraut

        Re: what a load of Tosh!

        I wonder if the irony was lost on the author, that they were rolling their eyes at using OO in development, but were happily using UNIX on the servers. UNIX, of the creed "do one thing, and do it well"; the bunch of self-contained, independent bits of code, easily bolted together through stable interfaces to provide complex functionality.

        1. Anonymous Coward
          Anonymous Coward

          "but were happily using UNIX"

          Sorry, the "chaining" in Unix has very little to do with OO. It's a very old and outdated model born when little code and data could stay in memory, so you had to "chain" applications and move data from one to another through I/O. OOP requires memory to work well.

          Not surprisingly, Unix is still very tied to C and abhors anything developed later. Even C++ is an attempt to make C developers endure OOP, and thereby it is one of the worst OO languages around.

          The lack of good OO languages, especially for quick GUI development, is one of the reason why Linux went nowhere as a desktop system. It's much easier to use OO widget libraries, compared to procedural ones.

      2. Anonymous Coward
        Anonymous Coward

        Re: OO is the basis of interface segregation which allows ...

        RE: OO is the basis of interface segregation which allows dependency injection, Inversion of control and everything else to enable testability.

        You're right that OO enables all that you state (and I love me some C# in this regard) but all three of the examples you gave are fully possible without 'classic' OO.

        Go has no inheritance (it is compositional - has-a not is-a) and yet easily handles DI, IoC and testing. It also has interfaces everywhere, again without 'classic' OO.

      3. matjaggard

        Re: what a load of Tosh!

        Some of the conclusions here are correct but O-O is so important to modern development that it's a crazy example.

      4. Anonymous Coward
        Anonymous Coward

        "Pre-built components ... that's a different matter."

        In the realm of UI widgets, OO helped a lot to create good and easy to use and easily reusable libraries.

    2. find users who cut cat tail

      Re: what a load of Tosh!

      Sure, OO improved a number of things. At the beginning.

      Then it became a religion. If you do not transform all code to a tangle of silly NounizorFactoriers you are heretic to be burned at stake.

      OO created a class [no pun necessary] of ‘programmers’ who, when implementing a procedure to sum some numbers, as a side effect produce a hierarchy of classes for numeric types, arithmetic operators, representation of cardinals, abstraction of lambda calculus and a bunch of things that you never imagined could be related to the problem. Yes, and then their implementation also forgets to add the last number...

      1. Bronek Kozicki

        Re: what a load of Tosh!

        OO is not the only way to enable decoupling (testability, dependency injection, high cohesion etc.) . However it is relatively simple to implement (in a language) and use (for programmer) with static type safety, hence its popularity. It comes at a cost of design artifacts which need to be studiously maintained, i.e. interface classes.

        The alternatives are polymorphism without explicit interfaces, either in a dynamically typed language (duck-typing) or in statically typed one (generics). Since dynamic typing makes early error discovery and enforcement of design tricky, the only tools remaining for large projects are either OO (well-established) or generics (tricky to use).

        Personally I am not a fan of OO, because of the maintenance effort of the interfaces it drops onto developers, enabling stupid designs (a singleton which is a factory to a ... value type because it only has getters and setters, WTFF!) and generally being misused like a very heavy hammer in the hands of 500lb gorilla. However there are (not that frequent) situations when dynamic function dispatch of few closely related functions, collected under single umbrella of an interface class makes sense - but this requires consideration and understanding of (both maintenance and runtime) costs associated! More often than not, as single function pointer (or perhaps small lambda) works better, with 10% of the associated cost in terms of lines of code which need to be read and understood every-f*g-time someone needs to touch the thing.

        As for AI? Neural nets have their place, but they are not going to save us from ourselves.

    3. Anonymous Coward
      Anonymous Coward

      Re: what a load of Tosh!

      The problem is, that's not the limit of what OO was supposed to be.

      All this inheritance, polymorphism etc that we use daily are great, but they were never the big idea of OOPS just a sideline.

      As someone else alluded to, it was supposed to be all about the interconnections not the nodes, about the messages and how objects communicated. That's largely ignored (though Go can be quite good at this) whilst the focus on the sidelines is sometimes taken to extremes in codebases where everything is something else and the sleight of hand leads to questioning where the actual code is.

      1. This post has been deleted by its author

  8. Anonymous Coward
    Anonymous Coward

    I'd kill for OO right now

    The code I have to maintain is ancient (bordering on 30 years), huge, spaghetti, horrendous, incomprehensible, and patched to hell and back. It has several similar versions for each country and bugfixes don't propagate their way round and similar features are coded in different ways for each branch. It's all based on transactions and message passing so reusable OO would be a lifesaver.

    If management were listening to the devs I guess it would have been transitioned to C++ around 2000 or so. They are trying to dig their way out of the pit they've made but I'm curbing my enthusiasm as they've decided to go with Java where you can buy devs by the pound but is completely unsuited to a high transaction system such as this.

  9. Korev Silver badge
    Coat

    Hadoop

    Some bright Spark will mention Hadoop/bigdata soon...

    1. Anonymous Coward
      Anonymous Coward

      Re: Hadoop

      You just did.

  10. Michael H.F. Wilkinson Silver badge

    I aten't dead!

    OO is certainly not dead, I use C++ for most of my more complex pieces of code, although just plain C is used for most of the heavy lifting image processing routines (I later tie some OO ribbons around that code to make them reusable ;-) ). What is essentially dead (bar some trivially short programs like "Hello world"), is true monolithic code, i.e. code that doesn't split up functionality into separate functions and procedures, and all flow control is through goto statements (preferable to labels you compute on the spot). THAT sort of could should be dead an buried by now.

  11. Robert 22

    I worked for a defense R&D organization. One of my colleagues was dabbling in AI at the end of the 1980s. Management decided that he should give a presentation on AI at a high level departmental conference. When gave them a dry run of his presentation, they were horrified to discover that it was of an introductory nature and wasn't going to impress the audience - the result was an immediate change in the conference program.

  12. Metal Marv
    Unhappy

    Devs getting listened to?

    In my Personal experience, Developers, engineers, etc, rarely, if ever get listened to, and if they do, their words are misinterpreted or twisted to fit an agenda.

    So I have given up trying to voice any genuine concerns anymore and now just take the Kings Shilling.

    1. Hazmoid

      Re: Devs getting listened to?

      Whilst ensuring that all requests are in writing for the CYA aspect ;)

  13. Jim 59

    OO

    some 25 to 30 years ago, there was a belief that software production would change radically...

    ...This was O-O, or object oriented software. Software production would radically change, the experts predicted... Of course, software production didn’t change radically into the component utopia that the academics had envisaged...

    I confess to leaving a C++ course in 1991 almost overcome with excitement about OO. At the time I was working for Texas Instruments, and OO seemed like an unbeilievably powerful way to solve engineering problems. Select a few objects, glue them together, et voila. An end to repetitive programming. An and to complexity. Write and exchange objects. Share and enjoy.

    High level languages embraced OO. Even assembled ones like Perl and VB. And whatever the current flavour-of-the-month language is, it is almost certainly OO.

    But orgasmic object sharing never materialized. Instead, standard "object libraries" became part of the language. Instead of being a super-duper, custom built, poloymorphous, multi inherited, operator-overloaded panacea that we would exchange and embellish and re-use, OO turned into a boring old standard library. Basically, OO == stdio.h.

    The OO model is still the best way to handle complexity as an application grows. Only the code re-use remains elusive.

    OK now I must return to that programme, think I'll re-use that class from 2 years ago. No. it wold be quicker to write a new object. After all, I am better at the language now, and conditions have changed, and the hardware is 10 times faster now, so that optimized cache is no longer needed, and, and...

    1. Paul Hovnanian Silver badge

      Re: OO

      "Only the code re-use remains elusive."

      I'm not so sure about that. OO didn't enable re-use, although it does enforce modularity and structure needed to make it easier. But re-use been with us from the first #include directive. How many people here are going to sit down and roll their own printf() or Xlib functions?

      Anecdote: Back when I did some software work for a Seattle area aviation company, we were interviewing candidates for a dev/admin position. One guy presented us with a code example in Perl of a program used to transfer some files between two servers that he had written. Good coding style, lots of useful comments. He obviously knew Perl. But it was about 6 pages long, forking off to launch an ftp child process and parsing the stdin, stderr responses. I asked him if he'd ever heard of CPAN. Nope. Well, here's a similar program I'd written. In 6 lines, the first being 'use Net::FTP;'. He was't so much of a code wizard after all.

      I'm not so much a fan of OOP (programming) as I am the underlying OOA (analysis) part of the process. Identify the interfaces, use cases, etc. properly and I don't care if you write your stuff in C or Basic.

      1. AdamWill

        er, wot?

        "He was't so much of a code wizard after all."

        Uh...well, that's an odd conclusion to draw. You say yourself, he clearly knew how to code quite well, with good style. Isn't that the *hard* part? Why would you draw teh conclusion "he wasn't so much of a code wizard" from the premise "he didn't know about CPAN"? Knowing about CPAN is the *easy* part. You solved it yourself in ten seconds!

        So now they guy *had* heard of CPAN and would be able to utilize his apparently unquestioned coding skills more effectively, why wouldn't you just hire him instead of (I infer) disqualifying him for the sin of not knowing about CPAN?

        1. Paul Hovnanian Silver badge

          Re: er, wot?

          "he clearly knew how to code quite well, with good style. Isn't that the *hard* part?"

          Not really. Ever since public school, the greatest emphasis has been on how to study, where to find resources and how best to use them. Sure, speling and neatness counted for something. But there's no way we were going to have our heads filled with everything we needed for life by graduation day. And that goes doubly so for fields where new discoveries and developments are continuously being made.

          So, I showed someone where CPAN was. What happens after I've been run over by a bus and the job is all his and he has to move beyond what is available there?

          Even worse story: Another applicant claimed to be a Perl programmer. So I opened up one of the top level program files. To show him our programming style, comments, revision control metadata, etc. As he was staring at the first few lines of the file (right at line 1: '#! /usr/bin/perl') he asks, "What language is this?"

  14. Lysenko

    ...and so the wheel turns...

    Software production would radically change, the experts predicted, so developers would be producing Lego-like components. What were called “monolithic” programs would become history.

    ... and we're back to the lego components, but this time they're in your node_modules directory and end in .js, not .vbx.

    Of course there are some genuine breakthroughs. The event loop in Windows 2.0 Node.js is a groundbreaking idea that has given us cooperative multi-tasking async programming, the UNIX microservices model with small programs cooperating to achieve a larger tasks is totally innovative and as for schema-less data storage with MUMPS NoSQL, well it simply never occurred to anyone before.

  15. Commswonk
    Thumb Up

    Nice One...

    So the role of the programmers was one of silent insubordination...

    Why is it that "silent insubordination" sounds so much less naughty then dumb insolence?

    As far as I can see there is no real difference between them.

    1. Mike 16

      Re: Nice One...

      The difference can boil down to the sort of case where a developer, on their own time, noodles around with a "Plan B", because mgmt's Plan A is clearly bollocks, and when the wail "The sky is falling and we have two weeks to get this right or we shut the doors" goes out, a diving catch with the "hobbiest" Plan B saves the day. The alternative, when I thought that the architecture astronauts must clearly have a handle on what looked like a risky practice, because mahogany row had so much faith in them, so I stuck to my assigned tasks, did in fact result in closing the doors.

      You're not being insolent when you do what you are paid to do while being paid to do it, but build an emergency exit on your own. Several companies I have worked at actually encouraged employee exploration, often with company materials and even some portion of the work week.

      BTW, in Re 4GLs. I just ran across an FARGO manual on bitsavers. For those not already spending their pension, this grew up into RPG. Designed to allow folks with extensive experience on plugboard-programmed electro-mechanical accounting machines, but none with those new-fangled computers, to use the IBM1401. The "You don't need to be a programmer" notion has been a thing for some time. So has the "You don't need any domain knowledge, just a C.S. degree" notion, to be fair.

  16. bfwebster

    I literally wrote an entire book on this syndrome

    Back in 1995 -- after shipping a desktop publishing system built using Objective-C, for Nextstep -- I published a book titled "Pitfalls of Object-Oriented Development" (M&T Books, 1995). The first few chapters were pretty much about this tendency. Here are a just a few of the many pitfalls discussed:

    Pitfall 1.1: Going object-oriented for the wrong reasons.

    Pitfall 1.2: Thinking objects come for free.

    Pitfall 1.3: Thinking objects will solve all problems.

    Pitfall 1.4: Thinking that object technology is mature.

    ...

    Pitfall 2.1: Not educating and enlisting management before the fact.

    Pitfall 2.2: Underestimating the resistance.

    Pitfall 2.3: Overselling the technology.

    Pitfall 2.4: Getting religious about object-oriented development.

    Pitfall 2.5: Not recognizing the politics of architecture.

    Pitfall 2.6: Getting on the feature-release treadmill.

    Pitfall 2.7: Betting the company on objects.

    ...

    Pitfall 3.1: Adopting objects without well-defined objectives.

    Pitfall 3.2: Cramming objects down the developers’ throats.

    Pitfall 3.3: Abandoning good software engineering practices.

    Pitfall 3.4: Not defining and using an effective methodology.

    Pitfall 3.5: Attempting too much, too soon, too fast.

    And so on, and so forth. What I have since discovered, as per the article above, is that this applies to any new technology or methodology. Some things never change. ..bruce..

    1. Bronek Kozicki

      Re: I literally wrote an entire book on this syndrome

      And to think that this book was printed around the time when I was trying to understand what's is so great about OOP, and never hear about this book. Oh well, the omission is rectified now, an order will be delivered in 2 weeks or sooner.

  17. Tim 11

    Component Reuse

    I think the author is conflating object orientation with component reuse. Whilst neither has been a panacea for all software development problems (and I don't think anyone serious ever claimed they would be), both have been successful and improved developers lives.

    OO (or at least the central concept of encapsulation) is pretty much universal in proper languages, and so is reuse of standard (often FOSS) code - even if it's not necessarily objects, it's JavaScript functions and microservices.

    Try going back to VB3 or plain K&R C and see how long it takes to build an application from scratch (like almost all apps were in the 80's and 90's) compared to what you can achieve with modern tools.

  18. Boris the Cockroach Silver badge

    I did my studying back then

    in O-O at the open university (for those that took the first M206 and M301 modules all I'll say is .. the frogs... the frogs.... and then... the bacon aieeeeeeeeeeeeeeeeeeeeeeeee)

    <3 hrs of therapy later>

    I come from an industrial background, and to me O-O was natural progression from the module design I was used to using in CNC control programs (particually macros where you pass a series of parameters to the module and let it get on with it) and from Z-80 assembly programs where again I always used a module based technique

    The OU taught that O-O was the next big thing, and we could forget all about the programming we'd done before, except for one minor and tiny tiny problem.

    Its all well and good using the O-O design, but when it comes down to it, its just a way of hiding what actually goes on inside a class, but you've still got to program that class , AND make sure it cant be broken by bad inputs.

    And then you inherit and extend, and hope to god what you've inherited does actually work.

    Its good for a top down systems design, where everything is planned out in advance, but its no different to any 1 of a number of different programming techniques we have to use in order to get any code out of the door (or widgets in my case)

    You always use whats best for the problem at hand, and thats what we always have to hide from the manglement when they're wandering around playing buzzword bullshit

    1. Bronek Kozicki

      Re: I did my studying back then

      The thing about OO is that it is being used to either replace or supplement composition with excessive levels of abstraction. Which comes from poor education - the abstractions are useful, e.g. for dependency injection or compilation firewalls, but not nearly as often as OO evangelists made us believe. You can have a perfectly testable non-trivial program, which for most of dependencies employs simple composition (possibly sprinkled with generics), rather than interface classes etc. tools of OOP.

      The missed education bit was to learn why and where abstractions are useful, e.g. open-closed principle, dependency injection etc. Instead, programmers are handed tools and commandments (e.g. define your interfaces! use design patterns for everything!) without clear explanation of the purpose, and the results are poor indeed. Because, as it turns out, simplicity is a virtue - and it got lost along the way.

  19. Anonymous Coward
    Anonymous Coward

    Every Profession Has That Role

    So we have the same role as Sergeants in the military: ignoring the higher-ups to protect them - and us - from their bad decisions and making things work.

    I think I'm going to sew some stripes on my shirts.

  20. Bucky 2

    It's important to remember that the "Object Oriented" Lego set that was being hyped isn't the same as what we call object oriented programming today.

    What they envisioned was a set of tools you would actually manipulate as icons on a screen. You would stack them together (because they all interfaced with each other automatically), and what you stacked together would be your program.

    Remember Wesley Crusher reprogramming the tractor beam in "The Naked Now"? He dragged pictures around on a screen.

    Everything was supposed to work this way by the late 80s or early 90s. The occupation of "programmer" was supposed to have gone the way of the buggy whip.

  21. billdehaan

    Been there, done that, pulled the stitches

    Like the author, I've been doing this for too many years.

    Every few years, a new management fad comes along that will "solve everything!", just you wait.

    In the mid 1980s, it was CASE tools. When CASE tools didn't pan out, they rushed to Expert Systems. That was followed by Artificial Intelligence, which in turn was superseded by Object Orientation. After that, it was "Software ICs", with Ada being the language to rule them all. Then it was Adaptive Interfaces. Finally, the Internet popped up.

    The best way you can tell you're dealing with a fad is that (a) management is busy cooing over how wonderful things are going to be, (b) little if anything developed with the new tech actually works at the moment, and most importantly, (c) the phrase "if you're not doing X, you're going to be out of business in a year".

    There are many shops that despite being Ada-free, CASE-free, Expert Systems-free, OO-free, Adaptive Interface-free, and Software IC-free, have managed to keep humming along for 30+ years.

    Mind you, some of these fads have some very good ideas behind them, and often there's some solid tech, as well. But there's a vast chasm between "hey, object orientation gives us more code re-use, and allows us to do generate higher quality code" and "object orientation will change the way you do business". Using the tech doesn't mean buying into the often absurd claims associated with it.

  22. Anonymous Coward
    Anonymous Coward

    Blaming the tools for Poor Life Choices

    The first seven or eight years of my "coding" career occurred within a scientific and professional environment. Every degree in every division of my department was either physics, chemistry, geology, biology, biochemistry, or chemical engineering. Programming was not a product. It was an accelerant. We already knew how to solve our own problems. But done by hand, our solutions might take years, or decades. Computers got us there faster. Note; we were all scientists. There was not a "professional manager" amongst us.

    ["Professional Manager." Where did we find such a Gordian phrase? Is it oxymoron or catachresis? Whatever it is, it is not real and we should not fall for such folly as to mental ascent that flying pigs, or professional managers, ever could now, nor ever will exist.]

    Each of us chose our own platform for solutions; some to assembly, some to C, some to C++, a few to SAS. Each of our teams chose for the correct use -- a concept entirely foreign in business today.

    Fast forward three decades and here we are. Managers are choosing platforms, both operating systems and software, by assessing the quality of marketing materials and by "chatter." This would not be so were the users choosing their own tools like we did 30 years ago. But that cannot be allowed because...

    Okay, why is this not allowed? Are "professional managers" superior in technical acumen and enlightened in wisdom that elevates their choices above the teams over which they reign?

    In a word, "ha!" But they occupy that superior position nonetheless.

    Old Billy had it wrong. Lawyers do not inhabit the nightmares of programmers. Managers do. If he lived in this day, his quote would be, "First thing we do, let's fire all the managers."

    In more than 30 years of programming I have never heard a peer say, "I didn't finish on time because the language was wrong." Professionals do not blame their tools.

    But I have heard many a "professional manager' say things like, "The compression ratio difference between the operating system and the network protocol were too high and it turned all the data chartreuse.... and Bob, over there, didn't account for it in his ETL script. That's why we're going to be a month late on our first sprint."

    Hacks blame their tools (and managers think of us as nothing more). And we only protect the problem when we protect the manager.

    It's not my fault that someone eminently qualified for the shovel and the pick should have chosen the life of "manager." They just made a poor life choice.

  23. CSYBERGIRL
    Happy

    Software Developers Should be paid more than management and lawyers.

    That's the real crux of it, and a coder should be paid the most :)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like