back to article IBM Java CTO: Devs shouldn't have to learn Docker, K8s, 30 other things to deploy an app

At IBM's Index developer conference in San Francisco, on Tuesday, The Register sat down with Big Blue's Java CTO John Duimovich to talk about the Java programming language, IBM, the cloud and other developer-oriented concerns. Duimovich made the case for IBM as a cloud platform partner, based on the company's Java expertise. …

  1. Anonymous Coward
    Trollface

    Headline begs the question

    Why should devs have to learn Java?

    The article doesn't really answer that one.

    1. Destroy All Monsters Silver badge
      Windows

      Re: Headline begs the question

      Because Idris is not yet industry ready. (But then: Introducing Idris on the JVM and an Idris Android example)

      But you can go for Kotlin.

    2. Ken 16 Silver badge

      Re: Headline begs the question

      Because Java is the Cobol of the 21st century

      1. Anonymous Coward
        Anonymous Coward

        Re: Headline begs the question

        And PWA should be it's nemesis

      2. JLV

        Re: Headline begs the question

        Meh, I've done both and rather prefer COBOL's simplicity.

        COBOL could have aged better as a cross-platform if not for MicroFocus's stranglehold and eye-wateringly expensive X86 compilers- $3-5K was not unusual.

        GNU OpenCobol was too little too exotic too late.

  2. Erik4872

    ???

    "The notion that as a developer you'll have to learn Docker, Kubernetes, and 30 other things before you can even deploy an app is something I'd like to get rid of,"

    My brain is hurting. As an infrastructure person, I thought the endless parade of strangely-named tools, products and frameworks was to abstract away the hardware. You know, push the Build button and your code magically deploys to a serverless, infrastructureless layer on your cloud of choice.

    1. Claptrap314 Silver badge

      Re: ???

      He dropped some howlers in this interview, but this line actually makes sense. I know folks here like to dump on DevOps, but it looks to me that what he is saying is that a common view of DevOps is in fact toxic.

      People that talk about DevOps as "moving operations responsibilities to the left" or something like that are in fact stating that their devs need to acquire ops knowledge. Problem 1: Knowledge doesn't cut it. You need expertise. It's really, really hard to develop expertise in two disparate fields. And software engineering and operations are two quite disparate fields. Problem 2: Throwing bodies at a problem doesn't scale. You need to do things better. Problem 3: Waterfall was a strawman process that, according to the man who created it, is "doomed to fail". If you're thinking in waterfall terms, you're not thinking about proper DevOps.

      DevOps is using software to solve operational problems inherent in the growth generally and microservices in particular. Good software engineering is first about understanding the problem space, and then figuring out how to confine the problem to code so that users don't experience it. A proper DevOps team creates simple tools for devs to use that enable them to put applications into production that meet the rigours of operational excellence without having to explicitly engage operations.

  3. Destroy All Monsters Silver badge
    Windows

    Why!!!

    He said the company sees quite a bit of interest in Node.js, particularly for front-end applications

    The gate to immense technical debt and technological craters ripped open by 20-somethings webmonkeys barely knolwedgeable about of what they are doing.

    Plan to throw away a Node.js application. Hell, plan to throw away two.

    1. Korev Silver badge
    2. sisk

      Re: Why!!!

      Node.js has yet to be my first choice of languages for any project, but it's not that bad. Then again, I still usually use PHP when I'm just messing around with quick little stuff for my own personal use, so maybe I'm just getting to be an old fart.

      1. Orv Silver badge

        Re: Why!!!

        Node.js is pretty nice when developing client/server stuff using websockets, since you can work with the same socket libraries on both sides. It greatly reduces the mental gear-shifting you have to do to go back and forth between browser-side Javascript and server-side PHP.

        I don't know if I'd try to write a huge, massively scaleable service in Node, but I might very well prototype one in it.

        A lot of people's complaints about Node.js stability and maintainability are really complaints about npm. Nothing says you can't manage your modules manually, or nail down specific versions in npm, though. People just use it sloppily without thinking about the repercussions.

  4. jockbroon

    Hear fucking hear!

    Good god, how I detest browsing through open source code repos these days. I swear I've seen one-liner beginner-tier JavaScript and HTML demos packaged up with 300 auxilliary files like JSON, YAML and other assorted shite from the likes of Composer, Grunt, Puppet, Chef, Docker, Vagrant, Gulp, Chug, Swallow, and whatever other fucking ludicrous "hip" names trendy developers have decided to use for their "automagical" deployment trash.

    Call me a grumpy old bastard if you want, but I miss the days where wanting to make use of an external library just meant a single "require" line at the top of your code.

    Seriously, though... who the fuck thought it was a great idea to encourage developers to use package managers which blindly download and install hundreds unknown packages from a remote server - with practically zero code visability unless you're really anal about checking everything?

    1. Boothy

      Re: Hear fucking hear!

      If your doing it right, the 'package managers' (Maven etc) won't have any external access, everything they 'download and install' is from an internal repository, such as Artifactory, where the contents have (hopefully) been checked and validated before being added.

      1. hellwig

        Re: Hear fucking hear!

        Sorry boothy, but what world are you living in? Remember when that guy pulled his "add spaces" package from NPM and broke half the internet?

        Not only do people point to a public server and forget it, they use libraries and packages they could easily write themselves in five minutes, but OH, that wouldn't be trendy.

        1. Boothy

          Re: Hear fucking hear!

          But isn't that poorly configured dev environments?

          One of our clients insists all all developers use Ubuntu laptops, and these are locked down.

          Developed apps (Java) cannot have libs put directly into them, all build and dependencies are managed by Maven.

          App builds (other than local IDE) and deployments are all run on remote Jenkins environments, that have no Internet access for dependencies, so all Maven dependencies can only be downloaded from a local curated repository.

  5. Kevin McMurtrie Silver badge

    What, some common sense?

    Somebody shouts "Cloud computing!" so you have a Java virtual machine in a virtual container on a virtual host in a virtual datacenter. On top of that, somebody shouts "Microservices!" so now you have that whole stack multiplied 10 to 30 times, with each piece using 5 database connections, each running the client library for every API of every other piece, each generating 1 TB a day of junk monitoring statistics, and each using 4GB of RAM each to do pretty much nothing. Next comes the complaints about how it's too expensive, it hurts the DB, it's too hard to debug, somebody always messes up deployment, the firewall is letting in hackers, and API changes are impossible. The proposed solution is then more layers of process!

    Java in the "cloud" went down a very dark path. Kill it and try again.

  6. Anonymous Coward
    Anonymous Coward

    Isn't using all those tools the DevOps dream? Don't need someone to manage the servers and deploy, just do it yourself.

    1. Claptrap314 Silver badge

      Nope, that's toxic DevOps.

  7. Anonymous Coward
    Anonymous Coward

    30 required ways to deploy?

    Maybe if you're using Glassfish (Oracles Java application server) but definitely not with Tomcat.

    It's what I sometimes seriously fail to understand; why several people insist on using pre-made solutions for this stuff, even if some of those are so horribly awkward to use. I mean... deployment of a war file requires at the core nothing more than placing it in the right directory. How hard can it be?

    I basically wrote an ant script for it, fully customized for use with my server park. Easy.

    1. Steve K

      Re: 30 required ways to deploy?

      Isn’t WebLogic Oracle’s JEE application server?

      GlassFish is the open-source one (also more active under the Payara developers than Oracle!)

    2. Anonymous Coward
      Anonymous Coward

      Re: 30 required ways to deploy?

      But is tomcat an application server ?

      It doesn't automate anywhere near the same level as servers from the likes of IBM, Oracle and RedHat, for example.

      If not for Spring I don't think Tomcat is even viable at times.

      Solutions we encounter that use it, we generally despise.

      RedHat's WildFly effort, IBM's Liberty profile and something just as slick from Oracle should kill Tomcat ? :)

      Then we can start to use Jython via JMX to actually do something useful to dozens of servers at the same time, instead of just reading metrics, or writing our own MBeans.

      Deploying apps is not usually as simple as 'dropping a WAR file', even if it is a WAR file you are deploying ?

      What about its resources ? Messaging, transactions, DB connections, resource bundles, static content.

      Spring helps, but its a complex beast. A 1 week 100+ slide deck course that leaves you confused and probably crying for mummy. I know.... I have been there.

      I want maybe Apache at the edge and a 'propa' server beyond that.

      I don't want a web and JSP server-turned App Server with a huge complex, hard to debug framework, because someone uses cool phrases like 'cross-cutting concerns' and other assorted alchemicals (Spring in the Enterprise).

      I can excuse Springs abstraction of connection factories and transactions handling for various bits of plumbing.

  8. Milton

    Call me a Colonialist Curmudgeon

    Call me an Auld (Colonialist) Curmudgeon, but I am also heartily sick of the bogglingly vast array of libraries and auxiliary tripe that infest modern systems, often doing little except pumping out logfiles no one will ever read, frequently providing just one ot two percent of the functionality of the core spec, inflating what could otherwise be cool, slim, well-written code by orders of magnitude of pointless and often quite shitty bloat.

    Note to bright-eyed, bushy-tailed and undoubtedly intelligent young-, hip- and, who knows, perhaps on the dark side even gang-sters: constantly virtualising and wrapping and abstracting what went before and then slapping a shiny new label on it is not necessarily the route to error-free, elegant efficiency.

    Some of us crankly old bastards cut our teeth on C, C++, and in my case too much Ada once upon a time, happen to think that much modern practice is the absolute antithesis of good coding (most of which is now found only in life-and-death systems like airliners), and never really cottoned to second-rate lingos like Java.

    Even when it was still called Batavia.

    1. hellwig

      Re: Call me a Colonialist Curmudgeon

      Some of us have ground our teeth to nubs on C and Ada to this day. Some things fancy, non-deterministic REs and VMs will never be able to replace.

      I wasn't ambulatory for the days of punch cards, but I cut my teeth on Assembly too. Ask any kid these days if they understand what that line of code they just wrote in some fancy 18th gen language compiles down to, and they'll probably just look at you funny. Do they understand that eventually everything still runs on the CPU?

      I actually had a depressing conversation with a couple folks about how a real-mode CPU code can only run one thing at a time. There's no magic that lets the OS or JVM sit live in the background monitoring what you do, because it's all just a single stream of command words sent to the CPU. That was a concept they couldn't quite wrap their heads around.

      1. BLwNKLJMOOSE

        Re: Call me a Colonialist Curmudgeon

        You touch on a very DEEP reality when you mention that "...CPU can only run one thing at a time". I think that most (yes, MOST!) developers today have no real fundamental understanding of how hardware does what it does, and Why This Is Important! This translates into blind willingness to accept "framework(s) du jour" as the norm. Blind willingness to pile "stuff" on top of more "stuff" because they simply want to use a tiny piece of functionality that (if they knew what they were doing in the first place) they could have written themselves with a couple lines of simple code. I know we all can't be uber wizards, capable of writing ARM assembly compiler code generators optimized to utilize concurrent pipelining; we shouldn't have to be! But if developers today had even twice their current understanding of the platforms they are targeting or porting to, they wouldn't put up with the proliferation of CRAP that ends up being listed in the "Required Skills" section of job description postings.

  9. JLV

    Trying for irony, is he? J2EE simple?

    Might have changed these days, with lessening popularity and ebooks.

    But a quick look at the Java book section 10 yrs ago would show creaking heavy shelves of endless alphabet soup Java drudgery to get the simplest things done: Spring, JNI, JMA, jndi,Hibernate... None of which could talk to non-Java heretics, jni aside.

    And we aren't talking small books either - 400-500 pages was common for any one of those mind bogglingly dull topics.

    1. Anonymous Coward
      Anonymous Coward

      --->Hibernate

      Can I just mention in passing that Hibernate was a redundant fustercluck that enabled you to produce the worst, shittiest database interaction using loads of XML?

      Hibernate basically was an attempt to enable people who had not a clue about databases or SQL to store relational data using Java. It sucked. It was awful.

      Sorry about the rant; cleaning up after someone used Hibernate and ended up with something that modified our carefully crafted schema (fortunately on development servers only) because "it didn't work otherwise" was several months of my life I could have used productively.

      Entrust an entire project to Hibernate? Excuse me while I gibber.

      1. Friendly Neighbourhood Coder Dan

        Re: --->Hibernate

        "an attempt to enable people who had not a clue about databases or SQL to store relational data" - fully agree, and worth mentioning is that apart from the zillions of unnecessary database hits it also is ( was? ) terribly complicated - much more than the SQL it wanted to save a developer from. I also had to use Spring on top of that, to make everything unnecessarily even more difficult. While it was clear to me what Hibernate was trying ( and failing ) to do, I never understood what the whole point of Spring was, apart from making everybody waste time and the will to live. I hope it's gone out of fashion. Other than that, I really like plain normal Java...

        1. Anonymous Coward
          Anonymous Coward

          Re: --->Hibernate -> I never understood what the whole point of Spring was

          Oh the point of Spring is obvious. As with Hibernate, it helps keep up the Intel share price.

  10. IGnatius T Foobar
    FAIL

    devs shouldn't have to learn...

    Also, devs shouldn't have to learn Java to deploy an app.

    And increasingly ... they don't.

  11. carddamom250
    FAIL

    Seriously?

    Does this idiot know anything he is saying? Seriously has he ever program in JEE to begin with? I mean the clusterfuck of annotations, factories, xml files, configuration files that made up JPA, JSF and everything else in JEE, not even spring is imune in fact spring is the worst ofender, the lemma of that framework should be: "Creating mindless zombies every minute!" Since any sucker nowadays grabs spring boot and calls himself a developer, in fact the reality that most Java code sucks when processing millions of transactions is the only thing good about it, since it keeps those suckers away from interesting programming positions, that are mostly ones that do not use Java and instead use something like Google Go to begin with.

    Moreover if you really like programming and you do it not only for the money, seriously try Google Go or anything else that doesn't force you to learn 20+ Frameworks and APIs in order to create a application, the pleasure you will get from learning will be much much worthwhile than learning any of these Java framworks.

    Also, to sum what is posted on this reddit (https://np.reddit.com/r/programming/comments/7zb7jt/ibm_java_cto_devs_shouldnt_have_to_learn_docker/dumthik/):

    > It's a little ridiculous the makers of the shitlord application called Websphere would say deploying an app should be less complicated

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like