Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Java Networking Programming Python IT

The Working Dead: Which IT Jobs Are Bound For Extinction? (infoworld.com) 581

Slashdot reader snydeq shares an InfoWorld article identifying "The Working Dead: IT Jobs Bound For Extinction." Here's some of its predictions.
  • The president of one job leadership consultancy argues C and C++ coders will soon be as obsolete as Cobol programmers. "The entire world has gone to Java or .Net. You still find C++ coders in financial companies because their systems are built on that, but they're disappearing."
  • A data scientist at Stack Overflow "says demand for PHP, WordPress, and LAMP skills are seeing a steady decline, while newer frameworks and languages like React, Angular, and Scala are on the rise."
  • The CEO and co-founder of an anonymous virtual private network service says "The rise of Azure and the Linux takeover has put most Windows admins out of work. Many of my old colleagues have had to retrain for Linux or go into something else entirely."
  • In addition, "Thanks to the massive migration to the cloud, listings for jobs that involve maintaining IT infrastructure, like network engineer or system administrator, are trending downward, notes Terence Chiu, vice president of careers site Indeed Prime."
  • The CTO of the job site Ladders adds that Smalltalk, Flex, and Pascal "quickly went from being popular to being only useful for maintaining older systems. Engineers and programmers need to continually learn new languages, or they'll find themselves maintaining systems instead of creating new products."
  • The president of Dice.com says "Right now, Java and Python are really hot. In five years they may not be... jobs are changing all the time, and that's a real pain point for tech professionals."

But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."


This discussion has been archived. No new comments can be posted.

The Working Dead: Which IT Jobs Are Bound For Extinction?

Comments Filter:
  • Short sight (Score:5, Insightful)

    by CRC'99 ( 96526 ) on Sunday May 21, 2017 @02:30AM (#54457933) Homepage

    The entire world has gone to Java or .Net

    *cough* What a crock of shit.

    • Re: Short sight (Score:5, Insightful)

      by Oscaro ( 153645 ) on Sunday May 21, 2017 @02:40AM (#54457957) Homepage

      Yep, fir example I work on medical diagnostic software and the amount of data you need to manage and render on screen smoothly is so huge that C++ is the most reasonable and common choice (even if not the only possible one). There are lots of fields where C++ is still king. And that's a shame, because it's a crock of a language.

      • by Anonymous Coward

        It is the lack of compiler/preprocessor support for legacy standards of the language.

        Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have be

        • by arth1 ( 260657 ) on Sunday May 21, 2017 @07:54AM (#54458683) Homepage Journal

          Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have been tested against in an automated fashion to ensure that new changes to the standard didn't break older code.

          I agree, for C++. Whenever I have breakages after upgrades, it's almost always C++. Programs have to be recompiled, because they've imported and extended templates that they themselves weren't in charge of. Even if the APIs remain the same, there are still breakages.
          For C, there are far fewer problems. Yes, someone might change an API, but the general consensus is to not do that, but provide new functions. New standards happen, but only affect the source, and not whether binaries continue to work, like can be the case for C++.

          C++ works well where you can control or dictate the runtime system, so it matches the developer toolchain. That's great for embedded-like systems where you can change the entire OS with upgrades, or long term stable systems like RHEL, where versions stay put for 10 years with only bugfix backports. But when binaries break after an OS update, they're almost always C++ ones. From big companies too.

          • Right, just within the gcc, the C++ ABI has incompatibly changed at least 3 times, not to mention incompatibilities between different compilers.

          • Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have been tested against in an automated fashion to ensure that new changes to the standard didn't break older code.

            I agree, for C++. Whenever I have breakages after upgrades, it's almost always C++. Programs have to be recompiled, because they've imported and extended templates that they themselves weren't in charge of. Even if the APIs remain the same, there are still breakages.
            For C, there are far fewer problems. Yes, someone might change an API, but the general consensus is to not do that, but provide new functions. New standards happen, but only affect the source, and not whether binaries continue to work, like can be the case for C++.

            So when minor point releases of C libraries break ABI much more often than C++ libraries, that just doesn't happen? libssl, libpng, libflac, libwebp, etc. Have all broken binary compatibility in minor releases. Note btw, that templates changing definition doesn't really break much in C++ unless you export those templated clases over a module interface which is a dangerous thing to do. Templates being in headers alone means the have no binary part whose compatibility they could break. Though of course you ca

      • Re: Short sight (Score:5, Insightful)

        by serviscope_minor ( 664417 ) on Sunday May 21, 2017 @03:07AM (#54458041) Journal

        There are lots of fields where C++ is still king.

        Anything requiring speed and expressivity.

        And that's a shame, because it's a crock of a language.

        I find modern C++ very pleasant on the whole.

        • Re: Short sight (Score:5, Interesting)

          by Anonymous Brave Guy ( 457657 ) on Sunday May 21, 2017 @03:38AM (#54458107)

          Anything requiring speed and expressivity.

          I wouldn't even say those are the big advantages of C and C++ any more.

          It's a relatively rare application these days that needs the kind of raw speed you can't achieve with other mainstream languages yet which relies on C or C++ for its performance-critical logic rather than either dropping to assembly (or linking to someone else's library that probably does) or resorting to some form of parallelism. I'm certainly not saying that set is empty, but it's probably getting smaller by the year.

          As for expressivity, if you mean how easy it is to express any particular idea in code, C and C++ are relatively weak compared to many other mainstream languages today. They lack many convenient language features widely available elsewhere, and their standard libraries aren't exactly broad and full-featured so you have to bring in additional dependencies to do almost anything useful.

          The area where C and C++ still shine compared to almost anything else (and I realise this might have been what you meant by "expressivity" instead) is the low-level control. You can deal with memory and ports and interrupts and so on very transparently in these languages, and they have a very lightweight runtime with minimal dependencies that makes them suitable for use in things like systems programming and writing software for embedded devices with limited resources.

          • Re: Short sight (Score:5, Insightful)

            by angel'o'sphere ( 80593 ) <angelo.schneider ... e ['oom' in gap]> on Sunday May 21, 2017 @05:36AM (#54458321) Journal

            C++ meanwhile supports basically everything other modern languages provide.

            You seem to be stuck in 1998 or something ...

            C and C++ are not the same thing, so making a very long statement about both of them is most certainly always wrong for one of the two languages.

            • by w3woody ( 44457 )

              The two problems with C++ are the fragile binary interface problem, and the lack of memory management beyond 'new' and 'delete'. The latter can be resolved by creating a base class which provides reference counting and automatic deallocation. The former, however, cannot be easily resolved since virtual methods in C++ are dispatched by going through a virtual method table with fixed offsets (meaning adding new virtual methods may alter the index where a method entry point is dispatched), and the size and off

          • by golodh ( 893453 ) on Sunday May 21, 2017 @06:52AM (#54458505)
            Libraries.

            Seriously, whilst C++ (and Fortran) are great to do the heavy computational lifting, most of that heavy lifting that goes on in computational engines can be isolated in, and accessed from, a specialised library.

            After that you really don't need C++ anymore.

            In fact you'll realise big productivity (and reliability) gains by *not* coding e.g. business logic or HMI's in C++. Use a script language instead and call those C++ libraries when you know exactly what you want done. I daresay that this is why languages like Python are so popular.

            In most applications that business logic and HMI fiddling is 95% of the code once you put the heavy computations inside a library call.

            The problem for C++ "coders" is that you don't want a load of mediocre C++ coders to build a library.

            Instead you want computational scientists and domain specialists to specify the algorithms, supported by a software engineer for systems design plus one or two really good C++ programmers who can both understand the algorithms and what they do, and who just so happen to be able to implement the design plus algorithms in high-quality, robust, efficient, and elegant code.

            • by dbIII ( 701233 ) on Sunday May 21, 2017 @07:26AM (#54458597)
              Interpreted languages are fine so long as there isn't a lot of code sticking together the stuff in those libraries that are nicely compiled for you.
              If there is a lot of code to interpret they suck just as much as they always have - hence some really sloooooow stuff out there.
              There's some appallingly slow stuff running on fast hardware - things like GUIs that take a couple of seconds to respond to a mouse click and bring down a menu despite being on a 4GHz machine that's not doing a lot other than waiting for input. That's the sort of thing that shows off a failure of lazy programming and using the wrong tool for the job (eg. a massive lump of custom java instead of handing over to a library).
              • Almost no mainstream programming language is truly interpreted any more, though, unless you're talking about something like a REPL interface. Even the JavaScript in your browser and that Python script you wrote the other day to parse a log file are actually JIT compiled behind the scenes these days, and will probably get within a factor of 2-3 of the performance of true compiled languages in most cases. The VM-hosted static languages like Java and C# have been in that position for a long time, and get close

          • It's a relatively rare application these days that needs the kind of raw speed you can't achieve with other mainstream languages yet which relies on C or C++ for its performance-critical logic rather than either dropping to assembly (or linking to someone else's library that probably does) or resorting to some form of parallelism. I'm certainly not saying that set is empty, but it's probably getting smaller by the year.

            Well, I tend to write those libraries. No one I know drops to assembly any more for perf

            • Yes, I agree that C++ (and C) are still good choices for the kind of work you described. I've worked on some of those too, and there wouldn't have been many other viable choices for those jobs. I just think the number of jobs where this is the case is trending downwards.

              In particular, I think it will trend down sharply if and when we reach the point that higher level languages with more expressive semantic models can be compiled to run more efficiently on the relevant hardware than C or C++ code that is wri

        • Speed is becoming less and less of an issue. As computers are getting faster all the time. But we get sectors who reach "good enough" speed every day. And then can start focusing on other problems that language like C and C++ may not handle as well. My biggest problem I have seen is deployment and change control. Being able to safely test out alpha/beta code safely on production data with real world usage. A lot of this stuff is under business processes. But there should be more tool and environments

          • Re: Short sight (Score:5, Insightful)

            by Anonymous Coward on Sunday May 21, 2017 @05:22AM (#54458303)

            Speed is becoming less and less of an issue. As computers are getting faster all the time.

            People have always been saying that, and it was never true, and still isn't.
            Why? Because software, software stacks, entire operating systems are becoming slower all the time, eating up all those resources that your newer computers are capable of.
            It has always been this way, and people like you have always been wrong.

        • by murdocj ( 543661 )

          When I used C++ I found it to be an interesting but ultimately failed experiment in layering OO on top of C. The end result was that the traps of C (invalid pointers etc) were still there, but now hidden under layers so that they were harder to detect / fix. For example, adding an element to one of the STL container classes could cause a reallocate, rendering any "references" that you had into that container invalid. So unlike OO languages that truly have references that don't magically become invalid, C

    • Re:Short sight (Score:5, Insightful)

      by Dutch Gun ( 899105 ) on Sunday May 21, 2017 @02:58AM (#54458003)

      C++ employers will be employable in the videogame industry for the foreseeable future, at least. I presume that they'll also be employable for working on any large-scale applications that requires support or compatibility beyond what some of the newer, safer, high-performance compiled languages can provide.

      People always talk about how terrible C++ is (and it's hard to argue with many of their points), but it continually shows up in the language rankings as a steady #3 to #7 or so, depending on how language "popularity" is figured. It benefits less from being "pure" and more from being incredibly pragmatic as a language, similar to C. R and Go are still lagging far behind, with D almost out of sight. Swift is moving up thanks to iOS, and maybe Kotlin will do the same thanks to Android (but we'll see - I'd literally never heard of it until recently), but those are almost pre-destined to be one-trick ponies due their strong platform ties.

      Ultimately, the big problem is that I don't see a real universal contender for high-performance native code taking over from C/C++. There are a lot of promising languages, but at the moment, nothing is really taking off. Simple inertia is pretty hard to overcome, as it turns out.

      Final point:

      But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

      Bwahahahahahaha! Oh damn, we can't even get our chat bots working reliably (we use them to auto-generate bugs and tasks). And in three years they're going to be replacing programmers? Fucking priceless!

      • Re:Short sight (Score:5, Insightful)

        by Kremmy ( 793693 ) on Sunday May 21, 2017 @03:09AM (#54458051)

        Ultimately, the big problem is that I don't see a real universal contender for high-performance native code taking over from C/C++. There are a lot of promising languages, but at the moment, nothing is really taking off. Simple inertia is pretty hard to overcome, as it turns out.

        The whole reason that people claim C/C++ are dying or going out of style is that they are entirely disconnected from this point. They explicitly overlook the fact that the languages they are always citing are written in C/C++ and rely to an extreme degree on libraries written in C/C++ even when they manage to self-host the languages. It's an ignorance of what the tools they are using actually are.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          We wrote a language that is a subset of C, wrote a wrapper for all the APIs that are in C, wrote a wrapper for all the libraries that are in C/C++, then used a compiler written in C to compile the compiler, which is of course in C or C++, on an operating system that was written in C. We plan to make this a standard by making it mandatory or almost mandatory on this one platform we have control over, and that became popular by supporting C.

          Looks like C is dead!

        • by Sycraft-fu ( 314770 ) on Sunday May 21, 2017 @05:23AM (#54458305)

          These days, everything is a computer. Your stove, your car, your cable modem, your TV, all are computers. They all have microcontrollers or microprocessors in them to handle various functions. It is cheaper and easier than doing discrete dedicated logic, even for simple things. Well, those need software of course and it turns out C/C++ are the thing that gets used a lot because you have little memory and power to work with. Pennies count in mass production and the smaller a CU, RAM, flash, etc you can get away with the better, but that means the code needs to be small. You aren't loading up Windows and running .NET on a microwave, you are getting a little PIC24 or something and putting on some highly efficient, directed code.

          Because of all these embedded devices, there's a lot of market for this kind of thing, it just isn't the trendy shit you see on hip "Web 3.0" sites. It gets done by people with engineering backgrounds at big companies.

          Also, speaking of small embedded computers, regular computers themselves have tons of computers in them. Crack open a desktop and you find a lot of chips in there, many of them computers in their own right. Your NIC is a computer. A simple one to be sure, but it is a processor that runs code, it is not all hard wired. Your SSD is a computer, it has a little chip (ARM usually) that runs the code it needs to do its job. Again, someone is writing the code for all that and that code is not being written in Java.

          Even when you have a platform that at a high level runs Java/.NET/whatever it had a bunch of lower level code on it.

        • by Svartalf ( 2997 )

          Which makes them quite the Tool.

          This doesn't even get into the reality that 70% of all the "computers" are embedded beasties...all those "IoT" processors and the bulk of them are programmed in C or C++. A Node.JS or Python option is available, but neither of those are what you'd call "secure". You might be able to get Go to "go" onto those platforms or Swift- but they're a bit largish and don't really target the small stuff.

          The remark about .Net or Java means they're a real Headupassian. No clue whatsoev

      • Re:Short sight (Score:5, Insightful)

        by angel'o'sphere ( 80593 ) <angelo.schneider ... e ['oom' in gap]> on Sunday May 21, 2017 @05:55AM (#54458347) Journal

        Those "how popular a language" is topics are not really relevant.

        Most languages more or less work the same, the details and flaws are mostly in the libraries (see PHP) or in some niche corners of automatic type conversions (see JavaScript and also C).

        Bottom line it does not really matter if you write Java or C++. A competent programmer should learn the other language in a day or two and get good in it in a few weeks or months.

        Of course there are edge cases. I don't expect everyone to become super fluent (quickly) in SQL, Smalltalk, Groovy or Lisp or Prolog or Haskell, OCaml or COBOL or Fortran.

        However: even if you are not fluent in any of those languages, with a little bit of intelligence you should be able to fix simple bugs. Writing a new program from scratch is obviously more difficult. Look at COBOL e.g. with its "strange" PIC data layouts and so many "divisions". I mean I fixed a bout 1M lines of code in COBOL for Y2K faults, however I could not really "write COBOL".

        Kotlin is around about 5 or 6 years, not sure. I don't see a big advantage over Scala or Java 8, probably easier than Scala as it is closer to Java. However the company, JetBrains, offers Kotlin to JavaScript and native code compilation. Kotlin to native could be interesting on Android, on other platforms I fear they don't have the cross platform libraries (GUI, Networking etc.)

        But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

        This is actually true. I wrote a "spec" (as in heavy formalized use case descriptions) to Java/Groovy source code "converter" about 10 years ago. It was super easy to make proof of concept prototypes.

        Look e.g. at https://cucumber.io/ [cucumber.io]

        However you are right, too. Programming/programmers wont go away for the foreseeable future and most likely never.

    • by Z80a ( 971949 )

      Someone has to write the interpreters/compilers for java or .net.

    • Re:Short sight (Score:5, Insightful)

      by TheRaven64 ( 641858 ) on Sunday May 21, 2017 @04:36AM (#54458209) Journal
      Let's look at some examples of things running on a typical computer:
      • Operating system: If it's *NIX, mostly C with some C++. If it's Windows, then mostly C++ with some C.
      • Web browser: Chrome, Firefox and Edge are all written in a mixture of languages, with C++ being the dominant one.
      • Office suite: Microsoft Office, Open/LibreOffice, and KOffice are all mostly C++.
      • Video and music player: The UI is often C# or Objective-C, but the code that handles the file metadata parsing, decoding, and playback is all C or C++, depending on the platform.

      Yup, sounds like the entire world has gone to Java or .NET to me...

    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Sunday May 21, 2017 @04:36AM (#54458211)
      Comment removed based on user account deletion
    • by gweihir ( 88907 )

      Indeed. It may be true that coding jobs on low skill level are often Java these days, but C is not vanishing at all. For example, I recently built a custom security filter component for a customer and while in theory it is possible to do this in Java (I think), it would be a lot of additional effort, the component would not perform well and maintenance would be a nightmare. Before that, I build a simulation tool, and again, nothing but C would do the job well for the core algorithms. Glue-code should of cou

    • Re:Short sight (Score:5, Insightful)

      by 0xdeadbeef ( 28836 ) on Sunday May 21, 2017 @01:15PM (#54459815) Homepage Journal

      Well, it was said by

      Elizabeth Lions, an executive coach, author, and president of Lionsology, a job leadership consultancy.

      A snake oil selling imbecile who hasn't the slightest clue about the things she talks about. Of course, we knew that when she called programmers "coders".

  • by tal_mud ( 303383 ) on Sunday May 21, 2017 @02:39AM (#54457949)

    "Ninety percent of coding is taking some business specs and translating them into computer logic".

    Business spec: Write a program that determines if any given program halts after a finite amount of time.

    • by Cederic ( 9623 ) on Sunday May 21, 2017 @02:48AM (#54457971) Journal

      Nicely put.

      Low-end AI? Translating user requirements into working software that actually meets their needs is in the same part of the AI difficulty list as cold fusion and solving world hunger.

      If you can actually interpret the business specs without a human putting them into a formal language, you don't need to translate them into computer logic at all. By then the AI can just execute them anyway.

      The moment you need that intermediary step involving a human and a formalised representation.. we call that programming.

      • Consider a company the scale of Google, with hundreds, if not thousands, of software projects ongoing simultaneously. Suppose you assigned an AI to observe which user stories go in, and what code comes out as a result. How many programs would you have to complete before the AI is able to take over a majority of the work involved in building an application? Maybe it can't directly convert user stories, but it could probably handle many of the components that a user story is made of.

        I'd honestly be surprised

        • by DrYak ( 748999 ) on Sunday May 21, 2017 @05:03AM (#54458277) Homepage

          Suppose you assigned an AI to observe which user stories go in, and what code comes out as a result. How many programs would you have to complete before the AI is able to take over a majority of the work involved in building an application? {...} I'd honestly be surprised if they aren't already doing something like this.

          Yes it's done. Not by google, but by others.
          The short answer is that the deep neural nets produce texts that looks like code on the first glance, but doesn't even compile.
          e.g.: The variables aren't even properly declared. it can write a formula (like "a = b + c")
          but isn't even able to realise the link with the declaration of the variable (that the "int a;" 10 lines above is linked to the "a").

          The problem is the size and complexity of modern AI.
          The size of the context they can consider,
          the amount of abstract models hidden behind the code, etc.

          Currently what AI has managed to recreate with deep neural nets, is on the level of WW2's Pigeon guided bombs [wikipedia.org].
          i.e.: leverage some image recognition net and similar basic tasks, and string a few together.

          The complexity required to write actual code is several orders of magnitude bigger.
          Even some humans can't do it reliably, and you hope to do it with what currently is the equivalent of the visual cortex sub-part of bird's brain.
          Good luck with that.

          Before achieving that we need :
          - more raw processing power (you'll need way much more neurons that currently used in today's deep neural nets)
          - advances in science to better understand how to combine together tons of such "function specific nets" to build a higher level of AI.
          (the same way a brain is a sum of lot of small specific region, each linked to a higher level/more abstract associative layer).

  • by thegarbz ( 1787294 ) on Sunday May 21, 2017 @02:42AM (#54457959)

    The entire world has gone to Java or .Net

    Wake me when either language runs on small tiny embedded applications.

    • by Z00L00K ( 682162 ) on Sunday May 21, 2017 @03:21AM (#54458075) Homepage Journal

      I agree - a lot of embedded devices are C and even assembly. Especially when you come down to small devices running an 8051 kernel and similar where every byte counts.

      C is also one of the better languages to use if you want a deterministic behavior of your code as long as the coding is done right. Environments like Java and .Net aren't good enough in that area since you have background threads for housekeeping that causes interference.

    • I guess you've never heard of the .Net Micro Framework [wikipedia.org]. Java also supports embedded systems [oracle.com]. Java us actually quite common on embedded devices and they've actually made processors that interpret the Java Bytecode at the hardware level.

    • Look at the JavaCard spec. Almost all smartcards run it. You'll find it on pretty much every SIM card, every credit card that has a chip, and a lot of corporate door-access cards. Does that count?
  • by Oscaro ( 153645 ) on Sunday May 21, 2017 @02:48AM (#54457973) Homepage

    I hate this fact that "java programmer" is considered by some people a different job than "C++ programmer". A good programmer should be able to learn a language in a month and become proficient in three months at most. Functional languages apart, all languages are more or less the same. It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

    • by phantomfive ( 622387 ) on Sunday May 21, 2017 @02:56AM (#54457999) Journal
      Because the person who can use the language today is more valuable than the person who can use the language in three months.
      • You kinda underestimate the number of java libraries.

        And recognizing that you need to use the library because you already know it exists isn't something you can pick up in a month.

        If you are simply coding then sure. You can rely on an experienced programmer to know what's needed and simply code.

        Here's a good start..

        http://blog.takipi.com/the-top... [takipi.com]

    • by serviscope_minor ( 664417 ) on Sunday May 21, 2017 @03:17AM (#54458065) Journal

      I hate this fact that "java programmer" is considered by some people a different job than "C++ programmer". A good programmer should be able to learn a language in a month and become proficient in three months at most.

      Yesbut. There's also a difference between proficient and expert. Becoming an expert in either takes much, much longer. For example a friend of mine is an expert in Java. I can hack code in the language and do plenty of things. He seems to have committed half the standard library to memory and knows the JVM in depth too. There's all sorts of weird and wonderful stuff you can do if you know those manipluations.

      It would take me many years to reach his level.

      It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

      Unless the hammer has a wooden head and is used for knocking chisels. It takes a long time to learn how to do that effectively no matter how well you can smash rocks with a sledge.

    • by Sique ( 173459 )
      I don't think so. Knowing a language does not only mean that you are able to write a syntactically correct program which compiles and does what it is supposed to do. It means that you have intimate knowledge of all the libraries and toolsets and coding environments that come with the language. And this is the real treasure of knowledge that makes the difference between a newby to the language and the seasoned programmer. If you have enough experience you know which things are already invented, and how to us
    • Comment removed based on user account deletion
    • all languages are more or less the same

      It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

      These two statements show just how narrow your view is on a large variety of topics. There are a wildly different set of hammers out there which need different techniques to get what you need. Would you use a sledge hammer like a watchmaker's tool chaser? Of course not. Both are completely different in every way including holding, body movement and problem.

      Likewise I don't expect someone who grew up learning how to program brainfuck to understand the first thing about Java. For bonus points without googling

    • by jon3k ( 691256 ) on Sunday May 21, 2017 @09:33AM (#54459051)

      A good programmer should be able to learn a language in a month and become proficient in three months at most.

      This isn't, and shouldn't be, the case. There is a huge demand for work in higher level languages that can be done by less skilled programmers. Most of them wouldn't be capable of programming in C, and that's ok. If ALL programming needed to be done by a programmer who could become proficient in any language in three months we'd be 50 years behind in our use of software as a species. We could be pedantic and call that "scripting" and not "programming" but that's obviously a specious distinction.

  • by phantomfive ( 622387 ) on Sunday May 21, 2017 @02:55AM (#54457995) Journal

    C and C++ coders will soon be as obsolete as Cobol programmers. "The entire world has gone to Java or .Net.

    My God, no! I'd rather program in brainfuck

  • Well, I started off on a System/3 with a card reader, so I've had to keep my competencies up to date.
    In our industry more than most this is vital - I sometimes meet people bragging about having "20 years experience in x"; too often it's more like 2 years worth of experience x 10.

    That said, I have some buddies still making decent coin on COBOL gigs...

  • Opportunity (Score:5, Insightful)

    by Areyoukiddingme ( 1289470 ) on Sunday May 21, 2017 @03:03AM (#54458023)

    But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

    Sounds like a fantastic opportunity to get rich—fleecing poor bastards who actually believe this dreck. Ninety percent of coding is indeed figuring out how to wedge some business wonk's hairbrained idea into the machine, but does this clown have any idea how broad a phrase "business specs" is? That's everything. I mean e-v-e-r-y-t-h-i-n-g.

    "Make my MRI machine work." Business spec. "Make my combine harvester work." Business spec. "Make my search engine work." Business spec. "Make my toy robot work." Business spec. "Present as many goddamned ad impressions as physically possible." Business spec. He's trying to claim that do-what-I-mean-not-what-I-say computers are just around the corner, readily (and cheaply) available. HA. No. You might, MIGHT be able to train a neural net to do a piece of one of those tasks. All of them? And all parts? Not even close. Not in three years.

    I'm sure nVidia's new Titan Xp is a marvelous thing, with its dedicated tensor accelerator hardware, but it's not do-what-I-mean hardware. It was just released last month, which means nVidia's next card is a year away. Does anybody think it's going to be do-what-I-mean hardware? No. How about the generation after that? Maybe another node shrink? Still no. How about three generations from now? If historical Titan benchmarks are anything to go by, it'll be twice as fast as a Titan Xp. It takes nVidia about 36 months to double performance. Is it going to be able to do-what-you-mean? Mmm, no.

    The world is going to need just as many coders in three years as it does now. It will probably need more. The coming wave of automation is not going to be self-programming, but it is coming. Somebody is going to have to write all that code. And baby all of those neural nets.

  • by serviscope_minor ( 664417 ) on Sunday May 21, 2017 @03:04AM (#54458033) Journal

    The president of one job leadership consultancy argues C and C++ coders will soon be as obsolete as Cobol programmers. "The entire world has gone to Java or .Net. You still find C++ coders in financial companies because their systems are built on that, but they're disappearing."

    The entire world has done what now? I work in the computer vision/data processing world. It's all written in C++ on the back end, often with python driving code on the front. Currently C++ is the only language with the expressivity, speed and resource frugalness required for the job.

    I've also worked on deep embedded stuff. Hell, some of the compilers don't even do C++ (looking at YOU IAR C/C++), so I wrote it in C. Otherwise I'd use C++, because there aren't any other languages with the resource control which will do the job.

    Lots of other stuff seems to run on the browser. All major browsers are implemented in C++ because... well you get the idea. About the only thing which could potentially displace C and C++ is Rust since it's basically the C and C++ model but with a syntax that excludes many common bugs. But it's a way from being there yet.

    A data scientist at Stack Overflow "says demand for PHP, WordPress, and LAMP skills are seeing a steady decline, while newer frameworks and languages like React, Angular, and Scala are on the rise."

    There's a difference between decline and fall. The displacement is certainly happening, but you can't replace WordPress with Angular and Scala because one is an entire CMS, the other are a library and language. That's not the same thing.

  • by Ambassador Kosh ( 18352 ) on Sunday May 21, 2017 @03:07AM (#54458045)

    Science and engineering continue to move towards doing more simulations. Everything from chemical simulations to flow simulations. The more accurate these simulations are the more computationally intensive they get but also the more money you can make since you have to do fewer real world experiments to isolate the true running conditions and the simulations can also be used as control systems allowing you to operate closer to the true danger area.

    In most chemical plants reactions are run FAR from the actual danger points in terms of product yield, purity, reaction speed etc because things like PID controllers just can't adapt to how chemical systems really work.

    The problem is that for this kind of work java and .net are SLOW. They can easily but 100x to 1000x slower than a program written in C, C++ or Fortran. The tooling to support High Performance Computing type applications really doesn't exist outside of C, C++ and Fortran. They have the most advanced optimizing compilers, profilers, debuggers, libraries etc. What I often see is something like MATLAB for visualization, Python for command and control and C/C++/Fortran for the actual simulation running on clusters.

    These newer microchips that have more cores per chip are only going to continue to push things in that direction. It is easy to gain a little scaling with threads but if you want to really get a program to run fast you need to either have direct memory control or you would need a far more efficient runtime than has ever been created so far.

    This may come as a surprise but almost no normal software uses more than about 1% of a cpu's capabilities. Even most games are 5%. You can see this when you run them under a good profiler like VTune. Sure the CPU is technically busy running the software but it is mostly just waiting for data and working with unoptimized data structures. To get over this barrier you need to do thousands of small changes to your program.

    If you need a program to run FAST you need to eliminate false sharing. If you have two threads write to different indexes in an array but the items are too close to each other in memory they could be sitting on the same cache line and this will cause the cores to have to resync and retry calculations based on which one committed first. The more cores you add the worse this problem gets. I have worked on a program that went from 30 seconds on 128 cores to 0.03 seconds on 128 cores by removing all the false sharing.

    You also need fine grained control over parallelization. You need to be able to decide that a function should only be parallelized and to what degree it should be parallelized based on the amount of data being handed into that function. That is why things like TBB and OpenMP allow those to be controlled at runtime. If you make a parallel version of quicksort and run each division in parallel recursively you reach a point where you are creating parallel tasks that are far too small and have too much overhead. This means you need to understand how many cpu cycles an operation normally takes and can parallelize based on this information.

    At this point I don't see any other languages really moving in to really compete with C and C++. Sure there are languages that do a lot of the high level stuff that used to be done with C and C++ but the world has also moved to harder problems and C and C++ have moved onto those harder problems also. This is a problem you can't just buy more hardware to fix. Many of these simulations take days to run in highly optimized C and C++ code and the java/.net versions would take a year to run. The time alone would kill the programs usefulness but forget ever optimizing your system using the simulation.

  • Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI.

    For some reason this made me think of a homeless person on a sidewalk with a cardboard sign that reads "A.I. took my job". Also relevant: "I'm a people-person dammit! I'm good at working with people!"

  • I actually program in the field and while I admit .NET is pretty awesome if you program specifically in Windows and if you aren't looking for performance! It isn't meant for the high performance edge applications that C and C++ are able to handle. Besides, all four languages .NET, C, C++ and Java all have some similarities with each other. It's about as stupid as saying because I drive a SUV that I can't drive a truck. Sure there's differences but anyone who calls themselves a programmer should be able to switch fairly easily. Java itself is a horrible use of resources, the few games I've seen written in them run like molasses and that problem in itself hasn't really changed over the years.

    As far as AI taking programming jobs, don't make me laugh. AI is nowhere near that level yet. An AI that can program is an example of self-aware AI that can program itself. Even if it could, I'm not sure if anyone would really want to risk it. Think about it, let's fire all the programmers and get the computer to program itself to help us. Eh, what if something goes wrong? What if it decides to go the way of skynet?

    While the cloud may have it's benefits, not everything will go there. Communication isn't that fast and some information is best kept off the web.

    Written by a business type who has no idea how programming works except for the next big thing. Not even realizing how much stuff is based on legacy architecture.

  • What a load of BS (Score:5, Informative)

    by jonwil ( 467024 ) on Sunday May 21, 2017 @04:04AM (#54458147)

    C and C++ aren't going anywhere. Everything from operating system kernels to operating theater robots are programmed in C and/or C++.

    I count at least a dozen devices in my apartment that contain some sort of microprocessor and I would bet money that all of them are using C and/or C++ in some form as part of their software.

    Anyone who thinks C or C++ is going away anytime soon is either a clueless idiot or has some vested interest in pushing Java and .NET.

  • by fatp ( 1171151 )
    Because human beings are bound For extinction
  • by mccalli ( 323026 ) on Sunday May 21, 2017 @04:19AM (#54458167) Homepage
    Still employed.
  • I interviewed a guy recently for a C/C++ role and while he had C down for a skill he couldn't tell me the most basic things about multi-threading, GUI programming, or even C++ fundamentals. He had spent the last 15 years working on some kind of terminal based software and simply didn't have a clue. Yeah I get his day job probably didn't require those things but it says much of his temperament and inclination that he couldn't be bothered to learn in his own time either. He didn't get the job.

    Learning somet

  • > right now Java and Python

    Oh, come on. Java as been the leading language for over 15 years now.

    > not that many coders after 2020

    I heard the exact same thing in the 90s with Aspect Oriented Programming. Oh, we won't need programmers, you'd just pick your big building blocks and just put them together and voila!
  • by Assmasher ( 456699 ) on Sunday May 21, 2017 @07:10AM (#54458547) Journal

    ...clearly these are people who know absolute f*** all about creating software.

    C supposedly died long ago, and yet I find myself using it in critical situations to use libraries such as xmlsec in order to build the underpinnings of an IDP mechanisms - now, those underpinnings are used by golang, but that's been the way of software since the early 90's.

    I don't dream of writing C, and I think golang, .NET, Rust, et cetera, are great and useful languages; but, just because I've got some sexy new impact wrench, I still find myself reaching for my 30 year old adjustable wrench on occasion...

  • by c ( 8461 ) <beauregardcp@gmail.com> on Sunday May 21, 2017 @07:24AM (#54458591)

    Ninety percent of coding is taking some business specs and translating them into computer logic.

    I think everyone has encountered code-written-to-spec at one point or another. And had to rewrite code such that it violates the spec in order to match reality.

    That aside, a more interesting question is "which programming language will those AI-compatible business specs be written in?"

  • Perl Developer (Score:4, Interesting)

    by Herkum01 ( 592704 ) on Sunday May 21, 2017 @09:12AM (#54458947)

    I have been doing Perl development for a long time, and in the last two years, it has straight out disappeared. You can still find Perl as a job requirement, usually as part of DevOps positions, but actually writing apps in it, they are gone.

    I have noticed that the new fad for LAMP is Python, it has shown up everywhere and years before it PHP, but Perl has been relegated to being a systems administration tool.

    • Re:Perl Developer (Score:4, Interesting)

      by __aaclcg7560 ( 824291 ) on Sunday May 21, 2017 @10:04AM (#54459169)
      I get a lot crap for stating this on Slashdot... but I thought Perl disappeared years ago. References to the LAMP stack was always to PHP or Python. Perl isn't being used to administrator the Windows systems at my current government IT job. I haven't ran into Perl in any of my private sector jobs in the last 20+ years.
      • It was always around, but it was not bringing in younger programmers. Instead, those guys were being pulled into the next fad. I cannot blame them, they needed a job and a lot of the problems they were supposedly solving were more BS than real issues.

        It would keep getting deprecated by Java and Ruby, and then PHP, now it Python (this is over the course of the last 13 years or so).

        It has always existed in the *NIX administration space and is it installed by default and it does things that BASH simply canno

  • by tbuskey ( 135499 ) on Sunday May 21, 2017 @09:14AM (#54458955) Journal

    * The CEO and co-founder of an anonymous virtual private network service says "The rise of Azure and the Linux takeover has put most Windows admins out of work. Many of my old colleagues have had to retrain for Linux or go into something else entirely."

    * In addition, "Thanks to the massive migration to the cloud, listings for jobs that involve maintaining IT infrastructure, like network engineer or system administrator, are trending downward, notes Terence Chiu, vice president of careers site Indeed Prime."

    Everyone (and half the ones quoted in the OP) talk about programming, not IT like in the question.

    The IT dept. worries about desktop, data management (NAS/backups), security, connectivity from the desktop to the rest of the company/world, remote access, email and other business apps (including database). I think that kind of IT will be around for awhile. The apps/email might move to outsourced. The desktop will probably be Windows in most cases for a long time unless MS really makes it unusable for most users.

    We're already seeing some examples of traditional IT moving away from Windows. I look at my kid's school. All chromebooks & cloud. The school IT needs to do networking/WiFi and account management. I expect that the data management and software upgrades is minimized. There would be security in network configuration and policies for teachers/parents/students. Probably some internal applications (building management and phones?) that can't be outsourced to a web app. Everything else is outsourced to Google. They save lots on IT compared to Windows/iPads that I've seen at other schools.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Sunday May 21, 2017 @09:36AM (#54459067)
    Comment removed based on user account deletion
  • by zmooc ( 33175 ) <zmooc@zmooc.DEGASnet minus painter> on Sunday May 21, 2017 @09:39AM (#54459083) Homepage

    So I no longer need to struggle with C/C++ because I need consistently reliable sub-2ms response times and any auto-vectorization I can get. That's awesome! But... what exactly is the alternative supposed to be?

    Also, all these people are talking about languages, not jobs. They appear not to understand that programmers can switch to other languages relatively easy and probably already use many languages.

  • by ilsaloving ( 1534307 ) on Sunday May 21, 2017 @09:53AM (#54459133)

    The list was all meh. Maybe. Um...

    Until I got to sysadmin. This is so mindbogglingly stupid that I'm amazed this guy can tie his own shoes in the morning. I didn't even bother looking at the rest of the list.

    If you think your company doesn't need sysadmins anymore just because your infrastructure is 'in the cloud', I REALLY REALLY want to see you do that. Just so I can laugh as your entire company collapses.

  • Oh, really? (Score:4, Insightful)

    by Shoten ( 260439 ) on Sunday May 21, 2017 @01:26PM (#54459851)

    The president of Dice.com says "Right now, Java and Python are really hot. In five years they may not be... jobs are changing all the time, and that's a real pain point for tech professionals."

    I think back to situations like steel workers or coal miners whose jobs disappear...and to the combination of where these people live, the lack of variety of the local economy, and the difficulty translating their skills to other industries. These things combine to make it nearly impossible for them to maintain their livelihoods. Conversely, in the tech field, that constant rate of change makes it not only relatively easy to change specialties, it eliminates any stigma that comes from having done so.

    Yes, this means that fields and skills sometimes go out of favor...but at least you're not stranded when they do. You have options. Whether or not you exercise those options...that's another thing. I'd rather have options, and have it left up to me whether I fail or succeed.

The "cutting edge" is getting rather dull. -- Andy Purshottam

Working...