Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education IT Technology

Top 10 Dead (or Dying) Computer Skills 766

Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
This discussion has been archived. No new comments can be posted.

Top 10 Dead (or Dying) Computer Skills

Comments Filter:
  • c ? really? (Score:5, Insightful)

    by stoolpigeon ( 454276 ) * <bittercode@gmail> on Thursday May 24, 2007 @04:55PM (#19260803) Homepage Journal
    doesn't really match up with my experience. and putting it next to powerbuilder? that's just not right.
    • Re:c ? really? (Score:5, Insightful)

      by WrongSizeGlass ( 838941 ) on Thursday May 24, 2007 @05:04PM (#19260985)
      'C' will never die. Period. It has so many uses from PC's & 'big iron' to embedded systems.
      • Re:c ? really? (Score:5, Insightful)

        by tha_mink ( 518151 ) on Thursday May 24, 2007 @05:39PM (#19261535)

        'C' will never die. Period. It has so many uses from PC's & 'big iron' to embedded systems.
        What is 'C'? Is that a language? Like latin?

        I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested ..." blah blah blah. I think that I actually even used the 'big iron' quote too when telling my friend how secure COBOL was. Um...I was wrong. I found that out with plenty of time to learn other stuff like Java and so forth but of course it's going to die. Just like C++ will die, just like Java will die, et al. If you've been in our business long enough, you should know better. Everything dies, it's just a matter of time. And if you think you're going to get 30 years out of the technologies that are new now, then you're wrong there too. That's the double edge sword that is the IT business. Keep learning, keep growing or start flipping burgers.
        • by Penguinshit ( 591885 ) on Thursday May 24, 2007 @05:51PM (#19261747) Homepage Journal
          Or do what I did and move into management...
        • Re:c ? really? (Score:5, Insightful)

          by WrongSizeGlass ( 838941 ) on Thursday May 24, 2007 @06:01PM (#19261897)

          And if you think you're going to get 30 years out of the technologies that are new now, then you're wrong there too.
          I've been coding in 'C' for 24 years, and unless OS's, drivers, embedded systems, et al, stop caring about performance I think 'C' will out last me in this industry (and probably out live me, too).

          That's the double edge sword that is the IT business. Keep learning, keep growing or start flipping burgers.
          I've coded in over 20 languages in my career, from assembly languages to proprietary 4GL's and everything in between, on more platforms than I have fingers. My library has programming texts older than most coders today. Keep learning? Great advice. Give up on 'C'? That's another story entirely ...
          • Re:c ? really? (Score:5, Insightful)

            by Plutonite ( 999141 ) on Thursday May 24, 2007 @06:57PM (#19262607)
            You are definitely right, and it's not just because what happend over the last 24 years of your engineering history is likely to carry over..it's also the nature of the language itself. There is a reason C (and C++) are so damn popular, and the reason is that they embody most, if not all, of what can be done with a general purpose language. Things like Java and Python will stay for quite a while too, because the design there is more conforming to object-orientation while keeping most of the general-pupose flexibility, but C and the various assembly languages will never die. It would require a re-write of the entire architectural basis of computing to throw them out, and the theoretical part of computation theory does not need features that are unavailable here (yet). Anything that can be done in any language can be done (albeit less elegantly) with the aforesaid.

            No one shall expel us from the Paradise that Richie has created (Apologies to David Hilbert)
          • Re:c ? really? (Score:5, Insightful)

            by Anonymous Coward on Thursday May 24, 2007 @07:29PM (#19262999)
            Several times I've been told "nobody does assembler anymore", and yet I still keep needing to use it. I'm not writing whole applications in assembler of course. But the same reasons I keep having to do assembler are some of the same reasons that C is used. C and assembler may be less commonly used than in the past, but unlike COBOL, modern computer systems are still very heavily dependent upon them. Until we get a radically new form of computer architecture, which doesn't seem likely anytime soon, demand will remain for people who can write and maintain the guts of what happens underneath the applications. And you can't do that in Java or C# or Ruby.

            Computerworld Magazine, being an IT rag, is concerned about IT, not computer science or engineering. Thus it worries about product names, not categories. So the point out skills in SNA or Novell Netware that aren't needed so much anymore, even though computer networking skills are even more popular and vital today than in the past. COBOL may be virtually dead, but dry and dusty applications for business purposes are alive and well and written by people who still wear ties.
        • Re:c ? really? (Score:4, Interesting)

          by Rakshasa Taisab ( 244699 ) on Thursday May 24, 2007 @06:04PM (#19261937) Homepage
          We've already gotten 35 years out of C, and it is still going strong. Not as much used at it used to be, but not insignificant.

          In addition to that, you could say that the C we have today is an old 'stable' fork of that language, which is now moving ahead in the form of C++.
        • Comment removed (Score:5, Interesting)

          by account_deleted ( 4530225 ) on Thursday May 24, 2007 @06:08PM (#19261975)
          Comment removed based on user account deletion
          • by gmack ( 197796 ) <gmack.innerfire@net> on Thursday May 24, 2007 @07:34PM (#19263087) Homepage Journal
            Phone systems are meant to just work and often the idea is that if it's still working it should be left that way. I contract for an ISP that has it's own adsl equipment and have an access card that gets me into several Bell Canada Buildings in Montreal and one in Toronto.

            The telephone world is a weird mix of the state of the art and old.

            I regularly see software that comes on 9 track reels and other ancient equipment.. My biggest shock was seeing in downtown Toronto equipment that still uses vacuum tubes.

        • 30 year technologies (Score:3, Interesting)

          by Colin Smith ( 2679 )
          Off the top of my head:

          Unix, shell scripting, C. There must be more.

          Just a thought, but it makes sense to invest skills in technologies with proven survivability.

           
        • I dare to disagree (Score:3, Interesting)

          by Opportunist ( 166417 )
          Cobol had one huge disadvantage over C: It was no "system" language. It was an application language. Whether it was good at that is up for debate, it certainly was better than most alternatives there were, but it was dependent on the applications it was used for.

          When the applications died, the language followed. I dare say, ABAP is going to suffer the same fate as soon as SAP wanes and The Next Big Thing comes along. 'til then, it is a get-rich-quick scheme in IT if there ever was one, granted.

          C, in its "pu
        • Re:c ? really? (Score:4, Insightful)

          by jesuscyborg ( 903402 ) on Thursday May 24, 2007 @06:27PM (#19262267)

          I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested ..." blah blah blah. I think that I actually even used the 'big iron' quote too when telling my friend how secure COBOL was. Um...I was wrong. I found that out with plenty of time to learn other stuff like Java and so forth but of course it's going to die.
          But you're forgetting the reason COBOL is dying is because the tech industry is moving away from the machines and operating systems that make use of COBOL. C will not die in the next half century because the tech industry is moving closer towards technologies that are built on C like GNU/Linux. If anything, C will become more prevalent.

          In the next fifty years I imagine C's role more or less becoming that of the "mother language". 90% of the time everyone will be using higher level languages like Perl, Ruby, and Haskell on their Linux computers, all of which are programmed in C. Programmers will only need C when they need to change their lower level system tools, or to write new ones that perform very efficiently.

          The only way I can see C dying is if a kernel comes along with a Linux compatible system interface that's written in a language suited better to the massively parallelized CPUs of the future. And once the kernel moves away from C, applications are bound to follow.
        • by Javagator ( 679604 ) on Thursday May 24, 2007 @07:08PM (#19262733)
          I thought that COBOL would never die

          Wait until Y3K. Then everyone will come crawling back, offering COBOL programmers big bucks.

        • Re:c ? really? (Score:5, Informative)

          by nwbvt ( 768631 ) on Thursday May 24, 2007 @07:19PM (#19262905)

          Well, yeah, every language will eventually fade out. But C is still going on strong, as its still the language of choice for many low level applications. I just searched Monster.com and found over 2500 jobs referencing C [monster.com] (its possible that some of the results are because the term "C" is too generic, but most of the titles indicate that C programming is actually part of the job), while Python gets 419 [monster.com], Ruby gets 168 [monster.com], PHP gets 612 [monster.com], and JavaScript gets 1736 [monster.com]. How the hell can C be considered dead if its one of the most popular languages around, and probably still the best available choice for a huge class of applications (just not web applications)?

          And in fact even the "dead and buried" Cobol is still alive, with 174 jobs [monster.com]. Now, its not as much as the more popular languages, but its still more than Ruby, which is supposed to be the next big thing.

          Anyways, from TFA:

          As the Web takes over, C languages are also becoming less relevant, according to Padveen. "C++ and C Sharp are still alive and kicking, but try to find a basic C-only programmer today, and you'll likely find a guy that's unemployed and/or training for a new skill," he says.

          Despite what this guy thinks, web programming hasn't "taken over", and never will. Yes, it has a large niche, but there are many systems out there that are not, nor never will be web applications. Unfortunately some people (like this guy, he owns some dumb .com company that no one has ever heard of, how does that make him an expert on the subject) have tunnel vision and think that since they work on web applications, everyone else must as well.

        • by Max Littlemore ( 1001285 ) on Thursday May 24, 2007 @07:33PM (#19263069)

          I found that out with plenty of time to learn other stuff like Java and so forth

          WTF? I mean, I can understand learning Java to update your skills, but forth? And how does learning java lead to learning forth anyway?

          Must be one of those ancient COBOL codgers who's lost his marbles. ;-P

        • Re:c ? really? (Score:4, Insightful)

          by Khashishi ( 775369 ) on Thursday May 24, 2007 @10:10PM (#19264629) Journal
          Any programmer worth his salt can pick up a new language in a couple hours. Hell, most languages today are just ALGOL with some syntactical refinements, and you know one, you know them all. I'm not worried if Java or C or Matlab dies out. What separates programmers is ability, not language experience.
          • Re:c ? really? (Score:5, Insightful)

            by Kjella ( 173770 ) on Friday May 25, 2007 @01:35AM (#19266309) Homepage
            Any programmer worth his salt can pick up a new language in a couple hours.

            If you mean whether it's curly braces or brackets or none at all and the syntax of basic control flow, then yes. If you mean being familiar with the standard library, the development tools and all the specific bits (Java generics, C++ templates, take your pick)? No.
          • Re:c ? really? (Score:5, Insightful)

            by shutdown -p now ( 807394 ) on Friday May 25, 2007 @03:44AM (#19267003) Journal

            Any programmer worth his salt can pick up a new language in a couple hours. Hell, most languages today are just ALGOL with some syntactical refinements, and you know one, you know them all.
            Surely you meant to write, "all ALGOL-family languages"? Which includes most mainstream ones: C/C++, Java, C#, Python, Perl, BASIC etc. But you can't easily jump from C++ to Lisp, Erlang, Prolog or FORTH (just to name a few) easily, because they are different. Then there are people having troubles moving from class-based OOP (Java) to prototype-based (JavaScript). Etc... there is a lot out there, and it does differ.
      • Re: (Score:3, Informative)

        by Anonymous Coward
        C's use at PC application level programming is over.

        C as a vehicle for embedded programming is very much alive. I work as an embedded programming for devices ranging from 8 bit PIC's to DSP's and most things in between.

        How would you like to code a TCP/IP stack in asm? It's not entertaining and while low power low cost embedded devices are more and more having ethernet MAC and PHY layers embedded in them C programming for these devices becomes more and more important. At the point where a $1.50 micro can
      • Re:c ? really? (Score:5, Interesting)

        by SadGeekHermit ( 1077125 ) on Thursday May 24, 2007 @09:05PM (#19264001)
        I think they're confused, anyway -- they're writers, not programmers. I bet I can even guess how they did their research: they called up all the recruiters they could find and asked each one to list the languages he/she thought were dead or dying. Then they compared notes on all the responses they got, and built their final list.

        I think the list should be called "top 10 languages recruiters don't want to hear about" because that would be more accurate.

        Realistically, as far as C goes I think the following factors should be considered before declaring it a dead language:

        1. Most of the more popular object oriented languages (Java, C#, C++) use C syntax. C++ is a superset of C.

        2. Java can use compiled C modules as an analog to C's old "escape to assembler" technique. In other words, you can call C code from Java when you have something you want to get "close to the metal" on. Thus, a "Java Programmer" may very well ALSO be a C programmer, even if technically that isn't on his resume or job description. I can do this; I imagine most other Java programmers can as well. What's funny is that, once you're calling C code, you can turn around and use the C code to call assembler, Fortran, or whatever else you like! What a weird world this is!

        (Links for the skeptical):
        http://www.csharp.com/javacfort.html [csharp.com] (Ironic that it's on a CSharp site, no?)
        http://www.mactech.com/articles/mactech/Vol.13/13. 09/CallingCCodefromJava/index.html [mactech.com]
        http://java.sun.com/developer/onlineTraining/Progr amming/JDCBook/jniexamp.html [sun.com]

        3. Linux is still written in C, I believe. As are its drivers, KDE-related programs, Gnome-related programs, and whatnot.

        4. C is the modern version of assembler, isn't it?

        ANYway, I don't think C's going anywhere. You might not be able to get PAID for doing it, as your main speciality will probably be something more buzzword-heavy, but you'll probably be doing some of it as a part of whatever other weird and mysterious things you do in the ITU.

        Poor journalists... One suspects they're rather easily confused these days.

    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Thursday May 24, 2007 @05:06PM (#19261007)

      "C++ and C Sharp are still alive and kicking, but try to find a basic C-only programmer today, and you'll likely find a guy that's unemployed and/or training for a new skill," he says.

      Now I know some people who've learned on C#, but I'm sure that will change in the near future.

      Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.
      • by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Thursday May 24, 2007 @05:46PM (#19261641) Homepage
        "Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.

        I learend Assembly first and have programmed nearly every cpu except a vax. I learned C second (in 1976) and still use it every day.

        I also know: apl, forth, snobol, fortran, rpg, cobol, lisp, smalltalk, pascal, algol and probably more I can't remember. I've never done much in them except for a tiny bit of forth. I know postscript pretty well and used it quite a bit. But I do C, day in day out and do not think very much of C++ or (worse) C#. The oddball languages have their place but for most it's a pretty narrow niche. I'd rather invert a matrix in Fortran than C though, I'll admit.

        Andrea Frankel said it best on usenet in the 80s: "If you need to hire a programmer ones with assembly are better than ones without".

        It worries me that people today don't actually know how computers work inside any more.
      • by Opportunist ( 166417 ) on Thursday May 24, 2007 @06:24PM (#19262233)
        To quote my Guru: "When you learned C, and you mastered it, you have learned every (procedural) language there is, for it is easier to take a step down rather than up."

        It's pretty much true. Look at the other languages you "should" learn today. Perl, PHP, Python, C#, Java... When you know your C well, learning them is fairly easy.
    • dovetail (Score:5, Informative)

      by Anonymous Coward on Thursday May 24, 2007 @05:06PM (#19261015)
      No better place to dovetail than first post.

      Here's a link to the print version [computerworld.com] for those who dislike clicking 18 times to read a news piece.

      And for those not wanting to feed the gossiping trolls altogether, here's the (pointless) "Top 10" list in short form.

      1. Cobol
      2. Nonrelational DBMS
      3. Non-IP networks
      4. cc:Mail
      5. ColdFusion
      6. C programming
      7. PowerBuilder
      8. Certified NetWare Engineers
      9. PC network administrators
      10. OS/2

      You may now return to the /. index in search of better things to quibble over.
      • Re:dovetail (Score:5, Insightful)

        by MightyMartian ( 840721 ) on Thursday May 24, 2007 @05:21PM (#19261277) Journal
        I don't think you can justify C and Cobol. There are millions upon millions of lines of code in these two languages, and despite all the sexy new ones that have come along, these two still reign supreme; C is incredibly prevalent on dedicated systems and within a lot of operating systems, and mainframe Cobol code can still be found throughout the business world (though often cleverly disguised these days). I doubt a skilled Cobol programmer will be at risk of starving any time in the near future.
        • Re:dovetail (Score:5, Insightful)

          by Watts Martin ( 3616 ) on Thursday May 24, 2007 @07:51PM (#19263257) Homepage

          I think you (and many others) are somewhat missing the point of the article, although the somewhat histrionic headline encourages a "miss the forest for the trees" reading.

          I don't think anyone is expecting C or even COBOL to vanish with the speed of PowerBuilder or NetWare; the issue is whether those are actually "growth markets" any more. The article is asserting they're not, and particularly in COBOL's case I'm pretty sure that's correct. COBOL will probably live on for quite some time, but you don't hear much about people deploying new COBOL projects -- you hear about them supporting existing ones that haven't been replaced.

          As for "but the OSes are written in C!" as a battle cry: well, yes, they are. But 25 years ago, they sure weren't: C was just too damn big and slow to write an operating system in. What's happened since then? Computers have gotten orders of magnitude faster, RAM and disk space have gotten orders of magnitude bigger, and of course compiler technology has also just gotten better. Couple that with the fact that operating systems and the computers they run on are just a lot more complicated -- having a higher-level language to deal with that, even at the system level, is a real advantage. There's nothing that prevents you from writing an operating system in assembly language now, but under most circumstances you probably wouldn't want to.

          The thing is, unless you want to assert that computers twenty years from now will not be much faster and have much more storage and be much more complicated, you can't assert that moving to a higher-level language than C will never be either practical or beneficial even at a system level. I don't expect C to go away or even be relegated to "has-been" status, but I suspect in the long term it isn't a growth skill. It's going to move more deeply into embedded systems and other arenas where small is not merely beautiful but necessary.

          The comparison with COBOL may be overstated, but it may not be completely inapt: the fact that there are still COBOL jobs out there and they may actually be fairly high-paying ones doesn't mean that going to school, in 2007, in preparation for a career as a COBOL developer is a bright idea. The same isn't as true for C, but I'm not convinced that's going to stay true for that much longer, let alone indefinitely.

      • Re:dovetail (Score:5, Interesting)

        by StarvingSE ( 875139 ) on Thursday May 24, 2007 @05:32PM (#19261433)
        I've said it before, and I'll say it again... lists like this are ridiculously stupid and not thought out. Its like "hey this is old it must be obsolete."

        The first two items on the list made me not want to take the author seriously. The financial business is run on COBOL and flat files, and will continue for some time. The language is not pretty, but it was made for a specific purpose and it does it well. In fact, demand for COBOL programmers has risen dramatically as people retire, and it is 7 years after Y2K. I know people who were asked to come out of retirement to work on COBOL again, for very high salaries, because it is not taught to us youngens anymore.
      • Re:dovetail (Score:5, Funny)

        by Stormwatch ( 703920 ) <rodrigogirao@hotm[ ].com ['ail' in gap]> on Thursday May 24, 2007 @07:05PM (#19262691) Homepage
        10 PRINT "What, no BASIC?"
        20 GOTO 10
    • Re:c ? really? (Score:5, Insightful)

      by fyngyrz ( 762201 ) * on Thursday May 24, 2007 @05:20PM (#19261249) Homepage Journal

      No, C isn't in any way going out. C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner. We use the most important of those regularly in C code, specifically objects and objects with methods. We obtain better performance, smaller executables, and smaller memory footprints than any company that makes similar software using C++ or Objective C's add-on paradigms. C, and the C sub-domain of C++ and so on, is no more "going away" than C++ itself is. C occupies a unique niche between the metal of assembly and the (so far) considerably less efficient higher level languages — I'm talking about results here, not code. I'm all about recognizing that a few lines of C++ are very convenient, but the cost of those lines is still too high to even think about abandoning C code for performance applications. For many, the object isn't finding the absolute easiest way to write code, but instead trying to find a balance between portability, reasonable code effort and high performance. C sits exactly in that niche. C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)

      Speaking as the guy who does the hiring around here, If your resume shows C and assembler experience, you've made a great start. Even just assembler. C or C++ only, and your odds have dropped considerably. C, assembler and either a great math background or specifically signal processing, and now we're talking. C++ doesn't hurt your chances, but you won't get to use it around here. :)

      • Re:c ? really? (Score:5, Insightful)

        by Chainsaw ( 2302 ) <jens,backman&gmail,com> on Thursday May 24, 2007 @05:47PM (#19261665) Homepage

        C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner.

        Wow. You must have had some really shitty software engineers. It's very likely that you can create C++ code that is as fast or faster than C. Yes, I said it. Implementing virtual inheritance and method overloading in plain C is doable, but it will be very complex. Templates? Don't even want to think about it.

        C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)

        I have no idea what the MS Treeview problem is, but once again - the programmers that you have worked with must have sucked balls. I'm an old C coder, with some solid x86 assembler knowledge. As you say, it's possible to get very high performing applications using C. However, why would I do that, when I can create code that is just as fast and much more readable by using C++? Yes - even for embedded development, which is my dayjob.

      • Re: (Score:3, Insightful)

        by iamacat ( 583406 )
        a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner

        So, when you have a small object with 10 methods, do you actually waste 40 bytes on 10 function pointers? Do you define your own virtual table structure, use var->vtable.move(var, x, y) kind of notation and require users to call non-virtual methods as ClassName_MethodName(obj, arg, ...)? What kind of C++ overhead are you avoiding here? How did you like the experience of incorporating a library wh
      • Re:c ? really? (Score:5, Insightful)

        by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Thursday May 24, 2007 @06:14PM (#19262075) Homepage
        ### C produces fast, tight code that so far,

        So can other languages, I don't think that is the main selling point of C. I think the main selling point of C is that it is by far the most "compatible" language of all. Python, Perl, Ruby and friends are all based on C, if you want to extend the languages, you do so by writing a module using their C API. If you want to simply call C functions, you can do so from many other languages as well be it Lisp, Ada or whatever. If you want to talk to the kernel you do so in C. No matter what language you use, sooner or later you come to a point where you have to fall back and either write C or interface with C code, since C is the 'real thing', while everything else is just an ugly wrapper around C code, trying to hide it, but often failing at doing so properly.

        As long as a ton of stuff is based on C code it isn't going away, especially not in the OpenSource world where basically everything is based on C.

        Maybe one day some Java or .net based OS will take over, but I don't see that happening for many years or decade(s?) to come.
        • Re:c ? really? (Score:5, Insightful)

          by atomicstrawberry ( 955148 ) on Thursday May 24, 2007 @09:06PM (#19264013)
          When you say 'based on C', do you mean that the compiler / interpreter is written in C, or that the language itself is derived from C? Because technically Ruby is based off Perl and Smalltalk, not C. The Perl side of things can be traced back to C, but Smalltalk's origins are in Lisp.

          However they are all implemented in C, as is PHP. In fact, I'm reasonably confident you'll find all of the web languages that the article declares are taking over are implemented using C. As is Apache, which is the backbone of the majority of internet servers. In fact, pretty much everything that provides important infrastructure is written in C.

          There may be demand right now for programmers that know the latest fad high-level language, but the demand for competent C programmers has hardly disappeared. The only reason that C would die is if another fast, portable, general-purpose language like it came along that offered significant benefits over C. I can't personally see that happening any time soon.
    • experience (Score:3, Insightful)

      by DrYak ( 748999 )
      In similar way :
      Non relationnal DBMS
      Yes, maybe they don't play such an important role as before at big irons. But actually they are encountered painfully often in science, where usually database grow slowly out of small projects that subsequently undergo numerous hacks.
      I'm studying bioinformatics and proteomics, non relationnal DBMS are part of the standard cursus (and often encountered in the wild).

      C programmingy
      Yes. Just try to tell it to the OSS community. Almost any cool piece of technology (most librar
  • by Sycraft-fu ( 314770 ) on Thursday May 24, 2007 @04:59PM (#19260865)
    But C? Really? I guess that the fact that nearly every game, every OS, almost every high performance computation tool and so on are written in it (or C++ which I keep under the same heading) doesn't count. While it certainly isn't the be-all, end-all, it is still widely used. Even games that make extensive use of scripting languages, such as Civilization 4, are still C/C++ for the core functions.

    Until there's enough spare processor cycles that it really doesn't matter how much CPU time you use, or a managed language gets as good at optimizing as a good C compiler/programmer combo (unlikely) I don't think C is going anywhere.
    • by LWATCDR ( 28044 ) on Thursday May 24, 2007 @05:04PM (#19260969) Homepage Journal
      C++ is still alive and well.
      I think they are wrong since C is still used on a lot of embedded systems where C++ is too heavy.
      BTW a good number of HPC tools and applications are still written in FORTRAN.
    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Thursday May 24, 2007 @05:17PM (#19261199)
      They phrased it very badly. C isn't going anywhere. But if all you know is C, then you are very rare.

      Most programmers who know C also know at least one other language.

      In any event, putting that on the list was just stupid.
  • ColdFusion Dead? (Score:5, Insightful)

    by AKAImBatman ( 238306 ) * <akaimbatmanNO@SPAMgmail.com> on Thursday May 24, 2007 @04:59PM (#19260875) Homepage Journal
    I can only hope. Terrible, terrible language. Of course, these days it's actually a template engine for a J2EE server. So it's not nearly as bad as it once was. Unfortunately, most of the ColdFusion projects are massive, sprawling directories from the CF4/CF5 days. You're not likely to see a nicely package JAR here. :-/

    Also, what's with "PC Network Administrators"? TFA must be referring to a rather specialized form of administrator, because last I checked we still needed someone to keep the desktops configured, the networks running, the file severs sharing, the login servers logging people in, and the IIS servers serving.
    • Re:ColdFusion Dead? (Score:4, Interesting)

      by funkdancer ( 582069 ) <funky.funkdancer@com> on Thursday May 24, 2007 @06:03PM (#19261929)
      I beg to differ. Whilst I agree that there's some shocking solutions out there, I develop applications using the http://mach-ii.com/ [mach-ii.com] framework and it makes for a great development platform; since version 6 (MX) it has supported components that allows for using object oriented development principles.

      If one was to use crappy solutions as an argument, how come anyone is still using php, asp, etc? I think ColdFusion has copped it more than any due to its low threshold of entry, but just because one _can_ make a shit solution using a platform doesn't mean the platform is shit.

      Have been using ColdFusion for 11 years now and it keeps getting better. It's a great second language to know in addition to .net or Java. Unfortunately & ironically, the low availability of developers pushes the case for enterprises to phase the solutions out, as it is too hard to get people who are good at it - even though it is a perfect fit for a lot of the work.
  • by Rakishi ( 759894 ) on Thursday May 24, 2007 @04:59PM (#19260877)
    I mean, this is IT where things change quickly and at times unexpectedly. If you don't have at least a number of diverse skills then I can't say I feel sorry for you when your job gets axed. I may not be a guru in any one language but at least I won't be unemployed when that language dies out.
  • by Ckwop ( 707653 ) * on Thursday May 24, 2007 @05:01PM (#19260915) Homepage

    As the Web takes over, C languages are also becoming less relevant, according to Padveen. "C++ and C Sharp are still alive and kicking, but try to find a basic C-only programmer today, and you'll likely find a guy that's unemployed and/or training for a new skill," he says.


    What the web can now allocate memory and talk to my hardware? Even if you're not a kernel programmer, the web has sucked and still sucks for application development. It will continue to suck for years, due to Internet Explorer. It's misleading to claim AJAX will solve all these problems because it won't. In fact, it might even cause a few problems of its own. For example, do you really think all that AJAX is secure? In short, I think the web is taking over what naturally comes to that medium. It is wrong to say its displaced C.



    Does this guy forget that all of the GNU/Linux Kernel base system is written in C? You know, the operating system that powers most web-servers? I'll tell you one thing, C will still be here in twenty years time when Ruby on Rails is talked about much in the same was Blitz Basic is today. C is here to stay; it's immortal.



    Simon


    • Re: (Score:3, Funny)

      by anvilmark ( 259376 )
      This SO reminds me of a comment one of our business users made to my boss a couple years back:

      "We don't need sockets anymore, everything is going to Web now..."
  • LaTeX (Score:5, Informative)

    by Anonymous Cowpat ( 788193 ) on Thursday May 24, 2007 @05:02PM (#19260923) Journal
    with MS equation editor becoming passable, journals that will mark your work up for you and quasi-wysiwyg TeX editors, people who 'do' LaTeX are hard to come by. (Afaik, I was the only person out of ~60 in my year (of physicists) who typed their project report up in LaTeX as plain LaTeX markup. About 4 other people used an editor. Everyone else used word.) Or maybe it's just that the students in my department are lazy and take little pride in the presentation of their work.
    • Re:LaTeX (Score:5, Funny)

      by serviscope_minor ( 664417 ) on Thursday May 24, 2007 @05:11PM (#19261095) Journal
      You're the lazy one. You know, avoiding all that tracking of cross references, mindless reformatting, applying styles, and doing battle with the (still) inadequate equation editor. Slacker.
    • The curse of good presentation skills is that no-one ever notices that you've used them, because you're good at presentation. :-)

      I'm currently having a similar debate at the office. We're working on a new tool, effectively a web front end for a simple database with a few handy UI gimmicks. In the grand scheme of things, it's a pretty simple tool, but it's part of a routine business process and literally thousands of people are going to be using it for a few hours each month.

      At a progress meeting yesterd

    • Re: (Score:3, Informative)

      by siwelwerd ( 869956 )
      LaTeX isn't going anywhere. It is the standard among mathematical journals for good reason.
  • True story... (Score:5, Interesting)

    by KingSkippus ( 799657 ) * on Thursday May 24, 2007 @05:02PM (#19260925) Homepage Journal

    When I started working at the huge multinational company I work at now, there were three things that I had very little experience with that everyone swore would last at the company for decades to come: Token Ring, Netware, and Lotus Notes. I insisted that within the next few years, these technologies would be dead and the company would have to change, and I was constantly reminded of the millions of dollars invested in them.

    It's eight years later. We have no Token Ring network. We have no Netware servers. I'm doing my damned best to convince people of how bad Lotus Notes sucks, and most everyone agrees, but we have a Notes support team that really likes their jobs and somehow manages to convince upper level management that it would cost billions of dollars to change to a real e-mail and collaboration solution. But I'm still holding out hope.

    Godwilling, Lotus Notes will soon be on this list as well.

  • If only... (Score:5, Funny)

    by 26199 ( 577806 ) * on Thursday May 24, 2007 @05:02PM (#19260933) Homepage

    ...writing unreliable, poorly-documented, just-about-does-the-job-and-only-if-you-get-lucky code would go out of fashion.

    Sadly it seems to be here to stay. In fact with the better availability/quality of scripting languages it is, if anything, becoming more popular...

  • by Anonymous Coward on Thursday May 24, 2007 @05:04PM (#19260971)
    1. secure software coding
      2. data management theory
      3. data modeling
      4. usability
      5. interface design
      6. use of testing, version control, refactoring, and other best practices
      7. space or time efficient algorithms
      8. general communications skills
      9. basic business concepts like ROI
    10. business ethics
  • Delphi (Score:3, Interesting)

    by CrazyTalk ( 662055 ) on Thursday May 24, 2007 @05:08PM (#19261047)
    Anyone out there still use Delphi? Does it even exist anymore? I'm a bit nostalgic for it - that was my first professional programming gig.
    • Re: (Score:3, Informative)

      by Anonymous Coward
      I'm nostalgic too. Delphi still is one of the best development systems for Windows out there. Too bad Borland/Inprise/Borland jumped on the .NET bandwagon and destroyed it. But if you get an old copy of D6 (or D7? cannot remember which one was the last non .NET-polluted version) you can write great software with it.

      Also, take a look at Lazarus [freepascal.org]. It's a multiplatform and open source Delphi clone that brought the beauty of Delphi to Linux.
      Note that it's 100% native on all platforms and produces 100% native cod
  • Web Design (Score:5, Insightful)

    by happyfrogcow ( 708359 ) on Thursday May 24, 2007 @05:11PM (#19261089)
    Judging by their web page, all design jobs are dead too. We should all just write web pages to serve ads, because C is dead.

    This article is trash, even if it does have some technologies that are irrelevant. It has very little value to the reader. I'd rather read a 10 top list for reasons Paris Hilton should be locked up for life.

  • by baggins2001 ( 697667 ) on Thursday May 24, 2007 @05:13PM (#19261137)
    I don't see where either of these are going away.

    There just aren't that many people that know networking outside of IT and there are still a lot of people that get confused about what is going on. I have seen where many people have cluged together a network at their office, but then they find out it sucks after awhile, so they have to call somebody in to look at it.
    C programming is going away. I'm always seeing algorithms with some part of C in them. Partly because these guys with VB skills say hey there is no reason to learn all that hard stuff. We'll just get more/bigger hardware. So far they have spent $300K on hardware and 5 man years of programming. They've got a lot of code but nothing to show for it. Runs fast and cranks through a lot of data, but nobody can figure out what it's good for.

  • PC network admins? (Score:5, Insightful)

    by Volante3192 ( 953645 ) on Thursday May 24, 2007 @05:23PM (#19261305)
    With the accelerating move to consolidate Windows servers, some see substantially less demand for PC network administrators.

    Apparently this guy's never dealt with users. If there's a way to screw up a system, even a dumb terminal, they WILL find a way.
  • Cobol has died back as much as it's going to, same as Fortran. It won't reduce in scale any further, because of maintenance requirements, so it is meaningless to say it is "dying". It's a stagnant segment, but it's a perfectly stable segment.

    Non-IP networks are dying? Must tell that to makers of Infiniband cards, who are carving out a very nice LAN niche and are set on moving into the WAN market. Also need to tell that to xDSL providers, who invariably use ATM, not IP. And if you consider IP to mean IPv4, then the US Government should be informed forthwith that its migration to IPv6 is "dead". Oh, and for satellite communication, they've only just got IP to even work. Since they weren't using string and tin cans before, I can only assume most in use are controlled via non-IP protocols and that this will be true for a very long time. More down-to-earth, PCI's latest specs allows for multiple hosts and is becoming a LAN protocol. USB, FireWire and Bluetooth are all networks of a sort - Bluetooth has a range of a mile, if you connect the devices via rifle.

    C programming. Well, yes, the web is making pure C less useful for some applications, but I somehow don't think pure C developers will be begging in the streets any time soon. Device driver writers are in heavy demand, and you don't get far with those if you're working in Java. There are also an awful lot of patches/additions to Linux (a pure C environment), given this alleged death of C. I'd love to see someone code a hard realtime application (again, something in heavy demand) in AJAX. What about those relational databases mentioned earlier in the story? Those written in DHTML? Or do I C an indication of other languages at work?

    Netware - well, given the talk about non-IP dying, this is redundant and just a filler. It's probably right, but it has no business being there with the other claim. One should go.

    What should be there? Well, Formal Methods is dying, replaced by Extreme Programming. BSD is dying, but only according to Netcraft. Web programming is dying - people no longer write stuff, they use pre-built components. Pure parallel programming is dying -- it's far more efficient to have the OS divide up the work and rely on multi-CPU, multi-core, hyperthreaded systems to take care of all the tracking than it is to mess with very advanced programming techniques, message-passing libraries and the inevitable deadlock issues. Asynchronous hardware is essentially dead. Object-Oriented Databases seem to be pretty much dead. 3D outside of games seems to be dead. Memory-efficient and CPU-efficient programming methods are certainly dead. I guess that would be my list.

  • I guess this shouldn't surprise me. Even though Netware was pulling "five nines" (of reliability, for those not familiar with the term) long before anyone considered running any flavor of windows on a server, we see another article bashing Netware.

    Sure, its sales have declined drastically, but I wouldn't say that its relevance has. I'd be willing to bet that if we were to actually survey what file servers are still running out there, we'll see a much larger representation of NetWare. Just because people aren't buying the latest version doesn't necessarily mean that they aren't using the old ones.

    For two years, I managed the computer network of a daily newspaper - including through the election debacle of 2000 and the 9/11 events. We ran that network primarily off of four netware 4.11 (later netware 5.0) servers. One of those servers had been running for over 400 days continuously when I left, and it served files and print jobs. That kind of reliability is hard to match.

  • by i_like_spam ( 874080 ) on Thursday May 24, 2007 @05:33PM (#19261451) Journal
    Yeah! Does that mean that my FORTRAN programming skills are still marketable?

    "What will the language of the year 2000 look like? Nobody knows, but it will be called FORTRAN." John W. Backus
  • by bobdehnhardt ( 18286 ) on Thursday May 24, 2007 @05:35PM (#19261471)
    Back in the days of DOS, when everything had to fit in 640KB RAM (give or take), the ability to load device drivers into UMBs and High Memory. Now there were tools you could use, like QEMM [wikipedia.org] or memmaker in MS-DOS 6, but Real Admins did it by hand.

    I carried a specially tuned DOS disk around with me, and would whip it out whenever anyone complained that a certain program wouldn't load. Boot off the floppy (with around 630KB conventional memory available after all drivers loaded), run the program with no problem, deliver the classic "It works for me" tech support line, slip the boot disk back into my pocket, and leave the user convinced they're doing something wrong.

    Ah, good times, good times....
    • Re:Memory Tuning (Score:4, Interesting)

      by camperdave ( 969942 ) on Thursday May 24, 2007 @07:04PM (#19262685) Journal
      I remember being able to squeeze 704K of conventional memory out of some systems (shadow RAM at $A0000, and a video card that sat at $B0000), then have certain programs complain that there was insufficient memory... because their counters wrapped at 640. Good times indeed.
  • ColdFusion (Score:4, Interesting)

    by Goyuix ( 698012 ) on Thursday May 24, 2007 @05:42PM (#19261581) Homepage
    Wow. I didn't actually expect this to be on the list but I am not at all surprised. We use it at my work as the primary web platform and I can assure you of two things regarding the language: 1) it is very hard to find someone with development skills using it and 2) the ones who do have the skills are VERY expensive. That seems to go along nicely with the theme of the article that it is in fact a dying skill. While I personally have never developed much of a taste for it (I do post on /. after all - it would be like heresy / blasphemy) there are a few long-time-developers here that have an unholy allegiance to it, almost completely unwilling to even look at alternate environments or frameworks. I would guess that is probably similar for many of the languages/skills on this list and their long time supporters.
  • Typing (Score:3, Interesting)

    by radarsat1 ( 786772 ) on Thursday May 24, 2007 @05:45PM (#19261625) Homepage
    I was just copying down a quote from an article I was reading, and I realized that my typing skills haven't really been up to scratch, even though I spend hours and hours on the computer every day. For programming and general writing, I spend a lot more time thinking than actually writing, plenty of time to fix typing mistakes. Rarely do I ever just copy something directly, but this time I happened to be putting a long block quote into my document.

    It got me thinking.. secretaries used to be hired just based on their typing skills. Speed & accuracy. I remember when I took a typing class in high school the teacher made us cover the keyboard so we couldn't look at it while we were typing, and we especially weren't allowed to use the delete key so she could mark us on how many errors we made.

    But it's funny, that's so backward, of course. Since typewriters are no longer used, your typing speed _includes_ the time it takes to hit the delete key and fix what you did wrong. You time further increases if you have to look at the screen and then find your place in the text. So typing speed is now the only thing that counts...

    Now add into that the fact that the days of the boss dictating memos to the secretary are mostly gone, and typing is really a skill that no longer matters. It certainly helps in day-to-day computer tasks, but it's no longer a make or break skill for IT and office people.
  • by sootman ( 158191 ) on Thursday May 24, 2007 @06:16PM (#19262119) Homepage Journal
    1) knowing what extensions are
    - Both the fact that that they exist in the first place AND what the different ones mean--"ooh, should I click on hotsex.jpg.doc.exe.scr.pif?"

    2) looking at the URL in the status bar before clicking on a link
    - Apple: I love you, but you SUCK for having the status bar off by default in Safari.

    3) knowing where downloaded files go
    - Every phone-based support call I've ever made:
    a) Painfully (see #4) navigate to a URL.
    b) Painfully (see #5) instruct user to download a file.
    c) Spend 5 minutes telling them where that file is on their computer

    4) the difference between \ and /
    - these people saw a backslash ONCE in their lives while using DOS about twenty years ago, and now every time I tell them an address, it's "Is that forward slash or backslash?" (Despite the fact that I've told them a million times that they'll pretty much NEVER see a \ in a URL.) This is usually followed by the question "Which one is slash?" God damn you, Paul [nytimes.com] Allen. [wired.com]

    5) the difference between click, right-click, and double-click
    "OK, right click on My Computer... no, close that window. Now, see the mouse? Press the RIGHT BUTTON..."

    6) the concept of paths, root directories, etc.
    - I why do I have to explain fifty times a day how to get from example.com/foo to example.com?

    Admins can get whatever skills they want--they picked the career, thy can accept the fact that things change. The backends are usually handled by people with some know-how. It's the end-users that cause all the problems. It'd be like driving in a world where people didn't know how to use turn signals, didn't check their blind spots, didn't know they shouldn't talk on the phone while making complicated maneuvers--oh, wait, bad example.
  • by Opportunist ( 166417 ) on Thursday May 24, 2007 @07:11PM (#19262785)
    A good network admin is sought after. And he will never be out of a job.

    Notice the "good" in the above statement, please!

    Unfortunately, network admins have already suffered for years from what we (programmers) are facing now: Clueless wannabes flooding the market. Sounds harsh, is harsh, but it's sadly true. Everyone who can spell TCP/IP and doesn't think it's the Chinese secret service calls himself a net admin. And since human resources usually can't tell a network cable from a phone cable, they hire the ones with the cutest looking tie. Or the one with the most unrelated certificates.

    Quite frankly, I have met so many people who claim to be net admins who know even LESS about networks than me. And I can barely cable my home net, and I can't solve the retransmission issues with my game machine that clog it. I do expect a lot from a net admin, granted, but for crying out loud, it's their JOB to know more about networks than I do, or I could do it myself!

    What you get today as a "network administrator" is some guy who can somehow, with a bit of luck, good fortune, a graphical interface and a step-by-step guide from the 'net, get the DHCP server on a Win2003 Server up and running. Don't bother trying to get a static IP or even a working DNS server from him. Not to mention that he'll look blankly at you when you ask him about splitting the 'net into smaller chunks. Anything in a netmask other than 00 or 0xFF (sorry: 0 and 255) is alien to him.

    That's not what I call a network administrator. That's what I call a clickmonkey.

    True network administrators who got more than an evening school degree are still rare. And they will have a job, with companies that know what to look for in a net admin.

    But the plague spreads. Recently we hired a "programmer" who doesn't know the difference between heap and stack. Or why inserting an inline assembler line of INT3 could do some good for his debugging problem.

    And we wonder about buffer overflow issues and other security problems in code? I stopped wondering.
  • by Locutus ( 9039 ) on Thursday May 24, 2007 @07:19PM (#19262895)
    That's right, after Microsoft shipped Windows 95, they dumped hundreds of millions on pushing Windows NT at the server markets. It was a full blown marketing attack on UNIX, Netware, and Lan Manager/OS/2 and we know it is marketing which won the day and admins who lost. How many UNIX servers turned into a dozen WinTel PCs after they found out one WinTel PC couldn't a few server processes and had to be split into one service/PC. Then they had to pull in replication to get anything close to the 99.9999% uptime of the UNIX systems.

    Yup, it's interesting how snake oil still gets sold year after year but only under a different name. IMO.

    Oh, and virtualization, that's all about moving all those single tasking servers back into one box where one crash won't take out the others. That's innovation for ya. Go Microsoft! :-/

    LoB
  • Other dead skills (Score:4, Interesting)

    by Orion Blastar ( 457579 ) <orionblastarNO@SPAMgmail.com> on Thursday May 24, 2007 @08:11PM (#19263431) Homepage Journal
    Turbo Pascal, phased out with Delphi and Free Pascal/Lazarus replacing it. I still know people who know Turbo Pascal and I learned Turbo Pascal in 1985.

    LANTastic, I recall some people were experts with this network. I can recall when Windows for Workgroups came out and had built in networking that LANTastic went on decline.

    DBase and Clipper, I can recall porting databases and code written in them to MS-Access in 1996-1997.

    Wordperfect 5.0/6.0 macro writing. I know some small law firms that still have document templates automated with Wordperfect 5.0 for DOS or Windows. Hardly anyone else uses Wordperfect and has moved to MS-Word and used VBA for Macros.

    AmigaDOS/AmigaOS it used to be the bee's knees for video and multi-media in the late 1980's, I am one of the few left that still has Amiga skills on my resume. AmigaOS reached 4.0 quite some time ago, but hardly anyone uses it anymore except in Europe for various niche markets.

    ProDOS, AppleDOS, I think the Apple // series is long since dead and buried, but still alive in some poor school districts that couldn't afford to replace them.

    Mac OS9 and earlier, I think Mac OSX is the top dog now. The Classic MacOS is no longer in demand, and 68K Macs are only used in school districts that couldn't afford to replace them.

    BeOS, despite trying to bring it back from the dead it using open source, BeOS used to be popular in the late 1990's and used to run on PowerPC Macs and Intel PCs. I can recall some of my friends used to develop software for BeOS, but not anymore.

    Wang, some people I know still list Wang skills on their resume. It used to be in high demand, but once Windows NT 4.0 and Windows 2000 Server came out, there was a mass migration from Wang, after Wang got shrunk and almost went out of business. They did have some Visual BASIC graphic tool called Wang ImageBasic, but I think Internet Explorer 4.0 or 5.0 broke it and so did Visual BASIC 6.0 break it. I think Leadtools replaced it.

    8 Bit Computers, nobody really uses them anymore. Big Businesses only used the Apple // or CP/M systems and the Atari, Commodore, Sinclair/Timex, etc were used in the home mostly.

    The Apple Newton, the Palm Pilot and Windows CE devices replaced it.

    Arcnet and Starnet cards, Ethernet replaced them. Token Ring is almost dead, but some die-hard IBM Fans still use it at their companies. Anyone remember making twisted pair and coaxial cable network wires for Arcnet and Starnet networks? I do.

    MS-DOS 6.X and Windows 3.X and earlier, like OS/2 they deserve to be mentioned. I think some older charities non-profit organizations still use them on old 286/386 systems that cannot run even Windows 95, and they use a Netware 3.X server to share files with them.

    MS-Foxpro, does anyone still use it? After MS-Access got upgraded, and MS-SQL Server had more features added to it, MS-Foxpro became redundant.

    Assembly Language, Machine Language, remember writing native code for the 8088, 68000, 6502, 6809, IBM Mainframes, etc? Hardly any company wants us to write in Assembly or Machine language anymore. It seems like only hackers use these to do exploits and write malware.

    FORTRAN, I think BASIC and C sort of replaced it, and then C++ and Java replaced them. FORTRAN got NASA to the moon, but NASA uses Java or Python now.
  • by glwtta ( 532858 ) on Thursday May 24, 2007 @09:19PM (#19264143) Homepage
    Neither are C, ColdFusion, or NetWare certification - programming and software design are skills, as is network administration; what they list are called tools.
  • my experiment (Score:4, Interesting)

    by wellingj ( 1030460 ) on Thursday May 24, 2007 @10:09PM (#19264615)
    I went to dice.com and started a blank search.
    The number of jobs(posted in the last 30 days) that was listed if I picked C as a skill?
    Answer: 17139 jobs

    Java?
    Answer: 15760 jobs

    So.....Myth-busted?
  • by bscott ( 460706 ) on Friday May 25, 2007 @04:19AM (#19267189)
    0. Tweaking IRQs on PC clones to let soundcards work with any other card
    1. Knowing how to drop certain types of home comupter to re-seat the chips
    2. Inserting 64k RAM chips with your bare hands to expand memory
    3. Cutting a notch in 5-1/4" floppies to use the other side
    4. Adjusting graphics by hand to NTSC-legal colors for decent video output
    5. Editing config.sys to push drivers into HIMEM in order to free up memory
    6. Crimping your own RJ45 connectors to save money
    7. PEEK and POKE locations to do cool stuff on the Commodore 64
    8. Manually configuring a SLIP connection to connect to the Internet (in pre-Winsock days)
    9. Removing adjectives and punctuation from code comments to fit into 1k of RAM

The relative importance of files depends on their cost in terms of the human effort needed to regenerate them. -- T.A. Dolotta

Working...