Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Is Computer Science Dead? 641

warm sushi writes "An academic at the British Computing Society asks, Is computer science dead? Citing falling student enrollments and improved technology, British academic Neil McBride claims that off-the-shelf solutions are removing much of the demand for high-level development skills: 'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch. Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.' Is that quote laughable? Or has the software development industry stabilized to an off-the-self commodity?"
This discussion has been archived. No new comments can be posted.

Is Computer Science Dead?

Comments Filter:
  • Wow! (Score:5, Interesting)

    by OverlordQ ( 264228 ) on Tuesday March 13, 2007 @03:58AM (#18329123) Journal
    Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.

    And who made those packages?

    Software don't write itself.
    • Re:Wow! (Score:5, Funny)

      by daranz ( 914716 ) on Tuesday March 13, 2007 @04:11AM (#18329215)
      They arrived from lands far away thanks to the magic we call outsourcing?
      • Econ 101 (Score:5, Insightful)

        by GCP ( 122438 ) on Tuesday March 13, 2007 @12:51PM (#18335621)
        I program because I love it. I've been doing it since before the Internet boom brought in all the carpetbaggers.

        Some years back, before the boom, I decided that Moore's Law (and other economic forces) were going to increase the number of programmable devices exponentially for decades to come, creating an insatiable economic demand for programmers. When the iBoom arrived, I saw it as a short-term overreaction, but still a part of long term extreme ramp up in demand for programmers.

        Then I started studying economics seriously and discovered the mistake in my thinking. Demand for programmers is not proportional to the amount of code running in the world. I've written code that will soon be on a billion (with a "B") devices, but it's the same code it was when it was on fewer than 100 M devices, and those of us who wrote it easily fit in one small cubicle pen.

        Real demand for programmers depends on how much NEW code has to be written and HOW FAST. (And by "new" I certainly include maintenance, glue code, customization of existing packages, etc.) If the number of programmable devices explodes (as I still believe--and observe), much of it will run code written by very few people, customized a bit, tweaked and glued by a few more people for other devices, and massively replicated. And if that customization can be done slowly enough, it can be done by an arbitrarily small group of programmers. Custom code for your own personal needs and those of your business group will constitute most new code, and that will be supported by tools that do what you want with a minimum of "programming" on your part--tools like Excel.

        Then the same Moore's Law and other forces that create the "everything will run software and be connected" world of the future also brings a hundred million or more new potential programmers into the developed world economy (without ever leaving their local undeveloped economies) each year to meet the demand for however much new code needs to be written each year, and the job of "programmer" is going to look more and more like various factory worker jobs (the decent ones, not the dangerous ones.)

        So the professor is decrying the falling interest in Computer Science. How would enrollment look in a "Factory Science" department at his university, I wonder....

        • Re: (Score:3, Insightful)

          by misleb ( 129952 )

          hen the same Moore's Law and other forces that create the "everything will run software and be connected" world of the future also brings a hundred million or more new potential programmers into the developed world economy (without ever leaving their local undeveloped economies) each year to meet the demand for however much new code needs to be written each year, and the job of "programmer" is going to look more and more like various factory worker jobs (the decent ones, not the dangerous ones.)

          What kind

    • Re: (Score:3, Funny)

      by sunami ( 751539 )

      Software don't write itself.
      All in good time... all in good time.
    • Re: (Score:3, Insightful)

      by jevring ( 618916 )
      Exactly. Even if there is an overflow of good developers at some point, they all retire (and eventually die), so then someone else is going to have to pick up the torch.
    • Re:Wow! (Score:5, Insightful)

      by codonaill ( 834165 ) on Tuesday March 13, 2007 @04:44AM (#18329407)
      There are 3 other jobs I can see that require CS skills, but are not product development/design jobs.

      1. Who buys it? What skills do Computer Scientists need to differentiate between Brand X and Brand Y billing system? Basically, proper product selection is as tough a job as product design - because you have to beat down sales jargon and work out what a system actually does - generally without unfettered access to the system itself.

      2. Who builds the Middleware/Integration layer? This is so specific to individual companies that you'll never get a solution that fits all the heterogenous parts of your network.

      3. Who builds large networks of products - i.e. works out that Portal solution 1 goes well with reporting solution 2 and alarm system 3. Who breaks down the business flows between these and who keeps track of strategic direction in each area?

      Dunno, still think there's plenty of non-dev jobs out there for CS graduates...

      C.
      • Re:Wow! (Score:5, Insightful)

        by Scarblac ( 122480 ) <slashdot@gerlich.nl> on Tuesday March 13, 2007 @05:17AM (#18329639) Homepage

        How about imaging research (stuff like using image processing to learn about the state of food stuffs with infrared cameras), or the hard problems that need to be solved to get to the Semantic Web?

        There is a lot of CS work out there. But it's science work, not programming or product development. That's not CS, that's engineering or just programming.

        • Bingo (Score:5, Interesting)

          by benhocking ( 724439 ) <.moc.oohay. .ta. .gnikcohnimajneb.> on Tuesday March 13, 2007 @07:28AM (#18330405) Homepage Journal

          There is a lot of CS work out there. But it's science work, not programming or product development. That's not CS, that's engineering or just programming.

          Leaving aside the issue of whether there is plenty of programming or product development work still out there (I think there is), you're absolutely right. We might as well argue that physics is dead because there are so few jobs for physicists. The supply/demand ratio for physicists is quite high. However, that doesn't mean that there isn't plenty of good science left to do. (No talking about string theory here - too volatile a topic.)

          Examples of very interesting areas in computer science, besides software development, compilers, networking, programming languages, graphics, and architecture include: quantum computing, neural networks, genetic algorithms, and genetic algorithms with neural networks. (Perhaps I'm wee bit biased here.) I guess to be fair I should also mention the tremendous growth in bioinformatics.

          • We might as well argue that physics is dead because there are so few jobs for physicists. The supply/demand ratio for physicists is quite high. However, that doesn't mean that there isn't plenty of good science left to do.

            But it is cheaper to do it overseas. If it does not involve local culture or heavy interaction with the customer, then firms will find it cheaper to offshore such work. The laws of physics and math are the same in Bangalore, but paychecks are not.

            I see IT in the US moving toward more han
            • Re: (Score:3, Interesting)

              by dgatwood ( 11270 )

              Nonsense. It is cheaper to do it at universities where you can pay the researchers next to nothing even by outsourced standards. Better still, foreign universities where you can pay next to nothing even by American university standards. :-)

              Corporate environments don't tend to lend themselves to heavy research. I'm sure there are exceptions, but they are the exception. If you want to do research, do a postdoc at a university. That sort of thing has limited potential for long-term financial stability,

              • I see no reduction in computer science work here at my university. The possibilities are increasing, not decreasing. One can bemoan the out-sourcing of American jobs, but that has nothing to do with the fact that computer science is not dead. Such a statement is as ridiculous as saying physics is dead.

                Computer science is still a very vibrant research field.

        • Re:Wow! (Score:5, Insightful)

          by morgan_greywolf ( 835522 ) * on Tuesday March 13, 2007 @07:49AM (#18330575) Homepage Journal
          Right. The term 'computer science' has become so muddled because people confuse applied computer science (computer information systems) with actual computer science. Computer science is pure science -- solving the hard problems to advance computing technology. People who have programming or computer engineering jobs are NOT computer scientists any more than mechanical engineers are physicists or pharmacists are chemists. Not that CS majors don't get jobs in the computer information systems arena, just as many physics majors go off and do engineering jobs. But the work of a software engineer just ain't science.
          • Re:Wow! (Score:5, Insightful)

            by Bill_the_Engineer ( 772575 ) on Tuesday March 13, 2007 @08:53AM (#18331293)

            But the work of a software engineer just ain't science.

            My collegues and I, being software engineers in X-Ray astronomy, disagree with you

            Sure some CS majors go on to make a new computer language or new technique for image analysis, but that doesn't make the software engineer less scientific. The systems we develop are used by X-Ray astronomers and would not exist without the Electronic Technician, Electrical Engineers, Mechanical Engineers, Computer Engineers, and Software Engineers.

            It has been my observation that most of the science is done by physicists (and other scientists) who understand enough about computers to code their own small routine to illustrate their point, and hand it off to us software engineers to clean up, make reliable, and integrate in to a complete hardware system that is capable of performing the science work they need.

            When spending millions of dollars on one-of-a-kind hardware, you not only depend on the computer algorithm being correct but also reliable, thoroughly tested, and an integral part of a well engineered system. All of our science is done in unmanned flights, so we can't simply reboot when something goes wrong.

            Before you correct me and say that we are not capable of computational science, my collegue developed a tracking system that calculates vehicle orientation based on images of stars captured by a telescope mounted on the vehicle...

            Anyway my point is that science is more engineering than algorithm these days. I'm not saying pure computer science is not important. I'm saying that we must introduce engineering practices into computer science to tackle the hard problems. This is why I believe that Computer Science is evolving into Software Engineering.

            As for the non-scientific information systems, thats a job for a MIS graduate.. :P

            • Re:Wow! (Score:5, Insightful)

              by Anonymous Coward on Tuesday March 13, 2007 @09:13AM (#18331603)
              1. computational science is not the same as pure computer science; if anything, pure CS resembles discrete mathematics.
              2. computer science might be spinning off "engineering" disciplines, but only after certain hard problems are solved; I seriously doubt that in your effort to create your positional system you discovered anything new about graph theory, cryptography, or algorithm analysis.

              I work with people like you everyday who think that because they write highly technical programs to convert their specialized knowledged into something that runs is doing "computer science." That is "applied" CS, and if you wanted to get into it you are really doing an engineering task.

              I have an undergrad in mechanical engineer, a MS in CS, and am working on my PhD in CS right now. I know the difference between real CS and applied CS/software engineering - and it is vast. I'd also argue that MIS people are vastly more useful than people who call themselves computer scientists because they have a formal education in some technical discipline yet work mostly with computers.
              • Re:Wow! (Score:4, Insightful)

                by Bill_the_Engineer ( 772575 ) on Tuesday March 13, 2007 @10:53AM (#18333509)

                computer science might be spinning off "engineering" disciplines, but only after certain hard problems are solved; I seriously doubt that in your effort to create your positional system you discovered anything new about graph theory, cryptography, or algorithm analysis.

                Have you? I admit its been a couple of years since I studied cryptography (S-Boxes, Fiestel Networks, Self synchronizing stream ciphers, oh my!). However I do perform algorithm analysis, and I use and try to improve the current working state of graph theory. You assumed that since I use engineering practices that somehow I am incapable of performing science. I admit my last internal paper was over 2 years ago, but in lieu of publishing I have been working within a couple of science missions.

                I work with people like you everyday who think that because they write highly technical programs to convert their specialized knowledged into something that runs is doing "computer science." That is "applied" CS, and if you wanted to get into it you are really doing an engineering task.

                I'm sorry I thought science required using specialized knowledge to prove a hypothesis (or create a program that does). I'm glad you corrected me... I have worked with people who think that because they are pursuing a PhD, that somehow they know better than the rest of us on how things are done.

                I do work with people who have actual PhDs in CS (and physics), and they never once considered me a code jockey or strictly "applying" computer science. We have mutual respect in our field of work. Just because I have a degree in Software Engineering, does not mean that I just code. I am not a manager of a large software project, I am a member of a 3 person software team (within a larger program) tasked with doing cutting edge work. If current technology can't do what we need, we must invent it. Admittingly, I do need to publish more.

                My point in my previous message was not that all software engineers are scientists, but rather some computer scientists are software engineers. Well, I'll let you get back to pumping lemmas...

                • Re: (Score:3, Insightful)

                  by kalaf ( 963208 )

                  He is correct, in so much as you basically described software engineering.

                  Engineering, in my mind, is finding solutions to problems. Science, on the other hand, is more like identifying new problems.

                  Just because your systems are designed to solve science problems doesn't make you a scientist. That said, it's not like there isn't any cross between the disciplines. Software engineering grads probably learn about P vs. NP and CS grads recieved minimal instruction in software engineering (I say minimal s

      • Re:Wow! (Score:4, Informative)

        by StarvingSE ( 875139 ) on Tuesday March 13, 2007 @05:44AM (#18329791)
        Also, there will always be room for proprietary corporate development. Many corporations have very specific functions that can be automated using software, but no shrinkwrap solution exist.

        I wish I still had the textbook to grab the quote from, but it contained a case study on adapting a shrinkwrap HR system, and writing their own. It was found that writing the system from scratch would have been much more cost effective than trying to adapt a generic off the shelf solution.
        • Re: (Score:3, Interesting)

          by Toba82 ( 871257 )
          That's basically my job. I take business processes that my employer needs and turn them into applications to save other employees time. This will never be replaced by off the shelf software.
      • Re:Wow! (Score:4, Insightful)

        by khakipuce ( 625944 ) on Tuesday March 13, 2007 @07:57AM (#18330633) Homepage Journal
        But there are still plenty of development jobs.

        A lot of companies cannot stay ahead by buying off the shelf products - they NEED to be ahead of the game, and they recognise that a so called "off the shelf" product at the level of ERP, CRM, etc. is really just a bag of components that the vendor will integrate into a product for you - they will charge you to do the analysis for your industry sector, and then they will take the knowledge they have gained and sell it on to others in the same sector - you paid them to shape the product for your sector ... and they sold it to your competitors.

        On the legacy side, it's fine to buy a big integrated suite if you are a new start-up, but I have never worked anywhere that could contemplate stripping out all their apps and starting from scratch, it's an endless round of upgrade payroll, replace ERP, bring in CRM. And someone has to makle all these work together.

        There are also those systems that no one writes - Engineering, Financial (as in city trading - I'm sure someone does do these as packages but in my experince there is a lot of in house software development going on in this area), process monitoring and control ...
    • Re:Wow! (Score:5, Insightful)

      by -noefordeg- ( 697342 ) on Tuesday March 13, 2007 @04:54AM (#18329481)
      No only that...

      "stable, well-proven ... "
      I've yet to see, say, a well written and stable ERP system.
      In Norway some of the more popular ERP/logistic and sale-systems are CS (Client System), Movex, Visma Unique and IBX. Systems which are just "ok". Terribly modules, inane logic, most likely a lot of bad code all over, but since it's closed source it's impossible to tell. From all the errors (some, really strange), lack up updated documentation and integration specifications, system resources used, and just from looking at the system documentation, you can easily tell that the systems are not "state of the art".

      What most of these complex systems really are, are a collection of small modules of which many are most likely writtin at different times, by different people, for different projects and just barely working together. The companies developing the systems probably have thousand and tens of thousand bugs and points for optimizations which will never be fixed. Any work done on these systems which is not directly connected to a new deployment and paid for by one or many customers are simply a loss for the company.
      Much of the "valuable" experience people get from using such a system, is actually how to use it without breaking it/how to use it despite all bugs, errors and strange quirks and twists.

      What my small company has been busy with the last years, is to move a lot of logic and data outside such systems. Because it's just to expensive to try and "upgrade" these huge behemoths. We develop external databases to store different data feeds, most likely recieved in XML-format which some of these systems is not capable of using. Actually, one of those systems are only capable of importing/exporting data with fixed lenght ASCII-files.

      I don't see any less work needing to be done on these systems in the coming future. Rather, the need for more developers working both inhouse and independent, to patch them up, make small adjustments here and there, and/or write "connectors" for logic/data processing will probably increase.
      • Re: (Score:3, Informative)

        by Treffster ( 1037980 )

        What my small company has been busy with the last years, is to move a lot of logic and data outside such systems. Because it's just to expensive to try and "upgrade" these huge behemoths. We develop external databases to store different data feeds, most likely received in XML-format which some of these systems is not capable of using. Actually, one of those systems are only capable of importing/exporting data with fixed length ASCII-files.

        I can attest to this. I'm a software developer for a small company

  • by ZombieEngineer ( 738752 ) on Tuesday March 13, 2007 @04:01AM (#18329147)
    Computer Science graduates can go one of two directions:

    Academic Research - Which has grown at a steady rate

    Corportate Development - Which collapsed at the end of the dot-com boom.

    There is still a need for "pure" computer science research for the next big improvement in the field of computing (where is the next "Google" going to appear?)

    ZombieEngineer
    • by kale77in ( 703316 ) on Tuesday March 13, 2007 @04:42AM (#18329395) Homepage

      Recently we mourned the Webmaster [slashdot.org], even though some of us were implicated in his murder.

      That's the kind of Computer Science that is dead: the kind that Computer Science, by its progress, leaves behind.

      An similar questions might be: Is evolutionary science dead? Or was that just the dinosaurs that died?

    • by Moraelin ( 679338 ) on Tuesday March 13, 2007 @04:47AM (#18329427) Journal
      You know, I don't buy it. On one hand you have all the corporates bitching and moaning about how they don't have enough people to do the work, and how everyone should outright give citizenship to any immigrant who can use a computer. See Bill Gates's speech recently, it was linked to right here on Slashdot. Plus, they've surely created a lot of jobs in India lately. And then we have guys like this one coming out and saying "oh, we just don't need more CS people." Something doesn't add up. Either one gang is right, or the other is right, but they can't both be right at the same time.

      Way I see it, reality is a lot more... perverse. Everyone still needs programmers, still needs an IT department, etc, they just don't want to pay for it.

      And enrollment has just reflected this. Studying engineering or CS is hard work, and there are only a limited number of people who do it for fun. And even those can do it as a hobby at home if all else fails. For most people you have to pay well to get them to do the extra effort. If you don't pay up, they'll go do something else.

      At any rate, the jobs do exist. Sure, not most of them involve researching the next great algorithm, but they exist. There are a ton of companies who need very specialized internal applications, or their own "B2B" applications, and I just don't see the off-the-shelf software who does those. Of course, most of it doesn't involve researching any new algorithms, but rather researching what the users really want. Then again, most computer-related jobs weren't exactly academic research in the past either. There were maybe more companies making compilers and new computers and what have you, but the bulk of the jobs was always in doing corporate software.

      At any rate, _maybe_ if all you're seeing yourself doing after college is researching the next paradigm shift in computing, yeah, that market has somewhat shrunk. If you don't have any qualms with writing some buzzword-ladden application for some corporation, it's as strong as ever. It just doesn't pay as much as in the dot-com times any more.
      • by ZombieEngineer ( 738752 ) on Tuesday March 13, 2007 @05:32AM (#18329729)
        I don't know if this was meant to be a flame bait but I'll bite.

        I am an engineer by trade (making training simulators for chemical plant operators) and I have encountered more than my fair share of Computer Science graduates.

        A lot of these people are focused on "how do I meet this product spec?" and not necessarily a solution fit for purpose. I routinely encounter situations where enumeration comparisons are done using strings and searches are implemented using a linear search (I kid you not, I once reduced a program run from 90 mins to 4 mins by replacing a single linear search with a binary search). Just because every 6 months there is a more powerful CPU on the market doesn't justify ever increasingly sloppy coding.

        There are a few people who are focused on "how do I make this better?". For these people making a compiler that would recognise linear search and replace with a more appropriate technique automatically is there objective, before people jump up and down saying there is no way a compiler could determine this I will point out that there was a consulting company who 20 years ago had a FORTRAN compiler that would silently replace nested loops with equivalent BLAS matrix calculations (said consulting company was bought out by Intel several years ago). So what is the big deal? FORTRAN died several years ago... Well it is a bigger deal today with Dual Core processors where things like BLAS calculations are perfectly suited to parallel processor architecture.

        Moving on to address some of your other comments: "Everyone still needs an IT department"
        If your IT department is stacked with CS people then someone isn't doing their job properly. I found IT support (did it for a University department while working on my post-grad) is highly dependent on the level of planing and implementation. A well planned system with appropriate lock-downs (era of Win 3.1, we mirrored the HDD of the local machine from the network server when people logged in) resulted in no viruses or other on-going issues (you had a network drive for personal storage but the desktops were a shared resource, you could install software, use it but the moment you logged off and back on again - Poof!). Prior to having a planned strategy, IT support consisted of firefighting & band-aid patching.

        "There are a ton of companies who need very specialized internal applications, or their own "B2B" applications"
        Oh Please!!! Specalised applications are a pain in the neck to support, the real issue here is that who ever implemented them did not fully understand what the end user requirements were. There is a real art of extracting that sort of information out of people and it requires an inquiring mind, good communication and people skills. There are application houses that milk corporations of money due to scope changes because they couldn't get the original spec right (I am not going to enter into the argument of whose is to blame for a defective spec, there are valid arguments for both sides).

        ZombieEngineer
  • Horology anyone? (Score:3, Interesting)

    by Tracer_Bullet82 ( 766262 ) on Tuesday March 13, 2007 @04:01AM (#18329149)
    I remember a few years ago, 2 the minimum if my memory serves me, that watchmaking is a dead business. Even the us education dept. considered it dead and buried with less than 100 students per year taking it.

    today though, with watchmaking (back) on the rise, the supply of workers is much less than the demand.

    everything, well most thing at least, is cyclical. we'd expect so called researchers to have much longer timelines in their research than the immeduate ones.
  • Dont tell that to my [acm.org] professor [uark.edu].
  • Where I work we are outsourcing work to India, China and Russia becaue it is impossible to deliver on our projects with people hired locally.

    When I was young you had to be a bit of a geek to tinker with computers at all. You had your 8 meg basic in rom, and a bit later, CP/M. Now people who want to tinker are building machines for gaming or some such and because what they are doing is much more mainstream, they don't think of it as being anything special so when they decide what to do for a living they don

    • by Alioth ( 221270 )
      8 *meg* BASIC, before CP/M? 8K maybe...
    • by cyclop ( 780354 ) on Tuesday March 13, 2007 @04:17AM (#18329261) Homepage Journal

      This doesn't mean CS is dead.

      Surely computing is much more accessible, and there is a hella lot more ready-to-go software and libraries compared to what was there 10 years ago, but this means nothing. New applications will always be needed/invented, and someone will need to code them. And even with the latest and easiest programming languages, doing things well needs some kind of education.

      I am a biophysics Ph.D. student. I have never had a formal CS education nor I am a code geek (although I like to code), and just building a relatively little data analysis application with plugin support in Python is making me smash my nose against things that would make my code much better, that probably are trivial for people with a CS education (what's currying? what is a closure? how do I implement design patterns? etc.) but that for me are new and quite hard (btw: a good book about all these concepts and much more?): so I understand why CS is of fundamental importance.

      • I am a biophysics Ph.D. student.

        Just curious: what drove your choice of career? For me it was hacking with electronics as a 5-15 year old in the 1970's.

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        (what's currying? what is a closure? how do I implement design patterns? etc.) but that for me are new and quite hard (btw: a good book about all these concepts and much more?)

        Modern CS doesn't teach these concepts either, try wikipedia. I'm being serious.

      • Re: (Score:3, Insightful)

        by beelsebob ( 529313 )
        Just of note - I doubt very much you'll find many CS students who know what Currying or a Closure is. Most of them learn Java and think that it's the best thing since sliced bread. They don't even realise that Functional Programming exists, let alone what it is, what it's benefits are etc.
    • by dbIII ( 701233 )
      And in India the projects are worked on by students or recent graduates doing cut price work until they have enough experience for a job with better money and greater responsibility - they can't even deliver there at the prices with experienced staff. It's funny seeing this outsourcing thing from the USA to the inexperienced in India happen when wages are not even a significant proportion of expenses - don't blame India blame clueless managent gambling the existance of the company on short term gains.
  • Don't think so (Score:3, Interesting)

    by VincenzoRomano ( 881055 ) on Tuesday March 13, 2007 @04:03AM (#18329161) Homepage Journal
    Just like building construction science was not dead with Egyptians, Greeks, Romans, Chinese, Aztech ... and so on, the IT won't.
    New technologies, new languages, new paradigms as well as new hardware will push forward the IT.
    I fear the sentence has come from some "old school" mind, still tied to old technologies. Which could really die sometime in the future.
  • It is true of course that most users of computers these days do not write their own accounting systems; do not write their own payroll systems; do not write their own word processors; and do not even keep a team of operating system tweakers in house ("system programmers" from the IBM mainframe days, needed just to keep the thing running).

    But ... someone has to write all this stuff!
  • by rucs_hack ( 784150 ) on Tuesday March 13, 2007 @04:06AM (#18329171)
    Over the last six years I've been increasingly worried by the falling level of ability in CS students.

    I've encountered CS students recently who in their third year are unable to do such basic things as understand memory allocation. As for algorithm design? Well that's simply unknown by the majority. That scares the shit out of me.

    The Mantra is 'don't re-invent the wheel'. This is used as an excuse for students taking off the shelf components for assignments (sorting classes for java being used for sorting assignments for example), or being given virtually complete assignments by lecturers and being walked slowly through the assignment to the point where little or no original thinking is required.

    Now it is true that re-inventing the wheel is a bad move at times. However whilst studying for their qualification, they should learn how to build the wheel in the first place.

    Back to the memory allocation point. I currently know of no final year students with a decent understanding of this topic, and yet it is the main cause of security problems in code. They should at least have a working knowledge.

    The ephasis is more and more on using languages designed to try and remove the main problems in code, but who writes these languages? It sure isn't the people who are only taught to use them, not create them.

    The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments. I'd love to blame it, I ferkin hate the language, but that isn't the root cause.

    Computer science is a hard topic that they are trying to make simpler to encourage more students. This has the result that CS students are graduating with ever reducing levels of ability, so people no longer see it as a worthwhile topic. Nowadays a CS student who wants to do really well has to work on independent study entirely apart from the course they are attending, and has also to face the unpleasant reality that their education as provided by the university is so poor that they may face years of further study to gain a useful level of ability.

    Post graduate study can reduce this problem, but there are fewer post grads too, and often it is funding, not interest in a topic, that guides the selection of a course.
    • I am a first-year computer science undergrad in the UK, studying Java.
      Should I be intensely worried, or should I just do a LOT of self-study in my spare time over the next two/three years?
      • by rucs_hack ( 784150 ) on Tuesday March 13, 2007 @04:33AM (#18329357)
        Self study.

        Of the people I knew who did well, those who self studied alongside their normal course did things like website design, search algorithms, micro kernel design, robotics and advanced study in certain languages (lisp, c++, C, Object Pascal, assembler), everyone I knew did the last thing, but the languages varied.

        You can pass and get a 2.1 or 2.2 easily just by following the course guidelines. I got my phd offer not by doing this, but by cramming every day (almost every day, have a blowout night at the weekend, you've got to have some fun time) with additional study. I exceeded the requirements of every assignment (I wasn't alone in doing this), and studied around every topic taught. The result was a lot of very interesting phd offers when I graduated, it rocked. I was tired a lot, I will admit, but the benefit was vast, I was so far ahead of the students who just followed the course that I actually tutored some.

        Don't assume I'm that clever though, I sweated blood sometimes trying to get assignments done early, and the extra learning was oft times very difficult. Every evening spent on it was one well spent however.

        Most of the people I know personally who did this are now in great jobs, one heading towards millionaire status at 25. In his case he worked like a dog, even more than I did. You wouldn't beleive what he was capable of on graduation.

        So work hard, and study around the subjects.
        • by DingerX ( 847589 ) on Tuesday March 13, 2007 @05:28AM (#18329717) Journal
          Parent, and ABG below. It's true for just about every undergraduate field.

          Undergraduate education has a few factors that drive the curriculum: one is enrollment (make it too hard, and nobody shows up; require everybody to take it, and everybody has to pass it), another is vocational preparation (what does the job market demand? or -- mixing enrollment and vocation -- what do the students think the job market demands?). The folks doing the teaching aren't really interested in either of these, and nor are the "good students".

          The "vocational" side of university education has always been there, and it's always been looked down upon by the really sharp people. And, you know what? In spite of the political rhetoric you hear around the US and Europe, the students who "Hit it out of the park" career-wise, the big successes, the Googles, Netscapes, Yahoos, Nokias and so on, aren't the ones who stick to a vocational curriculum. The ones who just did what the course told them to do are the guys who end up seeing their Technical Support jobs get outsourced.

          The "enrollment" issue is even more pernicious. No department wants to lose students -- since students are tied to money and power in the universities. So if a subject gets "less popular", the curriculum gets "easier" to boost retention.

          University courses, like other forms of professional formation, do teach a major professional skill: that to achieve results you need to be willing to do lots of crap-work, and that a good job involves doing boring stuff much of the time.
          Outside of that, the true strength of universities is that you're given some good resources to play with, and are surrounded by smart, curious, interested people. Find your passion, pursue it, and don't sweat money or jobs. Any employer you'd want to work for will recognize your abilities.
      • by Anonymous Brave Guy ( 457657 ) on Tuesday March 13, 2007 @05:00AM (#18329515)

        You shouldn't be intensely worried, but reading around your subject is pretty much always a smart move if you're a serious student. I learned this lesson very late in my academic career, and now wish I'd understood what the phrase really meant a couple of years earlier.

        In this business, knowing multiple programming languages (and in particular, knowing multiple programming styles -- OOP, procedural, functional, etc.) is a big asset. It helps you to think about problems in more varied ways, even if you will ultimately code the solution in whatever language is required by your particular professor or, in due course, employer.

        There are two suggestions I've heard in the past that I appreciate more as time goes by: try to learn a new programming language and to read a new book about programming every year. In the former case, if you're learning Java, that's OK, it's a pragmatic tool that's widely used in industry and it will teach you one way of thinking about a problem. I suggest the following as complementary languages, to be explored as and when you have the opportunity:

        • C, or even some version of assembler, to understand what's going on under the hood and what a low-level programming language really is;
        • Haskell or a dialect of ML, to understand that not all programming languages are block-structured procedural languages, and what a high-level programming language really is;
        • Python or Perl, to understand the costs and benefits of requiring less formal structure, and the use of dynamic type systems, and to learn a few neat ideas like regular expressions;
        • when you're ready, LISP, to understand what the old sayings "code is data" and "data is code" really mean, and what concepts like macros and metaprogramming are really all about.

        There are various other unique things you'll take away from each of the above, but if you spend perhaps a few months exploring each of them in some detail, it will make you a much more rounded programmer. I'd suggest either the above order, or swapping the first two around and going for a functional programming language and then something low-level. The requirements of your course or good advice from friends/teachers may guide you otherwise. Go with what works for you.

        To make your learning practical, pick some simple projects, perhaps to practise whatever algorithms you happen to be studying lately in other courses, and write a few small but real programs in each language. For example, if you're learning about operating system basics, try rewriting a couple of simple OS utilities or networking tools in C or assembler. If you're learning about databases, try writing a simple web front-end for a database, and power it with a few CGI scripts written in Perl or Python that use SQL to look up and modify the data in your database. If you're learning about graphics and image processing, write a simple ray tracer in Haskell or ML.

        Along the way, you'll develop potentially useful real world experience with things like OS APIs (and perhaps how they vary between platforms, and thus why standards are useful for these things), HTML/CSS and CGI for web development, SQL for database work, and so on.

        As you go through this, consider buying a good textbook on major subjects (programming languages, databases design and SQL, graphics algorithms, etc.) or make sure you've identified some good reference and tutorial material on the web. The latter is a big advantage for the modern compsci student, though you have to be careful to check your sources are well-regarded and not just a pretty web site with an authoritative tone of voice written by someone very enthusiastic but regrettably ill-informed. Things like FAQs and newsgroups can be valuable sources of information, but sometimes, there's just no substitute for a well-written, well-edited, authoritative textbook.

        Anyway, this post is now far too long, so I'll stop there. Please consider it "the approach I'd take if I could have my university days again" and take it for whatever it's worth to you. Good luck. :-)

    • by Rakishi ( 759894 )
      Don't claim it's everyone when all you have is likely some small selection of students or schools. Some schools are crap and some students are lazy/idiots. If all someone takes are the "easy A" courses that a monkey could do then why do you expect them to be more than a monkey?
    • I agree with a lot of what you said, but I also believe that many of your statements are very dependent on the school in question. At my school, CS grads are basically guaranteed a job immediately upon graduation with any number of big name companies, because the school I attend is very well known for providing a well rounded, in-depth, and difficult curriculum. I do believe, however, that my school is not the norm. Most colleges that I see around today don't teach CS, they teach coding, and as (I hope)
      • At my school, CS grads are basically guaranteed a job immediately upon graduation with any number of big name companies

        My advice: follow your nose. Work on what you enjoy. Big companies are okay but don't get stuck working on a production line.

        Mass production is how small companies become big.

    • by Geoffreyerffoeg ( 729040 ) on Tuesday March 13, 2007 @05:07AM (#18329575)
      The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments.

      You should blame Java. And you should blame C++, Python, and any other similar medium-high level language, if that's the intro language and your sole teaching language.

      Here at MIT we have 4 intro courses. The first, the famous Structure and Interpretation of Computer Programs [mit.edu], is taught entirely in Scheme, a purer and more pedagogical dialect of Lisp. You learn how to do all the high-level algorithms (e.g., sorting) in a purely mathematical/logical fashion, since Scheme has automatic object creation / memory handling, no code-data distinction, etc. At the end of the class you work with a Scheme interpreter in Scheme (the metacircular evaluator), which, modulo lexing, teaches you how parsing and compiling programs works.

      The next two are EE courses. The fourth [mit.edu] starts EE and quickly moves to CS. You use a SPICE-like simulator to build gates directly from transistors. (You've done so in real life in previous classes.) Then you use the gate simulator to build up more interesting circuits, culminating in an entire, usable CPU. From gates. Which you built from transistors. The end result is, not only are you intimately famliar with assembly, you know exactly why assembly works the way it does and what sort of electrical signals are occurring inside your processor.

      Once you know the highest of high-level languages and the math behind it, and the lowest of low-level languages and the electronics behind it, you're free to go ahead and use Java or whichever other language you like. (Indeed, the most time-consuming CS class is a regular OO Java software design project.) You're not going to get confused by either theory or implementation at this point.

      So yes, blame Java, if you're trying to teach memory allocation or algorithm design with it.
  • by mccalli ( 323026 ) on Tuesday March 13, 2007 @04:07AM (#18329179) Homepage
    From the article:Here at De Montfort I run an ICT degree, which does not assume that programming is an essential skill. The degree focuses on delivering IT services in organisations, on taking a holistic view of computing in organisations, and on holistic thinking.

    ie. not Computer Science. For those not familiar with the UK education set up I should also explain that De Montfort University is the old Leicester Polytechnic. The Polys were set up to provide much more practical education than the theoretical stances of the Universities, and a damned good job many did of it too - I'm certainly not playing the one-upmanship card that some do about the old polys, Leicester Poly was a good place and its successor De Montford has reached even further.

    But the point stands - this point of view is coming from an academic teaching at a more practically-oriented institution and already running a non-science based course. His viewpoint should be considered against that background.

    Cheers,
    Ian
  • Has the software development industry stabilized to an off-the-self commodity?

    The US department of labour predicts the industry will "grow more slowly than average [bls.gov]." That is hardly dead.

    It is 2007 and we are still writing code using text editors, not giving verbal commands to sentient machines. Nothing to see here, move along.

  • The thing about academics is they often have no real world commercial/industry experience yet feel the need to comment on it.

    What's in your car? What's in your TV? What's running your website? Those are just three things that spring to mind that are not "generally off the shelf" yes of the shelf components maybe, but someone still has to integrate it all together. It's just madness to say that as computers become more prolific we need less computer scientists.
  • Not dead (Score:3, Insightful)

    by Zo0ok ( 209803 ) on Tuesday March 13, 2007 @04:16AM (#18329251) Homepage
    Compare computer science to other science - like architecture. Computer Science is still very immature with very few true best practices and standards. It will not die anytime soon.

    Remember the 4th-Generation-Languages that were supposed to make programming unnecessary? Where are they today?

    Ask innoviative organisations like BitTorrent, Apple, Google or Blizzard if they see Computers Science be obsolete any time soon. I dont think so.
    • Re: (Score:3, Interesting)

      Compare computer science to other science - like architecture. Computer Science is still very immature with very few true best practices and standards. It will not die anytime soon.

      Maybe this is slightly off topic, but my wife is an architect, and any time I want to stir up one of her co-workers I tell him tales of version control, automated builds, automated unit testing and bug databases linked to revisions.

      None of this exists outside of the software business in anything like the same form. When it come

  • by cmholm ( 69081 ) <cmholm@m[ ]holm.org ['aui' in gap]> on Tuesday March 13, 2007 @04:18AM (#18329271) Homepage Journal
    Comp Sci has always been dead, and always will be. In 1982, one of my early CS professors claimed that the window of oportunity for a job as a programmer or s/w engineer was going to close soon as automatic code generators took over the task of raw code banging. Employers would just need a few engineers for design, and that would be it.

    But, I shouldn't be surprised that yet another generation of technology dilettantes think that they're reached the pinnicle of achievement in a line of endeavor, and from here on out it's just like corn futures (Somebody oughta tell Monsanto to stop wasting time with GMO research). But seriously, when we've got bean counters like Carly Finarino and whichever IBM VP it was claiming that the years of technical advance in IT are over, not to mention the author of the fine article, Mr. McBride, I see people who are in the wrong industry. Perhaps they should be selling dish washers, or teaching MCSE cram schools.

    McBride is whining because the students aren't packing his CS classes like they used to. His reasons whittle down to these: mature software packages exist to service a number of needs (which has always been true, to the contemporary observer), and it's too easy to outsource the whole thing to India. It is the writing of someone throwing in the towel. It's like the trash talk you hear from people who are about to leave your shop for another job. I won't be surprised to find him in fact "teaching" MCSE "classes" very soon. Good. His office should be occupied by someone who still has a fire in their belly.
    • Re: (Score:3, Insightful)

      one of my early CS professors claimed that the window of oportunity for a job as a programmer or s/w engineer was going to close soon as automatic code generators took over the task of raw code banging.

      I read once that assemblers and compilers were both described as enabling the "self programming computer" when they came out.

      Of course such things just increase productivity and open up new applications.

  • My friend has a CS degree from the local university. He is fully fluent in Java and VB. He had to do C++ and Haskell for course points but knows little about either. He's *not* a computer scientist by any stretch of the imagination.

    So it's probably a good thing.

    IT is so pervasive that the CS degree should fragment to suit the new world. My friend wouldn't stand a chance in front of an xterm, he's not even interested. To him, it's not a vocation, just a job (and fair enough).

  • by prefect42 ( 141309 ) on Tuesday March 13, 2007 @04:20AM (#18329293)
    "Neil McBride is a principal lecturer in the School of Computing, De Montfort University."

    De Montfort, one of the new universities that traditionally advertises on the TV and offers vocational courses in media and the like.

    Academic really doesn't mean much these days. He's not even consistent:

    "Interrupts, loops, algorithms, formal methods are not on the agenda."
    vs
    "The complexity of embedded systems, of modern computing applications requires a different way of thinking."

    I'd not like to use an embedded system he'd developed, unless by embedded he was thinking Windows Mobile + Flash.

    Sorry, a rant from someone who works at a real university, and knows he isn't an academic.
  • So, if computer science is dead, then who is going to develop the "accounting packages, enterprise resource packages, customer relationship management systems are the order of the day" that article's author mentions?

    Seriously though, this is weird. How come we don't see posts every other week about how common university majors such as english, political science, mathematics, or say classics are dead, presumably because they don't teach any real world job skills. If there are reasons for those majors to exis
  • by SmallFurryCreature ( 593017 ) on Tuesday March 13, 2007 @04:24AM (#18329313) Journal

    For that matter so is education in general. I am not a computer scientist, my education is technical instead. (LTS/MTS/HTS for the dutch)

    When I attended the LTS we had real shop class, learning how to work with wood, steel, electricty with real world equipment in an area that looked much like you would expect to find in industry.

    I recently had the occasion to visit a modern school that supposedly teaches the same skills, yet what I found was an ordinary classroom with a very limited and lightweight set of equipment. The kind of stuff you would find at home, NOT at work.

    Yet somehow todays kids are supposed to learn the same skills.

    And as if that ain't enough the number of hours of shop class have been reduced while the number of theory hours has been increased. Worse, the amount of technical theory has decreased as well and instead the amount of soft theory like history and such has taken over.

    This has TWO negative impacts. First young kids coming to work can't hold basic equipment and don't understand the theory behind it and even worse the kinds of kids (like me) that used to select a techincal education because they don't like theory have that choice removed. I myself was far too restless to do a theorectical class, 18 hours of shop class per week however made the remainign theory that much easier to handle and because theory and practice were linked it all made sense.

    Even worse, the modern education is supposed to make kids fit better into society, so how come they are bigger misfits then any generation before them?

    No this is not old people talk. Notice even here on slashdot how the art of discussion is dying out, say anything remotely controversial and be labelled a flamebaiter or a troll by some kid who can't handle the heat. I actually had a 20 year old burst in tears about two years ago because I chewed him out for drilling through the work bench. Modern education is so much about empowerment that kids who think they are the top of the top can't handle suddenly being the lowest of the low when they enter a working life. This is already a shock simply because you just went from being the youngest in school to the oldest in school and now suddenly you are the youngest again.

    Simply put, I think education in general is less and less about turning out skilled proffesionals and more and more about just keeping kids of the job market. Comp Sci ain't the only victim. Just try to get a good welder nowadays. Hell I settle for anyone who can knows the difference between a steel drill bit and a stone one. (And no, that doesn't mean one is made out of stone, rather what it is for drilling into).

  • Computer Science ain't dead yet (although it may smell that way at times), you just need to have a degree program that's worthwhile. Crappy places churning out more idiots hoping to make a fast quid tend to die off at these times, but the better ones survive.
  • There are a lot of niche applications or in-house development jobs that are not covered by standard applications. Things like writing control software for machines that are typically done by a small team of developers at the hardware manufacturer.

    If jobs for creating office suites disappear, that will only affect a small part of the field.
  • Many of the students who would look for a degree to get rich were enrolling in CS. Now that the news is filled with stories of out-sourcing to India and the collapse of programming as a way to earn infinite wealth these students are no longer interested in CS and are pursuing careers as doctors and lawyers instead. Good riddance, I say, anyone who is only into programming for the money is probably not overly good at it.

    Programmers will always be needed. As tools become more capable and advanced, the only t

  • by HuguesT ( 84078 ) on Tuesday March 13, 2007 @04:35AM (#18329365)
    The person who wrote this article doesn't even know what CS is. CS is computer science. It will be dead when science is dead.

    CS won't be dead until all the interesting questions in the theory of computing are solved : is P != NP? What can a quantum computer achieve? what are the theoretical limits to computation in the physical world, beyond Turing machines? Given the truly enormous current production in all the branches of IT from HCI to pure mathematics via signal and image processing, I would not be worried at all.

    Just to rehash, CS is not about designing the best accounting package. This is ICT, not CS. CS is a means to an end.

    As to ICT, I don't think the final word has been said either. Just look at the sad state of Vista, or for that matter, at just about any accounting package. Who can say with a straight face that's the best that can be done?
  • Are you mental? (Score:2, Interesting)

    by dintech ( 998802 )
    'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch.'

    This is equivalent to 'Off-the-shelf applications now fulfil all possible needs and changing requirements.'

    Surely not. The British Computer Society should really talk amongst themselves before releasing such obvious trolling public statements. This idea could get in to the hands of people who would take it seriously...

    Some muppet in your management chain is trying to 'leverage'
  • I work in-or-near the bespoke software business in finance, and certainly the increasingly powerful off-the-peg solutions that have emerged in the last 5-10 years do compete with bespoke development. It's also generally fairly true that it takes fewer developers to give 10 banks the _same_ software package than to give them each a bespoke package, making off-the-peg generally cheaper. But there are other differences.

    Projects go on forever either way so multiply by the number of years required :)

    Bespoke ap
  • by jandersen ( 462034 ) on Tuesday March 13, 2007 @04:49AM (#18329435)
    No, no, it just smells funny.

  • Offices will always want something that the COTS does not do. I think thrid party vendors should worry about becoming obsolite because Operating Systems begin to incorporate their functionality within the OS itself. M$ is trying with Virus scan and the like. Not perfect yet, but I think the code is on the wall, so to speak.
  • The article makes a lot of shaky assertions, but it gets one thing right: computer science curricula in higher education are becoming something of a joke. I think it misfires in saying that the way to go is to be more practical and interdisciplinary; I think the problem is that computer science programs are too practical. "Computer science" has come to be less the study of algorithms and information management, and more a vocational degree--universities aren't graduating computer scientists so much as they'

  • by Bloke down the pub ( 861787 ) on Tuesday March 13, 2007 @05:05AM (#18329561)
    To get the headlines a hundred years ago, just replace "British Computer Society" with "Ye Fraternal Guild of Buggywhip Frossickers" and "off-the-shelf solutions" with "horseless carriages".
  • Think about it. How many people get to write Java rather than write applications using Java? Or how many people get to write a brand new sorting algorithm compared to how many people get to use it?

    I don't think there's anything wrong here. It makes perfect sense schools would create more consumers of computer science than computer scientists. If everyone coming out of these schools was a "creator", either the unemployment rates would go sky high, or there'd be a whole bunch of overqualified people working o
  • no, no, no (Score:5, Insightful)

    by tomstdenis ( 446163 ) <<moc.liamg> <ta> <sinedtsmot>> on Tuesday March 13, 2007 @05:07AM (#18329569) Homepage
    This has been asked repeatedly ever since I was a wee lad [20 years ago]. The idea then was BASIC would replace comp.sci because it was so simple to program. Of course, it overlooked the fact that BASIC is wickedly inefficient. No, the answer is no. No. No. No. Why? Someone's gotta maintain the scene.

    For starters, the more automated tools are not efficient enough for most computing platforms (hint: think running that nice VB.NET application in 32KB of ram). Then combine that with the need for algorithms (re: 16MHz processors) and you can see that RAD tools don't apply.

    Tom
    • Re: (Score:3, Insightful)

      by maddogdelta ( 558240 )
      What tomstdenis said, but on a longer time scale. I coded my first Fortran in 1976. In 1980, somebody was showing me a system that would "eliminate programming" because you could just speak English (actually they were showing me a SQL system) to the computer.. In the late 80's, early 90's "object oriented programming" "would eliminate the programmer" because all you would have to do is put components together. And now this clueless arsehole. But he got published, so I guess he accomplished his job. (Li
  • by Rik Sweeney ( 471717 ) on Tuesday March 13, 2007 @05:23AM (#18329677) Homepage
    That's why people don't do it. When I was at University in the UK (Portsmouth if anyone cares), I did Maths and Computing.

    The first year consisted of learning how to format a floppy disk and write a Word document. Oh, and there was some Java thrown in there, but people found Java too hard and complained. Java then got removed from the curriculum and we did crap like theories in Artificial Intelligence instead.

    We had the option of doing C++ in our final year but this largely consisted of printing out to the console and writing some text to a file. No fancy shit like Pointers or anything like that. Most people didn't elect to do this option as programming is hard work and they just stuck to Matlab instead.
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday March 13, 2007 @05:38AM (#18329753)
    CS isn't dying. Academias monopoly on CS is dying. Forging swords was an experts job 400 years ago. Now it's a hobbyists thing. I may not know my way around memory allocation that well anymore, simply because my last three PHP customers and I couldn't give a sh*t, but I did opcode/assam programming 20 years ago (to control single dots on my sharp handheld screen) and the book I need to learn C in and out again is resting on a shelf two meters away.

    CS is sort of becoming a science like philosophy. There are people who study it and earn money with it, but anyone half way interested can join an educated discussion with them on the topic. And, on top of that, the experts view on the topic usually is quite strange and outside of common sense. You'll find tons of Wittgenstein crackpots at academic positions simply because they dig mental masturbation as a dayjob. The Schopenhauer guys all have occupations that are more 'real'.

    Nobody takes a guy serious anymore ranting about how this PL is worse than that and how Java sucks and real men use C or PHP is for sissies and Ruby is cool. They don't even want to hear from me that Zope still is lightyears ahead of Rails ;-) . People want the job done. And move on.

    Point in (simular) case: Nowadays nobody (not even academics) - except maybe a few people who build satellites and stuff - gives a rats ass if x86 sucks or not. It has won. Period. And I bet unemployed non-x86 hardware guys tell you how crappy it is if you give them some change and a warm dinner.
    If some kid in india who's read a copy of Kernighan & Ritchie can solve my low-level problem with some Linux module that's getting in my way, I don't give a hoot wether he's an academic or not. Yet I bet he's got a simular skill kit of one.

    Bottom line:
    Computers and their science have become mainstream and are slowly moving out of their steam age. Get with the programm.
  • by rbarreira ( 836272 ) on Tuesday March 13, 2007 @05:40AM (#18329767) Homepage

    Neil McBride says computer science was populated by mathematicians and physicists but now virtual robots can be created by eight-year olds without needing programming, logic or discrete mathematics skills.

    1- even if that's true, the 8-year old won't do anything revolutionary without knowing the details
    2- even if he could, it would probably be just a toy, not something usable in practice
    3- even if it was usable in practice, someone else with more knowledge could do something better
    4- etc etc etc

    Computer science has lost its mystique. There is no longer a need for a vast army of computer scientists. The applications, games and databases that students once built laboriously in final year projects are bought at bookshops and newsagents.

    Civil engineering has lost its mystique. There is no longer a need for a vast army of civil engineers. Apartments and houses that civil engineers once built laboriously in final year projects are bought at internet websites and real estate agents.
  • Good riddance (Score:3, Insightful)

    by Flambergius ( 55153 ) on Tuesday March 13, 2007 @05:44AM (#18329783)
    More changing than dying.

    There are problems. From my point of view (*), the typical graduating student is falling behind. It has always been that young person entering the ICT field professionally has had a lot to learn regardless where he got his decree. Right now that knowledge cap is bigger than it has ever been in the 10 years that I can talk about from experience. I see two main reasons for this. First, there simply is more to know. Basic skills like discrete math and coding aren't enough. You need at least strong design skills or a near mastery of a specialty. In fact, if you can't know all it makes more sense to know a specialty. Four or five years you have in college is not long enough except for the most gifted students. Second reason is that ICT is in fact changing and education has been slow to respond. ICT is now more conceptual than before (some other people like to talk about "information intensive vs. data intensive", I think they mean the same thing :-)). The required skill set in changing too: coding is losing out and modeling is winning.

    Of course, in absolute numbers people will write more code in future than now. It may even be that the absolute number of people working mostly in coding will remain relatively static or even increase a bit, but I do think it more likely that number of coders in decrease at least moderately. In any case it is fairly certain that, relative to non-coders in ICT, the number of coders will decrease significantly.

    As to "computer science dying", well, it should have "information science" to begin with. So, in a sense, good riddance.

    (*) ICT within FE, lot of contact with student (comp.sci projects), lively but informal connections to industry, work hard to keep myself up-to-date. I would say I have pretty good view.
  • Is Smelting Dead? (Score:3, Insightful)

    by theonetruekeebler ( 60888 ) on Tuesday March 13, 2007 @05:47AM (#18329813) Homepage Journal
    'As commercial metal products have matured, it no longer makes sense for organizations to develop iron from scratch. Cutlery, structural I-beams, sheet metal bending systems are the order of the day: stable, well-proven and easily available.' Is that quote laughable?

    Well, yeah. All my years working in tall buildings, and I never once panicked that the builders didn't know how to smelt, or that maybe they took one metallurgy course in engineering school but have forgotten everything about it except "don't drink incandescent liquids."

    Yet the whole world is made of stuff that was, at one point, smelted.

    What we're seeing is a new level of abstraction, with a much steeper amplification curve than ever before: The work of a very few extremely expert people becomes the building blocks for the work of a relative few very expert people becomes the building blocks for the work of a slightly few relatively expert people becomes the building blocks for the work of relatively many ordinarily skilled people becomes the building blocks for everything everybody else uses. There enough layers between the person the guy designing the circuit traces for a chip's sign-extended add instruction and the guy writing an Excel macro that one chip designer can support the work of hundreds of millions of others. Compare this with the mid-1960s when the guys writing the accounting package could walk into the machine room and physically rewire the machine to make sign-extended adds work faster with odd numbers.

    Computer science is not becoming dead. But it is becoming more focussed and more niche-oriented. There are so many things one can do with a computer without a CS degree that the lack of one is not a universal barrier, if it ever was. My last analogy for the day is the automobile factory. There's an assembly line in there full of people making cars, who have never even heard of smelting, have no comprehension of what makes gasoline burn one way and diesel fuel another, and would be utterly hopeless designing the ideal valve geometry, a whole industry full of people without the slightest clue about extractive metallurgy, yet here we are with hundreds of millions of rock-solid, reliable cars on the road.

  • by Dekortage ( 697532 ) on Tuesday March 13, 2007 @06:26AM (#18329997) Homepage

    It's hardly just CS. My major in college was studio art -- printmaking, illustration, photography, and graphic design -- and I've been a professional graphic designer for 20 years+. People just don't need the same kind of designers anymore. Advancements in technology have made most graphic design tasks really easy, really automated. I bet most people reading this post think they can "do" visual design, when in fact they simply happen to own Photoshop/GIMP and some other graphics apps and some snazzy clip art off of iStockPhoto.com. I bet you can even create fliers or web pages that don't look awful; with a good template, they might even look good. But you still don't have a true understanding of color theory, typography, layout, negative space, photo manipulation, and all the other skills that make a good, creative, original designer. But these advancements in technology have led directly to the decline of art departments around the country (and the rise of smaller, higher-quality art schools such as Parsons, School of Visual Arts, RISD, etc.

    This is completely analogous to supposed "CS" majors who don't understand efficient coding, memory allocation, reusable code, storage optimization, security models, etc. And heaven forbid they try to do interface design (which is the best marriage between visual design and software development). They may be smart enough to piece together some Java or C# clips off the Internet into a program that, technically, produces the proper data output, but that's it.

  • by Paulrothrock ( 685079 ) on Tuesday March 13, 2007 @06:31AM (#18330023) Homepage Journal

    You wanna do research-level computing? You want to design and create brand new ways of computing? You want to work on AIs? Get a degree in CS.

    If you want to code or do networking or project management, there are plenty of other courses out there that'll give you a much better education for that sort of job.

    What happened towards the end of the dot-com boom is that people started to realize that CS wasn't exactly right for generating code monkeys, and colleges started offering different types of courses to fill these positions.

  • by LordLucless ( 582312 ) on Tuesday March 13, 2007 @06:34AM (#18330051)
    One problem is that the computing disciplines have become intermingled and are often used interchangeably. Let me outline my definitions:

    Computer Science: This is the theoretical, researchoriented discipline. It deals with developing new algorithms, optimization and that side of things.
    Software Development: This is the application side of Computer Science. It takes the algorithms developed by CompSci and makes useful applications out of them.
    Information Technology: This is the techie discipline. Building computers, setting up networks, administrating systems. I'm not sure why it got that name, but it seems to have.

    The problem that this guy has is that he has conflated Computer Science and Software Development. And it used to be the case that they were pretty much mixed - if you wanted to program, you needed to understand all the theoretical stuff yourself. But in these days of large, freely-available libraries and modular software design, the two have become very distinct disciplines.

    It's not that Computer Science is dying out; it's that it has subdivided into two separate disciplines, and of the two, there is a much greater demand for Software Developers than Computer Scientists.
    • by Lord Bitman ( 95493 ) on Tuesday March 13, 2007 @06:42AM (#18330091)
      To rephrase:
      "Research", "Design", "Practical Application"
      Computer Science looks into things which are not immediately practical.
      Software Development takes those ideas and makes them practical
      Information Technology applies the technology developed by Software Development on real data (ie: information)

      But then, by those definitions I'm just an I.T. guy, so that's no fun :\
  • Changes (Score:4, Informative)

    by DaMattster ( 977781 ) on Tuesday March 13, 2007 @07:07AM (#18330243)
    I don't really think Computer Science is dead but its face and meaning are changing. Computer Science is now more than just software engineering. It encompasses network, infrastructure, and information management. The internet has fundamentally changed Computer Science and the curriculum of old has not adapted to the change. This quite possibly might explain the drop off in enrollment. Students see that commodity applications are being more widely used. However, there needs to be competent web application developers. Competent web application developers are not just good software engineers, they have a thorough understanding of infrastructure and information management.

    Another possible reason for dropping enrollments could be disillusionment with the field as good software engineer positions are being outsourced to save money. In many ways, lots of positions become victims of globalization. Many companies use software engineers for projects or as long term temporary employees to save on the bottom line. Software engineers may be better off seeking employment at companies that develop software versus, say, a bank.

  • by rlp ( 11898 ) on Tuesday March 13, 2007 @07:09AM (#18330253)
    I think it was the CEO in the boardroom with an outsourcing contract.
  • by AmiMoJo ( 196126 ) on Tuesday March 13, 2007 @08:14AM (#18330825) Homepage Journal
    The real reason for this problem is that there are too many computing graduates. In Britain, the government wants 50-60% of people to go to University. Only about 5% of available jobs need a degree level education. There is a shortage of plumbers and other people with below degree level/practical skills.

    Basically graduates are fucked. You end up with £25,000 of debt and poor job prospects. This guy is right - we don`t need that many graduate developers and the ones we do need tend to need experience and training anyway.
  • by swschrad ( 312009 ) on Tuesday March 13, 2007 @08:39AM (#18331113) Homepage Journal
    until they get used to making consumers out of their wage slaves, the business looks like a dead end to college-age kids.

    because it is.

    look up Henry Ford in the encyclopedia. his $5 daily wage changed the economy, allowing workers to buy what they made. the trend now is to McWages, and that doesn't cut it, unless you are "Bob" in Bangalore.
  • by lpq ( 583377 ) on Wednesday March 14, 2007 @03:46AM (#18344533) Homepage Journal
    How many computer science graduates are actually doing computer science.

    Working as a programmer in an IT department doesn't usually have much to do with science -- it's about trying to create more standardized cogs from a growing number of previously coded cogs.

    How much research is being done today on individual users benefitting from 32-128core machines?

    Assuming some magic doesn't happen and GHz start climbing / doubling without frying, it seems like they are just raising GHz/chip by adding more cores at similar
    clock rates. How can this benefit the average PC user? If it can't be exploited for "individual persons", it sounds like the PC may be a thing of the past.

    Is that what people want? Right now, the only ways of using multi-cores is usually running separate programs at the same time -- if you can use that -- but a home system doesn't usually need to serve web pages to 1000's of users. OR, divide your machine into "VM"s Nice for test/develop/production/redundancy, but again -- not too helpful for the average joe wanting programs to run faster/smoother, more user friendliness.

    What will it take to use Parallel computing in the Personal Computer industry? doesn't that sorta imply, if not the death, a serious problem in the Personal Computer industry.

    Do we have enough cores to start building some practical AI's? Can we develop
    special compilers and light-weight threads (i.e. - not separate processes) to allow use of multi cores dynamically in an individual program (for loops not needing previous loop result could all be parallelized if parallelizing cost to create helper threads for a few to several loops could have low enough overhead to make it worth it.

    Seems like AI and parallelizing are at least two areas that need computer science, but that doesn't seem to be what most people are doing these days. Might use it in voie and face recognition, but again -- not very general tasks.

    What companies are doing Computer Science these days? Seems like most employers just need development of applications to run curent "paradigms. No "computer science" needed.

    Most high level developers at companies -- even in Open Source (or at least the Linux Kernel) are awfully conservative when it comes to doing computer science. They want the tried and true, step-wise development vs. large scale "disruptive technologies" that could enable whole new ways of doing things. People at the top of most large projects (commercial and O.S.) are too conservative to be doing real computer science.

    Maybe research grants? Seems like the Bush idea of research grants are things that are guaranteed to provide benefits in the near future (soon enough to be used in the theater of battle, for example).

    Is there any place for computer science research outside of getting your doctoral?

  • by Targon ( 17348 ) on Wednesday March 14, 2007 @07:09AM (#18345451)
    Back in the mid to late 1970s, computers were these things you only heard about in movies or TV shows for those of us growing up back then. Sure there were some people who used computers, or who had access to them, but access to computers was something that only very large corporations had, or schools, or certain government jobs(but not all). The closest most people got to a computer was a terminal(a screen with keyboard that connected to a computer).

    The result of this is that there was something mysterious about computers. When the first personal computers became available to the general public(many will remember the Tandy/Radio Shack TRS-80 models 1,3, and 4 with the model 2 being more of a business model, and the Apple 2 series), these machines became the first ones available to those who didn't have enough information to build their own computers. They were fun, allowed for playing some games, and this inspired many to continue to learn how computers worked. There was also a good amount of encouragement given by teachers back in those days and into the 1980s.

    So, between having an interest in computers and technology by some, and being encouraged by others to continue learning, Computer Science grew in popularity. As time went on, and computers became more and more common in the 1980s into the 1990s, there was continued support by those in education and in general for those who showed a true interest in computers.

    So, what happened to change this SHOULD be the question being asked, not just looking around and complaining about the current situation. As technology became more and more common, the number of jobs grew in the sector until the tech crash in 2001-2002 when the real down-turn in the industry really started to show up. With many jobs lost, there was an excessive number of computer science trained people around.

    If you were in high school at that point and you were hearing about tech jobs being hard to get, switching focus might have seemed like a good idea. For parents and teachers, encouraging people to go into a field where the job market wasn't very good also wouldn't seem like a good idea. And so, here we are, in 2007, and the job market has gotten a bit better but still isn't booming. Entry-level positions are hard(or harder) to find because of outsourcing. Reports of how programmers are treated by companies(generally long long hours with little appreciation), and a lack of control in the development process as a junior level programmer scares people away.

    The computer industry has also transitioned from being "we need programmers because there are no pre-made applications that do what we want" to having different specialized areas. Now you have networking, system administration, Information Technology, database administration, and other specialized areas. As a result, those with an interest in computers will select a major that fits the area they have an interest in. Why go Computer Science if a MIS degree will get you where you want to go?

    So, the way to get students interested in Computer Science is to encourage them that it is an area that still needs people, and that it's not a major for those who are going to end up as a "code monkey". To be honest, the computer industry NEEDS true computer scientists since most applications seem to have been slapped together by people who may be able to write code, but can't figure out how to design an application(which is why multi-threaded applications are an exception in the MS Windows environment).

He keeps differentiating, flying off on a tangent.

Working...