Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Bug

NIST Estimates Sloppy Coding Costs $60 Billion/Year 340

An anonymous reader submits: "Computerworld is reporting on a government study just released that software bugs are costing the U.S. economy an estimated $59.5 billion each year, with more than half of the cost borne by end users and the remainder by developers and vendors. Better testing could allegedly cut that by one-third."
This discussion has been archived. No new comments can be posted.

NIST Estimates Sloppy Coding Costs $60 Billion/Year

Comments Filter:
  • Duh (Score:2, Insightful)

    by Dr. Bent ( 533421 ) <ben&int,com> on Tuesday June 25, 2002 @05:36PM (#3765268) Homepage
    This is not going to change until software companies become liable for damages when their software fails. Otherwise there is simply no economic incentive to hire competent programmers and do rigourous testing.
  • by shoppa ( 464619 ) on Tuesday June 25, 2002 @05:36PM (#3765270)
    If anyone believes that the greatest costs are due to "sloppy coding", they're wrong.

    Yes, sloppy coding has great costs, but blaming the coding for the state of software engineering is like blaming the rubber O-ring for the Space Shuttle disaster, when everyone knows that it was an organizational issue that prevented the knowledge of the critical temperature from getting out of engineering and into management. Most of the software industry (and I include system analysts, architects, and customers who don't know what they want and won't let us help them determine it) is responsible for:

    1. Poor requirements specification (e.g. specifying the platform and/or the language in the requirements - a big mistake, especially when the language they specify results in code riddled with buffer overflows)
    2. Poor requirements confirmation (e.g. Everything anyone asked for goes into the requirements with no review at all to determine if it's needed, technically possible, or self-contradictory)

    It doesn't matter how well you code, if the contract requires the coder to deliver an unusable product then that'll be what's delivered.

  • by lionchild ( 581331 ) on Tuesday June 25, 2002 @05:36PM (#3765271) Journal
    Some of what you're saying is true. However, not even half of the $60B is going towards fixing it. You have realize that the figure we're looking at likely includes the cost of lost productivity for the poor sole who found the bug, and now has to re-type their entire report, re-enter their entire manufacturing design, etc..
  • Ohh say 1/3... (Score:3, Insightful)

    by delphin42 ( 556929 ) on Tuesday June 25, 2002 @05:37PM (#3765279) Homepage
    Better testing could allegedly cut that by one-third

    What about the other 2/3? Is that wasted money that is completely unrecoverable?

    I don't understand how this figure could have been generated other than pulling it out of thin air.
  • by b0z ( 191086 ) on Tuesday June 25, 2002 @05:39PM (#3765296) Homepage Journal
    When presidents of companies that have ties to most of the politicians in the federal government horde money and rip off the people that work for them, costing billions, it's look at as an unfortunate mistake.



    When overworked programmers make a few mistakes because they've been up all night working on something for an unreasonable deadline their manager demands, it's a horrible thing that the government feels it's necessary to look into?


    I bet if there were some sort of programmers PAC that invited Dick Cheney out for a round of golf once and then a report like this one wouldn't have seen the light of day.

  • OMG (Score:5, Insightful)

    by Zen Mastuh ( 456254 ) on Tuesday June 25, 2002 @05:39PM (#3765299)

    Wow. They have said the same thing about illegal drug usage and goofing off at work. I just hope that the politicians and the WSPTOTC (Won't Someone Please Think of the Children) don't create a crazy War on Sloppy Coders and start busting down doors, pointing automatic weapons at coders and shouting "You with the undocumented line of code! Hands off the KB or I shoot!" If I see a TV commercial where some guy with a pocket protector laments "I killed a judge with my sloppy coding", then I am moving to another planet.

    This article it typical alarmist FUD. No mention of how much money is saved each year by coders. The sloppiness comes with the territory: unrealistic deadlines, sloppy documentation, buggy interfaces, clueless management, and a changing world. This world of perfection exists only in the minds of the pencil pushers at NIST. In the real world, coders sometimes make mistakes because they are...human.

  • Unfairly harsh (Score:5, Insightful)

    by mactari ( 220786 ) <rufwork AT gmail DOT com> on Tuesday June 25, 2002 @05:40PM (#3765314) Homepage
    Here's a quick quote:
    [There are very few markets where "buyers are willing to accept products that they know are going to malfunction," said Gregory Tassey, the NIST senior economist who headed the study. "But software is at the extreme end, in terms of errors or bugs that are in the typical product when it is sold."]

    Need I remind anyone of the Pinto? How about the recalls we've had recently with everything from tires to airbags? Even if the failures aren't that harsh, who hasn't heard a mechanic say, "Welp, the '89 model uses a Peugeot transmission. They're not exactly, um, the best." Having spent some time under a few vehicles, let me assure you the mechanics aren't always lying. :^)

    So there's at least one market where one hardly hits perfection with version 1.0.

    Of course software has bugs, of course bugs take time to fix or for a user to work around. The trick is that no software will ever be released bug free that does much more than print "Hello World!" to the console. It's not how much time bugs cost; bugs are a fact of life. The question is whether fixing those bugs would be worth the time it'd take to get them out. With a quick angry glance at the IIS team, often it isn't.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday June 25, 2002 @05:43PM (#3765330)
    Comment removed based on user account deletion
  • by ryants ( 310088 ) on Tuesday June 25, 2002 @05:43PM (#3765337)
    This is called the broken window fallacy [progress.org].

    Money payed to programmers to fix bugs is money that isn't being used more productively somewhere else.

    Ce qu'on voit et ce qu'on ne voit pas

  • Not real money (Score:2, Insightful)

    by moorg ( 537751 ) on Tuesday June 25, 2002 @05:43PM (#3765339)
    Sloppy coding is the unscientific way to point the finger. More than likely its not the code thats sloppy, its the release dates set by Marketing and Sales, feature list that are promoted before development begins, and poor testing by people more concerned about "this font isn't bold" rather than testing the core product. *sigh*
  • Re:Not real money (Score:2, Insightful)

    by T3chnomonk ( 301162 ) <.t3chnomonk. .at. .hotmail.com.> on Tuesday June 25, 2002 @05:48PM (#3765382) Homepage
    Amen brother. Most programmers would produce good code given a set of requirements that don't change daily and given a reasonable amount of time to complete the given task.

    I place much of the blame on the management level individual that is supposed to direct and management development - this individual is the interface to the rest of the company and its his/her responsiblity to inform and educate his peers and superiors.

    Developers implementing insanity can't help but let a little chaos loose.

  • by rutledjw ( 447990 ) on Tuesday June 25, 2002 @06:06PM (#3765490) Homepage
    let's not forget that the vast majority of coding is for customer-specific business apps. (Source: Cathedral and the Bazzarr) The vast majority of bad code is produced by you and me. Although MS may contribute...

    In the frequently encountered sw development environment good coders are typically placed into a situation where we have to cut corners to meet deadlines. A faulty design or architecture has to be forced in due to time and/or money constraints. How many people here have had to develop to poor &| missing requirements? How did those projects work out?

    Another issue, IMHO, is the proliferation of "look, you don't have to think anymore" IDEs. I am currently working maintenance on an "enterprise" project (the only thing enterprise is the price tag) which was built using Java-J2EE (enterprise ready) and deployed on enterprise hardware and app/db servers. Yet our throughput is abominable and we have huge memory issues.

    The people who originally wrote this mutated plague 'o the land clearly had NEVER written a Java app in their lives and were writing code more-or-less out of the book (I was hired on when we started maintenance). However, it was written using a very powerful IDE which did a lot of code generation and allowed people with below-average IQs to become Java programmers... If they had tried to write this thing using a text editor, they would have been exposed and thrown out much earlier.

    The power of thought and understanding cannot be underestimated. I am leery of any tool that seperates the user from the underworkings of any system, be it OS, language, DB or other. In the end, these tools allow people who have little understanding to create something that barely works.

    Just because it will 'javac' doesn't mean it works for sh!t...

    /rant. But it feels good to get that out

  • by bhurt ( 1081 ) on Tuesday June 25, 2002 @06:10PM (#3765510) Homepage
    I agree, but it's more than just requirements. It's unreasonable schedules, a lack- in many cases- of release management or even version control, and a consumer environment that allows, even encourages, software companies to focus on new features instead of getting the features they already have to work right.
  • by Syncerus ( 213609 ) on Tuesday June 25, 2002 @06:20PM (#3765570)
    Statements like the above are very misleading. On most of the major projects I have worked with over the last five years, time to project launch was always the
    number one consideration. In every case, I was ground relentlessly by senior management on a continuous basis about time-to-launch. If I stated that a project would take eight months to complete, the next question from senior management would be on ways to release the product in six months.

    The following day, I would get grilled on how to release the product in five months, or maybe four months.

    I experienced these same conditions while working for a number of different employers; when I pushed back on the release dates, a reason always appeared why our company would collapse in wreakage if the project were not released by the nearer deadline.

    I and the programmers that I have worked with finished some projects on time; others we did not. We worked in deliberate languages (C, C++, Java) and RAD languages (Perl, Python). The specifics of the project changed, but the mentality of an uninformed management never changes.

    I am very proud of what we achieved, bugs and all. I think that we did a hell of a job under very difficult circumstances.

    The statement that software bugs cost American industry $60 Billion makes me laugh when I hear it. For the most part, the bugs are caused by management refusal
    to spend the time and money it takes to write bug free software.

    The overwhelming majority of bugs can be easily eliminated by good developers if the following elements are in place:

    Good Functional Specifications
    Explicit Coding Standards
    Unit Testing of Code Modules
    Peer Review
    Sane Project Deadlines

    There's not much else to it. That, and the desire to do good work will fix 90% of the so-called problem.

  • by GuyMannDude ( 574364 ) on Tuesday June 25, 2002 @06:25PM (#3765600) Journal

    I skimmed through the article and I didn't see any place where it said how much it would cost to actually produce bug free code. I'm betting much more than the $60 billion arbitrary figure they came up with.

    (emphasis mine)

    I'm sorry but I have to take exception to your tone. Simply skimming a news blurb does not give you the right to trash the NIST study or label their conclusions as "arbitrary." Until you've made an effort to digest whatever analysis is contained in the 309 page study (and there is a link to the PDF file so don't say you can't do so) you really shouldn't be shitting on someone else's work. I don't think I'm being harsh by taking you to task over this. That news bulletin is an advertisement of sorts for the study. You can't possibly expect to get a full understanding of their analysis from a news blurb, for chrissake.

    I'm sorry but one of my pet peeves is armchair philosophers who seem to think that they can best the experts without actually doing any work.

    GMD

  • by erroneus ( 253617 ) on Tuesday June 25, 2002 @06:39PM (#3765662) Homepage
    **WARNING** Microsoft bashing!!

    It has been brought up many times that having bug-free software is not in Microsoft's best interest. It's the bugs, subsequent versions of the same software and dropped support for old versions that keep the company raking in the dough.

    I'm not sure I have to go into great detail to explain how that works, but clearly, there is a business model in the works here and not just sloppy coding.
  • by TibbonZero ( 571809 ) <Tibbon&gmail,com> on Tuesday June 25, 2002 @06:49PM (#3765710) Homepage Journal
    Everyone knows that today you need more Harddrive space than ever. That is a fact.

    However, I think that alot of that harddrive space is wasted on sloppy coding. At one point, hard drive space was so expensive, that you couldn't afford to have sloppy and inefficnat programs. 8mb of memory was standard, you didn't have 512mb to mess with and leak to. Look at your OS even. Windows 95 fit onto 30mb or so. You could sqeeze it even more if you started deleting things that weren't needed (paint, wordpad, etc...)
    Now, look at your WinME or WinXP install. I just did a clean install of XP Pro, and it's taking up 1,082,413,755 bytes (about a gig). Yea, that's a bit much.
    Ok, I will give it to you, WinXP Pro has more features and functionality than 95, however is it 34x the features and functionality? Is my computer 34x better, or have 34x more applications and programs that I use? No!!!

    I think that this code has been bloated to hell. At one point running DOS/Win3.1 and even NT 3.51, I knew what 90% of the files on my computer were, when they were put there, and what they were for. Now, who knows what 'adsmw.dll' does? (Ok, I know one of you guys probably know, but...)

    Also, they haven't optimzed anything, or made anything faster. Win95 could run on a 486 great. WinXP choaks on a 300PII with 64mb. Again, we have gotten more features, but for a system to run smoothly do we need 10-20 times the processing power to make it run well? I think that's absurd. Again, we have came far, but dont' tell me that it couldn't be faster with current hardware.

    Linux isn't free from this either. Again we have more features now, but look at the requirements of Redhat 5.2 to 7.3, they have gone up ALOT. And now its THREE CDS!!! That's even beating MSFT. We have alot of programs on linux, but come on, that's crazy. No you don't need all that stuff, but the size of the "Standard" install has gone up dramatically.

    Now on to games. Let's take Ultima Online. I remember in the Alpha test and Beta test, a 486 DX/4 was more that enough. I ran it on a Pentium 90, it worked great. Now the system requirements are what? Pentium 233 with 64mb ram? It doesn't sound like much (the engine is pretty dated.), but they haven't gotten ANY new things in it that would have really increased processor usage. No new graphics in the 2d version, no sounds really, just bad programming.

    Has anyone looked into the Demo Scene lately? I want them to program my OS and games for me!! They program stuff on 64k that most games can't do in 64MB!!!

    I think that most of the time now, the programmers are seeing these fast 2.4ghz systems and thinking "Who needs to optimze their loops on a system that fast"? And "Let's use Long Doubles for everything, even if it's only a bool, more consistant that way..". I personally hope that they start seeing that they can run faster if they program better...

  • Re:Missing figure? (Score:2, Insightful)

    by Titusdot Groan ( 468949 ) on Tuesday June 25, 2002 @06:54PM (#3765737) Journal
    Hmmmm, almost every study I've seen shows that doing things right (proper requirements, design, testing, etc.) actually saves time. It's certainly been my own experience that projects following a good methodology have been cheaper than projects that didn't, interestingly enough the projects that didn't follow a methodology were always initially estimated to be inexpensive.

    How many fewer versions of Office and Windows would there be if Microsoft was designing their software for reliability and usability instead finding a way to hack the browser into the OS to screw Netscape or modifying libraries to cause Lotus 123 and Wordperfect to misfunction? And would developing those versions be cheaper or more expensive?
  • Re:hmmmm (Score:3, Insightful)

    by happyclam ( 564118 ) on Tuesday June 25, 2002 @07:14PM (#3765813)
    Testing of the software isnt that big of an issue. "Most" bugs that are found by customers are already present in the bug tracking databases of the people that developed the software.

    Not necessarily true. The most recent startup I worked for (and got laid off from) had a policy of "ship it at soon as it has a heartbeat, and we'll clean it up during customization."

    Get this: 18 developers, one QA person. 6 months development, 3 weeks QA. Some stuff didn't even make it into QA. Yes, I opposed that schedule and spoke out against it but was overruled by The Man.

    So, perhaps it's not "more testing" that's required, but rather "more integrity" on the part of management of software companies. I think we all know the valuation games executives play these days, often sacrificing quality or glossing over deficiencies.

  • by rice_burners_suck ( 243660 ) on Tuesday June 25, 2002 @07:52PM (#3765982)
    Better testing could allegedly cut that by one-third.

    Yeah. And a better education system would cut the other two thirds. I believe that better programming practices would reduce the number of bugs installed to begin with, thus reducing the amount of testing that must take place.

    Better programming begins with better design. The purpose of a program and its interface to the outside world (user, system, etc.) must be spelled out initially. Nothing that doesn't fall into the predefined limited functionality belongs in the program. Period.

    I mean, you can make a bunch of programs if you need a bunch of functionalities and stuff, but you shouldn't have a ton of junk in the same program. If you know what I mean. I guess you don't, cuz I can see the flames coming.

    The program would be well designed, meaning the basic flow of the program would be defined, and then any number of possible ideas and algorithms would be explored during design time. The program should be designed such that algorithms can be exchanged with as little fuss and trouble as possible. In other words, encapsulation and modularity. The design would require the highest level of efficiency possible, meaning that spacial and temporal resources aren't wasted. In English, that means that a text editor wouldn't require 128 megs of RAM and 6 gigs of hard disk space, and would use up as few processor cycles as necessary, by design. In other words, a totally unoptimized program would execute so efficiently that any optimization would practically be a waste of time. You'll find that programs designed for efficiency are generally more reliable as well, because more thought goes into them before any implementation takes place. Finally, robustness would be a pervasive part of development. Testing mechanisms would be built in to the programs, and by design, those mechanisms could be removed at compile time. I'm not talking about debug information generated by the compiler. I'm talking about tests the programmer, tester and "preview release" users can invoke. A testing suite would also be designed, at design time. Project management is also part of the design process... in other words, for each component of the system, what resources (folks, time, money, etc.) are required, what problems are anticipated, etc. All of this is done by the programming team, not by some stupid manager who doesn't know jack about schitt. Once a ton of design takes place, management reviews it. Which features will be present in which release, etc. As much of everything as possible is anticipated beforehand. (Yeah yeah, you'll never anticipate more than 0.0% of the problems beforehand, but it gets you thinking, and that in itself will reduce greatly the number of errors made later.) Management reviews the design, multiplies all time estimates (which are already over-estimated by the programmers) by 2.5, and sign it or whatever. They can't ask for any new features or whatever until the program is written, debugged, tested, released, feedback comes in, a "huge fixes" release is released, more feedback comes in, and it's time to figure out new features again. This could take weeks or months, depending on the scope of the project or whatever.

    Once all that designing takes place, implementation begins. That doesn't mean that design ends. In fact, there's no 'design -> write -> compile -> debug -> fix -> write...' cycle. All of those items are done all the time, like you're multitasking or something. Yeah, you could draw out an operational chart showing all those items, so you'd spend most of your time doing whichever stage that you're on (like debugging most of the time, doing everything else some of the time). This is useful because you might be debugging some obscure problem, and then one of your fellow programmers will say, "Hey, who needs this damn routine in the first place?" (The routine you're debugging.) Maybe the design can change a little bit and then the section that contains that damn bug isn't even necessary. Know what I'm saying?

    Also, when I spake of education, I was speaking about all the branches of logic, electronics, math, etc. that are no longer studied as part of the programming curriculum or whatever. I'm talking about heavy duty stuff that programmers need to know cold, because it's actually USEFUL! Seriously... learn a little bit about electronics and you'll be surprised at how a few hundred lines of code can literally be replaced by tens of much simpler lines that execute thousands of times faster (not kidding!) and produce exactly the same results, or better (meaning with less problems and side effects). Seriously... when you design electronics, there's no erasing that capacitor once its soldered on the damn board. You gotta draw it right because you can't erase the metal once its cut. And if programmers thought like this more, and didn't think "oh yeah, we can always change it later" then programs would be better. Because, as easy as it is to delete a bunch of text and change it, you and I both know that sometimes you can't change stuff because of side effects. So yeah, it's easy to type, but as a programmer, you gotta pretend that you're carving the damn program in stone, because you'll probably find that it is carved in stone once you've written about 100,000 lines.

    So where was I? Oh yeah, programming... So while you're implementing, the design is reviewed often, perhaps every few days or once a week at least. Problems encountered during implementation are tested against the design, and changes to the design are almost invariably made, with a big eye towards reducing or eliminating coding changes.

    Oh yeah, and all this is done by the programmers again. The team meets often to review each others work, ask questions, find errors of any kind, criticize, rip the code apart, put it back together, and stuff. This constant reviewing is similar to the peer reviews that used to take place back when compilers ran on mainframes, where folks paid hefty fees for computer time, and had to schedule an appointment weeks in advance. In fact, I'd advocate dropping a mohogany grand piano hood (or whatever that cover is called) on the programmer's hand who makes the most mistakes during a project. That'll teach him (and get him to type slower, possibly leading to more thought behind each character typed). In fact, I'd get rid of the backspace and delete keys on the keyboards, and make the programmers pay a penny for each character they want deleted, and this would have to be done by a specialist who can write machine code to go to that location on the hard disk and remove the character, moving all the others back by one. Some of us would have to give up our whole damn paycheck if that was implemented!

    The core routines of the program, holding the most important crap, would probably be written by the whole damn team on a whiteboard, while questions are asked and shit. I don't care how long this takes, it'll save a ton of time in the end. Shit, some of all ya'lls will have to get a huge room with whiteboards all around and sliding ladders like they have in the old-school libraries. (Damn, dude, I love libraries.)

    Once in a while, the programming team will actually take a break from the program they're working on (like every friday afternoon, a few hours before you get off work) and study a bit of Knuth or one of the other classics out there. (There are a bunch of them.) Most of those books contain exercises at the end of each chapter. Pick a tough one, and have it solved by noon on Monday. You can think about it all weekend, talk to each other, whatever. This will expand the whole team's range of thought, and cause them to produce better shit.

    Oh yeah, and every line of the program would be run through a debugger, in as many combinations as practical, in order to watch what happens when they execute. All the programming do's and don't's learned over the past 50 years would be reviewed constantly, and the programmers would be required to follow all of these (using common sense, obviously).

    And of course, testers would be hired who get paid a small bonus every time they get something to fail. They'll be present from the first day onwards, testing even the individual routines before the program runs as a whole. And once the program runs as a whole, they'll get paid even more for each failure. And they'll have full access to the source code, so they can audit it to their heart's content and figure out innovative new ways to crash it and twist it and make it screw up all evil and stuff.

    When a program is ready for release, its release is delayed for an additional two weeks (or month, if you can afford it) as the entire damn team tries everything they can to make it screw up. (And yeah, they obviously fix every crappy thing they find. If all ya'll did it right up to this point, there won't be any changes requiring the damn design to change. Just little coding typos and crap.) Then, it's released as alpha-testing to a small group of folks. Then, once that passes, it's released as beta-testing in three phases to increasingly larger audiences. (Each of these testers gets paid a small amount for each bug they find for the first time.) Then, it's released to the final audience as version 1.0.

    Now at this point, a ton of support calls will start coming in and stuff, and folks will get suggestions and stuff for wishlists. All this will be accumulated for a few months while the programming team visits each problem and tries to figure out possible solutions. No coding changes are made during this time. These will be explored in theory, with an eye towards killing as many flies as possible in one swat. After a shitload of stuff is accumulated, the programmers choose proposed solutions, go over them along with management, go over proposed solutions, and choose the best ones. The programmers choose the best ones, that is. Management signs it into law, and they can't change their minds. Only the programmers can, if it's important enough, then call a meeting with management again and get the new law signed into law. These are implemented (again, with the same torture-testing as before) and once again, released to increasingly larger audiences, until finally, it's released as a version 2.0. All even versions are "fix" versions. All odd versions are "new features" versions. They can be developed in parallel, with porting going on between versions. Something like that.

    I'd even say, during design time, figure out how many bytes of code something will be, how many lines it'll be in the source code, how much memory and time it'll require, etc. Make up limitations, kind of like how car engineers have to make a throttle body fit in a certain place and fit certain restrictions. They get to play inside that area, but other stuff goes in the other places, so you can't make a throttle body the size of the whole damn car and expect the other crap to be changed to fit. Programmers constantly just write crap and assume it'll fit. If you invent restrictions, you'll be surprised how much better the code will be. It'll also be a challenge to make it fit into the restrictions you choose.

    Also, here's something all ya'll programmers will like. When the design is reviewed by management, and you say, "Implementing this should take 4 days" (and "this" would be a sufficiently small part of the system that it stands alone and can be estimated, but large enough that it won't be a single line of code or a single routine or something silly like that). Management will say, "Ok. If you can get this part implemented and working properly (in other words, passing all the tester's tests) within 3 days, you'll get a bonus of X." X wouldn't be like 10,000 bucks, but it wouldn't be 2 cents either. It'll be a small amount that when added to the other bonuses you get for getting everything done on time, you'll have some real money. And, a final release date for the whole damn project is estimated by the programming team. This is multiplied by 2.5 by management, and that's their expected deadline. (Then, if it does get done in time, management claims to be "early" and then everyone looks good. Even if it takes twice as long to get done as anticipated. But if it takes two and a half times as long as anticipated, you're on time. And if it takes longer than that, you're only overdue by a very small amount. And if you're really overdue, you're stupid.) Now, if the program can get released within 90% of the originally estimated time (that is, not the 2.5 times more), then the whole damn team gets a hefty bonus. Just to make them think efficiently.

    Ooooooooh well.

  • by Citizen of Earth ( 569446 ) on Tuesday June 25, 2002 @08:36PM (#3766138)
    was built using Java-J2EE (enterprise ready) and deployed on enterprise hardware and app/db servers. Yet our throughput is abominable and we have huge memory issues.

    What, you think that Sun developed Java because they are interested in moving software ?
  • by Anonymous Coward on Tuesday June 25, 2002 @08:38PM (#3766145)

    There was a time when American cars where the greatest car in the World, the same applied with consumer electronics products. There was a time when the paradigm said you can not combine price, quality and time to market.

    Then there was Eduward Deming and its Quality Management, and something happend to american industry, after some other guys listened to his ideas.

    And now a lot of people say that software can only be produced with bugs, with a lot of features, and costly.

    Well there is a guy named Watts S. Humphrey and people on other parts of the world are hearing his ideas about software engineerning and CMM, there is no question that someone will find out how to make software less buggy, on time, and in price.

    History will tell, remember that those who forget history are set to make the same mistakes.

  • by GCP ( 122438 ) on Wednesday June 26, 2002 @12:06AM (#3766895)
    Using better languages could reduce software bugginess dramatically, but only if programmers admit they have a problem.

    Most developers overestimate the value of custom solutions to every problem. A language like C, with no built-in string type, no collections other than the most primitive array, no memory mgt above the finest-grained primitives, and so on, seduce programmers into working at too low a level of abstraction for most problems.

    The nature of C is such that it is a good choice for the software equivalent of "innermost loop" code: small bits of code in constant use with crystal clear functional requirements that never change. The 1% of code that accounts for 95% of all CPU cycles should probably be written in C, massively code reviewed, tested exhaustively, and then left alone to do its well-defined and unchanging job.

    Code of this sort should mostly be in the OS itself, or in libraries, although it's also appropriate in the "engine" portion at the heart of Big Apps, like database engines, spreadsheet recalc engines, text rendering engines, etc.

    But programmers flatter themselves that everything they write is this type of code and use C for the other 99% for which it is ill-suited. Instead of working at a higher level, using built-in Unicode Strings, well-designed collections classes (where you simply can't write "outside the box"), automatic memory mgt., etc., they foolishly trade safety for speed, reliability for customizability, and so on.

    It's so easy to trade safety for speed when you tell yourself that you're such a hotshot programmer that you don't overlook anything (hence no actual loss in safety) and your code is so important that an increase in performance is a Big Deal (probably undetectable to the end user.)

    C is the perfect tool for programmers who think this way and a major reason for the general bugginess of software. It's an excellent choice for a small percentage of code and the biggest cause of bugs in the rest.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...