Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Businesses IT Technology

The Economist Tackles Complexity in IT 270

yfnET writes "In recent weeks, The Economist has run a number of articles addressing the ever-increasing complexity of software systems. The magazine, with typical Economist wisdom, casts an eye towards past human endeavors for lessons on how today's IT industry can succeed in dealing with complexity. As part of last month's extensive survey of information technology (see Related Items sidebar), the magazine offers insight on the limits of real-world metaphors, the perils of managing a rat's nest of obsolescent systems, and the need for 'disappearing' technology. And hitting newsstands just today is an overview of development models for increasingly large and unwieldy software projects. Among other things, this article compares the open source model to Microsoft's efforts using a quasi-open license. It also describes the 'agile' programming movement and its potential to keep even the most gigantic of projects under control."
This discussion has been archived. No new comments can be posted.

The Economist Tackles Complexity in IT

Comments Filter:
  • by Anonymous Coward on Friday November 26, 2004 @05:00PM (#10927189)
    Powerful Languages Like Smalltalk, and Lisp help one handle complexity.
    • They help, but they don't have a monopoly on handling complexity. Any programming language that allows you to subdivide a program into chunks of some kind with good hiding of the innards of the chunks can be used to write complex programs. To really beat complexity, try shell scripts. Since you can't (really) do simple stuff like adding numbers, it forces you to hide the details, which reduces the feel of the complexity. And stuff.
      • by Anonymous Coward
        We need to stop developing applications and start to think about platforms. Every class library is a quasi-language in which to describe a solution. Let them be full languages.

        We need to design very high level domain specific languages and write our applications in that. We can reuse the language (kind of like a high level virtual machine) as a platform for future versions of the application and could even let the user redefine the upper layers.

        We can hide a lot of control assembly in the structure and sy
        • by Anonymous Coward on Friday November 26, 2004 @06:30PM (#10927658)
          the advantage Lisp and Smalltalk have that other languages lack is that their syntax is so simple that extensions meld in as if they were part of the language.
        • SmallTalk and Lisp have always had this ability already, I think it's time te rest of the programmers get with it, and start using Language Oriented Programming.

          I used to think like you do, but now I'm better. Work in a palimpset environment on a project that had 20 different coders on it, and you'll sing a different tune. The only examples of successful Smalltalk/Lisp projects I could find were 1-2 people working on the entire project for its entire lifetime. Um, that's not where the rest of us live

    • Using a strange langauge that is not standard is what causes the problems in the first place. Did you read the articles at all?

      Its about a multitude of systems from many vendors running middleware from many vendors which run custom applications written from many different languages from different vendors that somehow all must communicate together. Not just which editor do you use or which language do you think is cool.

      Why is Microsoft still gaining marketshare ahead of the supperior Unix? Its because they
      • by lawpoop ( 604919 ) on Friday November 26, 2004 @07:46PM (#10928048) Homepage Journal
        A language is not 'strange' just because few people use it. Here's what you should take away from the parent:

        Some languages are better than others. The parent argues that lisp and smalltalk are some of the best languages to use. With other languages, you get to a point where you can't do it with that language alone, and you start building a meta-language out of that language, and there is your complexity creep. Avalon, etc. are an attempt to address the cruft and complexity that have grown up around the Windows platform.

        As far as Microsoft gaining on "supperior [sic]" unix, keep in mind that Unix is not Lisp (TM). So your comment is irrelevant. Parent is arguing that if you don't want complexity creep, start out with something that you know can see you to the end. Parent claims this is smalltalk/lisp.

    • It's not a technical problem. Most programmers don't understand this because they are trained to provide technical solutions for problems.

      The problem with the software industry is that it doesn't seem to learn much from its past. People were talking about complexity and the growing scale of things in the sixties already. Many of the technical solutions (e.g. components, object orientation) suggested then have been more or less adopted now. This has allowed us to scale development up to the point where we h
  • Aren't these the same people who tried to tell us that "comparative advantage" meant we should give up our manufacturing and all get degrees in high tech? Why should we believe ANYTHING they have to say after they've lied to us for so long?

    Economics is a religion- and a failed mythology at that. Economists need to learn to examine and reconfigure their basic axioms before ANYBODY should ever listen to them again.
    • You just don't give up, you fool. We're supposed to be transitioning to a new industry now. Keep with the times.
      • And what new industry would that be? And why won't it go away too?
        • It's a combination of several, actually.

          1) Day-trading.
          2) Law.

          In all seriousness though, I've been on your side of the argument more times than I want to count. Here are some of their allegedly serious answers.

          A) Space tourism.
          B) Biotechnology.
          C) Marketing.
          D) Robotics.***

          The first, even assuming engineering miracles (and if anyone can give us those, it's probably Rutan), it can't amount to more than a handful of US jobs. Most launches will be equatorial, meaning any ground crew will be somewhere else. T
          • Oddly enough- from my point of view, the fourth was the least dumb answer of the bunch- of course, the real problem with it is when the Robots start building other Robots- which will come *before* they're in general usage anyway.

            One thing amazed me is still being done by human beings- toilets. I would have thought that would have been automated years ago- but apparently the clay they're made out of contains imperfections that can be sanded out once identified- and this is one more job that will have to
          • Those are all great points. IMHO the bottleneck in every field is politics. This country has politics at the most absurd degree. Everybody's trying to get better with politics so they can get a raise. I seriously can't think of one industry where getting better with your skills actually turn you into a manager overnight.

          • A) Space tourism.

            B) Biotechnology.
            C) Marketing.
            D) Robotics.
            Unfortunately only one of those options is realistic, and I'm not a good liar.
    • Uuuhhh... have you ever read The Economist? There's no "these people" there - it's a magazine, like Time or Newsweek, with a global perspective. But it's just a magazine, dude.
    • Judging by the article it looks like those with high tech degree's may still be in demand due to neglect, mergers, and other issues that are plaguing the datacenter.

    • Chinese Slave Labor (Score:2, Interesting)

      by Anonymous Coward
      The problem is that American politicians want to integrate the Chinese economy and our economy [tibet.org]. In this combined economy, whenever Beijing intervenes in the Chinese economy, that intervention also affects the American economy. Yet, American politicians continue lying, "The USA has a free market because Washington does not intervene."

      What we should do is to terminate all trade (including goods and services -- which includes labor) with mainland China. Ditto for India and Mexico. Until the Chinese, the

      • by Anonymous Coward
        Further, we deport all the H-1B workers.

        Tell you what, at this point, there's no point in doing this. Instead, I say we let these people come over and enjoy our crappy healthcare, crappy retirement fate, crappy work environment, and crappy housing situations, while India accepts the same number of people to take their American money and live like kings (or at least moderately wealthy princes) over there. Or maybe I'll go move to Spain. Work all morning, work in the evening, sleep all afternoon. Sounds
      • by tarunthegreat2 ( 761545 ) on Saturday November 27, 2004 @01:13AM (#10929355)
        Of course, I'm feeding a troll here, but India has been on a path to "westernization" for about 200 years now, ever since the British East India Company first set foot on the shores of Calcutta. India alreasy has a "western-style" political and economic system, so STFU. Our laws are based on British Common Law - newsflash - so are American laws. We VOTE our leaders in to power. When two companies have a dispute ove a contract they go to court. When we want to make laws, they have to passed by two houses of parliament. Parliament happens to be this big place where elected representatives gather - to pass laws. Oh, and the unofficial offical language of India is ENGLISH. Imposing this system on a culture which has been transforming and transitioning for the past 1900 years going from Hindu - to Buddhist- To Muslim - to British is going to produce results which will be very different from what a pitifully young country like America isn't used to seeing. So just get the fuck over it. Finally this commitment to free trade that you talk about - The developing countries are ready & waiting for it. It is America that can't handle freeing trade in agriculture and industries like steel. It is USA which puts quotas on garments. Look up any textbook: quota != Free Trade. If you think an Indian software engineer is cheap, wait'll you discover the price of an Indian orange, or an Indian T-shirt. But you won't know about those because trade in those items is not "FREE". And it's not free because the EU and America want it that way. Because Billy Bob with-mouth-in-straw living in a redneck county of a red state just voted the current monkey into the white house. Now go back guarding the bridge, trolly-wolly.
    • by fiannaFailMan ( 702447 ) on Friday November 26, 2004 @05:49PM (#10927458) Journal
      Aren't these the same people who tried to tell us that "comparative advantage" meant we should give up our manufacturing and all get degrees in high tech? Why should we believe ANYTHING they have to say after they've lied to us for so long?
      Actually, I don't seem to remember that paper ever advocating any such thing. Nevertheless, a great many people fell for that one. A great many people still cling to it. And to say that they "lied" implies that they knew different but told us this anyway. Perhaps you can tell us what they hoped to gain from this grand deception. Or perhaps you'll just admit that you had an aversion to their endorsement of John Kerry.
      Economics is a religion- and a failed mythology at that.
      Nonsense. Economists, and the Economist newspaper, seldom have a problem admitting mistakes. Show me a religion that is willing to change its belief in a supernatural being after centuries of mounting evidence to explain away the mysteries that underpinned such superstition.
      • > Perhaps you can tell us what they hoped to gain from this grand deception.

        Brilliantly myopic of you to ask. What could a glut of intelligent high tech laborers gain them?

        And it's such a simple answer too: cheap labor. Cheap labor here, cheap labor there, cheap labor everywhere. Wise up and stop being a chump. They want to destroy the middle-class because a robust middle-class eats away at their bottom line where even enough is never enough.

        Learn something. Read: http://www.conceptualguerilla.com/les [conceptualguerilla.com]
      • Actually, I don't seem to remember that paper ever advocating any such thing. Nevertheless, a great many people fell for that one. A great many people still cling to it.

        Advocating free trade leads to this automatically.

        And to say that they "lied" implies that they knew different but told us this anyway.

        They should have known different- the end result of 50 years of subsidized education in one of the two most populous countries on earth is obvious in hindsight- and that information was never shown in
        • Advocating free trade leads to this automatically.

          Huh? Advocating free trade leads to us "giving up our manufacturing and all of us getting degrees in high tech?"

          They should have known different- the end result of 50 years of subsidized education in one of the two most populous countries on earth is obvious in hindsight- and that information was never shown in foresight.

          Huh? I'm sure you have an interesting point to make here. Please make it.

          Cheap labor and the destruction of the middle class- a mas

          • Huh? Advocating free trade leads to us "giving up our manufacturing and all of us getting degrees in high tech?"

            Of course it does- why else would we give up our cushy $15/hr factory jobs and let them ship those jobs overseas?

            Huh? I'm sure you have an interesting point to make here. Please make it.

            If we had been alowed to know two small facts about this whole globalization scheme, this disaster would have been forseeable to many in the computer industry, particularily those working on Networking. Thos
    • by Prof.Phreak ( 584152 ) on Friday November 26, 2004 @08:24PM (#10928208) Homepage
      Aren't these the same people who tried to tell us that "comparative advantage" meant we should give up our manufacturing and all get degrees in high tech?

      Many manufacturing jobs are cheaper overseas. Your point being? (ie: economist's point of view)

      If you got your degree in `high tech' because of what you heard an economist say... well, you deserve what you get then.

      If you're trully interested in computers and their capabilities (ie: a `computer scientist'), then your job cannot be outsourced.

      The coding jobs will go away---but that doesn't mean there aren't plenty of non-coding things to do (like design, research, etc., until we have computers walking and talking around the world there are still plenty of opportunities it IT---most of which pay off big time no matter where you do them).

      It's about time people with computer science degrees realized that they weren't training to become a code monkey.
      • I trained to be a software ENGINEER- that includes coding far better than what any given code monkey can do. But since the code monkeys are doing it, my talents went to waste for over two years as I scrambled to find some way to make a living.

        Science is about theory- ENGINEERING is about application of that theory. I'd much rather be doing the application of the theory than inventing the theory- but that's all been ripped away now.

        Plus, if you haven't noticed- Honda has a computer that walks, and Micros
        • There are things a _good_ coder (ie: a software engineer) should be able to do that average code monkeys cannot: like being able to build a complex system (something way above the level of a `vb db app').

          If you want some challenges, build a system that can recognize human speech well (no matter who's the speaker---and without training). I'm sure that will sell really well.

          The Honda robot may walk, but have you seen it walk? It walks slower than a slug. Build a system that can keep its balance while runnin
          • There are things a _good_ coder (ie: a software engineer) should be able to do that average code monkeys cannot: like being able to build a complex system (something way above the level of a `vb db app').

            True enough- but NONE of them pay, because business doesn't really need anything more complex than a vb db app.

            If you want some challenges, build a system that can recognize human speech well (no matter who's the speaker---and without training). I'm sure that will sell really well.

            Already is for $39.9 [handango.com]
            • What makes you think the code monkeys don't have an education? Better yet, what makes you think any company is going to be willing to pay American wages for this kind of research?

              Ok, I admit that many code monkeys have an education---in fact, many have a far better education than most. (ie: hire a BS degree in US or a masters degree in India?). That you can't do anything about (they went to school, got an education, etc.).

              The thing that you do control is your imagination, creativity, etc. Many people lac
    • ... which is a magazine, and not to be confused with the profession. In fact, I know of a few newspapers with that sort of name: The Economist and Sun, the Economist-Tribune.

      Newspaper and magazine names are usually historical, and the words they contain have often changed meaning. For example, there is a local newspaper north of Toronto called "The Liberal". This newspaper has, for example, no ties to the Liberal Party, is not particularly philosophically Liberal (whatever that means), and is basically
  • by Not_Wiggins ( 686627 ) on Friday November 26, 2004 @05:13PM (#10927248) Journal
    There's a serious problem with agile methodology and outsourcing (I didn't see any articles on Economist.com related to outsourcing, but may have missed it as I gave it only a cursory look).

    Large and unwieldy projects benefiting from agile methodologies? Yeah... when you have easy communication between the "customers" (business partners) and your IT staff.

    How does that happen when your developers are thousands of miles away, in a different timezone, with a totally different culture, and don't speak your native language?
    • I think this is one of the least recognised spects of agile techniques, particularly XP. While as first glance they look quite chaotic they actually require a huge amount of discipline and vast amounts of interaction. Obviously this is not an approach that scales well so when the project team reaches double digits as even moderate sized projects frequently do its time to look for alternative. Also its unusual for customers to be in a position to free their staff for the extended periods of time something li
    • Two weeks ago, the Economist published a survey on outsourcing [economist.com]. Their "surveys" are extensive reports, consisting of about 14-16 printed pages, and 10-12 separate articles. It covers the topic from several different angles, and its helpful to see outsourcing in all its complexity, rather than the "Indians took my job, outsourcing is evil," perspective that dominates here.
  • Translated from a a french newpaper:
    "The historical operator, his Orange subsidiary company and Bouygues Telecom have all three known, in the last seven months, of the significant breakdowns, depriving their customers of telephone service for durations varying from a few hours at almost two days.The succession of these events clarifies the increasing complexity of the systems of telecommunications and the difficulties raised by the needs, news, of interconnection of many heteroclite networks". A minister
  • by qualico ( 731143 )
    Hardware complexity can be reduced with homogenous environments.

    Good planning, documentation and standards reduce complexity in software.

    How much more do they have to complicate the issue?
    • But heterogenous environments help mitigate security problems. They are harder for a cracker to get through, less susceptible to virii and malware, and when parts fail, they tend to be less catastrophic because you are running different architectures. The trade off, of course, is in ease-of-administration, but in my experience the time spent running heterogenous hardware/environments has paid off big time when there is a problem. Complexity makes things difficult, yes, but that what I'm paid for, no?
  • Do simple things (Score:5, Insightful)

    by barcodez ( 580516 ) on Friday November 26, 2004 @05:16PM (#10927272)
    People tend to make things over complex if you let them. Bosses want that technology they read about in whatever PHB read these days. Such and such wants some TLA to do some other TLA.

    I know it's hard but you have to tell them that these things don't add any value in and of themselves. You want the simplest possible system that will solve the problem at hand. Really, nothing more, don't implement something because you may want/need it tomorrow 'cos when tommorrow comes it won't be right (and if it is, hey, you can implement it then).
  • by Greyfox ( 87712 ) on Friday November 26, 2004 @05:19PM (#10927294) Homepage Journal
    I've seen it work well and I've seen it work poorly. If all you have programming or managing on your project are chimpanzees, telling them to use scrum (or whatever) will not help you.

    I'd say just hire good people and keep the managers focussed on just a few projects, but this seems to be beyond the capabilities of most companies. So you end up with programmers who write Java like it was fortran and managers who juggle so many projects that they barely know your name much less what you do for the company. There doesn't seem to be a fix for this which companies will be willing to accept.

  • by museumpeace ( 735109 ) on Friday November 26, 2004 @05:21PM (#10927300) Journal
    fascinating read that, but facile too.
    All the management of complexity now takes place within the network, so that consumers no longer even know when their electricity or water company upgrades its technology. Thus, from the user's point of view, says Mr Benioff, "technology goes through a gradual disappearance process."
    [from the article]
    This push to make [for the user] simple what is after all increasingly complex, can only hide, not eliminate the role of the nerd class, a role the article disparages because nerds are presumed, as the inventors, to have foisted off complexity on the unwitting public. Was it Heinlein who said that "any sufficiently advanced [or was his word complex] technology is indistinguishable from magic"? The wish, on the part of typical users and marketers, that all the wonders of our age and those ages coming next should all just work like magic will in fact only ADD the complexity of UI technologies that are good at hiding the guts of the systems we depend upon. The the engineers and technicians will be as needed as ever and get even less sympathy from a public that never sees directly what it is that the "nerds" are doing for them.
    • by Anonymous Coward
      Was it Heinlein who said that "any sufficiently advanced [or was his word complex] technology is indistinguishable from magic"?

      Was Arthur C. Clark, I believe. And the corollary to that is something like:

      Any technology that IS distinguishable from magic is not sufficiently advanced.
    • This push to make [for the user] simple what is after all increasingly complex, can only hide, not eliminate the role of the nerd class, a role the article disparages because nerds are presumed, as the inventors, to have foisted off complexity on the unwitting public.

      I think the consideration you should take in that article is distiguishing what we will call the "common nerd" and the "ubernerd".

      When a technology is in an underdeveloped state, the use of that technology is very complex. It requires a ce

  • by GreenCrackBaby ( 203293 ) on Friday November 26, 2004 @05:24PM (#10927323) Homepage
    For an outsider looking in, perhaps it's easy to look at something like Google's 23-word front page and say "why can't they all be like that!" Too bad most systems need more than one form element to allow the user to interact with the system. Can you imagine someone telling Adobe to reduce their Photoshop interface down to one or two buttons? It would make no sense simply because editing a digital image is far more complex a process than 'search the web for these terms' to a user (though both may have similarly huge code bases behind them).


    Complexity in IT isn't going to go away. In fact, I'd argue it is a necessity. There are some tasks that simply require complex systems and those complex systems require complex data and/or complex user interfaces.

    • by Billly Gates ( 198444 ) on Friday November 26, 2004 @05:38PM (#10927395) Journal
      But here is the problem. All the complex systems don't talk to each other.

      The IT manager who quit from JP Morgan was a perfect example. You have 450 applications talking to each other and a user calls the helpdesk and demands an answer right away. What caused the problem? Which layer? Which application was doing what to the data?

      Microsoft was hot for awhile with the IT managers in corporations because all the dcom/com/ole applications can interact with each and become one. This can help the problem tremendously.

      However there is no standard protocal between all the vendors. That needs to change before vendors start with their own proprietary versions that only work with their products.

      If an application uses several layers and it screws up there has to be a way to trace and find out what happened.

      Perhaps a new opensource protocal could help? I like that idea.
      • by GreenCrackBaby ( 203293 ) on Friday November 26, 2004 @05:55PM (#10927483) Homepage
        All the complex systems don't talk to each other.


        I'd argue that is a near-impossible task. My background is in billing systems, so I'll give an example from that realm...

        Our company makes billing systems that end up producing your telephone bill. Sounds simple, but the billing system alone clocked in at over 6 million lines of code. Then you have the other two big necessary systems: a CSR application (such as Siebel) and a provisioning system. Not to mention hundreds of smaller apps that feed/collect data from each application. You have no idea how complex the infrastructure can get!

        But it's not just that complexity. In our billing system, a customer's account was important because we needed to know who to charge for each transaction. In a CSR application, they care about who to contact. In a provisioning app, they care about where the account is physically. This leads to a different approach to designing something as simple as the account structure in the database. It's not something that could be standardized because each application needs to look at the data differently.

        There was some hope of standardization with middleware applications like Vitria, but what we found is that we'd spend insane amounts of time building code that translated our account between our billing system and some common model held by the middleware. The complexity didn't go away -- it got worse!

        You won't ever see a standard vendor protocol. Not for lack of wanting one, but simply because it's impossible.

        • But it's not just that complexity. In our billing system, a customer's account was important because we needed to know who to charge for each transaction. In a CSR application, they care about who to contact. In a provisioning app, they care about where the account is physically. This leads to a different approach to designing something as simple as the account structure in the database. It's not something that could be standardized because each application needs to look at the data differently.

          And each p

        • by Mydron ( 456525 )
          Complexity may be inevitable, but for a moment lets put on our economist hat and look at what motivates companies to tolerate complexity. While some complexity in inherent, I believe that a lot of this complexity is often accidental, sometimes born out of naivete and occasionally deliberate. However, at the end of the day companies are not in a big hurry because the added cost of complexity often doesn't cost them anything, it is merely added to the cost of doing business and paid for by the consumer.

          Furth
    • by aoteoroa ( 596031 ) on Friday November 26, 2004 @05:58PM (#10927499)
      From the article:

      Can you imagine someone telling Adobe to reduce their Photoshop interface down to one or two buttons? It would make no sense simply because editing a digital image is far more complex a process than 'search the web for these terms' to a user (though both may have similarly huge code bases behind them).

      You hit the nail on the head. Software is complex when used to solve complex problems and easier for simple tasks.

      A simple accounting package such as QuickBooks can seem tough to use for a user who doesn't have a basic understanding of bookkeeping. OTOH some software is so easy to use that people take it for granted. My girlfriend needed an office suite on her computer so I installed OpenOffice. Because one word processor is very much like every other wordprocessor she started using OpenOffice right away and had no problems even though the only other word processor she used is WordPerfect.

    • Yes, Google is an extreme example, but it still illustrates an important point (which you alluded to):

      You have a another good example, Photoshop. Photoshop is actually rather well orgnaized (well, version CS that is, 7.0 still feels awkward). Not every button is displayed on the interface. When a tool is selected from the tool pallette additional options appear at the top in a context sensative tool option toolbar.

      Contrast that to Word. In Word you open it up and to create a new document (i know i
    • You hit the truth.

      If something is simple, people will pay less for it. The money is in complex systems.

      If everything is made simple, a lot of companies (or even, lone hackers) can provide the system. Then, the competitive edge you get by using IT is gotten from a complex system, again (because your competition has the simpler systems)

      Complexity isn't going to go away, because that is where the money is.
      --
      Informacion sobre Robotech [uchile.cl]
    • In 1998 someone might have said the same thing, putting a search engine in place of photoshop and saying "imagine completely stripping out all boolean operators from search engines to simplify the interface"- it would make no sense simply because searching through millions- billions of documents is a far more complex process than .

      However, here we are today, and I just type a few terms into google, and 95% of the time, I get exactly what I want... almost magically.

      I can imagine an interface to photoshop w
  • by Onimaru ( 773331 ) on Friday November 26, 2004 @05:24PM (#10927328)

    So a lot of this space was spent explaining to Joseph P. Siquespack, Esq. what a "protocol" was and the like, but there were two points in here that I'm really glad my great-grandboss might be reading:

    1. A system should be designed to fail in a predictable way. Much like a car body, it should crumple to protect its most valuable assets, and repairs should have obvious beginnings, middles, and ends.
    2. Obsolete systems will cause you more downtime in the end than incremental upgrades. And, what's worse, it will be all at once instead of at 4am twice a month on Saturday morning.

    Neither of the above are impossible goals! They can be done with a little thought and elbow grease. And the great part is, they're probably already being done! Next time you're reading over your IT department head's recommendations for a project, call them up and ask WHY. You might be amazed at how awesome the answer is, and it might even persuade you to put away the "my way or the highway" stamp.

    • Obsolete systems will cause you more downtime in the end than incremental upgrades. And, what's worse, it will be all at once instead of at 4am twice a month on Saturday morning.

      The problem is, that's at odds with the "if it ain't broke don't fix it" wisdom.

      Also, there are no incremental upgrades if you're running custom software under MPE on an HP mainframe. There's only (usually very expensive) migration to something else.
      • The problem is, that's at odds with the "if it ain't broke don't fix it" wisdom.

        Of course you have a point, but there's something to be said for anticipating the need for change.

        Old hard drives are going to punch-out sooner or later, your webserver might work with IPv4 today but someday IPv6 might be your only option, etc.

        If-it-ain't-broke-don't-fix-it is a nice countermeasure to "change for change's sake", but it shouldn't exclude the opportunity to future-proof ourselves, right?

      • The problem is, that's at odds with the "if it ain't broke don't fix it" wisdom.

        That guy is my boss: We don't do patches, they only break stuff. Build a stable system and never touch it again until we throw it out (years later).

        Just today we had a box owned: Red Hat 7.3 (unpatched) and Tomcat 3.3. It took two years of neglegence with that box. We have other box'es that's older than that ..

        Right now we are (i shit you not) migrating our self-service system to .not, including access to all your phone reco

    • Obsolete systems will cause you more downtime in the end than incremental upgrades. And, what's worse, it will be all at once instead of at 4am twice a month on Saturday morning.

      Not always true.

      Giving support to MSDOS 3.0? Yes it will give you more downtime

      Good old IBM mainframes? NO, IT WON'T. In fact, quite the opposite.
      --
      Informacion sobre Isaac Asimov [uchile.cl]
  • by Boss, Pointy Haired ( 537010 ) on Friday November 26, 2004 @05:29PM (#10927345)
    Sure, information technology is becoming ever increasingly complex; but it is also a great opportunity for companies that have the management able to deal with that complexity to excel and outperform their competition.

  • Engineers love to break things into smaller parts, each part serving one and only one function, like pulleys, shafts, rotors, etc.

    For really effective design each part has to serve multiple functions, like evolution is able to do: The human mouth can be used to eat, breathe, talk, etc etc.

    That's why a robot can't compete with an animal- In a robot each part usually severs only one function, making the machine inefficient as a whole.

    This problem is just magnified in computer software and will only get worse unless engineers start changing their tune. I think the worst offender of this philosophy is object oriented programming: It's the ultimate embodyment of this philosophy- Most big object-oriented software have only about 2-3% of code that performs any real work, with the rest only is window dressing to fulfill the engineer's urge to "modularize".

    The best software I see seems to be written either in more pragmatic procedural styles, or uses better mathematical underpinnings for its structure, like you'll find in functional programming languages (Haskell, Lisp, etc.)

    My apologies for living up to my user name!

    Conrad Barski
    • by TheLastUser ( 550621 ) on Friday November 26, 2004 @06:27PM (#10927645)
      To paraphrase oo doesn't make messy code, messy people who mis-use oo make messy code.

      I think the main problem with what, aparently all of us, have seen with oo code, is the universities. The coding style that is taught in universities makes for really ugly, unmaintainable code.

      If you cram 100 abstractions and modularization into a project in university, you get an A and every one says how clever you are to have used all of these features in your project. Do the same in the real world and you are left with unmaintainable blech.

      People have to learn that the various oo features are there to help them simplify their code, not to make it more complex. If using a particular modularization technique, say an interface, doesn't remove complexity then don't add it.

      Another really bad thing that people do is add code for some unspecified future purpose. Maybe they are creating a class that does some math, they need an add method and a subtract method, so they think, what the hell, I'll add a multiply and divide too. Why? All this does is make the code less readable. Never implement anything that you don't need right now.
      • Never implement anything that you don't need right now.

        I'd prefer something like: "Don't bother implementing things you don't know you'll ever need." That still rules out the time-wasting feature creep, while acknowledging that it's often far more efficient to plan for likely future developments from the start rather than constantly evolving a system without any sort of "grand plan".

        All evolutionary development causes overhead, most of which is unnecessary if you can anticipate major future developmen

    • I have been developing software for 25 years and was well versed in structured techniques before moving on to object orientation.

      I find I can write more (internally) complicated programs because of OO. OO-based design patterns, such as Model-View-Control for interactive programs, are a God send. I can pull off computational tricks that, without OO techniques, would make an incomprehensible and unmaintainable mess of the code.

      And by making things more powerful internally, I can write applications that ar

    • Eh... unconvincing. Organic systems are great if you don't mind a little fudging of the numbers. For example, human memory is notoriously unreliable. Combining features tends to create single points of failure and organic systems have a very narrow optimal operating environment. Try dropping the temperature by a hundred degrees or so and see how you fare. The lack of validation, if you will, on inputs, creates a significant susceptibility to viruses, bacteria, food poisoning and the like.

      There are plenty o
    • you can't slam modularity w/o being self-inconsistent.

      why? because to slam something means to put yourself outside of it in the first place (and pour invective on it in the second). putting yourself outside of something (drawing a boundary around it) is the essence of encapsulation, which is one of the techniques of modularity.

      so, let's be charitable and assume you don't want to slam modularity full-force, but rather some of the other techniques for modularity that often find themselves badly under

  • by Teancum ( 67324 ) <robert_horning AT netzero DOT net> on Friday November 26, 2004 @05:57PM (#10927496) Homepage Journal
    The problem with trying to compare the software industry to other endeavors of human experience is that in the realm of the computer/electronics industry that most software developers are dealing with, you are near the pinacle of dealing with abstraction of complex systems. While there may be other very complex systems that on a large scale can compare to some computer networks or CPU designs, computer science is the practice of dealing with abstractions on many levels, sometimes simultaneously.

    Indeed, electronic state machine digitial computing devices (also called computers) have proven so successful, with the software abstraction dealing with the various levels of abstraction, that they are used in the controlling of other complex systems, from air traffic management, urban water system management, freeway traffic monitoring, and law enforcement dispatching. You've seen them, and they are out there.

    Some very talented engineers have done a surprisingly good job of simplifying the tasks and reducing the abstractions to the point that all you need to do for the most part is plug it in and watch the gizmo do its thing. What this article in the Economist seems to be doing is complaining that the job isn't finished, and that complexities in setting up a computer system for some project is more difficult than it should be. That is primarily due to the fact that the author is using products that don't comply with standards (a real problem if standards don't exist yet for a certain concept or technology) or they are using the wrong tool for the job, like using a hammer to put in a few screws. Sure, it will work, but it is aggrivating and sometimes takes quite a bit longer to get the job done, and can damage things around it as a side effect. How many software/electronic gizmos out there do you know get used like that?

    While I'm willing to acknowledge that I don't know everything there is to know about the management and organization of complex systems, I would be more inclined to get the opinion on such a subject from a computer programmer than from a plumber.
    • What this article in the Economist seems to be doing is complaining that the job isn't finished, and that complexities in setting up a computer system for some project is more difficult than it should be.

      Did you notice that the author seems to be complaining that enterprise datacenters are composed of products from multiple vendors? I can't see a datacenter turning into something simple and easy - that's like expecting an assembly line to come in a consumer version.

  • I cry bullshit... (Score:5, Insightful)

    by Linegod ( 9952 ) <pasnak AT warpedsystems DOT sk DOT ca> on Friday November 26, 2004 @06:52PM (#10927763) Homepage Journal
    I hear this every damn day. 'We need to make it simple', 'It is a simple service' or 'It is a low option service'. This may work fine for the sales and marketing drones that make their commission off selling unnecessary services to uninformed customers, but as long as there is _choice_ out there, the backend is going to be complicated, and somebody, somewhere is going to have to known how all of it (or at least a major part of it) works.

    Try all you want, but unless the entire IT industry decides to switch to one massive global device that we all plug _directly_ into, you can't make video conferencing/VOIP/disaster recovery/etc through 2 LANS, 3 Service Providers and 10 different security layers a 'green/red' push button operation.

    I've gotta go get drunk now....
    • by mankey wanker ( 673345 ) on Friday November 26, 2004 @07:15PM (#10927891)
      I think I can amplify on this point. We keep hearing about "killer apps" like cell phones and so forth, and why can't computer technology be as simple as a cell phone? Etc, etc, etc...

      Hell, I can barely work my cell phone! Why? Because I am too busy keeping up with computer technologies to worry about much more than what it takes to make the cell phone place a simple call, or to return one. My point is this: the cell phone is not so simple unless you REALLY want to know how to use all of it's features and spend some time with the manual and get it all down to rote.

      If people felt that way about ANY computer technology beyond email clients and browsers they'd then have the exact same enthusiasm for the computer as they do for the cell phone.

      That's the bullshit part, that most other technologies are any more simple. Remind your parents/clients how they can't program the VCR either. Confirm for them that it's mostly a matter of the will to achieve a thing.
  • The Economist (Score:2, Insightful)

    I like how this post instntly soawned a flmewar pn economy, mostly fueled by people that either don't understand
    1. that The Economist is a magazine, not the people referred to in the sentence "economists say".
    2. anything about economics.
    3. both.

    The Economist is a very good news magazine full of reasonable articles and opinions, in all senses of the word "reasonable". There is not enough praise on Slashdot to make it justice. You should all subscribe, assuming that you are interested of knowing what happens
  • by LucidBeast ( 601749 ) on Friday November 26, 2004 @08:15PM (#10928176)
    First article paradoxically claimed that software complexities are making things difficult, but ofcourse as anybody who has written code knows, if you want to make things easy for user you need to write more code. Lets say you want to delete a file:
    rm foo
    vs. today
    point a mouse to a file open a menu select delete or alternatively press del or alternatively drag file to the trash can.

    Code for rm could be implemented in C with handfull of lines. Todays alternatives take thousands of lines of code, but to an end user the second alternative is simple. User doesn't have to know what the commands are, just toss the file away, as you would with solid objects.

    So there we have it, simple problem becomes complex from implementation point of view. I once had a customer who joked when I delivered them a new system that calculated the price and basic layout of the systems they were manufacturing, that inspite the fact that now it took less than tenth of the time to do the calculations, that we could still improve it so it would do the calculations when he pressed a button while thinking of something else.

    It should also be noted that what was impossible few years ago is now possible, because of improvement to hardware. This adds to the layers of complexity, because implementers can actually use modular approach instead of optimizing at lowest possible level.

  • by wobblie ( 191824 ) on Friday November 26, 2004 @08:19PM (#10928193)
    Main problems in IT in the US:

    1. Companies have gotten to big
    2. Companies try to centralize everything, instead of delegating duties to competent people (this practice also encourages hiring incompetent people). Competent IT workers are also very unhappy when they hands are tied by bureaucratic BS
    3. relating to (2), companies don't give raises or benefits anymore, which causes competent / motivated workers to hop jobs to get increases in pay
    4. People doing all the "centralizing" are ignorant of standards
  • by Animats ( 122034 ) on Friday November 26, 2004 @08:27PM (#10928216) Homepage
    Aerospace has dealt with high complexity for decades, rather more successfully than the IT industry. Here's how.
    • Interface specifications dominate If it doesn't work the way the spec says it does, fix the box, not the spec. If A won't talk to B, run the tests to check compliance with the spec. If you can't tell who's at fault, the spec is broken. This is why you can swap a Pratt and Whitney engine for a Rolls Royce engine.
    • The buyer, not the vendor, decides what is a "defect". One of the fundamental problems in IT is that vendors have sole discretion to decide what is a defect and what isn't. That doesn't fly in aerospace.
    • Fix blame. In aerospace, people get blamed for screwing up. You do not want your name or the name of your company to appear in an NTSB crash report. If you screw up big time, it will. Mistakes in aerospace are publicized. There's an NTSB database of 140,000 crashes [ntsb.gov]. If it was a hardware failure, the vendor is named.
    • Warranties have real meaning Airplanes come with good warranties, and so do all the parts that go into them. Commercial software doesn't.
    This runs engineering costs way up, and the life cycles are longer, but in IT, most of the commercial products are sold in large numbers, so you get to spread that engineering cost over a large number of items.

    It's time for computing to grow up and accept this kind of discipline. The automotive industry had to accept it in the 1960s, and cars got much better within a decade.

  • From the metaphor FA:

    Naturally, this has struck fear into Microsoft, whose Windows system runs 94% of the world's PCs and which sees itself as the ruler of the desktop.

    Really? I heard that 87.4% of all statistics are made up on the spot ... I just have to wonder if this one is part of that majority...

  • by TheLink ( 130905 ) on Saturday November 27, 2004 @07:16AM (#10930147) Journal
    Most people can only keep/visualize/instantiate about 7 distinct objects in short term memory at any one time. You might want to stick to a max of 5 to cater the slightly below average. (Maybe someone might want to be a hero and try increase the population average to 10 items or something ;) ).

    There are workarounds if you can easily get people to group a bunch of items as one object.

    If people have to remember 7 or more important things during the learning period where short term memory is used then the thing is hard.

    Given the amount of choices and options possible with software, it's going to be hard.

    So the workaround is to split the learning phases to small absorbable chunks and give time between these phases for people to commit them to long term memory.

    If you use common/defacto UI standards, many of the users would be familiar them and so the effective number of learning phases required drops.
  • by Linuxathome ( 242573 ) on Sunday November 28, 2004 @01:53AM (#10935568) Homepage Journal
    On the lighter side of things, I find it funny that IT departments often "re-invent" themselves by changing their name and acronyms--a complexity in themselves. You find acronyms such as MIS, CLC, ITC, CIT, CITS, etc. But in essence they all stand for the same thing.

    My university's acronym is CITS (Computer and Information Technology Systems), and before that they were just the CLC (Computer Learning Center). Imagine if they kept the name "Learning" in the acronym somewhere, it could've been: Computer Learning and Information Technology Systems (CLITS). But somehow I don't see that happening.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...