Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IT Technology

IT Snake Oil — Six Tech Cure-Alls That Went Bunk 483

snydeq writes "InfoWorld's Dan Tynan surveys six 'transformational' tech-panacea sales pitches that have left egg on at least some IT department faces. Billed with legendary promises, each of the six technologies — five old, one new — has earned the dubious distinction of being the hype king of its respective era, falling far short of legendary promises. Consultant greed, analyst oversight, dirty vendor tricks — 'the one thing you can count on in the land of IT is a slick vendor presentation and a whole lot of hype. Eras shift, technologies change, but the sales pitch always sounds eerily familiar. In virtually every decade there's at least one transformational technology that promises to revolutionize the enterprise, slash operational costs, reduce capital expenditures, align your IT initiatives with your core business practices, boost employee productivity, and leave your breath clean and minty fresh.' Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown." What other horrible hype stories do some of our seasoned vets have?
This discussion has been archived. No new comments can be posted.

IT Snake Oil — Six Tech Cure-Alls That Went Bunk

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Monday November 02, 2009 @02:15PM (#29952666) Journal

    The bad news is that artificial intelligence has yet to fully deliver on its promises.

    Only idiots, marketers, businessmen and outsiders ever thought we would be completely replaced by artificially intelligent machines. The people actually putting artificial intelligence into practice knew that AI, like so many other things, would benefit us in small steps. So many forms of automation are technically basic artificial intelligence, it's just very simple artificial intelligence. While you might want to argue that the things we benefit from are heuristics, statistics and messes of if/then decision trees, successful AI is nothing more than that. Everyone reading this enjoys benefits of AI but you probably don't know it. For instance, your hand written mail is most likely read by a machine that uses optical character recognition to decide where it goes with a pretty good success rate and confidence factor to fail over to humans. Recommendation systems are often based on AI algorithms. I mean, the article even says this:

    The ability of your bank's financial software to detect potentially fraudulent activity on your accounts or alter your credit score when you miss a mortgage payment are just two of many common examples of AI at work, says Mow. Speech and handwriting recognition, business process management, data mining, and medical diagnostics -- they all owe a debt to AI.

    Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind. I don't ever recall reading anything other than fiction claiming that humans would soon be replaced completely by thinking machines.

    In short, I don't think it's fair to put it in this list as it has had success. It's easy to dismiss AI if the only person you hear talking about it is the cult-like Ray Kurzweil but I assure you the field is a valid one [arxiv.org] (unlike CASE or ERP). In short, AI will never die because the list of applications -- though small -- slowly but surely grows. It has not gone 'bunk' (whatever the hell that means [wiktionary.org]). You can say expert systems have failed to keep their promises but not AI on the whole. The only thing that's left a sour taste in your mouth is salesmen and businessmen promising you something they simply cannot deliver on. And that's nothing new nor anything specific to AI.

    • The same defense applies to pretty much all of these (except maybe CASE).

    • by John Whitley ( 6067 ) on Monday November 02, 2009 @02:28PM (#29952850) Homepage

      The people actually putting artificial intelligence into practice knew that AI, like so many other things, would benefit us in small steps.

      Actually, there was a period very early on ('50s) when it was naively thought that "we'll have thinking machines within five years!" That's a paraphrase from a now-hilarious film reel interview with an MIT prof from the early 1950's. A film reel which was shown as the first thing in my graduate level AI class, I might add. Sadly, I no longer have the reference to this clip.

      One major lesson was that there's an error in thinking "surely solving hard problem X must mean we've achieved artificial intelligence." As each of these problems fell (a computer passing the freshman calc exam at MIT, a computer beating a chess grandmaster, and many others), we realized that the solutions were simply due to understanding the problem and designing appropriate algorithms and/or hardware.

      The other lesson from that first day of AI class was that the above properties made AI into the incredible shrinking discipline: each of its successes weren't recognized as "intelligence", but often did spawn entire new disciplines of powerful problem solving that are used everywhere today. So "AI" research gets no credit, even though its researchers have made great strides for computing in general.

      • by Chris Burke ( 6130 ) on Monday November 02, 2009 @03:10PM (#29953388) Homepage

        A film reel which was shown as the first thing in my graduate level AI class, I might add. Sadly, I no longer have the reference to this clip.

        Heh. Day 1 of my AI class, the lecture was titled: "It's 2001 -- where's HAL?"

        The other lesson from that first day of AI class was that the above properties made AI into the incredible shrinking discipline: each of its successes weren't recognized as "intelligence", but often did spawn entire new disciplines of powerful problem solving that are used everywhere today. So "AI" research gets no credit, even though its researchers have made great strides for computing in general.

        Yeah that's when the prof introduced the concept of "Strong AI" (HAL) and "Weak AI" (expert systems, computer learning, chess algorithms etc). "Strong" AI hasn't achieved its goals, but "Weak" AI has been amazingly successful, often due to the efforts of those trying to invent HAL.

        Of course the rest of the semester was devoted to "Weak AI". But it's quite useful stuff!

    • ``Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.''

      The problem is that, if it isn't that, then what is "artificial intelligence", rather than flashy marketing speak for just another bunch of algorithms?

      • Re: (Score:3, Insightful)

        by DrVomact ( 726065 )

        The problem is that, if it isn't that, then what is "artificial intelligence", rather than flashy marketing speak for just another bunch of algorithms?

        Exactly. "Artificial intelligence" seems to serve various purposes—at best vacuous and at worst deceptive. How many millions of dollars have academicians raked in for various projects that involve research into "artificial intelligence"?

        What makes all this silliness sustainable is the philosophical fog that surrounds words such as "intelligence" and "thinking". Such words easily slip their moorings in our common language, and acquire some very strange uses. Yet, because they are recognizably legitima

      • Re: (Score:3, Insightful)

        by cecille ( 583022 )
        Maybe. But I think it's mostly just a disconnect between what the people who work in the field believe the term to mean and what the general public takes the term to mean. Some of that might just be naivete on the part of researchers. And maybe some bravado as well.

        When I hear about intelligent anything to do with computers, I just think of a system that learns. That, to me, is the key differentiator. On the other hand, my mom's friend was telling me one night at dinner that her son was taking a cla
      • by toriver ( 11308 ) on Monday November 02, 2009 @05:37PM (#29955396)

        Do you also complain when airplanes don't flap their wings? (Sci-fi's Ornithopters excempted of course.)

        Knowledge systems/rules engines and neural networks can deduce answers, that is sufficient to be labeled "intelligence" in my book.

        • Re: (Score:3, Insightful)

          by LtGordon ( 1421725 )
          I'll call it truly intelligent when the computer can design its own algorithms. In the meantime, following an if/then tree is pretty weak AI.
    • by Wonko the Sane ( 25252 ) * on Monday November 02, 2009 @02:41PM (#29953010) Journal

      ERP could work if the vendors would realistically deal with GIGO.

      Unless you lock down the permissions so tightly that the system is unusable, your users will enter bad data. They'll add new entries for objects that already exist, they'll misspell the name of an object and then create a new object instead of editing the one they just created. They'll make every possible data entry error you can imagine, and plenty that you can't.

      We'd see a lot more progress in business software applications if all vendors would follow two rules:

      1. Every piece of data that comes from the user must be editable in the future
      2. Any interface that allows a user to create a new database entry MUST provide a method to merge duplicate entries.
    • It just when some aspect of symbolic computing is successful, its not really considered AI anymore and the goal changes. Or it was any computing technology to emerge from an AI laboratory was considered AI'ish.

      Some researchers divided this into "soft" and "hard" AI. The later would be someone conversational humna-like mentality. The former is any software technology along the way.
    • Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.

      Douglas Lenat [wikipedia.org], perhaps?

    • by Animats ( 122034 ) on Monday November 02, 2009 @02:58PM (#29953204) Homepage

      Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.

      Not today, after the "AI Winter". But when I went through Stanford CS in the 1980s, there were indeed faculty members proclaiming in print that strong AI was going to result from expert systems Real Soon Now. Feigenbaum was probably the worst offender. His 1984 book, The Fifth Generation [amazon.com] (available for $0.01 through Amazon.com) is particularly embarrassing. Expert systems don't really do all that much. They're basically a way to encode troubleshooting books in a machine-processable way. What you put in is what you get out.

      Machine learning, though, has made progress in recent years. There's now some decent theory underneath. Neural nets, simulated annealing, and similar ad-hoc algorithms have been subsumed into machine learning algorithms with solid statistics underneath. Strong AI remains a long way off.

      Compute power doesn't seem to be the problem. Moravec's classic chart [cmu.edu] indicates that today, enough compute power to do a brain should only cost about $1 million. There are plenty of server farms with more compute power and far more storage than the human brain. A terabyte drive is now only $199, after all.

    • by ceoyoyo ( 59147 ) on Monday November 02, 2009 @03:02PM (#29953256)

      "CASE" isn't entirely bunk either. CASE as CASE might be, but computer aided software design isn't. Perhaps most here are now too young to remember when, if you wanted a GUI, you had to design it by hand, positioning all the elements manually in code and then linking things up manually, in code.

      Now almost nobody designs a GUI without a RAD tool of some kind. You drop your GUI elements on the window and the tool generates code stubs for the interaction. That's way, way nicer, and way, way faster than, for example, setting up transfer records for a Windows 3.1 form.

    • Re: (Score:3, Funny)

      by stinerman ( 812158 )

      Only idiots, marketers, businessmen

      You repeat yourself, Mr. eldavojohn.

    • by ghostlibrary ( 450718 ) on Monday November 02, 2009 @03:29PM (#29953652) Homepage Journal

      AI already has successes. But, as an AI researcher friend of mine points out, once they succeed it's no longer 'AI'. Things like packet routing, used to be AI. Path-finding, as in games, or route-finding, as with GPS: solved. So yes, AI will never arrive, because AI is always 'other than the AI we already have.'

    • Re: (Score:3, Interesting)

      by Jesus_666 ( 702802 )

      For instance, your hand written mail is most likely read by a machine that uses optical character recognition to decide where it goes with a pretty good success rate and confidence factor to fail over to humans.

      In fact, the Deutsche Post (Germany's biggest mail company) uses a neural network to process hand-written zip codes. It works rather well, as far as I know. Classic AI, too.

      Plus, spam filters. Yes, they merely use a glorified Bayes classifier but, well... learning classifiers are a part of AI. Low-

  • by mveloso ( 325617 ) on Monday November 02, 2009 @02:19PM (#29952708)

    Not sure why virtualization made it into the potential snake-oil of the future. It's demonstrating real benefits today...practically all of the companies I deal with have virtualized big chunks of their infrastructure.

    I'd vote for cloud computing, previously known as utility computing. It's a lot more work than expected to offload processing outside your organization.

    • by i.r.id10t ( 595143 ) on Monday November 02, 2009 @02:25PM (#29952804)

      Yup, even for "just" development, virtualization has been a great gift. With one or two beefy machines, each developer can have an exact mirror of a production environment, and not cause issues on the production side or even for other developers while testing code and such.

    • by Anonymous Coward on Monday November 02, 2009 @02:26PM (#29952808)

      because virtualization only works for large companies with many, many servers, yet contractors and vendors sell it to any company with a couple of servers. You should virtualize you email ($2,000 by itself, give or take a little), web server, ($2,000 by itself, give or take a little), source control ($1,000 by itself, give or take a little, and a couple of others. So you have maybe $10,000 in 5 to 6 servers needed to run a small to mid-size company and spend tens of thousands to put them on one super-server running a complex setup of virtualized servers...oh no, the motherboard died and the entire biz is offline.

      Virtualization has it's place, but only at the larger companies.

      • Re: (Score:3, Insightful)

        by jedidiah ( 1196 )

        You could replace "virtualization" with "mainframe" or "big unix server" and still have the same issue.

        You would also end up with similar approaches to the problem. With some of these (mainframe), virtualization has been mundane/commonplace for decades.

      • by VoidEngineer ( 633446 ) on Monday November 02, 2009 @02:37PM (#29952960)
        Having been involved in a business start-up for a year or so now, I'd have to disagree. Virtualization is indispensible for QA testing. Being able to run a virtual network on a personal PC lets me design, debug, and do proof-of-concepts without requiring the investment in actual equipment. Virtualization isn't just about hardware consolidation: it's also about application portability. Small companies have just as much need for QA testing, hardware recycling, and application portability as the large ones.
      • by __aagmrb7289 ( 652113 ) on Monday November 02, 2009 @03:04PM (#29953296) Journal

        Spoken like someone who invested the technology five years ago, and hasn't updated their information since.

        1. If a small business is running more than two servers, then it's likely it'll be cheaper, over the next five years, to virtualize those servers.
        2. If a small business needs any sort of guaranteed uptime, it's cheaper to virtualize - two machines and high availability with VMWare, and you are good to go.
        3. Setting up VMWare, for example, is relatively simple, and actually makes remote management easier, since I have CONSOLE access from remote sites to my machine. Need to change the network connection or segment for a machine remotely? You can't do it safely without virtualization.

        There is more, but I recommend you check this out again, before continuing to spout this stuff. It's just not true anymore.

      • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Monday November 02, 2009 @03:19PM (#29953508) Homepage Journal

        because virtualization only works for large companies with many, many servers

        You're full of crap. At my company, a coworker and I are the only one handling the virtualization for a single rackful of servers. He virtualizes Windows stuff because of stupid limitations in so much of the software. For example, we still use a lot of legacy FoxPro databases. Did you know that MS's own FoxPro client libraries are single-threaded and may only be loaded once per instance, so that a Windows box is only capable of executing one single query at a time? We got around that by deploying several virtualized instances and querying them round-robin. It's not perfect, but works as well as anything could given that FoxPro is involved in the formula. None of those instances need to have more than about 256MB of RAM or any CPU to speak of, but we need several of them. While that's an extreme example, it serves the point: sometimes with Windows you really want a specific application to be the only thing running on the machine, and virtualization gives that to us.

        I do the same thing on the Unix side. Suppose we're rolling out a new Internet-facing service. I don't really want to install it on the same system as other critical services, but I don't want to ask my boss for a new 1U rackmount that will sit with a load average of 0.01 for the next 5 years. Since we use FreeBSD, I find a lightly-loaded server and fire up a new jail instance. Since each jail only requires the disk space to hold software that's not part of the base system, I can do things like deploying a Jabber server in its own virtualized environment in only 100MB.

        I don't think our $2,000 Dell rackmounts count as "super-servers" by any definition. If we have a machine sitting their mostly idle, and can virtualize a new OS instance with damn near zero resource waste that solves a very real business or security need, then why on earth not other than because it doesn't appeal to the warped tastes of certain purists?

      • Re: (Score:3, Interesting)

        by rhsanborn ( 773855 )
        I disagree. There are some real benefits for smaller companies who can afford to virtualize, more or less depending on the types of applications. Yes, I can buy one server to run any number of business critical applications, but I've seen, in most cases, that several applications are independently business critical and needed to be available at least for the full business day or some important aspect of the company was shut down. So while a single virtual server running everything sucks, you really can get
    • Yeah, I don't think this stuff can simply be called "snake oil". ERP systems are in use. They're not a cure-all, but failing to fix every problem doesn't make a thing useless. The current usefulness of "artificial intelligence" depends on how you define it. There are some fairly complex statistical analysis systems that are already pretty useful. Full on AI just doesn't exist yet, and we can't even quite agree on what it would be, but it would likely have some use if we ever made it.

      Virtualization is

    • Re: (Score:3, Interesting)

      Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot.

      I agree with your post (not the article) - these technologies have all had success in the experimental fields in which they've been applied. but ESPECIALLY virtualization, which is way past experimenting and is starting to become so big in the workplace that I've started using it at home. No need to setup a dual boot with virtualization, and the risk of losing data is virtually removed (pun intended) because anytime the virtual machine gets infected you just overwrite it with yesterdays backup. No need to s

    • ``Not sure why virtualization made it into the potential snake-oil of the future. It's demonstrating real benefits today...practically all of the companies I deal with have virtualized big chunks of their infrastructure.''

      I am sure they have, but does it actually benefit them? In many cases, it seems to me, it's just people trying their best to come up with problems, just so they can apply virtualization as a solution.

      • Re: (Score:3, Interesting)

        by afidel ( 530433 )
        It saved us from having to do a $1M datacenter upgrade so yeah, I'd say it benefited us.
    • Re: (Score:3, Insightful)

      I think the issue I have with both virtualization and cloud computing is a lack of concrete assessment. They are touted as wunder-technologies, and while they have their place and their use, a lot of folks are leaping into them with little thought as to how they integrate into existing technologies and the kind of overhead (hardware, software, wetware) that will go with it.

      Virtualization certainly has some great uses, but I've seen an increasing number of organizations thinking they can turf their server r

      • by digitalhermit ( 113459 ) on Monday November 02, 2009 @03:14PM (#29953442) Homepage

        I administer hundreds of virtual machines and virtualization has solved a few different problems while introducing others.

        Virtualization is often sold as a means to completely utilize servers. Rather than having two or three applications on two or three servers, virtualization would allow condensing of those environments into one large server, saving power, data center floor space, plus allowing all the other benefits (virtual console, ease of backup, ease of recovery, etc..).

        In one sense it did solve the under-utilization problem. Well, actually it worked around the problem. The actual problem was often that certain applications were buggy and did not play well with other applications. If the application crashed it could bring down the entire system. I'm not picking on Windows here, but in the past the Windows systems were notorious for this. Also, PCs were notoriously unreliable (but they were cheap, so we weighed the cost/reliability). To "solve" the problem, applications were segregated to separate servers. We used RAID, HA, clusters, etc., all to get around the problem of unreliability.

        Fast forward a few years and PCs are a lot more reliable (and more powerful) but we still have this mentality that we need to segregate applications. So rather than fixing the OS we work around it by virtualizing. The problem is that virtualization can have significant overhead. On Power/AIX systems, the hypervisor and management required can eat up 10% or more of RAM and processing power. Terabytes of disk space across each virtual machine is eaten up in multiple copies of the OS, swap space, etc.. Even with dynamic CPU and memory allocation, systems have significant wasted resources. It's getting better, but still only partially addresses the problem of under-utilization.

        So what's the solution? Maybe a big, highly reliable box with multiple applications running? Sound familiar?

    • Re: (Score:3, Interesting)

      Yes, it helps, but it really only helps with under-utilized hardware (and this is really only a problem in Microsoft shops). It doesn't help at all with OS creep; in fact, it makes it worse by making the upfront costs of allocating new "machines" very low; however, it has been and continues to be marketed a cure all which is where the snake-oil comes in. VMware's solution to OS creep: run tiny stripped down VMs with a RPC like management interface (that will naturally only work with vSphere) so that the V
  • by Known Nutter ( 988758 ) on Monday November 02, 2009 @02:24PM (#29952770)
    very disappointed that the word "synergy" did not appear in either linked article or the summary.
  • My Meta-assessment (Score:4, Interesting)

    by Anonymous Coward on Monday November 02, 2009 @02:25PM (#29952790)

    IT snake oil: Six tech cure-alls that went bunk
    By Dan Tynan
    Created 2009-11-02 03:00AM

    Today, cloud computing [4], virtualization [5], and tablet PCs [6] are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot.

    [...]

    1. Artificial intelligence
    2. Computer-aided software engineering (CASE)
    3. Thin clients
    4. ERP systems
    5. B-to-b marketplaces
    6. Enterprise social media

    1. AI: Has to have existed before it can be "bunk"
    2. CASE: Regarding Wikipedia [wikipedia.org], it seems to be alive and kicking.
    3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.
    4. ERP Systems: For low complexity companies, I don't see why ERP software isn't possible.
    5. Web B2B: He is right about this one.
    6. Social media: Big companies like IBM have been doing "social media" within their organization for quite some time.It's just a new name for an old practice

    And as far as his first comment,

    "Today, cloud computing [4], virtualization [5], and tablet PCs [6] are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot."

    [4] Google.
    [5] Data Servers.
    [6] eBooks and medical applications.

    • by Tablizer ( 95088 ) on Monday November 02, 2009 @02:35PM (#29952936) Journal

      There's a pattern here. Many of the hyped technologies eventually find a nice little niche. It's good to experiment with new things to find out where they might fit in or teach us new options. The problem comes when they are touted as a general solution to most IT ills. Treat them like the religious dudes who knock on your door: go ahead and talk to them for a while on the porch, but don't let them into the house.
           

    • by jedidiah ( 1196 )

      > 3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.

      Nevermind the Tivo. Web based "thin client computing" has been on the rise in corporate computing for over 10 years now. There are a lot of corporate Windows users that use what is essentially a Windows based dumb terminal. Larger companies even go out of their way to make sure that changing the setup on your desktop office PC is about as hard as doing the same to a Tivo.

      Client based computing (java or .net) is

      • by jimicus ( 737525 )

        > 3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.

        Nevermind the Tivo. Web based "thin client computing" has been on the rise in corporate computing for over 10 years now. There are a lot of corporate Windows users that use what is essentially a Windows based dumb terminal. Larger companies even go out of their way to make sure that changing the setup on your desktop office PC is about as hard as doing the same to a Tivo.

        Client based computing (java or .net) is infact "all the rage".

        They've been doing that for years. Strangely, even when your desktop PCs are locked down so tight they may as well be dumb terminals, a lot of people will still scream blue murder if it really is a dumb terminal being put on their desk.

        • Re: (Score:3, Insightful)

          So don't tell them it's a dumb terminal. Put a thin client on their desk and tell them they're getting a 6 ghz octocore with 32 gigs of ram and a petabyte hard drive. They'll never know. Most of them, anyway.
    • Re: (Score:3, Interesting)

      by Mike Buddha ( 10734 )

      2. CASE: Regarding Wikipedia [wikipedia.org], it seems to be alive and kicking.

      As a programmer, CASE sounds pretty neat. I think it probably won't obviate the need for programmers any time soon, but it has the potential to automate some of the more tedious aspects of programming. I'd personally rather spend more of my time designing applications and less time hammering out the plumbing. It's interesting to note that a lot of the CASE tools in that wikipedia article I'm familiar with, although they were never referred to as CASE tools when I was learning how to use them. I think the CA

  • The Cloud (Score:5, Funny)

    by Anonymous Coward on Monday November 02, 2009 @02:25PM (#29952798)

    It has vaporware all over it.

    • Re: (Score:2, Funny)

      by Shikaku ( 1129753 )

      Clouds are actually water vapors. So it literally is vaporware.

      • by syousef ( 465911 )

        Clouds are actually water vapors. So it literally is vaporware. ...and since it's water vapour it's no surprise that letting it anywhere near your computer hardware will instantly make that hardware go on the fritz.

  • by cybergrue ( 696844 ) on Monday November 02, 2009 @02:26PM (#29952830)
    It arises when the salesman tell the clueless management that "This product will solve all your problems!"

    Bonus points if the salesman admits that he doesn't need to know your problems before selling it to you.

  • by WormholeFiend ( 674934 ) on Monday November 02, 2009 @02:27PM (#29952846)

    Let's just say the technology is not quite there yet.

    • Re: (Score:3, Funny)

      "Let's just say the technology is not quite there yet"

      aka

      "Pertaining to the acceptability, us, speaking of the mechanical acumen almost has arrived, still"

  • I was surprised to find ERP on this list. Sure, it's a huge effort and always oversold, but there's hardly a large manufacturing company out there that could survive without some sort of basic ERP implementation.

    • Re: (Score:3, Insightful)

      The fundamental problem with ERP systems is that they are integrated and implemented by the second tier of folks in the engineering pecking order. Couple that fact with an aggressive sales force that would sell ice to eskimos and you've got a straight road to expensive failure.
      • Re:ERP? (Score:4, Interesting)

        by smooth wombat ( 796938 ) on Monday November 02, 2009 @03:01PM (#29953242) Journal
        you've got a straight road to expensive failure.

        Sing it brother (or sister)! As one who is currently helping to support an Oracle-based ERP project, expensive doesn't begin to describe how much it's costing us. Original estimated cost: $20 million. Last known official number I heard for current cost: $46 million. I'm sure that number is over $50 million by now.

        But wait, there's more. We bought an off-the-shelf portion of their product and of course have to shoe-horn it to do what we want. There are portions of our home-grown process that aren't yet implemented and probably won't be implemented for several more months even though those portions are a critical part of our operations.

        But hey, the people who are "managing" the project get to put it on their résumé and act like they know what they're doing, which is all that matters.

        an aggressive sales force that would sell ice to eskimos

        I see you've read my column [earthlink.net].
        • Re: (Score:3, Interesting)

          by q2k ( 67077 )

          Wow, an actively maintained ~ (tilde) web site. I don't think I've seen one of those since about 2002 ;) Your column is spot on.

  • by assemblerex ( 1275164 ) on Monday November 02, 2009 @02:33PM (#29952894)
    That went over real well once they saw user visits drop by almost half...
  • Expert systems (Score:4, Insightful)

    by michael_cain ( 66650 ) on Monday November 02, 2009 @02:35PM (#29952928) Journal
    Within limits, expert systems seem to work reasonably well. Properly-trained software that examines x-ray images has been reported to have better accuracy than humans at diagnosing specific problems. The literature seems to suggest that expert systems for medical case diagnosis is more accurate than doctors and nurses, especially tired doctors and nurses. OTOH, patients have an intense dislike of such systems, particularly the diagnosis software, since it can seem like an arbitrary game of "20 Questions". Of course, these are tools that help the experts do their job better, not replacements for the expert people themselves.
  • By the description in here the cloud didn't work because:

    Worse, users resented giving up control over their machines, adds Mike Slavin, partner and managing director responsible for leading TPI's Innovation Center. "The technology underestimated the value users place upon having their own 'personal' computer, rather than a device analogous -- stretching to make a point here -- to the days of dumb terminals," he says.

    So why does it look good now? Oh right different people heard the setup and a new generation gets suckered on it.

    • Re: (Score:3, Insightful)

      by abigor ( 540274 )

      Because cloud computing doesn't require a thin client? The two things aren't related at all. Offloading processing and data makes perfect sense for many applications.

  • by harmonise ( 1484057 ) on Monday November 02, 2009 @02:36PM (#29952940)

    This is a bit OT but I wanted to say that snydeq deserves a cookie for linking to the print version. I can only imagine that the regular version is at least seven pages. I hope slashdot finds a way to reward considerate contributors such as him or her for making things easy for the rest of us.

  • by E. Edward Grey ( 815075 ) on Monday November 02, 2009 @02:39PM (#29952976)

    I don't know of a single IT department that hasn't been helped by virtualization of servers. It makes more efficient use of purchased hardware, keeps businesses from some of the manipulations to which their hardware and OS vendors can subject them, and is (in the long term) cheaper to operate than a traditional datacenter. IT departments have wondered for a long time: "if I have all this processing power, memory, and storage, why can't I use all of it?" Virtualization answers that question, and does it in an elegant way, so I don't consider it snake oil.

  • The crazy hottie (Score:5, Interesting)

    by GPLDAN ( 732269 ) on Monday November 02, 2009 @02:42PM (#29953024)
    I kind of miss the crazy hotties that used to pervade the network sales arena. I won't even name the worst offenders, although the worst started with the word cable. They would go to job fairs and hire the hottest birds, put them in the shortest shirts and low cut blouses, usually white with black push-up bras - and send them in to sell you switches.

    It was like watching the cast of a porn film come visit. Complete with the sleazebag regional manager, some of them even had gold chains on. Pimps up, big daddy!

    They would laugh at whatever the customer said wildly, even if it wasn't really funny. The girls would bat their eyelashes and drop pencils. It was so ridiculous it was funny, it was like a real life comedy show skit.

    I wonder how much skimming went on in those days. Bogus purchase orders, fake invoices. Slap and tickle. The WORST was if your company had no money to afford any of the infratsructure and the networking company would get their "capital finance" team involved. Some really seedy slimy stuff went down in the dot-com boom. And not just down pantlegs, either.
    • Re: (Score:3, Insightful)

      by rossz ( 67331 )

      I kind of miss the crazy hotties that used to pervade the network sales arena. I won't even name the worst offenders, although the worst started with the word cable. They would go to job fairs and hire the hottest birds, put them in the shortest shirts and low cut blouses, usually white with black push-up bras - and send them in to sell you switches.

      Booth babes are the best thing about trade shows.

  • by loftwyr ( 36717 ) on Monday November 02, 2009 @02:44PM (#29953054)
    Most of the technologies in the article were overhyped but almost all have had real value in the marketplace.

    For example, AI works and is a very strong technology, but only the SF authors and idiots expect their computer to have a conversation with them. Expert systems (a better name) or technologies that are part of them are in place in thousands of back-office systems.

    But, if you're looking for HAL, you have another 2001 years to wait. Nobody seriously is working toward that, except as a dream goal. Everybody wants a better prediction model for the stock market first.
  • I got interested in AI in the early 90's and even then the statements made in the article were considered outrageous by people who actually knew what was going on. I use AI on a daily basis, from OCR to speech and gesture recognition. Even my washing machine claims to use it. Not quite thinking for us and taking over the world, but give it some time :).

    Same with thin clients. Just today I put together a proposal for three 100 seat thin client (Sunray) labs. VDI allows us to use Solaris, multiple Lin
  • by turing_m ( 1030530 ) on Monday November 02, 2009 @02:51PM (#29953136)

    Apparently it cures everything but RSI.

  • Those aren't all (Score:5, Insightful)

    by HangingChad ( 677530 ) on Monday November 02, 2009 @02:59PM (#29953210) Homepage

    We used to play buzzword bingo when vendors would come in for a show. Some of my personal favorites:

    IT Best Practices - Has anyone seen my big book of best practices? I seem to have misplaced it. But that never stopped vendors from pretending there was an IT bible out there that spelled out the procedures for running an IT shop. And always it was their product at the core of IT best practices.

    Agile Computing - I never did figure that one out. This is your PC, this is your PC in spin class.

    Lean IT - Cut half your staff and spend 3x what you were paying them to pay us for doing the exact same thing only with worse service.

    Web 2.0 - Javascript by any other name is still var rose.

    SOA - What a gold mine that one was. Calling it "web services" didn't command a very high premium. But tack on a great acronym like SOA and you can charge lots more!

    All those are just ways for vendors and contractors to make management feel stupid and out of touch. Many management teams don't need any help in that arena, most of them are already out of touch before the vendor walks in. Exactly why they're not running back to their internal IT people to inquire why installing Siebel is a really BAD idea. You can't fix bad business practices with technology. Fix your business practices first, then find the solution that best fits what you're already doing.

    And whoever has my IT Best Practices book, please bring it back. Thanks.

  • by FranTaylor ( 164577 ) on Monday November 02, 2009 @03:05PM (#29953330)

    "Artificial intelligence" - what's keeping the spam out of YOUR inbox? How does Netflix decide what to recommend to you? Ever gotten directions from Google Maps?

    "Computer-aided software engineering" - tools like valgrind, findbugs, fuzzing tools for finding security problems.

    "Thin clients" - ever heard of a "Web Browser"?

    "Enterprise social media" - That really describes most of the Internet

    As soon as I saw an opionion from "Ron Enderle" I knew this story would be BS.

  • by bazorg ( 911295 ) on Monday November 02, 2009 @03:17PM (#29953480)
    Funny the bit about ERP software. Essentially they say that ERP is not as good as people expected, but once you apply some Business Intelligence solutions you'll be sorted.
  • by S3D ( 745318 ) on Monday November 02, 2009 @03:17PM (#29953484)
    From TFA, philippic against social media:

    That's too much information. Before they know it, their scientists are talking to the competition and trade secrets are leaking out."

    I don't think author has a clue. The secrets which could be accidentally spilled are not worth keeping. If it so short it bound to be trivial, really essential results are megabytes and megabytes of data or code or know-how. Treat your researcher as prisoners, get prison science in return.

  • Java (Score:3, Insightful)

    by Marrow ( 195242 ) on Monday November 02, 2009 @03:21PM (#29953546)

    It was not too long ago that Java was going to:
    Give us applets to do what Browsers can never do: Bring animated and reactive interfaces to the web browsing experience!
    Take over the desktop. Write once, run anywhere and render the dominance of Intel/MS moot by creating a neutral development platform!

    Yes, perhaps its found a niche somewhere. But its fair to say it fell short of the hype.

  • by jc42 ( 318812 ) on Monday November 02, 2009 @03:27PM (#29953630) Homepage Journal

    The most obvious counterexample to the "AI" nonsense is to consider that, back around 1800 or any time earlier, it was obvious to anyone that the ability to count and do arithmetic was a sign of intelligence. Not even smart animals like dogs or monkeys could add or subtract; only we smart humans could do that. Then those engineer types invented the adding machine. Were people amazed by the advent of intelligent machines? No; they simply reclassified adding and subtracting as "mechanical" actions that required no intelligence at all.

    Fast forward to the computer age, and you see the same process over and over. As soon as something becomes routinely doable by a computer, it is no longer considered a sign of intelligence; it's a mere mechanical activity. Back in the 1960s, when the widely-used programming languages were Fortran and Cobol, the AI researchers were developing languages like LISP that could actually process free-form, variable-length lists. This promised to be the start of truly intelligent computers. By the early 1970s, however, list processing was taught in low-level programming courses and had become a routine part of the software developers toolkits. So it was just a "software engineering" tool, a mechanical activity that didn't require any machine intelligence.

    Meanwhile, the AI researchers were developing more sophisticated "intelligent" data structures, such as tables that could associate arbitrary strings with each other. Did these lead to development of intelligent software? Well, now some of our common programming languages (perl, prolog, etc.) include such tables as basic data types, and the programmers use them routinely. But nobody considers the resulting software "intelligent"; it's merely more complex computer software, but basically still just as mechanical and unintelligent as the first adding machines.

    So my prediction is that we'll never have Artificial Intelligence. Every new advance in that direction will always be reclassified from "intelligent" to "merely mechanical". When we have computer software composing best-selling music and writing best-selling novels or creating entire computer-generated movies from scratch, it will be obvious that such things are merely mechanical activities, requiring no actual intelligence.

    Whether there will still be things that humans are intelligent enough to do, I can't predict.

  • by ErichTheRed ( 39327 ) on Monday November 02, 2009 @04:01PM (#29954106)

    I definitely agree with a lot of the items on that list. This time around, however, thin clients are definitely in the running because of all the amazing VDI, virtual app stuff and fast cheap networks. However, anyone who tells you that you can replace every single PC or laptop in your company needs to calm down a little. Same goes for the people who explain thin clients in a way that makes it sound like client problems go away magically. They don't - you just roll them all up into the data center, where you had better have a crack operations staff who can keep everything going. Why? Because if the network fails, your users have a useless paperweighr on their desk until you fix it.

    I'm definitely surprised to not see cloud computing on that list. This is another rehashed technology, this time with the fast cheap network connectivity thrown in. The design principles are great -- build your app so it's abstracted from physical hardware, etc. but I've seen way too many cloud vendors downplay the whole data ownership and vendor lock-in problems. In my opinion, this makes sense for people's Facebook photos, not a company's annual budget numbers.

  • by HockeyPuck ( 141947 ) on Monday November 02, 2009 @04:20PM (#29954382)

    EMC, IBM, HDS and HP I'm looking at you.

    You've been pushing this Storage Virtualization on us storage admins for years now, and it's more trouble than it's worth. What is it? It's putting some sort of appliance (or in HDS's view a new disk array) in front of all of my other disk arrays, trying to commoditize my back end disk arrays, so that I can have capacity provided by any vendor I choose. You make claims like,

    1. "You'll never have vendor lock-in with Storage virtualization!" However, now that I'm using your appliance to provide the intelligence (snapshots, sync/async replication, migration etc) I'm now locked into your solution.
    2. "This will be easy to manage." How many of these fucking appliances do I need for my new 100TB disk array? When I've got over 300 storage ports on my various arrays, and my appliance has 4 (IBM SVC I'm looking at you), how many nodes do I need? I'm now spending as much time trying to scale up your appliance solution that for every large array I deploy, I need 4 racks worth of appliances.
    3. "This will be homogeneous!" Bull fucking shit. You claimed that this stuff will work with any vendor's disk arrays so that I can purchase the cheapest $/GB arrays out there. No more DMX, just clariion, no more DS8000 now fastT. What a load. You only support other vendor's disk arrays during the initial migration and then I'm pretty much stuck with your arrays until the end of time. So much for your utopian view of any vendor. So now that I've got to standardize on your back end disk arrays, it's not like you're saving me the trouble of only having one loadbalancing software solutions (DMP, Powerpath, HDLM, SDD etc..). If I have DMX on the backend, I'm using Powerpath whether I like it or not. This would have been nice if I was willing to have four different vendor's selling me backend capacity, but since I don't want to deal with service contracts from four different vendors, that idea is a goner.

    Besides, when I go to your large conferences down in Tampa, FL; even your own IT doesn't use it. Why? Because all you did is add another layer of complexity (troubleshooting, firmware updates, configuration) between my servers and their storage.

    You can take this appliance (or switch based in EMC's case) based storage virtualization and Shove It!

    btw: There's a reason why we connect mainframe channels directly to the control units. (OpenSystems translation: Connecting hba ports to storage array ports.) Answer: Cable doesn't need upgrading, doesn't need maintenance contracts and is 100% passive.

  • by bb5ch39t ( 786551 ) on Monday November 02, 2009 @04:48PM (#29954750)
    In honor of Arthur C. Clarke's famous words, I have a button which almost got me fired at work. "Any sufficiently advanced technology is indistinguishable from a rigged demo."

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...