Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
IT Technology

IT Snake Oil — Six Tech Cure-Alls That Went Bunk 483

snydeq writes "InfoWorld's Dan Tynan surveys six 'transformational' tech-panacea sales pitches that have left egg on at least some IT department faces. Billed with legendary promises, each of the six technologies — five old, one new — has earned the dubious distinction of being the hype king of its respective era, falling far short of legendary promises. Consultant greed, analyst oversight, dirty vendor tricks — 'the one thing you can count on in the land of IT is a slick vendor presentation and a whole lot of hype. Eras shift, technologies change, but the sales pitch always sounds eerily familiar. In virtually every decade there's at least one transformational technology that promises to revolutionize the enterprise, slash operational costs, reduce capital expenditures, align your IT initiatives with your core business practices, boost employee productivity, and leave your breath clean and minty fresh.' Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown." What other horrible hype stories do some of our seasoned vets have?
This discussion has been archived. No new comments can be posted.

IT Snake Oil — Six Tech Cure-Alls That Went Bunk

Comments Filter:
  • The bad news is that artificial intelligence has yet to fully deliver on its promises.

    Only idiots, marketers, businessmen and outsiders ever thought we would be completely replaced by artificially intelligent machines. The people actually putting artificial intelligence into practice knew that AI, like so many other things, would benefit us in small steps. So many forms of automation are technically basic artificial intelligence, it's just very simple artificial intelligence. While you might want to argue that the things we benefit from are heuristics, statistics and messes of if/then decision trees, successful AI is nothing more than that. Everyone reading this enjoys benefits of AI but you probably don't know it. For instance, your hand written mail is most likely read by a machine that uses optical character recognition to decide where it goes with a pretty good success rate and confidence factor to fail over to humans. Recommendation systems are often based on AI algorithms. I mean, the article even says this:

    The ability of your bank's financial software to detect potentially fraudulent activity on your accounts or alter your credit score when you miss a mortgage payment are just two of many common examples of AI at work, says Mow. Speech and handwriting recognition, business process management, data mining, and medical diagnostics -- they all owe a debt to AI.

    Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind. I don't ever recall reading anything other than fiction claiming that humans would soon be replaced completely by thinking machines.

    In short, I don't think it's fair to put it in this list as it has had success. It's easy to dismiss AI if the only person you hear talking about it is the cult-like Ray Kurzweil but I assure you the field is a valid one [arxiv.org] (unlike CASE or ERP). In short, AI will never die because the list of applications -- though small -- slowly but surely grows. It has not gone 'bunk' (whatever the hell that means [wiktionary.org]). You can say expert systems have failed to keep their promises but not AI on the whole. The only thing that's left a sour taste in your mouth is salesmen and businessmen promising you something they simply cannot deliver on. And that's nothing new nor anything specific to AI.

  • by mveloso ( 325617 ) on Monday November 02, 2009 @02:19PM (#29952708)

    Not sure why virtualization made it into the potential snake-oil of the future. It's demonstrating real benefits today...practically all of the companies I deal with have virtualized big chunks of their infrastructure.

    I'd vote for cloud computing, previously known as utility computing. It's a lot more work than expected to offload processing outside your organization.

  • by i.r.id10t ( 595143 ) on Monday November 02, 2009 @02:25PM (#29952804)

    Yup, even for "just" development, virtualization has been a great gift. With one or two beefy machines, each developer can have an exact mirror of a production environment, and not cause issues on the production side or even for other developers while testing code and such.

  • by Anonymous Coward on Monday November 02, 2009 @02:26PM (#29952808)

    because virtualization only works for large companies with many, many servers, yet contractors and vendors sell it to any company with a couple of servers. You should virtualize you email ($2,000 by itself, give or take a little), web server, ($2,000 by itself, give or take a little), source control ($1,000 by itself, give or take a little, and a couple of others. So you have maybe $10,000 in 5 to 6 servers needed to run a small to mid-size company and spend tens of thousands to put them on one super-server running a complex setup of virtualized servers...oh no, the motherboard died and the entire biz is offline.

    Virtualization has it's place, but only at the larger companies.

  • by cybergrue ( 696844 ) on Monday November 02, 2009 @02:26PM (#29952830)
    It arises when the salesman tell the clueless management that "This product will solve all your problems!"

    Bonus points if the salesman admits that he doesn't need to know your problems before selling it to you.

  • by WormholeFiend ( 674934 ) on Monday November 02, 2009 @02:27PM (#29952846)

    Let's just say the technology is not quite there yet.

  • by assemblerex ( 1275164 ) on Monday November 02, 2009 @02:33PM (#29952894)
    That went over real well once they saw user visits drop by almost half...
  • by jedidiah ( 1196 ) on Monday November 02, 2009 @02:34PM (#29952920) Homepage

    You could replace "virtualization" with "mainframe" or "big unix server" and still have the same issue.

    You would also end up with similar approaches to the problem. With some of these (mainframe), virtualization has been mundane/commonplace for decades.

  • Expert systems (Score:4, Insightful)

    by michael_cain ( 66650 ) on Monday November 02, 2009 @02:35PM (#29952928) Journal
    Within limits, expert systems seem to work reasonably well. Properly-trained software that examines x-ray images has been reported to have better accuracy than humans at diagnosing specific problems. The literature seems to suggest that expert systems for medical case diagnosis is more accurate than doctors and nurses, especially tired doctors and nurses. OTOH, patients have an intense dislike of such systems, particularly the diagnosis software, since it can seem like an arbitrary game of "20 Questions". Of course, these are tools that help the experts do their job better, not replacements for the expert people themselves.
  • by Tablizer ( 95088 ) on Monday November 02, 2009 @02:35PM (#29952936) Journal

    There's a pattern here. Many of the hyped technologies eventually find a nice little niche. It's good to experiment with new things to find out where they might fit in or teach us new options. The problem comes when they are touted as a general solution to most IT ills. Treat them like the religious dudes who knock on your door: go ahead and talk to them for a while on the porch, but don't let them into the house.
         

  • Re:ERP? (Score:3, Insightful)

    by cryfreedomlove ( 929828 ) on Monday November 02, 2009 @02:37PM (#29952958)
    The fundamental problem with ERP systems is that they are integrated and implemented by the second tier of folks in the engineering pecking order. Couple that fact with an aggressive sales force that would sell ice to eskimos and you've got a straight road to expensive failure.
  • by VoidEngineer ( 633446 ) on Monday November 02, 2009 @02:37PM (#29952960)
    Having been involved in a business start-up for a year or so now, I'd have to disagree. Virtualization is indispensible for QA testing. Being able to run a virtual network on a personal PC lets me design, debug, and do proof-of-concepts without requiring the investment in actual equipment. Virtualization isn't just about hardware consolidation: it's also about application portability. Small companies have just as much need for QA testing, hardware recycling, and application portability as the large ones.
  • by Wonko the Sane ( 25252 ) * on Monday November 02, 2009 @02:41PM (#29953010) Journal

    ERP could work if the vendors would realistically deal with GIGO.

    Unless you lock down the permissions so tightly that the system is unusable, your users will enter bad data. They'll add new entries for objects that already exist, they'll misspell the name of an object and then create a new object instead of editing the one they just created. They'll make every possible data entry error you can imagine, and plenty that you can't.

    We'd see a lot more progress in business software applications if all vendors would follow two rules:

    1. Every piece of data that comes from the user must be editable in the future
    2. Any interface that allows a user to create a new database entry MUST provide a method to merge duplicate entries.
  • by loftwyr ( 36717 ) on Monday November 02, 2009 @02:44PM (#29953054)
    Most of the technologies in the article were overhyped but almost all have had real value in the marketplace.

    For example, AI works and is a very strong technology, but only the SF authors and idiots expect their computer to have a conversation with them. Expert systems (a better name) or technologies that are part of them are in place in thousands of back-office systems.

    But, if you're looking for HAL, you have another 2001 years to wait. Nobody seriously is working toward that, except as a dream goal. Everybody wants a better prediction model for the stock market first.
  • Re:Thin Clients? (Score:3, Insightful)

    by abigor ( 540274 ) on Monday November 02, 2009 @02:53PM (#29953156)

    Because cloud computing doesn't require a thin client? The two things aren't related at all. Offloading processing and data makes perfect sense for many applications.

  • by MightyMartian ( 840721 ) on Monday November 02, 2009 @02:53PM (#29953168) Journal

    I think the issue I have with both virtualization and cloud computing is a lack of concrete assessment. They are touted as wunder-technologies, and while they have their place and their use, a lot of folks are leaping into them with little thought as to how they integrate into existing technologies and the kind of overhead (hardware, software, wetware) that will go with it.

    Virtualization certainly has some great uses, but I've seen an increasing number of organizations thinking they can turf their server rooms and big chunks of their IT staff by believing the hype that everything will become smaller and easier to manage. The technology is real, has some excellent uses and in a well-planned infrastructure upgrade can indeed deliver real results. But the sales pitch seems to be "replace 10 servers with 1, fire most of your IT department and away you go!"

    As to cloud computing, well, it's nothing more than a new iteration of a distributed computing model that dates back forty years or more. In the olden days (back when I was just a strippling) we called it the client-server model. Again, it's a technology was potentially excellent uses, but it, even moreso than virtualization has been hyped beyond all reason. There are profound security and data integrity issues that go along with cloud computing that seem to be swept under the rug. Again, it's the "put your data on the cloud, fire most of your IT department and away you go!"

    I'm fortunate in that I have a lot of say in how my budget is spent, but I've heard of guys who are basically having management shove this sort of stuff down their throats, and, of course, win or lose, it's the IT department that wears it when the bloom comes off the rose.

    Quite frankly I despise marketers. I think they are one of the greatest evils that have ever been created, a whole legion of professional bullshitters whose job it is to basically lie and distort the truth to shove out products that are either not ready for prime time or don't (and never will) deliver on the promises.

  • Those aren't all (Score:5, Insightful)

    by HangingChad ( 677530 ) on Monday November 02, 2009 @02:59PM (#29953210) Homepage

    We used to play buzzword bingo when vendors would come in for a show. Some of my personal favorites:

    IT Best Practices - Has anyone seen my big book of best practices? I seem to have misplaced it. But that never stopped vendors from pretending there was an IT bible out there that spelled out the procedures for running an IT shop. And always it was their product at the core of IT best practices.

    Agile Computing - I never did figure that one out. This is your PC, this is your PC in spin class.

    Lean IT - Cut half your staff and spend 3x what you were paying them to pay us for doing the exact same thing only with worse service.

    Web 2.0 - Javascript by any other name is still var rose.

    SOA - What a gold mine that one was. Calling it "web services" didn't command a very high premium. But tack on a great acronym like SOA and you can charge lots more!

    All those are just ways for vendors and contractors to make management feel stupid and out of touch. Many management teams don't need any help in that arena, most of them are already out of touch before the vendor walks in. Exactly why they're not running back to their internal IT people to inquire why installing Siebel is a really BAD idea. You can't fix bad business practices with technology. Fix your business practices first, then find the solution that best fits what you're already doing.

    And whoever has my IT Best Practices book, please bring it back. Thanks.

  • by __aagmrb7289 ( 652113 ) on Monday November 02, 2009 @03:04PM (#29953296) Journal

    Spoken like someone who invested the technology five years ago, and hasn't updated their information since.

    1. If a small business is running more than two servers, then it's likely it'll be cheaper, over the next five years, to virtualize those servers.
    2. If a small business needs any sort of guaranteed uptime, it's cheaper to virtualize - two machines and high availability with VMWare, and you are good to go.
    3. Setting up VMWare, for example, is relatively simple, and actually makes remote management easier, since I have CONSOLE access from remote sites to my machine. Need to change the network connection or segment for a machine remotely? You can't do it safely without virtualization.

    There is more, but I recommend you check this out again, before continuing to spout this stuff. It's just not true anymore.

  • by FranTaylor ( 164577 ) on Monday November 02, 2009 @03:05PM (#29953330)

    "Artificial intelligence" - what's keeping the spam out of YOUR inbox? How does Netflix decide what to recommend to you? Ever gotten directions from Google Maps?

    "Computer-aided software engineering" - tools like valgrind, findbugs, fuzzing tools for finding security problems.

    "Thin clients" - ever heard of a "Web Browser"?

    "Enterprise social media" - That really describes most of the Internet

    As soon as I saw an opionion from "Ron Enderle" I knew this story would be BS.

  • by NotBornYesterday ( 1093817 ) * on Monday November 02, 2009 @03:13PM (#29953430) Journal
    So don't tell them it's a dumb terminal. Put a thin client on their desk and tell them they're getting a 6 ghz octocore with 32 gigs of ram and a petabyte hard drive. They'll never know. Most of them, anyway.
  • by S3D ( 745318 ) on Monday November 02, 2009 @03:17PM (#29953484)
    From TFA, philippic against social media:

    That's too much information. Before they know it, their scientists are talking to the competition and trade secrets are leaking out."

    I don't think author has a clue. The secrets which could be accidentally spilled are not worth keeping. If it so short it bound to be trivial, really essential results are megabytes and megabytes of data or code or know-how. Treat your researcher as prisoners, get prison science in return.

  • Java (Score:3, Insightful)

    by Marrow ( 195242 ) on Monday November 02, 2009 @03:21PM (#29953546)

    It was not too long ago that Java was going to:
    Give us applets to do what Browsers can never do: Bring animated and reactive interfaces to the web browsing experience!
    Take over the desktop. Write once, run anywhere and render the dominance of Intel/MS moot by creating a neutral development platform!

    Yes, perhaps its found a niche somewhere. But its fair to say it fell short of the hype.

  • by jc42 ( 318812 ) on Monday November 02, 2009 @03:27PM (#29953630) Homepage Journal

    The most obvious counterexample to the "AI" nonsense is to consider that, back around 1800 or any time earlier, it was obvious to anyone that the ability to count and do arithmetic was a sign of intelligence. Not even smart animals like dogs or monkeys could add or subtract; only we smart humans could do that. Then those engineer types invented the adding machine. Were people amazed by the advent of intelligent machines? No; they simply reclassified adding and subtracting as "mechanical" actions that required no intelligence at all.

    Fast forward to the computer age, and you see the same process over and over. As soon as something becomes routinely doable by a computer, it is no longer considered a sign of intelligence; it's a mere mechanical activity. Back in the 1960s, when the widely-used programming languages were Fortran and Cobol, the AI researchers were developing languages like LISP that could actually process free-form, variable-length lists. This promised to be the start of truly intelligent computers. By the early 1970s, however, list processing was taught in low-level programming courses and had become a routine part of the software developers toolkits. So it was just a "software engineering" tool, a mechanical activity that didn't require any machine intelligence.

    Meanwhile, the AI researchers were developing more sophisticated "intelligent" data structures, such as tables that could associate arbitrary strings with each other. Did these lead to development of intelligent software? Well, now some of our common programming languages (perl, prolog, etc.) include such tables as basic data types, and the programmers use them routinely. But nobody considers the resulting software "intelligent"; it's merely more complex computer software, but basically still just as mechanical and unintelligent as the first adding machines.

    So my prediction is that we'll never have Artificial Intelligence. Every new advance in that direction will always be reclassified from "intelligent" to "merely mechanical". When we have computer software composing best-selling music and writing best-selling novels or creating entire computer-generated movies from scratch, it will be obvious that such things are merely mechanical activities, requiring no actual intelligence.

    Whether there will still be things that humans are intelligent enough to do, I can't predict.

  • by ghostlibrary ( 450718 ) on Monday November 02, 2009 @03:29PM (#29953652) Homepage Journal

    AI already has successes. But, as an AI researcher friend of mine points out, once they succeed it's no longer 'AI'. Things like packet routing, used to be AI. Path-finding, as in games, or route-finding, as with GPS: solved. So yes, AI will never arrive, because AI is always 'other than the AI we already have.'

  • by Anonymous Coward on Monday November 02, 2009 @03:30PM (#29953674)

    I went through the list and this Computer Aided Software Engineering appears to be a huge success to me.

    Look at the good IDEs and high level languages such as Java and PHP.

    I want to make a web program that sends me emails based on some form data using PHP. I don't need to know how TCP works, I don't need to understand how OS manages files, or how email works, or how data streams are used, or how memory is managed... I just mark certain fields to mean certain parameters and use a function to send the email with the parameters I wanted in the order I want them and I get the end result I want to.

    Yeah, optimized, large programs do need significant amounts of coding. But compare coding a large program with assembly and a notepad to coding that with Java and a good IDE and then tell me that Computer Aided Software Engineering has failed.

  • by Anonymous Coward on Monday November 02, 2009 @03:36PM (#29953758)

    Thin clients for Windows Machines don't make much sense in many environments. Thin clients for Macs do. There is one Mac terminal server (http://www.aquaconnect.net) which allows thin clients or Windows machines to run on a Mac server.

    The reason this makes sense is the price of a Mac as compared with a thin client or PC.

    Thin clients (even for Windows) make sense in secure environments and environments that have great turnover (ie. A computer lab, internet caffe). A secure environment is obvious, the data is on the server and does not leave. In a high turnover environment makes administration easier, where you can wipe and restore the environment, though there are some other tools to do that, just not as convenient.

  • by Dog-Cow ( 21281 ) on Monday November 02, 2009 @03:41PM (#29953810)

    That's not CASE. CASE was about translating requirements into code without human involvement. Your examples are all about abstraction and APIs.

  • by DrVomact ( 726065 ) on Monday November 02, 2009 @03:44PM (#29953872) Journal

    The problem is that, if it isn't that, then what is "artificial intelligence", rather than flashy marketing speak for just another bunch of algorithms?

    Exactly. "Artificial intelligence" seems to serve various purposes—at best vacuous and at worst deceptive. How many millions of dollars have academicians raked in for various projects that involve research into "artificial intelligence"?

    What makes all this silliness sustainable is the philosophical fog that surrounds words such as "intelligence" and "thinking". Such words easily slip their moorings in our common language, and acquire some very strange uses. Yet, because they are recognizably legitimate words that do have perfectly legitimate uses, it is all too easy to fool people into uncritical acceptance of claims that, when analyzed, make little sense.

    When someone talks to me about creating "artificial intelligence", I never deny that this is possible. I can't deny a claim that I don't clearly understand. To evaluate a claim, I first need clarification of what is being asserted. In this case, it's as though someone were talking about the discovery of "artificial chenya". To even understand that claim, I have to know what natural chenya might be. So tell me, what is "natural intelligence"? You're going to have to be very clear about that before you can start writing the requirements document for your project.

  • by maxwell demon ( 590494 ) on Monday November 02, 2009 @03:57PM (#29954058) Journal

    When we have computer software composing best-selling music and writing best-selling novels or creating entire computer-generated movies from scratch, it will be obvious that such things are merely mechanical activities, requiring no actual intelligence.

    When looking at some best-selling stuff, I'm already not sure that you need intelligence to produce it. :-)

  • by cecille ( 583022 ) on Monday November 02, 2009 @03:59PM (#29954080)
    Maybe. But I think it's mostly just a disconnect between what the people who work in the field believe the term to mean and what the general public takes the term to mean. Some of that might just be naivete on the part of researchers. And maybe some bravado as well.

    When I hear about intelligent anything to do with computers, I just think of a system that learns. That, to me, is the key differentiator. On the other hand, my mom's friend was telling me one night at dinner that her son was taking a class where they're "building machines like brains". Well, he was clearly learning about ANNs in some undergrad CI course, but man, it sure sounds better when you say you're building brains, eh? It sounds like self-aware systems are a semesters worth of work away. Maybe he was trying to make his work sound more impressive. Or maybe his mom just took the wrong thing away from the conversation. Either way, he's talking about building an XOR and she's thinking Commander Data.

    To be honest, though, researchers aren't always clear on what the terms mean either. Don't get me wrong - you'd be hard pressed to find a researcher who genuinely believes they are going to build a self-aware system or anything of the sort, but I remember going to a conference with an hour and a half long panel discussion on whether or not the fields of AI and CI should be combined or separated in the conference, and what each encompassed. No one had a good answer.
  • by Wonko the Sane ( 25252 ) * on Monday November 02, 2009 @03:59PM (#29954086) Journal

    I've never worked for a software company but as the "computer guy" I got to help move people from the "emailing spreadsheets around" workflow to basic MS Access database applications (I know just enough about databases to be horrified about the idea of using Access for critical business functions but it's better than Excel).

    As the maintenance manager of a factory I got to help the plant manager make software purchasing decisions. I've come to the conclusion that mid-sized to large corporations should just bite the bullet and hire their own programmers. If it makes sense to design your product and design your own assembly lines and design your own tooling, jigs and fixtures then it makes sense to design your own software. Any cost savings you can achieve by outsourcing to a more specialized company never seems to materalize.

  • by ErichTheRed ( 39327 ) on Monday November 02, 2009 @04:01PM (#29954106)

    I definitely agree with a lot of the items on that list. This time around, however, thin clients are definitely in the running because of all the amazing VDI, virtual app stuff and fast cheap networks. However, anyone who tells you that you can replace every single PC or laptop in your company needs to calm down a little. Same goes for the people who explain thin clients in a way that makes it sound like client problems go away magically. They don't - you just roll them all up into the data center, where you had better have a crack operations staff who can keep everything going. Why? Because if the network fails, your users have a useless paperweighr on their desk until you fix it.

    I'm definitely surprised to not see cloud computing on that list. This is another rehashed technology, this time with the fast cheap network connectivity thrown in. The design principles are great -- build your app so it's abstracted from physical hardware, etc. but I've seen way too many cloud vendors downplay the whole data ownership and vendor lock-in problems. In my opinion, this makes sense for people's Facebook photos, not a company's annual budget numbers.

  • by Intron ( 870560 ) on Monday November 02, 2009 @04:06PM (#29954188)
    One good example is speech recognition. This was a hot topic of research in the 70s. Now its a $20 DSP chip.
  • by rossz ( 67331 ) <ogre&geekbiker,net> on Monday November 02, 2009 @04:47PM (#29954728) Journal

    I kind of miss the crazy hotties that used to pervade the network sales arena. I won't even name the worst offenders, although the worst started with the word cable. They would go to job fairs and hire the hottest birds, put them in the shortest shirts and low cut blouses, usually white with black push-up bras - and send them in to sell you switches.

    Booth babes are the best thing about trade shows.

  • by bb5ch39t ( 786551 ) on Monday November 02, 2009 @04:48PM (#29954750)
    In honor of Arthur C. Clarke's famous words, I have a button which almost got me fired at work. "Any sufficiently advanced technology is indistinguishable from a rigged demo."
  • by murdocj ( 543661 ) on Monday November 02, 2009 @04:52PM (#29954822)

    Just out of curiosity... did you ever try to find out WHY people were making entries with invalid phone numbers? Is it at all possible that instead of your users being idiots, they HAD to make an entry, but the phone number was one piece of data that simply wasn't available?

    If I've learned anything over a lot of years of programming, it's that when your users absolutely insist on doing something contrary to what your program wants them to do, it's time to sit down and listen.

  • by turbidostato ( 878842 ) on Monday November 02, 2009 @05:01PM (#29954916)

    "We had a simple field on a form to "Supply a Telephone Number". The users didn't, so we used JS to validate they had filled it in."

    So instead of validation server-side you rely on validation client-side?

    "The more you Idiot-Proof a system, the smarter the Idiots become. Not smarter at actually entering the correct data, just smarter at bypassing the protections you put in place."

    Hummm... Why are your users entering such telephone numbers as 1111111? Are you *sure* they do it on mistake? Or might it be that they don't *want* to give their telephone number to you for their own valid reasons and you still didn't add the option "I don't have or don't want to share my telephone number with you"?

    I'm not sure which keyboard end is the idiot one in this case.

  • by mvdwege ( 243851 ) <mvdwege@mail.com> on Monday November 02, 2009 @05:13PM (#29955034) Homepage Journal

    A very good example, that. That $20 DSP does nothing but a brute force search on certain sound patterns. This is not in any way similar to how humans process speech.

    I am not in the camp that says humans have a certain ineffable something that computers can never replicate, but using brute force pattern matching is not the way to find out just how human perception works and reimplementing it in a machine. Chess, BTW, is an example of the opposite: even humans do a brute force search down the decision tree. Sometimes they're trained enough to prune the tree quickly, but that is no different from the common algorithms currently in use.

    As Douglas Hofstadter puts it, the most interesting things happen in those 100ms between seeing a picture of your mother and going 'Mom!', and we're nowhere near understanding that problem space enough to implement it in AI. At least, we weren't a couple of years back. I haven't kept up with current developments though.

    Mart

  • by publiclurker ( 952615 ) on Monday November 02, 2009 @05:19PM (#29955118)
    How about using VMWare to make sure you are not doing something decidedly stupid. I have VMWare images of every platform our software supports. I can easily verify that everything works as advertised without running all over the place snagging time on different machines. And if I encounter an issue on a particular setup, I can save a snapshot for later or restore the machine to it's pre-install state and try again.
  • by amRadioHed ( 463061 ) on Monday November 02, 2009 @05:25PM (#29955218)

    This is not in any way similar to how humans process speech

    How do humans do it?

  • by Maxo-Texas ( 864189 ) on Monday November 02, 2009 @05:26PM (#29955230)

    I have to agree with this...

    When a corporation passes a certain size, having a packaged ERP is a good idea (for legal compliance).

    The problem is, generic accounting packages, ERP packages, etc. work best when you don't have a bunch of exceptional processes (our PO process had 17 variations- from as little as 2 lines on the form- to a full form plus multiple attachments-- people used the same form area for many different meanings-- the 1 page form was really about a 4 page form).

    The solution was to cut down most of the exceptions and standardize. We do things that we added for an "edge case" 15 years ago which has never recurred.
    We have steps to address customers who are no longer customers. Yet the steps live on.

    Having an ERP and using it as specified (no unique coding unless it makes us boatloads of money) is the right choice for a big corporation.

  • by toriver ( 11308 ) on Monday November 02, 2009 @05:37PM (#29955396)

    Do you also complain when airplanes don't flap their wings? (Sci-fi's Ornithopters excempted of course.)

    Knowledge systems/rules engines and neural networks can deduce answers, that is sufficient to be labeled "intelligence" in my book.

  • by pudding7 ( 584715 ) on Monday November 02, 2009 @07:37PM (#29956790)
    So having all these idiot users put in 111111 made how many all those records useless? Did you ever send a memo saying "If you put in 11111111 you're wasting your time since this record won't be used by anyone, ever."?
  • by Phantasmagoria ( 1595 ) <loban.rahman+slashdot@NoSpAm.gmail.com> on Monday November 02, 2009 @09:32PM (#29958234)

    Another modern and heavily used AI: vehicle control systems (especially fighter jets and race cars).

  • by LtGordon ( 1421725 ) on Monday November 02, 2009 @10:07PM (#29958640)
    I'll call it truly intelligent when the computer can design its own algorithms. In the meantime, following an if/then tree is pretty weak AI.
  • by Velex ( 120469 ) on Monday November 02, 2009 @10:55PM (#29959148) Journal

    How do humans do it?

    It's a fascinatingly complex process. Seriously, read up a bit on Wikipedia and perhaps take a few foreign languages. There are many, many points of failure. I think it's interesting to consider Orwell's argument about language in 1984. When thinking of Orwell, I'm glad that I've had the opportunity to be exposed to as many languages as I have. The more languages I learn, even if only a few words and concepts, the more modes of thinking I open myself up to. A new language to me can sometimes introduce a whole new viewpoint on the world, simply through the specific connotations and denotations. Usually denotations are easy to translate, however connotations can pose such of a problem that sometimes we prefer to just outright borrow a word from another language to express precisely our meaning. Language can evoke all 5 senses.

    Personally, I'm fascinated by language, written and spoken. There are words I learned in Germany that I still use today even though I'm no longer anywhere near fluent (use it or lose it). For example, in English we have a "shortcut," but I can't readily think of the opposite unless I use the German word "Umweg." Another example: as I was looking at art in a story today I came across some Japanese characters (because we know that hanging up symbols you have no idea about is so cool), I noticed that the kanji for woman was one of the radicals in a kanji that was translated as "tranquility." It made me wonder who, thousands of years ago, thought about the concept of tranquility and decided that the lower radical should be the symbol for "woman." I could go on like this. Suffice to say, language is perhaps the single tool we use to define our consciousness has humans.

    I'd further pontificate that unless we were to create an AI for whom language is as prevasive as in the human mind, chasing strong AI will always result in failure.

The moon is made of green cheese. -- John Heywood

Working...