Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses The Almighty Buck IT Hardware

Why Your IT Spending Is About To Hit the Wall 301

CowboyRobot writes "For decades, rapid increases in storage, processor speed, and bandwidth have kept up with the enormous increases in computer usage. That could change however, as consumption finally outpaces the supply of these resources. It is instructive to review the 19th-century Economics theory known as Jevons Paradox. Common sense suggests that as efficiencies rise in the use of a resource, the consumption goes down. Jevons Paradox posits that efficiencies actually drive up usage, and we're already seeing examples of this: our computers are faster than ever and we have more bandwidth than ever, yet our machines are often slow and have trouble connecting. The more we have, the even more we use."
This discussion has been archived. No new comments can be posted.

Why Your IT Spending Is About To Hit the Wall

Comments Filter:
  • by Anonymous Coward on Friday April 13, 2012 @07:47PM (#39681139)

    not really. seems the former are a bunch of fucking idiots constantly tampering with shit that doesn't need tampering with (gnome 3, anyone? unity?), while the latter are either refreshingly pragmatic, or, as you say, tedious old farts resisting progress for the sake of pretending to be smart (windows xp, anyone?)

    my view is fuck the both of them, and there must be a third school of thought. however, i'm too tired and drunk to think what it might be.

  • by Anonymous Coward on Friday April 13, 2012 @07:49PM (#39681165)

    1. Computer hardware is not a finite resource like coal is or any other natural resource. Prices go up; somebody build a plant to make more. Econ 101.

    2. This assumes that computer hardware will be used the same way as it has been in the past. We are already seeing major changes. Less individual storage and more online storage; different devices that are less hardware intensive and computing is being used differently - less desktop and more handheld and all the differences down the chain from that.

    3. No mention of significant technology changes. Who's to say will still be using the current architectures or even silicon tech in the future. This assumes the same old same old for the future.

  • by DogDude ( 805747 ) on Friday April 13, 2012 @07:58PM (#39681237)
    In our company, IT spending is actually dropping, even as we expand. The cost of used hardware is insanely low because of all of the individuals and companies who still feel the need to buy "new" equipment so rapidly. We have no problems running Pentium 4's and Windows XP throughout our business, and wil do so for the foreseeable future.. We've moved our email/backup/web hosting services out to providers, and all of that is sill insanely cheap. Tech has actually exceeded our needs, so our IT spending has dropped significantly. Keep buying new machines every few years, people! We're loving buying your completely functional equipment at yard sale prices!
  • Peak Computing? (Score:5, Interesting)

    by slew ( 2918 ) on Friday April 13, 2012 @08:02PM (#39681257)

    If I gather what this article is speculating on, it's a phenomena similar to peak-oil.

    Peak-oil doesn't necessarily mean that you run-out of oil, it just means that the marginal cost of producing more oil reaches a point which causes the rate of oil production to decrease. In the backdrop of increasing demand, and limited supply this implies a sharp downturn in availability of oil at historical prices.

    If applied to computing, it would imply a limit to computing resources. I don't think we are there (although computing takes lots of electrical power and there seems to be enough semiconductor manufacturing capacity for the moment), but we may be at a point where demand increases beyond the rate at which technology can keep it on its historical increasing MIP/$ trend. If this MIP/$ trend flattens out, it may be difficult to find funding for new technological advances and fundamentally change the market for computing.

  • by Dolphinzilla ( 199489 ) on Friday April 13, 2012 @08:23PM (#39681407) Journal

    I read the headline for this story and laughed - it doesn't matter how much faster my computers or networks get - Our IT department just installs more and more virus scanners, software maintenance tools, firewalls, monitoring tools ,etc.... Each computer I get has more CPU cores and memory and faster graphics and they are able to do less and less and take longer and longer to boot. I figure before too long I'll have to go back to my old TI-30 calculator and some engineering graph paper and I'll be equal in efficiency to my computer once I factor in all the time I spend waiting for it to get around to sparing .5% of the 12 CPU cores to run the actual software I need to use....

  • Re:slow where (Score:4, Interesting)

    by jrminter ( 1123885 ) on Friday April 13, 2012 @08:33PM (#39681479)
    Ding ding ding - we have a winner. Our IT folks put so much crapware on our corporate image, that I had to take all my lab computers out of the domain and run vanilla installs w/ minimal antivirus and our imaging hardware/software. Makes a BIG difference.
  • Re:Bloated apps. (Score:2, Interesting)

    by Anonymous Coward on Friday April 13, 2012 @09:52PM (#39682047)

    the industry is using more Java which is as slow as snot. The attitude seems to be that if it runs slow, then throw some more iron at it.

    It's time this myth was debunked.

    Java itself is not slow. Properly written and optimized Java code runs almost as fast as equivalent C/C++. (I know, I write such code, and I measure timings for operations in nanoseconds in Java.) The JIT compilers built into modern JVMs generate very optimized machine code. (I know, I've looked at the assembler output.)

    Unfortunately, Java has a tendency to magnify poor programming decisions, and it's easy to be an idiot and still write Java "code." Sources of poor Java performance are generally instantiating way too many objects per second, resulting in frequent garbage collector use, and poor choices of algorithms and data structures for the underlying problem. (The same thing could be said of C/C++, although with manual memory management in C/C++ you're far less likely to run into a problem allocating/freeing too much memory per second.)

    Personally, I'm fine with this as there will always be career opportunities in rewriting/optimizing someone else's crappy code.

  • by rev0lt ( 1950662 ) on Friday April 13, 2012 @10:15PM (#39682191)
    Most "modern" (3 year old and newer) machines do have Gigabit connectors, so why not use them? On local networks, there are several advantages:

    1) reduced latency (someone else has mentioned it) - it really helps a lot some applications;
    2) less time loading roaming profiles / less time spent refreshing network shares;
    3) increased bandwidth (even at 100Mbit) - Gigabit gear is usually more error-resistant, and implement smarter and faster error correction;
    4) inter-departament high-speed sychronization - good for replicating storage, machine snapshotting/CDP, distributed filesystems and such;
    5) instant 10x speed upgrade on recent infrastructure, since 1000T is Cat5-based (no scrapping except the switches)

    My internet connection alone has 120Mbps downstream. And yes, I use it.
  • by HarrySquatter ( 1698416 ) on Saturday April 14, 2012 @08:19AM (#39684285)

    No, it had plenty of bugs before the Ajax rewrite. Like the longstanding pagination bugs that they eventually just gave up on.

Happiness is twin floppies.

Working...