Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IT Technology

The Quiet Before the Next IT Revolution 145

snydeq writes: Now that the technologies behind our servers and networks have stabilized, IT can look forward to a different kind of constant change, writes Paul Venezia. "In IT, we are actually seeing a bit of stasis. I don't mean that the IT world isn't moving at the speed of light — it is — but the technologies we use in our corporate data centers have progressed to the point where we can leave them be for the foreseeable future without worry that they will cause blocking problems in other areas of the infrastructure. What all this means for IT is not that we can finally sit back and take a break after decades of turbulence, but that we can now focus less on the foundational elements of IT and more on the refinements. ... In essence, we have finally built the transcontinental railroad, and now we can use it to completely transform our Wild West."
This discussion has been archived. No new comments can be posted.

The Quiet Before the Next IT Revolution

Comments Filter:
  • by msobkow ( 48369 ) on Wednesday August 13, 2014 @02:26AM (#47660809) Homepage Journal

    The article is a rather simplistic hardware-centric viewpoint. It doesn't even begin to touch on the areas where IT has always struggled: design, coding, debugging, and deployment. Instead it completely ignores the issue of software development, and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

    Which, of course, conveniently ignores the fact that someone has to write the code that runs in those virtual servers, debug it, test it, integrate it, package it, and ship it. Should it be an upgrade to an existing service/server, add in the overhead of designing, coding, and testing the database migration scripts for it, and coordinating the deployments of application virtual servers with the database servers.

    Are things easier than they used to be? Perhaps for they basic system administration tasks.

    But those have never been where the bulk of time and budget go.

    • by jemmyw ( 624065 ) on Wednesday August 13, 2014 @02:35AM (#47660825)

      Indeed, and virtualization is a rapidly evolving part of infrastructure right now. We may no longer be upgrading the hardware as rapidly (although I'm not certain about that either), but the virtual layer and tools are changing, and upgrading those requires just as much upheaval.

      • Virtualization is a pain in the ass. Want a new prod server? *click*
        Want a new dev environment? *click*
        Want a new db server? *click*
        Need an FTP server? *click*
        Need an HTTP server? *click*

        Before you know it when you need to deploy a small software change it becomes a big deal because you have a billion bloody servers to update.
        Before virtualization (or at least the ease of virtualization) you took your time and planned - checked available resources etc. Resources were scarce, RAM wasn't so abundant,
      • It seems too many forget that all this virtualization still runs on physical servers. Those physical servers still need hardware upgrades, monitoring, and resource management (especially when one starts oversubscribing). I don't get why people keep thinking hardware went away. Instead of lots of 1U servers, now you have big iron running lots of virtual servers.

        • by Hadlock ( 143607 )

          Yes, but now you have one, maybe two (hopefully super-smart) guys onsite with a deep systems knowledge, instead a fleet of screwdriver wielding guys with an A+ certification who are as likely as not to screw up your system. Once it's up and running you just have to keep that machine and it's backup going, and everyone can build on top of that in software, from anywhere in the world.

        • Nobody is forgetting that because it's now partially irrelevant. Need to upgrade ram in a server? Migrate the VM's to another, shut it down, upgrade, turn it back on. Have a server catch fire and die? HA has already migrated the VM's for you. Getting low on ram or cpu hits 100%? Look! An alert!

          Hardware does still matter, but it's no longer something that must be watched closely and in fear.
        • by sjames ( 1099 )

          Of course it didn't go away, but it did get a lot easier to maintain.

          I remember very well back in the bad old days, that white knuckle time between telling the remote server to reboot with a new kernel and ping starting again. And of course, the advance setup where you make the old kernel the default in hopes that if it all goes sideways you can call and find someone on-site who can manage to find and press reset should it hang or have some random problem that keeps it off the net.

          Then came nicer setups whe

    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Wednesday August 13, 2014 @03:17AM (#47660949)

      Are things easier than they used to be? Perhaps for they basic system administration tasks.

      But those have never been where the bulk of time and budget go.

      They could be if you did not know what you were doing. Like I suspect the author of TFA did not know.

      From TFA:

      Where we once walked on tightropes every day doing basic server maintenance, we are now afforded nearly instant undo buttons, as snapshots of virtual servers allow us to roll back server updates and changes with a click.

      If he's talking about a production system then he's an idiot.

      If he's talking about a test system then what does it matter? The time spent running the tests was a lot longer than the time spent restoring a system if any of those tests failed.

      And finally:

      Within the course of a decade or so, we saw networking technology progress from 10Base-2 to 10Base-T, to 100Base-T to Gigabit Ethernet. Each leap required systemic changes in the data center and in the corporate network.

      WTF is 10Base-2 doing there? I haven't seen that since the mid-90's. Meanwhile, every PC that I've seen in the last 10 years has had built-in gigabit Ethernet.

      If he wants to talk about hardware then he needs to talk about thing like Cisco Nexus. And even that is not "new".

      And, as you pointed out, the PROGRAMMING aspects always lag way behind the physical aspects. And writing good code is as difficult today as it has ever been.

      • WTF is 10Base-2 doing there? I haven't seen that since the mid-90's.

        That was probably the "or so" that came after the word 'decade'.

        A "decade or so" could be taken to mean 2 of them, which puts it back in the mid 1990's.

        I think you're just being Captain Pedantic, when all the GP was really trying to say was that things move pretty quickly in IT.

      • by Iamthecheese ( 1264298 ) on Wednesday August 13, 2014 @07:16AM (#47661697)
        Programming in good code isn't hard at all. What's hard is programming well when you're on the fifth "all hands on deck" rush job this year, you have two years of experience and no training because your company was too cheap to pay a decent wage or train you, a humiliating and useless performance review is just 'round the corner, and you doubt anything you type will end up in the final product. The problem is a widespread cultural one. When IT companies are willing to spend the time and money for consistent quality that's when they'll start to put out quality products.
        • Re: (Score:3, Insightful)

          Agreed 100% There is never time to do it right, but there is always time to do it over. Reviews that admit success, but celebrate weakness are not positive experiences. There is another trend of third parties marketing infrastructure solutions to high level management, skipping local subject matter experts. This triples the work we have to do. change is fine, and embraced, but we are paid for something. Provide stability and compliance in a rapidly evolving globalized environment.
          • > There is another trend of third parties marketing infrastructure solutions to high level management, skipping local subject matter experts.

            Trend? I thought this was SOP for sales, because SMEs kill sales. This has been known by Salesmen and SMEs since time immemorial.

      • Where we once walked on tightropes every day doing basic server maintenance, we are now afforded nearly instant undo buttons, as snapshots of virtual servers allow us to roll back server updates and changes with a click.

        If he's talking about a production system then he's an idiot.

        Why? Is it your contention that the work of sysadmins and support personnel has just been trouble-free for decades, and all the problems were caused by a sysadmin "not knowing what they were doing"?

      • by sjames ( 1099 )

        WTF is 10Base-2 doing there? I haven't seen that since the mid-90's. Meanwhile, every PC that I've seen in the last 10 years has had built-in gigabit Ethernet.

        Right, within a decade, the '90s to be specific, all of those network technologies were in use at some point.

    • by dbIII ( 701233 )

      and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

      Meanwhile I'd like to be able to turn clusters into a virtual server instead of having to codes specificly for clusters. Something like OpenMosix was starting to do before it imploded. Make serveral machines look like one big machine to applications designed to only run on single machines.

      • by Junta ( 36770 )

        www.scalemp.com does what you request.

        It's not exactly all warm and fuzzy. Things are much improved from the Mosix days in terms of having the right available data and kernel scheduling behaviors (largely thanks to the rise of NUMA architecture as the usual system design). However there is a simple reality that the server to server interconnect is still massively higher latency and lower bandwidth than QPI or HyperTransport. So if a 'single system' application is executed designed around assumptions of n

        • by dbIII ( 701233 )
          Cool, I'll definitely look into that. I thought that line of thought had been abandoned.
    • by SuricouRaven ( 1897204 ) on Wednesday August 13, 2014 @05:06AM (#47661335)

      Even software is slowing down, though. A lot of the commodity software reached the point of 'good enough' years ago - look how long it's taken to get away from XP, and still many organisations continue to use it. The same is true of office suites: For most people, they don't use any feature not present in Office 95. Updating software has gone from an essential part of the life cycle to something that only needs to be done every five years, sometimes longer.

      Around 2025 we will probably see a repeat of the XP situation as Microsoft tries desperately to get rid of the vast installed base of Windows 7, and organisations point out that what they have been using for the last decade works fine so they have no reason to upgrade.

      • by mysidia ( 191772 )

        Updating software has gone from an essential part of the life cycle to something that only needs to be done every five years, sometimes longer.

        Not if you actually care about security. The older software and operating systems prior to Windows 8 don't support newer more-effective security attack mitigation approaches such as ASR.

      • A lot of the commodity software reached the point of 'good enough' years ago - look how long it's taken to get away from XP, and still many organisations continue to use it.

        I find it hard to believe that operating systems became "good enough" with Windows XP. Rather, Vista took so long to come out that it disrupted the established upgrade cycle. If the previous 2-to-3-year cycle had continued, Vista would have come out in 2003 (without as many changes, obviously), Windows 7 in 2005 and Windows 8 in 2007. We'd be on something like Windows 12 by now.

        It's good that consumers are more aware and critical of forced obsolecence, but I don't agree with the "XP is good enough" crowd. I

    • Well beyond hardware, Software reliability over the past few decades has shot right up.

      Even Windows is very stable and secure. Over the past decade, I have actually seen more kernel panics from Linux than a BSOD. We can keep servers running for months or years without a reboot. Out Desktops,Laptops, and even mobile devices now perform without crashing all the time, and we work without feeling the need to save to the hard drive then backup to a floppy/removable media every time.

      What changes has happened se

      • by Junta ( 36770 )

        Software reliability over the past few decades has shot right up.

        I think this is a questionable premise.

        1) Accurate, though has been accurate for over a decade now
        2) Things have improved security wise, but reliability I think could be another matter. When things go off the rails, it's now less likely to let an adversary take advantage of that circumstance.

        3) Try/Catch is a potent tool (depending on the implementation it can come at a cost), but the same things that caused 'segmentation faults' with a serviceable stack trace in core file cause uncaught exceptions with a

    • The article is a rather simplistic hardware-centric viewpoint. It doesn't even begin to touch on the areas where IT has always struggled: design, coding, debugging, and deployment. Instead it completely ignores the issue of software development, and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

      And now is when we have a long and stupid debate as to whether the term "IT" signifies a grouping of all computer-related work including development, or whether it's limited to workstation/server/network design, deployment, and support. And we go on with this debate for a long time, becoming increasingly irate, arguing about whether developers or sysadmins do more of the 'real' work, and...

      Let's just skip to the end and agree that, regardless of whether IT 'really' includes software development, it's pret

    • by jon3k ( 691256 )
      I kind of agree with TFA here -- hear me out here. We went through a pretty fundamental shift in the datacenter over the last 10 years or so, and it's finally settling down. Of course there will be constant evolutionary progressions, updates, patches, etc we're basically done totally reinventing the datacenter. 10GbE, virtualization, the rise of SANs and converged data/storage, along with public/private/hybrid clouds - these huge transformative shifts have mostly happened already and we're settling into
    • The article starts with the observation that the hardware bottleneck is mostly gone, if you can afford to supply basic coffee to your employees, the IT hardware doesn't cost much more than that - contrast that to 1991 when the PC on my desk cost 2 months of my salary, and our "network" was a 4 line phone sitting next to it (modems came to our office 5 years later).

      Then, let's dream about what's next... you can dream, can't you?

    • by sje397 ( 2034296 )
      Huh? We just get monkeys to type in random things, and roll back if and when there's a bug.
  • by felixrising ( 1135205 ) on Wednesday August 13, 2014 @02:40AM (#47660847)
    I assume you are talking about the hardware... because once you have a "private cloud", the next step is moving away from setting up servers and configuring the applications manually, and getting into full on DevOps style dynamically scaling virtual workloads, that are completely (VM and their applications, the network configuration including "micro networks" and firewall rules) stood up and torn down dynamically according to the demands of the customers accessing the systems.. those same workloads can move anywhere from your own infrastructure to leased private infrastructure to public infrastructure without any input from you... of course, none of this is new... but it's certainly a paradigm shift in the way we manage and view our infrastructure... hardly something static or settled. Really this is a fast moving area that is hard to keep up with.
  • As soon as we have 8k video commonly available, which could be as soon as 2020, if Japan gets to host the Olympic Games, we will run out of storage, out of bandwidth, and there is not even a standard for an optical disc that can hold the data, at the moment. So our period of rest will not be too long.
    • by Anonymous Coward

      By then we will have the infamous Sloot Digital Coding System that will encode an entire movie down to 8KB so what are you so worried about?
      http://en.wikipedia.org/wiki/Jan_Sloot

    • The only problem there is that it is, for most purposes, pointless. Most people would be hard pressed to tell 720p from 1080p on their large TV under normal viewing conditions. What we see really is a placebo effect, similar to the one that plagues audiophile judgement: When you've paid a heap of cash for something, it's going to sound subjectively better.

      • Personally, when reading pdf articles, the difference between 1080p and 4k make a world of difference. The aliasing is almost invisible in 4K. Also, thin diagonal lines in plots are much more clearly defined. Because this difference is so large, I have little doubt that 8k will also make a difference, although maybe not as much as from 2k to 4k. I have not yet had the pleasure of comparing video on 1080p and 4k.
        • It's easy to tell the guys from IT. Everybody else reads the PDF files for their content. The IT guy looks for artifacts in the font rendering.

      • by Lumpy ( 12016 )

        Exactly, yet you will have a TON of people claiming they can tell the difference. In reality they can not.

        99% of the people out there sit too far away from their TV to see 1080p Unless they have a 60" or larger and sit within 8 feet of the TV set. The people that have the TV above the fireplace and sit 12 feet away might as well have a standard def TV set.

        But the same people that CLAIM they can see 1080p from their TV 10 feet away also claim that their $120 HDMI cables give them a clearer picture.

  • BS has arrived to slashdot.
  • In the recent decades we've been eyewitnesses to the revolutionary breakthroughs in such fields as energy, transportation, healthcare, and space industry, to name a few. The technologies emerged are nowadays pretty much ubiquitous and impossible to go without in day to day life. Yet the hardware IT industry is stuck with Moore's law and silicon, and there's even an embarrassing retreat to functional programming in the software branch.
  • Now standardize all your password requirements to a strength-based system without arbitrary restrictions or requirements, and standardize your forms' metadata so that they can be auto-completed or intelligently-suggested based on information entered previously on a different website. Trust me, this sort of refinement will be greatly appreciated.

  • I don't subscribe to this rose-tinted point of view, especially if you look at all this beautiful tech from the security standpoint.
    Most of the tech we deal with today was originally designed without security concerns. In most cases, security is an afterthought.
    So much for sitting back and taking a break.

    • Yeah, "... In essence, we have finally built the transcontinental railroad, and now we can use it to completely transform our Wild West."

      There are tunnels filled with dynamite all underneath the track just waiting for some wild west yahoo to push the detonator.

      Once a month, Microsoft issues security blankets in an effort to hide them.

      Until that problem is solved at the roadbed, no one's going to get a good night's sleep because those bird whistles you hear are not authentic, Kemo Sabe.

  • Moore's law has run out of steam. Yay!

    • I think it's more about the end of the MHz wars. Nowadays, to get more power, you add more cores. If you can't do that, you add more boxes.

      If you've got a single threaded million instruction blob of code, it's not executing very much faster today than it was a few years ago. If you're able to break it into a dozen pieces, then you can execute it faster and cheaper now than you could a few years ago, though.

      Moore's law hasn't really run out of steam, it more that it's rules have changed a bit - the raw power

      • by Lumpy ( 12016 )

        you can have 256 cores at 500mhz and a 4 core 5ghz system will be a lot snappier and faster. because most of what is used for computing does not scale to multi core easily.

        I will take a 2 core 5ghz system over a 4 core 3ghz system any day.

      • Moore's law was about a one node shrink every 18 months, meaning a reduction in structure sizes by sqrt(2), i.e. twice the number of transistors at the same die size. The reduction in size meant a reduction in gate thickness and operating voltage by sqrt(2) and a reduction of capacitances by a factor of two. Those allowed an increase in clock speed of sqrt(2) at constant power. None of that is happening any more.

  • Wrong (Score:2, Insightful)

    by idji ( 984038 )
    No, you IT people are no longer the great revolutionists - your time is gone. You are now just plumbers, who need to fix the infrastructure when it are broken. Other than that, we don't want to hear from you, and we certainly don't want your veto on our business decisions - that is why a lot of us business people use the cloud, because the cloud doesn't say "can't work, takes X months, and I need X M$ to set it up", but is running tomorrow out of operational budget.
    • by ruir ( 2709173 )
      Sure the cloud runs with gremlins, fuck yeah. I guess you also dont care about your mechanic says and use the "garage", and also do not care without you dentist says and go there once every 5 years. If you do not care what professionals advise you, you are an idiot and do not deserve competent people working for/with you. Douche bag.
      • With the increased reliability of modern cars, people do make fewer trips to the garage. So it's not unlikely that cars won't be in the garage more than every 5 years.

        I guess the same fact being true for IT really bugs you. The IT drones where I work are right now in a tizzy because the corporate IT people in Mexico are taking over. Because they can, and it saves a lot of $$, and also because the local fucks just aren't needed much anymore. There's no need for a guy to clean the lint out, all the mice are o

        • by ruir ( 2709173 )
          Give me a car that is in the garage only once in 5 years, and I wont mind to pay the price tag. There are IT drones/fucks and then there are IT people, and guess what I do not belong to the former. As much as there are fucks in any other job, yours included. People that get all cozy, did not had the vision to go ahead, keep up with the times, and get away with the menial tasks. But guess what, the stuff does not automate yourself and does not run alone. And when it seems to run alone, is because people are
    • by fisted ( 2295862 )
      So essentially you're saying that you, as a technically illiterate person, don't give a crap about the opinion of your sysadmin in technical questions.
      Oh, wait, you've already mentioned you're a business person. Enjoy your Dunning-Kruger while it lasts.

      need to fix the infrastructure when it are broken.

      Shall we fix your understanding of the English language while we're at it? Or would that be too mission-critical a business decision?

      • by ruir ( 2709173 )
        There are no problems with that, we also think he a bigger and useless idiot, and will move to best places ASAP.
    • by Lumpy ( 12016 )

      Awesome, we need to join the plumbers union and start getting $125 an hour then. Thanks for your support!

  • Oh lookie... (Score:4, Informative)

    by roger10-4 ( 3654435 ) on Wednesday August 13, 2014 @04:40AM (#47661227)
    Another submission to a superficial article from syndeq to drum up traffic for Info World.
  • ...right before the next, undreamed-of computing revolution knocks everyone on their ass.

    • Don't be too hard on the guy. Thisis why evolution invokes death. Just about the time you get it all figured out and decide it's not worth going in and punching your time clock, nature punches yours.
  • by CaptainOfSpray ( 1229754 ) on Wednesday August 13, 2014 @05:03AM (#47661319)
    ...like a dinosaur in the last days before the meteor. The future is over there in the Makerspaces, where 3D printing, embedded stuff, robotics, CNC machines, homebrew PCBs at dirt-cheap prices are happening. It's all growing like weeds, crosses the boundaries between all disciplines includg art, and is an essential precursor to the next Industrial Revolution, in which you and your giant installations will be completely bypassed.

    You, sir, are a buggy-whip manufacturer (as well as a dinosaur).
    • Seems like you are talking about general computing and related applications of computing. This guy is talking about business IT, which is a tiny subset of computing.

      It would not be inappropriate to mod parent off topic for posting a generalist reply to something written for a specific audience.

      • Not off-topic .. TFA is claming to know where "the next IT revolution" is coming from, and I'm saying he is looking in exactly the wrong direction.
  • by Anonymous Coward

    What the -- ?? "the technologies behind our servers and networks have stabilized" -- when did this happen? I'm not a datacenter person, but isn't the world filled with competing cloud providers with different APIs, and things like OpenStack? Did all this stuff settle down while I wasn't paying attention?

  • I think would be a better way of looking at what this article is on about.
    Back in the late 80's early 90's when I graduated and started my career in the Networking Industry the OSI 7 layer model (https://en.wikipedia.org/wiki/OSI_model) was often referred to. You don't hear it mentioned much these days.
    If you applied IT history and economics to it you'll find that each of those layers saw a period of fantastic growth & innovation (a few short years) before becoming IT commodities and having little valu

  • by Charliemopps ( 1157495 ) on Wednesday August 13, 2014 @07:36AM (#47661769)

    Now that the technologies behind our servers and networks have stabilized, IT can look forward to a different kind of constant change, writes Paul Venezia.

    I don't think Paul Venezia works in IT.

  • by account_deleted ( 4530225 ) on Wednesday August 13, 2014 @07:41AM (#47661795)
    Comment removed based on user account deletion
  • Translation: Bandwidth and ubiquitous connectivity, along with a generation trained to have no privacy are in place. Let the police state begin.

    If you think things like rural electrification are about helping people, you have your head in the sand.

  • by Anonymous Coward

    At the risk of pissing off some folks, I must say I've worked in IT since before it was called IT, and I can honestly say no revolutions will come from that area. After all, IT isn't known for it's innovative and R&D atmosphere. IT is the result of cramming middle management, contractors, and novice-to-mediocre developers together in cubicles. Sure, it's a steady paying job, which is why most of us do it. The revolutionary stuff will continue to come from those who have the luxury of choosing not to

    • by ruir ( 2709173 )
      If you are doing still in cubicles, you are doing something wrong. From a fellow IT...
  • The on-going technology churn we've seen in the last decade is *not* a feature of a revolution in progress that may be coming to an end, it's a reflection of stagnation in technology, without the ideal data centre technology (at least in terms of software) having achieved any kind of dominance. There's been a endless parade of new web technologies, none of which is more than an ugly hack on HTML. Websites are better than they were in twenty years ago, but certainly not 20 years' worth of progress better.

  • by labradort ( 220776 ) on Wednesday August 13, 2014 @11:25AM (#47663267)

    The concept is false. Things have changed in how they break and what we are concerned about on a daily basis. 10 years ago I didn't have compromised accounts to worry about every day. But I did spend more time dealing with hard drive failure and recovery. We are still busy with new problems and can't just walk off and let the systems handle it.

    If you believe IT is like running your Android device, then yes, there is little to be done other than pick your apps and click away. If you have some security awareness you would know there is much going on to be concerned about. When the maker of a leading anti-virus product declares AV detection is dead, it is time to be proactive looking at the problem. Too many IT folk believe if there is malware it will announce itself. Good luck with that assumption.

    • by ruir ( 2709173 )
      We dont need an AV maker to declare it dead, it has been a zombie for decades now.

It is easier to write an incorrect program than understand a correct one.

Working...