Building an IT Infrastructure Today vs. 10 Years Ago 93
rjupstate sends an article comparing how an IT infrastructure would be built today compared to one built a decade ago.
"Easily the biggest (and most expensive) task was connecting all the facilities together. Most of the residential facilities had just a couple of PCs in the staff office and one PC for clients to use. Larger programs that shared office space also shared a network resources and server space. There was, however, no connectivity between each site -- something my team resolved with a mix of solutions including site-to-site VPN. This made centralizing all other resources possible and it was the foundation for every other project that we took on. While you could argue this is still a core need today, there's also a compelling argument that it isn't. The residential facilities had very modest computing needs -- entering case notes, maintaining log books, documenting medication adherence, and reviewing or updating treatment plans. It's easy to contemplate these tasks being accomplished completely from a smartphone or tablet rather than a desktop PC."
How has your approach (or your IT department's approach) changed in the past ten years?
You don't build it (Score:5, Funny)
Re:You don't build it (Score:5, Insightful)
The cloud is fine and dandy until Microsoft Azure is unreachable for several hours ... again ...
http://www.theregister.co.uk/2013/11/21/azure_blips_offline_again/ [theregister.co.uk]
Re: (Score:1)
Doesn't matter. The only important consideration is that no one gets fired for going with the Microsoft solution.
Re: (Score:2)
Re: (Score:2)
The cloud is fine and dandy until Microsoft Azure is unreachable for several hours ... again ...
The cloud does not mean Azure.
The cloud means something like Rackspace, Softlayer, Slicehost, Linode, BuyVM, DigitalOcean.
There are plenty of other hosting providers that don't have a 4 hour outage every 3 months.
Re: (Score:2)
Re: (Score:2)
Don't forget to buy new iPads for all your employees as well so that they can get more work done!
Re: (Score:2)
From what/where/whom?
expect to allow intrusive oversight (Score:2, Insightful)
not much else has changed
There are limitations (Score:2)
Most enterprises rely upon one or more software packages from a vendor, often for critical functions. You can only do what your vendor's software allows. Not everything is tablet friendly or cloud happy.
Re: (Score:1)
HIPAA Privacy Rules (Score:2)
I believe these came into effect about 10 years ago. So aside from all the advances in "the cloud", I'd ask whether that would be secure enough. I mean not just of a bunch of Slashdotters. Ask the potential cloud providers if they are HIPAA compliant and can provide documentation to that effect.
Use GMail for transferring medical records and I'll guarantee you'll be swamped with ads for everything from Vi@gr@ to funeral services.
Re: (Score:3)
If only there were some way to look this up:
Re: (Score:3)
The effective compliance date of the Privacy Rule was April 14, 2003 with a one-year extension for certain "small plans"
Or pretty much 10 years ago.
Re: (Score:2)
press release [vmware.com]
How HIPPA works [hhs.gov]
Re: (Score:2)
HIPAA Compliancy has insane requirements... armed guard 24-7
HIPAA makes no mention of 'armed guards.'
In a nutshell, HIPAA states that -
1) You must protect health data, whether it is digital or in a filing cabinet
2) What the penalties are for a breach of that data.
Typing will always be better on a PC (Score:3)
The residential facilities had very modest computing needs -- entering case notes, maintaining log books, documenting medication adherence, and reviewing or updating treatment plans. It's easy to contemplate these tasks being accomplished completely from a smartphone or tablet rather than a desktop PC.
And by the time you've paired an external keyboard in order to key in all that stuff, you might as well just use a laptop PC.
In addition, some cloud solutions make dedicated desktop application suites or specific configurations unnecessary today. Browser-based options or virtual desktops have added appeal in health organizations because data is less likely to be stored locally on a device.
That'd double an organization's spending on operating system licenses because a Terminal Server CAL for Windows Server costs about as much as a retail copy of Windows for the client.
Re: (Score:2, Insightful)
And by the time you've paired an external keyboard in order to key in all that stuff, you might as well just use a laptop PC.
And when all you have is a hammer, everything looks like a nail. Seriously, I don't use a tablet for my work functions, but I use a smartphone to get my emails on the road. But I am not everybody; I have different needs. Sometimes a laptop isn't the answer for everyone.
That'd double an organization's spending on operating system licenses because a Terminal Server CAL for Windows Server costs about as much as a retail copy of Windows for the client
First of all, who says that the organization requires Terminal Server to use a cloud based system? Also, "browser-based" means that the solution can be OS agnostic. For example, SalesForce. In fact, some people might have these things
Four seconds and my laptop is out of sleep (Score:2)
Yeah, pairing a Bluetooth keyboard with an iPad or a Nexus takes FOREVER. It's not like the connection is on in moments from a cold start, and remembered until/unless you break the pairing.
Sarcasm detected. But the fact is, when I have tried using a keyboard with one tablet and then another tablet, it broke the pairing. And if you use a ZAGGkeys Flex or any of several other brands of Bluetooth keyboard with unrooted Android 4.3, it'll pair but you won't be able to type because certain Broadcom chipsets are misrecognized as "nonalphabetic keyboards", that is, gamepads.
You might as well break out that laptop, wait 2-3 minutes for it to boot up into a usable state
When I open my laptop's lid, it takes all of four seconds to come out of sleep and get the unlock prompt up. Dell Inspiron mini
Android 4.4 fixes the keyboard bug (Score:2)
And if you use a ZAGGkeys Flex or any of several other brands of Bluetooth keyboard with unrooted Android 4.3, it'll pair but you won't be able to type
Android 4.4 fixes this. I tried it on my own Nexus 7.
Not much difference (Score:1)
Re:Not much difference (Score:5, Informative)
In 2003, Sarbanes-Oxley was passed, forcing companies to have to buy SANs just to stick E-mail for long term storage/archiving.
For the most part, things have been fairly static, except with new buzzwords and somewhat new concepts. A few things that have changed:
1: Converged SAN fabric. Rather than have a FC switch and a network switch, people are moving to FCoE or just going back to tried and true iSCSI which doesn't require one to fuss around with zoning and such.
2: Deduplication. We had VMs in '03, but now, whole infrastructures use that, so having disk images on a partition where only one image is stored and only diffs are stored for other machines saves a lot of space.
3: RAID 6 becomes necessary. I/O hasn't gone up as much as other things, so the time it takes to rebuild a blown disk is pretty big. So, RAID 6 becomes a must so degraded volumes rebuild.
4: People stop using tape and go with replication and more piles of hard disks for archiving. Loosely coupled SAN storage in a hot recovery center becomes a common practice to ensure SAN data is backed up... or at least accessible.
5: VMs use SAN snapshots for virus scanning. A rootkit can hide in memory, but any footprints on the disk will be found by the SAN controller running AV software and can be automatically rolled back.
6: We went from E-mailed Trojans, macro viruses, and attacks on firewalls and unprotected machines to having the Web browser being the main point of attack for malware intrusion. It has been stated on /. that ad servers have become instrumental in widespread infections.
7: The average desktop computer finally has separate user/admin access contexts. Before Vista, this was one and the same in Windows, allowing something to pwn a box quite easily.
8: The OS now has additional safeguards in place, be it SELinux, Window's Low security tokens, or otherwise. This way, something taking over a Web browser may not be able to seize a user's access context as easily.
9: BYOD has become an issue. Ten years ago, people fawned over RAZR-type devices and an IT person had a Bat Belt of devices, be it the digital camera, MP3 player, the PDA, the pager, the cellphone, and the Blackberry for messaging. Around -05, Windows Mobile merged all of this into one device, and '07 brought us the iPhone which made the masses desire one device, not a belt full.
10: Tablets went from embedded devices to on desktops and big media consumption items.
11: Music piracy was rampant, so one threat was people adding unexpected "functionality" to DMZ servers by having them run P2P functionality (AudioGalaxy, eMule, etc.)
12: We did not have to have a Windows activation infrastructure and fabric in place, where machines had to have some internal access to a KMS box to keep running. XP and Windows Server 2003 had volume editions which once handed a key would update and were happy for good.
13: UNIX sendmail was often used for mail before virtually everyone switched over wholesale to Exchange.
14: Hard disk encryption was fairly rare. You had to find a utility like SafeBoot or use loopback encrypted partitions on the Linux side for data protection. This was after the NGTCB/Palladium fiasco, so TPM chips were not mainstream.
15: One still bought discrete hardware for hosts, because VMs were present for devs, but not really "earned their bones" in production. So, you would see plenty of 2-3U racks with SCSI drives in them for drive arrays.
Things that have stayed the same, ironically enough:
1: Bandwidth on the WAN. The big changes came and went after initial offerings of cable and DSL. After that, bandwidth costs pretty much have not changed, except for more fees added.
2: Physical security. Other than the HID card and maybe the guard at the desk, data center physical security has not changed much. Some places might offer a fingerprint or iris scanner, but nothing new there that wasn't around in 2003. Only major di
Re: (Score:3)
Another big difference which relates to the list you mentioned: almost nobody runs their own in-house mail anymore. It's too expensive (in time and experience, mostly) to maintain efficiently and effectively, in no small part due to spam. Even larger organizations have decided it's not worth the headache.
If there is in-house hosting of mail, it's due to complex requirements and the headache that migration would be to another system. Many of these have also put in place either Google or Microsoft frontend fi
Re: (Score:3)
almost nobody runs their own in-house mail anymore.
My experience is different from yours. I work for an IT service consultancy and we're trying to push a lot of customers to cloud based email but they're all sticking to their guns. No-one around here likes the cloud for key business functions, and the NSA press is keeping them firmly entrenched in their views. For most companies (less than 1000 users) Exchange is trivial to setup and maintain, and can be supported part-time or by outsourced support. Over 1000 users then you have a big enough IT team to loo
Re: (Score:2)
My experience mirrors yours.
Even the PHBs want to keep company E-mail in-house for fear that a provider could use their personal communications stored for 7 years due to SOX rules against them later on.
I've seen some places tend to have their top brass on an in-house Exchange system, while lower levels might end up on Azure or a cloud provider.
Exchange is pretty easy to get up and running, especially if AD is in place. It has decent anti-spam filters that you can turn on out of the box for the edge server,
Re: (Score:2)
One minor problem, Exchange requires Microsoft...
Re: (Score:2)
Re: (Score:3)
I work in physical security, so will mention some changes that your site may not have implemented but which many larger sites have.
1) Granularity of access - Formerly if you had an access card it got you into the data center and from there you had free range. Today the data center is (or should be) compartmentalized and access to each area dependent on need.
2) Rack Access - There are now several brands of hardware that control technicians' access to individual racks, including front and/or rear rack door.
3
Comment removed (Score:5, Funny)
Re:Same now as it was back then . . . (Score:4, Insightful)
That's good, but reality is more like...
Determine the deadline, if at all possible, don't consult anyone with experience building infrastructure.
Force committal to the deadline, preferably with hints of performance review impact.
Ensure purchasing compliance via your internal systems, which minimally take up 30% to 40% of the remaining deadline.
Leave the equipment locked in a storage room for a week, just to make sure. Or, have an overworked department be responsible for "moving" it, that's about a week anyway.
Put enormous amounts of pressure on the workers once the equipment arrives. Get your money's work, make them sweat.
When it's obvious they can't achieve a working solution in 30% (due to other blockers) of the allotted time, slip the schedule a month three days before the due date; because, it isn't really needed until six months from now.
That's how it is done today. No wonder people want to rush to the cloud.
actualy 10 years ago (Score:3)
Well.... Quite a bit has happened. (Score:4, Interesting)
We've consolidate all office application servers to 5 data centers, one per continent. Then we've rolled out end-point backup for some 80.000 laptops in the field and some 150.000 more PC's around offices across the world which includes legal hold capabilities. Each country in which we're active has a number of mobile device options for telephony, most of them being Android and Win8 based nowadays since WebOS got killed.
Then we're in the process of building a European infrastructure where we have data centers for managed customer environments in every major market in Europe. I am currently not aware of what's going on in APJ or South America. This is important in Europe however, because managed European customers don't want to see their data end up in the States, and the same goes for those that use our cloud offerings.
physical local IT staff presence in all countries has been minimized to a skeleton crew, not only because of data center consolidation but also because of the formation of a global IT helpdesk in low cost countries, and the rise of self-service portals.
The plethora of databases we had internally has been Archived using Application Information Optimizer for structured data archiving. We are our own biggest reference customer in this regard. On top of that we've beefed up our VPN access portals across the world so as to accommodate road warriors logging in from diverse locations.
Lastly, we use our own Records Management software suite to generate 8.000.000. unique records per day. These are archived for a particular retention period (7 years I believe) for auditing purposes.
Re: (Score:2)
Sorry to burst your bubble there.
I'm afraid to say I've been bleeding blue (HP Blue, not Dell, IBM, eh... never mind) since the dawn of time.
Re: (Score:3)
Yes they are. I work in the Information Management software division as a pre-sales, and I'm pretty much paid to tell subsets of the above to customers.
- We are our own reference customer for Connected backup for end-points.
- We are our own reference customer for TRIM, now known as HP Records Manager 8.0
- We are our own reference customer for Database Archiving, now known as HP Application Information Optimiser
So all of that is publicly available in white-papers and case-studies.
The fact that we're building
Re: (Score:3)
In the field of physical security, I've seen customers with 10 independent access control systems scattered around their various facilities condense into a single centralized and monitored system. Access control system panels used to be connected serially to a "server" which was a cast-off desktop PC shoved under a janitor's desk, but now are actual servers in server rooms, monitored and backed up by IT staff, communicating with panels that might be on the other side of the planet.
Security video was analog
Virtualization (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
Information security and adhering to all manner of certification, both in terms of physical security and compliance to information management regulation, is usually a lot more stringent in a decent (professional) cloud environment than in people's own data center.
I'd be inclined to disagree with your assessment of hosted infrastructure, although quite honestly I am apprehensive about going to the cloud myself.
Maybe it's a psychological thing.
Re: (Score:1)
How does adding a virtual machine (and another OS copy) in between the OS and the server program improve hardware utilization (unless you're a hosting company that has to give access to several unrelated entities while protecting them from each other, of course)?
I mean, it certainly improves flexibility. But I don't see how it improves hardware utilization.
Re: (Score:2)
How does adding a virtual machine (and another OS copy) in between the OS and the server program improve hardware utilization?
Ummm.... Because most servers running natively on dedicated hardware are coasting most of the time? You don't really understand virtualization, do you.
Re: (Score:1)
What stops you from installing several servers on the same machine without a virtualization layer in between?
Re: (Score:1)
Re: (Score:3)
Amen to this. I'd say it's the single most important change for network admins in the past 15 years. Our server farm went from a 7 foot stack of pizza boxes with disparate hardware and OSs that we were paying oodles to be parked in a server farm; to one public VM host on the cloud and one private VM host running on my boss's desktop.
Re: (Score:2)
What a fucking maroon.
I don't quite understand your post. Do you the writer is dark brownish-red, or do you think they were abandoned on a desolate island?
Re: (Score:2)
He thinks the writer is in a Bugs Bunny cartoon.
Re: (Score:2)
I think the writing is actually ok, but the web site is certainly abysmal.
Well... (Score:1)
Virtualization and Backups: These go hand in hand. Virtualize then backup a server, if the hardware implodes run it on a toaster oven. This allows people to be more promiscuous with consumer grade hardware for three 9's applications, and thus enables you to deploy more stuff given the software licensing expense is not full-on insane.
PC Miniaturization: Where you used to buy a purpose built box you can now buy a PC to do the same thing e.g. PBX, Video Conferencing, Security Camera's, Access Card system, et
Re: (Score:2)
Remember migrating off of win98?
Nope. IT departments were migrating from NT 3.51 to 2000. Home users were migrating from 98 to whatever you are implying (98SE being next, ME after that, and many waiting for XP). The move from 2000 to XP was easy. XP is what 2000 was supposed to be, so the fundamental differences between 2000 and XP were small, the real difference was that XP worked.
Re: (Score:1)
The only big difference was XP allowed DirectX 9. Windows 2000 always worked. Windows XP IS Windows 2000. You are too young to have fully experienced Windows 2000, Mr. 707885. Oh wait, you experienced it, but because it didn't run your games you poo-pooed it. For getting shit done, Windows 2000 is closer to Windows 7 than Windows XP will ever be.
Re:Well... (Score:4, Insightful)
Re: (Score:2)
Hardware is a frack of a lot more stable now too. When was the last time you had a video card or a NIC flake out? In a 900-desktop environment that used to be a daily occurrence.
Re: (Score:2)
Scaling is now stable. To setup 100 PC's with Windows 2000 is nothing like doing Windows 7.
Setting up 100 PCs with Windows 2000 was extremely easy. Windows 7 has become much harder because your can't edit the default user registry hive without Windows 7 freaking out. Microsoft still needs a good counterpart to /etc/skel/
Company in the 500-1000 employee count tier (Score:1)
We have divisions world-wide, but our Corporate/HQ division is located in America and consists of roughly 500 employees. At home, we have three facilities at different locations.
- The entire computer system is virtualized through VMware using VDIs with help from VMware View, and hosted at a major [unnamed] datacenter in Texas on a private network setup for our Company. We also have an identical setup at an Asian datacenter under the same provider, and both datacenters are linked together through VPN from th
Re: (Score:2)
>The network infrastructure is setup as a Class C 172.x.x.x
You mean Class B, or specifically the 172.16/12 private network. It may be further subnetted via CIDR, but only having 256 IPs (Class C) doesn't work well in most enterprise settings.
abstraction (Score:4, Insightful)
The biggest difference in the past 10 years is that everything has been abstracted and there's less time spent dealing with trivial, repetitive things for deployments and upkeep. We support more users now, per administrator, than we did back then by many a massive amplitude.
No more clickclickclick for various installations on Windows, for instance. No more janky bullshit to have to deal with for proprietary RAID controllers and lengthy offline resilvers. These things have been abstracted in the name of efficiency and the build requirements of cloud/cluster/virtualization/hosting environments.
We also have a lot more shit to take care of than we did a decade ago. Many of the same systems running 10 years ago are still running - except they've been upgraded and virtualized.
Instead of many standalone systems, most (good) environments at least have a modicum of proper capacity and scaling engineering that's taken place. Equipment is more reliable, and as such, there's more acceptable cyclomatic complexity allowed: we have complex SAN systems and clustered virtualization systems on which many of these legacy applications sit, as well as many others.
This also makes our actual problems much more difficult to solve, such as those relating to performance. There are fewer errors but more vague symptoms. We can't just be familiar with performance in a certain context, we have to know how the whole ecosystem will interact when changing timing on a single ethernet device.
Unfortunately, most people are neither broad or deep enough to handle this kind of sysadmin work, so much of the 'hard work' gets done by support vendors. This is in no small part due to in-house IT staffing budgets being marginal compared to what they were a decade ago, with fewer people at lower overall skill levels. Chances are that the majority of the people doing the work today are the same ones who did it a decade ago, in many locations, simply due to the burden of spinning up to the level required to get the work done. In other places, environments simply limp by simply on the veracity of many cheap systems being able to be thrown at a complex problem, overpowering it with processing and storage which was almost unheard of even 5 years ago.
The most obnoxious thing which has NOT changed in the past decade is obscenely long boot times. Do I really need to wait 20 minutes still for a system to POST sufficiently to get to my bootloader? Really, IBM, REALLY?!
Re: (Score:2)
Instead of many standalone systems, most (good) environments at least have a modicum of proper capacity and scaling engineering that's taken place.
Except that has nothing to do with what year it is.
Re: (Score:2)
The most obnoxious thing which has NOT changed in the past decade is obscenely long boot times. Do I really need to wait 20 minutes still for a system to POST sufficiently to get to my bootloader? Really, IBM, REALLY?!
With virtualization it's very rare for me to have to reboot a physical host, and guests reboot in a couple of seconds. So overall that situation seems to have improved dramatically. In my environment, at least.
without (Score:2)
"it's easy to contemplate these tasks being accomplished . . ." without security, without reliability, without stability, without privacy, without confidentiality, without accountability, without redundancy.
If I were to do that, I'd be in breach of at least half of my NDAs, and a few of my SLAs.
Biggest change-Outsource everything! (Score:1)
The biggest change has been in management, who are now trained to outsource anything and everything. Their answer to every question is to outsource it. If an organization has developed internal expertise in some in-depth area, the management will outsource whatever it is, even if they throw away the expertise in the process. And they'll probably fire the employees with the now-useless expertise and give themselves bigger bonuses. So the move to the "cloud" is not being driven by technical people, it's drive
Ten years? Bah, humbug. (Score:3)
10 years ago really wasn't that big a deal. By 2003, VPN (IPSec and OpenVPN) was fairly robust, and widely supported. PPTP was on the way out for being insecure. Internet was most everywhere, and at decent-if-not-great throughput. Go back five or ten years before *that*, and things were much more difficult: connectivity was almost always over a modem; remote offices *might* be on a BRI ISDN connection (128 kb/s), probably using some sort of on-demand technology to avoid being billed out the wazoo due to US telcos doing this bizarre, per-channel surcharge for ISDN. PPP was finally supplanting (the oh, so evil) SLIP, which made things better, assuming your OS even supported TCP/IP, which was not yet clearly the victor -- leading to multiple stacks to include MS and Novell protocols.
All in all, 2003 was about when things were finally getting pretty good. Leading up to 2000 had been a tough row to how. And let's just not even go before that -- a mishmash of TCP/IP, SNA, SAA, 3270, RS-232, VT100, completely incompatible e-mail protocols, network protocol bridges, massive routing tables for SAPpy, stupid protocols... a 100% nightmare. Very, very glad to have left those days behind.
a decade ago was 2003 (Score:2)
As in, AD was mostly mature, Win2003 was out, Linux was real, and PCs were commodities. An IT infrastructure now vs _20_ years ago on the other hand would be more interesting. Not much has happened since 2003.
NSA, anyone? (Score:2)
One could almost make a living off of selling slackware boxes running sendmail with mimedefang and spamassassin as the
Look Brah, no wires! (Score:1)