From the standpoint of securing important documents and information, is adopting virtualization a mistake?
As a genealogist, I’m well aware of our dependence on the use of virtualization, computer networks and the internet by governments, businesses and organizations to digitize, store, safeguard, and make available highly valuable documents, publications, etc. The following article by Jesse Troy outlines the concerns. Christine
Virtualization’s promise started out large, and the concept has taken off like a freight train in the night since around 2005. It’s easy to see why, as the technique allows processes to utilize resources more effectively than was previously the case.
As numerous corporations and government entities utilize the decades-old technology in ever increasing numbers, we should ask ourselves this question.
1. The largest problem is internal to your organization.
Virtually all corporations are already swamped with huge amounts of data; jumping on the virtualization train entails the creation of even more assets.
Discovery technology may help you to find what is already hidden in deep, dark corners, but ‘going virtual’ opens up a whole new dimension to your corner space.
This is known as sprawl.
2. Not all discovery tools recognize virtual machines and the data therein contained.
When pressed to the wall, it’s possible to put a search team onto the task of locating misplaced, mislabeled, or just plain lost data in the messy data center.
This is not the case where virtual information is concerned.
Like a ghost, it’s not able to be ‘seen’ directly.
3. Unknown assets create an unknown licensing scenario, in which it’s impossible to determine the correct number of licenses to purchase.
With automated licensing software – impotent when it comes to handling virtualization – typically in place, IT departments may be ringing up unplanned, and unnecessary costs.
4. Insecure default configurations can manifest.
When the ‘blueprint’ for virtualization is created, any problem, including a breachable security issue, is replicated.
Each future virtual machine will have the same bad padlock.
5. When a server can be created with complete ease, the unfortunate fact is that many are then born.
Whether born out of necessity or not is another question; followed by that of who maintains that server?
The IT department may be unaware of its existence when it’s created by a non-IT employee.
When that employee leaves the organization, it’s possible that information will wither unattended, essentially departing simultaneously.
6. Communication between different servers with different security clearances on the same machine is possible.
This presents the obvious system vulnerability.
If a hacker gains access to a less secure server, they can readily access information that is meant to be much more secure.
7. When the hypervisor – the software technology manager – is attacked, all the servers under that umbrella are susceptible to infiltration.
Patches must be maintained and kept current. Falling behind on maintaining security updates puts all the information across the board at risk, rather than on a few laggard’s machines.
Although these seven reasons – each pointing out a flaw in the technique – appear to be serious reasons to consider avoiding virtualization, that is not the case.
Each point has a relatively simple solution, such as firewall installation between servers or complete data organization prior to server creation. Confronting the problems before they become security issues is the right approach.
From there, resource utilization results have proven to be superb.