With Virtualization comes the potential to reduce costs and enhance productivity, realising savings across server count, carbon footprint, power consumption and cooling requirements. Whether a CIO is looking to go-green-for-green’s-sake, enhance brand image or improve competitiveness, Virtualization is an attractive proposition.
At its core, Virtualization enables organisations to make the most efficient use of available system resources by consolidating applications onto fewer physical servers. As demands in data centre infrastructure change, or in response to traffic spikes, physical resources that aren’t immediately required are automatically turned off, enabling a more efficient, environmentally-friendly use of resources.
To execute a successful Virtualization deployment, CIOs must first be clear about what they wish to achieve and determine if the technology is a good fit for the business. Within any organisation, disruptive or revolutionary initiatives have the highest chance of failure. By approaching Virtualization as a step-by-step evolution, organisations can boost their success rate.
Moving to a virtual environment
Virtualization is becoming a commodity, and should be treated accordingly. As the market matures, CIOs are becoming more savvy and waking up to the potentials and limitations of the technology and what can be expected from a vendor. It is a vendor’s responsibility to offer guidance on best practice and organisations should not be afraid to ask tough questions and demand answers.
Determining what percentage of an organisation’s workload can realistically be virtualised is a good first step. We seldom see organisations immediately migrate the majority of workloads to a virtual environment; instead it is common practice to start with less critical workloads, gain experience with the platform and then increase workloads to include mission-critical ones as well.
It is advisable for organisations to migrate further workloads once they are confident with the platform and support offered by a vendor.
Rather than view Virtualization as a standalone project, organisations should have a deeper look into the internal processes which might be impacted by adopting Virtualization technology. If not, customers might find that expected benefits of Virtualization, especially when it comes to improving IT agility, fail to become a reality. In other words, often the biggest hurdle to successful adoption of Virtualization is not the technology itself, but the processes that surround it.
Such processes include provisioning and change management. Each application involved might compete for computing resources and it is important for organisations to deploy software that grants greater visibility into the IT architecture to determine how applications are running. Greater visibility allows administrators to predict conflict and monitor performance to ensure critical applications receive priority and performance levels are met.
The true cost of Virtualization
Costs savings are the primary reason that enterprises look to move to a virtual environment. Reducing costs whilst driving productivity is in-line with any business objective and Virtualization is a money-saving technology. That is not to say there are no upfront costs, initial investment is necessary and return on investment (ROI) will result from a reduction in data centre footprint, hardware, maintenance, personnel and management costs. Virtualization technology alone may not reduce OPEX.
Going green not only enhances a brand’s image but can also reap substantial cost savings. By reducing the number of physical servers, organisations can greatly reduce the amount of rack space required which equates to substantial savings, especially if an organisation is renting space from a data centre provider. In an organisation’s own data centre, decreased power and cooling coupled with recouped floor space can often see savings of up to 50 per cent.
Administration of virtual environments is critical but maintenance hours are vastly reduced. Administrators need to support a smaller IT footprint of physical server bases and the virtual system allows them to add additional applications remotely, while the system is still running, and helps to limit the need for application downtime.
Deploying applications to a physical server often takes a lot of time and resources. This process can be reduced from days or hours to minutes in a virtual environment, allowing organisations to realise costs savings through reduced administration time and travel costs, and increased productivity. Virtualization reduces personnel costs as organisations no longer need to support sprawling physical hardware, allowing administrators to focus their attention on more important tasks.
Is my data secure?
Virtualization raises a new set of security concerns. As organisations migrate further workloads into a virtual environment, they expand their technology footprint and the amount of data exposed. In a non-virtual environment, a hack might gain access to just one server, but in a virtual environment, a hacker has access to every virtual instance which runs on a compromised server.
Security risks can arise from personnel within an organisation or from third-party attackers. With an increase in employees wishing to connect personal devices to the office network, it is imperative for enterprises to ensure access to data from personal devices is not abused. Vendors can enable organisations to keep all data off their desktops and end user devices to ensure data does not leave the virtual environment. Employees will be able to view data on their monitors but will be unable to copy or export data from the virtual environment. In fact, this is a common use case scenario in the deployment of Virtual Desktop Infrastructure (VDI).
One way to ensure security is to opt for a vendor that offers a kernel-based security policy enforcement infrastructure. This strict security policy guarantees isolation between virtual machines and between each machine and the hypervisor. The Linux Kernel-based Virtual Machine (KVM), with SELinux technology, brings military grade security into the Virtualization technology space and can help with hardening the system against bugs in the hypervisor that might be used as an attack vector aimed towards the host or another virtualised guest.
Risk of vendor lock-in
For too long, vendors have offered closed proprietary technology stacks, with little or no focus on interoperability with other market players. With no pre-defined open standards, Virtualization can become the mother of all lock-ins.
Vendor lock-ins can have a real impact on both financial organisations’ CAPEX and IT efficiency.
The inability to move workloads across different platforms, and difficulty in extracting data from virtual environments, can restrict business. With infrastructure that is defined in a way that’s friendly to IT vendors rather than customers, once users get stuck in one proprietary technology, it’s hard for them to move. Some of the leading vendors in this space are moving towards stricter licensing models that include significant charge for high density workloads, thereby limiting the amount of memory that can be allocated per CPU based on the customer’s licence. This defies one of the founding values of Virtualization: flexibility. With fluctuating workloads, nobody can predict what will be needed on a long-term basis.
By opting for an open source and open standards policy – focusing on interoperability and portability to end vendor lock-ins – enterprises can ensure they are in control. Harnessing the flexibility offered by Virtualization technology, organisations can react to immediate business needs and accelerate time to market of new initiatives.
Virtualization is the perfect component of a forward thinking IT strategy, offering a natural progression to cloud computing which allows organisations to take existing virtualised workloads and move them into a cloud environment. Virtualization can be seen as preparing businesses for a move to the cloud and is the perfect platform to migrate mission-critical workloads.