Not sure which way to go? I have the answers…

What’s Driving Virtualization Now?

What’s Driving Virtualization Now?

Buzz aside, virtualization is not a new technology.  Multiple operating systems have been hosted in mainframes for over 30 years.  Four recent trends have moved virtualization to the forefront in today’s computing environments.  When you look at these trends, you can easily see that virtualization is much more than a fad; coming from an industry that has brought us more fads than the fashion industry.

Trend #1: Hardware is underutilized

Gordon Moore, cofounder of Intel, made observations back in 1965 on the compounding power of processor computing power, which is known now as ‘Moore’s Law”.  Moore stated: “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year.”  What does this mean in simple terms?  Every new generation of chip delivers twice as much processing power as the previous generation — at the same price.  For the last 10 years or so, this estimate has been more like 18 months, versus 2 years, and more recently closer to 12 months!  In a more technical note, every year or so, twice the number of components can be stuffed into a silicon wafer the same size, thereby doubling the speed and efficiency of the processor with a slight, or any, change in physical size requirements. 

This rapid doubling of processing power has had a tremendous impact on the daily computing life of the consumer, and moreover, the software developers and technical support and design technicians.  In today’s world, comparing to just 10 years ago and based upon Moore’s Law, the increase is about 100,000 times as powerful!  In 10, not 45 years of time since Moore published his observations.

Keep in mind, to fully understand Moore’s Law, the numbers that are doubling, are themselves getting larger.  For example, if in year one the processing power is 100 million instructions per second (MIPS), than year two you would expect there to be 200 MIPS; in year three, 400 MIPS; etc. That sounds great, but think about year seven or eight, when the increase is like 6,400 to 12,800 in one generation. It has grown by 6,400. And the next year, it will grow by 12,800.  It’s almost amazingly unrealistic…but its fact!

Moore’s Law also demonstrates the related increasing returns.  The amount of improvement in technology and feature capability itself grows over time as well.  The exponential increase in capacity and power in every generation of processor release is what has been responsible for the vast level and speed of improvements in computing in such a short time.  In turn, this feeds the ever-increasing need for virtualization.

This is why this trend has meaning.  Ten years ago, programmers had to struggle to get software to work on what would be deemed ‘substandard’ hardware.  Today the hardware is so much more powerful that the software itself typically uses only a very small part of the available processing power in today’s technology…thus a completely different type of problem…unintentional overbuying and a no choice in underutilization.

In today’s datacenters, most machines run at a level of utilization of 10 to 15 percent of the total capacity.  This means 85 to 90 percent of the machine power and architecture is unused.  Moore’s Law seems not to be very relevant to most companies because they cannot take advantage of the increased power they have purchased, in today’s computing environments.  

Think about it, if you had to buy a truck to carry a few hundred pounds of product to and from a customer on a regular basis, would you buy the Ford F350 SuperDuty Turbo Diesel for $60,000+ or the F150 for less than ½ the cost?  OR would you use your own car?  Further, if you already had that F350, would you market your customers for delivery and pickup more aggressively to better utilize your trucks capacity?  Remember – that smaller truck, if chosen, still takes a bay in your loading dock, requires gas to be driven, insurance, etc., and when you increase the business need, it can’t be ‘upgraded’ to haul much more in the end.  The long-term cost of today’s underutilized machine is nearly the same as if it was running at full capacity.

It doesn’t take a guru to realize that computing resources are being wasted.  Keep in mind, with Moore’s Law, next year’s computers will have twice as much spare capacity as this year’s — and so on, for at least the foreseeable future anyway.

So there MUST be a better way to equalize and better use computer resources and capacity with the load…that’s what virtualization does.  Allowing a single piece of hardware to transparently support multiple operating and software systems by utilizing virtualization, companies can raise their hardware effective use rates from 10 or 15 percent to 70 or 80 percent.  Now you’re getting your money’s worth!

Moore’s Law has not only enabled virtualization, but effectively makes it a must in most organizations to some degree. Without virtualization, larger amounts of computing power will go to waste each year and our carbon footprints will continue to grow.  (oh, and your GREEN will continue to flow out of your corporate pockets needlessly)

So the chip industry is seemingly the cause of the need for virtualization.  Moore’s Law can also be used and seen in data storage and networking trends as well.  With more power comes a more feature and media rich computing environment – and that requires increased storage and network speed and capabilities as well.  Virtualization technologies are also in development and first use now…but more on that in another series.

Leave a Reply