Wednesday, October 19, 2011

Why virtualization will take over the desktop

One of the biggest problems that makers of desktop operating systems face is backwards compatibility.  You can either be a slave to it (Microsoft) or strategically break it (Apple), but either way a major new release of an OS is going to have to deal with it.

And it's not hard to maintain compatibility back one or two releases. But if you had some really beloved Windows 3.1 application, the chances of it working now are pretty slim.

The answer is virtualization.  What if, instead of having to maintain extensive backwards compatibility, you architected your system so that you had a low level OS, a VM on top of it (like so many existing VM stacks do) and then ran your OS in a VM?

When it came time to upgrade, the user could load a new OS and keep the old OS around for as long as needed.  Decades, maybe.  If you're Apple, and it's time for your 10 year shift in hardware, the VM could emulate the old hardware indefinitely, saving you from having to build something hacky.

In this world, the VMs don't have to provide complete isolation of the OSes.  The VM could, for example, support the underlying file system (since even modern OSes still support MS/DOS's FAT) and even potentially handle some windowing issues (like VMWare Fusion does on the Mac).

For those of us on a Mac who need Windows applications, we're already getting there. I have a virtualized Windows XP that I can run forever without worrying about XP specific programs breaking under new versions of Windows.  But for Windows users, this is still a new concept, but one that is coming, I'm sure.

Microsoft: Bundle your VM into Windows.

Apple: Buy Parallels and make it a core component of your next OS.

Linux: Start putting VM support into base distros.

You all can thank me later.

No comments:

Post a Comment