Virtual Appliances... poodoo
What I wish to address is this:
"With the rise of virtualization, applications will transform into virtual appliances. Virtual appliances simplify application deployment by delivering the complete software stack—from OS to the application—as a single, integrated unit." -SDTimes Web Seminar Abstract
Lest I be seen using VMs and am accused of being hypocritical: Virtualization is great for configuration testing. There is no better way to test against Oracle or Windows 98 without having to install such icky things than a VM. "Virtual Appliances", however, are a different story. Though they may “seem” exciting, and may be a practical solution given today’s technology and today’s problems, I find the trend disturbing. Why? ...Because a) the entire approach is grossly inefficient; and b) this is just another chapter in the long history of industry repetition. Setting aside the inefficiency aspect for a moment, isn’t this form of “run anywhere” virtualization exactly what Java set out to accomplish? “Write once, run anywhere” was the mantra, but what happened?
As I see it, the blame lies in:
1) Improper or incomplete implementations. This was exacerbated by overly complex design and poor specification making implementation difficult;
2) Changes in the platform. The platform should really be more of a constant than a variable. Too many versions and too much change turned “write once, run anywhere” into “write once, run where you have exactly XYZ version of the runtime”;
3) Performance problems. Only recently did Java even enter the C performance ballpark. Who wanted to “write once, run slowly everywhere”?;
4) Politics. Sun didn’t seem to find it important to keep everyone on board (e.g. Microsoft).
Doesn’t it seem possible for machine-level virtualization to succumb to some of the same vulnerabilities?
Now for the inefficiency aspect: I see running a virtual machine for the purpose of hosting a single application roughly equivalent to building a city for one person. Layers upon layers of infrastructure for running multiple processes, hosting hardware, sharing memory, et. al. basically squandered on a single application. Now hardware vendors are spending much of their resources optimizing the hardware to support these inefficient monstrosities rather than further optimizing the abstractions we already have. How many technologies are being worked on to make VMs interact with each other now in ways that have long been possible between OS processes? I realize that I am being a purist, but this usage of virtualization just seems like such an admission of defeat. The touted benefit is simplicity, but I predict we will end up with more complexity.
As I see it, there are two problems that virtual appliance aims to resolve:
1) Dependency complexity
2) Portability
Can't we solve these problems fundamentally rather than resort to such grossly inefficient architecture?!
"With the rise of virtualization, applications will transform into virtual appliances. Virtual appliances simplify application deployment by delivering the complete software stack—from OS to the application—as a single, integrated unit." -SDTimes Web Seminar Abstract
Lest I be seen using VMs and am accused of being hypocritical: Virtualization is great for configuration testing. There is no better way to test against Oracle or Windows 98 without having to install such icky things than a VM. "Virtual Appliances", however, are a different story. Though they may “seem” exciting, and may be a practical solution given today’s technology and today’s problems, I find the trend disturbing. Why? ...Because a) the entire approach is grossly inefficient; and b) this is just another chapter in the long history of industry repetition. Setting aside the inefficiency aspect for a moment, isn’t this form of “run anywhere” virtualization exactly what Java set out to accomplish? “Write once, run anywhere” was the mantra, but what happened?
As I see it, the blame lies in:
1) Improper or incomplete implementations. This was exacerbated by overly complex design and poor specification making implementation difficult;
2) Changes in the platform. The platform should really be more of a constant than a variable. Too many versions and too much change turned “write once, run anywhere” into “write once, run where you have exactly XYZ version of the runtime”;
3) Performance problems. Only recently did Java even enter the C performance ballpark. Who wanted to “write once, run slowly everywhere”?;
4) Politics. Sun didn’t seem to find it important to keep everyone on board (e.g. Microsoft).
Doesn’t it seem possible for machine-level virtualization to succumb to some of the same vulnerabilities?
Now for the inefficiency aspect: I see running a virtual machine for the purpose of hosting a single application roughly equivalent to building a city for one person. Layers upon layers of infrastructure for running multiple processes, hosting hardware, sharing memory, et. al. basically squandered on a single application. Now hardware vendors are spending much of their resources optimizing the hardware to support these inefficient monstrosities rather than further optimizing the abstractions we already have. How many technologies are being worked on to make VMs interact with each other now in ways that have long been possible between OS processes? I realize that I am being a purist, but this usage of virtualization just seems like such an admission of defeat. The touted benefit is simplicity, but I predict we will end up with more complexity.
As I see it, there are two problems that virtual appliance aims to resolve:
1) Dependency complexity
2) Portability
Can't we solve these problems fundamentally rather than resort to such grossly inefficient architecture?!
Comments