Experiences with Replacing Desktops w/ VMs? 442
A user asks: "After years of dealing with broken machines, HAL incompatibility, and other Windows frustrations, I'd like to investigate moving to an entirely VM-based solution. Essentially, when an employee comes in in the morning, have them log-in, and automatically download their VM from the server. This gives the benefits of network computing, in that they can sit anywhere, if their machine breaks, we can instantly replace it, etc, and the hope is that the VM will run at near-native speeds. We have gigabit to all of the desktops, so I'm not too worried about network bandwidth, if we keep the images small. Has anyone ever tried this on a large scale? How did it work out for you? What complications did you run of that I probably haven't thought of?"
Re:user icons (Score:0, Insightful)
Why not just use sunrays? (Score:5, Insightful)
might not be cost-effective (Score:1, Insightful)
What about the documents people create and edit, as well as apps they might want to download or install themselves? If they store them "locally", they'll be gonzo when you swap in a new image. There'll be some unhappy campers.
Not so sure about the architecture... (Score:5, Insightful)
If you were going to use vmware, make a standard image and push it out to the local hard drives. don't update that image unless it is time to push out a new set of windows udpates/etc. if you need to update the image though, that is going to be *hell* on your network/file servers.
I think it makes more sense to run a virtualized server than a desktop.
Also, you might end up paying for 2x the XP licenses since you'd have to pay for the host + guest operating systems.
And this would be an improvement how?... (Score:5, Insightful)
So a lot of expensive desktops emulating, um, pretty much themselves, using funky somewhat pricy software, running substantial images pulled off of expensive servers over an expensive network (bacause GB'net or not, a building full of folks starting up in the morning is gonna hammer you.) Then comes the challenge of managing all of those funky images, reconciling the oddities of an emulated environemnt, etc.
Could you make it work? Sure. But I gotta wonder if it'd be worth it.
Is gonna be any better then a well managed native environment? Or going Citrix clients? Or Linux/MacOS/terminals (chose your poison) boxes instead of MS Windows?
I hear your pain, I just think you're substituting a known set of problems with a more expensive, more complex, more fragile, baroquely elaborate, well, more-of-the-same.
It doesn't sound like much of an improvement really, just new and more complex failure modes, at extra cost.
Though, I guess, if you're looking for a new, challenging, and complex environment this would be it; just take your current one and abstract it another level. I wouldn't want to be the one footing the bill, or trying to rely on any of it, but at least it'd be something different.
Solution to the wrong problem. (Score:1, Insightful)
I guess your HAL problems are the major issue. You CAN overcome over 95% of those issues with the MS deployment configuration tools and ghosting (here [microsoft.com] and here [microsoft.com] is a start). It takes some engineering commitment to get that up and going but once the framewrok is on place, the minisetup should not be a problem across different hardware. I realy do think it is worth the inital time and effort for something like this.
Considering my above statements..
I have worked at many places and the ones with good backend engineering are much better off in the long run. I am not trying to knock anyone down here but honestly, if your facility is run by tier technicians, you get what you have now. Imagine going through an upgrade or service pack release? Some companies can perfrom those on 500 PCs in a single night without ever actually visting a PC. Some speand weeks doing one at a time. Unfortunatly, the later of the two is the nature of the business when "support" is contracted out. Someone doing engineering is no where to be found. The tools are freely available from MS and third parties to make all of your various PCs pretty much act as one.
Back in school... (Score:4, Insightful)
As an alternative to NIS, Netinfo does much the same thing, only it wasn't designed by people quite so sadistic as NIS. You'd still be using NFS though...
cya,
john
Independet Software Vendors wouldn't talk to you (Score:4, Insightful)
You're not qualified (Score:1, Insightful)
Several ways to fix this and get qualified:
1) Trial it on a small number of less important users. Get feedback. Make sure you listen to that feedback. Allow a decent period of time for the trial so initial teething problems can be sorted. Allocate sufficient resources to deal with early issues. This is the hard way to learn...through experience.
2) Hire expertise - someone that's done this before, to implement and advise. Make sure it's not a vendor since you won't know if you're being screwed till its too late.
3) Get some training.
DO NOT try to implement this for a large number of users in one hit. You're a fool if you do.
Smells like X (Score:3, Insightful)
we did this in the past... (Score:1, Insightful)
Re:Three different takes on this (Score:4, Insightful)
Not quite true. Yes, with the 3D. But the two main players (VMware and VPC) both support sound, and VMware even USB 1.1 passthrough.
With the thin-client option, Microsoft Terminal Services (if you're on a windows platform) has good scaling capabilities. Though it might not go into the hunderds or thousands, it should get you into the high dozens. Since most of the microsoft tool's dlls are loaded and shared between the clients, it has pretty good performance.
For linux, while SSH is always a favorite, look at NX-Servers (http://www.nomachine.com/ [nomachine.com] and http://freenx.berlios.de/ [berlios.de]) which is like X-forwarding with compression and caching.
It'll be difficult to have a fully virtualized solution. Going with thin clients, or a pxe-served image would be a more viable solution (no matter how beefy your servers and fast your network).
Re:Um, wouldn't a ... (Score:5, Insightful)
it's down to the configuration, the network itself can do it.
Re:Not so sure about the architecture... (Score:4, Insightful)
First off, I agree with you that this isn't a good application of a VM considering the number of alternative options that exist already. The one area I will disagree with is the licensing since you're in no way required to run Windows as your host OS. Just run a linux-based host OS and problem solved. VMWare runs just as well on both. I'm not sure about other options like Virtual PC or Qemu but last I checked Qemu only worked on Linux so you're still in a good position not to have to throw more money at Windows licensing.
Side topic, licensing has really gotten out of hand with pretty much every piece of commercial software. I think that's the real reason a lot of people are moving towards Linux. The learning curve required to administer linux effectively is outweighed by the complicated licensing schemes of various companies Microsoft especially. It is quite a challenge staying in compliance these days.
Back on topic, you could have a file server or three dedicated to the task using a DFS root to link them logically and to keep them sychronized. Then you wouldn't have to worry about pushing images killing server performance. Combined with network load balancing you could scale out as needed.
Re:Smells like X (Score:1, Insightful)
... which is fine if you can replace the existing environment with one based on X11 clients.
If not, and face it, most of the world isn't running Debian GNU/Linux like you'n'YT, this really isn't a feasible solution.
That said: <sigh>, yes, Xterms do provide a really nice, simple, centrally maintainable desktop solution that's likley 95%+ acceptable for most of the world. Damn that 5%....
Several options. Very workable for laptops. (Score:3, Insightful)
VM's can run off of network shares if you set things up right. Fast network, and you won't see a problem. I have run VM's off mirrored ximeta netdisks over 100meg with NTFS as the partition type, and it worked great, although it was only about 4 machines accessing it at one time. For office apps and such, it's a piece of cake.
I encourage people to use vmware for laptops. Create an encrypted disk with the vmware image that they want to run, then if the laptop gets stolen, you have to decrypt the disk before you can get to the really good stuff. Backups are easy, and it makes if necessary, laptop "sharing" something that you can do pretty easily as well. Multiple shifts can PC share easily as well. It's also easier to fix problems test updates and such by just snagging a copy of the image, and monkeying with it.
Citrix and remote desktop have their places as well.
Re:No 3D (Score:5, Insightful)
These two are often not an issue in corporate environments though.
Sure, some exceptions depending on what kind of work you do, but still exceptions.
Re:Citrix (Score:2, Insightful)
Er... Why exactly?
A thin linux desktop connected to a backend vmware server would provide exactly what the poster is looking for. Vmware ESX seems perfect for this and eliminates the "download entire disk images" part. Basically with ESX all of the vm's and associated images live and *run* on the server, the desktop is accessed via vmware-console, a little program that connects to the server and views the virtual machine, similar to citrix/vnc/whatnot. With the clustering solutions available to ESX server and the ability to move running machines between nodes, this seems like a good idea. The only real downside would be if your day-to-day involved 3d acceleration or heavy sound, in which case any solution except a "real" local workstation falls a bit short currently.
I've used such a set up to run a windows desktop for testing, and noticed no slowdown. This is even with 10 or so training and qa machines running on a P3 with 8G of ram and lots of disk. No noticiable slowdown in performance even when the other machines were all doing cpu intensive tasks.
Xen also seems like it's coming along nicely, but doesn't seem ready to provide for windows workstations on this scale, yet.
Re:Inevitably (Score:3, Insightful)
And yes, I'm blaming the victim. While there *should* be sense in saying that you ought to be able to walk anywhere without fear, if you keep going to drug-riddled areas downtown and getting mugged, then STOP GOING TO THOSE AREAS. Learn to take some responsibility for your own damn habits and learn a bit. You change the oil in your car and give it the occasional tune up... why not the same to your computer?
Another Possible Solution (Score:2, Insightful)
If a computer breaks and needs to be replaced, we can drop in a replacement PC, move it into the proper organizational unit in AD, do a group policy update on the box and it will install the appropriate software on startup (and with the exception of programs like VS.Net it goes fairly quickly).
Re:Inevitably (Score:5, Insightful)
From a practical perspective, telling college students not to download music, to avoid MySpace, and to not download seemingly harmless things like Screensavers and Wallpaper is about as effective as the rhythm method [wikipedia.org]. Sure, they're "sinners" with their pr0n and their music. How dare they? They get what they deserve by using a computer on the internet to download the information they want. That's a sin to be sure. It's strange how that apparently makes them culpable for systematic, intentional, and malicious exploiting of their computers. Of course, the long-term social effects of corporate self interest manipulating law and public opinion to create stigma in their economic interest is beyond the scope of a Slashdot comment.
Back to the technical issues. Understand that a lot of malware immediately turns off ActiveX security. They leave the door wide open behind them. In your perfect world, not only does every user have to be perfectly responsible and knowledgable, but they also can't make even a single mistake--since this basically leaves them wide open (i.e. it doesn't ask, just downloads and installs any application that asks) in many cases.
Similarly, there is no safe site. A vast majority of the web is ad sponsored. A single malicious banner ad can catch millions (the recent MySpace incident for example). Expecting every user and every advertising company (with possibly tens of thousands of ads) to not ever make a single mistake is unrealistic as it is lazy. The web can be secure if people would put the effort into getting secure systems developed and into place instead of blaming security problems on the sinners.
Ironically, one of your "solutions", Antivirus Software (a.k.a. stopgap measure or snake oil depending on your inclination), is probably the reason things are as bad as they are. Rather than closing holes, AV just stomps the critters that run in through them. If users had insisted on fixes and security rather than installing Norton Antivirus (and considering it "fixed", things probably wouldn't be nearly so bad as they are. It would also be nice if the economic disincentive for insecurity would lie with the vendors where it belongs, not with each and every user.
People don't realize it, but this is really an old misconception. Make something illegal, and its sources become disreputable. This then reinforces the belief that it's inherently bad. My issues of concern are software licensing, patents, and copyright reform. I'm sure the same argument could (validly) be made for marijuana, prohibition, and prostitution.
Of course we've got a double-whammy with software security. Not only are the sinners browsing seedy sites, there is also no one responsible for protecting them (since the vendors have all licensed their cares away).
Re:No 3D (Score:3, Insightful)
If people would call virtualization "Virtual Hardware"(well anything that doesn't start with M would be good), the confusion might not exist.
Re:No 3D (Score:5, Insightful)
That doesn't make sense. VMware should provide exactly the same virtual hardware to the guest no matter what physical computer you run the image on. In fact, that is one of the biggest selling points for VMWare.
Are you creating the VMware image FROM the Dimension 620, or running a fresh "virtual" install of XP?
-matthew
Re:Citrix and/or DeepFreeze (Score:2, Insightful)
From the website: "Deep Freeze instantly protects and preserves baseline computer configurations. No matter what changes a user makes to a workstation, simply restart to eradicate all changes and reset the computer to its original state - right down to the last byte. Expensive computer assets are kept running at 100% capacity and technical support time is reduced or eliminated completely. The result is consistent trouble-free computing on a truly protected and parallel network, completely free of harmful viruses and unwanted programs."
At my company we use a combination of Citrix and DeepFreeze that allows users to roam from station to station while still having full access to all of their apps and data (stored on the network). DeepFreeze ensures that a user never messes up the local computer with anything that a reboot can't fix.
You could also just do DeepFreeze, profiles, network based app installs which would ensure the apps and data are on the network.
Re:Inevitably (Score:2, Insightful)