Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Experiences with Replacing Desktops w/ VMs? 442

A user asks: "After years of dealing with broken machines, HAL incompatibility, and other Windows frustrations, I'd like to investigate moving to an entirely VM-based solution. Essentially, when an employee comes in in the morning, have them log-in, and automatically download their VM from the server. This gives the benefits of network computing, in that they can sit anywhere, if their machine breaks, we can instantly replace it, etc, and the hope is that the VM will run at near-native speeds. We have gigabit to all of the desktops, so I'm not too worried about network bandwidth, if we keep the images small. Has anyone ever tried this on a large scale? How did it work out for you? What complications did you run of that I probably haven't thought of?"
This discussion has been archived. No new comments can be posted.

Experiences with Replacing Desktops w/ VMs?

Comments Filter:
  • Re:user icons (Score:0, Insightful)

    by Anonymous Coward on Thursday August 17, 2006 @12:20AM (#15924746)
    So we could get inline goatse, rather than obfuscated links and ascii art?
  • by scubamage ( 727538 ) on Thursday August 17, 2006 @12:23AM (#15924763)
    Get some Sun Microsystems SunRays. Seriously.. thats exactly how they work. Your session can be saved on server and resumed anywhere else you plug in your smart card. One server and all of the terminals you need.
  • by Anonymous Coward on Thursday August 17, 2006 @12:29AM (#15924788)
    in terms of CPU cycles, that'll be a huge load on the servers while the desktops go underutilized (well actually, those VM players seem to be pretty piggy, you need 2G RAM or you'll max out the CPU) And the interactivity won't be as good as native Windows desktops.

    What about the documents people create and edit, as well as apps they might want to download or install themselves? If they store them "locally", they'll be gonzo when you swap in a new image. There'll be some unhappy campers.
  • by steppin_razor_LA ( 236684 ) on Thursday August 17, 2006 @12:30AM (#15924791) Journal
    I'm a vmware/virtualization fan, but I don't think this is the best application. It seems to me that it would be smarter to use terminal services / citrix / a thin client approach

    If you were going to use vmware, make a standard image and push it out to the local hard drives. don't update that image unless it is time to push out a new set of windows udpates/etc. if you need to update the image though, that is going to be *hell* on your network/file servers.

    I think it makes more sense to run a virtualized server than a desktop.

    Also, you might end up paying for 2x the XP licenses since you'd have to pay for the host + guest operating systems.

  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Thursday August 17, 2006 @01:00AM (#15924868) Homepage Journal

    So a lot of expensive desktops emulating, um, pretty much themselves, using funky somewhat pricy software, running substantial images pulled off of expensive servers over an expensive network (bacause GB'net or not, a building full of folks starting up in the morning is gonna hammer you.) Then comes the challenge of managing all of those funky images, reconciling the oddities of an emulated environemnt, etc.

    Could you make it work? Sure. But I gotta wonder if it'd be worth it.

    Is gonna be any better then a well managed native environment? Or going Citrix clients? Or Linux/MacOS/terminals (chose your poison) boxes instead of MS Windows?

    I hear your pain, I just think you're substituting a known set of problems with a more expensive, more complex, more fragile, baroquely elaborate, well, more-of-the-same.

    It doesn't sound like much of an improvement really, just new and more complex failure modes, at extra cost.

    Though, I guess, if you're looking for a new, challenging, and complex environment this would be it; just take your current one and abstract it another level. I wouldn't want to be the one footing the bill, or trying to rely on any of it, but at least it'd be something different.

  • by Anonymous Coward on Thursday August 17, 2006 @01:04AM (#15924880)
    I personally think your existing setup is was not well thought out and planned and you are now looking for a bandaid.

    I guess your HAL problems are the major issue. You CAN overcome over 95% of those issues with the MS deployment configuration tools and ghosting (here [microsoft.com] and here [microsoft.com] is a start). It takes some engineering commitment to get that up and going but once the framewrok is on place, the minisetup should not be a problem across different hardware. I realy do think it is worth the inital time and effort for something like this.

    Considering my above statements..
    I have worked at many places and the ones with good backend engineering are much better off in the long run. I am not trying to knock anyone down here but honestly, if your facility is run by tier technicians, you get what you have now. Imagine going through an upgrade or service pack release? Some companies can perfrom those on 500 PCs in a single night without ever actually visting a PC. Some speand weeks doing one at a time. Unfortunatly, the later of the two is the nature of the business when "support" is contracted out. Someone doing engineering is no where to be found. The tools are freely available from MS and third parties to make all of your various PCs pretty much act as one.
  • Back in school... (Score:4, Insightful)

    by SvnLyrBrto ( 62138 ) on Thursday August 17, 2006 @01:05AM (#15924884)
    They just used NIS and NFS, and the net effect was pretty much exactly what you describe... Sit down at any machine, log in, and your environment loads exactly the way you left it on the last machine, everything's safely backed up at the server end, and the client machines are pretty much disposable and interchangeable, and so on. Only difference if you're not farting around with virtual machines... ie. you're not quite as "cutting edge" but on the desktops themselves, don't you want a more proven system? So why wouldn't you just do the same thing, and use said proven, if something of a pain to administer, system?

    As an alternative to NIS, Netinfo does much the same thing, only it wasn't designed by people quite so sadistic as NIS. You'd still be using NFS though...

    cya,
    john
  • An "unsupported configuration"...
  • by syousef ( 465911 ) on Thursday August 17, 2006 @01:19AM (#15924924) Journal
    You're asking for advice on /. suggests you're not qualified.

    Several ways to fix this and get qualified:
    1) Trial it on a small number of less important users. Get feedback. Make sure you listen to that feedback. Allow a decent period of time for the trial so initial teething problems can be sorted. Allocate sufficient resources to deal with early issues. This is the hard way to learn...through experience.

    2) Hire expertise - someone that's done this before, to implement and advise. Make sure it's not a vendor since you won't know if you're being screwed till its too late.

    3) Get some training.

    DO NOT try to implement this for a large number of users in one hit. You're a fool if you do.
  • Smells like X (Score:3, Insightful)

    by Baloo Ursidae ( 29355 ) <dead@address.com> on Thursday August 17, 2006 @01:26AM (#15924943) Journal
    Sounds like you're trying to solve the same problem X11 [wikipedia.org] is designed to solve. Have you looked into getting a bunch of X terminals and one super-powerful machine?
  • by TwoEdge77 ( 92704 ) on Thursday August 17, 2006 @01:37AM (#15924986)
    It was called using a mainframe and 3270 terminals. Very reliable, easily updated.
  • by RShizzle ( 983535 ) on Thursday August 17, 2006 @02:30AM (#15925135) Homepage
    "You lose 3D, sound, and most of them run a bit slower than native."

    Not quite true. Yes, with the 3D. But the two main players (VMware and VPC) both support sound, and VMware even USB 1.1 passthrough.

    With the thin-client option, Microsoft Terminal Services (if you're on a windows platform) has good scaling capabilities. Though it might not go into the hunderds or thousands, it should get you into the high dozens. Since most of the microsoft tool's dlls are loaded and shared between the clients, it has pretty good performance.

    For linux, while SSH is always a favorite, look at NX-Servers (http://www.nomachine.com/ [nomachine.com] and http://freenx.berlios.de/ [berlios.de]) which is like X-forwarding with compression and caching.

    It'll be difficult to have a fully virtualized solution. Going with thin clients, or a pxe-served image would be a more viable solution (no matter how beefy your servers and fast your network).

  • by moro_666 ( 414422 ) <kulminaator@gmai ... Nom minus author> on Thursday August 17, 2006 @02:36AM (#15925146) Homepage
    hmm, i used linux debian on this setup, with a clunky realtek 3189 network card, and my video over the Xv extension of the xserver worked flawlessy, sound came through arts over the net, everything just works.

    it's down to the configuration, the network itself can do it.
  • by Vancorps ( 746090 ) on Thursday August 17, 2006 @02:43AM (#15925167)

    First off, I agree with you that this isn't a good application of a VM considering the number of alternative options that exist already. The one area I will disagree with is the licensing since you're in no way required to run Windows as your host OS. Just run a linux-based host OS and problem solved. VMWare runs just as well on both. I'm not sure about other options like Virtual PC or Qemu but last I checked Qemu only worked on Linux so you're still in a good position not to have to throw more money at Windows licensing.

    Side topic, licensing has really gotten out of hand with pretty much every piece of commercial software. I think that's the real reason a lot of people are moving towards Linux. The learning curve required to administer linux effectively is outweighed by the complicated licensing schemes of various companies Microsoft especially. It is quite a challenge staying in compliance these days.

    Back on topic, you could have a file server or three dedicated to the task using a DFS root to link them logically and to keep them sychronized. Then you wouldn't have to worry about pushing images killing server performance. Combined with network load balancing you could scale out as needed.

  • Re:Smells like X (Score:1, Insightful)

    by Anonymous Coward on Thursday August 17, 2006 @02:48AM (#15925182)

    ... which is fine if you can replace the existing environment with one based on X11 clients.

    If not, and face it, most of the world isn't running Debian GNU/Linux like you'n'YT, this really isn't a feasible solution.

    That said: <sigh>, yes, Xterms do provide a really nice, simple, centrally maintainable desktop solution that's likley 95%+ acceptable for most of the world. Damn that 5%....

  • by mrcpu ( 132057 ) on Thursday August 17, 2006 @04:20AM (#15925376)
    Vmware ACE would probably be a good choice, it allows you to lock down the host hardware, disabling various pieces.

    VM's can run off of network shares if you set things up right. Fast network, and you won't see a problem. I have run VM's off mirrored ximeta netdisks over 100meg with NTFS as the partition type, and it worked great, although it was only about 4 machines accessing it at one time. For office apps and such, it's a piece of cake.

    I encourage people to use vmware for laptops. Create an encrypted disk with the vmware image that they want to run, then if the laptop gets stolen, you have to decrypt the disk before you can get to the really good stuff. Backups are easy, and it makes if necessary, laptop "sharing" something that you can do pretty easily as well. Multiple shifts can PC share easily as well. It's also easier to fix problems test updates and such by just snagging a copy of the image, and monkeying with it.

    Citrix and remote desktop have their places as well.
  • Re:No 3D (Score:5, Insightful)

    by Jugalator ( 259273 ) on Thursday August 17, 2006 @06:02AM (#15925562) Journal
    "there's no 3D, no good audio etc"

    These two are often not an issue in corporate environments though.
    Sure, some exceptions depending on what kind of work you do, but still exceptions.
  • Re:Citrix (Score:2, Insightful)

    by MaerD ( 954222 ) on Thursday August 17, 2006 @07:51AM (#15925763)
    Citrix (or another similar product) is exactly what he should be looking into. Downloading entire disk images over a network is just a pain in the ass everytime someone boots. However Citrix isn't the solution to all things, yet it beats VMs for most practical applications.

    Er... Why exactly?

    A thin linux desktop connected to a backend vmware server would provide exactly what the poster is looking for. Vmware ESX seems perfect for this and eliminates the "download entire disk images" part. Basically with ESX all of the vm's and associated images live and *run* on the server, the desktop is accessed via vmware-console, a little program that connects to the server and views the virtual machine, similar to citrix/vnc/whatnot. With the clustering solutions available to ESX server and the ability to move running machines between nodes, this seems like a good idea. The only real downside would be if your day-to-day involved 3d acceleration or heavy sound, in which case any solution except a "real" local workstation falls a bit short currently.

    I've used such a set up to run a windows desktop for testing, and noticed no slowdown. This is even with 10 or so training and qa machines running on a P3 with 8G of ram and lots of disk. No noticiable slowdown in performance even when the other machines were all doing cpu intensive tasks.

    Xen also seems like it's coming along nicely, but doesn't seem ready to provide for windows workstations on this scale, yet.

  • Re:Inevitably (Score:3, Insightful)

    by Anonymous Coward on Thursday August 17, 2006 @08:21AM (#15925842)
    In every single case I've ever seen of being "constantly plagued by malware/spyware/etc," it was someone who was doing it to themselves. They were constantly stealing music, downloading porn or otherwise being stupid about their online activities. They didn't have automatic updates set, or were ignoring the stupid bubble that says they had updates waiting to be applied. They weren't running decent AV, probably never scanned their machine for spyware and so on.

    And yes, I'm blaming the victim. While there *should* be sense in saying that you ought to be able to walk anywhere without fear, if you keep going to drug-riddled areas downtown and getting mugged, then STOP GOING TO THOSE AREAS. Learn to take some responsibility for your own damn habits and learn a bit. You change the oil in your car and give it the occasional tune up... why not the same to your computer?
  • by JeepFanatic ( 993244 ) on Thursday August 17, 2006 @09:03AM (#15926013)
    I work for a .edu and perhaps a solution we use here could be helpful if your main goal is a "clean" computer state at startup. We've (finally) moved to an Active Directory based network. I'm now building .msi packages to be installed by AD and use a startup script to install a program called Deep Freeze which prevents changes from being made to the system while it is in a "frozen" state. With Deep Freeze, any changes that are made to the system are removed on reboot. Any file storage is done on a networked home directory. Deep Freeze can be setup to "thaw" during the night for windows and virus updates to run and then "freeze" again after X amount of time has passed.

    If a computer breaks and needs to be replaced, we can drop in a replacement PC, move it into the proper organizational unit in AD, do a group policy update on the box and it will install the appropriate software on startup (and with the exception of programs like VS.Net it goes fairly quickly).

  • Re:Inevitably (Score:5, Insightful)

    by KagatoLNX ( 141673 ) <kagato@@@souja...net> on Thursday August 17, 2006 @09:33AM (#15926167) Homepage
    Everytime I hear this it amazes me how unrealistic this line of reasoning really is. The essential statement is that, since there was a way to prevent the problem, the onus is on the user to "know what to do" because they are obviously "ignorant". It's like returning a broken hammer to the hardware store only to be told you "shouldn't have hit something so hard". Have you considered that the real problem lies in allowing vendors to completely avoid liability for their mistakes? Perhaps the lack of an economic incentive to make a good product has created the environment where this is possible?

    From a practical perspective, telling college students not to download music, to avoid MySpace, and to not download seemingly harmless things like Screensavers and Wallpaper is about as effective as the rhythm method [wikipedia.org]. Sure, they're "sinners" with their pr0n and their music. How dare they? They get what they deserve by using a computer on the internet to download the information they want. That's a sin to be sure. It's strange how that apparently makes them culpable for systematic, intentional, and malicious exploiting of their computers. Of course, the long-term social effects of corporate self interest manipulating law and public opinion to create stigma in their economic interest is beyond the scope of a Slashdot comment.

    Back to the technical issues. Understand that a lot of malware immediately turns off ActiveX security. They leave the door wide open behind them. In your perfect world, not only does every user have to be perfectly responsible and knowledgable, but they also can't make even a single mistake--since this basically leaves them wide open (i.e. it doesn't ask, just downloads and installs any application that asks) in many cases.

    Similarly, there is no safe site. A vast majority of the web is ad sponsored. A single malicious banner ad can catch millions (the recent MySpace incident for example). Expecting every user and every advertising company (with possibly tens of thousands of ads) to not ever make a single mistake is unrealistic as it is lazy. The web can be secure if people would put the effort into getting secure systems developed and into place instead of blaming security problems on the sinners.

    Ironically, one of your "solutions", Antivirus Software (a.k.a. stopgap measure or snake oil depending on your inclination), is probably the reason things are as bad as they are. Rather than closing holes, AV just stomps the critters that run in through them. If users had insisted on fixes and security rather than installing Norton Antivirus (and considering it "fixed", things probably wouldn't be nearly so bad as they are. It would also be nice if the economic disincentive for insecurity would lie with the vendors where it belongs, not with each and every user.

    People don't realize it, but this is really an old misconception. Make something illegal, and its sources become disreputable. This then reinforces the belief that it's inherently bad. My issues of concern are software licensing, patents, and copyright reform. I'm sure the same argument could (validly) be made for marijuana, prohibition, and prostitution.

    Of course we've got a double-whammy with software security. Not only are the sinners browsing seedy sites, there is also no one responsible for protecting them (since the vendors have all licensed their cares away).
  • Re:No 3D (Score:3, Insightful)

    by perlchild ( 582235 ) on Thursday August 17, 2006 @10:43AM (#15926627)
    Because in both cases the VM stands for virtual machine.
    If people would call virtualization "Virtual Hardware"(well anything that doesn't start with M would be good), the confusion might not exist.
  • Re:No 3D (Score:5, Insightful)

    by misleb ( 129952 ) on Thursday August 17, 2006 @10:50AM (#15926686)
    Where I work I have had significant trouble with Vmware images used on different makes/models of desktops. For instance, one XP image I made on a Dell Dimension 620 would come up with some random error when loaded on a Dimension 270, and vice versa. This problem is extremely prominent with Vista builds, as well. There are a lot of unknowns such as that when considering such a large-scale use of Vmware.


    That doesn't make sense. VMware should provide exactly the same virtual hardware to the guest no matter what physical computer you run the image on. In fact, that is one of the biggest selling points for VMWare.

    Are you creating the VMware image FROM the Dimension 620, or running a fresh "virtual" install of XP?

    -matthew
  • by paradocity ( 324024 ) on Thursday August 17, 2006 @11:26AM (#15926982)
    I would also checkout a product by Faronics called DeepFreeze.

    From the website: "Deep Freeze instantly protects and preserves baseline computer configurations. No matter what changes a user makes to a workstation, simply restart to eradicate all changes and reset the computer to its original state - right down to the last byte. Expensive computer assets are kept running at 100% capacity and technical support time is reduced or eliminated completely. The result is consistent trouble-free computing on a truly protected and parallel network, completely free of harmful viruses and unwanted programs."

    At my company we use a combination of Citrix and DeepFreeze that allows users to roam from station to station while still having full access to all of their apps and data (stored on the network). DeepFreeze ensures that a user never messes up the local computer with anything that a reboot can't fix.

    You could also just do DeepFreeze, profiles, network based app installs which would ensure the apps and data are on the network.
  • Re:Inevitably (Score:2, Insightful)

    by BluenoseJake ( 944685 ) on Thursday August 17, 2006 @11:26AM (#15926983)
    No, you're wrong, it is more like screwing a thousand people unprotected and then complaining when you get syphilis. The tools are there, the info is there, take the time to learn how to use the equipment that you are using. you don't just get into a car and drive it, so why can't people learn to use thier computers

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...