Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Windows Thin Clients - Worth Making the Switch? 128

Brendtron 5000 asks: "I work in the IT department of a major Canadian university. I've been given the task of investigating the pros/cons and costs associated with switching from Windows desktop machines to some kind of thin client solution. Both student lab and administrative machines are up for possible replacement. At first blush it seems that the cost savings will be considerable, given that thin clients are much cheaper and easier to maintain than a user controlled desktop machine. What were your experiences with switching to/managing thin client environments? Have the users been happy with thin clients? Did the cost savings materialize as expected?"
This discussion has been archived. No new comments can be posted.

Windows Thin Clients - Worth Making the Switch?

Comments Filter:
  • Yes, but not anymore (Score:5, Informative)

    by AKAImBatman ( 238306 ) * <[moc.liamg] [ta] [namtabmiaka]> on Tuesday May 09, 2006 @10:14PM (#15298277) Homepage Journal
    Back when Citrix produced an inexpensive version of Windows capable of supporting ~40 concurrent clients on a single dual-proc machine, the answer was "yes." The cost savings were huge. Not so much from the hardware, but from the support. Users were simply unable to screw up their desktops, could login remotely over a modem (!), and IT could share the session with the user to fix the problem without ever leaving their desk.

    Then Microsoft got involved. They refused to license to Citrix again, and released their own Terminal Services. The price skyrocketed, the licensing became confusing, the protocol was much heavier, and the system became far less stable under load. Not much has changed.

    It was a wonderful thing while it lasted, but don't expect to see any real returns on a modern Terminal Services system. The only real uses they have these days are remote administration and centralized applications. And you can expect to pay for those features.
  • by AKAImBatman ( 238306 ) * <[moc.liamg] [ta] [namtabmiaka]> on Tuesday May 09, 2006 @10:51PM (#15298447) Homepage Journal
    Citrix does license much of Terminal Services from Microsoft.

    Replace "much" with "very little" and you're a lot closer.

    And their package is an add-on to Microsoft Terminal Services

    Precisely. Citrix WinFrame was a custom version of NT 3.51 modified by Citrix to handle thin clienting. Microsoft stopped licensing Windows NT immediately thereafter, and forced Citrix to become an expensive add-on to an already uber-expensive product. Oh, and Microsoft botched the job while they were at it. RDP sucks in comparison to ICA, and stability went down the drain.
  • My experience (Score:5, Informative)

    by Devoir ( 973883 ) on Tuesday May 09, 2006 @11:12PM (#15298536)
    I've done some work for an education instituion who wanted a multi-site, low-cost upgrade to their network.

    The decision was made to go ahead with a Windows Terminal Services-based operation. The decision was helped by the fact that they were an educational institution with volume licence key access, so there was no cost in relation to that.

    A Dell twin-processor server was ordered and setup with the standard host of software, integrated into their domain and ran like a charm. The clients were setup with ThinStation, which is a phenomenal piece of software. This alone has enabled them to save tens of thousands of dollars simply due to hardware considerations. A new site which they took over had a number of machines which would be considered out of date, or subpar. This included first and second generation Pentium PCs, etc that would not have been considered for 'active duty' if they were required to run Windows XP with the latest and greatest productivity suites.

    At this point I should mention the initial deployment was planned for only the administration PCs, but due to the performance, savings and general ease of transition, management has indicated they would like to move forward with classroom deployment soon.

    All up, this single server is operating up to 50 clients at any one time, and due to the fact that it's running Terminal Services, their remote site bandwidth requirements have decreased fairly significantly.

    The time it takes to setup the ThinStation software is far outweighed by the time it would take to create and deploy a full image, and there is an additional benefit that everything is exactly the same no matter which staff member accesses which terminal.

    I'm unsure how educational licences operate in the organisation mentioned in the OP, but if it's anything like my experience, then the labour costs, hardware costs and sheer frustration cut out from dealing with an equivalent non-TS environment are definitely worth it from the point of both myself and the client.
  • by bucketoftruth ( 583696 ) on Tuesday May 09, 2006 @11:19PM (#15298572)
    I have to give props to the parent. I've spent a while searching for a good thin client system and have finally settled on LTSP. Amazingly easy to set up and amazing documentation. The kind of documentation that gives real answers, doesn't just lead you on a wild google chase. I'm impressed how fast it is.
  • by Natasha ( 31280 ) on Tuesday May 09, 2006 @11:30PM (#15298605)
    We run about 50 Wyse thin clients at the credit union I work at. They have some definite pros and cons. The nicest thing for us is that we can see the desktops of our tellers at any of our 6 branches. Makes support a hell of a lot easier. Plus if there's a power outage where your clients are, when they come back up you're right where you left off (assuming your servers are on UPSs).

    Where you run into trouble is the shared server resources. If you have a few people using large Excel documents it can seriously affect performance. It can also be a problem if you allow things like Flash. A resource heavy Flash program can eat up a single CPU (core,HT,whatever) pretty easily. Also, if your servers are in a different location from your clients, a network outage pretty much means your dead in the water until it comes back up.

    Currently we're about 50/50 on thin clients to PCs because some apps are just better off on a PC. Also, when you buy the clients, you might want to get ones with legacy ports. Most USB devices won't forward to the desktop, but serial and parallel will.
  • by AKAImBatman ( 238306 ) * <[moc.liamg] [ta] [namtabmiaka]> on Wednesday May 10, 2006 @12:11AM (#15298749) Homepage Journal
    User's reasons include: insufficient bandwidth to display the graphics I use,

    This wasn't an issue with ICA. Modem speeds were more than enough for our users to feel like they were at work. 10MBit LAN connections hooked up to a hub made their thin clients seem like they were flying. (Granted, this was back when Windows was designed to run in 16 - 256 color modes.) I wouldn't recommend playing a video game over ICA (though you could), but everything else from Word to Videos worked great.

    insufficient dedicated CPU time for the programs I need to run

    This wasn't an issue for us. For most office workers, CPU simply doesn't matter as it's underutilized anyway. You need very powerful apps that are generally outside the purview of office workers to make a dent in the CPU power.

    and "one network glitch and the whole enterprise stops working."

    This was always an issue. Thankfully, the network was stable and the the machine was mostly stable. So we were usually able to schedule downtime outside of business hours. In the few cases that a reboot was necessary during business hours, it was usually quick and not much different from a user perspective than losing access to some sort of client/server application. Plus, they actually knew it was down as opposed to getting a cryptic "cannot connect to server" message from the client software.

    Face it, I don't really think you are saving much in terms of central administration because you are going to have select users that need custom tools.

    Citrix WinFrame was so nice for this. What you'd do is you'd create a desktop type for each category of user. (For us it was by department.) You could configure this desktop to have access to specific application icons, and no others. Security could be reenforced with Windows ACL permissions aligned to the same users. You then save that desktop configuration and assign it to as many users that needed it. There were a few oddballs who had very specific requirements, but it was easy to meet their needs with all the regular Desktop support we *didn't* have to do.

    When Microsoft released Terminal Services, they screwed up many (most?) of these features that made it a workable concept.
  • well (Score:2, Informative)

    by i_c_andrade ( 795205 ) on Wednesday May 10, 2006 @12:17AM (#15298765)
    I have to manage a 21 user CPA firm. Thin clients work great EXCEPT when you have to deal with software that does not WILL NOT work in a terminal server (Quickbooks I am looking at you, and CCH/ProSystem fx). That and all the horrible 1980's abominations of software that accountants like to use. From an admin point of view its great, users like it also. Its just that 70%+ of the normal Windows software does not work on a server let alone a terminal server, it wants to run on a workstation.
  • by SerialHistorian ( 565638 ) on Wednesday May 10, 2006 @12:39AM (#15298835)
    I have also deployed LTSP and PXES at call centers. We deployed using Gnome and CentOS 4 to over 40 desktops. We ran into a few problems with the inital rollout with LTSP which prompted a switch to PXES in one case. Namely, LTSP depends on NFS to load the kernel. For some reason, we couldn't get NFS to work on the network in one office... and we still haven't figured it out, because it works fine at two other installs. It was a headache and a half. Terminals would freeze halfway through the day when they lost the drive their kernel was on, terminals would never load, etc. etc. etc. PXES was much easier to deploy due to clients loading their kernel via a tftp on boot after PXE. The other struggle we ran into was that clients needed more RAM than we initially thought. To run several instances of OpenOffice, Firefox, and other daily-use applications, clients quickly ran out of memory... at which point all of the windows in Gnome closed suddenly on the user. I thought it was great ... "Wow, a kernel that protects it's own memory resources, and just shuts things down if it needs more!" ... but users didn't. This could be solved two ways -- by using a buggy NFS mount to the server as swap space, or by cramming more RAM in the box. We found that 256 was the minimum that a user that kept a lot of windows open would need. A locked down machine that only allowed firefox could probably get by with less, and recompiling things like the kernel, gnome, firefox, etc. with patches eyeing memory consumption would also allow less. We used LDAP as our authentication and password store method, which had a lot of advantages as far as single sign on and global authentication went. Unfortunately, the LDAP admin client that we chose was never implemented properly, so the advantages went unrealized in a lot of ways. We should've written our own admin client. The server hummed right along. OpenOffice by far will be your most intensive application; with 40-50 users running openoffice, you'd see almost 100% memory and processer utilization... but would not yet start to see lag. We were running on a dual Opteron server in 32-bit mode (due to the need for the Flash plugin, which was not available in 64 bit for firefox/linux at the time we were fscking with it), with 4 gb of RAM and SATA2 drives in 3 RAID1 arrays; one of user homes, one for system and applications, and one very small array hanging off of the 2nd bus for swap for performance. Be sure to include applications in your dimensioning and think through your hardware. Don't buy a vendor bill of goods and note that using extensive NAS or fibre-channel drive arrays, while fancy, might slow things down badly for your users. We ran NX for remote desktop purposes and had several users that worked from home.
  • by Eil ( 82413 ) on Wednesday May 10, 2006 @01:28AM (#15298974) Homepage Journal
    We've done exactly this for a number of customers. The thin clients boot up, grab a kernel from the LTSP machine, then start X and a copy of rdesktop pointed at the Windows terminal server. The only major downfalls are that things like printers, USB devices, sound, and local removable media do not automatically get passed to the server. I understand that some of these are actually being worked on. We've also had a couple of employees complain that they couldn't play their music CDs in the machine. (And one of those actually had a brand new company-provided FM radio/CD player sitting on his desk right next to him...)

    One word of caution: If you plan to run Windows 2003, do not expect certain peripherals (scanners and printers, mainly) and software to work properly. Since it's touted by MS as a server OS, many driver and application developers specifically exclude it from the software's internal compability list and the software will refuse to install. If you think you'll be hooking up a lot of peripherals or running a lot of odd little applications, consider Windows 2000 instead, which unfortunately is pretty much unsupported by Microsoft these days...
  • security is critical (Score:2, Informative)

    by datazone ( 5048 ) on Wednesday May 10, 2006 @02:45AM (#15299150) Journal
    Since no one here has yet to mention the nightmare that security can be in a windows TS system, I will go ahead and let you know. If you care anything about security you will be paying some good money to citrix for their reporting tools that keep track of what apps which users run and such.

    The main issue is that when you have multiple end users coming from the same ip address (the TS) online fraud tracking can be almost impossible if the user hides their tracks.
    what do i mean?
    Lets say you have 10 users all running firefox, at the same time, then one of them uses a customers credit card to buy stuff from some online store. How do you find that user? well, hopefully the online store has some good logs to help you (yeah right), otherwise you are going to be search browser history and cookie data to see which user went to that site. If the user is smart they will purge their data in such a way that only that entry is no longer in their privacy data.
    Your only option is to setup a proxy which requires authentication per user and has damn good logging enabled.

    Another issue is trying to lock down the TS. It can be done, but it takes a long time to get the right balance of security and usability for all the applications that your users will need to run. God help you when you get applications that refuse to install, or requires stupid permissions to run.

    In the end I found that TS works for general users that use a defined set of applications.
    In a perfect world these applications would be restricted to:
    - simple office documents
    - limited external websites (you better be using a secure proxy)
    - web based internal applications (you better have strong authentication and good audit trails)

  • by W. Justice Black ( 11445 ) on Wednesday May 10, 2006 @04:48AM (#15299494) Homepage
    Obligatory Background:

    I currently work for Sun's Network Systems Group (x64 servers). I use a SunRay to do the vast bulk of my work every day. I have run my own SunRay servers (running Linux) for over two years.

    I used to work for a company called Taos, whose user infrastructure was entirely Windows Terminal Services + Citrix Metaframe. Another SA and I ran their terminal servers for my entire tenure there (about 2.5 years, plus I was doing application development at the same time).

    The Response:

    Those who hate thin clients (TCs for short) tend to do so for the following reasons:

    1. Initial procurement cost and software licensure is no cheaper than desktops. For some software (the Citrix bits in particular), it's significantly more expensive.
    2. Users can't (or damn well shouldn't be able to) run arbitrary software--the joke "screensaver" that their friend sent them, for example ("screensavers" are just eye candy anyway--who cares about saving a CRT anymore?).
    3. Performance with certain apps (video in particular) is highly network-bound and potentially crappy.
    4. Limited number of points of failure, so a dead server can affect many people.
    5. "What do you mean I can't plug my webcam/phone/food processor in there?"

    Most of these arguments are lame, because:

    1. Thin client hardware has MUCH better longevity than its desktop bretheren--five or more years out of TC hardware is the rule, not the exception.
    2. Users shouldn't be running arbitrary software anyway in most business settings.
    3. Performance with most other apps is stellar, as the first user to load an app "greases the skids," putting most of the app in cache for everyone else.
    4. If you do it right, you have configured your server to be relatively bulletproof, and have one or more backups (typically folks don't have backup desktop machines :-/ ). Plus, many people's work is network bound--a dead conventional server means people can't save stuff or can't update the company database (or whatever they're doing) anyway, making the desktop of little real use when their related servers (or the network generally) go down.
    5. Is more-or-less valid. There are some devices that simply will not work when attached to TC hardware (though a surprisingly large number of things will). Whether that's really a problem or not is in the eyes of the beholder.

    You also get the following benefits out of TCs:

    1. No crawling under a desk and facing the Dust Bunny Army (tm) to replace a dead drive, or removing 20 lbs of personal effects to upgrade someone's RAM.
    2. True centralized deployment of software--no guessing if an app got installed or WTF is actually on someone's hard drive (deployment solutions for PCs other than ghosting the whole drive have this nasty habit of being fidgety).
    3. With some solutions (Citrix in particular), you can "publish" an application, making just one app available to those who MUST use a PC, so you can mix-and-match your clients if needbe.
    4. Usability over slower WAN links is usually pretty good (especially with Citrix).
    5. Some solutions (particularly SunRay on Linux or Solaris) allow session "portability," which means that you can start typing a sentence, pull out your card, walk down the hallway (to, say, a meeting room), plop in your card and finish your sentence. To those that have never tried it, this seems silly. To those who use it daily, it's a Godsend (like those who are addicted to TiVo, SunRay session portability is something you just have to "get").
    6. TC hardware is generally SILENT and consumes very little electricity.

    SAs, like any users, hate TCs because they're "limited" in what they can do. The smart ones end up loving TCs precisely because users are limited in what they can do. That said, you also have to deal with the real TC problems:

    1. Some apps just won't behave, or they require a ton of work to behave. This problem has gotten better with time, but stories abound of ba
  • by Seraphnote ( 655201 ) on Wednesday May 10, 2006 @05:15AM (#15299563)
    My suggestion is TEST IT yourself, to help you make your decision on how to do it.
    But know you need to answer the following at least first:

    What kind of server are you going to run? Windows TS, Citrix, or Linux? If you're a Windows Admin who knows user management, Active Directory, and GPOs already, then the learning curve is shortest to the Windows TS. Citrix will mean learning it as a whole new server application. And Linux will mean knowing Linux and having apps that run on Linux.

    What kind of thin client are you going to run? The thin client has to support connecting to the type of server you chose. Besides Linux thin clients, which by the way can connect to Windows TS using "rdesktop", and do so quite well, there are two kinds of Windows thin clients, CE and XP-embedded. OH, and just because you know Windows desktop OS's, don't think that immediately translates to knowing how to configure Windows thin clients, CE or XP-embedded. They are a wholly separate beasts.

    And then you have to decide if you are going to centrally manage the images of the thin clients or not. You can configure them each individually, or you can set up a PXE server, and boot your thin clients from images you've prepared and stored there. I think some thin clients can be set to autoload their images from FTP or TFTP servers also. But centralized thin client management is a whole other project that you may not have the resources to implement and maintain.

    As someone else on this thread mentioned, you have to know if the applications you want to run in this setup will run in the server/client configuration you choose. And the only way to know this, may be to try it. For example I'm currently implementing and ERP application, that won't run via the Windows CE thin clients, but will run via the Windows XP-embedded thin clients. (But I'm running them via Linux thin clients.)

    Remember you can test your thin clients with the administrative TS that comes with every Windows server.

    Here you'll find examples here of some decent thin clients:
    http://h10010.www1.hp.com/wwpc/us/en/sm/WF04a/1245 4-321959-89307-338927-89307.html/ [hp.com] There are plenty of others.

    I'm using the HP t5525 Linux Thin Client with Windows Server 2003 R2 Terminal Services. This works great. I don't have time right now to deal with the central administration of images, so I spent a morning figuring out how I wanted to configure the t5525, and listed out instructions on how exactly to end up with the same config each time we need to configure one manually. Two of us set up a room of 10 of these, unpacking, hooking up, and configuring, and test connecting to Win TS in 40 minutes.

    I tried a t7510. It would be a great "grandparent PC" for a grandparent with broadband who wanted to web-browse (and maybe get sued by RIAA by downloading music and videos to play in Windows Media Player). But it was too quirky and different from XP-desktop to know instantly how to configure/maintain it. And again, I didn't need all the crap it came pre-loaded with. This is supposed to a "thin client" running apps on the TS, not a "thin client" running apps on itself.

    Anyways, the cost of the thin-clients is so low, you really ought to get a couple and try them yourself, before you commit to your grand solution.
  • by override11 ( 516715 ) <cpeterson@gts.gaineycorp.com> on Wednesday May 10, 2006 @07:45AM (#15299899) Homepage
    The latest LTSP (which we recently upgraded too) supports locally connected USB devices (mice, thumbdrives, scanners, etc), and they can even be shared by multiple users! We have 45 users running off a single LTSP server (dual 2.4 ghz, 4 gigs RAM), providing OpenOffice, running Mochasoft through Wine (dont ask), Evolution email, firefox, etc. Runs really nice, and so easy to maintain!
  • Citrix experiences (Score:2, Informative)

    by Caffeinated Geek ( 948530 ) on Wednesday May 10, 2006 @08:06AM (#15299952) Homepage
    To a great extent the answer is it depends. I have had experience with Citrix in two different environments. In the first I was not directly involved but it was closer to what you are describing. It was a school lab situation.

    The problems they ran into included multiple logins or simultaneous reboots are an issue. You essentially have all the machines hitting the server at one time. The hardware requirements that their vendor recommended were woefully inadequate. What was supposed to handle 40 users started to degrade at 10 and would start to fail around 20 or 25. The initial server was scraped and replaced with a much more robust solution. This issue was at least partially a vendor issue.

    The second implementation is the one I support now. From this one I can say that ICA is not in my environment as fast as being on the local LAN if you are going over a busy WAN connection. I have not measured this yet but my theory is this is more related to latency than actual bandwidth exhaustion. On the LAN Citrix or Terminal Server are both very nice. Printers are a beast. This won't be much of a problem in a lab environment since it's just a mater of making sure you purchase printers that work but for an environment where you have people using home equipment it's a complete headache. Citrix knows this is a problem and things are getting better.

    One of our apps is a resource hog so publishing it was horrible for the server. You definitely want to profile any applications before the implementation to make sure that they will work in a multi user environment. Thankfully this app is very rarely used by my Citrix users.

    Something you probably want to consider is teaming the servers so if one goes down you do not completely loose access. Don't forget you have to be able to carry the whole load with missing machines or you are wasting your time.

    My experience in both environments is that you can never lock the desktop/server down as much as you would like to. If you have a "nice" user base this may not be a problem but if you have people who want to tinker this can be a huge problem. Citrix does offer some advice and it will get you part way. Unfortunately anything that people can do to your locked down PCs now they will be trying to do to your server in a thin client environment. Personally for ease of management, I like Deep Freeze. That is the product that the university I worked for went to a few months after I left and the last time I asked they were having great results. But I know that is not what you are looking for.
  • by addbo ( 165128 ) on Wednesday May 10, 2006 @11:59AM (#15301535)
    Background: I work for a Health Authority in Canada and support around 300 users. We currently have ~100 thin clients deployed in our organization. They connect to 2 Network Load Balanced Terminal Servers running on Windows 2003 Enterprise Edition. We're not using Citrix (too expensive for our little shop)

    Server Specs: IBM xSeries 345, Model: 8670-L1X, 2 x 2.8 Ghz Xeon, 4 GB RAM, RAID-1 of 2 x 36.4 GB for paging file, RAID 1-0 4 x 73.6 GB for OS.

    Currently have 40 users on one of the servers, CPU goes between 0 and 25%, RAM usage at 1.66 GB. So not exactly taxed.

    My Experience is that Thin clients are much easier to support. Thin clients out of the box can be configured and setup in about 10 minutes (We use Windows CE thin clients from HP, and you just setup one thin client exactly how you like then export the settings to a file, using the file you quickly setup all other thin clients). Plus the ability to remote control a users desktop from the Terminal Server manager is great for helping with various little problems and saves a lot of time. (You do everything from your workstation)

    If you're on a Windows Domain you can create an OU to place your Terminal Servers in, you can then create an extremely locked down Group Policy that applies just to those servers. Disable control panel, limit start menu, even logging off users who leave a connection idle for more than a specified amount of time. Of course how locked down you want to be will have to be tailored for each individual organization (for instance we allow our users to add their own printers)

    I don't recall having any network performance issues as even Windows Terminal Server is "thin" enough for a decent LAN/WAN environment... we have clinics connected to us all around our little town... I connect to our servers at home via VPN and RDP and don't notice a large lag... even when I'm travelling out of town. We have Gigabit switches connecting our servers and 100Mbit connections throughout though... with fibre connecting the clinics.

    The issue of "if your network goes down so does your terminal server" is true... but then again... without network we don't have ability to print, access our file server, or authenticate. Plus anything you were working on when the network goes out isn't gone... your session is just in a disconnected state, when you logon again your session is revived with all your work intact.

    Thing is Thin Clients AREN'T for every user though... most users I've spoken to love the fact that the thin clients boot up so quickly and they can get to their work faster. But thin clients are really ideal for lots of users with a very standard set of application needs... Office 2003, Scheduling application, etc... there are some applications (like a few ERP and accounting applications) that aren't built too well for thin client use and if put on to a terminal server will chew through your server resources. If you have users who need to play with GIS data, or do AutoCAD, or graphic intensive things... then you're better off keeping with workstations for those users... but for users that use basic office productivity apps (I'm guessing like campus computer labs for students) then a thin client environment might be ideal.

    You don't have to go all or nothing right? Pick the tool best for the job ;)

    You also have a choice on the windows side of whether to go Windows CE or Windows XP embedded... having tried the two out... I would strongly recommend Windows CE... it's a much smaller OS which boots up quicker and the ability to lock it down is phenomenal... I setup our CE machines to be single button logon which just connects to an RDP session... nothing else... unless you need to install and share out printers from a thin client (then I would recommend XPe)... or need to register your assets into the domain?

    I find that I hardly ever have to do support calls for the Thin Clients, meanwhile for desktops/laptops they usually come up with quirky issues that take up a majority of my time. When I do have to support thin clients it can usually be centrally managed via the terminal server manager.

    Anyways that's my two cents.

    Addbo
  • by llefler ( 184847 ) on Wednesday May 10, 2006 @12:53PM (#15301953)
    Good luck buying a thin-client for under $400.

    You're just not looking very hard. I have two terminals that I purchased new for $150 each. Even if you match it with a 17" LCD you can still come in under $400.

    http://www.ntavo.com/ [ntavo.com]

    I'm not affiliated with them, just purchased a couple terminals to demo to clients. They support X and RDP.
  • Running an opteron in 32bit mode is a mistake...
    You could have a 64bit kernel and a 32bit userland, ala solaris... But with a 32bit kernel, you significantly lose performance when you go over 1gig of ram since you have to use nasty kludges like highmem.

All I ask is a chance to prove that money can't make me happy.

Working...