Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Is Open Source too Complex? 356

Jason Pillai writes to tell us ZDNet is reporting that at last month's Microsoft Worldwide Parter Conference in Boston Ryan Gavin, director of platform strategy, claimed that one of the big downsides to open source is complexity. From the article: "Gavin noted that the flexibility of open-source software in meeting specific business needs also means systems integrators and ISVs have to grapple with complexity costs. 'It's challenging for partners to build competencies to support Linux, because you never quite know what you're going to be supporting,' he added. 'Customers who run Linux could be operating in Red Hat, [Novell's] Suse, or even customized Debian environments,' he explained. 'You don't get that repeatable [development] process to build your business over time.'" More than once I have had complaints that my setup is more difficult than necessary. Is open source really that much harder, or just different than what most are used to?
This discussion has been archived. No new comments can be posted.

Is Open Source too Complex?

Comments Filter:
  • Eh? (Score:5, Interesting)

    by bloodredsun ( 826017 ) <martin AT bloodredsun DOT com> on Tuesday August 08, 2006 @07:53AM (#15864867) Journal

    Sorry, but I read this as "Choice is confusing - stick with what you are comfortable with. Hey look, that's us!"

    This sort of gibberish is what you would expect from the most popular product in the market who are being challenged for the first time in a while.

  • by inflex ( 123318 ) on Tuesday August 08, 2006 @07:55AM (#15864873) Homepage Journal
    Supporting -Linux- from a cold-start is a pain, not OpenSource.

    With Solaris and FreeBSD (as examples) you know what you're in for when you get there. With linux you never quite know for sure. Sure, you can gear yourself up with most of the more common setups (Debian, RH, etc) but beyond that things fracture into thousands of variants. From starting scripts to configuration files, it's a mess.
  • Re:Learning curve (Score:0, Interesting)

    by Anonymous Coward on Tuesday August 08, 2006 @07:55AM (#15864875)
    All config-in-text-files is the biggest piece of junk, since every software uses its own format,stores it in a different directory and you need a different parser for each config file. A global registry like in windows is the right way to go (allthough the windows registry editor is crap).

    Even gnome realized this and stores their config data now in a registry like directory structure.
  • Zimbra (Score:1, Interesting)

    by jlebrech ( 810586 ) on Tuesday August 08, 2006 @07:59AM (#15864886) Homepage
    At work we have a project based on Lotus notes which has taken 12 months up to now

    And its nowhere near to completion and cannot beat Zimbra or other. It's looks apauling and cost the earth.

    And we bought this because Lotus domino was EASY to develop. Sure with the notes client on everyones PC, but this is web development.

    So now which top 100 website uses domino for its backend heh?

  • by db32 ( 862117 ) on Tuesday August 08, 2006 @08:01AM (#15864900) Journal
    You forgot to mention the Novell integration stuff...the ActiveX stuff...then there is the whole MS-Java VM vs Sun-Java VM... I assume you have never been to a trade show either...where every vendor is willing to sell you a different "Document Management" system for upwards of $10,000 that really is just a stupid crutch replacement instead of having admins actually MANAGE the file storage and keep users from saving crap all over the network where they don't need to be. I mean...these vendors can't even explain what the hell their products do half the time...I just wander over, ask a few basic technical questions and the market bimbos (I will never understand selling software with sex appeal) are filling my bag with promo junk to make me stop asking questions in front of other potential dupes...er customers. Yeah...the closed source world makes SO much more sense and is SO less complex. But hey...as long as its going strong I can go to trade shows and get bags full of free goodies. The best stuff always came from the vendor that could actually answer my tech questions...they were generally happy to have someone that could speak intelligently with them about their product and thus broke out the expensive promo stuff. :)
  • It's about the same (Score:3, Interesting)

    by yancey ( 136972 ) on Tuesday August 08, 2006 @08:22AM (#15864980)
    Having managed both Windows and Linux systems in an environment with 500-1000 machines, I can say that the workload ends up being about the same. If someone were to tell me that managing Linux is "too complex", I would respond by saying that you just haven't yet learned Linux, but perhaps have learned a specific distribution. In essence, Windows is a single distribution and learning only one is easy enough. However, once you understand the fundamental concepts of Linux (or any unix-like OS), adapting to a new distribution is relatively easy. There is a learning curve with Linux, but there is with Windows too. Just ask anyone who has switched from a Mac to Windows. If you're not willing to learn, then you're just lazy.
  • by peragrin ( 659227 ) on Tuesday August 08, 2006 @08:28AM (#15865006)
    ah but that's the good thing about Linux that you can not say about MSFT.

    You as a vendor only supports RHEL, but that means your customers can get their Linux from Red Hat, CentOS, White Hat, or any of the other firms that take RHEL sources remove the trademarks and redistribute the binaries.

    Your still not locked to anyone vendor for support or services.
  • by Noryungi ( 70322 ) on Tuesday August 08, 2006 @08:28AM (#15865010) Homepage Journal
    Long answer:

    Is open source difficult? Yes, if you are just an average user. No, if you are a system administrator-type of user and that you manage information system for a living.

    If you are just an end-user, someone who uses computer to do something else (creative work, accounting, marketing, sales, whatever) and you don't know anything about computer, then yes, I guess Open Source is still too difficult for you... unless you have a sysadmin close at hand to (a) install your machine and (b) make sure it's updated regularly. Then, Open Source can be -- should be -- just as easy (if not easier) than Microsoft products. Open Source GUIs, such as XFCE, KDE or Gnome, once installed and configured properly, are just as easy and friendly as Windows. Of course, the ultimate in user-friendliness is Mac OS X, but that's another story.

    Please note that the term "user" -- as used above -- is not negative at all in my mind: I can perfectly understand that your job has nothing to do with computers, and that you don't have the time, or the inclination, to learn more about computers. And no, I don't think there is such as thing as a "Power User". Either you know enough to manage your own machines, or you don't. People who know just enough to be dangerous, but not enough to clean up the mess they have made, are users in my mind. Dangerous ones, but users nonetheless.

    On the other hand, when it comes to system administrators, Open Source wins hands down. Things like Apache, vsftpd, NFS, CUPS, perl/python/shell scripting and, especially, OpenSSH make my life (and the lives of countless other people) so much easier than their Microsoft counterparts. Plus, they are a lot cheaper than all the Microsoft products, they are more reliable, easier to manage, upgrade, patch and install. Seriously, consider the following examples to upgrade a machine or an application:
    1. Debian: sudo apt-get update && sudo apt-get upgrade
    2. Slackware: sudo upgradepkg ./*.tgz
    3. OpenBSD: sudo pkg_add -u -vvv -i
    4. Etc...


    Sure, to get to the stage that you actually can type these commands under OpenSSH and know what they do, you need to put in a lot of work. But the result is worth it. And, if you are a sysadmin worth his/her salary, you'll probably have a passion to learn that kind of things. Once learned, these commands result in less downtime, less cost, more customer satisfaction and a more efficient company. All in all, Windows, with its lack of security, Registry Database, its rather ugly GUI and its general flakiness is not good enough or "simple" enough when it comes to systems that must run 24/7 and support dozens, or even hundreds of users.

    Linux, on the other hand, may not ready just yet for the desktop. But it will one day. Which is probably why there is an un-ending stream of FUD coming out of Redmond these days...
  • Two words.... (Score:3, Interesting)

    by moosesocks ( 264553 ) on Tuesday August 08, 2006 @09:15AM (#15865289) Homepage

    For my sins I have used a lost of operating systems over the years and they all have their pros and cons, the one thing that seems common across them is that the more scarey they look the less likely they are to break because people don't mess with the difficult ones. Most failures are caused by human error (it's just that no one admits to it) and making server OS's look familiar tempts people to fiddle.


    SCO OpenServer.

    Scary, old, unsupported (difficult to find anyone willing to work with it), and extremely easy to break, even by end users.

    Then again, this is probably the result of bad karma for still running a serial-line network in 2006.

    To draw another analogy, Token Ring is also scary, and very easy to break. On the flipside, Mac OS X is much "simpler" to an end-user than Windows is(and to a certain extent, to the developer as well) -- it is also much more difficult to break. I've never seen an OS X installation trashed in such a way that it couldn't be fixed by creating a new user profile. Granted, this is due to OSX's UNIX underpinnings, but the fact remains that it's pretty undeniable that it's a simpler system to use for the user.
  • by anothy ( 83176 ) on Tuesday August 08, 2006 @09:17AM (#15865302) Homepage
    if this is what you learned from Software Engineering 101, you should go demand your money back for your entire education. you've learned all the wrong lessons.

    yes, most software is complex, but it doesn't have to be. the complexity generally comes from a few areas, like legacy support and poorly thought out design compromises. compare, for example, the Plan 9 kernel, which is ~180k lines of code for about a half dozen architectures, to linux, which is... well, an order of magnitude more than that, at least, even stripping out the vast driver support. it's also better structured and more readable. then compare other components: plan 9's ndb with Unix's whole host of files in /etc (how many files contain some combination of hostname, ether addr, IP addr, and so on?). and that's just low-level stuff. move up the stack towards the user and it gets more and more true. Apple's Safari is such a great experience for most people who use it because it's much simpler than most of the alternatives, say IE 7. the land-line telephone world retains many of its customers because mobile phones are more complex to use. software doesn't have to be complex, and folks like you who assume it does produce most of the complex code, because you've given up. and once you give up on trying, sure, it all looks like it has to be complex. it's a nice self-reinforcing fatalist outlook.

    sure, sometimes complexity is unavoidable. but we should strive to make that the exception rather than the rule. and it can be, if we put the effort into it.
  • by archen ( 447353 ) on Tuesday August 08, 2006 @09:46AM (#15865496)
    If administrators have to resort to the reading the source when something fails, there is a problem.

    I've often found this to be sort of a plus in a roundabout way. It seems that OS applications more commonly spit out specific errors about what is wrong into the logs. This often doesn't mean anything to you, but a search often finds someone who did have the same problem and poked through the code to figure it out. When you're an administrator of a criticle system you need it fixed. If MS just gives you "an error occured" message then when push comes to shove, you may very well wish you could just look at the code.

    Although I'm pretty far removed from C/C++ now days, I've done search on error messages and come up with the actual code where the program generates it. Some times it's easy enough to tell what conditions are causing the problems without being a programming guru.
  • by swillden ( 191260 ) * <shawn-ds@willden.org> on Tuesday August 08, 2006 @10:47AM (#15865995) Journal

    It does make it more difficult for a large company to develop for a Linux crowd in general.

    Somewhat, but it's not that bad. I wrote a commercial, closed-source Linux app that had to run on multiple distros and it's really not that much of an issue. The app I built was one of the more difficult ones to support across distros, too, since it had to integrate with (or replace) the login process, screensaver, etc. These are areas where distros do things very differently. Normal applications have many fewer issues.

    Some things I learned:

    • At core, all major, modern Linux distros look the same. There are small differences in they way files are organized and larger differences in how packages are installed and managed, but these differences don't require much effort to work around, even when you're changing the most distro-specific elements of the system (boot processes, login processes, etc.). Liberal use of configuration files and glue scripts is a very good idea so that you can reconfigure for lots of different environments without changing the binaries.
    • Cross-distro development is easy. Cross-distro packaging is fairly easy. The hard part is the cross-distro testing, not because it's particularly difficult, but because you have to do it on every distro you're going to support. It's well worth automating as much of the testing as possible. I recently began trying to use a test-driven development process, and I wish I had used it on that Linux project.
    • Although it's possible to create a cross-distro installer, your customers will be much happier if you provide native installers that integrate properly with their package management system.
    • You cannot expect to make a single RPM package for all RPM-using distros. The RPM SPEC files won't differ hugely from distro to distro, but they will differ if you want seamless integration.
    • For a corporate app, you really only need to target a small set of Linux distros to cover nearly all of your market. Red Hat, SUSE and Debian cover it. In the case of Red Hat and SUSE you need to support both the current release and the previous release. For Debian/Ubuntu, it's less clear. Ask some customers and take a guess (and see my next point).
    • You can treat the too-many-distros problem as an opportunity. In my case, I recommended that we do three things: First, create packages that cover 95% of the target market. Second, provide a tarball with all of the binaries and some instructions on how configuration files need to be set up so that very technical customers (common in the Linux market) can figure out how to integrate it into the distro of their choice -- with the caveat that you will only support them in resolving issues they can reproduce on a supported distro. Third, (here's the opportunity part), offer to make it work in whatever distro they like, for an integration fee. Charge them enough that you can do a thorough job, including writing all of the automated test code for that platform, all of the documentation needed by your customer service department, etc., so that you can just add it to the list of "supported" distros, and make a profit doing it.

    Really, I think the biggest difficulty with selling commercial Linux apps is the relatively weak demand. Although I don't like Windows, it's still quite dominant, and Windows apps are almost guaranteed a larger market. If, however, you can find a niche where there is significant demand for a commercial Linux product, the multiple-distros issue isn't going to significantly increase your development cost, will perhaps double the cost of developing you installation packages, and (assuming you make good use of automated testing) will probably increase your testing costs by 10% or so. Net, I'd say it costs <5% more to develop a significant application for multiple Linux distros rather than just one distro.

  • by mrchaotica ( 681592 ) * on Tuesday August 08, 2006 @10:48AM (#15866005)
    every vendor is willing to sell you a different "Document Management" system for upwards of $10,000 that really is just a stupid crutch replacement instead of having admins actually MANAGE the file storage and keep users from saving crap all over the network where they don't need to be

    No kidding -- the company I work for is implementing SharePoint for that very reason, completely screwing over those of use who use anything other than Explorer to interact with the shared directories.

  • by Anonymous Coward on Tuesday August 08, 2006 @11:19AM (#15866301)
    That has to be one of the only reasons a good graphical installer for Linux doesn't exist today. I'm even dissapointed in Ubuntu in that light - they're the closest in my mind to a full desktop solution.

    Do you smoke crack or something? Did you not notice the ugly blue/white text only screens that you MUST got through to install Windows NT/2K/XP/2003?

    No, I would say Linux is WAY beyond Windows in capability during installation. Not only do most linux distros fire up X to provide a graphical installer, but you have A LOT more controll over the installer! With Windows installer your only option for passing arguments along to the installer to change installation options is to use the damn function keys. Example, pressing F6 to load different SCSI drivers before the windows installer goes through the worst possible method of trying to auto detect controller cards: throw every freak'n driven into memory and just see what sticks. Problem with this is the probing some drivers do can lock up some SCSI controllers before the driver for that controller loads up, so when the correct driver finally gets loaded the card is in an invalid state and the driver cannot find it. I have had to boot off NT/2K install floppies MANY TIMES because of this! With the a Linux installer you can pass arguments along right away at the boot prompt. And Linux installers PROBE the system to see whats out there BEFORE loading the device drivers! Instead of the half assed method MS uses. And even after a Linux GUI based installer routine starts you can still access a basic shell in most cases to see whats going on in the background and hack things that are acting funky. This is light years ahead of Microsoft.

    Of course you have to have a clue to install Linux, which is why all these MCSE dorks are whinning and complaining. They want a cookie cutter approach to everything that is IT. Sorry, but the world doesn't work that way! I think these people are just upset because they feel the end coming and know their MCSE cert isn't going to mean dick in the new IT world...
  • by mrchaotica ( 681592 ) * on Tuesday August 08, 2006 @11:52AM (#15866721)
    'IE in a Tab' extension

    I discovered that just a few hours ago, actually! (Or rather, I knew about it before but didn't pay attention because I don't use Windows at home.)

    And yes, Sharepoint et al. are really just a way of getting everyone to organize better.

    The trouble is it screws up those of us who knew what we were doing to begin with. For example, I use a bash shell (via Cygwin) half the time to access the shared files, and I can't do that with SharePoint. Another developer has a backup shell script, which won't work with Sharepoint. Etc.

    If you think about it, every problem SharePoint "solves" already had a solution before, and in most cases the solution was simply a filesystem. SharePoint is just a crutch for people who don't understand how to use the tools they already have (or, from another perspective, a workaround for Windows' crap UI).

  • Re:Two words.... (Score:2, Interesting)

    by Kuxman ( 876286 ) <the_kux@yahoo.com> on Tuesday August 08, 2006 @01:17PM (#15867707) Homepage
    I (like you), also have my file system memorized. I know where everything is located, and am a sorting-nazi, so "cd /home/user/docs/dir/stuff" is the way I navigate. But Look at most people's filing cabinets, and it's a mess. Look at their computer.... also a mess. For me, if I'm in unknown territory, it's quicker to click through a bunch of folders rather than cd folder ... cd another_folder... etc.
  • Re:Two words.... (Score:3, Interesting)

    by styrotech ( 136124 ) on Tuesday August 08, 2006 @07:44PM (#15870668)

    One of the maxims of business is, has, and will always be that the customer is always right. Why can't OSS defenders see this?


    They can - it's just that with most open source projects the end user isn't a customer. For the most part end users are just freeloaders who don't give anything back to the project. For an open source project, a few good contributors (eg developers, testers, writers, artists, donors etc) is worth far more than hundreds of end users that don't contribute. Contributors are effectively the 'customers' of an open source project not the end users. The project is the contributors.

All great discoveries are made by mistake. -- Young

Working...