Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Dual-core Systems Necessary for Business Users? 398

Lam1969 writes "Hygeia CIO Rod Hamilton doubts that most business users really need dual-core processors: 'Though we are getting a couple to try out, the need to acquire this new technology for legitimate business purposes is grey at best. The lower power consumption which improves battery life is persuasive for regular travelers, but for the average user there seems no need to make the change. In fact, with the steady increase in browser based applications it might even be possible to argue that prevailing technology is excessive.' Alex Scoble disagrees: 'Multiple core systems are a boon for anyone who runs multiple processes simultaneously and/or have a lot of services, background processes and other apps running at once. Are they worth it at $1000? No, but when you have a choice to get a single core CPU at $250 or a slightly slower multi-core CPU for the same price, you are better off getting the multi-core system and that's where we are in the marketplace right now.' An old timer chimes in: 'I can still remember arguing with a sales person that the standard 20 Mg hardrive offered plenty of capacity and the 40 Mg option was only for people too lazy to clean up their systems now and then. The feeling of smug satisfaction lasted perhaps a week.'"
This discussion has been archived. No new comments can be posted.

Dual-core Systems Necessary for Business Users?

Comments Filter:
  • by The Crazed Dingus ( 961915 ) on Thursday March 23, 2006 @09:38PM (#14985020)
    Its true, I recently took a look at my own systems running processes, and while it only shows four or five icons in the system tray, i ended up showing that i have almost 50 backround apps running, and to boot almost 1500 process modules running. This is way up from a year ago, its something that is coming to the forefront that multi-core processers are going to become the norm.
  • by tlambert ( 566799 ) on Thursday March 23, 2006 @09:48PM (#14985073)
    "...getting a couple [for the executives]..."

    I can't tell you how many times I've seen engineers puttering along on inadequate hardware because the executives had the shiny, fast new boxes that did nothing more on a daily basis than run "OutLook".

    Just as McKusick's Law applies to storage - "The steady state of disks is full" - there's another law that applies to CPU cycles, which is "There are alwways fewer CPU cycles than you need for what you are trying to do".

    Consider that almost all of the office/utility software you are going to be running in a couple of years is being written by engineers in Redmond with monster machines with massive amounts of RAM and 10,000 RPM disks so that they can iteratively compile their code quickly, and you can bet your last penny that the resulting code will run sluggishly at best on the middle-tier hardware of today.

    I've often argued that engineers should have to use a central, fast compilation software, but run on hardware from a generation behind, to force them to write code that will work adequately on the machines the customers will have.

    Yeah, I'm an engineer, and that applies to me, too... I've even put my money where my mouth was on projects I've worked on, and they've been the better for it.

    -- Terry
  • Re:Not really (Score:4, Interesting)

    by shawnce ( 146129 ) on Thursday March 23, 2006 @10:04PM (#14985157) Homepage
    In short: dual core, like most parallelized technologies, doesn't do nearly as much as you think it does, and won't until our compilers and schedulers get much better than they are now.

    Yeah just like color correction of images/etc done by ColorSync (done by default in Quartz) on Mac OS X doesn't split the task into N-1 threads (when N > 1 and N being the number of cores). On my quad core system I see the time to color correct images I display take less then 1/3 the time it does when I disable all but one of the cores. Similar things happen in Core Image, Core Audio, Core Video, etc. ...and a much of this is vectorized code to begin with (aka already darn fast for what it does).

    If you use Apple's Shark tool to do a system trace you can see this stuff taking place and the advantages it has... especially so given that I as a developer didn't have to do a thing other then use the provided frameworks to reap the benefits.

    Don't discount how helpful multiple cores can be now with current operating systems, compilers, schedulers and applications. A lot of tasks that folks do today (encode/decode audio, video, images, encryption, compression, etc.) deal with stream processing and that often can benefit from splitting the load into multiple threads if multiple cores (physical or otherwise) are available.
  • Obligatory Quotes: (Score:3, Interesting)

    by absurdist ( 758409 ) on Thursday March 23, 2006 @10:07PM (#14985173)
    "640 KB should be enough for anybody."
    -Bill Gates, Microsoft

    "There is no reason why anyone would want a computer in their home."
    -Ken Olsen, DEC
  • by ikarys ( 865465 ) on Thursday March 23, 2006 @10:07PM (#14985177)
    I will benefit from multi-core.

    I'm perhaps not a typical business user, but what business wants is more concurrent apps, and more stability. Less hinderance from the computer, and more businessing :)

    Currently, I have a Hyperthreaded processor at both home and work. This has made my machine immune to some browser memory leak vulnerabilities, whereby only one of the threads has hit 50% CPU. (Remember just recently there was a leak to open windows calc through IE? I could only replicate this on the single core chips).

    Of course hyper threading is apparently all "marketting guff", but the basic principles are the same.

    I've found that system lockups are less frequent, and a single application hogging a "thread" does not impact my multitasking as much. I quite often have 30 odd windows open.. perhaps 4 word docs, outlook, several IEs, several firefoxs, perhaps an opera or a few VNC sessions and several visual studios.

    On my old single thread CPU this would cause all sorts of havock, and I would have to terminate processes through task manager and pray that my system would be usable without a reboot. This is much less frequent on HT.

    With muli-core, I can forsee the benefits of HT with added benefits of actually being 2 cores as opposed to pseudo 2 cores.

    For games, optimised code should be able to actively run over both cores. This may not be so good for multi tasking, but should mean that system slowdown in games is reduced as different CPU intensive tasks can be split over the cores, and not interfere with each other.

    (I reserve the right to be talking out of my ass... I'm really tired)
  • Obviously, few (if any) business users need anything more than a Pentium III running at 500 MHz. That processor is perfectly acceptable for business applications like OpenOffice.

    Obviously you haven't gone and looked at what gets installed on many business PCs. My employer's standard systems are 2.4ghz p4 on the desktop, 1.7gz p4m on the laptops, 512-1025m ram depending on system usage.

    Everyone complains that they're slow. Why? Lets see:

    Software distribution systems that check through 5000 packages every 2-4 hours to check for critical updates.

    Office addins chiefly our document management system. Ripping all the addins out of Word cuts its startup time by a factor of 5 for me, approximately. However, I'm pretty rare in not needing any of those addins.

    For a stripped down system running just the office apps with no addins and some basic virus scanning software, yes, old hardware does fine. But a business desktop does so much more than that, because a business needs to do more things to manage their computers. And the home user, who doesn't have all that, uses the processor power for games and goofing around with their pictures and home videos.

    And never underestimate how many business users don't just sit there making office documents. There are large numbers that do development, visualization, number crunching, and other compute intensive tasks on a regular, if not continuous basis.

    And whatever makes people think that these fancy web applications need less horsepower? JavaScript + DHTML is going to be less efficient for UI work than native code, thus requiring a faster processor.
  • by Rebelgecko ( 893016 ) on Thursday March 23, 2006 @11:29PM (#14985538)
    Maybe this is a dumb question, but how can you use over 100% of your CPU?
  • by Anonymous Coward on Thursday March 23, 2006 @11:29PM (#14985542)
    You get Hot chicks with Ferrari and repel Hot chicks with the super fast top of the line dual core system
  • by TheCarp ( 96830 ) * <sjc@NospAM.carpanet.net> on Thursday March 23, 2006 @11:41PM (#14985582) Homepage
    ok... I just had the dog and pony show from intel themselves a few days ago at work. Nothing special or anything you can't find online.... just the standard demo.

    You are right... however, thats always the way of it.

    Build it and they will come. Once the technology exists, somebody is gonna do something way fuckin cool with it, or find some great new use for it, and its gonna get used.

    Research computing, number crunching, they will eat this stuff up first, then as it becomes cheaper, it will make the desktop.

    Think of it for servers. Sure you don't NEED it... but what about power? Rack space? You have to take these into account.

    Sure you don't need more than a pentium 500 to serve your website. However, if you have a few of these puppies, you can serve the website off a virtual linux box under VMWare.
    Then you can have a database server, and a whole bunch of other virtual machines. All logically seprate... but on one peice of hardware.

    Far less power consumption and rack space used to run 10 virtual machines on one multi core multi "socket" (as the intel rep likes to call it) box. Believe it or not, these issues are killer for some companies. Do you know what it costs to setup a new data center?

    Any idea what it costs to be upgrading massive UPS units and power distribution units? These are big projects that end up requiring shutdowns, and all manner of work before the big expensive equipment can even be used.

    Never mind air conditioning. If you believe the numbers Intel is putting out, this technology could be huge for datacenters that are worried that their cooling units wont be adequet in a few years.

    Seriously, when you look at the new tech, you need to realise where it will get used first. Who really does need it. I already know some of this technology (not the hardware level but vmware virtual servers) is already making my job easier.

    Intel has been working with companies like vmware to make sure this stuff really works as it should. Why wouldn't they? Its in their interest that this stuff works, or else it will never get past system developers and implimented. (ok, thats a bit of a rosy outlook, in truth it would get deployed and fail and be a real pain and Intel would make money anyway... but it wouldn't be good for them really)

    The numbers looked impressive to me when I saw them. I am sure we will be using this stuff.

    -Steve
  • by Space cowboy ( 13680 ) * on Friday March 24, 2006 @01:19AM (#14985945) Journal
    It's very simple. Every time someone comes up with "most apps are useless at multi-processing", it's always a windows app. Most Apple apps are already multi-threaded for the reasons I state.

    It seems you can't point out a technical achievement (on either side of the fence) without some 'fanboy' accusation being levelled. [sigh]

    Simon.
  • by Perdo ( 151843 ) on Friday March 24, 2006 @01:52AM (#14986048) Homepage Journal
    A couple seconds here and there, lets say 2 seconds in sixty.

    Now cut that to one second in sixty with a faster machine, ignoring multiple cores for now.

    Gain a day of work for every sixty.

    Six days of work a year.

    A week of extra work accomplished each year with a machine twice as fast.

    You are paying the guy two grand a week to do auto cad right?

    That two year old machine, because machine performance doubles every two years, just cost you 2 grand to keep, when a new one would have cost a grand.

    The real problem is, we are not to the point where you only wait for your computer 1 second in 60. It's 10 seconds in 60. It costs you $10,000 a year in lost productivity. $20,000 in lost productivity if the machine is 4 years old.

    That's why the IRS allows you to depreciate computer capital at 30% a year... Because not only is your aging computer capital worth nothing, it's actually costing you money in lost productivity,

    Capital. Capitalist. Making money not because of what you do, but because of what you own. Owning capital that has depreciated to zero value, costing you expensive labor to keep, means that you are not a capitalist.

    You are a junk collector.

    Sanford and Son.

    Where is my ripple. I think this is the big one.

    Dual core? that is just the way performance is scaling now.

    The best and brightest at AMD and Intel can not make the individual cores any more complex and still debug them. No one is smart enough to figure out the tough issues involved with 200 million core logic transistors. So we are stuck in the 120 to 150 million range for individual cores.

    Transistor count doubles every two years.

    Cores will double every 2 years.

    The perfect curve will be to use as many of the most complex cores possible in the CPU architecture.

    Cell has lots of cores but they are not complex enough. To much complex work is offloaded to the programmer.

    Dual, Quad etc, at 150 million transistors each will rule the performance curve, keeping software development as easy as possible by still having exceptionally high single thread performance but still taking advantage of transistor count scaling.

    Oh, and the clock speed/heat explanation for dual cores is a myth. It's all about complexity now.
  • by Jeremi ( 14640 ) on Friday March 24, 2006 @02:04AM (#14986076) Homepage
    With dual processors, Norton AV is grabbing more than 100% CPU at times, even with their own throttling settings enabled.


    I think Norton AV (and other products like it) are a hopelessly flawed way to try to provide "security". No matter how many CPU cycles they burn trying to detect viruses, there will always be a new virus with a new level of obfuscation that will slip past them. Therefore, dual core CPUs won't be sufficient for this task, because any number of CPUs would not be sufficient -- an unbounded number of CPU cycles (or a solution to the Halting Problem) would be required to do it. The only way to provide reliable security is to carefully design the OS to be secure, so that it doesn't matter if a virus runs -- because the virus won't be able to do any harm anyway.


    That said, I think dual core CPUs are great, because it means (or will soon mean) that I can get a 2 processor machine for the price of a 1-processor machine, or a 4-processor machine for the price of a 2-processor. Eventually we'll have things like 64 processors on a chip, which will be great fun to play with -- my own Connection Machine, for cheap! :^)

  • Need RAM? Try Kahlon.com [kahlon.com]. They virtually have everything you could think of. I don't work there, I just am a very satisfied customer. (They even were very cooperative when I tried to pay them from Europe, which is where I live)

    Obviously, few (if any) business users need anything more than a Pentium III running at 500 MHz. That processor is perfectly acceptable for business applications like OpenOffice.

    As for this comment: I know everybody is going to say that it isn't true. It is. I am writing this just right now on a P-III 600MHz laptop with 512Meg RAM and OpenOffice works just fine. It takes a bit to load, but once it's running, it runs fine. Of course, I know what runs on my computer and right now only 30 processes run. Far from typical in the Windows world where everyone and his dog run multiple spyware proggies ;-)

One way to make your old car run better is to look up the price of a new model.

Working...