Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment The simplest explanation... (Score 3, Interesting) 449

...is that they wre horribly overpriced. I wanted a Windows tablet when they first came out, right up until I found em priced at $2000 and up. What the hell? You could get two nice laptops for that.

Even today they run about twice what they should. Apple waltzes in with a tablet half the cost of a Windows tablet, and it actually works well with its touch interface ... It is not at all hard to see why people liked it.

Comment Re:What's the deal with the rush of TSA stories re (Score 1) 1135

This is not true. Scramble time alone is around 40 minutes near Washington, that's why they had those "stay in your seats" periods, and that's pretty much best-case. And you can be very sure that they aren't going to shoot that plane down right away, they'll give it every chance; a mistake would be very, very bad.

But it is moot. The ability to take airliner 9/11 style didn't even last out the day of 9/11. Once passengers got the idea that the best thing to do was take down the terrorists, they did so on their own. All of the terrorist attack attempts on planes since then were defeated by passengers, not the TSA or air marshalls.

We are not going to see another 9/11. We are almost certainly going to see another Lockerbie though.

Comment Re:Steve Jobs has clout (Score 1) 681

You are making a couple of presumptions. First, that you're going to be able to "fix" the user; and second, that there is not a suitable replacement tool that doesn't have the trouble.

The first is certainly not always true. Some people are difficult or impossible to retrain, yet in today's world they lose a lot of they can't use a computer. These people really want an appliance.

And that brings us to the second point. Windows is not the only viable choice in computing! That is *especially* the case for consumers, but it is becoming more and more the case in business too as business apps move to the web.

Back to the case of the Mac for my problem user: There is not much difference from the user perspective these days between a Mac and Windows box. They look almost the same, they work almost the same, there is plenty of software to do whatever you want to do as long as you aren't a hardcore gamer (and let's face it, the people with these problems are rarely if ever hardcore gamers).

That being the case, perhaps there are times when it's better to look at a different tool than to keep blaming the user, especially when blaming the user doesn't actually make the problem go away *and* better tools are readily available.

I note that it's not just blatantly stupid users who have problems with Windows. Malware infections are *endemic* on Windows. *Most* consumers get a malware infection within a year of getting a new PC, and most are completely incapable of removing it on their own, even with commonly used (and recommended) commercial software. Nor is the problem specific to consumers; businesses have fast re-imaging software because they *need* it. That is the elephant in the room when it comes with Windows: Nobody likes to talk about how easily it gets screwed up, and from a consumer's point of view nobody likes to talk about how hard it is to fix problems once they crop up.

Consumers tend to deal with it by buying new PCs much more often than they really need. "It got really slow" and "it does weird things" translates into "PC is broken" and since fixing the PC -- having a Geek Squad type person come and clean it up or reinstall -- can often cost nearly as much as buying a whole new one, they buy new ones. It's like replacing your car when the maintenance gets too expensive.

This is the cost of using Windows. Clearly business finds it an acceptable cost, but that cost is much higher for a consumer. For a long time the consumer really didn't have a whole lot of choice, especially at reasonable price points.

If we presume that this is happening with consumers, then a device that does not get messed up in this way, even if it costs more, may be a better solution. That is what we've got when we talk about Macs. They are more expensive (much more expensive at the low end) but they break much less often, and when they do break it is usually not difficult to fix them. The end result is much longer hardware life. My experience is that the lifetime is double or more. If the cost is less than double that of the Windows PC, and it is, then it's a win for the consumer financially. It's a win anyway because of the reduction in hassle, but there you have it.

I don't think the Mac is going to be a particularly good value proposition much longer, though, if it indeed is the best value even today. Like I said before, most want an appliance. That is what they're getting with an iPhone or Android phone today, although their limited screen size makes them relatively poor internet access devices. We see that kind of appliance scaling up though: The iPad is a terrific web access device, and Android tablets ought to be as well, and GoogleTV and its ilk certainly could work as well. Pricing on these things is already competitive with the least expensive PCs, and ought to be significantly better as volumes rise, simply because they don't need anything like the kind of hardware you need to run Windows effectively.

The iPhone gave the iPad a strong applications base right from the start, so it isn't hobbled there either, and it wasn't hard for many application writers to scale their phone software up to use a larger screen effectively. Android tablets and GoogleTV ought to benefit from the same effect as they hit the market over the next year.

As this happens consumers are going to be faced with new choices. Do they want an expensive PC to replace the nth PC they have had that stopped working? Or do they want to try something like the phone they have that never broke, and that is less expensive too? I bet more and more pick the latter over the next five years, and the computing appliance is dominant not long after.

So who wins? Jim's Law: "The cheapest thing that gets the job done wins." I think that's Android and its ilk, although there is the possibility Apple will win because of its huge head start and economies of scale. I consider that fairly unlikely because we'll see such a variety of Android stuff that it'll fill all kinds of niches that Apple doesn't want (in particular, cheap-ass hardware that doesn't really work very well but sells simply because it's cheap). I think Apple ends up in roughly the same niche here as did the Mac previously, although probably with upwards of 25% market share.

Comment Re:Steve Jobs has clout (Score 1) 681

Obviously YMMV. I have had my share of weird problems with Macs, although none took more than 45 minutes to solve with the help of Google. Regarding hardware, there was a period around 2005 where their initial build quality left something to be desired, every system I bought in 2005 had to have a warrantee claim for some hardware issue. Systems before 2005 and after have been very high quality. En-toto, though, it's been much, much easier to keep them running ... and not one single full rebuild in the nine years since I started using OSX aside from a total hard drive failure.

I bought my first OSX laptop in 2001 to replace my wife's Windows laptop. I had been forced to rebuild that Windows laptop every 3 months like clockwork. (This was Win98, XP hadn't hit the scenes yet, although I'd been using NT for years on the desktop.) It drove me insane because rebuilds took 10 hours apiece between the OS reinstall and all the applications. (Reasonably priced imaging software was not yet available, nor back-up software for that matter.) We got the Mac (a Ti Powerbook) and I did almost nothing to it for its entire 5 year lifespan at home, and nothing at all for the 2 years after before the hinges broke from heavy use and destroyed the screen connection ribbon. 7 years out of that laptop and I spent less than *one hour* keeping it running. That is one heck of an improvement.

I thought XP would make things better, but it didn't. The registry was (and is) still a huge disaster, but luckily (or not) most XP boxes are so hugely malware infected within a year (sometimes within weeks) that you have to wipe and rebuild them. (Eradication is nigh impossible these days, and certainly much slower than a rebuild even when it works.) I don't own Acronis True Image because I felt like paying a bunch of money[1], I own it as a purely defensive measure: The Windows systems get imaged at every major installation point so at least I can return them to a near-current configuration within about half an hour.

Malware infections happen despite antivirus software. In fact, I find they're worse when using something mainstream like Norton versus something more oddball like AVG ... and most people use mainstream products.

Then there are the users. I had one who would randomly delete things. Like drivers. Her system would just stop working in weird and inscrutable ways, and of course she had no idea what she did. I finally gave up and forced her onto a Mac. I have had to deal with fewer than one issue per *year* since. That is another big improvement, and I think it comes down to the nice separation between system and user permissions; she cannot delete system things willy-nilly.

This is of course possible on Windows systems too (in fact, I gave a talk on how to configure your NT system's security back at WinDev in 1996) but unfortunately a wide variety of applications simply stop working if you are not running as administrator and people totally hate it if you lock the systems down so they can't install things. (That is true on Mac and Windows, although the Mac's security system is vastly less intrusive than UAC despite accomplishing the same thing.) The state of things on Windows has improved a lot since Vista, at least consumer games don't need admin rights just to run anymore, but I still run into it regularly with poorly written or legacy applications. It makes it quite difficult to convince users to run on securely configured systems.

I thought Vista would be a big improvement versus XP and pushed people to upgrade. I was mistaken. Everyone turns off UAC, the only significant improvement in the whole system, because it's just so intrusive. The first year to year and a half of Vista were disastrous due to immature and missing drivers too. But hey, most Windows users skipped Vista and went straight to Win7 so they missed that pain.

Win7 did not improve the malware situation over Vista, UAC or not. Both, according to the statistics, are vastly better than XP ... but it's very hard to tell based on the systems I get to disinfect. I think the quantity of successful malware infections is down, but when infected XP systems came in they'd be infested with four or five things while Vista and Win7 are usually just one or (rarely) two. Unfortunately those one or two are often the meanest and hardest to clean. So, back to wipe and reinstall. Fun fun fun! And when they're not systems I administer frequently I don't have an image and get to do the oh-so-lovely three hour Win7 install and update rigamarole, then a few more hours reinstalling all the apps. I *hate* that.

YMMV. My personal XP, Vista, and Win7 systems have not had any stability or infection problems to speak of, modulus bad drivers early in Vista. I spend little to no time managing them. I am clearly unusual, though, because I see plenty of damaged systems come through and spent stupid amounts of time fixing them. Macs, not so much. Every user who converted from Windows to Mac dropped off the radar in terms of administration overhead. The problem users no longer have systems that need major repair, but the depth of the repairs fell way off -- they now need really easy things like putting the URL bar back onto Safari, or making the Dock stay put.

Oh yea, no crapware to deal with on Macs either. And no need to buy and continually update antivirus and anti-malware (at least not yet).

Again YMMV, but I deal with Windows, Linux, and MacOS every day and of the three MacOS is the easiest to keep running, with Linux following a little behind. Windows takes a lot of time, with more problems and much more time-consuming solutions. And it did not get much better with Vista and Win7.[2]

Regarding hardware quality, I agree that Acer does a much better job than Dell or Compaq. (For the last four years or so I have preferred Acer systems.) In the field they're not very common though, and in the end analysis the laptops not anywhere near as durable as either a Thinkpad or Macbook[3]. All that plastic does not hold up that well. I realize it probably doesn't matter to you, I bet you replace such hardware no less often than every 3 years, and probably closer to every 2, just like I do with my Windows boxes. (Actually that's not quite true, the desktops get recycled into Linux servers and tend to pull another 3 to 4 years in that role ... I love Linux. But I have to repurchase hardware for Windows regularly because it continually gets bigger and slower to a degree where it is difficult or impossible to expand the system to keep up. Hopefully the hardware is getting sufficiently expandable now that the cycle will get broken.) But the Macs ... 5 years is a minimum and then they go off as hand-me-downs to someone else for years after. Newer versions of MacOS since 10.0 have worked better on the same hardware than the prior version. Leaner, meaner, more efficient despite greater capabilities.[4] All told that is really good bang-for-the-buck in my book, particularly when combined with much lower administration costs.

jim frost
jimf@frostbytes.com

[1] As an aside, Time Machine does a great job of this out-of-the-box. The one time I had a full hardware failure (a Mac mini's drive failed) the reinstall from backup was remarkably straightforward without any external tools. And it took less than half an hour. I really wish Microsoft would do a better job with in-the-box backup ... granted it's much better today than it was with XP, but it is still pretty lousy particularly on restore. Did I mention how much I liked Acronis?

[2] Some of the problems got much weirder, although not ultimately inscrutable. For instance, I had a user using a legacy app that would save a file and then try to load it in another app ... and it was not there. Go back to the first app, there it is. This was because of the virtualization of "Programs and Settings"; the app default was to save into a subdirectory of its install, and Vista virtualizes those directories to improve stability. Run a different app and it has a different virtualized directory so it can't see the file. What made it worse is that Search couldn't find the file in the virtualized directories -- they're specifically excluded (at least in Vista, have not checked Win7) -- and of course the whole directory structure is hidden from the desktop. The user had no chance of finding that file and was completely perpexed. It took me almost an hour to figure out what was going on too, until I saw the virtualization directory in the CMD shell and it twigged my memory about directory virtualization. I think the virtualization is too simplistic; it needs to share data files between apps per-user, but isolate DLLs and the like per-app. And no matter what form virtualization takes Search needs to be able to find stuff in those directories!

[3] The achilles heel of the Powerbooks was the power cord; when stressed it was not too difficult to break things inside the laptop. Unlike the Dells and Compaqs this was a daughterboard item on the Powerbooks and pretty easy and inexpensive to replace, so the laptop wasn't a loss if it happened. The move to the magnetic connectors has totally eliminated that particular problem. I rather wish Apple would license that to other manufacturers. What kills Mac laptops over the long haul today is monitor backlight failure. It's too early to know if the LED backlight systems common in Macs today will have similar issues to fluorescent backlights, although I suspect not.

[4] Win7 was the first new version of Windows in two decades that hadn't required double or more the resources of the previous (ok, ignoring WinME I suppose), and in the case of XPSP2 even the service pack doubled RAM requirements. Win7 didn't get any better than Vista in my experience, despite claims otherwise, but thank god it wasn't any worse. The Vista upgrade required wholesale machine upgrades, very expensive....

Comment Re:Steve Jobs has clout (Score 1, Offtopic) 681

Like a bunch of others I use Firefox too, and recommend it, on MacOS X. Safari is fine these days, but for a long time I got more reliable results with Firefox and it's nice to have the same software everywhere.

The Apple tax bit is a little disingenuous. The mini is indeed expensive (but so very small and quiet and there is value in that) but above that the machines end up being pretty well price-competitive with similar hardware.

I hear "I can get a way better Dell laptop for $600" compared to a Macbook, but it isn't true. The display is crap, the build quality is worse than crap. A comparable laptop is a Thinkpad ... And the prices are damn near identical.

Last I checked that was true of all--in-ones too (not my cup of tea). The low end of the Pros are a little expensive, but by the time you're halfway up the line they're a bargain.

Mind you, it irritates me no end that there is no expandable desktop unit except at the high end. On the other hand, the G5 Quad I use for photography is five years old in a couple of weeks and still going strong. Typical Windows desktop lives (and Linux for that matter) are no more than 3 years before it becomes difficult to expand the box enough to run the latest software.

None of that is whoy I buy Macs though. My time is valuable. I spend almost zero time maintaining Macs. No malware. No weird-ass registry issues that are only solveable by rebuilding the machine. Back-ups using in-the-box software that are unobtrusive and restores that are fast and painless. Basic software that works at least reasonably well, and often extremely well, without having to buy anything extra.

I use and manage all of the versions of Windows manufactured in the last decade regularly (some much more often than the Macs). I find it telling that in order to make it run smoothly, reliably, you have to spend hundreds on aftermarket software, and recovery from malware is painful beyond belief if you don't have a recent image. Even migrating to a new box is painful. Dealing with these things costs time and money, and the problems are all but nonexistent on Macs. (Many are nonexistent on Linux too; I make heavy use of Linux for development and on servers. Great bang for the buck.)

From a consumer point of view Macs are a way better deal. Not so much in business given the poor bulk management tools and Apple's legendarily bad business-class hardware support. Remember, though, that many of those tools exist primarily because it was impossible to manage the fragile Windows infrastructure without stuff like fast re-imaging. Windows breaks way more often than anything else and is the least repairable without rebuild system I have ever seen (and that's saying something, I wave worked with a lot of weird stuff).

Someday you should get me going about the design of the Windows VMM amd NTFS; the apathy Microsoft shows toward improving basic function is mind-boggling. There is no reason I should have to defrag drives regularly, that was a solved problem in 1985, for instance, and Microsoft could have all but eliminated it with trivial (and backward compatible) changes to the block allocator. Drives me nuts.

 

Comment Re:wrong OS? (Score 1) 1348

IMO there isn't a whole lot of difference in the basic UI of any of these things anymore. MacOS is easier to use than any of them because it is a whole lot more consistent within and between apps, but realistically things are not so bad anywhere else either.

Anyway Linux is lacking a whole lot more than just major studio games, and WINE only closes the gap in a few places (and with significant irritations). I use Linux daily, so this is not just idle speculation. Creativity products in particular are nonexistent or weak (I'm looking at you Open Office) with the exception of GIMP (and I still strongly prefer Photoshop). It's a superb programming environment though, I wish I had valgrind for Windows. (I do have Purify. When it works it is great. Most of the time it does not work. Oh well.)

I use Windows daily too, many things are just not available anywhere else. I find that a pretty good development environment has Windows running native and Linux in a VM. (I'd rather put Linux native, but Windows has plenty of trouble being performant even when it's on the bare hardware. It's gawdawful in VMs. Linux works fine in VMs excepting mediocre network performance.)

When it comes down to it my favorite desktop environment is the Mac. Excellent applications plus all the goodness that is UNIX, and it's easy enough to run Windows in a VM if I have to (though these days that is a pretty rare exception). I could do without a lot of aspects of His Steveness but I have to weigh that against the huge benefit of how easy it is to keep Macs running even in the hands of naive users.

Comment Re:wrong OS? NO! Wrong QUESTION! (Score 1) 1348

There are actually several fairly decent image editors on the web now (there weren't even a year ago), like pixlr.com. I'm not uninstalling my copy of Photoshop any time soon, for lots of reasons, but every passing day these programs get closer in functionality and for a whole lot of uses they're already there.

Regardless, I think content creation is going to need a PC or something like a PC for a good long time to come. The combination of high-bandwidth and precise input (keyboard[1], Wacom tablet) and horsepower is enough to take something like an iPad out of the picture completely for a lot of things. Of course, not very many people actually *do* those things, and for some very common tasks -- like constructing a presentation -- an iPad could well be superior. (I've done it with Keynote; It's *almost* there, but several UI annoyances are big enough to make me go back to the desktop. I could totally see using it exclusively with a few UI tweaks though, and in some cases it's already a lot better.)

[1] Of course, you can get a keyboard for an iPad if you want one. That kind of negates the beauty of the device if you ask me, though.

Comment Re:3 Menu Clicks (Score 1) 403

This is pretty similar to the position I'm in. I skipped the HTPC in favor of a Tivo, even though it was more limited, because every experiment I did with HTPCs ended with spending a ton of money to get something pretty fragile. OTA recording worked great, video capture through a cable box worked fine, but HDTV pretty much nuked it. The Tivo was a far better solution, both in that it works (and my wife loves it) and that it wasn't all that expensive over the long term.

Still, the Tivo has been a long-term disappointment. It does the DVR thing brilliantly, but it was obvious to me right from the outset that it could be the center of the AV stack if they put a little effort into it. But they didn't! And every new feature they add has leveraged Tivo's servers, which are so underprovisioned that you often get old waiting for a key click to be responded to. I will continue to use the Tivo until I find the cable connection to be redundant (5 years out, I bet) but I can already see its end-of-life.

$100 for an Apple TV is so cheap, and the interface so clean, that it's worth a shot just to see what it's like. Heck, I'd do it just for Netflix streaming. I will almost certainly buy one before Christmas, as soon as I get around to getting an HDMI switcher so I have an input to hook it up to.

I was surprised that the Logitech Google TV was $300, I expected closer to $200 based on the specs. I think that's going to be a hard sell in a recession economy (I'm certainly not lining up to buy one just yet and I'm very gadget-happy). My gut call is that Apple has the right idea, a dirt-cheap platform that tries to do a few things very well. If they manage to get enough TV content I would drop my cable subscription in a heartbeat, but even without it access to my iTunes database through my AV stack is worth $100. It's "good enough."

And I think my rationalizations come to the heart of the marketing problem with these devices: None of these will go mainstream without something in addition to TV, if only because none of them will have enough content to seriously compete with cable. There must be Something Else.

I think we're going to see the iTunes store for the Apple TV within the next year and they're going to push gaming hard. If they do that I could totally see this device doing a Wii and selling a gazillion units as a cheap little game platform. (Seriously, $5 games on my TV? I would totally do that.) Apple already proved they can sell iPod touches on that model very effectively. If they get that kind of volume the TV content people will sit up and notice. I wonder if that wasn't Jobs' game plan from the start of the Apple TV reboot.

Of course, Google TV could do the same thing (and I know they're talking about it). $300 though ... that is not a "take a chance" price, and despite huge gains they still don't have anything like the developer infrastructure of the iPhone/iPod touch to leverage.

No matter which way it goes I guess we will get cool new gadgets though, so bring it on :-).

Comment Re:What could possibly go wrong? (Score 1) 825

That was the general practice unless there were overriding concerns. Another thing was that you had to have a straight of at least 1mi (I might have the distance wrong) every so often so it could be used for aircraft.

I don't have the energy to dig up my bibliography of sources but it was certainly not hard to find a lot of detail about the highway system. Quite a lot of thought went into its design, and the most interesting thing is that commerce was really a gravy side-effect.

I read an interview with one of the original architects and he said the one thing they really screwed up was running the interstates so close to cities, and having so many ramps in those areas.

Comment Re:What could possibly go wrong? (Score 1) 825

This is not selfishness, I live on the other side of the country where there's no chance in hell of ever getting speed limits raised to 90mph. Nor do I think they should be. Rather, I think highway speeds ought to be 80th percentile. But I have spent lots of time in cars in NV and Utah and NM as a passenger and frankly it seems that the biggest problem out there is simple highway hypnosis. It's a long $#@^ way between anything in those states. Shorter travel time can be a huge win.

Anyway, you are wrong about tighter speed limits, but that's probably because you have not looked at the literature.

What has been shown time and again is that reducing the speed differential between traffic reduces accidents and fatalities, independent of actual speed. Your chances of dying in a crash are higher if you're going faster, but if you can reduce the incidence of crashes it can be -- and is -- a net win. So the goal here is to reduce accident rates as much as possible.

The engineers say that the way to do this is the 80th percentile rule; you let traffic free-flow and watch how fast it goes. Set the speed limit to the 80th percentile, rounded up a little (5mph in the US, 10k elsewhere). Set minimums at 10mph (20kph) lower.

The statistics say that traffic travelling 10mph faster *or* slower than average sees accident rates climb to 300% normal. Moreover, the slower side sees multi-vehicle accident rates climb 900%! Slower drivers cause a lot of accidents, and they involve other people much more often.

Now let's put that into the context of a typical 55mph US highway. Average traffic speed is 67mph on those highways. Minimum limits are 45mph. That means that someone -- legally -- going at the lower limit is actually going more than 20mph too slow! Very, very dangerous, both to themselves and to everyone else. But someone going 70mph -- 15mph too high according to the law -- is statistically very safe.

Given these numbers typical interstate traffic speed limits should be 70 or 75mph, not 55 or 65mph, and minimums should be 60 or 65mph respectively. That's what the engineering says. We have, unfortunately, eschewed engineering in favor of politics.

So, we have some great data from when the NMSL was repealed and a lot of limits jumped to 65mph. The first really interesting figure is that average traffic speeds jumped -- to 69mph. This put to lie the idea that traffic is just going to run at the tolerance limit of the police regardless of the speed limit. In fact, traffic tends to drive at "comfort" speeds, which unsurprisingly are somewhere near the design speed of the road.

With such a minimal increase in typical speed you wouldn't expect a large change in fatalities. There was a significant change though, absolutely -- but not when normalized for vehicle miles traveled. Moreover the fatality rate for the road system as a whole dropped by something like 5%. It's believed that this is because the change in highway limits made drivers prefer the safer interstates to the less safe rural highways (now 70mph was unlikely to get you a ticket).

Anyway, I spent awhile researching this stuff awhile back and may even still have a bibliography buried in my archives somewhere but I encourage you to do the research yourself. Even Wikipedia mentions this stuff, you could start there.

I note that many of these figures are multinational. The data supports this in the US, the UK, France, and Germany at a minimum. The best studies of this are in France and Germany. (Germany is an odd man out though; the autobahn is pretty safe even though it has severe vehicle speed differences; driver training might have a lot to do with it. US driver training is pathetic, nigh on nonexistent, and I would not recommend autobahn-style laws here.)

Here are a couple of additional factoids for you:

- Average accident speed on non-interstates in the US is 27mph. Average accident speed including interstates is 29mph. What this means is that most accidents do not have speed as a significant factor. This is not actually terribly surprising when you go look at where high accident rates occur. It's not the highways! It's not even close. Which brings us to:

- Fewer than 10% of accidents happen on highways. The US interstate system has the safest roadways in the country by far.

So where are accidents and fatalities happening in large numbers? Surface streets, at intersections. Failure to obey traffic control devices and failure to yield are the two biggest causes. These accidents happen at relatively low speeds, but it turns out that even 30mph is a honking big hunk of energy when you're talking about a couple of two-ton vehicles colliding.

I note that US traffic safety programs target speed almost exclusively, with almost no effort spent on either training or enforcement at intersections. This is unbelievably stupid public policy. The fine structure shows this bias too; speeding fines are large and grow very rapidly, but fines for running a red light tend to be near nominal. The red light runner is far more dangerous.

Comment Re:What could possibly go wrong? (Score 2, Insightful) 825

You've never been to Nevada, have you? 90mph is not stupid fast in much of the state. Dead flat straight roads for hundreds of miles ... That's Nevada.

As a general rule the US interstate system was designed to be safe at 75mph in 1950s military vehicles. It is no great trick to be safe at higher speeds in modern cars, particularly in a big empty state like NV. Heck, in that area 80mph limits were the norm until they passed the national speed limit.

Comment Re:Yes and no... (Score 1) 397

Manual management, when it's done properly, is certainly smaller ... but you have to balance that against the much larger chance of making errors[1] -- not just a significantly increased tendency to leak, but also serious errors like double-frees and use-after-free and the development time spent tracking that stuff down. (To say nothing of the costs of dealing with customers when their software crashes.) In addition GC mechanisms can have lower overall CPU costs, there are interesting optimizations available when you're doing things in bulk, but you pay for that in less predictability.

There's give and take and strong reasons to pick one or the other depending on the application type. If you step back and think about it, though, there aren't that many cases where the benefits of GC are outweighed by its costs. Software using GC tends to be easier to write and much less prone to crashing, and unpredictability is not usually in the user-perceptible (to say nothing of critical) range. Given that most of the cost of software is writing and maintaining it anything you can do to depress those costs is a big win.

Obviously there are cases where the tradeoffs are too expensive. Cellphones, as you point out, may be one of those -- but Android seems to be doing fine with Java as its principal runtime environment. (Honestly, your typical smartphone has way more memory and CPU than servers did not so long ago, to say nothing of the set-top boxes for which Java was originally designed.) Operating systems, realtime systems, and embedded systems are other cases.

[1] Brooks' _Mythical Man Month_ makes a strong case for development systems that optimize for reduced errors even at the cost of some performance. He was talking about assembly versus high level languages but (perhaps not surprisingly) the more things have changed the more they have remained the same. We have a lot of data on development and maintenance costs of various software environments now and costs tend to be lowest in the cases where the environment makes it harder for programmers to screw up. Usually by significant margins; my experience in comparing Java and C# versus C++ indicated an average time-to-completion differential of 300%, and a bug count reduction of more than 90%, over the long term. In some cases -- like network applications and servers -- the improved libraries found in Java and C# versus C++ yielded order-of-magnitude improvements. These are numbers you can take to the bank. Even if the first generation of the software is slow relative to a language like C++, the ability to rev the software three times as fast means much faster algorithmic development. It is often the case that the Java code would outperform C++ within three or four versions if the code was particularly complicated.

Of course C++ -- as with C before it -- is a particularly lousy language when it comes to memory management. It's a lot closer if you're using something that has, for instance, real arrays or heap validity checking. It annoys me to no end that none of the standard C++ development environments builds in debugging aids as a matter of course on non-release builds. If they exist at all they're hidden features (look at all the debugging heap features Microsoft Visual Studio will give you if you can figure out how to turn it on), and most of the time you have to go buy expensive add-ons (like BoundsChecker or Purify) to do things that would be trivial for the compiler writer to manage with a little extra instrumentation and integration with the library system. Alas. I actually had better debugging tools at my disposal for C++ in 1994 than I do today (although Valgrind is not bad at all), and that really pisses me off given how much money Microsoft gets for the tools I use.

Comment Re:Yes and no... (Score 1) 397

There are a lot of possible answers to that. The most obvious is that you want to limit the growth of the JVM; with garbage collecting there is a tendency to grow the heap without limit.

That was never a satisfactory answer to me, though, because it is not at all difficult to set up an heuristic to watch GC activity and grow the available heap when it looks like memory is tight -- and certainly to try it before throwing OutOfMemory! In fact, it wasn't very long before JVMs that did this started popping up (I think Microsoft's was the first; I think it's pretty darn ironic that the best JVM out there when Sun sued Microsoft was actually Microsoft's, and by no small margin).

IMO it's an anachronism that Sun's JVMs still have the hard fixed limit without even the choice to turn it off. Larger Java applications (e.g. Eclipse, Maven, and of course the web app servers) regularly break for no other reason than running into heap limits even when there is plenty of memory available on the system. I find it a huge and unnecessary irritation.

Comment Re:Maybe. (Score 1) 397

It's always been the case that there are application-visible differences in JVMs. (Remember that Java is "write once, debug everywhere." It's not just a funny tag line.) You have to detect and work around them somehow and the combination of vendor string and version is a reliable way to do that.

Slashdot Top Deals

To do nothing is to be nothing.

Working...