Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment This is nothing new (Score 1) 319

A few years ago NASA wanted to develop some form of on-line community similar to Second Life. So it sent out requests for ideas. I even submitted a few, figuring that if they did this right it could provide a serious environment for education and entertainment. NASA eventually announced a public hearing where potential developers could go. Well, what basically happened was, NASA had no funding for this, the proponents were expected to develop this at their own expense.

I saw the point here: you'd basically have to set up something which provided an environment for developing content, you'd have to figure out how to monetize your system to cover its costs. Consider that, since, unlike games like World of Warcraft, you can get into the existing virtual worlds for free, and NASA wanted at least a minimum area you could enter for free, a monetization through admission (game kit sales charges, or monthly fees) were basically out. You'd either have to sell space or find some way to sell add-ons, and very likely NASA would have a veto on what content or user actions were there. All you'd get for your trouble was the privilege of using NASA's "meatball" logo as part of your project. As this wasn't much of an incentive - anyone who wanted to be in the Virtual World business was already there - it died on the vine.

I seriously believe a few thousand dollars could have allowed NASA to create a programmable on-line virtual-reality based system which could have started small and been built up as those who used it figured out what to do with it, sort of the way Wikipedia bloomed from its small and humble beginnings. But they wanted an unrealistic system without a means to finance it. And their unrealistic expectations got them exactly what could be expected. A nothing that went nowhere.

Comment This is old news (Score 1) 287

I tried posting the following reply on the website the original article appeared on, but their comment system kept having an error so I'll post it here and expand upon what I would have said.

There is nothing new here; the problems of electronic data deteriorating or becoming unreadable because of proprietary lock in of various closed-source applications is well-known going back more than 20 years that I'm aware of, and certainly a lot longer than that. The use of wire recorders, player piano rolls, 78-RPM records, phonorecords now, 8-track tapes, laserdiscs, 8" floppies, 5 1/4" floppies, Jazz discs, Zip discs, and now 3 1/2 inch floppies, (and lots of other media I know I've forgotten) are all now obsolete storage media, some of which may have data which can no-longer be retrieved because the hardware and/or software to read them is unavailable, lost or forgotten.

RMS on Digital PDP minicomputers running RSX and RSTS and VAX machines and OS . ISAM and PAM on Univac VS/9 OS on 90/60 /70 and /80. VSAM on IBM mainframes (except the few places continuing to run z-System). The Control Data Cyber systems and their data file formats. Gould, Goodyear, Harris and RCA mainframes. All of these are basically obsolete, most if not all are gone, and data stored on media from those systems, if developed by a proprietary application, is probably, for all intents and purposes, lost forever even if the data is still present. The media may have deteriorated, and the systems to read them are essentially nonexistent.

Mechanisms for regular conversion as technology changes have to be provided for. This, however, requires that as the older media ages, that there be budget and personnel available to provide the conversion while both old and new media types are available. As the case of NASA cited in the article (an employee scrounged equipment and tapes on her own in order to keep the data alive until a means to retrieve it could be found), sometimes either or both may not be available.

Libraries have mentioned how their resources are stretched thin as it is, they may not have the funds or trained personnel to export old data to new media. And at the rate media keep changing this is happening more and more frequently. 30 years ago is 1980, 250K 8" disks are still in use. The 5 1/4" 360K disc is popular because of MS DOS machines. 20 years ago is 1990, and then, the 5 1/4 was still and 3 1/2 inch floppies were becoming popular, 15 years ago a reasonable medium for high-capacity storage were 100-meg zip disks. Now I don't even have a 5 1/4" drive, my computer still has a 3 1/2 but I don't have any floppies or use them, because I have a 4 gb jump drive I wear on a lanyard around my neck, and cost ten bucks.

We've gone to digital storage because it's orders of magnitude cheaper than analog. I've pointed out in previous articles that with a digital camera and 4GB SD cards, I can take thousands of pictures at an effective cost per picture that effectively rounds to zero. A single photo might take 1/2 to 1 meg, which means, without changing media, I can take upwards of 3,500 photos. Net cost is $10 when the media is bought; nothing more unless I print an image. When I take pictures, I don't take one, I take 3, or 5, or 20 because the extra pictures are essentially free and I can delete the ones I don't want later. When I was using 35MM film, each photo, with film and developing, was about 30c. A couple hundred pictures would set you back over US$50. Today, for $50 I can take more than 20,000 images.

But my sister still has an older digital camera that uses Smartmedia, She has to be careful to copy her images to hard disc when she uses it because you basically can't buy smartmedia any more and even when you could, the maximum size was 128 meg. Her photos were in the 150K size range so she can still take more than 500 photos on a 64M chip, and also the cost is effectively zero.

And for current media it's still near-zero per image. I bought a 1 terabyte drive (1000^3, not 1024^3). On one trillion bytes I can store just a hair under 1 million 1 megabyte images. The drive cost about $110 including tax, so that means each image stored costs $0.0011, or 1/10 of one cent. Stored on 4 GB DVD that costs about 25c, 3500 images cost $0.000071 each to store. Unless printed out (which has also dropped, it's about 15c at a drug store, and the image printer can read the media directly), again, the cost of any image is basically so low as to be noise.

But there are trade-offs. Information is so much cheaper to store digitally that we have to rely on the systems being available to retrieve them. Photos can be viewed out of a shoebox in the middle of the Sahara desert as long as it's light or you have a candle or flashlight. But you'd better have a computer, a media reader if it's not on he computer's hard drive, and some software to view a digital image, or you'll never see it. Plus electric power for the computer, too.

Information having any value on non-machined media will be usable as long as human beings are around; machine-processed media is only good as long as the machines that process that media are around. Paper photos will decay to some extent, they'll still remain as photos without human intervention. While digital data will never decay as long as the media is intact, the ability to continue to read the media will regularly require human intervention and access to the machines (hardware and software).

The most important solution to this problem is to reject any software system that uses closed or undocumented proprietary storage formats over open, non-proprietary ones. But this cuts the profits of some organizations; witness Microsoft fighting so hard to force use of its proprietary and badly-documented Word file format (to lock-in users) and its fights to stop use of the openly-documented Open Document Format (ODF) such as is used by OpenOffice.org. If files are in non-proprietary formats, the vendor can't hold your data hostage, either intentionally or by default (if the vendor goes out of business or decides to stop supporting the program you're using but won't release details about the data formats in use.)

As Maureen Smith pointed out in Robert A. Heinlein's To Sail Beyond The Sunset, "Whenever someone asks, 'why do they' or 'why don't they', the answer is almost always 'money'." Keeping data in digital format either saves a lot of money or makes it possible to do things that might be impossible outside of a computer, either because of the difficulty of manipulating images in the real world or it would be exorbitantly expensive. Which is going to be cheaper: a movie rendered digitally using Blender or one made by stop-action photography of individual cel images? They had to do it that way in the 1950s - also labor was cheaper then - but today much more can be done with digital images.

But its part of all development, every improvement has drawbacks. Paper was cheaper than clay tablets but the acids in paper means anything over a couple-hundred years old is either a copy or well-preserved. Monks typically recopied manuscripts as the old ones started to decay. With acid-free paper we got spoiled by libraries that could last hundreds of years as long as the books never got wet. Electronic information has to be copied as older, less-efficient media are replaced; it's a drawback of the technology as a result of how much cheaper storage keeps becoming. It's no different than it has been, we've always had to replace things as they wore out, it's just the cost to store each item has dropped so dramatically that we have a lot more of them. Replacing media over time is a maintenance issue, and maintenance is always an issue that nobody wants to deal with or pay for. And that is the whole problem. We're not preparing for maintenance by expecting it will need to be done and providing the resources necessary, being blindsided into being unable to perform maintenance (by proprietary file formats and vendor lock-in), and not doing the maintenance. So we pay the price of our short-sightedness.

Comment Michael Jackson (Score 1) 684

I thought VAC was pretty decent in preventing cheating in CS and Valve has been banning cheaters left and right?

I'm not going to rise to the bait and think you were serious in believing they were referring to "CS" as Counter Strike as opposed to Computer Science. But I have my own story of people intentionally confusing others.

Over 20 years ago when I lived in Southern California, KABC radio in Los Angeles has a (white) gentleman with an English accent as one of their talk radio show hosts, whose name is Michael Jackson. So me and another guy on a BBS were going on with this back-and-forth conversation about how he sure sounds different on his show on the radio, and so on with other comments, clearly "wrong". Eventually one guy took the bait, and pointed out to us that it's two different people. That's when we pointed out we already knew this, we just wanted to see how long it would take before someone noticed.

When the much more famous singer died, people went to the Hollywood Walk of Fame to look up the star for "Michael Jackson." Unfortunately, it wasn't his; it was the one that was given to the radio personality.

Comment My own Turbo Pascal story (Score 2, Interesting) 684

I would offer help to other students at this one college I would visit, and this one guy had a problem with his Turbo Pascal program (version 3) that it wouldn't compile and he couldn't figure out why. This was a fairly long program, say on the order of 20 screens (about 1,000 lines) so it wasn't clearly obvious. So I ask him if I can make a few changes. I go in and I find a point where the code should be complete, e.g. no pending procedures, and I insert a 'BEGIN END.' I exit and compile and the compile works. So I remove this and I go further down, insert BEGIN END., exit and try a compile, and this time it faults with an error, the same one he's getting. So I go upward and try again, and eventually - meaning in about 3 minutes - I find the problem and it was a very subtle bug even I didn't spot, he had left an open brace { in his code, so, it ignored everything to the next close brace, or the end of the program, (I forget which), because it treated everything from that point on until the brace was closed (or the end of the program), as a comment.

The guy was absolutely amazed that I found the bug as fast as I had. He had been spending something over 2 hours trying to find it, and not succeeding. I found it in less than five minutes. It could be that I was a fresh pair of eyes, or it might be I'd had over 3 years (then) experience doing programming didn't hurt either...

Comment Re:Be honest (Score 1) 684

Reminds me of a similar experience; I was writing .. something .. in assembly language that had a very subtle hiesenbug that only came up after rigourous testing.

I figured -- if *I* can't find it, the TA sure as hell isn't going to.

I was right.

I know exactly how you feel. We had some kind of a problem with one of the line printers at a college I went to, where you'd randomly get an extra " appearing at the end of a listing. This was one of those 'really fast' chain-drive printers hooked to a Hitachi or an Amdahl (an IBM System/370 Mainframe work-alike), the printer was something like a 1402 I think. Cost then, probably several thousand dollars, speed, about 6 pages a minute (300 LPM), not even as fast as my $110 8 ppm laser printer today.

As it turned out, one time I was doing an assignment, and by mistake I ran the same job twice, one after the other, one of the listings had the spurious character and the other didn't, so I conclusively proved it was the printer causing the problem, not the computer somehow sending a spurious character.

Comment I did a couple of times (Score 2, Interesting) 684

I won't say where, but a couple of times I did someone else's college computer programming assignment for them, making sure I 'dumbed down' the quality of the work so that it wouldn't be too obvious that someone who has a considerable amount of experience (over 5 years, then) did it, as opposed to a new student. What can I say other than I was broke and needed eating money, and as with today, it was just as hard then to get a job programming without a degree as it is now. At least it's over 20 years ago so the statute of limitations applies, presuming I did anything illegal. I can admit, however, I think I actually learned a few things from having to do the assignment.

"About the things I've done in the past, I hope either they've been forgotten, or if not forgotten, covered by the Statute of Limitations."
— Robert A. Heinlein

Science

Why the First Cowboy To Draw Always Gets Shot 398

cremeglace writes "Have you ever noticed that the first cowboy to draw his gun in a Hollywood Western is invariably the one to get shot? Nobel-winning physicist Niels Bohr did, once arranging mock duels to test the validity of this cinematic curiosity. Researchers have now confirmed that people indeed move faster if they are reacting, rather than acting first."
Software

The Final Release of Apache HTTP Server 1.3 104

Kyle Hamilton writes "The Apache Software Foundation and the Apache HTTP Server Project are pleased to announce the release of version 1.3.42 of the Apache HTTP Server ('Apache'). This release is intended as the final release of version 1.3 of the Apache HTTP Server, which has reached end of life status There will be no more full releases of Apache HTTP Server 1.3. However, critical security updates may be made available."

Comment Another Imbecile incompetently running video (Score 0, Offtopic) 249

I am guessing that the site was slashdotted because the video never ran. Yet another example of some imbecile who designs their own video player and either can't run the material correctly or can't handle the load. I see this over and over, someone - or some site - decides to run their own video player and it's either inoperative or runs badly. I wrote about this on my blog in October 2008 how so many places try - and fail - to properly run video.

You know, running video correctly isn't rocket science, YouTube does it fine under loads that would slashdot Slashdot. But do these stupidos use YouTube to serve their video? Noooo, they'd prefer to use some incompetent who can't provide it properly, probably because they're under the impression they'd lose ad revenue or something, I guess. But I see this all the time. The New York Times provides video for some of their stories, But their video doesn't work, and stalls, but has no way to cache the video so that if it fails you can either get it to run smoothly or go back and run it again without having to download the entire video all over again after it's already been served. I guess they never thought about people having problems,

If these were streamed video like a live event, that would be one thing. But they do the exact same thing YouTube does, they feed stored video to a player written using Adobe Flash. So there's no excuse for their failures except pure incompetence and/or stupidity.

Earth

Minnesota Introduces World's First Carbon Tariff 303

hollywoodb writes "The first carbon tax to reduce the greenhouse gases from imports comes not between two nations, but between two states. Minnesota has passed a measure to stop carbon at its border with North Dakota. To encourage the switch to clean, renewable energy, Minnesota plans to add a carbon fee of between $4 and $34 per ton of carbon dioxide emissions to the cost of coal-fired electricity, to begin in 2012 ... Minnesota has been generally pushing for cleaner power within its borders, but the utility companies that operate in MN have, over the past decades, sited a lot of coal power plants on the relatively cheap and open land of North Dakota, which is preparing a legal battle against Minnesota over the tariff."

Comment Re:incompetence (Score 1) 242

Manager: Hmmm. Well, it needs to work by next Tuesday.
Coder: (very quiet expletive)

Code Monkey have boring meeting with boring manager Rob
Rob say Code Monkey very diligent
but his output stink
his code not functional or elegant
what do Code Monkey think
Code Monkey think maybe manager want to write goddamn login page himself
Code Monkey not say it out loud
Code Monkey not crazy just proud
-- Jonathan Coulton

Comment It's a spurious number (Score 1) 242

I remember when someone gave a figure of something like an estimate of TCO (total cost of ownership) for PCs was something like $10,000 a year or $12,000 a year because it priced out everything at the maximum professional cost for software failures, installation of new software, etc. Ignoring PCs in non-commercial environments, I have a PC here at home; It's what I'm using to write this. My labor costs me nothing and the changes I make to my computer are for my benefit, and thus, while my own labor might have some value to me, it is not costing me anything other than opportunity cost, which again, has actual financial expenditure to me of zero.

If a team has to spend $10 million to develop an application because they had to do it twice and if they had done it right the first time would have cost $3 million, you can claim that it's a loss of $7 million, or you can - correctly - claim that it's a system that cost $10 million to develop including false starts. It all depends on how you want to cook the numbers.

If they are claiming that $6+ trillion represents complete failures I find that a bit unlikely. But if you count the amounts wasted because the customer didn't know what they wanted, should we then count as failures and expense costs all of the people who take perfectly working bathrooms and kitchens who gut them after a few years because they no longer like the way they look?

You can create any kind of number by how you count failure, if you include redesigns for performance, redesigns because of desire for increased features, or redesign for maintenance. You can also count failure as systems needing to be scrapped because they have absolutely no usability for any of the problems needed to be solved. If that was what was being claimed I would, again, find that number highly suspect. It all depends on where they get their numbers from.

Let's also not forget, again, this is an estimate, because most of these numbers are neither published nor available to outsiders to the company that developed the program or system. The number could be higher, or it could be lower. It reminds me of the supposed estimate of the losses for pirated software raised to huge numbers by claiming every copy made was a lost sale that would have been at full price without discounting. Some kid who made a copy of a program where he had to ask someone for a free disc is certainly not going to pay $200 for a copy, but the numbers presumed that the bootleg copy would have resulted in a full-price sale.

So if someone wants to claim that the total cost of software failures is US$6,200,000,000,000.00 I'd really like to know how they got this number. Are they pricing out costs in Africa as if the cost of labor is the same in New York City? Are they pricing labor in Los Angeles the same as in Baton Rouge, Louisiana? How are they determining costs?

Paul Robinson — Paul@paul-robinson.usMy Blog

Comment They're a bunch of socialist whiners (Score 1) 782

The purpose of the GPL was to ensure that a distributor of software release source and if they made any changes, that they release the source to those changes. The GPL was meant to ensure that the person have the freedom to know what the software contains and to modify and re-release if they choose to do so. The GPL in no way prevents or discourages anyone from selling a GPL licensed product. In fact, the GPL specifically states one may charge any fee they want for the application as long as they make access to the source available either with the sale or at a nominal charge from some other point if it's necessary to copy it. If you make the source available as part of the application, your responsibility under the GPL has been completely complied with, and you are entitled to charge whatever the traffic will bear.

If they don't like it, they should have written their own license that requires source release and prohibits charging. As it is, under the GPL you can charge anything you want and the people who originally released it have no right to object. You've got expenses, and for that matter, if you want to make a small profit - or even a large one - you're perfectly entitled to do so.

Nobody says a goddamn word when Redhat or Suse makes millions reselling a large collection of open source applications without paying the developers anything at all. Of course, Redhat does spend several million a year contributing code to the Linux distribution that everyone also gets to use, but that's beside the point; no one objects to Redhat making a huge profit, and if you can make something off a released GPL application and comply with the license, more power to you, and tell those whiners that the purpose of the GPL was to ensure protection of the user's freedom, not protection of obtaining something for nothing,
-----
Paul Robinson <paul@paul-robinson.us> - My Blog

Comment Re:Here we go again (Score 1) 258

The copyright of machine generated work has been a matter of law for more than a hundred years.

If you think this is in any way open to debate, ask yourself who drew Toy Story.

The computers simply rendered the images and drew them. But human beings put considerable thought into a script, settings, voices, etc. This is a far cry from a computer amalgam of existing facts. Further, in order to claim anything other than actual damages, the work must be registered within three months of when it was first created. Unless Wolfram registers every search output, all they can get is provable damages, they can't get automatic attorney's fees or statutory damages. Not sure whether they even are eligible for attorneys fees on unregistered works. (This whole thing applies to U.S. works, works from authors outside of the U.S. might not have to register.)

Slashdot Top Deals

I find you lack of faith in the forth dithturbing. - Darse ("Darth") Vader

Working...