Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking

Journal Journal: Where is there decent Internet? 1

A few of you have often noticed my signature, in which I mentioned that my current ISP offers 100 mbit fiber-to-the-home for $65/mo, no installation fee. Recently, I've discovered that while they do not filter, they do have a 20 gig/mo cap, alongside a vague policy about "more than five hours of video per week".

Of course, they sell a TV service, also. I would bet that is where this limit is coming from -- to prevent YouTube, Netflix, etc, from competing with Lisco TV.

Being unemployed, and as this is a small town, I would not mind relocating to find a job. The question is, where to? Is there anywhere which has similarly priced Internet, unthrottled, and uncapped -- or at least, with a significantly higher cap? (Alright, there's Japan. Anywhere in the US?)

User Journal

Journal Journal: 1-3% of all mainstream stars have planets?

The venerable BBC is reporting that a survey of light emitted from white dwarfs showed that between 1% and 3% had material (such as silicon) falling into the star on a continuous basis, potential evidence of dead worlds and asteroids. On this basis, the authors of the study speculate that the same percentage of mainstream stars in the active part of their life will have rocky matter. This is not firm evidence of actual planetary formation, as asteroids would produce the same results, but it does give an upper bound and some idea of what a lower bound might be for planetary formation.

Aside from being a useful value for Drake's Equation, the rate of planetary formation would be valuable in understanding how solar systems develop and what sort of preconditions are required for an accretion disk of suitable material to form.

Because the test only looked for elements too heavy to have been formed in the star, we can rule out the observations being that of cometary debris.

User Journal

Journal Journal: Fireball, but not XL5 3

Four fireballs, glowing blue and orange, were visible last night over the skies of the Carolinas on the southeast coast of the United States, followed by the sound of an explosion described as being like thunder. Reports of hearing the noise were coming in from as far afield as Connecticut. There is currently no word from NASA or the USAF as to what it could be, but it seems improbable that anything non-nuclear the military could put up could be heard over that kind of distance. It therefore seems likely to be a very big meteorite.

The next question would be what type of meteorite. This is not an idle question. The one slamming into the Sudan recently was (a) extremely big at an estimated 80 tonnes, and (b) from the extremely rare F-class of asteroid. If this new meteorite is also from an F-class asteroid, then it is likely associated with the one that hit Sudan. This is important as it means we might want to be looking very closely for other fragments yet to hit.

The colours are interesting and allow us to limit what the composition could have been and therefore where it came from. We can deduce this because anything slamming through the atmosphere is basically undergoing a giant version of your basic chemistry "flame test" for substance identification. We simply need to look up what metals produce blue, and in so doing we see that cadmium does produce a blue/violet colour, with copper producing more of a blue/green.

Other metals also produce a blue glow and tables of these colours abound, but some are more likely in meteoric material than others. Cadmium exists in meteorites. Well, all elements do, if you find enough meteorites. but it exists in sufficient quantity that it could produce this sort of effect. (As noted in the chemmaster link, low concentrations can't be detected by this method, however this is going to be vastly worsened by the fact that this isn't a bunsen burner being used and the distance over which you're observing is extreme.)

Ok, what else do we know? The fireballs were also orange. Urelites, such as the Sudan impact, contain a great deal of calcium, which burns brick-red, not orange. This suggests we can rule out the same source, which in turn means we probably don't have to worry about being strafed the way Jupiter was with the Shoemaker-Levy comet (21 impacts).

What can we say about it, though? Well, provided the surviving fragments didn't fall into the ocean, it means every meteorite hunter on the planet will be scouring newspaper stories that might indicate where impacts occurred. Meteoric material is valuable and anything on a scale big enough to be heard across the entire east coast of the US is going to be worth looking for. It had split into four in the upper atmosphere, so you're probably looking at a few thousand fragments reaching ground level that would exceed a year's average pay.

User Journal

Journal Journal: Eric Raymond: GPL no longer needed 1

Eric Raymond, gasbag and/or Open Source leader, recently claimed the GPL is no longer needed. "if the market punished people for taking open source closed, then why do our licenses need to punish people for taking open source closed? That is why I don't think you really need GPL or a reciprocal licenses anymore. It is attempting to prevent the behavior that the market punishes anyway.".

He has a point. If Open Source is truly a better development model, then it will beat closed source on its own merits and doesn't need viral clauses.

Flame on!

User Journal

Journal Journal: Software I purchased in 2008

Here's the (Macintosh) software I purchased in 2008:

1) Adobe CS3 Master Collection. (I bought this with the educational discount). I was originally going to buy just the web developer pack, but decided I might as well spend a little more and get everything. Big waste of money. Soundbooth (soemthing I was interested in) has almost no features. Dreamweaver, Fireworks, and Photoshop are the only apps I've used (primarily Fireworks). The UI is slow, painful, and non-standard -- not entirely their fault -- these are basically applications designed before OS X. On second thought, if you're charging $2500 or more, it is your fault. Adobe: you suck. Shit or get off the can.

2) Coda. Coda looks and feels like Apple's take on Dreamweaver. It lacks some of the features, but not enough for me to stop using Dreamweaver completely. If they did a Fireworks-type app I'd never use Adobe again.

3) SubEthaEdit. A nice little text editor (used by Coda). It's designed for real-time collaborative editing over the internet, though I haven't tried that. For some reason, it feels a little bit nicer than TextWrangler, though I don't know why.

4) Versions. A subversion browser. It's a little pricey (IMO), and yes, I know all the cool kids have switched to git. However, I still use svn for my repositories. Using a gui is easier (and more accurate) than typing out full pathnames, for tagging, etc.

5) Tax Cut 2007. Almost forgot about that, since I use it for a week and hopefully never again.

Mozilla

Journal Journal: "email this picture" on right-click 2

Why, oh why, did the devs at Mozilla pu this option on the right-click menu of Firefox immediately under "Save Image As"? Were they purposely trying to make life even more difficult for drunk people?
User Journal

Journal Journal: What constitutes a good hash anyway? 3

In light of the NIST complaint that there are so many applicants for their cryptographic hash challenge that a good evaluation cannot be given, I am curious as to whether they have adequately defined the challenge in the first place. If the criteria are too loose, then of course they will get entries that are unsuitable. However, the number of hashes entered do not seem to be significantly more than the number of encryption modes entered in the encryption mode challenge. If this is impossible for them to evaluate well, then maybe that was also, in which case maybe we should take their recommendations over encryption modes with a pinch of salt. If, however, they are confident in the security and performance of their encryption mode selections, what is their real objection in the hashing challenge case?

But another question one must ask is why there are so many applicants for this, when NESSIE (the European version of this challenge) managed just one? Has the mathematics become suddenly easier? Was this challenge better-promoted? (In which case, why did Slashdot only mention it on the day it closed?) Were the Europeans' criteria that much tougher to meet? If so, why did NIST loosen the requirements so much that they were overwhelmed?

These questions, and others, look doomed to not be seriously answered. However, we can take a stab at the criteria and evaluation problem. A strong cryptographic hash must have certain mathematical properties. For example, the distance between any two distinct inputs must be unconnected to the distance between the corresponding outputs. Otherwise, knowing the output for a known input and the output for an unknown input will tell you something about the unknown input, which you don't want. If you have a large enough number of inputs and plot the distance of inputs in relation to the distance in outputs, you should get a completely random scatter-plot. Also, if you take a large enough number of inputs at fixed intervals, the distance between the corresponding outputs should be a uniform distribution. Since you can't reasonably test 2^512 inputs, you can only apply statistical tests on a reasonable subset and see if the probability that you have the expected patterns is within your desired limits. These two tests can be done automatically. Any hash that exhibits a skew that could expose information can then be rejected equally automatically.

This is a trivial example. There will be other tests that can also be applied automatically that can weed out the more obviously flawed hashing algorithms. But this raises an important question. If you can filter out the more problematic entries automatically, why does NIST have a problem with the number of entries per-se? They might legitimately have a problem with the number of GOOD entries, but even then all they need to do is have multiple levels of acceptance and an additional round or two. eg: At the end of human analysis round 2, NIST might qualify all hashes that are successful at that level as "sensitive-grade" with respect to FIPS compliance, so that people can actually start using them, then have a round 3 which produces a pool of 3-4 hashes that are "classified-grade" and a final round to produce the "definitive SHA-3". By adding more rounds, it takes longer, but by producing lower-grade certifications, the extra time needed to perform a thorough cryptanalysis isn't going to impede those who actually use such functions.

(Yes, it means vendors will need to support more functions. Cry me a river. At the current scale of ICs, you can put one hell of a lot of hash functions onto one chip, and have one hell of a lot of instances of each. Software implementations are just as flexible, with many libraries supporting a huge range. Yes, validating will be more expensive, but it won't take any longer if the implementations are orthogonal, as they won't interact. If you can prove that, then one function or a hundred will take about the same time to validate to accepted standards. If the implementations are correctly designed and documented, then proving the design against the theory and then the implementation against the design should be relatively cheap. It's crappy programming styles that make validation expensive, and if you make crappy programming too expensive for commercial vendors, I can't see there being any problems for anyone other than cheap-minded PHBs - and they deserve to have problems.)

PC Games (Games)

Journal Journal: Open letter to EA (and other publishers)

Obligatory, pre-emptive MFD strip -- I know it's unlikely anyone from EA will read this. But I'm not the only one, I hope.

I live in a small town, with low cost of living. I'm single. I make a reasonable wage, so I have a ton of disposable income.

I'm a computer professional and enthusiast. I tend to spend a decent amount on hardware, and I do game. I also tend to download custom mods and such, even toy with level design from time to time -- in other words, I take full advantage of the fact that I'm on a PC, and not a console.

Now, I could tolerate most games being Windows only -- I don't have to like it, but I tolerate it. After all, I can always put it in a virtual machine, and even if I'm running it on the bare metal for performance, I don't game 100% of the time, and I generally don't do anything else when I'm gaming.

You have lost me as a customer because of DRM.

I'm not just talking about Spore.

I'm not fanatical about this. I happily buy Valve games over Steam, play them on Windows, and spend money to do so. I'll jump on anything decent coming through Penny Arcade's Greenhouse project. I play an MMO -- that means I pay a monthly fee, I have to use their software, and they can pretty much terminate my account whenever they want.

I want to give you money.

Here's what you did, in response to Spore -- and in your next game, apparently, not in Spore itself:

  - You upped the number of allowed installations from three to five. Some of us have more than five reinstalls per month.

  - You removed the need for the game to stay online -- that's only needed during activation. I'm sure some users are grateful -- but these users likely see it as exactly as small a gesture as increasing the number of installs. Why force them to be online in the first place?

  - You removed the CD copy protection. I haven't bought a game in years that used CD copy protection. What took you so long?

Here's what else is still a problem, for me:

  - Blacklisted programs. Daemon Tools, among other things -- it has legitimate uses other than piracy. I should be able to run whatever software I want on my machine -- it's mine, after all.

  - Reputation. SecuROM is widely known as the worst, and it isn't getting better. Many people report that it has trashed their system. Why should I trust it this time?

The freedom to do what I want, how I want, without having to solder things, is why I'm a PC gamer in the first place. DRM, by its very nature, limits that.

That's the damage. Here's the impact:

Your DRM, in the long run, does nothing to secure your product. Spore is one of the most widely pirated games ever, despite everything you did to inconvenience legitimate users. A skilled cracker can defeat all of these measures relatively quickly -- sometimes before the game is even released.

And that's the choice it has come to.

I want to play Mirror's Edge, badly. If there's ever a version of it for the PC, I've got money in hand to buy it, and a new computer, and a controller if needed -- I'm not sure how well the unique movement would map to a mouse, but maybe it will.

If Mirror's Edge comes, say, as a Steam game -- not like Bioshock, but actually just a Steam game, with no additional protection -- I'd buy it in a heartbeat. On opening day. Make it DRM-free, and I'll consider preordering.

If it comes with anywhere near the level of DRM you're currently requiring for Spore, even this "relaxed" version, I will head over to the nearest torrent site and download a copy. I have plenty of money to spend, yes, but not plenty of time to waste proving that I own something.

And I am not the only one who feels this way. Keep in mind: An unprecedented number of people gave Spore a low rating on Amazon because of its DRM. An unprecedented number of people have pirated Spore, mostly via torrent. Coincidence?

User Journal

Journal Journal: Beowulf MMORGs 3

Found this interesting site, which is focussing on developing grid computing systems for gaming. The software they seem to be using is a mix of closed and open source.

This could be an important break for Linux, as most of the open source software being written is Linux compatible, and gaming has been the biggest problem area. The ability to play very high-end games - MMORGs, distributed simulators, wide-area FPS, and so on, could transform Linux in the gaming market from being seen as a throwback to the 1980s (as unfair as that is) to being considered world-class.

(Windows machines don't play nearly so nicely with grid computing, so it follows that it will take longer for Microsoft and Microsoft-allied vendors to catch up to the potential. That is time Linux enthusiasts can use to get a head-start and to set the pace.)

The question that interests me is - will they? Will Linux coders use this opportunity of big University research teams and big vendor interest to leapfrog the existing markets completely and go straight for the market after? Or will this be seen as not worth the time, the same way that a lot of potentially exciting projects have petered out (eg: Open Library, Berlin/Fresco, KGI, OpenMOSIX)?

Microsoft

Journal Journal: Microsoft buys ciao.com to boost e-shopping search

From the article: "Microsoft has agreed to buy Greenfield Online, owner of popular European price comparison website ciao.com, for about $486 million to boost its Internet search and e-commerce business in Europe."

Microsoft has tried to buy their way into online success before; will this one be the charm?

User Journal

Journal Journal: Of Saphir and Whorf 2

I think I finally "get" Web 2.0.

It occurred to me when I started talking about The Cloud -- both loving the idea, and hating myself for using such an obvious buzzword. But I think I get it now.

It's about language.

Read 1984. And read about the Saphir-Whorf hypothesis. Maybe you'll see it, too -- our use of language has a profound impact on how we see the world.

There was a great story about how, when Europeans first came to America, some of the natives actually couldn't see the ships, because it was like nothing they'd ever seen before. They didn't have a word, or a frame of reference, for the huge cloud-like things they saw on the horizon -- so they just didn't see them.

I kind of doubt that story is true, but I do think it applies. How long did dynamic websites exist, with the ability for users to alter content, and no one "got it", until we started calling it "Web 2.0"? How long did virtualization exist -- how long did CPU-power-on-demand services exist -- and, while there was some buzz about virtualization, no one really got it until we started calling it The Cloud.

This isn't new -- it's existed, really, as long as abstract concepts have existed, because language is the medium through which we understand and communicate abstract concepts. For an obvious-example, take "Pro-Choice" vs "Abortionist" (or "Baby-Killer!"), and "Pro-Life" vs "Anti-Abortionist" (or "Woman-Hater!"). Quite often, people make the mistake of using the opposition's language in their argument, trying to show its flaws, but really, that only strengthens their argument. Who really wants to argue against choice, or life?

It's not always a good thing, and we should not always embrace new language. But neither should we be so quick to dismiss it as a "buzzword" -- after all, the Internet itself is perhaps the godfather of the modern buzzword. What we're really talking about is just another network -- which is really just a bunch of computers with wires running between them -- but now that we know it's something called "The Internet", our view changes, and it really becomes a world-changing phenomenon.

Understand: Not just appears to be, or appears to become. A random network of computers cannot change the world. The Internet can and has.

I now understand why RMS and friends insist on calling it "GNU/Linux", though I still don't agree with it. But you see... RMS understands the power of language.

(Edit: This could probably be applied to Memetic Engineering, if we ever implement that concept. The Anti-Meme would have to be very clearly defined in language for it to work.)

User Journal

Journal Journal: The Lost Tapes of Delia Derbyshire

Two hundred and sixty seven tapes of previously unheard electronic music by Delia Derbyshire have been found and are being cataloged.

For those unfamiliar with Delia Derbyshire, she was one of the top pioneers of electronic music in the 1950s and 1960s. One of her best-known pieces was the original theme tune to Doctor Who. According to Wikipedia, "much of the Doctor Who theme was constructed by recording the individual notes from electronic sources one by one onto magnetic tape, cutting the tape with a razor blade to get individual notes on little pieces of tape a few centimetres long and sticking all the pieces of tape back together one by one to make up the tune".

Included in the finds was a piece of dance music recorded in the mid 60s, examined by contemporary artists, revealed that it would be considered better-quality mainstream today. Another piece was incidental music for a production of Hamlet.

The majority of her music mixed wholly electronic sounds, from a sophisticated set of tone generators and modulators, and electronically-altered natural sounds, such as could be made from gourds, lampshades and voices.

Slashdot Top Deals

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...