Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Random prefix workaround (Score 4, Interesting) 56

There may very well be something I'm missing here, but I have a suggestion for how to deal with the random prefix attack.

Keep a running count of the number of requests for non-existent subdomains. Once they exceed a certain number in a short period of time, cease to respond to requests for subdomains that aren't already cached as valid.

Example: foo.com, www.foo.com, and mail.foo.com are cached. A flood of requests for (random chars).foo.com starts up. Once this exceeds 100 requests in a minute, all requests for foo.com subdomains are ignored except for foo.com, www.foo.com, and mail.foo.com.

This would still cut off access to infrequently-accessed subdomains, but subdomains with enough traffic to be in the cache would remain reachable.

Comment Not sure what is new here (Score 1) 102

Like a lot of people, I had incredibly shitty math teachers in school who managed to completely turn me off to the subject. Later in life, once I learned what mathematics is actually good for -- which is nearly everything -- I sat down with cheap used textbooks and Schaum's guides and started with algebra and worked through calculus, and then branched out into advanced mathematics. Right now, I'm teaching myself group theory. It is a bit harder to do it on your own without someone to answer questions when you get stuck, but I'm not sure that's actually a disadvantage in the long run: the concept you struggle to understand is remembered better than the one that is handed to you.

So now there are online courses. The difference between a MOOC and a book is what, besides lower information content?

Comment Re:Retina Displays? (Score 1) 377

I have no trouble looking at laser printer output and telling the difference between 600dpi and 1200dpi. The problem isn't that it's pointless for human vision, it's that many people have uncorrected or inadequately corrected bad vision, and the rest just don't pay attention. Consider seeing an optometrist.

But marketing to the common person in a way that is useful to them is not "bullshit".

True. But that's not what's happening here. What Apple is doing is using meaningless jargon to take advantage of their customers' technical ignorance. I don't actually have a problem with that in this case, since that ignorance is largely self-inflicted, but it's not customer service in any meaningful sense.

Comment Which is why I don't like Apple products (Score -1, Offtopic) 424

Technology must now work for everyone, not just 'computing enthusiasts.

Not to single out Apple -- all of the major consumer tech vendors do the same thing to one degree or another -- but this is why I'm not terribly interested in their products. I am a technology enthusiast, and I do tweak settings and heavily customize my machines to suit me. As interfaces are dumbed down and the underlying feature sets are minimized, there's less and less use I have for the associated products. And to the extent that GNOME has followed suit, I find myself doing more and more at the command line because there's a real limit to what can be accomplished efficiently with a mouse or at all jabbing at a tablet with my fingers like a chimp in a lab.

But for the tendency of some major UI projects in the Open Source world to imitate corporate products, I wouldn't care. That tendency is frankly bizarre, since the average consumer doesn't care about any aspect of FOSS except free-as-in-beer, probably won't install any FOSS software anyway, and certainly won't ever crack open the hood. Meanwhile, skilled users who understand that a shallow learning curve also means a shallow power curve are increasingly out of luck.

Comment Only if there's severance pay involved... (Score 5, Insightful) 550

...and then be blandly pleasant. Otherwise, just don't do it. What are they going to do, fire you?

I'm always amused at the naive goodwill that people extend to their employers. Most of us live in at-will states, without unions, and without any real workers' rights that can be exercised without spending more than they're worth retaining counsel. These are the people who can fire you at any time for any reason, but they want two weeks' warning if you leave on your own. Why give them extra freebies?

Look, forget the employer-employee bullshit. You are a vendor, selling a service. Your employer is a customer. As long as they're buying what you're selling at the best price you can get (which includes work conditions and perceived job security as well as pay and benefits), the customer is always right. As soon as they stop buying, or you find someone willing to pay more, then go attend to your new customer. The old customer wants to take more of your time for free? Politely decline. You're running a business -- you -- and the only point in giving something away free is if it leads to another sale.

Don't bother with work ethic or pride in your job at this point. Those are good concepts and they have their place, but that place is well before anyone starts talking about exit interviews. If you're leaving voluntarily, they treated you well, and you feel like extending the courtesy, sure. But even then, don't say anything that can be used against you later. It's just business, and that's how they see it. Go and do likewise.

Comment Superficiality carried to its extreme (Score 4, Insightful) 639

Seriously? There was nothing more important or interesting going on than some nebbish mumbling about the importance of packaging? Even for Apple fanboyism, this reaches new depths. "The boxes sit on shelves serving as a constant reminder of the beauty within." I wish there was a more appropriate and genteel response to that than, "Get a life!", but there you are.

Comment Missing the forest for the trees (Score 5, Insightful) 622

I've programmed in every major language and several minor ones from the 1970's to the present day, never mind design methodologies. They all have their relative strengths and weaknesses, but at the end of the day, the only thing that really recommends one over the other is a) what's available, and b) what you're most familiar with. No widely used language is "broken" any more than any natural spoken language is broken. No one ever says, hey, this novel would be much easier to write if we were taking advantage of the greater expressive power of Indonesian instead of kludgy old Lithuanian.

Aside from juvenile cliquishness and fashion obsession, every language flamefest starts with people obsessing on some awkward feature of the dominant language du jour, and then concluding that all of their problems would be solved if we all switched to some other language without that awkward feature. Of course, tomorrow's language (or methodology, editor, coding standard, platform) has its own awkward qualities that will only become apparent once it collides with the real world on a large scale, setting the stage for the day after tomorrow's language. Rarely does anyone pull their head out of their compiler/interpreter long enough to recognize that it's the real world that's awkward, and no amount of switching between fundamentally equivalent machine-parseable languages is going to change that.

Instead, we keep implementing the same stuff over and over in one language after another until the pace of real progress slows so much that we can actually get excited that the document viewer we're trying to port everything over to is receiving a "major" new features in HTML5 that will allow it to get a little closer to matching the desktop GUI functionality of twenty years ago, only not as well and with the added requirement of several orders of magnitude more hardware power required to keep it going.

But by all means, let's get rid of PHP if that makes it easier to imagine that we're doing something besides reinventing the same old wheel and doing it badly.

Comment Re:BULLSH!T (Score 5, Interesting) 147

Shannon's limit is a Mathematical principle.

Unfortunately, most people have next to no understanding of mathematics beyond some rote memorization from school. This is just another example of people confusing analog signals with magic. To be fair, the actual researchers involved probably understand this quite well, but the scientifically uneducated class from which science and technology journalists are drawn is another matter.

The non-mathematical version, for those interested, is that yes, analog signals are continuous and so can occupy an infinite number of states. The reason you can't get infinite bandwidth out of that is because both the transmitters and receivers have limited precision, and because there is always noise, which is another manifestation of the Second Law. For example, there are an infinite number of real numbers between 0 and 1. If you could actually use all of that space, you could encode any amount of information in an arbitrarily short signal. (Well, there's a limit to that, too, for which see Georg Cantor.) In practice, you can't use all of that space, because your instruments might distinguish quite well between 0.001 and 0.002, but they can't reliably tell the difference between 0.001 and 0.0005. On top of that, there is noise, which is also a big topic, but you can think of it as a random fluctuation in the signal. If the ambient noise varies between 0.0 and 0.0005 in the same example, you can't even reliably tell the difference between 0.001 and 0.002.

What the parent is getting at is that laws of physics, being derived from observations of nature with limited precision, might occasionally be overturned by better observations. Fundamental mathematical principles, on the other hand, are much more reliable. There might be a difference between rest mass and inertial mass that we could exploit for thrustless propulsion. It's extremely unlikely, but it can't be ruled out. But there is zero possibility that 2 + 2 will ever equal anything other than four. Shannon's limit and, for that matter, the Nyquist sampling theorem are a little more complex than a simple integer sum, but the actual math for both would fit on an index card with plenty of room to spare to blather on about "infinite" analog signals. We use digital signals most of the time these days because it makes the hardware easier to design, but neither digital nor analog can be used to make an end run around the Second Law.

What the researchers in TFA claim to have figured out is another way to use part of the signal outside of the frequency domain to stuff data into. It's a really ingenious approach that might be quite useful if it pans out in actual practice, but it's not magic, and it's not infinite.

Comment This is why we're not here for the long haul (Score 3, Funny) 124

We figure out a way to enhance human mental acuity, and the very first thing we apply it to is training snipers. Interspecies communication? Military dolphins. Never mind nuclear physics.

If we were as good at anything as we are at killing each other and stealing each other's stuff, we might have a chance. Hell, if we were even more interested in something else -- and no, screwing doesn't count.

Comment Surprising, but it shouldn't be (Score 1) 69

So our first steps into bridging the biological/electronic divide involve what is essentially an artificial parasite? Somehow, I fear it will not be long before some medical researcher who has read too much Douglas Hofstadter thinks of taking it to the next level of abstraction by monitoring intestinal health using parasitic instruments attached to tapeworms.

Slashdot Top Deals

Why won't sharks eat lawyers? Professional courtesy.