Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:"Science" == "Argumentum ab auctoritate" ?!?!?! (Score 4, Informative) 247

Cite Knuth... This is, of course, good science.

Well at least Professor Knuth is still alive, and I don't [YET!] need to refer to the poor man as spinning in his grave.

AC posted an excellent response here.. In the event you're filtering AC's, take the time to read it, as it's completely on point.

I would add is this: if you've never completed a Masters thesis or Doctoral dissertation, just try submitting one to your committee without adequate citations. If you write somewhere "I used well-known algorithm ABC because of XYZ" and you don't have a citation for that algorithm, you'll be sent back for rewrites pretty quickly to add appropriate citations.

By way of example, in my Masters thesis several years ago, I mentioned Unix diff , without a citation. Why would this need a citation? It was mostly mentioned in passing, and every computer scientist under the sun knows what diff is, right?

Committee came back asking for further citations on a few things, including diff (which, for the record, is "Hunt, J. W., and McIlroy, M. D. An algorithm for differential file comparison. CSTR, 41 (1976).")

Using citations isn't an appeal to authority. It's akin to using an existing library call in programming. Just as you wouldn't roll-your-own quick sort algorithm when coding, someone writing a scientific paper doesn't re-invent every algorithm ever derived. You find someone who has already done that, and you cite them. The AOCP is useful in this regard due to the sheerly massive number of algorithms Knuth describes. It's hard to go through a Computer Science program and not use one of these algorithms. Knuth himself likewise cites all of the algorithms in the AOCP, so it's not an appeal to his authority, as he delegates that out to others appropriately. It's simply useful because instead of having to track down papers written in the 1960's on your own, you can cite Knuth who cites those papers for you. This is why the AOCP is useful for a graduate student.

FWIW, I cited Knuth. I needs an algorithm to calculate variance, and another on the Box-Meuller transformation. Art of Computer Programming had one for each, which I adapted for my needs, and cited appropriately.

Yaz

Comment Re:knuth's art of computer programming (Score 5, Informative) 247

They're kind of dated, because few people do sorts and list manipulation at that level any more. I have both an original edition and a current edition of vols. 1-3, but haven't looked at them in years.

Sure, for the average programmer these days who relies on existing libraries, these probably aren't all that useful.

As a grad student working on a thesis and other papers however, Knuth's books are invaluable for citations. Need to defend the use of a specific algorithm? Cite Knuth. His books were invaluable citation material for when I wrote and defended my thesis a few years back.

This is, of course, good science. You may not need to use Knuth to program your own B* tree, but you have a pretty much universally accepted reference for citation if you use one in your research.

Yaz

Comment Re:Wasn't allocation always the problem? (Score 1) 306

I call BS, it would only take that long if it was a low priority job. If they were told in no uncertain terms to sort it out or be kicked out of the internet I'm sure they could deal with it much quicker than that.

Perhaps, but it's still potentially going to be a very large, costly job, which probably won't gain enough addresses to make it worth anyones while. It would still take them at least a few months.

The problem here is how many organizations with a large allocation (like a /8) have allocated these addresses within their organizations. Typically, they don't go around doling out the addresses in a completely contiguous manner -- they may have done something akin to setting up a /16 for each building (they would have received their address block before CIDR, and thus would have had to spit things along glassful lines), out of which different labs may have got a /24 to use however they wanted. Readdressing all of these and setting up new routes for all of these subnets is a big job for a large organization like MIT. You'd have to combine subnets together, which would change the routing topology, and compress everything down into a few /16's to make the returned address space contiguous.

You could return non-contiguous space, however this has a serious negative impact on world-wide routing tables. You can't just add a few million /28's to the global routing table (that is, you can't just say "hey, here's a few hundred thousand non-contiguous groups of 16 addresses we aren't using, let's give them back!").

And after putting all that effort into making their address space more contiguous (while still allowing room for future growth), they'd probably wind up with enough addresses to extend IPv4 for a month or two at best -- at which point, they might as well have put the effort into migrating to IPv6 instead.

Giving unused address space only slightly delays the inevitable. It does't postpone it indefinitely. If you're going to do the work, you might as well do it right the first time and get everything running on IPv6.

Yaz

Comment Re:About time! (Score 1) 306

Once we finally move on to IPv6, can we all have our own static IP?

That's a good reason to push it.

Actually, you get a prefix -- either a /48 or a /64, from which you can assign your own addresses. A /64 is enough to give you more addresses than the entire public IPv4 Internet. How you use them internally is up to you.

Yaz

Comment Re:About time! (Score 1) 306

if anyone back then had seen this coming that clearly, they'd have just used 64 bits to start with and we'd be fine for the next thousand years.

Exception that on a 8-bit computer running at only a handful of MHz, using a 64-bit address right off would have entailed a performance penalty. There would be more packet overhead, and more address processing required.

This may not seem like a bad compromise for clients, but you have to consider what would have happened to routers in the 70's and 80's had 64-bit addresses been the norm. Won't anybody think of the routers?

Yaz

Comment Re:Bloody Idiot (Score 1) 588

How long have we been vaccinating kids for? How long have we known about "autism"?

History of Autism

"Autism" as it is currently used was defined in 1938. The first vaccines were developed in the late 1700s, however, the first of the components of what are now the MMR vaccine were introduced in 1963 with the first measles vaccine (Timeline of Vaccines)

.

They tried that crap on my own kid who didnt behave well in school. Instead, I tried more discipline and a stricter policy and now he's a "Straight A" student.

Really, so how many times a day should I beat my autistic daughter who is completely unable to speak because of her condition? Do you recommend using paddles, straps, or electrocution? Maybe I should just lock her in a closet and feed her a bucket of fish heads once a week? Please Doctor Anonymous, share your wisdom!

Yaz

Comment Re:Knowledge (Score 1) 1037

Well, not necessarily. There is no scientific way I could think of that lets us tell what happens with our "soul" after death.

Of course not. You'd have to prove the existence of a soul scientifically before you could even start to answer such a question.

It would be no different than asking a zoologist about the mating patterns of the one eyed, one horned, flying purple people eater. All research is built upon the foundations of prior research; as there is no scientific evidence for a one eyed, one horned, flying purple people eater, there is no logical place to start in trying to deterring what it's mating patterns might be like.

You have to be careful with such statements, as they're the sorts of arguments people of faith like to try to use against science (i.e.: "But science can't prove/disprove X", where X is some construct for which there is no scientific basis in the first place, but which the speaker treats as a given). This is a fallacious line of argument, one which nobody can ever actually learn anything from.

Yaz

Comment Re:Better encourage rather than confront (Score 1) 98

I was using unblock-us for a while, and it worked flawlessly. I only stopped as there wasn't enough additional content on US netflix for me to justify paying for it.

IPv6 tunnels are fortunately free. And as I mentioned, if you have router support for it, then every Mac, PC, and Linux box in your house will automatically be provisioned for end-to-end IPv6 access to Netflix (and anything else IPv6 accessible on the Internet), along with any set-top boxes which may use IPv6 (Apple TV apparently does, but I don't own one to be able to confirm this).

Yaz

Comment Re:Better encourage rather than confront (Score 1) 98

Canadian Netflix is pretty crappy compared to the American version and we don't have much else. It's not like the content companies want to sell their products here, at least in an easy to purchase downloadable format

Pro tip:

Netflix is fully IPv6 enabled, which is actually great news for Canadian Netflix users. Just setup an IPv6 tunnel to the nearest Hurricane Electric tunnel server farm (if you have a router that supports this, you can enable IPv6 invisibly for your entire home quickly and easily. Apple's routers all support this out of the box, for example), and presto -- you'll have US Netflix.

Note that this only works on IPv6-enabled devices, of course, so your set-top box or smart TV may not benefit. And you have to ensure the browser you're using properly supports Happy Eyeballs so as to ensure it will prefer IPv6 over IPv4 (Safari on Mac OS X since Lion uses an algorithm to prefer whichever connection is fastest in responding, which can cause it to initially load Netflix via IPv6, showing all the US content you can't otherwise see in Canada, only to be blocked when you actually try to view it if OS X switches down to IPv4 for optimization purposes).

As I have IPv6 tunnelling enabled right at the router, there is no software to be installed or anything that needs to be configured anywhere once this is setup, unlike VPN/proxy solutions. It's also fast -- even though the IPv6 is tunnelled, I can't perceive any speed issues when watching content this way.

Enjoy!

Yaz

Comment Re:That's only part of the story. (Score 2) 60

$5000 per infringer (not per infringement) is the maximum. The minimum is $100, and I've heard word that the court is more likely to impose the minimum. The plaintiff either has to prove actual damages, or can apply for statutory damages, between $100 - $5000 at the judges discretion. The copyright act stipulates that the judge needs to consider whether the infringement was for non-commercial purposes, whether it was for private purposes, and whether it would constitute hardship for the defendant to pay.

Yaz

Comment Re:SAT is not a brute force loop (Score 3, Interesting) 189

SAT is clearly NP complete, and clearly the existence of good SAT solvers is not a proof that P=NP. This means that there will be relatively small problems that SAT solvers won't be able to solve.

Enjoyed your post, but have to correct a small quibble.

From a mathematical standpoint at least, being NP complete doesn't imply that there are some problems that are unsolvable; merely that they won't be solvable in any reasonable amount of computing time. If you have a few hundred billion years of compute time available, a SAT solver might be able to solve even those small problems you mention. Of course, from a practical perspective, none of us are going to be here to get the result in those situations, making them unsolvable from a practical standpoint.

(On the other hand, once the billions of aeons roll by and the machine goes 'ding' and spits out an answer, we do know that we can verify it in poly time. Huzzah!)

While all of this may seem ultra-pedantic, there is enough confusion about NP out there that someone reading your post may get the idea that things that are NP-complete are unsolvable. They're not unsolvable -- we can typically fashion algorithms to solve them, simply that those algorithms run in nondeterministic polynomial time, and thus may have runtimes exceeding the expected lifetime of the solar system, even with every cycle of compute time ever invented pushed at it.

...unless, of course, someone comes up with a proof that P = NP, in which case all those NP-complete problems can be transformed into P problems. Sure, they might still take a few hundred billion years to get a solution, but at least we'd know how many hundreds of billions of years would be needed to get a solution!

Yaz

Comment Re: Bullshit (Score 1) 389

It's really funny to think that Mac OS X, an OS for whom many Windows users think is primarily aimed at and is used by the least technically proficient users in the world, has had virtual desktops for seven years now. So if Apple can figure out how to provide this feature, why can't Microsoft? Yaz.

Slashdot Top Deals

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...