Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Pay attention to the professor? (Score 2) 134

When I was in college (2 years ago) I brought my laptop to most classes, simply because I can type about 4-5x faster than I can hand-write. The only thing running was emacs, but none of my professors minded.

The thing that makes this work for college is that I *want* to be there. If I really don't want to take a class, I just don't register for it. So if I'm sitting in a classroom, it's because I'm actually interested in what the professor has to say. Such a thing would never work in a setting like high schools and lower precisely because attendance is compelled.

Comment Re:2012 Year of the Linux UI? (Score 2) 81

A good GUI is easy to find. Here goes my recommendation:

1. Are you willing to undergo a learning curve? if yes, then you probably want a window manager like awesome. I've never heard anything but praise for tiling window managers from those who actually use them.
2. If not, try one of the boxen. I recommend fluxbox. It's nice because there's almost nothing to learn. No UI paradigms pushed on you. Add a panel and it has all the GUI complexity of Win95 (which I'm putting forward as a good thing).

You will get crap options though if you go for one of the "big three" (Gnome, KDE, Unity), but nobody says you have to stay. If you prefer a more integrated experience than one of these window managers, then go XFCE.

Sure, they won't spin your desktop like a cube. They won't make your windows close in a puff of smoke. They won't animate everything in 3D. Because it's a desktop, not a fucking video game.

Comment Turnabout Is Fair Play? (Score 5, Insightful) 202

Slashdot has so many comments boiling down to "Judges don't understand technology, and they look foolish when they rule on it anyways."

Then we have this article and it's responses, which basically boil down to "A bunch of technologists don't understand the law and the mechanism of precedence, and they look foolish when they comment anyways."

Comment Re:EULAs (Score 1) 384

IANAL, but no, EULA are not the same as free software licenses. The primary difference is that one is a *licensing agreement*, and the other is a *license*.

The concrete difference between the two is that a licensing agreement rests on contract law, in that it is an agreement between two parties. Generally speaking this means it must have (at least) two parties (software vendor and user) and there must be mutual consideration (they give you a license to software, you provide the agreement not to do certain things, and often toss in some money). A license rests on copyright law. Thus it does not require two parties or mutual consideration. Rather it is the unilaterally granted permission of a copyright holder to someone else to use their work.

The best place to see the difference is in the consequences of failure to adhere. The only way to fail to adhere to most free software licenses is to distribute binaries without the source and/or license du'jour. If you fail to adhere to the conditions of the license, then you have no license and are distributing a copyrighted work without permission of the author, which is copyright infringement. If you fail to adhere to the conditions of an EULA, then you are in breach of contract.

A helpful overview: http://www.law.washington.edu/lta/swp/law/contractvlicense.html

Comment Re:The Irony... (Score 2) 302

I hope this is a joke. But being paid well to burn out on a job is not slavery. Being compelled through the threat of violence to labor against your will to no benefit of your own is slavery. And sadly, human trafficking is a real thing in the world.

This is more like an investment banker who donates some of his money to groups fighting rape. Sure, some people overly fond of hyperbole might say that he has "raped" the economy, but that doesn't make him a hypocrite because he has never committed rape, and is doing real good in the world.

I understand that what Google is trying to do here is purchase good will. But it worked - they bought mine. It's not a bad trend to start. Any corporation that decides to donate considerable sums of money to ending atrocities in the world will get my consumer loyalty.

Comment No they haven't (Score 5, Interesting) 89

Title is correct. From TFA, the summary appears wrong. It seems they are not open sourcing anything. To quote TFA

On December 13th, NVIDIA announced that it will open up the CUDA platform by releasing source code for the CUDA Compiler.

They will let you look at the code, and they might let you send patches back to them. Nowhere I can find did NVIDIA promise anything along the lines of an open license, or even any license at all. This is more like a Microsoft shared-source deal, where you can look, but no rights or privileges are transferred to you.

That said, it would still be cool to see.

Comment Re:No, we need one *better* language, not "more" (Score 4, Informative) 421

The idiocy of this comment stems from the fact that it's author must have no experience in programming language design. We are all quite aware that humans are the primary users of our languages. The problem is that it's not helpful to have the peanut gallery always yelling "that one doesn't make me happy, make it more soft and people-like. I don't want to have to map my mental model - make it map its".

It's all well and good to say "make it understand English", but there are two primary problems with this. First, natural language programming is hard. Really hard. Just getting a computer to understand English with any reasonable reliability is pretty far in the future, and we can't wait for that. Second, we as humans don't really have much success expressing exactly what we want. It's why the most insidious bugs are not in code, but in specification. We so often don't know quite what we want that restrictive languages are actually beneficial, in that they force us to reason consistently.

And it's not some "saying floating around the internet", it's a very famous quote from Structure and Interpretation of Computer Programs, a seminal text in basic programming language theory and compiler/interpreter design. Most importantly, it's probably the first book you should read if you want to intelligently discuss this topic.

Another quote you might find interesting:

When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop.

- Alan Perlis

In short, from someone who likes to design programming languages - stop assuming that just because the problem is easy to understand that it is easy to solve. We're not all basement-dwelling geeks who think UNIX is the pinnacle of end-user usability and newbs should just get over it. We aren't pretending that there is no problem, and we're not refusing to educate ourselves on how to solve it.

Comment Do your homework before going GPU (Score 3, Informative) 205

As someone who has done some GPU programming (specifically CUDA) be aware that there is more to the GPU parallelism model than just "lots of threads". Many embarrassingly parallel problems translate very poorly to CUDA. The primary things to consider is that:

1. GPUs are *data parallel*. This means that you need to have an algorithm in which each and every thread will be executing the same instruction at the same time (just on different data). For a cheap way to evaluate it, if you can't speed up your program by vectorizing it then the GPU won't help. Of course, you can have divergent "threads" on GPUs, but as soon as you do you've lost all benefit to using a GPU, and have essentially turned your GPU into an expensive but slow computer.

2. Moving data onto or off of the GPU is *slow*. So if you can leave all the data on the GPUs and none of the GPUs need to communicate with each other, then this will work well. If the threads need to frequently globally sync up, you're going to be in trouble.

That said, if you have the right kind of data parallel problem, GPUs will blow everything else out of the water at the same price point.

Comment Re:SSNs? (Score 3, Insightful) 279

I'm assuming this is sarcasm. If it is not, my apologies.

But no, there is no possible way to lockdown a computer to prevent data from leaving it. You can mitigate by limiting the amount of data that can leave it, but you can't prevent data from coming off it (at least not while being used for its intended purpose at the DMV).

Sure, you can install filtering software to DPI everything leaving the machine and uninstall all text editors and remove the ability to install additional software. But at some point, the operator has to read information off the screen, and that's the analog hole you've hit right there. Even if sufficient surveillance is employed to prevent employees writing SSNs down on paper, there is still the possibility of the employee just remembering them. Given that a 9-digit SSN doesn't really have 9 bits of entropy and names are generally easy to remember, I'm guessing an employee using mnemonics could still easily recall 3 identities per shift. At $200 or more per identity, an extra $600 per shift is enough of a payday to motivate someone to try.

Comment Re:Good job, wants some cheese for your whine? (Score 1) 357

This isn't a binary blob driver. It is an Oracle-maintained open-source driver.

Correct, which is why I never said it was a binary blob driver. I was referring to the OP's comment that refusing support to binary blobs is tantamount to NIH syndrome. I was explaining why it is a reasonable thing to do.

Comment Re:Good job, wants some cheese for your whine? (Score 2) 357

As a developer, I understand the frustration of dealing with someone elses shitty software that you have absolutely no control over..... has documented when they occur as proof, which means fixing them should be fairly trivial as well.

If you truly believe that just having a large collection of triggers to a bug is all that is required to render fixing that bug "fairly trivial", then I sincerely hope I never find myself on the same dev team as you.

Denying support for binary-blob drivers is a perfectly reasonable thing to do. The kernel developers have finite time for support. If they choose to spend their time on investigating issues where they are not blocked by arbitrary restrictions on the tools they need to do their job, then fine. After all, given the great difficulty of debugging without source (which you, as a developer, surely understand), I find it quite feasible that in the time they could fix one bug caused by an external binary-blob driver they could probably fix 10 others.

Remember, this isn't a case of somebody just whining instead of doing something useful. This is a comment by somebody who is not doing this useful thing *because they are busy doing other useful things*.

Comment Re:big loss (Score 1) 1251

It is possible to develop a statistical method that determines to an arbitrary level of confidence, if species A could have evolved from species B given time duration T

Not really. There are *many* more variables than the two species and time. For example, the thousands of variables of the environment during the process of evolution that we cannot possibly know, except our extrapolated estimates from whatever evidence survived erosion and weathering over the thousands of intervening years. There are many factors that could influence the rate of mutation, rate of birth, rate of death and other factors of evolution. There isn't a fixed speed of evolution.

Slashdot Top Deals

Without life, Biology itself would be impossible.

Working...