Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:where does the burden of proof lie? (Score 5, Insightful) 747

Did you read the article? It said exactly that increased levels of CO2 will be mitigated by increased growth of green plant life, and that the current models are too aggressive in their estimations of negative effects. And this report was not from the oil industry, but from NASA and NOAA, both of whom have been vocal supporters of existing global warming models. Why did you immediately dismiss this new report in favor of scientists who lived one hundred years ago?

Why is it that when someone questions evidence of human caused global warming, he's labeled a "denier" (a term which was intentionally chosen to evoke images of Holocost Denial, by the way) but when someone questions evidence that it's not as bad as previously though, he's not just doing the right thing?

The bottom line is that we don't really know what's going on. Ignoring evidence that doesn't support your claims is just bad science.

Comment Re:TSA (Score 1) 480

I think you missed the point. The comment about going to college wasn't meant to imply that a college degree will help you handle packages better. The idea was that if you go to college you won't have to take a job that required you to be out in the rain and cold at 0400. The main point, however, was that no matter what job you accept, you should be honorable enough to do that job well, even when the conditions aren't to your liking. You've accepted a wage to provide a service. It is now your responsibility to provide that service to the best of your ability. If your answer is "they only pay me $x/hr so I'm going to put my steel toed boot through your Dell" then you need to find another job. I used "you" in this post as a generic pronoun. I wasn't talking about YOU c0lo.

Comment Re:You're right. (Score 1) 1348

Or if you like upgrading to a better video card or CPU, or playing RTS and RPG games, or using a keyboard for FPS, or any one of many other reasons desktops will thrive as gaming platforms for the foreseeable future. I guess you could argue that consoles will come with keyboards at some point to make some of these things easier but I don't think that's a fair position. You'll still have trouble with upgrades, mods and other customizations.

Comment Re:Ignorance (Score 1) 490

"Look, the bottom line is that Apple users like Apple. So what?"

It happens to be relevant to the discussion, in this case why they have such dramatically increased loyalty to what amounts to an inferior system on an inferior network.

I think that the answer is exactly what I said. iPhone users like the iPhone more than they dislike AT&T.

If the inferior system you're referring to is the iPhone I don't think that's a fair assessment. Certainly it's a different system, but for the vast majority of iPhone users I argue that it's a superior system. Most users don't care about the closed nature of the platform and appreciate that they have about 250k apps available, compared to the 40k or so Droid apps. Just because other phones are arguable better for a power user doesn't mean that they're better for the average user.

Comment Re:3M (Score 1) 646

There is no arbitrary starting point for a decade. The current decade could be 9 yrs + 7 months old, or it could be 2 years + 1 day old or it could be three seconds old.

The modern convention is that decades start in years ending with a 0. Yes, there are douchebags who will say "nu-uh, there was no year 0, so the decade doesn't end until the 0 year is over!!oneone!11! I burn you with my wiked smatzzz!" but people probably do equally douchey things like saying "If you're from Phoenix, does that make you a Phoenician?".

They fail to see that the whole damn system is arbitrary and that nobody is any more or less correct than any other person when choosing a starting point for his decade.

Comment Re:My Math Prof used Excel 4.blah on Win3.1 for th (Score 1) 131

Except that NumPy will use LAPACK and BLAS for it's linear algebra making it far more efficient. Try a QR decomposition on a matrix of any significant size in VB, then do the same decomposition using LAPACK and you'll see a huge difference. As for numerical analysis being about writing efficient algorithms, sure, that's true, but why would you want to rewrite those algorithms when highly optimized versions come by default?

Disclaimer: Yes, I'm sure you could get VB to use LAPACK and BLAS but python will do it by default.

Comment You really need to read some papers (Score 1) 279

What concerns me most is that you said that you haven't ever read a complete research paper. How can you be sure that your idea is new and worth publishing if you haven't done the requisite literature reviews? You really need to do your homework before you submit anything to a journal. Most journals will select referees for papers based on that paper's references. If you reference some papers by Dr. Smith and he is still alive, there's a good chance that Dr. Smith will be asked to referee. If you don't have reasonable references, your submission may be rejected out of hand.

Academic publication can be a very slow process. Don't be surprised if it takes a year for your paper to be officially accepted, and another six months before it shows up in print. DO NOT submit your paper to more than one journal. If you don't hear back, don't assume that you've been rejected. Contact the journal and find out what's going on before you send your submission elsewhere. The last thing you want is for more than one journal to accept your paper.

Comment Re:Par for the course? (Score 1) 510

Yes. In addition to a PS3, X360, and Wii there's also the iPhone and iPad that can be bricked via forced updates.

I've had an iPhone for quite some time and I don't ever remember a forced update. I always get a message informing me that an update is available and asking if I want to install it.

Comment DNA fingerprints are NOT UNIQUE (Score 1) 544

TFA is dead wrong. While DNA evidence can prove that a person didn't commit a crime, a false positive is still possible. If we collect DNA from everyone in the country as suggested, the odds of a false positive will increase accordingly. With the odds of a false positive are about 1:1 Billion (Google it if you don't believe this number), that means that about 300 people in the United States alone will match your DNA fingerprint. And that's just the ones who are currently alive.

Comment When does hacking become an act of war? (Score 2, Interesting) 149

At some point (I think we're already there) our computer infrastructure becomes so important to a nation that a cyber attack could be construed as an act of war. I wonder how long it will be before we see a physical military response to a cyber attack. We've already seen evidence that China attacked Google's corporate infrastructure a few months ago. Is this really all that dissimilar than Chinese agents coming to the US and physically breaking into Google's buildings? To relate things to the article, if it could be shown that Iran was indeed attacking CIA sites, would the US be justified in bombing Iranian intelligence facilities? Just some food for thought.

Comment Are we moving toward 'iconerate' users? (Score 1) 951

I read a book some time ago in which most people were 'iconerate' as opposed to being fully literate. It meant that they could recognize specific icons and symbols and understand them, but that user interfaces has become so sophisticated that understanding beyond that wasn't needed.

While this isn't exactly the idea you've described, there is more than a passing similarity. Users fly through messages and menus quickly because they know what to expect. They remember that blue thing with the W on it starts word, and the green one with the X starts excel. The problem with using this kind of system with error messages is that unless you're using it to capture very common errors, the users will never get to the point where they can immediately associate the icon with a specific problem. And if the users see the error often enough to achieve that level of understanding then they probably know how to deal with it themselves. Which, come to think of it, may not be bad if they could come to associate an icon with specific recovery steps. but it's not going to help you with the complex issues which is where the helpdesk really needs it.

The puppy error idea may work in the short term, but soon I think that the novelty will wear off and they'll all blur into "some cartooish icon" in the user's mind.

Slashdot Top Deals

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...