Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Gov't contractors are not paid by the hour (Score 1) 253

>Forcing long hours on contractors and saying "well, we pay your company hourly" is an immoral load of bullshit. This is nothing less than government-sanctioned overtime fraud.

The federal government is the worst offender against labor laws in the country.

It's nice being part of the same organization that investigates and prosecutes offenses, isn't it?

Comment Re:Problems in C++ (Score 1) 386

>1) No, that's not the case. The difference between .at and operator[] is that .at() has a const overload and operator[] is not.

Are you sure about that?

http://www.cplusplus.com/refer...

>2) Most high performance apps (eg. games) turn off bounds checking in any case for performance reasons.

Which is fine.

Personally, I'd have preferred it if I could simply enable or disable bounds checking on [], so I could test my code to make sure I'm not going to fry memory, and then disable bounds checking on performance critical code for release.

>Uh, stringstreams? http://stackoverflow.com/quest...

I'm not saying that you can't write your own functions, just that the STL string class is not a drop-in replacement in functionality in string.h

Comment Re:Problems in C++ (Score 2) 386

>>1. Dude ... if you want to query the size of an array, use a vector. No, it doesn't make your code less readable. And, no, you don't have to use .at() everywhere: C++ has this thing called operator overloading. Maybe you've heard of it. You can use array syntax with vectors. Use vectors.

Dude, don't use square brackets with STL arrays and vectors, just to make your code more readable. The [] operator skips bounds checking, which is the main reason for using these classes in the first place. At() is the proper methodology to use in pretty much every case, unless you are so confident in your bounds that its worth the trivial speed increase in access time.

>>2. I'd like to know what functionality you think you need string.h for when using C++ strings. I've found the standard library string type quite feature-complete.

The biggest gaps were filled in C++11 with replacements for atoi() and so forth, but there's still no replacement for strtok or some of the other functions in the core language.

>>3. C++ isn't an interpreted language; of course it won't have much reflection.

Sure. Makes life more difficult though by pushing those tests out to the linker instead of being able to code them directly from the language itself.

>>4. Forward declarations are not for saving the compiler time. They are for declaring a linkage interface with external code. If you ever even thought seriously about writing a C++ compiler you would know the language is not designed to make doing so easy.

Not my point. Quite obviously you need declarations for extern names not found in the file scope. But for functions within the same file scope, you still need to do forward declarations, which is only done to avoid having to do an extra pass over the code. Might have made sense in the 80s, but not today. Maybe it's not a huge deal since it's just a single copy and paste and edit every time you change a function definition when you're working on it, but it still otherwise serves no benefit.

Also, there shouldn't be much need for #ifdef guards any more. It's 2015. We should be able to include the same function definition twice without the universe breaking.

Comment Problems in C++ (Score 1) 386

>What kind of complexities has modern C++?

1. C++ still doesn't let you query a C-style array to determine its size, even though that functionality is tracked in dynamic arrays anyway, and can be calculated from staticly defined arrays within their own scope.

So every function using C-style arrays must also pass in a size_t holding the array size. This hurts readibility by wasting room on the parameter list, and exposes you to buffer overflow errors.

Legal:
int arr[10];
for (int i : arr) cout i;

Illegal:
void write_array(int arr[]) { for (int i : arr) cout i; }

STL arrays and vectors are obviously better, at the cost of decreased code readibility. Square bracket accesses are easier to read than .at()s everywhere.

2. Strings. Even with the string type, it is still shitty to use, still has terrible support, and you still have to use the c library string.h for some functionality since they've been too lazy to rewrite all of them into the C++ standard library. This means that people wanting to use the C++ string class still need to know the C way of doing things, and are still vulnerable to the same off by one errors that have been around for decades.

3. Almost no reflection capabilities.

4. The language still enforces rather idiotic rules about class and function definitions that modern languages have done away with, and for the better. It's not like putting #ifdef guards on your code is difficult, but in these modern times it should not be necessary. And forward declarations are a way of saving compiler time at the expense of programmer time. This is the opposite of what should be happening. Compilers are there to make programmers' work easier, not the other way around.

5. C++11 and later has made great strides in simplifying the life of programmers, but its cruft accumulation shows.

Comment Re:Stop trying to win this politically (Score 1) 786

>If you want to talk about science, then show me a tested climate model that has been subjected to an empirical test of its validity. It isn't that hard guys. We have a lot of very accurate historical data. Feed in past climate data and see if your climate model can predict the past or the present accurately. The first model that can do that which isn't just a collection of plug variables is something worth taking seriously.

What? No.

You have it completely backwards. All serious models are trained on and tested using historical data. If they can't even predict the past, what use are they?

But - here's the key point - predicting the past is *worthless* other than as a sanity check. As Garrison Cottrell told me, predicting the past is easy (even trivial). It's predicting the future that is hard.

The only way to really know for sure if a model works is to test it moving forward. And the IPCC doesn't have a great track record at that.

Comment Re:C++ (Score 1) 242

>STL sucks, I still have to do single character input and output from files, so much for getline BS

Gah, I/O in C++ is so horrible. In just the last month I've come across the following:

1) No platform independent way to do non-blocking I/O.

2) No iostream-compatible way of doing dup() or dup2(). You can change the buffers on iostreams, but this is not the same thing.

3) Just how shitty iostreams are at processing input files in a fault tolerant manner. On any major project, I always seem to just drop down to reading files one character at a time.

Comment Re:No locks (Score 1) 449

>Also, some problems can't be done in parallel, but we won't know how many can until we start trying....and then try for a few decades.

Right, but there's also a grey area between completely serializable and embarassingly parallel, in which methods like this will allow scaling algorithms up from "a few" computation nodes to "many", with the optimal numbers depending on the specific algorithms.

The biggest problems are still the same ones that existed when I got my Master's over a decade ago. Language support for parallelism isn't very good (I personally used MPI, which was awkwardly bolted on top of C++), it requires a certain amount of specialized knowledge to write parallel code that doesn't break or deadlock your machine (and writing optimized code is a bit more advanced than that), and library calls aren't all threadsafe. On the plus side, a lot of frameworks and libraries are now multithreaded by default, which nicely isolates the problems of parallel computing away from people who haven't been trained in it, and gives the benefits of parallel computing with only the downside of having to use a framework. =)

Comment No locks (Score 2) 449

Ungar's idea (http://highscalability.com/blog/2012/3/6/ask-for-forgiveness-programming-or-how-well-program-1000-cor.html) is a good one, but it's also not new. My Master's is in CS/high performance computing, and I wrote about it back around the turn of the millenium. It's often much better to have asymptotically or probabilistically correct code rather than perfectly correct code when perfectly correct code requires barriers or other synchronizing mechanisms, which are the bane of all things parallel.

In a lot of solvers that iterate over a massive array, only small changes are made at one time. So what if you execute out of turn and update your temperature field before a -.001C change comes in from a neighboring node? You're going to be close anyway? The next few iterations will smooth out those errors, and you'll be able to get far more work done in a far more scalable fashion than if you maintain rigor where it is not exactly needed.

Comment San Diego (Score 3, Informative) 285

I live in San Diego, some of the time, and similar results were posted here, too. The increase in rear-end collisions from people slamming on the brakes negates any benefit from reduced T-bones.

San Diego also reduced yellow light times, sometimes to below the legal limit, in order to boost revenue.

A judge looked at the program in 2001, said, "That's bullshit", and banned it for a year, and then the government finally ended it on its own in 2013.

Comment Re:freedom 2 b a moron (Score 1) 1051

>Why? Excluding religion, there is no reason to believe that vaccines cause any harm: literally every study attempting to find otherwise has either failed or been proven fraudulent.

Uh, no. You're grossly misrepresenting the case.

"Any harm" - really? All vaccines (heck, all medicine in general) carry a risk of adverse effects. There are common and minor adverse effects, and rare and serious adverse effects, including febrile seizures, allergy to the eggs used in the formulation, and so forth. What the scientific consensus is is that *vaccines are still worth it despite the risks*. That's why we don't give vaccines any more for viruses no longer in the wild - the benefit is no longer worth the risk.

From the CDC (http://www.cdc.gov/mmwr/preview/mmwrhtml/00046738.htm), adverse effects include:
1) HepB Pain at the injection site (3%-29%)
2) HepB Fever over 100*F (1%-6%)
3) HepB Anaphylaxis (1 in 600,000)
4) MMR Fever over 103*F (5%-15%)
5) MMR Rashes (5%)
6) MMR Joint Pain (3%)
7) MMR Febrile seizures, which caused the vaccine to be reformulate to reduce risk
8) MMR Aseptic meningitis, which led to the vaccine to switch strains in some countries

And so forth. All of these are based on studies, contrary to what you claimed that have found harm in vaccines.

I think you read a headline once that said, "No link between autism and vaccines" and falsely extrapolated that to mean "no reason to believe vaccines cause any harm".

Comment Re:Metacritic (Score 1) 91

>It is a big deal. The game comes with a built-in expiration date, which is a mystery. When EA is done with it, you're done with it. And to rewind...

The multiplayer may very well come with an expiration date. EA is pretty horrible in that respect.

The single player works even if the servers are down, and single player is the focus of the game.

>This is why I don't give money to fuckheads like EA and Ubisoft, and why you shouldn't either, and why I think you're an asshole for doing so. You're helping fuckheads be fuckheads.

I haven't given a dollar to Ubisoft since they implemented UPlay (except once accidentally when I bought a game without checking the publisher). On Origin, I own exactly 2 games (Mass Effect 3 and DA:I). On Steam I own 438. Their DRM system is the main reason why I refuse to support them with my dollars. But as I said, the DA:I DRM isn't as mind-bogglingly stupid as SimCity's.

Comment Metacritic (Score 2, Informative) 91

The user metacritic scores were very low for this game, whereas the critic's reviews were pretty high. This was the first time I can remember in which I've actually sided with the critics over the users. As far as I can tell, the users were just giving it bad scores because of the DRM. Due to debacles like Sim City, people are very, very leery of EA's DRM policies, and in fact DA:I has presented some problems for people doing benchmarks and the like (it detects the hardware changes and locks you out of the game after 4 or 5 changes). That said, DA:I will continue to work even with the EA servers go down (which they have) - you just can't play multiplayer. No big deal.

The game itself is amazing. Great story, amazing graphics, open(-ish) world with non-linear(-ish) design, challenging combats (I'm playing on Hard, can't comment on other modes), and an absolute ton of side missions to do with your companions that ties in back and forth with the non-interactive missions you can send your army on across the world. I highly recommend it for anyone who likes RPGs. It's the best CRPG I've played since Fallout New Vegas.

Comment Re:So it is not an accurate Documentary Film? (Score 1) 289

>Just to clarify, I believe Kip Thorne is the physicist who was a consultant on Interstellar, who made efforts to make the move more scientifically accurate than what Nolan could do on his own

And failed utterly. There are so many horribly bad manglings of physics in the movie that he's trying to salvage himself by saying on just one of the two dozen serious errors it's maybe sorta possible that it could be that way.

He should be ashamed of himself for granting the movie his imprimatur.

Slashdot Top Deals

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...