Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment: Re:Still half-assed C++11 support (Score 1) 198

by AstrumPreliator (#45161759) Attached to: Visual Studio 2013 Released

My last post was a bit sparse on details but I'll try and improve upon it here. The main point I was trying to make was that just because you define a function as constexpr doesn't mean it will run at compile time unless certain conditions are met. This obviously isn't a bad thing as you can re-use the function at runtime.
 
If you have a constexpr function and all of its arguments are constant expressions (such as literals) and it is used in a constant expression (switch case label, array size, non-type template parameter, etc...) or to initialize a constexpr variable then it is guaranteed to run at compile time according to the standard. Otherwise it is up to the compiler. I've noticed a lot of people think that merely passing constant expressions to a constexpr function is enough to guarantee compile time evaluation, but it's not. For instance:

constexpr int factorial(int n) { return n <= 1 ? 1 : (n*factorial(n-1)); }

int main() {
      char c[factorial(5)]; // Compile time
      switch(n) {
            case factorial(3): // Compile time
            break; }
      constexpr int i = factorial(4); // Compile time
      cout << factorial(8) << endl; // *Compile time or runtime*
      int i;
      cin >> i;
      int j = factorial(i); // Obviously runtime
}

Here is a page that talks about it (especially in the comments). I was also mistaken about clang and gcc. At higher optimization levels the factorial of 8 will be computed at compile time rather than runtime.

Comment: Re:Still half-assed C++11 support (Score 2) 198

by AstrumPreliator (#45160131) Attached to: Visual Studio 2013 Released
You need to be careful with constexpr as it is not guaranteed to be evaluated at compile time. I don't have a link handy, but if I remember correctly the only time a constexpr function is guaranteed to be evaluated at compile time is if all of it's parameters are constant expressions and it is used in a constant expression. Compilers are of course free to evaluate constexpr functions in other situations, although to my knowledge neither clang nor gcc does this yet.

In your example "show"_hash and "fill"_hash should be evaluated at compile time. However if you had someFunc(int hash, int runTimeParameter); and you passed "show"_hash to someFunc, there is no guaranteed it will be evaluated at compile time.

Comment: Re:Only one thing to do! (Score 2) 322

by AstrumPreliator (#44649847) Attached to: Open Source Mapping Software Shows Every Traffic Death On Earth
You know if you want gun control how about you actually dig deep and make a decent argument for it?

You say there were 30,000 gun deaths in 2010 but provide no reference. Okay fine I'll give a source. As it turns out the number is 31,672 firearm deaths in 2010 in the United States. If you open the PDF you'll find that 61.2% of these (19,383) were suicides. 35% (11,085) were homicides. I haven't looked at exactly where justifiable homicide is included but according to the FBI statistics it's only a few hundred.

In either case the bulk of the deaths come from suicide and homicide. Let's focus on suicide real quick. Now, one argument could be made that stricter gun control would lessen the number of firearm related suicides given how easy, deadly, and obtainable firearms are. According to the data suicide is the 10th leading cause of death in the United States for the year of 2010 at 38,364 deaths. I assume this includes the previously mentioned 19,383 suicides by firearm. Now how much would stricter gun control lessen the number of suicides given the fact that only around 50% of them are due to firearms? If so by how much? That is a valid topic of debate and one you could put forth for stricter gun control.

Now let's focus on the homicides, 35% of all firearm related deaths. This is actually a pretty complex subject that is rather hard to find data on. For instance would better education lower firearm related deaths such as gang shootings? Would stricter gun control have a similar effect or would those homicides be shuffled into other categories such as blunt objects or knives? How does the prohibition on drugs and the subsequent black market and illicit trade of them affect violent crime, particularly with firearms? I don't have any data to link to off the top of my head. Feel free to supply data, studies, and sources that strengthens your position that gun control would reduce these numbers, preferably more than other means.

Now see, that is the beginning of a good debate on gun control. Granted it's only one facet of gun control as there are others, but it's better than throwing around a misleading number and claiming gun control will solve everything.

P.S. Chances are you'll die from heart disease or cancer. Smoking and being out of shape is vastly more likely to kill you than a gun.

Comment: V12 (Score 4, Informative) 68

by AstrumPreliator (#41416037) Attached to: Torque3D Engine Goes Open-Source
Actually Tribes 2 used the V12 engine. This later became the Torque Game Engine, then Torque Game Engine Advanced, then Torque3D if memory serves me. The V12 engine was also an improvement over the Darkstar engine used for the original Tribes. Before that I have no idea, but this engine has been getting updates for at least 15 years.

It's not exactly the best engine in the world, but open sourcing code is never bad. So thanks GarageGames!

Comment: Re:Linking to Wikipedia to explain math (Score 1) 102

by AstrumPreliator (#41297369) Attached to: Possible Proof of ABC Conjecture
Then Wikipedia math articles should never *ever* be referred to in a general context to introduce an unfamiliar subject to anyone.

Given that most readers on this website are in the software engineering field who haven't studied advanced mathematics (though a lot probably have), I'd say you're right in this context. If this was posted on a math forum the wikipedia article would probably be an appropriate explanation. Similarly I probably wouldn't link to the wikipedia article "P versus NP problem" on a chemistry centric website as it wouldn't explain much since they probably lack the necessary background. It's all a matter of what your audience is.

This is a damning indictment of Wikipedia's mission. What good is information if it cannot be understood? It may as well be Viking runes.

Even though number theory wasn't my area of study I understood the article well enough. It's not indecipherable, merely specialized.

The http://abcdathome.com/ website provides an explanation of what ABC triples are, how to figure them, and what the conjecture is, its implications, and all that, in plain, understandable English, because ABC triples are just mere arithmetic when you get down to it. But you wouldn't know that from the Wikipedia page, which is a disaster.

The website you linked (even though it has a typo in it) is a teaching resource. Like I said, it's far better at conveying the meaning in an understandable way to someone who is unfamiliar with the subject matter. I also agree that something like that should have been used rather than the wikipedia article given the demographic on this website.

Comment: Re:Linking to Wikipedia to explain math (Score 1) 102

by AstrumPreliator (#41297063) Attached to: Possible Proof of ABC Conjecture
Of course wikipedia isn't a great teaching resource, it is ostensibly a databank for knowledge (and crazy admins). Teaching resources take that knowledge and convey it in some meaningful and understandable way to someone who doesn't know what it means.

It is by no means a dick measuring contest, at least not in the way you are saying. Math, computer science, and physics are areas I am very well versed in and articles written about those subjects on wikipedia are very easy for me to read and comprehend. However, if you point me to an article in a subject matter that I'm weak in I find it very difficult to read. Take the article on photosynthesis for instance. There are so many words in that article that I can barely pronounce let alone know the meaning of that I could easily assume it was a giant circle jerk by the biology community to include as much technobabble as possible. Yet it isn't a circle jerk and would be easy to read for someone well versed in that subject matter.

What you should have told that high school student was that wikipedia is secondary reference material, not a learning aid. If you need help learning a subject matter you should be asking for help from a tutor, teacher, instructor, or educational textbook, not wikipedia.

Comment: Re:My reason (Score 1) 369

This biggest disappointment to me is that, as with property generally, I cannot choose to disclaim ownership â" for most of what I write, I'd rather simply disclaim it to the public domain. Whilst those using my work in an academic context will be bound by academic rules in terms of citation and the like, if someone else can benefit, great â" since copyright was neither a driver not an enabler to the creation of the work, I'm unconvinced that copyright should exist over that work, but, since it does as a matter of law, I'd like to refuse to accept it. Which I can't...

You know I rarely use my moderation points since I don't come across any really good posts. In fact my moderation points just expired last night. This is one of the best things I've read on slashdot in a while though. I wish I had points to mod you up.

Comment: Re:Rushing?! For What?! (Score 2) 446

by AstrumPreliator (#39239189) Attached to: Math Textbooks a Textbook Example of Bad Textbooks
The reason they publish so many editions is to combat used textbook sales, especially in freshman and sophomore level undergrad programs. Professors sometimes write their own books as well, which are required when you take their course which is required for various degrees. A linear algebra course I once took was written by the department head if memory serves me correctly. The first edition had algebra misspelled as "algegra" on the binding. It understandably got a second edition. Too bad the book itself was quite horrible.

Of course when you continue through a math degree as I did they tend to use golden standard textbooks which haven't changed in years or decades receiving a new edition very rarely. By about my junior year we were using a lot of books from Springer which is a pretty decent publisher. Sometimes we'd use reference books from Dover which are mostly translated Russian and German texts that are quite old. Other courses such as differential geometry used "standard" textbooks like do Carmo's "Differential Geometry of Curves and Surfaces". I was even fortunate enough to have some really awesome professors. My differential equations instructor didn't even use or require the department's required textbook (some 50th edition book). Rather he taught from a bunch of his graduate textbooks which I actually bought after asking him.

That's not to say there shouldn't ever be a revised book. Errors slip through, no matter how hard you try to prevent it. Sometimes a new edition would also benefit from recent advancements in the field. Though this is less of a concern in math as new advancements are generally way above the rigor of an introductory textbook in the subject matter. However, in areas such as computer science this could definitely be a good thing for students every once in a while.

I think it comes down to how much publishers think they can make off of students. A lot of undergraduate mathematics is required by so many different fields that it makes sense why they do this (to prevent used book sales and make more money). When you start to advance towards a graduate program the number of people who need to take those courses drops off a lot. Perhaps the relatively recent F/OSS textbook movement could help here, although I doubt it. When it comes to K-12 I'm sure the situation gets a lot more cloudy because of ever changing standards though. Then again a lot of schools have relatively little money, I know the high school I went to gave us extra days off and only had half of the lights on in the building because they couldn't afford utilities. So perhaps F/OSS textbooks would work really well here.

Comment: Re:Both... (Score 1) 124

by AstrumPreliator (#39223625) Attached to: Video Games: Goods Or Services?
This is what I see in games, cherry picking the best of both world.

For instance, say I buy a physical copy of an Xbox 360 game. If I lose or break the disc I'm boned, I'll need to buy it again. Of course I can also sell it to someone else when I'm finished with it. So it seems like a good to me. However, what happens if there's a part of the game that's only available if you buy it new? All of a sudden I don't own part of the game, it's more like a service (a non-transferable license). When it comes to digital distribution like Xbox Live or Steam things do a 180. Now if my Xbox 360 blows up I can just download the game from Xbox Live on another machine. Of course I can't sell it to anyone else. So I don't seem to own it at all now, I merely have a non-transferable license.

Basically if it's a good I should due able to do with it as I please, including selling it to someone else. If it's a service then I'd like to download it and use it anywhere (within reason of course), even if the physical copy is destroyed.

Comment: Re:Ordinary Mortals (Score 1) 40

by AstrumPreliator (#38770650) Attached to: Book Review: OpenCL Programming Guide
A GPU is a computing device, but it's not another CPU. So while it may be fairly flexible it's still designed with one thing in mind. Debugging isn't nearly as nice as it is on the CPU, you can't do things such as print to the console from within a kernel (on a GPU) without an extension, and if you initiate a very time consuming process on the GPU your monitor will probably be locked until it finishes. Not to mention memory management is difficult on the GPU since you have to think about things such as coalesced reads/writes and cache coherency. Branching is also a pretty slow process on a GPU which needs to be taken into account.

Lastly OpenCL is not just for GPUs, it's for a bunch of different devices including CPUs, FPGAs, and DSPs. All of these have their own quirks. Now I'm not saying it's not possible to abstract all of this away, but it's not easy like you make it sound. If you can't do it at this level you shouldn't do it at all since you are going to have to work at this level whether you use a wrapper or not. I'm sure in 5-10 years the situation will change, but until then I agree with the GP.

Comment: Re:Wrong idea (Score 2) 281

by AstrumPreliator (#37269514) Attached to: Will Climate Engineering Ever Go Prime Time?
Right, because settlers on another planet, moon or exoplanet wouldn't have to worry about conserving every little thing they possibly could to survive. Releasing CO2 into the atmosphere would be colossally stupid since they'd most likely have a closed system where the plants can use this CO2 to provide oxygen and food in return. Plus it's not like we'll be using oil as an energy source since not only would it probably be nonexistent on another world, but it would also require oxygen to combust, something which is better saved for the people. Perhaps one day after a few hundred years of terraforming to reach an atmosphere near Earth normal and a steady supply of oil from Earth (which hasn't run out in that time frame) will lead to everyone getting all nostalgic and buying SUVs and causing global climate change, but I'm not seeing it.

Most worlds out there have no ecosystem to destroy, they have almost no atmosphere to pollute, and they are inhospitable to all but the most resilient forms of microbial life. So how exactly are we going to repeat "the whole damn shit again"? Hell, colonization would probably help out here since colonies would need to recycle everything they possibly can at the highest efficiency possibly. They'd also need the cheapest, easiest, and most efficient energy sources to power their colony.

"One day I woke up and discovered that I was in love with tripe." -- Tom Anderson

Working...