Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Knowing when not to (Score 1) 345

You are not 100% correct.

LAPACK and BLAS outperform Eigen for large decompositions. For work like computer vision, you often have huge heaps of small, fixed sized matrices. In such cases, the overhead of function calls and blocking which makes LAPACK fast makes it much, much slower than naive algoithms. I'm not sure where the crossover point is, but it's somewere in the mid to high tens in size.

Anyway, other C++ numerics libraries wrap BLAS and LAPACK for large sized objects.

Comment Rubbish (pun intended) (Score 4, Insightful) 371

I RTFA the article on contamination.

It's very disengenuous.

For example it's all "oe noes toxic inks removed from paper during recycling are landfilled", so recycling is bad! Somehow this is different from dumping the very same toxic inks into a land fill while temporarily attached to paper.

The same complaint is repeated through the article.

Basically they're blaming recycling for the toxic crap that's in stuff, while ignoring the fact that landfilling toxic crap has exactly the same problems.

And lead based spray paints? Apart from for historical reconstruction work, lead paint has been illegal here since 1992.

Comment Re:Experts... (Score 1) 345

Memory objects that clean themselves up after they go out of scope are the devil's work. The devil, I say!

Exactly. The devil creates work for idle hands, does he not? And what does not having to do vast amounts of error prone grunt work do if not leave hands idle?

Comment Re:Knowing when not to (Score 4, Insightful) 345

If you need really high performance you don't use most of the C++ features anyway, and end up basically writing straight C.

Nope nope nope nope nope nope nope nope nope.

That is far, far, far away from correct.

Check out something like Eigen or TooN (a somewhat more obscure library which is used in the vision world for things like PTAM). They are very far from C++. The code written down in an editor reads more like maths. There's no explict loops, no explicit memory allocation. They're both high performance libraries used in challenging applications (seriously download and run PTAM, it's amazing).

They're also fully templated so you can stick in an AD type instead of a normal number and get derivatives out automatically with no extra effort.

Writing high performance C++ is nothing like writing high performance C. It's much better. All the messy C details of allocation and etc simply vanish, leaving clean, nice looking code which is straightforward to read, write and debug.

Or another example from today for me. I need to find the best N (lowest cost) elements during some wort of complex search. It's easy. Just create a priority_queue<pair<double, int>> where the double is the cost and the int is the index of the object.

Then about 3 lines of logic keeps the pqueue updated with the best N so far.

All the irrelevant logic of how a queue works is witten and debugged by someone else and hidden from me without ever losing either performance or type safety. For bonus points, if I find I'm searching very small things and memory allocation becomes a penalty, I can switch the entire thing to stack allocation with almost no effort at all!

Comment Re:Masters know their limitations. (Score 3, Informative) 345

This 100%. C++ has become a clusterfuck of over-engineering

Nope. It's mostly hampered by the need to be backwards compatible. The committee know full well if they make serious breaking changes, then the C++ world will split into two distinct languages. Many, many people won't adopt the new one and the vendors will continue to support them because there's money there.

It's a blessing and a curse. The curse it it's hard to do anything but add features. The blessing is my code I debugged 13 years ago still works.

The committee would rather argue over the rare case of multi-dispatch / multi-methods then fix core issues.

You're basically being an arse. You're picking out one rather speculative paper designed to help C++ stay modern a ways in the future (and somehow labelling this as a bad thing) and yet knowingly ignoring the many papers on fixing core issues.

Modules have been in a constant state of on-again-off-again for over 10 years:

Yeah they should just slap in some module crap and if it's wrong, just rip it out and replace it. Who cares about getting things right and not breaking working code, eh? They learned their lesson very hard with export templates and are keen to not have to learn the same lesson again.

There are no standard pre-processor macros for function names as a string. GCC has the excellent __func__ which Microsoft finally got around to implementing C99 N2340 in Visual Studio 2015!

So you submitted a proposal, right? Or do you expect other people working for free to magically know your personal problems from your whining on slashdot?

Comment Re:Knowing when not to (Score 4, Insightful) 345

This is complete and utter tosh.

No one I know who does high performance code (such as numerics, real time computer vision, that sort of thing) uses anything but C++, especially for new projects. There is nothing out there that combines the speed and expressivity of C++, and when you know performance is going to be a factor at some point, C++ is the only choice.

Frankly for a lot of scientific and numerics style coding, I often reach for C++ initially even when performance isn't required since it often maps on to those problem domains better than any other languages I've used.

Oh and then there's the embedded world, where your choices are C and C++. Plenty of people use C++, since it like C scales all the way down to the 8 bitters like Atmel.

Comment Re:Simple ... (Score 3, Interesting) 345

Wow the smug condesention is strong in this one.

I for my part wrote an STL clone around 1993 when the STL was just a lab experiment at HP.

The hard bit about the STL is the whole concept of, well, concepts that Stepanov finally hammered out. The STL, especially 1993 era is not all that complex.

Well, iostreams and their interaction with locales is deeply fiddly and I'd steer clear of that. But the basic algorithms, you know, vector, list, set/ma/multiset/multimap, sort, heap, priority_queue and so on and so forth are not too bad.

Not to say it's not an achievement, but it's not enough to convince me that you're an uber-guru. I've written STL compatible containers, and STL like algorithms for things that weren't in there. Apart from quantity the principle is the same.

Perhaps you should read what I wrote: I have roughly 15 years consecutive C++ experience from 1989 till 2005, plus random 3 or 4 years over the last decade.

15 years experience, or 1 year of experience repeated 15 different times?

Given you've never seen code without new in it (as your other post claimed), I'm inclined to think the latter because you seem to be deeply ignorant of whole swathes of C++ style. In a lot of code, you never see new and delete. Everything is managed by containers. I work on computer vision systems, and you can get entire working, robust systems without a new anywhere in sight. The custom containers might have a new in, but that's---well, let me check the library I'm using---let's see there's 80 instances of new in 40k lines. Of those most are in old code from before TR1 gave us many standardised smart pointers, and others could easily be replaced with a std::vector (the code's not perfect, it's been hacked on for the last 15 years), some are strange, silly uses and the rest are to initialise now deprecated auto_ptr.

There's about one legit one which uses placement new and posix_memalign.

With spare time, I could make that one in 40k lines of code easily. In fact that's going to happen slowly as the library is being transitioned to C++14.

I find it terrifying that someone who pust themselves forward as a super C++ guru is splattering new so much all over the place. But you won't believe me because without knowing anything about me you've convinced yourself that you're superior.

Let's see what a real, certifiable C++ guru says:

http://www.informit.com/articl...

Bjarne doesn't like new/delete either. No offence, but I'll take his invention ans stewardship of the language over your 1 year of experience repeated 15 times.

I doubt you regularly find one here on /. who has significantly more C++ experience.

Out of interest, do you have any T-shirts with disparaging things written about n00bs on them? And are those slogans visible under the cheeto dust?

Comment Re:Given how C++ is taught. (Score 1) 345

A smart pointer has the same basic interface as a pointer, namely it supports * and ->, but not necessarily arithmetic. They're not supposed to be smarter than you, they're smarter than dumb pointers, because you can put logic into the various parts.

But yes, the standard ones essentially do lifetime management for allocated stuff, for when your object lifetime doesn't match the stack. They're pretty good things :)

Comment Re:Simple ... (Score 1) 345

The "masters" figured after 10 to 20 years of C++ that being a master is just mental masturbation, so they moved on to .Net and JVM based languages like C# or managed C++ and Java/Scala.

It's ironic how you accuse people of mental masturbation in a sentance that sounds likeso much wankery. I was going to correct you and point out where C++ is the superior alternative to almost everything else. I came to realise that you're not actually interested in learning. You seem more keen on letting the world know about your supposed mental superiority.

It also sound like you're intervierwing for C++ jobs you don't want purely so you can lord it over the interviewers. Get over yourself.

Comment Given how C++ is taught. (Score 2) 345

You should certainly be familiar with the syntax.

You should almost never see a new, and never a delete in normal code (rare exceptions for guru library writers only). If you do, you're almost certainly making life hard for yourself.

Another thing is programming with concepts. These aren't part of the language yet, but are part of the culture, part of the design of the STL and hopefully will make it into the language. Things having the same "concept" are types where the operation of the semantics are the same.

This is like a field in mathemetics: you have addition, multiplication, subtraction and division (well, really the additive and multiplicative inverses). If you get the proofs right, they'll work on any field. This is why you can muliply with a modular FFT.

For example, ints, floats, std::complex all obey the same concept, that of a number. There are more, such as automatic differentiation types provided by expernal libraries.

Another example is vectors in "metric spaces". A VP-tree is like binary search, but instead of working on an array of numbers, it works on a multidimensonal metric space of vectors. Normal 3D vectors in Euclidian space is a metric space (distances obey the triangle inequality), but interestingly so are bit vectors and hamming distance and even strings and edit distance. The underlying algorithm of a VP tree relies on several semantics. You need to be able to measure distances and update some lower and upper bounds. That is all.

The art of concepts is writing the algorithm using the abstract interface of the concept upon which it operates. This has several nice properties. Firstly, you use nothing more than the concept. If you use an operation which isn't part of the concept, it's almost certainly a bug. Secondly the algorithms are much clearer because they don't mix in the implemtation of one specific instance of a concept (e.g. building edit distance right into your VP tree) with the underlying algorithm. And finally, once you've done that you get genericism for free. Stick a template around the class and you have a working, debugged algorithm which works on everything it could work on.

This is how the STL works. For instance std::sort requires two concepts. The range spanned by the iterators passed to it must work with radom access and you must be able to compare the elements with <. Given that it can sort anything.

A good stage to get to as a C++ programmer is to write code like that, not necessarily even because you need the genericism but it forces you to separate the underlying abstract algorithm from the concrete specific datatype it is operating on this time. Doing so has many benefits.

Comment Re:Down with "research"! (Re:Wow, just wow...) (Score 2) 490

No: you said it was a choice, specifically.

There's plenty of evidence that human sexuality is flexible, and it's certainly determined by cultural constructs. Doesn't me I could go fuck a dude now, however.

are mutually-exclusive. And yet, the same people tend to argue for both of them depending on the talking point du jour.

Only in your silly world of absolutes. In the real, nuanced world, both play a part.

Ancient Romans and Greeks both tended to ridicule homosexuals.

No, they ridiculed sub/bottoms. Not homosexuality. Way to fail at reading.

It seems clear, that your own classical education is rather selective...
And you massively cheery picked and ignored context.

Comment Re:Down with "research"! (Re:Wow, just wow...) (Score 2) 490

Sexuality, you see, is a "social construct" now (and since 2004!)

I take it from your scare quotes that you strongly disagree. Perhaps you should read about some socially different societies, such as ancient Greece. The whole rather rigid spilt between "gay" and "straight" is a rather modern invention. In other words a social construct. Sure, men were expected to get a wife in order to produce a son, but that had little bearing on what they stuck their dick in for fun.

They didn't have a term for gay or strait, instead they had generalised terms for top and bottom, the latter of which also applied to females.

So yes, sexuality as you think of it is a social construct. The evidence is written all over history.

Slashdot Top Deals

"Luke, I'm yer father, eh. Come over to the dark side, you hoser." -- Dave Thomas, "Strange Brew"

Working...