Become a fan of Slashdot on Facebook


Forgot your password?
Slashdot Deals: Prep for the CompTIA A+ certification exam. Save 95% on the CompTIA IT Certification Bundle ×

Comment Re:javas not dead! (Score 1) 577

No, I'm saying it performs as well as C++ in most cases. Virtual method handling is one example as to why, the JIT has a better view at execution time as to what can and can't be inlined, so it can inline much more than a statically compiled C++ program possibly can.

You realize that JIT is inherently limited to a tiny bit of program which it compiles? JVM cannot spend neither time nor RAM building a whole program tree and making global optimizations like compilers can. And by the way, if you think that compilers are limited to static analysis only - there's also profile-guided optimization.

Well that's precisely the problem you face if you don't have an explosion of optimised binaries, unless you want to accept that the JVM is going to optimise more efficiently. It's not just about compiling for different architectures, it's about the JIT automatically being able to optimise to take advantage of extensions, and other hardware that may be present too. It can optimise dependent on amount of RAM, cache sizes etc. - something that just isn't known when you compile a plain old generic C++ binary for, say, the generic x86 platform.

But there are a number of other things it can do better too - better loop vectorisation (as a result of better inlining of virtual functions) and more efficient heap allocations for example.

In theory, it could do that. But if you do a reality check, you'll find out that JVMs right now are pretty mediocre compilers that lack even basic optimizations. Again, everything that JVM does, can be done by a compiler, but not vice versa. Compilers have nearly unlimited time and can spend gobs of RAM analyzing the program. They can use profile-guided optimization, allowing you to gather stats from a compiled program and then recompile it to better account for runtime behavior - if needed.

Oh, and while we're at it, fine-tuning assembly with specific CPU in mind does not matter these days except for SIMD ops. Waiting for memory accesses dominates CPU time - and here Java is at inherent disadvantage because you cannot really control memory layout of your data.

"Server software does not [need to] have single-thread performance because it's more often I/O bound - that means that CPU vendors can get away with CPUs like Bulldozer or SPARCs that suck at IPC (instruction per clock) performance."

This is nonsense. It depends entirely on the application. A heavy load web server for example may not really be I/O bound in the slightest depending on the size and what it does. Bulldozer is designed for optimisation of performance per watt, you're again confusing cause and effect as to why some things are the way they are.

Before you call this nonsense, go read some analysis and check benchmarks.

"That's not a problem of Java, though, but all managed languages - .NET also sucks."

Really, the problem is simply that you don't understand managed languages. Your understanding of the optimisations performed by JIT compilers is clearly woefully inadequate to being making this sort of comment. Your comments on server applications just don't even make sense for the most part to the point I'm not even sure you have the slightest grasp of what sort of things servers commonly serve.

"Microsoft tried to build an OS which would be .NET based - they wasted like 6 years on that and ultimately had to abandon the idea. Now they are going native :)"

This is just further nonsense. There was a Microsoft research project to try and build such a thing, and they did, and open sourced it. I don't know what you mean by "Now they are going native :)", they've always been native with their operating systems. If you were expecting their managed OS to cause them to throw out 3 decades of legacy code then you have a disturbing view of how software is developed.

It was a research project and nothing more, and even then it wasn't purely managed, they still had to bootstrap natively because no one ever pretended that managed languages are designed to do such low level operations. You can find out more about it here:

It's worth noting though that some of the things learnt from this research project have already made their way into Windows, but that's kind of the point of research.

You probably don't remember that Vista (Longhorn back then) was supposed to be .NET-only, with core API to the OS being written in a managed language. Microsoft was not able to implement that efficiently and had to cut it - this is speculated to be one of reasons for Vista delay.

Nowadays they are phasing away .NET in general, promoting C++ as the primary language for the platform.

"Sure, when I'm forced to use Java (e.g. Android), I immediately use the JNI window to escape. I am not interested in "benefits" of Java, if it means that I need to waste even more time trying to profile the application."

Right, and most people wont be interested in the downsides of your approach either because it simply means you're producing software much more slowly and with more scope for fatal bugs and security vulnerabilities.

You sound like one of those developers who has his little comfort zone and just can't deal with change. Everything should be written in assembly like it used to be! This is telling from your lack of knowledge about both JIT technology and server side software and hardware.

Like it or not, times have changed, there are better ways of doing things now such that the window of cases where C and C++ are the best tool for the job is rapidly diminishing. They're not worthless, they still very much have their place - low level operating system development being one example, some embedded development cases being another. But the fact is these languages have no tangible benefits over their managed counterparts for most real world scenarios that exist today, whether that's building desktop applications, or creating dynamic web pages, or building HPC trading systems. They do however have a number of downsides - slower development cycle and less secure and more error prone development by default being the obvious ones.

You either need to get over your fear of change and learn a bit more about these sorts of technologies and understand why much of what you said is wrong, or just stick to what you know and shut up about things you don't. Either way, sticking to what you know and complaining about that you don't know just results in you spouting nonsense as you have thus far in this thread with your simply outright incorrect comments about JIT technologies and server applications and hardware.

One can argue that everything will be written in Javascript and/or HTML5 by that reasoning. No, there are no major differences in productivity between Java coding and C++ coding. And C++ is not "a legacy language". In these days of multithreaded programming, it turns out that we are again changing the paradigm - we are using Data Oriented Design to gain efficiency and Java with its "everything is a non-trivial object" and lack of POD types fares very badly in that regard.

C++ hits the sweet spot of being pretty high level (heard of meta-programming?) while also allowing you to go all the way down to assembly when needed. With Java or any other managed language, you are inherently limited by what JVM provides, and JVMs have more concerns to care about than runtime efficiency, so they will always offer some kind of trade-off - that's why I called them "generic". While you can probably fine-tune them within reason, you cannot find a JVM that would, say, completely disable all runtime checks because your specific app does not need them.

If you will ever need to code an application that has tight performance requirements, like being required to draw a complicated scene under 16 ms, you will understand what I am talking about better. So far you seem to be only looking at the boring side of programming :) Try doing realtime graphics! ;-)

Comment Re:Java is faster than C++ (Score 1) 577

You are very optimistic. Right now even C++ compilers (which, believe me, are very much performance-oriented and rather are not memory-constrained) have problems with producing a good vectorized code, but thanks God we have assembly intrinsics and use them a lot. For JVM, that is even harder for multiple reasons (and unfortunate - historical - choice of Java bytecode is one of them). Sure, there's a broad class of software where performance does not matter, but as I said again, that is a boring software I don't want to work on. Writing such software is better to be outsourced somewhere where people crave for money more than I do.

As for HFT, I don't think that using Java is a good decision. If you can optimize for certain (best in its class) hardware, why do you need to hop through all the extra abstraction layers of Java? Sure you probably can, but it's like artificially limiting yourself.

Comment Re:javas not dead! (Score 1) 577

JIT is generic in a sense that each program (and even different parts of a single program) is different and you cannot base them all on a common framework. E.g. in C++ sometimes you have to abandon STL at all because you cannot allow dynamic memory allocation (and memory fragmentation it causes). I wouldn't say that "Java performs as well as C++", unless you are speaking about UI-heavy programs where bottleneck is user input - or, alternatively, C++ programs written by people who don't know how CPUs implement a virtual method call and why it's slower than a non-virtual one.

Yeah, with native languages you are bound to a specific architecture (and even variations of it, e.g. AVX, SSE), but is it better to be a jack of all trades and master of none? And also, there's no "explosion of binaries" anymore - unfortunately, the number of architectures available is continuously shrinking (kind of undermining that design goal of Java). "Boring" native software that is not performance-tuned may very well ship just a generic binary targeted at, say, all Pentium IV and higher CPUs.

Server software does not [need to] have single-thread performance because it's more often I/O bound - that means that CPU vendors can get away with CPUs like Bulldozer or SPARCs that suck at IPC (instruction per clock) performance. That is also the reason why Java can be used server-side without much problems, too. However, once you bring it to desktop, where performance matters, it starts to suck immediately. That's not a problem of Java, though, but all managed languages - .NET also sucks. Microsoft tried to build an OS which would be .NET based - they wasted like 6 years on that and ultimately had to abandon the idea. Now they are going native :)

Sure, when I'm forced to use Java (e.g. Android), I immediately use the JNI window to escape. I am not interested in "benefits" of Java, if it means that I need to waste even more time trying to profile the application. Is there any low-level Java profiler, by the way, which would tell where CPU is burning cycles in your code at? Down to the level of assembly - i.e. something like perf annotate.

Comment Re:javas not dead! (Score 1) 577

I know that some HFT platforms are written in Java, but I think this is caused by other reasons than trying to attain maximum performance. E.g. it is certainly harder to find programmers who have both high- and low-level skills (i.e. know hardware well and have good understanding of math, for instance).

As for "better than JIT" argument, I think you are putting too much trust into a rather generic approach. Even traditional (and performance oriented) compilers don't handle all use cases well and you can hit a roadblock there too, let alone all the fundamental problems with "managed" code (e.g. random memory access patterns which hurt CPU caches, various safety checks, stack-based VM design that doesn't map well to register-based hardware - stack-based processors are at inherent performance disadvantage, by the way, that's why Intel shunned FPU in favor of register-based SSE). My statement of JIT weakness is supported by the well-known fact that server-side software rarely has great single-thread performance (and often it doesn't need it, but it's another topic), so server CPUs tend to have gobs of cache in order to alleviate that.

Again... language (syntax, etc) doesn't matter much for me, I wouldn't mind Java if it allowed me to get as close to hardware as possible, even via non-portable extensions. It's the layers of code to profile and debug through and resulting feeling of not being in control is what I don't like.

Comment Re:javas not dead! (Score 1) 577

I like that comparison with soulless apartment blocks. To be fair, it wasn't only Soviets who built those - I have seen similar unimaginative housing in US, too. Either way, Java indeed feels like a typical business, "no nonsense" solution and I can picture a grey, dull world of a corporate 9-to-5 programmer for whom Monday is the worst day of the week. Don't want to work like that.

Comment Re:javas not dead! (Score 3, Insightful) 577

Java abstracts away all the fun in programming. You cannot really target specific hardware with it, and it means you cannot get (close to) maximum performance on any given piece of hardware - you are deprived of control over processor cache, memory locality of data and code, or access to vector instructions. JIT, which "compiles" the code piecewise, is at obvious disadvantage compared to a proper compiler that has global view of the program (and JIT is also more time constrained, compiler can spend hours compiling the code), and uncontrollable garbage collector means that you will have hard time enforcing even "soft realtime" requirements.

Different people may look differently, but for me, all that means that Java is suboptimal for games or other heavily performance-oriented stuff, and this is the only kind of software I enjoy programming. Making performance-insensitive backends full of "business logic" is for someone who is in the software industry for money only...

Comment Re:javas not dead! (Score 1) 577

I don't think "an app" applies to web "applications" and not sure those two will ever blend. The depedency on a (good) connection is still a problem, if you are commuting or, say, torrenting. Even on mobile devices, people want offline maps and offline dictionaries precisely for the reason of being location-independent.

Comment Re: Are you F*cking kidding me!!! (Score 1) 195

It still drives down wages. [...] wages fall, maybe not ot third world levels, but below what the market would normally dictate.

Why do you limit "market" to a single country only? One day we will witness formation of the United States of Earth, and artificial obstacles for population movement will be quickly forgotten. The market is already global.

Comment Re:Ubuntu is a has-been. (Score 1) 183

Kind of disagree. Granted, that is an anecdotal evidence, but I had brought my Linux desktop down by just allocating - and actually "touching" - too much memory (like 2x more than physical RAM) in a program of mine. While Windows may also suffer if you decide to memset() a 32GB array, I haven't yet seen the same unresponsiveness of the system - not even Ctrl-Alt-F1 worked.

Also, Linux the kernel is one thing, and Linux the OS is another. Linux graphics stack is certainly not "lightyears ahead" of Windows, where you can reset/reinstall graphics drivers as if they were userland programs - vice versa, we still have X server that probes PCI bus (try grep -i pci /var/log/Xorg.0.log).

The unfacts, did we have them, are too imprecisely few to warrant our certitude.