Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:LOL Java (Score 1) 233

> Did you perchance use JVisualVM to look at what your program was doing? what was it doing when spending time? sometimes the hotspot will become obvious

Yeah with various tools (jprofiler, visualVM, sysprof, strace, latencytop etc).

The problem with running Java profiling tools is that you have to leave UsePerfData on - and it's a cause of jitter in the VM itself.

The code is as optimised as you can get - I'm bypassing the "helper" classes included with JavaJack and using the raw mapping to Jack C API.

When running the program is at about 2% of CPU - so it's not a CPU starvation problem. I've run it under both regular and realtime linux - the linux thread priority is set by Jack itself as high - and I've tried setting the other threads in the VM to lower priority by native C calls. No dice.

> What was the delay? It should have been sub-millisecond. It is definitely not the source of the 40 ms delay that you see.

The problem I think is that it's competing with the compilation / profiling and housekeeping threads for CPU, and the VM can halt your thread whenever it feels like the profiling data or other things need updating.

I should mention that I'm outputting two periods of 1024 samples - so Java is in effect failing to meet a 20 millisecond deadline every now and then.

> I'm not sure whether you are aware of this, but to get low jitter you must yield.

I'm not very happy with this being a solution - if anything it shows the model is broken itself.
I can't even try this - the callback thread is created inside the Jack shared library - and "enters" the VM when it needs to do a callback. I can't block or do anything other than simply try and return the data needed by Jack in this thread.

It's been an entertaining discussion - but I think we're at an impasse - I think I've demonstrated some scenarios in which Java either lacks expressive tools or fails to offer the same guarantees that native C++ can offer.

Please don't forget - I am a Java developer - I'm not poo-pooing the language with bias - but realising the limitations of the tools we use is important in being able to identify the situations in which the tools are appropriate.

Cheers

Comment Re:LOL Java (Score 1) 233

One thing I forgot to mention. You've seen the JOAL library, yes? That's what I'm using for audio.

Funny you mention that - as the DSP isn't happening in Java.

As I mentioned in an earlier post, the strength of Java is in its vast wealth of libraries.

But that's not a java library - it's a java veneer on top of a native library.

Comment Re:LOL Java (Score 1) 233

All the games you mention bar Minecraft have a native core engine that isn't written in Java.

Minecraft itself isn't exactly a good advertisement for Java performance in games - and I'm not blaming Notch for that. Most of the bad press it gets performance wise is due to VM stalls.

And as I will repeat, my *experience* with using Java for a game is that the CPU-components run as smooth as silk and I have low jitter.

I suspect the jitter is below your perception level since the GPU is doing the heavy lifting.

In my case I have Java DSP code generating audio samples - no filesystem reads / writes. Floats are generated as a block into the NIO buffer.

I'm using JavaJack as the audio connector to Jack - the standard Java audio engine is terrible.

Basically Jack DSP thread makes a callback that provides NIO buffers to do the DSP calculations and place the result in (simply generating a 440hz sine wave, for example). The lowest output latency I can get with Java without jitter causing pops and clicks is about 40ms (1024 samples * 2 periods) as I mentioned. I've verified zero memory allocations and turned off UsePerfData too (UsePerfData can cause VM stalls due to the mutex around writing the performance data).

When tracing the VM using sysprof I can see the VM is trapping a signal to do it's housekeeping every X iterations which halts all running threads. This is on both OpenJDK1.8 and Suns JDK 1.8.

Using latencytop on the machine at the same time shows the VM spending time in userspace lock contention and waiting for events (timeouts for the housekeeping threads).

I notice you didn't mention realtime Java. I'll ask again - if the Java VM doesn't have jitter - why do the realtime specification and VM's exist? (since realtime is all about predictability).

Comment Re:LOL Java (Score 1) 233

On GUI toolkit memory allocations - This is true if you are dealing with smartphones. In that case GCJ is an alternative, or C++. However, it is hard to find a *current* system that is so memory limited that this is an issue.

Unnecessary memory allocations increase the cache pressure on the CPU introducing extra memory stalls and forcing the VM to interrupt threads.

Multimedia is soft realtime. You simply spend memory to read ahead. If you are talking about controlling stepping motors in software (which I've done in the past), then it is a problem. If you are controlling stepping motor device driver hardware then it is not a problem. Sorry, I just can't see a case where you need a hard realtime requirement and C++ would make a difference compared to Java. You would always do any hard realtime in hardware when you are worried about microsecond jitter (since it is the *operating system* that screws your jitter up, the difference in userspace Java vs C++ is neglible if you have idle cores ready to respond to work)

You seem to neglect the importance of low jitter - and you're writing a game? I write audio applications and have my prototyping in Java, but the real application in C++. I've measured the difference, and it's night and day. Java introduces another jitter source (VM stalls due to signals for housekeeping) that dwarf OS level jitter. It's not just a case of turning on the parallel garbage collector, either. Your threads still are interrupted. No such interruption occurs with native threads - even on a non-realtime OS.

But let us suppose for a moment you do have a hard-realtime requirement in your software. Then I would say, "Go ahead and use C++ if you think it would make a difference". It turns out that the amount of software that would fit this niche is small and getting smaller. But again I will remind you that the difference in determinism between C++ and Java is negligible if you have ever cared to measure them on a modern desktop O/S. It is the O/S that is the major source of jitter that will break your realtime requirements, not the software stack you are using.

I have measured it. Lowest audio output latency with Java in no-allocation loop is around 40 ms on a core2 2.4Ghz - same machine with C++ and I'm down in the sub-milliseconds. It's that big a difference.

but I would imagine it is so complex to do (retrofit to an existing C++ program) that even after several years it has not yet happened. One cannot simply dismiss the superior language constructs for synchronization when it comes to development effort. By having these (portable) language constructs it makes it easier to get massive multi-threading working correctly in Java relative to C++, and hence the jitter goes down and total throughout goes up. Conclusion, for less effort you get better total throughput with Java (assuming your problem can be parallelized; as most complex applications can).

You seem to have the idea that C++ is "C with objects". Threads in modern C++ are no more difficult to use or pool that in Java. But C++ threads won't get blocked unless you choose. Modern C++ has better granularity with synchronisation primitives too - with varying levels of relaxation for atomics. In C++ you get to deal with the actual primitive - not an abstracted Java Object that's increasing the pressure on the CPU cache.

I'm glad you concede that NIO can solve the problem. Yes, they are slightly unnatural compared with the simplicity of most of Java. However, I would argue that they are a lot more safer and portable than fudging around with pointers to achieve the same goal (as in, with pointers you have all sorts of awful macros to handle alignment, memory model and word size differences on different platforms; in Java this is transparent to you).

Again, you've missed how modern C++ deals with packed array structures. Using vector<Struct> in C++ is worlds of elegance better that fudging it with long offsets.

So, I personally am not swayed by your reasons for choosing C++ over Java. Your argument for C++ comes down to the embedded application domain, and really only for the very tiniest of devices (have you seen how powerful and capacious smartphones and tablets are these days?). The counting clock cycles is a thing of the past - as I said, in the vast number of cases of modern, complex, multi-threaded applications you are always waiting for something other than the CPU (eg. network, user, database, memory bus, GPU etc).

Well, I'm sure it's good enough for you, but when you start putting some actual logic into your flight sim for your fluid dynamics and attempt to do this with low latency you'll start to see the jitter problems with Java.

There's a reason that modern games are still written in C++ rather than a managed language - when you need to extract performance in a predictable way you want determinism. Managed languages can't provide that guarantee.

Sun themselves admitted that predictability is a problem - that's why the realtime specification and VMs exist - regular Java has jitter.

Comment Re:LOL Java (Score 1) 233

This destroys C++ (or any other software implementation) for performance.

Of the top of my head:

  • * Java doesn't allow management of packed arrays of structs. This means you can't do things like cache optimisation of data structures. You can kind of get there by using nio buffers and ugly map to offset style things, but it's just ugly and unwieldy. Cost for no reason.
  • * Java suffers from predictability problems. Jitter introduced by the system management threads and mutexes causes stalls to threads that you have no control over. There is a realtime Java VM, but it's not really "Java" as you can't use standard Java libraries and classes.
  • * Related to the point above - in code with strict scheduling deadlines, you can't use dynamic memory allocation (this goes for C++ as well as Java), but using any of the standard Java libraries and/or objects allocates memory all over the place. You can use things like the Javolution collections to get around these issues.
  • * Java's GUI toolkit (swing) allocates memory all over the place meaning all the hard work you put into a no-allocation loop in your code is wasted when using a JTabbedPane allocates massive amounts of memory just by moving your mouse over it.

I like Java as much as the next Java programmer, but the picture isn't that rosy when it comes to things needing real performance. C++ provides quite a few tools that can blow Java out of the water, purely because there's no runtime overhead and you have deterministic overhead. Add in the ability to manage cache line access and C++ will quickly run away with the title.

C++ doesn't make you pay for anything you aren't using and offers quite a few tools that allow micro-management of performance in a way Java simply cannot.

Comment Re:Go Green (Score 1) 312

Are there really modern (ie within the last 2-3 years) computers that don't support WOL?

In all honesty I'm not sure - but there is the network card firmware, network card drivers and possibly the BIOS and/or EFI all that have the possibility to get in the way of this working out of the box.

The OP mentioned they are rentals - not owned - which makes things a bit tricky. They may have to purchase this functionality if it is available.

Comment Re:never understood the appeal (Score 1) 189

Can you imagine if you'd started playing doom and halfway through the game you had an accident and ended up with Anterograde Amnesia?

The use of emulators would allow you to groundhog day that same level for the rest of your life with the same pleasure as if it was the first time around.

(Yes, I feel bad. At least I didn't make a forking dick joke.)

Comment Re:RHEL version? (Score 1) 269

I don't know the date of their tools release, but doing a "one shot" installation if you want to play around is pretty easy:

tar zxf gcc-4.8.0.tar.gz
cd gcc-4.8.0
./contrib/download_prerequisites
cd ..
mkdir gcc-4.8.0-build
cd gcc-4.8.0-buid
../gcc-4.8.0/configure --prefix=/SOMEINSTALLDIR --disable-multilib --disable-bootstrap --enable-languages=c,c++,objc,obj-c++,fortran
make -j #NUMPROCESSORS
make install

then add to PATH, LD_LIBRARY_PATH etc

That'll get you up and running.

Comment Re:Someone didn't do their homework... (Score 1) 337

Was there blatant bullying? Of course not and I never suggested such.

So did I misintepret your comment:

But it was killed off because Red Hat and nVidia didn't like.

I infered from that that Red Hat and NVidia actively looked to stop it. I apologise if that wasn't what you meant, it just came off that way.

Comment Re:Someone didn't do their homework... (Score 1) 337

Moving to Xegl would have been steps backwards until appropriate approaches for the problems I mentioned could be found. (How would existing stereo 3D GLX applications work? Very important to the people that use them. Video was a mess, too, what about multi-screens...) David was rather flippant about these problems at that time, but they were real roadblocks.

AIGLX provided a simpler route that didn't lose functionality in the meantime and didn't require writing a new driver. It wasn't just Red Hat / Nvidia bullying their way through here (your premise up the thread) - it gathered momentum as it was (at that time) the simpler way forward that didn't break anything. I think some of the Mesa drivers were first to implement AIGLX.

I'll agree Xegl did show promise in providing a single simple approach for driver development though.

So, given the nature of open source - if Xegl was superior, why didn't someone keep working on it? Why aren't we discussing Mir and Xegl instead of Mir and Wayland?

Comment Someone didn't do their homework... (Score 5, Informative) 337

We could have had a modern display server years ago with XGL/Xegl. But it was killed off because Red Hat and nVidia didn't like.

The disagreement was purely technical.

The XGL approach caused a bunch of peformance problems for various rendering scenarios (stereo3d, overlays like video) - XGL forced everything through a pixmap to be rendered by GL.

No acceleration using the GPU for video / scaling or anything else.

XGL was cool because it was first and everyone got googly eyed at the effects. It probably was a catalyst in getting the right solution (AIGLX), too.

Comment Re:Shocking... (Score 1) 104

A breach with only an account or two stolen makes no sense.

I'm afraid the real world has a few more shades of grey than hacked or not hacked.

The bad guys get caught with varying levels of "in" in the DMZ. High value single account targets are of interest to the bad guys too. A shotgun approach of attack can set off alarm bells where a surgical strike can go unnoticed for a bit longer.

Banks in particular have improved over the last few years with two factor auth and dropping the "smart client" (java / flash) mess, but the bad guys are just as inventive - social engineering has been on the rise to counteract some of these advances.

I realise I'm not going to convince you without any factual backup. On the other hand, I'm not willing to put former colleagues and employers in the spotlight.

Comment Re:Shocking... (Score 1) 104

I can think of a number of companies such as banks that have simply never been hacked

Having worked for a couple of banks in my time and had the ear of some of the security chiefs, I can tell you that it does happen. Unless it's a particularly visible breach (multiple account details stolen, loss of funds with transfers), very little of it makes it to the media. For obvious reasons.

I can think of a number of companies such as banks that have simply never been hacked, but even outside of that has Amazon ever been hacked?

What makes you think you'd hear about it if happened? Most companies will only hold up their hands and admit problems when the evidence is undeniable. See Sony.

Slashdot Top Deals

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...