By kbr on May 11, 2008
We were fortunate to obtain a dev kit from NVidia about a week and a half before JavaOne. At the time we were multitasking on other work that had to be done by the conference. With help from the Java ME team, I successfully compiled the CVM, a highly portable Java SE-class VM, for the device. This took about 24 hours. This stack includes the Connected Device Configuration (CDC) and Foundation Profile (FP), but no graphics stack. At that point we had a headless Java VM with JNI support. We also compiled in the java.nio Buffer classes from the JSR-239 reference implementation, because these are used to hand data down to the OpenGL layer in both the desktop JSR-231 and mobile JSR-239 implementations. However, we were not aiming to run JSR-239 on the device.
Anthony Rogers, lead designer of JavaFX, created a demo design we were aiming for, including 3D graphics, spatialized audio, and touch screen input. In support of his design, I then took out the machete and whacked JOGL down to about 10% of its original size, changing its build process to parse the OpenGL ES 2.0 headers instead of the desktop OpenGL 2.1 headers, and eliminating all utility classes that had dependencies on immediate mode, the fixed-function pipeline, and the AWT or Java 2D. I ported NVidia's basic OpenGL ES 2.0 demo, a spinning triangle with colored vertices, from C to Java to this modified JOGL. After working through several issues, including needing to write an OpenKODE-based launcher for CVM (using kdMain() instead of a normal main()), we had a window on the screen, but no graphics. Fortunately, while whittling down JOGL to the OpenGL ES 2.0 headers, I had left in the composable pipelines, in particular the DebugGL. Plugging in the DebugGL immediately isolated the problem to a missing
glUseProgram call. After fixing this, we had a spinning triangle on the screen being driven by a very thin Java stack interfacing directly to the graphics hardware via OpenGL ES 2.0. This was working the Friday night before the conference. Here is a picture of this demo running on the APX 2500, plugged in to the "wingboard" which provides power as well as several breakout connectors.
At that point we focused our efforts on audio. Our first attempts to use the device's hardware support for OpenMAX didn't work, and since it was a Saturday by this point we had to find a fallback. We found that we could send bits out the Windows waveOut device, so I built a very simple pseudo-spatialization engine in pure Java which models the positions of the speakers in 3D, and treats the sound as a point emitter in the 3D space, with 1 / (1 + r\^2) falloff between the source and the speakers. This engine then sends its outputs down to the Windows waveOut device with a very small amount of native code. This took somewhat longer than expected or desired, and we ran into a hiccup where we found that we could only initialize the waveOut device on the APX 2500 in mono mode, not stereo. We contacted NVidia about this and the reason turned out to be a bug in my code where I had incorrectly computed the average bytes per second. Fixing this allowed us to get full stereo out of the APX 2500. This was fixed by the Monday morning before the conference.
At that point we had to get the graphics portion of the demo working. We decided to fall back to using the OpenGL ES 1.1 support on the device because we didn't have the time to write extensive utility libraries to make it easy to write pure shader-based code. While I was working on downgrading the modified JOGL from OpenGL ES 2.0 to OpenGL ES 1.1, Sven Gothel, author of GL4Java, started working on some backup content. That night I ran into extensive problems, including needing to implement the JNI GetDirectBufferAddress entry point in order for JOGL to work properly. I was seeing mysterious JVM crashes while trying to get the simplest red square OpenGL demo (ported to vertex buffer objects for OpenGL ES 1.1) on the screen.
In the meantime, Sven single-handedly ported the San Angeles OpenGL ES 1.1 demo from the Assembly 2004 competition from C to Java in a single night, first getting it to run on the desktop using an unmodified JOGL. He then modified the demo slightly to fit the framework we had running on the APX 2500. (I didn't have time to fully abstract away the EGL, so there were a few places in the code where EGL-specific constructs had to be used.) While I was struggling mightily to get any graphical output to come out of the device, Sven handed me a binary and said, "Come on, give it a try." The first run yielded a NullPointerException. The second run worked. An absolute miracle and awesome team effort. Here are a couple of pictures of the demo running on the NVidia APX 2500. Click for a larger image.
On Sven's suggestion we dropped in a sound sample of the wind blowing, spatialized to oscillate between the right and left speakers, which gave the demo a haunting feel. Probably because we were using the OpenGL ES 1.1 driver, which emulates the fixed function pipeline at the driver level using shaders, we were using more CPU than we should have, so there were some dropouts on the audio where our Java-based mixer thread was starved. We'll fix this in the coming days.
Overall the port of Java, JOGL and 3D spatialized audio to the device, as well as the San Angeles demo, took six days.
We would like to thank Brian Bruning and Keith Galocy from NVidia for getting us the device and helping us quickly with our questions; Nandini Ramani for facilitating the contact with NVidia; Hinkmond Wong, Chris Plummer and Dean Long from the CVM team for getting us a working JVM so quickly; Chris Oliver for extremely helpful discussions and advice; and Anthony Rogers for the original demo idea, although I didn't succeed in executing his vision — yet. :)