Java and the Raspberry Pi Camera (Part 1)
By speakjava on Jun 14, 2013
The Raspberry Pi Foundation recently announced the launch of a camera that plugs into one of the two ribbon cable connectors on the board (as shown below):
I thought it would be an interesting idea to see how easy it would be to get this working with a Java or JavaFX application.
There are three utilities that are available for testing the camera: raspistill, raspiyuv and raspivid. These allow you to grab a frame or video from the camera and store it in a file. This seemed to be a good starting point for figuring out how to use the camera and get the frame data into a Java application, ideally as a BufferedImage (I decided to start with simple image capture and look at video streams later).
I downloaded the code from github and started looking at what it does and how it works. Initially I thought it would make to sense to use a toolchain to cross compile the code on my quad-core Linux box. However, having spent a day working on this and failed to get the code to compile cleanly (even using the download of the Raspberry Pi org's toolchain) I decided it might be slower on the Raspberry Pi, but at least it worked.
I also found a useful post from Tasanakorn Phaipool who had created a couple of sample applications that made use of the camera and linked to the OpenCV libraries. This provided a good starting point as it simplified things compared to the raspistill application and enabled me to figure out a relatively simple build environment (I don't have time right now to climb the learning curve required for cmake).
Getting the code to compile and run was really quite challenging. I will confess it's been a while since I've done any C coding, but more of the issues I experieced were to do with getting the build process to work correctly. I used an iterative approach to creating a Makefile, simply resolving issues as I found them, gradually adding header file references and libraries until the code compiled cleanly. To use the camera we need the multi-media abstraction layer (MMAL) API. Broadcom have very kindly made this available as source, but documentation-wise you pretty much have to dig through the source code (there is a big comment at the top of the mmal.h file which is the best documentation I've found so far). Once I'd got the code to compile and link it still would not run, which puzzled me for quite some time until, by comparing the raspistill executable to the one I'd built, I found that I needed to include the libmmal_vc_client.so in the list of libraries to link. (This really does confuse me because this library is not required to resolve any function references so the code compiles and links correctly, but without it the necessary camera configuration is not registered and the call to mmal_component_create() will fail).
At this point I have some code that will talk to the camera and display the preview image on the video output (HDMI). Next I need to modify this so it can be used with JNI and integrate this with a new subclass of ImageInputStream which can then be used to create a BufferedImage in a Java application.
One other thing that is interesting is that when I run the simple test program the preview is displayed and very shortly after the network stops working (all the LEDs on the Pi except the power light go out). I assume that is a bug somewhere. Fortunately, I have a serial console connected so can still access the Pi via PuTTY.
I will update my blog as I make more progress on this.