Friday Oct 04, 2013

JavaOne Afterglow

JavaOne Afterglow Last week was the eighteenth JavaOne conference and I thought it would be a good idea to write up my thoughts about how things went.

Firstly thanks to Yoshio Terada for the photos, I didn't bother bringing a camera with me so it's good to have some pictures to add to the words.

Things kicked off full-throttle on Sunday.  We had the Java Champions and JUG leaders breakfast, which was a great way to meet up with a lot of familiar faces and start talking all things Java.  At midday the show really started with the Strategy and Technical Keynotes.  This was always going to be tougher job than some years because there was no big shiny ball to reveal to the audience.  With the Java EE 7 spec being finalised a few months ago and Java SE 8, Java ME 8 and JDK8 not due until the start of next year there was not going to be any big announcement.  I thought both keynotes worked really well each focusing on the things most important to Java developers:


One of the things that is becoming more and more prominent in many companies marketing is the Internet of Things (IoT).  We've moved from the conventional desktop/laptop environment to much more mobile connected computing with smart phones and tablets.  The next wave of the internet is not just billions of people connected, but 10s or 100s of billions of devices connected to the network, all generating data and providing much more precise control of almost any process you can imagine.  This ties into the ideas of Big Data and Cloud Computing, but implementation is certainly not without its challenges.  As Peter Utzschneider explained it's about three Vs: Volume, Velocity and Value.  All these devices will create huge volumes of data at very high speed; to avoid being overloaded these devices will need some sort of processing capabilities that can filter the useful data from the redundant.  The raw data then needs to be turned into useful information that has value.  To make this happen will require applications on devices, at gateways and on the back-end servers, all very tightly integrated.  This is where Java plays a pivotal role, write once, run everywhere becomes essential, having nine million developers fluent in the language makes it the defacto lingua franca of IoT.  There will be lots more information on how this will become a reality, so watch this space.


How do we make the IoT a reality, technically?  Using the game of chess Mark Reinhold, with the help of people like John Ceccarelli, Jasper Potts and Richard Bair, showed what you could do.  Using Java EE on the back end, Java SE and JavaFX on the desktop and Java ME Embedded and JavaFX on devices they showed a complete end-to-end demo.  This was really impressive, using 3D features from JavaFX 8 (that's included with JDK8) to make a 3D animated Duke chess board.  Jasper also unveiled the "DukePad" a home made tablet using a Raspberry Pi, touch screen and accelerometer.  Although the Raspberry Pi doesn't have earth shattering CPU performance (about the same level as a mid 1990s Pentium), it does have really quite good GPU performance so the GUI works really well.  The plans are all open sourced and available here.  One small, but very significant announcement was that Java SE will now be included with the NOOB and Raspbian Linux distros provided by the Raspberry Pi foundation (these can be found here).  No more hassle having to download and install the JDK after you've flashed your SD card OS image.  The finale was the Raspberry Pi powered chess playing robot.  Really very, very cool.  I talked to Jasper about this and he told me each of the chess pieces had been 3D printed and then he had to use acetone to give them a glossy finish (not sure what his wife thought of him spending hours in the kitchen in a gas mask!)  The way the robot arm worked was very impressive as it did not have any positioning data (like a potentiometer connected to each motor), but relied purely on carefully calibrated timings to get the arm to the right place.  Having done things like this myself in the past I know how easy it is to find a small error gets magnified into very big mistakes.

Here's some pictures from the keynote:


The queue to get in.  Back at the Moscone for the keynote this year, which was nice.


The "Dukepad" architecture


Nice clear perspex case so you can see the innards.


The very nice 3D chess set.  Maya's obviously a great tool.


The robotic chess player.

After the keynotes it was sessions, hands on labs, BoFs and parties for the next four days.  Here's a few highlights:

  • Anything Lambda related was packed.  Good to see that there's lots of interest and people are really keen to use this great new feature.  For me, the real power is in the changes to the libraries that use the Stream and related classes.  I helped run the Lambda programming Hands on Lab.  If you're interested Stuart Marks has posted the materials on his blog.
  • My session on the Raspberry Pi JavaFX Carputer went really well.  Since I couldn't bring my car with me I'd made a short video of the system in action.  It was one of those rare occasions when I new that my demo would work!  I also managed to get my simulator working while I was at JavaOne so was able to show data recorded from a real run being played back on my device.  There will be more blog entries to follow on this shortly.
  • My other session was on JavaFX with the Leap Motion controller.  Thankfully for this I had the expert help of Gerrit Grunwald, Johan Voss and José Pereda who came with some great demos to complement my rather basic ones.  During the week I was lucky enough to go and visit Leap Motion, who are based in San Francisco and talk about some of the great stuff they're doing to make the controller even better.
  • The Java leaders visit to the baseball game was fun (unless you're a Giants fan).  Not totally convinced about baseball, but then compared to cricket, it's actually quite a fast paced game.
  • I didn't go to the appreciation event this year on Treasure Island.  The idea of queuing for a bus for an hour to get a free burger and beer and listen to Maroon 5 was less appealing than a quiet dinner with my colleagues (and a bit of a break from the non-stop Java action).
On the last day it was the Community Keynote, which was the highlight of the week for me (watch it here).  It's always great to celebrate the way that community makes Java different to other programming languages, but this year the organisers excelled themselves.  James Gosling was back at JavaOne again, talking more about Liquid Robotics and showing some of the exciting things he's doing with JavaFX.  My favorite quote was when he came on stage and said, "I guess NetBeans is the new PowerPoint", a reference to the fact that most of the presenters had eschewed slides in favour of code and they'd all used NetBeans. (if you haven't done so yet you should really try out the 7.4 release candidate)  Stephen Chin was also on stage to show a LEGO robot Duke that he'd built and worked as a Segway using the recently released Mindstorms EV3 kit and the Java SE environment.  One sad piece of news related to this is that later in the day, Stephen, Angela Caicedo and I went for a cup of coffee and Angela's car was broken into.  All that was stolen was a bag containing Duke.  If you see him in a downtown San Francisco dumpster make sure he gets home.

By far the best part of the keynote was where my good friend Arun Gupta's son, Aditya, got up on stage and showed over 1500 people how to hack Minecraft.  As a presenter he was flawless, he seemed confident, his demos worked and he presented the concepts clearly and with great demos.  Hard enough for a seasoned presenter, but consider that Aditya is only TEN YEARS OLD!  There's no way I could have done that at his age.  He deserves major respect for this, which is probably why he got a standing ovation when he finished.

You can watch the video of his performance here.

So that was JavaOne 2013.  Another great event and it will be even harder to top that next year.  One challenge I have taken away from this is that my son, Dylan, is only 7 years old.  I have less than three years to get him on stage talking about Java during the keynote at JavaOne!

Sunday Aug 04, 2013

The Raspberry Pi JavaFX In-Car System (Part 4)

Raspberry Pi JavaFX Carputer part 4 It's been a while since my last blog entry about my in-car system, which has been due to a number of other things taking priority.  The good news is I now have more to report in terms of progress.

The first thing is that I decided to extend the scope of my project in terms of integrating with my vehicle.  Originally, I had planned to add a 7" touch screen somewhere that was visible whilst driving.  Given the attention to detail that Audi's designers have taken over the interior this was not going to be simple.  The company I had originally ordered the touchscreen from ran into production problems and after several months admitted that delivery of the screen would not be for "some time".  Since I needed this for JavaOne in September I cancelled the order and started looking for a replacement.  eBay is a great place to find items like this and I found a screen being marketed for the Raspberry Pi which was a "double DIN" fitting (which actually means it is twice the height of the ISO 7736 standard).  Some more searching on eBay turned up a bezel that would enable me to replace the existing navigation/entertainment system in my car with my new, Raspberry Pi powered one (Given how much functionality the existing system has I don't see this as a long term replacement, more for experimentation).

Having received my screen I decided that for development and testing it would be better if I did not need to keep changing the centre console, so I set about making the screen/Pi combination easier to use standalone.  Unfortunately, I couldn't find the perfect sized box at RS, but got one that could be adapted to my needs (the problem was it was too shallow, so I added some longer bolts and spacers).  First up was to fit the screen into the top of the box, as shown in the pictures



I was happy that my project already required the use of some wood, as I believe all great software projects should involve some woodwork.

To mount Raspberry Pi I used the two vacant mounting points on the screen and attached a small perspex sheet to act as a platform for the Pi

Pi mounting

Getting the holes in the right position took three attempts, as the positioning of the external cables was a bit tricky given the available space.

The Raspberry Pi was then mounted using the bolts shown above with some plastic spacers

Raspberry Pi mounted

The USB cables provided connections for a USB port and SD card reader which are part of the screen bezel.  In the end I removed these as I did not plan to use them and they were taking up too much space.

Fitting the HDMI cable was a bit of a challenge.  The distace between the HDMI port on the Pi and the one on the screen is about 3cm.  The shortest cable I had was 1m!  Using some cable ties and a sharp knife I was able to come up with a workable solution (not exactly pretty, but  it works and won't be seen in the finshed 'product').

HDMI cabling

Since I wanted to include an accelerometer I mounted that on the bottom of the box so it wouldn't move around during development.  The final internals are shown below.  I added a short ethernet extension lead to simplify cabled network access, the WiPi dongle could be left in place and I ran a USB extension lead from the Pi to simplify switching between the touch screen and an external keyboard.


When assembled I had a pretty nifty looking Raspberry Pi computer

pi computer

In the next installment I'll cover how I started on the JavaFX part to deliver realtime data on the screen.

Wednesday Jul 17, 2013

Trying to build the Internet of Things

Internet of Very Small Things For quite a long time I've been interested in the idea of wireless sensor networks.  I remember reading about "Smart Dust" and even went to visit a company called Crossbow, who were developing "motes"about 10 years ago.  This then led on to working with Sun SPOTs and finding out about things like 802.15.4, ZigBee and personal area networks (PANs).  More recently, here at Oracle, we've been looking at all sorts of ways we can promote the many great advantages of Java in such environments (cross-platform support, ease of development, etc), and feeding all the data created by these devices into the data centre (thus our new Device to Datacentre initiatives).

More recently I came across an article about a startup called Wimoto, who are developing wireless sensors to measure temperature, soil conditions and water levels each for less than $40.  What interested me about this was that using Bluetooth LE technology the sensors could be powered for up to a year using a single coin cell power source.  Clearly Moore's law has finally bought us to a point where meaningful sensor networks are a commercial reality.

This got me thinking about how I could build a demo using this type of sensor technology combined with something like the Raspberry Pi to act as a concentrator using Java and the Oracle Event Processing software.

The first part was to figure out how to get some small low cost sensors for my network.  Doing some research I found what I thought was an ideal component, the BL600-SA from Laird Technologies, a low power bluetooth module with built in antenna.  Here's the basic feature set:
This is an incredible piece of technology when you realise that the price is less than £10!  This looked ideal, so I ordered two along with some coin cell battery holders.  Time to build my own mote.

Here's a picture of the module:
BL600 module top

and here's the other side with the solder pads (ignore the fact, for now, that some of them look a bit used).

BL600 solder pads

You will notice that the solder pads for the 44 possible connections are very, very small, literally about 0.5mm across with a gap of about 0.2mm.  Surface mount soldering is used in almost everything electronic these days, but sadly I am not equipped for flow soldering, either at home or in the office (Oracle heath and safety people actually get very upset if you start soldering in an open plan office).

For once I was glad that I am really short sited.  By using uncorrected vision I can get really close to the module and hopefully not burn myself while soldering.  Looking at the pin configuration it seemed that I needed 10 connections to the module:
  1. GND
  2. Vcc (3V)
  3. UART TX
  4. UART RX
  7. I2C SCL
  8. I2C SDA
  9. Analog input 1
  10. Analog input 2
I figured this would give me the ability to program the module and connect a number of different sensors.  Not too bad, as most of the connections are well spaced out.  Unfortunately, the UART ones are all grouped together.  My first attempt wasn't bad, but I was a bit concerned that I had a very small short between two of the UART connections.  In my attempt to use a scalpel to cut the solder I managed to remove the entire solder pad for the UART RX connection, which was the end of that module (good job they're cheap).

Having learnt some lessons from the first attempt my second attempt seemed better.  Here's a picture taken with a microscope of the four UART connections.


It's not exactly pretty, but I'm reasonably certain there's no short.  Having made the necessary connections I mounted the module using some blu tack (this is, after all, a hi-tech project) on a piece of veroboard, thus:


In theory, at this point I should be able to put a cell in the holder, connect the UART via something like a MAX3232 and use PuTTY to talk to the module.  Unfortunately, this is not the case.  I put a cell in and started by measuring the voltage from Vcc to GND, which should have been 3V (or a little over with a new cell).  What I get is 1V.  I assumed this meant I had a short somewhere, but when I checked the resistance between Vcc and GND it was somewhere in the region of 2 Megohms, which does not seem like a short to me.

At this point I need to have a bit of a think to see if I can figure out where the problem is.  It seems my internet of things is still some way off...

Friday Jun 28, 2013

The Raspberry Pi JavaFX In-Car System (Part 3)

Ras Pi car pt3 Having established communication between a laptop and the ELM327 it's now time to bring in the Raspberry Pi.

One of the nice things about the Raspberry Pi is the simplicity of it's power supply.  All we need is 5V at about 700mA, which in a car is as simple as using a USB cigarette lighter adapter (which is handily rated at 1A).  My car has two cigarette lighter sockets (despite being specified with the non-smoking package and therefore no actual cigarette lighter): one in the centre console and one in the rear load area.  This was convenient as my idea is to mount the Raspberry Pi in the back to minimise the disruption to the very clean design of the Audi interior.

The first task was to get the Raspberry Pi to communicate using Wi-Fi with the ELM 327.  Initially I tried a cheap Wi-Fi dongle from Amazon, but I could not get this working with my home Wi-Fi network since it just would not handle the WPA security no matter what I did.  I upgraded to a Wi Pi from Farnell and this works very well.

The ELM327 uses Ad-Hoc networking, which is point to point communication.  Rather than using a wireless router each connecting device has its own assigned IP address (which needs to be on the same subnet) and uses the same ESSID.  The settings of the ELM327 are fixed to an IP address of and useing the ESSID, "Wifi327".  To configure Raspbian Linux to use these settings we need to modify the /etc/network/interfaces file.  After some searching of the web and a few false starts here's the settings I came up with:

auto lo eth0 wlan0

iface lo inet loopback

iface eth0 inet static

iface wlan0 inet static
    wireless-essid Wifi327
    wireless-mode ad-ho0

After rebooting, iwconfig wlan0 reported that the Wi-Fi settings were correct.  However, ifconfig showed no assigned IP address.  If I configured the IP address manually using ifconfig wlan0 netmask then everything was fine and I was able to happily ping the IP address of the ELM327.  I tried numerous variations on the interfaces file, but nothing I did would get me an IP address on wlan0 when the machine booted.  Eventually I decided that this was a pointless thing to spend more time on and so I put a script in /etc/init.d and registered it with update-rc.d.  All the script does (currently) is execute the ifconfig line and now, having installed the telnet package I am able to telnet to the ELM327 via the Raspberry Pi.  Not nice, but it works.

Here's a picture of the Raspberry Pi in the car for testing

In Car

In the next part we'll look at running the Java code on the Raspberry Pi to collect data from the car systems.

Friday Jun 14, 2013

Java and the Raspberry Pi Camera (Part 1)

Using the Raspberry Pi Camera with Java I've always liked the idea of computer vision and on the very long list of things I'd like to spend more time exploring is the OpenCV libraries which have a handy set of Java bindings.  In the past I've experimented with, and used some of the other frameworks that are available for image capture in Java, specifically the Java Media Framework (JMF) and the Freedom for Media in Java (FMJ), mostly around the idea of integrating images from a webcam into an application like a security monitoring system.  Sadly, JMF has grown a little dusty over time with the last release being way back in 2002 (you have to be amused when you see that the hardware requirements for this are a 166MHz Pentium processor and 32Mb of RAM).  FMJ is a little more modern, but was last updated in 2007.

The Raspberry Pi Foundation recently announced the launch of a camera that plugs into one of the two ribbon cable connectors on the board (as shown below):

Raspberry Pi Camera

I thought it would be an interesting idea to see how easy it would be to get this working with a Java or JavaFX application.

There are three utilities that are available for testing the camera: raspistill, raspiyuv and raspivid.  These allow you to grab a frame or video from the camera and store it in a file.  This seemed to be a good starting point for figuring out how to use the camera and get the frame data into a Java application, ideally as a BufferedImage (I decided to start with simple image capture and look at video streams later).

I downloaded the code from github and started looking at what it does and how it works.  Initially I thought it would make to sense to use a toolchain to cross compile the code on my quad-core Linux box.  However, having spent a day working on this and failed to get the code to compile cleanly (even using the download of the Raspberry Pi org's toolchain) I decided it might be slower on the Raspberry Pi, but at least it worked.

I also found a useful post from Tasanakorn Phaipool who had created a couple of sample applications that made use of the camera and linked to the OpenCV libraries.  This provided a good starting point as it simplified things compared to the raspistill application and enabled me to figure out a relatively simple build environment (I don't have time right now to climb the learning curve required for cmake).

Getting the code to compile and run was really quite challenging.  I will confess it's been a while since I've done any C coding, but more of the issues I experieced were to do with getting the build process to work correctly.  I used an iterative approach to creating a Makefile, simply resolving issues as I found them, gradually adding header file references and libraries until the code compiled cleanly.  To use the camera we need the multi-media abstraction layer (MMAL) API.  Broadcom have very kindly made this available as source, but documentation-wise you pretty much have to dig through the source code (there is a big comment at the top of the mmal.h file which is the best documentation I've found so far).  Once I'd got the code to compile and link it still would not run, which puzzled me for quite some time until, by comparing the raspistill executable to the one I'd built, I found that I needed to include the in the list of libraries to link.  (This really does confuse me because this library is not required to resolve any function references so the code compiles and links correctly, but without it the necessary camera configuration is not registered and the call to mmal_component_create() will fail).

At this point I have some code that will talk to the camera and display the preview image on the video output (HDMI).  Next I need to modify this so it can be used with JNI and integrate this with a new subclass of ImageInputStream which can then be used to create a BufferedImage in a Java application.

One other thing that is interesting is that when I run the simple test program the preview is displayed and very shortly after the network stops working (all the LEDs on the Pi except the power light go out).  I assume that is a bug somewhere.  Fortunately, I have a serial console connected so can still access the Pi via PuTTY.

I will update my blog as I make more progress on this.

Wednesday Jun 12, 2013

The Raspberry Pi JavaFX In-Car System (Part 2)

Raspberry Pi JavaFX Car Pt2 In my last post (which was rather further back in time than I had planned) I described the ideas behind my in-car Raspberry Pi JavaFX system.  Now it's time to get started on the technical stuff.

First, we need a short review of modern car electronics.  Things have certainly moved on from my first car, which was a 1971 Mini Clubman.  This didn't even have electronics in it (unless you count the radio), as everything was electro-mechanical (anyone remember setting the gap for the points on the distributor?)  Today, in Europe at least, things like anti-lock brakes (ABS) and stability control (ESC) which require complex sensors and electronics are mandated by law.  Also, since 2001, all petrol driven vehicles have to be fitted with an EOBD (European On-Board Diagnostics) interface.  This conforms to the OBD-II standard which is where the ELM327 interface from my first blog entry comes in. 

As a standard, OBD-II mandates some parts while other parts are optional.  That way certain basic facilities are guaranteed to be present (mainly those that are related to the measuring of exhaust emission performance) and then each car manufacturer can implement the optional parts that make sense for the vehicle they're building. 

There are five signal protocols that can be used with the OBD-II interface:
  • SAE J1850 PWM (Pulse-width modulation, used by Ford)
  • SAE J1850 VPW (Variable pulse-width, used by General Motors)
  • ISO 9141-2 (which is a bit like RS-232)
  • ISO 14230
  • ISO 15765 (also referred to as Controller Area Network, or CAN bus)
You can think of this as the transport layer, which can be changed by the car manufacturer to suit their needs.  The message protocol which uses the signal protocol is defined by the OBD-II standard.  The format of these commands is pretty straightforward requiring a sequence  of pairs of hexadecimal digits.  The first pair indicates the 'mode' (of which there are 10); the second, and possibly third, pair indicates the 'parameter identification' or PID being sent.  The mode and PID combination defines the command that you are sending to the vehicle.  Results are returned as a sequence of bytes that form a string containing pairs of hexadecimal digits encoding the data.

For my current vehicle, which is an Audi S3, the protocol is ISO 15765 as the car has multiple CAN buses for communication between the various control units (we'll come back to this in more detail later).

So where to start?

The first thing that is necessary is to establish communication between a Java application and the ELM327.  One of the great things about using Java for an application like this is that the development can easily be done on a laptop and the production code moved easily to the target hardware.  No cross compilation tool chains needed here, thank you.

My ELM327 interface communicates via 802.11 (Wi-Fi).  The address of my interface is (which seems pretty common for these devices) and uses port 35000 for all communication.  To test that things are working I set my MacBook to use a static IP address on Wi-Fi and then connected directly to the ELM327 which appeared in the list of available Wi-Fi devices.  Having established communication at the IP level I could then telnet into the ELM327.  If you want to start playing with this it's best to get hold of the documentation, which is really well written and complete.  The ELM327 essentially uses two modes of communication:
  • AT commands for talking to the interface itself
  • OBD commands that conform to the description above.  The ELM327 does all the hard work of  converting this to the necessary packet format, adding headers, checksums and so on as well as unmarshalling the  response data.
To start with I just used the AT I command which reports back the version of the interface and AT RV which gives the current car battery voltage.  These worked fine via telnet, so it was time to start developing the Java code. 

To keep things simple I wrote a class that would encapsulate the connection to the ELM327.  Here's the code that initialises the connection so that we can read and write bytes, as required

  /* Copyright © 2013, Oracle and/or its affiliates. All rights reserved. */

  private static final String ELM327_IP_ADDRESS = "";
  private static final int ELM327_IP_PORT = 35000;
  private static final byte OBD_RESPONSE = (byte)0x40;
  private static final String CR = "\n";
  private static final String LF = "\r";
  private static final String CR_LF = "\n\r";
  private static final String PROMPT = ">";

  private Socket elmSocket;
  private OutputStream elmOutput;
  private InputStream elmInput;
  private boolean debugOn = false;
  private int debugLevel = 5;
  private byte[] rawResponse = new byte[1024];
  protected byte[] responseData = new byte[1024];

   * Common initialisation code
   * @throws IOException If there is a communications problem
  private void init() throws IOException {
    /* Establish a socket to the port of the ELM327 box and create
     * input and output streams to it
    try {
      elmSocket = new Socket(ELM327_IP_ADDRESS, ELM327_IP_PORT);
      elmOutput = elmSocket.getOutputStream();
      elmInput = elmSocket.getInputStream();
    } catch (UnknownHostException ex) {
      System.out.println("ELM327: Unknown host, [" + ELM327_IP_ADDRESS + "]");
    } catch (IOException ex) {
      System.out.println("ELM327: IO error talking to car");

    /* Ensure we have an input and output stream */
    if (elmInput == null || elmOutput == null) {
      System.out.println("ELM327: input or output to device is null");

    /* Lastly send a reset command to and turn character echo off
     * (it's not clear that turning echo off has any effect)
    debug("ELM327: Connection established.", 1);

Having got a connection we then need some methods to provide a simple interface for sending commands and getting back the results.  Here's the common methods for sending messages.

   * Send an AT command to control the ELM327 interface
   * @param command The command string to send
   * @return The response from the ELM327
   * @throws IOException If there is a communication error
  protected String sendATCommand(String command) throws IOException {
    /* Construct the full command string to send.  We must remember to
     * include a carriage return (ASCII 0x0D)
    String atCommand = "AT " + command + CR_LF;
    debug("ELM327: Sending AT command [AT " + command + "]", 1);

    /* Send it to the interface */
    debug("ELM327: Command sent", 1);
    String response = getResponse();

    /* Delete the command, which may be echoed back */
    response = response.replace("AT " + command, "");
    return response;

   * Send an OBD command to the car via the ELM327.
   * @param command The command as a string of hexadecimal values
   * @return The number of bytes returned by the command
   * @throws IOException If there is a problem communicating
  protected int sendOBDCommand(String command)
      throws IOException, ELM327Exception {
    byte[] commandBytes = byteStringToArray(command);

    /* A valid OBD command must be at least two bytes to indicate the mode
     * and then the information request
    if (commandBytes.length < 2)
      throw new ELM327Exception("ELM327: OBD command must be at least 2 bytes");

    byte obdMode = commandBytes[0];

    /* Send the command to the ELM327 */
    debug("ELM327: sendOBDCommand: [" + command + "], mode = " + obdMode, 1);
    elmOutput.write((command + CR_LF).getBytes());
    debug("ELM327: Command sent", 1);

    /* Read the response */
    String response = getResponse();

    /* Remove the original command in case that gets echoed back */
    response = response.replace(command, "");
    debug("ELM327: OBD response = " + response, 1);

    /* If there is NO DATA, there is no data */
    if (response.compareTo("NO DATA") == 0)     
      return 0;

    /* Trap error message from CAN bus */
    if (response.compareTo("CAN ERROR") == 0)
      throw new ELM327Exception("ELM327: CAN ERROR detected");

    rawResponse = byteStringToArray(response);
    int responseDataLength = rawResponse.length;

    /* The first byte indicates a response for the request mode and the
     * second byte is a repeat of the PID.  We test these to ensure that
     * the response is of the correct format
    if (responseDataLength < 2)
      throw new ELM327Exception("ELM327: Response was too short");

    if (rawResponse[0] != (byte)(obdMode + OBD_RESPONSE))
      throw new ELM327Exception("ELM327: Incorrect response [" +
          String.format("%02X", responseData[0]) + " != " +
          String.format("%02X", (byte)(obdMode + OBD_RESPONSE)) + "]");

    if (rawResponse[1] != commandBytes[1])
      throw new ELM327Exception("ELM327: Incorrect command response [" +
          String.format("%02X", responseData[1]) + " != " +
          String.format("%02X", commandBytes[1]));

    debug("ELM327: byte count = " + responseDataLength, 1);

    for (int i = 0; i < responseDataLength; i++)
      debug(String.format("ELM327: byte %d = %02X", i, rawResponse[i]), 1);

    responseData = Arrays.copyOfRange(rawResponse, 2, responseDataLength);

    return responseDataLength - 2;

   * Send an OBD command to the car via the ELM327. Test the length of the
   * response to see if it matches an expected value
   * @param command The command as a string of hexadecimal values
   * @param expectedLength The expected length of the response
   * @return The length of the response
   * @throws IOException If there is a communication error or wrong length
  protected int sendOBDCommand(String command, int expectedLength)
      throws IOException, ELM327Exception {
    int responseLength = this.sendOBDCommand(command);

    if (responseLength != expectedLength)     
      throw new IOException("ELM327: sendOBDCommand: bad reply length ["
          + responseLength + " != " + expectedLength + "]");

    return responseLength;

and the method for reading back the results.

   * Get the response to a command, having first cleaned it up so it only
   * contains the data we're interested in.
   * @return The response data
   * @throws IOException If there is a communications problem
  private String getResponse() throws IOException {
    boolean readComplete = false;
    StringBuilder responseBuilder = new StringBuilder();

    /* Read the response.  Sometimes timing issues mean we only get part of
     * the message in the first read.  To ensure we always get all the intended
     * data (and therefore do not get confused on the the next read) we keep
     * reading until we see a prompt character in the data.  That way we know
     * we have definitely got all the response.
    while (!readComplete) {
      int readLength =;
      debug("ELM327: Response received, length = " + readLength, 1);

      String data = new String(Arrays.copyOfRange(rawResponse, 0, readLength));

      /* Check for the prompt */
      if (data.contains(PROMPT)) {
        debug("ELM327: Got a prompt", 1);

    /* Strip out newline, carriage return and the prompt */
    String response = responseBuilder.toString();
    response = response.replace(CR, "");
    response = response.replace(LF, "");
    response = response.replace(PROMPT, "");
    return response;

Using these methods it becomes pretty simple to implement methods that start to expose the OBD protocol.  For example to get the version information about the interface we just need this simple method:

   * Get the version number of the ELM327 connected
   * @return The version number string
   * @throws IOException If there is a communications problem
  public String getInterfaceVersionNumber() throws IOException {
    return sendATCommand("I");

Another very useful method is one that returns the details about which of the PIDs are supported for a given mode.

   * Determine which PIDs for OBDII are supported. The OBD standards docs are
   * required for a fuller explanation of these.
   * @param pid Determines which range of PIDs support is reported for
   * @return An array indicating which PIDs are supported
   * @throws IOException If there is a communication error
  public boolean[] getPIDSupport(byte pid) throws IOException, ELM327Exception {
    int dataLength = sendOBDCommand("01 " + String.format("%02X", pid));

    /* If we get zero bytes back then we assume that there are no
     * supported PIDs for the requested range
    if (dataLength == 0)
      return null;

    int pidCount = dataLength * 8;
    debug("ELM327: pid count = " + pidCount, 1);
    boolean[] pidList = new boolean[pidCount];
    int p = 0;

    /* Now decode the bit map of supported PIDs */
    for (int i = 2; i < dataLength; i++)
      for (int j = 0; j < 8; j++) {
        if ((responseData[i] & (1 << j)) != 0)
          pidList[p++] = true;
          pidList[p++] = false;

    return pidList;

The PIDs 0x00, 0x20, 0x40, 0x60, 0x80, 0xA0 and 0xC0 of mode 1 will report back the supported PIDs for the following 31 values as a four byte bit map.  There appear to only be definitions for commands up to 0x87 in the specification I found.

In the next part we'll look at how we can start to use this class to get some real data from the car.

Thursday Apr 25, 2013

The Raspberry Pi JavaFX In-Car System (Part 1)

Raspberry Pi JavaFX Car System (Pt 1) As part of my work on embedded Java I'm always on the look out for new ideas for demos to build that show developers how easy it is to use and how powerful.  In some of my recent web surfing I came across an interesting device on eBay that I thought had real potential.  It's called an ELM327 OBDII CAN bus diagnostic interface scanner.  It is a small box that plugs in to the service port of a modern car and provides an interface that allows software to talk to the Electronic Control Units (ECUs) fitted in your car.  The one I bought provides a Wi- Fi link and also includes a USB socket for wired connectivity.  Similar products are available that provide a BlueTooth interface, but the various opinions I read indicated that these were not as easy to use.  Considering it cost a little over £30 I thought it was well worth it for some experimentation.

Here's a picture of the device:


And here it is plugged into the service port located near the pedals on my car. 


The only downside is that the orientation of the socket means that you can't see the status lights when it's plugged in (at least not without a mirror).

My initial thoughts were to look at what kind of data could be extracted from the car and then write some software that would provide realtime display of things that aren't shown through the existing instrumentation.  I thought it would also be fun to record journey data that could be post-analysed in much the way Formula 1 uses masses of telemetry to let the drivers know where they could do better.

Since I wanted to use embedded Java the obvious choice of processing unit was the Raspberry Pi.  It's cheap, I have a whole bunch of them and it's got plenty of computing power for what I have in mind.  It also has some other advantages:
  • Low power consumption (easy to run off the 12V cigarette lighter supply)
  • Support for JavaFX through some nice touch screens from Chalkboard Electronics (so I can go wild with the interface)
  • Easily accessible GPIO pins
The last point got me thinking about what other possibilities there were for my in-car system.  Recently my friend and colleague Angela Caicedo did a session at Devoxx UK entitled, "Beyond Beauty: JavaFX, Parallax, Touch, Gyroscopes and Much More".  Part of this involved connecting a motion sensor to the Raspberry Pi using the I2C interface that is also available.  The particular sensor she used is from Sparkfun and uses a very cool single chip solution from InvenSense, the MPC-6150.  This provides 9-axis motion data, which means acceleration and rate of rotation for the X, Y and Z axes as well as a compass sensor that works regardless of the orientation of the sensor.

Having studied physics at university (a long time ago, in a galaxy far, far away) I vaguely remember that if I combine acceleration data with the mass of the car and things like engine speed I can calculate the horse power of the engine as well as the torque being generated.  Throw that into the mix and this could make a really fun project.

As further inspiration I came across this video recently:

There's also an interesting one from Tesla who use a 17" touch display as their cemtre console.

In the follow up parts to this blog entry I'll detail how the project evolves.

Monday Jan 21, 2013

Building an SD Card Image For a Raspberry Pi Java Hands On Lab

Building an SD Card Image For a Raspberry Pi Hands On Lab Last year we ran a very successful hands on lab for developers at Devoxx in Antwerp.  The concept was to have 40 people in a room, give them all a Raspberry Pi, cables and a pre-configured SD card and get them to build cool JavaFX apps.  One of the things I had to do was organise all the equipment and make a suitable image for the SD card.  As this was before Oracle had announced the early access of JDK8 for the Raspberry Pi with hard float support we used the soft float of Java SE embedded version 7 for ARMv6 and a non-production build of JavaFX.  As we're repeating this lab at JFokus in a couple of weeks I thought it might be useful to write up how I built the SD image as there may well be people who want to run something similar.

Hardware Setup

To simplify matters from a hardware perspective (and to make the lab economically viable) we decided not to provide attendees with monitors, keyboards and mice.  All interaction with the Pi would need to be via the network connection.  To eliminate the need for two power outlets per attendee we also decided to use USB Y power cables that can draw power from two USB ports on the attendee's laptop.  Since USB ports are rated at 500mA two would give us more than the minimum 700mA required for the Pi (as a side note I've found that you can happily boot a Pi from one USB port on a MacBook Pro - although that is without any USB peripherals attached to the Pi). 

With no monitor or USB keyboard/mouse all interaction would be via the network connection.  Again, to simplify the infrastructure we provided all attendees with an ethernet cross-over cable.  One end is connected to the ethernet port on the attendee's laptop, the other to the ethernet port on the Raspberry Pi.

The hardware setup is shown in the diagram below:
Machine setup

Software Setup

For this part I'll describe the setup necessary for the Rasbian distro so we can use the new JDK8 EA build.  One issue with this is that the JavaFX libraries included no longer supprt rendering via X11.  Since the ARM port of JavaFX is aimed at embedded devices like parking meters and point-of-sale devices we don't expect these to use an X based desktop underneath.  Now that rendering is only supported directly to the framebuffer (which gives us significantly better performance) projecting the JavaFX applications back to the attendees laptop via VNC will no longer work.  Although there is a package calld fbvnc this will not work as the rendering on the Pi does not use areas of memory that are accessible this way.

Here is a step-by-step guide:
  1. Install the Rasbian distro on an SD card.  I use 4Gb SanDisk class 4 cards which provide enough space, work with the Pi and are cheap.  When you need to replicate a significant number of cards, smaller is quicker.  To install the distro either use DiskImager (on Windows) or a simple dd command on Linux or Mac (detailed instructions can be found on the Raspberry Pi web site).
  2. Put this in a Pi and boot.  I do this with a monitor and USB keyboard connected to make life simpler.  When the Pi has finished booting you will be presented with a screen as shown:
  1. Move down to expand_rootfs and select this by pressing RETURN.  This will expand the filesystem to fill the available space on the SD card.
  2. Select overclock and accept the notice about potentially reducing the lifetime of your Pi.  Remember: live fast, die young.  Seriously, though, given the cost of the Pi and the fact that the manufacturers will honour the warranty for anything up to a 1GHz clockrate and I think this is pretty safe.  I go for the medium setting of 900MHz.  This has not given me any issues, although you may want to go higher or lower as preferred.
  3. Select SSH.  I think this is enabled by default, but just to make sure select it.
  4. Lastly on this screen select update.  This will update any packages necessary in the Linux distribution.  Obviously for this you will need your Pi connected to a network where it can find the internet settings via DHCP, etc.
  5. Tab to 'Finish', hit RETURN and you will be dropped into a shell.
  6. Being an old school UNIX hacker I really don't like sudo, so the first thing I do is sudo bash and then set a password for root so I can su whenever I need to.
  7. There is a user account, pi, that is created by default.  For our labs I create a separate account for attendees to login as.  Use something like useradd -u 1024 -d /home/lab -m -s /bin/bash lab.  Remember to set the user's password: passwd lab.
  8. Since we want things to be as simple as possible we setup a DHCP server on the Pi.  Before we do that we need the Pi to use a static IP address.  Edit the /etc/network/interfaces file and change

  9. iface eth0 inet dhcp


    iface eth0 inet static


    In these settings I've used a class A private network which is the same as the one I use in my office.  This makes things easy, as I can also configure the gateway so that the Pi can access the internet which will be required for the next stages.  If you are using a class C private network (like 192.168.0.X) you will need to change this accordingly.

    At this point I reboot the machine with the monitor and keyboard disconnected and switch to doing everything over SSH.

  10. Login over the network using SSH (use either the lab account or the pre-installed pi one) and su to root. 
  11. Install the DHCP server package, apt-get install isc-dhcp-server
  12. Configure DHCP by editing the /etc/dhcp/dhcpd.conf file.  Under the comment line 
     # This is a very basic subnet declaration.

    subnet netmask {

    This will provide an IP address in the range from 164 to 170. Having seven available addresses is a bit of overkill, but gives us some flexibility (change your IP addresses as necessary).  In addition you must comment out these two lines at the start of the file:

    option domain-name "";
    option domain-name-servers,;

    This was one of the things that changed between the soft float Wheezy distro and the hard float Raspbian distro.  It took me ages to figure out why the DHCP server would not work properly on Raspbian.  When I used the soft float distro all I needed to do was add the subnet and range definition.  On Raspbian the DHCP server refused to serve IP addresses even though the log messages seemed to indicate that it was fine.  After I did a diff on the dhcp.conf files from both I noticed the two lines that had been uncommented.  I commented them out again and everything worked fine.
For our lab the attendees wrote the code on their laptops using the NetBeans IDE and then transferred the project across to the Pi to run.  To make life as easy as possible the Pi is configured to support multiple ways of getting files onto it: FTP, NFS and Samba.
  1. Install the FTP server package, apt-get install proftpd-basic.  Although it would seem logical to want to run this from inetd, choose the standalone option as this actually works better and gets started, quite happily, at boot time.
  2. Configure the FTP server by editing the /etc/proftpd/proftpd.conf file.  This is not strictly necessary, but if you want to be able to use anonymous ftp then uncomment the sizeable section that starts with the comment,

    # A basic anonymous configuration, no upload directories.

  3. Install the necessary packages for NFS server support, apt-get install nfs-kernel-server nfs-common
  4. Edit the /etc/exports file to add the user home directory,

    /home/lab     10.0.0.*(rw,sync,no_subtree_check)
At this point you would think, like I did, that rebooting the machine would give you a functioning NFS server.  In fact on the soft float Wheezy distro this is exactly what happened.  As with DHCP there is some weirdness in terms of changes that were made between the soft float Wheezy distro and the Raspbian one.  With Raspbian if you use the showmount -e command, either locally or remotely you get the somewhat cryptic error message, clnt_create: RPC: Port mapper failure - RPC: Unable to receive.

I'm sure with hindsight I should have been able to solve this quicker, but having had it working fine on Wheezy I just couldn't figure out why the same thing didn't work on Raspbian.  Evantually after much Googling and head scratching I determined that it was down to the RPC bind daemon not being started at boot time.  Some kind and thoughtful person decided that RPC didn't need to run at boot time.  Rather than leaving the package out so that when it's needed it gets installed and correctly configured they just moved the links from /etc/rc2.d and /etc/rc3.d from being S (for start) to K (for kill), so it doesn't start.
  1. Make the RPC bind daemon start at boot time by running update-rc.d rpcbind enable (as root)
  2. Install the Samba packagaes with apt-get install libcups2 samba samba-common
  3. Configure samba.  Edit the /etc/samba/smb.conf file and add the following at the end of the file:

    comment = Raspberry Pi Java Lab
    path = /home/lab
    writable = yes
    guest ok = yes
If you want the attendees to be able to project the desktop of the Pi to their laptops then you will need VNC.
  1.  Install the VNC server, apt-get install tightvncserver
  2. As the lab user, set a password for the VNC server with tightvncpasswd.  When doing this you can set different passwords for a fully interactive session and a view only one.
  3. Run tightvncserver :1 to generate all the necessary configuration files.  You will now be able to access the Raspberry Pi desktop remotely using a VNC client (I use [the bizarrely named] Chicken of the VNC on the Mac, RealVNC on Windows and xtightvncviewer on Linux).
  4. In order for the VNC server to start up whenever the system boots a script is required in the /etc/init.d directory.  I call it tightvncviewer, for which the code is:

    #!/bin/sh -e
    # Start/stop VNC server

    # Provides:          tightvncserver
    # Required-Start:    $network $local_fs
    # Required-Stop:     $network $local_fs
    # Default-Start:     2 3 4 5
    # Default-Stop:      0 1 6
    # Short-Description: tightvncserver remote X session projection
    # Description:       tightvncserver allows VNC clients to connect to
    #                    this machine and project the X desktop to the
    #                    remote machine.

    . /lib/lsb/init-functions

    # Carry out specific functions when asked to by the system
    case "$1" in
        echo Starting tightVNC server
        su lab -c 'tightvncserver :1 > /tmp/vnclog 2>&1'
        echo Stopping tightVNC server
        su lab -c 'tightvncserver -kill :1'
        echo Restarting vncserver
        $0 stop
        $0 start
        echo "Usage: /etc/init.d/vncserver {start|stop|restart}"
        exit 1

    exit 0

    Make sure that this script has execute permission.  To create the necessary links into the /etc/rc*.d directories run update-rc.d tightvncserver defaults.  Note that this provides the desktop of the 'lab' user.  If you want to support a different user change the name.  More users can be supported by creating additional servers running on screens other than :1.
To avoid having to provide printed instructions for the lab or distribute files on a CD or memory stick I also configure Apache on the Pi so that once the Pi is connected to the attendee's laptop they can simply open a web page and have whatever instructions and software available from there.
  1. Install Apache, apt-get install apache2
  2. Create your HTML content and put it in /var/www
  3. Finally install the Java runtime.  I put it in /opt and set the PATH environment variable in the user's .bashrc file.

Tuesday Oct 16, 2012

Mind Reading with the Raspberry Pi

Mind Reading With The Raspberry Pi At JavaOne in San Francisco I did a session entitled "Do You Like Coffee with Your Dessert? Java and the Raspberry Pi".  As part of this I showed some demonstrations of things I'd done using Java on the Raspberry Pi.  This is the first part of a series of blog entries that will cover all the different aspects of these demonstrations.

A while ago I had bought a MindWave headset from Neurosky.  I was particularly interested to see how this worked as I had had the opportunity to visit Neurosky several years ago when they were still developing this technology.  At that time the 'headset' consisted of a headband (very much in the Bjorn Borg style) with a sensor attached and some wiring that clearly wasn't quite production ready.  The commercial version is very simple and easy to use: there are two sensors, one which rests on the skin of your forehead, the other is a small clip that attaches to your earlobe.

Neurosky product image 1 Neurosky product image 2

Typical EEG sensors used in hospitals require lots of sensors and they all need copious amounts of conductive gel to ensure the electrical signals are picked up.  Part of Neurosky's innovation is the development of this simple dry-sensor technology.  Having put on the sensor and turned it on (it powers off a single AAA size battery) it collects data and transmits it to a USB dongle plugged into a PC, or in my case a Raspberry Pi.

From a hacking perspective the USB dongle is ideal because it does not require any special drivers for any complex, low level USB communication.  Instead it appears as a simple serial device, which on the Raspberry Pi is accessed as /dev/ttyUSB0.  Neurosky have published details of the command protocol.  In addition, the MindSet protocol document, including sample code for parsing the data from the headset, can be found here.

To get everything working on the Raspberry Pi using Java the first thing was to get serial communications going.  Back in the dim distant past there was the Java Comm API.  Sadly this has grown a bit dusty over the years, but there is a more modern open source project that provides compatible and enhanced functionality, RXTXComm.  This can be installed easily on the Pi using sudo apt-get install librxtx-java

Next I wrote a library that would send commands to the MindWave headset via the serial port dongle and read back data being sent from the headset.  The design is pretty simple, I used an event based system so that code using the library could register listeners for different types of events from the headset.  You can download a complete NetBeans project for this here.  This includes javadoc API documentation that should make it obvious how to use it (incidentally, this will work on platforms other than Linux.  I've tested it on Windows without any issues, just by changing the device name to something like COM4).

To test this I wrote a simple application that would connect to the headset and then print the attention and meditation values as they were received from the headset.  Again, you can download the NetBeans project for that here.

Oracle recently released a developer preview of JavaFX on ARM which will run on the Raspberry Pi.  I thought it would be cool to write a graphical front end for the MindWave data that could take advantage of the built in charts of JavaFX.  Yet another NetBeans project is available here.  Screen shots of the app, which uses a very nice dial from the JFxtras project, are shown below.

JavaFX Mind Reader

JavaFX Mind Reader

I probably should add labels for the EEG data so the user knows which is the low alpha, mid gamma waves and so on.  Given that I'm not a neurologist I suspect that it won't increase my understanding of what the (rather random looking) traces mean.

In the next blog I'll explain how I connected a LEGO motor to the GPIO pins on the Raspberry Pi and then used my mind to control the motor!

Monday Sep 17, 2012

Two Weeks To Go, Still Time to Register

Yes, it's now only two weeks to the start of the 17th JavaOne conference!

This will be my ninth JavaOne, I came fairly late to this event, attending for the first time in 2002.  Since then I've missed two conferences, 2006 for the birth of my son (a reasonable excuse I think) and 2010 for reasons we'll not go into here.  I have quite the collection of show devices, I've still got the WoWee robot, the HTC phone for JavaFX, the programmable pen and the Sharp Zaurus.  The only one I didn't keep was the homePod music player (I wonder why?)

JavaOne is a special conference for many reasons, some of which I list here:

  • A great opportunity to catch up on the latest changes in the Java world.  This is not just in terms of the platform, but as much about what people are doing with Java to build new and cool applications.
  • A chance to meet people.  We have these things called BoFs, which stands for "Birds of a Feather", as in "Birds of a feather, flock together".  The idea being to have sessions where people who are interested in the same topic don't just get to listen to a presentation, but get to talk about it.  These sessions are great, but I find that JavaOne is as much about the people I meet in the corridors and the discussions I have there as it is about the sessions I get to attend.
  • Think outside the box.  There are a lot of sessions at JavaOne covering the full gamut of Java technologies and applications.  Clearly going to sessions that relate to your area of interest is great, but attending some of the more esoteric sessions can often spark thoughts and stimulate the imagination to go off and do new and exciting things once you get back.
  • Get the lowdown from the Java community.  Java is as much about community as anything else and there are plenty of events where you can get involved.  The GlassFish party is always popular and for Java Champions and JUG leaders there's a couple of special events too.
  • Not just all hard work.  Oracle knows how to throw a party and the appreciation event will be a great opportunity to mingle with peers in a more relaxed environment.  This year Pearl Jam and Kings of Leon will be playing live.  Add free beer and what more could you want?

So there you have it.  Just a few reasons for why you want to attend JavaOne this year.  Oh, and of course I'll be presenting three sessions which is even more reason to go.  As usual I've gone for some mainstream ("Custom Charts" for JavaFX) and some more 'out there' ("Java and the Raspberry Pi" and "Gestural Interfaces for JavaFX").  Once again I'll be providing plenty of demos so more than half my luggage this year will consist of a Kinect, robot arm, Raspberry Pis, gamepad and even an EEG sensor.

If you're a student there's one even more attractive reason for going to JavaOne: It's Free!

Registration is here.  Hope to see you there!

Thursday Aug 16, 2012

JavaFX Interface For Power Control

Power Control JavaFX Interface
Having completed the construction of my power control system I've finally found time to build the software interface using the Arduino board I included in it.

First off I needed some code on the Arduino that would listen for commands comming via the USB connection and then take the appropriate action.  Since all that is required is to set one of two pins either high or low the protocol is trivial.  The C code for the Arduino is shown below:
#define SOCKET_PIN_1 3
#define SOCKET_PIN_2 2
#define SOCKET_1_ON 65
#define SOCKET_1_OFF 97
#define SOCKET_2_ON 66
#define SOCKET_2_OFF 98

void setup(){
  pinMode(SOCKET_PIN_1, OUTPUT);
  pinMode(SOCKET_PIN_2, OUTPUT);

void loop(){
  int incomingByte = 0;
  /* Wait for control command from the PC */
  if (Serial.available() > 0) {
    // read the incoming byte:
    incomingByte =;
    switch (incomingByte) {
      case SOCKET_1_ON:
        digitalWrite(SOCKET_PIN_1, HIGH);
        Serial.println("Socket 1: ON");
      case SOCKET_1_OFF:
        digitalWrite(SOCKET_PIN_1, LOW);
        Serial.println("Socket 1: OFF");
      case SOCKET_2_ON:
        digitalWrite(SOCKET_PIN_2, HIGH);
        Serial.println("Socket 2: ON");
      case SOCKET_2_OFF:
        digitalWrite(SOCKET_PIN_2, LOW);
        Serial.println("Socket 2: OFF");
All this does is initialise the serial port to work at 9600 baud and configure pins 2 and 3 as outputs.  Why use pins 2 and 3 and not 0 and 1 I hear you ask.  The answer is that pins 0 and 1 are also used for accessing the UART of the Arduino.  Once programmed and up and running this is no problem, but if you have an application running that is using these pins you can't then upload a program the Arduino though the USB port.  This caused me some problems to start with until I found a blog reference to this elsewhere.  To make life easier I switched to using pins 2 and 3.

The loop function looks for bytes being sent via the USB serial connection and takes the appropriate action in setting the pins high or low.  To keep things simple I used 'a' and 'A' for socket 1 and 'b' and 'B' for socket 2.  Lower case sets the pins low (turning the socket off) and upper case sets the pin high (turning the socket on).  To test this all you need to do is use the Serial Monitor in the Arduino IDE and type the appropriate character.

Next we need some way of sending the appropriate bytes from the controlloing PC.  Java has long had the JavaComm API which provides an API for all things serial and parallel.  The PC I'm using for the UI is running Ubuntu Linux, so I used the available librxtx-java package.  This has a rather frustrating limitation, that I would describe as a bug.  Plugging the Arduino USB into my machine automatically creates me a device to use to access this, which is what we need.  In this case the device is /dev/ttyACM0.  The problem is that librxtx-java will only recognise serial ports of the form /dev/ttyS{number}.  To get round this I created a symbolic link from /dev/ttyACM0 to /dev/ttyS4 (since I actually have physical serial ports on my machine using ttyS0 to ttyS3).  The big drawback to this is that when the machine is rebooted the OS very thoughtfully removes my symbolic link.  At some point I need to try and figure out if there is a way through udev to make this work properly.

The code below shows part of the class I created to handle communication with the Arduino through the serial port:
 public ArduinoComms(String portName) throws ArduinoCommsException {
   debug("AC: opening port: " + portName);
   CommPortIdentifier portIdentifier = null;
   CommPort commPort = null;

   try {
     portIdentifier = CommPortIdentifier.getPortIdentifier(portName);
     debug("AC: Got portIdentifier");
   } catch (NoSuchPortException ex) {
     debug("AC: getPortIdentifier failed");
     throw new ArduinoCommsException(ex.getMessage());

   if (portIdentifier.isCurrentlyOwned())
     throw new ArduinoCommsException("Error: Port is currently in use");
   else {
     try {
       commPort =, 2000);
       debug("AC: Opened port");
       if (commPort instanceof SerialPort) {
         SerialPort serialPort = (SerialPort) commPort;
         debug("AC: Set parameters");

         in = serialPort.getInputStream();
         out = serialPort.getOutputStream();
         debug("AC: Got input/output streams");
       } else {
         System.out.println("ERROR: Not recognised as a serial port!" );
         throw new ArduinoCommsException(portName);
     } catch (PortInUseException |
              UnsupportedCommOperationException |
              IOException ex) {
       throw new ArduinoCommsException(ex.getMessage());
Passing /dev/ttyS4 to this constructor provides the application with an InputStream and OutputStream to communicate with the Arduino.  To simplify things further I subclassed my Arduino communications class to make it specific to my power control adding some useful methods shown below:
  * Turn socket one on
  * @throws IOException If this fails
 public void socketOneOn() throws IOException {
   debug("PC: socketOneOn");
   if (out != null)
     throw new IOException("Output stream is null!");

  * Turn socket one off
  * @throws IOException If this fails
 public void socketOneOff() throws IOException {
   debug("PC: socketOneOff");
   if (out != null)
     throw new IOException("Output stream is null!");

  * Turn socket two on
  * @throws IOException If this fails
 public void socketTwoOn() throws IOException {
   debug("PC: socketTwoOn");
   if (out != null)
     throw new IOException("Output stream is null!");

  * Turn socket two off
  * @throws IOException If this fails
 public void socketTwoOff() throws IOException {
   debug("PC: socketTwoOff");
   if (out != null)
     throw new IOException("Output stream is null!");

All that is the required now is a user interface to provide a way of sending the appropriate character when the user wants to change the power state.  I borrowed some button graphics from Jaspers JavaOne Kinect demo last year and a nice background I found here.  The result is shown below:
screen shot 1

screen shot 2

screen shot 3

The code for the JavaFX part is shown below:

     * Background
    URL resourceURL = PowerUI.class.getResource("resources/background.png");
    Image backgroundImage = new Image(resourceURL.toExternalForm());
    ImageView background = new ImageView(backgroundImage);
     * Images for switches
    resourceURL = PowerUI.class.getResource("resources/power-off.png");
    Image powerOffImage = new Image(resourceURL.toExternalForm());
    resourceURL = PowerUI.class.getResource("resources/power-on.png");
    Image powerOnImage = new Image(resourceURL.toExternalForm());

    final ImageView powerOffSocketA = new ImageView(powerOffImage);
    final ImageView powerOnSocketA = new ImageView(powerOnImage);
    final ImageView powerOffSocketB = new ImageView(powerOffImage);
    final ImageView powerOnSocketB = new ImageView(powerOnImage);

    Font f = new Font(18);
     * Label and control for the first socket
    Group labelA = GroupBuilder.
    Rectangle r = RectangleBuilder.
    Text socketALabel = TextBuilder.
        text("POWER 1").
    powerOffSocketA.setOnMouseClicked(new EventHandler() {
      public void handle(MouseEvent t) {
        try {
          debug("PUI: Socket 1 ON");
        } catch (IOException ex) {
          System.out.println("ERROR: " + ex.getMessage());

    powerOnSocketA.setOnMouseClicked(new EventHandler() {
      public void handle(MouseEvent t) {
        try {
          debug("PUI: Socket 1 OFF");
        } catch (IOException ex) {
          System.out.println("ERROR: " + ex.getMessage());

     * Label and control for the first socket
    Group labelB = GroupBuilder.
    r = RectangleBuilder.
    Text socketBLabel = TextBuilder.
        text("POWER 2").
    powerOffSocketB.setOnMouseClicked(new EventHandler() {
      public void handle(MouseEvent t) {
        try {
          debug("PUI: Socket 2 ON");
        } catch (IOException ex) {
          System.out.println("ERROR: " + ex.getMessage());
    powerOnSocketB.setOnMouseClicked(new EventHandler() {
      public void handle(MouseEvent t) {
        try {
          debug("PUI: Socket 2 OFF");
        } catch (IOException ex) {
          System.out.println("ERROR: " + ex.getMessage());
One of the things I've just started really using when developing JavaFX is the Builder classes.  These are great for making it easy to create Nodes and setting numerous attributes without having to call each method individually on the object.

I guess the next thing is to make this into a simple web service so I can control my Raspberry Pi and Beagle Board from a web browser antwhere in the world.

Monday Jul 02, 2012

The Power to Control Power

I'm currently working on a number of projects using embedded Java on the Raspberry Pi and Beagle Board.  These are nice and small, so don't take up much room on my desk as you can see in this picture.

Desktop embedded systems

As you can also see I have power and network connections emerging from under my desk.  One of the (admittedly very minor) drawbacks of these systems is that they have no on/off switch.  Instead you insert or remove the power connector (USB for the RasPi, a barrel connector for the Beagle).  For the Beagle Board this can potentially be an issue; with the micro-SD card located right next to the connector it has been known for people to eject the card when trying to power off the board, which can be quite serious for the hardware. The alternative is obviously to leave the boards plugged in and then disconnect the power from the outlet.  Simple enough, but a picture of underneath my desk shows that this is not the ideal situation either.

Under desk wiring

This made me think that it would be great if I could have some way of controlling a mains voltage outlet using a remote switch or, even better, from software via a USB connector.  A search revealed not much that fit my requirements, and anything that was close seemed very expensive.  Obviously the only way to solve this was to build my own.

Here's my solution.  I decided my system would support both control mechanisms (remote physical switch and USB computer control) and be modular in its design for optimum flexibility.  I did a bit of searching and found a company in Hong Kong that were offering solid state relays for 99p plus shipping (£2.99, but still made the total price very reasonable).  These would handle up to 380V AC on the output side so more than capable of coping with the UK 240V supply.  The other great thing was that being solid state, the input would work with a range of 3-32V and required a very low current of 7.5mA at 12V.  For the USB control an Arduino board seemed the obvious low-cost and simple choice.  Given the current requirments of the relay, the Arduino would not require the additional power supply and could be powered just from the USB.

Having secured the relays I popped down to Homebase for a couple of 13A sockets, RS for a box and an Arduino and Maplin for a toggle switch.  The circuit is pretty straightforward, as shown in the diagram (only one output is shown to make it as simple as possible).  Originally I used a 2 pole toggle switch to select the remote switch or USB control by switching the negative connections of the low voltage side.  Unfortunately, the resistance between the digital pins of the Arduino board was not high enough, so when using one of the remote switches it would turn on both of the outlets.  I changed to a 4 pole switch and isolated both positive and negative connections.

Power switch circuit

IMPORTANT NOTE: If you want to follow my design, please be aware that it requires working with mains voltages.  If you are at all concerned with your ability to do this please consult a qualified electrician to help you.

It was a tight fit, especially getting the Arduino in, but in the end it all worked.  The completed box is shown in the photos.

Controller inside

Connected with USB

The remote switch was pretty simple just requiring the squeezing of two rocker switches and a 9V battery into the small RS supplied box.  I repurposed a standard stereo cable with phono plugs to connect the switch box to the mains outlets.  I chopped off one set of plugs and wired it to the rocker switches.  The photo shows the RasPi and the Beagle board now controllable from the switch box on the desk.

Remote control

I've tested the Arduino side of things and this works fine.  Next I need to write some software to provide an interface for control of the outlets.  I'm thinking a JavaFX GUI would be in keeping with the total overkill style of this project.

Friday Jun 08, 2012

Call For Papers Tips and Tricks

This year's JavaOne session review has just been completed and by now everyone who submitted papers should know whether they were successful or not.  I had the pleasure again this year of leading the review of the 'JavaFX and Rich User Experiences' track.  I thought it would be useful to write up a few comments to help people in future when submitting session proposals, not just for JavaOne, but for any of the many developer conferences that run around the world throughout the year.  This also draws on conversations I recently had with various Java User Group leaders at the Oracle User Group summit in Riga.  Many of these leaders run some of the biggest and most successful Java conferences in Europe.

  • Try to think of a title which will sound interesting.  For example, "Experiences of performance tuning embedded Java for an ARM architecture based single board computer" probably isn't going to get as much attention as "Do you like coffee with your dessert? Java on the Raspberry Pi". 
  • When thinking of the subject and title for your talk try to steer clear of sessions that might be too generic (and so get lost in a group of similar sessions). 
  • Introductory talks are great when the audience is new to a subject, but beware of providing sessions that are too basic when the technology has been around for a while and there are lots of tutorials already available on the web.
  • JavaOne, like many other conferences has a number of fields that need to be filled in when submitting a paper.  Many of these are selected from pull-down lists (like which track the session is applicable to).  Check these lists carefully.  A number of sessions we had needed to be shuffled between tracks when it was thought that the one selected was not appropriate.  We didn't count this against any sessions, but it's always a good idea to try and get the right one from the start, just in case.
  • JavaOne, again like many other conferences, has two fields that describe the session being submitted: abstract and summary.  These are the most critical to a successful submission.  The two fields have different names and that is significant; a frequent mistake people make is to write an abstract for a session and then duplicate it for the summary.  The abstract (at least in the case of JavaOne) is what gets printed in the show guide and is typically what will be used by attendees when deciding what sessions to attend.  This is where you need to sell your session, not just to the reviewers, but also the people who you want in your audience.  Submitting a one line abstract (unless it's a really good one line) is not usually enough to decide whether this is worth investing an hour of conference time.  The abstract typically has a limit of a few hundred characters.  Try to use as many of them as possible to get as much information about your session across.  The summary should be different from the abstract (and don't leave it blank as some people do).  This field is where you can give the reviewers more detail about things like the structure of the talk, possible demonstrations and so on.  As a reviewer I look to this section to help me decide whether the hard-sell of the title and abstract will actually be reflected in the final content.  Try to make this comprehensive, but don't make it excessively long.  When you have to review possibly hundreds of sessions a certain level of conciseness can make life easier for reviewers and help the cause of your session.
  • If you've not made many submissions for talks in the past, or if this is your first, try to give reviewers places to find background on you as a presenter.  Having an active blog and Twitter handle can also help reviewers if they're not sure what your level of expertise is.  Many call-for-papers have places for you to include this type of information. 

It's always good to have new and original presenters and presentations for conferences.  Hopefully these tips will help you be successful when you answer the next call-for-papers.

Tuesday May 29, 2012

Adding a serial console to the Raspberry Pi

I finally found some more time to do some more on the Raspberry Pi.  As part of a much bigger project I'm working on for this it looked like I'd need to rebuild the kernel to include and modify some drivers (which will take me way, way back into my past doing UNIX development at AT&T).  Knowing the way these things go I decided to prepare for the worst, i.e. a kernel that fails to boot properly.  To get as much information as possible it's always good to be able to see the boot messages direct and, after a bit of searching, I found that the kernel boot messages are all output to the serial port on the Raspberry Pi.

Serial ports can be a bit misunderstood since some people tend to use RS-232 and serial communication interchangeably which can lead to some serious disappointment (and cost).  RS-232 uses 12V whereas the UART (Universal Asynchronous Receive and Transmit) which generates the bits is either a 3.3V or 5V device (depending on the board design).  To have the UART on the Pi communicate with the serial port on my PC I would need a conversion circuit.  I've used this before when I did a demo using the WowWee RoboSapien where I had a Sun SPOT connected to the console of the robot via a serial connection.  The necessary circuit is shown below:

MAX3232 Circuit

Note that the 9-pin D-type connector is shown from the front, so you need to remember that when soldering the wires on.

The connections to the Raspberry Pi are made on the GPIO expansion header.  A diagram showing which pins are which can be found here.

Having tested the configuration on a breadboard (always a good idea) I made a small veroboard implementation.  The final result is shown here:

Raspberry Pi Serial Console

To communicate I use good old PuTTY.  The necessary settings are:

  • Speed: 115200 baud
  • Data bits: 8
  • Stop bits: 1
  • Parity: None
  • Flow control: None

Using this I can now see all of the boot messages (and won't lose them if the video were to stop for whatever reason) and I can login over this connection as an alternative to SSH.

Monday Apr 30, 2012

JavaFX on the Raspberry Pi

Over the last few days I've been playing with the Raspberry Pi board (I was lucky enough to secure one of these as part of the work Oracle is doing to ensure that Java runs smoothly on it). 

Initial setup was pretty straightforward although I did need to check the current rating of the power supply I was using as the board needs 700mA, which is more than a normal USB port or hub supplies.  I used Win32DiskManager to copy the OS image to an SD card and then gparted to resize the partition to use the whole 8Gb on the card rather than just 2Gb.  Since I don't have a spare monitor I decided to do most things remotely over the network and set up sshd without any problem.

There is an OpenJDK build for ARM, but it doesn't have JIT support, so performance is not optimal.  Oracle provides a commercial implementation, which does have JIT support, so I downloaded and installed this which was painless. (The only thing to note here is I used the vfp version of the JDK).

Next, I figured I'd have a go at building JavaFX from source.  I grabbed a snapshot and set about seeing what needed to be done.  The first problem was that the reported platform didn't match anything in the build description (ARM v. x86).  Being a bit lazy I cheated.  Rather than configuring a whole new set of build rules I just overrode the architecture setting and forced it to i586.  Since the compiler is ARM, what's the worst that could happen?  This got me started, but then I bumped in to issues with some of the native media code compiling with SSE2 optimisations that don't exist on ARM.  A short learning curve later I switched to the ARM NEON equivalent, changed some definitions and got a bit further.  There were a few more issues around missing packages and left over .o files and then a stroke of real luck. 

Someone in Oracle contacted me to say that we already had an internal build of JavaFX for ARM which had been done for the Beagle Board and pointed me to where I could download it, which I duly did.  I copied over all the necessary bits as well as one of my own, simple JavaFX apps compiled into a jar.  After a few command line mis-starts I had JavaFX up and running on the Raspberry Pi!

Here's a quick video that shows the results:

At the moment this is as far as I've got.  I did try a more complex application, but ran into a problem with a missing library.  Something to track down later.

Getting the remote desktop working was a bit of a challenge, but not because of anything to do with the Raspberry Pi or Java.  Initially I thought I could use ssh -X and then project the X application (JavaFX) back to my Mac so I could do the screen capture.  Although this worked fine if the client machine was Linux every time I tried it on Mac OS X I got an access error for MIT-SHM.  It seemed the X environment was trying to use shared memory, which when the client and server are on different machines is not going to work.  I tried several ways to try and convince the system not to use the shared memory extensions, but it just wasn't happening.  In the end I gave up and decided that the path of least resistance was to use VNC and have the whole desktop visible on the Mac.

Stay tuned for more updates.


A blog covering aspects of Java SE, JavaFX and embedded Java that I find fun and interesting and want to share with other developers. As part of the Developer Outreach team at Oracle I write a lot of demo code and I use this blog to highlight useful tips and techniques I learn along the way.


« April 2014