Friday Oct 25, 2013

Integrating NetBeans for Raspberry Pi Java Development

Raspberry Pi IDE Java Development The Raspberry Pi is an incredible device for building embedded Java applications but, despite being able to run an IDE on the Pi it really pushes things to the limit.  It's much better to use a PC or laptop to develop the code and then deploy and test on the Pi.  What I thought I'd do in this blog entry was to run through the steps necessary to set up NetBeans on a PC for Java code development, with automatic deployment to the Raspberry Pi as part of the build process.

I will assume that your starting point is a Raspberry Pi with an SD card that has one of the latest Raspbian images on it.  This is good because this now includes the JDK 7 as part of the distro, so no need to download and install a separate JDK.  I will also assume that you have installed the JDK and NetBeans on your PC.  These can be downloaded here.

There are numerous approaches you can take to this including mounting the file system from the Raspberry Pi remotely on your development machine.  I tried this and I found that NetBeans got rather upset if the file system disappeared either through network interruption or the Raspberry Pi being turned off.  The following method uses copying over SSH, which will fail more gracefully if the Pi is not responding.

Step 1: Enable SSH on the Raspberry Pi

To run the Java applications you create you will need to start Java on the Raspberry Pi with the appropriate class name, classpath and parameters.  For non-JavaFX applications you can either do this from the Raspberry Pi desktop or, if you do not have a monitor connected through a remote command line.  To execute the remote command line you need to enable SSH (a secure shell login over the network) and connect using an application like PuTTY.

You can enable SSH when you first boot the Raspberry Pi, as the raspi-config program runs automatically.  You can also run it at any time afterwards by running the command:

sudo raspi-config

This will bring up a menu of options.  Select '8 Advanced Options' and on the next screen select 'A$SSH'. Select 'Enable' and the task is complete. Step 2: Configure Raspberry Pi Networking By default, the Raspbian distribution configures the ethernet connection to use DHCP rather than a static IP address. You can continue to use DHCP if you want, but to avoid having to potentially change settings whenever you reboot the Pi using a static IP address is simpler. To configure this on the Pi you need to edit the /etc/network/interfaces file. You will need to do this as root using the sudo command, so something like sudo vi /etc/network/interfaces. In this file you will see this line: iface eth0 inet dhcp This needs to be changed to the following: iface eth0 inet static address 10.0.0.2 gateway 10.0.0.254 netmask 255.255.255.0 You will need to change the values in red to an appropriate IP address and to match the address of your gateway. Step 3: Create a Public-Private Key Pair On Your Development Machine How you do this will depend on which Operating system you are using: Mac OSX or Linux Run the command: ssh-keygen -t rsa Press ENTER/RETURN to accept the default destination for saving the key. We do not need a passphrase so simply press ENTER/RETURN for an empty one and once more to confirm. The key will be created in the file .ssh/id_rsa.pub in your home directory. Display the contents of this file using the cat command: cat ~/.ssh/id_rsa.pub Open a window, SSH to the Raspberry Pi and login. Change directory to .ssh and edit the authorized_keys file (don't worry if the file does not exist). Copy and paste the contents of the id_rsa.pub file to the authorized_keys file and save it. Windows Since Windows is not a UNIX derivative operating system it does not include the necessary key generating software by default. To generate the key I used puttygen.exe which is available from the same site that provides the PuTTY application, here. Download this and run it on your Windows machine. Follow the instructions to generate a key. I remove the key comment, but you can leave that if you want. Click "Save private key", confirm that you don't want to use a passphrase and select a filename and location for the key. Copy the public key from the part of the window marked, "Public key for pasting into OpenSSH authorized_keys file". Use PuTTY to connect to the Raspberry Pi and login. Change directory to .ssh and edit the authorized_keys file (don't worry if this does not exist). Paste the key information at the end of this file and save it. Logout and then start PuTTY again. This time we need to create a saved session using the private key. Type in the IP address of the Raspberry Pi in the "Hostname (or IP address)" field and expand "SSH" under the "Connection" category. Select "Auth" (see the screen shot below). Click the "Browse" button under "Private key file for authentication" and select the file you saved from puttygen. Go back to the "Session" category and enter a short name in the saved sessions field, as shown below. Click "Save" to save the session. Step 4: Test The Configuration You should now have the ability to use scp (Mac/Linux) or pscp.exe (Windows) to copy files from your development machine to the Raspberry Pi without needing to authenticate by typing in a password (so we can automate the process in NetBeans). It's a good idea to test this using something like: scp /tmp/foo pi@10.0.0.20:/tmp on Linux or Mac or pscp.exe foo pi@raspi:/tmp on Windows (Note that we use the saved configuration name instead of the IP address or hostname so the public key is picked up). pscp.exe is another tool available from the creators of PuTTY. Step 5: Configure the NetBeans Build Script Start NetBeans and create a new project (or open an existing one that you want to deploy automatically to the Raspberry Pi). Select the Files tab in the explorer window and expand your project. You will see a build.xml file. Double click this to edit it. This file will mostly be comments. At the end (but within the </project> tag) add the XML for <target name="-post-jar">, shown below Here's the code again in case you want to use cut-and-paste: <target name="-post-jar"> <echo level="info" message="Copying dist directory to remote Pi"/> <exec executable="scp" dir="${basedir}">     <arg line="-r"/>     <arg value="dist"/>     <arg value="pi@10.0.0.20:NetBeans/CopyTest"/>   </exec>  </target> 
For Windows it will be slightly different:

<target name="-post-jar">   <echo level="info" message="Copying dist directory to remote Pi"/>   <exec executable="C:\pi\putty\pscp.exe" dir="${basedir}"> <arg line="-r"/> <arg value="dist"/> <arg value="pi@raspi:NetBeans/CopyTest"/> </exec> </target>  You will also need to ensure that pscp.exe is in your PATH (or specify a fully qualified pathname). From now on when you clean and build the project the dist directory will automatically be copied to the Raspberry Pi ready for testing. Friday Oct 04, 2013 JavaOne Afterglow JavaOne Afterglow Last week was the eighteenth JavaOne conference and I thought it would be a good idea to write up my thoughts about how things went. Firstly thanks to Yoshio Terada for the photos, I didn't bother bringing a camera with me so it's good to have some pictures to add to the words. Things kicked off full-throttle on Sunday. We had the Java Champions and JUG leaders breakfast, which was a great way to meet up with a lot of familiar faces and start talking all things Java. At midday the show really started with the Strategy and Technical Keynotes. This was always going to be tougher job than some years because there was no big shiny ball to reveal to the audience. With the Java EE 7 spec being finalised a few months ago and Java SE 8, Java ME 8 and JDK8 not due until the start of next year there was not going to be any big announcement. I thought both keynotes worked really well each focusing on the things most important to Java developers: Strategy One of the things that is becoming more and more prominent in many companies marketing is the Internet of Things (IoT). We've moved from the conventional desktop/laptop environment to much more mobile connected computing with smart phones and tablets. The next wave of the internet is not just billions of people connected, but 10s or 100s of billions of devices connected to the network, all generating data and providing much more precise control of almost any process you can imagine. This ties into the ideas of Big Data and Cloud Computing, but implementation is certainly not without its challenges. As Peter Utzschneider explained it's about three Vs: Volume, Velocity and Value. All these devices will create huge volumes of data at very high speed; to avoid being overloaded these devices will need some sort of processing capabilities that can filter the useful data from the redundant. The raw data then needs to be turned into useful information that has value. To make this happen will require applications on devices, at gateways and on the back-end servers, all very tightly integrated. This is where Java plays a pivotal role, write once, run everywhere becomes essential, having nine million developers fluent in the language makes it the defacto lingua franca of IoT. There will be lots more information on how this will become a reality, so watch this space. Technical How do we make the IoT a reality, technically? Using the game of chess Mark Reinhold, with the help of people like John Ceccarelli, Jasper Potts and Richard Bair, showed what you could do. Using Java EE on the back end, Java SE and JavaFX on the desktop and Java ME Embedded and JavaFX on devices they showed a complete end-to-end demo. This was really impressive, using 3D features from JavaFX 8 (that's included with JDK8) to make a 3D animated Duke chess board. Jasper also unveiled the "DukePad" a home made tablet using a Raspberry Pi, touch screen and accelerometer. Although the Raspberry Pi doesn't have earth shattering CPU performance (about the same level as a mid 1990s Pentium), it does have really quite good GPU performance so the GUI works really well. The plans are all open sourced and available here. One small, but very significant announcement was that Java SE will now be included with the NOOB and Raspbian Linux distros provided by the Raspberry Pi foundation (these can be found here). No more hassle having to download and install the JDK after you've flashed your SD card OS image. The finale was the Raspberry Pi powered chess playing robot. Really very, very cool. I talked to Jasper about this and he told me each of the chess pieces had been 3D printed and then he had to use acetone to give them a glossy finish (not sure what his wife thought of him spending hours in the kitchen in a gas mask!) The way the robot arm worked was very impressive as it did not have any positioning data (like a potentiometer connected to each motor), but relied purely on carefully calibrated timings to get the arm to the right place. Having done things like this myself in the past I know how easy it is to find a small error gets magnified into very big mistakes. Here's some pictures from the keynote: The queue to get in. Back at the Moscone for the keynote this year, which was nice. The "Dukepad" architecture Nice clear perspex case so you can see the innards. The very nice 3D chess set. Maya's obviously a great tool. The robotic chess player. After the keynotes it was sessions, hands on labs, BoFs and parties for the next four days. Here's a few highlights: • Anything Lambda related was packed. Good to see that there's lots of interest and people are really keen to use this great new feature. For me, the real power is in the changes to the libraries that use the Stream and related classes. I helped run the Lambda programming Hands on Lab. If you're interested Stuart Marks has posted the materials on his blog. • My session on the Raspberry Pi JavaFX Carputer went really well. Since I couldn't bring my car with me I'd made a short video of the system in action. It was one of those rare occasions when I new that my demo would work! I also managed to get my simulator working while I was at JavaOne so was able to show data recorded from a real run being played back on my device. There will be more blog entries to follow on this shortly. • My other session was on JavaFX with the Leap Motion controller. Thankfully for this I had the expert help of Gerrit Grunwald, Johan Voss and José Pereda who came with some great demos to complement my rather basic ones. During the week I was lucky enough to go and visit Leap Motion, who are based in San Francisco and talk about some of the great stuff they're doing to make the controller even better. • The Java leaders visit to the baseball game was fun (unless you're a Giants fan). Not totally convinced about baseball, but then compared to cricket, it's actually quite a fast paced game. • I didn't go to the appreciation event this year on Treasure Island. The idea of queuing for a bus for an hour to get a free burger and beer and listen to Maroon 5 was less appealing than a quiet dinner with my colleagues (and a bit of a break from the non-stop Java action). On the last day it was the Community Keynote, which was the highlight of the week for me (watch it here). It's always great to celebrate the way that community makes Java different to other programming languages, but this year the organisers excelled themselves. James Gosling was back at JavaOne again, talking more about Liquid Robotics and showing some of the exciting things he's doing with JavaFX. My favorite quote was when he came on stage and said, "I guess NetBeans is the new PowerPoint", a reference to the fact that most of the presenters had eschewed slides in favour of code and they'd all used NetBeans. (if you haven't done so yet you should really try out the 7.4 release candidate) Stephen Chin was also on stage to show a LEGO robot Duke that he'd built and worked as a Segway using the recently released Mindstorms EV3 kit and the Java SE environment. One sad piece of news related to this is that later in the day, Stephen, Angela Caicedo and I went for a cup of coffee and Angela's car was broken into. All that was stolen was a bag containing Duke. If you see him in a downtown San Francisco dumpster make sure he gets home. By far the best part of the keynote was where my good friend Arun Gupta's son, Aditya, got up on stage and showed over 1500 people how to hack Minecraft. As a presenter he was flawless, he seemed confident, his demos worked and he presented the concepts clearly and with great demos. Hard enough for a seasoned presenter, but consider that Aditya is only TEN YEARS OLD! There's no way I could have done that at his age. He deserves major respect for this, which is probably why he got a standing ovation when he finished. You can watch the video of his performance here. So that was JavaOne 2013. Another great event and it will be even harder to top that next year. One challenge I have taken away from this is that my son, Dylan, is only 7 years old. I have less than three years to get him on stage talking about Java during the keynote at JavaOne! Sunday Aug 04, 2013 The Raspberry Pi JavaFX In-Car System (Part 4) Raspberry Pi JavaFX Carputer part 4 It's been a while since my last blog entry about my in-car system, which has been due to a number of other things taking priority. The good news is I now have more to report in terms of progress. The first thing is that I decided to extend the scope of my project in terms of integrating with my vehicle. Originally, I had planned to add a 7" touch screen somewhere that was visible whilst driving. Given the attention to detail that Audi's designers have taken over the interior this was not going to be simple. The company I had originally ordered the touchscreen from ran into production problems and after several months admitted that delivery of the screen would not be for "some time". Since I needed this for JavaOne in September I cancelled the order and started looking for a replacement. eBay is a great place to find items like this and I found a screen being marketed for the Raspberry Pi which was a "double DIN" fitting (which actually means it is twice the height of the ISO 7736 standard). Some more searching on eBay turned up a bezel that would enable me to replace the existing navigation/entertainment system in my car with my new, Raspberry Pi powered one (Given how much functionality the existing system has I don't see this as a long term replacement, more for experimentation). Having received my screen I decided that for development and testing it would be better if I did not need to keep changing the centre console, so I set about making the screen/Pi combination easier to use standalone. Unfortunately, I couldn't find the perfect sized box at RS, but got one that could be adapted to my needs (the problem was it was too shallow, so I added some longer bolts and spacers). First up was to fit the screen into the top of the box, as shown in the pictures I was happy that my project already required the use of some wood, as I believe all great software projects should involve some woodwork. To mount Raspberry Pi I used the two vacant mounting points on the screen and attached a small perspex sheet to act as a platform for the Pi Getting the holes in the right position took three attempts, as the positioning of the external cables was a bit tricky given the available space. The Raspberry Pi was then mounted using the bolts shown above with some plastic spacers The USB cables provided connections for a USB port and SD card reader which are part of the screen bezel. In the end I removed these as I did not plan to use them and they were taking up too much space. Fitting the HDMI cable was a bit of a challenge. The distace between the HDMI port on the Pi and the one on the screen is about 3cm. The shortest cable I had was 1m! Using some cable ties and a sharp knife I was able to come up with a workable solution (not exactly pretty, but it works and won't be seen in the finshed 'product'). Since I wanted to include an accelerometer I mounted that on the bottom of the box so it wouldn't move around during development. The final internals are shown below. I added a short ethernet extension lead to simplify cabled network access, the WiPi dongle could be left in place and I ran a USB extension lead from the Pi to simplify switching between the touch screen and an external keyboard. When assembled I had a pretty nifty looking Raspberry Pi computer In the next installment I'll cover how I started on the JavaFX part to deliver realtime data on the screen. Friday Jun 28, 2013 The Raspberry Pi JavaFX In-Car System (Part 3) Ras Pi car pt3 Having established communication between a laptop and the ELM327 it's now time to bring in the Raspberry Pi. One of the nice things about the Raspberry Pi is the simplicity of it's power supply. All we need is 5V at about 700mA, which in a car is as simple as using a USB cigarette lighter adapter (which is handily rated at 1A). My car has two cigarette lighter sockets (despite being specified with the non-smoking package and therefore no actual cigarette lighter): one in the centre console and one in the rear load area. This was convenient as my idea is to mount the Raspberry Pi in the back to minimise the disruption to the very clean design of the Audi interior. The first task was to get the Raspberry Pi to communicate using Wi-Fi with the ELM 327. Initially I tried a cheap Wi-Fi dongle from Amazon, but I could not get this working with my home Wi-Fi network since it just would not handle the WPA security no matter what I did. I upgraded to a Wi Pi from Farnell and this works very well. The ELM327 uses Ad-Hoc networking, which is point to point communication. Rather than using a wireless router each connecting device has its own assigned IP address (which needs to be on the same subnet) and uses the same ESSID. The settings of the ELM327 are fixed to an IP address of 192.168.0.10 and useing the ESSID, "Wifi327". To configure Raspbian Linux to use these settings we need to modify the /etc/network/interfaces file. After some searching of the web and a few false starts here's the settings I came up with: auto lo eth0 wlan0 iface lo inet loopback iface eth0 inet static address 10.0.0.13 gateway 10.0.0.254 netmask 255.255.255.0 iface wlan0 inet static address 192.168.0.1 netmask 255.255.255.0 wireless-essid Wifi327 wireless-mode ad-ho0 After rebooting, iwconfig wlan0 reported that the Wi-Fi settings were correct. However, ifconfig showed no assigned IP address. If I configured the IP address manually using ifconfig wlan0 192.168.0.1 netmask 255.255.255.0 then everything was fine and I was able to happily ping the IP address of the ELM327. I tried numerous variations on the interfaces file, but nothing I did would get me an IP address on wlan0 when the machine booted. Eventually I decided that this was a pointless thing to spend more time on and so I put a script in /etc/init.d and registered it with update-rc.d. All the script does (currently) is execute the ifconfig line and now, having installed the telnet package I am able to telnet to the ELM327 via the Raspberry Pi. Not nice, but it works. Here's a picture of the Raspberry Pi in the car for testing In the next part we'll look at running the Java code on the Raspberry Pi to collect data from the car systems. Friday Jun 14, 2013 Java and the Raspberry Pi Camera (Part 1) Using the Raspberry Pi Camera with Java I've always liked the idea of computer vision and on the very long list of things I'd like to spend more time exploring is the OpenCV libraries which have a handy set of Java bindings. In the past I've experimented with, and used some of the other frameworks that are available for image capture in Java, specifically the Java Media Framework (JMF) and the Freedom for Media in Java (FMJ), mostly around the idea of integrating images from a webcam into an application like a security monitoring system. Sadly, JMF has grown a little dusty over time with the last release being way back in 2002 (you have to be amused when you see that the hardware requirements for this are a 166MHz Pentium processor and 32Mb of RAM). FMJ is a little more modern, but was last updated in 2007. The Raspberry Pi Foundation recently announced the launch of a camera that plugs into one of the two ribbon cable connectors on the board (as shown below): I thought it would be an interesting idea to see how easy it would be to get this working with a Java or JavaFX application. There are three utilities that are available for testing the camera: raspistill, raspiyuv and raspivid. These allow you to grab a frame or video from the camera and store it in a file. This seemed to be a good starting point for figuring out how to use the camera and get the frame data into a Java application, ideally as a BufferedImage (I decided to start with simple image capture and look at video streams later). I downloaded the code from github and started looking at what it does and how it works. Initially I thought it would make to sense to use a toolchain to cross compile the code on my quad-core Linux box. However, having spent a day working on this and failed to get the code to compile cleanly (even using the download of the Raspberry Pi org's toolchain) I decided it might be slower on the Raspberry Pi, but at least it worked. I also found a useful post from Tasanakorn Phaipool who had created a couple of sample applications that made use of the camera and linked to the OpenCV libraries. This provided a good starting point as it simplified things compared to the raspistill application and enabled me to figure out a relatively simple build environment (I don't have time right now to climb the learning curve required for cmake). Getting the code to compile and run was really quite challenging. I will confess it's been a while since I've done any C coding, but more of the issues I experieced were to do with getting the build process to work correctly. I used an iterative approach to creating a Makefile, simply resolving issues as I found them, gradually adding header file references and libraries until the code compiled cleanly. To use the camera we need the multi-media abstraction layer (MMAL) API. Broadcom have very kindly made this available as source, but documentation-wise you pretty much have to dig through the source code (there is a big comment at the top of the mmal.h file which is the best documentation I've found so far). Once I'd got the code to compile and link it still would not run, which puzzled me for quite some time until, by comparing the raspistill executable to the one I'd built, I found that I needed to include the libmmal_vc_client.so in the list of libraries to link. (This really does confuse me because this library is not required to resolve any function references so the code compiles and links correctly, but without it the necessary camera configuration is not registered and the call to mmal_component_create() will fail). At this point I have some code that will talk to the camera and display the preview image on the video output (HDMI). Next I need to modify this so it can be used with JNI and integrate this with a new subclass of ImageInputStream which can then be used to create a BufferedImage in a Java application. One other thing that is interesting is that when I run the simple test program the preview is displayed and very shortly after the network stops working (all the LEDs on the Pi except the power light go out). I assume that is a bug somewhere. Fortunately, I have a serial console connected so can still access the Pi via PuTTY. I will update my blog as I make more progress on this. Wednesday Jun 12, 2013 The Raspberry Pi JavaFX In-Car System (Part 2) Raspberry Pi JavaFX Car Pt2 In my last post (which was rather further back in time than I had planned) I described the ideas behind my in-car Raspberry Pi JavaFX system. Now it's time to get started on the technical stuff. First, we need a short review of modern car electronics. Things have certainly moved on from my first car, which was a 1971 Mini Clubman. This didn't even have electronics in it (unless you count the radio), as everything was electro-mechanical (anyone remember setting the gap for the points on the distributor?) Today, in Europe at least, things like anti-lock brakes (ABS) and stability control (ESC) which require complex sensors and electronics are mandated by law. Also, since 2001, all petrol driven vehicles have to be fitted with an EOBD (European On-Board Diagnostics) interface. This conforms to the OBD-II standard which is where the ELM327 interface from my first blog entry comes in. As a standard, OBD-II mandates some parts while other parts are optional. That way certain basic facilities are guaranteed to be present (mainly those that are related to the measuring of exhaust emission performance) and then each car manufacturer can implement the optional parts that make sense for the vehicle they're building. There are five signal protocols that can be used with the OBD-II interface: • SAE J1850 PWM (Pulse-width modulation, used by Ford) • SAE J1850 VPW (Variable pulse-width, used by General Motors) • ISO 9141-2 (which is a bit like RS-232) • ISO 14230 • ISO 15765 (also referred to as Controller Area Network, or CAN bus) You can think of this as the transport layer, which can be changed by the car manufacturer to suit their needs. The message protocol which uses the signal protocol is defined by the OBD-II standard. The format of these commands is pretty straightforward requiring a sequence of pairs of hexadecimal digits. The first pair indicates the 'mode' (of which there are 10); the second, and possibly third, pair indicates the 'parameter identification' or PID being sent. The mode and PID combination defines the command that you are sending to the vehicle. Results are returned as a sequence of bytes that form a string containing pairs of hexadecimal digits encoding the data. For my current vehicle, which is an Audi S3, the protocol is ISO 15765 as the car has multiple CAN buses for communication between the various control units (we'll come back to this in more detail later). So where to start? The first thing that is necessary is to establish communication between a Java application and the ELM327. One of the great things about using Java for an application like this is that the development can easily be done on a laptop and the production code moved easily to the target hardware. No cross compilation tool chains needed here, thank you. My ELM327 interface communicates via 802.11 (Wi-Fi). The address of my interface is 192.168.0.11 (which seems pretty common for these devices) and uses port 35000 for all communication. To test that things are working I set my MacBook to use a static IP address on Wi-Fi and then connected directly to the ELM327 which appeared in the list of available Wi-Fi devices. Having established communication at the IP level I could then telnet into the ELM327. If you want to start playing with this it's best to get hold of the documentation, which is really well written and complete. The ELM327 essentially uses two modes of communication: • AT commands for talking to the interface itself • OBD commands that conform to the description above. The ELM327 does all the hard work of converting this to the necessary packet format, adding headers, checksums and so on as well as unmarshalling the response data. To start with I just used the AT I command which reports back the version of the interface and AT RV which gives the current car battery voltage. These worked fine via telnet, so it was time to start developing the Java code. To keep things simple I wrote a class that would encapsulate the connection to the ELM327. Here's the code that initialises the connection so that we can read and write bytes, as required  /* Copyright © 2013, Oracle and/or its affiliates. All rights reserved. */ private static final String ELM327_IP_ADDRESS = "192.168.0.10"; private static final int ELM327_IP_PORT = 35000; private static final byte OBD_RESPONSE = (byte)0x40; private static final String CR = "\n"; private static final String LF = "\r"; private static final String CR_LF = "\n\r"; private static final String PROMPT = ">"; private Socket elmSocket; private OutputStream elmOutput; private InputStream elmInput; private boolean debugOn = false; private int debugLevel = 5; private byte[] rawResponse = new byte[1024]; protected byte[] responseData = new byte[1024]; /** * Common initialisation code * * @throws IOException If there is a communications problem */ private void init() throws IOException { /* Establish a socket to the port of the ELM327 box and create * input and output streams to it */ try { elmSocket = new Socket(ELM327_IP_ADDRESS, ELM327_IP_PORT); elmOutput = elmSocket.getOutputStream(); elmInput = elmSocket.getInputStream(); } catch (UnknownHostException ex) { System.out.println("ELM327: Unknown host, [" + ELM327_IP_ADDRESS + "]"); System.exit(1); } catch (IOException ex) { System.out.println("ELM327: IO error talking to car"); System.out.println(ex.getMessage()); System.exit(2); } /* Ensure we have an input and output stream */ if (elmInput == null || elmOutput == null) { System.out.println("ELM327: input or output to device is null"); System.exit(1); } /* Lastly send a reset command to and turn character echo off * (it's not clear that turning echo off has any effect) */ resetInterface(); sendATCommand("E0"); debug("ELM327: Connection established.", 1); }  Having got a connection we then need some methods to provide a simple interface for sending commands and getting back the results. Here's the common methods for sending messages.  /** * Send an AT command to control the ELM327 interface * * @param command The command string to send * @return The response from the ELM327 * @throws IOException If there is a communication error */ protected String sendATCommand(String command) throws IOException { /* Construct the full command string to send. We must remember to * include a carriage return (ASCII 0x0D) */ String atCommand = "AT " + command + CR_LF; debug("ELM327: Sending AT command [AT " + command + "]", 1); /* Send it to the interface */ elmOutput.write(atCommand.getBytes()); debug("ELM327: Command sent", 1); String response = getResponse(); /* Delete the command, which may be echoed back */ response = response.replace("AT " + command, ""); return response; } /** * Send an OBD command to the car via the ELM327. * * @param command The command as a string of hexadecimal values * @return The number of bytes returned by the command * @throws IOException If there is a problem communicating */ protected int sendOBDCommand(String command) throws IOException, ELM327Exception { byte[] commandBytes = byteStringToArray(command); /* A valid OBD command must be at least two bytes to indicate the mode * and then the information request */ if (commandBytes.length < 2) throw new ELM327Exception("ELM327: OBD command must be at least 2 bytes"); byte obdMode = commandBytes[0]; /* Send the command to the ELM327 */ debug("ELM327: sendOBDCommand: [" + command + "], mode = " + obdMode, 1); elmOutput.write((command + CR_LF).getBytes()); debug("ELM327: Command sent", 1); /* Read the response */ String response = getResponse(); /* Remove the original command in case that gets echoed back */ response = response.replace(command, ""); debug("ELM327: OBD response = " + response, 1); /* If there is NO DATA, there is no data */ if (response.compareTo("NO DATA") == 0) return 0; /* Trap error message from CAN bus */ if (response.compareTo("CAN ERROR") == 0) throw new ELM327Exception("ELM327: CAN ERROR detected"); rawResponse = byteStringToArray(response); int responseDataLength = rawResponse.length; /* The first byte indicates a response for the request mode and the * second byte is a repeat of the PID. We test these to ensure that * the response is of the correct format */ if (responseDataLength < 2) throw new ELM327Exception("ELM327: Response was too short"); if (rawResponse[0] != (byte)(obdMode + OBD_RESPONSE)) throw new ELM327Exception("ELM327: Incorrect response [" + String.format("%02X", responseData[0]) + " != " + String.format("%02X", (byte)(obdMode + OBD_RESPONSE)) + "]"); if (rawResponse[1] != commandBytes[1]) throw new ELM327Exception("ELM327: Incorrect command response [" + String.format("%02X", responseData[1]) + " != " + String.format("%02X", commandBytes[1])); debug("ELM327: byte count = " + responseDataLength, 1); for (int i = 0; i < responseDataLength; i++) debug(String.format("ELM327: byte %d = %02X", i, rawResponse[i]), 1); responseData = Arrays.copyOfRange(rawResponse, 2, responseDataLength); return responseDataLength - 2; } /** * Send an OBD command to the car via the ELM327. Test the length of the * response to see if it matches an expected value * * @param command The command as a string of hexadecimal values * @param expectedLength The expected length of the response * @return The length of the response * @throws IOException If there is a communication error or wrong length */ protected int sendOBDCommand(String command, int expectedLength) throws IOException, ELM327Exception { int responseLength = this.sendOBDCommand(command); if (responseLength != expectedLength) throw new IOException("ELM327: sendOBDCommand: bad reply length [" + responseLength + " != " + expectedLength + "]"); return responseLength; }  and the method for reading back the results.  /** * Get the response to a command, having first cleaned it up so it only * contains the data we're interested in. * * @return The response data * @throws IOException If there is a communications problem */ private String getResponse() throws IOException { boolean readComplete = false; StringBuilder responseBuilder = new StringBuilder(); /* Read the response. Sometimes timing issues mean we only get part of * the message in the first read. To ensure we always get all the intended * data (and therefore do not get confused on the the next read) we keep * reading until we see a prompt character in the data. That way we know * we have definitely got all the response. */ while (!readComplete) { int readLength = elmInput.read(rawResponse); debug("ELM327: Response received, length = " + readLength, 1); String data = new String(Arrays.copyOfRange(rawResponse, 0, readLength)); responseBuilder.append(data); /* Check for the prompt */ if (data.contains(PROMPT)) { debug("ELM327: Got a prompt", 1); break; } } /* Strip out newline, carriage return and the prompt */ String response = responseBuilder.toString(); response = response.replace(CR, ""); response = response.replace(LF, ""); response = response.replace(PROMPT, ""); return response; }  Using these methods it becomes pretty simple to implement methods that start to expose the OBD protocol. For example to get the version information about the interface we just need this simple method:  /** * Get the version number of the ELM327 connected * * @return The version number string * @throws IOException If there is a communications problem */ public String getInterfaceVersionNumber() throws IOException { return sendATCommand("I"); }  Another very useful method is one that returns the details about which of the PIDs are supported for a given mode.  /** * Determine which PIDs for OBDII are supported. The OBD standards docs are * required for a fuller explanation of these. * * @param pid Determines which range of PIDs support is reported for * @return An array indicating which PIDs are supported * @throws IOException If there is a communication error */ public boolean[] getPIDSupport(byte pid) throws IOException, ELM327Exception { int dataLength = sendOBDCommand("01 " + String.format("%02X", pid)); /* If we get zero bytes back then we assume that there are no * supported PIDs for the requested range */ if (dataLength == 0) return null; int pidCount = dataLength * 8; debug("ELM327: pid count = " + pidCount, 1); boolean[] pidList = new boolean[pidCount]; int p = 0; /* Now decode the bit map of supported PIDs */ for (int i = 2; i < dataLength; i++) for (int j = 0; j < 8; j++) { if ((responseData[i] & (1 << j)) != 0) pidList[p++] = true; else pidList[p++] = false; } return pidList; }  The PIDs 0x00, 0x20, 0x40, 0x60, 0x80, 0xA0 and 0xC0 of mode 1 will report back the supported PIDs for the following 31 values as a four byte bit map. There appear to only be definitions for commands up to 0x87 in the specification I found. In the next part we'll look at how we can start to use this class to get some real data from the car. Thursday Apr 25, 2013 The Raspberry Pi JavaFX In-Car System (Part 1) Raspberry Pi JavaFX Car System (Pt 1) As part of my work on embedded Java I'm always on the look out for new ideas for demos to build that show developers how easy it is to use and how powerful. In some of my recent web surfing I came across an interesting device on eBay that I thought had real potential. It's called an ELM327 OBDII CAN bus diagnostic interface scanner. It is a small box that plugs in to the service port of a modern car and provides an interface that allows software to talk to the Electronic Control Units (ECUs) fitted in your car. The one I bought provides a Wi- Fi link and also includes a USB socket for wired connectivity. Similar products are available that provide a BlueTooth interface, but the various opinions I read indicated that these were not as easy to use. Considering it cost a little over £30 I thought it was well worth it for some experimentation. Here's a picture of the device: And here it is plugged into the service port located near the pedals on my car. The only downside is that the orientation of the socket means that you can't see the status lights when it's plugged in (at least not without a mirror). My initial thoughts were to look at what kind of data could be extracted from the car and then write some software that would provide realtime display of things that aren't shown through the existing instrumentation. I thought it would also be fun to record journey data that could be post-analysed in much the way Formula 1 uses masses of telemetry to let the drivers know where they could do better. Since I wanted to use embedded Java the obvious choice of processing unit was the Raspberry Pi. It's cheap, I have a whole bunch of them and it's got plenty of computing power for what I have in mind. It also has some other advantages: • Low power consumption (easy to run off the 12V cigarette lighter supply) • Support for JavaFX through some nice touch screens from Chalkboard Electronics (so I can go wild with the interface) • Easily accessible GPIO pins The last point got me thinking about what other possibilities there were for my in-car system. Recently my friend and colleague Angela Caicedo did a session at Devoxx UK entitled, "Beyond Beauty: JavaFX, Parallax, Touch, Gyroscopes and Much More". Part of this involved connecting a motion sensor to the Raspberry Pi using the I2C interface that is also available. The particular sensor she used is from Sparkfun and uses a very cool single chip solution from InvenSense, the MPC-6150. This provides 9-axis motion data, which means acceleration and rate of rotation for the X, Y and Z axes as well as a compass sensor that works regardless of the orientation of the sensor. Having studied physics at university (a long time ago, in a galaxy far, far away) I vaguely remember that if I combine acceleration data with the mass of the car and things like engine speed I can calculate the horse power of the engine as well as the torque being generated. Throw that into the mix and this could make a really fun project. As further inspiration I came across this video recently: There's also an interesting one from Tesla who use a 17" touch display as their cemtre console. In the follow up parts to this blog entry I'll detail how the project evolves. Monday Jan 21, 2013 Building an SD Card Image For a Raspberry Pi Java Hands On Lab Building an SD Card Image For a Raspberry Pi Hands On Lab Last year we ran a very successful hands on lab for developers at Devoxx in Antwerp. The concept was to have 40 people in a room, give them all a Raspberry Pi, cables and a pre-configured SD card and get them to build cool JavaFX apps. One of the things I had to do was organise all the equipment and make a suitable image for the SD card. As this was before Oracle had announced the early access of JDK8 for the Raspberry Pi with hard float support we used the soft float of Java SE embedded version 7 for ARMv6 and a non-production build of JavaFX. As we're repeating this lab at JFokus in a couple of weeks I thought it might be useful to write up how I built the SD image as there may well be people who want to run something similar. Hardware Setup To simplify matters from a hardware perspective (and to make the lab economically viable) we decided not to provide attendees with monitors, keyboards and mice. All interaction with the Pi would need to be via the network connection. To eliminate the need for two power outlets per attendee we also decided to use USB Y power cables that can draw power from two USB ports on the attendee's laptop. Since USB ports are rated at 500mA two would give us more than the minimum 700mA required for the Pi (as a side note I've found that you can happily boot a Pi from one USB port on a MacBook Pro - although that is without any USB peripherals attached to the Pi). With no monitor or USB keyboard/mouse all interaction would be via the network connection. Again, to simplify the infrastructure we provided all attendees with an ethernet cross-over cable. One end is connected to the ethernet port on the attendee's laptop, the other to the ethernet port on the Raspberry Pi. The hardware setup is shown in the diagram below: Software Setup For this part I'll describe the setup necessary for the Rasbian distro so we can use the new JDK8 EA build. One issue with this is that the JavaFX libraries included no longer supprt rendering via X11. Since the ARM port of JavaFX is aimed at embedded devices like parking meters and point-of-sale devices we don't expect these to use an X based desktop underneath. Now that rendering is only supported directly to the framebuffer (which gives us significantly better performance) projecting the JavaFX applications back to the attendees laptop via VNC will no longer work. Although there is a package calld fbvnc this will not work as the rendering on the Pi does not use areas of memory that are accessible this way. Here is a step-by-step guide: 1. Install the Rasbian distro on an SD card. I use 4Gb SanDisk class 4 cards which provide enough space, work with the Pi and are cheap. When you need to replicate a significant number of cards, smaller is quicker. To install the distro either use DiskImager (on Windows) or a simple dd command on Linux or Mac (detailed instructions can be found on the Raspberry Pi web site). 2. Put this in a Pi and boot. I do this with a monitor and USB keyboard connected to make life simpler. When the Pi has finished booting you will be presented with a screen as shown: 1. Move down to expand_rootfs and select this by pressing RETURN. This will expand the filesystem to fill the available space on the SD card. 2. Select overclock and accept the notice about potentially reducing the lifetime of your Pi. Remember: live fast, die young. Seriously, though, given the cost of the Pi and the fact that the manufacturers will honour the warranty for anything up to a 1GHz clockrate and I think this is pretty safe. I go for the medium setting of 900MHz. This has not given me any issues, although you may want to go higher or lower as preferred. 3. Select SSH. I think this is enabled by default, but just to make sure select it. 4. Lastly on this screen select update. This will update any packages necessary in the Linux distribution. Obviously for this you will need your Pi connected to a network where it can find the internet settings via DHCP, etc. 5. Tab to 'Finish', hit RETURN and you will be dropped into a shell. 6. Being an old school UNIX hacker I really don't like sudo, so the first thing I do is sudo bash and then set a password for root so I can su whenever I need to. 7. There is a user account, pi, that is created by default. For our labs I create a separate account for attendees to login as. Use something like useradd -u 1024 -d /home/lab -m -s /bin/bash lab. Remember to set the user's password: passwd lab. 8. Since we want things to be as simple as possible we setup a DHCP server on the Pi. Before we do that we need the Pi to use a static IP address. Edit the /etc/network/interfaces file and change 9. iface eth0 inet dhcp to iface eth0 inet static address 10.0.0.8 gateway 10.0.0.254 netmask 255.255.255.0 In these settings I've used a class A private network which is the same as the one I use in my office. This makes things easy, as I can also configure the gateway so that the Pi can access the internet which will be required for the next stages. If you are using a class C private network (like 192.168.0.X) you will need to change this accordingly. At this point I reboot the machine with the monitor and keyboard disconnected and switch to doing everything over SSH. 10. Login over the network using SSH (use either the lab account or the pre-installed pi one) and su to root. 11. Install the DHCP server package, apt-get install isc-dhcp-server 12. Configure DHCP by editing the /etc/dhcp/dhcpd.conf file. Under the comment line # This is a very basic subnet declaration. Add subnet 10.0.0.0 netmask 255.255.255.0 { range 10.0.0.164 10.0.0.170; } This will provide an IP address in the range from 164 to 170. Having seven available addresses is a bit of overkill, but gives us some flexibility (change your IP addresses as necessary). In addition you must comment out these two lines at the start of the file: option domain-name "example.org"; option domain-name-servers ns1.example.org, ns2.example.org; This was one of the things that changed between the soft float Wheezy distro and the hard float Raspbian distro. It took me ages to figure out why the DHCP server would not work properly on Raspbian. When I used the soft float distro all I needed to do was add the subnet and range definition. On Raspbian the DHCP server refused to serve IP addresses even though the log messages seemed to indicate that it was fine. After I did a diff on the dhcp.conf files from both I noticed the two lines that had been uncommented. I commented them out again and everything worked fine. For our lab the attendees wrote the code on their laptops using the NetBeans IDE and then transferred the project across to the Pi to run. To make life as easy as possible the Pi is configured to support multiple ways of getting files onto it: FTP, NFS and Samba. 1. Install the FTP server package, apt-get install proftpd-basic. Although it would seem logical to want to run this from inetd, choose the standalone option as this actually works better and gets started, quite happily, at boot time. 2. Configure the FTP server by editing the /etc/proftpd/proftpd.conf file. This is not strictly necessary, but if you want to be able to use anonymous ftp then uncomment the sizeable section that starts with the comment, # A basic anonymous configuration, no upload directories. 3. Install the necessary packages for NFS server support, apt-get install nfs-kernel-server nfs-common 4. Edit the /etc/exports file to add the user home directory, /home/lab 10.0.0.*(rw,sync,no_subtree_check) At this point you would think, like I did, that rebooting the machine would give you a functioning NFS server. In fact on the soft float Wheezy distro this is exactly what happened. As with DHCP there is some weirdness in terms of changes that were made between the soft float Wheezy distro and the Raspbian one. With Raspbian if you use the showmount -e command, either locally or remotely you get the somewhat cryptic error message, clnt_create: RPC: Port mapper failure - RPC: Unable to receive. I'm sure with hindsight I should have been able to solve this quicker, but having had it working fine on Wheezy I just couldn't figure out why the same thing didn't work on Raspbian. Evantually after much Googling and head scratching I determined that it was down to the RPC bind daemon not being started at boot time. Some kind and thoughtful person decided that RPC didn't need to run at boot time. Rather than leaving the package out so that when it's needed it gets installed and correctly configured they just moved the links from /etc/rc2.d and /etc/rc3.d from being S (for start) to K (for kill), so it doesn't start. 1. Make the RPC bind daemon start at boot time by running update-rc.d rpcbind enable (as root) 2. Install the Samba packagaes with apt-get install libcups2 samba samba-common 3. Configure samba. Edit the /etc/samba/smb.conf file and add the following at the end of the file: [lab] comment = Raspberry Pi Java Lab path = /home/lab writable = yes guest ok = yes If you want the attendees to be able to project the desktop of the Pi to their laptops then you will need VNC. 1. Install the VNC server, apt-get install tightvncserver 2. As the lab user, set a password for the VNC server with tightvncpasswd. When doing this you can set different passwords for a fully interactive session and a view only one. 3. Run tightvncserver :1 to generate all the necessary configuration files. You will now be able to access the Raspberry Pi desktop remotely using a VNC client (I use [the bizarrely named] Chicken of the VNC on the Mac, RealVNC on Windows and xtightvncviewer on Linux). 4. In order for the VNC server to start up whenever the system boots a script is required in the /etc/init.d directory. I call it tightvncviewer, for which the code is: #!/bin/sh -e # # Start/stop VNC server ### BEGIN INIT INFO # Provides: tightvncserver # Required-Start:$network $local_fs # Required-Stop:$network $local_fs # Default-Start: 2 3 4 5 # Default-Stop: 0 1 6 # Short-Description: tightvncserver remote X session projection # Description: tightvncserver allows VNC clients to connect to # this machine and project the X desktop to the # remote machine. ### END INIT INFO . /lib/lsb/init-functions # Carry out specific functions when asked to by the system case "$1" in
start)
echo Starting tightVNC server
su lab -c 'tightvncserver :1 > /tmp/vnclog 2>&1'
;;
stop)
echo Stopping tightVNC server
su lab -c 'tightvncserver -kill :1'
;;
restart)
echo Restarting vncserver
$0 stop$0 start
;;
*)
echo "Usage: /etc/init.d/vncserver {start|stop|restart}"
exit 1
;;
esac

exit 0

Make sure that this script has execute permission.  To create the necessary links into the /etc/rc*.d directories run update-rc.d tightvncserver defaults.  Note that this provides the desktop of the 'lab' user.  If you want to support a different user change the name.  More users can be supported by creating additional servers running on screens other than :1.
To avoid having to provide printed instructions for the lab or distribute files on a CD or memory stick I also configure Apache on the Pi so that once the Pi is connected to the attendee's laptop they can simply open a web page and have whatever instructions and software available from there.
1. Install Apache, apt-get install apache2
2. Create your HTML content and put it in /var/www
3. Finally install the Java runtime.  I put it in /opt and set the PATH environment variable in the user's .bashrc file.

Tuesday Oct 16, 2012

Mind Reading with the Raspberry Pi

Mind Reading With The Raspberry Pi At JavaOne in San Francisco I did a session entitled "Do You Like Coffee with Your Dessert? Java and the Raspberry Pi".  As part of this I showed some demonstrations of things I'd done using Java on the Raspberry Pi.  This is the first part of a series of blog entries that will cover all the different aspects of these demonstrations.

A while ago I had bought a MindWave headset from Neurosky.  I was particularly interested to see how this worked as I had had the opportunity to visit Neurosky several years ago when they were still developing this technology.  At that time the 'headset' consisted of a headband (very much in the Bjorn Borg style) with a sensor attached and some wiring that clearly wasn't quite production ready.  The commercial version is very simple and easy to use: there are two sensors, one which rests on the skin of your forehead, the other is a small clip that attaches to your earlobe.

Typical EEG sensors used in hospitals require lots of sensors and they all need copious amounts of conductive gel to ensure the electrical signals are picked up.  Part of Neurosky's innovation is the development of this simple dry-sensor technology.  Having put on the sensor and turned it on (it powers off a single AAA size battery) it collects data and transmits it to a USB dongle plugged into a PC, or in my case a Raspberry Pi.

From a hacking perspective the USB dongle is ideal because it does not require any special drivers for any complex, low level USB communication.  Instead it appears as a simple serial device, which on the Raspberry Pi is accessed as /dev/ttyUSB0.  Neurosky have published details of the command protocol.  In addition, the MindSet protocol document, including sample code for parsing the data from the headset, can be found here.

To get everything working on the Raspberry Pi using Java the first thing was to get serial communications going.  Back in the dim distant past there was the Java Comm API.  Sadly this has grown a bit dusty over the years, but there is a more modern open source project that provides compatible and enhanced functionality, RXTXComm.  This can be installed easily on the Pi using sudo apt-get install librxtx-java

Next I wrote a library that would send commands to the MindWave headset via the serial port dongle and read back data being sent from the headset.  The design is pretty simple, I used an event based system so that code using the library could register listeners for different types of events from the headset.  You can download a complete NetBeans project for this here.  This includes javadoc API documentation that should make it obvious how to use it (incidentally, this will work on platforms other than Linux.  I've tested it on Windows without any issues, just by changing the device name to something like COM4).

To test this I wrote a simple application that would connect to the headset and then print the attention and meditation values as they were received from the headset.  Again, you can download the NetBeans project for that here.

Oracle recently released a developer preview of JavaFX on ARM which will run on the Raspberry Pi.  I thought it would be cool to write a graphical front end for the MindWave data that could take advantage of the built in charts of JavaFX.  Yet another NetBeans project is available here.  Screen shots of the app, which uses a very nice dial from the JFxtras project, are shown below.

I probably should add labels for the EEG data so the user knows which is the low alpha, mid gamma waves and so on.  Given that I'm not a neurologist I suspect that it won't increase my understanding of what the (rather random looking) traces mean.

In the next blog I'll explain how I connected a LEGO motor to the GPIO pins on the Raspberry Pi and then used my mind to control the motor!