Friday Oct 25, 2013

Integrating NetBeans for Raspberry Pi Java Development

Raspberry Pi IDE Java Development The Raspberry Pi is an incredible device for building embedded Java applications but, despite being able to run an IDE on the Pi it really pushes things to the limit.  It's much better to use a PC or laptop to develop the code and then deploy and test on the Pi.  What I thought I'd do in this blog entry was to run through the steps necessary to set up NetBeans on a PC for Java code development, with automatic deployment to the Raspberry Pi as part of the build process.

I will assume that your starting point is a Raspberry Pi with an SD card that has one of the latest Raspbian images on it.  This is good because this now includes the JDK 7 as part of the distro, so no need to download and install a separate JDK.  I will also assume that you have installed the JDK and NetBeans on your PC.  These can be downloaded here.

There are numerous approaches you can take to this including mounting the file system from the Raspberry Pi remotely on your development machine.  I tried this and I found that NetBeans got rather upset if the file system disappeared either through network interruption or the Raspberry Pi being turned off.  The following method uses copying over SSH, which will fail more gracefully if the Pi is not responding.

Step 1: Enable SSH on the Raspberry Pi

To run the Java applications you create you will need to start Java on the Raspberry Pi with the appropriate class name, classpath and parameters.  For non-JavaFX applications you can either do this from the Raspberry Pi desktop or, if you do not have a monitor connected through a remote command line.  To execute the remote command line you need to enable SSH (a secure shell login over the network) and connect using an application like PuTTY.

You can enable SSH when you first boot the Raspberry Pi, as the raspi-config program runs automatically.  You can also run it at any time afterwards by running the command:

sudo raspi-config

This will bring up a menu of options.  Select '8 Advanced Options' and on the next screen select 'A$ SSH'.  Select 'Enable' and the task is complete.

Step 2: Configure Raspberry Pi Networking

By default, the Raspbian distribution configures the ethernet connection to use DHCP rather than a static IP address.  You can continue to use DHCP if you want, but to avoid having to potentially change settings whenever you reboot the Pi using a static IP address is simpler.

To configure this on the Pi you need to edit the /etc/network/interfaces file.  You will need to do this as root using the sudo command, so something like sudo vi /etc/network/interfaces.  In this file you will see this line:

iface eth0 inet dhcp

This needs to be changed to the following:

iface eth0 inet static

You will need to change the values in red to an appropriate IP address and to match the address of your gateway.

Step 3: Create a Public-Private Key Pair On Your Development Machine

How you do this will depend on which Operating system you are using:

Mac OSX or Linux

Run the command:

ssh-keygen -t rsa

Press ENTER/RETURN to accept the default destination for saving the key.  We do not need a passphrase so simply press ENTER/RETURN for an empty one and once more to confirm.

The key will be created in the file .ssh/ in your home directory.  Display the contents of this file using the cat command:

cat ~/.ssh/

Open a window, SSH to the Raspberry Pi and login.  Change directory to .ssh and edit the authorized_keys file (don't worry if the file does not exist).  Copy and paste the contents of the file to the authorized_keys file and save it.


Since Windows is not a UNIX derivative operating system it does not include the necessary key generating software by default.  To generate the key I used puttygen.exe which is available from the same site that provides the PuTTY application, here.

Download this and run it on your Windows machine.  Follow the instructions to generate a key.  I remove the key comment, but you can leave that if you want.


Click "Save private key", confirm that you don't want to use a passphrase and select a filename and location for the key.

Copy the public key from the part of the window marked, "Public key for pasting into OpenSSH authorized_keys file".  Use PuTTY to connect to the Raspberry Pi and login.  Change directory to .ssh and edit the authorized_keys file (don't worry if this does not exist).  Paste the key information at the end of this file and save it.

Logout and then start PuTTY again.  This time we need to create a saved session using the private key.  Type in the IP address of the Raspberry Pi in the "Hostname (or IP address)" field and expand "SSH" under the "Connection" category.  Select "Auth" (see the screen shot below).


Click the "Browse" button under "Private key file for authentication" and select the file you saved from puttygen.

Go back to the "Session" category and enter a short name in the saved sessions field, as shown below.  Click "Save" to save the session.


Step 4: Test The Configuration

You should now have the ability to use scp (Mac/Linux) or pscp.exe (Windows) to copy files from your development machine to the Raspberry Pi without needing to authenticate by typing in a password (so we can automate the process in NetBeans).  It's a good idea to test this using something like:

scp /tmp/foo pi@

on Linux or Mac or

pscp.exe foo pi@raspi:/tmp

on Windows (Note that we use the saved configuration name instead of the IP address or hostname so the public key is picked up). pscp.exe is another tool available from the creators of PuTTY.

Step 5: Configure the NetBeans Build Script

Start NetBeans and create a new project (or open an existing one that you want to deploy automatically to the Raspberry Pi).

Select the Files tab in the explorer window and expand your project.  You will see a build.xml file.  Double click this to edit it.


This file will mostly be comments.  At the end (but within the </project> tag) add the XML for <target name="-post-jar">, shown below


Here's the code again in case you want to use cut-and-paste:

<target name="-post-jar">
  <echo level="info" message="Copying dist directory to remote Pi"/>
  <exec executable="scp" dir="${basedir}">
    <arg line="-r"/>
    <arg value="dist"/>
    <arg value="pi@"/>

For Windows it will be slightly different:

<target name="-post-jar">
  <echo level="info" message="Copying dist directory to remote Pi"/>
  <exec executable="C:\pi\putty\pscp.exe" dir="${basedir}">
    <arg line="-r"/>
    <arg value="dist"/>
    <arg value="pi@raspi:NetBeans/CopyTest"/>

You will also need to ensure that pscp.exe is in your PATH (or specify a fully qualified pathname).

From now on when you clean and build the project the dist directory will automatically be copied to the Raspberry Pi ready for testing.

Sunday Aug 04, 2013

The Raspberry Pi JavaFX In-Car System (Part 4)

Raspberry Pi JavaFX Carputer part 4 It's been a while since my last blog entry about my in-car system, which has been due to a number of other things taking priority.  The good news is I now have more to report in terms of progress.

The first thing is that I decided to extend the scope of my project in terms of integrating with my vehicle.  Originally, I had planned to add a 7" touch screen somewhere that was visible whilst driving.  Given the attention to detail that Audi's designers have taken over the interior this was not going to be simple.  The company I had originally ordered the touchscreen from ran into production problems and after several months admitted that delivery of the screen would not be for "some time".  Since I needed this for JavaOne in September I cancelled the order and started looking for a replacement.  eBay is a great place to find items like this and I found a screen being marketed for the Raspberry Pi which was a "double DIN" fitting (which actually means it is twice the height of the ISO 7736 standard).  Some more searching on eBay turned up a bezel that would enable me to replace the existing navigation/entertainment system in my car with my new, Raspberry Pi powered one (Given how much functionality the existing system has I don't see this as a long term replacement, more for experimentation).

Having received my screen I decided that for development and testing it would be better if I did not need to keep changing the centre console, so I set about making the screen/Pi combination easier to use standalone.  Unfortunately, I couldn't find the perfect sized box at RS, but got one that could be adapted to my needs (the problem was it was too shallow, so I added some longer bolts and spacers).  First up was to fit the screen into the top of the box, as shown in the pictures



I was happy that my project already required the use of some wood, as I believe all great software projects should involve some woodwork.

To mount Raspberry Pi I used the two vacant mounting points on the screen and attached a small perspex sheet to act as a platform for the Pi

Pi mounting

Getting the holes in the right position took three attempts, as the positioning of the external cables was a bit tricky given the available space.

The Raspberry Pi was then mounted using the bolts shown above with some plastic spacers

Raspberry Pi mounted

The USB cables provided connections for a USB port and SD card reader which are part of the screen bezel.  In the end I removed these as I did not plan to use them and they were taking up too much space.

Fitting the HDMI cable was a bit of a challenge.  The distace between the HDMI port on the Pi and the one on the screen is about 3cm.  The shortest cable I had was 1m!  Using some cable ties and a sharp knife I was able to come up with a workable solution (not exactly pretty, but  it works and won't be seen in the finshed 'product').

HDMI cabling

Since I wanted to include an accelerometer I mounted that on the bottom of the box so it wouldn't move around during development.  The final internals are shown below.  I added a short ethernet extension lead to simplify cabled network access, the WiPi dongle could be left in place and I ran a USB extension lead from the Pi to simplify switching between the touch screen and an external keyboard.


When assembled I had a pretty nifty looking Raspberry Pi computer

pi computer

In the next installment I'll cover how I started on the JavaFX part to deliver realtime data on the screen.

Friday Jun 28, 2013

The Raspberry Pi JavaFX In-Car System (Part 3)

Ras Pi car pt3 Having established communication between a laptop and the ELM327 it's now time to bring in the Raspberry Pi.

One of the nice things about the Raspberry Pi is the simplicity of it's power supply.  All we need is 5V at about 700mA, which in a car is as simple as using a USB cigarette lighter adapter (which is handily rated at 1A).  My car has two cigarette lighter sockets (despite being specified with the non-smoking package and therefore no actual cigarette lighter): one in the centre console and one in the rear load area.  This was convenient as my idea is to mount the Raspberry Pi in the back to minimise the disruption to the very clean design of the Audi interior.

The first task was to get the Raspberry Pi to communicate using Wi-Fi with the ELM 327.  Initially I tried a cheap Wi-Fi dongle from Amazon, but I could not get this working with my home Wi-Fi network since it just would not handle the WPA security no matter what I did.  I upgraded to a Wi Pi from Farnell and this works very well.

The ELM327 uses Ad-Hoc networking, which is point to point communication.  Rather than using a wireless router each connecting device has its own assigned IP address (which needs to be on the same subnet) and uses the same ESSID.  The settings of the ELM327 are fixed to an IP address of and useing the ESSID, "Wifi327".  To configure Raspbian Linux to use these settings we need to modify the /etc/network/interfaces file.  After some searching of the web and a few false starts here's the settings I came up with:

auto lo eth0 wlan0

iface lo inet loopback

iface eth0 inet static

iface wlan0 inet static
    wireless-essid Wifi327
    wireless-mode ad-ho0

After rebooting, iwconfig wlan0 reported that the Wi-Fi settings were correct.  However, ifconfig showed no assigned IP address.  If I configured the IP address manually using ifconfig wlan0 netmask then everything was fine and I was able to happily ping the IP address of the ELM327.  I tried numerous variations on the interfaces file, but nothing I did would get me an IP address on wlan0 when the machine booted.  Eventually I decided that this was a pointless thing to spend more time on and so I put a script in /etc/init.d and registered it with update-rc.d.  All the script does (currently) is execute the ifconfig line and now, having installed the telnet package I am able to telnet to the ELM327 via the Raspberry Pi.  Not nice, but it works.

Here's a picture of the Raspberry Pi in the car for testing

In Car

In the next part we'll look at running the Java code on the Raspberry Pi to collect data from the car systems.

Friday Jun 14, 2013

Java and the Raspberry Pi Camera (Part 1)

Using the Raspberry Pi Camera with Java I've always liked the idea of computer vision and on the very long list of things I'd like to spend more time exploring is the OpenCV libraries which have a handy set of Java bindings.  In the past I've experimented with, and used some of the other frameworks that are available for image capture in Java, specifically the Java Media Framework (JMF) and the Freedom for Media in Java (FMJ), mostly around the idea of integrating images from a webcam into an application like a security monitoring system.  Sadly, JMF has grown a little dusty over time with the last release being way back in 2002 (you have to be amused when you see that the hardware requirements for this are a 166MHz Pentium processor and 32Mb of RAM).  FMJ is a little more modern, but was last updated in 2007.

The Raspberry Pi Foundation recently announced the launch of a camera that plugs into one of the two ribbon cable connectors on the board (as shown below):

Raspberry Pi Camera

I thought it would be an interesting idea to see how easy it would be to get this working with a Java or JavaFX application.

There are three utilities that are available for testing the camera: raspistill, raspiyuv and raspivid.  These allow you to grab a frame or video from the camera and store it in a file.  This seemed to be a good starting point for figuring out how to use the camera and get the frame data into a Java application, ideally as a BufferedImage (I decided to start with simple image capture and look at video streams later).

I downloaded the code from github and started looking at what it does and how it works.  Initially I thought it would make to sense to use a toolchain to cross compile the code on my quad-core Linux box.  However, having spent a day working on this and failed to get the code to compile cleanly (even using the download of the Raspberry Pi org's toolchain) I decided it might be slower on the Raspberry Pi, but at least it worked.

I also found a useful post from Tasanakorn Phaipool who had created a couple of sample applications that made use of the camera and linked to the OpenCV libraries.  This provided a good starting point as it simplified things compared to the raspistill application and enabled me to figure out a relatively simple build environment (I don't have time right now to climb the learning curve required for cmake).

Getting the code to compile and run was really quite challenging.  I will confess it's been a while since I've done any C coding, but more of the issues I experieced were to do with getting the build process to work correctly.  I used an iterative approach to creating a Makefile, simply resolving issues as I found them, gradually adding header file references and libraries until the code compiled cleanly.  To use the camera we need the multi-media abstraction layer (MMAL) API.  Broadcom have very kindly made this available as source, but documentation-wise you pretty much have to dig through the source code (there is a big comment at the top of the mmal.h file which is the best documentation I've found so far).  Once I'd got the code to compile and link it still would not run, which puzzled me for quite some time until, by comparing the raspistill executable to the one I'd built, I found that I needed to include the in the list of libraries to link.  (This really does confuse me because this library is not required to resolve any function references so the code compiles and links correctly, but without it the necessary camera configuration is not registered and the call to mmal_component_create() will fail).

At this point I have some code that will talk to the camera and display the preview image on the video output (HDMI).  Next I need to modify this so it can be used with JNI and integrate this with a new subclass of ImageInputStream which can then be used to create a BufferedImage in a Java application.

One other thing that is interesting is that when I run the simple test program the preview is displayed and very shortly after the network stops working (all the LEDs on the Pi except the power light go out).  I assume that is a bug somewhere.  Fortunately, I have a serial console connected so can still access the Pi via PuTTY.

I will update my blog as I make more progress on this.

Thursday Apr 25, 2013

The Raspberry Pi JavaFX In-Car System (Part 1)

Raspberry Pi JavaFX Car System (Pt 1) As part of my work on embedded Java I'm always on the look out for new ideas for demos to build that show developers how easy it is to use and how powerful.  In some of my recent web surfing I came across an interesting device on eBay that I thought had real potential.  It's called an ELM327 OBDII CAN bus diagnostic interface scanner.  It is a small box that plugs in to the service port of a modern car and provides an interface that allows software to talk to the Electronic Control Units (ECUs) fitted in your car.  The one I bought provides a Wi- Fi link and also includes a USB socket for wired connectivity.  Similar products are available that provide a BlueTooth interface, but the various opinions I read indicated that these were not as easy to use.  Considering it cost a little over £30 I thought it was well worth it for some experimentation.

Here's a picture of the device:


And here it is plugged into the service port located near the pedals on my car. 


The only downside is that the orientation of the socket means that you can't see the status lights when it's plugged in (at least not without a mirror).

My initial thoughts were to look at what kind of data could be extracted from the car and then write some software that would provide realtime display of things that aren't shown through the existing instrumentation.  I thought it would also be fun to record journey data that could be post-analysed in much the way Formula 1 uses masses of telemetry to let the drivers know where they could do better.

Since I wanted to use embedded Java the obvious choice of processing unit was the Raspberry Pi.  It's cheap, I have a whole bunch of them and it's got plenty of computing power for what I have in mind.  It also has some other advantages:
  • Low power consumption (easy to run off the 12V cigarette lighter supply)
  • Support for JavaFX through some nice touch screens from Chalkboard Electronics (so I can go wild with the interface)
  • Easily accessible GPIO pins
The last point got me thinking about what other possibilities there were for my in-car system.  Recently my friend and colleague Angela Caicedo did a session at Devoxx UK entitled, "Beyond Beauty: JavaFX, Parallax, Touch, Gyroscopes and Much More".  Part of this involved connecting a motion sensor to the Raspberry Pi using the I2C interface that is also available.  The particular sensor she used is from Sparkfun and uses a very cool single chip solution from InvenSense, the MPC-6150.  This provides 9-axis motion data, which means acceleration and rate of rotation for the X, Y and Z axes as well as a compass sensor that works regardless of the orientation of the sensor.

Having studied physics at university (a long time ago, in a galaxy far, far away) I vaguely remember that if I combine acceleration data with the mass of the car and things like engine speed I can calculate the horse power of the engine as well as the torque being generated.  Throw that into the mix and this could make a really fun project.

As further inspiration I came across this video recently:

There's also an interesting one from Tesla who use a 17" touch display as their cemtre console.

In the follow up parts to this blog entry I'll detail how the project evolves.

Tuesday Oct 16, 2012

Mind Reading with the Raspberry Pi

Mind Reading With The Raspberry Pi At JavaOne in San Francisco I did a session entitled "Do You Like Coffee with Your Dessert? Java and the Raspberry Pi".  As part of this I showed some demonstrations of things I'd done using Java on the Raspberry Pi.  This is the first part of a series of blog entries that will cover all the different aspects of these demonstrations.

A while ago I had bought a MindWave headset from Neurosky.  I was particularly interested to see how this worked as I had had the opportunity to visit Neurosky several years ago when they were still developing this technology.  At that time the 'headset' consisted of a headband (very much in the Bjorn Borg style) with a sensor attached and some wiring that clearly wasn't quite production ready.  The commercial version is very simple and easy to use: there are two sensors, one which rests on the skin of your forehead, the other is a small clip that attaches to your earlobe.

Neurosky product image 1 Neurosky product image 2

Typical EEG sensors used in hospitals require lots of sensors and they all need copious amounts of conductive gel to ensure the electrical signals are picked up.  Part of Neurosky's innovation is the development of this simple dry-sensor technology.  Having put on the sensor and turned it on (it powers off a single AAA size battery) it collects data and transmits it to a USB dongle plugged into a PC, or in my case a Raspberry Pi.

From a hacking perspective the USB dongle is ideal because it does not require any special drivers for any complex, low level USB communication.  Instead it appears as a simple serial device, which on the Raspberry Pi is accessed as /dev/ttyUSB0.  Neurosky have published details of the command protocol.  In addition, the MindSet protocol document, including sample code for parsing the data from the headset, can be found here.

To get everything working on the Raspberry Pi using Java the first thing was to get serial communications going.  Back in the dim distant past there was the Java Comm API.  Sadly this has grown a bit dusty over the years, but there is a more modern open source project that provides compatible and enhanced functionality, RXTXComm.  This can be installed easily on the Pi using sudo apt-get install librxtx-java

Next I wrote a library that would send commands to the MindWave headset via the serial port dongle and read back data being sent from the headset.  The design is pretty simple, I used an event based system so that code using the library could register listeners for different types of events from the headset.  You can download a complete NetBeans project for this here.  This includes javadoc API documentation that should make it obvious how to use it (incidentally, this will work on platforms other than Linux.  I've tested it on Windows without any issues, just by changing the device name to something like COM4).

To test this I wrote a simple application that would connect to the headset and then print the attention and meditation values as they were received from the headset.  Again, you can download the NetBeans project for that here.

Oracle recently released a developer preview of JavaFX on ARM which will run on the Raspberry Pi.  I thought it would be cool to write a graphical front end for the MindWave data that could take advantage of the built in charts of JavaFX.  Yet another NetBeans project is available here.  Screen shots of the app, which uses a very nice dial from the JFxtras project, are shown below.

JavaFX Mind Reader

JavaFX Mind Reader

I probably should add labels for the EEG data so the user knows which is the low alpha, mid gamma waves and so on.  Given that I'm not a neurologist I suspect that it won't increase my understanding of what the (rather random looking) traces mean.

In the next blog I'll explain how I connected a LEGO motor to the GPIO pins on the Raspberry Pi and then used my mind to control the motor!

Monday Jul 02, 2012

The Power to Control Power

I'm currently working on a number of projects using embedded Java on the Raspberry Pi and Beagle Board.  These are nice and small, so don't take up much room on my desk as you can see in this picture.

Desktop embedded systems

As you can also see I have power and network connections emerging from under my desk.  One of the (admittedly very minor) drawbacks of these systems is that they have no on/off switch.  Instead you insert or remove the power connector (USB for the RasPi, a barrel connector for the Beagle).  For the Beagle Board this can potentially be an issue; with the micro-SD card located right next to the connector it has been known for people to eject the card when trying to power off the board, which can be quite serious for the hardware. The alternative is obviously to leave the boards plugged in and then disconnect the power from the outlet.  Simple enough, but a picture of underneath my desk shows that this is not the ideal situation either.

Under desk wiring

This made me think that it would be great if I could have some way of controlling a mains voltage outlet using a remote switch or, even better, from software via a USB connector.  A search revealed not much that fit my requirements, and anything that was close seemed very expensive.  Obviously the only way to solve this was to build my own.

Here's my solution.  I decided my system would support both control mechanisms (remote physical switch and USB computer control) and be modular in its design for optimum flexibility.  I did a bit of searching and found a company in Hong Kong that were offering solid state relays for 99p plus shipping (£2.99, but still made the total price very reasonable).  These would handle up to 380V AC on the output side so more than capable of coping with the UK 240V supply.  The other great thing was that being solid state, the input would work with a range of 3-32V and required a very low current of 7.5mA at 12V.  For the USB control an Arduino board seemed the obvious low-cost and simple choice.  Given the current requirments of the relay, the Arduino would not require the additional power supply and could be powered just from the USB.

Having secured the relays I popped down to Homebase for a couple of 13A sockets, RS for a box and an Arduino and Maplin for a toggle switch.  The circuit is pretty straightforward, as shown in the diagram (only one output is shown to make it as simple as possible).  Originally I used a 2 pole toggle switch to select the remote switch or USB control by switching the negative connections of the low voltage side.  Unfortunately, the resistance between the digital pins of the Arduino board was not high enough, so when using one of the remote switches it would turn on both of the outlets.  I changed to a 4 pole switch and isolated both positive and negative connections.

Power switch circuit

IMPORTANT NOTE: If you want to follow my design, please be aware that it requires working with mains voltages.  If you are at all concerned with your ability to do this please consult a qualified electrician to help you.

It was a tight fit, especially getting the Arduino in, but in the end it all worked.  The completed box is shown in the photos.

Controller inside

Connected with USB

The remote switch was pretty simple just requiring the squeezing of two rocker switches and a 9V battery into the small RS supplied box.  I repurposed a standard stereo cable with phono plugs to connect the switch box to the mains outlets.  I chopped off one set of plugs and wired it to the rocker switches.  The photo shows the RasPi and the Beagle board now controllable from the switch box on the desk.

Remote control

I've tested the Arduino side of things and this works fine.  Next I need to write some software to provide an interface for control of the outlets.  I'm thinking a JavaFX GUI would be in keeping with the total overkill style of this project.


A blog covering aspects of Java SE, JavaFX and embedded Java that I find fun and interesting and want to share with other developers. As part of the Developer Outreach team at Oracle I write a lot of demo code and I use this blog to highlight useful tips and techniques I learn along the way.


« July 2016