Mind Reading with the Raspberry Pi
By speakjava on Oct 16, 2012
A while ago I had bought a MindWave headset from Neurosky. I was particularly interested to see how this worked as I had had the opportunity to visit Neurosky several years ago when they were still developing this technology. At that time the 'headset' consisted of a headband (very much in the Bjorn Borg style) with a sensor attached and some wiring that clearly wasn't quite production ready. The commercial version is very simple and easy to use: there are two sensors, one which rests on the skin of your forehead, the other is a small clip that attaches to your earlobe.
Typical EEG sensors used in hospitals require lots of sensors and they all need copious amounts of conductive gel to ensure the electrical signals are picked up. Part of Neurosky's innovation is the development of this simple dry-sensor technology. Having put on the sensor and turned it on (it powers off a single AAA size battery) it collects data and transmits it to a USB dongle plugged into a PC, or in my case a Raspberry Pi.
From a hacking perspective the USB dongle is ideal because it does not require any special drivers for any complex, low level USB communication. Instead it appears as a simple serial device, which on the Raspberry Pi is accessed as /dev/ttyUSB0. Neurosky have published details of the command protocol. In addition, the MindSet protocol document, including sample code for parsing the data from the headset, can be found here.
To get everything working on the Raspberry Pi using Java the first thing was to get serial communications going. Back in the dim distant past there was the Java Comm API. Sadly this has grown a bit dusty over the years, but there is a more modern open source project that provides compatible and enhanced functionality, RXTXComm. This can be installed easily on the Pi using sudo apt-get install librxtx-java.
Next I wrote a library that would send commands to the MindWave headset via the serial port dongle and read back data being sent from the headset. The design is pretty simple, I used an event based system so that code using the library could register listeners for different types of events from the headset. You can download a complete NetBeans project for this here. This includes javadoc API documentation that should make it obvious how to use it (incidentally, this will work on platforms other than Linux. I've tested it on Windows without any issues, just by changing the device name to something like COM4).
To test this I wrote a simple application that would connect to the headset and then print the attention and meditation values as they were received from the headset. Again, you can download the NetBeans project for that here.
Oracle recently released a developer preview of JavaFX on ARM which will run on the Raspberry Pi. I thought it would be cool to write a graphical front end for the MindWave data that could take advantage of the built in charts of JavaFX. Yet another NetBeans project is available here. Screen shots of the app, which uses a very nice dial from the JFxtras project, are shown below.
I probably should add labels for the EEG data so the user knows which is the low alpha, mid gamma waves and so on. Given that I'm not a neurologist I suspect that it won't increase my understanding of what the (rather random looking) traces mean.
In the next blog I'll explain how I connected a LEGO motor to the GPIO pins on the Raspberry Pi and then used my mind to control the motor!