JDDAC at the Romberg Tiburon Center - SF Bay Estuary Monitoring

Well, how often does high-tech work in large scale enterprise computing fall right into a fisherman's habit? I had a nice conference call with a bunch of University and Corporate technologist folks yesterday afternoon about a project that needs a world class software architecture to provide real-time or near real-time data about SF Bay water quality and monitoring. The idea is to get data every 1 - 10 minutes from a wireless sensor grid that monitors all the estuary waters, and even middle and outside of the Bay and provide that data accurately and quickly to all users. Such a data would include real time lookups of conditions, as well as historical data monitoring for scientists, researchers and policy makers.

Sun and other companies started an initiative some time ago called Java Distributed Data Acquisition and Control (JDDAC). And immediately, one can see a strong business case for JDDAC. The challenge today in manufacturing, for example, is retooling costs for high-margin custom manufacturing. In other words, businesses can charge more money for some amount of work if they do custom work. But the costs of retooling are prohibitive and so customers tend not to order custom unless really necessary because it does cost so much more and manufacturers rarely provide discounts on any custom job unless one makes a very large order of things. The purpose of JDDAC was to standardize remote sensing and control technology so that businesses could retool and customize their manufacturing line to do more custom work at lower cost, thus meeting both customer demand and lowering business cost.

But then recently, some SF State U. researchers at the Romberg Tiburon Center that do Marine estuary research got wind of this initiative and quickly connected the dots. They've been funded by various gov't agencies to monitor Bay Quality. One of their major data consumers is NOAA. And some of their data is used in policy making for all sorts of things, from water diversion upstream, to quality standards evaluation of habitat for fish and wildlife. My details are only superficial and I'm learning more about this as I go along. But it sounds like a lot of fun.

I think I heard that they got a grant to design and build a new sensor grid architecture to monitor water quality. One of the design goals will be to standardize on software and hardware interfaces so a grid could be quickly and efficiently deployed anywhere (even exported to other sites around the world) to monitor many types of conditions - e.g. water clarity/turbidity, bio-fouling, mineral and other elemental concentrations for CO2, O2, pollutants, salinity, etc. etc. The system would need to scale as well. Starting with just a dozen sensor stations with sub-grids of sensors, the system needs to support thousands or more and do this in real-time. With such a plethora of sensors, having a standard interface and software to do data acquisition and control are vital if they were going to succeed.

Today, they have hardwired sensors mounted on some big concrete pilings along some shorelines. These are then connected phyiscally to computers and every 1 - 5 minutes, water quality data are gathered and store on disk to be processed later. It is non-real-time, and requires considerable human attention. The new system they want to build would be automated, wireless, and use standard sensor interfaces like JDDAC to collect data and control sensors. For example, one of the problems of taking, say turbidity measurements (murkiness of the water) is that algae and other crustaceans can foul the intake sensors. Their solution has been to put yet another remote control hardwired blower/pump around the sensor grid to clear the fouling prior to each measurement. More wires, more maintenance, more downtime. Ideally, by having a wireless and standard control interface, one control can proxy commands for another (much like USB or Firewire devices... you can chain them together). But there are obvious advantages to this technology for monitoring water quality, and when I brought up the concept of fish census, these professors all understood the challenge of figuring out how many fish are actually around in the water, data DFG/FGC need to assess stocks.

As an avid angler inside Bay waters concerned about Bay Water quality, I was delighted to see such initiatives for real data collection be funded and proceed at such a rapid pace. I was also delighted after a senior colleague of mine that leads the specification team invited me to become one of the participants for the software and network architecture. When the SFSU professors spoke about pier pilings and tidal currents and bio-fouling and piers, I could picture exactly what they were dealing with having spent some years now fishing these areas. And my colleague grinned because he knew that for this particular project, I was perhaps as eager to support this initiative as I was qualified from both the software and marine aspect.

And interestingly, word has come down from the Fish & Game pipeline that some folks down south in Monterey may be interested in a sensor station. Funding has occurred and I heard they are building a new Pier near the Moss Landing Jetty. This pier may only be primarily used for research vessel mooring and Monterey Bay science projects is what I heard, but that's early news and more details need to be researched. I need to contact some folks at CSU Monterey Bay to see if this is one of their projects and to ask if the might want to collaborate. I attended a great little symposium at CSUMB back in July or August and saw a lot of research posters from collaborative Universitys all over the West Coast and US. The symposium was sponsored by the NOAA (Nat'l Oceanic and Atmospheric Administration).

Who knew just fishing the Bay from pier and shore and along our estuary waters would have such beneficial and synergistic consequences? More to come...


Post a Comment:
Comments are closed for this entry.



« July 2016