Markus Stocker bio photo

Markus Stocker

Between information technology and environmental science with a flair for economics, the clarinet, and the world of soups and salads.

Email Twitter Google+ LinkedIn Github

Back in June I wrote a short note on how sensor data could be served to applications such that the interface is platform and language independent. As described, the main idea is to expose sensor data servers using the REST software architecture style. HTTP is the interface, parameterization occurs over URL and the data is optimally presented according to the calling agent type, e.g. a person through a browser, a software or something in between (like you and your Linux shell).

I have invested a few hours in prototyping the idea. The result is worth an update here, I think. Given the nature of sensor data, I’m using only GET (at least for now). Thus, the main goal is to define a URL that has all the required parameters (here for a proprietary system) and implement some code for content negotiation. I have been using Restlet (thanks guys, the world would not be as happy without people writing such open source code!). What follows is an example for a URL to request a time series for sensor data (URL is stripped by irrelevant parameters, for the discussion here)


Resense, the RESTful Sensor Server described, executes some code according to the HTTP request and returns some output, depending on the calling agent. Currently, text/plain returns a plot of the time series embedded in HTML code (using Flot, yet another nice library). This is meant for people requesting sensor data through a browser. Accept application/xml will return the same data in, well, XML. This can be tested with a simple shell command, e.g.

curl -H "Accept: application/xml" \

I plan to serve also some image representation of the time series, e.g. image/png, which could be used in HTML img tags. The most interesting, however, is application/rdf+xml. What you get back, of course, is RDF/XML encoded data. The following might not surprise the semantic web savvy reader, but perhaps others. The URL, so-called resource, discussed here can be used as RDF sensor data server. Therefore, we can use the served RDF data and process it further, e.g. with SPARQL. For instance, assume you want to filter real sensor measurements and generated (missing data) values in a time series served by Resense. The SPARQL query might look like

PREFIX resense: <>

SELECT ?observation ?timestamp ?value
  ?observation resense:IsObservation "false" .
  ?observation resense:Timestamp ?timestamp .
  ?observation resense:Value ?value .
ORDER BY DESC(?timestamp)

Here we query directly Resense (FROM) which will return RDF data that we query for generated observations and return their time stamp and measurement value. An excerpt of the result set may be

| observation           | timestamp                 | value                   |
| resense:1281018990227 | "2010-08-05" | "0.0016412593963763967" |
| resense:1281018979727 | "2010-08-05" | "0.0016531616825412035" |
| resense:1281018945149 | "2010-08-05"  | "0.0016480003663514493" |
| resense:1281018930133 | "2010-08-05" | "0.0016621107210564752" |

There are still many things that can be improved, like proper data typing. More interesting could be designing a proper OWL ontology and classify sensors, e.g. temperature, pressure, humidity, seismic, etc. Data pushed by sensors to, or pulled by, a sensor proxy (data server) and managed, e.g., with a standard RDBMS could be served by Resense and be self-describing according to the ontology. A catalog service could then index Resense data servers while agents (people and software) could formulate queries using the ontology. A SPARQL query could be built dynamically according to the agent’s query formulation, not unlike the example above. Of course, the ontology could provide also some interesting OWL inference and this might be extended with spatial-temporal reasoning and lead to a platform for conceptual-spatial-temporal reasoning and querying for sensor data.