How we made a home environment sensor from a Raspberry Pi and our e-paper device

Swizec Teller, March 6th 2014 View all posts

What happens when you deploy Visionect's stack to the Raspberry Pi? A Webkit instance on a server that renders websites and sends the resulting images to simple e-paper devices over the air. The device is just the interface, the server does all the heavy lifting.

Can the Raspberry Pi handle it?

Visionect's frontend wizard, Matevz, decided to find out. He found Raspberry Pi boards inside Cubesensors, mixed two of the most interesting Slovenian hardware startups together, and barely broke a sweat doing it.

Cubesensors

Image

Cubesensors won a Launch award last year because they are small, cute, and awesome. A tiny plastic cube with a Raspberry Pi board that you put on a table and it will help you make the room a better place.

There's a bunch of sensors in there - humidity, noise, air pollution, vibration, everything you can think of. Normally these cubes create a mesh network using ZigBee and talk to an app on your phone to say "Hey, you know what, maybe you should open a window."

atevz hacked one of them to run Visionect's server software, become a Wi-Fi access point, and report straight to the V tablet without any need for internet or extra devices.

V tablets are especially well suited for this purpose since they use the awesome e-paper display from E Ink (just like your ebook reader). This means you'll be able to mount the device on a sunny porch and run it on battery power for weeks as E-paper displays offer awesome sunlight readability at super low power consumption.

Result: self-contained Visionect stack on a Raspberry Pi with a bunch of sensors for environmental data, all being displayed on the awesome e-paper based V Tablet.

Visionect+Cubesensors+node.js == win

Image

Contrary to what you might expect, shoving a server that usually takes a beefy processor and many rams onto the 700MHz Raspberry Pi was completely uneventful. It just worked.

  • install some dependancies
  • make deb packages for Visionect things
  • use gvm to update Go and get the admin panel working
  • write a log parser for sensor logs
  • make a simple static file server
  • write some code to display data
  • hook up the tablets

And that's it. That's all it took. Imagine my disappointment when I was expecting a juicy tale of trial and error, of heroic hacking and triumph. Everything just ... worked.

atevz says the hardest part was creating those deb packages, but you don't have to do that anymore. You can just run sudo apt-get install koala.

If you ever need to make a deb package, Matevz says he liked this seven step guide. Says it was the least painful.

Parsing log files

Because there's still no official API, Matevz had to do some inventive hacking - parsing ZigBee logs.

When Cubesensors talk to each other, they leave a trail in /var/log/ziggy.stdout.log. I'm not sure why, but it's useful when you want to read the sensors every ten seconds.

Logs look like this:

[20140306T09:49:08.034570] New node id=409D, eui64=000D6F0003053413
[20140306T09:49:09.533744] 000D6F0003C16E48: data = {"noise": 16.532, "temp": 24.5, "fw": 28, "battery": 2464, "light": 1708, "voc": 400, "humidity": 1576, "pressure": 986.0, "voc_resistance": 14590, "shake": true}
[20140306T09:49:09.540727] 000D6F0003C16E48: score = 66.297 (temp=23.5/1.0 humidity=20.0326797386/-0.348102820428 voc=450/1 noise=16.532/1)
[20140306T09:49:09.544915] 000D6F0003C16E48: reply = '\x11\x00B' (kwargs={"home": true, "score": 66}, queue={})
[20140306T09:49:09.606970] Sequences: '8F' added
[20140306T09:49:09.842009] Sequences: '8F' done (ACK), removing.
[20140306T09:49:10.004599] 000D6F0003053413: received ping
[20140306T09:49:10.010692] 000D6F0003053413: reply = '\x10\x00' (kwargs={"home": true}, queue={})

Parsing those into a useful form is a simple matter of using node's Tail package and applying a regex to every line.

var tail = new Tail("/var/log/ziggy.stdout.log"),
    pat = /\[.*?\] (.*?): data = (.*?})/g,
    sensors = {};

tail.on("line", function(line) {
  var data = pat.exec(line);
  if (data) {
    sensors[data[1]] = JSON.parse(data[2]);
  }
});

Simple.

Displaying the data

Image

Once your data is in a friendly JSON format, you have to slap together a simple dashboard-like app for the V tablet.

You'll need a static server first, but you're using node.js to parse logs anyway, so a server is just a node-static away. Or you can do it the old fashioned manual way like Matevz did.

But he's a frontend guy so we forgive him :)

ost of the interface is just a bunch of logic for selecting which cube to look at and going back from a specific display. fetchData, which fetches data and displays it, is far more interesting.

It's called every ten seconds and looks like this:

fetchData = function() {
    $.getJSON('data.json', function(new_data) {
        data = new_data;
        $.each(data, function(id, cube) {
            cube.voc = Math.max(cube.voc - 900, 0)*0.4 + Math.min(cube.voc, 900)
            cube.humidity = Math.min(95, Math.max(10, (cube.humidity -330*5.1)/(600/1000*5.1) + 55))
            cube.light = 10/6.0*(1+(cube.light/1024.0)*4.787*Math.exp(-(Math.pow((cube.light-2048)/400.0+1, 2)/50.0))) * (102400.0/Math.max(15, cube.light) - 25);
        });
        if ($('#data').is(':visible')) {
            renderData($('#data').data('id'));
        }
    });
}

mmmm, maths. And magic numbers. To be honest, I'm not at all certain how those conversions work or what any of the numbers mean, but the end result is a human readable display of temperature, noise, humidity, light, air pressure, and air quality in a room.

Image

You can get the full source code, here.

Hooking up the tablets

Time to turn your Raspberry Pi into a WiFi hotspot for the tablets.

You'll need a wi-fi dongle, some more deb packages, and a bit of configuration. Matevz says he used the TP Link TL-WN722N dongle, but I'm sure you can use whatever's lying around the house. Especially if it's based on the Atheros chipset.

Then you have to add deb http://mirrordirector.raspbian.org/raspbian/ wheezy non-free to /etc/apt/sources.list and run apt-get install firmware-atheros isc-dhcp-server hostapd to install all the packages.

After that, some configuration:

# /etc/network/interfaces

auto wlan0
iface wlan0 inet static
        address 192.168.20.1
        netmask 255.255.255.0
        network 192.168.20.0
        broadcast 192.168.20.255

# /etc/dhcp/dhcpd.conf

subnet 192.168.20.0 netmask 255.255.255.0 {
    range 192.168.20.100 192.168.20.200;
    option routers 192.168.20.1;
    interface wlan0;
}

# /etc/hostapd/hostapd.conf

interface=wlan0
driver=nl80211
ssid=WIFI_NAME
hw_mode=g
channel=11
wpa=1
wpa_passphrase=WIFI_PASSWORD
wpa_key_mgmt=WPA-PSK
wpa_pairwise=TKIP CCMP
wpa_ptk_rekey=600
macaddr_acl=0

# /etc/init.d/hostapd.conf

DAEMON_CONF=/etc/hostapd/hostapd.conf

Then tell the Raspberry Pi to turn into an access point every time it boots with two commands:

update-rc isc-dhcp-server defaults
update-rc hostapd defaults

And voila. A self contained system where a bunch of sensors inside a plastic box hooked up to a Raspberry Pi talk to a palm-sized e-paper tablet.

Well, you have to tell the tablet to connect to this access point as well. But that's trivial - go into settings, input the right IP addresses. Then don't forget to visit Visionect admin panel at http://192.168.20.1:8150 and tell it which URL to serve. For the cubesensors experiment this was http://192.168.20.1:8888.

Do it yourself

Let's recap. To turn your Raspberry Pi into a Visionect server you have to:

  • add Visionect's deb packages to sources.list (packages.visionect.si)
  • get a wifi dongle
  • convince your Pi to act as an access point (previous section)
  • tell the tablet where to connect

And you're done. Matevz made a room sensor display thingy on a V Tablet with an E Ink e-paper display, what are you going to build?

Swizec Teller is a geek with a hat. He's helping us create an ongoing showcase of interesting projects from our community. You should follow him on twitter, here.

Powered by MailChimp