Monitoring a Synology Diskstation with Munin

I have been using Munin to monitor the health of my Raspberry Pi for while now. As I have more devices installed in my network I was looking for a way to monitor these devices as well. As Munin uses a client-server model you are required to install the Munin node on the device to be monitored. Every five minutes the Munin server polls its clients for the values and creates charts using RRDTool.

One of the devices is a network attached storage by Synology I use for backups as well as an external storage for movies, music and other stuff. The Diskstation is basically a server that runs a customized Linux (Disksation Manager). In various forums I found information about how to set-up the Synology Diskstation as Munin node. An easier way is to use the Simple Network Management Protocol (SNMP). Just activate the SNMP service in the control panel of the Diskstation, give it a community name and you are all set:

Now tell the Munin master the community name, so it can fetch data from the Disktstation. Therefor create a file inside /etc/munin/plugin-conf.d/ folder with the name of the host (e.g. diskstation) containing two lines:

[snmp_diskstation_*] secret_community_name

The community string acts as password here. One Note: Using the community-based SNMP Version 2 (SNMPv2c) the data is sent in clear text (unencrypted) over the network. It should only be used inside LANs behind firewalls, not in WANs. If secure is an issue you should consider using SNMP Version 3 (SNMPv3).
The Munin master now knows the community name yet is unaware of what data to collect from the device. You can use the command munin-node-configure to discover those:

sudo -u munin munin-node-configure -snmp diskstation -snmpcommunity secret_community_name -shell

Munin now polls the device „diskstation“ using the SNMP-community „secret_community_name“ and shows the corresponding symlinks which have to be added to /etc/munin/plugins. You could either copy and execute them manually or just pipe them using | bash. After creating the symlinks add the following host to your /etc/munin/munin.conf to have the Diskstation appear as separate host. (Note: The IP address has to be the one of the Munin master which does the polling, not the IP address of the Diskstation).

use_node_name no

Once added the last step is restarting the Munin node

sudo /etc/init.d/munin-node restart

After a couple of minutes the Munin server fetches the first values, stores them in its RRD files and generates the charts:

SolarPi – A Flask powered photovoltaic monitor

After collecting some photovoltaic data using PikoPy and a some readings from the residential meter it was time to put everything together. The data is collected by a couple of scripts triggered by a cronjob every five minutes.

$ crontab -l
*/5 * * * * python /home/solarpi/
*/5 * * * * python /home/solarpi/
*/15 * * * * python /home/solarpi/

The results are then written into a SQLite database. I chose SQLite over PostgreSQL for its lighter footprint on the Raspberry Pi. And since we will have no concurrent writes SQLite should be capable serving as a database backend.

The sole purpose of the web application is to fetch data from the database and render the results. I decided to use Flask, a microframework for Python based on Werkzeug and Jinja 2. For the Frontend I used the Bootstrap based SB Admin 2 Theme.

Initially I used SQLAlchemy as an Object Relational Mapper where I defined my models in Python:

class PVData(SurrogatePK, Model):
__tablename__ = 'pv_data'
id = Column(db.Integer(), nullable=False, primary_key=True)
created_at = Column(db.Text(), nullable=False, default=dt.datetime.utcnow)
dc_1_u = Column(db.Integer(), nullable=True)
dc_1_i = Column(db.Float(), nullable=True)
ac_1_u = Column(db.Integer(), nullable=True)
ac_1_p = Column(db.Integer(), nullable=True)
dc_2_u = Column(db.Integer(), nullable=True)
current_power = Column(db.Integer(), nullable=True)
daily_energy = Column(db.Float(), nullable=True)
total_energy = Column(db.Integer(), nullable=True)

While using an ORM is very helpful for CUD operation I decided to keep my read queries using raw SQL instead.

The main view is a dashboard presenting all kinds of interesting KPIs like yielded, imported, exported Energy and how much money was received for the sold energy:

SolarPi Dashboard

Furthermore there are detailed charts available for each day at a five minute interval, or the last 7 days and 30 days, respectively:

SolarPi Daily Chart

The project is live at:

The code is on Github at

PikoPy: A python package for working with a Piko Inverter from Kostal

The first step of my plan, building a Raspberry Pi based photovoltaic monitoring solution, is finished. I created a python package that works with the Kostal Piko 5.5 inverter (and theoretically should work with other Kostal inverters as well) and offers a clean interface for accessing the data:

import pikopy

#create a new piko instance
p = Piko('host', 'username', 'password')

#get current power
print p.get_current_power()

#get voltage from string 1
print p.get_string1_voltage()

The next steps are collecting the data and processing it.

The library can be downloaded from
The code is on Github:

Determine your Fitbit stride length using a GPS watch

I have been carrying my Fitbit One for a little over two years with me and it keeps tracking my daily steps. It also tracks my distance covered by multiplying those steps using the stride length which you can either provide explicitly or implicitly setting your heights. In the winter of 2012 I bought my first Garmin Forerunner 410 (replaced by a Garmin Forerunner 920XT) GPS watch to help me track my running (and other outdoor) activities. Since then I have worn it at every activity. Additionally before every running session I started a Fitbit activity to measure my step count. Over the last 18 month I have recorded 69 runs using both, the Fitbit and the watch. I put the data in a csv file so I could load it into R. Then I performed a linear regression to determine my stride length.

Fitbit Stride Length Regression

In an ideal world, where the GPS watch would perform with a 100% accuracy and the Fitbit would recognize every single step taken, the regression fit would meet the origin. Unfortunately neither of them is that accurate. Our model gives us an intercept of 398.46m with a slope of 1.00421m and an adjusted R^2 of 0.97505 Based on that I have covered a whole lap of a competitive running track before I made a single step. 😉

I have noticed that the placement of the Fitbit has a huge impact on the amount of steps recorded. Attached to the pocket of my running shorts it seems to miss certain steps. I get much better results attaching it directly at my waist. Overall I am quite satisfied with the accuracy provided by the Fitbit. If you do a lot of interval training or vary your stride from session to session your result will be far more off.

The code for the regression is on Github:

A Raspberry Pi photovoltaic monitoring solution

A friend of mine had a photovoltaic system (consisting of 14 solar panels) installed on his rooftop last year. As I was looking for another raspberry pi project I convinced him I would setup a reliable monitoring solution that will lead him to an access to the data in real-time data. The current setup comes with an inverter by the company Kostal.

Kostal Piko 5. 5 Inverter

The Kostal Piko 5.5 runs an internal web server showing statistics like current power, daily energy, total energy plus specific information for each string. However, the visual layout of the the web page and the bundled software look like they were designed in the late 90s and development has been abandoned:

Kostal Piko 5.5 Web Interface

The power inverter itself logs data at either a 15 minutes interval for about 100 days or at an hour interval for about 400 days. At the current setting the 15 minutes interval has been selected. Furthermore my friend took notes irregularly from the built-in liquid-crystal display.

The final plan is now set. A Raspberry Pi would fetch data from the inverter, store it in a database and a web application would render the results to a browser:

SolarPi SetUp

The project is live at

Analyzing Sleep with Sleep Cycle App and R

I have been tracking my sleep for almost two years now using my Fitbit. I started with the Fitbit Ultra and then moved on the the Fitbit One after it came out. In October 2013 I found out about the Sleep Cycle app for the iPhone. For weeks, Sleep Cycle was listed as the best-selling health app in Germany, where currently (as of January 2014) it is in second place. The program promises, to wake you up in the morning without being tired. This is indeed possible if the alarm goes off in light sleep and not in deep sleep. It also allows you to set some kind of sleep music (white noise) to assist you to fall asleep. After reading all the positive reviews on the AppStore I decided to give it a try.

The app promises to wake you up in a time frame up to 30 minutes prior to the alarm you set if it detects your movement in the morning. Even more important to me than the actual smart alarm feature was the possibility to collect some data while sleeping. In the morning you are presented with a chart of your sleep pattern of last night:
Sleep Cycle Screenshot of last night

The app also allows to export the database as a comma separated file containing: time you went to bed, time you woke up, sleep quality in %, wake up mood and user defined sleep notes. This gives you the opportunity to do some more analysis. I decided to fire up R and create my own charts.

So far I have used the app to track 100 nights of sleep and decided to peak into the data. Let’s take a look how long I slept each night:
Sleep Duration over time

It looks like the longer I slept the higher the sleep quality is. A scatter plot of the data gives:
Sleep time vs. sleep quality

The chart takes also the sleep notes into consideration. You can see clearly that sleeping away from home results in lower sleep quality. The same applies for exercising (note: I tagged a sleep with exercising when I worked out late in the evening). On the contrary taking a melatonin (dosage 3mg) increased the sleep quality.

Averaging the sleep quality by month shows, that the January worse than the previous month. One explanation is a vacation I took, where I did not sleep so well at all.
Average sleep quality by month

The R code for the data wrangling and the charts:

Berlin Marathon 2014 Participants

After the 2013 Berlin Marathon sold out in less than four hours, the organizers decided to alter the registration process for 2014. First there was a pre-registration phase followed by a random selection from the pool of registrants to receive a spot. Those who were selected had to register until November 11th, 2013. Any spots that were not confirmed till the 11th would be offered to pre-registered candidates according to the order in which they were randomly selected.

At is a list of all registered participants of the event. Being curious how many runners are on the list I had two options: 1. going through the entire list and counting or 2. download the entire list and let the computer do the math. I chose the latter. If you checked the website already you saw that they present only parts of the list at a time a reload it asynchronously while you scroll down. I wrote a little python script that queries the data and saves the JSON response to a csv file.

Checking the file we see there are only 16,707 participants so far. Sure, there are spots sold to agencies or given to sponsors, but how many will be handed out in the second wave? Until they announce the results let’s look the data:

The distribution of the year of birth shows a bimodal pattern. Most runners are born in the late 60s or early 70s and then there is a second spike a around 1980.
Looking at the top 10 participating nations it is no surprise that Germany stands out by far. Followed by Great Britain and Denmark. The only non-European country in the top 10 are the United States of America:

The code for generating the images:

Update: The official registration period is over and 23,286 runners have signed up.

Installing Oracle Java 7 SDK on the Raspberry Pi

Two days ago the official hard-float Oracle Java 7 JDK has been announced on the official Raspberry Pi blog. Prior to this there was only the OpenJDK implementation which was lacking performance.

Furterhmore the Raspberry Pi Foundation announced that future Raspbian images would ship with. Oracle Java by default.

If you want to give it a spin you can install the JDK with:

sudo apt-get update && sudo apt-get install oracle-java7-jdk

Using htop to monitor system processes on the Raspberry Pi

If you work a lot on the command line you are probably familiar with the top utility to see what process is taking the most CPU or memory. There’s a similar utility called htop that is much more powerful. On top of the information that top provides, htop additionally shows your usage per CPU, as well as a meaningful text graph of your memory and swap usage right at the top.

htop running on the RaspberryPi

As if that wasn’t enough, you can just use your up/down arrow keys to select a process, and then you can kill it with the F9 key if you like, or you can change the priority by using the F7 and F8 keys.

To install htop simply use
sudo apt-get install htop

Once installed you can start it using

Five years of Weight Tracking

After I moved back from New Jersey in June 2008 I started to track my body weight more seriously. My routine usually consists of getting up and after finishing the morning bathroom I would step on my scale. That way I try to ensure that the condition for each weighing are as similar as possible. I recorded my weight on paper and eventually would put everything into a spreadsheet for further analysis.

In January 2011 I upgraded my bathroom scale to the Withings WiFi Body Scale. That way I could automate the process of tracking my weight by just stepping on it. No more writing on paper and eventually transferring everything into spreadsheets.

The people at Withings provide an API so external services could access your weight data. A nice way to get better charts is through Trendweight. Just link your Withings account to the web site and they will generate nice JavaScript charts of your weight/body fat/lean mass. Another great feature is the export functionality where you can export a comma separated version of your data including trend values for total weight as well as fat %, fat mass, and lean mass.

Using the exported data we can fire up R to look at the data:

The generated chart shows my weight loss in 2009, where I started cycling again after years of absence. From my lowest in the Fall of 2009 I gradually gained weight throughout 2010, 2011 until 2012, where I hit the 100 kg mark around Christmas time:

Body Weight 2008 - 2013

That was enough and I decided it was time to start working out again. So far, I am at a good downwards trend. I have to keep up that momentum. My next goal is to get between 80 kg and 85 kg and then maintain my weight. The color of the dots reflect my body fat percentage and there seems to be a strong correlation between body fat and my actual weight:

Body Weight vs. Body Fat Percentage

Doing a simple linear regression gives us an adjusted R-squared of 0.8419