So you just got something like and Arduino or Raspberry Pi kit with a few sensors. Setting up temperature or motion sensors is easy enough. But what are you going to do with all that data? It’s going to need storage, analysis, and summarization before it’s actually useful to anyone. You need a dashboard!
But even before displaying the data, you’re going to need to store it somewhere, and that means a database. You could just send all of your data off into the cloud and hope that the company that provides you the service has a good business model behind it, but frankly the track records of even the companies with the deepest pockets and best intentions don’t look so good. And you won’t learn anything useful by taking the easiest way out anyway.
Instead, let’s take the second-easiest way out. Here’s a short tutorial to get you up and running with a database backend on a Raspberry Pi and a slick dashboard on your laptop or cellphone. We’ll be using scripts and Docker to automate as many things as possible. Even so, along the way you’ll learn a little bit about Python and Docker, but more importantly you’ll have a system of your own for expansion, customization, or simply experimenting with at home. After all, if the “cloud” won’t let you play around with their database, how much fun can it be, really?
InfluxDB and Grafana wrapped up in Docker
Let’s get our parts together, starting with the database. InfluxDB is an open-source time series database that’s fast and intended for storing data that varies over time, like your sensor data. It processes this data using an SQL-like language. Writing SQL is nobody’s idea of fun, but it’s standard.
Grafana is an analytics platform that lets you visualize data and do stuff like generate alerts. It supports plugins which means that it allows integration with other software. It will be used to talk to our InfluxDB and then do some awesome stuff.
Docker is a containerization program which basically means that it allows us to put together our application, it’s dependencies, libraries and components together in a box so that it becomes easier for us to maintain and move things. Most importantly, it allows us to use containers that others have already prepared to make our lives easier.
Setup Docker
Now I am doing things in Linux, because I intend to migrate this to a Raspberry Pi once I’m done experimenting. That said, typing a couple of sentences into the command line is as easy as navigating drop-down menus anyway. The first step is to install docker and docker-compose:
sudo apt-get update && sudo apt-get install docker docker-compose
Not much else to it at this point. Restart you machine just in case. The documentation for the community edition can help answer a number of question that you might have.
We said already that docker images are readily available for use and so we need a docker image for InfluxDB as well as another one for Grafana. To get things up and running quickly, clone my repository from GitHub .
git clone https://github.com/inderpreet/py_docker_grafana_influxdb_dashboard.git docker-compose up -d chmod +x add_source.sh && ./add_source.sh cd pyclient && chmod +x setup_env.sh && ./setup_env.sh python test.py

To test if things work, do a docker ps -a in a new terminal. This should return a list of containers running on your system. Copy the container ID from the list and run docker exec -it 1772a6e6387c influx
replacing the container ID for your machine. You should get an influxDB shell and can run commands such as:
show users
show databases
create database hackaday
We need to create a database to store our sensor readings so go ahead and do that. I have used “hackaday” as the name though it can be anything.
Quickly, testing things:
use hackaday insert dummySensor v=123 insert dummySensor v=567 insert dummySensor v=000 select * from dummySensor
After typing the above in manually, you should be able to see the data in a time series.
The time stamp is in nano seconds and there can be more than one data value for a particular series — maybe one from each sensor. Given some better names, you can look at any given sensor’s data using the following SQL query: select livingRoom from temperatureSensors. Note: If you’re populating a test database by hand, do no put spaces between the comma and next value field.
The Python script I have included will generate a sensor data in an exponential form and will reset the value to 1.1 once it hits 100.0 . This should allow us to plot a dummy curve in Grafana.
Set Up Grafana
Grafana is already up and running thanks to the docker and can be accessed at http://localhost:3000 Once you enter the URL in the browser, Grafana should greet you will a login prompt. You can then login with username/password admin/admin. (You might want to change this pretty soon!)
Next, check to see that your InfluxDB is recognized by Grafana, using “Add Sources” if not. Click Save and Test and to confirm. Make sure the default checkbox at the top is ticked.
Next we should create a dashboard. From the left menu bar, select Dashboard>home and new dashboard. This should spit out an empty canvas with a few icons.
Select Graph and then click on the little title called Panel Title or simply press “e” on your keyboard. Then set it up so that:
– DataSource is set to InfluxDB
– selectMeasurement is set to sensorData
– fill is set to fill(previous)
And at the top, the refresh is set to 5 seconds and display for the last 15 minutes.
Reading Temperature
The reason that I have left the Python script outside of the Docker container is so that life will be easier when we make needed changes to read sensor data. For now, let’s take data in from a locally connected sensor attached to a microcontroller that can speak serial to your computer, to be read by a Python routine.
import serial
# replace with serial port name
ser = serial.Serial('/dev/ttyACM0')
ser.flushInput()
while True:
try:
temp_str = ser.readline()
temp = float(temp_str)
print(temp)
except:
print("Keyboard Interrupt")
break
As long as you can get your Arduino or equivalent to spit out numbers as a string, this script will receive them and translate them into floating point numbers. I leave the part about pushing the data into the InfluxDB as an exercise for the reader. (Hint: you can copy and paste code from/into the test.py I have provided.)
Raspberry Pi and the Sense Hat
The next stage would be moving to a Raspberry Pi. Got one for Christmas? You are in luck. Clone my repository from GitHub and do the same steps as before. Instead of the test.py, I have a sense_hat_temperature.py that can be executed. With this, you should be able to add sensorData and temperature and humidity values to the database.
Customization And Conclusion
The Docker system is configured using the docker-compose.yml file. In particular, there is a volumes variable that create a local folder influxdb_data which is where the data is written. You could configure it to write to say an external drive or even a network drive.
In case of the Raspberry Pi, this can be extremely useful. I wanted a way to be able to display sensor data easily while getting a few Python and Docker experiments going. This is just the beginning, though, since there is a lot more that can be done.
So far, our sensor device is an Arduino tied to the computer by a serial cable. If we want to take this wireless, we’ll need to think about getting the data from the sensor to the database. You could do worse than using MQTT and WiFi on an ESP8266. For power-sensitive applications, consider Bluetooth LE.
For now, get this system up and running and play around with local data. That way, when you build out the rest of your home sensing and automation network, the back end will be ready for you.
No comments:
Post a Comment