Good afternoon folks!
I have a few raspberry pi’s now on my home network, performing various uses such as media centers, home automation servers, and print servers. Now I have bought another one of these awesome little computers for use in my datashed.
I planned to turn it into an environmental monitoring solution, so I can see various temperatures, the humidity, and other glorious readings. I finally got around to ordering the parts and writing the code up for it.
I have only got the pi logging temperature for now, but I aim to add lots of other sensors to it to make it much more functional like I said earlier – humidity and smoke sensors etc.
Shopping list (for full project)
- Raspberry Pi Model B
- 4.7k resistor
- DS18B20 sensors
- DHT22 sensor
- piezo vibration sensor
- smoke sensor
- IR break beam sensor
- flame sensor
I only used the 4.7k resistor, the Pi and the DS18B20 sensors in this part of the project, all other components will be used later on in the pipeline.
Step 1 – Setting up the Pi
#install updates sudo apt-get update sudo apt-get upgrade #reboot to install updates sudo reboot -n #install the packages sudo apt-get install vim screen iftop htop snmpd
Now that’s out of the way, you need to install a couple of python libraries
sudo pip install w1thermsensor sudo pip install influxdb
and then enable modprobe to read the serial data from the sensors:
sudo modprobe w1-gpio
once modprobe is enabled, you need to add another line to run on startup
sudo vim /boot/config.txt #then add to the end: dtoverlay=w1-gpio
Connecting the sensors
before you can get on writing our python code, you need to connect the sensors up.
You can wire a single sensor, or multiple sensors up as per the picture below:
ssh to the PI, do the following :
cd /sys/bus/w1/devices/ ls
Make a note of the file names within this directory.
Installing the python code / dependencies
you’ll need to install the influxdb and the w1thermseonsor python libraries before this will run,
Once installed, cd to your home directory and create a new file called temp.py
cd git clone github.com/ainsey11/SendPiTempToGrafana cd SendPiTempToGrafana vim temp.py
then change the values for the sensor_1 to 5 to match the file names you got from the last section.
Save and exit the file, ignore the server details at the bottom for now
Don’t run anything else on the PI for the moment
Installing the InfluxDB and Grafana software
You could do this on the pi, however it may struggle, I’ll write this as if it was a seperate server.
In my case I installed Ubuntu server 14 on a vm, 80GB disk, single NIC, 4GB ram and 2 vcpu’s.
Once installed, set a static IP for the server. Make note of what you set it to.
cd sudo apt-get update sudo apt-get upgrade sudo apt-get install screen htop vim sudo wget https://s3.amazonaws.com/influxdb/influxdb_0.11.0-1_amd64.deb sudo dpkg -i influxdb_0.11.0-1_amd64.deb
At this point browse to http://:8083
if the admin page loads up, click connect and create a new database called pi_temp
then create a user for the database, username and password can be whatever you wish
back to the server..
wget https://grafanarel.s3.amazonaws.com/builds/grafana_2.6.0_amd64.deb sudo apt-get install -y adduser libfontconfig sudo dpkg -i grafana_2.6.0_amd64.deb sudo service grafana-server start
You can do auto start etc, see their manual for more info.
Again, browse to the server , http://<serverip>:3000
login with admin/admin
from here, you can create a new dashboard with the graph / stat setup of your choice.
1st, you’ll need to configure the database source.
go to : http://<serverip>3000/datasources
click add new at the top and fill out the form as required. Hit test to make sure it’s working, if so – click save
then go to : http://<serverip>:3000/import/dashboard
download the environmental file from here and select it in the import page. Change the data source to the database source you just made,
you’ll then see a dashboard appear with a big graph of the 5 sensors, you could make this into single figure panes and all sorts, check out the grafana documentation, it’s great!
I have a huge setup with grafana, powershell modules that pull data from exchange and into influxdb, python server ping checks and more, over time I’ll write them all up for you to have a go with!