Table of Contents
- Part 1 - Introduction
- Part 2 - Installing and Configuring InfluxDB
- Part 3 - Deploying a Custom SmartThings SmartApp for Data Logging
- Part 4 - Installing and Configuring Grafana with an Initial Dashboard
What is InfluxDB
If you're super interested in all the little details, you should absolutely go and check out the InfluxDB website, but in a nutshell, InfluxDB is a time series based data platform. Unlike other database platforms that store data based on a primary key that you define, InfluxDB uses a timestamp as the primary key for data points; I'm obviously over-simplifying this to make it a little easier to understand. InfluxDB supports the auto-expiration of data—think of this as a TTL on a DNS entry if that helps you—which automatically purges the data at the end of expiration. The other big advantage to InfluxDB is that any past experience you may have with say SQL, is applicable here. The query language and syntax of InfluxDB has a lot of similarities with SQL, and you'll see that when we get to Part 4 and build our first dashboard.
Before getting started, make sure your OS is up to date. I'm using Ubuntu 16.04, so running a
sudo apt-get update && sudo apt-get upgrade && sudo reboot does the trick. If you're using a local device (such as a virtual machine, Raspberry Pi or Pine64) you're also going to want to make sure you have a static IP set.
InfluxDB stores all of your data in UTC (coordinated universal time). To make things a little easier on ourselves, we'll first set our server to also use UTC.
sudo timedatectl set-timezone Etc/UTC
Next we want to download the install package for InfluxDB. The InfluxDB website contains a list of the packages available for different operating systems. Simply pick the one that matches your OS and run the commands. In my case for Ubuntu 16.04:
sudo dpkg -i influxdb_1.3.2_amd64.deb
Now it's just a matter of starting InfluxDB:
sudo service influxdb start
Now that InfluxDB is installed, we just need to do a little configuration before we can start sending data to it. The first step is to connect to the InfluxDB shell by running
Now we need to create a database that we'll send the data to from SmartThings by running
CREATE DATABASE SmartThings, and validate that the database was created by running
A Note About Retention Policies
When you create a new database, like we did with
SmartThings above, InfluxDB automatically creates a default retention policy named
autogen that has infinite retention. we can validate this by running
SHOW RETENTION POLICIES ON "SmartThings".
You can read up on how to manage retention policies in the InfluxDB documentation, but for our purposes we're just going to modify the default retention policy to hold our data for 55 weeks with a shard group of 1 week. This means that we'll capture a rolling 55 weeks of data, and drop 1 week of data when the data is purged. I chose 55 weeks instead of 52 purely so I have the ability to look back at a calendar year's worth of data for up to 3 weeks. So if I happen to not look at January 1 - December 31 on exactly the 31st of December it's no big deal. Having an extra 3 weeks simply buys me a little extra time.
Note that we didn't setup any authentication as part of this install of InfluxDB. This means that InfluxDB will accept any request that it is sent regardless of where that request came from. That's fine for be because I'm using a local virtual machine inside my own network's firewall. If you've chosen to host your InfluxDB on a publicly accessible server, such as at Digital Ocean, you'll absolutely want to harden your server and configure InfluxDB authentication.
InfluxDB is now up and running on our server, we've created a database (SmartThings), set a retention policy to keep data in that database for 55 weeks, and in part 3 we'll start sending data to InfluxDB from SmartThings.