Today I worked out an example of using InfluxDB from Django in Docker. Using Docker containers to run databases greatly reduces the amount of database configuration you need to worry about when you’re trying to work out a proof of concept.
InfluxDB is a great tool for storing timestamped data. Storing a timestamp and a set of measurements, one timestamp per row, in a Postgres database is possible, but inefficient. InfluxDB offers you a way to store a set of values, and a set of indexed meta-data tags per row.
For example, if you’re collecting hourly production data from multiple wells, you can store the rates as data values, and wells as indexed tags. Then looking up the production from a set of wells over some time period becomes very efficient due to indexing. Looking up wells by production rates, however, would be very inefficient, unless you stored rate data as a tag, and well names as values. Learn more here from the InfluxDB documentation.
First, create a project and app.
mkdir django-influxdb-example
cd django-influxdb-example
django-admin startproject project
cd project
django-admin startapp app
cd ..
Create a virtual environment and install django, postgres and influxdb.
python3 -m venv venv
source venv/bin/activate
pip install django psycopg2 influxdb
pip install --upgrade pip
pip freeze > requirements.txt
Right now, my requirements look like,
certifi==2018.8.24
chardet==3.0.4
Django==2.1.1
idna==2.7
influxdb==5.2.0
psycopg2==2.7.5
python-dateutil==2.7.3
pytz==2018.5
requests==2.19.1
six==1.11.0
urllib3==1.23
Create a Dockerfile
,
FROM python:3.6-alpine3.7
ENV PYTHONUNBUFFERED 1
RUN apk update \
&& apk add libpq postgresql-dev \
&& apk add build-base
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY project ./
Create a docker-compose.yml
file,
version: '3'
services:
db:
image: postgres
web:
build: .
command: python3 manage.py runserver 0.0.0.0:8000
volumes:
- ./project:/usr/web/app
ports:
- "8000:8000"
depends_on:
- db
- influxdb
links:
- influxdb
influxdb:
image: influxdb
ports:
- "8083:8083"
- "8086:8086"
I added the following to my Django settings.py
,
INFLUXDB_HOST = 'influxdb'
INFLUXDB_PORT = 8086
INFLUXDB_USERNAME = None
INFLUXDB_PASSWORD = None
INFLUXDB_DATABASE = 'example'
INFLUXDB_TIMEOUT = 10
At this point, you could try to build the whole thing and see if it comes up,
docker-compose build
docker-compose run web python3 manage.py makemigrations
docker-compose run web python3 manage.py migrate
docker-compose up
If you go to localhost:8000, then you should see the default Django splash page. If you open an interactive Python session on the web container, you should be able to access InfluxDB through the Python influxdb module,
docker-compose run web python3
Starting django-influxdb-example_db_1 ... done
Starting django-influxdb-example_influxdb_1 ... done
Python 3.6.6 (default, Sep 12 2018, 02:19:14)
[GCC 6.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from influxdb import InfluxDBClient
>>> client = InfluxDBClient('influxdb', 8086)
>>> client.get_list_database()
[{'name': '_internal'}]
>>> client.create_database('example')
>>> client.get_list_database()
[{'name': '_internal'}, {'name': 'example'}]
Now everything is connected and running, you can focus on writing to code to create databases, and add points. I looked through the code for django-influxdb-metrics and found the code below, which I put in a utils.py
file in my app/
directory. This creates an influxdb client using the constants we defined earlier in the settings.py
file.
from django.conf import settings
from influxdb import InfluxDBClient
import logging
logger = logging.getLogger(__name__)
def get_influxdb_client():
"""Returns an ``InfluxDBClient`` instance."""
client = InfluxDBClient(
settings.INFLUXDB_HOST,
settings.INFLUXDB_PORT,
settings.INFLUXDB_USERNAME,
settings.INFLUXDB_PASSWORD,
settings.INFLUXDB_DATABASE,
timeout=getattr(settings, 'INFLUXDB_TIMEOUT', 10),
ssl=getattr(settings, 'INFLUXDB_SSL', False),
verify_ssl=getattr(settings, 'INFLUXDB_VERIFY_SSL', False),
)
return client
Finally, from here, I can create databases, add points, and perform queries through the client.