Logging Data

Sometimes you just want to gather data at all times, and sync to ConnectorDB periodically. The logger allows you to do exactly such a thing.

The logger caches datapoints to a local sqlite database, and synchronizes with ConnectorDB every 10 minutes by default.

Suppose you have a temperature sensor on a computer with an intermittent internet connection.

You can use the Logger to cache data until a sync can happen:

def getTemperature():
    #Your code here
    pass

from connectordb.logger import Logger

def initlogger(l):
    # This function is called when first creating the Logger, to initialize the values

    # api key is needed to get access to ConnectorDB
    l.apikey = raw_input("apikey:")

    # If given a schema (as we have done here), addStream will create the stream if it doesn't exist
    l.addStream("temperature",{"type":"number"})

    # Sync with ConnectorDB once an hour (in seconds)
    l.syncperiod = 60*60

# Open the logger using a cache file name (where datapoints are cached before syncing)
l = Logger("cache.db", on_create=initlogger)

# Start running syncer in background (can manually run l.sync() instead)
l.start()

# While the syncer is running in the background, we are free to add data
# to the cache however we want - it will be saved first to the cache file
# so that you don't lose any data, and will be synced to the database once an hour
while True:
    time.sleep(60)
    l.insert("temperature",getTemperature())

The logger requires the python-apsw package to work. It is a thread-safe sqlite wrapper, which is used to safely store your data between synchronization attempts.

On ubuntu, you can run apt-get install python-apsw. On windows, you will need to download the extension package from http://www.lfd.uci.edu/~gohlke/pythonlibs/#apsw , and install using pip.

Logger

class connectordb.logger.Logger(database_file_path, on_create=None, apikey=None, onsync=None, onsyncfail=None, syncraise=False)[source]

Bases: object

Logger enables logging datapoints with periodic synchronization to a ConnectorDB database. the logged datapoints are cached in a sqlite database, as well as the necessary connection data, so that no data is lost, and settings don’t need to be reloaded from the database after initial connection.

addStream(streamname, schema=None, **kwargs)[source]

Adds the given stream to the logger. Requires an active connection to the ConnectorDB database.

If a schema is not specified, loads the stream from the database. If a schema is specified, and the stream does not exist, creates the stream. You can also add stream properties such as description or nickname to be added during creation.

addStream_force(streamname, schema=None)[source]

This function adds the given stream to the logger, but does not check with a ConnectorDB database to make sure that the stream exists. Use at your own risk.

apikey

The API key used to connect to ConnectorDB. This needs to be set before the logger can do anything! The apikey only needs to be set once, since it is stored in the Logger database.

Note that changing the api key is not supported during logger runtime (after start is called). Logger must be recreated for a changed apikey to come into effect.

cleardata()[source]

Deletes all cached data without syncing it to the server

close()[source]

Closes the database connections and stops all synchronization.

connectordb

Returns the ConnectorDB object that the logger uses. Raises an error if Logger isn’t able to connect

data

The data property allows the user to save settings/data in the database, so that there does not need to be extra code messing around with settings.

Use this property to save things that can be converted to JSON inside the logger database, so that you don’t have to mess with configuration files or saving setting otherwise:

from connectordb.logger import Logger

l = Logger("log.db")

l.data = {"hi": 56}

# prints the data dictionary
print l.data
insert(streamname, value)[source]

Insert the datapoint into the logger for the given stream name. The logger caches the datapoint and eventually synchronizes it with ConnectorDB

insert_many(data_dict)[source]

Inserts data into the cache, if the data is a dict of the form {streamname: [{“t”: timestamp,”d”:data,…]}

lastsynctime

The timestamp of the most recent successful synchronization with the server

name

Gets the name of the currently logged in device

ping()[source]

Attempts to ping the currently connected ConnectorDB database. Returns an error if it fails to connect

serverurl

The URL of the ConnectorDB server that Logger is using. By default this is connectordb.com, but can be set with this property. Note that the property will only take into effect before runtime

start()[source]

Start the logger background synchronization service. This allows you to not need to worry about syncing with ConnectorDB - you just insert into the Logger, and the Logger will by synced every syncperiod.

stop()[source]

Stops the background synchronization thread

sync()[source]

Attempt to sync with the ConnectorDB server

syncperiod

Syncperiod is the time in seconds between attempting to synchronize with ConnectorDB. The Logger will gather all data in its sqlite database between sync periods, and every syncperiod seconds, it will attempt to connect to write the data to ConnectorDB.