Three approaches to ensure the accurate of data and the uncertainty as well are:
The third step will be for another time. This tutorial may get a bit too long and more stuff in one place.
By now, the expected output from the Serial Monitor with the deepsleep like the below:
It is hard to see and too late for recovery since I did not have another copy of this screenshot. Basically, with deeepsleep in place, after the ESP boots up, it produces some weird characters first, followed by the connection to the WiFi and MQTT. Then, it is taking readings from the sensors and publishing to the MQTT broker. Finally, it takes 5-minute snap. This is the expected outcome from client side (ESP8266 with sensors). Everything seemed working and the message was pushed to the local broker from the photo above.
The message is published to a topic in MQTT broker, we need to create a script working as a client to listen to the topic and store data to the database. My current choices are Python and Sqlite.
I used the same approach for listening to MQTT message and store the data into SQL database here , so I would like not to repeat myself entirely. Basic steps are:
sudo apt install sqlite3 pip3 install mqtt-paho numpy pandas --user
(--user) is optional in case a problem with user's permission. Numpy is fast library to calculate basic statistics and Pandas is for a large data in tabular form for the next step.
CREATE TABLE node1 (time DATETIME, ldr_1 INT, ldr_2 INT, temt6000 INT, bh1750 INT,\ tsl1_f INT, tsl1_lux INT, tsl2_f INT, tsl2_lux INT, ds_1 REAL, ds_2 REAL,\ tdht REAL, tsht REAL, hdht REAL, hsht REAL, hic REAL); CREATE TABLE node1_post (time DATETIME,LDR_AVG REAL, LDR_STD REAL, LUM_AVG REAL, LUM_STD REAL, LUX_AVG REAL, LUX_STD REAL, TEM_AVG REAL, \ TEM_STD REAL,HUM_AVG REAL, HUM_STD REAL, HIC_AVG REAL);
python3 capture_data.py
Add nohup before python3 to dump the output to the file and & at the end to keep the script runs even after the terminal closed.
Download the Python script on GitHub
What does this script perform?
client = mqtt.Client() client.username_pw_set(username=mqtt_username, password=mqtt_password) client.connect(mqtt_broker_ip, 1883, 60) client.on_connect = on_connect client.on_message = on_message client.on_disconnect = on_disconnect sleep(1) client.loop_forever()
by creating an instance of class mqtt.Client(), setting up the authentication and hooking up functions. The loop_forever() lets the instance started again after 1-second sleep.
if msg.retain == 0:
if payload['sensor'] =='node1':
sql_cmd_one = """INSERT INTO {1[sensor]} VALUES ('{0}', {1[ldr_1]}, {1[ldr_2]}, {1[t6000]}, {1[bh1750]}, {1[tsl1_f]}, {1[tsl2_f]},{1[tsl1_l]}, {1[tsl2_l]},
{1[ds_1]:.2f}, {1[ds_2]:.2f},{1[tdht]}, {1[tsht]:.2f},
{1[hdht]:.2f}, {1[hsht]:.2f}, {1[hic]:.2f});""".format(time_, payload)
def writeToDB(sql_command): '''store captured data''' conn = sqlite3.connect(dbFile) cur = conn.cursor() cur.execute(sql_command) conn.commit()
For data pre-processing, I used Numpy library. For a few input, this approach is over-killed but you could image this is a cleaner way if you a few dozens of one parameter. Numpy and Pandas are the right tools for data organized in a tabular format. Calculation the average and the standard deviation (σ) is a breeze. In the next step, I will show how Pandas library supports a SQL query and loads external data into DataFrame. Working with DataFrame reduces the dimension but does require users to be able to abstract such approach, so that command are felt more intuitive.
▣ ▣ ▣