CloudHacks.blog Azure Raspberry Pi Environmental Sensor Part 2: Azure IOT Hub

Raspberry Pi Environmental Sensor Part 2: Azure IOT Hub



Welcome back, we are continuing our three part series on using a RaspberryPi to setup an air quality sensor and send the data to Azure and present the data in a website. In the last blog we explored how to setup the Pi with the Enviro+ Sensors and capture data using the provided examples. In this blog we will explore how to setup Azure IoT Hub, and use the Python SDK to send test data and how to use Azure Anaytics Stream to forward that traffic to a Cosmos DB and also to PowerBi. To get started you will need the following:

  • An Azure Subscription
  • A basic CosmosDB or any Azure DB solution (optional)
  • A PowerBi Free subscription (or a trial of PowerBi Pro)
  • A laptop with Python 3.7+ installed (you can use your Pi for this)

Azure IoT Hub setup

In our Azure subscription we are going to setup an Azure IoT hub as well as an IoT device. The IoT Hub acts as an API input for IoT. This is achieved by capturing data using the commonly used MQTT protocol which has become a standard for IoT devices over the years. The IoT Hub provides an injection point allowing for two way communication from Azure to mulitple IoT and edge devices. The IoT Hub also allows interaction with other Azure services allowing you to consume the IoT.

Lets get started, in Azure Marketplace we are going to search for IoT Hub and setup a new instance:

For my usecase we are going to go with the Free tier to start off with:

Go Next, Next and Create.


Once created you will have an Azure IoT hub sitting there available

What we will next do is create an IoT device:

Once created we will get the connection string and use it in our test script:

Take note of your connection string as we will use it for the next section below.

Testing Azure IoT Hub connection

Now to do some Python coding 😊, we are going to setup a simple script which will send dummy data to Azure, data we can use to test the data upload and use this data to test the full data stream process in azure.

Before we get started with our code, we need to first install some python modules which provide the SDK for the Azure IoT module:

sudo pip3 install azure-iot-device  
sudo pip3 install azure-iot-hub  
sudo pip3 install azure-iothub-service-client  
sudo pip3 install azure-iothub-device-client  

Once these are installed we can look at our simple code to test the Azure IoT hub: first we ineed to import all the libraries into our python script:

import os
import asyncio
from azure.iot.device.aio import IoTHubDeviceClient
import time

Next we are going to define our connection string and message we are going to send. Although we can send any data we want to the Hub, for us to consume the data and store it in a database or a powerbi dataset we need to present the data as a JSON string:

conn_str = "HostName=cloudhacks.azure-devices.net;DeviceId=test-win-device;SharedAccessKey=*****
MSG_SND = '{{"messageId": 100,"deviceId": "Raspberry Pi Web Client","temperature": {temperature},"humidity": {humidity}}}'

Next, a function to setup the connection to Azure IoT Hub using the Hub Device to Cloud library, this will setup the connection using MQTT over TCP:

def iothub_client_init():
        client = IoTHubDeviceClient.create_from_connection_string(conn_str)
        return client

Finally, we are going to create a code which will pass in predefined values for Temp and Humidity as a message to the IoT hub:  

async def iothub_client_telemetry_sample_run():
        try:
            client = iothub_client_init()
            print ( "Sending data to IoT Hub, press Ctrl-C to exit" )
            while True:
                await client.connect()
                msg_txt_formatted = MSG_SND.format(temperature=35, humidity=30) 
                message = msg_txt_formatted
                print( "Sending message: {}".format(message) )
                await client.send_message(message)
                print ( "Message successfully sent" )
                time.sleep(3)
                await client.disconnect()

In this function we are opening a connection to the IoT hub, modify the message with the defined values for our message and send the message as a JSON string.

Compiling this code and test it, in your Python terminal you should see a continuous stream of “Message successfully sent”:

To see the data being received on the azure side, open up the cloud shell (or any device with Azure CLI installed) and run the command:

az iot hub monitor-events --hub-name <<IoT Hub Name>>




And you should see the payload data being sent.

For a full copy of the code you can find it on my github page here.

Consuming IoT hub events using Azure Stream Analytics Job

We will now look at Stream Analytics Job which is a service that can take inputs from a source structure the data to be readable and then pass it as an output into datastorage systems such as SQL systems. In our example we will look at capturing the event data from the IoT hub and stream it into a CosmosDB for future use and also into a PowerBi Dataset to present the data for future.

Lets start by creating a Stream Analytics Job, we will begin with a simple single stream unit:

The first thing you will notice that the job is in a stopped state, this is because we have yet to define an input soo lets do that.

If all is setup it should auto pick up your IoT hub, be sure to insure your Event serialization format is JSON.

Once saved it will start capturing and decoding your messages (be sure to still have the messages coming through from the last section). To view the sample data, browse to query section and at the bottom it should show your data in a table format:

Next we are going to define our outputs, as stated we are going to setup a CosmosDB and PowerBi output (just to test our limits but you are welcome to use any other DB  output such as MySQL or MS SQL). For this we need to go to the Output tabs:

We will start by defining a CosmosDB as our first output:

NOTE: Cause I already have a CosmosDB in my subscription it was able to pick it up.

We will rinse and repeat with PowerBi:

Now we have our two outputs defined!! We need to now define our query statement. Which is a simple SQL query to define what data we capture and send to our output DBs. For this we will go back to the query tab and define our query:

Here we have default query which we need to replace with our own query, we will update the query to the below replacing the [YourInputAlias] with [Test-Data] as our input source and [YourOutputAlias] to our output db [cosmosdb] and [powerBi-myworkspace]

SELECT
    *
INTO
    [cosmosdb]
FROM
    [Test-Data]
SELECT
    *
INTO
    [powerBi-myworkspace]
FROM
    [Test-Data]

We can test this script to validate the output as per below:

Save this as the default query for the job and we can now run the job.

With it started we can now check our Cosmos DB to validate that we are getting data:

And in PowerBi:

Now that we have shown in a proof of concept that we can get data uploaded into Azure IoT and present the data. We are in the next part going to combined the solutions in Part 1 and Part 2, developing a solution which will present our live data in a easy to use web app using PowerBi and IoT hub.

Stay tuned.

This project and blog was a collaboration between @Ratnanjali Sinha and @Shubham Sinha