Azure services and Internet of Things
I just finished several posts about Galileo board for beginners. In those posts I showed how to create simple projects, which help us to collect data from sensors. But there still is a question: what we should do with collected data there. That’s why in this post I want to show several cloud services, which will help you to collect and analyze data from many devices at the same time in order to be ready to present consolidated data for users.
Of course I am going to talk about Azure because Azure provides not just base services like VM hosting, WebSites, SQL Azure etc. but several specific services, which are ready for solutions based on “micro” devices – devices which might contain many different boards and sensors around the World.
I will start with most simple Storage service component like Queue, which was presented in the beginning of Azure era and which is important for some scenarios in Internet of Things area.
Queue supports REST API and works fine in IoT projects which have just one event publisher or just one event consumer and if you don’t need to implement any special algorithm for access data in queue.
For example, you design a device, which plays music in restaurants or bars. Everybody may order favorite music using smartphone, tablet, terminal at the bar etc. But the device can play just one composition at the same time. So, you will have a queue there with orders inside and each element will contain some expiration time as well. In these case you have just one consumer device and it doesn’t require any special algorithm for selecting messages.
You can design one more device, which will provide the next number in line to government officials. This device increases internal counter, provides a new number, prints tickers with the number and puts a given number to the queue. Each official has a table – one more device, which connects to the queue and takes the next number from the queue. In this case we have just one publisher but many consumers (several agents with own tables).
So, there are many scenarios where queue is good enough. You can find REST API documentation for queue using this link.
Pay special attention that there are very strong requirements for queue. You should use low case symbols only. I forget about it from time to time and then it takes some time to find a mistake.
If you have many publishers and consumers at the same time, a simple queue will not work fine.
Imagine cashiers at the food store. Usually each cashier has his own line and there is a chance to increase number of cashiers in real time etc. There is no ways to use just single queue. At the same time it’s too hard and expensive to use many queues, because you need to implement lots of code, which will create queues dynamically and manage elements there. In other words you will need super queue.
Probably it was a good idea to create something like super queue with much own code inside but today we have Event Hubs.
Event Hubs is a special service which supports messaging scenarios with many event publishers and many consumers at the same time.
Thanks to partitions, consumer groups and offsets it’s possible to implement any scenarios there. Of course, you can use Event Hubs in scenarios, which work with Queue as well. In this case you will use a universal approach.
Of course, Event Hubs like Queue and other services, support REST API as well.
In some scenarios you might want to notify users about some events using standard Push notification services. In this case, it’s easy to use Azure Mobile Services. I already wrote a series about Azure Mobile Services. So, if you are interested in some aspects of it, you can read my series.
Tables and Blobs
Of course, previous services allow to use events in order to communicate between publishers and consumers. But in many scenarios you are required to store a lot of data from sensors. For example, you can create a device, which will collect data about pollution in the area. Your device is not going to send any events. Instead, you are going to use this data to analyze the situation for many years. So, you need a storage but Queue and Event Hubs cannot works like a storage.
Since usually devices send much non-relationship data, you will use non-SQL storages. In case of Azure you can use Tables and Blobs.
Tables allow to store data in table format and you can store hundred thousands terabytes there. But Tables have two disadvantages there. First of all in order to store data from sensors, your device need permanent access to Internet. Of course, you can preserve data to a local file but in this case it’s better to send to Azure file itself. The second disadvantage is related to format of data. You can store text data there but in case of images, videos etc. you need to return to files. So, usually, in IoT scenarios you will use Blobs instead of Tables.
Blobs allow to store any files inside and support REST API as well. So, usually you will preserve sensor data to a file and send it to Blobs based on your own scenario.
So, we already have some services which help with events and storage but we still need a way to analyze stored data. A new Azure service, called Stream Analytics, will help you to implement this task as well.
Stream Analytics can use Event Hubs and Blobs like a source for analysis. So, it is able to get and prepare all the needed data from your analysis. After that Stream Analytics provides query language, which helps you to implement your own algorithm in order to prepare data for analysis. Finally, stream analytics can upload all your results to new blobs, Event Hubs, SQL Azure database, which could be used as a source for many analysis tools.
Stream Analytics is still in preview mode but you can try it for free right now using trial accounts or your regular Azure account.
Therefore, as you can see, Azure supports full stack of needed services for IoT world and could be used with any IoT devices.