Sibeesh Passion

Top Menu

  • Home
  • Search
  • About
  • Privacy Policy

Main Menu

  • Articles
    • Azure
    • .NET
    • IoT
    • JavaScript
    • Career Advice
    • Interview
    • Angular
    • Node JS
    • JQuery
    • Knockout JS
    • Jasmine Framework
    • SQL
    • MongoDB
    • MySQL
    • WordPress
  • Contributions
    • Medium
    • GitHub
    • Stack Overflow
    • Unsplash
    • ASP.NET Forum
    • C# Corner
    • Code Project
    • DZone
    • MSDN
  • Social Media
    • LinkedIn
    • Facebook
    • Instagram
    • Twitter
  • YouTube
    • Sibeesh Venu
    • Sibeesh Passion
  • Awards
  • Home
  • Search
  • About
  • Privacy Policy

logo

Sibeesh Passion

  • Articles
    • Azure
    • .NET
    • IoT
    • JavaScript
    • Career Advice
    • Interview
    • Angular
    • Node JS
    • JQuery
    • Knockout JS
    • Jasmine Framework
    • SQL
    • MongoDB
    • MySQL
    • WordPress
  • Contributions
    • Medium
    • GitHub
    • Stack Overflow
    • Unsplash
    • ASP.NET Forum
    • C# Corner
    • Code Project
    • DZone
    • MSDN
  • Social Media
    • LinkedIn
    • Facebook
    • Instagram
    • Twitter
  • YouTube
    • Sibeesh Venu
    • Sibeesh Passion
  • Awards
  • Linux Azure Function Isolated Dot Net 9 YAML Template Deployment

  • Build, Deploy, Configure CI &CD Your Static Website in 5 mins

  • Post Messages to Microsoft Teams Using Python

  • Get Azure Blob Storage Blob Metadata Using PowerShell

  • Deploy .net 6 App to Azure from Azure DevOps using Pipelines

AzureIoT
Home›Azure›An Introduction to Azure Stream Analytics Job

An Introduction to Azure Stream Analytics Job

By SibeeshVenu
December 11, 2018
1047
8
Share:
stream-analytics-intro-pipeline

[toc]

Introduction

The capability of an Azure Stream Analytics Job is a lot, here in this post we are going to discuss a few of them. An Azure Stream Analytics is basically an engine which processes the events. These events are coming from the devices we have configured, it can be an Azure IoT Dev Kit (MXChip) or a Raspberry Pi and many more. The stream analytics job has two vital parts

  • Input source
  • Output source

The input source is the source of your streaming data, in my case, it is my IoT Hub. And the output source is the output what you are configuring. I had configured the output to save the data to an Azure SQL database. Let’s just stop the introduction part now and start creating our own Stream Analytics. 

docs.microsoft.com

Background

I recently got my MXChip (Azure Iot Dev Kit) and I was surprised with the capabilities that device can do. It has a lot of sensors within the device, like temperature, humidity, pressure, magnetometer, security etc. Then I thought it is time to play with the same. So the basic idea here was to,

  1. Configure the device to send the data to the IoT Hub
  2. Select the IoT Hub as a stream input
  3. Send the output to an SQL Server database

In this article, we are going to concentrate on how to create a Stream Analytics Job and how you can configure the same to save the stream data to the SQL Server database. 

Prerequisites

To do the wonderful things, we always need some prerequisites.  

  1. Azure Subscription
  2. MXChip Azure IoT Dev Kit 
  3. An active IoT Hubows Driver Kit (WDK) 10
  4. IoT Core ADK Add-Ons
  5. Windows 10 IoT Core Packages
  6. The Raspberry Pi BSP
  7. Custom FFU image we have created

Creating the Azure Stream Analytics Job

Login to your Azure Portal and click on the Create a resource, and then search for the “Stream Analytics job”. 

Once you clicked on the Create button, it is time to specify the details of your job.

  1. Job Name
  2. Subscription
  3. Resource Group
  4. Location
  5. Hosting Environment

I would strongly recommend you to select the same resource group of your IoT Hub for the Stream Analytics Job as well so that you can easily delete the resources when there are not needed. Once the deployment is successful you can go to the resource overview and see the details.

Configure Inputs

In the left menu, you can see a section called Job topology, that’s where we are going to work. Basically, we will be setting the Inputs and Outputs and then we will be writing a query which can take the inputs and send the values to the configured output. Click on the Inputs label and click on Add stream input and then select the IoT Hub.

Configure the Input in Stream Analytics

In the next screen, you will have options to select the existing IoT hub and to create a new IoT Hub. As I have already created an IoT hub, I would select the existing one.

Please be noted that you are allowed to use special characters in the Input alias field, but if you use such, please make sure to include the same inside [] in the query, which we will be creating later. 

About the special characters in Input alias field

Once you are successfully configured the Inputs, then we can go ahead and configure the outputs.

Configure Outputs

Click on the Outputs from the Job topology section and click Add, and then select SQL Database.

Configure the Output in Stream Analytics

You can either create a new Database or select the one you had already created. I used the existing database and table. 

Configure the Query

Once you click the label Query on the left pan, you will be given an editor where you can write your queries. I am using the below query. 

SELECT
    messageId,
    deviceId,
    temperature,
    humidity,
    pressure,
    pointInfo,
    IoTHub,
    EventEnqueuedUtcTime,
    EventProcessedUtcTime,
    PartitionId
INTO
    streamoutputs
FROM
    streaminputs

As you can see that I am just selecting the fields I may need and saving the same to our stream outputs. You can always select all the fields by using the select * query, but the problem with that is, you will have to set up the table columns in the same order of the stream data. Otherwise, you may get an error as below.

Encountered error trying to write 1 event(s): Failed to locate column ‘IoTHub’ at position 6 in the output event

Stream analytics query error

If there are any errors, you can see that in the Output details.

Run the Stream Analytics Job and See the Data in the Database

As we have already done the initial set up, we can now start our Stream Analytics Job, please make sure that the IoT Hub is running and the device is sending data to the IoT Hub. If everything is working as expected, you will be able to see the data in the SQL server database. You can either connect your MXChip device to the network and test this or use the custom simulator app. 

If you are using the Simulator console application, make sure that you are giving the device id, key and the IoT hub uri correctly, otherwise you will get an unauthorized error as explained here. 

Test the Stream Analytics Job Inside the Portal

You also have an option to test the functionality in the portal itself. The only thing you will have to do is to prepare the sample input data. I have prepared the sample JSON data as follows. 

[
  {
    "deviceId": "test-device",
    "humidity": 77.699449415178719,
    "pointInfo": "This is a normal message.",
    "temperature": 32.506656929620846
  },
  {
    "deviceId": "test-device",
    "temperature": 52.506656929620846,
    "humidity": 17.699449415178719,
    "pointInfo": "This is a normal message."
  },
  {
    "deviceId": "test-device",
    "temperature": 42.506656929620846,
    "humidity": 57.699449415178719,
    "pointInfo": "This is a normal message."
  }
]

Now we can go to the Query section and upload the sample data file for our inputs. 

In the next window, you can select the JSON option and upload your JSON file. 

Click the Test button, and now you should be able to see the output as below.

Conclusion

Wow!. Now we have learned,

  • What is Azure Stream Analytics Job
  • how to create Azure Stream Analytics Job
  • how to add Inputs to the Azure Stream Analytics
  • how to add Outputs to the Azure Stream Analytics
  • how to add custom Query in Azure Stream Analytics
  • how to Test the Stream Analytics Query with sample data

You can always ready my IoT articles here.

Your turn. What do you think?

Thanks a lot for reading. Did I miss anything that you may think which is needed in this article? Could you find this post as useful? Kindly do not forget to share me your feedback.

Kindest Regards
Sibeesh Venu

TagsAzureAzure IoTAzure SQLConfigure Stream Analytics with Input and OutputData LakeIoTIoT HubMXChipStream Analytics Job
Previous Article

Raspberry PI SD Card Provisioning with Windows ...

Next Article

Azure Stream Analytics Job and Tools for ...

0
Shares
  • 0
  • +
  • 0
  • 0
  • 0

SibeeshVenu

I am Sibeesh Venu, an engineer by profession and writer by passion. Microsoft MVP, Author, Speaker, Content Creator, Youtuber, Programmer.

Related articles More from author

  • VM IP configuration
    Azure

    Creating an Azure VM from the VHDX/VHD File

    March 6, 2019
    By SibeeshVenu
  • Azure

    Secure Serverless Azure Functions AppSetting Using Key Vault

    July 5, 2019
    By SibeeshVenu
  • Azure

    Linux Azure Function Isolated Dot Net 9 YAML Template Deployment

    April 27, 2025
    By SibeeshVenu
  • Azure

    Creating a Simple Windows Application Using Azure

    April 29, 2015
    By SibeeshVenu
  • AzureHow to

    Fix for 404 ResourceNotFound Error After Uploading to Azure Container

    May 27, 2017
    By SibeeshVenu
  • Raspberry Pi 4, What is There for You, Worth Buying
    IoT

    Raspberry Pi 4, What is There for You, Worth Buying?

    August 11, 2019
    By SibeeshVenu
0

My book

Asp Net Core and Azure with Raspberry Pi Sibeesh Venu

YouTube

MICROSOFT MVP (2016-2022)

profile for Sibeesh Venu - Microsoft MVP

Recent Posts

  • Linux Azure Function Isolated Dot Net 9 YAML Template Deployment
  • Build, Deploy, Configure CI &CD Your Static Website in 5 mins
  • Easily move data from one COSMOS DB to another
  • .NET 8 New and Efficient Way to Check IP is in Given IP Range
  • Async Client IP safelist for Dot NET
  • Post Messages to Microsoft Teams Using Python
  • Get Azure Blob Storage Blob Metadata Using PowerShell
  • Deploy .net 6 App to Azure from Azure DevOps using Pipelines
  • Integrate Azure App Insights in 1 Minute to .Net6 Application
  • Azure DevOps Service Connection with Multiple Azure Resource Group

Tags

Achievements (35) Angular (14) Angular 5 (7) Angular JS (15) article (10) Article Of The Day (13) Asp.Net (14) Azure (65) Azure DevOps (10) Azure Function (10) Azure IoT (7) C# (17) c-sharp corner (13) Career Advice (11) chart (11) CSharp (7) CSS (7) CSS3 (6) HighChart (10) How To (9) HTML5 (10) HTML5 Chart (11) Interview (6) IoT (11) Javascript (10) JQuery (82) jquery functions (9) JQWidgets (15) JQX Grid (17) Json (7) Microsoft (8) MVC (20) MVP (9) MXChip (7) News (18) Office 365 (7) Products (10) SQL (20) SQL Server (15) Visual Studio (10) Visual Studio 2017 (7) VS2017 (7) Web API (12) Windows 10 (7) Wordpress (9)
  • .NET
  • Achievements
  • ADO.NET
  • Android
  • Angular
  • Arduino
  • Article Of The Day
  • ASP.NET
  • Asp.Net Core
  • Automobile
  • Awards
  • Azure
  • Azure CDN
  • azure devops
  • Blockchain
  • Blog
  • Browser
  • C-Sharp Corner
  • C#
  • Career Advice
  • Code Snippets
  • CodeProject
  • Cognitive Services
  • Cosmos DB
  • CSS
  • CSS3
  • Data Factory
  • Database
  • Docker
  • Drawings
  • Drill Down Chart
  • English
  • Excel Programming
  • Exporting
  • Facebook
  • Fun
  • Gadgets
  • GitHub
  • GoPro
  • High Map
  • HighChart
  • How to
  • HTML
  • HTML5
  • Ignite UI
  • IIS
  • Interview
  • IoT
  • JavaScript
  • JQuery
  • jQuery UI
  • JQWidgets
  • JQX Grid
  • Json
  • Knockout JS
  • Linux
  • Machine Learning
  • Malayalam
  • Malayalam Poems
  • MDX Query
  • Microsoft
  • Microsoft ADOMD
  • Microsoft MVP
  • Microsoft Office
  • Microsoft Technologies
  • Microsoft Windows
  • Microsoft Windows Server
  • Mobile
  • MongoDB
  • Monthly Winners
  • MVC
  • MVC Grid
  • MySQL
  • News
  • Node JS
  • npm
  • Number Conversions
  • October 2015
  • Office 365
  • Office Development
  • One Plus
  • Outlook
  • Page
  • PHP
  • Poems
  • PowerShell
  • Products
  • Q&A
  • Raspberry PI
  • React
  • SEO
  • SharePoint
  • Skype
  • Social Media
  • Software
  • Spire.Doc
  • Spire.PDF
  • Spire.XLS
  • SQL
  • SQL Server
  • SSAS
  • SSMS
  • Storage In HTML5
  • Stories
  • Third Party Software Apps
  • Tips
  • Tools
  • Translator Text
  • Uncategorized
  • Unit Testing
  • UWP
  • VB.Net
  • Videos
  • Virtual Machine
  • Visual Studio
  • Visual Studio 2017
  • Wamp Server
  • Web API
  • Web Platform Installer
  • Webinars
  • WebMatrix
  • Windows 10
  • Windows 7
  • Windows 8.1
  • Wordpress
  • Writing

ABOUT ME

I am Sibeesh Venu, an engineer by profession and writer by passion. Microsoft MVP, Author, Speaker, Content Creator, Youtuber, Programmer. If you would like to know more about me, you can read my story here.

Contact Me

  • info@sibeeshpassion.com

Pages

  • About
  • Search
  • Privacy Policy
  • About
  • Search
  • Privacy Policy
© Copyright Sibeesh Passion 2014-2025. All Rights Reserved.
Go to mobile version