Recently I started working in the Azure Cloud and I would like to share an example I worked on that helped me understand the possibilities of this cloud environment. The focus for this article is using Azure Functions and Input and Output bindings to the Azure Queue Storage.
The business case for this example is a website where users can enter comments. Serverless Functions in the Azure Cloud receive those comments, process them and save them for later analysis.
This setup has a Function that receives an HTTP request with all the information of the new user comment. This message is validated (presumably redundant on top of client validation) and pushed on an Azure Queue. A second function will retrieve messages from the Queue and be able to send them to different persistent storage destinations. The example will use a Node.js runtime.
First you create a new Function App to hold the functions. A Function App is used to group related functions together and deploy them together.
If you don’t have a Resource Group yet, you can create one directly from this screen. Choose Node.js and your nearest region. A function app always needs an Azure Storage account to hold the data. Based on the name of the Function App, a storage account will automatically be created but if you want to use an existing Storage account you can change this in the Hosting tab. Leave all other options on the default and create the Function App.
Second resource we need to create is the Azure Function that will receive our new comments from our customers and push them on the queue. We don’t need to create the queue separately (although you can of course).
For the receiving function we used the simple template for an HTTP trigger. In this scenario, the website will send user comments directly to this function with a POST request. After creation of the function, open the “Integrate” screen to edit the data bindings.
Here, you can add a new Output and select Azure Queue Storage. Please choose a descriptive parameter name to keep your code as clear as possible. As this queue will not be used for anything else, you can change the name of the queue to something more specific. If a queue doesn’t exist, it will be automatically created. In the input field for the Storage account connection you see a value you might not recognize (probably AzureWebJobsStorage). This is a label, created to hold the connection string for your Storage account. It was automatically created when you linked a Storage account to your Function App. You can check and edit it in the configuration screen of the application.
You will see a list of created constants that define your configuration, including your Storage account connection. Everything in this screen is editable but I would recommend caution as a mistake can easily break your app.
Back in the Integrate screen, you are ready to create the output binding to the Queue. Azure will save the new queue in the same storage account as the Function app, which makes the most sense.
After saving you can see the new output binding in the list. You can also check the function.json file to see what has been added.
You’re now ready to edit the function code to push the received messages on the queue. The power of these bindings is clear when you see that you only need 1 line of code to add a new message to the queue. There is no need to create any connection or initiate some client. You can just assign a new value to your output binding which is now available in the context object.
Of course, you should expand on validation and more detailed responses in this function to guarantee clean, easy-to-process messages on your queue. If you pay per request of a function, you want the function that processes the messages on your queue, to succeed every time.
To test the function, open the test tab on the right side of the screen. Paste the right body for your request and run the test. When the 200 status comes back, the message should be on our newly created queue.
An Azure Storage Queue is not a separate resource but part of the storage account. Go to your storage account and you find the Queues menu item under the Queue service header. Select your queue and you get a list of all messages on the queue.
Now that the input and (temporary) saving of the data is set up, we can create a second function to retrieve the messages from the queue and process them. The setup for this function is even easier than the first one. When selecting a template, you can choose Azure Queue Storage Trigger. You supply a name and the name of the queue. (Would have liked to choose from a list of existing queues, though) The storage connection will be filled automatically.
After creation, your function is ready for use. The function will react to any message pushed to the queue, it will be removed and the entire message is available in the function. The sample code will look like this:
If we test this function by adding a new message to the queue, we can see the result appear in the log tab.
This example is of course very simple, but for instance, you can imagine this function sending the messages to different destinations based on the type of message provided.
With this small scenario, I put together a basic setup for using the Azure Queue Storage for collecting messages to be picked up by another function. I didn’t get into all the small details like which buttons to push because I feel the Azure Cloud environment is clear enough to guide you through most of it. Azure helps you as much as possible and provides info-labels to explain input parameters or references to extra documentation. Other times it helps me a lot to just explore the options and information that are provided. Often disclosing some of the configuration that is done without your knowledge and thereby increasing your understanding of this framework.