In this lab we will practice Deploying Serverless app in Azure
Functions are hosted in an execution context called a function app.
You define function apps to logically group and structure your functions and a compute resource in Azure.
In our escalator example, you would create a function app to host the escalator drive gear temperature service.
There are a few decisions that need to be made to create the function app; you need to choose a service plan and select a compatible storage account.
Choose a service plan
Function apps may use one of two types of service plans:
Consumption plan
Azure App Service plan
When using the Azure serverless application platform, choose the Consumption plan.
This plan provides automatic scaling and bills you only when your functions are running.
The Consumption plan comes with a configurable timeout period for executing a function.
By default, it’s five (5) minutes, but may be configured to have a timeout as long as 10 minutes.
The Azure App Service plan enables you to avoid timeout periods by having your function run continuously on a VM that you define.
When using an App Service plan, you are responsible for managing the app resources the function runs on, so this is technically not a serverless plan.
However, it may be a better choice if your functions are used continuously, or if your functions require more processing power or longer execution time than the Consumption plan can provide.
Storage account requirements
When you create a function app, it must be linked to a storage account.
You can select an existing account or create a new one.
The function app uses this storage account for internal operations, such as logging function executions and managing execution triggers.
On the Consumption plan, this is also where the function code and configuration file are stored.
You have two option to complete this lab
Login to Microsoft
Sign into you Microsoft Account
Click on activate sanbox.
Open a new tab in browser and open https://portal.azure.com
Sign in with Microsoft account you used in previous step.
If you get below message click on X to close it
Else go to next step
Let’s create a function app in the Azure portal.
Under Azure services, select Create a resource.
The Create a resource pane appears.
In the menu, select Compute, and then select Function App in the Popular products list. The Create Function App pane appears.
On the Basics tab, enter the following values for each setting.
Setting | Value |
---|---|
Project Details | |
Subscription | Concierge Subscription |
Resource Group | From the dropdown list, select |
Instance Details | |
Function App name | Enter a globally unique app name, which becomes part of the base URL of your service. For example, you can name it escalator-functions-xxx, where you can replace xxx with your initials and a number. Valid characters are a-z, 0-9 and - |
Publish | Code |
Runtime stack | Node.js (which is the language we use to implement the function examples in this exercise). |
Version | Accept default |
Region | Select a geographical location close to you. In a production system, you would want to select a location near your customers or consumers of the function. |
Select Review + create, and then select Create. Deployment will take a few minutes. You'll receive a notification when deployment is completed.
When deployment completes, select Go to resource.
The Function App pane for your escalator function appears.
In the Essentials section, select the URL link to open it in a browser.
A default Azure web page appears with a message that your Functions app is up and running.
Now that we’ve created a function app, let’s look at how to build, configure, and execute a function.
Triggers
Functions are event driven, which means they run in response to an event.
The type of event that starts a function is called a trigger.
Each function must be configured with exactly one trigger.
Azure supports triggers for the following services.
Service | Trigger description |
---|---|
Blob Storage | Starts a function when a new or updated blob is detected. |
Azure Cosmos DB | Start a function when inserts and updates are detected. |
Event Grid | Starts a function when an event is received from Event Grid. |
HTTP | Starts a function with an HTTP request. |
Microsoft Graph Events | Starts a function in response to an incoming webhook from the Microsoft Graph. Each instance of this trigger can react to one Microsoft Graph resource type. |
Queue Storage | Starts a function when a new item is received on a queue. The queue message is provided as input to the function. |
Service Bus | Starts a function in response to messages from a Service Bus queue. |
Timer | Starts a function on a schedule. |
Bindings
A binding is a declarative way to connect data and services to your function.
Bindings interact with various data sources, which means you don’t have to write the code in your function to connect to data sources and manage connections.
The platform takes care of that complexity for you as part of the binding code.
Each binding has a direction–your code reads data from input bindings, and writes data to output bindings.
Each function can have zero or more bindings to manage the input and output data processed by the function.
A trigger is a type of input binding that has the ability to initiate execution of some code.
Azure provides a large number of bindings to connect to different storage and messaging services.
Define a sample binding
Let’s look at an example of configuring a function with an input binding (trigger) and an output binding.
Let’s say we want to write a new row to Azure Table storage whenever a new message appears in Azure Queue Storage.
This scenario can be implemented using an Azure Queue Storage trigger and an Azure Table storage output binding.
The following snippet is the function.json file for this scenario.
{
"bindings": [
{
"name": "order",
"type": "queueTrigger",
"direction": "in",
"queueName": "myqueue-items",
"connection": "MY_STORAGE_ACCT_APP_SETTING"
},
{
"name": "$return",
"type": "table",
"direction": "out",
"tableName": "outTable",
"connection": "MY_TABLE_STORAGE_ACCT_APP_SETTING"
}
]
}
Our JSON configuration specifies that our function will be triggered when a message is added to a queue named myqueue-items.
The return value of our function is then written to outTable in Azure Table storage.
This example is a simple illustration of how we configure bindings for a function.
We could change the output to be an email using a SendGrid binding, or put an event onto a Service Bus to notify some other component in our architecture, or even have multiple output bindings to push data to various services.
Create a function in the Azure portal
Azure provides several predefined function templates for common scenarios:
Quickstart
Custom functions
Function templates
When you create your first function in the Azure Create function pane, you can select a predefined trigger for your function.
Based on your selections, Azure generates default code and configuration information, such as creating an event log entry when input data is received.
Selecting a template from the Add function pane provides easy access to the most common development environments, triggers, dependencies.
When you create a function in the Azure portal, you can choose from more than 20 templates. Once created you can further customize the code.
Navigate to your function and its files
When you create a function from a template, several files are created, including a configuration file, function.json, and a source code file, index.js.
You can create or edit functions for your function app by selecting Functions under the Functions category from the Function App menu.
When you select a function that you created in your function app, the Function pane opens. By selecting Code + Test from the Function menu, you have access to actions in the command bar to test and run the code, to save or discard changes you make, or to obtain the published URL.
By selecting Test/Run from the command bar, you can run use cases for requests that include query strings and values.
The function’s path above the code box displays the name of the file that is open. You can select a specific file from the dropdown to test or edit, for example, function.json.
In the image above, the pane on the right has Input and Output tabs. Selecting the Input tab enables you to build and test the function by adding query parameters and supplying values for your query string. The Output tab displays the results of the request.
Test your Azure function
After you’ve created a function, you’ll want to test it. There are two approaches:
Running it manually
Testing it from within the Azure portal itself
Run function manually
You can start a function by manually triggering the configured trigger.
For instance, if you’re using an HTTP trigger, you can use a tool, such as Postman or cURL, to initiate an HTTP request to your function endpoint URL, which is available from the function definition (Get function URL).
Test in the Azure portal
The portal also provides a convenient way to test your functions. As previously described, in the screenshot above.
When you select Run in this pane, the results automatically appear in the Output tab, and the Logs pane opens to display the status.
The ability to monitor your functions is critical during development and in production.
The Azure portal provides a monitoring dashboard, which you turn on by enabling Application Insights integration.
In the Function App menu, under Settings, select Application Insights, select Turn on Application Insights, and then select Apply.
In the dialog box, select Yes.
The Application Insights dashboard provides a quick way to view the history of function operations by displaying the timestamp, result code, duration, and operation ID populated by Application Insights.
Streaming logs pane After you’ve enabled Application Insights in the Azure portal, you can add logging statements to your function for debugging. The called methods for each language are passed a “logging” object, which can be used to add log information to the Logs pane in the Code + Test pane when running a test.
The following code snippets show how to create a log message:
In JavaScript, the context object is passed to the handler.
context.log('Enter your logging statement here');
In C#, log.LogInformation method, the log object is passed to the C# method processing the function.
log.LogInformation("Enter your logging statement here");
In PowerShell, use Write-Host cmdlet to write to the log:
Write-Host "Enter your logging statement here"
Errors, failures, warnings, and anomalies
You can use Metrics or options from the Investigate category in the Function menu to monitor performance, diagnose failures, or configure dozens of predefined workbooks to manage your function app, everything from compilation errors and warnings in the code, to usage statistics by role.
Let’s continue with our gear drive example, and add the logic for the temperature service. Specifically, we’re going to receive data from an HTTP request.
Function requirements
First, we need to define some requirements for our logic:
Add a function to your function app
As we described in the preceding unit, Azure provides templates that help you build functions.
In this unit, we’ll use the HttpTrigger template to implement the temperature service.
In the previous step, you deployed your function app and opened it.
If it isn’t already open, you can open it from the Home page by selecting All resources, and then selecting your function app, named something like escalator-functions-xxx.
In the Function App menu, under Functions, select Functions.
The Functions pane appears.
This lists any functions you defined for your function app.
In the command bar, select Create.
The Create function pane appears.
Under Select a template, select HTTP trigger.
Select Create. The HttpTrigger1 is created and displays in the HttpTrigger1 Function pane.
In the Developer menu on the left, select Code + Test.
The code editor opens, displaying the contents of the index.js code file for your function.
The default code that the HTTP template generated appears in the following snippet.
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
const name = (req.query.name || (req.body && req.body.name));
const responseMessage = name
? "Hello, " + name + ". This HTTP triggered function executed successfully."
: "This HTTP triggered function executed successfully. Pass a name on the query string or in the request body for a personalized response.";
context.res = {
// status: 200, /* Defaults to 200 */
body: responseMessage
};
}
Your function expects a name to be passed in either through the HTTP request query string, or as part of the request body.
The function responds by returning the message Hello,
From the source file dropdown list, select function.json to view the configuration of the function, which should look like the following code.
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
This configuration file declares that the function runs when it receives an HTTP request. The output binding declares that the response will be sent as an HTTP response.
Test the function
To test the function, you can send an HTTP request to the function URL using cURL on the command line.
Expand the Logs frame at the bottom of the trigger function pane. The log frame should start accruing trace notifications every minute.
To find the endpoint URL of the function, from the command bar, select Get function URL, as shown in the following image.
Save this link by selecting the Copy to clipboard icon at the end of the URL. Store this link in Notepad or a similar app for later use.
Secure HTTP triggers
HTTP triggers let you use API keys to block unknown callers by requiring a key as part of the request.
When you create a function, you select the authorization level. By default, it’s set to Function, which requires a function-specific API key, but it can also be set to Admin to use a global “master” key, or Anonymous to indicate that no key is required.
You can also change the authorization level through the function properties after creation.
Because you specified Function when you created this function, you need to supply the key when you send the HTTP request.
You can send it as a query string parameter named code, or as an HTTP header (preferred) named x-functions-key.
To find the function and master keys, in the Function App menu, under Developer, select Function Keys.
The Function Keys pane for your function opens.
By default the function key value is hidden. Show the default function key value by selecting Hidden value.
Click to show value in the Value. Copy the value to the clipboard, and then store this key in Notepad or a similar app for later use.
At the bottom of the screen, scroll to the left, and select your function.
At the top, under the Get Function Url section, copy your URL by selecting the Copy to clipboard icon at the end of the URL.
Store this link in Notepad or a similar app for later use.
Next, scroll to the left, and from the Function App menu, under Functions, select Functions, and then select HttpTrigger1 (or DriveGearTemperatureService for PowerShell). The HttpTrigger1 (or DriveGearTemperatureService for PowerShell) Function pane appears.
In the left menu pane, under Developer, select Code + Test.
The Code + Test pane appears for your HttpTrigger1 (or DriveGearTemperatureService for PowerShell) function.
On the command bar, select Test/Run.
A pane showing the input parameters for running a test.
In the Body text box, overwrite the embedded code by replacing line 2 in the Body with the cURL command below, replacing
curl --header "Content-Type: application/json" --header "x-functions-key: <your-function-key>" --request POST --data "{\"name\": \"Azure Function\"}" <your-https-url>
Review the cURL command and verify that it has the following values:
Select Run.
The Code + Test pane should open a session displaying log file output.
The log file updates with the status of your request, which should look something like this for JavaScript:
2022-02-16T22:34:10.473 [Information] Executing 'Functions.HttpTrigger1' (Reason='This function was programmatically called via the host APIs.', Id=4f503b35-b944-455e-ba02-5205f9e8b47a)
2022-02-16T22:34:10.539 [Information] JavaScript HTTP trigger function processed a request.
2022-02-16T22:34:10.562 [Information] Executed 'Functions.HttpTrigger1' (Succeeded, Id=4f503b35-b944-455e-ba02-5205f9e8b47a, Duration=114ms)
and something like this for PowerShell:
Add business logic to the function
Let’s add the logic to the function, to check temperature readings that it receives, and set a status for each temperature reading.
Our function is expecting an array of temperature readings. The following JSON snippet is an example of the request body that we’ll send to our function.
Each reading entry has an ID, timestamp, and temperature.
{
"readings": [
{
"driveGearId": 1,
"timestamp": 1534263995,
"temperature": 23
},
{
"driveGearId": 3,
"timestamp": 1534264048,
"temperature": 45
},
{
"driveGearId": 18,
"timestamp": 1534264050,
"temperature": 55
}
]
}
Let’s replace the default code in our function with the following code, to implement our business logic.
In the HttpTrigger1 function pane, open the index.js file, and replace it with the following code. After making this change, on the command bar, select Save to save the updates to the file.
module.exports = function (context, req) {
context.log('Drive Gear Temperature Service triggered');
if (req.body && req.body.readings) {
req.body.readings.forEach(function(reading) {
if(reading.temperature<=25) {
reading.status = 'OK';
} else if (reading.temperature<=50) {
reading.status = 'CAUTION';
} else {
reading.status = 'DANGER'
}
context.log('Reading is ' + reading.status);
});
context.res = {
// status: 200, /* Defaults to 200 */
body: {
"readings": req.body.readings
}
};
}
else {
context.res = {
status: 400,
body: "Please send an array of readings in the request body"
};
}
context.done();
};
The logic we added is straightforward. We iterate through the array and set the status as OK, CAUTION, or DANGER based on the value of the temperature field. We then send back the array of readings with a status field added to each entry.
Notice the Log statements when you expand Logs at the bottom of the pane. When the function runs, these statements will add messages in the Logs window.
Test our business logic
We’re going to use the Test/Run feature in Developer > Code + Test to test our function.
In the Input tab, replace the contents of the Body text box with the following code to create our sample request.
{
"readings": [
{
"driveGearId": 1,
"timestamp": 1534263995,
"temperature": 23
},
{
"driveGearId": 3,
"timestamp": 1534264048,
"temperature": 45
},
{
"driveGearId": 18,
"timestamp": 1534264050,
"temperature": 55
}
]
}
Select Run. The Output tab displays the HTTP response code and content.
To see log messages, open the Logs tab in the bottom flyout of the pane (if it is not already open).
The following image shows an example response in the output pane and messages in the Logs pane.
Screenshot of the Azure function editor, with the Test and Logs tabs showing.
The Output tab shows that a status field has been correctly added to each of the readings.
In the Developer menu on the left, select Monitor to see that the request has been logged to Application Insights. The Monitor pane appears for your function.
Select Configure. The Application Insights pane appears for your trigger function.
Select Create new resource, and in the New resource name field, select your function app, and in the Location field, select the region you initially associated with your function app.
Select OK.
Summary
You’ve learned how to use Azure Functions to host business logic services in the cloud.
It’s a great way to add hosted services to your solution that can scale and grow with your business.
You focus on the code using the language of your choice, and Azure manages the infrastructure.
Functions can integrate with other services, like Event Grid, GitHub, Twilio, Microsoft Graph, and Logic Apps to create complex and robust serverless workflows quickly and easily.
Clean up
The sandbox automatically cleans up your resources when you’re finished with this module.
When you’re working in your own subscription, it’s a good idea at the end of a project to identify whether you still need the resources you created.
Resources that you leave running can cost you money.
You can delete resources individually or delete the resource group to delete the entire set of resources.