When developing solutions, it is not uncommon to encounter situations where Azure Functions are required to perform some tasks. For example, creating an API interface, or handling some complex execution out for a logic app. A common question that always arises is how to deploy everything together, while allowing software developers to still build and test their code outside of a deployment cycle.
A simple approach to this problem is to have a code pipeline that publishes the code after the Azure function is deployed. However, if the infrastructure is deployed as one solution and it has Logic Apps that depends on functions the deployment will fail since the reference will be missing during the logic app deployment. Using PowerShell and Terraform with PowerShell or Bash, this can be solved by writing the necessary code and break the execution when needed and restore the deployment. However, there is a different approach using ARM templates.
It is possible to instruct the ARM templates to deploy, or rather set the code, for the Azure Functions. With the introduction of Bicep as a Domain Specific Language for ARM templates, this article will make use of Bicep.
Creating the Azure Function
To create an Azure Function, 3 resources need to be created:
- A Storage Account
- An Application Service Plan
- An Azure Function
Without going into how the 3 resource are created, below is a sample of the code needed to create the resources.
param LocationCode string resource Storage 'Microsoft.Storage/storageAccounts@2021-04-01' = { name: 'stor${LocationCode}labfunction' location: resourceGroup().location kind: 'StorageV2' sku: { name: 'Standard_LRS' } properties: { encryption: { keySource: 'Microsoft.Storage' services: { blob: { enabled: true } file: { enabled: true } queue: { enabled: true } table: { enabled: true } } requireInfrastructureEncryption: true } allowBlobPublicAccess: false } } resource AppService 'Microsoft.Web/serverfarms@2021-01-15' = { name: 'asp-${LocationCode}-lab-function' location: resourceGroup().location sku: { tier: 'Dynamic' name: 'Y1' } } var FunctionName = 'func-${LocationCode}-lab-function' resource Function 'Microsoft.Web/sites@2021-01-15' = { name: FunctionName kind: 'Functionapp' location: resourceGroup().location properties: { serverFarmId: AppService.id siteConfig: { appSettings: [ { name: 'FUNCTIONS_EXTENSION_VERSION' value: '~3' } { name: 'FUNCTIONS_WORKER_RUNTIME' value: 'node' } { name: 'WEBSITE_NODE_DEFAULT_VERSION' value: '~14' } { name: 'AzureWebJobsStorage' value: 'DefaultEndpointsProtocol=https;AccountName=${Storage.name};AccountKey=${listKeys(Storage.id, Storage.apiVersion).keys[0].value};EndpointSuffix=core.windows.net' } { name: 'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING' value: 'DefaultEndpointsProtocol=https;AccountName=${Storage.name};AccountKey=${listKeys(Storage.id, Storage.apiVersion).keys[0].value};EndpointSuffix=core.windows.net' } { name: 'WEBSITE_CONTENTSHARE' value: '${toLower(FunctionName)}files' } ] use32BitWorkerProcess: false ftpsState: 'Disabled' } } }
Building the Functions
Now that the resources have been created, it is time to create the functions. The example used for this article is a basic API that allows data to be written to an Azure Table and fetched. The fetch is a simple one that returns all records in the table.
For this article example, we require an Azure Table. To show how connections to the storage are defined, a separate Storage Account is created that will hold the table for the example.
resource StorageTable 'Microsoft.Storage/storageAccounts@2021-04-01' = { name: 'stor${LocationCode}labfunctable' location: resourceGroup().location kind: 'StorageV2' sku: { name: 'Standard_LRS' } properties: { encryption: { keySource: 'Microsoft.Storage' services: { blob: { enabled: true } file: { enabled: true } queue: { enabled: true } table: { enabled: true } } requireInfrastructureEncryption: true } allowBlobPublicAccess: false } }
Next the Azure function needs a connection to the new Storage Account, this is done by adding a new configuration to the appSettings
section of the Azure Function. In the example, the connection configuration is called TableConnection
.
{ name: 'WEBSITE_CONTENTSHARE' value: '${toLower(FunctionName)}files' } { name: 'TableConnection' value: 'DefaultEndpointsProtocol=https;AccountName=${StorageTable.name};AccountKey=${listKeys(StorageTable.id, StorageTable.apiVersion).keys[0].value};EndpointSuffix=core.windows.net' } ]
Before we can build the API functions within the Azure function, the functions code needs to be created. In the example, 2 files are created post.js
and get.js
.
Important Note: The example is using Node.JS 14 TLS version. However, this method can be used with any of the supported languages.
// post.js module.exports = async function (context, req) { context.log('JavaScript HTTP trigger function processed a request.'); let val = Math.random().toString(36).substring(7); context.bindings.outputTable = []; context.bindings.outputTable.push({ PartitionKey: 'Test', RowKey: val, Value: req.body }); context.res = { body: "Record Inserted - " + val }; } // get.js module.exports = async function (context, req) { context.log('JavaScript HTTP trigger function processed a request.'); context.res = { body: context.bindings.sampleTable }; }
With everything in place, the 2 APIs needed for Post and Get can be created. Although the same function could have been created and a condition based on the method of access, 2 functions are created to show the functionality.
Create an instance of the Microsoft.Web/sites/functions
, and for the name pass the Azure function app name followed by the function name to be exposed, for example PostData
. Next the bindings for the function are defined as part of the configuration section. This replaces the function.json
file used to define the trigger, inputs and output links. Finally, using the files attribute the files required by the function are passed with their filename as a key and the file contents as value.
resource FunctionPostData 'Microsoft.Web/sites/functions@2021-01-15' = { name: '${FunctionName}/PostData' dependsOn: [ Function ] properties: { config: { bindings: [ { authLevel: 'function' type: 'httpTrigger' direction: 'in' name: 'req' methods: [ 'post' ] } { type: 'http' direction: 'out' name: 'res' } { name: 'outputTable' direction: 'out' type: 'table' tableName: 'Sample' connection: 'TableConnection' } ] } files: { 'index.js': loadTextContent('post.js') } } } resource FunctionGetData 'Microsoft.Web/sites/functions@2021-01-15' = { name: '${FunctionName}/GetData' dependsOn: [ Function ] properties: { config: { bindings: [ { authLevel: 'function' type: 'httpTrigger' direction: 'in' name: 'req' methods: [ 'get' ] } { name: 'sampleTable' direction: 'in' type: 'table' tableName: 'Sample' connection: 'TableConnection' } { type: 'http' direction: 'out' name: 'res' } ] } files: { 'index.js': loadTextContent('get.js') } } }
Now the Bicep can be compiled, and the generated ARM template will contain the contents for the files to be created during the deployment.
Using a proper structure to segregate the code handled by the Infrastructure devops team and the developers writing code, it is possible to combine everything into ARM templates for deployment.
Full Code available here: https://gist.github.com/kdemanuele/a2573b7a5dbc18ea75d0ef6da9ae4eab