Skip to main content

3 posts tagged with "Azure DevOps"

View All Tags

Β· 11 min read

I love Netlify deploy previews. This post implements a pull request deployment preview mechanism for Azure Static Web Apps in the context of Azure DevOps which is very much inspired by the Netlify offering.

title image reading "Azure Static Web App Deploy Previews with Azure DevOps" with a Azure, Bicep and Azure DevOps logos

Having a build of your latest pull request which is deployed and clickable from the PR itself is a wonderful developer experience. It reduces friction for testing out changes by allowing you to see the impact from within the PR itself. No checking to see if an environment is free with the rest of the team, then manually running a pipeline and waiting whilst a deployment happens. No. It's all there without you having to lift a finger. I use Netlify deploy previews on my blog and have become accustomed to the delight that is this:

screenshot of a Netlify deploy preview on my latest blog post

I love this and I wanted to implement the "browse the preview" mechanism in Azure DevOps as well, using Azure Static Web Apps. This blog post contains two things:

  1. A pull request deployment environment mechanism using Azure and Azure Pipelines with Bicep.
  2. A mechanism for updating a pull request in Azure DevOps with a link to the deployment environment (the "browse the preview")

It's worth bearing in mind that there's a very similar feature to what we're going to build for 1. in SWAs now called "staging environments" that is presently only available on GitHub and not Azure DevOps:

screenshot of Anthony Chu at Microsoft saying "Unfortunately environments is not yet available for Azure DevOps."

It's possible that in future the deployment environment aspect of this blog post may be rendered redundant by staging environments landing in Azure DevOps. However, the second part, which updates a PR in ADO with a link is probably generally useful. And it may be the case that the approach of provisioning an environment on demand and extracting a URL could be reworked to work with App Service and similar too.

I wrote about using SWAs with Azure DevOps earlier this year. This blog post will take the form of a pull request on the code written in that post.

Getting defaultHostName from Static Web Apps​

The first thing we're going to do is take the Bicep from that post and tweak it to the following:

param appName string
param repositoryUrl string
param repositoryBranch string

param skuName string = 'Free'
param skuTier string = 'Free'

resource staticWebApp 'Microsoft.Web/[email protected]' = {
name: repositoryBranch == 'main' ? appName : '${appName}-${repositoryBranch}'
location: resourceGroup().location
sku: {
name: skuName
tier: skuTier
}
properties: {
// The provider, repositoryUrl and branch fields are required for successive deployments to succeed
// for more details see: https://github.com/Azure/static-web-apps/issues/516
provider: 'DevOps'
repositoryUrl: repositoryUrl
branch: repositoryBranch
buildProperties: {
skipGithubActionWorkflowGeneration: true
}
}
}

output staticWebAppDefaultHostName string = staticWebApp.properties.defaultHostname // eg gentle-bush-0db02ce03.azurestaticapps.net
output staticWebAppId string = staticWebApp.id
output staticWebAppName string = staticWebApp.name

There's some changes in here. First of all we're using a newer version of the staticSites resource in Azure. You'll also see that we name the resource conditionally now. If we're on the main branch we name it as we did before with appName. But if we aren't then we suffix the name with the repositoryBranch. It's worth knowing that there are restrictions and conventions for Azure resource naming. If you have a branch name that is just alphanumerics and hyphens you'll be fine.

You'll see the output of the Bicep file has changed. Previously we were outputting the apiKey that we used for deployment. This isn't the securest of approaches as, by having this as a deployment output, this data can be accessed by people who share access with your Azure portal. So we're going to use a different (and more secure) approach to acquire this in our pipeline later.

More significantly, we are now outputting the staticWebAppDefaultHostName of our newly provisioned SWA. This is the location where people will be able to view the deployment preview. Since we want to pump that into our pull request description, so people can click on the link, we are going to need this. We're also pumping out the staticWebAppId and staticWebAppName. We'll use the staticWebAppName to acquire the apiKey in our pipeline.

Azure Pipelines tweaks​

Now to the pipeline. After the deployment, our updated pipeline is going to acquire the apiKey for deployment like so:

- task: [email protected]
displayName: 'Acquire API key for deployment'
inputs:
azureSubscription: $(serviceConnection)
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
APIKEY=$(az staticwebapp secrets list --name $(staticWebAppName) | jq -r '.properties.apiKey')
echo "##vso[task.setvariable variable=apiKey;issecret=true]$APIKEY"

The above uses the Azure CLI task to acquire the apiKey. It uses jq to pull out the required property from the JSON and writes it as a secret variable in the pipeline to be used in the deployment.

At the end of the pipeline, if we're not on the main branch, the the pipeline is going to run a custom script that will update the PR with the preview URL:

- task: [email protected]
displayName: 'Pull request preview install'
condition: and(succeeded(), ne(variables.isMain, 'true'))
inputs:
command: 'install'
workingDir: pull-request-preview

- task: [email protected]
displayName: 'Pull request preview'
condition: and(succeeded(), ne(variables.isMain, 'true'))
inputs:
command: 'custom'
customCommand: 'run pull-request-preview -- --sat "$(System.AccessToken)" --project "$(System.TeamProject)" --repository "$(Build.Repository.Name)" --systemCollectionUri "$(System.CollectionUri)" --pullRequestId $(System.PullRequest.PullRequestId) --previewUrl "https://$(staticWebAppDefaultHostName)"'
workingDir: pull-request-preview

We haven't written that script yet; we will in a moment.

The complete azure-piplines.yml is below, and you'll notice we've moved all variables save for the subscriptionId into the azure-pipelines.yml and we're using a westeurope location / resource group as at present staticSites is not available everywhere:

pool:
vmImage: ubuntu-latest

variables:
# subscriptionId is a variable defined on the pipeline itself
- name: appName
value: 'our-static-web-app'
- name: location
value: 'westeurope' #Β at time of writing static sites are available in limited locations such as westeurope
- name: serviceConnection
value: 'azureRMWestEurope'
- name: azureResourceGroup # this resource group lives in westeurope
value: 'johnnyreilly'

steps:
- checkout: self
submodules: true

- bash: az bicep build --file infra/static-web-app/main.bicep
displayName: 'Compile Bicep to ARM'

- task: [email protected]
name: DeployStaticWebAppInfra
displayName: Deploy Static Web App infra
inputs:
deploymentScope: Resource Group
azureResourceManagerConnection: $(serviceConnection)
subscriptionId: $(subscriptionId)
action: Create Or Update Resource Group
resourceGroupName: $(azureResourceGroup)
location: $(location)
templateLocation: Linked artifact
csmFile: 'infra/static-web-app/main.json' # created by bash script
overrideParameters: >-
-repositoryUrl $(Build.Repository.Uri)
-repositoryBranch $(Build.SourceBranchName)
-appName $(appName)
deploymentMode: Incremental
deploymentOutputs: deploymentOutputs

- task: [email protected]
name: 'SetDeploymentOutputVariables'
displayName: 'Set Deployment Output Variables'
inputs:
targetType: inline
script: |
$armOutputObj = '$(deploymentOutputs)' | ConvertFrom-Json
$armOutputObj.PSObject.Properties | ForEach-Object {
$keyname = $_.Name
$value = $_.Value.value

# Creates a standard pipeline variable
Write-Output "##vso[task.setvariable variable=$keyName;]$value"

# Display keys and values in pipeline
Write-Output "output variable: $keyName $value"
}
pwsh: true

- task: [email protected]
displayName: 'Acquire API key for deployment'
inputs:
azureSubscription: $(serviceConnection)
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
APIKEY=$(az staticwebapp secrets list --name $(staticWebAppName) | jq -r '.properties.apiKey')
echo "##vso[task.setvariable variable=apiKey;issecret=true]$APIKEY"

- task: [email protected]
name: DeployStaticWebApp
displayName: Deploy Static Web App
inputs:
app_location: 'static-web-app'
# api_location: 'api'
output_location: 'build'
azure_static_web_apps_api_token: $(apiKey)

- task: [email protected]
displayName: 'Pull request preview install'
condition: and(succeeded(), ne(variables.isMain, 'true'))
inputs:
command: 'install'
workingDir: pull-request-preview

- task: [email protected]
displayName: 'Pull request preview'
condition: and(succeeded(), ne(variables.isMain, 'true'))
inputs:
command: 'custom'
customCommand: 'run pull-request-preview -- --sat "$(System.AccessToken)" --project "$(System.TeamProject)" --repository "$(Build.Repository.Name)" --systemCollectionUri "$(System.CollectionUri)" --pullRequestId $(System.PullRequest.PullRequestId) --previewUrl "https://$(staticWebAppDefaultHostName)"'
workingDir: pull-request-preview

Updating the PR with a preview URL​

We want to be able to update our pull request with our deploy URL. To make that happen, we're going to whiz up a little node app using TypeScript, ts-node and the azure-devops-node-api package.

Let's create our app:

mkdir pull-request-preview
cd pull-request-preview
npm init --yes
npm install @types/node @types/yargs ts-node typescript azure-devops-node-api yargs --save

We'll update our newly created package.json file with a pull-request-preview script which will be the entry point.

  "scripts": {
"pull-request-preview": "ts-node ./index.ts"
},

We'll add a tsconfig.json file that looks like this:

{
"compilerOptions": {
"target": "ES2015",
"module": "CommonJS",
"strict": true,
"esModuleInterop": true,
"moduleResolution": "node"
}
}

Finally we'll add our script in a new index.ts file:

#!/usr/bin/env node
import yargs from 'yargs/yargs';
import * as nodeApi from 'azure-devops-node-api';
import { IGitApi } from 'azure-devops-node-api/GitApi';
import { PullRequestStatus } from 'azure-devops-node-api/interfaces/GitInterfaces';

const parser = yargs(process.argv.slice(2)).options({
pat: { type: 'string', default: '' },
sat: { type: 'string', default: '' },
systemCollectionUri: { type: 'string', demandOption: true },
project: { type: 'string', demandOption: true },
repository: { type: 'string', demandOption: true },
pullRequestId: { type: 'number' },
previewUrl: { type: 'string', demandOption: true },
});

(async () => {
await run(await parser.argv);
})();

async function run({
pat,
sat,
project,
repository,
systemCollectionUri,
pullRequestId,
previewUrl,
}: {
pat: string;
sat: string;
systemCollectionUri: string;
project: string;
repository: string;
pullRequestId: number | undefined;
previewUrl: string;
}) {
const config: Config = { project, repository };
const gitApi = await getGitApi({ pat, sat, systemCollectionUri });

if (!pullRequestId)
console.log(
'No pull request id supplied, so will look up latest active PR'
);

const pullRequestIdToUpdate =
pullRequestId || (await getActivePullRequestId({ gitApi, config }));
if (!pullRequestIdToUpdate) {
console.log('No pull request found');
return;
}

console.log(
`Updating ${systemCollectionUri}/${project}/_git/${repository}/pullrequest/${pullRequestIdToUpdate} with a preview URL of ${previewUrl}`
);

const pullRequest = await getPullRequest({
gitApi,
config,
pullRequestId: pullRequestIdToUpdate,
});

await updatePullRequestDescription({
gitApi,
config,
pullRequestId: pullRequestIdToUpdate,
description: makePreviewDescriptionMarkdown(
pullRequest.description!,
previewUrl
),
});

console.log(
`Updated pull request description a preview URL of ${previewUrl}`
);
}

interface Config {
project: string;
repository: string;
}

async function getGitApi({
sat,
pat,
systemCollectionUri,
}: {
pat: string;
sat: string;
systemCollectionUri: string;
}) {
const authHandler = pat
? nodeApi.getPersonalAccessTokenHandler(
pat,
/** allowCrossOriginAuthentication */ true
)
: nodeApi.getHandlerFromToken(
sat,
/** allowCrossOriginAuthentication */ true
);

const webApi = new nodeApi.WebApi(systemCollectionUri, authHandler);
const gitApi = await webApi.getGitApi();

return gitApi;
}

async function getActivePullRequestId({
gitApi,
config,
}: {
gitApi: IGitApi;
config: Config;
}) {
const topActivePullRequest = await gitApi.getPullRequests(
config.repository, // repository.id!,
{ status: PullRequestStatus.Active },
config.project,
undefined,
/** skip */ 0,
/** top */ 1
);

return topActivePullRequest.length > 0
? topActivePullRequest[0].pullRequestId
: undefined;
}

async function getPullRequest({
gitApi,
config,
pullRequestId,
}: {
gitApi: IGitApi;
config: Config;
pullRequestId: number;
}) {
const pullRequest = await gitApi.getPullRequest(
config.repository, // repository.id!,
pullRequestId,
config.project,
undefined,
/** skip */ 0,
/** top */ 1,
/** includeCommits */ false,
/** includeWorkItemRefs */ false
);
return pullRequest;
}

async function updatePullRequestDescription({
gitApi,
config,
pullRequestId,
description,
}: {
gitApi: IGitApi;
config: Config;
pullRequestId: number;
description: string;
}) {
// To do an update with the API you must provide a new object with only the properties you are updating
const updatePullRequest = {
description,
};
await gitApi.updatePullRequest(
updatePullRequest,
config.repository,
pullRequestId,
config.project
);
}

function makePreviewDescriptionMarkdown(desc: string, previewUrl: string) {
const previewRegex = /(> -*\n> # Preview:\n.*\n>.*\n> -*\n)/;

const makePreview = (previewUrl: string) => `> ---
> # Preview:
> ${previewUrl}
>
> ---
`;

const alreadyHasPreview = desc.match(previewRegex);
return alreadyHasPreview
? desc.replace(previewRegex, makePreview(previewUrl))
: makePreview(previewUrl) + desc;
}

The above code does two things:

  1. Looks up the pull request, using the details supplied from the pipeline. It's worth noting that the System.PullRequest.PullRequestId variable is initialized only if the build ran because of a Git PR affected by a branch policy. If you don't have that set up, the script falls back to using the latest active pull request. This is generally useful when you're getting set up in the first place; you won't want to rely on this behaviour.
  2. Updates the pull request description with a prefix piece of markdown that provides the link to the preview URL. This is our "browse the preview": screenshot of rendered markdown with the preview link

This script could be refactored into a dedicated Azure Pipelines custom task.

Permissions​

The first time you run this you may encounter a permissions error of the form:

Error: TF401027: You need the Git 'PullRequestContribute' permission to perform this action.

To remedy this you need to give your build service the relevant permissions to update a pull request. You can do that by going to the security settings of your repo and setting "Contribute to pull requests" to "Allow" for your build service:

Screenshot of "Contribute to pull requests" permission in Azure DevOps Git security being set to "Allow"

Enjoy! (and keep Azure tidy)​

When the pipeline is now run you can see that a deployment preview link is now updated onto the PR description:

Screenshot of deployment preview on PR

This will happen whenever a PR is raised which is tremendous.

A thing to remember, is that there's nothing in this post that tears down the temporary deployment after the pull request has been merged. It will hang around. We happen to be using free resources in this post, but if we weren't there would be cost implications. Either way, you'll want to clean up unused environments as a matter of course. And I'd advise automating that.

So be tidy and cost aware with this approach.

Β· 9 min read

How can we deploy resources to Azure, and then run an integration test through them in the context of an Azure Pipeline? This post will show how to do this by permissioning our Azure Pipeline to access these resources using Azure RBAC role assignments. It will also demonstrate a dotnet test that runs in the context of the pipeline and makes use of those role assignments.

title image reading "Permissioning Azure Pipelines with Bicep and Role Assignments" and some Azure logos

We're following this approach as an alternative to exporting connection strings, as these can be viewed in the Azure Portal; which may be an security issue if you have many people who are able to access the portal and view deployment outputs.

We're going to demonstrate this approach using Event Hubs. It's worth calling out that this is a generally useful approach which can be applied to any Azure resources that support Azure RBAC Role Assignments. So wherever in this post you read "Event Hubs", imagine substituting other Azure resources you're working with.

The post will do the following:

  • Add Event Hubs to our Azure subscription
  • Permission our service connection / service principal
  • Deploy to Azure with Bicep
  • Write an integration test
  • Write a pipeline to bring it all together

Add Event Hubs to your subscription​

First of all, we may need to add Event Hubs to our Azure subscription.

Without this in place, we may encounter errors of the type:

##[error]MissingSubscriptionRegistration: The subscription is not registered to use namespace 'Microsoft.EventHub'. See https://aka.ms/rps-not-found for how to register subscriptions.

We do this by going to "Resource Providers" in the Azure Portal and registering the resources you need. Lots are registered by default, but not all.

Screenshot of the Azure Portal, subscriptions -> resource providers section, showing that Event Hubs have been registered

Permission our service connection / service principal​

In order that we can run pipelines related to Azure, we mostly need to have an Azure Resource Manager service connection set up in Azure DevOps. Once that exists, we also need to give it a role assignment to allow it to create role assignments of its own when pipelines are running.

Without this in place, we may encounter errors of the type:

##[error]The template deployment failed with error: 'Authorization failed for template resource '{GUID-THE-FIRST}' of type 'Microsoft.Authorization/roleAssignments'. The client '{GUID-THE-SECOND}' with object id '{GUID-THE-SECOND}' does not have permission to perform action 'Microsoft.Authorization/roleAssignments/write' at scope '/subscriptions/***/resourceGroups/johnnyreilly/providers/Microsoft.EventHub/namespaces/evhns-demo/providers/Microsoft.Authorization/roleAssignments/{GUID-THE-FIRST}'.'.

Essentially, we want to be able to run pipelines that say "hey Azure, we want to give permissions to our service connection". We are doing this with the self same service connection, so (chicken and egg) we first need to give it permission to give those commands in future. This is a little confusing; but let's role with it. (Pun most definitely intended. πŸ˜‰)

To grant that permission / add that role assignment, we go to the service connection in Azure Devops:

Screenshot of the service connection in Azure DevOps

We can see there's two links here; first we'll click on "Manage Service Principal", which will take us to the service principal in the Azure Portal:

Screenshot of the service principal in the Azure Portal

Take note of the display name of the service principal; we'll need that as we click on the "Manage service connection roles" link, which will take us to the resource groups IAM page in the Azure Portal:

Screenshot of the resource groups IAM page in the Azure Portal

Here we can click on "Add role assignment", select "Owner":

Screenshot of the add role assignment IAM page in the Azure Portal

Then when selecting members we should be able to look up the service principal to assign it:

Screenshot of the add role assignment select member IAM page in the Azure Portal

We now have a service connection which we should be able to use for granting permissions / role assignments, which is what we need.

Event Hub and Role Assignment with Bicep​

Next we want a Bicep file that will, when run, provision an Event Hub and a role assignment which will allow our Azure Pipeline (via its service connection) to interact with it.

@description('Name of the eventhub namespace')
param eventHubNamespaceName string

@description('Name of the eventhub name')
param eventHubName string

@description('The service principal')
param principalId string

// Create an Event Hub namespace
resource eventHubNamespace 'Microsoft.EventHub/[email protected]' = {
name: eventHubNamespaceName
location: resourceGroup().location
sku: {
name: 'Standard'
tier: 'Standard'
capacity: 1
}
properties: {
zoneRedundant: true
}
}

// Create an Event Hub inside the namespace
resource eventHub 'Microsoft.EventHub/namespaces/[email protected]' = {
parent: eventHubNamespace
name: eventHubName
properties: {
messageRetentionInDays: 7
partitionCount: 1
}
}

// give Azure Pipelines Service Principal permissions against the Event Hub

var roleDefinitionAzureEventHubsDataOwner = subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'f526a384-b230-433a-b45c-95f59c4a2dec')

resource integrationTestEventHubReceiverNamespaceRoleAssignment 'Microsoft.Authorization/[email protected]' = {
name: guid(principalId, eventHub.id, roleDefinitionAzureEventHubsDataOwner)
scope: eventHubNamespace
properties: {
roleDefinitionId: roleDefinitionAzureEventHubsDataOwner
principalId: principalId
}
}

Do note that our bicep template takes the service principal id as a parameter. We're going to supply this later from our Azure Pipeline.

Our test​

We're now going to write a dotnet integration test which will make use of the infrastructure deployed by our Bicep template. Let's create a new test project:

mkdir src
cd src
dotnet new xunit -o IntegrationTests
cd IntegrationTests
dotnet add package Azure.Identity
dotnet add package Azure.Messaging.EventHubs
dotnet add package FluentAssertions
dotnet add package Microsoft.Extensions.Configuration.EnvironmentVariables

We'll create a test file called EventHubTest.cs with these contents:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Azure.Identity;
using Azure.Messaging.EventHubs;
using Azure.Messaging.EventHubs.Consumer;
using Azure.Messaging.EventHubs.Producer;
using FluentAssertions;
using Microsoft.Extensions.Configuration;
using Newtonsoft.Json;
using Xunit;
using Xunit.Abstractions;

namespace IntegrationTests
{
public record EchoMessage(string Id, string Message, DateTime Timestamp);

public class EventHubTest
{
private readonly ITestOutputHelper _output;

public EventHubTest(ITestOutputHelper output)
{
_output = output;
}

[Fact]
public async Task Can_post_message_to_event_hub_and_read_it_back()
{
// ARRANGE
var configuration = new ConfigurationBuilder()
.AddEnvironmentVariables()
.Build();

// populated by variables specified in the Azure Pipeline
var eventhubNamespaceName = configuration["EVENTHUBNAMESPACENAME"];
eventhubNamespaceName.Should().NotBeNull();
var eventhubName = configuration["EVENTHUBNAME"];
eventhubName.Should().NotBeNull();
var tenantId = configuration["TENANTID"];
tenantId.Should().NotBeNull();

// populated as a consequence of the addSpnToEnvironment in the azure-pipelines.yml
var servicePrincipalId = configuration["SERVICEPRINCIPALID"];
servicePrincipalId.Should().NotBeNull();
var servicePrincipalKey = configuration["SERVICEPRINCIPALKEY"];
servicePrincipalKey.Should().NotBeNull();

var fullyQualifiedNamespace = $"{eventhubNamespaceName}.servicebus.windows.net";

var clientCredential = new ClientSecretCredential(tenantId, servicePrincipalId, servicePrincipalKey);
var eventHubClient = new EventHubProducerClient(
fullyQualifiedNamespace: fullyQualifiedNamespace,
eventHubName: eventhubName,
credential: clientCredential
);
var ourGuid = Guid.NewGuid().ToString();
var now = DateTime.UtcNow;
var sentEchoMessage = new EchoMessage(Id: ourGuid, Message: $"Test message", Timestamp: now);
var sentEventData = new EventData(
Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(sentEchoMessage))
);

// ACT
await eventHubClient.SendAsync(new List<EventData> { sentEventData }, CancellationToken.None);

var eventHubConsumerClient = new EventHubConsumerClient(
consumerGroup: EventHubConsumerClient.DefaultConsumerGroupName,
fullyQualifiedNamespace: fullyQualifiedNamespace,
eventHubName: eventhubName,
credential: clientCredential
);

List<PartitionEvent> partitionEvents = new();
await foreach (var partitionEvent in eventHubConsumerClient.ReadEventsAsync(new ReadEventOptions
{
MaximumWaitTime = TimeSpan.FromSeconds(10)
}))
{
if (partitionEvent.Data == null) break;
_output.WriteLine(Encoding.UTF8.GetString(partitionEvent.Data.EventBody.ToArray()));
partitionEvents.Add(partitionEvent);
}

// ASSERT
partitionEvents.Count.Should().BeGreaterOrEqualTo(1);
var firstOne = partitionEvents.FirstOrDefault(evnt =>
ExtractTypeFromEventBody<EchoMessage>(evnt, _output)?.Id == ourGuid
);
var receivedEchoMessage = ExtractTypeFromEventBody<EchoMessage>(firstOne, _output);
receivedEchoMessage.Should().BeEquivalentTo(sentEchoMessage, because: "the event body should be the same one posted to the message queue");
}

private static T ExtractTypeFromEventBody<T>(PartitionEvent evnt, ITestOutputHelper _output)
{
try
{
return JsonConvert.DeserializeObject<T>(Encoding.UTF8.GetString(evnt.Data.EventBody.ToArray()));
}
catch (JsonException)
{
_output.WriteLine("[" + Encoding.UTF8.GetString(evnt.Data.EventBody.ToArray()) + "] is probably not JSON");
return default(T);
}
}
}
}

Let's talk through what happens in the test above:

  1. We read in Event Hub connection configuration for the test from environment variables. (These will be supplied by an Azure Pipeline that we will create shortly.)
  2. We post a message to the Event Hub.
  3. We read a message back from the Event Hub.
  4. We confirm that the message we read back matches the one we posted.

Now that we have our test, we want to be able to execute it. For that we need an Azure Pipeline!

Azure Pipeline​

We're going to add an azure-pipelines.yml file which Azure DevOps can use to power a pipeline:

variables:
- name: eventHubNamespaceName
value: evhns-demo
- name: eventHubName
value: evh-demo

pool:
vmImage: ubuntu-latest

steps:
- task: [email protected]
displayName: Get Service Principal Id
inputs:
azureSubscription: $(serviceConnection)
scriptType: bash
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
PRINCIPAL_ID=$(az ad sp show --id $servicePrincipalId --query objectId -o tsv)
echo "##vso[task.setvariable variable=PIPELINE_PRINCIPAL_ID;]$PRINCIPAL_ID"

- bash: az bicep build --file infra/main.bicep
displayName: 'Compile Bicep to ARM'

- task: [email protected]
name: DeployEventHubInfra
displayName: Deploy Event Hub infra
inputs:
deploymentScope: Resource Group
azureResourceManagerConnection: $(serviceConnection)
subscriptionId: $(subscriptionId)
action: Create Or Update Resource Group
resourceGroupName: $(azureResourceGroup)
location: $(location)
templateLocation: Linked artifact
csmFile: 'infra/main.json' # created by bash script
overrideParameters: >-
-eventHubNamespaceName $(eventHubNamespaceName)
-eventHubName $(eventHubName)
-principalId $(PIPELINE_PRINCIPAL_ID)
deploymentMode: Incremental

- task: [email protected]
displayName: 'Install .NET SDK 5.0.x'
inputs:
packageType: 'sdk'
version: 5.0.x

- task: [email protected]
displayName: dotnet integration test
inputs:
azureSubscription: $(serviceConnection)
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true # allows access to service principal details in script
inlineScript: |
cd $(Build.SourcesDirectory)/src/IntegrationTests
dotnet test

When the pipeline is run, it does the following:

  1. Gets the service principal id from the service connection.
  2. Compiles our Bicep into an ARM template
  3. Deploys the compiled ARM template to Azure
  4. Installs the dotnet SDK
  5. Uses the Azure CLI task which allows us to access service principal details in the pipeline to run our dotnet test.

We'll create a pipeline in Azure DevOps pointing to this file, and we'll also create the variables that it depends upon:

  • azureResourceGroup - the name of your resource group in Azure where the app will be deployed
  • location - where your app is deployed, eg northeurope
  • serviceConnection - the name of your AzureRM service connection in Azure DevOps
  • subscriptionId - your Azure subscription id from the Azure Portal
  • tenantId - the Azure tenant id from the Azure Portal

Running the pipeline​

Now we're ready to run our pipeline:

screenshot of pipeline running successfully

Here we can see that the pipeline runs and the test passes. That means we've successfully provisioned the Event Hub and permissioned our pipeline to be able to access it using Azure RBAC role assignments. We then wrote a test which used the pipeline credentials to interact with the Event Hub. To see the repo that demostrates this, look here.

Just to reiterate: we've demonstrated this approach using Event Hubs. This is a generally useful approach which can be applied to any Azure resources that support Azure RBAC Role Assignments.

Thanks to Jamie McCrindle for helping out with permissioning the service connection / service principal. His post on rotating AZURE_CREDENTIALS in GitHub with Terraform provides useful background for those who would like to do similar permissioning using Terraform.

Β· 5 min read

This post demonstrates how to deploy Azure Static Web Apps using Bicep and Azure DevOps. It includes a few workarounds for the "Provider is invalid. Cannot change the Provider. Please detach your static site first if you wish to use to another deployment provider." issue.

title image reading &quot;Publish Azure Static Web Apps with Bicep and Azure DevOps&quot; and some Azure logos

Bicep template​

The first thing we're going to do is create a folder where our Bicep file for deploying our Azure Static Web App will live:

mkdir infra/static-web-app -p

Then we'll create a main.bicep file:

param repositoryUrl string
param repositoryBranch string

param location string = 'westeurope'
param skuName string = 'Free'
param skuTier string = 'Free'

param appName string

resource staticWebApp 'Microsoft.Web/[email protected]' = {
name: appName
location: location
sku: {
name: skuName
tier: skuTier
}
properties: {
// The provider, repositoryUrl and branch fields are required for successive deployments to succeed
// for more details see: https://github.com/Azure/static-web-apps/issues/516
provider: 'DevOps'
repositoryUrl: repositoryUrl
branch: repositoryBranch
buildProperties: {
skipGithubActionWorkflowGeneration: true
}
}
}

output deployment_token string = listSecrets(staticWebApp.id, staticWebApp.apiVersion).properties.apiKey

There's some things to draw attention to in the code above:

  1. The provider, repositoryUrl and branch fields are required for successive deployments to succeed. In our case we're deploying via Azure DevOps and so our provider is 'DevOps'. For more details, look at this issue.
  2. We're creating a deployment_token which we'll need in order that we can deploy into the Azure Static Web App resource.

Static Web App​

In order that we can test out Azure Static Web Apps, what we need is a static web app. You could use pretty much anything here; we're going to use Docusaurus. We'll execute this single command:

npx @docusaurus/[email protected] init static-web-app classic

Which will scaffold a Docusaurus site in a folder named static-web-app. We don't need to change it any further; let's just see if we can deploy it.

Azure Pipeline​

We're going to add an azure-pipelines.yml file which Azure DevOps can use to power a pipeline:

trigger:
- main

pool:
vmImage: ubuntu-latest

steps:
- checkout: self
submodules: true

- bash: az bicep build --file infra/static-web-app/main.bicep
displayName: 'Compile Bicep to ARM'

- task: [email protected]
name: DeployStaticWebAppInfra
displayName: Deploy Static Web App infra
inputs:
deploymentScope: Resource Group
azureResourceManagerConnection: $(serviceConnection)
subscriptionId: $(subscriptionId)
action: Create Or Update Resource Group
resourceGroupName: $(azureResourceGroup)
location: $(location)
templateLocation: Linked artifact
csmFile: 'infra/static-web-app/main.json' # created by bash script
overrideParameters: >-
-repositoryUrl $(repo)
-repositoryBranch $(Build.SourceBranchName)
-appName $(staticWebAppName)
deploymentMode: Incremental
deploymentOutputs: deploymentOutputs

- task: [email protected]
name: 'SetDeploymentOutputVariables'
displayName: 'Set Deployment Output Variables'
inputs:
targetType: inline
script: |
$armOutputObj = '$(deploymentOutputs)' | ConvertFrom-Json
$armOutputObj.PSObject.Properties | ForEach-Object {
$keyname = $_.Name
$value = $_.Value.value

# Creates a standard pipeline variable
Write-Output "##vso[task.setvariable variable=$keyName;issecret=true]$value"

# Display keys in pipeline
Write-Output "output variable: $keyName"
}
pwsh: true

- task: [email protected]
name: DeployStaticWebApp
displayName: Deploy Static Web App
inputs:
app_location: 'static-web-app'
# api_location: 'api' # we don't have an API
output_location: 'build'
azure_static_web_apps_api_token: $(deployment_token) # captured from deploymentOutputs

When the pipeline is run, it does the following:

  1. Compiles our Bicep into an ARM template
  2. Deploys the compiled ARM template to Azure
  3. Captures the deployment outputs (essentially the deployment_token) and converts them into variables to use in the pipeline
  4. Deploys our Static Web App using the deployment_token

The pipeline depends upon a number of variables:

  • azureResourceGroup - the name of your resource group in Azure where the app will be deployed
  • location - where your app is deployed, eg northeurope
  • repo - the URL of your repository in Azure DevOps, eg https://dev.azure.com/johnnyreilly/_git/azure-static-web-apps
  • serviceConnection - the name of your AzureRM service connection in Azure DevOps
  • staticWebAppName - the name of your static web app, eg azure-static-web-apps-johnnyreilly
  • subscriptionId - your Azure subscription id from the Azure Portal

A successful pipeline looks something like this:

Screenshot of successfully running Azure Pipeline

What you might notice is that the AzureStaticWebApp is itself installing and building our application. This is handled by Microsoft Oryx. The upshot of this is that we don't need to manually run npm install and npm build ourselves; the AzureStaticWebApp task will take care of it for us.

Finally, let's see if we've deployed something successfully...

Screenshot of deployed Azure Static Web App

We have! It's worth noting that you'll likely want to give your Azure Static Web App a lovelier URL, and perhaps even put it behind Azure Front Door as well.

Provider is invalid workaround 2​

Shane Neff was attempting to follow the instructions in this post and encountered issues. He shared his struggles with me as he encountered the "Provider is invalid. Cannot change the Provider. Please detach your static site first if you wish to use to another deployment provider." issue.

He was good enough to share his solution as well, which is inserting this task at the start of the pipeline (before the az bicep build step):

- task: [email protected]
inputs:
azureSubscription: '<name of your service connection>'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az staticwebapp disconnect -n <name of your app>'

I haven't had the problems that Shane has had myself, but I wanted to share his fix for the people out there who almost certainly are bumping on this.