Skip to main content

4 posts tagged with "azure pipelines"

View All Tags

Bicep meet Azure Pipelines 2

Last time I wrote about how to use the Azure CLI to run Bicep within the context of an Azure Pipeline. The solution was relatively straightforward, and involved using az deployment group create in a task. There's an easier way.

Bicep meet Azure Pipelines

The easier way#

The target reader of the previous post was someone who was already using [email protected] in an Azure Pipeline to deploy an ARM template. Rather than replacing your existing [email protected] tasks, all you need do is insert a prior bash step that compiles the Bicep to ARM, which your existing template can then process. It looks like this:

- bash: az bicep build --files infra/app-service/azuredeploy.bicep
displayName: "Compile Bicep to ARM"

This will take your Bicep template of azuredeploy.bicep, transpile it into an ARM template named azuredeploy.json which a subsequent [email protected] task can process. Since this is just exercising the Azure CLI, using bash is not required; powershell etc would also be fine; it's just required that the Azure CLI is available in a pipeline.

In fact this simple task could even be a one-liner if you didn't fancy using the displayName. (Though I say keep it; optimising for readability is generally a good shout.) A full pipeline could look like this:

- bash: az bicep build --files infra/app-service/azuredeploy.bicep
displayName: "Compile Bicep to ARM"
displayName: "Deploy Hello Azure ARM"
inputs:
azureResourceManagerConnection: '$(azureSubscription)'
action: Create Or Update Resource Group
resourceGroupName: '$(resourceGroupName)'
location: 'North Europe'
templateLocation: Linked artifact
csmFile: 'infra/app-service/azuredeploy.json' # created by bash script
csmParametersFile: 'infra/app-service/azuredeploy.parameters.json'
deploymentMode: Incremental
deploymentOutputs: resourceGroupDeploymentOutputs
overrideParameters: -applicationName $(Build.Repository.Name)
- pwsh: |
$outputs = ConvertFrom-Json '$(resourceGroupDeploymentOutputs)'
foreach ($output in $outputs.PSObject.Properties) {
Write-Host "##vso[task.setvariable variable=RGDO_$($output.Name)]$($output.Value.value)"
}
displayName: "Turn ARM outputs into variables"

And when it's run, it may result in something along these lines:

Bicep in an Azure Pipeline

So if you want to get using Bicep right now with minimal effort, this an on ramp that could work for you! Props to Jamie McCrindle for suggesting this.

Bicep meet Azure Pipelines

Bicep is a terser and more readable alternative language to ARM templates. Running ARM templates in Azure Pipelines is straightforward. However, there isn't yet a first class experience for running Bicep in Azure Pipelines. This post demonstrates an approach that can be used until a Bicep task is available.

Bicep meet Azure Pipelines

Bicep: mostly ARMless#

If you've been working with Azure and infrastructure as code, you'll likely have encountered ARM templates. They're a domain specific language that lives inside JSON, used to define the infrastructure that is deployed to Azure; App Services, Key Vaults and the like.

ARM templates are quite verbose and not the easiest thing to read. This is a consequence of being effectively a language nestled inside another language. Bicep is an alternative language which is far more readable. Bicep transpiles down to ARM templates, in the same way that TypeScript transpiles down to JavaScript.

Bicep is quite new, but already it enjoys feature parity with ARM templates (as of v0.3) and ships as part of the Azure CLI. However, as Bicep is new, it doesn't yet have a dedicated Azure Pipelines task for deployment. This should exist in future, perhaps as soon as the v0.4 release. In the meantime there's an alternative way to achieve this which we'll go through.

App Service with Bicep#

Let's take a simple Bicep file, azuredeploy.bicep, which is designed to deploy an App Service resource to Azure. It looks like this:

@description('Tags that our resources need')
param tags object = {
costCenter: 'todo: replace'
environment: 'todo: replace'
application: 'todo: replace with app name'
description: 'todo: replace'
managedBy: 'ARM'
}
@minLength(2)
@description('Base name of the resource such as web app name and app service plan')
param applicationName string
@description('Location for all resources.')
param location string = resourceGroup().location
@description('The SKU of App Service Plan')
param sku string
var appServicePlanName_var = 'plan-${applicationName}-${tags.environment}'
var linuxFxVersion = 'DOTNETCORE|5.0'
var fullApplicationName_var = 'app-${applicationName}-${uniqueString(applicationName)}'
resource appServicePlanName 'Microsoft.Web/[email protected]' = {
name: appServicePlanName_var
location: location
sku: {
name: sku
}
kind: 'linux'
tags: {
CostCenter: tags.costCenter
Environment: tags.environment
Description: tags.description
ManagedBy: tags.managedBy
}
properties: {
reserved: true
}
}
resource fullApplicationName 'Microsoft.Web/[email protected]' = {
name: fullApplicationName_var
location: location
kind: 'app'
tags: {
CostCenter: tags.costCenter
Environment: tags.environment
Description: tags.description
ManagedBy: tags.managedBy
}
properties: {
serverFarmId: appServicePlanName.id
clientAffinityEnabled: true
siteConfig: {
appSettings: []
linuxFxVersion: linuxFxVersion
alwaysOn: false
ftpsState: 'Disabled'
http20Enabled: true
minTlsVersion: '1.2'
remoteDebuggingEnabled: false
}
httpsOnly: true
}
identity: {
type: 'SystemAssigned'
}
}
output fullApplicationName string = fullApplicationName_var

When transpiled down to an ARM template, this Bicep file more than doubles in size:

  • azuredeploy.bicep - 1782 bytes
  • azuredeploy.json - 3863 bytes

This tells you something of the advantage of Bicep. The template comes with an associated azuredeploy.parameters.json file:

{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"tags": {
"value": {
"costCenter": "8888",
"environment": "stg",
"application": "hello-azure",
"description": "App Service for hello-azure",
"managedBy": "ARM"
}
},
"sku": {
"value": "B1"
}
}
}

It's worth remembering that you can use the same parameters files with Bicep that you can use with ARM templates. This is great for minimising friction when it comes to migrating.

Bicep in azure-pipelines.yml#

Now we have our Bicep file, we want to execute it from the context of an Azure Pipeline. If we were working directly with the ARM template we'd likely have something like this in place:

displayName: "Deploy Hello Azure ARM"
inputs:
azureResourceManagerConnection: '$(azureSubscription)'
action: Create Or Update Resource Group
resourceGroupName: '$(resourceGroupName)'
location: 'North Europe'
templateLocation: Linked artifact
csmFile: 'infra/app-service/azuredeploy.json'
csmParametersFile: 'infra/app-service/azuredeploy.parameters.json'
deploymentMode: Incremental
deploymentOutputs: resourceGroupDeploymentOutputs
overrideParameters: -applicationName $(Build.Repository.Name)
- pwsh: |
$outputs = ConvertFrom-Json '$(resourceGroupDeploymentOutputs)'
foreach ($output in $outputs.PSObject.Properties) {
Write-Host "##vso[task.setvariable variable=RGDO_$($output.Name)]$($output.Value.value)"
}
displayName: "Turn ARM outputs into variables"

There's two tasks above. The first is the native task for ARM deployments which takes our ARM template and our parameters and deploys them. The second task takes the output variables from the first task and converts them into Azure Pipeline variables such that they can be referenced later in the pipeline. In this case this variablifies our fullApplicationName output.

There is, as yet, no [email protected]. Though it's coming. In the meantime, the marvellous Alex Frankel advised:

I'd recommend using the Azure CLI task to deploy. As long as that task is updated to Az CLI version 2.20 or later, it will automatically install the bicep CLI when calling az deployment group create -f main.bicep.

Let's give it a go!

displayName: "Deploy Hello Azure Bicep"
inputs:
azureSubscription: '$(azureSubscription)'
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
az --version
echo "az deployment group create --resource-group '$(resourceGroupName)' --name appservicedeploy"
az deployment group create --resource-group '$(resourceGroupName)' --name appservicedeploy \
--template-file infra/app-service/azuredeploy.bicep \
--parameters infra/app-service/azuredeploy.parameters.json \
--parameters applicationName='$(Build.Repository.Name)'
echo "az deployment group show --resource-group '$(resourceGroupName)' --name appservicedeploy"
deploymentoutputs=$(az deployment group show --resource-group '$(resourceGroupName)' --name appservicedeploy \
--query properties.outputs)
echo 'convert outputs to variables'
echo $deploymentoutputs | jq -c '. | to_entries[] | [.key, .value.value]' |
while IFS=$"\n" read -r c; do
outputname=$(echo "$c" | jq -r '.[0]')
outputvalue=$(echo "$c" | jq -r '.[1]')
echo "setting variable RGDO_$outputname=$outputvalue"
echo "##vso[task.setvariable variable=RGDO_$outputname]$outputvalue"
done

The above is just a single Azure CLI task (as advised). It invokes az deployment group create passing the relevant parameters. It then acquires the output properties using az deployment group show. Finally it once again converts these outputs to Azure Pipeline variables with some jq smarts.

This works right now, and running it results in something like the output below. So if you're excited about Bicep and don't want to wait for 0.4 to start moving on this, then this can get you going. To track the progress of the custom task, keep an eye on this issue.

Bicep in an Azure Pipeline

Update: an even simpler alternative#

There is even a simpler way to do this which I discovered subsequent to writing this. Have a read.

Surfacing Azure Pipelines Build Info in a .NET React SPA

The title of this post is hugely specific, but the idea is simple. We want to answer the question: "what codebase is running in Production right now?" Many is the time where I've been pondering over why something isn't working as expected and burned a disappointing amount of time before realising that I'm playing with an old version of an app. Wouldn't it be great give our app a way to say: "Hey! I'm version 1.2.3.4 of your app; built from this commit hash, I was built on Wednesday, I was the nineth build that day and I was built from the main branch. And I'm an Aries." Or something like that.

This post was inspired by Scott Hanselman's similar post on the topic. Ultimately this ended up going in a fairly different direction and so seemed worthy of a post of its own.

A particular difference is that this is targeting SPAs. Famously, cache invalidation is hard. It's possible for the HTML/JS/CSS of your app to be stale due to aggressive caching. So we're going to make it possible to see build information for both when the SPA (or "client") is built, as well as when the .NET app (or "server") is built. We're using a specific type of SPA here; a React SPA built with TypeScript and Material UI, however the principles here are general; you could surface this up any which way you choose.

Putting build info into azure-pipelines.yml#

The first thing we're going to do is to inject our build details into two identical buildinfo.json files; one that sits in the server codebase and which will be used to drive the server build information, and one that sits in the client codebase to drive the client equivalent. They'll end up looking something like this:

{
"buildNumber": "20210130.1",
"buildId": "123456",
"branchName": "main",
"commitHash": "7089620222c30c1ad88e4b556c0a7908ddd34a8e"
}

We generate this by adding the following yml to the beginning of our azure-pipelines.yml (crucially before the client or server build take place):

- script: |
echo -e -n "{\"buildNumber\":\"$(Build.BuildNumber)\",\"buildId\":\"$(Build.BuildId)\",\"branchName\":\"$(Build.SourceBranchName)\",\"commitHash\":\"$(Build.SourceVersion)\"}" > "$(Build.SourcesDirectory)/src/client-app/src/buildinfo.json"
echo -e -n "{\"buildNumber\":\"$(Build.BuildNumber)\",\"buildId\":\"$(Build.BuildId)\",\"branchName\":\"$(Build.SourceBranchName)\",\"commitHash\":\"$(Build.SourceVersion)\"}" > "$(Build.SourcesDirectory)/src/server-app/Server/buildinfo.json"
displayName: "emit build details as JSON"
failOnStderr: true

As you can see, we're placing the following variables that are available at build time in Azure Pipelines, into the buildinfo.json:

  • BuildNumber - The name of the completed build; which usually takes the form of a date in the yyyyMMdd format, suffixed by .x where x is a number that increments representing the number of builds that have taken place on the given day.
  • BuildId - The ID of the record for the completed build.
  • SourceVersion - This is the commit hash of the source code in Git
  • SourceBranchName - The name of the branch in Git.

There's many variables available in Azure Pipelines that can be used - we've picked out the ones most interesting to us.

Surfacing the server build info#

Our pipeline is dropping the buildinfo.json over pre-existing stub buildinfo.json files in both our client and server codebases. The stub files look like this:

{
"buildNumber": "yyyyMMdd.x",
"buildId": "xxxxxx",
"branchName": "",
"commitHash": "LOCAL_BUILD"
}

In our .NET app, the buildinfo.json file has been dropped in the root of the app. And as luck would have it, all JSON files are automatically included in a .NET build and so it will be available at runtime. We want to surface this file through an API, and we also want to use it to stamp details into our logs.

So we need to parse the file, and for that we'll use this:

using System;
using System.IO;
using System.Text.Json;
namespace Server {
public record BuildInfo(string BranchName, string BuildNumber, string BuildId, string CommitHash);
public static class AppVersionInfo {
private const string _buildFileName = "buildinfo.json";
private static BuildInfo _fileBuildInfo = new(
BranchName: "",
BuildNumber: DateTime.UtcNow.ToString("yyyyMMdd") + ".0",
BuildId: "xxxxxx",
CommitHash: $"Not yet initialised - call {nameof(InitialiseBuildInfoGivenPath)}"
);
public static void InitialiseBuildInfoGivenPath(string path) {
var buildFilePath = Path.Combine(path, _buildFileName);
if (File.Exists(buildFilePath)) {
try {
var buildInfoJson = File.ReadAllText(buildFilePath);
var buildInfo = JsonSerializer.Deserialize<BuildInfo>(buildInfoJson, new JsonSerializerOptions {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (buildInfo == null) throw new Exception($"Failed to deserialise {_buildFileName}");
_fileBuildInfo = buildInfo;
} catch (Exception) {
_fileBuildInfo = new BuildInfo(
BranchName: "",
BuildNumber: DateTime.UtcNow.ToString("yyyyMMdd") + ".0",
BuildId: "xxxxxx",
CommitHash: "Failed to load build info from buildinfo.json"
);
}
}
}
public static BuildInfo GetBuildInfo() => _fileBuildInfo;
}
}

The above code reads the buildinfo.json file and deserialises it into a BuildInfo record which is then surfaced up by the GetBuildInfo method. We initialise this at the start of our Program.cs like so:

public static int Main(string[] args) {
AppVersionInfo.InitialiseBuildInfoGivenPath(Directory.GetCurrentDirectory());
// Now we're free to call AppVersionInfo.GetBuildInfo()
// ....
}

Now we need a controller to surface this information up. We'll add ourselves a BuildInfoController.cs:

using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
namespace Server.Controllers {
[ApiController]
public class BuildInfoController : ControllerBase {
[AllowAnonymous]
[HttpGet("api/build")]
public BuildInfo GetBuild() => AppVersionInfo.GetBuildInfo();
}
}

This exposes an api/build endpoint in our .NET app that, when hit, will display the following JSON:

Surfacing the client build info#

Our server now lets the world know which version it is running and this is tremendous. Now let's make our client do the same.

Very little is required to achieve this. Again we have a buildinfo.json sat in the root of our codebase. We're able to import it as a module in TypeScript because we've set the following property in our tsconfig.json:

"resolveJsonModule": true,

As a consequence, consumption is as simple as:

import clientBuildInfo from './buildinfo.json';

Which provides us with a clientBuildInfo which TypeScript automatically derives as this type:

type ClientBuildInfo = {
buildNumber: string;
buildId: string;
branchName: string;
commitHash: string;
}

How you choose to use that information is entirely your choice. We're going to add ourselves an "about" screen in our app, which displays both client info (loaded using the mechanism above) and server info (fetched from the /api/build endpoint).

import {
Card,
CardContent,
CardHeader,
createStyles,
Grid,
makeStyles,
Theme,
Typography,
Zoom,
} from '@material-ui/core';
import React from 'react';
import clientBuildInfo from '../../buildinfo.json';
import { projectsPurple } from '../shared/colors';
import { Loading } from '../shared/Loading';
import { TransitionContainer } from '../shared/TransitionContainer';
const useStyles = (cardColor: string) =>
makeStyles((theme: Theme) =>
createStyles({
card: {
padding: theme.spacing(0),
backgroundColor: cardColor,
color: theme.palette.common.white,
minHeight: theme.spacing(28),
},
avatar: {
backgroundColor: theme.palette.getContrastText(cardColor),
color: cardColor,
},
main: {
padding: theme.spacing(2),
},
})
)();
type Styles = ReturnType<typeof useStyles>;
const AboutPage: React.FC = () => {
const [serverBuildInfo, setServerBuildInfo] = React.useState<typeof clientBuildInfo>();
React.useEffect(() => {
fetch('/api/build')
.then((response) => response.json())
.then(setServerBuildInfo);
}, []);
const classes = useStyles(projectsPurple);
return (
<TransitionContainer>
<Grid container spacing={3}>
<Grid item xs={12} sm={12} container alignItems="center">
<Grid item>
<Typography variant="h4" component="h1">
About
</Typography>
</Grid>
</Grid>
</Grid>
<Grid container spacing={1}>
<BuildInfo classes={classes} title="Client Version" {...clientBuildInfo} />
</Grid>
<br />
<Grid container spacing={1}>
{serverBuildInfo ? (
<BuildInfo classes={classes} title="Server Version" {...serverBuildInfo} />
) : (
<Loading />
)}
</Grid>
</TransitionContainer>
);
};
interface Props {
classes: Styles;
title: string;
branchName: string;
buildNumber: string;
buildId: string;
commitHash: string;
}
const BuildInfo: React.FC<Props> = ({ classes, title, branchName, buildNumber, buildId, commitHash }) => (
<Zoom mountOnEnter unmountOnExit in={true}>
<Card className={classes.card}>
<CardHeader title={title} />
<CardContent className={classes.main}>
<Typography variant="body1" component="p">
<b>Build Number</b> {buildNumber}
</Typography>
<Typography variant="body1" component="p">
<b>Build Id</b> {buildId}
</Typography>
<Typography variant="body1" component="p">
<b>Branch Name</b> {branchName}
</Typography>
<Typography variant="body1" component="p">
<b>Commit Hash</b> {commitHash}
</Typography>
</CardContent>
</Card>
</Zoom>
);
export default AboutPage;

When the above page is viewed it looks like this:

And that's it! Our app is clearly telling us what version is being run, both on the server and in the client. Thanks to Scott Hanselman for his work which inspired this.

Azure Pipelines meet Jest

This post explains how to integrate the tremendous test runner Jest with the continuous integration platform Azure Pipelines. Perhaps we're setting up a new project and we've created a new React app with Create React App. This ships with Jest support out of the box. How do we get that plugged into Pipelines such that:

  1. Tests run as part of our pipeline
  2. A failing test fails the build
  3. Test results are reported in Azure Pipelines UI?

Tests run as part of our pipeline#

First of all, lets get the tests running. Crack open your azure-pipelines.yml file and, in the appropriate place add the following:

displayName: npm run test
inputs:
command: 'custom'
workingDir: 'src/client-app'
customCommand: 'run test'

The above will, when run, trigger a npm run test in the src/client-app folder of my project (it's here where my React app lives). You'd imagine this would just work™️ - but life is not that simple. This is because Jest, by default, runs in watch mode. This is blocking and so not appropriate for CI.

In our src/client-app/package.json let's create a new script that runs the tests but not in watch mode:

"test:ci": "npm run test -- --watchAll=false",

and switch our azure-pipelines.yml to use it:

displayName: npm run test
inputs:
command: 'custom'
workingDir: 'src/client-app'
customCommand: 'run test:ci'

Boom! We're now running tests as part of our pipeline. And also, failing tests will fail the build, because of Jest's default behaviour of exiting with status code 1 on failed tests.

Tests results are reported in Azure Pipelines UI#

Pipelines has a really nice UI for reporting test results. If you're using something like .NET then you'll find that test results just magically show up there. We'd like that for our Jest tests as well. And we can have it.

The way we achieve this is by:

  1. Producing test results in a format that can be subsequently processed
  2. Using those test results to publish to Azure Pipelines

The way that you configure Jest test output is through usage of reporters. However, Create React App doesn't support these. However that's not an issue, as the marvellous Dan Abramov demonstrates here.

We need to install the jest-junit package to our client-app:

npm install jest-junit --save-dev

And we'll tweak our test:ci script to use the jest-junit reporter as well:

"test:ci": "npm run test -- --watchAll=false --reporters=default --reporters=jest-junit",

We also need to add some configuration to our package.json in the form of a jest-junit element:

"jest-junit": {
"suiteNameTemplate": "{filepath}",
"outputDirectory": ".",
"outputName": "junit.xml"
}

The above configuration will use the name of the test file as the suite name in the results, which should speed up the tracking down of the failing test. The other values specify where the test results should be published to, in this case the root of our client-app with the filename junit.xml.

Now our CI is producing our test results, how do we get them into Pipelines? For that we need the Publish test results task and a new step in our azure-pipelines.yml after our npm run test step:

displayName: npm run test
inputs:
command: 'custom'
workingDir: 'src/client-app'
customCommand: 'run test:ci'
displayName: 'supply npm test results to pipelines'
condition: succeededOrFailed() # because otherwise we won't know what tests failed
inputs:
testResultsFiles: 'src/client-app/junit.xml'

This will read the test results from our src/client-app/junit.xml file and pump them into Pipelines. Do note that we're always running this step; so if the previous step failed (as it would in the case of a failing test) we still pump out the details of what that failure was. Like so:

And that's it! Azure Pipelines and Jest integrated.