Skip to main content

4 posts tagged with "powershell"

View All Tags

Setting Build Version Using AppVeyor and ASP.Net Core

AppVeyor has support for setting the version of a binary during a build. However - this deals with the classic ASP.Net world of AssemblyInfo. I didn't find any reference to support for doing the same with dot net core. Remember, dot net core relies upon a <Version> or a <VersionPrefix> setting in the .csproj file. Personally, <Version> is my jam.

However, coming up with your own bit of powershell that stamps the version during the build is a doddle; here we go:

Param($projectFile, $buildNum)
$content = [IO.File]::ReadAllText($projectFile)
$regex = new-object System.Text.RegularExpressions.Regex ('(<version>)([\d]+.[\d]+.[\d]+)(.[\d]+)(<\/Version>)',
[System.Text.RegularExpressions.RegexOptions]::MultiLine)
$version = $null
$match = $regex.Match($content)
if($match.Success) {
# from "<version>1.0.0.0</version>" this will extract "1.0.0"
$version = $match.groups[2].value
}
# suffix build number onto $version. eg "1.0.0.15"
$version = "$version.$buildNum"
# update "<version>1.0.0.0</version>" to "<version>$version</version>"
$content = $regex.Replace($content, '${1}' + $version + '${4}')
# update csproj file
[IO.File]::WriteAllText($projectFile, $content)
# update AppVeyor build
Update-AppveyorBuild -Version $version
</version>

You can invoke this script as part of the build process in AppVeyor by adding something like this to your appveyor.yml.

before_build:
- ps: .\ModifyVersion.ps1 $env:APPVEYOR_BUILD_FOLDER\src\Proverb.Web\Proverb.Web.csproj $env:APPVEYOR_BUILD_NUMBER

It will keep the first 3 parts of the version in your .csproj (eg "1.0.0") and suffix on the build number supplied by AppVeyor.

TFS 2012 meet PowerShell, Karma and BuildNumber

To my lasting regret, TFS 2012 has no direct support for PowerShell. Such a shame as PowerShell scripts can do a lot of heavy lifting in a build process. Well, here we're going to brute force TFS 2012 into running PowerShell scripts. And along the way we'll also get Karma test results publishing into TFS 2012 as an example usage. Nice huh? Let's go!

PowerShell via csproj#

It's time to hack the csproj (or whatever project file you have) again. We're going to add an AfterBuild target to the end of the file. This target will be triggered after the build completes (as the name suggests):

<Target Name="AfterBuild">
<Message Importance="High" Text="AfterBuild: PublishUrl = $(PublishUrl), BuildUri = $(BuildUri), Configuration = $(Configuration), ProjectDir = $(ProjectDir), TargetDir = $(TargetDir), TargetFileName = $(TargetFileName), BuildNumber = $(BuildNumber), BuildDefinitionName = $(BuildDefinitionName)" />
<Exec Command="powershell.exe -NonInteractive -ExecutionPolicy RemoteSigned "& '$(ProjectDir)AfterBuild.ps1' '$(Configuration)' '$(ProjectDir)' '$(TargetDir)' '$(PublishUrl)' '$(BuildNumber)' '$(BuildDefinitionName)'"" />
</Target>

There's 2 things happening in this target:

  1. A message is printed out during compilation which contains details of the various compile time variables. This is nothing more than a console.log statement really; it's useful for debugging and so I keep it around. You'll notice one of them is called $(BuildNumber); more on that later.
  2. A command is executed; PowerShell! This invokes PowerShell with the -NonInteractive and -ExecutionPolicy RemoteSigned flags. It passes a script to be executed called AfterBuild.ps1 that lives in the root of the project directory. To that script a number of parameters are supplied; compile time variables that we may use in the script.

Where's my BuildNumber and BuildDefinitionName?#

So you've checked in your changes and kicked off a build on the server. You're picking over the log messages and you're thinking: "Where's my BuildNumber?". Well, TFS 2012 does not have that set as a variable at compile time by default. This stumped me for a while but thankfully I wasn't the only person wondering... As ever, Stack Overflow had the answer. So, deep breath, it's time to hack the TFS build template file.

Checkout the DefaultTemplate.11.1.xaml file from TFS and open it in your text editor of choice. It's find and replace time! (There are probably 2 instances that need replacement.) Perform a find for the below

[String.Format(&quot;/p:SkipInvalidConfigurations=true {0}&quot;, MSBuildArguments)]

And replace it with this:

[String.Format("/p:SkipInvalidConfigurations=true /p:BuildNumber={1} /p:BuildDefinitionName={2} {0}", MSBuildArguments, BuildDetail.BuildNumber, BuildDetail.BuildDefinition.Name)]

Pretty long line eh? Let's try breaking that up to make it more readable: (but remember in the XAML it needs to be a one liner)

[String.Format("/p:SkipInvalidConfigurations=true
/p:BuildNumber={1}
/p:BuildDefinitionName={2} {0}", MSBuildArguments, BuildDetail.BuildNumber, BuildDetail.BuildDefinition.Name)]

We're just adding 2 extra parameters of BuildNumber and BuildDefinitionName to the invocation of MSBuild. Once we've checked this back in, BuildNumber and BuildDefinitionName will be available on future builds. Yay! Important! You must have a build name that does not feature spaces; probably there's a way to pass spaces here but I'm not sure what it is.

AfterBuild.ps1#

You can use your AfterBuild.ps1 script to do any number of things. In my case I'm going to use MSTest to publish some test results which have been generated by Karma into TFS:

param ([string]$configuration, [string]$projectDir, [string]$targetDir, [string]$publishUrl, [string]$buildNumber, [string] $buildDefinitionName)
$ErrorActionPreference = 'Stop'
Clear
function PublishTestResults([string]$resultsFile) {
Write-Host 'PublishTests'
$mstest = 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe'
Write-Host "Using $mstest at $pwd"
Write-Host "Publishing: $resultsFile"
& $mstest /publishresultsfile:$resultsFile /publish:http://my-tfs-server:8080/tfs /teamproject:MyProject /publishbuild:$buildNumber /platform:'Any CPU' /flavor:Release
}
function FailBuildIfThereAreTestFailures([string]$resultsFile) {
$results = [xml](GC $resultsFile)
$outcome = $results.TestRun.ResultSummary.outcome
$fgColor = if($outcome -eq "Failed") { "Red" } else { "Green" }
$total = $results.TestRun.ResultSummary.Counters.total
$passed = $results.TestRun.ResultSummary.Counters.passed
$failed = $results.TestRun.ResultSummary.Counters.failed
$failedTests = $results.TestRun.Results.UnitTestResult | Where-Object { $_.outcome -eq "Failed" }
Write-Host Test Results: $outcome -ForegroundColor $fgColor -BackgroundColor "Black"
Write-Host Total tests: $total
Write-Host Passed: $passed
Write-Host Failed: $failed
Write-Host
$failedTests | % { Write-Host Failed test: $_.testName
Write-Host $_.Output.ErrorInfo.Message
Write-Host $_.Output.ErrorInfo.StackTrace }
Write-Host
if($outcome -eq "Failed") {
Write-Host "Failing build as there are broken tests"
$host.SetShouldExit(1)
}
}
function Run() {
Write-Host "Running AfterBuild.ps1 using Configuration: $configuration, projectDir: $projectDir, targetDir: $targetDir, publishUrl: $publishUrl, buildNumber: $buildNumber, buildDefinitionName: $buildDefinitionName"
if($buildNumber) {
$resultsFile = "$projectDir\test-results.trx"
PublishTestResults $resultsFile
FailBuildIfThereAreTestFailures $resultsFile
}
}
# Off we go...
Run

Assuming we have a build number this script kicks off the PublishTestResults function above. So we won't attempt to publish test results when compiling in Visual Studio on our dev machine. The script looks for MSTest.exe in a certain location on disk (the default VS 2015 installation location in fact; it may be installed elsewhere on your build machine). MSTest is then invoked and passed a file called test-results.trx which is is expected to live in the root of the project. All being well, the test results will be registered with the running build and will be visible when you look at test results in TFS.

Finally in FailBuildIfThereAreTestFailures we parse the test-results.trx file (and for this I'm totally in the debt of David Robert's helpful Gist). We write out the results to the host so it'll show up in the MSBuild logs. Also, and this is crucial, if there are any failures we fail the build by exiting PowerShell with a code of 1. We are deliberately swallowing any error that Karma raises earlier when it detects failed tests. We do this because we want to publish the failed test results to TFS before we kill the build.

Bonus Karma: test-results.trx#

If you've read a previous post of mine you'll be aware that it's possible to get MSBuild to kick off npm build tasks. Specifically I have MSBuild kicking off an npm run build. My package.json looks like this:

"scripts": {
"test": "karma start --reporters mocha,trx --single-run --browsers PhantomJS",
"clean": "gulp delete-dist-contents",
"watch": "gulp watch",
"build": "gulp build",
"postbuild": "npm run test"
},

You can see that the postbuild hook kicks off the test script in turn. And that test script kicks off a single run of karma. I'm not going to go over setting up Karma at all here; there are other posts out there that cover that admirably. But I wanted to share news of the karma trx reporter. This reporter is the thing that produces our test-results.trx file; trx being the format that TFS likes to deal with.

So now we've got a PowerShell hook into our build process (something very useful in itself) which we are using to publish Karma test results into TFS 2012. They said it couldn't be done. They were wrong. Huzzah!!!!!!!

Deploying from ASP.Net MVC to GitHub Pages using AppVeyor part 2

"Automation, automation, automation." Those were and are Tony Blair's priorities for keeping open source projects well maintained.

OK, that's not quite true... But what is certainly true is that maintaining an open source project takes time. And there's only so much free time that anyone has. For that reason, wherever you can it makes sense to AUTOMATE!

Last time we looked at how you can take an essentially static ASP.Net MVC site (in this case my jVUNDemo documentation site) and generate an entirely static version using Wget. This static site has been pushed to GitHub Pages and is serving as the documentation for jQuery Validation Unobtrusive Native (and for bonus points is costing me no money at all).

So what next? Well, automation clearly! If I make a change to jQuery Validation Unobtrusive Native then AppVeyor already bounds in and performs a continuous integration build for me. It picks up the latest source from GitHub, pulls in my dependencies, performs a build and runs my tests. Lovely.

So the obvious thing to do is to take this process and plug in the generation of my static site and the publication thereof to GitHub pages. The minute a change is made to my project the documentation should be updated without me having to break sweat. That's the goal.

GitHub Personal Access Token#

In order to complete our chosen mission we're going to need a GitHub Personal Access Token. We're going to use it when we clone, update and push our GitHub Pages branch. To get one we biff over to Settings / Applications in GitHub and click the "Generate New Token" button.

The token I'm using for my project has the following scopes selected:

appveyor.yml#

With our token in hand we turn our attention to AppVeyor build configuration. This is possible using a file called appveyor.yml stored in the root of your repo. You can also use the AppVeyor web UI to do this. However, for the purposes of ease of demonstration I'm using the file approach. The jQuery Validation Unobtrusive Native appveyor.yml looks like this:

#---------------------------------#
# general configuration #
#---------------------------------#
# version format
version: 1.0.{build}
#---------------------------------#
# environment configuration #
#---------------------------------#
# environment variables
environment:
GithubEmail: [email protected]
GithubUsername: johnnyreilly
GithubPersonalAccessToken:
secure: T4M/N+e/baksVoeWoYKPWIpfahOsiSFw/+Zc81VuThZmWEqmrRtgEHUyin0vCWhl
branches:
only:
- master
install:
- ps: choco install wget
build:
verbosity: minimal
after_test:
- ps: ./makeStatic.ps1 $env:APPVEYOR_BUILD_FOLDER
- ps: ./pushStatic.ps1 $env:APPVEYOR_BUILD_FOLDER $env:GithubEmail $env:GithubUsername $env:GithubPersonalAccessToken

There's a number of things you should notice from the yml file:

  • We create 3 environment variables: GithubEmail, GithubUsername and GithubPersonalAccessToken (more on this in a moment).
  • We only build the master branch.
  • We use Chocolatey to install Wget which is used by the makeStatic.ps1 Powershell script.
  • After the tests have completed we run 2 Powershell scripts. First <a href="https://github.com/johnnyreilly/jQuery.Validation.Unobtrusive.Native/blob/master/makeStatic.ps1">makeStatic.ps1</a> which builds the static version of our site. This is the exact same script we discussed in the previous post - we're just passing it the build folder this time (one of AppVeyor's environment variables). Second, we run <a href="https://github.com/johnnyreilly/jQuery.Validation.Unobtrusive.Native/blob/master/pushStatic.ps1">pushStatic.ps1</a> which publishes the static site to GitHub Pages.

We pass 4 arguments to pushStatic.ps1: the build folder, my email address, my username and my personal access token. For the sake of security the GithubPersonalAccessToken has been encrypted as indicated by the secure keyword. This is a capability available in AppVeyor here.

This allows me to mask my personal access token rather than have it available as free text for anyone to grab.

pushStatic.ps1#

Finally we can turn our attention to how our Powershell script pushStatic.ps1 goes about pushing our changes up to GitHub Pages:

param([string]$buildFolder, [string]$email, [string]$username, [string]$personalAccessToken)
Write-Host "- Set config settings...."
git config --global user.email $email
git config --global user.name $username
git config --global push.default matching
Write-Host "- Clone gh-pages branch...."
cd "$($buildFolder)\..\"
mkdir gh-pages
git clone --quiet --branch=gh-pages https://$($username):$($personalAccessToken)@github.com/johnnyreilly/jQuery.Validation.Unobtrusive.Native.git .\gh-pages\
cd gh-pages
git status
Write-Host "- Clean gh-pages folder...."
Get-ChildItem -Attributes !r | Remove-Item -Recurse -Force
Write-Host "- Copy contents of static-site folder into gh-pages folder...."
copy-item -path ..\static-site\* -Destination $pwd.Path -Recurse
git status
$thereAreChanges = git status | select-string -pattern "Changes not staged for commit:","Untracked files:" -simplematch
if ($thereAreChanges -ne $null) {
Write-host "- Committing changes to documentation..."
git add --all
git status
git commit -m "skip ci - static site regeneration"
git status
Write-Host "- Push it...."
git push --quiet
Write-Host "- Pushed it good!"
}
else {
write-host "- No changes to documentation to commit"
}

So what's happening here? Let's break it down:

  • Git is configured with the passed in username and email address.
  • A folder is created that sits alongside the build folder called "gh-pages".
  • We clone the "gh-pages" branch of jQuery Validation Unobtrusive Native into our "gh-pages" directory. You'll notice that we are using our GitHub personal access token in the clone URL itself.
  • We delete the contents of the "gh-pages" directory leaving it empty.
  • We copy across the contents of the "static-site" folder (created by makeStatic.ps1) into the "gh-pages".
  • We use git status to check if there are any changes. (This method is completely effective but a little crude to my mind - there's probably better approaches to this.... shout me in the comments if you have a suggestion.)
  • If we have no changes then we do nothing.
  • If we have changes then we stage them, commit them and push them to GitHub Pages. Then we sign off with an allusion to 80's East Coast hip-hop... 'Cos that's how we roll.

With this in place, any changes to the docs will be automatically published out to our "gh-pages" branch. Our documentation will always be up to date thanks to the goodness of AppVeyor's Continuous Integration service.

Deploying from ASP.Net MVC to GitHub Pages using AppVeyor part 1

There's a small open source project I'm responsible for called jQuery Validation Unobtrusive Native. (A catchy name is a must for any good open source project. Alas I'm not quite meeting my own exacting standards on this particular point... I should have gone with my gut and called it "Livingstone" instead. Too late now...)

The project itself is fairly simple in purpose. It's essentially a bridge between ASP.Net MVC's inbuilt support for driving validation from data attributes and jQuery Validation's native support for the same. It is, in the end, a collection of ASP.Net MVC HTML helper extensions. It is not massively complicated.

I believe it was Tony Blair that said "documentation, documentation, documentation" were his priorities for open source projects. Or maybe it was someone else... Anyway, the point stands. Documentation is supremely important if you want your project to be in any way useful to anyone other than yourself. A project, no matter how fantastic, which lacks decent documentation is a missed opportunity.

Anyway I'm happy to say that jQuery Validation Unobtrusive Native has documentation! And pretty good documentation at that. The documentation takes the form of the jVUNDemo project which is part of the jQuery Validation Unobtrusive Native repo. jVUNDemo is an ASP.Net MVC web application which is built on top of the jQuery Validation Unobtrusive Native helpers. It demonstrates the helpers in action and documents how you might go about using them. It looks a bit like this:

When I first put jVUNDemo together I hosted it on Azure so the world could see it in all it's finery. And that worked just fine. However, there's something you ought to know about me:

I'm quite cheap#

No really, I am. My attention was grabbed by the essentially "free" nature of GitHub Pages. I was immediately seized by the desire to somehow deploy jVUNDemo to GitHub Pages and roll around joyfully in all that lovely free hosting.

"But", I hear you cry, "you can't deploy an ASP.Net MVC application to Git Hub Pages... It only hosts static sites!" Quite right. Sort of. This is where I get to pull my ace of spades: jVUNDemo is not really an "app" so much as a static site. Yup, once the HTML that makes up each page is generated there isn't any app like business to do. It's just a collection of text and 1 screen demo's really.

That being the case, there's no reason why I shouldn't be able to make use of GitHub Pages. So that's what I decided to do. Whilst I was at it I also wanted to automate the deployment process. When I tweak jVUNDemo I don't want to have to manually push out a new version of jVUNDemo to wherever it's being hosted. No, I'm a developer so I'll automate it.

I've broken this up into 2 posts. This first one will cover how you generate a static site from an ASP.Net MVC site. The second will be about how to use AppVeyor to ensure that on each build your documentation is getting republished.

You Wget me?#

So, static site generation. There's a well known Unix utility called Wget which covers exactly that ground and so we're going to use it. It downloads and saves HTML, it recursively walks the links inside the site and grabs those pages too and it converts our routes so they are locally browsable ("Demo/Required" becomes "Demo/Required.html").

You can use Chocolatey to get a copy of Wget. We're also going to need IIS Express to host jVUNDemo whilst Wget converts it. Once jVUNDemo has been built successfully you should be be able to kick off the process like so:

cd C:\projects\jquery-validation-unobtrusive-native
.\makeStatic.ps1 $pwd.path

This invokes the makeStatic Powershell script in the root of the jQuery Validation Unobtrusive Native repo:

param([string]$buildFolder)
$jVUNDemo = "$($buildFolder)\jVUNDemo"
$staticSiteParentPath = (get-item $buildFolder).Parent.FullName
$staticSite = "static-site"
$staticSitePath = "$($staticSiteParentPath)\$($staticSite)"
$wgetLogPath = "$($staticSiteParentPath)\wget.log"
$port = 57612
$servedAt = "http://localhost:$($port)/"
write-host "jVUNDemo location: $jVUNDemo"
write-host "static site parent location: $staticSiteParentPath"
write-host "static site location: $staticSitePath"
write-host "wget log path: $wgetLogPath"
write-host "Spin up jVUNDemo site at $($servedAt)"
$process = Start-Process 'C:\Program Files (x86)\IIS Express\iisexpress.exe' -NoNewWindow -ArgumentList "/path:$($jVUNDemo) /port:$($port)"
write-host "Wait a moment for IIS to startup"
Start-sleep -s 15
if (Test-Path $staticSitePath) {
write-host "Removing $($staticSitePath)..."
Remove-Item -path $staticSitePath -Recurse -Force
}
write-host "Create static version of demo site here: $($staticSitePath)"
Push-Location $staticSiteParentPath
# 2>&1 used to combine stderr and stdout
wget.exe --recursive --convert-links -E --directory-prefix=$staticSite --no-host-directories $servedAt > $wgetLogPath 2>&1
write-host "lastExitCode: $($lastExitCode)"
cat $wgetLogPath
Pop-Location
write-host "Shut down jVUNDemo site"
stop-process -Name iisexpress
if (Test-Path $staticSitePath) {
write-host "Contents of $($staticSitePath)"
ls $staticSitePath
}

The above Powershell does the following:

  1. Starts up IIS Express serving jVUNDemo on http://localhost:57612/
  2. Waits 15 seconds for IIS Express to get itself together (probably a shorter wait time would be sufficient)
  3. Points Wget at jVUNDemo and bellows "go!!!!"
  4. Wget downloads and saves the static version of jVUNDemo to a directory called "static-site"
  5. Stops IIS Express
  6. Prints out the contents of the "static-site" directory

When run, it pumps something like this out:

jVUNDemo location: C:\projects\jquery-validation-unobtrusive-native\jVUNDemo
static site parent location: C:\projects
static site location: C:\projects\static-site
wget log path: C:\projects\wget.log
Spin up jVUNDemo site at http://localhost:57612/
Wait a moment for IIS to startup
Create static version of demo site here: C:\projects\static-site
wget.exe : --2014-12-29 07:49:56-- http://localhost:57612/
Resolving localhost...
127.0.0.1
Connecting to localhost|127.0.0.1|:57612... connected.
HTTP request sent, awaiting response...
200 OK
..... lots of HTTP requests.....
Downloaded: 30 files, 1.0M in 0.09s (10.8 MB/s)
Converting static-site/Demo/CreditCard.html... 34-0
Converting static-site/Demo/Number.html... 34-0
Converting static-site/Demo/Range.html... 34-0
Converting static-site/Demo/Email.html... 34-0
Converting static-site/AdvancedDemo/CustomValidation.html... 35-0
Converting static-site/Demo/Date.html... 36-0
Converting static-site/Home/License.html... 27-0
Converting static-site/index.html... 29-0
Converting static-site/AdvancedDemo/Dynamic.html... 35-0
Converting static-site/Demo/MaxLengthMinLength.html... 34-0
Converting static-site/Demo/Required.html... 34-0
Converting static-site/AdvancedDemo.html... 32-0
Converting static-site/Demo/Remote.html... 35-0
Converting static-site/Demo/EqualTo.html... 34-0
Converting static-site/AdvancedDemo/Globalize.html... 38-0
Converting static-site/Demo/Url.html... 34-0
Converting static-site/Demo.html... 37-0
Converting static-site/Home/GettingStarted.html... 29-0
Converting static-site/Home/Download.html... 27-0
Converting static-site/AdvancedDemo/Tooltip.html... 34-0
Converted 20 files in 0.03 seconds.
Shut down jVUNDemo site
Contents of C:\projects\static-site
Directory: C:\projects\static-site
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 12/29/2014 7:50 AM AdvancedDemo
d---- 12/29/2014 7:50 AM Content
d---- 12/29/2014 7:50 AM Demo
d---- 12/29/2014 7:50 AM Home
d---- 12/29/2014 7:50 AM Scripts
-a--- 12/29/2014 7:50 AM 5967 AdvancedDemo.html
-a--- 12/29/2014 7:50 AM 6802 Demo.html
-a--- 12/29/2014 7:47 AM 12862 favicon.ico
-a--- 12/29/2014 7:50 AM 8069 index.html

And that's it for part 1 my friends! You now have a static version of the ASP.Net MVC site to dazzle the world with. I should say for the purposes of full disclosure that there are 2 pages in the site which are not entirely "static" friendly. For these 2 pages I've put messages in that are displayed when the page is served up in a static format explaining the limitations. Their full glory can still be experienced by cloning the project and running locally.

Next time we'll take the mechanism detailed above and plug it into AppVeyor for some Continuous Integration happiness.