Eduard Keilholz

Hi, my name is Eduard Keilholz. I'm a Microsoft developer working at 4DotNet in The Netherlands. I like to speak at conferences about all and nothing, mostly Azure (or other cloud) related topics.
LinkedIn | Twitter | Mastodon | Bsky


I received the Microsoft MVP Award for Azure

Eduard Keilholz
HexMaster's Blog
Some thoughts about software development, cloud, azure, ASP.NET Core and maybe a little bit more...

Building a Boardgame in Azure - Part 4 - Pipelines

Building a Boardgame in Azure

One important factor of agile software development is the OPS part of DevOps. As a developer, you’ll probably know how to write code. But do you know how to do OPS? What is this OPS thing? OK, obviously the OPS is an abbreviation of Operations, but still… What do ‘operations’ mean?

Ages ago, we used to write software, create some sort of an installer for that software system. Then document the system for weeks, and document an installation procedure. Everything needs to be in place before you could hand the CD-ROM (or disk) over to one of the system/network administrators to have them install the system. Sometimes that process went well…

Now the OPS part of DevOps bridges the gap between the system administrator and the developer. Or even better, make the system administrator part of the development team. This means, that the team writing the software is now also responsible for the infrastructure the system runs on.

This sounds all really scary, but when working in cloud environments it’s actually a relief (for every party involved).

The idea is to automate the entire process from when the developer has finished coding, to the deployment of a production environment. To do this in Azure DevOps, you use pipelines.

What is a pipeline?

Basically, a pipeline is a mechanism that lets you automate work depending on a time schedule, or (for example) a change in the codebase. The mechanism of starting a pipeline is called a trigger. There are many sophisticated ways to configure a trigger, but the most common is a trigger on the main branch. This means, that the pipeline is fired as soon as the codebase in the main branch has changed.

It’s a good practice to have some branch policies in place on the main branch so no developer is allowed to push changes directly to the main branch but to use a pull request instead. This way you can enforce the four-eyes principle and make sure that at least one other developer has seen the code and agrees upon the solution.

Pipeline capabilities

It’s impossible to cover all the things you can do from within a pipeline. Build / Compile code, run Powershell scripts, do almost anything on Azure, communicate with external systems you name it. But for this game, the pipeline would be something like this :

  • Compile the code
  • Run unit tests on the code
  • Publish the build artifact
  • Test infrastructure definition
  • Publish infrastructure definition
  • Provision test environment (incremental)
  • Apply database schema changes on the test environment
  • Deploy to test environment
  • Provision test environment (complete)
  • Provision prod environment (incremental)
  • Apply database schema changes on prod environment
  • Deploy to prod environment
  • Provision prod environment (complete)

In my post about ARM Templates I explain how to chain all these tasks into a multi-stage pipeline. And it might be that the list above is not accurate, because the game may require an additional step. To be honest, the prod environment doesn’t even exist at the time of writing this article.

A multi-stage pipeline means, that the pipeline allows you to start multiple stages (yes, it’s in the name). So in this case, I have a Build stage, a deploy to test stage, and a deploy to prod stage. Each state contains one or more jobs, and each job contains one or more steps.

You can organize your pipeline into jobs. Every pipeline has at least one job. A job is a series of steps that run sequentially as a unit. In other words, a job is the smallest unit of work that can be scheduled to run.

Steps are actual tasks to do within a pipeline. For example upload files, run a script, and so forth.

Compiling your code

Most often, one of the first jobs in a pipeline is to compile the system. A build delivers an artifact. For a web API for example, this is a set of DLL files and everything required to be sent to a production machine to host the system. In order to use the artifact in the later stages of the pipeline, the artifact must be published. In the case of my awesome game, I need to publish three artifacts.

  • The API Build artifact
  • A migration/seed project
  • ARM Templates

The first one is obvious, it’s the system that I want to deploy. The second one is a command-line system that contains my database migration strategy and allows me to migrate my SQL Server database to a newer version, and to seed data. More on this on a future post. And last, the ARM Template. This is basically a JSON representation of my desired infrastructure. I’m going to need these files in later stages of my pipeline to provision the infrastructure in Azure, just before I’m going to deploy the system itself.

Because I’m using linked templates in my ARM Template, I need to upload the linked templates just prior to template validation. This is an early check I built in the pipeline to eliminate obvious mistakes in the pipeline. This step is executed prior to publishing the ARM Templates pipeline artifact.

stages:
  - stage: build
    displayName: Build
    jobs:
      - job: build_api
        displayName: Build the web API
        pool:
          name: "Azure Pipelines"
          vmImage: "windows-latest"
        steps:
          - task: DotNetCoreCLI@2
            displayName: "Restore Packages"
            inputs:
              command: "restore"
              projects: "**/*.csproj"
              feedsToUse: "select"

          - task: DotNetCoreCLI@2
            displayName: Publish
            inputs:
              command: publish
              publishWebProjects: True
              arguments: "--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)/webapi-package"
              zipAfterPublish: True

          - task: PublishPipelineArtifact@0
            displayName: "Publish Artifact: webapi-package"
            inputs:
              artifactName: "webapi-package"
              targetPath: "$(Build.ArtifactStagingDirectory)/webapi-package"

Deploying the system

I already mentioned the ARM Templates a couple of times. ARM Templates are basically JSON representations of your desired infrastructure. You can present these templates to the Azure Resource Manager and it will figure out which resources are already in place and what it needs to do to come to your desired state. Again, I blogged about this in detail right here. There are a couple of advantages of having infrastructure as code. The most important is that the definition of you infra, lives right next to your code, in your source code repository. This allows you to apply version control over your infra, which is really awesome. There are a couple of different systems that allow you to write infrastructure as code, like Pulumi, Terraform and the Azure native ARM Templates. Because I already blogged about this in detail, I’m skipping this for now.

This leaves me with the final step to actually deploy the system to an Azure Web App in the cloud:

- deployment: deploy_webapi
displayName: "Deploy Web API"
environment: "Test-Environment"
dependsOn:
    - deploy_arm_templates_incremental
pool:
    name: "Azure Pipelines"
    vmImage: "windows-2019"
strategy:
    runOnce:
    deploy:
        steps:
        - task: DownloadPipelineArtifact@0
            displayName: "Download Artifact: webapi-package"
            inputs:
            artifactName: "webapi-package"
            targetPath: $(System.DefaultWorkingDirectory)/webapi-package

        - task: AzureRmWebAppDeployment@4
            displayName: "API App Deploy: Core API"
            inputs:
            azureSubscription: "name-of-azure-service-connection"
            appType: web
            webAppName: "your-azure-web-app-name"
            package: "$(System.DefaultWorkingDirectory)/webapi-package/*.zip"

Conclusion

I think it’s really important to get familiar with pipelines and get them under control allowing you to deploy your system fast and automated. In these modern times, it’s important to get customer feedback as soon as possible so you can adjust and refine the software system that you’re developing. Using pipelines dramatically shortens the feedback loop and takes away a lot of pain in the repetitive manual update procedures.