Article

Automating Database Promotion With TeamCity, Azure & SQL

Mitchell Posada

This post was previously on the Pathfinder Software site. Pathfinder Software changed its name to Orthogonal in 2016. Read more.

A current medical software project here at Orthogonal involves a lot of Microsoft SQL database development. There was already a good automation process in place for pushing our code out to different environments using TeamCity, our Continuous Integration (CI) server. But we were manually moving new database files into our Quality Assurance (QA) and User Acceptance Testing (UAT) environments when there was a database schema change. We quickly saw an opportunity for automation, and sought out the means to script database promotion from one environment to another. Our CI, QA and UAT environments are hosted on Windows Azure, so we immediately turned to the Azure PowerShell tools for our solution. However, the information we needed to accomplish our task was spread out around the web. Our goal is to consolidate this information in one place for others who wish to accomplish the same task.

Things we needed to do

  • Make sure SQL Server version is 2012 SP1 CU4 or higher (build 11.00.3368)

  • Set up the Azure PowerShell tools in all environments (local and remote)

  • Add a storage container to our storage account on Azure

  • Set up remote PowerShell access on each server

  • Add remote certificates to trusted store on appropriate machines

  • Create a SQL credential on all SQL Server instances

  • Create scripts to perform the backup and restore operations for each server

  • Host the backup and restore scripts on the CI server

  • Create build steps in TeamCity to run each script

SQL Versions and PowerShell Secrets

First things first, the PowerShell cmdlets we need are in two places. The cmdlets that allow you to backup and restore databases are found in the SQL Server PowerShell (SQLPS) utility of SQL Server 2012 SP1 CU4 or higher. Check your SQL Server version to make sure you have the tools you need (run SELECT @@Version in SQL Management Studio to see your version number). If you’re version number is lower than 11.00.3368, you will need to update SQL Server to use this guide.

Second, we need to install the Azure PowerShell tools and map them to our subscription. We used the certificate method described in this guide to set up our PowerShell tools (there’s no need for us to reiterate the steps here).

Containing the problem

There are two ways you can add a container to your storage account on Azure.

If you prefer to use an online tool, simply log into the Azure Management Console and click the link on the left for Storage. If you do not already have a storage account, create a new one. Inside the storage account, click the Containers link at the top of the page. Use the “+ Add” button at the bottom of the page to create a new container.

SQL Versions and PowerShell Secrets

First things first, the PowerShell cmdlets we need are in two places. The cmdlets that allow you to backup and restore databases are found in the SQL Server PowerShell (SQLPS) utility of SQL Server 2012 SP1 CU4 or higher. Check your SQL Server version to make sure you have the tools you need (run SELECT @@Version in SQL Management Studio to see your version number). If you’re version number is lower than 11.00.3368, you will need to update SQL Server to use this guide.

Second, we need to install the Azure PowerShell tools and map them to our subscription. We used the certificate method described in this guide to set up our PowerShell tools (there’s no need for us to reiterate the steps here).

Containing the problem

There are two ways you can add a container to your storage account on Azure.

If you prefer to use an online tool, simply log into the Azure Management Console and click the link on the left for Storage. If you do not already have a storage account, create a new one. Inside the storage account, click the Containers link at the top of the page. Use the “+ Add” button at the bottom of the page to create a new container.

If you’re having auth problems I feel bad for you son I got 99 problems and credentials ain’t one

Creating a SQL credential allows our PowerShell scripts to perform backup and restore operations.

Follow the steps in this article to create a SQL credential on each SQL Server instance. We modified our scripts so that we could run them on multiple machines without having to provide parameters. Ours looked like this (with critical details removed, of course):

import-module sqlps


# create variables
$storageAccount = “mystorageaccount”
$storageKey = “<storageaccesskeyvalue>”
$secureString = convertto-securestring $storageKey  -asplaintext -force
$credentialName = “mybackuptoURL”

$computerName = $env:COMPUTERNAME

#cd to computer level
cd sqlserver:\sql\$computerName
# get the list of instances
$instances = Get-childitem
#pipe the instances to new-sqlcredentail cmdlet to create SQL credential
$instances | new-sqlcredential -Name $credentialName  -Identity $storageAccount -Secret $secureString

Once the credentials are in place, we can create PowerShell scripts for backing up and restoring our database. Again, we modified our scripts from the examples provided by Microsoft. It is important to note your SQL Credential name, as it is needed in the next steps.

Our backup script looks like this (placeholder values are indicated by angle brackets):

import-module sqlps

$storageAccount = “<storage account name>”

$blobContainer = “<storage container name>”

$backupUrlContainer = “https://$storageAccount.blob.core.windows.net/$blobContainer/”

$credentialName = “<SQL credential name>”

$computerName = $env:COMPUTERNAME

$backupFile = $backupUrlContainer + “<file name>.bak”

Set-AzureSubscription -SubscriptionName “<subscription name>” -CurrentStorageAccount $storageAccount

Remove-AzureStorageBlob -Container $blobContainer -Blob “<file name>.bak”

#cd to computer level

cd sqlserver:\sql\$computerName\DEFAULT\databases

#backup all the user databases

Backup-SqlDatabase -Database <database name> -backupFile $backupFile -SqlCredential $credentialName -Compression On

One difference you will notice in our script is that we provide an explicit filename for the backup, whereas the Microsoft scripts set the file name dynamically based on the date and time. We chose to do this so that our CI server would be able to pick up the appropriate database backup from our Azure storage account for the environment in question without having to figure out which file is most current. We don’t want to delete old backups manually, so the script first deletes an existing file of the same name from our cloud storage container. There might be a cleaner solution, but this method suits our needs just fine.

Here is our restore script (more angle bracket placeholder values here):

import-module sqlps

#create variables

$storageAccount = “<storage account name>”

$blobContainer = “<storage container name>”

$backupUrlContainer = “https://$storageAccount.blob.core.windows.net/$blobContainer/”

$credentialName = “<SQL credential name>”

$computerName = $env:COMPUTERNAME

$srvPath = “SQLSERVER:\SQL\$computerName\DEFAULT\”

# for default instance, the $srvpath variable would be

# “SQLSERVER:\SQL\COMPUTERNAME\DEFAULT”

# navigate to SQL Server Instance

CD $srvPath

$backupFile = $backupUrlContainer + “<backup file name>.bak”

# Restore Database and move files

Restore-SqlDatabase -Database <database name> -SqlCredential $credentialName -BackupFile $backupFile -Replace

The biggest change we made to this script was adding the “-Replace” flag to the Restore-SqlDatabase command.  We use the Replace flag because there are occasionally changes to our ETL process (Extract, Transform, and Load) that cause restores to fail.

Assuming you have already created your SQL credential, you can run these scripts to get a taste of what follows. We were amazed at how quickly Azure stores files in the cloud. Having waited hours for files to transfer on other platforms, we were surprised to see backups and restores typically happening in a minute or less for large database files (~500mb).

Setting up Remote Powershell Access on Azure (We have no witty title for this)

There were some quick steps, and some involved steps, in getting our backup and restore scripts working. This was one of the more complicated ones.

If you don’t already have one, you will need a way to create an SSL certificate. We used makecert from the Windows SDK to create a self-signed cert. We’re running Windows 8 and Windows Server 2012; we used this version of the SDK.

After digging around the depths of MSDN, StackOverflow, and some various Microsoft blogs, we found this guide on the Azure forums to be the key to setting up our remote access needs.

Steps two and three are critical if you are using a self-signed certificate. If you already have an SSL cert, you can skip steps two and three. All you need to do is import the existing cert onto any machine that will initiate PowerShell remote sessions.

Following these steps was the key to enabling PowerShell remote sessions, which allows the scripts hosted on our CI server to run on the development, QA, and UAT servers. Don’t forget as part of this process you are exposing port 5986 on your servers, so update your firewall policies accordingly if you have IP restrictions in place.

Note: Unless you are going to sign your PowerShell scripts, you will need to change your execution policy to allow unsigned remote scripts. Microsoft outlines the execution policies here.

We built this (Team)City on scripts and code.

Okay, we’re nearing the end now. To run our scripts from TeamCity as build steps, the scripts need to be stored on our CI server. We put generic versions of the scripts into our git repository for version control; customized versions of the scripts (one per environment) are stored on the server. The coup de grâce is a script that allows TeamCity to open a PowerShell remote session and execute local scripts on another server. This is a little dirty, because it holds onto the password used to access the remote server. (This is part of the reason why customized scripts are stored only on the CI server.)

Here is the script we use to accomplish this task:

$password = ConvertTo-SecureString ‘<password>’ -AsPlainText -Force

$appCred = New-Object System.Management.Automation.PsCredential(‘<domain\username>’, $password)

$rs = New-PSSession -ComputerName ‘<machine address>’ -Credential $appCred -UseSSL

invoke-command -session $rs -FilePath ‘<script to run>’

Once you create the remote access scripts and map them to your backup / restore scripts appropriately, you can create TeamCity build configurations to run each script.

Remember: all of the PowerShell scripts you want to run will need to be stored on the CI server. The PowerShell script above runs a script from the local machine on the remote server.

Once you set up build configurations for your backup and restore scripts, you are ready to rock! Just run the desired build steps to backup and restore your databases.

Bonus: What to do if something goes wrong during the backup and locks the lease on your blob? 

Here is a TechNet article on how to break the leases on your blobs, should they get stuck.

    

Related Posts

Article

Orthogonal-Contributed Article Wins Prestigious AAMI Prize

Article

TINY GVS: Reimagining the Validated State

Article

Cloud Computing & Validation: 6 Key Recommendations

Article

The Validated State: MedTech’s Standard of Trust