“Life’s under no obligation to give us what we expect.”
– Margaret Mitchell
Hello everyone and happy 2021! Wait, what? It’s FEBRUARY!?! Oops.
You may well have noticed I have not been around for a couple of months. That is because I’ve been struggling. A lot. We’ve had some home worries, and with the state of the world playing on our minds constantly, it has never felt right to sit down and blog anything other than some of the recipes I’m working on (purely to get out of my own head!) but this week I was asked about rollbacks when working in the Migrations based approach with SQL Change Automation, and I couldn’t resist writing something.
Before I get into this though, I want to be very clear in my preferences. I don’t believe database changes SHOULD ever be rolled back. Always forward. If for any reason you need to undo a change, either for a disaster or just because you want to, and other code is being rolled back – then this should be a task. A rollback branch should be created, the respective changes should be made in dev and then using your fabulous, flexible and repeatable pipeline should be merged, built, checked, tested and deployed upstream to Prod. This rolls the database forwards and does so in a way that is controlled, auditable and prevents data loss.

However there MAY be some scenarios where a rollback is unavoidable, for example if you are required to do so immediately, or you’re not sure which of the 3 changes that were just deployed actually caused something to go bang, so you want to blanket undo until you can investigate properly.
With this in mind then there are a couple of ways of rolling back in the migrations approach that can absolutely be a part of your pipeline, one or two of which I want to try and flesh out in this post. This post would of course be much shorter if we were working with the State-Based approach, as we would simply need to compare a previous artifact with our target and generate a down script, but as we know, the state-based method does indeed come with it’s own challenges.
- Generate down scripts up front – this is an option that is described fairly well in the SQL Change Automation documentation, and may be preferable to some, however this is limited to VS and does definitely include some additional work for developers, and potentially generates quite a lot of scripts that may have to be rationalized
- Ad-hoc rollback with SQL Compare – again, this is described in the documentation but they will have to be changes that are then recaptured in the development pipeline in some way, plus this method is ideal for programmable objects but not so much for table changes.
But I think we’re onto something there with a combination of these 2 approaches. Is there a way to use SQL Compare and SQL Data Compare alongside SQL Change Automation, to automatically generate down scripts as artifacts, before each Production deployment? This is what I want to investigate.
Set Up
As usual, I set up an Azure DevOps repo (this time called RollBackPipe) and created myself a set of local databases that I’ve been using to record videos lately so I now have the MaskMePlease database in Dev, Test, Staging and Production flavors. I’m doing this locally on my machine with an Azure DevOps agent I had hanging around anyway – it saved me firing up a bunch of Azure SQL DBs, and I just used SQL Clone.


Implementation
I got the pipeline running straight through, I was simply using YAML for this pipeline as I didn’t want to “faff” around with the Release stages etc. but this could also have been achieved with Classic Pipelines of course.

Ignore the terrible stage names, essentially I was doing 3 things with SQL Change Automation:
- Database Build: Producing a Nuget Package to represent the successfully building files stored in my Git repo, to consume later in the pipeline
- Create Release Artifact: This stage looks at the target environments (Test, STG and Production) and creates a number of reports in my local DatabaseDeploymentResources folder for me to review
- Deploy From Release Artifact: Take the patch script that has been produced and run this against the target DB
So it was time to try and build in the SQL Compare Command Lines. Note if I was doing this on an Azure Pipelines agent I might need to run this as a container or programmatically install the command lines instead, but fortunately they’re just installed on my machine so I could just add a command line stage.
There was actually a couple of ways to do this, I can either pass in the details of the 2 databases completely via the command line (full documentation here), or I could generate an XML config file to pass in and store that in version control, or I could pre-construct a SQL Compare / Data Compare “Project” and just pass that in. The latter seemed easier, so of course I did that!
I opened up SQL Compare and did a Database to Database comparison for Staging and Production, but in the reverse direction to ensure we create a down script and not an up script!

I then saved this as a project, then using the red SQL Compare logo in the top left I was able to launch SQL Data Compare with the same project, so it loads the settings for you. The reason for doing this is that if we undo anything that was created as part of a migration, then try to redeploy it, it won’t actually recognize those features as missing, because the Migration Log table tracks all the migrations, and it will still be saying that it has been deployed. So we need to compare the MigrationLog tables to re-sync back to the previous state, before we ran the migrations.
However I DID make sure that before comparing I only the __MigrationLog table in the comparison in Tables and Mappings:

I then had to change the Comparison Key being used for the migration log table to [Custom], because the default comparison key includes the date/time executed, so this is what I went for:

I saved this project down with the Compare script:

and then following the rather helpful documentation on the command lines, I was able to make the following calls as part of my pipeline (YAML included):
- task: CmdLine@2
inputs:
script: '"C:\Program Files (x86)\Red Gate\SQL Compare 14\sqlcompare.exe" /project:"C:\DatabaseDeploymentResources\MaskMePlease\_ComparisonResources\ProdToStagingDownScript.scp" /scriptfile:"C:\DatabaseDeploymentResources\MaskMePlease\%BuildId%\SchemaDowngradeScript.sql" /include:Identical'
env:
BuildId: $(Build.BuildId)
- task: CmdLine@2
inputs:
script: '"C:\Program Files (x86)\Red Gate\SQL Data Compare 14\sqldatacompare.exe" /project:"C:\DatabaseDeploymentResources\MaskMePlease\_ComparisonResources\ProdToStagingDataDownScript.sdc" /scriptfile:"C:\DatabaseDeploymentResources\MaskMePlease\%BuildId%\DataDowngradeScript.sql" /include:Identical'
env:
BuildId: $(Build.BuildId)
and sure enough, having this run after the deployment to staging but before the deployment to Production I was able to get the reports being produced into the artifacts folder with the relevant changes document:

So it was time to try it out! I created a BadTable on Dev and generated the migration for it, and committed it to source control:


This then triggered the pipeline to run:

and produced the correct reports and .sql scripts:

Then I ran these scripts to rollback Production (of course I could have modified them to preserve any data I needed to retain, this actually works best with Programmable objects like sprocs). Then I re-ran the deployment pipeline and the BadTable lives once again!

So does this work with multiple changes too? I created a stored procedure called “deletemeplease” and modified our BadTable to add a “randomthing” column, to see if we could roll those back:


Production after deployment:


Production after rollback:


All I have to do to roll back forwards now is run my pipeline:

and they’re back:


Conclusions
Is it perfectly possible to have a version specific script output as an artifact as part of your Azure DevOps pipelines? Yes it absolutely is.
Is it perfectly possible to rollback changes that you’ve made when taking a Migrations-First approach to database development? Yes it absolutely is.
Would I recommend this approach? No. Absolutely not.
Just because this is possible doesn’t mean it is good process. I’ve been asked about it enough times that I was more curious than anything – however after actually IMPLEMENTING this process, it is clear to see there are many aspects to it that can fall down and it makes it clearer now, more so than ever, that the best approach would be to either roll forwards from Dev, through the pipeline in an automated, auditable and controllable way, or if we just need to roll back temporarily because of other code being rolled back, then use SQL Compare ad-hoc. This pipeline was very simple, with very few changes and when i made mistakes I could rectify them easily myself – but with a larger dev team, when under pressure to fix things, and many changes to consider, this approach can very quickly fall apart.
Long story short, (I believe) we should follow and trust the documentation.
YAML from Azure DevOps:
trigger:
- main
pool: 'default'
steps:
- task: RedgateSqlChangeAutomationBuild@4
inputs:
operation: 'Build SQL Change Automation Project'
sqlProj: 'RollBackPipe\RollBackPipe.sqlproj'
packageName: 'MaskMePlease.Database'
tempServerTypeBuild: 'localDB'
nugetPackageVersionSelector: 'Specific'
nugetPackageVersion: '1.0'
nugetPackageVersionUseBuildId: true
requiredVersionOfDlma: 'latest'
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'Create'
NuGetFile: 'MaskMePlease.Database.1.0.$(Build.BuildID).nupkg'
ExportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Test'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_Test'
TargetAuthMethod: 'windowsAuth'
TransactionIsolationLevel: 'serializable'
AbortOnWarning: 'none'
requiredVersionOfDlma: 'latest'
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'DeployFromResources'
ImportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Test'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_Test'
TargetAuthMethod: 'windowsAuth'
requiredVersionOfDlma: 'latest'
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'Create'
NuGetFile: 'MaskMePlease.Database.1.0.$(Build.BuildID).nupkg'
ExportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Staging'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_STG'
TargetAuthMethod: 'windowsAuth'
TransactionIsolationLevel: 'serializable'
AbortOnWarning: 'none'
requiredVersionOfDlma: 'latest'
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'DeployFromResources'
ImportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Staging'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_STG'
TargetAuthMethod: 'windowsAuth'
requiredVersionOfDlma: 'latest'
- task: CmdLine@2
inputs:
script: '"C:\Program Files (x86)\Red Gate\SQL Compare 14\sqlcompare.exe" /project:"C:\DatabaseDeploymentResources\MaskMePlease\_ComparisonResources\ProdToStagingDownScript.scp" /scriptfile:"C:\DatabaseDeploymentResources\MaskMePlease\%BuildId%\SchemaDowngradeScript.sql" /include:Identical'
env:
BuildId: $(Build.BuildId)
- task: CmdLine@2
inputs:
script: '"C:\Program Files (x86)\Red Gate\SQL Data Compare 14\sqldatacompare.exe" /project:"C:\DatabaseDeploymentResources\MaskMePlease\_ComparisonResources\ProdToStagingDataDownScript.sdc" /scriptfile:"C:\DatabaseDeploymentResources\MaskMePlease\%BuildId%\DataDowngradeScript.sql" /include:Identical'
env:
BuildId: $(Build.BuildId)
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'Create'
NuGetFile: 'MaskMePlease.Database.1.0.$(Build.BuildID).nupkg'
ExportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Production'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_Production'
TargetAuthMethod: 'windowsAuth'
TransactionIsolationLevel: 'serializable'
AbortOnWarning: 'none'
requiredVersionOfDlma: 'latest'
- task: RedgateSqlChangeAutomationRelease@4
inputs:
Operation: 'DeployFromResources'
ImportPath: 'C:\DatabaseDeploymentResources\MaskMePlease\$(Build.BuildID)\Production'
AppendEnvironment: false
TargetDatabaseConnectionType: 'explicit'
TargetDatabaseServer: '.\TOOLS'
TargetDatabaseName: 'MaskMePlease_Production'
TargetAuthMethod: 'windowsAuth'
requiredVersionOfDlma: 'latest'