“There is always a way to go if you look for it.”
– Ernest A. Fitzgerald
As many of you know, I really enjoy talking about Data Masking. I fundamentally believe it is an absolutely ESSENTIAL part of Test Data Management and specifically the provisioning of Pre-Production environments. If you hold sensitive PII/PHI/PCI in your Production environments, you have no excuses for porting any of that back into Dev and Test.
I also stand firmly behind the belief (and it is just that, my belief) that masking is much safer than anything you can achieve with encryption or limiting access alone. Static masking, when done correctly, means that even if all other security measures are bypassed, or we accidentally expose data somehow, it doesn’t matter because the PII/PHI has been wiped, but it is still fundamentally useful for development and testing environments.
I’m also a huge proponent of using classification and data masking as part of a DevOps process, and often you’ll find me using Azure DevOps to actually kick off my masking process – but one of the most frequent questions I get is “do I have to install Data Masker somewhere?” and the answer I always have to give is… yes.
When we use SQL Provision to spin up our environments in non-Prod, generally we know they have to be in IaaS VMs or On-Prem, due to the nature of the technology (keep your eyes open on THIS because big changes coming soon *squeals in excitement*) but sometimes we’re working entirely with PaaS DBs and VMs don’t even come into our vocabulary – but we still need them to be masked for non-Prod use.
I have some data in an Azure SQL DB – a copy of DMDatabase (get it from my GitHub):
and I would like to get this masked before I create copies of it. Now I have written scripts in the past to make sure that Azure SQL DBs can be masked and then copied into non-Prod environments and you can get those scripts here, and you could easily wrap something like the below INTO a script like that – but much of the work I’ve done on this always involves having Data Masker installed somewhere and invoked on that machine – a VM, my laptop, whatever.
So, how do we avoid having to install Data Masker each time?
Data Masker for SQL Server does not (at time of writing) have publicly available a docker container or method for installing using Choco or something like that – once we do, trust me, I’ll be blogging about it A LOT as I will be very excited. But there is an install file available: https://download.red-gate.com/installers/DataMaskerforSQLServer/ and this might be enough.
Initially I created a Masking Set locally for the DMDatabase copy in Azure, it was nice and simple just masking a few of the fields on the CUSTOMER table:
It relies on SQL auth for the connection and I’m remembering the credentials, though these could be subbed in later on using the PARFILE (documentation on that here).
Next I put this masking set into a newly created Azure DevOps Git repo, which I cloned down onto my local machine – and then committed and pushed my changes up into Azure DevOps:
Now that this was all in ADO, it was time to set up a pipeline for it – so let’s jump into some YAML! Now, Data Masker currently needs to run on a Windows machine so we’ll set the pool to Windows-Latest:
trigger: - main pool: vmImage: windows-latest
The next step is to grab the installer – which I know I can easily do with PowerShell. I’m sure you could be more clever about this, but with limited time I chose the most recent version and hard coded that in to a PowerShell Invoke-WebRequest cmdlet:
- task: PowerShell@2 inputs: targetType: 'inline' script: | $source = 'https://download.red-gate.com/installers/DataMaskerforSQLServer/2021-03-15/DataMaskerforSQLServer.exe' New-Item -ItemType directory -Path C:\download $destination = 'C:\download\DataMaskerforSQLServer.exe' Invoke-WebRequest -Uri $source -OutFile $destination New-Item -ItemType directory -Path C:\download\DMlogs displayName: 'Download Data Masker'
This PowerShell task is going to grab the most recent exe for Data Masker and pull it down into C:\download on the hosted VM we’re using for the pipeline and it’s also going to create a directory for the Data Masker logs as well (if you wanted to extend the YAML at the end to wrap these logs up and publish them as a result of the pipeline masking, then go for it and tweet me to let me know!)
Next we have to extract and install Data Masker from that download, which is fairly easy to do with a cmd call:
- task: CmdLine@2 inputs: script: | "C:\download\DataMaskerforSQLServer.exe" products "Data Masker for SQL Server" log c:\download temp c:\download /IAgreeToTheEula RG_LICESE=%RGLICENSE% displayName: 'Install DMS Headlessly'
Note I’m using the guidance from this page, making sure to accept the EULA and I’m passing in my Redgate License as a variable, that I have specified for the pipeline and kept secret. This will put Data Masker in the default location in C:\Program Files\… and means we will then be able to call it. I do however need to make sure that this now works. So I saved my pipeline and ran it to see what happened:
Fabulous, that all works nicely. Now to pass the DMSMaskSet file to Data Masker and get it to run – ah but I forgot, I’m going to need a PARFILE as per the cmdline documentation that specifies where the files etc. are for the run. So I create my PARFILE.txt as such:
MASKINGSET=C:\download\DMDB_MaskingTime.DMSMaskSet LOGFILEDIR=C:\download\DMlogs DATASETSDIR=C:\Program Files\Red Gate\Data Masker for SQL Server 7\DataSets REPORTSDIR=C:\download\DMlogs INTERIM_REPORTS=false
As you can see, very simple. But to make sure these files are “simply” in the right folder, and because I don’t have time to explore how I could pass an environment variable into the .txt file itself, I’m going to add a quick file copy task to make sure that my masking set and PARFILE both make it into that location:
- task: CopyFiles@2 inputs: SourceFolder: '$(Build.Repository.LocalPath)' Contents: '**' TargetFolder: 'C:\download' displayName: 'Copy files from working directory'
Of course now that we have Data Masker installed on the pipeline VM, the masking set AND the PARFILE… let’s get masking!
- task: CmdLine@2 inputs: script: '"C:\Program Files\Red Gate\Data Masker for SQL Server 7\DataMaskerCmdLine.exe" PARFILE=C:\download\PARFILE.txt' displayName: 'Run Data Masker'
I agree the command is now less than impressive given all the prep work, but when you run all of this in it’s entirety…
Big success! But let’s check the data of course and make sure it is as we expect:
It all works, Lynne has been masked to Muna (and the rest has been masked too!), and I didn’t need to have Data Masker installed on a VM in my environment, with an Azure DevOps self hosted agent to run it – I could just do it programmatically. #Winning.
I’ll put the full YAML below however some caveats:
- You cannot run this pipeline a LOT in a short space of time, I found that out. The RedGate downloads page is not particularly designed for this process, so it should probably be run sparingly otherwise the pipeline times out because the server, by way of balancing, will prevent you from pulling the file too often.
- You will need to update the PowerShell step reasonably often if you want the DMS most recent installer, or make it more futureproof to grab the latest version – I just didn’t investigate that.
- Data Masker is dependent on the power of the machine it is running on in MANY ways and Azure DevOps pipeline VMs are not particularly the most powerful beasts in the world – so if you have a lot of masking that needs doing, I would be weary of this method and might stick to an Application Server VM you’ve got hanging around, just in case.
- This is likely to change a lot in the future so may not be relevant when you’re reading it after a few months – so before implementing anything like this, if it’s been a few months, contact me or Redgate and just confirm that there isn’t some better way of doing this, if I haven’t already blogged it!
Thanks for stopping by and have a great week!
trigger: - main pool: vmImage: windows-latest steps: - task: PowerShell@2 inputs: targetType: 'inline' script: | $source = 'https://download.red-gate.com/installers/DataMaskerforSQLServer/2021-03-15/DataMaskerforSQLServer.exe' New-Item -ItemType directory -Path C:\download $destination = 'C:\download\DataMaskerforSQLServer.exe' Invoke-WebRequest -Uri $source -OutFile $destination New-Item -ItemType directory -Path C:\download\DMlogs displayName: 'Download Data Masker' - task: CmdLine@2 inputs: script: | "C:\download\DataMaskerforSQLServer.exe" products "Data Masker for SQL Server" log c:\download temp c:\download /IAgreeToTheEula RG_LICESE=%RGLICENSE% displayName: 'Install DMS Headlessly' - task: CopyFiles@2 inputs: SourceFolder: '$(Build.Repository.LocalPath)' Contents: '**' TargetFolder: 'C:\download' displayName: 'Copy files from working directory' - task: CmdLine@2 inputs: script: '"C:\Program Files\Red Gate\Data Masker for SQL Server 7\DataMaskerCmdLine.exe" PARFILE=C:\download\PARFILE.txt' displayName: 'Run Data Masker'
4 thoughts on ““But I don’t wanna INSTALL it!”: Data Masker on the fly in Azure DevOps (with an Azure SQL DB)”
Hi, Do you need to specify command type when you run DataMaskerCmdLine ? I do see set of commands like run, build etc need to specified when we run the command.
Can you let us know?
– task: CmdLine@2
script: ‘”C:\Program Files\Red Gate\Data Masker for SQL Server 7\DataMaskerCmdLine.exe” PARFILE=C:\download\PARFILE.txt’
displayName: ‘Run Data Masker’
Hi Sam, thanks for your comment,
You can use the commands like run, build etc. however if you don’t specify and just provide the PARFILE it will just run it by default, you could adapt this pipeline to dynamically build a set each time from SQL Data Catalog, I went the simpler route – but the command as above works just fine 🙂
Chris, I also did saw your powershell script(https://github.com/ChrisUnwin/PowerShell/blob/master/Demos/Redgate%20Demos/CopyMaskCopyAzure.ps1), I do have a question here (https://github.com/ChrisUnwin/PowerShell/blob/master/Demos/Redgate%20Demos/CopyMaskCopyAzure.ps1#L21) can I know how this PARFILE looks, I want to see the masking set in it? I want to know how can we mask on temp copy. If you could share those details that would be awesome.
Hi Sam, thank you for the comment! The PARFILE doesn’t look anything out of the ordinary – you can find a couple of examples here: https://documentation.red-gate.com/dms/data-masker-help/general-topics/about-command-line-automation
The PARFILE and Masking set were just located on my machine being invoked from the Azure DevOps local agent and ultimately pointing at the temp SQL DB copy! Let me know if this makes sense or if you have any extra questions – thanks very much!