Recently, I participated in a DR event to recover a data warehouse. The configuration of the environment was not terribly complex, but it took some time to find and adjust settings to get the ETL running again. What occurred to me is that, if you have an automated deployment process, DR will be a breeze. And, if you are doing DR test runs, you probably deserve automated deployment. This post will be 1 of 2, in which I will describe an approach to dealing with database changes.
How do we detect differences in the database schema and create a change script?
There are a bunch of tools out there which support schema comparison, but I prefer SSDT. This will allow us to perform a schema comparison and create a SQLCMD file, which can be executed against the target database to implement the changes. Please note that the script generated is not idempotent. If a new table needs to be created, it will create that table without checking to see if it already exists. While the output script cannot be executed twice in a row without error, the process of publishing or schema comparison can. So it is the both the mechanism to compare schemas and implement those changes in a script which makes the entire process idempotent.
Getting set up for automation
We can manually publish changes to a database in SSDT by right clicking on the project and selecting Publish. This is a good starting point, but instead of publishing, we will create a publish profile. Note that in the pop-up menu there are a few options about profiles on the bottom left corner:
When you are done configuring the profile, click “Save As,” rather than “Create Profile.” The difference between the two, is that “Save As” lets you specify the name and location of the file. The latter just drops the file in the root of your project and names it automatically. We can use this feature to create environment specific profiles. For demo purposes, I will create a second profile with a different database name.
Advanced Profile Settings
The advanced settings button can be used to modify the settings properties for the profile. This has a few import options to consider per environment:
- Block Incremental deployment if data loss might occur – you would normally have this turned on as a safeguard, however, there are times when you need to turn it off, like when you need to drop a column from a table.
- Always re-create database – this makes sense for testing if you need to baseline the database
- Back up database before deployment – this might make sense if the database is small and backups will occur relatively quickly. Though I prefer using snapshots, instead.
- Drop objects that are in the target but not in the project – do not use this setting. Because you cannot filter out users, you will wind up dropping users, which are environment specific. I’m not holding my breath when this enhancement will be added.
- Ignore permissions – this might make sense if security changes have been made manually in different environments
- Ignore role membership – for the same reason as above
- Generate smart defaults – great for iterating in development environments. This will allow you to add a Non-nullable column to a table with data. It will specify a default value, like 0 for int or an empty space for a string.
The end result of saving the profile is an xml document that specifies these settings, including the values for SQLCMD variables which will usually vary by environment:
Once we have the publish profiles created, we can script the deployment process in powershell using the sqlpackage.exe utility. Note that by clicking on Publish or double clicking a profile, the project will be built. Building the project creates a dacpac file in the bin folder of the project. The dacpac stores the schema definition which can be used for comparison to a target database. I have copied the dacpac and publish profiles to staging folder in c:tempbuild. The deployment script looks like this in powershell:
- $buildFolder = “c:tempBuild”
- $sourceFile = “$BuildFolderDemo.dacpac”
- $sqlPackage = “$env:PROGRAMFILES (x86)Microsoft SQL Server110DACbinsqlpackage.exe” #the location can vary by installation
- & $sqlPackage /Profile:“$buildFolderDemo.Publish.xml” /SourceFile:$sourceFile /Action:Publish
- & $sqlPackage /Profile:“$buildFolderDemoTest.Publish.xml” /SourceFile:$sourceFile /Action:Publish
In the next post, we will make the above script configurable using a JSON configuration file.