Database Version Control - Incremental Scripts

21.10.2011 01:20Comments
Back in this post, I explained the three different methods for versioning databases that I have worked with: backups, incremental scripts and Visual Studio Database Project. I never had a chance to blog about incremental scripts method and I thought I’d blog about it now.

Setting things up

DbVersion table

First, you need to add a table to keep track of all the incremental scripts that were executed against your database. Something like this will do:
  1. CREATETABLE [dbo].[DbVersion](
  2.     [Id] [int] IDENTITY(1,1)NOTNULL,
  3.     [FileName] [varchar](250)NOTNULL,
  4.     [Description] [varchar](4000)NOTNULL,
  5.     [Executed] [date] NOTNULL,
  7. (
  8.     [Id] ASC
  10. )ON [PRIMARY]
So now you can query this table to see the current version, and what’s more, if the sequence in which the scripts were executed is correct.


The baseline is a script which creates the entire database to a given point in time. So the first script that creates the database from scratch is one of many baselines that we might have. For creating the baseline in SQL Server, you can use the generate script option: image This will open a wizard that will guide through the steps to script your database. In order to have a proper baseline, you can clean the database, but leave all the data in the enumeration tables (like categories, and built in data), and use the “Script data and schema” option: image

Versioning Scheme

You should pick a versioning scheme that suites your development process. The minimum setup that I would use is the following: [project name].[major].[minor]. In this scenario the major would be the current baseline, and the minor the incremental script relative to the baseline. All the versions with minor “0” are the baselines themselves. Now that we chose our scheme, let’s save the baseline with the following filename: demo.1.0.sql.


Scenario 1 – Clean setup

By clean setup, I mean that you don’t have a database in your local environment. At this point you should do the following: 1.- Pick the biggest baseline (ie: demo.2.0.sql, etc) and execute it. 2.- Run all the incremental scripts of that baseline in ascending order (ie: demo.2.1.sql, demo.2.2.sql, etc).

Scenario 2 – Making a change

When making a change to our database, we will create a new incremental script, thus increasing the version in our database. Suppose that you update your SVN (or whichever version control tool) database folder, and you see that the latest script is demo.2.3.sql. At that point you’ll create a new file called demo.2.4.sql for your change. The script itself can be made by a tool like Red Gate’s SQL Compare and SQL Data Compare, or even manually. Whichever method you choose for your scripts, it is usually a good practice to enclose the changes inside of a transaction. When the script is run, and after the changes are made, the script should insert a record in the DbVersion table mentioned before. By doing this, we ensure that the DbVersion is always up to date. Here is a sample script generated with SQL Compare to which I manually removed the GOs and added the insert to the DbVersion table:  
  1. /*
  2. Run this script on:
  3.         (local)\sqlexpress.RestBucks2    -  This database will be modified
  4. to synchronize it with:
  5.         (local)\sqlexpress.Store
  6. You are recommended to back up your database before running this script
  7. Script created by SQL Compare version 9.1.0 from Red Gate Software Ltd at 10/6/2011 7:41:17 PM
  8. */
  11. IFEXISTS(SELECT*FROM tempdb..sysobjectsWHERE id=OBJECT_ID('tempdb..#tmpErrors'))DROPTABLE #tmpErrors
  12. CREATETABLE #tmpErrors(Error int)
  16. PRINTN'Altering [dbo].[Product]'
  17. ALTERTABLE [dbo].[Product] DROP
  18. COLUMN [Version]
  23. INSERTINTO [dbo].[DbVersion]
  24.            ([FileName]
  25.            ,[Description]
  26.            ,[Executed])
  27.      VALUES
  28.            ('demo.2.4.sql'
  29.            ,'Removing Version column from Product table.'
  30.            ,GETDATE())
  31. PRINT'The database update succeeded'
  33. END
  34. ELSEPRINT'The database update failed'
  35. DROPTABLE #tmpErrors

Creating a Baseline

After several changes are made to a database, more and more scripts are required to be executed in order to have a clean set up. In order to avoid having too many incremental scripts, you can create a baseline as a way to shortcut a bunch of those incremental scripts. Sometimes it is a good idea to create a baseline when a production deployment is done, sometimes it is a good idea to create one when you have more scripts than you would like. In order to create a baseline, the only thing you need to do is, pick the previous baseline, run all the incremental scripts in that baseline, and then script the entire database like we did before.

Updating Your Local Database

The flow for updating an existing database would be the following: 1.- Query the DbVersion table in your database to find the current version. 2.- Query your database scripts repository looking for scripts that are newer than yours. 3.- Execute the scripts, except for the baselines (minor equals “0”). 4.- Query the DbVersion table to make sure that all the scripts ran successfully and in order. Usually this step can be easily automated. Here’s the PowerShell script for updating your database (it’s only a little bit over 30 lines).
  1. $SqlServer='.\sqlexpress'
  2. $DB='RestBucks2'
  3. $conn_options=("Data Source=$SqlServer; Initial Catalog=$DB;" +
  4. "Integrated Security=SSPI")
  5. $conn= New-ObjectSystem.Data.SqlClient.SqlConnection($conn_options)
  6. $conn.Open()
  7. $cmd=$conn.CreateCommand()
  8. $cmd.CommandText ='Select top 1 FileName from dbVersion order by FileName desc'
  9. $reader=$cmd.ExecuteReader()
  10. if ($reader.Read())
  11. {
  12.     $columns= New-Objectobject[] 1
  13.     $reader.GetValues($columns)>$null
  14.     $current=$columns[0]
  15.     $conn.close()
  16. }
  17. else
  18. {
  19.     $current= ""
  20. }
  21. $files=get-childitem |Where {$_.Name -like "$DB.*.*.sql"}
  22. $tokens=$files|Select-Object @{Name="FileName";Expression={$_}}, @{Name="Tokens";Expression={$_.Name.Split('.')}}
  23. $scripts=$tokens|Select-Object @{Name="FileName";Expression={$_.FileName.Name}}, @{Name="IsBaseline";Expression={$_.Tokens[2] -eq 0}}
  24. $updates=$scripts|where {$_.IsBaseline -eq $false-and$_.FileName.CompareTo($current)-gt 0}
  25. foreach($scriptin$updates)
  26. {
  27.     $conn= New-ObjectSystem.Data.SqlClient.SqlConnection($conn_options)
  28.     $conn.Open()
  29.     $cmd=$conn.CreateCommand()
  30.     $cmd.CommandText =get-content $script.FileName
  31.     $cmd.ExecuteNonQuery()
  32.     $conn.close()
  33. }

Modifying an update

If you made a mistake in a script, and you need to correct it, don’t! Always try to create a new version rather than altering an update that’s already in the svn, because those changes are hard to push to the rest of your team. However, if you made a mistake that may result in the loss of data, then fix it, commit it to SVN, and let everyone know! :)

Stored Procedures

Stored procedures and view in SQL Server are a little special. Suppose that you change a table, in order to remove a column that a Stored Procedure or a view is using. At that point, SQL Server will perform the update and break the SP or the view silently. So as part of the update process of a database, it’s not a bad a idea to drop all the stored procedures and all the views, and recreate them. For doing that you have to keep the SPs and Views in different scripts, probably in a different folders and one file per SP or View. So sacrifice a little bit of development time, to make your life much easier multiple times when migrating databases in every environment, including the most critical one: PRODUCTION!


  • It’s blazing fast.
  • You never lose your data as part of a change in any of your environments. So no need for updating testing scripts, etc.
  • Deployments are dead simple. By the time you get to the production deployment, the scripts were already executed in every developer’s machine, dev, qa, staging environments, etc.
  • Easy to automate deployments.
  • No need for complex heavy tools like Visual Studio Database Projects.
  • The updates history resides in the database itself (see DbVersion table).


  • May require SQL knowledge to craft migration scripts.
  • It’s hard to test the incremental scripts.
  • Takes up a little bit more of developer’s time in some cases.
So even though I am sure this method of versioning the database may not work on every project, I consider it a perfectly valid way to keep your database under version control.

comments powered by Disqus