Database migrations are one of those things developers tend to overthink until they've been burned by a bad deployment. Then suddenly, version control for your database schema becomes non-negotiable. Entity Framework has migrations built in, Flyway is the Java world's darling, and there are dozens of other tools competing for mindshare. So why DbUp?
The answer is refreshingly simple:- DbUp. It's a .NET library that runs SQL scripts in order, tracks what's been executed, and doesn't try to be clever about it. No DSL to learn, no code-first abstractions, no magic. Just SQL files, version control, and a straightforward execution model.
By doing less, DbUp often does more. When Entity Framework migrations fail mid-deployment because of some edge case the migration generator didn't anticipate, you're left debugging auto-generated code. With DbUp, you're debugging the exact SQL you wrote. When you need to run a complex data migration that would be painful in LINQ, DbUp doesn't care—write whatever SQL you need.
This isn't to say DbUp is perfect for everyone. If you're building a greenfield application with simple entities and you love code-first development, EF migrations might serve you better. But if you're working with existing databases, need precise control over your SQL, or want migration scripts that are just SQL files in source control, DbUp is elegant in its simplicity.
For instance: A legacy database with complex stored procedures, triggers, and custom types? DbUp handles it effortlessly. A brand-new microservice with three tables and standard CRUD operations? EF migrations might be faster to get started.
DbUp is a .NET library for deploying changes to SQL databases. It tracks which SQL scripts have been run against a database and ensures that scripts are executed exactly once, in the correct order.
Key characteristics:
DbUp's entire job is to run SQL scripts in order and remember which ones have been executed. That's it. No code generation, no reverse engineering, no complex state management.
You write standard SQL. Need to create indexes with specific options? Use advanced database features? Execute complex data migrations? Just write the SQL you need.
Unlike code-first migrations that assume they own your schema, DbUp works perfectly with legacy databases, DBA-managed schemas, and complex existing systems.
SQL scripts in source control are easy to review, merge, and understand. DBAs can review them. Version control handles conflicts naturally.
Run migrations from console apps, web app startup, CI/CD pipelines, or dedicated migration tools. DbUp is just a library—you decide how and when to use it.
DbUp is battle-tested, used by thousands of applications, and maintained by the .NET community. It powers deployments from small apps to enterprise systems.
Traditional Manual Migrations:
1. Write SQL script
2. Connect to production database
3. Run script manually via SSMS/pgAdmin
4. Hope you remembered all environments
5. Hope you didn't run it twice
6. Lose track of what's been applied where
With DbUp:
1. Write SQL script as embedded resource
2. Add to project (auto-numbered: 001_CreateUsers.sql, 002_AddEmailIndex.sql)
3. Run application/migration tool
4. DbUp checks its tracking table
5. DbUp runs only new scripts in order
6. DbUp records execution
7. Safe, repeatable, auditable
DbUp creates a table (default: SchemaVersions) to track executed scripts:
CREATE TABLE SchemaVersions (
SchemaVersionID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
ScriptName NVARCHAR(255) NOT NULL,
Applied DATETIME NOT NULL
)
Before running any script, DbUp checks if its name exists in this table. If it does, it's skipped. If not, it's executed and recorded.
# Core package
dotnet add package dbup-core
# Database-specific packages
dotnet add package dbup-sqlserver # SQL Server
dotnet add package dbup-postgresql # PostgreSQL
dotnet add package dbup-mysql # MySQL
dotnet add package dbup-sqlite # SQLite
Create a console application for migrations:
using DbUp;
using System;
using System.Linq;
using System.Reflection;
namespace DatabaseMigrations
{
class Program
{
static int Main(string[] args)
{
var connectionString =
args.FirstOrDefault()
?? "Server=localhost;Database=MyApp;Trusted_Connection=True;";
var upgrader =
DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.LogToConsole()
.Build();
var result = upgrader.PerformUpgrade();
if (!result.Successful)
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.ResetColor();
return -1;
}
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
Console.ResetColor();
return 0;
}
}
}
Add SQL files as embedded resources:
Project Structure:
DatabaseMigrations/
├── Program.cs
├── Scripts/
│ ├── 001_CreateUsersTable.sql
│ ├── 002_CreateOrdersTable.sql
│ ├── 003_AddEmailIndex.sql
│ └── 004_CreateStoredProcs.sql
└── DatabaseMigrations.csproj
In .csproj:
<ItemGroup>
<EmbeddedResource Include="Scripts\**\*.sql" />
</ItemGroup>
Example Migration Script (001_CreateUsersTable.sql):
CREATE TABLE Users (
Id INT IDENTITY(1,1) PRIMARY KEY,
Username NVARCHAR(100) NOT NULL UNIQUE,
Email NVARCHAR(255) NOT NULL UNIQUE,
PasswordHash NVARCHAR(255) NOT NULL,
CreatedDate DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
UpdatedDate DATETIME2 NULL,
IsActive BIT NOT NULL DEFAULT 1
);
CREATE INDEX IX_Users_Email ON Users(Email);
CREATE INDEX IX_Users_Username ON Users(Username);
001_CreateInitialSchema.sql
002_AddUserRoles.sql
003_AlterUsersAddLastLogin.sql
004_CreateOrdersTable.sql
005_MigrateUserData.sql
Best Practices:
20231215_CreateUsers.sql-- 001_CreateProductsTable.sql
CREATE TABLE Products (
Id INT IDENTITY(1,1) PRIMARY KEY,
Name NVARCHAR(200) NOT NULL,
Description NVARCHAR(MAX) NULL,
Price DECIMAL(18,2) NOT NULL,
StockQuantity INT NOT NULL DEFAULT 0,
CategoryId INT NULL,
CreatedDate DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
UpdatedDate DATETIME2 NULL,
IsDeleted BIT NOT NULL DEFAULT 0,
CONSTRAINT CK_Products_Price CHECK (Price >= 0),
CONSTRAINT CK_Products_Stock CHECK (StockQuantity >= 0)
);
CREATE INDEX IX_Products_CategoryId ON Products(CategoryId);
CREATE INDEX IX_Products_IsDeleted ON Products(IsDeleted) WHERE IsDeleted = 0;
-- 005_AlterProductsAddSKU.sql
ALTER TABLE Products
ADD SKU NVARCHAR(50) NULL;
GO
-- Backfill SKUs for existing products
UPDATE Products
SET SKU = 'SKU' + RIGHT('00000' + CAST(Id AS NVARCHAR), 5)
WHERE SKU IS NULL;
GO
-- Make it required after backfill
ALTER TABLE Products
ALTER COLUMN SKU NVARCHAR(50) NOT NULL;
GO
CREATE UNIQUE INDEX IX_Products_SKU ON Products(SKU);
-- 008_MigrateUserRoles.sql
-- Create new roles table
CREATE TABLE Roles (
Id INT IDENTITY(1,1) PRIMARY KEY,
Name NVARCHAR(50) NOT NULL UNIQUE
);
CREATE TABLE UserRoles (
UserId INT NOT NULL,
RoleId INT NOT NULL,
AssignedDate DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
PRIMARY KEY (UserId, RoleId),
FOREIGN KEY (UserId) REFERENCES Users(Id),
FOREIGN KEY (RoleId) REFERENCES Roles(Id)
);
GO
-- Insert default roles
INSERT INTO Roles (Name) VALUES ('Admin'), ('User'), ('Guest');
GO
-- Migrate existing users based on IsAdmin flag
INSERT INTO UserRoles (UserId, RoleId)
SELECT
u.Id,
r.Id
FROM Users u
CROSS JOIN Roles r
WHERE u.IsAdmin = 1 AND r.Name = 'Admin'
OR u.IsAdmin = 0 AND r.Name = 'User';
GO
-- Remove old column
ALTER TABLE Users DROP COLUMN IsAdmin;
-- 010_CreateUserStoredProcedures.sql
CREATE PROCEDURE usp_GetUserById
@UserId INT
AS
BEGIN
SET NOCOUNT ON;
SELECT
Id,
Username,
Email,
CreatedDate,
IsActive
FROM Users
WHERE Id = @UserId;
END;
GO
CREATE PROCEDURE usp_CreateUser
@Username NVARCHAR(100),
@Email NVARCHAR(255),
@PasswordHash NVARCHAR(255)
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO Users (Username, Email, PasswordHash)
VALUES (@Username, @Email, @PasswordHash);
SELECT SCOPE_IDENTITY() AS UserId;
END;
GO
CREATE PROCEDURE usp_UpdateUserEmail
@UserId INT,
@NewEmail NVARCHAR(255)
AS
BEGIN
SET NOCOUNT ON;
UPDATE Users
SET Email = @NewEmail,
UpdatedDate = GETUTCDATE()
WHERE Id = @UserId;
SELECT @@ROWCOUNT AS RowsAffected;
END;
GO
-- 012_CreateUserActivityView.sql
CREATE VIEW vw_UserActivity
AS
SELECT
u.Id AS UserId,
u.Username,
u.Email,
COUNT(DISTINCT o.Id) AS OrderCount,
SUM(o.TotalAmount) AS TotalSpent,
MAX(o.OrderDate) AS LastOrderDate,
u.CreatedDate AS MemberSince
FROM Users u
LEFT JOIN Orders o ON u.Id = o.CustomerId
GROUP BY u.Id, u.Username, u.Email, u.CreatedDate;
GO
-- 015_CreatePerformanceIndexes.sql
-- Covering index for order queries
CREATE NONCLUSTERED INDEX IX_Orders_CustomerId_OrderDate
ON Orders (CustomerId, OrderDate DESC)
INCLUDE (TotalAmount, Status);
-- Filtered index for active orders
CREATE NONCLUSTERED INDEX IX_Orders_Active
ON Orders (OrderDate DESC)
INCLUDE (CustomerId, TotalAmount)
WHERE Status IN ('Pending', 'Processing');
-- Full-text index for product search
CREATE FULLTEXT CATALOG ftCatalog AS DEFAULT;
GO
CREATE FULLTEXT INDEX ON Products(Name, Description)
KEY INDEX PK_Products
ON ftCatalog;
GO
// Embedded resources (default)
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.Build();
// From file system
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsFromFileSystem("./Scripts")
.Build();
// Multiple sources
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.StartsWith("MyApp.Migrations.Core"))
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.StartsWith("MyApp.Migrations.Data"))
.Build();
// Custom filtering
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.EndsWith(".sql") && !script.Contains("Test"))
.Build();
// Single transaction (default - safest)
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithTransactionPerScript() // Each script in its own transaction
.Build();
// No transactions
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithoutTransaction() // Use for scripts with statements that can't run in transactions
.Build();
// Single transaction for all scripts
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithTransaction() // All scripts in one transaction (all-or-nothing)
.Build();
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithVariables(new Dictionary<string, string>
{
{ "DatabaseName", "MyAppDb" },
{ "SchemaName", "dbo" },
{ "AdminEmail", "[email protected]" }
})
.Build();
In SQL Script:
-- 020_CreateAdminUser.sql
USE [$DatabaseName$];
GO
INSERT INTO [$SchemaName$].Users (Username, Email, IsAdmin)
VALUES ('admin', '$AdminEmail$', 1);
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithExecutionTimeout(TimeSpan.FromMinutes(5)) // Timeout for long-running scripts
.LogToConsole()
.LogScriptOutput() // Log PRINT statements from SQL
.Build();
static int Main(string[] args)
{
var connectionString = GetConnectionString();
// Ensure database exists
EnsureDatabase.For.SqlDatabase(connectionString);
// Run migrations
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.LogToConsole()
.Build();
// Check if any scripts need to run
var scriptsToExecute = upgrader.GetScriptsToExecute();
if (!scriptsToExecute.Any())
{
Console.WriteLine("No new scripts to execute.");
return 0;
}
Console.WriteLine($"Found {scriptsToExecute.Count} scripts to execute:");
foreach (var script in scriptsToExecute)
{
Console.WriteLine($" - {script.Name}");
}
var result = upgrader.PerformUpgrade();
if (!result.Successful)
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.ResetColor();
return -1;
}
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
Console.ResetColor();
return 0;
}
public class Program
{
public static void Main(string[] args)
{
var host = CreateHostBuilder(args).Build();
// Run migrations before starting the application
RunDatabaseMigrations(host.Services);
host.Run();
}
private static void RunDatabaseMigrations(IServiceProvider services)
{
var configuration = services.GetRequiredService<IConfiguration>();
var logger = services.GetRequiredService<ILogger<Program>>();
var connectionString = configuration.GetConnectionString("DefaultConnection");
logger.LogInformation("Running database migrations...");
EnsureDatabase.For.SqlDatabase(connectionString);
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.LogTo(new DbUpLogger(logger))
.Build();
var result = upgrader.PerformUpgrade();
if (!result.Successful)
{
logger.LogError(result.Error, "Database migration failed");
throw new Exception("Database migration failed", result.Error);
}
logger.LogInformation("Database migrations completed successfully");
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
});
}
// Custom logger adapter
public class DbUpLogger : IUpgradeLog
{
private readonly ILogger _logger;
public DbUpLogger(ILogger logger)
{
_logger = logger;
}
public void WriteInformation(string format, params object[] args)
{
_logger.LogInformation(format, args);
}
public void WriteError(string format, params object[] args)
{
_logger.LogError(format, args);
}
public void WriteWarning(string format, params object[] args)
{
_logger.LogWarning(format, args);
}
}
// MigrationTool/Program.cs
class Program
{
static int Main(string[] args)
{
var config = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", optional: false)
.AddJsonFile($"appsettings.{GetEnvironment()}.json", optional: true)
.AddEnvironmentVariables()
.AddCommandLine(args)
.Build();
var connectionString = config.GetConnectionString("DefaultConnection");
if (string.IsNullOrEmpty(connectionString))
{
Console.WriteLine("Error: Connection string not found");
return -1;
}
Console.WriteLine($"Environment: {GetEnvironment()}");
Console.WriteLine($"Database: {GetDatabaseName(connectionString)}");
// Optional: Create database if it doesn't exist
if (config.GetValue<bool>("CreateDatabaseIfNotExists"))
{
EnsureDatabase.For.SqlDatabase(connectionString);
}
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.EndsWith(".sql"))
.WithVariables(new Dictionary<string, string>
{
{ "Environment", GetEnvironment() }
})
.LogToConsole()
.WithTransactionPerScript()
.Build();
var result = upgrader.PerformUpgrade();
if (!result.Successful)
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.ResetColor();
return -1;
}
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
Console.ResetColor();
DisplaySuccessInformation(result);
return 0;
}
static void DisplaySuccessInformation(DatabaseUpgradeResult result)
{
if (result.Scripts.Any())
{
Console.WriteLine();
Console.WriteLine("Executed scripts:");
foreach (var script in result.Scripts)
{
Console.WriteLine($" ✓ {script.Name}");
}
}
else
{
Console.WriteLine("No new scripts to execute.");
}
}
static string GetEnvironment()
{
return Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") ?? "Production";
}
static string GetDatabaseName(string connectionString)
{
var builder = new SqlConnectionStringBuilder(connectionString);
return builder.InitialCatalog;
}
}
# Dockerfile for migration tool
FROM mcr.microsoft.com/dotnet/runtime:8.0 AS base
WORKDIR /app
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /src
COPY ["MigrationTool/MigrationTool.csproj", "MigrationTool/"]
RUN dotnet restore "MigrationTool/MigrationTool.csproj"
COPY . .
WORKDIR "/src/MigrationTool"
RUN dotnet build "MigrationTool.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "MigrationTool.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "MigrationTool.dll"]
docker-compose.yml:
version: '3.8'
services:
database:
image: mcr.microsoft.com/mssql/server:2022-latest
environment:
SA_PASSWORD: "YourStrong@Passw0rd"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
volumes:
- sqldata:/var/opt/mssql
migrations:
build:
context: .
dockerfile: Dockerfile
depends_on:
- database
environment:
ConnectionStrings__DefaultConnection: "Server=database;Database=MyAppDb;User Id=sa;Password=YourStrong@Passw0rd;TrustServerCertificate=True;"
command: ["./wait-for-it.sh", "database:1433", "--", "dotnet", "MigrationTool.dll"]
app:
build:
context: .
dockerfile: Dockerfile.web
depends_on:
- migrations
environment:
ConnectionStrings__DefaultConnection: "Server=database;Database=MyAppDb;User Id=sa;Password=YourStrong@Passw0rd;TrustServerCertificate=True;"
ports:
- "8080:80"
volumes:
sqldata:
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.Build();
var upgrader = DeployChanges.To
.PostgresqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.Build();
PostgreSQL Script Example:
-- 001_CreateUsersTable.sql
CREATE TABLE users (
id SERIAL PRIMARY KEY,
username VARCHAR(100) NOT NULL UNIQUE,
email VARCHAR(255) NOT NULL UNIQUE,
password_hash VARCHAR(255) NOT NULL,
created_date TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_date TIMESTAMP NULL,
is_active BOOLEAN NOT NULL DEFAULT TRUE
);
CREATE INDEX idx_users_email ON users(email);
CREATE INDEX idx_users_username ON users(username);
var upgrader = DeployChanges.To
.MySqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.Build();
var upgrader = DeployChanges.To
.SQLiteDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.Build();
DbUp doesn't have built-in rollback support (by design—SQL rollbacks are complex and error-prone). However, you can implement rollback patterns:
Scripts/
├── Up/
│ ├── 001_CreateUsersTable.sql
│ ├── 002_AddEmailColumn.sql
│ └── 003_CreateIndexes.sql
└── Down/
├── 003_DropIndexes.sql
├── 002_RemoveEmailColumn.sql
└── 001_DropUsersTable.sql
public class DatabaseMigrator
{
public void Upgrade(string connectionString)
{
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.Contains(".Up."))
.Build();
upgrader.PerformUpgrade();
}
public void Downgrade(string connectionString, int targetVersion)
{
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.Contains(".Down."))
.Build();
// Custom logic to run down scripts in reverse order
// until reaching target version
}
}
If a migration fails in production, create a new forward-only script to fix it:
-- 025_FixBrokenMigration.sql
-- Compensates for issues in 024_AlterUsersTable.sql
-- Restore accidentally dropped column
ALTER TABLE Users
ADD LegacyId INT NULL;
-- Restore data from backup table
UPDATE u
SET u.LegacyId = b.LegacyId
FROM Users u
INNER JOIN Users_Backup_20231215 b ON u.Id = b.Id;
# Test against local database
dotnet run --project MigrationTool -- --connection "Server=localhost;Database=MyApp_Test;..."
# Test against Docker container
docker-compose up -d database
dotnet run --project MigrationTool
[Fact]
public void AllMigrationsExecuteSuccessfully()
{
// Arrange
var connectionString = "Server=(localdb)\\mssqllocaldb;Database=DbUpTests;Integrated Security=true;";
// Clean up any existing test database
DropDatabase.For.SqlDatabase(connectionString);
EnsureDatabase.For.SqlDatabase(connectionString);
// Act
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(typeof(DatabaseMigrator).Assembly)
.LogToConsole()
.Build();
var result = upgrader.PerformUpgrade();
// Assert
Assert.True(result.Successful, result.Error?.ToString());
// Verify database state
using var connection = new SqlConnection(connectionString);
var tableCount = connection.ExecuteScalar<int>(
"SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'");
Assert.True(tableCount > 0, "No tables were created");
}
[Fact]
public void MigrationsAreIdempotent()
{
var connectionString = "Server=(localdb)\\mssqllocaldb;Database=DbUpIdempotentTest;Integrated Security=true;";
DropDatabase.For.SqlDatabase(connectionString);
EnsureDatabase.For.SqlDatabase(connectionString);
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(typeof(DatabaseMigrator).Assembly)
.Build();
// First run
var result1 = upgrader.PerformUpgrade();
Assert.True(result1.Successful);
// Second run - should be no-op
var result2 = upgrader.PerformUpgrade();
Assert.True(result2.Successful);
Assert.Empty(upgrader.GetScriptsToExecute());
}
# Azure DevOps Pipeline
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
variables:
buildConfiguration: 'Release'
steps:
- task: UseDotNet@2
inputs:
version: '8.x'
- task: DotNetCoreCLI@2
displayName: 'Restore packages'
inputs:
command: 'restore'
- task: DotNetCoreCLI@2
displayName: 'Build migration tool'
inputs:
command: 'build'
projects: '**/MigrationTool.csproj'
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI@2
displayName: 'Run migration tests'
inputs:
command: 'test'
projects: '**/MigrationTool.Tests.csproj'
- task: DotNetCoreCLI@2
displayName: 'Publish migration tool'
inputs:
command: 'publish'
publishWebProjects: false
projects: '**/MigrationTool.csproj'
arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts@1
displayName: 'Publish artifacts'
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'migrations'
GitHub Actions:
name: Database Migrations
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test-migrations:
runs-on: ubuntu-latest
services:
sqlserver:
image: mcr.microsoft.com/mssql/server:2022-latest
env:
SA_PASSWORD: ${{ secrets.SA_PASSWORD }}
ACCEPT_EULA: Y
ports:
- 1433:1433
options: >-
--health-cmd "/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -Q 'SELECT 1'"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v3
- name: Setup .NET
uses: actions/setup-dotnet@v3
with:
dotnet-version: '8.0.x'
- name: Restore dependencies
run: dotnet restore
- name: Build
run: dotnet build --no-restore
- name: Test migrations
run: dotnet test --no-build --verbosity normal
env:
ConnectionStrings__DefaultConnection: Server=localhost;Database=TestDb;User Id=sa;Password=${{ secrets.SA_PASSWORD }};TrustServerCertificate=True;
- name: Run migrations
run: dotnet run --project MigrationTool/MigrationTool.csproj
env:
ConnectionStrings__DefaultConnection: Server=localhost;Database=TestDb;User Id=sa;Password=${{ secrets.SA_PASSWORD }};TrustServerCertificate=True;
✅ GOOD:
Scripts/
├── 001_InitialSchema/
│ ├── 001_CreateUsersTable.sql
│ ├── 002_CreateRolesTable.sql
│ └── 003_CreateUserRolesTable.sql
├── 002_AddAuditFields/
│ ├── 004_AlterUsersAddAudit.sql
│ └── 005_AlterOrdersAddAudit.sql
└── 003_Performance/
├── 006_CreateIndexes.sql
└── 007_CreateStoredProcs.sql
❌ BAD:
Scripts/
├── CreateUsers.sql
├── AddStuff.sql
├── Fix.sql
└── Updates.sql
✅ GOOD - One logical change per script:
-- 015_AddUserEmailIndex.sql
CREATE INDEX IX_Users_Email ON Users(Email);
❌ BAD - Multiple unrelated changes:
-- 015_MiscChanges.sql
CREATE INDEX IX_Users_Email ON Users(Email);
ALTER TABLE Products ADD Description NVARCHAR(MAX);
CREATE TABLE Categories (Id INT PRIMARY KEY);
-- 020_CreateMultipleObjects.sql
CREATE TABLE Orders (
Id INT PRIMARY KEY,
CustomerId INT NOT NULL
);
GO
CREATE INDEX IX_Orders_CustomerId ON Orders(CustomerId);
GO
CREATE PROCEDURE usp_GetOrder
@OrderId INT
AS
BEGIN
SELECT * FROM Orders WHERE Id = @OrderId;
END;
GO
-- 025_CreateEmailIndex.sql
-- Check if index already exists
IF NOT EXISTS (
SELECT 1 FROM sys.indexes
WHERE name = 'IX_Users_Email'
AND object_id = OBJECT_ID('Users')
)
BEGIN
CREATE INDEX IX_Users_Email ON Users(Email);
END;
GO
-- 030_MigrateUserData.sql
-- Create backup
SELECT * INTO Users_Backup_20231215 FROM Users;
GO
-- Perform migration
UPDATE Users
SET NewColumn = CASE
WHEN OldColumn = 'A' THEN 1
WHEN OldColumn = 'B' THEN 2
ELSE 3
END;
GO
-- Verify migration
DECLARE @ErrorCount INT;
SELECT @ErrorCount = COUNT(*)
FROM Users
WHERE NewColumn IS NULL AND OldColumn IS NOT NULL;
IF @ErrorCount > 0
BEGIN
RAISERROR('Data migration failed: %d records with NULL NewColumn', 16, 1, @ErrorCount);
-- Rollback would happen automatically if in transaction
END;
GO
# Include migration tool in solution
Solution/
├── MyApp.Web/
├── MyApp.Core/
├── MyApp.Data/
└── MyApp.Migrations/
├── Scripts/
│ ├── 001_CreateSchema.sql
│ ├── 002_SeedData.sql
│ └── ...
├── Program.cs
└── MyApp.Migrations.csproj
# Run migrations as part of deployment
git pull origin main
dotnet build MyApp.Migrations
dotnet run --project MyApp.Migrations
dotnet run --project MyApp.Web
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.Contains(".Common.") ||
script.Contains($".{environment}."))
.Build();
Script organization:
Scripts/
├── Common/
│ ├── 001_CreateUsersTable.sql
│ └── 002_CreateOrdersTable.sql
├── Development/
│ ├── 100_SeedTestUsers.sql
│ └── 101_SeedTestOrders.sql
└── Production/
└── 200_CreateProductionIndexes.sql
public class DetailedUpgradeLog : IUpgradeLog
{
private readonly ILogger _logger;
private readonly List<string> _executedScripts = new();
public DetailedUpgradeLog(ILogger logger)
{
_logger = logger;
}
public void WriteInformation(string format, params object[] args)
{
var message = string.Format(format, args);
_logger.LogInformation(message);
if (message.StartsWith("Executing"))
{
_executedScripts.Add(message);
}
}
public void WriteError(string format, params object[] args)
{
_logger.LogError(string.Format(format, args));
}
public void WriteWarning(string format, params object[] args)
{
_logger.LogWarning(string.Format(format, args));
}
public IReadOnlyList<string> ExecutedScripts => _executedScripts.AsReadOnly();
}
-- Phase 1: 050_AddNewColumn.sql
ALTER TABLE Users
ADD NewEmailFormat NVARCHAR(255) NULL;
GO
-- Phase 2: 051_PopulateNewColumn.sql
UPDATE Users
SET NewEmailFormat = LOWER(Email)
WHERE NewEmailFormat IS NULL;
GO
-- Phase 3: Deploy application code that uses NewEmailFormat
-- Phase 4: 052_MakeColumnRequired.sql
ALTER TABLE Users
ALTER COLUMN NewEmailFormat NVARCHAR(255) NOT NULL;
GO
-- Phase 5: 053_DropOldColumn.sql
ALTER TABLE Users
DROP COLUMN Email;
GO
EXEC sp_rename 'Users.NewEmailFormat', 'Email', 'COLUMN';
GO
-- 060_MigrateLargeDataset.sql
-- Process in batches to avoid locking
DECLARE @BatchSize INT = 10000;
DECLARE @RowsAffected INT = @BatchSize;
WHILE @RowsAffected = @BatchSize
BEGIN
UPDATE TOP (@BatchSize) Products
SET NormalizedName = UPPER(LTRIM(RTRIM(Name)))
WHERE NormalizedName IS NULL;
SET @RowsAffected = @@ROWCOUNT;
-- Small delay to prevent blocking
WAITFOR DELAY '00:00:01';
END;
GO
-- 070_CreateUserProc_v2.sql
-- Drop old version
IF EXISTS (SELECT 1 FROM sys.objects WHERE name = 'usp_GetUser')
BEGIN
DROP PROCEDURE usp_GetUser;
END;
GO
-- Create new version with additional functionality
CREATE PROCEDURE usp_GetUser
@UserId INT,
@IncludeDeleted BIT = 0 -- New parameter
AS
BEGIN
SET NOCOUNT ON;
SELECT
Id,
Username,
Email,
IsActive,
IsDeleted -- New column
FROM Users
WHERE Id = @UserId
AND (@IncludeDeleted = 1 OR IsDeleted = 0);
END;
GO
Problem: Script marked as executed but needs to run again
Solution:
-- Manually remove from tracking table
DELETE FROM SchemaVersions
WHERE ScriptName = '015_ProblematicScript.sql';
-- Or mark specific script to always run
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(
Assembly.GetExecutingAssembly(),
script => script.EndsWith("AlwaysRun.sql")
? ScriptProviderOptions.RunAlways
: ScriptProviderOptions.RunOnce)
.Build();
Problem: Long-running migrations causing timeouts
Solution:
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithTransactionPerScript() // Smaller transaction scope
.WithExecutionTimeout(TimeSpan.FromMinutes(10)) // Longer timeout
.Build();
Problem: Scripts running in wrong order
Solution:
❌ BAD naming:
Script1.sql
Script2.sql
Script10.sql // This runs after Script1!
✅ GOOD naming:
001_Script.sql
002_Script.sql
010_Script.sql
Problem: Multiple statements failing
Solution:
-- DbUp handles GO statements automatically
CREATE TABLE Test (Id INT);
GO
INSERT INTO Test VALUES (1);
GO
-- Not needed:
-- EXEC('CREATE TABLE Test (Id INT)'); -- Don't do this!
Use DbUp when:
Use Entity Framework when:
Use Flyway when:
var upgrader = DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.LogToConsole()
.Build();
var result = upgrader.PerformUpgrade();
// Transaction per script
.WithTransactionPerScript()
// No transactions
.WithoutTransaction()
// Custom timeout
.WithExecutionTimeout(TimeSpan.FromMinutes(5))
// Variable substitution
.WithVariables(new Dictionary<string, string> { { "var", "value" } })
// Filter scripts
.WithScriptsEmbeddedInAssembly(assembly, s => s.EndsWith(".sql"))
// From file system
.WithScriptsFromFileSystem("./Scripts")
DeployChanges.To.SqlDatabase(connectionString) // SQL Server
DeployChanges.To.PostgresqlDatabase(connectionString) // PostgreSQL
DeployChanges.To.MySqlDatabase(connectionString) // MySQL
DeployChanges.To.SQLiteDatabase(connectionString) // SQLite
DbUp embodies the Unix philosophy: do one thing and do it well. It runs SQL scripts in order and tracks what's been executed. That's it. No clever abstractions, no code generation, no magic.
This simplicity is its strength. Your migrations are SQL files in source control—readable, reviewable, and debuggable. DBAs can contribute without learning a framework. Developers can use the full power of SQL without fighting an ORM's limitations.
For teams that value transparency, control, and simplicity in their database deployment process, DbUp is the pragmatic choice. It gets out of your way and lets you focus on what matters: writing good SQL and shipping reliable software.
DbUp
Database Providers
For Comparison
Microsoft Documentation
Blogs and Articles
Video Tutorials
Essential Reading
Platform Documentation
DbUp was created by Paul Stovell and is maintained by the .NET open-source community. This guide draws upon community best practices, real-world deployment patterns, and years of production usage across thousands of applications.
Dapper - Micro ORM
Serilog - Structured Logging for .NET
Logging is often treated as an afterthought—sprinkle some Console.WriteLine() calls during development, maybe add ILogger if you remember, and hope for the best in production. Then something breaks at 3 AM, and you're grepping through text files trying to reconstruct what happened. Structured logging changes this game entirely.