Hi everyone, today in this blog we are going to talk about version-based database migration With Liquibase.
Creating the database for your application seems to be easy. As long as you don’t need to support multiple versions or work in a huge team. You just generate an SQL script from your database model. Or in the worst case, export the required statements from your test database.
You can then execute it manually or use JPA to do that automatically. This approach doesn’t work as soon as you need to support multiple versions of your application. And extracting the required changes from the test database becomes a huge mess, when the size of your team and the number of changes increases. You can avoid some of these problems when you create the database update script while working on the next release. And store them alongside your code in git or whichever version control system you use. Whenever someone gets the latest code changes, they will also get the required database changes with the right tooling in place. You will even be able to apply these changes automatically. One tool that can help you with that is Liquibase.
What is Liquibase?
Liquibase is an open source tool released under APL 2.0. It provides you a set of tools to implement an automated, version based database migration for you application. Some of its benefits are the sequential execution of all required updates to get to the most recent database version, the automatic creation and execution of rollback operations for failed updates and the database-independent definition of the updated operations. In this blog we’re going to show you how to use all of these benefits in your application.
Let’s start with the creation of your database before we show you how to update it. The database change log is an XML, JSON, YAML, or SQL file which describes all changes that need to be performed to update the database.
In most cases, you want to create 1 file for each release. Each file consists of one or more change sets. A changeSet describes a set of changes that Liquibase executes within one transaction. You can add as many changes to a set as you like. But to avoid any problems during a rollback, you shouldn’t define more than one logical change per set. Each changeSet gets identified by the name of the author and an id. Liquibase stores this information together with the name of the change log file in the database changeLog table to keep track on the executed change sets.
Let’s have a look at 3 changeSets that create a new database table, a primary key, and a foreign key constraint. Liquibase allows you to do a lot more than that. You can change existing tables, define execution requirements, provide rollback commands to handle errors and load data. But for now, let’s focus on the creation of a new database.
Here, you can see a changeSet that creates the table author with the columns id, firstname, lastname and version.
The createTable element has to define the name of the table you want to create. On top of that, you can specify other attributes, like the name of the database schema or the tablespace. You also need to provide at least one column tag as a nested property.
In the above example, we used 4 of these tags, to define the 4 database columns of the author table. The attributes name and type are required to define a new column. Liquibase also supports a set of additional attributes. You can use them to provide a default value, specify the encoding or set a comment. You can also use a constraints tag to define a primary key, not null, unique, foreign key or cascade constraint. We used it in this example to create a not null constraint for the id and version for the id and version columns.
As you can see, the definition of the changeSet is simple and doesn’t require any SQL. As always, this can be a good and bad thing. Liquibase will generate the SQL statements for the database you want to use. That allows you to create database-independent update scripts but puts you also at the risk of executing unexpected statements. You should therefore always test the generated SQL statements.
We’re gonna show you how to export and run the SQL statements at the end of this article. If you didn’t define the primary key when you created the table, you can add the constraint with an addPrimaryKey tag. We use it here to add the primary key constraint to the id column of the book table. These two attributes are mandatory.
Also, you can provide additional ones to provide the name of the constraint, schema, catalog or tablespace. And you can add a foreign key constraint with an addForeignKeyConstraint tag. You then need to provide the name of the constraint, the baseColumnNames and baseTableName. It defines the column and table to which you want to add the constraint. And the refrencedColumnNames and refrencedTableName that define the column and table to which the foreign key points to.
So, now you understand the basic tags you can use to create the initial database for your application. But don’t worry, you don’t need to write this file yourself if you already have an existing database. You might want to add Liquibase to an existing application. Or can you use a tool to model and export your table model.
In these situations, it’s much easier to let Liquibase generate the required statements. You can do that by providing the connection information and the generateChangeLog command to the Liquibase command line client.
Here you can see an example for our local PostgreSQL database. We’ll tell the Liquibase to use PostgreSQL’s JDBC driver to connect to the recipes schema of my local database. It will generate a changeLog that creates an identical database. And to write it to the db.changelog-1.0.xml file in the myFiles directory.
You can see the generated changeLog here.
As you can see, Liquibase used the current username as author and added “generated” as postfix and it also generated a prefix for the id. If you already have a database, generating the changeLog is a lot easier and faster than writing it yourself. But as always, don’t forget to review the generated changeSets so that you can be sure that they meet our expectations.
Before you execute the changeLog, you should always export and review the generated SQL statements. Some database administrators also need the script for their internal documentation. Otherwise, they plainly reject to perform any changes they didn’t review themselves. In all these situations, you can use Liquibase’s command line client. It will generate the required SQL statements and write them to a file.
Here you can see an example in which we tell Liquibase to connect to my local PostgreSQL database, determine the required updates and generate the SQL statements. After you’ve created and checked the changeLog yourself or used the command line client to create it. You can choose between multiple options to execute it. We’re using a command line tool here. But you can also use a maven plugin to create the database as part of your build or deployment process. Or you can use a Servlet, Spring or CDI listener to automatically create or update the database at application startup.
Based on the changeLog, Liquibase create the tables and sequence in the test_liquibase schema. Liquibase documents the execution of all changeSets in the databasechangelog table. It will use this information for future runs of the update process to determine which changesets need to be executed.
Okay, that’s it for today. I hope this article was useful to you. Please feel free to drop any comments, questions or suggestions.