Hibernate integration testing strategies

Imagine having a tool that can automatically detect JPA and Hibernate performance issues. Wouldn’t that be just awesome?

Well, Hypersistence Optimizer is that tool! And it works with Spring Boot, Spring Framework, Jakarta EE, Java EE, Quarkus, or Play Framework.

So, enjoy spending your time on the things you love rather than fixing performance issues in your production system on a Saturday night!

Introduction

I like integration testing. As I explained in this article, it’s a good way to check what SQL queries are generated by Hibernate behind the scenes. But integration tests require a running database server, and this is the first choice you have to make.

Using a production-like local database server for Integration Testing

For a production environment, I always prefer using incremental DDL scripts, since I can always know what version is deployed on a given server, and which scripts required to be deployed. I’ve been relying on Flyway to manage the schema updates for me, and I’m very content with it.

On a small project, where the amount of integration tests is rather small, you can employ a production-like local database server for testing as well. This is the safest option since it guarantees you’re testing against a very similar environment with the production setup.

Some people believe that using a production-like environment would affect the test execution time, but that’s not the case. Nowadays, you can use Docker with tmpfs to speed up your tests and run them almost as fast as with an in-memory database.

In-memory database Integration testing

Another option is to choose an in-memory database for integration testing. There are many in-memory databases you could choose from: HSQLDB, H2, Apache Derby, to name a few.

I’ve been using two in-memory schema generation strategies, both of them have pros and cons, which I am going to explain as follows.

Making use of hibernate.hbm2ddl.auto=”update”

When using Hibernate, the database schema generation can be customized using the hibernate.hbm2ddl.auto configuration property.

The simplest way to deploy a schema is to use the update option. This is useful for testing purposes. I wouldn’t rely on it for a production environment, for which incremental DDL scripts is a much better approach.

Let’s start with the JPA configuration, you can found in the persistence.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0"
             xmlns="http://java.sun.com/xml/ns/persistence"
             xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
    <persistence-unit name="testPersistenceUnit" transaction-type="JTA">
        <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
        <exclude-unlisted-classes>false</exclude-unlisted-classes>

        <properties>
            <property name="hibernate.archive.autodetection"
                      value="class, hbm"/>
            <property name="hibernate.transaction.jta.platform"
                      value="org.hibernate.service.jta.platform.internal.BitronixJtaPlatform" />
            <property name="hibernate.dialect"
                      value="org.hibernate.dialect.HSQLDialect"/>
            <em><property name="hibernate.hbm2ddl.auto"
                      value="update"/></em>
            <property name="hibernate.show_sql"
                      value="true"/>
        </properties>
    </persistence-unit>
</persistence>

And the dataSource configuration looks like:

<bean id="dataSource" 
      class="bitronix.tm.resource.jdbc.PoolingDataSource" 
      init-method="init"
      destroy-method="close">
    <property name="className" value="bitronix.tm.resource.jdbc.lrc.LrcXADataSource"/>
    <property name="uniqueName" value="testDataSource"/>
    <property name="minPoolSize" value="0"/>
    <property name="maxPoolSize" value="5"/>
    <property name="allowLocalTransactions" value="false"/>
    <property name="driverProperties">
        <props>
            <prop key="user">${jdbc.username}</prop>
            <prop key="password">${jdbc.password}</prop>
            <prop key="url">${jdbc.url}</prop>
            <prop key="driverClassName">${jdbc.driverClassName}</prop>
        </props>
    </property>
</bean>

Bitronix is a very reliable stand-alone JTA transaction manager. When you are developing Java EE applications, the Transaction Manager is supplied by the Application Server. However, for Spring-based projects, we need to employ a stand-alone Transaction Manager if we need to use XA transactions.

When using JTA, it’s not advisable to mix XA and Local Transactions, since not all XA Data Sources allow operating inside a Local Transaction. Unfortunately, as simple as this DDL generation method is, it has one flaw which I am not too fond of. I can’t disable the allowLocalTransactions setting since Hibernate creates the DDL script and updates it outside of an XA Transaction.

Another drawback is that you have little control over what DDL script Hibernate deploys on your behalf, and in this particular context I don’t like compromising flexibility over convenience.

If you don’t use JTA, and you don’t need the flexibility of deciding what DDL schema would be deployed on your current database server, then the hibernate.hbm2ddl.auto=”update” is probably your rightful choice.

Flexible schema deploy

This method consists of two steps. The former is to have Hibernate generating the DDL scripts, and the latter is to deploy them in a customized fashion.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-antrun-plugin</artifactId>
    <executions>
        <execution>
            <id>generate-test-sql-scripts</id>
            <phase>generate-test-resources</phase>
            <goals>
                <goal>run</goal>
            </goals>
            <configuration>
                <tasks>
                    <property name="maven_test_classpath" refid="maven.test.classpath"/>
                    <path id="hibernate_tools_path">
                        <pathelement path="${maven_test_classpath}"/>
                    </path>
                    <property name="hibernate_tools_classpath" refid="hibernate_tools_path"/>
                    <taskdef name="hibernatetool"
                             classname="org.hibernate.tool.ant.HibernateToolTask"/>
                    <mkdir dir="${project.build.directory}/test-classes/hsqldb"/>
                    <hibernatetool destdir="${project.build.directory}/test-classes/hsqldb">
                        <classpath refid="hibernate_tools_path"/>
                        <jpaconfiguration persistenceunit="testPersistenceUnit"
                                          propertyfile="src/test/resources/META-INF/spring/jdbc.properties"/>
                        <hbm2ddl drop="false" create="true" export="false"
                                 outputfilename="create_db.sql"
                                 delimiter=";" format="true"/>
                        <hbm2ddl drop="true" create="false" export="false"
                                 outputfilename="drop_db.sql"
                                 delimiter=";" format="true"/>
                    </hibernatetool>
                </tasks>
            </configuration>
        </execution>
    </executions>
</plugin>

Having the create_db.sql and drop_db.sql DDL scripts, we now have to deploy them when the Spring context starts, and this is done using the following custom Utility class:

public class DatabaseScriptLifecycleHandler implements InitializingBean, DisposableBean {

    private static final org.slf4j.Logger log = LoggerFactory.getLogger(DatabaseScriptLifecycleHandler.class);

    private final Resource[] initScripts;
    private final Resource[] destroyScripts;

    private JdbcTemplate jdbcTemplate;

    @Autowired
    private TransactionTemplate transactionTemplate;

    private String sqlScriptEncoding = "UTF-8";
    private String commentPrefix = "--";
    private boolean continueOnError;
    private boolean ignoreFailedDrops;
    private boolean transactional = true;

    public DatabaseScriptLifecycleHandler(DataSource dataSource,
                                          Resource[] initScripts,
                                          Resource[] destroyScripts) {
        this.jdbcTemplate = new JdbcTemplate(dataSource);
        this.initScripts = initScripts;
        this.destroyScripts = destroyScripts;
    }

    public Resource[] getInitScripts() {
        return initScripts;
    }

    public Resource[] getDestroyScripts() {
        return destroyScripts;
    }

    public String getCommentPrefix() {
        return commentPrefix;
    }

    public void setCommentPrefix(String commentPrefix) {
        this.commentPrefix = commentPrefix;
    }

    public boolean isContinueOnError() {
        return continueOnError;
    }

    public void setContinueOnError(boolean continueOnError) {
        this.continueOnError = continueOnError;
    }

    public boolean isIgnoreFailedDrops() {
        return ignoreFailedDrops;
    }

    public void setIgnoreFailedDrops(boolean ignoreFailedDrops) {
        this.ignoreFailedDrops = ignoreFailedDrops;
    }

    public String getSqlScriptEncoding() {
        return sqlScriptEncoding;
    }

    public void setSqlScriptEncoding(String sqlScriptEncoding) {
        this.sqlScriptEncoding = sqlScriptEncoding;
    }

    public boolean isTransactional() {
        return transactional;
    }

    public void setTransactional(boolean transactional) {
        this.transactional = transactional;
    }

    public void afterPropertiesSet() throws Exception {
        initDatabase();
    }

    public void destroy() throws Exception {
        destroyDatabase();
    }

    public void initDatabase() {
        if (transactional) {
            transactionTemplate.execute(new TransactionCallback<Void>() {
                @Override
                public Void doInTransaction(TransactionStatus status) {
                    runInitScripts();
                    return null;
                }
            });
        } else {
            runInitScripts();
        }
    }

    private void runInitScripts() {
        final ResourceDatabasePopulator resourceDatabasePopulator = createResourceDatabasePopulator();
        jdbcTemplate.execute(new ConnectionCallback<Void>() {
            @Override
            public Void doInConnection(Connection con) throws SQLException, DataAccessException {
                resourceDatabasePopulator.setScripts(getInitScripts());
                resourceDatabasePopulator.populate(con);
                return null;
            }
        });
    }

    public void destroyDatabase() {
        if (transactional) {
            transactionTemplate.execute(new TransactionCallback<Void>() {
                @Override
                public Void doInTransaction(TransactionStatus status) {
                    runDestroyScripts();
                    return null;
                }
            });
        } else {
            runDestroyScripts();
        }
    }

    private void runDestroyScripts() {
        final ResourceDatabasePopulator resourceDatabasePopulator = createResourceDatabasePopulator();
        jdbcTemplate.execute(new ConnectionCallback<Void>() {
            @Override
            public Void doInConnection(Connection con) throws SQLException, DataAccessException {
                resourceDatabasePopulator.setScripts(getDestroyScripts());
                resourceDatabasePopulator.populate(con);
                return null;
            }
        });
    }

    protected ResourceDatabasePopulator createResourceDatabasePopulator() {
        ResourceDatabasePopulator resourceDatabasePopulator = new ResourceDatabasePopulator();
        resourceDatabasePopulator.setCommentPrefix(getCommentPrefix());
        resourceDatabasePopulator.setContinueOnError(isContinueOnError());
        resourceDatabasePopulator.setIgnoreFailedDrops(isIgnoreFailedDrops());
        resourceDatabasePopulator.setSqlScriptEncoding(getSqlScriptEncoding());
        return resourceDatabasePopulator;
    }
}

which is configured as:

<bean id="databaseScriptLifecycleHandler" 
      class="com.vladmihalcea.util.DatabaseScriptLifecycleHandler"
      depends-on="transactionManager">
    <constructor-arg name="dataSource" ref="dataSource"/>
    <constructor-arg name="initScripts">
        <array>
            <bean class="org.springframework.core.io.ClassPathResource">
                <constructor-arg value="hsqldb/create_db.sql"/>
            </bean>
            <bean class="org.springframework.core.io.ClassPathResource">
                <constructor-arg value="hsqldb/init_functions.sql"/>
            </bean>
        </array>
    </constructor-arg>
    <constructor-arg name="destroyScripts">
        <array>
            <bean class="org.springframework.core.io.ClassPathResource">
                <constructor-arg value="hsqldb/drop_functions.sql"/>
            </bean>
            <bean class="org.springframework.core.io.ClassPathResource">
                <constructor-arg value="hsqldb/drop_db.sql"/>
            </bean>
        </array>
    </constructor-arg>
</bean>

I'm running an online workshop on the 20-21 and 23-24 of November about High-Performance Java Persistence.

If you enjoyed this article, I bet you are going to love my Book and Video Courses as well.

This time we can get rid of any local transaction so we can safely set the:

<property name="allowLocalTransactions" value="false" />
Transactions and Concurrency Control eBook

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.