How to customize the JDBC batch size for each Persistence Context with Hibernate

(Last Updated On: January 29, 2018)


JDBC batching has a significant impact on reducing transaction response time. As previously explained, you can enable batching for INSERT, UPDATE and DELETE statements with just one configuration property:

<property name="hibernate.jdbc.batch_size" value="5"/>

However, this setting affects every Persistence Context, therefore every business use case inherits the same JDBC batch size. Although the hibernate.jdbc.batch_size configuration property is extremely useful, it would be great if we could customize the JDBC batch size on a per Persistence Context basis. This article demonstrates how easily you can accomplish this task.

Time to upgrade

Hibernate 5.2 adds support for customizing the JDBC batch size at the Persistence Context level, as illustrated by the following example:

int entityCount = 20;

doInJPA(entityManager -> {

    for ( long i = 0; i < entityCount; ++i ) {
        Post post = new Post( i, 
            String.format( "Post nr %d", i )
        entityManager.persist( post );

In the test case above, the Hibernate Session is configured to use a JDBC batch size of 10.

When inserting 20 Post entities, Hibernate is going to generate the following SQL statements:

    (name, id) 
    ('Post nr 0', 0), ('Post nr 1', 1), 
    ('Post nr 2', 2), ('Post nr 3', 3), 
    ('Post nr 4', 4), ('Post nr 5', 5), 
    ('Post nr 6', 6), ('Post nr 7', 7), 
    ('Post nr 8', 8), ('Post nr 9', 9)

    (name, id) 
    ('Post nr 10', 10), ('Post nr 11', 11), 
    ('Post nr 12', 12), ('Post nr 13', 13), 
    ('Post nr 14', 14), ('Post nr 15', 15), 
    ('Post nr 16', 16), ('Post nr 17', 17), 
    ('Post nr 18', 18), ('Post nr 19', 19)

As you can see, the JDBC batch size allows us to execute only 2 database roundtrips instead of 20.

If you enjoyed this article, I bet you are going to love my Book and Video Courses as well.


The Session-level JDBC batch size configuration is a very useful feature that Hibernate 5.2 has to offer, and you should definitely use it to tailor the JDBC batch size based on the underlying business use case requirements.

Subscribe to our Newsletter

* indicates required
10 000 readers have found this blog worth following!

If you subscribe to my newsletter, you'll get:
  • A free sample of my Video Course about running Integration tests at warp-speed using Docker and tmpfs
  • 3 chapters from my book, High-Performance Java Persistence, 
  • a 10% discount coupon for my book. 
Get the most out of your persistence layer!


8 thoughts on “How to customize the JDBC batch size for each Persistence Context with Hibernate

  1. I use a wildfly that comes with an older version of hibernate, 5.0.10 in wildfly 10.
    Can I modify the batch size in that transaction by just setting the property in the entity manager:

    public class UtilService {
    @PersistenceContext(unitName = "myDB")
    private EntityManager entityManager;
    private void doSomething() {
        entityManager.setProperty("hibernate.jdbc.batch_size", 1000);

    or will that configure nothing or even worse set the configuration outside the transaction?

    Best regards,

    1. It will not work. The only way to make it work in Hibernate 5.0 is to set the hibernate.jdbc.batch_size property in persistence.xml configuration file. Only since Hibernate 5.2, you can override the global SessionFactory batch size property at the Session-level.

  2. Totally appreciate your answer. When you used session.update, The update worked perfect as you said. Following to that:

    The batch insert worked perfectly for me as well and was fast. I used enhanced-sequence. As follow:

    strategy = “enhanced-sequence”,
    parameters = {
    name = “sequence_name”,
    value = “JPWH_SEQUENCE”
    name = “increment_size”,
    value = “500 ),
    name = “optimizer”,
    value = “pooled-lo”
    @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = “ID_GENERATOR_POOLED”)

    SQL server 2012
    Hibernate 5.2.10.Final

    I got a problem that Hibernate creates a table instead of a sequence. I first tried to let hibernate create the stuff for me (i.e. using “”) but it creates a table not a sequence.

    I then created the sequence manually (i.e. “CREATE SEQUENCE …. AS BigInt …”, and removed the property ”” but still did not work; Because Hibernate executed this query then got this exception:

    next_val as id_val
    JPWH_SEQUENCE with (updlock,
    2017-07-01 19:49:51.365 [main]$1$1.execute($1$1.execute({36} – could not read a hi value

    Clear that hibernate tries to read from a table not a sequence.

    From Hibernate documentation book: “enhanced-sequence—Uses a native database sequence when supported; otherwise falls back to an extra database table with a single column and row, emulating a sequence”

    SQL server 2012 supports sequences so why Hibernate does create tables here? What I am missing here?
    The idea is I don’t want a table as you recommended and don’t want to have a table for each entity.


  3. Hi Vlad,
    I tried to do batch UPDATE via Hibernate but it is is really slow because it executes a select before each update (It executes this select to attach the entity). Because the update is so slow, I decided to use the nasty native JDBC solution (JdbcTemplate.batchUpdate) and IDENTITY generation strategy. Maybe I am missing something. I would appreciate looking into the following:

    Updating 200 simple entities took 41 seconds.

    I am using Hibernate 5.2 and SQL Server. I set both “hibernate.order_inserts” and “hibernate.order_updates” to true, set batch_size property as well.

    Here is my code

    Very simple entity

    public class BaselineTest {
    @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "hilo_sequence_generator")
            name = "hilo_sequence_generator",
            strategy = "",
            parameters = {
                    @Parameter(name = "sequence_name", value = "hilo_seqeunce"),
                    @Parameter(name = "initial_value", value = "1"),
                    @Parameter(name = "increment_size", value = "3"),
                    @Parameter(name = "optimizer", value = "hilo")
    private Long id;
    @Column(name = "UpdatedBy")
    private String updatedBy;

    Repository code:

    @PersistenceContext(unitName = "entityManagerFactory")
    private EntityManager entityManager;
    public  Collection bulkeSave(Collection entities, int batchSize) {
    final List<T> savedEntities = new ArrayList<T>(entities.size());
    int i = 0;
    for (T t : entities) {
        if (i % batchSize == 0) {
    return savedEntities;
    private  T persistOrMerge(T t) {
        if (t.getId() == null) {
            return t;
        } else {
            return entityManager.merge(t);

    Thanks so much for your time,

Leave a Reply

Your email address will not be published. Required fields are marked *