StatelessSession does not flush when using jdbc batch_size > 1
Description
100% Done
Activity
Show:

Christian Beikov March 2, 2023 at 11:28 PM
Please create a new issue and attach a reproducer for this if that is still a problem in Hibernate 6

Felix König February 28, 2023 at 1:16 PM
Or, as a generic workaround snippet:

Felix König February 28, 2023 at 1:10 PM
This is still a problem and source of potential data loss in hibernate 6.1.7. For my case, which is Spring Boot, I’ve implemented this workaround:
Which allows you to perform some operations using a stateless session within a larger transaction.
Tobias Kremer June 27, 2013 at 9:02 AM
The workaround @sandermak posted does not work with Hibernate 4.2.2.
I found an new one:
I still do not understand why StatelessSession
does not offer a method to flush the session.
Steve Ebersole March 21, 2011 at 7:08 PM
Bulk closing stale resolved issues
I'm using a StetelessSession to insert millions of rows : it works great and without using much memory. But I've just seen that with a jdbc batch size of 50 for example (<property name="hibernate.jdbc.batch_size" value="0"/> in my persistence.xml) the last round of inserts aren't flushed to the database. For example, with 70 insert, only the first 50 are sent to the database.
I've searched a lot about this issues and on this thread (https://forum.hibernate.org/viewtopic.php?f=1&t=987882&start=0), the only solution found is to set the batch_size to 1, which is really a shame.
I've tried to flush the session, close the jdbc connection, etc etc ... no luck.
I'd be fine with a way to set the batch_size to 1 only for this method, pro grammatically, but I've not found any way to do that.
If you don't pay attention, it's a easy way to lose data.