I like hibernate very much, but I sometimes use my own SQL Gernator .. Batch data is very consumable ..
However, it has recently turned Hibernate's blog, which has a new understanding of Hibernate's batch.
Insert 10,000 records, more bad practices are:
Session session = sessionFactory.openSession ();
Transaction tx = session.begintransaction ();
For (int i = 0; i <100000; i ) {
Customer Customer = New Customer (.....);
Session.save (CUSTOMER);
}
TX.comMit ();
session.close ();
This will soon, when Outofmenery, Hibernate is cached at all new objects, we generally know flush (), clear () release memory. But frequent will not lead to efficiency Question? I thought about the approach of JDBC batch.
First set up
Hibernate.jdbc.batch_size 20
Then, the code does some small modifications, this practice we are in the batch of JBDC ...
Full of 20 a batch of addbatchupdate ... is Flush (), CLEAR ();
Session session = sessionFactory.openSession ();
Transaction tx = session.begintransaction ();
For (int i = 0; i <100000; i ) {
Customer Customer = New Customer (.....);
Session.save (CUSTOMER);
IF (i% 20 == 0) {
// Flush a batCh of Inserts and release memory:
Session.flush ();
Session.clear ();
}
}
TX.comMit ();
session.close ();
If it is a bulk update, then the best way is to use scroll () method 2.1.6 to support.
code show as below:
Session session = sessionFactory.openSession ();
Transaction tx = session.begintransaction ();
Scrollableresults Customers = session.getnamedQuery ("getcustomers")
. Scroll (scrollmode.forward_only);
INT count = 0;
While (Customers.Next ()) {
Customer Customer = (Customer) Customers.get (0);
Customer.Updatestuff (...);
IF ( count% 20 == 0) {
// Flush a batCh of updates and release memory:
Session.flush ();
Session.clear ();
}
}
TX.comMit ();
session.close ();
There is also a small warning: In this case, don't cook for the Entity ... otherwise there will be memory issues.
Still watching the original text :)
Original: http://blog.hibernate.org/cgi-bin/blosxom.cgi/gavin king/batch.html