My project is using hibernate with spring transaction manager and my database is postgres (might be irrelevant).
I'm trying to read big xml files and construct objects out of those (objects are not big but the amount is) and insert them into database.
If by some chance one of my objects violates database constraint the whole process stops. How can I skip the ones which violate the database constraint? alternatively log their id or whatever to a log file?
Question update:
I've been browsing trough SO and found that for batch inserts it's best recommended to use Stateless session but I still get the same issue and insert stops :
May 26, 2012 4:45:47 PM org.hibernate.util.JDBCExceptionReporter logExceptions
SEVERE: ERROR: duplicate key value violates unique constraint "UN_FK"
Detail: Key (fid)=(H1) already exists.
Here are the relevant parts of my code for parsing xml and inserting into db, for simplicity let's assume I'm inserting movies :
//class field
@Autowired
private SessionFactory sessionFactory;
@Override
public void startDocument() throws SAXException {
session = sessionFactory.getCurrentSession();
}
@Override
public void endElement(String uri, String localName, String qName) throws SAXException {
if (qName.equalsIgnoreCase("FILM")) {
movie.setCategory(category);
movie.setAdded(new Date());
session.insert(movie);
}
}
I and have this property set in the app-ctx hibernate.jdbc.batch_size
to 100. Is it really necessary to do select before insert in order to avoid this?
Update 2:
If I use StatelessSession
instead of session, I get arround 20 inserts and than the processing stops indefinitely without any exception or anything.
I assume the number 20 is because I'm pooling connections with tomcat and have maxActive="20"
.
Bounty Update :
I'd really love to see someone offer solution (without defensive select if possible). Using statelessSession or just session.