0

Trying to do long running process with multiple update operations on entities but the transaction keeps timing out and rolled back.

EclipseLink with logging set to fines outputs the following

    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=31 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=32 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=33 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=34 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=35 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=36 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finest:   Execute query UpdateObjectQuery(entity.Service[ id=37 ])
Fine:   UPDATE service SET uid = ? WHERE (id = ?)
    bind => [2 parameters bound]
Finer:   end unit of work flush
Finer:   resume unit of work
Warning:   EJB5123:Rolling back timed out transaction [JavaEETransactionImpl: txId=6 nonXAResource=9 jtsTx=null localTxStatus=1 syncs=[com.sun.ejb.containers.ContainerSynchronization@70ec56d3, org.eclipse.persistence.internal.jpa.transaction.JTATransactionWrapper$1@2f836809, org.eclipse.persistence.transaction.JTASynchronizationListener@7fa7baae, com.sun.enterprise.resource.pool.PoolManagerImpl$SynchronizationListener@74f86c98]] for [SearchManager]
Finer:   TX afterCompletion callback, status=ROLLEDBACK
Finest:   Connection released to connection pool [default].
Finer:   release unit of work
Finer:   client released

The following three methods are called during the update operation

public void recreateIndex() {
    int start = 0;
    while (true) {
        List<Product> products = em.createNamedQuery("Product.findAll")
                .setMaxResults(100).setFirstResult(start).getResultList();
        indexProducts(products);
        start = products.get(products.size() - 1).getId();
        if (products.size() == 100) {
            start++;
        } else {
            start = 0;
            break;
        }
    }
    start = 0;
    while (true) {
        List<Service> services = em.createNamedQuery("Service.findAll")
                .setMaxResults(100).setFirstResult(start).getResultList();
        indexServices(services);
        start = services.get(services.size() - 1).getId();
        if (services.size() == 100) {
            start++;
        } else {
            break;
        }
    }
}

@Transactional
private void indexProducts(List<Product> products) {
    Map<String, Object> properties = em.getProperties();
    em.setProperty("javax.persistence.query.timeout", -1);
    for (Product p : products) {
        p.setUid(RandomStringUtils.randomAlphanumeric(10));
        addProduct(p);
        em.persist(p);
    }
    em.flush();
}

@Transactional
private void indexServices(List<Service> services) {
    em.setProperty("javax.persistence.query.timeout", -1);
    for (Service s : services) {
        s.setUid(RandomStringUtils.randomAlphanumeric(10));
        addService(s);
        em.persist(s);
    }
    em.flush();
}

I currently the total number of rows in the two tables being updated equals 1000 rows but I expect it to me more in production. How can I do this operation without getting the transaction timing out.

Things I have tried include

  1. Running the code in a new thread of execution (Fire and forget) it still timed out
  2. Annoting the class with @TransactionManagement(TransactionManagementType.CONTAINER)
  3. Tried running without a transaction I got a runtime error complaining that a transaction was needed
  4. Trying to set javax.persistence.querytimeout to never time out which btw I am not sure if -1 is the value I need for no timeout. this is where I am

Willing to post more code if needed. Thanks in advance

qualebs
  • 1,291
  • 2
  • 17
  • 34
  • maybe doesn't answer your question directly, but why do this in Java. Consider this question https://stackoverflow.com/questions/6280789/generate-guid-in-mysql-for-existing-data – Scary Wombat May 19 '17 at 02:26
  • @ScaryWombat it's bacause I am adding all of these entities to a lucene index and I need a random string unique id to be able to update or delete single documents in the index I thought why not generate the ID as I do the indexing? poor method name but addProduct and addService are the two methods submitting the entity for indexing – qualebs May 19 '17 at 02:36
  • do you have an index on `id` ? – Scary Wombat May 19 '17 at 02:40
  • @ScaryWombat Yes I do, but I can't use it in Lucene because other entities from other tables definitely will have an index on id that will match this one. Further more the transaction still times out even if I comment the two method calls `addProduct` and `addService`. The question is really about being able to process many entities in a single transaction without the transaction timing out. – qualebs May 19 '17 at 02:50
  • What is your goal? You can set the transaction timeout higher (to the max you expect this to take in production), reduce the number of statements/transaction size, or find more optimal ways to access the database. It isn't clear why you would call em.persist(p); on an entity that has already been read in from the database. If it isn't in the persistence unit already, this would throw an error (or cause an insert). You might also look at batch writing https://wiki.eclipse.org/Using_EclipseLink_JPA_Extensions_(ELUG)#Using_EclipseLink_JPA_Extensions_for_JDBC – Chris May 19 '17 at 14:33

0 Answers0