2

I have to use Hibernate 4.1.7 in my project. Unfortunately, it's not up to me to move to a newer version.

In my context, I have a master-detail situation. So, I load the master object and show it on a web page, with all its details. I can change both master and details the way I like, even including new details. My saving code is as follows:

@Override
@Transactional
public void save(Entity entity) {
    Session session = entityManager.unwrap(Session.class);
    session.saveOrUpdate(entity);
    session.flush();
}

Problem comes when I try to delete a detail. The code below shows how it is done:

myMaster.getDetails().remove(myDetail);

I expect Hibernate will track the changes over the master object (it lost a detail instance from the list member), but when I call the save() method it throws java.lang.IllegalArgumentException: Removing a detached instance xxxx#yyyy.

I understand the concept if trying to remove a detached instance, but I don't understand why the just removed instance is being considered detached by Hibernate.

Any ideas?

TIA!

AlexSC
  • 1,823
  • 3
  • 28
  • 54
  • 1
    "I load the master object and show it on a web page, with all its details" - I'd assume that this is where they all get detached. – Thomas Apr 07 '17 at 14:01
  • Thanks, but why are they being detached? If it is really like that, why updating the details still work? Aren't they also detached? Is it possible to update detached instances? – AlexSC Apr 07 '17 at 14:56
  • IIRC `saveOrUpdate()` is also able to update detached entities so that's why the update still works. How they become detached depends on what you're actually doing but I assume you're opening a Hibernate session (or JTA transaction), read the entities and close the session again (opening and closing might be done by some interceptor so you might not directly be doing it). An entity is considered attached if it is present in the current session (aka first level cache) and if the session is closed all entities that have been loaded through it get detached. – Thomas Apr 07 '17 at 15:38
  • @Thomas: Ok, let's accept that idea, since JSF closes the connection after is finnishes processing the request. The question now is how to make it work. It seems to me that such a web page is tremendously usual, so this kind of problem seems to happen very often. What's the solution? – AlexSC Apr 07 '17 at 16:57
  • Well, you could try to reattach the entities before doing that or, depending on your model, just reattach the detail and remove it (if it is the owning side of the relation) - you could even do that with an update query instead of going though the entitiy manager. – Thomas Apr 10 '17 at 07:41
  • @Thomas, thanks for your suggestions, but one nice thing of Hibernate is that it is supposed to track the changes the object suffered and the perform all the database operations for me, giving me a much higher level of abstraction. if i have to take care of all that, then it seems I'm wasting all that. The strangest thing is: if I try to merge the master object back to the hibernate session I get an exception telling me that there is already an object with the sabe Id. is seems to me that the master object is not detached after all, so, why is the detail? – AlexSC Apr 10 '17 at 11:03
  • try to follow this link http://stackoverflow.com/questions/2428706/jpa-thinks-im-deleting-a-detached-object. – Vivien SA'A Apr 10 '17 at 11:29
  • It's really hard to tell what Hibernate is doing without knowing much more of your code and I have the feeling it might be way too much for SO anyways. I didn't use JSF in years and even when I used it entities were never made available to JSF itself (was a design decision of ours) and thus I'd have to guess: during the restore view phase the master itself might get loaded and thus be attached to the session again while the details are still detached or there might actually be two versions: the lazy proxy in the master and a loaded detail that's not connected to the master. – Thomas Apr 10 '17 at 11:39
  • 1
    About the things that Hibernate is supposed to track: you're correct in that Hibernate is doing that. The problem arises when entities get detached, which normally happens when the session is closed - and you want to keep session life times as short as possible to reduce other side effects (e.g. transactions and consistency matters etc.). There's still the [extended session pattern] (http://stackoverflow.com/questions/3946288/extended-session-for-transactions) you could look into (although some consider it an anti-pattern since it's easy to misuse). – Thomas Apr 10 '17 at 11:46

1 Answers1

5

All you need to do to make it work the way you expect (to let Hibernate check the entire object graph) is this:

@Override
@Transactional
public Entity save(Entity entity) {
    return entityManager.merge(entity);
}

Of course, make sure that MERGE operation is cascaded from master to details (CascadeType.MERGE or CascadeType.ALL) and that orphanRemoval is set for details collection. If orphanRemoval is not suitable, then you have to remove the details explicitly (otherwise Hibernate would delete associated children when they stopped being associated with parents, which is obviously not desirable).

Also, make sure to use the result of the merge operation afterwards as it always returns a copy of the passed-in detached instance (passed-in instance is not modified). This is especially important when persisting new instances because ids are set in the copy.

However, you pay the penalty of reloading the object graph from db, but, if you want everything done automatically, Hibernate has no other way to check what has actually changed, other than comparing it with the current state in the db.

Now the explanations of what you observed in your attempts:

  1. The master instance you save is detached. You load it in one transaction (persistence context/session), and save it in another.

  2. It works fine with saveOrUpdate when you update it:

    Either save(Object) or update(Object) the given instance, depending upon resolution of the unsaved-value checks (see the manual for discussion of unsaved-value checking).

    Parameters:

    object - a transient or detached instance containing new or updated state

    As stated in the doc, it is intended to work with detached and transient instances.

  3. You tried merging, but you got an exception telling you that there is already an object with the same id. The only explanation is that you tried something like this:

    @Override
    @Transactional
    public void save(Entity entity) {
        entity = entityManager.merge(entity);
        Session session = entityManager.unwrap(Session.class);
        session.saveOrUpdate(entity);
        session.flush();
    }
    

    Then an exception is thrown as described in the update javadoc:

    Update the persistent instance with the identifier of the given detached instance. If there is a persistent instance with the same identifier, an exception is thrown.

    So, you merged the instance into the current persistence context, and then tried to saveOrUpdate it (which delegated to update) and that resulted in the exception, as you must not update the detached instance while there is already a persistent one in the current persistence context (otherwise the persistent one would become stale).

    You don't have to do this, just merge, and at the end of the transaction Hibernate will dirty-check the objects and flush the changes to the db automatically.

Dragan Bozanovic
  • 23,102
  • 5
  • 43
  • 110
  • Excelent answer and even better explanations. Thank you for all that. We are close to perfection! It's now working for deletes and updates, but for inserts I have now duplicates. So, i merge one instance and get two rows in DB. When I have a constraint, I receive a DB error, prooving that indeed there were two attempts to insert. Any ideas? – AlexSC Apr 12 '17 at 12:50
  • @AlexSC I edited the answer. You probably continued to use the passed-in instance after merging. You have to use the result of the merge operation afterwards. – Dragan Bozanovic Apr 12 '17 at 15:20
  • actually, I did so. When I say the insert is being duplicated, I mean the detail object. Like, I load the master and de details, add a new detail, save the master. Saving the master means merging the master back to the session. After the `save()` method ends, I see the attempts of two inserts in Hibernate log and the error. In this scenario, I don't have the opportunity to not use the merge-returned instance, do you agree? It's really strange, but it's happening. – AlexSC Apr 13 '17 at 13:43
  • @AlexSC You pass back the detached graph as you loaded it (with only actual new changes being different)? Does the unchanged detail have the id when being merged? If not, Hibernate considers it new and tries to insert it. – Dragan Bozanovic Apr 13 '17 at 22:53
  • @AlexSC Btw, I have created a [test](https://github.com/bdragan/jpa-tests/blob/master/src/test/java/emtest/em/EMTest.java) which shows that `merge` works as expected, you may run it and compare it with your actual code. – Dragan Bozanovic Apr 15 '17 at 19:22
  • it seems you don't see things precisely as I see. After merging the master I receive a new version of it and yes, I pass it back to be used in the next save. However, the problem happens in the very *first time* I save the master that was added with a new detail. Like, I load the master and some details come together. I add a new detail and I save the master for the *first time* after it was loaded. The new detail is saved twice in the DB (two INSERTS are issued against the server). If no constrainst are violated, the saving is successful and I return the new master. – AlexSC Apr 17 '17 at 11:53
  • @AlexSC That doesn’t happen in the example test I provided, only one insert is executed for the new detail. However, the Hibernate version you use may be affected by the bug [HHH-5855](https://hibernate.atlassian.net/browse/HHH-5855), you may try the workarounds proposed there if that is the case. – Dragan Bozanovic Apr 17 '17 at 12:28