1

I have a Java web application using Spring, Hibernate and Wicket connecting to a MySQL database that I'd like to refactor and separate into several applications. I started by using Maven's multi-module system but in reality each of the applications would have its own release cycle, so I've ditched that effort now and I'm looking at creating individual projects for each of them. They will all continue to connect to the same database so I was going to move the model classes into a project of their own which can be used as a dependency.

I have a few questions regarding this setup:

  1. Is moving the model classes to their own project a typical solution to the multiple apps/single database problem, or is there another way?

  2. Is there a nice way of ensuring all the applications are using the same version of the model dependency?

  3. Should I also include any base daos and services in this core project that each application could use or extend, or should I just include my GenericHibernateDao and let each application create its own daos and services? Obviously I will want to avoid changing this project as much as possible as it will require a new release of all the applications depending on it.

  4. Is there any Hibernate related config I would need to change, such as connection pooling? Does it matter if each app has its own pool or should they share one? I'm not using caching at the moment, but I understand if I wanted to I would need a distributed cache?

  5. How would I share application config such as db params, email host, sms gateway etc. between applications? Is there any way of defining them once somewhere to ensure they are all pointed at the same db?

  6. Are there any other gotchas I may encounter further down the road with this setup, either with Maven or during deployment? Any tips or best practises I should follow?

Jack
  • 21
  • 1
  • 3
  • 2
    Based on #1 and #2 I think it would be a great opportunity to create a microservice for the database layer, which includes service methods to do all of the processing. This would ensure that there are never different database models on different systems. It would also create a nice separation between the application and the underlying data. – mnd Jan 22 '15 at 17:05
  • Since you aren't asking a specific question, and getting people's opinions... this may be closed out. However, I *do* think this is a good set of questions. Perhaps you should ask this in the Community Wiki instead? – Ascalonian Jan 22 '15 at 17:06
  • @mnd could you please explain a bit more how this would work? Wouldn't it mean a change to the services would require a new release of all the applications that depend on them? – Jack Jan 22 '15 at 17:21
  • @Ascalonian Yeh, I wasn't sure whether to post it here or to post it to programmers.stackexchange.com – Jack Jan 22 '15 at 17:24
  • @Jack, yes, you would have to update the applications is the microservice changed - I was thinking that #2 indicated that you don't want to accidentally leave one application with an old model - and rather the application would just stop working if the service changed. On the flip side, if you were willing to support multiple versions, you could have versioned REST APIs, where part of the url includes `/v1/` or `/v2/` to indicate which version of the API you are using. This is nice if you're able to support multiple models simultaneously. – mnd Jan 22 '15 at 17:57

1 Answers1

0

These have been usual scenario with me, what I have usually done is.. - DAOs, Conn. Pool Management, Fail over related code can be managed by writing separate module [jar] - You can then use this module in components as you have mentioned.

With this you will have separate connection pool for each of your component.

codinnvrends
  • 264
  • 2
  • 8