0

I have been reading many architectures such as N-Layered, Onion ,... . But I'm designing a large system that is working by some huge databases and provides a lots of services to another applications/clients.

On the other hand our system have to be designed very extensible. the parts of our system that named modules or subsystems, have own models,Business logic and maybe own UIs or services. Even some modules don't have any UI nor any services to serving out of systems, these modules just extends our system.

I'm a member of designer team and I'm thinking about the below architecture for our system:

Architecture of our system

It is an onion architecture, but the entities of database will be defined in each module. Each module will be developed by a separate team. We don't have dependency between them, but our database is same!

My concerns are below:

  • How can they dependent each other?
  • Is this right architecture for us?
  • If yes, what are the other concerns?
  • If no, what architecture(s) you suggest?
Erbureth
  • 3,378
  • 22
  • 39
Ali Adlavaran
  • 3,697
  • 2
  • 23
  • 47
  • I didn't quite get your first concern can you explain more. Also the modules you are creating how much independent you / your org want them to be? – Guanxi Mar 03 '15 at 12:20
  • we're avoid to accept many changes in our core, so we have design modules scope, which there is our business logic related to there models. Business logic are exclusive controller and manager to their models, because it knows what means the data and how should it do with them, how to validate and ... , so we have communication between business logic in various modules via DI, to keep atomic the logical roles. my first concern is these communications. – Ali Adlavaran Mar 03 '15 at 13:13
  • (... Continues): actually connection modules together via Dependency Injection , requires to do some changes in core. for example a new feature is developed by a module and we want to let the other modules to use it. so we define a new interface in the core and we register it with the implemented new feature. the problem is that we don't want to apply changes in the core frequent and per A little bit change our system! – Ali Adlavaran Mar 03 '15 at 13:22
  • You are correct in focus on the communication. Since all modules are fully independant a efficient communication ill be where it shine or get problematic. You should keep many channels open for that communication, maybe a webapp module can talk with the core by means of a webservice and a datapumb module can just send files by ftp do a specific folder, etc. If you can there's notthing worng in keeping a monolythical core and force any module to adapt to it – jean Mar 03 '15 at 13:22
  • Depend a lot on your requirements but for this scenario DI is not my first option. Keep your core as a blackbox and let the team modules develop the communication, just keep clear and sound the messages/protocols they can use to send/get data from the core – jean Mar 03 '15 at 13:26
  • @jean, ok. but we have a issue ,that is our database implementation and management. in general. the database does not managed by core, rather it done by a module which has responsibility to manage the database.Furthermore the every other modules manage a specific part of our database exclusively. here maybe there is communication between them. AND i don't believe that messages/protocols be good idea for do it. think about implementing simple IRepositories in modules. what do you think about this? – Ali Adlavaran Mar 03 '15 at 13:39
  • I think each module can get it's own DB (or no, a webmodule can use noSQL for instance) and keep a core DB for the core app. Example: imagine all permissions are managed by the core, you ill keep the tables at core DB and just expose the relevant methods/messages (auth, new user, recover passw, etc) to the modules. A module keeping track of a users sales ill keep it own sales tables and ill be not visible to others (considering it's irrelevant data for the others modules). – jean Mar 03 '15 at 14:03
  • DI can be great to plugins and libraries but to a really big system where you want to keep a core away from maintenance issues is not the way to go and you already found that for yourself – jean Mar 03 '15 at 14:07
  • @jean you're right. but you think about we have thousands hit per second on the core! Nevertheless is it right to use SOA ? think about the performance. – Ali Adlavaran Mar 03 '15 at 14:35
  • SOA is a good shoot @Rob Conklin answer cover its very well. a good implemented core plus a good architeture and servers can handle dozen thousands hits/second with easy. If it scales up you can start to think out of the box and use scalable distributed architetures – jean Mar 03 '15 at 14:39

1 Answers1

4

When you say "large", I'm assuming you are talking at least a million lines of code. Assuming that is correct, you should really look at a SOA architecture to separate your "modules". Depending on the language you are using, there are lots of nice RESTful architectures that are ideal for exposing database layer services.

Don't do code-level dependency, or allow the dependencies to creep in at the database layer. Strongly decouple them, and make the modules talk to each other over a network layer. This keeps them strongly independent, and limits the scope of any one change.

It also allows you to have multiple versions in production at the same time, allowing an application ecosystem to evolve independently.

This is especially true if you have independent UI's. Requiring all applications that leverage your core application to upgrade at the same time is a logistical nightmare.

We've been using this methodology for quite a few years, using Restlet to expose core database level services, and having other services and applications consume them. It's been very effective, and allows applications to evolve and deploy on their own schedule.

It also allows individual database "modules" to be refactored independently.

Rob Conklin
  • 8,806
  • 1
  • 19
  • 23
  • Thank you for the answer. It seems reasonable. we are separating our sub-systems as far as possible to reach high in-dependency. But communication language between them, however by SOA (for example web APIs ...) is our challenge now! I explain two approaches which MAYBE solve the problem : APROACH 1: consider we make a central language dictionary or messaging provider which knows all of sub-system communication languages and provides lots of utilities to make or read message for them. OK, messaging is normalized. BUT then, we will LOST scalability as you see. – Ali Adlavaran Mar 17 '15 at 06:11
  • APPROACH 2 : maybe we use some standard or general data languages. for example consider XML , JSON , ... languages. ok then we don't have any central provider and every sub-system parses received messages and communicate with each other but in this aspect very a lots of unnecessary jobs maybe need to messaging between them. And more importantly, I DON'T like this method – Ali Adlavaran Mar 17 '15 at 06:21
  • please tell me some general approaches and how we deal with messaging issue in your suggested solution. Thank you. – Ali Adlavaran Mar 17 '15 at 06:32
  • "in this aspect very a lots of unnecessary jobs" I'm confused by this. What are the "unnecessary jobs" you are concerned about? Are you referring to the service layer? – Rob Conklin Mar 17 '15 at 14:06
  • Your central language dictionary is a remote facade pattern: http://martinfowler.com/eaaCatalog/remoteFacade.html. It is effective, but you lose reusability of the micro-services architecture. It also, as you said, makes a single-point of failure, reduces scalability, etc. It also has a real risk of becoming a "god object": http://en.wikipedia.org/wiki/God_object – Rob Conklin Mar 17 '15 at 14:10