I know that this question has been discussed a lot already, but I would like to describe my situation.
As far as I know, there are techniques and best practices to solve the shared database in microservices architectures (event sourcing, CQRS...) but all of that seems to complex for my case, let me explain.
I built a rest API using nodejs. This API allow you to fetch, using a GET request, the data stored in a mysql database. Now I need to import a lot of data in the same database (creating every time a new table). The first solution could be to add a new endpoint (POST request) to the existing microservice to create the new table and add the new data.
But I was thinking about to create a different nodejs microservice (import service) because the import feature could be very CPU time consuming and nodejs is single thread; I don’t want that a user has to wait to fetch the data because another one is importing the new one.
The problem whit that solution is that I have to share the same database between the 2 microservices. Using the typical approaches (event sourcing, CQRS) could be the best solution but it complexs too much the architecture ( for this project I don't need to address the data consistency problem).
There are others 2 solution that I can use:
- create a common Lib to access the DB and use the lib in the microservices
- The “import microservice” instead of access to the database directly, can use the API rest of the other service to post the new data as soon as they are ready to be imported.
What is the best solution? Do you know other possible ways to address this problem?
Thank you very much