0

I have a very large monolithic legacy application that I am tasked with breaking into many context-bounded applications on a different architecture. My management is pushing for the old and new applications to work in tandem until all of the legacy functionality has been migrated to the current architecture.

Unfortunately, as is the case with many monolithic applications, this one maintains a very large set of state data for each user interaction and it must be maintained as the user progresses through the functionality.

My question is what are some ways that I can satisfy a hybrid legacy/non-legacy architecture responsibly so that in the future state all new individual applications are hopelessly dependent on this shared state model?

My initial thought is to write the state data to a cache of some sort that is accessible to both the legacy application and the new applications so that they may work in harmony until the new applications have the infrastructure necessary to operate independently. I'm very skeptical about this approach so I'd love some feedback or new ways of looking at the problem.

Rubén
  • 34,714
  • 9
  • 70
  • 166
Will Evers
  • 934
  • 9
  • 17

1 Answers1

1

Whenever I've dealt with this situation I take the dual writes approach to the data as it mostly a data migration problem. As you split out each piece of functionality you are effectively going to have two data models until the legacy model is completely deprecated. The basic steps for this are:

  1. Once you split out a component start writing the data to both the old and new database.
  2. Backfill the new database with anything you need from the old.
  3. Verify both have the same data.
  4. Change everything that relies on this part of the data to read from the new component/database.
  5. Change everything that relies on this part of the data to write to the new component/database.
  6. Deprecate that data in old database, i.,e. back it up then remove it. This will confirm that you've migrated that chunk.

The advantage is there should no data loss or loss of functionality and you have time to test out each data model you've chosen for a component to see if it works with the application flow. Slicing up a monolith can be tricky deciding where your bounded contexts lie is critical and there's no perfect science to it. Always keep in mind where you need your application to scale and which pieces are required to perform.

CheeseFerret
  • 597
  • 11
  • 21
  • 1
    Yeah that sort of what I am thinking although I am not retiring any legacy databases since they are used throughout the company ( unfortunately), I'm just retiring the application. So since its short lived session data it will be a single write to a new DB but will follow the rest of your steps for migrating. – Will Evers Jan 24 '19 at 15:09