0

I've been working a lot with microservices recently and the common pattern is that every service is responsible for its own data. thus service "A" can not access service "B" data directly without talking to service "B" via some http api or message queue.

Now I've started to pick up some work with azure functions for the first time. I've looked at a fair few examples and they all seem to have any old function just dabbling with data in a shared data store (Which seems like we're going back to the old style of having a massive monolithic database).

I was just wondering if there was a common pattern to follow with data storage when using Function as a Service? And where does the responsibilities lie?

BMW
  • 42,880
  • 12
  • 99
  • 116
Kevin Smith
  • 13,746
  • 4
  • 52
  • 77
  • Micro-Services is an application design paradigm. Azure Functions is a Serverless compute framework. You can use Azure Functions to host Micro-Services, but it's also used for event-based programming and general purpose Serverless job hosting (aka Cloud Duct Tape). So I wouldn't read too much in the way of Application Design best-practices into the samples. – David Browne - Microsoft Aug 17 '17 at 22:56

1 Answers1

1

The following screen snippet is an example of the event-driven distributed model of the business processors in the cloud-based solutions without using a monolithic database. More details about this concept and technique can be found in my article Using Azure Lease Blob

BusinessContextWithAF

Note, that the each Business Context has own Lease Blob for holding a state of the processing with references to other resources such as metadata, config, data, results, etc. This concept allows to create a matrix (multi) dimensional business processing model, where each sub-nested process can have own Lease Blob.

Roman Kiss
  • 7,925
  • 1
  • 8
  • 21