2

I'm brain storming the architecture for a new SaaS that will require processing events from one Salesforce instance per customer. I'll be using AWS AppFlow with Lambdas to process the events.

The basic flow is that the user will sign up via a web app for the SaaS, upon which I will need to create the AppFlow for this specific user. The user will need to sign in to Salesforce for OAuth during the sign up process, which should give me the Access Token for the user.

What I'm confused by is how to make this process dynamic/automated - should I be calling AWS APIs (eg CreateFlow, CreateConnectorProfile)? I've only ever worked with static resource YML files before for AWS resources, since it was a one time setup, not multi-tenant. Obviously having to manually push code each time a user signs up would be a hassle.

Would passing in the Access Token be enough (from the point of view of AuthN/Z) to call the create APIs?

user3073511
  • 101
  • 1
  • 7

1 Answers1

1

Nope, a lot simpler than that if I'm reading your idea correctly. I've personally enjoyed Salesforce Platform Events for this.

https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-saas-salesforce.html

To summarize:

  1. You create a "listener" in Eventbridge/AppFlow for Salesforce-sourced events.
  2. Funnel those requests to Eventbridge eventbus. Overflow may go to s3
  3. The eventbus is then tied to a lambda where you can make your decisions.

In the platform event, you'd define a few fields like the access token, name, org id, etc. You can define a simple Apex file which publishes the platform event once its created. I generally have a Salesforce flow call an invocable to this Apex class to perform last second cleanup and publishing.

https://developer.salesforce.com/docs/atlas.en-us.platform_events.meta/platform_events/platform_events_publish_apex.htm

List<Low_Ink__e> inkEvents = new List<Low_Ink__e>();
inkEvents.add(new Low_Ink__e(Printer_Model__c='XZO-5', Serial_Number__c='12345', 
              Ink_Percentage__c=0.2));
inkEvents.add(new Low_Ink__e(Printer_Model__c='MN-123', Serial_Number__c='10013', 
              Ink_Percentage__c=0.15));


// Call method to publish events
List<Database.SaveResult> results = EventBus.publish(inkEvents);

// Inspect publishing result for each event
for (Database.SaveResult sr : results) {
    if (sr.isSuccess()) {
        System.debug('Successfully published event.');
    } else {
        for(Database.Error err : sr.getErrors()) {
            System.debug('Error returned: ' +
                        err.getStatusCode() +
                        ' - ' +
                        err.getMessage());
        }
    }       
}

When the lambda fires, it receives a single record with all the data you're looking for. Easy to move to any storage piece like DynamoDB, S3, etc. Or just use it instantly.

booky99
  • 1,436
  • 4
  • 27
  • 45
  • So, I know next to nothing about developing for Salesforce, but does this solution extend easily to multiple customers? It sounds like I'll have to edit the customer's Salesforce instance to define the Platform Event, which doesn't make the solution self-serve from the customer's point of view? – user3073511 Jul 04 '22 at 15:46
  • yes, you're right. my solution requires modifying their org. I'm kind of confused what you would need from their access token upon login to your app. Do you plan on modifying their org or something? – booky99 Jul 04 '22 at 22:39
  • 1
    AppFlow needs the access token to integrate with Salesforce. Step 2 of this tutorial https://aws.amazon.com/blogs/apn/building-secure-and-private-data-flows-between-aws-and-salesforce-using-amazon-appflow/ shows when the user would log in if the flow was created manually. However if I create the integration using the API (and not the AWS UI), I should be able to provide the access token as part of the ConnectorProfileCredentials object (https://docs.aws.amazon.com/appflow/1.0/APIReference/API_SalesforceConnectorProfileCredentials.html) – user3073511 Jul 05 '22 at 18:06