1

We have a ruby rails front end and distributed back end services running batch jobs via Redis.

The front end will acquire the initial Xero oauth2 token.

The token will expire in 30 minutes, the refresh token lasts 60 days (great), but can only be used once.

When multiple back end processes try to access Xero and the token has expired, only one can succeed all others will fail because the refresh token is single use.

Database transaction boundaries make sharing the token via the Database impractical.

I've considered the following solutions (I don't like any of them):

  • A singleton to handle all xero interactions.
  • Wrap xero in a retry process and check Redis for tokens that may have been updated by another process.
  • Making the user go through oauth2 login repeatedly to obtain a token for each back end process.

Unfortunately I'm new to both Ruby and Redis so specifics on implementation in this environment would be very useful.

Robert Sutton
  • 507
  • 4
  • 7

2 Answers2

1

So this is the approach I took.

When fetching the token from the database or refreshing the token: Do it on a new thread which ensures active record uses a separate database connection. This ensures that database transactions don't interfere with seeing the latest version of the token.

Robert Sutton
  • 507
  • 4
  • 7
0

I would highly recommend that you don't enqueue the jobs with the actual token credentials as metadata passed to Redis.

Your access_token and refresh_token should be stored in just a single place. When you enqueue a job, you should be passing it only a user_id, in which case when it was processed it would look up the user.token_set and then pass in that user.token_set['access_token'] to the xero_client (or however you have it setup) with the xero-ruby SDK.

When you re-fresh the access_token, a new token_set is returned, including a new refresh_token. Rinse and repeat.

Database transaction boundaries make sharing the token via the Database impractical

This is a bit confusing to me. What processes are the background jobs doing? Are they not processing data and reading/writing it to the same database you would be storing that token_set in? Are they strictly calling to non app level db/apis?

SerKnight
  • 2,502
  • 1
  • 16
  • 18
  • If the token is stored in the DB and multiple transactioned jobs run at the same time, then if the token expires both would need to refresh (only one can succeed) and because of the DB transaction neither can see the others updates to the token stored in the DB. This will result in one transaction failing and have to roll back. – Robert Sutton Jul 21 '20 at 03:31
  • “ Each time you perform a token refresh, you should save the new refresh token returned in the response. If, for whatever reason, your app doesn't receive the response you can retry your existing token for a grace period of 30 minutes” > https://developer.xero.com/documentation/oauth2/auth-flow That should help with any crazy unlucky timing. In addition, each background job should likely refresh and save the token set each time. Are you using sidekiq? Should also be fairly straightforward to retry any jobs and refresh the token if you do run into any failures. – SerKnight Jul 21 '20 at 13:12
  • When the batches run for excess of 30 minutes it is guaranteed to happen. but even a one minute batch has a 1/30 chance. Given the batches are long running (hours) rolling back is undesirable, expensive and very likely if not a certainty. The key issue here is that long running transactions (which make updates to the DB invisible), will result in one or more processes not seeing the updated token and having no way to acquire it without rolling back an hours long transaction. – Robert Sutton Jul 22 '20 at 04:32