0

At our company we are using Redisearch with a non-trivial amount of data in it (think 3 of 4 gigabytes of data). When we fill it up with a normal script it can take a couple hours. Now something has changed in the document structure and we have to update the running production instance.

I am wondering what the best approach is without (mayor) downtime and also how it can be done without a lot of manual intervention.

  • An option is to spin up a new instance, fill this up with the new data and then switch the service to that after everything is loaded. After this we can destroy the old instance.
  • We can use a similar approach with RIOT. Here we spin up a new instance, fill it up and then sync using RIOT.
  • We can setup a .rdb somewhere else and use that to import the data.

But we also see some downsides with these approaches:

  • This seems like a good option, small downside is that you need double the resources for a bit of time.
  • We assume that while the data is syncing your service can both get the new data structure and the old which might be hard to code for and test before deployment.
  • We are finding that setting up a .rdb file locally is becoming harder the more data you get since most people don't have the needed memory in their machines.

Are there more solutions for this problem?

1 Answers1

2

Redis Connect a new product that we will launch soon solves this problem by extracting data from database change (or other sources) transform the data and store it as Hash or JSON in the Redis Enterprise database where you run the Redisearch index

This solution will insert new data, update existing data and delete removed entries. Will this work for you?

Yaron
  • 79
  • 3