0

Background

I'm creating a Notion DB that will contain data about different analyzers my team uses (analyzer name, location, last time the analyzer sent data, etc.). Since I'm using live data I need to have a way to quickly update the data of all analyzers in the notion db.

I'm currently using a python script to get the analyzers data and upload it to the Notion DB. Currently I read each row, get it's ID that I use to update the row's data - but this is too slow: it takes more than 30 seconds to update 100 rows.

The Question

I'd like to know if there's a way to quickly update the data of many rows (maybe in one big bulk operation). The goal is perhaps 100 row updates per second (instead of 30 seconds).

snatchysquid
  • 1,283
  • 9
  • 24

2 Answers2

1

There are multiple things one could do here - sadly none of it will improve the updates drastically. Currently there is no way to update multiple rows, or to be more precise pages. I am not sure what "read each row" refers to, but you can retrieve multiple pages of a database at once - up to 100. If you are retrieving them one by one, this could be updated.

Secondly, I'd like to know how often the analyzers change and if, will they be altered by the Python script or updated in Notion? If this does not happen too often, you might be able to cache the page_ids and retrieve the ids not every time you update. Sadly the last_edited_time of the database does not reflect any addition or removal of it's children, so simply checking this is not an option.

The third and last way to improve performance is multi-threading. You can send multiple requests at the same time as the amount of requests is usually the bottleneck.

I know none of these will really help you, but sadly no efficient method to update multiple pages exists.

Simon
  • 126
  • 2
  • Yes when I say rows of the DB I believe I mean pages. Retrieving them is not the prolem, updating them is the slow part. Analyzers change all the time but since updating them in notion takes time we run the script (using aws lambda) every few minutes. Caching IDs sounds like a good idea and I will try as well as using multi-threading. Will update :) – snatchysquid Sep 06 '22 at 08:18
0

There is also the rate limit of 3 requests per second, which is enforced by Notion to ensure fair performance for all users. If you send more requests, you will start receiving responses with an HTTP 429 code. Your integration should be written in such a way that this response will be respected and should prevent any requests to be sent before the time indicated in the indicated number of seconds as per this page on the notion developer API guidelines.

Nel Prinsloo
  • 81
  • 2
  • 6