0

I have a local installation of CockroachDb on my windows PC and when I run a particular select query, I get the following error message:

7143455 bytes requested, 127403581 currently allocated, 134217728 bytes in budget.

I have read a blog post here but I haven't found a solution. I will appreciate a help on how to increase this budget limit.

Yusuff Sodiq
  • 815
  • 2
  • 12
  • 19
  • What sort of queries are you running? At a guess, this might be a bug that was found in memory accounting for queries involving JSONB columns, which will be fixed in version 2.0.3 in a couple of weeks. – Ben Darnell May 28 '18 at 12:58
  • @BenDarnell you're right, the query involves a JsonB column. The query is: `SELECT ID,JsonData,PrimaryIDs,IsActive,IsDeleted FROM ( SELECT ID,JsonData,PrimaryIDs,IsActive,IsDeleted, ROW_NUMBER() OVER (ORDER BY ID) AS RowNum FROM "TableName" where LOWER(JsonData->> 'Name') like '%transaction%' ) AS TBL WHERE TBL.RowNum BETWEEN 1 AND 100 order by ID;` – Yusuff Sodiq May 28 '18 at 16:54

1 Answers1

2

CockroachDB versions between 2.0 and 2.0.2 have a bug in memory accounting for JSONB columns, leading to this error. The bug will be fixed in version 2.0.3, due in mid-June.

As a workaround, you may be able to rewrite this query to be more efficient (This might reduce the memory usage enough to work even with the bug. Even if it doesn't, it'll speed up the query when 2.0.3 is available). If I'm reading your query correctly, this is equivalent to

SELECT ID, JsonData,PrimaryIDs,IsActive,IsDeleted FROM "TableName"
    WHERE LOWER(JsonData->>'Name') LIKE '%transaction%'
    ORDER BY ID OFFSET 0 FETCH NEXT 100 ROWS ONLY

The subquery with ROW_NUMBER() was used with older versions of SQL Server, but since SQL Server 2012, the OFFSET 0 FETCH NEXT N ROWS ONLY version has been available and is more efficient.

The syntax OFFSET 0 FETCH NEXT N ROWS ONLY syntax comes from the SQL standard so it should work with most databases. CockroachDB also supports the LIMIT keyword which is used in MySQL and PostgreSQL for the same purpose.

Ben Darnell
  • 21,844
  • 3
  • 29
  • 50
  • Thanks for the optimized query. I used it and the error has gone. Looking forward to the fix in version 2.0.3 anyway. – Yusuff Sodiq May 31 '18 at 08:54