I am using jackc/pgx library to insert largeobjects into Postgres. It works fine when the large objects are small. However in one case the large object was measuring almost 1.8 GB in size. As a result when performing the write operation, there was "out of memory (SQLSTATE 54000)" error.
Here is the code snippet how I am inserting blobs
import (
"github.com/jackc/pgx/v4"
"github.com/jackc/pgx/v4/pgxpool"
)
// Read bytes from the file to be imported as large object
b, err := ioutil.ReadFile(pathToLargeObjectFile)
txWrite, err := dbPool.Begin(ctx)
loWrite := txWrite.LargeObjects()
fmt.Printf("Creating new blob with ID : %d", ID)
id, err := loWrite.Create(ctx, ID)
// open blob with ID
obj, err := loWrite.Open(ctx, id, pgx.LargeObjectModeRead|pgx.LargeObjectModeWrite)
n, err := obj.Write(b)
fmt.printf("Written %d byte to blob %d\n", n, id)
I get an error on this line
n, err := obj.Write(b)
How do I prevent the error and successfully import the large object?
I read this post Inserting Large Object into Postgresql returns 53200 Out of Memory error which tries to write the bytes in chunks.
Is similar possible with jackc/pgx?