Postgresql 9.1 NPGSQL 2.0.12
I have binary data I am wanting to store in a postgresql database. Most files load fine, however, a large binary (664 Mb) file is causing problems. When trying to load the file to postgresql using Large Object support through Npgsql, the postgresql server returns 'out of memory' error.
I'm running this at present on a workstation with 4Gb RAM, with 2Gb free with postgresql running in an idle state.
This is the code I am using, adapted from PG Foundry Npgsql User's Manual.
using (var transaction = connection.BeginTransaction())
{
try
{
var manager = new NpgsqlTypes.LargeObjectManager(connection);
var noid = manager.Create(NpgsqlTypes.LargeObjectManager.READWRITE);
var lo = manager.Open(noid, NpgsqlTypes.LargeObjectManager.READWRITE);
lo.Write(BinaryData);
lo.Close();
transaction.Commit();
return noid;
}
catch
{
transaction.Rollback();
throw;
}
}
I've tried modifying postgresql's memory settings from defaults to all manner of values adjusting:
- shared_buffers
- work_mem
- maintenance_work_mem
So far I've found postgresql to be a great database system, but this is a show stopper at present and I can't seem to get this sized file into the database. I don't really want to have to deal with manually chopping the file into chunks and recreating client side if I can help it.
Please help!?