0

I can read 1800 rows from the db,but if I want read more there throw the error "An I/O error occurred while sending to the backend.". And I also can't use the PreparedStatement to insert message to db. This is my pg configure for questdb.

pg.enabled=true
pg.net.active.connection.limit=1000
pg.net.bind.to=0.0.0.0:8812
pg.net.event.capacity=1024
pg.net.io.queue.capacity=1024
pg.net.idle.timeout=300000
pg.net.interest.queue.capacity=1024
pg.net.listen.backlog=50000
pg.net.recv.buf.size=-1
pg.net.send.buf.size=-1
pg.character.store.capacity=4096
pg.character.store.pool.capacity=64
pg.connection.pool.capacity=64
pg.password=quest
pg.user=admin
pg.factory.cache.column.count=16
pg.factory.cache.row.count=16
pg.idle.recv.count.before.giving.up=10000
pg.idle.send.count.before.giving.up=10000
pg.max.blob.size.on.query=5120k
pg.recv.buffer.size=10M
pg.send.buffer.size=10M
pg.date.locale=en
pg.timestamp.locale=en
pg.worker.count=2
#pg.worker.affinity=-1,-1;
pg.halt.on.error=false
pg.daemon.pool=true
roycyz
  • 1

1 Answers1

2

Without seeing Questdb's logs nor error message from JDBC driver I am going to attempt to suggest trying a new version released just recently.

docker pull questdb/questdb:5.0.5.4-linux-amd64

It has several PostgreSQL wire related bugs fixed. One of those edge case with TCP fragmentation, where TCP connection can be abruptly terminated by questdb and JDBC driver would produce "An I/O error occurred while sending to the backend"