0

We are working in a big project (with a lot of metadata) using Jena TDB. About a month ago a problem regarding memory appeared suddenly—the program has been working properly for months and no changes were done—and we are not able to upload any more data.

We’ve been working in this issue for several weeks, and we think that the problem is caused because of the size of some of our .dat files are larger than 16Gb. We’ve read that the index system used for TDB employs 64 bits for each index: 8 bits for the type + 44 bits for disk allocation + 12 bits for vnode. With 44 bits we can manage 16GB, and we think here is where the memory problem appears.

Could you please tell us if we are correct? If so, can you please tell us about the best solution?

RobV
  • 28,022
  • 11
  • 77
  • 119
Paco
  • 1

0 Answers0