1

I've been testing on Hypertable recently since attracted by its a-priori ability to handle huge amounts of time-series data. My version is 0.9.7.5, and I inserted about one million "rows" that each have about 1000 columns labeled date_time:"date" with the corresponding values such as '100.0' (this is not yet huge).

However, at every hypertable startup, and despite any (known from me) memory tuning in the hypertable.cfg file, EVEN WHEN ACCOUNTING FOR HEAP FRAGMENTATION, the hypertable process uses all my RAM and starts to swap.

Would anyone have any idea about why this is happening and maybe how to stop this very annoying behaviour?

The hypertable software pleases me a lot for its user-friendliness while being very powerful and fast, but I might be tempted to switch to the more memory-controllable cassandra if this problem finds no solution since I cannot throw RAM at it eternally.

Thanks a lot,

Nicolas

Nicko
  • 13
  • 4
  • Which processes exactly? Can you run "top" to see which consume most? Are you running all servers on the same machine? – cruppstahl May 07 '13 at 14:39
  • Hey cruppstahl! It's the rangeserver process. And I am in standalone mode. – Nicko May 07 '13 at 14:48
  • Still looking for an answer! Anyone? I understand there is a memory usage at startup, but I would like to be able to estimate it before hand and to reduce it for following startups. – Nicko May 13 '13 at 14:00

1 Answers1

0

You can control the RangeServer's memory consumption with the configuration option Hypertable.RangeServer.MemoryLimit.Percentage. The default is 60%. If all servers run on the same machine then maybe reducing it to 40 will solve the issue.

cruppstahl
  • 2,447
  • 1
  • 19
  • 25
  • I've already tried that and almost every memory tuning I've seen on the configuration properties... Setting the MemoryLimit.Percentage even at 10% does not change anything. – Nicko May 08 '13 at 10:18