0

I have set up NBC Cluster at my Office. There are two physical Machines with 128G/each. The database size is around 2G. We are an ISP and we have kept the RADIUS database in the cluster.

The thing that is worrying me at the moment is, in both the Systems, the process is consuming 122G each out of 128 and I think its shocking. I am quite new to database so I am having trouble debugging the issue.

Rick James
  • 135,179
  • 13
  • 127
  • 222

1 Answers1

0

The memory used by NDB data nodes is defined by your cluster configuration. So even if the database is only 2GB in size, if you have configured to run with up to 64 GByte of memory, this memory is preallocated to ensure that it is there when it is needed.

So look into your config.ini file to see how you configured the NDB data nodes.