3

I use a ConcurrentHashMap (key: String, value: custom object) as one of my data structures. the map contains about 300,000 objects and will be updated frequently (about 20,000 put and 20,000 removes per minute).

It seems, that the whole program slow down after about 1 hour. Can somebody tell me if the ConcurrentHashMap is made for frequent modifications or if there is an alternative which is better for my requirements?

Thanks :)

DavidG
  • 113,891
  • 12
  • 217
  • 223
Michael
  • 1,251
  • 1
  • 15
  • 29
  • 2
    20,000 operations on the map per minute should be ok. Have you tried to profile your program to see where it spends time after an hour? Could it be a memory issue (do you make sure you don't keep a reference to the objects you remove)? – assylias Aug 17 '12 at 10:16
  • Thanks for the fast comment. Yes I already profiled the program and there is no memory overflow. If 40,000 operations are ok, then I will search for other problems. – Michael Aug 17 '12 at 10:23
  • Do you use any concurrencyLevel in the constructor? – Roman C Aug 17 '12 at 10:24
  • No I do not. Normally the map is only used by one thread. Some other threads will use it regularly every 30 sec. – Michael Aug 17 '12 at 10:26
  • 2
    I just ran a quick test: adding and removing 200,000 items from a map containing 300,000 takes about 0.5 seconds (a bit less with 1 thread, a bit more with 20 threads). This will obviously vary + you are probably doing other things in your program but it gives an idea of the magnitude. – assylias Aug 17 '12 at 10:34
  • 1
    Here is a nice explanation for your question: http://stackoverflow.com/questions/1378310/performance-concurrenthashmap-vs-hashmap – Vivek Aug 17 '12 at 10:35
  • This is more better explanation http://stackoverflow.com/questions/1573901/concurrenthashmap-constructor-parameters – Roman C Aug 17 '12 at 10:52
  • Have you checked how many hash collisions you have? – Philippe Marschall Aug 17 '12 at 11:21

1 Answers1

4

I've used ConcurrentHashMap in production with many writers/readers @ > 100,000 read/writes per minute. We'd let the system run for days but I never encountered any slowdown due to ConcurrentHashMap.

If your code is slowing down after an hour, the first thing I'd suspect is GC overhead due to a memory leak. If I were you I'd hookup jvisualvm to your app. and monitor the memory/CPU usage and go from there.

Enno Shioji
  • 26,542
  • 13
  • 70
  • 109
  • So the JVM option -XX:-UseParallelOldGC and -XX:+UseGCOverheadLimit may be helpful. – Roman C Aug 17 '12 at 11:00
  • @RomanC: IMO if your app is noticeably slowing down due to GC overhead, either (1) there is a memory leak that needs to be fixed, or (2) the heap is too small for the app. I wouldn't recommend fiddling with GC implementation. UseGCOverheadLimit should be enabled by default, and merely helps detecting memory problems earlier. – Enno Shioji Aug 17 '12 at 11:08