HashMap: intialCapacity=1000; loadFactor=0.75;
The above means, that the HashMap will re-size around 1000*75 = 750 th entry to 2000. Would rehashing take place at this time? If yes, then how will the performance be affected? If not, then when? at MAX_CAPACITY?
TreeMap: No rehashing, but sorting. Documentation suggests that insertion/reading/search is always O(log N). However, isnt sorting/new-entry/delete-entry always re-sizes the entire TreeMap?
How are the two compared in terms of BigO notation for the above scenarios and overall performance?
HashMap and ConcurrentHashMap are highly used implementations but TreeMap is not that much used in comparison. I agree on a TreeMap that only adds and seldom deletes but highly searched to be preferably over HashMap/table implementations.
Any comment is appreciated.
EDIT: In terms of data-structure amortization, what are the performance worst cases for best practices that should be taken into account? Like rehashing of a Hash based MAP and/or resizing of a tree based Map or set. There are certain trade-offs but assuming that datastructure is constantly pressed for modification due to highly un-predictable throughput.