0

I have around 5000 strings of size (length in range from 50-80 mostly). Currently i create an unordered map and push these keys and during execution i access (using map' s find function) them 10-100 million times. I did some profiling around this search, seems to be the runtime hogger. I searched for other better and faster search options, but somehow did not find anything substantial. Do anyone have idea about, how to make it faster, open for custom made container also. I did try std::map, but did not help. Do share link if anyone have.

Also one more point to add, i also modify values against some keys at runtime also, but not that many times. Mostly it's search.

Jans
  • 11,064
  • 3
  • 37
  • 45
instance
  • 1,366
  • 15
  • 23
  • `std::map` seems to make things worse since your problem is searching intensive which `std::unordered_map` usually does better than `std::map` (see more https://stackoverflow.com/questions/2196995/is-there-any-advantage-of-using-map-over-unordered-map-in-case-of-trivial-keys). IMO the hash function used by `std::unordered_map` could be the bottle-neck. – duong_dajgja Dec 08 '18 at 07:29
  • And maybe you could try this https://github.com/greg7mdp/sparsepp. – duong_dajgja Dec 08 '18 at 07:36
  • And a similar question: https://stackoverflow.com/questions/8372579/c-1m-look-ups-in-unordered-map-with-string-key-works-much-slower-than-net-c/8372757 – duong_dajgja Dec 08 '18 at 07:39

1 Answers1

0

Having considered a similar question to yours C++ ~ 1M look-ups in unordered_map with string key works much slower than .NET code, I would guess you have run into the issue caused by hash function used by std::unordered_map. For strings with length of 50-80 that could lead to a lot of collisions and this would significantly degrade look-up performance.

I would suggest you to use some custom hash function for the std::unordered_map. Or you could give A fast, memory efficient hash map for C++ a try.

duong_dajgja
  • 4,196
  • 1
  • 38
  • 65