Firstly, the complexity of equal_range is documented at the link you yourself provide as:
Average case: constant.
Worst case: linear in container size.
Secondly, the logical operation of "subtracting the resulting iterators" has to be implemented using linear iteration with complexity O(bucket_size(bucket(key)))
, stepping through a list or vector of hash-colliding values checking for the matches, so...
"2) equal_range and then subtracting the resulting iterator"..."is constant time"
...is not a well-founded assertion.
As for "1) count", it's complexity is likewise documented- in this case:
Average case: linear in the number of elements counted.
Worst case: linear in container size.
Which again may differ from your "linear time in number of occurences". The reason that's the average is that normally with max_load_factor
at the default of 1.0 and a good hash function, there will only be collisions as for a random scattering - something around the 10-20% mark, so most of the time the only keys hashing to a specific bucket will be those you're counting - with the average being a constant multiple around 1.1x or 1.2x that - hence linear.