26

I have a dictionary with keys that are ints. I would like to get the largest key. I don't keep track of keys so they might be consecutive (e.g. 1,2,3,4,5,6) but might skip (1,3,4,5) although I doubt that makes any difference.

Do I just use a binary search or is there a method? As far as I see you can hardly beat binary search for such a simple task - maybe you can halve it.

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
s5s
  • 11,159
  • 21
  • 74
  • 121

3 Answers3

50

If you have LINQ available, you should be able to do:

myDictionary.Keys.Max();
Ry-
  • 218,210
  • 55
  • 464
  • 476
  • 2
    To anyone trying to use this, you'll need to ensure you include the linq extension methods `Imports System.Linq` & compile in .net 3.5+ - Works a treat then :) – HeavenCore Jan 29 '16 at 15:05
  • 1
    is this of order `O(n)` or of `O(1)`? – Pixel_95 Nov 04 '19 at 13:31
  • 1
    @Pixel_95: O(n). For O(1) you can keep track of the max as the dictionary changes, like in Simon Mourier’s answer. – Ry- Nov 04 '19 at 20:55
10

A binary search would be the fastest but wouldn't work against a normal dictionary, since the Keys aren't stored in any particular order. @Minitech's answer, using Linq Max(), is the easiest if you're using a normal dictionary.

If this is an operation you will have to do many times, you may consider moving to a SortedDictionary<TKey, TValue> which sorts entries based on the key.

var dict = new SortedDictionary<int, int> {{3, 0}, {12, 0}, {32, 0}, 
                                           {2, 0}, {16, 0}, {20, 0}};
Console.WriteLine(dict.Keys.Last()); //prints 32

EDIT: This can be slower than a normal dictionary. I suppose it would have been good to mention that. That's because the entries in the dictionary are stored in a different way (a Red/Black tree vs hash buckets/hash table)

There is a point that the SortedDictionary becomes faster than a normal Dictionary. However, this would probably come out to be around 1 million items, however that's just a guess. It turns out it about 10 times faster at that many items (but we're talking about 100ths of a second anyway, so does it really matter?). It's about equal on x64 release for 100000 items. Considering there's extra overhead of adding items to the dictionary, it's probably not worth it. Also, I "cheated" a little by overriding the comparer so it would sort in reverse order, so I'm actually doing dict.Keys.First() instead of last to get the largest item.

A SortedDictionary is really meant for if you needed to iterate over all of the Key Value pairs in order. I think @SimonMourier's answer is probably the best. I guarantee you it's the fastest, with minimal overhead.

Christopher Currens
  • 29,917
  • 5
  • 57
  • 77
2

If performance is really an issue, I would create a new class on top of an existing one, implementing the standard interfaces, like this:

    public class RememberMaxDictionary<K, V> : IDictionary<K, V> where K: IComparable<K>
    {
        private Dictionary<K, V> _inner;

        public RememberMaxDictionary()
        {
            _inner = new Dictionary<K, V>();
        }

        public K MaxKey { get; private set; }

        public void Add(K key, V value)
        {
            _inner.Add(key, value);

            if (key.CompareTo(MaxKey) > 0) // test null if needed
            {
                MaxKey = key;
            }
        }

    ... TODO implement the rest...
Simon Mourier
  • 132,049
  • 21
  • 248
  • 298
  • 3
    This is misleading. You left out the part where MaxKey is decreased (assuming removing an entry is desired, which in most cases it is). 'Remove' method is non-trivial to implement in an efficient manner. – Dixtosa Dec 10 '20 at 13:17
  • @Dixtosa - depends on your context and needs. Write your own answer. – Simon Mourier Dec 10 '20 at 13:27