Do you mind spending O(2n) memory? You could use a Queue<> in combination with a Dictionary<,>. The queue would handle the queue and dequeue operations and the dictionary would ensure unique entries. A simple wrapper class could combine those two, and it would give you O(log n) queue and dequeue times.
Example:
public class SetQueue<T>
{
private readonly Dictionary<T, bool> duplicates = new Dictionary<T, bool>();
private readonly Queue<T> queue = new Queue<T>();
public bool Enqueue(T item)
{
if (!duplicates.ContainsKey(item))
{
duplicates[item] = true;
queue.Enqueue(item);
return true;
}
return false;
}
public T Dequeue()
{
if (queue.Count >0)
{
var item = queue.Dequeue();
if (!duplicates.ContainsKey(item))
throw new InvalidOperationException("The dictionary should have contained an item");
else
duplicates.Remove(item);
return item;
}
throw new InvalidOperationException("Can't dequeue on an empty queue.");
}
}
An insert into this custom data structure check if the dictionary already contains the item. This operation uses the ContainsKey method which is a O(log n) operation. If the item was already contained in the data structure than the method exits. If the item isn't contained, then the item will be inserted into the queue which is a constant O(1) operation. It will also be added to the dictionary. When the count of the dictionary is less than the capacity this will approach a constant, O(1) insertion time as well. The total queue time will therefore be O(log n).
The same thing goes the dequeuing method.
This solution is basically the same as the built-in data structure OrderedDictionary, however, since this solution uses generic there is no overhead in boxing/unboxing in it's operations making it wastely faster.