I'm studying for some technical interviews coming up and was just going over lecture slides from a year or two ago about data structures.
I'm not clear on why the worst case runtimes of merge for a leftist heap is O(log n) whereas for a skew heap it is O(n), when a skew heap essentially merges in the same way as a leftist heap.
A leftist heap merges A and B by picking the tree with the smaller root and recursively merging its right subtree with the larger tree. Then it checks the null path lengths and swaps its two subtrees if it violates leftist structure property.
A skew heap does the same thing but blindly swaps its two subtrees every time as it recursively merges A and B.
Why would the worst case of merge for a skew heap become O(n)? Is it because we can't guarantee a height bound as it recursively merges (since it's swapping sides every time)? Does this have to do with Floyd's Algorithm, that the sum of the heights from all nodes in a tree grows in O(n)?