In a blogpost on https://discourse.ubuntu.com the performance weakness of Gnome-Shell is explained as being caused – not by obviously detectable "hot spots", but being caused – by "cold spots":
"The thing is in the case of Gnome-Shell its biggest performance problems of late were not hot spots at all. They were better characterized as cold spots where it was idle instead of updating the screen smoothly. Such cold spots are only apparent when you look at the real time usage of a program, and in not the CPU or GPU time consumed."
Since I have never heard of such "cold spots", I now wonder, if there is a more detailed definition of "cold spots" in program performance analysis.