Maybe stack overflow isn't the place for this, but it's related to a data science project I'm working on.
If I randomly generate a number between 1 and 10.... 10 times, and then total all the numbers, what will the standard deviation of that total be?
I'm pretty sure the mean of the total will be 55.5, but what about the average distance from that mean?