-1

Maybe stack overflow isn't the place for this, but it's related to a data science project I'm working on.

If I randomly generate a number between 1 and 10.... 10 times, and then total all the numbers, what will the standard deviation of that total be?

I'm pretty sure the mean of the total will be 55.5, but what about the average distance from that mean?

rikkitikkitumbo
  • 954
  • 3
  • 17
  • 38
  • 1
    I'm voting to close this question as off-topic because it's not about programming (try http://stats.stackexchange.com). – Oliver Charlesworth Aug 10 '16 at 22:39
  • 4
    I'm voting to close this question as off-topic because it is about statistics / [math.se] instead of programming or software development. – Pang Aug 11 '16 at 01:35

1 Answers1

1

Well, from here one could get that mean indeed would be 10*(10+1)/2 = 55, and variance would be 10*(10-1)2/12 = 67.5.

Quick test in Python

import math
import random

def sample(a, b):
    s = 0.0
    for k in range (0, 10):
        s += random.uniform(a, b)

    return s

random.seed(12345)

a = 1.0
b = 10.0

n = 100000

q  = 0.0
q2 = 0.0
for k in range(0, n):
    v = sample(a, b)
    q  += v
    q2 += v*v

q  /= float(n)
q2 /= float(n)

print(q, q2 - q*q)

Prints 55.005828775627684 67.69074422910626

Std deviation would be sqrt(67.5) and equal to about 8.22

Severin Pappadeux
  • 18,636
  • 3
  • 38
  • 64