I am working on the following code:
np.random.seed(6233)
np.random.choice(20, 20)
u1 = np.random.normal(loc=0.0, scale=25.0, size=20)
u2 = np.random.normal(loc=0.0, scale=5.0, size=20)
idu1 = np.random.choice(len(u1), 20)
idu2 = np.random.choice(len(u2), 20)
idu = [idu1, idu2]
I know it is not correct but my goal is to create a variable u
which will be as follows:
u = (np.random.normal(loc=0.0, scale=25.0, size=20))*0.3 + (np.random.normal(loc=0.0, scale=5.0, size=20)*0.7)
I tried to create two other variables u1
and u2
and then convert each of them into one-dimensional array (because I though I might need to use choice(a, size, replace=True, p)
as follows:
idu = [idu1, idu2]
probabilities = [0.3, 0.7]
choice(idu, size=20, replace=True, p=probabilities)
My problem is that I have no knowledge how I am supposed to mix two normal distributions by assigning probabilities 0.3
and 0.7
and create a variable that will have a combinations of the two with size=20
, because I get a syntax error. Please any help will be appreciated!