Suppose we have a random sample of size n = 8 from a lognormal distribution with parameters mu and sigma. Since it is a small sample, from a non-normal population I will be using the t confidence interval. I ran a simulation to determine the true (simulated) CI of a 90% t-CI in which mu=1 and sigma= 1.5
My problem is that my code below follows a NORMAL distribution and it needs to be a lognormal distribution. I know that rnorm has to become rlnorm so that the random variables come from the log distribution. But I need to change what mu and sigma are. Mu and sigma in normal distribution aren't the same in a log distribution.
Mu in the log distribution= exp(μ + 1/2 σ^2). And sigma is exp (2 (μ+sigma^2)) – exp2 (μ+sigma^2)
I'm just confused on how I can incorporate these two equations into my code.
BTW- if you didn't already guess, I am VERY new to R. Any help would be appreciated!
MC <- 10000 # Number of samples to simulate
result <- c(1:MC)
mu <- 1
sigma <- 1.5
n <- 8; # Sample size
alpha <- 0.1 # the nominal confidence level is 100(1-alpha) percent
t_criticalValue <- qt(p=(1-alpha/2), df=(n-1))
for(i in 1:MC){
mySample <- rlnorm(n=n, mean=mu, sd=sigma)
lowerCL <- mean(mySample)-t_criticalValue*sd(mySample)/sqrt(n)
upperCL <- mean(mySample)+t_criticalValue*sd(mySample)/sqrt(n)
result[i] <- ((lowerCL < mu) & (mu < upperCL))
}
SimulatedConfidenceLevel <- mean(result)
EDIT: So I tried replacing mu and sd with their respective formulas...
(mu=exp(μ + 1/2 σ2) Sigma= exp(2μ + σ2)(exp(σ2) - 1)
and I got a simulatedconfidencelevel of 5000.