I have this problem and I know that it can be carried out in several ways.
Assume that the returns of a security X are distributed according to a normal law with mean m=0 and standard deviation s=5.
- What is the value at risk at 1 percent (i.e., that minimum value below which it will go in 1 percent of the cases)?
I solved it this way but I would like to know if there are other ways
qnorm(0.01,mean=0,sd=5)
pnorm(-11.63174,mean=0,sd=5)