2

I have the following question, where the argument of \exsolution is {0010} but it should be {1000}

\begin{question}
What is the capital of Italy

\begin{answerlist}
  \item Rome
  \item Paris
  \item Vienna
  \item Madrid
\end{answerlist}
\end{question}
\extype{schoice}
\exsolution{0010}
\exshuffle{4}

I have corrected the error and re-run the processes of creating and grading exams from scratch using the same seed. Unfortunately, the sequence of answers in this question changes (note \exshuffle{4}), so the grades assigned to this particular question are wrong. All other questions are OK.

pendermath
  • 180
  • 7

1 Answers1

1

Due to the way that exshuffle is implemented it is not easy to just change the {answerlist} and/or exsolution and get the right resulting exam.

Instead I would recommend to go through the meta-information and fix it there. I presume that you are generating the exams with exams2nops() and have stored the RDS with the metainformation, right? I will produce such a file via:

set.seed(1)
exams2nops(c("capitals.Rnw", "italy.Rnw", "switzerland.Rnw"), n = 5, dir = ".")

Thus, there are five exams with 3 exercises each with your problematic exercise italy.Rnw in second place. The metainformation is stored in metainfo.rds which we can read again via

x <- readRDS("metainfo.rds")

Now x is a list of 5 elements (exams), each of which has 3 elements (exercises), which have elements question, questionlist, solution, solutionlist, metainfo, and supplements. Here, we need to inspect the questionlist in order to fix the metainfo$solution. Currently, Vienna is marked as being correct:

x[[1]][[2]]$questionlist
## [1] "Madrid" "Vienna" "Rome"   "Paris" 
x[[1]][[2]]$metainfo$solution
## [1] FALSE  TRUE FALSE FALSE

However, it should be Rome:

x[[1]][[2]]$questionlist == "Rome"
## [1] FALSE FALSE  TRUE FALSE

So we can loop through this and save the result. Just to be safe, we also store the original RDS file:

x <- readRDS("metainfo.rds")
file.copy("metainfo.rds", "metainfo-orig.rds")
for(i in seq_along(x)) {
  x[[i]][[2]]$metainfo$solution <- x[[i]][[2]]$questionlist == "Rome"
}
saveRDS(x, "metainfo.rds")

Final remark: There is also an element metainfo$string that is used when extracting exams_metainfo(). If we wanted to use that, we would need to fix the $string as well. But for nops_eval() it is sufficient to fix the $solution.

Achim Zeileis
  • 15,710
  • 1
  • 39
  • 49
  • Nice! However, to make it more interesting, the question was randomly chosen from a pool of two questions, ie c("italy.Rnw","germany.Rnw"). I wonder how the content of the loop changes. – pendermath Jan 26 '21 at 11:05
  • Inside the `for()` loop you need to add `if(x[[i]][[2]]$metainfo$file == "italy")` so that the replacement of `$metainfo$solution` is only carried out when `metainfo$file` is the problematic file. – Achim Zeileis Jan 26 '21 at 11:22
  • You're very welcome! Things like this actually were part of the reason why I put so much work into R/exams. Sometimes when using the tools provided by the university I would hit a wall at a certain point or would have to do a lot of manual checking. In R/exams I would at least write some R code to work around many problems... – Achim Zeileis Jan 26 '21 at 16:06
  • speaking of which, I got two exams with different registration numbers but corrected according to the SAME Exam ID. It's kind of frustrating. I don't want to regret having decided using r/exams... Have you experienced something like this before? – pendermath Jan 26 '21 at 16:10
  • I don't see what the problem is here. In any case, it seems to be unrelated to the previous question. So please provide a reproducible example and post a new question - either here on SO or in our discussion forum on R-Forge. (The latter is easier for discussions that require iterative comments rather than a clear single answer.) – Achim Zeileis Jan 26 '21 at 17:38
  • It seems the software wrongly read two exams IDs. Will do as you suggest – pendermath Jan 26 '21 at 19:03
  • In many cases those problems come from suboptimal scanned images. Either students drew something where they shouldn't - or the scanner settings should be improved, e.g., a bit lighter or darker. In any case the software should warn you and ask for improvement (when run interactively). – Achim Zeileis Jan 26 '21 at 19:29