SCM

Forum: support

Monitor Forum | Start New Thread Start New Thread
RE: How to grade exams/questions manually? [ Reply ]
By: Achim Zeileis on 2020-11-02 23:06
[forum:48401]
OK, good.

For anyone else reading this:

x <- exams2xyz(...)

returns a list of lists. For example, you can extract the solution from the metainformation for the i-th random replication of the j-th exercise as:

x[[i]][[j]]$metainfo$solution

These are a lot of nested lists when typing this interactively. But using this information programmatically should hopefully be straightforward.

Those exams2xyz() interfaces that save an .rds file do so via

saveRDS(x, ...)

Thus, it is easy to re-read the object "x" from the .rds which is what nops_eval() does.

P.S. for Sebastian: Re cross-posting. For questions that are likely to have a clear answer, it is fine to use either StackOverflow or the R-Forge forum but cross-posting increases the risk of duplication of efforts. For questions that require discussion or sharing experiences/opinions/etc. it is better to use the R-Forge forum.

RE: How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 19:52
[forum:48400]
Hola Francisco,

muchas gracias por tu R-script. I actually learned quite a bit from it. Grading is not so easy for our usecase, maybe we should come up with some convenience function.

RE: How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 19:50
[forum:48399]
Ah, I just learned that `exams2xyz` include the metadata.

OK, consider the question answered. Thanks again.

RE: How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 19:44
[forum:48398]
Thanks for explaining. Now I understand better.

Where would you recommend drawing the correct answers from? From the RDS file with the metainfo?

(Sorry if you have documented that elsewhere, I've tried to find it.)


RE: How to grade exams/questions manually? [ Reply ]
By: Achim Zeileis on 2020-11-02 16:27
[forum:48395]
The evaluation framework that exams2nops(), exams2moodle(), etc. use is implemented in exams_eval(). For example, you can use the following:

eval <- exams_eval(partial = TRUE, negative = FALSE, rule = "false2")

indicating that you want partial credits but the overall points per item must not become negative. A correctly ticked box yields 1/#correct points and an incorrectly ticked box 1/#false. The only exception is where there is only one false item (which would then cancel _all_ points) then 1/2 is used.

The resulting object "eval" is a list with the input parameters (partial, negative, rule) and three functions checkanswer(), pointvec(), pointsum().

Imagine that you have the correct answer pattern

cor <- "10100"

The associated points for correctly and incorrectly ticked boxed would be:

eval$pointvec(cor)
## pos neg
## 0.5000000 -0.3333333

Thus, for the following answer pattern you get:

ans <- "11100"
eval$checkanswer(cor, ans)
## [1] 1 -1 1 0 0
eval$pointsum(cor, ans)
## [1] 0.6666667

The latter would still need to be multiplied with the overall points assigned to that exercise.

For numeric answers you can only get 100% or 0%:

eval$pointsum(1.23, 1.25, tolerance = 0.05)
## [1] 1
eval$pointsum(1.23, 1.25, tolerance = 0.01)
## [1] 0

Similarly, string answers are either correct or false:

eval$pointsum("foo", "foo")
## [1] 1
eval$pointsum("foo", "bar")
## [1] 0

This is as much general infrastructure that we have implemented. We thought that practitioners who would "roll their own" form could work with this while still having all flexibility they want/need.

RE: How to grade exams/questions manually? [ Reply ]
By: Francisco Goerlich on 2020-11-02 16:11
[forum:48393]
Hi Sebastian,
You are right!
I vary the questions (within the same type, i.e. schoice with the same number of questions, num with the same number of decimals,...) but not the order. So a single fixed answer questionnaire does the job.
Regards

RE: How to grade exams/questions manually? [ Reply ]
By: Francisco Goerlich on 2020-11-02 16:06
[forum:48392]

_1GenerarPrueba5.R (6) downloads
Hi Sebastian,
I attach the script that generates the exam from the question bank.
In the process I store two files, one with the solutions of each student and other with metadata of the question.
I use this in the script that marks the tests.

The code is very old, and I am sure that it can be done much more efficiently now, but since it works I have little incentive to improve it.

I donĀ“t mind to send you all the scripts, but comments are in spanish.

Best regrads,

RE: How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 16:01
[forum:48391]
Hi Francisco,

One thought. I am planning to create many variants of the exam in order to avoid cheating. So the order of questions will vary. In some instances the first question will be open-ended, in some instances it will be a multiple-choice with 4 options, and so on. So for that purpose, one fixed answer questionnaire may not be suitable.

(The Google forms approach works nicely for other purposes.)

RE: How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 15:57
[forum:48390]
Hi Francisco,

how do you do the auto-grading? That's what I need for the upcoming exams. Could you kindly explain that in some more detail? (Or point-out to some code?)

Google Forms is even more practical than Excel files. I will double check if my university has some or many concerns with Google forms. However, if no personal information is submitted, no privacy concerns should apply.

Saludos,
Sebastian

RE: How to grade exams/questions manually? [ Reply ]
By: Francisco Goerlich on 2020-11-02 15:45
[forum:48388]
Hi Sebastian,
Have you thought using Google forms instead of Excel files.
I do it!
A test on-going: https://forms.gle/hG5xB42WfP4QqqSr6
In spanish!
A test is different for each student and is automatically graded.
If you fill in a from, you will receive a mail that says that this mail in NOT registered in the course (students hace to use their university mail, which is the unique id for me)
Best regards,


How to grade exams/questions manually? [ Reply ]
By: Sebastian Sauer on 2020-11-02 15:16
[forum:48386]
**What I'd like to do:**

I would like to use r-exams in the following procedure:

1. Providing electronic exams in pdf format to students
2. Let the students upload excel file with their answers
3. Grade the answers



**My problem:**

I have not found a procedure/function to easily grade a PDF exam where the answer came in _electronically_. So, I'm not using Moodle, and I'm not scanning paper sheets. I would like the students to put their answers to an Excel sheet and upload their answers. So I'll have the answers electronically and now I'd like to grade them. Let's call this approach "manual grading".

What's a good way to grade the questions manually?



**What I have tried:**

I'm aware of the `exam2nops()` function, and I know that it gives back an .RDS file where the correct answers are stored. Hence, I basically have what I need. However, I feel that some helper functions might be in place.

I've hastily written some function in order to achieve that, but there are probably better ways -- and maybe there's some builtin way to do so in r-exams: https://gist.github.com/sebastiansauer/c942b2dded75620a67269cbc9aa66a14



**Please help me with:**

Any suggestions how to manually grade the answers are welcome. Thank you!




(This is a cross-post from SO, it appears that this question does not fit SO: https://stackoverflow.com/questions/64646692/how-to-grade-exams-questions-manually?noredirect=1#comment114306529_64646692)

Thanks to:
Vienna University of Economics and Business Powered By FusionForge