Checking question validity

I want to check the validity of my test questions. Normally that is not a problem. I can generate a report that will give me the responses to each question and I can do the statistical analysis that will give me the information I want. The problem comes when the course owner requires the learner to score 100% on the test. Now, the data show that every learner got every question correct. If it took the learner 3 or 4 times to pass the test, how do I capture the responses to those missed questions in the first 2 or 3 times the learner took the exam and did not pass.

I appreciate any response.

Discussion (4)

every time the process test results action is triggered it should send the interaction data to the LMS and so any failed questions should get recorded.

Or are they repeating the question until they get it right and so by the time the data is sent all questions are correct?

I am guessing the latter. If moving to the first approach isn't an option then i would suggest adding a variable to track question attempts. I do this for a number of modules that require 100%. Each time a question is processed (i.e. when Submit is clicked) set up a variable to add the question number to a variable (e.g. 'AttemptsTracker'). I'd use a symbol too so that it can be used to separate the variable back out at a later time (and so that Lectora doesn't perform a calculation).

Within the Suspend_Data variable you would end up with a string looking something like 1-1-1-2-3-4-5-5-6-7-8. This tells us that they needed 3 attempts to pass Q1, 1 attempt to pass Q2 etc. You can use something like excel to split the string using the symbol "-" as the separator or use a formula to count each time "1-" appears or "2-" appears. If you need to know exactly what they answered for each question you'd need to have more complex actions (to add e.g. 1a or 1b etc). keep it short though as there is a limit to how much data suspend_data can hold.

You would obviously need to be able to access the suspend_data field from your LMS reports to get this information.

It's a bit of a clunky solution but it does work for us when we need this information.

thank you mallow76

Couple follow up comments/questions: If I publish the course in SCORM 4 do I still need to access the suspend data? and, yes, I do need to know which option was chosen for each question. there must be a way to use variables to do that as well.

you could potentially add the variable (or a variable for each question) to a form and submit it to google sheets as a workaround if you do not have access to suspend_data.

Yes - you should be able to add the question variable too so that you would see what answer has been given. The problem with that is that the full answer text is appended. I think SCORM 2004 (is that what you are using?) has a higher allowance for the suspend_data text but it would still fill quite quickly if you have a lot of questions. You can replace the answer text in the questions with a more coded approach (i.e. Q1-A, Q1-B etc) and hide these text boxes on the page - then just add text boxes with the actual answer text. This will use less space and provide a more readable/analysable output. (If you randomise the order of your answers though this will not work)

This approach would still require access to the suspend_data field from the LMS. I generally use SCORM 1.2 so don't know offhand if SCORM 2004 offers any better solution.

Discussions have been disabled for this post