supressing the Failed message in a test

There are several parts to this question actually. I’ll try and explain.I’m working with a client that wants to have a “survey” where learners answer questions and they’re graded. The learners need to see their grade, BUT they don’t need to see the “FAILED” message if their course is below the threshold. They simply need to see their score. It would be very un-politically correct to use the nomenclature “failed” for this survey. I’m assuming the grading script can’t be easily changed to be able to adjust that verbiage so how do we let the learners see their score in the test on the same page we provide a “key” for them to interpret that score? Can we have a text box populate with the variable that contains their score on the survey? That field would obviously be on the page they are re-directed to when the survey completes.Here’s another twist on the situation. The instructor for the course would like to capture the data from the first five questions separately in the LMS from the data for the rest of the survey. The first five questions are the most important. They want a separate score in the LMS (saba in this case). I thought perhaps this could be accomplished with a test with multiple parts. I’m not in a position to see if these individual scores from separate parts of the same test are stored independently or not. Does anyone know off the top of their head? I’m assuming that Lectora just averages the score and sends only that score to the LMS.If the above assumption is true, how do we have a two-part exam that averages the score for the students to see and then presents that averaged score to the LMS for saving along with the independent test scores? (need the average and the individual scores basically).If we were to average the two, that brings up another level of complication in that the first exam would have five questions and the 2nd exam would have about 15. These questions would need to be equally weighted. If not missing questions in the 2nd exam would be less critical than missing questions Are we confused yet?

Discussions have been disabled for this post