Lectora Test Reporting to LMS

Can someone help me understand how Lectora works that I keep getting these results on a test? Here are the results I receive, and the what I would like for it to be able to do if anyone has guidance.



Results are all from 1 test taken 3 different times:

Test 1: I made a 45%.

Reporting shows "completed" with a 45%, and it shows the questions as I answered them with a, b, c, d, etc.


Test 2: I made a 90%

Reporting shows "completed" with a 90%, and it shows the questions as I answered them, except it includes the first 30 questions as answered in Test 1, then the second round. A total of 60 questions.


Test 3: I made a 14%

Reporting shows "completed" with a 90%, and it does not show the questions as I answered them.


Goals:

  • To show results individually.
  • For each test in the reporting section I need it to show the score that was received for THAT specific take, and only the questions answered for THAT take.
  • If I can only pick from one option - If it shows the highest score received, and at least the questions answered for each take individually this would be fine because I can manually calculate the score.



Does anyone understand how this works?

Discussion (2)

Hi, I ran this through ChatGTP and it actually has a lot of good information! Similar to how I would have responded. I'm pasting it's reply here, I updated a few things for clarity.


What could be happening?


  1. The same SCORM attempt is being “resumed” rather than a truly new attempt being started each time. Lectora (or any SCORM course) typically sends question/interaction data to the LMS in a single “attempt” bucket unless the LMS is explicitly told to treat it as a brand‐new attempt. When learners reopen the same SCORM attempt, they often continue adding to the old attempt data instead of overwriting or clearing it.
  2. The LMS is ignoring the new data if the course is already marked Complete/Passed. Some LMSs stop accepting updated interaction details the moment a SCORM course reports completion (or a passing status). So if the second or third attempt is taken in the same attempt session (same SCORM registration), the LMS might not record the new scores or question details.
  3. Lectora may not be resetting interaction data for a retake. Depending on your publish settings for sending test interaction data, Lectora might be re-submitting all the old question responses plus any new ones. That is why you see “60 questions” (the 30 from the first attempt plus the 30 from the second attempt). It can happen if each question is reported again without first clearing out the old data.
  4. SCORM 1.2 vs. SCORM 2004 limitations. Some LMSs (especially under SCORM 1.2) only store a limited amount of interaction data. If it hits the data limit or if the status is “Complete,” any new data might be truncated or ignored. (Not always the case—but it’s something to watch out for.)


What might be the LMS’s responsibility?


  • The LMS decides how to handle repeated launches of a course that has already been completed, including:
  • Whether it allows additional attempts and multiple recorded scores.
  • Whether it overwrites the previous attempt or merely updates a single “attempt record” with new data.
  • Whether it stores question-level data for each attempt separately, or just lumps it all into one SCORM log.
  • If the LMS is programmed or configured to take “the highest score” or “the first completed score” and then not accept more data, it can cause the behaviors you’re seeing (showing 90% after the third attempt, or not showing new question responses at all).
  • Some LMSs have a built-in concept of multiple attempts, but it often requires the learner to launch a new attempt (e.g., “Re-enroll” or “Start new attempt” button) so the LMS creates a fresh SCORM registration.


How to move toward your goals (common approaches):


  1. Force a new SCORM attempt each time the user retakes the test. Depending on your LMS, there may be a “Start New Attempt” option when launching the course a second or third time. Each time the user clicks “Start New Attempt,” the LMS will record that attempt with fresh data and scores.
  2. Tell Lectora to ‘reset’ or ‘clear’ question data on retake. Ensure Lectora’s test/quiz settings are configured so that when a learner clicks “Retake,” it actually wipes out old responses and re‐submits fresh ones to the LMS. In Lectora’s Publish or Test/Survey settings, check the interaction reporting settings.
  3. Check how your LMS is set to handle scoring:
  • Some LMSs let you choose “highest,” “latest,” or “average” score if the learner attempts the course multiple times.
  • If your goal is to see each attempt’s data separately, confirm that your LMS can store multiple attempts under the same enrollment (some do, some don’t). If not, your only option might be a manual re‐enrollment or a “new attempt” that the LMS tracks as a fresh registration.
  1. If you need all attempts in a single user record but separated out:
  • You’ll need an LMS that’s capable of capturing “attempt #1,” “attempt #2,” “attempt #3,” etc. with each attempt’s question breakdown. Not all LMSs display that natively for SCORM 1.2. SCORM 2004 or xAPI/Tin Can might handle it better, or you might need a custom solution or a specialized reporting feature in your LMS.


In short:


  • Lectora does typically submit all question data each time you take the test—unless you specifically tell it not to or the LMS refuses updates after completion.
  • If you see older interactions lumped in, it’s usually a sign you’re still in the same SCORM attempt, or look at your test interaction data publish setting, maybe you don't want timestamp info for interactions
  • The LMS can decide to store those as separate attempts or treat them all as one big attempt.
  • If you want clean, individual attempts (each with its own score and question breakdown), you either need to trigger a new attempt or adjust the course settings so it resets all prior answers—and check that your LMS will accept those new results after a “complete” status.


Hello @JIvy. Looking at the support tickets you submitted, is this issue still a Lectora issue, or is it being worked as an issue with the LMS, in particular Rockstar Learning Platform? If RLP, then Customer Solutions is assisting you. If still an issue with Lectora, please open a separate support ticket, and we will assist you. Thank you!