I have an exam pulling from multiple question banks tied to specific topics. Each question has a unique interactionID assigned based on the bank it comes from. For example:
and so on
In my exam, I'm allowing multiple attempts. The issue I'm experiencing is the interaction ID stays the same for all the attempts. I'd like to be able to distinguish one attempt from the next. I've populated the "Interaction ID Prefix" in settings, but when I look at my report showing the quiz data, I only see the assigned InteractionID from the question.
Has anyone experienced the issue and found a solution?
That is the design: if you remain in the same session, for multiple attempts the same random questions will appear. If you want another subset, learner has to close and restart the quiz.
This is normal behaviour. A question's Interaction ID has to remain the same from one session to the next and from one user to the next otherwise SCORM reporting would be all over the place.
If you want more advanced options for tracking scoring, you might need to look into other more 'cutting edge' LMS integration standards such as xAPI. But your LMS may or may not support that.
Just to confirm. Let's say I have an exam setup with a passing score of 80 and unlimited attempts. If the learner fails the 1st attempt, and immediately goes to retry the exam from the results slide, you are saying there is no way to have the interactionID reflect a 2nd or 3rd attempt unless we use xAPI?
Our goal is to somehow differentiate the interaction data between attempts while in the same session.
To my knowledge the interaction ID for a given quiz question never changes under the SCORM standard. If it DID change for each subsequent attempt by the user, then that would likely throw the LMS for a loop because the different attempts would not be automatically related in the LMS database. The interaction ID and the user ID need to remain constant to get meaningful reporting.
If you want to have some way to track each and every attempt on the question via your LMS reporting, then you first need to find out whether or not your LMS supports that functionality at all. There's nothing in the SCORM standard (that I am aware of) which would preclude the LMS from storing the assessment data separately for each and every attempt on a quiz. The limitation is more likely to be that the LMS builders probably never designed their databases to be that flexible or robust. You're asking for something that most clients would never think of wanting because they only ever usually focus on whether or not the user passed the assessment, that's all.
I don't use xAPI for any of my client courses so I don't really know whether changing the interaction ID is possible with xAPI or not but since Captivate CAN publish to xAPI but doesn't seem to provide any dynamic way to alter interaction IDs for either SCORM or xAPI standard, I would assume that xAPI doesn't normally provide that functionality 'out of the box' either. But since it is designed to be a more flexible standard than SCORM, the possibility should exist for you to build a custom solution...if you want to go to that level. You'd probably need the services of some talented programmers though.