I have a client who is wanting to track knowledge check questions. They would like to track the number of times:
Is there a way to track this? they use an LMS.
Are you talking about Captivate's Knowledge Check questions that are NOT normal Quiz Question slides? These never report to the LMS.
Or are you referring to normal Quiz slides and you are just calling them "knowledge check slides" because that's what you see them as?
To add to Rod's questions: are you talking about analysis of one learner, or of ALL learners?
I apologize...I am referring to quiz question slides. The client wants all learners tracked on their LMS to let them know which questions students had the most difficulty with, but still wants the student to get 3 tries (1 for True/False). They would like to know how many times all students answered a question correct the first time to determine what questions the students are having the most difficulty with, or the most ease with.
All this is done in the LMS, as well as tests are created directly in the LMS, since the passing of the test must be recorded in the database. Tests created in other programs do not allow this to be done because other programs lack a mechanism for identifying users when performing tests.
@michail. I beg to differ with your statements.
Tests DO NOT need to be created in the LMS in order to have a pass or fail result recorded in the LMS database. (In fact if they are then the test will NOT be part of the course SCORM package.) Any SCORM-compliant LMS can record whether or not a test in the SCO was passed or failed.
The identity of the user is recorded by the LMS when they log into the LMS system. But this identity can also be picked up and displayed within the SCORM if necessary because the LMS exposes variables that Captivate can use.
But in this case, each test should be packaged in a separate SCORM package, then maybe yes.
In the case of packing several tests into a SCORM package, we will receive only the integral evaluation, but not the evaluation for each test that the client asked.
From what you are saying I think perhaps you are not as familiar with all options in the SCORM standard and with how Captivate works with it.
According to the SCORM standard each SCO (SCORM zip package) can only report one result. But that SCO can have both content and quiz questions. A SCORM course can be made up of multiple SCO modules, each of which can report its own score to the LMS.
Captivate has an extra software tool bundled with the main application called the Multi-SCORM Packager (File > New Project > Multi-SCORM Packager). This tool allows you to bundle a number of SCO modules into a single SCORM course with a combined imsmanifest.xml file that lists all of the modules. The test results for each SCO can be accepted by the LMS and displayed separately in reports with a 'rolled up' overall score for the entire course.
Not all LMSs support multi-scorm courses, but that's a limitation of the particular LMS, not the SCORM standard or of Captivate.
That's right, but this package allows you to get a report for each module, but not detailing for each test, indicating the attempts to perform the test and detailing answers to questions. This can be done only in the LMS itself.
Therefore, it is logical that in this case the tests do in the LMS itself