Highlighted

"Correct" much slower than "try again"

New Here ,
Jul 03, 2019

Copy link to clipboard

Copied

So, I've asked a similar question before, but didn't really get a resolution, but now I have a new bit of data, so I'll attempt again.

I have a 5-question T/F/MC quiz at the end of the lesson with infinite attempts.  So, I have 3 feedback options: "try again", "you must answer this question to continue" and "correct, click anywhere...".  When I test this on my LMS, the "try again" caption pops up near-instantaneously.  The "correct" caption, however, always takes 2-3 seconds to appear.  It creates 2 possible problems:

1.  It's a bit annoying to wait and it creates doubt that the button has been clicked.

2.  The most impatient ones click "Submit" again, which the system interprets as their second click - "click anywhere to continue" - and moves them on to the next question, which in turn cause the "correct" feedback to flash by barely visible.  I suppose they can surmise that it's correct if they were able to advance, but it's still not good practice.

I guess I'm wondering why the settings for "try again" and "correct" are different (as my experience seems to indicate) and whether I can do something to make the settings for "correct" the same as "try again"?  Or is it the difference between reporting and not reporting the answer?  As in "try again" doesn't require reporting, so it goes by fast, but "correct" needs to be reported, so it lags?  Is there anything at all I can do about this?

I've read Lilybiri's blog quite carefully and understand the two-step process, but I guess I'm not sure which specific setting could be the answer to my question.  My issue doesn't seem to be stemming from the fact that clicking "Continue" takes a few seconds, it seems to be in the first part of the two-step process.

Question Question Slides in Captivate - Captivate blog

Adobe Community Professional
Correct answer by RodWard | Adobe Community Professional

It sounds like your LMS may be bogging down with requests and the lag you are experiencing is the time it takes for the LMS to acknowledge the data that has just been sent to it by the course module.  The module is probably waiting for the LMS to reply before it will move to the next question.

Take a look at these two blog posts from a while ago"

How LMS server latency can kill your e-learning | Infosemantics Pty Ltd

Set up Adobe Captivate e-learning to mimimize load on Learning Management Systems | Infosemantics Pty Ltd

The second post explains what you can do about this if it DOES turn out to be your issue.

If you want to test whether or not your LMS might be the issue rather than the Captivate module, just upload the same lesson to SCORM Cloud and test it from there.

Views

97

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more

"Correct" much slower than "try again"

New Here ,
Jul 03, 2019

Copy link to clipboard

Copied

So, I've asked a similar question before, but didn't really get a resolution, but now I have a new bit of data, so I'll attempt again.

I have a 5-question T/F/MC quiz at the end of the lesson with infinite attempts.  So, I have 3 feedback options: "try again", "you must answer this question to continue" and "correct, click anywhere...".  When I test this on my LMS, the "try again" caption pops up near-instantaneously.  The "correct" caption, however, always takes 2-3 seconds to appear.  It creates 2 possible problems:

1.  It's a bit annoying to wait and it creates doubt that the button has been clicked.

2.  The most impatient ones click "Submit" again, which the system interprets as their second click - "click anywhere to continue" - and moves them on to the next question, which in turn cause the "correct" feedback to flash by barely visible.  I suppose they can surmise that it's correct if they were able to advance, but it's still not good practice.

I guess I'm wondering why the settings for "try again" and "correct" are different (as my experience seems to indicate) and whether I can do something to make the settings for "correct" the same as "try again"?  Or is it the difference between reporting and not reporting the answer?  As in "try again" doesn't require reporting, so it goes by fast, but "correct" needs to be reported, so it lags?  Is there anything at all I can do about this?

I've read Lilybiri's blog quite carefully and understand the two-step process, but I guess I'm not sure which specific setting could be the answer to my question.  My issue doesn't seem to be stemming from the fact that clicking "Continue" takes a few seconds, it seems to be in the first part of the two-step process.

Question Question Slides in Captivate - Captivate blog

Adobe Community Professional
Correct answer by RodWard | Adobe Community Professional

It sounds like your LMS may be bogging down with requests and the lag you are experiencing is the time it takes for the LMS to acknowledge the data that has just been sent to it by the course module.  The module is probably waiting for the LMS to reply before it will move to the next question.

Take a look at these two blog posts from a while ago"

How LMS server latency can kill your e-learning | Infosemantics Pty Ltd

Set up Adobe Captivate e-learning to mimimize load on Learning Management Systems | Infosemantics Pty Ltd

The second post explains what you can do about this if it DOES turn out to be your issue.

If you want to test whether or not your LMS might be the issue rather than the Captivate module, just upload the same lesson to SCORM Cloud and test it from there.

Views

98

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Jul 03, 2019 0
Most Valuable Participant ,
Jul 03, 2019

Copy link to clipboard

Copied

That is a pretty old blog. Meanwhile I have posted recently 11  blogs in the eLearning community about Quizzes. Here are the lnks to those describing the default setup, the later ones about tweaking:

Captivate's Quizzes (1): Terminology - eLearning

Captivate's Quizzes (2): Submit Process - eLearning

Captivate's Quizzes (3): Attempts and Scores - eLearning

Captivate's Quizzes (4): Preferences - eLearning

Captivate's Quizzes (5): Master Slides - eLearning

Can I see the timeline of a question?  And can you tell the exact version number you are using (Help, About Captivate).

I find a question with infinite attempts, if it has to be scored, bit weird since the learner will never be able to proceed without having given a correct answer. Which leads to a score of 100% all the time. Why not use Knowledge check slides in that case?

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0
New Here ,
Jul 03, 2019

Copy link to clipboard

Copied

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0
New Here ,
Jul 03, 2019

Copy link to clipboard

Copied

Sorry, adding a picture and a text didn't work well for me above for some reason, but that was the timeline.

I'm using 2019 11.5.0.

As for knowledge check - it's a good question!  We initially designed the questions to have only 2 attempts, but the customer didn't like that, so it seemed easiest to change slides to infinite attempts.  Do you think it makes sense to rebuild as knowledge checks instead?

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0
Most Valuable Participant ,
Jul 03, 2019

Copy link to clipboard

Copied

I don't see any reason why this is such a differnce, but never used quiz slides because of what I mentioned before.

Indeed for a KC slide, no data have to be prepared for sending to a LMS, only reason I can think about for the issue you are reporting.

Make sure to upgrade the used theme to the present version (lot of changes to themes in this most recent one). Never delete embedded objects, also not for KC slides). Maybe a tip: keep your questions always in a GIFT or a CSV file, makes recreation much easier if you have to switch between quiz and KC slides.

For that small amount of questions, would opt to recreate KC slides. Another possiblity would be to tweak the quiz slide, skip the first step of the submit process and use custom feedback shapes or captions. Those are described in that old post, and more in detail in some of the Tweak posts I created.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0
Adobe Community Professional ,
Jul 03, 2019

Copy link to clipboard

Copied

It sounds like your LMS may be bogging down with requests and the lag you are experiencing is the time it takes for the LMS to acknowledge the data that has just been sent to it by the course module.  The module is probably waiting for the LMS to reply before it will move to the next question.

Take a look at these two blog posts from a while ago"

How LMS server latency can kill your e-learning | Infosemantics Pty Ltd

Set up Adobe Captivate e-learning to mimimize load on Learning Management Systems | Infosemantics Pt...

The second post explains what you can do about this if it DOES turn out to be your issue.

If you want to test whether or not your LMS might be the issue rather than the Captivate module, just upload the same lesson to SCORM Cloud and test it from there.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0
natalyas LATEST
New Here ,
Jul 03, 2019

Copy link to clipboard

Copied

Thank you, testing this right now!

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Jul 03, 2019 0