Hello all, I've seen a few posts regarding the NaN error occuring for branching courses, but they don't seem to replicate the problem I'm having..
I've created a branching course, dealing with training two different teams working with the same database, but with different job roles. So each branch covers each team, and at the end of the training is a test, specific to their training.
But I've found that my test fails me even when I enter the correct answer and I can see "NaN" on several fields of the test results.
To troubleshoot, I created a blank project and set up the following slides
Slide 1 - Intro
Slide 2 - Menu to select "Branch 1" or "Branch 2"
Slides 3 & 4 - "True or False" questions for branch 1
Slides 5 & 6 - "True or False" questions for branch 2
Slide 7 - Quiz results
When I take the course it works fine, I'm able to select whichever branch I choose and take and pass the test (see below)
So I know that my branching tests works, but the problem is that becuase the Quiz result screen is the last screen, when I click on "Continue" the course takes me nowhere.
So the solution to this is to go to the Preferrences and tell the course what to do on pass or fail, and this is where the problems start..
I want to go back to Slide 2 (the branch selection menu) once the quiz is taken, so I go to preferrences to get this set up..
By default the "Pass" and "Fail" actions are set to "Continue" so I set them to "Jump to Slide 2".
Now when I take the test I get the "NaN" error and can only fail the course.
I've seen posts talking about "NaN" with branching courses and mentioning that the project must be corrupt, but I can't see that this is corrupt project issue. I've replicated the problem on three seperate courses, two of which have nothing in them other than the bare minimum to test the branching exams. I've cleared the cache in Captivate and I've also asked a colleague to replicate what I've done with her copy of Captivate and she gets the same result. So it seems like Captivate has an issue with branching tests.
If I put the Action on pass or fail settings back toe "Continue" then the branchng tests work properly again. I thought I could work around the problem by adding anothe slide to the end of the project, after the Quiz results screen, to use that to navigate back to the menu, but no such luck, even a blank slide after the quiz results causes the same problem..
Any ideas on this? Is it a known (or unknown) bug? Is there some way of working around it?
Thanks in advance
That is bizarre.
NaN (Not A Number) is something that typically appears when a calculation is trying to be performed on a string. Seems that somehow something is getting pulled into the calculation that shouldn't be there.
Does the same result happen if you wish to return to a slide but there is no branching involved?
Have you considered making your own custom results slide?
Good question.. I tried a simple course of :
Slide 1 - Intro
Slide 2 - Begin test
Slide 3 & 4 True or False questions
Slide 5 - Quiz results
It worked fine. I then changed the preferrences to move back to Slide 2 on Pass or Fail, and this worked fine as well. So it does seem to be narrowed down to branching.
I hadn't considered a custom results slide, I'm not sure I knew you could make one, maybe I'll look into how to create one and see if that can solve the problem. Thanks..
If you turned on Branch aware, you don't need a custom question slide at all. The quizzing system variables are dynamic, will take into account only the visited quiz slides.
Have a look at:
Thanks for your reply. My orginal problem did not involve custom quiz pages, that was just an option to try and work around the problem.
The problem is that I can set up a branching quiz and have it work, but the Quiz Results screen is set to "Continue" for the Pass or Fail actions, and as the Quiz results slide is the last slide of the project there is nowhere to "Continue" to.
As soon as I change any settings to try and work around this, my branching quiz fails and I get the NaN error.
I've included screen shots in my original post if that will help.
It is not a good idea to have the Score slide as last slide at all. I have mentioned that multiple times. The Pass/Failure actions are executed when the last frame of the Score slide is reached, not when the Continue button is clicked. What is wrong with having a short last slide, you can put the Exit button there to avoid the problem you created? Did you allow backwards navigation in Quiz settings?
Thanks for your reply.
No I don't like the Quiz Results at the last page either, it was just becuase of the problems I'm having. I'm working on a course which covers training for a large database which is used by several teams, doing different tasks, the idea was to set up branching so that :
Team A took branch 1
Team B took branch 2
Team C took both branches
But when I ran into problems with the Branch aware quiz, I created a fresh blank course to try and troubleshoot the issue.. hence the very stripped down example I gave earlier of just 7 slides.
So continuing on with trouble-shooting I've taken my stripped down course of 7 slides and given it the dafault values of "Continue" on Pass or Fail so it works correctly. I then added a blank slide after the Quiz results, and the course continued to work correctly. I then added an action "Jump to Slide 2" on the exit of the last slide and the course failed, displaying the "NaN" error.
I think my interpretation of branching quizzes must have been wrong. I'd expected someone to take branch 1 and take the test, then if they worked in a team that needed to do the additonal work they could then go back to take branch 2 and take that test. But I don't think Captivate will allow that, I think it wants to restrict you to taking just one branch, and by allowing the user to circle back to the main menu to potentially take the second branch this then causes the error.
So it looks like I won't be able to use Branch aware quizzing in my course...
I have posted a link to my blog post explaining Branch aware before. Did you read it? Your interpretation is indeed wrong. They need to follow one branch, not adding quiz slides on top of an already followed branch. That is perfectly possible if you structure the custom navigation even in your use case.
Repeat the link:
Yes, I did read the link you sent, but have a busy office with lots of interuptions so I think I need to wait for a moment when things quieten down and I can re-read it..
I also read a similar post you made before posting my question here, for some reason this forum won't let me enter links to my post, but it's your "Branch Aware Quiz" posting from your blog.
So I've re-read you article a couple of times, and I don't think you're doing what I'm trying to achieve.
It seems to me from looking at your article that you are creating a course that provides the user with several branches of learning and a test for each branch, but the user will only ever take one test in that course. So if they select "English" they will take the test in English and then not be able to select a second (or third) test afterwards.
What I need to do is create a branching course that has a branch for each team and the work they do, so they can take that training and associated test. But if they then change teams and now need to be trained and tested on the second branch, they should be able to come back to the course to take that second branch and test.
But I think it looks like Captivate only allows one test per course, regardless of the number of branches built into that course. Or is that wrong?
Will try again to explain. It is a SCORM requirement to have only one score for each course, that is not a Captivate limitation. Normally when starting a course with a quiz, the questions are fixed from the start, even when it is a subset of a question pool, and each Quiz Attempt will present the same questions again.
The beauty of Branch aware is that this is not the case, but the quizzing system variables used to track the quiz become 'dynamic'. This means that they will adapt to the 'branch' followed by the learner, and the maximum score, the percentage... etc are calculated based on the visited quiz slides. The requirement for a second quiz attempt remain: same questions need to be retaken.
What you want to achieve can be done, but it is beyond the scope of these forums.
If a learner will switch to a new job, you'd better use the LMS to present him with the course as if he had not yet taken it (he will need a new profile anyway). Or you can split up the course in multiple courses.
Have published so many blogs about quizzes that they would be a nice book. Understanding quizzes, and be able to tweak them is a very broad topic. Too bad it is often ignored in most books and trainings.
I think perhaps it should be pointed out that if your learners happened to change their job description and end up in a different work team, they are unlikely to experience that right in the middle of completing an e-learning lesson. Would that not be true?
When you successfully complete a Captivate module and close down the session, the LMS has recorded the score for that session against that learner's profile, but the Captivate SCORM module itself does not 'remember' anything. The details are all parked in the LMS database.
So, if the learner did happen to belong to Team A and successfully completed the branch for that team, but then at some later date was reassigned to Team B, then they would be starting a completely new user session when they launch that same module again from the LMS to qualify for Team B. Since they now are no longer part of Team A, as long as they simply complete the relevant branch that applies to Team B, all should be well.
So, is it possible that you are encountering this NaN error because you are trying to do something that would NOT occur in the actual circumstances of your user? From your description, I think you are simply not using the Branch Aware function as it was designed to work.
Branch Aware is designed to allow the learner to choose a particular path through the test questions and then for subsequent attempts within the same user session they will be given only those same questions again. It's called Branch Aware because the module 'knows' the branch path taken by the user through the quiz. But this only applies to that specific user session. If they start a completely new user session then it's a blank slate again.
Did you find a solution? I am experiencing the exact same issue in Captivate 2019. Branch aware, two quizzes based on where the learner works (location). I keep getting a NaN error. I've done this before and never had this issue.
I tried just adding one question at a time - all goes well. Finally had it working in HTML preview, then published and I steered into my LMS and bam! NaN error again.
Same here. I encountered the same issue in Captivate 2019. Branch aware for two quizzes based on the learner's choice. Worked at the begining but then NaN errors for no reason... very frustrated... >_<
I'm having to think back a couple of years now, but I think the problem was down to how SCORM expects a course to work. They stipulate a user can only get one set of test results each time they take a course.
So my course had two branches of content with a test at each branch. That worked fine initially becuase the user will take one branch and one test in a single sitting of the course. However, what I was trying to do was give the users the option of taking both branches in a single sitting of the course, so when I added a slide at the end of the test to allow the user to circle back to the start and take the second branch and test, this will mean the course contravenes SCORMs rules (only one set of test results per user per sitting of a course) and this is what generated my problem, because Captivate is trying to abide by SCORMs rules.
I hope that makes sense.
Sorry for the very delayed reply. for some reason I didn't get a notification of your question.
I did get a solution to this - or at least an answer not a solution. The answer is this (and I'm dredging up my memories here) It's to do with SCORM. They stipulate that a user can only every have one set of test results from a single sitting of a course. So what I was trying to do was circle around for a second for a second test which contraveins the SCORM rules. So it's not a bug with Adobe's captivate, it's Captivate abiding by the SCORM rules. It's possible a person sitting a course more than once and then take a test so they end up taking the test more than once, but that's not what I was trying to achieve, I was trying to go for essentially 2 tests in the same course at the same time, and that broke the SCORM rules.
I hope that makes sense