• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Issues with "connection to server lost" in Halogen LMS

Contributor ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

Hi,

I have created a series of modules to be used in an LMS called Halogen (SCORM 2004 vs. 3).

All these modules have been created using the same template.

Unfortunately some of the modules work fine after uploading the ZIP and some run until the end screen and then upon exiting, the LMS says "A fatal error has occured, communication with the server has been lost" and gives out a long log (see below). This only happens in 3 of the 5 modules.

I tried Copying/pasting the slides of a non-functional module into one that works but the outcome is the same.

I ran it in SCORM cloud and there it works as it should with "completion" and "success".

So what could make this error at the last instant when the module should log off? I am really at my wits´ end here.

The content is also the same and there are no HTML issues tracked.
Thanks a lot for any idea!

SCORM Engine Log
Version Error reading version information from /META-INF/MANIFEST.MF (Error Message: no protocol: /META-INF/MANIFEST.MF)
keyboard shortcuts: refresh "r", close "esc"
Displaying:  Control   Runtime   Sequencing   Sequencing P-Code   Look-ahead P-Code
Refresh All
Refresh With Filter
Expand All
Collapse All

Save Debug Log

Activity Data  Hide

+ Module 4 (Course_ID4_ORG)
+ Module 4 (SCO_ID4)

Possible Navigation Requests  Hide

+ Navigation Request: START
+ Navigation Request: RESUME ALL
+ Navigation Request: CONTINUE
+ Navigation Request: PREVIOUS
+ Navigation Request: EXIT
+ Navigation Request: EXIT ALL
+ Navigation Request: SUSPEND ALL
+ Navigation Request: ABANDON
+ Navigation Request: ABANDON ALL
+ Navigation Request: CHOICE - Course_ID4_ORG
+ Navigation Request: CHOICE - SCO_ID4

Global Objectives  Hide

SSP Buckets  Hide

Errors  Hide

- Data Postback Errors

  • `370`<?xml version='1.0'?>
<lmsresponse>
<error present='true'>
<description>An unexpected error occurred persisting the runtime data.org.apache.xerces.parsers.DOMParser.parse(Unknown Source)An invalid XML character (Unicode: 0x3) was found in the value of attribute 'Description' and element is 'ActivityRunTimeInteraction'.org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 6121; An invalid XML character (Unicode: 0x3) was found in the value of attribute 'Description' and element is 'ActivityRunTimeInteraction'.</description>
</error>
</lmsresponse>



  • `370`<?xml version='1.0'?>
<lmsresponse>
<error present='true'>
<description>An unexpected error occurred persisting the runtime data.org.apache.xerces.parsers.DOMParser.parse(Unknown Source)An invalid XML character (Unicode: 0x3) was found in the value of attribute 'Description' and element is 'ActivityRunTimeInteraction'.org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 6115; An invalid XML character (Unicode: 0x3) was found in the value of attribute 'Description' and element is 'ActivityRunTimeInteraction'.</description>
</error>
</lmsresponse>




Log

     [23:06:22.565] Trying to create XMLHttpRequest in JavaScript1.5

     [23:06:22.566] Creating object

     [23:06:25.296] Control GetExceptionText

     [23:06:26.297] Control GetExceptionText

     [23:06:26.734] Control Initialize

+ [23:06:26.736] Initializing Possible Navigation Requests

+ [23:06:26.737] Initializing Possible Navigation Request Absolutes

+ [23:06:26.738] Initial Selection and Randomization

+ [23:06:26.786] Overall Sequencing Process [OP.1] returned '' in 0.031 seconds

     [23:06:26.792] **************************************

     [23:06:26.792] Deliverying Activity - Module 4 (SCO_ID4)

     [23:06:26.792] **************************************

     [23:06:26.792] Control DeliverActivity - Module 4 (SCO_ID4)

+ [23:06:26.816] Control Update Display

+ [23:06:26.817] Control Evaluate Possible Navigation Requests

+ [23:06:26.818] Evaluate Possible Navigation Requests Process [EPNR] returned '' in 0.015 seconds

+ [23:06:26.833] Control Update Display

+ [23:06:27.300] Control GetExceptionText

+ [23:06:36.788] Control IsThereDirtyData

     [23:06:36.789] Control MarkDirtyDataPosted

+ [23:06:36.789] Control GetXmlForDirtyData

+ [23:06:37.553] Control MarkPostedDataClean

+ [23:06:40.534] Initialize('') returned 'true' in 0.001 seconds

+ [23:06:40.535] GetValue('cmi.success_status') returned 'unknown' in 0.001 seconds

     [23:06:40.536] GetLastError() returned '0' in 0 seconds

+ [23:06:40.537] GetValue('cmi.completion_status') returned 'unknown' in 0 seconds

     [23:06:40.537] GetLastError() returned '0' in 0 seconds

+ [23:06:40.537] SetValue('cmi.completion_status', 'incomplete') returned 'true' in 0.003 seconds

+ [23:06:40.540] SetValue('cmi.exit', 'suspend') returned 'true' in 0 seconds

+ [23:06:40.540] GetValue('cmi.mode') returned 'normal' in 0.001 seconds

     [23:06:40.541] GetLastError() returned '0' in 0 seconds

+ [23:06:40.541] GetValue('cmi.entry') returned 'ab-initio' in 0 seconds

     [23:06:40.541] GetLastError() returned '0' in 0 seconds

+ [23:06:40.541] GetValue('cmi.suspend_data') returned '' in 0.001 seconds

     [23:06:40.542] GetLastError() returned '403' in 0 seconds

     [23:06:40.542] GetLastError() returned '403' in 0 seconds

     [23:06:40.542] GetErrorString('403') returned 'Data Model Element Value Not Initialized' in 0 seconds

     [23:06:40.542] GetDiagnostic('') returned 'The Suspend Data field has not been set for this SCO.' in 0 seconds

+ [23:06:40.542] SetValue('cmi.location', '0') returned 'true' in 0 seconds

+ [23:06:40.542] SetValue('cmi.suspend_data', '0') returned 'true' in 0 seconds

+ [23:06:40.663] GetValue('cmi.learner_name') returned 'JCO, JCO' in 0.001 seconds

     [23:06:40.664] GetLastError() returned '0' in 0 seconds

+ [23:06:46.789] Control IsThereDirtyData

     [23:06:46.789] Control MarkDirtyDataPosted

+ [23:06:46.790] Control GetXmlForDirtyData

+ [23:06:48.164] SetValue('cmi.score.raw', '0') returned 'true' in 0.002 seconds

+ [23:06:48.166] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:06:48.167] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:06:48.167] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:06:48.168] SetValue('cmi.location', 'Title%20screen') returned 'true' in 0 seconds

+ [23:06:48.173] SetValue('cmi.suspend_data', 'A1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextA%24YC%3Cdiv%3E%3Cspan%20class%3D%22cp-actualText%22%20style%3D%22line-height%3A100%25%3Bcolor%3A%232a6c13%3Bfont-size%3A24px%3Bfont-family%3A%27Georgia%20regular%27%2CGeorgia%3B%22%3E%3Cbr%3E%3C/span%3E%3Col%3E%3C/ol%3E%3C/div%3EHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:06:48.365] Control MarkPostedDataClean

+ [23:06:56.791] Control IsThereDirtyData

     [23:06:56.792] Control MarkDirtyDataPosted

+ [23:06:56.793] Control GetXmlForDirtyData

+ [23:06:57.734] Control MarkPostedDataClean

+ [23:06:59.772] SetValue('cmi.score.raw', '0') returned 'true' in 0.002 seconds

+ [23:06:59.774] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:06:59.775] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:06:59.775] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:06:59.776] SetValue('cmi.location', 'Course%20contents') returned 'true' in 0 seconds

+ [23:06:59.780] SetValue('cmi.suspend_data', 'B1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:07:06.786] Control IsThereDirtyData

     [23:07:06.787] Control MarkDirtyDataPosted

+ [23:07:06.787] Control GetXmlForDirtyData

+ [23:07:07.665] Control MarkPostedDataClean

+ [23:07:16.785] Control IsThereDirtyData

+ [23:07:26.787] Control IsThereDirtyData

+ [23:07:36.790] Control IsThereDirtyData

+ [23:07:46.785] Control IsThereDirtyData

+ [23:07:47.285] SetValue('cmi.score.raw', '0') returned 'true' in 0.005 seconds

+ [23:07:47.290] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:07:47.290] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:07:47.291] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:07:47.292] SetValue('cmi.location', 'Course%20objectives') returned 'true' in 0 seconds

+ [23:07:47.294] SetValue('cmi.suspend_data', 'C1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:07:56.786] Control IsThereDirtyData

     [23:07:56.786] Control MarkDirtyDataPosted

+ [23:07:56.786] Control GetXmlForDirtyData

+ [23:07:57.880] Control MarkPostedDataClean

+ [23:08:06.786] Control IsThereDirtyData

+ [23:08:13.773] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:08:13.773] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:08:13.773] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:08:13.773] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:08:13.774] SetValue('cmi.location', 'Meetings') returned 'true' in 0.002 seconds

+ [23:08:13.778] SetValue('cmi.suspend_data', 'D1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:08:16.786] Control IsThereDirtyData

     [23:08:16.786] Control MarkDirtyDataPosted

+ [23:08:16.786] Control GetXmlForDirtyData

+ [23:08:17.691] Control MarkPostedDataClean

+ [23:08:18.157] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:08:18.158] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:08:18.158] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:08:18.158] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:08:18.159] SetValue('cmi.location', 'YOUR%20OWN%20PREFERENCES%201') returned 'true' in 0 seconds

+ [23:08:18.161] SetValue('cmi.suspend_data', 'E1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:08:26.554] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:08:26.554] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:08:26.555] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:08:26.555] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:08:26.555] SetValue('cmi.location', 'YOUR%20OWN%20PREFERENCES%201') returned 'true' in 0.001 seconds

+ [23:08:26.559] SetValue('cmi.suspend_data', 'F1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1BALAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:08:26.786] Control IsThereDirtyData

     [23:08:26.787] Control MarkDirtyDataPosted

+ [23:08:26.787] Control GetXmlForDirtyData

+ [23:08:29.557] Control MarkPostedDataClean

+ [23:08:36.796] Control IsThereDirtyData

+ [23:08:46.786] Control IsThereDirtyData

+ [23:08:56.785] Control IsThereDirtyData

+ [23:09:06.552] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:09:06.553] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:09:06.554] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:09:06.554] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:09:06.556] SetValue('cmi.location', 'YOUR%20OWN%20PREFERENCES%20II') returned 'true' in 0 seconds

+ [23:09:06.560] SetValue('cmi.suspend_data', 'G1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2BAIAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:09:06.785] Control IsThereDirtyData

     [23:09:06.785] Control MarkDirtyDataPosted

+ [23:09:06.785] Control GetXmlForDirtyData

+ [23:09:07.730] Control MarkPostedDataClean

+ [23:09:12.554] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:09:12.555] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:09:12.556] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:09:12.557] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:09:12.559] SetValue('cmi.location', 'YOUR%20OWN%20PREFERENCES%20III') returned 'true' in 0 seconds

+ [23:09:12.563] SetValue('cmi.suspend_data', 'H1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3BAIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:09:16.794] Control IsThereDirtyData

     [23:09:16.794] Control MarkDirtyDataPosted

+ [23:09:16.795] Control GetXmlForDirtyData

+ [23:09:18.480] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:09:18.481] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:09:18.481] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:09:18.481] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:09:18.482] SetValue('cmi.location', 'YOUR%20OWN%20PREFERENCES%20IV') returned 'true' in 0 seconds

+ [23:09:18.486] SetValue('cmi.suspend_data', 'I1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4BAIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:09:18.588] Control MarkPostedDataClean

+ [23:09:26.788] Control IsThereDirtyData

     [23:09:26.788] Control MarkDirtyDataPosted

+ [23:09:26.788] Control GetXmlForDirtyData

+ [23:09:27.144] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:09:27.145] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:09:27.145] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:09:27.146] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:09:27.146] SetValue('cmi.location', 'Meetings%20') returned 'true' in 0 seconds

+ [23:09:27.150] SetValue('cmi.suspend_data', 'J1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:09:28.699] Control MarkPostedDataClean

+ [23:09:36.788] Control IsThereDirtyData

     [23:09:36.788] Control MarkDirtyDataPosted

+ [23:09:36.789] Control GetXmlForDirtyData

+ [23:09:37.918] Control MarkPostedDataClean

+ [23:09:46.785] Control IsThereDirtyData

+ [23:09:56.785] Control IsThereDirtyData

+ [23:10:06.785] Control IsThereDirtyData

+ [23:10:16.790] Control IsThereDirtyData

+ [23:10:26.785] Control IsThereDirtyData

+ [23:10:36.786] Control IsThereDirtyData

+ [23:10:46.788] Control IsThereDirtyData

+ [23:10:56.786] Control IsThereDirtyData

+ [23:11:06.785] Control IsThereDirtyData

+ [23:11:16.786] Control IsThereDirtyData

+ [23:11:26.786] Control IsThereDirtyData

+ [23:11:36.786] Control IsThereDirtyData

+ [23:11:46.786] Control IsThereDirtyData

+ [23:11:56.785] Control IsThereDirtyData

+ [23:12:06.785] Control IsThereDirtyData

+ [23:12:16.785] Control IsThereDirtyData

+ [23:12:26.787] Control IsThereDirtyData

+ [23:12:36.793] Control IsThereDirtyData

+ [23:12:46.786] Control IsThereDirtyData

+ [23:12:56.789] Control IsThereDirtyData

+ [23:13:06.786] Control IsThereDirtyData

+ [23:13:16.785] Control IsThereDirtyData

+ [23:13:26.786] Control IsThereDirtyData

+ [23:13:36.796] Control IsThereDirtyData

+ [23:13:46.786] Control IsThereDirtyData

+ [23:13:56.805] Control IsThereDirtyData

+ [23:14:06.785] Control IsThereDirtyData

+ [23:14:16.785] Control IsThereDirtyData

+ [23:14:26.805] Control IsThereDirtyData

+ [23:14:27.617] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:14:27.617] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:14:27.618] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:14:27.618] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:14:27.620] SetValue('cmi.location', 'DECISION-MAKING%20PROCESS') returned 'true' in 0 seconds

+ [23:14:27.624] SetValue('cmi.suspend_data', 'K1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:14:32.36] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:14:32.37] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:14:32.38] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:14:32.39] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:14:32.39] SetValue('cmi.location', 'Decision-making%20I') returned 'true' in 0 seconds

+ [23:14:32.42] SetValue('cmi.suspend_data', 'L1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:14:36.790] Control IsThereDirtyData

     [23:14:36.791] Control MarkDirtyDataPosted

+ [23:14:36.791] Control GetXmlForDirtyData

+ [23:14:43.344] Control MarkPostedDataClean

+ [23:14:46.786] Control IsThereDirtyData

+ [23:14:56.786] Control IsThereDirtyData

+ [23:15:06.786] Control IsThereDirtyData

+ [23:15:16.786] Control IsThereDirtyData

+ [23:15:26.785] Control IsThereDirtyData

+ [23:15:36.786] Control IsThereDirtyData

+ [23:15:46.789] Control IsThereDirtyData

+ [23:15:56.785] Control IsThereDirtyData

+ [23:16:06.785] Control IsThereDirtyData

+ [23:16:16.785] Control IsThereDirtyData

+ [23:16:26.785] Control IsThereDirtyData

+ [23:16:36.788] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:16:36.789] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:16:36.790] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:16:36.790] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:16:36.792] SetValue('cmi.location', 'DECISION-MAKING%20II') returned 'true' in 0 seconds

+ [23:16:36.794] SetValue('cmi.suspend_data', 'M1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:16:36.911] Control IsThereDirtyData

     [23:16:36.911] Control MarkDirtyDataPosted

+ [23:16:36.911] Control GetXmlForDirtyData

+ [23:16:39.993] Control MarkPostedDataClean

+ [23:16:46.786] Control IsThereDirtyData

+ [23:16:56.788] Control IsThereDirtyData

+ [23:17:06.785] Control IsThereDirtyData

+ [23:17:16.785] Control IsThereDirtyData

+ [23:17:26.786] Control IsThereDirtyData

+ [23:17:36.786] Control IsThereDirtyData

+ [23:17:46.785] Control IsThereDirtyData

+ [23:17:56.786] Control IsThereDirtyData

+ [23:18:05.520] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:18:05.521] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:18:05.522] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:18:05.523] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:18:05.526] SetValue('cmi.location', 'DECISION-MAKING%20III') returned 'true' in 0 seconds

+ [23:18:05.528] SetValue('cmi.suspend_data', 'N1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:18:06.788] Control IsThereDirtyData

     [23:18:06.788] Control MarkDirtyDataPosted

+ [23:18:06.788] Control GetXmlForDirtyData

+ [23:18:08.942] Control MarkPostedDataClean

+ [23:18:16.786] Control IsThereDirtyData

+ [23:18:26.785] Control IsThereDirtyData

+ [23:18:36.785] Control IsThereDirtyData

+ [23:18:46.785] Control IsThereDirtyData

+ [23:18:56.786] Control IsThereDirtyData

+ [23:19:06.787] Control IsThereDirtyData

+ [23:19:16.785] Control IsThereDirtyData

+ [23:19:26.785] Control IsThereDirtyData

+ [23:19:36.785] Control IsThereDirtyData

+ [23:19:46.785] Control IsThereDirtyData

+ [23:19:56.786] Control IsThereDirtyData

+ [23:20:06.786] Control IsThereDirtyData

+ [23:20:16.791] Control IsThereDirtyData

+ [23:20:26.786] Control IsThereDirtyData

+ [23:20:36.785] Control IsThereDirtyData

+ [23:20:46.786] Control IsThereDirtyData

+ [23:20:56.786] Control IsThereDirtyData

+ [23:21:07.63] Control IsThereDirtyData

+ [23:21:15.102] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:21:15.103] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:21:15.104] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:21:15.104] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:21:15.106] SetValue('cmi.location', 'DECISION-MAKING%20IV') returned 'true' in 0 seconds

+ [23:21:15.108] SetValue('cmi.suspend_data', 'O1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:21:16.785] Control IsThereDirtyData

     [23:21:16.785] Control MarkDirtyDataPosted

+ [23:21:16.786] Control GetXmlForDirtyData

+ [23:21:21.906] Control MarkPostedDataClean

+ [23:21:26.788] Control IsThereDirtyData

+ [23:21:36.785] Control IsThereDirtyData

+ [23:21:46.791] Control IsThereDirtyData

+ [23:21:56.786] Control IsThereDirtyData

+ [23:22:06.786] Control IsThereDirtyData

+ [23:22:14.149] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:22:14.150] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:22:14.151] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:22:14.152] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:22:14.154] SetValue('cmi.location', 'DECISION-MAKING%20TIPS') returned 'true' in 0.001 seconds

+ [23:22:14.157] SetValue('cmi.suspend_data', 'P1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:22:16.786] Control IsThereDirtyData

     [23:22:16.786] Control MarkDirtyDataPosted

+ [23:22:16.786] Control GetXmlForDirtyData

+ [23:22:19.702] Control MarkPostedDataClean

+ [23:22:26.785] Control IsThereDirtyData

+ [23:22:36.788] Control IsThereDirtyData

+ [23:22:46.786] Control IsThereDirtyData

+ [23:22:56.785] Control IsThereDirtyData

+ [23:23:02.205] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:23:02.206] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:23:02.207] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:23:02.208] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:23:02.210] SetValue('cmi.location', 'REPORTING/INFORMATION%20SHARING') returned 'true' in 0.001 seconds

+ [23:23:02.213] SetValue('cmi.suspend_data', 'Q1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:23:06.683] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:23:06.683] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:23:06.684] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:23:06.684] SetValue('cmi.score.scaled', '0') returned 'true' in 0.001 seconds

+ [23:23:06.685] SetValue('cmi.location', 'REPORTING%20I') returned 'true' in 0.003 seconds

+ [23:23:06.690] SetValue('cmi.suspend_data', 'R1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2BAOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:23:06.802] Control IsThereDirtyData

     [23:23:06.802] Control MarkDirtyDataPosted

+ [23:23:06.802] Control GetXmlForDirtyData

+ [23:23:09.627] Control MarkPostedDataClean

+ [23:23:16.788] Control IsThereDirtyData

+ [23:23:26.786] Control IsThereDirtyData

+ [23:23:36.786] Control IsThereDirtyData

+ [23:23:46.786] Control IsThereDirtyData

+ [23:23:50.997] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:23:50.998] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:23:50.999] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:23:50.999] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:23:51.1] SetValue('cmi.location', 'REPORTING%20II') returned 'true' in 0 seconds

+ [23:23:51.3] SetValue('cmi.suspend_data', 'S1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:23:56.788] Control IsThereDirtyData

     [23:23:56.789] Control MarkDirtyDataPosted

+ [23:23:56.789] Control GetXmlForDirtyData

+ [23:23:58.483] Control MarkPostedDataClean

+ [23:24:06.785] Control IsThereDirtyData

+ [23:24:16.785] Control IsThereDirtyData

+ [23:24:26.785] Control IsThereDirtyData

+ [23:24:36.786] Control IsThereDirtyData

+ [23:24:46.785] Control IsThereDirtyData

+ [23:24:56.785] Control IsThereDirtyData

+ [23:25:06.788] Control IsThereDirtyData

+ [23:25:16.786] Control IsThereDirtyData

+ [23:25:26.786] Control IsThereDirtyData

+ [23:25:36.786] Control IsThereDirtyData

+ [23:25:46.785] Control IsThereDirtyData

+ [23:25:56.785] Control IsThereDirtyData

+ [23:26:06.785] Control IsThereDirtyData

+ [23:26:08.963] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:26:08.964] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:26:08.964] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:26:08.965] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:26:08.967] SetValue('cmi.location', 'REPORTING%20III') returned 'true' in 0 seconds

+ [23:26:08.969] SetValue('cmi.suspend_data', 'T1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:26:16.789] Control IsThereDirtyData

     [23:26:16.790] Control MarkDirtyDataPosted

+ [23:26:16.790] Control GetXmlForDirtyData

+ [23:26:20.774] Control MarkPostedDataClean

+ [23:26:26.785] Control IsThereDirtyData

+ [23:26:36.786] Control IsThereDirtyData

+ [23:26:46.785] Control IsThereDirtyData

+ [23:26:56.785] Control IsThereDirtyData

+ [23:27:02.792] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:27:02.792] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:27:02.793] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:27:02.793] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:27:02.795] SetValue('cmi.location', 'OFFICIAL%20AND%20UNOFFICIAL%20INFORMATION%20I') returned 'true' in 0.001 seconds

+ [23:27:02.798] SetValue('cmi.suspend_data', 'U1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:27:06.785] Control IsThereDirtyData

     [23:27:06.785] Control MarkDirtyDataPosted

+ [23:27:06.786] Control GetXmlForDirtyData

+ [23:27:08.413] Control MarkPostedDataClean

+ [23:27:16.785] Control IsThereDirtyData

+ [23:27:26.789] Control IsThereDirtyData

+ [23:27:36.786] Control IsThereDirtyData

+ [23:27:46.785] Control IsThereDirtyData

+ [23:27:56.786] Control IsThereDirtyData

+ [23:28:04.833] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:28:04.834] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:28:04.835] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:28:04.835] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:28:04.837] SetValue('cmi.location', 'Reporting%20tips') returned 'true' in 0 seconds

+ [23:28:04.840] SetValue('cmi.suspend_data', 'V1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:28:06.785] Control IsThereDirtyData

     [23:28:06.785] Control MarkDirtyDataPosted

+ [23:28:06.785] Control GetXmlForDirtyData

+ [23:28:08.701] Control MarkPostedDataClean

+ [23:28:16.786] Control IsThereDirtyData

+ [23:28:26.786] Control IsThereDirtyData

+ [23:28:31.178] SetValue('cmi.score.raw', '0') returned 'true' in 0.002 seconds

+ [23:28:31.180] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:28:31.181] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:28:31.181] SetValue('cmi.score.scaled', '0') returned 'true' in 0.002 seconds

+ [23:28:31.183] SetValue('cmi.location', 'REVIEW%20OF%20PERSONAL%20PREFERENCES') returned 'true' in 0.001 seconds

+ [23:28:31.186] SetValue('cmi.suspend_data', 'W1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:28:36.789] Control IsThereDirtyData

     [23:28:36.789] Control MarkDirtyDataPosted

+ [23:28:36.789] Control GetXmlForDirtyData

+ [23:28:38.542] Control MarkPostedDataClean

+ [23:28:46.785] Control IsThereDirtyData

+ [23:28:54.589] SetValue('cmi.score.raw', '0') returned 'true' in 0.001 seconds

+ [23:28:54.590] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:28:54.590] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:28:54.590] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:28:54.591] SetValue('cmi.location', 'Title%20screen%20end%20Quiz') returned 'true' in 0.002 seconds

+ [23:28:54.596] SetValue('cmi.suspend_data', 'X1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:28:56.786] Control IsThereDirtyData

     [23:28:56.786] Control MarkDirtyDataPosted

+ [23:28:56.786] Control GetXmlForDirtyData

+ [23:28:57.953] Control MarkPostedDataClean

+ [23:28:58.995] SetValue('cmi.score.raw', '0') returned 'true' in 0 seconds

+ [23:28:58.995] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:28:58.996] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:28:58.996] SetValue('cmi.score.scaled', '0') returned 'true' in 0 seconds

+ [23:28:58.997] SetValue('cmi.location', 'Quiz%201') returned 'true' in 0 seconds

+ [23:28:59.0] SetValue('cmi.suspend_data', 'Z1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP000AA0000C-1B0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:28:59.6] SetValue('cmi.suspend_data', 'Z1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP0A%24nP001BA0001C-1BZA%7E%24mV*4-FzUA0001A00AAAA0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:29:06.786] Control IsThereDirtyData

     [23:29:06.786] Control MarkDirtyDataPosted

+ [23:29:06.786] Control GetXmlForDirtyData

+ [23:29:07.995] Control MarkPostedDataClean

+ [23:29:16.786] Control IsThereDirtyData

+ [23:29:17.763] SetValue('cmi.suspend_data', 'a1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP001BA0001C25CZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzUA0001A00AAAA0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:29:26.785] Control IsThereDirtyData

     [23:29:26.785] Control MarkDirtyDataPosted

+ [23:29:26.785] Control GetXmlForDirtyData

+ [23:29:27.628] GetValue('cmi.interactions._count') returned '0' in 0 seconds

     [23:29:27.629] GetLastError() returned '0' in 0 seconds

+ [23:29:27.629] SetValue('cmi.interactions.0.id', 'Interaction_24') returned 'true' in 0.003 seconds

+ [23:29:27.632] SetValue('cmi.interactions.0.type', 'choice') returned 'true' in 0 seconds

+ [23:29:27.632] SetValue('cmi.interactions.0.learner_response', '1') returned 'true' in 0 seconds

+ [23:29:27.632] SetValue('cmi.interactions.0.result', 'correct') returned 'true' in 0.001 seconds

+ [23:29:27.633] SetValue('cmi.interactions.0.correct_responses.0.pattern', '1') returned 'true' in 0 seconds

+ [23:29:27.633] SetValue('cmi.interactions.0.weighting', '10') returned 'true' in 0 seconds

+ [23:29:27.633] SetValue('cmi.interactions.0.latency', 'PT9.87S') returned 'true' in 0 seconds

+ [23:29:27.633] SetValue('cmi.interactions.0.objectives.0.id', 'Quiz_201653151048') returned 'true' in 0.001 seconds

+ [23:29:27.634] SetValue('cmi.interactions.0.timestamp', '2017-01-12T23:29:27.0+01') returned 'true' in 0.001 seconds

+ [23:29:27.877] SetValue('cmi.suspend_data', 'a1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP0A%24nP001BA0001C27CZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943A0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:29:27.970] SetValue('cmi.suspend_data', 'b1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP0A%24nP001BA0001C27DZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzUA0001A00AAAA0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0 seconds

+ [23:29:32.270] Control MarkPostedDataClean

+ [23:29:36.826] Control IsThereDirtyData

     [23:29:36.827] Control MarkDirtyDataPosted

+ [23:29:36.827] Control GetXmlForDirtyData

+ [23:29:38.843] Control MarkPostedDataClean

+ [23:29:39.639] SetValue('cmi.suspend_data', 'c1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP001BA0001C27EZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzUA0001A00AAAA0vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBA$_#-#_$') returned 'true' in 0.001 seconds

+ [23:29:45.964] GetValue('cmi.interactions._count') returned '1' in 0.002 seconds

     [23:29:45.966] GetLastError() returned '0' in 0 seconds

+ [23:29:45.966] SetValue('cmi.interactions.1.id', '24810682') returned 'true' in 0.002 seconds

+ [23:29:45.968] SetValue('cmi.interactions.1.type', 'choice') returned 'true' in 0.001 seconds

+ [23:29:45.970] SetValue('cmi.interactions.1.learner_response', 'Organizing_pre-meetings_with_a_smaller_number_of_people_is_usually_very_useful[,]Participation_in_meetings_is_defined_by_how_much_you_speak__so_Japanese_will_always_try_to_say_something[,]It_is_important_not_to_interrupt_your_Japanese_colleagues_and_leave_them_enough_room_to_speak[,]If_you_want_more_information__talking_with_Japanese_colleagues_after_work_hours_will_certainly_help') returned 'true' in 0 seconds

+ [23:29:45.970] SetValue('cmi.interactions.1.result', 'incorrect') returned 'true' in 0.001 seconds

+ [23:29:45.971] SetValue('cmi.interactions.1.correct_responses.0.pattern', 'Organizing_pre-meetings_with_a_smaller_number_of_people_is_usually_very_useful[,]It_is_important_not_to_interrupt_your_Japanese_colleagues_and_leave_them_enough_room_to_speak[,]If_you_want_more_information__talking_with_Japanese_colleagues_after_work_hours_will_certainly_help') returned 'true' in 0 seconds

+ [23:29:45.971] SetValue('cmi.interactions.1.description', 'Please select all the correct statements.') returned 'true' in 0 seconds

+ [23:29:45.971] SetValue('cmi.interactions.1.weighting', '10') returned 'true' in 0 seconds

+ [23:29:45.971] SetValue('cmi.interactions.1.latency', 'PT18.73S') returned 'true' in 0.001 seconds

+ [23:29:45.972] SetValue('cmi.interactions.1.objectives.0.id', 'Quiz_201653151048') returned 'true' in 0.002 seconds

+ [23:29:45.974] SetValue('cmi.interactions.1.timestamp', '2017-01-12T23:29:45.0+01') returned 'true' in 0.001 seconds

+ [23:29:45.976] GetValue('cmi.interactions._count') returned '2' in 0 seconds

     [23:29:45.976] GetLastError() returned '0' in 0 seconds

+ [23:29:45.976] SetValue('cmi.interactions.2.id', 'Interaction_24') returned 'true' in 0.001 seconds

+ [23:29:45.977] SetValue('cmi.interactions.2.type', 'choice') returned 'true' in 0 seconds

+ [23:29:45.977] SetValue('cmi.interactions.2.learner_response', '1') returned 'true' in 0.001 seconds

+ [23:29:45.978] SetValue('cmi.interactions.2.result', 'correct') returned 'true' in 0 seconds

+ [23:29:45.978] SetValue('cmi.interactions.2.correct_responses.0.pattern', '1') returned 'true' in 0.001 seconds

+ [23:29:45.979] SetValue('cmi.interactions.2.weighting', '10') returned 'true' in 0 seconds

+ [23:29:45.979] SetValue('cmi.interactions.2.latency', 'PT9.87S') returned 'true' in 0 seconds

+ [23:29:45.979] SetValue('cmi.interactions.2.objectives.0.id', 'Quiz_201653151048') returned 'true' in 0 seconds

+ [23:29:45.979] SetValue('cmi.interactions.2.timestamp', '2017-01-12T23:29:45.0+01') returned 'true' in 0.001 seconds

+ [23:29:45.980] GetValue('cmi.interactions._count') returned '3' in 0 seconds

     [23:29:45.980] GetLastError() returned '0' in 0 seconds

+ [23:29:45.980] SetValue('cmi.interactions.3.id', '24811473') returned 'true' in 0.001 seconds

+ [23:29:45.981] SetValue('cmi.interactions.3.type', 'choice') returned 'true' in 0 seconds

+ [23:29:45.981] SetValue('cmi.interactions.3.learner_response', 'It_is_an_informal_discussion[,]It_takes_place_before_the_meeting[,]It_makes_it_possible_to_informally_discuss_and_consult_with_all_colleagues_involved_before_the_meeting') returned 'true' in 0 seconds

+ [23:29:45.981] SetValue('cmi.interactions.3.result', 'correct') returned 'true' in 0 seconds

+ [23:29:45.981] SetValue('cmi.interactions.3.correct_responses.0.pattern', 'It_is_an_informal_discussion[,]It_takes_place_before_the_meeting[,]It_makes_it_possible_to_informally_discuss_and_consult_with_all_colleagues_involved_before_the_meeting') returned 'true' in 0 seconds

+ [23:29:45.981] SetValue('cmi.interactions.3.description', 'What best describes nemawashi? Please select all the appropriate statements. ') returned 'true' in 0.001 seconds

+ [23:29:45.982] SetValue('cmi.interactions.3.weighting', '10') returned 'true' in 0 seconds

+ [23:29:45.982] SetValue('cmi.interactions.3.latency', 'PT11.63S') returned 'true' in 0 seconds

+ [23:29:45.982] SetValue('cmi.interactions.3.objectives.0.id', 'Quiz_201653151048') returned 'true' in 0 seconds

+ [23:29:45.982] SetValue('cmi.interactions.3.timestamp', '2017-01-12T23:29:45.0+01') returned 'true' in 0.001 seconds

+ [23:29:45.983] GetValue('cmi.interactions._count') returned '4' in 0.001 seconds

     [23:29:45.984] GetLastError() returned '0' in 0 seconds

+ [23:29:45.984] SetValue('cmi.interactions.4.id', '24811740') returned 'true' in 0 seconds

+ [23:29:45.984] SetValue('cmi.interactions.4.type', 'choice') returned 'true' in 0 seconds

+ [23:29:45.984] SetValue('cmi.interactions.4.learner_response', 'It_happens_automatically_and_regularly[,]It_is_a_bottom-up_process_(employees_reporting_to_their_boss)[,]It_is_a_contraction_of_three_Japanese_words:_hokoku_(to_report)__renraku_(to_communicate)__sodan_(to_ask_advice)') returned 'true' in 0.001 seconds

+ [23:29:45.985] SetValue('cmi.interactions.4.result', 'correct') returned 'true' in 0 seconds

+ [23:29:45.985] SetValue('cmi.interactions.4.correct_responses.0.pattern', 'It_happens_automatically_and_regularly[,]It_is_a_bottom-up_process_(employees_reporting_to_their_boss)[,]It_is_a_contraction_of_three_Japanese_words:_hokoku_(to_report)__renraku_(to_communicate)__sodan_(to_ask_advice)') returned 'true' in 0 seconds

+ [23:29:45.985] SetValue('cmi.interactions.4.description', 'What best describes horenso? Please select all the appropriate statements. ') returned 'true' in 0 seconds

+ [23:29:45.986] SetValue('cmi.interactions.4.weighting', '10') returned 'true' in 0 seconds

+ [23:29:45.986] SetValue('cmi.interactions.4.latency', 'PT5.29S') returned 'true' in 0 seconds

+ [23:29:45.986] SetValue('cmi.interactions.4.objectives.0.id', 'Quiz_201653151048') returned 'true' in 0 seconds

+ [23:29:45.986] SetValue('cmi.interactions.4.timestamp', '2017-01-12T23:29:45.0+01') returned 'true' in 0 seconds

+ [23:29:45.988] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:45.988] SetValue('cmi.success_status', 'passed') returned 'true' in 0.001 seconds

+ [23:29:45.989] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:45.989] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:29:45.989] SetValue('cmi.score.max', '40') returned 'true' in 0.001 seconds

+ [23:29:45.990] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:29:45.990] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0.001 seconds

+ [23:29:45.991] SetValue('cmi.location', 'Quiz%204') returned 'true' in 0 seconds

+ [23:29:45.994] SetValue('cmi.suspend_data', 'c1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP0A%24nP001BA0101C28EZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*bNRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:29:46.83] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:29:46.83] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:46.83] SetValue('cmi.score.raw', '30') returned 'true' in 0.001 seconds

+ [23:29:46.84] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:29:46.84] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:29:46.85] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:29:46.85] SetValue('cmi.location', 'Result%20slide') returned 'true' in 0 seconds

+ [23:29:46.88] SetValue('cmi.suspend_data', 'd1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP001BA1100C28EZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*BfRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:29:46.785] Control IsThereDirtyData

     [23:29:46.785] Control MarkDirtyDataPosted

+ [23:29:46.786] Control GetXmlForDirtyData

+ [23:29:52.682] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:29:52.682] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:52.682] SetValue('cmi.score.raw', '30') returned 'true' in 0.001 seconds

+ [23:29:52.683] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:29:52.683] SetValue('cmi.score.min', '0') returned 'true' in 0.001 seconds

+ [23:29:52.684] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:29:52.685] SetValue('cmi.location', 'Quiz%201') returned 'true' in 0 seconds

+ [23:29:52.688] SetValue('cmi.suspend_data', 'Z1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP100BB1100C28EZA%7E%24mV*4-FzU%7E%24mV*lkKzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*BfRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:29:53.94] Control ERROR

+ [23:29:53.94] Control MarkPostedDataDirty

+ [23:29:53.94] Control IsThereDirtyData

     [23:29:53.94] Control MarkDirtyDataPosted

+ [23:29:53.94] Control GetXmlForDirtyData

+ [23:29:55.5] Control ERROR

+ [23:29:55.5] Control MarkPostedDataDirty

     [23:29:55.6] Control DisplayError - A fatal error has occurred, communication with the server has been lost.

+ [23:29:57.127] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:29:57.127] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:57.128] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:29:57.128] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:29:57.128] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:29:57.128] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:29:57.128] SetValue('cmi.location', 'Quiz%202') returned 'true' in 0 seconds

+ [23:29:57.133] SetValue('cmi.suspend_data', 'a1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP100BB1100C25EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*BfRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0.001 seconds

+ [23:29:58.789] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:29:58.789] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:58.789] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:29:58.789] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:29:58.789] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:29:58.789] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:29:58.790] SetValue('cmi.location', 'Quiz%203') returned 'true' in 0 seconds

+ [23:29:58.793] SetValue('cmi.suspend_data', 'b1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP100BB1100C25EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*O6PzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*BfRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:29:59.881] SetValue('cmi.location', 'Quiz%204') returned 'true' in 0.001 seconds

+ [23:29:59.884] SetValue('cmi.suspend_data', 'c1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP100BB1100C27EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*12UzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*BfRzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:30:00.748] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:30:00.748] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:30:00.749] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:30:00.749] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:30:00.749] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:30:00.749] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0.004 seconds

+ [23:30:00.753] SetValue('cmi.location', 'Result%20slide') returned 'true' in 0.001 seconds

+ [23:30:00.756] SetValue('cmi.suspend_data', 'd1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP100BB1100C28EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*12UzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*XEVzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:30:03.7] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:30:03.7] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:30:03.8] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:30:03.8] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:30:03.8] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:30:03.8] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:30:03.8] SetValue('cmi.location', 'End%20slide') returned 'true' in 0 seconds

+ [23:30:03.11] SetValue('cmi.suspend_data', 'e1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP100BB1100C28EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*12UzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*XEVzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:30:05.804] SetValue('cmi.success_status', 'passed') returned 'true' in 0 seconds

+ [23:30:05.804] SetValue('cmi.completion_status', 'completed') returned 'true' in 0 seconds

+ [23:30:05.805] SetValue('cmi.score.raw', '30') returned 'true' in 0 seconds

+ [23:30:05.805] SetValue('cmi.score.max', '40') returned 'true' in 0 seconds

+ [23:30:05.805] SetValue('cmi.score.min', '0') returned 'true' in 0 seconds

+ [23:30:05.805] SetValue('cmi.score.scaled', '0.75') returned 'true' in 0 seconds

+ [23:30:05.805] SetValue('cmi.location', 'End%20slide') returned 'true' in 0 seconds

+ [23:30:05.807] SetValue('cmi.suspend_data', 'e1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP0A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP1A%24nP100BB1100C28EZA%7E%24mV*4-FzU%7E%24mV*oLUzU1000B00AHAN0%3B1%3B2%3B3%3B4%3B5%3B6BAB0B0BBB1B1BCB0B0BDB1B0BEB1B1BFB0B0BGB1B1aB%7E%24mV*4kKzU%7E%24mV*G-MzU1010B00KAA%241GFKImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943KImage_1673FOSmartShape_941OSmartShape_936OSmartShape_937OSmartShape_942OSmartShape_943AbC%7E%24mV*aENzU%7E%24mV*12UzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB0B0BCB1B1BDB1B1BEB1B1cD%7E%24mV*w6PzU%7E%24mV*XEVzU1010B00KFAJ0%3B1%3B2%3B3%3B4BAB0B0BBB1B1BCB0B0BDB1B1BEB1B10vIAnswerq1Avto%20brainstorm%20as%20well%20as%20to%20discuss%20and%20decide.LAnswerq1319BAIAnswerq2ARnot%20so%20important%20IAnswerq3AuNeither%20of%20the%20above.%20It%u2019s%20always%20case%20by%20caseIAnswerq4ASI%20completely%20agreeIAnswerq5BARText_Entry_Box_11BARText_Entry_Box_21BAQText_Entry_Box_3BAQText_Entry_Box_4BAQText_Entry_Box_5BAQText_Entry_Box_6BAQText_Entry_Box_7BAQText_Entry_Box_8BALcase_study1BALcase_study2A%24gBHe%20probably%20did%20not%20ask%20specifically%20for%20reports%20and%20so%20his%20team%20did%20not%20know%20they%20had%20to%20reportOcase_study2348BALcase_study3BALcase_study4BALcase_study5BATcpQuizInfoStudentIDBAVcpQuizInfoStudentNameAIJCO%2C%20JCOXnumberofstudentsinclassBAEtextBAHtext112BAHv_agreeBAIv_debateBAHv_examsBANv_interactionBAGv_nullBAJv_null320BAJv_null349BAFv_oneBAFv_twoBAMv_understandBAHv_visibBAKv_visib190BANv_visib190201BAQv_visib190201209BAKv_visib191BAKv_visib200BAKv_visib208BAKv_visib268BANv_visib268301BANv_visib268313BANv_visib268316BAQcpQuizHandledAllBB$_#-#_$') returned 'true' in 0 seconds

+ [23:30:05.807] SetValue('cmi.session_time', 'PT23M25.26S') returned 'true' in 0 seconds

+ [23:30:05.809] Commit('') returned 'true' in 0.002 seconds

+ [23:30:05.811] GetValue('cmi.success_status') returned 'passed' in 0 seconds

     [23:30:05.811] GetLastError() returned '0' in 0 seconds

+ [23:30:05.811] GetValue('cmi.completion_status') returned 'completed' in 0 seconds

     [23:30:05.811] GetLastError() returned '0' in 0 seconds

+ [23:30:05.811] SetValue('cmi.session_time', 'PT23M25.27S') returned 'true' in 0.001 seconds

+ [23:30:05.812] SetValue('cmi.exit', 'suspend') returned 'true' in 0 seconds

+ [23:30:05.812] Commit('') returned 'true' in 0 seconds

+ [23:30:05.812] Terminate('') returned 'true' in 0.001 seconds

+ [23:30:18.846] Control Evaluate Possible Navigation Requests

+ [23:30:18.849] Evaluate Possible Navigation Requests Process [EPNR] returned '' in 0.017 seconds

+ [23:30:18.866] Control Update Display

+ [23:30:18.976] In ScoHasTerminatedSoUnload

Views

1.3K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Contributor , Jan 17, 2017 Jan 17, 2017

Okay, here is the solution.

Some lms don't have a feature that strips the slide text of control characters that may have been copy pasted in from Word etc.

In this case, there were 2 questions in the quiz that had a line break control character that we could not see.

As captivate uses the actual question text to report  the questions to the lms, the lms thought it was being asked to do something it didn't understand and quit out.

Removing the control characters fixed the issue. And it only took

...

Votes

Translate

Translate
Community Expert ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

I am very sorry about your issues, but unless another Captivate user happens to be on the same LMS, I doubt you'll get appropriate answers in this Captivate forum. Since everything works fine when testing in the SCORM Cloud, the problems are clearly due to the LMS and you should contact the responsibles of that LMS.

Only suggestions I have: be sure that the score slide is not the last slide, have at least one slide after that score slide.

I also heard that sometimes SCORM2004 is not totally stable. Is it possible to try out with SCORM1.2 as well? I know that it has less functionality, but maybe it could be a temporary solution?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

As always, thank you.

The result slide is the second before last and the learner has to click on an "exit" button to quit.

I am confused as it only happens to some and not all the modules...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

Maybe have a look at this thread:

Re: Problems with IE9 and C9 closing entire browser

TLCMediaDesign is a scripting guru, you could try his script to overrule the normal Exit function of Captivate, to see if it helps with your problems?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

That is a good idea. We are using Firefox and Edge and it doesn´t change it but I will try the script.

Where exactly would I put it in?

After script? Actually I also asked him in the thread as I need it for all browsers.

<!DOCTYPE html>

<html lang="en">

<head>

<meta name='viewport' content='initial-scale = 1, minimum-scale = 1, maximum-scale = 1'/>

<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />

<meta http-equiv="x-ua-compatible" content="IE=10">

<title>Module 3</title>

<style type="text/css">#initialLoading{background:url(assets/htmlimages/loader.gif) no-repeat center center;background-color:#ffffff;position:absolute;margin:auto;top:0;left:0;right:0;bottom:0;z-index:10010;}</style>

<script>

var deviceReady = false;

var initCalled = false ;

var initialized = false;

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

I just saw that the communication loss happens not when I click "exit" but already when I click "submit" on the last quiz slide to go to the "results" slide.

That is strange...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

Test your SCORM in SCORM Cloud and see if you get the same result.

This looks to be an issue with your LMS to me.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 13, 2017 Jan 13, 2017

Copy link to clipboard

Copied

Thanks Rod.

As I said,  I tested the exact same zip in Scorm Cloud and it worked. But the error in the modules is reproducable and only happens in the same 3 of 5, all based on the same template. It is very strange. I just replaced all quiz slides in 1 defective module hoping that this was the reason but it wasn´t.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 14, 2017 Jan 14, 2017

Copy link to clipboard

Copied

Then this means the error is in your LMS and you should have a good case to go back to them and ask for their technical people to get involved in helping you find the solution.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 14, 2017 Jan 14, 2017

Copy link to clipboard

Copied

I hope you are right. I will do that.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jan 17, 2017 Jan 17, 2017

Copy link to clipboard

Copied

Okay, here is the solution.

Some lms don't have a feature that strips the slide text of control characters that may have been copy pasted in from Word etc.

In this case, there were 2 questions in the quiz that had a line break control character that we could not see.

As captivate uses the actual question text to report  the questions to the lms, the lms thought it was being asked to do something it didn't understand and quit out.

Removing the control characters fixed the issue. And it only took me a week!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Apr 22, 2024 Apr 22, 2024

Copy link to clipboard

Copied

LATEST

Hahaha you saved my life 7 years later!!! Thank you!!! I was having the same issue in the new Captivate. After I read your post I found out one of the questions I was copying from Word to Captivate was created as a "Formula". It looks like this format (formula) is not supported in Captivate and for that reason the sowtware was closing automatically and giving this "connection to server lost" notification. THANK YOU!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources
Help resources