Tracking Interactions

I am trying to configure a scored simulation so that if the
learner performance an incorrect action, they do not receive credit
for the point(s) assigned to the associated interaction.
I am working with click boxes and the issue I'm having is
that if I go through and perform incorrect actions, the SCORM
transcript and the results page in the simulation is reporting that
I performed all the actions correctly.
I thought that deselecting the Allow User Infinite attempts
checkbox in the click box properties dialog might be the solution.
If it is, I can't figure out what selection to make in the After
last attempt drop-down list in order for the learner to be able to
move forward by still performing the required action.
Another possibility is that I have configured the Quiz
Reporting options incorrectly. Currently, I have selected the
following options:
Choose Report Data: Slide Views Only
Reporting Level: Interactions and Score
Any recommendations for what setting(s) to modify in order
for incorrect interactions to be counted as incorrect in results
would be greatly appreciated.
Thanks,

Thank you for your response.
I think that it's something other than the Reporting Level /
LMS system because the transcript on the LMS system mirrors the
results that are displayed on the Captivate quiz results screen
that I embedded into the end of the simulation.
It is listing each interaction as being "correct" even when I
click away from the required click box on purpose first and the
failure feedback is displayed. Then, I click on the required click
box to move forward in the simulation.
If you have any other suggestions of what other settings I
should modify that would be great.
Thanks
Captivader
Scottsdale, AZ

Similar Messages

  • Interaction ID

    Can someone tell me what the interaction ID number and
    Objective ID is for when it comes to LMS?
    Does it matter if my ID for different SCORMS are the same?
    The problem is I went and made 40 SCORMS from within the same
    base template file. (Didn't want to keep making changes to all my
    settings everytime i made a new SCORM) Just realized that this
    might have a weird effect on my LMS.. does anyone know anything
    about it?

    If you want to track interactions, this is how you'd do it.
    Tracking
    interactions generally relates to writing the user's result
    and the
    expected/proper result to the LMS. By assigning a unique ID
    for each
    interaction, you can tell which response was made to what
    interaction.
    Interaction IDs have to be unique within a specific SCO but
    do not need
    to be unique across SCOs (I think). So when you run a report
    for a given
    SCO in your LMS, you should be able to see interaction
    results (ID, user
    response, correct response).
    So, you should be fine with what you've done. However, it
    definitely
    depends on whether your LMS has any problem with repeated
    IDs. I don't
    think it should but LMS implementations/specs are widely
    varied...
    Erik
    edumacator wrote:
    > Can someone tell me what the interaction ID number is
    for when it comes to LMS?
    > Does it matter if my ID for different SCORMS are the
    same?
    >
    > The problem is I went and made 40 SCORMS from within the
    same base template
    > file. (Didn't want to keep making changes to all my
    settings everytime i made a
    > new SCORM) Just realized that this might have a weird
    effect on my LMS.. does
    > anyone know anything about it?
    >

  • Captivate3 score transmission to LMS not working in Firefox

    We are experiencing a troubling problem with Captivate 3 quiz scores not transmitting to the LMS when using Firefox. The exact same functionality works perfectly in IE. We have narrowed it down to a TIMING problem for how quickly data is passed to the LMS, which differs by browser, and we would like to know if anyone has a solution (other than telling users only to use IE --- not an option).
    Using Firefox:
    When you finish the course quiz, the score is calculated and reported correctly within Captivate. When you exit the course immediately after that, a PARTIAL score is reported to the LMS, it is almost always incorrect, and it varies from session to session. For example, if you scored 100% (and Captivate told you that), the completion status and score will show up in the LMS as 20%, 30%, or some small percentage that is less than a passing grade and definitely less than 100%.
    The critical thing we've noticed is that Captivate will take a long time (almost 3 minutes!) while it sits and processes data at the end of the quiz, on the Results screen. You can see the cursor reacting, which is a visual cue that processing is still happening. If you wait for that processing to finish and THEN check your score in the LMS, *amazingly* it is correct (100%). But if you exit the course normally, without waiting 3 minutes to return to the LMS, your score will not be correctly reported. So... why is there this lag, and why is it reporting ANY score at all if processing is still occurring, as opposed to a zero score? What is it processing that takes so long and when is it sending what to the LMS??  This gets even more interesting when you see how it works in a different browser...
    Using IE:
    There is no processing lag and never a problem. All scores are reported correctly and immediately when you exit the course and go to the LMS.
    We have been trying to isolate the problem and figure out why Firefox handles this differently than IE and what we can do to fix it.
    I have stripped the lesson down to only 10 interactions (clickables) that are included in the Quiz score. All other objects are set to 0, and not included in the quiz. Advanced Interaction map verifies a clean slate.
    Quiz Reporting is set to:
    Enable reporting (checked)
    SCORM version 1.2
    80% = passing grade
    Report Pass or Fail = Status as defined by report data
    Report Data = Quiz Results Only
    Report Score to LMS = Percent
    Reporting Level = Report Score (I'm not reporting or tracking interactions)
    Our LMS is Pathlore.
    Does anyone have any ideas what is going on with Captivate vs. Firefox and what we can do to make score reporting behave properly?
    Thanks!

    Hi Ginnycr,
    I'm experiencing the same issue with Captivate 4 and it looks like this problem has been around for quite some time now. Have a look at this link:
    http://www.koma-medien.de/elearning/forum/archive/index.php/t-1140.html
    at the bottom of the thread someone posts this:
    "The slow issue is resolved when I force the g_intAPIType to 0 (into the
    <project>.htm file), so it force Captivate to communicate as if it was into IE."
    Try it out and see if it works? I'll test on my end too. Hopefully one of us will come up with a solution! (or one of the nice Captivate experts on this forum will jump in and rescue us)

  • Captivate and LMS JavaScript Problem

    We use Captivate and a Plateau LMS. We had an issue with
    bookmarking within our course. The first lesson worked just fine,
    but later lessons were not bookmarking. We added javascript alerts
    to the code to see when the Captivate content was setting LMS
    variables.
    We feared the issue was the bandwidth between the client and
    the LMS was so limited that the script attempting to set the LMS
    variables was failing, but this is not the case. The script appears
    to not run at all sometimes at random intervals. When it does run
    it is successful each time. The first lesson, which works reliably,
    is much smaller than the other lessons, maybe the size of the SWF
    factors into the ability for the script to run properly?
    Has anyone else run across this issue? Does anyone have
    advice in deeper troubleshooting an issue like this?
    Any input is appreciated.
    Thanks!
    Chris

    I think that CP3 sends back its data on a time basis, not a
    slide
    basis...like every 7 seconds or something?
    We've had customers who are tracking interaction data in
    their CP
    quizzes which have like 40 questions. If they go through it
    too quickly,
    the timed 'save' to the LMS gets behind quickly...so by the
    time the
    user is done, only 2/3rds of the suspend data has been
    saved...
    And I imagine the bookmark could be left out entirely as
    well.
    So see if it's a timing thing? Go through slowly, see if
    posting happens
    on a timed basis...maybe that will help determine where the
    issue is...
    I do wonder how CP4 sends its data back...
    Erik
    chrishillman wrote:
    > The script appears to not run at all sometimes at random
    > intervals. When it does run it is successful each time.
    The first lesson,
    > which works reliably, is much smaller than the other
    lessons, maybe the size of
    > the SWF factors into the ability for the script to run
    properly?
    Erik Lord
    http://www.capemedia.net
    Adobe Community Expert - eLearning
    http://www.adobe.com/communities/experts/
    http://www.awaretips.net -
    Authorware Tips!

  • Scorm 1.2 Package  errors     

    When I create a SCORM 1.2 packages using Breeze Presenter,
    the scorm_support.js file created consistently has the same
    error...
    var g_nAPI = 0; // type of API to start searching for;
    allowable values: 0 - SCORM 2004; 1 - SCORM 1.2 (or 1.1)
    This variable should be set (to 1)...
    I'm getting tired of unzipping, ediiting and rezipping
    packages for this basic error ?
    Am I alone in seeing this ?
    Andrew

    Captivate 6 shouldn't require any scored assessments to properly track.
    Just setup your SCORM 1.2 lesson with Incomplete > Complete
    set 'User Access' or 'Slide Views - 1 slide'
    Don't check the Quiz option
    The Oracle LMS can be troublesome - when developing content for a customer a couple years ago, tracking interaction responses that exceeded more than 1 character would crash the system...
    BUT simple lessons that just launched and marked as completed - those didn't pose any isses.
    As long as you have SCORM options checked in the preferences, and publish to a .zip file, that .zip file should have everything your LMS needs.
    Good luck
    -E

  • Captivate and LMS Force 10

    Does anyone know if Captivate works with the LMS system Force
    10?
    Thanks,
    Jeff

    I think that CP3 sends back its data on a time basis, not a
    slide
    basis...like every 7 seconds or something?
    We've had customers who are tracking interaction data in
    their CP
    quizzes which have like 40 questions. If they go through it
    too quickly,
    the timed 'save' to the LMS gets behind quickly...so by the
    time the
    user is done, only 2/3rds of the suspend data has been
    saved...
    And I imagine the bookmark could be left out entirely as
    well.
    So see if it's a timing thing? Go through slowly, see if
    posting happens
    on a timed basis...maybe that will help determine where the
    issue is...
    I do wonder how CP4 sends its data back...
    Erik
    chrishillman wrote:
    > The script appears to not run at all sometimes at random
    > intervals. When it does run it is successful each time.
    The first lesson,
    > which works reliably, is much smaller than the other
    lessons, maybe the size of
    > the SWF factors into the ability for the script to run
    properly?
    Erik Lord
    http://www.capemedia.net
    Adobe Community Expert - eLearning
    http://www.adobe.com/communities/experts/
    http://www.awaretips.net -
    Authorware Tips!

  • Visible/invisible subform based on choices from 2 dropdowns

    Does anyone have an example of a form with subforms being visible/invisible based on the answers of 2 dropdowns.
    thanks for sharing

    Hi Jagan,
    It is probablly just how you set up the script.
    For starters, for Yes/No choices I think you would be better to use radio buttons instead of checkbox objects. So for example you would have four different radio button exclusion groups, each with a Yes and No radio button. Then in the Object > Binding palette, you would set the Yes value to 1 and the No value to 0.
    This way you can sum the four radio button exclusion groups and if the total value is >=1 then change the presence of the text field and highlighting to suit.
    There are some examples here that might help:
    Scoring options: http://assure.ly/uiKS6f.
    Tracking interaction (checkboxes): http://assure.ly/xDbtbx.
    Hope that helps,
    Niall

  • Cmi.suspend_data element size

    Our LMS - SkillPort - has asked us to change the
    cmi.suspend_data element for one of our courses (mainly a quiz) to
    4096. They say it's over 5800 characters.
    I'm not familiar with the cmi.suspend_data setting. How do
    you access it? How do you change it? My colleague suggests that
    it's probably automatically generated when we work through the
    "SCORM 1.2 RTE" step of the SkillPort Compatibility Toolkit.
    Have you run into any issues regarding cmi.suspend_data ?
    What can you recommend?
    Thank you!

    I don't know if the error can be ignored without issue. If
    their LMS is
    reporting that the amount of data the lesson is trying to
    send is too
    large, it may not save any of it. Or even if it does, it may
    only save
    the first 4096 bytes...so when that data is sent back to the
    LMS on
    resume, it will be incomplete (and cause problems with the
    lesson's
    resume/playback?).
    If the LMS is just reporting the field is too long but saves
    it all
    anyway then, yes, I'd suspect you could ignore the error
    without
    ill-effect (?).
    I'm not sure what aspects of the CP Quiz Preferences come
    into play
    here. As Rick suggests, checking "Never send Resume Data"
    would make
    sense in this case, as the suspend_data field *should be*
    specifically
    used for the lesson to load back-in any data it collected on
    a previous
    launch.
    There are other SCORM fields/variables for the lesson to send
    quiz data
    and results to the LMS. The status and score should NOT be a
    part of
    that suspend_data tracking. In fact, the LMS wouldn't even be
    able to
    report those results if the CP lesson was sending back status
    and score
    in that field.
    SO my guess is you can indeed check that "Never send Resume
    Data" option
    and still get the results you need, especially if bookmarking
    is not
    something you want to happen - so every time users launch
    this lesson,
    it will be like they're launching it for the first time (as
    far as the
    lesson itself appears to the user).
    IF that doesn't work and for some reason checking that option
    does
    affect the tracking you need then, yes, the next best option
    would be to
    reduce the number of questions so the amount of info in that
    suspend_data field is reduced.
    Finally, tracking interactions will significantly add to the
    suspend_data field, I suspect, as I think that means the
    lesson sends
    the question text and the selected answer back to the LMS (so
    'shorter'
    questions' may work but less-likely?)
    HTH!
    Erik
    Janatwork wrote:
    > -- Since the purpose of "suspend_data" is to track
    things BETWEEN SESSIONS,
    > could I and SkillPort simply ignore the 5800 - 4096 size
    conflict, since we are
    > requiring the learners to complete the quiz in ONE
    session (i.e., they cannot
    > bookmark it and come back later).
    > do you mean "reduce the number of quiz
    > questions"? (My course has 36 questions) Or do you mean
    "make the questions
    > shorter"?
    Erik Lord
    http://www.capemedia.net
    Adobe Community Expert - eLearning
    http://www.adobe.com/communities/experts/
    http://www.awaretips.net -
    Authorware Tips!

  • Oracle (web) and MAPI

    Howdy!
    So something we would like to do is to have Oracle be able to receive emails (from MS exchange), look up the email address, and insert a record based on that email (basically, we want to use this to track interactions with our clients).
    Has anyone done something like this with web deployed forms (is it even possible)?
    Is there code out there I can use?
    thanks!
    -Joel

    Joel,
    what about using Java Mail through the Forms Java Importer? (http://java.sun.com/products/javamail/)
    The SUN discussion forum also provides some information on how to access MSExchange (http://forum.java.sun.com/)
    Frank

  • PeopleSoft Analytic Calculation Engine - Anyone using it?

    We are planning to use PeopleSoft Analytic Calculation Engine in HRMS 8.9. We have simple needs such as showing headcount/turnover rate metrics.
    Has anyone used it? How is your experience?
    Thanks

    Having our own LMS, we do not use the Peoplesoft one, but we have a client that I'm pretty sure does. They've referred to it as the Peoplesoft LMS and the Oracle LMS...but pretty sure it's one in the same (as Oracle bought Peoplesoft months ago, right?)
    The biggest issue we've heard about it is no Interaction-level tracking support. Now I don't know what version they have but not being able to track interaction data was a big disappointment to them. They hired a contractor to customize and add such functionality and it works, mostly, but has some flaws...and, of course, if they ever update their LMS, likely that customization will be blown away.
    Otherwise, the only other complaint I heard was the cost and time of integration with their other systems (whatever those may be - HR and such, I imagine). It wasn't cheap or quick.
    I'd like to balance this out but I've not heard any positive comments on it, other than it does integrate with their HR system.
    Erik

  • Anyone using Peoplesoft Enterprise Learning (LMS)?

    Hi everyone,
    My organization is currently considering Peoplesoft Enterprise Learning Management (aka Peoplesoft LMS). Could anyone who has used/currently uses it please share their experiences with me?
    Any feedback on PS LMS would be appreciated - and it would be great to a demo if possible.
    Thanks,
    Eric Benson
    e-Learning Business Analyst

    Having our own LMS, we do not use the Peoplesoft one, but we have a client that I'm pretty sure does. They've referred to it as the Peoplesoft LMS and the Oracle LMS...but pretty sure it's one in the same (as Oracle bought Peoplesoft months ago, right?)
    The biggest issue we've heard about it is no Interaction-level tracking support. Now I don't know what version they have but not being able to track interaction data was a big disappointment to them. They hired a contractor to customize and add such functionality and it works, mostly, but has some flaws...and, of course, if they ever update their LMS, likely that customization will be blown away.
    Otherwise, the only other complaint I heard was the cost and time of integration with their other systems (whatever those may be - HR and such, I imagine). It wasn't cheap or quick.
    I'd like to balance this out but I've not heard any positive comments on it, other than it does integrate with their HR system.
    Erik

  • Need clarification for Captivate assessment settings

    Greetings,
    I have questions about how an LMS handles unique identifiers
    that are assigned to questions or interactions in a 3rd party
    software such as Flash or Captivate.
    First of all, I must say that the more I work with our LMS
    (Integrity eLearning’s WBT) the more I like it. It appears to
    so feature rich that I have to ask questions such as the ones
    below. I would certainly recommend this product. But I am not the
    LMS admin, so these questions are necessary.
    My intent is to make an LMS Report (such as the Question
    Response Summary Report) more readable, and therefore better
    understood. To do this I need to understand naming limitations when
    a Captivate generated SCO is integrated with our LMS.
    For example: (for those experienced with Captivate)
    In Captivate:
    I have an assessment based on how a user interacts with a
    series of slides. The slides emulate the use of a piece of
    software. There are transparent click boxes on specific slides.
    On specific slides, for a specific click box:
    In Click Box Properties, Reporting tab, I have 5 variables
    available. 3 fields and 2 check boxes. I understand the uses of the
    2 check boxes (Include In Quiz and Report Answers) and the
    Weighting field. My questions relate to 2 fields:
    Objective ID:
    Interaction ID:
    The Objective ID:
    I believe this variable needs to be a consistent identifier,
    the label is the same for each different interaction within the one
    assessment (SCO). So, if I have 5 slides, each with one click box,
    this property for each of the 5 click boxes is the same.
    Does this relate to a separate Objective dictated by the LMS
    or is this related strictly to the that SCO?
    Can I use the same identifier in a different assessment, for
    a different SCO?
    In other words, can I re-use this exact label and not have
    data problems when I generate reports?
    The Interaction ID:
    I believe this variable has to have unique identifiers
    (within the single lesson) for each question in the assessment
    (SCO). So, if I have 5 slides, each with one click box, this
    property for each of the 5 click boxes is different and unique
    within the SCO.
    Is this correct?
    If correct:
    Can I use the same identifier in a different assessment, for
    a different SCO?
    Can the value have spaces or will spaces cause problems?
    Can I re-use the labels in different SCOs?
    If I have no particular Objective, but I do have Interactions
    I want to track, do I have to have a value for the Objective ID
    field?
    If not, is it good practice to have one anyway? Does it
    matter?
    When I put in identifiers in these fields, and if I can
    re-use the identifiers in different SCOs, can I have spaces,
    special characters, etc. or would this cause problems when data is
    retrived from the back-end database?
    Thank you,
    TPK

    TPK,
    Objective ID fields are supposed to be based on Objectives
    set in
    conjunction with an LMS. You can tie various questions to an
    "Objective"
    and then confirm the learner has met their objectives (or
    determine where
    they need help). Captivate doesn't support much beyond
    sending Objective
    IDs in interaction data (e.g. doesn't send objective's score,
    etc).
    Interaction IDs are supposed to be unique across all content
    in an LMS.
    That typically isn't realistic and I don't know any LMS that
    requires or
    expects them to be unique. For your reports, the Interaction
    IDs should be
    unique in a single Captivate file. If you want to track
    interactions, but
    don't care about objective's - you can leave it blank or put
    in a "0". It
    won't cause problems in your reporting with WBT Manager (e.g.
    a recommended
    LMS).
    Hope that helps
    Regards,
    Andrew

  • FMS(Interactive) - Adobe Media Player Tracking Service.

    Hi Folks...
    I understand that the Flash Interactiver Server has this feature called 'Adobe Media Player Tracking Service' which the free version dosen't have. Any idea what ' Adobe Media Player Tracking Service' is and what it does ?
    regards

    Hi JavaIsLife,
    Flash Media interactive Server and Adobe Media Player are two different products which can participate in a video streaming workflow.
    Please check out this article http://www.adobe.com/devnet/mediaplayer/articles/amp_cdk_pt06_print.html .
    Also curious to know where you read about the 'Adobe Media Player Tracking Service' feature in Flash Media Interactiver Server.
    Regards
    Mamata

  • LMS not tracking quiz completion or interaction data

    I have created two identical assessments (apart from the questions). When users exit the POST assessment on the Quiz Reporting slide, the LMS tracks their completion and interaction data. However, when users exit the PRE assessment on the Quic Reporting slide, the LMS does NOT track their completion nor interaction data. I've configured all Quiz Preferences identically. Is there something else I'm missing?

    I'm not sure of this and defer to any contrary/additional replies that may follow...but...
    Past versions of CP have required some sort of scored interaction in order for the lesson to properly report via SCORM to the LMS.
    If you have a CP lesson without any scored interactions (no 'reporting'), that may be why completion is being recorded to the LMS.
    SO try setting up one scored interaction as you suggest.
    For CP lessons that do have quizzes, you won't need that separate scored interaction.
    You'll also likely want to change you settings to just 'Slide Views Only' if you just want completion reported based on them viewing the whole project, and I suspect Slide View Completion should be 100, not 10...
    Let us know how that goes...
    Erik

  • P-Track Application: Joint view of Interactive Reports

    Is there a way to create a customized report in the P-Track Application that joins the fields from the various individual interactive reports? For example, if I want certain columns from Projects tab and also other columns from the Status Updates tab, how can I achieve that?
    Thanks,
    Dorothy

    Dorothy, you'll need to create another report or modify an existing one with a query that does this type of join. It most certainly can be done -- but -- you'll have to unlock the application so that you can edit it.. I augmented the interactive report on page 1 (Projects' interactive report) to give users the "latest status update" by project.
    This is my "extra column" in that report:
    ,(select status_update from eba_proj_status_updates where update_date=(select max(update_date) from EBA_PROJ_STATUS_UPDATES u where u.project_id = s.id)) latest_status_update

Maybe you are looking for

  • Flex Builder 3 will not work on Mac

    Flex Builder 3 & Mac OSX: The hard disk on my mac failed, lost everything, including Flex Builder 2. I installed a new hard disk, and downloaded the "trial" of Flex Builder 3 (FB 3). The software seemed to install fine(no errors in the install log).

  • How I can create a XML file from java Aplication

    How I can create a XML file from java Aplication whith have a the following structure <users> <user> <login>anyName</login>      <password>xxxx</password> </user> </users> the password label must be encripted accept any suggestion

  • How to create Process Integration in SLD

    Hello colleagues, I need create Process Integration in SLD my version is PI 7.4. In development system I have create this process How to create in production? Thanks.

  • Arabic in Smart Forms

    Guys! Can somebody please tell me I am working on SapScript Form which contains both English and Arabic Characters. Logon Is English. SAP Script English is coming perfectly but when I add Standard Text of Arabic, it is not coming as required. As you

  • How do I start slapd at boot time in leopard?

    Greeting,   Forgive me if this is the wrong forum/community.   I want to start slapd at boot time on my new leopard upgrade.  On tiger, it was as simple as having an LDAPSERVER=-YES- line in /etc/hostconfig.  That does not seem to work for leopard.