LMS Reporting & AICC

we have created a few modules on some new automated HR forms.
Each form has a demo module and a scenario-based quiz module. I do
not score either one. With that said I want to be able to award
credit to those who have completed either module. Can I do this
without scoring them. Ideally as I said, they finish and they get
credit. I have no experience in using the Quiz Manager function of
Captivate. Any and all help is appreciated.
Does anybody know how Larry (CatBandit) is doing. I had heard
thru some post he was ill.
Bill

You can score interactions without making them actual quiz
questions -
just track them and assign a weight. You do, however, have to
supply
some sort of scored interaction to pass a completion value to
the LMS.
(at least, from the SCORM perspective, not sure about AICC)
A common approach to this is to put an interaction in the
beginning of
the piece ('click here to start') or at the end ('click here
to see your
score') which is tracked.
Erik
flyme wrote:
> we have created a few modules on some new automated HR
forms. Each form has a
> demo module and a scenario-based quiz module. I do not
score either one. With
> that said I want to be able to award credit to those who
have completed either
> module. Can I do this without scoring them. Ideally as I
said, they finish and
> they get credit. I have no experience in using the Quiz
Manager function of
> Captivate. Any and all help is appreciated.
>
> Does anybody know how Larry (CatBandit) is doing. I had
heard thru some post
> he was ill.
>
> Bill
>
Erik Lord
http://www.capemedia.net
Adobe Community Expert - Authorware
http://www.adobe.com/communities/experts/
http://www.awaretips.net -
samples, tips, products, faqs, and links!
*Search the A'ware newsgroup archives*
http://groups.google.com/groups?q=macromedia.authorware
*The Blankenship Caveat: Note that direct linking to http
content
through any Authorware icon will likely fail if a proxy
server is present!*

Similar Messages

  • Problems reporting aicc to lms

    I made a project an published it to aicc. When putting it into our lms (we made it ourselves) it is not reporting. With other projects (other software) we didn't have problems.
    What am I doing wrong.

    Hot Fix #1 here:
    http://blogs.adobe.com/captivate/2011/03/captivate-5-0-hotfix-for-the-issue-with-launching -aicc-enabled-projects-from-lms.html
    HotFix #2 here:
    This is with reference to the issue you are facing with AICC courses on LMS.
    Please follow the steps mentioned below and let me know if this solves the issue for you.
    1.       Close Captivate if already launched.
    2.       Navigate to Captivate installed folder ( For Ex : C:\Program Files\Adobe\Adobe Captivate 5)
    3.       Navigate to folder “Templates”
    4.       Open folder “Publish”
    5.       Open folder “AICC”
    6.       Rename the HTM file “default.htm” to “default_old.htm”.
    7.       Download the zip the file from https://acrobat.com/#d=rZv8-z9C0z4PPMpqjbkZIQ , Unzip the contents and copy the HTM file “default.htm”
    8.       Paste it inside “AICC” folder.
    9.       Launch Captivate and open the project.
    10.   Re-publish the project and upload the content to LMS.
    Let me know if this solves the issue.
    Thanks,
    Ashwin Bharghav B
    Adobe Captivate Team
    hope it works!

  • What could be causing my multiple response survey question to show a non-response in LMS report?

    I have a 10 question survey that has been running fine and providing accurate responses in reports generated from my LMS. I made a property change to a multiple response type question that is now causing the reports to show a date/time (2003-01-01  18:00:00) response instead of the actual multiple responses the user made. The change I made to the question was to change the "Numbers property from" [ a), b), c) ] to [ 1), 2), 3) ]. This property change did not affect the reported responses for the single response questions, only the multi-response. This problem occured Using Captivate 4. I then tried using the same properties in Captivate 5.5 thinking it was a bug in CP4, but got the same results. Has anyone else run into this issue? I'm going to bring this to the attention of my LMS provider, but want to make sure CP is clean first.
    Thanks,
    Dave

    First things first. I am not an LMS expert, so this is not about troubleshooting.
    However, to ensure that your course works fine on a SCORM-compliant LMS, you can upload it to cloud.scorm.com and test it out. If the course works fine there, rest assured that it will work fine on any SCORM-compliant LMS. If your LMS team comes back with an issue after this, you can counter them with this evidence that the problem is with the LMS and not your course.
    Sreekanth

  • Help needed with LMS Reporting Settings

    I am new to Captivate and am working with 5.5 to create a number of large courses.
    The first course, I have split over 5 captivate files as there are a large number of slides in each file.  I have reviewed the Help PDF on Learning Management System, but am struggling to find the correct settings in the Reporting Preferences to suit my needs.
    Course structure:
    File 1     -     less than 100 slides, includes question slides not set as reporting, non score-reporting objects
    File 2     -     less than 100 slides, includes question slides not set as reporting, non score-reporting objects
    File 3     -     more than 100 slides, includes question slides not set as reporting, non score-reporting objects
    File 4     -     more than 100 slides, includes question slides not set as reporting, non score-reporting objects
    File 5     -     less than 100 slides, includes question slides as score-reporting objects and other non score-reporting objects (final quiz to be reported and scored in LMS)
    I envisaged using Slide Views Only in the first 4 files, and then Quiz Results and Slide Views in the last file.
    However, these two choices only seem to allow a maximum of 100 Slides - is this correct?, is there no way around this?
    There is another course in our series with much more slides than this first one so having to split into files of 100 could mean
    up to 12 captivate files in one course.
    I don't know if I now need to re-think the structure of the courses.  Maybe I have completely misunderstood the Quiz Reporting
    and Reporting Scores.
    I need to be able to have the Learner click their way through the course, making it highly-interactive for them but at the same
    time not scoring everything they click, until they reach the final assessment which is 10 questions and scored in %.   
    Another thing, after numerous tests, it seems that if I untick the 'Never Send Resume Data' then the 2nd file does not open.  Leaving
    it ticked will open the 2nd file - but then the progress is not saved if the Learner exits the course before finishing.  Has anyone
    experienced this before?
    I also would like clarification on the Manifest Settings, the 'Identifier' - should I be giving each file a unique number, or does it not
    really matter?  When we purchased 5.5, we did not know that we would need the Multi-SCO Packager, so have tried an alternative
    called Reload Editor timebeing, which we have been using for our testing with collating the files - has anyone used this before?  I have
    tried to download the free trial of elearning suite 2.5 but I am having problems getting it done.        
    Any help would be very much appreciated.
    Thanks

    The number 100 shown in Slide View Completion is the % of the total number of slides, not the number of slides from 1-100.
    Your strategy of breaking the course up into smaller modules like this is a good one.
    But if you have only one of the 5 modules that has scoring, you might be wise to add at least one scored object to each of the other modules (perhaps a button on the first or second slide with a nominal score of 1 point) to give the project at least some kind of a score.  I find that in some LMSs if one module in a course is scored, they expect ALL the others to also have a score as well.  So providing some surreptitious method of giving the user a guaranteed score keeps everything working on an even keel.
    Set Report Status as Pass/Fail and Report to LMS as Percent.  These are the settings I find most reliable.
    When using multi-SCO SCORM packages created with Captivate, whether or not the second file in the sequence of modules opens automatically is something you are supposed to set up in the LMS, not in Captivate.  Look for an option called Auto Continue or similar.  This is supposed to mean that when the user finishes playing each module, the next one will open automatically.
    There are two Identifiers you need to specify in each module.  One is the SCORM course identifier and all modules that belong to the same course should share this same ID.  The other is the SCO ID for each module, and these should all be unique.

  • How do I identify Presenter Questions in a LMS report?

    Hi,
    We've put together 25 questions in Presenter and set it up to report from our LMS (e2train).
    We need to know what users answer for each of the questions and can pull off a report that shows the results ok.
    The problem is that we have no identifier for the 25 questions.
    For each user the report shows the following columns:
    Description (blank for all 25 qs);
    Type (q type),
    Learner Response (What they selected);
    Result (correct or incorrect).
    How do I get Presenter to identify each question?

    Hi,
    Can you display what code you used for this? I followed note 172045.1 and everything compiles properly but when I run the report, I just get a box with a red X in it, like it is not finding the picture. I ran the procedure that downloads the image and it works fine from there. Thanks.

  • LMS report building with graph?

    Dear All
    we installed a LMS 3.2 for our customer with HUM module.
    when the customer using Hum to generate report, he found that the pre-defined report which
    show the last 24 hours data, has graph with it.
    but when he tried to generate another report for a shorter time frame, e.g. for the last 8 hours.
    then this time the report wont have graph with it.
    http://www.cisco.com/en/US/docs/net_mgmt/ciscoworks_health_and_utilization_monitor/1.2/user/guide_12/HUMrprts.html#wp1134146
    our question is how to generate a custom report with graph?
    is it possible to do so with LMS 3.2?
    thanks
    Yuke

    Hi,
    yes, you can create custom reports with the Graph. Its actually created with graph by default. The only thing you need to do after launching the custom report is that you need to click on the device which is hyperlink to Graph for the OID for which custom report was created.
    Hope this is what you was looking for.
    Thanks,
    Gaganjeet

  • A report about reachable devices in LMS report tools

                       Hi all,
    is possible to display a report about reachable devices in DCR database? The report about unreachable devices is only in the reports menu in LMS 4.2.4 (Reports > Inventory > Management Status >Unreachable Devices).
    I tried to use the report designer, but there is no possibility to create a rule for this report.:-( Similarly I tried to create group devices, but I didn't find some criteria for this type of group. Thanks.
    Roman

    Hi Roman,
    As per my knowledge , there is no such report available in LMS, however I would suggest you to create a POLLER for DEVICE AVAILBLITY  to achive your goal.
    Go to  Monitor > Performance Settings > Setup > Pollers     >> click Create Poller.
    check the below link for more information:
    http://www.cisco.com/en/US/docs/net_mgmt/ciscoworks_lan_management_solution/4.2/user/guide/lms_monitor/mnt-perf.html#wp1542387
    Thanks-
    Afroz
    [Do rate the useful post]

  • LMS report problem

    Dears,
    when i run a report on a device (For example Device availability) for more than 3 days, it is giving that the data is not available for this period, when i run the same job for 1 day, it is giving the result for 1 day. i Can retrieve result for more than 1 day. Any ideas?
    Regards,

    Hi Tolib
    Did you tried Firefox Explorer, try it, have problem looking like this long time back with LMS then i switched over Firefox it works.
    but its most probably with your java version installed tried to install different version of java (latest one or older one not sures some time it works fine with older one)
    further more did you runn OS 32 bit or 64bit
    make sure about this requirements too.
    Browser      Internet Explorer 8.0
                      Firefox 3.6 for Windows
                        Note: Support for 32 bit browsers only
    Java Plug-in     Java Plug-in version 1.6.0_19 or later 1.6 updates only
    here is more details about it.
    http://www.cisco.com/en/US/prod/collateral/netmgtsw/ps6504/ps6528/ps11200/ps11202/data_sheet_c78-610760.html

  • LMS Reporting Complete when communication is lost

    The LMS we're using, eLogic, reports a SCORM course as Complete if the communication is lost. This was happening on a regular basis as associates began to use online training for the first time. So we requested that it leave the course as Incomplete so the user could re-open more easily. We've been told that this does not conform to SCORM standards that say if communication is lost that the course be marked as Complete. That makes no logical sense to me. Can anyone explain? Or help me to understand?
    Thanks!

    I think this really has more to do with how your lesson is sending the status. It's the lesson that, ultimately, should decide if the lesson is complete or not. However, that status is not 'definite' until the LMS.commit function is called. Once that's called, then any data that's been sent back to the LMS is official.
    So ideally, your CP lesson should not send a Completed status until the user has finished whatever tasks are required to gain that status (all pages viewed, certain interactions finished, etc)...then when the user exits the lesson, CP sends the Commit and Finish calls and the LMS finalizes that lesson.
    It sounds like this LMS is doing some sort of default action where, perhaps, if communication is lost with the lesson or a timeout is reached, it automatically commits the status (which CP should have sent). So first:
    1. Is your CP lesson sending the status properly? Is it sending Completed as soon as it opens? That's a common issue... If so, the LMS could be receiving the status correctly, but acting on it incorrectly by commiting that status when it should not be.
    2. Is the LMS taking a default action, like setting a status to Completed, when the API communication is lost? Which, I agree, sounds like a bad assumption to make and is doubtfully a SCORM requirement.
    HTH
    Erik

  • LMS reporting errors with Captivate

    I have been trying to publish my Captivate simulations to the LMS but no matter what settings I use the end user can stil close the project down anywhere and it will report as 100% ...any ideas

    You'll need to provide more information than this.
    What LMS are you using?
    Show screenshots of your settings?
    Tell us more about your course.

  • LMS reporting using SendTrackingDataAtEnd

    We are using Captivate 7 to report percentage scores to our LMS (CertPoint/inFor), with intermittant results. Sometimes the scores will pass, other times they will not, which is upsetting a large client base. There are 14 courses in the series.
    Originally the courses were published using the Default template. I've republished them using SendTrackingDataAtEnd. However, we are experiencing the same results. The questions are pulling from a question pool of 10 questions. We are showing the results slide at the end.
    I've also tested to make sure the question slides are all 3 seconds and are paused at around 1 second. I hate to keep republishing these courses since many are long and the user can't move forward until the end of each recorded slide (more like a slide by slide presentation) and when I republish it, of course that breaks the bookmarking. Any poor user without their score passing to the LMS now has to go through the entire thing to try and resubmit.
    It would be great if we could find a solution to this issue as this is our first time attaching Captivate courses using SCORM and I'd like to be able to do all our courses in Captivate if we can get it working reliably.
    I'm attaching images of the Quiz Properties. Is there something I am missing/doing wrong? I'm wondering if it could have something to do with the Identifiers. I didn't update those at all.
    Thanks!

    Make sure you have at least one slide BEYOND the Quiz Results slide, and make sure the learners ALL reach that final slide before closing their browser.  If your learners are closing their window BEFORE allowing the course to correctly report their score then that would explain your issues.
    I usually rename the Continue button on the Quiz Results slide something like Register Your Score! and make it big and bold so that they don't miss the inference that UNLESS they take this action to register their score then they won't benefit from all their hard work.
    Then on the final slide I place a 10 second dummy animation that makes it LOOK as though something is being communicated to the LMS with an outro audio music file playing as well.  Allowing this extra time on the last slide also helps ensure the data is transmitted to the LMS just in case the LMS latency happens to be high at that moment.
    Bottom line is that most of the time this issue can be greatly alleviated by how you design your course and how you educate (manipulate) your learner behaviours.

  • LMS reporting

    Hi folks - we are just setting up a new LMS (bespoke) and I
    am trying to identify exactly what output Captivate 3 can be asked
    for....
    In choose report data, if we select Quiz results + slide
    views, I get a percentage figure for slide completion.
    If we use branching, what exactly does this record? It would
    be perfect if it can identify the branch logic so that if the user
    follows a single branch of 4 slides, that it is recorded as 100%,
    even though there are slides in other branches that have not been
    viewed. Is this the case?
    Hope you can help!
    Ying

    Nevermind. OP corrected forum issue. Please disregard my
    mindless chatter.

  • LMS Report Question for RME.

    Can someone tell me when the patch for the EoS/EoL report is going to be released?
    Thanks
    Chris

    Reportedly "today": https://supportforums.cisco.com/message/3029081#3029081

  • LMS report for down ports

    I am using LMS3.2 and I donot know how to customize a report or sys log to be sent to
    me as a notification or e-mail when aport or the switch is down, and how to search for a
    specified host using the ip or the mac address.

    You can configure DFM's Notification Services to send an email or syslog message when an OperationallyDown or Unresponsive event occurs.  This is done under DFM > Notification Services.  Essentially, you will first define a Notification Group which matches the devices and events of which you are interested, then you will define a syslog or email notification subscription to send the actual notification.
    To search for end hosts by MAC or IP, go to Campus Manager > User Tracking > Reports.

  • Lms reporting success status=pass/fail, where it should give unknown while running on saba 5.3

    I am trying to run my content on SABA 5.3 , After exiting the content the lms displays success status as Pass or Fail where it should show Unknown. My content does not contain any quiz, so on which basis lms is calculating the success status.

    Yes I know you said your content had no quiz, but please try to remember that we answer hundreds of these enquiries and it's not uncommon for people to tell us one thing when their settings say another.  So I have the habit of asking to see the actual screenshot.
    Each LMS interprets the SCORM standard a little differently.  In SCORM 2004 a course module can have both a status for Complete or Incomplete, as well as a separate status for Pass or Fail. (Under SCORM 1.2 the overall status was tracked by a single element - cmi.core.lesson_status - allowing for only a single status value that would boil down to Complete vs Incomplete or Pass vs Fail values.
    Under SCORM 2004 there are two separate elements tracking completion and success.
    The SCORM 2004 element cmi.completion_status is designed to indicate whether the learner has completed the SCO and can take the following status values:
    completed,
    incomplete,
    not attempted,
    unknown
    Your Pass/Fail status in SCORM 2004 falls under the element called cmi.success_status. This one indicates whether the learner has mastered the objective and can take the following values:
    passed,
    failed,
    unknown
    So while it is possible for a value of "unknown" to be set for either cmi.completion_status or cmi.success_status, I don't think this would be very commonly done for a Captivate module that actually has interaction data being sent to the LMS.  If the user is interacting with the content at all it would seem more likely that you'd be getting a 'completed' or 'incomplete' status value in the completion element and a 'pass' or 'fail' value in the success_status element.
    The bit I don't understand is why are you particularly keen to see a value of 'unknown' showing up here?
    Check this page for information about getting SABA to work with Captivate: http://www.infosemantics.com.au/adobe-captivate-learning-managment-system-lms-guide/saba
    Perhaps one of the threads it links to will provide more help for you.

Maybe you are looking for

  • SYM Acc for /3E1,/3E2,/3E2,/3F1,/3F2,/3F3,/3F3 Etc.

    Hi, CAn anybody plz guide me about these SYM Acc's & their assignments. Whether it is correct or Work.Please guide. /3E1 Ee ESI contribution 1 - 5008 F EE ESI Contri /3E2 Er ESI contribution 1 + 5009 C ER ESI Contri 1 /3E2 Er ESI contribution 2 - 500

  • Web page images move up or down in pcs

    I have a similar problem. Just finished my new website which looks great in Iweb but when I publish it it looks different in every browser and even worse in PCs. My website is www.davidleon.biz The pictures, html widgets and lines seem to come up or

  • Boot Camp / Windows 8.1 64-bit / Driver compatibility issues

    Hi everyone, I have been struggling to install Windows 8.1 Pro 64-bit version through Boot Camp on my MacBook Air 13 inch mid-2012 for a few days and I have finally decided to give up and write this thread to see if anyone has encountered the same pr

  • Won't start up in safe mode

    Mac pro Quad Core... 2 NEC 24" monitors Working away got a kernal panic like wild blinking flickering... Command save, hard shut down 2 Super duper clone boot drives in machine Now, 1st boot drive, safe start, get boot progress bar runs through to th

  • How can i get firefox on my mobile galaxy 2

    Don't know how walk me,through