Pre test not branching

I followed the Adobe eLearning channel video and set the pre-test action to ......PointsScored is greater than 40 (as there are only 4 questions worth 10 points each).  The Action is to Jump to advanced section, or Else jump to beginner section.  Yet on taking the pre test pass or fail I arrive at the beginner section.  What am I missing?
Thanks Jo

Hi Joanne,
Thanks for reaching Adobe Community. Please accept my apologies for the inconvenience.
I tested the same thing at my end it works. The reason it is not working at your end because in your Advanced Actions Else statement is executing. The reason behind this is you have only 4 questions worth 10 points each and your If statement in the Advanced actions says that its should be greater than 40. In order for this to work you need to have minimum of 5 questions or set the statement as "is equal to 40". In addition to this please make sure you are using the variable cpQuizInfoPretestPointsScored as you have a Pretest Quiz.
Hope this helps!
Thanks!

Similar Messages

  • How can I skip/pass a lesson when the pre-test is passed?

    I'm creating a course with an overview and 3 lessons. Each is its own CP file (Captivate 6).
    This course will be hosted in our LMS, where I will require each lesson to be taken/passed.
    Lesson 1 is introductory/background material about my topic, and my users may or may not have had exposure to this information already. If they know the info, there's no need for them to take this lesson.
    I have added a pre-test to the lesson, that, if failed, forces the user to go through the lesson content and a graded quiz at the end. The score is reported to the LMS and all is well.
    If the user passes the pre-test, s/he is jumped to a congrats slide and instructed to exit the lesson. (No need to take the graded quiz - the questions are the same, and they've already "passed.")
    My question: How do I report a PASS for the lesson if the user doesn't take the graded quiz? I don't need to report any particular score.
    Thanks for any help you can offer!

    Lieve, I'd LOVE to see your blog post about this topic - especially since you always include examples. =)
    Request: I need help with reporting.
    Here's my project's Branch View (click to view larger):
    I've done as you suggested as far as I know. The navigation works perfectly, but when it comes to reporting, I'm not seeing what I expect/want.
    What I want:
    For the user to be able to skip the pre-test (the branching image shows this)If the user passes the pre-test, can exit the course with a Complete/Pass
    For the user to have X number of chances at the test. After the user a) passes or b) uses up X chances and still fails, I want to report back to my LMS (I'm testing in scormcloud right now, but will be using HealthStream) both completion AND pass or fail.
    What I've got:
    The user CAN skip the pre-test (the branching works, anyway)
    I can get Captivate to report Complete/Unknown or Complete/Pass (sometimes, even when the user fails the graded quiz), or Incomplete/Unknown.

  • AppContainers_test fails with error during pre-test configuration checks

    Hello,
    AppContainers_test fails with error during pre-test configuration checks:
    "[FAIL] The machine is not properly configured for testing: the test directory "C:\Windows\SYSTEM32\" is not ACL'd to allow execute permission for "APPLICATION PACKAGE AUTHORITY\ALL APPLICATION PACKAGES"
    But that kind of permission for APPLICATION PACKAGE AUTHORITY\ALL APPLICATION PACKAGES was allowed, also i tried to allow full access, but error was the same.
    In the test WindowsFilteringPlatform_Tests, the same requirement "REQ - Wfp-based products must not block App Container apps operating within their declared network intentions by default" was checked successfully.
    How can i pass successfully AppContainers_test? 
    I installed lastest versions of QFE and filters on my server, but it didn't help.

    Hi M'Alerter,
    I have the same issue, have you found a solution to your problem ? 
    Cheers,
    Ghalem

  • Can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    Pretest score normally is not sent to the LMS. However it is stored in a system variable, you could use JS to send it.

  • Can I have multiple pre tests and a final test?

    Hi,
    I want to create  unit whereby I have small pre-tests (5) to check knowledge, they must get answer ciorrect to continue. Then i want to create a final test at the end of the project.
    Is this posisble or do i need to have different projects?
    Many thanks
    FS

    I think trying to get feedback to work with Survey Questions using Advanced Actions would be a lot more trouble than it's worth. 
    Also, using Pretest questions is not the answer because they're very limiting in other ways. They MUST all be at the beginning of the project BEFORE the main quiz, and they disable the playbar for the entire module. (This is by design.)  The way Pretests currently work is not usually what authors are looking for.
    If setting ordinary quiz question slides to zero points and deselecting the option to Include in Quiz is NOT enough for you, then I suggest you resort to creating your own 'question' slides via other methods such as building them from default interactive objects available in Captivate.  You can use widgets or buttons or Smart Shapes or Text Entry boxes to replicate most of the other question types.

  • Setting a Conditional Action for Pre-Test questions: Skipping Content Slides

    OK boys and girls, let's see if this is even possible within one course SCO, or if this a multi-SCO solution (which means I'm toast, because we don't have the multi-SCO aggregator)?
    I want to set up a prescriptive pre-test. In my ideal world, answers to each individual question would either enable or disable the content slides connected to the question. If the user passes the pre-test, they could opt to skip the prescriptive stuff and complete the course. If the user fails the pretest, they would have to view the enabled content.
    Is this even possible in a single SCO world?
    Justenuf

    Jim belongs in that pantheon of good guys as well, so many here have pointed me in the right direction and helped me move forward with the capabilities of this software. And, in that spirit - here's my work around for the pretest/post test in a SCORM 1.2 environment:
    Created an entrance slide that allows opting totake the pretest or not - but only allows you to take it once.
    Created one test (take off the question x of x widget) split it and slide the content in between it. In this case, 10 questions with 5 in each set.
    Titled the first set of questions as a PreTest and the second set as a Post Test.
    Set my passing score at 4 correct. (the user only needs to get 4 out 5 on either set correct to pass - and I'm setting up an action that only allows the user to see the questions they missed if they fail the pretest.)
    Created variables for each "pretest" question and set them at 0. Created on Enter advanced actions on each slide following a question that toggled the variable to 1 if the question was answered correctly  if "CpQuizInfoLastSlidepointScored is equal to 1" then "Assign Question<x> with 1"
    Created an On Enter advanced action for each content area slide "Skip #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the content connected to the missed questions in the pretest is seen.
    Copied the same On Enter advanced action the Post test questions and renamed it "SkipPost #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the questions related to the reviewed content are seen
    The sum total of these activities is the following -
    The user can choose to take the pretest and complete the course by gettiong 4 or 5 questions correct and jump to the exit page.
         If the user takes the pretest and scores lower than 80%, the content for the missed questions is viewed and the user must answer the questions regarding
         that content. The user passes the course when he/she answers the fourth question correctly.
    The user can choose to view the content, take the post test and pass the course by answering 4 of 5 correctly.
    it's running right in preview, in web browser preview and on my LMS previewer, with one navigation error.
          the error: after failing the pretest, taking the prescribed content and anse wering the questions correctly, the course occasionally jumps to the results page
         and s displays a failure message, then freezes in place. it does NOT happen each time, which is why I am cy urrently scurrying back to my happy place to dwell
         on this mystery awhile.
    Justenuf

  • Running a pre-test script to setup my test data

    I am trying to run a script from e-Manager Enterprise to setup or stage my data prior to the playback of my test branch. I think i could do it with vba in a dummy e-Tester script or via a third party test. Has anyone ever done this and what is the best method for this type of activity?

    The 3rd Party test case type in e-ME is well suited for this purpose. You can specify any .bat file executable to run with your 3rd Party test case. Although one usage for this is to launch automated scripts in external or homegrown test tools - another usage is to launch a pre-test or other executable to setup your test environment. You can then either launch this manually in the e-ME test tree or schedule it to run ahead of your other automated scripts in the e-ME Web scheduler.

  • How to Add Likert Scale Pre-Test w/Customized Feedback?

    I need to have two quizzes in my course:
    One is a Self-Assessment with a 5 point Likert-Scale. There are 6 questions in this set. I need three possible responses depending on how they score (0-4 is one response; 5-8 is another response; 9-12 is another response)
    A final quiz at the end with only 3 questions, again giving feedback based on how they respond, this time percentage-wise (they score less than 80% they get feedback, over 80% another feedback)
    I am a very basic Captivate 6 user and have no clue how to go about doing this. I've read many forums about pre-tests (which don't seem to include Likert Scales), variables, advanced actions. There HAS to be a simpler way...
    Thanks,
    Christina

    Hi Lilybiri,
    Thanks for the quick response. To answer your question...
    I'm using Captivate 6. I saw the pretest option but Likert Scale is not an option
    I can use a different type of question, but wasn't able to figure out how to have different responses score differently. When I look at the Quiz Properties for MCQ, I only see one overall score value. Here is an example of the type of question I'm using: I proactively develop relationships with those I work with. Options are Never = 1 pt; Sometimes = 2 pt; Often = 3 pt; Always = 4 pt; I Don't Know = 0 pt
    If they receive a total of 0-4 pts, their feedback should be: "You can considerably improve your influencing skills by applying the principles in this course"; If they receive a total of 5-8 pts, their feedback should be: "You are already applying some of the principles. Plan to enhance your usage after completing this course." If they receive 9+ pts, the feedback should be: "You are using your influencing skills regularly. Focus on the principles you tend to use less often and coach others on how to become an effective influencer."
    Yes, all I need for the Final quiz is the Pass/Fail. I already have that set up and it's working fine (and currently not including the Likert Scale questions = I figured out how to do that based on this link: http://goo.gl/hhjJe)
    Thanks,
    Christina

  • Pre Tests in Captivate

    In one of the Developer Forum articles it mentions the
    ability to use pre-tests in Captivate.
    There is no documentation in HELP or in Knowledgebase on how
    to assign questions to a pre-test as opposed to a post-test, to
    give full credit for & ability to by-pass content w/ all
    correct answers in a pre-test.
    And... if any incorrect pre-test answers, to send learner
    through content & post-test, but in this case, only score user
    on answers in post-test, not all questions in course... Any
    suggestions? Thanks....

    You may be thinking that pre-testing is an actual feature of
    the product, which is not the case.
    Captivate only knows tests... its up to you to design them to
    be useful as a pre-test or a post-test. How you do this is
    determined by the delivery environment your content will be
    accessed from and an understanding of Captivate's reporting
    abilities.
    For example, you mentioned that you want to "give full
    credit", which implies that you're doing centralized reporting in
    some way. The type of reporting you're doing will determine how you
    create your course.
    If you're using an LMS via the SCORM standard, all navigation
    and pass/fail logic must reside in the LMS, so you just build your
    pre-test and make the LMS use the score from that determine whether
    the user passes the overall course. If they don't pass, make them
    take everything, if they do pass, give them credit for the whole
    thing.
    If your particular LMS doesn't let you do that and you can't
    find a way to trick the LMS into doing what you want (say, by
    assigning much higher score values to the questions in the pre-test
    to artificially give the user a score high enough to pass the
    entire course), then you may be out of luck. That's the nature of
    SCORM, and has nothing to do with Captivate.
    Bottom line, there's no documentation because Captivate is
    not intended to fit into any particular system or scheme. It's
    merely a tool that lets you create building blocks that you can
    (hopefully) slot into a separate solution using a "standards-based"
    interface, such as AICC/SCORM, or one of the other options listed
    in the Quiz Manager.
    The reality is that each deployment environment, especially
    the "standards-based" ones, have their own quirks and deployment
    requirements, so how you make Captivate work in your particular
    environment is unfortunately left up to you. That's just the nature
    of the game.

  • Pre-Test in Captivate with rating scale questions

    What I want to do with Captivate:
    - First an Pre-Test with 35 Questions the learner has to answer. I have to use a scale from 1 to 6.
    - These questions belong to 13 groups of questions
    - The Results of the questions are to be summed up in grouping variables
    - The Score in the variables are responsible, which of the eLearning-Moduls should be highligthed in the TOC
    - At last the TOC should be displayed
    - I want to use for every of the following moduls an own project (to reduce the size of every project)
    - I have to import this all in the LMS ILIAS (Scorm 1.2 and Scorm 2004)
    - the learner should be able to interrupt the application at any time and resume again
    I have seen the Thread "how to make a Pre Test in Captivate 4" and I work out the script-example inside the Captivate-Help  and the ebook from Rick Stone.
    What is the recommendation
    1. to realize the slides with the questions (they should be self-created, but how to do?)
    2. to realize the code in the advanced actions
    3. to realize the working-together of the project with pre-test and toc and all the following modules
    Perhaps, someone has an example for me.
    Best regards
    Winfried

    Hello Winfried,
    Spent a couple of hours yesterday to work out a proposition. Must confess that I realized it in CP5, mostly to save some time with the Advanced Actions but it could be done in CP4 too. Beware: I did not bother a lot about the layout, graphics, just tried to find a workflow that will achieve your goal only using Captivate (no widgets, no to be written Flash-SWF's) within a reasonable amount of time. There is quite a lot of copy/paste possible, repetitive tasks that are a bit monotonous, sorry about it, and some discipline is necessary.
    To start with: I made some choices to simplfy.
    You showed me a screenshot with buttons to click on (little bit like the Likert Question slide) but I have preferred to have the user introduce a number from 1 to 6, which means you'll need a legend to explain on each page. If you really want the six buttons later on, you should look for a widget or a Flash SWF that returns a value = clicked button (1 to 6). To achieve this in Cp alone would be a tremendous work. You should have to take into account that a user can change his mind and click on another button... With my system (typing a number) this is no issue.
    Choose to put 13 questions on each slide, one for each question group. This makes it possible to prepare one slide and copy it twice (to have 39 possible questions). There will have to be done some minor changes on the copies, but most of the work can be taken over. If you prefer to have less questions/slide that would be also possible, but then you'll have different copies (p.e. a slide with 7 and a slide with 6 questions).
    Choose to have a submit button for each question that will trigger an advanced action. Mostly need this to check after completion of a slide if all questions have been answered. If you do not like those many 'Submit' buttons, in CP5 this could be easily replaced, in CP4 it will take some more working time.
    First screenshot of a question slide:
    In this screenshot you'll see 13 questions with 13 Text Entry Boxes. For the TEB I provided the Submit button and only the Failure Caption (red one: put all of them in the same spot). The value entered in the TEB is stored in a user variable, so I needed 13 user variables for this slide, labeled them:
    v_1  v_2   .... v_13.
    Same user variables will be used on the next question slide (if you have less questions/slide you'll need less user vars of this kind).
    TEB is validated for the values 1 up till 6. If another number is entered the Failure Caption shows up.
    The blue Text Captino (Please, answer all...) is initially hidden when entering the slide (action for On Enter slide).
    Each Submit button triggers an Advanced action (labeled first AddGroup1) on success (slightly different, you'll need 13 of then, but copy can limit the work), with variables:
    created one variable 'v_counter' to store the number of correct answers on the slide, will be used to check if all questions have been answered; this counter is set to the value 0 on entering the slide
    created 13 group variables  v_G1  v_G2.... v_G13 to store the total score per group
    labeled actions like AddGroup1:
    increments counter with 1
    adds value stored in v_1 to the variable v_G1
    because playhead shouldn't move on the Timeline, set the system var rdcmndGotoFrame = rdInfoCurrentFrame
    in the 12 copies of this action, only second action has to be changed
    Next button allows the user to go to the next slide, but triggers another advanced action, that I labeled TestCompleteSlide1. You'll need a slightly different action for each Question slide, but again copy does most of the work. This action:
    checks if the variable counter is different from 13
    if it is different:
    the blue Text Caption is showed to insist on answering all the questions (this is what had to be changed on copy, because each text caption has its proper ID, blue Text Caption on second question slide has a different ID from that on first)
    rdcmndGotoFrame = rdInfoCurrentFrame - 2 to put back the playhead
    if it is 13, jump to next slide
    Last slide is TOC: The modules to be taken will be marked with a green circle in this screenshot and get a click box over the name and the green circle. Those not to be taken will be marked with a red triangle and not have a click box. You'll have to configure the click boxes to open the proper module. Initially all click boxes are there but invisible. Same for all green circles and red triangles. They will get visible based on checking the value of the grouping variables v_G1  v_G2.... v_G13. Just a screenshot here, if you confirm that you want to try my proposition, will explain further. This was rather easy in CP5, will not be in CP4, that is why I didn't yet work it out in CP4:
    Lilybiri

  • Pre-test Questions used as Knowledge Builders

    I need questions at the beginning of a course that will not be included when the learner clicks on the Retake and Review Quiz buttons. Pre-test questions work but they don't allow the use of text captions for incorrect and correct feedback. They use smart shapes. Are there any other limitations of Pre-test Questions?

    There are a lot of limitations. You are expected to have pretest question in sequence at the start of the course. All navigation will be cancelled: by playbar and by TOC.
    But what you say about the feedback being only smart shapes as text containers is not true at all. They follow the same styling as normal question slides. If you turned on Use Shapes for SFH in the Preferences, Defaults, it will be shapes. You can always change this even after having inserted the Quiz/Preset slides by going to the corresponding master slide, right-click on the shape and choose 'Revert to caption'. Although to me it is a mystery why you should prefer captions, just my personal opinion because shapes are so much easier to format.
    If you use normal question slides, and do not add its score to the Total score, most system quizzing variables will not include those 'Knowledge' slides. But indeed, as you point out, Review and Retake will go to the first Question slide, which may be a knowledge slide. With a shared/advanced action it is pretty easy to circumvent this. I almost never use Pretest slides for its limitations. Whenever possible I use custom question slides instead of the default ones for Knowledge slides.

  • Pre-test,post-test in project

    Hi,
    I  try  to  create  CAI by captivate3. The project contain
       1. 10 items pre-test
       2. Lesson
       3. 10 items  post-test
       My problem is I  cannot separate in post-test from the pre-test. That  is  when  I  insert 10 items  multiple  choice  post-test  after  lesson, Quiz  result slide  will  automatically  after  the  post - test  slide  and  then  make  it  20 items.
       I  want  to  separate  the  pre-test's  quiz  result  slide  from  the  post-test's.
       How  can  I  operate  it?
                  Thanks  a  lot

    Hi  Lilybiri
    I'm  Thai. I use  cp4. I'm  not  sure you understand my poor English or not?
    Anyway, I followed your advise to make  3  files  as follows:
          First --> pre-test + Lesson
         Second -->  post- test
         Third --> Main menu
         For  the  main  menu, I  made  a  button  to  link  to  pre-test  and post-test  by  choosing
    "open url or file"  I  can  preview  this  in  my  computer  but  I  can't  preview  the  link (pre-test and post-test)  in
    another  computer.
         Could  you  provide  me  software  simulation  step  by  step?
                                     Thank  you

  • Calender on the pre can not import calender events from OL or ICAL

    You know what be really cool? .I can forwared calender events from OL to Ical & visa a versa. The calender on either machine will recoginize the event & import it. I send it through my gmail account. The calender on the PRE does not have the ability to open it & place it in the calender. It does not recogonize it. I wish it did. That be cool.
    Post relates to: Pre p100eww (Sprint)

    Well, I found the answer to my problem. I can add events to my OL. OL will sync with google calender. Google calender will sync with my phone automatically. This works great.!

  • Test Note (Documentation Type) when creating Test Plan

    Hi Gurus.
    After we upgraded our SolMan 7.0 system from Support Pack (ST component ) level 19 to level 25,  I noticed this new field "Test Note" (Dcoumentation Type) when creating a Test Plan in transaction STWB_2.
    Initially, when I haven't chosen any project yet, this field has a list-of-values (i.e. Test Case Note, iTutor Document) but when I choose a project, the list-of-values only showing "ST". I also tried choosing another project and there is no value in the list.
    Does anyone know where this list is coming from or how I can configure this?
    It looks like project-related and but I couldn't find any settings for this.
    Can this field be turned off so it will not show in the screen?
    Thanks,
    Tess

    Hi Tess,
    The documentation type ST is the standard documentation type for test notes.
    Test notes are needed by testers to document results in detail during/after testing a test case.
    In SOLAR_PROJECT_ADMIN  there is the text menu 'GOTO - Project Template - Implementation Projects'
    and then there to the tap 'Documentation Types' to do the default settings for 'ST' for all implementation projects.
    Otherwise you'll have to change settings in the administration for each project!
    There you can assign your own documentation template for ST in the central settings.
    If you are using other project types for testing you'll have to do the same for each project type.
    Regards,
    Adelbert

  • Way to Capture Pre-Test Score?

    Hi all -
    We're using OLM 11.5.10j and want to have a pre-test that, if passed, exempts you from needing to complete the course with which it's associated, and would also mark the overall course as complete. If the learner doesn't pass the pre-test, they would have to take the course + post-test. We also want to capture the pre-test and post-test score separately, so that we know what the score on the pre-test was and, if applicable, the score on the post-test.
    We thought about using two separate "courses" - pre-test, and course+post-test - as part of a Learning Path. That way, we could set the Learning Path to have "at least one component mandatory" and set a the pre-test as a mandatory pre-req on the course+post-test. However, that's a little clunky and really isn't a Learning Path.
    We've looked into other AICC-spec fields that are available in OLM to which the learning object would post additional values to show pre-test/post-test score, then report on those spare fields to determine the respective scores. However, we ran into issues understanding what fields were available in OLM for the LO to post to.
    Any suggestions on how to implement this? Thanks!

    Hello Lee
    You can use DFF attributes in OTA_DELEGATE_BOOKINGS table to store Pre-Test Score and Post-Test Score. Build a class just like any other class......
    Class has 3 Main sections
    1. Pre-Test Section
    2. Class
    3. Test
    If the employee passes the Pre-Test Section...Employee can either continue to take the class or else give the credit to employee for the entire class. (Storing the Pre-test Score in one of the attribute is similar to the way we change the date_status_changed field in OTA_DELEGATE_BOOKINGS table.)
    If employee doesnt clear the pre-test section then save the pre-test-score using package/plsql api to store in DFF attributes of OTA_DELEGATE_BOOKINGS table and continue...rest is the normal process as in other anyother class....
    Hope this helps.....
    I am also expecting some one to give a nice solution.
    Thanks,
    Srujan

Maybe you are looking for

  • Videos app crashes when I try to watch a tv show? Any ideas?

    Hi, since upgrading to iOS5 I haven't been able to watch TV shows on the video app on my iPad 2. Up until today I've had diagnostics turned off but just turned them on and found the following diagnostic report which will now hopefully go to Apple so

  • PI 7.1 - CPA Cache not updated with directory data.

    Hi Gurus, I have installed PI 7.1 and recently I refreshed my cache in partial and full mode. when I go to RWB->Cache monitoring -> Show Cache Status Table, I see following error for Central Adaptor engine. <?xml version="1.0" encoding="ISO-8859-1"?>

  • Ios unable to update or restore on iphone4

    My iphone 4 was updating ios 6 but error occured and alerted me to restore my phone. Now Itunes wont sync. Requesting help trouble shooting this error.

  • I'm not able to log in as J2EE_ADMIN through the Web Browser.

    If I try to log into the server using http://<server>:50000/ I get the SAP Nteweaver SAP Web Application screen. I click on System Information, I get prompted for the username and password. I enter the username J2EE_ADMIN and then the password, but i

  • IPHONE 4S CRACKED SCREEN

    Hi Guys - I am devastated!  My iphone 4S has been dropped and the front screen is smashed.  I have insurance but the excess is £100.  I have made an appointment at the Apple store for Wednesday.  I have been told you can repair it down market for £40