Pre-test,post-test in project

Hi,
I  try  to  create  CAI by captivate3. The project contain
   1. 10 items pre-test
   2. Lesson
   3. 10 items  post-test
   My problem is I  cannot separate in post-test from the pre-test. That  is  when  I  insert 10 items  multiple  choice  post-test  after  lesson, Quiz  result slide  will  automatically  after  the  post - test  slide  and  then  make  it  20 items.
   I  want  to  separate  the  pre-test's  quiz  result  slide  from  the  post-test's.
   How  can  I  operate  it?
              Thanks  a  lot

Hi  Lilybiri
I'm  Thai. I use  cp4. I'm  not  sure you understand my poor English or not?
Anyway, I followed your advise to make  3  files  as follows:
      First --> pre-test + Lesson
     Second -->  post- test
     Third --> Main menu
     For  the  main  menu, I  made  a  button  to  link  to  pre-test  and post-test  by  choosing
"open url or file"  I  can  preview  this  in  my  computer  but  I  can't  preview  the  link (pre-test and post-test)  in
another  computer.
     Could  you  provide  me  software  simulation  step  by  step?
                                 Thank  you

Similar Messages

  • Pre and Post test Questions

    I have a project that requires a pre-test, post-test and
    knowledge reviews throughout the course. I have created 10 question
    slides (I did not use question pools). I am having two issues occur
    within the course. First I do not want to track or score the
    pre-test questions or the questions embedded within the course what
    I will refer to as Knowledge reviews.
    I do want the user to go through and receive feedback on each
    question answered incorrectly but I do not want a score to be
    collected/tracked or tallied as part of the sum total for the
    course. The post-test questions need to be the only tracked/scored
    questions with that value being submitted to the LMS.
    Currently all the questions are being tallied in the course.
    (i.e test results show 30 questions with a score when there should
    only be 10). What is the best way to include mutiple questions
    within my course (as pre-test and Knowledge reviews) but not have a
    score tracked until the user takes a post test.
    To correct this should I change the pre-test and knowledge
    reviews to a survey type question? Or should I leave the question
    as graded but set the point value to 0. I inially left the pre-test
    and knowldge reviews as graded questions but I deselected Report
    Answers under the reporting tab thinking this would prevent the
    passing of a score, which is not the case.
    If anyone can offer some advice or point me to some
    documentaiton that discusses this I would greatly appreciate any
    help.
    The second issue I have is with the progress indicator within
    the course. Is there a way to revmove this from the quesiton pages.
    My client does not want to see this indicator. The current work
    around I am using is to reduce the font size of the indicator to
    5pt, setting the font color to match my background and then
    positioning the indicator in a set loation on the page. While this
    workaround works I would like to remove it in total if possible.
    Thanks in advance,

    The first part sounds very weird. I just made a project which
    included knowledge checks in the middle of the course and they are
    not tracked in the final test. I used Graded questions in my course
    and I initially set the score per question to 0 points. After I
    found out that they were counted in the question total I removed
    the reporting for each of the questions. Now the points are not
    being counted and also the questions are not being counted in the
    final assesment - I just have the 6 questions out of 6 in the
    post-test even though I have 4 knowledge checks inside the course
    it self.
    Try and click F9 to open up the advanced interaction panel
    and make sure that the answers are not being reported. I know it's
    obvious, but sometimes Captivate does some weird things.
    As for your second question regarding the progress indicator
    you can remove that by going to Quiz - Quiz Preferences - Settings
    and then unchecking the box "Show Progress".
    Best regards,
    Michael

  • Can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    Pretest score normally is not sent to the LMS. However it is stored in a system variable, you could use JS to send it.

  • How can I skip/pass a lesson when the pre-test is passed?

    I'm creating a course with an overview and 3 lessons. Each is its own CP file (Captivate 6).
    This course will be hosted in our LMS, where I will require each lesson to be taken/passed.
    Lesson 1 is introductory/background material about my topic, and my users may or may not have had exposure to this information already. If they know the info, there's no need for them to take this lesson.
    I have added a pre-test to the lesson, that, if failed, forces the user to go through the lesson content and a graded quiz at the end. The score is reported to the LMS and all is well.
    If the user passes the pre-test, s/he is jumped to a congrats slide and instructed to exit the lesson. (No need to take the graded quiz - the questions are the same, and they've already "passed.")
    My question: How do I report a PASS for the lesson if the user doesn't take the graded quiz? I don't need to report any particular score.
    Thanks for any help you can offer!

    Lieve, I'd LOVE to see your blog post about this topic - especially since you always include examples. =)
    Request: I need help with reporting.
    Here's my project's Branch View (click to view larger):
    I've done as you suggested as far as I know. The navigation works perfectly, but when it comes to reporting, I'm not seeing what I expect/want.
    What I want:
    For the user to be able to skip the pre-test (the branching image shows this)If the user passes the pre-test, can exit the course with a Complete/Pass
    For the user to have X number of chances at the test. After the user a) passes or b) uses up X chances and still fails, I want to report back to my LMS (I'm testing in scormcloud right now, but will be using HealthStream) both completion AND pass or fail.
    What I've got:
    The user CAN skip the pre-test (the branching works, anyway)
    I can get Captivate to report Complete/Unknown or Complete/Pass (sometimes, even when the user fails the graded quiz), or Incomplete/Unknown.

  • Way to Capture Pre-Test Score?

    Hi all -
    We're using OLM 11.5.10j and want to have a pre-test that, if passed, exempts you from needing to complete the course with which it's associated, and would also mark the overall course as complete. If the learner doesn't pass the pre-test, they would have to take the course + post-test. We also want to capture the pre-test and post-test score separately, so that we know what the score on the pre-test was and, if applicable, the score on the post-test.
    We thought about using two separate "courses" - pre-test, and course+post-test - as part of a Learning Path. That way, we could set the Learning Path to have "at least one component mandatory" and set a the pre-test as a mandatory pre-req on the course+post-test. However, that's a little clunky and really isn't a Learning Path.
    We've looked into other AICC-spec fields that are available in OLM to which the learning object would post additional values to show pre-test/post-test score, then report on those spare fields to determine the respective scores. However, we ran into issues understanding what fields were available in OLM for the LO to post to.
    Any suggestions on how to implement this? Thanks!

    Hello Lee
    You can use DFF attributes in OTA_DELEGATE_BOOKINGS table to store Pre-Test Score and Post-Test Score. Build a class just like any other class......
    Class has 3 Main sections
    1. Pre-Test Section
    2. Class
    3. Test
    If the employee passes the Pre-Test Section...Employee can either continue to take the class or else give the credit to employee for the entire class. (Storing the Pre-test Score in one of the attribute is similar to the way we change the date_status_changed field in OTA_DELEGATE_BOOKINGS table.)
    If employee doesnt clear the pre-test section then save the pre-test-score using package/plsql api to store in DFF attributes of OTA_DELEGATE_BOOKINGS table and continue...rest is the normal process as in other anyother class....
    Hope this helps.....
    I am also expecting some one to give a nice solution.
    Thanks,
    Srujan

  • Can I have multiple pre tests and a final test?

    Hi,
    I want to create  unit whereby I have small pre-tests (5) to check knowledge, they must get answer ciorrect to continue. Then i want to create a final test at the end of the project.
    Is this posisble or do i need to have different projects?
    Many thanks
    FS

    I think trying to get feedback to work with Survey Questions using Advanced Actions would be a lot more trouble than it's worth. 
    Also, using Pretest questions is not the answer because they're very limiting in other ways. They MUST all be at the beginning of the project BEFORE the main quiz, and they disable the playbar for the entire module. (This is by design.)  The way Pretests currently work is not usually what authors are looking for.
    If setting ordinary quiz question slides to zero points and deselecting the option to Include in Quiz is NOT enough for you, then I suggest you resort to creating your own 'question' slides via other methods such as building them from default interactive objects available in Captivate.  You can use widgets or buttons or Smart Shapes or Text Entry boxes to replicate most of the other question types.

  • Setting a Conditional Action for Pre-Test questions: Skipping Content Slides

    OK boys and girls, let's see if this is even possible within one course SCO, or if this a multi-SCO solution (which means I'm toast, because we don't have the multi-SCO aggregator)?
    I want to set up a prescriptive pre-test. In my ideal world, answers to each individual question would either enable or disable the content slides connected to the question. If the user passes the pre-test, they could opt to skip the prescriptive stuff and complete the course. If the user fails the pretest, they would have to view the enabled content.
    Is this even possible in a single SCO world?
    Justenuf

    Jim belongs in that pantheon of good guys as well, so many here have pointed me in the right direction and helped me move forward with the capabilities of this software. And, in that spirit - here's my work around for the pretest/post test in a SCORM 1.2 environment:
    Created an entrance slide that allows opting totake the pretest or not - but only allows you to take it once.
    Created one test (take off the question x of x widget) split it and slide the content in between it. In this case, 10 questions with 5 in each set.
    Titled the first set of questions as a PreTest and the second set as a Post Test.
    Set my passing score at 4 correct. (the user only needs to get 4 out 5 on either set correct to pass - and I'm setting up an action that only allows the user to see the questions they missed if they fail the pretest.)
    Created variables for each "pretest" question and set them at 0. Created on Enter advanced actions on each slide following a question that toggled the variable to 1 if the question was answered correctly  if "CpQuizInfoLastSlidepointScored is equal to 1" then "Assign Question<x> with 1"
    Created an On Enter advanced action for each content area slide "Skip #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the content connected to the missed questions in the pretest is seen.
    Copied the same On Enter advanced action the Post test questions and renamed it "SkipPost #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the questions related to the reviewed content are seen
    The sum total of these activities is the following -
    The user can choose to take the pretest and complete the course by gettiong 4 or 5 questions correct and jump to the exit page.
         If the user takes the pretest and scores lower than 80%, the content for the missed questions is viewed and the user must answer the questions regarding
         that content. The user passes the course when he/she answers the fourth question correctly.
    The user can choose to view the content, take the post test and pass the course by answering 4 of 5 correctly.
    it's running right in preview, in web browser preview and on my LMS previewer, with one navigation error.
          the error: after failing the pretest, taking the prescribed content and anse wering the questions correctly, the course occasionally jumps to the results page
         and s displays a failure message, then freezes in place. it does NOT happen each time, which is why I am cy urrently scurrying back to my happy place to dwell
         on this mystery awhile.
    Justenuf

  • Pre Tests in Captivate

    In one of the Developer Forum articles it mentions the
    ability to use pre-tests in Captivate.
    There is no documentation in HELP or in Knowledgebase on how
    to assign questions to a pre-test as opposed to a post-test, to
    give full credit for & ability to by-pass content w/ all
    correct answers in a pre-test.
    And... if any incorrect pre-test answers, to send learner
    through content & post-test, but in this case, only score user
    on answers in post-test, not all questions in course... Any
    suggestions? Thanks....

    You may be thinking that pre-testing is an actual feature of
    the product, which is not the case.
    Captivate only knows tests... its up to you to design them to
    be useful as a pre-test or a post-test. How you do this is
    determined by the delivery environment your content will be
    accessed from and an understanding of Captivate's reporting
    abilities.
    For example, you mentioned that you want to "give full
    credit", which implies that you're doing centralized reporting in
    some way. The type of reporting you're doing will determine how you
    create your course.
    If you're using an LMS via the SCORM standard, all navigation
    and pass/fail logic must reside in the LMS, so you just build your
    pre-test and make the LMS use the score from that determine whether
    the user passes the overall course. If they don't pass, make them
    take everything, if they do pass, give them credit for the whole
    thing.
    If your particular LMS doesn't let you do that and you can't
    find a way to trick the LMS into doing what you want (say, by
    assigning much higher score values to the questions in the pre-test
    to artificially give the user a score high enough to pass the
    entire course), then you may be out of luck. That's the nature of
    SCORM, and has nothing to do with Captivate.
    Bottom line, there's no documentation because Captivate is
    not intended to fit into any particular system or scheme. It's
    merely a tool that lets you create building blocks that you can
    (hopefully) slot into a separate solution using a "standards-based"
    interface, such as AICC/SCORM, or one of the other options listed
    in the Quiz Manager.
    The reality is that each deployment environment, especially
    the "standards-based" ones, have their own quirks and deployment
    requirements, so how you make Captivate work in your particular
    environment is unfortunately left up to you. That's just the nature
    of the game.

  • Pre-Test in Captivate with rating scale questions

    What I want to do with Captivate:
    - First an Pre-Test with 35 Questions the learner has to answer. I have to use a scale from 1 to 6.
    - These questions belong to 13 groups of questions
    - The Results of the questions are to be summed up in grouping variables
    - The Score in the variables are responsible, which of the eLearning-Moduls should be highligthed in the TOC
    - At last the TOC should be displayed
    - I want to use for every of the following moduls an own project (to reduce the size of every project)
    - I have to import this all in the LMS ILIAS (Scorm 1.2 and Scorm 2004)
    - the learner should be able to interrupt the application at any time and resume again
    I have seen the Thread "how to make a Pre Test in Captivate 4" and I work out the script-example inside the Captivate-Help  and the ebook from Rick Stone.
    What is the recommendation
    1. to realize the slides with the questions (they should be self-created, but how to do?)
    2. to realize the code in the advanced actions
    3. to realize the working-together of the project with pre-test and toc and all the following modules
    Perhaps, someone has an example for me.
    Best regards
    Winfried

    Hello Winfried,
    Spent a couple of hours yesterday to work out a proposition. Must confess that I realized it in CP5, mostly to save some time with the Advanced Actions but it could be done in CP4 too. Beware: I did not bother a lot about the layout, graphics, just tried to find a workflow that will achieve your goal only using Captivate (no widgets, no to be written Flash-SWF's) within a reasonable amount of time. There is quite a lot of copy/paste possible, repetitive tasks that are a bit monotonous, sorry about it, and some discipline is necessary.
    To start with: I made some choices to simplfy.
    You showed me a screenshot with buttons to click on (little bit like the Likert Question slide) but I have preferred to have the user introduce a number from 1 to 6, which means you'll need a legend to explain on each page. If you really want the six buttons later on, you should look for a widget or a Flash SWF that returns a value = clicked button (1 to 6). To achieve this in Cp alone would be a tremendous work. You should have to take into account that a user can change his mind and click on another button... With my system (typing a number) this is no issue.
    Choose to put 13 questions on each slide, one for each question group. This makes it possible to prepare one slide and copy it twice (to have 39 possible questions). There will have to be done some minor changes on the copies, but most of the work can be taken over. If you prefer to have less questions/slide that would be also possible, but then you'll have different copies (p.e. a slide with 7 and a slide with 6 questions).
    Choose to have a submit button for each question that will trigger an advanced action. Mostly need this to check after completion of a slide if all questions have been answered. If you do not like those many 'Submit' buttons, in CP5 this could be easily replaced, in CP4 it will take some more working time.
    First screenshot of a question slide:
    In this screenshot you'll see 13 questions with 13 Text Entry Boxes. For the TEB I provided the Submit button and only the Failure Caption (red one: put all of them in the same spot). The value entered in the TEB is stored in a user variable, so I needed 13 user variables for this slide, labeled them:
    v_1  v_2   .... v_13.
    Same user variables will be used on the next question slide (if you have less questions/slide you'll need less user vars of this kind).
    TEB is validated for the values 1 up till 6. If another number is entered the Failure Caption shows up.
    The blue Text Captino (Please, answer all...) is initially hidden when entering the slide (action for On Enter slide).
    Each Submit button triggers an Advanced action (labeled first AddGroup1) on success (slightly different, you'll need 13 of then, but copy can limit the work), with variables:
    created one variable 'v_counter' to store the number of correct answers on the slide, will be used to check if all questions have been answered; this counter is set to the value 0 on entering the slide
    created 13 group variables  v_G1  v_G2.... v_G13 to store the total score per group
    labeled actions like AddGroup1:
    increments counter with 1
    adds value stored in v_1 to the variable v_G1
    because playhead shouldn't move on the Timeline, set the system var rdcmndGotoFrame = rdInfoCurrentFrame
    in the 12 copies of this action, only second action has to be changed
    Next button allows the user to go to the next slide, but triggers another advanced action, that I labeled TestCompleteSlide1. You'll need a slightly different action for each Question slide, but again copy does most of the work. This action:
    checks if the variable counter is different from 13
    if it is different:
    the blue Text Caption is showed to insist on answering all the questions (this is what had to be changed on copy, because each text caption has its proper ID, blue Text Caption on second question slide has a different ID from that on first)
    rdcmndGotoFrame = rdInfoCurrentFrame - 2 to put back the playhead
    if it is 13, jump to next slide
    Last slide is TOC: The modules to be taken will be marked with a green circle in this screenshot and get a click box over the name and the green circle. Those not to be taken will be marked with a red triangle and not have a click box. You'll have to configure the click boxes to open the proper module. Initially all click boxes are there but invisible. Same for all green circles and red triangles. They will get visible based on checking the value of the grouping variables v_G1  v_G2.... v_G13. Just a screenshot here, if you confirm that you want to try my proposition, will explain further. This was rather easy in CP5, will not be in CP4, that is why I didn't yet work it out in CP4:
    Lilybiri

  • AppContainers_test fails with error during pre-test configuration checks

    Hello,
    AppContainers_test fails with error during pre-test configuration checks:
    "[FAIL] The machine is not properly configured for testing: the test directory "C:\Windows\SYSTEM32\" is not ACL'd to allow execute permission for "APPLICATION PACKAGE AUTHORITY\ALL APPLICATION PACKAGES"
    But that kind of permission for APPLICATION PACKAGE AUTHORITY\ALL APPLICATION PACKAGES was allowed, also i tried to allow full access, but error was the same.
    In the test WindowsFilteringPlatform_Tests, the same requirement "REQ - Wfp-based products must not block App Container apps operating within their declared network intentions by default" was checked successfully.
    How can i pass successfully AppContainers_test? 
    I installed lastest versions of QFE and filters on my server, but it didn't help.

    Hi M'Alerter,
    I have the same issue, have you found a solution to your problem ? 
    Cheers,
    Ghalem

  • How do you make fla file the default document to test the project?

    my default document is the swf file.  help me please.

    You create your flash application in Flash Professional by creating new FLA file and editing it. The output (flash application) of this proces is SWF file. Which is then played by Adobe FlashPlayer.
    When you test your project, the SWF file is exported and then opened by flashplayer.
    FLA is your source file and you keep it on your computer for later editing.
    SWF is your actual application and you deploy (copy) it on the server or wherever you intend to deploy your application.
    If what you're trying to say is that you wish to debug your application, then test it not by pressing Ctrl+Enter, but Shift+Ctrl+Enter.

  • Test/Trial project on MDM and XI integration.

    Test/Trial project on MDM and XI integration.
    I want to try out a test case scenario where in i would be importing and exporting data to XI (Using XML files i guess)
    I am a newbie in MDM integration.
    Please help me out with my trial project.
    What i already have:
    I have MDM system and repository on which i have been learning basic MDM designing a (test)repository of my own.
    What information i need:
    -What all system requirements (XI system configuration and all) are needed to established. (What all should i be asking the XI and the system admin guys to be able to get the platform ready for me.*
    *-The further procedure for actual integration to be done for importing/exporting data as in a real time project scenario

    Hi Gaurav,
    SInce you want to try a test scenario including MDM and XI, I would suggest you to try
    ECC -> XI->MDM scenario
    Wherein you can take the data from ECC in the form of IDOCs and convert them into XML.
    You can then import the XML into an MDM repository. MDM repository can be a Customer, Material, Vendor etc repository. Hence extract the similar data from ECC.
    Kindly follow the steps below:
    1) Firstly you will have to extrcat the look-up table data from R3 using MDMGX transaction-
    Please follow this link to know about the same:
    How to Configure Master Data Management Reference Data Extraction -
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f0740b31-a934-2b10-2e95-af2252991baa
    https://websmp203.sap-ag.de/~sapidb/012006153200000394732006E/MDM554_GenericExtactor.pdf
    2) For loading the main Table Data into MDM, kindly follow the link below, which will guide you to the config/prcoess that will be required for the data to be uploaded from R3 into MDM.
    2.1) R/3-XI-MDM (Outbound Scenario)
    You will have to extract the data from R3 in the form of IDOCS and then using XI, you will have to convert them into XML format as MDM understands that.
    2.2) For the process of Data from MDM back to R3, kindly refer this link below:
    MDM-XI-R/3  Integration
    2.3) For general info about this scenario, kindly refer this link below:
    MDM Mass data Integration with R/3 made easy
    Please follow these links for additional info-
    1) SAP Network Blog: XI Configuration for MDM Integration - Sample Scenario-
    XI Configuration for MDM Integration - Sample Scenario
    2) The below given blog can be used to understand the XI config done when data is sent back from MDM via XI to R3.
    MDM->XI->R3
    MDM-XI-R/3  Integration
    3) MDM 5.5 data distribution using SAP XI -
    MDM 5.5 data distribution using SAP XI
    Hope this information helps.
    Thanks and Regards
    Nitin jain

  • How to test XI project..

    1.How to test xi projects?
    2.then how to test our scenario? Is there any testing tool is available to test the scenario?
    3. What is CCMS? how it is useful in XI ?
    Thanks in advance
    Rakesh

    Hi Rakesh,
    +1.How to test xi projects?
    2.then how to test our scenario? Is there any testing tool is available to test the scenario?+
    XI is about messaging. So the way to test an XI implementation would be to test the messaging. Two options for doing that...
    1. Test the implementation without the Application systems. The plain J2SE adapter really comes in handy here. You can use it to send an XML file to the Integration Server, check whether it is getting through the pipeline as intended (which means the routing, the mappings, etc), and let it pop out to a file somewhere. You can check all the steps the message has gone through from the XML monitor (sxmb_moni).
    2. Test the actual implementation with the application systems. For this you would use the testing tools of the application itself.
    I do not know of any automated tools for testing XI configuration, or any way of testing it without actually sending a message across.
    What is CCMS? how it is useful in XI ?</b> 
    I can give you some info regarding CCMS, but the rest will try finding out and let you know soon. Mean while pls go through this info on CCMS.
    When CCMS is not configured you can only do individual monitoring.
    To configure the same you must decide which system is to be the central system for the SCR.
    To know what SCR is and to know more about it please go through this link.
    https://help.sap.com/saphelp_nw04s/helpdata/en/1e/ec9e38f0d92936e10000009b38f8cf/content.htm
    After doing the same please follow these steps.
    Log on to the system that is to contain the central repository.
    - Choose CCMS ® Configuration ® Alert Monitor, or call transaction RZ21.
    - Choose Technical Infrastructure ® System Repository ® Set Repository Role.
    - The system displays the Set Repository Role in Central System Administration screen. Choose Central Repository.
    And also go through the following links. This I guess will help you solve your problem and I guess it answers your question.
    https://help.sap.com/saphelp_nw04s/helpdata/en/e5/15a0380d134f6fe10000009b38f8cf/content.htm
    /people/sap.india5/blog/2005/12/06/xi-ccms-alert-monitoring-overview-and-features
    /people/sap.user72/blog/2005/11/24/xi-configuring-ccms-monitoring-for-xi-part-i
    Regards,
    Abhy
    Message was edited by: Abhy Thomas

  • Please, I am in a very big problem I designed the project by Adobe director And you test the software on the computers on Windows and works with high quality When I tested the project on Apple Macintosh operating system, Li appeared a number of problems

    Please, I am in a very big problem
    I designed the project by Adobe director
    And you test the software on the computers on Windows and works with high quality
    When I tested the project on Apple Macintosh operating system, Li appeared a number of problems and you dissolve a large part of them
    But I have a problem too big
    The project works on some devices and others show me messages mistakes and why chasm different operating system version
    Apple's Macintosh and these messages
    "biosegnatures" can't be opened becouse it 's from undefined developer "
    others
    shokwave player error
    pleas what can i do

    Welcome to the Support Communities.
    m.raslan wrote:
    I tested the project on Apple Macintosh operating system
    Click on the  menu at top left of your screen, choose About This Mac and, in the panel that appears, note the Version no. in the form 10.n.n. Then update your product list so we can see what version of OS X you're running on the Mac — see this tutorial:
    Update your product list
    m.raslan wrote:
    "biosegnatures" can't be opened becouse it 's from undefined developer "
    This Apple doc may help:
    OS X: About Gatekeeper
    m.raslan wrote:
    shokwave player error
    Is Shockwave Player installed?...
    http://www.adobe.com/shockwave/welcome/
    ...If not, get it here:
    http://get.adobe.com/shockwave/

  • Running a pre-test script to setup my test data

    I am trying to run a script from e-Manager Enterprise to setup or stage my data prior to the playback of my test branch. I think i could do it with vba in a dummy e-Tester script or via a third party test. Has anyone ever done this and what is the best method for this type of activity?

    The 3rd Party test case type in e-ME is well suited for this purpose. You can specify any .bat file executable to run with your 3rd Party test case. Although one usage for this is to launch automated scripts in external or homegrown test tools - another usage is to launch a pre-test or other executable to setup your test environment. You can then either launch this manually in the e-ME test tree or schedule it to run ahead of your other automated scripts in the e-ME Web scheduler.

Maybe you are looking for