SCORM - Simple Quiz - Pass/Fail Only

Flash CS4
AS2
Good Afternoon,
I've created a simple multiple choice quiz in Flash.
I was wondering if anyone could point me to some resources for writing a simple SCORM wrapper, and the accompanying AS2 code.  This is only going to report a pass/fail to the LMS - that's all - nothing more at this point.  I'm hoping there should be a simple tut on this, as I said, it's pretty straight forward.
Thank you!
Actus

I have a tutorial here:
http://pipwerks.com/2008/04/27/how-to-add-basic-scorm-code-to-a-flash-movie/
It uses a custom (and free) SCORM ActionScript class and JavaScript wrapper so you don't need to write your own.
The tutorial uses SCORM 2004, but the JavaScript wrapper and ActionScript classes are version-neutral (works for SCORM 2004 and 1.2). There are two versions of the ActionScript class available: one for AS2 and one for AS3.
You can download the wrapper files here:
http://pipwerks.com/downloads/
- philip

Similar Messages

  • Why does captivate 5 scorm 2004 change the pass/fail status when using complete/incomplete

    SUMMARY:  I am trying to use SCORM 2004 reporting but I am finding out ADOBE does not diffentiate between COMPLETE/INCOMPLETED and PASSED/FAILED variables in the SCORM 2004.
    RESULT: I can not accurately get the passing/failing status from the course.
    QUESTION: Is their anything I can do or does ADOBE have a patch they have release which will accurately report the passed/failed STATUS as well as the completed/incompleted status.
    OTHER RESEARCH: I even attempted to post these "proofs" on scorm.com and they have something called "SATISFACTION TRACKED" to accomodate content creation systems (CAPTIVATE 5) which does not follow the standards of SCORM 2004.
    BACKGROUND:  the report status "Incomplete/Complete" and "Pass/Fail" under the publishing options for SCORM 2004 essentially mean the same thing when the CAPTIVATE COURSE sends setvar commands to the SCORM API...  In other words the SCORM API receives the same exact commands when publishing in both modes.  "THIS IS WRONG"
    why is the wrong?  IF REPORTING; incomplete/complete ;  the CAPTIVATE 5 COURSE SHOULD NOT "change" the STATUS OF PASS/FAIL when attempting to report COMPLETE/INCOMLETE with 100% slide views.
    DEMO #3> SHOWS THE PROBLEM
    About DEMO #1 and DEMO #2> these are other reporting options I attempted with other problems arrising..so these images are just for supporting my original problem which is shown in DEMO #3...
    DEMO 1 (REPORT STATUS has no affect on reporting):SETTINGS:
    PASS: (PROBLEM)
    Anything above Passing Score is not recorded to LMS (once passing is reached...no more bookmarks are recorded)
    FAIL: (CORRECT)
    Failure gets sent to the database on the last screen with "completed"
    DEMO 2 (Report status  [Complete/Incomplete  Pass/Fail]   has no impact on commands sent to scorm api):
    SETTINGS:
    PASS: (CORRECT)
    Very Last page gets sent to the Database
    FAIL: (PROBLEM)
    NOTHING GETS SENT TO DATABASE (stuck in final score screen forever)...because it says "INCOMPLETE"
    DEMO 3:
    Report status [Complete/Incomplete Pass/Fail] has no impact on commands sent to scorm api
    SETTINGS:
    PASS: (CORRECT)
    Very Last page gets sent to the Database with a 90% passing with "completed"
    FAIL: (PROBLEM)
    LAST SLIDE SENDS "completed" but GETS SENT AS A PASSING SCORE??? even when it is a failure
    (IN OTHER WORDS ADOBE CHANGES THE PASS/FAIL STATUS based on slide views...
    this is WRONG ....REALLY WRONG becuase I am trying to report
    "COMPLETE/INCOMPLETE" for 100% slide views
    PASS/FAIL OPTIONS:
    OPTION I CAN PURSUE?
    SO WHAT CAN I DO?  YES, I CAN TELL THE LMS...to COMPARE THE SCORE RECEIVED DURING A "COMPLETE" status...and then change the grade in the LMS based on the SCORE IN THE LMS...BUT THAT WOULD MEAN THE LMS COULD REPORT ONE GRADE and the COURSE COULD REPORT A DIFFERENT GRADE....and that would not be good.
    ALL OF OUR COURSES SHOW THE STUDENT THE FINAL SCORE WITHIN EACH COURSE..
    QUESTION TO ADOBE?
    SO ADOBE when are you going to differentiate these two reporting options so that the PASS/FAIL OPTIONS ARE ACCURATELY REPORTED WHEN USING COMPLETE/INCOMPLETE?
    MORE INFORMATION
    HERE IS THE SCORM.COM "Satisfaction" adjustment (very last line) to accomodate inconsistent reporting in CAPTIVATE 5.
    Thanks,
    DOUG LUBEY OF LOUISIANA
    www.douglubey.com
    OTHER WEBSITES:
    http://forums.adobe.com/message/3332035
    http://forums.adobe.com/message/3122315
    http://forums.adobe.com/thread/712502?start=50&tstart=0
    http://forums.adobe.com/thread/800570
    http://forums.adobe.com/thread/720121
    SEARCH ENGINE REFERENCE:
    CAPTIVATE SCORM SUCCESS_STATUS
    captivate 5 scorm 2004 pass/fail is dependent on complete/incomplete
    scorm 2004 captivate cs5 success_status incorrect
    scorm 2004 captivate cs5 reports success_status incorrectly
    scorm 2004 captivate incorrectly reports pass/fail
    scorm 2004 captivate 5 incorrectly reports pass/fail
    scorm 2004 "captivate 5" incorrectly reports pass/fail
    captivate 5 sends a pass even when a failure is issued by the course using scorm 2004.
    captivate does not work properly when working with completed status.
    CAPTIVATE SCORM SUCCESS_STATUS COMPLETE STATUS SHOULD BE INDEPENDENT
    CAPTIvATE 5 cmi.completion_status
    and cmi.success_status does not work independently with scorm 2004.  they both share the same definition.  These two field were designed to help differentiate the difference between a passing or failing score and when the course is actually completed.   WITH CAPTIVATE 5 they do not differentiate the two.  COMPLETE means PASS and INCOMPLETE means fail..
    I was trying to use "completion_status" as a way to determine when all slides were shown.  So that I can send the final score (PASS/FAIL) the LMS after all slides were viewed and this was the absoluted score the student recieved.
    2011 CAPTIVATE CHANGES cmi.success_status TO PASSING WHEN COURSE REPORTS FAILURE
    2010 CAPTIVATE CHANGES SUCCESS TO PASSED WHEN COURSE REPORTS FAILURE
    CAPTIVATE 5 CHANGES cmi.success_status TO PASSING
    CAPTIVATE 5 CHANGES cmi.success_status TO PASSING WHEN COURSE REPORTS FAILURE
    CPTX CAPTIVATE CHANGES cmi.success_status TO PASSING
    CP5 WHY DOES CAPTIVATE SCORM SUCCESS_STATUS
    CPTX WHY DOES CAPTIVATE CHANGE PASSING STATUS
    CPTX CAPTIVATE SCORM CHANGES FAILING SCORE TO PASSING SCORE
    CAPTIVATE SCORM SUCCESS_STATUS COMPLETE STATUS MEANS THE SAME THING
    CAPTIVATE SCORM SUCCESS_STATUS COMPLETE STATUS SHOULD BE INDEPENDENT

    Hi,
    Here's something I found in a similar situation... hope this helps!
    After publishing your CP5 project as a SCORM 2004 zip file, unzip the file so you can get into the separate files.  Open the imsmanifest.xml file with a text editor; I like using Notepad++.  Search for the line starting with <imsss:minNormalizedMeasure>.  The complete line will probably look like <imss:minNormalizedMeasure></imss:minNormalizedMeasure>, this is the line that is supposed to contain your passing score (at least for our LMS).  So, if your passing score is supposed to be 80%, edit this line to be:  <imsss:minNormalizedMeasure>0.80</imsss:minNormalizedMeasure> and save the file.  Then select all the files and re-zip them, I use 7-zip for that.
    I found one other weird thing that doesn't happen 100% of the time, so test it.  In the example above, try to get a score of exactly 80% (or whatever passing grade you are using) and see if you get the correct status reported.  I found I sometimes have to re-edit that line in the manifest file to be 0.79 so it correctly includes an exact 80% score.  No idea why and you have to watch the number of questions in your quiz to make sure that works for you mathematically.
    If this works or not, I'd recommend posting your results so others can get some ideas.  Good Luck!!

  • Captivate 5 Quiz pass and fail issues

    Hi all,
    Still new and working with Captivate 5 (Win7 OS). I have a project with five quiz questions at the end. The goal is for the learner to have to pass the quiz to continue and the passing score required is 80%. I am allowing three tries at the quiz before failure. The problem I am running into is when the learner has failed on the final attempt, the quiz results screen is shown and will not advance to the next slide. Here are the settings I have so far:
    Preferences>Project>Start and End>Project End options-Action-Close project
    Preferences>Quiz >Reporting>Quiz: Enable reporting for this project-checked
    Preferences>Quiz >Settings>Quiz> Required-Pass required
    Allow backward movement-unchecked
    Show score at the end of the quiz-checked
    Preferences>Quiz >Pass or Fail> Pass/Fail options-80%
    If passing grade-Go to the next slide
    If failing grade- Allow 3 attempts. Show Retake button. Action-Go to the next slide
    All quiz questions are set to one attempts
    I have a “Retake” button directly on top of  the  “Continue” button on the quiz results page. The retake button appears correctly on each of the attempts and the “Continue” button appears if the learner gets a passing score. If the learner fails at the end of the third attempt, the continue button appears and at this point and is not active (due to failing score). I can only get the project to advance to the final slide if the learner passes the quiz.
    The learner needs to be able to resume the project if they stop and exit. I also want the project to close the browser window after the last slide has played. I do have a playback controls (clients requirement).
    Any help is greatly appreciated!!

    Change your Quiz > Required > Pass Required to Optional instead.
    You've told the quiz that it should force the user to pass before allowing them past the Quiz Results slide.  That's probably why the Continue button or playbar won't allow them to move further.  By making pass optional it should work.
    Getting the course window to close automatically at the end is by no means guaranteed to happen.  That depends on many factors that may be outside your control (e.g. browser settings, LMS player, security, etc)
    For 300 more troubleshooting tips like this, read the book!
    http://www.infosemantics.com.au/troubleshoot-adobe-captivate

  • Pass/Fail issue with Quiz and Click Buttons

    Got a complicated one for ya!
    I have a project that has about 100 quiz questions.  I want to add slides with an image for backgrownd, a voice over and a "Click Button" to be used like a Simulation.  If the "Click Button" is not clicked on, I want the slide to act like a Quiz and be marked as wrong.
    At the end of the Quiz I want the Pass/Fail function to be: If Pass: "Continue" then if Fail "Jump to Slide". This Failed Slide would jump to the last slide; the end of the project with a "Thank you".  If Passed, the project continues to display more information on other slides.
    The tests I have done the Pass/Fail seems to not be working.  If failed, the project continues just like the Passed even though the "Pass/Fail" options are marked correctly.
    What do you think?

    Also check the options under Quiz > Settings > Required.  It could be that you have it set to something other than Optional which might be restricting things.

  • I added Edit Preferences Quiz Pass or Fail If Passing Grade Go to the next slide

    Unfortunately, it doesn't work. The playback frame stays on the quiz reporting page. Please, can anybody tell me what I could be doing wrong?

    Rod -
    At the end of the quiz I have my "PASS" and "FAIL" screens set up, one after another. In the Captivate Preferences, under "Quiz Pass and Fail" - I have that if they Pass, they jump to the "PASS" slide, and if they Fail they of course go to the "FAIL" slide.
    When I play the course and pass, the pass screen that I built shows up after it shows my results, but then immediately goes to the fail screen that I built. How can I stop that from happening? Do I need to hide the playbar when the user gets to that page? I'd rather not as I want the user to be able to restart the course if they want to without having to close then reopen.
    Many thanks -
    Kevin

  • Captivate 5.5 Quiz Preferences Pass/Fail Actions

    Is there a good explanation anywhere of the Pass/Fail actions in Captivate, when one would want to use an action and which combinations of actions can or should be used?
    No matter what I choose for actions, for both passing and failing grade, Disable, Enable, Increment, and Deincrement are grayed out. If they are always grayed out, what are they used for?
    I looked this up in the Adobe Help menu, but as often happens, the information is too vague and high level in the Help menu for any real understanding.

    Hi Jay,
    I always execute an advanced actions for Pass/Fal actions, that will give you all possible statements (except Increment... because they are never available in advanced actions, one of the feature requests I have already entered several times). In this blog post I tried to explain the functionalities of question slides and the score slide, and those Pass/Fail actions are included in the blog post:
    Question Question Slides in Captivate
    Lilybiri

  • Lms reporting success status=pass/fail, where it should give unknown while running on saba 5.3

    I am trying to run my content on SABA 5.3 , After exiting the content the lms displays success status as Pass or Fail where it should show Unknown. My content does not contain any quiz, so on which basis lms is calculating the success status.

    Yes I know you said your content had no quiz, but please try to remember that we answer hundreds of these enquiries and it's not uncommon for people to tell us one thing when their settings say another.  So I have the habit of asking to see the actual screenshot.
    Each LMS interprets the SCORM standard a little differently.  In SCORM 2004 a course module can have both a status for Complete or Incomplete, as well as a separate status for Pass or Fail. (Under SCORM 1.2 the overall status was tracked by a single element - cmi.core.lesson_status - allowing for only a single status value that would boil down to Complete vs Incomplete or Pass vs Fail values.
    Under SCORM 2004 there are two separate elements tracking completion and success.
    The SCORM 2004 element cmi.completion_status is designed to indicate whether the learner has completed the SCO and can take the following status values:
    completed,
    incomplete,
    not attempted,
    unknown
    Your Pass/Fail status in SCORM 2004 falls under the element called cmi.success_status. This one indicates whether the learner has mastered the objective and can take the following values:
    passed,
    failed,
    unknown
    So while it is possible for a value of "unknown" to be set for either cmi.completion_status or cmi.success_status, I don't think this would be very commonly done for a Captivate module that actually has interaction data being sent to the LMS.  If the user is interacting with the content at all it would seem more likely that you'd be getting a 'completed' or 'incomplete' status value in the completion element and a 'pass' or 'fail' value in the success_status element.
    The bit I don't understand is why are you particularly keen to see a value of 'unknown' showing up here?
    Check this page for information about getting SABA to work with Captivate: http://www.infosemantics.com.au/adobe-captivate-learning-managment-system-lms-guide/saba
    Perhaps one of the threads it links to will provide more help for you.

  • Tracking Pass/Fail on LMS

    I am trying to track pass/fail on a Captivate project and I
    am having difficulties with the LMS. It is tracking as completed
    after just the first slide, before even taking the quiz. I have my
    quiz settings set to report "Quiz results only", Pass/Fail, Report
    to LMS as Percent, Reporting level is "Report Score", the Pass/Fail
    options are set to 75%.
    NOTE: I have allowed the user 2 attempts to pass. Could this
    be what's giving me trouble?

    Look at the Project > Advanced Interaction panel and see
    if you have
    score values setup throughout your lesson as you should.
    It sounds like perhaps you have a low number of scored
    interactions and
    that 75% is being met as soon as they answer the first
    question
    correctly? Compare the Advanced Interaction results to your
    Quiz
    Preferences > Pass/Fail options and see what shakes out...
    Erik
    BriarPatch1 wrote:
    > I am trying to track pass/fail on a Captivate project
    and I am having
    > difficulties with the LMS. It is tracking as completed
    after just the first
    > slide, before even taking the quiz. I have my quiz
    settings set to report
    > "Quiz results only", Pass/Fail, Report to LMS as
    Percent, Reporting level is
    > "Report Score", the Pass/Fail options are set to 75%.
    >
    > NOTE: I have allowed the user 2 attempts to pass. Could
    this be what's giving
    > me trouble?
    >
    Erik Lord
    http://www.capemedia.net
    Adobe Community Expert - eLearning
    http://www.adobe.com/communities/experts/
    http://www.awaretips.net -
    Authorware Tips!

  • Filter Simple List to show only 1 record per employee

    All,
    I'm looking for a very simple OOTB way to filer a list.
    I've used InfoPath form to create a simple test form which is published to form library on submission.
    My list columns are:
    Employee Name, Pass/Fail,Test Score, Test Attempt (Count).
    The list is grouped by Pass/Fail
    I need to show all passed tests and only failed tests where a user hasn't subsequently taken the test again(s) and passed it.
    I was hoping I could do this with a filter web part as with OOTB filter column options, I can't seem to do it. Or just show the highest count number per employee.
    Any help would be greatly appreciated.
    Thanks
    W

    Thanks, yet that doesn't work for me.
    Maybe it's my test attempt count.
    Each form is a new form, and each form will have a test attempt (TA=) number against it.
    i.e.
    Form1 TA=1 Fail
    Form2 TA=2 Pass
    With the above, both seem to appear with the above filter.
    No form as TA=0
    I can't guarantee that users will only fail once.
    Forms ending in 130 and 310 are completed by the same Employee Name
    Thanks
    W

  • Pass/Fail Assessments

    I am using Captivate 3 and am looking for resources that can guide me through creating an assessment in a pass/fail mode.  I have created assessments, and have created quizzes, but need to bring the two together to record a demonstration of a software package we use, then launch an assessment that requires the user to get XX percent of mouse clicks, etc. correct in order to pass the assessment.  I know that I can use failure captions and set the number of tries a user is allowed, but have not been able to determine a way to set it up as pass/fail.
    Thanks for your assistance.

    Hello,
    No problem with CP. First you will have to determine the mastery score, what is needed to Pass. This has to be done in the Quiz, Preferences, Pass or Fail, Pass options. You can combine Question slides with assessment, where the user has to use interactive objects to be scored. An overview of scored Question slides and interactive objects can be seen in the view 'Advanced Interaction' (menu Project). Then you will have to set the Reporting (also in Quiz Preferences), depending on the SCORM version (do not know Plateau myself). It is possible to report Pass/fail as you will see in those settings.
    Hope this can be a start.
    Lilybiri

  • How do i configure a FOR loop to have the behavior of the step Loop Type: Pass/Fail count?

    Hello,
    I'm using the Pass/Fail count set to one Pass to capture an event generated by my DUT.  I originally used a numerical compare step with the Looping type of Pass/Fail count to accomplish this.  Unfortunately the implementation changed and now I need to execute a few steps that can not be combined within one code module as before. Nor can these steps be put into a subroutine.  One of the steps executes a .NET asembly and I haven't figured out how to pass the reference to the subroutine.  When the subroutine is intered the reference is lost and the methode does not execute correctly.
    I have an evaluation function the exits the loop when the expected conditions are met. Everything works except for the Overall Pass/Fail result of the For loop.  If the loop exits due to the first numerical compare test passing, I want the loop overall execution to report as "Passed".  If the loop reaches it's predetermined number of iterations, the overall result needs to report as "Failed".  It would also be nice to have the radio button functionality of "Record Result of Each iteration".  Some conditions require a wait over a minute for the event to occur and I don't want to generate needless data for the report.
    Currently I get the pass/fail status for each For loop iteration, but only "Done" for each loop iteration.  I don't want the initial few failures to cause the test to fail.
    Does anyone know how to do this? Suggestions?
    Thanks,

    I have 2 steps in the loop that can not be combined into one step. This forces me to implement the behavior in some form of loop.  A While Loop could be an option.  I would need to pass the Step.Result.Status from the Multiple Numerical compaire step to the condition of the While Loop.  I tried to use the Step.TS.ID without success.  Some form of reference to the numerical compare step would be needed. I'm using a For Loop as I do want to limit the number of iterations of the loop.  In the case where the loop iterations are reached, the event I am trying to detect did not occur at the correct time and a failure needs to be reported.
    I came up with something based on my comments in the second post:
    1) To start with I configured a Locals.ForLoop_5 variable.  This is used to set the limit on the loop iterations and for comparison after the loop has finished executing. More on that later.
    2) The first step inside the loop invokes a method within a .NET assembly that has been packed in a DLL This method gets the required data from the DUT and stores it to a text file.
    3) The next step is the Multiple Numeric Limit step.  This step invokes a VI that extracts the data from the text file.  This step has been customized in several ways.
      i)  In Run Options, the Results Recording Option was disabled.  This prevents recording of "Failed" while the loop executes when waiting for the event to happen.
      ii) In Run Options, the Step Failure Causes Sequence Failure is unchecked.  Same reasoning as i)  These steps are not true failures.
      iii) A Post Action is configured to go to the nexxt step after the For Loop End step On Condition True with the logic of Step.Result.Status == "Passed".  This causes the loop to exit when the first "Passed" is encountered which corrolates with the event I'm trying to detect. On Conditon Fail remains set to default.
    4)  The step after the For Loop End is an expression step with everythin set to default except for the Status Expression logic set to: Locals.Loopindex < Locals.ForLoop_5 ? (Step.Result.Status = "Passed") : (Step.Result.Status = "Failed"). This step performs the overall Pass/Fail reporting for the For Loop.  If the number of loop iterations is less than the maximum it could have only gotten there by the previous logic triggered by the numerical compare passing, therefore "Passed".  If the loop index has reached the limit, then the event was not detected, therefore Failed.
    I have tested this work around with success, it just a pain to now have to implement this on my 40 some odd For Loops.
    If there is a better way, I'd still like to hear it.

  • Sending a return value without pass/fail message

    Hi,
    I'm using the COM adapter to talk TestComplete application.
    I have a string value test step to evaluate a return value and determine if it's pass or fail (e.g. return 0 = pass, return 1 = fail).  This works perfectly.
    However, I have added another return value (e.g. return 2), the code executed fine, but it keeps displaying that step as "Failed", which is annoying.  Is there a way to suppress this failed message?
    Basically, here's what I need to do, return 0= display pass, return 1 = display fail, return 2 = display no message or just display done.
    Thanks so much!
    Solved!
    Go to Solution.

    The String Value Test step type only evaluates to Passed or Failed.  It does not do a tristate.  In fact any of the step types in the Tests (Pass/Fail Test, Numeric Limit Test, Multiple Numeric Limit Test, String Value Test) will always be pass or fail.  They never evaluate to just done.
    The Status Expression is what makes a step Pass or Fail.  Unfortunately with the default steps you cannot change the Status Expression.  You have to create a custom step type to do that.
    What are you setting your Step.Result.String to?  Or did you change the Data Source?
    Here are 3 options:
    1- Create a custom step based on the String Value Limit test and change the Status Expression such that you get your desired behavior
    2- Use and Action step and change the Status Expression such that you get your desired behavior.
    3- Continue using the step you are using but in the Post-Expression place some logic which turns off the Step.StepFailCausesSequenceFail if you are in return 2 state
    http://zone.ni.com/devzone/cda/tut/p/id/4302
    Hope this helps
    jigg
    CTA, CLA
    teststandhelp.com
    ~Will work for kudos and/or BBQ~

  • Count the Pass/ Fail in a DAQ reading testing ?

    I want to write the while loop to read the DAQ signal. Each time, the DAQ input signal is compared with the spec. limit. The signal that is within the limit will be recorded as " Pass " while the signal out of spec. will be recorded as " Fail ". At the end of the testing ( I break the program), the testing result will be displayed on the testing panel.
    I believe my question is similar to the programming of " A=A+1 ". Can anybody help me to resolve the problem ?

    Tian,
    Are you looking for just the number of passed and number failed measurements or a record of each individual test. If you are only concerned with the number of passed/failed tests then use a shift register. Right-click on the border of the loop and choose Add Shift Register. Then use a case structure. If it passed the test (true case) then increment the value. If it did not pass the test (false case) then just wire the value straight through the case structure. To get the number that failed simply subtract the number that passed from the iteration count.

  • Combine "Pass/Fail" and "Multiple numeric limet" tests

    Hi,
    how is it possible to combine a pass/fail and a multiple numeric limit test?
    I know that you can create your own step types and so I already combined these two steps. But now i can only observe the numeric limits. Regardless of the pass/fail source ("Edit Pass/Fail Source") TestStand only looks at the numeric limits. If they are ok the test is passed. If not, it is failed. It does not matter if "Step.Result.PassFail" is True or False.
    What is the problem, or do i make any mistakes?
    Thanks for your help
    Daniel

    Hi,
    You will need to include the Step.Result.PassFail in the Status Expression. Probably at the moment you only have the NumericLimitTest Status expression in your combined step type. You will need to changed the Status Expression to match your requirement which will give you an overall Pass Fail for your step type.
    Regards
    Ray Farmer
    Regards
    Ray Farmer

  • [LV - TS] How to execute a SEQ file, get the result (PASS/FAIL) and the report file ?

    Hello,
    I'm looking for the most obvious and simple way to, from Labview :
    1- Start the Teststand engine : DONE
    2- Get a list of SEQ files to execute : DONE
    3- Loop through the list to execute each SEQ file : <- found some examples but I have some trouble with the UIMessage stuff
    http://zone.ni.com/reference/en-XX/help/370052N-01/tsexamples/infotopics/executing_sequences_using_api_labview/
    4- Get the result of the SEQ execution : <- unsure
    5- Get the XML path+name of the report file generated : <- obviously not easy
    http://digital.ni.com/public.nsf/allkb/3D3B2358DF9B786B8625789500549A3B
    http://forums.ni.com/t5/NI-TestStand/Getting-Report-File-Path-location-at-execution-in-TestStand-2012/td-p/3041789
    Here is my current implementation :
    David Koch
     

    Hello,
    I tried to work from this example :
    http://forums.ni.com/t5/NI-TestStand/Execution-of-TestStand-Sequence-in-LabVIEW-via-TS-API-Different/td-p/2939616/page/2#M46686
    However it remains unclear about how to get a simple
    answer from just running a sequence file out of any
    model so far (Simple Pass). Like I told you, all what
    I need, once the TS engine is started, is a VI with the
    following interface :
    INPUT
    1- TS engine reference
    2- SEQ file path (path or string)
    3- Sequence name to execute
    4- UUT string (optional, but yeah)
    5- error in
    OUTPUT
    1- Result boolean (PASS/FAIL)
    2- Path to XML report
    3- error out
    It would be so cool to just have this
    About the surrounding UI, I'm working on it. The LV
    code is not really important at this point, it's running
    a SEQ file that matters to me.
    David Koch
     

Maybe you are looking for

  • Add on custom button to ALV toolbar in std tcode ME51N, ME52N and ME53N.

    Hi All, I have the requirement to add on the custom button to ALV toolbar in ITEM OVERVIEW for standard transaction ME51N, ME52N and ME53N. Does anyone know what enhancement point or user exit i should apply and how i can add on the custom button? Th

  • Error while downloading the xml file

    i am having a internal table which contains data in xml format but i am unable to download the data into local system by using 'GUI_DOWNLOAD' fn. module its error its showing is as : The XML page cannot be displayed Cannot view XML input using XSL st

  • My iTunes has stopped working with Windows 8 (worked fine at first).

    Error message states "iTunes could not connect to iTunes store.  Unknown error (0x80090318).  Anyone else have this issue?

  • Awesome window manager client rule for "GtkFileChooserDialog"

    Hi, I've set up my current awesome wm so that the web browser (chromium) always opens on tag 2 of screen 1. When I have an external screen however, I soemtimes like to move the web browser to the external screen. The problem with my setup is that whe

  • OWA calendar Items not showing up

    After the latest cumulative update OWA calendar items aren't showing up. I don't use OWA that often so it's hard to determine when this happened so I am assuming that it was after the last cumulative update. Another coworker OWA errors out when he tr