Pass error cluster from labview dll to teststand

Hello,
I just want to pass an error cluster from a dll compiled in LV7.1 to TestStand 3.1. I never receive the contents of the LV error cluster in TS. I compiled my function with standard calling convention option, this should work. In my VI I generate only an error and pass to error output.
LV Settings:
 TS settings:
regards
MB

MB,
please follow the info in this KB:
http://digital.ni.com/public.nsf/allkb/22BF02003B4588808625717F003ECD67?OpenDocument
Please note that using "By Value" will never return any values to TestStand!
You cannot use the default error-container in TestStand to receive data from the LV error cluster if you compile the VI into a LV DLL.
So either you choose to follow the KB or you split up the error cluster in your LV VIs to return error.occurred (boolean), error.code (numeric i32) and error.msg (LV String) .
hope this helps,
Norbert
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.

Similar Messages

  • Stub out Error Cluster from Error Code.vi?

    When I profile my application, I find that the biggest consumer of CPU time is the Error Cluster from Error Code.vi, called from a number of locked NI libraries.  How can I stub out this .vi, replacing it with basically a pass-through?  I tried creating a project specific .vi with the same name, and when I open my project all the project and vi.lib .vis used link to it, but they all come up broken and have to be re-linked, which I can't do in the locked .vis (I matched the ins and outs and connector pattern).  I can't edit the Error Cluster...vi directly, getting a message about the .vi being used by another application even with a clean start of LV and going directly to that .vi in the library (and I would prefer to not mess with the vi.lib version anyway).  Any suggestions?  Thanks.  LV 2011.
    Matt

    If you have the LLB manager open it reserves your VI in a different appliation instance, therefore it is locked and no editing.  Simply close the LLB manager and you can whack away at that VI.
    I doubt that the shared clone setting plays much of a role here, there are some deeper issues.   A rather amusing VI in a few ways:
    In order to show the VI Title instead of the Name you Open a VI reference which is going to stick you into the root loop.  10 reentrent clones simply means 10 copies waiting in line for the root loop.  A non-reentrent version would simply have 10 copies waiting to run.  Minus the root loop issue, reentrency is the way to go, and on a desktop machine shared clones is typically quite effective.
    It can't be a slow VI, look Trim Whitespace was inlined manually to avoid a performance hit!  In a loop maybe, but really, shaving off the overhead of a single subVI?  That's optimization.
    But wait, all of that and much, much more is given right back by the use of Delete From Array to remove the first element of the Array.  Bad on so many levels.    Delete From Array is a data copy machine, and now you are doing one needlessly.  Array subset is your friend here, tells the compiler you are only reading, nothing to get excited about. Pull it outside the case structure, you are ditching the first element twice in the two branches.
    Concatenating strings in a loop, also a recipe for fun.  Often there are ways to leverage Array to Spreadsheet string for better performance, or build a string array and then concatenate at the end.  Probably not a big deal here, how big are call chains really, but if you are inlining subVIs by hand then you should really be frying the bigger fish.
    I find myself in your shoes fairly often.  Some clunker in vi.lib causes a bug or sluggish performance.  You either ditch the NI code and roll your own, or make it work on your machine but not others, or chalk it up to the cost of doing business.  I have tried on numerous occasions to suggest that all patches to vi.lib be made readily and freely available to all previous versions of LV that are compatible.  Let's say this VI got tweaked so it was a bit more performant for you, at least enough to be viable.  Then you could happily use the patched version in LV11, and if you went to a different machine you would simply make sure vi.lib was up-to-date.  These types of patches could roll out incrementally as needed, less need to cross your fingers that you won the CAR lottery with each new version of LV.  (I guess most lottery winners don't shell out $$$ to collect their "winnings" though).
    After all that what would I do here?  I would wrap that sucker inside a disable structure (provide minimal functionality, like pass through caller name, code and error instead).  Then I would check the performance again.  Now you have a data point as to whether or not it is worth it to proceed.

  • Pass 2D array from LabVIEW to C# problem

    Hi , I have made a dll using .net assembly to use my labview code in c# . I can easily pass string , numerical and 1D arrays from labview to c# and vice versa, but when I want to read back 2D U16 array from Labview in c# I get the following exception although they have exactly the same type (ushort)
    Cannot widen from source type to target type either because the source type is a not a primitive type or the conversion cannot be accomplished.
    Could you please let me know how should I pass 2d array from labview to c#
    Thanks

    If all else fails, you could pass n 1D arrays across and then put the original 2D array back together. That's probably how the computer would do it behind the scenes, anyway.
    Cameron
    To err is human, but to really foul it up requires a computer.
    The optimist believes we are in the best of all possible worlds - the pessimist fears this is true.
    Profanity is the one language all programmers know best.
    An expert is someone who has made all the possible mistakes.
    To learn something about LabVIEW at no extra cost, work the online LabVIEW tutorial(s):
    LabVIEW Unit 1 - Getting Started
    Learn to Use LabVIEW with MyDAQ

  • Passing values from CVI dll to TestStand gives wrong values

    Initially I wanted to pass an array of booleans from CVI to TestStand.  I read that this was not possible because you cannot set Boolean as a parameter.
    So I declared it as unsigned intin CVI.
    Strangely enough, TestStand does receive it as an arrayof Booleans.  However the values are still wrong.  I wonder if it actually is possible to pass an array of Booleans.
    Using TestStand 3.1 and CVI ver 7.1 (no impact)
    Here is how I declare the parameters in CVI:
    void __declspec(dllexport) __stdcall MemTest
        CAObjHandle     thisContext,
        unsigned char *    apbResult,
        int             alComport,
        int                alSocketNum,
        UINT32            aulHostAddr,
        BOOL        *    apbTestOutcome,
        UINT32             testSize,
        UINT32          startAddress
    The other functions within CVI use an array of booleans.
    Here is how I declare the parameters in TestStand:
    Here are the results I get.  They should match...
    The obvious question is:
    How do I get the results in TestStand to match those from CVI?
    The printout >>  no. 1 Passed << etc..  are the values of the array apbResult in CVI.  bSippResuls should match.
    Message Edited by Ray.R on 10-16-2009 10:41 AM
    Attachments:
    TS_prms.PNG ‏29 KB
    TSwrongValues.PNG ‏12 KB

    Ray,
    Try this:
    void __declspec(dllexport) GetResults(unsigned char arg1[16])
     //Insert function body here.
     arg1[0] = (unsigned char)0;
     arg1[1] = (unsigned char)1;
     arg1[2] = (unsigned char)0;
     arg1[3] = (unsigned char)1;
     arg1[4] = (unsigned char)1;
     arg1[5] = (unsigned char)0;
     arg1[6] = (unsigned char)1;
     arg1[7] = (unsigned char)1;
     arg1[8] = (unsigned char)1;
     arg1[9] = (unsigned char)0;
     arg1[10] = (unsigned char)1;
     arg1[11] = (unsigned char)1;
     arg1[12] = (unsigned char)1;
     arg1[13] = (unsigned char)1;
     arg1[14] = (unsigned char)0;
     arg1[15] = (unsigned char)1;
    Regards
    Ray
    Regards
    Ray Farmer

  • Error while caling Labview DLL from VB2005

    Dear all expert,
    i'm a student and very new in Labview. Currently, i design a application using Labview8.2 and need convert it to dll. So that i can call it fom my VB2005.
    I'm encounter an error, there is my computer "hang" when calling the labview dll. I suspect this may due unproper way i calling the labview dll in my VB program. So, really hope some one can give me any suggestion and modification to my program. I attach my vi and Vb program.
    for your information,
    i already install labview run time engine 8.2.1 and uncheck the "Loader Lock detected".
    below is the Function Prototype create when i using Application Builder to build the dll.
    void Bodeplot(unsigned short Operation, double Beta, double Frequency, double Kc, double Fcz, double Fcp, double Wz, double Wp, double Wzrhp, double k, double *Output)
    Thank you very much.
    Attachments:
    MyProgram.zip ‏723 KB

    Hi cckoh,
    A couple suggestions include:
    1) Make sure you are using the correct .NET data types that map to the data type that LabVIEW uses. For example, your Operation paramter for your DLL is an unsigned short which is equivalent to a System.UInt16 in .NET.
    2) You should be using ByRef instead of ByVal for your output parameters. For example, in your function prototye, you have double *Output. For example, when you import your unamanged DLL, the prototype you should use woudl be:
    Declare Auto Function Bodeplot Lib "........."(ByVal Operation as System.UInt16, ........ ByRef Output as Double)
    I tested this out and it loads the DLL, calls the function, and returns without problems.
    Hope this helps!!
    Best Regards,
    Jonathan N.
    National Instruments

  • Problems in using Labview DLL with TestStand!

    Hi,
    I tried to put the VI's to create a TCP/IP Connection, read/write Data to it and close it inside a LabVIEW DLL and use these functions with TestStand.
    The problem is to get access to the ConnectionID generated as TCP Network Refnum in LabVIEW.
    I don't know how to specify the prototype entry for this Refnum in LabVIEW and how to read it with TestStand.
    Another try to pass an ActiveXReference of SequenceContext and use the SetValIDispatch method to set a local variable (Type: ActiveXReference) to the returned ConnectionID of the TCPOpen.VI wasn't successful too.
    It seems to me that the connectionID isn't a normal ActiveXReference.
    Regards,
    Sunny

    Hi Sunny -
    You should treat this parameter as a pointer to an int when calling the DLL from TestStand (or any language like C or C++). Note that you can't do anything with the value outside of LabVIEW since it only has meaning inside of LabVIEW. You can only pass it around for use in other VIs you call from TestStand.
    Hope this helps....
    Kyle G.
    National Instruments
    Kyle Gupton
    LabVIEW R&D
    National Instruments

  • Using PostLVUserEvent function from Labview.dll

    I am trying to use the PostLVuserEvent function from the labview.dll and corresponding extcode.h file.  I have made my own dll using CVI that uses this function.  I then use Labview to call my dll.  My dll compiles fine.  No issues, no errors.  LabView can call my dll function containing the PostLVUserEvent function with no errors.  However, the event number LabView passes my dll is not the same event number I receive so no LabView event is triggered. 
    Has anyone had this issue? 
    We are trying to solve it from the LabView side and the CVI side but cannot seem to get it to work.  The examplesI have found here were compiled using C++ and those seem to work.  I do not know why when I compile my program in C, it creates the dll but does not work.   If I had to guess, i think it's an issue with pointers vs non-pointers.  When the LAbview program selects my function from the dll, it shows the argument PostLVUserEvent as a pointer when in my dll, it is not a pointer PostLVUserEvent....
    Any ideas?
    Thanks in advance. 

    Hello Blue
    Take a look to this one, it was created on C, I think the .c and the .h files will give a good idea, how the function should be implemented. It is important when calling the dll on LabVIEW, to use the std calling convention and run it on the UI thread.
    Regards
    Mart G
    Attachments:
    LabView EventTest.zip ‏1041 KB

  • How do I return error code from a DLL?

    I have a C++ DLL that can throw an exception.  In the C function interface between LabView and the C++ class, I catch exceptions so they don't go up to LabView.  But I want to get the error to the calling VI.  Is there a way to get an error code into the error bundle that comes out of the Call Library Function Node so that I can feed it to the simple error dialog box?  The exception has a string that describes the error and where it came from.  Or should I just make every DLL call return an error string (empty string if no error)?
    Thanks,
    Keith.

    Normally the convention is to use an integer return code with a return code of zero meaning no error. You can then create an error cluster based on this return value. The best thing to do is to have a "Error Code Check" subVI that you plop down each time. The error cluster for the Call Library Function function will tell you of problems with the LabVIEW <--> DLL interface. In fact, earlier versions of LabVIEW did not have an error in/error out cluser for that function. I'm not aware of being able to automatically set the error out cluster for the Call Library Function function. With the .NET interface it's different. If the method throws an exception you get the error out cluster with the error code of 1172 I think.

  • Not found error message from LabVIEW runtime when called from VC++ with statically linked MFC

    I have a dll built using LabVIEW 6i. I am calling the dll from a VC++ application. When the application loads I get an error pop-up 'cannot find LabVIEW runtime'. If I change the VC++ code to dynamically load MFC (using the loadlibrary function) or dynamically load the LabVIEW dll I created then the problem goes away. Only when both are loaded statically do I get the error message.
    The target machine is running Win2K pro and has the LabVIEW runtime installed.
    I do not want do dynamic loads as I need to call the dll from a legacy application. Are there any options to change the way LV links to MFC or to force the dll to find the LV runtime?
    Alan Widmer

    Ben,
    I have attached a ZIP of the files required to test a DLL from an MFC application. Or you can build the same application by following the instructions below. I used the numtest dll that you previously sent to me so there is no 'magic' in the LabVIEW or dll.
    In VC++ select NEW. Pick the MFC AppWizard (MFC) to build an MFC application. In the wizard select
    Project Name: MFCNumtest
    Single Document
    No database support
    No Compound document support
    Support for ActiveX
    Default settings on the Features page
    MFC Standard
    MFC as a shared DLL
    From the ResourceView of the project explorer, add an item Test to the IDR_MAINFRAME menu then add an item Go as a submenu item (with an ID of ID_TEST_GO)
    Right click the new menu item and run the Class
    Wizard. Click Add Function.. button to make a function OnTestGo()
    Open MainFrm.cpp and scroll to the end of the file to see the skeleton for the OnTestGo function. In this function add a call to the LabVIEW dll. You will also need to add a reference to the .lib file and the usual extcode.h.
    When you run the app you will see a window with the Test menu item. Select it and click Go to run the call to LabVIEW.
    This all works great. If you now go to Project | Settings and on the general tab change the Microsoft Foundation Classes: item from 'Use MFC in a shared DLL' to 'Use MFC in a static library' rebuild ALL files in the project and run the app you get the error message:
    System error 998 while loading the LabVIEW Run-Time Engine (C:\Program Files\National Instruments\shared\LabVIEW Run-Time\6.0\\lvrt.dll).
    numtest requires a version 6.0 (or compatible) LabVIEW Run-Time Engine. Please contact the vendor of numtest to correct this problem.
    I apologies for my slow response, g
    ot distracted by some other issues.
    Thanks for your help,
    Alan Widmer.
    Attachments:
    mfcnumtest.zip ‏44 KB

  • Multiple numeric test with cluster from labview?

    Is it possible to make a multiple numeric limit test with the values from a cluster from a labview vi?
    Solved!
    Go to Solution.

    This can be done with a little work.
    The first step is to get the cluster's data back in to TestStand's memory space as you would do normally.
    Then you either use post expressions (since you're at a point in time between the test code module completing and the status evaluation) to move the pertinent parts in to the data source for the step (usually step.numericarray) or...
    http://zone.ni.com/reference/en-XX/help/370052K-01/tsref/infotopics/test_step_types_mnl_test/
    And
    http://zone.ni.com/reference/en-XX/help/370052K-01/tsref/infotopics/pane_mnl_datasource/
    Use the 'specify data source for each measurement''.
    I'm not in front of a computer with TestStand right now to knock up an example but it should be pretty straight forward.
    Either method should allow you to work with a mixed cluster of data types and just evaluate on the numeric ones.
    Thanks
    Sacha
    // it takes almost no time to rate an answer

  • Error message from labview when trying to set up the DAQ assistant

    I recieve a message from labview when I try to set up the DAQ assistant and select a channel to use. The error I get states "An exception occured within the external code called by a Call Library Node. This might have corrupted LabView's memory. Save any work to a new location and restart labview". Labview then freezes. I have reset the DAQ device in trying to slove this but I still get the same message. How do I solve this?? thanks - mars2006

    Hi Mars-
    It sounds like your NI-DAQ installation may have become corrupted. I would suggest uninstalling and reinstalling the DAQmx 7.4 driver to correct this problem and ensure that you're up to date. This download is available here: NI-DAQ 7.4
    If the problem persists you may want to uninstall and reinstall LabVIEW and then NI-DAQ in that order. The error message will usually give an indication as to which VI the error occurred in. Please let us know which VI is failing if you're unable to avoid the error with these suggestions.
    Have a good day-
    Tom W
    National Instruments

  • Receive a cluster from a dll function

    Hi,
    I have a function in a DLL which returns a C struct.  I want to know how can I receive a cluster from a CLFN?
    struct info:
    struct typedata{
    int a;
    int b;
    const char * c;
    int d;
    int e;
    }data;
    function call:
    const typedata * hostinfo(int param);
    Solved!
    Go to Solution.

    LVCoder wrote:
    Thanks a lot both of you for letting me know about my dumb assumption that 2 different processes can use the same memory space.
    Anyways, I figured out the problem with my code that was reading the double value. 
    the DLL function returns the address of the struct which has 5 elements.
    struct test {
    int a;
    int b;
    int c;
    double d;
    double e;
    suppose the address I receive is 10
    address of test->a = 10
    address of test->b = 14
    address of test->c = 18
    address of test->d = 26 // I was refering it as address 22. but I forgot that I was on a 64 bit machine and that the OS will allocate the whole memory address for a double and won't utilize the 4 bytes left in the previous memory address. 
    thanks!
    That has nothing to do with the bitness of your CPU or the OS. It is only dependent on the compiler alignment setting when you compile your DLL. Microsoft Compilers use a default alignment of 8 bytes. That means every variable or element inside a cluster will be aligned to the smaller one of either its own size or the default alignment.
    You can change the compiler alignment either with a compiler switch or an explicit #pragma iin your source code where you declare the variables or structure typedefs.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Passing error message from login module to login page

    Hello,
    we have a custom login module to authenticate user in ldap and to grant application roles stored in db.
    Is it possible to pass error catched in login module to the user (display the error message on login screen)? We think it is helpful to see correct reason why the user couln't be logged in.
    Notes:
    Jdev version is 10.1.3.1. Custom login module was written using Frank Nimphius guidelines and examples.
    Rado

    Hi,
    if you followed this example then it is configured for container managed authentication, in which case the error message cannot be propagated to the view.
    There was a similar discussion on the J2EE forum and the answer was that the OC4J team will put this on a list of enahncements they track. The technical reason appears to be that the J2EE spec does not foresee to tell users about the "why" authentication fails - which clearly is a limitation of the Spec.
    Frank

  • What is the best way to open close and pass instrument handles from labview in teststand parallel model?

    I have a number of test systems that use a parallel model with labview. We have a good number of instruments(PXI).
    What is the prefered method for open,closing and passing instrument handles in teststand using labview? 
    Solved!
    Go to Solution.

    Hi,
    No, Below is a bit from the Session Manager Help
    Currently, Session Manager supports the following instrument session types:
    IVI Sessions—Use an IVI session to obtain the C-based instance handle for an IVI logical or virtual instrument name. NI Session Manager does not support IVI-COM drivers at this time. When IVI-COM drivers are available, you can use an IVI session to obtain an ActiveX interface reference to an IVI-COM driver.
    VXIplug&play Sessions—Use a VXIplug&play session to obtain a C-based instance handle for a VXIplug&play logical or virtual instrument name. Configure VXIplug&play names in the <VXIplug&play directory>\<Platform directory>\NISessionMgr.ini file.
    VISA Sessions—Use a VISA instrument session to obtain a C-based viSession handle to a VISA resource or logical name. Configure VISA logical names in the <VXIplug&play directory>\<Platform directory>\NISessionMgr.ini file.
    Custom Sessions—Use a custom session to create a data container object that shares ActiveX objects you create or other data between software components you write. Use the Attach and Get methods to attach data to and retrieve data from a session. A custom session does not initialize, close, or own an instrument handle. The data you share with a custom session does not have to be instrumentation related. You can create a custom session with any name you request.
    Regards
    Ray Farmer
    Regards
    Ray Farmer

  • Passing parameters from LabVIEW vi to TestStand

    Hallo,
    I'm struggling with retrieving the value of a vi OUT Parameter of type ASCII string from the basic_serial_write_and_read.vi (http://zone.ni.com/devzone/cda/epd/p/id/2669)
    I've assigned a Local Variable type string and assigned this expression to the vi OUT parameter "read string".
    When running the sequence however, the Local Variable value is not updated! (?)
    I can see the ASCII string being returned from the UUT on the NI I/O Trace tool.
    The basic_serial_write_and_read.vi works OK in LabVIEW.
    I'm just getting started but can't figure this out. I'd be appreciative of any hints.
    Cheers
    Kech
    Solved!
    Go to Solution.

    Attached are the basci_erial_write _and_read.vi and my play.seq TestStand sequence.
    When using the ASCII char "@" as string to write value - i get no reply value to the Locals.Local_read Variable, although i see that the UUT replies with a 22 byte long message on the NI I7O Trace tool.
    How do i convert the vi - so that the string to write and read string parameter type can be used with hex data i.e. 0x40?
    Attachments:
    basic_serial_write_and_read.vi ‏23 KB
    play.seq ‏9 KB

Maybe you are looking for

  • Mrp run on no of line items in sale order

    hi all i have a sale order with o line items include. i want to run mrp on sale order only (not want to use MD50) can i use user exit for this task ? if so please advise which user exit i can use ? thanks

  • Finder doesn't refresh the list of files

    Hello! I have a problem in Finder, it has stopped automatically refreshing the content in folders. Earlier I could move/rename/etc files and folders in other applications and the finder would automatically keep my view up to date. Now i have to go ou

  • Call WebService on startup

    I have a form that uses web services. A user enters a order number for an existing order and about 10 fields are populated. This works great. I now want to pass the order number in when the form is launched and call the web service on start up/form l

  • Webui problems

    I have been having problems with the webui. I am running ifs 1.1 on Win2000 SP2. When hitting the server via the webui, some images do not show up. When hitting the server with the exact same user from a browser on a different computer the images sho

  • Appleworks 6 and Unexpected error#37

    We still have a few files in Appleworks that were done in Appleworks 5. These we can still open in Appleworks 6 however when we try to save them we are asked if we want to save them with a V6 on the end of the file name. When we try to do this we get