Exporting ENUM definition to DLL header file

I have a VI that takes an ENUM as an input to select a command to be performed. This VI is part of a DLL to be called from C.
I would like to see the ENUM definition in the header file for the DLL so that the C program can use the same symbolic names for the commands that I
am using in my case selector within the VI.
Is there a good way to do this?
When I generate the DLL the parameter is shown simply as a uint32_t (I have the ENUM defined as 32 bit in LV rather than the default 16 bit).
I would have expected that all ENUMs that are parameters or return values of any function going into a DLL would automatically be defined in the DLL's header...
They really should be typedef ENUM definitions with the name of the LV control being given to the typedef by default.
Any help is greatly appreciated.
Thanks,
Klaus

You can export all items from an Enum Control using the "Strings[]" property. The values for enums always begin with zero and increment by one.
Does this help?
a.lia-user-name-link[href="/t5/user/viewprofilepage/user-id/88938"] {color: black;} a.lia-user-name-link[href="/t5/user/viewprofilepage/user-id/88938"]:after {content: '';} .jrd-sig {height: 80px; overflow: visible;} .jrd-sig-deploy {float:left; opacity:0.2;} .jrd-sig-img {float:right; opacity:0.2;} .jrd-sig-img:hover {opacity:0.8;} .jrd-sig-deploy:hover {opacity:0.8;}

Similar Messages

  • Import c struct from dll header file into TestStand?

    Hi,
    I'm writing a TestStand sequence to call some CVI functions I have compiled into a dll. One header file in the CVI code is the interface to the dll and declares some functions and some C structs which are to be passed into the functions as pointers. These structs are populated with test results as the CVI code executes. I want to import these structs into TestStand.
    TestStand seems to have properly imported my function calls because I can select them from a drop-down list. But it doesn't seem to recognize my struct definitions, because it just says "unknown __struct 617*" -- it knows it's a C struct pointer but doesn't know what fields exist in the struct.
    How do I get TestStand to recognize the fields that are in my structs?
    The C functions and structs are declared with DLLEXPORT in the header file. I compile the CVI code into a dll, with the target settings set to export marked header files and symbols. The only header file marked in the target settings dialog is the header file declaring these functions and structs. So I think they're being exported properly into the dll.
    Any ideas to help?
    Thanks!

    I'm using typedef struct {...} MyStruct; to make life easier in CVI world, so rather than changing that throughout my CVI code I think I'll just live with the fact that TestStand doesn't know what struct type it is.
    But typedef or not, is it ever possible for TestStand to import the struct definitions (ie so that it knows what fields and types exist within the struct)? Or do I have to make containers in TestStand that match the format of the structs I'm using in CVI, and pass those container pointers into my CVI function calls?
    Thanks!

  • Where can i find the User32.dll header files?

    Hi
    Is there any way to get the header file and library file of User32.dll or If I want to use a function from a User32.dll where can i get the parameters and arguments for that function which is required to call.
    Thanks & Regards
    Samuel J
    System Engineer
    Captronic Systems
    [email protected]
    Solved!
    Go to Solution.

    Hi,
    In general you don't need "user32.h" because the functions are declared in Winuser.h (you should include Windows.h).
    Otherwise, if you need user32.h only, then you can get this file together with Debugging Tools 
    After install you will get this file at C:\Program Files\Debugging Tools for Windows\winext\manifest\user32.h
    regards,
    Andrey. 

  • Shared library: function is not found and recognized in header file

    Hello,
    I am trying to use Java methods into LV. I am doing so by creating Java Invocation Interface, usind which I can call Java methods into C++ and then create a shared library that can be called into LV.
    When I am importing my shared library into Labview, I am getting the following error messages:
    The shared library contains 3 function(s). But no function is found and recognized in the header file. The following function(s) cannot be wrapped. If you want to import these functions, please review the warning messages next to the functions below. You will need to fix the problems before you can continue with the wizard.
    jclass invokeJavaClass(JNIEnv* jenv, string className);
    The following symbols are not defined:
    jclass;
    Undefined symbols can prevent the wizard from recognizing functions and parameters. To correct this problem, check the header file to determine if you must add preprocessor definitions. Click the Back button to return to the previous page of the wizard to add a preprocessor definitionsl (for example, "NIAPI_stdcall = __stdcall" or "NIAPIDefined = 1").
    The following header file was not found in the specified header file or one of the referenced header files:
    -  string
    -  iostream
    -  cstring
    -  jni.h
    To fix, click the Back button to go to the previous page and add the header file path to the Include Paths list.
     Please advise.
    Regards,
    H
    Attachments:
    SharedLibError.png ‏51 KB

    Hello Vivek,
    The LabVIEW dll that I am trying to import does not include any third-party device..all my code is fully based on LabVIEW. Maybe this helps you to guess what is happening: once I've parsed the dll' header appears an error
    like this one:
    void
    __cdecl Zdmt(LVBoolean *stop, double P, char channelName[],
        TD1
    *errorIn, TD14 *FFTOptions, TD12 *Calibration, char FileName[],
    int32_t minRecordLength, TD26 *InstrumentHandler, LVRefNum
    sessionRefArray[],
        LVRefNum *queueIN, TD1 *errorOut, LVBoolean
    *averagingDone,
        HWAVES LastRecordFetched, TD24 *Impedance, TD17
    *ColeColeCluster,
        TD18 *FFTcluster, TD5
    *InstrumentHandleOutputCluster, LVRefNum *queueOut,
        int32_t
    *Acquired, TD6 *FreqTimeInfoCluster, double *averagesCompleted,
    int32_t len);
    The following symbols are not defined:
    LVBoolean;
    int32_t; LVRefNum;
    Undefined symbols can prevent the wizard
    from recognizing functions and parameters. To correct this problem,
    check the header file to determine if you must add predefined symbols.
    Click the Back button to return to the previous page of the wizard to
    add a preprocessor definitionsl (for example, "NIAPI_stdcall =
    __stdcall" or "NIAPIDefined = 1").
    The following header file was
    not found in the specified header file or one of the referenced header
    files:
    -  extcode.h
    To fix, click the Back button to go to the
    previous page and add the header file path to the Include Paths list.
    I have replaced the first line #include "extcode.h" of
    the dll header file for #include "C:\Program Files\National
    Instruments\LabVIEW 8.6\cintools\extcode.h" that is the full path where
    the header file is located. However, new libraries seems to be missed:
    -  stdint.h
    -  MacTypes.h
    As far as I know,  Mactypes.h contains basic mac os data types and it doesn't have any relation with stdint.h...
    I have created both of them and stored into the same folder as extcode.h, but then other libraries are missed!!!
    Do you know if it would be possible to create the .dll generating all the header files associated for its data structures???
    And if this is not factible, then what do you suggest me? because I hope to not having to create all the header files until it stops giving me an error!
    thanks for four time,
    ben

  • How to export user defined properties in seperate file

    Hello,
    i want to export user definied properties in a seperate file using OMB+ commands. The documentation of this topic doesn't work.
    The OMB Code I use is the following:
    OMBEXPORT TO MDL_FILE '${Pfad}/006_${Project}.mdl' \
    FROM PROJECT '${Project}' \
    ALL_CLASS_DEFINITONS \
    CONTROL_FILE '$SANDBOX_HOME/Project.ctl' \
    INCLUDE_USER_DEFINITIONS \
    OUTPUT LOG TO '${Pfad}/006_${Project}_exp.log'
    The control file defines the path and filename for the the definitons in this way:
    DEFINITIONFILE=C:\Arbeitsverzeichnis\udp.mdd
    But the file will not be created and i get no error message.
    When using this command the udp' definitons will be exported, but in the general project export file. Which is the correct way to export the definition in a seperate file?

    - Create a user-defined OTD with one String field.
    - Create an XSD-based OTD with repeating name and value fields of type String
    - Create a "New Web Service" Java Collaboration whose input will be the first OTD, whose output will be the second OTD and whose operation will be named getProeprties, or some such
    - Implement, in Java, the code necessary to read your properties, given the properties file name or path (given in the input OTD) and populate the name/valur pairs on the output OTD
    - drag the operation of the collaboration onto the Business Process Editor canvas as an Activity
    - Configure a Busienss Rule to set the input of the getProperties activity to the name/path of the properties file
    - Configure a Business Rule(s) to use the name/value pairs returned from the ivocation of teh getProperties service
    If you wish to take this one step further you could consider writing the java collaboration
    a) generically so it can be reused from different business processes
    b) to cache the properties on first read so each invocation after the first one simply returns the in-memory values instead of re-reading the proerties from disk
    c) generically so it can read, cache and return different property sets depending on the properties file/path provided as input.
    Bear in mind that IO from a java collaboration by means other than an eWay violates the EJB spec.. It works all the same.
    Message was edited by:
    mczapski

  • Missing header files when importing a shared library with labview 8.6?

    Hi all,
    I want to import a .dll into my .vi program but I am not able to do it...
    I have created the dll  following the ni website tutorial
    http://zone.ni.com/devzone/cda/tut/p/id/3303#toc2
    Once the dll has been created, I have tried to import it with
    Tools-Import-Shared Library(dll)
    After parsing the header file appears an error like this one:
    void __cdecl Zdmt(LVBoolean *stop, double P, char channelName[],
        TD1 *errorIn, TD14 *FFTOptions, TD12 *Calibration, char FileName[],
        int32_t minRecordLength, TD26 *InstrumentHandler, LVRefNum sessionRefArray[],
        LVRefNum *queueIN, TD1 *errorOut, LVBoolean *averagingDone,
        HWAVES LastRecordFetched, TD24 *Impedance, TD17 *ColeColeCluster,
        TD18 *FFTcluster, TD5 *InstrumentHandleOutputCluster, LVRefNum *queueOut,
        int32_t *Acquired, TD6 *FreqTimeInfoCluster, double *averagesCompleted,
        int32_t len);
    The following symbols are not defined:
    LVBoolean; int32_t; LVRefNum;
    Undefined symbols can prevent the wizard from recognizing functions and parameters. To correct this problem, check the header file to determine if you must add predefined symbols. Click the Back button to return to the previous page of the wizard to add a preprocessor definitionsl (for example, "NIAPI_stdcall = __stdcall" or "NIAPIDefined = 1").
    The following header file was not found in the specified header file or one of the referenced header files:
    -  extcode.h
    To fix, click the Back button to go to the previous page and add the header file path to the Include Paths list.
    I have replaced the first line #include "extcode.h" of the dll header file for #include "C:\Program Files\National Instruments\LabVIEW 8.6\cintools\extcode.h" that is the full path where the header file is located. However, new libraries seems to be missed:
    -  stdint.h
    -  MacTypes.h
    Does anybody know what I have to do??
    Any help will be really appreciated,
    Regards,
    Benjamin

    If you use any of the LabVIEW cintools headers, they reference other headers too. The import wizard is written in a way that it simply skips parsing datatypes that can not be resolved due to missing header files.  If your functions you want to import references such datatypes then you get an according error about any include files the wizard could not load, otherwise not. The wizard can not know which of the missing header files is the problem since it obviously doesn't know what would be in those header files.
    The LabVIEW cintools headers are multiplatform, meaning they evaluate various compiler predefined defines to determine which platform they are included in. The import library wizard does not define any specific defines, since it is not really a compiler. So you have to define them. And they get adapted with each new LabVIEW version to support new compilers and compiler versions, so the defines described in the link in the first post do not have to be correct for cintools headers in newer LabVIEW versions.
    All in all writing DLLs that interface to LabVIEW cintools headers should not be done by writing them and then importing them using the wizard but instead you should write the VI and create the Call Library Node, then let LabVIEW create a template C file from the context menu of the Call Library Node and copy that into your C sources and fill in the functions from there.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • How can I use a dll if I dont have a header file

    I'm not sure if I'm even trying the possible here as I have searched and not been able to find much at all.  However I figured it was worth asking here.
    I have access to several dll's used by a program, I need to open a file using the program (for some reason it is completely non responsive unless you open it "within" the program itself) and so decided to browse the .dll files included.  Ive found a few functions which may carry out the function I need.  Is there a way of figuring out the inputs/outputs if I don't have documentation or a header file?
    This is the next stage in a huge project I am working on at the moment and I've been banging my head against the wall all day trying to figure this out.
    Thanks in advance for any help
    Rik
    That glass?
    Thats glass is neither half full or half empty....
    Its twice the size it needs to be

    Yes, that makes sense. It also means that what you are trying to do is not likely to work. You have no way of knowing what the program does when opening the file, so guessing at using the DLLs is purely a shot in the dark without even knowing where the dark is. Even if you could find the function (assuming it's just one) that loads a file, how is the program supposed to use it now? That function has to be called from within the program. When you call it from LabVIEW you are not sitting inside the program's memory space, so it has no way of knowing about the file.
    I would suggest, instead, to see if the program accepts command-line parameters. For example, does it accept a name of a file to open as part of launching it from the command line? If not, then you may need to resort to trying to control it via automation. If it has no built-in automation then you need to resort to using the OS to make pretend you're clicking buttons and typing text. This has come up many times before, and there have been numerous posts on this, so please do a search on controlling an external program from LabVIEW within this forum. You can call the Windows API functions to move the mouse to a specific location and click the button as well as typing text, or you can use third-part automation tools. One that I have used successfully is AutoIt. The search I indicated will yield other suggestions. 

  • How to use Dlls and Header files in my java Code?

    Hi All,
    I want to make use of dll and header files of DataStage, from my Java Interface.
    i am buliding a product where i need to contact DataStage server from my Java Code alone, For this they provided dlls and header files
    Now i wan to make use of them and need to perform DataStage operations from my Java Interface.
    Can any on help out me regarding this?

    Try Java Platform SDK for native function calls without JNI writing code, like it done in .NET languages.
    For MS Windows:
    http://www.simtel.net/product.php[id]100916[SiteID]simtel.net
    For Linux/Unix:
    http://www.simtel.net/product.php[id]117719[SiteID]simtel.net

  • How to use preprocess​or directives (#define) in C++ header file with LabVIEW 8.2

    I have a C++ header file that contains around 2000 preprocessor directives:
    #define MEM_1   0xC
    #define MEM_2   0xD
    #define MEM_3   0x18
    I want to be able to "access" these memory offsets by identifier name (MEM_1) in my LabVIEW program like I would in a C++ program.  I do not want the overhead of parsing through the header file and storing all the offsets into an array or similar structure. 
    I've written a simple Win32 console program to return the memory offset given the identifier (see code below), and created a DLL to use with my LabVIEW program.  In the console program, you notice that I can call a function and pass in the identifer name, and get the offset back correctly:
    getOffset(MEM_1);
    In LabVIEW, I was hoping to be able to pass in the identifier (MEM_1) but was unsure what datatype to use.  In my C++ code, I defined the parameter as an int.  But in LabVIEW, I can't enter in MEM_1 as an int.   Can someone advise on how to do this?  Or if there is an alternate way to use #define's from external code inside LabVIEW?
    #include "stdafx.h"
    #include "scrmem.h"
    #include "stdio.h"
    void getOffset (int var);
    int _tmain(int argc, _TCHAR* argv[])
     getOffset(MEM_1);
    canf("%d");
     return 0;
    void getOffset (int var)
     printf("The address of MEM_1 is %x", var); 

    kaycea114 wrote:
    Hi,
    Where do you think I should use the string? 
    The way that getOffset is currently defined in the DLL, I have to connect an integer input into the LabVIEW function.  This prevents me from entering in: MEM_1 as the input to the LabVIEW function.
    Are you suggesting that I change getOffset to receive a String parameter ("MEM_1")?  Does that mean I need to do a string compare (line by line) through the header file to get the offset?  It seems like doing this search through the header file would degrade performance, but if that's the only work around, then I'll do it.
    Please advise.
    Well, what you want to do is indeed entering a string and getting back the assigned integer. That is what the C preprocessor is doing too although there it is done only once at the preprocessor stage of course and not at runtime anymore. But LabVIEW is not a C preprocessor.
    What you did so far seems to be to define getOffset() that accepts an enum that needs to be created from the C source code to then return the assigned constant. That's of course not very helpful.
    And writing a VI that could parse the C header file and create a name/constant array is really a lot easier than doing the same in C. You don't even need to parse the file each time again, but can instead cache them in an uninitialized shift register (LV2 style global).
    Even more easy would be to create from that data a ring control using property nodes and save it as custom control and voila you have the most direct lookup you can get in LabVIEW and it works just as comfortable as using the define in C code. It would mean that you need to seperate your header file possibly into several different files to only get related constants into the same ring control, but that is easily done.
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Compiler appears to include .cpp source instead/addition to .h header file

    Numerous source files in our project depend upon unix/dcfPosixLocks.h and must be linked with the dcfPosixLocks.o that results from compiling unix/dcfPosixLocks.cpp. While all of these numerous sources files either directly or indirectly #include "unix/dcfPosixLocks.*_h_*" none of them ever #include "unix/dcfPosixLocks.*_cpp_*" (or any other non .h file for that matter). Yet, the C++ compiler incorrectly appears to include unix/dcfPosixLocks.cpp instead of (or in addition to) unix/dcfPosixLocks.h, for example when compiling dcfConstants.cpp (actual invocation and error below). Renaming unix/dcfPosixLocks.cpp to unix/dcfPosixLocks.c (or other names with C/C++ related file extensions) results in similar behavior. We first discovered this problem when we moved from a partially patched Studio 11 on Solaris 9 SPARC to an unpatched Studio 11 on Solaris 10 SPARC, i.e. it works fine on the old system. The problem remains after fully patching Studio 11 on Solaris 10 SPARC and even after fully patching Studio 12 on Solaris 10 SPARC. The implication is that one of the Studio 11 patches broke the C++ preprocessor and has of course been carried over into Studio 12. Since we use gmake to compile multiple sources concurrently, we ruled gmake out as a source of the problem by eliminating concurrent compilation (e.g. -j1 not -j2) and by tracing it to ensure that it was not deciding to compile unix/dcfPosixLocks.cpp (e.g. due to some implicit rule or something like that). Workaround #1, since the dcfConstants.cpp file contains no templates, is to separately preprocess and then compile, i.e. CC -E dcfConstants.cpp 1> .../dcfConstants.i and then CC -c dcfConstants.i -o .../dcfConstants.o. Of course, workaround #1 is dubious in general and completely unworkable where templates are involved. Workaround #2 is to move the unix/dcfPosixLocks.cpp file out of the include path (many of the directories in our include path contain both source and header files). Unfortunately, workaround #2 is problematic from a source control point of view... implies lots of source files being moved unnecessarily from one location to another to artificially distinguish them from header files, breaking historical chains in the process (granted, depends on the abilities of your source control software). Any thoughts or comments? We are hoping that somebody might be able to help us discover the errors in our ways, contribute additional workarounds, and/or confirm that this is in fact a compiler bug. I can try to provide more information as needed. Thank you in advance. -R
    /opt/studio11/SUNWspro/bin/CC -G -KPIC -w -mt
    -D_POSIX_PTHREAD_SEMANTICS -DCXX_SUNPRO -DDCF_NO_STD_MIN -g
    -D_DEBUG -DDCF_ASSERTIONS_ABORT -DDCF_DEADLOCK_ABORTS -DDC F_EXCEPTIONS_ABORT -I. -I/export/home/buildsys/src/build/tmp.3/dcf1/inc
    -I/export/home/buildsys/src/build/tmp.3/dcf1/platform/unix/solaris/inc
    -I/export/home/buildsys/src/build/tmp.3/dcf1/platform/unix/solaris/inc
    -I/export/home/buildsys/src/build/tmp.3/dcf1/platform/unix/inc
    -Iunix/solaris -Iunix
    -I/export/home/buildsys/src/build/tmp.3/dcf1/interfaces
    -I/export/home/buildsys/src/build/tmp.3/dcf1/platform/unix/solaris/interfaces.overlay
    -I/export/home/buildsys/src/build/tmp.3/dcf1/interfaces.overlay
    -c dcfConstants.cpp -o Debug/unix/solaris/dcfConstants.o
    "unix/dcfPosixLocks.*_cpp_*", line 72: Error:
    dcfLocks::oAtomicAccessToLong is initialized twice.
    "unix/dcfPosixLocks.*_cpp_*", line 103: Error:
    dcfMutexAttributes::pSingleton is initialized twice.
    2 Error(s) detected.
    gmake[2]: *** [Debug/unix/solaris/dcfConstants.o] Error 2
    gmake[1]: *** [all-r] Error 1
    gmake: *** [build] Error 2

    I'm having trouble understanding what change in Sun compilers or in Solaris could have resulted in the change in behavior you report.
    For template code, Sun compilers have always used the automatic inclusion model described in the C++ Users Guide section 5.2, "Template Definitions".
    Specifically, if you include a header foo.h that declares one or more templates, the compiler will automatically include foo.cpp (or foo.cc, foo.C, etc) if it needs a definition of one of those templates. That is, it assumes that template definitions missing from foo.h will be in foo.cpp.
    Does that answer your question?

  • How can I export an edited image to a file?

    I can't seem to find a way in iPhoto 09 do this. Can I export an edited image to a file, keeping the edits intact, or do I have to use a 3rd party editing app when I edit from inside iPhoto 09?
    I would prefer to use iPhoto's editing tools so should I import previously cropped images into iPhoto? If yes, what file should I use? I was wondering if anyone does this on a regular basis and if there issues doing this.
    I like the editing tools in iPhoto very much. In other apps I have always cropped, resized, corrected levels and a bit of color. I am not a power user by any means and I am close to doing this all in iPhoto if I could only export the edited image to a file.
    The reason I want to do this is because the printing portion of my work flow is to upload files to my merchant who prints them and are ready for nearby pickup within an hour. This is all done at a very reasonable price and the color accuracy is very close.
    Any advice would be greatly appreciated.
    Regards,
    Jim

    You can edit from iPhoto by dragging to the desktop or exporting using the File -> Export command, setting the Kind to anything except original.
    But you probably don’t need to:
    There are many, many ways to access your files in iPhoto:
    *For Users of 10.5 Only*
    You can use any Open / Attach / Browse dialogue. On the left there's a Media heading, your pics can be accessed there. Apple-Click for selecting multiple pics.
    Uploaded with plasq's Skitch!
    You can access the Library from the New Message Window in Mail:
    Uploaded with plasq's Skitch!
    *For users of 10.4 and 10.5* ...
    Many internet sites such as Flickr and SmugMug have plug-ins for accessing the iPhoto Library. If the site you want to use doesn’t then some, one or any of these will also work:
    To upload to a site that does not have an iPhoto Export Plug-in the recommended way is to Select the Pic in the iPhoto Window and go File -> Export and export the pic to the desktop, then upload from there. After the upload you can trash the pic on the desktop. It's only a copy and your original is safe in iPhoto.
    This is also true for emailing with Web-based services. However, if you're using Gmail you can use iPhoto2GMail
    If you use Apple's Mail, Entourage, AOL or Eudora you can email from within iPhoto.
    If you use a Cocoa-based Browser such as Safari, you can drag the pics from the iPhoto Window to the Attach window in the browser.
    *If you want to access the files with iPhoto not running*:
    Create a Media Browser using Automator (takes about 10 seconds) or use this free utility Karelia iMedia Browser
    Other options include:
    1. *Drag and Drop*: Drag a photo from the iPhoto Window to the desktop, there iPhoto will make a full-sized copy of the pic.
    2. *File -> Export*: Select the files in the iPhoto Window and go File -> Export. The dialogue will give you various options, including altering the format, naming the files and changing the size. Again, producing a copy.
    3. *Show File*: Right- (or Control-) Click on a pic and in the resulting dialogue choose 'Show File'. A Finder window will pop open with the file already selected.
    Regards
    TD

  • Error in running Extract Definition Upload from Data File concurrent.

    Hi all,
    Am trying to upload the 834 Extract Layout from the data file by running the concurrent program, Extract Definition Upload from Data File.
    After running this concurrent program am getting the Extract Layout definition with Record layout and Data elements within it.
    But some record layouts has the changes at their repeating level in that.
    Please suggest me how do i get the same repeating levels for the record layouts when i move the 834 benefit extract layout definition
    from one instance to another instance.

    Hi,
    We have exactly the same error in IBolt. This error happens sometime and we don't find the reason.
    IBolt has been upgraded to Version 3.1 SP1 and a fix has been developed by the support. After installation, we've work hardly during 2 days for testing all the flows by the customer without having this error.
    Ibolt works as a service and when this error occurs, the service stopped and must be restarted.
    In our flows, we have put a delete of the observer.dll file just before the rest of the flow but it doesn't solve the problem.
    Deleting the directory can be temporarly a solution but the error come back another day.
    Each time the SBO client or DTW is started, the %temp%\SMS_OBJ_DLL directory is created...
    When you just delete the observer.dll file and leave the rest of the directory like it is, the file is re-created when you start and log to SBO. I would say that this file is a copy of another file observer_800178.dll (800178 depending of the version of SBO yoi have). 800178 corresponds to SBO 2007 A PL 42.
    Do you have more information since your post has been send ?
    Thanks in advance for your help.
    Best regards.

  • Can I use Bridge to export image data into a .txt file?

    I have a folder of images and I would like to export the File Name, Resolution, Dimensions and Color Mode for each file into one text file. Can I use Bridge to export image data into a .txt file?

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • Database toolkit add on - using in a dll/lib file

    When using the database functions (for example, DBError) in a dll file called from MS Visual Studio, I get linking errors because the functions are not found.  I copied the header file for the functions into the project that is compiled to a dll, but this did not help.
    What options do I need to set in the project properties?  Or how do I fix this?
    Thanks!

    Hi coolGal,
    Please try the following:
    Open Visual Studio and create a New Project
    In the New Project Window, select Visual C++ » LabWindows/CVI » LabWindows/CVI Project, then click Next
    In the Options window, leave the defauls selected (EXE, Cteate a started C++ file, and Full run-time engine), then click Finish
    Open the cpp file that was created and add #include <cvi_db.h> underneath the existing #include <cvirte.h>
    In the WinMain function, add char *myMessage = DBErrorMessage(); just above the call to CloseCVIRTE();
    Select Build » Build Solution
    For me, this builds and links successfully with CVI 2009 and Visual Studio 2008. What versions of CVI, Visual Studio, and the SQL Toolkit are you using? Also, what operating system are you on?
    Best,
    John M
    National Instruments
    Applications Engineer

  • How to include header files with different extensions

    Hi,
    When i include a header file with extension .ch (myincludefile.ch), the compiler gives error messages but when i change the extension to .h, the problem disappears.
    Can anyone help me getting rid of this problem?
    For example, for the line below, I get a warning such as, "attempt to redefine MY_CONST without #undef". Remember when i change the extension to .h, the problem disappears.
    #define MY_CONST 500 /* Constant */
    Thank you very much

    I don't see how the name of the file could cause or prevent error messages, except when template declarations are involved. So let's assume for now that the file has a template declaration.
    The Templates chapter of the C++ Users Guide explains about including or separating template declarations and their definitions.
    If you have only a template declaration in a file and the compiler needs the definition, it will look for another file with the same base name and include it automatically. For example, if you have files foo.h and foo.cc, and foo.h has a template declaration, the compiler will include foo.cc automatically, even if you didn't intend for that to happen. You can wind up with multiple delcaration errors that way.
    When looking for a file containing template definitions, the compile will not include a .h file, so as not to create recursive inclusion. If changing the file name to .h causes your problem to disappear, it seems like an unwanted automatic inclusion is the problem.
    You can try two things to find out:
    1. Compile with the -H option. The compiler will output an indented list of all included files. See if you are getting a file you didn't intend, or the same file twice.
    2. Compile with option -template=no%extdef. It disables the automatic search for template definitions.
    If you find an unintended included file this way, you will probably have to change the names or organizaiton of some of the files. Our implementation of the C++ standard library depends on NOT using the -template=no%extdef option, which might mean you can't use that option.

Maybe you are looking for

  • I can't get my projects in Final Cut Pro X to work in 1080p. One turned out to be 1080p but I did everything the same as the ones that did not.  ***?  Any ideas?

    I can't get my projects in Final Cut Pro X to work in 1080p. One turned out to be 1080p by "accident" but I did everything the same as the ones that did not. Same camera/settings (shot in 60i).  I didn't really care cause I had to get some stuff done

  • Document number not within defined interval

    I am trying to assign my own po number with idoc pordcr101 i am using the next po in our system 4500016058. when i look in we02 after sending the idoc i see document 4500016058 not within defined interval. can i assign my own po?  do i need to define

  • Apple Remote

    Since upgrading to Mac OS X Lion 10.7.5 I have been unable to use my Apple Remote to view movie trailers.  Does anyone know how I can fix this?  Many thanks!

  • FCPX crashes upon dragging element.

    Every time I launch FCPX, everything is fine. But once in a while it does this awesome thing when i drag any element (video or whatever) onto the timeline... it pauses, loads what appears to be existing projects (not sure why it loads them because th

  • Data Inconsistency between reports

    As per S_ALR_87013533, I get planned values for a WBS for year 2008 as, say, 2000. When I run CJI4 for same WBS the plan values are 2100, i have been able to track the difference of 100 to a particular cost element here. I have checked the consistenc