Profile performance of a VI using DLL

Hi,
Looking out for some suggestions as we are working on a VI that is calling ActiveX DLL. We want to check performance of that VI, so using Profile Performance And MEmeory tool available in LabVIEW8.6, but the result seems to be incorrect as we are getting much lesser time spent in the report compared to find out by ourself.
Please suggest me if things works in the same way.
Thanks in Advance,
Vivek

I recommend taking the simple approach - put a millisecond timer function before and after the call to the DLL, and subtract. I suspect that CLFNs do not appear in the profiler because LabVIEW hands off execution to the DLL and can't monitor the internals of what the DLL is doing. LabVIEW has no way to know how much memory the DLL allocates nor how much processor time the DLL uses.

Similar Messages

  • Can't use profiler (Performance Analyzer) in Sun Studio 12, please help!!

    Hi,
    I've mandriva 2008 (linux), and I've installed Sun Studio 12.
    At the first start of Sun Studio a message appear:
    Warning - could not install some modules:
         ATD Sun Studio Core - The module named com.sun.tools.swdev.toolscommon was needed and not found.
         dbx Debugger UI - The module named com.sun.tools.swdev.toolscommon was needed and not found.
         ATD Performance Analyzer Actions - The module named com.sun.forte.st.mpmt/1 was needed and not found.
         ATD Performance Analyzer Actions - The module named com.sun.tools.swdev.toolscommon was needed and not found.
    All other modules works correctly, but unfortunately I need the profiler (performance analyzer) because I want to speed up my C++ code. What can I do???
    Please help!
    NOTE:
    I've added to my .bash_profile this lines:
    PATH=/opt/sun/sunstudio12/bin:$PATH
    export PATH
    PATH=/opt/sun/sunstudio12/man/:$PATH
    export PATH
    LD_LIBRARY_PATH=/opt/sun/sunstudio12/prod/lib
    export LD_LIBRARY_PATH
    but unfortunately this don't solve the problem....
    Edited by: MLX82 on Feb 1, 2008 11:24 PM

    If I type `uname -p` this message appear:
    [mlx@localhost ~]$ `uname -p`
    bash: Intel(R): command not foundIf I type: --userdir won't work as you can see:
    sunstudio --userdir /home/mlx/.sunstudio/12.0-Linux-Intel\(R\)\ Pentium\(R\)\ M\ processor\ 1.73GHz/I get an 426 line error (but this is the correct location, as man of sunstudio say), while if I type for example:
    sunstudio --userdir /home/mlx/error 426 disappears, but I get anyway the error on the module:
    Warning - could not install some modules:
         ATD Performance Analyzer Actions - The module named com.sun.forte.st.mpmt/1 was needed and not found.
         ATD Performance Analyzer Actions - The module named com.sun.tools.swdev.toolscommon was needed and not found.
         dbx Debugger UI - The module named com.sun.tools.swdev.toolscommon was needed and not found.
         ATD Sun Studio Core - The module named com.sun.tools.swdev.toolscommon was needed and not found.On the other hand I've searched the "id" exe and it is in /bin:
    [mlx@localhost ~]$ id
    uid=500(mlx) gid=500(mlx) gruppi=500(mlx)so I've created a sym link:
    cd /usr/bin
    ln -s /bin/id ./idbut when I start Sun Studio 12 I get again the error on the module.
    So I've tryed to reinstall everything (yes, also OS) but it (partially) solves only the problem installing with batch_installer. Infact now I can use batch_installer but at the end of installation it say:
    [root@localhost tmp]# ./batch_installer --accept-sla
    Installation failed: cleanup successful.Anyway SunStudio 12 still works, but the main problem about the module of performance analyzer is still here...
    how can I solve this? Please help!
    Edited by: MLX82 on Feb 4, 2008 3:19 PM

  • Improve the performance in stored procedure using sql server 2008 - esp where clause in very big table - Urgent

    Hi,
    I am looking for inputs in tuning stored procedure using sql server 2008. l am new to performance tuning in sql,plsql and oracle. currently facing issue in stored procedure - need to increase the performance by code optmization/filtering the records using where clause in larger table., the requirement is Stored procedure generate Audit Report which is accessed by approx. 10 Admin Users typically 2-3 times a day by each Admin users.
    It has got CTE ( common table expression ) which is referred 2  time within SP. This CTE is very big and fetches records from several tables without where clause. This causes several records to be fetched from DB and then needed processing. This stored procedure is running in pre prod server which has 6gb of memory and built on virtual server and the same proc ran good in prod server which has 64gb of ram with physical server (40sec). and the execution time in pre prod is 1min 9seconds which needs to be reduced upto 10secs or so will be the solution. and also the exec time differs from time to time. sometimes it is 50sec and sometimes 1min 9seconds..
    Pl provide what is the best option/practise to use where clause to filter the records and tool to be used to tune the procedure like execution plan, sql profiler?? I am using toad for sqlserver 5.7. Here I see execution plan tab available while running the SP. but when i run it throws an error. Pl help and provide inputs.
    Thanks,
    Viji

    You've asked a SQL Server question in an Oracle forum.  I'm expecting that this will get locked momentarily when a moderator drops by.
    Microsoft has its own forums for SQL Server, you'll have more luck over there.  When you do go there, however, you'll almost certainly get more help if you can pare down the problem (or at least better explain what your code is doing).  Very few people want to read hundreds of lines of code, guess what's it's supposed to do, guess what is slow, and then guess at how to improve things.  Posting query plans, the results of profiling, cutting out any code that is unnecessary to the performance problem, etc. will get you much better answers.
    Justin

  • Profile Performance in LabVIEWvs Performance meter in Vision Assistant: Doesn't match

    Hi everyone,
    I faced a strange problem about performance timing between these two measurements.
    Here is my test
    -used inbuilt example provided by labview in vision assistant-Bracket example-Uses two pattern matches, one edge detection algorithm and two calipers(one for calculating midpoint and other for finding angle between three points.
    -When i ran the script provided by NI for the same in vision assistnat it took average inspection time of 12.45ms(even this also varies from 12-13ms:my guess is this little variation might be due to my cpu/processing load).
    -Then i converted the script to vi and i used profile performance in labview and surprisingly it is showing way more than expected like almost ~300ms(In the beginning thought it is beacuse of all rotated search etc..but none of them make sense to me here).
    Now my questions are
    -Are the algorithms used in both tools are same? (I thought they are same)
    -IMAQ read image and vision info is taking more than 100ms in labview, which doesn't count for vision assistant. why?( thought the template image might be loaded to cache am i right?)
    -What about IMAQ read file(doesn't count for vision assistant?? In labview it takes around 15ms)
    -Same for pattern match in vision assitant it takes around 3ms(this is also not consistant) in labview it takes almost 3times (around 15ms)
    -Is this bug or am i missing somethings or this is how it is expected?
    Please find attachments below.
    -Vision Assistant-v12-Build 20120605072143
    -Labview-12.0f3
    Thanks
    uday,
    Please Mark the solution as accepted if your problem is solved and help author by clicking on kudoes
    Certified LabVIEW Associate Developer (CLAD) Using LV13
    Attachments:
    Performance_test.zip ‏546 KB

    Hmm Bruce, Thanks again for reply.
    -When i first read your reply, i was ok. But after reading it multiple times, i came to know that you didn't check my code and explanation first.
    -I have added code and screenshot of Profile in both VA and LabVIEW.
    In both Vision Assistant and Labview
    -I am loading image only once.
    Accounted in Labview but not in VA, because it is already in cache, But time to put the image into cache?
    I do understand that, when we are capturing the image live from camera things are completely different.
    -Loading template image multiple times??
    This is where i was very much confused. Beacuase i didn't even think of it. I am well aware of that task.
    -Run Setup Match Pattern once?
    Sorry, so far i haven't seen any example which does pattern match for multiple images has Setup Match Pattern everytime. But it is negligible time i wouldn't mind.
    -Loading images for processing and loading diffferent template for each image?
    You are completely mistaken here and i don't see that how it is related to my specific question.
    Briefly explaining you again
    -I open an image both in LabVIEW and VA.
    -Create two pattern match steps. and Calipers(Negligible)
    -The pattern match step in VA shows me longest time of 4.65 ms where as IMAQ Match pattern showed me 15.6 ms.
    -I am convinced about IMAQ Read and vision info timing, because it will account only in the initial phase when running for multiple image inspection.
    But i am running for only once, then Vision assistant should show that time also isn't it?
    -I do understand that, Labview has lot more features on paralell execution and many things than Vision Assistant.
    -Yeah that time about 100ms to 10ms i completely agree. I take Vision Assistant profile timing as the ideal values( correct me if i am wrong).
    -I like the last line especially, You cannot compare the speeds of the two methods.
     Please let me know if i am thinking in complete stupid way or at least some thing in right path.
    Thanks
    uday,
    Please Mark the solution as accepted if your problem is solved and help author by clicking on kudoes
    Certified LabVIEW Associate Developer (CLAD) Using LV13

  • How to profile performance on device?

    How is this done? Was there a way to show a border around the areas that are being redrawn? android and/or iOS
    I am using flash cs5.5

    Profiling comes in many flavors...
    - Load Capacity => load test... there are some (slightly outdated) load test templates that you can use to slam your SP environment and determine its load capacity
    - Average usage => performance monitors... use the built in performance monitors, or some dedicated systems (like MS System Center) include application-specific capabilities (System Center has Management Packs)
    - Solution Profiling => performance profilers... a bit harder, since it requires isolating the custom components and comparing data... use the SP dev tools, SQL profiles, logs, etc.
    Scott Brickey
    MCTS, MCPD, MCITP
    www.sbrickey.com
    Strategic Data Systems - for all your SharePoint needs

  • How To Profile Performance Before & After Installation Of A Solution

    Hi.
    We have third party and custom solutions that we want to install into our SharePoint 2010 Farm.
    We want to find out if there are any performance bottlenecks after the installation so we would like to profile our SharePoint 2010 Farm performance and SQL Server 2008 R2 performance before and after.
    How could we achieve this profile for our Farm overall and both SharePoint 2010 and SQL Server 2008 R2?

    Profiling comes in many flavors...
    - Load Capacity => load test... there are some (slightly outdated) load test templates that you can use to slam your SP environment and determine its load capacity
    - Average usage => performance monitors... use the built in performance monitors, or some dedicated systems (like MS System Center) include application-specific capabilities (System Center has Management Packs)
    - Solution Profiling => performance profilers... a bit harder, since it requires isolating the custom components and comparing data... use the SP dev tools, SQL profiles, logs, etc.
    Scott Brickey
    MCTS, MCPD, MCITP
    www.sbrickey.com
    Strategic Data Systems - for all your SharePoint needs

  • How To Perform Lot Split Transactions Using Transaction Open Interface (MTI)

    Can anyone give me some guidance on how to perform lot split transaction using MTI?
    I am using the following code:
    DECLARE
    l_transaction_type_id NUMBER := 83;
    l_transaction_action_id NUMBER := 41;
    l_transaction_source_type_id NUMBER := 13;
    l_org_id NUMBER := 1884;
    l_txn_header_id NUMBER;
    l_txn_if_id1 NUMBER;
    l_txn_if_id2 NUMBER;
    l_txn_if_id3 NUMBER;
    l_parent_id NUMBER;
    l_sysdate DATE;
    l_item_id NUMBER :=287996;
    l_user_id NUMBER;
    l_distribution_account_id NUMBER;
    l_exp_date DATE;
    BEGIN
    --For Lot Merge, there should be only one resultant lot.
    --The transaction_quantity populated in MTI/MTLI should be the entire
    --quantity that is available to transact for the org/sub/item/locator/LPN in
    --that particular lot number.
    --Get transaction_header_id for all the MTIs
    SELECT APPS.mtl_material_transactions_s.NEXTVAL
    INTO l_txn_header_id
    FROM sys.dual;
    --Get transaction_interface_id of resultant record
    SELECT APPS.mtl_material_transactions_s.NEXTVAL
    INTO l_txn_if_id1
    FROM sys.dual;
    l_parent_id := l_txn_if_id1;
    l_sysdate := SYSDATE;
    l_user_id := -1; --substitute with a valid user_id
    l_distribution_account_id := NULL; --needed for lot translate
    l_exp_date := NULL; --set if required
    --Populate the MTI record for resultant record
    INSERT INTO MTL_TRANSACTIONS_INTERFACE
    transaction_interface_id,
    transaction_header_id,
    Source_Code,
    Source_Line_Id,
    Source_Header_Id,
    Process_flag,
    Transaction_Mode,
    Lock_Flag,
    Inventory_Item_Id,
    revision,
    Organization_id,
    Subinventory_Code,
    Locator_Id,
    Transaction_Type_Id,
    Transaction_Source_Type_Id,
    Transaction_Action_Id,
    Transaction_Quantity,
    Transaction_UOM,
    Primary_Quantity,
    Transaction_Date,
    Last_Update_Date,
    Last_Updated_By,
    Creation_Date,
    Created_By,
    distribution_account_id,
    parent_id,
    transaction_batch_id,
    transaction_batch_seq,
    lpn_id,
    transfer_lpn_id
    VALUES
    l_txn_if_id1, --transaction_header_id
    l_txn_header_id, --transaction_interface_id
    'INV', --source_code
    -1, --source_header_id
    -1, --source_line_id
    1, --process_flag
    3, --transaction_mode
    2, --lock_flag
    l_item_id, --inventory_item_id
    null, --revision
    l_org_id, --organization_id
    'EACH', --subinventory_code
    1198, --locator_id
    l_transaction_type_id, --transaction_type_id
    l_transaction_source_type_id, --transaction_source_type_id
    l_transaction_action_Id, --l_transaction_action_id
    100000, --transaction_quantity
    'EA', --transaction_uom
    100000, --primary_quantity
    l_sysdate, --Transaction_Date
    l_sysdate, --Last_Update_Date
    l_user_id, --Last_Updated_by
    l_sysdate, --Creation_Date
    l_user_id, --Created_by
    l_distribution_account_id, --distribution_account_id
    l_parent_id, --parent_id
    l_txn_header_id, --transaction_batch_id
    2, --transaction_batch_seq
    NULL, --lpn_id (for source MTI)
    NULL --transfer_lpn_id (for resultant MTIs)
    --Insert MTLI corresponding to the resultant MTI record
    INSERT INTO MTL_TRANSACTION_LOTS_INTERFACE(
    transaction_interface_id
    , Source_Code
    , Source_Line_Id
    , Process_Flag
    , Last_Update_Date
    , Last_Updated_By
    , Creation_Date
    , Created_By
    , Lot_Number
    , lot_expiration_date
    , Transaction_Quantity
    , Primary_Quantity
    VALUES (
    l_txn_if_id1 --transaction_interface_id
    , 'INV' --Source_Code
    , -1 --Source_Line_Id
    , 'Y' --Process_Flag
    , l_sysdate --Last_Update_Date
    , l_user_id --Last_Updated_by
    , l_sysdate --Creation_date
    , l_user_id --Created_By
    , 'Q0000.1' --Lot_Number
    , l_exp_date --Lot_Expiration_Date
    , 100000 --transaction_quantity
    , 100000 --primary_quantity
    INSERT INTO MTL_TRANSACTIONS_INTERFACE
    transaction_interface_id,
    transaction_header_id,
    Source_Code,
    Source_Line_Id,
    Source_Header_Id,
    Process_flag,
    Transaction_Mode,
    Lock_Flag,
    Inventory_Item_Id,
    revision,
    Organization_id,
    Subinventory_Code,
    Locator_Id,
    Transaction_Type_Id,
    Transaction_Source_Type_Id,
    Transaction_Action_Id,
    Transaction_Quantity,
    Transaction_UOM,
    Primary_Quantity,
    Transaction_Date,
    Last_Update_Date,
    Last_Updated_By,
    Creation_Date,
    Created_By,
    distribution_account_id,
    parent_id,
    transaction_batch_id,
    transaction_batch_seq,
    lpn_id,
    transfer_lpn_id
    VALUES
    l_txn_if_id1, --transaction_header_id
    l_txn_header_id, --transaction_interface_id
    'INV', --source_code
    -1, --source_header_id
    -1, --source_line_id
    1, --process_flag
    3, --transaction_mode
    2, --lock_flag
    l_item_id, --inventory_item_id
    null, --revision
    l_org_id, --organization_id
    'EACH', --subinventory_code
    1198, --locator_id
    l_transaction_type_id, --transaction_type_id
    l_transaction_source_type_id, --transaction_source_type_id
    l_transaction_action_Id, --l_transaction_action_id
    100000, --transaction_quantity
    'EA', --transaction_uom
    100000, --primary_quantity
    l_sysdate, --Transaction_Date
    l_sysdate, --Last_Update_Date
    l_user_id, --Last_Updated_by
    l_sysdate, --Creation_Date
    l_user_id, --Created_by
    l_distribution_account_id, --distribution_account_id
    l_parent_id, --parent_id
    l_txn_header_id, --transaction_batch_id
    3, --transaction_batch_seq
    NULL, --lpn_id (for source MTI)
    NULL --transfer_lpn_id (for resultant MTIs)
    --Insert MTLI corresponding to the resultant MTI record
    INSERT INTO MTL_TRANSACTION_LOTS_INTERFACE(
    transaction_interface_id
    , Source_Code
    , Source_Line_Id
    , Process_Flag
    , Last_Update_Date
    , Last_Updated_By
    , Creation_Date
    , Created_By
    , Lot_Number
    , lot_expiration_date
    , Transaction_Quantity
    , Primary_Quantity
    VALUES (
    l_txn_if_id1 --transaction_interface_id
    , 'INV' --Source_Code
    , -1 --Source_Line_Id
    , 'Y' --Process_Flag
    , l_sysdate --Last_Update_Date
    , l_user_id --Last_Updated_by
    , l_sysdate --Creation_date
    , l_user_id --Created_By
    , 'Q0000.1' --Lot_Number
    , l_exp_date --Lot_Expiration_Date
    , 100000 --transaction_quantity
    , 100000 --primary_quantity
    --Get transaction_interface_id of Source record-1
    SELECT APPS.mtl_material_transactions_s.NEXTVAL
    INTO l_txn_if_id2
    FROM sys.dual;
    --Populate the MTI record for Source record-1
    INSERT INTO MTL_TRANSACTIONS_INTERFACE
    transaction_interface_id,
    transaction_header_id,
    Source_Code,
    Source_Line_Id,
    Source_Header_Id,
    Process_flag,
    Transaction_Mode,
    Lock_Flag,
    Inventory_Item_Id,
    revision,
    Organization_id,
    Subinventory_Code,
    Locator_Id,
    Transaction_Type_Id,
    Transaction_Source_Type_Id,
    Transaction_Action_Id,
    Transaction_Quantity,
    Transaction_UOM,
    Primary_Quantity,
    Transaction_Date,
    Last_Update_Date,
    Last_Updated_By,
    Creation_Date,
    Created_By,
    distribution_account_id,
    parent_id,
    transaction_batch_id,
    transaction_batch_seq,
    lpn_id,
    transfer_lpn_id
    VALUES
    l_txn_if_id2, --transaction_header_id
    l_txn_header_id, --transaction_interface_id
    'INV', --source_code
    -1, --source_header_id
    -1, --source_line_id
    1, --process_flag
    3, --transaction_mode
    2, --lock_flag
    l_item_id, --inventory_item_id
    null, --revision
    l_org_id, --organization_id
    'EACH', --subinventory_code
    1198, --locator_id
    l_transaction_type_id, --transaction_type_id
    l_transaction_source_type_id, --transaction_source_type_id
    l_transaction_action_Id, --transaction_action_id
    -200000, --transaction_quantity
    'EA', --transaction_uom
    -200000, --primary_quantity
    l_sysdate, --Transaction_Date
    l_sysdate, --Last_Update_Date
    l_user_id, --Last_Updated_by
    l_sysdate, --Creation_Date
    l_user_id, --Created_by
    l_distribution_account_id, --distribution_account_id
    l_parent_id, --parent_id
    l_txn_header_id, --transaction_batch_id
    1, --transaction_batch_seq
    NULL, --lpn_id (for source MTI)
    NULL --transfer_lpn_id (for resultant MTIs)
    --Insert MTLI corresponding to the Source record-1
    INSERT INTO MTL_TRANSACTION_LOTS_INTERFACE(
    transaction_interface_id
    , Source_Code
    , Source_Line_Id
    , Process_Flag
    , Last_Update_Date
    , Last_Updated_By
    , Creation_Date
    , Created_By
    , Lot_Number
    , lot_expiration_date
    , Transaction_Quantity
    , Primary_Quantity
    VALUES (
    l_txn_if_id2 --transaction_interface_id
    , 'INV' --Source_Code
    , -1 --Source_Line_Id
    , 'Y' --Process_Flag
    , l_sysdate --Last_Update_Date
    , l_user_id --Last_Updated_by
    , l_sysdate --Creation_date
    , l_user_id --Created_By
    , 'Q0000' --Lot_Number
    , l_exp_date --Lot_Expiration_Date
    , -200000 --transaction_quantity
    , -200000 --primary_quantity
    END;

    the first MTI record should be the source record ...ie. it should have transaction quantity as negative.
    new set of MTI records should have positive transaction quantities.
    Also ensure that sum of transaction quantities for the set should be 0...
    What is the error that you are getting?
    Thanks,
    Hrishi.

  • How can i use dlls for running rtc3?

    how can i use the dll libraries to use for the rtc3 software?

    amolchoudhary wrote:
    can anyone please help me???
    How? You provide almost no useful information.
    What is rtc3?
    Why do you think you have to use DLLs?
    What did you try so far that hasn't worked?
    What is it you want to accomplish?
    You see what you ask here is just about the same as if someone would ask you to tell them what screw he needs to fix his car.
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • On my Macbook pro 15 2011, System Profiler is suggesting that it uses SATA III for the hard drive and SATA II for the Optical Drive.  Is that true?

    On my Macbook pro 15, 2011, System Profiler is suggesting that it uses SATA III for the hard drive and SATA II for the Optical Drive.  Is that true?

    That is correct. The tech specs indicate:
    Hard Drive Interface
    6.0 Gbps Serial ATA (SATA)
    Optical Drive Interface
    3.0 or 6.0 Gbps Serial ATA (SATA)

  • How to use dll function in JDeveloper?

    I want to use function in dll to develope my application,but
    I can't find how to do that in the JDeveloper document.
    null

    You have to use the Java Native Interface to use dlls. The JNI
    tutorial at http://java.sun.com/docs/books/tutorial/native1.1/
    walks you through the basics and provides a sample too.
    Hope this helps.
    Regards,
    yinjun (guest) wrote:
    : I want to use function in dll to develope my
    application,but
    : I can't find how to do that in the JDeveloper document.Who can
    : give me an answer? Thanks a lot in advance.
    null

  • How to use dll in Labview ?

    I compile this code to dll file with VC++2010 filename is test_dll.dll .
    #include "stdafx.h"
    #include <iostream>
    #include <Windows.h>
    using namespace std;
    int main(int a){
        cout << "Test dll...............\n";
        return a;
    After that, I put Call Library Function node in editor and double click Call Library Function node. I browse test_dll.dll into Library name or path and set function prototype to int32_t main(int32_t a); but it show error Call Library Function Node 'test_dll.dll:main':function not found in libraly. How to use dll in Labview ?  And I have more question is what is differrent from Tools -> Import -> Shared Library(.dll) and use Call Library Function node.
    Solved!
    Go to Solution.

    The issue you are having is that LabVIEW is not capable of using C++ DLLs directly. It only handles C DLLs. This does not mean that if you you cannot use the DLL if it's compiled with the C++ compiler as opposed to the C compiler. Rather, it means that you must take extra steps in order to use it from LabVIEW. The primary issue is that of name mangling or adornment. This is discussed here: http://zone.ni.com/devzone/cda/tut/p/id/4877. Basically you need to prepend extern "C" in front of your prototypes in your header files. I would also suggest reviewing this article: https://decibel.ni.com/content/docs/DOC-14564.

  • How to use .dll compiled in Delphi in Java?

    Can anyone provide me information how to use .dll compiled in Delphi in Java? The .dll Delphi program may be non-OOP program.

    Hi
    If You want to write anything in PASCAL then simply write JNI code in C++ and call functions from this DELPHI DLL, but remember about changing the order of parameters etc when You call it. I had exactly the same problem, but it is possible, and we succeed.
    You will need exact description of input parameters required by PASCAL functions and of output parameters. Then construct in C++ objects wchich looks in memory exacly the same as they will look if You will be using PASCAL. Then PASCAL function will interprete this area in memory as known structure and will run.
    Maciek

  • How to Use DLL Function in Java Applet

    Hi all,
    I have been assigned a task to develop java applet. The problem is, I need to use DLL functions in my applet to read records. Could you pls anyone guide me how to interface DLL and applet to read records with sample code?
    I'm using JDK 1.5.0_06 and Windows XP OS. thanx..
    best rgds,
    jpdbay

    You will need to use Java Native Interface JNI. Ther are many posts on the subject, please search, and this is the documentation:
    http://java.sun.com/j2se/1.5.0/docs/guide/jni/spec/jniTOC.html

  • Using DLL functions created with LabVIEW 6i in CVI 4.0.1

    I want use SQL functions from LabVIEW 6i in LabWindows/CVI 4.0.1. Building DLL with this LabVIEW-functions was succesfull, but after using DLL in CVI project and run, error message was generated: FATAL ERROR : LABVIEW.LIB was not called from a LabVIEW process. Can you help me? In attachment are VIs and build script for this DLL.
    Attachments:
    DLL.ZIP ‏24 KB

    The SQL toolkit appears to be an ActiveX program. Why not call the objects with activeX in CVI directly?
    There are several hits for labview.lib on ni's site. Goto http://search.ni.com/?col=alldocs&layout=TechResources&ql=a
    and search for labview.lib.

  • Spooling out profile that are not being used

    hi guys,
    is there anyway to spool out those profile that are not in used or assigned to any username or roles?

    flaskvacuum wrote:
    hi guys,
    is there anyway to spool out those profile that are not in used or assigned to any username or roles?
    SQL> CREATE PROFILE test LIMIT
      2     FAILED_LOGIN_ATTEMPTS 5
      3     PASSWORD_LIFE_TIME 60
      4     PASSWORD_REUSE_TIME 60
      5     PASSWORD_REUSE_MAX 5
      6     PASSWORD_VERIFY_FUNCTION null
      7     PASSWORD_LOCK_TIME 1/24
      8     PASSWORD_GRACE_TIME 10;
    Profile created.
    -- Created but not assigned --
    SQL> select distinct profile from dba_profiles where profile not in (select profile from dba_users);
    PROFILE
    TEST
    SQL>

Maybe you are looking for

  • Getting error while trying to refresh EUL using adrfseul.sh

    I am getting the following error while trying to refresh EUL. I am using Discoverer 10.2.0.2 with Oracle Applications 11.5.10.2 and following note id 313418.1 of metalink. You are running adrfseul, version 115.0 Start of adrfseul session Date/time is

  • Audit Log Report generating an "Out of Memory" error message.

    Greetings. We are a new IDM customer. We are running IDM 6.0 with an Oracle database. We are now getting the following error message when we run the IDM Audit Log Report for Today's Activities: "java.lang.OutOfMemoryError". How do we increase the mem

  • IPOD Help~(cannot view anything except a white blank screen)

    Alright...I never expected to post up something in this discussion thread this soon...since I got my 30gig iPOD just three days ago...and curious as I am to things...I think I broke it, hah~ Anyways...I need some help. Here's the story. I was updatin

  • Oracle 10g - Data Pump: Export / Import of Sequences ?

    Hello, I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users. My question concerns the Data Pump Utility and what happens to seq

  • Emergent!! Two alternative material number for one same real material???

    Hi Experts In our company,for some specail requirement,we want to have two material numbers for one same real material,that means this two materials are substitute for each other. In every transaction, we can use this both two material numbers. For e