Parallel execution of Mat script in Labview

Hi All
I have an application where I am using MatLAB program to simulate 3D image simulation for vibration data. In this we are using two Mat Script call in two instances which are running parallel.  
The issue observed is MatLAB scripts which am running parallel are not working simultaneously any one is working at any time. When I call MatLAB script for second time, Mat script which is already executing stops execution and second instance gets executes.
What I have to do to execute 2 MatLAB scripts simultaneously in LABView.
Regards
Devender T

I'm not a Matlab experts, but as far as I know, Matlab doesn't allow you to run scripts in parallel, so this is a Matlab limitation, not a LabVIEW one. You could look into using the Parallel Computing Toolbox for Matlab, though I think its intention is to target multiple cores, rather than multiple threads.
You may need to pursue this in the Matlab forums.

Similar Messages

  • Parallel execution of multiple scripts

    Hi All,
    I understand that by using parallel hint we can achieve parallel execution of the queries(insert/update/delete).
    For my today's question I would like know if Parallelism can be achieved for the following scenario.
    I have a script with an insert-select statement and multiple merge statements and few update statements all on the same table.
    I have to run this script 12 times on the same table, once for each month of the year.
    Currently we run for Jan (where record_date = '201001') commit it, and then run for Feb, and so on.
    Can all the 12 month be run in parallel. One way I can think of is create 12 different scripts and kick them of by opening 12 different SQL Plus sessions (all on the same table).
    Is there a better way of doing this ?
    Note: each month data will not effect the other other month data.
    Regards,
    Aj

    Creating 12 different scripts would be a sub-optimal solution, and a maintenance nightmare. Creating 1 parametrized script and call that 12 times with different parameters would be slightly better.
    Creating 1 stored procedure with parameters, and run that procedure in the scheduler 12 times would still be better, as everything runs at the server side only.
    Your description is deliberately vague, so the possibility of deadlock can not be excluded.
    Sybrand Bakker
    Senior Oracle DBA

  • How to open a .MAT file in LABVIEW

    Hey everyone,
    I currently have a .MAT image file which I would like to open in LABVIEW to apply some image filters. However, I am unsure as to how to open the .MAT file in LABVIEW. Ideally I would like to be able to open the .MAT file without using matlab, and it seems the only way to accomplish this task is through the mathscript node. I have attached an example of the image im trying to open, as well as the LABVIEW program which I would like to use on the image. Thanks for the feedback! =]
    Attachments:
    MAT_Image_Adjust.zip ‏1779 KB

    Hi Boiler,
    1) Do you have a choice in the format you export your data from MATLAB?
    "ASCII Format
    Complete the following steps if you want to import or export data between LabVIEW and the MATLAB® environment, the process is straightforward as long as you are using ASCII format.
    From the MATLAB® environment to LabVIEW
    To save a vector or a matrix Xin ASCII format with tab delimiter, enter the following in the command window or m-script file in the MATLAB® environment:   
    >>SAVE filename X -ascii -double -tabs
        This creates a file whose name is filename, and it contains the data X in ASCII format with a tab delimiter.
    Import the file into LabVIEW using the Read From Spreadsheet File VI located on the Programming»File I/O palette.
    2) Have you tried using the mathscript node? Did you get any errors?
    "Binary Format
    Complete the following steps if you want to import or export data between LabVIEW and MATLAB®.
    From the MATLAB® environment to LabVIEW
    To read a .mat file in LabVIEW would require a VI to parse the file. This may be easier if each variable is saved to a separate file.
    " -- this was done here, no ideas if it still works,
    I want to read a Matlab MAT file into labview
    Hope this helps, James
    Kind Regards
    James Hillman
    Applications Engineer 2008 to 2009 National Instruments UK & Ireland
    Loughborough University UK - 2006 to 2011
    Remember Kudos those who help!

  • How to run multiple CodedUI Ordered Tests over multiple Test Agents for parallel execution using Test Controller

    we are using VS 2013, I need to run multiple Coded UI Ordered Tests in parallel on different agents.
    My requirement :
    Example:   I have 40 Coded UI Test scripts in single solution/project. i want to run in different OS environments(example 5 OS ).  I have created 5 Ordered tests with the same 40 test cases. 
    I have one Controller machine and 5 test agent machines. Now I want my tests to be distributed in a way that every agent gets 1 Ordered test to execute. 
    Machine_C = Controller (Controls Machine_1,2,3,4,5)
    Machine_1 = Test Agent 1 (Should execute Ordered Test 1 (ex: OS - WIN 7) )
    Machine_2 = Test Agent 2 (Should execute Ordered Test 2 (ex:
    OS - WIN 8) )
    Machine_3 = Test Agent 3 (Should execute Ordered Test 3
    (ex: OS - WIN 2008 server)  )
    Machine_4 = Test Agent 4 (Should execute Ordered Test 4 (ex:
    OS - WIN 2012 server) )
    Machine_5 = Test Agent 5 (Should execute Ordered Test 5 (ex:
    OS - WIN 2003 server) )
    I have changed the  “MinimumTestsPerAgent” app setting value
    as '1' in controller’s configuration file (QTController.exe.config).
    When I run the Ordered tests from the test explorer all Test agent running with each Ordered test and showing the status as running. but with in the 5 Test Agents only 2 Agents executing the test cases remaining all 3 agents not executing the test cases but
    status showing as 'running' still for long time (exp: More then 3 hr) after that all so  its not responding. 
    I need to know how I can configure my controller or how I can tell it to execute these tests in parallel on different test agents. This will help me reducing the script execution time. 
     I am not sure what steps I am missing. 
    It will be of great help if someone can guide me how this can be achieved.
    -- > One more thing Can I Run one Coded UI Ordered Test on One Specific Test Agent?
    ex: Need to run ordered Test 1 in Win 7 OS (Test Agent 1) only.
    Thanks in Advance.

    Hi Divakar,
    Thank you for posting in MSDN forum.
    As far as I know, we cannot specify coded UI ordered test run on specific test agent. And it is mainly that test controller determine which coded UI ordered test assign to which test agent.
    Generally, I know that if we want to run multiple CodedUI Ordered Tests over multiple Test Agents for parallel execution using Test Controller.
    We will need to change the MinimumTestsPerAgent property to 1 in the test controller configuration file (QTControllerConfig.exe.config) as you said.
    And then we will need to change the bucketSize number of tests/number of machines in the test settings.
    For more information about how to set this bucketSize value, please refer the following blog.
    http://blogs.msdn.com/b/aseemb/archive/2010/08/11/how-to-run-automated-tests-on-different-machines-in-parallel.aspx
    You can refer this Jack's suggestion to run your coded UI ordered test in lab Environment or load test.
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/661e73da-5a08-4c9b-8e5a-fc08c5962783/run-different-codedui-tests-simultaneously-on-different-test-agents-from-a-single-test-controller?forum=vstest
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Parallel Execution  against Normal Execution

    Hi,
    Can someone explain why in this case Serial Execution is faster although enabling parallel dml at session level , What are the possibilities to improve speed of executoin for parallel execution as i need to populate for 139 stores data having around half a million rows for each store , i will have to run a cursor on the distinct value of store to achieve this , just to test i got the following Results , can anyoe guide where i am going Wrong?
    This is the Script
    set timing on
    insert into r8_win_store
    select * from win_store@rms8 where store=99;
    commit;
    alter session enable parallel dml;
    insert /*+ Parallel(t,8) */ into r8_win_store t
    select /*+ parallel(e,8)*/ * from win_store@rms8 e where e.store=99;
    commit;
    alter session disable parallel dml;
    OUtput
    SQL> @test1.sql
    299666 rows created.
    Elapsed: 00:03:48.12
    Commit complete.
    Elapsed: 00:00:00.01
    Session altered.
    Elapsed: 00:00:00.00
    299666 rows created.
    Elapsed: 00:08:02.81
    Commit complete.
    Elapsed: 00:00:01.31
    Session altered.
    Elapsed: 00:00:00.00

    Parallel processing in Oracle is intended to reduce I/O latency. When you tell the kernel to do an I/O, you need to wait for that call to complete - disk platters to spin, disk controller heads to move, etc.
    If you need to make a bunch of I/O calls after one another, you will spend up to 90% of the elapsed time waiting for that I/O to actually put the data into your buffer to process.
    Now imagine doing a million I/Os.. and 90% of the time spend waiting for I/O to complete. A big performance knock and a frustrating one - as you cannot make that actual I/O any faster. It is simply slow. The slowest operation you can do on the computer and you're doing a million of them.
    Parallel processing in Oracle addresses this problem. There can be sufficient I/O bandwidth to make a 100 I/O calls per second. But your process that does an I/O call, wait, process, and does an I/O call, wait, process, can only reach a speed of 10 I/O calls per second.
    Which means 90% of the I/O pipe is free to do I/O in parallel. So instead of a single process doing that million I/Os (using 10% I/O bandwidth/thruput), Oracle spawns 10 process each doing a 100K I/Os each - thus making better use of the I/O thruput.
    So Oracle PQ is useless if you scan small volumes of data - it is intended for large volumes of data. There are overheads in PQ as the parallel processes have to be coordinated in order to work together. When each only needs to do 10 I/Os, the time spend on coordination alone can be more than what the time would have been to simply do a 100 I/Os using a single process.
    It is also a fallacy that number of CPUs determine the just how many parallel processes you can start. The real determining factor is the load you can put on your I/O subsystem.
    Bottom line is that PQ is nothing "special" or "magic". It is simply a method to reduce I/O latency by performing parallel I/O. And it is only sensible to use when the amount of I/O to be done warrants parallel processing.
    Oh yeah - and the CBO is very capable in deciding when to use PQ and not. So rather than force PQ down the CBO's throat using the PARALLEL clause and hardcoding the degrees and instances, it should rather make those decisions (as informed decisions) itself. Which means using the PARALLEL clause on tables and not as SQL hints. The DBA can easily tune that by altering the PARALLEL clause if a table.. The DBA cannot by any means do the same thing when dealing with SQLs that insists on a specific number of PQ processes on a specific number of instances.

  • ORA-12842: Cursor invalidated during parallel execution

    Hi,
    Database version: 9.2.0.6.0
    OS : Red Hat AS 3.
    I encountered this problem lately in one of our scripts.
    The error message shows:
    BEGIN SP_RPT77B_V2(SYSDATE -1); END;
    ERROR at line 1:
    ORA-12842: Cursor invalidated during parallel execution
    ORA-06512: at "REPORTADMIN.SP_RPT77B_V2", line 273
    ORA-06512: at line 1
    Elapsed: 00:02:49.60
    Does anyone have any clues on what does this error messages means? Is there a way to rectified the problem?
    Any advise, thanks.

    Hi!
    Check the error description --
    ORA-12842 schema modified during parallel execution
    Cause: Schema modified during the parse phase of parallel processing.
    Action: No action required.
    And the other error is --
    ORA-06512 at string line string
    Cause: Backtrace message as the stack is unwound by unhandled exceptions.
    Action: Fix the problem causing the exception or write an exception handler for this condition. Or you may need to contact your application administrator or database administrator.
    Regards.
    Satyaki De.

  • View parallel executions from custom OI

    Hello
    TestStand 3.5
    LabView 8.2
    The machine I'm controlling can conduct two tests at the same time and I'm
    therefore using the parallel model.
    I've customized the full-featured OI that ships with TestStand so I have a
    execution view for each of my testSockets.
    Here is the problem... I can't seem to initialize the execution view managers
    properly. The execution is not showed on the OI.
    I have modified "Configure ExecutionView Manager.vi", "Configure
    Event Callbacks.vi" and "DisplayExecution Event Callback.vi" to
    handle another Exec view manager according to the guidelines found elsewhere on
    this forum.
    If I'm using the sequential model the steps are shown perfectly (Only one of
    the exec managers can show the sequence at a time, but I can switch between
    them). Using the parallel model nothing is shown.
    Even if I use the simple OI i can't see the sequence for any of my two parallel
    executions.
    What am I missing?

    Okay... I'm not quite sure what was wrong but it's working perfectly now.

  • Very slow parallel execution

    Hello every smart heads.
    I’m using LabView 7.1, traditional NI-DAQ, Windows XP
    I have two different VIs. Both of them work in a while loop. The first one is acquiring data from analogue input, and another one measures period of an input TTL signal. When I run analogue acquisition alone, is running very fast. Processor is loaded up to 80%.
    But, when I run period measurement in parallel to this analogue measurement, execution of analogue acquisition is very slow, but processor is loaded maybe up to 10% . I would expect, that after run another program, I will load processor more !?
    I tried to change delay times in those loops, but it stayed without result. I’ve tried to change priority, and execution in VI properties, but still without success.
    Can somebody help me optimize the parallel execution ?

    Ok, I've found the reason.
    I think (I'm almost sure) that when I read count from buffer, and time limit of Counter Read Buffer.vi is not exceeded yet, acquisition from analogue inputs is not executed, but another VI waits till Counter Read Buffer.vi will finish.
    I though that execution should be parallel. Why is it like this ? I can solve problem if I put time limit very small. However, I need timeout indication (code 10800).
    Any Ideas ?
    Attachments:
    Measure Buffered Period (DAQ-STC).llb ‏296 KB
    measurePmode.llb ‏31 KB

  • 11g Parallel Execution on AIX 6 - SMT Enabled or Disabled?

    Greetings,
    I've had no luck searching for an answer to this question and I'm hoping someone can answer it:
    Can Oracle 11g Parallel Execution spread the "granules" of paralllism across the threads (logical cpus) in an AIX SMT enabled environment, or should SMT be disabled and the "granules" be spread scross the processors (virtual cpus)? The application is a data warehouse in a non-RAC configuration using a p570 server. From what I've read, the server must be SMP for Oracle parallel execution capabilities to be maximized, but I think all AIX servers are SMP (not sure if the server needs to be ordered as an SMP server). I'm mostly concerned with the data load processing at this point, and not so concerned for the query right now. I believe AIX 6.1 can enable/disable SMT dynamically. So would it make sense to disable during data loads, and enable for DSS query?
    Hope the question makes sense. Thanks for any help in advance!

    SMT will determine Oracle's cpu_count parameter.
    However this is a static parameter.
    So it won't work, and it might be even dangerous to change it on the fly.
    Sybrand Bakker
    Senior Oracle DBA
    Experts: those who did read documentation.

  • How can I make the execution of my script faster

    Hi everyone
    How can I make the execution of my script faster, because it takes a lot of time to execute? The following is my script:
    DECLARE
    CURSOR C1 IS
    SELECT A.ITEM_CODE,A.STORE_CODE,ST_UNIT,SA_UNIT,CART_QTY,QUANTITY_ON_HAND
    FROM PROJ.IM_LOCATION A
    WHERE A.ITEM_CODE BETWEEN :ITEM_FRM AND :ITEM_TO
    AND A.STORE_CODE = :FRM_STORE
    ORDER BY
    A.STORE_CODE ;
    CURSOR C2 IS
    SELECT A.ITEM_CODE,A.STORE_CODE,ST_UNIT,SA_UNIT,CART_QTY,QUANTITY_ON_HAND
    FROM PROJ.IM_LOCATION A
    WHERE A.ITEM_CODE BETWEEN :ITEM_FRM AND :ITEM_TO
    AND A.STORE_CODE = :FRM_STORE
    ORDER BY
    A.STORE_CODE ;
    big_syb_qty_issue number(12,3);
    small_syb_qty_issue number(12,3);
    big_issue number(12,3);
    small_issue number(12,3);
    item_syb_code varchar2(30);
    big_syb_qty_rec number(12,3);
    small_syb_qty_rec number(12,3);
    BI_SUPP_REC number(12,3);
    SM_SUPP_REC number(12,3);
    big_syb_qty_ADJ number(12,3);
    small_syb_qty_ADJ number(12,3);
    big_ADJ number(12,3);
    small_ADJ number(12,3);
    big_syb_qty_rec_iner number(12,3);
    small_syb_qty_rec_iner number(12,3);
    BI_INTER number(12,3);
    SM_INTER number(12,3);
    cl_big_qty number(12,3);
    cl_small_qty number(12,3);
    cl_big_qty1 number(12,3);
    cl_small_qty1 number(12,3);
    BIG_QTY_OPEN number(12,3);
    SMALL_QTY_OPEN number(12,3);
    BEGIN
         IF ((:FRM_STORE IS NULL) OR (:ITEM_FRM IS NULL) OR (:ITEM_TO IS NULL) OR (:DATE_FRM IS NULL)) THEN
              SHOW_MESSAGE('You Should Enter the Parameters Values Correctly Please try again !!!! ');
              RAISE FORM_TRIGGER_FAILURE;
              GO_FIELD('DATE_FRM');
         END IF;     
    DELETE FROM STOCK_AT_DATE_REP2;
    COMMIT;
    cl_big_qty := 0;
    cl_small_qty := 0;
    -- MESSAGE('Please Wait The System Calculating The Transactions !!!');
    SET_APPLICATION_PROPERTY(CURSOR_STYLE,'BUSY');
    FOR R IN C1
    LOOP
    cl_big_qty := R.CART_QTY ;
    cl_small_qty := R.QUANTITY_ON_HAND;
    -- Transerfer Data 1
    BEGIN
    SELECT B.ITEM_CODE,SUM(NVL(CART_QTY,0)) ,SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,big_syb_qty_issue,small_syb_qty_issue
    FROM IM_TRANS_ISSUE_HEADER A,IM_TRANS_ISSUE_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND B.DEL_STORE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE > :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('ISSUED BIG'||' '||big_syb_qty_issue);
    exception
    when no_data_found then big_syb_qty_issue := 0;
    small_syb_qty_issue := 0;
    when others then MESSAGE(10,sqlerrm);
    END ;
    -- Goods Received Data From Supplier 1
    BEGIN
    SELECT B.ITEM_CODE,SUM(B.CART_QTY),SUM(ITEM_QUANTITY)
    INTO item_syb_code,big_syb_qty_rec,small_syb_qty_rec
    FROM IM_GOODS_RECIEVE_HEADER A,IM_GOODS_RECIEVE_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND STORE_CODE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE > :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('RECEIVED FROM SUPPLIER BIG'||' '||big_syb_qty_rec);
    exception
    when no_data_found then big_syb_qty_rec := 0;
    small_syb_qty_rec := 0;
    when others then message(10,sqlerrm);
    END ;
    -- Adjustement Data 1
    BEGIN
    SELECT B.ITEM_CODE ,SUM(NVL(CART_QTY,0)) ADJUST_QTY,SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,big_syb_qty_ADJ,small_syb_qty_ADJ
    FROM IM_ADJUST_HEADER A,IM_ADJUST_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND B.STORE_CODE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND A.DOC_DATE > :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('Adjust BIG'||' '||big_syb_qty_ADJ);
    exception
    when no_data_found then big_syb_qty_ADJ := 0;
    small_syb_qty_ADJ := 0;
    when others then message(10,sqlerrm);
    END ;
    -- Goods Received Data From Stores 1
    BEGIN
    SELECT B.ITEM_CODE,SUM(B.CART_QTY),SUM(ITEM_QUANTITY)
    INTO item_syb_code,big_syb_qty_rec_iner,small_syb_qty_rec_iner
    FROM IM_TRANS_REC_HEADER A,IM_TRANS_REC_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND REC_STORE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE > :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- show_message('here');
    -- SHOW_MESSAGE('Received From Stores BIG'||' '||big_syb_qty_rec_iner);
    exception
    when no_data_found then
    big_syb_qty_rec_iner := 0;
    small_syb_qty_rec_iner := 0;
    when others then message(10,sqlerrm);
    END ;
    cl_big_qty := (NVL(cl_big_qty,0) + NVL(big_syb_qty_issue,0));
    cl_big_qty := (NVL(cl_big_qty,0) - NVL(big_syb_qty_rec,0));
    cl_big_qty := (NVL(cl_big_qty,0) - NVL(big_syb_qty_rec_iner,0));
    big_syb_qty_ADJ := -1 * NVL(big_syb_qty_ADJ,0);
    cl_big_qty := (NVL(cl_big_qty,0) + NVL(big_syb_qty_ADJ,0));
    -- srw.message(2000,'cl_small_qty'||cl_small_qty);
    cl_small_qty := (NVL(cl_small_qty,0) + NVL(small_syb_qty_issue,0));
    cl_small_qty := (NVL(cl_small_qty,0) - NVL(small_syb_qty_rec,0));
    cl_small_qty := (NVL(cl_small_qty,0) - NVL(small_syb_qty_rec_iner,0));
    small_syb_qty_ADJ := -1 * NVL(small_syb_qty_ADJ,0);
    cl_small_qty := (NVL(cl_small_qty,0) + NVL(small_syb_qty_ADJ,0));
    -- srw.message(2000,'cl_small_qty'||cl_small_qty); srw.message(2000,'cl_small_qty'||cl_small_qty);
    INSERT INTO STOCK_AT_DATE_REP2
    VALUES(R.STORE_CODE,R.ITEM_CODE,cl_big_qty,cl_small_qty,R.ST_UNIT,R.SA_UNIT,:DATE_FRM,
    :DATE_TO,0,0,0,0,0,0,0,0,cl_big_qty,cl_small_qty);
    cl_big_qty := 0;
    cl_small_qty := 0;
    END LOOP;
    COMMIT;
    FOR R IN C2
    LOOP
    -- Transerfer Data 2
    BEGIN
    SELECT B.ITEM_CODE,SUM(NVL(CART_QTY,0)) ,SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,big_issue,small_issue
    FROM IM_TRANS_ISSUE_HEADER A,IM_TRANS_ISSUE_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND B.DEL_STORE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE BETWEEN :DATE_FRM AND :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('ISSUED BIG'||' '||big_syb_qty_issue);
    exception
    when no_data_found then
    big_issue := 0;
    small_issue := 0;
    when others then MESSAGE(10,sqlerrm);
    END ;
    -- Goods Received Data From Supplier 2
    BEGIN
    SELECT B.ITEM_CODE,SUM(NVL(B.CART_QTY,0)),SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,BI_SUPP_REC,SM_SUPP_REC
    FROM IM_GOODS_RECIEVE_HEADER A,IM_GOODS_RECIEVE_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND STORE_CODE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE BETWEEN :DATE_FRM AND :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('1- SM_SUPP_REC '||' '||SM_SUPP_REC );
    -- SHOW_MESSAGE('RECEIVED FROM SUPPLIER BIG'||' '||big_syb_qty_rec);
    exception
    when no_data_found then
    BI_SUPP_REC := 0;
    SM_SUPP_REC := 0;
    when others then message(10,sqlerrm);
    END ;
    -- Adjustement Data 2
    BEGIN
    SELECT B.ITEM_CODE ,SUM(NVL(CART_QTY,0)) ADJUST_QTY,SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,big_ADJ,small_ADJ
    FROM IM_ADJUST_HEADER A,IM_ADJUST_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND B.STORE_CODE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND A.DOC_DATE BETWEEN :DATE_FRM AND :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- SHOW_MESSAGE('Adjust BIG'||' '||big_syb_qty_ADJ);
    exception
    when no_data_found then
    big_ADJ := 0;
    small_ADJ := 0;
    when others then message(10,sqlerrm);
    END ;
    -- Goods Received Data From Stores 2
    BEGIN
    SELECT B.ITEM_CODE,SUM(NVL(B.CART_QTY,0)),SUM(NVL(ITEM_QUANTITY,0))
    INTO item_syb_code,BI_INTER,SM_INTER
    FROM IM_TRANS_REC_HEADER A,IM_TRANS_REC_DETAILS B
    WHERE A.DOC_CODE = B.DOC_CODE
    AND REC_STORE = R.STORE_CODE
    AND ITEM_CODE = R.ITEM_CODE
    AND DOC_DATE BETWEEN :DATE_FRM AND :DATE_TO
    GROUP BY
    B.ITEM_CODE;
    -- show_message('here');
    -- SHOW_MESSAGE('Received From Stores BIG'||' '||big_syb_qty_rec_iner);
    exception
    when no_data_found then
    BI_INTER := 0;
    SM_INTER := 0;
    when others then message(10,sqlerrm);
    END ;
    BEGIN
         BIG_QTY_OPEN := 0;
    SMALL_QTY_OPEN := 0;
    BEGIN
    SELECT NVL(S_BIG_QTY_OPEN,0) ,NVL(S_SMALL_QTY_OPEN,0)
    INTO
    BIG_QTY_OPEN,SMALL_QTY_OPEN
    FROM STOCK_AT_DATE_REP2
    WHERE S_STORE_CODE = R.STORE_CODE
    AND S_ITEM_CODE = R.ITEM_CODE;
    END;
    BIG_QTY_OPEN := ((BIG_QTY_OPEN) + NVL(big_issue,0));
    BIG_QTY_OPEN := ((BIG_QTY_OPEN) - NVL(BI_SUPP_REC,0));
    big_adj := -1 * NVL(big_adj,0);
    BIG_QTY_OPEN := ((BIG_QTY_OPEN) + NVL(big_adj,0));
    BIG_QTY_OPEN := ((BIG_QTY_OPEN) - NVL(BI_INTER,0));
    SMALL_QTY_OPEN := ((SMALL_QTY_OPEN) + NVL(SMALL_issue,0));
    SMALL_QTY_OPEN := ((SMALL_QTY_OPEN) - NVL(SM_SUPP_REC ,0));
    SMALL_adj := -1 * NVL(SMALL_adj,0);
    SMALL_QTY_OPEN := ((SMALL_QTY_OPEN) + NVL(SMALL_adj,0));
    SMALL_QTY_OPEN := ((SMALL_QTY_OPEN) - NVL(SM_INTER,0));
    END;
    BEGIN
    UPDATE STOCK_AT_DATE_REP2
    SET BIG_SUP_REC = BI_SUPP_REC,
    SMALL_SUP_REC = SM_SUPP_REC,
    BIG_ISSUE_TRAN = big_issue,
    SMALL_ISSUE_TRAN = SMALL_issue,
    BIG_ADJUST = big_adj,
    SMALL_ADJUST = SMALL_adj,
    BIG_INTER_REC = BI_INTER,
    SMALL_INTER_REC = SM_INTER,
    S_BIG_QTY_OPEN = BIG_QTY_OPEN,
    S_SMALL_QTY_OPEN = SMALL_QTY_OPEN
    WHERE S_STORE_CODE = R.STORE_CODE
    AND S_ITEM_CODE = R.ITEM_CODE;
    exception
    when no_data_found then
    NULL;
    when others then
    message(10,sqlerrm);
    END;
    END LOOP;
    COMMIT;
    SET_APPLICATION_PROPERTY(CURSOR_STYLE,'default');
    SYNCHRONIZE;
    -- SHOW_MESSAGE('The Data Have Been Calculated !!!');
    END;
    declare
         pl_id ParamList;
    APPLICATION_ID VARCHAR2(20):='PRD';
              COMMAND_LINE VARCHAR2(100) :='STOCK_LEDGER';
    BEGIN
    pl_id := Get_Parameter_List('tmpdata');
    IF NOT Id_Null(pl_id) THEN
    Destroy_Parameter_List( pl_id );
    END IF;
    pl_id := Create_Parameter_List('tmpdata');
    Add_Parameter(pl_id,'DATE_FRM',TEXT_PARAMETER,:DATE_FRM);
    Add_Parameter(pl_id,'DATE_TO',TEXT_PARAMETER,:DATE_TO);
    Add_Parameter(pl_id,'ITEM_FRM',TEXT_PARAMETER,:ITEM_FRM);
    Add_Parameter(pl_id,'ITEM_TO',TEXT_PARAMETER,:ITEM_TO);
    Add_Parameter(pl_id,'FRM_STORE',TEXT_PARAMETER,:FRM_STORE);
    IF :REPORT_TYPE = 1 THEN
    Run_Product(REPORTS,'C:\INV\RDF\STOCK_LEDGER.rdf',SYNCHRONOUS,RUNTIME,
    FILESYSTEM, pl_id,NULL);
    ELSIF :REPORT_TYPE = 2 THEN
    Run_Product(REPORTS,'C:\INV\RDF\STOCK_LEDGER_2.rdf',SYNCHRONOUS,RUNTIME,
    FILESYSTEM, pl_id,NULL);
    END IF;
    END;
    Waiting for your valuable answer
    Best Regards
    Jamil Alshaibani

    Make a matte in Photoshop.
    From the Photoshop menu bar: File > New > Film & Video Presets. Choose one that suits your FCP project pixel dimensions. Make the background black. Place a white rectangle with rounded corners on top (the white will determine how much of your picture is visible). Flatten Image. Save as TIFF. Import into FCP.
    Move your video clips up to V2. Place the imported TIFF on V1. Highlight all clips on V2.
    Go to Modify > Composite Mode > Travel Matte Luma. Done.
    If you want soft edges, no need for a matte. Double click your clip to place it in the Viewer. Open up the Motion tab > Crop > Edge Feather. Nice and quick. Copy and Paste attributes for the other clips.

  • Same code gives different results in Matlab Script in Labview and Matlab

    I am implemeting a Matlab code into a LabVIEW application using Matlab script. When I import the exactly same code to the Matlab Script in LabVIEW it gives a different result than it is in Matlab. This code is a simulation code including first kind bessel functions. Using LabVIEW 7.1 and Matlab R14 service pack 3.

    Labview 8.5
    Matlab R2009b
    Attached are the graphs produced by matlab script in labview and in matlab.
    The minimum of graph produced by matlab code is below 1 and that in labview is above 1.
    Thanks a lot for your reply.
    Sorry, I haven't quantified the "sometimes" yet. 
    Attachments:
    matlabsResult.jpg ‏29 KB
    LV.png ‏84 KB

  • Parallel Execution questions

    Hi everyone, i have few questions about Parallel Execution.
    1- Query Coordinator to scan the table, split up the table to the processes. But, each process take different number of rows. For example, in oracle 10g database architecture, thomas kyte gave an example:
    "SELECT COUNT(*) FROM BIG_TABLE"
    It is distributed to 4 processes. One process take 1000 rows, another process 1200 etc. Why this is not equal? How the number of rows is determined by the QC?
    2- If we do not determine the degree of parallelism ( /*+ PARALLEL(B) */ instead of /*+ PARALLEL(B,16) */ ) how the cbo determine the degree of parallelism?
    3- in the explain plan i see lots of thing i do not know. For example :
    :TQ10000
    what is that?
    -> TQ, IN-OUT, PQ_Disturb columns?
    thanks for responses.

    Hi
    Why this is not equal? How the number of rows is determined by the QC?The number of rows is not used to split the table. Data is split in granules and distributed to the slave processed based on two methods:
    - partition granules: a whole partition is given to a specific slave process
    - block range granules: range of blocks are given to each slave process
    So, depending on the size of the granules, one slave process might process much more data than another.
    If we do not determine the degree of parallelism how the cbo determine
    the degree of parallelism?It depends on the configuration. In fact, there is a default at session level, at system level and at table level. Depending which ones are set, a DOP is chosen. In addition, the DOP might be decreased at runtime (before the execution starts, not dynamically during it).
    For example : :TQ10000 what is that?Producers send data to consumers through so called "table queues" (TQ). The number, in your case 10000, is just a identifier generated by the database engine.
    TQTable queue, e.g.: TQ10000 -> Q1,00
    IN-OUTThis is the relationship between parallel operations. E.g. P->P means that a parallel operation sends data to another parallel operation.
    PQ_Disturb columns?This is the method used to distribute rows. E.g. RANGE means that the producers send specific ranges of rows to different consumers.
    For a more detailed discussion about these topics have a look to the documentation:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14223/usingpe.htm#i1009828
    HTH
    Chris Antognini
    Author of Troubleshooting Oracle Performance, Apress 2008 (http://top.antognini.ch)

  • Tracking the order of execution of sql scripts in SQL*Plus

    In our production environment we sometimes have to run some .sql scripts in a particular order. Since the order of execution is important , i have created another .sql file caller caller.sql(shown at the bottom) which will call all the scripts in the right order.
    i thought of putting a exec DBMS_LOCK.SLEEP (5); after the end of every execution of the script so that i can see the
    'Ending script1'message .
    The spooling within the caller script(execute_stack.log) has become meaningless because each script has a spool <filename.log> and spool off within it. These spool logs (for every script) is important for tracking purposes as each script belongs to a different development team and i have to send them the spooled log file after the execution.
    I don't want to see the entire scripts running by in my screen. Since these scripts have their own spooling, i can later check the logs if the scripts where executed properly.
    So i need two things.
    1.I just need to see the following and nothing else in the screen.
    Ending script1
    Ending script2
    Ending script3
    .2. I need to log the order of execution. ie. the execute_stack.log should look like the above.Since there is a spool off within each script, this wouldn't be possible.Right?
    Ending script1
    Ending script2
    Ending script3
    .The caller.sql script which calls all the scripts in the right order
    alter session set nls_date_format = 'DD-MON-YYYY hh24:MI:SS';
    set serveroutput on
    set echo on;
    set feedback on;
    spool execute_stack.log
    @script1.sql
    exec dbms_output.put_line ('Ending script1');
    dbms_output.put_line(chr(10)||chr(10)||'.'||chr(10)||'.'||chr(10));
    exec DBMS_LOCK.SLEEP (5);
    @script2.sql
    exec dbms_output.put_line ('Ending script2');
    dbms_output.put_line(chr(10)||chr(10)||'.'||chr(10)||'.'||chr(10));
    exec DBMS_LOCK.SLEEP (5);
    @script3.sql
    exec dbms_output.put_line ('Ending script3');
    dbms_output.put_line(chr(10)||chr(10)||'.'||chr(10)||'.'||chr(10));
    exec DBMS_LOCK.SLEEP (5);
    @script4.sql
    dbms_output.put_line(chr(10)||chr(10)||'.'||chr(10)||'.'||chr(10));
    exec dbms_output.put_line ('Ending script3');
    commit;
    spool off;Is this a professional way of tracking the execution of .sql scripts?

    Pete_Sg1 wrote:
    Is this a professional way of tracking the execution of .sql scripts?No. There is very little professional about using .sql scripts on a production system - when stored procedures are safer, more robust, easier managed and controlled and secure.. and where a log table can be used to properly log the runtimes (and other stats) of each processing step.
    Let's just take a look at the number of moving parts you need to schedule and run a .sql script. A cron job needs to be configured with the proper environment setting. It needs to run a shell script. That shell script needs to load SQL*Plus. SQL*Plus needs to connect to the database (starting a dedicated server process most likely). SQL*Plus then needs to read a .sql file, parse these commands and either execute these locally (SQL*Plus commands) or remotely (PL/SQL and SQL commands).
    How can this be considered professional when the very same can be achieved with a
    - stored procedure
    - using DBMS_JOB to schedule the procedure for execution
    There are so many things that can go wrong with the first method. And so few things that could go wrong with the last one. No contest as to which method is not only better, but also professional.
    PS. See that you use Windows to run these scripts. It is even worse as it introduces another hardware and software layer making the scenario even more insecure & unsafe with more moving parts that can go wrong or simply fail.

  • CJS-00030  Assertion failed: Execution of SQL script reports an unexpected error.System Copy

    Experts,
    I am performing Homogeneous System copy using backup/restore method.I have taken QA system backup ie dump files and trying to build on Sandbox system for testing purpose.Everything went fine and was able to load the dump but stuck at one of the point and throwing me error
    ERROR      [cinstallercallbackimpl.cpp:228]
    CJS-00030  Assertion failed: Execution of SQL script reports an unexpected error.
    When I checked the sapinst_dev log I found the below error but not able to fix it and not aware to proceed further
    1> use SID
    1> EXEC sp_changedbowner sapsa
    Msg 17362, Level 16, State 1:
    Server 'SID', Procedure 'sp_changedbowner', Line 181:
    The proposed new db owner already is a user in the database or owns the database.
    (return status = 1)
    I am sure these may be small changes that have to be made by changing the db owner and Im not exactly sure how to perform it.
    Note:I am Using the Windows environment with Sybase ase database and SWPM tool to perform.
    Appreciate your suggestions
    Thanks

    I fixed the issue ..by changing the owner sapsa.Bingo !!!!

  • Scripting in LabVIEW

    Hi, could you please let me know what is the easiest scripting method that can be used in LabVIEW, we have engineers that know nothing about LabVIEW, they want an scripting language to do simple things like assignments , for loop, while loop and basic math functions such Mean . Is math script ( Matlab) the best choice?
    The problem I have with Math Script is , I think it is good for complicated mathematics but it can't be used as a command script for LabVIEW ?
    For example if I have this line in that scripting window
    vpp= 10
    then I would like to pass that value to a VI, if I don't have this line then there is no need to have that VI
    Is there any LabVIEW scripting language that can be good for this purpose?

    Thanks jcarmody and Yamaeda for your suggestions
    Before working on your idea about using Paython I would like to show you one example and if you think the toolbox can handle that I will continue
    This is one of the m files ( matlab ) provided by them:
    As you can see below they can call functions , use basic math functions and use basic commands such as if , for , while?
    They would like to have an environment like this but when for example they call a function I don't want to run a Matlab or Paython code. I want to run a SubVI which is specified for that command and then we goto to the second line and continue. SO everything should be implemented in LabVIEW but I want to give them this capability to write scripts (since they don't know LabVIEW) but everything should be execute in LabVIEW. Can the Paython toolbox give me this capability?
    If not do you have any suggestion
    vTest = 0.8; %default voltage value for current measurements is 0.8V
    end
    atpMode = 2;
    atpAddress = 3;
    tpCfg = 'ATP';
    tpAnaBuff = 'OFF';
    configATP(dutNum,ATPmode,ATPaddress,tpCfg,tpAnaBuff);
    %This function sets up the DUT test mode and test point
    vdd = [];
    vpp = [];
    configTIBVolt(vdd,vpp,vTest);
    % This funciton configures the analog voltages on the test interface board
    % Variables that are empty would retain their previous values
    bufferOn = 0;
    configDUTRelay(bufferOn);
    % This function configures the relay that selects the analog buffer on the
    % DUT board. For current measurements, the buffer should be bypassed
    k = 1;
    trimCode = 0;
    while(trimCode(k) < 8 && trimCode(k) > -9) %iBias Trim is 2's complement 4-bit
    numBits = 4;
    trimCode_twosComp = dec2twosComp(trimCode(k),numBits);
    % this function converts the decimal trimCode into its 4-bit two's
    % complement form since the iBias trim codes are in two's complement
    regData = ['xxxx',trimCode_twosComp];
    regAddr = 13;
    regReadModifyWrite(dutNum,regAddr,regData)
    % This function reads the register value, modifies only the selected bits,
    % (i.e. those that aren't 'x'), then writes back to that register.
    % This is to avoid overwriting unrelated bits in the same byte
    dutOut = 'CURR_MEAS';
    pxiAnaIn = 'CURR_MEAS';
    dutVpp = [];
    dutVdd = [];
    configTIBMux(dutNum, dutVpp, dutVdd, dutOut, pxiAnaIn);
    % This function configures the test interface board analog switches for
    % VDD, VPP, DUTout, and PXIanaIn
    % Variables that are empty would retain their previous values
    pauseTime = 1e-3;
    pause(pauseTime);
    % Wait for everything to settle.
    if( k>2 && direction(k)~=direction(k-1) )
    % If the sign of the error changed, then the best trim code is
    % either this one or the one just before it
    [Y, minIndx] = min(abs(iBias-targetIBias));
    % Finds the index with the min error
    iBiasTrimCode = trimCode(minIndx);
    iBiasMeas = iBias(minIndx);
    % assigns values to output variables
    trimCode_twosComp = dec2twosComp(iBiasTrimCode,numBits);
    regData = ['xxxx',trimCode_twosComp];
    regReadModifyWrite(dutNum,regAddr,regData)
    % write the best trim code t0 the DUT
    return
    % Exit the function
    end
    trimCode(k+1) = trimCode(k) + direction;
    k=k+1;
    end
    % If we've reached this portion of the code, then we failed to find an optimal
    % trim code. We should just return the last values, since these are as close
    % as we can get to the target (i.e. we're at the edge of the range)
    iBiasTrimCode = trimCode(end-1);
    iBiasMeas = iBias(end);

Maybe you are looking for

  • How to use  ADF application functionality in Webcenter Portal

    Hi, We have an separate ADF application with bunch of functionality that are been using in others applications.We have to use these functionality in the WebCenter portal. In this ADF application, each jspx page used for implementing different functio

  • How can I eliminate "sending text" sound?

    I really don't need or want a notification that a text has been sent.  Texts should be a "fire and forget" kind if thing.

  • RAM Upgrade - (2) 256's + (2) 512's?

    All the research I have done into upgrading my RAM suggests matching what I already have. My question is, I currently have a dual 1.8 G5 with 512 RAM (2 256 sticks) -- in order to upgrade do I need to get another two 256 sticks to get to 1gb total, o

  • User exit or badi or enhance for MIRO using Purchase order

    Hello All,           When Miro was done using Purchase order i want to change the field BSEG-ZUONR value to assign purchase oder number. In the setting when the purhcase order was selected but still it comes as blank. could any body suggest the solut

  • How do I burn cd's onto my ipod?

    How do I burn cd's onto my ipod? I'm so confused.