Logging data after a trigger with Lookout Direct

Hi,
I would like to log data with Lookout Direct after a trigger input (not periodically or continuously).  I am using Lookout DIrect version 4.5.1 build 19 and a Direct Logic 250 PLC.
Does anyone have suggestions on how to do this?  I would prefer that this data be logged into Excel.  I have 24 sensors connected to the PLC and each time a sensor transitions from hi to low I would like to log the time of transition. 
Currently I have individual spreadsheet storage objects for each sensor as well as individual latchgates to indicate when logging has been completed.  In the PLC code there is a state machine for each individual sensor as well and one of the states waits for an acknowledgement from Lookout that data logging has been completed before moving on to the next state (I will have to dig a bit deeper to remember exactly why I needed to do that).
I am hoping there is a more traditional approach that is easier than what I am doing.  One of the problems I have been facing is Lookout Direct crashing every few days and I suspect it is because the sensors are often sensed within milliseconds of each other and opening/closing so many files is causing problems.  I worked through a list of possible reasons that Lookout may be crashing (provided by tech support) and I am nearly convinced I am just asking too much of the program...
Any help will be greatly appreciated
Thank you,
David

In case someone can help with this, here is a bit more information about my application and the PLC/Lookout code I have developed:
Actuators have two positions, nominal and folded.  Prox sensors are used to monitor the position of actuators.  12 actuators can be monitored simultaneously.  The time at which prox sensors are sensed high is recorded so that actuator speed and actuation success is logged. 
The PLC code consists of 12 separate state machines, each with the following 8 states:
State 1
  System reset or nominal timer reset
  Wait for nominal prox release
State 2
  Nominal prox released
  Wait for folded prox sense
State 3
  Folded prox sensed
  Wait for folded ack from Lookout (acknowledges that timer value has been logged)
State 4
  Folded ack from Lookout received
  Wait for folded timer reset (this state is active for one scan only)
State 5
  Fold-in timer reset
  Wait for folded prox release
State 6
  Folded prox released
  Wait for nominal prox sense
State 7
  Nominal prox sensed
  Wait for nominal ack from Lookout (acknowledges that timer value has been logged)
State 8
  Nominal ack received from Lookout
  Wait for nominal timer reset (this state is active for one scan only)
Lookout acknowledges that timer values have been read and saved with LatchGates.  Lookout uses a SpreadSheet Storage Object to save the PLC timers when the PLC enters states 3 or 7 (prox sensor triggered).  The logged member of the SpreadSheet Storage Object is used to change the state of the LatchGates which, in turn, signal the PLC to proceed to the next state.  When the folded file is saved, the nominal LatchGate is turned off and the folded LatchGate is turned on and vice-versa for the nominal file.
The LatchGate/SpreadSheet Storage combination is what I am hoping to improve upon.  I believe Lookout is crashing when 12 SpreadSheet Storage Objects log to 12 different files during the same 1 second period of time.
If anyone has suggestions of a way to log this data in the PLC memory or a software package better suited for this application, please let me know!  I believe this would be simple with LabVIEW, unfortunately obtaining the additional hardware and software that I would need hasn't been easy! 

Similar Messages

  • After update trigger with condition

    Dear members,
    I have a table with multiple columns on each update on these columns I have to update another table columns.
    for one column the following trigger is working fine, but there are about more than 10 columns which I have to check for.
    I want to create only one after update trigger on this table and check which column is updated and update the same column in 2nd table.
    here is the trigger:
    CREATE OR REPLACE TRIGGER ABC_RATE_UPD
    AFTER UPDATE
       OF RATE
       ON ABC_PCD 
       REFERENCING NEW AS New OLD AS Old
       FOR EACH ROW
    DECLARE
      BEGIN
        UPDATE RM_TRAN_IN
        SET RATE = :NEW.RATE
        WHERE RM_PCD_ID = :NEW.RM_PCD_ID;
       EXCEPTION
         WHEN OTHERS THEN
           -- Consider logging the error and then re-raise
           RAISE;
    END ABC_RATE_UPD; here is my if condition, (edited)
    if :NEW.RATE != :OLD.RATE then
            UPDATE ABC_TRAN_IN
            SET RATE = :NEW.RATE
            WHERE ABC_PCD_ID = :NEW.ABC_PCD_ID;
        ELSIF :NEW.PACK_UNIT_ID != :OLD.PACK_UNIT_ID then
            UPDATE ABC_TRAN_IN
            SET PACK_UNIT_ID = :NEW.PACK_UNIT_ID
            WHERE ABC_PCD_ID = :NEW.ABC_PCD_ID;
    END IF;
        Its working good, is this a good approach, to the solution?
    regards:
    Edited by: user2040934 on Dec 29, 2012 3:22 PM

    In your trigger if you update more than one column at a time will it update all the updated columns in the other table properly?
    Can't you code something like this
    CREATE OR REPLACE TRIGGER ABC_RATE_UPD
    AFTER UPDATE
       OF col1,col2 , col3
       ON ABC_PCD 
       REFERENCING NEW AS New OLD AS Old
       FOR EACH ROW
    DECLARE
      BEGIN
    if updating ('col1') then
        UPDATE RM_TRAN_IN
        SET col1 = :NEW.col1
        WHERE :old.id = :NEW.id;
      end if;
      if updating ('col2') then
        UPDATE RM_TRAN_IN
        SET col2 = :NEW.col2
        WHERE :old.id = :NEW.id;
      end if;
       if updating ('col3') then
        UPDATE RM_TRAN_IN
        SET col3 = :NEW.col3
        WHERE :old.id = :NEW.id;
      end if;
       EXCEPTION
         WHEN OTHERS THEN
           -- Consider logging the error and then re-raise
           RAISE;
    END ABC_RATE_UPD;

  • After Insert Trigger with DML on the subject table?

    I am trying to set up e-mail notifications, so I created a procedure, which accepts the argument of an id. In the procedure, it queries the table and sends out mail based on the result set. (in the trigger, I pass in the value of :NEW.id). The procedure then queries the table for that row id.
    First, for the after update I was getting the error:
    ORA-04091: table is mutating, trigger/function may not see it ORA-06512
    So, I was advised to do in the declare block: pragma autonomous_transaction; I did that, and that solved the problem there. SO I did the same in the after insert trigger, but then I get the error:
    ORA-00060: deadlock detected while waiting for resource ORA-06512
    I asked our DBA and he said you are not able to query the table of which the trigger is a subject of. I thought it would be possible since its After insert or update?
    Can anyone offer any suggestions ? :)
    Thanks,
    Trent

    Any reason why it would work in an after update trigger then?
    So, I was advised to do in the declare block: pragma autonomous_transaction; I did that, and that solved the problem therePRAGMA AUTONOMOUS_TRANSACTION "bends" the restriction against SQL against base table.
    It is the equivalent to tap dancing across a mine field.
    You might get the desired results most of the time or you might get a tasty surprise when you least expect it.
    What happens to your application in the future, if/when the UPDATE has a ROLLBACK issued & PRAGMA AUTONOMOUS_TRANSACTION has successfully completed?
    Inquiring minds would like to know the answer.

  • After insert trigger with :NEW.ROWID

    Hi All,
    I am using a After insert trigger to generate history record as following:
    CREATE TABLE TB_TEST (classID number(3), classNm varchar2(12));
    CREATE TABLE TB_HIST_TEST (hist_dttm timestamp(6), classID number(3), classNm varchar2(12));
    CREATE or REPLACE TRIGGER air_test AFTER INSERT ON tb_test FOR EACH ROW
    BEGIN PK_SRVC.CRT_NewRec('TB_TEST', 'TB_HIST_TEST', :new.ROWID); END:
    In PK_SRVC package, I use the following statment to create new record in TB_HIST_TABLE:
         Insert into tb_hist_test (hist_dttm, classID, classNm)
         values (select systimestamp, classID, classNm from tb_test where rowid = ROWID )
    The trigger DOES fire when a new row is inserted into TB_TEST. However there is no record inserted into TB_HIST_TEST. Any suggestion?
    Thanks,

    The PK_SRVC.CRT_NewRec is a generic service package that can be shared by many tables. It uses dynamic SQL to get all the collumns for different tables based on USER_TAB_COLUMNS.  The following is the code of this package:
    PROCEDURE CRT_NewRec ( p_TableName IN VARCHAR2, p_AudTableName IN VARCHAR2, p_RowID)
    IS
           TYPE TabCol_RecTyp IS RECORD (COLUMN_NAME VARCHAR2(30), COLUMN_ID NUMBER);
           TYPE TabCol_CurTyp IS REF CURSOR;
            c_TabCol TabCol_CurTyp;
            rc_TabCol TabCol_RecTyp;
            v_Sql_TabCol VARCHAR2(1000);
            v_ColNames VARCHAR2(1000);
            v_Sql VARCHAR2(1000);
            PRAGMA AUTONOMOUSE_TRANSCATION;
    BEGIN
            v_SQL_TabCol := ' SELECT column_name, column_id FROM USER_TAB_COLUMNS'
                                      ||  ' WHERE table_name = ' || CHR(39) || p_TableName || CHR(39)
                                      || '  ORDER BY column_id';
            v_ColNames := NULL;
            OPEN c_TabCol FOR v_Sql_TabCol;
            Loop
                   FETCH c_TabCol INTO rc_TabCol; Exit WHEN c_TabCol%NOTFOUND:
                   v_ColNames := v_ColNames || ',' || rc_TabCol.COLUMN_NAME;
            End Loop;
            CLOSE c_TabCol;
            v_Sql := 'INSERT INTO ' || p_AudTableName || '(HIST_DTTM, ' || v_ColNames || ' )'  
                      || ' SELECT systimestamp, ' || v_ColNames || ' FROM ' || p_TableName
                      || ' WHEN ROWID = chartorowid(' || CHR(39) || p_RowId || CHR(39) || ')';
             EXECUTE IMMEDIATE v_Sql;
             COMMIT;
    END;    
    (charmingholidays-yyz)

  • Convert "1D array of dynamic data" to "Dynamic data" after auto-indexing with loop

    Hi -
    I have a question about how exaclty auto-indexing works with the Dynamic Data type when exiting a while loop. I'd like to use the DAQ assistant to record data through an analog input (on the MyDAQ) and then replay that data through an analog output. This works fine if I use the DAQ Assistant to record data for a set amount of time:
    However, I run into trouble when I try to use a While loop with a stop button to record data for an arbitrary amount of time. I set the dynamic data tunnel out of the While loop to "Indexing", but then when it exits the loop, it becomes "1-D Array of Dynamic Data" instead of just "Dynamic Data", and get a broken wire when I try to connect it to the input of the DAQ Assistant. I've also tried converting the dynamic data to Array and Waveform data types inside the loop, but have the same issue (they become 2D Array and 2D Array of Waveform respectively when leaving the loop). Try as I might using blocks like Convert to/from Dynamic Data Type, or Array Subset, I'm unable to get them back down to just Dynamic Data or Waveform, which will be accepted by the DAQ Assistant:
    One other small note - I'm not sure if using the "Build Array" function with a shift register accomplishes exactly the same thing as just auto-indexing an array out of the loop - so I've tried both, but still have the broken wire issue.
    I'm assuming this is just a simple issue of using the right block to convert "1D Array of [Something]" to just plain [Something]. Any help appreciated.
    Attachments:
    PWM_FlatSequence.vi ‏101 KB
    PWM_FlatSequence_While.vi ‏129 KB

    I have been facing problem for all temperature values placing the Write to Measurement File outside the Loop. 
    In Error message it says" Source: 1D array of dynamic data and Sink: Dynamic data".
    Is there any means of convering 1D array of dynamic data to dynamic data?
    I would highly appreciate any help.
    Attachments:
    Temperature Logger.vi ‏110 KB

  • DSC 8.6.1 wrong timestamps for logged data with Intel dual core

    Problem Description :
    Our LV/DCS 8.6.1 application uses shared variables to log data to Citadel. It is running on many similar computers at many companies just fine, but on one particular Intel Dual Core computer, the data in the Citadel db has strange shifting timestamps. Changing bios to startup using single cpu fixes the problem. Could possibly set only certain NI process(es) to single-cpu instead (but which?). The old DSCEngine.exe in LV/DSC 7 had to be run single-cpu... hadn't these kind of issues been fixed by LV 8.6.1 yet?? What about LV 2009, anybody know?? Or is it a problem in the OS or hardware, below the NI line??
    This seems similar to an old issue with time synch server problems for AMD processors (Knowledge Base Document ID 4BFBEIQA):
    http://digital.ni.com/public.nsf/allkb/1EFFBED34FFE66C2862573D30073C329 
    Computer info:
    - Dell desktop
    - Win XP Pro sp3
    - 2 G RAM
    - 1.58 GHz Core 2 Duo
    - LV/DSC 8.6.1 (Pro dev)
    - DAQmx, standard instrument control device drivers, serial i/o
    (Nothing else installed; OS and LV/DSC were re-installed to try to fix the problem, no luck)
    Details: 
    A test logged data at 1 Hz, with these results: for 10-30 seconds or so, the timestamps were correct. Then, the timestamps were compressed/shifted, with multiple points each second. At perfectly regular 1-minute intervals, the timestamps would be correct again. This pattern repeats, and when the data is graphed, it looks like regular 1-sec interval points, then more dense points, then no points until the next minute (not ON the minute, e.g.12:35:00, but after a minute, e.g.12:35:24, 12:36:24, 12:37:24...). Occasionally (but rarely), restarting the PC would produce accurate timestamps for several minutes running, but then the pattern would reappear in the middle of logging, no changes made. 
    Test info: 
    - shared variable configured with logging enabled
    - data changing by much more than the deadband
    - new value written by Datasocket Write at a steady 1 Hz
    - historic data retrieved by Read Traces
    - Distributed System Manager shows correct and changing values continuously as they are written

    Meg K. B. , 
    It sounds like you are experiencing Time Stamp Counter (TSC) Drift as mentioned in the KB's for the AMD Multi-Core processors. However, according to this wikipedia article on TSC's, the Intel Core 2 Duo's "time-stamp counter increments at a constant rate.......Constant TSC behavior ensures that the duration of each clock tick is
    uniform and supports the use of the TSC as a wall clock timer even if
    the processor core changes frequency." This seems to suggest that it would be not be the case that you are seeing the issue mentioned in the KBs.
    Can you provide the exact modle of the Core 2 Duo processor that you are using?
    Ben Sisney
    FlexRIO V&V Engineer
    National Instruments

  • Problems with After Report trigger Updating using User-defined functions

    Hi,
    I have a report which uses SQL to select data that uses a user-defined stored function in the WHERE clause.
    I use the same SQL in the After Report trigger with and UPDATE statement to flag all records selected as being run.
    Data is being selected by the report no problem, but the records are not being updated. In a test, If I remove the conditions using the user functions, the records update as expected. In Live conditions I must have these conditions in the script.
    I originally tried putting the UPDATE in a formual column, but that would not fire on records where that page was not paged through (or paged to end) in the Runtime Previewer.
    Can anyone advise?

    In case anyone is interested.
    The issue was that the stored functions have roles assigned for security.
    PL/SQL for After Report doesn't seem to recognise the roles having been assigned for the report, so the implicit cursor update/select I had wouldn't work.
    I changed the SELECT into an explicit CURSOR and introduced a FOR LOOP, keeping the UPDATE as an implicit statement.
    I know see this was more of a PL/SQL issues than a report one, but such is life. So if anyone feels the urge to move it to the PL/SQL forum, then feel free!!
    Have a problem free afternoon. :-)

  • Monitor multiple channels for analog trigger with DAQmx drivers

    Hello! I would like to start a data acquisition of multiple analog channels (16) from an analog trigger. I would like trigger to monitor four of the (same) channels, and trigger when any one of them reaches a certain voltage. I found an example that would work with the Traditional DAQ drivers (using occurances), but can't figure out how to do something similar in DAQmx.
    Time is also an issue, as I would like to collect the first 80 milliseconds of data after the trigger (at a rate of 500,000 Hz).
    I'm using LabView 7.0 and collecting data off of two PXI-6133 cards.
    Thanks for your help!

    Hi Denise-
    After some research, I have found that it is not possible to use the functionality of DAQ Occurrences in DAQmx. Ironically, the reason that this functionality is available in Traditional and not DAQmx is due to the exploitation of an inherent limitation of Traditional that was upgraded in DAQmx. The multi-thread capability of DAQmx is a major advantage for most applications, but in this case it prevents the use of occurrences as they existed in Traditional DAQ.
    In short, this means that you can't directly use this functionality in DAQmx. You can however emulate this functionality with minimal software analysis of the incoming signal. I have attached a modified example VI that logs data to a chart only when the analog level of one of the channels being measured has exceeded a user-defined reference value. Basically, the task is running continuously in the background but the data is not actually logged until the signal is above a predetermined "trigger" level.
    Please let me know if the attached example is helpful for your application. You will see the input channels listed in the format "DevX/ai0:y" where X is the device number and y is the highest channel number of interest.
    Regards,
    Tom W
    National Instruments
    Attachments:
    Cont Acq&Graph Voltage-Int Clk Analog SW Trigger.vi ‏83 KB

  • After Insert Trigger - IF-THEN-ELSE

    Is it possible to put an if-then-else statement in an after insert trigger with an update to a second table?
    CREATE OR REPLACE trigger bev_trg
    after insert
    on bev
    for each row
    DECLARE
    MEINS VARCHAR2(3);
    SPART VARCHAR2(2);
    FKIMG NUMBER(22);
    SHKZG CHAR(1);
    BEGIN
    insert into bev2
    (vbeln,
    posnr)
    values
    (:new.vbeln,
    :new.posnr
    IF FKIMG = '10' and MEINS = 'EA'
    THEN
         update bev2 set bwsie = (fkimg * '12.17') ;
    ELSIF
    FKIMG = '12' and MEINS = 'EA'
    THEN
         update bev2 set bwsie = (fkimg * '19.02');
    ELSIF
    SHKZG = 'X'
    THEN
         update bev2 set bwsie = (fkimg * '-1');
    ELSE
         update bev2 set bwsie = '0' ;
    END IF;
    update bev2 set bwfkimg = (fkimg * '-1') where SHKZG = 'X';
    end;
    This does not do the updates.
    Thanks
    Bev

    My original problem is this: I need to create a table
    derived from data in another table
    and change it to be the conditional data in the update statements.
    create table bev(
    VBELN VARCHAR2(10),
    POSNR VARCHAR2(6),
    MEINS VARCHAR2(3),
    SPART VARCHAR2(2),
    FKIMG NUMBER(22),
    SHKZG CHAR(1),
    BWSIE NUMBER(9,2),
    BWFKIMG NUMBER(22)
    update bev set bwsie = (fkimg * '12.17') where SPART = '10' and meins = 'EA' ;
    update bev set bwsie = (fkimg * '19.02') where SPART = '12' and meins = 'EA';
    update bev set bwsie = (fkimg * '27.39') where SPART = '15' and meins = 'EA';
    update bev set bwsie = (fkimg * '48.69') where SPART = '20' and meins = 'EA';
    update bev set bwsie = (fkimg * '109.56') where SPART = '30' and meins = 'EA';
    update bev set bwsie = '0' where meins = 'EA' and SPART not in('10','12','15','20','30');
    update bev set bwsie = (fkimg * '-1') where SHKZG = 'X';
    update bev set bwfkimg = (fkimg * '-1') where SHKZG = 'X';
    This did not produce the required results so I thought I would try
    creating a second table
    and using a trigger with if-then-else to update it.
    Can't use the trigger on the
    original table because it mutates.
    create table bev2(
    VBELN VARCHAR2(10),
    POSNR VARCHAR2(6),
    BWSIE NUMBER(9,2),
    BWFKIMG NUMBER(22)
    insert into bev select vbeln, posnr, meins, spart, fkimg , shkzg, '', ''
    from [email protected]
    where vbeln = '0090043834' and posnr = '000011'
    insert into bev select vbeln, posnr, meins, spart, fkimg , shkzg, '', ''
    from [email protected]
    where vbeln = '0090043833' and posnr = '000011'
    CREATE OR REPLACE trigger bev_trg
    after insert
    on bev
    for each row
    DECLARE
    MEINS VARCHAR2(3);
    SPART VARCHAR2(2);
    FKIMG NUMBER(22);
    BEGIN
    insert into bev2
    (vbeln,
    posnr,
    fkimg
    values
    (:new.vbeln,
    :new.posnr,
    :new.fkimg
    IF :new.FKIMG = '10' and :new.MEINS = 'EA'
    THEN
    update bev2 set bwsie = (:new.fkimg * '12.17') ;
    ELSIF
    :new.FKIMG = '12' and :new.MEINS = 'EA'
    THEN
    update bev2 set bwsie = (:new.fkimg * '19.02');
    ELSIF
    :new.FKIMG = '15' and :new.MEINS = 'EA'
    THEN
    update bev2 set bwsie = (:new.fkimg * '27.39');
    ELSIF
    :new.FKIMG = '20' and :new.MEINS = 'EA'
    THEN
    update bev2 set bwsie = (:new.fkimg * '48.69');
    ELSIF
    :new.FKIMG = '30' and :new.MEINS = 'EA'
    THEN
    update bev2 set bwsie = (:new.fkimg * '109.56');
    ELSIF
    SHKZG = 'X'
    THEN
    update bev2 set bwsie = (:new.fkimg * '-1');
    ELSE
    update bev2 set bwsie = '0' ;
    END IF;
    update bev2 set bwfkimg = (:new.fkimg * '-1') where SHKZG = 'X';
    end;

  • Update by after insert trigger

    I need to update each new inserted row of data by checking if there were same values in the table. So I tried the After Insert Trigger with the code below :
    CREATE OR REPLACE TRIGGER approve_checker
    after insert on TBL_BILL
    for each row
    declare
    counter integer;
    begin
    SELECT COUNT() INTO counter FROM TBL_BILL WHERE issue_date=:NEW.issue_date AND wagon_no=:NEW.wagon_no ;*
    IF( counter > 1)
    THEN
    UPDATE TBL_BILL SET approved = 2 WHERE id = :NEW.id;
    END IF;
    end approve_checker;
    But errors accured :
    ORA-04091: table RAIL.BILL is mutating, trigger/function may not see it
    ORA-06512: at "RAIL.APPROVE_CHECKER", line 4
    ORA-04088: error during execution of trigger 'RAIL.APPROVE_CHECKER'
    Any help appreciated.

    You cannot SELECT from the table on which the row-level trigger is defined, nor can you UPDATE a table on which the row-level trigger is defined. If you are inserting multiple rows in a single INSERT statement, Oracle can't be sure what rows to update and/or select.
    Is there a reason that you wouldn't just create a unique constraint to enforce this condition?
    Justin

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • EBS: Email XML publisher output, from After Report Trigger in Data Template

    Here is what I'm trying to do:
    -- In EBS (11.5.10 CU2), I'm using XML publisher (5.6.2) data template and layout template to generate Output files (PDF, EXCEL etc)
    --In the Data Template's AfterReport Trigger, I'm using the Concurrent Request Id to locate the Output file name and trying to Email that output file.
    Problem:
    -- When the AfterReport trigger code is executed, the code is NOT seeing the output file and hence the file is NOT emailed.
    Observations/Questions:
    -- From what I observe, the Output Post Processor ( that generates the Excel / PDF files) is running AFTER the code in AfterReport trigger... and hence the AfterReport trigger is Not quite seeing / able to access the output file.
    So, the sequence of execution seems to be:
    -- Before Report Trigger
    -- Data Query (SQL statement)
    -- After Report Trigger
    -- Output Post Processor
    Because the AfterReport Trigger is running before the Output Post Processor, it is Not able to see the output file. Is that a True statement?
    If Yes, how else can the DataTemplate access the Output file?
    If No, what could cause the AfterReport trigger to not see the output file?

    Because the AfterReport Trigger is running before the Output Post Processor, it is Not able to see the output file. Is that a True statement?
    I believe so, as the OPP works on the output of the Report after the Report has completed execution.
    You could use the same approach as we do for bursting the report to different users. Write a Java Concurrent program based on "oracle.apps.xdo.oa.cp. XMLPReportBurst" with delivery channel Email to send the email output. You would need to add code to launch the Concurrent child request in your AfterReport Trigger:
    function AfterReport return boolean is
    jreq_id number;
    begin
    srw.message (100, 'DEBUG: AfterReport_Trigger +');
    jreq_id:= FND_REQUEST.SUBMIT_REQUEST ('XDO','XDOBURST','','',FALSE,:P_CONC_REQUEST_ID,'Y',chr(0),
    If (jreq_id=0)
    then
    srw.message (100,'Request id is zero');
    end if;

  • Table - Issue with Logging Data Changes

    Hi Experts,
    My client's requirement is to log data changes happened in a table which is critical. I've selected 'Log data changes' check box in technical settings of the table. Now, I'm able to see log for all the changes i did after that. But, I'm facing following issues. Kindly help me in fixing the same.
    1. The table is having two fields with same data element. In the change log SCU3, under the field name, field label mentioned in the data element is getting displayed. As, the data element is same for two fields, its difficult to find out which particular field is changed. Is there any way to display table field name instead of field label from data element?
    2. Sometime, even there is change in the data record, I'm able to see a log record in the change log with caption 'Data record unchanged'. What might be the problem and how to resolve?
    Thanks in Advance,
    Siva Sankar.

    Hi Moha Nan,
       You just use ABAP to change the table P? In this way I think it is not change to X table.
    You can upload master data to add navigation,it is a good way to bulid the SID of P table and X
    table .
      Hope it is help to you!

  • Mutating table exception on trigger with After Insert but not with before

    Hi
    I need to maintain some constraint on the table to have only one row for values of the columns but not by primary key constraint.
    Because in case of primary key the insert would fail and the rest of the operation would be discontinued, I cannot change in the somponent that inserts the row so I have to prevent that on the table I have.
    I created a before insert trigger on the table which checks if any row exists in the table with same column values as the one being inserted. if found I delete the rows and let the insert happen (w/o raising any error). if the rows do not exist then the insert shall be continued.
    I read at place that modifying the dame table in the trigger body shall raise a mutating table exception, but I donot get the exception when the trigger is fired.
    Just when I change the trigger to after insert trigger then the nutating table exception is thrown.
    Is it the right behavior i.e. the Before insert trigger does not raise the exception and only after insert does that, since I could not find the example for before insert triggers throwing such exception so I think it is better to confirm it before finalizing the implementation.
    Thanks
    Sapan

    sapan wrote:
    Hi Tubby
    I cannot user unique constraint because that would raise an exception upon violation and the third party component that is inserting in the table would fail.
    That component does some other tasks as well after this insert and if an exception is raised then those tasks would not be performed.
    Also I cannot change the component to ignore this exception.Well then, you're in a bit of a pickle.
    I'm guessing the trigger you have been working on isn't "safe". By that i mean that it doesn't account for multi-user scenarios. You'll need to serialize access to the data elements in question and implement some sort of locking mechanism to ensure that only 1 session can work with those values.
    After you work out how to do that it sounds as though you would be better served using an INSTEAD OF trigger (you'd need to implement this on a view which is made off of your base table).
    Here's one way you can work on serializing access to your table on a relatively fine grained level (as opposed to locking the entire table).
    Re: possible to lock stored procedure so only one session may run it at a time?
    Cheers,

  • How to log data directly to an ODBC datasource(MS-Access).

    Greetings,
    I have been logging the data(Array) obtained from a transducer directly into a spreadsheet using Labview VI.
    Now I need to put this data directly into an Access 2000 Database.
    Any Suggestions ???.

    There are a couple of options for logging data to an ODBC database: the LabVIEW Datalogging and Supervisory Control (DSC) module or the LabVIEW Enterprise Connectivity Toolkit.
    With LabVIEW DSC comes the ODBC Citadel Database. This allows you to easily log data, and you could retrieve this data into MS Access with SQL statements. More information about the DSC module can be found at:
    http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=1010
    The Enterprise Connectivity Toolkit gives you tools for database connectivity, statistical process control, and internet-enabling technologies. More information about this toolkit can be found at:
    http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=2452

Maybe you are looking for