Polling for new Data in Tables with SQL-functions

Hi
In my table I have 2 columns:
event_state : integer (0 = unread, 1 = read)
min_process_time : date
In the DB-Adapter-Wizard I can configure my event_state-Field for using as logical delete field. In the SQL-Query there is a
... WHERE event_state=0
But, I only want to query all new entries in this table, where
min_process_time > SYSDATE.
How can I configure my toplink-mapping, that SYSDATE can be used in the polling-query? If I use plain-sql instead of the expression from the Wizard (last page), then this sql-statement is ignored. In an expression I only can use a constant-value.
Any idea?
Thanks
Gregor

Hi
I am creating a database Polling (logical delete OPEN to CLOSED) adapter that polls a table for records which have a "SCHEDULED" date field.
The Polling Adapter should pick up those records where the status is OPEN and SCHEDULED<=SYSDATE.
DB Adapter wizard does not allows this where clause(SCHEDULED<=SYSDATE) to be set; so i tried modifying the Toplink SQL with custom SQL. But it does not works. Please suggest a workaround.
Please help.
Thanks
Debashis

Similar Messages

  • How can I load data into table with SQL*LOADER

    how can I load data into table with SQL*LOADER
    when column data length more than 255 bytes?
    when column exceed 255 ,data can not be insert into table by SQL*LOADER
    CREATE TABLE A (
    A VARCHAR2 ( 10 ) ,
    B VARCHAR2 ( 10 ) ,
    C VARCHAR2 ( 10 ) ,
    E VARCHAR2 ( 2000 ) );
    control file:
    load data
    append into table A
    fields terminated by X'09'
    (A , B , C , E )
    SQL*LOADER command:
    sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
    datafile:
    column E is more than 255bytes
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)

    Check this out.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961

  • Sql*loader - load data in table with multiple condition

    Hi,
    I have oracle 9i on Sun sloaris and i need to load data in one of oracle table using sql*loader with conditional column data.
    My table is like:
    Load_table
    col1 varchar2(10),
    col2 varchar2(10),
    col3 varchar2(10),
    Now i have to load data like:
    If col2 = US1 then col3 = 'AA'
    If col2 = US2 then col3 = 'BB'
    If col2 = US3 then col3 = 'CC'
    How can i load this data in table using sql*loader?
    Thanks,
    Pora

    Hi
    it is a half-solution.
    You have to:
    1. open file
    2. take a line
    3. split the line into values (using substring to)
    4. check condition (01 or 02)
    5. do a proper insertion
    Good Luck,
    Przemek
    DECLARE
    v_dir VARCHAR2(50) := 'd:/tmp/'; --directory where file is placed
    v_file VARCHAR2(50) := 'test.txt'; -- file name
    v_fhandle UTL_FILE.FILE_TYPE; ---file handler
    v_fline VARCHAR2(906); --file line
    v_check VARCHAR2(50);
    BEGIN
    v_fhandle := UTL_FILE.FOPEN(v_dir, v_file, 'R'); --open file for read only
    LOOP -- in the loop
    UTL_FILE.GET_LINE( v_fhandle , v_fline); -- get line by line from file
    if (substr(v_fline,17,2) = '01') then --check the value
    INSERT INTO ... -- Time_in
    else
    INSERT INTO ... -- Time_out
    end if;
    END LOOP;
    EXCEPTION
    WHEN NO_DATA_FOUND
    THEN UTL_FILE.FCLOSE( v_fhandle );
    END;

  • Create data base table with EXEC SQL

    Hello,
    I nead to create o data base table with EXEC SQL in an Abap program.
    My code is :
    TRY.
       EXEC SQL.
          CREATE table zt_hello ( mandt char(4) NOT NULL,
                                  kunnr char(10) NOT NULL,
                                  PRIMARY KEY (mandt, kunnr) )
        ENDEXEC.
      CATCH cx_sy_native_sql_error INTO exc_ref.
        error_text = exc_ref->get_text( ).
    ENDTRY.
    IF sy-subrc = 0.
      COMMIT WORK.
    ENDIF.
    But it still not working.
    Can you help me please.
    Thanks.
    Edited by: widad soubhi on Jul 14, 2010 5:26 PM

    Please refer this code
    REPORT z_struct_create .
    DATA: my_row(500) TYPE c,
    my_file_1 LIKE my_row OCCURS 0 WITH HEADER LINE.
    DATA: dd02v TYPE dd02v.
    DATA: my_file_tab1 LIKE dd03p OCCURS 0 WITH HEADER LINE.
    SELECTION-SCREEN BEGIN OF BLOCK blk WITH FRAME TITLE text
    NO INTERVALS.
    PARAMETERS:
    name TYPE ddobjname,
    testo TYPE text40,
    file_1 LIKE rlgrap-filename.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN END OF BLOCK blk.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR file_1.
    PERFORM file_selection USING file_1.
    INITIALIZATION.
    text = text-001.
    START-OF-SELECTION.
    IF file_1 IS INITIAL.
    MESSAGE ID 'Z0017_BDI' TYPE 'I' NUMBER 001.
    EXIT.
    ENDIF.
    CALL FUNCTION 'WS_UPLOAD'
    EXPORTING
    filename = file_1
    filetype = 'ASC'
    TABLES
    data_tab = my_file_1.
    IF sy-subrc 0.
    MESSAGE ID 'Z0017_BDI' TYPE 'I' NUMBER 002.
    EXIT.
    ENDIF.
    LOOP AT my_file_1.
    IF sy-tabix > 1.
    CLEAR my_file_tab1.
    SPLIT my_file_1 AT ';' INTO
    my_file_tab1-fieldname
    my_file_tab1-datatype
    my_file_tab1-leng
    my_file_tab1-decimals
    my_file_tab1-ddtext
    my_file_tab1-inttype = 'C'.
    my_file_tab1-INTLEN = my_file_tab1-leng.
    my_file_tab1-tabname = name.
    my_file_tab1-position = sy-tabix - 1.
    my_file_tab1-ddlanguage = sy-langu.
    my_file_tab1-OUTPUTLEN = my_file_tab1-leng.
    APPEND my_file_tab1.
    ENDIF.
    ENDLOOP.
    dd02v-tabname = name.
    dd02v-ddlanguage = sy-langu.
    dd02v-tabclass = 'INTTAB'.
    dd02v-DDTEXT = testo.
    dd02v-MASTERLANG = sy-langu.
    IF NOT my_file_tab1[] IS INITIAL.
    CALL FUNCTION 'DDIF_TABL_PUT'
    EXPORTING
    name = name
    dd02v_wa = dd02v
    TABLES
    dd03p_tab = my_file_tab1
    EXCEPTIONS
    tabl_not_found = 1
    name_inconsistent = 2
    tabl_inconsistent = 3
    put_failure = 4
    put_refused = 5
    OTHERS = 6
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ELSE.
    MESSAGE ID 'Z0017_BDI' TYPE 'I' NUMBER 003.
    EXIT.
    ENDIF.
    *& Form file_selection
    -->P_FILE_1 text
    FORM file_selection USING p_file.
    CALL FUNCTION 'WS_FILENAME_GET'
    EXPORTING
    def_filename = ''
    def_path = 'c:\'
    mask = ',.,..'
    mode = '0'
    title = 'Selezione file'
    IMPORTING
    filename = p_file
    RC = RCODE
    EXCEPTIONS
    inv_winsys = 1
    no_batch = 2
    selection_cancel = 3
    selection_error = 4
    OTHERS = 5.
    ENDFORM. " file_selection
    File Template:
    Fieldname;Data Type;Lentgh;Dec.;Descr.
    FIELD1;CHAR;000020;000000;my field 1
    FIELD2;CHAR;000008;000000;my field 2
    FIELD3;CHAR;000007;000000;my field 3
    FIELD4;CHAR;000006;000000;my field 4

  • Sample report for filling the database table with test data .

    Hi ,
    Can anyone provide me sample report for filling the database table with test data ?
    Thanks ,
    Abhi.

    hi
    the code
    data : itab type table of Z6731_DEPTDETAIL,
           wa type Z6731_DEPTDETAIL.
    wa-DEPT_ID = 'z897hkjh'.
    wa-DESCRIPTION = 'computer'.
    append wa to itab.
    wa-DEPT_ID = 'z897hkjhd'.
    wa-DESCRIPTION = 'computer'.
    append wa to itab.
    loop at itab into wa.
    insert z6731_DEPTDETAIL from wa.
    endloop.
    rewards if helpful

  • Problems with retrieving data from tables with 240 and more records

    Hi,
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.
    I installed Oracle 11.2.0 Client and I started to have problems with retrieving data from tables.
    First I used the same connection string, driver and so on (O10 Oracle 10g) then I tried ORA Oracle but with no luck. The result is like this:
    I'm able to connect to database. I'm able to retrieve data but from small tables (e.g. with 110 records it works perfectly using both O10 and ORA drivers). When I try to retrieve data from tables with like 240 and more records retrieval simply hangs (nothing happens at all - no error, no timeout). Application seems to hang forever.
    I'm using Powerbuilder to connect to Database (either PB10.5 using O10 driver or PB12 using ORA driver). I used DBTrace, so I see that query hangs on the first FETCH.
    So for the retrievals that hang I have something like:
    (3260008): BIND SELECT OUTPUT BUFFER (DataWindow):(DBI_SELBIND) (0.186 MS / 18978.709 MS)
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=1
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): EXECUTE:(DBI_DW_EXECUTE) (192.982 MS / 19171.691 MS)
    (3260008): FETCH NEXT:(DBI_FETCHNEXT)
    and this is the last line,
    while for retrievals that end, I have FETCH producing time, data in buffer and moving to the next Fetch until all data is retrieved
    On the side note, I have no problems with retrieving data either by SQL Developer or DbVisualizer.
    Problems started when I installed 11.2.0 Client. Even if I want to use 10.0.1 Client, the same problem occurs. So I guess something from 11.2.0 overrides 10.0.1 settings.
    I will appreciate any comments/hints/help.
    Thank you very much.

    pgoel wrote:
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.Earlier (before installing new stuff) did you ever try retrieving data from big tables (like 240 and more records), if yes, was it working?Yes, with Oracle 10g client (before installing 11g) I was able to retrieve any data, either it was 10k+ records or 100 records. Installing 11g client changed something that even using old 10g client (which I still have installed) fails to work. The same problem occur no matter I'm using 10g or 11g client now. Powerbuilder hangs on retrieving tables with more than like 240 records.
    Thanks.

  • DB Adapter: Polling For New Records returns the First record multiple Times

    I Polling for New or Chnaged Records against DB2 on iSeries. The DB Adapter returns the first record from the Table multiple Times. If the Table has 5 records it displays the first record 5 times. I am using BPEL 10.1.3.1 Can anyone help me with this.

    Hi there,
    please check out the DBAdapter trouble-shooting guide:
    http://download-east.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/app_trblshoot.htm#CIHFEHFA
    I am copying an entry from there into here:
    A.1.21 Some Queried Rows Appear Twice or Not at All in the Query Result
    Problem
    When you execute a query, you may get the correct number of rows, but some rows appear multiple times and others do not appear at all.
    This behavior is typically because the primary key is configured incorrectly. If the database adapter reads two different rows that it thinks are the same (for example, the same primary key), then it writes both rows into the same instance and the first row's values are overwritten by the second row's values.
    Solution
    Open Application Sources > TopLink > TopLink Mappings. In the Structure window, double-click PHONES. On the first page, you should see Primary Keys. Make sure that the correct columns are selected to make a unique constraint.
    Save and then edit the database partner link.
    Click Next to the end, and then click Finish and Close.
    Open your toplink_mappings.xml file. For the PHONES descriptor, you should see something like this:
    <primary-key-fields>
    <field>PHONES.ID1</field>
    <field>PHONES.ID2</field>
    </primary-key-fields>
    Thanks
    Steve

  • Data Source for BSEG Data base table

    Hi Every one,
    I need to create a data source for the data base table BSEG (Function Module Extraction).
    I have followed below steps.
    I have created extarct structure which contains all the fields of BSEG Data base table.If I save the data source I am getting error.
    If I remove all the currecy fields from extract structure, I am able to activate the data sorce.
    So Could you please explain me how to extract currency fields.

    Hi,
    Firstly, can you explain why you need a generic extractor for BSEG? There are already 2 datasources for BSEG, they are:
    0FI_GL_4
    http://help.sap.com/saphelp_nw70/helpdata/en/0c/b4973c115a6f3ae10000000a114084/frameset.htm
    0FI_GL_14
    http://help.sap.com/saphelp_nw70/helpdata/en/49/5700570223413085021a8b4ef1087a/frameset.htm
    To use 0FI_GL_14, you should use the new GL in ECC side.
    On the other hand, for your problem, your problem may occur if you did not give the reference unit field of your currency field in your extraction structure. when you double click the  the currency field, you will see the ref. field, give the reference unit field in this area.
    Regards.

  • ERR:10003 Unexpected data store file exists for new data store

    Our TimesTen application crashes and then it can not connect TimesTen datastore, and then we use ttIsql and get error "10003 Unexpected data store file exists for new data store".So we must rebuild the DataStore.
    I guess the application damages the datastore because we use "direct-linked" mode. Is it true?
    Should I use "Client-Server" mode if our data is very important?
    thx!

    Your question raises several important discussion points:
    It is possible (though very unlikely in practice) for a C or C++ program operating in direct mode to damage the contents of the datastore e.g. by writing through an invalid memory pointer. In the 11+ years that TimesTen has existed as a commercial product we have so far never seen any support case where this was diagnosed as the cause of a problem. However, it is definitely a theoretical possibility and rigorous program testing and use of tools such as Purify is strongly recommended when developing in C or C++ in direct mode. Java programs running in direct mode are completely 'safe' unless they invoke non-Java code via JNI when a similar risk is present.
    The reality is that most customers who use TimesTen in very high performance mission critical applications use mainly direct mode...
    Note also that an application crashing should not cause any damage or corruption to a datastore, even if it is using direct mode, as Times%Ten contains explicit mechanisms to guard against this.
    Your specific problem (error 10003) is nothing to do with the datastore being damaged. This error reflects a discrepancy between the instance main daemon's metedata about all the datastores that it is managing and the reality. This error occurs when the main daemon does not know about a datastore and yet when it comes to connect to (and hence create) the datastore it finds that checkpoint or log files already exist. The main daemon's metadata is managed solely by the main daemon and is completely separate from the datastore and datastore files (the default location is <tt_instance_install_directory>/info, though you can change this at install time). The ususal cause of this is that someone has been manually manipulating files within that directory (which of course you should never do) and has removed or renamed the .DBI file corresponding to the datastore.
    This error should never arise under normal circumstances and certainly not just because some application has crashed.
    Rather than simply switching to the (much slower) client/server mode I think we should try and understand why this error is occurring. Could you please post the following:
    1. Output of ttVersion command
    and then we can take it from there.
    Thanks, Chris

  • FDMEE Import error "No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'

    Hi,
    We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
    Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
    I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
    I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
    Also its only happening to one ledger rest all ledgers are working fine without any issues.
    Thanks

    Hi,
    there are some Support documents related to this issue.
    I would suggest you have a look to them.
    Regards

  • Period is locked for new data

    Good Morning,
    Anyone get this error before?
    -4013:Period is locked for new data
       at NetPoint.SynchSBO.SBOObjects.SBOOrder.NetPointToSBOOrder(NPOrder order)
       at NetPoint.SynchSBO.SBOObjects.SBOOrder.NetPointToSBO(NPQueueObject qData)
       at NetPoint.SynchSBO.SynchObjectBase.Synch()
    Haven't received it before today, of course, the day we go live.  The period we are posting to is not locked for new data and it seems to be limited to one customer only...  Any immediate thoughts?
    Thanks so much,
    Kristen

    OK, I have it working for the moment, but I am very curious about this.  I went back in and selected the Install Plugin button 2 more times just to be sure, I restarted everything and they went through fine.  Could it be that the Plugin installed partially before?  It didn't seem so because other orders were flowing both ways and just certain ones were seeming to get stuck...  Then more gradually started failing until none were passing through.  This has been working seemlessly in this environment for weeks until of course go live today, so I am at a loss.  Probably just that one piece of excitement for Go Live that we were missing.  All is well.  James, thanks for responding.

  • Period is locked for new data [Message 131-107]

    Hi all,
    One of my client was faced this problem "Period is locked for new data [Message 131-107]" when they do Period-End-Closing for Year 2008 in SAP Business One 2007A Patch 42.
    Can anyone help me?
    Thank you.
    Best regards,
    danny

    Hi Danny,
    Check the link
    Period End Closing
    *Close the thread if issue solved
    Regards
    Jambulingam.P

  • Period locked for new data on PO an SO

    Since last week when we try to update a PO anr SO we got this Message
    PERIODE LOCKED FOR NEW DATA.
    For exemple we change the remark feild only and we got this message.
    Since PO and SO are not impacting any gl entry this is not normal. on the DEMO DB i can update a SO or PO when the perio is close without any problem

    Hi Roy,
    First you have to change if your posting perioud is close.
    if close then choose your posting perioud then in your window left side a sign of triangle click on.
    after change ur status  and after u are able to do changes in previous posting perioud.
    this may help u...
    Thanks.
    JRAJPUT

  • Process chains for loading data to target is not functioning

    Hi SAPians,
    Recently, we have upgraded the firmware on IBM P590 with the help of IBM on Saturday i.e. 06/12/2008 (The firmware of P5-590 was SF235_209. It was upgraded to SF240_XXX) and since then the process chains for loading data to targets are not functioning properly.  We have stopped all the SAP services, database services & OS services from our end and services have been rebooted after firmware upgrade.
    However, the problem with the process chains that load transaction data and hierarchies is solved but the chains that load master data are still not working as scheduled.
    We tried to load the master data manually, by running DTP to load data to an object analysis code (attributes) the request remained YELLOW. After refreshing it several times, the request turned into RED. The error message was - PROCESSING TERMINATED (screenshot 1 attached).
    To mitigate this we tried deleting the bad request and it prompted with the below message:
    "Cannot lock request DTPR_4C37TCZGDUDX1PXL38LVD9CLJ of table ZANLYSIS for deletion (Message no. RSMPC178)" (screenshot 2 attached)
    Please advise us how the bad request should be deleted?
    Regards,
    Soujanya

    Hi Sowjanya,
    Follow the below procedure to make yellow request to RED:
    Use SE37, to execute the function module RSBM_GUI_CHANGE_USTATE
    From the next screen, for I_REQUID enter that request ID and execute.
    From the next screen, select 'Status Erroneous' radiobutton and continue.
    This Function Module, change the status of request from Green / Yellow to RED.
    Once it is RED, you can manually delete that request...
    Releasing LocK
    Gott Transaction Code SM12 and release the lock.
    Hope It helps you.
    Regardss,
    Nagaraju.V

  • Send a table with a function.

    Hi all!!
    I want to send a table with a function, but I don't know how to do it.
    in my code I have this.
    FUNCTION ZBORRAR.
    ""Interfase local
    *"    TABLES
    TABLES: T751F.
    *"    VARIABLES
    DATA: BEGIN OF S_CLA_MED_CAND.
              DATA: MASSN  LIKE T751F-MASSN,
                    MNTXT  LIKE T751F-MNTXT.
    DATA: END OF S_CLA_MED_CAND.
    DATA: T_CLA_MED_CAND LIKE S_CLA_MED_CAND OCCURS 0 WITH HEADER LINE.
    *"    FILL THE I. TABLE
    SELECT MASSN MNTXT
    FROM T751F
    INTO TABLE T_CLA_MED_CAND
    WHERE SPRSL = 'S'.
    ENDFUNCTION.
    I don't have any thing in the import, export, modif., tables tabs.
    Thanks and Regards.

    I MADE THIS...
    I FOLLOWED THIS
    Define a tan\ble in the TABLES tab.
    Table name: T_CLA_MED_CAND
    Type: LIKE
    Associated Type: T751F
    Table T_CLA_MED_CAND will be sent thru this FM.
    AND MY CODE IS:
    FUNCTION ZBORRAR.
    ""Interfase local
    *"  TABLES
    *"      X_CLA_MED_CAND STRUCTURE  T751F
    *"    TABLES
    *"    VARIABLES
    TYPES: BEGIN OF S_CLA_MED_CAND,
              MASSN  LIKE T751F-MASSN,
              MNTXT  LIKE T751F-MNTXT,
    END OF S_CLA_MED_CAND.
    DATA: T_CLA_MED_CAND TYPE TABLE OF S_CLA_MED_CAND WITH HEADER LINE.
    *"    GET DATA
    SELECT MASSN MNTXT
    FROM T751F
    INTO TABLE T_CLA_MED_CAND
    WHERE SPRSL = 'S'.
    LOOP AT T_CLA_MED_CAND.
      MOVE: T_CLA_MED_CAND-MASSN TO X_CLA_MED_CAND-MASSN,
            T_CLA_MED_CAND-MNTXT TO X_CLA_MED_CAND-MNTXT.
      APPEND X_CLA_MED_CAND.
    ENDLOOP.
    ENDFUNCTION.
    IN THE RESULTS THE TABLE HAS THE DATA I WANT BUT THERE IS TWO MORE FIELDS MANDT AND SPRSL. IS THIS BECAUSE THE SPRSL I'M USING IT TO FILTER THE DATA AND THE MANDT 'CAUSE I'M WORKING IN IT?
    HOW CAN I JUST HAVE TWO FIELDS??

Maybe you are looking for