Some Times Table is inserting Partial record from a view Why?

The query consists of a simple insert statement where the data is inserted from a view. The issue is that some times only the first retreived record from the select statement are inserted into the table from a view and the rest are ignored when we scheduled in the batch job of EM console. Some times it inserted correctly.
For Ex.
INSERT INTO Table_COLL(COL_A, COL_B, COL_C)
SELECT COL_A, COL_B, COL_C
FROM COLL_VIEW
WHERE DATE=TRUNC(SYSDATE);
COMMIT;
The select statement retreives 200 records and the insert statement works perfectly as expected when executed in SQL* plus.
However the issue happens where only one record is inserted when the same query is scheduled in the batch job of the EM console or a procedure is created for the query and the same is scheduled in the EM console as a batch job and Some times it inserted correctly in EM Console.
Can anyone explain as why this is hapenning? (DB is Oracle 10g& size is 100gb )

I doubt that there is a bug. I suspect the query runs when you schedule it at a point of time where TRUNC(SYSDATE) gives less rows...
You could store the COUNT(*) and execution time before you run the INSERT in a LOG table. Any transaction based on two times (runtime and systime) can lead to unexpected results.
What about something like this:
INSERT INTO Table_COLL(COL_A, COL_B, COL_C)
SELECT COL_A, COL_B, COL_C
FROM COLL_VIEW a
LEFT JOIN Table_COLL B
ON (      a.COL_A = b.COL_A
      AND a.COL_B = b.COL_B
      AND a.COL_C = b.COL_C)
WHERE a.DATE>=TRUNC(SYSDATE)-1
AND b.COL_A IS NULL;
COMMIT;

Similar Messages

  • Insertion of records from view into a table

    hello Friends,
    I have a view LGR is build on 12 table,this view has over 1 millon records, now when i run reports based on this view are so much slower, for speeding up reports i create a table LGRT and base all the reports on it, but the problem is that after any chage to the data of the LGR view , I delete all records from LGRT table and reinsert all records from LGR view
    delete from lgrt;
    insert into lgrt
    select *
    from lgr
    friends its a very long procedure what can i do to minimaze the human work.....
    i mean that it should be done automatically.

    Hello,
    It sounds like you could use a materialised view here to speed things up, also it will save you the hassle of deleting records from the temporary table etc as they offer quite a few synchronisation options. Have a look at the documentation here:
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96540/statements_63a.htm#SQLRF01302
    There a quite a few examples toward the bottom of the page.
    HTH

  • Insert multiple records from web dynpro

    Hi,
    How to insert multiple records from web dynpro applications to SAP backend system ?
    Thanks,
    sowmya

    Hi soumya..
       if want to multiple row selected  then save into the Sap back End..
       value node--table.
    Backend value node=table_bapi_input;
       back end internal table --tableback.
    int size=wdcontext.nodeTable().size();
    int lead=wdcontext.nodeTable().getLeadSelection();
    table_bapi_input in=new table_bapi_input();
    wdcontext.nodetable_bapi_input().bind(in);
       for(int i=size-1;i>0;i--){
    tableback set=new tableback()
    if(lead==i || wdcontext.nodeTable().ismultiSelection(i)){
    set.setName(wdContext.nodeTable().getTableElementAt(i).getName();
    in.addZc_input(set);
    wdContext.currenttable_bapi_inputElement().modelobject().excute();
    wdContext.nodeOutput().invalidate();
    thanks
    jati

  • Hi Apple! I am having a few problems with my 6 month old iPad mini (ios 7.1.1). I am using my iPad and then some time later it just restars! Don't know why and it just keeps doing it! Please fix or if I restore my iPad will it do any good? Thank you

    Hi Apple! I am having a few problems with my 6 month old iPad mini (ios 7.1.1). I am using my iPad and then some time later it just restars! Don't know why and it just keeps doing it! Please fix or if I restore my iPad will it do any good? Thank you

    Plug your iPad into your computer, open iTunes, and click Restore. This will reset your iPad to factory settings, so make sure everything important is stored safely. Once the restore is complete, do NOT restore from your backup.
    If you still have problems, take the iPad to the Apple Store or another AASP (Apple Authorised Service Provider). If you're still within warranty, any service will be of no cost. Check your serial number here or find your receipt.
    Hope I've helped!
    PS: This is a peer-to-peer forum. Apple generally doesn't pay attention here.

  • Time out while reading single record from CRMD_ORDERADM_H table on OBJECTID

    Hi,
    This is the problem i am facing in CRMD_ORDERADM_H.
    if i search for a single record in CRMD_ORDERADM_H Table using SE11 on OBJECT_ID field, it is giving me Time out error.
    CRMD_ORDERADM_H db size would be > 1 Billion records.
    It is having a Secondary Standard index on OBJECT_ID.
    If i search for single record with OBJECT_ID  and PROCESS_TYPE i am able to get the result within seconds.
    But if i take some range in OBJECT_ID and single value in PROCESS_TYPE then i am getting Time out erro.
    we have index (custom) on OBJECT_ID and PROCESS_TYPE combination.
    What would be the cause?
    Thanks in Advance,
    -Kishore

    Hello,
    there is a special table for reading records from orders: CRMD_ORDER_INDEX.
    Regards, R

  • Insert Matching Records from Lookup Table to Main Table

    First off, I want to say many thanks for all the help that I've been provided on here with my other posts. I really feel as though my SQL knowledge is much better than it was even a few short weeks ago, largely in part to this forum.
    I ran into a snag, which I'm hoping someone can provide me some guidance on. I have 2 tables an import table and a lookup table. What I need to have happen is anytime there are matches between the "Types" in the 2 tables, I need a single instance
    of the "Type" and all corresponding fields from the lookup table appended to the import table. There will only be a single instance of each type in the "Lookup" table. Below is an example of how the data might look and the results that
    I would need appended.
    tblLookup
    Type Name Address City
    A Dummy1 DummyAddress No City
    B Dummy2 DummyAddress No City
    C Dummy3 DummyAddress No City
    tblImport
    Type Name Address City
    A John Maple Miami
    A Mary Main Chicago
    A Ben Pacific Eugene
    B Frank Dove Boston
    Data that would be appended to tblImport
    Type Name Address City
    A Dummy1 DummyAddress No City
    B Dummy2 DummyAddress No City
    As you can see only a single instance will be inserted even though there may be multiple instances in the import table. This is the part that I'm struggling on. Any assistance would be appreciated.

    I'm not really sure how else to explain it. With my example, the join would be on "Type" As you can see, there are 2 matching records between the tables (A and B). I would need a single instance of A and B to be inserted into the import table. 
    Below is a SQL statement, which I guess is what you're asking for but it will not do what I need it to do. With the example that I have below, it would insert multiple instances of type "A" into the import table.
    INSERT INTO tblImport (Type, Name, Address, City)
    Select tblLookup.Type, tblLookup.Name,
    tblLookup.Address, tblLookup.City)
    From tblLookup
    Join tblImport on tblLookup.Type = tblImport.Type

  • Time out error while fetching records from table BKPF

    Hi,
    I am fetching records from table BKPF using BUKRS & AWKEY in where clause. Query is as follows:
        SELECT BELNR  XBLNR  AWKEY
        FROM   BKPF
        INTO TABLE L_I_BKPF_TEMP
        PACKAGE SIZE 500
        WHERE BUKRS LIKE L_C_EG
        AND   AWKEY IN L_R_AWKEY .
          APPEND LINES OF L_I_BKPF_TEMP TO I_BKPF .
        ENDSELECT .
    Program is giving time out error. There are 25628 records in range L_R_AWKEY , i m fetching 500 records at a time using  PACKAGE SIZE. But the execution of prog stops on this query.
    Please suggest something to overcome this problem.

    Hi
    Rui is right,
    if you need to get the data by  operation parameters u have to use the fields AWTYP and AWKEY.
    In this selection u can omit the company code.
    SELECT BELNR XBLNR AWKEY FROM BKPF
           INTO TABLE L_I_BKPF_TEMP
                 PACKAGE SIZE 500
                      WHERE   AWTYP = <......> "<------------
                             AND AWKEY IN L_R_AWKEY .
           APPEND LINES OF L_I_BKPF_TEMP TO I_BKPF .
    ENDSELECT .
    Max

  • How to check at what time the extractor has picked records from r3 tables

    Hi experts ,
    If we want to know exactly at what time the Extractor has picked up the records from r/3 tables .
    or if we want to know the time stamp of extractor picking the records from r3 tables after r3 entries
    Regards ,
    Subash Balakrishnan

    Hi,
    The following are few function modules which will give you the information you need based upon the area you are working in.
    SD Billing: LOG_CONTENT_BILLING
       Delivery: LOG_CONTENT_DELIVERY
    Purchasing: LOG_CONTENT_PURCHASING etc...
    See if the above FMs help you in any way...

  • Need to insert selected records from PROD into TEST

    I am trying to insert all records (for a specific region only) from production to test.
    I'm a little confused about how the pl/sql would look for this.
    Note that as I insert into the table in test, I also have a key field that is auto incremented by 1 each time.
    The problem that I am having is I need to link two tables in PROD together to determine the region:
    So in test, I want to do something like:
    INSERT INTO ACCOUNT_PRICE
    (select * from ACCOUNT_PRICE@PROD, MARKETER_ACCOUNT@PROD
    where substr(MARKETER_ACCT_NO,1,1) = '3'
    and MARKETER_ACCOUNT_NO = PRICE_ACCOUNT_NO);
    However, i'm not sure if this is correct or if I should be using a BULK insert.
    Note that I cannot just load the whole table as I need to restrict it to only one region of data.
    Any help would be appreciated.
    Sean

    Direct load (BULK) is irrelevant to what you are asking. I would strongly suggest that you read the docs about this feature before considering it for any purpose.
    As to your question what you are asking is unclear and, for reasons known only to you, you did not include a version number.
    So given that I get to invent your version number I have decided you have 10g and therefore you have DataPump so my recommendation is to use
    DataPump to extract and export the records you wish to move.
    http://www.morganslibrary.org/reference/dbms_datapump.html

  • Inserting nested records from XML to DB

    Hi,
    I am facing a problem with inserting nested records in XML to DB. For example, I have this XML:
    <?xml version="1.0" encoding="utf-8" ?>
    <ns0:CutLOTUpdate xmlns:ns0="http://LayoutTracking/v1.0">
    <C>1</C>
    <COMMENTS>Main1</COMMENTS>
    <CUT_DATA>
    <CUT>
    <D>1</D>
    <COMMENTS>2Main1</COMMENTS>
    <IT>
    <E>11</E>
    <COMMENTS>3Det1</COMMENTS>
    </IT>
    <IT>
    <E>12</E>
    <COMMENTS>3Det2</COMMENTS>
    </IT>
    </CUT>
    <CUT>
    <D>2</D>
    <COMMENTS>2Main2</COMMENTS>
    <IT>
    <E>21</E>
    <COMMENTS>3Det1</COMMENTS>
    </IT>
    <IT>
    <E>22</E>
    <COMMENTS>3Det2</COMMENTS>
    </IT>
    </CUT>
    </CUT_DATA>
    </ns0:CutLOTUpdate>
    I would like to insert these data into the following table in a denormalized form:
    CREATE TABLE A (
    C NUMBER,
    D NUMBER,
    E NUMBER,
    C_COMMENTS VARCHAR2(50),
    D_COMMENTS VARCHAR2(50),
    E_COMMENTS VARCHAR2(50))
    I have tried using this procedure:
    CREATE OR REPLACE PROCEDURE insc (Cut_Clob CLOB) AS
    Cut XMLType;
    BEGIN
    /*Converts Cut_Clob parameter into XML */
    Cut := sys.xmltype.createXML(Cut_Clob);
    /*Inserts data from XML to table*/
    INSERT INTO a
    ( C ,
    C_COMMENTS ,
    D ,
    D_COMMENTS ,
    E ,
    E_COMMENTS )
    SELECT DISTINCT
    ExtractVALUE(CUT, '/ns0:CutLOTUpdate/C' , 'xmlns:ns0="http://LayoutTracking/v1.0') C,
    ExtractValue(CUT, '/ns0:CutLOTUpdate/COMMENTS', 'xmlns:ns0="http://LayoutTracking/v1.0') C_COMMENTS,
    ExtractVALUE(value(ct), '/CUT/D') D,
    ExtractValue(value(ct), '/CUT/D_COMMENTS') D_COMMENTS,
    ExtractVALUE(value(it), '/IT/E') E,
    ExtractValue(value(it), '/IT/E_COMMENTS') E_COMMENTS
    FROM TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT','xmlns:ns0="http://LayoutTracking/v1.0'))) ct,
    TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT/IT','xmlns:ns0="http://LayoutTracking/v1.0'))) it;
    COMMIT;
    END;
    However, this resulted into a cartesian product.
    Is it possible for me to insert this XML into such table? If yes, can anyone show me how?
    I apologize if this seems trivial to you and I appreciate your time for helping me.
    Thank you,
    Kaye

    Hi,
    I have tried:
    FROM TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT','xmlns:ns0="http://LayoutTracking/v1.0'))) ct,
    TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT/IT','xmlns:ns0="http://LayoutTracking/v1.0'))) it;
    This did not work - resulting in Cartesian product.
    I am working with Oracle 10g DB 10.2.0.1.
    If it's not too much, I am hoping that someone could show me a script to parse this XML and actually place it in a denormalized form.
    If you think this is not possible, can you please just point me to an example where the same XML (with nested information) can be inserted into 3 different tables (relational)?
    I have tried searching on different sources, to no avail. I am a beginner on this... I apologize for any inconveniece caused.

  • HT4522 I would like to delete some time capsule files that I erased from my mac. Please and thanks

    Please help me deleting some files that are in my time capsule which I have erased from my computer
    thank you

    Since you deleted the files on your Mac, Time Machine will eventually delete them automatically from the Time Capsule, so you really need do nothing.
    If you really do want to delete the files right now, open Time Machine, wait for it to load and use the timeline on the right side of the window to go back to a date when the files were still on your Mac.
    Use the Finder like interface to navigate to the specific file that you want to delete and click on it to highlight it
    Click the gear icon at the top of Time Machine window and click "Delete all backups of  xxxxxxxx"
    Do the same for other files that you want to delete.
    Click Cancel at the lower left of the window to return to your normal desktop

  • Insert statement does not insert all records from a partitioned table

    Hi
    I need to insert records in to a table from a partitioned table.I set up a job and to my surprise i found that the insert statement is not inserting all the records on the partitioned table.
    for example when i am using select statement on to a partitioned table
    it gives me 400 records but when i insert it gives me only 100 records.
    can anyone help in this matter.

    INSERT INTO TABLENAME(COLUMNS)
    (SELECT *
    FROM SCHEMA1.TABLENAME1
    JOIN SCHEMA2.TABLENAME2a
    ON CONDITION
    JOIN SCHEMA2.TABLENAME2 b
    ON CONDITION AND CONDITION
    WHERE CONDITION
    AND CONDITION
    AND CONDITION
    AND CONDITION
    AND (CONDITION
    HAVING SUM(COLUMN) > 0
    GROUP BY COLUMNS

  • Insert the record from XML file to Tables.

    Hi guys,
    I ill be getting the XML file from FRONT-END.It ill be stored one permanent location(Directory).I need to Insert the data to the corresponding table.Anybody knows Pls give some suggestion for this.......
    Regards....
    GKM

    Using the Oracle XML DB Webdav are is one method as it acts like a file system, but essentially get's the documents directly into the database so they can be queried through the resouce_view.
    Other methods involve reading the file as if it's a CLOB and then using XMLTYPE constructor to change that CLOB to an XMLTYPE which can then be stored as that datatype in the database or processed as you need.
    The best place to look is over in the XML DB forum, which has it's own FAQ detailing various best practices for all sort of XML stuff, including reading XML files and shredding them into relational tables etc.
    {thread:id=410714}
    Edited by: BluShadow on 18-Jan-2012 08:53
    corrected link

  • Mapping is inserting multiple records from a single source to Dimension.

    Hi All,
    I am very new to OWB. Please help me out. I've created Dimension with the help of the wizard and then a mapping which consist of single source and single dimension. The mapping is populating nearly 500 times of the actual records. Following are some details to give you a better understanding of mapping: I created a dimension with four levels and two hierarchy. Levels are L1, L2, L3 and L4 and hierarchies are H1-> L1, L2 and L4
    and H2-> L3 and L4. L4 is lowest level of hierarchy. L1 and L3 are parent levels in the respective hierarchies. I assigned an attribute of each level as Business identifier that means business identifier attribute is different in each level. In mapping I mapped the parent natural key(Key for parent Level in a hierarchy) as the value which has been mapped for parent level. The result is coming 500 times of the record that exist in source table. I've tried even single common business identifier for each level but again the result is 5 times of the records. Please let me know the solution.
    Thanks is advance.
    Amit

    Hi ,
    You may not be having multiple records in your dimension.
    To understand better the records insertion, try a snow flake version of the dimension and see how the records are inserted as per the levels in the respective tables.
    Thanks

  • Inserting multiple records from a QofQ?

    I'm doing (still!) an app for parents to sign up for
    information from their child's school, and giving them the option
    to choose more than one grade using checkboxes. I'm passing the
    grades as a string, then parsing them into individual searchable
    grades, and then querying the db to see if that email/school/grade
    subscription already exists. If it doesn't exist, I want to add it
    to the db. Using CFDUMP, I've verfied that I'm extracting the
    correct records to add.
    I've attached the query that selects the records to be added,
    and then my current INSERT query (which chokes on "INSERT") -- I've
    tried putting brackets around INSERT per the CFWACK, but that
    didn't work either. The error is "Query of Queries syntax error;
    Encountered INSERT".

    I can see a couple things wrong with your code -
    1) You shouldn't need to apply the following restrictions to
    your subsToAdd query as you already apply them to your Ignatz
    query:
    (Email = '#Form.Email#') AND (LocationCode =
    #Form.LocationCode#)
    2) For your insert statement, you want to surround the entire
    SQL statement with the <cfloop> block:
    <cfquery name="saveSubs" dbtype="query">
    <CFLOOP query="subsToAdd">
    INSERT INTO ...
    </CFLOOP>
    </cfquery>
    3) You're probably also recieving an error because you aren't
    qualifying your text fields (like Email) with single quotes. A
    better solution would be to use the <cfqueryparam> tags with
    your query - plus it will make your SQL run faster as well!
    INSERT INTO Subscriptions (Email,LocationCode,GradeID)
    VALUES (<cfqueryparam cfsqltype="CF_SQL_VARCHAR"
    value="#Email#" maxlength="50">,
    <cfqueryparam cfsqltype="CF_SQL_VARCHAR"
    value="#locationCode#" maxlength="30">,
    <cfqueryparam cfsqltype="CF_SQL_INTEGER"
    value="#gradeID#">)
    (I took a guess as to the datatypes of your fields, you'll
    have to adjust them accordingly).

Maybe you are looking for

  • Cannot find Officejet Pro 8500 A909g on network

    We have several PC's/laptops connected to our wireless network and they all use the same printer, ( Officejet Pro 8500 A909g ) My personal laptop was working fine with the printer untill Vista crashed and I upgraded to Windows 7. I downloaded the sof

  • Using UNC Path With Execute Process Task

    I have an Execute Process task in which the process can either delete, get or put files to an SFTP site. The executeable takes arguments in the following format: "host" "user" "password" "put" "full local path/filename" "full remote path/filename" I

  • "Your server is not configured properly or your search query has ..."

    Config info: v490, 20k+users, JES 2005Q1, 118540-23 118207-42 115614-20 Brief Description: When users select the Corporate Directory under the Address Book tab in CE and click the ALL button (or do a wildcard search) they get the following error: "Yo

  • Gnome-keyring-daemon crashes when blank passwords are entered.

    Steps to reproduce: Install fresh Arch Linux Install chromium, gnome, xorg-server, and xorg-xinit Put "exec gnome-session" in ~/.xinitrc Launch startx Open chromium On prompt for new keyring creation, leave password fields blank and press OK. Witness

  • BTE for FB08 Reversal

    Does anyone know of a BTE I can leverage to trigger workflow before having a reversal posting generate in the FB08 transaction?