Sample Records, need ideas, Fill Table SQL samples

Say that I need to select 500 records of the berst quality for a Store number 77, by some Spend column, say it is: SPEND_12_MONTHS
I picked say 45 records/rows from this one Table, and I need to fill the other 455 on the spend column from a 2nd Table,
how would I do that ?
Just looking for general ideas, so I have no Sample of code to show, but the rows would be Distinct with no duplicates
on the Customer_id (primary key).
I have a need to do this on a handful of Stores, not just 1, but trying to get an idea, so for example:
Store | Primary Table 1 | Fill Table 2
1-----------45--------------------------455
2-----------23--------------------------477
3-----------11--------------------------489
The reason why I need to pull from the First Table is that it has Priority #1, and Table 2, Priority #2. In Table #1 there isn't that many records, so that's why I need to fill from Table 2.
Thank You!
Am on Oracle 10G R2

I am not sure if I get you correctly.
You have:
table1 Top Prio Customers (small number)
table2 All Customers.
now you want to query all 500 customers preferable from table1 if they exist there otherwise from table2. The Customer_ID is unique in each table and consitent in both tables.
Right?
There are many ways...
SELECT t1.*
from table1 t1
UNION ALL
SELECT t2.*
from table2 t2
left join table1 t1
on t1.Customer_id = t2.Customer_id
where  t1.Customer_id is NULL;You can also use NOT EXISTS not this LEFT trick works quite fast in all ORACLE versions, but for 500 rows, who cares.

Similar Messages

  • Need idea about table index

    Hi all,
    could someone please advise me tool for definition of needed index? In other words, I got query with "table access full", because one of tables doesn't have any indexes, so I should know what index would be the best to create. I don't say that "table access full" is always evil, but I guess index can possibly improve performance.

    best indexes include the most frequently used columns in the ON (Join) and WHERE clauses. With no index, it can only do a FTS.
    create an index in a test environment
    set autotrace traceonly explain;
    select.... from table where <indexed col> etc..
    set autotrace off -- to turn it off...
    --REPEAT this sequence until you have the plan you expect
    Using this autotrace does not actually execute the query (no rows retrieved). Review the output looking for ACCESS method vs. FILTER method

  • A sample code to check records of a system table?

    hi ABAP4 experts,
    We are pretty new at ABAP4.  We would be appreciated if you can provide a sample code to check how many records and calculate a total amount for a specific field, e.g., DMBTR in a system table, e.g., BSEG.  Note: there is no any selection for this table BSEG, we just want to get the total record count in this table and also the total amount for a specific field e.g. DMBTR in this table.
    Do we have to use an internal table to transfer all the records of BSEG into the internal table to get the result?
    We will give you reward points!

    Hi Kevin,
    Using SUM directly in SQL will NOT work for table BSEG because BSEG is pool table. You will get an ABAP error.
    "Aggregate functions and the addition DISTINCT are not supported in field lists for pooled and cluster tables".
    You need an internal table to transfer all data from BSEG and perform calculation for count and sum.
    Concerning about performance running perhaps you can code something like this.
    REPORT ZZFLTEST NO STANDARD PAGE HEADING.
    TABLES: BSEG.
    DATA: CURS          TYPE CURSOR,
          PACKAGE_SIZE  LIKE RMCS4-MC_CM_PSIZE VALUE '10000'.                                                                               
    DATA: BEGIN OF I_BSEG OCCURS 0,
            BELNR TYPE BSEG-BELNR,
            BURKS TYPE BSEG-BURKS,
            GJAHR TYPE BSEG-GJAHR,       
            BUZEI TYPE BSEG-BUZEI,       
            DMBTR TYPE BSEG-DMBTR,
            SHKZG TYPE BSEG-SHKZG.
    DATA: END OF I_BSEG.
    DATA: TOT_DMBTR TYPE BSEG-DMBTR,
          TOT_REC   TYPE I.
    SELECTION-SCREEN BEGIN OF BLOCK B01 WITH FRAME TITLE TEXT-001.
    SELECTION-SCREEN SKIP.
    PARAMETERS: P_SIZE LIKE RMCS4-MC_CM_PSIZE DEFAULT '10000'.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN END OF BLOCK B01.
    START-OF-SELECTION.
      PACKAGE_SIZE = P_SIZE.
      OPEN CURSOR WITH HOLD CURS FOR
      SELECT BELNR BURKS GJAHR BUZEI DMBTR SHKZG
      FROM BSEG
      WHERE BELNR <> SPACE
        AND BURKS <> SPACE
        AND GJAHR <> SPACE
        AND BUZEI <> SPACE.
    *Fetch internal table I_BSEG for every 10000 records.
      DO.
        FETCH NEXT CURSOR CURS
        INTO TABLE I_BSEG PACKAGE SIZE PACKAGE_SIZE.
        IF SY-SUBRC <> 0.
          EXIT.
        ENDIF.
      ENDDO.
      CLOSE CURSOR CURS.
      LOOP AT I_BSEG.
        TOT_REC = TOT_REC + 1.
        IF I_BSEG-SHKZG = 'S'.
          TOT_DMBTR = TOT_DMBRT + I_BSEG-DMBTR * -1.
        ELSE.
          TOT_DMBTR = TOT_DMBTR + I_BSEG-DMBTR.
        ENDIF.
      ENDLOOP.
      WRITE: / 'TOTAL BSEG-DMBTR:', TOT_DMBTR,
             / 'TOTAL RECORD:    ', TOT_REC.
    END-OF-SELECTION.
    Hope this will help.
    Regards,
    Ferry Lianto

  • How to process each records in the derived table which i created using cte table using sql server

    I want to process each row from the CTE table I created, how can I traverse from first row to second row and so on....
    how to process each records in the derived table which i created using  cte table using sql server

    Ideally you would be doing a set based processing rather than traversing row by row as thats more efficient. To answer it specific to your scenario we may need more info. Can you explain with some sample data your exact requirement?
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • SQL*Loader- Records Rejected - Error on table ORA-01722: invalid number

    Getting the following errors :
    Please tell me where I am going wrong?
    Attached is the log file and snippets of datafile along with the control file !!
    Also please direct me how can i upload 4900 records at one go?
    SQL*Loader: Release 11.1.0.7.0 - Production on Fri Oct 14 03:06:06 2011
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    Control File: sample.ctl
    Data File: Cities.csv
    Bad File: Cities.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table CITY, loaded from every logical record.
    Insert option in effect for this table: INSERT
    Column Name Position Len Term Encl Datatype
    ID FIRST * , CHARACTER
    NAME NEXT 35 , ' CHARACTER
    COUNTRYCODE NEXT 3 , ' CHARACTER
    POPULATION NEXT * WHT CHARACTER
    Record 1: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 2: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 3: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 4: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 5: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 6: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 7: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 8: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 9: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 10: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 11: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 12: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 13: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 14: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 15: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 16: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 17: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 18: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 19: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 20: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 21: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 22: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 23: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 24: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 25: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 26: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 27: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 28: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 29: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 30: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 31: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 32: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 33: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 34: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 35: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 36: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 37: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 38: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 39: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 40: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 41: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 42: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 43: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 44: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 45: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 46: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 47: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 48: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 49: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 50: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    Record 51: Rejected - Error on table CITY, column POPULATION.
    ORA-01722: invalid number
    MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
    Table CITY:
    0 Rows successfully loaded.
    51 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 35840 bytes(64 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 64
    Total logical records rejected: 51
    Total logical records discarded: 0
    Run began on Fri Oct 14 03:06:06 2011
    Run ended on Fri Oct 14 03:06:12 2011
    Elapsed time was: 00:00:06.18
    CPU time was: 00:00:00.03
    my control file (sample.ctl):
    load data infile 'Cities.csv'
    into table city
    fields terminated by ','
    (id integer external,
    name char(35) enclosed by "'",
    countrycode char(3) enclosed by "'",
    population integer external terminated by '\n'
    my datafile (Cities.csv) (it contains 4900 records, but I am showing here just 4 records for ease)
    3830,'Virginia Beach','USA',425257
    3831,'Atlanta','USA',416474
    3832,'Sacramento','USA',407018
    3833,'Oakland','USA',399484
    Thanks in advance!!

    Look that when I change a little bit your datafile as follows
    1,'Kabul','AFG',1780000
    2,'Qandahar','AFG','237500'
    3,'Herat','AFG','186800'  I got the same error (2 last rows rejected for the same error invalid number)
    mhouri > select * from cities;
            ID NAME                                COU POPULATION
             1 Kabul                               AFG    1780000
    SQL*Loader: Release 10.2.0.3.0 - Production on Fri Oct 14 10:38:06 2011
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Control File:   cities.ctl
    Data File:      cities.dat
      Bad File:     cities.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table CITIES, loaded from every logical record.
    Insert option in effect for this table: INSERT
       Column Name                  Position   Len  Term Encl Datatype
    ID                                  FIRST     *   ,       CHARACTER           
    NAME                                 NEXT    35   ,    '  CHARACTER           
    COUNTRYCODE                          NEXT     3   ,    '  CHARACTER           
    POPULATION                           NEXT     *  WHT      CHARACTER           
    Record 4: Rejected - Error on table CITIES, column ID.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 2: Rejected - Error on table CITIES, column POPULATION.
    ORA-01722: invalid number
    Record 3: Rejected - Error on table CITIES, column POPULATION.
    ORA-01722: invalid number
    Table CITIES:
      1 Row successfully loaded.
      3 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                  35840 bytes(64 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             4
    Total logical records rejected:         3
    Total logical records discarded:        0
    Run began on Fri Oct 14 10:38:06 2011
    Run ended on Fri Oct 14 10:38:06 2011
    Elapsed time was:     00:00:00.23
    CPU time was:         00:00:00.09Population value within the data file should be a number
    Best regards
    Mohamed Houri

  • Do we need to fill setup table if we change extract structure of D/S

    Hello Experts,
    as we are in need of adding new filed to existing delta enable Datasources.
    Do we need to fill the setup table if we change the Extract structure of Data source which is already loading Delta data to bw system?
    case 1) we are planning to add new field directly in maintain structure in TCode LBWE.
    case 2) adding new field to Extract structure and writing relavent code in CMOD.
    Thanks in advance
    Regards
    Ravi

    Hello Ravi,
    If you are adding new field to a structure, even though you do not need historical data in the new fields it is a mandate that you have to replicate the datasource and then reinitialize the datasource, in both the cases you have mentioned. The reason for this is, the RSA7 contains your existing structure, as you are changing it, a new replica needs to be created in RSA7. You have to run a INIT without datatransfer, dont need to refill the setup tables for the INIT without data transfer.
    After you activate the new fields and poppulate it using CMOD. First transfer all the records in SMQ1 to RSA7, using a V3 update. Then run a delta in BW, by this way you will load all the existing records from source system to BW. Then replicate the datasource and create the required InfoObject and mapping in BW (Please note that without deleting data, you wont be able to change the DataTarget structure unless you are using remodelling in BI 7.0). Delete the datasource entry in RSA7. Runa INIT without datatransfer.
    Please let me know if you need any more information.
    Regards,
    Pankaj

  • Help needed to get unique record from an internal table

    Hi Everybody,
    I have to get unique record from an internal table. i know we can use read statement with key condition .
    But the problem is i have to use some relational operators like GE or LE.
    eg
    read table itab into wa with key width GE itab-widthfrom
                                                       width LE itab-widthto
                                                       machno eq itab-machno.
    Its giving me error when I use the operators GE , LE.( I think since it can't find a unique record with those relational
    operators in the with key statement)
    Is there any other way to get unique record from internal table without using the loop?
    Thanks,
    Sunny

    Using the read statement you will need some kind of loop. For example.
    DO.
    READ TABLE......
      WITH KEY ......
                        READ = SPACE
    IF SY-SUBRC EQ 0
      TABLE-READ = 'X'.
      MODIFY TABLE
      ADD 1 TO W_FOUND.
    ELSE
      EXIT.
    ENDIF
    ENDDO.
    IF W_FOUND EQ 1.
    ...record is unique.
    ENDIF.

  • Need help for record deletion from custom table

    Hi friends
    I have to write a custom program which will be generic to delete any table record with date field.
    This program needs to be generic (should be able to delete records from any custom table) in nature with selection screen parameters as:
    Table Name and Number of Days prior to which records are deleted, both mandatory.
    Program Flow:
    1.     From number of days calculate date before which records are deleted, ( current date u2013 no. of days = past date).
    2.     Custom table have date field, delete records prior to that date.
    3.     Program may be scheduled for background job, put default values for both fields. No. of days should not be less than 60.
    4.     Classical Report output with number of records deleted.
    My query is how will I know the name of the Date field so that I can write a DELETE query.
    If I use 'DDIF_FIELDINFO_GET' it gives me all field names but how to filter out?
    with regards
    samikhya

    Hi
    I have added  field on the selection screen as p_fieldname and got the F4 help for it , so that the user will get the field name run time as per the table name.
    Now I am facing problem while writing the DELETE query.
    I wrote like
    DELETE (fp_tab)
    where (fp_fieldname) LE date
    It is not working. I also tried with taking a string to concatenate fp_fieldname, LE and date to l_string
    when I write like this:
    DELETE (fp_tab)
    where (l_string) , sy-subrc is getting 4 and no records are getting deleted.
    I do not understand where the dynamic Delete is failing??
    with reagards
    Samikhya

  • Deleting/Updating records from an object table in PL/SQL

    Hello All,
    VER:
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE     11.2.0.3.0     Production
    TNS for Linux: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I have created an object and inserted records in it. Is there any way we can delete/update records from it. I do not want to delete based on iteration like delete.collection but I would like to know if we can delete directly from obj like delete from table...
    CREATE OR REPLACE TYPE test_type AS OBJECT
    col1 number,
    col2 varchar2(100)
    CREATE OR REPLACE TYPE tab_type is table of test_type;
    DECLARE
    test_tab tab_type;
    l_cnt NUMBER;
    BEGIN
    select test_type(col1,col2) bulk collect
    into test_tab from (select 1 as col1,'test1' as col2 from dual
                        union all
                        select 2,'test2' from dual);
    IF test_tab.count>0
    THEN
    DELETE FROM TABLE(CAST(test_tab as tab_type)) a
    where a.col1=1;
    END IF;
    l_cnt := test_tab.count;
    END;Thx
    Shank.

    SB,
    I have a scenario wherein I insert few records into a collection table. I'm gonna filter few records from collection table based on the filter.
    I want to delete the records that didn't match the filter. Right now, I'm inserting the records into a physical table and deleting from there. I do no want to use a physical table. Trying to avoid it.
    Would like to delete from collection itself.
    DELETE FROM TABLE(CAST(lv_attr_filter_tab as EDMS_CSPP_DISC_REQ_TAB_TYPE))
                                 WHERE NOT EXISTS (SELECT 1
                                       FROM edms_disc_lines_stg edls
                                       WHERE edls.req_id = edrg.request_id
                                          AND edls.disc_line_id = edrg.discount_id
                                          AND UPPER(edls.disc_status) IN ('ACTIVE');

  • Need Help : Urgent : Making one of the record Bold in a Table

    Hi Frds,
    I am new to OAF.......
    I am  facing the issue that i have to make one of the record bold in a table...........
    By using the query, i m trying to display the payslip
    It contains the list of Earnings ,Deductions and NetPay amount..........
    this is the part of the query........
    select payment_date,element_name,arabic_name,val,balance from
      select '0' flag,assignment_id,null payment_date,'Payslip for the Month'  element_name,to_char(payment_date,'Month-YYYY')arabic_name,
       null val, null balance from xx_payroll_info
    where 1 =1
    and payment_date = last_day(:2) 
    and assignment_id = (select assignment_id from xx_people_reporting_info
        where person_id = :1)
    union all
    select '1'flag,assignment_id,payment_date,element_name,arabic_name,
       value val,null,balance
      from xx_payslip_details_mv
    where 1 = 1
    and payment_date = last_day(:2)
    and earn_deduct = ('E')
    and assignment_id in (select assignment_id from xx_people_reporting_info)
    union all
    select '2' flag,assignment_id,payment_date,'Earnings-Total',null,sum(value) val,null
      from xx_payslip_detail_mv
    where 1 =1
    and payment_date = last_day(:2)
    and earn_deduct = 'E'
    and assignment_id in (select assignment_id from xx_people_reporting_info
           where 1 =1
                and person_id = :1 )
    group by assignment_id,payment_date
    My Requirement is : I have to make the Payslip For the month of , Date, Earnings-Total  into Bold.....  How can i do this.... plz... help me out in this......
    Thanks &Regards,
    Jaya

    Hi Jaya,
    Set CSS Class property as  OraDataText for respectiveb column.
    OR
    /In Controller PR
                    import oracle.cabo.style.CSSStyle;
                    CSSStyle customCss = new CSSStyle();
                     customUnCss.setProperty("text-transform","uppercase");
                     customUnCss.setProperty("font","bold 16px \"Trebuchet MS\", Verdana, sans-serif");//# -red
                     OAMessageStyledTextBean styledTextBean =(OAMessageStyledTextBean)webBean.findIndexedChildRecursive("POCommentsItem");
                     if(styledTextBean!=null)
                      styledTextBean.setInlineStyle(customUnCss);
           Thanks,
            Dilip

  • Comparing Two tables with 300k records and update one table

    Could you let me know how to compare two tables having 300k records and update one table.below is the scenario.
    Table Tabl_1 has columns A,B and Tabl_2 has columns B,new_column.
    Column B has same data in both the tables.
    I need to update Tabl_2 in new_column with Tabl_1 A column data by comparing B column in both tables.
    I m trying to do using PLSQL Tables.
    Any suggestion?
    Thanks.

    Hi,
    Whenever you have a problem, please post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) from all tables involved, so that the people who want to help you can re-create the problem and test their ideas.
    Also post the results you want from that data, and an explanation of how you get those results from that data, with specific examples.
    If you're asking about a DML statement, such as UPDATE, the CREATE TABLE and INSERT statements should re-create the tables as they are before the DML, and the results  will be the contents of the changed table(s) when everything is finished.
    Always say which version of Oracle you're using (for example, 11.2.0.2.0).
    See the forum FAQ: https://forums.oracle.com/message/9362002
    ef2019c7-080c-4475-9cf4-2cf1b1057a41 wrote:
    Could you let me know how to compare two tables having 300k records and update one table.below is the scenario.
    Table Tabl_1 has columns A,B and Tabl_2 has columns B,new_column.
    Column B has same data in both the tables.
    I need to update Tabl_2 in new_column with Tabl_1 A column data by comparing B column in both tables.
    I m trying to do using PLSQL Tables.
    Any suggestion?
    Thanks.
    Why are you trying to use PL/SQL tables?  If tabl_1 and tabl_2 are regular database tables, it will be much simpler and faster just to use them.
    Depending on your requirements, you can do an UPDATE or MERGE, either in SQL or in PL/SQL.

  • Only one round trip to database from BizTalk per message irrespective of number of records in message per table.

    I am creating biztalk application to store the data into sql server.
    and my client says this line what i am not understood .
    "Only one round trip to database from BizTalk per message irrespective of number of records in message per table."
    Any one can help me to understand this line.!
    Thanks,

    One more option is -
    Create a stored procedure to perform batch insert, you can insert into any number of tables you want.
    --sample SP code...just added the steps you need to know to extract XML and perform insert
    CREATE PROCEDURE <SPName>
    @YourXML XML
    AS
    BEGIN
    EXEC sp_xml_preparedocument @idoc OUTPUT, @youXML
    SET @j = 1
                WHILE @j <= @recordCount
                BEGIN
                     SET @xpath = '//ns1:RootNode/Record[' + CAST(@j AS VARCHAR(11)) + ']'
                     ;WITH XMLNAMESPACES('record namespace' as ns0, 'rootnode namespace' as ns1)
         INSERT INTO YourTable
                      (field1, field2....field20)             
                      SELECT                                 
                            field1, field2....field20
                      FROM OPENXML(@idoc, @xpath, 2)
                            WITH(field1 varchar(2), field2 varchar(20)........field20 varchar(100))                        
                      SET @j = @j + 1
                END    
    END
    On BizTalk side its quite simple...generate schema for your stored procedure...and in your map transform your XML to StoredProcedure schema using CDATA.
    Hope it helps!!

  • New To Oracle.. Needs Help:: Conversion from SQL Server to Oracle 11g

    I am new to Oracle 11g and badly need the conversion of SQL Server Functions to Oracle.. Sample Pasted Code not working .. end with error.. pls help
    Create Table TempT (ID1 Varchar (10),
    ID2 Varchar (10)
    CREATE OR REPLACE PACKAGE GLOBALPKG
    AS
    TYPE RCT1 IS REF CURSOR;
    TRANCOUNT INTEGER := 0;
    IDENTITY INTEGER;
    END;
    CREATE OR REPLACE FUNCTION fTempT
    i IN VARCHAR2 DEFAULT NULL
    RETURN GLOBALPKG.RCT1
    IS
    REFCURSOR GLOBALPKG.RCT1;
    BEGIN
    OPEN REFCURSOR FOR
    SELECT TT.*
    FROM TempT TT
    WHERE (fTempT.i = ''
    OR TT.ID1 = fTempT.i)
    RETURN REFCURSOR;
    END;
    CREATE OR REPLACE FUNCTION fTempTF
    i IN VARCHAR2 DEFAULT NULL
    RETURN GLOBALPKG.RCT1
    IS
    REFCURSOR GLOBALPKG.RCT1;
    BEGIN
    OPEN REFCURSOR FOR
    SELECT *
    FROM TABLE(fTempT(i))
    RETURN REFCURSOR;
    END;
    LAST FUNCTION ENDs WITH ERROR
    Error(13,7): PL/SQL: ORA-22905: cannot access rows from a non-nested table item

    2. The major purpose is to get a simplest way to create a parameterized function who can return a table like output. 2nd function has no use instead i was testing the result of First Function like thisIf you just want to select from a select, you should use a view not a function.
    1. which program is more help ful for writing and executing queries bcoz after using Query Analyzer of Microsoft It seems difficult to work on SQL Developer.
    sqlplus? If you are having difficulty learning new tools because of an old one you used, probably best to forget the old one and concentrate on learning the new one because it will be different. This goes for the database itself also.
    2. Can DMLs be used within a Function.Yes, you just can't execute the function in another SQL statement if it modifies data and this is a good thing.
    3. Can temporary tables be used within a function.Unfortunately yes, but they shouldn't be unless you are in a slowest application competition.
    5. Each Function which is a Table Function must be accompanied with Type Definitions?? its a bit longer way of doing the things than SQL ServerThat is why it is better to use views instead, is there any reason you want a select that you can select from inside a function?
    SQL Server for last 9 years thats why i refer this toolThat is not in itself a problem, if you try and do what you did in SQLServer in Oracle, that will be a problem though.

  • Please help!  Having trouble finding records when searchign two tables.

    Hi, I've been trying to solve this problem for several weeks now, and I have exhausted all of my knowledge and experience. I need help, and I hope someone here can give me some direction.
    I am using VB 2008, and the CR that comes bundled with VS 2008 Pro. My database is a SQL Server 2005 CE v3.5 (a *.sdf file). I am connecting to the database through a dataset, and I am displaying the report in a CrystalReportViewer.
    My dataset consistes of two tables:
    1) tblCustomers which has a primary key "CustID", and contains only customer contact and personal information.
    2) tblDateVisited which has a primary key of "VisitID", but it also has a column titled "CustID". Basically, every time a customer visits the business, details of that visit are recorded in tblDateVisited, and that record is associated with the customer by their CustID.
    Here's what I'm trying to accomplish: I want to be able to display only Customer records when the customer has visited and that visit matches certain criteria. Right now, I am trying to match visits from the "tblVisitDate.PlayerType" column. If the customer has ever had a visit where they matched a particular player type, I want to see those customer records.
    I don't know what I'm doing wrong, though. I can search a dataset if I am only querying one table and pulling records from that table. However, whenever I try to add a second table and perform queries on that table to get records from the first table, I can't return any records.
    If it helps, I am trying to use one CrystalReportViewer to display multiple reports (user choice) and here's how I'm loading the report into the viewer:
    Me.tblCustomersTableAdapter.Fill(Me.dsPlayerTypeReports.tblCustomers)
    Me.tblDateVisitedTableAdapter.Fill(Me.dsPlayerTypeReports.tblDateVisited)
    Me.ReportFile.SetDataSource(dsPlayerTypeReports.Tables(1))
    I am suspicious that my problem is in the Tables(1) method. It confuses me that I can only assign one table as a datasource when I obviously need access to two tables to make this selection work.
    Whatever the case, I'm at the end of my rope with this one. I'm not prone to giving up, but I'm at a dead end currently.
    Any attempt to assist me with this will be greatly appreciated, successful or not.
    Thanks in advance!
    -Will

    Hi
    I finally figured this out (or at least a way of doing it that seems to work!
    In VS create a new, empty datasource with no database objects in it. You'll get a message saying asking if that's what you really want to do.
    Add a TableAdapter to the datasource using the 2-table SQL statement with a join but no WHERE clause, in my case:
    "SELECT tblMedia., tblTitles. FROM tblMedia LEFT JOIN tblTitles ON tblMedia.ODGMediaID = tblTitles.ODGMediaID "
    Make sure Fill a DataTable and Return a DataTable are checked.
    Rename the TableAdapter and DataTable to something sensible.
    Design the report using the new Dataset.
    To display the report in VB with user generated WHERE clause:
    strWhere = "WHERE ID=..." etc
    strSQL = "SELECT tblMedia., tblTitles. FROM tblMedia LEFT JOIN tblTitles " & _
                            "ON tblMedia.ODGMediaID = tblTitles.ODGMediaID " & strWhere
    Dim conn As OleDbConnection = New OleDbConnection(strConn)
    Dim da As OleDbDataAdapter = New OleDbDataAdapter(strSQL, conn)
    Dim dsReport As New DataSet
    da.Fill(dsReport, "DataTableName")
    Dim rpt As New ReportDocument
    rpt.Load(PathToReportFile)
    rpt.SetDataSource(dsReport)
    crReportViewer.ReportSource = rpt
    Jonathan Attree

  • Process records in a transaction table in real time

    We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.
    We are basically looking at various options:
    1) have the external process run continuously, select all records in the table, process them and delete them and then start the process again
    2) have the external process run continuously, select all records in the table, process them, update a status flag and then start the process again processing only those records with their status not yet updated
    3) fire a trigger for each record launching the external process (if it is not running yet)
    4) have a separate table containing a timestamp which is updated via trigger for every transaction that is inserted in the transaction table. Have the external process run continuously and only process those records which have exceeded the previous timestamp.
    Would appreciate any ideas you may have how to tune this process and your views regarding the options mentioned above(or others you might have)
    Thanks a lot.

    user9511474 wrote:
    We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.My busiest table collects up to 50 million rows per hour (peak periods in the day) that also needs to be processed immediately (as a batch) after the hour. I use partitioning. It is a very flexible and there are very few (if any) performance knocks.
    The entire data set has to be processed. With a partition that means a full scan of the table partition - and the ability to do it using parallel query. No additional predicates are needed, except the to have the CBO apply partition pruning. In other words, the predicate enables the CBO to narrow down the SQL to the previous hour's partition only.
    No additional predicates needed like a STATUS flag to differentiate between processed and unprocessed rows - as the entire data set in the partition is unprocessed at that time. (such a flag approach will not be very scalable in any case)
    Also, I do not use external processes. Too expensive performance wise to ship data all the way from the Oracle buffer cache to some external process. And parallel query is also not an option with an external process as the OCI does not provide the external process with a threading interface in order to hook into each of the data output streams provided by the parallel query clients.
    I stay inside PL/SQL to perform the data processing. PL/SQL is even more capable than ProC/C++ and Java and .Net in this regard.
    The execution interface to drive the scheduling of processing is DBMS_JOB. Straight forward and simple to use.
    The basic principles of processing large data volumes in Oracle is to effectively use I/O. Why use indexes when an entire data set needs to be processed? Why perform updates (e.g. updating a status flag) when the data model and physical implementation of that can eliminate it?
    I/O is the most expensive operation. And when dealing with a large volume, you need to make sure that every single I/O is actually required to achieve the end result. There's no room to waste I/O as the performance penalties are hefty.

Maybe you are looking for

  • How do you invoke a simple jsp

    hi, i have a very simple jsp that displays the current time. it works on my tomcat server (installed on my computer). i purchased a domain from yahoo, say, www.beginer.com and uploaded the jsp file to the yahoo server. but when i referenced it with t

  • Explicit History Code Plug-in for Adobe Products?

    Hello, my name is Matthew Reeves, a graduate student at the University of Utah School of Architecture in Salt Lake City, UT. I hope you are doing well this summer. I am currently completing my final semester of architecture school, and have been lear

  • Re: enhancement spot

    Hi techies,    I am interested in updating myself in the new concept enhancement spots.   Suppose i have inserted a source code plug within the enhancement spot (Dynamic) (i.e) ENHANCEMENT....... if ... write:/ 'enhancement1'. (or) message 'enhanceme

  • Is it my imagination, or is this a bad idea? NTFS external as a Mac drive

    I've always heard NTFS and Mac's didn't get along too well. Today I found out that Seagate is selling NTFS drives as "Mac" drives. Here's a link to their spec sheet: http://www.seagate.com/www-content/product-content/backup-plus-fam/backup-plus-d esk

  • Help with calling Oracle Procedure

    First, let me state that I'm new to Oracle and using it with ColdFusion, but I've used CF on and off for the past 6 years. So this might be a simple "newbie" problem with Oracle and CF. My boss wrote the following simple Oracle procedure to concatena