Can i use one interface to load data into 2 different tables

Hi Folks,
Can i use one interface to load data into 2 different tables(same schema or different schemas) from one source table with same structure ?
Please give me advice
Thanks
Raj
Edited by: user11410176 on Oct 21, 2009 9:55 AM

Hi Lucky,
Thanks for your reply,
What iam trying is ...Iam trying to load the data from legacy tables(3) into oracle staging tables.But i need to load the same source data into two staging tables(these staging tables are in two different schemas)
can i load this source data into two staging tables by using single standard interface(some business logic is there)
If i can then give me some suggestion how to do that
Thanks in advance
Raj

Similar Messages

  • How to use one form to submit data to 2 tables on mysql

    Can someone please help me on this,
    I am developing a jsp website and I want to use one form to submit data to 4 tables on mysql database and the tables are related by one foreign key.
    Can someone bail me out of this ....I've hit a hard brick wall!!!!...

    kwesij wrote:
    Can someone please help me on this,
    I am developing a jsp website and I want to use one form to submit data to 4 tables on mysql database and the tables are related by one foreign key.
    Can someone bail me out of this ....I've hit a hard brick wall!!!!...What's the problem? What does a brick wall look like?
    Connect to the database and execute four SQL INSERT/UPDATE statements as a single unit of work. The fact that you have one form shouldn't be an issue.
    I'll bet you're having trouble because you haven't layered the problem either in code or in your mind.
    I'd recommend that you write a POJO to take in some objects and execute the SQL. Once you have that running successfully you can worry about the form. Decouple the two.
    Computer science is all about decomposing large problems into smaller ones.
    %

  • Can I use Bridge to export image data into a .txt file?

    I have a folder of images and I would like to export the File Name, Resolution, Dimensions and Color Mode for each file into one text file. Can I use Bridge to export image data into a .txt file?

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • Can I use one Apple TV device on two different HD televisions to view the same info from a mini Mac?

    Can I use one Apple TV device on two different HD televisions to view the same info from a mini Mac?

    Welcome to the Apple Community Lschaef5318.
    You need one Apple TV for each TV.

  • Loading data into a table

    I am loading data into a table I created which includes a column "Description" with a data type VARCHAR2(1000). When I go to load the data which is less than 1000 characters I receive the following error message:
    Record 38: Rejected - Error on table SSW_INPUTS, column DESCRIPTION.
    Field in data file exceeds maximum length
    I have increased the size of the column but that does not seem to fix the error. Does anyone know what this error means? Another thought is that I have created the "Description" column to large...which can't be true because I should receive the error when I create the table. Plus I already inputted data into a similar table with similar data and had no problems!
    Someone please help...
    Thank you,
    April.

    Note that I'm assuming Oracle8(i) behavior. Oracle9 may treat Unicode differently.
    Are you inserting Unicode data into the table? Declaring a variable as varchar2(1000) indicates that Oracle should reserve 1000 bytes for data. If you're inserting UTF-8 encoded data, each character may take up to 3 bytes to store. Thus, 334 characters of data could theoretically overflow a varchar2(1000) variable.
    Note that UTF-8 is designed so that the most commonly used characters are stored in 1 byte, less commonly used characters are stored in 2 bytes, and the remainder is stored in 3 bytes. On average, this will require less space than the more familiar UCS-2 encoding which stores every character as 2 bytes of data.
    Justin

  • How to  load data into user tables using DIAPIs?

    Hi,
    I have created an user table using UserTablesMD object.
    But I don't have know how to load data into this user table. I guess I have to use UserTable object for that. But I still don't know how to put some data in particular column.
    Can somebody please help me with this?
    I would appreciate if somebody can share their code in this regard.
    Thank you,
    Sudha

    You can try this code:
    Dim lRetCode As Long
    Dim userTable As SAPbobsCOM.UserTable
    userTable = pCompany.UserTables.Item("My_Table")
    'First row in the @My_Table table
    userTable.Code = "A1"
    userTable.Name = "A.1"
    userTable.UserFields.Fields.Item("U_1stF").Value = "First row value"
    userTable.Add()
    'Second row in the @My_Table table
    userTable.Code = "A2"
    userTable.Name = "A.2"
    userTable.UserFields.Fields.Item("U_1stF").Value = "Second row value"
    userTable.Add()
    This way I have added 2 lines in my table.
    Hope it helps
    Trinidad.

  • Using Import Man to load Data into Multi Value Fileds in a Qualified Table

    Hi there,
    When using the Import Manager, i can not use the "append" option to load data into my multi value field which is contained within my qualified table.
    Manually it works fine on Data manager, so the field has been set up correctly. Only problem is appending the data during Import Manager Load.
    Any reason why I do not have this option available during Field mapping in Import Manager. The selection options are shown but in gray.
    Would appreciate any sugestions.
    Chris Huggett

    Thanks Sowseel
    Its a good document but doesn't address my problem, maybe My problem isn't clear.
    The structure(part of) that I have currently is as follows.
    Main Table - Material
                           QFTable-  MNF PN
                               LUField - MNF Name(Qualifier Single Value)
                               LUField  - BU ID  (Non Qualifier Multi Value)
                               TField   - P/N- (Non Qualifier)
    I know how to load data to the main and qualified tables, but what I can not do, using Import Manger, is updating the  "LUField  - BU ID  (Non Qualifier Multi Value)" using the append functionality.
    Thanks
    Chris Huggett

  • Error while using Rule file in loading data into Essbase through ODI

    Hi Experts,
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule fule to add values to existing values am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • Loading data into existing table

    Hi I have tried to load data into a large table from a csv file but am not getting any success. I have this control file
    LOAD DATA
    INFILE 'Book1.xls'
    BADFILE 'p_sum_bad.txt'
    DISCARDFILE 'p_sum_dis.txt'
    APPEND
    INTO TABLE p_sum
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    SUMMARY_LEVEL ,
    PERIOD_START_TIME ,
    BUSY_HOUR ,
    OMC ,
    INT_ID ,
    BTS_ID ,
    BTS_INT_ID ,
    CELL_GROUP ,
    HO_PERIOD_DURATION ,
    POWER_PERIOD_DURATION ,
    MSC_I_SUCC_HO ,
    MSC_I_TCH_TCH ,
    MSC_I_SDCCH_TCH ,
    MSC_I_SDCCH ,
    MSC_I_TCH_TCH_AT ,
    MSC_I_SDCCH_TCH_AT ,
    MSC_I_SDCCH_AT ,
    MSC_I_FAIL_LACK ,
    MSC_I_FAIL_CONN ,
    MSC_I_FAIL_BSS ,
    MSC_I_END_OF_HO ,
    MSC_O_SUCC_HO ,
    The data is:
    2     3-Nov-06               1000033     9     8092220          1440     1440     5411     5374     7     30     5941
    2     3-Nov-06               1000033     10     1392190          1440     1440     0     0     0     0     0
    2     3-Nov-06               2000413     3     2127446          1440     1440     80     80     0     0     83
    2     3-Nov-06               2000413     4     2021248          1140     1440     0     0     0     0     0
    2     3-Nov-06               2000413     5     2021252          1080     1440     1     1     0     0     1
    2     3-Nov-06               2000413     6     2130163          1440     1440     2200     2193     2     5     2224
    2     3-Nov-06               2000413     7     6205155          1020     1440     0     0     0     0     0
    2     3-Nov-06               2000413     8     6200768          900     1440     30     30     0     0     31
    2     3-Nov-06               2000413     10     2111877          1440     1440     0     0     0     0     0
    2     3-Nov-06               1000033     18     1076419          1440     1440     75     73     0     2     79
    2     3-Nov-06               1000033     19     8089060          1440     1440     0     0     0     0     0
    but when I try to load the data, I get:
    Column Name Position Len Term Encl Datatype
    SUMMARY_LEVEL FIRST * , O(") CHARACTER
    PERIOD_START_TIME NEXT * , O(") CHARACTER
    Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
    ORA-01722: invalid number
    I believe the data being loaded has to be NUMBER. Can anyone adivse what do I need to change to load the data. Thanks

    Justin,
    Tried that, no luck:
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table P_SUM, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    SUMMARY_LEVEL FIRST * WHT O(") CHARACTER
    PERIOD_START_TIME NEXT * WHT O(") CHARACTER
    BUSY_HOUR NEXT * WHT O(") CHARACTER
    OMC NEXT * WHT O(") CHARACTER
    INT_ID NEXT * WHT O(") CHARACTER
    BTS_ID NEXT * WHT O(") CHARACTER
    BTS_INT_ID NEXT * WHT O(") CHARACTER
    CELL_GROUP NEXT * WHT O(") CHARACTER
    Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
    ORA-01722: invalid number
    Any other sugesstion

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • Can I use one infosource to update data to CUBE and ODS???

    Hi all,
    Can anyone tell me if I can load data to OSD and Cube from one the same InfoSource? As I know, I have to have "0recordmode"(update mode) in communication structure for ODS not for Cube.  So how can ODS and Cube use the same Infosucre to update data?
    Thank you

    John,
    Depending on the volume of data and the type of infosource, sometime one update to ODS first and then subsequently update to the Cube or update both cube and ODS in parallel.
    For example, if you use AR or AP line item datasources, since you are extracting from both open line and closed line items tables, there could be multiple records for the same thing coming through.  First when it appears as an open line item, and when it is paid, it appears again as a paid line item so if you update directly to the cube, you can have multiples of the real value.  Thus in this instance you update from infosource to ODS then subsequently use delta update from ODS to cube.   CCA and C0-PA on delta mode are example where there are no overlapping, thus you can update both ODS and Cube at the same time.
    Hope this helps,
    Mary

  • Loading data into multiple tables using sqlloader

    Hi,
    I am using sql loader to load the data from flat file into the data base
    my file structure is as below
    ====================
    101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
    102,smith,mobile@@fax@@home,1234@@345@@234,123.40
    103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
    in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
    the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
    emp
    empno ename sal
    101 john 1234.40
    102 smith 123.40
    103 adams 2345.40
    emp_comm
    empno comm_mode comm_no_text
    101 mobile 1234
    101 fax 3425
    101 home 1232
    101 office 2345
    101 email [email protected]
    102 mobile 1234
    102 fax 345
    102 home 234
    103 fax 1234
    like this needs to insert the data using sql loader
    my table structures
    ===============
    emp
    empno number(5)
    ename varchar2(15)
    sal number(10,2)
    emp_comm
    empno number(5) reference the empno of the emp table
    comm_mode varchar2(10)
    Comm_no_text varchar2(35)
    now i want insert the file data into the specified structues
    please help me out to achieve this using sql loader
    (we are not using external tables for this)
    Thanks & Regards.
    Bala Sake
    Edited by: 954925 on Aug 25, 2012 12:24 AM

    Pl post OS and database details
    You will need to split up the datafile in order to load into separate tables. The process is documented
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
    HTH
    Srini

  • Interface for loading data into customer products

    I am trying to find if there is any interface for loading Customer products in Install Base apart from entering them manually. I understand that this exists in Oracle Apps version 11.5.7 but I need this for version 11.5.4.

    Hi,
    In 11.5.4 , you've to write yourself the loader, using Oracle standard API.
    I've performed it, it is working fine.
    Hugues

  • Loading data into multiple tables - Bulk collect or regular Fetch

    I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
    I have run into a couple of problems and have a few questions where I would like to seek advice:
    1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
    2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
    3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
    Notes:
    1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
    2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
    The structure of the procedure is as follows
    Declare
    dest_type is table of source_table%ROWTYPE;
    dest_tab dest_type ;
    iCount NUMBER;
    cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
    BEGIN
    OPEN source_cur;
    LOOP
    FETCH source_cur -- BULK COLLECT
    INTO dest_tab -- LIMIT 1000
    EXIT WHEN source_cur%NOTFOUND;
    FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
    <Insert into app_tab1 values key, col12, col23, col34 ;>
    <Insert into app_tab2 values key, col15, col29, col31 ;>
    <Insert into app_tab3 values key, col52, col93, col56 ;>
    UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
    iCount := iCount + 1 ;
    IF iCount = 1000 THEN
    COMMIT ;
    iCount := 0 ;
    END IF;
    END LOOP;
    END LOOP ;
         COMMIT ;
    END ;
    Edited by: user11368240 on Jul 14, 2009 11:08 AM

    Assuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
    DECLARE
        iCount NUMBER;
        CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
    BEGIN
        OPEN source_cur;
        FOR r IN source_cur
        LOOP
            <Insert into app_tab1 values key, col12, col23, col34 ;>
            <Insert into app_tab2 values key, col15, col29, col31 ;>
            <Insert into app_tab3 values key, col52, col93, col56 ;>
            UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
            iCount := iCount + 1 ;
            IF iCount = 1000 THEN
                COMMIT ;
                iCount := 0 ;
            END IF;
            END LOOP;
        COMMIT ;
    END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
    If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
    However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler?

  • Loading data into multiple tables from an excel

    Can we load data in to multiple tables at a time from an excel through Utilities? If yes how? Please help me
    Regards,
    Pallavi

    I would imagine that the utilities allow you to insert data from a spreadsheet into 1 and only 1 table.
    You may have to write your own custom data upload using External Tables and a PL/SQL procedure to insert data from 1 spreadsheet into more than 1 table.
    If you need any guidance on doing this let me know and I will happily point you in the right direction.
    Regards
    Duncan

Maybe you are looking for

  • Shared Reviews in Acrobat 9 Never End if File Kept Open

    It appears that an Acrobat Pro 9 shared review does not really end until everyone has closed the document, even if the deadline has already passed. Here are the details: 1. Using Acrobat 9 Pro I created a shared review that is hosted on a local netwo

  • Regarding link option in crystal report 2008

    respected sir,        how to enable full outer join and right outer join in link option because default is disable????

  • Aperture Library too big

    My only Aperture library is too big (almost 1TB) and slow to open. I would like to move older projects (2005-2009) into a 2nd new library (aka Archive Library) to another internal HD and continue to use the existing library for projects in 2012 and l

  • I pods will not synce with windows 8.1?

    i pod nano will not synce using windows 8

  • Programmatically Delete Portal user Cache

    Hi! I have a problem re-using the "Portal User Profile" portlet to put on my own portal page. It is not able to execute unless i am Portal Administrator (PORTAL). The requirement is that I need to clear the cache for a Particular User. <br><br> I twe