How to load 2 fields out of 3 from flat file to Oracle in one interface?

Hi,
Is there any way to selectively load flat file into oracle within one interface. For example if my flat file has this structure:
test_cid
test_ind
test_data
and my table is:
test_code ( correspond to test_cid above)
test_data (correspond to test_data above)
That can be done with two interfaces ( first to load the file into oracle staging and second to load the target oracle table from staging oracle table). Is there a way to make it in one interface?
Edited by: vzaslav on Oct 21, 2009 1:26 PM

That works for small tables. For big tables you need to accomodate for extra space as you will need to have at least twice as much space for working/integration tables and that is what we are trying to avoid.
In fact I was able to modify the LKM to load the file into oracle, however my integration step fails and I cannot make it work. To make lkm work one need to change getColList in create working table/generate ctl file steps to getSrcColList. That will create the working table based on source file structure and not on target table structure.
That doesn't work for IKM however. Any ideas would be appreciated.
Thank You.

Similar Messages

  • Error while loading data from flat file to Oracle DB

    I am new to ODI. I am trying to load data from a flat file to Oracle DB using the LKM File to Oracle (External Table) and using the IKM Oracle Incremental Update. I am getting the following exception during the "Create External Table" stage:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
      File "<string>", line 44, in <module>
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
      at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
      at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
      at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
      at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
      at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
      at oracle.jdbc.driver.T4CStatement.executeForRows(T4CStatement.java:1110)
      at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1488)
      at oracle.jdbc.driver.OracleStatement.executeInternal(OracleStatement.java:2251)
      at oracle.jdbc.driver.OracleStatement.execute(OracleStatement.java:2192)
      at oracle.jdbc.driver.OracleStatementWrapper.execute(OracleStatementWrapper.java:347)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
    java.sql.SQLException: java.sql.SQLException: ORA-30088: datetime/interval precision is out of range
      at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2473)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
      at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:561)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
      at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
      at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
      at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
      at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
      at java.lang.Thread.run(Thread.java:662)
    Caused by: Traceback (most recent call last):
      File "<string>", line 44, in <module>
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
      at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
      at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
      at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
      at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
      at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
      at oracle.jdbc.driver.T4CStatement.executeForRows(T4CStatement.java:1110)
      at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1488)
      at oracle.jdbc.driver.OracleStatement.executeInternal(OracleStatement.java:2251)
      at oracle.jdbc.driver.OracleStatement.execute(OracleStatement.java:2192)
      at oracle.jdbc.driver.OracleStatementWrapper.execute(OracleStatementWrapper.java:347)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
    java.sql.SQLException: java.sql.SQLException: ORA-30088: datetime/interval precision is out of range
      at org.python.core.PyException.fillInStackTrace(PyException.java:70)
      at java.lang.Throwable.<init>(Throwable.java:181)
      at java.lang.Exception.<init>(Exception.java:29)
      at java.lang.RuntimeException.<init>(RuntimeException.java:32)
      at org.python.core.PyException.<init>(PyException.java:46)
      at org.python.core.PyException.<init>(PyException.java:43)
      at org.python.core.Py.JavaError(Py.java:455)
      at org.python.core.Py.JavaError(Py.java:448)
      at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
      at org.python.core.PyObject.__call__(PyObject.java:355)
      at org.python.core.PyMethod.__call__(PyMethod.java:215)
      at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
      at org.python.core.PyMethod.__call__(PyMethod.java:206)
      at org.python.core.PyObject.__call__(PyObject.java:397)
      at org.python.core.PyObject.__call__(PyObject.java:401)
      at org.python.pycode._pyx1.f$0(<string>:50)
      at org.python.pycode._pyx1.call_function(<string>)
      at org.python.core.PyTableCode.call(PyTableCode.java:165)
      at org.python.core.PyCode.call(PyCode.java:18)
      at org.python.core.Py.runCode(Py.java:1204)
      at org.python.core.Py.exec(Py.java:1248)
      at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
      at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
      at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
      at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
      at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
      at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
      at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
      ... 1 more
    Caused by: java.sql.SQLException: ORA-30088: datetime/interval precision is out of range
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
      at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
      at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
      at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
      at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
      at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
      at oracle.jdbc.driver.T4CStatement.executeForRows(T4CStatement.java:1110)
      at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1488)
      at oracle.jdbc.driver.OracleStatement.executeInternal(OracleStatement.java:2251)
      at oracle.jdbc.driver.OracleStatement.execute(OracleStatement.java:2192)
      at oracle.jdbc.driver.OracleStatementWrapper.execute(OracleStatementWrapper.java:347)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
      ... 33 more
    Could anyone please provide any pointers.
    Thanks,
    Srini.

    The code that is executed is as follows:
    createTblCmd = r"""
    create table ODITEMP.C$_0PARTNER
      C1_C1 NUMBER(14),
      C2_C2 VARCHAR2(12),
      C3_C3 VARCHAR2(18),
      C4_C4 NUMBER(7),
      C5_C5 TIMESTAMP(11)
    ORGANIZATION EXTERNAL
      TYPE ORACLE_LOADER
      DEFAULT DIRECTORY dat_dir
      ACCESS PARAMETERS
      RECORDS DELIMITED BY 0x'0D0A'
      CHARACTERSET 'WE8ISO8859P1'
      STRING SIZES ARE IN CHARACTERS
      BADFILE 'partners.txt_%a.bad'
      LOGFILE 'partners.txt_%a.log'
      DISCARDFILE 'partners.txt_%a.dsc'
      SKIP 0
      FIELDS
      MISSING FIELD VALUES ARE NULL
      C1_C1 POSITION(1:14) ,
      C2_C2 POSITION(15:26) ,
      C3_C3 POSITION(27:44) ,
      C4_C4 POSITION(45:51) ,
      C5_C5 POSITION(52:62)
      LOCATION ('partners.txt')
    PARALLEL
    REJECT LIMIT UNLIMITED
    # Create the statement
    myStmt = myCon.createStatement()
    # Execute the trigger creation
    myStmt.execute(createTblCmd)
    myStmt.close()
    myStmt = None
    # Commit, just in case
    myCon.commit()

  • ODI error while loading data from Flat File to oracle

    Hi Gurus,
    I am getting following error while loading flat file to oracle table :
    ava.lang.NumberFormatException: 554020
         at java.math.BigInteger.parseInt(Unknown Source)
         at java.math.BigInteger.<init>(Unknown Source)
         at java.math.BigDecimal.<init>(Unknown Source)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
         at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    The connections between source and target is fine.
    Kindly suggest on this error.
    Thanks
    Shridhar

    Hi John,
    The source is csv file. By default all columns are string type.The integration is failing at step 3 ( load Data).
    I am loading the data from csv file directly to staging table( oracle ).Its one to one mapping.
    Shridhar

  • Loading Date fields from flat file to Oracle tables

    Hi,
    I have a flat file with a few date columns. I have given the format for the date field while creating the data store for the flat file. The format I used is 'YYYY-MM-DD'.
    But I get an error when I execute it after associating the module with LKM & IKM.
    The error message is as follows:
    org.apache.bsf.BSFException: exception from Jython: Traceback (innermost last):
    File "<string>", line 3, in ?
    OS command has signalled errors
    Can anyone help me out.
    Thanks

    At the time of DataStore creation. You take that date field as string.
    After this, in interface, when you map this date field with table field. You should use CONVERT(<field name>,DATE) in expression editor.

  • Can we load data in variable screen from flat file?

    Hello all,
    One of my user once asked me she needs to run a report for some employees, i said you can just type that in employee varaible input box.
    She said I need to run the report and I need it for 120 employees, she has the employee numbers in excel sheet.
    What is the best way of doing this if any ?
    Thanks

    This can be done from a flat file, which has list of all values.
    If you are using Bex 3.5, then we have an icon in variable screen bottom right to upload values, where we can specify the path of the file.
    Naveen.a

  • Sending data from flat file or oracle table view to a IBM MQ

    Hi,
    We need a help in developing solution.
    We have a scenario, where data(multiple records) in source (table/file) needs to populated into a queue(JMS IBM MQ) as a single message. To achieve this we are trying a two step process.
    1)     We created a temp table to store multiple records from file, this temp table is having one column of *‘clob’* data-type which will store data from file into a single row. We are facing issue while loading data from file to staging area using LKM FILE TO ORACLE (SQLLDR)
    When we are executing interface, while creating C$ table it is going error out . Instead of taking clob (target write data type) it is taking varchar with +5000+ size whereas varchar supports only *4000*.
    create table SNPM.C$_0SINGLERECORD
    C1_C1 VARCHAR2(5000) NULL
    NOLOGGING
    Error message:
    910 : 42000 : java.sql.SQLException: ORA-00910: specified length too long for its datatype
    java.sql.SQLException: ORA-00910: specified length too long for its datatype
    Need help in sending data into destination table.
    2) After populating this data into destination(temp table) as a single record we need to send each row as a single message to JMS MQ. currently we are using LKM FILE to SQL to load into Staging area and IKM SQL TO JMS APPEND to put message in JMS MQ. We are succeeding in putting message of length < 4000 as these KMs are converting the data using varchar2, but we have data of large size+(>4K)+.
    Pointer/ solution will be of great help.
    Please let me know in case you need more description.

    The error message that is showing while using clob as datatype(which we created as user data type in data types section of respective technology)
    java.lang.NumberFormatException: For input string: "4294967295"
         at java.lang.NumberFormatException.forInputString(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
    while using varchar2 in creating C$ table following is the error:
    910 : 42000 : java.sql.SQLException: ORA-00910: specified length too long for its datatype
    java.sql.SQLException: ORA-00910: specified length too long for its datatype
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)

  • Loading the 18 digit keyfigure from Flat File

    HI Gurus,
    We are loading data using FLATFILE Interface,so our flatfile format is being generated using some TE ,so we can't change that format.
    its has some keyfigure values 17 digits prefixed with + or - signs.. but in BW i think key figure accepts only 17 digits so last digit of first keyfigure is going into next keyfigure
    i have 3 keyfigures in between that file format.
    like below
    00000000005803.6000000000003385.44+00000000000483.63
    am not able load becos first key is trying to get update updated as 00000000005803.6 and second kf as 000000000003385.and third kf as 44+00000000000483
    in this way + is coming in between so its not able to load.
    so now here briefly what i am trying to ask is ..
    is there anyway through which we can load 18 digit into  keyfigure..
    If any body faced this problem pls help me in this regard .if u found solution..
    Regards,
    BRB

    I would think you should be able to load the KF as it is as it is essentially 17 digits (sign not counted).
    One option would be to define three dummy Characteristics fields in your data source structure (of size 18 each), and receive the complete values (sign plus 17 digits) of KFs from file in these. You can then move these values in the KF using a transfer rule (ABAP) from the corresponding characteristics. A simple MOVE shall do, as the sign in not included in the digit count.
    cheers,

  • Loading data from flat file to infocube

    hi,can any one give me the steps invovled  in loading data from flatfile to info cube in an elegant way

    hi,
    data loading in BI 7.0 from fla file means
    fisrt create one Cube or DSO with the same structure which you have in flatfile..and activate it.. ->now comes to Datasource tab> create one Datasource here you need to select type of data for example..
    select Transactional data --> and menntion your flatfile name in extraction tab- and file type and eneter your info object names in FIELDS tab -->
    and load preview data Activate it..
    now select your datasource and create info package and schedule it..
    now your data will loded in to PSA level...----> and now comes to info provider select your cube.. and right clcik it..
    and create transformations.,. and activate it..----> and create DTP -- Activate it.. and Execute it..
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> (DTP)-->Data TargetNow if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    Data Transfer Process (DTP) is now used to load data using the dataflow created by the Transformation.
    Here's how the DTP data load works:
    1) Load InfoPackage
    2) Data gets loaded into PSA (hence why PSA only is selected)
    3) DTP gets "executed"
    4) Data gets loaded from PSA into the data target once the DTP has executed
    if helpful provide points
    regards
    harikrishna N

  • Help Required regding: Validation on Data Loading from Flat File

    Hi Experts,
    I need u r help in the following issue.
    I need to validated the transactional data loading to the GL Cube from Flat file,
    1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>“0GL_ACCOUNT”</b> info object.
    2) If the master data record does not exits then the record need to be skipped from the loading and after the loading  the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
    I would really appriciate u r help and suggestions on solving this issue.
    Regds
    Hari

    Hi, write a <b>start routine</b> in transfer rules like this.
      DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
            l_s_errorlog TYPE rssm_s_errorlog_int,
            <b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
            new_datapak type tab_transtru.
           refresh new_datapak.
           loop at datapak into l_s_datapak_line.
           select single * from /BI0/PGLACCOUNT into l_s_glaccount
             where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
    and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
    and OBJVERS eq 'A'.
           if sy-subrc eq 0.
             append l_s_datapak_line to new_datapak.
           endif.
           endloop.
           datapak = new_datapak.
           if datapak[] is initial.
    abort <> 0 means skip whole data package !!!
             ABORT = 4.
           else.
             ABORT = 0.
           endif.
    i have already some modifications but U can slightly change it to suit your need.
    regards
    Emil

  • Loading data into infocube in bi 7.0 from flat file.

    Hello  All,
    I need the complete procedure for loading data into the infocube from flat file in bi 7.0 i.e.using transformation,DTP etc.
    Please help me with some gud documents.

    Hi Pratighya,
    Step by step procedure for loading data from flat file.
    1. Create the infoobjects you might need in BI
    2. Create your target infoprovider
    3. . Create a source system
    4. Create a datasource
    5. Create and configure an Infopackage that will bring your records to the PSA
    6. Create a transformation from the datasource to the Infoprovider
    7. Create a Data Transfer Process (DTP) from the datasource to the Infoprovider
    8. Schedule the infopackage
    9. Once succesful, run the DTP
    10. This will fill your target.
    Hope this helps
    Regards
    Karthik

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

  • Moving data from flat file

    Hi,
    I am moving data from flat file to oracle table. while populating the oracle table if I get any errors in flat file those errors should populate in ODI error table.
    Is this is possible? if yes. Could you please let me know the set up in ODI.

    CKM is the dedicated for checking the constraints while doing transformation. The constraints includes, PK,FK, conditions etc.,
    There are two types/ways of checking the constraints.
    Flow Control: Triggered CKM after data is loaded in to I$.
    Static Control : Triggered CKM after data is loaded in to target table.
    If you opt for any one the above ODI will create E$ and SNP_CHECK_TAB (summary table for logging the errors) and load the error records.
    ODI will also provide you an option of loading the corrected error records by RECYCLE ERROR feature. In this phase/run ODI will load ONLY the error records in to target table (assuming its been corrected/cleaned).
    **how to set up flow control could you please provide steps**
    **Appreciate your help**

  • How to load Material from Flat File and convert to SAP Format

    Hi
              I am loading 0Material values from Flat File for mapping purpose. The format of Material in Flat File is "7704132". Within the system, i need to compare the value with 0Material values from the incoming data and update corresponding 0Material in records. for this purpose, I created dummy materials taking 0Material as template and trying to load data. I am getting invalid error- Version '7704132' is not valid RSDMD No194. Can anyone please let me know how to over come this issue. Should I include any routine in the data source or rules level. I am in BI 7.0.
    Thanks.

    Hi,
    Use the FM CONVERSION_EXIT_ALPHA_INPUT to convert the value into Internal format .Use this FM in the Transformations (field mapping).
    Search the forum with CONVERSION_EXIT_ALPHA_INPUT for more information on this.
    Regards,
    Anil Kumar Sharma .P

  • How to load data to a cube from multiple infosources ?

    Hi friends,
    How to load data to a cube from multiple infosources ? could u please answer this question .
    thanks in advance......

    Hi ,
    say for example you need to load data to 1 cube from 3 info sources:
    1) You need to create 3 update rules for the Cube.
    2) Each time you create the update rules. Mention the name the name of the Info source. and create update rules correspondingly.
    Regards
    satish
    Message was edited by:
            satish murthy

  • How to load date and time from text file to oracle table through sqlloader

    hi friends
    i need you to show me what i miss to load date and time from text file to oracle table through sqlloader
    this is my data in this path (c:\external\my_data.txt)
    7369,SMITH,17-NOV-81,09:14:04,CLERK,20
    7499,ALLEN,01-MAY-81,17:06:08,SALESMAN,30
    7521,WARD,09-JUN-81,17:06:30,SALESMAN,30
    7566,JONES,02-APR-81,09:24:10,MANAGER,20
    7654,MARTIN,28-SEP-81,17:24:10,SALESMAN,30my table in database emp2
    create table emp2 (empno number,
                      ename varchar2(20),
                      hiredate date,
                      etime date,
                      ejob varchar2(20),
                      deptno number);the control file code in this path (c:\external\ctrl.ctl)
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>any help i greatly appreciated
    thanks
    Edited by: user10947262 on May 31, 2010 9:47 AM

    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)Try
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime "to_date(:etime,'hh24:mi:ss')", ejob, deptno)
    this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>
    That's not an error, you can see errors within log and bad files.

Maybe you are looking for