Load data through control file

Hi ,
I am the first time loading data ,
My table definition:
<pre>
SQL> desc t;
Name Null? Type
ID NUMBER
NAME VARCHAR2(35)
And my csv file data:
id     name
1     5.0315
2     3.2645
3     1.4975
4     -0.2695
5     -2.0365
And my control file:
load data
infile "c:\lddata.csv"
into table t truncate
fields terminated by ','
(id,
name
When I run in sql*plus I am getting this error:
SQL> @c:\lddata.ctl
SP2-0042: unknown command "load data" - rest of line ignored.
SP2-0734: unknown command beginning "infile "c:..." - rest of line ignored.
SP2-0734: unknown command beginning "into table..." - rest of line ignored.
SP2-0734: unknown command beginning "fields ter..." - rest of line ignored.
SP2-0044: For a list of known commands enter HELP
and to leave enter EXIT.
4
</pre> --how to use this pre tag pls
How should over come this error.
pls,
Thanks,

SQL*Loader has to run from OS command prompt as
<br>
<br>
sqlldr username/password control=<control_file>...
<br>
<br>
Read more here
<br>
<br>
[pre] and [/pre] tags

Similar Messages

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • Error while loading data through flat files

    Hi,
    I am laoding data through flat files
    I loaded the data in development & then transported it to Quality  system  for doing Integration testing & loaded the data in Quality  system  & data loading was successful
    But we do the regression testing in one more Quality system where the objects get transported from Quality  system ,here when I try to the load the data to any of the cube in this system  I am getting the following error
    Error 'The argument 'US' cannot be interpreted as a number' on assignment field /BIC/ZIFBCOCTR record 1 value US
    Generally we get this error when there is the data of one field is going to the other field
    But one thing I am unable to understand is it was working fine in the Quality system and all the objects got transported from Quality to Regression testing system
    Could any one help me out
    Thanks
    Maya

    Hi,
    Can you please cross check the file structure and the transfer structure.
    Cheers,
    Malli....

  • Loading data through rule files in Essbase

    Hi everyone,
    I really need help because something that I don't understand is happening when I am loading data through a rule files that I have created in Essbase. In the "Field Properties" of that rules files, I added few accounts in the "Global Properties" that I wanted to be replaced by others when I perform the load since I had troubles with the hierarchy of my accounts. My problem is that I have 3 accounts that I cannot add when I am loading my data and I don't understand why. By example, in my data extract file, I want to add the account : 601820SN60005 but since it doesn't exist in my Dimension Library I changed my rule file. I add the following propertie : Replace 601820SN60005 with 601820, so the system should put the data in that account instead. But it's not what'S happening. When I perform my load, the log tells me that the member 601820SN60005 does not exist in the database which is true but it seems that the propertie I had is not working. The load should work property. Is it possible that my rules files is corrupted? Or anything else?
    Please give me a clue of what's happening here because I really don't understand!
    Thanks a lot!

    I will just pipe in with the mostly worthless comment that doing any kind of ETL within a load rule is really not the world's best idea. Could you do your transformations in the source? There is this fantastic language called "SQL" that allows all kinds of cool data manipulations. Oracle even sells this product called ODI that I hear is just dandy for ETL work. :)
    A slightly more useful suggestion -- it's really tough to view all of the various transformations in a load rule, as you have found to your sorrow. Did you know you can print the load rule and get all of the transformations? I use it all the time when a client hasn't listened to my whine about not doing ETL in a load rule.
    Regards,
    Cameron Lackpour

  • How to load data using Control File in BW 7

    Hi All,
    I have a requirement to load data in BW using control file. The development is done in in BW 7.0. In BW 3.5 in infopackage, there is an option of FILE IS ( Control File or Data File ). Please suggest how to simulate the same in BW 7.0
    Regards,
    Vikram

    Any suggestions?

  • How do I skip footer records in Data file through control file of sql*loade

    hi,
    I am using sql*loader to load data from data file and i have written control file for it. How do i skip last '5' records of data file or the footer records to be skiped to read.
    For first '5' records to be skiped we can use "skip" to achieve it but how do i acheive for last '5' records.
    2)
    Can I mention two data files in one control file if so what is the syntax(like we give INFILE Where we mention the path of data file can i mention two data file in same control file)
    3)
    If i have datafile with variable length (ie 1st record with 200 charcter, 2nd with 150 character and 3rd with 180 character) then how do i load data into table, i mean what will be the syntax for it in control file.
    4)if i want to insert sysdate into table through control file how do i do it.
    5) If i have variable length records in data file and i have first name then white space between then and then last name, how do i insert this value which includes first name and last name into single column of the table.( i mean how do you handle the white space in between first name and last name in data file)
    Thanks in advance
    ram

    You should read the documentation about SQL*Loader.

  • Loading of master data through flat files

    Hi
    can anybody tell how to load master data through flat files.As for as my knowledge we load characteristic and attributes values first and then we load text and then hierarchy.
    is it right or is there any procedure to load all values at a time.
    sai

    Hi ,
    condition1:The sequence of columns in the transfer structure must correspond to the sequence of columns in your flat file
    chk this help.sap link
    http://help.sap.com/saphelp_nw04s/helpdata/en/c8/e92637c2cbf357e10000009b38f936/frameset.htm
    Hope this helps you!!!!!!!
    cheers,
    Swapna.G

  • Loading data from .csv file into existing table

    Hi,
    I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
    I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
    timesheet_entry_id,time_worked,timesheet_date,project_key .
    The csv columns are :
    project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
    What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
    Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
    Thanks,
    Anjali

    Hi Anjali,
    Take a look at these threads which might outline different ways to do it -
    File Browse, File Upload
    Loading CSV file using external table
    Loading a CSV file into a table
    you can create hidden items in the page to validate previous records before insert data.
    Hope this helps,
    M Tajuddin
    http://tajuddin.whitepagesbd.com

  • Load data from xml file in oracle data base

    Hi all,
    I'd like to know if is posible to load data from a file xml into a table of oracle data base through SQL*LOADER, loaded only in a normal column no with type XMLType , for example
    I have a xml file
    <person name="kate" surname="fari" city="new york" >
    <son name="faus" age="18"/>
    <son name="doly" age="10"/>
    </person>
    and I load in table :
    table :person
    column
    name surname city
    kate fari new york
    table : son
    name age
    doly 10
    faus 18
    thank you for your return !!!!!!!!!
    Ninova
    Edited by: user10474037 on 30 mai 2011 08:47
    Edited by: user10474037 on 30 mai 2011 08:48
    Edited by: user10474037 on 30 mai 2011 08:48
    Edited by: user10474037 on 30 mai 2011 08:49
    Edited by: user10474037 on 30 mai 2011 08:50

    Hi
    This May be found at
    SQL Loader to upload XML file

  • I am loading hierarachy through flat file but facing problem

    i am loading hierarachy through flat file but facing problem  when creating infopackage  still error coming  please suggest  details what are steps followed

    Hi,
    Follow the steps:
         Hierachy through flat file
    &#61656;     First create a flat file save as in .CSV format. Then close the file.
    &#61656;       Create info area in info object then info object catalog opens then create it then info object screen opens then another screen opens were you will give data type, length then select option hierarchy tab, then select it simply select the check box with hierarchies.
    &#61656;        Select attribute tab in that name, age, address, phone then activate it then come back to the main screen then select characters right click a small screen open then select info object then a small screen opens then create sales region and activate it.
    &#61656;     Then return back to the main screen then click characters right click on it then a small window opens then select info object option & create sales office and activate it.
    &#61656;     Then return to the main screen then select characters then the ID then the double click on it  then the other screen opens select hierarchy tab then select external chars in hierarchies then click on it then other screen opens then select find tab then give the sales region info object name and click continue then it finds the name.
    &#61656;     The name would be scroll on the other screen then the office should be find the same way as region then office would be scroll on the other screen then continue and activate it.
    &#61656;     Then come back to the main screen then go to info source option then it opens then create application component then it opens other screen then click continue it
    &#61656;     Then  create info source for master data then select  direct update option there give the name of the info object of  as ex; YID_AS2 then click  continue.
    &#61656;     Then the YID_AS@ could be seen in the main screen then right click on it. Then a small option opens then click assign data source option then other screen opens there it gives the empty space double click on it then PC file then click continue then another window opens click continue then again screen opens click continue again a pop up screen opens click continue.
    &#61656;     Then another screen opens there the attribute data source then activate it then select hierarchy activate it then the left side data structure/ hierarchy then click on it then it opens another screen then click hire maintenance button .
    &#61656;     Then give the hierarchy name (your wish) then go with continue then yes then save then go back to the main screen.
    &#61656;     Then maintain the flat file as node id, info object, node name, link name, parent no.
    &#61656;     Then info package creation then click on the flat file right clicks on it then a other screen opens then select attribute data source and then continue.
    &#61656;        Then other screen opens then select external data tab in .CSV then click 1 the enter preview then another screen opens then click then it shows the preview then schedule then start then monitor click on it  then it opens other screen then click on the master data then we can see the data then go back to the  main screen.
    &#61656;     Then select the flat file right click on it then it gives other screen then select hierarchy in that then click continue then it opens other screen then select hierarchy selection tab there you can see the file name then select the button of hierarchy.
    &#61656;     Then select External data then the file name in CSV then processing tab only psa option with update subsequently in data targets then the update it schedule start then the monitor then go to the main screen.
    &#61656;     Then go to info object then select the file id which you have given then double click on it then select hierarchy or text file then the contents then execute it.
    &#61656;     Create a transaction table , cube in a normal way. We can see the result in reports we have to give the hierarchy name in the properties of rows where we drag the characteristics id then we can see the records which we have entered man usally after executing the report.
    regards
    ashwin

  • Error when executing interface which load data from csv file which has 320

    Hi,
    Can some one provide a resolution for below error:
    I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
    using LKM File to SQL, IKM Sql Control Append.
    I am getting below error when executing the interface :
    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
    BSF info: Create external table at line: 0 column: columnNo
         at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
         at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
         at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
         ... 11 more
    Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
    out.print("createTblCmd = r\"\"\"\ncreate table ") ;
    out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
    out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
              "<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
         , ",\\n\\t", "","")) ;
    out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
    out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
    out.print("'\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
    out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
    out.print("\n\t\tBADFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
    out.print("\n") ;
    if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    } else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
    out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
    out.print("'\n\t\t") ;
    if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
    } else {out.print("OPTIONALLY ENCLOSED BY '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
    out.print("' AND '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
    out.print("' ") ;
    }out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    }out.print("\tLOCATION (") ;
    out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
    out.print(")\n)\n") ;
    out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
    out.print("\nREJECT LIMIT ") ;
    out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
    out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
    ****** ORIGINAL TEXT ******
    createTblCmd = r"""
    create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
         <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
              "<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
         , ",\n\t", "","")%>
    ORGANIZATION EXTERNAL
         TYPE ORACLE_LOADER
         DEFAULT DIRECTORY dat_dir
         ACCESS PARAMETERS
              RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
              <%=odiRef.getUserExit("EXT_CHARACTERSET")%>
              <%=odiRef.getUserExit("EXT_STRING_SIZE")%>
              BADFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
              LOGFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
              DISCARDFILE     '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
              SKIP           <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
    <% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
              FIELDS
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%} else {%>
              FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
              <% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
              <%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%}%>     LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
    <%=odiRef.getUserExit("EXT_PARALLEL")%>
    REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
    # Create the statement
    myStmt = myCon.createStatement()
    # Execute the trigger creation
    myStmt.execute(createTblCmd)
    myStmt.close()
    myStmt = None
    # Commit, just in case
    myCon.commit().
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)

    The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
    Please see support Note [ID 1469977.1] for details.

  • Load data from XML files into BI

    Hi All,
    Can any one guide me through the steps which involve in loading data from XML files into BI?
    Thanks
    Santosh

    Hi James,
    I have followed upto step No: 12 .
    Please could you let me know how to link the XML file on my local drive to BI,
    I am not able to find figure out where to specify the path of the XML file to be loaded into BI.
    Thanks In Advance
    Regards,
    San
    1. Create a Infosource(ZIS), with transfer structure in accordance with the structure in the XML file. (You can open the XML file with MS Excel.This way you can get to know the structure).
    2. Activate the transfer rules.
    3. After activation ;from the Menu Bar , Select Extras>Create BW Datasource with SOAP connection.
    4.Now Activate the Infosurce again, this creates an XML based Datasource(6AXXX...)
    5.These steps would create two sub nodes under the Infosource(ZIS).
    6.Select Myself system and create a data package and execute it run in Init mode(without Data Transfer).
    7. This would create an entry in RSA7(BW delta Queue)
    8. Again create another Delta Infopackage under it, and run it. Now the Datasource(6AXXXXXX..) would turn green in RSA7.
    9.In Function builder(SE37) select your FM( do a search ,F4, on the datasource 6AXXX....) .
    10.Inside this RFC based FM , from the Menu Bar select Utilities>more Utilities>Create Web services>From Function module.
    11.A wizard will guide you through the next steps .
    12.once this is done a Web serrvice would be enabled in WSADMIN. Select your FM and execute it.
    13.From here you can upload the data from XML file into BW delta queue.
    Edited by: Santosh on Nov 30, 2008 2:22 PM

  • Loading data from multiple files to multiple tables

    How should I approach on creating SSIS package to load data from multiple files to multiple tables. Also, Files will have data which might overlap so I might have to create stored procedure for it. Ex. 1st day file -data from au.1 - aug 10 and 2nd day
    file might have data from aug.5 to aug 15.  So I might have to look for max and min date and truncate table with in that date range.

    thats ok. ForEachLoop would be able to iterate through the files. You can declare a variable inside loop to capture the filenames. Choose fully qualified as the option in loop
    Then inside loop
    1. Add execute sql task to delete overlapping data from the table. One question here is where will you get date from? Does it come inside filename?
    2. Add a data flow task with file source pointing to file .For this add a suitable connection manager (Excel/Flat file etc) and map the connection string property to filename variable using expressions
    3. Add a OLEDB Destination to point to table. You can use table or view from variable - fast load option and map to variable to make tablename dynamic and just set corresponding value for the variable to get correct tablename
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Loading data through pl/sql

    Hi frnds,
    i want to load .mdb table into oracle 9i through pl/sql procedure.
    i am doing this through sql* loader
    my script is :
    LOAD DATA INFILE '//empwrpub1/Data_Wrangler/Owens/Original Data/037 Refresh 10-06/SAP/SRM_INVOICE.txt'
    INTO TABLE SRM_INVOICE_200610
    FIELDS TERMINATED BY ',' ENCLOSED BY '"'
    LINES TERMINATED BY '\n'
    IGNORE 1 LINES
    now i want this through pl/sql procedure
    thanks in advance

    Hi,
    The two different questions have arised during this Forum.
    1. Load data through SQL loader from PL/SQL.
    2. Use of many DDL and DML statements in the process of Data Loading.
    I will start from the second Question :
    this is great question I myself has faced this problem like first create temporary tables load data, update spaces with null or date formatting etc.
    This is easily achieved when you load the data by using External Tables. There you need to write a script mentioning your datafile, location, column and line termination etc. In this same script you use your DDL statements first and then data loading script and there after DML if any.
    This block would look like something PL/SQL block.
    So, First questioin has also been answered with the condition that you have to use External tables and then load data by writing PL/SQL. I am not sure as I have never tried to call SQL loader commands from PL/SQL. So it Looks impossible. If at all you want to carry out the same activity from PL/SQL block then write a SQL loader Script, CTL file and then write a SHELL Program and then call it from PL/SQL Block.
    Bye

  • Can't load data through smart view (ad hoc analysis)

    Hi,
    There is EPM application where I want to give ability to planners to load data through smart view (ad hoc analysis). In Shared Services there are four options in
    EssbaseCluster-1: Administrator, Create/Delete Application, Server Access, Provisioning Manager. Only Administrator can submit data in smart view (ad-hoc analysis). But I don't want to grant Essbase administrator to planners, I'm just interested to give them ability to load data through ad-hoc analysis. Please suggest!

    I take that you refreshed the Planning security, If not refresh the security of those users. Managing Security Filters
    Check in EAS whether those filters are created with "Write" permissions.
    Regards
    Celvin
    http://www.orahyplabs.com

Maybe you are looking for

  • Why is the 'Choose icon...' button in the Options tab of the field property dialog box not enabled?

    I am using Adobe Acrobat Professional XI running on Windows 8. This "feature" has been bugging me for some time now. I remember it happening when I used Acrobat 9 Professional (Windows XP) and it is still happening in Acrobat XI Professional. I think

  • VA42 user exit to update contract start and end dates at Item level

    Hi  Experts,                   I need to update contract start and end dates at Item level in 'VA42' transaction, for this i am using user-exit 'USEREXIT_FIELD_MODIFICATION' in the include 'MV45AFZZ'. i am able  to display the data into item level th

  • What Improvements Would You Like To See in the Store? Tabs, search options?

    I just sent some feedback on the store but, you know, you never feel satisfied doing that so I thought I'd post my ideas and see if anybody else felt the same way or have found solutions for navigating through the store. My Feedback; +As a recent iPo

  • Duration of imported pictures

    I'm making a slideshow in FCP. Right now if I import all pictures at ones all of them are about 9 seconds. I need all of them to be 4 seconds. Is there a way to preset the duration of the pictures in timeline? Thanks.

  • Mac Backup Problems

    Has anyone found and fixed problems with Mac Backup where it copies files slowly and then 'hangs' at 'checksum' point.  Doesn't complete.  I'm backing up to a WD Mybook World Edition NAS in a different room which Time Machine won't see.  Backup was w