Error loading transaction data from csv file

Hello all
I am getting the following error when loading from a .csv file:
Error 'The argument '-65.025,60' cannot be interpreted as a number' on assignment field BALANCE record 1 value -65.025,60
The BALANCE field is defined as : 0D_BALANCE (from the SAP Demo Cubes).
Any help on this is appreciated.
TIA

Thank you all for your help
I made the changes. I was still getting the error until I had changed one item.
Under the FIELDS tab, I set the "FORMAT" to "External".
After doing this, all worked well.
Thanks again!

Similar Messages

  • How to load the data from .csv file to oracle table???

    Hi,
    I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
    Thanks in advance

    981145 wrote:
    Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
    the command is
    sqlldrType it and see if you have it installed.
    Have a look also at the FAQ link posted by Marwin.
    There are plenty of examples also on the web.
    Regards.
    Al

  • Error in loading  transaction dat from flat file

    hi everybody
    while i try to load  data from flat file at time of  scheduling it shows the error "check load from infosourceand its showing  errorwhilw monitoring the data as follow"the  correcsponding datapacket were not updated using psa" what can  i do to rectify the  problem
    thanks in advance

    Hi,
    Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
    Check whether all the related AWB objects are active or not.
    Regards,
    K.Manikandan.

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Error while load the data from CSV with CTL file..?

    Hi TOM,
    When i try to load data from CSV file to this table,
    CTL File content:
    load data
    into table XXXX append
         Y_aca position char (3),
         x_date position date 'yyyy/mm/dd'
    NULLIF (x_date = ' '),
    X_aca position (* + 3) char (6)
    "case when :Y_aca = 'ABCDDD' and :XM_dt is null then
    decode(:X_aca,'AB','BA','CD',
    'DC','EF','FE','GH','HG',:X_aca)
    else :X_aca
    end as X_aca",
    Z_cdd position char (2),
         XM_dt position date 'yyyy/mm/dd'
    NULLIF XM_dt = ' ',
    When I try the above CTL file; geting the following error..
    SQL*Loader-281: Warning: ROWS parameter ignored in parallel mode.
    SQL*Loader-951: Error calling once/load initialization
    ORA-02373: Error parsing insert statement for table "XYZ"."XXXX".
    ORA-00917: missing comma

    Possible Solutions
    Make sure that the data source is valid.
    Is a member from each dimension specified correctly in the data source or rules file?
    Is the numeric data field at the end of the record? If not, move the numeric data field in the data source or move the numeric data field in the rules file.
    Are all members that might contain numbers (such as "100") enclosed in quotation marks in the data source?
    If you are using a header, is the header set up correctly? Remember that you can add missing dimension names to the header.
    Does the data source contain extra spaces or tabs?
    Has the updated outline been saved?

  • Error when executing interface which load data from csv file which has 320

    Hi,
    Can some one provide a resolution for below error:
    I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
    using LKM File to SQL, IKM Sql Control Append.
    I am getting below error when executing the interface :
    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
    BSF info: Create external table at line: 0 column: columnNo
         at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
         at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
         at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
         ... 11 more
    Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
    out.print("createTblCmd = r\"\"\"\ncreate table ") ;
    out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
    out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
              "<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
         , ",\\n\\t", "","")) ;
    out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
    out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
    out.print("'\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
    out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
    out.print("\n\t\tBADFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
    out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
    out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
    out.print("\n") ;
    if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    } else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
    out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
    out.print("'\n\t\t") ;
    if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
    } else {out.print("OPTIONALLY ENCLOSED BY '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
    out.print("' AND '") ;
    out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
    out.print("' ") ;
    }out.print("\n\t\t") ;
    out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
    out.print("\n\t\t(\n\t\t\t") ;
    out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
                        "<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
                        , ",\\n\\t\\t\\t", "","")) ;
    out.print("\t\t\n\t\t)\n\t)\n") ;
    }out.print("\tLOCATION (") ;
    out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
    out.print(")\n)\n") ;
    out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
    out.print("\nREJECT LIMIT ") ;
    out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
    out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
    ****** ORIGINAL TEXT ******
    createTblCmd = r"""
    create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
         <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
              "<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
         , ",\n\t", "","")%>
    ORGANIZATION EXTERNAL
         TYPE ORACLE_LOADER
         DEFAULT DIRECTORY dat_dir
         ACCESS PARAMETERS
              RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
              <%=odiRef.getUserExit("EXT_CHARACTERSET")%>
              <%=odiRef.getUserExit("EXT_STRING_SIZE")%>
              BADFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
              LOGFILE          '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
              DISCARDFILE     '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
              SKIP           <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
    <% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
              FIELDS
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%} else {%>
              FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
              <% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
              <%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
              <%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
                   <%=odiRef.getColList("", "[CX_COL_NAME]\t"+
                        "<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
                        , ",\n\t\t\t", "","")%>          
    <%}%>     LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
    <%=odiRef.getUserExit("EXT_PARALLEL")%>
    REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
    # Create the statement
    myStmt = myCon.createStatement()
    # Execute the trigger creation
    myStmt.execute(createTblCmd)
    myStmt.close()
    myStmt = None
    # Commit, just in case
    myCon.commit().
         at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)

    The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
    Please see support Note [ID 1469977.1] for details.

  • Error while loading Transactional data from NW BW Infoprovider

    Hi,
      I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
    Below is the transformation file content
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=NO
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT= 9999999999999
    ROUNDAMOUNT=
    *MAPPING
    ACCOUNT = 0ACCOUNT
    BUSINESSTYPE = *NEWCOL(NOBTYPE)
    BWSRC = *NEWCOL(R3)
    COMPANYCODE = 0COMP_CODE
    DATASRC = *NEWCOL(R3)
    FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
    GAME = *NEWCOL(NOGAME)
    INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
    PROBABILITY = *NEWCOL(NOPROB)
    PRODUCT = *NEWCOL(NOPROD)
    PROFITCENTER = 0PROFIT_CTR
    PROJECT = *NEWCOL(NOPROJECT)
    TIME = 0FISCPER
    VERSION = *NEWCOL(REV0)
    WEEKS = *NEWCOL(NOWEEK)
    SIGNEDDATA= 0AMOUNT
    *CONVERSION
    Below is the Error Log
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    InforProvide=ZPCA_C01
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&amp;L.xls
    CLEARDATA= Yes
    RUNLOGIC= No
    CHECKLCK= Yes
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other cube
    Application: ACTUALREPORTING Package status: ERROR
    Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
    ANy Inpout will be appreciated.
    Sanjay

    Hello Guru,
    Currently i am getting below error while loading costcenter master data from BW to BPC.
    Task name MASTER DATA SOURCE:
    Record count: 189
    Task name TEXT SOURCE:
    Record count: 189
    Task name CONVERT:
    No 1 Round:
    Info provider  is not available
    Application: ZRB_SALES_CMB Package status: ERROR
    Anybody can tell me, if i have missed anything ???
    Regards,
    BI NEW
    Edited by: BI  NEW on Feb 23, 2011 12:25 PM

  • "Error occurs when loading transaction data from other model" - BW loading into BPC

    Hi Experts,
    I'm having a problem with my data loading from BW, using the standard Load InfoProvider Selections data manager package.
    If I run for a period without data it succeeds (with warning) but if there is data to be extracted I get the following error:
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other model
    model: AIACONS. Package status: ERROR
    As it runs ok when there isn't data it appears there is something preventing the movements of data out of the cube itself, rather then a validation issue.
    Has anyone encountered similar or have any ideas as to the problem?
    Best,
    Chris

    Hi Vadim,
    It's not specific to the transformation file as I have tried with others for the same BW cube and get the same result.
    We get a warning when we try and validate the transformation file:
    "Error occurs when loading transaction data from other model".
    This only appears in the validation pop up and doesn't throw up any warnings about the transformation file itself.  The validation log says:
    Validate  and Process Transformation File Log
    Log creation time
    3/7/2014 16:09
    The result of validation of the
      conversion file
    SUCCESS
    The result of validation of the
      conversion file with the data file
    FAIL
    Validation Result
    Validation Option
    ValidateRecords = NO
    Message
    Error occurs when loading transaction data from other model
    Reject List
    I can't find any errors anywhere else.
    Best,
    Chris

  • Error occurs when loading transaction data from other model

    Hello Experts, I am trying to validate my transformation file and I can see peculiar behaviour of the transformation file. Even though the transformation file is not complete/  complete with all the mappings, i am getting the same error as above.
         I can see options, mapping and conversion sections are validating successfully and throwing the above error.
    Incomplete Transformation File
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT= 10
    ROUNDAMOUNT=
    *MAPPING
    CUSTOMER = *NEWCOL (NO_CUST)
    Validating the transformation files
    Validating options...
    Validation of options was successful.
    Validating mappings...
    Validation of mappings was successful.
    Validating conversions...
    Validation of the conversion was successful
    Creating the transformation xml file. Please wait...
    Transformation xml file has been saved successfully.
    Begin validate transformation file with data file...
    [Start test transformation file]
    Validate has successfully completed
    ValidateRecords = YES
    Error occurs when loading transaction data from other model
    Validation with data file failed
                           I am getting the same error with complete transformation file also. Please let me know where I am doing the mistake or is it a system error?
    Thanking you
    Praveen

    Hi,
    By
    *MAPPING
    CUSTOMER = *NEWCOL (NO_CUST)
    you want CUSTOMER to receive a fixed string "NO_CUST"?
    If so use,
    *MAPPING
    CUSTOMER = *STR (NO_CUST)

  • Error occurs when loading transaction data from other cube

    Hi Gurus,
    I'm currently working on a Transformation File for loading Transactional Data from BW to BPC but a error message is displayed "Error occurs when loading transaction data from other cube". I have already checked permissions for my user, double checked my transformation file and the dimensions, made all conversion files needed and the message has not changed.
    Can anybody help me to solve this problem?!
    Thanks a lot & Best Regards,
    HH

    Hi,
    Here, the Transformation File & Conversion File. I have already tested both with another different InfoCube and they work but no for the one needed.
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    *MAPPING
    Category=*NEWCOL(ACTUAL)
    P_0BASE_UOM=0BASE_UOM
    P_0BUS_AREA=0BUS_AREA
    P_0COSTCENTER=0COSTCENTER
    P_0FUNDS_CTR=*NEWCOL(null)
    P_0GL_ACCOUNT=0ACCOUNT
    P_0LOC_CURRCY=*NEWCOL(MXN)
    P_0MATL_TYPE=*NEWCOL(null)
    P_0VENDOR=0VENDOR
    P_DataSrc=*NEWCOL(UPLOAD)
    P_ZMATERIAL1=0MATERIAL
    P_ZMATERIAL2=*NEWCOL(null)
    P_ZMATL_CLASS=*NEWCOL(null)
    P_ZMATL_TESP=*NEWCOL(null)
    P_ZRATIO=*NEWCOL(KF_inpmdInt)
    Time=0CALMONTH
    SIGNEDDATA=0TOTALSTCK
    *CONVERSION
    Time=Time_conv.xls
    EXTERNAL     INTERNAL
    201101          2011.JAN
    201102          2011.FEB
    201103          2011.MAR
    201104          2011.APR
    201105          2011.MAY
    201106          2011.JUN
    201107          2011.JUL
    201108          2011.AUG
    201109          2011.SEP
    201110          2011.OCT
    201111          2011.NOV
    201112          2011.DEC
    Thank you for taking at glance to the files.
    Best Regards,
    HH

  • Loading data from .csv file into Oracle Table

    Hi,
    I have a requirement where I need to populate data from .csv file into oracle table.
    Is there any mechanism so that i can follow the same?
    Any help will be fruitful.
    Thanks and regards

    You can use Sql Loader or External tables for your requirement
    Missed Karthick's post ...alredy there :)
    Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM

  • Loading data from .csv file into existing table

    Hi,
    I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
    I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
    timesheet_entry_id,time_worked,timesheet_date,project_key .
    The csv columns are :
    project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
    What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
    Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
    Thanks,
    Anjali

    Hi Anjali,
    Take a look at these threads which might outline different ways to do it -
    File Browse, File Upload
    Loading CSV file using external table
    Loading a CSV file into a table
    you can create hidden items in the page to validate previous records before insert data.
    Hope this helps,
    M Tajuddin
    http://tajuddin.whitepagesbd.com

  • How to Load the data from excel file(Extension is .CSV) into the temp.table

    Hi
    How to Load the data from excel file(Extension is .CSV) into the temporary table of oracle in Forms11g.
    My Forms Version is - Forms [64 Bit] Version 11.1.2.0.0 (Production)
    Kindly Suggest the Solution.
    Regards,
    Sachin

    Hello Sachin,
    You can use the following metalink note:How to Read Data from an EXCEL Spreadsheet into a Form Using Webutil Client_OLE2 (Doc ID 813535.1) and modify it a little bit.
    Instead of copy values into forms you can save them in your temporary table.
    Kind regards,
    Alex
    If someone's helpful or correct please mark it accordingly.

  • How to load the data from excel file into temprory table in Forms 11g?

    Hi
    How to Load the data from excel file(Extension is .CSV) into the temporary table of oracle in Forms11g.
    My Forms Version is - Forms [64 Bit] Version 11.1.2.0.0 (Production)
    Kindly Suggest the Solution.
    Regards,
    Sachin

    Declare
        v_full_filename         varchar2(500);
        v_server_path           varchar2(2000);
        v_separator             VARCHAR2(1);
        v_filename              VARCHAR2(400);
        filename                VARCHAR2 (100);
        v_stop_load             varchar2 (2000);
        v_rec_error_log         varchar2(4000);
        v_error_log             varchar2(4000);
        ctr                     NUMBER (12);
        cols                    NUMBER (2);
        btn                     number;
        RES                     BOOLEAN;   
        application             ole2.obj_type;
        workbooks               ole2.obj_type;
        workbook                ole2.obj_type;
        worksheets              ole2.obj_type;
        worksheet               ole2.obj_type;
        cell                    ole2.obj_type;
        cellType                ole2.OBJ_TYPE;
        args                    ole2.obj_type;
        PROCEDURE olearg
        IS
        args   ole2.obj_type;
        BEGIN
        args := ole2.create_arglist;
        ole2.add_arg (args, ctr);                                
        ole2.add_arg (args, cols);                                   
        cell := ole2.get_obj_property (worksheet, 'Cells', args);
        ole2.destroy_arglist (args);
        END;
    BEGIN
    v_full_filename := client_get_file_name(directory_name => null
                                     ,file_name      => null
                                     ,file_filter    => 'Excel  files (*.xls)|*.xls|'  
                                                                            ||'Excel  files (*.xlsx)|*.xlsx|'                                                                 
                                     ,message        => 'Choose Excel file'
                                     ,dialog_type    => null
                                     ,select_file    => null
    If v_full_filename is not null Then
    v_separator := WEBUTIL_CLIENTINFO.Get_file_Separator ;
    v_filename := v_separator||v_full_filename ;
    :LOAD_FILE_NAME := substr(v_filename,instr(v_filename,v_separator,-1) + 1);                                
    RES := Webutil_File_Transfer.Client_To_AS(v_full_filename,"server_path"||substr(v_filename,instr(v_filename,v_separator,-1) + 1));     
    --Begin load data from EXCEL
    BEGIN
        filename := v_server_path||substr(v_filename,instr(v_filename,v_separator,-1) + 1); -- to pick the file
        application := ole2.create_obj ('Excel.Application');
        ole2.set_property (application, 'Visible', 'false');
        workbooks := ole2.get_obj_property (application, 'Workbooks');
        args := ole2.create_arglist;
        ole2.add_arg (args, filename); -- file path and name
        workbook := ole2.get_obj_property(workbooks,'Open',args);
        ole2.destroy_arglist (args);
        args := ole2.create_arglist;
        ole2.add_arg (args, 'Sheet1');
        worksheet := ole2.get_obj_property (workbook, 'Worksheets', args);
        ole2.destroy_arglist (args);
        ctr := 2;                                                     --row number
        cols := 1;                                                -- column number
        go_block('xxx');
        FIRST_RECORD;  
        LOOP       
                --Column 1 VALUE --------------------------------------------------------------------
            olearg;
            v_stop_load := ole2.get_char_property (cell, 'Text'); --cell value of the argument
            :item1 := v_stop_load;
            cols := cols + 1;                                                      
              --Column 2 VALUE --------------------------------------------------------------------
            olearg;
            :item2 := ole2.get_char_property (cell, 'Text'); --cell value of the argument
            cols := cols + 1;
            --<and so on>
        ole2.invoke (application, 'Quit');
        ole2.RELEASE_OBJ (cell);
        ole2.RELEASE_OBJ (worksheet);
        ole2.RELEASE_OBJ (worksheets);
        ole2.RELEASE_OBJ (workbook);
        ole2.RELEASE_OBJ (workbooks);
        ole2.RELEASE_OBJ (application);
    END;
    --End load data from EXCELPlease mark it as answered if you helped.

  • SQL* Loader Loading specific column from CSV file to the table

    Dear All,
    Iam Loading specific column from .CSV file to the oracle table.
    Could pls help how i can load only that cols into the table
    Eg: CSV file having id, Frst_name,Last_name, Address,Phone,Insurance etc
    out of this I want to load only Frst_name,Last_name columns to oracle table say fname and lname.
    Thanks in Adv.
    Junu

    Lily,
    I made some changes to your table def but you will get the idea
    -- Table EMPLOYEE
    CREATE TABLE EMPLOYEE
      EMPID        NUMBER                           NOT NULL,
      EMPNICKNAME  VARCHAR2(10 BYTE)                    NULL,
      FNAME        VARCHAR2(20 BYTE)                NOT NULL,
      MI           VARCHAR2(20 BYTE)                    NULL,
      LNAME        VARCHAR2(20 BYTE)                NOT NULL,
      FULLNAME     VARCHAR2(20 BYTE)                NOT NULL,
      HIREDATE     DATE                             DEFAULT SYSDATE               NOT NULL
    --  data file employee.dat
    1,amy,b,amy b
    2,cindy,d,cindy d
    3,eric,f,eric f
    4,gary,h,gary
    -- Control file : Employee.ctl ( you can use truncate, replace or append , see sqlldr for more options)
    load data
    Truncate into table employee
    fields terminated by ","
    optionally enclosed by '"'
    TRAILING NULLCOLS
    empId INTEGER EXTERNAL,
    FName char(20),
    LName char(20),
    FullName char(30)
    now to load use following or you can speicify infile in control fle
    sqlldr username/passowrd control=employee.ctl  data=employee.dat log=employee.log
    {code}
    Hope this help.
    Regards                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Maybe you are looking for