Error while uploading data through CSV File

Dear All,
While Performing following steps I have encounted error in BW 3.5.
Step 1. Right-click Source System u2013 demo: flat file, and then select Create InfoPackageu2026.
Step 2. Select the DataSource Material number (Master data), enter a description for the InfoPackage, and
Step 3. Click the External data tab. Select options as shown in the screen. Enter a file name with a path.(CSV File)\
While checking the preview I am able to see the values inside CSV file,
But while Starting the scheduler to upload this data it is displaying following error=>
Syntax error in template RSTMPLIR, row 0 (-> Long text)
While checking performance assistant following detail is displayed.
*Diagnosis
Field "C_R_D" is unknown. It is neither in one of the specified tables nor defined by a "DATA"   ...
System response
The program generation was terminated.
Procedure
Correct the template*
Can you guide how to eliminate this error?
Points will be rewarded for your contribution
Regards,
Purav

Flat File was in error.

Similar Messages

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • Error while loading data through flat files

    Hi,
    I am laoding data through flat files
    I loaded the data in development & then transported it to Quality  system  for doing Integration testing & loaded the data in Quality  system  & data loading was successful
    But we do the regression testing in one more Quality system where the objects get transported from Quality  system ,here when I try to the load the data to any of the cube in this system  I am getting the following error
    Error 'The argument 'US' cannot be interpreted as a number' on assignment field /BIC/ZIFBCOCTR record 1 value US
    Generally we get this error when there is the data of one field is going to the other field
    But one thing I am unable to understand is it was working fine in the Quality system and all the objects got transported from Quality to Regression testing system
    Could any one help me out
    Thanks
    Maya

    Hi,
    Can you please cross check the file structure and the transfer structure.
    Cheers,
    Malli....

  • SQlldr Error while uploading "excel" or "csv" file.

    Hello to community,
    We are using Oracle AS(application Server) 10g as a "web Server" & "database Server"
    The Database Server is having an NFS Partition,which has mounted onto "Web Server"
    So, if any client tried to upload any excel OR csv file , the Web Server will redirect that file data onto "database" server" through NFS partition.
    By clicking "upload" button from client end, they are getting a strange error. By checking "ias_console" log file, I have found below latest logs,which belongs to the error. Kindly let me know where is the problem coming from.
    The Database Server Shared NFS partition name is "web_upload" & we have same "web_upload" partition on the "web server". The command of mounting NFS partition is given below.
    On AIX web Server :- mount <ip address>:/file_data/web_upload/ /web_upload/
    SYNTAX :- <ip add of db server>/mount point "web server mount point"
    The error is as given below.
    09/06/09 15:27:47 NumIdle: 2
    09 Jun 2009 15:27:47,764 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private Connection getDBConnection() ] Exited
    09 Jun 2009 15:27:47,764 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private Connection getDBConnection() ] Exited
    09 Jun 2009 15:27:47,766 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private void initialize() ] After getDBConnection called,
      connection object is: org.apache.commons.dbcp.PoolableConnection@1048a893
    09 Jun 2009 15:27:47,767 [DEBUG] - [ com.vat.website.service.ExcelUploadService ] [ public boolean insertFileDetails(ExcelDataBean databean,String
      strFname) ] strUniqueKey:select  web_uploaded_file_history_seq.nextval from dual
    09 Jun 2009 15:27:47,767 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public ResultSet executeQuery(String strQuery) ] Entered
    09 Jun 2009 15:27:47,773 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public ResultSet executeQuery(String strQuery) ] Exited
    09 Jun 2009 15:27:47,774 [DEBUG] - [ com.vat.website.service.ExcelUploadService ] [ public boolean insertFileDetails(ExcelDataBean databean,String
      strFname) ] insertQuery:INSERT INTO WEB_UPLOADED_FILE_HISTORY(WUF_SERIAL_NUMBER,WUF_DEALER_ID,WUF_PERIOD_FROM,WUF_PERIOD_TO,WUF_FILE_PATH,WUF_FORM_NO,WUF_SERVER_IP,WUF_UPLOAD_YN,WUF_CREATED_DATE,WUF_CREATED_BY,WUF_ORIGINAL_REVISED,WUF_SHEETS_NUMBER,wuf_reco
      d_key)VALUES('2658271','T00100001000306',to_date('01/01/2009','dd/mm/yyyy'),to_date('31/01/2009','dd/mm/yyyy'),'/web_upload/090609/VAT
      Returns/000000/0000000000/Form201/0000000000_0T201BO0109_009062009152747.csv',replace(decode('T201B','T201M', 'T201','T201B'),'T','VAT-Form'),(select
      RPAD(sys_context('USERENV','IP_ADDRESS'),15,' ') AS client_ipaddress from dual),'Y',SYSDATE,'WEB','O','0','3317063')
    09 Jun 2009 15:27:47,791 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public void closeDBConnection() ] Entered
    09 Jun 2009 15:27:47,791 [DEBUG] - finalDate:01-JAN-2009
    09 Jun 2009 15:27:47,792 [DEBUG] - finalDate:31-JAN-2009
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.action.UploadAction ] [ public ActionForward submit(ActionMapping mapping, ActionForm
      form,HttpServletRequest request, HttpServletResponse response) ] Executing: sqlldr parfile=parafile.par silent=feedback direct=Y at location:
      /web_upload/090609/VAT Returns/000000/0000000000/Form201/SQLLdr
    It seems that this problem is due to "Sqlldr" then how to troubleshoot this problem?
    Waiting for your favorable response,
    Advanced Thanks,
    Nishith Vyas.

    Flat File was in error.

  • While uploading data from local file

    Hi,
    While uploading data from local file having the header text into interal table using the GUI_UPLOAD function module,there is problem in upload because of header text in file, do we have any option in this FM itself to skip this header text and upload from sdecond line because as per our requirement we can't delete this header text from the file.

    Hi Manish,
    Do you have the problem while uploading? Is it giving any error?
    If not, if your uploading is successful into the internal table, then delete the first row, which is header, from the internal table using index eq 1.
    Regards,
    Chandra Sekhar

  • DB Connect Load - "Unknow error while uploading data from the DB Table"

    Hi Experts,
    We have our BI7 system connected to Oracle DB based third party tool. The loads are performing quite well in DEV environment.
    I would like to know, how we transport DB Connect datasources to Quality systems? Any different process to be followed for DB Connect datasources?
    At present the connections between BI Quality and the third party quality systems are established. We transported the DataSource from BI DEV system to BI quality system, but on trigerring an infopackage we are not able to perform loads. It prompts - "Unknow error while uploading data from the DB Table".
    Also on comparing the DataSources in DEV system and Quality system there are no fields in "Proposal" tab of datasource in Quality system. Also I cannot change or activate Datasource in Quality system as we dont have change access in quality.
    Please advice.
    Thanks,
    Abhijit

    Hi,
    Sorry for bumping an old thread ....
    Did this issue get ever get resolved?
    I am facing the same one. The loads work successfully in Dev. The transport for DBConnect DS also moved in successfully.
    One strange this is that DB User for dev did not automatically change to db user from quality when I transported the DBConnect datasource. DBCon DS still shows me the DB User from Dev in Quality system
    I get "Unknown Error" whenever I trigger the data package.
    Advait

  • Error  while uploading data in table t_499s through BDC Prog

    Hi
    am facing problem while uploading data in table t_499s through BDC Program  , if there is more than 15 records in file its not allowing to upload kindly suggest what to do
    Thanx
    Mukesh s

    Hi,
    See if you want to update only single table, which has User maintenance allowed
    Use Modify statement.
    EX:
    LOOP AT ITAB INTO WA_TAB.
        MOVE-CORRESPONDING WA_TAB TO T499S.
        MODIFY T499S.
        CLEAR T499S.
      ENDLOOP.
    It will update the table, to check go to sm30 , and check in V_T499S.
    Rgds
    Aeda

  • Error while uploading data to ztable from excel file

    Hi,
    I have a requirement where i have to upload data from excel file to ztable.I have used the fm 'ALSM_EXCEL_TO_INTERNAL_TABLE' for reading the excel file.After reading the excel file i have used INSERT zrb_hdr from table t_zrb_hdr for updating the ztable with data .
    here it is giving error as the data base table zrb_hdr and the internal table t_zrb_hdr should be declared of same type .
    I got this error b'coz i have changed the date and time fields in t_zrb_hdr table to char type.so the structure of zrb_hdr and t_zrb_hdr are not same.If i don't change the date and time fields,in the o/p i am not getting proper date and time formats.
    now how can i upload data into ztable?

    Hi,
    Try this.
    Data: itab type standard table of ztable,
             wa_itab type ztable.
    loop at t_zrb_hdr into wa_t_zrb_hdr.
       wa_itab-date = wa_t_zrb_hdr-date.
       wa_itab-time = wa_t_zrb_hdr-time.
       like  move all the fiedl to wa_itab...........
       append itab with wa_itab.
    Endloop.
    now insert the records from itab to the database table ztable.
    Thanks,
    Muthu.

  • Error handling while uploading data through Batch Input Session

    I want to upload data from a file to 2 different infotypes at the same time. One is a user defined infotype 9010 and the other is a standard infotype 2010. 9010 infotype has the start and end time whereas 2010 doesn't.While uploading I have put validations to check duplicate entries for infotype 9010 based on the start time and endtime. The duplicate entry is rejected in 9010 but the same gets uploaded in 2010. In 2010 I can't put any validations because there is no start time and end time.  I want to prevent the upload of the entry in infotype 2010 if it is not uploaded in 9010. Please help.

    Doing IMPORT?EXPORT will act like a global variable shared between two separate programs.
    Here is an example:
    In first part, export the values to a memory variable after your conditions satisfy and you want to set a flag, like this ...
        free memory id 'ZFLAG_VAR'. "recomended to clear that variable
                            " before start, you dont need to declare it
        data: l_flag type c.
        if my_condition = 'true'.
          l_flag = 'X'.
          export l_flag to memory id 'ZFLAG_VAR'.
        endif.
    In second part, you can
    import l_flag from memory id 'ZFLAG_VAR'.
    if l_flag = 'X'.
       write: 'hmmmm ... the flag was marked, so my condition was true in last part'.
    endif.
    Edited by: Rob Burbank on Nov 25, 2009 3:12 PM

  • Error while reading data through External Table!!!

    CREATE TABLE "COGNOS"."EXT_COGNOS_TBS9_TEST"
    (     "ITEM_DESC" VARCHAR2(200 BYTE),
    "EXT_CODE" VARCHAR2(20 BYTE),
    "RC_DATE" DATE,
    "RES_KD_AMNT" NUMBER(18,3),
    "RES_FC_AMNT" NUMBER(18,3),
    "NRES_KD_AMNT" NUMBER(18,3),
    "NRES_FC_AMNT" NUMBER(18,3),
    "TOTAL" NUMBER(18,3),
    "OF_WHICH_OVR1" NUMBER(18,3)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY "EXTDATADIR"
    ACCESS PARAMETERS
    ( RECORDS
    DELIMITED BY NEWLINE LOAD WHEN *({color:#ff0000}EXT_CODE LIKE 'TBS9%'{color})* FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL )
    LOCATION
    ( 'TBS9_TEST.CSV'
    External table creation went through successfully but am getting error while reading data. Am quite sure error is because of above line in red color. Could you please help me in transforming logic.
    Thanks in Advance,
    AP

    Let's start with the basics...
    1) You state that you are getting an error. What error do you get? Is this an Oracle error (i.e. ORA-xxxxx)? If so, please include the error number and the error message as well as the triggering statement. Or is the problem that rows are getting written to the reject file and errors are being written to the log file? If so, what record(s) are being rejected and what are the reasons given in the log file? Or perhaps the problem is something else?
    2) You state that you are quite sure that the problem relates to the hilighted code. What makes you quite sure of this?
    Justin

  • Error while uploading an edited excel file into an internal table

    Hi Experts,
    I am getting error while uploading an excel file that has been edited.
    I am using GUI_UPLOAD for uploading the file into internal table.
    In my program I first have to download a file, if I use the same file without editing I am able to read the file.
    When I try to edit it and then use it to upload it fails, but this is part of the my requirement.
    PLease suggest.
    Regards
    Kishore

    TYPE-POOLS: truxs.
    parameter :    lv_full_path     TYPE rlgrap-filename,
    data : lt_conv_data   TYPE truxs_t_text_data,
              lt_roles_excel   type table of ( your structure).
    start-of-selection.
            CALL FUNCTION 'SAP_CONVERT_TO_XLS_FORMAT'
              EXPORTING
                i_line_header        = 'X'
                i_filename           = lv_full_path
              TABLES
                i_tab_sap_data       = lt_roles_excel
              CHANGING
                i_tab_converted_data = lt_conv_data
              EXCEPTIONS
                conversion_failed    = 1
                OTHERS               = 2.
    In the FM Line_header = 'X' means it will negelect the first line. So u can give the heading in the excel file.
    Hope this might help u .
    With Regards,
    Sumodh.P

  • Error while loading data from a file on application server

    Hi all,
    Facing an error while loading data from a flat file.
    Error 'The argument '##yyyymmdd;@##' cannot be interpreted as a number ' while assigning character.
    I changed the format of date fields (tried with number,general,date(International))in the xls. But i still get the same error.Did check all the data types in Data source all the fields are dats.
    Can you please tell me what could be the problem?
    Thank you all,
    Praveen

    Hi all,
    As far as my first question i got through it but i had one more field in my flat file while actually is a time stamp, but in my flat file i have a data in this format
    10/21/2006  5:11:48 AM which i need to change to 10/21/2006
    one more note is i have some of the fields as NULL in this field
    Last Updated Date
    10/21/2006  5:11:48 AM
    10/21/2006  5:11:48 AM
    NULL
    NULL
    10/21/2006  5:11:48 AM
    NULL
    I want to display the values as 10/21/2006 and NULL as it is.
    Please let me know if we have a conversion routine in datasource which can solve my problem.
    Regards,
    Praveen

  • ODI error while loading data from Flat File to oracle

    Hi Gurus,
    I am getting following error while loading flat file to oracle table :
    ava.lang.NumberFormatException: 554020
         at java.math.BigInteger.parseInt(Unknown Source)
         at java.math.BigInteger.<init>(Unknown Source)
         at java.math.BigDecimal.<init>(Unknown Source)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
         at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    The connections between source and target is fine.
    Kindly suggest on this error.
    Thanks
    Shridhar

    Hi John,
    The source is csv file. By default all columns are string type.The integration is failing at step 3 ( load Data).
    I am loading the data from csv file directly to staging table( oracle ).Its one to one mapping.
    Shridhar

  • Error while importing data from XML file to a Oracle database

    I am getting the following error while importing data
    *** atg.xml.XMLFileException: No XML files were found for "/Dynamusic/config/dynamusic/songs-data.xml". 
    The files must be located  under the name "/Dynamusic/config/dyna
    *** java.io.FileNotFoundException: D:\MyApp\ATG\ATG10.0.3 (Access is denied)
    java.io.FileNotFoundException: D:\MyApp\ATG\ATG10.0.3 (Access is denied)
            at java.io.FileInputStream.open(Native Method)
            at java.io.FileInputStream.<init>(FileInputStream.java:120)
            at atg.adapter.gsa.xml.TemplateParser.importFiles(TemplateParser.java:6675)
            at atg.adapter.gsa.xml.TemplateParser.runParser(TemplateParser.java:5795)
            at atg.adapter.gsa.xml.TemplateParser.main(TemplateParser.java:5241)
    I have placed the FakeXADataSource and JTDataSource properties files as required. In fact I have been able to import data in all machines. Just this one machine is giving problems. I have also checked out the access rights. Can someone help me please?

    Confirm that you have access to D:\MyApp\ATG\ATG10.0.3 and existence of Dynamusic/config/dynamusic/songs-data.xm

  • Change Value While Uploading data From Excel File

    Dear Expert,
    Please guide me is it Possible?, if yes then How?
    We have one BDC Uploader there After uploading data from Excel File, i want to check one field like this....
    if it_f65data-newbs contains any Alphabet.
    then user Should be able to change value on the same time while executing Program in table.
    Please tell me how i can change value in running program
    Regards'
    Shelly Malik

    Hi,
    What you can probably do is, perform a consistency check on your data in the internal table and segregate all those rows that have inconsistent data (in your case, the field NEWBS containing albhabets) into another temporary internal table. Display the temporary internal table in an editable ALV grid saying that these data have not been inserted to the database, coz they have inconsistent data in the field NEWBS. Allow the user to edit and then save the data which will consequently refresh the internal table contents.

Maybe you are looking for