Data file load

hi,
If we try to load data and data cells have "," as seprator for ex 34,6788,666.7 Then will the data be loaded fine or we should remove the seprator?
Some cells have "-" for the data which is missing should i change it with #missing or 0.00 in my data file?
Thanks!

Are you sure a "-" gets loaded as a zero in Essbase through a data load rule?
I know that a "-" will get sent to Essbase from Excel if the formatting in Excel turns 0's to -'s, but that's because there are real zeros behind the -'s.
I have to say I never tried to load a "-" through a data load rule as I've always specified that missing data be #Missing.
If the data file can't be changed at the source, you can use the data load rule file to do a replace of the the "-" with a #Missing. I prefer to do as few manipulations within the rule file as possible as it is a pain to maintain.
Regards,
Cameron Lackpour

Similar Messages

  • Best way to do an Excel data file load

    Hi
    I need to load Excel file’s data into an ORACLE table (on a frequent basis). While loading it, I need to validate each column's contents (using PL SQL code). Are there any packages/procs/APIs provided by ORACLE to do this kind of activity? What would be the best way to do an Excel file load within ORACLE ? FYI, I have Visual Basic code that reads data from Excel file and loads it into a temporary ORACLE table, then I validate data in this temporary table using a PL-SQL code in stored procedure. I am trying to avoid the "front end" process of this effort in VB and want to do the whole thing within ORACLE itself. Please let me know if you have any ideas.
    Your help is greatly appreciated!!
    Thanks in advance,
    Ram

    If you are running on Windows, you could try COM Automation which means moving your VB process into a stored procedure. I've never tried this myself, having been quite satisfied with Heterogeneous Connectivity.
    Tak

  • Data file load to Planning using FDMEE

    Hi All,
    Hyperion version : 11.1.2.3.0.26
    We have a currency planning application and the dimns are Account,business,entity,currency,version,scenario,period and year.
    My data file contains ;
    Account;business;entity;version;data
    AC_1001;International;US_Region;working;10000
    AC_1002;International;US_Region;working;10000
    When I try loading data to this application using FDMEE I am getting three gold fishes I thought the load is succesful but when I tried retrieving the data from smartview and found the data's are not loaded.
    POV : Jan 15,Actual
    In smartview from Essbase:
    HSP_InputValue
    HSP_InputValue
    Jan
    Jan
    FY15
    FY15
    Actual
    Actual
    Working
    Working
    Local
    USD
    International
    International
    US_Region
    US_Region
    AC_1001
    #Missing
    #Missing
    AC_1002
    #Missing
    #Missing
    Smartview from planning : Adhoc grid cannot be open as there no valid rows of data .
    Not sure why this is happening ,Could you please help me with this . THANKS in ADVANCE!
    Regards,
    Keny Alex

    And this is the log:
    2015-01-29 02:33:35,503 INFO  [AIF]: FDMEE Process Start, Process ID: 621
    2015-01-29 02:33:35,503 INFO  [AIF]: FDMEE Logging Level: 4
    2015-01-29 02:33:35,504 INFO  [AIF]: FDMEE Log File: D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:35,504 INFO  [AIF]: User:admin
    2015-01-29 02:33:35,505 INFO  [AIF]: Location:RPDLOC (Partitionkey:53)
    2015-01-29 02:33:35,505 INFO  [AIF]: Period Name:Jan 15 (Period Key:1/1/15 12:00 AM)
    2015-01-29 02:33:35,506 INFO  [AIF]: Category Name:Actual (Category key:1)
    2015-01-29 02:33:35,506 INFO  [AIF]: Rule Name:RPD (Rule ID:78)
    2015-01-29 02:33:37,162 INFO  [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
    [Oracle JRockit(R) (Oracle Corporation)]
    2015-01-29 02:33:37,162 INFO  [AIF]: Java Platform: java1.6.0_37
    2015-01-29 02:33:39,399 INFO  [AIF]: -------START IMPORT STEP-------
    2015-01-29 02:33:44,360 INFO  [AIF]: File Name: Datafile.txt
    2015-01-29 02:33:44,736 INFO  [AIF]: ERPI-105011:EPMERPI- Log File Name :D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:44,738 INFO  [AIF]: ERPI-105011:EPMERPI- LOADID:PARTKEY:CATKEY:RULEID:CURRENCYKEY:FILEPATH::621;53:1:78:Local:D:\demos\FDMEE/
    2015-01-29 02:33:44,738 INFO  [AIF]: ERPI-105011:EPMERPI- ImportTextData - Start
    2015-01-29 02:33:44,920 INFO  [AIF]: ERPI-105011:EPMERPI- Log File Name :D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:44,924 INFO  [AIF]: ERPI-105011:EPMERPI- File Name Datafile.txt
    periodKey2015-01-01
    2015-01-29 02:33:44,927 INFO  [AIF]: ERPI-105011:EPMERPI-  PROCESS ID: 621
    PARTITIONKEY: 53
    IMPORT GROUP: RPDVersion11
    FILE TYPE: DELIMITED
    DELIMITER: ;
    SOURCE FILE: Datafile.txt
    PROCESSING CODES:
    BLANK............. Line is blank or empty.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    TC................ Type Conversion, Amount field could not be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    SKIP FIELD.............. SKIP field value was found
    NULL ACCOUNT VALUE.............. Account Field is null
    SKIP FROM SCRIPT.............. Skipped through Script
    Rows Loaded: 2
    Rows Rejected: 0
    2015-01-29 02:33:44,929 INFO  [AIF]: ERPI-105011:EPMERPI- ARCHIVE MODE: null
    2015-01-29 02:33:44,930 INFO  [AIF]: ERPI-105011:EPMERPI- Start archiving file:
    2015-01-29 02:33:44,930 INFO  [AIF]: ERPI-105011:EPMERPI- Archive file name: 62120150101.txt
    2015-01-29 02:33:44,931 INFO  [AIF]: ERPI-105011:EPMERPI- Deleting the source file: Datafile.txt
    2015-01-29 02:33:44,931 INFO  [AIF]: ERPI-105011:EPMERPI- File not deleted: D:\demos\FDMEE\Datafile.txt
    2015-01-29 02:33:44,938 INFO  [AIF]: ERPI-105011:EPMERPI- ImportTextData - End
    2015-01-29 02:33:44,938 INFO  [AIF]: ERPI-105011:EPMERPI- Total time taken for the import in ms = 200
    2015-01-29 02:33:45,069 INFO  [AIF]:
    Import Data from Source for Period 'Jan 15'
    2015-01-29 02:33:45,085 INFO  [AIF]: Generic Data Rows Imported from Source: 2
    2015-01-29 02:33:45,089 INFO  [AIF]: Total Data Rows Imported from Source: 2
    2015-01-29 02:33:45,783 INFO  [AIF]:
    Map Data for Period 'Jan 15'
    2015-01-29 02:33:45,794 INFO  [AIF]:
    Processing Mappings for Column 'ACCOUNT'
    2015-01-29 02:33:45,796 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,796 INFO  [AIF]:
    Processing Mappings for Column 'ENTITY'
    2015-01-29 02:33:45,797 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,797 INFO  [AIF]:
    Processing Mappings for Column 'UD1'
    2015-01-29 02:33:45,798 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,798 INFO  [AIF]:
    Processing Mappings for Column 'UD2'
    2015-01-29 02:33:45,799 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,836 INFO  [AIF]:
    Stage Data for Period 'Jan 15'
    2015-01-29 02:33:45,838 INFO  [AIF]: Number of Rows deleted from TDATAMAPSEG: 4
    2015-01-29 02:33:45,848 INFO  [AIF]: Number of Rows inserted into TDATAMAPSEG: 4
    2015-01-29 02:33:45,850 INFO  [AIF]: Number of Rows deleted from TDATAMAP_T: 4
    2015-01-29 02:33:45,851 INFO  [AIF]: Number of Rows deleted from TDATASEG: 2
    2015-01-29 02:33:45,859 INFO  [AIF]: Number of Rows inserted into TDATASEG: 2
    2015-01-29 02:33:45,860 INFO  [AIF]: Number of Rows deleted from TDATASEG_T: 2
    2015-01-29 02:33:45,919 INFO  [AIF]: -------END IMPORT STEP-------
    2015-01-29 02:33:45,946 INFO  [AIF]: -------START VALIDATE STEP-------
    2015-01-29 02:33:45,993 INFO  [AIF]:
    Validate Data Mappings for Period 'Jan 15'
    2015-01-29 02:33:46,001 INFO  [AIF]: Total Data Rows available for Export to Target: 2
    2015-01-29 02:33:46,001 INFO  [AIF]:
    Validate Data Members for Period 'Jan 15'
    2015-01-29 02:33:46,002 INFO  [AIF]: Total Data Rows available for Export to Target: 2
    2015-01-29 02:33:46,026 INFO  [AIF]: -------END VALIDATE STEP-------
    2015-01-29 02:33:46,089 INFO  [AIF]: -------START EXPORT STEP-------
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: Cube Name: RPDFN
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: Export Mode: STORE_DATA
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: updateMultiCurrencyProperties - BEGIN
    2015-01-29 02:33:49,532 INFO  [AIF]: [HPLService] Info: Currency Properties Exist for Planning Application: RPDPLN
    2015-01-29 02:33:49,534 INFO  [AIF]: [HPLService] Info: Number of existing multi-currency property rows deleted: 7
    2015-01-29 02:33:49,537 INFO  [AIF]: [HPLService] Info: Base Currency for Application 'RPDPLN': USD
    2015-01-29 02:33:49,542 INFO  [AIF]: [HPLService] Info: Number of multi-currency property rows inserted: 7
    2015-01-29 02:33:49,542 INFO  [AIF]: [HPLService] Info: updateMultiCurrencyProperties - END
    2015-01-29 02:33:49,543 INFO  [AIF]: Updated Multi-Curency Information for application:RPDPLN
    2015-01-29 02:33:49,543 INFO  [AIF]: Connecting to essbase using service user:admin
    2015-01-29 02:33:49,572 INFO  [AIF]: Obtained connection to essbase provider:Embedded
    2015-01-29 02:33:49,576 INFO  [AIF]: Obtained connection to essbase cube RPDFN
    2015-01-29 02:33:49,593 INFO  [AIF]: Locking rules file AIF0078
    2015-01-29 02:33:49,595 INFO  [AIF]: Successfully locked rules file AIF0078
    2015-01-29 02:33:49,595 INFO  [AIF]: Copying rules file AIF0078 for data load as AIF0078
    2015-01-29 02:33:49,609 INFO  [AIF]: Unlocking rules file AIF0078
    2015-01-29 02:33:49,611 INFO  [AIF]: Successfully unlocked rules file AIF0078
    2015-01-29 02:33:49,611 INFO  [AIF]: The data rules file has been created successfully.
    2015-01-29 02:33:49,617 INFO  [AIF]: Locking rules file AIF0078
    2015-01-29 02:33:49,619 INFO  [AIF]: Successfully locked rules file AIF0078
    2015-01-29 02:33:49,625 INFO  [AIF]: Load data into the cube by launching rules file...
    2015-01-29 02:33:50,526 INFO  [AIF]: The data has been loaded by the rules file.
    2015-01-29 02:33:50,530 INFO  [AIF]: Unlocking rules file AIF0078
    2015-01-29 02:33:50,532 INFO  [AIF]: Successfully unlocked rules file AIF0078
    2015-01-29 02:33:50,532 INFO  [AIF]: Executed rule file
    2015-01-29 02:33:50,572 INFO  [AIF]: [HPLService] Info: Creating Drill Through Region for Process Id: 621
    2015-01-29 02:33:51,075 INFO  [AIF]: [HPLService] Info: Drill Through Region created for Process Id: 621
    2015-01-29 02:33:51,076 INFO  [AIF]: [HPLService] Info: [loadData:621] END (true)
    2015-01-29 02:33:51,117 INFO  [AIF]: -------END EXPORT STEP-------
    2015-01-29 02:33:51,214 INFO  [AIF]: [HPLService] Info: [consolidateData:621,Jan 15] END (true)
    2015-01-29 02:33:51,264 INFO  [AIF]: -------START CHECK STEP-------
    2015-01-29 02:33:51,316 INFO  [AIF]: -------END CHECK STEP-------
    2015-01-29 02:33:51,413 INFO  [AIF]: FDMEE Process End, Process ID: 621

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • How do I skip footer records in Data file through control file of sql*loade

    hi,
    I am using sql*loader to load data from data file and i have written control file for it. How do i skip last '5' records of data file or the footer records to be skiped to read.
    For first '5' records to be skiped we can use "skip" to achieve it but how do i acheive for last '5' records.
    2)
    Can I mention two data files in one control file if so what is the syntax(like we give INFILE Where we mention the path of data file can i mention two data file in same control file)
    3)
    If i have datafile with variable length (ie 1st record with 200 charcter, 2nd with 150 character and 3rd with 180 character) then how do i load data into table, i mean what will be the syntax for it in control file.
    4)if i want to insert sysdate into table through control file how do i do it.
    5) If i have variable length records in data file and i have first name then white space between then and then last name, how do i insert this value which includes first name and last name into single column of the table.( i mean how do you handle the white space in between first name and last name in data file)
    Thanks in advance
    ram

    You should read the documentation about SQL*Loader.

  • Flat File loading Initialize with out Data transfer is disabled in BI 7.0

    Hi experts,
              When loading through flat file in BI 7.0 for Info Package Level Initialization Delta Process with data Transfer is coming by default,but when i want to select Initialization Delta Process without Data transfer is disabled. (in the creation of Data Source (flat file) in the Extraction Tab Delta Process is changed to FIL1 Delta Data (Delta Images).
    please provide me Solution.
    regards
    Subba reddy.

    Hi Shubha,
    For flat file load please go throught he following link:
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
    This will help.
    Regards,
    Mahesh

  • Date Format  error while loading *.dat  file to Sataging .

    Hi All ,
    Loading data from *.dat file in to Staging , the dat file contains data for date in 'MM-DD-YYYY' format which is String , when it loading by control file it produce the error date format is not supported .I think ti required DD-MM-YYYY format . Is their any properties to change the date format or else need to write the function for converting date format .
    I am using OWb 10g Release 1 , with same DB .
    Please guide me
    Thanx in advance
    Regards ,

    Hi,
    Before loading into staging, you can use Expression operator to convert the CHAR into DATE with something like this:
    TO_DATE('01-02-2005','MM-DD-YYYY')
    Hope this helps.

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • Problem on loading DAT file when using 3G network modem

    Hello,
    I'm having some strange problem when I'm trying to load my game on the part where DAT file with Map Object is read, using 3G Network Modem. This issue started when I migrated my applet from one host to another. Everything loads well, until the user authentication. On user authenticatiom I'm recieving map name which I should load and show the user on it. I'm recieving the message where map name is mentioned, but after, when the process of loading map begins(when I need to read DAT file which is only 15KB) application blocks(browser and JVM also block).
    I thought it could be because of slow network communication, but it's not the reason, since I have tested(using Bandwidth limiter software) with 5 KBs/Second and it loads well and application is running well.
    Any clue why this can happen? If code is needed I'll post some pieces related to the process of map creation.
    URL or application: [ http://mimosa.dei.uc.pt/serhiy/demo/hoonline.html|http://mimosa.dei.uc.pt/serhiy/demo/hoonline.html]
    Accounts: test01/test01 ... test0n/test0n ... test05/test05 (n is number from 1 to 5)
    Thanks in advance!

    You have to upload it with FileReference.upload() to a PHP
    (or other server-side) script which saves it to a folder on the
    server. When the DataEvent.UPLOAD_COMPLETE_DATA event has been
    dispatched you can then use the FileReference.name to load from the
    file on the server just like any other image.

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • SQL Loader and foreign characters in the data file problem

    Hello,
    I have run into an issue which I can't find an answer for. When I run SQL Loader, one of my control files is used to get file content (LOBFILE) and one of the fields in the data file has a path to that file. The control file looks like:
    LOAD DATA
    INFILE 'PLACE_HOLDER.dat'
    INTO TABLE iceberg.rpt_document_core APPEND
    FIELDS TERMINATED BY ','
    doc_core_id "iceberg.seq_rpt_document_core.nextval",
    -- created_date POSITION(1) date "yyyy-mm-dd:hh24:mi:ss",
    created_date date "yyyy-mm-dd:hh24:mi:ss",
    document_size,
    hash,
    body_format,
    is_generic_doc,
    is_legacy_doc,
    external_filename FILLER char(275) ENCLOSED by '"',
    body LOBFILE(external_filename) terminated by EOF
    A sample data file looks like:
    0,2012-10-22:10:09:35,21,BB51344DD2127002118E286A197ECD4A,text,N,N,"E:\tmp\misc_files\index_testers\foreign\شیمیایی.txt"
    0,2012-10-22:10:09:35,17,CF85BE76B1E20704180534E19D363CF8,text,N,N,"E:\tmp\misc_files\index_testers\foreign\ลอบวางระเบิด.txt"
    0,2012-10-22:10:09:35,23552,47DB382558D69F170227AA18179FD0F0,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\leesburgis_á_ñ_é_í_ó_ú_¿_¡_ü_99.doc"
    0,2012-10-22:10:09:35,17,83FCA0377445B60CE422DE8994900A79,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\làm thế nào bạn làm ngày hôm nay"
    The problem is that whan I run this, SQL Loader throws an error that it can't find the file. It appears that it can't interpret the foreign characters in a way that allows it to find that path. I have tried adding a CHARACTERSET (using AL32UTF8 or UTF8) value in the control file but that only has some success with Western languages, not the ones listed above. Also, there is no set of defined languages that could be found in the data file. It essentaially could be any language.
    Does anyone know if there is a way to somehow get SQL Loader to "understand" the file system paths when a folder and/or file name could be in some other langauge?
    Thanks for any thoughts - Peter

    Thanks for the reply Harry. If I try to open the file in various text editors like Wordpad, Notepad, GVIM, andTextpad, they all display the foreign characters differently. Only Notepad comes close to displaying the characters properly. I have a C# app that will read the file and display the contents and it renders it fine. If you look at the directory of files in Windows Explorer, they all are displayed properly. So it seems things like .Net and Windows have some mechanism to understand the characters in order to render them properly. Other applications, again like Wordpad, do not know how to render them properly. It would seem that whatever SQL Loader is using to "read" the data files also is not rendering the characters properly which prevents it from finding the directory path to the file. If I add "CHARACTERSET AL32UTF8" in the control file, all is fine when dealing with Western langauges (ex, German, Spanish) but not for the Eastern languages (ex. Thai, Chinese). So .... telling SQL Loader to use a characterset seems to work, but not in all cases. The AL32UTF8 is the characterset that the Oracle database was created with. I have not had any luck if I try to set the CHARACTERSET to whatever the Thai character set is, for example. There problem there though is that even if that did work, I can't target specific lagauages because the data could come from anywhere. It's like I need some sort of global "super set" characterset to use. It seems like the CHARACTERSET is the right track to follow but I am not sure, and even if it is, is there a way to handle all languages.
    Thanks - Peter

  • I need format for data in excel file load into info cube to planning area.

    Hi gurus,
    I need format for data in excel file load into info cube to planning area.
    can you send me what should i maintain header
    i have knowledge on like
    plant,location,customer,product,history qty,calander
    100,delhi,suresh,nokia,250,2011211
    if it is  right or wrong can u explain  and send me about excel file format.
    babu

    Hi Babu,
    The file format should be same as you want to upload. The sequence of File format should be same communication structure.
    Like,
    Initial columns with Characteristics (ex: plant,location,customer,product)
    date column (check for data format) (ex: calander)
    Last columsn with Key figures (history qty)
    Hope this helps.
    Regards,
    Nawanit

  • On load, getting error:  Field in data file exceeds maximum length

    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE    11.2.0.3.0    Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I'm trying to load a table, small in size (110 rows, 6 columns).  One of the columns, called NOTES is erroring when I run the load.  It is saying that the column size exceeds max limit.  As you can see here, the table column is set to 4000 Bytes)
    CREATE TABLE NRIS.NRN_REPORT_NOTES
      NOTES_CN      VARCHAR2(40 BYTE)               DEFAULT sys_guid()            NOT NULL,
      REPORT_GROUP  VARCHAR2(100 BYTE)              NOT NULL,
      AREACODE      VARCHAR2(50 BYTE)               NOT NULL,
      ROUND         NUMBER(3)                       NOT NULL,
      NOTES         VARCHAR2(4000 BYTE),
      LAST_UPDATE   TIMESTAMP(6) WITH TIME ZONE     DEFAULT systimestamp          NOT NULL
    TABLESPACE USERS
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    I did a little investigating, and it doesn't add up.
    when i run
    select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
    I get a return of
    643
    That tells me that the largest size instance of that column is only 643 bytes.  But EVERY insert is failing.
    Here is the loader file header, and first couple of inserts:
    LOAD DATA
    INFILE *
    BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
    DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
    APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
    Fields terminated by ";" Optionally enclosed by '|'
      NOTES_CN,
      REPORT_GROUP,
      AREACODE,
      ROUND NULLIF (ROUND="NULL"),
      NOTES,
      LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
    BEGINDATA
    |E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females.  Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%).  The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population.  People over the age of 60 account for about 22% of visits.   Most of the visitation is from the local area.  More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short.  Over half of the visits last less than 3 hours.  The median length of visit to overnight sites is about 43 hours, or about 2 days.  The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long.   Most visits come from people who are fairly frequent visitors.  Over thirty percent are made by people who visit between 40 and 100 times per year.  Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%).  Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
    Here is the full beginning of the loader log, ending after the first row return.  (They ALL say the same error)
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Control File:   NRIS.NRN_REPORT_NOTES.ctl
    Data File:      NRIS.NRN_REPORT_NOTES.ctl
      Bad File:     ./NRIS.NRN_REPORT_NOTES.BAD
      Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
    Insert option in effect for this table: APPEND
       Column Name                  Position   Len  Term Encl Datatype
    NOTES_CN                            FIRST     *   ;  O(|) CHARACTER
    REPORT_GROUP                         NEXT     *   ;  O(|) CHARACTER
    AREACODE                             NEXT     *   ;  O(|) CHARACTER
    ROUND                                NEXT     *   ;  O(|) CHARACTER
        NULL if ROUND = 0X4e554c4c(character 'NULL')
    NOTES                                NEXT     *   ;  O(|) CHARACTER
    LAST_UPDATE                          NEXT     *   ;  O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
        NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
    Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
    Field in data file exceeds maximum length...
    I am not seeing why this would be failing.

    HI,
    the problem is delimited data defaults to char(255)..... Very helpful I know.....
    what you need to two is tell sqlldr hat the data is longer than this.
    so change notes to notes char(4000) in you control file and it should work.
    cheers,
    harry

  • SQL Loader - Field in data file exceeds maximum length

    Dear All,
    I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
    Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
    Table creation script:
    CREATE TABLE "TEST_TAB"
        "STR"  VARCHAR2(4000 BYTE),
        "STR2" VARCHAR2(4000 BYTE),
        "STR3" VARCHAR2(4000 BYTE)
      );Control file:
    LOAD DATA
    INFILE 'C:\table_export.txt'
    APPEND INTO TABLE TEST_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    ( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
    STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
    STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
    )Log:
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Control File:   C:\TEST_TAB.CTL
    Data File:      C:\table_export.txt
      Bad File:     C:\TEST_TAB.BAD
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table TEST_TAB, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    STR                                 FIRST  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR,1,4000)"
    STR2                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR2,1,4000)"
    STR3                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR3,1,4000)"
    value used for ROWS parameter changed from 64 to 21
    Record 1: Rejected - Error on table TEST_TAB, column STR.
    Field in data file exceeds maximum length
    MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
    Table TEST_TAB:
      0 Rows successfully loaded.
      1 Row not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                 252126 bytes(21 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             1
    Total logical records rejected:         1
    Total logical records discarded:        0
    Run began on Mon Jul 26 16:06:25 2010
    Run ended on Mon Jul 26 16:06:25 2010
    Elapsed time was:     00:00:00.22
    CPU time was:         00:00:00.15Please suggest a way to get it done.
    Thanks for reading the post!
    *009*

    Hi Toni,
    Thanks for the reply.
    Do you mean this?
    CREATE TABLE "TEST"."TEST_TAB"
        "STR"  VARCHAR2(4001),
        "STR2" VARCHAR2(4001),
        "STR3" VARCHAR2(4001)
      );However this does not work as the error would be:
    Error at Command Line:8 Column:20
    Error report:
    SQL Error: ORA-00910: specified length too long for its datatype
    00910. 00000 -  "specified length too long for its datatype"
    *Cause:    for datatypes CHAR and RAW, the length specified was > 2000;
               otherwise, the length specified was > 4000.
    *Action:   use a shorter length or switch to a datatype permitting a
               longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
    Edited by: 009 on Jul 28, 2010 6:15 AM

  • Loader- Field in data file exceeds maximum length

    Hi,
    I am getting error while loading the data: However data size of this columns is less thatn 4000 and i defined column as : OBJ_ADDN_INFO CLOB
    Please help
    ==================
    Record 1: Rejected - Error on table APPS.CG_COMPARATIVE_MATRIX_TAB, column OBJ_ADDN_INFO.
    Field in data file exceeds maximum length
    LOAD DATA
    infile *
    REPLACE
    INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ( APPS_VERSION,
    MODULE_SHORT_NAME,
    CATEGORY,
    MODULE,
    OBJECT_NAME,
    OBJECT_TYPE,
    OBJECT_STATUS,
    FUNCTION_NAME,
    OBJ_ADDN_INFO
    begindata
    "12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INIT,PROGRAM,Changed,"Initial Load - Update Depot Repair Order Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"
    "12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INCR,PROGRAM,Changed,"Update Depot Repair Orders Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"

    If you don't specify a data type for a data field in the SQL Loader control file, SQL Loader assumes the data type is CHAR(255). If you have data that is larger than that, then you can't rely on the default. Try changing the control file to
    LOAD DATA
    infile *
    REPLACE
    INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ( APPS_VERSION,
    MODULE_SHORT_NAME,
    CATEGORY,
    MODULE,
    OBJECT_NAME,
    OBJECT_TYPE,
    OBJECT_STATUS,
    FUNCTION_NAME,
    OBJ_ADDN_INFO char(4000)
    )

Maybe you are looking for

  • Mini display port to HDMI adapter not working

    I can not get my MacBook OS X to dispaly on my new Samsung LED TV. I have a mini display port to HDMI adapter (two years old) conneted with an HDMI cable (new) into the HDMI input on the TV. I tried both inputs, tried resetting the TV, plugging, unpl

  • When installing itunes i get the error message

    I tunes could not be installed because visual basic script is not installed or has been disabled i am a bit thick so if anyone could help i would be grateful

  • X11 keeps crashing on launch

    When ever I start X11 (either clicking or by connecting with ssh -X, it goes into a cycle of start-crash-pause-start-crash-pause... I get an error stating that it does not understand a color depth of -1. Based on some information from elsewhere I tri

  • Open in... not working for PDF in iOS6

    Hi, I have just upgraded to iOS 6 and can no longer view PDF files from email on my new iPad. The preview image appears fine and when I tap and hold on it, the "Open in.." window appears with app icons. When I pick an app, the popover goes away and n

  • CS5 Crashing when saving or opening files

    Mac OS X 10.7.5 It started after installing trial version of Astute VectorScribe 2. I deleted the plugin, per Astute YouTube instructions. I have also done the following. Quit Illustrator. Went to User/username/Library/Preferences/ Adobe illustrator