Governor limit exceeded in cube generation (Maximum data records exceeded.)

There are similar posts which didn't help in my situation.
I had the error: Governor limit exceeded in cube generation (Maximum data records exceeded.). The query returns about *64000* rows.
I've changed the instance config and exagerated it, and then also the register, but I still get the error:Governor limit exceeded in cube generation (Maximum data records exceeded.)
instanceconfig:
<?xml version="1.0"?>
<WebConfig>
<ServerInstance>
<CredentialStore>
<CredentialStorage type="file" path="C:\OracleBIData\web\config\credentialstore.xml"/>
</CredentialStore>
     <CubeMaxRecords>5000000</CubeMaxRecords>
     <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>
     <ResultRowLimit>5000000</ResultRowLimit>
<PivotView>
          <MaxVisibleRows>5000000</MaxVisibleRows>
          <MaxVisibleColumns>1024</MaxVisibleColumns>
          <MaxVisiblePages>1024</MaxVisiblePages>
          <MaxVisibleSections>1024</MaxVisibleSections>
</PivotView>
</ServerInstance>
</WebConfig>
Also added
*<CubeMaxRecords>5000000</CubeMaxRecords>*
*     <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>*
to the registry.
But still got the error:(
Thanks for your help
Edited by: user635025 on Jul 24, 2009 4:34 AM

I suggest disabling the cache. Setting the max rows to a very high number and disabling the cache is the way to go when you are querying an Oracle database :)
( for those who haven't allready did the the thing )
In NQSConfig.INI
# Query Result Cache Section
[ CACHE ]
ENABLE     =     NO;

Similar Messages

  • Error : Governor limit exceeded in cube generation (Maximum data records ex

    Hi
    I have created a report that throws this error.
    Governor limit exceeded in cube generation (Maximum data records exceeded.)
    Error Details
    Error Codes: QBVC92JY
    After going through various blog i found that i need to change the max value of pivot table view in the instanceconfig.xml file. I have added it to 500000.
    But still it throws same error.
    Ashok

    Hi Ashok,
    There are a number of setting who work in parrallel. Have a look here:
    http://obiee101.blogspot.com/2008/02/obiee-controling-pivot-view-behavior.html
    regards
    John
    http://obiee101.blogspot.com

  • Error: QBVC92JY. Maximum data records exceeded

    Hi all,
    I've got a big problem with this error: Governor limit exceeded in cube generation (Maximum data records exceeded.).
    I've changed configuration in "instanceconfig.xml" where:
    <PivotView>
    <MaxCells>1000000</MaxCells>
    <CubeMaxRecords>1000000</CubeMaxRecords>
    <CubeMaxPopulatedCells>1000000</CubeMaxPopulatedCells>
    </PivotView>
    but I've still the same problem.
    Someone can help me?
    Steve

    stesappo:
    Put <CubeMaxRecords> and <CubeMaxPopulatedCells> out of <PivotView>.
    Try this in instanceconfig.xml:
    <PivotView>
    <MaxCells> 4000000 </MaxCells>
    <MaxVisibleColumns> 5000 </MaxVisibleColumns>
    <MaxVisiblePages> 2500 </MaxVisiblePages>
    <MaxVisibleRows> 50000 </MaxVisibleRows>
    <MaxVisibleSections> 3000 </MaxVisibleSections>
    <ResultRowLimit>20000</ResultRowLimit>
    </PivotView>
    <CubeMaxRecords> 1000000 </CubeMaxRecords>
    <CubeMaxPopulatedCells> 1000000 </CubeMaxPopulatedCells>
    Gabriel.

  • Governor limit exceeded in cube generation error in OBIEE 10g

    Hi Gurus,
    one of my OBIEE Report is throwing the error called *"Governor limit exceeded in cube generation(Maximum data records exceeded.)"*
    I am using OBIEE 10g.Could you please suggest me here and report also runnig for long time like to get one day data it is running for 30 min to pick 9000 rows.
    Regards,
    SK

    Hello,
    Try to alter the values in the instanceconfig.xml. ( Under \OracleBIData\web\config )
    <CubeMaxRecords>200000</CubeMaxRecords>
    <CubeMaxPopulatedCells>200000</CubeMaxPopulatedCells>
    Also refer to :     OBIEE 10g: Error: "Governor Limit Exceeded In Cube Generate. Error Codes: QBVC92JY" or "Maximum number of allowed pages in Pivot Table exceeded...Error Detail" When Displaying a Pivot View or Chart [ID 1092854.1] if it did not solve the issue.
    Hope this helps. Pls mark if it does.
    Thanks,
    SVS

  • Pivot view error "Governor Limit exceeded in cube generation"

    HI Gurus
    My Instanceconfig.xml contains this section.
    <PivotView>
    <MaxCells>600000</MaxCells>
    <CubeMaxRecords>600000</CubeMaxRecords>
    <CubeMaxPopulatedCells>600000</CubeMaxPopulatedCells>
    </PivotView>
    I restarted all the services but still I am getting
    "Governor Limit exceeded in cube generation & the maximum data records exceeded. "
    If the data is below 40K it works fine.
    I saw the previous posts on this error and all the suggestions already tried.
    Did I missed any thing.
    Please help me.
    Thanks

    I ran into this issue and didn't see the resolution here or on any other public forums, so I'll add a detail bit that I found in Metalink, in Doc ID 494163.1.
    1. The xml nodes you care about are CubeMaxRecords and CubeMaxPopulatedCells, at least from OBI 10.1.3.4
    2. These are both child nodes of ServerInstance. Their defaults are 40,000 and 150,000, respectively. That is, your baseline instanceconfig.xml file should look like:
    <CubeMaxRecords>40000</CubeMaxRecords>
    <CubeMaxPopulatedCells>150000</CubeMaxPopulatedCells>
    </ServerInstance>
    </WebConfig>
    3. You only want to tune these a little at a time, in 1000/10000 increments at most. They act as governors on how the presentation server uses memory, and just slapping an extra zero on the end of each number will probably cause your presentation server to crash. This is much worse than the QBVC92JY error, which at least comes with a nice little graphic.
    4. If you can't find a tuning on these limits that eliminates the error without causing your server to crash, the report author should be re-thinking what he/she is trying to do. It's likely that the report is returning far more data than you really want, and that it needs to be filtered in some way.
    Good luck,
    -Eric

  • Maximum Data records in a Segment?

    Hi Folks,
    I wanted to know if there is a limitation to number of data records in an IDoc segment.The requirement to generate 1 segment for each material number but the problem is whenever a material number contains greater then 1000 records,it is split into two segments.
    The IDoc is an extension of PROACT01 IDoc.
    Thanks in Advance.
    -Abhishek

    Hi Abhishek,
    Maximum records that can be created 1 segment is 9999999999.
    It all depends on the attributes of Segment.
    Regards,
    Vineesh B    

  • On load, getting error:  Field in data file exceeds maximum length

    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE    11.2.0.3.0    Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I'm trying to load a table, small in size (110 rows, 6 columns).  One of the columns, called NOTES is erroring when I run the load.  It is saying that the column size exceeds max limit.  As you can see here, the table column is set to 4000 Bytes)
    CREATE TABLE NRIS.NRN_REPORT_NOTES
      NOTES_CN      VARCHAR2(40 BYTE)               DEFAULT sys_guid()            NOT NULL,
      REPORT_GROUP  VARCHAR2(100 BYTE)              NOT NULL,
      AREACODE      VARCHAR2(50 BYTE)               NOT NULL,
      ROUND         NUMBER(3)                       NOT NULL,
      NOTES         VARCHAR2(4000 BYTE),
      LAST_UPDATE   TIMESTAMP(6) WITH TIME ZONE     DEFAULT systimestamp          NOT NULL
    TABLESPACE USERS
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    I did a little investigating, and it doesn't add up.
    when i run
    select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
    I get a return of
    643
    That tells me that the largest size instance of that column is only 643 bytes.  But EVERY insert is failing.
    Here is the loader file header, and first couple of inserts:
    LOAD DATA
    INFILE *
    BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
    DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
    APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
    Fields terminated by ";" Optionally enclosed by '|'
      NOTES_CN,
      REPORT_GROUP,
      AREACODE,
      ROUND NULLIF (ROUND="NULL"),
      NOTES,
      LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
    BEGINDATA
    |E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females.  Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%).  The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population.  People over the age of 60 account for about 22% of visits.   Most of the visitation is from the local area.  More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short.  Over half of the visits last less than 3 hours.  The median length of visit to overnight sites is about 43 hours, or about 2 days.  The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long.   Most visits come from people who are fairly frequent visitors.  Over thirty percent are made by people who visit between 40 and 100 times per year.  Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%).  Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
    Here is the full beginning of the loader log, ending after the first row return.  (They ALL say the same error)
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Control File:   NRIS.NRN_REPORT_NOTES.ctl
    Data File:      NRIS.NRN_REPORT_NOTES.ctl
      Bad File:     ./NRIS.NRN_REPORT_NOTES.BAD
      Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
    Insert option in effect for this table: APPEND
       Column Name                  Position   Len  Term Encl Datatype
    NOTES_CN                            FIRST     *   ;  O(|) CHARACTER
    REPORT_GROUP                         NEXT     *   ;  O(|) CHARACTER
    AREACODE                             NEXT     *   ;  O(|) CHARACTER
    ROUND                                NEXT     *   ;  O(|) CHARACTER
        NULL if ROUND = 0X4e554c4c(character 'NULL')
    NOTES                                NEXT     *   ;  O(|) CHARACTER
    LAST_UPDATE                          NEXT     *   ;  O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
        NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
    Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
    Field in data file exceeds maximum length...
    I am not seeing why this would be failing.

    HI,
    the problem is delimited data defaults to char(255)..... Very helpful I know.....
    what you need to two is tell sqlldr hat the data is longer than this.
    so change notes to notes char(4000) in you control file and it should work.
    cheers,
    harry

  • Forward to gmail/hotmail Event ID 3030 552 5.7.0 Number of Received: DATA headers exceeds maximum permitted

    A user wants me to forward his Exchange 2003 recipient’s email to his Gmail account. 
    In Active Directory Users and Computers I created a Contact and then in the recipient “Delivery Options” Forward to: I put that contact.
    He receives all the forwards of email created internally at Gmail, but only some email that comes from external domains make it to Gmail. 
    Most (but not all) external email when forwarded to his Gmail account, by Exchange 2003, creates an NDR (non-delivery report) saying “DATA headers exceeds maximum permitted” (shown below). I’m pretty sure this message is generated by Exchange 2003 because it
    is the same if the external, forwarded to, account is Gmail or Hotmail.
    Any ideas how to solve this?
    -------Error Msg in Outlook----------
    John Smith on 12/12/2014 5:26 PM
                The recipient could not be processed because it would violate the security policy in force
               <ourdomain.com #5.7.0 smtp;552 5.7.0 Number of 'Received:' DATA headers exceeds maximum permitted>
    “ourdomain.com”, above, is the name of our email domain.
    -------- Event Viewer Error-----------
    Event Type: Error
    Event Source: MSExchangeTransport
    Event Category: NDR 
    Event ID: 3030
    Date: 12/12/2014
    Time: 5:08:58 PM
    User: N/A
    Computer: WIN2K3
    Description:
    A non-delivery report with a status code of 5.7.0 was generated for recipient rfc822;[email protected] (Message-ID  <001301d01671$54abb8c0$fe032a40$@com>).

    Hi,
    Based on the error Number of 'Received:' DATA headers exceeds maximum permitted. This message header size could exceeds message header size limits.
    Message header size limits  These limits apply to the total size of all message header fields that are present in a message. The size of the message body or attachments isn’t considered. Because the header fields
    are plain text, the size of the header is determined by the number of characters in each header field and by the total number of header fields. Each character of text consumes 1 byte.
    So, please check the message header size limits setting on receive connector by the following cmdlet:
    Get-ReceiveConnector “Connector name” | FL MaxHeaderSize
    Then check the problematic message header and compare them to check this issue. If these message header exceeds the message header size limit, we can use the following cmdlet to change the maximum header size:
    Set-ReceiveConnector “Connector Name” –MaxHeaderSize “value”
    The MaxHeaderSize parameter specifies in bytes the maximum size of the SMTP message header that the Receive connector accepts before it closes the connection. The default value is 65536 bytes. When you enter a value, qualify the value with one of
    the following units:
    B (bytes)
    KB (kilobytes)
    MB (megabytes)
    GB (gigabytes)
    Unqualified values are treated as bytes. The valid input range for this parameter is from 1 through 2147483647 bytes.
    Note: Some third-party firewalls or proxy servers apply their own message header size limits. These third-party firewalls or proxy servers may have difficulty processing messages that contain attachment file names that are greater than 50 characters
    or attachment file names that contain non-US-ASCII characters.
    Best Regards.

  • Field in data file exceeds maximum length

    Hi,
    I am trying to run the following SQL*Loader control job on my Oracle 11gR2 . Running the SQL*Loader control job results in the ‘Field in data file exceeds maximum length’ error message. Below, I am listing the control file.Please Suggest. Thanks
    It's giving me an error when I run SQL Loader on it,
    Record 61: Rejected - Error on table RMS_TABLE, column GEOM.SDO_POINT.X.
    Field in data file exceeds maximum length.
    Here is my SQL Loader Control file,
    LOAD DATA
    INFILE *
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE RMS_TABLE
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
       Status NULLIF Status = BLANKS,
       Score,
       Match_type NULLIF Match_type = BLANKS,
       Match_addr NULLIF Match_addr = BLANKS,
       Side NULLIF Side = BLANKS,
       User_fld NULLIF User_fld = BLANKS,
       Addr_type NULLIF Addr_type = BLANKS,
       ARC_Street NULLIF ARC_Street = BLANKS,
       ARC_City NULLIF ARC_City = BLANKS,
       ARC_State NULLIF ARC_State = BLANKS,
       ARC_ZIP NULLIF ARC_ZIP = BLANKS,
       INCIDENT_N NULLIF INCIDENT_N = BLANKS,
       CDATE NULLIF CDATE = BLANKS,
       CTIME NULLIF CTIME = BLANKS,
       DISTRICT NULLIF DISTRICT = BLANKS,
    LOCATION
    NULLIF LOCATION = BLANKS,
       MAPLOCATIO
    NULLIF MAPLOCATIO = BLANKS,
       LOCATION_T
    NULLIF LOCATION_T = BLANKS,
       DAYCODE
    NULLIF DAYCODE = BLANKS,
       CAUSE
    NULLIF CAUSE = BLANKS,
       GEOM COLUMN OBJECT
         SDO_GTYPE       INTEGER EXTERNAL,
         SDO_POINT COLUMN OBJECT
           (X            FLOAT EXTERNAL,
            Y            FLOAT EXTERNAL)
    CREATE TABLE RMS_TABLE (
      Status VARCHAR2(1),
      Score NUMBER,
      Match_type VARCHAR2(2),
      Match_addr VARCHAR2(120),
      Side VARCHAR2(1),
      User_fld VARCHAR2(120),
      Addr_type VARCHAR2(20),
      ARC_Street VARCHAR2(100),
      ARC_City VARCHAR2(40),
      ARC_State VARCHAR2(20),
      ARC_ZIP VARCHAR2(10),
      INCIDENT_N VARCHAR2(9),
      CDATE VARCHAR2(10),
      CTIME VARCHAR2(8),
      DISTRICT VARCHAR2(4),
      LOCATION VARCHAR2(128),
      MAPLOCATIO VARCHAR2(100),
      LOCATION_T VARCHAR2(42),
      DAYCODE VARCHAR2(1),
      CAUSE VARCHAR2(17),
      GEOM MDSYS.SDO_GEOMETRY);

    Hi,
    Looks like you have a problem with record 61 in your data file. Can you please post it in reply.
    Regards
    Ivan

  • SDO_ORDINATES.X.Field in data file exceeds maximum length

    Hi All,
    While loading data in .SHP file into oracle spatial through SHP2SDO tool following error message appears:
    Error message:
    Record 54284: Rejected - Error on table GEO_PARCEL_CENTROID, column CENTROID_GEOM.SDO_ORDINATES.X.
    Field in data file exceeds maximum length.
    I read some where this is due to the SQL * Loader takes default column value to 255 characters. But there is confusion to me how to change the column size in control file because it is object data type. I am not sure this is correct or not.
    The control file show as below:
    LOAD DATA
    INFILE geo_parcel_centroid.dat
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE GEO_PARCEL_CENTROID
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    CENTROID_ID INTEGER EXTERNAL,
    APN_NUMBER      NULLIF APN_NUMBER = BLANKS,
    PROPERTY_A      NULLIF PROPERTY_A = BLANKS,
    PROPERTY_C      NULLIF PROPERTY_C = BLANKS,
    OWNER_NAME      NULLIF OWNER_NAME = BLANKS,
    THOMAS_GRI      NULLIF THOMAS_GRI = BLANKS,
    MAIL_ADDRE      NULLIF MAIL_ADDRE = BLANKS,
    MAIL_CITY_      NULLIF MAIL_CITY_ = BLANKS,
    MSLINK,
    MAPID,
    GMRotation,
    CENTROID_GEOM COLUMN OBJECT
    SDO_GTYPE INTEGER EXTERNAL,
    SDO_ELEM_INFO VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL),
    SDO_ORDINATES VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL)
    Any help on this would appreciate.
    Thanks,
    [email protected]

    Hi,
    Looks like you have a problem with record 61 in your data file. Can you please post it in reply.
    Regards
    Ivan

  • SQL Loader - Field in data file exceeds maximum length

    Dear All,
    I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
    Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
    Table creation script:
    CREATE TABLE "TEST_TAB"
        "STR"  VARCHAR2(4000 BYTE),
        "STR2" VARCHAR2(4000 BYTE),
        "STR3" VARCHAR2(4000 BYTE)
      );Control file:
    LOAD DATA
    INFILE 'C:\table_export.txt'
    APPEND INTO TABLE TEST_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    ( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
    STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
    STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
    )Log:
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Control File:   C:\TEST_TAB.CTL
    Data File:      C:\table_export.txt
      Bad File:     C:\TEST_TAB.BAD
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table TEST_TAB, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    STR                                 FIRST  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR,1,4000)"
    STR2                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR2,1,4000)"
    STR3                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR3,1,4000)"
    value used for ROWS parameter changed from 64 to 21
    Record 1: Rejected - Error on table TEST_TAB, column STR.
    Field in data file exceeds maximum length
    MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
    Table TEST_TAB:
      0 Rows successfully loaded.
      1 Row not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                 252126 bytes(21 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             1
    Total logical records rejected:         1
    Total logical records discarded:        0
    Run began on Mon Jul 26 16:06:25 2010
    Run ended on Mon Jul 26 16:06:25 2010
    Elapsed time was:     00:00:00.22
    CPU time was:         00:00:00.15Please suggest a way to get it done.
    Thanks for reading the post!
    *009*

    Hi Toni,
    Thanks for the reply.
    Do you mean this?
    CREATE TABLE "TEST"."TEST_TAB"
        "STR"  VARCHAR2(4001),
        "STR2" VARCHAR2(4001),
        "STR3" VARCHAR2(4001)
      );However this does not work as the error would be:
    Error at Command Line:8 Column:20
    Error report:
    SQL Error: ORA-00910: specified length too long for its datatype
    00910. 00000 -  "specified length too long for its datatype"
    *Cause:    for datatypes CHAR and RAW, the length specified was > 2000;
               otherwise, the length specified was > 4000.
    *Action:   use a shorter length or switch to a datatype permitting a
               longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
    Edited by: 009 on Jul 28, 2010 6:15 AM

  • Loader- Field in data file exceeds maximum length

    Hi,
    I am getting error while loading the data: However data size of this columns is less thatn 4000 and i defined column as : OBJ_ADDN_INFO CLOB
    Please help
    ==================
    Record 1: Rejected - Error on table APPS.CG_COMPARATIVE_MATRIX_TAB, column OBJ_ADDN_INFO.
    Field in data file exceeds maximum length
    LOAD DATA
    infile *
    REPLACE
    INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ( APPS_VERSION,
    MODULE_SHORT_NAME,
    CATEGORY,
    MODULE,
    OBJECT_NAME,
    OBJECT_TYPE,
    OBJECT_STATUS,
    FUNCTION_NAME,
    OBJ_ADDN_INFO
    begindata
    "12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INIT,PROGRAM,Changed,"Initial Load - Update Depot Repair Order Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"
    "12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INCR,PROGRAM,Changed,"Update Depot Repair Orders Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"

    If you don't specify a data type for a data field in the SQL Loader control file, SQL Loader assumes the data type is CHAR(255). If you have data that is larger than that, then you can't rely on the default. Try changing the control file to
    LOAD DATA
    infile *
    REPLACE
    INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ( APPS_VERSION,
    MODULE_SHORT_NAME,
    CATEGORY,
    MODULE,
    OBJECT_NAME,
    OBJECT_TYPE,
    OBJECT_STATUS,
    FUNCTION_NAME,
    OBJ_ADDN_INFO char(4000)
    )

  • Field in data file exceeds maximum length - CTL file error

    Hi,
    I am loading data in new system using CTL file. But I am getting error as 'Field in data file exceeds maximum length' for few records, other records are processed successfully. I have checked the length of the error record in the extract file, it is less than the length in the target table, VARCHAR2 (2000 Byte). Below is the example of error data,
    Hi Rebecca~I have just spoken to our finance department and they have agreed that the ABCs payments made can be allocated to the overdue invoices, can you send any future invoices direct to me so that I can get them paid on time.~Hope this is ok ~Thanks~Terry~
    Is this error caused because of the special characters in the string?
    Below is the ctl file I am using,
    OPTIONS (SKIP=2)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE  '$FILE'
    APPEND
    INTO TABLE "XXDM_DM_17_ONACCOUNT_REC_SRC"
    WHEN (1)!= 'FOOTER='
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS (
                                  <Column_name>,
                                  <Column_name>,
                                  COMMENTS,
                                  <Column_name>,
                                  <Column_name>
    Thanks in advance,
    Aditya

    Hi,
    I suspect this is because of the built in default length of character datatypes in sqldr - it defaults to char(255) taking no notice of what the actual table definition is.
    Try adding CHAR(2000), to your controlfile so you end up with something like this:
    OPTIONS (SKIP=2)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE  '$FILE'
    APPEND
    INTO TABLE "XXDM_DM_17_ONACCOUNT_REC_SRC"
    WHEN (1)!= 'FOOTER='
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS (
                                  <Column_name>,
                                  <Column_name>,
                                  COMMENTS CHAR(2000),
                                  <Column_name>,
                                  <Column_name>
    Cheers,
    Harry

  • SQL loader Field in data file exceeds maximum length for CLOB column

    Hi all
    I'm loading data from text file separated by TAB and i got the error below for some lines.
    Event the column is CLOB data type is there a limitation of the size of a CLOB data type.
    The error is:
    Record 74: Rejected - Error on table _TEMP, column DEST.
    Field in data file exceeds maximum length
    I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5
    Here are the line causing the error fronm my data file and my table description for test:
    create table TEMP
    CODE VARCHAR2(100),
    DESC VARCHAR2(500),
    RATE     FLOAT,
    INCREASE VARCHAR2(20),
    COUNTRY VARCHAR2(500),
    DEST     CLOB,
    WEEK     VARCHAR2(10),
    IS_SAT VARCHAR2(50),
    IS_SUN VARCHAR2(50)
    CONTROL FILE:
    LOAD DATA
    INTO TABLE TEMP
    APPEND
    FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
    CODE,
    DESC,
    RATE,
    INCREASE,
    COUNTRY),
    DEST,
    WEEK,
    IS_SAT,
    IS_SUN
    Data file:
    BHS Mobile     Bahamas - Mobile     0.1430          1     "242357, 242359, 242375, 242376, 242395, 242421, 242422, 242423, 242424, 242425, 242426, 242427, 242428, 242429, 242431, 242432, 242433, 242434, 242435, 242436, 242437, 242438, 242439, 242441, 242442, 242443, 242445, 242446, 242447, 242448, 242449, 242451, 242452, 242453, 242454, 242455, 242456, 242457, 242458, 242462, 242463, 242464, 242465, 242466, 242467, 242468, 24247, 242524, 242525, 242533, 242535, 242544, 242551, 242552, 242553, 242554, 242556, 242557, 242558, 242559, 242565, 242577, 242636, 242646, 242727"               
    BOL Mobile ENTEL     Bolivia - Mobile Entel     0.0865     Increase     591     "67, 68, 71, 72, 73, 740, 7410, 7411, 7412, 7413, 7414, 7415, 7420, 7421, 7422, 7423, 7424, 7425, 7430, 7431, 7432, 7433, 7434, 7435, 7436, 7437, 7440, 7441, 7442, 7443, 7444, 7445, 7450, 7451, 7452, 7453, 7454, 7455, 746, 7470, 7471, 7472, 7475, 7476, 7477, 7480, 7481, 7482, 7483, 7484, 7485, 7486, 7490, 7491, 7492, 7493, 7494, 7495, 7496"               Thank you.

    Hi
    Thank you for youe help, I found the solution and here what i do in my Control file i added
    char(40000) OPTIONALLY ENCLOSED BY '"' .
    LOAD DATA
    INTO TABLE TEMP
    APPEND
    FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
    CODE,
    DESC,
    RATE,
    INCREASE,
    COUNTRY,
    DEST
    char(40000) OPTIONALLY ENCLOSED BY '"',
    WEEK,
    IS_SAT,
    IS_SUN
    Thank you for your help.

  • Finding maximum Date in Cube

    Hi,
    How can I find the Maximum date from all the dates of a date characterstic which exists in CUBE Data.
    I want restrict a selection in my Query with this maximum Date. So how can i fetch that date into a variable?

    Hi
    You can call the function module RSDRI_INFOPROV_READ in your exit for the variable and read the dates and find the maximum. You have to pass the name of the info cube u want to read. However writing this code is slightly a tedious job. So think of any other way u can meet ur requirement.
    If you want to use the function module tell me i ll send u a piece of code which will give you an idea.
    Thanks
    Mansi

Maybe you are looking for

  • Select query gives result in abc order instead of creation order

    Hi, I used the following command in sql command window to insert some rows to my table: SQL> SQL> insert into regions (Region_ID, Region_Name) values (1 , 'Zafon'); 1 row inserted SQL> insert into regions (Region_ID, Region_Name) values (2 , 'Hasharo

  • Special characters in Discoverer

    Hi everybody, I would like to use special characters in reports such as arrow (U2192, Arial font.). It's fine in Discoverer, but after publishing to AS portal it doesn't work. Thanks for any ideas how to solve this issue. Tomas Greif

  • How to  create forign vendor

    plz tell me how to create forign vendor? urgent

  • Transfering 0CUSTOMER from R/3 and activating in Business Content

    Hi all,    New to BW. Trying to learn Data Transfer. Need help activating infoobject 0CUSTOMER in Business Content. I transferred 0CUSTOMER from R/3 without any problem. When i try to activate it in Business Content I get the following errors: 1) Cha

  • Nodemanager in WLS81sp5 calling custom script to start Managed Server.

    Hi, In WLS81sp5 on AIX, I have modified nodemanager.properties file to call a custom script. This custom script is called successfully when I start a managed server from admin console. This custom script is called with 1st argument as "-Xms512m -Xmx5