DRG-11112: length of CLOB query value exceeds maximum of 64000
Is there a CLOB length limitation when running an Oracle Text search? (v 11.1.0.7) I have checked the Reference Guide and Application Developer's Guide.
--create table
create table nk_1929(id number, vc_a clob);
--insert dummy data
declare
vc_clob clob;
begin
vc_clob := lpad(to_clob('a'), 222920, 'a');
insert into nk_1929 values(1, vc_clob);
end;
--create index
create index nk_1929_ndx on nk_1929(vc_a)
indextype is ctxsys.context parameters('
datastore ctxsys.default_datastore
stoplist ctxsys.empty_stoplist');
--run query with a search string longer than 64000
declare
str1 clob;
query_term clob;
begin
select vc_a into query_term from nk_1929 where id = 1;
str1 := 'select id from nk_1929 where contains(vc_a, :1) > 0';
execute immediate str1 using query_term;
end;
ORA-29902: error in executing ODCIIndexStart() routine
ORA-20000: Oracle Text error:
DRG-11112: length of CLOB Query Value exceeds maximum of 64000
Please let me know if I am missing something here?
Same 64000 CLOB query value limitation is also generated with a simple select:
--run query with a search string longer than 64000
declare
vn_id number;
query_term clob;
begin
select vc_a into query_term from nk_1929 where id = 1;
select max(id) into vn_id from nk_1929 where contains(vc_a, query_term) > 0;
end;
ORA-29902: error in executing ODCIIndexStart() routine
ORA-20000: Oracle Text error:
DRG-11112: length of CLOB Query Value exceeds maximum of 64000
Similar Messages
-
XML data value exceeds maximum length - ORA-30951
Hello,
I am receiving ORA-30951: Element or attribute at Xpath /dataroot/Respondent[1]/Response[3]/Value exceeds maximum length error during the XML load.
I have registered the schema and it works fine when the Value is less than 64k but fails if its greater. I tried changing the type of Value to type="xsd:base64Binary" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="BLOB" but then I get ORA-00932: inconsistent datatypes error.
Can someone please let me know what I am doing wrong or is there a way I can load more than 64k length elements on 10g?
Thanks
Here is my schema.
var SCHEMAURL varchar2(256)
var XMLSCHEMA CLOB
set define off
begin
:SCHEMAURL := 'http://xmlns.example.com/Svy_Resp.xsd';
:XMLSCHEMA := '<?xml version="1.0"; encoding="utf-16"?>
<xsd:schema attributeFormDefault="unqualified" elementFormDefault="qualified" version="1.0"; xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" xdb:storeVarrayAsTable="true">
<xsd:element name="dataroot" xdb:defaultTable="SVY_RESP_XML_SCHEMA" type="datarootType" />
<xsd:complexType name="datarootType" xdb:maintainDOM="false"
xdb:SQLType="Dataroot_T">
<xsd:sequence>
<xsd:element maxOccurs="unbounded" name="Respondent" type="RespondentType" />
</xsd:sequence>
<xsd:attribute name="generated" type="xsd:dateTime" />
</xsd:complexType>
<xsd:complexType name="RespondentType" xdb:maintainDOM="false" xdb:SQLType="Respondent_Type">
<xsd:sequence>
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="KsID" type="xsd:int" />
<xsd:element name="email" type="xsd:string" />
<xsd:element name="SyID" type="xsd:int" />
<xsd:element name="KSuID" type="xsd:int" />
<xsd:element name="Completed" type="xsd:int" />
<xsd:element name="SubmitDateTime" type="xsd:dateTime" />
<xsd:element maxOccurs="unbounded" name="Response" type="ResponseType" />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="ResponseType" xdb:maintainDOM="false" xdb:SQLType="Response_Type">
<xsd:sequence>
<xsd:element name="ResponseID" type="xsd:int" />
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="CID" type="xsd:int" />
<xsd:element name="AID" type="xsd:int" />
<xsd:element name="Value" type="xsd:string"/>
<xsd:element name="QID" type="xsd:int" />
<xsd:element name="SID" type="xsd:int" />
</xsd:sequence>
</xsd:complexType>
</xsd:schema>';
end;
/Thanks for the quick response. I am not able to modify the source file, but will it be possible to set the value to NULL if it exceeds the max length instead of failure?
Thanks -
Hi,
I have a CLOB column in my table which stores XML values. I need to get size of XML stored.
but when i am using length fucntion of oracle,it is giving error character string buffer too small.
can anyone suggest a way out to find the length of CLOB column.
my actual requirement is that i want to create a view in base table with all the field and one more field which will have length of CLOB column value.
Regards,
Vikasso, it's an XML now?
select dbms_lob.getlength (extract (str, '/*').getClobVal())
from testas in
SQL> with test
2 as
3 (
4 select xmltype ('<this>test</this>') str from dual
5 ) -- End of Test. Here's the query:
6 select dbms_lob.getlength (extract (str, '/*').getClobVal())
7 from test
8 /
DBMS_LOB.GETLENGTH(EXTRACT(STR,'/*').GETCLOBVAL())
17
SQL> -
Ctxload error DRG-11530: token exceeds maximum length
I downloaded the 11g examples (formerly the companion cd) with the supplied knowledge base (thesauri), unzipped it, installed it, and confirmed that the droldUS.dat file is there. Then I tried to use ctxload to create a default thesaurus, using that file, as per the online documentation. It creates the default thesaurus, but does not load the data, due to the error "DRG-11530: token exceeds maximum length". Apparently one of the terms is too long. But what can I use to edit the file? I tried notepad, but it was too big. I tried wordpad, but it was unreadable. I was able to create a default thesaurus using the much smaller sample thesaurus dr0thsus.txt, so I confirmed that there is nothing wrong with the syntax or privileges. Please see the copy of the run below. Is there a way to edit the droldUS.dat file or a workaround or am I not loading it correctly? Does the .dat file need to be loaded differently than the .txt file?
CTXSYS@orcl_11g> select banner from v$version
2 /
BANNER
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
PL/SQL Release 11.1.0.6.0 - Production
CORE 11.1.0.6.0 Production
TNS for 32-bit Windows: Version 11.1.0.6.0 - Production
NLSRTL Version 11.1.0.6.0 - Production
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> host ctxload -thes -user ctxsys/ctxsys@orcl -name default -file
C:\app\Barbara\product\11.1.0\db_1\ctx\data\enlx\droldUS.dat
Connecting...
Creating thesaurus default...
Thesaurus default created...
Processing...
DRG-11530: token exceeds maximum length
Disconnected
CTXSYS@orcl_11g> connect ctxsys/ctxsys@orcl
Connected.
CTXSYS@orcl_11g>
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
1
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> exec ctx_thes.drop_thesaurus ('default')
PL/SQL procedure successfully completed.
CTXSYS@orcl_11g> host ctxload -thes -user ctxsys/ctxsys@orcl -name default -file
C:\app\Barbara\product\11.1.0\db_1\ctx\sample\thes\dr0thsus.txt
Connecting...
Creating thesaurus default...
Thesaurus default created...
Processing...
1000 lines processed
2000 lines processed
3000 lines processed
4000 lines processed
5000 lines processed
6000 lines processed
7000 lines processed
8000 lines processed
9000 lines processed
10000 lines processed
11000 lines processed
12000 lines processed
13000 lines processed
14000 lines processed
15000 lines processed
16000 lines processed
17000 lines processed
18000 lines processed
19000 lines processed
20000 lines processed
21000 lines processed
21760 lines processed successfully
Beginning insert...21760 lines inserted successfully
Disconnected
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
1
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
9582
CTXSYS@orcl_11g>Hi Roger,
Thanks for the response. You are correct. I was confusing the terms thesaurus and knowledge base, which sometimes seem to be used interchangeably or synonymously, but are actually two different things. I read over the various sections of the documentation regarding the supplied knowledge base and supplied thesaurus more carefully and believe I understand now. Apparently, the dr0thsus.txt file that I did ultimately load using ctxload to create a default thesaurus is the supplied thesaurus that is intended to be used to create the default English thesaurus, which supports ctx_thes syn and such. The other droldUS.dat file that I mistakenly tried to load using ctxload is the supplied compiled knowledge base that supports ctx_doc themes and gist and such. In the past I have used ctx_thes.create_thesaurus to create a thesaurus, but using ctxload can also load a thesaurus from a text file with the data in a specified format. Once a thesaurus is loaded using ctxload, it can then be compiled using ctxkbtc to add it to the existing compiled knowledge base. So, the knowledge base is sort of a compilation of thesauri, which is what led to my confusion in terminology. I think I have it all straight in my mind now and hopefully this will help anybody else who searches for the same problem and finds this.
Thanks,
Barbara -
SQL loader Field in data file exceeds maximum length for CLOB column
Hi all
I'm loading data from text file separated by TAB and i got the error below for some lines.
Event the column is CLOB data type is there a limitation of the size of a CLOB data type.
The error is:
Record 74: Rejected - Error on table _TEMP, column DEST.
Field in data file exceeds maximum length
I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5
Here are the line causing the error fronm my data file and my table description for test:
create table TEMP
CODE VARCHAR2(100),
DESC VARCHAR2(500),
RATE FLOAT,
INCREASE VARCHAR2(20),
COUNTRY VARCHAR2(500),
DEST CLOB,
WEEK VARCHAR2(10),
IS_SAT VARCHAR2(50),
IS_SUN VARCHAR2(50)
CONTROL FILE:
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY),
DEST,
WEEK,
IS_SAT,
IS_SUN
Data file:
BHS Mobile Bahamas - Mobile 0.1430 1 "242357, 242359, 242375, 242376, 242395, 242421, 242422, 242423, 242424, 242425, 242426, 242427, 242428, 242429, 242431, 242432, 242433, 242434, 242435, 242436, 242437, 242438, 242439, 242441, 242442, 242443, 242445, 242446, 242447, 242448, 242449, 242451, 242452, 242453, 242454, 242455, 242456, 242457, 242458, 242462, 242463, 242464, 242465, 242466, 242467, 242468, 24247, 242524, 242525, 242533, 242535, 242544, 242551, 242552, 242553, 242554, 242556, 242557, 242558, 242559, 242565, 242577, 242636, 242646, 242727"
BOL Mobile ENTEL Bolivia - Mobile Entel 0.0865 Increase 591 "67, 68, 71, 72, 73, 740, 7410, 7411, 7412, 7413, 7414, 7415, 7420, 7421, 7422, 7423, 7424, 7425, 7430, 7431, 7432, 7433, 7434, 7435, 7436, 7437, 7440, 7441, 7442, 7443, 7444, 7445, 7450, 7451, 7452, 7453, 7454, 7455, 746, 7470, 7471, 7472, 7475, 7476, 7477, 7480, 7481, 7482, 7483, 7484, 7485, 7486, 7490, 7491, 7492, 7493, 7494, 7495, 7496" Thank you.Hi
Thank you for youe help, I found the solution and here what i do in my Control file i added
char(40000) OPTIONALLY ENCLOSED BY '"' .
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY,
DEST
char(40000) OPTIONALLY ENCLOSED BY '"',
WEEK,
IS_SAT,
IS_SUN
Thank you for your help. -
How to Sort by the length of the returned value from a query.
Hi,
I was wondering if it is possible to sort by the length of the returned value from a query?
For example if I want to get a list of people with the name 'Samuel', I would like to short by how short the length of the whole name is.
Sort by length of the name in SQL
Samuel Syda
Samuel Indranaka
Samuel Johnsons
Samuel Longhenderson
Thank you.Hi,
Sorting is done by an ORDER BY clause at the end of the main query.
In most cases, you can ORDER BY any expression, even f it is not in the SELECT clause. In this case, it sounds like you just need:
ORDER BY LENGTH (name_column)
I hope this answers your question.
If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
Post your query, using an ORDER BY clause like the one above, and point out where that query is producing the wrong results, and explain, using specific examples, how you get the right results from the given data in those places.
Always say what version of Oracle you're using (e.g. 11.2.0.2.0).
See the forum FAQ: https://forums.oracle.com/message/9362002 -
ORA-30951: ...exceeds maximum length
Oracle Database 10g Release 10.2.0.1.0 - Production
I am new to XML and having a problem importing data using the XML Repository. I have annotated the schema and validated it using XML Spy. I am able to register the schema w/o errors. I am now working through the issues as they occur during the insertion of xml documents. Thes section below is giving me an error (bottom) that the data exceeds the maximum length. The "DATA" in the xml doc is a pdf file that has been converted to characters by some method. The size element has a data value of 5008. Seems to be too long for a varchar2. I tried RAW, CLOB, BLOB. I was pretty sure they wouldn't work and they didn't. I get an error that the xml/xdb types are incompatible.
How can I modify the schema to get this element to load?
Is it possible to tell oracle to ignore an element so I can eliminate those that are not critical? This would be very helpful.
Thanks!
Schema -
<xs:element name="NpdbReportList" minOccurs="0">
<xs:complexType>
<xs:sequence maxOccurs="unbounded">
<xs:element name="NpdbReport" minOccurs="0">
<xs:complexType>
<xs:all minOccurs="0">
<xs:element name="DCN" minOccurs="0"/>
<xs:element name="DateReport" type="Date" minOccurs="0"/>
<xs:element name="ReportType" type="IdRef" minOccurs="0"/>
<xs:element ref="LOB" minOccurs="0"/>
</xs:all>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:complexType name="LOB" xdb:SQLType="LOB_T"xdb:maintainDOM="false">
<xs:all>
<xs:element name="Type" type="IdRef"/>
<xs:element name="Size"/>
<xs:element name="Data" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="VARCHAR2"/>
</xs:all>
</xs:complexType>
ftp> mput *
mput Smyth_Steven_1386367.XML? y
227 Entering Passive Mode (127,0,0,1,83,221)
150 ASCII Data Connection
550- Error Response
ORA-30951: Element or attribute at Xpath /Provider/NpdbReportList/NpdbReport[1]/LOB/Data exceeds maximum length
550 End Error Response
28706 bytes sent in 0.014 seconds (1.9e+03 Kbytes/s)
ftp>Thanks for your time Marco.
Here is the header:
<?xml version="1.0" encoding="UTF-8"?>
<!-- edited with XMLSpy v2010 rel. 2 (http://www.altova.com) by Joe (DSI) -->
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" elementFormDefault="qualified" attributeFormDefault="unqualified" xdb:storeVarrayAsTable="true" xdb:mapStringToNCHAR="true" xdb:mapUnboundedStringToLob="true">
I made the following change -
<xs:element name="Data" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="CLOB"/>
I received this error -
ORA-31094: incompatible SQL type "CLOB" for attribute or element "Data"
ORA-06512: at "XML_TEST.XML_TEST_SCHEMA_REGISTER", line 48
I did a little more testing after the first post. I used the same type as an element that is defining image data.
<xs:element name="Data" type="xs:base64Binary" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="BLOB"/>
While this did register and I was able to load a record, I am guessing that this will render the data for this element usless but at least the record loads. I'll still need to resolve this issue as the .pdf data is important.
Thanks
Joe -
ORA-22813: operand value exceeds system limits when generation XML
Hi All,
We are using Oracle 11GR2 database and I am trying to generate XML Files using SQL/XML Functions.
I am in the end of development and while testing I am facing this freaking issue. ORA-22813: operand value exceeds system limits.
SELECT XMLSERIALIZE(DOCUMENT DATA AS CLOB) AS DATA FROM (
SELECT
XMLELEMENT (
"Region_Data",
XMLAGG (
XMLFOREST (
R.region as "Region_Name",
R.first_name||R.last_name as "EmployeeFullName",
R.ntlogin as "EmployeeAlias",
R.job_title as "EmployeeRole",
R.sap_number as "SAPNumber",
R.sales_transaction_dt AS "Day",
R.region AS "RegionName",
R.postpaid_totalqty AS "PostpaidCount",
R.postpaid_totaldollars AS "PostpaidAmount",
R.postpaidfeature_totalqty AS "PostpaidFeatureCount",
R.postpaidfeature_totaldollar AS "PostpaidFeatureAmount",
R.prepaid_totalqty AS "PrepaidCount",
R.prepaid_totaldollars AS "PrepaidAmount" ,
R.prepaidfeature_totalqty AS "PrepaidFeatureCount",
R.prepaidfeature_totaldollars AS "PrepaidFeatureAmount",
R.accessory_totalqty AS "AccessoriesCount",
R.accessory_totaldollars AS "AccessoriesAmount",
R.handset_totalqty AS "HandsetsCount",
R.handset_totaldollars AS "HandsetsAmount",
(SELECT XMLAGG (
XMLELEMENT (
"Division",
XMLFOREST (
di.division AS "DivisonName",
di.postpaid_totalqty AS "PostpaidCount",
di.postpaid_totaldollars AS "PostpaidAmount",
di.postpaidfeature_totalqty AS "PostpaidFeatureCount",
di.postpaidfeature_totaldollar AS "PostpaidFeatureAmount",
di.prepaid_totalqty AS "PrepaidCount",
di.prepaid_totaldollars AS "PrepaidAmount" ,
di.prepaidfeature_totalqty AS "PrepaidFeatureCount",
di.prepaidfeature_totaldollars AS "PrepaidFeatureAmount",
di.accessory_totalqty AS "AccessoriesCount",
di.accessory_totaldollars AS "AccessoriesAmount",
di.handset_totalqty AS "HandsetsCount",
di.handset_totaldollars AS "HandsetsAmount",
(SELECT XMLAGG (
XMLELEMENT (
"District",
XMLFOREST (
dis.district AS "DistrictName",
dis.postpaid_totalqty AS "PostpaidCount",
dis.postpaid_totaldollars AS "PostpaidAmount",
dis.postpaidfeature_totalqty AS "PostpaidFeatureCount",
dis.postpaidfeature_totaldollar AS "PostpaidFeatureAmount",
dis.prepaid_totalqty AS "PrepaidCount",
dis.prepaid_totaldollars AS "PrepaidAmount" ,
dis.prepaidfeature_totalqty AS "PrepaidFeatureCount",
dis.prepaidfeature_totaldollars AS "PrepaidFeatureAmount",
dis.accessory_totalqty AS "AccessoriesCount",
dis.accessory_totaldollars AS "AccessoriesAmount",
dis.handset_totalqty AS "HandsetsCount",
dis.handset_totaldollars AS "HandsetsAmount",
(SELECT XMLAGG (
XMLELEMENT (
"Store",
XMLFOREST (
mst.store_id AS "StoreNumber",
mst.store_name AS "StoreLocation",
mst.postpaid_totaldollars AS "PostpaidAmount",
mst.postpaidfeature_totalqty AS "PostpaidFeatureCount",
mst.postpaidfeature_totaldollar AS "PostpaidFeatureAmount",
mst.prepaid_totalqty AS "PrepaidCount",
mst.prepaid_totaldollars AS "PrepaidAmount" ,
mst.prepaidfeature_totalqty AS "PrepaidFeatureCount",
mst.prepaidfeature_totaldollars AS "PrepaidFeatureAmount",
mst.accessory_totalqty AS "AccessoriesCount",
mst.accessory_totaldollars AS "AccessoriesAmount",
mst.handset_totalqty AS "HandsetsCount",
mst.handset_totaldollars AS "HandsetsAmount"
FROM stores_comm_mobility_info_vw mst
WHERE mst.district = dis.district
) "Store_Data")))
FROM diST_comm_mobility_info_vw dis
WHERE dis.division = di.division
) "District_Data")))
FROM div_comm_mobility_info_vw di
WHERE di.region = r.region
) AS "Division_Data"))) AS DATA
FROM reg_comm_mobility_info_vw R GROUP BY region)
This is working fine for conditions where there is less amount of data, but when there is more data this query is failing.
I do not know what to do now. Is there any way of this limit or do I need someother mechanisms to generate XML Files.
The challenge is we need to generate XML Files and send the XML Data to an Interface which will use this data to display in a cell phone.
I am really frustated now as I am getting this error when I am testing for huge amount of data.
Appreciate if anyone can help me out ASAP.
(tHE BELOW XML I am trying to generate)
<REGION>
<Region_Data>
<Region_Name>Southwest</Region_Name>
<EmployeeFullName>AllisonAndersen</EmployeeFullName>
<EmployeeAlias>AANDERS60</EmployeeAlias>
<EmployeeRole>District Manager, Retail Sales</EmployeeRole>
<SAPNumber>P12466658</SAPNumber>
<Day>JAN</Day>
<RegionName>Southwest</RegionName>
<PostpaidCount>52</PostpaidCount>
<PostpaidAmount>1579.58</PostpaidAmount>
<PostpaidFeatureCount>296</PostpaidFeatureCount>
<PostpaidFeatureAmount>4174.19</PostpaidFeatureAmount>
<AccessoriesCount>394</AccessoriesCount>
<AccessoriesAmount>45213.87</AccessoriesAmount>
<Division_Data>
<Division>
<DivisonName>Southern California</DivisonName>
<PostpaidCount>52</PostpaidCount>
<PostpaidAmount>1579.58</PostpaidAmount>
<PostpaidFeatureCount>296</PostpaidFeatureCount>
<PostpaidFeatureAmount>4174.19</PostpaidFeatureAmount>
<AccessoriesCount>394</AccessoriesCount>
<AccessoriesAmount>45213.87</AccessoriesAmount>
<District_Data>
<District>
<DistrictName>Orange County West</DistrictName>
<PostpaidCount>52</PostpaidCount>
<PostpaidAmount>1579.58</PostpaidAmount>
<PostpaidFeatureCount>296</PostpaidFeatureCount>
<PostpaidFeatureAmount>4174.19</PostpaidFeatureAmount>
<AccessoriesCount>394</AccessoriesCount>
<AccessoriesAmount>45213.87</AccessoriesAmount>
<Store_Data>
<Store>
<StoreNumber>9551</StoreNumber>
<StoreLocation>TM - BROOKHURST & WARNER</StoreLocation>
<PostpaidAmount>10</PostpaidAmount>
<PostpaidFeatureCount>22</PostpaidFeatureCount>
<PostpaidFeatureAmount>319.89</PostpaidFeatureAmount>
<AccessoriesCount>27</AccessoriesCount>
<AccessoriesAmount>4330</AccessoriesAmount>
</Store>
</Store_Data>
</District>
</District_Data>
</Division>
</Division_Data>
</Region_Data>
</REGION>
Thanks,
Madhu K.You didn't give any feedback in your previous thread.
Did you try the approach suggested here in {message:id=10998557}, instead of using nested inline subqueries ? -
Lax validation errors on schema import ('version' exceeds maximum length)
I have a schema as per below. I'm trying to import it into Oracle 10.2.0.2.0. However, I'm getting the following lax validation error:
Error loading ora_business_rule.xsd:ORA-30951: Element or attribute at Xpath /schema[@version] exceeds maximum length
I can fix it by modifying the attribute and shortening it but I'd like to know why it's occuring. Insofar as I can tell there is no imposed limit on the size of schema attributes according to the W3C standard. Which then makes me wonder: does Oracle impose limits on the length of all attributes or is this specific to 'version' ? If there is a limit, what is the upper bound (in bytes) ? Where is this documented?
Cheers,
Daniel
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:br="http://foo.com/BusinessRule_PSG_V001" targetNamespace="http://foo.com/BusinessRule_PSG_V001" elementFormDefault="qualified" attributeFormDefault="unqualified" version="last committed on $LastChangedDate: 2006-05-19 11:00:52 +1000 (Fri, 19 May 2006) $">
<xs:element name="edit">
<xs:complexType>
<xs:sequence>
<xs:element name="edit_id" type="xs:string"/>
<xs:element ref="br:business_rule"/>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="derivation">
<xs:complexType>
<xs:sequence>
<xs:element name="derivation_id" type="xs:string"/>
<xs:element ref="br:derivation_type"/>
<xs:element ref="br:business_rule"/>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="derivation_type">
<xs:simpleType>
<xs:restriction base="xs:NMTOKENS">
<xs:enumeration value="complex"/>
<xs:enumeration value="format"/>
<xs:enumeration value="formula"/>
<xs:enumeration value="recode"/>
<xs:enumeration value="SAS code"/>
<xs:enumeration value="transfer"/>
<xs:enumeration value="count"/>
<xs:enumeration value="sum"/>
<xs:enumeration value="max"/>
<xs:enumeration value="min"/>
</xs:restriction>
</xs:simpleType>
</xs:element>
<xs:element name="business_rule"></xs:element>
</xs:schema>Opps -- Sorry it's a decision we took when looking at Version
When we registered the Schema for Schemas during XDB bootstrap the Version attriubte was mapped to varchar2(12).
SQL> desc xdb.xdb$schema_T
Name Null? Type
SCHEMA_URL VARCHAR2(700)
TARGET_NAMESPACE VARCHAR2(2000)
VERSION VARCHAR2(12)
NUM_PROPS NUMBER(38)
FINAL_DEFAULT XDB.XDB$DERIVATIONCHOICE
BLOCK_DEFAULT XDB.XDB$DERIVATIONCHOICE
ELEMENT_FORM_DFLT XDB.XDB$FORMCHOICE
ATTRIBUTE_FORM_DFLT XDB.XDB$FORMCHOICE
ELEMENTS XDB.XDB$XMLTYPE_REF_LIST_T
SIMPLE_TYPE XDB.XDB$XMLTYPE_REF_LIST_T
COMPLEX_TYPES XDB.XDB$XMLTYPE_REF_LIST_T
ATTRIBUTES XDB.XDB$XMLTYPE_REF_LIST_T
IMPORTS XDB.XDB$IMPORT_LIST_T
INCLUDES XDB.XDB$INCLUDE_LIST_T
FLAGS RAW(4)
SYS_XDBPD$ XDB.XDB$RAW_LIST_T
ANNOTATIONS XDB.XDB$ANNOTATION_LIST_T
MAP_TO_NCHAR RAW(1)
MAP_TO_LOB RAW(1)
GROUPS XDB.XDB$XMLTYPE_REF_LIST_T
ATTRGROUPS XDB.XDB$XMLTYPE_REF_LIST_T
ID VARCHAR2(256)
VARRAY_AS_TAB RAW(1)
SCHEMA_OWNER VARCHAR2(30)
NOTATIONS XDB.XDB$NOTATION_LIST_T
LANG VARCHAR2(4000)
SQL> -
SDO_ORDINATES.X.Field in data file exceeds maximum length
Hi All,
While loading data in .SHP file into oracle spatial through SHP2SDO tool following error message appears:
Error message:
Record 54284: Rejected - Error on table GEO_PARCEL_CENTROID, column CENTROID_GEOM.SDO_ORDINATES.X.
Field in data file exceeds maximum length.
I read some where this is due to the SQL * Loader takes default column value to 255 characters. But there is confusion to me how to change the column size in control file because it is object data type. I am not sure this is correct or not.
The control file show as below:
LOAD DATA
INFILE geo_parcel_centroid.dat
TRUNCATE
CONTINUEIF NEXT(1:1) = '#'
INTO TABLE GEO_PARCEL_CENTROID
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS (
CENTROID_ID INTEGER EXTERNAL,
APN_NUMBER NULLIF APN_NUMBER = BLANKS,
PROPERTY_A NULLIF PROPERTY_A = BLANKS,
PROPERTY_C NULLIF PROPERTY_C = BLANKS,
OWNER_NAME NULLIF OWNER_NAME = BLANKS,
THOMAS_GRI NULLIF THOMAS_GRI = BLANKS,
MAIL_ADDRE NULLIF MAIL_ADDRE = BLANKS,
MAIL_CITY_ NULLIF MAIL_CITY_ = BLANKS,
MSLINK,
MAPID,
GMRotation,
CENTROID_GEOM COLUMN OBJECT
SDO_GTYPE INTEGER EXTERNAL,
SDO_ELEM_INFO VARRAY TERMINATED BY '|/'
(X FLOAT EXTERNAL),
SDO_ORDINATES VARRAY TERMINATED BY '|/'
(X FLOAT EXTERNAL)
Any help on this would appreciate.
Thanks,
[email protected]Hi,
Looks like you have a problem with record 61 in your data file. Can you please post it in reply.
Regards
Ivan -
SQL Loader - Field in data file exceeds maximum length
Dear All,
I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
Table creation script:
CREATE TABLE "TEST_TAB"
"STR" VARCHAR2(4000 BYTE),
"STR2" VARCHAR2(4000 BYTE),
"STR3" VARCHAR2(4000 BYTE)
);Control file:
LOAD DATA
INFILE 'C:\table_export.txt'
APPEND INTO TABLE TEST_TAB
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
)Log:
SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: C:\TEST_TAB.CTL
Data File: C:\table_export.txt
Bad File: C:\TEST_TAB.BAD
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_TAB, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
STR FIRST 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR,1,4000)"
STR2 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR2,1,4000)"
STR3 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR3,1,4000)"
value used for ROWS parameter changed from 64 to 21
Record 1: Rejected - Error on table TEST_TAB, column STR.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table TEST_TAB:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 252126 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 1
Total logical records discarded: 0
Run began on Mon Jul 26 16:06:25 2010
Run ended on Mon Jul 26 16:06:25 2010
Elapsed time was: 00:00:00.22
CPU time was: 00:00:00.15Please suggest a way to get it done.
Thanks for reading the post!
*009*Hi Toni,
Thanks for the reply.
Do you mean this?
CREATE TABLE "TEST"."TEST_TAB"
"STR" VARCHAR2(4001),
"STR2" VARCHAR2(4001),
"STR3" VARCHAR2(4001)
);However this does not work as the error would be:
Error at Command Line:8 Column:20
Error report:
SQL Error: ORA-00910: specified length too long for its datatype
00910. 00000 - "specified length too long for its datatype"
*Cause: for datatypes CHAR and RAW, the length specified was > 2000;
otherwise, the length specified was > 4000.
*Action: use a shorter length or switch to a datatype permitting a
longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
Edited by: 009 on Jul 28, 2010 6:15 AM -
Loader- Field in data file exceeds maximum length
Hi,
I am getting error while loading the data: However data size of this columns is less thatn 4000 and i defined column as : OBJ_ADDN_INFO CLOB
Please help
==================
Record 1: Rejected - Error on table APPS.CG_COMPARATIVE_MATRIX_TAB, column OBJ_ADDN_INFO.
Field in data file exceeds maximum length
LOAD DATA
infile *
REPLACE
INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
( APPS_VERSION,
MODULE_SHORT_NAME,
CATEGORY,
MODULE,
OBJECT_NAME,
OBJECT_TYPE,
OBJECT_STATUS,
FUNCTION_NAME,
OBJ_ADDN_INFO
begindata
"12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INIT,PROGRAM,Changed,"Initial Load - Update Depot Repair Order Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"
"12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INCR,PROGRAM,Changed,"Update Depot Repair Orders Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"If you don't specify a data type for a data field in the SQL Loader control file, SQL Loader assumes the data type is CHAR(255). If you have data that is larger than that, then you can't rely on the default. Try changing the control file to
LOAD DATA
infile *
REPLACE
INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
( APPS_VERSION,
MODULE_SHORT_NAME,
CATEGORY,
MODULE,
OBJECT_NAME,
OBJECT_TYPE,
OBJECT_STATUS,
FUNCTION_NAME,
OBJ_ADDN_INFO char(4000)
) -
Operand value exceeds system limits in sdo_aggr_mbr
Hi-- i'm trying to get the MBR of a fairly large geometry (1429 vertices) and run into a strange problem:
when i:
select sdo_aggr_mbr(shape)
from FEEDER_LINES_SDO
where subname = 'OCEANO';
i get what i expect:
SDO_GEOMETRY(2003, 82212, NULL, SDO_ELEM_INFO_ARRAY(1,1003, 3),SDO_ORDINATE_ARRAY(712103.736,3876977.34, 733591.744, 3896557.18))
however when i try to get the subname in my query as well:
select subname ,sdo_aggr_mbr(shape)
from FEEDER_LINES_SDO
where subname = 'OCEANO'
group by subname;
i get
ERROR at line 1:
ORA-22813: operand value exceeds system limits
The query fails with "ORA-00937: not a single-group group function" when i leave out the group by clause
i can get around it with a kludge, but would like to know why the group by fails
the kludge:
select subname,min(t.x) minx, min(t.y) miny, max(t.x) maxx, max(t.y) maxy from
FEEDER_LINES_SDO c,
TABLE(SDO_UTIL.GETVERTICES(c.shape)) t
where subname = 'OCEANO'
group by subname;
SUBNAME MINX MINY MAXX MAXY
OCEANO 712103.736 3876977.34 733591.744 3896557.18
where minx(), miny() etc are variations on:
function minx (geom_in mdsys.sdo_geometry)
return number DETERMINISTIC IS
begin
return sdo_geom.sdo_min_mbr_ordinate(geom_in,1);
end;
the group by expression seems to work fine on geometries with less than 1200 vertices. Is there a system parameter i can change?
elvis{44}% oerr ora 22813
22813, 00000, "operand value exceeds system limits"
// *Cause: Object or Collection value was too large. The size of the value
// might have exceeded 30k in a SORT context, or the size might be
// too big for available memory.
// *Action: Choose another value and retry the operation.
i am running oracle 9.2.0.1 on solaris8
any insight on this will be greatly appreciated
cheers
--kassimHi Kassim,
At KMS I recently ran into the same ORA-22813, when running this cursor SQL
CURSOR lcur_montage IS
select m.mont_id, m.sys_PK, m.krtp_id, m.mont_geom, m.til_dato_id , m.forloeb
from MTK_montage m
where m.fra_dato_id = in_dato_id
and m.krtp_id = 1
order by m.mont_id;
Omitting the order by clause makes it work fine. If I alternatively omit the SDO_geometry m.mont_geom as a select item, the query also works.
Our problem seems to arise when trying to sort selected rows, which contain large objects such as SDO_geometry.
Yesterday we played around with SORT_AREA_SIZE, but to no avail. It turns out to be a known bug.
When I today search for ORA-22813 in MetaLink, the first list item is
1.
9.2.0.X Patch Sets - List of Bug Fixes by Problem Type
Type: Note
Doc ID: 217194.1
Score: 63%
Modified Date: 18-FEB-2003
Status: PUBLISHED
Platform: Generic issue
Product: Oracle Server - Enterprise Edition
which unfortunately will not open and reveal its content.
On the other hand trying MetaLink -> Bugs -> search for 'ORA-22813' gives amongst others Bug 2656107, which looks a lot like my problem.
For Oracle eyes: - when will this bug be fixed? Does it solve the problem at hand?
- regards
Jens Ole Jensen
Kort & MatrikelStyrelsen (WWW: http://www.kms.dk)
Danmark
version: (32 bit) Oracle9i Enterprise Edition Release 9.2.0.2.0 - Production on Sun/SunOS 5.8 (64 bit) -
Variable length field exceeds maximum length
Hi All,
I am trying to load some signature ascii data from load file. so i wrote the code like below in my controal file to load that to database. For SIGN_IMAGE in oracle db it was mentioned as RAW(2000). The below is worked fine when i tried in window and oracle 8i environment.
SIG_TYPE POSITION(23:23) CHAR,
SIGN_IMAGE POSITION(24:1977) VARRAW(2000)
NULLIF SIGN_IMAGE=BLANKS,
SIGN_IMAGE1 POSITION(1978:3930) VARRAW(2000)
NULLIF SIGN_IMAGE1=BLANKS
But when i ported the same thing to solaris and oracle 10g environment. The below code is giving error when SQL Loder loading.
The error is like : Variable length field exceeds maximum length.
But here i am giving a lenght of 1954 only with including 2 bytes length of the string.
Could any one tell me what is exactly the problem? i am not able sort out the issue.
Thanks,
ShashiHi,
I am getting the error in environment Oracle 10.2.0 and SunOS 5.10. This case was executed fine in Oracle 8.1 and windows xp professional. Please find the details below.
LOG file :
Column Name Position Len Term Encl Datatype
TRANS_NO 1:15 15 CHARACTER
TDR_ID 16:18 3 CHARACTER
DVCE_TYPE 19:20 2 CHARACTER
CAP_CD 21:22 2 CHARACTER
SIG_TYPE 23:23 1 CHARACTER
SIGN_IMAGE 24:1977 2002 VARRAW
NULL if SIGN_IMAGE = BLANKS
SIGN_IMAGE1 1978:3930 2002 VARRAW
NULL if SIGN_IMAGE1 = BLANKS
SIGN_IMAGE2 3931:5883 2002 VARRAW
NULL if SIGN_IMAGE2 = BLANKS
SIGN_IMAGE3 5884 2002 VARRAW
NULL if SIGN_IMAGE3 = BLANKS
value used for ROWS parameter changed from 64 to 31
Record 1: Rejected - Error on table SIGSCH.SIGNATURE, column SIGN_IMAGE.
Variable length field exceeds maximum length.
Record 2: Rejected - Error on table SIGSCH.SIGNATURE, column SIGN_IMAGE.
Variable length field exceeds maximum length.
Record 3: Rejected - Error on table SIGSCH.SIGNATURE, column SIGN_IMAGE.
Variable length field exceeds maximum length.
Record 4: Rejected - Error on table SIGSCH.SIGNATURE, column SIGN_IMAGE.
Variable length field exceeds maximum length.
Controal file :
LOAD DATA
INFILE 'sigc.sig'
BADFILE 'load_7.bad'
DISCARDFILE 'load_7.dis'
APPEND
INTO TABLE sigsch.signature
TRANS_NO POSITION(1:15) CHAR,
TDR_ID POSITION(16:18) INTEGER EXTERNAL,
DVCE_TYPE POSITION(19:20) CHAR,
CAP_CD POSITION(21:22) CHAR,
SIG_TYPE POSITION(23:23) CHAR,
SIGN_IMAGE POSITION(24:1977) VARRAW(2000)
NULLIF SIGN_IMAGE=BLANKS,
SIGN_IMAGE1 POSITION(1978:3930) VARRAW(2000)
NULLIF SIGN_IMAGE1=BLANKS,
SIGN_IMAGE2 POSITION(3931:5883) VARRAW(2000)
NULLIF SIGN_IMAGE2=BLANKS,
SIGN_IMAGE3 POSITION(5884) VARRAW(2000)
NULLIF SIGN_IMAGE3=BLANKS
DATA line
0001200000002480050600?ža4GW ' & ' & _" _! _" ^# ^# _" ^# _" ^# _! _" & ' & ' & !%_!5 & !% & !U & !% ' V X _R _! _Z _Q _" _"__1 _"_ ( _!__2_ (_ 0 & 'pa@NS _! _"_^3 _" ^# _! ^$ ^# _! ^$ _! _" _!paDC' ' P V _ P ^ & '_ ._ 0__*_ 0 ^# _" _! _" _! ^T X _Q! H PpaTI$_ (_ 0_ (_ 0__2_ (^_9 _"_^3 _" _! _R _Z P! H P! H & ^ W !% & & ' _! _" ^# _" _! _" _" _Q _" X P &pa]KA_ 0^ 8_ 0_ (__2_ 0_ ( X _! P P Z! H Q P Z R _!__2 _!^ 8__2_ (^ 8_ 6_ 0 ' Ppe"F; P ^ P & P ' & '_ 0_ 0__)_ 0 _"__) Z " ^S _Y! @!_Jpe3K/ _" _" _! _"__1 ^$ _! _" P _! ' ^ ' & & !% ' & & ' &_ ( _" X P P Y!J R!I!_J ^S R Y _"_ (__1^ 8_ 0^ 8_ 7^ 8_ ( & _" X PpeDF# & & _" _" _! _" ^# _" PpeDKW '_ 0 V P X R X P Q _"_ 0_ 0_ ( _" _! Z P Q Z!J Q R _! _" _"_ 0^ 8__1_ /_ 0_ 6_ ( & ' X V! H W! H! N! H! @ X P Z ! _" _! ^$ _! ^# _" ^# R ! ' & ' !$ !% !% !%_!5 !$ ' & W & & P X _" _R _! _Z _! _" _"__)_ 0__*_ 0 Ppi#J+ X_ (_ 0_ (^ @_ (__1__*__1 _R _Z _Q X P P! H X ' & V_ 7 &_ ( '_ 6 _" Ppi-KG_ 0_ 7__)_ 0_ /__1^ 8_ 0 R Z P Q Z Q Z _" _Q__2__) _"__2_ (^ @_ (_ 6 V ' ^ P P ' X V X P P X P! H P ^ P X W & '_ 6_ (__2_ (__1__2 _! _" _R _Q! H _Z P Ppi@H& _Q P X _" V X ' P_ 6_ (__2__)_ 0__2 _" _! _R _Q X P! H P _ & ' &_ . '_ 6 P XpiTN# &_ (^ 8__2^ 8_ 0__)__2_^3 _" _! ^# R " _! _R _!__2 _"_ 0_ (_ 6_ ._ 7_ 6_ / & & ' X P Q! H!J Ppi\KW _" _R X _Q _Z _Q _R _Zpm&O9_ . _"__2 _!_^3_^, ^#_^3 ^# ^# ^$ Qpm(JS 0_ ( _!_ 0 _"__) _" _" _! _Z P X P V _ &_ ._ 7_ 0 P Ppm1ID_ 0_ 0__* _!__2 _" Q " X _! P P X W X !% P & & W & _" _! _" _" _! ^# _" X V ' !] & V ' & _" _!__2 _" _! _" P & ' & V _ V _! @ ^! H!_B Xpa44V &_ 0____1____2 ^# ^# ^# R !!_J X P V _ !% & !% & '_ . &_ 7_ 6 Ppa<7Q_ 0_ ._ 0 _"__* _! _" ^# _" ]\ ^$ ^S ^# _Z _! W &paE3A _"_ 0 &_ 0_ ( _"_ 0__) _"_^3 _" ^S ^# _Z P!_I W !\ ' "S !%\!% !$ !%_!5 & '_ ( ^# _" ^# _! ^$ ^[ _! ^T ^# _YpaW7)_ 0_ ( _"__1 _" ^# _" ^# _! ^T X _Q! H P! H Xpe 7Q_ 0_ ( X _" _Q ^# _R ^# ^# ^\ ^# ^S _" _! _Z V !% !U !] !T !% !% !% &_ 0 _"__) ^$ _! _" ^# _" ^[ P _Y! @ X W ^ ' &_ ._ 7_ . '_ 6_ 0 & W P ^pe37/ ' _! _" _" _! ^# ^$ _! ^[ ^$ ^# _! ^$ _!pe-3A _" X _Q X! @ X P! H P ' X & V ' & ' !$ ' & _" _! ^$ _! ^# ^$ ^[ ^# ^# ^S _" _" V !% ^ W !\ W P ^ P X P _R _! _Z ^# _R ^# ^# _Z _! _RpeE/G X P X P W X P &_ 7_ (__1_ 0__*__1_^, _! ^# ^\ _Q _Z!_A!_J! H P! HpeU3A &_ 0_ 0__* _" _! _Z _Q _" ^S _" ^# _" _! _"__1 _"__2_ (_ 0pe]5D _" _! _" _" _! ^# ^\ ^# ^S _" ^# X & W & !% !%_"3_!-_ 6_!- &_ 7_ 0 & P V _ P X! @ X P X _Q ^$ _! _" _!__2_ ( _"_ 0_ (_ 0 Vpi&0> _! ^$ _Q _" _! ^$ _! _" & _ !$ ' !% !$ ' !U V ^ P! H P!_J X!_B Xpi65<_ ( _"_ 0 _! _"__2 _!__* _! ^$ ^# _Y _R _R X V ^ !U ' & &_ 7_ . '_ 0_ ( X VpiA1D_ 0^ 8 _"_ 0 _! ^$ _!_^+ ^$ ^[ ^S ^# ^[ _R _R_ 6 &_!5_!- !%_!5 !$ !% !U !% ^ W P X P _Y P _" _! _" ^#__2_ (__1_ (_ 0 ' &piF0> P X P! H P! H! H V __ . &_ 0_ (^ @_ (__2__*__1 _" _Q _Z _R! H!_I! H PpiT3! P &__2 _" _! _R _! _" _" ^# _! ' & ' !$ W & ' ^ V! H P! H Xpm$;! & _" _" _! ^# ^$ _! ^# _" _" ^#pm'3/ _"_ 0 _" _! _R _! _Z P _R! H! H P! N X V W & _Z _! ^$ ^S ^# ]] ^# ]% ]U_^3 _!__*^_:^_A^ 8^ 8^ ?^ 8^ @p -
Error : Summary String length exceeds maximum
I just got an error while trying to Update a User on my IDM instance.
com.waveset.util.InternalError : Summary String length (3003) exceeds maximum (2048)
I already checked this forum for similar issues, and I found a couple of threads. However, for some reason, I was unable to VIEW those threads (something about not having the rights/access to read those threads)
So, I am posting my issue here.
Any help will be greatly appreciatedRoman,
if you are using NETBEANS, this is what you do :
(a) right-click on the name of your instance (in the Projects Tab)
(b) select IDM
(c) then select "Download Objects"
(d) From the list that appears, choose "Common Configuration Object Types"
(e) Next, choose "*Repository Configuration"*
Somewhere in this file, there is a clause which states : *"Maximum Summary = '2048'*
Change this value to something higher (but, *NOT* too high !)
This should solve your problem
Maybe you are looking for
-
HT2494 Apple computer is not "feeling" external drive.
I have mackbook pro 13 with retina dispaly, Mavriek OX, I am using external drive 2TB as backup for time machaine. all were fince. I installed parallels and windows 8.1 preview to run some programs. The Apple stopped "feeling" the external drive alth
-
I buy my ipod touch 4 in colombia, but i dont have some apps
I buy my ipod touch in colombia, but i dont have some apps in the ipod that other ipods have like angry birds, why? and i can do something for that this other apps appear without buying another ipod?
-
Well that didn't take long....21.5" iMac Failure
3 days as a new iMac owner and it's already broken. So far PC/Windows wins! Just getting a white screen with an Apple logo and the activity icon spinning. Any suggestions? I was just importing photos into iPhoto and had just got used to the way iPhot
-
How do i update iTunes match?
I started matching my library and had to stop. How do I get the match going again?
-
In a R12 Dev instance, we have trace files generated after turning on the "Concurrent: Allow Debugging" option at the user level (ID 301372.1). Is there a way to identify the trace file names? From prior runs, it seems that the trace file name is for