Document name exceeds allowed length
Hello all,
We are getting an "Document name exceeds allowed length" error when we want to approve a shopping card. There is no attachments in the SC and no reason for this error. First of all we already created the SC and approval WF started and now when we check the document this error occurs.
Do you have any idea where to start checking?
Best regards
Hi
The error should not be raised in the case where no attachment exists, it is raised normally when the attachment URL exceeds 255 characters.
However - perhaps you need to apply the note [1269204|https://service.sap.com/sap/support/notes/1269204]. This may be the issue in your system.
Regards,
Jason
Edited by: Jason Boggans on Apr 19, 2010 10:15 AM
Similar Messages
-
Upgrade error "exceeded maximum allowed length (134217728 bytes)
Hi:
I'm trying to add a patch to the OEM patch cache, and I'm getting an error:
Error: - Failed to Upload File.Uploaded file of length 545451768 bytes exceeded maximum allowed length (134217728 bytes)
The patch file is what it is (545451768 bytes). How do I install this patch? I'm trying to upgrade a 10.1.0.3 DB to 10.1.0.4
ThanksThis log is for external user.
Did you deploy Lync edge server?
The Edge Server rejects the authentication request, and redirects the Lync 2010 client to the Lync Web Services (https://lyncexternal.contoso.com/CertProv/CertProvisioningService.svc).
It seems the redirect fails, please check the event view on Lync edge Server.
Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. -
On load, getting error: Field in data file exceeds maximum length
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
PL/SQL Release 11.2.0.3.0 - Production
CORE 11.2.0.3.0 Production
TNS for Solaris: Version 11.2.0.3.0 - Production
NLSRTL Version 11.2.0.3.0 - Production
I'm trying to load a table, small in size (110 rows, 6 columns). One of the columns, called NOTES is erroring when I run the load. It is saying that the column size exceeds max limit. As you can see here, the table column is set to 4000 Bytes)
CREATE TABLE NRIS.NRN_REPORT_NOTES
NOTES_CN VARCHAR2(40 BYTE) DEFAULT sys_guid() NOT NULL,
REPORT_GROUP VARCHAR2(100 BYTE) NOT NULL,
AREACODE VARCHAR2(50 BYTE) NOT NULL,
ROUND NUMBER(3) NOT NULL,
NOTES VARCHAR2(4000 BYTE),
LAST_UPDATE TIMESTAMP(6) WITH TIME ZONE DEFAULT systimestamp NOT NULL
TABLESPACE USERS
RESULT_CACHE (MODE DEFAULT)
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 80K
NEXT 1M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
BUFFER_POOL DEFAULT
FLASH_CACHE DEFAULT
CELL_FLASH_CACHE DEFAULT
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
I did a little investigating, and it doesn't add up.
when i run
select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
I get a return of
643
That tells me that the largest size instance of that column is only 643 bytes. But EVERY insert is failing.
Here is the loader file header, and first couple of inserts:
LOAD DATA
INFILE *
BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
Fields terminated by ";" Optionally enclosed by '|'
NOTES_CN,
REPORT_GROUP,
AREACODE,
ROUND NULLIF (ROUND="NULL"),
NOTES,
LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
BEGINDATA
|E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females. Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%). The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population. People over the age of 60 account for about 22% of visits. Most of the visitation is from the local area. More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
|E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short. Over half of the visits last less than 3 hours. The median length of visit to overnight sites is about 43 hours, or about 2 days. The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long. Most visits come from people who are fairly frequent visitors. Over thirty percent are made by people who visit between 40 and 100 times per year. Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
|E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%). Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
Here is the full beginning of the loader log, ending after the first row return. (They ALL say the same error)
SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: NRIS.NRN_REPORT_NOTES.ctl
Data File: NRIS.NRN_REPORT_NOTES.ctl
Bad File: ./NRIS.NRN_REPORT_NOTES.BAD
Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
NOTES_CN FIRST * ; O(|) CHARACTER
REPORT_GROUP NEXT * ; O(|) CHARACTER
AREACODE NEXT * ; O(|) CHARACTER
ROUND NEXT * ; O(|) CHARACTER
NULL if ROUND = 0X4e554c4c(character 'NULL')
NOTES NEXT * ; O(|) CHARACTER
LAST_UPDATE NEXT * ; O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
Field in data file exceeds maximum length...
I am not seeing why this would be failing.HI,
the problem is delimited data defaults to char(255)..... Very helpful I know.....
what you need to two is tell sqlldr hat the data is longer than this.
so change notes to notes char(4000) in you control file and it should work.
cheers,
harry -
SQL Loader - Field in data file exceeds maximum length
Dear All,
I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
Table creation script:
CREATE TABLE "TEST_TAB"
"STR" VARCHAR2(4000 BYTE),
"STR2" VARCHAR2(4000 BYTE),
"STR3" VARCHAR2(4000 BYTE)
);Control file:
LOAD DATA
INFILE 'C:\table_export.txt'
APPEND INTO TABLE TEST_TAB
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
)Log:
SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: C:\TEST_TAB.CTL
Data File: C:\table_export.txt
Bad File: C:\TEST_TAB.BAD
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_TAB, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
STR FIRST 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR,1,4000)"
STR2 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR2,1,4000)"
STR3 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR3,1,4000)"
value used for ROWS parameter changed from 64 to 21
Record 1: Rejected - Error on table TEST_TAB, column STR.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table TEST_TAB:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 252126 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 1
Total logical records discarded: 0
Run began on Mon Jul 26 16:06:25 2010
Run ended on Mon Jul 26 16:06:25 2010
Elapsed time was: 00:00:00.22
CPU time was: 00:00:00.15Please suggest a way to get it done.
Thanks for reading the post!
*009*Hi Toni,
Thanks for the reply.
Do you mean this?
CREATE TABLE "TEST"."TEST_TAB"
"STR" VARCHAR2(4001),
"STR2" VARCHAR2(4001),
"STR3" VARCHAR2(4001)
);However this does not work as the error would be:
Error at Command Line:8 Column:20
Error report:
SQL Error: ORA-00910: specified length too long for its datatype
00910. 00000 - "specified length too long for its datatype"
*Cause: for datatypes CHAR and RAW, the length specified was > 2000;
otherwise, the length specified was > 4000.
*Action: use a shorter length or switch to a datatype permitting a
longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
Edited by: 009 on Jul 28, 2010 6:15 AM -
SQL Loader - data exceeds maximum length
I am having an issue with SQL Loader falsely reporting that a column is too long in a CSV upload file. The offending column, Notes, is defined in the staging table as VARCHAR2(1000). The text in the Notes column in the upload file for the record that is being rejected is only 237 characters long. I examined the raw data file with a hex editor and there are no special cahracters embedded in the column. The CSV upload was recreated but the false error remains.
Any ideas what to check? Any suggestion appreciated.
Here are the pertinent files.
Control File:LOAD DATA
INFILE 'Mfield_Upl.dat'
BADFILE 'Mfield_Upl.bad'
TRUNCATE
INTO TABLE Mfield_UPL_Staging
FIELDS TERMINATED BY ',' optionally enclosed by '"'
ControlNo CHAR,
PatientID CHAR,
CollectDate DATE "MM/DD/YYYY",
TestDate DATE "MM/DD/YYYY",
AnalyteDesc CHAR,
Results CHAR,
HiLoFlag CHAR,
LoRange CHAR,
HiRange CHAR,
UnitOfMeas CHAR,
Comments CHAR,
Notes CHAR,
ClinicalEvent CHAR,
OwnerLName CHAR,
OwnerFName CHAR,
PetName CHAR,
AssecNo CHAR,
SpecimenID CHAR
{code}
Staging Table:{code}
CREATE TABLE Mfield_UPL_Staging
ControlNo VARCHAR2(20),
PatientID VARCHAR2(9),
CollectDate DATE,
TestDate DATE,
AnalyteDesc VARCHAR2(100),
Results VARCHAR2(100),
HiLoFlag CHAR(10),
LoRange VARCHAR2(15),
HIRange VARCHAR2(15),
UnitOfMeas VARCHAR2(25),
Comments VARCHAR2(100),
Notes VARCHAR2(1000),
ClinicalEvent VARCHAR2(20),
OwnerLName VARCHAR(50),
OwnerFName VARCHAR(50),
PetName VARCHAR(50),
AssecNo NUMBER(10),
SpecimenID NUMBER(10)
{Code}
Error Log File:{code}
SQL*Loader: Release 9.2.0.1.0 - Production on Wed Aug 11 08:22:58 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: Mfield_UPL_CSV.ctl
Data File: Mfield_UPL.dat
Bad File: Mfield_Upl.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table MFIELD_UPL_STAGING, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
CONTROLNO FIRST * , O(") CHARACTER
PATIENTID NEXT * , O(") CHARACTER
COLLECTDATE NEXT * , O(") DATE MM/DD/YYYY
TESTDATE NEXT * , O(") DATE MM/DD/YYYY
ANALYTEDESC NEXT * , O(") CHARACTER
RESULTS NEXT * , O(") CHARACTER
HILOFLAG NEXT * , O(") CHARACTER
LORANGE NEXT * , O(") CHARACTER
HIRANGE NEXT * , O(") CHARACTER
UNITOFMEAS NEXT * , O(") CHARACTER
COMMENTS NEXT * , O(") CHARACTER
NOTES NEXT * , O(") CHARACTER
CLINICALEVENT NEXT * , O(") CHARACTER
OWNERLNAME NEXT * , O(") CHARACTER
OWNERFNAME NEXT * , O(") CHARACTER
PETNAME NEXT * , O(") CHARACTER
ASSECNO NEXT * , O(") CHARACTER
SPECIMENID NEXT * , O(") CHARACTER
Record 1042: Rejected - Error on table MFIELD_UPL_STAGING, column NOTES.
Field in data file exceeds maximum length
Table MFIELD_UPL_STAGING:
3777 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
{code}Try:
-- Etc ...
Notes CHAR(1000),
-- Etc ...SQL*Loader limits string buffer to 256 unless specified different.
:p -
ORA-30951: ...exceeds maximum length
Oracle Database 10g Release 10.2.0.1.0 - Production
I am new to XML and having a problem importing data using the XML Repository. I have annotated the schema and validated it using XML Spy. I am able to register the schema w/o errors. I am now working through the issues as they occur during the insertion of xml documents. Thes section below is giving me an error (bottom) that the data exceeds the maximum length. The "DATA" in the xml doc is a pdf file that has been converted to characters by some method. The size element has a data value of 5008. Seems to be too long for a varchar2. I tried RAW, CLOB, BLOB. I was pretty sure they wouldn't work and they didn't. I get an error that the xml/xdb types are incompatible.
How can I modify the schema to get this element to load?
Is it possible to tell oracle to ignore an element so I can eliminate those that are not critical? This would be very helpful.
Thanks!
Schema -
<xs:element name="NpdbReportList" minOccurs="0">
<xs:complexType>
<xs:sequence maxOccurs="unbounded">
<xs:element name="NpdbReport" minOccurs="0">
<xs:complexType>
<xs:all minOccurs="0">
<xs:element name="DCN" minOccurs="0"/>
<xs:element name="DateReport" type="Date" minOccurs="0"/>
<xs:element name="ReportType" type="IdRef" minOccurs="0"/>
<xs:element ref="LOB" minOccurs="0"/>
</xs:all>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:complexType name="LOB" xdb:SQLType="LOB_T"xdb:maintainDOM="false">
<xs:all>
<xs:element name="Type" type="IdRef"/>
<xs:element name="Size"/>
<xs:element name="Data" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="VARCHAR2"/>
</xs:all>
</xs:complexType>
ftp> mput *
mput Smyth_Steven_1386367.XML? y
227 Entering Passive Mode (127,0,0,1,83,221)
150 ASCII Data Connection
550- Error Response
ORA-30951: Element or attribute at Xpath /Provider/NpdbReportList/NpdbReport[1]/LOB/Data exceeds maximum length
550 End Error Response
28706 bytes sent in 0.014 seconds (1.9e+03 Kbytes/s)
ftp>Thanks for your time Marco.
Here is the header:
<?xml version="1.0" encoding="UTF-8"?>
<!-- edited with XMLSpy v2010 rel. 2 (http://www.altova.com) by Joe (DSI) -->
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" elementFormDefault="qualified" attributeFormDefault="unqualified" xdb:storeVarrayAsTable="true" xdb:mapStringToNCHAR="true" xdb:mapUnboundedStringToLob="true">
I made the following change -
<xs:element name="Data" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="CLOB"/>
I received this error -
ORA-31094: incompatible SQL type "CLOB" for attribute or element "Data"
ORA-06512: at "XML_TEST.XML_TEST_SCHEMA_REGISTER", line 48
I did a little more testing after the first post. I used the same type as an element that is defining image data.
<xs:element name="Data" type="xs:base64Binary" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="BLOB"/>
While this did register and I was able to load a record, I am guessing that this will render the data for this element usless but at least the record loads. I'll still need to resolve this issue as the .pdf data is important.
Thanks
Joe -
This problem began when I typed in an address http www etc., making one mistake in the address. I corrected it and then tried to copy the address because it was long (I wanted to be able to acces it another time), and that's when the trouble starrted. I just copt documents to get a new one because it's quicker, but it is unorthodox, I know. Since then whenever I copy a document i cannot open it, and it often puts a gobledegook ending to the document name. I've started emailing texts to myself so as to keep them, but I MUST be able to use documents normally. I hava a MacBook Pro, using OS Mountain Lion 10.8.5. I don't like it. It's fairly new, and has messed up my system of documents : the size is alll wrong and the tabulations are worse than ever. I use text edit because I wanted to use Macdictate because I am handicapped (arthritis and firbomyalgia) but I have never been able to get it to work. Since I changed OS I cannot use Text edit properly any more. Please help me. I think it must be a bug that came through when I was putting this address in four days ago. I went to the site once I'd corrected the address and it was perfectly normal. I shall ring them today and ask them if this has happened to anyone else. I live in France. Thank you for your advice. Because I am handicapped and I do not dirve, it is very difficult to get to a Mac store. I deleted Mac cloud documents because I don't want to use it and I thought it might solve the problem, but it didn"t.
Back up all data.
This procedure will unlock all your user files (not system files) and reset their ownership and access-control lists to the default. If you've set special values for those attributes on any of your files, they will be reverted. In that case, either stop here, or be prepared to recreate the settings if necessary. Do so only after verifying that those settings didn't cause the problem. If none of this is meaningful to you, you don't need to worry about it.
I've tested these instructions only with the Safari web browser. If you use another browser, they may not work as described.
Step 1
If you have more than one user account, and the one in question is not an administrator account, then temporarily promote it to administrator status in the Users & Groups preference pane. To do that, unlock the preference pane using the credentials of an administrator, check the box marked Allow user to administer this computer, then reboot. You can demote the problem account back to standard status when this step has been completed.
Triple-click anywhere in the following line on this page to select it. Copy the selected text to the Clipboard (command-C):
{ sudo chflags -R nouchg,nouappnd ~ $TMPDIR.. ; sudo chown -R $UID:staff ~ $_ ; sudo chmod -R u+rwX ~ $_ ; chmod -R -N ~ $_ ; } 2> /dev/null
Launch the Terminal application in any of the following ways:
☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
☞ Open LaunchPad. Click Utilities, then Terminal in the icon grid.
Paste into the Terminal window (command-V). You'll be prompted for your login password. Nothing will be displayed when you type it. You may get a one-time warning to be careful. If you don’t have a login password, you’ll need to set one before you can run the command. If you see a message that your username "is not in the sudoers file," then you're not logged in as an administrator.
The command will take a noticeable amount of time to run. Wait for a new line ending in a dollar sign (“$”) to appear, then quit Terminal.
Step 2 (optional)
Take this step only if you have trouble with Step 1 or if it doesn't solve the problem.
Boot into Recovery. When the OS X Utilities screen appears, select
Utilities ▹ Terminal
from the menu bar. A Terminal window will open.
In the Terminal window, type this:
res
Press the tab key. The partial command you typed will automatically be completed to this:
resetpassword
Press return. A Reset Password window will open. You’re not going to reset a password.
Select your boot volume ("Macintosh HD," unless you gave it a different name) if not already selected.
Select your username from the menu labeled Select the user account if not already selected.
Under Reset Home Directory Permissions and ACLs, click the Reset button.
Select
▹ Restart
from the menu bar. -
Unable to rename document name on ItemAdded event programmatically
Hi,
I have developed a Event Receiver on Item Added event basically to update the document name after document gets uploaded to SharePoint Library.
I have developed below code but its getting failed at on Item.Update(); function call. Referred this
MSDN Blog.
public override void ItemAdded(SPItemEventProperties properties)
base.ItemAdded(properties);
try
SPListItem item = properties.ListItem;
string draftAuthor = item["Author"].ToString();
int index = draftAuthor.IndexOf("#");
string finalAuthor = draftAuthor.Substring(index+1, draftAuthor.Length-index-1);
updatedTitle = "My Dashboard- "+ finalAuthor;
item.File.CheckOut();
item["Name"] = updatedTitle;
item.Update(); // Getting failed here
item.File.CheckIn("File has been renamed");
catch (Exception ex)
string error = "Error in event overriding : " + ex.ToString();
myLog.Source = "My Dashboard Extension";
myLog.WriteEntry(error, EventLogEntryType.Error, 4700);
Getting below error
Error in event overriding : Microsoft.SharePoint.SPException: Invalid data has been used to update the list item. The field you are trying to update may be read only. ---> System.Runtime.InteropServices.COMException (0x80020005): <nativehr>0x80020005</nativehr><nativestack></nativestack>Invalid
data has been used to update the list item. The field you are trying to update may be read only.
Please suggest me how can I make this piece of code work.
Thanks !!!Hi,
We should use the MoveTo method to rename the file name. The following code snippets for your reference.
base.ItemAdded(properties);
properties.ListItem["Title"] = properties.ListItem.File.Name;
properties.ListItem.Update();
SPFile file = properties.ListItem.File;
// Get the extension of the file. We need that later.
string spfileExt = new FileInfo(file.Name).Extension;
// Rename the file to the list item's ID and use the file extension to keep
// that part of it intact.
file.MoveTo(properties.ListItem.ParentList.RootFolder.Url +
"/" + properties.ListItem["ID"] + spfileExt);
// Commit the move.
file.Update();
Thanks & Regards,
Jason
Jason Guo
TechNet Community Support -
Hi All,
I am trying to load the XML Files into a Table using the SQL Loader and i am getting the Error
Record 1: Rejected - Error on table COMMONASSETCATALOG.
ORA-30951: Element or attribute at Xpath /AC/T[1]/T[1]/T[1]/T[1]/T[1]/Doc[@] exceeds maximum length
The <Doc> Element which is child of the <T> contains an XML Schema inside it..
The Doc Element is declared in Schema as
<xs:complexType name="AsDocType">
<xs:annotation>
<xs:documentation>A (Doc)ument, a container for any type of file</xs:documentation>
</xs:annotation>
<xs:sequence minOccurs="0" maxOccurs="unbounded">
<xs:any namespace="##any" processContents="lax"/>
</xs:sequence>
<xs:attributeGroup ref="AsDocAtts"/>
</xs:complexType>
The Size of the XML Content that <Doc> Node has is around 34Kb.
Could you pls let me know how to resolve this..
Thanks
SateeshHi All,
I am trying to load the XML Files into a Table using the SQL Loader and i am getting the Error
Record 1: Rejected - Error on table COMMONASSETCATALOG.
ORA-30951: Element or attribute at Xpath /AC/T[1]/T[1]/T[1]/T[1]/T[1]/Doc[@] exceeds maximum length
The <Doc> Element which is child of the <T> contains an XML Schema inside it..
The Doc Element is declared in Schema as
<xs:complexType name="AsDocType">
<xs:annotation>
<xs:documentation>A (Doc)ument, a container for any type of file</xs:documentation>
</xs:annotation>
<xs:sequence minOccurs="0" maxOccurs="unbounded">
<xs:any namespace="##any" processContents="lax"/>
</xs:sequence>
<xs:attributeGroup ref="AsDocAtts"/>
</xs:complexType>
The Size of the XML Content that <Doc> Node has is around 34Kb.
Could you pls let me know how to resolve this..
Thanks
Sateesh -
Lax validation errors on schema import ('version' exceeds maximum length)
I have a schema as per below. I'm trying to import it into Oracle 10.2.0.2.0. However, I'm getting the following lax validation error:
Error loading ora_business_rule.xsd:ORA-30951: Element or attribute at Xpath /schema[@version] exceeds maximum length
I can fix it by modifying the attribute and shortening it but I'd like to know why it's occuring. Insofar as I can tell there is no imposed limit on the size of schema attributes according to the W3C standard. Which then makes me wonder: does Oracle impose limits on the length of all attributes or is this specific to 'version' ? If there is a limit, what is the upper bound (in bytes) ? Where is this documented?
Cheers,
Daniel
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:br="http://foo.com/BusinessRule_PSG_V001" targetNamespace="http://foo.com/BusinessRule_PSG_V001" elementFormDefault="qualified" attributeFormDefault="unqualified" version="last committed on $LastChangedDate: 2006-05-19 11:00:52 +1000 (Fri, 19 May 2006) $">
<xs:element name="edit">
<xs:complexType>
<xs:sequence>
<xs:element name="edit_id" type="xs:string"/>
<xs:element ref="br:business_rule"/>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="derivation">
<xs:complexType>
<xs:sequence>
<xs:element name="derivation_id" type="xs:string"/>
<xs:element ref="br:derivation_type"/>
<xs:element ref="br:business_rule"/>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="derivation_type">
<xs:simpleType>
<xs:restriction base="xs:NMTOKENS">
<xs:enumeration value="complex"/>
<xs:enumeration value="format"/>
<xs:enumeration value="formula"/>
<xs:enumeration value="recode"/>
<xs:enumeration value="SAS code"/>
<xs:enumeration value="transfer"/>
<xs:enumeration value="count"/>
<xs:enumeration value="sum"/>
<xs:enumeration value="max"/>
<xs:enumeration value="min"/>
</xs:restriction>
</xs:simpleType>
</xs:element>
<xs:element name="business_rule"></xs:element>
</xs:schema>Opps -- Sorry it's a decision we took when looking at Version
When we registered the Schema for Schemas during XDB bootstrap the Version attriubte was mapped to varchar2(12).
SQL> desc xdb.xdb$schema_T
Name Null? Type
SCHEMA_URL VARCHAR2(700)
TARGET_NAMESPACE VARCHAR2(2000)
VERSION VARCHAR2(12)
NUM_PROPS NUMBER(38)
FINAL_DEFAULT XDB.XDB$DERIVATIONCHOICE
BLOCK_DEFAULT XDB.XDB$DERIVATIONCHOICE
ELEMENT_FORM_DFLT XDB.XDB$FORMCHOICE
ATTRIBUTE_FORM_DFLT XDB.XDB$FORMCHOICE
ELEMENTS XDB.XDB$XMLTYPE_REF_LIST_T
SIMPLE_TYPE XDB.XDB$XMLTYPE_REF_LIST_T
COMPLEX_TYPES XDB.XDB$XMLTYPE_REF_LIST_T
ATTRIBUTES XDB.XDB$XMLTYPE_REF_LIST_T
IMPORTS XDB.XDB$IMPORT_LIST_T
INCLUDES XDB.XDB$INCLUDE_LIST_T
FLAGS RAW(4)
SYS_XDBPD$ XDB.XDB$RAW_LIST_T
ANNOTATIONS XDB.XDB$ANNOTATION_LIST_T
MAP_TO_NCHAR RAW(1)
MAP_TO_LOB RAW(1)
GROUPS XDB.XDB$XMLTYPE_REF_LIST_T
ATTRGROUPS XDB.XDB$XMLTYPE_REF_LIST_T
ID VARCHAR2(256)
VARRAY_AS_TAB RAW(1)
SCHEMA_OWNER VARCHAR2(30)
NOTATIONS XDB.XDB$NOTATION_LIST_T
LANG VARCHAR2(4000)
SQL> -
Ctxload error DRG-11530: token exceeds maximum length
I downloaded the 11g examples (formerly the companion cd) with the supplied knowledge base (thesauri), unzipped it, installed it, and confirmed that the droldUS.dat file is there. Then I tried to use ctxload to create a default thesaurus, using that file, as per the online documentation. It creates the default thesaurus, but does not load the data, due to the error "DRG-11530: token exceeds maximum length". Apparently one of the terms is too long. But what can I use to edit the file? I tried notepad, but it was too big. I tried wordpad, but it was unreadable. I was able to create a default thesaurus using the much smaller sample thesaurus dr0thsus.txt, so I confirmed that there is nothing wrong with the syntax or privileges. Please see the copy of the run below. Is there a way to edit the droldUS.dat file or a workaround or am I not loading it correctly? Does the .dat file need to be loaded differently than the .txt file?
CTXSYS@orcl_11g> select banner from v$version
2 /
BANNER
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
PL/SQL Release 11.1.0.6.0 - Production
CORE 11.1.0.6.0 Production
TNS for 32-bit Windows: Version 11.1.0.6.0 - Production
NLSRTL Version 11.1.0.6.0 - Production
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> host ctxload -thes -user ctxsys/ctxsys@orcl -name default -file
C:\app\Barbara\product\11.1.0\db_1\ctx\data\enlx\droldUS.dat
Connecting...
Creating thesaurus default...
Thesaurus default created...
Processing...
DRG-11530: token exceeds maximum length
Disconnected
CTXSYS@orcl_11g> connect ctxsys/ctxsys@orcl
Connected.
CTXSYS@orcl_11g>
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
1
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
0
CTXSYS@orcl_11g> exec ctx_thes.drop_thesaurus ('default')
PL/SQL procedure successfully completed.
CTXSYS@orcl_11g> host ctxload -thes -user ctxsys/ctxsys@orcl -name default -file
C:\app\Barbara\product\11.1.0\db_1\ctx\sample\thes\dr0thsus.txt
Connecting...
Creating thesaurus default...
Thesaurus default created...
Processing...
1000 lines processed
2000 lines processed
3000 lines processed
4000 lines processed
5000 lines processed
6000 lines processed
7000 lines processed
8000 lines processed
9000 lines processed
10000 lines processed
11000 lines processed
12000 lines processed
13000 lines processed
14000 lines processed
15000 lines processed
16000 lines processed
17000 lines processed
18000 lines processed
19000 lines processed
20000 lines processed
21000 lines processed
21760 lines processed successfully
Beginning insert...21760 lines inserted successfully
Disconnected
CTXSYS@orcl_11g> select count(*) from ctx_thesauri where ths_name = 'DEFAULT'
2 /
COUNT(*)
1
CTXSYS@orcl_11g> select count(*) from ctx_thes_phrases where thp_thesaurus = 'DE
FAULT'
2 /
COUNT(*)
9582
CTXSYS@orcl_11g>Hi Roger,
Thanks for the response. You are correct. I was confusing the terms thesaurus and knowledge base, which sometimes seem to be used interchangeably or synonymously, but are actually two different things. I read over the various sections of the documentation regarding the supplied knowledge base and supplied thesaurus more carefully and believe I understand now. Apparently, the dr0thsus.txt file that I did ultimately load using ctxload to create a default thesaurus is the supplied thesaurus that is intended to be used to create the default English thesaurus, which supports ctx_thes syn and such. The other droldUS.dat file that I mistakenly tried to load using ctxload is the supplied compiled knowledge base that supports ctx_doc themes and gist and such. In the past I have used ctx_thes.create_thesaurus to create a thesaurus, but using ctxload can also load a thesaurus from a text file with the data in a specified format. Once a thesaurus is loaded using ctxload, it can then be compiled using ctxkbtc to add it to the existing compiled knowledge base. So, the knowledge base is sort of a compilation of thesauri, which is what led to my confusion in terminology. I think I have it all straight in my mind now and hopefully this will help anybody else who searches for the same problem and finds this.
Thanks,
Barbara -
Server Exception : The data exceeds the length
Hi All,
I am updating a MDM record through API . modifyrecord . I am changing almost 50 fields in record and then modifying . I get an error , the data exceeds the length . How can i know which particular Field is giving error . It just says the data excced the length but which fields is not shown in error .How to get the field name .Hi Govind,
I assume u r using ABAP.
Check if the data type u have assigned for all the fields is correct.
Specially check for integer fields also.
MDM_GDT_INTEGERVALUE only takes data upto 4 bytes but if it exceeds this limit u need to use MDM_GDT_DOUBLE.
Server exception mostly happens when some parameter is not entered correctly.
Regards,
Neethu Joy -
XML data value exceeds maximum length - ORA-30951
Hello,
I am receiving ORA-30951: Element or attribute at Xpath /dataroot/Respondent[1]/Response[3]/Value exceeds maximum length error during the XML load.
I have registered the schema and it works fine when the Value is less than 64k but fails if its greater. I tried changing the type of Value to type="xsd:base64Binary" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="BLOB" but then I get ORA-00932: inconsistent datatypes error.
Can someone please let me know what I am doing wrong or is there a way I can load more than 64k length elements on 10g?
Thanks
Here is my schema.
var SCHEMAURL varchar2(256)
var XMLSCHEMA CLOB
set define off
begin
:SCHEMAURL := 'http://xmlns.example.com/Svy_Resp.xsd';
:XMLSCHEMA := '<?xml version="1.0"; encoding="utf-16"?>
<xsd:schema attributeFormDefault="unqualified" elementFormDefault="qualified" version="1.0"; xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" xdb:storeVarrayAsTable="true">
<xsd:element name="dataroot" xdb:defaultTable="SVY_RESP_XML_SCHEMA" type="datarootType" />
<xsd:complexType name="datarootType" xdb:maintainDOM="false"
xdb:SQLType="Dataroot_T">
<xsd:sequence>
<xsd:element maxOccurs="unbounded" name="Respondent" type="RespondentType" />
</xsd:sequence>
<xsd:attribute name="generated" type="xsd:dateTime" />
</xsd:complexType>
<xsd:complexType name="RespondentType" xdb:maintainDOM="false" xdb:SQLType="Respondent_Type">
<xsd:sequence>
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="KsID" type="xsd:int" />
<xsd:element name="email" type="xsd:string" />
<xsd:element name="SyID" type="xsd:int" />
<xsd:element name="KSuID" type="xsd:int" />
<xsd:element name="Completed" type="xsd:int" />
<xsd:element name="SubmitDateTime" type="xsd:dateTime" />
<xsd:element maxOccurs="unbounded" name="Response" type="ResponseType" />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="ResponseType" xdb:maintainDOM="false" xdb:SQLType="Response_Type">
<xsd:sequence>
<xsd:element name="ResponseID" type="xsd:int" />
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="CID" type="xsd:int" />
<xsd:element name="AID" type="xsd:int" />
<xsd:element name="Value" type="xsd:string"/>
<xsd:element name="QID" type="xsd:int" />
<xsd:element name="SID" type="xsd:int" />
</xsd:sequence>
</xsd:complexType>
</xsd:schema>';
end;
/Thanks for the quick response. I am not able to modify the source file, but will it be possible to set the value to NULL if it exceeds the max length instead of failure?
Thanks -
Sqlldr - filler columns exceeds maximum length
Hi All,
DB version : 10.2.0.1.0
We are getting a CSV file with 127 fields. We require to load first 9 fields and the 127th field using SQLLDR. What is the best way to specify this in Control file?
Currently we are giving like
C10 filler,
C11 filler,
C127 filler,
column_name
1. Is there any other better approach available?
2. We are gettiing issues when, filler columns exceeds maximum length. We tried to give like
c10 char(4000) filler ,
But it gives sybtax error.What is the work around for this?
Thanks in advance,
Jeneesh
Please note that using EXTERNAL TABLEs or other methods is not feasible for us.Hi jeneesh,
Did you try,
c10 filler char(4000)from docs
The syntax for a filler field is same as that for a column-based field, except that a filler field's name is followed by FILLER.Best regards
Peter -
I have created several PDF forms using Adobe Designer 7.0 for Windows XP and I want to compile them into a single PDF document using Adobe Acrobat Professional 7.0's "Create PDF from multiple files" option. But when I try, I get a message that "The file (name) is protected. It cannot be used for this command" for each file in turn. When I open the files individually and look at the security settings, I see "Document Assembly: Not Allowed". I was wondering how I could change the settings to allow document assembly. Acrobat Help just tells me to contact the original document author...but I *am* the original document author!
Once a form is made in Designer, it can't be edited in Acrobat. They are two totally different technologies.
Maybe you are looking for
-
How to Get a BLOB Field from JDBC Coding
Hi, I have written the JDBC Code to get the field values from the Table. There is a BLOB field in that table. How to get the BLOB ? What is the return type we should use. If it is Varchar then we can use rs.getString(). If it is BLOB then what is the
-
Why does my computer keep crashing? Can someone look at these Problem Deta
My computer has been crashing A LOT recently. I really don't want to have to reinstall OSX, is there anything I can do besides repair permissions? What follows are the Problem Details from the report I sent to Apple. I should say that I was downloadi
-
Question about using VMware Fusion with mutiple user accounts on the mac
I know this isn't a Fusion forum as such, but wondered if anyone knew the answer to this problem. I installed the latest version of fusion on my machine and a copy of windows XP. It's very good and am impressed with the ease and speed of it. However,
-
How to use Material Group in substitution rule
Hello, Anyone experience in how to use the entry of the material group of a PO, in a substitution rule for accounting documents. I have noticed the material number can be used within a substitution rule, and is also available in BSEG... so far have n
-
Currently re-designing our publishing workflow from custom legacy FrameMaker system to InDesign. Trying to work out best method of handling automated glossary creation. Currently using mark-up tags in Frame to create a numerical tag for reference to