Insert text on Column exceeding 4000 bytes

Hi ,
I create a table with two columns
Create table A (B NUMBER ,C VARCHAR2(4000));
I cant insert a text into coulmn C as it exceed its maximum allwable size.
Please help .
Thanks.

VARCHAR2 in Oracle are limited to 4000 bytes. Anything longer you need to store as CLOB. Unfortunately, you can't alter column from VARCHAR2 to CLOB. You would have to do someting like:
SQL> alter table a rename column c to d
  2  /
Table altered.
SQL> alter table a add c clob
  2  /
Table altered.
SQL> update a set c = d;
0 rows updated.
SQL> alter table a drop column d;
Table altered.
SQL> desc a
Name                                                                     Null?    Type
B                                                                                 NUMBER
C                                                                                 CLOB
SQL> So now column C is CLOB, it retains all values of old VARCHAR2 column C and you can insert values > 4000:
SQL> insert into a values(1,to_clob(lpad('A',4000,'A')) || lpad('B',4000,'B'));
1 row created.
SQL> select length(c) from a
  2  /
LENGTH(C)
      8000
SQL> SY.

Similar Messages

  • How to perform Text length more than 4000 bytes

    I want to read each row from txt file
    and used utl_file pacage to concat contents into a variable,
    the variable datatype is varchar2(4000), so text length can't more than 4000 bytes,
    but I wish get full text from txt file, How can I do?

    Thans! Detlev.
    I have a code :
    PROCEDURE read_file( path in varchar2, filename in varchar2, msg in out varchar2) AS
    data_line varchar2(4000);
    ifile utl_file.file_type;
    BEGIN
    ifile := utl_file.fopen(path,filename,'R');
    LOOP
    utl_file.get_line( ifile, data_line);
    msg := msg | | rtrim(data_line);
    END LOOP;
    utl_file.fclose(ifile);
    EXCEPTION
    WHEN no_data_found THEN
    utl_file.fclose(ifile);
    END;
    My question is
    1. The msg length can't more than 4000
    2. I use utl_raw.cast_to_raw function, that can't cast varchar2 to raw if varchar2 length more than 2048
    3. If I want to change the data type varchar2 to raw,
    the utl_raw.concat function can't use Loop ....end loop clause,
    So that only use utl_raw.concat(ra1, ra2,ra3,ra4,ra5, ... ra12)
    I want to concat all line to a raw datatype from text file , How can I do?

  • How to insert more than 4000 bytes in BLOB column

    Hi all,
    My oracle version is Oracle Database 11g Enterprise Edition Release 11.2.0.3.0.
    I have checked in google and in this forum also, but did not find the answer.
    When inserting into less than 4000 bytes, it is inserting without any issues. If i am inserting more than that, it is throwing
    error like this ORA-01460: unimplemented or unreasonable conversion requested.
    Can anybody guide me how to do this or link.
    Thanks in advance.
    Thanks,
    Pal

    user546710 wrote:
    Hi,
    Thank you very much for your reply.
    Before, I have not worked with BLOB column, so I don't know much about its using. Currently, I am using direct
    insert statement only. Its a normal stored procedure written in a package. I am calling this SP with other columns and
    with this BLOB column. I am able to insert into table, if the inserted file is less than 4000 bytes. If it is more than that,
    I am getting that problem.
    Thanks,
    VenuSQL variables can only hold 4000 bytes.
    PL/SQL can hold up to 32767
    I am getting that problem. 100% lacking in actionable detail
    post code & COMPLETE error message & code

  • ORA-01461 when trying to insert text 4000 characters into a CLOB column

    I work on a Windows application that connects to databases using ODBC. I'm currently working on adding Oracle support to the application, but I'm running into a lot of problems inserting strings of more than 4000 characters into CLOB columns. First, I tried to do that directly using code similar to the following (setup, cleanup, and error checking snipped for clarity).
    SQLHDBC hDbc;
    SQLHSTMT hStmt;
    unsigned char query[200];
    SQLAllocStmt(hDbc, &hStmt);
    strcpy(query,"INSERT INTO test_table (col_clob, col_int) ('long string', 1)");
    SQLExecDirect(hStmt,query,strlen(query));
    Of course, "long string" was really a string of more than 4000 characters. When I did that, I got an error indicating that I needed to use bind parameters for strings that long. However, bind parameters don't seem to work either. I changed my code to something like the following:
    SQLHDBC hDbc;
    SQLHSTMT hStmt;
    unsigned char query[200];
    +unsigned char *str=NULL;+
    int str_size;
    SQLAllocStmt(hDbc, &hStmt);
    strcpy(query,"INSERT INTO test_table (col_clob, col_int) (:strvar, 1)");
    SQLPrepare(hStmt,query,SQL_NTS);
    str = (unsigned char*)malloc(6000 * sizeof(unsigned char));
    SQLBindParameter(hStmt, 1, SQL_PARAM_INPUT, SQL_C_CHAR, SQL_VARBINARY, 0, 0, str, sizeof(str), &str_size);
    strcpy(str,'long_string');
    str_size = SQL_NTS;
    SQLExecute(hStmt);
    This code works fine when 'long_string' is less than 4000 characters. When it's greater than that, I get the following error:
    +[Oracle][ODBC][Ora]ORA-01461: can bind a LONG value only for insert into a LONG column+
    I'm using Visual Studio.NET 2003 on Windows XP, and Oracle client 11.1.0.6.0. My server version is as follows:
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
    PL/SQL Release 10.2.0.1.0 - Production
    CORE    10.2.0.1.0      Production
    TNS for Linux: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    I thought I read somewhere that this is a known bug with Oracle 10.2.0.1.0, but I can't seem to find the reference now. Can anyone confirm this or provide some assistance with this problem? Frankly, it seems kind of ridiculous that I can't do something as simple as inserting a reasonably sized amount of text into a CLOB column. I'm considering using BFILEs instead, but that would be kind of a pain to fit into our application, so I'd prefer not to do this if at all possible.
    Thanks in advance for any help.

    I found this link referring to a similar problem that was apparently fixed in version 10.2.0.4 of the server: ORA-01461: can bind a LONG value only for insert into a LONG column Should I try to upgrade the server and see if that fixes things?

  • Error inserting XML records 4000 bytes through Pro*C

    Hi,
    I am seeing the following error while trying to insert XML records > 4000 bytes (Records < 4000 bytes get inserted without any issues). Any help in resolving the issue would be highly appreciated.
    ORA return text: ORA-01461: can bind a LONG value only for insert into a LONG column.
    I am also able to insert records > 4000 bytes using the following query, But, I want to insert the records through a C application (using Pro*C) that is not running on the database server.
    INSERT INTO MY_XML_TABLE
    VALUES (XMLType(bfilename('XML_DIR', 'MY_FILE.XML'),
    nls_charset_id('AL32UTF8')));
    Oracle Version
    ===============
    SQL> select * from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    CORE 11.2.0.2.0 Production
    TNS for Solaris: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production
    Pro*C/C++ version:
    ====================
    Pro*C/C++ RELEASE 11.2.0.0.0 - PRODUCTION
    Schema registration:
    ====================
    begin
    DBMS_XMLSCHEMA.registerSchema (
    SCHEMAURL => 'MY_XML_SCHEMA.xsd',
    SCHEMADOC => bfilename ('ENG_REPORTS', 'MY_XML_SCHEMA.xsd'),
    GENTYPES => FALSE,
    OPTIONS => DBMS_XMLSCHEMA.REGISTER_BINARYXML,
    CSID =>nls_charset_id ('AL32UTF8'));
    end;
    Table creation
    ===============
    CREATE TABLE MY_XML_TABLE (
    MY_XML_RECORD XmlType )
    XMLTYPE MY_XML_RECORD STORE AS BINARY XML
    XMLSCHEMA "MY_XML_SCHEMA.xsd" ELEMENT "MYXMLTAG" ;
    Record Insertion (Pro*C generated code):
    =========================================
    /* EXEC SQL FOR :l_sizeof_array_togo
    insert INTO MY_XML_TABLE
    (MY_XML_RECORD )
    VALUES( XMLTYPE(:l_XML_ptr INDICATOR :l_XML_indicators )); */
    struct sqlexd sqlstm;
    sqlstm.sqlvsn = 12;
    sqlstm.arrsiz = 1;
    sqlstm.sqladtp = &sqladt;
    sqlstm.sqltdsp = &sqltds;
    sqlstm.stmt = "insert into MY_XML_TABLE (MY_XML_RECORD) values (XMLTYPE(:s1\
    :s2 ))";
    sqlstm.iters = (unsigned int )l_sizeof_array_togo;
    sqlstm.offset = (unsigned int )20;
    sqlstm.cud = sqlcud0;
    sqlstm.sqlest = (unsigned char *)&sqlca;
    sqlstm.sqlety = (unsigned short)4352;
    sqlstm.occurs = (unsigned int )0;
    sqlstm.sqhstv[0] = (unsigned char *)&l_XML_ptr->xml_record;
    sqlstm.sqhstl[0] = (unsigned long )8002;
    sqlstm.sqhsts[0] = ( int )sizeof(struct xml_rec_definition);
    sqlstm.sqindv[0] = ( short *)&l_XML_indicators->XML_record_ind;
    sqlstm.sqinds[0] = ( int )sizeof(struct XML_indicator);
    sqlstm.sqharm[0] = (unsigned long )0;
    sqlstm.sqadto[0] = (unsigned short )0;
    sqlstm.sqtdso[0] = (unsigned short )0;
    sqlstm.sqphsv = sqlstm.sqhstv;
    sqlstm.sqphsl = sqlstm.sqhstl;
    sqlstm.sqphss = sqlstm.sqhsts;
    sqlstm.sqpind = sqlstm.sqindv;
    sqlstm.sqpins = sqlstm.sqinds;
    sqlstm.sqparm = sqlstm.sqharm;
    sqlstm.sqparc = sqlstm.sqharc;
    sqlstm.sqpadto = sqlstm.sqadto;
    sqlstm.sqptdso = sqlstm.sqtdso;
    sqlcxt((void **)0, &sqlctx, &sqlstm, &sqlfpn);
    }

    After selecting data from xmltab table I just received first line of xmldata file. i.e
    <?xml version="1.0" encoding="WINDOWS-12 52"?> <BAROutboundXM L xmlns="http://BARO
    That must be a display issue.
    What client tool are you using, and what version?
    If SQL*Plus, you won't see the whole content unless you set some options :
    {code}
    SET LONG <value>
    SET LONGCHUNKSIZE <value>
    {code}
    Could you try the following?
    {code}
    SET LONG 10000
    SELECT t.object_value.getclobval() FROM xmltab t;
    -- to force pretty-printing :
    SELECT extract(t.object_value, '/*').getclobval() FROM xmltab t;
    {code}
    Edited by: odie_63 on 16 févr. 2011 08:58

  • Cannot insert varchar2 4000 bytes correctly in asp

    I tried to insert 4000 bytes into a filed of varchar2 type
    in an ASP page using bind variables.
    After inserting, I cannot fetch the column correctly. Result was only 2000 bytes.
    In sqlplus I verified that the inserted data was 4000 bytes but a wrong character at 2001 th byte.
    I use oracle 8.1.6.
    This problem is not occurring in JSP.
    Thanks...

    Have you solved your problem yet?
    I think you are experiencing the auto-magical characterset conversions in Windows applications.
    SQL*Plus is still using those non-Unicode Win APIs; on the contrary, your C# application is very probably using the Unicode Win APIs. That means, no matter how you input (Big5 ones, unicode ones, cut-and-paste, .... ), SQL*Plus receives the characters in Big5 form and your C# application receives them in Unicode form. Windows does all the conversions transparently.
    In the SQL*Plus case, the HKSCS characters are in their Big5 UDA/VDA codes. Since both your server and client have NLS_LANG set as ZHT16MSWIN950, these HKSCS Big5 codes are stored in your database intact w/o any conversion. When these characters are displayed, no matter in SQL*Plus or your C# application, they are retrieved in Big5 form and passed to Windows for display. Hence, there is no problem.
    However, when you input these HKSCS characters through your C# application, they are received in Unicode form. I believe that your application is storing these Unicode characters in DB directly and, therefore, they must be converted to "MSWIN950" form by some Oracle components who lacks the full conversion table of Windows. Naturally, HKSCS characters become "??".
    I am not familiar with C# and Windows APIs so I dunno the exact way to solve the problem. Still, the rule-of-thumb is to let Windows does the character set conversions and ensure that the converted data are stored in Oracle w/o further conversions. I think your C# applications should convert the text data from Unicode to Big5 (under Windows) and then save the Big5 data to the “ZHT16MSWIN950” database in Big5 context. This will probably solve your problem.

  • Problem on inserting text from file to a clob column

    Hi! I'm having problem in inserting text read from a text file to a clob column in my table.
    Here's my code
    public void addSyllabusOutline(int syllid)
              CLOB objclob1 = null;
              CLOB objclob2 = null;
              String query = "SELECT outline, projoutline FROM Syllabus WHERE syllabusid = '"+ syllid +"' FOR UPDATE";
              try
                   System.out.print("Getting syllabus outline and projoutline clob locator...");
                   DBConnection dbconnbean = new DBConnection();
                   dbconnbean.openConnection();
                   ResultSet rst = dbconnbean.executeQuery(query);
                   if(rst.next())
                        objclob1 = (oracle.sql.CLOB)rst.getClob("outline");
                        objclob2 = (oracle.sql.CLOB)rst.getClob("projoutline");
                        Writer clobwriter1 = ((oracle.sql.CLOB)objclob1).getCharacterOutputStream();
                        String filename1 = "c:\\o1u2t3l4i5n6e7.txt";
                        File outlinefile1 = new File(filename1);
                        FileReader outlineFileReader1 = new FileReader(outlinefile1);
                        char[] cbuffer1 = new char[10 * 1024];
                        int nread1 = 0;
                        while((nread1 = outlineFileReader1.read(cbuffer1)) != -1)
                             clobwriter1.write(cbuffer1, 0, nread1);
                        //clobwriter1.flush();
                        clobwriter1.close();
                        Writer clobwriter2 = ((oracle.sql.CLOB)objclob2).getCharacterOutputStream();
                        String filename2 = "c:\\p1r2o3j4o5u6t7l8i9n0e.txt";
                        File outlinefile2 = new File(filename2);
                        FileReader outlineFileReader2 = new FileReader(outlinefile2);
                        char[] cbuffer2 = new char[10 * 1024];
                        int nread2 = 0;
                        while((nread2 = outlineFileReader2.read(cbuffer2)) != -1)
                             clobwriter2.write(cbuffer2, 0, nread2);
                        //clobwriter2.flush();
                        clobwriter2.close();
                   System.out.println("done");
                   dbconnbean.closeConnection();
              catch(Exception e)
                   System.out.println("Error: " + e);
    My error is java.sql.SQLException fetch out of sequence. I don't have an Idea on what seems to be the problem. Please help.. Thank you very much. ^_^

    Hi,
    Print the whole stack trace. It will tell you the line which causes the error. How you use clob and blob in oracle is also version dependent. Try to google for
    oracle "your version" java clob example.
    /Kaj

  • How to display text in item 4000 bytes in size

    Hi,
    I have a table with CLOB column and I am fetching this data and displaying it in page item. But the limitation with page item is the size, it can't hold more that 4000 bytes. But I have some rows in the table with more than 64K bytes.
    Alternatively If I create PL/SQL dynamic content region with htp.p, it can hold upto 32K bytes as varchar2 support only this limit.
    What will be the solution if I want to display that data on the page?
    Thnx
    -Smith

    Smith
    Try this
      CREATE OR REPLACE PROCEDURE buffer_print(p_text IN CLOB) IS
      BEGIN
        FOR i IN 1..CEIL((LENGTH(p_text))/10)
        LOOP
          htp.prn(SUBSTR(p_text,(i-1)*10+1,10));
        END LOOP;
      END; Cheers
    Ben

  • Query in 10g DB gives error when size of element exceeds 4000 characters?

    I used this query give in this thread Need help in reading a _fmb.XML and writing the item properties to a table
    select x1.item_name
         , x1.item_type
         , x2.property
         -- to convert back entities such as
    to their character values :
         , utl_i18n.unescape_reference(x2.property_value) as property_value
         -- parent information :
         , x1.parent_item_name
         , x1.parent_item_type
    from xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /Module/descendant::*[@def:Name]
            return element item {
              attribute item_name {data($i/@def:Name)}
            , attribute item_type {local-name($i)}
            , attribute parent_item_name {data($i/parent::*/@def:Name)}
            , attribute parent_item_type {local-name($i/parent::*)}
            , $i
           passing xmltype(bfilename('TEST_DIR','length_test_fmb.xml'), nls_charset_id('AL32UTF8'))
           columns item_name         varchar2(50) path '@item_name'
                 , item_type         varchar2(50) path '@item_type'
                 , parent_item_name  varchar2(50) path '@parent_item_name'
                 , parent_item_type  varchar2(50) path '@parent_item_type'
                 , item              xmltype      path '.'
         ) x1
       , xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /item/*/attribute::def:*
            let $propname := local-name($i)
            where $propname != "Name"
            return element p {
              element name {$propname}
            , element value {data($i)}
           passing x1.item
           columns property       varchar2(50)  path 'name'
                 , property_value varchar2(4000) path 'value'
        ) x2
    ;This works perfectly, but I tried a form which has a program unit which exceeds 4000 characters, i.e. I converted a fmb to xml, I.E. PROPERTY VALUE. When you run the query then you get error "function returned value too large".
    I tried changing the VARCHAR2 to CLOB but then you get another error.
    We cannot migrated to 11g as yet, so how to handle this in 10g?
    Any help would be greatly appreciated.
    Edited by: Channa on Oct 25, 2011 6:40 AM

    Hi Channa,
    I should have mentioned it in the previous thread, a sound approach to the overall requirement would be to use object-relational storage for loading XML documents in the database.
    You can read more here : http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14259/xdb03usg.htm#g1055369
    However, that's not possible if we use the DUMP=ALL option to convert Forms files to XML because the generated files do not conform to the Forms XML schema.
    So unless you decide to use DUMP=OVERRIDDEN, you're stuck with the current situation.
    If I remember correctly, the ability to project large strings as CLOB with XMLTable was added in version 10.2.0.4.
    You're not out of options though.
    Here are two, one being "dirtier" than the other...
    1) Divide the property value into multiple chunks of 4000 characters (or less if the db uses a multi-byte character set), then rebuild the string as CLOB in the SELECT clause :
    select x1.item_name
         , x1.item_type
         , x2.property
         , to_clob(utl_i18n.unescape_reference(x2.property_value1)) ||
           to_clob(utl_i18n.unescape_reference(x2.property_value2)) as property_value
         , x1.parent_item_name
         , x1.parent_item_type
    from xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /Module/descendant::*[@def:Name]
            return element item {
              attribute item_name {data($i/@def:Name)}
            , attribute item_type {local-name($i)}
            , attribute parent_item_name {data($i/parent::*/@def:Name)}
            , attribute parent_item_type {local-name($i/parent::*)}
            , $i
           passing xmltype(bfilename('TEST_DIR','module2.xml'), nls_charset_id('AL32UTF8'))
           columns item_name         varchar2(50) path '@item_name'
                 , item_type         varchar2(50) path '@item_type'
                 , parent_item_name  varchar2(50) path '@parent_item_name'
                 , parent_item_type  varchar2(50) path '@parent_item_type'
                 , item              xmltype      path '.'
         ) x1
       , xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /item/*/attribute::def:*
            let $propname := local-name($i)
            let $propval := data($i)
            where $propname != "Name"
            return element p {
              element name {$propname}
            , element value1 {substring($propval,1,4000)}
            , element value2 {substring($propval,4001,4000)}
           passing x1.item
           columns property        varchar2(50)   path 'name'
                 , property_value1 varchar2(4000) path 'value1'
                 , property_value2 varchar2(4000) path 'value2'
        ) x2
    ;2) Output the property value as a text() node (XMLType datatype) and serialize as CLOB in the SELECT :
    select x1.item_name
         , x1.item_type
         , x2.property
         , dbms_xmlgen.convert(x2.property_value.getclobval(),1) as property_value
         , x1.parent_item_name
         , x1.parent_item_type
    from xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /Module/descendant::*[@def:Name]
            return element item {
              attribute item_name {data($i/@def:Name)}
            , attribute item_type {local-name($i)}
            , attribute parent_item_name {data($i/parent::*/@def:Name)}
            , attribute parent_item_type {local-name($i/parent::*)}
            , $i
           passing xmltype(bfilename('TEST_DIR','module2.xml'), nls_charset_id('AL32UTF8'))
           columns item_name         varchar2(50) path '@item_name'
                 , item_type         varchar2(50) path '@item_type'
                 , parent_item_name  varchar2(50) path '@parent_item_name'
                 , parent_item_type  varchar2(50) path '@parent_item_type'
                 , item              xmltype      path '.'
         ) x1
       , xmltable(
           xmlnamespaces(default 'http://xmlns.oracle.com/Forms', 'http://xmlns.oracle.com/Forms' as "def")
         , 'for $i in /item/*/attribute::def:*
            let $propname := local-name($i)
            where $propname != "Name"
            return element p {
              element name {$propname}
            , element value {data($i)}
           passing x1.item
           columns property        varchar2(50)   path 'name'
                 , property_value  xmltype        path 'value/text()'
        ) x2
    ;Edited by: odie_63 on 25 oct. 2011 17:03
    Edited by: odie_63 on 25 oct. 2011 17:04

  • Need help in parsing a VARCHAR2(4000 BYTES) field

    Hi Guys,
    Let me give the DB information first:
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE 10.2.0.4.0 Production
    TNS for Solaris: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    My problem is: The frontend of the application sends a large string into the VARCHAR2 (4000 BYTES) field. The string it sends is somewhat like this -
    <strong>Crit:</strong> Some text. <br><strong>Distinguished</strong> (points 3):Some comment text1. <blockquote> </blockquote><strong>Crit:</strong> Some other text.<br><strong>Distinguished</strong> (points 3):Some comment text2. <blockquote> </blockquote><strong>Crit:</strong> Some more text. <br><strong>Distinguished</strong> (points 3):Some comment text3. <blockquote> </blockquote><strong>Overall comments:</strong><br> Final text!!I want to parse the text and put into separate columns. Number of Crit: can be more than 3; its 3 up there. But the basic structure is same. What is the best possible way of parsing this in PL/SQL code? I want something like
    Table 1
    Crit                       Points           Comment
    Some text                3        Some comment text1.
    Some other text        3        Some comment text2.
    Some more text        3        Some comment text3.
    Table 2
    Overall comments
    Final text!!Please let me know, if you need further information.
    Thanks.
    Edited by: 794684 on Sep 14, 2010 4:15 AM
    Edited by: 794684 on Sep 14, 2010 4:38 AM
    Edited by: 794684 on Sep 14, 2010 4:53 AM
    Edited by: 794684 on Sep 14, 2010 6:42 AM

    Welcome to the forum.
    Looks like noformat tags are not working.Please use the {noformat}{noformat} tag if you want to post formatted examples/code.
    For example, when you type:
    {noformat}select *
    from dual;
    {noformat}
    it will appear as:select *
    from dual;
    when you post it on this forum...
    The FAQ will explain the other options you have: http://forums.oracle.com/forums/help.jspa                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Size for XMLType only 4000 Bytes???

    Hello,
    it seems, that the XMLType can only store up to 4000 bytes of
    data.
    When I enter a document larger than 4000 bytes the end is
    truncated and the "extract"-method doesn't work anymore.
    I have to store sizes of about 30-50 Kilobytes.
    So, is there a way to set the size of the XMLType?
    Has anyone else this problem?
    TIA
    Alex

    Hello,
    First, I want to thank you for helping me.
    So, I've tried to create my table without the schema and the pb is still the same.
    The index I create is a context index.
    I create my table with the following query:
    create table artefact(numArt number, art XMLType)
    xmltype column art XMLSCHEMA "http://www.oracle.com/artefact.xsd" element "exportList";
    CREATE SEQUENCE numArt START WITH 1 INCREMENT BY 1 NOMAXVALUE NOCYCLE CACHE 2;
    create index ind on artefact(art)
    indextype is ctxsys.context;
    My schema seems to be Ok because it is validate by XMLSpy and when I use "little" data, everything is OK.
    Here is the way I register my schema:
    begin
         dbms_xmlschema.deleteSchema('http://www.oracle.com/artefact.xsd',4);
    end;
    declare
    doc varchar2(10000) := '
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb">
         <xsd:element name="exportList" type="TypeExportList"/>
         <xsd:complexType name = "TypeExportList">
              <xsd:sequence>
                   <xsd:element name = "request" type="TypeRequest"/>
              </xsd:sequence>
         </xsd:complexType>
         <xsd:complexType name="TypeRequest">
              <xsd:sequence>
                   <xsd:element name="artifact" type="TypeArtefact"/>
                   <xsd:element name="target" type="TypeTarget"/>
                   <xsd:element name="keyword" type="TypeKeyWord"/>
              </xsd:sequence>
              <xsd:attribute name = "targetTool" type="xsd:string" use="required"/>
              <xsd:attribute name = "targetVersion" type = "xsd:string" use="optional"/>
              <xsd:attribute name = "sourceTool" type = "xsd:string" use="optional"/>
              <xsd:attribute name = "cmpa" type = "xsd:string" use="required"/>
              <xsd:attribute name = "command" type = "xsd:string" use="required"/>
              <xsd:attribute name="readOnly" type = "xsd:string" use="required"/>
         </xsd:complexType>
         <xsd:complexType name="TypeArtefact">
              <xsd:sequence>
                   <xsd:element name= "object" type = "TypeObject"/>
              </xsd:sequence>
                   <xsd:attribute name = "projectName" type="xsd:string" use="required"/>
         </xsd:complexType>
         <xsd:complexType name="TypeObject">
                   <xsd:attribute name = "objectName" type = "xsd:string" use="required"/>
                   <xsd:attribute name = "objectType" type = "xsd:string" use="required"/>
         </xsd:complexType>
         <xsd:complexType name="TypeTarget">
                   <xsd:attribute name = "targetIP" type="xsd:string" use="required"/>
         </xsd:complexType>
         <xsd:complexType name="TypeKeyWord">
                   <xsd:attribute name = "word" xdb:SQLName="word" xdb:SQLType="CLOB"/>
         </xsd:complexType>
    </xsd:schema>'
    begin
    dbms_xmlschema.registerSchema('http://www.oracle.com/artefact.xsd',doc);
    end;
    Then, I insert the value as a clob (because the XML docs makes more than 4000 characters).
    And then I search it with the query:
    select numArt from artefact a
    where contains(a.art,'test inpath(exportList/request/keyword/@word)')>0;
    And if the attribute word makes more than 4000 characters, the word test is not found...

  • Is there any way to store data larger than 4000 bytes?

    Hi,
    Could someone here know which data type to store database column size larger than 4000 bytes.
    VARCHAR2 has the limit of 4000 bytes in database column. Is there ways to store data larger than 4000 bytes?
    Thank you very much!

    CLOB
    You use the CLOB datatype to store large blocks of character data in the database, in-line or out-of-line. Both fixed-width and variable-width character sets are supported. Every CLOB variable stores a locator, which points to a large block of character data. The size of a CLOB cannot exceed four gigabytes.
    CLOBs participate fully in transactions, are recoverable, and can be replicated. Changes made by package DBMS_LOB can be committed or rolled back. CLOB locators can span transactions (for reads only), but they cannot span sessions.
    Joel P�rez

  • Run-Time error 1004 when VBA run Text-To-Column

    Expert,
      I got error message 1004 said "Excel can convert only column at a time, The range can be many rows tall but no more than one column wide, try again by selecting cells in one column only. my VBA code is this. If I copy B5 only, the Text-To-Column
    is working. can you help ?
    Sheets("Sheet1").Range("B5:E300").Copy
        Sheets("Sheet2").Select
        Range("A2:E300").PasteSpecial xlPasteAll
        Columns("B:M").Insert Shift:=xlToRight, CopyOrigin:=xlFormatFromLeftOrAbove
        Application.CutCopyMode = False
        Selection.TextToColumns Destination:=Range("A1"), DataType:=xlDelimited, _
            TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=False, Tab:=False, _
            Semicolon:=False, Comma:=True, Space:=False, Other:=False, FieldInfo _
            :=Array(Array(1, 1), Array(2, 1), Array(3, 1), Array(4, 1), Array(5, 1), Array(6, 1), _
            Array(7, 1), Array(8, 1), Array(9, 1), Array(10, 1), Array(11, 1), Array(12, 1)), _
            TrailingMinusNumbers:=True
    Thanks
    James Liang

    Re:  text to columns error
    You have four columns selected when you call text to columns, so ...
    Replace the word "Selection" with Range("A1:A300") so it looks like...
      Range("A1:A300").TextToColumns Destination:=Range("A1"), DataType:=xlDelimited, _
    The above is my guess as to what you want to work with; I didn't experiment with the rest of the code.
    Jim Cone
    Portland, Oregon USA
    free & commercial excel programs (n/a xl2013)
    https://jumpshare.com/b/O5FC6LaBQ6U3UPXjOmX2

  • Sqlldr - filler columns exceeds maximum length

    Hi All,
    DB version : 10.2.0.1.0
    We are getting a CSV file with 127 fields. We require to load first 9 fields and the 127th field using SQLLDR. What is the best way to specify this in Control file?
    Currently we are giving like
       C10 filler,
       C11 filler,
       C127 filler,
       column_name
       1. Is there any other better approach available?
    2. We are gettiing issues when, filler columns exceeds maximum length. We tried to give like
                  c10 char(4000) filler  ,
       But it gives sybtax error.What is the work around for this?
    Thanks in advance,
    Jeneesh
    Please note that using EXTERNAL TABLEs or other methods is not feasible for us.

    Hi jeneesh,
    Did you try,
    c10 filler char(4000)from docs
    The syntax for a filler field is same as that for a column-based field, except that a filler field's name is followed by FILLER.Best regards
    Peter

  • Is there a Numbers equivalent to Excel's "Text to Columns"?

    I am new to Numbers 09. In Excel I often need to take large amounts of ASCII text data, with hundreds or even thousands of line, paste the data into Column A, then use the "Text to Columns" utility to parse the data into separate columns for further manipulation, calculation, or graphing. Is there a similar process available in Numbers?

    I found a way to accomplish this, assuming you also have iWork “Pages” available.
    Paste your delimited text into a blank"Pages" document.  Do a Find / Replace on your delimiting character (comma or whatever) and replace it with a tab character. 
    To do this, use Edit > Find > Find... and choose “Advanced” (at the top of the pop-up).  Your delimiter goes in the Find text box.  Then use the “Insert” pull-down menu to set the Replace text to “Tab.”   It puts a little arrow symbol into the text box.
    Your original text probably has line breaks already.  But if it has a separate delimiter instead, you can convert them into the required line breaks in the same way, using Find/Replace to "Paragraph Break.".  Hopefully the same delimiter is not used for both row and column breaks in your data.
    Now copy/paste that text from Pages into a single cell in Numbers and it should break it into columns and rows.
    MS Office for Mac isn’t that expensive.  After this ordeal I’ve ordered a copy.

Maybe you are looking for