Data size limit on long column

I saw one of the Oracle JDBC FAQ saying 8.0x thin driver does not
support long column with data size >2k using setString.
is there any latest Oracle driver support long column with data
size > 2k ?
null

//this sample will create the table
//just modify the connect string. will work with thin and oci
drivers
import java.sql.*;
import java.io.*;
class ThinBug
public static void main (String args [])
throws SQLException, ClassNotFoundException, IOException
// Load the driver
Class.forName ("oracle.jdbc.driver.OracleDriver");
// Connect to the database
// You can put a database name after the @ sign in the
connection URL.
Connection conn =
//DriverManager.getConnection
("jdbc:oracle:thin:@myhost:1521:v815", "scott", "tiger");
DriverManager.getConnection ("jdbc:oracle:oci8:@v81",
"scott", "tiger");
// It's faster when you don't commit automatically
conn.setAutoCommit (true);
// Create a Statement
Statement stmt = conn.createStatement ();
// Create the example table
try
stmt.execute ("drop table mylong1");
catch (SQLException e)
// An exception would be raised if the table did not exist
// We just ignore it
// Create the table
stmt.execute ("create table mylong1 (NAME varchar2 (256),
DATA long)");
// Let's insert some data into it. We'll put the source code
// for this very test in the database.
File file = new File ("long.doc");
InputStream is = new FileInputStream ("long.doc");
PreparedStatement pstmt =
conn.prepareStatement ("insert into mylong1 (data, name)
values (?, ?)");
pstmt.setAsciiStream (1, is, (int)file.length ());
pstmt.setString (2, "test");
System.out.println("Before insert");
pstmt.execute ();
System.out.println("Aftetr insert");
// Do a query to get the row with NAME 'test'
ResultSet rset =
stmt.executeQuery ("select ROWID from mylong1 where
NAME='majid'");
// Get the first row
String szRowID="";
if (rset.next ())
szRowID = rset.getString (1);
System.out.println("ROWID="+szRowID);
// Let's insert some data into it. We'll put the source code
// for this very test in the database.
file = new File ("long.doc");
is = new FileInputStream ("long.doc");
pstmt =
conn.prepareStatement ("update mylong1 set data=? where
rowid=?");
pstmt.setAsciiStream (1, is, (int)file.length ());
pstmt.setString (2, szRowID);
pstmt.execute ();
// Do a query to get the row with NAME 'StreamExample'
rset =
stmt.executeQuery ("select DATA from mylong1 where
NAME='test'");
// Get the first row
if (rset.next ())
// Get the data as a Stream from Oracle to the client
InputStream gif_data = rset.getAsciiStream (1);
// Open a file to store the gif data
FileOutputStream os = new FileOutputStream
("example2.out");
// Loop, reading from the gif stream and writing to the
file
int c;
while ((c = gif_data.read ()) != -1)
os.write (c);
// Close the file
os.close ();
null

Similar Messages

  • Can I increase the size of a Long column to 4000 characters?

    Can I increase the size of a Long column to 4000 characters?

    maximum sizeof 4 GB -1 byte RTFM there: Datatypes(click)
    extracted: LONG: Character data of variable length up to 2 gigabytes, or 231 -1 bytes.
    Not 4Gb. Or also, read Paul M post.
    Regards,
    Yoann.

  • 10g Xe  4Gb data size limit

    Hi All,
    I have read the I can load only 4Gb data on my 10G xe database.
    Is the 400Mb default allocation of system tablespace (data dictionary) included here?
    I mean 4gb less 400Mb is my new data size limit?
    Thanks a lot

    The user data storage limit is 4 GB. 5 GB includes system tablespaces. Here is the exact text that is printed on the web page under Administration >> Storage -
    +"Oracle Database Express Edition is designed to provide users with 4 GB of user data storage. Physical storage is limited to a database size of 5 GB of total overall size. This includes the system tablespace, but excludes temporary and rollback. You can compact storage by clicking Compact Storage in the Tasks list."+

  • How to insert long text data in oracle for LONG column type??

    Anybody who can tell me what is best way to store long text (more than 8k) in Oralce table.
    I am using Long datatype for column but it still doenst let me insert longer strings.
    Also I am using ODP.Net.
    Anybody with a good suggestion???
    Thanks in advance

    Hi,
    Are you getting an error? If so, what?
    This works for me..
    Greg
    create table longtab(col1 varchar2(10), col2 long );
    using System;
    using System.Data;
    using Oracle.DataAccess.Client;
    using Oracle.DataAccess.Types;
    using System.Text;
    public class longwrite
    public static void Main()
    // make a long string
    StringBuilder sb = new StringBuilder();
    for (int i=0;i<55000;i++)
    sb.Append("a");
    sb.Append("Z");
    string indata = sb.ToString();
    Console.WriteLine("string length is {0}",indata .Length);
    // insert into database
    OracleConnection con = new OracleConnection("user id=scott;password=tiger;data source=orcl");
    con.Open();
    OracleCommand cmd = new OracleCommand();
    cmd.CommandText = "insert into longtab values(1,:longparam)";
    cmd.Connection = con;
    OracleParameter longparam = new OracleParameter("longparam",OracleDbType.Long,indata .Length);
    longparam.Direction = ParameterDirection.Input;
    longparam.Value = indata ;
    cmd.Parameters.Add(longparam);
    cmd.ExecuteNonQuery();
    Console.WriteLine("insert complete");
    //now retrieve it
    cmd.CommandText = "select rowid,col2 from longtab where col1 = 1";
    OracleDataReader reader = cmd.ExecuteReader();
    reader.Read();
    string outdata = (string)reader.GetOracleString(1);
    Console.WriteLine("string length is {0}",outdata.Length);
    //Console.WriteLine("string is {0}",outdata);
    reader.Close();     
    con.Close();
    con.Close();
    }

  • Row/data size limit when saving DESKI report to Excel?

    Users are getting an "(Error: INF  )" message when attempting to save a report to Excel. When the report has been generated using a smaller date range it generates 12,104 rows and saves successfully to Excel. When a larger date range is specified the report generates 18,697 rows and generates the above error message and does not produce an Excel file. The Excel file size for the 12,104 rows is 18.5 MB; the 18,697 row file would be in the neighborhood of 25+ MB. The row limits on Excel are well above what we're generating, so we're thinking there's a brick wall in the process that creates the Excel file from the BObj report?
    We are running XIr2 patched to SP4 on Windows servers/XP desktops using MS SQL Server database softwear.

    Hi Sarbhjeet,
    Thank you for all your help.
    Further to our investigation we found that this is known issue and it is a limitation of the Product, it wont be fixed.
    The ADAPT for the same is ADAPT00743734
    It is reproducible with XI 3.1 as well.
    FYI...........
    1. PDF Engine will get the rendering information from the busobj. If there is a character size differences between PDF and busobj, there are chances that you may see bigger cells or reduced data in PDF.
    2. In the report when Fit to 1 page by 1 wide in Pagesetup->FitToPrint, the page size will be set to the fit the complete report in busobj.
    3. When save as PDF, this information (increased page size, due to settings in 2) will be sent to PDF Engine. The Engine will try to fit the size into the available standard PDF size (based on Print settings and other combinations).If it tries to fit the report (enhanced page size) to the available size, then we loose lot of information depending on the number of columns and rows.
    4. Shrinking the report in busobj may not feasible.
    When we change the page size in Fit to Print (either Adjust to % or Fit to ) , in reporter it changes the page size instead of changing the rendering to fit to the page.
    When save the report to PDF, it gets the information with extended page size. So we see the enhanced page in PDF.
    Also in order to implement the shrink (instead of changing the Page size), each cell has to be shrieked to the percentage of the reduced page (compared to the original page size). It becomes more complicated when charts come to picture.
    I hope the above information helps.
    Thanks & Regards,
    Anisa

  • Long column size??

    Hi all,
    Is there any way of finding the exact/approx size of a LONG column in a table???
    i know that user_segments and vsize cannot.
    any ideas??
    Thanks in advance.

    Not really, no. This is an inherent limitation for this type. Best practice was to always store the size into another column when the LONG data was inserted or updated if you needed to know that later. LOB columns are more flexible than LONG.

  • IMP-000200: long column Too larege for column buffer size 22

    IMP error: long column too large for column buffer size
    IMP-000200: long column Too larege for column buffer size <22>
    imp hr/hr file=/home/oracle/hr.dmp fromuser=hr touser=hr buffer=10000 and try also 100000000
    and still the same error please any body can help me with detials please

    Providing more information/background is probably the wise thing to do.
    Versions (databases, exp, imp), commands and parameters used - copy&paste, relevant part of logs - copy&paste, describe table, etc.
    Some background, like what's the purpose, did this work before, what has changed, etc.
    Also you might check the suggested action for the error code per documentation:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14219/impus.htm#sthref10620
    Edited by: orafad on Dec 5, 2010 7:16 PM

  • Select LONG column into CLOB variable

    Hi all,
    I am trying retrieve the data present in a LONG column into a CLOB variable.
    However I am getting an error, pls let me know how I can resolve it.
    DECLARE
    v_text CLOB;
    BEGIN
    SELECT TO_LOB(trigger_body)
    INTO v_text
    FROM
    user_triggers
    WHERE
    ROWNUM <= 1;
    END;
    ERROR at line 8:
    ORA-06550: line 8, column 20:
    PL/SQL: ORA-00932: inconsistent datatypes: expected NUMBER got LONG
    ORA-06550: line 8, column 5:
    PL/SQL: SQL Statement ignored
    Let me know if there is an alternate to this. I would like to get the data present in the LONG column into a variable.
    The reason why I am not retrieving the LONG column into LONG variable is stated below (from Oracle Website):
    You can insert any LONG value into a LONG database column because the maximum width of a LONG column is 2**31 bytes.
    However, you cannot retrieve a value longer than 32760 bytes from a LONG column into a LONG variable.
    Thanks and Regards,
    Somu

    There are couple of things I did (listed in order):
    1) Create Global Temporary Table containing a CLOB column
    2) Select LONG column and convert to CLOB by using TO_LOB and insert into Global Temporary Table containing a CLOB column
    2) Select from this Global Temporary Table (which already contains data in CLOB) and assign it to a CLOB variable.
    This is done because you can not directly use TO_LOB in a select statement to assign the value to a CLOB variable.
    Stated below is an example:
    -- Create Temporary Table
    CREATE GLOBAL TEMPORARY TABLE glb_tmp_table_lob(
    time TIMESTAMP WITH LOCAL TIME ZONE,
    text CLOB
    ON COMMIT DELETE ROWS;
    -- PL/SQL Block to Execute
    DECLARE
    v_clob CLOB;
    BEGIN
    -- Insert into Temporary Table by converting LONG into CLOB
    INSERT INTO glb_tmp_table_lob (
    time ,
    text
    SELECT
    sysdate ,
    TO_LOB(dv.text)
    FROM
    dba_views dv
    WHERE
    ROWNUM <= 1
    -- Select from the Temporary table into the variable
    SELECT
    gt.text
    INTO
    v_clob
    FROM
    glb_tmp_table_lob gt;
    COMMIT;
    -- Now you can use the CLOB variable as per your needs.
    END;
    /

  • UpdateCharacterStream for LONG column gives SQLException: Data size bigger than max

    I have a LONG column and when I call updateCharacterStream via the resultset I get the following error:
    SQLException: Data size bigger than max size for this type: 2391
    I am trying to update the column with a value that is 2391 in length. The column is defined as a LONG so it is plenty big enough.
    What gives?
    If I use a UTF-8 database it works OK. But if I use a db that is on the default character set I get the error. There are NO multibyte characters, I just happened to test it against my unicode db and it worked.

    This only happens with the thin driver. I am using 8i. It also works OK if the target db is in UTF8!
    D:\myjava>java OracleLongTest
    Registering the ORACLE JDBC drivers.
    Connecting to database
    Connection = oracle.jdbc.driver.OracleConnection@1616c7
    Insert via prepared statement with reader length = 2200
    Insert with prepared statement was successfull
    Updating via prepared statement with reader length = 2200
    Update with prepared statement was successfull
    ===========================================================================
    ticketDesc len = 2200 mod date = 970510081398
    Updating with reader length = 2200
    Updating C1 ts = 970510081608
    Updating the row...
    Exception 2 = java.sql.SQLException: Data size bigger than max size for this type: 2200
    java.sql.SQLException: Data size bigger than max size for this type: 2200
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:114)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:156)
    at oracle.jdbc.dbaccess.DBError.check_error(DBError.java:775)
    at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java:82)
    at oracle.jdbc.driver.OraclePreparedStatement.setItem(OraclePreparedStatement.java:761)
    at oracle.jdbc.driver.OraclePreparedStatement.setDatum(OraclePreparedStatement.java:1464)
    at oracle.jdbc.driver.OraclePreparedStatement.setCHAR(OraclePreparedStatement.java:1397)
    at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:1989)
    at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:2200)
    at oracle.jdbc.driver.OraclePreparedStatement.setOracleObject(OraclePreparedStatement.java:2216)
    at oracle.jdbc.driver.UpdatableResultSet.prepare_updateRow_binds(UpdatableResultSet.java:2070)
    at oracle.jdbc.driver.UpdatableResultSet.updateRow(UpdatableResultSet.java:1287)
    at OracleLongTest.run(OracleLongTest.java:146)
    at OracleLongTest.main(OracleLongTest.java:41)
    FINISHED!!
    StringBuffer sb = new StringBuffer("");
    for(int i = 0; i < 2200; i++)
    sb.append("X");
    Statement stmt = con.createStatement();
    try
    String insertSQL = "CREATE TABLE MYLONGTEST (C1 NUMBER(38), C2 LONG)";
    stmt.executeUpdate(insertSQL);
    catch( SQLException se) {}
    PreparedStatement insertPstmt = con.prepareStatement("INSERT INTO MYLONGTEST VALUES (?, ?)");
    PreparedStatement updatePstmt = con.prepareStatement("UPDATE MYLONGTEST SET C2 = ? WHERE C1 = ?");
    PreparedStatement selectPstmt = con.prepareStatement("SELECT C1,C2 FROM MYLONGTEST WHERE C1 = ?", 1004, 1008);
    long ts = System.currentTimeMillis();
    String value = sb.toString();
    Reader rdr = new StringReader(value);
    int len = value.length();
    try
    System.out.println("Insert via prepared statement with reader length = " + len);
    insertPstmt.setLong(1, ts);
    insertPstmt.setCharacterStream(2, rdr, len);
    insertPstmt.execute();
    System.out.println("Insert with prepared statement was successfull");
    catch( SQLException se )
    System.out.println("Caught exception = " + se.toString());
    try
    System.out.println("Updating via prepared statement with reader length = " + len);
    rdr = new StringReader(value);
    updatePstmt.setCharacterStream(1, rdr, len);
    updatePstmt.setLong(2, ts);
    int rowcount = updatePstmt.executeUpdate();
    System.out.println("Update with prepared statement was successfull");
    updatePstmt.close();
    catch( SQLException se )
    System.out.println("Caught exception = " + se.toString());
    System.out.println("===========================================================================");
    selectPstmt.setLong(1, ts);
    ResultSet rs = selectPstmt.executeQuery();
    rs.absolute(1);
    String c2 = rs.getString("C2");
    ts = rs.getLong("C1");
    System.out.println("c2 len = " + c2.length() + " mod date = " + ts);
    rs.absolute(1);
    ts = System.currentTimeMillis();
    System.out.println("Updating with reader length = " + len);
    rdr = new StringReader(value);
    rs.updateCharacterStream("C2", rdr, len);
    ts = System.currentTimeMillis();
    System.out.println("Updating C1 ts = " + ts);
    rs.updateLong("C1", ts);
    System.out.println("Updating the row...");
    rs.updateRow();
    null

  • S1000 Data file size limit is reached in statement

    I am new to Java and was given the task to trouble shoot a java application that was written a few years ago and no longer supported. The java application creates database files the user's directory: diwdb.properties, diwdb.data, diwdb.lproperties, diwdb.script. The purpose of the application is to open a zip file and insert the files into a table in the database.
    The values that are populated in the diwdb.properties file are as follows:
    #HSQL Database Engine
    #Wed Jan 30 08:55:05 GMT 2013
    hsqldb.script_format=0
    runtime.gc_interval=0
    sql.enforce_strict_size=false
    hsqldb.cache_size_scale=8
    readonly=false
    hsqldb.nio_data_file=true
    hsqldb.cache_scale=14
    version=1.8.0
    hsqldb.default_table_type=memory
    hsqldb.cache_file_scale=1
    hsqldb.log_size=200
    modified=yes
    hsqldb.cache_version=1.7.0
    hsqldb.original_version=1.8.0
    hsqldb.compatible_version=1.8.0
    Once the databsae file gets to 2GB it brings up the error meessage 'S1000 Data file size limit is reached in statement (Insert into <tablename>......
    From searching on the itnernet it appeared that the parameter hsqldb.cache_file_scale needed to be increased & 8 was a suggested value.
    I have the distribution files (.jar & .jnlp) that are used to run the application. And I have a source directory that was found that contains java files. But I do not see any properties files to set any parameters. I was able to load both directories into NetBeans but really don't know if the files can be rebuilt for distribution as I'm not clear on what I'm doing and NetBeans shows errors in some of the directories.
    I have also tried to add parameters to the startup url: http://uknt117.uk.infores.com/DIW/DIW.jnlp?hsqldb.large_data=true?hsqldb.cache_file_scale=8 but that does not affect the application.
    I have been struggling with this for quite some time. Would greatly appreciate any assistance to help resolve this.
    Thanks!

    Thanks! But where would I run the sql statement. When anyone launches the application it creates the database files in their user directory. How would I connect to the database after that to execute the statement?
    I see the create table statements in the files I have pulled into NetBeans in both the source folder and the distribution folder. Could I add the statement there before the table is created in the jar file in the distribution folder and then re-compile it for distribution? OR would I need to add it to the file in source directory and recompile those to create a new distribution?
    Thanks!

  • Streaming data to LONG columns in Oracle 7.3.2.3.0

    I am trying to stream data to a LONG column. I'm using Oracle
    Server 7.3.2.3.0 on AIX and JDBC driver 8.0.4 on Windows NT 4
    SP5.
    I include sample tables/programs at the end, but here's the
    summary of what's happening:
    I'm creating a byte array of length 2500. If I use
    setAsciiStream I get the following exception when I execute the
    prepared statement:
    java.sql.SQLException: Data size bigger than max size for this
    type
    at oracle.jdbc.dbaccess.DBError.check_error(DBError.java)
    at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java)
    at oracle.jdbc.driver.OraclePreparedStatement.setItem
    (OraclePreparedStat
    ement.java)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setAsciiStream
    (OraclePrepa
    redStatement.java)
    at TestOracle.main(TestOracle.java:26)
    If I use setBinaryStream I get this exception:
    java.sql.SQLException: ORA-01461: can bind a LONG value only for
    insert into a LONG column
    at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java)
    at oracle.jdbc.ttc7.Oall7.receive(Oall7.java)
    at oracle.jdbc.ttc7.TTC7Protocol.doOall7
    (TTC7Protocol.java)
    at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch
    (TTC7Protocol.java)
    at oracle.jdbc.driver.OracleStatement.doExecuteOther
    (OracleStatement.jav
    a)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithBatch
    (OracleStatement
    .java)
    at oracle.jdbc.driver.OracleStatement.doExecute
    (OracleStatement.java)
    at
    oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout
    (OracleStateme
    nt.java)
    at
    oracle.jdbc.driver.OraclePreparedStatement.executeUpdate
    (OraclePrepar
    edStatement.java)
    at oracle.jdbc.driver.OraclePreparedStatement.execute
    (OraclePreparedStat
    ement.java)
    at TestOracle.main(TestOracle.java:27)
    My Oracle7 manual states that LONG columns can store 2GB of text.
    I tried the above with LONG RAW columns and it worked fine.
    Can anyone explain why I get this error? I've tried it with
    different sizes and when the data is <2000 bytes it works fine
    for LONG columns.
    My table is simple:
    create table TestLongs (key INTEGER PRIMARY KEY, data LONG);
    My Java code is also very simple:
    public class TestOracle
    public static void main(String[] args)
    Connection con = null;
    PreparedStatement pstmt = null;
    try
    Class.forName("oracle.jdbc.driver.OracleDriver");
    con = DriverManager.getConnection(
    "jdbc:oracle:thin:@itchy:1526:test",
    "System", "<OMITTED>");
    byte[] data = new byte[2500];
    for (int i=0; i< 2500; i++)
    data[i] = 53;
    String sql = "INSERT INTO TestLongs (key, data)
    VALUES(1, ?)";
    pstmt = con.prepareStatement(sql);
    ByteArrayInputStream bis = new ByteArrayInputStream
    (data);
    pstmt.setAsciiStream(1, bis, data.length);
    pstmt.execute();
    catch (SQLException e)
    System.err.println("An error occurred with the
    database: " + e);
    e.printStackTrace();
    catch (Exception e)
    System.err.println("Oracle JDBC driver not found." +
    e);
    e.printStackTrace();
    finally
    try
    if (pstmt != null)
    pstmt.close();
    if (con != null)
    con.close();
    catch (SQLException e)
    System.err.println("Unable to close
    statement/connection.");
    null

    Robert Greig (guest) wrote:
    : I am trying to stream data to a LONG column. I'm using Oracle
    : Server 7.3.2.3.0 on AIX and JDBC driver 8.0.4 on Windows NT 4
    : SP5.
    I tried it with the old 7.3.x JDBC driver and it works fine. I
    also noticed after further testing that it sometimes worked
    with the 8.0.4 driver. Looks like a bug in the 8.0.4 driver or
    some wacky incompatibility.
    null

  • Imp-00020 long column too large for column buffer size (22)

    Hi friends,
    I have exported (through Conventional path) a complete schema from Oracle 7 (Sco unix patform).
    Then transferred the export file to a laptop(window platform) from unix server.
    And tried to import this file into Oracle10.2. on windows XP.
    (Database Configuration of Oracle 10g is
    User tablespace 2 GB
    Temp tablespace 30 Mb
    The rollback segment of 15 mb each
    undo tablespace of 200 MB
    SGA 160MB
    PAGA 16MB)
    All the tables imported success fully except 3 tables which are having AROUND 1 million rows each.
    The error message comes during import are as following for these 3 tables
    imp-00020 long column too large for column buffer size (22)
    imp-00020 long column too large for column buffer size(7)
    The main point here is in all the 3 tables there is no long column/timestamp column (only varchar/number columns are there).
    For solving the problem I tried following options
    1.Incresed the buffer size upto 20480000/30720000.
    2.Commit=Y Indexes=N (in this case does not import complete tables).
    3.first export table structures only and then Data.
    4.Created table manually and tried to import the tables.
    but all efforts got failed.
    still getting the same errors.
    Can some one help me on this issue ?
    I will be grateful to all of you.
    Regards,
    Harvinder Singh
    [email protected]
    Edited by: user462250 on Oct 14, 2009 1:57 AM

    Thanks, but this note is for older releases, 7.3 to 8.0...
    In my case both export and import were made on a 11.2 database.
    I didn't use datapump because we use the same processes for different releases of Oracle, some of them do not comtemplate datapump. By the way, shouldn't EXP / IMP work anyway?

  • Size limit of column content

    Can anyone tell me if there is a limit on the size of a data item that you can retrieve into a report column. I have in the back of my mind that there is a limit of either 30k or 32k but I can't find it documented anywhere.
    Many thanks.
    Fintan.

    Hi Everyone;
    I am running advanced analysis reports on HANA as datasource and I am getting the message Size Limit of Result Set Exceeded whenever I have a larger output. I have noticed that in the admin guide and in other SCN posts about this setting in Windows Registry:
    [HKEY_LOCAL_MACHINE\Software\SAP\AdvancedAnalysis\Settings\DataSource]
    "ShowBicsSample"="True"
    "ResultSetSizeLimit"="-1"
    But I am having hard time finding this setting in my windows registry; can someone help me understand why I don't have it in my registry? and how to find it and change this setting?
    FYI...I am using Office 2010.

  • Maxl Error during data load - file size limit?

    <p>Does anyone know if there is a file size limit while importingdata into an ASO cube via Maxl. I have tried to execute:</p><p> </p><p>Import Database TST_ASO.J_ASO_DB data</p><p>using server test data file '/XX/xXX/XXX.txt'</p><p>using server rules_file '/XXX/XXX/XXX.rul'</p><p>to load_buffer with buffer_id 1</p><p>on error write to '/XXX.log';</p><p> </p><p>It errors out after about 10 minutes and gives "unexpectedEssbase error 1130610' The file is about 1.5 gigs of data. The filelocation is right. I have tried the same code with a smaller fileand it works. Do I need to increase my cache or anything? I alsogot "DATAERRORLIMIT' reached and I can not find the log filefor this...? Thanks!</p>

    Have you looked in the data error log to see what kind of errors you are getting. The odds are high that you are trying to load data into calculated memebers (or upper level memebers) resulting in errors. It is most likely the former. <BR><BR>you specify the error file with the <BR><BR>on error write to '/XXX.log'; <BR><BR>statement. Have you looked for this file to find why you are getting errors? Do yourself a favor, load the smaller file and look for the error file to see what kind of an error you are getting. It is possible that you error file is larger than your load file, since multiple errors on a single load item may result in a restatement of the entire load line for each error.<BR><BR>This is a starting point for your exploration into the problem. <BR><BR>DATAERRORLIMIT is set at the config file, default at 1000, max at 65000.<BR><BR>NOMSGLOGGINGONDATAERRORLIMIT if set to true, just stops logging and continues the load when the data error limit is reached. I'd advise using this only in atest environement since it doesn't solve the initial problem of data errors.<BR><BR>Probably what you'll have to do is ignore some of the columns in the data load that load into calculated fields. If you have some upper level memebers, you could put them in skip loading condition. <BR><BR>let us know what works for you.

  • Data type size for the view column

    I am creating a view from base table using substr() for some fields. The data type size of the column in the view becomes twice the size of the corresponding column in the base table. Example
    SQL> create table temp_sat(name varchar2(20));
    Table created.
    SQL> create view temp_view as select substr(name,1,10) name from temp_sat;
    View created.
    SQL> desc temp_view
    Name Null? Type
    NAME VARCHAR2(40)
    Can i specify the size of the column while creating view

    satish asnani wrote:
    I am creating a view from base table using substr() for some fields. The data type size of the column in the view becomes twice the size of the corresponding column in the base table. Example
    SQL> create table temp_sat(name varchar2(20));
    Table created.
    SQL> create view temp_view as select substr(name,1,10) name from temp_sat;
    View created.
    SQL> desc temp_view
    Name                                      Null?    Type
    NAME                                               VARCHAR2(40)Can i specify the size of the column while creating viewHi,
    Which platform and Oracel DB version?
    I had try to replicate, but I can't:
    SQL> create table t_v
      2  (id varchar2(20));
    Tabla creada.
    SQL> desc t_v
    Nombre                                                            ┐Nulo?   Tipo
    ID                                                                         VARCHAR2(20)
    SQL> r
      1  create view v_t_v
      2  as
      3  select substr(id,1,10) id
      4* from t_v
    Vista creada.
    SQL> desc v_t_v
    Nombre                                                            ┐Nulo?   Tipo
    ID                                                                         VARCHAR2(10)John

Maybe you are looking for

  • Windows Vista not loading IPOD in itunes or explorer HELP!!!

    Family just purchased a new HP pavillion w/ vista. I, myself, have a mac - i am LOST! We installed the latest ITUNES and checked all the drivers and every other option suggested online. We just can't get the ipod to be recognized on vista. The ipod i

  • HT204053 how can i sync calendar and contacts between my iPhone and new Mac book pro?

    I use an iPad and iPhone to maintain my schedule and contacts. I just purchased a Macbook Pro...how do I sync info on my other devices with my Macbook? I have a current iCloud ID... Would the info from the calander on the iPad automatically "push" to

  • Problems with Fios and video games.

    Hi, I switched from Time Warner Cable to Fios last month and I've been having problems with my internet going down on ALL computers whenever ONE of them plays a game online.  I know it sounds strange, but as far as we can tell, it's only caused by vi

  • 10.1.0.3

    I see that some issues seem to have been fixed in 10.1.0.3 from a tar repsonse. I don't see any 10.1.0.3 donwload for Windows XP. Is it available?

  • Not able to define splits.

    Hi Experts, I am trying to define the splits for PM order operation, I have created the work center, in the other formula of Capacities tab I have mentioned SAP007 and SAP004 in the Scheduling tab other formula field. I have also created the HR work