Large data entry issue

My Form in a JSP entry uses a Clob data type into my Oracle 9i database using SQL insert in JDBC. The insert always works if the form entry is under 4000 characters. The database is Oracle 9i using ojdbc14 driver.
Due to restrictions I cant update the ojdbc14 driver at this time.
Please advise if there is anything I can do in my insert statment to handle larger than 4000 character entry into my Clob data type?

Evergrean wrote:
My Form entry uses a Clob data type. The insert always works using Java PreparedStatement if the form entry is under 4000 characters. The database is Oracle 9i using ojdbc14 driver on my Tomcat web container.
Due to restrictions I cant update the ojdbc14 driver at this time.
The issue is when the Form entry is over 4000 characters it wont enter any data into the database. The error message I get: SQL Exception: Data size bigger than max size for the type: 12463
Please advise how I can make it work?Please clarify what the data type specifically is. Obviously, as noted, if you have a varchar2(4000) then you can't put more into that.
And that has nothing at all to do with the driver.
You did not say however whether you could change the database itself.
If you can then you change it to a blob type, create an alter script to update the database, taking care to assess any other impacts on code (java or oracle) and make appropriate changes.
If you can't then I suggest you do one of the following
1. Disallow the user from entering more data
2. Write a bug against the requirements (the ones that insist that the users must put in more data but which do not allow you do provide any more storage space.)
Note that the compression solution is somewhat problematic because compression invariably produces a binary type. And to put a binary type into a text field would require converting it to text. And that last step increases storage requirements. Thus with small data sets you might see little or no improvement. That however does depend on the data.

Similar Messages

  • Data entry issue in CATS

    Dear All,
    when i enter time (hours) in data entry sheet.. system says "Enter relevant working time attributes" I really dont know where i have done mistake.
    Im using fields Pernr.name,controling area, tasklevel, task type, task component and attendance/ absence type in data entry sheet.
    when i enter time only with controling area and attendace type, its accepting the time data but if i assigned task leave, task type and task component, then only system thooughing an error msg "*Enter relevant working time attributes".
    I have checked all asignement with task type,tasklevel and task component.. which all ar correct but i dont knw why this msg coming..
    can any one help me???
    thanks,
    Manjula.

    Hi Manju..
    I think there is an problem in tasktype component.. check whether you have activated 'component task type 'activated or not.. if you activated the check box.. then you just maintained abbsence / attendance type  or wage type and also check the default vaule in the profile settings.
    try and let me knw..
    Dinesh.

  • Need Data Entry Help " causing issues in data entry

    What's the syntax for telling a insert/update query to add
    information to the database so that " and ' don't affect the
    import? Meaning if I add 30.6" long in diameter it only will pull
    out 30.6. Is this a data entry or pull error and how can I fix
    it?

    when you insert/update the data in your db use
    <cfqueryparam> tag - it
    will automatically escape all necessary characters.
    at output, i assume you are displaying the data in a text
    input form
    field, the problem is not with cf or your data, but with the
    basic html:
    <input type="text"
    value="#some-cf-variable-with-double-quotes#">
    when the value of cf var contains double quotation marks, the
    rendered
    html will look like:
    <input type="text" value="0.36" in diameter">
    see where the problem is? the value displayed in the field
    will be just
    0.36, and the browser will ignore the rest of the code.
    cf has built-in functions to work around this, like
    xmlformat() or
    htmleditformat()
    Azadi Saryev
    Sabai-dee.com
    http://www.sabai-dee.com/

  • Issues with dates not storing correctly on a data entry screen

    We are using Application Express 3.0.1.00.08
    I have a table which I have written a data entry screen for, the user enters most of the fields on theentry screen, howevere there are some fields that I would like to populate when a biutton is pressed e.g last_updated_date and creation_date.
    I have a on submit after computations and validation process with the following code to set the last updated date:
    begin
    :P1_LAST_UPDATED := to_date(sysdate,'DD-MON-RRRR HH24:MI');
    end;
    I am setting the creation date at the same time as getting the primary key as follows:
    declare
    function get_pk return varchar2
    is
    begin
    for c1 in (select XX_SVS_SEQ.nextval next_val
    from dual)
    loop
    return c1.next_val;
    end loop;
    end;
    begin
    :P1_ID := get_pk;
    :P1_CREATION_DATE := sysdate;
    :P1_CREATED_BY := :F380_OSS_USER_ID;
    end;
    However the dates are being written to the database incorrectly they are being written with a year 0009 instead of 2009.
    Does anyone know how I can get hidden dates in a data entry form to write correctly to the database.
    Regards
    Kay

    It could be done as a database trigger but surely APEX should be capable of saving the correct date format of an item.
    Since my original post I have managed to get the dates stored with the correct year in the hidden fileds by doing the following in my process:
    :P1_LAST_UPDATED := to_date(sysdate,'DD-MON-RRRR HH24:MI');
    however for some reason the date is being stored with the correct year now as 2009 , but the time element is not being stored any ideas?

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • Error while Exporting large data from Reportviewer on azure hosted website.

    Hi,
    I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
    Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
    Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
    at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    --- End of inner exception stack trace ---
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
    at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
    at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
    at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
    at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
    It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
    Any help will be appreciated.
    Thanks.

    Sorry, let me clarify my questions as they were ambiguous:
    For a given set if input, does the request always take the same amount of time to fail? How long does it take?
    When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
    Also, if you can share your site name (directly or
    indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side.

  • Creation Of Custom Transaction For RF enabled devices "mobile data entry"

    Hello Everyone,
    i need some information, or some material or some example's for creating custom transactions that will invoke the standard good issue transaction of sap from the handled device using the RF , so can anybody give some reference which will help me in creating custom transaction for a RF enabled handled device, which will invoke the standard sap goods issue transaction( mb1a) ,, there are some LM-transaction which do goods issue from the RF enabled handled device, but in my case i want to do goods issue using reservation as my input, so can anybody guide me or send some reference material matching to my problem, i would really appreciate it..
    thanks in advance...
    sapient2006

    Hi Hussaini,
    Here are the steps.
    1> Customization needs to be completed in SPRO. Logistics Execution -> Mobile Data entry section. Also compare entries with LM01 for understanding.
    2> In the 'Define menu management', you can create dynamic menus. For Menu or transaction type, if you enter "1", a menu appears; if you enter "2", a transaction appears. Custom transactions can be created and assigned here.
    Once that is done, when you execute LM01 you will see your custom transactions come up. Make sure that these are created keeping in mind the RF device screen size and limitations.
    Regards,
    Anand.

  • ORA-01403 No Data Found Issue

    Hi,
    Im very new to streams and having a doubt regarding ORA-01403 issue happening while replication. Need you kind help on this regard. Thanks in advance.
    Oracle version : 10.0.3.0
    1.Suppose there are 10 LCRs in a Txn and one of the LCR caused ORA-01403 and none of the LCRs get executed.
    We can read the data of this LCR and manually update the record in the Destination database.
    Eventhough this is done, while re-executing the transaction, im getting the same ORA-01403 on the same LCR.
    What could be the possible reason.
    Since, this is a large scale system with thousands of transactions, it is not possible to handle the No data found issues occuring in the system.
    I have written a PL/SQL block which can generate Update statements with the old data available in LCR, so that i can re-execute the Transaction again.
    The PL/SQL block is given below. Could you please check if there are any issues in this while generating the UPDATE statements. Thank you
    /* Formatted on 2008/10/23 14:46 (Formatter Plus v4.8.7) */
    --Script for generating the Update scripts for the Message which caused the 'NO DATA FOUND' error.
    DECLARE
    RES NUMBER; --No:of errors to be resolved
    RET NUMBER; --A number variable to hold the return value from getObject
    I NUMBER; --Index for the loop
    J NUMBER; --Index for the loop
    K NUMBER; --Index for the loop
    PK_COUNT NUMBER; --To Hold the no:of PK columns for a Table
    LCR ANYDATA; --To Hold the Logical Change Record
    TYP VARCHAR2 (61); --To Hold the Type of a Column
    ROWLCR SYS.LCR$_ROW_RECORD; --To Hold the LCR caused the error in a Txn.
    OLDLIST SYS.LCR$_ROW_LIST; --To Hold the Old data of the Record which was tried to Update/Delete
    NEWLIST SYS.LCR$_ROW_LIST;
    UPD_QRY VARCHAR2 (5000);
    EQUALS VARCHAR2 (5) := ' = ';
    DATA1 VARCHAR2 (2000);
    NUM1 NUMBER;
    DATE1 TIMESTAMP ( 0 );
    TIMESTAMP1 TIMESTAMP ( 3 );
    ISCOMMA BOOLEAN;
    TYPE TAB_LCR IS TABLE OF ANYDATA
    INDEX BY BINARY_INTEGER;
    TYPE PK_COLS IS TABLE OF VARCHAR2 (50)
    INDEX BY BINARY_INTEGER;
    LCR_TABLE TAB_LCR;
    PK_TABLE PK_COLS;
    BEGIN
    I := 1;
    SELECT COUNT ( 1)
    INTO RES
    FROM DBA_APPLY_ERROR;
    FOR TXN_ID IN
    (SELECT MESSAGE_NUMBER,
    LOCAL_TRANSACTION_ID
    FROM DBA_APPLY_ERROR
    WHERE LOCAL_TRANSACTION_ID =
    '2.85.42516'
    ORDER BY ERROR_CREATION_TIME)
    LOOP
    SELECT DBMS_APPLY_ADM.GET_ERROR_MESSAGE
    (TXN_ID.MESSAGE_NUMBER,
    TXN_ID.LOCAL_TRANSACTION_ID
    INTO LCR
    FROM DUAL;
    LCR_TABLE (I) := LCR;
    I := I + 1;
    END LOOP;
    I := 0;
    K := 0;
    dbms_output.put_line('size >'||lcr_table.count);
    FOR K IN 1 .. RES
    LOOP
    ROWLCR := NULL;
    RET :=
    LCR_TABLE (K).GETOBJECT
    (ROWLCR);
    --dbms_output.put_line(rowlcr.GET_OBJECT_NAME);
    PK_COUNT := 0;
    --Finding the PK columns of the Table
    SELECT COUNT ( 1)
    INTO PK_COUNT
    FROM ALL_CONS_COLUMNS COL,
    ALL_CONSTRAINTS CON
    WHERE COL.TABLE_NAME =
    CON.TABLE_NAME
    AND COL.CONSTRAINT_NAME =
    CON.CONSTRAINT_NAME
    AND CON.CONSTRAINT_TYPE = 'P'
    AND CON.TABLE_NAME =
    ROWLCR.GET_OBJECT_NAME;
    dbms_output.put_line('Count of PK Columns >'||pk_count);
    DEL_QRY := NULL;
    DEL_QRY :=
    'DELETE FROM '
    || ROWLCR.GET_OBJECT_NAME
    || ' WHERE ';
    INS_QRY := NULL;
    INS_QRY :=
    'INSERT INTO '
    || ROWLCR.GET_OBJECT_NAME
    || ' ( ';
    UPD_QRY := NULL;
    UPD_QRY :=
    'UPDATE '
    || ROWLCR.GET_OBJECT_NAME
    || ' SET ';
    OLDLIST :=
    ROWLCR.GET_VALUES ('old');
    -- Generate Update Query
    NEWLIST :=
    ROWLCR.GET_VALUES ('old');
    ISCOMMA := FALSE;
    FOR J IN 1 .. NEWLIST.COUNT
    LOOP
    IF NEWLIST (J) IS NOT NULL
    THEN
    IF J <
    NEWLIST.COUNT
    THEN
    IF ISCOMMA =
    TRUE
    THEN
    UPD_QRY :=
    UPD_QRY
    || ',';
    END IF;
    END IF;
    ISCOMMA := FALSE;
    TYP :=
    NEWLIST
    (J).DATA.GETTYPENAME;
    IF (TYP =
    'SYS.VARCHAR2'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETVARCHAR2
    (DATA1
    IF DATA1 IS NOT NULL
    THEN
    UPD_QRY :=
    UPD_QRY
    || NEWLIST
    (J
    ).COLUMN_NAME;
    UPD_QRY :=
    UPD_QRY
    || EQUALS;
    UPD_QRY :=
    UPD_QRY
    || ' '
    || ''''
    || SUBSTR
    (DATA1,
    0,
    253
    || '''';
    ISCOMMA :=
    TRUE;
    END IF;
    ELSIF (TYP =
    'SYS.NUMBER'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETNUMBER
    (NUM1
    IF NUM1 IS NOT NULL
    THEN
    UPD_QRY :=
    UPD_QRY
    || NEWLIST
    (J
    ).COLUMN_NAME;
    UPD_QRY :=
    UPD_QRY
    || EQUALS;
    UPD_QRY :=
    UPD_QRY
    || ' '
    || NUM1;
    ISCOMMA :=
    TRUE;
    END IF;
    ELSIF (TYP =
    'SYS.DATE'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETDATE
    (DATE1
    IF DATE1 IS NOT NULL
    THEN
    UPD_QRY :=
    UPD_QRY
    || NEWLIST
    (J
    ).COLUMN_NAME;
    UPD_QRY :=
    UPD_QRY
    || EQUALS;
    UPD_QRY :=
    UPD_QRY
    || ' '
    || 'TO_Date( '
    || ''''
    || DATE1
    || ''''
    || ', '''
    || 'DD/MON/YYYY HH:MI:SS AM'')';
    ISCOMMA :=
    TRUE;
    END IF;
    ELSIF (TYP =
    'SYS.TIMESTAMP'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETTIMESTAMP
    (TIMESTAMP1
    IF TIMESTAMP1 IS NOT NULL
    THEN
    UPD_QRY :=
    UPD_QRY
    || ' '
    || ''''
    || TIMESTAMP1
    || '''';
    ISCOMMA :=
    TRUE;
    END IF;
    END IF;
    END IF;
    END LOOP;
    --Setting the where Condition
    UPD_QRY := UPD_QRY || ' WHERE ';
    FOR I IN 1 .. PK_COUNT
    LOOP
    SELECT COLUMN_NAME
    INTO PK_TABLE (I)
    FROM ALL_CONS_COLUMNS COL,
    ALL_CONSTRAINTS CON
    WHERE COL.TABLE_NAME =
    CON.TABLE_NAME
    AND COL.CONSTRAINT_NAME =
    CON.CONSTRAINT_NAME
    AND CON.CONSTRAINT_TYPE =
    'P'
    AND POSITION = I
    AND CON.TABLE_NAME =
    ROWLCR.GET_OBJECT_NAME;
    FOR J IN
    1 .. NEWLIST.COUNT
    LOOP
    IF NEWLIST (J) IS NOT NULL
    THEN
    IF NEWLIST
    (J
    ).COLUMN_NAME =
    PK_TABLE
    (I
    THEN
    UPD_QRY :=
    UPD_QRY
    || ' '
    || NEWLIST
    (J
    ).COLUMN_NAME;
    UPD_QRY :=
    UPD_QRY
    || ' '
    || EQUALS;
    TYP :=
    NEWLIST
    (J
    ).DATA.GETTYPENAME;
    IF (TYP =
    'SYS.VARCHAR2'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETVARCHAR2
    (DATA1
    UPD_QRY :=
    UPD_QRY
    || ' '
    || ''''
    || SUBSTR
    (DATA1,
    0,
    253
    || '''';
    ELSIF (TYP =
    'SYS.NUMBER'
    THEN
    RET :=
    NEWLIST
    (J
    ).DATA.GETNUMBER
    (NUM1
    UPD_QRY :=
    UPD_QRY
    || ' '
    || NUM1;
    END IF;
    IF I <
    PK_COUNT
    THEN
    UPD_QRY :=
    UPD_QRY
    || ' AND ';
    END IF;
    END IF;
    END IF;
    END LOOP;
    END LOOP;
    UPD_QRY := UPD_QRY || ';';
    DBMS_OUTPUT.PUT_LINE (UPD_QRY);
    --Generate Update Query - End
    END LOOP;
    END;

    Thanks for you replies HTH and Dipali.
    I would like to make some points clear from my side based on the issue i have raised.
    1.The No Data Found error is happening on a table for which supplemental logging is enabled.
    2.As per my understanding, the "Apply" process is comparing the existing data in the destination database with the "Old" data in the LCR.
    Once there is a mismatch between these 2, ORA-01403 is thrown. (Please tell me whether my understanding is correct or not)
    3.This mismatch can be on date field or even on the timestamp millisecond as well.
    Now, the point im really wondering about :
    Some how a mismatch got generated in the destination database (Not sure about the reason) and ORA-01403 is thrown.
    If we could update the Destination database with the "Old" data from LCR, this mismatch should be resolved isnt it?
    Reply to you Dipali :
    If nothing is working out, im planning to put a conflict handler for all tables with "OVERWRITE" option. With the following script
    --Generate script for applying Conflict Handler for the Tables for which Supplymentary Logging is enabled
    declare
    count1 number;
    query varchar2(500) := null;
    begin
    for tables in (
    select table_name from user_tables where table_name IN ("NAMES OF TABLES FOR WHICH SUPPLEMENTAL LOGGING IS ENABLED")
    loop
    count1 := 0;
    dbms_output.put_line('DECLARE');
    dbms_output.put_line('cols DBMS_UTILITY.NAME_ARRAY;');
    dbms_output.put_line('BEGIN');
    select max(position) into count1
    from all_cons_columns col, all_constraints con
    where col.table_name = con.table_name
    and col.constraint_name = con.constraint_name
    and con.constraint_type = 'P'
    and con.table_name = tables.table_name;
    for i in 1..count1
    loop
    query := null;
    select 'cols(' || position || ')' || ' := ' || '''' || column_name || ''';'
    into query
    from all_cons_columns col, all_constraints con
    where col.table_name = con.table_name
    and col.constraint_name = con.constraint_name
    and con.constraint_type = 'P'
    and con.table_name = tables.table_name
    and position = i;
    dbms_output.put_line(query);
    end loop;
    dbms_output.put_line('DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(');
    dbms_output.put_line('object_name => ''ICOOWR.' || tables.table_name|| ''',');
    dbms_output.put_line('method_name => ''OVERWRITE'',');
    dbms_output.put_line('resolution_column => ''COLM_NAME'',');
    dbms_output.put_line('column_list => cols);');
    dbms_output.put_line('END;');
    dbms_output.put_line('/');
    dbms_output.put_line('');
    end loop;
    end;
    Reply to u HTH :
    Our Destination database is a replica of the source and no triggers are running on any of these tables.
    This is not the first time im facing this issue. Earlier, we had to take big outage times and clear the Replica database and apply the dump from the source...
    Now i cant think about that situation.

  • Two step confirmation with RF device (Mobile Data Entry) - 1st step only

    Hello
    I want to implement the following scenario:
    - WM transfer orders are created, relevant for two step confirmation;
    - a user working with RF device confirms the 1st step - picking;
    - then the materials are transported physically to another building (the same warehouse number);
    - another user working with RF device confirms the 2nd step - putaway.
    For confirming the 1st step I tried to use standard transaction LM07. Confirmation of the 1st step is working fine but right after that a screen for confirming the 2nd step is displayed to the user. I would like this transaction to finish just after confirming the 1st step and not to proceed with the 2nd step immediately - because the confirmation of the 2nd step should be done by another user (with some other transaction).
    What is more: if you exit the transaction manually after confirming the 1st step the system saves the TO number (in table LRF_WKQU) and in some circumstances triggers the screen for confirming the 2nd step even if you do not run LM07 - this is so called "recovery" function which is a nuisance in my case.
    So my question is: is there a way to customize LM07 so that it finished processing automatically right after confirming the 1st step for a two step confirmation TO ? (and without storing data for "recovery" function). Or maybe some other mobile-data-entry transaction should be used for that ?
    regards

    Hi Vlad,
    One more information, which i would like to communicate on this issue.
    We have raised OSS message to SAP on this issue. They have replied back that, on screen 1302, we have some custom code which is interfering with the system behavior.
    As of now, we have analyzed at our end, and we found that, the enhancement at screen 1302 is restricted for the functionality, and in theory, this enhancement should not be effecting at all the system behavior.
    SAP has suggested to test the scenario , without any enhancement on screen 1302.
    Still we are in discussion with SAP, and try to find out the resolution.
    Can you also check in your system, if you have any enhancement on screen 1302 ? And if yes, does it interfering with the stander SAP codes.
    Thanks & Regards,
    Raj.

  • Conditional format with large data fails and show error as "Selection is too large" in Excel 2007

    I am facing a issue in paste special operation using conditional formats for large data in Excel 2007
    I have uploaded a file at below given location. 
    http://sdrv.ms/1fYC9qE
    The file contains two sheets, Sheet "Data" contains the data on which formats are to be applied and sheet "FormatTables" contains the format tables which contains conditional formating.
    There are two table in "FormatTables" sheet. Both have some conditional formats applied on it. 
    Case 1: 
    1. Select the table range of Table1 i.e $A$2:$AV$2
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    It throws error as "Selection is too large"
    Case 2:
    1. Select the table range of Table2 i.e $A$5:$AV$5
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    Formats get applied successfully.
    Both are the same format tables with same no of column and applied to same data range($A$1:$AV$20664) where one of the case works and another fails.
    The only diffrence is Table1 has appliesTo range($A$2:$T$2) as partial of total table range($A$2:$AV$2) whereas the Table2 has appliesTo range($A$5:$AV$5) same as of its total table range($A$5:$AV$5)
    NOTE : This issue is only in Excel 2007

    Excel 2007 No Supporting formating to take a formatting form another if source table has more then 16000 rows and if you want to do that in more then it then you have ot inset 1 more row in your format table to have 3 rows
    like: A1:AV3
    then try to copy that formating and apply
    Solution Case 1: 
    1.Select the table range of Table1 i.e AV21 and drage it down to one row down
    2. Select the table range of Table1 i.e $A$2:$AV$3
    3. Copy it
    4. Goto Sheet "Data" 
    5. Select data area i.e $A$1:$AV$20664
    6. Perform a paste special operation on full range and select "Formats" option while performing paste special

  • Reading Large data files from client machine ( urgent)

    Hi,
    I want to know the best way for reading large data at about 75MB file from client machine and insert to the database.
    Can anybody provide sample code for it.
    Loading the file should be done at client machine and inserting into the database should be done at server side.
    How should i load the file?
    How should i transfer this file or data to server ?
    How should i insert into the database ?
    Thanks in advance.
    regards
    Kalyan

    Like I said before you should be using your application server to serve files >from the server off the filesystem. The database should not store files this big >and should instead just have a reference to this file. I think u have not understood the problem corectly.
    I will make it clear.
    The requirement is as follows.
    This is a j2ee based application.
    Application server is oracle application server.
    Database is oracle9i
    it is thick client (swing based application)
    User enters datasource like c:\turkey.data
    This turkey.data file contains data
    1@1@20050131@1@4306286000113@D00@32000002005511069941@@P@10@0@1@0@0@0@DK@70059420@4330654016574@1@51881100@51881100@@99@D@40235@0@0@1@430441800000000@@11@D@42389@20050201@28483@15@@@[email protected]@@20050208@20050307@0@@@@@@@@@0@@0@0@0@430443400000800@0@0@@0@@@29@0@@@EUR
    like wise we may have more than 3 lacs rows in it.
    We need to read this file and transfer this to the application server. Which are EJBS.
    There we read this file each row in file is one row in the database for a table.
    Like wise we need to insert 3 lacs records in the database.
    We can use Jdbc to insert the data which is not a problem.
    Only problem is how to transfer this data to server.
    I can do it in one way. This is only a example
    I can read all the data in StringBuffer and pass to server.
    There again i get the data from StringBuffer and insert into database using jdbc.
    This way if u do it. It is performance issue and takes long time to insert into the database.It even may give MemoryOutofBond exception.
    just iam looking for the better way of doing this which may get good performace issue.
    Hope u have understood the problem.

  • Problem with large data report

    I tried to run a template I got from release 12 using data from the release we are using (11i). The xml file is about 13,500 kb. when i run it from my desktop.
    I get the following error (mostly no output is generated sometimes its generated after a long time).
    Font Dir: C:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\fonts
    Run XDO Start
    RTFProcessor setLocale: en-us
    FOProcessor setData: C:\Documents and Settings\skiran\Desktop\working\2648119.xml
    FOProcessor setLocale: en-us
    I assumed there may be compatibility issues between 12i and 11i hence tried to write my own template and ran into same issue
    when i added the third nested loop.
    I also noticed javaws.exe runs in the background hogging a lot of memory. I am using Bi version 5.6.3
    I tried to run the template through template viewer. The process never completes.
    The log file is
    [010109_121009828][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setData(InputStream) is called.
    [010109_121014796][][STATEMENT] Logger.init(): *** DEBUG MODE IS OFF. ***
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setTemplate(InputStream)is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutput(OutputStream)is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutputFormat(byte)is called with ID=1.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setLocale is called with 'en-US'.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.process() is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.generate() called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] createFO(Object, Object) is called.
    [010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] oracle.xdo Developers Kit 10.1.0.5.0 - Production
    [010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] Scalable Feature Disabled
    End of Process.
    Time: 436.906 sec.
    FO Formatting failed.
    I cant seem to figure out whether this is a looping or large data or BI version issue. Please advice
    Thank you

    The report will probably fail in a production environment if you don't have enough heap. 13 megs is a big xml file for the parsers to handle, it will probably crush the opp. The whole document has to be loaded into memory and perserving the relationships in the documents is probably whats killing your performance. The opp or foprocessor is not using the sax parser like the bursting engine does. I would suggest setting a maximum range on the amount of documents that can be created and submit in a set of batches. That will reduce your xml file size and performance will increase.
    An alternative to the pervious approach would be to write a concurrent program that merges the pdfs using the document merger api. This would allow you to burst the document into a temp directory and then re-assimilate them it one document. One disadvantage of this approach is that the pdf is going to be freakin huge. Also, if you have to send that piggy to the printer your gonna have some problems too. When you convert it pdf to ps the files are going to be massive because of the loss of compression, it's gets even worse if the pdf has images......Then'll you have a more problems with disk on the server and or running out of memory on ps printers.
    All of things I have discussed I have done in some sort of fashion. Speaking from experience your idea of 13 meg xml file is just a really bad idea. I would go with option one.
    Ike Wiggins
    http://bipublisher.blogspot.com

  • SEM-BCS Data-Entry-Layouts

    Hi all
    We are currently planning a migration from EC-CS zu SEM-BCS. One goal is to replace the existing Excel-VBAs for data-entry, which are difficult to maintain.
    Before we evaluate a solution outside of SEM-BCS we would like to check whether the Data-Entry-Layouts would be appropriate.
    I was able to setup several forms (using data-driven or fixed structures). Data input works fine. Data is available in the cube after saving.
    The problem: Whenever I open a data-entry-layout again the already existing data in the cube is not displayed. I can continue to enter data, which is also saved in the cube, but the already existing data is not displayed. This is not really user friendly.
    - Does anybody have any idea what could be wrong with my forms?
    - What are your experiences with the data-entry-layouts? Do you use them or do you use different ways to post data to SEM-BCS? What would you recommend?
    Thank you
    Best regards
    Markus

    The problem is solved by using all existing Support Packages. Obviously there was an issue with the data entry forms which was solved with one of the existing Support Packages.
    Regards
    Markus

  • How to improve performance(insert,delete and search) of table with large data.

    Hi,
    I am having a table which is used for maintaining history and have a large data and that keeps on increasing or decreasing based on the business rules.
    I am getting performance issues with this table which searching for any records or while inserting new data into it. I have already used index in this table but still I am facing lot of issues related to performance.
    Also, we used to insert bulk data into this table.
    Can we have any solution to achieve this, any solutions are greatly appreciated.
    Thanks in Advance!

    Please do not duplicate your posts across forums.  It's considered bad practice and rude, as people will not know what answers you've already received and may end up duplicating the effort.
    Locking this thread - answer on other thread please

  • Find out the time sheet data entry fields

    hi
    this is satish,
              i inserted the time sheet values in the time sheet data entry view. now the my problem is where these field values store in which tables.
    thanks for all

    Hi:
    I need to transfer CATS data to FICO. I found out about transaction CAT7.
    At the moment, I have 2 issues:
    1) My CATSDB data have status "30". However, there is no corresponding entries in CATSCO. Could you tell me what I need to do to ensure that CATSCO has values?
    2) My FICO system is running on 4.6C whereas my HR system is running on ECC6.0. Can CAT7 still works?
    I appreciate your response.
    Thanks,
    Ash

Maybe you are looking for

  • DVD Burner won't recognize discs

    I recently installed an lg gsa-4166b dual layer dvd burner to my emac and for some reason it won't recognize cds or dvds. Whenever I try to burn anything and put in a blank disc I get the message "no disk inserted". I'm running Mac OSX 10.4.11 and ev

  • How do I export an album with all the identifying information?

    When I export photos, I only get the jpg file.  I want the ablum with names of all the people.  I have made the album to remember great grandparents, grandparents etc.  Without the names, the pictures will soon become worthless.

  • Asset Value depreciated to Rs 1 after 12 year

    Hi, There is rule that Asset after 12 years should have Net Book Value Rs 1 and no further depreciation should be calculated on the asset after 12 years after capitalization date. How can we configure this scenario.

  • Webi 4.0 and BW Hierarchy Member Selector

    Hello, We have recently installed Business Objects 4.0.2 and have some questions about Webi 4.0 and reports based on SAP BW via BICS. 1. We have setup a BICS connection to a SAP BW query using the Information Design Tool and attempted to create a rep

  • Mouse Pressed in a JTree

    I have a JTree in my GUI and I have set up a mouse listener for this JTree - when I click anywhere in this JTree, I can print to the screen that an event has occurred, my problem is getting the source of the event ie; what JTree element has been clic