Caching Large Data.

Hi,
I am working on an application that fires an SQL query and returns its data in a data structure.
The data structure is as follows :
LinkedList rows;
each 'rows' node references an array of Strings[] (representing fields).
Wcreating the data structure, the program crashes out with a OutOfMemoryException.
I know this will be an obvious problem. But any solutions ?
-Vallabh

Search the forum for increasing the amount of memory given to java.
It's something like "java -xms 2048 ..."

Similar Messages

  • How to Improve large data loads ?

    Hello Gurus,
    Large data loads at my client long hours. I have tried using the recommedations from various blogs and SAP sites, for control parameters for DTP's and Infopackages. I need some viewpoints on what are the parameters that can be checked in the Oracle and Unix systems. I would also need some insight on:-
    1) How to clear log files
    2) How to clear any cached up memory in SAP BW.
    3) Control parameters in Oracle and Unix for any improvements.
    Thanks in advance.

    Hi
    I think those work should be performed by the BASIS guys.
    2)U can delete the cache memory by using the Tcode : RSRT and then select the cache monitor and then delete.
    Thanx & Regards,
    RaviChandra

  • INPUT: KT4/V vs CRC error in large data transfer/CD burning HERE!

    This issue can be solved with BIOS update. KT4V & KT4 Ultra users who are having this problem can request for the TEST BIOS to test on your system. You may either pm/email me or Bas, or get it at http://ftp://ftp.heppen.be/MSI/
    Please report back whether the test BIOS would really fix the problem, or cause any new problem, or any performance hit.
    ** this sounds like a Christmas Gift to KT4V users AND New Year Gift to KT4 Ultra users!!!  :D  **
    To all KT4 Ultra and KT4V users, either you have data corruption or CRC error in large data transfer and CD burning or not, your inputs are needed.
    Please list down your system specs as details as possible. Below here is a guideline, you may take this, CTRL-C (copy) and write your specs in your post.
    1. System specs:
    CPU:
    Motherboard:
    RAM Slot-1: (exact brand and model)
    RAM Slot-2:
    RAM Slot-3:
    display card:  [no overclock]
    IDE-1M:  (exact HDD brand and model pls)
    IDE-1S:
    IDE-2M:
    IDE-2S:
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3:
    PCI-4:
    PCI-5:
    PCI-6:
    PSU: (brand, model, total power, (estimated) combined power)
    BIOS revision:
    Operating System:
    VIA 4-in-1 drivers : (if you installed it, tell us the version)
    other drivers, services or applications might affect the data transfer such as : PCI Latency patch, WPCREDIT modifications, VCool, CoolerXP...
    2. CRC ERROR?
    PASS or FAIL
    If PASS, let us know your BIOS settings.
    If FAIL, proceed as below:
    3. Please use these BIOS settings:
    1. Load BIOS Setup Default
    2. NO OVERCLOCK ON FSB! Set accordingly to your CPU
    2. Set RAM to
    a) SPD, if failed try
    b) User defined to the slowest RAM timings, ie 266,2.5,3,6,3,disable interleave,4,disable 1T, normal
    If PASS, go for more extreme BIOS settings as you usually use:
    1. High Performance Default
    2. set RAM to the extreme timings
    3. DO NOT overclock yet until both 1. and 2. are PASS
    4. Try these suggestions:
    1. Microsoft IDE drivers (uninstall VIA 4-in-1's)
    2. VIA 4-in-1 different version's IDE filter driver?
    3. VIA IDE Miniport driver?
    4. use IDE-3 RAID channel for one HDD data transfer
    5. same HDD transfer, ie C:\dir1\*.* -> C:\dir2\*.*
    6. burn CD at 1x speed
    7. Set the HDD and/or CD to PIO mode, or slower UDMA mode.
    8. If and only if you know how to update BIOS correctly and willing to take some risks, try the BETA BIOS KT4 (1.25), KT4V (1.64) too.
    Please report back your tests and experimentations of these suggestions.
    If you have any workaround to deal with this issue other than set back FSB to 100Mhz, please tell us too.
    Thanks for your inputs!

    My system, just gotten this 2 days ago
    CPU: Athlon XP 2000+
    Motherboard: MSI KT4V (MS-6712)
    RAM Slot-1:
    RAM Slot-2: 512 Mb Kingston DDR 333 CAS 2.5
    RAM Slot-3:
    display card: Abit Siluro GF3 Ti200
    IDE-1M: Western Digital WD800JB (8 Mb Cache) - 80 GB
    IDE-1S: Seagate U-Series ST360020A - 60 GB
    IDE-2M: Sony DVD-ROM 16x (DDU1621)
    IDE-2S: Creative CDRW121032
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3: Accton 1207F 10/100 Fast Ethernet Card
    PCI-4: SBLIVE 5.1 Platinum with Live!Drive II
    PCI-5:
    PCI-6:
    PSU: 400Watts (Generic)
    BIOS revision: 1.6
    Operating System: Windows 2000 with SP3
    VIA 4-in-1 drivers : Hyperion 4.45, only AGP and INF installed. IDE drivers are standard Win2k/SP3 ones.
    2. CRC ERROR?
    FAIL.
    When i had my system, i tried installing Windows ME as i wanted to do dual-booting together with Win2K. When i tried to install NVIDIA Detonator drivers (Ver 30.82), it proceeded as normally and asked for a reboot, which i did, then it just hang before the start of Windows ME. I did a reboot, and later selected "Normal" as Windows ME detected a failed startup, and later i was able to enter Windows ME, but it reported that the NVIDIA Detonator drivers were invalid and of wrong type to my display card.
    Later i tried to move my files from my C:\ to D:\ and it reported saying that my destination file is invalid.
    When i changed my OS to Win2k/SP3 (no more dual-boot), and installed the same NVIDIA detonator driver version, it worked. When i started to copy files again, it later BSOD, and said PAGE_FAULT_ERROR (something like this). When installing from CDs, it will report that my .CAB files are corrupt or have insufficient swap file space (i set mine manually at 1.2Gb size). Then there are times during my reboots and entering win2k, i found my keyboard and mouse (all PS/2) not working and windows loads as usual.
    Later i changed my PCI latency setting from 32 to 96, i managed to install from CD without much further issues.
    Upon reading these posts here, i didn't realize that the MSI KT4 series or the KT400 chipset had so much issues! i have read from countless sites like extremetech and anandtech and none reported about this particular errors i have encountered during my first 2 days with this setup. (this is my first Athlon setup, i'm previously and Intel person-type).
    So far, i conclude in my settings is that:
    -32-Bit settings in Bios settings for CD/DVD/CD-RW must be disabled, i concur with Shumway's recommendations.
    -DMA settings in Windows 2000 for CD/DVD/CD-RW must be set to PIO mode otherwise when copying from CD to HDD will have read errors.
    -Installing the PCI latency fix really does wonders for my set. (PCI Latency fix ver 1.9). Now i can copy files from all my drives without worrying so much about CRC errors. Thank you, George E. Breese.
    -I really want to know why in WinME i can't install the NVIDIA detonator drivers, while in Win2K i can.
    I post again, once i have done some more tests to my system, especially CD-R writes.
    Angel17

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • WAD : Result set is too large; data retrieval restricted by configuration

    Hi All,
    When trying to execute the web template by giving less restiction we are getting the below error :
    Result set is too large; data retrieval restricted by configuration
    Result set too large (758992 cells); data retrieval restricted by configuration (maximum = 500000 cells)
    But when we try to increase the number of restictions it is giving output. For example if we give fiscal period, company code ann Brand we are able to get output. But if we give fical period alone it it throwing the above error.
    Note : We are in SP18.
    Whether do we need to change some setting in configuration? If we yes where do we need to change or what else we need to do to remove this error
    Regards
    Karthik

    Hi Karthik,
    the standard setting for web templates is to display a maximum amount of 50.000 cells. The less you restrict your query the more data will be displayed in the report. If you want to display more than 50.000 cells the template will not be executed correctly.
    In general it is advisable to restrict the query as much as possible. The more data you display the worse your performance will be. If you have to display more data and you execute the query from query designer or if you use the standard template you can individually set the maximum amount of cells. This is described over  [here|Re: Bex Web 7.0 cells overflow].
    However I do not know if (and how) you can set the maximum amount of cells differently as a default setting for your template. This should be possible somehow I think, if you find a solution for this please let us know.
    Brgds,
    Marcel

  • Result set is too large; data retrieval restricted by configuration

    Hi,
    While executing query for a given period, 'Result set is too large; data retrieval restricted by configuration' message is getting displayed. I had searched in SDN and I had referred the following link:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d047e1a1-ad5d-2c10-5cb1-f4ff99fc63c4&overridelayout=true
    Steps followed:
    1) Transaction Code SE38
    2) In the program field, entered the report name SAP_RSADMIN_MAINTAIN and Executed.
    3) For OBJECT, entered the following parameters: BICS_DA_RESULT_SET_LIMIT_MAX
    4) For VALUE, entered the value for the size of the result set, and then executed the program:
    After the said steps, the below message is displayed:
    OLD SETTING:
    OBJECT =                                VALUE =
    UPDATE failed because there is no record
    OBJECT = BICS_DA_RESULT_SET_LIMIT_MAX
    Similar message is displayed for Object: BICS_DA_RESULT_SET_LIMIT_DEF.
    Please let me know as to how to proceed on this.
    Thanks in advance.

    Thanks for the reply!
    The objects are not available in the RSADMIN table.

  • How can I Cache the data I'm reading from a collection of text files in a directory using a TreeMap?

    How can I Cache the data I'm reading from a collection of text files in a directory using a TreeMap? Currently my program reads the data from several text files in a directory and the saves that information in a text file called output.txt. I would like to cache this data in order to use it later. How can I do this using the TreeMap Class? These are the keys,values: TreeMap The data I'd like to Cache is (date from the file, time of the file, current time).
    import java.io.*;
    public class CacheData {
      public static void main(String[] args) throws IOException {
      String target_dir = "C:\\Files";
      String output = "C:\\Files\output.txt";
      File dir = new File(target_dir);
      File[] files = dir.listFiles();
      // open the Printwriter before your loop
      PrintWriter outputStream = new PrintWriter(output);
      for (File textfiles : files) {
      if (textfiles.isFile() && textfiles.getName().endsWith(".txt")) {
      BufferedReader inputStream = null;
      // close the outputstream after the loop
      outputStream.close();
      try {
      inputStream = new BufferedReader(new FileReader(textfiles));
      String line;
      while ((line = inputStream.readLine()) != null) {
      System.out.println(line);
      // Write Content
      outputStream.println(line);
      } finally {
      if (inputStream != null) {
      inputStream.close();

    How can I Cache the data I'm reading from a collection of text files in a directory using a TreeMap? Currently my program reads the data from several text files in a directory and the saves that information in a text file called output.txt. I would like to cache this data in order to use it later. How can I do this using the TreeMap Class?
    I don't understand your question.
    If you don't know how to use TreeMap why do you think a TreeMap is the correct solution for what you want to do?
    If you are just asking how to use TreeMap then there are PLENTY of tutorials on the internet and the Java API provides the methods that area available.
    TreeMap (Java Platform SE 7 )
    Are you sure you want a map and not a tree instead?
    https://docs.oracle.com/javase/tutorial/uiswing/components/tree.html

  • Error while Exporting large data from Reportviewer on azure hosted website.

    Hi,
    I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
    Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
    Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
    at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    --- End of inner exception stack trace ---
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
    at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
    at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
    at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
    at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
    It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
    Any help will be appreciated.
    Thanks.

    Sorry, let me clarify my questions as they were ambiguous:
    For a given set if input, does the request always take the same amount of time to fail? How long does it take?
    When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
    Also, if you can share your site name (directly or
    indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side.

  • WCF service connection forcibly closed by the remote host for large data

    Hello ,
                        WCF service is used to generate excel report , When the stored procedure returns large data around 30,000 records. Service fails
    to return the data . Below is the mentioned erorr log :
    System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP
    response to <service url> This could be due to the service
     endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by
    the server (possibly due to the service shutting down). See server logs for more details. ---> System.Net.WebException:
    The underlying connection was closed: An unexpected error occurred on a receive. ---> System.IO.IOException:
    Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
       at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       --- End of inner exception stack trace ---
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
       --- End of inner exception stack trace ---
       at System.Net.HttpWebRequest.GetResponse()
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout).
       --- End of inner exception stack trace ---
    Server stack trace:
       at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
       at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
       at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
    Exception rethrown at [0]:
       at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
       at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
       at IDataSetService.GetMastersData(Int32 tableID, String userID, String action, Int32 maxRecordLimit, Auditor& audit, DataSet& resultSet, Object[] FieldValues)
       at SPARC.UI.Web.Entities.Reports.Framework.Presenters.MasterPresenter.GetDataSet(Int32 masterID, Object[] procParams, Auditor& audit, Int32 maxRecordLimit).
    WEB CONFIG SETTINGS OF SERVICE
    <httpRuntime maxRequestLength="2147483647" executionTimeout="360"/>
    <binding name="BasicHttpBinding_Common"  closeTimeout="10:00:00"  openTimeout="10:00:00"
           receiveTimeout="10:00:00"  sendTimeout="10:00:00"  allowCookies="false"
           bypassProxyOnLocal="false"  hostNameComparisonMode="StrongWildcard"
           maxBufferSize="2147483647"  maxBufferPoolSize="0"  maxReceivedMessageSize="2147483647"
           messageEncoding="Text"  textEncoding="utf-8"   transferMode="Buffered"
           useDefaultWebProxy="true">
    <readerQuotas     maxDepth="2147483647"
          maxStringContentLength="2147483647"  maxArrayLength="2147483647"
          maxBytesPerRead="2147483647"  maxNameTableCharCount="2147483647" />
         <security mode="None"> 
    WEB CONFIG SETTINGS OF CLIENT
    <httpRuntime maxRequestLength="2147483647" requestValidationMode="2.0"/>
    <binding name="BasicHttpBinding_Common"
           closeTimeout="10:00:00"       openTimeout="10:00:00"
           receiveTimeout="10:00:00"       sendTimeout="10:00:00"
            allowCookies="false"        bypassProxyOnLocal="false"
            hostNameComparisonMode="StrongWildcard"        maxBufferSize="2147483647"
            maxBufferPoolSize="2147483647"        maxReceivedMessageSize="2147483647"
            messageEncoding="Text"        textEncoding="utf-8"
            transferMode="Buffered"        useDefaultWebProxy="true">
     <readerQuotas
           maxDepth="2147483647"
           maxStringContentLength="2147483647"
           maxArrayLength="2147483647"
           maxBytesPerRead="2147483647"
           maxNameTableCharCount="2147483647" />   

    Doing binding configuration on a WCF service to override the default settings is not done the sameway it would be done on the client-side config file.
    A custom bindng must be used on the WCF service-side config to override the defualt binding settings on the WCF service-side.
    http://robbincremers.me/2012/01/01/wcf-custom-binding-by-configuration-and-by-binding-standardbindingelement-and-standardbindingcollectionelement/
    Thee readerQuotas and everything else must be given in the Custom Bindings to override any default setttings on the WCF service side.
    Also, you are posting to the wrong forum.
    http://social.msdn.microsoft.com/Forums/vstudio/en-us/home?forum=wcf

  • Dynamic action - Cache server data

    APEX 4.2.2
    Is there a way to cache server side data in a global (page/document level) Javascript vector (bunch of key-value pairs) (associative array, array of objects or some such)? This way the data can be used by subsequent dynamic actions' Javascript code without querying the server over and over.
    I see that apex_util.json_from_sql is still undocumented after all these years. Are there any examples of using that API to do this sort of thing? Can someone please share?
    Thanks

    https://apex.oracle.com/pls/apex/f?p=57688:24
    I took a shot at this and it seems to work well using Javascript global objects for data storage. Hope this helps someone.
    1. Page attributes - Javascript global variables var gData,gLookup={};2. On-demand application process Get_Emps as apex_util.json_from_sql('select empno,ename from emp');3. Page Load dynamic action to invoke the application process and cache the data var get = new htmldb_Get(null,$v('pFlowId'),'APPLICATION_PROCESS=Get_Emps',$v('pFlowStepId'));
    var retval=get.get();get=null;
    gData=apex.jQuery.parseJSON(retval);
    $.each(gData.row,function(i,obj){
       gLookup[obj.EMPNO]=obj.ENAME;
    });4. EMPNO page item - Standard OnChange dynamic action to SetValue of ENAME page item using the cached data - Javascript expression gLookup[$v(this.triggeringElement)]
    We document something when it is ready to be supported. This specific procedure isn't something we haven't invested in, and thus, per our discretion, have elected not to document or support it at this time.Fair enough.

  • Large Data file problem in Oracle 8.1.7 and RedHat 6.2EE

    I've installed the RedHat 6.2EE (Enterprise
    Edition Optimized for Oracle8i) and Oracle
    EE 8.1.7. I am able to create very large file
    ( > 2GB) using standard commands, such as
    'cat', 'dd', .... However, when I create a
    large data file in Oracle, I get the
    following error messages:
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    extent management local autoallocate;
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    ERROR at line 1:
    ORA-19502: write error on file "/data/u1/db1/data1.dbf", blockno 231425
    (blocksize=8192)
    ORA-27069: skgfdisp: attempt to do I/O beyond the range of the file
    Additional information: 231425
    Additional information: 64
    Additional information: 231425
    Do anyone know what's wrong?
    Thanks
    david

    I've finally solved it!
    I downloaded the following jre from blackdown:
    jre118_v3-glibc-2.1.3-DYNMOTIF.tar.bz2
    It's the only one that seems to work (and god, have I tried them all!)
    I've no idea what the DYNMOTIF means (apart from being something to do with Motif - but you don't have to be a linux guru to work that out ;)) - but, hell, it works.
    And after sitting in front of this machine for 3 days trying to deal with Oracle's, frankly PATHETIC install, that's so full of holes and bugs, that's all I care about..
    The one bundled with Oracle 8.1.7 doesn't work with Linux redhat 6.2EE.
    Don't oracle test their software?
    Anyway I'm happy now, and I'm leaving this in case anybody else has the same problem.
    Thanks for everyone's help.

  • Best approach to return Large data ( 4K) from Stored Proc

    We have a stored proc (Oracle 8i) that:
    1) receives some parameters.
    2)performs computations which create a large block of data
    3) return this data to caller.
    compatible with both ASP (using MSDAORA.Oracle), and ColdFusion (using Oracle ODBC driver). This procedure is critical in terms of performance.
    I have written this procedure as having an OUT param which is a REF CURSOR to a record containing a LONG. In order to make this work, at the end of the procedure I have to store the working buffer (an internal LONG variable) into a temp table, and then open the cursor as a SELECT from the temp table.
    I have tried to open the cursor as a SELECT of the working buffer (from dual) but I get the error "ORA-01460: unimplemented or unreasonable conversion requested"
    I suspect this is taking too much time; any tips about the best approach here? is there a resource with REAL examples on returning large data?
    If I switch to CLOB, will it speed the process, be compatible with callers, etc ? all references to CLOB I saw use trivial examples.
    Thanks for any help,
    Yoram Ayalon

    We have a stored proc (Oracle 8i) that:
    1) receives some parameters.
    2)performs computations which create a large block of data
    3) return this data to caller.
    compatible with both ASP (using MSDAORA.Oracle), and ColdFusion (using Oracle ODBC driver). This procedure is critical in terms of performance.
    I have written this procedure as having an OUT param which is a REF CURSOR to a record containing a LONG. In order to make this work, at the end of the procedure I have to store the working buffer (an internal LONG variable) into a temp table, and then open the cursor as a SELECT from the temp table.
    I have tried to open the cursor as a SELECT of the working buffer (from dual) but I get the error "ORA-01460: unimplemented or unreasonable conversion requested"
    I suspect this is taking too much time; any tips about the best approach here? is there a resource with REAL examples on returning large data?
    If I switch to CLOB, will it speed the process, be compatible with callers, etc ? all references to CLOB I saw use trivial examples.
    Thanks for any help,
    Yoram Ayalon

  • Error when opening large data forms

    Hi,
    We are working on a Workforce planning implementation. We have 2 large custom defined dimensions.
    When opening large data forms we get a standard "Error has occurred" error. If we reduce the member selection the data form opens fine.
    Is anyone aware of a setting that can be increased to open large data forms? Im not referring to the "Warn id data form is larger than 3000 cells" setting.
    Im pretty sure there is a parameter that can be increased but I cant find it.
    Thanks for your help.
    Seb

    Hi Seb,
    If you do find the magic parameter then let us know because I would be interested to know.
    It is probably something like ALLOW_LARGE_FORMS = true :)
    In the planning logs is the error related to planning or is essbase related error.
    Is it failing due to the amount of rows or is it because it is going beyond the max of 256 columns ?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Conditional format with large data fails and show error as "Selection is too large" in Excel 2007

    I am facing a issue in paste special operation using conditional formats for large data in Excel 2007
    I have uploaded a file at below given location. 
    http://sdrv.ms/1fYC9qE
    The file contains two sheets, Sheet "Data" contains the data on which formats are to be applied and sheet "FormatTables" contains the format tables which contains conditional formating.
    There are two table in "FormatTables" sheet. Both have some conditional formats applied on it. 
    Case 1: 
    1. Select the table range of Table1 i.e $A$2:$AV$2
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    It throws error as "Selection is too large"
    Case 2:
    1. Select the table range of Table2 i.e $A$5:$AV$5
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    Formats get applied successfully.
    Both are the same format tables with same no of column and applied to same data range($A$1:$AV$20664) where one of the case works and another fails.
    The only diffrence is Table1 has appliesTo range($A$2:$T$2) as partial of total table range($A$2:$AV$2) whereas the Table2 has appliesTo range($A$5:$AV$5) same as of its total table range($A$5:$AV$5)
    NOTE : This issue is only in Excel 2007

    Excel 2007 No Supporting formating to take a formatting form another if source table has more then 16000 rows and if you want to do that in more then it then you have ot inset 1 more row in your format table to have 3 rows
    like: A1:AV3
    then try to copy that formating and apply
    Solution Case 1: 
    1.Select the table range of Table1 i.e AV21 and drage it down to one row down
    2. Select the table range of Table1 i.e $A$2:$AV$3
    3. Copy it
    4. Goto Sheet "Data" 
    5. Select data area i.e $A$1:$AV$20664
    6. Perform a paste special operation on full range and select "Formats" option while performing paste special

  • Client automatically cache the data got from cache server?

    Hi expert,
    I have 2 questions about the client local cache. Would you please help to give me some suggestion?
    1. Will client automatically locally cache the data got from cache server the first time and automatically update the data in local cache when getting the same data from cache server again? I go through the API reference but cannot find any API to query the data currently cached in the local cache.
    2. If client will automatically cache the data got from cache server. Is there any way for a client to get the data event that happens to its local cache, such as entry created in local cache, entry deleted from local cache and entry updated in local cache? In my opinion, when getting an entry from cache server the first time, the MapListener's entry create event should be triggered. When getting the same entry again, the entry update event should be triggered.
    However, I have tried a client with replicated cache, a client with partitioned cache, an extend client with remote cache and a client with local cache(front cache part of near cache), the client (the NamedCache object has been set the MapListener) cannot get any event notification after getting data from cache server. By the way, my listener is OK since when putting data the entry create event and entry update event will be triggered.
    Your suggestion is very appreciated. :)

    Hi
    If I were you I would read this http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/toc.htm
    and particularly the section about Near Caching here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/nearcache.htm#CDEFEAJG
    which is what you are asking about in your question.
    Near Caching is how Coherence stores data in the locally - which is the answetr to your first question. How Near Caching works is explained in the documentation.
    Events, which you ask about in your second question are explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/delivereventsjava.htm#CBBIIEFA
    It might be that ContinuousQueryCache is closer to what you want. This is explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/queryabledatafabric.htm#sthref38 A ContinuousQueryCache is like having a sub-set of the underlying cache on the local client which you can then listen to etc...
    JK

Maybe you are looking for

  • HP LaserJet P2035n Drivers for 64bit XP

    So I tried to use the HP email and live chat support but neither worked because they said my p/n or s/n were invalid and I am pulling them off the printer itself.  The issue I am having is with a HP LaserJet P2035N.  It is hooked to the network and t

  • The "Other" tab on my HD keeps filling up even after i delete stuff from it

    Hi! Just wanted to share my story to see if any of you knew about this and how to solve it, because it's driving me insane to the point where i want to throw mi macbook out the window. So, i have a MacBook Air (11-inch, Mid 2011) with a 64GB HD. It's

  • How to change the transition time for a title in the new imovie

    With the new update I have lost how to do a simple task. I have titles in the lower left hand corner ine a video (pictured below). I would like for the titles to change seamlessly, without blinking after each new title appears. Sort of like a digital

  • How to exclude GRIR open items from GL Open Items

    Hi, I have a requirement from client that they need a report of GL Open Items excluding GRIR open items. Can anybody give an idea regarding this. Thanks, Aparna

  • OS X Yosemite and Aperture 3

    I recently installed OS X Yosemite on my mac book pro and since I can not open my Aperture software - it says "You can't use this version of the application "Aperture" with this version of OS X. You have "Aperture" 3.2.4." there is an upgrade availab