Streaming large data to client

Hi Friends,
Need some help.
I have to generate a pdf on fly and stream to client over internet, but the size of pdf might be 100 MB, and I think streaming this much data over network is not good. Moreover if user is on a dial up it will take too long for user to download the pdf. I cannot stream the data in any other format, I have to stream it as a PDF.
Do you guys have any suggestions to handle this situation?
Any help is really appreciated.
Thanks,
BS

Hi Thanks for your reply.
"Then provide him with a link to a servlet that will read the file and send it back to him."
When reading this file, if it is huge still we have the same network and user's download speed issues, correct me if I am wrong.
Generally how do you stream huge files to client, I am sure it is not a new problem, but I never encountered this before. Moreover, what are options for generating read only files.
My requirement is to generate read only memo's and user wants to generate several memos at once.
Please do let me know if there is another way to handle this requirement.
Thanks.

Similar Messages

  • Reading Large data files from client machine ( urgent)

    Hi,
    I want to know the best way for reading large data at about 75MB file from client machine and insert to the database.
    Can anybody provide sample code for it.
    Loading the file should be done at client machine and inserting into the database should be done at server side.
    How should i load the file?
    How should i transfer this file or data to server ?
    How should i insert into the database ?
    Thanks in advance.
    regards
    Kalyan

    Like I said before you should be using your application server to serve files >from the server off the filesystem. The database should not store files this big >and should instead just have a reference to this file. I think u have not understood the problem corectly.
    I will make it clear.
    The requirement is as follows.
    This is a j2ee based application.
    Application server is oracle application server.
    Database is oracle9i
    it is thick client (swing based application)
    User enters datasource like c:\turkey.data
    This turkey.data file contains data
    1@1@20050131@1@4306286000113@D00@32000002005511069941@@P@10@0@1@0@0@0@DK@70059420@4330654016574@1@51881100@51881100@@99@D@40235@0@0@1@430441800000000@@11@D@42389@20050201@28483@15@@@[email protected]@@20050208@20050307@0@@@@@@@@@0@@0@0@0@430443400000800@0@0@@0@@@29@0@@@EUR
    like wise we may have more than 3 lacs rows in it.
    We need to read this file and transfer this to the application server. Which are EJBS.
    There we read this file each row in file is one row in the database for a table.
    Like wise we need to insert 3 lacs records in the database.
    We can use Jdbc to insert the data which is not a problem.
    Only problem is how to transfer this data to server.
    I can do it in one way. This is only a example
    I can read all the data in StringBuffer and pass to server.
    There again i get the data from StringBuffer and insert into database using jdbc.
    This way if u do it. It is performance issue and takes long time to insert into the database.It even may give MemoryOutofBond exception.
    just iam looking for the better way of doing this which may get good performace issue.
    Hope u have understood the problem.

  • Running out of memory while using cursored stream with large data

    We are following the suggestions/recommendations for the cursored stream:
    CursoredStream cursor = null;
              try
                   Session session = getTransaction();
                   int batchSize = 50;
                   ReadAllQuery raq = getQuery();
                   raq.useCursoredStream(batchSize, batchSize);
                   int num = 0;
                   ArrayList<Request> limitRequests = null;
                   int totalLimitRequest = 0;
                   cursor = (CursoredStream) session.executeQuery(raq);
                   while( !cursor.atEnd() )
                        Request request = (Request) cursor.read() ;
                        if( num == 0 )
                             limitRequests = new ArrayList<Request>(batchSize);
                        limitRequests.add(request);
                        totalLimitRequest++;
                        num++;
                        if( num >= batchSize )
                             log.warn("Migrating batch of " + batchSize + " Requests.");
                             updateLimitRequestFillPriceForBatch(limitRequests);
                             num = 0;
                             cursor.releasePrevious();
                   if( num > 0 )
                        updateLimitRequestFillPriceForBatch(limitRequests);
                   cursor.close();
    We are committing every 50 records in the unit of work, if we set DontMaintianCache on the ReadAllQuery we are getting PrimaryKeyExceptions intermittently, and we do not see much difference in the IdentityMap size.
    Any suggestions/ideas for dealing with large data sets? Thanks

    Hi,
    If I use read-only classes with CursoredStream and execute the query within UOW, should I be saving any memory?
    I had to use UOW because when I use Session to execute the query I get
    6115: ISOLATED_QUERY_EXECUTED_ON_SERVER_SESSION
    Cause: An isolated query was executed on a server session: queries on isolated classes, or queries set to use exclusive connections, must not be executed on a ServerSession or in CMP outside of a transaction.
    I assume marking the descriptor as read-only will avoid registering in UOW, but I want to make sure that this is the case while using CursoredStream.
    We are running in OC4J(OAS10.1.3.4) with BeanManagedTransaction.
    Please suggest.
    Thanks
    -Raam
    Edited by: Raam on Apr 2, 2009 1:45 PM

  • Data streaming between server and client does not complete

    Using an ad-hoc app, data streaming between server
    and client does not complete as it supposed to be.
    The process runs well in solaris 5.8, however under 5.9
    we have found the characters stream buffer length limitation
    is around 900 to 950 characters (by default we are using 3072
    characters).
    Example:
    - We are transfering HTML file, which will be displayed
    in the App client, with buffer=3072, the HTML only displayed / transfered
    as xxxxxxxx characters, but with buffer=900 the HTML is displayed properly,
    in this case, the only problem that we have is the file transfer will
    eventually longer than usual.
    - There is another case, where we have to transfer information (data) as stream
    to the client. A long data stream will not appear at all in the client.
    Any ideas why the change between 5.8 and 5.9 woudl cause problems?
    The current app-driver that we are using is compiled using Solaris 5.6,
    if possible we would like to have use of a later version, which is compiled using Solaris 5.9, do you think this will probably solve our problem?
    Thanks
    Paul

    Does this have anything to do with Java RMI? or with Java come to think of it?

  • Error while Exporting large data from Reportviewer on azure hosted website.

    Hi,
    I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
    Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
    Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
    at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    --- End of inner exception stack trace ---
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
    at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
    at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
    at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
    at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
    It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
    Any help will be appreciated.
    Thanks.

    Sorry, let me clarify my questions as they were ambiguous:
    For a given set if input, does the request always take the same amount of time to fail? How long does it take?
    When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
    Also, if you can share your site name (directly or
    indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side.

  • WCF service connection forcibly closed by the remote host for large data

    Hello ,
                        WCF service is used to generate excel report , When the stored procedure returns large data around 30,000 records. Service fails
    to return the data . Below is the mentioned erorr log :
    System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP
    response to <service url> This could be due to the service
     endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by
    the server (possibly due to the service shutting down). See server logs for more details. ---> System.Net.WebException:
    The underlying connection was closed: An unexpected error occurred on a receive. ---> System.IO.IOException:
    Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
       at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       --- End of inner exception stack trace ---
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
       --- End of inner exception stack trace ---
       at System.Net.HttpWebRequest.GetResponse()
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout).
       --- End of inner exception stack trace ---
    Server stack trace:
       at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
       at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
       at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
    Exception rethrown at [0]:
       at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
       at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
       at IDataSetService.GetMastersData(Int32 tableID, String userID, String action, Int32 maxRecordLimit, Auditor& audit, DataSet& resultSet, Object[] FieldValues)
       at SPARC.UI.Web.Entities.Reports.Framework.Presenters.MasterPresenter.GetDataSet(Int32 masterID, Object[] procParams, Auditor& audit, Int32 maxRecordLimit).
    WEB CONFIG SETTINGS OF SERVICE
    <httpRuntime maxRequestLength="2147483647" executionTimeout="360"/>
    <binding name="BasicHttpBinding_Common"  closeTimeout="10:00:00"  openTimeout="10:00:00"
           receiveTimeout="10:00:00"  sendTimeout="10:00:00"  allowCookies="false"
           bypassProxyOnLocal="false"  hostNameComparisonMode="StrongWildcard"
           maxBufferSize="2147483647"  maxBufferPoolSize="0"  maxReceivedMessageSize="2147483647"
           messageEncoding="Text"  textEncoding="utf-8"   transferMode="Buffered"
           useDefaultWebProxy="true">
    <readerQuotas     maxDepth="2147483647"
          maxStringContentLength="2147483647"  maxArrayLength="2147483647"
          maxBytesPerRead="2147483647"  maxNameTableCharCount="2147483647" />
         <security mode="None"> 
    WEB CONFIG SETTINGS OF CLIENT
    <httpRuntime maxRequestLength="2147483647" requestValidationMode="2.0"/>
    <binding name="BasicHttpBinding_Common"
           closeTimeout="10:00:00"       openTimeout="10:00:00"
           receiveTimeout="10:00:00"       sendTimeout="10:00:00"
            allowCookies="false"        bypassProxyOnLocal="false"
            hostNameComparisonMode="StrongWildcard"        maxBufferSize="2147483647"
            maxBufferPoolSize="2147483647"        maxReceivedMessageSize="2147483647"
            messageEncoding="Text"        textEncoding="utf-8"
            transferMode="Buffered"        useDefaultWebProxy="true">
     <readerQuotas
           maxDepth="2147483647"
           maxStringContentLength="2147483647"
           maxArrayLength="2147483647"
           maxBytesPerRead="2147483647"
           maxNameTableCharCount="2147483647" />   

    Doing binding configuration on a WCF service to override the default settings is not done the sameway it would be done on the client-side config file.
    A custom bindng must be used on the WCF service-side config to override the defualt binding settings on the WCF service-side.
    http://robbincremers.me/2012/01/01/wcf-custom-binding-by-configuration-and-by-binding-standardbindingelement-and-standardbindingcollectionelement/
    Thee readerQuotas and everything else must be given in the Custom Bindings to override any default setttings on the WCF service side.
    Also, you are posting to the wrong forum.
    http://social.msdn.microsoft.com/Forums/vstudio/en-us/home?forum=wcf

  • Dispatching large data from DB.

    Hi all.
    My web service will act as the middle tier between front-end web apps. and back-end Oracle DB. Some request will cause a large returning data, say 30,000 - 40,000 records.
    How can I relay such large data from the back-end DB. to the front-end web apps. ? Can general SOAP string msg. will suite this request ? This approach will cause a very large string stream.
    Thanks in advance.
    Tanin

    Hi,
    I haven't tested the specific code below, but here how it would go.
    In a Page Load process, Before or After Header, write a PL/SQL Block that creates and loads the CLOB into the collection
    declare
      l_clob clob;
    begin
      -- wipe out if already exists and creates new empty collection
      apex_collection.create_or_truncate_collection('LOBCOLLECTION');
      for r in ( select * from remote_tab@db_link ) loop
        apex_collection.add_member('LOBCOLLECTION', p_clob001 => r.clob_column);
       -- loads the clobs one by one from the cursor
    end loop;
    end;Once this is done you can query the collections using the code below
    select clob001 from apex_collections where collection_name = 'LOBCOLLECTION'Regards,

  • Can servlet know if data reaches client?

    Is there any way for a servlet to tell if the data it sent on the response
              output stream really reached the client? Presumably using sockets we could
              check if the connection was closed cleanly or if it aborted but the servlet
              API doesn't seem to have anything relating to this.
              Does the new fuss over web services provide something to help with this - it
              must be an issue with web services!
              Thanks,
              -Dominic
              

    Hi Mike,
              my experiments suggest otherwise:
              If my client makes a request and then is killed off, the server appears to
              notice and the isCommitted() method on the response returns false. However,
              if I remove the network cable from the client machine after the request is
              made, the client connection times out and notices but the server does not -
              the response just vanishes. There is no IOException in this case (even
              though I explicitly flush the stream).
              It seems incredible that the TCP/IP protocol knows the connection has gone
              but there's no way for the servlets to find out.
              Cheers,
              -Dominic
              "Mike Mormando" <[email protected]> wrote in message
              news:3c6a8c1a$[email protected]..
              > "Dominic Tulley" <[email protected]> wrote in
              message
              > news:3c6a8156$[email protected]..
              > > Is there any way for a servlet to tell if the data it sent on the
              response
              > > output stream really reached the client? Presumably using sockets we
              > could
              > > check if the connection was closed cleanly or if it aborted but the
              > servlet
              > > API doesn't seem to have anything relating to this.
              > When you flush the output stream, if the client isn't connected you get an
              > IOException I believe.
              > I think that's as close as it gets.
              > Mike
              >
              >
              >
              

  • How to Improve large data loads ?

    Hello Gurus,
    Large data loads at my client long hours. I have tried using the recommedations from various blogs and SAP sites, for control parameters for DTP's and Infopackages. I need some viewpoints on what are the parameters that can be checked in the Oracle and Unix systems. I would also need some insight on:-
    1) How to clear log files
    2) How to clear any cached up memory in SAP BW.
    3) Control parameters in Oracle and Unix for any improvements.
    Thanks in advance.

    Hi
    I think those work should be performed by the BASIS guys.
    2)U can delete the cache memory by using the Tcode : RSRT and then select the cache monitor and then delete.
    Thanx & Regards,
    RaviChandra

  • Large data-centric applications in flex?

    Has anyone built a large data-centric application in flex? Are there code examples online?
    I have read several articles about how to structure large applications in Flex but as yet have not seen any code examples of how this is achieved in practice.
    Any help/links would be greatly appreciated, google hasn't helped me too much.
    M

    Tim, here is a bit of feedback about DCD
    data management only supporting int as returntype for method createItem ( signature in the docs is createItem(item:myDatatype):int ) seems a bit limitative. We should be able to return the PK datatype or eventually the bean itself. When we are configuring data management the first thing we pick is the primary key so the wizard should be able to validate if the returntype for createItem match de PK. UpdateItem is also problematic, we are assuming that the bean is saved as is, but if we want to update some additional fields server-side, we have to implement some kind of push so the client is updated. The update signature should be updateItem((item: myDatatype):myDatatype or updateItem((item: myDatatype):*
    Really don't like the way the services are generated inline and with the endpoint hardcoded. In the past, no endpoint was necessary, even less a full hardcoded one. Since we point it to the services-config, why is there a need to write it in the code? Removing it from the services generated we are still able to run it successfully.
    Pagination should support a single count method for multiple paged queries as I explained in another thread. A ER has been made https://bugs.adobe.com/jira/browse/FB-20809.
    Lazy-loaded related objects should be supported, the basic support lazy-loading just for the first level (object itself) is quite limitative because complex objects (with many associations ) will take a lot of time to load every single related object. For small projects it can ok, but not for medium / large ones.
    The invocation process to service call should be based on events like the rest of the frameworks, flex included, like that we would be able to easily replace the generated code with our own implementation.
    Forms generation is quite useful, just think it should be more advanced like enabling formatting and validation process.

  • How to handle large data while acquisition? BNC 2110

    I want to acquire data using  BNC 2110, I am writing a software in VB 6. We will use 3 channels. We are supposed to scan about 10000 points before AcquiredData is triggered. in all we will need to scan 10000 * 1000 * 1000 before data is put into a binary fall. Can anybody let me know, how to hande this large number points

    Hello Vjuno,
    In order to acquire 10,000,000,000 points you are going to have to be streaming this data to your hard drive as you go.  To do this you'll need to write the data you read to a file each loop iteration.  In general it is a good practice to make your "samples to read" at least 10% of your sample rate in seconds to avoid overflowing buffers, however, depending on your computer you may be able to go faster.  I made an example program in LabVIEW and was able to read 10,000 points at a time from each of 3 analog inputs at 333MHz and write the values to file without overflowing a buffer.  However, even opening a web browser while the code was running was enough to delay the VI long enough for the buffer to overflow.
    You can use the DAQmx Configure Input Buffer call to increase the buffer size and account for spikes in CPU usage from other processes, and you should also monitor the "Available Samples Per Channel" property to make sure you aren't steadily gaining samples in your buffer.  Since you want to acquire 10 billion samples at 1MHz this acquisition will take several hours; if you're not able to keep the buffer empty then it will become apparent before the end of your acquisition.  By monitoring the samples in the buffer you can tell if you're pulling the samples out fast enough, if you find that this number is steadily increasing then you should either reduce the sample rate or increase the number of samples to read each time you call the DAQmx Read.
    In my example program I used a write to TDMS (binary) file and a PCI-6251.
    I hope this helps, and have a good night.
    Cheers,
    Brooks

  • View data in client B from client A in the same SID without a valid logon?

    Hi Folks
    We are planning on upgrading our 4.6C system to ERP 6.0, and are initialy considering having two clients in the same sandbox SID.  One would be for the developers to perform code remediation checks (client A), and one would contain a copy of production data for performing testing of functionality over live data (client B).
    Would it be possible to view data in client B from client A in the same system without a valid logon to client B or RFC connection to client B from client A?   For example via the use on an ABAP program to SQL the database?
    I know one can use transactions like SM30/SM31 to view, compare, and adjust data between clients, but this requires an RFC connection and valid logon to the target client.
    Regards
    Kevin.

    Hi Kevin.
    >
    Kevin McLatchie wrote:
    > Would it be possible to view data in client B from client A in the same system without a valid logon to client B or RFC connection to client B from client A?   For example via the use on an ABAP program to
    Short answer: yes.
    If someone has the right to write and execute ABAP reports on the system he is able to access the data of all clients. So I don't think that this setup is advisable. Don't mix development and production data in one system.
    Best regards,
    Jan

  • UDI-00018: Data Pump client is incompatible with database version 11.2.0.1

    Hi
    I am trying to import data in Oracle 11g Release2(11.2.0.1) using impdp utitlity and getting below errror
    UDI-00018: Data Pump client is incompatible with database version 11.2.0.1.0
    Export dump has taken in database with oracle 11g Release 1(11.1.0.7.0) and I am trying to import in higher version of the database. Is there any parameter I have to set to avoid this error?

    AUTHSTATE=compat
    A__z=! LOGNAME
    CLASSPATH=/app/oracle/11.2.0/jlib:.
    HOME=/home/oracle
    LANG=C
    LC__FASTMSG=true
    LD_LIBRARY_PATH=/app/oracle/11.2.0/lib:/app/oracle/11.2.0/network/lib:.
    LIBPATH=/app/oracle/11.2.0/JDK/JRE/BIN:/app/oracle/11.2.0/jdk/jre/bin/classic:/app/oracle/11.2.0/lib32
    LOCPATH=/usr/lib/nls/loc
    LOGIN=oracle
    LOGNAME=oracle
    MAIL=/usr/spool/mail/oracle
    MAILMSG=[YOU HAVE NEW MAIL]
    NLSPATH=/usr/lib/nls/msg/%L/%N:/usr/lib/nls/msg/%L/%N.cat
    NLS_DATE_FORMAT=DD-MON-RRRR HH24:MI:SS
    ODMDIR=/etc/objrepos
    ORACLE_BASE=/app/oracle
    ORACLE_HOME=/app/oracle/11.2.0
    ORACLE_SID=AMT6
    ORACLE_TERM=xterm
    ORA_NLS33=/app/oracle/11.2.0/nls/data
    PATH=/app/oracle/11.2.0/bin:.:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/oracle/bin:/usr/bin/X11:/sbin:.:/usr/local/bin:/usr/ccs/bin
    PS1=nbsud01[$PWD]:($ORACLE_SID)>
    PWD=/nbsiar/nbimp
    SHELL=/usr/bin/ksh
    SHLIB_PATH=/app/oracle/11.2.0/lib:/usr/lib
    TERM=xterm
    TZ=Europe/London
    USER=oracle
    _=/usr/bin/env

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • WAD : Result set is too large; data retrieval restricted by configuration

    Hi All,
    When trying to execute the web template by giving less restiction we are getting the below error :
    Result set is too large; data retrieval restricted by configuration
    Result set too large (758992 cells); data retrieval restricted by configuration (maximum = 500000 cells)
    But when we try to increase the number of restictions it is giving output. For example if we give fiscal period, company code ann Brand we are able to get output. But if we give fical period alone it it throwing the above error.
    Note : We are in SP18.
    Whether do we need to change some setting in configuration? If we yes where do we need to change or what else we need to do to remove this error
    Regards
    Karthik

    Hi Karthik,
    the standard setting for web templates is to display a maximum amount of 50.000 cells. The less you restrict your query the more data will be displayed in the report. If you want to display more than 50.000 cells the template will not be executed correctly.
    In general it is advisable to restrict the query as much as possible. The more data you display the worse your performance will be. If you have to display more data and you execute the query from query designer or if you use the standard template you can individually set the maximum amount of cells. This is described over  [here|Re: Bex Web 7.0 cells overflow].
    However I do not know if (and how) you can set the maximum amount of cells differently as a default setting for your template. This should be possible somehow I think, if you find a solution for this please let us know.
    Brgds,
    Marcel

Maybe you are looking for