Error when exporting large amount of data to Excel from Apex4

Hi,
I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
We are using Application Express 4.1.1.00.23 on Oracle 11g.
Any help would be appreciated.
Thanks
Sue

Hi,
>
I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
>
How? Application Builder > Data Workshop? Apex Page Process? (Packaged) procedure?
>
It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
>
Have you changed your webserver in the process? Say moved from OHS to ApexListener?
>
Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
We are using Application Express 4.1.1.00.23 on Oracle 11g.
Any help would be appreciated.

Similar Messages

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • Freeze when writing large amount of data to iPod through USB

    I used to take backups of my PowerBook to my 60G iPod video. Backups are taken with tar in terminal directly to mounted iPod volume.
    Now, every time I try to write a big amount of data to iPod (from MacBook Pro), the whole system freezes (mouse cursor moves, but nothing else can be done). When the USB-cable is pulled off, the system recovers and acts as it should. This problem happens every time a large amount of data is written to iPod.
    The same iPod works perfectly (when backupping) in PowerBook and small amounts of data can be easily written to it (in MacBook Pro) without problems.
    Does anyone else have the same problem? Any ideas why is this and how to resolve the issue?
    MacBook Pro, 2.0Ghz, 100GB 7200RPM, 1GB Ram   Mac OS X (10.4.5)   IPod Video 60G connected through USB

    Ex PC user...never had a problem.
    Got a MacBook Pro last week...having the same issues...and this is now with an exchanged machine!
    I've read elsewhere that it's something to do with the USB timing out. And if you get a new USB port and attach it (and it's powered separately), it should work. Kind of a bummer, but, those folks who tried it say it works.
    Me, I can upload to Ipod piecemeal, manually...but even then, it sometimes freezes.
    The good news is that once the Ipod is loaded, the problem shouldnt' happen. It's the large amounts of data.
    Apple should DEFINITELY fix this though. Unbelievable.
    MacBook Pro 2.0   Mac OS X (10.4.6)  

  • Power BI performance issue when load large amount of data from database

    I need to load data set from my database, which have large amount of data, it will take so many time to initialize data before I can build report, is there any good way to process large amount of data for PowerBI? As I know many people analysis data based
    on PowerBI, is there any suggestion for loading large amount of data from database?
    Thanks a lot for help

    Hi Ruixue,
    We have made significant performance improvements to Data Load in the February update for the Power BI Designer:
    http://blogs.msdn.com/b/powerbi/archive/2015/02/19/6-new-updates-for-the-power-bi-preview-february-2015.aspx
    Would you be able to try again and let us know if it's still slow? With the latest improvements, it should take between half and one third of the time that it used to.
    Thanks,
    M.

  • Airport Extreme Intermittent Network Interruption when Downloading Large Amounts of Data.

    I've had an Airport Extreme Base Station for about 2.5 years and have had no problems until the last 6 months.  I have my iMac and a PC directly connected through ethernet and another PC connected wirelessly.  I occasionally need to download very large data files that max out my download connection speed at about 2.5Mbs.  During these downloads, my entire network loses connection to the internet intermittently for between 2 and 8 seconds with a separation between connection losses at around 20-30 seconds each.  This includes the hard wired machines.  I've tested a download with a direct connection to my cable modem without incident.  The base station is causing the problem.  I've attempted to reset the Base Station with good results after reset, but then the problem simply returns after a while.  I've updated the firmware to latest version with no change. 
    Can anyone help me with the cause of the connection loss and a method of preventing it?  THIS IS NOT A WIRELESS PROBLEM.  I believe it has to do with the massive amount of data being handled.  Any help would be appreciated.

    Ok, did some more sniffing around and found this thread.
    https://discussions.apple.com/thread/2508959?start=0&tstart=0
    It seems that the AEBS has had a serious flaw for the last 6 years that Apple has been unable to address adequately.  Here is a portion of the log file.  It simply repeats the same log entries over and over.
    Mar 07 21:25:17
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 21:25:17
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 21:26:17
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 21:26:17
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:30:43
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:36:41
    Severity:5
    Clock synchronized to network time server time.apple.com (adjusted +0 seconds).
    Mar 07 21:55:08
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 21:55:08
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 21:55:32
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 21:55:33
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:59:47
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:24:53
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 22:24:53
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 22:25:18
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 22:25:18
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:30:43
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:36:42
    Severity:5
    Clock synchronized to network time server time.apple.com (adjusted -1 seconds).
    Mar 07 22:54:37
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 22:54:37
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Anyone have any ideas why this is happening?

  • Export large ASCP plan data to excel

    Hi,
    We have a need to export plan results from ASCP Plan to excel. Export option takes too long. I would like to know if there is an alternate method in Oracle EBS R12.1 ASCP to get the results of Plan in excel or similar format.
    Thanks,
    Ash

    If you are exporting a substantial number of records to excel the best option id to use CSV format and then process the generated CSV file with excel.
    Exporting directly to excel format works well with small amounts of data, but with a large number of records system memory becomes an issue.
    If you feel the need to export directly into excel format you can try to increase the maximum memory allocated to SQLDeveloper by adding something like the following
    AddVMOption -Xmx1024Mto the sqldeveloper.conf file usually located in
    [SQLDEveloper_install_dir]\sqldeveloper\binadjust the number as you see fit, but bear in mind that there have been issues reported while meddling with this parameter.
    The default value for the parameter is stored in the ide.conf file usually located in
    [SQLDEveloper_install_dir]\ide\bin

  • Osx server crashes when copying large amount of data

    Ok. I have set up a mac os x server on a G4 Dual 867. Set to standalone server. The only services running are, VPN, AFP, DNS (I am pretty sure the DNS is set up correctly). I have about 3 Firewire drives and 2 USB 2.0 drives hooked up to it.
    When I try and copy roughly 230GB from one drive to another, it either just stops in the middle or CRASHES the server! I can't see anything out of the ordinary in the logs, though I am a newbie.
    I am stumped. Could this be hardware related? I just did a complete fresh install of os x server!

    This could be most anything, whether a disk error, a non-compliant device, a firewire error (I've had FireWire drivers tip over Mac OS X with a kernel panic; if the cable falls out at an inopportune moment when recording in GarageBand, toes up it all goes), to a memory error. This could also be a software error. This could be a FireWire device(s) that's simply drawing too much power.
    Try different combinations of drives, and replace one or more of these drives with another; start a sequence of elimination targeting the drives.
    Here's what Apple lists about kernel panics as an intro; it's details from the panic log that'll most probably be interesting...
    http://docs.info.apple.com/article.html?artnum=106228
    With some idea of which code is failing, it might be feasible to find a related discussion.
    A recent study out of CERN found three hard disk errors per terabyte of storage, so a clean install is becoming more a game of moving the errors around than actually fixing anything. FWIW.

  • NMH305 dies when copying large amounts of data to it

    I have an NMH305 still set up with the single original 500GB drive.
    I have an old 10/100 3COM rackmount switch (the old white one) uplinked to my Netgear WGR614v7 wireless router.  I had the NAS plugged into the 3COM switch and everything worked flawlessly.  Only problem was it was only running at 100m.
    I recently purchased a TRENDnet TEG-S80g 10/100/1000 'green' switch.  I basically replaced the 3com with this switch.  To test the 1g speeds, I tried a simple drag & drop of about 4g worth of pics to the NAS on a mapped drive.  After about 2-3 seconds, the NAS dropped and Explorer said it was no longer accessible.  I could ping it, but the Flash UI was stalled.
    If I waited several minutes, it could access it again.  I logged into the Flash UI and upgraded to the latest firmware, but had the same problem.
    I plugged the NAS directly into the Netgear router and transfered files across the wireless without issue.  I plugged it back into the green switch and it dropped after about 6-10 pics transfered.
    I totally bypassed the switch and plugged it directly into my computer.  Verified I can ping & log in to the Flash UI, then tried to copy files and it died again.
    It seems to only happend when running at 1g links speeds.  The max transfer I was able to get was about 10mbps, but I'm assuming that's limited by the drive write speeds & controllers.
    Anyone ran into this before?
    TIA!

    Hi cougar694u,
    You may check this review "click here". This is a thorough review about the Media Hub's Write and Read throughput vs. File Size - 1000 Mbps LAN.
    Cheers

  • Exporting large amount of data

    I need help.
    I have to export the results of a query (can be many, up to a million records) into a CSV file.
    Is it preferable to create a temporary file on the server and then serve it to the client or
    is it better to directly create the response while browsing the recordset?
    Thanks
    Marco
    p.s.: sorry for my poor english.

    Even I am looking for same, I want to create csv file inside the servlet , and sent the csv file to client using download
    option.
    Regard's
    Suresh Babu G

  • How to Send and Receive Large Amounts of Data to and from a  Web Service

    Hi All,
    My requirement is: .Net Web service should receive a file and services do some modification in that file and return the file. I need to write client (using Java) to invoke that web service. Please help in writing java client code for accessing that service. Suggest if any changes required in Web service code also.
    My .Net web Service web method code:
    [WebMethod]
    public byte [] ByteEcho(byte[] data) {
    -     Some modification code -
    return data;
    }

    that will work fine for small files. it will potentially cause OutOfMemoryErrors on client and/or server for large files. if you want to send/recieve large files, you need to stream them.
    also, be aware that you cannot send raw byte[] via xml, you will need to encode the data using some sort of binary -> text encoding like Base64 encoding.

  • Sending Large amount of data (250 K +) from Oracle to flex client

    I have an oracle database with more than 250K rows of data that needs to be sent to the AdvancedDataGrid and Flex charts for runtime analysis. What would be and ideal solution to implement this?

    I would say paging would be the way to go, downloading 250K rows to a client is not sensible. Any analysis that may be required should be performed server-side. Dumping 250K data-points into a chart is also unlikely to be performant or necessary.
    You get paging with LCDS, and the new FlashBuilder 4 code generation features also support paging.

  • Getting error while trying to upload the data in excel from SSIS package through sql agent job

    We are getting below errors.
    Error:
    The Microsoft Jet database engine cannot open the file '\\serversdev\Documents\QC Files\Prod.xls'. It is already opened exclusively by another
    user, or you need permission to view its data.
    Please suggest ASAP
    Regards,
    Ramu
    Ramu Gade

    Hi Dikshan Gade,
    According to your description, you want to upload data from excel to database, when you call ssis package through SQL Server Agent job, you got the error message.
    To troubleshoot the problem, please refer to the following steps:
    Validate that the account has permissions on the file and the folder.
    Verify that the file and the data source name (DSN) are not marked as Exclusive.
    Make sure SQL Server Agent Services service account has the permission to access the database.
    We can check SQL Server Agent’s activity logs, Windows Event logs and SSIS logs to get more clues. Also the tool Process Monitor is helpful to track the cause of registry or file access related issues. For more information about the issue, please refer to
    the following KB article:
    http://support.microsoft.com/kb/306269
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • Error while Exporting large data from Reportviewer on azure hosted website.

    Hi,
    I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
    Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
    Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
    at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    --- End of inner exception stack trace ---
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
    at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
    at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
    at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
    at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
    It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
    Any help will be appreciated.
    Thanks.

    Sorry, let me clarify my questions as they were ambiguous:
    For a given set if input, does the request always take the same amount of time to fail? How long does it take?
    When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
    Also, if you can share your site name (directly or
    indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side.

  • Finder issues when copying large amount of files to external drive

    When copying large amount of data over firewire 800, finder gives me an error that a file is in use and locks the drive up. I have to force eject. When I reopen the drive, there are a bunch of 0kb files sitting in the directory that did not get copied over. This is happens on multiple drives. I've attached a screen shot of what things look like when I reopen the drive after forcing an eject. Sometime I have to relaunch finder to get back up and running correctly. I've repaired permissions for what it's worth.
    10.6.8, by the way, 2.93 12-core, 48gb of ram, fully up to date. This has been happening for a long time, just now trying to find a solution

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

Maybe you are looking for

  • "Document type" in access sequence  not working in free goods

    Dear All, our requirement is to provide free goods with the main items in portal. for this i defined access sequence 1.Customer/Material and 2. SalesOrg/Documnet Type/Material. for access sequence Customer/Material  it's working fine but when i use S

  • TS5376 Have new dell xps system and cannot install I tunes: "Apple Application Support was not found"...Error 2 (Windows error 2)....any ideas??

    I am having tremendous difficulty installing I Tunes onto my new Dell XPS (Windows 8). First it downloaded but my library wasn't there and it would not recognize my ipod. Then I uninstalled it and now it won't reinstall saying: Apple Application Supp

  • Can't install itune win 7

    Can't upgrade iTune to 10.6.  This message pop up during installation, says I have problem with Window Installer Package, anyone knows what I should do?  I this a Window problem or a problem with iTune?

  • Help please: back up?

    What's the best way to back up my data? Have about 2gb music, some photos, couple documents, nothing vital yet but as time goes it will get more full. No Ipod or external hard drive. Can I back up to a DVD? Is there a "one click" way to set this up?

  • HTP package question

    I have a feeling I'm in the wrong place for this question but maybe someone can help. I'm not so good with Oracle configuration and setup issues but I have created a package with a few procedures that output info using the HTP package. At my old plac