CR 11.0 Large DataSet & Load Time

I'm creating a report on a table which has 250k+ rows and it takes a few minutes for the dataset to load.  The report has 3 parameters, two of which are dynamic and come from two fields in the table.
I'm trying to delay loading of the full dataset until after the parameters have been chosen during run-time.  I've tried populating the parameters using views which filter the data in the database, but the full datasets are still being loaded prior to the parameter forms.
Is it possible to get the dynamic parameters first before pulling the rest of the reports data?

Please re-post if this is still an issue to the Data Connectivity - Crystal Reports Forum or purchase a case and have a dedicated support engineer work with you directly

Similar Messages

  • Optimization on Load Time (Large Images and Sounds)

    Hi,
    I have an all flash website that works by having each portion of it to load and unload in the center of a frame based off the navigation chosen.  Load times on everything but one part of my site are ok.
    The dimensions of the part inside the frame are 968x674.  I know that's relatively large for a flash file, but that can't be changed at this point.
    Within the page there are objects that come up when you find an item on the screen.  Its a dialog box and it shows text, an image, and has a voice over that reads what the text says.  After you're done looking at it, there is an x-button to dismiss the window.  There are 15 of these and they are exported to the actionscript.  I add them in the actionscript via addChild.  I figured this is a huge spot that could be reconfigured, but I'm not sure if it would make much a difference.
    The other huge thing is there is a ton on the screen.  It starts you off in an environment, and when you click 3 different sections, it zooms into that portion allowing you to look for the items you're trying to find.
    There is also a man that talks and animates in the beginning and at the end.  We are planning on taking out the end part and putting a text box up.
    Here is the website..  its a propane safety site for kids.  I know its kind of a weird idea, but it works and people seem to like it.
    www.propanekids.com
    The two parts of the site we are trying to optimize is the main menu when you first are at the site, and the "Explore" section.  (Just click on the cloud on the front page.
    If someone could take the time to give me some new ideas of how I can get these loading times down, I would greatly appreciate it!
    Thanks

    Ok, who ever posted this message is hacking me  and i can't believe you guys are helping
    Date: Thu, 27 Jan 2011 11:07:28 -0700
    From: [email protected]
    To: [email protected]
    Subject: Optimization on Load Time (Large Images and Sounds)
    Hi,
    I have an all flash website that works by having each portion of it to load and unload in the center of a frame based off the navigation chosen.  Load times on everything but one part of my site are ok.
    The dimensions of the part inside the frame are 968x674.  I know that's relatively large for a flash file, but that can't be changed at this point.
    Within the page there are objects that come up when you find an item on the screen.  Its a dialog box and it shows text, an image, and has a voice over that reads what the text says.  After you're done looking at it, there is an x-button to dismiss the window.  There are 15 of these and they are exported to the actionscript.  I add them in the actionscript via addChild.  I figured this is a huge spot that could be reconfigured, but I'm not sure if it would make much a difference.
    The other huge thing is there is a ton on the screen.  It starts you off in an environment, and when you click 3 different sections, it zooms into that portion allowing you to look for the items you're trying to find.
    There is also a man that talks and animates in the beginning and at the end.  We are planning on taking out the end part and putting a text box up.
    Here is the website..  its a propane safety site for kids.  I know its kind of a weird idea, but it works and people seem to like it.
    http://www.propanekids.com
    The two parts of the site we are trying to optimize is the main menu when you first are at the site, and the "Explore" section.  (Just click on the cloud on the front page.
    If someone could take the time to give me some new ideas of how I can get these loading times down, I would greatly appreciate it!
    Thanks
    >

  • Loading time into memory for a large datastore ?

    Is there some analysis/statistics about what would be the loading time for a timesten data store according to the size of the data store.
    We have a problem with one of our clients where loading of datastore into memory takes a long time. but only certain instances it takes this long.. maximum size for data store is set to be 8GB (64bit AIX with 45GB physical memory), is it something to do with transactions which are not committed?
    Also is it advisable to have multiple smaller datastores or one single large datastore...

    When a TimesTen datastore is loaded into memory it has to go through the following steps. If the datastore was shut down (unloaded from memory) cleanly, then the recovery steps essentially are no-ops; if not then they may take a considerable time:
    1. Allocate appropriately sized shared memory segment from the O/S (on some O/S this can take a significant time if the segment is large)
    2. Read the most recent checkpoint file into the shared memory segment from disk. The time for this step depends on the size of the checkpoint file and the sustained read performance of the storage subsystem; a large datastore, slow disks or a lot of I/O contention on the disks can all slow down this step.
    3. Replay all outstanding transaction log files from the point corresposnding to the checkpoint until the end of the log stream is reached. Then rollback any still open transactions. If there is a very large amount of log data to replay then this can take quite some time. This step is skipped if the datastore was shut down cleanly.
    4. Any indices that would have been modified during the log replay are dropped and rebuilt. If there are many indices, on large tables, that need to be rebuilt then this step can also take some time. This phase can be done in parallel (see the RecoveryThreads DSN attribute).
    Once these 4 steps have been done the datastore is usable, but if recovery had to be done then we will immediately take a checkpoint which will happen in the background.
    As you can see from the above there are several variables and so it is hard to give general metrics. For a clean restart (no recovery) then the time should be very close to size of datastore divided by disk sustained read rate.
    The best ways to minimise restart times are to (a) ensure that checkpoints are occurring frequently enough and (b) ensure that the datastore(s) are always shutdown cleanly before e.g. stopping the TimesTen main daemon or rebooting the machine.
    As to whether it is better to have multiple smaller stores or one large one - that depends on several factors.
    - A single large datastore may be more convenient for the application (since all the data is in one place). If the data is split across multiple datastores then transactions cannot span the datastores and if cross-datastorestore queries/joins are needed they must be coded in the application.
    - Smaller datastores can be loaded/unloaded/recovered faster than larger datastores but the increased number of datastores could make system management more complex and/or error prone.
    - For very intensive workloads (especially write workloads) on large SMP machines overall better throughput and scalability will be seen from multiple small datastores compared to a single large datastore.
    I hope that helps.
    Chris

  • Is anyone working with large datasets ( 200M) in LabVIEW?

    I am working with external Bioinformatics databasesa and find the datasets to be quite large (2 files easily come out at 50M or more). Is anyone working with large datasets like these? What is your experience with performance?

    Colby, it all depends on how much memory you have in your system. You could be okay doing all that with 1GB of memory, but you still have to take care to not make copies of your data in your program. That said, I would not be surprised if your code could be written so that it would work on a machine with much less ram by using efficient algorithms. I am not a statistician, but I know that the averages & standard deviations can be calculated using a few bytes (even on arbitrary length data sets). Can't the ANOVA be performed using the standard deviations and means (and other information like the degrees of freedom, etc.)? Potentially, you could calculate all the various bits that are necessary and do the F-test with that information, and not need to ever have the entire data set in memory at one time. The tricky part for your application may be getting the desired data at the necessary times from all those different sources. I am usually working with files on disk where I grab x samples at a time, perform the statistics, dump the samples and get the next set, repeat as necessary. I can calculate the average of an arbitrary length data set easily by only loading one sample at a time from disk (it's still more efficient to work in small batches because the disk I/O overhead builds up).
    Let me use the calculation of the mean as an example (hopefully the notation makes sense): see the jpg. What this means in plain english is that the mean can be calculated solely as a function of the current data point, the previous mean, and the sample number. For instance, given the data set [1 2 3 4 5], sum it, and divide by 5, you get 3. Or take it a point at a time: the average of [1]=1, [2+1*1]/2=1.5, [3+1.5*2]/3=2, [4+2*3]/4=2.5, [5+2.5*4]/5=3. This second method required far more multiplications and divisions, but it only ever required remembering the previous mean and the sample number, in addition to the new data point. Using this technique, I can find the average of gigs of data without ever needing more than three doubles and an int32 in memory. A similar derivation can be done for the variance, but it's easier to look it up (I can provide it if you have trouble finding it). Also, I think this funtionality is built into the LabVIEW pt by pt statistics functions.
    I think you can probably get the data you need from those db's through some carefully crafted queries, but it's hard to say more without knowing a lot more about your application.
    Hope this helps!
    Chris
    Attachments:
    Mean Derivation.JPG ‏20 KB

  • Can a BIG form be served up one page at a time to avoid long load time?

    Tricks I have read for optimizing the load time of large forms are not helping. Linearization causes the first page to render quickly, but you can't interact with the fields until the whole form finishes loading -- no help there. Is there a way to break the form into pages (without creating entirely separate forms) so the user can fill out a page, hit a Next Page button, fill out that page, etc.? Understood that this is an old school idea, but until Reader can download a 1+ MB form in less time than it takes an average user to get ticked off, old school might do the trick.
    Alternatively, is there a way to construct a form so you can start interacting with it without having to wait for it all to load? This question comes from the (uninformed) assumption that maybe there are forward references that can't be satisfied until all the bits have come over the wire. If that's right, can a multipage form be architected so as to avoid this problem?

    No that technology does not exist yet. There are form level events that need to have the entire document there before they can fire. Also you would have to keep track of where you are so that would mean some sort of session information for each user.

  • Dvrcast load times - is HDS a solution?

    We are streaming H.264 video to the dvrcast application using RTMP streaming.  Everything is working fine, though we are noticing some unacceptable load times when loading large videos for playback that have been recorded this way.  I found http://forums.adobe.com/message/3399973#3399973 which offers some suggestions.  I will be looking into post-processing the video to defragment it, but I'm also interested in HDS.  Is it possible to use HDS with dvrcast streams that were recorded to mp4 via RTMP streaming?  I tried the HDS URL convention - http://server/dvrcast_origin/stream_name.f4m but this isn't working.  Are there other suggestions for improving load times for long videos with the dvrcast application?

    Additionally, I can't seem to get the F4V post processor to work with these files.  It is throwing an error "cannot open file" and the -v flag adds no helpful output.

  • Comparing load times w/ and w/o BIA

    We are looking at the pros/cons of BIA for implementation.  Does anyone have data to show a comparison between loads, loads with compression, vs BIA Index time?

    Haven't seen numbers comparing load times.  Loads to your cubes and compression continue whether you have BIA or not.  Rollup time would be eliminated as you would no longer have the need to have aggregates.  No aggregates should also reduce Change Run time, perhaps a lot, or only a little, depending on whether you have large aggregates with Nav Attrs in them. All of that is offset to some degree by the time to update the BIA.
    Make sure you understand all the licensing costs, not just SAP's, but the hardware vendors per blade licensing costs.  Talked to someone just the other day that was not expecting a per blade licensing, list price of the license per blade was $75,000

  • Long loading time if no internet connection available (VS 2005 Redist)

    Hallo forum members!
    I've been searching for a solution for my problem for a while - without success. It isn't really a critical issue, but quite annoying anyways.
    Here's the problem i'm dealing with: I've developed an .NET application using Visual Studio 2005 and the included Crystal Report classes. When deploying my module everything works  as expected, offering a decent performance when loading the Crystal Report assemblies provided by the redistributable package - if an internet connection is available. If the system doesn't offer such a connection (e.g. not properly configured proxy settings) the module takes a huge amount of time to be loaded. An examination (using ProcMon) leaded to the conclusion that the Crystal Report assemblies try to retrieve the system connection settings from the registry in order to establish a connection to the internet. This is done only once when loading for the first time.
    Has anyone experienced the same problem? What is this connection used for? Is there a way to suppress this behaviour on systems without a valid internet connection in order to avoid this annoying long loading time?
    Thanks on advance for your help.
    Greetings,
    L.F.

    Nothing like a good vacation. And I got your christmas present early :). While you were away we (actually one other customer) figured it out. I have written a note re. the resolution that was supposed to be published yesterday, but it's still not out there, so here it is as I submitted it:
    Symptom
    When a report is loaded is takes between 150 - 180 seconds to display
    This is only reproducible on computers with no internet connection
    There are no error messages and the report comes up eventually
    Reproducing the Issue
    Create a Windows application with the following function calls:
              new Report()
             report.SetDataSource(dataSet)
    The function SetDataSource takes 2 plus minutes to execute
    Cause
    On SetDataSource() an attempt is made to connect to crl.verisign.com
    SetDataSource() uses the framework and as per the Microsoft Knowledge Base specified under "Resolution", it is actually the framework that is attempting the connection
    Resolution
    Solution is described in this Microsoft Knowlege Base:
    http://support.microsoft.com/kb/936707
    The following Microsoft article also discusses the issue:
    http://msdn.microsoft.com/en-us/library/bb629393.aspx
    An alternative workaround to the above is as follows:
    Open IE and go to the Tools menu and select Internet Options...
    Go to the Advanced tab
    Find the Security heading
    Under the Security heading find "Check for publisher's certificate revocation"
    Uncheck this option
    Keywords
    long time load crl.verisign.com certificate revoked internet
    The note will be published here:
    https://www.sdn.sap.com/irj/sdn/businessobjects-notes
    and the number will be 1270414
    Ludek

  • Reducing a SWF file size / creating "instance names" for website loading time improvement

    Im hoping someone clever can help...
    What i have done so far:
    1) Created a flipbook using indesign CS5.5 that contains approx 25
    images (embedded PNG's), these need to be good quality as this is a
    portfolio - I'd also like to keep the 'page curl' effect. This makes the SWF generated cira 25MB.
    2) created a Flash website "wrapper" to hold the above flipbook and add
    features such as "bookmarks", links to other sites and forward/backward
    page navigation buttions.
    The issue we have is that because the flipbook SWF is 25MB the site
    load time is slow.
    What can be done to speed this up without compromising the image
    quality?
    1) Is there a way to make the indesign SWF smaller?
    2) is there a way in indesign to properly give each page an
    "instance name" so that they can be controlled from flash and therefore
    I can load each pages image as and when needed.
    any other ideas?
    (If you'd like to visit the website to view the loading issue http://www.lizblackdesigns.co.uk)
    Thanks
    Liz

    these need to be good quality as this is a portfolio
    The approach to your portfolio website seems very outdated and will not help you in the near future. Flash is being used less and less with the increase in Mobile Devices being used to access the internet - a large chunk of which include Apple's iDevices which don't support Flash. This alone is a reason not to do it, however there are a few other big issues with the method you have chosen.
    There is no "content" on your website for any search engine to find. No text. No meta data. No image alt tags. No titles. Nothing. In other words, your website will not be found by the people you are trying to attract. I would imagine for a portfolio site, you may want to re-think this approach.
    The way the Flash file was created is the worst possible way to add Flash to the web as you have only embedded image files - and this is your entire site?! You could have included some text, which would have been viewable by search engines, and used color shapes to increase quality but lower filesize.
    The filesize meant it took me over 1 minute to get off your home page. The average internet user will leave your site if you do not keep them interested after 5 seconds. In other words, you should have content of interest viewable within 5 seconds of load time.
    If you are set on keeping the Flash setup for your website, re-build using as much text, color shapes and lines as you can (vector work in other words) and do not rasterize/export as image. This will drastically reduce load time and increase quality. You will also want to divide your file into sections and create more than 1 HTML page so that the browser only loads a bit at a time. You might also want to set a maximum size container for the Flash file as on my browser, it looks extremely pixelated and is very large, but unclear.
    I would suggest looking into HTML5 if you are keen to create a website that can be viewed by around 98% of the web surfing people and it will also mean you can use an image slider (or similar) to show good quality images of your work. This will aid your site in being found by search engines but will also allow you to update sections of the site without re-publishing the entire Flash file.
    Rik

  • Help improving server PDF loading time

    I have a large Adobe LifeCycle SE application (xdp 87,000 KB or 3,500KB PDF) it consists of over 130 forms (contains fragments as well).  My problem is that it is taking over 6 minutes to load into the server.  What can I do to speed up the rendering process?  any suggestions on practices for more efficient loading time?  Thanks

    I will be out of the office starting  07/09/2009 and will not return until
    07/10/2009.
    This e-mail and any files transmitted with it are confidential and are solely for the use of the addressee.  It may contain material that is legally privileged, proprietary or subject to copyright belonging to Mutual of Omaha Insurance Company and its affiliates, and it may be subject to protection under federal or state law.  If you are not the intended recipient, you are notified that any use of this material is strictly prohibited.  If you received this transmission in error, please contact the sender immediately by replying to this e-mail and delete the material from your system.  Mutual of Omaha Insurance Company may archive e-mails, which may be accessed by authorized persons and may be produced to other parties, including public authorities, in compliance with applicable laws.

  • How to cache swf files to reduce load time?

    Hi,
    We are using the XML data connection for our swf files. Some of our swf files are large and may take a minute to load. Is it possible to cache the swf files so they load in a shorter amount of time? If so, how would I go about doing this? Thanks!

    Jim,
    There is no option as such in Xcelsius.
    We used SAP BI Query as a source for our dashboard and we had similar issue, we then cached the SAP BI Query to reduce the load time....it worked really very well!!
    Try doing something similar in the source system...
    -Anil

  • Slow image load time in web browsers. My images load too slowly.

    Hi, My images are loading slowly in my site. They are png and all under 600kb. Should they be jpegs (they are photos) Can anyone advise?

    I would do a few tests, I think you might be fine with the High setting, but also try a little lower if you can to conserve load time. It would be good to get your files to around 120kb each.
    A .jpeg file is the way to go, as .png files tend to be 20-40% larger.
    Here is an image that is set to 730x486- saved as Medium - Quality 50 - jpeg - final size 99 kb

  • Slow scrolling after large image loaded

    Hi all,
    Im trying to load a large Jpeg image (11mb, 4096*4096) into a bufferedimage object and draw this on a scrollable panel. However, the load time and scrolling once it has loaded is very slow. Here is the code I use to load the image and draw it
    try {
    File f = new File(mapName_);
    map_ = ImageIO.read(f);
    } catch (Exception e) {}
    public void paintComponent(Graphics g){
    super.paintComponent(g);
    Graphics2D g2 = (Graphics2D)g;
    if(map_ != null){
    g.drawImage(map_,0,0,this);
    Also, when I run the program I increase the heap size with the command "java -Xmx128M -jar imap.jar" to avoid out of memory errors.
    Any ideas on how to improve the loading time and scroll speed?
    Thanks

    I have a 9800pro, AMD XP3200 and 1gig pc3200.

  • SQLLOADER: Large Data Loads: 255 CHAR limit?

    Issue: I'm trying to load a delimited text file into an Oracle table
    with data that execeeds 255 characters and it's not working correctly.
    The table fields are set to VARCHAR2 2000 - 4000. No problem. Some of
    the data fields in the file have over 1000 characters. Ok. When running
    a load, SQLLOADER seems to only want to handle 255 characters based on
    it's use of the CHAR datatype as a defult for delimited file text
    fields. Ok. So, I add VARCHAR(2000) in the .ctl file next to the fields
    that I want to take larger datasets. That does not seem to work.
    When I set a field in the control file to VARCHAR(2000), the data for
    that field will get into the table. That's fine but, the issue is
    SQLLOADER does not just put just that field's data into the table, but
    it puts the remainder of the record into the VARCHAR(2000) field.
    SQLLOADER seems to fix the length of the field and forgets I want
    delimiters to continue to work.
    Anyone know how to get SQLLOADER to handle multiple >255 data fields in
    a delimited file load?
    jk
    Here is my control file:
    load data
    infile 'BOOK2.csv'
    append into table PARTNER_CONTENT_TEMP
    fields terminated by ',' optionally enclosed by '^' TRAILING NULLCOLS
    (ctlo_id,
    partners_id,
    content2_byline ,
    content2 varchar(4000),
    content3 varchar(2000),
    content9 varchar(1000),
    submitted_by,
    pstr_id,
    csub_id)
    null

    I have been sucessful using char instead of varchar. But having
    the optionally enclosed by value in the data has always solved
    the problem.

  • Issues with site load time. (bg images)

    Hey, guys.
    Having an issue with a site uploaded with MUSE. None of the images are overly large, but the load time is 5 to 6 seconds where it should be instant.
    Looking at it, it appears as though all of the images are attempting to load simotaneously in one large chunk rather than in a line.  Is there anyway to remedy this?

    This isn't related to background images. It's how Muse is handling the loading of the ~100 images in the slideshow. Thanks for pointing this out. Until we have altered the loading method for these images, the only workaround I have is to decrease the number of images in the slideshow.

Maybe you are looking for

  • Date of Last.Del field is not updated in EKEK table

    Hello All, We have one date update problem for one Scheduling agreement, the field is LFDKD (Date of Last.Delivery) in the table EKEK -Header Data for Scheduling Agreement Release and  this date hasnt been updated only for one SA since 2008 , it rema

  • Why do file sizes change when I use "Process Multiple Images" to add watermarks?

    Hi, I'm using Elements 11. In order to add watermarks to many JPG pics at once, I use the function "Process Multiple Files". I select a source folder and a destination folder, and adds a three digit serial number to each file. I do NOT tick the check

  • (Ljava/io/File;Z)V not found

    My class works fine in NT. When I upload through FTP (binary or ASCII) to AIX 4.1.3 with java 1 I get the following error Exception in thread "main" java.lang.NoSuchMethodError: java.io.FileOutputStream : method <init>(Ljava/io/File;Z)V not found WHa

  • Are there any hot fixes or releases due soon?

    It's been a while since I've seen anything released from the update centre. Are there any fixes due for release soon? Despite several months, a few hot fixes and the JSC2 Update 1 release, JSC2 doesn't seem to have improved all that much since the in

  • Can't install v3.6 from v 3.06 om Mac OSX 10.4.11

    I'm trying to update Firefox to v 3.6 from 3.06 but I can't get it to work at all. Do I need to uninstall the original version? Every time I download the update I keep getting the original version Any Ideas?