Break a (very) large table into pages interactively

Here's the problem: I have a number of reports which, depending on parameters and the nature of the underlying data, may produce a table (tablix) with 10's of 1000's of rows.
The non-interactive renderers deal with this kind of table reasonably - Word & PDF, for example, simply start a new page when the current page is full.
Sadly, interactive use of the report uses the RPL renderer which, for some strange and incomprehensible reason, ignores the page size and always tries to render a single table without intervening page breaks.
Unfortunately, when you ask the Web or WinForms ReportViewer clients to render a 250,000 row table on a single page, they fail badly - the Web viewer just gives up after a few minutes, while the WinForms viewer tries like a champ, but even after 90 minutes
of execution and consumption of 10+Gb of RAM is still stuck compute-bound trying to lay out that giant table.
Looking for ideas - I've been looking for a solution for 10 years now, but SSRS has progressed so little since 2005, the time lapsed doesn't really mean much.
In case it matters - the underlying query is MDX.  SSRS 2012 (or 2008 R2 or 2014 - they all behave the same).
-cd Mark the best replies as answers!

Setting the InteractiveSize property should resolve this issue.
Since you say you are getting no page breaks at all I am guessing you have the interactive height set to zero. 
Not that you can't see this property if you right click the report background and go to report properties - you need to use the properties pane instead.
http://www.magnetismsolutions.com/blog/nathaneccles/2013/10/30/ssrs-using-page-size-and-interactive-size-to-manage-printing
LucasF

Similar Messages

  • Split a large table into multiple packages - R3load/MIGMON

    Hello,
    We are in the process of reducing the export and import downtime for the UNICODE migration/Conversion.
    In this process, we have identified couple of large tables which were taking long time to export and import by a single R3load process.
    Step 1:> We ran the System Copy --> Export Preparation
    Step 2:> System Copy --> Table Splitting Preparation
    We have created a file with the large tables which are required to split into multiple packages and where able to create a total of 3 WHR files for the following table under DATA directory of main EXPORT directory.
    SplitTables.txt (Name of the file used in the SAPINST)
    CATF%2
    E071%2
    Which means, we would like each of the above large tables to be exported using 2 R3load processes.
    Step 3:> System Copy --> Database and Central Instance Export
    During the SAPInst process at Split STR files screen , we have selected the option 'Split Predefined Tables' and select the file which has predefined tables.
    Filename: SplitTable.txt
    CATF
    E071
    When we started the export process, we haven't seen the above tables been processed by mutiple R3load processes.
    They were exported by a Single R3load processes.
    In the order_by.txt file, we have found the following entries...
    order_by.txt----
    # generated by SAPinst at: Sat Feb 24 08:33:39 GMT-0700 (Mountain
    Standard Time) 2007
    default package order: by name
    CATF
    D010TAB
    DD03L
    DOKCLU
    E071
    GLOSSARY
    REPOSRC
    SAP0000
    SAPAPPL0_1
    SAPAPPL0_2
    We have selected a total of 20 parallel jobs.
    Here my questions are:
    a> what are we doing wrong here?
    b> Is there a different way to specify/define a large table into multiple packages, so that they get exported by multiple R3load processes?
    I really appreciate your response.
    Thank you,
    Nikee

    Hi Haleem,
    As for your queries are concerned -
    1. With R3ta , you will split large tables using WHERE clause. WHR files get generated. If you have mentioned CDCLS%2 in the input file for table splitting, then it generates 2~3 WHR files CDCLS-1, CDCLS-2 & CDCLS-3 (depending upon WHERE conditions)
    2. While using MIGMON ( for sequencial / parallel export-import process), you have the choice of Package Order in th e properties file.
      E.g : For Import - In the import_monitor_cmd.properties, specify
    Package order: name | size | file with package names
        orderBy=/upgexp/SOURCE/pkg_imp_order.txt
       And in the pkg_imp_txt, I have specified the import package order as
      BSIS-7
      CDCLS-3
      SAPAPPL1_184
      SAPAPPL1_72
      CDCLS-2
      SAPAPPL2_2
      CDCLS-1
    Similarly , you can specify the Export package order as well in the export properties file ...
    I hope this clarifies your doubt
    Warm Regards,
    SANUP.V

  • When using reader, the font gets very large on last page and I cant change it

    when using reader, the font gets very large on last page and I cant change it

    You cannot edit PDF files with Reader. You can change the View options to make what you see smaller. View -> Zoom

  • Handling very large diagrams in Pages?

    I am writing a book that requires sometimes the use of large diagrams. These are vector-based diagrams (PDF). Originally, I planned to use iBooks Author and widgets to let the user zoom/pan/scroll and use other nice interactive stuff, but after having tried everything I have decided to give up on iBooks Author and iBooks for now because of its dismal handling of images (pixels only, low resolution, limited size only, etc.).
    I am planning to move my project over to Pages. Not having the 'interactive widget'  approach means I need some way to handle large images. I have been thinking about putting very large images in multiple times on different pages with different masks. Any other possible trick? Can I have documents with multiple page sizes? Do I need a trick like the one above or can an ePub book be zoomed/panned/scrolled, maybe using something different to read it than iBooks?

    Peter, that was indeed what I expected. But it turns out iBooks Author can take PDF, but iBooks cannot and iBooks Author renders them to low resolution images (probably PNG) when compiling the .ibook form the .iba.
    Even if you use PNG in the first place, the export function of iBooks Author (either to PDF or to iBook) create low resolution renders.
    The iBooks format is more a web-based format. The problem lies not in what iBooks Author can handle, but in how it compiles these to the iBooks format. It uses the same export function for PDF, making also PDF export ugly and low-res.
    iBooks Author has more drawbacks, for instance, if you have a picture and you change the image inside the picture, you can't. You have to teplace the entire picture. That process breaks all the links to the picture.
    iBooks Author / iBooks is by far not mature.

  • Insert Numbers Table Into Pages

    Is there an equivalent to the "Past-Special" command found in MS Word/Excel? I want to link a Numbers table into a Pages document and have the Pages document automatically change when the Numbers table is changed.

    Hi, I think there might be some misleading information about this.
    You can highlight a table in Numbers, copy, then paste it into your Pages document. It moves and resizes like an image, and you can directly edit any cell in the table. You can add rows, columns, etc the way you can if you create a table through Pages. The functions all remain the same and change automatically as you change the information.
    While you may not be able to link so that changes in one program affect the other, I find I don't need it to change in both very often. If it ever does become an issue, I just copy/paste whichever is "complete" into the other and continue.
    Hope it helps!
    (Incidentally, you can do the same thing to take tables created in Pages and put them into Numbers)
    Message was edited by: FredoSays

  • Importing word 2004 docs with tables into Pages

    I work in video production, and my clients send me shooting scrips as word documents formatted with tables. Needless to say, I was quick to discover that Pages doesn't really 'like' these documents. I can't keep asking my clients to reformat their docs as txt files for me - it's very unprofessional.
    What I've found is that all the text is there, but the tables don't 'translate' (one cell tries to fit on a whole page and the text gets 'lost' at the bottom). Is there a quick and easy way to remove the tables in Pages but keep the text?
    In Word, it's Alt+A, V, B, 0 (I've been formatting and reformatting scripts for 8 years!).
    Thanks for your help. I REALLY don't want to have to get Office - I've read so many horror stories about 2008 (especially the excel part).

    sknygrydg07 wrote:
    Is there a quick and easy way to remove the tables in Pages but keep the text?
    In Word, it's Alt+A, V, B, 0 (I've been formatting and reformatting scripts for 8 years!).
    Hello,
    Have you tried: Select table, then from the Pages Menu; Format > Table > Convert Table to Text?
    It might be of some value in this situation.
    Jerry

  • Deleting rows from very large table

    Hello,
    I need to delete rows from a large table, but not all of them, so I can't use truncate. The delete condition is based on one column, something like this:
    delete from very_large_table where col1=100;
    There's an index (valid, B-tree) on col1, but it still goes very slow. Is there any instruction which can help delete rows faster?
    Txh in adv.
    A.

    Your manager doesn't agree to your running an EXPLAIN PLAN? What is his objection? Sounds like the prototypical 'pointy-hair boss'.
    Take a look at these:
    -- do_explain.sql
    spool explain.txt
    -- do EXPLAIN PLAN on target queries with current index definitions
    truncate table plan_table
    set echo on
    explain plan for
    <insert query here>
    set echo off
    @get_explain.sql
    -- get_explain.sql
    set linesize 120
    set pagesize 70
    column operation     format a25
    column query_plan     format a35
    column options          format a15
    column object_name     format a20
    column order           format a12
    column opt           format a6
    select     lpad(' ',level) || operation "OPERATION",
         options "OPTIONS",
         decode(to_char(id),'0','COST = ' || NVL(to_char(position),'n/a'),object_name) "OBJECT NAME",
         cardinality "rows",     
         substr(optimizer,1,6) "OPT"
    from     plan_table
    start     with id = 0
    connect by prior id = parent_id
    There are probably newer, better ways, but this should work with all living versions of Oracle and is something I've had in my back pocket for several years now. It's not actually executing the query or dml in question, just running an explain plan on it.

  • WHY CANT I POST TABLES INTO PAGES?

    I keep trying to post a table from the internet application "Statcrunch 5.0" into pages, but every time it only pastes the values and removes the borders of the table, making the data ridiculously confusing.

    Just a footnote to this thread.
    I hand-created a pure HTML, 4x6 table without any div containers, or CSS. Looks like this, unadorned in Safari.
    From the Pages v5.2.2 document toolbar, I added a table, and adjusted it down to a 1x2 example. I used a number automatic data format for the first cell. In the second cell, I created a custom, comma-delimited, number data format.
    I then used paste and match style into the first cell (A1) of the table, which then automatically expanded to 5x6 table, that applied the formatting that I assigned to the original Pages table cells. As cleanup, I selected the header row, chose to center align, and bold the column labels. Done. Looks like this.
    The above technique did not work well for a different HTML table with div containers, and heavy CSS styling.

  • Delete on very large table

    Hi all,
    one table in my database has grown to 20GB, this table holds log for applications since 2005 , so we decided to archive and delete all the log from 2005 and 2006.
    the table:
    CREATE TABLE WORKMG.PP_TRANSFAUX
    NID_TRANSF NUMBER(28) NOT NULL,
    VTRANSFTYPE VARCHAR2(200 BYTE) NOT NULL,
    VTRANSF VARCHAR2(4000 BYTE) NOT NULL,
    DTRANSFDATE DATE NOT NULL
    TABLESPACE TBS_TABWM_ALL
    The command:
    delete from workmg.pp_transfaux where dtransfdata < to_date('20070101
    00:00','yyyymmdd hh24:mi')
    My question is, what are the "best pratices" for this operation ? such a huge delete can " flood fill" the redo logs , i cant avoid it with "alter table pptransfaux nologging" ...
    So i can delete small chunks of data .. say .. delete 6 months each time ... but .. i'll get a big fragmented table ...
    my environment:
    oracle 9.2.0.1 under windows 2000
    Best Regards
    Rui Madaleno

    Since this is log data I am assuming you don't need it all online at a given time, and that you don't have a partitioning license:
    <Online>
    0. Backup the database.
    1. Create an empty duplicate table 'A'.
    <maintenance window>
    2. Exchange A and the primary table.
    <Online>
    3. insert-as-select-compress-nologging the data to keep from the primary table to A.
    4. create-table-as-select-compress-nologging the data to archive from the primary table to B.
    5. Drop the primary table.
    6. Archive and drop B.
    7. Backup the database.
    <Future>
    7. Upgrade to 11g
    8. License and utilize the automated partition generation feature to give each period its own partition.
    9. Periodically archive and drop partitions from the log table.
    If you have partitioning:
    <Online>
    1. Create an empty duplicate table, we'll call it A. It should be partitioned either by month or year.
    2. insert-as-select-compress-nologging the primary table into A.
    <maintenance window>
    3. exchange A and the primary table.
    <Online>
    4. Drop the primary table.
    5. archive and drop the partitions of A you no longer need.

  • How to break up a large document into multiple pages?

    Sorry for my newbie questions, but I am really new to Illustrator (CS6 on Windows)...
    I am creating a poster in A1 format.  My printer can only print up to A4, so I would like to break up my poster into 8 A4 pages to print a preview before sending the doc off to the printer.
    Is there a way to easily do this?
    Thanks in advance for your replies.

    Go to View>Show Print Tiling
    then go to File>Print
    From the page Size select A4
    In the Print Dialog select from the Scaling optionsTile Full Pages from the drop down
    Youy will see the tiling you need  9 pages BTW
    One of the problems you cane see right away is that the pages are butted by the tiling and your printed might need a border on all edges.
    So tio compensate for this you need to have a little bit of an overlap that changes the tiling but when you trim the paper to the edge of the actual print area on the page and lay the pages on top of one another you will be able match them.
    I chose .5 inch for the overlap. You will have to figure out the amount you need for your printer.
    So in the print dialog to the right of the Scale Option Enter an over lap amount as well
    You can do it with separate artboards but you will have a hard time when it comes to the overlap.

  • How do I break up a large PDF into 8"x11" pieces and print?

    I downloaded a super large PDF map that probably has to be split into at least ten 8.5"x11" pages in order to print
    (and then i will have to splice/tape the printed pages together, to form the map).
    So my question is:
    How do I tell Photoshop (or just Acrobat??) to split the massive image up, so that it prints in pieces properly onto 8x11 pages?
    thank you for any advice!

    Tools - Pages - Split Document - Number of Pages: 2

  • Trouble pasting table into Pages.

    I have been unable to copy a table from Numbers and paste it into a Pages document.  The data gets pasted, but not the formatting and it is also not placed in a table.  I am using Pages '09  and Numbers '08. 
    What appears to happen is teh text or cell data from teh table is pasted in teh Pages document, and seperated by a series of tabs rthare than being placed in a table.

    Numbers '08 was designed to be used with Pages '08, not with Pages '09 which was created 18 months later.
    As Fruhulda wrote, pasting a table from Numbers '08 in Pages '08 or pasting a table from Numbers '09 in Pages '09 behave well (as long as the size limits are respected).
    Pages '09 was not designed as a single app but as a component of a package including Numbers '09 and Keynote '09. Compatibility with the '08 versions wasn’t an item in the design workflow. They were distributed as separate apps only at least 18 months later.
    Yvan KOENIG (VALLAURIS, France) mardi 21 février 2012
    iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.3
    My Box account  is : http://www.box.com/s/00qnssoyeq2xvc22ra4k

  • Importing Numbers table into pages

    Hi,
    If anyone can help me out really quickly with this I'd really appreciate it, deadline coming up!
    I want to get a table I've done in numbers into my assignment on Pages. The table is in landscape form and Pages in portrait. I basically just want to get the page turn it 90 degrees and slot it into my assignment.
    I've tried just copying it over but all the table just comes out in a mess.
    Thanks

    Hello
    Maybe I missed something but I don't understand your message.
    You wrote:
    I've tried just copying it over but all the table just comes out in a mess.
    Copying a table from Numbers to Pages works flawlessly as you may see on this screenshot:
    Given that, it's true that there is no tool available to rotate a table but this has nothing related with your last sentence.
    Yvan KOENIG (from FRANCE dimanche 13 avril 2008 20:28:26)

  • Creation of very large table

    Dear all, I need to create a table that will host about 40 GB of data with a very simple structure (varchar2(20), varchar2(10), VARCHAR2(4000)), thus there will be millions of records. The table should be indexed by the first two fields to allow online reporting.
    I am not a DBA and our DBA is on leave so we have a little trouble there (;C). Loading the table is not a problem, however I will need to partition it if I want it to deliver queries with a reasonable response time. The same for indexes.
    Is there a recommendation for creation of such table?
    The first field is a code that indicates the dossier number. The second field indicates the type of data that is contained in the third field. Typically, a query would be of the kind "show me the name of manufacturers (thus, second code=xx) for year 2006 (thus, second code=yy and data=2006). I have thought that partitioning by year would be a good idea and then maybe hash-sub partitioning in order to reduce the size of the yearly partitions.
    Regarding performance, what needs to be done in order to optimise the execution plan?
    Thanks in advance.

    Thanks for the reminder, but licensing is covered by my organisation. I am really concerned about the technical aspects of my question.
    Regarding Vit's post, I do not understand fully why "Well, partitioning on "data" seems to be a bad idea, if it contains other values than just dates (as you indicate)." I would partition on data, the problem is that the governing field is not a column per se, but rather contained in the data column together with other information. I'll make it more clear:
    column 1: fiche id
    column 2: data code
    column 3: data itself
    Thus, a "fiche" would consist of several records, each containing diverse information, for example:
    fiche code data
    1 10 01/01/2007 (i.e. the date I would use for partitioning)
    1 20 Acme (e.g. name of manufacturer)
    1 30 1000000 (e.g. revenue in 2007)
    1 40 WaltDisney st. (e.g. address)
    and so on
    So, it is not so straight forward to use a column for partitioning, I actually need the combination of code and data to know what data to use. Otherwise I would use hash partitioning.
    Regarding the keys, the obvious columns are fiche and code, being the combination of both, unique. A typical query would want to search for data in the data column (i.e. year), for which I also need column code (I need to know what code is the type of data year before I can ask for it). Indexing of data would be necessary.
    Finally, the data would be accessed through an SQL editor. The end user would have freedom to build own queries. Just for information, a select count on the partially loaded table (bit more than 95 million records = ~9% of foreseen load) without indexes takes about 1 min 20 seconds to complete.
    One more thing I forgot to mention: the table will have inserts (not frequent, but maybe massive) and not updates.

  • Working with VERY LARGE tables - is it possible to bypass row counting?

    Hello!
    For working with large result sets ADF provides the `Range Paging` mechanism for views, described in the 27.1.5 part of the Developer’s Guide For Forms/4GL Developers.
    It works well, but as a common mode it counts total row count to allow paging. In some cases query `select count(1) from (SELECT ...)...` can take very, very long time.
    But if a view object doesn't know row count (for example we can override getEstimatedRowCount() method ), paging controls doesn't appear in user interface.
    Meanwhile I suggest that it's possible to display two paging links - Prev and Next, without knowing row count. Is it a way to do it?
    Thank in advance,
    Ilya Rodionov.

    Hi Ilya,
    while you wait for Franks to dig up the right sample you can read this thread:
    Re: ADF BC: Performance issue with getEstimatedRowCount (ER?)
    There we discuss the exact issue.
    Timo

Maybe you are looking for