DataGrid limit on amount of records or data?

Hi.
Im using flex with a hibernate backend.
I have tables in a db that Id like to show using Datagrid.
I have done tests and found that there is a limit to the
number of
records shown in the datagrid.
Is this a limitation of the datagrid?
How do I get around this?
I currently am able to show around 5000 records.
When I have tables of say 10000 or 100000 records, how do I
show them?
Thanks.
Kind regards.
Luke.

Wow! That's a lot of records on one grid. You are going to
want to start paging your data.
The short answer is only pass in as many records as you can
reasonably View in your grid. Cache the rest on the server in a
queue waiting for retrieval.
When the user scrolls down close to the last record (say
within 25 records from the end of the loaded records) you can be
removing a chunk of records from the top of the dataprovider and
moving them into a shared object or other storage for easy
retrieval and at the same time downloading more records from the
queue cached on the server.
Not the HOW portion of how you do that depends greatly on the
technology you use on the server and unfortunately I have no
experience with Hibernate. Sorry. Still as long as you know how to
cache a query on the server it should not be too hard. Think of it
like an automated Next N interface for your grid.
Much faster load times also as you are never sending fewer
records to your grid at any one time.

Similar Messages

  • Limiting the amount of records or pages that gets generated by a report

    Post Author: jjurroz
    CA Forum: General
    Hey guys,I'm using CR 8.5.3I have a quick question that hopefully one of your can help me out with. I have a report that works perfectly and uses a couple different parameters for sorting etc.. I'm using an SQL database and for a particular report, after inputting the correct parameter values the report would generate for example 12,000 records on 2000 pages. What I'm trying to do is create a new parameter that would limit either the amount of pages that gets generated or the amount of records. I think I'd rather limit the amount of records to start off with and I would like it to be from a list of predefined amounts. I.E. "ALL", "First 10", "First 50". Essentially I'm trying to create a report that gives me a subset or sample of the main report so that it could get printed to pdf and sent to customer for review. Any help would be greatly appreciated! Regards,Jose

    Post Author: jjurroz
    CA Forum: General
    I can only go to 'TopN/Sort Group Expert' menu option if I add a summary field, otherwise the option is greyed out. If I do add a summary field and go to 'TopN/Sort Group Expert' I have no formula button. I can select TopN from a drop down but then I have to hard code the N value. I'm start to wonder whether its even possible to do it dynamically in CR 8.5.   V361:Sorry JJurroz, Here again I have CR XI, so not sure if 8 will do this.  Create a number parameter {? Top_N} then go to Group Sort Expert, you should be able to click on the formula button for Top N and put your parameter {? Top_N} in the formula.
    That should work.

  • Any ideas on how to control the amount of sustain pedal data that Logic Pro 9 records? Every time I press the sustain pedal down, it records 4 sustains. Same thing when I release the pedal. Thanks!

    Any ideas on how to control the amount of sustain pedal data that Logic Pro 9 records? Every time I press the sustain pedal down, it records 4 sustains. Same thing when I release the pedal. Thanks!

    This gets confusing, you have two Macs with different issues and two pedals with different issues? First let's sort out what's what. So, do the issues come from the pedals or from the Mac software? To find out, simply switch the pedals and note their behaviour. No change: it's the software, change: it's the pedals (imho the most likely scenario).
    Second, what kind of pedals are they? Can you name the type of pedal too? The Yamaha what and the Mgear which?
    Third, with the 4 fold data problem, are the doubled events simultaneous, or a few ticks apart, or more?
    Also, you have no MIDI keyboard? Are the pedals both "standalone"?
    I think you have to sort this problem at its' source, which are the pedals. You could, if push really comes to shove, perhaps create some sort of ingeneous/elaborate transformer setup in the environment to filter the extra messages, although with the 4/4 sustain events that may not even be possible. With the "stepped" events you could set up a filter to just block all events with values 20-110 or so.
    But much better would be to configure the pedal(s) correctly.

  • Can you limit the amount of data accessed per user on an AirPort Extreme?

    Can you limit the amount of data accessed per user on an AirPort Extreme?

    Your question was whether the AirPort Extreme is able to establish data limits per user.
    If you add another router that has this type of capability or install software on another router, then you will be able to establish data limits for each user. The AirPort Extreme will have no control over this.

  • Error : Governor limit exceeded in cube generation (Maximum data records ex

    Hi
    I have created a report that throws this error.
    Governor limit exceeded in cube generation (Maximum data records exceeded.)
    Error Details
    Error Codes: QBVC92JY
    After going through various blog i found that i need to change the max value of pivot table view in the instanceconfig.xml file. I have added it to 500000.
    But still it throws same error.
    Ashok

    Hi Ashok,
    There are a number of setting who work in parrallel. Have a look here:
    http://obiee101.blogspot.com/2008/02/obiee-controling-pivot-view-behavior.html
    regards
    John
    http://obiee101.blogspot.com

  • Governor limit exceeded in cube generation (Maximum data records exceeded.)

    There are similar posts which didn't help in my situation.
    I had the error: Governor limit exceeded in cube generation (Maximum data records exceeded.). The query returns about *64000* rows.
    I've changed the instance config and exagerated it, and then also the register, but I still get the error:Governor limit exceeded in cube generation (Maximum data records exceeded.)
    instanceconfig:
    <?xml version="1.0"?>
    <WebConfig>
    <ServerInstance>
    <CredentialStore>
    <CredentialStorage type="file" path="C:\OracleBIData\web\config\credentialstore.xml"/>
    </CredentialStore>
         <CubeMaxRecords>5000000</CubeMaxRecords>
         <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>
         <ResultRowLimit>5000000</ResultRowLimit>
    <PivotView>
              <MaxVisibleRows>5000000</MaxVisibleRows>
              <MaxVisibleColumns>1024</MaxVisibleColumns>
              <MaxVisiblePages>1024</MaxVisiblePages>
              <MaxVisibleSections>1024</MaxVisibleSections>
    </PivotView>
    </ServerInstance>
    </WebConfig>
    Also added
    *<CubeMaxRecords>5000000</CubeMaxRecords>*
    *     <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>*
    to the registry.
    But still got the error:(
    Thanks for your help
    Edited by: user635025 on Jul 24, 2009 4:34 AM

    I suggest disabling the cache. Setting the max rows to a very high number and disabling the cache is the way to go when you are querying an Oracle database :)
    ( for those who haven't allready did the the thing )
    In NQSConfig.INI
    # Query Result Cache Section
    [ CACHE ]
    ENABLE     =     NO;

  • What is the best practice of deleting large amount of records?

    hi,
    I need your suggestions on best practice of deleting large amount of records of SQL Azure regularly.
    Scenario:
    I have a SQL Azure database (P1) to which I insert data every day, to prevent the database size grow too fast, I need a way to  remove all the records which is older than 3 days every day.
    For on-premise SQL server, I can use SQL Server Agent/job, but, since SQL Azure does not support SQL Job yet, I have to use a Web job which scheduled to run every day to delete all old records.
    To prevent the table locking when deleting too large amount of records, in my automation or web job code, I limit the amount of deleted records to
    5000 and batch delete count to 1000 each time when calling the deleting records stored procedure:
    1. Get total amount of old records (older then 3 days)
    2. Get the total iterations: iteration = (total count/5000)
    3. Call SP in a loop:
    for(int i=0;i<iterations;i++)
       Exec PurgeRecords @BatchCount=1000, @MaxCount=5000
    And the stored procedure is something like this:
     BEGIN
      INSERT INTO @table
      SELECT TOP (@MaxCount) [RecordId] FROM [MyTable] WHERE [CreateTime] < DATEADD(DAY, -3, GETDATE())
     END
     DECLARE @RowsDeleted INTEGER
     SET @RowsDeleted = 1
     WHILE(@RowsDeleted > 0)
     BEGIN
      WAITFOR DELAY '00:00:01'
      DELETE TOP (@BatchCount) FROM [MyTable] WHERE [RecordId] IN (SELECT [RecordId] FROM @table)
      SET @RowsDeleted = @@ROWCOUNT
     END
    It basically works, but the performance is not good. One example is, it took around 11 hours to delete around 1.7 million records, really too long time...
    Following is the web job log for deleting around 1.7 million records:
    [01/12/2015 16:06:19 > 2f578e: INFO] Start getting the total counts which is older than 3 days
    [01/12/2015 16:06:25 > 2f578e: INFO] End getting the total counts to be deleted, total count:
    1721586
    [01/12/2015 16:06:25 > 2f578e: INFO] Max delete count per iteration: 5000, Batch delete count
    1000, Total iterations: 345
    [01/12/2015 16:06:25 > 2f578e: INFO] Start deleting in iteration 1
    [01/12/2015 16:09:50 > 2f578e: INFO] Successfully finished deleting in iteration 1. Elapsed time:
    00:03:25.2410404
    [01/12/2015 16:09:50 > 2f578e: INFO] Start deleting in iteration 2
    [01/12/2015 16:13:07 > 2f578e: INFO] Successfully finished deleting in iteration 2. Elapsed time:
    00:03:16.5033831
    [01/12/2015 16:13:07 > 2f578e: INFO] Start deleting in iteration 3
    [01/12/2015 16:16:41 > 2f578e: INFO] Successfully finished deleting in iteration 3. Elapsed time:
    00:03:336439434
    Per the log, SQL azure takes more than 3 mins to delete 5000 records in each iteration, and the total time is around
    11 hours.
    Any suggestion to improve the deleting records performance?

    This is one approach:
    Assume:
    1. There is an index on 'createtime'
    2. Peak time insert (avgN) is N times more than average (avg). e.g. supposed if average per hour is 10,000 and peak time per hour is 5 times more, that gives 50,000. This doesn't have to be precise.
    3. Desirable maximum record to be deleted per batch is 5,000, don't have to be exact.
    Steps:
    1. Find count of records more than 3 days old (TotalN), say 1,000,000.
    2. Divide TotalN (1,000,000) with 5,000 gives the number of deleted batches (200) if insert is very even. But since it is not even and maximum inserts can be 5 times more per period, set number of deleted batches should be 200 * 5 = 1,000.
    3. Divide 3 days (4,320 minutes) with 1,000 gives 4.32 minutes.
    4. Create a delete statement and a loop that deletes record with creation day < today - (3 days ago - 3.32 * I minutes). (I is the number of iterations from 1 to 1,000)
    In this way the number of records deleted in each batch is not even and not known but should mostly within 5,000 and even you run a lot more batches but each batch will be very fast.
    Frank

  • Is there any limit on how many records a cursor can hold?

    Hi Everyone,
    This is Amit here. I want to know whether there is any limit on how many records a cursor can hold.
    I have a program in which i am creating a cursor and passing it to another procedure as an input parameter. But the count of cursor query is more than 15 Lakhs. The program is running forever.
    Just wanted to know whether the huge data is the problem.
    Thanks ....
    Regards,
    Amit

    user13079404 wrote:
    Just wanted to know whether the huge data is the problem.What do you think? How long does your code typically need to wait for the data to leave the magnetic platter of the harddisk, travel across wires and into the memory buffer of your application - for a single row?
    Now multiply that waiting for I/O time with a million - for a million rows. Or by a billion, for a billion rows.
    Is "+huge data+" a problem? Not really - it simple needs more work to get that amount of data from disk. More work means slower performance. It is that simple.
    Which is why the row-by-row approach used by many developers is wrong. You do not pull a million rows from disk and process it in PL/SQL or Java or .Net. Heck, you do not even pull 10,000 rows like that.
    The correct approach is to think data sets and use SQL to process that for you - and only return the bare minimum of data to the application layer. Maximize SQL. Minimize PL/SQL and Java and .Net.

  • ABAP Query - Want to limit the number of records to 5

    I have a query based on a a table join (MKPF with MSEG) infoset. I want to limit the number of records to first five.
    How to deal with it?

    select data from tables using join and where conditions....into internal table....
    now move the first 5 records to second internal table and use them...
    or else in ur select statement u can use...
    select  data from tables <join condition> upto 5 rows.
    Message was edited by:
            Ramesh Babu Chirumamilla

  • Limit the number of record to download to excel from a report

    I am just wondering if there is a way to limit the number of rows that I want to download to excel from a report in apex, right now when I click download to excel it downloads all records. If I am displaying 15 records on report page I want the ability to download only those records.
    Any help with that would be appreciated.
    Kind Regards,
    Sofia.

    Sofia,
    The same report query runs for downloading the data into excel, I don't think you can limit the number of records in download.
    You can achieve it using custom code, like on click of download excel redirect to another page and then restrict the data as per your need, or you can use the custom procedure to download the limited data.
    Denes' utility to download into excel.
    http://htmldb.oracle.com/pls/otn/f?p=31517:108:1476564836494581:::RP,::
    Regards,
    Manish

  • Insert into table a large amount of records

    I was trying to find a fast way to optimize a script that insert a large amount of records into a table. The initial script was like
    insert into table_xxxx
    select a.camp1, a.camp2, a.camp3 a.camp4, b.camp1, b.camp2, b.camp3
    from table_a a, table_b b
    where a.camp0 = b.camp0
    The commit sentence was at the end of the insert script; so i came up with this solution
    Declare
    TYPE cur_CURSOR IS REF CURSOR ;
    TYPE Tab_Hist IS TABLE OF table_xxxx%ROWTYPE INDEX BY BINARY_INTEGER;
    g_tHist Tab_Hist;
    CURSOR c_Base IS
    select a.camp1, a.camp2, a.camp3 a.camp4, b.camp1, b.camp2, b.camp3
    from table_a a, table_b b
    where a.camp0 = b.camp0;
    BEGIN
    OPEN c_base;
    LOOP
    FETCH c_base BULK COLLECT INTO g_tHist LIMIT 1000;
    EXIT WHEN g_tHist.COUNT = 0;
    BEGIN
    FORALL i IN g_tHist.FIRST .. g_tHist.COUNT SAVE EXCEPTIONS
    INSERT INTO prov_cobr_dud VALUES g_tHist(i);
    COMMIT;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    NULL;
    END;
    g_tHist.DELETE;
    EXIT WHEN c_base%NOTFOUND;
    END LOOP;
    CLOSE c_base;
    commit;
    END;
    If anyone could tell me another way to do the same thing i'll apreciate it a lot; i'm keen on learn more efficient ways to optimize scripts.
    PD: The initial insert was inserting the table with 120,000 records (more or less)

    Hello,
    Wrong forum. This is the Oracle Forms forum. You should post in the SQL-PL/SQL forum.
    Francois

  • Limit the amount of tasks that a user can claim

    I have a custom worklist and I need to develop a requirement for not allowing the users to claim more than one task at a time. Does the Worklist API have something to limit the amount of tasks that a user can claim at a time?.
    Thanks in advance.
    Neuquino

    Apparently the Family Base add-on (https://my.verizonwireless.com/vzw/nos/safeguards/SafeguardProductDetails.action?productName=familybase) from Verizon will allow you to set limits on how much data a particular device can consume.  I think it is free for the first 3 months (not sure) after that it will cost $5 per month.  I have heard rumors though that it does not work well with iPhones???

  • How can I limit the amount of space on a Time Capsule used by Time Machine?

    I know that you cannot partition a Time Capsule drive.  However, I was wondering about other options for limiting the amount of space used by Time Machine so that I can use the rest of the space for other purposes.
    I heard something about creating a disk image on the Time Capsule to limit the amount of space used by Time Machine.  If I create a disk image, would TM just use all the space that's not part of the disk image?  How do I access the disk image part?  Can I manually drag and drop files to it from my Mac like any other external drive?  Could I set up the disk image part to work with a Windows computer?
    I've also heard things about modifying sparsebundles which I don't know anything about.  How does this work?  What happens to the space that is not part of the TM sparsebundle?  How do I access it?  Can I manually drag and drop files to it from my Mac like any other external drive?  Could I set up the extra space to work with a Windows computer?

    Some other things to make note of:
    Some disk image formats have a maximum size and only use that size after they have had to expand to actually hold that much data. Make sure you are not using this type of image so you can guarantee the space is reserved. Sparse bundles are like this. I haven't read up on the sparse bundle trick, but I assume it's something to do with setting a maximum size for it.
    When you open up the Time Capsule disk, you will first be looking at the normal storage space. When you create an image, there will be a file you click on that mounts another drive that uses space on the Time Capsule. This shows up as just one file unless you open it to see it's contents.

  • I need a query that selects the amount of records for each day in a table.

    I need a query that selects the amount of records for each
    day in a table.
    Eg the result would be:
    date 1 14
    date 2 3
    etc
    Any ideas?

    sorted:
    SELECT count([commentID]),convert(varchar, dateAdded, 112)
    FROM COMMENTSgroup by convert(varchar, dateAdded,
    112)

  • Maximum amount of records

    hi,
    I need to calculate maximum amount of records that I can store in my db, how can to calculate that?
    thanks you by any idea

    Hi
    Max. no of record stored in db is not depands upon ur space(hard disks/any media on which u store it), conceptually there is no any such limit.
    Raju

Maybe you are looking for

  • Runtime report generation from legacy Sales tool

    My organization (IT) is looking to move the runtime contract/invoice reporting from an existing product and service sales tool (contracts and invoices are produced for the customer) to an environment where report template design and ownership can be

  • How to add a Wait Step in BPM for the synchronous send step

    Hi, we have a scenario where in one of the step in BPM is a synchronous step which sends data to ECC system and gets the Respose . The Requirement is that we need to add a wait for the Synchronous step in BPM as the data sent to RFC(ECC) need to get

  • Error Opening a Sybase Connection

    Ok, this is the first time i have tried to establish a sybase connection before but this is what i did. I went to the sybase site and downloaded the new JConnect 5.5 and 4.5 I installed it and it is setup in my classpath. jisql works properly so i am

  • Error in vendor selection while creating Purchase requistions

    Hi, I was using bapi_requestion_create to create purchase requistion with different quotas for different vendors. Ex: vendor A   with 60%       vendor B   with 40% If the required quantity is 100 items then it has to split 60 items to vendor A and cr

  • Hello my problem is i cant use the 3d tool or when i click the 3d menu there's no selection. Please

    Adobe Photoshop Version: 13.0 (13.0 20120315.r.428 2012/03/15:21:00:00) x32 Operating System: Windows 7 32-bit Version: 6.1 System architecture: AMD CPU Family:15, Model:15, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3 Physical processor coun