Record limit in REUSE_ALV_GRID_DISPLAY

I have a ALV report program that displays an internal table with about 5000 records. When the user either selects 1-n lines on the ALV screen (or none at all) and then clicks a Details button I fill another internal table with the details that I want to display on the next screen.
The problem I am running into is that the 2nd internal table has 100K+ records but when trying to execute the REUSE_ALV_GRID_DISPLAY fm I get a program abend telling me that "No storage space available for extending table IT_84403".
I have to assume that ALV has a limitation as to the number of records that can be displayed. I checked the ALV documentation in the R/3 Library but did not find any information as to how much data (or number of records) can be displayed in an ALV.
Is the limitation just the number of records or the total memory used (i.e. number of records * record length)? Knowing that I could monitor for the table size and maybe allow the details table to be displayed page by page (just a thought).
Thanks for any comments.
Gerd

Ok,  I just tested it, I am getting the same error when filling up an internal table without going to ALV grid.
Here is the code.
* Internal Tables
data: itab type table of mara with header line.
data: itab2 type table of mara.
  select * into table itab
            from mara
                 where matnr in s_matnr
                   and mtart in s_mtart.
  do 20 times.
    loop at itab.
      append itab to itab2.
    endloop.
  enddo.
That said,  I have to say that the problem is definitly not because the ALV grid can't handle it, but the system itself can't handle it.
Regards,
Rich Heilman

Similar Messages

  • BUG: Record Limit per Document doesn't work for PDF in CS4 - does it work in CS5?

    Hey all - I'm attempting to export 100 data merged documents to pdf.  I know i can use "Record Limit per Document" set to 1 to create 100 InDesign files, which isn't what i want to do.  When you select "Export to PDF" in the data merge window, the "record limit per document" option exists, but no matter what, it will always create one giant pdf file - it will NOT separate into 100 different pdf files.  This is a bug in CS4.
    I am wondering if the bug has been fixed in CS5 or if there is a workaround in CS4 to generate the pdfs.
    All I found is this ancient thread in which people say the only workaround is to batch convert the pdf files later, and then degenerates into unrelated discussion:
    http://forums.adobe.com/message/1110826

    g'day there
    has there been any follow-up to this or workarounds?
    i constantly have VDP jobs which have tens of thousands of records, but the chaps printing it only want the PDFs in lots of 500 or so. being able to do ONE merge which splits the merge into bite-size PDFs for our printing section would be preferable to making them through the dialog box in the appropriate lots.
    colly

  • Error: Max processing time or Max records limit reached

    Hi All,
    While I run the report in Infoview, I get the below error:
    Unable to retrieve object:
    Max processing time or Max records limit reached
    Kindly suggest me.
    Thanks,
    Meena

    There is a default limit on the number of records returned and on the time out of an 'idle' connection..These could be set in the CMC , however first try to check the query for that report and see if it is applying your record selection criteria at the database level  ( use the Show Sql option and see if all your selection criteria have been turned into WHERE clauses)
    - this will drastically reduces both the number of records returned to the Crystal and the time it takes for...
    You can find setting here:
    CMC>servers>page server>properties
    Its always not recommended to set it to unlimited as page server is not a robust server, you need to schedule such reports that uses job server which is more robust.
    Regards,
    Parsa.

  • Two billion record limit for non-partitioned HANA tables?

    Is there a two billion record limit for non-partitioned HANA tables? I've seen discussion on SCN, but can't find any official SAP documentation.

    Hi John,
    Yes there is a limit for non-partitioned tables in HANA. In the first page of this document says: SAP HANA Database – Partitioning and Distribution of Large Tables
    A non - partitioned table cannot store more than 2 billion rows. By using partitioning, this
    limit may overcome by distributing the rows to several partitions. Please note, each partition must not contain more than 2 billion rows.
    Cheers,
    Sarhan.

  • Max processing time or Max records limit reached (Crystal Reports Server)

    Although my report runs fine from with Crystal Reports (designer), I get an error when I try to run it from the Crystal Reports Server portal.
    I get an error u201CMax processing time or Max records limit reachedu201D.
    How can I solve this problem?
    Iu2019m accessing a DB2 database on an iSeries Server thru ODBC. I know that the report uses a lot of data. Therefore I have set u201Callow query timeout yesu201D within the ODBC data source. The data connection itself is not the problem, I have no problem running other reports on the same connection.
    PS. A do not know whether or not this is the right forum for my post. I first posted within "Java Development - BusinessObjects Enterprise, BusinessObjects Edge, Crystal Reports Server" but I did not get any response.

    I found the answer to my problem:
    1) Log onto the CMC
    2) Goto "Servers" in the dropdown menu
    3) Expand "Service Categories"
    4) Select "Crystal Reports Services
    5) In the right window will be listed the currently running services.  The 2 services that Dell mentioned are in there under "Description":  CrystalReportsProcessingServer and CrystalReports2013ProcessingServer
    Hope this helps somebody else.

  • Maximum record limit for internal table

    hello all,
    can any one tell me what is the maximum limit of internal table. i would like to add all records from bseg to internal table. so i can improve processing time.
    thanks,
    raj

    hi,
    Before Release 4.0A, ABAP stored the content of internal tables in a combination of main memory and file space. This means that the maximum size of all internal tables of all programs running on such an application server at one time is about 2 GB. With Release 4.0A or greater, this size decreases to about 500 MB. (Note that those values aren't fixed, but this is a good guide. This minimum of 500 MB is the lowest limit of the real value, which varies among different operating systems and even among different releases of the same operating system.)
    It may sound strange that a newer release has a higher restriction on capacity. But it's a consequence of the fact that the contents of internal tables move from a reserved file to shared memory. When you process internal tables this way in Release 4.0A or greater, you pay for much better performance with a smaller potential size.
    Regards,
    Sourabh

  • Max record limit for Batch delete

    Hi,
    Is there a limit on the maximum number of records that can be deleted using the batch delete functionality?
    If I select an Account list which has more than 200 records, so it covers more than one page of the list view. When I select batch delete, does that delete all the records or just the 100 records on the first page?
    Regards,

    The batch delete will delete all the records in the list. There is no upper limit on batch delete.
    Edited by: bobb on May 4, 2011 7:30 AM

  • Maximum Record limit in RDBMS adaptor

    Hi,
      I'm trying to find a way to limit the number of records for one message in JDBC adaptor.
      This is necessary to read set of records at a time from the "sender" JDBC adaptor.
      If any one knows how to do this please let me know..
    Thanks
    Yasitha

    Hi Shankar,
       Thanks a lot for your guidance. This is working. But in this particular scenario i have another problem. What we have done is read the set of records with a Status "Yes" in to XI and change the Status to "IN" until the validations and other necessary steps are done.
    This DB gets data in a very high frequency and i think between the Select and Update statements there can be changes to the TOP 500. So i was trying to Add a aditional parameter "MaximumRows" to the JDBC adapter and limit the number of records.
    But that didn't work either. What do you think about this.. Do you know from where we can find information about the Additional Parameters of the Adapters.
    Regards,
    Yasitha.

  • Record Limit for a book

    Hi experts,
    from online help, it mentions that
    "Any book can contain data, but for best performance, do the following:
    Limit the record count to a maximum of 20,000 to 30,000."
    Does that mean if we have more than 30,000 records in a book that SOD performance will be bad?
    The problem is we have more than 100,000 records for each book....but we also do not want to compromise the performance.
    Is there any solution to that?
    Thanks,
    Sab

    Hi Bob,
    Yes, I am using the search function on the left. I am searching for Home phone number *4491773.
    But the system hang for about 5 minutes and then prompted this message:
    Error: originating at /OnDemand/user/AccountList
    This request was interrupted by another request by you. Please try your request again later. (RIP_WAIT_ERROR_CALCELLED).
    I only did a single search at a time, not sure why it mentioned "another request".
    Thanks,
    Sab

  • Maximum record limit in BW ?

    Hi,
          Is there any maximum limit for the number of records that can be scheduled in BW from the source system in one <b>Initialization/Full Update</b>. Any help would be appreciated.
    Thanks & Regards
    Hari

    Hi,
       I came across this information in the document 'Extraction techniques'.
    "<i>With large data packages, the amount of memory required depends largely on the number of
    data records that are being transferred in each particular data package. You use
    these parameters to set the maximum number of data records that a data package can
    contain. <b>The default setting is a maximum of 100,000 records per data pacakge.</b> The
    maximum main memory requirement per data package is approximately 2'Max. Rows'1000
    bytes.</i>"
    I would like to know more regarding this settings.
    Thanks & Regards
    Hari

  • 65,000 record limit in BEx-workaround?

    Hi Experts,
    In BE'x there is limit to the number of records being displayed i.e. 65000 records.Suppose if the result of a query execution exceeds this 65K limit, is there anyway by which I can display records above 65K in a new tab in the same work-book?
    Thanks
    Aravind

    HI,
    You can use some selection conditions and use 2 queries with different selecion conditions in the 2 worksheets of the workbook.
    The limitation of 65k is imposed by Microsoft product and hence cannot  be overcome directly.
    Regards,
    Nitin

  • Records limit 300000

    At one of my client computers with win2000 oracle 8i
    we can not insert into table more, then 300000 records
    The disk free space is very big
    What we should do to fix this problem

    Hi Kamalesh,
    I dont see such a limit.
    SQL> select count(*) from yy;
    COUNT(*)
    4194304
    SQL> insert into yy select * from yy;
    4194304 rows created.
    SQL> select count(*) from yy;
    COUNT(*)
    8388608
    Vladimir, can you post the error message that you are getting ?
    regards

  • Child records limit

    Hi all,
    Is there any requiered -or recomended- limit on the number of child records a form should have?
    I'm specially concerned about reconciliation performance with the posibility of thousands of child records...
    Thanks,

    Hi all,
    Is there any requiered -or recomended- limit on the number of child records a form should have?
    I'm specially concerned about reconciliation performance with the posibility of thousands of child records...
    Thanks,

  • Hard disk free space and 192Khz/24bit time recording limit...

    Hello, when I try to record a long stuff at high sampling rate... 96Khz or 192Khz - 24bit , the max time that I have is 65 minutes whit the metronome set at 20 bpm, the max recording time limit is not active, and the space of the hard disk is of 900 GB... but I have only 65 minute of recording time... is a big problem when I need to record classical stuff or a Live.
    --- for 12 channels at 192Khz/24bit in 1 hour of recording time the hard disk space needed is aprox 3,9 Gb... how is possible that LP 7.2.1 limit my recording time ???---- if I go down to 48 Khz I have the time of 248 Minutes of recording... ----

    Hello, when I try to record a long stuff at high
    sampling rate... 96Khz or 192Khz - 24bit , the max
    time that I have is 65 minutes with the metronome set
    at 20 bpm.
    Try setting the metronome higer, say 30 or 40.
    There may be a quirk in that the lowest metronome settings don't always give you the most time.
    Failing that, try another application. Do you by any chance
    have the OEM version of Cubase LE or Mackie's Tracktion?
    Both of those will get you more time at 96khz/24.
    (Seems that both of those applications come as freebies with a lot of different hardware)
    You can record the tracks in one app then import them into Logic.
    Pancenter

  • Audio Recording Limit

    Does exist any limit to audio recording in flash? I created a simple wave recording from the mic but it stop at 30 minutes or less... I don´t have any problems of computer memory I think. Does it may be from the code? or is any limitation of size (the wave can reach Gb´s...)? I have searched into the docs but have not find any related problem

    There´s my source (just the record):
                import flash.events.MouseEvent;
                import flash.events.ProgressEvent;
                import flash.events.SampleDataEvent;
                import flash.media.Microphone;
                import flash.system.Capabilities;
                import mx.events.FlexEvent;
                [Bindable]
                public var out:String = new String();
                public var process:NativeProcess;
                public static const FILE_NAME:String = "recording.wav";
                public var soundBytes:ByteArray = new ByteArray();
                [Bindable]
                public var mic:Microphone;
                public var nativeProcessStartupInfo:NativeProcessStartupInfo;
                public var f:FileStream = new FileStream();
                public var file:File;
                public function record_clickHandler(event:MouseEvent):void
                    record.visible=false;
                    stop.visible=true;
                    initclock();
                    event.currentTarget.enabled = false;
                    stop.enabled = !event.currentTarget.enabled;
                    mic = Microphone.getMicrophone();
                    mic.setSilenceLevel(0, 4000);
                    mic.gain = 30;
                    mic.rate = 44;
                    soundBytes.length = 0;
                    mic.addEventListener(SampleDataEvent.SAMPLE_DATA, sampleHandler);
                public function sampleHandler(event:SampleDataEvent):void
                    progressbar1_progressHandler();
                    while(event.data.bytesAvailable){
                        soundBytes.writeFloat(event.data.readFloat());
                public function stop_clickHandler(event:MouseEvent):void
                    event.currentTarget.enabled = false;
                    record.enabled = !event.currentTarget.enabled;
                    out = new String();
                    mic.removeEventListener(SampleDataEvent.SAMPLE_DATA, sampleHandler);
                    soundBytes.position = 0;
                    file = File.desktopDirectory.resolvePath(FILE_NAME);
                    f.open( file, FileMode.WRITE);
                    f.writeBytes(WaveEncoder.encode(soundBytes, 1));
                    f.close();
    and the encoder:
    package
        import flash.events.Event;
        import flash.utils.ByteArray;
        import flash.utils.Endian;
        public class WaveEncoder
            static public function encode( samples:ByteArray, channels:int=2, bits:int=16, rate:int=44100 ):ByteArray
                var data:ByteArray = WaveEncoder.create( samples );
                var bytes: ByteArray = new ByteArray();
                bytes.endian = Endian.LITTLE_ENDIAN;
                bytes.writeUTFBytes( 'RIFF' );
                bytes.writeInt( uint( data.length + 44 ) );
                bytes.writeUTFBytes( 'WAVE' );
                bytes.writeUTFBytes( 'fmt ' );
                bytes.writeInt( uint( 16 ) );
                bytes.writeShort( uint( 1 ) );
                bytes.writeShort( channels );
                bytes.writeInt( rate );
                bytes.writeInt( uint( rate * channels * ( bits / 8 ) ) );
                bytes.writeShort( uint( channels * ( bits / 8 ) ) );
                bytes.writeShort( bits );
                bytes.writeUTFBytes( 'data' );
                bytes.writeInt( data.length );
                bytes.writeBytes( data );
                bytes.position = 0;
                return bytes;
            static private function create( bytes:ByteArray ):ByteArray
                var buffer:ByteArray = new ByteArray();
                buffer.endian = Endian.LITTLE_ENDIAN;
                bytes.position = 0;
                while( bytes.bytesAvailable )
                    buffer.writeShort( bytes.readFloat() * 0x7fff );
                return buffer;

Maybe you are looking for