65,000 record limit in BEx-workaround?

Hi Experts,
In BE'x there is limit to the number of records being displayed i.e. 65000 records.Suppose if the result of a query execution exceeds this 65K limit, is there anyway by which I can display records above 65K in a new tab in the same work-book?
Thanks
Aravind

HI,
You can use some selection conditions and use 2 queries with different selecion conditions in the 2 worksheets of the workbook.
The limitation of 65k is imposed by Microsoft product and hence cannot  be overcome directly.
Regards,
Nitin

Similar Messages

  • Creating an Excel file from table with 40,000 records

    I need help with creating an Excel file for client that has a
    table of 40,000 records. I have code that generates the Excel and
    has worked well in the past but with this much data it is timing
    out.
    I've already informed the client that Excel has limit of
    66000 records. So it might be better to export to CSV as the data
    in this table is going to keep growing.
    If you have time to work with me on this , contact me a
    alexagui [at] gmail [dot] com
    and I can send you more details so you can put together an
    accurate quote.
    Thanks,
    Alex

    asdren_one wrote:
    > I need help with creating an Excel file for client that
    has a table of 40,000
    > records. I have code that generates the Excel and has
    worked well in the past
    > but with this much data it is timing out.
    >
    > I've already informed the client that Excel has limit of
    66000 records. So it
    > might be better to export to CSV as the data in this
    table is going to keep
    > growing.
    >
    > If you have time to work with me on this , contact me a
    alexagui [at] gmail
    > [dot] com
    > and I can send you more details so you can put together
    an accurate quote.
    For so many records my guess is that the concatenation times
    the most
    time so you'll need to use StringBuffer to build the string.
    Google for
    coldfusion and stringBuffer:
    http://www.stillnetstudios.com/2007/03/07/java-strings-in-coldfusion/
    Mack

  • BUG: Record Limit per Document doesn't work for PDF in CS4 - does it work in CS5?

    Hey all - I'm attempting to export 100 data merged documents to pdf.  I know i can use "Record Limit per Document" set to 1 to create 100 InDesign files, which isn't what i want to do.  When you select "Export to PDF" in the data merge window, the "record limit per document" option exists, but no matter what, it will always create one giant pdf file - it will NOT separate into 100 different pdf files.  This is a bug in CS4.
    I am wondering if the bug has been fixed in CS5 or if there is a workaround in CS4 to generate the pdfs.
    All I found is this ancient thread in which people say the only workaround is to batch convert the pdf files later, and then degenerates into unrelated discussion:
    http://forums.adobe.com/message/1110826

    g'day there
    has there been any follow-up to this or workarounds?
    i constantly have VDP jobs which have tens of thousands of records, but the chaps printing it only want the PDFs in lots of 500 or so. being able to do ONE merge which splits the merge into bite-size PDFs for our printing section would be preferable to making them through the dialog box in the appropriate lots.
    colly

  • Record Limit for a book

    Hi experts,
    from online help, it mentions that
    "Any book can contain data, but for best performance, do the following:
    Limit the record count to a maximum of 20,000 to 30,000."
    Does that mean if we have more than 30,000 records in a book that SOD performance will be bad?
    The problem is we have more than 100,000 records for each book....but we also do not want to compromise the performance.
    Is there any solution to that?
    Thanks,
    Sab

    Hi Bob,
    Yes, I am using the search function on the left. I am searching for Home phone number *4491773.
    But the system hang for about 5 minutes and then prompted this message:
    Error: originating at /OnDemand/user/AccountList
    This request was interrupted by another request by you. Please try your request again later. (RIP_WAIT_ERROR_CALCELLED).
    I only did a single search at a time, not sure why it mentioned "another request".
    Thanks,
    Sab

  • Maximum record limit in BW ?

    Hi,
          Is there any maximum limit for the number of records that can be scheduled in BW from the source system in one <b>Initialization/Full Update</b>. Any help would be appreciated.
    Thanks & Regards
    Hari

    Hi,
       I came across this information in the document 'Extraction techniques'.
    "<i>With large data packages, the amount of memory required depends largely on the number of
    data records that are being transferred in each particular data package. You use
    these parameters to set the maximum number of data records that a data package can
    contain. <b>The default setting is a maximum of 100,000 records per data pacakge.</b> The
    maximum main memory requirement per data package is approximately 2'Max. Rows'1000
    bytes.</i>"
    I would like to know more regarding this settings.
    Thanks & Regards
    Hari

  • Is it possible to increase/remove the 20,000 row limit on the AV reports?

    When running a report for Logins and Logoffs for the last month, the report returns 20,000 records and informs me that there are more than 20,000. Does anyone know of a way to either remove the limit or to increase it?
    regards,
    Iain Barr
    Ategrity Solutions Ltd.

    Hi Iain:
    At this time, the out-of-the-box reports in Audit Vault are limited to 20,000 rows in code. There is no configurable way to change this limit.
    However, the Admin Guide does document the schema extensively, so it should be possible to point an external reporting tool, such as Oracle BI Publisher, to the schema, and query the data. Such reports will not be limited to 20,000 rows.

  • Possible to send the 50,000 records at a time using proxy?

    Hi All,
    I am using the proxy to send the data form SAP to PI and then send it to Receciver by using JMS. Here i have a small issue.... is it possible to send the 50,000 records at a time using proxy? If not please suggest me how can i send bulk of records through proxy?
    Thanks
    Karthik.

    is it possible to send the 50,000 records at a time using proxy? If not please suggest me how
    can i send bulk of records through proxy?
    you can try this in steps...do not go for a BigBang testing :)....check how much your XI system can handle at a time...then you may need to tune the system parameters to accomodate more message size.....how to do this??...check the below document..section 3.9.2 (Special Cases)
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7
    Regards,
    Abhishek.

  • In BDC, I Have 10,000 Records Which Method do I Select? and Why?

    Hi all,
    In BDC , I Have 10,000 Records of Material Master Application. I have go through by Session Method or Call Transaction Method. Which Method do I Select? and Why?

    Hi..
    There you hav to go for sessions method....
    because...
    1. session methos has auto matic error handling option. so if there is any error in last but 100 th reocrd it will just threws that record and remaining part willl complete.
    2. And it was offline method.. means formatting of the data and assigning to Sap lay can be done in two steps... So you  10000 recors can update in expected time comaper with Calltransaction method...
    Get back to me if you are not satisfy with above reasons.
    Thanks,
    Naveen.I

  • Need to post Full Load data (55,000 records) to the target system.

    Hi All,
    We are getting the data from SAP HR system and we need to post this data to the partner system. So we configured Proxy(SAP) to File(Partner) scenario. We need to append the data of each message to the target file. Scince this is a very critical interface, we have used the dedicated queues. The scenario is working fine in D. When the interface transported to Q, they tested this interface with full load i.e with 55,000 messages.All messages are processed successfully in Integration Engine and to process in Adapter engine, it took nearly 37 hrs. We need to post all 55,000 records with in 2 hrs.
    The design of this interface is simple. We have used direct mapping and the size of each message is 1 KB. But need to append all messages to one file at the target side.We are using Advantco sFTP as receiver adapter and proxy as a sender.
    Could you please suggest a solution to process all 55,000 messages with in 2hrs.
    Thanks,
    Soumya.

    Hi Soumya,
    I understand your scenario as, HR data has be send to third party system once in a day. I guess, they are synchronizing employee (55,000) data in third party system with SAP HR data, daily.
    I would design this scenario as follows:-
    I will ask ABAPer to write a ABAP program, which run at 12:00, pickup 55,000 records from SAP HR tables and place them in one file. That file will be placed in SAP HR file system (you can see it using al11). At 12:30, PI File channel will pick up the file and transfer the file to third party target system as it is, without any transformation. File to File, pass through scenario (no ESR objects). Now, ask the target system to take the file, run their program (they should have some SQL routines). That SQL program will insert these records into target system tables.
    If 55,000 records make huge file at SAP HR system, ask ABAPer to split it into parts. PI will pick them in sequence based on file name.
    In this approach, I would ask both SAP HR (sender) and third party (target) system people to be flexible. Otherwise, I would say, it is not technically possible with current PI resources. In my opinion, PI system is middleware, not system in which huge computations can be done. If messages are coming from different systems, then collection in middleware makes sense. In your case, collecting large number of messages from single system, at high frequency is not advisable. 
    If third party target system people are not flexible, then go for File to JDBC scenario. Ask SAP HR ABAPer to split input file into more number of files (10-15, you PI system should be able to handle). At receiver JDBC, use native SQL. You need java mapping to construct, SQL statements in PI. Donu2019t convert flat file to JDBC XML structure, in your case PI cannot handle huge XML payload.
    You have to note, hardware upgrade is very difficult (you need lot of approvals depending your client process) and very costly. In my experience hardware upgrade will take 2-3 months.
    Regards,
    Raghu_Vamsee

  • Performance in processing 80,000 records.

    Hi
    I am working on a module where I have to upload a file of 80,000 records, process them and then upload them in Web Service.
    I am uploading file by simply parsing request
    items = upload.parseRequest(request);After this I am traversing entire file line by line, processing individual records with my logic, and then saving them to a Vector.
    In second servlet I am fetching these records and then uploading them to WSDL file.
    This process will take some time.
    I am facing few problems/questions here :
    Question 1:
    After 30 minutes or so.. the browser displays "This page cannot be displayed".
    While I am debugging this code and setting breakpoints, I noticed that code is actually executing when browser displays "This page cannot be displayed" message.
    Can I increase browser settings so that It can wait for some more time before displaying above message.
    So, that my java code can complete its execution.
    Question 2 :
    I am using vector to store all 80,000 records at one go. Will the use of ArrayList or some other collection type increase performance.
    Question 3 :
    What if I break vector in parts.
    i.e. instead of keeping 1 single vector of 80,000 records, if I store 10,000 records each in different vectors and then process them separately.
    Please comment.
    Thanks.

    money321 wrote:
    Question 1:
    After 30 minutes or so.. the browser displays "This page cannot be displayed".
    While I am debugging this code and setting breakpoints, I noticed that code is actually executing when browser displays "This page cannot be displayed" message.
    Can I increase browser settings so that It can wait for some more time before displaying above message.
    So, that my java code can complete its execution.
    It is the request timeout, it is a webserver setting, not a webbrowser setting. Even though the request times out, the code should still continue to execute until the process finishes; you just don't get the response in your browser.
    Question 2 :
    I am using vector to store all 80,000 records at one go. Will the use of ArrayList or some other collection type increase performance.
    Probably yes, because a Vector is thread safe while the ArrayList is not. It is a similar situation as StringBuffer/StringBuilder.
    Question 3 :
    What if I break vector in parts.
    i.e. instead of keeping 1 single vector of 80,000 records, if I store 10,000 records each in different vectors and then process them separately.Wouldn't make much of a difference I'd say. The biggest performance hit is the webservice call, so try to save as much time as you can there. By the way, are you doing one webservice call, or 80000?
    >
    Please comment.
    Thanks.

  • Not able to update more than 10,000 records in CT04 for a characteristic

    Hi all,
    We are not able to update more than 10,000 records in CT04 for a certain characteristic.
    Is there any possible way to do this?
    Please advise...its a production issue.
    Thanks.

    Hello ,
    Please consider using a check table for the characteristic involved if you are working with large
    number of values assigned
    With a check table you have a possibility to work with a huge amount of values , also the performance should improve                          
    Please refer to the link
    http://help.sap.com/saphelp_erp60_sp/helpdata/en/ec/62ae27416a11d1896d0000e8322d00/frameset.htm
    Section - Entering a Check Table 
    Hopefully the information helps
    Thanks
    Enda.

  • Dynamic Parameter Record Limit

    I am using Crystal 2008 with universes as a data source with BOE XI 3.0 (Edge) Infoview and need to create parameter fields that return greater than 1,000 records.  I have seen suggestions about modifying the local workstation registry, and also found a reference to an earlier version of BOE (SAP Note 1199759) about modifying the LOV batch size for custom sorting.  The latter refers to a Web Intelligence setting, so I'm not sure if it affects parameters in Crystal Reports.
    Which is the correct method to resolve this issue my versions of Crystal and BOE?

    Hi
    we need to make changes for CR so that it fetches more than 1000 records.
    Please refer to 1218588 - How to increase the number of values in a dynamic parameter list and make the necessary registry changes.
    Hope this helps!!
    Regards
    Sourashree

  • Script logic record more than 300,000 record

    Hi Expert,
    When I run my logic I have error in my formula log:
    (More than 300,000 records. Details are not being logged)
    Ignoring Status
    Posting ok
    I check my script it pull out total 422076 records.
    Is it meaning I cannot More than 300,000 records??
    Is there any where I can set MAX records I can generate for my single script to run??
    Thanks..

    You should use
    *XDIM_MAXMEMBERS dimension = numberOfMembers to be processed at a time
    For example
    *XDIM_MAXMEMBERS Entity = 50
    Figure out wich dimension has the most members, and use it, this sections you script logic.
    I hope that helps
    Leandro Brasil

  • Unable to fetch 50,000 records from SQL using Orchestrator

    Hi Team,
    I have a table in MS SQL which is having more than 50,000 records. I am trying to fetch 50,000 records using orchestrator, but unable to fetch the records..
    There is no problem with the SQL query because I can able to get 40,000 records..
    I am using SCORCH DEV - SQL Integration Pack.
    Regards,
    Soundarajan.

    Hi,
    Thanks for your query.
    I have also used timeout parameter but it is not working.. As you said I also tried with Query database Activity which is out of the box...
    Now i can able to fetch more than 80,000 records but the output what i am getting is not in the format which we are looking for..
    Attached the output...
    How to edit this..?
    I tried to write the output in excel but all the data sits in the first column itself..
    Regards,
    Soundarajan.

  • How to extract 250,000 Records into excel

    Hi Friends In sql server My table has 250,000 records . How can I import these records into excel?
    If not  which file type I can use for it. Thanks for advance friends

    There are many solutions. I say two solution here:
    using "extract" by right click on database in ssms like bellow screens shots
    using "get external data" from data tab in excel
    more info: http://office.microsoft.com/en-001/excel-help/connect-to-import-sql-server-data-HA010217956.aspx
    sqldevelop.wordpress.com

Maybe you are looking for