Slow record selection in tableView component with large number of records

Hi experts,
we have a Business Server Page (flow logic) with several htmlb:inputField's. As known from SAP standard we would like to offer value helper (F4) to the users for the ease of record selection.
We use the onValueHelp() method of the inputField to open a extra browser window through JavaScript. In the popup another html-website is called, containing a tableView component with all available records. We use the SINGLESELECT mode for the table view.
Everything works perfect and efficient, unless the tableView contains too many entries. If the number of possible entries is large the whole component performs very very slow. For example the selection of the record can take more than one minute. Also the navigation between pages through the buttons at the bottom of the component takes a lot of time. It seems that the tableView component can not handle so many entries.
We tried to switch between stateful and stateless mode, without success. Is there a way to perform the tableView selection without doing a server-round-trip? Any ideas and comments will be appreciated.
Best regards,
Sebastian

Hi Raja,
thank you for your hint. I took a look at sbspext_table/TableViewClient.bsp but did not really understand how the Java-Script coding works. Where is the JavaScript code in that example? Which file, does it contain.
Meanwhile I implemented another way to evite the server round trip.
- Switch page mode of the popup window to "Stateful"
- Use OnInitialization method like OnCreate (as shown in [using OnInitialization like OnCreate])
- Limit the results of the SELECT statement with UP TO 1000 ROWS
Best regards,
Sebastian

Similar Messages

  • TableView performance with large number of columns

    I notice that it takes awhile for table views to populate when they have a large number of columns (> 100 or so subjectively).
    Running VisualVM based on CPU Samples, I see that the largest amount of time is spent here:
    javafx.scene.control.TableView.getVisibleLeafIndex() 35.3% 8,113 ms
    next is:
    javfx.scene.Parent$1.onProposedChange() 9.5% 2,193 ms
    followed by
    javafx.scene.control.Control.loadSkinClass() 5.2% 1,193 ms
    I am using JavaFx 2.1 co-bundled with Java7u4. Is this to be expected, or are there some performance tuning hints I should know?
    Thanks,
    - Pat

    We're actually doing some TableView performance work right now, I wonder if you could file an issue with a simple reproducible test case? I haven't seen the same data you have here in our profiles (nearly all time is spent on reapplying CSS) so I would be interested in your exact test to be able to profile it and see what is going on.
    Thanks
    Richard

  • Lookups with large number of records do not return the page

    Hi,
    I am developing an application using Oracle JHeadstart 10.1.3 Preview Version 10.1.3.0.78
    In my application I created a lookup under domains and used that lookup for an attribute (Display Type for this attribute is: dropDownList) in a group to get the translation fro this attribute. The group has around 14,800 records and the lookup has around 7,400 records.
    When I try to open this group (Tab), the progress shows that it is progressing but it does not open even after a long time.
    If I change the Display Type for the attribute from dropDownList to textInput then it works fine.
    I have other lookups with lower number of records. Those lookups work fine with dropDownList Display Type.
    Only I have this kind of problem when I have a lookup with large number of records.
    Is there any limitation of record number for lookups under Domains?
    How I can solve this?
    I need to translate the attribute (get the description from another table using the code).
    Your help would be appreciated.
    Thanks
    Syed

    We have also faced similar issue, but us, it was happening when we were using the dropDownList in a table, while the same dropDownList was working in table format. In our case the JVM is just used to crash and after google'ing it here in forums, found that it might be related to some JVM issue on Windows XP machines without Service Pack 2.
    Anyway... the workaround that we taken to get around the issue is to use LOV instead of a dropDownList in your jHeadStart.
    Hope this helps...
    - rutwik

  • How to Capture a Table with large number of Rows in Web UI Test?

    HI,
    Is there any possibility to capture a DOM Tabe with large number of Rows (say more than 100) in Web UI Test?
    Or is there any bug?

    Hi,
    You can try following code to capture the table values.
    To store the table values in CSV :
    *web.table( xpath_of_table ).exportToCSVFile("D:\exporttable.csv", true);*
    TO store the table values in a string:
    *String tblValues=web.table( xpath_of_table ).exportToCSVString();*
    info(tblValues);
    Thanks
    -POPS

  • Barcode CODE 128 with large number (being rounded?) (BI / XML Publisher 5.6.3)

    After by applying Patch 9440398 as per Oracle's Doc ID 1072226.1, I have successfully created a CODE 128 barcode.
    But I am having an issue when creating a barcode whose value is a large number. Specifically, a number larger than around 16 or so digits.
    Here's my situation...
    In my RTF template I am encoding a barcode for the number 420917229102808239800004365998 as follows:
    <?format-barcode:420917229102808239800004365998;'code128c'?>
    I then run the report and a PDF is generated with the barcode. Everything looks great so far.
    But when I scan the barcode, this is the value I am reading (tried it with several different scanner types):
    420917229102808300000000000000
    So:
         Value I was expecting:     420917229102808239800004365998
         Value I actually got:         420917229102808300000000000000
    It seems as if the number is getting rounded at the 16th digit (or so, it varies depending of the value I use).
    I have tried several examples and all seem to do the same.  But anything with 15 digits or less seems to works perfectly.
    Any ideas?
    Manny

    Yes, I have.
    But I have found the cause now.
    When working with parameters coming in from the concurrent manager, all the parameters define in the concurrent program in EBS need to be in the same case (upper, lower) as they have been defined in the data template.
    Once I changed all to be the same case, it worked.
    thanks for the effort.
    regards
    Ronny

  • FR Layout issue with large number of columns

    Hi!
    I'm developing a report in FR 11.1.1.3 with over 30 columns.
    The issue is that when I run the report in web preview, the dropdown of dimension in page goes to the far right and disappears from the display.
    If I reduce the number of the columns I don't have this problem.
    I've already tried to maximize the workspace to the maximum without any result.
    Can anyone help me to deal with reports with large numbers of columns?
    Regards,
    Luís
    Edited by: luisguimaraes on 13-Mar-2012 06:48

    IE8 could be the reason. According to the supported platform matrices (http://www.oracle.com/technetwork/middleware/bi-foundation/oracle-hyperion-epm-system-certific-2-128342.xls), check tab EPM System Basic Platform, row 70, in order IE8 to work, FR and Workspace should be patched.
    FR Patch number: 9657652
    Workspace Patch number: 9314073
    Patches can be found on My Oracle Support. Just search for the patch number.
    Cheers,
    Mehmet

  • SSO with large number of users

    Hi,
    We want to implement SSO using user mapping because we have different user ids from system to system.
    We have large number of users in our system, how can we implement user mapping.
    Is there anyway to write a program so that it can take care of user mapping, if yes? can you please give overview so that i can dig in to it.
    Thanks,
    Damodhar.

    Hi Damodhar
    User mapping can be done in the programming level. The User Management Engine in EP 6.0 provides two interfaces to access the user mapping data namely
    1. IUserMappingService.
    2. IUserMappingData.
    You can implement these two intefaces to enable User Mapping. Please refer to the following link for further details.
    http://help.sap.com/saphelp_nw04/helpdata/en/69/3482ee0d70492fa63ffe519f5758f5/content.htm
    Hope that was helpful.
    Best Regards
    Priya

  • IPhoto 9.5.1 face recognition slow with large number of photos ( 20,000)

    I have a new fast Imac with new Iphoto 9.5.1. I have >20,000 photos. When I use face recognition with people who have >50 photos associated with them, the rolling ball continues for 20-40 seconds. I didn't have this with the same number of photos with my old Imac2009 with Iphoto 9.4.3, 2011.
    Thanks.

    That is probably still the initial setup for face recognition for you new iPhoto version.  Once the "Faces" database has been rebuild, the performance  should get better. Give it a day or two.  If the problem persists, back up your iPhoto library and rebuild it.
    Hold down the key combination alt/option- command ⌥⌘ firmly and double click the iPhoto library to launch the First Aid Tools. Keep holding down the keys, until you are seeing the First Aid Panel.  Select the entry "Rebuild Database" from the panel and click "Rebuild".

  • Can't Empty Trash With Large Number of Files

    Running OS X 10.8.3
    I have a very large external drive that had a Time Machine backup on the main partition. At some point, I created a second partition, then started doing backups on the new partition. On Wed, I finally got around to doing some "housecleaning" tasks I'd been putting off. As part of that, I decided to clean up my external drive. So... I moved the old, unused and unwanted Backups.backupdb that used to be the Time Machine backup, and dragged it to the Trash.
    Bad idea.
    Now I've spent the last 3-4 days trying various strategies to actually empty the trash and reclaim the gig or so of space on my external drive.  Initially I just tried to "Empty Trash", but that took about four hours to count up the files just to "prepare to delete" them. After the file counter stopped counting up, and finally started counting down... "Deleting 482,832 files..." "Deleting 482,831 files..." etc, etc...  I decided I was on the path to success, so left the machine alone for 12-14 hours.
    When I came back, the results were not what I expected. "Deleting -582,032 files..."  What the...?
    So after leaving that to run for another few hours with no results, I stopped that process.  Tried a few other tools like Onyx, TrashIt, etc...  No luck.
    So finally decided to say the **** with the window manager, pulled up a terminal, and cd'ed to the .Trash directory for my UID on the USB volume and did a rm -rfv Backups.backupdb
    While it seemed to run okay for a while, I started getting errors saying "File not found..." and "Invalid file name..." and various other weird things.  So now I'm doing a combination of rm -rfing individual directories, and using the finder to rename/cleanup individual Folders when OSX refuses to delete them.
    Has anyone else had this weird overflow issue with deleting large numbers of files in 10.8.x? Doesn't seem like things should be this hard...

    I'm not sure I understand this bit:
    If you're on Leopard 10.5.x, be sure you have the "action" or "gear" icon in your Finder's toolbar (Finder > View > Customize Toolbar).  If there's no toolbar, click the lozenge at the upper-right of the Finder window's title bar.  If the "gear" icon isn’t in the toolbar, selectView > Customize Toolbar from the menubar.
    Then use the Time Machine "Star Wars" display:  Enter Time Machine by clicking the Time Machine icon in your Dock or select the TM icon in your Menubar.
    And this seems to defeat the whole purpose:
    If you delete an entire backup, it will disappear from the Timeline and the "cascade" of Finder windows, but it will not actually delete the backup copy of any item that was present at the time of any remaining backup. Thus you may not gain much space. This is usually fairly quick
    I'm trying to reclaim space on a volume that had a time machine backup, but that isn't needed anymore. I'm deleting it so I can get that 1GB+ of space back. Is there some "official" way you're supposed to delete these things where you get your hard drive space back?

  • Bogging down with large number of photos

    I have over 115,000 (give or take a few 1000) photos in iPhoto at present. I primarily use it to organize photos so I can find them, make slideshows, and hope to eventually make photo books - they're our family history for the past 7 years or so. My iPhoto library is about 155 GB.
    I'm running this on the iMac - bought a couple of months ago. 4GB RAM with a 1T HD.
    My main grumble is that whenever iPhoto is open, my iMac bogs down. I frequently see the spinning wheel, beach ball, or whatever it's called, particularly when I have the iPhoto to the front and it's just starting up... (plus other times).
    A local Mac person thought that if I moved to Aperture, it would handle this number of photos much better... I've never much looked at Aperture before, so I'm not sure... About 2+ years ago, I had PCs and used Adobe Photoshop Elements - loved the organizer, and the level of editing power was just about right.... These days, I have next to no time for any editing except the occasional red-eye, but I can enjoy doing it on occasion (mostly swapping out backgrounds, making collages, etc).
    What could I do to improve things???

    iPhoto is rated for 250,000 images and while you've a way to go just yet that is still a very big library.
    With Libraries of that size I would expect a slowness loading or quitting, though not in normal use - viewing, organizing and so on.
    You may be able to speed up launching by turning off all automatic syncing - looking for Shared Libraries, Checking with Mobile Me etc in the Preferences.
    Aperture is indeed a Professional application, but I would caution that there is less value in it if you're not shooting Raw and that there is a learning curve in using it. You can download a free trial of Aperture [here|http://www.apple.com/aperture/trial>
    You can have multiple libraries in iPhoto, however the downside is that you can only have one open at a time.
    If you opt to go that way then to create a library: Hold down the option (or alt) key key and launch iPhoto. From the resulting menu select 'Create Library' and use the same keystroke to Choose between Libraries.
    Managing Multiple Libraries - including moving pics/albums/rolls and metadata between them - is greatly facilitated by using iPhoto Library Manager
    Regards
    TD

  • ALV performance with large number of columns

    Dear friends,
    I have created an ALV grid which has approximately 225 fields in it using Classes an not REUSE fms..
    After the ALV grid is first displayed if the user scrolls down to the next page it takes significant time to display the data in the next page.. The documentation says that the ALV grid only caches the data upon display (After first display of a page and before the ALV grid data is refreshed it works fine).  
    Is there any mechanism of caching the entire ALV grid data before/after the method set_table for_first_display is called??
    Helpful answers will be appropriately rewarded..
    Cheers
    Nitesh

    We're actually doing some TableView performance work right now, I wonder if you could file an issue with a simple reproducible test case? I haven't seen the same data you have here in our profiles (nearly all time is spent on reapplying CSS) so I would be interested in your exact test to be able to profile it and see what is going on.
    Thanks
    Richard

  • Working with large number of search results in struts/jsp application

    I'm developing a struts/jsp web application that returns upwards of
    30000 results obtained from a very complex schema and multiple queries
    to the database.
    We want the user to be able to paginate through the results, sort by
    several fields, add/drop columns and download in several formats. We
    don't expect heavy traffic but we do want it to be scalable.
    Up to now, I have been storing the results in a session ArrayList
    which can eat up a lot of memory. The queries take long to perform so
    I do not want to repeat them with every request.
    I would like to store in a single mysql temp table but I have read
    that when the connection ends, the table is dropped. I understand that
    creating a database Connection as a session variable is not a good
    practice.
    This is not a unique problem - I was hoping someone else has some
    insight into the best way to proceed.
    Thanks

    glwinsor wrote:
    I'm developing a struts/jsp web application that returns upwards of
    30000 results obtained from a very complex schema and multiple queries
    to the database.Who will want to deal with 30,000 rows? Not me. When Google returns a result to me, I get it in chunks of 10 at a time.
    We want the user to be able to paginate through the results, sort by
    several fields, add/drop columns and download in several formats. We
    don't expect heavy traffic but we do want it to be scalable.That's a separate, complex issue.
    Up to now, I have been storing the results in a session ArrayList
    which can eat up a lot of memory. The queries take long to perform so
    I do not want to repeat them with every request.Maybe your schema has issues. Do you have appropriate indexes? Has your DBA checked EXPLAIN PLAN to be sure that the queries are optimal?
    I would like to store in a single mysql temp table but I have read
    that when the connection ends, the table is dropped. I understand that
    creating a database Connection as a session variable is not a good
    practice.Not scalable.
    This is not a unique problem - I was hoping someone else has some
    insight into the best way to proceed.The paging part has been dealt with by Hibernate and other frameworks. Something like the Flex grid UI component might help. Caching and such can help you out. It's not easy, as you already know, and there isn't a canned solution that is one size fits all.
    %

  • File Bundle with large number of files failed

    Hi!
    Well, I thought there will appear problems. We do have some apps for distribution just by copying large amount of files (not large in size) to Windows (XP Pro, usually) machines. These is some programs which works from directory wo any special need for installation. Happy situation for admin. From one side. In ZfD 4.0.1 we did install this app on one of machines and then did take snapshot via special app (who remember) and did copy file to (Netware) server share, give rights for device (~ workstation) and associate it with ws via eDir and ... voila, next restart or whatsoever and app was there. Very nice, indeed, I miss this!
    So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry, bundle, upload files (first time it stuck, second time id accomplish, around 7500 files) and did distribution/launch association to ws (~device). And ... got errors. Several entries in log as examples below.
    Any ideas?
    More thanks, Alar.
    Error: [1/8/10 2:41:53 PM] BundleManager BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to process task: Novell.Zenworks.AppModule.LaunchException: Exception of type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION launchType, ActionContext context, ActionSetResult previousResults)
    Error: [1/8/10 2:41:54 PM] BundleManager ActionMan.FailureProcessingActionException Failed to process action: Information for id 51846d2388c028d8c471f1199b965859 has not been cached. Did you forget to call CacheContentInfo first?

    ZCM10 is not efficient in handling that number of files in a single
    bundle when they are in the content repo.
    Suggestions include zipping the files and uploading to the content repo
    and then downloading and extracting the zip as part of the bundle.
    Or Use the "Copy Directory" option to copy the files from a Network
    Source Directly like you did in ZDM.
    On 1/8/2010 8:56 AM, NovAlf wrote:
    >
    > Hi!
    > Well, I thought there will appear problems. We do have some apps for
    > distribution just by copying large amount of files (not large in size)
    > to Windows (XP Pro, usually) machines. These is some programs which
    > works from directory wo any special need for installation. Happy
    > situation for admin. From one side. In ZfD 4.0.1 we did install this app
    > on one of machines and then did take snapshot via special app (who
    > remember) and did copy file to (Netware) server share, give rights for
    > device (~ workstation) and associate it with ws via eDir and ... voila,
    > next restart or whatsoever and app was there. Very nice, indeed, I miss
    > this!
    > So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry,
    > bundle, upload files (first time it stuck, second time id accomplish,
    > around 7500 files) and did distribution/launch association to ws
    > (~device). And ... got errors. Several entries in log as examples
    > below.
    > Any ideas?
    > More thanks, Alar.
    > ---
    > Error: [1/8/10 2:41:53 PM] BundleManager
    > BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to
    > process task: Novell.Zenworks.AppModule.LaunchException: Exception of
    > type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    > at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION
    > launchType, ActionContext context, ActionSetResult previousResults)
    > ---
    > Error: [1/8/10 2:41:54 PM] BundleManager
    > ActionMan.FailureProcessingActionException Failed to process action:
    > Information for id 51846d2388c028d8c471f1199b965859 has not been cached.
    > Did you forget to call CacheContentInfo first?
    > ---
    >
    >

  • How to code spark custom component with variable number of (skin)parts?

    Hello. I'm trying to code a complex Spark custom component that may have a variable number of parts. To help you understand the requirements, the component can be visualized as an HSlider with a unlimited number of thumbs (as opposed to one).
    How do I, in general, represent these thumbs in the host component as well as the skin? If I had a fixed number of thumbs, say 5, I could easily represent them as 5 button SkinParts declaratively. However, it's not immediately clear to me how to deal with a variable number of them.
    I've studied the HSlider implementation as well as other components and can't find an example that fits this pattern. The closest thing that I can think of is to represent the thumbs as a DataGroup and provide a custom item renderer to render them. Couple that with the general HSlider behaviors that I need to preserve, such as the fairly involved local/global coordinate translations, I don't know whether the approach will work.
    Any better ideas? Thanks.

    #2 sounds utterly strange to me. How would I utilize the phase id?The code below shows my idea whereas I never validate it in any real projects:
    public class MyPhaseListener implements PhaseListener {
         private static final String IDKEY = "PHASEID";
         public static PhaseId getCurrentPhaseId() {
              return (PhaseId) FacesContext.getCurrentInstance().getExternalContext().getRequestMap().get(IDKEY);
         public void beforePhase(PhaseEvent event) {
    event.getFacesContext().getExternalContext().getRequestMap().put(IDKEY,event.getPhaseId());
         public PhaseId getPhaseId() {
              return PhaseId.ANY_PHASE;
    }You can write your constructor like as:
    if (MyPhaseListener.getCurrentPhaseId().equals(PhaseId.RENDER_RESPONSE ) {
         /* create children because this is the first time to create the component */
    }

  • BPC Performance with large number of dimensions's members

    Hi,
    I would like to know if there is a limitation on the number of members in one dimension. This dimension, named PROJET, is often used in expansion on our input schedule reports (to retrieve the projects which belong to the entity entered in the current view).
    With approximate 2500 members in this dimension, the report takes about 4 minutes to be expansed (or even open).
    We have 8 dimensions with few to 300 members. The PROJECT dimension is the biggest in terms of members's number.
    Thank you in advance for your feedback !
    Helene

    Hi Helene,
    With 3,000 members you should not be experiencing these problems, if your report is designed properly.
    I'm running BPC 5.1 SP8 on SQL 2005, with a dimension containing 22,000 members. Client PC's are typical (XP, Excel 2007, 1 or 2 gig RAM).
    Using EVDRE and this dimension expanding on the rows, most reports & input schedules can expand & refresh in the range of 10 to 30 seconds.
    The faster times are when I use a row expansion memberset using dimension properties, such as Active="X". The slower times are when the memberset is hierarchy-based, such as BAS.
    If you're using a dynamic template (one using EVEXP for the expansion) then you should start over using EVDRE. It will be much faster, particularly if you optimize your row expansion.
    It's often a good idea to add dimension properties specifically for the purpose of optimizing the report expansion, if the dimension has thousands of members. I sometimes go as far as to add properties which mimic the hierarchy (MyLevel2, MyLevel3, etc) just for this purpose.

Maybe you are looking for

  • Windows Home Server 2011 and Windows 8.1 desktop

    My questions is whether a Windows Live account and a local account can be combined in Windows 8.  I asked this question in Microsoft Community and an MS employee said I should ask my question here. Is there a way to have both active? I want to reap t

  • Where will I specify process chain and query time statistics to be loaded .

    I am on BI 7.0. I see on my system, BI Statistics Technical Content has been installed because when I run RSDDSTAT transaction under Info Provides I see cubes such as 0TCT_C01, oTC_C02, oTCT_C03,  oTCT_MC01, 0TCT_VC01.. I also see process chains inst

  • SM 59

    Hi I have to configure ALE for Inbound. Do i need to config. SM59 (RFC destination for Inbound). cos i am receving not sending.? Aryan

  • AIX-Installation-Http Server-"httpd: bad user name porting"

    Hi Every Body, Thanks one and all. With all of your Support we Installed and Developed HTML DB Application in Windows 2000 Environment. Development in Progress... So we decided to install and invoke its future thru AIX environment. As per desicion In

  • How to run DB verify

    Verision 7.3.4 We had a problem in backup process. Can anyone advise how to run DB verify to check if the data files are corrupted? Thanks!