Using cache - best practice

We created a banner using toplink. The buttons (images) are all stored in the database. The banner is in 300+ pages and they cange depending on some business rules. I want the best performance available. Yet, I want to make sure that once a record is updated so is the banner. What is the best cache option to use? Also, where should I set the cache? If I click Toplink Mapping (I'm using Jdeveloper 10.12), should I double-click on the specific mapping? I only see two options:
- default (checked)
- always refresh
-only refresh if newer version
Is there some type of "best practices" of using Toplink's cache?
Thanks,
Marcelo

Hello Marcelo,
Can't be sure exactly, but are you modifying the database outside of TopLink? This would explain why the cached data is stale. If so, what is needed is a strategy to handle refreshing the cache once changes are known be made outside TopLink, or revise the caching strategy being used. This could be as easy as calling session.refreshObject(staleObject), or configuring specific class descriptors to always refresh when queried.
Since this topic is rather large and application dependent, I'd recommend looking over the 10.1.3 docs:
http://download-west.oracle.com/docs/cd/B25221_01/web.1013/b13593/cachun.htm#CHEJAEBH
There are also a few other threads that have good discussions on how to avoid stale data, such as:
Re: Only Refresh If Newer Version and ReadObjectQuery
Best Regards,
Chris

Similar Messages

  • Collection caching - best practices

    Hi all
    I have a list of rows in a db that are ultimately added to a html <select>
    field
    This select field is on every page of my site
    My client calls a method in a SLSB which
    - calls a finder method
    - is returned a collection of read only EJB local interfaces
    - copies the contents of the local interfaces into plain java beans
    - returns a collection of plain java beans to the client
    Other than making my ejb read-only, what best practices should i consider so
    that i can minimise the amount of work that is involved each time i want to
    build this <select> field?
    Specifically, I was wondering what best practices people are implementing to
    cache collections of information in an EJB environment. I would like to
    minimize the amount of hand rolled caching code i have if possible.
    Thanks
    Matt

    thanks
    i just wanted to make sure
    "Cameron Purdy" <[email protected]> wrote in message
    news:[email protected]..
    I have a list of rows in a db that are ultimately added to a html<select>
    field
    This select field is on every page of my site
    My client calls a method in a SLSB which
    - calls a finder method
    - is returned a collection of read only EJB local interfaces
    - copies the contents of the local interfaces into plain java beans
    - returns a collection of plain java beans to the client
    Other than making my ejb read-only, what best practices should iconsider
    so
    that i can minimise the amount of work that is involved each time i wantto
    build this <select> field?
    Specifically, I was wondering what best practices people are
    implementing
    to
    cache collections of information in an EJB environment. I would like to
    minimize the amount of hand rolled caching code i have if possible.For read-only lists, use a singleton pattern to cache them.
    For caching data in a cluster, consider Coherence:
    http://www.tangosol.com/coherence.jsp
    Peace,
    Cameron Purdy
    Tangosol, Inc.
    http://www.tangosol.com/coherence.jsp
    Tangosol Coherence: Clustered Replicated Cache for Weblogic
    "Matt Krevs" <[email protected]> wrote in message
    news:[email protected]..
    >

  • SLES OES Patching using ZCM - Best Practice?

    Have recently updated an underused ZLM 7.3 to ZCM 11 and have started to utilise it to perform patching of SLES 10 SP3 and OES 2.
    At present I have a Novell subscription that downloads from novell NU as Patches with
    Create Category based Bundle Groups on.
    I hadn't at this point got my head around the sandbox but now want to utilise this as I don't want to implement it willy nilly and have patches deployed and installed without testing first.
    So first question is whether there is a best practice document for using ZCM 11 for patching SLES 10 and Sles/oes boxes that describes how best to use the sandbox.
    Next question is whether sandbox applies to the patches or only monolithic bundles
    At the end of the day it would be great to have a best practice document on how to utilise ZCM with change control over keeping SLES10/OES boxes patched up to date

    Hello,
    updates via ZCM is also a problem for me.
    At least i could give a little response from testing.
    As stated in the oes2 documentation patches had to be used for updates. At that point i had the problem that the subscription only created a bundle group which included the patches.
    The monolithic bundle seems to be an ordinary update (zcm docu states rpm -U ...) and so should not be used for oes2 servers.
    I found out that its ok to rename the patch bundle group created by the subscription to fix the patchlist. I use bundle names which include the date to keep updates from sles and oes in sync.
    So you should be able to do updates with these bundlegroups without problem but it is not very nice because each patch bundle had its on log report and entry in the bundle list.
    So to keep a global view is not very easy.
    The other problem is that there is no official documentation neither on the oes2 or the zcm site how to correct update/patch a sles or oes system with zcm (without patchlink).
    Hope this will be included soon as all linux satellite server will have the problem that this is the only way to get updates without direkt connection to the internet - at least as far as i can see.

  • How to use KNOA - best practices

    I am looking for best practices in implementing KNOA. How is the tool being used? How do we make sure it yelds value? Your input is appreciated!

    Hi Shishir,
    https://websmp104.sap-ag.de/instguides
    Also:
    help.sap.com\Sap Best Practices\Getting Started R/3
    Patricio.

  • REACh Implementation using SAP Best Practices for Chemicals

    Hello,
    We are working on implementation of SAP REACh Compliance in to Best Practices for Chemicals. Currently we are extending the Best Practices scenario for "SVT" to  cover REACh Compliance.
    Do we have a preconfigured scenario available with list of building blocks to be activated for REACh Compliance to work?  Are there any plans to release the same in future?
    Thanks
    Jayakumar

    Chemicals best Practice support on REACH  is currently limited to SVT scenario you mentioned in your question. Having Best practice support for SAP REACH Compliance is a good idea of course but BP community needs to decide to do so and needs to take action. This has not be done so far and would currently be difficult to realize since there is simply no REACH best practice yet... All companies are more or less still in learning mode about how to find the best way to implement REACH legislation best not to forget  the that REACH as such (its guidance documents) are still changing too.
    Kind rgds
    MArko

  • "Source used" item best practice

    Hello,
    How can I configure an item with source used "Always, replacing any..." and source type="Database Column" and STILL not loose the entered value every time the page comes back (items are by default repopulated with DB column values, which make me loose all entered values),
    The reason why the page comes back before a submit is because I have some "Select List with Redirect" which dynamically populate other "select list" (dynamic LOV)
    How can solve/workaround this issue? Remember, I want all entered values NOT be reset by db column values
    Thanks for your help!
    Sam

    Sam,
    I've begun to use AJAX for this type of select list. Solves the caching issues nicely and doesn't cause an entire page refresh as the page isn't submitted until all the data is entered. Carl Backstrom has an example in his samples application - http://apex.oracle.com/pls/otn/f?p=11933:37
    Make that Denes Kubicek's - http://deneskubicek.blogspot.com/2008/04/cascading-select-list-in-tabular-form.html
    Dave
    Message was edited by:
    dsteinich
    Message was edited by:
    dsteinich

  • CS3 Bridge Cache Best Practices

    I am using CS3 Bridge Version 2.1.1.9 and have a question(s) about the cache.
    (System specs: Vista 64bit SP1, CPU: Quad9550, Ram: 4GB, Video: 9800GT)
    When and why should I purge the cache?
    Whenever I re-enter Bridge and go back to a recent folder of images, the spinning "pizza wheel" reappears for a few seconds and seems to refresh the thumbnails.  Why could it not do this once and keep the refreshed thumbnails?
    By the way, I just recently switched the Preference from High Quality to Quick Thumbnails and that change gave me a huge boost in speed.  The High Quality thumbnails were incredibly SLOW which once again begs the question just above, why does it not do it once?
    Thanks
    Jim Calvert

    As I mentioned in post #4 my thumbnails stay for several weeks (I use quick thumbs).
    I have a dedicated scratch partition of 40 gigs (shared with photoshop).   My Bridge catch was 14 gigs this morning, and I elected to compact thumbnails and that process reduced cache down to 9 gigs.  Compacting eliminates cached items that no longer are linked to anything.  Folders that had not been visited for at least a week still had the thumbnails there, so no rebuilding.  I use only central cache (do not export to folders).
    The point of this is how big is your cache?  Does it have to be overwritten to make space for other activity?   The only other thing I can think of is that if you are using a lot of raw files it has a seperate cache.  I really do not understand this, or what purpose it serves.  But under edit/Camera Raw Preferences there is a box to set location and size of camera raw cache.  Default is 1 gig, and it has been recommended by some that this should be increased to 4 to 10 gigs, depending on you use of ACR and file sizes.
    My thumbs load instantly (unless they have to be rebuilt).  I have between 400-800 pictures in a folder.
    I am using CS3.

  • Best practice for moving from a G5 to a new Mac with SL

    I am receiving my new iMac today (27") and am very excited
    However I want to move over using the best practices to assure that I remain excited and not frustrated
    My initial thoughts are to boot it up and doe the initial set up - to move my iPhoto library over and to use migration assistance to move the rest of my data files
    Then to install all of the extra software that I can find the packages for from the original installation disks
    And then finally to use migration assistant again to move over any software that I can not find original disks for (I've moved from Mac to Mac to Mac over and over and some of the software goes back to OS 9 (and won't run anymore I guess)
    Is this a good way
    OR
    will I mess up doing it this way
    OR
    am I spending far too much time worrying about moving old problems over and would be better off to just turn MA loose and let it do its thing form the beginning?
    BTW - mail crashes a lot on my existing system - pretty much everything else seems ok - except iPhoto is slow - hoping that the new Intel dual core will help that
    LN

    Migration Assistant is not a general file moving tool. MA will migrate your Applications and Home folders transferring only your third-party applications. MA will transfer any application support folders required by your applications, your preferences, and network setup. You do not have a choice of what will be migrated other than the above. MA cannot determine whether anything transferred is compatible with Snow Leopard. I recommend you look at the following:
    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    In general you are better off reinstalling any third-party software that is PPC-only. Otherwise update your software so it's compatible with Snow Leopard.
    Do not transfer any OS 9 software because it's unsupported. You can transfer documents you want to keep.
    Buy an external hard drive to use for backup.

  • JSP Best Practices and Oracle Report

    Hello,
    I am writing an application that obtains information from the user using a JSP/HTML form and then submitted to a database, the JSP page is setup using JSP Best Practices in which the SQL statments, database connectivity information, and most of the Java source code in a java bean/java class. I want to use Oracle Reports to call this bean, and generate a JSP page displaying the information the user requested from the database. Would you please offer me guidance for setting this up.
    Thank you,
    Michelle

    JSP Best Practices.
    More JSP Best Practices
    But the most important Best Practice has already been given in this thread: use JSP pages for presentation only.

  • Upscale / Upsize / Resize - best practice in Lightroom

    Hi, I'm using LR 2 and CS4.
    Before I had Lightroom I would open a file in Bridge and in ACR I would choose the biggest size that it would interpolate to before doing an image re-size in CS2 using Bicubic interpolation to the size that I wanted.
    Today I've gone to do an image size increase but since I did the last one I have purchased OnOne Perfect Resize 7.0.
    As I have been doing re-sizing before I got the Perfect Resize I didn't think about it too much.
    Whilst the re-size ran it struck me that I may not be doing this the best way.
    Follow this logic if you will.
    Before:
    ACR > select biggest size > image re-size bicubic interpolation.
    Then with LR2
    Ctrl+E to open in PS (not using ACR to make it the biggest it can be) > image re-size bicubic interpolation.
    Now with LR2 and OnOne Perfect Resize
    Ctrl+E to open in PS > Perfect Resize.
    I feel like I might be "missing" the step of using the RAW engine to make the file as big as possible before I use OnOne.
    When I Ctrl+E I get the native image size (for the 5D MkII is 4368x2912 px or 14.56x9.707 inches).
    I am making a canvas 24x20"
    If instead I open in LR as Smart Object in PS and then double click the smart icon I can click the link at the bottom and choose size 6144 by 4096 but when I go back to the main document it is the same size... but maybe if I saved that and then opened the saved TIFF and ran OnOne I would end up with a "better" resized resulting document.
    I hope that makes sense!?!?!?!
    Anyway I was wondering with the combo of software I am using what "best practice" for large scale re-sizing is. I remember that stepwise re-sizing fell out of favour a while ago but I'm wondering what is now the considered best way to do it if you have access to the software that was derived from Genuine Fractals.

    I am indeed. LR3 is a nice to have. What I use does the job I need but I can see the benefits of LR3 - just no cash for it right now.

  • Help: Best Practice Baseline Package experience

    All,
    Can I ask for some feedback from folks who've used the Best Practice Baseline package before - I'm in a project implementing ECC 6.0 and we're planning to use this package to quickly set up and demo the system to the business.  I've limited experience in config and system setup so your advice is much appreciated
    Specifically my questions:
    1.  Has anyone used the package for a similar purpose (quick setup to demo to business)?  Does it save a lot of time or should our team just configure the system manually?  Are there pitfalls I should be aware of?
    2.  I understand a limited amount of config / master data customisation can be done and there's even a wizard.  Is it effective and error-free - i.e., will I be better off by installing the base package and make changes later manually ?
    3.  Assuming after the demo we have firmed business requirements, which approach would be faster and easier?  Configuring from scratch vs. using existing BP baseline then doing delta configuration?
    4.  Does the documentation tell me exactly what was configured and the settings used?
    Appreciate any help and insight you can offer, thanks!!

    Dear Jin,
    The following help link should answer your queries:
    http://help.sap.com/bp_grcv152/GRC_US/Documentation/SAP_APJ_BP_Study_2006_CS.ppt#418,1,Slide  of SAP Best Practice
    Regardes,
    Naveen.

  • Best practices about JTables.

    Hi,
    I'm programming in Java since 5 months ago. Now I'm developing an application that uses tables to present information from a database. This is my first time handling tables in Java. I've read Sun's Swing tutorial about JTable, and several information on other websites, but they limit to table's syntax and not in best practices.
    So I decided what I think is a proper way to handle data from a table, but I'm not sure that is the best way.Let me tell you the general steps I'm going through:
    1) I query employee data from Java DB (using EclipseLink JPA), and load it in an ArrayList.
    2) I use this list to create the JTable, prior transformation to an Object[][] and feeding this into a custom TableModel.
    3) From now on, if I need to search an object on the table, I search it on the list and then with the resulting index, I get it from the table. This is possible because I keep the same row order on the table and on the list.
    4) If I need to insert an item on the table, I do it also on the list, and so forth if I'd need to remove or modify an element.
    Is the technique I'm using a best practice? I'm not sure that having to keep synchronized the table with the list is the better way to handle this, but I don't know how I'd deal just with the table, for instance to efficiently search an item or to sort the table, without doing that first on a list.
    Are there any best practices in dealing with tables?
    Thank you!
    Francisco.

    Hi Joachim,
    What I'm doing now is extending DefaultTableModel instead of implementing AbstractTableModel. This is to save implementing methods I don't need and because I inherit methods like addRow from DefaultTableModel. Let me paste the private class:
    protected class MyTableModel extends DefaultTableModel {
            private Object[][] datos;
            public MyTableModel(Object[][] datos, Object[] nombreColumnas) {
                super(datos, nombreColumnas);
                this.datos = datos;
            @Override
            public boolean isCellEditable(int fila, int columna) {
                return false;
            @Override
            public Class getColumnClass(int col) {
                return getValueAt(0, col).getClass();
        }What you are suggesting me, if I well understood, is to register MyTableModel as a ListSelectionListener, so changes on the List will be observed by the table? In that case, if I add, change or remove an element from the list, I could add, change or remove that element from the table.
    Another question: is it possible to only use the list to create the table, but then managing everything just with the table, without using a list?
    Thanks.
    Francisco.

  • SAP Best Practices for Chemicals DE roadmap

    Hi All,
    we are going to start a project using SAP Best Practices for Chemicals. We have decided to use DE version because the landscape is european based. I have seen on SAP Portal that the DE version is based on Ehp3: does anybody know if Ehp5 version is planned and when?
    Thenks a lot.
    regards,
    Sergio.

    To give more info:
    I somehow figured out that probably note 1648098 is the required note.
    However, note 1648098 has status "The requested SAP Note is either in reworking or is released internally only ".
    Can someone verify whether this is correct note and tell me when will this note be available?
    Regards.

  • Best Practice - HCM service

    Hi,
        The ESS latest package was uploaded initially into the portal. This package consists of mostly the Webdynpro iviews.
        Later it was decided to use the Best practice, so it is uploaded. Now all the services provided in the Best practice is available in one common folder, except for the ESS. The iviews available in this Best practice are mostly the transaction iviews.
        My query is that, why the HCM (ESS) services are not there under the best practice folder? Is it due to the already uploaded ESS package? What should i do to avail the HCM services of the best practice?
    It's urgent. plz help. All useful answers will be rewarded.

    Thanks Bharathwaj for the reply. Here is the link.
    https://websmp104.sap-ag.de/swdc
    In the website "SAP Software Distribution Center" select the category "Download" -> "Installations & Upgrades" -> "Entry by Application Group" then select "SAP Best Practices" -> "SAP BP PORTALS".
    In this the EP V2.60 version was downloaded.

  • Best Practices for E-Commerce

    Hi Experts,
    I was wondering if anybody has experience in installing and configuring SAP Best Practices for E-Commerce. I have downloaded and installed the ADD-ONs but when trying to configure IPC and webshop via XCM http://<server>:<port>/isauseradm/admin/xcm/init.do I am getting a 404 error "URL or resource not found".
    Who knows what component I am missing and can put me in the right direction!
    Thanks in advance!
    Wolfgang

    I am not sure I understand your question.
    If you are using E-Commerce with CRM2007 (CRM6), not ERP, then use the [Best Practices Building Blocks|http://help.sap.com/bp%5Fcrmv12007/CRM%5FDE/html/] -> Technical Information -> Building Block Library
    However, there STILL is not a BP for E-Commerce with CRM though.

Maybe you are looking for

  • Age Gate restricted Dailymotion preview images are not displayed - why?

    No Dailymotion video preview image, which requires the Age Gate filter to be turned off, is displayed anymore in my Firefox 32.0.1 and 32.0.2, they are all black. It used to work a couple of weeks ago with an older version, and still works with IE. C

  • Bug? using MAX() function on char(1) column returns something larger

    details: -- we have a complex nested query, where we are essentially returning the max() value into a variable -- the max() function is being used on a char(1) column -- where MAX() is part of an inner select, we have started getting ORA-06502: PL/SQ

  • Cost Center planning Error.

    Dear All, I am getting problem wih Fund in doing Cost center planning (kp06). In client level there are 2 controlling areas one is 4000 and another is 6000. the compny code which assigned to controllinga area 6000 they activated fund. but the company

  • How to use single processor in weblogic

    Hi, In multi core processors, weblogic server should be used specific processor/processors in Linux environment. Is there any configuration available Out of the the box from the Oracle weblogic...? Thanks, Ravindher

  • ARD 3.1 Bug ? (task not ended)

    Hello, After updating to ARD 3.1 (from 3.0), I have some problems using unix task. with 3.0 when I send "softwareupdate -a -i" (using root user) to a group of computer, all update were made in parallel and when each task finised, I can see a green ic