Hibernate's dual-layer cach and TopLink's caching strategy

Dear members,
I understand that caching between hibernate and toplink is implemented (or utilized) differently. Hibernate seems to have 'dual-layer caching' (which may imply they have two layers of cache) whereas TopLink has session cache and shared cache. The way I see it, they seem to be aiming for the same thing. Are there any differences between (obviously there are, only that I do not know them) those two caching architectures, and how different are they?
Howard

Yes there are differences :) For details check out
TopLink vs Hibernate... revisited... again :)
and
Indirection - how are references resolved after session has been closed?

Similar Messages

  • DVD SP Dual Layer burning and DVD@ccess difficulties

    I'm making my first dual layer project and I can't format from DVD SP 3 b/c I get this error message (see below). I burned the project in toast 6 and the project works fine in Macs except for the URL DVD a@cess link doesn't connect. (It works in my computer) The DVD 2nd layer doesn't work in some PC's. I"m using Verbatim + R DL Media. Can anyone give me some suggestions?
    Can anyone explain this error message to me:
    Formatting was not successful. Layer 1 cannot exceed Layer 0 for a OTP configured disc. Please choose a suitable marker location that will support this condition.

    Hey James
    this particular error means that layer 1 is larger (as in size) to layer 0.
    since we dont have enough details of your project, i cannot correctly tell you what to do.
    i will ask you however, where did you set your layer break? tell us how much larger layer 1 is to layer 0. what will you be burning your image to, DLT, disc, hard drive? did you check the format properties and see where a suitable layer break can be? did you set layer break to automatic?
    any help could give you a fast answer
    Mikey M.

  • Query performance problem - events 2505-read cache and 2510-write cache

    Hi,
    I am experiencing severe performance problems with a query, specifically with events 2505 (Read Cache) and 2510 (Write Cache) which went up to 11000 seconds on some executions. Data Manager (400 s), OLAP data selection (90 s) and OLAP user exit (250 s) are other the other event with noticeable times. All other events are very quick.
    The query settings (RSRT) are
    persistent cache across each app server -> cluster table,
    update cache in delta process is checked ->group on infoprovider type
    use cache despite virtual characteristics/key figs checked (one info-cube has1 virtual key figure which should have a static result for a day)
    =>Do you know how I can get more details than what's in 0TCT_C02 to break down the read and write cache events time or do you have any recommandation?
    I have checked and no dataloads were in progres on the info-providers and no master data loads (change run). Overall system performance was acceptable for other queries.
    Thanks

    Hi,
    Looks like you're using BDB, not BDB JE, and this is the BDB JE forum. Could you please repost here?:
    Berkeley DB
    Thanks,
    mark

  • LOCAL OLAP CACHE AND GLOBAL OLAP CACHE

    what is local olap cache and global olap cache...
    what is the difference between ....can you explain  scenario plz...
    will reward with points
    thanks in advance

    Hello GURU
    Local cache is specific to a user, before BW 3.0 it was only local cache available...if a user run the query data will come to cache from infoprovider and next time same query will not go to Data base instead it will fetch data from cache memory...this cache will be used only for that particular user...if some other user try the same query it will not pick up dta from cache.....
    BW 3.0 onward we have global cache which means several user can access same cache for the same query or data which is related in cache...
    Thanks
    Tripple k

  • Difference between Presentation Server Cache and BI Server Cache

    Hello Experts,
    What is the Diff b/w Presentation Server Cache and BI Server Cache
    Thanks,
    S Gouda

    Hello,
    Okay, what do you want to do about caching at BI server and Presentation server.
    A nQSXXXX.tmp is a temporary cache file which maintained by the BI Server for an analysis request by a user and is kind of shared data between the OBI Server and the OBI Presentation server. This is refereed as the 'Cursor Cache' which could be managed by going thru Administration> Manage Sessions > Clear Cursor Cache.These .tmp files are it is not related to BI Server cache.
    By Default, the BI Server Cache is stored in the and stored as NQSxxxxx.tbl files. [middlware_home]/instances/instance1/bifoundation/OracleBIServerComponent/coreapplication_obis1/cache]
    Caching occurs by default at the subrequest level, which results in multiple cache entries for some SQL statements. Caching subrequests improves performance and the cache hit ratio, especially for queries that combine real-time and historical data.
    Below are some useful links for cache management in OBIEE 11g.
    http://oraclebisolutions.blogspot.com/2013/02/obiee-11g-obi-server-and-presentation.html
    http://drazda.blogspot.com/2012/10/obiee-11g-cache-management.html
    http://allaboutobiee.blogspot.in/2012/03/cache-management-purging-cache.html
    Pls mark itf this helps. Else post the exact questions you have about this post.
    Thanks,
    SVS

  • Dual layer DVD and FCP movie

    Hi
    I am going to try and use a dual layer DVD for a two-parts long movie.
    Is it possible to use a "session mode" ?
    For example, could I burn one layer, in such a way people can see how it looks when they it on a DVD player
    and then burn the second part... or even burn the first one and a little of the second one before burning the rest of it ?
    Feasible ?
    I can use Toast 7.0.2 with my computer.
    Thanks
    Bernard

    No, I don't think you can. It's all or nothing.

  • Dual-Layer Bluray and PE7 or PE8

    Surfing over at Muvipix someone had posted that PE7 does not support Dual-Layer Bluray (BD-R DL 50 gig) on Export.
    PE7 supports only standard Bluray discs at 25 gig for Burning.
    Anyone know if the latest version, PE8 supports BD-R DL 50 gig on Export...
    Thanx..

    Nothing is cast in stone here, I picked up the BD-R DL 50 gig info from a one-liner that was posted over at Muvipix.
    Here's the link to the thread.
    http://muvipix.com/phpBB3/viewtopic.php?f=57&t=5980
    If you look at the original first post, the first paragraph, 2nd line, he states: "When I discovered, that Premiere Elements 7 isn't able to burn BD-R DL (50 GB), I separated my project into two parts."
    So I'm not sure if it's true or not, but nobody over at Muvipix questioned that statement (and there are some pretty bright folks that support that Forum).
    I was just curious if someone could confirm the PE7 BD-R DL thing, and if true, does it also apply to PE8.
    I don't have a blank Bluray RW-DL, and I don't want to spend $30+ bucks to find out (the pricing on these things is ridiculus, especially the BD-RW DL).
    I'm strickly SD right now, but thinking of moving to HD and BD-R, I figure it's almost time for me to join the HD party.
    (Why should they be having all the fun)...lol...

  • Idvd dual layer discs and mkv files

    Hi, idvd will not encode a film if it is over a certain amount of time. The only way round this is to use a dual layer disc. But these are expensive. Is there a way round this or can anyone suggest different software to use which is as compatable with dvd players as idvd is?  Thanks
    Also which software will convert mkv or mp4 to dvd as I can not get idvd to do this. I am using the latest idvd.
    Many thanks.

    Toast
    Video Monkey
    VisualHub
    Google search for others.

  • Dual layer DVD and Powermac G5

    I have a Powermac G5 2.7 mhz Dual. I use installed dvd recorder which is PIONEER DVD-RW DVR-109, Revision: A912. For the first time in my life I tried to burn DL DVD-R . But I can not. I counn't find an old technical specification fro my mac but is this correct that it does support Dual layer DVD burning?
    What's wrong?

    Take a look at System Profiler - Burning:
    Does your drive support DVD-R DL - or does is support only DVD+R DL?

  • Write-Behind Caching and Limited Internal Cache Size

    Let's say I have a write-behind cache and configure its internal cache to be of a fixed limited size, e.g. 10000 units. What would happen if more than 10000 units are added to the write-behind cache within the write-delay period? Would my CacheStore's storeAll() get all of the added values or would some of the values be missed because of the internal cache size limitation?

    Hi Denis,     >
         > If an entry is removed while it is still in the
         > write-behind queue, it will be removed from the queue
         > and CacheStore.store(oKey, oValue) will be invoked
         > immediately.
         >
         > Regards,
         > Dimitri
         Dimitri,
         Just to confirm, that I understand it right if there is a queued update to a key which is then remove()-ed from the cache, then the following happens:
         First CacheStore.store(key, queuedUpdateValue) is invoked.
         Afterwards CacheStore.erase(key) is invoked.
         Both synchronously to the remove() call.
         I expected only erase will be invoked.
         BR,
         Robert

  • Dual Layer DVD and files??

    Thanks SDMacuser for your help last week.
    Will a .img file burn a DL DVD successfully (assuming Verbatim DL+R media, Toast Lite and Lacie DL burner)?
    Would a .cdr file burn give better results?
    Or should I burn directly from iDVD?
    Any ideas on where to get Verbatim DVD-R DL Inkjet printable discs?

    Check out Sam's Club which is where I buy mine (or Fry's Electronics).... But then again my drive prefers minus media and so do I. Newer burners / S-Drives work well on plus media DL. Older drives sometimes choke on plus media and DL is out of the question unless you own a newer burner).
    I nearly always burn from a disc image at 4x or slower using Roxio Toast (works better than apple's disc utility IMO).
    or see ebay Item number: 260124945356
    Good luck!
    Message was edited by: SDMacuser

  • Exchange 2010 all mail clients with internal cached and internal non-cached mode users having connection status issues, login prompt -- external is fine either way

    Hello,
    I'm seeing an issue that started this past Monday with no recent change to our environment. 
    External Outlook Anywhere users and Mobile users, OWA users are unaffected.
    Internal Users are affected when using Outlook of any version, 2010 or 2013.
    -Internal Users normally log into their workstation with their ad credentials, domain joined machines. Outlook opens without credentials prompting ideally.
    1 . Using Cached Mode:
    a. Login Prompts, slowness - Since Monday, users are getting prompted to login. The prompt goes away after logging in at startup. It is also causing high CPU
    on the workstations.
    b. Free/Busy, Out of Office doesn't work. I can, however complete auto-setup for a new user, so autodiscover is not completely down for internal users.
    c. Checking Connection Status shows connecting status on highlighted entry below without ever establishing connection (just goes away).
    The outlook icon in the system tray says "Outlook is requesting data from the server":
    2. Using Non-cached mode 
    a. No Login Prompt at startup, business as usual
    b. Free/Busy, Out of Office works fine. Autodiscover is fine.
    c. Checking Connection Status shows normal except, it has "casarray URL, then status is referral" it seams to flicker this constantly then go away intermittently.
    d. Main issue in non-cached mode is, sometimes a user will log into windows, open outlook and it will not open and says "server is unavailable. Retry, work offline, or cancel" I can try re-opening, same message.
    Only after logging out of windows, logging back in, I can get back into outlook. This is not every time, or consistent with specific users, but random. 
    3. Lync Pop up  for credentials often, though entering the password.
    URLs are correct. Autodiscover, EWS, etc. 
    Already tried bypassing our loadmaster load balancer with host files on clients. same issue regardless of casarray node. 
    Not sure what is going on.
    Josh

    You have a hybrid configuration with Office 365, right?  You didn't provide this valuable piece of information.
    In Exchange Online PowerShell run this command:
    Get-OrganizationConfig | FL PublicFoldersEnabled
    If the value is "Remote" then users with Exchange Online mailboxes are looking to on-premises public folders.  Be sure that you have legacy pubic folder interoperability properly configured.
    http://technet.microsoft.com/en-us/library/dn249373(v=exchg.150).aspx
    Ed Crowley MVP "There are seldom good technological solutions to behavioral problems."

  • Oracle cache and File System cache

    on the checkpoint, oracle cache will be written to disk. But, If an oracle database is over file system datafile, it likely that the data are still leave in FileSystem cache. I don't know how could oracle keep data consistency.

    Thanks for your feedback. I am almost clear about this issue now, except one point need to be confirmed: do you mean that on linux or unix, if required, we can set "direct to disk" from OS level, but for windows, it's default "direct to disk", we do not need to set it manually.
    And I have a further question: If a database is stored on a SAN disk, say, a volume from disk array, the disk array could take snapshot for a disk on block level, we need to implement online backup of database. The steps are: alter tablespace begin backup, alter system suspend, take a snapshot of the volume which store all database files, including datafiles, redo logs, archived redo logs, controll file, service parameter file, network parameter files, password file. Do you think this backup is integrity or not. please note, we do not flush the fs cache before all these steps. Let's assume the SAN cache could be flushed automatically. Can I think it's integrity because the redo writes are synchronous.

  • Backup to DVD doesn't span properly with dual layer media.

    I attempted to use the backup feature in Itunes 7.0.1
    After choosing backup to disc, it provided a warning that the backup would require more than one CD - and that's OK by me, so I went ahead and put media in the drive. It started and after filling the first disk, it stopped and said it was aborting because of an 'unspecified error?'
    My DVD drive is capable of dual layer burning, and I use good quality media. It did successfully burn the DVD, but only part of the library was backed up onto the dual layer DVD I was using. I have used both this drive and this type/brand media succesfully on numerous other projects.
    It looks like the backup routine fails when using dual layer media and backing up more than will fit on one media.
    I could't acess the bug report paga so am posting it here.
    I don't know if it's repeatable, because I don't want to waste any more dual layer DVD's.
    Windows XP Pro, non remarkable system otherwise, P4 2400 - 1G memory no weird startup programs, no anti virus running. pretty much plain vanilla.
    Just thought other might have the same issue.
      Windows XP Pro  

    henry, would it be possible to get a copy of your CD/DVD diagnostics to Apple via the form at the bottom of the following document?
    Optical drives diagnostics and troubleshooting

  • Player compatibility problem with dual layer media

    I've created a dual layer DVD and there is a problem with compatibility from player to player.
    It plays fine in one of my DVD players but will not load in the other.
    The duplicators tested my master on 2 players and it loaded on both, but would lock up if a chapter was selected from the submenu.
    It has 4 tracks of content. 3 menus, and a bunch of chapter markers. The original build had stories (replaced with standard chapter markers) but I removed them hoping it would solve the problem.
    The imported video tracks were created with Compressor. MPEG-2 2 pass VBR max bit rate 7.5. AC3 audio tracks. 7.7 GB on disc.
    I'm already way over my deadline and ahve no idea what to do next other than maybe recompress so the disc is not so close to being maxed out.

    Thats just how compatible that media is... +R DL (double layer) & -R DL are not that compatible. Recordable media is less compatible that replicated discs but double and dual layer media can be 50% compatible at best (depending on your target market...)
    Good quality media, Verbatim +R DL for example burnt at slow speeds helps improve things,
    -Jake

Maybe you are looking for