Need Clarification on using cache along with JPA

Hi,
I am using Oracle Coherence 3.5 and Jdeveloper 11g.
I have written a sample program which implements CacheStore which performs the following operations:
Insert, Update, Delete
Configured cache-config.xml file in whcih the element cachestore-scheme is used to invoke the corressponding class and the table name.
When I tried to run the cache using the JPA-Cache-server.bat I got the values accordingly. But each and every time the data is fetched from Database. I am using distributed-scheme in the configuration file. How do I configure the file such that for the first time the values comes form DB and then next time I get values from cache.
I used the same cache name in another program and tried to retrieve the values from cache but i got only null values:(
I will be very thank full to you if any one helps me on this.
Waiting for ur response..
Thanks,
Jagadeesh

I did not change any data in DB. I tried changing data in DB and then tested I got the updated content.
Below is my configuration file.
<cache-config>
<caching-scheme-mapping>
<cache-mapping>
<cache-name>Sample*</cache-name>
<scheme-name>distributed-db-backed</scheme-name>
</cache-mapping>
</caching-scheme-mapping>
<caching-schemes>
<distributed-scheme>
<scheme-name>distributed-db-backed</scheme-name>
<service-name>DistributedCache</service-name>
<backing-map-scheme>
<read-write-backing-map-scheme>
<internal-cache-scheme>
<class-scheme>
<class-name>com.tangosol.util.ObservableHashMap</class-name>
</class-scheme>
</internal-cache-scheme>
<cachestore-scheme>
<class-scheme>
<class-name>com.oracle.coherence.handson.cachest</class-name> <!-- Class where Cache store is implemented-->
<init-params>
<init-param>
<param-type>java.lang.String</param-type>
<param-value>filest</param-value> <!-- Table name-->
</init-param>
</init-params>
</class-scheme>
</cachestore-scheme>
<read-only>false</read-only>
<!--
To make this a write-through cache just change the value below to 0 (zero)
-->
<write-delay-seconds>0</write-delay-seconds>
</read-write-backing-map-scheme>
</backing-map-scheme>
<listener/>
<partition-count>100M</partition-count>
<autostart>true</autostart>
</distributed-scheme>
</caching-schemes>
</cache-config>
Thanks,
Jagadeesh

Similar Messages

  • How to use Oracle partitioning with JPA @OneToOne reference?

    Hi!
    A little bit late in the project we have realized that we need to use Oracle partitioning both for performance and admin of the data. (Partitioning by range (month) and after a year we will move the oldest month of data to an archive db)
    We have an object model with an main/root entity "Trans" with @OneToMany and @OneToOne relationships.
    How do we use Oracle partitioning on the @OneToOne relationships?
    (We'd rather not change the model as we already have millions of rows in the db.)
    On the main entity "Trans" we use: partition by range (month) on a date column.
    And on all @OneToMany we use: partition by reference (as they have a primary-foreign key relationship).
    But for the @OneToOne key for the referenced object, the key is placed in the main/source object as the example below:
    @Entity
    public class Employee {
    @Id
    @Column(name="EMP_ID")
    private long id;
    @OneToOne(fetch=FetchType.LAZY)
    @JoinColumn(name="ADDRESS_ID")
    private Address address;
    EMPLOYEE (table)
    EMP_ID FIRSTNAME LASTNAME SALARY ADDRESS_ID
    1 Bob Way 50000 6
    2 Sarah Smith 60000 7
    ADDRESS (table)
    ADDRESS_ID STREET CITY PROVINCE COUNTRY P_CODE
    6 17 Bank St Ottawa ON Canada K2H7Z5
    7 22 Main St Toronto ON Canada     L5H2D5
    From the Oracle documentation: "Reference partitioning allows the partitioning of two tables related to one another by referential constraints. The partitioning key is resolved through an existing parent-child relationship, enforced by enabled and active primary key and foreign key constraints."
    How can we use "partition by reference" on @OneToOne relationsships or are there other solutions?
    Thanks for any advice.
    /Mats

    Crospost! How to use Oracle partitioning with JPA @OneToOne reference?

  • Need help in installing JBOSS along with Jdk version

    Hi,
    I am using Linux 64 bit Machine.
    I have Jdk1.5.0_18 and jboss-4.2.3.GA installed in my machine.
    But the JBOSS Server is failed in booting.
    What version of JBOSS Server needed for the 64 bit along with Jdk1.5.0_18
    Any help will be needful for me
    Thanks and Regards

    The name of this forum is "Database - General" not JBOSS, JDK, or "anything goes."
    Please delete this post and post your question in the appropriate forum.
    Thank you.

  • 2nd Try: Please Help : UIX : Not able to use messageFileUpload along with m

    UIX : Not able to use messageFileUpload along with messageChoice
    I used messageFileUpload UIX element and associated upload button to fileupload servlet. This servlet just captures the file and put it in desired location. This works just fine.
    My client wanted me to add another choice filed on UI. I used messageChoice. I named the field as fileType. All the options have name and value populated but when I submit the form servlet is reading fileType as NULL.
    I am not sure what's happening. Please let me know how should I send file handle as well fileType ?
    Thanks,
    Linda

    What exactly are you expecting this to do for you?
    Forall is designed to process batches of DML to prevent context switching from PL/SQL to SQL.
    Has no relevance for what you are trying to do.
    If you don't want to call the procedure multiple times, you need to rewrite it to passs the cursor to the function, then have the function process all the records.
    Carl

  • AUDIO PROBLEM WHILE USING OR ALONG WITH WEBCAM

    Hi friends,
    SUBJECT : AUDIO PROBLEM WHILE USING OR ALONG WITH WEBCAM
    I'm not getting any sound while using web-cam.
    I tried recording my voice , and played.. i can see my video but audio
    Do we have any settings where i need to cross-check to make sure that everything being turned-on ?
    My laptop details:-
    HP Pavilion dv6 laptop
    Operating system Installed - Windows 7
    Can someone help me with this?
    Thanks
    kiran

    Hi,
    not sure if coincidence or the same question, but the same question got asked internally. The seeded option is required to get the OJSP filter installed. Here's the internal response
    ADF View integration with MDS not configured
    In web.xml, you need to have mds-ojsp intg enabled to load jspx base + customizations from MDS. In your web project in Jdev, go to ADFv project properties and select “Seeded customization” property. It would enable mds-jsp engine configuration in web.xml. If you want to use user personalization feature at runtime, you would need to select “User customization” property as well which would enable ADFv change persistence config in web.xml.
    Warning: Some of the metadata under ... is packaged as part of both WAR and MAR. This metadata cannot be accessed from WAR using MDS.
    For these two packages, you are including the files in both WAR & MAR. This warning conveys that since these base files are being put in MAR, at runtime they would be read from mds repository & not from WAR.
    Frank

  • Can we use assumetargetdef along with sourcedef

    Can we use assumetargetdef along with sourcedef

    FISH2 wrote:
    Yes you can, if you have the below costraints.
    {SOURCEDEFS <full_pathname>} | ASSUMETARGETDEFS
    Use SOURCEDEFSif the source and target  tables have different definitions. Specify the source data-definitions file generated by DEFGEN.
    Use ASSUMETARGETDEFS, if the source and target tables have the same definitions.
    For Oracle databases that use multi-byte character sets, you must use SOURCEDEFS(with a DEFGEN-generated definitions file) if the source semantics setting is in bytes and the target is in characters. This is required even when the source and target data definitions are identical.
    In other words, after taking into consideration the constraints noted in the documentation, the answer is "No, you can't."
    Can we use assumetargetdef along with sourcedef
    You should state what it is, exactly that you are trying to achieve.  "Source definitions" are for the trail being read. There is only one "definition" for the trail. ("Definition" in this sense, means the DDL that was used to generate the tables on the source database.)  The data captured from the source database is "applied" via SQL to the "target" database. The data is "mapped" from the source schema to the target schema. If you don't know what the source schema was, you can assume the target & source schemas are the same, and use the "AssumeTargetDefs" parameter.  However, if the source & target table definitions are different, then you'd use "SourceDefs {definitions file}" together with a database connection to the target database for the "target definitions".  (There are specific instances where you can have a "sourcedefs {file}" together in the same parameter file with a "targetdefs {file}", but that doesn't usually apply. It allows doing mapping upstream, before accessing the target database, to create a trail with a different definition (schema) than the original source database.)

  • HT5590 Use caching server with multiple public Addresses?

    According to the Apple documentation, to use the caching server, all clients need to share the same public address via nat. On my network with many macs, this would appear to make the caching service useless, as we have multiple public addresses to which our clients are nat'ed (a full class C, to be exact). Is there anyway around this restriction, or am I simply going to be unable to use what looks like it would be a highly usefull service?

    Yes, the multiple internal/private subnets mapping to a single public IP is very common in the education/enterprise arena. It is the basic hub-spoke topology:
    where all spokes connect to needed resources at the hub, and only the hub is connected to the Internet. In the case of K-12 education, we need to run a content filter (by Federal rules) on student Internet connectivity. The most efficient way to do that is to locate the filter (along with other servers and resources) at the hub and then route all Internet traffic through the hub. Each spoke (and the hub) is a different internal/private network subnet ... 10.65.x.x, 10.66.x.x, etc. In my case I have 3M from each spoke to the hub, and then 45M from the hub to the Internet.
    In the "old" days ... pre 10.8 ... we had (and still have for some of our oler 10.4 computers) a software update server at each spoke, and computers at each spoke were configured (with the Apple software update script) to get their updates from the update server at their spoke ... iApps as well as OS apps. This worked perfectly!
    Now that Apple, in their Orwellian attempt to monitor and control iApps, has introduced this "either-or" attitude about using a local update server OR caching server  (but not giving you the option to get iApps from the local update server) they have really hurt schools like mine. Without being able to serve all updates locally on each spoke, updating becomes impossible when you are tryiing to udpate a lab full of computers, and the iApp alone is 1.2G for EACH computer ...and now it must come from the Internet since the caching server is 'broken.'
    I currently have case open with Apple Enterprise Support, and will now also get my K-12 Apple Support Tech invloved. I will share this info with them. Perhaps there is some solution that I do not know about, or perhaps there will be a solution created by Apple for situations like mine. I can't see being the only one with this problem, I just think that I may be one of the first to notice it due to my limiited bandwith situation.
    Thanks for your insight. Your original post got me thinking and enabled me to identify what *I* feel is the problem. I will keep this thread updated.
    M:>

  • Using NonCatalogLogger along with the LogMBean

    Hi anybody
    According to the API, the NonCatalogLogger class provides application services
    for logging error messages to the weblogic server log. The name, location and
    other properties of the logfile are determined by the LogMBean for the server.
    Now, I have the instance of the LogMBean running in the server and using this
    instance I am able to configure our weblogic server's logging configuration from
    any client machine. But I am not able to log any message from a client machine
    into the server's log file. That is I am not able to use the LogMBean object along
    with the NonCatalogLogging object.
    Do you have a suggestion?
    Regards
    Zakaria Chowdhury

    http://edocs.bea.com/wls/docs60/javadocs/weblogic/management/configuration/L
    ogMBean.html
    "Rajan Annadurai" <[email protected]> wrote in message
    news:3ced8eb9$[email protected]..
    >
    hi Sanjeev,
    "in addition to FileName you can specify any LogMBean prop for a client inthe same
    manner"
    Can you please list down the LogMbean property to set rotation size. I amnot able
    to find it any where in the documentation.
    thank you,
    Rajan
    "Sanjeev Chopra" <[email protected]> wrote:
    Clients cannot log to the servers logfile. If you use NonCatalogLogger on
    the client, it creates its own file. By default however, the file is
    turned
    off. You need to turn it on by specifying the FileName prop of theLogMBean
    for that client.
    Since client config is not done thru MBeans, the way you define this is
    with
    system props i.e. -Dweblogic.log.FileName=....
    (in addition to FileName you can specify any LogMBean prop for a client
    in
    the same manner)
    "Zakaria Chowdhury" <[email protected]> wrote in message
    news:[email protected]..
    Hi anybody
    According to the API, the NonCatalogLogger class provides applicationservices
    for logging error messages to the weblogic server log. The name,
    location
    and
    other properties of the logfile are determined by the LogMBean for theserver.
    Now, I have the instance of the LogMBean running in the server and
    using
    this
    instance I am able to configure our weblogic server's loggingconfiguration from
    any client machine. But I am not able to log any message from a clientmachine
    into the server's log file. That is I am not able to use the LogMBeanobject along
    with the NonCatalogLogging object.
    Do you have a suggestion?
    Regards
    Zakaria Chowdhury

  • (Sales & Operations Planning) - Can we use RMCPSOP along with IDoc method

    Hi,
        While transferring data to demand mgmt(SOP) using the standard mass processing, sometimes the job is getting failed with error u201CNo period unit maintained in material masteru201D, after analysing the code which is triggering this error, we found that when ever an S076E is not the for corresponding Planning entry, it is triggering the error. 
         We are using the standard mass processing job using IDoc method to update S076((Sales & Operations Planning), we found that sometimes the S076E table was not getting updated with the Material/Plant combination. As suggested in SAP Note 500354 Program RMCPSOP can be used to synchronize S076 and S076E tables. Seeing the technical description of the suggested program it will create S076 and S076E entries based on PGMI and PGZU tables.
    My question is
    1) Can we use the RMCPSOP along with IDoc method, does it overwrites the S076 entries which were updated with IDoc method?
    2) At what step should the RMCPSOP executed if we use IDoc method.
    Regards
    Bala Krishna

    > I am planning to use WMMBID02 for it.
    > Although I found in Std SAP it is available for inbound only but I feel we can generate this idoc using user exit available at the time of "material document posting".
    >
    If you find nearly all the fileds which are asked by ur partner in that idoc, thats fine.
    > my second query is , in this idoc along with other information we also have to send information related to "REASON CODE" and "To Stock Status" ( Like in case material is transferred from Blocked to unrestricted stock type and To Stock status will be Unrestricted ) but these fields are not available in idoc defination , what should I do ?
    >
    You can extend the idoc, if you still think that you are having enough fileds in standard idoc, which are usefull.
    Reddy

  • I need help with motion control. I am programming in Visual Basic. I will need help with what parts I need to purchase from NI, along with help on the code.

    I am using a Papst servo motor and I need to know where to start and what to purchase to get this motor to spin. I am using visual basic and in my program I calculate the direction and RPM's needed from the motor. It will spin anywhere from 1 to 10000 RPM's. It seems rather easy, but I have no idea on how to spin the motor at the specific RPM, and stop it with a command stop in the program. Please help.

    We really should know a little more about your intended uses for this system, but assuming you want to do relatively simple (or even not so simple!) motion, you'll need a few components...
    A motion controller, such as the PCI-7342, can take your VB commands and turn them into the commands needed to "run" the motor. Next you'll need a drive, such as the MID-7342. This includes the servo amplifier that actually powers the motor. It also has connections to "pass through" the encoder signals from the motor back to the motion controller.
    The above-named pieces assume one or two axes of motion. You'll also need a cable to connect the two (can't remember the model right now). You can use MAX to configure the motion controller, and there are just a few VB calls you
    'll need to make using NI-Motion functions to define the motion and get it going.
    Hope this helps!

  • Need to Copy a subform along with content from one page to another page

    Hi All,
    I am new to Adobe Live Cycle .
    I am facing a particular problem in one scenario.
    I have a growing list of item i.e the number of Items are uncertain. I have put all these item in a sub form.
    Now I need a copy of this sub form from the First page to the 2nd Page.
    Basically , I want to copy a Subform along with the content from one page to another.
    Can anybody please help me.

    In source project open Tempo List (the one that is a list editor). Select all tempo changes and "copy them (command+c)
    close project
    Open destination project, open Tempo List delete all information and paste (command+V). Remember that Logic should be stopped at the exact position where the first tempo event happens. This is ussually 1.1.1.1, but check it in the source before closing it.
    hope this helps.
    regards

  • JAVA to XML on runtime a,then using thatstream along with XSL genertateHTML

    hi everybody,
    can u give me a example of a java file which generates an xml file on runtime and passes it as a stream to XSLT processor along with XSL file to produce and html output.
    i would appreciate this becoazue i know how to use physical file with XSLTprocessor but i am not able to pass an XML generated at runtime as a stream to the transformer class as a source.

    Hi Sumit
    See the links below for examples on doing this:
    http://xml.apache.org/xalan-j/
    http://xml.apache.org/xalan-j/samples.html
    Good Luck!
    Eshwar Rao
    Developer Technical Support
    Sun mcirosystems inc
    http://www.sun.com/developers/support

  • Why we need live cache along with RDB in APO

    Hello Expert,
    I recently entered to APO module so want to know about live cache and why we really need comparing to RDB and what RDB cannot do that live cache can do?

    SRS,
    Functionalities are pretty much the same.  LiveCache is a relational Database, built on MaxDB.
    Advantage of LC is that most of the data is stored in memory rather than on disk storage.  Data access and retrieval is substantially faster from memory than from disk storage.
    Best Regards,
    DB49

  • How can i use oracle coherence with JPA/ejb  in web service?

    Hi
    I want to make web service using JPA which calls oracle XE via oracle coherence? i want to use JAX-ws? i searched and found you can make and deployed it using web logic but is there any other way i can make it and deployed in tomcat. i want to use oracle coherence + Oracle XE + JAX-WS? if it possible how can i other wise what are other ways i can do it?
    please any one does know it reply please it helps me lot to get.
    Thanks in advance,
    Edited by: 913837 on Feb 22, 2012 3:51 PM

    If you want data cached in Coherence to find it's way into an Oracle database for persistence, then look at the "CacheStore" section of the Coherence Developer Guide. This also works the other way round too, in that you can get data read into a Coherence cache via a database read. Again, look in the Coherence Developer Guide.
    If you want you applications "entry point" into a piece of code to be a web-service, then Tomcat+CXF will work just fine. Once you are in the service, just use the Coherence API to put the data in a cache.
    But also look at the HTTP access offered in later versions of Coherence in the form of REST. This may save you the Tomcat+CXF install, depending upon your needs. See the Coherence Client Guide.
    Still, what exactly are you trying to achieve here? It's not clear from your post why a web service using JPA for persistence needs to go via Coherence at all. More info needed.
    Cheers,
    Steve

  • Using Caching Server with Only Two Macs?

    I'm currently running OS X Mavericks (10.9.4) on both of my Macs (2008 MB and 2010 MBP).  We have a limited Internet download allotment, so I am trying to find ways to reduce the number OS X system update downloads.  Someone suggested that I take a look at OS X Server, because it has a Caching Server, which would download the update once and install from a local location when other Macs updated. 
    Given that I only have two (2) Macs, and one of those would have OS X Server, would that work for me? 
    Or would the fact that one of my Macs would be running OS X Server make the updates to it incompatible for my other Mac?  If that was a problem, would installing OS X Server on both Macs, but only running the Caching Server on one of them resolve the issue?
    I don't need OS X Server for anything else.
    Thanks!
    Bev in TX

    We have a NetGear router/firewall between the ISP's satellite modem and our computers (my DH's two MS Windows PCs & my two Macs).  All of our computers are connected to the network via ethernet cable (wireless network disabled). 
    My Macs are both network configured with:
         Configure iPv4: Using DHCP
         Subnet Mask: 255.255.255.0
         Router:     192.168.1.1
         DNS Server:     192.168.1.1
    MP IP address: 192.168.1.6
    MBP IP address: 192.168.1.2
    But, I don't know whether those are static or not.  I tried switching my Macs' cable ports on the router, but that did not change the IP addresses.  I realize that didn't prove anything, though it would have if the IP addresses had changed.
    I used to be able to download updates from an Apple support web page, but things are not so simple anymore.  I had a discussion about this a while back at in the Safari Community.
    https://discussions.apple.com/message/26456050?ac_cid=op123456#26456050
    Aside from the suggestion to install OS X Server, a couple suggestions were made for manually downloading or copying a package:
    Download a log from an arcane URL; try to find the appropriate update in that log; download the update using the URL in the log file.  I can readily see a couple downfalls in this scenario:
    The arcane URL could be changed, so that I would no longer know its location.
    I could easily select the wrong update from that log file.
    Do the update on one Mac, watching folder /Library/Updates to copy the downloaded package.  However, that package would automatically be deleted by the system, so I would have to be able to do the copy prior to its removal.  I'm not clear as to how this would work, given that the download and installation are automatically performed, and I have no indication as to when the former ends and the latter starts.
    In either case, the downloaded packages would not make any suitability checks, so I could easily mess up a system (though it would probably not be likely if I keep my Macs OS synchronized).  Still a trashed system can be a headache even with good backups, due to the time it would take to restore a system.  That's why I considered going the OS X Server route, but I know about zilch when it comes to networking.
    Bev in TX

Maybe you are looking for