Accessing same Content Management Repository from Multiple Domains

I'm attempting to point many different domains to a single content management repository to avoid the duplication of data.
All the domains are able to read content correctly, but when trying to create content (whether programmatically or through the admin tool), most of the domains will get this error:
java.sql.SQLException: ORA-00001: unique constraint (SCHEMA_NAME.PK_CM_NODE) violated
Since there's no sequence generated in the database, it appears that the primary key is defined by something in the portal code. This seems to imply that you can't have multiple domains pointing to the same repository. Can someone confirm that they've seen this work? And if you've gotten it to work, what I am doing wrong?
Thanks for any help,
Dan Turkenkopf

This particular issue is inherent to the OOTB repository in WLP. The repository uses a sequencer to generate id's for created content. The way these ID's are generated causes the possibility of two applications attempting to create items with the same ID. This is due to the fact that the WLP repository manages the ID's used when creating content and doesn't currently use a GUID strategy for content id's.
You could configure another repository which does ID generation in a different fashion and avoid the persistence problem I've described.
The VCR layer which provides the access to the OOTB repository really only has the cache issue I described. This can be worked around with some custom event listener implementations that can manage stales caches. These would need to be aware of the other applications sharing the underlying content repository.
-Ryan

Similar Messages

  • Accessing the same stateful session bean from multiple clients in a clustered environment

    I am trying to access the same stateful session bean from multiple
              clients. I also want this bean to have failover support so we want to
              deploy it in a cluster. The following description is how we have tried
              to solve this problem, but it does not seem to be working. Any
              insight would be greatly appreciated!
              I have set up a cluster of three servers. I deployed a stateful
              session bean with in memory replication across the cluster. A client
              obtains a reference to an instance of one of these beans to handle a
              request. Subsequent requests will have to use the same bean and could
              come from various clients. So after using the bean the first client
              stores the handle to the bean (actually the replica aware stub) to be
              used by other clients to be able to obtain the bean. When another
              client retrieves the handle gets the replica aware stub and makes a
              call to the bean the request seems to unpredictably go to any of the
              three servers rather than the primary server hosting that bean. If the
              call goes to the primary server everything seems to work fine the
              session data is available and it gets backed up on the secondary
              server. If it happens to go to the secondary server a bean that has
              the correct session data services the request but gives the error
              <Failed to update the secondary copy of a stateful session bean from
              home:ejb20-statefulSession-TraderHome>. Then any subsequent requests
              to the primary server will not reflect changes made on the secondary
              and vice versa. If the request happens to go to the third server that
              is not hosting an instance of that bean then the client receives an
              error that the bean was not available. From my understanding I thought
              the replica aware stub would know which server is the primary host for
              that bean and send the request there.
              Thanks in advance,
              Justin
              

              If 'allow-concurrent-call' does exactly what you need, then you don't have a problem,
              do you?
              Except of course if you switch ejb containers. Oh well.
              Mike
              "FBenvadi" <[email protected]> wrote:
              >I've got the same problem.
              >I understand from you that concurrent access to a stateful session bean
              >is
              >not allowed but there is a
              >token is weblogic-ejb-jar.xml that is called 'allow-concurrent-call'
              >that
              >does exactly what I need.
              >What you mean 'you'll get a surprise when you go to production' ?
              >I need to understand becouse I can still change the design.
              >Thanks Francesco
              >[email protected]
              >
              >"Mike Reiche" <[email protected]> wrote in message
              >news:[email protected]...
              >>
              >> Get the fix immediately from BEA and test it. It would be a shame to
              >wait
              >until
              >> December only to get a fix - that doesn't work.
              >>
              >> As for stateful session bean use - just remember that concurrent access
              >to
              >a stateful
              >> session bean is not allowed. Things will work fine until you go to
              >production
              >> and encounter some real load - then you will get a surprise.
              >>
              >> Mike
              >>
              >> [email protected] (Justin Meyer) wrote:
              >> >I just heard back from WebLogic Tech Support and they have confirmed
              >> >that this is a bug. Here is their reply:
              >> >
              >> >There is some problem in failover of stateful session beans when its
              >> >run from a java client.However, it is fixed now.
              >> >
              >> >The fix will be in SP2 which will be out by december.
              >> >
              >> >
              >> >Mike,
              >> >Thanks for your reply. I do infact believe we are correctly using
              >a
              >> >stateful session bean however it may have been misleading from my
              >> >description of the problem. We are not accessing the bean
              >> >concurrently from 2 different clients. The second client will only
              >> >come into play if the first client fails. In this case we want to
              >be
              >> >able to reacquire the handle to our stateful session bean and call
              >it
              >> >from the secondary client.
              >> >
              >> >
              >> >Justin
              >> >
              >> >"Mike Reiche" <[email protected]> wrote in message
              >news:<[email protected]>...
              >> >> You should be using an entity bean, not a stateful session bean
              >for
              >> >this application.
              >> >>
              >> >> A stateful session bean is intended to be keep state (stateful)
              >for
              >> >the duration
              >> >> of a client's session (session).
              >> >>
              >> >> It is not meant to be shared by different clients - in fact, if
              >you
              >> >attempt to
              >> >> access the same stateful session bean concurrently - it will throw
              >> >an exception.
              >> >>
              >> >> We did your little trick (storing/retrieving handle) with a stateful
              >> >session bean
              >> >> on WLS 5.1 - and it did work properly - not as you describe. Our
              >sfsb's
              >> >were not
              >> >> replicated as yours are.
              >> >>
              >> >> Mike
              >> >>
              >> >> [email protected] (Justin Meyer) wrote:
              >> >> >I am trying to access the same stateful session bean from multiple
              >> >> >clients. I also want this bean to have failover support so we want
              >> >to
              >> >> >deploy it in a cluster. The following description is how we have
              >tried
              >> >> >to solve this problem, but it does not seem to be working. Any
              >> >> >insight would be greatly appreciated!
              >> >> >
              >> >> >I have set up a cluster of three servers. I deployed a stateful
              >> >> >session bean with in memory replication across the cluster. A client
              >> >> >obtains a reference to an instance of one of these beans to handle
              >> >a
              >> >> >request. Subsequent requests will have to use the same bean and
              >could
              >> >> >come from various clients. So after using the bean the first client
              >> >> >stores the handle to the bean (actually the replica aware stub)
              >to
              >> >be
              >> >> >used by other clients to be able to obtain the bean. When another
              >> >> >client retrieves the handle gets the replica aware stub and makes
              >> >a
              >> >> >call to the bean the request seems to unpredictably go to any of
              >the
              >> >> >three servers rather than the primary server hosting that bean.
              >If
              >> >the
              >> >> >call goes to the primary server everything seems to work fine the
              >> >> >session data is available and it gets backed up on the secondary
              >> >> >server. If it happens to go to the secondary server a bean that
              >has
              >> >> >the correct session data services the request but gives the error
              >> >> ><Failed to update the secondary copy of a stateful session bean
              >from
              >> >> >home:ejb20-statefulSession-TraderHome>. Then any subsequent requests
              >> >> >to the primary server will not reflect changes made on the secondary
              >> >> >and vice versa. If the request happens to go to the third server
              >that
              >> >> >is not hosting an instance of that bean then the client receives
              >an
              >> >> >error that the bean was not available. From my understanding I
              >thought
              >> >> >the replica aware stub would know which server is the primary host
              >> >for
              >> >> >that bean and send the request there.
              >> >> >
              >> >> >Thanks in advance,
              >> >> >Justin
              >>
              >
              >
              

  • Content Management / Repository Managers

    Hi,
    I must write IView for selecting Entry Points (Knowledge Management -> Content Management -> Repository Managers).
    Logical way need I first all Entripoints. Which class gives me the whole list of the Entry Points in the portal?
    Regards,
    Raissa

    Hi,
    Have you see this link ?
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/kmc/knowledge management and collaboration developers guide.html
    Patricio.

  • I need to Mirror the same content of iPad on multiple Apple TV simultaneously, can it be done

    I need to Mirror the same content of iPad on multiple Apple TV simultaneously.
    I have a iPad and three Apple TV in three different rooms, and I would like to play the same video or mirror simultaneously on all the Apple TV located in my three rooms. Is that possible?
    I would also like to know is it possible to have two iPad mirrored on to a single apple tv, so that I can view both my iPad screen on same apple TV simultaneously.
    Any suggestions?

    doubt it most local area networks would die having to stream such amount of data

  • All SYFY online on demand has error "Content not available from this domain"

    I keep getting an error on any SYFY online on demand shows which says "Content not available from this domain." This applies to any show from SYFY that I click on to test.  Other channels don't have that error.

    Hi,
    Thanks for bringing this issue. Yes, the message should be SEVERE. This will be fixed.
    You may also want to subscribe to [email protected] and bring up any issues, enhancements.
    Regards,
    Deepak

  • Regarding : CMR: CONTENT MANAGEMENT REPOSITORY

    HIII gurus..
    I Want to learn concepts of  CMR. Means What is CMR??
    and why we used it??
    So pls Provide some study material and links which r usefull for me.
    Thanks in advance.

    Hi,
    CONTENT MANAGEMENT REPOSITORY is an internal repository in KM which can save its data in 3 modes:
    DB mode
    DBFS mode
    FSDB Mode
    So depending on your requirement you choose one of the modes that suites your needs.
    Check this for more:
    http://help.sap.com/saphelp_nw04/helpdata/en/62/468698a8e611d5993600508b6b8b11/frameset.htm
    So mostly when you want to save datas in KM, you choose internal repository like CMR.
    Regards,
    Praveen Gudapati

  • Pulling Data in Portal Content Management DB from Documentum

    Is there a way to get data in the portal content management database directly from documentum? We have documentum as one of multiple content management source in our environment.
    What we want to do is that whenever anything is published in documentum, the weblogic portal content management database gets that same data through a pull or push service.
    Can this be configured in portal content management or do we need to write our own java based service which polls documentum on a regular basis, gets the data from it and enters it in BEA content management database.
    The data migration from documentum to weblogic portal db has to be transparent to the user and no user interaction should be required to accomplish it.
    Thanks
    Michelle

    Hi,
    To my knowledge there is no tool to propagate content from DCTM to BEA. What it is available is the CM SPI integration which links the DCTM system to the Virtual Content Repository in the Portal. It works with the DCTM Content Server as well as the Caching server. The integration works very well and we have several joint customers using it.
    If you feel like exploring options you may want to see if there is a way to export the data from DCTM. If so you might be able to use the Bulkloader to get data into the portal.
    Regards,
    --alex                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • KM Content Management Repository

    Dear All,
    We have installed EP7.0 SP10, I would like to strore documents can we use KM internal repsotiory if so do we need to do some configurations. please provide more details on this.
    Regards,
    Murali

    Hi Murali,
    After the completion of the installation for usage type EP, you need to perform a number of configuration tasks to enable the basic use of Knowledge Management (KM). These tasks are essential to enable minimal use of KM features. To enable full use of all features, you need to install the Search and Classification Engine (TREX) and configure the scenario Enterprise Knowledge Management.
    Specification of the Portal address to enable KM to generate URLs for services.
    Specifying the Portal URL
    System administrator
    Configuration of communication channels to enable the use of subscriptions and notifications
    Configuring Channels
    System administrator
    Configuration of local editing to allow users to check out documents and edit them with applications of their own choice. 
    Enabling Local Editing
    System administrator
    Assignment of the content manager role to a user who is responsible for organizing content creation, access and publication.
    Assigning the Content Manager Role
    User administrator
    Assignment of tasks to instances of KM running in a SAP J2EE cluster. This step is only necessary for a cluster installation.
    Cluster Only: Assigning Tasks to Nodes
    System administrator
    Removal of the cluster setting. This step is only necessary if one instance of KM is running on a single machine, for example, on a test or demo system.
    Single Instance Only: Removing the Cluster Setting
    System administrator
    Optional removal of the Search field and Advanced Search option in the Portal. If your installation does not include the Search and Classification engine (TREX), which enables the use of search within KM, you can hide the search feature.
    Installation Without TREX: Hiding the Search Field
    Content administrator
    Assignment of permissions to folders.
    Setting Permissions
    Content administrator
    Permissions define which users have which access rights for items in KM repositories (folders, documents, and links).
    You need read permission for the item in question in order to display its permissions.
    You must be entered in the list of permission owners in order to change the permissions.
    There are two types of permission in Knowledge Management:
    Permissions (for the item itself)
    These permissions define permission to read, write, and delete items.
    You can set the permissions for all items (folders, documents, links).
    Service Permissions
    Service permissions define permissions for functions that are provided by KM services, such as subscriptions.
    You can only set service permissions for folders.
    In the folder hierarchy of a KM repository, permissions are inherited by subordinate folders from superordinate folders.
    When you create a new item in a KM folder, the item initially has no access control list. Instead, it inherits the permissions of the superordinate folder.
    When you change and save the permissions, the system creates a separate ACL for the item in question. Inheritance no longer takes place. From this point on, changes to the permissions of the superordinate folder are no longer inherited by the subordinate object.
    Thanks
    Rajnikanth Dumpala

  • WLP Content Management Repository

    We are using WLP content management system(file based repository), We have some configuration which is in content-config.xml, where we are setting a default value for "cm_fileSystem_path" to "c:\portal\content", so far it is ok, but once deployed in production this value overwrites the production value which is configured using WLP admin console. My question is how do we set it a particular value during deployment time rather than putting it always in 'content-cofnig.xml' then using WLP admin console to update with production env value.
    Here is what we would like to do,
    If there is no config parameter available then deployment should take this value from content-config.xml, but some how if we are providing this value during deployment time then it should over write the default value supplied with content-config.xml.
    Thanks
    Lalit Barik

    How do I create plan.xml for every env Im not sure what you mean. Use vi/notepad if you want! See the section 'Using an Existing Deployment Plan to Configure an Application' in the link. You simply need to create this plan once with the value you want and you can use the same one with a different path value(the one you want to change) for each environment. Whether you want to create a template plan.xml whose value is replaced by an ant build is upto you.
    Plan.xml is similar to a WLST script, you'd only need to parameterise the actual value you want to use as the path(and you'd have to run both for every deployment) - Im not sure this value is accessible over WLST though..
    regards
    deepak

  • Structure of folders of the content management repository

    Hello,
    is there a possibility to get an overview of the folder structure of the cm repository.
    we have a lot of folders and we need to get an overview!!
    best regards, christian

    Hi Christian,
    you can use the <a href="http://help.sap.com/saphelp_nw04/helpdata/en/2a/1d95a6ff8e5b49a51960210374ba5b/content.htm">Resource Statistics Report</a> for that. Just navigate to Content Management -> Reports -> Resource Statistics -> Start
    Add the location "/", under parameters enter as Access Path Pattern "/**", check only "Include Folders" and "List Matching Items" and click on start.
    Hope this helps,
    Robert

  • Accessing secured content area view from JPDK

    Is it possible to access the secured content are views from JPDK?
    For example if I am logged on as user USER1 in Portal, is it then possible to access WWSBR_ALL_ITEMS as USER1?

    hi,
    You can access Content Area APIs from any user using JDBC calls. But, you may have to grant 'EXECUTE' privileges on those procedures (& SELECT privilege if its a DB object like Table, VIEW).
    If you are using PL/SQL procedures in your application, you can directly access them through PL/SQL calls, otherwise you have to use JDBC.
    --Sriram                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Managing Photos from multiple sources to iPhoto '11 Photo Stream on Mac.

    I combined photos from multiple sources (camera, desktop, iPhone) to iPhoto '11 on Mac.  I determined keeper photos and put them in Event.  I also rated and flagged the keeper photos.  I want to delete the rest of photos from Photo Stream but can't see rating/flag so I can't determine which ones to delete.
    This is basic file management but not finding a solution.  Suggestions?

    The Photo Stream is designed to push your images into the cloud the moment they are taken. It does not (yet) support to show the ratings and most of the tags.
    If you have an event with all photos you want to keep, the easiest way would be to delete all photos from the Photo Stream (on all your IOS devices) and then add only your selected "keeper" images to the Photo Stream again.
    Regards
    Léonie
    Added: It is not safe to rely on the Photo Stream for storage, since it will only keep a snapshot of currently added photos - the last month's photos or the last 1000 photos. You have have no control over the sorting of photos in the stream or how long they will be kept. The Photo Stream is a means of sharing, not of storing images. I prefer to use "Shared Photo Streams" for more permanent photo selections. These will show exactly the selection of images I add to them, and they are not automatically erased after a month.
    iCloud: Using and troubleshooting Shared Photo Streams

  • Content management repository integration issues

    We are facing some major limitations with our content management repositories (we are currently
    using Documentum, Lotus and few others.. ) -- especially integrating these various repositories
    and share the information throughout the enterprise.
    We are looking into Livelink from Opentext as a possible solution,
    and are attending the upcoming webinar on the topic
    http://www.opentext.com/events/event.asp?id=34947#register
    but has anyone used the product , especially in conjunction with Oracle portal product?
    thanks,
    Joe

    I have a solution for you. Switch to the Oracle Content Management SDK! :-)

  • How to manage photos from multiple accounts

    Hi,
    i am trying to solve one problem. I have one iMAC where i have two user profiles connected to different AppleIDs. Next to it there are two iPhones and two iPads. I am trying to find out, how to manage photos from all these devices using one account on iMAC. Second question is if there is any finction in iPhoto that can publish images to NAS server? Second alternative for me is to byu iCloud drive space and use Family sharing for publishing photos. Bellow text schema is attached, it is only my idea and i dont know if this could work like this. Did anyone somehow solved this configuration? Thanks

    I can see the screenshot if I double click on it.  However, here is it again.  Being Halloween maybe it'll show.

  • Accessing a single itunes library from multiple macs using a NAS???

    hi, at the moment i have a imac g5 holding all my itunes library files on its hard drive. i'm about to buy another mac to locate downstairs, and also have itunes on my windows laptop.
    i am thinking of moving the itunes library to a networked attached storage device, something like the buffalo terastation. does anyone have experience of this? does it work well over a wireless network? (802.11g)
    how easy is this to set up, and make each mac point to the library on the NAS device? are there any problems with ripping cds to the NAS device, or multiple macs accessing different songs at the same time?
    what would i need to do on each mac to make if see the library if i moved it to the NAS?
    sorry for the questions, but i know what i want to do, just not quite sure how to do it!
    thanks, andy

    hi,
    sorry i haven't tried this. i've now resorted to
    buying a mac mini, attaching an external firewire
    harddrive and using itunes sharing. still means i
    have to have the mac mini on all the time, so not
    ideal. i decided not to buy the nas for the time
    being. i'm still interested in hearing if it works on
    other peoples systems tho!
    There are others who have done something similar in that they created a shared folder on one computer and opened access to that shared folder to all other computers. The result was that only one computer could listen to the library at a time. I would assume the same thing would be true if using a NAS due to the file locking iTunes imposes. Anybody want to take a ride to the Apple Store and see if one of their reps have done it??

Maybe you are looking for