Content server migration - loadercli and I'm running nuts...

We have an (old) content server database of 150 GB on 7.3.0.52 and we're trying to migrate that database to Linux x86_64 bit. I tried the following before:
- tried to install/run 7.3 on SuSE SLES 11 SP1 - failing (new kernel threading)
- tried to restore the 7.3 backup on 7.5 - failing (host pagaes too old)
- tried to use 'loadercli' of 7.5 (Note 962019) - failing (ASCII --> Unicode)
- now trying to use 7.6.06.20 and stuck
I want to use pipes as transport and use loadercli on the target system (7.6).
I created two files (according to note 962019)
EXPORT USER
catalog outstream pipe '/home/sqdcos/trans/COS.CAT'
data outstream pipe '/home/sqdcos/trans/COS.DATA' PAGES
package outstream file '/home/sqdcos/trans/COS.export'
and
IMPORT USER
catalog instream pipe '/home/sqdcos/trans/COS.CAT'
data instream pipe '/home/sqdcos/trans/COS.DATA' PAGES
package outstream '/home/sqdcos/trans/COS.import'
The pipes are not existing.
I start the export which seems to work, the COS.CAT pipe is created.
As soon as I start the import, I get the following error message:
IMPORT USER
catalog instream pipe '/home/sqdcos/trans/COS.CAT'
data instream pipe '/home/sqdcos/trans/COS.DATA' PAGES
package outstream '/home/sqdcos/trans/COS.import'
// M    Execute   PACKAGE  to transform  CATALOG
// M    Import    PACKAGE x'01000000A296EB4EB45800009B16031EC842BF0100000000'
// M    Number of TABLES   transformed : 3
// M    Processed command is a PAGES format based copy from database
// with ASCII catalog to database with UNICODE catalog
// M    Execute   PACKAGE  to transform  DATA
// M    Number of TABLES   to transform: 0
// E -25329:    The given data file '/home/sqdcos/trans/COS.DATA' was not
// generated using EXPORT in PAGE Format (missing table description).
plus I get an additional pipe created "COS.DATA0000"
What am I missing here? I'm fiddling with this since hours and I can't figure what I'm doing wrong.
Markus

what is the source platform (just to be able to test here)?
Source platform is SLES 9 32bit
Target is SLES 11 SP1 64bit
> 50% less data volume in the target sounds strange.
it is, data is missing.
> What does the loader.log say - source and target - anything suspicious?
Not really, it looks "good":
loadercli -d COS -n xx.xx.xx.xx -u SAPR3,SAP -b COS_EXPORT.sql
Loader protocol: '/home/sqdcos/sdb/connd266/loader/log/loader.log'
Loader packages: '/home/sqdcos/sdb/connd266/loader/packages'
User SAPR3 connected to database COS schema SAPR3 on 191.1.1.29.
EXPORT USER
catalog outstream pipe '/home/sqdcos/trans/COS.CAT'
data outstream pipe '/home/sqdcos/trans/COS.DATA' RECORDS
package outstream file '/home/sqdcos/trans/COS.export'
Successfully executed:
Total number of tables (definition) exported: 3
Total number of tables (data)       exported: 3 (excluded: 0, failed: 0)
loadercli -d COS -u SAPR3,SAP -b COS_IMPORT.sql
Loader protocol: '/home/sqdcos/sdb/connd266/loader/log/loader_2011121600202813.log'
Loader packages: '/home/sqdcos/sdb/connd266/loader/packages'
User SAPR3 connected to database COS schema SAPR3 on local host.
IMPORT USER
catalog instream pipe '/home/sqdcos/trans/COS.CAT'
data instream pipe '/home/sqdcos/trans/COS.DATA' RECORDS
package outstream '/home/sqdcos/trans/COS.import'
Successfully executed:
Total number of tables (definition) imported: 3
Total number of tables (data)       imported: 3 (excluded: 0, failed: 0)
Could/should we use a higher version on the target system?
Markus

Similar Messages

  • Content server migration

    At the moment in our production system(SAP 6.O) there are two clients:
    Client 400 where FI/CO/MM application are running, and client 300 for
    HCM applications.
    We would like to have only one client in our production system.
    Due to this, we would like to move HCM process and data from cliente
    300 to client 400. At the end of migraton Client 300 will be cancelled.
    Among others applications, also SAP e-recruiting is running in
    client 300. Is it possibile to move E-Recruiting data and attacchments
    from one client to another? I'm worried especially about attacchments
    stored in content server and Business partner data.
    Thanks a lot!
    Christian

    Hi,
    I have not done a migration yet, but I´m introducing ArchiveLink in a larger scal right now and migration is also a topiv for later tasks. What I have found is OSS note 1043676. Here you can copy docs from one repository to another. Thats how I would try:
    1. Make a new rep poiting at your new CS (e.g. Z2)
    2. Copy all docs from Z1 (your old CS) to Z2 by using OSS note 1043676
    3. Change customizing that all your relations point to the new rep.
    Cause I guess the final repository is not transparent to the application therefore a switch to a new rep should be possible.
    I did not try this but this may be the way I would suppose to go.
    Hope I could help you anyway
    P.S.: If you found the right way, please post it
    kind regards Matthew
    Edited by: matthew c. on May 25, 2010 11:15 AM

  • SAP content server - Migration as well upgrade

    Hello All,
    I am planning to upgrade my content server from 630 to 650, the challanges are , our  current server in windows 2000  ( IA 32 Bit)and Maxdb 7.3.0.35 . i want to bring into Windows 2008 R2 (64 Bit) MaXdb 7.8 .
    my plan is do a inplace upgrade in current server from windows 2000 to Windows 2003 and perform inplace upgrade for maxdb7.3.0.35 to 7.5  and content server from 630 to 640 at one stage.
    and next stage is install content server 650 maxdb 7.8 in windows 2008 R2 (64 Bit ), an do a system copy from source .
    My questions are first stage will be fesible for me, in second stage do i face any issues ? will content server accept this bit change from 32 to 64 bit.
    do i face any challanges in Maxdb Upgrade?
    Regards,
    Prem

    Hi Prem,
    Migration from 32-bit to 64-bit should be possible.
    You may refer below SAP notes for your reference
    962019 - System Copy of SAP MaxDB Content Server Database
    129352 - Homogeneous system copy with SAP MaxDB
    389366 - Relocation of documents
    Hope this helps.
    Regards,
    Deepak Kori

  • Fetch XML from Content server to Portlet and display as HTML

    Hi,
    I want to fetch the XML from the content server and display that XML in the Weblogic Portlet as HTML.
    The main reason is that data will be contributed by the users in UCM and then I had to get that data from content server and display in portlet
    Please Help........
    Thanks,
    Vinod

    Vinod:
    Probably what you're looking for is to bring the content over using the UCM SPI adapter for WLP's VCR:
    [http://download.oracle.com/docs/cd/E13155_01/wlp/docs103/ucm_adapter/intro.html]
    And then to display the content using a custom template using Content Presenter:
    [http://download.oracle.com/docs/cd/E13155_01/wlp/docs103/cm/displaytemplatesCM.html]
    [http://download.oracle.com/docs/cd/E13155_01/wlp/docs103/portlets/development.html]
    When you say you want to fetch xml, is that because it's a contributor data file? You can either read the property values directly from the VCR, or use the info you get from the node (such as dDocName) to then use a UCM/WCM web service to retrieve formatted content and put it in an iframe. There are a lot of ways to go in formatting the data.
    You can even add edit capabilities to your Presenter template by calling the 'edit data file' web service of WCM.

  • Using a single Content server for DMS and also for Archive link documents

    Hi,
    We have Planned for a single content server for managing the documents in DMS, parallelly as DMS Standard SAP objects are not available for all SAP transactions, i have proposed for making use of SAP Archivelink funtionality to maintain the documentations where Std SAP DMS does not provide a Solution.
    so here the question is
    can a single content server be used as a content repository for Both DMS and archivelink,
    can any one having done this pls state if it is possible to create different or Many Content repositories for DMS, Archive link in the same server,
    is this a virtual creation or can we assign each repository a specific storage space which it should not exceed.
    Thanks and regards
    Sathish

    Hi sathish,
    -- Through the ArchiveLink docuemnt Management interface, an SAP sytem can use various content servers as storage media. similarly, one contnent server can be used by multiple SAP systems.
    -- A Content server always has a single database assigned to it. A database can therefore be used by only one content server.
    -- A databse is split into as many repositories as necessary. The design of the repository is mapped in the SAP system.A repository contains documents.
    Depending on your requirements,(if u have limited documents or with low requirements)  you can use the same content server for both DMS and Archivelink.
    The number of documents stored on the content server is limited only by the size of the database.
    Its better to have an external content server for archiving purpose, if your database size is increasing. Since the archiving systems storage is not dependent on the database.
    U can also refer to the below thread :
    How to Archive SAP DMS Data?
    hope this helps....
    regards
    kavitha

  • File Server Migration Source and Target Data Validation

    Does anyone know of a power shell script/CLI command  or some other way to verify source and target data after a file server migration?  I want to make sure that the shares that are migrated from the source and target are an exact match. Thank
    you.

    Hi,
    An example is provided in this article:
    http://blogs.technet.com/b/heyscriptingguy/archive/2011/10/08/easily-compare-two-folders-by-using-powershell.aspx
    $fso = Get-ChildItem -Recurse -path C:\fso
    $fsoBU = Get-ChildItem -Recurse -path C:\fso_BackUp
    Compare-Object -ReferenceObject $fso -DifferenceObject $fsoBU
    And actually Robocopy could also do this job with /L and /log:file parameter. 
    If you have any feedback on our support, please send to [email protected]

  • Content server Migration 7.6 to 7,8

    Hi Experts,
    we have two SAP content servers:
    - An old one Win2k3 (32bit) with MaxDB 7.6.00.18 and nearly 200GB of data
    - A new Win2008R2 server (64bit) with SAP MaxDB 7.8.02.29 and 150GB of data
    We´d like to move all data from the old 7.6 DB onto the new server.
    Does anyone have a "best practice" for that?
    On both CS we do a daily full backup into the filesystem.
    Is there a possibility to recover the 7.6 backup on the "new" server into the 7.8 environment?
    May
    Any further suggestions? Export / import?
    Thanks for any help.
    Martin Schneider

    Hello Deepak Kori,
    thanks a lot for your reply!
    I read this SAP note and the note with additional information (129352).
    As i understand we can do a homogeneous system copy as follows:
    - setting up the new server (2008x64) with a MAXDB instance (7.6.03.09)
    - Creating a backup (migration) from the source system (7.6.00.18)
    - recovery with initialization of this backup on the new server
    - upgrade of the DB instance to SAP MAXDB 7.8.02.29
    If this is correct, i think we have a good strategy!
    Thanks
    Martin

  • Content Server Migration to Windows 2088 Server

    Hi,
    All our systems are on windows 2008 server. Since during time of Implementation CS was not supported on Wind 2K8 we had to install it on a separate hardware (Win 2K3). Now that it is supported we need to migrate our server from this separate hardware to our SolMan server.
    I need some advice on how too go ahead with this activity.
    I believe below are the steps.
    1. Upgrade MaxDB in our current CS from Version 7.6.03.10 to 7.6.5.12 or above as per SAP Note 1399009.
    2. Install SAP CS on the SolMan Sever.
    3. Do a Homogenous System Copy - restore of Backup of source MaxDB. Note 962019 & 129352
    4. Copy the ContentServer.INI file and the security directory to the
    new server. As this is not backed up in the DB Backup.
    Is this Process correct? Is there any guides for this. Please help.

    Hi,
    I have not done a migration yet, but I´m introducing ArchiveLink in a larger scal right now and migration is also a topiv for later tasks. What I have found is OSS note 1043676. Here you can copy docs from one repository to another. Thats how I would try:
    1. Make a new rep poiting at your new CS (e.g. Z2)
    2. Copy all docs from Z1 (your old CS) to Z2 by using OSS note 1043676
    3. Change customizing that all your relations point to the new rep.
    Cause I guess the final repository is not transparent to the application therefore a switch to a new rep should be possible.
    I did not try this but this may be the way I would suppose to go.
    Hope I could help you anyway
    P.S.: If you found the right way, please post it
    kind regards Matthew
    Edited by: matthew c. on May 25, 2010 11:15 AM

  • Content Server Migration in SAP CRM 4.0

    Hi,
    we have the following problem:
    We want to migrate old documents from an old content repository to a
    new archive to see them in CRM. How to to this?
    We found the note 1043676 to migrate them, but the note is only for SAP R/3. How to migrate old documents to a new one?
    Regards
    Jochen

    Hi
    Please locate the boconfig.cfg file e.g. /home/bi4/BI4/setup/boconfig.cfg
    If the file has only a few entries it is most definitely corrupted. Check if the boconfig.cfg_old has more lines, if yes make it boconfig.cfg
    If that still fails:
    - Restore the boconfig.cfg from a backup
    - Copy the boconfig.cfg file from a working BI4 system
    Then restart BI4
    Regards
    Roland

  • Migration of documents from a http content server into sap kpro und sap dvs

    hello,
    I want to migrate documents from an http 4.5 content server into the sap knowledge provider (kpro) and in sap dvs with an abap program.
    I know I have to create a PHIO and a LOIO and write it in the tables DMS_PH_CD1 and (only the LOIO) in DMS_DOC2LOIO.
    Where I have to write my url for accessing the document on the content server?In which table?
    What fm´s do I need to create the PHIO´s and LOIO´s?
    Has anyone an idea and hints (like weblinks) to integrate documents from an content server into kpro and sap dvs?

    Hello,
    the private key, where the hash is signed with is stored
    in your AppServer directory $DIR_INSTANCE/sec and is
    called SAPSYS.PSE. Where the PSE is a secude (www.secude.de) specific format which contains the private and the publik key.
    But I guess you won't get the private key, because its private, unless you are the Administror
    Then signig is done via the normal industry standards. (http://www.rsasecurity.com/)
    regards,
    mumba.

  • FMX2 - earmarked funds  and SAP Content server

    Hello all,
    Need a help to find solution for the following issue.
    Customer would like to have earmarked funds   attachments  save on SAP Content server u201Eout of databaseu201C.
    So in detail, customer run FMX2 change document and from menu choose Environment > Object list > create attachments
    Attachment is save on DB, so question is how to change this to save attachments on SAP Content server. And what is need to set up at transaction OAC3 to which Object type and document type is for attachment at FMX2.
    SAP Content server is up and running also we have repositories u2013 this is not issue. Issue is only to fill in data at OAC3 with correct object type and doc. Type.
    Thanks for any help
    Peter

    Hi, it's better to ask from basis team. In table maintained link between Document type and 'place' where you want to save your documents, it's not FM customizing

  • SAP Content Server Upgrade from 6.3 to 6.4 & MaxDB 7.3 to 7.8

    Dear Experts,
    We are in planning of upgrade SAP Content server from 6.3 to 6.4 and MaxDB from 7.3 build 35 to 7.8.
    For Content server upgrade is that we need to install the Content server 6.4 and install the latest MaxDB 7.8 and export and import the data.
    We are on Win 2003 32 Bit and we are migrating to win 2008 R2 64 Bit.
    Is it possible directly using database copy and restoring in MaxDB 7.8 version?
    Or we need to use loadercli for migrating the data.
    As per the SDN Link http://forums.sdn.sap.com/thread.jspa?threadID=1649661
    for content server upgrade is by replacing latest dll files at the respective dir.
    Pls let me know your views and known issues.
    Thank You,
    Mahesh

    Hello Mahesh,
    In note [735598|http://service.sap.com/sap/support/notes/735598] you can see:
    If your source version is 7.3.00 or 7.4.03, read the upgrade guide "Upgrade to MaxDB Release 7.5" or "Upgrade to MaxDB Release 7.6" on SAP Service Marketplace:
    http://service.sap.com/instguides > Other Documentation > Database Upgrades > MySQL MaxDB
    Regards,
    Eduardo Rezende

  • Content Server not working correctly

    Hello,
    I am having problems with my Tandberg Content Server. For several years, we have used the Tandberg Content Server (TCS) in combination with our Tandberg Management Server (TMS) to schedule and record video conferences. I schedule 60 video conferences a week to roughly 20 rooms, using the TMS. Of these, about 14 video conferences per week are recorded.
    Originally, I only had one recording alias on the TCS set up, and it worked fine. But I recently added a second recording alias, due to the growing number of requests for recordings of the video conferences. It seemed to go okay.
    Now, however, the Content Server isn't working quite right. It will record two different video conferences at the same time, but once it's finished encoding them, the link that you get to send to people is messed up: often (but not always) the link merges two seperate video conferences together in one video. Two one-hour video conferences will be merged together in one two-hour recorded conference.
    The problem only occurs with video conferences that occur at the same time. Strangely, if I search the Content Server's F drive, under Data/Media, the recordings are all there, and they are seperate. They are only joined together on the Content Server link.
    I have made sure that we are indeed recording on seperate recording aliases.
    Because the videos in the Data/Media bin are always labled something like O1110292903-0593923929.wmv, I can use guesswork to reconstruct a valid URL to get a working link, but it's a lot of extra work: I want to be able to log in to the Content Server admin interface and be able to send someone a link in a few minutes.
    Any ideas on what's going wrong? I created a trouble ticket with Cisco earlier in the week, but have not heard back from Cisco tech support, so I thought I'd open it to the Cisco community.

    HI Ron.  Good morning.  Since our webex the last time, I do believe we can use participant templates in the mean time until we find out whats going on when using TMS this way.  When using the recording drop down and all the systems aren't registered to Gatekeeper seems to be the trigger here, but to get you running, this is what I propose we try and do here. 
    You may have to purge the Content Server out of TMS so TMS doesn't "know" about the IP address of the Content Server. 
    You will have to examine your routing and preferred MCU in routing here, and make the two changes:
    Administrative Tools>Configuration>Conference Settings.  All the way at the bottom you will see:
    I had to change this to Always and Preferred MCU type in routing to MPS.  This is logical since this is the only MCU you have and you need the MCU in each booking (from my understanding of what you said during the webex).
    So:
    1) Jot down all the conferences that have the Content Server involved, and remove the recording from each conference.
    2) Purge the Content Server from TMS.
    3) Add participant template with just the IP address of the content server: Booking>Participant Templates and build one like shown below.  The number field should be the address of the content server and IP zone should be IP zone for your system. 
    Go ahead and save the template. 
    Make booking with this template and another endpoint (don't add the MPS since it should automatically).  Check the connection settings and the MPS should be call the IP of the TCS and the other endpoint. 
    Click Save conference. 
    If the conference saves ok, build another with a different room and same template again, and click save conference. 
    The caveat is that when calling the IP of the TCS, the TCS will use what you have configured as the default recording template under Configuration>Site Settings
    Just choose which one you want to use all the time if you can. 
    The way it shows on the TCS is:

  • Oracle Content Server Security with Web Center Suite

    Hi all
    I've configured Oracle Content Server (UCM) to integrate with WebCenter Spaces (WCS). The document service works on a user's personal page but fails on group spaces and when I troll the logs I find the following error:
    <WCS-07006> <run-time error obtaining content repository oracle.webcenter.doclib.internal.view.DoclibJCRException: repository error
    Caused By: javax.jcr.ItemNotFoundException: Unable to get folder info for dCollectionID = 288463355527000401
    Caused By: oracle.stellent.ridc.protocol.ServiceException: Unable to display virtual folder information. Cannot read folder.
    I am using a socket connection, not web or SSL.
    UCM is on an external server and not on the same machine as WCS.
    UCM is 11g and WCS 11g patchset 2. Yes I know UCM 11g is not "officially" supported to integrate with WCS until patchset 3, but others have gotten it to work, so I'd like to do the same.
    On UCM machine OHS (from web tier utilities) is installed and comfigured.
    httpd.conf edited as per requirement.
    config.cfg and intradoc.cfg edited as per requirement.
    Admin user on UCM has admin and sysmanager rights.
    I think this may be because the UCM uses the internal default LDAP first (i.e. it is first on the list of providers) because it does not have an admin user in the external LDAP. So even though in the service connection I specify the UCM admin i think it must also be able to find this user in the external LDAP on the WCS side.
    I am currently having trouble figuring out how to give an external LDAP user the same permissions on UCM as that which the default internal LDAP admin user has, so that I can use that external LDAP user in the services connection setup instead.
    With WCS under the Fusionware EM is simply added an external LDAP user to the domain for the webcenter section, but for UCM I'm not sure where the appropriate place is to add such a user.
    I am also not 100% sure if this is actually the issue.
    Any advice will be greatly appreciated!
    Thanks!
    If it helps:
    run-time error obtaining content repository[[
    oracle.webcenter.doclib.internal.view.DoclibJCRException: Repository error
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.adf.view.page.editor.webapp.WebCenterComposerFilter.doFilter(WebCenterComposerFilter.java:106)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.adf.share.http.ServletADFFilter.doFilter(ServletADFFilter.java:62)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:97)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
    at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:247)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:157)
    at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:159)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:94)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:414)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:138)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.webcenter.webcenterapp.internal.view.webapp.WebCenterLocaleWrapperFilter.processFilters(WebCenterLocaleWrapperFilter.java:288)
    at oracle.webcenter.webcenterapp.internal.view.webapp.WebCenterLocaleWrapperFilter.doFilter(WebCenterLocaleWrapperFilter.java:177)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:330)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.doIt(WebAppServletContext.java:3684)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3650)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2268)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2174)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1446)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused by: javax.jcr.ItemNotFoundException: Unable to get folder info for dCollectionID = 288463355527000403
    at oracle.jcr.impl.ExceptionFactory.itemNotFound(ExceptionFactory.java:587)
    at oracle.stellent.jcr.IdcPersistenceManager.getResourceByUUID(IdcPersistenceManager.java:433)
    at oracle.jcr.impl.TransientLayer.getResourceByUUID(TransientLayer.java:323)
    at oracle.jcr.impl.OracleSessionImpl.getNodeByUUID(OracleSessionImpl.java:279)
    at oracle.webcenter.doclib.internal.view.JCRRepositoryLogic.getNode(JCRRepositoryLogic.java:178)
    at oracle.webcenter.doclib.internal.view.JCRRepositoryLogic.getItem(JCRRepositoryLogic.java:849)
    ... 90 more
    Caused by: oracle.stellent.ridc.protocol.ServiceException: Unable to display virtual folder information. Cannot read folder.
    at oracle.stellent.ridc.protocol.ServiceResponse.getResponseAsBinder(ServiceResponse.java:116)
    at oracle.stellent.ridc.protocol.ServiceResponse.getResponseAsBinder(ServiceResponse.java:92)
    at oracle.stellent.jcr.IdcPersistenceManager.getResourceByUUID(IdcPersistenceManager.java:421)
    ... 94 more

    Hi, Thanks for response.
    1. I have created default webcenter project application.
    2. Created content repository connection using File System..
    3. Created new page using default Template.
    4. Drag drop Document List Viewer TaskFlow and setup the content server connection.
    5. After running the application , i can see list of documents link in page..
    6. But When I click on any link ..browser automaically close..
    7.On server console I am getting above mentioned error..
    I am using jdev 11.1.1.5 and running on internal weblogic server...
    Thanks
    Naresh

  • Archive Repository - Content Server or Root File System?

    Hi All,
    We are in the process of evaluating a storage solution for archiving and I would like to hear your experiences and recommendations.  I've ruled out 3rd-party solutions such as IXOS as over kill for our requirement.  That leaves us with the i5/OS root file system or the SAP Content Server in either a Linux partition or on a Windows server.  Has anyone done archiving with a similar setup?  What issues did you face?  I don't plan to replicate archive objects via MIMIX.
    Is anyone running the SAP Content Server in a Linux partition?  I'd like to know your experience with this even if you don't use the Content Server for archiving.  We use the Content Server (currently on Windows) for attaching files to SAP documents (e.g., Sales Documents) via Generic Object Services (GOS).  While I lean towards running separate instances of the Content Server for Archiving and GOS, I would like to run them both in the same Linux LPAR.
    TIA,
    Stan

    Hi Stanley,
    If you choose to store your data archive files at the file system level, is that a secure enough environment?  A third party certified storage solution provides a secure system where the archive files cannot be altered and also provides a way to manage the files over the years until they have met their retention limit.
    Another thing to consider, just because the end users may not need access to the archived data, your company might need to be able to access the data easily due to an audit or law suit situation. 
    I am a SAP customer whose job function is the technical lead for my company's SAP data archiving projects, not a 3rd party storage solution provider , and I highly recommend a certified storage solution for compliance reasons.
    Also, here is some information from the SAP Data Archiving web pages concerning using SAP Content Server for data archive files:
    10. Is the SAP Content Server suitable for data archiving?
    Up to and including SAP Content Server 6.20 the SAP CS is not designed to handle large files, which are common in data archiving. The new SAP CS 6.30 is designed to also handle large files and can therefore technically be used to store archive files. SAP CS does not support optical media. It is especially important to regularly run backups on the existing data!
    Recommendation for using SAP CS for data archiving:
          Store the files on SAP CS in a decompressed format (make settings at the repository)
           Install SAP CS and SAP DB on one server
           Use SAP CS for Unix (runtime tests to see how SAP CS for Windows behaves with large files still have to be carried out)
    Best Regards,
    Karin Tillotson

Maybe you are looking for

  • How do i get the ipod to mount in itunes, it is seen by disk manag. not exp

    ok i have worked on this piece for the last 7 hours and to no avail. I've tried the 5 r's five times, I've copied the usbinf.inf driver to the windows dir, even though it was already there. I've tried to get the explorer to recognize the device, no d

  • What's wrong with the method Package.getPackages() ?

    Hello, has anyone an idea why this code doesn't print "java.sql" and other packages' names ? for (Package pack : Package.getPackages()) {      System.out.println(pack.getName()); }this is what figures in the javadoc : public static Package[] getPacka

  • Help! I  need to know how to make photos black and white from color and leave one thing colored.

    I am a new user. I have elements 6. I really want to take pic of flower that i shot in front yard and change it to black and white and then erase the b/w flower to reveal the true color of the flower underneath. I have had mixed ways advised to me bu

  • Cross platform - Using AI on PC and Mac

    If I have a PC and Mac both running CS3 and I'm using Opentype fonts, how easy/hard is it to work on an AI file back and forth between the two computers?

  • Mac App Store launch crash

    After i have tried to setup adobe flash player the biggest part of apps crashing while i'm trying to launch them. Crash report from mac app store: Process:               App Store [2807] Path:                  /Applications/App Store.app/Contents/Mac