Performance outlook - triple document storage

Have both Solaris and Windows NT with Oracle 8.1.6, both multi-
processor. Storage of documents is about to triple. Want to
get the documents into Oracle, and not just have pointers with
Cold Fusion verity. Have done limited testing with small amount
of documents in Blobs. Performance was reasonable with text
searches.
How well are searches going to perform within record with Blob
(PDF, etc.) with 100G + of documents, ie comparable to CF
verity? Am wondering if anyone is at this size or greater and
not using BFILE?

Hi Pradeep,
   Look at <a href="http://help.sap.com/saphelp_crm50/helpdata/en/e3/923227b24e11d5993800508b6b8b11/frameset.htm">External Repositories</a> maybe it helps you
Regards.
Manuel

Similar Messages

  • Solution Manager Document Storage

    Hi Team,
    I am aware that the documents in solution manager are stored in Knowledge Warehouse (KW) through content server installation.  Can anyone brief about How does the document storage in KW will impact / consume the overall storage capacity of Sol Man database? How does the document storage will impact the oveall Sol Man system performance?
    On which scenario,we recommend installation of KPRO server in the Sol Man System for document storage?
    Guidance Required.
    Thnaks
    PSK

    Hi,
    For implementation area regarding the technical name of the folders, the following rule applies
    for storing documents in Knowledge Warehouse folders:
    for PROJECTS:
    With the exception of template projects, the system stores all documents
    in a folder with the technical project name when you create them.
    In template projects, the system stores all documents, with the
    exception of project documents, in folder SOLAR00. Project documents are
    stored in a folder with the technical project name.
    for SOLUTIONS:
    The system stores all documents in a folder with the technical name of
    the solution when you create them.
    Check the note  913175 as reference.
    Kind Regards,
    Fabricius

  • Indexing documents storaged in Documentum Content Server

    Hello experts,
    I've a question about how i can do the next project.
    My development team wants develop in our SAP ECC 6.0 system an application that allows them work with the documents storaged in our Documentum Content Server, and after they must be able to search over does document contents an retrieve them from content server.
    Basically, we need indexing the documents of Content Server for use it in SAP system.
    It's possibily do it only with Documentum Content Server?? or we need SAP Trex??
    Thanks
    Regards.

    Hi Sonia,
    Refer the documentum server link below for required information
    http://docs.oracle.com/cd/E10502_01/doc/search.1018/e10418/cmsources.htm
    Hope this is useful.
    Regards,
    Deepak Kori

  • Documents, storage size and number of members (in Workspaces)

    Greetings!
    Is there any limit on the number of documents, storage size and number of members?
    If positive, how can I expand this?
    Thanks in advance.

    There is no limit to the number of documents or the number of members.
    At this time, we are providing 5 GB of storage, but there is no way to purchase additional space.

  • Is the Global Document Storage (GDS) used in Rights Management?

    I am setting up an Adobe LiveCycle ES3 server exclusively for Rights Management; and I am wondering if the folder for the Global Document Storage (GDS) will contain any files that need to be backup'ed in case for system recovery. 
    TS

    Aayush,
    Thank you for your response. 
    In the documentation:
    "LiveCycle specific data:
    Application data exists in the database and in Global Document Storage (GDS), and must be backed up in real time. GDS is a directory that is used to store long-lived files that are used within a process. These files may include PDFs, policies, or form templates. "
    However, during my testings I don't see any file activities on the GDS folder such as file creation, deletion, etc.  Under what Rights Management operations or scenarios would files, policies be stored in the GDS?  
    Maybe I am testing using the proper conditions?
    TS

  • Webdav support for document storage

    I am trialing the CRM on Demand and am wondering if the API supports some sort of document storage interface such as webdav?

    Hi, If you are in R16, R16 provides WS API to store and retrieve attachments from OD.
    -- Venky CRMIT

  • Where to find out Text for document storage for HR Objects in E-Recruitment

    Hi,
    In E-Recruitment
    1.For table HRP5121,there are some fields which store numeric values but points to a text in the document storage for HR Objects. Where can we find out the description for the text ?
    2.Similarly for Resume attachment is there any way where we can find out where the resume or other documents are stored
    For pt 1 Requirement is to create a report

    If i am not wrong, resume attached would be stored in tables TOAHR and SDOKCONT1, and attached document can be accessed through PB30 --> Extras

  • EXTREMELY SLOW XQUERY PERFORMANCE AND SLOW DOCUMENT INSERTS

    EXTREMELY SLOW XQUERY PERFORMANCE AND SLOW DOCUMENT INSERTS.
    Resolution History
    12-JUN-07 15:01:17 GMT
    ### Complete Problem Description ###
    A test file is being used to do inserts into a schemaless XML DB. The file is inserted and then links are made to 4
    different collection folders under /public. The inserts are pretty slow (about
    15 per second and the file is small)but the xquery doesn't even complete when
    there are 500 documents to query against.
    The same xquery has been tested on a competitors system and it has lightening fast performance there. I know it
    should likewise be fast on Oracle, but I haven't been able to figure out what
    is going on except that I suspect somehow a cartesian product is the result of
    the query on Oracle.
    ### SQLXML, XQUERY, PL/SQL syntax used ###
    Here is the key plsql code that calls the DBMS_XDB procedures:
    CREATE OR REPLACE TYPE "XDB"."RESOURCEARRAY" AS VARRAY(500) OF VARCHAR2(256);
    PROCEDURE AddOrReplaceResource(
    resourceUri VARCHAR2,
    resourceContents SYS.XMLTYPE,
    public_collections in ResourceArray
    ) AS
    b BOOLEAN;
    privateResourceUri path_view.path%TYPE;
    resource_exists EXCEPTION;
    pragma exception_init(resource_exists,-31003);
    BEGIN
    /* Store the document in private folder */
    privateResourceUri := GetPrivateResourceUri(resourceUri);
    BEGIN
    b := dbms_xdb.createResource(privateResourceUri, resourceContents);
    EXCEPTION
    WHEN resource_exists THEN
    DELETE FROM resource_view WHERE equals_path(res, privateResourceUri)=1;
    b := dbms_xdb.createResource(privateResourceUri, resourceContents);
    END;
    /* add a link in /public/<collection-name> for each collection passed in */
    FOR i IN 1 .. public_collections.count LOOP
    BEGIN
    dbms_xdb.link(privateResourceUri,public_collections(i),resourceUri);
    EXCEPTION
    WHEN resource_exists THEN
    dbms_xdb.deleteResource(concat(concat(public_collections(i),'/'),resourceUri));
    dbms_xdb.link(privateResourceUri,public_collections(i),resourceUri);
    END;
    END LOOP;
    COMMIT;
    END;
    FUNCTION GetPrivateResourceUri(
    resourceUri VARCHAR2
    ) RETURN VARCHAR2 AS
    BEGIN
    return concat('/ems/docs/',REGEXP_SUBSTR(resourceUri,'[a-zA-z0-9.-]*$'));
    END;
    ### Info for XML Querying ###
    Here is the XQuery and a sample of the output follows:
    declare namespace c2ns="urn:xmlns:NCC-C2IEDM";
    for $cotEvent in collection("/public")/event
    return
    <cotEntity>
    {$cotEvent}
    {for $d in collection("/public")/c2ns:OpContextMembership[c2ns:Entity/c2ns:EntityIdentifier
    /c2ns:EntityId=xs:string($cotEvent/@uid)]
    return
    $d
    </cotEntity>
    Sample output:
    <cotEntity><event how="m-r" opex="o-" version="2" uid="XXX541113454" type="a-h-G-" stale="2007-03-05T15:36:26.000Z"
    start="2007-03-
    05T15:36:26.000Z" time="2007-03-05T15:36:26.000Z"><point ce="" le="" lat="5.19098483230079" lon="-5.333597827082126"
    hae="0.0"/><de
    tail><track course="26.0" speed="9.26"/></detail></event></cotEntity>

    19-JUN-07 04:34:27 GMT
    UPDATE
    =======
    Hi Arnold,
    you wrote -
    Please use Sun JDK 1.5 java to perform the test case.Right now I have -
    $ which java
    /usr/bin/java
    $ java -version
    java version "1.4.2"
    gcj (GCC) 3.4.6 20060404 (Red Hat 3.4.6-3)
    sorry as I told you before I am not very knowledgeable in Java. Can you tell me what setting
    s I need to change to make use of Sun JDK 1.5. Please note I am testing on Linux
    . Do I need to test this on a SUN box? Can it not be modify to run on Linux?
    Thanks,
    Rakesh
    STATUS
    =======
    @CUS -- Waiting for requested information

  • Document Storage in Sharepoint without using EP

    Hi,
    We are implementing SAP Solution Manager 4.0 without using EP. It uses CRM 5.0.
    The issue We are facing is that the Content Management system of CRM 5.0 uses the SAP databases for storage of documents. Obviously, this can be expected to be a heavy drag on the DB performance.
    We want to instead use an alternate means to store CM Documents. For example, use a File-server area on the application server or use MS Sharepoint portal 2003 for the purpose. Any pointers on how to achieve this?
    -Regards,
    Pradeep

    Hi Pradeep,
       Look at <a href="http://help.sap.com/saphelp_crm50/helpdata/en/e3/923227b24e11d5993800508b6b8b11/frameset.htm">External Repositories</a> maybe it helps you
    Regards.
    Manuel

  • T520 - 42435gg / Sound stutter and slow Graphic performance with Intel Rapid Storage AHCI Driver

    Hi everybody,
    I have serious Problems with my 42435gg
    Any time I install the Intel Storage AHCI Driver (I've tried plenty of different versions) which is suggested by System Update I experience a horrible Sound stutter and slow Graphic performance in Windows 7 64-Bit.
    The funny thing in this case: If the external e-sata port is connected the problems do not occur. If the port is unused again, the stutter begins immediately.
    The only thing I can do is using the Windows internal Storage Driver with which I am not able to use my DVD recorder for example.
    The device was sent to lenovo for hardware testing with no result. It was sent back without any repairing.
    Anybody experience on this?
    Kind regards,
    Daniel

    Did you try the 11.5 RST beta? Load up DPClat and see if DPC conditions are favorable.
    What are you using to check graphics performance?
    W520: i7-2720QM, Q2000M at 1080/688/1376, 21GB RAM, 500GB + 750GB HDD, FHD screen
    X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
    Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen

  • Best practices for performance on io and storage.

    I'm building / buying a new server was planning on going virtual.
    Dual xeon 2620 v3 with 64 GB ram we have about 15 users and 14 remote users.
    Main server 2008 / 2012 SQL
    2nd Server 2008 / 2012 File storage
    3rd Server / Terminal Services / Citrix (may not be needed still evaluating)
    Here is my concern.
    HyperV server installed on a mirror dive 120 GB SAS Raid 0 (i've been informed that this is unnecessary as hyper V doesn't require much space and that having it on an SSD would only improve on the boot up speed for the actual hyper V server, hypervisor,
    there fore even if I put this on a slow 5400 RPM drive this would only affect the initial boot up of hyperV (which I don't' plan on rebooting often)  is this true, would the page file be an issue?
    I was then planning on having 600 sas 15K (x4) raid 10, I would use this for the datastores of the 3 server on these drives.
    I've been informed that the I/O on these drives will affect performance and that each server should be on it's own  separate physical drives (raid volume).
    Is this common? Should I be using separate HD's for each virtual machine?
    nambi

    I'm building / buying a new server was planning on going virtual.
    Dual xeon 2620 v3 with 64 GB ram we have about 15 users and 14 remote users.
    Main server 2008 / 2012 SQL
    2nd Server 2008 / 2012 File storage
    3rd Server / Terminal Services / Citrix (may not be needed still evaluating)
    Here is my concern.
    HyperV server installed on a mirror dive 120 GB SAS Raid 0 (i've been informed that this is unnecessary as hyper V doesn't require much space and that having it on an SSD would only improve on the boot up speed for the actual hyper V server, hypervisor,
    there fore even if I put this on a slow 5400 RPM drive this would only affect the initial boot up of hyperV (which I don't' plan on rebooting often)  is this true, would the page file be an issue?
    I was then planning on having 600 sas 15K (x4) raid 10, I would use this for the datastores of the 3 server on these drives.
    I've been informed that the I/O on these drives will affect performance and that each server should be on it's own  separate physical drives (raid volume).
    Is this common? Should I be using separate HD's for each virtual machine?
    Do not create "silos" or "islands" of storage as it both a) hell of a management and b) effective way to steal IOPS from your config. OBR10 (One Big RAID10) is a way to go. See:
    OBR10 (One Big RAID10) is a way to go
    http://community.spiceworks.com/topic/262196-one-big-raid-10-the-new-standard-in-server-storage
    Good luck :)
    Hyper-V Shared Nothing Cluster. Only two Hyper-V hosts needed.

  • DMS Document Storage in External Content Server

    Dear All,
    We are working on a DMS scenario, where we need to store the document in an external content server, and not in SAP DB. We are evaluating the solutions around the content server, and we see that SAP provides a HTTP Interface to SAP Content Server, and this interface can be configured through OAC0 and OACT, and managed through CSADMIN. Now, a set of questions:
    1. If we intend to use SAP Content Server, do we need to purchase additional license for it? Our understanding is that the software is delivered with SAP Installation DVD, and as such no licensing charges are required.
    2. If we DO NOT intend to use SAP Content Server, and rather use an external content server (Possibly utilizing the File System at the OS level as a repository), how will the configuration of OAC0 and OACT look like in such a case? We definitely can not use HTTP Content Server as storage type, How will the entries be organized in this case?
    Has anyone worked on a scenario like this? It will be really nice if you can share your experience and expertise in this regard.
    Awaiting replies.
    Thanks and Sincere Regards,
    Sid

    Hi Ravindra,
    Thanks for the clarification. Do I understand correctly that, for a external content server (without an installation of SAP Content Server) also, we need to specify the storage type as HTTP Content Server ? But in such a case, how will the work processes be handled? As I understand, SAP Content Server engine will handle the incoming/outgoing requests through a Web Server. So, don't we need a similar arrangement for an external content server also? In that case, will the connection parameters be for a HTTP Content Server or for RFC Archive, where we can specify a RFC Destination of type G and connect through it?
    Thanks and Regards,
    Sid

  • Document storage advice required

    Good day,
    Our developers are doing a project that involves processing Purchasing Invoices that have been scanned and converted to PDF. They are using ArchiveLink and at the moment are storing these documents directly in the database. They(developers) were under the impression they were simply linking the documents from the file system but I don't think that is correct. It is still under development.
    Anyway, to cut a long story short I would just like to know if there are any other implications to storing approximately 2000 x 130kb PDF documents on average per month in a ERP database ? I know it will eventually effect our backups and restore times but as far as MaxDB is concerned is the database usually configured in a specific way to handle this kind of thing like on Content Management Server ?
    I would like them to move the documents to Content Management Server or a 3rd party solution like ImageNow and just need some advice.
    Many thanks.
    Regards,
    Nelis

    After doing some more research I came across notes 653774 which states "..file repositories have been eliminated as of Basis Release 6.10. If a file repository is used in a live system, the data can no longer be addressed after a release upgrade to Basis Release 6.10 (or later version)." ...so that will prove helpful
    Also note 595663 gives information on using ArchiveLink with database storage and mentions it should only be used with smaller datasets, for production SAP Content Server should be used.
    This more or less gives me the info I require.
    I'm still curious to know whether MaxDB is specifically setup differently on Content Server to handle large volumes of binary documents ?
    Nelis

  • Help: Bad performance in marketing documents!

    Hello,
    When creating an AR delivery note which has about 10 lines, we have really noticed that the creation of lines becomes slower and slower. This especially happens when making tab in the system field "Quantity". In fact, before going to the next field quickly, it stays in Quantity field for about 5 seconds!
    The number of formatted searches in AR delivery note is only 5. And only one is automatic. The number of user fields is about 5.
    We have heard about the bad performance when the number of lines increases in the documents when having formatted searches, but it is odd to happen this with about 10 lines in the document.
    We are using PL16 and this issue seems to have been solved already at PL10.
    Could you throw some light on this?
    Thanks in advance,

    It is solved now.
    It had to be with the automatic formated search in 2 head fields.
    If the automatic search is removed, the performance is OK.
    Hope it helps you,

  • Performance of extracting document elements

    I have a container with about 100,000 documents, of which several elements are indexed. When I run a query like this:
    for $lead in doc('dbxml:leads.dbxml/31308')/als:auto-lead-service
    return <lead attempts="{$lead/@processing-attempts}"/>
    the query returns in about 260ms.
    If I change the query to include many more document elements, like this:
    for $lead in doc('dbxml:leads.dbxml/31308')/als:auto-lead-service
    return
    <lead attempts="{$lead/@processing-attempts}"
         lead-date="{$lead/@created}"
         first-name="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:PersonName[1]/star:GivenName[1]}"
         last-name="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:PersonName[1]/star:FamilyName[1]}"
         address="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:AddressLine[1]}"
         lead-city="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:City[1]}"
         lead-state="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:StateOrProvince[1]}"
         lead-zip="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:PostalCode[1]}"
         lead-email="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Contact[1]/star:EMailAddress[1]}"
         lead-phone="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Contact[1]/star:Telephone[1]}"
         lead-source="{$lead/als:id[@source eq 'autoleadservice.com:source']}"
         lead-type="{$lead/als:meta[@key eq 'com.autoleadservice.LeadType']}"
         lead-subtype="{$lead/als:meta[@key eq 'com.autoleadservice.LeadSubType']}"/>
    the query takes about 8 seconds to run.
    Some of these elements are indexed. If I change the query to return only elements that are not indexed, like this:
    for $lead in doc('dbxml:leads.dbxml/31308')/als:auto-lead-service
    return <lead
         first-name="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:PersonName[1]/star:GivenName[1]}"
         last-name="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:PersonName[1]/star:FamilyName[1]}"
         address="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:AddressLine[1]}"
         lead-city="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:City[1]}"
         lead-state="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:StateOrProvince[1]}"
         lead-zip="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Address[1]/star:PostalCode[1]}"
         lead-email="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Contact[1]/star:EMailAddress[1]}"
         lead-phone="{$lead/star:SalesLead[1]/star:Header[1]/star:IndividualProspect[1]/star:Contact[1]/star:Telephone[1]}"
         />
    The query returns in about 260ms again.
    From the query plan when indexed elements are included, it looks like BDB is performing a full index lookup for each indexed element, does that sound correct? Is there a way to force BDB to not use the index?

    Hi Raghu,
    This issue is being handled through Metalink.
    Regards,
    Andrei Costache

Maybe you are looking for

  • Duplicate Records in CFMAIL

    I think I have done this before but I cannot remember the code or command. Bascially, I want ot output records in cfmail, identical to the cfoutput I have on the screen. I use the exact same code for the cfmail but it is duplicating everything 100 ti

  • Using ThreadLocal with WebLogic App Server

              With weblogic thread pooling, When I use threadlocal variables in my application,           how does it work as far as cleaning those variables after the request is completed.           Thanks in advance.           

  • Error message :"Error occurred in the data selection"

    Hi All , When I am exracting a Delta from source I am getting this error mesage Error occurred in the data selection" can any one let me know how to slove  this, Thanks in Advance. rao

  • How much can macbooks take?

    Ok. I just installed windows 7 and am very happy with it. but, just wondering, are macbooks themselves made with the ability to game and download big torrent files? I don't wanna wear it down past its ability to function. I am turning into a big game

  • [SOLVED] (None-issue) OpenBox seems to be using too much RAM

    Hey all, I turned my computer on, and I decided to see what sort of RAM my lightweight Openbox install uses. I opened a terminal, typed top, and saw the memory was almost 400mb... I don't have any panels, or any Conky. I have a background, that's abo