Reg: Knowledge Provider

Hi Experts
I'm new to DMS.
pl explain
1) knowledge provider
2)Content Server
3)cache server
4) repository settings
and process flow of DMS
Thanks & Regards
kumar

Hi,
1. Knowledge Provider :-  The service which provides link between SAP server & content server. Through this service you can link content server with TREX server.
2. Content Server :- Server where you the original files (Word,Excel...) are stored. It is using MAX DB database.
3. Cache Server :- Server which stores some last visited documents. (eg: same as like your cache memory of computer).
4. Repository :- The logical storage identification where you are storing original documents.
    Repository Settings :- First create storage system through OAC0 transaction. Then create system category through OACT transaction assign the storage system to storage category.
When check-in the document through transaction CV01N select the Storage System.
Process flow of documents mins: In one department for person are working. One person is creating documents & his senior checking the documents & HOD of department release the documents. So you can use document Status functionality to create this kind of flow of document.
Regards,
Sunny

Similar Messages

  • FileDownload from Content-Server (Knowledge Provider)

    Hi,
    I am trying to download a File via FileDownload UI from the Conent-Server. The Files are from Type: Knowledge Provider (KPro).  I get the following InputStream (Character from 01-12 + Binary from 13-17) : I need only the binary part, the lines from 13 - 17.
    01 --ejjeeffe0
    02 Content-Type: image/gif
    03 Content-Length: 3692
    04 X-compId: marketing.gif
    05 X-Content-Length: 3692
    06 X-compDateC: 2007-04-12
    07 X-compTimeC: 16:05:24
    08 X-compDateM: 2007-04-12
    09 X-compTimeM: 16:05:24
    10 X-compStatus: online
    11 X-pVersion: 0045
    12
    13 GIF89a@ @ ÷ÿ ÎÎΫ««œœœÕÕÕ<;;ƒƒƒzzz
    14 LKLaaauuuää䤤¤ÉÉɹ¹º
    15 ¾¾¾jjjôôôèèèîîîØØØðððøøø••–êêêòòòìì썍ööö
    16 ìêíæçèôùöõòñâÞÝÝÙÞûúüðíîôðõò
    17 £££ÜÛßÝÚÚàáâ™
    18 ejjeeffe0
    How is it possible to separate the two parts when I need only the lines from 13 - 17? I can not get the right InputStream ( lines 13 - 17) back. My main problem is to read the stream and convert it back to InputStream.
    regards,
    Sharam

    Hello,
    try the  LineNumberReader
    http://java.sun.com/j2se/1.4.2/docs/api/java/io/LineNumberReader.html
    it keeps track of line numbers for you, so you could read from line 13.
    Jan

  • Knowledge provider(Kpros) and sap content server.

    Hi Gurus,
    i need some details about Knowledge provider(Kpros) and sap content server.
    please provide the link.

    Hi Udaya,
    check help links
    http://help.sap.com/saphelp_nw04/helpdata/en/d0/590c421c7f11d5991d00508b6b8b11/frameset.htm
    kpro
    http://help.sap.com/saphelp_nw04/helpdata/en/f2/a1a93769928b7fe10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/87/069f3815e5ef60e10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/7c/6abd646ab811d3aece0000e82deb58/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/7c/6abd5b6ab811d3aece0000e82deb58/frameset.htm
    content server operation
    check below pdf
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/02621aac-0d01-0010-2c83-c7f62605733a
    Koti Reddy

  • Knowledge Provider (KPro) content-length of document

    hello,
    I need to know where the knowledge provider (kpro) logs or saves the information about the content-length of the document? or in detail: in which table is the length information of a document saved (perhabs in bytes?)?
    thanks for your reply.
    thorsten

    Hi Guys
    Can anyone give a clue on this. We are stuck at this phase.

  • Reg. Provider Url set in weblogic-ejb-jar.xml

    have created an MDB which listens to an IBM MQ. I currently
    specify the path of the binding file by setting the provider_url in the weblogic-ejb-jar.xml
    An example is as follows::
    <destination-jndi-name>Sample.Q</destination-jndi-name>
    <initial-context-factory>com.sun.jndi.fscontext.RefFSContextFactory</initial-context-factory>
    <provider-url>file:/apps/test/</provider-url>
    <connection-factory-jndi-name>Sample.QCF</connection-factory-jndi-name>
    is there any way to set this parameter dynamically from MDB itself

    AFAIK you can't dynamically reconfigure MDBs; whats in the XML is what you get.
    One alternative is to use Spring and Message Driven POJOs which allows you to use Spring to configure the provider & activation spec, which allows you to dynamically at runtime configure properties however you wish or even hot deploy MDPs within a running application.
    http://jencks.org/Message+Driven+POJOs
    Depending on how complex your runtime-configuration is you could create your own Spring Factory Beans to do wacky stuff (query a database or whatever to figure out the provider URLs etc) or use a Spring post processor to inject values from some other source (JNDI, LDAP etc).
    James
    http://logicblaze.com/

  • Scms for file manipulation (knowledge provider) documentation

    Hi,
    Do you have any code sample or documetation on the SCMS calls/API to manipulate files on the content server ?
    Sincerely,
    Olivier Matt

    Hi Olivier,
    what exactly do you need?
    Best regards
    Torsten

  • Knowledge Provider KPRO architecture integration with Documentum

    Hi All,
    As a part of KPro architecture it is understood that the SAP Content server at the bottom layer can be replaced by any third party Content server and in our case we wish to make use of Documentum content server in place of SAP content server
    On digging further it is understood that the integration point is an HTTP script
    Currently http script is having a default value to point to SAP content server as ContentServer/ContentServer.dll
    Now we need to point the http script to point to Documentum ?
    Our environment is in solaris and hence instead of dll it should be a shared library .so file.
    Have anyone tried that?
    If so how can we achieve this? How can we make the http script to point to Documentum in KPro

    Hi Guys
    Can anyone give a clue on this. We are stuck at this phase.

  • Content Management System (CMS) vs Knowledge Management (KM)

    Hi,
    What is the difference between CMS and KM?  Is CMS part of KM?  How do you install CMS or KM?  I need to be able to install CMS.
    Thanks,

    Hi,
    "CMS" would in the first place only be a generic term that you can call any product that manages content.
    At SAP I am aware of the following similar terms:
    <b>KM -</b> Java-based CMS, integration framework for 3rd party CMS's and more of SAP NetWeaver, runs in the Portal
    <b>CM -</b> The architectural part of KM that deals with basic content management services. With additional services and TREX this combines to KM.
    <b>DMS -</b> "Document management system", R/3-based system for management of engineering documents, bases on "KPro" in Web AS ABAP, part of mySAP PLM, integrates into KM via repository manager
    <b>KW -</b> KPro-based system for managing SAP-related documentation and training
    <b>Content Server - </b>Object storage location for KPro-managed documents (see DMS)
    <b>KPro -</b> "Knowledge Provider", deep level very generic set of ABAP methods to manage document attachment objects in an R/3 environment
    Hope that clarifies...
    Regards, Karsten

  • KPro vs Knowledge Management What is the difference ?

    Hi All, I have read up on the ECM on this website, yet i don't really understand what is the difference between Knowledge Management and Knowledge Provider. Is it 2 different ways of implementing the same thing ?
    If anyone can help with an example would be greatly appreciated. Thanks.

    Hi,
    Please find required difference
    SAP Knowledge Provider(KPro)
    =============================
    The SAP Knowledge Provider (KPro) is a cross-application and media-neutral information
    technology infrastructure within the R/3 Basis System. The modular structure and openness on
    which KPro is based is reflected in its modular services and clearly defined interfaces. Its
    extensive flexibility means that KPro can be used to process the widest variety of information
    types within and relating to documents and document-like objects. For example, administration
    and index data, as well as pure content.
    Applications that use the SAP Knowledge Provider involve various end users, who in turn have
    different requirements. There is therefore no universal interface for accessing KPro services
    regarding the following points:
      Specific knowledge management functions for end users
      Specific terminology for describing the document-like objects within the context of the
    individual application
      Specific design of work process flows
      Specific design of user interfaces
    SAP Knowledge Management
    =========================
    With its Knowledge Management capabilities, SAP NetWeaveru2122 provides a central, role-specific point of entry to unstructured information from various data sources in the portal. This unstructured information can exist in different formats such as text documents, presentations, or HTML files. Workers in an organization can access information from different source such as file servers, their intranet, or the World Wide Web. A generic framework integrates these data sources and provides access to the information contained in them through the portal.
    Knowledge Management functions support you in structuring information and making it available to the correct target audience. These functions include search, classification, and subscriptions (see below): You can use these functions on all content of integrated data sources, as long as the technical conditions are met.
    Knowledge Management is a part of SAP Enterprise Portal. The entire functional scope and configuration of Knowledge Management are available in portal iViews.
    Functions such as discussions, feedback, and sending items from KM folders by e-mail are integrated into Knowledge Management and enable you to work across role and department borders. In addition, Knowledge Management functions are used in Collaboration, for example, to store documents in virtual rooms.
    Knowledge Management supports the SAP NetWeaveru2122 scenario Information Broadcasting. You can use the BEx Broadcaster and Knowledge Management to make business information from SAP NetWeaver Business Intelligence available to a wide spectrum of users in the portal. You can store the following BI objects in KM folders:
    ·        Documents with precalculated reports.
    ·        Links to BEx Web applications and queries with live data.
    Various Knowledge Management functions, such as subscriptions for documents, are available for these items. There is a specially modified user interface for displaying BI items in KM folders. You can also change or extend this interface to meet your requirements.
    Hope this helps.
    Regards,
    Deepak Kori

  • Knowledge Link IT 1062

    Hi Experts
    Has anyone used the Knowledge Link Infotype 1062 for connecting Course Type to a Knowledge Warehouse from within LSO before?
    i understand this can be done, but am looking for specific details on setup of KM / DMS?
    ZWHat is needed, what isn't needed and if any configuration needs to be done to link this to Infotype 1062.
    Thanks in Advance
    Anton Kruse

    HI Everyone
    I`v done a research and i believe its better to do the standard way of managing contents for LSO.
    Based on my finding its better to use Kpro(Knowledge provider) for this purpose. DMS and CMS Services both are Kpro Services when DMS uses(in standard way) for documents related to MM and CMS recommended for LSO.
    CMS can link to KM(Knowledge management) and also using Authoring Environment you can manage all contents related to LSO.
    AE allows you to use external Authoring Tools in order to provide eLearning. indeed you can use CMS repositories to store LSO Docs.
    Since you may decide to give users an access to learning material it is a good idea to upload docs in a way that easily being accessible from web.
    Integration between CMS and DMS is possible( refer to link below: Integration of CMS & DMS).
    Authoring Environment:
    http://help.sap.com/saphelp_ls200/helpdata/en/e6/c9e81df5ed11d5997000508b6b8b11/frameset.htm
    Integration of CMS and DMS refer to SAP SCN:
    http://help.sap.com/saphelp_erp60_sp/helpdata/en/15/aea9375d79fb7de10000009b38f8cf/frameset.htm
    Any idea/recommendation/suggestion from you experts warmly appreciated.
    Regards,

  • Document Attachment at GR

    Hi
    Is it possible to attach the Dly challan *** invoice document at the time of GR?
    If so where will it be stored? The document should be stored in a separate server.
    Thanks
    Rahul

    Hi Rahul,
    If Document Management System is active in your system You can find the Services for Object Option At the top of MIGO screen.
    To Attach any kind of document to the business object like  GR doc or during Invoicing we need to have Document Management System (DMS) to be active and settings for Kpro (Knowledge Provider) Server to be active. Please check with you BASIS.
    I am working on your issue. Please revert back if not solved.
    Reg,
    Ashok
    Assign Points if useful

  • How many ADFS farms can you have in a single forest/single domain?

    Hi
    I may have some terminology incorrect...please let me know if I do. :)
    My question is, how many ADFS farms can you have in a single forest/single domain? If you want to know why I am asking...please read on.
    We have 1 ADFS Farm and we are looking adding services to it. However not every cloud vendor provides a "Identity Broker" with there services.
    We have a consultant that is advising that we need to enable a SAML-based IdP-initiated single sign-on (SSO) ie using "IdpInitiatedSignOnPage"
    However to do this we need to modify the ADFS website to have "drop down" list so the user can select the "Relying Party" and then authentication with them.
    This means we are exposing a list of every company/party we have federated with. The exposure of this information, is deemed a security concern by our company....which I agree with.
    So the consultant advises that we need a separate ADFS farm. I have searched online, but haven't found any information that confirms multiple ADFS farms can be implemented in a single forest/single domain.
    Thanks for reading and if you have any other suggestions...I'd appreciate it.
    Nyobi

    This is not exactly FIM related question - there is ADFS forum available on Technet. However - technically there is no limit of ADFS farms in a forest \ domain. It is just a service which uses AD and is not altering it in any way or storing some forest-wide
    information like Exchange. So you can setup two ADFS services in single forest - no problem. 
    If it is a best solution to your problem? I can't say with that limited information but maybe just customization of pages on ADFS side would be enough? 
    Tomek Onyszko, memberOf Predica FIM Team (http://www.predica.pl), IdAM knowledge provider @ http://blog.predica.pl

  • How to configure sync rules involving a CSV file and portal self service

    Hello,
     I need to configure some FIM sync rules for the following scenario:
     User account details are entered from a HR CSV file and exported to AD  Users have the ability to modify their own AD attributes in the
    FIM portal (there is not a requirement for them to view their  HR CSV data in the portal). The FIM portal modifications will be exported to AD as expected.  
    My setup is as follows:
    CSV file - name, last name, employee ID, address.
    CSV MA - has direct attribute flows configured in the MA between the data source and MV Portal self service attributes –      
    users can edit mobile, display name and photo
    I've also set the CSV MA as precedent for the attributes
    FIM MA – attribute flows defined for MV to Data Source as usual (i.e. firstname to firstname, accountname to accountname, etc).
    AD MA – no attribute flows defined as inbound and outbound sync rules have been configured in the portal using the Set\MPR\Triple.
    I’m thinking of using the following run profiles:
    CSV MA – full import and delta sync (imports HR data)
    FIM MA –  export and delta import (imports portal changes)
    FIM MA – delta sync (syncs any portal changes)
    AD MA – export and delta import
    If my understanding is correct this should sync HR data from CSV to AD, as well as user attribute self service updates from the portal to AD.
    If I wanted to just do a HR CSV sync could I get away with just steps 1 & 4 ? (presumably not as my rules are in the FIM portal?)
    If I wanted to do just a portal sync, could I get away steps 2-4?
    Any advice on how to improve my setup is much appreciated - cheers
    IT Support/Everything

    The truth is that your design should be done in the way that it doesn't matter which profiles in which order you will execute. At the end, if you will run all import, synch and export profiles on each data source you should get same result. This is beauty
    of synch engine here.
    Your steps from 1-4 will synch data to your data sources and at the end will give you expected result. But not because of the order you are executing them but because of correct attribute flows. If flows from CSV file and from FIM portal might be done for
    the same attributes you need to think also about attribute precedence.   
    Tomek Onyszko, memberOf Predica FIM Team (http://www.predica.pl), IdAM knowledge provider @ http://blog.predica.pl

  • Approval workflow error when creating a new custom entity in FIM 2010 R2

    Hello,
    i'm hoping somebody here can help, me i've been struggling with this for some time now. On a fresh FIM installation i create a custom entity named "Role" and add a few custom attributes.
    I then create an approval workflow and MPR for normal users to create entities of type Role, but another user must approve this request. The other user has a working mailbox - i've tried firing an action workflow that sends a mail notification when someone
    creates a new role and it is working fine. But, when i enable my approval workflow (the only field i changed from default is the approver) on the MPR, the workflow always failes with the message:
    Error processing your request: The operation was rejected because of access control policies.
    Reason: The server workflow rejected the operation.
    Attributes:
    Correlation Id: 750a558a-d3e4-4216-b16a-e76d79f011ec
    Request Id: feaabbc9-dea4-49a3-8b29-65b77de6f8fd
    Details: The Workflow Instance '04202cc0-14a3-410c-a3fc-2d6e5d25ebe6' encountered an internal error during processing. Contact your system administrator for more information.
     I enabled tracing and this is what i found:
    Microsoft.ResourceManagement Verbose: 0 : Creating WorkflowServiceHost for XOML Definition:\n<ns0:SequentialWorkflow ActorId="00000000-0000-0000-0000-000000000000" RequestId="00000000-0000-0000-0000-000000000000" x:Name="SequentialWorkflow"
    TargetId="00000000-0000-0000-0000-000000000000" WorkflowDefinitionId="00000000-0000-0000-0000-000000000000" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/workflow"
    xmlns:ns1="clr-namespace:System.Workflow.Activities;Assembly=System.WorkflowServices, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856
        ThreadId=8
        DateTime=2013-09-04T15:17:10.0496188Z
    Microsoft.ResourceManagement Information: 1 : 1 :  : Invalid Element 'ReceiveActivity.WorkflowServiceAttributes' found while deserializing an object of type 'Microsoft.ResourceManagement.Workflow.Activities.ApprovalActivity'.
        ThreadId=8
        DateTime=2013-09-04T15:17:10.1277486Z
    Microsoft.ResourceManagement Information: 1 : 1 :  : Invalid data found while deserializing an object of type 'Microsoft.ResourceManagement.Workflow.Activities.ApprovalActivity'.
        ThreadId=8
        DateTime=2013-09-04T15:17:10.1277486Z
    Microsoft.ResourceManagement Verbose: 0 : A WorkflowRuntime is not available for this WorkflowDefinitionVersionKey '20'.
        ThreadId=8
        DateTime=2013-09-04T15:17:10.1277486Z
    Microsoft.ResourceManagement Error: 3 : Workflow host activation failed for workflow definition id : 231457c6-d044-4cc7-839f-98e5cf88f514, version key: 20. Exception: Object reference not set to an instance of an object.   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.ActivateHost(ResourceManagementWorkflowDefinition
    workflowDefinition, Boolean suspendWorkflowStartupAndTimerOperations)
       at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.RetrieveWorkflowDataForHostActivator()
        ThreadId=8
        DateTime=2013-09-04T15:17:10.1277486Z
    Microsoft.ResourceManagement Information: 1 : The service has updated the list of active hosted workflow definitions to sequence number '1'.
    This happened on two separate FIM deployments, but both of them were set up in the same way. What am i missing here?
    Thank you,
    Martin

    (...) What am i missing here? (...) - Sharepoint 2013 and probable bug in FIM related to it. Check this thread for workaround
    and resolution:
    http://social.technet.microsoft.com/Forums/en-US/1b76672d-1276-4c71-b9fc-5bb1fcb36877/event-id-3-with-approval-activity?forum=ilm2
    Tomek Onyszko, memberOf Predica FIM Team (http://www.predica.pl), IdAM knowledge provider @ http://blog.predica.pl

  • Error while checking Document in DMS_C1_ST

    Hi All,
    I am using my company's sandbox. For a DMS Document Type, I have activated the 'Use KPRO' option in the Document Type Attributes.
    But When I check in a original lets say simple word file, it says:
    An error occured while creating the original attribute for WRD.
    An error occured while creating original in the Knowledge Provider.
    Contact the systems administrator and check the log file (transaction Evaluate Application Log, Object SDOK)
    When I use SAP IDES demo system with same configuration, I dont see this error.
    Can someone kindly help. What's different in my company's box which needs to be corrected.
    Thanks,
    Sachin

    Please go to transaction SE16 and enter 'SDOKPROP' as table. Then enter 'DMS*' in the field PROP_NAME and press F8. Now all DMS related entries should be displayed to you. Please compare them with the following entries displayed in the attached screenshot and maintain the missing entries.
    DMS_ACTIVE_VERSION 00 Active Version
    DMS_APPLICATION 00 Logical application
    DMS_CHECKOUT_USER 00 Checkout User
    DMS_CREATE_AUDIT 00 Generate new version of the original
    DMS_DEFAULT_LANGUAGE 00 Default Language
    DMS_DELETEABLE 00 PHIO deletable
    DMS_DOC_KEY 02 IWB_DMS01 DMS_DOCKEY Document info record
    DMS_DOC_VERSION 00 Version of relevant document
    DMS_DRAW_APPNR 00 Application number from DRAW
    DMS_DRAW_DTTRG 00 Data carrier from DRAW
    DMS_DRAW_FILEP 00 File name from DRAW
    DMS_FILE1 00 File name(firstpart)
    DMS_FILE2 00 File name(secondpart)
    DMS_FILE3 00 File name(thirdpart)
    DMS_FILE_ID 00 GUID that points to table DMS_PHIO2FILE
    DMS_FRMTXT 00 Format description, ID of additional file
    DMS_MUP_CAT 00 Markup category
    DMS_MUP_FLAG 00 Markup flag
    DMS_ORDER 00 Sequence of additional files
    DMS_STATUS 00 Document status
    DMS_STATUSNR 00 Number in status protocol
    DMS_STATUSNR_X 00 References to Status Log
    Please pay attention that all entries are typed well and that there are no spelling errors. If all these values are maintained correctly in table SDOKPROP the dump and the error message should no longer appear.
    For further information you can also see the corresponding SDN WIKI page under
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/plm/error26296in+CV01N

Maybe you are looking for