Data manager navigation performance

Hi Gurus,
I have some issue with slow data manager navigation.  The packages themselves run perfectly fast, but such things as opening the package list or validating a transformation file takes a huge amount of time.  Does anyone have any experience dealing with this and have any ways of improving it?
Initially I thought it was my laptop but it is being experienced universally.
Cheers,
Chris

Hi Ethan,
Take a look at one of the allocation script: http://pastebin.com/TA16xCd3
We are testing RUNLOGIC but we are facing some problems in two situations:
- passing the DM package variable to the RUNLOGIC script
- using a passed variable in the called script
The DM prompts for 3 selections: ENTITY, TIME and CATEGORY.
The RUNLOGIC script:
*SELECT(%DIVISIONS%,"[ID]",DIVISION,"[LEVEL]='DIV' AND [STORECOMMON]<>'Y'")
*SELECT(%BRANCHES%,"[ID]",BRANCH,"[BRANCHTYPE]='STORE'")
*START_BADI RUNLOGIC
     QUERY=OFF
     WRITE=ON
     LOGIC=ALLOC_DIV_ACTUAL_S.LGF
     DIMENSION ENTITY=C1000
     DIMENSION TIME=FY10.MAY
     DIMENSION CATEGORY=ACTUAL
     DIMENSION DIVISION=%DIVISIONS%
     DIMENSION DIVISION=%BRANCHES%
     CHANGED=ENTITY,TIME,CATEGORY,DIVISION,BRANCH
     DEBUG=ON
*END_BADI
In ALLOC_DIV_ACTUAL_S.LGF, we are using a %DIVISION_SET% variable. At the time of validating, we get a message "Member "" does not exist".
When we run the package, it fails with the same error message:
An exception with the type CX_UJK_VALIDATION_EXCEPTION occurred, but was neither handled locally, nor declared in a RAISING clause
Member "" not exist
Thanks
Regis

Similar Messages

  • The current user has insufficient permissions to perform this operation when trying to add Term stored managed navigation.

    Hi,
    i am getting this error "The current user has insufficient permissions to perform this operation." when trying to add the Term store managed navigation like the following screen shot. i am the Farm
    administrator and as well managed services account. also noticed, cannot delete the service application, saying you don't have enough permission to delete the db. but using this account i was able to do everything before in my environment. is anyone already
    face this kind of error, so what will be the way to resolve this?
    Appreciated!

    event though its a farm admin,It should provide the access to MMS.please find the below link for more details and the solution for the issue.
    Go to SharePoint Central Administration Site –> Application Management –> [Service Applications] –> Manage service applications
    2.   Highlight the Managed Metadata Service that your web application is associated with. (Do not click on the link, just click somewhere else on that row to highlight it)
    3.   Click on Permissions button in the ribbon area.
    4.   Add the application pool account used by your web application and give it  ‘full Access to Term Store’
    5.   Click OK.
    http://expertsharepoint.blogspot.de/2014/08/managed-metadata-service-or-connection.html
    Anil Avula[MCP,MCSE,MCSA,MCTS,MCITP,MCSM] See Me At: http://expertsharepoint.blogspot.de/

  • MDM Data manager :Unable to perform Checkout/in operations on record

    Hi forum,
    When I am trying to checkout  records  in data manager, I receive an error 'The Requested error was not found , verson 5.5 SP06'. Same error appears when we are executing checkin and roll back on any record in repository.
    Please advise on resolution steps for this.
    Regards,
    Vinay M.S

    Hi Vinay M.S
    Pls check functions for role that you used (SAP MDM Console->Admin->Roles).
    May be it role dont have permissions to checkout(in) records?
    Regards
    Kanstantsin

  • MDM Data Manager - Freezes at loading Local Searches

    Hi all,
    While the loading the MDM data manager (DM) the application freezes for ~5 min while stating "Loading Local Searches", before the DM is finíshed loading.
    Does anyone know how to solve this problem, or at least where the local searches are stored on the drive?
    Kind Regards,
    Christian

    Hi Christian,
    While the loading the MDM data manager (DM) the application freezes for ~5 min while stating "Loading Local Searches", before the DM is finíshed loading.
    Does anyone know how to solve this problem, or at least where the local searches are stored on the drive?
    As per my knowledge, there may be some reasons for this problem.
    1. There might be a version mismatch between the MDM server and the MDM Data Manager Gui. You can check the verison only when the Data Manager opens.
    However, you can check it for other GUi's and compare it with the MDM server version. ( Even the build version should match).
    2. There might be huge number of records saved in your repository.
    3) You might have changed the repository strutcure which has not been updated throughout.
    There might be some other problem also.
    Possible solutions:
    1. Unload the repository and Load it again with UPDATE INDICES. This might solve the problem at the root itself.
    2)I would suggest you to check your GUI version and upgrade it if its not matching with the MDM server version.
    3)If still not working, you can try-unistalling the DM client and re-installing it again.
    Please note these 2 steps before performing the step 2 and step3.
    Check the repository for errors and if any, then verify it
    Do take a back of the repository from the Console- Use Archive repository.
    Hope it helps.
    Thanks and Regards
    Nitin jain

  • PDF - Importing through Data Manager  -

    I am performing following steps. In the data manager, I am trying to import the PDF file. Then link it to the Main Table.
    While importing I am using "Link to Orginal File Only"
    PDF file is present in the share folder and the path of share folder does not have spaces. I get the following error message
    "There was an error opening this document. This file cannot be found".
    Actually file is available as well as path is correct and also acees to the sahre folder.
    I tried same thing through local drive from my desktop. I encounter the same message.
    When, I go and see PDFs table in the repository, I can see a record inserted with the correct path. I am not able top open up the PDF using Object --> View PDF.
    I am using MDM 7.1 SP 05.
    Please share your experience.

    Please let me know what version of MDM you are using... Can you give a try by uploading pdf from your local system.. This will make sure that there has nothing to do with the shared drive..
    Please revert with the results.
    Best Regards,
    Shiv
    Please also check this document, just to make sure you are not missing any step.
    [How to Import Pdf Document in MDM|http://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/c01f54a2-99f1-2a10-5aa5-dcc50870e7f6]
    Best Regards,
    Shiv

  • Using own SQLite DB in combination with Data Management

    Hi,
    Currently on a huge project we're we are using LCDS 3.1 in combination with a AIR 2.5 client.
    I've been reading "Using Adobe LiveCycle Data Services ES2 version 3.1" and I have a question. In the chapter "Building an offline-enabled application" it says on the very first line:
    "You can use an offline adapter with an AIR SQLite database to perform offline fills when a desktop client is disconnected from the LiveCycle Data Services server. An offline adapter contains the SQL queries for AIR SQLite for retrieving cached items like an assembler on the server retrieves items from the data source."
    However, in my experience that AIR SQLite database is not just any DB but one that Datamanagment designs and generates itself, based on the Dto the DataManagement destination is managing. The offline adapter doesn't work like an assembler at all, because the documentation says you can only override the methods pertaining to constructing the WHERE, and ORDER BY parts of the queries, not the SELECT, CREATE, FROM,... parts.
    In our case, we have a database on the server, constructed according to a very specific ERD, and we have a SQLite database on the client, also constructed according to a very specific ERD. What we want to do is execute every fill, create, update, delete against the offline cache and only synchronize with the backend when we want it the synchronize (technically possible by playing with the autoMerge, autoSaveCache, autoConnect,... properties). So what part of datamanagement can we customize to use our DB instead of a generated one?
    Thx in advance!

    You are correct in noting that Data Management does not allow you to use your own database to store offline data.  This data is exclusively managed by the LCDS library for the developer.  The intent is that the local cache is a reflection of the server data, not an independent copy.
    If you have an existing database in AIR, then you will have much more direct control over the querying and updating of that data by using the SQLite APIs directly.
    That being said, you can in essence replicate the data stored on the server, managed by Data Management, in the offline cache.  In an upcoming release (winter 2011) we will have a few features ('briefcases' and a 'changes-only' fill) that will make this story even more compelling for your use cases.  But even with the 3.1 functionality, you can do something like the following:
    Perform a fill() to collect the data you want to have available on the client, save this in the offline cache
    Construct an Offline Adapter Actionscript class that implements the fills you want to perform on the local data
    Use the DataService.localFill() API to perform all of the client application fills, turn off autoCommit.
    When the client is online, call commit() to store client changes and call fill() to refresh the cached data.
    This should give you some ideas on how you could go about constructing your app to leverage the offline features of Data Services.
    Tom

  • MDM Data Manager step-by-step process with an example

    Hi All,
    I recently started reading MDM. I am clear with MDM-Console concepts. But I have problem with MDM Data Manager. I am confused with data entries in qualified tables, work flows(using microsoft studio), Matching mode- rules(High treshold, low threshold) etc.. I read material posted in forums- but still I am not clear.
    I need a step-by-step process for data entries with *examples*. A good example which covers all Data Manager concepts. PDF's with screen shots, videos any thing would help.
    Please help me......It would be of great help.
    Thanks in Advance.
    Suchir

    Hi Suchir ,
    MDM 5.5 has 4 MDM clients to perform 4 main functionality on the MDM master records.
    - Console- This is where all the Administrative work is performed
    - Data Manager- This is where the actual maintainence work is done on the master data like consolidation deduplication etc.
    - Import manager- Thsi is from where the source matser records are taken into MDM repoistory
    - Syndicator- This is from where teh consolidated master data is send back to the target systems.
    MDM Data Manager is the heart of MDM
    The activities that can be performed in the MDM Data manager are:
    - Data Consolidation
    - Data Validation
    - Data Deduplication
    - Data Assignment
    - Data Governanace
    etc......
    Qualified Table:
    - Qulaified tables are used to store relational records where one field value is determining the other field or fields value.
    - Qualified tables are maintained in Console and can be viewed in Data Manager
    - Qualified Table are viewed under a separate section in Data Manager in the right hand side.
    - Qualified table has Non qualifiers and Qualifiers.
    - Non qualifiers can be seen in the  Qualified table and in the linking Qualified field
    - Qualifiers can be seen separately under the Qualified section in the data manager
    Kindly refer the below link to know more on this:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00a15239-684e-2b10-b8ae-b936b7d1c1fe
    Workflows
    - Workflows are designed in MDM for Governanace
    - You can create the MDM workflow using the MS Visio 2003 stencils in the MDM Data manager in the record mode under the workflow table
    - The saved workflow in DM can then be executed either mnaully or automatically based on teh trigger actions
    - The trigger action for teh MDM wf are record Add,Update,Import,Manual
    Kindly refer the below link to know more on this:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/60559952-ff62-2910-49a5-b4fb8e94f167  (MDM Workflows Overview)
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90990743-91c0-2a10-fd8f-fad371c7ee40  (Demo of MDM Workflows)
    Matching Mode
    - This mode is exclusively for Master data deduplication
    - We need to create Rules,Strategies,Transformation in MDM data manager in teh record mode
    - Then you need to run these strategies on the selected records
    - Based on your score and threshold set in the strategy MDM will identify duplicate records
    - This score will be seen in teh Matching mode
    - Colour coding is used along with scores to find out dupliactes
    - Green - 100%dupliacte Blue- 50?% Red- No duplicate
    Kindly refer the below link to know more on this:
    /people/michael.reil/blog/2006/05/18/mdm-matching-strategies-for-master-data-consolidation (matching merging in mdm)
    Hope It Helped
    Thanks & Regards
    Simona Pinto

  • Governance and Data Management

    Hello,
    We need to know about governance and data management, from this week I am working as master data coordinator in Empresas Polar (Mass Consumer Products Industry), where the role of this function is to assure the creation and maintenance of the master data, as initial activities we are raising statistics to know the most important information of the master data in all SAP modules, and the idea is across own developments in R/3, Portal, Workflow and Tools that exist on the market to simplify Update of the information on the part of the user, we know that in SAP the information is created in different views or transactions, which makes very difficult the opportune update of the data.
    I am interested to know about MDM or any tool/application which eolud help us,
    I appreciate your help,
    Thanks

    Hi torres,
    The MDM system for master data management and single-source cross-media catalog publishing does more than just database management, asset management, or even database publishing.It uses an industry-standard SQL-based DBMS as a u201Ccontaineru201D for the repository database. But then it goes much further to implement an elegant and seamless combination of many different technologies,concepts, and minute details to create a comprehensive solution to an
    age-old problem.Many of the features incorporated into the MDM system are unique.
    ADVANTAGES
    End-to-end solution. The MDM system provide an end-to-end
    solution that automates the entire process of managing master data from start to finish, including bulk data import, centralized master data management, and published output to a variety of media.
    u2022 Database-driven system. MDM layers a thick shell of functionality on top of a powerful SQL-based DBMS so that the MDM system is fully scalable and the master data is fully accessible to other SQLbased applications and tools.
    u2022 Large capacity. The MDM system efficiently manages master data repositories containing up to millions of records.
    u2022 Superior performance. MDM breaks through SQL performance
    bottlenecks to deliver blazingly fast performance that is measured in milliseconds rather than seconds and is literally 100u20131000 times that of a SQL DBMS alone. No other system on the market today delivers comparable performance.
    u2022 Powerful search and retrieval. All of the MDM modules include
    powerful search and retrieval capabilities, so that an entire repository of thousands or millions of items can be easily searched and any item or group of items located in a matter of seconds.
    u2022 Cross-media publishing. The MDM system is the only one on the market today that includes tightly integrated functionality for both electronic and printed output, for Web, CD-ROM, and paper output from a single MDM repository.

  • Detailed Query Runtime Statistics - Substeps of Data manager Event

    Dear All,
    I am using the report for 0TCT_MC02_Q0200 to analyse the Detailed Query Runtime Statistics.
    The report shows that the Event 9000 (Data manager Event) took a time of 109 seconds.
    I would like to know as to what does the 9000 Data Manager Event that takes 109 seconds do for me.
    What is it doing for 109 seconds ????
    Can I have more details about the substeps of Data manager Event and time taken by them ???  Any Table or a report would suffice my requriement.
    Thanks
    Regards
    Shalabh

    Hi Jain,
    Execute the query in RSRT or in RRMX. To Check how much time it is taking to execute this querry?? To get this information, Goto SE11> Give table name RSDDSTAT_DM in BI7.0 or RSDDSTAT in BW3.x> Display -> Contents-> Give from date and to date values as today, user name as Ur user name, and give the query name--> execute.
    Now u'll get a list with fields like Object name(Report name), Time read, Infoprovider name(Multiprovider), Partprovider name (Cube), Aggregate name... etc. If the time read is less than 100,000,000 (100 sec) is acceptable. If the time read is more than 100 sec then it is recommended to create Aggregates for that query to increase performance.
    Here in this table u'll find detailed Query Runtime Statistics.
    Hope thsi helps you..
    Regards,
    Ramki.

  • LiveCycle 2.6.1 Data Management with The ColdFusion 8.0 DataManagement Event Gateway Issue

    Hello all,
         I've recently been developing a project that involves sending out events from ColdFusion to LiveCycle 2.6.1 using the Data Management event gateway to Flex 4.0 clients (LiveCycle and ColdFusion are on different Instances, but the same server).  To begin with, I used ColdFusion assemblers, DAO's, and models and everything worked fine locally.  After deploying this setup to a beta site, I decided that this setup would be very troublesome in terms of configuring clustered instances across multiple servers.  I then decided to convert my assemblers, DAO's, and models to Java.  The conversion went well and the flex clients see the exact same data as they did with the ColdFusion adapter.
         Once I tried to send an update through from my ColdFusion application to a Flex client, I get an error stating that:
    "Unable to find the Flex adapter for destination My_Dest in the RMI registry on localhost:1099.The Flex adapter may not be running or the destination may be incorrect."
    After seeing this error, downloaded a Java-based RMI inspector to see what was going on.  To get a good idea of what was happening when the ColdFusion adapter was being used, I switched my data-management-config file back to the CF adapter.  I noticed that the RMI entry was as follows:
    localhost:1099/cfdataserviceadapter/My_Dest
    localhost:1099/cfassembler/my_cf_instance
    Once I gathered this data as the base, I converted back to the Java adapter in my data-management-config file, restarted the servers, and ran the RMI inspector again.  Only the "localhost:1099/cfassembler/my_cf_instance" was showing.  (This one shows because I have "Enable Remote Adobe LiveCycle Data Management Access" checked in my CF instance's CF Admin -> Flex Integration).  Since I don't need this checked anymore, I unchecked it and re-ran the RMI inspector.  As it should, the "localhost:1099/cfdataserviceadapter/My_Dest" went away.  Since no destination shows up, it means that the Flex adapter isn't registering my "my_Dest" destination with RMI.  Since it isn't registered, I can't see it when I try to send a message through the CF Data Management event gateway.
    Can anyone help me out here?  I certainly may be missing something when it comes to RMI (I don't work with Java very often).  Any advice would be greatly appreciated!
    Thank you,
    Dustin Blomquist

    Dustin,
    Without the ColdFusion based data management destination defined on the LCDS server, the destination will not show up in the RMI registry.  It is only the CF adapter code that does this.  The 'stock' LCDS adapter does not support invoking via RMI the way the CF version does.
    I would recommend you run the LCDS MessageBrokerServlet inside the ColdFusion web application.  This will give you two things:
    1. You will not have the overhead of RMI between CF and LCDS as they will share the same VM (better performance!).
    2. You will be able to use the CF Data Management Gateway to pass messages to Java-based destinations.  The APIs the gateway uses should work fine with either CF or Java based Data Management destinations.
    The CF/LCDS integration doesn't support what you are trying to do when you run two seperate instances.

  • Error while connecting to Repository through Data Manager

    Hi Experts,
    I am able to access MDM server through MDM Console, Here mainlyI got two errors.
    Error 1:- I am not able to create new repository through Console.when I click on "create Repository" option in Console,it will try to connect DBMS server to create.at the same time I have given Database credentials.It was giving error like" Unable to connect to DBMS server".
    Error 2:- I have some repositories in my MDM server.already one repository is running.I am trying to access the running repository in Data Manager as well in as Import Manager and Syndicator.at the same time I am not able to connect to repository giving error like "Repository not loaded".
    Note: all the necessary ports are opened as per SAP standard.
    Please help me regarding this.if anybody have idea about this.
    Thanks in advance,
    Ravi.

    Hi Ravi,
       Steps to load a repository
       1) Load the MDM server
       2) Start MDM server
       3) There you will find list of repositorys
       4) If required login into the repository
            This is the main step you got to perform to solve your second problem
       5) Right click the Repository and select 'Load Repository' and then 'Immediate'
           This will load the repository and you can work on the repository from any where either IM,DM or syndicator.
      For the first error check wether you have given the proper path for your DBMS server; if you give it right it wont give u any error. If its in local system then by default it be in the list. Else you got to provide the system name where the server is there.
      In case of any othere issuse please let me know.
    Regards,
    CHARAN

  • NEW TO SAP BPC - Advice on Data Management Packages

    Hi,
    I am an experienced Microsoft BI User, but am new to SAP BPC.
    I am working on a project that has required me to create custom SSIS packages for my client. To date my packages transform / cleanse and format the source into a clean CSV file, that requires no further intervention from the user, other than to use the standard SAP BPC Import Routine, with a simple Transformation file.
    In order to fully automate the process, I would like my Custom package to automatically invoke the Standard import package without any user based intervention.
    Can anyone advise on this?
    I have considered writing further logic in my custom package that would completely bypass SAP BPC Data Manager, and insert the information direclty into the Fact Tables, taking into consideration any members that don't exist in the Dimensions. However, I would prefer not to go down this route unless absolutely necessary.
    Can anyone point me in the right direction, so that I can call a clear routine from within my custom package, that would invoke a SAP BPC package /task to clear certain data from the fact table, and then call another Package / task that would update the Fact table based on a fully prepared csv file.
    Many thanks

    You could modify your custom package to perform all the steps thru BPC import and have the user initiate the Package from the BPC Data Manager.  You would just need to add a MODIFYSCRIPT variable (see the standard IMPORT package as an example) and add the BPC tasks required to Convert/Load that data your package already prepares.
    I have had prior success having a custom package initiated from within the BPC Data Manager extract data from an external data source (ie: Oracle), use user provided runtime input to customize data extraction parameters, manipulate the extracted data, and import it into BPC.
    Unfortunately I've never attempted to initiate a BPC task without the package being initiated from within BPC.  But I would surmise that if you provide the proper values for the individual BPC Tasks (ie: Convert and Load) required properties (ie: AppSet, App, User,...), I cannot think why it would not work.

  • Data creation in data manager vs. ERP transaction

    Hi experts,
    I have a general question on working with MDM: When I use MDM as central data management tool then I create and change the whole data within MDM Data Manager or maybe in MDM iViews in EP. Doing that way I will loose a lot of positive aspects I had doing this in ERP formerly even when using business content.
    For example:
    - no process logic exists (which I had in ERP dynpros in the code): no preconfigured assignments, validations, relationships, etc.
    - I have to know all relevant tables where I have to do my entries
    - no standard searches for typical problems
    - all roles and authorizations have to be created on my own (except standard roles like Admin, Data Expert, etc.), there is no possibility to transfer authorizations from ERP to MDM (authorizations can be very complex)
    - you need more (expert) know-how to maintain data (which is also a job of business departments)
    - etc.
    So a change to central master data management means a step backwards from a dynpro controlled input GUI to a database frontend with only rudimental features and loosing a lot of functionality that was build within a lot of years.
    Is this right or did I miss something important? I think loosing all this functionality can be a cause for resistance against introduction of MDM central data management in companys.
    Thanks for your answers. Helpful answers will be rewarded.
    BR, bd

    Hi BD,
    CMDM is one of the later stages in using MDM effectively.Before moving to a CMDM scenario it is necessary that ground base is set up.
    - One goes for a CMDM scenario once all the one time data(exisiting data) is already cleansed.So firstly you need to extract all the possible high risk duplicates existing data from the ERP sytem into MDM ,run the matching merging strategies on them and then maintain the consistent data in MDM.Once this step is completed only then it makes sense to go for the creation of new data centrally through MDM.
    - Just like in ECC when a user creates a new material for instance.He has to go through all the views and enter in all the required and optional fields value.In the same way this same scenario can be replicated in MDM by creting a repository with all the fields that one needs to enter while creating a new material. and by using MDM Validations we can simulate the same requirement of mandatory as in ECC.
    - If you go for a Business content repository for Material,Vendor,Product etc you will have all the ready to use Roles with authorizations so its no a much of rework.
    - Standard searches amy not be available in MDM but one can always desig his/her own search based on the customer requirement and MDM searches are very easy to use and also very dynamic.
    I do agree that SAP MDM may not be a fully grown tool at this point of time but with the SAP MDM 7.1 vs most of these drawbacks will be addressed. and besides although ECC can do most of the Master data related work it is not a dedicated system for that purpose and will therefore have an influence on the performance and time involved aspects.
    Hope It Helped,
    Kindly Reward Points if found useful
    Thanks & Regards
    Simona Pinto

  • Prevent Data Manager Package to be startet at certain times of the day

    Hi
    We currently have an issue as described below.
    During month end closing SAP BW data is updated to BPC 3 times each day. This is done using a Data Package linke performing 8 steps in total. This DP Link works as expected. The problem is that while this is running users start other data manager packages that are based on the data from the data package link. This is an issue because the data is not correctly allocated before all 8 steps of the DP link has completed. This leads to very funny / Incomplete data and we would like to prevent this issue.
    The question is therefor! Can we implement some kind of logic that prevents users from starting other data manager packages when the DP link is running?
    /Chris

    Hi Chris,
    If you want to do the check for a DM which is running eg. a script via the default formular PC you can create copy of the PC and build in an extra step just after the start to run a custom program.
    You can set up the custom program eg. to check the table RSPCLOGCHAIN for dependent running PC or validate if the DM package can run dependent on some parameters set in a custom table. Eg. set the time range for when you are running the pakage link.
    You don't need much logic in the custom program to do the check.
    I don't see another or even standard way to do the check in BPC.
    Br
    Rasmus Larsen

  • Syndication process - Tags in Data manager Selecting the map in Syndication

    Hello Experts,
    I am supporting a repository in MDM 7.1, and there is a part of the syndication process that i don´t know how does it work. This is the process what i mean:
    The repository works with 3 different maps, the map is selected in  MDM Data Manager belong a 'syndicatable' tag. Two of the maps are selected inside Qalified tables.
    What i need to know is where could i see the configuration for that automatic mapping (it is performed belong MDSS), could anybody help me?
    Thanks in advance,
    Mariano Pinkava.

    Hi Mariano,
    In addition, please go through page 223 (mds.ini) & 272 of Console Reference Guide.
    Syndication File Location:
    When MDSS completes a syndication to a port, it places the syndication file in the portu2019s Ready folder. The Ready folder is part of the following fixed directory structure, located beneath the MDM Serveru2019s distribution root directory:
    root/DBMSinstance_DBMStype/RepositoryName/Outbound/ RemoteSystem/PortName/Ready
    where:
    u2022 root is the distribution root directory (set in mds.ini).
    u2022DBMSinstance is the network identifier used to specify the DBMS instance name and DBMStype is the four-character identifier for the DBMS type (i.e. MSQL, ORCL, IDB2).
    u2022RepositoryName, RemoteSystem, and PortName are the values entered in the Code property for each item in MDM Console.
    Thanks,
    Priti

Maybe you are looking for

  • Logitech Marble Mouse and Xorg 1.8

    With the update to xorg 1.8, fdi files in /etc/hal/fdi/policy are no longer being used. I created a workaround for my Marble Mouse by creating a file named /etc/X11/xorg.conf.d/10-marblemouse.conf Then I added the following cote to it: Section "Input

  • External Hard Drive Mac/PC Partition

    Hello, I'm wondering if it's possible to partition my Maxtor One Touch hard drive, with a DOS partition and a OSX partition. I don't think it's possible with Disk Utility, does anybody know if there is another free application that would do this? I n

  • How do I get my iMessages from my macbook pro onto my iPhone 5?

    I had to "restore" and then" restore from backup" my iphone due to glitch reasons. I now do not have about a month of my imessages on my iphone but they are stored on my macbook. Is there a way I can get these messages back onto my iphone?

  • SOA Suite 11.1.1.3 - doesn't create managed server

    Trying to install SOA Suite 11.1.1.3 from the docs at http://download.oracle.com/docs/cd/E15523_01/install.1111/e14318/qisoa.htm Get right up until section "6.1 Configuring Custom Port Numbers for Oracle BAM" (without any errors in logs etc) and noti

  • Is the Mac the new Atari????

    Since all this talk about the Intel Mac and the implications it will have on us loyal Mac (the real thing) users, I have had a lot of questions and thoughts. One of them is: I bagan my music work in my teens (not even that long ago) on one of the lat