Using dependency manager for forms captured in designer.

I am trying to use dependency manager for a form (already captured in designer) and I am getting the following error:
Any help will be appreciated.
Message
CDR-06011: Internal error.
Cause
An unexpected internal error has occurred.
Action
Switch on diagnostics, and repeat the steps that caused the error.
Message
BME-99008: The Java runtime cannot dynamically link a class.
Cause
A class has been loaded that has changed in some way or has become out of step with some library class. The current activity may fail and the system may have been left in an unstable state. The following is a stack trace.
java.lang.NoClassDefFoundError: oracle/forms/jdapi/Jdapi
     at oracle.des.ia.parser.structured.GeneralModule.getFormsBuiltIns(GeneralModule.java:182)
     at oracle.des.ia.parser.structured.GeneralModule.setupSqlParserParameter(GeneralModule.java:226)
     at oracle.des.ia.parser.structured.GeneralModule.analyze(GeneralModule.java:145)
     at oracle.des.ia.parser.structured.Structured.parse(Structured.java:126)
     at oracle.des.ia.AnalysisTool.executeParser(AnalysisTool.java:1810)
     at oracle.des.ia.AnalysisTool.process(AnalysisTool.java:1290)
     at oracle.des.ia.AnalysisTool.process(AnalysisTool.java:1010)
     at oracle.repos.tools.dependency.manager.framework.DependencyAnalyzer.parseObject(DependencyAnalyzer.java:558)
     at oracle.repos.tools.dependency.manager.framework.DependencyAnalyzer.analyzeUsing(DependencyAnalyzer.java:383)
     at oracle.repos.tools.dependency.manager.framework.DependencyAnalyzer.analyzeUsing(DependencyAnalyzer.java:306)
     at oracle.repos.tools.dependency.manager.framework.DependencyAnalyzer.analyze(DependencyAnalyzer.java:278)
     at oracle.repos.tools.dependency.manager.dialog.AnalyzeDialog$4.runImpl(AnalyzeDialog.java:650)
     at oracle.repos.tools.dependency.manager.adapter.RunnableErrAdapter.run(RunnableErrAdapter.java:17)
     at java.lang.Thread.run(Unknown Source)
Action
If further errors occur, you should restart the application.
Also, contact Oracle support, giving the information in this message.
It might be possible to upgrade the internal virtual machine.

Hi Avinash:
There is a note on Metalink about your problem. You miss a "f60jpapi.jar"
in %ORACLE_HOME$/forms60/java. Check Metalink for "BME-99008".
My problem is close to yours. I am getting BME-99007, which means
that I am missing some java class for the forms parser.
Any idea, did you ever had a problem like this.
By the way did you succeed to create forms and reports "dependencies".
Thank you
Michael

Similar Messages

  • Using Identity Management for Securing Web Services

    My goal is to associate my services with an Oracle Internet Directory. I made some attempts to set up SAML authentication for the web services, but it didn't have the right outcome.
    (My identity management server and OID is up and running and I have successfully made authentication modules for other web applications)
    Here is what I did:
    1. I wrote a simple java file, used jdeveloper tools to create and deploy it as a web service to OC4J. I associated an identity management server with this service through OC4J web tools as security provider.
    2. I made a data control for the web service and put it in an ADF application . (client)
    3. I deployed the client project(2) to OC4J.
    I could use the web service through the page.
    Then
    I secured the webservice to expect SAML for authentication.
    Surprisingly, the client could still communicate with the webservice, Why? Shouldn't it have rejected the request because of the problem in SAML token? (The proxy and the data control were not secured, and didn't provide any SAML tokens)
    4.
    I added login page to my client project (through ADF security wizard). It used idenity management for authentication successfully. login process completes and web service data control is displayed.
    5. I want the authentication information to be propagated through the page so that the web service receives the data and uses Identity Management.
    I know I should add <property name="oracle.security.wss.propagate.identity" value ="true"/>
    to one of the configuration files, but don't know where exactly.
    Best Regards,
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Image not displayed in pdf generated using Java API for Forms service

    Hi,
    I am creating a pdf document using Java API for Forms Service.
    I am able to generate the pdf but the images are not visible in the generated pdf.
    The image relative path is coming in the xml as defined below. The images are stored dynamically in the Livecycle repository each time a request is fired with unique name before the xml is generated.
    <imageURI xfa:contentType="image/png" href="../Images/logo.png"></imageURI>
    Not sure if I need to specify specify specific URI values that are required to render a form with image.
    The same thing is working when I generate pdf document using Java API for Output Service.
    As, I need to generate interactive form, I have to use Forms service to generate pdfs.
    Any help will be highly appreciated.
    Thanks.

    Below is the code snippet:
                //Create a FormsServiceClient object
                FormsServiceClient formsClient = new FormsServiceClient(myFactory);
                //Specify URI values that are required to render a form
                URLSpec uriValues = new URLSpec();
                                  // Template location contains the whole rpository path for the form
                uriValues.setContentRootURI(templateLocation);
               // The base URL where form resources such as images and scripts are located.  Whole Image path is passed in BaseUrl in the http format.
                      String baseLocation = repositoryPath.concat(serviceName).concat(imagesPath);   
                                  uriValues.setBaseURL(baseLocation);                                        
                // Set run-time options using a PDFFormRenderSpec instance
                PDFFormRenderSpec pdfFormRenderSpec = new PDFFormRenderSpec();
                pdfFormRenderSpec.setCacheEnabled(new Boolean(true));           
                pdfFormRenderSpec.setAcrobatVersion(com.adobe.livecycle.formsservice.client.AcrobatVersio n.Acrobat_8);
                                  //Invoke the renderPDFForm method and write the
                //results to a client web browser
                String tempTemplateName =templateName;
                FormsResult formOut = formsClient.renderPDFForm(tempTemplateName,
                                              inXMDataTransformed,pdfFormRenderSpec,uriValues,null);
                //Create a Document object that stores form data
                Document outputDocument = formOut.getOutputContent();
                InputStream inputStream = outputDocument.getInputStream();

  • Is it a must to use Solution manager for ECC 6.0 upgrade

    Is it a must to use Solution Manager for ECC 6.0 (FICO & HR) upgrade? I am told Solution Manager is required only if it has ABAP and JAVA stacks, or involves Enhancement Packages; and could be optional if it is only ECC 6.0 with ABAP stacks.

    Hi,
    I'm having a difficult time finding some guidance for setting up our R/3 Enterprise system in solman properly, so that I can create the upgrade XML file required to go directly to ERP 6.0 EHP4. 
    I've set up the R/3 system fine, but when i create a maintenance transaction i cannot for the life of me figure out how to tell it I want to go from r/3 to ERP6 ehp4.
    Any docs out there for this?
    Thanks!
    I figured it out... i had to add the R/3 Enterprise logical components to my system landscape.  After that I was able to select R/3 as the initial data in the maintenance transaction and then the system i wanted to upgrade, and then the option to upgrade.
    Never mind!
    Edited by: Tom Janes on Sep 9, 2009 12:52 AM

  • Use State Management for Savepoints

    Hello
    i read a topic in the oracle adf 11.1.2 developer guid about Use State Management for Savepoints the Topic index is 43.9.1 How to Use State Management for Savepoints
    have any one try a sample based on it plz
    regads mohammad j.b.yaseen

    Hi,
    have a look at page 6 of http://www.oracle.com/technetwork/developer-tools/adf/learnmore/007-cancelform-savepoint-169126.pdf
    "Explicit ADFm Savepoints in ADF Task Flow"
    Frank

  • ALE: how do I use dependent distribution for customer and vendor addresses?

    Hello SDN'ers
    Hopefully someone can help me with an ALE question.
    We have two ERP systems connected with ERP - Retail and AFS.
    Customers and Vendors may be created in Retail, and are distributed to AFS using DEBMAS, CREMAS and ADRMAS messages.
    Not all customers and vendors are required in AFS, so we have various filters set up in the Retail distribution model (BD64).
    However with the filters in place, addresses were still being distributed for customers and vendors that had been filtered out.
    To try and avoid this for vendors, we created a filter group for AddressOrg.SaveReplica in receiver determination. In the filter group we selected Dependent Distribution for CREMAS (Address owner object).
    This works perfectly for vendors - vendor addresses are now only distributed if the vendor itself is distributed.
    However I've now discovered it is preventing any customer addresses from being sent! I was surprised about this: I expected CREMAS dependency to only affect vendor addresses. Can someone tell me:
    - is this how dependent distribution is designed to work? In which case can anyone offer any suffestions to achieve the filtering some other way?
    - or are we doing something wrong to be getting this result?
    Thanks for any help!
    regards, Roger

    Hi all,
    as Robert suggested, I've written an exit to get round this, although I'm surprised I should need to!
    What I did was to code an enhancement in function group BDBF, at the start of subroutine COMPUTE_LOGIC_EXPRESSION.
    * In the case of addresses, dependencies should only be applied for the appropriate type of address
      data: lwa_tbd14 type tbd14.
      if P_MODEL_DATA-FTYPE = C_FTYPE_DEPENDENCY_FROM_MESTYP.
    *   Dependency from Message Type.
    *   Read the Address Object type
        READ TABLE P_FILTER_VALUES
                         WITH KEY OBJTYPE  = 'AD_OBJTYPE'
                         BINARY SEARCH.
        if sy-subrc = 0.
    *     Check if the Message Type is linked to the Address Owner Object Type
    *     (Table is buffered on DB)
          select single *
            from TBD14
            into lwa_tbd14
           where MESTYP = P_MODEL_DATA-FMESTYP
             and OBJTYP = P_FILTER_VALUES-OBJVALUE.
          if sy-subrc <> 0.
    *       Wrong type of address for this dependency - stop here
            P_REPLICATE = c_true.
            exit.
          else.
    *       Correct address type. Standard processing continues, checking any
    *       filter values set up against the Address Owner Object Type.
          endif.
        else.
    *     It isn't an address - continue with standard processing
        endif.
      endif.
    Let me know if this helps you, or if you know of a better way to achieve the same result.
    Roger

  • Using Desktop Manager for managing Evolution Mail-Accounts

    hi there,
    where can i find a documentation that describes how i can manage
    evolution acocunts within Desktop Manager. I just can add an account name
    but i can not define the account specific configuration data such as the
    type of email server (in our case: M$ Exchange and SUN JES Server)
    and based on the type of email server i must add some more or less configuration
    information.
    second question : can i use some variables for those configuration data such as
    $USERNAME to configure the evolution email account using the correct username ?
    the password has to be entered by the user if he/she runs evolution for the first time.
    thanks in advance
    cheers
    joerg

    Hi Joerg,
    The documentation for the Desktop Manager focuses on the common parts of the operation of the product, the details of the meaning of each setting are more related to the application to which they belong, which is why they don't feature in the general administration manual. You can, however, find more contextual information about the various pages of the Desktop Manager by clicking on the "More" button present on each of these pages, for instance in the case of the Evolution accounts, this would have told you about the fact that the Accounts field is a list of strings containing the XML account description (though I'll grant you that in this particular case, it's a bit on the terse side).
    As for the placeholders, you need to update your Sun Ray server with patch 120454 (120455 depending on your platform) revision 02, so that the GConf adapter which integrates Desktop Manager data into the Gnome configuration handles the placeholder properly. Then you have the possibility of using the special strings "[apoc.<attribute>]" in the configuration settings, which will be replaced when evaluated with the value of <attribute> in the LDAP user entry.
    As an example (which you should adapt to the contents of your LDAP server and the actual definition you want to use for your Evolution accounts), you could for instance use the following XML blob for an Evolution account (I've formated it nicely for display purposes, you will want to paste as a single line into that Accounts list):
    <?xml version="1.0"?>
    <account name="[apoc.mail]" uid="[apoc.uid]" enabled="true">
         <identity>
             <name>[apoc.cn]</name>
             <addr-spec>[apoc.mail]</addr-spec>
             <reply-to></reply-to>
             <organization></organization>
             <signature auto="false" default="-1"/>
             </identity>
         <source save-passwd="false" keep-on-server="false" auto-check="true" auto-check-timeout="10"/>
         <transport save-passwd="false">
             <url>smtp://[apoc.givenname].[apoc.sn]@[apoc.mailhost]</url>
         </transport>
         <drafts-folder>file:///home/[apoc.uid]/evolution/local/Drafts</drafts-folder>
         <sent-folder>file:///home/[apoc.uid]/evolution/local/Sent</sent-folder>
         <auto-cc always="false">
             <recipients></recipients>
         </auto-cc>
         <auto-bcc always="false">
             <recipients></recipients>
         </auto-bcc>
         <pgp encrypt-to-self="false" always-trust="false" always-sign="false" no-imip-sign="false">
             <key-id></key-id>
         </pgp>
        <smime encrypt-to-self="false" always-sign="false"/>
    </account>Now to go back to your problem with getting the profiles you've assigned with the Desktop Manager visible on the user's desktop, I can first offer a few remarks on your configuration:
    - you've specified a DN and password for authentication to your LDAP server, I would like to point out that the agent (as opposed to the Desktop Manager itself) will only ever perform read accesses to your LDAP server (to find the user entry, figure out its position in the hierarchy and retrieve the applicable profiles), so you don't need to provide a user with write access, let alone administrative priviledges, as you seem to have done. If your LDAP server setup supports anonymous read access, I would advise you to leave these fields blank in the agent configuration;
    - you've set the ChangeDetectInterval, DaemonChangeDetectInterval, TimeToLive and GarbageCollectionInterval to 1 (minute), which is unnecessary and for some of them probably detrimental to the performance of the agent operation. I can expand if you wish on the meaning of each of these (though I think that's described in the manuals), but I would advise you put back the default values for all of these settings, except maybe the ChangeDetectInterval while you're testing the product (this is the setting used to determine how often the user data is refreshed, i.e how often the changes from the Desktop Manager are propagated to the desktop applications, one minute is fine for testing purposes but you should put it back to its sixty minutes default once the profiles are fairly stable in order to minimise the activity of the agent).
    Please first try and make the above changes to the agent configuration, and check whether the problem you're having is still present. If so, we can look into other potential causes for the problem.
    Regards,
    Cyrille

  • Use configuration management for multi site application

    Hi,
    My client develop an application multi site with designer 6i. We want to use the configuration management .
    The application will be deploy in 10 sites, but not necessary with the same version of the application.
    You want to control that with the configuration management of designer 6i.
    Someone has some experience with that ??
    Can i have some reference or detail experience ??
    Thanks a lot
    chris

    I asked this sort of question for creating a live and a test release of an application. The answer I got was to create a configuration of based on my application and then to create an applications based on the the versions of the configuration.
    Then, I could check the configuration against the main workarea to tell me what was out of date.
    I could also use the configuration wizard to create a new version (of the configuration) selecting the versions of the modules I wanted (ie: newer ones). Then I created a new version of the release application using the new configuration.
    Finally, in the command tool I can select the release work area which will extract correct module versions (I have the .fmbs) and I can compile a release.
    I guess your solution here is to have a main workarea with all the source, then a configuration of that and then one release application for every site. Each release application could use a different version of the configuration - depending on how up to date they are.
    HTH
    Steve

  • Using Sync Manager for LabVIEW OI exe & TestStand

    We would like to have our LabVIEW Operator Interface executable send data via a TestStand queue to a logging sequence.  The SyncManager API sounds like a slick way to do this because it provides access to TS syncronization objects from other processes outside of TestStand.  I created an experimental sequence to try out the concept.  The sequence creates a queue with a TestStand Queue step, enqueues an item with another TS queue step, then uses ActiveX calls into the SyncManager API to add another item to the queue.  That all works.  
    When I added a LabVIEW VI to enqueue another item via similar ActiveX calls into the SyncManager API, the VI does not get a valid reference to the queue and so it can't enqueue the item.  
    I read on the forum that the TS queue name needs to start with * in order to be accessed by separate processes on the same machine.  When I prepend a * on my queue name, the TS queue steps continue to work, but the step that uses an ActiveX call into the SyncManager API to get a reference to the queue returns a null ref so the following ActiveX enqueue step returns an error.  Plus the LabVIEW VI still doesn't get a valid reference to the queue either.
    Ideas and suggestions welcome!  The sequence and LabVIEW VI are attached. Using LV and TS 2012, got similar results in 2013SP1 .
    Thanks!
    Hans
    Solved!
    Go to Solution.
    Attachments:
    Sync Manager Experiment.zip ‏193 KB

    Thank you very much dug9000.  I unchecked the Create Object box in the sequence step adapter settings and disabled the Automation Open node in the VI.  Now the sequence ActiveX steps can get a valid queue ref for a queue named "*TestQueue" and can enqueue an item.  Yay!  The VI gets now gets a valid reference to the queue and can get the numbers of items on the queue.  Yay!  But, I have not been able to get the VI to successfully enqueue an item.  I tried both passing the sequence context and a null sequence context to the IQueue Enqueue method, but no luck either way.  The updated zip file is attached.  More help would be greatly appreciated!
    Hans
    Attachments:
    Sync Manager Experiment II.zip ‏49 KB

  • Using work manager for third party connections

    Folks we are trying implement work manager to
    Allthough WL is self tuning, and you could "increase the internal socket pool", you have to fully understand the implications of this.
    -It increases the # of threads, which ultimately degrades performance, due to the amount of context switching which would have to occur (this is at the application layer)... in general
    -It will require additional memory, of which, we'd need to account for this, otherwise, it can run the WL instance out of memory, if we're in a fault condition
    Let me explain this "fault" condition a little further...
    There's a BIG difference between a 3rd party being down, and a 3rd party being able to respond to our request, but unable to process it in a timely fashion (i.e. they're overloaded, or the tcp connection is happening properly, but the application response from the 3rd party isn't properly responding/working in a timely fashion). We've seen this condition on a # of occasions and need to be able to have a max limit or governor on each 3rd party... of which, my understanding of WL work managers allows for this (I do have to note, I am not experienced with these, and my knowledge is only with what I've had time to read on). Ultimately, I'm trying to prevent the condition I've seen in a # of serious "slow downs" or hang-ups within the application... we should NOT be able to have 3rd party calls hamper the performance or availability of our application

    Note that you can reduce the context switching by 'pinning' certain CPUs to handle, for example, the network interrupts.
    You can use (on Linux): cat /proc/interrupts (shows information related to hard interrupts).
    From http://middlewaremagic.com/weblogic/?p=8133
    "In a multi-processor environment, interrupts are handled by each processor. Two principles have proven to be most efficient when it comes to interrupt handling:
    Bind processes that cause a significant amount of interrupts to a CPU. CPU affinity enables the system administrator to bind interrupts to a group or a single physical processor (of course, this does not apply on a single CPU system). To change the affinity of any given IRQ, go into /proc/irq/<number of respective irq>/ and change the CPU mask stored in the file smp_affinity. To set the affinity of IRQ 19 (which in the above example is the network interface card) to the first CPU, we can use echo 2 > /proc/irq/19/smp_affinity."
    To configure a work manager properly, you have to throttle the maximum threads and the capacity. The maximum number of threads are the number of threads
    that are allowed to run simultaneously. Usually, you set this equal to some resource (such as the number of connections in a data source). This to prevent
    contention for that resource (otherwise the thread would be waiting for the resource and thus consuming CPU cycles by polling if can go or not). When the maximum
    is reached no other threads are spawned to run. The request that comes in is put into a piece of work and stays that way until an execute thread comes available
    to process that work. The number of work instances you allow in your heap can be set by the capacity constraint (which thus throttles the memory consumption).
    When the capacity is reached, new requests are rejected.

  • Create key mapping using import manager for lookup table FROM EXCEL file

    hello,
    i would like create key mapping while importing the values via excel file.
    the source file containing the key, but how do i map it to the lookup table?
    the properties of the table has enable the creation of mapping key. but during the mapping in import manager, i cant find any way to map the key mapping..
    eg
    lookup table contains:
    Material Group
    Code
    excel file contain
    MatGroup1  Code   System
    Thanks!
    Shanti

    Hi Shanti,
    Assuming you have already defined below listed points
    1)  Key Mapping "Yes" to your lookup table in MDM Console
    2) Created a New Remote System in MDM console
    3) proper rights for your account for updating the remote key values in to data manager through import manager.
    Your sample file can have Material Group and Code alone which can be exported from Data Manager by File-> Export To -> Excel, if you have  data already in Data Manager.
    Open your sample file through Import Manager by selecting  the remote system for which you want to import the Key mapping.
    (Do Not select MDM as Remote System, which do not allows you to maintain key mapping values) and also the file type as Excel
    Now select your Soruce and Destination tables, under the destination fields you will be seeing a new field called [Remote Key]
    Map you source and destination fields correspondingly and Clone your source field code by right clicking on code in the source hierarchy and map it to Remote Key if you want the code to be in the remote key values.
    And in the matching criteria select destination field code as a Matching field and change the default import action to Update NULL fields or UPDATED MAPPED FIELDS as required,
    After sucessfull import you can check the Remote Key values in Data Manager.
    Hope this helps
    Thanks
    Sowseel

  • Using Task Manager for custom approvals

    If some one knows a better way to do customer approvals using a web service please let me know.
    Issue: I am trying to use the Task Manager provided in the BPEL Control. I am trying to invoke the web service and I am getting the error. I am not sure if I am using it right or what might be happening. Please let me know how I should be using this service or if there is an easier way.
    Cheers Mike.
    <remoteFault xmlns="http://schemas.oracle.com/bpel/extension"><part name="summary"><summary>exception on JaxRpc invoke: HTTP transport error: javax.xml.soap.SOAPException: java.security.PrivilegedActionException: javax.xml.soap.SOAPException: Message send failed: set.by.caller: set.by.caller</summary>
    </part></remoteFault>

    It looks like to me that in your invoke activity instead of invoking "initiate" operation you are trying to invoke "onResult" operation. Go to you partner link and check what did you set as partner role in "Partner role" drop down list. Partner role should be set as provider and not as requester.
    Cheers!
    Zoran

  • Oracle forms 6i Dependency manager - Loading

    Hi,
    I am trying to use the Dependency manager for oracle forms 6i and I need to load the forms into the repository. How do I load the forms into the repository?
    thanks,
    Ray

    How can I do that?Use Oracle Net8 Easy Config utility (Windows menu).
    I try to connect with the user I created with
    Enterprise Manager.When you connect to DB you use a username, a password, and a connection string. The last one (a TNS alias) is necessary because you're working in a different Oracle Home, it's the same as you were on a different machine.

  • Having response time issues using Studio to manage 3000+ forms

    We are currently using Documaker Studio to create and maintain our forms, of which we have thousands. Once we create the form we export it to a very old version of Documerge where it is then used in our policy production. 
    The problem is that because we have so many forms/sections, everytime we click on "SECTIONS" in Studio it takes a significant amount of time to load the screen that lists all of the sections. Many of these forms/sections are old and will never change but we want to still have access to them in the future.
    What is the best way to "backup" all these forms somewhere where they are still accessible? Ideally I think I would like to have one workspace (let's call it "PRODUCTION") that has all 3000+ forms and delete the older resources from our existing workspace (called "FORMS") so that just has the forms that we are currently working on.  This way the response time in the "FORMS" workspace would be much better. Couple questions:
    1. How would I copy my existing workspace "FORMS" (and all the resources in it) to a new workspace called "PRODUCTION"?
    2. How would I delete from the "FORMS" workspace all of the older resources?
    3. Once I am satisfied with a new form/section in my "FORMS" workspace how would I move it to "PRODUCTION"?
    4. How could I move a form/section from "PRODUCTION" back into "FORMS" in order to make corrections, or use it as a base for a new form down the road?
    5. Most importantly....Is there a better way to do this?
    Again, we are only using this workspace for forms creation and not using it to generate output...we will be doing that in the future once we upgrade from the very old Documerge on the mainframe, to Documaker Studio.
    Many thanks to any of you who can help me with this!

    However, I am a little confused on the difference between extracting and promoting. Am I correct in assuming that I would go into my PROD workspace and EXTRACT the resources that I want to continue to work on. I would then go into my new, and empty, DEV workspace and IMPORT FILES (or IMPORT LIBRARY?) using the file(s) that I created with the EXTRACT? In effect, I would have two totally separate workspaces, one called DEV and one called PROD?
    Extraction is writing a copy of a resource from the library out to disk. Promotion is copying a resource from one library to another, with the option of modifying the metadata values of the source and target resources. You would use extract in a case where you don't have access to both libraries to do a promote.
    An example promotion scenario would go something like this. You have resources in the source (DEV) that you want to promote to the target (PROD). Items to be promoted are tagged with the MODE = "To Promote". When you perform the promotion, you can select the items that you want to promote with the filter MODE="To Promote". When you perform the promotion, you can also configure Studio to set the MODE of the resource(s) in the source to be MODE="To Delete", and set the MODE of the resource(s) in the target to be MODE="" (empty). Then you can go back and delete the resources from the source (DEV) where MODE=DELETE.
    Once you have the libraries configured you could bypass the whole extract/import bit and just use promote. The source would be PROD, and the target would be DEV. During promotion, set the target MODE = "To Do", and source MODE = "In Development". In this fashion you will see which resources in PROD are currently being edited in DEV (because in PROD the MODE = "In Development"). When development is completed, change the MODE in DEV to "To Promote", then proceed with the promotion scenario described above.
    I am a bit confused on the PROMOTE function and the libraries that have the  _DEV _TEST _PROD suffixes. This looks like it duplicates the entire workspace to new libraries _PROD but it is all part of the same workspace, not two separate workspaces?  Any clarification here would be helpful.
    Those suffixes are just attached by default; these suffixes don't mean anything to Documaker. You could name your library PROD and use it for DEV. It might be confusing though ;-) The usual best practice is to name the library and subsequent tablespaces/schemas according to their use. It's possible to have multiple libraries within a single tablespace or schema (but not recommended to mix PROD and non-PROD libraries).
    Getting there, I think!
    -A

  • Project Management using Solution Manager

    Hi,
    We are trying to use solution Manager for managing our IT projects - specifically for SAP projects.
    We browsed through the existing functionalities that are available and are not able to get info on the following:
    ___Project Admin tab for Roadmaps :_ __ In the transaction RMMAIN, we are able to define the scope of the activities at each phase and assign team members and track the status of the activity / task.
    However we do not have the tab for tracking the planned start date / end date and the actual start / end dates. Similar is the case for efforts.
    So not sure how to track these?
    Integration of Roadmaps and Blueprints : Once we have defined the project and assigned the Roadmap, we have all the activities that need to be tracked. In addition in transaction SOLAR01 / SOLAR02, we can define the scope and configurations which in itself are a kind of sub projects to be embedded withing the Blueprint and realization phases respectively of the main Projetc Plan.
    How to link these Blueprint specific plan and the realization related plan to the main project plan created out of the Roadmap chosen and to track them as one single project?
    Synchronization wih ms project : I see only the option to download the plan as an mpp file. But is there any option to upload mpp as well so that the project can be tracked using mpp and synchronized with Solution manager at frequent intervals?
    Your answers would definitely be helping us a lot and would be greatly appreciated.
    thanks in advance.
    regards,
    suresh velan

    Hi Suresh,
    Solution Manager is a Project Management tool. But it is not for Project Planning. If you want to do Project Planning also from the SolMan system, then you have to enable cProjects component of SolMan.
    Synchronization between your MS Project and cProjects is possible. But synchronization between SolMan's RoadMap/Blueprint and MS Project is not possible.
    In RoadMap, you can capture and track the status of all phase-wise activities as to what is their status. Here capturing date details is not possible.
    In Blueprint/Configuration you can capture and keep track of the individual object's blueprinting and configuration activities. Here you can capture the planned/actual start/end dates for each object.
    As I said earlier, Solution Manager provides lot of Project Management features, but Project Planning is not possible. For that we have to start using cProjects component in SolMan.
    Hope it helps.
    best regds,
    Alagammai.

Maybe you are looking for

  • My songs are in my iTunes library but the song titles do not show up on the songs list.

    I have an older generation iPod nano, do I need to go back to an older version of iTunes so I can see the song titles on the song list? If so, does anyone have a link to the oder verson?

  • Flash to Video expert advice needed

    I am exporting flash 8 animation to video to be edited in final cut. I have exported the file in the animation codec "full quality" to a quicktime .mov file, I then imported it into final cut and have found a jagged edges on the images and am wonderi

  • Display rows for empty and mesure values

    Hi, We are building live office objects charts on top of WEBI documents and facing an issue while refreshing the data in live office object for empty measure and empty dimension values. Charts are pointed to the data cells in the live office table an

  • SRM EBP & CCM

    ALL, We are trying to push contracts to CCM from SRM / EBP 5.5.  When we try to click the Distribute the contract to catalog we can not. Is this a SRM config item? Any help would be great!

  • How do I feed a graph from database?

    Hi all I have done in a few hours more that I would in many days since i'm trying Flex but now I'm feeling very lonely .. I'm having some charts and bar graphs in my project but i'm stuck Can someone give me a few tips ? I have a postgresql database