Logical target not mapped to physical system

Hi Experts,
We are getting issue when we are aceesing webservices.Trace are mentioned below.I am not able to figure out why is this prob as i already test in netweaver administrator-configuration-destination and under this we test all the webservices destination that we created and test is succesfull.Kindly put some lights on this matter.
com.sap.tc.webdynpro.model.webservice.exception.WSModelRuntimeException: Exception on creation of service metadata for WS metadata destination 'WS_METADATA_DEST_CONF' and WS interface 'PRConfig_1Vi_Document'. One possible reason is that the metadata destination 'WS_METADATA_DEST_CONF' has not been properly configured; check configuration.
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.getOrCreateWsrService(WSModelInfo.java:440)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.readOperationsFromWSDL(WSModelInfo.java:372)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.importMetadataInternal(WSModelInfo.java:342)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.importMetadata(WSModelInfo.java:326)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo$Cache.getModelInfo(WSModelInfo.java:199)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.getModelInfoFromCacheOrCreate(WSModelInfo.java:1035)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.getModelInfoFromCacheOrCreate(WSModelInfo.java:248)*
*     at com.sap.tc.webdynpro.model.webservice.gci.WSTypedModel.<init>(WSTypedModel.java:41)*
*     at com.sap.srm.mdm.admin.pr.PRConfiguration.<init>(PRConfiguration.java:53)*
*     at com.sap.srm.mdm.pr.PRConfiguration.wdDoInit(PRConfiguration.java:121)*
*     at com.sap.srm.mdm.pr.wdp.InternalPRConfiguration.wdDoInit(InternalPRConfiguration.java:313)*
*     at com.sap.tc.webdynpro.progmodel.generation.DelegatingComponent.doInit(DelegatingComponent.java:108)*
*     at com.sap.tc.webdynpro.progmodel.controller.Controller.initController(Controller.java:215)*
*     at com.sap.tc.webdynpro.progmodel.controller.Controller.init(Controller.java:200)*
*     at com.sap.tc.webdynpro.clientserver.cal.ClientComponent.init(ClientComponent.java:430)*
*     at com.sap.tc.webdynpro.clientserver.cal.ClientApplication.init(ClientApplication.java:362)*
*     at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.initApplication(ApplicationSession.java:783)*
*     at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:303)*
*     at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:741)*
*     at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:694)*
*     at com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:253)*
*     at com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)*
*     at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)*
*     at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doGet(DispatcherServlet.java:46)*
*     at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)*
*     at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)*
*     at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)*
*     at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)*
*     at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)*
*     at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)*
*     at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1039)*
*     at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)*
*     at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)*
*     at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)*
*     at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)*
*     at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)*
*     at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)*
*     at java.security.AccessController.doPrivileged(AccessController.java:219)*
*     at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:104)*
*     at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:176)*
Caused by: com.sap.engine.services.webservices.espbase.discovery.TargetConfigurationException: Logical Target sap.com/DynamicWSProxies/WS_METADATA_DEST_CONF not mapped to a physical system.
*     at com.sap.engine.services.webservices.server.management.discovery.ServiceDiscoveryImpl.getWSDLUrl(ServiceDiscoveryImpl.java:73)*
*     at com.sap.engine.services.webservices.espbase.client.dynamic.GenericServiceFactory.createService(GenericServiceFactory.java:128)*
*     at com.sap.tc.webdynpro.model.webservice.metadata.WSModelInfo.getOrCreateWsrService(WSModelInfo.java:429)*
*     ... 39 more*
Caused by: com.sap.engine.services.webservices.espbase.query.exceptions.TechnicalException: Logical Target sap.com/DynamicWSProxies/WS_METADATA_DEST_CONF not mapped to a physical system.
*     at com.sap.engine.services.webservices.espbase.query.EngineWSFactoryImpl.getWSQuery(EngineWSFactoryImpl.java:115)*
*     at com.sap.engine.services.webservices.server.management.discovery.ServiceDiscoveryImpl.getWSDLUrl(ServiceDiscoveryImpl.java:64)*
*     ... 41 more*
Caused by: com.sap.engine.services.webservices.espbase.discovery.TargetNotMappedException: Logical Target sap.com/DynamicWSProxies/WS_METADATA_DEST_CONF not mapped to a physical system.
*     at com.sap.engine.services.webservices.server.management.discovery.DestinationsResolver.<init>(DestinationsResolver.java:100)*
*     at com.sap.engine.services.webservices.server.management.discovery.ServiceDiscoveryImpl.getEntityResolverForTarget(ServiceDiscoveryImpl.java:99)*
*     at com.sap.engine.services.webservices.espbase.query.EngineWSFactoryImpl.getWSQuery(EngineWSFactoryImpl.java:113)*
*     ... 42 more*
Thanks,
sudhanshu

For your reference:
This is the only code so far in the implementation:
public void wdDoInit()
    //@@begin wdDoInit()
    //$$begin Service Controller(-608326537)
   Bp03 model = new Bp03();
//   wdContext.nodeRequest_BusinessPartnerBasicDataByIDQueryResponse_In().bind(new Request_BusinessPartnerBasicDataByIDQueryResponse_In(model));
   Request_BusinessPartnerBasicDataByIDQueryResponse_In req = new Request_BusinessPartnerBasicDataByIDQueryResponse_In(model);
    wdContext.currentIDElement().setBP_ID("0000000009");
    //$$end
    //@@end
Am i implementing the correct code?
Thanks.

Similar Messages

  • LSMW - Logical File Path not pointing to Physical Path

    Hi All,
    I'm trying to upload some MIGO goods movement transactions through LSMW.  I'm to step 7, specifying files, and am getting the error "No logical file path has been specified.  I saw another thread which helped me, but I am still stuck after reading it. 
    Re: Error in Creation of Purchase Requisition through LSMW
    I have gone into transaction FILE and created the logical file name MM_GOODS_MOVEMENT, and set the physical file as the .lsmw.conv converted data file as the above thread says.  I also tried using the logical path LOCAL_TEMPORARY_FILES for the converted data, but I get the error telling me that the path does not point to a physical directory.  I then tried creating my own file path, and setting the path to my desktop where the source file is, and the same error occurred.
    Does anyone know why it is not recognizing the physical path maintained in the logical path?
    Thanks,
    J

    Thanks for both replies,
    I have full authorization across all transactions in the sandbox system I am working in, so that is not an issue.  I filled the logical path field in the MM_GOODS_MOVEMENT logical file with LOCAL_TEMPORARY_FILES, as well as trying my own created file path, and I recieved the same error for each.  Here is how the config is currently:
    Logical File Path Definition:  MM_GOODS_MOVEMENT
    Syntax Group: WINDOWS NT
    Physical path: C:\Documents and Settings\jchanski\Desktop\<FILENAME>
    or,
    Logical File Path Definition:  LOCAL_TEMPORARY_FILES
    Syntax Group: WINDOWS NT
    Physical path: C:\temp\<FILENAME>
    Logical File Name Definition: MM_GOODS_MOVEMENT
    Physical file: MM_MM_MIGO_INVUPLOAD.lsmw.conv
    Data format: ASC
    Applicat.area: MM
    Logical path: LOCAL_TEMPORARY_FILES or MM_GOODS_MOVEMENT
    Do you see any error with this Brajvir?  Thanks!

  • AC 5.3  Logical systems and multiple Physical systems

    HI all:
    We have set up a Logical System, and then have hooked up 2 Physical systems to it.  Based on the documentation from SAP, the rule set (and the way we set it up) is at the Logical system level. 
    Physical system #1 is reporting the correct results, however, Physical system #2 results are not complete.  My understanding is that both Physical systems should be drawing from the rule set based in the Logical system.
    Results:  Physical system #1 - correctly identifies 10 risks
                  Physical system #2 - incorrectly only identifies 5 of the expected 10 risks.
    Role assignment to user is exactly same, authorizations are exactly the same.  The User/Role/Profile Full Sync has been repeated as part of troubleshooting.
    Has anyone experienced this?  I'm looking for ideas/options on where to investigate why the results are different.
    Thanks
    Margaret

    HI:
    Thanks for answering!!  I'm having trouble understanding the answers, sorry.
    When you have a logical system...you upload the rules once to that system, and generate them.  Then when you set up a physical system and hook it up to the logical system...the 1st, 2nd or 3rd physical system....they all use the same ruleset.  The whole concept (as I understand) is that the use of Logical and Physical systems...and putting the rule set on the Logical system so you don't have to worry about one rule set taking precedence over another, is an advantage, if this is possible (ie: we don't have multiple rule sets).
    Where should I be checking that the risks are defined properly?  The rule set is the delivered one from SAP.  The functions and risks match what we had (plus more) in the CC 4.0 system.
    The first physical system we set up (example: Development) works fine, and all regular SoD risks are coming up as expected.  When we added another system (example: Test)...the results of SoD analysis were incomplete. 
    Does that help clarify?  Perhaps I'm just missing something?
    Margaret
    Edited by: Margaret Sokolov on Apr 28, 2009 4:34 PM

  • Destination R/3 System not mapped error

    Hi,
    I am getting the error "Destination R/3 System not mapped".    I am getting this error when tried to extract data from a data source in transaction RSA3.   I resaved the data source (RSA6) and activated the extract structure (RSA2), but still getting the same error.   Any help will be much appreciated.
    -Ravi.

    Hi Dennis,
    Thanks for your response.   No, I was not specifying a target system.    Per your suggestion, I picked one of the systems from the list, and still got the same message.   I was thinking in the same lines on an RFC may be missing, but I have this problem with only one of the data sources.   I am able to extract data from other data sources w/o issues.   I would think all the data sources will fail if an RFC is missing.
    Also, this system is backcopied from another system recently.   I am thinking some thing went wrong with this data source during the backcopy, but unable to pinpoint what that could be.   Your help is much appreciated.
    Thanks,
    Ravi.

  • Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. --- System.NullReferenceException: Object reference not set to an instance of an object

    Hi,
    (1) I am trying to get the data from a database table and import into a text file.
    (2) If the record set have the data then it runs ok
    (3) when the record set not having any data it is giving the below error
    Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.NullReferenceException: Object reference not set to an instance of an object
    Could you please let me know why this is happening?
    Any help would be appriciated
    Thanks,
    SIV

    You might ask them over here.
    http://social.msdn.microsoft.com/Forums/vstudio/en-US/home?forum=vbgeneral%2Ccsharpgeneral%2Cvcgeneral&filter=alltypes&sort=lastpostdesc
    Regards, Dave Patrick ....
    Microsoft Certified Professional
    Microsoft MVP [Windows]
    Disclaimer: This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.

  • Can one physical system have multiple logical systems? if yes  why?

    can one physical system have multiple logical systems? if yes  why?
    Regards
    sriman

    Logical system is an ALIAS name for a physical LOCATION. so we may have more than 1 logical system name for a physical location.
    eventhough its allowed,but normally we dont do that.
    you can do this in FILE transaction.
    Regards
    srikanth

  • I recently upgraded to a G5 and Logic Pro 8. My previous system was a Power Mac 7200 with Logic Platinum 3.5.3. I can not open the old song files unless they are first saved in Logic 6 or 7. Where do I get a dongle for Logic 7?

    I recently upgraded to a G5 and Logic Pro 8. My previous system was a Power Mac 7200 with Logic Platinum 3.5.3. I can not open the old song files unless they are first saved in Logic 6 or 7. Where do I get a dongle for Logic 7?

    eBay or Craigslist would be the only choice.

  • The source or target cluster ID is not present on the system!

    Hi,
    During System copy, we are facing issue "Run Java Migration Tool kit"
    Phase of Sapinst.
    I could found exact error messages sequences which may help you in
    understanding the problem.:
    ==========================================
    #Start preparing switch.
    #The switch.properties could not be read. Reason: switch.properties (No
    such file or directory)
    The source or target cluster ID is not present on the system! The
    current (source) cluster ID is 787075 and the new (target) cluster ID
    is 3137675
    ==========================
    Last error message might be quite familiar to you as this is very
    common error message however just wanted to inform you that the "SAP
    Note 966752 - Java system copy problems with the Java Migration
    Toolkit" could not help me out.
    all the possible entries have been updated and retried but no success.
    Please suggest if you have face similar kind of problem.
    Thanks.
    Cheers !!!
    Ashish

    Hi,
    cluster ID is just combination of below parameters:
    In our case, my source system (ABC) was refreshed from another system (XYZ) recently
    so while installing target system ( DEF), I changed the source system details from ABC to XYZ in below file and retried the
    SAPinst screen. System copy has got completed successfully.
    Open the file <installation directory>/jmt/cluster_id_switch.properties and edit the line
    src.ci.sid=
    src.ci.instance.number=
    src.ci.instance.name=
    src.ci.host=
    If in your case source system is not refreshed recently; You may try with functional host name or OS host name etc. details for above parameters.
    If this does not work check details of "SAP Note 966752 - Java system copy problems with the Java
    Migration Toolkit" which says almost the same thing but I could not get that as the statements related to
    box number are bit confusing and contradictory.
    Cheers !!!
    Ashish

  • LSMW Logical file 'CS_BI_BOM_CREATE' is not assigned to physical file

    Hi all,
    I am creating LSMW tool for BOM by standard batch input program.
    In the specify files push button i m getting this error
    Logical file 'CS_BI_BOM_CREATE' is not assigned to physical file 'LSMW_C_MDM.119_MAT_BOM_001.lsmw.conv'
    Message no. /SAPDMC/LSMW_OBJ_060045
    anybody know how to proceed?
    If i am going to Tcode - file and setting this one then i require a change request.
    Is this the right way to proceed by changing it in "file" tcode.
    Tell me all possible options.

    Hi,
    double click on row Converted data with file 'LSMW_C_MDM.119_MAT_BOM_001.lsmw.conv'
    enter in field Logical Path: LSMW_C_MDM.119_MAT_BOM_001.lsmw.conv
    enter in field Logical File: LSMW_C_MDM.119_MAT_BOM_001.lsmw.conv
    and SAVE
    the second option is go in transaction file and create the missing 'LSMW_C_MDM.119_MAT_BOM_001.lsmw.conv'
    Regards Vassko!

  • Transport does not map business systems

    Hello,
    we have a problem when transporting XI configuration objects (via normal file export/import, not CMS). In the SLD, we have configured business systems
    T01CLNT010, belongs to group dev  and has transport target PROCLNT010
    T01CLNT020, belongs to group dev  and has transport target PROCLNT020
    the strange thing is the first business system is not converted correctly to the prod business system after transporting the XI config from XI dev to XI Prod. For the second business system, that works fine. I compared the entries in the SLD, both look fine. But one does not work, the other does.
    For the one that does not work, the transport goes fine, does not show any error, but just does not convert the name. (which of course will lead to an error when trying to activate the objects, because T01CLNT010 does not exist in XI prod). So it seems it cannot find the SLD transport target for that object.
    Does anyone know how XI exactly finds out how to convert the business system while transporting it to production ? How does it find the connection between the XI business system object and the corresponding SLD entry ? Just by name ? Or is some kind of internal GUID used which may have been broken ? Can I see that anywhere ?
    Is it safe to re-associate the SLD business system to the XI business system in the XI integration builder ? Or are dependent objects (e.g. receiver det.) then lost ?
    CSY

    Does anyone know how XI exactly finds out how to convert the business system while transporting it to production ?
    The only thing that matters here is properly defining the Transport Targets.
    How does it find the connection between the XI business system object and the corresponding SLD entry?
    Not sure about this but I think the name in the transport target would suffice for this requirement. Also, make sure that the groups are properly defined and target SLD has all business systems properly maintained.
    Is it safe to re-associate the SLD business system to the XI business system in the XI integration builder ?
    How will you be doing this re-association? If you are thinking of changing the Business system name in ID manually, then it would mean almost complete development in ID.
    Regards,
    Prateek

  • HTTP url parameters to pass the target side how handle using udf and source is not mapped to target

    HI Gur's.
    I have a requirement HTTP -post -Proxy.
    Post will send the data in URL. we have written the udf for handling the to receive the  data  we are not mapping with source structure to target structure .Url data  display in SOAP header only.how to store the soap header  data ECC using in proxy receiver.

    thank you shahcsanjay for your reply
    but I think that web.showDocument can not be used by ordinary java web application ..
    I think it can be used only with with "oracle forms" Am I right ?
    If no can you please tell me where can I find a useful document about how to use web.showDocument ...
    thanks again
    Saleem

  • DG Observer triggering SIGSEGV Address not mapped to object errors in alert log

    Hi,
    I've got a Data Guard configuration using two 11.2.0.3 single instance databases.  The configuration has been configured for automatic failover and I have an observer running on a separate box.
    This fast-start failover configuration has been in place for about a month and in the last week, numerous SEGSEGV (address not mapped to object) errors are reported in the alert log.  This is happening quite frequently (every 4/5 minutes or so).
    The corresponding trace files show the process triggering the error coming from the observer.
    Has anyone experienced this problem?  I'm at my wits end trying to figure out how to fix the configuration to eliminate this error.
    I must also note that even though this error is occurring a lot, it doesn't seem to be affecting any of the database functionality.
    Help?
    Thanks in advance.
    Beth

    Hi..   The following is the alert log message, the traced file generated, and the current values of the data guard configuration.  In addition, as part of my research, I attempted to apply patch 12615660 which did not take care of the issue.  I also set the inbound_connection_timeout parameter to 0 and that didn't help either.  I'm still researching but any pointer in the right direction is very much appreciated.
    Error in Alert Log
    Thu Apr 09 10:28:59 2015
    Exception [type: SIGSEGV, Address not mapped to object] [ADDR:0x9] [PC:0x85CE503, nstimexp()+71] [flags: 0x0, count: 1]
    Errors in file /u01/app/oracle/diag/rdbms/<db_unq_name>/<SID>/trace/<SID>_ora_29902.trc  (incident=69298):
    ORA-07445: exception encountered: core dump [nstimexp()+71] [SIGSEGV] [ADDR:0x9] [PC:0x85CE503] [Address not mapped to object] []
    Use ADRCI or Support Workbench to package the incident.
    See Note 411.1 at My Oracle Support for error and packaging details.
    Thu Apr 09 10:29:02 2015
    Sweep [inc][69298]: completed
    Trace file:
    Trace file /u01/app/oracle/diag/rdbms/<db_unq_name>/<SID>/trace/<SID>_ora_29902.trc
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning and Oracle Label Security options
    ORACLE_HOME = /u01/app/oracle/product/11.2.0.3/dbhome_1
    System name:    Linux
    Node name:      <host name>
    Release:        2.6.32-431.17.1.el6.x86_64
    Version:        #1 SMP Wed May 7 14:14:17 CDT 2014
    Machine:        x86_64
    Instance name: <SID>
    Redo thread mounted by this instance: 1
    Oracle process number: 19
    Unix process pid: 29902, image: oracle@<host name>
    *** 2015-04-09 10:28:59.966
    *** SESSION ID:(416.127) 2015-04-09 10:28:59.966
    *** CLIENT ID:() 2015-04-09 10:28:59.966
    *** SERVICE NAME:(<db_unq_name>) 2015-04-09 10:28:59.966
    *** MODULE NAME:(dgmgrl@<observer host> (TNS V1-V3)) 2015-04-09 10:28:59.966
    *** ACTION NAME:() 2015-04-09 10:28:59.966
    Exception [type: SIGSEGV, Address not mapped to object] [ADDR:0x9] [PC:0x85CE503, nstimexp()+71] [flags: 0x0, count: 1]
    DDE: Problem Key 'ORA 7445 [nstimexp()+71]' was flood controlled (0x6) (incident: 69298)
    ORA-07445: exception encountered: core dump [nstimexp()+71] [SIGSEGV] [ADDR:0x9] [PC:0x85CE503] [Address not mapped to object] []
    ssexhd: crashing the process...
    Shadow_Core_Dump = PARTIAL
    ksdbgcra: writing core file to directory '/u01/app/oracle/diag/rdbms/<db_unq_name>/<SID>/cdump'
    Data Guard Configuration
    DGMGRL> show configuration verbose;
    Configuration - dg_config
      Protection Mode: MaxPerformance
      Databases:
        dbprim - Primary database
        dbstby - (*) Physical standby database
      (*) Fast-Start Failover target
      Properties:
        FastStartFailoverThreshold      = '30'
        OperationTimeout                = '30'
        FastStartFailoverLagLimit       = '180'
        CommunicationTimeout            = '180'
        FastStartFailoverAutoReinstate  = 'TRUE'
        FastStartFailoverPmyShutdown    = 'TRUE'
        BystandersFollowRoleChange      = 'ALL'
    Fast-Start Failover: ENABLED
      Threshold:        30 seconds
      Target:           dbstby
      Observer:         observer_host
      Lag Limit:        180 seconds
      Shutdown Primary: TRUE
      Auto-reinstate:   TRUE
    Configuration Status:
    SUCCESS
    DGMGRL> show database verbose dbprim
    Database - dbprim
      Role:            PRIMARY
      Intended State:  TRANSPORT-ON
      Instance(s):
        DG_CONFIG
      Properties:
        DGConnectIdentifier             = 'dbprim'
        ObserverConnectIdentifier       = ''
        LogXptMode                      = 'ASYNC'
        DelayMins                       = '0'
        Binding                         = 'optional'
        MaxFailure                      = '0'
        MaxConnections                  = '1'
        ReopenSecs                      = '300'
        NetTimeout                      = '30'
        RedoCompression                 = 'DISABLE'
        LogShipping                     = 'ON'
        PreferredApplyInstance          = ''
        ApplyInstanceTimeout            = '0'
        ApplyParallel                   = 'AUTO'
        StandbyFileManagement           = 'MANUAL'
        ArchiveLagTarget                = '0'
        LogArchiveMaxProcesses          = '4'
        LogArchiveMinSucceedDest        = '1'
        DbFileNameConvert               = ''
        LogFileNameConvert              = ''
        FastStartFailoverTarget         = 'dbstby'
        InconsistentProperties          = '(monitor)'
        InconsistentLogXptProps         = '(monitor)'
        SendQEntries                    = '(monitor)'
        LogXptStatus                    = '(monitor)'
        RecvQEntries                    = '(monitor)'
        SidName                         = ‘<sid>’
        StaticConnectIdentifier         = '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<db host name>)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=<service_name>)(INSTANCE_NAME=<sid>)(SERVER=DEDICATED)))'
        StandbyArchiveLocation          = 'USE_DB_RECOVERY_FILE_DEST'
        AlternateLocation               = ''
        LogArchiveTrace                 = '0'
        LogArchiveFormat                = '%t_%s_%r.dbf'
        TopWaitEvents                   = '(monitor)'
    Database Status:
    SUCCESS
    DGMGRL> show database verbose dbstby
    Database - dbstby
      Role:            PHYSICAL STANDBY
      Intended State:  APPLY-ON
      Transport Lag:   0 seconds
      Apply Lag:       0 seconds
      Real Time Query: ON
      Instance(s):
        DG_CONFIG
      Properties:
        DGConnectIdentifier             = 'dbstby'
        ObserverConnectIdentifier       = ''
        LogXptMode                      = 'ASYNC'
        DelayMins                       = '0'
        Binding                         = 'optional'
        MaxFailure                      = '0'
        MaxConnections                  = '1'
        ReopenSecs                      = '300'
        NetTimeout                      = '30'
        RedoCompression                 = 'DISABLE'
        LogShipping                     = 'ON'
        PreferredApplyInstance          = ''
        ApplyInstanceTimeout            = '0'
        ApplyParallel                   = 'AUTO'
        StandbyFileManagement           = 'AUTO'
        ArchiveLagTarget                = '0'
        LogArchiveMaxProcesses          = '4'
        LogArchiveMinSucceedDest        = '1'
        DbFileNameConvert               = ''
        LogFileNameConvert              = ''
        FastStartFailoverTarget         = 'dbprim'
        InconsistentProperties          = '(monitor)'
        InconsistentLogXptProps         = '(monitor)'
        SendQEntries                    = '(monitor)'
        LogXptStatus                    = '(monitor)'
        RecvQEntries                    = '(monitor)'
        SidName                         = ‘<sid>’
        StaticConnectIdentifier         = '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<db host name>)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=<service_name>)(INSTANCE_NAME=<sid>)(SERVER=DEDICATED)))'
        StandbyArchiveLocation          = 'USE_DB_RECOVERY_FILE_DEST'
        AlternateLocation               = ''
        LogArchiveTrace                 = '0'
        LogArchiveFormat                = '%t_%s_%r.dbf'
        TopWaitEvents                   = '(monitor)'
    Database Status:
    SUCCESS

  • Use BDLS in BI for re-mapping of source system

    Hello,
    Is it possible to use tr code BDLS in BI  in order to automatically change source system assignment in source system related BI objects (such as info sources, info packages..)?
    We have created a new ECC QA client and would like to load data from that to BI (the same BI system as before) and wish to avoid the manual work of manually re-mapping all source system related objects.
    In the thread below the conclusion was that it's possible to use BDLS, but when executing it in test mode and looking at the affected tables, only the parter profiles and EDI-related tables seems to be modified with a new LSN (logical system name). However, I could not find a change in tables containing f.ex. info sources etc.
    Fresh source system clients in production BW system
    Is there anyway of doing this without manually re-mapping the BI objects?
    Thanks,
    F C
    Message was edited by:
            F C

    Thanks!
    When running a LSN conversion in BI from source OLTP to target OLTP in test mode and then checking some BI-object tables, I see that the logical system name is changed, so it's seems to me as if the infosources are changed.. (or?)
    However, perhaps a transport in "system copy" mode is preferrable (?) In step 2, do you mean that I should first collect a system copy in a transport request and then collect "everything before" in the same transport request?
    Any comments on BDLS, Patrick?
    Regards,
    F C
    Message was edited by:
            F C

  • Mapping of Quality System (Source System) after the Transport

    Hi,
      We have two quality source systems like 300,400 in sync with the development ,to extract the data .    I am transporting Transfer rule mapping from the development .After the transport to Quality , the changes are reflected in 300 . But I want to change it from 300 to 400 i.e.Transfer rule mapping should be reflected in 400. What is the transaction code used to change this mapping in BW System?
    Thanks for your time.
    Thanks & Regards,
    Raja

    Raja,
    When you transport - you have a transport mapping in your dev system that tells BI which system connects to which system in the QA system for instance
    FLAT_FILE in DEV could be Z_FLATFILE in QA - accordingly the datasources are mapped. Also the transfer rules are source system specific which means that you change the source system - the transfer rules will have to be recreated.
    Instead what I am asking you is to maintain the mappings for client 400 in DEV and then maintain the mappings as 400 - 400 in the qa system and then transport the same. If your dev systems are linked to QA300 and your mapping says QA400 the same will hold good.. however if you have QA300 in dev to be sent to QA as QA300 and then change the source system .. it will not work since this will be seen as a source system change and you cannot change the same. You could try BDLS and try changing the logical client name ( I am not sure if it is BDLS ) or change the RFC connection parameters in SM59 but am not sure if it would work....

  • Converting note mapping for addictive drums to use midi kit

    My midikit is wrongly assigned for addictive drums. When I hit the hihat I get a crash and so on.
    the manual says:
    If you want to use GM midi files, or connect an e-drum kit to AD, you need to convert the note mapping. In many hosts you can use a “drum map” to remap the notes from GM to AD. There are several drum maps available for free download in our User Area!
    Eh?
    Please help me.
    Timo

    Hi
    You have two possible solutions:
    1) In the drumkit controller, remap the MIDI notes transmitted to suit
    2) In Logic, create a Mapped Instrument object in the Environment, and sort out the MIDI Input/Output note mapping to suit. Then either:
    connect this object between the Physical Input and the Sequencer Input objects onthe Clicks and Ports layer when you wish to record MIDI from your drum controller with AD (disconnect Mapped Instrument when you have finished with the drums).
    or
    Cable the Mapped Instrument object directly to the Instrument Channel Strip (Mixer Layer) which has the AD plug. With the  correct Arrange track Click on the  Mapped Instrument object using the MIDI Thru tool to re-assign the Arrange track to the Mapped Instrument object.
    CCT

Maybe you are looking for