Performance In Simple Scenarios

I have done some performance testing to see if asynchronous triggers performs any better than synchronous triggers in a simple audit scenario -- capturing record snapshots at insert, update and delete events to a separate database within the same instance of SQL Server.
Synchronous triggers performed 50% better than asynchronous triggers; this was with conversation reuse and the receive queue activation turned off, so the poor performance was just in the act of forming and sending the message, not receiving and processing.  This was not necessarily surprising to me, and yet I have to wonder under what conditions would we see real performance benefits for audit scenarios.
I am interested if anyone has done similar testing, and if they received similar or different results.  If anyone had conditions where asynchronous triggers pulled ahead for audit scenarios, I would really like to hear back from them.  I invite any comments or suggestions for better performance.
The asynchronous trigger:
Code Snippet
ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
FOR INSERT AS
BEGIN
  DECLARE
    @CONVERSATION UNIQUEIDENTIFIER ,
    @MESSAGE XML ,
    @LOG_OPERATION CHAR(1) ,
    @LOG_USER VARCHAR(35) ,
    @LOG_DATE DATETIME;
  SELECT TOP(1)
    @CONVERSATION = CONVERSATION_HANDLE ,
    @LOG_OPERATION = 'I' ,
    @LOG_USER = USER() ,
    @LOG_DATE = GETDATE()
  FROM SYS.CONVERSATION_ENDPOINTS;
  SET @MESSAGE =
  ( SELECT
      CUST_ID = NEW.CUST_ID ,
      CUST_DESCR = NEW.CUST_DESCR ,
      CUST_ADDRESS = NEW.CUST_ADDRESS ,
      LOG_OPERATION = @LOG_OPERATION ,
      LOG_USER = @LOG_USER ,
      LOG_DATE = @LOG_DATE
    FROM INSERTED NEW
    FOR XML AUTO );
  SEND ON CONVERSATION @CONVERSATION
    MESSAGE TYPE CUSTOMER_LOG_MESSAGE ( @MESSAGE );
END;
The synchronous trigger:
Code Snippet
ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
FOR INSERT AS
BEGIN
  DECLARE
    @LOG_OPERATION CHAR(1) ,
    @LOG_USER VARCHAR(15) ,
    @LOG_DATE DATETIME;
  SELECT
    @LOG_OPERATION = 'I' ,
    @LOG_USER = USER() ,
    @LOG_DATE = GETDATE()
  INSERT INTO SALES_LOG.DBO.CUSTOMER
  SELECT
    CUST_ID = NEW.CUST_ID ,
    CUST_DESCR = NEW.CUST_DESCR ,
    CUST_ADDRESS = NEW.CUST_ADDRESS ,
    LOG_OPERATION = @LOG_OPERATION ,
    LOG_USER = @LOG_USER ,
    LOG_DATE = @LOG_DATE
  FROM INSERTED NEW
END;

Synchronous audit has to do one database write (one insert). Asynchronous audit has to do at least an insert and an update (the SEND)  plus a delete (the RECEIVE) and an insert (the audit itself), so that is 4 database writes. If the destination audit service is remote, then the sys.transmission_queue operations have to be added (one insert and one delete). So clearly there is no way asynchronous audit can be on pair with synchronous audit, there are at least 3 more writes to complete. And that is neglecting all the reads (like looking up the conversation handle etc) and all the marshaling/unmarshaling of the message (usually some fairly expensive XML processing).
Within one database the asynchronous pattern is apealing when the trigger processing is expensive (so that the extra cost of going async is negligible) and reducing the original call response time is important. It could also help if the audit operations create high contention and defering the audit reduces this. Some more esoteric reasons is when asynchronous processing is desired for architecture reasons, like the posibility to add a workflow triggered by the original operation and desire to change this workflow on-the-fly without impact/down time (eg. more consumers of the async message are added, the message is schreded/dispatched to more processing apps and triggers more messages downstream etc etc).
If the audit is between different databases even within same instance then the problem of availabilty arrises (audit table/database may be down for intervals, blocking the orginal operations/application).
If the audit is remote (different SQL Server isntances) then using Service Broker solves the most difficult problem (communication) in addition to asynchronicity and availability, in that case the the synchrnous pattern (e.g. using a linked server) is really a bad choice.

Similar Messages

  • Hello, I want to perform a simple task. I have two Adobe ID accounts linked to two different email addresses. One account i do not want to use anymore but it is linked to my main email account. the account i do not want anymore is still in its free stage

    Hello, I want to perform a simple task.I have two Adobe ID accounts linked to two different email addresses. One account i do not want to use anymore but it is linked to my main email account. the account i do not want anymore is still in its free stage and i have not purchased anything with it. My other Adobe ID account is linked to an email i rarely use and don't particularly want to use. i have tried changing the linked email account to my regular one i use. But it obviously does not allow me because of it already being linked to my other non used obsolete Adobe ID account. Is there any solution to this?? Please help.

    Adobe contact information - http://helpx.adobe.com/contact.html may help

  • XI Simple Scenario..

    HI Everyone,
    I am new to XI , so please tell me a simple scenario to start with ( if possible with Step-by- Step).
    Also, Please tell me how to move from simple scenarios to more complex ones also if possible give me the complex sceanrios too.
    Regards,
    Snehal

    Snehal,
    Doing a file to file is the best way to start with XI. these blogs can help you on that,
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1 - File to File Part 1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2 - File to File Part 2
    Regards,
    Bhavesh

  • Simple scenario

    Can somebody explain a simple scenario (like 2 or 3 dimensions and 2 or 3 key figures) starting from data load (from flat file, so that I can create using EXCEL) to creating infocube and generating a report.
      Or is there any source that I can find the above scenario?
    Thanks in advance.

    Hi,
    You can do this one.
    Create one ODS with following key fields.
    1. Document number
    Data fields.
    2. Cal day
    3. Comp code
    4. Sales org
    5. Customer code
    6. CUstomer district
    7. Material
    8. Quantity
    9. Price
    And then create a cube with the following dimensions
    Organization with fllowing chars
    COmp COde
    Sales Org
    Customer with following chars
    Customer Code
    Customer Dist
    MAterial with following chars
    Material
    Here, you need not consider the document number. The quantity and price goes to the fact table.
    Please let me know if it worked
    Sriram

  • XI and BAPI - simple scenario

    Hi,
    I'm just a beginner with XI. I've got such scenario to develop. Could anyone help me ?
    1. After completing production in external system I'm receving LOIPRO.LOIPRO01 message from R/3 system with status I0045. (I mean LOIPRO with status I0045 is beeing sent from external system to XI)
    2. In that case I should call BAPI_GOODSMVT_CREATE with quantity 0 and delivery flag completed.
    3. To call that BAPI, IDOC sent from external system has unsufficient data (only status and process order). Because of that before BAPI_GOODSMVT_CREATE the BAPI_PROCORD_GET_DETAIL must be called.
    Should I use BPM or I can avoid using it ?
    Could anyone describe me how should it looks like in Intergration Repository (I would be very grateful for any step-by-step description )
    Thanks in advance.

    Hi Lukasz,
    While your scenario could be accomplished using a BPM, this would not be optimal. One reason is the performance overhead incurred and another is that the information received from BAPI_PROCORD_GET_DETAIL could (in principle) be stale by the time you later call BAPI_GOODSMVT_CREATE because there is no way to call the 2 BAPIs from XI in the same transactional context.
    A better solution would be to wrap the 2 BAPI calls in either a remote-enabled function module or, better yet, a receiver proxy in the R/3 system. Thus, XI simply calls this RFC or proxy and it, in turn, takes care of calling the 2 BAPIs in sequence.
    Regards,
    Thorsten

  • File content conversion simple scenario

    Hi friends,
    I am trying FCC for first time. I had created one simple file scenario without using FCC to which I had to provide input in XML format.
    like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:mt_sender xmlns:ns0="http:/soni.com">
       <ponum>1</ponum>
       <poqty>2</poqty>
       <poamt>3</poamt>
    </ns0:mt_sender>
    Now I want to use FCC for this,in which i have to provide comma seperate data, e.g 11,12,114.4
    Please let me know how to do this. I had gone through some blogs on sdn but I didnt got solution for this.
    Thanks.,
    Brij........

    HI,
    If you want the output file in comma separated pattern for the incoming xml file,then you can go for the below mentioned fcc parameters in your receiver channel.
    Structure*                 Recordset,Record
    Record.fieldSeparator             ,
    Record.endSeparator             'nl'
    Recordset.fieldSeparator        'nl'
    Regards,
    Swetha.

  • The performance of synchronous scenarios?

    Hi gurus,
    We are planing to implement the integration between SAP and WMS(Warehouse management system) . The customer wants all the scenarios (Purchase Order, goods received,...) to be synchronous. Although it's good from the functional point, I think the performance may be not good.
    We also need to consider the timeout isssue. For example, when the network is down or the application data is wrong, then the sender system will wait a  long time or received the cause of wrong application data. If the application data is wrong, whether we should allow user to save the data in sender system? If not, they will cry if no solutions found in short time. If yes, it will cause the inconsistent data in two systems.
    What do you think about this? Is the performance bad according to your experience? How to deal with the wrong application data case? Any help will be appreciated!

    We are planing to implement the integration between SAP and WMS(Warehouse management system) . The customer
    wants all the scenarios (Purchase Order, goods received,...) to be synchronous.
    What are your end system protocols....if you are planning to use IDOC then SYNC Communication is not supported.
    If the application data is wrong, whether we should allow user to save the data in sender system?
    Better if an ACK mechanism is built which will indicate to the SAP system if the data is correct/ incorrect....If incorrect the SAP system can be configured to re-send the data.
    I do not find it a good option that wrong application data is stored in the system.....if required for business analysis then fine you can archive the wrong data into some other location.
    Is the performance bad according to your experience?
    It depends on the volume of data that you will be transacting for a defined duration of time.
    How to deal with the wrong application data case?
    This is something which the business consultants have to decide.
    Regards,
    Abhishek.

  • Performance of syncronous scenario

    Hi,
    I need suggestions as to how to handle the following scenario.
    When a user enters data in  one field (say employee number)of the SAP screen the data must be used in XI to call a webservice
    and get results(employee details) from it and fill up other fields in the screen.
    I can develop a synchronous scenario easily but i want to know if i would have performance issues later.
    What factors should i consider?
    Is there any other approach that i can follow?

    Dear Siva
    For such a requirement you would have to go for a syncronous scenario. However for performance perspective, you can create a web service which takes multiple employee numbers and returns details for all of them in one shot, instead of retrieving emplyee details one by one.
    Regards
    Monika

  • UDDI - Simple Scenarios

    hi UDDI experts,
    For the SAP online documentation shown below, is there any tutorial or examples or how to guide available? Did anyone have experiene using SAP J2EE Engines UDDI registry?
    Thanks in advance
    Kiran
    snippet from online documentation
    ==============================
    Simple UDDI Scenarios
    ·        One company publishes a standard in UDDI and the others can find and use the same standard.
    For example, a tour office company may offer rooms from several hotels. This company publishes a Web Service Interface in UDDI and each hotel that wants to be represented by this company implements a Web service that extends this Web Service Interface. Later on, after an agreement between both companies, the tour company may use the WSDL of the Hotel Web service to ask for free rooms, book rooms, and so on.
    ·        A published standard Web Service Interface already exists and you want to see who supports it and to choose the one that suits you best.

    Synchronous audit has to do one database write (one insert). Asynchronous audit has to do at least an insert and an update (the SEND)  plus a delete (the RECEIVE) and an insert (the audit itself), so that is 4 database writes. If the destination audit service is remote, then the sys.transmission_queue operations have to be added (one insert and one delete). So clearly there is no way asynchronous audit can be on pair with synchronous audit, there are at least 3 more writes to complete. And that is neglecting all the reads (like looking up the conversation handle etc) and all the marshaling/unmarshaling of the message (usually some fairly expensive XML processing).
    Within one database the asynchronous pattern is apealing when the trigger processing is expensive (so that the extra cost of going async is negligible) and reducing the original call response time is important. It could also help if the audit operations create high contention and defering the audit reduces this. Some more esoteric reasons is when asynchronous processing is desired for architecture reasons, like the posibility to add a workflow triggered by the original operation and desire to change this workflow on-the-fly without impact/down time (eg. more consumers of the async message are added, the message is schreded/dispatched to more processing apps and triggers more messages downstream etc etc).
    If the audit is between different databases even within same instance then the problem of availabilty arrises (audit table/database may be down for intervals, blocking the orginal operations/application).
    If the audit is remote (different SQL Server isntances) then using Service Broker solves the most difficult problem (communication) in addition to asynchronicity and availability, in that case the the synchrnous pattern (e.g. using a linked server) is really a bad choice.

  • A very simple scenario, but I can't find a solution

    Hi,
    I have an iPhone that runs IOS 4.1. It is synced with iTunes library at home which has all my music, video, podcasts, apps, etc. Occasionally when I'm at work I come across an audio clip or video clip that I would like to save on an iPhone to watch during the commute back. I am reasonably tech savvy, but I can't find a solution to accomplish this task. Since iPhone is synced with the library at home, it cannot be synced with the iTunes library at work without it being wiped and reinitialized. I have found partial solution for video files using the (now unavailable) VLC app, which allows transfer of FLV or MP4 files, although since hardware acceleration is not available to VLC app, watching any video above 480p is not possible. VLC app does not accept MP3 files, so I'm out of luck with these.
    In the past I used non-IOS iPods and had no problems syncing it with library at home and being able to manually add music/videos from iTunes library at work.
    I'm surprised that the device/ecosystem that is so user friendly suddenly cannot be used for such a simple task. All of the content I am talking about is DRM free and in open formats like MP3.
    This limitation of only one iTunes library being allowed to add/manage content on an IOS device seems very not user friendly to me. I understand the need to protect content, but again - I am only talking about non-purchased publicly available content like MP3 files of programs published by radio stations.
    I have a VPN connection between home and work, but iTunes does not allow syncing via anything but USB. I have even tried to find some sort of "Virtual USB" port that would allow me to connect iPhone at work and have computer at home see it as USB connected device, but could not find any solution for this on OS X.
    Am I missing something trivial or is it impossible to add an MP3 file to an iPhone from anywhere but one iTunes library?
    Thank you.

    I have finally been able to solve this problem, albeit through an unorthodox method.
    I have placed my iTunes library folder on a NAS I have at home and configured both home Mac Pro and Macbook Pro at work to use the same library. Since I have VPN, the file path is the same for both machines, so they both now can access the same library and sync to the phone from it.
    Now when at work I come across something I want to drop to the iPhone for listening on my way back home, I add the file to the library (it travels over the VPN to my home NAS) and then sync (it travels back over the same VPN to my iPhone)
    Since there's about 20mbit bandwidth, the speed and response is acceptable.
    The problem is solved.
    I wonder, why Apple had to put this strange limitation of only one library being able to sync to the iPhone? Is the rationale behind this decision documented anywhere?

  • Perform a simple validation of a custom field in AI_CRM_ORDER_SAVE

    Hello,
    I want to perform a custom validation of a custom field just before saving my order.
    I found out I should do this in the CHECK_BEFORE_SAVE method of the AI_CRM_ORDER_SAVE Badi, but I cannot figure out how to get the value entered in my custom field.
    Can anyone point me in the right direction?
    Thanks in advance!
    Regards,
    Tim.

    Hi Shailaja Gandam,
    I am having the same error.
    But for me its a little diferent.
    The error only occurs when I'm saving more then 2 objects, then on second order it try to save a guid from the previous loop step.
    I clean every objects on beginning of the loop, but its not working.
    I think its buffers or someting like it
    Have you solved your problem?

  • JAVA API AND ABAP API SIMPLE SCENARIO

    Hello MDM gurus
    I have never used any Java API or ABAP API to leverage and present MDM functionalities on front end systems like portal,etc...
    Could you please give me all the required to play around with JAVA api and ABAP api's.
    Points will be given to every valuable answer.
    Thanks

    Hi Nazeer,
    In order to use Portal you need Java APIs and to start with refer the MDM Java docs to get the basic idea of various classes and methods to be used in developing the simple java application and access it using portal.
    http://help.sap.com/saphelp_mdm550/helpdata/en/47/9f23e5cf9e3c5ce10000000a421937/frameset.htm
    Sample code for Duplicating Repository
    public class TestDuplicateRepository
               public static ConnectionPool simpleConnection;
               public static RepositoryIdentifier repIdentifier,repIdentifier1;
         public static String session;
         public static String connection = "MDMServer_Test";
         public static String repository1 = "Test_Repository";
         public static String repository2 = "Test_Duplicate";
              public static DBMSType dbmsType = DBMSType.MS_SQL;
         public static void main(String[] args)throws CommandException, ConnectionException
                   //Creating Connection.
                   simpleConnection = ConnectionPoolFactory.getInstance(connection);     
                   //Establishing connection with Repository.
                   repIdentifier = new RepositoryIdentifier(repository1, connection, dbmsType);
                   repIdentifier1 = new RepositoryIdentifier(repository2, connection, dbmsType);
                   //Creation Sever Session.
                   CreateServerSessionCommand createServerSessionCmd = new CreateServerSessionCommand(simpleConnection);
                   createServerSessionCmd.execute();
                   session = createServerSessionCmd.getSession();
                   //Authenticating Server Session.                    
                   AuthenticateServerSessionCommand auth= new AuthenticateServerSessionCommand(simpleConnection);
                   auth.setSession(session);
                   auth.setUserName("Admin");
                   auth.setUserPassword("Admin");
                   auth.execute();
                   session = auth.getSession();     
                   //Duplicate Repository Command
                   DuplicateRepositoryCommand duplRepCmd = new DuplicateRepositoryCommand(simpleConnection);
                   duplRepCmd.setDBMSUserName("sa");
                   duplRepCmd.setDBMSUserPassword("abc");
                   duplRepCmd.setSession(session);
                   duplRepCmd.setSourceRepositoryIdentifier(repIdentifier);
                   duplRepCmd.setTargetRepositoryIdentifier(repIdentifier1);
                   duplRepCmd.execute();
    Similarly you can try with Getting server version, Archive repository and then move on to adding,modifying records etc.
    For ABAP APIs refer the below link
    http://help.sap.com/saphelp_mdm550/helpdata/en/44/93aa6e31381053e10000000a422035/frameset.htm
    Regards,
    Jitesh Talreja

  • ** Queue Performance in BPM scenarios

    Hi friends,
    How about the Queue performance when processing lot of messages at the same time ? For example, if 100 files are picked up and processed by BPM (at the same time), all files processed by the queue XBQO$PE_<Task> Ex: XBQO$PE_WS90000041and the result are delivered thru transaction queues ex: XBTOY0_<4 digit queue> .
    My doubt is a) Will this  slow performance when processing lot of files like 200,300 b) Is it required to increase any queue parameter ?
    Kindly clarify, friends.
    Kind Regards,
    Jeg P.

    Hey,
      refer to the link for the performance issues.
    /people/swen.conrad/blog/2006/12/22/xi-ccbpm-performance-under-perform-or-out-perform
    If you think the performance is slow then you can go for multiple Queues.
    refer to this link for the configuration of multiple Queues.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e0ff0006-3901-2a10-9e97-a71b423da545
    reward points if useful.
    regards,
           Milan

  • Multi Mapping   for a simple scenario

    Hi,
       i have   a scenario    i.e. from the source I'm getting some 10 fields of data..   like  bellow..
    Data:                      0--- Unbounded
        Company_Code   1-1
        Order_No            1-1
        Material              1-1
        Amount              1-1
    but my requirement is.. in the receiver side i have two structures.. 
    1)Receiver1
    Data:                      0--- Unbounded
        Company_Code   1-1
        Order_No            1-1
    2)Receiver2
    Data:                      0--- Unbounded
        Company_Code   1-1
        Material              1-1
        Amount              1-1
    if the Company Code is  1000  then   the data wil goes to First Receiver..   and  if the  Company Code is 2000  then the data will goes to Second Receiver
      This is my requirement..
      for this.  idid IR point of view every thing correctly by using multi  mapping..  even if i test the mapping that is working fine..
    but  in the ID(Integration Directory ) i'm not able to see any  Interface mappings .  in the Enhanced  Interface detremination..    It was displaying No Objects Found.. message..
    for that what  can i do..   any suggestions.. please..
    regards
    Jain

    Hi jain,
      I think you confused with  occurance.
      For your recurement You created a structure with 0-unbounded in the Source and Receivers.
      There is no need of change the Message type and Interface Name Structure occurances.
      Just Do like belllow.
      1. Mapping:
         select your source structure.
         select your Receiver structures.
         Put the condition for creating Nodes for the Receiver structures.
         There is No need of change occurances.
      2.InterFace Mapping:
         Select Your source message Interface.
         Select your Receiver interfaces.
         Have (1 Mapping) 1 source ,2 Receivers.
         There is No need of change occurances.
    Now go to Interface Determination.Choose extened.you will get your mapping.
    Regards,
    Prakasu

  • GET VPN in a simple scenario

    R1---Cloud(R4)----R2
              |
              R3(KS)
    hi,
    I set up 3 routers, with R3 being the KS. a very simple GET VPN. It is not working. The underlying reachibility is fine.
    any idea?
    thanks,
    Han
    =====R3, KS====
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    crypto isakmp key cisco address 1.1.14.1
    crypto isakmp key cisco address 1.1.24.2
    crypto ipsec transform-set mygdoi-trans esp-aes esp-sha-hmac
    crypto ipsec profile godi-profile-getvpn
    set security-association lifetime seconds 7200
    set transform-set mygdoi-trans
    crypto gdoi group getvpn
    identity number 1234
    server local
      rekey retransmit 10 number 2
      sa ipsec 1
       profile godi-profile-getvpn
       match address ipv4 199
       replay counter window-size 64
    interface Serial1/0
    ip address 1.1.34.3 255.255.255.0
    serial restart-delay 0
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    ip forward-protocol nd
    no ip http server
    no ip http secure-server
    access-list 199 permit ip host 1.1.1.1 host 2.2.2.2
    access-list 199 permit ip host 2.2.2.2 host 1.1.1.1
    ============R1, GM============
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    lifetime 1200
    crypto isakmp key cisco address 1.1.34.3
    crypto gdoi group getvpn
    identity number 1234
    server address ipv4 1.1.34.3
    crypto map getvpn-map 10 gdoi
    set group getvpn
    interface Loopback0
    ip address 1.1.1.1 255.255.255.0
    interface FastEthernet0/0
    no ip address
    shutdown
    duplex half
    interface Serial1/0
    ip address 1.1.14.1 255.255.255.0
    serial restart-delay 0
    crypto map getvpn-map
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    =====R2, GM=====
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    lifetime 1200
    crypto isakmp key cisco address 1.1.34.3
    crypto gdoi group getvpn
    identity number 1234
    server address ipv4 1.1.34.3
    crypto map getvpn-map 10 gdoi
    set group getvpn
    interface Loopback0
    ip address 2.2.2.2 255.255.255.0
    interface Serial1/0
    ip address 1.1.24.2 255.255.255.0
    serial restart-delay 0
    crypto map getvpn-map
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    ============
    show cryto ipsec sa on R2
    R2#sh cry ips sa
    interface: Serial1/0
        Crypto map tag: getvpn-map, local addr 1.1.24.2
       protected vrf: (none)
       local  ident (addr/mask/prot/port): (2.0.0.0/255.0.0.0/0/0)
       remote ident (addr/mask/prot/port): (1.0.0.0/255.0.0.0/0/0)
       current_peer 0.0.0.0 port 848
         PERMIT, flags={origin_is_acl,}
        #pkts encaps: 0, #pkts encrypt: 0, #pkts digest: 0
        #pkts decaps: 0, #pkts decrypt: 0, #pkts verify: 0
        #pkts compressed: 0, #pkts decompressed: 0
        #pkts not compressed: 0, #pkts compr. failed: 0
        #pkts not decompressed: 0, #pkts decompress failed: 0
        #send errors 0, #recv errors 0
         local crypto endpt.: 1.1.24.2, remote crypto endpt.: 0.0.0.0
         path mtu 1500, ip mtu 1500, ip mtu idb Serial1/0
         current outbound spi: 0xB4D74B58(3034008408)
         PFS (Y/N): N, DH group: none
         inbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 3, flow_id: SW:3, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         inbound ah sas:
         inbound pcp sas:
         outbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 4, flow_id: SW:4, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         outbound ah sas:
         outbound pcp sas:
       protected vrf: (none)
       local  ident (addr/mask/prot/port): (1.0.0.0/255.0.0.0/0/0)
       remote ident (addr/mask/prot/port): (2.0.0.0/255.0.0.0/0/0)
       current_peer 0.0.0.0 port 848
         PERMIT, flags={origin_is_acl,}
        #pkts encaps: 0, #pkts encrypt: 0, #pkts digest: 0
        #pkts decaps: 0, #pkts decrypt: 0, #pkts verify: 0
        #pkts compressed: 0, #pkts decompressed: 0
        #pkts not compressed: 0, #pkts compr. failed: 0
        #pkts not decompressed: 0, #pkts decompress failed: 0
        #send errors 0, #recv errors 0
         local crypto endpt.: 1.1.24.2, remote crypto endpt.: 0.0.0.0
         path mtu 1500, ip mtu 1500, ip mtu idb Serial1/0
         current outbound spi: 0xB4D74B58(3034008408)
         PFS (Y/N): N, DH group: none
         inbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 1, flow_id: SW:1, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         inbound ah sas:
         inbound pcp sas:
         outbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 2, flow_id: SW:2, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         outbound ah sas:
         outbound pcp sas:
    R2#

    First, I would say the sorryserver should be the CSS2 vip and not a server behind it.
    This is a feasible solution.
    The only important point is that CSS1 needs to see the response from the server, so you need to nat traffic on CSS1 with an ip address part of CSS1 subnet so that the server behind CSS2 can send the response to CSS1 and not directly to the client.
    You can do this with a group.
    ie:
    group natme
    vip x.x.x.x
    add destination service sorryserver1
    active
    Regards,
    Gilles.

Maybe you are looking for

  • How do I back my iphone  music up to a new computer

    I previously had the music on my iPhone 5 backed up to my old computer, this computer has now crashed and I can't access anything off it. Is there a way I can get this music from my library?

  • A B E N D  KTEXT : MMPV_DATE_CHECK: Error occurred (Note 1082841).    Trans

    Hi gurus I am trying to open posting period in MMPV but it is displaying a errror *A B E N D* *KTEXT : MMPV_DATE_CHECK: Error occurred (Note 1082841).* *Transaction MMPV (period closing program) cannot be executed. For more information, see Note 1082

  • Check lot

    Hi, We have one check lot of check number 010101 to 015100. But after some months coincidently new check lot has come form same bank with same numbers. System does not allowed to create new check lot for same number. Please suggest how to create.

  • Work Schema Vs Staging Area

    Can anybody explain what's the difference between Work Schema and Staging Area in ODI. Work schema we specify while creating Physical schema in Topology manager. Is Staging area same as work schema? Thanks

  • Iphoto 2011 Freezing Computer? Please help

    Hi, When viewing photos in Iphoto 9 my whole computer froze. I clicked on a picture and my Mac froze. This is the second day in a row that this happened. I had to force a shutdown of my whole computer. Is the issue with Iphoto 2011 or with my compute