Data Domain - Replication Acceleration vs Application Acceleration?

http://www.datadomain.com/pdf/DataDomain-Cisco-SolutionBrief.pdf
I recently read an article by Cisco detailing the WAAS appliances capability to add addional deduplicaion to Data Domain replication traffic being forwarded across the WAN.  After reading the aricle and speaking with my SE.  He recommended a 674 in Application Acceleration mode vs the WAE 7341 in Replication Acceleration mode.  The article also states Application Acceleration mode was used.
Why is Application Acceleration mode recommended over Replication Acceleration mode for this traffic?
I have a project requiring 15 - 20 GB / day of incremental backups to be replicated between 3 sites.  The sites are 90 ~ 110ms apart and the links are around 20mbps at each site.  Would a 674 in Application Acceleration mode really make a difference for the Data Domain replications?

Is their any possibility you could dig up the configuration that was used on the WAAS Application Accelerator during the Data Domain testing.  I see Data Domain replication service runs over port TCP 4126.  My SE recommends disabling the WAE 674s DRE functions for the Data Domain traffic and simply reply on LZ & TFO.  How do you simply disable DRE, but still use LZ and TFO?
I see 3 common settings;
action optimize full                                                       LZ+TFO+DRE
action optimize DRE no compression none          TFO only
action pass-through                                                    bypass
Can you do LZ+TFO only?  None of the applications in the link below show this type of action setting.  This leads me to believe my SE was really suggesting turn off DRE completely for the WAAS.
This WAAS needs to optimize traffic for;
Lotus-Notes, HTTP, CIFS, Directory Services, RDP, and Data Domain
Can all applications above + Data Domain be optimized from a pair of WAE 674s in AA mode?
http://cisco.biz/en/US/docs/app_ntwk_services/waas/waas/v407/configuration/guide/apx_apps.html
policy-engine application
   name Data-Domain
   classifier Data-Domain
      match dst port eq 4126
exit
   map basic
      name Data-Domain classifier Data-Domain action optimize DRE no compression LZ
exit

Similar Messages

  • Dc replication acceleration mode

    Hi all,
    I have a setup with 2 X WAE-7371 devices. my application is SnapMirror.
    I have 2 questions,
    1.can i used application acceleration mode for that application or i need to use replication mode ? (i understand that i can use both, but what is more recommended ?)
    2.for the replication mode, what software is recommended ?, i know that 4.0.19 supported that feature but i cannot download it. can i used 4.0.27 ? (i cannot find configuration guide for that release).
    appriciate your help.

    Avi,
    1.  What are the WAN characteristics (bandwidth and RTT latency) in your environment?
    2.  The Replication Accelerator mode is supported in the 4.0.x versions of WAAS.  You can download and use the latest version from CCO.
    Regards,
    Zach

  • How to roll up data in bi accelerator

    Hi,
    can any body give me the steps to roll up data in bi accelerator.
    Thanks
    Asim

    Hi,
            Aggregates are nothing but a collection of most used data. You can have a cube with 10 characteristics and KFs and have 5 aggregates on it that has most used data sets so that system can read them easily while running a report.
    Updating the aggregates is called Rolling up of aggreagtes.
    Refer to the link below.
    [BW agrgegates|http://help.sap.com/saphelp_bw30b/helpdata/en/7d/eb683cc5e8ca68e10000000a114084/frameset.htm]

  • Unable to create data-domain in Endeca 3.0v

    Hi,
    I have installed Endeca 3.0 server in non-SSL mode. The verification URL http://localhost:7001/endeca-server/ws/manage?wsdl is opening up. I could login into the WebLogic Administration Console and could see that the oracle.endecaserver Web application has a State of "Active" in the Administration Console. But when I try to create data-domain using through command prompt I get an error as: 'java' is not recognized as internal or external command, operable program or batch file.
    May I know what is preventing from creating data-domain.
    Thanks,
    Anusha.R

    It sounds like you are running endeca-cmd[.sh] create-dd as a user who does not have the Java JDK on its PATH.
    To set this temporarily for the life of one shell you can follow the instructions in the JRockit Installation and Upgrade Guide, http://docs.oracle.com/cd/E15289_01/doc.40/e15065/post_install.htm#i1066387 . To make this permanent, for Linux you'll want to add this to your shell startup files (.bash_profile) or on Windows follow the instructions to add an environment variable.

  • Upgrading Endeca Server data domains from 3.0 to 3.1

    Please explain the instructions mentioned below (link for reference where it is mentioned):
    http://docs.oracle.com/cd/E40518_01/studio.310/studio_migration/toc.htm#Application%20data%20structure%20now%20uses%20data%20sets
    When you upgrade Endeca Server data domains that were ingested using Integrator ETL, you will need to ingest the records into a single data set (referred to as a collection in Endeca Server and Integrator ETL) that has its key set to "Base". The display name can then be something to represent the actual content of the records.
    Setting the data set key to "Base" allows any existing components that were tied to the Base view in 3.0 to be able to display the correct data.
    What "key" we are referring here, is it collection key or view key. Also, which Integrator ETL version we are referring, is it 3.0.
    I tried exporting the data domain from Endeca 3.0 and importing into Endeca 3.1 with default settings, it did not work.
    Thanks in advance!

    Correct.
    Now, keep in mind that while this will get your data up to 3.1, it is very likely (assuming you have a production app) that you'll want to consider data model changes or other enhancements to take advantage of the new functionality.
    Also, there are no more child data sources, so if your application used them, you'll want to consider separate datasets, not just a base dataset.

  • Is it possible to access Data Domain to retrieve records?

    Hello,
    I'm using OEID 3.1.
    Today i was asked if i know if its possible to access the records stored in the Data Domain to retrieve them.
    Actually the question was if there is a way to access the "Database" of the Data Domain to retrieve the data from there.
    Is this possible ? Is there a way to retrieve the data from the Data Domain stored?
    I can only think it would be possible to retrieve the data but without the information about the columns, only a simple retrieve of information by placing a results table and do an export but maybe there is a way to do a retrieve of the data like a sql export, including information about the metadata, etc..
    Regards and thanks for the help.

    Frederico,
    Are you trying to do it in Clover/Endeca Integrator, a portlet in Studio or your own application?
    The Endeca Server uses Web Services to serve requests for data, it's not as simple as say a REST service or writing a SQL Query (though you could make it that easy by writing a quick front-end).
    I think this question is not being answered because it's unclear what the "end goal" of what you're trying to do is and why you are trying to do it.
    If you're just looking to "play around", download SOAP UI and play around with that.  It's a great tool, regardless of what you are trying to do as it will automatically interrogate a WSDL file (the definition of a Web Service) and generate sample requests for you.
    If you're looking to export records in Clover....you are (almost certainly) using web services already in your data loading graphs.  Check out how something like InitDataDomain uses the Manage web service as an example (or LoadAttributes uses sconfig) and then drop your own web service component on a graph, switch it over to use the Conversation service and try some things.
    If you have a custom portlet or front-end looking to use Endeca Server data, Web Services support lots of (nearly all?) coding languages, platforms, etc.  It's probably outside the scope of this forum to talk about consuming Web Services in your application, since that's something that thousands of applications do and is more of a Java/C#/your language of choice question.  For example:
    How to create and consume a simple Web Service using JAX WS » the Open Tutorials
    http://wiki.servicenow.com/index.php?title=Web_Services_C_Sharp_.NET_End_to_End_Tutorial
    Hope that helps.
    Patrick Rafferty
    http://branchbird.com

  • How to get the data from mysql database which is being accessed by a PHP application and process the data locally in adobe air application and finally commit the changes back in to mysql database through the PHP application.

    How to get the data from mysql database which is being accessed by a PHP application and process the data locally in adobe air application and finally commit the changes back in to mysql database through the PHP application.

    If the data is on a remote server (for example, PHP running on a web server, talking to a MySQL server) then you do this in an AIR application the same way you would do it with any Flex application (or ajax application, if you're building your AIR app in HTML/JS).
    That's a broad answer, but in fact there are lots of ways to communicate between Flex and PHP. The most common and best in most cases is to use AMFPHP (http://amfphp.org/) or the new ZEND AMF support in the Zend Framework.
    This page is a good starting point for learning about Flex and PHP communication:
    http://www.adobe.com/devnet/flex/flex_php.html
    Also, in Flash Builder 4 they've added a lot of remote-data-connection functionality, including a lot that's designed for PHP. Take a look at the Flash Builder 4 public beta for more on that: http://labs.adobe.com/technologies/flashbuilder4/

  • I want to upgrade my storage plan in icloud. Before that i want to know whether synchronization of data in my PC and data in one of the applications of ipad is possible through icloud or not

    I want to upgrade my storage plan in icloud. Before that i want to know whether synchronization of data in my PC and data in one of the applications like "phone drive" of ipad is possible through icloud or not?

    The Photos app doesn't currently support subfolders, it only has the one level of folder/album. You will either need to change your folder structure on your computer to be just one level, see if there is a third-party photo app in the store that copes with subfolders, or just make do. You can try leaving feedback for Apple : http://www.apple.com/feedback/ipad.html

  • How to delete multiple data domains with single step ?

    how to delete multiple data domains with single step ?

    You can go to your Endeca-Server domain home e.g.($WEBLOGIC-HOME$/user_projects/domains/endeca_server_domain/EndecaServer/bin)
    run
    [HOST]$ ./endeca-cmd.sh list-dd
    default is enabled.
    GettingStarted is enabled.
    endeca is enabled.
    BikeStoreTest is enabled.
    create a new file from the output just with the domains that you want to delete and then create a loop
    [HOST]$ vi delete-dd.list
    default
    GettingStarted
    endeca
    BikeStoreTest
    [HOST]$ for i in $(cat delete-dd.list); do; ./endeca-cmd.sh delete-dd $i; done
    Remember that this can not be undone, unless you have a backup.

  • Retrieve data from a non-peoplesoft application using HTTP Get

    I need to retrieve data from a non-peoplesoft application. They want us to submit a HTTP GET request to their URL with a series of parameters. I am thinking about using HTTP Targert connector to accomplish this. Does anyone have sample peoplecode?
    Currently we are on 8.51.10 Tools...
    If there is any better way .. please let me know ..

    I have used HTTP Get to get XML file from a government sanction list by hitting URL http://www.treasury.gov/ofac/downloads/sdn.xml
    There is a delivered PS program that does that for vendor sanctions. I had to get the online setup correctly by creating a new custom Node with HTTP Target Connector. The program name is BSP_IMPORT. The below code is responsible for the calling the node and retrieving the data. Play around with the code below see if you can get it to meet your needs.
    BSP_IMPORT_AET.BANKNODE.Value is just the custom external code that I created.
    PMT_FLAT_FILE_INBOUND message is just a none rowset based message to use the web service call.
    Local TR:FileUtilities:FTP &oFTPUtil = create TR:FileUtilities:FTP();
    +/* HTTP */+
    +/*******************************************************************************/+
    Local Message &msgHTTP;
    Local Message &msgResult;
    +&msgHTTP = CreateMessage(Message.PMT_FLAT_FILE_INBOUND);+
    +&oFTPUtil.PopulateFTPGetIBInfo(&msgHTTP, BSP_IMPORT_AET.BANKNODE.Value);+
    +&msgResult = %IntBroker.ConnectorRequest(&msgHTTP);+
    +/* check to see if the file is wrapped */+
    +&strAllLines = &msgResult.GenXMLString();+
    +&strAllLines = Substitute(&strAllLines, Char(26), " "); /* Added this line to remove invalid characters */+
    +/*******************************************************************************/+
    Edited by: Maher on Mar 20, 2012 3:28 PM

  • How to send data from a web dypro application using workflow

    Hi All,
    I am working on a web dynpro application where the user will enter the header and item details for a FI document to be posted. Once the user enters the data the workflow should initiate and should also send the data across to the approver to approve. To initiate the workflow I am using the function module 'SAP_WAPI_START_WORKFLOW' and it's working fine and generating a uniquw workflow item id. Now my main concern is how to send the data across from web dynpro application through the workflow. I have my data in three internal tables: 1. header table. 2. G/L table and 3. Currency table, I am capturing all this data from the web dypro screen entered by the user. Right now I have the following code in my web dypro application.
    METHOD execute_bapi_acc_document_post .
      DATA: return TYPE TABLE OF bapiret2.
      DATA: wa_return LIKE LINE OF return.
      DATA lo_bapi_acc_document_po TYPE REF TO if_wd_context_node.
      DATA lo_changing TYPE REF TO if_wd_context_node.
      DATA lo_accountgl TYPE REF TO if_wd_context_node.
      DATA lo_currencyamount TYPE REF TO if_wd_context_node.
      DATA lo_importing TYPE REF TO if_wd_context_node.
      DATA lo_documentheader TYPE REF TO if_wd_context_node.
      DATA lo_element TYPE REF TO if_wd_context_element.
      DATA lt_elements TYPE wdr_context_element_set.
      DATA ls_c_documentheader TYPE if_componentcontroller=>element_documentheader.
      DATA lt_c_accountgl TYPE if_componentcontroller=>elements_accountgl.
      DATA ls_c_accountgl LIKE LINE OF lt_c_accountgl.
      DATA lt_c_accountgl_cp TYPE if_componentcontroller=>elements_accountgl.
      DATA lt_c_currencyamount TYPE if_componentcontroller=>elements_currencyamount.
      DATA ls_c_currencyamount LIKE LINE OF lt_c_currencyamount.
      DATA lt_c_currencyamount_cp TYPE if_componentcontroller=>elements_currencyamount.
      DATA wa_c_currencyamount type bapiaccr09.
    CALL FUNCTION 'SAP_WAPI_START_WORKFLOW'
      EXPORTING
        TASK                      = 'TSXXXXXXXXXX'            
       USER                      = sy-uname
    IMPORTING
       RETURN_CODE               = L_RETURN_CODE
       WORKITEM_ID               = LV_WIID
    TABLES
    *   INPUT_CONTAINER           = lt_input_container
       MESSAGE_LINES             = lt_message_lines
       AGENTS                    = ls_agents
      lo_bapi_acc_document_po = wd_context->get_child_node( wd_this->wdctx_bapi_acc_document_po ).
      lo_changing = lo_bapi_acc_document_po->get_child_node( wd_this->wdctx_changing ).
      lo_accountgl = lo_changing->get_child_node( wd_this->wdctx_accountgl ).
      lo_currencyamount = lo_changing->get_child_node( wd_this->wdctx_currencyamount ).
      lo_importing = lo_bapi_acc_document_po->get_child_node( wd_this->wdctx_importing ).
      lo_documentheader = lo_importing->get_child_node( wd_this->wdctx_documentheader ).
      lo_element = lo_documentheader->get_element( ).
      lo_element->get_static_attributes(
        IMPORTING static_attributes = ls_c_documentheader ).
      lt_elements = lo_accountgl->get_elements( ).
      LOOP AT lt_elements[] INTO lo_element.
        lo_element->get_static_attributes( IMPORTING static_attributes = ls_c_accountgl ).
        INSERT ls_c_accountgl INTO TABLE lt_c_accountgl[].
      ENDLOOP.
      lt_c_accountgl_cp = lt_c_accountgl[].
      lt_elements = lo_currencyamount->get_elements( ).
      LOOP AT lt_elements[] INTO lo_element.
        lo_element->get_static_attributes( IMPORTING static_attributes = ls_c_currencyamount ).
        INSERT ls_c_currencyamount INTO TABLE lt_c_currencyamount[].
      ENDLOOP.
      lt_c_currencyamount_cp = lt_c_currencyamount[].
      READ TABLE lt_c_currencyamount INTO ls_c_currencyamount INDEX 2.
      ls_c_currencyamount-amt_doccur = ls_c_currencyamount-amt_doccur * '-1.0000'.
      MODIFY lt_c_currencyamount FROM ls_c_currencyamount INDEX 2.
      CALL FUNCTION 'BAPI_ACC_DOCUMENT_POST'
        EXPORTING
          documentheader = ls_c_documentheader
        TABLES
          accountgl      = lt_c_accountgl
          currencyamount = lt_c_currencyamount
          return         = return.
    ENDMETHOD.
    Please suggest.
    Thanks,
    Rajat
    I am not sure if this falls in webdynpro or workflow threads.. so I am posting it here also
    Edited by: rajatg on Jun 23, 2010 9:28 PM

    Dear Colleague,
    You have different method to send parameters to Workflow.
    1. Method
    Container Set Element
    DEFINE SWC_SET_ELEMENT.
      CALL FUNCTION 'SWC_ELEMENT_SET'
        EXPORTING
          ELEMENT   = &2
          FIELD     = &3
        TABLES
          CONTAINER = &1
        EXCEPTIONS
          OTHERS    = 1.
    END-OF-DEFINITION.
    Set the data into Workflow container
        SWC_SET_ELEMENT IT_CONTAINER 'parameter1' lv_parameter1.
    Start the Workflow
        CALL FUNCTION 'EWW_WORKFLOW_START'
          EXPORTING
            X_TASK          = 'WS90000001'   " your wf
          IMPORTING
            Y_WORKFLOW_ID   = WF_ID " your workitem id
          TABLES
            X_CONTAINER     = IT_CONTAINER
          EXCEPTIONS
            INVALID_TASK    = 1
            NO_ACTIVE_PLVAR = 2
            START_FAILED    = 3
            GENERAL_ERROR   = 4
            OTHERS          = 5.
    2. Method,
    You can also add your parameters direly to a container,
      DATA: lt_simple_container TYPE TABLE OF swr_cont,
            ls_simple_container TYPE swr_cont.
      ls_simple_container-element = 'parameter1'.
      ls_simple_container-value = lv_parameter1.
      APPEND ls_simple_container TO lt_simple_container.
      CALL FUNCTION 'SAP_WAPI_WRITE_CONTAINER'
        EXPORTING
          workitem_id      = WF_ID " your workitem id
          do_commit        = 'X'
        TABLES
          simple_container = lt_simple_container.
    Bulent.

  • How to access internal table data from webdynpro to Flex application.

    Hi Connoisseur
    The data transfer from Abap WebDeypro to flex island works well. I followed , there is an example from Thomas Jung (by the way as always Great Work) and  Karthikeyan Venkatesan (Infosys) but this example covers simple type only.
    There is no example with complex types like arrayCollection which handle the transfer of data from flex to WebDynpro.
    i tried to do pass internal table value  to flex-datagrid.but its not work.
    i would like to know
    1.how to access internal table data from webdynpro to Flex application.
    2.how to pass the internal table to flex-datagrid.
    2.how to pass dynamically in ADOBE flex.
    3. how to do Flex is receiving the wd context data?
    4. how can we update WD context with FLEX data.
    Ple give me sample example and step by step procedure.
    Regards
    laxmikanth

    Hi Laxmikanth,
    Please refer this...
    Flash island: update complex type from flex
    Cheers..
    kris.

  • I have 3 macs. 2 are Lion, 1 I left Snow leopard so I can access data from my old Quicken Application that doesn't work with Lion. If I move to iCloud, will I no longer be able to access my MobileMe email on the Snow Leopard mac?

    I have 3 macs. 2 are Lion, but 1 I left Snow leopard so I can access data from my old Quicken Application that doesn't work with Lion. If I move to iCloud, will I no longer be able to access my MobileMe email on the Snow Leopard mac?

    What version of word do you have? The TS3938 sounds like it's a PowerPC app- written for an old architecture that is no longer supported in Lion. If this is the case, your files are fine- you just need a newer version of word that will run in Lion in order to open them. The newest version (2011) should be readily available anywhere, and has worked fine for me ever since I switched to Lion on release day....

  • How to pull data from a c/c++ Application

    I am working on a project to instrument a product which runs on java as well as c/c++.
    Is there a way to pull data from a c/c++ application to a MBean?
    I am able to push the data through a custom TCP Adapter, But I could not get much help on a pull model.
    Also I would like to get some pointers on instrumentation techniques / patterns.
    Regards
    Chandramohan

    I'm not aware of any JMX out-of-the-box support for managing C++ applications with JMX or MBeans, but you can find some ideas on how this might be done in [this thread|http://forum.java.sun.com/thread.jspa?threadID=5240363] from [this very forum|http://forum.java.sun.com/forum.jspa?forumID=537&start=0].

  • Realtime data integration with Third party application

    A customer wants to send data to a third party application when a user modifies some information on a business object (ie PO Amount, Employee Name etc). How can you trap this change as close to realtime as possible within SAP?

    In addition to Senthil's reply, it may also be possible to hook into a workflow event, provided such an event is raised.  Whilst I prefer the change pointer approach outlined by Senthil, it does enforce a delay between the "application event" itself and change pointer processing.
    To see if a workflow event is raised, in a non-production system, use transaction SWU8 to activate the workflow trace.  Next perform the update that you wish to trap.  Finally, use transaction SWU9 to display the workflow log.  Hopefully you'll see a workflow event raised for your update (for example a CHANGED event).
    This will allow you to perform either synchronous or asynchronous processing immediately (the norm is for workflow processing to be triggered immediately but asychronously).
    It would be great if all SAP objects supported a common event model, with BADI's for standardised events for create, change, etc.  Maybe one day, but not today...
    Cheers,
    Scott

Maybe you are looking for

  • /N/SAPAPO/MAT1 question

    Hello Guru, i have a question on tcode /N/SAPAPO/MAT1, in the extra tab we have a field (Dem Exclu. from SS) can anyone of you know what is the use of this field? can it affect the demand requirement planning in APO?

  • HDMI/DVI adapter

    Can anyone confirm using this with success?

  • Application Error: "Shop not found"

    Hello Experts, Hopefully someone can help with this strange error that is being returned. When attempting to login to our Webshop, a limited numbers customers have reported receiving the following error: "An application error occurred A serious syste

  • Lost all my contacts on i4 , the same day I upgraded Outlook for Mac

    i lost all of my i4 contacts today. it happened soon after tI upgraded my Outlook for Mac to 14.1.4

  • I've lost disk utility ... anyone know where it's gone.

    Since a clean re-istall of Leopard 10.5.2 I can't find disk utility in the Applications - Utilities folder, doesn't show up in Finder either. Had it before re-install ... vanished, any ideas? Thanks.