Usage of CAF in real World scenario

Hi All,
iam trying to understand Netweaver CAF(core & GP) from a technical perspective. all the marketing information that SAP provides does not give me enough information to understand how it can solve real world problems. i would appreciate it if anyone of you can explain me in detail how Netweaver CAF can solve the problem and what would have been your solution if Netweaver CAF was not in place.
Please explain with some simple scenario.
Thanks & Regards,
Kavitha

Hi Crispine,
Here you can find some materials concerning CAF and its technical side:
SAP Composite Application Framework - CAF Tutorial Center [original link is broken]
It is actually the CAF Tutorial center, containing some documents as well, but I hope it helps you get to know better what CAF is about.
Kind Regards
Maria

Similar Messages

  • HR Interfaces - Real World Scenarios - Please help !

    Hello Folks,
    I would appreciate it if someone could give examples of real world scenarios in which XI has been used for HCM (HR) Interfaces. Both from SAP to External as well as Legacy systems. I am not looking so much for technical explanation but more of a high level description of systems involved, the types of data sent/received (payroll master etc). Really looking for real world examples.
    For example: Payroll master from SAP HCM to Legacy (?) using so and so idoc etc.
    If you all gurus can help, I would really appreciate it.
    Thanks.

    Hi,
    Check out this thread
    HR Implementation in XI landscape
    Thanks,
    Prakash

  • PI Implementation Examples - Real World Scenarios

    Hello friends,
    In the near future, I'm going to give a presentation to our customers on SAP PI. To convince them to using this product, I need examples from the real world that are already implemented successfully.
    I have made a basic search but still don't have enough material on the topic, I don't know where to look actually. Could you post any examples you have at hand? Thanks a lot.
    Regards,
    Gökhan

    Hi,
    Please find here with you the links
    SAP NetWeaver Exchange Infrastructure Business to Business and Industry Standards Support (2004)
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90052f25-bc11-2a10-ad97-8f73c999068e
    SAP Exchange Infrastructure 3.0: Simple Use Cases
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/20800429-d311-2a10-0da2-d1ee9f5ffd4f
    Exchange Infrastructure - Integrating Heterogeneous Systems with Ease
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1ebea490-0201-0010-faad-a32dd753d009
    SAP Network Blog: Re-Usable frame work in XI
    /people/sravya.talanki2/blog/2006/01/10/re-usable-frame-work-in-xi
    SAP NetWeaver in the Real World, Part 1 - Overview
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/20456b29-bb11-2a10-b481-d283a0fce2d7
    SAP NetWeaver in the Real World, Part 3 - SAP Exchange Infrastructure
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3172d290-0201-0010-2b80-c59c8292dcc9
    SAP NetWeaver in the Real World, Part 3 - SAP Exchange Infrastructure
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/9ae9d490-0201-0010-108b-d20a71998852
    SAP NetWeaver in the Real World, Part 4 - SAP Business Intelligence
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1f42d490-0201-0010-6d98-b18a00b57551
    Real World Composites: Working With Enterprise Services
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d0988960-b1c1-2a10-18bd-dafad1412a10
    Thanks
    Swarup

  • Real world scenario JDBC-PROXY

    Hi All,
    I have created a interface between JDBC-PROXY. Select some rows from the table and do some proceessing in proxy (say create material). The scenario looks good. But while selecting from the table JDBC adapter updates the status field to prevent processing the same row again and again.
    Now if suppose my R/3 server is down, the data from the table is lost. How can I track these data?
    Thanks & Regards,
    Jai Shankar.

    Hi Jai,
    A JDBC sender adapter cannot be configured Synchronously, yes u r correct .
    If you want an update back to the database after execution , BPM is the only one way .
    create the RECEIVER JDBC adapter fromat as suggested in this link,
    http://help.sap.com/saphelp_nw04/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm
    one thing dont use update in Sender Jdbc Adapter use it in Receiver Jdbc adapter
    the steps in BPM wil be ,
    1. Receive -- for Sender JDBC adapter
    2- Send Synchronous -- RFC call synchrnous
    3 - Send -- using Receievr JDBC adapter to Insert / Update the DB.
    Regards,
    Sridhar

  • Discuss SAP NetWeaver in the Real World Series

    Hello SAP NetWeaver Platform forum denizens! I, Jennifer Lankheim, am a senior editor of SDN.
    Our recently launched series, "SAP NetWeaver in the Real World" has generated some heated Letters to the Editor, as it were, so the SDN editorial team thought it would be a great idea to open a forum where SDN community members could discuss the series. For example, do you think there was a better technological approach for Iridium Motors? Do you agree or disagree with any statements made in the articles?
    We, the editors, would also love to hear what you think about the series in general - has it captured your interest? Would you like to see more SDN content like this in the future? Would you like to see REAL case studies, detailing REAL implementations? Let us know!
    Cheers,
    Jennifer Lankheim

    Sorry I've got it:
    1. NetWeaver in the Real World Overview:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/sap netweaver in the real world_overview.pdf
    2. SAP NetWeaver in the Real World Part II
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/sap netweaver in the real world  part ii.pdf
    3. SAP NetWeaver in the Real World, Part III SAP XI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/sap netweaver in the real world part iii sap exchange infrastructure.pdf
    4.  NetWeaver in the Real World Part IV
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/sap netweaver in the real world part iv.pdf
    5. SAP NetWeaver in the Real World, Part V SAP Enterprise Portal
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/sap netweaver in the real world,%20Part%20V%20SAP%20Enterprise%20Portal.pdf
    6. Real World Scenarios of SAP XI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/events/asug-tech-forum-04/real world scenarios of sap xi.pdf
    here comes the whole page with the links:
    https://www.sdn.sap.com/sdn/bestof2004.sdn?page=realworld.htm

  • RAISE EVENT usage in real world

    Hi,
    I wonder what is the advantage of the statement RAISE EVENT in contrast to a normal call to a class method.
    Given the following example coding:
    METHOD pai_0100.
      CASE ok_code.
        WHEN 'START'.
          RAISE EVENT read_sflight_data_event EXPORTING i_carrid = i_carrid.
      ENDCASE.
    ENDMETHOD.
    This will raise the event read_sflight_data_event. This event calls the method read_sflight_data set as handler in the class constructor:
    METHOD constructor.
      SET HANDLER me->read_sflight_data FOR me.
    ENDMETHOD.
    Reading data from database...
    METHOD read_sflight_data.
      IF NOT i_carrid IS INITIAL.
        SELECT * FROM sflight INTO TABLE gt_sflight
          WHERE carrid = i_carrid.
      ELSE.
        SELECT * FROM sflight INTO TABLE gt_sflight.
      ENDIF.
    ENDMETHOD.
    Of course I could have written, too:
    METHOD pai_0100.
      CASE ok_code.
        WHEN 'START'.
          me->read_sflight_data( i_carrid = i_carrid ).
      ENDCASE.
    ENDMETHOD.
    Which results in exactly the same result of course, however what is the advantage of events in real world? When is it useful to raise an event instead of calling the event handler directly?
    Thanks for shedding some lights on this topic.

    You don't have to look far to get real world SAP examples for event handling:
    <ul style="list-style:circle!important">
    <li>Workflow (e.g. trigger an event upon creation/change of a document)</li>
    <li>ALV (e.g. react to user input like a double click)</li>
    <li>Job control (e.g. fire a job upon a specific event)</li>
    </ul>
    When you look at the examples it's obvious that the event producer doesn't necessarily know anything about any possible future event consumers: You can ignore events, have one or multiple listeners and the listeners are usually separate from the coding unit that raised the event. Note that it depends on the event handling framework how events are processed (e.g. asynchronous versus synchronous, sequential versus parallel). For class based events the event handler processing is synchronous and sequential; for multiple event handlers their execution sequence is based on their registration order (see [raise event|http://help.sap.com/abapdocu_70/en/ABAPRAISE_EVENT.htm]).
    In a simple example like the one you gave I'd say the event handling approach is possibly questionable (at least as long as all the event listeners are within the same class and there is clearly no need for other objects to listen to this event). Anyhow, the example is probably designed to show the gist of event handling, but sometimes examples from SAP look overly complicated because they seem to prefer the [SoC|http://en.wikipedia.org/wiki/Separation_of_concerns] design principle over [KISS|http://en.wikipedia.org/wiki/KISS_principle] (couldn't resist this little rant).
    Cheers, harald

  • Windows 10 Pro and Enterprise real world usage (for 7 and 8 too)

    Hi Gang,I've been given a small assignment for potential Windows 10 deployment (sadly). They want a comparison of Windows 10 Pro against Enterprise.Obviously I can read tables and see that feature X such as DirectAccess is only available in Enterprise than Pro and mention several other features and pricing and this and that.But what does this really mean in the real world? Do many of the companies make use of the features available in Windows 10 (not even 10 to be honest, 7 onward really) or is it simply we can afford it so get?Of course, some features have their obvious benefits (or necessity) from Pro onward editions simply for domain joining, bitlocker and so on but for those on 7 or 8 even that have or stopped using certain features?Looking to deploy VMs to test several of these features.Thanks for reading my topic in advance,...
    This topic first appeared in the Spiceworks Community

    this is from a blog and I checked it out it is in fact true. May be old news but I thought I would share.http://lauren.vortex.com/archive/001116.htmlA couple of days ago I discussed a number of privacy and other concerns with Microsoft's new Windows 10, made available as a free upgrade for many existing MS users:Windows 10: A Potential Privacy Mess, and Worse:http://lauren.vortex.com/archive/001115.htmlThe situation has only been getting worse since then. For example, it's been noted that the Win10 setup sequence is rigged to try fool users into switching to an MS browser, irrespective of their browser settings before they started the upgrade:Mozilla isn’t happy with Microsoft for changing how users change the default web browser in Windows 10:...

  • Character Styles in the Real World

    Rick:
    Thanks for your efforts, and let me add my Amen to both
    subjects (on file locations and on Character styles).
    My real-world use of Character styles is a combination usage
    of Paragraph and Character styles for Notes: I have a Paragraph
    style called Note, which simply adds margins of .15in Left, 10pt
    Top, and 8pt Bottom. Within this paragraph style, multiple labels
    announce the type of Note with the use of Character styles
    NoteLabel (Navy), RecommendLabel (Teal), CAUTIONLabel (Purple), and
    WARNINGLabel (Red).
    This way, you can change the color of one or more labels
    without worrying about the paragraph settings (or vice versa).
    Also, when placing a Note inside a table cell (which might
    have limited horizontal space, especially with three or four
    columns), we still use the "Label" character styles but
    without the Notes paragraph style. This still sets off the
    text visually, without adding unnecessary extra vertical space.
    Thanks again, Rick!
    Leon

    I can tell you about two sites.
    1. A system which allocates and dispatches crews, trucks, backpack hoses, spare socks, etc to bushfires (wildfires to you). It operates between two Government departments here in Australia. Each of those despatchable items is a remote object and there have been up to 50,000 active in the system at a time during the hot summer months. This is a large and life-critical system.
    2. A monitoring system for cable TV channels. A piece of hardware produces a data stream representing things like channel utilization, error rates, delay, etc and this is multiplexed via RMI to a large number of operator consoles. Again this is a major and business-critical system.
    And of course every J2EE system in existence uses RMI internally, albeit almost entirely RMI/IIOP.

  • Real World Item Level Permission Performance?

    I am considering implementing item level permission on a list we use. I've seen all the articles online cautioning not to do this with lists of more than 1000 items, but the articles seem to have little detailed information about the actual impact and what
    causes the performance issues. Additionally, they seem to refer to document libraries more than lists. I'd like some feedback about what might occur if we were to use item level security in our situation.
    Our situation is this: list of current ~700 items in a sharepoint list. Expected to grow around 700 items per year. The list has about 75 fields on it. We have 8 active-directory groups that have access to the list, based upon company department. Each
    item in the list can apply to one or more departments. The groups represent around 100-150 different unique users.
    We would like to use item level security to be set via workflow, to enable particular groups to access the item based upon their group membership. For example, if the list item is for the HR department, then the HR group has access. If the item is for IT,
    then the IT group has access (and HR wouldn't).
    That's it. There would be no nesting of items with multiple permission levels, no use of user-level ACLs on the items, etc.
    Thoughts about this configuration and expected performance issues?  Thanks for any feedback!

    Just an update for anyone who finds this thread:
    I converted our data into a test SharePoint list with 1500 rows. I then enabled full item-level security, with restrictions to hide data not created by the person.
    I then set individual permissions for each item that included 2-3 AD groups with different permissions--contribute, full ownership, etc, and 2-3 individuals with varying permissions. The individuals represented around 50 total people.
    After the permissions were set I then did a comparison of loading individual views and the full data set in Standard and Datasheet views, for both myself as an administrator with full list access and with several of the individuals who only had access to
    their designated items--typically 75-100 of the total list.
    The results were that I found no discernable difference in system performance from the user interface level while loading list views after the item level security was configured in this way. I understand this will vary based up
    hardware configuration and exact permission configuration, but in our situation the impact of item level security on a list of 1500 items had very little, if any, negative performance impact. Note that I didn't check performance at the database server level,
    but I'm assuming the impact there was minimal since the front-end user experience was unaffected.
    I expect we'll put this solution into place and if we do I'll update this post when we have additional real-world usage information.

  • Real time scenarios where Web Dynpro is used

    Hi Experts,
    I am a beginner with Web Dynpro (ABAP ), i have developed a couple of simple web dynpros for practice purpose. I would like to know the below things to get a better idea on the technology
    1. Can anyone who have worked on Web Dynpro share the real time applications which you worked on.(I am not expecting any technical details, i want to understand the scenarios in real world where this technology is used).
    2. If some one can link web dynpro's with CRM/SRM that would help me more.(Once again i need the business scenario)
    Thanks in advance.
    regard's
    Sudheer V

    Hi Sharath,
    Thanks for your response, it was really a useful link. But once again they all talk about technical details but what i was looking for was business scenarios where web dynpro can be used effectively.
    When i was searching for this i found one such scenario where web dynpro gets the quotation details from the CRM system and creates a sales order with reference to the quotation in the SAP R/3.
    I was actually looking for scenarios like the above with some technical details for implementing the same.
    Please let me know if you know scenarios something like the above one.Thanks.
    Regard's
    Sudheer V

  • The real world battery life of a 2013 MacbookAir ?

    Here is my situation. I use my Air a lot in the field (photography, and a few other things). It is the model before the 2013.
    My question is, is the battery life of the 2013 model as good as it is said to be? I read reviews which say 7 hours or more of battery life, in real world conditions. This is a lot of time, a game changer.
    I am definitely considering buying the new model, but only if the battery life is significantly better (my current Air is only a year old). Can anyone who used one of these fine machines give me their thoughts? Is the battery as good as claimed, is the battery life ok (I have read that the new batts have a shorter life span).
    Any advice appreciated.
    Ian

    Ian999
    is the battery life of the 2013 model as good as it is said to be?
    I get 12+ hours out of mine, it depends on what you are running as the main factor
    compared to my 2012 model I sold, the improvement is a bit stunning.
    Ian999
    I read reviews which say 7 hours or more of battery life
    Again, that totally depends on WHAT you are running, ..videos, a good bit less
    Ive owned 3 Air, the battery life on the new Haswell is incredible, far better than last 2012 model, as apple.com chart will indicate to you.
    Youre not going to find another superslim notebook/laptop with this power that does it all with THIS kind of battery life.
    Keep it plugged in when near a socket so you keep the charging cycles down on your LiPo (lithium polymer) cells / battery, but not plugged in all the time. When not being used for several hours, turn it off.
    And best "tip" is if its near a socket,...plug it in as long as you can (especially at home) since cycle count on the battery are the "miles that wear out the tires (battery)", however again, not plugged in all or most of the time.
    http://www.apple.com/batteries/notebooks.html
    "Apple does not recommend leaving your portable plugged in all the time."
    While cycle count is commonly seen to be the “miles” on your Lithium Ion pack cell in your Macbook, which they are, this distinction is not a fine line at all, and it is a big misconception to “count charge cycles”
    *A person who has, for example, 300 charge cycles on their battery and is recharging at say 50-60% remaining of a 100% charge has better battery usage and care than another person who has 300 charge cycles at say 15% remaining on a 100% charge. 
    DoD (depth of discharge) is far more important on the wear and tear on your Macbook battery than any mere charge cycle count.  *There is no set “mile” or wear from a charge cycle in general OR in specific.    As such, contrary to popular conception, counting cycles is not conclusive whatsoever, rather the amount of deep DoD on an averaged scale of its use and charging conditions.
    (as a very rough analogy would be 20,000 hard miles put on a car vs. 80,000 good miles being something similar)
    *Contrary to some myths out there, there is protection circuitry in your Macbook and therefore you cannot overcharge it when plugged in and already fully charged
    *However if you don’t plan on using it for a few hours, turn it OFF (plugged in or otherwise) ..*You don’t want your Macbook both always plugged in AND in sleep mode       (When portable devices are charging and in the on or sleep position, the current that is drawn through the device is called the parasitic load and will alter the dynamics of charge cycle. Battery manufacturers advise against parasitic loading because it induces mini-cycles.)
    Keeping batteries connected to a charger ensures that periodic "top-ups" do very minor but continuous damage to individual cells, hence Apples recommendation above:   “Apple does not recommend leaving your portable plugged in all the time”, …this is because “Li-ion degrades fastest at high state-of-charge”. This is also the same reason new Apple notebooks are packaged with 50% charges and not 100%.
    LiPo (lithium polymer, same as in your Macbook) batteries do not need conditioning. However...
    A lot of battery experts call the use of Lithium cells the "80% Rule" ...meaning use 80% of the charge or so, then recharge them for longer overall life.
    Never let your Macbook go into shutdown and safe mode from loss of power, you can corrupt files that way, and the batteries do not like it.
    The only quantified abuse seen to Lithium cells is instances when often the cells are repeatedly drained very low…. key word being "often"
    The good news is that your Macbook has a safety circuit in place to insure the battery doesn’t reach too low before your Macbook will auto power-off. Bad news: if you let your Macbook protection circuitry shut down your notebook at its bottom, and you refrain from charging it for a couple days...the battery will SELF-DRAIN to zero (depending on climate and humidity)…and nothing is worse on a Lithium battery being low-discharged than self-draining down to and sitting at 0
    Contrary to what some might say, Lithium batteries have an "ideal" break in period. First ten cycles or so, don't discharge down past 40% of the battery's capacity. Same way you don’t take a new car out and speed and rev the engine hard first 100 or so miles.
    Proper treatment is still important. Just because LiPo batteries don’t need conditioning in general, does NOT mean they dont have an ideal use / recharge environment. Anything can be abused even if it doesn’t need conditioning.
    From Apple on batteries:
    http://support.apple.com/kb/HT1446
    Storing your MacBook
    If you are going to store your MacBook away for an extended period of time, keep it in a cool location (room temperature roughly 22° C or about 72° F). Make certain you have at least a 50% charge on the internal battery of your Macbook if you plan on storing it away for a few months; recharge your battery to 50% or so every six months roughly if being stored away. If you live in a humid environment, keep your Macbook stored in its zippered case to prevent infiltration of humidity on the internals of your Macbook which could lead to corrosion.

  • How do I send Firefox information about It's helpful for Mozilla's engineers to be able to measure how Firefox behaves in the real world. The Telemetry feature

    It's helpful for Mozilla's engineers to be able to measure how Firefox behaves in the real world. The Telemetry feature provides this capability by sending performance and usage info to us. As you use Firefox, Telemetry measures and collects non-personal information, such as memory consumption, responsiveness timing and feature usage. It then sends this information to Mozilla on a daily basis and we use it to make Firefox better for you.
    Telemetry is an opt-in feature. Bring it to me

    Hi Terrancecallins,
    I understand that you would like to opt-in for the Telemetry feature in your Firefox browser. Thank you for your interest in helping to make Firefox better!
    Here is the help article explaining how to turn on this feature:
    * [[Send performance data to Mozilla to help improve Firefox]]
    Please let us know if you have any other questions.
    Thanks,
    - Ralph

  • Real World X99 5960X 4.4ghz, 36MP files & Lightroom 5 - ~1.09sec JPG Export

    Hi All,
    I've been trawling the net for the longest time since the X99 was launched. There wasn't really real world LR performance results I could find, so I am sharing this info with the community here.
    I  am a professional photographer, formerly trained as a systems analyst so I am keen with pc/mac hardware etc. I was torn between a new MacPro or setting up a new PC. I did some research and ended up with a PC. Here are the specs for the new PC: Intel 5960x 4.4ghz (8core), Rampage V, MSI 980, Cosair H110, CosairAX1200, Corsair Carbide 540. Running Intel SSDs (old ones). Old PC was an i7 960 3.9ghz (4core)
    I kept my old SSDs (they were still working) and transplanted them over to the new system. Hence, SSD speeds were the constant in the experiment.
    36MP file export to JPG using 3 process stacking method: ~1.09sec (about 100-160% faster than the 3.9ghz 4core)
    1:1 preview generation: ~1-2sec (maybe about 10-20% faster)
    Browsing 1:1 previews: <0.2sec (about 100% faster)
    Other stuff like sliders etc: About the same speed as the old machine 3.9ghz.
    The CPU was maximised during the file export, all CPU threads were firing. However, the 1:1 preview was as expected likely no optimised for multithreading. Browsing was surprisingly fast, likely due to the X99 architecture. That said, I was not very disappointed with the results as I read that LR wasn't multi-threaded much (I saw the CPU % usage in my old PC) but found it interesting that most operations weren't affected much.
    I hope this short review will be helpful to folks in their search for 4/6/8 core machines.
    Best
    Wes

    Yep, useful and informative, Wes - thanks.

  • Real-world experience with Exchange 2010 SP3 RU5+ and Powershell 4?

    The support-ability matrix for Exchange (http://technet.microsoft.com/en-us/library/ff728623(v=exchg.150).aspx) says Exchange
    2010 SP3 RU5+ and Powershell 4 are compatible.  But, there is very little actual discussion about how well that works. 
    I use Powershell extensively for mission critical and somewhat complex processes, with Exchange 2010 on 2008 R2 and AD access/reads/updates. 
    Can I get a summary of the caveats and benefits from someone who has actually done this in a
    real-world/production scenario (more than one server, managing from a separate non-Exchange server), and who has scripting experience with this configuration?  
    Also, how has this affected EMC operations?  
    As always thank you in advance!  

    I believe the matrix states that its supported to install Exchange into an environment where __ version of WMF is present.  Exchange 2010, launched from a Win 2012 server, reports version 2.0 when you call $host.  For example, calling the ActiveDirectory
    module from EMS on an Win 2012 server (ps 3.0) fails.
    I'll double check the extent of this scenario and get back to you.
    Mike Crowley | MVP
    My Blog --
    Planet Technologies

  • Making Effective Use of the Hybrid Cloud: Real-World Examples

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, and it was clear that NetApp's approach to hybrid cloud and Data Fabric resonated with the crowd. NetApp solutions such as NetApp Private Storage for Cloud are solving real customer problems.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that allows you to move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    Check out the following blogs for more perspectives:
    Microsoft Ignite Sparks More Innovation from NetApp
    ASR Now Supports NetApp Private Storage for Microsoft Azure
    Four Ways Disaster Recovery is Simplified with Storage Management Standards
    Introducing OnCommand Shift
    SHIFT VMs between Hypervisors
    Infront Consulting + NetApp = Success
    Richard Treadway
    Senior Director of Cloud Marketing, NetApp
    Tom Shields
    Senior Manager, Cloud Service Provider Solution Marketing, NetApp
    Enterprises are increasingly turning to cloud to drive agility and closely align IT resources to business needs. New or short-term projects and unexpected spikes in demand can be satisfied quickly and elastically with cloud resources, spurring more creativity and productivity while reducing the waste associated with over- or under-provisioning.
    Figure 1) Cloud lets you closely align resources to demand.
    Source: NetApp, 2015
    While the benefits are attractive for many workloads, customer input suggests that even more can be achieved by moving beyond cloud silos and better managing data across cloud and on-premises infrastructure, with the ability to move data between clouds as needs and prices change. Hybrid cloud models are emerging where data can flow fluidly to the right location at the right time to optimize business outcomes while providing enhanced control and stewardship.
    These models fall into two general categories based on data location. In the first, data moves as needed between on-premises data centers and the cloud. In the second, data is located strategically near, but not in, the cloud.
    Let's look at what some customers are doing with hybrid cloud in the real world, their goals, and the outcomes.
    Data in the Cloud
    At NetApp, we see a variety of hybrid cloud deployments sharing data between on-premises data centers and the cloud, providing greater control and flexibility. These deployments utilize both cloud service providers (CSPs) and hyperscale public clouds such as Amazon Web Services (AWS).
    Use Case 1: Partners with Verizon for Software as a Service Colocation and integrated Disaster Recovery in the Cloud
    For financial services company BlackLine, availability, security, and compliance with financial standards is paramount. But with the company growing at 50% per year, and periodic throughput and capacity bursts of up to 20 times baseline, the company knew it couldn't sustain its business model with on-premises IT alone.
    Stringent requirements often lead to innovation. BlackLine deployed its private cloud infrastructure at a Verizon colocation facility. The Verizon location gives them a data center that is purpose-built for security and compliance. It enables the company to retain full control over sensitive data while delivering the network speed and reliability it needs. The colocation facility gives Blackline access to Verizon cloud services with maximum bandwidth and minimum latency. The company currently uses Verizon Cloud for disaster recovery and backup. Verizon cloud services are built on NetApp® technology, so they work seamlessly with BlackLine's existing NetApp storage.
    To learn more about BlackLine's hybrid cloud deployment, read the executive summary and technical case study, or watch this customer video.
    Use Case 2: Private, Nonprofit University Eliminates Tape with Cloud Integrated Storage
    A private university was just beginning its cloud initiative and wanted to eliminate tape—and offsite tape storage. The university had been using Data Domain as a backup target in its environment, but capacity and expense had become a significant issue, and it didn't provide a backup-to-cloud option.
    The director of Backup turned to a NetApp SteelStore cloud-integrated storage appliance to address the university's needs. A proof of concept showed that SteelStore™ was perfect. The on-site appliance has built-in disk capacity to store the most recent backups so that the majority of restores still happen locally. Data is also replicated to AWS, providing cheap and deep storage for long-term retention. SteelStore features deduplication, compression, and encryption, so it efficiently uses both storage capacity (both in the appliance and in the cloud) and network bandwidth. Encryption keys are managed on-premises, ensuring that data in the cloud is secure.
    The university is already adding a second SteelStore appliance to support another location, and—recognizing which way the wind is blowing—the director of Backup has become the director of Backup and Cloud.
    Use Case 3: Consumer Finance Company Chooses Cloud ONTAP to Move Data Back On-Premises
    A leading provider of online payment services needed a way to move data generated by customer applications running in AWS to its on-premises data warehouse. NetApp Cloud ONTAP® running in AWS proved to be the least expensive way to accomplish this.
    Cloud ONTAP provides the full suite of NetApp enterprise data management tools for use with Amazon Elastic Block Storage, including storage efficiency, replication, and integrated data protection. Cloud ONTAP makes it simple to efficiently replicate the data from AWS to NetApp FAS storage in the company's own data centers. The company can now use existing extract, transform and load (ETL) tools for its data warehouse and run analytics on data generated in AWS.
    Regular replication not only facilitates analytics, it also ensures that a copy of important data is stored on-premises, protecting data from possible cloud outages. Read the success story to learn more.
    Data Near the Cloud
    For many organizations, deploying data near the hyperscale public cloud is a great choice because they can retain physical control of their data while taking advantage of elastic cloud compute resources on an as-needed basis. This hybrid cloud architecture can deliver better IOPS performance than native public cloud storage services, enterprise-class data management, and flexible access to multiple public cloud providers without moving data. Read the recent white paper from the Enterprise Strategy Group, “NetApp Multi-cloud Private Storage: Take Charge of Your Cloud Data,” to learn more about this approach.
    Use Case 1: Municipality Opts for Hybrid Cloud with NetApp Private Storage for AWS
    The IT budgets of many local governments are stretched tight, making it difficult to keep up with the growing expectations of citizens. One small municipality found itself in this exact situation, with aging infrastructure and a data center that not only was nearing capacity, but was also located in a flood plain.
    Rather than continue to invest in its own data center infrastructure, the municipality chose a hybrid cloud using NetApp Private Storage (NPS) for AWS. Because NPS stores personal, identifiable information and data that's subject to strict privacy laws, the municipality needed to retain control of its data. NPS does just that, while opening the door to better citizen services, improving availability and data protection, and saving $250,000 in taxpayer dollars. Read the success story to find out more.
    Use Case 2: IT Consulting Firm Expands Business Model with NetApp Private Storage for Azure
    A Japanese IT consulting firm specializing in SAP recognized the hybrid cloud as a way to expand its service offerings and grow revenue. By choosing NetApp Private Storage for Microsoft Azure, the firm can now offer a cloud service with greater flexibility and control over data versus services that store data in the cloud.
    The new service is being rolled out first to support the development work of the firm's internal systems integration engineering teams, and will later provide SAP development and testing, and disaster recovery services for mid-market customers in financial services, retail, and pharmaceutical industries.
    Use Case 3: Financial Services Leader Partners with NetApp for Major Cloud Initiative
    In the heavily regulated financial services industry, the journey to cloud must be orchestrated to address security, data privacy, and compliance. A leading Australian company recognized that cloud would enable new business opportunities and convert capital expenditures to monthly operating costs. However, with nine million customers, the company must know exactly where its data is stored. Using native cloud storage is not an option for certain data, and regulations require that the company maintain a tertiary copy of data and retain the ability to restore data under any circumstances. The company also needed to vacate one of its disaster-recovery data centers by the end of 2014.
    To address these requirements, the company opted for NetApp Private Storage for Cloud. The firm placed NetApp storage systems in two separate locations: an Equinix cloud access facility and a Global Switch colocation facility both located in Sydney. This satisfies the requirement for three copies of critical data and allows them to take advantage of AWS EC2 compute instances as needed, with the option to use Microsoft Azure or IBM SoftLayer as an alternative to AWS without migrating data. For performance, the company extended its corporate network to the two facilities.
    The firm vacated the data center on schedule, a multimillion-dollar cost avoidance. Cloud services are being rolled out in three phases. In the first phase, NPS will provide disaster recovery for the company's 12,000 virtual desktops. In phase two, NPS will provide disaster recover for enterprise-wide applications. In the final phase, the company will move all enterprise applications to NPS and AWS. NPS gives the company a proven methodology for moving production workloads to the cloud, enabling it to offer new services faster. Because the on-premises storage is the same as the cloud storage, making application architecture changes will also be faster and easier than it would be with other options. Read the success story to learn more.
    NetApp on NetApp: nCloud
    When NetApp IT needed to provide cloud services to its internal customers, the team naturally turned to NetApp hybrid cloud solutions, with a Data Fabric joining the pieces. The result is nCloud, a self-service portal that gives NetApp employees fast access to hybrid cloud resources. nCloud is architected using NetApp Private Storage for AWS, FlexPod®, clustered Data ONTAP and other NetApp technologies. NetApp IT has documented details of its efforts to help other companies on the path to hybrid cloud. Check out the following links to lean more:
    Hybrid Cloud: Changing How We Deliver IT Services [blog and video]
    NetApp IT Approach to NetApp Private Storage and Amazon Web Services in Enterprise IT Environment [white paper]
    NetApp Reaches New Heights with Cloud [infographic]
    Cloud Decision Framework [slideshare]
    Hybrid Cloud Decision Framework [infographic]
    See other NetApp on NetApp resources.
    Data Fabric: NetApp Services for Hybrid Cloud
    As the examples in this article demonstrate, NetApp is developing solutions to help organizations of all sizes move beyond cloud silos and unlock the power of hybrid cloud. A Data Fabric enabled by NetApp helps you more easily move and manage data in and near the cloud; it's the common thread that makes the uses cases in this article possible. Read Realize the Full Potential of Cloud with the Data Fabric to learn more about the Data Fabric and the NetApp technologies that make it possible.
    Richard Treadway is responsible for NetApp Hybrid Cloud solutions including SteelStore, Cloud ONTAP, NetApp Private Storage, StorageGRID Webscale, and OnCommand Insight. He has held executive roles in marketing and engineering at KnowNow, AvantGo, and BEA Systems, where he led efforts in developing the BEA WebLogic Portal.
    Tom Shields leads the Cloud Service Provider Solution Marketing group at NetApp, working with alliance partners and open source communities to design integrated solution stacks for CSPs. Tom designed and launched the marketing elements of the storage industry's first Cloud Service Provider Partner Program—growing it to 275 partners with a portfolio of more than 400 NetApp-based services.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    Dave:
    "David Scarani" <[email protected]> wrote in message
    news:3ecfc046$[email protected]..
    >
    I was looking for some real world "Best Practices" of deploying J2EEapplications
    into a Production Weblogic Environment.
    We are new at deploying applications to J2EE application servers and arecurrently
    debating 2 methods.
    1) Store all configuration (application as well as Domain configuration)in properties
    files and Use Ant to rebuild the domain everytime the application isdeployed.
    I am just a WLS engineer, not a customer, so my opinions have in some
    regards little relative weight. However I think you'll get more mileage out
    of the fact that once you have created your config.xml, checking it into src
    control, versioning it. I would imagine that application changes are more
    frequent than server/domain configuration so it seems a little heavy weight
    to regenerate the entire configuration everytime an application is
    deployed/redeployed. Either way you should check out the wlconfig ant task.
    Cheers
    mbg
    2) Have a production domain built one time, configured as required andalways
    up and available, then use Ant to deploy only the J2EE application intothe existing,
    running production domain.
    I would be interested in hearing how people are doing this in theirproduction
    environments and any pros and cons of one way over the other.
    Thanks.
    Dave Scarani

Maybe you are looking for

  • Why does Motion crash when I try to remove rigged parameter?

    Each time I try to remove a rigged Mask source drop well Motion crashes. This video shows the steps I take. Will you please see if you can recreate the crash? I've tested this on two different Mac computers (Macbook Pro with Lion and Macbook with Sno

  • I'm in need of a "container"

    Hi all. In my project there are 2 radio buttons. Each of them gives access to several controls (such as string, listboxes and checkboxes) which differ both in number and in type. What i would like to do is to make these controls visible or invisible

  • Problem validating XMl document

    Hi everyone, I'm facing a problem validating a XML document with Apache toolkit under windows XP and eclipse 3.0 I generate a pair of public/private keys using the RSA algorithm. The keys are of arbitrary length, but satisfying RSA conditions, ie we

  • Exclamation Mark On Printer Icon

    Every time I print from my iMac to my HP Photosmart 5510 printer, an exclamation mark in a yellow triangle appears on the printer icon in my Dock, like this: What does this mean? The "Now Printing" window that pops up on my screen looks like this, ag

  • WRT300N - Dropping Signal

    Hi there, I am hoping someone could help me out. I am currently running the WRT300n v 1.1 (firmaware version 1.51.2) with a WUSB300n adapter on a Windows Vista Ultimate 32bit desktop. Getting very strong signal all the time but every soo often my con