Best Practices for highly dynamic features like Search

For a project I need to implement a "Search" Component which will most probably use lucene that is built into CQ5. Since most of the other content on the site is dynamic and cached on dispatcher my concern is regarding the load such a dynamic feature will create for the publish instance.
What are the best practices to minimize the load on publish instance for such a scenario?

One option is to have your search results display via AJAX rather than a full page request. That way most of the page is cached in dispatcher and only the AJAX request with the search results is dynamic.

Similar Messages

  • Suggest the Best Practice for Procurement of Commodities like crude, copper

    Dear Gurus,
    Please suggest the best practice for following business Process.
    My client is having the procurement need for a comodity whose prices are fluctuating. Say Crude Oil . The prices are changing every day. Now The client would like to pay to vendor on day of good receipt. But it may be possible that the price on GR day is much higher. How to control these kind of procurement. Presently I have activated the price variance through invoice posting but this is not working.
    Can you suggest best practice for  procurement of comodities.
    Thank you for your consistant support.
    Regards
    Vinod Kakade
    Edited by: vinodkakade on Jul 14, 2011 2:34 PM

    Hi Vinod,
    Would you know the price by the time the GR is to happen, in this case you can ask the vendor to send the confirmations, just prior to that with the correct price.
    Please refer this link
    Change in PO Price after goods receipts and goods issue
    though the thread is marked unanswered.
    Regards
    Shailesh

  • Best practice for highly available management / publishing servers

    I am testing a highly available appv 5.0 environment, which will deploy appv packages to a Xenapp farm.  I have two SQL 2012 servers configured as an availability group for the backend, and two publishing / management servers for the front end. 
    What is the best practice to configure the publishing / management servers for high availability?  Should I configure them as an NLB cluster, which I have tested and does seem to work, or should I just use the GPO to configure the clients to use both
    publishing servers, which I have also tested and appears to work?
    Thanks,
    Patrick Sullivan

    In App-V 5.0 the Management and Publishing Servers are hosted in IIS, so use the same approach for HA as you would any web application.
    If NLB is all that's available to you, then use that; otherwise I would recommend a proper load balancing solution such as Citrix NetScaler or KEMP LoadManager.
    Please remember to click "Mark as Answer" or "Vote as Helpful" on the post that answers your question (or click "Unmark as Answer" if a marked post does not actually
    answer your question). This can be beneficial to other community members reading the thread.
    This forum post is my own opinion and does not necessarily reflect the opinion or view of my employer, Microsoft, its employees, or other MVPs.
    Twitter:
    @stealthpuppy | Blog:
    stealthpuppy.com |
    The Definitive Guide to Delivering Microsoft Office with App-V

  • Best Practice for saving all fieds and searches in capital letters

    I want to save all fields in my all pages in CAPS and also to search with CAPS e.g user enters search criteria in small letters, then automatically it should convert to caps. What is the best practice to do that?

    Hi,
    There are already so many discussions on this in this forum, some of the links are:
    Uppercase
    How to convert user input in the page to upper case?
    Sireesha

  • Best practice for High availability design, HSRP

    Hi,
    I am planning to create High Availability for LAN to WAN connectivity.
    But I want to know your opinion about the best way how to do this. I googled for a solution/best way how to do this, but I didn't found in my opinion right answer.
    The situation:
    I have 2 3945E Routers and 2 3560 switches. The design that I am planning to implement is below.
    The main goal is to have redundant connection, whatever one of the devices will fail. For example, if the R1 will fail, R2 should become active, if the SW1 will fail, the SW2 will take care about reachability and vice versa. The router 1 should be preferred always, if the link to ISP isn't down, because of greater bandwidth. So why am I drown 2 connections to 2 separate switches. If the SW1 will fail, I will still have a connection to WAN using R1 router.
    The Router interface should be configured with sub interfaces (preferred over secondary IP address of interface), because more than 10 subnets will be assigned to the LAN segment. The routers have 4 Gi ports.
    HSRP must be enabled on LAN side, because PC's on LAN must have redundant def. getaway.
    So, the question is - what is the best and preferred way to do this?
    In my opinion, I should use BVI and combine R1 routers 2 interfaces in to logical one and do the same for the R2.
    Next, turn the router in to L3 switch using IRB and then configure HSRP.
    What would be your preferred way to do this?

    Hi Audrius,
    I would suggest you to go with HSRP. GLBP you will use where you want load balance.
    I think the connectivity between your Routers (3945) and switches (3560) is gigabit connection which is high speed. So keep one physical link from your switches to each router and do HSRP on those router physical interfaces.
    In this way you will have high availability like if R1 fails then R2 will take over.
    Regarding the config see the below which I have for one of my Customer DC.
    ACTIVE:
    track 1 interface GigabitEthernet0/0 line-protocol
    track 2 interface GigabitEthernet0/0 line-protocol
    interface GigabitEthernet0/1
    ip address 10.10.10.12 255.255.255.0
    ip nat inside
    ip virtual-reassembly
    duplex full
    speed 100
    standby use-bia scope interface
    standby 0 ip 10.10.10.10
    standby 0 priority 110
    standby 0 preempt
    standby 0 authentication peter2mo
    standby 0 track 1 decrement 30
    standby 0 track 2 decrement 30
    STANDBY:
    track 1 interface GigabitEthernet0/0 line-protocol
    interface GigabitEthernet0/1
    ip address 10.10.10.11 255.255.255.0
    ip nat inside
    ip virtual-reassembly
    duplex full
    speed 100
    standby use-bia scope interface
    standby 0 ip 10.10.10.10
    standby 0 priority 90
    standby 0 authentication peter2mo
    standby 0 track 1 decrement 30
    Please rate the helpfull posts.
    Regards,
    Naidu.

  • Best practice for a site with a lot of images?

    I am working on a site that will have over a hundred images
    and I wanted to see what is the best practice for designing a site
    like this. Should a go with xml(please give examples or
    explanation), a text file or just loadMovie("image1project1.jpg",
    "bottomsec") with named external images that will stay the same.
    Any help is appreciated on staying up to date with this kind of
    site.
    Thanks,
    Randy

    ok I am new please be nice - I think I want to set it up like
    this
    <project1>
    <section>Architecture</section>
    <name>New Building for CREATiVENESS</name>
    <comment>The major challenge to designing this new
    tower was the site constraints  a small 3 acre urban corner site.
    It is located adjacent to a community center to facilitate extended
    use in the evenings and weekends for the entire community.
    </comment>
    <thumb>thumbs/project1.jpg</thumb>
    <img1>images/project1img1.jpg</img1>
    <img2>images/project1img2.jpg</img2>
    <img3>images/project1img3.jpg</img3>
    <img4>images/project1img4.jpg</img4>
    </project1>
    <project2>
    <section>Interiors</section>
    <name>New Building for Me</name>
    <comment>The major challenge to designing this new
    tower was the site constraints  a small 3 acre urban corner site.
    It is located adjacent to a community center to facilitate extended
    use in the evenings and weekends for the entire community.
    </comment>
    <thumb>thumbs/project2.jpg</thumb>
    <img1>images/project2img1.jpg</img1>
    <img2>images/project2img2.jpg</img2>
    <img3>images/project2img3.jpg</img3>
    <img4>images/project2img4.jpg</img4>
    </project2>
    <project3>
    <section>Architecture</section>
    <name>New Building for You</name>
    <comment>The major challenge to designing this new
    tower was the site constraints  a small 3 acre urban corner site.
    It is located adjacent to a community center to facilitate extended
    use in the evenings and weekends for the entire community.
    </comment>
    <thumb>thumbs/project3.jpg</thumb>
    <img1>images/project3img1.jpg</img1>
    <img2>images/project3img2.jpg</img2>
    <img3>images/project3img3.jpg</img3>
    <img4>images/project3img4.jpg</img4>
    </project3>
    <project4>
    <section>Interiors</section>
    <name>New Building for that guy</name>
    <comment>The major challenge to designing this new
    tower was the site constraints  a small 3 acre urban corner site.
    It is located adjacent to a community center to facilitate extended
    use in the evenings and weekends for the entire community.
    </comment>
    <thumb>thumbs/project4.jpg</thumb>
    <img1>images/project4img1.jpg</img1>
    <img2>images/project4img2.jpg</img2>
    <img3>images/project4img3.jpg</img3>
    <img4>images/project4img4.jpg</img4>
    </project4>
    but I am not sure of the way to create the way to run through
    it to find if it is in a section to put it in the menu and then to
    call the images and text once they are in a project area. I dont
    know if the
    this.firstChild.nextSibling.childNodes[0].childNodes[2]
    is the best way to call things in the file. Any help is
    appreciated. Please let me know what are the best practices and
    easiest way to work with a large xml file.
    Thanks,
    Randy

  • Highly Required CRM 5.0 Best practices for CRM Service Module

    Dear all,
    I have been searching for CRM 5.0 version best practices in Internet quiete a long period, but could not find anywhere.
    currently SAP is providing only best practices for SAP CRM 2007 version.
    since most of configuration is differing because of Webclient Interface, I request you to refer a source from where I can get the CRM 5.0 Best Practices for Service module.
    Your suggestions and help will be highly appreciated.
    Best regards
    Raghu ram

    Hi Srini,
    <removed by moderator>
    Thank you & Best regards
    Raghu ram
    Edited by: Raghu Ram on Jul 16, 2009 6:09 AM
    Edited by: Raghu Ram on Jul 16, 2009 6:11 AM
    Moderator message please review the rules of engagement located here:
    https://www.sdn.sap.com/irj/scn/wiki?path=/display/home/rulesofEngagement
    Edited by: Stephen Johannes on Jul 16, 2009 8:12 AM

  • BEST PRACTICE FOR AN EFFICIENT SEARCH FACILITY

    Good Morning,
    Whilst in Training, our Trainer said that the most efficiency from the Sharpoint Search would be to install the Search Facility on a separate server (hardware).
    Not sure how to have this process done.
    Your advice and recommendation would be greatly appreciated.
    thanks a mil.
    NRH

    Hi,
    You can
    create a dedicated search server that hosts all search components, query and index role, and crawl all on one physical server.
    Here are some article for your reference:
    Best practices for search in SharePoint Server 2010:
    http://technet.microsoft.com/en-us//library/cc850696(v=office.14).aspx
    Estimate performance and capacity requirements for SharePoint Server 2010 Search:
    http://technet.microsoft.com/en-us/library/gg750251(v=office.14).aspx
    Below is a similar post for your reference:
    http://social.technet.microsoft.com/Forums/en-US/be5fcccd-d4a3-449e-a945-542d6d917517/setting-up-dedicated-search-and-crawl-servers?forum=sharepointgeneralprevious
    Best regards
    Wendy Li
    TechNet Community Support

  • Best practices for search service in a sharepont farm

    Hi
    in a sharepoint web application there is many BI dashboards are deployed and also we have plan to
    configure enterprise search  for this application.
    in our sharepoint 2010 farm we have
    2  application server s
    2 WFE servers
    here one application server is running
    c.a + webanalytics service and itself is a domain controller
    second application server is for only running secure store service+ Performance point service only
    1 - here if we  run search server service in second application server can any issues to BI performance and
    2 - its best practice to run Performance point service and search service in one server
    3 -also is it  best practice to run search service in a such a application server where already other services running
    and where we have only one share point web application need to be crawled and indexed with  below crawl schedule.
    here we only run full crawl per week and incremental crawl at midnight daily
    adil

    Hi adil,                      
    Based on your description, you want to know the best practices for search service in a SharePoint farm.
    Different farms have different search topologies, for the best search performance, I recommend that you follow the guidance for small, medium, and large farms.
    The article is about the guidance for different farms. 
    Search service can run with other services in the same server, if condition permits and you want to have better performance for search service and other services including BI performance, you can deploy search service in dedicated server.
    If condition permits, I recommend combining a query component with a front-end Web server to avoid putting crawl components and query components on the same serve.
    In your SharePoint farm, you can deploy the query components in a WFE server and the crawl components in an application server.
    The articles below describe the best practices for enterprise search.
    https://technet.microsoft.com/en-us/library/cc850696(v=office.14).aspx
    https://technet.microsoft.com/en-us/library/cc560988(v=office.14).aspx
    Best regards      
    Sara Fan
    TechNet Community Support

  • Best Practices for Handling queries for searching XML content

    Experts: We have a requirement to get the count of 4 M rows from a specific XML tag with value begins with, I have a text index created but the query is extremely slow when I use the contains operator.
    select count(1) from employee
    where
    contains ( doc, 'scott% INPATH ( /root/element1/element2/element3/element4/element5)') >0
    what is oracle's best practices recommendation to query / index such searches?
    Thanks

    Can you provide a test case that shows the structure of the data and how you've generated the index? Otherwise, the generic advice is going to be "use prefix indexing".

  • Best Practices For Household IOS's/Apple IDs

    Greetings:
    I've been searching support for best practices for sharing primarily apps, music and video among multple iOS's/Apple IDs.  If there is a specific article please point me to it.
    Here is my situation: 
    We currently have 3 iPads (2-kids, 1-dad) in the household and one iTunes account on a win computer.  I previously had all iPads on single Apple ID/credit card and controlled the kids' downloads thru the Apple ID password that I kept secret.  As the kids have grown older, I found myself constantly entering my password as the kids increased there interest in music/apps/video.  I like this approach because all content was shared...I dislike because I was constantly asked to input password for all downloads.
    So, I recently set up an individual account for them with the allowance feature at iTunes that allows them to download content on their own (I set restrictions on their iPads).  Now I have 3 Apple IDs under one household.
    My questions:
    With the 3 Apple IDs, what is the best way to share apps,music, videos among myself and the kids?  Is it multiple accounts on the computer and some sort of sharing? 
    Thanks in advance...

    Hi Bonesaw1962,
    We've had our staff and students run iOS updates OTA via Settings -> Software Update. In the past, we put a DNS block on Apple's update servers to prevent users from updating iOS (like last fall when iOS 7 was first released). By blocking mesu.apple com, the iPads weren't able to check for or install any iOS software updates. We waited until iOS 7.0.3 was released before we removed the block to mesu.apple.com at which point we told users if they wanted to update to iOS 7 they could do so OTA. We used our MDM to run reports periodically to see how many people updated to iOS 7 and how many stayed on iOS 6. As time went on, just about everyone updated on their own.
    If you go this route (depending on the number of devices you have), you may want to take a look at Caching Server 2 to help with the network load https://www.apple.com/osx/server/features/#caching-server . From Apple's website, "When a user on your network downloads new software from Apple, a copy is automatically stored on your server. So the next time other users on your network update or download that same software, they actually access it from inside the network."
    I wish there was a way for MDMs to manage iOS updates, but unfortunately Apple hasn't made this feature available to MDM providers. I've given this feedback to our Apple SE, but haven't heard if it is being considered or not. Keeping fingers crossed.
    Hope this helps. Let us know what you decide on and keep us posted on the progress. Good luck!!
    ~Joe

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best practice for a deplomyent (EAR containing WAR/EJB) in a productive environment

    Hi there,
    I'm looking for some hints regarding to the best practice deployment in a productive
    environment (currently we are not using a WLS-cluster);
    We are using ANT for buildung, packaging and (dynamic) deployment (via weblogic.Deployer)
    on the development environment and this works fine (in the meantime);
    For my point of view, I would like to prefere this kind of Deploment not only
    for the development, also for the productive system.
    But I found some hints in some books, and this guys prefere the static deployment
    for the p-system.
    My question now:
    Could anybody provide me with some links to some whitepapers regarding best practice
    for a deployment into a p-system ??
    What is your experiance with the new two-phase-deploment coming up with WLS 7.0
    Is it really a good idea to use the static deployment (what is the advantage of
    this kind of deployment ???
    THX in advanced
    -Martin

    Hi Siva,
    What best practise are you looking for ? If you can be specific on your question we could provide appropriate response.
    From my basis experience some of the best practices.
    1) Productive landscape should have high availability to business. For this you may setup DR or HA or both.
    2) It should have backup configured for which restore has been already tested
    3) It should have all the monitoring setup viz application, OS and DB
    4) Productive client should not be modifiable
    5) Users in Production landscape should have appropriate authorization based on SOD. There should not be any SOD conflicts
    6) Transport to Production should be highly controlled. Any transport to Production should be moved only with appropriate Change Board approvals.
    7) Relevant Database and OS security parameters should be tested before golive and enabled
    8) Pre-Golive , Post Golive should have been performed on Production system
    9) EWA should be configured atleast for Production system
    10) Production system availability using DR should have been tested
    Hope this helps.
    Regards,
    Deepak Kori

  • BI Best Practice for Chemical Industry

    Hello,
    I would like to know if anyone is aware of SAP BI  Best Practice for Chemicals.And if so can anyone please post a link aswell.
    Thanks

    Hi Naser,
    Below information will helps you in detail explanation regarding Chemical industry....
    SAP Best Practices packages support best business practices that quickly turn your SAP ERP application into a valuable tool used by the entire business. You can evaluate and implement specific business processes quickly u2013 without extensive Customization of your SAP software. As a result, you realize the benefits with less Effort and at a lower cost than ever before. This helps you improve operational efficiency while providing the flexibility you need to be successful in highly demanding markets. SAP Best Practices packages can benefit companies of all sizes, including global enterprises creating a corporate template for their subsidiaries.
    Extending beyond the boundaries of conventional corporate divisions and functions, the SAP Best Practices for Chemicals package is based on SAP ERP; the SAP Environment, Health & Safety (SAP EH&S) application; and the SAP Recipe Management application. The business processes supported by SAP Best Practices for Chemicals encompass a wide range of activities typically found in a chemical industry
    Practice:
    u2022 Sales and marketing
    u2013 Sales order processing
    u2013 Presales and contracts
    u2013 Sales and distribution (including returns, returnables, and rebates, with quality management)
    u2013 Inter- and intracompany processes
    u2013 Cross-company sales
    u2013 Third-party processing
    u2013 Samples processing
    u2013 Foreign trade
    u2013 Active-ingredient processing
    u2013 Totes handling
    u2013 Tank-trailer processing
    u2013 Vendor-managed inventory
    u2013 Consignment processing
    u2013 Outbound logistics
    u2022 Supply chain planning and execution Supply and demand planning
    u2022 Manufacturing planning and execution
    u2013 Manufacturing execution (including quality management)
    u2013 Subcontracting
    u2013 Blending
    u2013 Repackaging
    u2013 Relabeling
    u2013 Samples processing
    u2022 Quality management and compliance
    u2013 EH&S dangerous goods management
    u2013 EH&S product safety
    u2013 EH&S business compliance services
    u2013 EH&S industrial hygiene and safety
    u2013 EH&S waste management
    u2022 Research and development Transformation of general recipes
    u2022 Supplier collaboration
    u2013 Procurement of materials and services (Including quality management)
    u2013 Storage tank management
    u2013 E-commerce (Chemical Industry Data Exchange)
    u2022 Enterprise management and support
    u2013 Plant maintenance
    u2013 Investment management
    u2013 Integration of the SAP NetWeaver Portal component
    u2022 Profitability analysis
    More Details
    This section details the most common business scenarios u2013 those that benefit most from the application of best practices.
    Sales and Marketing
    SAP Best Practices for Chemicals supports the following sales and marketingu2013related business processes:
    Sales order processing u2013 In this scenario, SAP Best Practices for Chemicals supports order entry, delivery, and billing. Chemical industry functions include the following:
    u2022 Triggering an available-to-promise (ATP) inventory check on bulk orders after sales order entry and automatically creating a filling order (Note: an ATP check is triggered for packaged material.)
    u2022 Selecting batches according to customer requirements:
    u2022 Processing internal sales activities that involve different organizational units
    Third-party and additional internal processing u2013 In this area, the SAP Best Practices for Chemicals package provides an additional batch production step that can be applied to products previously produced by either continuous or batch processing. The following example is based on further internal processing of plastic granules:
    u2022 Purchase order creation, staging, execution, and completion
    u2022 In-process and post process control
    u2022 Batch assignment from bulk to finished materials
    u2022 Repackaging of bulk material
    SAP Best Practices for Chemicals features several tools that help you take advantage of chemical industry best practices. For example, it provides a fully documented and reusable prototype that you can turn into a productive solution quickly. It also provides a variety of tools, descriptions of business scenarios, and proven configuration of SAP software based on more than 35 years of working with the
    Chemical industry.
    SAP Functions in Detail u2013 SAP Best Practices for Chemicals
    The package can also be used to support external toll processing such as that required for additional treatment or repackaging.
    Tank-trailer processing u2013 In this scenario, SAP Best Practices for Chemicals helps handle the selling of bulk material, liquid or granular. It covers the process that automatically adjusts the differences between the original order quantities and the actual quantities filled in the truck. To determine the quantity actually filled, the tank trailer is weighed before and after loading. The delta weight u2013 or quantity filled u2013 is transmitted to the SAP software via an order confirmation. When the delivery for the sales order is created, the software automatically adjusts the order quantity with the confirmed filling quantity.The customer is invoiced for the precise quantity filled and delivered.
    Supply Chain Planning and Execution
    SAP Best Practices for Chemicals supports supply chain planning as well as supply chain execution processes:
    Supply and demand planning u2013 Via the SAP Best Practices for Chemicals package, SAP enables complete support for commercial and supply-chain processes in the chemical industry, including support for integrated sales and operations planning, planning strategies for bulk material, and a variety of filling processes with corresponding packaging units. The package maps the entire supply chain u2013 from sales planning to material requirements planning to transportation procurement.
    Supplier Collaboration
    In the procurement arena, best practices are most important in the following
    Scenario:
    Procurement of materials and services:
    In this scenario, SAP Best Practices for Chemicals describes a range of purchasing processes, including the following:
    u2022 Selection of delivery schedules by vendor
    u2022 Interplant stock transfer orders
    u2022 Quality inspections for raw materials, including sampling requests triggered
    by goods receipt
    Manufacturing Scenarios
    SAP Best Practices for Chemicals supports the following sales and
    Manufacturingu2013related business processes:
    Continuous production u2013 In a continuous production scenario, SAP Best Practices for Chemicals typifies the practice used by basic or commodity chemical producers. For example, in the continuous production of plastic granules, production order processing is based on run-schedule headers. This best-practice package also describes batch and quality management in continuous production. Other processes it supports include handling of byproducts,co-products, and the blending process.
    Batch production u2013 For batch production,
    SAP Best Practices for Chemicals typifies the best practice used by specialty
    chemical producers. The following example demonstrates batch production
    of paint, which includes the following business processes:
    u2022 Process order creation, execution, and completion
    u2022 In-process and post process control
    u2022 Paperless manufacturing using XMLbased Process integration sheets
    u2022 Alerts and events
    u2022 Batch derivation from bulk to finished materials
    Enterprise Management and Support
    SAP Best Practices for Chemicals also supports a range of scenarios in this
    area:
    Plant maintenance u2013 SAP Best Practices for Chemicals allows for management
    of your technical systems. Once the assets are set up in the system, it focuses on preventive and emergency maintenance. Tools and information support the setup of a production plant with assets and buildings.Revenue and cost controlling u2013 The package supports the functions that help you meet product-costing requirements in the industry. It describes how cost centers can be defined, attached
    to activity types, and then linked to logistics. It also supports costing and settlement of production orders for batch and continuous production. And it includes information and tools that help you analyze sales and actual costs in a margin contribution report.
    The SAP Best Practices for Chemicals package supports numerous integrated
    business processes typical of the chemical industry, including the following:
    u2022 Quality management u2013 Supports integration of quality management concepts across the entire supplychain (procurement, production, and sales), including batch recall and complaint handling
    u2022 Batch management u2013 Helps generate batches based on deliveries from vendors or because of company production or filling, with information and tools for total management of batch production and associated processes including batch  derivation, batch information cockpit, and a batchwhere- used list
    u2022 Warehouse management u2013 Enables you to identify locations where materials
    or batch lots are stored, recording details such as bin location and other storage information on dangerous goods to help capture all information needed to show compliance with legal requirements
    Regards
    Sudheer

  • Best Practices for zVM/SLES10/zDB2 environment for dialog instances.

    Hi,  I am a zSeries system programmer who has just completed an IBM led Proof of Concept which demonstrated the viability of running SAP instances on SUSE SLES10 Linux booted in zVM guests and accessing zDB2 data via hipersockets. Before we build a Linux infrastructure using the 62 IFLs we just procured, we are wondering if any best practices for this environment have been developed as an OSS note or something else by SAP.    Below you will find an email which was sent and responded to by IBM and Novell on these topics...
    "As you may know, Home Depot has embarked on an IBM led proof of concept using SUSE SLES10 running in zVM guests on IBM zSeries hardware to host SAP server instances.  The Home Depot IT organization is currently in the midst of a large scale push to modernize our merchandising and people systems on SAP platforms.  The zVM/SUSE/SAP POC is part of that effort, as is a parallel POC of an Intel Blade/Red Hat/SAP platform.  For our production financial systems we now use a pSeries/AIX/SAP platform.
          So far in the zVM/SUSE/SAP POC, we have been able to create four zVM LPARS on IBM z9 hardware, create twelve zVM guests on those LPARS, boot SLES10 in those guests, install and run SAP instances in those guests using hipersockets for access to our DB2 SAP databases running on zOS, and direct user workloads to the SAP instances with good results.  We have also successfully developed cloning scripts that have made it possible to create new SLES10 instances, configured and ready for SAP installs, in about 10 seconds using FLASHCOPY and IBM DASD.
          I am writing in the hope that you can direct us to technical resources at IBM/Novell/SAP who may be able to field a few questions that have arisen.  In our discussions about optimization of the zVM/SUSE/SAP platform, we wondered if any wisdom about the appropriateness of and support for using zVM capabilities to virtualize SAP has ever been developed or any best practices drafted.  Attached you will find an IBM Redbook and a PowerPoint presentation which describes the use of the zVM discontiguous shared segments and the zVM named saved system features for the sharing of reentrant code and other  elements of Linux and its applications, thereby conserving storage and disk resources allocated to guest machines.   The specific question of the hour is, can any SAP code be handled similarly?  Have specific SAP elements eligible for this treatment been identified? 
          I've searched the SUSE Knowledgebase for articles on this topic to no avail.  Any similar techniques that might help us reduce the total cost of ownership of a zVM/SUSE/SAP platform as we compare it to Intel Blade/Red Hat/SAP and pSeries/AIX/SAP platforms are of great interest as we approach the end of our POC.  Can you help?
          Greg McKelvey is a Client I/T Architect at IBM.  He found the attached IBM documents and could give a fuller account of our POC.  Pat Downs, IBM zSeries IT Architect, has also worked to guide our POC. Akshay Rao, IBM Systems IT Specialist - Linux | Virtualization | SOA, is acting as project manager for the POC.  Jim Hawkins is the Home Depot Architect directing the POC.  I've CC:ed their email addresses.  I am sure they would be pleased to hear from you if there are the likely questions about what the heck I am asking about here.  And while writing, I thought of yet another question that I hoping somebody at SAP might weigh in on; are there any performance or operational benefits to using Linux LVM to apportion disk to filesystems vs. using zVM to create appropriately sized minidisks for filesystems without LVM getting involved?"
    As you can see, implementation questions need to be resolved.  We have heard from Novell that the SLES10 Kernel and other SUSE artifacts can reside in memory and be shared by multiple operating system images.  Does SAP support this configuration?  Also, has SAP identified SAP components which are eligible for similar treatment?  We would like to make sure that any decisions we make about the SAP platforms we are building will be supportable.  Any help you can provide will be greatly appreciated.  I will supply the documents referenced above if they are not known to any answerer.  Thanks,  Al Brasher 770-433-8211 x11895 [email protected]

    Hello AL ,
    first, let me welcome you on board,  I am sure you won't be disapointed with your choice to run SAP on ZOS.
    as for your questions,
    it wan't easy to find them in this long post , so i suggest you take the time to write a short summary that contains a very short list of questions.
    as for answers.
    here are a few usefull sources of information :
    1. the sap on db2 for Z/os sdn page :
    SAP on DB2 for z/OS
    in it you can find 2 relevant docs :
    a. best practices for ...
    b. database administration for db2 udb for z/os .
    this second publication is excellent , apart from db2 specific info , it contains information on all the components of the sap on db2 for z/os like zlinux,z/vm and so on ...
    2. I can see that you are already familiar with the ibm redbooks , but it seems that you haven't taken the time to get the most out of that resource.
    from you post it is clear that you have found one usefull publication , but I know there are several.
    3. a few months ago I wrote a short post on a similar subject ,
    I'm sure its not exactly what you are looking for at this moment , but its a good start , and with some patience you may be able to get some answers.
    here's a link
    http://blogs.ittoolbox.com/sap/db2/archives/index-of-free-documentation-on-sap-db2-administration-14245
    good luck.
    omer brandis.

Maybe you are looking for

  • Photo transfer to Apple TV

    I transfer photos from my iMac to my first generation Apple TV.  In a folder on an external hard drive attached to the iMac, iTunes only recognizes 8 of 140 raw files with related xmp files.  If I export the raw files as jpgs it recognizes all.  Why?

  • CS5 - The operation cannot complete because there isn't enough memory (RAM) available

    Hi, I keep getting the above message whenever I try to perform any 3D functions in Illustrator. I have Adobe Premium CS5 V5 and am using Windows 7 Proffesional on 2 machines: desktop pc: Core i5 3.33Ghz/4Gb Ram with an independant graphics card: ATI

  • CONFIGURATION STEPS FOR ADOBE INTERACTIVE FORMS

    Hi, I am developing some quotation application . I need to configure adobe interactive forms  can any one tell me step by step configuration. Regards, Gurprit Bhatia

  • Where does an application store information?

    I downloaded express invoice on my macbook pro thinking it was for free when it was a trial version, I used it for that period and now it will not open unless I pay for the program - which I have no intention of doing.  How do I get to the informatio

  • Corrupted iTunes Match Uploads

    So I have a couple of iTunes match uploads that seems somehow corrupted. I've removed the local files, but they won't re-download from iCloud. The two different errors I get are: "An unknow error occurred (11111)" and "The URL "^0" could not be found