TestStand best practices for distribution

We're looking for best practices for distributing TestStand
systems.  I've found the TestStand
Style Guide but it's a little sparse on how to set up the
distributed systems.  We're looking for guidelines on where to put
configuration data, where to put sequence files, how to manage users,
and similar. 
We'll be distributing systems to various contract manufacturers in
China as well as using the systems in multiple locations in-house.
What have you done with distributions and what problems have you seen?
Right now we're planning to separate the Deployment Engine from our
sequence files and putting all our configuration into our distribution
kit for the Deployment Engine.

The TestStand Reference manual provides good information on system
distribution.  Chapter 14 : Deploying TestStand Systems covers the
necessary information for distributing your TestStand
application.  I do have a few suggestions and caveats:
(1)  Make sure you use a Work Space when distributing your
files.  A Work Space makes it easy to package all of your files
and dependencies.  Moreover, the distribution wizard provides a
feature that displays all the files that will be included in the
installation package into an easy to use Tree View.
(2)  TestStand currently does not look for embedded DLL
dependencies.  So, if your code is calling a DLL module that calls
a DLL module, be sure to include the embedded dependency DLL in your
WorkSpace.
(3)  StationGlobals and Custom Data Types are usually missed in
installation.  Be sure that you are including your
StationGlobals.ini file and MyTypes.ini file from the Cfg directory
into your installation workspace.
If you have more specific questions, please feel free to post them here!
Good Luck!
Tyler Tigue
Applications Engineer
National Instruments

Similar Messages

  • Best/Easiest Practice for Distribution of Webhelp files for Fat(ish) Client Application

    So another in my long line of questions in trying to change the implementation of the help systems at my new employer.
    So, the pressure is to convince the existing guard to toss out the old method and go with a new method... One of the issues has come up for CSH Webhelp (via RH10)...
    Apparently, in the old method (where every help topic was it's own CHM - no, I'm not kidding).  They kept it that way so that any time a help file was updated, they could just send out that one chm to update that one topic.
    Now they are worried that if we switch to webhelp using topics in one single big project that we can't just send out a single updated file any more...
    So I'm trying to get ammunition as to how updates to the help can be distributed without it being a big hassle...
    We do have small updates that go out about every week, and they are afraid that going to the webhelp (1 project with lots of topics) method will require more time and effort for the techs because they will have to complete the update of a huge help system every time.
    One solution we have suggested is that we only update help files on major releases and add any changes to help procedures in the Release Notes or in a separate PDF and also include a note that the help files "will be updated to reflect this change in the XX/2015 XYZ Release).
    Does ANYONE have any other ideas of how distribution could be done efficiently so that we don't have to continue this "project per topic" fiasco?
    HELP!!! PLEASE!!!!

    Amebr - you hit the nail on the head... i want to apologize to everyone for all this mass hysteria...but this has been a freaking rollercoaster... one day they are happy with the plan, the next day they decided they want to complicate it more... So now my job is to find out how viable and how tricky and perhaps how dangerous it will be to send out JUST the files that have been changed.  Especially for CSH! 
    Anyone have any experience with this or have any advice?  Peter?  I suspect you might know how this might work... are they any pitfalls to watch out for.. I just have this fear of sending out a few files from an entire project... Altho, it appears when I publish, ONLY the files that have been edited in some way show a new date.. that includes non-topic files - so i'm assuming sending all those will keep the TOC, Index, and Search features working properly... as well as incoming and outgoing links from an edited topic?
    Another question would be how to handle new images in a project... as the image folder tends to be the largest, i don't want to have to send the entire image folder... Thoughts...????  Advice?  Jeff? Peter? Amebr? Bueller?  Bueller?  Bueller....?
    THANKS!Best/Easiest Practice for Distribution of Webhelp files for Fat(ish) Client Application
    OH... and Jeff... my apologies for the misnomer of using the term "Fat-ish" client in reference to this... I totally misspoke and should have said Mobile App... but in my mind, any time you have to download something to use it, it's a fat client... sorry!  :-/

  • Best Practice for the Service Distribution on multiple servers

    Hi,
    Could you please suggest as per the best practice for the above.
    Requirements : we will use all features in share point ( Powerpivot, Search, Reporting Service, BCS, Excel, Workflow Manager, App Management etc)
    Capacity : We have  12 Servers excluding SQL server.
    Please do not just refer any URL, Suggest as per the requirements.
    Thanks 
    srabon

    How about a link to the MS guidance!
    http://go.microsoft.com/fwlink/p/?LinkId=286957

  • Best practice for distributing/releasing J2EE applications.

    Hi All,
    We are developing a J2EE application and would like some information on the best
    practices to be followed for distributing/releasing J2EE applications, in general.
    In particular, the dilemma we have is centered around the generation of stub, skeleton
    and additional classes for the application.
    Most App. Servers can generate the required classes while deploying the EJBs in the
    application i.e. at install time. While some ( BEA Weblogic and IBM Websphere are
    two that we are aware of ) allow these classes to be generated before the installation
    time and the .ear file containing the additional classes is the one that is uploaded.
    For instance, say we have assembled the application "myapp.ear" . There are two ways
    in which the classes can be generated. The first is using 'ejbc' ( assume we are
    using BEA Weblogic ), which generates the stub, skeleton and additional classes for
    the application and returns the file, say, "Deployable_myapp.ear" containing all
    the necessary classes and files. This file is the one that is then installed. The
    other option is to install the file "myapp.ear" and let the Weblogic App. server
    itself, generate the required classes at the installation time.
    If the first way, of 'pre-generating' the stubs is followed, does it require us to
    separately generate the stubs for each versions of the App. Server that we support
    ? i.e. if we generate a deployable file having the required classes using the 'ejbc'
    of Weblogic Ver5.1, can the same file be installed on Weblogic Ver6.1 or do we
    have to generate a separate file?
    If the second method, of 'install-time-generation' of stubs is used, what is the
    nature/magnitude of the risk that we are taking in terms of the failure of the installation
    Any links to useful resources as well as comments/suggestions will be appreciated.
    TIA
    Regards,
    Aasif

    Its much easier to distribute schema/data from an older version to a newer one than the other way around. Nearly all SQL Server deployment features supports database version upgrade, and these include the "Copy Database" wizard, BACKUP/RESTORE,
    detach/attach, script generation, Microsoft Sync framework, and a few others.
    EVEN if you just want to distribute schemas, you may want to distribute the entire database, and then truncate the tables to purge data.
    Backing up and restoring your database is by far the most RELIABLE method of distributing it, but it may not be pratical in some cases because you'll need to generate a new backup every time a schema change occurs, but not if you already have an automated
    backup/maintenance routine in your environment.
    As an alternative, you can Copy Database functionality in SSMS, although it may present itself unstable in some situations, specially if you are distributing across multiple subnets and/or domains. It will also require you to purge data if/when applicable.
    Another option is to detach your database, copy its files, and then attach them in both the source and destination instances. It will generate downtime for your detached databases, so there are better methods for distribution available.
    And then there is the previously mentioned method of generating scripts for schema, and then using an INSERT statement or the import data wizard available in SSMS (which is very practical and implements a SSIS package internally that can be saved for repeated
    executions). Works fine, not as practical as the other options, but is the best way for distributing databases when their version is being downgraded.
    With all this said, there is no "best practice" for this. There are multiple features, each offering their own advantages and downfalls which allow them to align to different business requirements.

  • Best Practice for Distributing Databases to Customers

    I did a little searching and was surprised to not find a best practice document for how to distribute Microsoft SQL Databases. With other database formats, it's common to distribute them as scripts. It seems that feature is rather limited with the built-in
    tools Microsoft provides. There appear to be limits to the length of the script. We're looking to distribute a database several GBs in size. We could detach the database or provide a backup, but that has its own disadvantages by limiting what versions
    of the SQL Server will accept the database.
    What do you recommend and can you point me to some documentation that handles this practice?
    Thank you.

    Its much easier to distribute schema/data from an older version to a newer one than the other way around. Nearly all SQL Server deployment features supports database version upgrade, and these include the "Copy Database" wizard, BACKUP/RESTORE,
    detach/attach, script generation, Microsoft Sync framework, and a few others.
    EVEN if you just want to distribute schemas, you may want to distribute the entire database, and then truncate the tables to purge data.
    Backing up and restoring your database is by far the most RELIABLE method of distributing it, but it may not be pratical in some cases because you'll need to generate a new backup every time a schema change occurs, but not if you already have an automated
    backup/maintenance routine in your environment.
    As an alternative, you can Copy Database functionality in SSMS, although it may present itself unstable in some situations, specially if you are distributing across multiple subnets and/or domains. It will also require you to purge data if/when applicable.
    Another option is to detach your database, copy its files, and then attach them in both the source and destination instances. It will generate downtime for your detached databases, so there are better methods for distribution available.
    And then there is the previously mentioned method of generating scripts for schema, and then using an INSERT statement or the import data wizard available in SSMS (which is very practical and implements a SSIS package internally that can be saved for repeated
    executions). Works fine, not as practical as the other options, but is the best way for distributing databases when their version is being downgraded.
    With all this said, there is no "best practice" for this. There are multiple features, each offering their own advantages and downfalls which allow them to align to different business requirements.

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practice for test reports location -multiple installers

    Hi,
    What is recommended best practice for saving test reports with multiple installers of different applications:
    For example, if I have 3 different teststand installers: Installer1, Installer2 and Installer3 and I want to save test reports of each installer at:
    1. C:\Reports\Installer1\TestReportfilename
    2. C:\Reports\Installer2\TestReportfilename
    3. C:\Reports\Installer3\TestReportfilename
    How could I do this programatically as to have all reports at the proper folder when teststand installers are deployed to a test PC?
    Thanks,
    Frank

    There's no recommended best practice for what you're suggesting. The example here shows how to programmatically modify a report path. And, this Knowledge Base describes how you can change a report's filepath based on test results.
    -Mike 
    Applications Engineer
    National Instuments

  • BI Best Practice for Chemical Industry

    Hello,
    I would like to know if anyone is aware of SAP BI  Best Practice for Chemicals.And if so can anyone please post a link aswell.
    Thanks

    Hi Naser,
    Below information will helps you in detail explanation regarding Chemical industry....
    SAP Best Practices packages support best business practices that quickly turn your SAP ERP application into a valuable tool used by the entire business. You can evaluate and implement specific business processes quickly u2013 without extensive Customization of your SAP software. As a result, you realize the benefits with less Effort and at a lower cost than ever before. This helps you improve operational efficiency while providing the flexibility you need to be successful in highly demanding markets. SAP Best Practices packages can benefit companies of all sizes, including global enterprises creating a corporate template for their subsidiaries.
    Extending beyond the boundaries of conventional corporate divisions and functions, the SAP Best Practices for Chemicals package is based on SAP ERP; the SAP Environment, Health & Safety (SAP EH&S) application; and the SAP Recipe Management application. The business processes supported by SAP Best Practices for Chemicals encompass a wide range of activities typically found in a chemical industry
    Practice:
    u2022 Sales and marketing
    u2013 Sales order processing
    u2013 Presales and contracts
    u2013 Sales and distribution (including returns, returnables, and rebates, with quality management)
    u2013 Inter- and intracompany processes
    u2013 Cross-company sales
    u2013 Third-party processing
    u2013 Samples processing
    u2013 Foreign trade
    u2013 Active-ingredient processing
    u2013 Totes handling
    u2013 Tank-trailer processing
    u2013 Vendor-managed inventory
    u2013 Consignment processing
    u2013 Outbound logistics
    u2022 Supply chain planning and execution Supply and demand planning
    u2022 Manufacturing planning and execution
    u2013 Manufacturing execution (including quality management)
    u2013 Subcontracting
    u2013 Blending
    u2013 Repackaging
    u2013 Relabeling
    u2013 Samples processing
    u2022 Quality management and compliance
    u2013 EH&S dangerous goods management
    u2013 EH&S product safety
    u2013 EH&S business compliance services
    u2013 EH&S industrial hygiene and safety
    u2013 EH&S waste management
    u2022 Research and development Transformation of general recipes
    u2022 Supplier collaboration
    u2013 Procurement of materials and services (Including quality management)
    u2013 Storage tank management
    u2013 E-commerce (Chemical Industry Data Exchange)
    u2022 Enterprise management and support
    u2013 Plant maintenance
    u2013 Investment management
    u2013 Integration of the SAP NetWeaver Portal component
    u2022 Profitability analysis
    More Details
    This section details the most common business scenarios u2013 those that benefit most from the application of best practices.
    Sales and Marketing
    SAP Best Practices for Chemicals supports the following sales and marketingu2013related business processes:
    Sales order processing u2013 In this scenario, SAP Best Practices for Chemicals supports order entry, delivery, and billing. Chemical industry functions include the following:
    u2022 Triggering an available-to-promise (ATP) inventory check on bulk orders after sales order entry and automatically creating a filling order (Note: an ATP check is triggered for packaged material.)
    u2022 Selecting batches according to customer requirements:
    u2022 Processing internal sales activities that involve different organizational units
    Third-party and additional internal processing u2013 In this area, the SAP Best Practices for Chemicals package provides an additional batch production step that can be applied to products previously produced by either continuous or batch processing. The following example is based on further internal processing of plastic granules:
    u2022 Purchase order creation, staging, execution, and completion
    u2022 In-process and post process control
    u2022 Batch assignment from bulk to finished materials
    u2022 Repackaging of bulk material
    SAP Best Practices for Chemicals features several tools that help you take advantage of chemical industry best practices. For example, it provides a fully documented and reusable prototype that you can turn into a productive solution quickly. It also provides a variety of tools, descriptions of business scenarios, and proven configuration of SAP software based on more than 35 years of working with the
    Chemical industry.
    SAP Functions in Detail u2013 SAP Best Practices for Chemicals
    The package can also be used to support external toll processing such as that required for additional treatment or repackaging.
    Tank-trailer processing u2013 In this scenario, SAP Best Practices for Chemicals helps handle the selling of bulk material, liquid or granular. It covers the process that automatically adjusts the differences between the original order quantities and the actual quantities filled in the truck. To determine the quantity actually filled, the tank trailer is weighed before and after loading. The delta weight u2013 or quantity filled u2013 is transmitted to the SAP software via an order confirmation. When the delivery for the sales order is created, the software automatically adjusts the order quantity with the confirmed filling quantity.The customer is invoiced for the precise quantity filled and delivered.
    Supply Chain Planning and Execution
    SAP Best Practices for Chemicals supports supply chain planning as well as supply chain execution processes:
    Supply and demand planning u2013 Via the SAP Best Practices for Chemicals package, SAP enables complete support for commercial and supply-chain processes in the chemical industry, including support for integrated sales and operations planning, planning strategies for bulk material, and a variety of filling processes with corresponding packaging units. The package maps the entire supply chain u2013 from sales planning to material requirements planning to transportation procurement.
    Supplier Collaboration
    In the procurement arena, best practices are most important in the following
    Scenario:
    Procurement of materials and services:
    In this scenario, SAP Best Practices for Chemicals describes a range of purchasing processes, including the following:
    u2022 Selection of delivery schedules by vendor
    u2022 Interplant stock transfer orders
    u2022 Quality inspections for raw materials, including sampling requests triggered
    by goods receipt
    Manufacturing Scenarios
    SAP Best Practices for Chemicals supports the following sales and
    Manufacturingu2013related business processes:
    Continuous production u2013 In a continuous production scenario, SAP Best Practices for Chemicals typifies the practice used by basic or commodity chemical producers. For example, in the continuous production of plastic granules, production order processing is based on run-schedule headers. This best-practice package also describes batch and quality management in continuous production. Other processes it supports include handling of byproducts,co-products, and the blending process.
    Batch production u2013 For batch production,
    SAP Best Practices for Chemicals typifies the best practice used by specialty
    chemical producers. The following example demonstrates batch production
    of paint, which includes the following business processes:
    u2022 Process order creation, execution, and completion
    u2022 In-process and post process control
    u2022 Paperless manufacturing using XMLbased Process integration sheets
    u2022 Alerts and events
    u2022 Batch derivation from bulk to finished materials
    Enterprise Management and Support
    SAP Best Practices for Chemicals also supports a range of scenarios in this
    area:
    Plant maintenance u2013 SAP Best Practices for Chemicals allows for management
    of your technical systems. Once the assets are set up in the system, it focuses on preventive and emergency maintenance. Tools and information support the setup of a production plant with assets and buildings.Revenue and cost controlling u2013 The package supports the functions that help you meet product-costing requirements in the industry. It describes how cost centers can be defined, attached
    to activity types, and then linked to logistics. It also supports costing and settlement of production orders for batch and continuous production. And it includes information and tools that help you analyze sales and actual costs in a margin contribution report.
    The SAP Best Practices for Chemicals package supports numerous integrated
    business processes typical of the chemical industry, including the following:
    u2022 Quality management u2013 Supports integration of quality management concepts across the entire supplychain (procurement, production, and sales), including batch recall and complaint handling
    u2022 Batch management u2013 Helps generate batches based on deliveries from vendors or because of company production or filling, with information and tools for total management of batch production and associated processes including batch  derivation, batch information cockpit, and a batchwhere- used list
    u2022 Warehouse management u2013 Enables you to identify locations where materials
    or batch lots are stored, recording details such as bin location and other storage information on dangerous goods to help capture all information needed to show compliance with legal requirements
    Regards
    Sudheer

  • Best practice for intervlan routing?

    are there some best practices for intervlan routing ?
    I've been reading allot and I have seen these scenarios
    router on a stick
    intervlan at core layer
    intervlan at distribution layer.
    or is intervlan needed at all if the switches will do the routing?
    I've done all of the above but I just want to know what's current.

    The simple answer is it depends because there is no one right solution for everyone. 
    So there are no specific best practices. For example in a small setup where you may only need a couple of vlans you could use a L2 switch connected to a router or firewall using subinterfaces to route between the vlans.
    But that is not a scalable solution. The commonest approach in any network where there are multiple vlans is to use L3 switches to do this. This could be a pair of switches interconnected and using HSRP/GLBP/VRRP for the vlans or it could be stacked switches/VSS etc. You would then dual connect your access layer switches to them.
    In terms of core/distro/access layer in general if you have separate switches performing each function you would have the inter vlan routing done on the distribution switches for all the vlans on the access layer switches. The core switches would be used to route between the disribution switches and other devices eg. WAN routers, firewalls, maybe other distribution switch pairs.
    Again, generally speaking, you may well not need vlans on the core switches at all ie. you can simply use routed links between the core switches and everything else. 
    The above is quite a common setup but there are variations eg. -
    1) a collapsed core design where the core and distribution switches are the same pair. For a single building with maybe a WAN connection plus internet this is quite a common design because having a completely separate core is usually quite hard to justify in terms of cost etc.
    2) a routed access layer. Here the access layer switches are L3 and the vlans are routed at the access layer. In this instance you may not not even need vlans on the distribution switches although again to save cost often servers are deployed onto those switches so you may.
    So a lot of it comes down to the size of the network and the budget involved as to which solution you go with.
    All of the above is really concerned with non DC environments.
    In the DC the traditional core/distro or aggregation/access layer was also used and still is widely deployed but in relatively recent times new designs and technologies are changing the environment which could have a big impact on vlans.
    It's mainly to do with network virtualisation, where the vlans are defined and where they are not only routed but where the network services such as firewalling, load balancing etc. are performed.
    It's quite a big subject so i didn't want to confuse the general answer by going into it but feel free to ask if you want more details.
    Jon

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Best-practice for Catalog Views ? :|

    Hello community,
    A best practice question:
    The situtation: I have several product categories (110), several items in those categories (4000) and 300 end-users.    I would like to know which is the best practice for segment the catalog.   I mean, some users should only see categories 10,20 & 30.  Other users only category 80, etc.    The problem is how can I implement this ?
    My first idea is:
    1. Create 110 Procurement Catalogs (1 for every prod.category).   Each catalog should contain only its product category.
    2. Assign in my Org Model, in a user-level all the "catalogs" that the user should access.
    Do you have any idea in order to improve this ?
    Saludos desde Mexico,
    Diego

    Hi,
    Your way of doing will work, but you'll get maintenance issues (to many catalogs, and catalog link to maintain for each user).
    The other way is to built your views in CCM, and assign these views to the users, either on the roles (PFCG) or on the user (SU01). The problem is that with CCM 1.0 this is limitated, cause you'll have to assign one by one the items to each view (no dynamic or mass processes), it has been enhanced in CCM 2.0.
    My advice:
    -Challenge your customer about views, and try to limit the number of views, with for example strategic and non strategic
    -With CCM 1.0 stick to the procurement catalogs, or implement BADIs to assign items to the views (I experienced it, it works, but is quite difficult), but with a limitated number of views
    Good luck.
    Vadim

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best practice for multi-language content in common areas

    I've got a site with some text in header/footer/nav that needs to be translated between an English and Spanish site, which use the same design. My intention was to set up all the text as content to facilitate. However, if I use a standard dialog with the component's path set to a child of the current page node, I would need to re-enter the text on every page. If I use a design dialog, or a standard dialog with the component's path set absolutely, the Engilsh and Spanish sites will share the same text. If I use a standard dialog with the component's path set relatively (eg path="../../jcr:content/myPath"), the pages using the component would all need to be at the same level of the hierarchy.
    It appears that the Geometrixx demo doesn't address this situation, and leaves copy in English. Is there a best practice for this scenario?

    I'm finding that something to the effect of <cq:include path="<%= strCommonContentPath + "codeEntry" %>" resourceType ...
    works fine for most components, but not for parsys, or a component containing a parsys. When I attempt that, I get a JS error that says "design.path is null or not an object". Is there a way around this?

  • Best Practice for utility in Sol Man 4.0

    We have software component ST-ICO of release 150_700 with Patch level 5
    We want a Template Selection for ‘Utility’ industry. I checked in
    the service market place and found that 'Baseline Package United
    Kingdom V1.50, Template: BP_BLKU150' is available in the above software
    component.
    But we are not getting any templates other than 'BP_UTUS147 - Best Practices for Water Utility' in the 'SOLAR_PROJECT_ADMIN'
    transaction.
    Kindly suggest any patch needs to be applied or some configuration need to be done.
    Regards
    Mani

    Hi Mani,
       Colud u plz give me the link of "where u find the template BP_BLKU150"?
    It will be helpful for me.
    Thanks
    Senthil

  • Best Practices for SRM Installation !!

    Hi
        can someone share the best Practices for SRM Installation ?
    What is the typical timeframe to install SRM on development server and as well as on the Production server ?
    Appericiate the responses
    Thanks,
    Arvind

    Hi
    I don't know whether this will help you.
    See these links as well.
    <b>http://help.sap.com/bp_epv170/EP_US/HTML/Portals_intro.htm
    http://help.sap.com/bp_scmv150/index.htm
    http://help.sap.com/bp_biv170/index.htm
    http://help.sap.com/bp_crmv250/CRM_DE/index.htm</b>
    Hope this will help.
    Please reward suitable points.
    Regards
    - Atul

Maybe you are looking for

  • 3rd Party Sales::Create Automatic PO in Item Cat vs. in SL Catgy

    Hi All, I've a query regd the 3rd party sales order processing. There is an indicator at item category level, we've an indicator 'Create PO Automatic'. Also, at Schedule Line category level, for CS, we've indicators for Order Type as NB and Item cate

  • Problems upgrading to iTunes 8.2 from 8.0.2.20

    For the last few weeks I am prompted occasionally to update iTunes, I dutyfully follow the process. Downloading Vers 8.2.0 Size 83.13MB 1st chunk - 14.6MB 2nd chunk - 26.7MB 3rd chunk - 0.1MB 4th chunk - 41.7MB Starts to install, gets within 4 blocks

  • Structure of our Active Directory

    Hi All, We have following active directory structure in our organization. I am not sure, if this is a flat or deep hierarchy. We have domain(forest)as xyz.com. we have a group created SAP under OU=Applications,DC=xyz,DC=com. All our users accessing P

  • I take pictures and when i try to bring them up big it show up blank

    i have an iphone 4s and when i take pictures it shows up in the camera roll, but when i try to bring up the pictures individually it is completely blank.

  • Apple Store tell me its "unfixable" but it seems quite simple...??

    I have the same symptoms as described in this thread: http://discussions.apple.com/thread.jspa?messageID=6436296&#6436296 I posted a question on the end of that thread but didnt get any response. Seems like a simple problem somewhere between the disc