Best practices for the datapush in Blazeds?

Hello,
Currently I'm working on the data push using Blazeds. I'm currently wondering about the best practices that needs to be followed to reduce the hits to the server when using Data push/Increase the performance, Could you please guide me or direct me to good resources on this. Also i'm currently using default Amf for the data push, help me choose the Default channel for this process.

Hi,
According to
this documentation, “You must configure a new name in Domain Name Services (DNS) to host the apps. To help improve security, the domain name should not be a subdomain
of the domain that hosts the SharePoint sites. For example, if the SharePoint sites are at Contoso.com, consider ContosoApps.com instead of App.Contoso.com as the domain name”.
More information:
http://technet.microsoft.com/en-us/library/fp161237(v=office.15)
For production hosting scenarios, you would still have to create a DNS routing strategy within your intranet and optionally configure your firewall.
The link below will show how to create and configure a production environment for apps for SharePoint:
http://technet.microsoft.com/en-us/library/fp161232(v=office.15)
Thanks
Patrick Liang
Forum Support
Please remember to mark the replies as answers if they
help and unmark them if they provide no help. If you have feedback for TechNet
Subscriber Support, contact [email protected]
Patrick Liang
TechNet Community Support

Similar Messages

  • Best Practice for the Service Distribution on multiple servers

    Hi,
    Could you please suggest as per the best practice for the above.
    Requirements : we will use all features in share point ( Powerpivot, Search, Reporting Service, BCS, Excel, Workflow Manager, App Management etc)
    Capacity : We have  12 Servers excluding SQL server.
    Please do not just refer any URL, Suggest as per the requirements.
    Thanks 
    srabon

    How about a link to the MS guidance!
    http://go.microsoft.com/fwlink/p/?LinkId=286957

  • Best practice for the test environment  &  DBA plan Activities    Documents

    Dears,,
    In our company, we made sizing for hardware.
    we have Three environments ( Test/Development , Training , Production ).
    But, the test environment servers less than Production environment servers.
    My question is:
    How to make the best practice for the test environment?
    ( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
    Also please , Can I have a detail document regarding the DBA plan activities?
    I appreciate your help and advise
    Thanks
    Edited by: user4520487 on Mar 3, 2009 11:08 PM

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

  • Best Practice for the Vendor Consignment Process

    Hiii ,,,
    Can anybody have the best practices for the Vendor consignment process???
    Please Send me the document.
    Please explain me the Consignment process in SAP??
    Thanx & Regards,
    Kumar Rayudu

    Hi Kumar,
    In order to have Consigment in SAP u need to have master data such as material master, vendor master and purchase inforecord of consignment type. U have to enter the item category k when u enter PO. The goods receipt post in vendor consignment stock will be non valuated.
    1. The intial steps starts with raising purchase order for the consignment item
    2. The vendor recieves the purchase order.
    3. GR happens for the consignment material.
    4. Stocks are recieved and placed under consignment stock.
    5. When ever we issue to prodn or if we transfer post(using mov 411) from consignment to own stock then liability occurs.
    6. Finally comes the settlement using mrko. You settle the amount for the goods which was consumed during a specific period.
    regards
    Anand.C

  • Req:SAP Best practice for the Funds Management

    Dear all,
    Let me know where I can get the SAP Best practice for the Funds Management . Waiting for your valuable reply.
    Regards
    Manohar

    Hello Manohar,
    You can find documentation in links below:
    Industry Solution Master Guide - SAP for Public Sector:
    https://websmp105.sap-ag.de/~form/sapnet?_SHORTKEY=00200797470000065911
    SAP Best Practices for Public Sector:
    http://help.sap.com/  SAP Best Practices -> Industry Packages -> Public
    Sector
    Online Library for Funds Management:
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/41/c62c6d6d84104ab938a
    a7eae51db06/frameset.htm
    I hope it helps
    Best Regards,
    Vanessa Barth.

  • Best Practice for the database owner of an SAP database.

    We recently had a user account removed from our SAP system when this person left the agency.  The account was associated with the SAP database (he created the database a couple of years ago). 
    I'd like to change the owner of the database to <domain>\<sid>adm  (ex: XYZ\dv1adm)  as this is the system admin account used on the host server and is a login for the sql server.  I don't want to associate the database with another admin user as that will change over time.
    What is the best practice for database owner for and SAP database?
    Thanks
    Laurie McGinley

    Hi Laura
    I'm not sure if this is best practise or not, but I've always had the SA user as the owner of the database. It just makes it easier for restores to other systems etc.
    Ken

  • Best practice for the Update of SAP GRC CC Rule Set

    Hi GRC experts,
    We have in a CC production system a SoD matrix that we would like to modified extensively. Basically by activating many permissions.
    Which is a best practice for accomplish our goal?
    Many thanks in advance. Best regards,
      Imanol

    Hi Simon and Amir
    My name is Connie and I work at Accenture GRC practice (and a colleague of Imanolu2019s). I have been reading this thread and I would like to ask you a question that is related to this topic. We have a case where a Global Rule Set u201CLogic Systemu201D and we may also require to create a Specific Rule Set. Is there a document (from SAP or from best practices) that indicate the potential impact (regarding risk analysis, system performance, process execution time, etc) caused by implementing both type of rule sets in a production environment? Are there any special considerations to be aware? Have you ever implemented this type of scenario?
    I would really appreciate your help and if you could point me to specific documentation could be of great assistance. Thanks in advance and best regards,
    Connie

  • What are the best practices for the RCU's schemas

    Hi,
    I was wondering if there is some best practices about the RCU's schemas created with BIEE.
    I already have discoverer (and application server), so I have a metadata repository for the Application Server. I will upgrade Discoverer 10g to 11, so I will create new schema with RCU in my metada repository (MR) of the Application Server. I'm wondering if I can put the BIEE's RCU schemas in the same database.
    Basically,
    1. is there a standard for the PREFIX ?
    2. If I have multiple components of Fusion in the same Database, I will have multiples PREFIX_MDS schema ? Can they have the same PREFIX ? Or They all need to have a different prefix ?
    For exemple: DISCO_MDS and BIEE_MDS or I can have DEV_MDS and this schema is valid for both Discoverer and BIEE.
    Thank you !

    What are the best practices for exception handling in n-tier applications?
    The application is a fat client based on MVVM pattern with
    .NET framework.
    That would be to catch all exceptions at a single point in the n-tier solution, log it and create user friendly messages displayed to the user. 

  • BEST PRACTICE FOR THE REPLACEMENT OF REPORTS CLUSTER

    Hi,
    i've read the noter reports_gueide_to_changed_functionality on OTN.
    On Page 5 ist stated that reports cluster is deprecated.
    Snippet:
    Oracle Application Server High Availability provides the industry’s most
    reliable, resilient, and fault-tolerant application server platform. Oracle
    Reports’ integration with OracleAS High Availability makes sure that your
    enterprise-reporting environment is extremely reliable and fault-tolerant.
    Since using OracleAS High Availability provides a centralized clustering
    mechanism and several cutting-edge features, Oracle Reports clustering is now
    deprecated.
    Please can anyone tell me, what is the best practice to replace reports cluster.
    It's really annoying that the clustering technology is changing in every version of reports!!!
    martin

    hello,
    in reality, reports server "clusters" was more a load balancing solution that a clustering (no shared queue or cache). since it is desirable to have one load-balancing/HA approach for the application server, reports server clustering is deprecated in 10gR2.
    we understand that this frequent change can cause some level of frustration, but it is our strong believe that unifying the HA "attack plan" for all of the app server components will utimatly benefit custoemrs in simpifying their topologies.
    the current best practice is to deploy LBRs (load-balancing routers) with sticky-routing capabilites to distribute requests across middletier nodes in an app-server cluster.
    several custoemrs in high-end environments have already used this kind of configuration to ensure optimal HA for their system.
    thanks,
    philipp

  • Best practice for the use of reserved words

    Hi,
    What is the best practice to observe for using reserved words as column names.
    For example if I insisted on using the word comment for a column name by doing the following:
    CREATE TABLE ...
    "COMMENT" VARCHAR2(4000),
    What impact down the track could I expect and what problems should I be aware of when doing something like this?
    Thank You
    Ben

    Hi, Ben,
    Benton wrote:
    Hi,
    What is the best practice to observe for using reserved words as column names.Sybrand is right (as usual): the best practice is not to use them
    For example if I insisted on using the word comment for a column name by doing the following:
    CREATE TABLE ...
    "COMMENT" VARCHAR2(4000),
    What impact down the track could I expect and what problems should I be aware of when doing something like this?Using reserved words as identifiers is asking for trouble. You can expect to get what you ask for.
    Whatever benefits you may get from naming the column COMMENT rather than, say, CMNT or EMP_COMMENT (if the table is called EMP) will be insignificant compared to the extra debugging you will certainly need.

  • What is the best practice for the WLI?

    Hello to everyone,
    I am a newbie to the Weblogic Integration Server and I had read the docs of WLI
    from the BEA, like "Programming BPM Client Applications". But I found it is bitter
    and astringent. I am wondering there is documents and samples for Weblogic Intergration
    Server for the entry level. I 'd appreciate it if anyone could give some comments.
    Thanks.
    Leon

    I was wondering if anybody knew of any bea documents that outlined best practices
    in terms of naming standards for business operations, event keys, tasks etc...
    "Tinnapat Chaipanich" <[email protected]> wrote:
    Hi
    Why not try BPM tutorial :)
    http://edocs.bea.com/wli/docs70/bpmtutor/index.htm
    Regards,
    Tinnapat.
    "Leon" <[email protected]> wrote in message
    news:3ea7ffbd$[email protected]..
    Hello to everyone,
    I am a newbie to the Weblogic Integration Server and I had read thedocs
    of WLI
    from the BEA, like "Programming BPM Client Applications". But I foundit
    is bitter
    and astringent. I am wondering there is documents and samples for WeblogicIntergration
    Server for the entry level. I 'd appreciate it if anyone could givesome
    comments.
    Thanks.
    Leon

  • Any best practices for the iPad mini????

    I am in the beginning stages of designing my mag for the iPad.... now the iPad mini seems to be all the hype and the latest news states that the mini may out sell the larger one.
    So... I know that the dimensions are 1x1 on bumping down to the smaller screen... but what about font sizes? what about the experience? Anyone already ahead of the game?
    I have my own answers to these questions, but any one out there have some best practice advice or links to some articles they find informative...

    I think 18-pt body text is fine for the iPad 2 but too small for the iPad mini. Obviously, it depends on your design and which font you're using, but it seems like a good idea to bump up the font size a couple points to account for the smaller screen.
    For the same reason, be careful with small buttons and small tap areas.
    I've also noticed that for whatever reason, MSOs and scrollable frames in PNG/JPG articles look great on the iPad 2 but look slightly more pixelated on the iPad Mini. It might just be my imagination.
    Make sure that you test your design on the Mini.

  • Defragmenting Mac (Leopard) best practice for the safest disk optimisation

    What is the best way to defragment?
    Its always dangerous to perform disk related stuff, sometimes some of the apps can lead to file corruption, I know lot cases where almost all the third party software has caused issues.
    I have disk warrior, Techtool pro, drive genius, which one is safest? I know most of them require to boot of media to perform on boot drive.
    Recently, i upgraded to 320gb 7200 rpm and since i had about 190gb of data, i thought disk utility will be faster, and it did copy in 1.5 hour but it didn't defragment my hard drive since it does block level copy. My computer is IN FACT RUNNING SLOWER.
    Im thinking of carbon copy cloner to external hard drive, file level copy, and then boot of restore dvd 10.5 to perform restore function by disk utility which is lot faster since it does block level copy.

    I use idefrag.
    http://www.coriolis-systems.com/iDefrag-faq.php
    from iDefrag help:
    Why Defragment?
    It has often been asserted that defragmentation (or disk optimization) is not a good idea on
    systems using Apple’s HFS+ filesystem. The main reasons given for this historically have been:
    HFS+ is very much better at keeping files defragmented than many other commodity filesystems.
    Advanced features in recent versions of HFS+ can easily be disrupted by a defragmentation tool
    that does not support them, resulting in decreased performance.
    There is a risk associated with defragmentation.
    Whilst these arguments are certainly valid, they are not the whole story. For one thing, iDefrag,
    unlike most other disk defragmentation tools, fully supports the most recent features of HFS+,
    namely the metadata zone (or “hot band”) and the adaptive hot file clustering support added in
    Mac OS X 10.3. Not only does it avoid disrupting them, but it is capable of fixing disruption caused
    by other software by moving files into or out of the metadata zone as appropriate.
    Sensible arguments for occasional optimization of your disk include:
    HFS+ is not very good at keeping free space contiguous, which can, in turn, lead to large files
    becoming very fragmented, and can also cause problems for the virtual memory subsystem on
    Mac OS X.
    Older versions of the Mac OS are not themselves aware of the metadata zone policy, and may
    disrupt its performance.
    HFS+ uses B-Tree index files to hold information about the filesystem. If a large number of files
    are placed on a disk, the filesystem may have to enlarge these B-Tree structures; however, there
    is no built-in mechanism to shrink them again once the files are deleted, so the space taken up
    by these files has been lost.
    Whilst HFS+ is good at keeping individual files defragmented, mechanisms like Software Update
    may result in files that are components of the same piece of software being scattered across the
    disk, leading to increased start-up times, both for Mac OS X itself and for applications software.
    This is a form of fragmentation that is typically overlooked.
    Defragmenting disk images can be helpful, particularly if they are to be placed onto a CD/DVD, as
    seeks on CD/DVD discs are particularly expensive.
    Some specific usage patterns may cause fragmentation despite the features of HFS+ that are
    designed to avoid it.
    We do not recommend very frequent optimization of your disk; optimizing a disk can take a
    substantial amount of time, particularly with larger disks, far outweighing the benefits that are
    likely to be obtained by (say) a weekly optimization regime.
    Optimization may make more sense, however, following large software updates, or on an
    occasional basis if you notice decreased performance and lots of hard disk seeking on system
    start-up or when starting an application.
    Kj

  • Are there any "Best Practices" for the setting of the variables in the magnus.conf file when configuring iWS4.1 ?

     

    The default values written to magnus.conf are suitable for most installations.
    If you are interested in tuning your web server for performance, the "Performance Tuning, Sizing, and Scaling Guide" at http://docs.iplanet.com/docs/manuals/enterprise.html
    has some suggestions for magnus.conf values.

  • SAP Best Practice for Document Type./Item category/Acc assignment cat.

    What is the Best Practice for the document Type & Item category
    I want to use NB -  Item category  - B & K ( Blanket PO) , D ( Service)  and T( Text) .
    Is sap recommends to use FO Only for the Blanket Purchase Order.
    We want to use service contract (with / without service entry sheet) for all our services.
    We want to buy asset for our office equipments .
    Which is the best one to use NB or FO ?
    Please give me any OSS notes or reference for this
    Thanks
    Nick

    Thank you very much for your response. 
    I hope I can provide some clarity on how the accounting needs to be handle per FERC  Regulations.  The G/L balance on the utility that is selling the assets will be in the following accounts (standard accounts across all FERC Regulated Utilities):
    101 - Acquisition Value for the assets
    108 - Accumulated Depreciation Value for the assets
    For an example, there is Debit $60,000,000 in FERC Account 101 and a credit $30,000,000 in FERC Account 108.  When the purchase occurs, the net book value for the asset will be on our G/L in FERC Account 102.  Once we have FERC Approval to acquire the plant assets, we will need to enter the Acquisition Value and associated Accumulated Depreciation onto our G/L to FERC Account 101 and FERC Account 108 respectively with an offset to FERC Account 102.
    The method that I came up with is to purchase the NBV of the assets to a clearing account.  I then set up account assignments that will track the Acquisition Value and respective Accumulated Depreciation for each asset that is being purchased.  I load the respective asset values using t-code AS91 and then make an entry to the 2 respective accounts with the offset against the clearing account using t-code OASV.  Once my company receives FERC approval, I will transfer the asset to new assets that has the account assignments for FERC Account 101 and FERC Account 108 using t-code ABUMN or FB01.

Maybe you are looking for