Best Practice to Manage Subscriptions for EA Enterprise Agreement Customer

Hi Team,
We as a company have EA Subscription for our future product(s) release. So, we want to understand How should we manage subscriptions for our different Project with different environment.
I have visited few important links like
http://blog.thavo.com/2014/05/where-to-start-with-microsoft-azure.html
http://blog.kloud.com.au/2013/07/30/good-practices-for-managing-windows-azure-subscriptions/
we use all possible combinations like SQL Azure, Azure Network, Azure Website, Cloud Service & Traffic Manager...
we also like to use Slot & Swapping in Azure Websites while Staging & Production in case of Cloud Service.
Based on above link, I understood that we should have different subscription for each type of environment like my Project name is ToDoApp then I should have two subscription
1) ToDoApp Staging    2) ToDoApp Production.
My Question here is: if we follow that practice, How could we utilize the Swapping feature of Azure because it works only within a specific subscription. so, If I deploy the application on my ToDoApp Staging Environment probably after testing I have to re-deploy
the same on Production (Swapping can't work as, we have environment based different subscription)
Request you to please suggest something where our deployment would minimize and we can have resource utilization based report per subscription with swapping functionality.
Regards, Brijesh Shah

Hi Jambor & all,
You are correct. From the subscriptions, we might get all billing & resource utilization information easily. But the same time, my concern here is we might loose the core azure feature like Swapping.
Azure Website (or Cloud service) has feature to test the deployment before move to production. so, we deploy that on staging or such slot & once we are done, we can move that.
But this way we have to use all those instances (whether it's for testing-staging or production) all in single subscription.
Moreover within single subscription it should be through only one instance (One cloud service or One Azure Website) then only you can swap.
So, Don't you think that would be hard limitation
Regards, Brijesh Shah

Similar Messages

  • What's the best practice to manage the page file?

           
    We have one Hyper-v Server running windows 2012 R2 with 128 GB RAM and 2 drives (C and D). It setup Automatically manage page file size for all drives. What's the best practice to manage the page file?
    Bob Lin, MCSE & CNE Networking, Internet, Routing, VPN Networking, Internet, Routing, VPN Troubleshooting on http://www.ChicagoTech.net How to Install and Configure Windows, VMware, Virtualization and Cisco on http://www.HowToNetworking.com

    For Hyper-V systems, my general recommendation is to set the page file to 1-4 GB. This allows for a mini-dump should something happen. 99.99% of the time, Microsoft will be able to figure out the cause of the problem from the mini-dump. It does not make
    sense on a Hyper-V system to set aside enough space to capture all the memory on the system because only a very small portion of that memory is used by the parent partition. Most of the memory is under control of the individual VMs.
    Yes, I had one of the Hyper-V product group tell me that I should let Windows manage it.  A couple of times I saw space on my system disk disappear because the algorithm decided it wanted all the space for the page file.  Made it so I couldn't
    patch my systems.  Went back in and set the page file to 1-4 GB and have not had any issues since.
    . : | : . : | : . tim

  • General Discussion - Best practice to manage Process order

    Hi Experts,
    Which is the best practice to manage process orders ?
    1. Quantity Change - I can make quantity adjustment in R3 and APO.
    2. Source Change - I can make a version change from order header . Also i can make a source change in APO by selecting a different PPM. Which is the best option.
    3. Re Read Master Data - Best practice to read master data is from R3 or APO ?
    I feel for all the above scenarios process ordes should always managed in R3. But still wondering why we have the same flexibility in APO too ?
    Can

    Hello,
    we are just migrating from 4.6c to ECC 6.0 and I have a couple of workflows to adopt.
    For background steps I defined in the corresponding BOR methods an exception to be fired when no result is available (e.g. no mail address available). Normally, I defined them as temporary errors.
    I activated in the WI outcome section the line for this exception and so the workflow processed this branch when the exception appeared. It worked fine.
    Now, in ECC 6.0, the same workflow get stuck in the WI. The exception is fired (I can see it in the log as "Error message"), but the WI is still in status "in process". It doesn't continue with the error outcome branch.
    Is this a new logic in ECC 6.0? Do you have any idea what to do? I used this logic some dozent times in different methods and workflows and it gives me a headache if I have to change everything ...
    Thank you!
    Best regards,
    Thomas

  • Best practice to define length for varchar field of table in sql server

    What is best practice to define length for a varchar field in table
    where field suppose Remarks By Person  varchar(max) or varchar(4000)
    Could it affect on optimization in future????
    experts Reply Must ... 
    Dilip Patil..

    Hi Dilip,
    Varchar(n/max) is a variable-length, non-unicode character data. N defines the string length and can be a value from 1 through 8,000. Max indicates that the maximum storage size is 2^31-1 bytes (2 GB). The storage size is the actual length of the data entered
    + 2 bytes. We always use varchar when the sizes of the column data entries vary considerably. While if the filed data size might exceed 8,000 bytes in some way, we should use varchar(max).
    So the conclusion is just like Uri said, use varchar(max) or varchar(4000) is depends on how much characters we are going to store.
    The following document about varchar in SQL Server is for your reference:
    http://technet.microsoft.com/en-us/library/ms176089.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Best practice on Oracle VM for Sparc System

    Dear All,
    I want to test Oracle VM for Sparc System but I don't have new model Server to test it. What is the best practice of Oracle VM for Sparc System?
    I have a Dell laptop which has spec as below:
    -Intel® CoreTM i7-2640M
    (2.8GHz, 4MB cache)
    - Ram: 8GB DDR3
    - HDD: 750GB
    -1GB AMD Radeon
    I want to install Oracle VM VirtualBox on my laptop and then install Oracle VM for Sparc System in Virtual Box, is it possible?
    Please kindly give advice,
    Thanks and regards,
    Heng

    Heng Horn wrote:
    How about computer desktop or computer workstation with the latest version has CPU supports Oracle VM or SPARC?Nope. The only place you find SPARC T4 processors is in Sun Servers (and some Fujitsu servers, I think).

  • Best practices to reduce downtime for Database releases(rolling changes)

    Hi,
    What are best practices to reduce downtime for database releases on 10.2.0.3? What DB changes can be rolling and what can't?
    Thanks in advance.
    Regards,
    RJiv.

    I would be very dubious about any sort of universal "best practices" here. Realistically, your practices need to be tailored to the application and the environment.
    You can invest a lot of time, energy, and resources into minimizing downtime if that is the only goal. But you'll generally pay for that goal in terms of developer and admin time and effort, environmental complexity, etc. And you generally need to architect your application with rolling upgrades in mind, which necessitates potentially large amounts of redesign to existing applications. It may be perfectly acceptable to go full-bore into minimizing downtime if you are running Amazon.com and any downtime is unacceptable. Most organizations, however, need to balance downtime against other needs.
    For example, you could radically minimize downtime by having a second active database, configuring Streams to replicate changes between the two master databases, and configure the middle tier environment so that you can point different middle tier servers against one or the other database. When you want to upgrade, you point all the middle tier servers against database A other than 1 that lives on a special URL. You upgrade database B (making sure to deal with the Streams replication environment properly depending on requirements) and do the smoke test against the special URL. When you determine that everything works, you configure all the app servers to point at B and have Streams replication process configured to replicate changes from the old data model to the new data model), upgrade B, repeat the smoke test, and then return the middle tier environment to the normal state of balancing between databases.
    This lets you upgrade with 0 downtime. But you've got to license another primary database. And configure Streams. And write the replication code to propagate the changes on B during the time you're smoke testing A. And you need the middle tier infrastructure in place. And you're obviously going to be involving more admins than you would for a simpler deploy where you take things down, reboot, and bring things up. The test plan becomes more complicated as well since you need to practice this sort of thing in lower environments.
    Justin

  • Best Practice setting up NICs for Hyper V 2008 r2

    I am looking at some suggestions for best practice for setting up a hyper V 2008 r2 at a remote location with 5 nics, one for managment vlan and other 4 on the data vlan.  This server will host  2 virtual machines, one is a DC and the other
    is a member local DHCP server.  The server is setup now with one nic on the management Vlan and the other nic's set to get there ip from the local dhcp server on on the host.   We have the virtual networks setup in Hyper V to
    point to each of the nics using the "external connection".  The virtual servers 'DHCP and AD" have there own ip set within them.  Issues we are seeing,  when the site looses external connections for a while they cannot get ip
    addresses from the local dhcp server anymore.
    1. NIC on management Vlan -- IP Static -- Physical host
    2. NIC on the Data network Vlan -- DHCP linked as a connection "external" in Hyper V  -- virtual server DHCP
    3. NIC on the Data network Vlan -- DHCP linked as a connection "external" in Hyper V -- Virtual server domain controller
    4. NIC on the Data network Vlan -- DHCP linked as a connection "external" in Hyper V -- extra
    5. NIC on the Data network Vlan -- DHCP linked as a connection "external" in Hyper V -- extra
    Thanks in advance

    Looks like you may be over complicating things here.  More and more of the recommendations from Microsoft at this point would be to create a Logical Switch and then layer on Logical Networks for your management layers, but here is what I would do for
    you simple remote office.  
    Management NIC:  Looks good (Teaming would be better, but only if you had 2 different switching to protect against link failures at the switch level.  Doesn't seem relevant in this case however.
    NIC for Data Network VLAN:  I would use one NIC in your case if you can have the ability to Trunk multiple VLANs at the switch level to the NIC.  That way you are setting the VLAN on the VMs NIC that you want to access and your
    Virtual Switch configuration is very simple.  On this virtual switch however, I would uncheck IPv4 and IPv6.  There is no need to give this NIC an address as you are just passing traffic through them from the VMs that are marked with VLAN tags.  Again,
    if you have multiple physical switches in the building teaming could be an option, but probably adds more complexity than is necessary for a small office. 
    Even if you keep your Virtual Switches linked to separate NICs unchecking IPv4 and IPv6 makes sense. 
    Disable all the other NICs
    Beyond that, check your routing.  Can you ping between all hosts when there is not interruption? What DHCP server are they getting there addresses on normally?  Where are your name resolution servers (DNS, WINS)?  
    No silver bullet here, but maybe a step in the right direction.
    Rob McShinsky (VirtuallyAware.com)
    VirtuallyAware - Experiences in a Virtual World (Microsoft MVP - Virtual Machine)

  • Best practice to extract data from Hyperion Enterprise 5.5

    We are looking into extracting high-level data from our Hyperion Enterprise 5.5 and am in the process of researching what are the best practices to do that. I am reading the docs for the APIs that I can call from VB6. I am also interested if there are available Java APIs out there.Thanks in advance and Happy Holidays to everyone!Angelito [email protected]

    The easiest is using HAL (Hyperion Application Link). I have used HAL to extract data, organizations, account, subs, entities, etc.

  • What is the best practice to manage versions in XI?

    Hi!
    Is there any <b>good</b> “best practice” ways to manage versions in XI.
    Have a challenging scenario with many legacy systems and many interfaces per legacy system.
    Should I put all the different interfaces for one legacy system under one namespace in the Integration Repository or should I make one own namespace for each interface.
    Or is there other approaches, that I should consider to get a environment that is “easily” maintained.
    br.samuli

    Hi,
    In our project we have defined our own naming conventions/namespaces.
    For instance, we have agreed that we will group all interfaces (from all offices worldwide and different projects) into one single product version.
    This global custom product will in turn be divided into different SWC (Software Components) and SWCV (Software Component versions).
    So for each business scenario we will create separate SWC's and when necessary create new versions of these SWC's.
    This means that each SWC should contain all the required objects to support a complete integration scenario i.e. inbound/outbound-interfaces, data types, msg types, business scenario's etc...
    Regards,
    Rob.

  • Best practices to manage Materials+Vendors in an SRM-MDM Respository?

    Hi Gurus,
    I have a functional question about how manage the Master Data of "Materials" and "Vendors" in an scenario of SRM-MDM Catalog (Repository). MDM 7.1, SRM 7.0, MDM-SRM Catalog (Repository) 7.0.
    My concern is that this kind of repository has 32 fields aprox. and the majory of fields are referenced to Material information and a little fields of Vendors.
    The big question is how load or modeling the information in the Repository?
    Which are the best practices?:
    a) Manage the materials in the main table of the repository and then add other main table to maintain the vendor data?
    b) Manage the materials and the vendors in different repositorys?
    c) Manage the materials & vendors in the same main table in one repository?
    I know that part of the solution depends of the SRM Team requiriments, but I would like to know what are the best practices from the MDM Side.
    Thanks in advanced.
    JP

    Hey JP,
    Couple of questions to you.
    Do you have Material and Vendor Master in SRM or ECC or BOTH ??
    What will be the scenario, Consolidation, Catalogue Management or CMDM??
    What will be POC for Mater data?
    Cheers,
    Rajesh

  • Best Practices or Project Template for Rep/Version

    I have installed the Repository 6i (3) and created the users successfully, even though it has taken a lot of effort to make sure each step is correct.
    However, on setting up the workareas and importing the project files, I have been trying back and force to figure out where things go, and who has what access.
    Is there something like a best practice or a project template for setting up a basic repository/version control system, that provides
    1. the repository structure,
    2. corresponding file system structure (for different developers, build manager, etc)
    3. access grants, and
    4. work scenarios, etc.
    The Technet demos and white papaers are either too high-level (basic), or too individual function oriented. I can't get a clear picture of the whole thing, since there are so many concepts and elements that don't easily go together.
    Considering that I am a decent DBA and developer, it has taken me 2 weeks, and I am still not ready to sign up other developers to use this thing. How do you expect any small development teams to ever use it? It's one thing to design it to be scalable and all-possible, it's another to make it easily usable. I have been suggested to use MS VSS. The only reason I am still trying Ora-Rep is its promise to directly support Designer and Oracle objects.

    Andy,
    I have worked extensively with the Repository over the last year and a half. I have collected some of my experiences and the derived guidelines on using the Repository in real life in a number of papers that I will be presenting at ODTUG 2001, next week in San Diego. If you happen to be there (see www.odtug.com), come and see me and we could talk through your specific situation. If you are not and you are interested in those papers, drop me an Email and I could send them to you (they probably will also become available on OTN right after the ODTUG conference).
    best regards,
    Lucas

  • SAP Best Practices on assigning roles for Auditors

    Dear Gurus,
    We need to set up SAP roles for auditors in or system for SRM ECC & BI.
    Could you please suggest on wich roles should be granted to the auditors as best practice to follow on?
    I will really apprecciate your help.
    Best Regards,
    Valentino

    Hi Martin,
    Thanks for your interest. I would be very happy to work with folks like you to slowly improve such roles as we find improvement possibilities for them, and all benefit from the joint knowledge and cool features which go into them. I have been filing away at a set of them for years now - they are not evil but still usefull and I give them to an auditor without being concerned as long as they can tell me approximately what they have been tasked to look into.
    I then also show them the corresponding user menu of my role for these tasks and then leave them alone for a while... 
    Anyway... SAP told me that if we host the content on SDN for the collaboration and documentation to the changes in the files, then version management of the files can be hosted externally for downloading them (actually, SAP does not have an option because their software does not support it...).
    I will rather host them on my own site and add the link in the SDN wiki and a sticky forum post link to it than use a generic download service, at least to start with. Via change management to the wiki, we can easily map this to version management of the files on a monthly periodic update cycle once there are enough changes to the wiki.
    How about "Update Tuesday" as a maintenance cycle --> config updates each second Tuesday of the month... to remove authorizations to access backdoors which are more than "just display"...
    Cheers,
    Julius

  • Best Practices on managing two icloud accounts

    Due to living in different countries, I require multiple icloud accounts. This is because some apps are available in the appstore of one country, but not the other. However, I find this also makes managing multiple devices more cumbersome. I was wondering if there are any other in a similar situation, who could share some best practices they've developed.
    For example:
    backing up to icloud
    iphoto streaming
    itunes
    purchases in appstore
    Thanks

    You're welcome.
    Can I import the same music into both iTunes instances.
    Yes, as long as your authorize the computer for each account that the music was purchased under, e.g- if the music was purchased under your account & you want the music in your wife's library, you would use her computer login, launch itunes, sign her out of itunes, sign in using your itunes account details, Store>Authorize this Computer, sign out, then sign her back in.
    Do I need to just add one new user
    Yes, and I assume you want full administrative powers, so you should set up your account as an administrator.
    Once you've set things up this way, transferring purchased content(Apps & music) is real easy. Simply launch itunes, sign out of whatever account you want to transfer the content to, sign in using the account that has the content on the phone, File>Transfer Purchases, sign out after doing so & then sign the other party back in.
    You also want to make sure that you disable auto sync when an iphone is connected under preferences in the edit menu. Do this for each login. You DO NOT WANT TO SYNC to each others account, just File>Transfer Purchases. That way you can share apps & music, which is permitted under the EUSLA for family members in the same household.

  • Best practice file management

    Hello everyone
    I'm hoping to receive some advice on the best way to store a large collection of music, pictures and video, whilst keeping my computer as empty as possible to maximise it's processing power for professional video editing in the future. The computer is for both personal and business use, and I share it with my partner who is a music fiend, but not computer savvy.
    I have recently purchased a new 27" iMac with the standard specs (3.1GHz Quad-Core Intel Core i5, 4GB memory, 1TB hard drive).  It is currently running Snow Leopard (v. 10.6.6), and I plan to update to Lion when I purchase a new broadband account.
    I also have:
    One external 1TB Western Digital drive, currently at 85% capacity, with music, videos and pictures
    One external 1TB Western Digital drive, currently empty
    A new 2TB time capsule which is not yet set up
    Apple TV, not yet set up
    A mobile USB modem and basic account, soon to be replaced by a fairly high speed broadband modem with a fairly large download cap
    So far, I have:
    created three user profiles - one administrator, and two users
    Set up iTunes so each user shares a single library and itunes media folder
    We would like to also digitise a large collection of CDs and records.
    I was thinking about using the time capsule not only for storage/back up, but also to create a wireless network, allowing it to be stored with my printer and the hard drives can be stored in different room to the computer, away from sight.  The computer is so very gorgeous on it's own after all.... I've been advised also to use it as a wireless router when we purchase our broadband account, so I'm assuming the modem should also be connected to the time capsule in the other room.
    Rather than droning on about what I think I should do, I wondered if one of you experts could advise me on the best way to set everything up? I'm not sure that it's the best idea to set it up so the computer is always having to find files wirelessly from the time capsule and connected drives...  Wouldn't that be slow?
    The advice I've read in the various forums has been rather confusing, so your advice would be really appreciated!!
    Cheers
    Fiona

    Hi Fiona,
    Like your question, I'm in same boat and new to iMAC all together and want to setup backup and sharing strategy via best practice right up front.  Did you get any response or any good best practice you ran across in your research you an share with me?  Thanks.

  • Best Practice Advice - Using ARD for Inventorying System Resources Info

    Hello All,
    I hope this is the place I can post a question like this. If not please direct me if there is another location for a topic of this nature.
    We are in the process of utilizing ARD reporting for all the Macs in our district (3500 +/- a few here and there). I am looking for advice and would like some best practices ideas for a project like this. ANY and ALL advice is welcome. Scheduling reports, utilizing a task server as opposed to the Admin workstation, etc. I figured I could always learn from those with experience rather than trying to reinvent the wheel. Thanks for your time.

    hey, i am also intrested in any tips. we are gearing up to use ARD for all of our macs current and future.
    i am having a hard time with entering the user/pass for each machine, is there and eaiser way to do so? we dont have nearly as many macs running as you do but its still a pain to do each one over and over. any hints? or am i doing it wrong?
    thanks
    -wilt

Maybe you are looking for

  • Director 12 - quit unexpectedly first time used

    I've just purchased Director 12 upgrade from 11.5, downloaded and ran the installer. This is on a Mac Pro desktop machine running OSX 10.7.5 Tried to launch for first time. Director 12 splash screen shows, then replaced with following system error me

  • Mount SMB share based on user name at Login

    I'm trying to get an SMB share mounted at login for Users where the UNC path is based on the username. I have written a simple script to accomplish this: set username to do shell script("whoami") try mount volume "smb://path/to/files/" & username end

  • TS2755 Cannot connect my I-Pad 1 to Comcast modem network with correct password. Other devices work fine.

    I cannot connect my Ipad to a Comcast modem network with the correct password. Other devices work fine.

  • Portal session timeout judge

    Hi everybody, I have a question about portal session timeout. Now we have a javascript for session timeout in Masthead iview, which can increase the time for the active user. It uses EPCM to rase the event. But our page has four iveiws, and there is

  • Having trouble with the trackpad on my Lenovo X201

    I'm trying ot get Arch set up on my new Lenovo X201 thinkpad and I've got as far as getting X up and running.  However, I'm having some problems with the trackpad. First of all the mouse pointer is a little crazy, leaps all over the screen and is ver