Best approach for multi-team/multi-projects.

Hi,
I'm looking for the best approach to handle multi-teams/multi-projects scenario. We have 20 development groups and over 300 products. Each products on it's own Schedule.
Product X can be assign to Group A, but at some point, it can be assign to Group B.
We are currently using TFS 2012, but will be upgrading to 2013 soon.
Based on many reading, we are thinking to create only one Team Projects to ease management.
In it, we will create a team for each development group, but we will not create an associated area path with the name of the team.
- Group A
- Group B
- Group C
Than, we will create an Area for each product.
- Product X
- Product Y
- Product Z
and, we will create multi-level of iterations to match each Schedule.
- Product X
   - Release 1
      - Sprint 1
      - Sprint 2
- Product Y
    - Release 1
       -Sprint 1
    - Release 2
       -Sprint 1
The main issue, we have with this approach is that we can't use the Backlog or the Task Board effectivelly, as there is no way to filter per areas and iterations.
Reading "How do I change the underlying query for the task board (and backlog board) on TFS Preview", this doen't seam to be possible in TFS 2012.
In TFS 2013, "The Agile Portfolio Management: Using TFS to support backlogs across multiple teams" was introduced. Will this help to solve the problem?
We would create a management team for each development group.
We would create an agile team with an associated area for each product.
The only thing that I couldn't find in the documentation is how to re-assign an agile team to another management team. Is this possible?
Also can each agile team have their specific itérations, if so will it roll up properly to the management team?
Regards
SYSOTI
PS: Sorry couldn't post the links of the quoted text as I get the message: Body text cannot contain images or links until we are able to verify your account. ;-(

Hi SYSOTI,
Based on your description, seems the area path is not configured properly hence you can't use the Backlog or the Task Board effectivelly.
From the
Agile Portfolio Management: Using TFS to support backlogs across multiple teams, the area path is set as agile team which is a consist of team members but not a product name. For you scenario, you can set the area path name as your product name to
identify the associate products for work items.  And the groups you mentioned for products in the team projects are sub-group of contributors.
Seems there is no need to create a management team for each development group since management team might be in a higher level to view the progress for all of the work across the agile teams. Certianly, you can create multiple management teams, but the management
teams will be able to view works for all agile teams. 
If you have multiple teams and products, you can create a team project for each product if the products don't have much relationship. However, it's OK to manage the projects for multiple products in the same team project. And working within a single team
project also have benifits, you ccan check this
blog for more information.
Best regards,
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • Best Practices for Defining NDS Java Projects...

    We are doing a Proof of Concept on using NDS to develop non-SAP Java applications.  We are attempting to determine if we can replace our current Java development tools with NDS/WAS.
    We are struggling with SAP's terminology and "plumbing" for setting up/defining Java projects.  For example, what is and when do you define Tracks, Software Components, Development Components, etc.  All of these terms are totally foreign to us and do not relate to our current Java environment (at least not that we can see).  We are also struggling with how the DTR and activities tie in to those components.
    If any one has defined best practices for setting up Java projects or has struggled with and overcome these same issues, please provide us with some guidance.  This is a very frustrating and time-consuming issue for us.
    Thank you!!

    Hi Peggy,
    In Component Model we divide software projects into small components.Components can use other components in well defined manner.
    A development object is a part of a component that can be changed or developed in some way; it provides the component with a certain part of its functionality. A development object may be a Java class, a Web Dynpro view, a table definition, a JSP page, and so on. Development objects are always stored as “sources” in a repository.
    A development component can be defined as a frame shared by a number of objects, which are part of the software.
    Software components combine components (DCs) to larger units for delivery and deployment.
    A track comprises configurations and runtime systems required for developing software component versions.It ensures stable states of deliverables used by subsequent tracks.
    The Design Time Repository is for versioning source code management. Distributed development of software in teams. Transport and replication of sources.
    You can also find lot of support in SDN for the above concepts with tutorials.
    Refer this Link for a overview on Java development Infrastructure(JDI)
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/webas/java/java development infrastructure jdi overview.pdf
    To understand further
    Working with Net Weaver Development Infrastructure :
    http://help.sap.com/saphelp_nw04/helpdata/en/03/f6bc3d42f46c33e10000000a11405a/content.htm
    In the above link you can find all the concepts clearly explained.You can also find the required tutorials for development.
    Regards,
    Vijith

  • Best settings for mixed codec doco project

    Hi
    Please forgive me for asking what I am sure has been asked before, but having done a few searches I was still unsure as to what is the best approach for this project.
    I have just come over to FCP from Avid (hence knowing very little about FCP), and have a documentary project that I want to cut in FCP.
    It was mostly shot on XDcam EX 1080/25p 35mbps, but there is a fair amount of B camera footage that is DVCProHD 1080/25p 100mbps
    The general concensus seems to be to open a project in the format that is most prevalent i.e XDcam EX.
    As this project is only going to be offlined in this system, and then conformed elsewhere, and as I would like to be able to have all the media on one drive so I can cut on the workstation and on my laptop where necessary, it seems sensible to use a more compressed offline res as there is well over 100 hours of footage.
    Any guidance would be greatly appreciated as I don't want to cut in a codec/format that is just going to give me headaches when it comes to doing the conform, or during the edit got that matter.
    Cheers
    Adam
    Specs:
    Workstation - 2 x 2.4GHZ quad core Xeon, 16GB ram, 1TB system, 2TB primary scratch, 4TB external media drive (shared with laptop)
    Laptop - 2.8ghz core i7, 8GB ram, 500GB internal drive, 4TB external media drive (shared with workstation)
    FCP 7.0.3 running on systems

    >it seems sensible to use a more compressed offline res as there is well over 100 hours of footage.
    Sorry Avid-man...going to have to let go of your "offline/online" workflow ideals.  There is no offline in FCP. Well, it is best if there isn't.  You will...SHOULD...be working with XDCAM and DVCPRO HD at full resolution.  So your cut will then just be color corrected.  100 hours of XDCAM is 1.7TB.  100 hours of DVCPRO HD is well...5.4T TB...but you will have a mixture of the two.  I had 80 hours of DVCPRO HD and all that fit on 2TB...but then again, that was at 720p.  Still...unless you REALLY know what you are doing...unless you have an FCP expert on staff...do not offline/online. 
    ANYWAY...I don't know who this "General Consensus" is, but they are WRONG.  They couldn't be wronger if they wanted to be wrong about being wrong.  Well, they are wrong.  NEVER edit ANYTHING that is not XDCAM in an XDCAM sequence...not unless you like crashing a lot...and then blaming FCP for sucking and not being as good as an Avid....because you will do that if you follow this General person's advice. 
    No...Use a ProRes sequence setting, and edit both on that.  XDCAM is a GOP format...no real frames...just a group of pictures.  So if you throw a real frame format onto that...crash city.  But use ProRes...and you will be fine. 
    So use both formats natively...don't try to go offline/online.  And edit both in a ProRes sequence.

  • What is the best approach for combining events?

    When I work on a wedding my current workflow involves creating a compound clip for each section of the video (e.g. reception, ceremony, dancing etc). Then I add the compound clip 'sequences' into a single project to add the chapter markers and export to a single master file.
    I like the idea of managing each section in a project rather than a compound clip now that projects are part of the library in 10.1, but is there a good way to combine multiple projects (for each section) into a single master project, or would I still need to copy the contents of each project and paste in the master project?
    Maybe I am best to continue with my current workflow.

    Just saw the discussion title - should have said "What is the best approach for combining projects"?

  • What's the best approach for handeling about 1300 connections in Oracle.

    What's the best approach for handling about 1300 connections in Oracle 9i/10g through a Java application?
    1.Using separate schema s for various type users(We can store only relevant data with a particular schema.     Then No. of records per table can be reduced by replicating tables but we have to maintain all data with a another schema     Then we need update two schema s for a given session.Because we maintain separate scheama for a one user and another schema for all data and then there may be Updating problems)
    OR
    2. Using single schema for all users.
    Note: All users may access the same tables and there may be lot of records than previous case.
    What is the Best case.
    Please give Your valuable ideas

    It is a true but i want a solution from you all.I want you to tell me how to fix my friends car.

  • Best approach for IDOC - JDBC scenario

    Hi,
    In my scenarion I am creating sales order(ORDERS04) in R/3 system and which need to be replicated in a SQL Server system. I am sending the order to XI as an IDoc and want to use JDBC for sending data to SQL Server. I need to insert data in two tables(header & details). Is it possible without BPM?  Or what is the best approach for this?
    Thanks,
    Sri.

    Yes, this is possible without the BPM.
    Just create the Corresponding Datatype for the insertion.
    if the records to be inserted are different, then there wil be 2 different datatypes ( one for header and one for detail).
    Do a mutlimapping, where your Source is mapped into the header and details datatype and then send using the JDBC sender adapter.
    For the strucutre of your Datatype for insertion , just check this link,
    http://help.sap.com/saphelp_nw04/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm
    To access any Database from XI, you will have to install the corresponding Driver on your XI server.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3867a582-0401-0010-6cbf-9644e49f1a10
    Regards,
    Bhavesh

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB?
    We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    We have number of mappings. We built process flows for mappings as well.
    I like to know, what are the best approches to incorportate re-start options in our process. ie a failure of mapping in process flow.
    How do we re-cycle failed rows?
    Are there any builtin features/best approaches in OWB to implement the above?
    Does runtime audit tables help us to build re-start process?
    If not, do we need to maintain our own tables (custom) to maintain such data?
    How did our forum members handled above situations?
    Any idea ?
    Thanks in advance.
    RI

    Hi RI,
    How many mappings (range) do you have in a process flows?Several hundreds (100-300 mappings).
    If we have three mappings (eg m1, m2, m3) in process flow. What will happen if m2 fails?Suppose mappings connected sequentially (m1 -> m2 -> m3). When m2 fails then processflow is suspended (transition to m3 will not be performed). You should obviate cause of error (modify mapping and redeploy, correct data, etc) and then repeat m2 mapping execution from Workflow monitor - open diagram with processflow, select mapping m2 and click button Expedite, choose option Repeat.
    In re-start, will it run m1 again and m2 son on, or will it re-start at row1 of m2?You can specify restart point. "at row1 of m2" - I don't understand what you mean (all mappings run in Set based mode, so in case of error all table updates will rollback,
    but there are several exception - for example multiple target tables in mapping without corelated commit, or error in post-mapping - you must carefully analyze results of error).
    What will happen if m3 fails?Process is suspended and you can restart execution from m3.
    By having without failover and with max.number of errors=0, you achieve re-cycle failed rows to zero (0).This settings guarantee existence only two return result of mapping - SUCCSES or ERROR.
    What is the impact, if we have large volume of data?In my opinion for large volume Set based mode is the prefered processing mode of data processing.
    With this mode you have full range enterprise features of Oracle database - parallel query, parallel DML, nologging, etc.
    Oleg

  • Best approach for RFC call from Adapter module

    What is the best approach for making a RFC call from a <b>reciever</b> file adapter module?
    1. JCo
    2. Is it possible to make use of MappingLookupAPI classes to achieve this or those run in the mapping runtime environment only?
    3. Any other way?
    Has anybody ever tried this? Any pointers????
    Regards,
    Amol

    Hi ,
    The JCo lookup is internally the same as the Jco call. the only difference being you are not hardcoding the system related data in the code. So its easier to maintain during transportation.
    Also the JCO lookup code is more readable.
    Regards
    Vijaya

  • Best Approach for Reporting on SAP HANA Views

    Hi,
    Kindly provide information w.r.t the best approach for the reporting on HANA views for the architecture displayed below:
    We are on a lookout for information mainly around the following points:
    There are two reporting options which are known to us and listed below namely:
    Reporting on HANA views through SAP BW  (View > VirtualProvider > BEx > BI 4.1)
    Reporting on HANA views in ECC using BI 4.1 tools
            Which is the best option for reporting (please provide supportive reasons : as in advantages and limitations)?
             In case a better approach exists, please let us know of the same.
    Best approach for reporting option on a mixed scenario wherein data of BW and HANA views is to be utilized together.

    Hi Alston,
    To be honest I did not understand the architecture that you have figured out in your message.
    Do you have HANA instance as far as I understood and one ERP and BW is running on HANA. Or there might be 2 HANA instance and ERP and BW are running independently.
    Anyway If you have HANA you have many options to present data by using analytic views. Also you have BW on HANA as EDW. So for both you can use BO and Lumira as well for presenting data.
    Check this document as well: http://scn.sap.com/docs/DOC-34403

  • What's Best Approach for Multitrack Classical Music?

    Can someone suggest the best approach for recording classical musicians onto
    four tracks? In this scenario, they play until they make a mistake on, say,
    measure 24, stop, then (take 2) go back to measure 20 and play until the next
    rough spot, and so on. Ultimately there may be 15 takes that all need to be
    trimmed and stitched together.
    In the old (tape) days, this was pretty basic editing. I would use a blade and block
    to cut out all the bad stuff on the multitrack tape, then I could mix. But how do I
    do this in Audition? (I use version 1.5.)
    I can't do the cuts it in edit view because the tracks would get out of sync
    Assuming all the takes are in one session, in multitrack view, this most basic of
    functions seems to elude me. What am I missing?

    Al the Drifter wrote:
    If you follow Steve's advice, and after doing the edits you discover
    that one instrument should come up 1db, you are screwed.
    I could be wrong about this in the classical music environment,
    where things are not close-mic'ed but if I am, I am confident Steve
    will correct me.  Ha.
    You always run the risk of small changes between takes - and that's where Audition 3 and the new improved crossfades score rather heavily. You won't notice 1dB on a single instrument across a fade though - it's hard to spot this as a jump, even, unless it's on pure tone. No, I very rarely close-mic stuff at all, although I did with a clavichord recently - it's seriously too quiet to mic any other way.
    jaypea500 wrote:
     when recording classical music, any engineer worth anything has the mix down pat as it's being recorded. 
    That's the way they used to work, certainly - but not nowadays, especially if it's done on location, which most classical recording is. What's more likely to happen is that you'd use decent mic preamps feeding straight into a multitrack, or even some software on a laptop. I generally record like that - but I also feed the multitrack outputs to a Yamaha mixer via ADAT, do a mix on that and record it back to a spare multitrack pair. I don't actually need to do that - but having a mix available from the multitrack that's pretty much there is good as far as being able to play back takes to conductors is concerned.
    Of course, one of the other reasons that classical sessions recorded on location aren't mixed on the spot is that the monitoring conditions are invariably far from ideal, and I'd have it that no engineer worth anything would ever risk a final mix done on location.
    But I only get paid to do all of this on a regular basis, so what would I know? Must be something though - my customers come back for more...

  • Best approach for building dialogs based on Java Beans

    I have a large amount of Java Beans with several properties each. These represent all the "data" in our system. We will now build a new GUI for the system and I intend to reuse the beans as far as possible. My idea is to automatically generate the configuration dialogs for each bean using the java.beans package.
    What is the best approach for achieving this? Should I use PropertyEditors or should I make my own dialog-generator using the Introspetor class or are there any other suitable solutions?
    All suggestions and tips are very welcome.
    Thanks!
    Erik

    Definitely, it is better for you to use JTable. Why not try it?

  • Help Please!!  Best Practices for building an NDS Project...

    We are doing a Proof of Concept on using NDS to develop non-SAP Java applications. We are attempting to determine if we can replace our current Java development tools with NDS/WAS.
    We are struggling with SAP's terminology and "plumbing" for setting up/defining Java projects. For example, what is and when do you define Tracks, Software Components, Development Components, etc. All of these terms are totally foreign to us and do not relate to our current Java environment (at least not that we can see). We are also struggling with how the DTR and activities tie in to those components.
    If any one has defined best practices for setting up Java projects or has struggled with and overcome these same issues, please provide us with some guidance. This is a very frustrating and time-consuming issue for us.
    Thank you!!

    Hello Peggy,
    this is my first post but I hope it helps you anyway.
    To learn the SAP "language" I additionally used the a SAP Presentation regarding the SAP JDI.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/java development infrastructure real world use webinar.pdf
    I think this one is quite useful as an addon to the other links for information you already got. Your name also indicates that your mother-tongue language is German. If so, the german version of the book (Java-Programming with the SAP WAS) is already available for purchase and really useful. Then, you can also use the information provided by the University of Potsdam. They have an introduction about how to setup a track in the SLD and then how to setup SCs.
    http://epic.hpi.uni-potsdam.de/nwlab/SC+Track.html
    Hope this helps...

  • Best approach for publishing a paid version and an ad supported free version of the same app

    Hi,
    One of my Windows 8 store app is almost ready for store submission.
    What is the best approach for publishing a paid version and  an ad supported  free version of the same app.
    Can I do the following
    1. Submit the app with unlimited free trial to store
    2. During the free trial ads will be displayed
    3. If the user purchases the app, then the ad would not displayed
    Any advise is greatly
    appreciated.
    Best Regards

    Although the in-App purchase option is good but for ad based apps my approach is different.
    I would suggest putting 2 different apps in the store, one free with Ads and one without. Reason being you want the extra reference and xaml ad controls depending upon how
    many you have on the paid version of the apps. I would keep my apps as lighter and cleaner as possible specially when its a paid app.
    I currently manage both free and paid app through one solution and reuse most of the code except for the views.
    Binoj Daniel www.CodeRewind.com

  • Best approach for uploading document using custom web part-Client OM or REST API

    Hi,
     Am using my custom upload Visual web part for uploading documents in my document library with a lot of metadata.
    This columns contain single line of text, dropdownlist, lookup columns and managed metadata columns[taxonomy] also.
    so, would like to know which is the best approach for uploading.
    curretnly I am  trying to use the traditional SSOM, server oject model.Would like to know which is the best approach for uploading files into doclibs.
    I am having hundreds of sub sites with 30+ doc libs within those sub sites. Currently  its taking few minutes to upload the  files in my dev env. am just wondering, what would happen if  the no of subsites reaches hundred!
    am looking from the performance perspective.
    my thought process is :
    1) Implement Client OM
    2) REST API
    Has anyone tried these approaches before, and which approach provides better  performance.
    if anyone has sample source code or links, pls provide the same 
    and if there any restrictions on the size of the file  uploaded?
    any suggestions are appreciated!

    Try below:
    http://blogs.msdn.com/b/sridhara/archive/2010/03/12/uploading-files-using-client-object-model-in-sharepoint-2010.aspx
    http://stackoverflow.com/questions/9847935/upload-a-document-to-a-sharepoint-list-from-client-side-object-model
    http://www.codeproject.com/Articles/103503/How-to-upload-download-a-document-in-SharePoint
    public void UploadDocument(string siteURL, string documentListName,
    string documentListURL, string documentName,
    byte[] documentStream)
    using (ClientContext clientContext = new ClientContext(siteURL))
    //Get Document List
    List documentsList = clientContext.Web.Lists.GetByTitle(documentListName);
    var fileCreationInformation = new FileCreationInformation();
    //Assign to content byte[] i.e. documentStream
    fileCreationInformation.Content = documentStream;
    //Allow owerwrite of document
    fileCreationInformation.Overwrite = true;
    //Upload URL
    fileCreationInformation.Url = siteURL + documentListURL + documentName;
    Microsoft.SharePoint.Client.File uploadFile = documentsList.RootFolder.Files.Add(
    fileCreationInformation);
    //Update the metadata for a field having name "DocType"
    uploadFile.ListItemAllFields["DocType"] = "Favourites";
    uploadFile.ListItemAllFields.Update();
    clientContext.ExecuteQuery();
    If this helped you resolve your issue, please mark it Answered

  • Best approach for IDs mapping..

    Hello,
    I'd like to ask you for your experiences about classical integration problem: mapping of IDs (materials, partners...)
    What is the best approach for integration between SAP and other systems? Can you give me some hints?
    Thanx, Peter

    Hi Peter,
    you have 4 ways to do it:
    1. you can do it inside an integration process:
    RFC call for checking a table with ID -> ID mappings
    (not so good as you have to use integration process)
    but very easy to biuld as this is standard 
    2. table in R/3 and changing the values in a user exit
    (you maintaint the data in a table in R/3)
    the fastest way (no calls to other programs)
    but you have to create user exits and
    this is not why you (your client) bought the XI  
    3. you can use this new RFC API
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/801376c6-0501-0010-af8c-cb69aa29941c
    which seems to be the best approach
    as you don't need BPM for this and it's a standard
    4. value mapping tables in XI...
    Regards,
    michal
    Message was edited by: Michal Krawczyk

Maybe you are looking for

  • Data not populated when run from a different machine

    I have a stored procedure which populates a table The stored procedure when it is run on my machine using SQL Plus populates data correctly in a table. I select using SQL PLUS and it shows that the data is there But when I run the SAME procedure ( SA

  • Error while installing SAP Widget Developer tool.zip on Eclipse Java EE IDE

    Hello,           I am new to SAP widget. I hav installed "Eclipse Java EE IDE for Web Developers" from eclipse.org and now when i was trying to install SAP Widget Developer tool.zip add-on as stated in the SAP Enterprise Widget Development Guide -->

  • Upgrading to Production Version of XE

    Is it necessary to un-install XE-Beta prior to re-downloading the production version. Below is the version information for what I have currently installed. I didn't want to start the download process without first understanding what the install progr

  • Restore database not working

    Hi i am using oracle9i as database server using RMAN i am taking full backup on 22-dec-06 after that i am doing lot of r&d in database e.g. drop user with cascade now i want to restore database using above *.bk file so i am doing following things STA

  • Clients not able to connect to MS exchange server over a cisco switch

    Hello, I have a Cisco L3 switch. I have a Exchange server and until i split the network into every thing was working ok, meaning the clients can connect to the Exchange without issues. But after i split the network into 2 vlans, moving the exchange i