Urg: Effort Estimation of Migration Work, Any standards !

Hi,
I wanted to know is there any "standard" of estimating the "work effort" (In person-months or in other terms) to establish the fact of actual work to be done. Like, is there any category of migration eg. complex, simple etc.
Could you please help !

It will help  but you might end up in errors if you have lots of customizations. so I would say the best approach is
migrate VIA SP 2010 and then to SP 2013 from SP 2007. you need to plan in such a way that migration application/database by application. should perform no of dry runs in the test envionments and find out what are the potentiall issues and making note of
them.etc. by direct approach you will not have issues interms of data loss and customization and metadata, effort is more but it will be good and smooth, where as with quest or any other third party tool cannot not say you will have smooth migration
depends on my past experience with it.
Thanks, Ram Ch

Similar Messages

  • Migration Effort estimation.

     
    Hi All,
    Now days we are getting a lot of enquiries regarding the migration of SharePoint 2007 to SharePoint 2013 from Customers. Biggest challenge I am facing in migration is, the cost of migration.
    We are using the Content DB upgrade approach for Migration.
    Do anyone know any free tool that can help me in migration efforts estimation?
    Regards Restless Spirit

    It will help  but you might end up in errors if you have lots of customizations. so I would say the best approach is
    migrate VIA SP 2010 and then to SP 2013 from SP 2007. you need to plan in such a way that migration application/database by application. should perform no of dry runs in the test envionments and find out what are the potentiall issues and making note of
    them.etc. by direct approach you will not have issues interms of data loss and customization and metadata, effort is more but it will be good and smooth, where as with quest or any other third party tool cannot not say you will have smooth migration
    depends on my past experience with it.
    Thanks, Ram Ch

  • EP effort estimation

    Hi,
    Do SAP provide any guidelines or thumbrules for the effort estimation of EP development, implementation etc.
    Please let me know the details
    regards,
    Sujesh

    HI,
    Providing Effort estimation in EP is highly vague as it purely depends on the kind of development and implementation to be done. You can be more specific in the areas you would like to get info about such as development using webdynpro etc.
    Also the architecture and process work flow play decisive roles in estimation.
    As of now <a href="https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/4c2b16e2-0a01-0010-0eab-9a90926d5947">here</a> is an estimation for a migration project.
    Additional <a href="https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2eb43136-0301-0010-9b96-b00377384497">info</a>  on the same project.
    Regards,
    Sujana

  • Best Practice for Portal Patches and effort estimation

    Hi ,
    One of our client is applying the following patches
    1. ECC 6.0 SP15(currently SP14)
    2. ESS MSS SP15(currently SP14 with some level of functional customization )
    3. EP 7 SP18(currently SP14)
    We would like to kwow the best practice for applying portal patches and the effort estimation for redoing the portal devt on the new patch.
    o   What is the overall level of effort with applying Portal patches?
    o   How are all the changes to SAP objects handle?  Do they have to be
         manually re-entered?
    o  What is the impact of having a single NWDI instance across the
        Portal Landscape during the Patch process?
    Regards,
    Revathi Raju.

    Hi Revathi,
    o What is the overall level of effort with applying Portal patches?
    overall effort to apply the patch is apprx 1/2-1 days for NW7 system. This is exclude the patch files download because it's based on your download speed.
    o How are all the changes to SAP objects handle? Do they have to be
    manually re-entered?
    Depending on your customization. Normally it wont effect if you created the customzation application apart from SAP standard application
    o What is the impact of having a single NWDI instance across the
    Portal Landscape during the Patch process?
    Any change that related to NWDI, you might be need to re-deployed from NWDI itself.
    Thanks
    Regards,
    AZLY

  • Portal patches and effort estimation

    Hi ,
    One of our client is applying the following patches
    1. ECC 6.0 SP15(currently SP14)
    2. ESS MSS SP15(currently SP14 with some level of functional customization )
    3. EP 7 SP18(currently SP14)
    We would like to kwow the best practice for applying portal patches and the effort estimation for redoing the portal devt on the new patch.
    o What is the overall level of effort with applying Portal patches?
    o How are all the changes to SAP objects handle? Do they have to be
    manually re-entered?
    o What is the impact of having a single NWDI instance across the
    Portal Landscape during the Patch process?
    Regards,
    Revathi Raju.

    Hi Revathi,
    o What is the overall level of effort with applying Portal patches?
    overall effort to apply the patch is apprx 1/2-1 days for NW7 system. This is exclude the patch files download because it's based on your download speed.
    o How are all the changes to SAP objects handle? Do they have to be
    manually re-entered?
    Depending on your customization. Normally it wont effect if you created the customzation application apart from SAP standard application
    o What is the impact of having a single NWDI instance across the
    Portal Landscape during the Patch process?
    Any change that related to NWDI, you might be need to re-deployed from NWDI itself.
    Thanks
    Regards,
    AZLY

  • Effort estimation for upgrading Sap Net weaver 7.0 to EHP2 SP6

    HI,
    We have BI 7.0 on SAP Netweaver 7.0, we want to upgrade it ot EHP2 SP6.
    Current patch level is 16 & oracle 10.2.2.0 database.
    Can anyone tell me  how much the time effort is needed.
    my customer wants an effort estimation for upgrading EHP2.
    please give me some bried idea about this acitivity.
    Regards,
    Maqsood

    > I would like to have details information with respective to time effort estimation during EHP2 upgrade process as below
    >
    > 1. initialization   -1. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 2. extraction - . what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 3. configuration -. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 4. checks - . what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 5. preprocessing -. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 6. dowtime -. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available). In general how much downtime system takes during this phase.
    > 7. postprocessing -. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    > 8. Finalization -. what system dos in this phase - 2. how much time system takes - 3. impact on server ( like availabel or not available)
    Nobody can tell you any time values. This depends on the used hardware, the speed of the underlying I/O subsystem, the number of processes you use, the speed of the server, number of languages etc. etc.
    1 - 5 is done online, nobody will notice because all the work is done in a shadow instance .Downtime says it all and 7 and 8 are postprocessing work.
    To try that do a copy of the production to a sandbox and play with the values and see yourself how much time you need.
    Markus

  • Effort estimation report

    hi
    i need to develop a effort estimation report . could anyone let me know if anyone has worked on this please let me know how can i do this.
    thanks in advance

    Hi,
    Development time is highly variable. There are no standard methods of doing effort estimation report. You can actually maintain an excel or something else in which youcan mention how much time is needed to create a fresh report from scratch, small changes etc.
    You can just have a look into this:
    Link: [http://yunus.hacettepe.edu.tr/~sencer/objectp.html]
    Hope this helps you...

  • How to migrate from a standard store setup in a splitted store (msg - idx) setup

    How can I migrate from a standard store setup in a splitted setup described in
    https://wikis.oracle.com/display/CommSuite/Best+Practices+for+Messaging+Server+and+ZFS
    can a 'reconstruct' run do the migration or have I do a
    imsbackup - imsrestore ?

    If your new setup would use the same filesystem layout as the old one (i.e. directory paths to the files would be the same when your migration is complete) you can just copy the existing store into the new structure, rename the old store directory into some other name, and mount the new hierarchy instead of it (zfs set mountpoint=...). The CommSuite Wiki also includes pages on more complex migrations, such as splitting the user populace into several stores (on different storage) and/or separate mailhosts. That generally requires that you lock the user in LDAP (perhaps deferring his incoming mail for later processing into the new location), migrate his mailbox, rewrite the pointers from LDAP, reenable account. The devil is in the details, for both methods. For the latter, see Wiki; for the former I'll elaborate a bit here
    1) To avoid any surprises, you should stop the messaging services before making the filesystem switch, finalize the data migration (probably with prepared data already mostly correct in the new hierarchy before you shut down the server, just resync'ing the recent changes into new structure), make the switch and reenable the server. If this is a lightly-used server which can tolerate some downtime - good for you If it is a production server, you should schedule some time when it is not very used so you can shut it down, and try to be fast - so perhaps practice on a test system or a clone first.
    I'd strongly recommend taking this adventure in small reversible steps, using snapshots and backups, and renaming old files and directories instead of removing them - until you're sure it all works, at least.
    2) If your current setup already includes a message store on ZFS, and it is large enough for size to be a problem, you can save some time and space by tricks that lead to direct re-use of existing files as if they are the dataset with a prepopulated message store.
    * If this is a single dataset with lots of irrelevant data (i.e. one dataset for the messaging local zone root with everything in it, from OS to mailboxes) you can try zfs-cloning a snapshot of the existing filesystem and moving the message files to that clone's root (eradicating all irrelevant directories and files on the clone). Likewise, you'd remove the mailbox files on the original system (when the time is right, and after sync-ing).
    * If this is already a dedicated store dataset which contains the directories like dbdata/    mboxlist/  partition/ session/   and which you want to split further to store just some files (indices, databases) separately, you might find it easier to just make new filesystem datasets with proper recordsizes and relocate these files there, and move the partition/primary to the remaining dataset's root, as above. In our setups, the other directories only take up a few megabytes and are not worth the hassle of cloning - which you can also do for larger setups (i.e. make 4 clones and make different data at each one's root). Either way, when you're done, you can and should make sure that these datasets can mount properly into the hierarchy, yielding the pathnames you need.
    3) You might also look into separating the various log-file directories into datasets, perhaps with gzip-9 compression. In fact, to reduce needed IOPS and disk space at expense of available CPU-time, you might use lightweight compression (lzjb) on all messaging data, and gzip on WORM data sets - local zone, but not global OS, roots; logs; etc. Structured databases might better be left without compression, especially if you use reduced record sizes - they might just not compress enough to make a difference, just burning CPU cycles. Though you could look into "zle" compression which would eliminate strings of null bytes only - there's lots of these in fresh database files.
    4) If you need to recompress the data as suggested in point (3), or if you migrate from some other storage to ZFS, rsync may be your friend (at least, if your systems don't rely on ZFS/NFSv4 ACLs - in that case you're limited to Solaris tar or cpio, or perhaps to very recent rsync versions which claim ACL support). Namely, I'd suggest "rsync -acvPHK --delete-after $SRC/ $DST/" with maybe some more flags added for your needs. This would retain the hardlink structure which Messaging server uses a lot, and with "-c" it verifies file contents to make sure you've copied everything over (i.e. if a file changes without touching the timestamp).
    Also, if you were busy preparing the new data hierarchy with a running server, you'd need to rsync old data to new while the services are down. Note that reading and comparing the two structures can take considerable time - translating to downtime for the services.
    Note that if you migrate from ZFS to ZFS (splitting as described in (2)), you might benefit from "zfs diff" if your ZFS version supports it - this *should* report all ofjects that changes since the named snapshot, and you can try to parse and feed this to rsync or some other migration tool.
    Hope this helps and you don't nuke your system,
    //Jim Klimov

  • Flash Player does not work with Standard User, but will work when browser is "Run as Administrator"

    I don't understand why this is happening, but Flash Player will not work when a Standard User is using the browser. It will only work when the browser is run as an administrator. Any way to fix this? I've tried to uninstall, and reinstall the player, and it still wouldn't work.
    The user is on Windows 7, 64-bit. Using IE 10.
    Flash not working for Standard User - YouTube

    Any update to provide at all here guys?  Again, in my situation, it's very much rights-related as a standard user doesn't even report that the flash player exists when testing it on the Adobe Flash Version Detection website (despite it showing up in Control Panel and under Add/Remove Programs).  I've already tried giving the C:\Windows\System32\Macromed and files/subfolders appropriate permissions for the standard user and still nothing.  If I either give the user in question full local admin rights or logon as the domain admin, then the Adobe Flash Version Detection website says Flash is installed and Flash works fine.
    Thoughts???

  • Mini displayport doesn't work any more

    My Mini Displayport doesn't work any more with VGA, DVi, just with the Apple Cinema Display. It started one week ago, I've tried many different monitors, but it seems to be that it only works with the Apple Cinema Display, Thunderbolt Ethernet Adapter is also accepted but nothing else. PRAM, SMC, booting from an Harddisk with Mavericks (I'm on 10.10.2), Safe Mode, nothing works. I have a Macbook Air, 11 " from Mid 2013, no warranty any more. Today I was at Bense, a german reseller (thank you guys for your effort), they did the same tests and some more, but also found nothing. Is there anyone with the same problems and a solution for it?
    Cheers
    Ayhan

    Does it work on the other command key? I typically use the one to the left of the space bar, and at one point of time, it stopped working. but then i started to use the command key to the right of the keyboard which worked. but, then i started using the one that wasn't working, and just clicking slowly rather than quickly, and now it seems to work again. Don't know why . . . but that is what happened for me.

  • Is there any standard report for capacity of machines used /ideal

    Hi,,
    i am interested in knowing any standard report which tells us how long is my machine ( work center) ideal and how many hours i have used during a period of  say one month.
       how can we use capacity planning and work scheduling in repetitive manufacturing . pls tell me a the procedure to be followed.
    thanks in advance
    regards
    madan mohan

    Hi,
      thanks for your reply . but i am not able to get a report on  total capacity available in  the  last month and capacity i have utilized  for that month . so that i can derive my  work center ideal time during that past month.this gives  the time i have kept  the machine ideal . Is there any standard report meeting this requirement.
    pls give me  some solution
    regards
    madan

  • Error in Starting Migration Work Bench

    I have installed Oracle Enterprise 8.1.7 with Migration work bench installed as well. When starting the migration work bench from start menu, I run into the following error. I would appreciate any insight regarding the problem.
    ** Oracle Migration Workbench
    ** Release 1.3.0.0.0 Production
    ** ( Build 18072000 )
    ** ORACLE_HOME: C:\oracle\ora81
    ** user language: en
    ** user region: US
    ** user timezone: CST
    ** file encoding: Cp1252
    ** java version: 1.1.7.30o
    ** java vendor: Oracle Corporation
    ** o.s. arch: x86
    ** o.s. name: Windows NT
    ** o.s. version: 5.0
    ** Classpath:
    C:\oracle\ora81\Omwb\olite\Oljdk11.jar;C:\oracle\ora81\Omwb\olite\Olite40.jar;C:\Program Files\Oracle\jre\1.1.7\lib\rt.jar;C:\Program Files\Oracle\jre\1.1.7\lib\i18n.jar;C:\oracle\ora81\Omwb\jlib;C:\oracle\ora81\Omwb\plugins\SQLServer6.jar;C:\oracle\ora81\Omwb\plugins\Sybase.jar;C:\oracle\ora81\Omwb\plugins\MSAccess.jar;C:\oracle\ora81\Omwb\plugins\SQLAnywhere.jar;C:\oracle\ora81\Omwb\plugins\SQLServer7.jar;C:\oracle\ora81\Omwb\jlib\omwb-1_3_0_0_0.jar;C:\oracle\ora81\jdbc\lib\classes111.zip;C:\oracle\ora81\lib\vbjorb.jar;C:\oracle\ora81\jlib\ewt-swingaccess-1_1_1.jar;C:\oracle\ora81\jlib\ewt-3_3_6.jar;C:\oracle\ora81\jlib\ewtcompat-opt-3_3_6.zip;C:\oracle\ora81\jlib\share-1_0_8.jar;C:\oracle\ora81\jlib\help-3_1_8.jar;C:\oracle\ora81\jlib\ice-4_06_6.jar;C:\oracle\ora81\jlib\kodiak-1_1_3.jar
    ** Started : Mon Jun 30 11:06:10 CDT 2003
    Exiting
    java.lang.NullPointerException: cannot add null item to LWChoice
         at oracle.ewt.lwAWT.LWChoice.addItem(Unknown Source)
         at oracle.mtg.migrationUI.LoginDialog._run8iLiteEnabledDialog(LoginDialog.java:509)
         at oracle.mtg.migrationUI.LoginDialog.run(LoginDialog.java:358)
         at oracle.mtg.migrationUI.MigrationApp.init(MigrationApp.java:272)
         at oracle.sysman.emSDK.client.appContainer.WebApplication.main(WebApplication.java:2876)

    Please post your qeustion in the Migration Workbench forum. The URL is:
    Database and Application Migrations
    You may also be interested in OTN's Migration Center.
    The URL is:
    http://otn.oracle.com/tech/migration/

  • Effort Estimation for a New BI implimentation Project

    Dear Experts,
    I need Effort Estimation model excel sheet or any other doc for a New BI implimentation Project.
    I want to know how much time it will take for the following 2 scenarios.
    1. Activatating the business content Object in BI7 for all BC extracotrs in ECC.
    2. If content is not suitable for the requirment then what is the time frame for creating the Customized BI objects and DS,ES and all in ECC.
    I knew it the sam has been circulated to their personal e-mail ids earlier from this forum.
    <removed>
    Thanks,
    APC
    Edited by: Arun Varadarajan on Mar 24, 2009 5:20 PM

    look into this link
    http://www.slideshare.net/arun_bala1/asap-methodology

  • Problem on 10g Express using join (+), but works on Standard 10g

    Good Day all,
    I am having a problem with the following query:
    select srds.specific_date,l.DayCntr
    from (select add_months(sysdate,-1)+ rownum * 5 as Specific_date from all_objects where rownum < 35) srds,
    (select rownum as DayCntr from all_objects oc where rownum < 35) l
    where l.DayCntr < 32
    and l.DayCntr = (to_number(to_char(srds.specific_date(+),'DD')))
    and ((srds.specific_date between add_months(sysdate,-1) and sysdate)
    or (srds.specific_date is null))
    order by daycntr;
    On express 10g, not all rows are returned as expected, minmum of 31 rows, getting sometimes 20
    while on 10g Standard it does work, any ideas?
    Thanks

    for me its works fine in both Oracle 10.2.0.3 and Oracle XE and return the same result. maybe is not a query problem, maybe a installation problem
    10 G
    Connected.
    @10g> select srds.specific_date,l.DayCntr
      2  from (select add_months(sysdate,-1)+ rownum * 5 as Specific_date from all_objects where rownum < 35) srds,
      3  (select rownum as DayCntr from all_objects oc where rownum < 35) l
      4  where l.DayCntr < 32
      5  and l.DayCntr = (to_number(to_char(srds.specific_date(+),'DD')))
      6  and ((srds.specific_date between add_months(sysdate,-1) and sysdate)
      7  or (srds.specific_date is null))
      8  order by daycntr;
    SPECIFIC_DATE          DAYCNTR
                                 1
                                 2
    05/11/2008 15:20:10          5
                                 6
                                 7
    10/11/2008 15:20:10         10
                                11
                                12
    15/11/2008 15:20:10         15
                                16
                                17
    20/11/2008 15:20:10         20
                                21
                                22
    26/10/2008 15:20:10         26
                                27
    31/10/2008 15:20:10         31
    17 rows selected.
    XE
    Connected.
    @XE> select srds.specific_date,l.DayCntr
      2  from (select add_months(sysdate,-1)+ rownum * 5 as Specific_date from all_objects where rownum < 35) srds,
      3  (select rownum as DayCntr from all_objects oc where rownum < 35) l
      4  where l.DayCntr < 32
      5  and l.DayCntr = (to_number(to_char(srds.specific_date(+),'DD')))
      6  and ((srds.specific_date between add_months(sysdate,-1) and sysdate)
      7  or (srds.specific_date is null))
      8  order by daycntr;
    SPECIFIC_DATE          DAYCNTR
                                 1
                                 2
    05/11/2008 15:22:46          5
                                 6
                                 7
    10/11/2008 15:22:46         10
                                11
                                12
    15/11/2008 15:22:46         15
                                16
                                17
    20/11/2008 15:22:46         20
                                21
                                22
    26/10/2008 15:22:46         26
                                27
    31/10/2008 15:22:46         31
    17 rows selected.
    @XE>
    @XE>

  • Effort estimation (Aufwandschu00E4tzung) for WDA

    Hi,
    is there any best practice out there how we can calculate a effort estimation for a the development of a WDA application? Any Excel-Tool, anything?
    I thought about calculating for every screen (if it contains for example 3 tabstrips i would calculate alltogeather 4 screens) and add some time for the component. But this is by far too vaguely, isn't it?
    How do you do this usually?
    Thanks in advance.

    Hi Axel,
    the time one needs for developing a Web Dynpro ABAP Application mainly depends on the answers to the following questions:
    Are there good enough APIs that can be used for handling most of the business logic required?
    How flexible should the application be? Do you need to generate the ui or is it sufficient to have fixed screens?
    Can components be used again?
    --> Ask your developers, not excel
    ~Silke

Maybe you are looking for