SAP EHP Update for Large Database

Dear Experts,
We are planning for the SAP EHP7 update for our system. Please find the system details below
Source system: SAP ERP6.0
OS: AIX
DB: Oracle 11.2.0.3
Target System: SAP ERP6.0 EHP7
OS: AIX
DB: 11.2.0.3
RAM: 32 GB
The main concern over here is the DB size. It is approximately 3TB. I have already gone through forums and notes it is mentioned the DB size is not having any impact on the SAP EHP update using SUM. However am stil thinking it will have the impact in the downtime phase.
Please advise on this.
Regards,
Raja. G

Hi Raja,
The main concern over here is the DB size. It is approximately 3TB. I have already gone through forums and notes it is mentioned the DB size is not having any impact on the SAP EHP update using SUM. However am stil thinking it will have the impact in the downtime phase.
Although 3TB DB size may not have direct impact on the upgrade process, downtime of the system may vary with larger database size.
Points to consider
1) DB backup before entering into downtime phase
2) Number of Programs & Tables stored in the database. ICNV Table conversions and XPRA execution will be dependent on these parameters.
Hope this helps.
Regards,
Deepak Kori

Similar Messages

  • Ticket SAP Solution Manager 7.0 to SAP EHP 1 for SAP Solution Manager 7.0

    Hi,
    I want to migrate all tickets from SAP Solution Manager 7.0 to SAP EHP 1 for SAP Solution Manager 7.0 , please let me know the procedure how to migrate it.
    DB Detail
    SAP Solution Manager 7.0 - MSSQL( 9.00.4035)
    SAP EHP 1 for SAP Solution Manager 7.0 - MSSQL(10.00.2531)
    Thanks,
    Kaleel

    Dear Mohamed,
    I was reading the post and the replies and while I think your question has been answered, I just want to make sure what you are trying to achieve. Is it a migration, where you have one Solution Manager system on SAP Solution Manager 7.0 and another
    SAP Solution Manager EHP1 system and you were looking to migrate the Service Desk messages?
    If you are "Upgrading" one system from Solution Manager 7.0 to  EHP1 then all the messages are preserved.
    If you are "Migrating" to a physically different system, then transfer of Service Desk Messages is not supported.
    Please see the document [SAP Solution Manger-  Conent Transfer|https://websmp204.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000608232009E]
    Also Service Desk is a completely independent CRM Service, so the messages could always be archived, as has been suggested.
    No depending on your scenario, is the EHP1 system brand new or is it already productive? If it was brand new, you might be able to do a homogenous system copy, and then upgrade the new system to EHP1. But if it is already productive, this would not be an option.
    Hope this helps to clarify.
    Regards,
    Paul

  • SAP EHP 1 for SAP Solution Manager 7.0 configuration check

    Hi All,
    I have configured my SAP EHP 1 for SAP Solution Manager 7.0 according to the guide. Now i would like to test my configuration. Please provide me with a check list and test scenarios.

    SOLMAN_SETUP will guide you to setup SOLMAN, after you setup you gotta make sure that everything is green or resolve the errors.
    SMSY---> Ssystem Check RFC Connections,check the SLDCHECK.
    Refer to OPERATIONS GUIDE under INSTGUIDES, which is very detailed.
    Thanks
    SM

  • RMAN Tips for Large Databases

    Hi Friends,
    I'm actually starting administering a large Database 10.2.0.1.0 on Windows Server.
    Do you guys have Tips or Docs saying the best practices for large Databases? I mean large as 2TB of data.
    I'm good administering small and medium DBs, but some of them just got bigger and bigger!!!
    Tks a lot

    I would like to mention below links :
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28318/partconc.htm
    http://download.oracle.com/docs/cd/B28359_01/server.111/b32024/vldb_backup.htm
    For couple of good advices and considerations for RMAN VLDB:
    http://sosdba.wordpress.com/2011/02/10/oracle-backup-and-recovery-for-a-vldb-very-large-database/
    Google "vldb AND RMAN in oracle"
    Regards
    Girish Sharma

  • Sql Server Management Assistant (SSMA) Oracle okay for large database migrations?

    All:
    We don't have much experience with the SSMA (Oracle) tool and need some advice from those of you familiar with it.  We must migrate an Oracle 11.2.0.3.0 database to SQL Server 2014.  The Oracle database consists of approximately 25,000 tables and 30,000
    views and related indices.  The database is approximately 2.3 TB in size.
    Is this do-able using the latest version of SSMA-Oracle?  If so, how much horsepower would you throw at this to get it done?
    Any other gotchas and advice appreciated.
    Kindest Regards,
    Bill
    Bill Davidson

    Hi
    Bill,
    SSMA supports migrating large database of Oracle. To migrate Oracle database to SQL Server 2014, you could use the latest version:
    Microsoft SQL Server Migration Assistant v6.0 for Oracle. Before the migration, you should pay attention to the points below.
    1.The account that is used to connect to the Oracle database must have at least CONNECT permissions. This enables SSMA to obtain metadata from schemas owned by the connecting user. To obtain metadata for objects in other schemas and then convert objects
    in those schemas, the account must have the following permissions: CREATE ANY PROCEDURE, EXECUTE ANY PROCEDURE, SELECT ANY TABLE, SELECT ANY SEQUENCE, CREATE ANY TYPE, CREATE ANY TRIGGER, SELECT ANY DICTIONARY.
    2.Metadata about the Oracle database is not automatically refreshed. The metadata in Oracle Metadata Explorer is a snapshot of the metadata when you first connected, or the last time that you manually refreshed metadata. You can manually update metadata
    for all schemas, a single schema, or individual database objects. For more information about the process, please refer to the similar article: 
    https://msdn.microsoft.com/en-us/library/hh313203(v=sql.110).
    3.The account that is used to connect to SQL Server requires different permissions depending on the actions that the account performs as the following:
     • To convert Oracle objects to Transact-SQL syntax, to update metadata from SQL Server, or to save converted syntax to scripts, the account must have permission to log on to the instance of SQL Server.
     • To load database objects into SQL Server, the account must be a member of the sysadmin server role. This is required to install CLR assemblies.
     • To migrate data to SQL Server, the account must be a member of the sysadmin server role. This is required to run the SQL Server Agent data migration packages.
     • To run the code that is generated by SSMA, the account must have Execute permissions for all user-defined functions in the ssma_oracle schema of the target database. These functions provide equivalent functionality of Oracle system functions, and
    are used by converted objects.
     • If the account that is used to connect to SQL Server is to perform all migration tasks, the account must be a member of the sysadmin server role.
    For more information about the process, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313158(v=sql.110)
    4.Metadata about SQL Server databases is not automatically updated. The metadata in SQL Server Metadata Explorer is a snapshot of the metadata when you first connected to SQL Server, or the last time that you manually updated metadata. You can manually update
    metadata for all databases, or for any single database or database object.
    5.If the engine being used is Server Side Data Migration Engine, then, before you can migrate data, you must install the SSMA for Oracle Extension Pack and the Oracle providers on the computer that is running SSMA. The SQL Server Agent service must also
    be running. For more information about how to install the extension pack, see Installing Server Components (OracleToSQL). And when SQL Express edition is used as the target database, only client side data migration is allowed and server side data migration
    is not supported. For more information about the process, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313202(v=sql.110)
    For how to migrate Oracle Databases to SQL Server, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313159(v=sql.110).aspx
    Regards,
    Michelle Li

  • Oracle Critical Patch Update for Oracle Database 10g Release 10.2.0.3.0

    Hi,
    The [ Oracle Critical Patch Update for April 2012 was released on April 17th, 2012|http://www.oracle.com/technetwork/topics/security/cpuapr2012-366314.html] , wherein it was stated that below list versions will get affected.
    Oracle Database 10g Release 2, versions 10.2.0.3, 10.2.0.4, 10.2.0.5
    Currently I am using
    BANNER
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    Can anyone tell me, difference between
    Oracle Database 10g Release 2, versions 10.2.0.3 with Oracle Database 10g Release 10.2.0.3.0
    I am much concerned whether to apply patch for 10.2.0.3.0, wherein the affected version is Oracle Database 10g Release 2, versions 10.2.0.3
    Thanks,
    Sam

    user12983673 wrote:
    Can anyone tell me, difference between
    Oracle Database 10g Release 2, versions 10.2.0.3 with Oracle Database 10g Release 10.2.0.3.0There is no difference -- you use version 10.2.0.3 which is affected by last CPU.

  • DR startegy for Large database

    Hi All,
    We have a 30TB database for which we need to design a backup strategy.( Oracle 11gR1 SE, 2 Node RAC with ASM)
    Client needs a DR site for the database and from the DR site we will be running tape backup.
    The main constraint we are facing here are size of DB which will grow till 50 TB in future and another is we are running in Oracle standard edition.
    Taking a full RMAN backup to a SAN box will take around 1 week for us for a DB size of 30TB.
    Options for us:
    1. Create a manual standby and apply archive logs( We cant use Dataguard as we are in SE edition)
    2. Storage level replication ( Using HP Continous access)
    3. Use thrid party tools such as Shareplex,Golden gate, DBVisit etc
    Which one will be the best option here with respect to cost and time or do we have any other option better than this.
    We cant upgrade to Oracle EE edition as of now since we need to meet the project deadline for Client. We are migrating legacy data to production now and this will be interrupted if we go for a upgrade.
    Thanks in advance.
    Arun
    Edited by: user12107367 on Feb 26, 2011 7:47 AM
    Modified the heading from Backup to DR

    Arun,
    Yes this limitation about BCT is problematic in SE but after all if everything was included in SE who would pay the EE licence :) ?.
    Only good thing if BCT is not in use is that RMAN checks the whole database for corruption even if the backup is an incremental one. There is no miraculous "full Oracle" solution if your backups are so slow but as you mentioned the manual standby with delayed periodic applications of the archives is possible. It's up to you to evaluate if if works in your case though : how many archive log files will you daily generate and how long will it take to apply them on your environment ?
    (notice about Golden Gate it's no more a third party tool : it's now an Oracle tool and it is clearly introduced as a recommended replacement for Streams)
    Best regards
    Phil

  • BRTOOLS with tape continuation for large database

    Hello,
    I have a R/3 database of 2TB which needs to be copied onto a new staging server for upgrade.
    The problem I am facing is that I dont have any space in the SAN(storage) for taking the backup on disk.
    So the only option I have is to take backups on tape.
    Even tried backup with compress mode on tape but ended up with CPIO error for handling larger than 2GB files(note 20577).
    And Since the database size is 2TB and the tapes I have hold 800GB I would have to use multiple tapes.
    Since my some of the files in the database like *DATA_1 range between 6GB to 10GB  CPIO cannot
    handle files larger than 2GB as per note 20577.
    So I had to change the parameter   tape_copy_cmd = dd   in init<sid>.sap.
    But DD will end once the end of tape is reached with a error message thereby failing my backup.
    Please help me get out of this situation.
    Regards,
    Guru

    Hi,
    Please check the 'Sequential backup' section in the backup guide. If its not possible to use a tape with a big capacity, you could use this method instead.
    You would need to add/ modify the following parameter in init<sid>.ora :
    1. volume_backup
    2. tape_address
    3.tape_address_rew
    4.exec_parallel
    You'll find more info about these parameters in www.help.sap.com and in the backup guide itself.
    Br,
    Javier

  • SAP EHP 1 for PI 7.1 Installer failed: runing 2 upgrades parallely

    Hi Friends
    I am doing an SAP EHPI Installer upgrade on one host machine from PI 7.1 to EHPi 1 PI 7.1
    while on the same host machine another upgrade is running 4.6 C to ECC 6.0.
    while i try to open my ehpi installer it opens the upgrade one with the bellow command.
    http://<host name>:4239
    can you help em out how can i run my ehpi installe without disturbing the other one.
    Regards
    Pooja

    the EHP folder creates and picks the port the one i will specify in the command line.
    Correct me if i understood you wrong.
    once i enter it that port entry will automatically get reflected in the
    D:\SP09\EHPI\sdt\htdoc\sdtdsu.jnlp
    do we have a note which says this.
    Currently it picks the default entry as shown below;
    <?xml version="1.0"?>
    <jnlp href="/sdtdsu.jnlp" codebase="http://HOSTEST.bcone.com:4239/" spec="0.2 1.0">
      <information>
        <title>SDTGui for EHPInstaller</title>
        <vendor>SAP AG</vendor>
        <description kind="short">SDTGui for EHPInstaller</description>
        <icon href="/pics/sdtgui_big.gif"/>
      </information>
      <security>
        <all-permissions/>
      </security>
      <resources>
        <j2se version="1.5+"/>
        <jar download="eager" href="/lib/sdt_compatibility.jar"/>
        <jar download="eager" href="/lib/sdt_engine.jar"/>
        <jar download="eager" href="/lib/sdt_gui.jar"/>
        <jar download="eager" href="/lib/sdt_util.jar"/>
        <jar download="eager" href="/lib/sdt_trace.jar"/>
      </resources>
      <application-desc main-class="SDTGui">
        <argument>host=HOSTEST.bcone.com</argument>
        <argument>port=4241</argument>
        <argument>service=DSUService</argument>
      </application-desc>
    </jnlp>
    Regards
    Pooja

  • Oracle for large database + configuration

    Hi
    I have some historical data for Stocks and Options that I want to save into an Oracle database. Currently I have about 190G of data and expect to grow about 5G per month. I have not completely thought about how to organize the tables. It is possible that there might just be one table which might be larger than the hard disk I have.
    I am planning to put this on a DELL box, running Windows 2000. Here is the configuration.
    Intel Xeon 2.4GHz, 2G SDRAM, with 3 146G SCSI harddrive with PERC3 SCSI controller. This machine roughly costs 7000$.
    Is there any reason that this wont work. Will Oracle be able to organize one database across multiple disks? How about Tables? Can Tables span multiple disks.
    All this data is going to be read only.
    My other cheaper choice is
    Intel box running P3, 2G RAM, 2 200G IDE drives. My questions for this are/ Will this configuration work?
    Also, for this kind of a database what kind of total disk space should I budget for?
    thanks
    Venkat

    The Server Manager was deprecated in 9i. Instead of using it you have to use the SQL*Plus. Do you have another JRE installed ?
    How to create a database manually in 9i
    Administrator's Guide for Windows Contents / Search / Index / PDF
    http://download-east.oracle.com/docs/cd/B10501_01/win.920/a95491.pdf
    Joel Pérez

  • Backups for large databases

    I am wondering how people do restores of very large DB's. Ours is not that large yet bit will grow to the point where exports and imports are not feasible. The data only changes periodically and as it is a web application,cold backups are not really an option. We don't run in archived log mode because of the static nature of the data. Any suggestions?

    put the read only tables in a read only tablespace and slowly changing tables to another tablespace. the most frequent ones in another tablespace.
    take transportable tablespace export of the frequently changing tablespace daily and slowly changing 2-3 times a week (depending on your site specifics). this involves nothing but a metadata export of the datadictionary info of the tablespaces exported and an OS level copy of the datafiles of those tablespaces.
    this is the best way for you to backup/recover. check out oracle documentation or this website for transportable tablespaces.
    I guess it comes to a point where you have to make a tradeoff between performance and recoverability. In my opinion always take recoverability over performance.
    If the periodic change of data is nothing but a bulk data load then after the dataload take a backup of a database. having multiple recovery scenarios is the best way for recovery.

  • Error occurred while processing SAP SQL Tools for SQLServer (Database Copy)

    Hello,
    We are currently running ERP 6 EP4 support stacks 6. Two weeks ago we
    refreshed our sandbox from QAS system and installed support stack 9 for
    testing. Kernel was also upgraded from sp69 to sp137 as a part of the
    support pack installation.
    This week, I was asked to do another refresh of sandbox from the QAS system. As the
    kernel was upgraded before to sp139, I copied back the previous kernel
    file sp69 to the kernel folder and started refresh process as usual.
    When I was running SAP for SQL, it ran successfully thorugh all the
    steps but failed in the final steps with the following error logs:
    =======================================================================
    An error occurred while processing option SAP Toools for MS SQL Server
    > Database Copy Completion( Last error reported by the step: System
    call failed. Error 2 (The system cannot find the file specified. ) in
    execution of system call 'CreateProcessAsUser' with parameter ( ,
    NULL, Program Files/sapinst_instdir/MSS/CPY, &StartupInfo,
    &ProcessInfo), line (631) in file (synxcchapp.cpp), stack trace:
    iaxxejsctl.cpp: 272: EJS_ControllerImpl::executeScript()
    d:\depot\bas\710_rel\bc_710-3_rel\gen\optu\ntamd64
    \ins\sapinst\impl\src\ejs\iaxxejsbas.hpp: 450:
    EJS_Base::dispatchFunctionCall() iaxxejsexp.cpp: 178:
    EJS_Installer::invokeModuleCall() synxcchapp.cpp: 228:
    CSyChildApplicationImpl::start(false) synxcchapp.cpp: 252:
    CSyChildApplicationImpl::doStart() .). You can now:
    Choose Retry to repeat the current step.
    Choose View Log to get more information about the error.
    Stop the option and continue with it later.
    Log files are written to C:\Program Files/sapinst_instdir/MSS/CPY.

    I looked at the logs, very similar error to the one that I already posted:
    ERROR      2011-07-08 09:29:49.861
               CJSlibModule::writeError_impl()
    MUT-03025  Caught ESyException in Modulecall: At line 631 file synxcchapp.cpp
    Call stack:
    iaxxejsctl.cpp: 272: EJS_ControllerImpl::executeScript()
    d:\depot\bas\710_rel\bc_710-3_rel\gen\optu\ntamd64\ins\sapinst\impl\src\ejs\iaxxejsbas.hpp: 450: EJS_Base::dispatchFunctionCall()
    iaxxejsexp.cpp: 178: EJS_Installer::invokeModuleCall()
    synxcchapp.cpp: 228: CSyChildApplicationImpl::start(false)
    synxcchapp.cpp: 252: CSyChildApplicationImpl::doStart()
    System call failed. Error 2 (The system cannot find the file specified.
    ) in execution of system call 'CreateProcessAsUser' with parameter (<token>, NULL, <command line for executable saplicense.exe, NULL, NULL, TRUE, 0x420, NULL, C:/Program Files/sapinst_instdir/MSS/CPY, &StartupInfo, &ProcessInfo), line (631) in file (synxcchapp.cpp), stack trace: iaxxejsctl.cpp: 272: EJS_ControllerImpl::executeScript()
    d:\depot\bas\710_rel\bc_710-3_rel\gen\optu\ntamd64\ins\sapinst\impl\src\ejs\iaxxejsbas.hpp: 450: EJS_Base::dispatchFunctionCall()
    iaxxejsexp.cpp: 178: EJS_Installer::invokeModuleCall()
    synxcchapp.cpp: 228: CSyChildApplicationImpl::start(false)
    synxcchapp.cpp: 252: CSyChildApplicationImpl::doStart()
    TRACE      2011-07-08 09:29:49.861 [iaxxejsbas.hpp:488]
               EJS_Base::dispatchFunctionCall()
    JS Callback has thrown unknown exception. Rethrowing.
    ERROR      2011-07-08 09:29:49.875 [sixxcstepexecute.cpp:971]
    FCO-00011  The step doLI with step key |SAPMSSTOOLS|ind|ind|ind|ind|0|0|MssSysLI|ind|ind|ind|ind|7|0|doLI was executed with status ERROR ( Last error reported by the step: System call failed. Error 2 (The system cannot find the file specified.
    ) in execution of system call 'CreateProcessAsUser' with parameter (<token>, NULL, <command line for executable saplicense.exe, NULL, NULL, TRUE, 0x420, NULL, C:/Program Files/sapinst_instdir/MSS/CPY, &StartupInfo, &ProcessInfo), line (631) in file (synxcchapp.cpp), stack trace: iaxxejsctl.cpp: 272: EJS_ControllerImpl::executeScript()
    d:\depot\bas\710_rel\bc_710-3_rel\gen\optu\ntamd64\ins\sapinst\impl\src\ejs\iaxxejsbas.hpp: 450: EJS_Base::dispatchFunctionCall()
    iaxxejsexp.cpp: 178: EJS_Installer::invokeModuleCall()
    synxcchapp.cpp: 228: CSyChildApplicationImpl::start(false)
    synxcchapp.cpp: 252: CSyChildApplicationImpl::doStart()
    TRACE      2011-07-08 09:29:49.886
      Call block:CallBackInCaseOfAnErrorDuringStepExecution
        function:CallTheLogInquirer
    is validator: true

  • SAP Kernel update with HANA DATABASE

    Dear experts,
    I am planning to update the kernel  with HANA database. It is my first time  that I apply a kernel with HANA  and for that reason I want to be sure everything is properly planned. I have updated kernel with Oracle databases and my biggest doubt is if I have to do something in the database, what would it be?
    I have looking for any note that help me  with a step by step but I couldn't find it.   In that order I will ask you based on your own experiences.
    Current  Kernel is 742 PL28, I am planning to update to PL 101.
    I appreciate any comments or suggestions,
    Best regards,
    DanielO

    Dear Reagan,
    As you told, the kernel was applied and I had no issues. I appreciatte  your help.
    Best regards,
    DanielO

  • Selection Interface for large database

    I am looking for a working example of a CF selection field that fills or builds a name list as you type.  The database has about 600,000 names with 400 new people being added each day.  I am looking for a smart tool that watches you type and brings down a name list as you go.  In the list the user needs to see the name and other identifying information like DOB and phone number.  User clicks the row and the persons recorded is located.  I think I have a good understanding the CFC side of this. 
    If you type fast the tool waits for a second. "Sounds like" support would also be nice.
    Thanks for any ideas?

    You mean AutoSuggest? See this link:
    http://forta.com/blog/index.cfm/2007/5/31/ColdFusion-Ajax-Tutorial-1-AutoSuggest
    You might want to adjust the code to work on official version because the example was built for beta version.

  • Brbackup for large database

    Dear All,
    In our PRD Server the DB size more 800 GB.Now we using the Brtools for backup.The tape size is 400/800 GB ultrium 3.In future the DB Size may increase .How to take the backup .Is there any way to take backup in to split up backup.
    Kindly Give the solution.
    Regards
    guna

    Hello,
    if your backup is to big for one tape, you may do one of the following:
    Use more than one tape drive. You may specify them in your init<SI>.sap file.
    Use one drive, and manually enter another tape as soon as the first is full. You probably don't want to, though...
    Use one drive and a loader that will automatically change tapes.
    So most probably you will have to pay for additional hardware.
    regards

Maybe you are looking for

  • Pension payment India Payroll

    Dear Gurus, When we change the status as retiree and we need to pay pension why cant we pay through IT0008 and need to pay through IT0014/15. Also in case of death of the employee pension needs to be paid to nominee.Now where do we maintain nominee b

  • How can I import history from old computer

    I have just bought a new computer and want to have the tab history from my old computer. How do I do this?

  • Firewire DVI Port

    I heared, that there might be some FW -> DVI Ports available for OSX (and Apple Hardware) does somebody know something about it ? How is the name of one of this products, and where can I find it... ?

  • PowerBook 145B Startup Trouble

    So I have this PowerBook 145B. It will start up with the regular startup sound and a slight click on the speaker, and the drive stays spinning, but the screen stays blank. The pointer is on the screen and can be moved by the trackball, but nothing ha

  • How to use dynamic select query which queries from 3 different table.

    Hi All, I am new to Toplink, i would like to use a named query to select some of the columns from 3 different tables with dynamic where clause. I have used the following lines. Please tell me how to get code for the dynamic where clause. First try :