Best practices for data migration

Hi All,
This thread is useful for those who can use their opportunity to share the ideas and knowledge for making better and best use of data migration using Business Objects Data Services.

Hans,
The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
The next release, including the web app, will support SQL Server and Oracle, but not DB2.
Paul

Similar Messages

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • SAP Best Practices for Data Migration :repositories only on MS SQL Server ?

    Hi,
    I'm implementing the "SAP Best Practices for Data Migration" (see https://websmp109.sap-ag.de/bp-datamigration).
    As part of the installation you have to install MS SQL Server Express Edition. The installation guide contains detailed steps to do this. All repositories for Data Services should be running on SQL Server, according to the installation guide.
    The customer I'm working for now does not want to use SQL Server, but DB2, as company standard.
    So I use DB2 for the local and profiler repositories.
    I notice however that the web application http://localhost:8080/MigrationServices does not support DB2.The only database type you can select in the configuration area is MS SQL Server.
    Is this a limitation, a by design ?

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • The best practice for data archiving

    Hi
    My client has been using OnDemand for almost 2 years, there are around 2M records in the system(Activities), so just want to know what is the best practice of data archiving, we dont care much about the data in the 6 month ago.

    Hi Erik,
    Archival is nothing but deletion.
    Create a backup cube in BW. Copy the data from your planning cube to the backup cube, and then delete that data region from your planning cube.
    Archival will definitely improve the performance of your templates, scripts, etc; since the system will now search from a smaller dataset.
    Hope this helps.

  • Best Practices for Data Access

    Good morning!
    I was wondering if someone might give me some advice on some best practices for retrieving data from a SQL server in the cloud via a desktop application?
    I'm curious if I embed into my desktop application the server address (IP, or Domain or whatever) and allow the users to provide their own usernames and passwords when using the application, if there was anything "wrong" with that? Where-in my
    application collects the username and password from the user, connects to a server with that username and password, retrieves the data and uses it in-app.
    I'm petrified of security issues and I would hate to start using a SQL database with this setup only to find out that anyone could download x, y or z and connect to the database and see everything.
    Assuming I secure all of the users with limited permissions, is there anything wrong with exposing a SQL server to the web for my application to use? If so, what and what would be a reasonable alternative?
    I really appreciate any help and feedback!

    There are two options, none of them very palatable:
    1) One is to create a domain, and add the VM and your local box to it.
    2) Stick to a workgroup, but have the same user name and password on both machines.
    In practice, a better option is to create an SQL login that is member of sysadmin - or who have rights to impersonate an account that is member of sysadmin. And for that matter, you could use the built-in sa account - but you rename it to something else.
    The other day I was looking at the error log from a server that apparently had been exposed on the net. The log was full with failed login attempts for sa, with occasional attempts for names like usera and so on. The server is in Sweden - the IP address
    for the login attempts were in China.
    Just so know what you can expect.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Best practice for database migration in 11g

    Hello,
    Database migration is required due to OS change.  Here, I have two database instances say A and B in the old server where RDBMS_VERSION is 11.1.0.7.0. They need to be migrated into a new OS where the oracle has been installed with version 11.2.0.2.0.
    Since all data + objects need to be migrated into the new server, I want to know what the best practice is and how to do that. Thanks in advance for your necessary guidance.
    Thanks and Regards,
    Prosenjit

    Hi Prosenjit,
    you have some options.
    1. RMAN Restore: you can restore your database via rman to the new host, and then upgrade it.
        Please follow instruction from MOS Note: RMAN Restore of Backups as Part of a Database Upgrade (Doc ID 790559.1)
    2. Data Guard: check the MOS Note: Mixed Oracle Version support with Data Guard Redo Transport Services (Doc ID 785347.1)
    3. Full Export / Import (DataPump)
    Borys

  • ASM BEST PRACTICES FOR 'DATA' DISKGROUP(S)

    In our quest to reduce operating costs we are consolidating databases and eliminating RAC in favor of standalone servers. This is a business decision that is a certainty.  Our SAN has been upgraded, and the new database servers are newer, faster, etc.
    Our database version is 11.2.0.4 with Grid Infrastructure 12.1.0.1. Our data diskgroup is RAID-5 and our fra is RAID-1+0.  ASM has external redundancy.  All disks are of equal size with equal storage performance and availability.
    Previously our databases were on separate clusters by function: OLTP, REPORTING and ENTERPRISE CONTENT MANAGEMENT. Development/Acceptance shared a cluster, while production was separate.
    The new architecture combines different functions onto one server for dev/acc, and another for production.  This means they will all be using the same ASM instance.  Typically we followed Oracle’s recommendation to have two disk groups, one for data and the other for FRA.  That followed well when the database was the only one using the data diskgroup.  Now that we are coming databases, is the best practice still to have one data diskgroup and one FRA diskgroup?  For example, production will house 3 databases.  OLTP is 500 GB, Reporting is 1.3 TB, and Enterprise Content Management is 6 TB and growing.
    My consideration is that if all 3 databases accessing the same data diskgroup, the smaller OLTP must traverse through the 6 TB of content management.  Or is this thinking flawed?
    Does this warrant separate diskgroups?  Are there pros and cons to this?
    Any insights are appreciated.
    Best Regards,
    Sherrie

    I have many issues to deal with in this 'consolidation', but budget reduction is happening in state and regional government.  Our SAN storage is for our enterprise infrastructure and not part of my money-savings directive.  We are also migrating to UCS blades for the infrastructure, also not part of my budget reduction contribution. Oracle licensing is our biggest software cost, this is where my directive lies.  We've always been conservative and done more with less, now we will do with less, but different because the storage and hardware are awesome. 
    We've been consolidating databases onto RAC clusters and standalones since we started doing Oracle.  For the last 7 years we've supported ASM, 6 databases and 2 passive standby instances (with Data Guard) on a 2-node cluster totalling 64gb of memory.  The new UCS blades have 256gb of memory.  I get that each database must support its background processes.  If I add up the sga, pga allocated, background processes they take up about 130gb of memory, but also consider that there is an overhead to RAC.  In all the years we've had Oracle, most of our failures, outages or downtime was because of RAC.  On the plus side of that, the seamless failover saved us most times (not all times), but required administrative time for troubleshooting.
    I would love to go the Oracle 12c and use its multitenant architecture, but I have 3rd party applications that don't yet support it.  11.2 might be our last release unless I can reduce costs.  Consolidation is real and much needed, I believe why Oracle responded to the market with multitenancy. 
    But back to my first question about how many diskgroups to service a group of databases.  What I hearing, and think I agree to, is that one data group will suffice because the ASM instance knows where to retrieve the data and waste will be reduced, as well as management. 
    I still need to do some ciphering and by no means have a final plan, but thank you all for your insights and contributions.

  • Best practice for data persistance for monitoring without BAM

    Greetings,
    We are modeling a business process in a large organization using BPEL Process Manager. The key point is that business people needs to monitor the execution of the business process in several key sectors of the process execution as well as they need to get report information of the process.
    To model this in our project, we decided to create a new Oracle Database Schema that is going to hold the information about the business process execution (we decided that because for this initial offering the customer is not buying BAM). In this context, the BPEL process is going to be sending this key information to the repository so business people can then view real time information about the process execution as well as historical information in form of reports.
    The important issue here is, if there is a best practice to send the information to the Database Schema ? it could be just using single database adapters ? maybe using sensors sending the data using topics connections ?
    Any help will be highly appreciated.
    Thanks in advance.

    hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?

  • The best practice for data mart to different BW System.

    Hi All,
    Would you like to suggest me what i have to do for this case ??
    I have 2 SAP BW systems e.g. BW A & BW B.
    I wanna transfer data from info cube within BW A into info cube within BW B.
    The things that I did :
    1. 'Generate Export Data Sources' for info cube BW A.
    2. Replicate source system in BW B. In BW B, it will show datasource from info cube BW A.
    3. In SAP BW B, I create info package, then I can fetch data from SAP BW A.
    What I wanna ask are:
    1. Could I make it automatically?? Because everytime I wanna fetch data from Info Cube SAP BW A, I must run info package in SAP BW B / what's the best practice.
    2. Could RDA make it automatically ?? Automatic in my case is everytime I have new/update data in Info cube SAP BW A, I don't have to run info package in SAP BW B.
    SAP BW B will automatically fetch the data from Info Cube A.
    If yes, could you give me step-by-step how to use RDA to solve my case please..
    Really need ur guidances all .
    Thanks,
    Best regards,
    Daniel N.

    Hi Daniel,
    You can create a process chain to load your cube in BW A. SImilarly create a process chain in your BW B system to load its cube.
    Now in your system BW A you create a process chain to load your cube. After that you can run automatically the procewss chain in BW B. You can use the Remote chain option for this.
    This will trigger a chain automatically in the remote system.
    Regards,
    Mansi

  • What is the best practice for voicemail migration?

    Hello Tech Gurus,
    Am looking into a way to migrate our customer voicemail where their voicemail is on NME-CUE module. They want to migrate their voicemail's configurations, licenses and related (to SRE module) and I would like to know what is the best practice or guidelines that I can refer to.
    Thank you very much!
    Regards,
    Alex.

    Hi Alex,
    I was seeing the DOC  which says that 
    Cisco supports transfer of CUE licenses, with some restrictions. Transfer is supported for CUE devices that are of the same type, for an RMA or in cases in which a license was wrongly installed. This process is not intended for transferring licenses from one generation to another (for example, from NM-CUE to NME-CUE, or from NME-CUE to SRE devices). Transferring a license is accomplished using a process called rehosting. The rehosting process transfers a license from one UDI to another by revoking the license from the source device and installing in a new device
    http://www.cisco.com/c/en/us/td/docs/voice_ip_comm/unity_exp/rel7_1/Licensing/CUELicensing_book/csa_overview_CUE.html#wp1101175
    You can still speak to licensing team along with show license udi from SRE module along with old licenses details from NME-CUE fro rehosting.
    regds,
    aman

  • Best practices for data representation

    I'm curious about the best data representation for a constant or variable when there is an obvious choice of two.
    For example, take the Timeout terminal of the Event structure. This terminal takes a Long (I32) data type, but I'm wiring to it a constant value of 100 and therefore could use an Unsigned Byte (U8). Setting the constant to be I32 prevents an automatic conversion step from happening, but setting it to be U8 saves a little bit of unnecessary allocated space.
    Which is better?

    Practically
    speaking it more than likely will not matter until the data sets get
    large however as a "best practices" go it is best to keep the data
    consistent and in the type that the control, property node etc expects. Directly from the NI user manual (LV 7.1)
    "Coercion
    dots appear on block diagram nodes to alert you that you wired two
    different numeric data types together. The dot means that LabVIEW
    converted the value passed into the node to a different representation.
    Coercion dots can cause a VI to use more memory and increase its run
    time. Try to keep data types consistent in VIs."
    Cheers,
    --Russ

  • Best practices for data entry online system

    Hi all
    I am(with a team of 4 members) going to build an online data entry system which may have approximately 30 screens. I am going to use Spring BlazeDS remoting to connect middleware.
    Anyone could please suggest me some good practices to follow in flex side to do such a "DATA ENTRY" application.
    The below points are some very few common best practices we need to follow while doing coding .But i am not sure how to achive them in flex side.
    User experience (Probably i can get little info regarding this from my client)
    Code maintanability
    Code extendibility
    memory and CPU optimization
    Able to work with team members(Multiple checkouts)
    Best framework
    So i am looking for valueble suggestion from great minds.

    There are two options, none of them very palatable:
    1) One is to create a domain, and add the VM and your local box to it.
    2) Stick to a workgroup, but have the same user name and password on both machines.
    In practice, a better option is to create an SQL login that is member of sysadmin - or who have rights to impersonate an account that is member of sysadmin. And for that matter, you could use the built-in sa account - but you rename it to something else.
    The other day I was looking at the error log from a server that apparently had been exposed on the net. The log was full with failed login attempts for sa, with occasional attempts for names like usera and so on. The server is in Sweden - the IP address
    for the login attempts were in China.
    Just so know what you can expect.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • JPA - Best Practice For Data Transfer?

    I've been considering an alternative method for data transfer between applications, by using Serialized or Encoded to File JPA Entities. (either Binary or XML)
    I know this procedure may have several draw backs as compared to traditional exported SQL queries or data manipulation statements, however, I want to know if anyone has considered or used this process for data transfer?
    The process would be to
    - query the database and load the JPA Entities
    - Serialize or Encode them to file
    - zip up the entire folder with the JPA entities
    - transfer the data to destination machine
    - extract the data to a temp directory
    - reload the JPA entities by de-serializing and persisting them to the database
    The reason I'm considering this process, is basically because I have a desktop application (manages member records, names, dates, contributions, etc) used by different organisations in different locations (which are not related except by purpose ie clubs or churches) and would like to have a simple way of transporting all data associated with a single profile (information about a member of the organisation) from one location to another in a simple way, ie users interact only with the application without the need for any database management tool or such.
    I'm also considering this because it is not easy to generate an SQL Script file without using a dedicated Database Management Tool, which I do not want the application users to have to learn how to use.
    I would appreciate ANY suggestions and probable alternative solutions for this problem. FYI: I'm using a Java DB database.
    ICE

    jschell wrote:
    In summary you are moving data from one database to another. True
    You only discussed flow one way. Well the process is meant to be bi-directional. Basically what I envision would to have something like:
    - the user goes to File -> Export Profile...
    - then selects one or more profiles to export
    - the export process completes and the zip archive is created and transfered (copied, mailed, etc) to the destination location
    then on the destination pc
    - the user goes to File -> Import Profile
    - selects the profile package
    - the app extracts, processes and imports the data (JPA serialized for example)
    Flow both ways is significantly more complicated in general.Well if well done it shouldn't be
    And what does this have to do with users creating anything?Well as shown above the user would be generating the Zip Archive (assuming that is the final format)
    Does this make the problem clearer?
    ICE

  • Wireless Design - Best Practices for Data, Voice, and LBS

    Hi,
    I am currently in the process of designing a WLAN for a new hospital and I am getting some push back from my sales team.  The requirements of the WLAN are data, voice, and location based services (RFID for medical equipment) ... needs to be 2.4 GHz for Guest and some apps/clients but primarily 5 GHz for most of the clients ... lastly needs to be N compatible for future use.
    So, I did a predictive design with 1252's on the perimeter with 2.4 and 5 GHz patch antennas and 1142's in the middle to fill gaps ... I also scoped out 2 5508 for redundancy .... total design with -65 at my edges was 169.  However, this is getting push back because of several cost issues ....
    1. The bundle that Cisco offers for 5 100 AP license 5508 WLC is cheaper than buying 2 250 AP licenses WLC's ... which doesn't make any sense to me because I think 5 devices is over kill
    2. The sales engineer is concerned about the power issues with the 1252's ... customer would rather not use power injectors ... and although they would have 6500's at there core ... they would only have basic switches in their IDF's so I wasn't sure which POE Switches would be able to handle 1252 but cost was an issue there as well
    So, for my understanding when you are doing a WLAN design for LBS it's always best to have APs or antennas on the perimeter for better triangulation ... it makes more sense to me to do that with patch instead of Omni's ... however my sales engineer wants to use all 1142's ... so my question is what are the pro and cons behind using all Omni's or using Patch and Omni's?
    Furthermore, if anyone has any documentation supporting why I would not use all Omni's that would be great because all the articles I have read on LBS just state that placement of APs is critical but doesn't give no specifics on whether it's a good practice to place them on the perimeter using a specific type of antenna or what.
    Thanks in advance for you help and any ideas about this design!!!

    1.  The 5508 is expensive because it's alot faster than the 4400 plus there are some features exclusive to the 5508 such as OfficeExtend.  As the old network design adage goes:  Your design can be done correctly, cheap or fast.  Choose two.
    2.  The 1250 requires 19.5w of power to enable FULL MCS rates to both radios.  Only the 3560E, 3750E or the Sup720 is capable of supporting that.  Upgrading the IOS of the 1250 to 12.4(10b)JDA3 will allow the AP to operate both radios at 15.4w BUT at a lower MCS rates.  Correct placement of the AP and the correct use of the antennaes will also help in the signal distribution.
    3.  Patch antennaes are mostly directional.  The 1140 is onmi-directional BUT the signal strength is not as powrful as the 1250 at full power.  The AIR-ANT2451NV is an omni-directional patch designed for the 1250.
    Cisco Aironet Antennas and Accessories Reference Guide
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/product_data_sheet09186a008008883b.html
    Cisco Aironet 2.4 GHz and 5 GHz Antennas and Accessories
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/product_data_sheet09186a008022b11b.html
    Some of the new patch antennaes for the 1250
    Cisco Aironet Dual Band MIMO Low Profile Ceiling Mount Antenna (AIR-ANT2451NV-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant2451nv.pdf
    Cisco Aironet Very Short 5-GHz Omnidirectional Antenna (AIR-ANT5135SDW-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant5135sdw.pdf
    Cisco Aironet Very Short 2.4-GHz Omnidirectional Antenna (AIR-ANT2422SDW-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant2422sdw.pdf
    Cisco Aironet 5-dBi Diversity Omnidirectional Antenna (AIR-ANT2452V-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant2452v.pdf
    Cisco Aironet 5-GHz MIMO Wall-Mounted Omnidirectional Antenna (AIR-ANT5140NV-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant5140nv.pdf
    Cisco Aironet 5-GHz MIMO 6-dBi Patch Antenna (AIR-ANT5160NP-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant5160np.pdf
    Cisco Aironet 2.4-GHz MIMO Wall-Mounted Omnidirectional Antenna (AIR-ANT2450NV-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant2450nv.pdf
    Cisco Aironet 2.4-GHz MIMO 6-dBi Patch Antenna (AIR-ANT2460NP-R)
    http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/data_sheet_ant2460np.pdf

Maybe you are looking for