Best practice for migrating data tables- please comment.

I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
They also require extensive documentation where every step is recorded in a document and use that for the deployment.
I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
Please comment on your view of this practice. Thanks!

>
Please comment on your view of this practice. Thanks!
>
Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
>
I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
>
The process you describe is what I would expect, and require, in any well-run environment.
>
I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
>
Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
>
They also require extensive documentation where every step is recorded in a document and use that for the deployment.
I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
>
Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

Similar Messages

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • Best Practices for migrating .rpd

    Hi,
    Can someone please share with us the industry best practices for migrating the .rpd file from a source (QA) to a destination (Prod) environment. Our Oracle BI server runs on a unix box and currently what we do to migrate is to save copies of QA and prod .rpds on our local machine and do a Merge. Once merged, we ftp the resulting .rpd to destination (prod) machine. This approach does not seem reliable and we are looking for a better established approach for .rpd migration. Kindly help me where I can find the Documentation..Any help in this regard would be highly appreciated.

    I don't think there is an industry best practice as such. What applies for one customer doesn't necessarily apply at another site.
    Have a read of this from Mark Rittman, it should give you some good ideas:
    http://www.rittmanmead.com/2009/11/obiee-software-configuration-management-part-2-subsequent-deployments-from-dev-to-prod/
    Paul

  • Best practice for heirachical data

    First off, I have to say that JMX in Java 6 is terrific stuff. Bundling jconsole in with Java has made JMX adoption so much easier for us.
    Now, to my question. We have read-only hierarchical data (think a DOM tree) that we would like to publish via JMX. What is the best practice? We see two possibilities:
    1. Publish each node of the tree with it's own object name and type. This will allow jconsole to display the information in the tree control.
    2. Publish just the root of the tree with an object name and type and then use CompositeType to describe the nodes of the tree. This means you look at the tree in the "Attribute Value" panel of jconsole.
    Is there any best practices for such data? We have implemented #2 and it works but we are wondering if long term this might lead to unforeseen consequences.
    Thanks in advance.
    --Marty                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Path,
    I did go with #1 and it worked out great. Every node in our tree is an ObjectName node. Works very well for us.
    --Marty                                                                                                                                                                                                                                                                   

  • Best practice for sharing data with model window

    Hi team,
    what would the best practice for sharing data with a modal
    window be ? I use a modal window to display record details from a
    record list, but i am not quite sure how to access the data from
    the components in the main application in the modal window.
    Any hints would be welcome
    Best
    Frank

    Pass a reference to the parent into the modal popup. Then you
    can reference anything in the parent scope.
    I haven't done this i 2.0 yet so I can't give you code. I'll
    post if I do.
    Oh, also, you can reference the parent using parentDocument.
    So in the popup you could do:
    parentDocument.myPublicVariable = "whatever";
    Tracy

  • Best practice for migration to new hardware

    Hi,
    We are commissioning new hardware for our Web Server. Our current webserver is version 6.1SP4, and for the new server we've decided to stay with 6.1 but install SP7.
    Is there a best practice for migrating content from one physical server to another?
    What configuration files should I watch out for?
    Hopefully the jump from SP4 to SP7 won't cause too many problems.
    Thanks,
    John

    unfortunately, there is no quick solution for migrating from 1 server to other. you will need to carefully reconstruct
    - acl rules
    - server hostname configurations
    - any certificates that have been created on the old machine

  • Cisco Network 2009 - Best practices for migrating previous versions of cisco unified communications manager to cucm 7.1

    Does anybody have a copy of the above referenced presentation that you could send me. 
    Thanks in advanced. 
    The presentation can be purchased at the following site:
    http://www.scribd.com/doc/33211957/BRKVVT-2011-Best-Practices-for-Migrating-Previous-Versions-of-Cisco-Unified-Communications#archive
    but felt I ask one of my peeps first. 
    Thanks in advanced.
    Dennis

    Hi Dennis,
    Well..let's give this a try
    Cheers!
    Rob

  • Need Best Practice for Migrating from Solaris to Linux

    Hi Team,
    We are migrating our Data Center from Solaris to Linux and our EBS 11i, database 10g (10.2.0.5) is 6TB. Please let us know the Best Practice to Migrate our EBS 11.5.10.2 from Solaris to Linux RHEL 5.
    we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5. Please let us know for any details.
    EBS version: 11.5.10.2
    DB version: 10.2.0.5
    We have checked the certifications in Oracle support.
    Oracle EBS 11.5.10.2 is not certified with Linux x86-64 RHEL 5. 
    Oracle EBS 11.5.10.2 is certified on Linux x86 RHEL 5.
    So we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5.
    Thank You.

    You can transportable tablespace for the database tier node.
    https://blogs.oracle.com/stevenChan/entry/10gr2_xtts_ebs11i
    https://blogs.oracle.com/stevenChan/entry/call_for_xtts_eap_participants
    For the application tier node, please see:
    https://blogs.oracle.com/stevenChan/entry/migrate_ebs_apptiers_linux
    https://blogs.oracle.com/stevenChan/entry/migrating_oracle_applications_to_new_platforms
    Thanks,
    Hussein

  • Where to find best practices for tuning data warehouse ETL queries?

    Hi Everybody,
    Where can I find some good educational material on tuning ETL procedures for a data warehouse environment?  Everything I've found on the web regarding query tuning seems to be geared only toward OLTP systems.  (For example, most of our ETL
    queries don't use a WHERE statement, so the vast majority of searches are table scans and index scans, whereas most index tuning sites are striving for index seeks.)
    I have read Microsoft's "Best Practices for Data Warehousing with SQL Server 2008R2," but I was only able to glean a few helpful hints that don't also apply to OLTP systems:
    often better to recompile stored procedure query plans in order to eliminate variances introduced by parameter sniffing (i.e., better to use the right plan than to save a few seconds and use a cached plan SOMETIMES);
    partition tables that are larger than 50 GB;
    use minimal logging to load data precisely where you want it as fast as possible;
    often better to disable non-clustered indexes before inserting a large number of rows and then rebuild them immdiately afterward (sometimes even for clustered indexes, but test first);
    rebuild statistics after every load of a table.
    But I still feel like I'm missing some very crucial concepts for performant ETL development.
    BTW, our office uses SSIS, but only as a glorified stored procedure execution manager, so I'm not looking for SSIS ETL best practices.  Except for a few packages that pull from source systems, the majority of our SSIS packages consist of numerous "Execute
    SQL" tasks.
    Thanks, and any best practices you could include here would be greatly appreciated.
    -Eric

    Online ETL Solutions are really one of the biggest challenging solutions and to do that efficiently , you can read my blogs for online DWH solutions to know at the end how you can configure online DWH Solution for ETL  using Merge command of SQL Server
    2008 and also to know some important concepts related to any DWH solutions such as indexing , de-normalization..etc
    http://www.sqlserver-performance-tuning.com/apps/blog/show/12927061-data-warehousing-workshop-1-4-
    http://www.sqlserver-performance-tuning.com/apps/blog/show/12927103-data-warehousing-workshop-2-4-
    http://www.sqlserver-performance-tuning.com/apps/blog/show/12927173-data-warehousing-workshop-3-4-
    http://www.sqlserver-performance-tuning.com/apps/blog/show/12927061-data-warehousing-workshop-1-4-
    Kindly let me know if any further help is needed
    Shehap (DB Consultant/DB Architect) Think More deeply of DB Stress Stabilities

  • Best Practice for Migration of BO  from one server to another

    Hi All,
               I would like to know what is the Best pratice for Migration of BO from One server to another.
    i have Installed BO Xi R2 on my server.
    Thanks,
    Anendu Bothra
    Edited by: Anendu Bothra on Mar 5, 2009 10:24 AM

    You need to copy your input and output file stores from the old server to the new server. By default these are located in the <Business Objects install path>\FileStore directory.
    Then you need to stop the CMS, Right click the CMS,Click the Configuration tab, and then click Specify.
    Choose Copy, then click OK.
    Choose the version information for the source CMS database.
    Select the database type for the source CMS database, and then specify
    its database information (including host name, user name, and password).
    Select the database type for the destination CMS database, and then
    specify its database information (including host name, user name, and
    password).
    When the CMS database has finished copying, click OK.
    Once this process has been completed start the CMS and click on update objects -> located on the top of the CCM.
    I'd advise taking full backups beforehand.

  • Best practices for initial data loads to MDM

    Hi,
       We need to load more than 300000 vendors from SAP into MDM production repository. Import server might take days to load that much if no error occurs.
    Are there any best practices for initial loads to MDM available? What considerations must be made while doing the initial loads.
    Harsha

    Hello Harsh
    With SP05 patch1 there is a file aggregation functionality in the import port. Is is supposed to optimize the import performance.
    BTW, give me your mail address and I will send you an idoc packaging paper for MDM.
    Regards,
    Goekhan

  • Best Practice for Designing Database Tables?

    Hi,
    I work at a company for tracking devices (GPS Devices). Our SQL Server database is designed to have a table for each device we sell, currently there is 2500 tables in our database and they all have the same columns they only differ in table name. Each device
    sends about 4K records per day.
    currently each table hold from 10K records to 300K records
    What is the best practice to design a database in this situation? 
    When accessing database from a C# application, which is better to use, direct SQL commands or views? 
    a detailed description about what is best to do in such scenario would be great. 
    Thanks in advance.
    Edit:
    Tables columns are:
    [MessageID]
          ,[MessageUnit]
          ,[MessageLong]
          ,[MessageLat]
          ,[MessageSpeed]
          ,[MessageTime]
          ,[MessageDate]
          ,[MessageHeading]
          ,[MessageSatNumber]
          ,[MessageInput]
          ,[MessageCreationDate]
          ,[MessageInput2]
          ,[MessageInput3]
          ,[MessageIO]

    Hello Louis, thank you so much for your informative post. I'll describe in detail what situations I came through my 9 months of work in the company (working as a software engineer, but I am planning to take over database maintenance since no one is maintaining
    it right now and I cannot do anything else in the code to make it faster)
    At every end of the month our clients generate report for the previous month for all their cars, some clients have 100+ cars, and some have few. This is when real issue start, they are calling their data from our server through internet while having 2000
    unit sending data to our server, they keep on getting read time out since SQL Server gives priority to insert and hold all select commands. I solved it temporary in the code using "Read Uncommitted" once I initialize a connection through C#. 
    The other issue is generating reports for a month or two takes lots of time when selecting 100+ units. Thats what I want to solve, the problem is the one who wrote the C# app used hard coded SQL Statements
    AND
    the company is refusing to upgrade from SQL Server 2003 and Windows Server 2003. 
    Now talking about reports, there are summary reports, stops reports, zone reports ..etc most of them depend usually on at least MessageTime, MessageDate, MessageSpeed, MessageIO and MessageSatNumber.
    So from your post I conclude that for now I need to set snapshots so that select statements don't get kicked out in favor for insert commands, but does SQL Server automatically select from the snapshots or do I have to tell it to do so? 
    Other than proper indexing what else I need? Tom
    Phillips suggested Table partitioning but I don't think it is needed in my case since our database size is 78GB
    When I run code analysis on the app, Visual Studio tells me I better use stored procedures, views than using hard coded Select Statements, what difference will this bring me when talking about performance?
    Thanks in advance. 

  • Obiee 11g : Best practice for filtering data allowed to user

    Hi gurus,
    I have a table of the allowed areas for each user.
    I want to show only the data facts associated with these allowed areas.
    For instance my user scott can see France and Italy data.
    I made a variable session. I put this session variable in a filter.
    It works ok but only one value (the first one i think) is taken in account (for instance, with my solution scott will see only france data).
    I need all the possible values.
    I tried with the row wise parameter of the variable session. But it doesn't work (error obiee).
    I've read things on internet about using stragg or valuelistof but neither worked.
    What would be the best practice to achieve this goal of filtering data with conditions by user stored in database ?
    Thanks in advance, Emmanuel

    Check this link
    http://oraclebizint.wordpress.com/2008/06/30/oracle-bi-ee-1013332-row-level-security-and-row-wise-intialized-session-variables/

  • Best practice for extracting data to feed external DW

    We are having a healthy debate with our EDW team about extracting data from SAP.  They want to go directly against ECC tables using Informatica and my SAP team is saying this is not a best practice and could potentially be a performance drain.  We are recommending going against BW at the ODS level.  Does anyone have any recommendations or thoughts on this?

    Hi,
    As you asked for Best Practice, here it is in SAP landscape.
    1. Full Load or Delta Load data from SAP ECC to SAP BI (BW): SAP BI understand the data element structure of SAP ECC, and delta mechanism is the continous process of data load from a SAP ECC (transaction system) to BI (Analytic System).
    2. You can store transaction data in DSOs (granular level), and in InfoCubes (at a summrized level) within SAP BI. You can have master data from SAP ECC coming into SAP BI separately.
    3. Within SAP BI, you SHOULD use OpenHub service to provide SAP BI data to other external system. You must not connect external extractor to fetch data from DSO and InfoCube to target system. OpenHub service is the tool that faciliate data feeding to external system. You can have Informatica to take data from OpenHubs of SAP BI.
    Hope I explain to best of your satisfaction.
    Thanks,
    S

  • Best Practice for Master Data Reporting

    Dear SAP-Experts,
    We face a challenge at the moment and we are still trying to find the right approach to it:
    Business requirement is to analyze SAP Material-related Master Data with the BEx Analyzer (Master Data Reporting)
    Questions they want to answer here are for example:
    - How many active Materials/SKUs do we have?
    - Which country/Sales Org has adopted certain Materials?
    - How many Series do we have?
    - How many SKUs below to a specific season
    - How many SKUs are in a certain product lifecycle
    - etc.
    The challenge is, that the Master Data is stored in tables with different keys in the R/3.
    The keys in these tables are on various levels (a selection below):
    - Material
    - Material / Sales Org / Distribution Channel
    - Material / Grid Value
    - Material / Grid Value / Sales Org / Distribution Channel
    - Material / Grid Value / Sales Org / Distribution Channel / Season
    - Material / Plant
    - Material / Plant / Category
    - Material / Sales Org / Category
    etc.
    So even though the information is available on different detail  levels, the business requirement is to have one query/report that combines all the information. We are currently struggeling a bit on deciding, what would be the best approach for this requirement. Did anyone face such a requirement before - and what would be the best practice. We already tried to find any information online, but it seems Master data reporting is not very well documented. Thanks a lot for your valuable contribution to this discussion.
    Best regards
    Lukas

    Pass a reference to the parent into the modal popup. Then you
    can reference anything in the parent scope.
    I haven't done this i 2.0 yet so I can't give you code. I'll
    post if I do.
    Oh, also, you can reference the parent using parentDocument.
    So in the popup you could do:
    parentDocument.myPublicVariable = "whatever";
    Tracy

Maybe you are looking for

  • Will Adobe AIR apps work on iOS (Javascript/HTML)?

    Hello, I haven't been able to find any definitive answer on whether Adobe AIR apps using Javascript and HTML can work on iOS. Anyone know for sure? And if so, what happens with the APIs available through these swfs that we currently include: networkI

  • Customer name is not updating in Organization model

    Hi All, One employee  is created with the name "test customer" in ECC and it is assigned to organization model in CRM. After that the the employee name is changed fom test customer to __customer test. Now same changes is updating in CRM but not updat

  • Debugging stored procedure in SQL Server 2012

    Hi, Please I need your help. I used to debug stored procedures in SQL server 2008, 2008 R2. I'm debugging the stored procedure using SQL Server management studio. I just create a break point then I click on the debug button and start debugging withou

  • Color in the control bar is that normal?

    There is colors in the control bar. Is that normal?

  • Firefox java 5 plugin for FC4 on AMD machine

    I have installed java 5 on FC4 over AMD ( not the 64 bit ) machine. I tried to link individually both ns7 and ns7gcc29 plugins in the plugin directory of firefox. cd /usr/lib/firefox-1.0.4/plugins/ ,and then ln -s /cytrion/java-soft/jdk1.5.0/jre/plug