Using Delete Cascade a best practice?

Hi,
My current requirement need to delete specific records from  parent and also from child tables.
So, DELETE CASCADE will be helpful.
I need to know whether it is a good practice to use that? I hope it is.
And also I think the performance will be better than deleting manually.
please throw me a light on this.
Thank you.

> However if you allow CASCADE DELETE, data in both child1 and child2 will be wiped out when deleting the parent key which may not be the intended behavior. Since it could cause unintended data loss, I would not use it if I am designing the table.
We need to be realistic about CASCADE DELETE. How do you audit trail all the rows deleted in related tables?
Assume you work for AdventureWorks Cycles and a manager is telling you: delete that old mountain bike from the product table. As soon as you do that, a month later another manager demands is back for some reporting.
The point is don't ever DELETE data. Move it over to history/audit table so you can provide it if somebody asks for it again (guaranteed someone will).
And my point: CASCADE DELETE should be part of a bigger db maintenance procedure. Therefore, I prefer stored procedure implementation with proper audit trail.
Audit trail example:
http://www.sqlusa.com/bestpractices2005/auditwithoutput/
Kalman Toth Database & OLAP Architect
SELECT Video Tutorials 4 Hours
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

Similar Messages

  • Design decision: good to use delete cascade in database?

    Hello,
    Suppose there are two tables: customer and address.
    These two tables are linked as one (customer) to many (address), and delete cascade. Every time when the application deletes a customer record, the linked address(es) will be deleted as well.
    For my opinion, this is good because of performance and less coding.
    But bad for maintenance, since the address is implicitly deleted, the developer may not know this in advance if the system is not documented well.
    What is your opinion? What facts will affect your decision to use it or not?

    The design should say that when a customer record is deleted, the linked address records must also be deleted. Otherwise your database loses referential integrity.
    So for me, if the database supported cascading deletes, I would use them. Why would I do extra programming which the database is offering to do for me, especially since my version of the code is likely to be less robust than the database's version? The only exception would be if some other action needed to be taken besides just deleting the dependent records, and that action couldn't be handled within the database.
    As for your comments about maintenance, it's true that if the system is not documented then developers may have difficulty in maintaining it. However I don't exactly foresee designers saying to themselves "Oh, this system isn't well documented so I won't use Feature X".

  • Data Warehouse using MSSQL - SSIS : Installation best practices

    Hi All,
              I am working on a MSSQL - 2008 R2 , based data warehouse building.  The requirement is to read source data from files, put it in stage database and perform data cleansing etc .. and then move the
    data to data warehouse db .. Now the question is about the required number of physical servers and in which server which component (MSSQL , SSIS ) of MSSQL should be installed based on any best practices:
    Store Source files --> Stage database --> data warehouse db
    The data volumne will be high ( per day 20 - 30 k transactions) ... Please suggest
    Thank you
    MSSQL.Arc

    Microsoft documentation: "Use a Reference Architecture to Build An Optimal Warehouse
    Microsoft SQL Server 2012 Fast Track is a reference architecture data warehouse solution giving you a step-by-step guide to build a balanced hardware...Microsoft SQL Server 2012 Fast Track is a reference architecture
    data warehouse solution giving you a step-by-step guide to build a balanced hardware configuration and the exact software setup. 
    Step-by-step instructions on what hardware to buy and how to put the server together.
    Setup instructions  on installing the software and all the specific settings to configure.
    Pre-certified by Microsoft and industry hardware partners for the most optimal hardware and software configuration."
    LINK:
    https://www.microsoft.com/en-us/sqlserver/solutions-technologies/data-warehousing/reference-architecture.aspx
    Kalman Toth Database & OLAP Architect
    IPAD SELECT Query Video Tutorial 3.5 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Use of AC adapter - best practice?

    I just purchased a MacBook Pro a few days ago and the shop assistant told me it was best to fully charge the battery and then use it until it was completely dead and then fully recharge it at which point I should take out the AC adapter. He basically told me that I should never leave the AC adapter in when the computer is fully charged even when I am using it in an office for several hours at a time.
    I have read and executed the instructions regarding calibrating the battery and this makes sense to me, however I cannot find any articles supporting the statement that I should never leave the AC adapter in once the battery is fully charged.
    Could you please tell me if I should leave it in or out?

    Morten Twellmann wrote:
    I just purchased a MacBook Pro a few days ago and the shop assistant told me it was best to fully charge the battery and then use it until it was completely dead and then fully recharge it at which point I should take out the AC adapter. He basically told me that I should never leave the AC adapter in when the computer is fully charged even when I am using it in an office for several hours at a time.
    Complete and utter nonsense. In fact, except for when you do the battery calibration every few months, you are better off NOT deep cycling (ie, full charge/discharge) lithium ion batteries. By never using the computer with the AC adaptor in as you were told, you are forcing the computer to operate in a low energy consumption mode. The end result is lower processor/graphics performance. Using the computer on AC power, even when the battery is full, is not harmful to the battery.
    http://electronics.howstuffworks.com/lithium-ion-battery2.htm
    http://www.batteryuniversity.com/parttwo-34.htm
    Could you please tell me if I should leave it in or out?
    You can do whatever suits your needs/desire. The only thing to remember is that you should try and regularly use the battery - either use it on batter power on a regular basis, or, if this isn't always possible, perform the calibration as recommended by Apple.

  • Using XML with Flex - Best Practice Question

    Hi
    I am using an XML file as a dataProvider for my Flex
    application.
    My application is quite large and is being fed a lot of data
    – therefore the XML file that I am using is also quite large.
    I have read some tutorials and looked thorough some online
    examples and am just after a little advice. My application is
    working, but I am not sure if I have gone about setting and using
    my data provider in the best possible (most efficient) way.
    I am basically after some advice as to weather I am going
    about using (accessing) my XML and populating my Flex application
    is the best / most efficient way???
    My application consists of the main application (MXML) file
    and also additional AS files / components.
    I am setting up my connection to my XML file within my main
    application file using HTTPService :
    <mx:HTTPService
    id="myResults"
    url="
    http://localhost/myFlexDataProvider.xml"
    resultFormat="e4x"
    result="myResultHandler(event)" />
    and handling my results with the following function:
    public function myResultHandler(event:ResultEvent):void
    myDataFeed = event.result as XML;
    within my application I am setting my variable values by
    firstly delacring them:
    public var fName:String;
    public var lName:String;
    public var postCode:string;
    public var telNum:int;
    And then, giving them a value by “drilling” into
    the XML, E;g:
    fName = myDataFeed.employeeDetails.contactDetails.firstName;
    lName = myDataFeed.employeeDetails.contactDetails.lastName;
    postCode =
    myDataFeed.employeeDetails.contactDetails.address.postcode;
    telNum = myDataFeed.employeeDetails.contactDetails.postcode;
    etc…
    Therefore, for any of my external (components in a different
    AS file) components, I am therefore referencing there values using
    Application:
    import mx.core.Application;
    And setting the values / variables within the AS components
    as follows:
    public var fName:String;
    public var lName:String;
    fName =
    Application.application.myDataFeed.employeeDetails.contactDetails.firstName;
    lName =
    Application.application.myDataFeed.employeeDetails.contactDetails.lastName;
    As mentioned this method seems to work, however, is it the
    best way to do it??? :
    - Connect to my XML file
    - Set up my application variables
    - Give my variables values from the XML file ……
    Bearing in mind that in this particular application there are
    many variable that need to be set and there for a lot of lines of
    code just setting up and assigning variables values from my XML
    file.
    Could someone Please advise me on this one????
    Thanks a lot,
    Jon.

    I don't see any problem with that.
    Your alternatives are to skip the instance variables and
    query the XML directly. If you use the values in a lot of places,
    then the Variables will be easier to use and maintain.
    Also, instead of instance variables, you colld put the values
    in an "associative array" (object/hashtable), or in a dictionary.
    Tracy

  • Best practice to host websites on xserve with mac os x server leopard.

    Hi Guys,
    I'm trying to optimize the xserve to host multiple joomla sites...
    Can some one help me with "hidden manuals or using your experience" about best practices out there...!!
    It'd be great on your part...
    Cheers

    Erm, Joomla site hosting 'just works' with Leopard Server site virtualisation and the built in mysql.
    If you want the best practice try Mac OS X Server Essentials Second Edition which has chapters about setting up multiple web sites.

  • Best Practice for Deploying ADF application

    I am tasked with developing a best or prefered practice of feploying a large ADF application. Background: we are in the process of redeveloping a UI for a large system. We have broken the system down into susbsytems. Each of these susbsystems UI will be a ADF aaplicaion(?). This is a move from a MS .Net front end. The backend (Batch processes etc) is being dveloped in Java. So my question is if I have several ADF projects for each subsystem and common components that they all will use - what is the best practice to compile package and deploy? The deployment will be to weblogic server or servers(Cluster).
    We have a team of at least 40 -50 developers worldwide so we are looking for an automated build and deploy and would like to follow Oracle best practice. So far I have read Deploying ADF Applications (http://download.oracle.com/docs/cd/E15523_01/web.1111/e15470/deploy.htm#BGBJHGFH) and have followed the links. I have also look at the ADF evangalist blogs - lots of chatter about ojdeploy. My concern about ojdeploy is that dependent files are also being compiled at the same time. I expected that we want shared dependent files compiled only once (Is that a valid concern)?
    So then when we build the source out of subversion (ojdeploy ? Ant? ) then what is best practice to deploy to a weblogic server (wslt admin console) - again we want it to be automated.
    Thank you in advance for replies.
    RK

    Rule 1: Never use the "Automatically Expose UI Componentes in a New Managed Bean" option, create your bindings manually;
    Rule 2: Rule 1 is always right;
    Rule 3: In doubts, refer to rule 2.
    You may also want to check out :
    http://groups.google.com/group/adf-methodology
    And :
    http://www.oracle.com/technology/products/jdev/collateral/4gl/papers/Introduction_Best_Practices.pdf

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Best Practices Pharma-MM Docs & process Over view

    Hi All,
            Could anybody provide the Pharma Industry processes-Typical MM Point of view.
    Also if any additional useful docs,which can be useful for implemenating Pharma best Practices package in a Industry.
    Thanks
    JP

    Hi Jyoti Prakash,
    The following link will guide and takes you though in detail the SAP Best Practices for Pharmaceuticals. 
    http://help.sap.com/bp_pharmav1600/Pharma_US/index.htm
    This will be the best available source for pharma best practices in line with SAP.
    Hope this will be very useful to you.  Please confirm
    Regards
    R. Senthil Mareeswaran.

  • Infomation regarding Best Practices Required,

    Dear Friends,
        Happy New Year......
    Im working as a part BI Excellence team in a reputed company.
    I jst want to say a client to install the BI Best Practice(Scenario -  SCM), inorder to do that i need to present him the advantages and difference between Best practice (SPECIFIC FOR BI) over General Implementation.
    When i search in Help.sap.com, it generally speaks about the time consumption n guidelines of Overall SAP Best Practices.
    Can anyone help me wrt to BI (From Blue Print to Go Live), Time line diferrences between SAP BI Best Practice and General Implementation.
    An Example with Specific Scenario Like SCM, Taking a Cube for IM and describing the Start to End Implemenation process and its timeline. How the same differs, when we go by using a SAP BI Best Practice installation?
    Please provide your Valuable suggesstions, as i dont hav any Implementation experience.
    Requesting your Valuable Guidence.
    Regards
    Santhosh kumar.N

    Hi,
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f6/7a0c3c40787431e10000000a114084/frameset.htm
    http://help.sap.com/bp_biv370/html/Bw.htm
    Hope it helps........
    Thanks & Regards,
    SD

  • HCM Best Practice LOAD - Error in Copy Report variants Step

    Dear Friends,
    We are trying to use/load the HCM Best Practice data for testing purpose. We have applied the HCM Best Practice Add -ON. It went successfully. When we try to execute the Preparation step - (in Copy Varient Phase) using Tcode /HRBPUS1/C004_K01_01 we are getting the following error.
    <b>E00555 Make an entry in all required fields</b>
    Request you to provide some solution for the same.
    Thanks and regards,
    Abhilasha

    Hi Sunny and others,
    The main error here was sapinst couldn´t find and read the cntrlW01.dbf because this file was not on that location (/oracle/W01/112_64/dbs).
    I already solved this issue... what I did was:
    1º) As user ora I went in sqlplus as sysdba and I ran the following script (the control.sql script that was generated at the begining of the system copy process):
    SQL> @/oracle/CONTROL.SQL
    Connected.
    ORA-32004: obsolete or deprecated parameter(s) specified for RDBMS instance
    ORA-01081: cannot start already-running ORACLE - shut it down first
    Control file created.
    Database altered.
    Tablespace altered.
    2º) This is very important, is necessary to know where is the .CTL that was created with the script, so I checked what was the value of the parameter control_files in that moment:
    SQL> show parameter control_files;
    /oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl
    3º) Next, logged with ora, for the sapinst read the value that it needed, I copied this file to the location/path that is needed for sapinst could read:
    cp /oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl /oracle/W01/112_64/dbs/cntrlW01.dbf
    4º) I ran the sapinst from the point he had stopped.
    Thank you for your help.
    Kind regards,
    João Dimas - Portugal

  • SAP Best Practice HR: DX Toolbox

    Hi Suresh,Saquib
      This post is in continuity to my previous post, can u provide some advantages/disadv's of using  <b>ZBPHR_ZDTT - SAP Best Practice HR: DX Toolbox</b> tool to upload PA infotypes

    Vijay,
    I haven’t used it , but I just Heard that SAP give best practices of common infotype. Most of the time you wouldn’t find the country specific infotype . I also came to knew in discussion that you can not upload the data from application server.
    Hope this’ll give you some clue.
    Thanks
    Saquib
    Message was edited by: Saquib Khan

  • EFashion sample Universes and best practices?

    Hi experts,
    Do you all think that the eFashion sample Universe was developed based on the best practices of Universe design? Below is one of my questions/problems:
    Universe is designed to hide technical details and answer all valid business questions (queries/reports). For non-sense questions, it will show 'incompatible' etc. In the eFashion sample, I tried to compose a query to answer "for a period of time, e.g. from 2008.5 to 2008.9, in each week for each product (article), it's MSRP (sales price) and sold_price and margin and quantity_sold and promotion_flag". I grabed the Product.SKUnumber, week from Time period, Unit Price MSRP from Product, Sold at (unit price) from Product, Promotions.promotion, Margin and Quantity sold from Measures into the Query Panel. It gives me 'incompatible' error message when I try to run it. I think the whole sample (from database data model to Universe schema structure/joins...) is flawed. In the Product_promotion_facts table, it seems that if a promotion lasts for more than one week, the weekid will be the starting week and duration will indicate how long it lasts. In this design, to answer "what promotions run in what weeks" will not be easy because you need to join the Product_promotion_facts with Time dimention using "time.weekid between p_prom.weekid and p_prom.weekid+duration" assuming weekid is in sequence, instead of simple "time.weekid=p_prom.weekid".  The weekid joins between Shop_fact and product_promotion_facts and Calendar_year_lookup are very confusing because one is about "the week the sales happened" and the other "the week the promotion started". No tools can smart enough to resolve this ambitious automatically. Then the shortcut join between shop_facts and product_promotion_facts. it's based on the articleid alone. obviously the two have to be joined on both article and time (using between/and, not the simple weekid=weekid in this design), otherwise the join doesn't make sense (a sale of one article on one day joins to all the promotions to this article of all time?).
    What do you think?
    thanks.
    Edward

    You seem to have the idea that finding out whether a project uses "best practices" is the same as finding out whether a car is blue. Or perhaps you think there is a standards board somewhere which reviews projects for the use of "best practices".
    Well, it isn't like that. The most cynical viewpoint is that "best practices" is simply an advertising slogan used by IT consultants to make them appear competent to their prospective clients. But basically it's a value judgement. For example using Hibernate may be a good thing to do in many projects, but there are projects where it would not be a good thing to do. So you can't just say that using Hibernate is a "best practice".
    However it's always a good idea to keep your source code in a repository (CVS, Subversion, git, etc.) so I think most people would call that a "best practice". And you could talk about software development techniques, but "best practice" for a team of three is very different from "best practice" for a team of 250.
    So you aren't going to get a one-paragraph description of what features you should stick in your project to qualify as "best practices". And you aren't going to get a checklist off the web whereby you can rate yourself for "best practices" either. Or if you do, you'll find that the "best practice" involves buying something from the people who provided the checklist.

  • Network Services Best Practices

    Hello
    I've been using the Network Services Best Practices document  (27 Sep 2006) for some years now and I wonder if there has been actually an update to it. If not would you guys have any new Network Best Practices document you would suggest? Something that talks about Virutalization, etc.... would be great

    Hi Scott,
    Thank you for posting your issue in the forum.
    I am trying to involve someone familiar with this topic to further look at this issue. There might be some time delay. Appreciate your patience.
    Thank you for your understanding and support.
    Best Regards,
    Justin Gu

Maybe you are looking for

  • Runtime error while opeinig window in Web Dynport- ABAP

    Hello, We are having a strange scenario in abap web Dynpro. we are using POWL and users are getting run time error while opening a pop up window for enter some information with the message "nstance of the view W_POP_UP already exists in component". T

  • Media not working:please help

    okay so, my blackberry will not play my videos, youtube videos, the ringtones that came with the phone, flixter videos, or any kind of media, a message pops up when i try to select a ringtone that says invalid ringtone selsction. a security wipe did

  • My 3Gs doesn't work anymore after upgrading to Iatest Update. Still It's resetting itself but doesn't boot. Can anyone help me?

    I updated my 3GS with the latest update. But it doesn't work. Still I only see the Apple Icon on the display and almost every two minutes I get the sound of resetting/rebooting itself by the phone. So, what can I do?

  • Need to Download Acrobat 9 STANDARD

    Greetings All, I am tranferring an employee to a new computer and it is vital I transfer Acrobat 9 for her use. Unfortunately, I took over the IT only a few months ago and it was originally installed a few years ago. No records of install media or do

  • Converting a labwindows driver to labview

    http://sine.ni.com/apps/utf8/niid_web_display.download_page?p_id_guid=E3B19B3E9628659CE034080020E748... I am trying to convert the labwindows driver found at the link above to labview (8.5) and I am trying to do this via the tools-->instrumentation--