Correlated Subquery for joining summary table original data table

I want to list a set of fields from an inner-query that correspond to calculated fields on an outer-query. A set of date fields on the outer-query signify a range to correlate records on the inner-query. Records on the inner-query have a single date field
which should fall within the outer-query date range. At this point, I cannot determined the correct syntax for this query, but am able to achieve the desired results using a left outer join and case-when arguments on what should be the inner-query records.
Here's my basic SQL right now:
SELECT  DISTINCT a.service,
      a.start_range,   -- start of month
      a.end_range,    -- end of month
      a.availability,
      CASE WHEN YEAR(a.start)=YEAR(o.begin) AND MONTH(a.start)=month(o.begin) THEN o.event_id ELSE NULL END AS 'event_id'
CASE WHEN YEAR(a.start)=YEAR(o.begin) AND MONTH(a.start)=month(o.begin) THEN o.minutes ELSE NULL END AS 'minutes'
FROM   service_availability a LEFT OUTER JOIN 
           service_outage o ON 
a.service=o.service AND
TIMESTAMPDIFF(YEAR,o.begin,NOW())=0
WHERE   TIMESTAMPDIFF(YEAR,a.start,NOW())=0
The limitation with this approach is that the CASE WHEN argument needs to be used for retrieving any fields on the service_outage table. I've tried several approaches including parameters in both the FROM and
WHERE clauses as well as subqueries, but cannot get any to return the data. Is there a better way to achieve this?

Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your datA. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal datA. We need
to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL.
>> I want to list a set of fields[sic] from an inner-query that correspond to calculated fields [sic] on an outer-query. <<
Columns are not fields. Rows are not records. CASE is an expression, not an argument. Where did you get the terms “inner-query” and “outer-query”? I never heard them before and I have some experience with SQL. We have subqueries; is that what you are trying
say? 
>> A set of date fields [sic] on the outer-query signify a range to correlate records [sic] on the inner-query. Records on the inner-query have a single date field [sic] which should fall within the outer-query date range.<<
You do not know the basic terms or how SQL work. You also posted some weird dialect of an SQL-like language. What is NOW()?? Does that mean CURRENT_TIMESTAMP in your dialect?  Did you know that BEGIN is a reserved word in SQL? 
>> At this point, I cannot determined the correct syntax for this query, but am able to achieve the desired results using a LEFT OUTER JOIN and CASE-when arguments [sic] on what should be the inner-query [sic] records [sic]. Here's my basic SQL right
now: <<
Why are you doing temporal math in SQL? Where is the calendar and report period tables?
Since SQL is a database language, we prefer to do look ups and not calculations. They can be optimized while temporal math messes up optimization. A useful idiom is a report period calendar that everyone uses so there is no way to get disagreements in the DML.
The report period table gives a name to a range of dates that is common to the entire enterprise. 
CREATE TABLE Month_Periods
(month_name CHAR(10) NOT NULL PRIMARY KEY
  CHECK (month_name LIKE '[12][0-9][0-9][0-9]-[01][0-9]-00'),
 month_start_date DATE NOT NULL,
 month_end_date DATE NOT NULL,
 CONSTRAINT date_ordering
  CHECK (month_start_date <= month_end_date),
etc);
These report periods can overlap or have gaps. I like the MySQL convention of using double zeroes for months and years, That is 'yyyy-mm-00' for a month within a year and 'yyyy-00-00' for the whole year. The advantages are that it will sort with the ISO-8601
data format required by Standard SQL and it is language independent. The pattern for validation is '[12][0-9][0-9][0-9]-00-00' and '[12][0-9][0-9][0-9]-[01][0-9]-00'
Why is an outage not an availability status or an event? 
>> The limitation with this approach is that the CASE WHEN argument [sic] needs to be used for retrieving [sic: querying is not retrieving] any fields [sic] on the Service_Outage table<<
We have no DDL! Not even sample data. 
>> I've tried several approaches including parameters in both the FROM and WHERE clauses as well as subqueries, but cannot get any to return the data. <<
Parameters exist in procedure declarations; arguments are the actual values passed in them.  Neither of them as anything to do with a query. 
Want to try again? 
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL

Similar Messages

  • Look for 'Purchase Order' and 'Due date' Tables

    Hi Everyone,
    iam creating a Form for an automatic payment and i have some difficulties to find the table of these fields :
    - Purchase Order (EBELN)
    - Due date (NETDT)
    Please can u help me?
    Regards.

    I hope you checked these tables.
    I found these on Where-used for your NETDT field, Alos these tables have field BELNR which might intrests you.
      Table Fields                     Short descriptn
      BWPOS                            Valuations for Open Items
      FAEDT                            Due Date for Net Payment
      DKKOP                            Balance Audit Trail
      NETDT                            Due Date for Net Payment
      DKOKP                            Open Item Account Balance Audit Trail
      NETDT                            Due Date for Net Payment
      DSKOP                            Balance Audit Trail
      NETDT                            Due Date for Net Payment
      MHND                             Dunning Data
      FAEDT                            Due Date for Net Payment
    good luck,
    ags.

  • How to View master data & Trasaction data tables for Puchasing Cube

    Hi BW masters,
    How should I check which tables (Master data tables and Transaction data tables) are involved with Business Content Cube(Purchasing data 0PUR_C01). I would like to know this information in order to go to that table and browse the data in that table. Where should we go and check this out.
    I will reward the points for the good answers.:-)
    Thanks
    Syed.

    hi Syed,
    to view tables' content you can use SE16,
    you can try infocube name, dimension tables of cube are store in *d[infocube name]1 to F, fact tables are f[infocube name] and e[infocube name]-compress,
    infocube data are stored in star schema, you can use LISTCUBE to view the content (to see the structure/tables in cube, use LISTSCHEMA),
    master data (not time-dependent) are stored in text table t[infoobject name], attribute - p[infoobject name] and hierarchy - h[infoobject name],
    take a look bw multidimensional data modeling for detail
    http://help.sap.com/bp_biv235/BI_EN/documentation/Multi-dimensional_modeling_EN.doc
    hope this helps.

  • Why in SE16 we can not  see New Data Table for standard DSO

    Hi,
    We says that there is three tables (New Data Table, Active Data Table and Change Log Table) of Standard DSO, Then Why in SE16 we can not  see New Data Table of Standard DSO.
    Regards,
    Sushant

    Hi Sushant,
    It is possible to see the 3 DSO tables data in through SE16. May be you do not have authorization to see data through SE16.
    Sankar Kumar

  • Filter - data table

    Can I use Xcelsius just to show on an organised way a part of a table with filters?
    Ex.:
    Type     Team     Week     Game.Nr.     Date     Hour     Location     Homa     Gast
    BvV     CadJL     34     80226     23/08/08     10:00     A     BBC Kangoeroes Willebr - A     Basket Tongeren
    BvV     JunJL     34     80124     23/08/08     15:00     WEG     BBC Helios Zottegem     BBC Kangoeroes Willebr - A
    Comp     JunJL     35     13108092     30/08/08     11:00     A     BBC Kangoeroes Willebr - A     BBC Schelle - A
    Comp     CadJL     35     13120092     30/08/08     13:00     A     BBC Kangoeroes Willebr     BBC Geel
    Comp     CadPM     35     6101125     30/08/08     10:45     WEG     Londerzeelse Dunkers - B     Dames Willebroek
    Comp     JunJL     36     13108103     6/09/08     11:00     A     BBC Kangoeroes Willebr - A     BBC Falco Gent - A
    With the filter I want the user to select a Team and/or week  The result should be a table only matching one or both criteria

    Hi Wim,
    As you have discovered, my experience also has been that the filter selector only selects a single row.  This is also consistent with the animated help for this component.
    The following "workaround" to the limitation of the Filter Selector component will meet your requirement:
    1) Place a Combo Box component on the canvas for each of the filter criteria you require;
    2) Link the Labels property of each combo box to a cell range that contains a list of the unique values;
    3) Set the Data Insertion Type of each Combo Box to "Label" and bind to a Destination cell to receive the selected item;
    4) Define a formula cell to act as the filter key, which is a concatenation of all of the values of the Destination cells from step 3;
    5) Insert a new column at the beginning of your data table to act as the filter key for each row in the table.  Each row in this column should contain a formula that concatenates each of the cells that correspond to the columns you want to filter by;
    6) Place an additional Combo Box on the canvas.  Set the Labels property to the range of cells in the filter key column defined in step 5.  Set the Data Insertion Type to "Filtered Rows".  Set the Source Data range to all of the cells in your table.  Set the Destination Data to an area in the spreadsheet that will receive the filtered subset of rows from your original data table.  Set the Selected Item property of this combo box to reference the filter key formula cell defined in step 4;
    7) Place a spreadsheet table component on the canvas and link the Display Data property to the Destination cell range defined in step 6.
    Once you have the layout as desired, hide the Combo Box defined in Step 6 behind the spreadsheet table defined in step 7 so that the end user cannot interact with it.  This Combo Box is used as an indirect method of dynamically filtering the rows from your data table.
    One other point to keep in mind:  Xcelsius does not handle large volumes of data very well.  Depending on the number of columns in your 500 row table, you may experience performance issues, or Xcelsius may stop responding altogether.
    The above approach may seem convoluted but it achieves the desired result.
    Let me know if you need any clarification of the above technique.
    Regards,
    Mustafa.

  • Splitting of catalogue and data tables in their core BO management database

    Hi All,
    Could you please suggest your ideas on the below recommendation suggested by our DBA's:-
    It is recommendable to split catalogue tables from data table and other data objects (e.g. indexes) using storage.
    splitting of catalogue and data tables in their core BO management database
    Is it a recommendable solution to splitting of catalogue and data tables in their core BO management database?
    Many Thanks,
    Madhu
    Edited by: Madhu P on Jun 11, 2008 11:56 AM

    Dear Madhu,
    it's really safe to separate the BO management tables ( the repository) from the tables containing the data.
    We went a step further and create a BO database (dedicated server) containing only the repository.
    Because the BO database may provide meta data form different universes pointing to different databases containing the data we have easier administration and some performance advantages
    with this separation.
    bye
    yk

  • Can we implement the custom sql query in CR for joining the two tables

    Hi All,
    Is there anyway to implement the custom sql query in CR for joining the two tables?
    My requirement here is I need to write sql logics for joining the two tables...
    Thanks,
    Gana

    In the Database Expert, expand the Create New Connection folder and browse the subfolders to locate your data source.
    Log on to your data source if necessary.
    Under your data source, double-click the Add Command node.
    In the Add Command to Report dialog box, enter an appropriate query/command for the data source you have opened.
    For example:
    SELECT
        Customer.`Customer ID`,
        Customer.`Customer Name`,
        Customer.`Last Year's Sales`,
        Customer.`Region`,
        Customer.`Country`,
        Orders.`Order Amount`,
        Orders.`Customer ID`,
        Orders.`Order Date`
    FROM
        Customer Customer INNER JOIN Orders Orders ON
            Customer.`Customer ID` = Orders.`Customer ID`
    WHERE
        (Customer.`Country` = 'USA' OR
        Customer.`Country` = 'Canada') AND
        Customer.`Last Year's Sales` < 10000.
    ORDER BY
        Customer.`Country` ASC,
        Customer.`Region` ASC
    Note: The use of double or single quotes (and other SQL syntax) is determined by the database driver used by your report. You must, however, manually add the quotes and other elements of the syntax as you create the command.
    Optionally, you can create a parameter for your command by clicking Create and entering information in the Command Parameter dialog box.
    For more information about creating parameters, see To create a parameter for a command object.
    Click OK.
    You are returned to the Report Designer. In the Field Explorer, under Database Fields, a Command table appears listing the database fields you specified.
    Note:
    To construct the virtual table from your Command, the command must be executed once. If the command has parameters, you will be prompted to enter values for each one.
    By default, your command is called Command. You can change its alias by selecting it and pressing F2.

  • How can i join the header and item table to fetch the data

    hi experts,
                   i have a doubt in using inner join or for all entries, for fetching the data from the item table mseg, by taking the doc.no from mkpf. Plz sort out the difference, what happens, if i use the both statements for fetching data

    Hi,
    Both has same functionality.
    but if u are using FAE, u ahev to check for the
    ~intial condition of the source table,
    ~ duplicate entries, if any
    Inner join will fetch the data from all the join tables at once. FAE will fetch the date from a table first then use that data to fetch data from subsequent table.
    say in ur case, if u r using FAE,
    1.select from mkpf.
    2.select from mseg fae in I_MKPF.
    first try using JOIN. if it is taking lots of time, then try FAE.
    regards,
    madhu

  • Data being dropped after joining Fiscal Time with Fact Table

    Hi,
    In my RPD, i have jolined the Fiscal Calendar with my fact tables : Fiscal Day is joined to Create Time column in the fact table.
    Fiscal Day Data is : 6/6/2006 12:00:00
    Creare Time Data is : 6/6/2006 11:45:49
    6/6/2006 11:56:50
    When i pull the fiscal day column with the other columns fromthe fact table in the report, i only see a subset of my data. I beleive this is due to the timestamp and it is causing to drop records.
    Is there anyway, i can join this and not drop records due to the timestamp.
    ANy suggestions will be appreciated.
    Thanks

    I assume that fiscal day table is at the day level. I would suggest that you store the date in an integer format (for example 20060606 'YYYYMMDD') similarly you may store the creation_date_key in fact table which is also in the same format, then do a direct join. This will be provide good performance for your reporting since you are joining integers and not date or char formats.
    If you want time of creation date also, you might want to store it in a seperate column in say HHMMSS format, this when joined with time dimension will allow you hierarchical reporting for time too.

  • Join between fact table and master data table

    Is it posible to join a Cube with a Characteristic? This is exactly what i need:
    - In my cube i have date (0CALDAY) and (among others) a characteristic (ZCHAR) and a key figure (ZKEYF).
    - I added a key figure (ZKFAT) as an attribute to ZCHAR. So the tables look something like this:
    Fact table:
    <b>0CALDAY | ZCHAR | ZKEYF</b>
    12.10.2006 | CHAR1 | 10
    12.10.2006 | CHAR2 | 20
    12.10.2006 | CHAR3 | 30
    Master data table for ZCHAR:
    <b>ZCHAR | ZKFAT</b>
    CHAR1 | 1000
    CHAR2 | 2000
    CHAR3 | 1500
    I need to make a query with a table that looks like this:
    <b>0CALDAY | ZCHAR | ZKEYF | ZKFAT</b>
    12.10.2006 | CHAR1 | 10 | 1000
    12.10.2006 | CHAR2 | 20 | 2000
    12.10.2006 | CHAR3 | 30 | 1500
    Finally, the query should result in something like this:
    <b>0CALDAY | ZKEYF | ZKFAT</b>
    12.10.2006 | 10 | 1000
    12.10.2006 | 20 | 2000
    12.10.2006 | 30 | 1500
    Adding KFAT to the fact table is not an option, i need to read this information directly from ZCHAR.
    I've tried using MultiProviders but didn't get the result i need.
    Is there any way to achieve this? Please advice.
    Thank you,

    Thank you Ram C. i've tried your solution and it may be it, but i'm having one problem:
    Since i'm reporting in web, i used the second solution you offered and almost got the desiered result. The problem is that some entries are correct but others ar duplicated! Using same example, my result table looks something like this:
    <b>0CALDAY | ZKEYF | Calculated KF (from ZKFAT)</b>
    12.10.2006 | 10 | 1000
    12.10.2006 | 20 | 4000
    12.10.2006 | 30 | 1500
    Second row should be 2000, but instead it shows 4000. I added ZCHAR's attribute ZKFAT as a display attribute in order to compare the results. I found that the display attribute is correct (2000) but the Calculated KF still showed duplicated data (4000).
    By the way, when i execute the query, i get this warning message:
    "Calculated key figure ZRT_C02_2_CKF004 is not defined correctly"
    Any ideas why this could be happening?
    Thank you for your help.
    Message was edited by: Gerardo Gaona

  • Find the key for join between table "crhd" and "equi" for use field "answt"

    I make program for read data from table "crhd" for print  about machine report
    but I can not find the key for link join to table "equi" for print field "answt" (acquistion value)
    please help me  find the key field for join between table "crhd" and "equi" for use field "answt"
    thank you very much...

    This is how the Work Center is linked to a particular Equipment -
    Functional - In IE03 ( view Equipments) You see the Work Center of a particular Equipment.
    Technical - Go to view V_EQUI ( view of EQUI and EQUZ). Pass the Equipment number alongwith V_EQUI-PM_OBJTY = 'A' ( i,.e searching for the Object Type Work Center).  In this way u ll get the V_EQUI-GEWRK - this is the Work Center ID.
    You can pass this Work Center ID to CRHD. And you will get the Work Center text.
    CRHD-OBJTY = 'A'
    CRHD-OBJID = V_EQUI-GEWRK.
    and u ll get the CRHD-ARBPL - this is the Work Center.
    So u need to come backwards, alongwith ur CRHD-OBJTY and CRHD-OBJID , you pass the same to V_EQUI and u get the list of equipment numbers alongwith ur ANSWT(Acquisition value).
    I guess it solves ur problem.

  • Tablespace design for joined tables

    Hello altogether,
    Right now I'm designing the database structure for an application requiring high performance on database side.
    While thinking about the tablespace structure I came to the following question:
    When I am sure that I will have a join on two (or more) tables very often, is it better to put them in one tablespace, or is it better to put them into different tablespaces?
    Thanks for your help!
    Daniel

    The tablespace that any particular object is in is irrelevant from a performance standpoint (assuming the tablespace parameters are identical, of course. An object in a dictionary managed tablespace may well have different performance characteristics than an object in a locally managed tablespace using ASSM). Now, if different tablespaces have data files on different physical devices, you may see some performance gains from putting different objects into different tablespaces simply because this would tend to distribute I/O over more physical devices. Of course, you could accomplish the same thing by creating data files on different devices in the same tablespace. And with newer disk subsystems, this sort of manual load balancing tends to be rather pointless.
    If you have locally attached storage and you want to distribute I/O across devices by putting objects in different tablespaces rather than assigning multiple data files to the same tablespace, it generally isn't going to matter much which objects are in which tablespace so long as the I/O is relatively evenly distributed (and assuming that there aren't any I/O patterns that need to be accounted for (i.e. if object A is used heavily during the day and object B is used heavily at night and those are the only two objects in the database, putting them in different tablespaces is pointless because you'd never be using the extra spindles simultaneously). You'd have just as much luck putting objects that start with the letters A-L in one tablespace and M-Z in another as putting tables and indexes in different tablespaces or putting related objects in the same tablespace or putting related objects in different tablespaces. So long as I/O gets distributed, it doesn't matter which objects are where.
    Personally, I'd never have separate tablespaces for performance. Much easier to have a single tablespace with multiple data files on different devices. Multiple tablespaces should exist for managability reasons, recoverability reasons, and because different objects require different tablespace settings.
    Justin

  • Rules for joining tables in SQVI

    Hi all,
    I'm new to this community and this is my first time when i post on this forum. Hope I didn't post or did anything to bad.
    I just want to know how can I know which tables should I join. I'm trying to create a report that shows me all the open items from a GL account (Construction In Progress) and have beside the AUC the AUC description, order, PO, Vendor, vendor name, plant, material (at least).
    I tried to join BSIS and ANLA for starters, but anything I select it just gives me the same error "No data was selected" Is there any rule like only the key should be selected as selection fields, anything that you could advise would be a blessing.
    Thank you
    Edited by: Viorel_Petre on Jan 18, 2011 10:52 AM

    Hi,
    Welcome to SDN!!!
    Since you are new to SAP.
    you should ask the functional consultant to provide you with the names of the tables and fields.
    the transaction SQVI is used to create queries and you can also use it to know relation bewetween tables for appplying join conditions.
    for join conditions.
    1. create  VIEW.
    2. Choose data source as table join.
    3. insert 2 tables so see the realation between them.
    search google/SDN for more information.

  • Migration of huge data from norm tables to denorm tables for performance

    We are planning to move the NORM tables to DENORM tables in Oracle DB for a client for performance issue. Any Idea on the design/approach we can use to migrate this HUGE data (2 billion records/ 5TB of data) in a window of 5 to 10 hrs (Or minimum than that also).
    We have developed SQL that is one single query which contains multiple instance of same table and lots of join. Will that be helpful.

    Jonathan Lewis wrote:
    Lother wrote:
    We are planning to move the NORM tables to DENORM tables in Oracle DB for a client for performance issue. Any Idea on the design/approach we can use to migrate this HUGE data (2 billion records/ 5TB of data) in a window of 5 to 10 hrs (Or minimum than that also).
    We have developed SQL that is one single query which contains multiple instance of same table and lots of join. Will that be helpful.Unfortunately, the fact that you have to ask these questions of the forum tells us that you don't have the skill to determine whether or not the exercise is needed at all. How have you proved that denormalisation is necessary (or even sufficient) to solve the client's performance problems if you have no idea about how to develop a mechanism to restructure the data efficiently ?
    Regards
    Jonathan LewisYour brutal honesty is certainly correct. Another thing that is concerning to me is that it's possible that he's planning on denormalizing tables that are normalized for a reason. What good is a system that responds like a data warehouse but has questionable data integrity? I didn't even know where to begin with asking that question though.

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

Maybe you are looking for

  • XBOX 360 using MacBook Pro monitor

    Is it possible to play XBOX 360 by connecting it to the MacBook Pro? If possible, could someone please tell me how? Thanks

  • ClassCastException when running on invocable task

    The following exception has occurred when running a synchronous invocable task, It appears to be internal to Coherence as nowhere in my code am I casting from an Object to a List. +[2010-03-24 19:14:09.143 +0000] 5945635 [Logger@9264171 3.5.3/465] ER

  • How to validate an attribute in an entry from another data source

    Hi, i have a problem.. the users in our system have an address field...now this address field has to be verified against the values of addresses stored in another database,lets call it AddDB... if the address entered in the entry does not match again

  • Bluetooth Headset and Headphone audio out at same time?

    My GF and I were on our way to NYC on Acela and decided to watch a movie to kill time. I inadvertenly forgot to bring my headphone splitter, so I figured I could plug in one set of earphones into the headphone jack and use my bluetooth wirelessely. U

  • So, I'm assuming my iPod has gone up to that big motherboard in the sky.

    I own a 3G 30GB iPod that's served me dutifully for the last three or so years, and tonight I'm findind that it's giving me issues. First of all, the iPod will be stuck at the black apple bootup logo for 5 to 10 minutes, much longer than the normal 1