SQL for joins

I have 2 tables - customer and order. I am looking to test some join strategies (block nested loop, sort-merge and hash). Could anyone tell me the sql code to perform the joins.

Hi amanda,
Nested Loop: this join is used when you have indexes defined on the join columns. you can use the hint USE_NL.
select /*+ ORDERED USE_NL(b) */ customer_no , order_no
from order a, customer b
where a.customer_no = b.customer_no;
Sort-Merge: suppress the indexes then you shld be able to use sort-merger join or use the hint USE_MERGE, this shld force a sort-merger join.
select /*+ ORDERED USE_MERGE(b) */ customer_no , order_no
from order a, customer b
where a.customer_no = b.customer_no;
Hash Join: use the USE_HASH hint in the query to force the hash join. make sure you have has_area_size parameter set to a value like
700k depending on the memory availability in your machine.hash join builds the hash value of the customer_no from cutomer table in memory and searches for matching records in the order table this is used if you have more memory in your machine.
select /* ORDERED USE_HASH(b) */ customer_no , order_no
from order a, customer b
where a.customer_no = b.customer_no;
may be you can grab some oracle books on sql tuning they explain in great detail about the hints and joins.
Mukundan.
null

Similar Messages

  • SQL for join table  problem?

    hi morning,
    my problem also haven find out the solution,can help me solve.
    I got one part of register student to exam,during that part is like that.
    Student Code:________(table3)
    Exam Code:_________(table 3)
    Student Name:________(table 1)
    Exam Name:_________(table 2)
    I can write code call table 3 but when three table join together in part 3,got problem is come out alry....The problem is inside the Student Name & Exam Name.When i use the button for NEXTRECORD to check record,cannot call the student Name & exam Name.
    Thank for help. Because i use sql code to call table 3 to check table1 and table 2 detail ,then retrive that data.
    my code is like that,
    private void cmd_NexRec_RSEActionPerformed(java.awt.event.ActionEvent evt) {
    // TODO add your handling code here:
    int i;
    i=0;
    try{
    String x1,x2;
    x1=(jTextField10.getText()).trim();
    x2=(jTextField11.getText()).trim();
    if (x1.equals("")&& x2.equals("") )
    counter.studentExam="R";
    counter.registerStudentExam="N";
    i++;
    jInternalFrame8.setVisible(false);
    Connection con = getConnection2();
    Statement s = con.createStatement (ResultSet.TYPE_SCROLL_INSENSITIVE,
    ResultSet.CONCUR_UPDATABLE);
    ResultSet rs=getStudentExams();
    ResultSet rs1=getCodes();
    ResultSet rs2=getExams();
    String select="select * from studentExamFile a LEFT JOIN {studentData b} ON a.student_code=b.student_code LEFT JOIN {exam c} ON c.exam_code=a.exam_code where student_code='"+ x1 +"' and exam_code='"+ x2 +"'" ;
    // String select ="select * from studentExamFile , exam , studentData where studentExamFile.student_code=studentData.student_code and exam.exam_code=studentExamFile.exam_code ";
    // rs = s.executeQuery(select);
    rs.next();
    StudentExam c=getstudentcode(rs);
    Code c1=getCode(rs1);
    Examcode c2=getexamcode(rs2);
    jTextField12.setText(c.studentcode);
    jTextField13.setText(c.examcode);
    jTextField14.setText(rs.getString("c1.name"));
    jTextField15.setText(rs.getString("c2.ename"));
    // jTextField14.setText(c1.name);
    // jTextField14.setText(c2.ename);
    jLabel16.setText("Record: Review");
    jInternalFrame10.setVisible(true);
    jTextField12.setEnabled(false);
    jTextField13.setEnabled(false);
    //New Line
    jTextField14.requestFocus();
    else
    Connection con = getConnection2();
    Statement s = con.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_UPDATABLE);
    String w=(jTextField10.getText()).trim();
    String w2=(jTextField11.getText()).trim();
    //String select= "select * from [studentExamFile] where [student_code]>'" w "'";
    // String select1="select * from [studentExamFile] where [exam_code]>'" w2 "'";
    String select="select * from studentExamFile a LEFT JOIN {studentData b} ON a.student_code=b.student_code LEFT JOIN {exam c} ON c.exam_code=a.exam_code where student_code='"+ x1 +"' and exam_code='"+ x2 +"'" ;
    ResultSet rs;
    rs = s.executeQuery(select);
    // rs = s.executeQuery(select1);
    rs.next();
    if(rs.isFirst())
    counter.registerStudentExam="N";
    jInternalFrame8.setVisible(false);
    jTextField12.setText(rs.getString(1));
    jTextField13.setText(rs.getString(2));
    jLabel16.setText("Record: Review");
    jInternalFrame10.setVisible(true);
    jTextField12.setEnabled(false);
    jTextField13.setEnabled(false);
    //New Line
    jTextField14.requestFocus();
    counter.studentExam="R";
    else
    JOptionPane.showMessageDialog(this,"End of the file ","Infomation",JOptionPane.INFORMATION_MESSAGE);
    }catch(SQLException e)
    System.out.println("Error");
    //jTextField3.requestFocus();
    }

    thanks for reply.
    i think my problem is in connection to database .
    String sql="select studentData.student_name, exam.exam_name from studentData,exam,studentExamFile where studentData.student_code=studentExamFile.student_code and exam.exam_code=studentExamFile.exam_code ";
                    // String sql="select studentData.student_name , exam.exam_name from studentExamFile  LEFT JOIN { studentData }  ON studentExamFile.student_code=studentData.student_code LEFT JOIN { exam }  ON exam.exam_code=studentExamFile.exam_code ";
                     Connection con = getConnection();
                     Statement s = con.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_UPDATABLE);
                     Connection con2 = getConnection();
                     Statement s3 = con.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_UPDATABLE);
                     System.out.println("error");//my problem inside here....help me
                     rs3=s.executeQuery(sql);
                     rs3=s3.executeQuery(sql);
                     rs3.next();
                     String s1,s2;
                     s1=rs3.getString(1);//maybe me dunno how to convert the sql data...
                     s2=rs3.getString(2);
                     rs.next();
                     rs2.next();
                    // rs3.next();
                     Code c=getCode(rs);
                     Examcode c2=getexamcode(rs2);
                     StudentExam c3=getstudentcode(rs3);
                     jTextField12.setText(c3.studentcode);
                     jTextField13.setText(c3.examcode);
                    // jTextField14.setText(rs.getString("c.name")); 
                     //jTextField15.setText(rs.getString("c2.ename"));
                     jTextField14.setText(s1); 
                     jTextField15.setText(s2);
                     jLabel16.setText("Record: Review");
                     jInternalFrame10.setVisible(true);
                     jTextField12.setEnabled(false);
                     //New Line
                     jTextField14.requestFocus();                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Can we implement the custom sql query in CR for joining the two tables

    Hi All,
    Is there anyway to implement the custom sql query in CR for joining the two tables?
    My requirement here is I need to write sql logics for joining the two tables...
    Thanks,
    Gana

    In the Database Expert, expand the Create New Connection folder and browse the subfolders to locate your data source.
    Log on to your data source if necessary.
    Under your data source, double-click the Add Command node.
    In the Add Command to Report dialog box, enter an appropriate query/command for the data source you have opened.
    For example:
    SELECT
        Customer.`Customer ID`,
        Customer.`Customer Name`,
        Customer.`Last Year's Sales`,
        Customer.`Region`,
        Customer.`Country`,
        Orders.`Order Amount`,
        Orders.`Customer ID`,
        Orders.`Order Date`
    FROM
        Customer Customer INNER JOIN Orders Orders ON
            Customer.`Customer ID` = Orders.`Customer ID`
    WHERE
        (Customer.`Country` = 'USA' OR
        Customer.`Country` = 'Canada') AND
        Customer.`Last Year's Sales` < 10000.
    ORDER BY
        Customer.`Country` ASC,
        Customer.`Region` ASC
    Note: The use of double or single quotes (and other SQL syntax) is determined by the database driver used by your report. You must, however, manually add the quotes and other elements of the syntax as you create the command.
    Optionally, you can create a parameter for your command by clicking Create and entering information in the Command Parameter dialog box.
    For more information about creating parameters, see To create a parameter for a command object.
    Click OK.
    You are returned to the Report Designer. In the Field Explorer, under Database Fields, a Command table appears listing the database fields you specified.
    Note:
    To construct the virtual table from your Command, the command must be executed once. If the command has parameters, you will be prompted to enter values for each one.
    By default, your command is called Command. You can change its alias by selecting it and pressing F2.

  • Forming generic sql query   for joining multiple sap tables in ABAp

    Hi,
    I am new to this abap field ,facing an issue onsap-jco project . I have used RFC_READ_TABLE  FM ,Customized this FM but facing an issue how to write generic  open SQl select statement  for joining multiple tables  using RFC_READ_TABLE .Kindly help on this issue.
    Thanks.

    something like this? If your tuples are not single columns, then you'll have to use dynamic sql to achieve the same result.with
    table_1 as
    (select '|Xyz|Abc|Def|' tuple from dual
    table_2 as
    (select '|Data1|Data21|Data31|Data41|Data51|' tuple from dual union all
    select '|Data2|Data22|Data32|Data42|Data52|' tuple from dual union all
    select '|Data3|Data23|Data33|Data43|Data53|' tuple from dual union all
    select '|Data4|Data24|Data34|Data44|Data54|' tuple from dual union all
    select '|Data5|Data25|Data35|Data45|Data55|' tuple from dual
    select case the_row when 1
                        then tuple
                        else '|---|---|' || substr(tuple,instr(tuple,'|',1,3) + 1)
           end tuple
      from (select substr(a.tuple,instr(a.tuple,'|',:one_one),instr(a.tuple,'|',:one_one + 1)) ||
                   substr(a.tuple,instr(a.tuple,'|',1,:one_two) + 1,instr(a.tuple,'|',1,:one_two + 1) - instr(a.tuple,'|',1,:one_two)) ||
                   substr(b.tuple,instr(b.tuple,'|',1,:two_one) + 1,instr(b.tuple,'|',1,:two_one + 1) - instr(b.tuple,'|',1,:two_one)) ||
                   substr(b.tuple,instr(b.tuple,'|',1,:two_two) + 1,instr(b.tuple,'|',1,:two_two + 1) - instr(b.tuple,'|',1,:two_two)) tuple,
                   rownum the_row
              from table_1 a,table_2 b
    order by the_rowRegards
    Etbin
    Message was edited by:Etbin
    user596003

  • ORA-01489 Received Generating SQL for Report Region

    I am new to Apex and I am running into an issue with an report region I am puzzled by. Just a foreword, I'm sure this hack solution will get a good share of facepalms and chuckles from those with far more experience. I welcome suggestions and criticism that are helpful and edifying!
    I am on Apex 4.0.2.00.07 running on 10g, I believe R2.
    A little background, my customer has asked an Excel spreadsheet be converted into a database application. As part of the transition they would like an export from the database that is in the same format as the current spreadsheet. Because the column count in this export is dynmic based on the number of records in a specific table, I decided to create a temporary table for the export. The column names in this temp table are based on a "name" column from the same data table so I end up with columns named 'REC_NAME A', 'REC_NAME B', etc. (e.g. Alpha Record, Papa Record, Echo Record, X-Ray Record). The column count is currently ~350 for the spreadsheet version.
    Because the column count is so large and the column names are dynamic I've run into a host of challenges and errors creating this export. I am a contractor in a corporate environmentm, so making changes to the apex environment or installation is beyond my influence and really beyond what could be justified by this single requirement for this project. I have tried procedures and apex plug-ins for generating the file however the UTL_FILE package is not available to me. I am currently generating the SQL for the query in a function and returning it to the report region in a single column (the user will be doing a text-to-column conversion later). The data is successfully being generated, however, the sql for the headers is where I am stumped.
    At first I thought it was because I returned both queries as one and they were joined with a 'union all'. However, after looking closer, the SQL being returned for the headers is about +10K+ characters long. The SQL being returned for the data is about +14k+. As mentioned above, the data is being generated and exported, however when I generate the SQL for the headers I am receiving a report error with "ORA-01489: result of string concatenation is too long" in the file. I am puzzled why a shorter string is generating this message. I took the function from both pages and ran them in a SQL command prompt and both return their string values without errors.
    I'm hopeful that it's something obvious and noobish that I'm overlooking.
    here is the code:
    data SQL function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      l_ret := 'select ';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '|| to_char("'||rec.column_name||'")';
        else
            l_c := 1;
            l_ret := l_ret || ' to_char("' || rec.column_name || '")';
        end if;
      end loop;
        l_ret := l_ret || ' from ' || l_tbl;
      dbms_output.put_line(l_ret);
    end;header sql function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '||'''||rec.column_name||'''';
        else
            l_c := 1;
            l_ret := l_ret || '''' || rec.column_name || '''';
        end if;
      end loop;
        l_ret := l_ret || ' from dual';
      dbms_output.put_line(l_ret);
    end;-------
    EDIT: just a comment on the complexity of this export, each record in the back-end table adds 12 columns to my export table. Those 12 columns are coming from 5 different tables and are the product of a set of functions calculating or looking up their values. This is export is really a pivot table based on the records in another table.
    Edited by: nimda xinu on Mar 8, 2013 1:28 PM

    Thank you, Denes, for looking into my issue. I appreciate your time!
    It is unfortunately a business requirement. My customer has required that the data we are migrating to this app from a spreadsheet be exported in the same format, albeit temporarily. I still must meet the requirement. I'm working around the 350 columns by dumping everything into a single column, which is working for the data, however, the headers export is throwing the 01489 error. I did run into the error you posted in your reply. I attempted to work around it with the clob type but eneded up running into my string concatentation error again.
    I'm open to any suggestions at this point given that I have the data. I'm so close because the data is exporting, but because the columns are dynamic, the export does me little good without the headers to go along with it.

  • What is a efficient SQL for this query ?

    Hi,
    I am using 9.2 database. Suppose I am having following 2 tables
    Table p having only one column i.e. 'a'. Values in this column are
    a1
    b1
    c1
    d1
    Table Q having three columns a, b, c
    a1, 1, 100
    a1, 2, 50
    b1, 1, 30
    b1, 2, 40
    d1, 2, 90
    Table Q can be joined only using column a.
    Table Q can have multiple or no records for column a in table p. Based on above sample data, I want following output
    a1, 100, 50
    b1, 30, 40
    c1
    d1, 90
    Kindly tell be how can I achive this in most efiicient way !!!
    thanks & regards
    PJP

    I only have you two tracks about how do it.
    If you want all the columns from p with or wihout q you have to do:
    11:35:58 SQL> l
    1 select p.*, q.*
    2 from p,q
    3* where p.a = q.a (+)
    11:37:27 SQL> /
    For change the order of the colums for rows you can see the url, the are more examples like that only need to search "columns for rows" in this forums.
    Anyway:
    with rt as
    select 'a1' a, 1 b, 100 c from dual union
    select 'a1', 2, 50 from dual union
    select 'b1', 1, 30 from dual union
    select 'b1', 2, 40 from dual union
    select 'd1', 2, 90 from dual)
    --select * from rt
    select a, decode('1','0','0',rtrim(xmlagg(xmlelement(b, b || ',')).extract('//text()'),',')) b
    , decode('1','0','0',rtrim(xmlagg(xmlelement(c, c || ',')).extract('//text()'),','))
    from rt
    group by a
    Message was edited by:
    cth
    Other way:
    select a, ltrim(b,',') as b, ltrim(c,',') as c
    from (
    select row_number() over (partition by a order by length(b) desc) as rn, a, b,c
    from (select a, sys_connect_by_path(b, ',') as b,
              sys_connect_by_path(c, ',') as c
    from (
    select row_number() over (partition by a order by b) as rn, a, b,c
    from rt) y
    connect by rn = prior rn + 1 and prior a = a
    start with rn = 1
    where rn = 1
    Message was edited by:
    cth

  • Correlated Subquery for joining summary table original data table

    I want to list a set of fields from an inner-query that correspond to calculated fields on an outer-query. A set of date fields on the outer-query signify a range to correlate records on the inner-query. Records on the inner-query have a single date field
    which should fall within the outer-query date range. At this point, I cannot determined the correct syntax for this query, but am able to achieve the desired results using a left outer join and case-when arguments on what should be the inner-query records.
    Here's my basic SQL right now:
    SELECT  DISTINCT a.service,
          a.start_range,   -- start of month
          a.end_range,    -- end of month
          a.availability,
          CASE WHEN YEAR(a.start)=YEAR(o.begin) AND MONTH(a.start)=month(o.begin) THEN o.event_id ELSE NULL END AS 'event_id'
    CASE WHEN YEAR(a.start)=YEAR(o.begin) AND MONTH(a.start)=month(o.begin) THEN o.minutes ELSE NULL END AS 'minutes'
    FROM   service_availability a LEFT OUTER JOIN 
               service_outage o ON 
    a.service=o.service AND
    TIMESTAMPDIFF(YEAR,o.begin,NOW())=0
    WHERE   TIMESTAMPDIFF(YEAR,a.start,NOW())=0
    The limitation with this approach is that the CASE WHEN argument needs to be used for retrieving any fields on the service_outage table. I've tried several approaches including parameters in both the FROM and
    WHERE clauses as well as subqueries, but cannot get any to return the data. Is there a better way to achieve this?

    Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your datA. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal datA. We need
    to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL.
    >> I want to list a set of fields[sic] from an inner-query that correspond to calculated fields [sic] on an outer-query. <<
    Columns are not fields. Rows are not records. CASE is an expression, not an argument. Where did you get the terms “inner-query” and “outer-query”? I never heard them before and I have some experience with SQL. We have subqueries; is that what you are trying
    say? 
    >> A set of date fields [sic] on the outer-query signify a range to correlate records [sic] on the inner-query. Records on the inner-query have a single date field [sic] which should fall within the outer-query date range.<<
    You do not know the basic terms or how SQL work. You also posted some weird dialect of an SQL-like language. What is NOW()?? Does that mean CURRENT_TIMESTAMP in your dialect?  Did you know that BEGIN is a reserved word in SQL? 
    >> At this point, I cannot determined the correct syntax for this query, but am able to achieve the desired results using a LEFT OUTER JOIN and CASE-when arguments [sic] on what should be the inner-query [sic] records [sic]. Here's my basic SQL right
    now: <<
    Why are you doing temporal math in SQL? Where is the calendar and report period tables?
    Since SQL is a database language, we prefer to do look ups and not calculations. They can be optimized while temporal math messes up optimization. A useful idiom is a report period calendar that everyone uses so there is no way to get disagreements in the DML.
    The report period table gives a name to a range of dates that is common to the entire enterprise. 
    CREATE TABLE Month_Periods
    (month_name CHAR(10) NOT NULL PRIMARY KEY
      CHECK (month_name LIKE '[12][0-9][0-9][0-9]-[01][0-9]-00'),
     month_start_date DATE NOT NULL,
     month_end_date DATE NOT NULL,
     CONSTRAINT date_ordering
      CHECK (month_start_date <= month_end_date),
    etc);
    These report periods can overlap or have gaps. I like the MySQL convention of using double zeroes for months and years, That is 'yyyy-mm-00' for a month within a year and 'yyyy-00-00' for the whole year. The advantages are that it will sort with the ISO-8601
    data format required by Standard SQL and it is language independent. The pattern for validation is '[12][0-9][0-9][0-9]-00-00' and '[12][0-9][0-9][0-9]-[01][0-9]-00'
    Why is an outage not an availability status or an event? 
    >> The limitation with this approach is that the CASE WHEN argument [sic] needs to be used for retrieving [sic: querying is not retrieving] any fields [sic] on the Service_Outage table<<
    We have no DDL! Not even sample data. 
    >> I've tried several approaches including parameters in both the FROM and WHERE clauses as well as subqueries, but cannot get any to return the data. <<
    Parameters exist in procedure declarations; arguments are the actual values passed in them.  Neither of them as anything to do with a query. 
    Want to try again? 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Sql to JOIN 3 tables

    I need some help on my sql to join 3 tables.
    I am close, but my results come out to be a multiple of what im supposed to get.
    I am trying to retrieve a sum from table 2 and table 3.
    there are 4 records in table2 and 3 records in table3
    the results SHOULD be 3 and 20
    but i get 12/60
    which is a number i shoudl get times the #records of the oppposite table.
    i have moved my code in different arrangements to see if that is the issue, but it doesnt seem to matter.  i still get the result of a multiple.
    Heres my code so far:
    SELECT sum(table3.field) AS sumtable3, sum(table3.field) AS sumtable2
    FROM table1
    LEFT JOIN table2 ON table1.id=table2.id
    LEFT JOIN table3 ON table2.id=table3.id
    GROUP BY table1.id
    ORDER BY table1.id

    Please provide sample data for all 3 tables.

  • Dynamic SQL - Inner Join

    I just starting to Learn and use Dynamic SQL. 
    I am trying use a inner join query in dynamic SQL 
    for Example say 'Select A.id as Itemid , A.name as Itemname from TableA A inner join TableB B On A.ID = B.ID where B.ID typeid = 1 '
    for some reason this is not working . i for sure know the query works as when i run the query it works fine. when i try to run it dynamically it give me blank values.
    Please let me know if you need more information.
    Appreicate Your help!!
    Thanks,
    Chaitanya 

    Here is an example for dynamic SQL from BOL:
    DECLARE @IntVariable int;
    DECLARE @SQLString nvarchar(500);
    DECLARE @ParmDefinition nvarchar(500);
    /* Build the SQL string one time.*/
    SET @SQLString =
    N'SELECT BusinessEntityID, NationalIDNumber, JobTitle, LoginID
    FROM AdventureWorks2012.HumanResources.Employee
    WHERE BusinessEntityID = @BusinessEntityID';
    SET @ParmDefinition = N'@BusinessEntityID tinyint';
    /* Execute the string with the first parameter value. */
    SET @IntVariable = 197;
    EXECUTE sp_executesql @SQLString, @ParmDefinition,
    @BusinessEntityID = @IntVariable;
    /* Execute the same string with the second parameter value. */
    SET @IntVariable = 109;
    EXECUTE sp_executesql @SQLString, @ParmDefinition,
    @BusinessEntityID = @IntVariable;
    Basically any query can be run as dynamic SQL.  However, static SQL is the first choice. Use dynamic SQL if needed.
    Dynamic SQL examples:
    http://www.sqlusa.com/bestpractices/dynamicsql/
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • SQL for pulling description from product identification section off of summ

    I tried using the DB schema tool to generate sql for pullling the Description field off of the Product Indentification section on the summary tab and it came up with zero records even though i know we have a few 1,000 specs that use that field.
    Can you send me the sql that i should be using or the table to look in to find that text??
    thanks,
    David

    This should get you started,
    select
      ss.SpecNumber as "Spec Number"
      , ssn.name as "Spec Name"
      , sdftp.Description as "Description"
    from SpecDescriptionFreeTextProp sdftp
      inner join gsmProductIdentification gpi on sdftp.fkSpecID = gpi.pkid
      inner join SpecSummary ss on gpi.fkSpecID = ss.SpecID
      inner join SpecSummaryName ssn on ssn.fkSpecsummary = ss.PKID
    where
      ssn.langid = 0

  • SQL for summing weekly data

    Hi
    I have two tables which has date column and holding more than month data. I want to sum the weekly data starting Monday by combing two table . Can you any help me on to get the SQL for this.
    SQL> select * from ORCL_DATA_GROWTH;
    REPORT_DA DATAGROWTH                                                                                                             
    03-SEP-12       9.78                                                                                                             
    04-SEP-12       5.36                                                                                                             
    05-SEP-12       5.42                                                                                                             
    06-SEP-12      33.36                                                                                                             
    07-SEP-12       5.47                                                                                                             
    08-SEP-12        5.5                                                                                                             
    09-SEP-12        5.5                                                                                                             
    10-SEP-12       9.47                                                                                                             
    11-SEP-12       8.16                                                                                                             
    12-SEP-12      23.97                                                                                                             
    13-SEP-12      51.28                                                                                                             
    14-SEP-12      24.05                                                                                                             
    15-SEP-12      24.03                                                                                                             
    16-SEP-12      24.17                                                                                                             
    17-SEP-12      28.16                                                                                                             
    18-SEP-12      24.17                                                                                                             
    19-SEP-12      24.19                                                                                                             
    20-SEP-12      50.96                                                                                                             
    21-SEP-12      24.19   
    SQL> select * from ORCL_PURGING;
    REPORT_DA PURGING_SPACE FILE_SYSTEM                                                                                              
    01-OCT-12            18 /dborafiles/orac/ora_Test/oradata01                                                                     
    01-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    02-OCT-12            55 /dborafiles/orac/ora_Test/oradata01                                                                     
    02-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    03-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    03-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    06-OCT-12            70 /dborafiles/orac/ora_Test/oradata01                                                                     
    06-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    07-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    07-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    08-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    08-OCT-12             0 /dborafiles/orac/ora_Test/oradata05 Combining two table and daily out put sample below. But I want to have output by summing weekly value
    SQL> select a.report_date,sum(a.PURGING_SPACE) PURGING_SPACE,b.DATAGROWTH from ORCL_PURGING a ,ORCL_DATA_GROWTH b where a.report_date=b.report_date and a.report
    _date<=(sysdate) and a.report_date >=(sysdate-30) group by a.report_date,b.datagrowth order by a.report_date;
    REPORT_DA PURGING_SPACE DATAGROWTH
    19-SEP-12            77      24.19
    20-SEP-12             2      50.96
    21-SEP-12            47      24.19
    22-SEP-12            19      24.16
    23-SEP-12            22      24.05
    24-SEP-12            25      28.11
    25-SEP-12            43      24.08
    26-SEP-12            21      24.06
    27-SEP-12            22      50.86
    28-SEP-12            22      23.05
    29-SEP-12            22      23.27
    30-SEP-12            22      23.61
    01-OCT-12            18      28.67
    02-OCT-12            55      25.92
    03-OCT-12            21      23.38
    04-OCT-12             0      50.46
    05-OCT-12             0      23.62
    06-OCT-12            70      24.39
    07-OCT-12            21      24.53
    08-OCT-12            21      28.66
    09-OCT-12            51      24.41
    10-OCT-12            22      24.69
    11-OCT-12            23      50.72
    12-OCT-12            22      25.08
    13-OCT-12            25      25.57
    14-OCT-12            21      23.38
    15-OCT-12            22      27.77
    27 rows selected.                               Thanks in advance.
    Edited by: BluShadow on 18-Oct-2012 09:28
    added {noformat}{noformat} tags for readability.  Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hi,
    user9256814 wrote:
    Hi
    This is my qery
    WITH purging_week AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (purging_space)           AS total_purging_space -- same as in main query
         FROM      orcl_purging
         GROUP BY TRUNC (report_date, 'IW')
    ,     data_growth_week     AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (datagrowth)           AS total_datagrowth
         FROM      orcl_data_growth
         GROUP BY TRUNC (report_date, 'IW')
    SELECT     COALESCE ( p.report_week
              , d.report_week
              )               AS report_week
    ,     p.total_purging_space
    ,     d.total_datagrowth
    FROM          purging_week     p
    FULL OUTER JOIN     data_growth_week d ON d.report_week = p.report_week
    ORDER BY report_week
    ;That's still hard to read.
    You may have noticed that this site normally compresses whitespace.
    Whenever you post formatted text (including, but not limited to, code) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.  Blushadow probably won't do it for you every time you post a message.
    This is just one of the many things mentioned in the forum FAQ {message:id=9360002} that can help you get better answers sooner.
    Are you still having a problem?  Have you tried using in-line views?  If so, wouldn't it make more sense to post the code you're using, rather than some code you're not using?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Looking For Free Good Tutorial To Learn SQL For 8i

    Hello all,
    I am looking for a free good tutorial to learn SQL for Oracle 8i.
    If anyone got any links or resources, kindly suggest.
    Thanks in advance.

    I am looking for a free good tutorial to learn SQL for Oracle 8i.8i has been unsupported for years. If you're just starting out, you might as well learn the latest.

  • BREAK not working in SQL+ for windows. Works in SQL+ in DOS

    I'm writing a simple query in SQL+
    SELECT table_name, column_name
    FROM user_tab_columns
    WHERE table_name like 'MYTAB%'I first set BREAK ON table_name
    then I run my query.
    I expect the table name to be shown once, per table, not once per column.
    If I run this query in SQL+ in a DOS window, that's exactly what happens.
    If I run the same thing in SQL+ for Windows (version 10 on an 11g db) the break command appears to be ignored.
    Any ideas ??
    Thanks folks.

    Not sure but maybe try the NODUPLICATES option.
    break on table_name noduplicates

  • Need help with SQL for Pie Chart

    I am trying to create a pie charge which would have 3 slices.
    Following sql gives me the expected values when I run the sql command:
    select
    round(avg(CATEGORY_1 + CATEGORY_2 + CATEGORY_3 + CATEGORY_4 + CATEGORY_5),0) "OD Engagements",
    round(avg(CATEGORY_6 + CATEGORY_7 + CATEGORY_13),0) "Talent Engagements",
    round(avg(CATEGORY_8 + CATEGORY_9 + CATEGORY_10 + CATEGORY_11 + CATEGORY_12),0) "Other Engagements"
    from OTD_PROJECT
    where STATUS in ('Open','Hold')
    I get 3 columns labeled: OD Engagements, Talent Engagements and Other Engagements with the correct averages based on the the data.
    I have tried several ways to try to get this to work in the SQL for a pie chart, but keep getting the invalid sql message and it won't save. I also tried saving without validation, but no data is shown on the chart at all.
    I want to have a pie, with 3 slices, one labeled OD Engagements with a value of 27, one labeled Talent Engagements with a value of 43 and one labeled Other Engagements with a value of 30. Then I want to be able to click on each pie slice to drill down to a secondary pie chart that shows the breakdown based on the categories included in that type.
    Since I am not grouping based on an existing field I an unsure what the link and label values should be in the chart sql.

    You'll need something like the below. I have no idea what the URL for the drilldown needs to be. It should create an appropriate link in your app for the particular slice. Mainly the code below breaks the SQL results into three rows rather than three columns. It may well have a syntax error since I can't test.
    select linkval  AS LINK,
           title    AS LABEL,
           calc_val AS VALUE
    FROM   (SELECT 'OD Engagements' AS title,
                   round(avg(CATEGORY_1 + CATEGORY_2 + CATEGORY_3 + CATEGORY_4 + CATEGORY_5),0) AS calc_val,
                   'f?p=???:???:' || v('APP_SESSION') || '::NO:?' AS LINKVAL
            from   OTD_PROJECT
            where  STATUS in ('Open','Hold')
            UNION ALL
            SELECT 'Talent Engagements' AS title,
                   round(avg(CATEGORY_6 + CATEGORY_7 + CATEGORY_13),0) AS calc_val,
                   'f?p=???:???:' || v('APP_SESSION') || '::NO:?' AS LINKVAL
            from   OTD_PROJECT
            where  STATUS in ('Open','Hold')
            UNION ALL
            SELECT 'Other Engagements' AS title,
                   round(avg(CATEGORY_8 + CATEGORY_9 + CATEGORY_10 + CATEGORY_11 + CATEGORY_12),0) AS calc_val,
                   'f?p=???:???:' || v('APP_SESSION') || '::NO:?' AS LINKVAL
            from   OTD_PROJECT
            where  STATUS in ('Open','Hold')
           );

  • Could we use embedded SQL for XML in .pc ?

    Hi,
    Can we use embedded SQL for XML in .pc ?
    <1> assume we have run SQL statements in Oracle9i:
    SQL>create table MY_XML_TABLE
    (Key1 NUMBER,
    Xml_Column SYS.XMLTYPE);
    SQL>insert into MY_XML_TABLE(key1, Xml_Column) values
    (1, SYS.XMLTYPE.CREATEXML
    ('<book>
    <chapter num="1">
    <text>This is the my text</text>
    </chapter>
    <book>')
    <2> Could we directly translate it in .pc as usually:
    <outlined, not exactly)
    int emp_number = 1;
    XML_Data emprec; /* ?????? */
    EXEC SQL SELECT M.Xml_Column.GETCLOBVAL() as XML_Data
    INTO :emprec INDICATOR :emprec_ind
    FROM MY_XML_TABLE M
    WHERE Key1 = :emp_number;
    Thanks
    MJ

    reply by myself.
    No problem!!
    ===============================
    int emp_number = 1;
    struct emprec{
    char feature[1280]
    EXEC SQL SELECT M.Xml_Column.GETCLOBVAL() as XML_Data
    INTO :emprec INDICATOR :emprec_ind
    FROM MY_XML_TABLE M
    WHERE Key1 = :emp_number;

Maybe you are looking for

  • Problem in delivery creation against stock transfer PO

    Hi, Scenario: - We have 5 material and make the PO for stock transfer yesterday. After that we made the delivery against that PO. Problem it this....today we need to make the another PO related to last one (all same material number's) but diff. qty.

  • Error - 9672 when installing printer to Airport Express

    Hey all, now i have done some research on this before asking for help. It seems that when installing my HP 1350 All in One printer to my Airport Express i get this error 9672. now i did find this link http://docs.info.apple.com/article.html?artnum=30

  • Satellite P870 - simple image backup & restore software required

    Does anyone know of simple disk imaging software to back up and restore C: drive including the OS and every file. I used to use BartPE which had the shell of Windows XP on the CD and booted from the CD and allowed an image to be made or restored. It

  • Address_into_printform

    Hi kind professionals, We are still sadly using version 4.6c of SAP. Problem: From what I have read in the OSS notes I can not print additional address fields like 'steet_2' and 'C/O' using the function module 'address_into_printform' from SAP script

  • How to find iAuthor books in iBooks?

    I visitited the iBooks store to see what is available, and cannot find anyplace where interactive iAuthor books are distinguished from regular eBooks.  Does anybody know if these books will be segregated and promoted in a special place, and if so, is