Retrieving first 100 rows

hi
I have pretty much rows inside my table.
I want to retrieve the first 100 records inside the table. Then next 100.
How can I do it with sql statement
Message was edited by:
Ricardinho

If ranks are unique.
We can use these solutions.
--Using subqueryselect SortKey,anyColumns
from YourTable a
where (select count(b.SortKey)+1 from YourTable b
        where b.SortKey > a.SortKey
          and RowNum <= EndRank) between StartRank and EndRank
  and RowNum <= (EndRank-StartRank)+1
order by SortKey desc;
--Sort in InlineView
select SortKey,anyColumns from(
    select SortKey,anyColumns,RowNum as Rank from(
        select SortKey,anyColumns from YourTable order by SortKey desc))
where Rank between StartRank and EndRank
  and RowNum <= (EndRank-StartRank)+1
order by SortKey desc;
--Using OLAP
select SortKey,anyColumns from(
    select SortKey,anyColumns,Row_Number() over(order by SortKey desc) as Rank
      from YourTable)
where Rank between StartRank and EndRank
  and RowNum <= (EndRank-StartRank)+1
order by SortKey desc;

Similar Messages

  • Query results window limited to 100 rows then 150 then 200

    Great product - I love this tool and don't mind helping to find the bugs in the prereleases like this.
    When I
    SELECT * FROM MY_TABLE
    in a sql worksheet, the results window initially only displays the first 100 rows.
    I can try to scroll down, but it stops at 100 rows.
    If click on Script Output window, the fetch count goes up to 150 and if I click back into
    Results window, I can then scroll to row 150. Click back over to Script Output and come back, now I'm up to 200. As long I scroll to the bottom of the Results window and click back and forth, I can increase the size of the results window by 50 rows each time. If I want the result window to show 2000 rows this will get real tedious real quick...
    Maybe a bug in dynamically resizing the result set?

    Hi gary,
    You may have already seen one or more threads like the following on the issue of increased memory overhead for the Excel formats:
    Re: Sql Developer 3.1 - Exporting a result set in xls generates and empty file
    Basically SQL Developer uses a third-party API to read and write these Excel formats. There are distinct readers and formatters for each of the xls and xlsx forms.
    There is a newer version of the API that supports streaming of xlsx Workbooks. Basically it achieves a much lower footprint by keeping in memory only rows that are within a sliding window, while the older, non-streaming version gives access to all rows in the document. The programmer may define the size of this window. I believe the newer API version was either not available or not stable during our 3.1 development cycle. Possibly a future SQL Developer version might use it.
    Regards,
    Gary
    SQL Developer Team

  • Retrieve a single row from Oracle function

    Hi!
    I have an Oracle function created as follow:
    CREATE OR REPLACE FUNCTION user_data(userId IN users.user_id%TYPE)
    RETURN users%ROWTYPE AS
    userData users%ROWTYPE;
    BEGIN
    SELECT * INTO userData
    FROM users
    WHERE user_id = userId;
    RETURN userData;
    END user_data;
    This function returns a single row from 'users' table.
    I tried to retrieve that single row by mean of:
    CallableStatement statement = connection
    prepareCall("{ call ? := get_data(?) }");
    statement.*registerOutParameter(1, OracleTypes.OTHER)*;
    statement.setInt(2, 103);
    statement.execute();
    ResultSet rs = (ResultSet) statement.getObject(1);*
    String value = rs.getString(2);
    System.out.println(value);
    But this code doesn't work.
    I tried other OracleTypes, also. (I don't know what OracleType I receive when the Oracle function return a ROWTYPE.)
    Can somebody tell me how to retrieve that single row?
    What is the type of out parameter (registerOutParameter()) when the Oracle function return a ROWTYPE?
    Notes: No cursor can be added. No database change is allowed.
    Thank you in advance.
    [Adrián E. Córdoba]
    Edited by: aecordoba on Mar 18, 2011 3:58 PM

    aecordoba wrote:
    I beg your pardon for my bad English. (It isn't my original language.)
    It's not a language problem. It's that you didn't provide any details about what went wrong.
    That just is my problem:
    I know the retrieved result is not a result set: It's a single row.Doesn't matter if it's a single row. You have a ResultSet object. You have to call ResultSet's next() method to get to the first row, even if it's the only row.
    1- Which is the Oracle type I receive in Java when the Oracle function returns a ROWTYPE?I don't know.
    2- How can I get each column value when I receive a single row in Java?The same way as when there are multiple rows: For each row, call ResultSet.next() to advance to the next row, then call the appropriate ResultSet.getXxx() methods to get each column's value. Again: If you have a ResultSet, you must call next() to get to the first row, even if there is only one row. "First row out of 1 total row" is no different than "first row out of 100 total rows."
    EDIT: Okay, I didn't notice before that you're using a CallableStatement with "out" parameters. I've never used one of those before, so I'm not familiar with the details. I really don't know if casting to a ResultSet is appropriate here. Do you actually have documentation that says you can do that, or are you just guessing and trying to find something that works.
    Edited by: jverd on Mar 18, 2011 11:26 AM

  • Last 100 rows of a database

    I have an application which needs to connect to a database and retrieve the 100 most recent rows and display them. Now the problem is that this operation must happen every 3-5 seconds. How could I perform this operation? Do I have to use a scrollable resultset and start counting from rs.last() upwards? How can I make more efficient?
    Thnx

    The DSP card is connected on a PCI slot on the machine that holds the database hence no network bandwidth concerns there. This database is mirrored on another server where more HDD space is available. The machine that holds the DSP is only a 1U rack so not a lot I can do on it.
    My solution was to mirror the data on the second machine and delete the data from the first one every few days. The DSP produces about 3GB of data per day and the system will be monitored for the next 5 years aprox. That about 5TB of data. These data will be processed during this period of time which is adequate enough but there is still the need to hava some sort of almost real-time graphical representation of the data as they come in.
    The main table of the database has 40 columns each one holding data gathered from its corresponding sensor. All rows are timestamped with a resolution of 1millisecond (it is possible to make it accurate to 100nanoseconds) and this column is indexed as well plus there is also a sequential ordered index as well.
    The data coming in are just integer numbers with not more than 3-4 digits. This means that each row holds much less than a 1kilobyte.
    By the way the database is MySQL
    A better solution, if the database supports it, would probably be the second table approach. A scheduled stored procedure would move the data and clean the second table every 3 seconds. This would have only one hit on the main table. Then the clients hit the second table.How would I go about doing this? Any suggestions? How can I use scheduled stored procecdures? Any links apreciated
    You could probably also cache the data off the database entirely. That will impact the network bandwidth. But it should lessen the impact on the database resources (and I would not be sure which is best.)This sounds interesting as well, I would be grateful for any more info on how to cache data off the database.
    And now a different question...
    I've programmed some code to use to mirror the database. What I did is get the two connections to the databases and then use SELECT * FROM TABLE. While on the while loop on the result set I insert each row to the second database. At the end I delete the data from the first one.
    Of course this solution only works when having a small number of rows. When trying to mirror a whole days worth of data (3GB) I get a nasty out of memory error since my (stupid?) solution loads the entire result set in memory before even beginneing to insert the rows.
    I was wondering if there is a method of doing this in parts of lets say 50,000 rows each time until all of it is mirrored. Or maybe another way without all the result set being loaded in memory.
    I have heard of setFetchSize() but I do not know if the JDBC driver that comes with MySQL supports it and even if it does I wouldn't have a clue on how to use it.
    I know there are other ways of mirroring a database but it would be really helpfull if I could do it programatically.
    As you probably guessed I am not a programmer but I thought my little knowledge of Java was enough to be able to use it constructively on this scientific experiment.
    Any suggestions, comments or links that would help clarify or solve the above problems really appreciated. And thanks for all the feedback so far
    Best regards,
    Stefanos

  • How to retrieve the multiple rows data on PDF form in a web service method using WSDL DataConnection

    How to retrieve the multiple rows data on PDF form in a web service method using WSDL DataConnection.
    I have a multiple rows on PDF form. All rows have 4 textfields. I want to submit the multiple rows data to a method defiened in webservice.
    Unable to retrieve the data in multiple rows within webservice method.

    Hi Paul,
    I'm now able to save the retrieved xml in a hidden text field and create dynamic table, and I'm able to fill this table from the XML, but the problem is that I could not find the correct way to loop on the xml, what I'm trying to say, the table will have number of rows with the data of the first row only, so can you tell me the right way to loop on the xml!
    this is my code
    TextField1.rawValue=xmlData.document.rawValue;
    xfa.datasets.data.loadXML(TextField1.rawValue, true, false);
    for(var i=0; i<count; i++)
    xfa.form.resolveNode("form1.P1.Table1.Row1["+i+"].Num").rawValue = xfa.datasets.data.record.num.value;
    xfa.form.resolveNode("form1.P1.Table1.Row1["+i+"].Name").rawValue = xfa.datasets.data.record.name.value;
    Table1.Row1.instanceManager.addInstance(true);
    Thanks
    Hussam

  • SSIS -SKIP First few ROWS

    Hi Experts ,
    I am working on SSIS 2008.
    I have Excel file as a Source .
    Scenario: First 5 Rows contain some Headings which I dont need
    My Actual Header Starts from 6th Row and then Data .
    How should I used my Excel source to Take 6th Row as Header and Then Data .
    NOTE ::
    I receive file every Month .
    Name of File is same Every Month
    Number and Name of Coumns are Fix
    Number of Records are not Fix  every month
    I tried using OpenRowset in Excel sourcei.e Revenue$A6:D6
    A6 =MY FIRST COLUMN HEADER
    D6= My LAST COULMN HEADER
    Package Executed but  No data Loaded in Destination SQL Table
    Any Help ??

    Hi Rihan
    I tried using OpenRowset in Excel sourcei.e Revenue$A6:D6
    A6 =MY FIRST COLUMN HEADER
    D6= My LAST COULMN HEADER
    here change the d6 to d(total rows in excel sheet) i.e if if u have 100 rows (A5:D100)
    Regards 
    bhimireddy

  • Delegated Admin web application only requests first 100 accounts?

    Hi,
    - Sun Java System Messenger Express 6.2
    - Delegated Administrator 6.3-0.09 built Sep 6, 2005
    Is this true that the Delegated Admin (DA) web application only requests first 100 accounts?
    Once logged in to DA web application, we only see "Retrieved Users (100)" if we want to see all users; but if we do a search on uid or username, all other users are retrieved.
    One of the admin gave us the following response:
    This is not a directory-related problem, but rather a matter of the design of the DA application you are using. The web-based java app only requests the first 100 accounts from the directory (presented by default as 10 pages of 10 accoints each.) , since you're supposed to be using the search facility to find accounts when you need to modify or delete them.
    This is a deliberate design choice by the Sun programmers who wrote the thing, probably because the directory is capable of holding several thousand accounts and pulling them all would take quite a bit of time (not to mention memory space), so in the interest of response-time speed they limited the data pull.
    I cannot modify this application's functionality. If you need a list of all user accounts in your domain, I can supply an LDIF on request, with any attributes (mail, uid, cn, etc.) that you like.
    Please let us know if there is any way we can view all users (approx. 1000) from DA web application.
    Thank you for your time,
    GJ

    Yes, the terminal commands I gave are changing permissions.
    Properly written OS X apps should run under any user account, and should store any account-specific information in the each user's home folder. Some poorly written apps might only be executable by the administrator. Running the first command I have will make the app executable to all users.
    Some even more poorly written apps will write user data into the application itself rather than to the user's home folder. This is a particularly bad problem with game software, which for example might write high score info into the app itself. If this is the case for your misbehaving apps, the second command I gave will make the app writable by everybody and should solve the problem.

  • Selecting first 15 rows and displaying next 15

    I wish to select the first 15 rows from my database and then have an option select a next page, whereby the next 15 rows are displayed. I'm not sure how to go about it, could someone enlighten me. For reference, it would be something like how the forums is done, whereby they display x amount of threads, and you would have to go to the next page.

    This can be done by retrieving the first 15 using something like rownum <= 15 in the where clause.
    Then, by remembering the ID of the last thread that was displayed you select threads id > lastID and rownum <= 15... that is the first 15 rows that match your query starting immediately after the last thread searched.
    It's not unusual to retrieve and store a list of the Thread IDs completely as this allows a more stable feedback at the expense of heavier session information being stored.
    Hope that helps
    Talden

  • I NEED A FIRST 5 ROWS DATA

    HI EXPORTS,
    MY QURY IS
    SELECT TOT.HTNO, TOT.RESULT_NAME, TOT.TOTALMARKS, TOT.TOTALRANK,
    TOT.RESEXAMSLNO, TOT.RESACADEMICSLNO, TOT.RESGROUPSLNO,
    TOT.RESCORPCOLLSLNO, TOT.DISTRICTSLNO
    FROM RESULT_STUDENTMARKS_TOTALS TOT
    WHERE TOT.RESEXAMSLNO = 1
    AND TOT.RESGROUPSLNO = 1
    AND TOT.RESACADEMICSLNO = 9
    AND TOT.RESCORPCOLLSLNO IN (99)
    AND TOT.DISTRICTSLNO IN (4)
    AND ROWNUM <5
    ORDER BY TOT.HTNO
    I WANT FIRST 4 ROWS OF DATA BASED ON ASSENDING ORDER CAN U CHAGE IS QURY OR
    GIVEN ANY EXAMPLE OF FIRST 4 ROWS OF DATA ORDER BY ONLY ONE FIELD IN THAT TABLE
    THANX & REGARDS
    ASHOK
    Edited by: Ashok on Apr 28, 2011 11:01 PM

    See
    http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:62364503028143
    and
    http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:2853107469873
    You have to filter the ROWNUM after the ORDER BY.
    so :
    select * from
    SELECT TOT.HTNO, TOT.RESULT_NAME, TOT.TOTALMARKS, TOT.TOTALRANK,
    TOT.RESEXAMSLNO, TOT.RESACADEMICSLNO, TOT.RESGROUPSLNO,
    TOT.RESCORPCOLLSLNO, TOT.DISTRICTSLNO
    FROM RESULT_STUDENTMARKS_TOTALS TOT
    WHERE TOT.RESEXAMSLNO = 1
    AND TOT.RESGROUPSLNO = 1
    AND TOT.RESACADEMICSLNO = 9
    AND TOT.RESCORPCOLLSLNO IN (99)
    AND TOT.DISTRICTSLNO IN (4)
    ORDER BY TOT.HTNO
    where rownum < 6
    order by 1Hemant K Chitale
    Edited by: Hemant K Chitale on Apr 29, 2011 2:28 PM
    Changed "rownum < 5" to "rownum < 6" to get 5 rows.

  • Selecting the first n rows with Oracle

    Please help,I am very confused with this.
    For example,lets take a table called "items" which has
    ID    NAME    PRICE
    1    cup              1.2
    2    book         49.99
    3    mobile        89.99
    20    bike        1250
    19    egg            0.8
    18    bun           2.5
    17    color          2.22
    16    tv             310
    15    air            0
    14    laptop         999.5
    13    pack         21.53
    12    cd/r           1.2
    11    table         198
    10    apple         1.05
    9    carpet         122.4
    8    oracle         19999
    7    door           150
    6    dollar         1
    5    pencil        1.35
    Next,say, we want to retrieve the five cheapest items.
    select name, price
    from items
    where rownum &lt; 6
    order by price;
    NAME                      PRICE
    coke                        .78
    cup                         1.2
    pencil                     1.35
    book                      49.99
    mobile                    89.99
    This is wrong. In the result set above, the item with id no 19 (egg) is missing which only  costs 0.80 (currency units). We get this because
    Oracle first retrieves the first five  rows and then orders them by price. This is because of the fact  that we didn't explicitly enough state what we meant with first.
    This problem can be solved by using row_number with a option---&gt;below
    select name, price
    from (
    _*select name, price, row_number() over (order by price) r*_
    _*from items*_
    where r between 1 and 5;
    NAME                      PRICE
    air                           0
    coke                        .78
    egg                          .8
    dollar                        1
    apple                      1.05
    **Question is in the above select if i ADD the clause "WHERE rownum <= 6"---The above result set goes wrong again.I dont get the correct order as above.
    Please help.Can we have the order by done first and then take 5 records of them.
    select name, price
      from (
        select name, price, row_number() over (order by price) r
          *from items WHERE rownum <= 6*
    where r between 1 and 5

    wrote:user_7000011
    Thanks.
    *But, what i wanted is  in the inner most select where we select from item --I need to add a where qualification as below.*
    select * from
    (select name, price, dense_rank() over (ORDER by price desc ) rank from item where rownum &lt;=6
    ) a
    where a.rank&lt;6
    Well, the rank limitation is having the same effect in that query, so there's no need to try and put "rownum &lt;=6" in that case.What you need to understand is that "rownum" is a pseudo column that is generated when the results of a query are already determined, but it is applied before any ordering takes place (as the where clause is evaluated before the order by clause).
    So if you try and say "where rownum &lt;= 6 order by x", you are under the misunderstanding that it will order the data first and then take the first 6 rows. This isn't the case. It will take the first 6 rows (whatever they may be) and then order those.
    In order to order first and then take 6 rows you have to order your data without restriction and then wrap that in a query to restrict it.
    Hence...
    select x
    from tableX
    where rownum &lt;= 6
    order by xbecomes
    select x
    from (select x
          from   tableX
          order by x)
    where rownum &lt;= 6This ensures that the data is selected and ordered first and then the first 6 rows are taken from that.

  • Retrieve a single row from Multirow ResultsetObject

    I would like to know how can i retrieve a single row from a multi row result object. I have a result object which retrieved multiple rows and i would like to retrieve only the top row or a specific row. Can any one of you suggest how this can be done. Thank you for all the help in advance.

    call next() on your result set, this goes to the first row and you can retrieve the data with getString(), getInt(), etc.

  • First n rows

    Version: Oracle 10
    How can I get first 'n' rows in a select query but after applying order by;
    For eg:
    --Get first 4 rows?
    select * from abc order by a;
    As I understand rownum could be used in "where" clause which means it happens before order by is applied.

    I have some examples for emp for top-N and bottom-N Queries
    (just in case you want to extend you sql)
    TABLE employees
    Name             Null?    Type        
    EMPLOYEE_ID      NOT NULL NUMBER(6)   
    FIRST_NAME                VARCHAR2(20)
    LAST_NAME        NOT NULL VARCHAR2(25)
    EMAIL            NOT NULL VARCHAR2(25)
    PHONE_NUMBER              VARCHAR2(20)
    HIRE_DATE        NOT NULL DATE        
    JOB_ID           NOT NULL VARCHAR2(10)
    SALARY                    NUMBER(8,2) 
    COMMISSION_PCT            NUMBER(2,2) 
    MANAGER_ID                NUMBER(6)   
    DEPARTMENT_ID             NUMBER(4)  
    Task:
    Display top two salaries in each department.
    1. Using function ROW_NUMBER ():
    SELECT employee_id, department_id, salary
      FROM (SELECT employee_id, department_id, salary,
                   ROW_NUMBER () OVER (PARTITION BY department_id ORDER BY salary DESC)
                                                                               r_no
              FROM employees)
    WHERE r_no <= 2
    EMPLOYEE_ID DEPARTMENT_ID SALARY  
    200         10            4400    
    201         20            13000   
    202         20            6000    
    114         30            11000   
    115         30            3100    
    203         40            6500    
    100         90            24000   
    101         90            17000   
    2. Using function RANK()
    SELECT employee_id, department_id, salary, rn
      FROM (SELECT employee_id, department_id, salary,
                   RANK () OVER (PARTITION BY department_id ORDER BY salary DESC)
                                                                               rn
              FROM employees)
    WHERE rn <= 2
    EMPLOYEE_ID DEPARTMENT_ID SALARY   RN                                    
    200         10            4400     1                                     
    201         20            13000    1                                     
    202         20            6000     2                                     
    114         30            11000    1                                     
    115         30            3100     2                                     
    203         40            6500     1                                     
    100         90            24000    1                                     
    101         90            17000    2                                     
    102         90            17000    2                                     
    The salaries that are equal (department 90) receive the same rank                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Select First 10 rows of data based on dates

    Hi,
    Have a report like this
    "Object Name", "Object Type", "Complexity", "Task Name", "RCCL Close Date"
    ABC XXX MEDIUM Design 1-1-2008
    BBC XXX MEDIUM Design 2-1-2008
    ABC XXX MEDIUM Design 3-1-2008
    BBC XXX MEDIUM Design 4-1-2008
    ABC XXX MEDIUM Design 5-1-2008
    BBC XXX MEDIUM Design 6-1-2008
    ABC XXX MEDIUM Design 7-1-2008
    BBC XXX MEDIUM Design 8-1-2008
    ABC XXX MEDIUM Design 9-1-2008
    BBC XXX MEDIUM Design 10-1-2008
    ABC XXX MEDIUM Design 11-1-2008
    BBC XXX MEDIUM Design 12-1-2008
    ABC XXX MEDIUM Design 13-1-2008
    BBC XXX MEDIUM Design 14-1-2008
    ABC XXX MEDIUM Design 15-1-2008
    BBC XXX MEDIUM Design 16-1-2008
    How to select the first 10 rows using a sql query please suggest me..
    Thanks
    Sudhir.

    I tryed your code but this is returning only the 10th row data not all 10 rows data please suggest me how to modify the code
    The code below is wht i am using to execute. when i put a condition rno >= 10 it is showing me the 10th row data.
    only single row. please suggest me in modifying the code
    select
    from
    select
    "Project Name",
    "Object Name",
    "Object Type",
    "Complexity",
    "Task Name",
    "Actual Effort",
    "Plan Effort",
    "Close Date",
    --count(*) over (partition by "Object Name" order by "Close Date" ) recs
    row_number() over(order by "Close Date") rno
    from
    SELECT
    "Project Name",
    "Object Name",
    "Object Type",
    "Complexity",
    MAX("Task Name") "Task Name",
    SUM("Actual Effort") "Actual Effort",
    SUM("Plan Effort") "Plan Effort",
    MAX("Close Date") "Close Date"
    FROM
    SELECT
    "Project Name",
    "Object Name",
    "Object Type",
    "Complexity",
    MAX("Task Name") "Task Name",
    SUM("Actual Effort") "Actual Effort",
    SUM("Plan Effort") "Plan Effort",
    SUM("EV") "EV",
    MAX("Close Date") "Close Date"
    FROM
    SELECT
    INITCAP(pro.project_name) "Project Name",
    INITCAP(pobj.name) "Object Name",
    INITCAP(POBJ.PROGRAM_TYPE) "Object Type",
    INITCAP(POBJ.COMPLEXITY) "Complexity",
    INITCAP(tas.name) "Task Name",
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) ) "Actual Effort",
    CASE
    WHEN upper(:P68_TASK) LIKE UPPER('Incomming-Doc Review') Then SR.INCOMMING_DOC
    WHEN upper(:P68_TASK) LIKE UPPER('Design') Then
    SR.DESIGN
    WHEN upper(:P68_TASK) LIKE UPPER('Design Review') Then
    SR.DESIGN_REVIEW
    WHEN upper(:P68_TASK) LIKE UPPER('Design Rework') Then
    SR.DESIGN_REWORK
    WHEN upper(:P68_TASK) LIKE UPPER('Build') Then
    SR.BUILD
    WHEN upper(:P68_TASK) LIKE UPPER('Build Review') Then
    SR.BUILD_REVIEW
    WHEN upper(:P68_TASK) LIKE UPPER('Build Rework') Then
    SR.BUILD_REWORK
    WHEN :P68_TASK in ('Test Plan Prep','Unit Test Plan') Then
    SR.TEST_CASE_PREP
    WHEN :P68_TASK in ('Test Plan Review','Unit Test Plan Review') Then
    SR.TEST_CASE_REVIEW
    WHEN :P68_TASK in ('Test Plan Rework','Unit Test Plan Rework') Then
    SR.TEST_CASE_REWORK
    WHEN :P68_TASK in ('Testing','Unit Testing') Then
    SR.UNIT_TESTING
    WHEN :P68_TASK in ('Test Result Review','Unit Test Result Review') Then
    SR.TEST_RESULT_REVIEW
    WHEN upper(:P68_TASK) LIKE UPPER('Installation Review') Then
    SR.INSTALLATION_SCRIPT
    WHEN upper(:P68_TASK) LIKE UPPER('Release-Review') Then
    SR.RELEASE_REVIEW
    WHEN upper(:P68_TASK) LIKE UPPER('FDD Review') Then
    SR.FDD_REVIEW
    End "Plan Effort",
    CASE
    WHEN upper(:P68_TASK) LIKE UPPER('Incomming-Doc Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.INCOMMING_DOC,0)
    ) / decode(SR.INCOMMING_DOC,0,null,SR.INCOMMING_DOC) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Design') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.DESIGN,0)
    ) / decode(SR.DESIGN,0,null,SR.DESIGN) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Design Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.DESIGN_REVIEW,0)
    ) / decode(SR.DESIGN_REVIEW,0,null,SR.DESIGN_REVIEW) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Design Rework') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.DESIGN_REWORK,0)
    ) / decode(SR.DESIGN_REWORK,0,null,SR.DESIGN_REWORK) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Build') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.BUILD,0)
    ) / decode(SR.BUILD,0,null,SR.BUILD) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Build Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.BUILD_REVIEW,0)
    ) / decode(SR.BUILD_REVIEW,0,null,SR.BUILD_REVIEW) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Build Rework') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.BUILD_REWORK,0)
    ) / decode(SR.BUILD_REWORK,0,null,SR.BUILD_REWORK) ) * 100 ),2)
    WHEN :P68_TASK in ('Test Plan Prep','Unit Test Plan') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.TEST_CASE_PREP,0)
    ) / decode(SR.TEST_CASE_PREP,0,null,SR.TEST_CASE_PREP) ) * 100 ),2)
    WHEN :P68_TASK in ('Test Plan Review','Unit Test Plan Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.TEST_CASE_REVIEW,0)
    ) / decode(SR.TEST_CASE_REVIEW,0,null,SR.TEST_CASE_REVIEW) ) * 100 ),2)
    WHEN :P68_TASK in ('Test Plan Rework','Unit Test Plan Rework') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.TEST_CASE_REVIEW,0)
    ) / decode(SR.TEST_CASE_REWORK,0,null,SR.TEST_CASE_REWORK) ) * 100 ),2)
    WHEN :P68_TASK in ('Testing','Unit Testing') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.UNIT_TESTING,0)
    ) / decode(SR.UNIT_TESTING,0,null,SR.UNIT_TESTING) ) * 100 ),2)
    WHEN :P68_TASK in ('Test Result Review','Unit Test Result Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.TEST_RESULT_REVIEW,0)
    ) / decode(SR.TEST_RESULT_REVIEW,0,null,SR.TEST_RESULT_REVIEW) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Installation Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.INSTALLATION_SCRIPT,0)
    ) / decode(SR.INSTALLATION_SCRIPT,0,null,SR.INSTALLATION_SCRIPT) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('Release-Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.RELEASE_REVIEW,0)
    ) / decode(SR.RELEASE_REVIEW,0,null,SR.RELEASE_REVIEW) ) * 100 ),2)
    WHEN upper(:P68_TASK) LIKE UPPER('FDD Review') Then
    trunc((((
    SUM( nvl(TCL.NUM_HOURS_DAY1,0) + nvl(TCL.NUM_HOURS_DAY2,0) +
    nvl(TCL.NUM_HOURS_DAY3,0) + nvl(TCL.NUM_HOURS_DAY4,0) +
    nvl(TCL.NUM_HOURS_DAY5,0) + nvl(TCL.NUM_HOURS_DAY6,0) +
    nvl(TCL.NUM_HOURS_DAY7,0) )
    nvl(SR.FDD_REVIEW,0)
    ) / decode(SR.FDD_REVIEW,0,null,SR.FDD_REVIEW) ) * 100 ),2)
    End "EV",
    NULL "Close Date"
    FROM
    timecard_lines tcl,
    project_objects pobj,
    tasks tas,
    projects pro,
    timecard_headers thr,
    TIMECARD_PERIODS TP,
    Sub_Proc_Replan SR
    WHERE
    pro.id = :P1_PROJECTS AND
    TP.ID = THR.TPD_ID AND
    tcl.pobj_id = pobj.id AND
    tcl.tas_id = tas.id AND
    pro.id = pobj.pro_id AND
    thr.id = tcl.thr_id AND
    pobj.id = tcl.pobj_id and
    PRO.ID = SR.PRO_ID AND
    POBJ.ID = SR.POBJ_ID AND
    upper(tas.name) like upper(:P68_TASK)
    and tas.name not in ('Wait Time','Wait time','Wait-Time','Project Management',
    'Quality Management','Offshore-Onsite','Wait_Time')
    and pobj.id in
    (SELECT distinct pobj.id
    FROM
    task_status ts,projects pro,project_objects pobj,task tsk
    ,employees emp,employee_project_pairs epp,REVIEW_ITEMS RI
    WHERE
    pro.id = :p1_projects and
    pobj.pro_id = pro.id
    and pro.id = ts.pro_id
    and pobj.id = ts.pro_obj_id
    and tsk.id = ts.task_id
    and emp.id = epp.emp_id
    and epp.pro_id = pro.id
    and ts.id = ri.ts_id
    and tsk.name in
    ('Incomming-Doc Review',
    'Incoming Doc- Review',
    'Incoming Doc- Review',
    'FDD - Review',
    'Architecture - Review',
    'Design Review',
    'Design - Review',
    'TDD - Review',
    'Build Review',
    'Build - Review',
    'Code - Review',
    'Code',
    'Test Result Review',
    'Test Result - Review',
    'UTR - Review',
    'Test - Review',
    'Installation Review',
    'Installation - Review',
    'Release-Review',
    'Release - Verification',
    'Migration & Release - Review',
    'Release - Review')
    and ts.status in ('CLOSE','ACCEPTED','closed')
    and ( upper(:P68_TASK) like '%'||upper(substr(tsk.name,1,5))||'%'
    or upper(:P68_TASK_NAMES) like '%'||upper(substr(tsk.name,1,5))||'%' )
    GROUP BY
    pro.project_name,pobj.name,tas.name,POBJ.PROGRAM_TYPE,POBJ.COMPLEXITY,SR."Effort",SR.INCOMMING_DOC,SR.DESIGN,SR.DESIGN_REVIEW,SR.DESIGN_REWORK,SR.BUILD,
    SR.BUILD_REVIEW,SR.BUILD_REWORK,SR.TEST_CASE_PREP,SR.TEST_CASE_REVIEW,
    SR.TEST_CASE_REWORK,SR.UNIT_TESTING,SR.TEST_RESULT_REVIEW,
    SR.INSTALLATION_SCRIPT,SR.RELEASE_REVIEW,SR.FDD_REVIEW
    UNION
    SELECT distinct
    INITCAP(pro.project_name) "Project Name",
    INITCAP(pobj.name) "Object Name",
    INITCAP(POBJ.PROGRAM_TYPE) "Object Type",
    INITCAP(POBJ.COMPLEXITY) "Complexity",
    NULL "Task Name",
    NULL "Actual Effort",
    NULL "Plan Effort",
    NULL "EV",
    decode(
    MAX(ts.close_timestamp),null,
    MAX(ts.REVIEWER_ACCEPTED_DATE),
    MAX(ts.close_timestamp) ) "Close Date"
    FROM
    task_status ts,projects pro,project_objects pobj,task tsk
    ,employees emp,employee_project_pairs epp,REVIEW_ITEMS RI
    WHERE
    pro.id = :p1_projects and
    pobj.pro_id = pro.id
    and pro.id = ts.pro_id
    and pobj.id = ts.pro_obj_id
    and tsk.id = ts.task_id
    and emp.id = epp.emp_id
    and epp.pro_id = pro.id
    and ts.id = ri.ts_id
    and tsk.name in
    ('Incomming-Doc Review',
    'Incoming Doc- Review',
    'Incoming Doc- Review',
    'FDD - Review',
    'Architecture - Review',
    'Design Review',
    'Design - Review',
    'TDD - Review',
    'Build Review',
    'Build - Review',
    'Code - Review',
    'Code',
    'Test Result Review',
    'Test Result - Review',
    'UTR - Review',
    'Test - Review',
    'Installation Review',
    'Installation - Review',
    'Release-Review',
    'Release - Verification',
    'Migration & Release - Review',
    'Release - Review')
    and ts.status in ('CLOSE','ACCEPTED','closed')
    and ( upper(:P68_TASK) like '%'||upper(substr(tsk.name,1,5))||'%'
    or upper(:P68_TASK_NAMES) like '%'||upper(substr(tsk.name,1,5))||'%' )
    group by pro.project_name,pobj.name,tsk.name,POBJ.PROGRAM_TYPE,pobj.complexity
    GROUP BY "Project Name","Object Name", "Object Type", "Complexity"
    WHERE "Task Name" IS NOT NULL
    GROUP BY "Project Name","Object Name", "Object Type", "Complexity"
    order by "Close Date"
    where rno >= 10
    and rownum <= 10
    Thanks
    Sudhir

  • Update 100 Row Data In One Time against a Code

    Dear Expert,
    How can I update a Master Data Row Table record using a particular code. Like I write a query Update Table Set U_UDF = 'Value' Where Code = 2
    U_UDF is the field of Master Data Row Table's field. There are 100 Row against code 2 & I want to update all row with same value but in one time only.
    How can I do that ?
    Plz help me.
    Regards

    Hi,
    Try this:
    SAPbobsCOM.GeneralService oGeneralService = null;
                                                    SAPbobsCOM.GeneralData oGeneralData = null;
                                                    SAPbobsCOM.GeneralDataParams oGeneralParams = null;
                                                    SAPbobsCOM.CompanyService sCmp = null;
                                                    SAPbobsCOM.GeneralData oChild = null;
                                                    SAPbobsCOM.GeneralDataCollection oChildren = null;
                                                    sCmp = SBO_Company.GetCompanyService();
                                                    oGeneralService = sCmp.GetGeneralService("UDO");
                                                    // Get UDO record 
                                                    oGeneralParams = ((SAPbobsCOM.GeneralDataParams)(oGeneralService.GetDataInterface(SAPbobsCOM.GeneralServiceDataInterfaces.gsGeneralDataParams)));
                                                    oGeneralParams.SetProperty("Code", ContractCode);
                                                    oGeneralData = oGeneralService.GetByParams(oGeneralParams);
                                                   // Add lines on UDO Child Table 
                                                    oChildren = oGeneralData.Child("CONTRACTDETAIL");
                                                    // Update an existing line 
                                                    oChild = oChildren.Item(LineId - 1);
                                                    DateTime dt = DateTime.Now;
                                                    oChild.SetProperty("U_STATS", "Terminated");
                                                    oChild.SetProperty("U_Updated", dt);
                                                    oChild.SetProperty("U_Remarks", "Service Terminated");
    //Update the UDO Record
                                                    oGeneralService.Update(oGeneralData);
    I have given you the sample. Just change it accordingly.
    Hope it helps.
    Thanks & Regards
    Ankit Chauhan

  • "Powermaps needs more resources to complete this operation" error. 2 datasets 100 rows. one tour. two layers

    I'm working on some BI mashups to demo the powerbi offerings to potential clients, but I get close to 20+ crashes a day relating to memory issues
    I'm utilising powerquery to import and filter data from 2 rss feeds. Each are at most 100 rows. I then add these to a powerpivot model & have two powerview reports over this.
    I've added a powermap to the solution but find the reliability to be very poor. If it does manage to work the colours of the bubbles and bars sometimes don't match the defined colours or they're black (a telltale sign of memory problems in windows). This
    usually breaks the whole excel 'process' and I can't save the workbook. I get "workbook not saved" errors or that "there was a problem saving with all the features added. Click continue and save as a different file"
    either that or i get a hard crash like I did while writing this post:
    Problem signature:
      Problem Event Name:    APPCRASH
      Application Name:    EXCEL.EXE
      Application Version:    15.0.4535.1507
      Application Timestamp:    52282875
      Fault Module Name:    KERNELBASE.dll
      Fault Module Version:    6.1.7601.18229
      Fault Module Timestamp:    51fb1116
      Exception Code:    e0434f4d
      Exception Offset:    0000c41f
      OS Version:    6.1.7601.2.1.0.256.1
      Locale ID:    3081
    Additional information about the problem:
      LCID:    1033
      skulcid:    1033
    Read our privacy statement online:
      http://go.microsoft.com/fwlink/?linkid=104288&clcid=0x0409
    If the online privacy statement is not available, please read our privacy statement offline:
      C:\Windows\system32\en-US\erofflps.txt
    Memory amount is not a problem according to resource manager; excel process taking ~500mb and 70% free ram. I left memtest86+ running overnight and it found no errors in 8 passes so it looks like my sticks are ok. I have sql server 2008r2 & 2012
    along with all the bells (db engine, ssis, ssas) installed and that never gives me hard memory failures even when I drive it hard. Obviously if i try to process a data model that's too big it gives me out of mem errors but that's expected and it doesn't crash
    SSDT or the SSAS\Tabular instance. The services are off while I'm doing the excel powerbi work.
    laptop specs are :
    OS Name    Microsoft Windows 7 Ultimate
    Version    6.1.7601 Service Pack 1 Build 7601
    Other OS Description     Not Available
    OS Manufacturer    Microsoft Corporation
    System Manufacturer    Dell Inc.
    System Model    XPS L521X
    System Type    x64-based PC
    Processor    Intel(R) Core(TM) i7-3612QM CPU @ 2.10GHz, 2101 Mhz, 4 Core(s), 8 Logical Processor(s)
    BIOS Version/Date    Dell Inc. A15, 1/08/2013
    SMBIOS Version    2.7
    Windows Directory    C:\Windows
    System Directory    C:\Windows\system32
    Boot Device    \Device\HarddiskVolume1
    Locale    Australia
    Hardware Abstraction Layer    Version = "6.1.7601.17514"
    Time Zone    Cen. Australia Daylight Time
    Installed Physical Memory (RAM)    8.00 GB
    Total Physical Memory    7.88 GB
    Available Physical Memory    5.67 GB
    Total Virtual Memory    15.9 GB
    Available Virtual Memory    13.6 GB
    Page File Space    8.00 GB
    Page File    C:\pagefile.sys
    I can upload the workbook too if desired
    I dont solely blame powermaps, rendering the powerview reports is pretty hit and miss in all the workbooks i've developed before as well, but clicking the Maps button in excel is certainly the biggest gamble
    Jakub @ Adelaide, Australia

    Hi Jakub,
    I can think of two changes you can make that can help.
    1. You don't mention what architecture your office version is. I would suggest you use 64 bit Excel. It has a higher memory threshold and performs much better when using a lot of memory.
    2. Install all of the latest CU updates for Office. There has been dozens of bug fixes for PowerPivot and Power View since the last release.
    You mention you can share the workbook. Feel free to send to me at
    [email protected] and I would be willing to take a look and see if the crashes are known issues or already fixed.
    Brad Syputa, Microsoft Reporting Services This posting is provided "AS IS" with no warranties.

Maybe you are looking for

  • MM-SUS Scenario - BP Creation from MM with external number

    Hi Gurus, We are implementing MM-SUS scenario on landscape involving ECC 6.0 EHP5 and SRM 7.0 EHP1. In connection with Vendor transfer from ECC to SUS, We want to achieve that ECC vendor and SUS business partner (BP) have the same number (I E: vendor

  • Profit Center Accouting-Transfer Prices

    Hi, For Profit center valuation, Activation of Material Ledger  is mandatory?

  • Can we call to formmail.pl from within flash?

    Hi all.. there's probably a simple way to send form content to the standard formmail.pl but I need a tutorial of some sort to do it.. anyone point me in the right direction?

  • I can't hear a song

    I pre-ordered an album and it said you got 2 free songs. Today if you had the album pre orders you would get another song but I can't hear the new song that I was supposed to receive it just says purchased but it won't let me play it    

  • Partioning

    hai can u tell about logical and physical partioning?when u go for that