Help with a query regarding time dependent display of Plan/Actual data

Hello,
Let me try to explain what my problem is:
I have a query that shows plan and actual sales figures on a timeline. Plan and actual data is identified by 0VERSION (P01 for Plan; P00 for Actual).
0CALMONTH and Sales key figure is in rows. 0VERSION is in collums.
The output looks as follows (simplified):
CalYYYY/MM............P01....P00
2006.09......Sales........90.....100
2006.10......Sales......100.......95
2006.11......Sales........90........0
2006.12......Sales........95........0
So far so good! But users aren't satisfied with that. They want only <u>one</u> collumn with sales figures. For past months (< 2006.11) they want to see actual figures (Version P00). For current and future month (>= 2006.11) they want to see plan figures (Version P01). And this only in one single collumn.
So the output should look this way:
CalYYYY/MM..........P00/P01
2006.09......Sales......100
2006.10......Sales.... ...95
2006.11......Sales........90
2006.12......Sales........95
Is there any way/workaround to accomplish this?
Your help will be very appreciated!
Regards,
Ulrich

Hi,
For this what you can try is , have the two Restricted Key Figs the way you have right now, but Hide them. Then Create a Third Formula Key Fig, in which you can put a Boolean Logic :
(FV > 11) * Act KF +(FV<= 11)*Plan KF
Where FV is a Formula Variable which takes on the Value from Month (1 for Jan , 2 for Feb etc)
Try something on these lines.

Similar Messages

  • Hi guys can someone help with a query regarding the 'podcast app' why do they not have all the episodes that relate to one show available why only half or a selected amount

    Hi guys can someone help with a query regarding the 'podcast app' why do they not have all the episodes that relate to one show available why only half or a selected amount

    THanks...but some days they have all the episodes right back to the very first show...ive downloaded a few but they are only available every now and then which makes no sense...why not have them available the whole time ??

  • Age group to make it time dependent display as per transaction date

    Hi All,
    I have a doubt in displaying a attribute as time dependent. Below is my scenario.
    I have a cube where am capturing the transaction data of the customer. Currently while the customer data loading using the customer number i do look up to the customer master data and findout the D.O.B and calculate the Age group(bucket) and punch it in the transactin cube itself. Note Age group is calculated based on the DOB and the transaction date. It was working fine.
    But issue raised today customer is registered and he is allowed to do some transactions immediately after that, and the personal details like DOB, address is captured latter. Hence all the transactions until his DOB is maintained in the source system is puched in cube w/o DOB.
    Hence i do not want to have it in the cube and want to have the age group to be fetched from attribute / some other mean. But in case if i make the agegroup as attribute of the customer it will show only the current age. But i need to show the agegroup as on the transactions date.
    For ex. today customer age is 25 and day after tmr his age changes to 26. And in this case if i execute the report for whole month under 25 i should get the transaction and under 26 age i should get the respective transaction.
    how should u achieve this. please help me out.
    Note: Having a virtual KF seems createing the performance issue because data volume pulling out is more in numbers.
    thanks in advance
    Prem

    Hi,
    In this scenario i think Virtual Key figure is inevitable.
    Hope that helps.
    Regards
    Mr Kapadia
    *Assigning points is the way to say thanks.**

  • Help with SQL query invloving time operations

    I have created 2 tables in my SQL. One is the user_info table which stores the time of login and timezone of login for each user. The other is the post_table which stores the postid, user who makes the post, time of post and timezone for each posts.
    CREATE TABLE user_info
    user_id VARCHAR(20),
    login_date DATE,
    login_time_zone VARCHAR(20),
    PRIMARY KEY (user_id)
    CREATE TABLE post_table
    post_id VARCHAR(20), 
    user_id VARCHAR(20),
    datepost DATE, 
    time_zone VARCHAR(20),
    PRIMARY KEY (post_id),
    FOREIGN KEY (user_id) REFERENCES user_info(user_id) ON DELETE CASCADE
    ) ;Some sample data for my tables is as below -
    INSERT INTO user_info VALUES( 'u1', to_date('9/17/2009 20:00','MM/DD/YYYY mi:ss'), -2 );
    INSERT INTO user_info VALUES( 'u2', to_date('9/17/2009 19:55','MM/DD/YYYY mi:ss'), -4 );
    INSERT INTO post_table VALUES( 'p1', 'u1', to_date('9/17/2009 20:50','MM/DD/YYYY mi:ss'), 6 );
    INSERT INTO post_table VALUES( 'p2', 'u2', to_date('9/17/2009 20:30','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO post_table VALUES( 'p3', 'u2', to_date('9/18/2009 6:00','MM/DD/YYYY mi:ss'), 2 );
    INSERT INTO post_table VALUES( 'p4', 'u1', to_date('9/17/2009 21:00','MM/DD/YYYY mi:ss'), -3 );I need to write an SQL query which - finds the user(s) whose time difference between the login time and the latest time when he/she writes a post is the smallest. I need to consider the timezones here as well.
    I am unsure if time_zone should be of type VARCHAR or TIMESTAMP so have created it as VARCHAR in my tables.
    Someone please help me form this query.
    PS : How do I user <code> tags in this forum to write sql statements.
    Edited by: user11994430 on Oct 9, 2009 5:59 PM

    I tried with the following test data
    INSERT INTO user_info VALUES( 'u1', to_date('9/17/2009 20:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u2', to_date('9/16/2009 13:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u3', to_date('9/18/2009 15:00','MM/DD/YYYY mi:ss'), 0 );
    INSERT INTO user_info VALUES( 'u4', to_date('9/20/2009 17:00','MM/DD/YYYY mi:ss'), 0 );
    INSERT INTO user_info VALUES( 'u5', to_date('9/14/2009 3:00','MM/DD/YYYY mi:ss'), -3 );
    INSERT INTO user_info VALUES( 'u6', to_date('9/15/2009 6:00','MM/DD/YYYY mi:ss'), -3 );
    INSERT INTO user_info VALUES( 'u7', to_date('9/16/2009 7:00','MM/DD/YYYY mi:ss'), 0 );
    INSERT INTO user_info VALUES( 'u8', to_date('9/17/2009 8:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u9', to_date('9/18/2009 9:00','MM/DD/YYYY mi:ss'), 0 );
    INSERT INTO user_info VALUES( 'u10', to_date('9/19/2009 10:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u11', to_date('9/20/2009 11:00','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO user_info VALUES( 'u12', to_date('9/21/2009 19:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u13', to_date('9/1/2009 4:00','MM/DD/YYYY mi:ss'), -3 );
    INSERT INTO user_info VALUES( 'u14', to_date('9/22/2009 7:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u15', to_date('9/24/2009 23:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u16', to_date('9/25/2009 11:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u17', to_date('9/26/2009 18:00','MM/DD/YYYY mi:ss'), -4 );
    INSERT INTO user_info VALUES( 'u18', to_date('9/27/2009 13:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u19', to_date('9/17/2009 18:00','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO user_info VALUES( 'u20', to_date('9/29/2009 22:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u21', to_date('9/30/2009 5:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u22', to_date('9/15/2009 7:00','MM/DD/YYYY mi:ss'), -4 );
    INSERT INTO user_info VALUES( 'u23', to_date('9/16/2009 17:00','MM/DD/YYYY mi:ss'), -8 );
    INSERT INTO user_info VALUES( 'u24', to_date('9/17/2009 19:00','MM/DD/YYYY mi:ss'), 0 );
    INSERT INTO user_info VALUES( 'u25', to_date('9/18/2009 22:00','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO user_info VALUES( 'u26', to_date('9/19/2009 15:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO user_info VALUES( 'u27', to_date('9/20/2009 23:00','MM/DD/YYYY mi:ss'), 1 );
    INSERT INTO post_table VALUES('p1', 'u26', to_date('9/14/2009 18:00','MM/DD/YYYY mi:ss'), -5 ) ;
    INSERT INTO post_table VALUES('p2', 'u2',  to_date('7/1/2009 15:00','MM/DD/YYYY mi:ss'), 1 ) ;
    INSERT INTO post_table VALUES('p3',  'u2',  to_date('7/20/2009 20:00','MM/DD/YYYY mi:ss'), 1  );
    INSERT INTO post_table VALUES('p4', 'u5',  to_date('7/20/2009 22:00','MM/DD/YYYY mi:ss'), 1) ;
    INSERT INTO post_table VALUES( 'p5',  'u2', to_date('7/21/2009 10:00','MM/DD/YYYY mi:ss'), 1  );
    INSERT INTO post_table VALUES(  'p6',  'u8',  to_date('8/1/2009 20:00','MM/DD/YYYY mi:ss'), -8  );
    INSERT INTO post_table VALUES( 'p7',  'u10', to_date('5/3/2009 15:00','MM/DD/YYYY mi:ss'), -3 ) ;
    INSERT INTO post_table VALUES( 'p8',  'u25', to_date('9/15/2009 20:00','MM/DD/YYYY mi:ss'), -5 ) ;
    INSERT INTO post_table VALUES(  'p9',  'u6', to_date('9/7/2009 19:00','MM/DD/YYYY mi:ss'), -3 ) ;
    INSERT INTO post_table VALUES( 'p10',  'u10', to_date('7/22/2009 10:00','MM/DD/YYYY mi:ss'), 1 ) ;
    INSERT INTO post_table VALUES( 'p11',  'u9',  to_date('7/7/2009 13:00','MM/DD/YYYY mi:ss'), 0) ;
    INSERT INTO post_table VALUES(  'p12', 'u2',  to_date('7/30/2009 11:00','MM/DD/YYYY mi:ss'), 1  );
    INSERT INTO post_table VALUES(  'p13', 'u10',  to_date('7/22/2009 8:00','MM/DD/YYYY mi:ss'), 1  );
    INSERT INTO post_table VALUES(  'p14',  'u6', to_date('5/30/2009 23:00','MM/DD/YYYY mi:ss'), 1  );
    INSERT INTO post_table VALUES(  'p15', 'u3',  to_date('5/31/2009 2:00','MM/DD/YYYY mi:ss'), 0 ) ;
    INSERT INTO post_table VALUES( 'p16', 'u12',  to_date('6/20/2009 7:00','MM/DD/YYYY mi:ss'), -8 ) ;
    INSERT INTO post_table VALUES(  'p17', 'u20',  to_date('6/20/2009 9:00','MM/DD/YYYY mi:ss'), -8) ;
    INSERT INTO post_table VALUES(  'p18','u27',  to_date('9/15/2009 11:00','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO post_table VALUES(  'p19','u26', to_date('7/1/2009 20:00','MM/DD/YYYY mi:ss'), 0 ) ;
    INSERT INTO post_table VALUES(  'p20', 'u25',  to_date('7/2/2009 17:00','MM/DD/YYYY mi:ss'), -5 );
    INSERT INTO post_table VALUES(  'p21', 'u27',  to_date('7/3/2009 20:00','MM/DD/YYYY mi:ss'), 1) ;
    INSERT INTO post_table VALUES( 'p22',  'u2',  to_date('9/15/2009 13:00','MM/DD/YYYY mi:ss'), 1 ) ;
    INSERT INTO post_table VALUES( 'p23',  'u21',  to_date('5/30/2009 17:00','MM/DD/YYYY mi:ss'), -8  );
    INSERT INTO post_table VALUES( 'p24',  'u25', to_date('8/30/2009 20:00','MM/DD/YYYY mi:ss'), -5  );
    INSERT INTO post_table VALUES(  'p25',  'u18', to_date('9/13/2009 18:00','MM/DD/YYYY mi:ss'), -8  );
    INSERT INTO post_table VALUES(  'p26',  'u11',  to_date('9/9/2009 13:00','MM/DD/YYYY mi:ss'), -8  );
    INSERT INTO post_table VALUES( 'p27',  'u23',  to_date('9/10/2009 1:00','MM/DD/YYYY mi:ss'), -5  );
    INSERT INTO post_table VALUES( 'p28',  'u22', to_date('9/10/2009 14:00','MM/DD/YYYY mi:ss'), -4  );The output I get is
    USER_ID
    u25
    u9
    u20
    u5
    u27
    u8
    u21
    u23
    u22
    u26
    u10
    USER_ID
    u3
    u12
    u18
    u2
    u6
    u11
    17 rows selected.

  • Need help with SQL Query with Inline View + Group by

    Hello Gurus,
    I would really appreciate your time and effort regarding this query. I have the following data set.
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*20.00*-------------19
    1234567----------11223--------------7/5/2008-----------Adjustment for bad quality---------44345563------------------A-----------------10.00------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765--------------------I---------------------30.00-------------19
    Please Ignore '----', added it for clarity
    I am trying to write a query to aggregate paid_amount based on Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number and display description with Invoice_type 'I' when there are multiple records with the same Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number. When there are no multiple records I want to display the respective Description.
    The query should return the following data set
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*10.00*------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765-------------------I---------------------30.00--------------19
    The following is my query. I am kind of lost.
    select B.Description, A.sequence_id,A.check_date, A.check_number, A.invoice_number, A.amount, A.vendor_number
    from (
    select sequence_id,check_date, check_number, invoice_number, sum(paid_amount) amount, vendor_number
    from INVOICE
    group by sequence_id,check_date, check_number, invoice_number, vendor_number
    ) A, INVOICE B
    where A.sequence_id = B.sequence_id
    Thanks,
    Nick

    It looks like it is a duplicate thread - correct me if i'm wrong in this case ->
    Need help with SQL Query with Inline View + Group by
    Regards.
    Satyaki De.

  • Please, need help with a query

    Hi !
    Please need help with this query:
    Needs to show (in cases of more than 1 loan offer) the latest create_date one time.
    Meaning, In cases the USER_ID, LOAN_ID, CREATE_DATE are the same need to show only the latest, Thanks!!!
    select distinct a.id,
    create_date,
    a.loanid,
    a.rate,
    a.pays,
    a.gracetime,
    a.emailtosend,
    d.first_name,
    d.last_name,
    a.user_id
    from CLAL_LOANCALC_DET a,
    loan_Calculator b,
    bv_user_profile c,
    bv_mr_user_profile d
    where b.loanid = a.loanid
    and c.NET_USER_NO = a.resp_id
    and d.user_id = c.user_id
    and a.is_partner is null
    and a.create_date between
    TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
    TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    order by a.create_date

    Take a look on the syntax :
    max(...) keep (dense_rank last order by ...)
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/functions056.htm#i1000901
    Nicolas.

  • Please need help with this query

    Hi !
    Please need help with this query:
    Needs to show (in cases of more than 1 loan offer) the latest create_date one time.
    Meaning, In cases the USER_ID, LOAN_ID, CREATE_DATE are the same need to show only the latest, Thanks!!!
    select distinct a.id,
    create_date,
    a.loanid,
    a.rate,
    a.pays,
    a.gracetime,
    a.emailtosend,
    d.first_name,
    d.last_name,
    a.user_id
    from CLAL_LOANCALC_DET a,
    loan_Calculator b,
    bv_user_profile c,
    bv_mr_user_profile d
    where b.loanid = a.loanid
    and c.NET_USER_NO = a.resp_id
    and d.user_id = c.user_id
    and a.is_partner is null
    and a.create_date between
    TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
    TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    order by a.create_date

    Perhaps something like this...
    select id, create_date, loanid, rate, pays, gracetime, emailtosend, first_name, last_name, user_id
    from (
          select distinct a.id,
                          create_date,
                          a.loanid,
                          a.rate,
                          a.pays,
                          a.gracetime,
                          a.emailtosend,
                          d.first_name,
                          d.last_name,
                          a.user_id,
                          max(create_date) over (partition by a.user_id, a.loadid) as max_create_date
          from CLAL_LOANCALC_DET a,
               loan_Calculator b,
               bv_user_profile c,
               bv_mr_user_profile d
          where b.loanid = a.loanid
          and   c.NET_USER_NO = a.resp_id
          and   d.user_id = c.user_id
          and   a.is_partner is null
          and   a.create_date between
                TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
                TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    where create_date = max_create_date
    order by create_date

  • HT1365 Hi can anyone help with this issue regarding my wireless magic mouse? When im on google chrome and scrolling down the page i always have youtube running in the background but the audio cuts/spits/pops can anyone help me with this?

    Hi can anyone help with this issue regarding my wireless magic mouse? When im on google chrome and scrolling down the page i always have youtube running in the background but the audio cuts/spits/pops can anyone help me with this?

    The figures you mention only make sense on your intranet.  Are you still using the same wireless router.  The verizon one is somewhat limited as far as max wireless-n performace.  For one thing it only has a 2.4 radio.   I like many people who wanted wireless-n performance before they even added a wireless-n gigabit router, have my own handling my wireless-n network.

  • Please help with tricky query

    I need help with SQL query (if it can be accomplished with query at all).
    I'm going to create a table with structure similar to:
    Article_Name varchar2(30), Author_Name varchar2(30), Position varchar2(2). Position field is basicly position of an article author in the author list, e.g. if there is one author, his/her position is 0, if 2, then 1st author is 0, second is 1, etc.
    Article_Name Author_Name Position
    Outer Space Smith 0
    Outer Space Blake 1
    How can I automate creation of Position, based on number of authors on the fly? Let's say I have original table without Position, but I want to create a new table that will have this information.
    Regards

    If you have an existing table whose structure doesn't tell you what position the author is in, what's the algorithm you'd use to determine who was the first author, the second author, etc? If you issue a select query on a table without providing an "order by" clause, Oracle makes no guarantees about the order in which it retrieves rows.
    As an aside, why would you store position number in a varchar2 field? If it's a number, it ought to be stored as a number.
    Justin

  • Need urgent help with the query - Beginer

    Hello - I need help with a query to populate data in a table.Here is the scenario.
    Source1
    MnthID BranchCod CustID SegCode FXStatus ProfStatus Profit
    200712 B1 C1 20 Y Y 100
    Source2
    MnthID BranchCod CustID ProdCode ProdIndex
    200712 B1 C1 12 1
    200712 B1 C2 12 0
    Destination
    MnthID BranchCod SegCode ProdCode CountSegCust CountProdCust ProfitProdCust
    Condition and Calculations:
    1)Source1 customer are base customers.If Source2 has customers who is not in source1 then that customer's record should not be fetched.
    2)SegCode, FX Status, ProfStatus is one variable in destination table. [ SegCode = SegCode+ FXStatus (if FXStatus = Y)+ ProfStatus (if FXStatus = Y) ]
    3)CountSegCust = CountCustID Groupby MnthID,BranchCod,SegCode Only.
    4)CountProdCust = CountCustID Groupby MnthID,BranchCod,SegCode,ProdCode (when ProdIndex = 1)
    5)ProfitProdCust = Sum of Profit of Customers Groupby MnthID,BranchCod,SegCode,ProdCode (when ProdIndex = 1)
    Apologies for bad formatting.
    Thanks in advance!!

    A total guess indeed.
    It's not clear whether some aggregation can be done (summing counts of grouped data might cause some customers being counted more than once)
    insert into destination
    select mnthid,branchcod,segcode,prodcode,countsegcust,countprodcust,profitprodcust
      from (select s1.mnthid,
                   s1.branchcod,
                   s1.segcode || case s1.fxstatus when 'Y' then s1.fxstatus || s1.profstatus end segcode,
                   s2.prodcode,
                   count(s1.custid) over (partition by s1.mnthid,
                                                       s1.branchcod,
                                                       s1.segcode || case s1.fxstatus when 'Y' then s1.fxstatus || s1.profstatus end
                                              order by null
                                         ) countsegcust,
                   count(case proindex when 1
                                       then custid
                         end
                        ) over (partition by s1.mnthid,
                                             s1.branchcod,
                                             s1.segcode || case s1.fxstatus when 'Y' then s1.fxstatus || s1.profstatus end
                                             s2.prodcode
                                    order by null
                               ) countprodcust,
                   sum(case proindex when 1
                                     then profit
                       end
                      ) over (partition by s1.mnthid,
                                           s1.branchcod,
                                           s1.segcode || case s1.fxstatus when 'Y' then s1.fxstatus || s1.profstatus end
                                           s2.prodcode
                                  order by null
                             ) profitprodcust,
                   row_number() over (partition by s1.mnthid,
                                                   s1.branchcod,
                                                   s1.segcode || case s1.fxstatus when 'Y' then s1.fxstatus || s1.profstatus end
                                                   s2.prodcode
                                          order by null
                                     ) the_row
              from source1 s1,source2 s2
             where s1.mnthid = s2.mnthid
               and s1.branchcod = s2.branchcod
               and s1.custid = s2.custid
    where the_row = 1Regards
    Etbin

  • Help with slooow query

    I created a blogging tool for my students to use as I teach
    them internet safety and cyber citizenship. I am no CF master, but
    I dabble a little bit here and there. I need some help with this
    query. It is running extremely slow, which means I have probably
    created some horrendous loop in this query. If any one out there
    has a better solution for this query, I and my middle school
    students would be extremely grateful.
    Here's what I would like it to do. I have two tables, one for
    the blog messages and another for comments. The comments are linked
    to their respective blog messages through a common database field.
    When someone clicks on a link to read a student's blog, a query
    runs which pulls all of the blog messages for that user, the
    comments, and it also counts the number of comments entries for
    each message so that I can place a total # of comments under each
    blog message.

    Not sure why you have this like this: (Select
    count(commentid) from comments where comments.blogid = blog.blogid)
    or this twice: blog.blogusersid = #fname#
    You need to make sure that the comments.blogid and
    blog.blogid fields are indexed. Does this query work any faster?
    <cfquery Name="Myblog" datasource="blog">
    SELECT b.blogid, b.btitle, b.bcontent, b.bdate,
    b.blogusersid, b.fname, b.lname, b.blogpict, b.pictlocation,
    b.userid, c.commentid, c.blogid, b.lastupdated, COUNT(c.commentid)
    AS cc
    FROM blog AS b
    INNER JOIN comments AS c ON c.blogid = b.blogid
    WHERE b.blogusersid = #fname#
    GROUP BY b.blogid, b.btitle, b.bcontent, b.bdate,
    b.blogusersid, b.fname, b.lname, b.blogpict, b.pictlocation,
    b.userid, c.commentid, c.blogid, b.lastupdated
    ORDER BY b.bdate
    </cfquery>
    ..... but I'm not sure that you will be getting the comment
    count that you want with either query.
    Phil

  • Help with iTunes database and Time Machine after Lion clean install

    I did a clean install of OS X Lion and think I've given myself problems with my Snow Leopard Time Machine backup and with my iTunes Library database files.
    My iMac (2007) had become very sluggish so I opted for a clean install.  Before the install of Lion my Snow Leopard 10.6.8 internal 1TB HD was at about  600GBs.  I was using Time Machine to backup to an external 1TB drive.  I had my iTunes 10.?  library on the same drive.
    For the Lion install I partitioned my internal 1TB HD into two partitions - a 250GB Lion boot partition and the rest for Lion data files.
    Now I  have two problems - 1.  I can't  work out how to re-connect my Time Machine backups, since they were related to a much bigger original drive, and 2.  I can't seem to find my iTunes database files.
    I also don't want to restore any application files at this stage, as I'm determined to do a fresh install of only the applications I need as I go along, and hopefully  avoid the issues of sluggishness I had with Snow Leopard.
    I have all my iTunes files in their pre-Lion external HD folder, but that folder does not seem to  have the iTunes Library.XML or iTunes Library files.  I didn't delete them so I'm hoping they were in my iTunes folder on my SL boot drive.  But that drive was deleted during the install, so I'll be depending on Time Machine to restore the database.
    Can anyone suggest a way to deal with Time Machine for my Lion install, and a way to restore my iTunes database?  I'd like to keep as much of the Snow Leopard Time Machine data as I can, while continuing to do Time Machine backups from my Lion installation.
    Anyone know where iTunes stores those database files, if not in the external drive iTunes folder? Spotlight search doesn't find the files, but is there a way to search my old Time Machine backups without having the Time Machine backups folder re-connected in Lion?
    Thanks.

    My iTunes folder is organised like this if it helps:
    And iTunes uses them from the Prefrences like this:
    Regards,
    Colin R.

  • Help with my query

    Hello all,
    Total newbie to this pl/sql stuff. So, deseperately need help in my query.
    BOOKING_ID     BOOKING_STATUS     BOOKING_DATE     BOOKING_TIME     BOOKING_DATE_TIME
    1234567     CANCELLED     20090301     37252     5/1/2010 10:20
    1234567     CANCELLED 20090301     44229     5/1/2010 12:17
    1234567     BOOKED     20090301     39462     5/1/2010 10:57
    1234567     CANCELLED     20090301     43549     5/1/2010 12:05
    9671111     BOOKED     20090301     68124     5/1/2010 12:57
    9671111     CANCELLED     20090301     45001     5/1/2010 12:05
    How do I write my query such that I would get the following results:
    BOOKING_ID     BOOKING_STATUS     BOOKING_DATE     BOOKING_TIME     BOOKING_DATE_TIME
    9671111     BOOKED     20090301     68124     2/4/2010 12:17
    Basically, I am looking at the latest BOOKING_TIME and making sure the BOOKING_STATUS=BOOKED, if not, don't even bother bring back the result. Hence, you see that BOOKING_ID=1234567 is not required since at the latest BOOKING_TIME=44229, the BOOKING_STATUS=CANCELLED.
    Any help is greatly appreciated.
    Thank you in advance for your help.
    Stanley Ho

    Hi, Stanley,
    Welcome to the forum!
    Whenever you have a question, please post your sample data in a form that people can actually use. CREATE TABLE and INSERT statements are perfect.
    For example:
    CREATE TABLE     booking
    (     booking_id          NUMBER (8)
    ,     booking_status          VARCHAR2 (10)
    ,     booking_date_time     DATE
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (1234567,        'CANCELLED',        TO_DATE ('5/1/2010 10:20', 'MM/DD/YYYY HH24:MI'));
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (1234567,        'CANCELLED',        TO_DATE ('5/1/2010 12:17', 'MM/DD/YYYY HH24:MI'));
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (1234567,        'BOOKED',        TO_DATE ('5/1/2010 10:57', 'MM/DD/YYYY HH24:MI'));
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (1234567,        'CANCELLED',        TO_DATE ('5/1/2010 12:05', 'MM/DD/YYYY HH24:MI'));
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (9671111,        'BOOKED',        TO_DATE ('5/1/2010 12:57', 'MM/DD/YYYY HH24:MI'));
    INSERT INTO  booking (booking_id, booking_status, booking_date_time)
                  VALUES (9671111,        'CANCELLED',        TO_DATE ('5/1/2010 12:05', 'MM/DD/YYYY HH24:MI'));What you want is called a Top-N Query .
    Here's one way to do it:
    WITH     got_rnum  AS
         SELECT     booking.*
         ,     ROW_NUMBER () OVER ( PARTITION BY  booking_id
                                   ORDER BY          booking_date_time     DESC
                           ) AS rnum
         FROM    booking
    SELECT     booking_id
    ,     booking_status
    ,     TO_CHAR (booking_date_time, 'YYYYMMDD')               AS booking_date
    ,     TO_CHAR (booking_date_time, 'SSSSS')               AS booking_time
    ,     TO_CHAR (booking_date_time, 'MM/DD/YYYY HH24:MI')     AS booking_date_time
    FROM     got_rnum
    WHERE     rnum          = 1
    AND     booking_status     = 'BOOKED'
    ;Notice that you don't need PL/SQL to do this; plain old SQL is good enough.
    Of course, if you're using PL/SQL for other reasons, you can use a query like this within PL/SQL.
    Dates (including time of day) should always be stored in DATE columns.
    If you have a DATE column, like booking_date_time, then there's no need for redundant date and time columns.
    You can always display just the year-month-day, or just the time, in any format, as I did above.
    The output from the query above, with the data above, is:
    BOOKING_ID BOOKING_ST BOOKING_ BOOKI BOOKING_DATE_TIM
       9671111 BOOKED     20100501 46620 05/01/2010 12:57I realize the booking_date and booking_time columns aren't quite what you posted. If they are not derivable from booking_date_time, then you probably do need separate columns for them, and those columns can easily be added to the query above.
    Edited by: Frank Kulash on Feb 5, 2010 4:41 PM
    KEEP (DENSE_RANK ...) , like Max used below, is a great tool to have in your kit. The problem with it is that you have to repeat a lot of stuff for every column, so the more columns you have in your output, the more tedious it gets. ROW_NUMBER sclaes much better, and is adaptable to more situations. I suggest you master ROW_NUMBER first, and look into KEEP (DENSE_RANK ...) later.

  • Query on Time Dependent Info object

    Hi ,
    I am trying to create a query out of a time dependent info object.The info object is 0employee and since it is time dependent it has the date from and date to automatically in the infoobject master data. However these fields do not come up as characteristics when a create a query out of this infoobject.
    Can you please let me know why or am I missing something ? I know I can get it if i use it in a cube or DSO but i want to create from this infoobject. Please help.
    Thanks ,
    Regards
    Ashwin G

    Hi
    0employee_attr datasource have start and end date is mapped 1:1. target is 0employee? but when i check at 0employee attribute tab i have seen any start  and end date attributes.
    If you have it in attr u should be able to assign as read from master data.
    Otherwise routine will be
    SELECT STARTDATE from /BI0/MEMPLOYEE where employee = source_fields-employee.
    for start date and
    SELECT ENDDATE from /BI0/MEMPLOYEE where employee = source_fields-employee.
    for enddate
    Reagards,
    Nagaraju.V

  • I need help with a query

    Hello everyone,
    First, some background information.  We have in place a table which records status changes on a work order.  The orders normally go through each status only once, however they do occasionally reuse status indicators.  For example, an order is placed on hold, released from hold, placed back on hold, released again and so on.  The sample data provided is an example of an order with data repeating itself.  I need some help with writing a query on this table.
    LOC_CODE
    WO_NO
    UPDATETIME
    WO_STATUS_OLD
    WO_STATUS_NEW
    xxx
    12345
    05-01-2013 10:24:00
    WR
    SP
    xxx
    12345
    05-01-2013 10:39:00
    SP
    PM
    xxx
    12345
    05-01-2013 11:52:00
    PM
    ES
    xxx
    12345
    05-01-2013 11:58:00
    ES
    MO
    xxx
    12345
    05-01-2013 12:03:00
    MO
    ES
    xxx
    12345
    05-01-2013 12:38:00
    ES
    AT
    xxx
    12345
    05-01-2013 12:48:00
    AT
    RS
    xxx
    12345
    05-01-2013 13:01:00
    RS
    RA
    xxx
    12345
    05-01-2013 13:26:00
    RA
    RS
    xxx
    12345
    05-01-2013 13:36:00
    RS
    RA
    xxx
    12345
    05-01-2013 15:35:00
    RA
    RS
    xxx
    12345
    05-01-2013 15:42:00
    RS
    RA
    xxx
    12345
    05-01-2013 16:04:00
    RA
    RS
    xxx
    12345
    05-01-2013 16:42:00
    RS
    RA
    xxx
    12345
    05-01-2013 19:28:00
    RA
    FD
    xxx
    12345
    05-01-2013 19:28:00
    FD
    SO
    The query (which will in turn be used for a view) will display the elapsed time between status updates (subtract updatetime from the record preceeding).  Only the first record for each order at a location would have no elapsed time.  The result should look like this:
    LOC_CODE
    WO_NO
    UPDATETIME
    WO_STATUS_OLD
    WO_STATUS_NEW
    MINUTES_ELAPSED
    xxx
    12345
    05-01-2013 10:24:00
    WR
    SP
    {null}
    xxx
    12345
    05-01-2013 10:39:00
    SP
    PM
    15
    xxx
    12345
    05-01-2013 11:52:00
    PM
    ES
    73
    xxx
    12345
    05-01-2013 11:58:00
    ES
    MO
    6
    xxx
    12345
    05-01-2013 12:03:00
    MO
    ES
    5
    xxx
    12345
    05-01-2013 12:38:00
    ES
    AT
    35
    xxx
    12345
    05-01-2013 12:48:00
    AT
    RS
    10
    xxx
    12345
    05-01-2013 13:01:00
    RS
    RA
    13
    xxx
    12345
    05-01-2013 13:26:00
    RA
    RS
    25
    xxx
    12345
    05-01-2013 13:36:00
    RS
    RA
    10
    xxx
    12345
    05-01-2013 15:35:00
    RA
    RS
    119
    xxx
    12345
    05-01-2013 15:42:00
    RS
    RA
    7
    xxx
    12345
    05-01-2013 16:04:00
    RA
    RS
    22
    xxx
    12345
    05-01-2013 16:42:00
    RS
    RA
    38
    xxx
    12345
    05-01-2013 19:28:00
    RA
    FD
    166
    xxx
    12345
    05-01-2013 19:28:00
    FD
    SO
    0
    I have been trying various queries, but no luck as of yet.  I would appreciate your input.
    Thank you,
    Patrick

    Sorry about the late reply.  I had an unexpected meeting to attend.  Here is the requested information
    We are running Oracle Database 11g Release 11.2.0.3.0 - 64bit Production.
    --  DDL for Table WO_STATUS
      CREATE TABLE "WO_STATUS"
       ( "LOC_CODE" VARCHAR2(3 BYTE),
      "WO_NO" NUMBER,
      "UPDATE_DATETIME" DATE,
      "WO_STATUS_OLD" VARCHAR2(2 BYTE),
      "WO_STATUS_NEW" VARCHAR2(2 BYTE)
    INSERT INTO WO_STATUS (LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 10:24:00'},'WR','SP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 10:39:00'},'SP','PM');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 11:52:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 11:58:00'},'ES','MO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:03:00'},'MO','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:38:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:48:00'},'AT','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:01:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:26:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:36:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 15:35:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 15:42:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 16:04:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 16:42:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 19:28:00'},'RA','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 19:28:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:35:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:37:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:45:00'},'AT','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:51:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:01:00'},'RA','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:02:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:23:00'},'SO','MP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 08:29:00'},'WR','SP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:07:00'},'SP','PM');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:48:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:51:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:19:00'},'AT','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:20:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:24:00'},'SO','PY');

Maybe you are looking for

  • Mac Pro (early 2008) freezing after displaying a pink checkered pattern.

    I upgraged to lion and a whole lot of issues arose. The first thing I noticed is that I would keep getting an error message on startup saying that one of my computer USB ports is pretty much being overused in wattage. I also remember that it was gett

  • Is there a grammar checker for Adobe Reader?

    Specifically, I'm looking for a grammar checker to use with Adobe Reader while entering data in forms Created with Adobe LiveCycle ES.

  • No signature in preview

    I made a signature in preview - but I have no sign for signature on the second bar - and this after a Combo Update with 10.7.2. Preview is Version 5.5.1 (719.11) german. Should the sign for "signature" not automatically appear in the secon bar?

  • Splitting date into year and month

    Hi,    I am getting it_final-bedat (dd.mm.yyyy).I need to split month as well as year in separate fields in the internal table.tell some ideas.

  • Cannot see virtual terminals

    I boot into X and then attempt to switch to a virtual terminal (ctrl+alt+f{1,2,3...}). I just get a black screen (the screen is on) but cannot see anything. The terminal is there because I can type 'root' and the password to log in and type 'restart'