Query for monitor data upload

Hi, Experts
    Normally in Cube we just have requestID, which only has number information and nothing else( request date, time, selection, type of data upload ... )
    Can I make a Bex query show information just like Cube manage?  becase we had to check whether there is duplicated selection request is not deleted or some missing request in case multi-datasource to one cube
    I can not find any useful query in BW statistics queries.
thanks in advance.

I'm also can not found enough information from table RSMONICDP
In our case, Cube 0COOM_C02 have lots infosources, some are full upload and some are Delta upload. all of inforpackage are scheduled one process chain.
then I go to log of this process chain, I found some error happened in some days, so some time the process chain is not finished, so that's means in Cube 0COOM_C02 have missing request and duplicated request.
I'm hard to using cube-manage to found all of problem request because there are so many request and so little windowns.  so my question is, is there any Bex query or BW table can indicate similiar information within cube - manage - request tab.
so I can analysis them in Excel, it's quict easy for me.
thank you all

Similar Messages

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Function module Vs BDC for master data upload

    Hi ,
    Please advice we should use the following function modules for master data upload or we should go for BDC.
    MP_RFC_SINGLE_CREATE
    MP_RFC_INACT_CHANGE
    MPLAN_CREATE
    MPLAN_CHANGE
    MPLAN_SET_DELETION_INDICATOR
    ASSET_MASTERRECORD_MAINTENANCE
    MPLAN_ITEM_CREATE
    MPLAN_ITEM_CHANGE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    Actually, we have already used these function modules in our upload program, but we are not sure if these function modules will create any data inconsistency.
    Please let me know, if we should continue using the FMs, or there is any risk using the FMs and we should replace them by BDC.
    Thanks in advance.

    HI Vikram,
    Better to serch for the BAPI for uploading the master data.Becuase we have problems with BDC and FM's.
    If you use FM's it does n't contain all the fields which you want.IF you go for BDC this is not maintainable for future releaseas.IF you upgrade then screen may change.
    IF don' have any BAPI then better go for BDC.
    Thanks

  • LSMW used only for master data upload?

    Hi
    Can you please let me know if LSMW is used only for master data upload or we can also use it for transaction data ?

    Hi Christino.
    I have come across a standard SDN thread which deals with the uploading master data, refer it:
    [SDN Reference for uploading master data using LSMW|how can we upload master data by using LSMW;
    [SDN reference for which uploading is preferred (Master data or Transaction data)|Which one is better for uploading data LSMW or ECATT ?;
    Good Luck & Regards.
    HARSH

  • Query for finding data during special holidays

    Hi, i have a table for special holidays that looks like this:
    PROFILE_DAY
    VAR_DATE
    REGULAR_HOLIDAY
    1/1/2013
    H_WEEK_THURSDAY
    3/28/2013
    H_WEEK_FRIDAY
    3/29/2013
    REGULAR_HOLIDAY
    12/24/2013
    REGULAR_HOLIDAY
    12/25/2013
    REGULAR_HOLIDAY
    12/31/2013
    And another table (LOAD_PROFILE_TEST), which contains LOAD_PROF1 values from (TIME_EQ) 0 to 24 at intervals 0.25 for (PROFILE_DAY) MONDAY to SUNDAY including REGULAR_HOLIDAY, H_WEEK_THURSDAY and H_WEEK_FRIDAY.  All in all, this table contains 970 records (97 records between 0 to 24 with interval of 0.25 per PROFILE_DAY, with 10 distinct PROFILE_DAY).
    TIME_EQ
    PROFILE_DAY
    LOAD_PROF1
    LOAD_PROF2
    0
    REGULAR_HOLIDAY
    11.47
    0.25
    REGULAR_HOLIDAY
    11.27
    0.5
    REGULAR_HOLIDAY
    11.3
    0.75
    REGULAR_HOLIDAY
    11.08
    0
    MONDAY
    11.27
    0.25
    MONDAY
    11.33
    0.5
    MONDAY
    11.18
    Now, I have this query to update value of LOAD_PROF2 of the said table whenever parameters V_DATE_OUT & V_DATE_IN is entered:
    UPDATE LOAD_PROFILE_TEST
         SET LOAD_PROF2 = LOAD_PROF1 + :LOAD_DIFF
         WHERE  UPPER(PROFILE_DAY) IN (select UPPER(to_char(:V_DATE_OUT + (level-1), 'fmDAY'))
                       from  dual
                       connect by level <=  :V_DATE_IN - :V_DATE_OUT  + 1
    where :LOAD_DIFF is a certain pre-determined value.
    This query works fine if i am trying to update regular days from MONDAY to SUNDAY.  What i would like to do is to determine if the two parameter dates, V_DATE_OUT & V_DATE_IN would fall under any of the holidays on the first table LOAD_PROFILE_TEST, then update only those rows.  For example, V_DATE_OUT = 12/02/2013, Monday and V_DATE_IN = 12/06/2013, Friday.  The query above would update the values of LOAD_PROF2 for PROFILE_DAY -  MONDAY to FRIDAY, corresponding to dates 12/02/2013 and 12/06/2013.  If however, V_DATE_OUT = 12/23/2013, Monday and V_DATE_IN = 12/27/2013, Friday, this should update the rows corresponding to PROFILE_DAY - MONDAY for 12/23/2013, THURSDAY for 12/26/2013, FRIDAY for 12/27/2013, and REGULAR_HOLIDAY for the dates 12/24/2013 and 12/25/2013 since these two are included in the first table (table of holidays).  This same scenario will work the same way when V_DATE_OUT  and/or V_DATE_IN fall in the dates 3/28/2013 and 3/29/2013.  All other dates not included in the table for special holidays will be treated according to the day they fall on (Monday thru Sunday).  I hope my point is clear.  Thank you in advance.

    Thanks for your reply.  Firstly, I am using Forms [32 Bit] Version 10.1.2.0.2 (Production).  I will try to explain a little further but I don't know if i can give the exact details of the tables since this would be voluminous. First i have a table REG_HOLIDAYS, whcih contain the following:
    PROFILE_DAY
    VAR_DATE
    REGULAR_HOLIDAY
    1/1/2013
    H_WEEK_THURSDAY
    3/28/2013
    H_WEEK_FRIDAY
    3/29/2013
    REGULAR_HOLIDAY
    12/24/2013
    REGULAR_HOLIDAY
    12/31/2013
    REGULAR_HOLIDAY
    12/25/2013
    Now i have another table LOAD_PROFILE_TEST which i will simplify just to show what i wanted to do:
    CREATE
    TABLE LOAD_PROFILE_TEST
    TIME_EQ
    NUMBER (5, 2),
    PROFILE_DAY
    VARCHAR2 (15 BYTE),
    LOAD_PROF1
    NUMBER (6, 2), 
    LOAD_PROF2
    NUMBER (6, 2)
    Here are the sample values of this table:
    TIME_EQ
    PROFILE_DAY
    LOAD_PROF1
    LOAD_PROF2
    0
    Monday
    2
    0
    Tuesday
    2.1
    0
    Wednesday
    2.3
    0
    Thursday
    2.5
    0
    Friday
    2.2
    0
    Saturday
    2.4
    0
    Sunday
    2.3
    0
    Regular_holiday
    1.9
    0.25
    Monday
    2.1
    0.25
    Tuesday
    2.1
    0.25
    Wednesday
    2.4
    0.25
    Thursday
    2.2
    0.25
    Friday
    2.5
    0.25
    Saturday
    2.3
    0.25
    Sunday
    2.3
    0.25
    Regular_holiday
    2.5
    However, in the actual table, TIME_EQ will start with 0 until 24 with interval of 0.25 (i.e 0, 0.25, 0.5, 0.75, 1, 1.25, 1.5, ...23.5, 23.75, 24).  So for a PROFILE_DAY of 'Monday', there will be 97 rows corresponding to TIME_EQ of 0 to 24.  The same is true for Tuesday, Wednesday until Sunday including Regular_holiday.  All in all, this table would contain 776 rows.  LOAD_PROF1 values are random values initally inputted with PROFILE_DAY and TIME_EQ.
    The first goal is to UPDATE this table (LOAD_PROFILE_TEST) by updating the column LOAD_PROF2 by adding a certain parameter value, :LOAD_DIFF, say 0.1. Now, i will have two scenarios to show what i would like to happen.  First, i have two date parameters :V_DATE_OUT and :V_DATE_IN.
    First Case, V_DATE_OUT = 12/15/2013 (Monday); V_DATE_IN = 12/22/2013 (Sunday)
    I will have this query to update said table:
    UPDATE LOAD_PROFILE_TEST
           SET LOAD_PROF2 = LOAD_PROF1 + :LOAD_DIFF
       WHERE   UPPER(PROFILE_DAY) IN (select UPPER(
                                                                     to_char(
                                                                          :V_DATE_OUT + (level-1),
                                                                          'fmDAY'))
                                                                    from  dual
                                                     connect by level <=
                                                           :V_DATE_IN - :V_DATE_OUT  + 1);
    The output of this query would be:
    TIME_EQ
    PROFILE_DAY
    LOAD_PROF1
    LOAD_PROF2
    0
    Monday
    2
    2.1
    0
    Tuesday
    2.1
    2.2
    0
    Wednesday
    2.3
    2.4
    0
    Thursday
    2.5
    2.6
    0
    Friday
    2.2
    2.3
    0
    Saturday
    2.4
    2.5
    0
    Sunday
    2.3
    2.4
    0
    Regular_holiday
    1.9
    0.25
    Monday
    2.1
    2.2
    0.25
    Tuesday
    2.1
    2.2
    0.25
    Wednesday
    2.4
    2.5
    0.25
    Thursday
    2.2
    2.3
    0.25
    Friday
    2.5
    2.6
    0.25
    Saturday
    2.3
    2.4
    0.25
    Sunday
    2.3
    2.4
    0.25
    Regular_holiday
    2.5
    Since, 12/15/2013 up to 12/22/2013 is from Monday to Sunday without having a particular day included in the first table, REG_HOLIDAYS, therefore all the rows with Monday to Sunday are updated.
    Second Case, V_DATE_OUT = 12/23/2013 (Monday); V_DATE_IN = 12/29/2013 (Sunday)
    Take note that 12/24 and 12/25 are included in the first table, REG_HOLIDAYS, therefore i need a query so that this would be my output afterwards:
    TIME_EQ
    PROFILE_DAY
    LOAD_PROF1
    LOAD_PROF2
    0
    Monday
    2
    2.1
    0
    Tuesday
    2.1
    0
    Wednesday
    2.3
    0
    Thursday
    2.5
    2.6
    0
    Friday
    2.2
    2.3
    0
    Saturday
    2.4
    2.5
    0
    Sunday
    2.3
    2.4
    0
    Regular_holiday
    1.9
    2.0
    0.25
    Monday
    2.1
    2.2
    0.25
    Tuesday
    2.1
    0.25
    Wednesday
    2.4
    0.25
    Thursday
    2.2
    2.3
    0.25
    Friday
    2.5
    2.6
    0.25
    Saturday
    2.3
    2.4
    0.25
    Sunday
    2.3
    2.4
    0.25
    Regular_holiday
    2.5
    2.6
    As can be seen, since 12/23/2013 up to 12/29/2013 is from Monday to Sunday, rows with Monday upto Sunday should be updated, HOWEVER, since 12/24 and 12/25 which falls on a Tuesday and a Wednesday also were declared as holidays (included in the REG_HOLIDAYS table),  instead of updating rows with PROFILE_DAY of Tuesday and Wednesday, the query should instead update rows with Regular_holiday as PROFILE_DAY and leave as is the rows with Tuesday and Wednesday.  Additional note, days declared in the REG_HOLIDAYS table with PROFILE_DAY of Regular_holidays are treated as DISTINCT as in the case of 12/24 and 12/25.
    The query you gave me gives the correct value if the involved days between V_DATE_OUT and V_DATE_IN is from Monday to Sunday only, otherwise it always gives Regular_holiday only regardless of the other dates queried.  As in the first case above, it would give Monday to Sunday (correct) but for the second case, it will only give Regular_holiday, the other days (Monday, Thursday, Friday, Saturday and Sunday) was not outputted.
    I hope this became clearer since i will still be needing this for another query which i will inquire again after resolving this issue.  Thanks a lot for your help.

  • Optimization for bulk data upload

    Hi everyone!
    I've got the following issue:
    I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
    I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
    This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
    Do you have any advice that could help me?
    Thanks in advance!

    Hi! thank you for you answer!
    High process consuming is in the MDB
    I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
    Thanks again!

  • Trouble writing Query for Pivoting data from a table

    I am having a little trouble writing a query for converting the below table data into a pivot data. I am trying to write a query for which if I give a single valid report_week date as input it should give me the data for that week and also provide two extra columns, one which gives the data of last week for the same countries and the second column which gives the difference of numbers in both the columns(i.e. COUNT - COUNT_LAST_WEEK).
    REPORT_WEEK     DIVISION     COUNT
    9/26/2009     country1     81
    9/26/2009     country2     97
    9/26/2009     country3     12
    9/26/2009     country4     26
    9/26/2009     country5     101
    10/3/2009     country1     85
    10/3/2009     country2     98
    10/3/2009     country3     10
    10/3/2009     country4     24
    10/3/2009     country5     101
    10/10/2009     country1     84
    10/10/2009     country2     98
    10/10/2009     country3     10
    10/10/2009     country4     25
    10/10/2009     country5     102
    For example, if I give input as 10/10/2009, the output should be as give below.
    REPORT_WEEK     DIVISION     COUNT     COUNT_LAST_WEEK     DIFFERENCE
    10/10/2009     country1     84     85     -1
    10/10/2009     country2     98     98     0
    10/10/2009     country3     10     10     0
    10/10/2009     country4     25     24     1
    10/10/2009     country5     102     101     1
    For example, if I give input as 10/3/2009, the output should be as give below.
    REPORT_WEEK     DIVISION     COUNT     COUNT_LAST_WEEK     DIFFERENCE
    10/3/2009     country1     85     81     4
    10/3/2009     country2     98     97     1
    10/3/2009     country3     10     12     -2
    10/3/2009     country4     24     26     -2
    10/3/2009     country5     101     101     0
    Can anyone please shed some light on Query building for the above scenarios.
    Thank you
    SKP
    Edited by: user11343284 on Oct 10, 2009 7:53 AM
    Edited by: user11343284 on Oct 10, 2009 8:28 AM

    I assume there is no gap in report weeks. If so:
    SQL> variable report_week varchar2(10)
    SQL> exec :report_week := '10/10/2009'
    PL/SQL procedure successfully completed.
    with t as (
               select to_date('9/26/2009','mm/dd/yyyy') report_week,'country1' division,81 cnt from dual union all
               select to_date('9/26/2009','mm/dd/yyyy'),'country2',97 from dual union all
               select to_date('9/26/2009','mm/dd/yyyy'),'country3',12 from dual union all
               select to_date('9/26/2009','mm/dd/yyyy'),'country4',26 from dual union all
               select to_date('9/26/2009','mm/dd/yyyy'),'country5',101 from dual union all
               select to_date('10/3/2009','mm/dd/yyyy'),'country1',85 from dual union all
               select to_date('10/3/2009','mm/dd/yyyy'),'country2',98 from dual union all
               select to_date('10/3/2009','mm/dd/yyyy'),'country3',10 from dual union all
               select to_date('10/3/2009','mm/dd/yyyy'),'country4',24 from dual union all
               select to_date('10/3/2009','mm/dd/yyyy'),'country5',101 from dual union all
               select to_date('10/10/2009','mm/dd/yyyy'),'country1',84 from dual union all
               select to_date('10/10/2009','mm/dd/yyyy'),'country2',98 from dual union all
               select to_date('10/10/2009','mm/dd/yyyy'),'country3',10 from dual union all
               select to_date('10/10/2009','mm/dd/yyyy'),'country4',25 from dual union all
               select to_date('10/10/2009','mm/dd/yyyy'),'country5',102 from dual
    select  max(report_week) report_week,
            division,
            max(cnt) keep(dense_rank last order by report_week) cnt_this_week,
            max(cnt) keep(dense_rank first order by report_week) cnt_last_week,
            max(cnt) keep(dense_rank last order by report_week) - max(cnt) keep(dense_rank first order by report_week) difference
      from  t
      where report_week in (to_date(:report_week,'mm/dd/yyyy'),to_date(:report_week,'mm/dd/yyyy') - 7)
      group by division
      order by division
    REPORT_WE DIVISION CNT_THIS_WEEK CNT_LAST_WEEK DIFFERENCE
    10-OCT-09 country1            84            85         -1
    10-OCT-09 country2            98            98          0
    10-OCT-09 country3            10            10          0
    10-OCT-09 country4            25            24          1
    10-OCT-09 country5           102           101          1
    SQL> exec :report_week := '10/3/2009'
    PL/SQL procedure successfully completed.
    SQL> /
    REPORT_WE DIVISION CNT_THIS_WEEK CNT_LAST_WEEK DIFFERENCE
    03-OCT-09 country1            85            81          4
    03-OCT-09 country2            98            97          1
    03-OCT-09 country3            10            12         -2
    03-OCT-09 country4            24            26         -2
    03-OCT-09 country5           101           101          0
    SQL> SY.

  • Tutorial for new Data Upload feature in 4.1?

    Greetings,
    Is there a tutorial available for using the new Data Upload feature in 4.1? We have not upgraded to 4.1 yet and I would like to try out the new feature on my APEX test workspace, so I can use the new feature once we are upgraded to 4.1.
    Thanks,
    John

    I installed the Product Portal Sample Database and went to page 21. Very nice looking. Is there any tutorial for this Sample application? In other words, is there a tutorial that uses this application as its basis?
    What I am really looking for (my apologies if I have not been clear) is a tutorial that steps you through the process of setting up the new feature of Data Upload in APEX 4.1. I would like to create a new application on my test workspace and learn how to set up the Data Upload page.
    Seeing the Data Load in action is very helpful though. Thanks for pointing me to this.
    Thanks,
    John

  • Query for due date

    Hi All,
    How to write a query for an alert to be generated 5 days before the document due date?
    thanks
    SV Reddy

    Hi Everybody for your valuable suggestions.
    When we execute this query in normal mode by going to user query menu, it returns some records that can be displayed on to the screen in a separate window.  When you attach this query to an Alert, how the Alert will trigger ?
    Is it when the query returns more than 1 record then the name of the Alert will display in the inbox of the user ? 
    OR
    Is it the result of the query displayed in the window?
    What will happen when this query returns 0 records ?  means is that true the alert wont be triggerred ?
    What is the base criteria that an alert take in to consideration when to send a popup to the specified users?
    Please reply
    Thanks

  • Help in Query for max date

    Hi, How I can get the max(Gst_date) record of GRD. IF record of GRD has more than one record then I need the one max(GST_date) record of every GRD. Thanks.
    create table #CRT (CRT numeric, GRD numeric, GST_Date datetime)
    insert into #CRT values (7 ,1900,'01-01-2000')
    insert into #CRT values (19,1900,'01-01-2002')
    insert into #CRT values (24,1900,'01-01-2013')
    insert into #CRT values (7 ,2100,'01-01-2010')
    insert into #CRT values (19,2100,'01-01-2012')
    insert into #CRT values (7 ,2200,'01-01-2012')
    insert into #CRT values (19,2200,'02-02-2012')
    I would like the following output from query. The following record is the max(GST_Date ) of every '''GRD'".
    CRT   GRD   GST_Date
    24    1900  01-01-2013
    19    2100  01-01-2012
    19    2200  02-02-2012

    Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. Everything you posted is wrong. You should follow ISO-8601 rules for
    displaying temporal data. In fact is is required by ANSI/ISO Standards. You failed again. We need to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL. You need to read and download the PDF for: 
    https://www.simple-talk.com/books/sql-books/119-sql-code-smells/
    Please, please learn why rows are not records. You do not even know what a key is or how to write a INTEGER NOT NULL(s, p) declaration. Allowing a table and column to have the same name is legal syntax and we regret not taking it out of the Standard. It is
    stupid! How can a set also be an attribute of itself? But your DDL gave no attribute  properties, as per ISO-11179. 
    Here is my attempt at repairs
    CREATE TABLE CRT 
    (crt_something INTEGER NOT NULL, 
     grd_something INTEGER NOT NULL, 
     PRIMARY KEY (crt_something, grd_something),
     gst_date DATE NOT NULL);
    Here is how we write an insertion statement; you are still using Sybase syntax! Why did you pick the most ambiguous, non-ANSI date display format? 
    INSERT INTO CRT 
    VALUES 
     ( 7, 1900, '2000-01-01), 
     ( 7, 2100, '2010-01-01'), 
     ( 7, 2200, '2012-01-01'), 
     (19, 1900, '2002-01-01'), 
     (19, 2100, '2012-01-01'), 
     (19, 2200, '2012-02-02'),
     (24, 1900, '2013-01-01');
    >> How I can get the MAX(gst_date) record [sic] of grd_something. If record [sic] of grd_something has more than one record [sic] then I need the one MAX(gst_date) record [sic] of every grd_something. <<
    WITH X
    (SELECT crt_something, grd_something, gst_date,
            MAX(gst_date) 
              OVER (PARTITION BY grd_something) AS gst_date_max
      FROM CRT)
     SELECT *
       FROM X
      WHERE gst_date_max = gst_date;
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Query for spatial data with a GeometryCollection fails

    There are exact 538 CurvePolygons (only exterior rings at this
    sample). All of them are valid geometries and equal in dimension
    and so on. Now I connect them to a GeometryCollection and query
    for other relating spatial data in some tables. It seems that
    the use of around (not exact!) 200 CurvePolygon in one
    GeometryCollection works fine but the adding of more
    CurvePolygon result in an error with the Spatial Index (I could
    add the ORA- error numbers if I have some data in my test tables
    again next days).
    Is there anybody else having trouble with these mysterious
    problem? Maybe there is a border by the number of points in
    GeometryCollection?
    (More details, programming code could be delivered)
    (working with Java 1.3.1, oracle.sdoapi.*, Oracle 8.1.7.)

    Hi Lutz,
    Could you provide more info or samples of what is going wrong?
    Also, could you try making sure the geometry you are passing in
    as the query window is valid (i.e. instead of passing it in as a
    query window, pass it into sdo_geom.validate_geometry).
    Thanks,
    Dan

  • Query for inserting data into table and incrementing the PK.. pls help

    I have one table dd_prohibited_country. prohibit_country_key is the primary key column.
    I have to insert data into dd_prohibited_country based on records already present.
    The scenario I should follow is:
    For Level_id 'EA' and prohibited_level_id 'EA' I should retreive the
    max(prohibit_country_key) and starting from the maximum number again I have to insert them
    into dd_prohibited_country. While inserting I have to increment the prohibit_country_key and
    shall replace the values of level_id and prohibited_level_id.
    (If 'EA' occurs, I have to replace with 'EUR')
    For Instance,
    If there are 15 records in dd_prohibited_country with Level_id 'EA' and prohibited_level_id 'EA', then
    I have to insert these 15 records starting with prohibit_country_key 16 (Afetr 15 I should start inserting with number 16)
    I have written the following query for this:
    insert into dd_prohibited_country
    select     
         a.pkey,
         b.levelid,
         b.ieflag,
         b.plevelid
    from
         (select
              max(prohibit_country_key) pkey
         from
              dd_prohibited_country) a,
         (select
    prohibit_country_key pkey,
              replace(level_id,'EA','EUR') levelid,
              level_id_flg as ieflag,
              replace(prohibited_level_id,'EA','EUR') plevelid
         from
              dd_prohibited_country
         where
              level_id = 'EA' or prohibited_level_id = 'EA') b
    My problem here is, I am always getting a.pkey as 15, because I am not incrementing it.
    I tried incrementing it also, but I am unable to acheive it.
    Can anyone please hepl me in writing this query.
    Thanks in advance
    Regards
    Raghu

    Because you are not incrementing your pkey. Try like this.
    insert
       into dd_prohibited_country
    select a.pkey+b.pkey,
         b.levelid,
         b.ieflag,
         b.plevelid
       from (select     max(prohibit_country_key) pkey
            from dd_prohibited_country) a,
         (select     row_number() over (order by prohibit_country_key)  pkey,
              replace(level_id,'EA','EUR') levelid,
              level_id_flg as ieflag,
              replace(prohibited_level_id,'EA','EUR') plevelid
            from     dd_prohibited_country
           where level_id = 'EA' or prohibited_level_id = 'EA') bNote: If you are in multiple user environment you can get into trouble for incrementing your PKey like this.

  • Master data time dependant attribute in query for several dates

    Hello all,
    I need to create a query to display prices for a material in different periods. 
    The user will enter a period interval.  For each period, they want to know the material price the last day of that period.
    Material price is a time-dependat attribute of material, and I created one formula variable to display it. The problem is the using the key-date of the query only works for a date, but not for severals.
    Is there any other way to do it?
    If not, the only solution I find is to have in the infoprovider the prices that I need, but I don't think this is a good practice.
    Any suggestions?
    Thanks!

    I have just seen something: in fact the created lines are:
    Key1   Compound1   fromdate    todate    attr1      attr2
    0001      L      01.01.1000      01.01.2006          <empty
    so as you can see, the right interval is created but the attributes are not up to date.
    Cheers.
    Cyril.

  • Query for getting data for every quarter for financial year

    Hi,
    My problem is I need to get the data for every quarter for financial year and also I need data for every week for financial year.
    For example for financial year 2012-13, Apr2012 to Jun2012 would be Q1, Jul2012 to Sep2012 would be Q2 and so on. Total 8quarters should come upto Apr2013.
    In the same way  1st apr 2012 to 7th apr 2012 would be week1, 8th apr to 15th apr would be week2 and son on. How to write a query for this scenario in oracle. Can anybody help me on this. very urgent..
    Thanks in advance.

    lakmesri wrote:
    Hi,
    My problem is I need to get the data for every quarter for financial year and also I need data for every week for financial year.
    For example for financial year 2012-13, Apr2012 to Jun2012 would be Q1, Jul2012 to Sep2012 would be Q2 and so on. Total 8quarters should come upto Apr2013.
    In the same way  1st apr 2012 to 7th apr 2012 would be week1, 8th apr to 15th apr would be week2 and son on. How to write a query for this scenario in oracle. Can anybody help me on this. very urgent..
    Thanks in advance.
    How can you get 8 quarters within a year ? I'b be concerned here.
    lakmesri wrote:
    Hi,
    In the same way  1st apr 2012 to 7th apr 2012 would be week1, 8th apr to 15th apr would be week2 and son on. How to write a query for this scenario in oracle. Can anybody help me on this. very urgent..
    Thanks in advance.
    First, that question is really not clearly asked. Second how could it be urgent ? You even did not tell us your Oracle version, did not show any tables descr, output sample nor any effort on your side to work on.
    Nicolas.

  • Query for FormSubmit data by ID via the SOAP API

    When I submit a form entry via the SOAP API request 'Create' it get an ID as well as the Form Type data in return:
    <CreateResponse xmlns="https://secure.eloqua.com/API/1.2">
      <CreateResult xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
           <CreateResult>
                <EntityType>
                     <ID>21</ID>
                     <Name>aherghaewForm</Name>
                     <Type>Form</Type>
                </EntityType>
                <Errors/>
                <ID>3</ID>
           </CreateResult>
      </CreateResult>
    </CreateResponse>
    However, I can't seem to query for that ID. When using the request 'Query'
    <ns:Query>
      <ns:eloquaType>
           <ns:ID>21</ns:ID>
           <ns:Type>Form</ns:Type>
      </ns:eloquaType>
      <ns:searchQuery>ID='3'</ns:searchQuery>
      <ns:pageNumber>1</ns:pageNumber>
      <ns:pageSize>20</ns:pageSize>
    </ns:Query>
    I receive the following Error:
    The search query you provided is invalid: 'id' is invalid.
    As the field 'ID' is not part of the FieldValueCollection, this outcome is not unexpected, but unfortunate, as it would greatly simplify querying for data that was just submitted. Querying for a field that is part of the FieldValueCollection delivers correct results.
    Is there a way to do just that, or do i have to follow another approach to retrieve FormSubmit data?
    Thanks,
    Felix

    OK. I see column X logic now:
    SQL> select  id,
      2          name,
      3          case when instr(sys_connect_by_path(id,',') || ',',',4,') > 0 then 'X' else 'Y' end x
      4    from  worker
      5    start with id in (
      6                      select  id
      7                        from  worker
      8                        where boss_id is null
      9                        start with id = 4
    10                        connect by prior boss_id = id
    11                      )
    12    connect by prior id = boss_id
    13  /
            ID NAME                             X
             1 Mennan                           Y
             2 Ahmet                            Y
             3 Akin                             Y
             4 Ayse                             X
             5 Aylin                            X
             6 Selim                            Y
    6 rows selected.
    SQL>SY.

Maybe you are looking for