Issue in multiprovider/query data

Hi all,
I'm seeing an issue with my data in multiprovider/query which I cannot understand.
I have a multiprovider with a cube. I'm able to see data in the cube with a certain restriction but when I put the same restriction in the multiprovider I'm not able to see the data. I have a query with the same restriction as well and there too the results are not showing.
I have checked the assignment in the multiprovider and it is correctly assigned i.e. from the cube from where I want the data.
When I tried to display data in the multiprovider without any key figures selected then the data shows up correctly. Why is this happening and what can I do to fix this?
Thanks

Hi,
First check report data by using  0infoprov iobj in multi provider  and restrict with cube  . check either you are getting same data or not as cube including kfg's.
able to see data in the cube with a certain restriction : what type of restriction's applied if possible give detail.
Thanks.

Similar Messages

  • Query Data Block issues

    We have a legacy Forms and Reports application(~750 items combined). I suspect the applications were developed initially under 6i, but they work quite well under 10GR2, and is testing well under 11GR2 (which we plan on migrating to soon). We ran into an issue the other day for Data Query Blocks, and wanted your expert opinion on it.
    Due to a change in business requirements, we recently had to increase the length of a database field (from 20 to 30). We went into Forms Developer, made the appropriate changes, and recompiled them all successfully. Everything *seems* to work fine, but we noticed we get failures on any item that tries to utilize that extra 10 characters. Apparently this is because the Query Data Block seems to use a cached/stale copy of the database field sizes, and still expects a size of 20! What we realized now is that we have to go into each of those forms, and REFRESH the Data Block (through the Data Block Wizard) for every one of them.
    My question is this: is there anyway to force the Form to refresh those Query Data Blocks absent going into each one and REFRESHING it? We've tried compiling through Forms Developer, and through Batch Compiles, and nothing seems to get those cached copies to update. I'd really prefer to tackle this issue once and for all since we are prepping for migration testing.  Frankly I suspect the table space was tweaked a good bit over the years since development (before I ever got here), and if they didn't REFRESH those Data Blocks properly, a great many of the Forms will be suffering this same cached/stale view problem.
    Appreciate your thoughts,
    Dave

    Q_Stephenson,
    That's an excellent suggestion.  I have seen vague hints and allegations of Forms manipulation using that method, but never felt comfortable with my overall knowledge(*) of Forms to look at it that deep.  I think it's time to clear some space and dig into the topic for a better look.
    Thank you,
    Dave
    (*) The majority of my programming background is in Java; it's only been recently that I've had to really dig deep into Forms.

  • Multiprovider Query Problem

    Hi Gurus,
    The scenario is that there are 2 inficubes, one with order data and other with delivery data. The cube with order data has the requested delivery date and the delivery cube has the actual delivery date. The problem is that I have a query on a multiprovider (on top of the 2 cubes). So when I output the query data by the sales order number, the result is fine, but when I drill down on any of the dates mentioned above ( they are free characteristics in the query), the result splits up into 2 records. For Eg.
    Sales Order     Req Del Date  Act Del Date  Order Qty  Shipped Qty
    12345               03/03/08         -                    5                 -
                                   -           06/03/08            -                5
    What can I do to get the result in one row?
    I will reward points for any help.
    Thanks

    This is the behavior of the multiprovider, since the actual goods issue date is not part of the orders cube, then it will create a second record. There are a couple solutions you could get around to this:
    1. You could merge the data in one DSO before you actually load it to the data target. To do this, you could update fields you need to the orders ods from the delivery ods.
    2. You could create an infoset between the two cubes if you are in 7.0, otherwise, you could create infoset using the underlying ods and create a query from the infoset: performance wise this is not recommended.
    3. If you want to solve the issue report level, there is what is called constant selection and you can make the actual goods issue date as a constant selection and you can get one line.
    /people/prakash.darji/blog/2006/09/19/the-hidden-secret-of-constant-selection
    I would recommend the last option,
    thanks.
    Wond

  • Performance issue while generating Query

    Hi BI Gurus.
    I am facing performance issue while generating query on 0IC_C03.
    It has a variable as (from & to) for generating the report for a particular time duration.
    if the variable (from & to) fields is filled then after taking a long time it shows run time error.
    & if the query is executed without mentioning the variable(which is optional) then the data is extracted from beginning to till date. the same takes less time in execution.
    & after that the period has to be selected manually by option keep filter value. please suggest how can i solve the error
    Regards
    Ritika

    HI RITIKA,
    WEL COME TO SDN.
    YOUHAVE TO CHECK THE FOLLOWING RUN TIME SEGMENTS USING ST03N TCODE:
    High Database Runtime
    High OLAP Runtime
    High Frontend Runtime
    if its high Database Runtime :
    - check the aggregates or create aggregates on cube and this helps you.
    if its High OLAP Runtime :
    - check the user exits if any.
    - check the hier. are used and fetching in deep level.
    If its high frontend runtime:
    - check if a very high number of cells and formattings are transferred to the Frontend ( use "All data" to get value "No. of Cells") which cause high network and frontend (processing) runtime.
    For From and to date variables, create one more set and use it and try.
    Regs,
    VACHAN

  • Question on Dynamic Query Data Source and Form Folders in Oracle Forms 6i

    Hi there -
    I have one interesting requirement in Oracle Forms.
    This is what I wanted to do.
    1. Have a LOV and Go button on Form.
    2. From LOV, I will select a pre-defined custom table and click Go.
    3. Based on the selected custom table, I have to populate the Block (Tabular Format).
    4. User should be able to do update, delete, insert on this block.
    5. User should be able to use the Oracle Form folders functionality (select only necessary column and save views etc. Std folder functionality).
    6. If user selects a different custom table name in the LOV on top, I need to refresh the data from the block based on this new table. Remaining functionality should be as it is (steps 3 to 5).
    You can see here, I am going to have dynamic query data source (Table Name as well as column mapping) on the block. I do not know before hand how many columns the user selected table has!
    This is what I have planned for this so far but I have some major questions before I can move on with this design:
    1. I am going to create a table structure with fixed number of column in form (40 cols assuming that the custom table will not have more that 40 cols). (Kind of limitation but it's okay as of now).
    2. Dynamically populate the block based on the table name selected by the user from LOV. Dynamically change the table column names based on the table selected etc.
    3. Perform insert, update, delete using PL/SQL package.
    So far it looks okay.
    Now my real question is,
    Can user still be able to user "Folders" functionality here? I have never done this kind of development before and I doubt the dynamic column naming, dynamic column data source will really work for "folders"!
    Also, I am not really sure whether user will be able to save these "folder" queries?
    Okay so form experts, can you ppl suggest me if this is really going to work? Are there any better ways to do this?
    Initially I tried to do this in OA Framework but I got stuck at because as per OAF developer guide "I cannot user OAF personalization for dynamic items, regions etc".
    For more info on that thread see this link...
    Re: setUserCustomizable issue!
    Thanks in advance for the help.

    Any suggestion anyone?

  • Issue with the query

    Hi Friends,
    I am having a problem with this query. The query is to fetch all the elements for the employees. The elements that need to be fetched are set up in the flex sets.
    In the table fnd_flex_values there are columns start_date_active and end_date_active fields and these are null. In the below query when I use the condition
    and trunc(sysdate) between NVL(ffv.start_Date_active,TO_DATE('01-JAN-1951','DD-MON-YYYY')) and NVL(ffv.end_Date_active,TO_DATE('31-DEC-4712','DD-MON-YYYY'))
    the query fetches the results in 5seconds
    but when I replace the same query with the statement
    and to_date(to_char(ppa.date_earned,'DD-MON-YYYY'),'DD-MON-YYYY') between NVL(ffv.start_Date_active,TO_DATE('01-JAN-1951','DD-MON-YYYY')) and NVL(ffv.end_Date_active,TO_DATE('31-DEC-4712','DD-MON-YYYY'))
    the query takes a lot of time. Infact it gets timed out
    Columns start_date_active and end_date_active fields are date type. Can anyone say what is the issue with this query
    When I give the sysdate instead of ppa.date_earned date
    select papf.person_id, papf.full_name, papf.employee_number
    ,petf.element_name, pivf.name,
    prrv.result_value, ppa.date_earned,ppa.effective_date, to_char(ppa.date_earned,'DD-MON-YYYY') date_earned, paaf.assignment_id, petf.effective_start_date
    ,petf.effective_end_date, ffv.*
    from per_all_people_f papf
    ,per_all_assignments_f paaf
    ,pay_payroll_actions ppa
    ,pay_assignment_actions paa
    ,pay_element_types_f petf
    ,pay_input_values_f pivf
    ,pay_run_results prr
    ,pay_run_result_values prrv
    ,per_person_type_usages_f pptuf
    ,per_person_types ppt
    ,fnd_flex_values ffv
    ,fnd_flex_value_sets ffvs
    where 1=1
    and papf.person_id = paaf.person_id
    and trunc(ppa.date_earned) between trunc(papf.effective_start_date) and trunc(papf.effective_end_date)
    and trunc(ppa.date_earned) between trunc(paaf.effective_start_date) and trunc(paaf.effective_end_date)
    and paa.assignment_id = paaf.assignment_id
    and paa.action_status='C'
    and ppa.payroll_id = 61
    --and ppa.consolidation_set_id = 108
    and ppa.payroll_action_id = paa.payroll_action_id
    and prr.assignment_action_id = paa.assignment_action_id
    and prr.element_type_id = petf.element_type_id
    -- and trunc(ppa.date_earned) between trunc(petf.effective_start_date) and trunc(petf.effective_end_date)
    and trunc(sysdate) between petf.effective_start_date and petf.effective_end_date
    and prrv.run_result_id = prr.run_result_id
    and prrv.input_value_id = pivf.input_value_id
    and     prr.status in ('P','PA')
    and pivf.name ='Pay Value'
    and ppa.date_earned between pivf.effective_start_date and pivf.effective_end_date
    and ppa.time_period_id=145
    and paaf.person_id = pptuf.person_id
    and pptuf.person_type_id = ppt.person_type_id
    and ppt.user_person_type = nvl('Employee',ppt.user_person_type)
    and trunc(ppa.date_earned) between pptuf.effective_start_date and pptuf.effective_end_date
    and petf.element_name = ffv.flex_value
    and ffv.flex_value_set_id = ffvs.flex_value_set_id
    and ffv.enabled_flag ='Y'
    and ffvs.flex_value_set_name='GROUP_ELEMENTS'
    and trunc(sysdate) between NVL(ffv.start_Date_active,TO_DATE('01-JAN-1951','DD-MON-YYYY')) and NVL(ffv.end_Date_active,TO_DATE('31-DEC-4712','DD-MON-YYYY'))
    -- and trunc(ppa.effective_date) between nvl(ffv.start_Date_active,trunc(ppa.effective_date)) and NVL(ffv.end_Date_active,trunc(ppa.effective_date))
    -- and to_date(to_char(ppa.date_earned,'DD-MON-YYYY'),'DD-MON-YYYY') between NVL(ffv.start_Date_active,TO_DATE('01-JAN-1951','DD-MON-YYYY')) and
    order by ffv.parent_flex_value_low, papf.employee_number;
    Thanks

    ʃʃp wrote:
    /* Formatted on 2012/06/11 13:34 (Formatter Plus v4.8.8) */
    SELECT DISTINCT fr_ir_code, fr_fc_code1, product1, param_rep_prd, no_of_links_start_ir, no_of_supps_start_ir
               FROM comm_exst_rp_comit_aggr_irview
              WHERE fr_ir_code = 'AS01'
                AND srta_period = 2
                AND srta_year = 2011
                AND param_rep_prd = '2011-2'
                AND DECODE (:bind_variable, 'All', 1, 0) = DECODE (:bind_variable, 'All', 1, 1)
           GROUP BY fr_ir_code, fr_fc_code1, product1, param_rep_prd, no_of_links_start_ir, no_of_supps_start_ir
    UNION
    SELECT DISTINCT fr_ir_code, fr_fc_code1, product1, param_rep_prd, no_of_links_start_irda AS no_of_links_start_ir,
                    no_of_supps_start__irda AS no_of_supps_start_ir
               FROM comm_exst_rp_comit_aggr_irview
              WHERE fr_ir_code = 'AS01'
                AND srta_period = 2
                AND srta_year = 2011
                AND param_rep_prd = '2011-2'
                AND da_status = 'N'
                AND DECODE (:bind_variable, 'All', 1, 0) = DECODE (:bind_variable, 'All', 0, 0)
           GROUP BY fr_ir_code, fr_fc_code1, product1, param_rep_prd, no_of_links_start_irda, no_of_supps_start__irdaYes Union is one of the best solutions for handling two queries.
    As per my understanding DECODE function use is:
    DECODE (value,<if this value>,<return this value>,
    <if this value>,<return this value>,
    <otherwise this value>)
    But I am little bit jumbled for understanding below two lines in the query..
    1)
    AND DECODE (:bind_variable, 'All', 1, 0) = DECODE (:bind_variable, 'All', 1, 1)
    2)
    AND DECODE (:bind_variable, 'All', 1, 0) = DECODE (:bind_variable, 'All', 0, 0)Can you please tell me how the comparision is done using DECODE function?

  • Issues with Bex query structures and Crystal Reports/Webi

    Hi experts,
    I'm having an issue with Bex Query structures and nulls. I've built a Crystal Report against a Bex query that uses a Bex Query structure. The structure looks like the following
    Budget $
    Budget %
    Actual $
    Actual %
    Budget YTD
    etc
    if I drag the structure into the Crystal Report detail section with a key figure it displays like this
    Budget $     <null>
    Budget %     <null>
    Actual $     300
    Actual %     85
    Budget YTD     250
    the null values are displayed (and this is what is required). However if I filter using a Record selection or group on a profit centre then the nulls along with the associated structure component are not displayed.
    Actual $     300
    Actual %     85
    Budget YTD     250
    Webi is also behaving similarly. Can anyone explain why the above is happening and suggest a solution either on the Bex side of things or on the Crystal Reports side of things? I'm confused as to why nulls are displayed in the first example and not the second.
    Business Objects Edge 3.1 SP2
    SAP Int Kit SP2
    OS: Linux
    BW 701 Level 6
    Crystal Reports 2008 V1
    Thanks
    Keith

    Hi,
    Crystal Reports and Web Intelligence will only show data which is in the cube. You could have an actual 0 or Null entry whithout grouping but by changing the selection / grouping in the report the data does not include such entry anymore.
    ingo

  • Export Query Data to Excel

    Hey guys, I have a cfm page that I am using to query data, and the  result set is displayed on the same page when a user clicks submit.
    My question is, I would like to create a clickable icon where, after a  user runs the query and the data table displays, I want the user to be  able to click a little Excel icon that will allow them to download the  data in Excel.
    So, a user clicks on a little icon somewhere on the page and IE or  Firefox or whatever pops up a little dialog box asking them if they want  to OPEN or SAVE the file results.xls.  How can I do this?
    Here is my current code, but where do I implement the cfoutput stuff to  export?  On the same page?
    <cfquery name="qActivity" datasource="khamp" result="resultInfo">
         SELECT KHAMELEON.GL_DETAIL.ACCOUNT, KHAMELEON.GL_ACCOUNT.DES1, KHAMELEON.GL_DETAIL.ENTITY,
        SUM (KHAMELEON.GL_DETAIL.AMOUNT) AS "TotalAmt"           
         FROM KHAMELEON.GL_ACCOUNT, KHAMELEON.GL_DETAIL
        WHERE 0=0
        <cfif Form.Entity IS NOT "">
              AND KHAMELEON.GL_DETAIL.ENTITY = '#Form.Entity#'
         </cfif>
        AND KHAMELEON.GL_DETAIL.ACCTG_DATE <= '#Form.asofday#-#Form.asofmonth#-#Form.asofyear#'
        <cfif Form.accountnum IS NOT "">
        AND KHAMELEON.GL_ACCOUNT.ACCOUNT = '#Form.accountnum#'
        </cfif>
        AND KHAMELEON.GL_ACCOUNT.ACCOUNT=KHAMELEON.GL_DETAIL.ACCOUNT
        GROUP BY
    KHAMELEON.GL_ACCOUNT.ACCOUNT,
    KHAMELEON.GL_DETAIL.ACCOUNT,
    KHAMELEON.GL_ACCOUNT.DES1,
    KHAMELEON.GL_DETAIL.ENTITY
         HAVING SUM(KHAMELEON.GL_DETAIL.AMOUNT)<>0
         ORDER BY KHAMELEON.GL_ACCOUNT.ACCOUNT ASC
         </cfquery>
      <cfif resultInfo.Recordcount eq 0>
        No Records Match the Search Criteria.
        <cfelse>
        <hr/>
        <br/>
        <table border="1" class="displaytable">
    <!--Headings Row-->  
            <tr>
               <th>Account</th>
               <th>Description</th>
               <th>Entity</th>
               <th>Book 1</th>
          </tr>
    <!--Result Rows-->   
          <cfoutput query="qActivity">
          <tr>
            <td>#qActivity.ACCOUNT#</td>
            <td>#qActivity.DES1#</td>
            <td>#qActivity.ENTITY#</td>
            <td style="text-align:right">#NumberFormat('#qActivity.TotalAmt#', "_(999,999,999.99)")#</td>
          </tr>
          </cfoutput>
    I got the following code off of a thread in the forum, but it trys to  download the excel file as soon as the query is run (the excel download  doesn't work though, it trys to download the actual cfm page instead):
    <cfheader name="Content-Disposition"
    value="inline; filename=tb.xls">
    <cfcontent type="application/vnd.ms-excel">
    <table border="2">
    <tr>
    <td> Account </td><td> Description </td><td> Entity </td><td> Book1 </td>
    </tr>
    <cfoutput query="qActivity">
    <tr>
    <td>#qActivity.ACCOUNT#</td><td>#qActivity.DES1#</td><td>#qActivity.ENTITY#</td><td>#NumberFormat('#qActivity.TotalAmt#', "_(999,999,999.99)")#</td>
    </tr>
    </cfoutput>
    </table>
    </cfcontent>
    Thanks guysq

    To actually get the data into excel, google "cold fusion excel poi" and look for Ben Nadel's cfc.  Otherwise you might have issues with Office 2007.
    For the icon or whatever, make your query a session variable.  Then have the icon link to either a self closing popup or a very small iframe that exports the query to an excel file and then uses cfcontent to download it to the user.

  • How to query data from Oracle, MySQL, and MSSQL?

    For an environment consisting of Oracle 11g/12c enterprise edition, MySQL 5.7 community edition, and MSSQL 2008/2012 stanard/enterprise edition, is there any major issue using DG4ODBC to query data from all 3 platforms?
    Is there other free alternatives?
    If the queried data is mostly contained in MySQL or MSSQL, will it be more efficient to query from MySQL or MSSQL?
    If yes, any suggestion of how to do it in those platforms? I know MSSQL can use linked server but it is quite slow.

    mkirtley-Oracle wrote:
    Hi Ed,
        It is semantics.  By multiple instances I mean you have the gateway installed in an ORACLE_HOME which has 1 listener. However, if you are connecting to different non-Oracle databases or different individual databases of that non-Oracle database then you need multiple gateway instances for each database being connected.  I did not mean that you need a gateway installed in a separate ORACLE_HOME for each non-Oracle database to which you are connecting.
    Each of these would have a separate instance file within that ORACLE_HOME/hs/admin directory with the connection details for the non-Oracle database to which that instance connects.. So, you would have -
    initgtw1.ora - connects to MySQL
    initgtw2.ora - connect to SQL*Server northwind database
    initgtw3.ora - connect to SQL*Server test database
    etc
    etc
    Each of these instances would have a separate entry in the gateway listener.ora.
    In MOS have a look at this note -
    How To Add A New Database or Destination To An Existing Gateway Configuration (Doc ID 1304573.1)
    Regards,
    Mike
    Ah yes, we are in agreement, it was just semantics.  Thanks.

  • About Query Data Source Columns property

    Hello everyone,
    I'm new to Oracle Forms version 10.1.0.2.
    When I create a data block based on a table using Data Block Wizard, the block's Query Data Source Columns property is automatically populated with column definition entries corresponding to the columns of the base table.
    I tried making changes to these entries, for example by changing the data types to wrong data types or even deleting them, and I found that those changes had no effect on the block at all. The form was still working as I wanted.
    Please explain what is exactly the role of the block's Query Data Source Columns property.
    Thank you very much.
    p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.

    p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.
    IMHO those properties are very self-explaining: It is the data source of the block, or in other terms: how it is populated.
    Table means the data block is based on a table and subsequently will be populated by
    select col1, col2, col3 from your_table
    With sub-query the block will be populated with your subquery; forms will issue
    select col1, col2, col3 from (
      -- this is your subquery
      select col1, col2, col3 from tab1, tab2 where [....]
    With Procedure in short you'd have a stored procedure which returns a ref cursor and the block will be populated by the ref cursor.
    As for your question about the name: this actually should matter; the default is NULL which means that there needs to be a column which has the exact name as the item so in the above sample with table the item associated with your_table.col1 should be named col1. If it isn't the property should be named like the column. If this property also doesn't reflect the name this shouldn't work IMO.
    cheers

  • Extra column in BW query data?

    I'm getting an extra blank column in my BW query data that does not show in data preview mode.  I'm expecting data, based on the preview to look like this:
    CCtr   Yr    Amt   Qty
    375   2010   10     10
    But getting instead:
    CCtr   Yr    Blank    Amt   Qty
    375   2010              10      10
    I'm sure there is probably a workaround for this, but it seems like a bug.  Anyone else experiencing this with the "direct" connection to BW query?
    I'm on Xcelsius 5.3.4.0, Build 12,3,4,1038.  BW 7.01 SP6.
    Thanks,
    Brian Aldrich

    Thanks all-
    Turns out that this was an issue caused by a scaling factor in one of the columns in the BW query.  I had the amount scaled to thousands and the query relusts were putting a "*1000" every so many rows in the data...in the column that was (mostly) blank.  It placed that text every 6 or so rows.  This did not show in the query in BW or the data preview feature in Xcelsius.  Since my initial data retrieval was 5 rows, I didn't see this, but when I brought in a larger number of rows, it showed up.
    I turned the scaling factor off in the query to resolve the issue.
    Cheers,
    Brian
    P.S.  Someone in another thread suggested adding a spreadsheet table to the canvas to help with data preview...this was a huge help in finding the issue.  Highly recommended!

  • Exceptional aggregation on Non *** KF - Aggregation issue in the Query

    Hi Gurus,
    Can anyone tell me a solution for the below scenario. I am using BW 3.5 front end.
    I have a non cumulative KF coming from my Stock cube and Pricing KF coming from my
    Pricing Cube.(Both the cubes are in Multiprovider and my Query is on top of it).
    I want to multiply both the KF's to get WSL Value CKF but my query is not at the material level
    it is at the Plant level.
    So it is behaving like this: for Eg: ( Remember my Qty is Non-*** KF)
                   QTY  PRC
    P1  M1      10     50
    P1  M2       0     25
    P1  M3      5      20
    My WSL val should be 600 but it is giving me 15 * 95 which is way too high.
    I have tried out all options of storing the QTY and PRC in two separate CKF and setting the aggregation
    as before aggregation and then multiplying them but it din't work.
    I also tried to use Exceptional Aggregation but we don't have option of  ' TOTAL' as we have in BI 7.0
    front end here.
    So any other ideas guys. Any responses would be appreciated.
    Thanks
    Jay.

    I dont think you are able to solve this issue on the query level
    This type of calculation should be done before agregation and this feature doesnt exist in BI 7.0 any longer. Any kind of exceptional aggregation wont help here
    It should be be done either through virtual KF (see below )  or use stock snapshot approach
    Key figure QTY*PRC should be virtual key figure. In this case U just need to one cbe (stock quantity) and pick up PRC on the query run time
    QTY PRC
    P1 M1 10 50
    P1 M2 0 25
    P1 M3 5 20

  • APD with query data source

    Hi,
    I'm facing a problem, i created APD which have query data source, this query have so many calculated keyfigure and these calculated keyfigure has Exception Aggregation field is "Use Standard Aggregation" and on Calculations tab "local calculation is "Nothing defined". so Bex query result is defferent from query column "SUM". my problem is all query fields are map in APD there i'm suing the aggregation transformation which is using SUM and load data into DSO.
    Now query indiviual sum for each column is equal to DSO field sum but diffferent from BEx query sum, is there any posibility to take BEx query sum in DSO
    Please help me to fix this issue.
    Thanks,

    Hi,
    Could not understand your issue correctly. Could you please give sample values?
    My suggestion would be to create the query without any result rows and load it in the APD with 1:1 transformation.
    Does that fulfill your requirement?
    Thank.

  • Crystal Reports Templates usage - SAP BW MDX query data

    Hi Ingo,
    How are you?
    I am experiencing issues with crystal reports when trying to use crystal reports templates with SAP BW data.
    I have successfully created Crystal Report using SAP BW data with NodeID,  ParentNodeID hierarchy, Crystal Hierarchical Grouping.
    Also developed Xcelsius Dashboard using Live Office --> Crystal Report (SAP BW data) with refresh data on open.
    Now trying to create a template for new crystal reports to be designedu2026
    The following are observed by me:
    1.     When I create a crystal report using Report Wizard and select  templates from  the list shown, the following error displayed for SAP BW data:
    2.     When I try to apply template using Template Expert and selecting the report which is created using SAP BW data (and formatted), the following error displayed:
    u201CCould not apply template to the documentu201D
    I would like to know, whether there are known issues with Crystal Reports Templates using SAP BW data?  If so, could you suggest any solutions/work around to this issue?
    Thanks & Regards,
    S.Salai Manimudiyan

    I tried the following three methods:
    1. Applied all the 12 Crystal Templates
    2. Applied my own reports as Template
    3. Also applied Crystal Report developed using Templates Fileds.
    None of the above is working.
    Created a report with BW query data and try to apply all the above templates but the same issue.
    If I create a report using eFashion universe and apply all the above works without any problem.
    I do agree with you that templates should not depend on the data source, as it is mainly for the common/standard report page layout (to reuse)...
    Could you pl. check at your end and let me know the results.
    Thanks & Regards,
    S.Salai Manimudiyan

  • MDX Query Failed to execute when universe created on MultiProvider Query

    Hi All,
    I am running the WebI report created on the universe which is built on Query.
    The data flow model in our system for this query is:
    Data Store Object> Infoset-> multiProvider--->Query.
    When I drag only on Dimension object in the Webi Report without any Prompts the report works fine.
    But when I add any other Dimension or any Prompt in the WebI Query it gives me following error:
    A database error occured. The database error text is: The MDX query SELECT  { [Measures].[4EGM09GXGU939C3RJH2G9DOQY] }  ON COLUMNS , NON EMPTY CROSSJOIN( [Z_WM_IS01___F1].[LEVEL01].MEMBERS, [Z_WM_IS01___F12].[LEVEL01].MEMBERS ) DIMENSION PROPERTIES [Z_WM_IS01___F12].[2Z_WM_IS01___F12], [Z_WM_IS01___F1].[2Z_WM_IS01___F1] ON ROWS FROM [ZWM_M02/Z_ZWM_M02_Q001]  failed to execute with the error Unknown error. (WIS 10901)
    But if i create the Universe on Query which is built on DSO then WebI is working fine.
    Is there any restrictions on using Query on multiprovider?
    Or do I have to make any configuration at BO server?
    How do I configure logging for WebI and Designer to check the Log records?
    Do I have to change anything related to any MDX Parser in BO Server?
    We are using folowing versions:
    SAP BI 7.0 and BOXIR3.1
    Request to pls help me to solve this problem.

    Thanks Ravi for ur inputs.
    The report is working fine now.
    There was the problem with the Mandatory prompts defined in SAP BI  Query on which universe was developed.
    U should have only those Prompts as mandatory in SAp BI Query which are Universe level prompt, if some mandatory prompts defined as Class level promts in universe then the MDX Query generated based on WEBI report will not Execute.

Maybe you are looking for

  • Can i transfer money across itunes accounts

    I want to transfer money across from one itunes account to another is this possible?

  • DATA_CHANGED event handler is not triggered after input.(OO ALV)

    Dear Experts, What I did: I register Enter as the trigger event for data_changed event, and I put my checking logic in data_changed_handler which is a method of a local class. When the checking fails, I put messages using er_data_changed->add_protoca

  • SWF Loader scaling my SWFs

    Hello all, I have a Flex app that loads in a swf using 'SWFLoader' tag. the SWFloader and the swf being loaded in are the same height and width (see attached code) Now, the SWF getting loaded in is also 468 X 351 however the designers that made it ha

  • Migration from Sun Java System Portal Server to the Oracle Portal

    Hi, I'm in charge of evaluate the viability of a project that comprises the migration from Sun Java System Portal Server to the Oracle Portal. so I'm wondering if there is a kind of out of box migration tool in Oracle Portal or exists some guidelines

  • Trying to install 10.5.7

    I'm trying to install Mac OS X 10.5.7 Lepoard onto my computer. I already have 10.5.4 Leopard on there and want to upgrade it. I re-start the computer like it says I have to if I want to install it and the progress bar comes up telling me how far thr