Aggregate Creation

Hi,
I need to create aggregate for a cube and when i drag and drop the chracteristic from the left hand side to right hand side it prompts as a new aggregate name where in i would like to include all teh charcteristics under one aggregate name.
Am i missing out something,can anyone help me on this.
Points would be assigned.
Regards,
Vamsi

HI,
Got it.
Thanks

Similar Messages

  • Problem: Aggregates for repository by scheduling an aggregate creation job

    Hi dear OBIEE Gurus,
    I‘m trying to create my Aggregates for my repository by scheduling an aggregate creation job as shown in
    http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/aggpersist/aggpersist.htm
    env:
    1-     Oracle Business Intelligence Product Version 10.1.3.4.1 (Build 090414.1900)
    2-     Windows XP
    3-     Three separate repositories (OE, SH, & Sample Sales) running in parallel each on its own analytics deployment on one OC4J server
    +++++++++++++++++
    What is the problem:
    When I try to start the scheduled job I get following message:
    +++++++++++++++++
    Oracle BI Server
    Copyright (c) 1997-2009 Oracle Corporation, All rights reserved
    delete aggregates
    delete aggregates
    Statement execute succeeded
    create aggregates
    "ag_Salesfacts"
    for "SH"."Salesfacts"("Amount Sold")
    at levels ("SH"."ProductsDim"."Category", "SH"."TimesDim"."Times Detail", "SH"."CustomersDim"."State Province")
    using connection pool "orcl SH"."Connection Pool"
    in "orcl SH".."SH"
    create aggregates
    "ag_Salesfacts"
    for "SH"."Salesfacts"("Amount Sold")
    at levels ("SH"."ProductsDim"."Category", "SH"."TimesDim"."Times Detail", "SH"."CustomersDim"."State Province")
    using connection pool "orcl SH"."Connection Pool"
    in "orcl SH".."SH"
    Processed: 2 queries
    Encountered 1 errors
    [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
    [nQSError: 46036] Internal Assertion: Condition !m_factTable.IsNull(), file .\Src\SAAPAggrCube.cpp, line 46.
    Statement preparation failed
    ++++++++++++++++++++
    What have I done so far?
    ++++++++++++++++++++
    -     I went through all steps in the aforementioned tutorial, up to the point where it says “6. Verify that the job completes successfully in the Job Manager.”
    -     I set up the job in the scheduler
    -     I let run the job in the job manager, Which is where I get the error.
    -     I tried different combinations of fields, aggregations and levels, all with the same result.
    -     Manually starting the job has the same result: >D:\programs\oracle\obiee\server\Bin\NQCmd.exe -u Administrator -p Administrator -d AnalyticsWebSH -s D:\programs\oracle\obiee\server\Repository\create_aggregated_table_rk.SQL
    -     I tried to create aggregates in my other repositories, and found OTHER errors, but at least the aggregates were created there (to some extent)
    -     I looked in OTN for the error, found nothing
    -     Google does not know a thing about this
    Any ideas?

    Hi John,
    there is an OSS Note 453026 'No aggregate proposals for MultiCubes after query' that seems explaining your situation, but I think you aren't on the old 2.0/2.1 versions...
    John, I don't know how could I help you...
    If I find more, I'll write you at once !
    Bye,
    Roberto

  • Aggregate Creation in SAP BI

    Hi
    How can we identify on which info object do we need to create aggregate to improve query performance.
    Srujay

    Hi,
    Go to RSRT, click on "Execute + Debug" and Expand the option "Aggs" and select "Display Aggregate Found".
    Once you do this, provide your query input. and execute
    In the next screen, you will get a screen "Compare requested/found aggregates"
    Here you will see the combination of characteristics which are being hit at the initial execution of the query.
    This is the combination which you should use to create your aggregates.
    You can come back to the same screen later to check if the aggregates you've created are being used or not.
    To improve performance, you could also select Display Statistics Data in RSRT, by clicking on "Execute + Debug". Here in the duration tab check what stage of the query is taking more time.
    Regards,
    Joe
    Edited by: lazarus_joey on Mar 5, 2012 2:06 PM

  • AGGREGATES CREATION

    HI Folks,
    i know how to create the aggregates and its use ,but i just want to know on which basis we will decide that we need aggregates in this situation.tell me exactly where in which table or column we have to see whether this particular query needs aggregates or not?i know chekcing of BW STATISTICS & also RSRT to check the query performance and all the fileds.let me know clearly where we have to see and decide to create the aggregates.Please give me step by step procedure to see that.
    helpful answers are rewarded greatly....
    Thanks & Regards,
    ashok

    This depends on the size of the cube data and how frequently users will be using query to get the data from that cube and how much time they are wasting. Again exact table columns depends on which column or fields they are using. If client uses say 5 particular fields/key figures frequently how you can avoid these? You can create an aggregates and create a same report from cube as well as aggregates and see what's the difference or if there is any better performance when you run report from aggregates.
    Regards,
    Ashok

  • SQL Error 12801 while aggregate creation

    Hi All,
    I created a new aggregate for my cube.When i started the job for filling it the jog got finished but the activated didnt got filled.Folloewing is the job log from SM37.
    SQL Error: 12801
    Error in SQL Statement:
    Error in SQL Statement:
    New aggregation for aggregate 45186SONE7709PY9DIWK9TY4J(ZFIA_C12) for InfoCube  starte
    SQL Error: 12801
    Error in SQL Statement:
    Error in SQL Statement:
    Job finished
    I have gone through SAP notes 159779 and 986631 but couldnt decide whether these are applicable as much of these relates to BASIS.Has anyone solved the above problem using these notes or any other notes or any other clue where am i going wrong.
    Any help would be appreciated.
    Ajay

    Hello Ajay,
    I'm having this very same problem these days, and I haven't been able to solve it yet. I tried to recreate the aggregate but still not working. Did you get to solve it?
    I would appreciatte any kind of help.
    Thanks,
    Luis.

  • Query for functioning of Aggregate

    Hello experts,
    Could you please clarify me one fundamental question on the working of an aggregate.
    I have a cube with the follwing information
    CHAR1        CHAR2     CHAR3          KF 1
    I have build an aggregate with
    CHAR1         CHAR 2         KF1
    I have my doubt here :
    For example:
    If  I create a query with CHAR1 and CHAR2 in rows and KF1 in coloumns then the data is fetched from the Aggregate in stead of cube there by reducing the time. Is this correct?
    OR
    If I create  a query is created with CHAR1 CHAR2 AND CHAR 3 in rows and KF1 in coloumns
    then The data is read from the cube and noway the aggregate comes into picture.
    Could you please clarify me how exactly the aggregate is working.

    Hi,
    Goto RSRT transaction>Give report name>Execute+Debug>a pop up sceen will appear with multiple check boxes>select "Display aggregates found" option--> It will show all the aggregates those are hitting the query.
    To propose aggregates follow the below procedure to improve the query performance:
    First try to execute the query in RSRT on which u required to build aggregates. Check how much time it is taking to execute.....and whether it is required to build aggregate on this querry?? To get this information, Goto SE11> Give tabl name RSDDSTAT_DM in BI7.0 or RSDDSTAT in BW3.x.> Disply -> Contnts-> Give from date and to date values as today, user name as Ur user name, and give the query name
    --> execute.
    Now u'll get a list with fields like Object anme(Report anme), Time read, Infoprovider name(Multiprovider), Partprovider name (Cube), Aggregate name... etc. If the time read is less than 100,000,000 (100 sec) is acceptable. If the time read is more than 100 sec then it is recommended to create Aggregates for that query to increase performance. Keep in mind this time read.
    Again goto RSRT> Give query name> Execute+Debug-->
    A popup will come in that select the check box display aggregates found--> continue. If any aggregates or exist for that
    query it will display first if u press on continue button, it will display from which cube which fields are coming it will display...try to copy this list of objects on which aggregate can be created into one text file...
    then select that particular cube in RSA1>context>Maintain Aggregates-> Create by own> click on create aggregate button on top left side> Give discription of the aggregate>continue> take first object from list and fclick on find button in aggregates creation screen> give the object name and search... drag and drop that object into aggregate name right side (Drag and drop all the fields like this into aggregate).---->
    Activate the aggregate--> it will take some time once the activation finishes --> make sure that aggregate is in switch on mode.
    Try to xecute the query from RSRT again and find out the time read and compare this with first time read. If it is less tahn first time read then u can propose this aggregate to incraese the performance of the query.
    I hope this will help u... go through the below links to know about aggregates more clear.
    http://help.sap.com/saphelp_nw04s/helpdata/en/10/244538780fc80de10000009b38f842/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3f66ba90-0201-0010-ac8d-b61d8fd9abe9
    Hope it helps you..
    Regards,
    Ramki.

  • Issue with Query OLAP time very high

    Hello Guyz
    I ran my query in RSRT, and noticed tht the QOLAPTIME was almost 432 seconds (the query had crossed the 65556 records limit by then). The DBTIME was 70 secs.
    1. Are the above times in seconds?
    2. What are the performance techs I can implement, to improve the OLAP time? Since I think aggregates, indexing and partitioning only improve the DB time?
    3. I already have cache active on this query.
    Any suggestions?
    Please don't post the same question across the different forums
    Edited by: Moderator on Jul 8, 2009 11:46 AM

    Hello,
    One more thing, do any of the standard tech. of Indexing, Partitioning, Aggregate creation help in decreasing the OLAP time?
    These tech will be helpful for DB time but no use for OLAP time. And RKF didn't cost extra OLAP time but CKF and CELL calculation do.
    In your post you said there are more than 65535 rows. That's the main cause of high OLAP time according to my experience. Why users want so many rows? It's almost impossible to read. You can imaging how long it would take to transfer so many data from bw server to user (result in high OLAP time).
    Please reduce the lines by filter or something else. If you can't reduce the line number, I don't think the OLAP time would be low.
    Regards,
    Frank

  • Performance of the query is poor

    Hi All,
    This is Prasad.  I have a problem with the query it is taking more time to retrieve the data from the Cube.  In the query they are using a Variable of type Customer Exit.   The Cube is not at compressed.  I think the issue with the F fact table is due to the high number of table partitions (requests) that it has to select from. If I compress the cube, the performance of the query is increased r not?  Is there any alternative for improving the performance of the query.  Somebody suggested Result set query, iam not aware of this technique if u know let me know.
    Thanks in advance

    Hi Prasad,
    Query performance will depend on many factors like
    1. Aggregates
    2. Compression of requests
    3. Query read mode setting
    4. Cache memory setting
    5. By Creating BI Accelerator Indexes on Infocubes
    6. Indexes
    Proposing aggregates to improve query performance:
    First try to execute the query in RSRT on which u required to build aggregates. Check how much time it is taking to execute.....and whether it is required to build aggregate on this querry?? To get this information, Goto SE11> Give tabl name RSDDSTAT_DM in BI7.0 or RSDDSTAT in BW3.x.> Disply -> Contnts-> Give from date and to date values as today, user name as Ur user name, and give the query name
    --> execute.
    Now u'll get a list with fields like Object anme(Report anme), Time read, Infoprovider name(Multiprovider), Partprovider name (Cube), Aggregate name... etc. If the time read is less than 100,000,000 (100 sec) is acceptable. If the time read is more than 100 sec then it is recommended to create Aggregates for that query to increase performance. Keep in mind this time read.
    Again goto RSRT> Give query name> Execute+Debug-->
    A popup will come in that select the check box display aggregates found--> continue. If any aggregates or exist for that
    query it will display first if u press on continue button, it will display from which cube which fields are coming it will display...try to copy this list of objects on which aggregate can be created into one text file...
    then select that particular cube in RSA1>context>Maintain Aggregates-> Create by own> click on create aggregate button on top left side> Give discription of the aggregate>continue> take first object from list and fclick on find button in aggregates creation screen> give the object name and search... drag and drop that object into aggregate name right side (Drag and drop all the fields like this into aggregate).---->
    Activate the aggregate--> it will take some time once the activation finishes --> make sure that aggregate is in switch on mode.
    Try to xecute the query from RSRT again and find out the time read and compare this with first time read. If it is less tahn first time read then u can propose this aggregate to incraese the performance of the query.
    I hope this will help u... go through the below links to know about aggregates more clear.
    http://help.sap.com/saphelp_nw04s/helpdata/en/10/244538780fc80de10000009b38f842/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3f66ba90-0201-0010-ac8d-b61d8fd9abe9
    Follow this thread for creation of BIA Indexes:
    Re: BIA Creation
    Hopr this helps...
    Regards,
    Ramki.

  • Records read/transferred

    Hi BW Gurus,
    I have a simple question In RSRT when i run the query my
    records read/ Tranferred = 3886/66.... can nebody please advice wht should i do
    Any Suggestions will be reciprocated with points
    Nick

    hi Nick,
    it's records read from database and transferred to front end/query,
    normally this can be used to 'judge' if aggregate creation will help to improve query performance or not, aggregate may help if the ratio is > 10 (AND % of db time > 30%). check details in 'query performance tuning with aggregate'
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3f66ba90-0201-0010-ac8d-b61d8fd9abe9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cbd2d390-0201-0010-8eab-a8a9269a23c2
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/d9fd84ad-0701-0010-d9a5-ba726caa585d
    hope this helps.

  • Calculating size of CUBE & ODS

    Hi Everybody,
       Could any one suggest me, how can we calculate exact size of the cube & ODS?
    Thanks and Regards,
    Ramesh

    DB02 can provide some history on a table or tablespace about space consumption, but not row count, e.g.
    For a specific table like this -
    Date............Size (Kbyte) 
    10/01/2005 ........1,193,984
    09/01/2005 ........1,193,984
    08/01/2005 ........1,128,448
    07/01/2005 ........1,020,928
    06/01/2005 ..........988,160
    05/01/2005 ..........955,392
    04/01/2005 ..........914,432 
    03/01/2005 ..........881,664
    02/01/2005 ..........840,704
    01/01/2005 ..........807,936 
    12/01/2004 ..........635,904
    11/01/2004 ..........521,216
    But you don't really want to have to collect this for all the tables involved on a monthly basis.
    SAP_INFOCUBE DESIGNS can provide row counts
    0BWTC_C03  /BI0/D0BWTC_C031    rows:  6,005  ratio:  1 %
    0BWTC_C03  /BI0/D0BWTC_C03P    rows:     83  ratio:  0 %
    0BWTC_C03  /BI0/D0BWTC_C03T    rows:    880  ratio:  0 %
    0BWTC_C03  /BI0/E0BWTC_C03     rows:616,772  ratio: 84 %
    0BWTC_C03  /BI0/F0BWTC_C03     rows:116,037  ratio: 16 %
    RSRV test - Database information about InfoProvider, can provide this info about a InfoProvider -
    ZPU_C51 tables
    ZPU_C51 is a BasisInfoCube
    Cube ZPU_C51 has 9 dimensions; 0 dimensions are line item dimensions
      Table /BIC/DZPU_C511 has 3962 entries. Size corresponds to 3% of the InfoCube
      Table /BIC/DZPU_C512 has 2243 entries. Size corresponds to 2% of the InfoCube
      Table /BIC/DZPU_C513 has 4847 entries. Size corresponds to 3% of the InfoCube
      Table /BIC/DZPU_C514 has 5843 entries. Size corresponds to 4% of the InfoCube
      Table /BIC/DZPU_C515 has 44 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/DZPU_C516 has 148 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/DZPU_C51P has 174 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/DZPU_C51T has 74 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/DZPU_C51U has 5 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/EZPU_C51 has 0 entries. Size corresponds to 0% of the InfoCube
      Table /BIC/FZPU_C51 has 142233 entries. Size corresponds to 100% of the InfoCube
    Keep in mind the row counts displayed by SAP_INFOCUBE_DESIGNS and RSRV are based on the DB statistics and are only as current as the last time statistics have been collected on the tables.
    SE16 on the other hand, actually goes out and runs a query on the table to provide a true current row count.
    From a storage mgmt standpoint, I doubt your DBA is really that concerned about the number of rows you added and really just wants to know about disk storage consumption trends.  Mgmt on the other hand may very well be interested in row counts as it is something they can relate to and can indicate what's going on in the organization.
    I am not aware of any BW functionality that provides current InfoProvider row count histories/trends other than to run some of the above tools monthly or write your own pgm to do so. 
    There are various tables that track load information and could provide counts of records added. You could probably create an Infoset and do some reporting that way. The best way to get counts of records loaded is probably from the BW Tech Content cubes where you could create a query to provide monthly record load counts by InfoProvider, e.g.
    Calendar day     Time      Records
    9/1/2005     38,950     5,789,428
    9/2/2005     77,581     19,185,224
    9/3/2005     1,138     248,247
    9/4/2005     1,066     241,527
    9/5/2005     260     528
    9/6/2005     21,352     5,192,199
    9/7/2005     28,954     5,212,939
    9/8/2005     25,770     5,837,173
    9/9/2005     34,271     6,341,234
    9/10/2005     834     13,580
    9/11/2005     1,193     242,951
    9/12/2005     19,426     4,408,503
    9/13/2005     20,756     3,504,421
    9/14/2005     24,353     3,975,332
    9/15/2005     30,432     5,984,105
    9/16/2005     31,633     6,077,860
    9/17/2005     799     15,682
    9/18/2005     1,152     243,567
    9/19/2005     16,288     4,038,754
    9/20/2005     15,313     3,504,187
    9/21/2005     22,744     3,955,414
    9/22/2005     29,216     5,348,904
    9/23/2005     28,798     5,616,426
    9/24/2005     696     13,151
    9/25/2005     2,010     242,341
    9/26/2005     15,279     3,666,828
    9/27/2005     22,411     4,095,785
    9/28/2005     23,465     4,547,366
    9/29/2005     30,920     5,826,502
    9/30/2005     32,039     6,294,938
    Overall Result     599,100     119,665,096
    Keep in mind that records added to a cube for a month are not be the same as the records that end up in the E fact table table after compression.
    So I think you really need to define what it is you are looking for, statistical info for mgmt and/or info for the DBA for storage estimates.
    For stoage planning purposes, your DBA should be able to determine space use and trends, that's a basic function of their job.  Your role is to keep them informed of changes they can't determine from the DB, e.g.
    - transaction volume expected to double for an InfoProvider
    - you are going to create several aggregates
    - creation of new InfoProviders
    - you are deleting large amoutns of data
    BW disk storage grows - I don't think I have ever seen a Data Warehouse that has reached a point where it rolls off as much data as it loads.  A new BW can easily have storage requirements increasing by multiples year over year the first few years.  It can waste a lot of folks time trying to get to fussy about tracking individual table changes.
    Haven't used the Quicksizer to off an opinion. Probably an acceptable tool for estimating, just like any of the row count methods others have mentioned.

  • Data load Tuning

    Hello All,
    What are the Data Load Tuning ways from R/3 to BW we can possibly do, please help.
    Thanks,
    Suman

    Hi,
    To improve the data load performance
    1. If they are full loads then try to see if you make them delta loads.
    2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
    3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
    For eg
    1) Create Index
    2) Delete Index
    3) Aggregate Creation on Info Cube
    4) Compressing Info Cube data
    5) Rollup Data to Aggregates
    6) Partitioning infoCube
    7) Load Master data before loading Transactional Data
    8) Adjusting Datapackage size
    https://forums.sdn.sap.com/click.jspa?searchID=10049032&messageID=4373697
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
    4. Check whether the system processes are free when this load is running
    5. Try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
    6. Goto manage ODS -> activate -> activate in parallel -> increase the number of processes from there.for direct access try TCode RSODSO_SETTINGS
    7. Remove Bex Reporting check box in ODS if not required.
    8. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
    Use InfoPackages with disjoint selection criteria to parallelize the data export.
    Complex database selections can be split to several less complex requests.
    Number Range Buffering Performance  
    /thread/754694
    Check this oss note : 130253.Review the oss note 857998 and 130253. The first note tells you how to find the dimensions and infoobjects that needs number range buffering.
    Check this doc on BW data load perfomance optimization
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    BI Performance Tuning
    Business Intelligence Journal Improving Query Performance in Data Warehouses
    http://www.tdwi.org/Publications/BIJournal/display.aspx?ID=7891
    Achieving BI Query Performance Building Business Intelligence
    http://www.dmreview.com/issues/20051001/1038109-1.html
    SAP Business Intelligence Accelerator : A High - Performance Analytic Engine for SAP Ne tWeaver Business Intelligence
    http://www.sap.com/platform/netweaver/pdf/BWP_AR_IDC_BI_Accelerator.pdf
    BI Performance Audit
    http://www.xtivia.com/downloads/Xtivia_BIT_Performance%20Audit.pdf
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/10564d5c-cf00-2a10-7b87-c94e38267742
    https://websmp206.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000689436&
    Thanks,
    JituK

  • TSV errors and creation of aggregates

    Hi experts,
    Since few days, I can't run any more a query. It makes different shorts dumps like :
    TSV_TNEW_PAGE_ALLOC_FAILED
    TSV_TABH_POOL_NO_ROLL_MEMORY
    TSV_TNEW_OCCURS_NO_ROLL_MEMORY
    The query is based on a Infocube which contains 3 584 086 records. Once by week, entire data is deleted and reloaded.
    There is'n any aggregatesfor this Infocube.
    Do you know if the creation of aggregates in order top optimize performance of the query can resolve this kind of technical problem ?
    My second part of my question is about aggregates :
    In my query, we want to display the masterdata Article (0MATERIAL) and some of his navigation attributes (_Department_, Sub-department, Family). We want to display the plant and one ratio Stock data.
    In this query, we make a restriction on a charactestic XXX by excluding a value.
    My idea was to create an aggrgate with Article, Department, Sub-department, Family, Plant, and the characteristic XXX with a value excluded.
    But when I create the aggregate, I firstly drag and drop the article. But when I want then to drag and drop his navigation attributes, a warning is generate : "Article cannot be  in aggregate because of characteristic Department".
    - We can't put in an aggregate a masterdata with some of his navigation attributes ?
    - Is it possible in an aggregate to restrict a characteristic by excluding a value ? Apparently, we can only add a constant...
    I can bring some precisions if it's necessary for better understanding.
    Thanks for your precious help,
    Tempka

    Hi Tempka,
      It seems that there are issues with your roll memory where the memory is not enough to execute your query.
    TSV_TNEW_PAGE_ALLOC_FAILED and TSV_TNEW_BLOCKS_NO_ROLL_MEMORY
    You will need to get the roll memory extended by contacting your Basis team / ERP team.
    Just before you go to ERP team, check if the data base indexes for the info cube on which the  query has been developed are fine or not. You can check the DB indexes by following the below steps:
    1) Right click the cube, Click on manage.
    2) Click on Performance tab,
    3) Check DB indexes.
    4) If the check shows Red, Then recreate the DB indexes by clicking on "Create DB Index (batch)". Once the indexing is complete  try running the report again & see if the error comes.
    Also
    Thanks
    Pawan

  • Back end activities for Activation & Deactivation of Aggregates

    Hi ,
    Could any body help me to understand the back-end activites performed at the time of activation and deactivation of aggregates.
    Is filling of Agreegate is same as Roll up?
    What is the diffrence between de-activation and deletion of Aggregate?
    Thanks.
    Santanu

    Hi Bose,
    Activation:
    In order to use an aggregate in the first place, it must be defined activated and filled.When you activate it, the required tables are created in the database from theaggregate definition. Technically speaking, an aggregate is actually a separate BasicCube with its own fact table and dimension tables. Dimension tables that agree with the InfoCube are used together. Upon creation, every aggregate is given a six-digit number that starts with the figure1. The table names that make up the logical object that is the aggregate are then derived in a similar manner, as are the table names of an InfoCube. For example, if the aggregate has the technical name 100001, the fact tables are called: /BIC/E100001 and /BIC/F100001. Its dimensions, which are not the same as those in the InfoCube,have the table names /BIC/D100001P, /BIC/D100001T and so on.
    Rollup:
    New data packets / requests that are loaded into the InfoCube cannot be used at first for reporting if there are aggregates that are already filled. The new packets must first be written to the aggregates by a so-called “roll-up”. In other words, data that has been recently loaded into an InfoCube is not visible for reporting, from the InfoCube or aggregates, until an aggregate roll-up takes place. During this process you can continue to report using the data that existed prior to the recent data load. The new data is only displayed by queries that are executed after a successful roll-up.
    Go for the below link for more information.
    http://sapbibw2010.blogspot.in/2010/10/aggregates.html
    Naresh

  • Problem with SLD and Creation of JCo

    Hello everyone!
    I have a problem with the SLD and the Creation of JCo on the Portal, in trying to create the JCo always tells me that the name used there in the SLD, being that the facility is new.
    Who performs the installation, set in the SLD the ECC 6 PRD, delete it to see if I could create a JCo but still the error again when creating the ECC 6 PRD in the SLD I said that already exists and can not create it.
    I found a forum which recommended re-import the cimsap .... .... zip and zip CR_Content which would clean everything, I managed to import the zip cimsap ... but in trying to raise CR_Content gives me the following message "All incremental updates of the selected aggregate import for export SAP_CR line have already been loaded before. You require a newer import to update your current content SAP_CR version 4.11." .
    I recommend, do the installation again, reload the cimsap.zip and CR_Content.zip a copy homogeneous.
    from already thank you very much
    Text of error to create JCo destination
    Failed to process request. Please contact your system administrator.
    [Hide]
    Error Summary
    While processing the current request, an exception occured which could not be handled by the application or the framework.
    If the information contained on this page doesn't help you to find and correct the cause of the problem, please contact your system administrator. To facilitate analysis of the problem, keep a copy of this error page. Hint: Most browsers allow to select all content, copy it and then paste it into an empty document (e.g. email or simple text file).
    Root Cause
    The initial exception that caused the request to fail, was:
       java.lang.NullPointerException
        at at com.sap.tc.webdynpro.serverimpl.wdc.sl.SystemLandscapeFactory.checkStatus(SystemLandscapeFactory.java:991)
        at at com.sap.tc.webdynpro.services.sal.sl.api.WDSystemLandscape.checkStatus(WDSystemLandscape.java:469)
        at at com.sap.tc.webdynpro.tools.sld.ButtonBar.onActionGoForward(ButtonBar.java:505)
        at at com.sap.tc.webdynpro.tools.sld.wdp.InternalButtonBar.wdInvokeEventHandler(InternalButtonBar.java:280)
        at at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.invokeEventHandler(DelegatingView.java:87)
        ... 28 more
    See full exception chain for details.
    System Environment
    Client
    Web Dynpro Client Type HTML Client
    User agent Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; InfoPath.1)
    Version null
    DOM version null
    Client Type msie7
    Client Type Profile ie6
    ActiveX enabled
    Cookies enabled
    Frames enabled
    Java Applets enabled
    JavaScript enabled
    Tables enabled
    VB Script enabled
    Server
    Web Dynpro Runtime Vendor: SAP, build ID: 7.0017.20081210091801.0000 (release=NW04S_17_REL, buildtime=2009-01-12:09:43:27[UTC], changelist=52884, host=pwdfm114), build date: Tue Apr 28 19:28:39 CLT 2009
    J2EE Engine 7.00 patchlevel 48199.450
    Java VM IBM J9SE VM, version:2.2, vendor: IBM Corporation
    Operating system Linux, version: 2.6.18-92.el5, architecture: amd64
    Session & Other
    Session Locale en
    Time of Failure Fri Aug 28 11:43:53 CLT 2009 (Java Time: 1251474233677)
    Web Dynpro Code Generation Infos
    sap.com/tcwdtools
    No information available null
    sap.com/tcwddispwda
    No information available null
    sap.com/tcwdcorecomp
    No information available null
    Detailed Error Information
    Detailed Exception Chain
    java.lang.NullPointerException
         at com.sap.tc.webdynpro.serverimpl.wdc.sl.SystemLandscapeFactory.checkStatus(SystemLandscapeFactory.java:991)
         at com.sap.tc.webdynpro.services.sal.sl.api.WDSystemLandscape.checkStatus(WDSystemLandscape.java:469)
         at com.sap.tc.webdynpro.tools.sld.ButtonBar.onActionGoForward(ButtonBar.java:505)
         at com.sap.tc.webdynpro.tools.sld.wdp.InternalButtonBar.wdInvokeEventHandler(InternalButtonBar.java:280)
         at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.invokeEventHandler(DelegatingView.java:87)
         at com.sap.tc.webdynpro.progmodel.controller.Action.fire(Action.java:67)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.doHandleActionEvent(WindowPhaseModel.java:420)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.processRequest(WindowPhaseModel.java:132)
         at com.sap.tc.webdynpro.clientserver.window.WebDynproWindow.processRequest(WebDynproWindow.java:335)
         at com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:143)
         at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:321)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:713)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:666)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:250)
         at com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doPost(DispatcherServlet.java:53)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1039)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(AccessController.java:207)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:102)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:172)

    Hello!!
    The "SLD" with which I am working up to Portal, the "ECC" installs itself as "ABAP.
    Who did the installation and initial configuration, do not leave any documentation about it.
    Now I need to create the systems and "JCo" necessary to enable them to implement the "ESS", but errors are presented above.
    Some days ago I managed to update the CR_Content, which allowed me to create the "JCo" but at the time of the test, says' com.sap.mw.jco.JCO $ Exception: (103) RFC_ERROR_LOGON_FAILURE "name or password incorrect .
    I managed to also create the portal systems, "System Administration -> System Configuration -> Content Portal" and to test it fails on "Test Connection with Connector", which as noted is associated with the connectors "JCo".
    For the creation of the system, I think I should create something in the Visual Administrator ", but within the documentation that I do not see anything about it.
    thanks for your help!

  • Querying on aggregates created on Virtual Cube

    Hello,
    I have implemented a virtual InfoProvider with Services.When I create queries directly on the Virtual Infoprovider the query runs fine and I see the report.
    As per my requirement I create an aggregate on the Virtual Infoprovider .Then I define a query on the aggregate .But when I execute this query I get the following errors :
    Error reading the data of InfoProvider AG4
    An exception with the type CX_SY_REF_IS_INITIAL occurred, but was neither handled locally, nor declared in a RAISING clause
    Dereferencing of the NULL reference.
    Would appreciate any assistance on this topic.
    Thanks
    Priyadarshi

    Yes it is possible to create aggregates on Virtual cubes.
    I will be grateful if hope anybody who is aware of the method of aggreagate creation and who has faced similar issues comes forward and throws some light on what could be the error.
    Thanks

Maybe you are looking for

  • I get an error when  downloading itunes 7

    The setup program runs and quicktime downloads and creates an icon and when installing the itunes portion, when the status bar is nearing the end, an error message will come up saying " Could not open key: HKEYLOCALMACHINE\Software\Classes\.cdda\Open

  • Ipod Classic won't sync to my Itunes on an external hard drive

    So before I could sync my Ipod classic to my Itunes just fine. Then I downloaded episodes of Avatar and suddenly my Ipod won't connect with itunes. I tried connecting it but it didn't work and now all the stuff I had on my ipod is now erased yet some

  • Adobe Media Encoder Fails to start

    I recently upgraded to CS5, fully 64-bit, but because a lot of my plugins and etc are still 32-bit, I installed the 32-bit support so I could have premiere & after effects CS4 and CS5 on the same system. But this seems to have created a but with medi

  • Number of records in internal table

    Hi How can I tjeck an internal table for the number of records it contains (even if it contains 0). Thanks in advance, regards Torben

  • SAP implementation in Utilities Industry

    Hi, What is the major difference while implementing SAP CO for a utilities industry? How does it differ with other industries? Any help is appreciated. Thanks