About No aggregation in DTP

hi expert,
i have a question about the No Aggregation in the tab view update of DTP.
i have read the online help, but i do not understand yet.
is there any body who can explain it to me?
thanks a lots.
Jerry
Message was edited by:
        jinyou chang

hi,
Normally in BW data loads we will get few error records  because of data quality etc, due to these records loads falilures will occur, the main purpose of temparary storage is you can store the error records in temparary storage and load the remaining data into data target for reporting availablity to the business.
This temparary storage call it as error stack, in this we can store data based on semantic groups.
Semantic Groups to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined in a single data package.
This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
I hope this will help you, if you need complete details read the below link.
http://help.sap.com/saphelp_nw70/helpdata/en/42/fa1bcbcf2c1aa2e10000000a422035/frameset.htm
Regards
KSR

Similar Messages

  • Confused about Exception Aggregation

    Gurus-
    I'm rather confused about exception aggregation. I've reviewed[ Himanshu Mahajan's document in SDN|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f0b8ed5b-1025-2d10-b193-839cfdf7362a?QuickLink=index&overridelayout=true], and I'm still confused.
    Can someone offer some suggestions ?
    Specifically, here's what I've got (by example):
    Infocube Contents:
    Record 1 - Vendor A, Material B, Value $100
    Record 2- Vendor A, Material C, Value $200
    I need to return
    Vendor   Material   Value  Sum of All Vendor's Records
    My query results should look like this:
    Row 1, Vendor A, Material B, Value $100, Sum $300
    Row 2, Vendor A, Material C, Value $200, Sum $300
    How can I accomplish this in a Bex 7 Query ? I'm fiddling with Exception Aggregates (Total) on Vendor but don't get what I need. I've played with a Calculated Key Figure with an Exception Aggregate (Total) on Vendor, but again, no joy.
    Many thanks, o Gurus !

    Hi,
    As i can see from your report output you need a seperate column with the sum of values against vendor.
    If that is your requirement then first apply exception aggregation as TOTAL with refernce characterstic as material.Now in the result you will get the total sum
    According to your requirement now you need this in seperate column so make a new formula and use Sumct(value).
    This will give you the desired ouput.
    For more information on the same read the below links.
    http://help.sap.com/saphelp_nw70/helpdata/en/45/6148bab9ac1deee10000000a155369/content.htm
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/007ba7a0-bcea-2b10-7e89-cbcb9db98a28
    Hope it helps.
    Regards,
    AL
    Edited by: AL1112 on Dec 30, 2011 5:38 AM

  • About dynamic aggregation and using model

    I want to use model to realize dynamic aggregation.
    for example, I have a dimension named "product",it contains some values like :
    "apple" "orange" "beer" "beef" "bullfrog" "coffee"....
    I also have a "wastage" measure base on this dimension.
    I sometimes wanna see fruit wastage(fruit=apple+orange),sometimes breakfast food wastage(breakfast=apple+beef+coffee),sometimes other aggregation.
    how can I solve this problem? thanks for any suggestion.

    There are two answers depending on the scoping that
    you want the members to have. Both cases use the
    same model:
    DFN Foodmod MODEL
    MODEL
    DIMENSION product
    FRUIT = AGGREGATION('APPLE' 'ORANGE')
    BREAKFAST = AGGREGATION('APPLE' 'BEEF' 'COFFEE')
    END
    If you want the dynamic calculations to be persistent
    across sessions then you should make the model part
    of the definition of the aggmap.
    DFN Foodmap AGGMAP
    AGGMAP
    Relation Food.Food
    Model Foodmod PRECOMPUTE(na)
    END
    If you just want the positions to last for the session
    you can dynamically add a model to a pre-existing
    aggmap.
    DFN Foodmap AGGMAP
    AGGMAP
    Relation Food.Food
    END
    CONSIDER Foodmap
    AGGMAP ADD Foodmod

  • About DTP

    Hi everyone,
    can anyone please tell me what is the use of DTP in BI. what is the exact functionality of DTP.
    Thanks
    Sankar

    Hi
    DTP:
    The dataflow in 7.0 is so, first you need to get data to PSA with a infopack and from there you need to use DTP to load from PSA to the dataprovider(ODS). and other DTP to load to further infoprovider(cube).
    DTP is it like transferring data either from InfoObject or Infocube to another infoobject or infocube?
    NO, DTP is scheduling data from PSA to infoprovider/s. you would need to use infopack to load data from Sourcesystem to PSA.
    Transformations are just like transfer/update rules in 3.x. its
    just the mapping you do between the source fields and target fields.
    And DTP to load the data according to the mapping into the infoprovider.
    Purpose:
    You use the data transfer process (DTP) to transfer data within BI from
    one persistent object to another object, in accordance with certain transformations
    and filters. In this respect, it replaces the data mart interface and the InfoPackage.
    As of SAP NetWeaver 7.0, the InfoPackage only loads data to the entry layer of BI (PSA).
    The data transfer process makes the transfer processes in the data warehousing layer
    more transparent. Optimized parallel processing improves the performance of the transfer
    process (the data transfer process determines the processing mode). You can use the data
    transfer process to separate delta processes for different targets and you can use filter
    options between the persistent objects on various levels. For example, you can use
    filters between a DataStore object and an InfoCube.
    Data transfer processes are used for standard data transfer, for real-time data
    acquisition, and for accessing data directly.
    FInd the link about complete iNfo on DTPs in BI 7.0
    SAP NetWeaver 7.0 BI: Data Transfer Process (DTP) / Blog Series
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    hope this helps u
    regards
    gaurav

  • Aggregation in Bex

    As a novice in BI, I have some questions about the aggregation in BEx.
    1) As you guys already known, everthing in the tab Calculation of KF properties is justed used in displaying the result. I want to set the result value in this tab , and then use this value to calculate others. Is there any solution?
    For example, I set the 'Calculation Result As' = 'AVG' and I want the average value to calculate others.
    2) I really hate the limitation of aggregation that SAP provides. I want to customize my own aggregation. Is there any user exit to serve my requirements?
    Thank you very much. These problems always disturb my life!!!

    hi
    1) You are correct...i hve also faced enough problems reg aggregation...but I think some of might be solved in BI 7.0 bex designer..
    as of old versions: you can write VB code/macros to get desired results...
    2) I m not sure which scenario you rae talking abt....aggregtaion would be dependent on lots of factors.....
    thanks for any points

  • Optimization Level never rises during aggregation design

    I've recently added a couple of new dimensions to an old SSAS database.   I decided since I added new dimensions I should re-design the aggregations.
    I use the aggregation design wizard, and it acts pretty normal for most of the measure groups.  
    But then I get to the only measure group that connects to my two new dimensions.   This happens to be a very large measure group and it uses 19 different cube dimensions.
    When I run the aggregation design wizard on this group, I choose the "until I click stop" option and let it go, and it starts designing aggregations.   The number of aggregations designed keeps going up, and the storage space allocated
    keeps going up, but the optimization level stays at 0% the whole time.   After a few minutes it gives up at about 200 aggregations and 6 gigs of space used, and still 0% optimization.
    Is there any possible scenario in which this might be expected and normal, or should I be worried?   I've never seen this happen before.
    -Tab Alleman

    Hi Tab,
    If you select the ‘I click Stop’ option and watch the design grow until the estimated size is ridiculously large (maybe over a couple of Gb) you can then get a feeling for how many small aggregations can be built; you can then stop it, reset the aggregations
    and then restart using either the ‘Performance Gain’ or ‘Storage Reaches’ option set to an appropriate level.
    I would suggest you refer to the following articles regarding best practices and effective to design aggregations in SSAS, please see:
    Designing Effective Aggregations in AS2005:
    http://cwebbbi.wordpress.com/2006/10/23/designing-effective-aggregations-in-as2005/
    Aggregation Design Best Practices:
    http://technet.microsoft.com/en-us/library/cc966399.aspx#EBAA
    If you have any feedback on our support, please click
    here.
    Regards,
    Elvis Long
    TechNet Community Support

  • Exception aggregation doesn't work correctly

    Hi,
    Here we have a problem about exception aggregation, we created two queries. One is based on the
    InfoCube, another is based on the DSO.
    Both queries have the same definiation. And the data of the InfoCube is from the DSO.
    In the query definition we created a formula and in this formula we used a exception aggregation.
    For example, we used count all value <> 0 and reference to city in exception aggregation to count
    all city which sales value above certain value.
    We got the different result from these two queries.
    The result in the DSO is what we neend.
    Can you tell me why I got the different result?
    Our system information is:
    SAP NetWeaver 2004s
    SAP_ABA     700     0009     SAPKA70009     Cross-Application Component
    SAP_BASIS     700     0009     SAPKB70009     SAP Basis Component
    PI_BASIS     2005_1_700     0009     SAPKIPYJ79     PI_BASIS 2005_1_700
    SAP_BW     700     0009     SAPKW70009     SAP NetWeaver BI 7.0
    BI_CONT     703     0000          -     Business Intelligence Content
    Thanks

    Jin,
    Ideally one way to find out if exception aggregation is working fine is to drill down based on the exception aggregation characteristic and see if the formula is being calculated right...
    Arun

  • Aggregator not binding files

    Needing  help again. This time I'm going back to my old problem about the Aggregator. I have these three lessons that I want to bind together because I have quizzes in each of them. ( I don't have SCORM Packager and the other binder program have not work) So... When I was using the Aggregator to bind my files that is up until today it was working fine. It would bind my files together with a TOC on the side and it would go from one slide to the  next no problem. Now it still does the same thing except when I run it from the Aggregator (publish) the TOC does not appear for the second and third lessons but the lessons continue in a very wonky sort of way.
    Then I inserted all of the necessary files into Moodle - Moodle allows  me to run the first lesson but kicks me out after the first lesson is over. It takes me to the Moodle landing page.  So first this morning everything was working fine - it wasn't until I changed the names of my files that things got broken - the only thing I did different was I renamed my files i.e. my three lessons and made a new aggregator file so that they would have appropriate names displayed when the lessons were being played.
    I put the following files into Moodle
    Health Center Prog.agg
    Health Center Prog.htm
    Health Center Prog.swf
    Intro to HC,swf
    History of HC.swf
    Prog Req.swf
    Any idea on what I could have done to make the lessons stop playing and now kick me out of the lesson and throw me to the Moodle landing page.
    Im using Captivate 6 on a Mac.
    Please please please anything you can suggest would be helpful.
    Thanks in advance.
    Maree

    I don't honestly believe Storyline is going to solve this particular problem for you.  It doesn't have either an Aggregator app or a Multi-SCORM Packager app like Captivate offers.  The reason your content wasn't working before was due to your changing the names of the files and breaking the links.  That would happen in Storyline as well.
    By the way, you have shown the published file names in one of your original posts as having spaces in the names:
    Health Center Prog.agg
    Health Center Prog.htm
    Health Center Prog.swf
    Intro to HC,swf
    History of HC.swf
    Prog Req.swf
    Please be aware that when publishing content to a web server for delivery over an intranet or the internet you should NOT have spaces in the names as this can also break links.  Use hyphens or underscores instead of spaces.  In Captivate, this means you need to set the name in the Project Title field in the Publish dialog correctly to output the files without spaces.
    Also, there's no point uploading the .agg file to your web server or Moodle as this is just the file that Captivate reads in order to create the Aggregated project output.  It's not required for the final content on the server.

  • Cost-based aggregation

    Hi,
    Is it possible to find out how the cube is aggregated when you use cost-based aggregration?
    The cost-based aggregation is giving me reasonable load times, disk usage and query times. But I can't use this because one of my hierarchies changes rather often causing the complete cube to be re-aggregated. If I use level-based aggregation I can overcome this problem but I am having trouble finding the best configuration for which levels to aggregate on.
    Regards /Magnus

    Magnus,
    I think you are asking about dynamically aggregating over a hierarchy (or some parts of a hierarchy, like a level or a member).
    AWM does not expose that kind of functionality, but its there in the OLAP.
    You can set the levels or even parent members for which the cube data is pre-aggregated. For all the other levels or parent-members, it will be dynamically aggregated. Its done through PRECOMPUTE.
    Here is some explanation. The example is about doing complete dynamic aggregation over a hierarchy. Then I mention other PRECOMPUTE conditions that you can use.
    Lets say you want the cube to be dynamically aggregated over a hierarchy at query time (instead of pre-aggregating over that hierarchy), you can set the PrecomputeCondition of the cube by selecting the dimension and setting PrecomputeCondition to NONE. If you describe the AGGMAP for this cube (in olap worksheet), you will then see PRECOMPUTE(NA) for that dimension. In case of uncompressed cubes, the AGGMAP may still show PRECOMPUTE(<valueset>), but that valueset will be empty.
    You can also query ALL_CUBES view to see the PRECOMPUTE settings. For more PRECOMPUTE options look at RELATION statement documentation in AGGMAP at http://docs.oracle.com/cd/E11882_01/olap.112/e17122/dml_commands_1006.htm#i1017474
    EXAMPLE:
    begin
    dbms_cube.import_xml(q'!
    *<Metadata Version="1.3">*
    *<Cube Name="BNSGL_ACTV" Owner="BAWOLAP">*
    *<Organization>*
    *<AWCubeOrganization PrecomputeCondition="BAWOLAP.PRODUCT NONE"/>*
    *</Organization>*
    *</Cube>*
    *</Metadata> !');*
    end;
    In addition to NONE, the other options for PRECOMPUTE are
    (1). ALL
    (2). AUTO
    (3). n%
    (4). levels of dimensions to be precomputed
    (5). a list of one or more parent members to be precomputed. For rest of the parent members, dynamic aggregation will be done at query time.
    (6). According to documentation, some conditional statements can be used also (although I have not tried it). For example:
    PRECOMPUTE (geography.levelrel ‘L3')
    PRECOMPUTE (LIMIT(product complement ‘TotalProd’))
    PRECOMPUTE (time NE ’2001')
    Note that there maybe a bug because of which the dimensions (over which the dynamic aggregation is desired) should be last dimensions in the aggregation order.
    For your situation, you should look at (4) or (5) or (6)
    .

  • When to use aggregation?

    hi all,
    I have a question about using Aggregation relationship.
    if we have a class Cat and its attributes are name, age, height.
    I don't really have to draw another classes .. Class Name, Class Age and Class Height, I just write them in the same class Cat right??
    but if the attributes were Body, Leg, Head. they should be seperate classes: Class Body, Class Leg and Class Head. why is in the first situation I just add them in the Cat class and the other case they should have their own classes? I need to know the difference or the reason, so I don't get confused in drawing class diagrams.
    Thank you

    mshadows wrote:
    hi all,
    I have a question about using Aggregation relationship.
    if we have a class Cat and its attributes are name, age, height.
    I don't really have to draw another classes .. Class Name, Class Age and Class Height, I just write them in the same class Cat right?? Right
    but if the attributes were Body, Leg, Head. they should be seperate classes: Class Body, Class Leg and Class Head. Not necessarily. It depends on how you will use this information.
    why is in the first situation I just add them in the Cat class and the other case they should have their own classes? A Leg or Head can have lots of attributes (width, height, color, hairiness, and boolean attributes like isBleeding or whatever). Well Age and Height are single attributes. i.e. there is one value. Leg and Head are entities/objects and age or height are attributes and describe the entity/object.

  • EYE 007 Aggregated Value for Analysis Authorisations

    Hi there,
    I'm attempting to unit test a new report in our development environment via RSECADMIN. Having created the role and assigned to the test user I get the error that aggregated values for particular characteristics are empty. However I've already added these to an analysis authorisation and used this for another report where it finds the characteristics.
    I'm stumped as to why this report doesn't find the same values. I've generated the role and run a user master compare, but this still fails. Any help is appreciated.
    Thanks.

    1. Please take the InfoProvider on which you have created your query and find which characteristics are Authorization Relavant for that MultiProvider/InfoProvider.
    2. Make sure all these characteristics are added to the analysis authorizations assigned to the user: Detailed feild values for the one your report is about and aggregated value for the other one and all the relevant 0TCA* content as well
    The report should work, however in your case it seems like you are assigning the characteristics using separate analysis authorizations, in that case make sure the concerned InfoProvider is mentioned in each analysis authorization under 0TCAIPROVfor the analysis authorizations to combine.

  • Aggregator Service

    Hi,
    I have a query related to the Aggregator Service.
    We recently made use of aggregator service. The Hourly Report from this service
    looks like this,
    Start Reporting      21 May 2007 02:00:00 GMT          
    End Reporting      22 May 2007 02:00:00 GMT          
    Reporting Interval      Hourly          
    Day      Date/Time                                      Named        Anonymous
    Mon      21. May 2007 02:00 (GMT)      18       27
    Mon      21. May 2007 03:00 (GMT)      21       41
    Mon      21. May 2007 04:00 (GMT)      19       42
    Mon      21. May 2007 05:00 (GMT)      6       20
    Mon      21. May 2007 06:00 (GMT)      4       9
    Mon      21. May 2007 07:00 (GMT)      4       25
    I would like to know if these no's are cummulative?
    Meaning, if a user logged in between 2:00 and 3:00 and he's active till 5:00, in
    this case will the user be counted in the next hour count too(3:00 - 4:00) ?
    Thanks,
    Vikas

    I am not sure about the Aggregator Service in Order to Bill PIP. But the aggregator programming model we introduced for a specific purpose.
    We had use case:
         In the MDM Customer project, the Siebel application has create/update triggers defined at the database level (as opposed to from the UI frames), so any update/create action can potentially lead to multiple events getting raised for integration. Therefore, there is a need to aggregate these events and process them in batches instead of processing each fine-grained event by itself.
         The events can be raised on the following business entities: Account, Contact and Address.
         It is also required to maintain the relationships between the above entities when doing the aggregation: an account can have one or more Contacts/Addresses attached to it. Similarly, a Contact can have one more Addresses attached to it. Also, contacts and addresses can be shared across multiple accounts in Siebel.
    The Event Aggregation Programming Model provides a comprehensive methodology for the business use case where the events / entity / messages aggregation is needed.
         Event Aggregation is needed for the following reasons:
    1.     Multiple events are being raised before the completion of a business message, and each incomplete message triggers an event, which causes a business event in the integration layer.
    2.     Have a holistic view of an entity.
    3.     Synchronizing entity’s to have single view of the entity.
    4.     To increase the performance.
    5.     Several fine-grained events need to be consolidated to a single coarse-grained event.
    6.     Merge the duplicates of the same event
    Thanks

  • Key figure summation

    Hi All
    I searched SDN forums and could not find a solution for the same.
    I have added a keyfigure as display attribute in 0costcenter masterdata. The same is available as display attribute in headcount cube (0papa_c02) via 0mast_cctr. Now at the reporting level, the values for the key figure is not summing up at the node level for costcenter hierarchy as a normal functionality of display attribute. It works fine for normal key figures.
    So as a solution, I added this keyfigure in the headcount cube and populated the same via a routine from 0costcenter masterdata. The values are populating perfactly fine in the headcount cube. But the problem is that, as the headcount cube has multiple entries for the same costcenter, the values for the key figure is getting aggregated at the reporting level. I even tried using exception aggregation and non cumulative key figures to get the non aggregated value. But I am unable to solve this.
    Could you please guide me how can I acheive this. I know I can achieve this if I use an ods. But this will affect the whole design. Plaese suggest.
    Cheers
    Chanda

    Hi Jacob
    I created one calculated key figure which contains the real key figure from the cube. When I press Ok, it will get saved. The only thing I could see is the properties of the calculated key figure/ key figure. Where I can see count all values <>0 in calculate result as.
    I could not find below at the query level. Are you talking about the aggregation tab in the infoobject maintainance in BW side?
    "on the next screen in bottom left corner, there is a button (I think it will say "Expand" if you log on in english). Click that one and now you get to set the aggregation behaviour. Set the exception aggregation to 'count all values 0' and the reference char to the costcenter (I think your mst. costcenter in this case)"
    Cheers
    Chanda

  • Error while activating the datasource during migration

    Hi Guys,
    When we were trying to migrate the datasources from 3.x to BI 7.0, we are encountering an error with the activation of the datasource. The datasource itself has been migrated but the ativation log gives and error
    Error in activating the datasourc----RSO404
    Has anyone encountered this issue? Your input is highly appreciated.
    Regards,
    Doniv

    Voodi,
    Thanks for the response but the note talks about deletion of the DTP causing the errors. Moreover we are already on SP 13. Any other ideas?
    Doniv

  • Appearance of result row in query

    Hi guys,
    I have the following query example (part of it):
    row1 100.000 credit amount
    row2 100.000 credit amount
    row3 100.000 credit amount
    result 300.000
    I only want to be displayed:
    row1 ...
    row2 ...
    row3 ...
    result 100.000
    I know how the rows (1-3) can be handled that they appear empty but don´t know how to handle that the result appears in 100.000.
    I thought about exception aggregation but the key figure Im using in this case is not calculated (only uploaded). And as far as I know the expection aggregation only works for calculated key figures.
    Any idea?

    Hi, There are 2 ways, 1st you can define 'Constant' selection else if there are some issues due to CKFs' restrictions 2nd method is to perform a simple average you can try using a simple 'Formula' like SUM(X) / # of records. Ex. (100100100)/3 = 100. Try using the option which retains the value constant, else you can also define the 'No Display' in o/ps like 'Display Results As..' in the last tab of BEx titled 'Calculations''. You can get the # of records by using 0ROWCOUNT in case of DSO or using Data Functions like COUNT(X).

Maybe you are looking for

  • Can other family members download app I bought?

    Is it possible (and legal) for other family members to download an app I bought? For example, can I log in on their computer with my apple id and download the app without messing anything up?

  • CUCM 6.1(5) "jump" upgrade to 9.1(2)

    I am in the process of trying to upgrade a customers CUCM 6.1(5) environment to 9.1(2) using the jump upgrade method. The current system is on MCS servers and I have installed new 6.1(5) CUCM's on a UCS, applied th required cop file to disable licens

  • Gateway 19" LCD monitor

    I turned my comp on yesterday and my LCD light just flashes but no display? a few days ago i noticed a (hissing) sound sorta coming from the back. anyone have any ideas on whats up?? Thx

  • Restriction on memory scope for UI components binding

    Hello I'm using JDeveloper 11g 10.1.1.3 I found in the Fusion Developer's Guide for Oracle ADF (14 Getting Started with ADF Task Flows) next thing: <i> Restrict the scope of the managed bean that you reference through a UI component's binding attribu

  • Where should use Merge Statement in Form

    hi experts in Toad I am using this Merge statement , working accordingly. I want to use this merge statement in my form where can I use it. I created a Procedure in Form Named Proc_merge and call it when validate item of *:bankcode* PROCEDURE proc_me