Data accuracy on the aggregates operations report

Hi,
I am facing the issue Please help me in solving this issue.
when user was reviewing the data for accuracy on the Aggregates Operations Report
and found some discrepancies.  This is for plant 3001 (Carden) for March.user got the data as
57,051 from the COIO transaction in SAP and from the Inventory Movement
report in BW under production receipts, but on the Aggregates Operations
report for extracted there is only 30,840.  The surge material in question
is the amount that should show up in extracted tonnes on the AOR report.
user has tried  running different reports but can't figure out why the
difference.  Also, the Net Processed Volume and Aggregates processed are
also incorrect.  The amount Net Process Volume should the total of material
9050 (6,313 tonnes), 9061 (35,429 tonnes), 9094 (3,045 tonnes), and 9106
(10,628 tonnes), and for Aggregates Processed will be the same except not
including 9094.
Thanks,

Hi Lalitha,
From what I could undersand, one SAP report and 2 BW reports are being mentioned in your question.
The SAP transaction COIO and the Inventory Management report in BW are showing the same value. The AOR report is showing a different value for the plant 3001.
Please check the following:
1) How are the key figures (like Net Processed Volume and Aggregates processe ) computed?
2)  What fields from R/3 are used for this computation?
The reports within BW are showing a difference, which means the sources and data flow have been taken differently. So just analyze if the design in the back-end i.e. modeling is done based on any functional specifications given by the Business if so that can be your justification.
If this doesn't help you, you might have to give some more details of your problem.
All the best!
Regards
Sudeepti

Similar Messages

  • Z1 Data not being uploaded to Aggregates operation report(AOR)

    Hi,
    Iam facing one issue.The downtime entered into the Z1 Notification should upload into the AOR (Aggregate Operations Report).  The downtimes entered in Z1’s for the secondary plant have been transferred to the AOR downtime section, but the downtime entered in  Z1's for the Upper plant, have not transferred to the Aggregates operation report downtine section.
      The report runs based in Multiporovider.
    Please help me out in this issue.
    Regards,
    Sindhu

    Hi,
    thanks for ur response. ihave checked transfer rules and update rules.Could u tel me how can i proceed further to find out why the downtime is not transferred to the report.
    Regards,
    sindhu

  • Is Data Guard good option for Operational Reports

    One of our DBA has suggested to have data guard to produce real time or nearly real time(15 min. delay) operational reports, by which we can remove the load on OLTP system. Is it a good practice to use data guard for reporting purpose? what if Production OLTP system goes down (planned or unplanned maintenance), will data guard act as production OLTP database? Can any one in this forum provide some help to understand data guard concept?

    Whether this is a good option or not depends on a lot of factors - including which version of Oracle you are using, and what your operating system you have selected.
    In it's simplest form, Data Guard is the controller for hot standby synchronization and database failover. But, once you have a hot standby you may want to use it for other things. That additional use involves compromise - for example, while the standby is used for reporting, it is not applying the sync'ing data [as  fast] and there could be unacceptable delays.
    I encourage you to start with the Oracle portal for Data Guard and get familiar with come of the white papers. Move forward to the documentation and the forum dedicated to the technology.

  • Sending Date parameter  from the form to Report

    Dear Friend
    Hi everybody, I am sending the date parameter to my report through a form ,and my date format in the form and in the report as follow : DD/MM/YYYY but when I enter the date as 18/1/2004 the system pass this value as follow : 18-JAN-04 but I want the system to pass my date format as 18/1/2004 not as 18-JAN-04 ,how can I do that
    Best regards
    Jamil

    Ino,
    thanks for your answer.
    You're all rigth abouth the index issue. So I've written my SQL like you say in your post. But i get an error : "ORA-... Non-character found where character expected... " or something like that. I've checked nls_date_format in my client machine and the server (DB 8.1.7.4), and also the regional settings in both machines, and all are set to 'DD-MON-YY'. What can be wrong?
    Thanks again for your time.
    Jaime.

  • Event Log Error: Microsoft.ResourceManagement: System.Data: System.InvalidOperationException: The requested operation cannot be completed because the connection has been broken. at System.Data.SqlClient.SqlInternalConnectionTds.ExecuteTransaction

    Has anyone ever seen this? Any Clues?

    I had set up a clean environment some weeks ago, and just today I started configuring MA's. When I got to the FIM MA I wanted to do an import, sync and export, and when I got to the export it seemed to run endless. In the eventlog I got the same errors as
    you did.
    Didn't find any real hints in SQL or in the FIM requests history... After doing a good old reboot it just started working...
    http://setspn.blogspot.com

  • How to delete aggreagetd data in a cube without deleting the Aggregates?

    Hi Experts,
    How to delete aggreagetd data in a cube without deleting the Aggregates?
    Regards
    Alok Kashyap

    Hi,
    You can deactivate the aggregate. The data will be deleted but structure will remain.
    If you switch off the aggregates it wont be identified by the OLAP processor. report will fetch the data directly from the cube. Switching off the aggreagte won't delete any data,but temporarly the aggregate will not be availbale as if it is not built on the info cube. No reporting is not possible on swtiched off aggregates. The definition of the aggregate is not deleted.
    You can temporarily switch off an aggregate to check if you need to use it. An aggregate that is switched off is not used when a query is executed.This aggregate will be having data from the previous load's. If the aggregate is switched off means it wont be available for reporting but data can be rolled up into it.
    If u deactivate the aggregates the data will be deleted from the aggregates and the aggregate structure will remain the same.
    The system deletes all the data and database tables of an aggregate. The definition of the aggregate is not deleted.
    Later when you need those aggregate once again you have to create it from scratch.
    Hope this helps.
    Thanks,
    JituK

  • How can I see the data in the aggregates

    how can see the data available in the aggregates.
    Jay

    Hi Jay,
    its so simple,
    please goto the manage aggregates screen and copy the technical name of the aggregate and add
    /bic/exxx  xxx is the aggregate technical name, and for f fat table use /bic/fxxx, and go to se16 and enter the table name and thats it ur data is with u.
    R

  • How to display an "All Day Event" date correctly in an integrated SSRS Report?

    I have two event items in a calendar list in SharePoint 2010. Both items have the same start time and end time. One of them, however, has the "All Day Event" checkbox checked. If I access them through a REST service, this is how the data is
    returned:
    <!-- item 1 -->
    <d:StartTime m:type="Edm.DateTime">2014-03-21T00:00:00</d:StartTime>
    <d:EndTime m:type="Edm.DateTime">2014-03-25T23:55:00</d:EndTime>
    <d:AllDayEvent m:type="Edm.Boolean">false</d:AllDayEvent>
    <!-- item 2 -->
    <d:StartTime m:type="Edm.DateTime">2014-03-21T00:00:00</d:StartTime>
    <d:EndTime m:type="Edm.DateTime">2014-03-25T23:59:00</d:EndTime>
    <d:AllDayEvent m:type="Edm.Boolean">true</d:AllDayEvent>
    I have a report in the same SharePoint 2010 site that uses SSRS in integrated mode. The data source is the calendar list mentioned above.  The date fields are not formatted, just displayed as them come from the list/database.
    My locale is set to en-US. When I run the report, the start date for item 1 is displayed as "3/21/2014" ('all day' set to false) but the start date for item 2 is displayed as "3/20/2014" which is incorrect ('all day' set to true).
    I did some research online and found out that SharePoint stores all date fields as UTC except for 'All Day Events', which are stored in local time (our servers are in Central Time, but I'm running the report fom Pacific Time, in the US).
    I coudn't find a solution to display the date correctly in the integrated SSRS report. Is there a way, maybe some straightforward formatting, to show All Day Event dates correctly? I tried adding hours but this is inconsistent with daylight saving hour changes.
    I would appreciate any help.
    C#, Sharepoint

    Hi SharpBud,
    The date for all day event stored in SQL in GMT time, the start time for an all day event returns the start time in GMT time, which is not the current time most likely.
    This is a confirmed issue, as a workaround, I would suggest you to use a calculate column for the event for the column, using the following format:
    IF(TEXT(([End Time]-[Start Time])-TRUNC(([End Time]-[Start Time]),0),"0.000000000")="0.999305556",IF([Start Time]=ROUND([Start Time],0),[Start Time],DATE(YEAR([Start Time]),MONTH([Start
    Time]),DAY([Start Time])+1)),[Start Time])
    Thanks,
    Qiao Wei
    TechNet Community Support

  • Can we dynamically select DataSets in Data Model of a BI Publisher Report?

    Hello,
    I have a requirement as below -
    For a report 'XX Report1' we have Data Model(concatenated SQL Data Source) with 3 Datasets -
    New DataSet1 - WebService Call (WSDL) (Say D1)
    New DataSet2 - SQL QueryA (Say D2)
    New DataSet3 - SQL QueryB (Say D3)
    Layout - 'XXRTFFile.RTF'
    And another report 'XX Report2' with Data Model(concatenated SQL Data Source) with 3 Datasets -
    New DataSet1 - SQL QueryC (Say D4)
    New DataSet2 - SQL QueryA (D2)
    New DataSet3 - SQL QueryB (D3)
    Layout - 'XXRTFFile.RTF'
    Note that DataSet2 and DataSet3 have the same queries in both the reports.
    'XX Report1' is called from Siebel frontend through a URL
    'XX Report2' is called from Batch Programming later.
    The requirement is to come up with a single Report for both of these calls.
    So, I thought of merging all the DataSets in to 1 Data Model as D1,D2,D3,D4.
    Is there a way for us to select the DataSets from Data Model using the Parameters when Report is run, so that the report picks up D1, D2, D3 if it is called from Siebel frontend and D4,D2,D3 if its run from Batch Programming?
    Please let me know.Thanks in advance.
    Thanks,
    Phani

    The web services API would probably be your best bet. http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e10416/bip_webservice_101331.htm#CHDGIJHH
    There is some Java examples in the above. I'm guessing the c# code wouldn't be too different.
    Bryan

  • SCCM Get Collection Member Fails "Failed to get members of collection. The SMS Provider reported an error"

    Full Error is:  "Failed to get members of collection '{Collection Name from "Initialize Data"}'.". The SMS Provider reported an error. Details: Generic failure
    Orchestrator 2012 R2 7.2.84.0
    Using Integration Pack 7.2 for System Center 2012 Configuration Manager
    I have a Runbook that is setup to use Get Collection Members from SCCM.  The fields are:
    Collection:  {Collection Name from "Initialize Data"}
    Collection Value Type:  Name
    When I use the Runbook Tester (or when I try to run it myself) I get the error:
    "Failed to get members of collection '{Collection Name from "Initialize Data"}'.". The SMS Provider reported an error. Details: Generic failure
    I have tried using the Collection ID and I get the same error.  From the Collection Property in Get Collection Members I can browse and see all the collections in SCCM, so I know the connection is working.
    An example of a collection I am trying to use is this:  SUM - Patch and Reboot Server - 3rd Sat 1AM .  I have tried using it with quotes and single quotes as well and I get the same error.
    The log file for SCCM looks like this when it fails:
    CSspQueryForObject :: Execute...~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    ~*~*~e:\nts_sccm_release\sms\siteserver\sdk_provider\smsprov\sspobjectquery.cpp(1782) : Failed to parse WQL string SELECT * FROM SMS_Collection WHERE Name = "{Collection Name from "Initialize Data"}"~*~*~  $$<SMS Provider><03-25-2014
    11:34:52.904+300><thread=528 (0x210)>
    ~*~*~Failed to parse WQL string SELECT * FROM SMS_Collection WHERE Name = "{Collection Name from "Initialize Data"}" ~*~*~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    Execute WQL  =SELECT * FROM SMS_Collection WHERE Name = "{Collection Name from "Initialize Data"}"~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    Execute SQL =~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    Removing Handle 1792008112 from async call map~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    ExecQueryAsync: COMPLETE SELECT * FROM SMS_Collection WHERE Name = "{Collection Name from "Initialize Data"}"~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    CExtUserContext::LeaveThread : Releasing IWbemContextPtr=1862502880~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    My SCCM Administrator and I are out of ideas about what else to try.  Anyone have some ideas on what might be causing the issue? 
    MCITP | VCP4 | VCP5

    Hi,
    may I ask if you modified the log? Because of ...
    Execute WQL  =SELECT * FROM SMS_Collection WHERE Name = "{Collection Name from "Initialize Data"}"~  $$<SMS Provider><03-25-2014 11:34:52.904+300><thread=528 (0x210)>
    Have you typed or subscribe the Published Data to the Field Collection in the Activity "Get Collection Member"?
    It must be subscribed!: Click with right mouse key-> Subscribe->Published Data
    Regards,
    Stefan
    www.sc-orchestrator.eu ,
    Blog sc-orchestrator.eu

  • Can't get around this error after adding second dataset...A scope is required for all aggregates used outside of a data region unless the report contains exactly one dataset

    I added a dataset to an existing report and broke an aggregation.  In the old (i.e. single dataset) report, this expression below worked fine.  I wanted to get a distinct count of the vst_ext_id field when my educated field was like "VTE1*"
    = CountDistinct(IIF(Fields!educated.Value like "VTE1*", Fields!vst_ext_id.Value, Nothing))
    After adding a new dataset, this no longer works and I get the error " A scope is required for all aggregates used outside of a data region unless the report contains exactly one dataset".  Having done some research online, I found that I
    needed to specify my dataset explicitly and I thought this new expression might work, but still no success...
    = CountDistinct(IIF(Fields!educated.Value,"DataSet1" like "VTE12*", Fields!vst_ext_id.Value,"DataSet1", Nothing))
    Am I missing something?  Based on online responses, this explicit dataset naming convention seems to help most people, but it isn't working for me. 
    Thanks in advance!
    Brian

    I found the answer.  Apparently, my expression syntax was off.  This expression does the trick...
    = CountDistinct(IIF(Fields!educated.Value like "VTE12*", Fields!vst_ext_id.Value,Nothing),"DataSet1")
    I just happened upon this particular syntax searching online.  I was trying to specify the dataset name after each .value, but I never got that to work.   This is the only time I have found this particular syntax online. 

  • Data Error in the Query/Report after selective data deletion for infocube

    Hi Experts,
    Please advise what i was missing and what went wrong...
    I got a Query (Forecast) on a Multicube...which is based on 2 Infocubes with Aggregates...
    As i identified some data discrepency..yesteraday i performed selective data deleation on one of the Infocube
    and executed report yesteraday and the results in the query are correct...
    When today i executed the same report i am getting different results..
    When i compared the results of the report with that of data in cube they are not matching
    The report is not displaying the data in cube..for some rows it is displaying the data in the cube but for some rows it is just displaying same as the above row
    there is no data loaded into info cube after selective deleation
    Do i need to perform request compression and fill the aggregated after selective deleation
    Please advise what went wrong

    Hi Venkat,
    No i haven't done anything on aggregates before or after selective delete
    As there is not data load after the selective delete according to SAP Manual we don't need to perform any thing on aggregates...as selective data deletion on cube will delete data from aggregates as well
    Please update how to identify error

  • Report does not show data , but data exists in the cube.

    Hi All,
    I have a situation where I could not show the data in the report. When I load data from an extractor 0CO_OM_WBS_1 into a Cube directly I am able to show the data in my report. When I load the same extractor to a DSO and from the DSO when I load it into the Cube, the data does not show up in the query. To check the data I use the same restriction and could see the data reside in the cube (LISTCUBE). I compressed the requests, still it is not showing up in the query. No aggregates create on the cube.
    It shows the data if I load directly from the extractor, but not when I load data thru DSO.
    Any ideas.
    Alex(Arthur Samson)

    Hi Alex,
    I am facing same problem, i have data in cube but in report i cant see....
    i have created a generic DS, i loaded the data to DSO  then CUBE.
    data is loaded succesfully and i can see data in CUBE ( Manage ), when i am running the repory i am not getting data.
    i think you solve this issue... plz help me to resolve this issue.
    Regards,
    SHAIK.

  • Can I store only the aggregate data in OLAP cube

    Hi All,
    I know that the OLAP cubes store the leaf data and then builds the aggregate data on top of it and stores with in it. I have huge amounts of data ( like billions of data in my fact table and 6-8 dimension tables). If you keep the leaf data along with the agg data with in the cube would be too large to build.
    So I am thinking to have to store only the agg data within the OLAP cube and for the leaf data it should still read from the Relational tables. Something like Hybrid OLAP.
    (what I mean is
    1. create the dimensions and cube in the AWM on 11g.
    2. I will also specifiy the levels that I want the agg data to be calculated and stored in the OLAP cube.
    (what I want is
    1. I want to store only the agg data in the cube and leaf data still in the relatlional tables.
    2. when I read the cube and drill down to the leaf level , it should fetch the leaf level data.
    Is this possible in Oracle OLAP, if yes please provide some pointers
    Thanks
    S

    First you should try out storing and aggregating data to atleast see if the cube-loading time, query-time and AW-size is within acceptable limit or not. 11gOLAP especially 11gR2 OLAP is very efficient on all three fronts.
    Regarding specifying levels, you can either use Cost-based-aggregation and pick the %age that should be pre-aggregated OR you can use Level-based-aggregation and pick the levels from each dimension that should be pre-aggregated.
    Try these out before venturing into anything else. You will be surprised by the results. There are other ways to store the data in smaller multiple-cubes and joining those cubes through formulas. Make sure you don't mistake an attribute for a dimension.
    Not sure what you mean by just storing aggregated data in olap cubes. You can just do a SUM .. GROUP BY .. of relational data before loading it into olap cubes. For example, if you source data is at DAY level, you can SUM.. GROUP BY .. at MONTH-level and then load month-level data into olap cubes where leaf-level is month-level.
    The sql view (used by reporting tools) could then be a join between month-level "olap" data and "day-level" relational data. When you are drilling-down on the data, the sql view will pick up data from appropriate place.
    One more thing. Drill-Thru to detail data is a functionality of reporting tools also. If you are using OBIEE or Discoverer Plus OLAP, then you can design the reports in a way that after you reach the olap leaf-level, then it will take the user to a relational detail-report.
    These are all just quick suggestions (on a Friday evening). If possible, you should get Oracle OLAP Consulting Group help, who can come up with good design for all your reporting needs.

  • How to provide the data directly for the report using web service

    Hi all,
    I'm trying to execute a report from the webservice API (using BIP version 10.1.3.4) and want to provide the report with a pre-fetched data set. According to the Dev-Guide, I should use element /ReportRequest/reportData for this, but I can't find a proper example illustrating this use case. The problem is that I don't know how to enter my data set into the element. My data set is xml (text) while the data type of the reportData element is base64binary.
    I've tried something like the following, but without any success:
    ...<reportData>
           <ns1:rowset>
               <ns1:row>
                   <ns1:emp>
                      <ns1:name>
                   </ns1:emp>
               </ns1:row>
           </ns1:rowset>
       </reportData>
    ...Any help is highly appreciated.
    Thanks in advance,
    H
    Edited by: Harm Verschuren on Nov 10, 2008 1:01 PM

    Hello,
    Thanks for your answer, the situation is a little different from what you describe.
    We make a call to BI Publisher via a Web Service (PublicReportService) via the ReportRequest operation.
    The report we call contains a query to a database.
    When we do not include XML in the reportData field then the report data is obtained from the Database and the report result is returned via the Web Service.
    !http://s3.amazonaws.com/twitpic/photos/full/2863197.jpg?AWSAccessKeyId=0ZRYP5X5F6FSMBCCSE82&Expires=1235722431&Signature=oboPzLWqQDwB2AGCDCAj3ujHRi4%3D!
    When, however, we include XML data in the reportData field, then we see that ONLY the XML data in the reportData field is used, and NOT data from the Database.
    !http://twitpic.com/img/1pzrp-81573d3dde07b9b643d91800cf2715ef.49a79f9e-original.jpg!
    Is there a possibility to combine the two?
    Regards Léon

Maybe you are looking for

  • Skype Startup Crash on Windows 7

    After login sound and two seconds of loading, the Skype windows cease to respond and Skype will crash when one clicks on any Skype window. I am using Skype version 7.6.0.103. The problem began on Skype 7.2, but I uninstalled that and installed Skype

  • Ethernet connection from iMac to pc transfer question

    I'm going to be getting an iMac shortly. I currently have a Windows PC. I'm sure I read somewhere that its possible to connect a Mac and a Windows PC together with an Ethernet cable. What I'm wanting to do is,  rather than transfer my music, photos,

  • Itunes Not verifying IPod Classic

    I am trying to sync my Ipod, and for some reason Itunes isn't verifying my Ipod, thus making it impossible to sync the content of my computer. This is the first time it has ever done this. In the top box, it is saying, and has been saying for a while

  • Creating OnLine

    I'm a relatively new user to converting on line.  All has gone well so far...  I usually convert simple word docs created in MS Office Word 2003 to pdf.  As I say until today no problem but now my headers are not printing.  Footers are fine but no he

  • Code for Custom Business Object and Adding/Updating Data

    Hi, I would like to update/insert data thru Custom Business Object to sql Server.Pls let me know is it possible in MSA.If yes I would appreciate if you can share the code/Process in this forum. Thanks and Regds Harish