Data reconciliation using OBIEE/Answers

Hi
I am new to OBIEE. I am trying to use the OBIEE and Answers for the reconciliation of the accounting entries with transaction data, However This leads to two facts one transaction lines and accounting lines. I can make the transaction lines also dimension. But in such case can I compare the transaction amounts with accounting amounts something like
Transaction Total – Accounting total – difference .
Thanks a lot in advance.

Hi
Still I am not ready with complete data model. However I Can explain. we have the customer, transaction types, date , Currency as dimensions. These dimensions are linked to the transaction Fact where we have the transaction amount like Invoice amount for each Transactions, open balance for each TXtransaction. We have another child table to the transaction table called accounting distribution table. Now i can build two differnt star one for transaction as fact and another one as Accounting flex. In such a case in we join both fact in presentationlayer to get the transaction amount and accounting amount compared. Also another question is can we have the both facts in the same schema. Or any other approach?
Thanks a lot in advance
raj

Similar Messages

  • Data reconciliation using MDM

    Hi can anyone tell me that does MDM play any role in data reconciliation across various R/3 modules.

    Hi Gopesh,
    maybe you want to read the thread right before this one... there i put an answer which covers your topic.
    this part might be relevant to you:
    Concerning 2: <b>Actually with MDM 3.0 you can consolidate any data</b> - the standard allows MD Objects like Product or Business Partner - but if you want to harmonize eg FI/CO Data you can create beforehand on MDM a MD Object type using the MD Framework - this framework allows to create any data type you like (look at the elearning sessions on the mdm page in sdn - https://www.sdn.sap.com/sdn/developerareas/mdm.sdn?node=linkDnode8 )
    regards, Matthias

  • Column data formatting in OBIEE ANSWERS

    Hi,
    I have a requirement of putting '-' whenever column value is NULL. I am doing this ANSWERS. I am doing this by using the below.
    IFNULL(CAST("COLUMN NAME" AS CHAR),'-').
    It is correctly replacing by '-' whenever data is NULL but it is also overriding default data format as Treat Text As(custom Text Format) if you look at the data format tab in the column properties. So if column has data like 23.1 it is displaying as 23.123456667777.
    But I want to overwrite custom text format as number format to show as 23.1.
    Please help me on this.
    Thanks,
    SRK

    I have made a solution for you :
    http://gerardnico.com/weblog/2009/04/09/how-to-mix-string-and-number-data-type-in-one-column-and-get-a-sum/
    Replace the condition by your value is null in the conditional formatting and you are done.
    Success
    Nico

  • How to count the occurance of the data entry on OBIEE answers

    Hello guys
    I have one report that contains a column call type. In the report that I am creating, there are 200 rows in that column, but there are really just 4 types of call type; "A,B,C,D"
    So in my report it looks like this
    Call type other dim and fact
    A
    A
    A
    B
    B
    C
    D
    D
    D
    D
    Now I need to add one more column said call counts, which needs to count the occurance of each call types:
    Call type count others
    A 3
    B 2
    C 1
    D 4
    So how would I be able to achieve this on answers? I used "count" function on "count" column, but it is counting all the rows without distinguishing the call type
    Please let me know
    Thanks

    Hi,
    Looks to me like this should be a measure within the RPD rather than something you try to do at the presentation level. Any simple "# of <x>" measure will work like you describe, it should just do a simple count on something unique in the table and have an aggregation rule set, OBIEE will then automatically group by this in your report by distinct call type without you having to do anything complex.
    Regards,
    Matt

  • Request chaning/flow in OBIEE Answers?

    In OBIEE Answers, is it possible to save the value of a request and reuse this value for further
    calculations?
    In other words, I want to use the results of request(s) as the data source for another request.
    The use case is like this:
    1. Use OBIEE Answers to get the value of a metric. There are multiple such metrics. Create one request per metric and save all such requests.
    2. Use the requests saved in step 1 to derive a higher layer of metrics. Save these requests.
    3. Use the requests saved in step 2 to derive a higher layer of metrics. Save these requests.
    And so on.
    Then I want to show these metrics on the dashboard with drill down navigation.
    Appreciate your help.
    Thank you.

    Hi all,
    Thanks for the reply.
    But this doesn't seem to solve the problem.
    Let me put it in other words:
    Result-1:
    Metric_A X <-- single row resulting from a request in OBIEE Answers
    Result-2:
    Metric_B Y <-- again single row resulting from a request in OBIEE Answers
    Result-3 (to be derived from above two results)
    Metric_C a*X + b*Y
    where a, b are constants.
    and so on...
    Edited by: user2363628 on Jun 10, 2010 2:18 AM

  • Underscores appearing when the Source is OBIEE Answers report.

    Hi,
    When I use OBIEE Answers for my Source in the XML data I see the table name.columns in the below format with a Underscore at the beginning and end.
    Is there any way that we can give proper names to it as becasue when I am using this in Analyzer Template there is no way I can rename the columns.
    <_Responsibility_._Booking_Channel_>ATO/Call Centre</_Responsibility_._Booking_Channel_>
    <_Origin_City_._Origin_Country_Name_>Jordan</_Origin_City_._Origin_Country_Name_>
    <_Origin_City_._Origin_City_Name_>AMMAN</_Origin_City_._Origin_City_Name_>
    <_Destination_City_._Destination_Country_Name_>Egypt</_Destination_City_._Destination_Country_Name_>
    <_Destination_City_._Destination_City_Name_>CAIRO</_Destination_City_._Destination_City_Name_>
    <_Flight_Segment_._No_of_Customer_Journeys_>5</_Flight_Segment_._No_of_Customer_Journeys_>
    <_Flight_Segment_._No_of_Flight_Segments_>5</_Flight_Segment_._No_of_Flight_Segments_>
    Regards,
    Bhavik

    Hey Bhavik,
    I'm having a similar problem with the formatting of the columns taken from OBIEE Answers. I'm trying to do the following, but its not liking the period in the middle of a variable name.
    <xsl:variable name="totalOH" select="sum(current-group()/_EQUIPMENT_._OH_)"/>
    I'll let you know if I find anything.
    -Luke

  • Using OBIEE Scheduler to export and save data in excel/pdf

    Hi
    Is it possible to use obiee scheduler and run particular reports and export the data into excel/pdf and save this file at given location on server.
    env: obiee 11g/linux OS
    Thanks for any ideas.

    Hi,
    Yes it is possible.
    As per my knowledge we don't have such option but we can do one way.
    Using IBOTS you can store specific report and specific time in specific location.
    Note : Using JS script.
    Are you going to implement this way send me your mails id will send you js script for doing this.
    Re: Auto SAVE to excel -- Please refer this thread.
    Create js file using the below script and add this js scrif in your ibot advanced tab.
    var FSO = new ActiveXObject("Scripting.FileSystemObject");
    var foldername = 'D:\IBOT'
    if (FSO.FolderExists(foldername))
    var fileName = foldername + "\\" +Parameter(1);
    var fooFile = FSO.CopyFile(Parameter(0), fileName, true);
    else
    FSO.CreateFolder(foldername);
    var fileName = foldername + "D:\IBOT" + Parameter(1);
    var fooFile = FSO.CopyFile(Parameter(0), fileName, true);
    Thanks,
    Satya

  • Field is in DB table/alias but I get ORA904 error using it in OBIEE Answers

    Hello, I'm very new at developing with RPD and OBIEE tools, so I'm struggling with something that I thought I did correctly. I added a new field to a database table, and can query it in the database. I added it to the table in RPD Physical layer and I verified that it also now exists in an alias of that table. The physical alias table field is used as a source for a logical column in the presentation layer. Everything looks good in the RPD. But when I try to pull it up in a query in OBIEE Answers I get a 904 error telling me that the column does not exist. I've bounced the servers, dropped cookies and cache, but nothing helps - I still get the error.
    What step could I have missed?
    Thanks in advance!
    Rich

    THank you all for the advice, but it still appears to be a problem. Here's what I tried and the results:
    1. Right-clicked on the new column to do a row count update. As predicted by Gaurav, this also failed with a 904 error.
    2. Verified exact spelling is correct.
    3. Checked the connection pool. THis looks correct, but I will double-check today with one of my co-workers who has more experience with RPD.
    4. Tried querying the column in RPD - this also failed with 904 error.
    5. Checked the LTS within BMM layer and my new field does map back to the alias column
    6. Ran a consistency check and the only errors (warnnings actually) are for a different, unrelated table.
    7. Checked security and it also looks okay, but this is another one I'll double-check later with one of our experts here.
    8. Joins in physical layer and BMM look good
    9. Checked physical column and table in the presentation layer. These also point to the right columns/tables
    10. Selecting the column by itself in analysis fails with 904.
    If anyone can think of something else to consider based on this please advise. One thing I did notice is that in performing the steps above I noticed somewhere (the BMM I think) that the new column I added is at the bottom of the table (last field) and not in the middle as it is in the database. In past experience with other tools this hasn't mattered, but I'm going to see if I can change that and if that has an effect. I'm guessing it won't but I'm running out of ideas!
    Thanks again,
    Rich

  • Using OBIEE for a custom Data Warehouse

    Hi Everyone,
    I am very new to OBIEE and I have a few questions about this product family.
    1. I have an existing custom build data warehouse, and I would like to know, is it possible to have build reports on this data warehouse?
    2. I understand that OBIEE comes with pre-built ETL jobs in Informatica, what kind of license is it? Is it possible to modify them, or even build new jobs that load into a non-OBIEE data warehouse?
    your answer will be greatly appreciated.
    Jeffrey
    Edited by: user3265404 on Oct 13, 2009 12:50 PM

    Its the same Informatica which can do all functions as a stand alone Infa. additionally it also has prebuilt adapters for source systems like Siebel, APPL, PSFT, JDE, SAP and some universal adapters, so the license included these also which is going to cost more than getting a informatica Licence from Informatica Corp. Moreover, OBI Apps 7.9.6 comes with Informatica 8.6 which is a little older version of the tool. Informatica is going to release version 9 in a couple of weeks.
    I see that you already have a datawarehouse, so why do you need a ETl tool again?
    OBI EE can directly report out of a datewarehouse, and also transactional systems as long as the metadata layer is built.
    PS: Am I clear?

  • Urgent: Error while displaying data in OBIEE answers

    Hi,
    I'm facing an issue in OBIEE answers when trying to display values from my fact table. It is displaying something like invalid column identifier in oci call stmt execute...
    I have three dimensions and that particular column is in three dimensional tables and i changed the names and placed in fact table.
    when i'm trying to view data in physical layer it is displaying data.
    why i'm facing this error.. How can I resolve this??
    Its very urgent requirement and any help will be appreciable..
    Regards,
    Sindhu

    Hi,
    I changed the column names in BMM only. Not in physical layer. In physical layer i'm able to view data.
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 17001] Oracle Error code: 904, message: ORA-00904: "T95877"."OBSERVATION_DATETIME": invalid identifier at OCI call OCIStmtExecute. [nQSError: 17010] SQL statement preparation failed. (HY000)
    SQL Issued: SELECT "FACT - RISK SLOT Overtime"."In Gas Price Value ID" saw_0 FROM "AO Risk" ORDER BY saw_0
    I'm getting this error...
    Regards,
    Sindhu

  • Deliver Answer reports using OBIPublisher scheduler VS using OBIEE Delivers

    Hi Gurus,
    I am working on a POC where I need to give a comparative study between delivery mechanism of OBIEE delivers and OBI Publisher Scheduler for sending across Answers/dashboard reports built in OBIEE.
    Business & IT wants to use OBI Publisher scheduler to deliver reports to end users .I have extensively worked on OBI Delivers for delivery of alerts and configuring of IBOTS for various different tasks.
    If somebody can tell me the pros and cons of using one or the other tool then it would be of great help.
    Scenario:- we have some Cash Advance reports in OBIEE Answers/Dashboard from Oracle Applications(OTM) which are built using OBIEE and the reports needs to be delivered using OBI Publisher scheduler.Can I do this and if I can what are the advantages and disadvantages of doing it in OBI Publisher.And is it a good idea to use OBI Publisher scheduler because I belive to use the Publisher's scheduler I have to put the Answers reports in OBI Publisher and use Publisher's scheduler for delivery.Please let me know if my understanding is correct.
    This is something for which I need to have a solution very soon.please help.
    Thanks a lot in advance for all the help.
    Regards,
    Sankar

    I think Delivers it's probably a better solution than the BIP Scheduler since it's better integrated to OBIEE. I don't think you can schedule a Dashboard in BIP Scheduler, as far as I know BIP takes Answer queries only. Besides a BIP report requires some setup before you can schedule it unlike Answers and Dashboards which can be scheduled directly in Delivers.
    I guess it comes down to an architecture decision. If I were you I would try to use Delivers only. If you think Delivers is missing any BIP Scheduler functionality I would post another thread in the forum as they might another way to do it in Delivers which you don't know.

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Data Reconciliation Data Sources in Business Content

    Can you tell me where we can find these data sources and explain me how to use them? Do we need to define infocube/ods or anything like that to load the data and use report to see the results?
    Please explain me with one complete scenario.
    Thanks.

    Data Reconciliation for data sources allows you to ensure the consistency of data has been loaded in to BI is available and used productively there.
    It is based on comparision of the data loaded in to BI and the application data in the source system.You can access the data in the source system directly to perform this comparison.
    The term Data Reconciliation data source is used for Data sources that are used as a reference for accessing the application data in the source directly and there fore allow you to draw comparison to the source data.
    It allows you to check the integrity of the loaded data by for EXAMPLE,comparing the total of a keyfigure in the data store object with the corresponding totals that the virtual providers access directly in the Source system.
    Hope it will helps you.....

  • Data reconciliation

    BI Experts - I am using BI as my data repository to move data from R/3 to BPC. Any suggestions on what I need to look out for? Thanks.

    Purpose
    An important aspect in ensuring the quality of data in BI is the consistency of the data.  As a data warehouse, BI integrates and transforms data and stores it so that it is made available for analysis and interpretation. The consistency of the data between the various process steps has to be ensured. Data reconciliation for DataSources allows you to ensure the consistency of data that has been loaded into BI and is available and used productively there. You use the scenarios that are described below to validate the loaded data. Data reconciliation is based on a comparison of the data loaded into BI and the application data in the source system. You can access the data in the source system directly to perform this comparison.
    The term productive DataSource is used for DataSources that are used for data transfer in the productive operation of BI. The term data reconciliation DataSource is used for DataSources that are used as a reference for accessing the application data in the source directly and therefore allow you to draw comparisons to the source data. 
    You can use the process for transaction data. Limitations apply when you use the process for master data because, in this case, you cannot total key figures, for example.
    Model
    The following graphic shows the data model for reconciling application data and loaded data:
    The productive DataSource uses data transfer to deliver the data that is to be validated to BI. The transformation connects the DataSource fields with the InfoObject of a DataStore object that has been created for data reconciliation, by means of a direct assignment.  The data reconciliation DataSource allows a VirtualProvider direct access to the application data.  In a MultiProvider, the data from the DataStore object is combined with the data that has been read directly. In a query that is defined on the basis of a MultiProvider, the loaded data can be compared with the application data in the source system.
    In order to automate data reconciliation, we recommend that you define exceptions in the query that proactively signal that differences exist between the productive data in BI and the reconciliation data in the source. You can use information broadcasting to distribute the results of data reconciliation by email, for example.
    Edited by: prem casanova on Oct 21, 2008 11:15 AM

  • Data Reconciliation ... trouble shooting

    Hi All Experts,
    Can anyone provide me some assistance on the following plus provide me some solution.
    a)Data validation btw r/3 and bw reports
    b)if not possible then use se16 - but need to explain how to determine the table and the fields to be used
    check if the definitions are similar between reports
    check to see if there are any exclusions in the report and the data inconsistency is caused by master data loads (unassigned etc)
    c)timing of the data loads - are there postings since the data is loaded into bw?
    d)if the reportt definitions are the same between r/3 and bw - check the data in the cube.  use listcube - explain how to use listcube.
    e)determine if the error is caused by a particular package or error in update rule / transfer rule - explain how to determine this (include the package id, possible errors - full load, delta load)
    include pointers on how to fix the error - asking a bw consultant to fix the error is a good idea in this case.  they should not be fixing the error themselves (unless its a query issue)
    Also ... data reconciliation and how to fix the errors and suggest corrections, improvements and fixes wherever possible.
    TQ
    BR
    Kumar

    a) Data validation between R/3 and BW reports can be done by getting the reports from the R/3 consultant or if u know how to get that u can do that urself. And checking them with the reports in BW ( the selections shd be the same ).
    b)use se16 . For this u have to know from which tables the data is coming into BW . For example take the case or Material Master . The data comes into this Master data from the table MARA.
    for this goto to the SE16 of R/3 and check the data from this table . similarly all other datasources.
    c)u can check this in the delta queue of the source system.
    d)Listcube is a transaction in the BW where u can check the data .
    u get a screen with selections and fill the entries by defult if there are too many selection fields the system will not allow u so u have deselect all the characteristics and select few which are important or those for which u want to check the data.
    e)Determining if the error is caused by paricular package or any other thing , u can find this int he RSMO transaction and select the load which failed and in the details tab u find all the information .
    To see if it is caused in the update rules or TR's u have to simulate the update and switch on debugging and check the update rules and u can find out.
    Data reconcilation is nothing but ur first question and how to fix the errors u will have to get the error to be fixed so if u hae any error just post the error in the forum and u will be helped by ur friends here.
    Hope this helps
    Regards
    Majeed

Maybe you are looking for

  • UDP client stuck in infinite loop

    public static void main(String args[]) throws Exception       BufferedReader inFromUser= new BufferedReader(new   InputStreamReader(System.in));       DatagramSocket clientSocket= new DatagramSocket();       InetAddress IPAddress = InetAddress.getLoc

  • IDOC and ALE Configuration with cross application with SD

    Hi Sap Gurus, Please send me some document on IDOC and ALE Configuration with cross application with SD. Please send me the material on my mail id [email protected] Regards shekhar

  • I have Mac OSX 10.5.5 what upgrade should I have/need

    I am currently using Mac 10.5.5, what upgrade should I have/need?

  • Create IDOC by output type from delivery

    Hello, I have an output type that comes out by a delivery, what I create this delivery from any plant beside a new one, the output is created and sent as it should be to the XI as an IDOC, But when I'm trying to do so from a new plant that was create

  • Using NT Security Context with JNDI to talk to AD

    Hello all, Is there a way in JNDI to connect to Active Directory using the current NT Security Context like ADSI does? I want to run a Java program as a service under Win2k. I want to assign a user for it to run as (on service start). When the progra