Data Federator data lineage and impact analysis

Hi,
I am looking for the use of data lineage and impact analysis feature in data federator at Universe level.
Can I know which objects are using which fields by uisng impact feature in the data federator.
Thanks in Advance.

I realize this answer is a cop out but I'll suggest it anyway. Upgrade.
I struggled with getting the lineage and impact analysis working in 10.1 and had to do quite a bit of trial and error in the Apache configuration files, DAD's ,etc. because the lineage reports were web based. I never got them working in Production due to other much more pressing implementation issues at the time and never went back to fix them.
In 10.2 and above the reports are much easier to get with no real configuration needed.
Anyway, back to your question, I would suggest going back through the 10.1 documentation on getting those to work and have someone a bit familiar with setting up Apache help you. I did get them working in our Test environment but its been so long I can't recall all the steps and tricks, sorry.
-gary

Similar Messages

  • Data Federator data source not available causes Universe Connection error

    I created a Data Federator project that connects to 20 servers across US and Canada.  All data sources are SQL Server 2005.  The DF project maps 40 source objects into 4 target objects.  I created a universe based on the DF project and we have been quite pleased with Webi query response.  Today one of the source servers was taken off line and this generated a connection error when trying to access the universe (not trying to access the data source that failed).  We do not want the universe connection to error when one source server is not available u2013 is that possible?
    If the answer is no then I see us abandoning what appears to be a great solution for real time distributed reporting and resorting to ETL and moving data.

    Hi Chapman,
    Can you be little elobrate on what you have done to solve the issue.
    Thanks,
    Dayanand

  • Power designer's Data Architect, Enterprise Architect and Information Architect

    Does anyone have product differences between Power designer's Data Architect, Enterprise Architect and Information Architect?

    Dear Rahaman,
    Please refer to the PD 16.5 packaging options below and you can't miss the differences.
    DataArchitect\DM includes Shell/Core*, XML, PDM, LDM
    DataArchitect\RE  includes Shell/Core*, XML, PDM, LDM, Repository
    Information Architect includes Shell/Core*, XML, PDM, LDM, Repository, DMM, CDM, BPM
    EnterpriseArchitect includes Shell/Core*, XML, PDM, LDM, Repository, DMM, CDM, BPM, OOM, EAM, EA Frameworks
    ** Shell & Core includes UI, Gap and Impact Analysis Tools, Reporting Tools, RQM, Free Model, and other common components
    There exists different modeling features you can use in each of packages in PowerDesigner 16.
    If you want to use a repository for storing models and sharing them with other modelers, you should have at least DataArchitect/RE option.
    And, If you should use OOM, then it is required that you have the EnterpriseArchitect option.
    Best Regards
    Jerry

  • How to join data from SAP BW and Oracle database from Webi?

    Hi,
    Need to  create a Web Intelligence Report connecting to multiple data sources.        
    Ø SAP-BW contains the data (All units).                                                                               
    Ø Oracle db contains the access security data (respective units                                                                    
    for the user).               
    Since Universe can only connect to a single data-source we have to create two universes connecting to SAP-BW and Oracle db respectively and then create a webi report to point the two universes to retrieve the equi-join data on the report.
    May I know which layer I can join the data? What kind of data I can join?
    Can I merge dimensions from both the data sources? What about measures?
    How I can create SQL query in Webi in order to simulate the join to the two data source?
    Appreciate any info related.
    thanks and regards
    nora

    Hi Nora,
    You can use Data Federator technology to federate multi-sources such as SAP BW and Oracle.
    Las you just have to create a universe on the Data Federator data source and the universe isee the data source as a single source whereas it is not.
    Then any clainet that consume uninverses can take advantage of this solution.
    Regards;
    Didier

  • Difference between Data Integrator, Data Services, Federator & Data Manage

    Hi,
    What is the difference between each of the following components of SAP Business Object?
    Data Integrator
    Date Services
    Data Federator
    Data Management
    Do these not do the data integration (ETL)? Is so, why a Data Integrator is a must for Rapid Mart not any of the other listed above?

    Thanks for the to-the-point response. It was really helpful. Isn't SAP BusinessObjects Data Federator is also a kind of integrator? The following link mentions "SAP BusinessObjects Data Federator can help you rapidly integrate data in real-time"
    http://www.sap.com/solutions/sapbusinessobjects/information-management/data-integration/datafederator/index.epx
    Or if I put it differently: Data Federator is used when physical integration is not required, and provides the functionality to integrate data virtually in memory.
    Please correct me, if I am not.

  • OSS note Impact analysis

    Hi Experts,
    I am SAP ABAP developer the functional consults asked me to check the oss note no . In that oss note number how i check risk and impact analysis . Please suggest me .

    Venkat,
    You should get all the info in the OSS note itself
    Thanks
    Bala Duvvuri

  • Oracle Data Modeler - Impact Analysis option

    Hi
    I am using Oracle Data Modeler 3.1.0683 and reversed engineering my existing relational models to logical models. I have 3 relational models and reverse engineering it to 1 logical model.
    In logical model under enity's propoerties -> impact analysis how do I add which relational table the logical entity depends on? For example in relational models i have table Class, Student, teacher in 3 separate relation model. Logical i created entity Person which depends on table Student from relational model 1, and teacher from relational model 2, I want to view(add) these tables under "Impact Analysis".
    The help window says
    <quote>
    +"Impact Analysis+
    +Enables you to view and specify information to be used by Oracle Warehouse Builder for impact analysis."+
    </quote>
    Though i couldnt figure out where to specify?
    Thanks in advance.
    Regards
    Lahoria

    So any suggestions how can I bring those table ( as i mentioned in original post) to show up in Impact AnalysisIf your entity is result of reverse engineering from relational model then you can find related table under mappings. The same if you engineer logical model to relational model.
    If you start from column then you can see related attribute in logical model and usage in data flow diagrams and dimensional models
    Philip

  • Data Federator - Hierarchies and Text Descriptions

    Hi,
    I am trying to create WEBI reports on the top of SAP BW. The universe is connecting to SAP BW via Data Federator. I am facing issues with the hierarchy creation. The hierarchies created in the InfoProvider (Multicube - from CRM data source) are not available in the Universe.  I can't create the hierarchies in the universe as all the levels of the hierarchy are not available in the multicube. I need to get the hierarchy data from the CRM data source directly. But the valid sources for DF with respect to SAP BW are just cubes. Kindly let me know if there are any workarounds. Please let me know if more details are needed.
    Thanks in advance.
    Regards.

    Hi
    Officially SAP BW hierarchies are not suppoerted for the Data Federator  (http://wiki.sdn.sap.com/wiki/display/BOBJ/LeveragingDataFederatorinSAP+integrations). What you can do is to use also the SAP R/3 connector (SAP BusinessObjects Web Intelligence Reporting for SAP ERP [original link is broken] .Note this is prototype not a product) and fetch your CRM data also in the virtual tables of the DF. Still you have to rebuild your hierarchies manually in the universe.
    Regards,
    Stratos

  • Data Federator XI 3.0 SP2 and Virtualization

    Does anyone know if the Data Federator is supported in virtual environments such as VmWare 3.x or 4.x?
    Thanks is advance.
    Regards,
    Enrico

    Hi Enrico,
    DF can be installed in VM ware . we have used it in VM5 and VM6.
    If VM is of Windows 2000sp4,2003 R2 then it should work for DF .
    Hope this helps
    Thanks,
    Dayanand

  • Sales and Purchase analysis, sort by date default

    Hello,
    Is there a way to default the sales and purchase analysis reports to show the most recent orders first vs. the oldest orders first in the list?  I am refering to the reports that appear when you are in Item Master Data, and then click on the graphs in the Sales Data and Purchasing data tabs.
    Thank you

    Hi,
    You may double click the column header to change the sort.
    Thanks,
    Gordon

  • Data Federator and Information Design tool

    Hi,
    First, Can Data Federator be a single data source for Information Design Tool?
    Second, from performance point of view, using Information Design tool to build multi-sources Universe or building Universe on Data Federator, which way is better?
    Thanks

    Hi,
    let me explain :
    in BI 4.0, the Data Federator Query Server engine is now part of the server-side Adaptive Processing Server .
    You only invoke this architecture when you select "Multi Source Enabled" -  any other entry point is NOT using the Data Federation Service.
    So - if you select SingleSource -  that's not using DF.
    However, it is entirely possible to select "Multi Source Enabled" then only use 1 connection in the MSU !!
    Having said the above,  Your second question doesn't make sense.
    Unless you mean, using the Data Federator middleware driver in BI 4.0 IDT, to connect to an XI 3.1 Data Federator Query Server?   (i.e. consuming the 'old source' in the new enviornment)   
    This backwards compatibility was delivered to facilitate transition of DF customers, especially since there is no standalone DF product in BI 4.0 (yet)
    I trust this is of use to you.
    Regards,
    H

  • Dropship PO schedule line date changes and impact in SO

    Hi,
    Please suggest where senario; Drop ship PO created and Changed Schedule delivery date on Purchase order and automaticaly changes in Sales order schedule line level.
    SO- PR - PO
    Senario1;
    When Vendor not confirms requested delivery quantity on date, he confirms partially qty and later confirmes remaining quanity. When we split in Purchase order delivery schedule directly it will not update at sales order it confirms and changes initially qty only on SO schedule line level. how can i copy back to sales order schedule qty partial split done on PO.
    Senario2;
    When SO created and PR - PO generated later when we Add new line item in SO or Directly add PO line item how can i make sync between the two documents automatically.
    SO Schedule delivery date and PO schedule delivery date how can i see the link b/w to document back to back updates automatically when we change any one doc.
    Thanks in advance
    JACK

    Hi Caetano,
    thanks for your suggestion
    Yes, we use firm zone for few of the vendors. there the system don't change the schedule lines.
    Also for the stock transfer PR's there is no firm zone and the lead time is one day. in this case it changes everyday after the MRP run.
    the stock transfer PR's leads to the creation of Schedule lines from the source plant. Since this PR gets changed everyday. the alerts coming out of MD07 gets changed and we really did not know if this order is delayed or not.  In the source plant we use the firm zone to avoid moving the schedule line. But then the alerts are always not correct.
    is there any setting which helps in not moving the PR everyday without using " start in the past"
    thanks
    Nagendra Kumar

  • Data Federator XI 3.1 - best practices

    We are planning of rolling out a data federator setup in our company and I'm looking for some best practices.
    The major question I have is, do we install the data federator server components on a dedicated server or can/should we install the components on one of the machines of our BOE r3 cluster (4 nodes -> 2 mgmt and 2 processing)
    Is there any document that contains a summary of the best practices for setting up a data federator environment.
    Kind regards
    Guy

    Hello,
    the advice is to have a specific machine for the DF server.
    DF can become memory and CPU intensive for large queries so a dedicated machine allows to improve DF performances and avoid negative impacts on other services (e.g.BOE).
    A lot of calculation and temporary storage is done in memory so the advice is to add as much RAM as needed for large queries. If the RAM is not large enough you will have disk swap and hence you'll notice a lower performance.
    Hope that it helps
    Regards
    PPaolo

  • Data Federator on Unix - Need to connect to Informix

    Hi,
    We are planning to Deploy Data Federator in Linux - SuSE 64 bit environment. We also have a need to connect to Informix and Teradata databases.
    According to the supported platforms document, only ODBC drivers are available to connect to Informix and Teradata databases.
    Is there a driver bridge available for these ODBC connectivity only databases?
    Update: Didnt notice there were Unix ODBC drivers available. I think we should be fine.
    Will it be supported if wel use the Informix Type 4 JDBC driver (http://www-01.ibm.com/software/data/informix/tools/jdbc/) as a Generic JDBC driver? Is there any performance impact?
    Appreciate the assistance.
    Thanks,
    Thiag.
    Edited by: Thiag Loganathan on Jul 21, 2010 5:43 PM
    Edited by: Thiag Loganathan on Jul 21, 2010 8:26 PM

    How will you access your third-party module in a NT box from UNIX? If it will be over TCP/IP, you may use the UTL_TCP package.

  • Impact Analysis: How to trace which objects and tables used in a report?

    Impact Analysis: How to trace which Webi objects and tables used in a report?
    Currently, our company have been using BO Webi as our ad-hoc and reporting tool for over a year now.  Past several months, we've been pushing our power users to develop their own report, and started to notice that we loss track off which data (tables, columns, ... , BO objects) being used where and by whom.   The BI team now spend more time tracing through reports manually, instead of designing Universe.
    After consulted with our local  SAP  technical sale, they said the only solution is to buy BO's ETL (Data Integration) and
    Metadata Management tool, which price starting from $300K per CPU.  I suppose that is NOT the right solution; however, we have not found one yet.  Some executives believe Cognos (now by IBM) would provide a better BI solution as we scale.
    If anyone know, please provide (1) Impact Analysis method: How to trace which Webi objects and tables used in a report? and (2) Does Cognos provide better impact analysis method without a heavy spending?
    Thank you very much,
    Ed
    Edited by: EdPC-SCB on Sep 8, 2009 3:56 PM

    EdPC-SCB,
    have you tried enabling auditing?
    - Yes, audit log only shows user's activities which isn't useful for us. Please let us know any audit log that might be helpful .
    For most of the servers listed in the CMC there is an "Audit" tab.  I'd say if you have the disk space in your database for Auditor available, then if in doubt turn it on (at least for a while) to see if it exposes what you are seeking to find out --that'd be the quickest way.  The documentation (xir2_bip_auditor_en.pdf) doesn't offer much in helping you to see a correlation between ticking on an Audit option in a Server and how it will populate in the Auditor DB -- most of us just hunt and peck until we get what we want.  Once you have the good stuff in each of the Servers ticked on you'll be able to track down which report recieves which object.  To help youself out initially, you should run every report that you can find so Auditor will get seeded.
    thanks,
    John

Maybe you are looking for