Help for BI Content extraction from multiple datasources

Hello,
I am working on a new installation of BI having just come out of the BW310 and BW350 classes, so please be patient with me. :~)  I only have a development instance running, and for my first project, I'm working on the Cross Application Time Sheet.  I have the infocube set up for Time Sheet (OCATS_1), plus the other associated ICs (OCATS_C01, OCATS_C02, OCATS_MC1) and I am getting data from the appropriate datasources (0CA_TS, 0CA_TS_IS_1, 0CA_TS_IS_2). 
The problem I am getting, is that the transformations from the datasource(s) into the DTP leaves out several fields which are found in the Employee datasource.  How would I go about combining the information from the employee datasource into the Time Sheet infocubes along with the timesheet data?
Thanks. 
(Points will be awarded, per usual)

SRM doesn't use kind of "cockpit" LO-like in ECC.
Overall picture:
http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
If you setup data flows accordign Business Content of SRM:
http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
Then just perform an init load and schedule the deltas.

Similar Messages

  • When we go for Business content extraction and when we go for LO/COPA etc.e

    Hi Dear Friends
    When we go for Business content extraction and when we go for LO/COPA etc.extractions
    Thanks
    Learner

    Hi,
    It depends on the KPI you are extracting for,if Business Content Extractor provides you with those then there is no need for generic extractors.
    In CO-PA the KPI for profitability differs from client to client.You have Business Content for CO-PA also...
    Cheers

  • Business components from multiple datasources

    Hi All,
    I want to create business components from multiple datasource. Please tell me if it is possible in ADF framework and Jdeveloper 11.1.1.4
    I opened the Model.jpx and tried adding the two datasource but it is showing as a dropdown list from which I can choose only one.
    My requirement is to have components from one datasource and read data from a second database to create a LOV
    Please suggest how to go about it
    Thanks

    Hi,
    There are many ways to achieve this.
    If your requirement is just to read data from different database (and not to update it), then using a database link would be a better approach.
    You could create a database link in the primary database (target database being the one you would like to query). Base your sql query for the vo using the db link.
    -Arun

  • Data Upload from Multiple DataSource

    Hi All,
    How can i upload data from multiple datasources to a single Cube in BI 7??
    Kindly provide step by step procedure.
    Regards.

    Hi,
    In BI 7 you still have both data flows present, You can either use the old 3.5 dataflow or the new BI 7 dataflow.
    SAP has still allowed organizations to use the old 3.5 dataflow till they convert over to the newer flows so that the production dosen't disturb. Its takes some efforts to convert the 3.5 data flows to the 7.0 flows.
    So, the way its usually done is, a technical upgrade is done. Then existing modules are converted over to the 7.0 dataflows in a phased manner. Any newer projects that come along are carried out with the 7.0 functionality.
    Even if you install business content in a 7.0 system, you'll get 3.5 objects. You need to convert them over to the 7.0 versions.
    Cheers,
    Kedar

  • Data Extraction from Multiple data sources into a single Infoprovider

    Hi Experts,
    Can anyone send me links or examples on how to extract data from multiple data sources into 1 Cube/DSO.
    Can anyone send me example scenarios for extracting data from 2 data sources into a single Cube/DSO.
    Thanks
    Kumar

    Hi Ashok,
    Check the following link from SAP help. Ths is probably what you are looking for.
    [ Multiple data sources into single infoprovider|http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm]
    Data from multiple data sources which are logically related and technicall have the same key can be combined into single record.
    For example, if you have Sales order item and sales order item status .
    These both have the same key - sales order and item and are logically related.
    The Sales order item - provides information on Sales order item - material , price, quantity, plant etc.
    The item status  - povides information on delivery status, billing status, rejection status....
    These are two different data sources 2LIS_!1_VAITM ad 2LIS_11_VASTI.
    In case you have few master data attributes coming from different systems ie tow data sources in different systems say completely defines your master data. Then you could use a DSO or infoobject to combine the records.
    If you want to see aggregated data you use a cube.
    ie say you want to analyzae the customer revenue on a particular material and a particular project.
    The project details would come from Project syatem and Sales details fom sales flow.
    The data is then combined in the DSO with key cutomer and sales area say.
    Then it is aggregated at cube level...by not using sales order in the cube model...so all sales order of the same customer would add while loading the cube.. to give direct customer spend values for your sales area at aggregated level.
    Hope this helps,
    Best regards,
    Sunmit.

  • Fetching data from multiple datasources

    Hi,
    I want to fetch data from different datasources to a single cube.
    What would be the pre-requisites for carriying out the activity.
    Awaiting kind advise.
    Thanks
    Edited by: gskitu on Mar 7, 2012 6:51 PM

    Hi
    There r no prerequisites to your requirement . How ever we have different data flows for your requirement as 1. multiple data source to  different info sources or transformations  to a single cube  2. multiple data sources to single info source or transformation to a single info cube . Any way regarding data flow complexity , second approach is easier one ......
    Thanks.
    Rajendra.

  • Extraction from different datasources

    hi,
    my problem is we need some fields from one datasource and some in other and some in another.so how can i extract these fields and can use them .can any body give a solution how to achive this.or can we extract the fields from the tables in r/3 which is better option.

    Sekhar
    There are 3 options of doing this :
    Option 1:- If fields that you want from different datasources are related to one particular application area say purchasing/sales then you can enhance the base datasource(contains max no of fields you want) with the required fields from other data sources ...write an ABAP user exit to populate these fields .
    Option 2 :- You always have option of creating generic data source based on view/infoset query extraction/function module .
    Option 3 : If the required fields are from one particular application area then extract data from these data sources into respective ODS objects . Then you can manipulate the data into Infocube/ODS on top of these base ODS objects if it suits ur requirement.

  • Problems with real-time extraction from generic datasource.

    Hello,
    I have a real-time extraction from generic datasoruce with numeric field as a delta pointer. I have created an initilization infopackage, delta IP and real-time IP, real-time data transfer process, real-time deamon. The problem is that real-time extraction does not see delta updates. Deamon creates requests in PSA every minute, but there are all empty. When i delete the deamon and shedule not real-time delta infopackage, it loads delta records correctly. Version of BI - 7.0 SP 18.
    The second question: a cannot stop real-time deamon. I stop it in RSRDA, but when i try to shedule standart IP, i get an error, that deamon is still running. So i have to delete it and recreate.
    And the 3rd question, theoretical. What real-time extraction is necessary for? As i understand, i can create a process chain that will be sheduled every 1 (5, 10, ...) minute. What are advantages of real-time extraction rather such variant?
    Thanks.

    Hi,
    1.Note 1113610 - P17:SDL:PC:RDA:Daemon runs: Do not start delta/init IPackage
    Summary
    Symptom
    When a daemon runs or is scheduled, you may not start a 'normal' delta/init InfoPackage.
    Other terms
    RDA, real-time data acquisition, daemon, InfoPackage, scheduler
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 7.0 BI
               Import Support Package 17 for SAP NetWeaver 7.0 BI (BI Patch 17 or SAPKW70017) into your BI system. The Support Package is available when Note 1106569 "SAPBINews BI7.0 Support Package 17", which describes this Support Package in more detail, is released for customers.
    **********you can apply the correction instructions provided in the note
    2.i refered closing of the request in the target i.e DSO not psa.
    3. Note 1000157 - RDA daemon does not get data from delta queue
    Summary
    Symptom
    You use Real-time Data Acquisition (RDA) to load data from an SAP source system. After several hours or days, the system no longer reads the data from the delta queue. In the RDA monitor, you can see that the InfoPackage has been executed, however the data transfer process (DTP) has not been executed for a long time. In transaction RSA7 in the source system, you can see that there is already data in the delta queue.
    Cheers,
    swapna.G

  • Archive Manager Fails to Extract from multiple achives

    Just as the title says, up until a few nights ago my archive manager extracted multi-rar files perfectly, but now whenever I try to extract one, it only extracts from the rar I select.
    I've tried reinstalling, and looking in archive managers settings to no avail.
    Has anybody else ever had this happen?
    Or any ideas on what I could try to fix this?
    Thanks in advance

    Make sure they're all named correctly (.rar, .r00, .r01, etc.) and not corrupt.
    What might work is to extract each rar individually and then use cat to join the extracted files (use a CLI app like unrar, that'd be a lot faster, with a little bash magic).
    Does this occur with only one archive set? Have you tired using something simple like unrar? Find out whether it's the app or the archive set that's bad.
    Last edited by Ranguvar (2009-01-29 18:37:10)

  • Extracting data from multiple datasources through DAC

    We have to extract data from two Oracle datasources (GL datasource 1 and GL datasource 2) by DAC client to one Datawarehouse. We have configured DAC for one datasource and it is working fine. How can we add another datasource into the same DAC client.
    Neeraj

    You will need to update the Source DB connections in both the DAC and Informatica Workflow before you want to load the DW with the other GL Database. You will contnuously have to switch these configuration settings or stand up another DAC Server, Repository and Client that point to the same Datawarehouse.

  • Please help about BI Content Extract!!!

    How can I get to know the tables' names from which I want to extract data from, suppose I know the BI DataSource(for example, 2LIS_03_BF) or the extract structure(for example, MC03BF0), and if, furthermore, can I get to know the name of the corresponding extraction program???
    Thanks a lot.

    Dear Rao,
    Refer this help link
    http://help.sap.com/saphelp_nw04/helpdata/en/3d/5fb13cd0500255e10000000a114084/frameset.htm
    and
    Refer this blog
    /people/sap.user72/blog/2005/09/05/sap-bw-and-business-content-datasources-in-pursuit-of-the-origins
    Regards,
    Vamsi

  • Need help for SRM Data Extraction into BI-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process. If I can get some documentation that can give me an idea for data extraction and delta management in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    SRM doesn't use kind of "cockpit" LO-like in ECC.
    Overall picture:
    http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
    If you setup data flows accordign Business Content of SRM:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
    Then just perform an init load and schedule the deltas.

  • Need help for focusing the cursor from one textbox to another textbox

    hii all,
    I have a problem in java script.
    Note:The textboxes are generated dynamically.it is not having fixed length.(the number of textboxes generated may be 3,2,4....etc.it is not fixed.
    To move cursor from one textbox to another text box ,I have taken the length of the textboxes of the first column.I used onkeyDown event .
    in the function ,first i checked the condition like
    for(i=0;i<form1.box.length;i++) //box is the name of the textboxes
    if(event.keyCode==13)
    form1.box[i+1].focus();
    return false;
    by using this the cursor is moving from first text box to second textbox and stops.
    if i use event.returnValue=false; instead of return false ,then the cursor automatically going to the laxt textbox of the column.
    my problem is how i can focus the cursor from one textbox to another textbox one after the other till the end.
    if any one has solution please help me.
    also if we can do in another way also,please help me.
    thanx.>

    thanx .u helped me so much.
    i have to check another condition. see the code below
    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
    <HTML>
    <HEAD>
    <TITLE> New Document </TITLE>
    <META NAME="Generator" CONTENT="EditPlus">
    <META NAME="Author" CONTENT="">
    <META NAME="Keywords" CONTENT="">
    <META NAME="Description" CONTENT="">
    </HEAD>
    <SCRIPT language="Javascript">
    function fnTest(str) {
    if(event.keyCode==13) {
    if(str == 4) {
    formHeader.box[0].focus();
    else {
    formHeader.box[parseInt(str)+1].focus();
    return false;
    </SCRIPT>
    <BODY>
    <FORM name="formHeader">
    <CENTER>
    <INPUT TYPE="TEXT" name="box" value="" onKeyDown="javascript:fnTest('0');">
    <br>
    <INPUT TYPE="TEXT" name="box" value="" onKeyDown="javascript:fnTest('1');">
    <br>
    <INPUT TYPE="TEXT" name="box" value="0" disabled="false" onKeyDown="javascript:fnTest('2');">
    <br>
    <INPUT TYPE="TEXT" name="box" value="" onKeyDown="javascript:fnTest('3');">
    <br>
    <INPUT TYPE="TEXT" name="box" value="" onKeyDown="javascript:fnTest('4');">
    </CENTER>
    </FORM>
    </BODY>
    </HTML>
    suppose if some of the fields are disabled,then the focus must skip to the next one.i have written disabled for box2.if the cursor focuses in box1,then the cursor should move to box3 escaping box2.
    if u know plz tell me.

  • Golden gate extract from multiple oracle db's on different servers

    Hello,
    I am new to golden gate and I want to know if is it possible to extract data from an oracle database which is on a different server? Below is the server list
    Linux server 1: has oracle database (11.2.0.4) (a1db) and golden gate installed (11.2.1.0.3).
    Linux server 2: has oracle database (11.2.0.4) (b1db)
    a1db and b1db are not clustered, these are 2 separate instances on 2 different servers.
    Is it possible to capture change data on b1db from GG installed linux server 1? I am planning to use classic capture.
    architecture like below, can it be done? If so what option I will be using in the extract
    Thanks,
    Arun

    Here is something from my personal notes; hope this helps:
    Standby or Off Host Environment
    GoldenGate extracts, data pumps and replicats can all work with database environments accessed using TNS.  When one of these processes needs to work with a database environment over TNS then instead of the following USERID specification:
    setenv (ORACLE_SID = “GGDB")
    USERID ggsuser, PASSWORD encrypted_password_cipher_text
    The following USERID specification would be used:
    USERID ggsuser@GGDB, PASSWORD encrypted_password_cipher_text
    When this specification is used the setenv line is not required since the process will connect over TNS.
    When a data pump or replicat is running in a standby or otherwise an off host environment the USERID specification above is the only special requirement.  It is recommended that the TNS entry contains the necessary failover and service name configuration so that if or when switch over or fail over occurs, the process may continue once the environment is available again.  if the data pump is using the PASSTHRU parameter then a USERID specification is not required.  When the data pump is operating in PASSTHRU mode, it does not need a database connection to evaluate the metadata.
    When a source extract is running in a standby or otherwise off host environment the USERID specification above is required as well as Archive Log Only mode.  It is recommended that the TNS entry contains the necessary failover and service name configuration so that if or when switch over or fail over occurs, the process may continue once the environment is available again.  The source extract requires a database connection in order to evaluate the metadata that occurs in the archived redo log.  Since the source extract is running in an environment separate from the source database environment it is unable to read the online redo logs.  Therefore it is required to be configured in Archive Log Only mode.  If the environment that the source extract is running in is a standby environment, then it will evaluate the archived redo logs through switchover. 
    The standby or off host environment has minimal requirements. The requirements that need to be met are Oracle software availability and storage for archived redo logs.  If the database environment where GoldenGate will be running is a standby database environment then GoldenGate can utilize the required shared libraries from the standby database environment.  However if GoldenGate is being executed from a server environment does not contain a database environment, a client installation is required at a minimum.  This will provide GoldenGate with the necessary shared libraries in order to satisfy it’s dynamically linked library dependencies.  The archived redo logs must also be available for GoldenGate to read.    They can be made available using a shared storage solution or with a dedicated storage solution.  A standby database environment works well for this purpose as it is receiving archived redo logs on a regular basis.  GoldenGate is able to leverage these archived redo logs and not impose any additional infrastructure requirements in order to evaluate and capture the data changes from the source database.  In order for GoldenGate to be utilized with a standby database environment for archived redo log access, only a minimal standby database is required.  Specifically the standby database needs to be be mountable so that it can accept archived redo logs.  Since GoldenGate will connect to the primary database to evaluate the metadata contained in the archived redo logs, a complete standby database is not required.

  • Extraction from generic datasource

    Hello,
    We are planning to build a generic datasource on a custom table in ECC. I wanted to know if there will be any issues with the extraction if records are being inserted into the table while the extraction is going on?
    Any help would be appreciated.
    Thanks
    Viraj

    Hi,
    The lower safety interval actually helps not to have the data loss but again it is not 100% full proof method. It's correctness depends on how you set the safety interval.
    Not necessaraly but sometime it may happen that any record is going beyond safety interval and still failing to get extracted in delta.
    You can refer the following thread for more detailed information about time stamps.
    [Safty intervel in generic Extraction;
    Regards,
    Durgesh.
    Edited by: Durgesh Gandewar on Jul 20, 2011 3:02 PM

Maybe you are looking for