DSO - Source Systems

We are using Bobj DS, and according with our solution we have DSO's with data.
  Previously , in the phase 1, we used datasources 7.0 and the model was loading without problems. However, in this phase, when I fill the table opm_sources with these DSO's and put the Source Type = DSO, I dont see this "new system" in the data management tool. I mean, I fill the OPM_SOURCES in this way:
Source_ID: INTURSA##Z_INTPO
Source_Name : Z_INTPO_DSO
Upload_Type: PO
Source_Type: DSO
Then, I complete the table OPMDM_SYSTEM with
OPM_SYSTEM: INTURSA
However, I see this DSO (Z_INTPO) under XSA_PCFILE source system in the data management tool.
I debugged the OPM_DM_FILE_CATEGORIZATION and if the source type is not a "DS", then the source_system is the "DEFAULT" of RSXAFILES_GLOBAL-XASYSTEM, and this is "XSA_PCFILE". So, the OPM_SYSTEM for these DSO' are "XSA_PCFILE".
  Is there a form to work with DSO that belong to different source systems in order to use the Data management tool?. For example:
DSO_1 source System 1
DSO_2 source System 1
DSO_3 source system 2
DSO_4 source system 2
I dont want to use files..
Regards,
Gastón.

i think yes you can compare data from different source system
you can make one universe based on one source system
and another universe based on the other source system
in webi you can create one reports with 2 different queries, each one based on universe for each source.
its easy,
but you need to take care of login credentials for the connection used by each universe.
good luck

Similar Messages

  • Multiple rows in BI Infocube/or DSO for one row in source system

    Hi Gurus,
    I have a typical scenario where in I have a ztable in R/3 system and it contains some custome transaction data.
    The business scenario wants it to split into multiple records in BI ( ofcourse keys for each records in BI would be unique). Either infocube or DSO can be used to store the data in BI.
    Is it possible to do that. As far as my knowledge goes( I have limited BI exposure) one record from source system ( read row) is transformed into one record ( row) in BI in simple transformation.
    Please give me some rough idea if its possible how can we achieve it.
    I know we can write an ABAP program and create a ztable to do it and use the ztable as source for incube/dso. But I want to use this as last option.
    Thanks
    Sumit

    Hi,
    An ODS will be better.
    Use a start routine.
    data-package --> package2
    create your own package with your rules.
    update.insert.modify.append....
    package2 --> data_package
    and from 1 row, you create n row as your rules would like..
    Regards,

  • Source system - DSO-1 - DSO-2 - Cube (Performance improvement)

    Hello ppl,
    If I have a data flow like written below:
    Source system --> DSO-1 --> DSO-2 --> Cube
    Situation is:
    1.     Data load from source system to DSO-1 is full and can not be changed to delta itu2019s the restriction.
    2.     Amount of data flowing in DSO-1 is in millions daily (because it is full and growing).
    3.     Currently all DTPs in between source system and cube are maintained as full.
    Requirement is to improve performance, is it possible to make DTPs as Delta after DSO-1. Since I am deleting all records from DSO-1 and its PSA daily not sure if it will work? If yes how ?
    Any suggessionsu2026
    Regards,
    Akshay

    DTPs  delta is -->request based delta from write optimized DSO.The DTP should have the delta option selected.
    DTPs delta is -->request based delta only also for standard DSO.
    You will find initialization option in DTP on the execution tab(select DTP as type delta in extraction tab).
    Now since you will be doing init from DSO so system will also give you option to select table of DSO for extraction:
    i.e. Active/Changelog.
    You can select Active if you are using Write optimized DSO and Changelog if you are using standard DSO for DSO 1.
    Now coming bact to your case,using write optimized DSO will definately enhance the performance  as you don't have to activate the request..but I have recently had a very bad experience with write optimized DSO regarding delta.
    So I dont' trust write optimized DSO for pulling delta from it.
    if you want to use it ,test it as much possible before using it . I would still go with standard(the beauty of changelog table).
    Now regarding DTP ,once you load the the data from the DSO you can load init information even after the load.Else load the first request from dtp w/o data transfer for init and then run deltas..both will work fine.Its a similar option present in infopackages called "init without data transfer".
    Refer this thread ,a really nice discussion in this :
    BI7: "Delta Init. Extraction From..." under DTP Extraction tab?

  • Doesn't load from source system to PSA and DSO

    Hi Gurus,
    We have launched a Delta load from Source System (POSDM) to DSO. In the source system there are 5000000 rows but none was loaded into DSO or PSA. The delta load was ok, but any data was loaded.
    If the delta  load was lunch again, would be loaded those 5000000 rows. Or those 5000000 rows wouldn't be able to load again.
    Any idea abut this issue? Any feedback will be really appreciated.
    Thanks in advance.

    Hi David,
    Are you sure this 5 million records are delta records and should be pulled into BW as delta??
    did you count the number of records in the underlying tables??
    which data source you are using??
    Delta loads are suppose to bring the new records or changed records since the last data load and not all the records.
    Since the request is green as you said and still it shows 0 records then it is possible that nothing has changed since last delta.try to see the details of the data load... if it failed...may be thats the reason you are not able to see any records.
    If you schedule the delta again it will bring the records changed or created after last delta.
    If the delta was unsuccessfull then turn the QM status and overall status of request to red manually in the monitor of the request and delete it from PSA and all the targets in which its loaded and schedule the delta again and it should bring the delta.
    Thanks
    Ajeet
    Thanks
    Ajeet

  • Changing the source system in QA

    Hello All,
    I just wanted a quick opinion from your experience on the following issue:
    We  have a ECC Dev client 20 connected to BI Dev client 20
    Similarly we have ECC QA 120 connected to BI QA 120
    But due to some reason we now want to connect a new ECC QA client 150 to BI QA 120 and take out ECC QA 120 totally(120 is wiped off).
    I already have a lot of development transported to BI QA where the source system is ECC 120. Now if 120 dies and we pull the data from 150, what are the pitfalls to watch for?
    Like all my Master data objects , DSO and Cubes still point to ECC QA client 120 but now the "actual" source system is going to be ECC QA 150.
    Points,
    Gaurav

    You need to replicate all the data sources in BI from QA 150 and change the source system assignment to 150 instead of 120.
    Also, you need to reinitilase delta for delta enabled extractors.
    Two  things to watch out:
    1.  If you dont have source system identifier,  then if you happen to get records with same key, then it will overwrite. This applies to both master and transaction data.
    2. For transaction data that is not delta enables, there may be a possibiltiy with which the records will get duplicated. So, better to delete the old requests before reload the data from QA150.
    Ravi Thothadri

  • How to merge key field from external source system with SAP R/3 master

    Hi,
    In SAP BW 7.0 system, our scenario is Master Data for 0GL_ACCOUNT is coming from SAP R/3 alongwith the Transactional data records for standard FI cubes. Then, one more set of transaction data is coming from external source system, a flat file, into another custom DSO(ZDSO_FI), which also has this GL Account field.
    This flat file's GL account, GL_file, has to be basically mapped/merged with the standard 0GL_ACCOUNT field so that at the time of loading the transactional data for custom DSO, ZDSO_FI (with transformation mapping GL_file > 0GL_ACCOUNT), system automatically refers to the 0GL_ACCOUNT master data to load these incoming transactional values, from the external flat file system. How can this be done?
    To illustrate the scenario, say I have 5 records in 0GL_ACCOUNT, loaded from SAP R/3 into SAP BW-
    0GL_ACCOUNT      Short Description     Source System
    100                                   D1                          R/3
    200                                   D2                          R/3
    300                                   D3                          R/3
    400                                   D4                          R/3
    500                                   D5                          R/3
    Now suppose if my flat file has following sample transactional data, to be uploaded in SAP BW  ZDSO_FI-
    GL_file      Key Figure1
    400          789
    200          567
    Then after uploading this transactional data in ZDSO_FI (with transformation mapping GL_file > 0GL_ACCOUNT), the 0GL_ACCOUNT data becomes as below-
    0GL_ACCOUNT      Short Description     Source System
    400
    200
    100                                   D1                          R/3
    200                                   D2                          R/3
    300                                   D3                          R/3
    400                                   D4                          R/3
    500                                   D5                          R/3
    So note that the system did not refer the incoming GL's from flat file, although the field is mapped to 0GL_ACCOUNT in transformation, to the already available master data. Rather created 2 new data rows for the GL accounts coming from external system. Because of this I am not able to perform the calculations common from standard FI cube and ZDSO_FI, with GL account as key field. I need to synchronise these data values based on GL Account to proceed with further calculation and am badly stuck.
    Request if anyone can please throw some light on how to achieve this seemingly simple requirement?
    Thanks in advance.
    Nirmit

    Better post this thread is in the [Enterprise Data Warehousing|Data Warehousing; forum.

  • RSAR238 : IDoc type for source system T90CLNT090 is not available

    Hi all,
    I am working on ECC 6.0. Have created a ODS object using RSA1OLD. While activating the ODS, am getting this error:
    1. Error:
    R7I028
    Object could not be activated
    2. Error:
    RSAR238
    IDoc type for source system T90CLNT090 is not available
    Error when creating the export datasource and dependent objects
    Kindly help me with this error.
    Thanks and regards,
    Dhanapal

    Hi,
    Check the same in SDN..
    Object ZDUMMY could not be activatedMessage no. R7I028
    Re: DSO Activation problem
    http://sap.ittoolbox.com/groups/technical-functional/sap-bw/test-cases-for-module-human-capital-management-1425016
    http://sap.ittoolbox.com/groups/technical-functional/sap-bw/idoc-type-for-source-system-t90clnt090-is-not-available-message-no-rsar238-1431565
    Thanks
    Reddy

  • Transport Error - DS does not exist in Source system

    Dear All,
    I am getting below Error while transporting the DSO from DEV to Quality.
    DataSource  8ZM_ALT  does not exist in source system  DCQCLI041 of version A.
    I have checked the dataosource is active and available in Development system.  But Its giving error while importing.
    As per the dataflow the data of this DSO is updated to Cube.
    Has anyone solved this issue, because i have checked in SDN but there are no suitable answers.
    Regards
    Sankar

    Hi,
    Your right I am transporting ZM_ALT Dso .  The following log of the error we are getting.
    DataSource 8ZM_ALT does not exist in source system DCQCLI041 of version A
    Mapping between data source 8ZM_ALT and source system DCQCLI041 is inconsistent
    DataSource 8ZM_ALT does not exist in source system DCQCLI041 of version A
    Field /BIC/ZM_AA will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_APPL will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_BEM will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_MOD will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NCDS will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NDESC will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NES1 will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NES2 will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NIN will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NPCS will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Field /BIC/ZM_NPN will not be delivered from DataSource 8ZM_ALT in source system DCQCLI041
    Regards
    Sankar

  • RFC connection to source system is damaged , no Metadata uploaded

    Hello Friends,
    I need your help to understand and rectify why my transport is failing again and again.
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata upload
    Environment - Production Support (Dev - Quality - Testing - Production)
    Alive scenario - Need to create two InfoObject and incorporate the fields in to data targets and reports.
    What we have did?
    1st request - We have created new request and captured two Infoobjects in to the request.
    2nd Request - We have captured the replicated Datasource along with that the below sequence displayed by default in the same package.
    Communication structure
    DataSource Replica
    Transfer Rules
    InfoSource transaction data
    Transfer structure
    3rd Request - we have captured DSO and Cube only in this request.
    4th Request - Captured 2 update rules (ODS) Update Rule (Cube)
    The above 3rd request failed in the testing system (successful in Quality System) and ODS is inactive in the Testing system.
    Testing system Error Message:
    The creation of the export DataSource failed
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata uploa
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata uploa
    Error when creating the export DataSource and dependent Objects
    Error when activating ODS Object ZEOINV09
    Error/warning in dict. activator, detailed log    > Detail
    Structure change at field level (convert table /BIC/AZEOINV0900)
    Table /BIC/AZEOINV0900 could not be activated
    Return code..............: 8
    Following tables must be converted
    DDIC Object TABL /BIC/AZEOINV0900 has not been activated
    Error when resetting ODS Object ZEOINV09 to the active version
    So we have captured the data mart Datasource and created a new( 5th ) request (transported only 5th request )and transported in the below sequence
    Communication structure
    Datasource Replica
    Transfer Rules
    Infosource transaction data
    Transfer structure
    development to Quality - Successfully
    Failed in testing system again.
    How to rectify this error, please help us
    Thanks,
    Nithi.

    Hello All,
    *Question -1*
    I have checked the connections and typed the steps below - what i have seen in the screen.
    Steps :
    R/3 Connection - I have double clicked on the BT1CLNTXXX and tested the connections.
    Connection Type :
    Logon - 0KB , 10 KB , 20KB, 30 KB
    R/3 connection:
    5 msec, 0msec, 1msec,1msec,1msec
    Please let me know whether RFC connection is OK or not.
    Question - 2
    I want to know is there is any option to check before transporting the TR from Developement to Quality - "Preview Check " the sequence , so that we can avoid the Transport failure because of TR sequence.
    Regards,
    Nithi.

  • Infosource 8ZISCMS10 is not defined in the source system

    Hi Gurus,
    I got an issue when loading the data with in SAP BW (Info source is ODS),It threw an error message stating "Infosource 8ZISCMS10 is not defined in the source system",
    Could any one of you help me in this regrad plz,
    N:B: I have activated the transfer structure through RS_TRANSTRU_ACTIVATE_ALL  and replicated the data sources before loading...

    When u activate DSO/ODS (say ZISCMS10) it creates datasource (8ZISCMS10) & Infosource (8ZISCMS10)
    U can check these at ...... Modeling --> Datasources -> BI -> BI source system name. If it is missing .. reactivate DSO/ODS.
    If the load is failing in QA, u should rereactivate DSO/ODS in DEV and transport that to QA.. After transport.. make sure datasource (8ZISCMS10) & Infosource (8ZISCMS10) appears in QA (Modeling --> Datasources -> BI -> BI source system name)

  • Initialization option for source system in schedular tab

    SAP BW Gurus:
    Please help me with this question.
    This is a new system setup. I used initilization option doing the two years of data loads. I loaded eight loads with each load for a quarter's data.
    Then I found there are some loading issues with one load. So, I need to delete that load and reload again.
    To do that, i need to delete that particular load in the initialization request.
    When I highlighted that request and delete, there is a window pop up saying:" There are active init selections in source system for this data source. Therefore, only ALL init selections can be deleted at once."
    If I choose "no", I couldn't delete that particular package. Then, just for the sake of trying, I click on "yes", then all the 8 initization requests are gone!!!
    Now, I got two questions:
    1. If all the requests are gone, does it mean, I have reload everything again for initilization, ?? (it will take days to do it!!) Now i have two years of data already in my cube and DSO. This is version 7. But I do need to schedule Delta process later.
    2. If I have eight requests, am I able to delete only one of them?? What does the question i listed above mean?
    Thanks so much, guys!!

    Hi,
    System maintains only one delta queue for a datasource irrespective of ...number of inits... which is done from the source system to the target system and this you can see in RSA7.
    You can do init with data for any number number of selections but the delta for all those selections will be combined and will present in one queue.
    If you delete one init request it will delete the whole delta queue as all these request share the same delta queue and therefore all the init selections will go off.
    You should be careful when you do this.
    Now since all the delta is lost and if you can find a way to load the lost delta like doing a full repair on created on or changed on date... then you can just as suggested...but on the safer side I would suggest to the repeat the process or as mentioned in the OSS notes.
    Thanks
    Ajeet

  • Slow extraction from Source System

    Experts,
    I'm on the basis team and we're trying to figure out what we can do to increase the performance of a BI extraction from an ECC source system.
    Our BI team is doing a Full extraction of Data Source 0UC_SALES_STATS_02 which has about 24 million records.  This has been running for about 2.5 days and has only extracted 12.5 million records according to RSMO
    I'm no expert on BI performance, but I'm trying to get there.  One thing we noticed is that we have over 1000 Data Packages and each has about 11K records per package.
    RSBIW "control params for data transfer" in the BI system is set as this:
    LOGSYS     --  Max KB.  --  MAX Lines  --  Freq  --  MAx PROC  -- Target Sys  -- Max DPs
    OURBWSYS --  20000 --          0             --  10    --        3            --     blank        --   blank
    OURECCSYS --  20000 --          0             --  10    --        3            --     blank        --   blank
    Also we only see one background process running on the ECC system which is the Zprogram that was written to perform the extract.
    We also checked IMG - sap netweaver - BI - links to other source systems - maintain contrl params for data xfer
    Freq/IDOC = 10
    pkg size = 50,000
    partition size = 1,000,000
    We are on NWEHP1 for our BI system
    We are on ECC6.0 EHP3sp5 for our ECC system
    We use Oracle 10.2.0.4.0 with it fully tuned
    We also think we have all the correct tuning parameters in our ECC/BI systems.  There are a few tweaks we could make but I don't see any glaring problems.  All our memory parameters seem to be in line with what is recommended.
    We do not see any kind of Memory/CPU bottlenecks.
    Do you think the many data packages and small records per package is a problem? 
    Any tips to increase performance would be helpful, I'm still looking.
    I saw this wiki thread and it mentions to look at tcode RSA3 in ECC system to monitor long extracts.   But I have looked at this and I have not figured out what all needs to go into the fields to correctly run this.
    Any help you can provide would be great.
    Thanks,
    NICK

    HI Nick,
    This problem is due to the huge volumes of data and the job is ruinng from long time onwards, normally we will kill the job if it exceeds more than 8 hours time while extracting the data.
    Please suggest your BI Team to pull data in small volumes by providing selections in info package. This will drastically redcues the time for extraction of huge data and will also not consume all the resources in system.
    Please pass below links to your BI team as this is completely releated to design issue.
    http://help.sap.com/saphelp_nw70/helpdata/en/39/b520d2ba024cfcb542ef455c76b250/frameset.htm
    Type of DSO should be write optimized to increase load performance.
    http://help.sap.com/saphelp_nw70/helpdata/en/34/1c74415cb1137de10000000a155106/frameset.htm
    2. And at your end look for the table spaces of the PSA tables, may be it is execeeding its capacity and make it as automatic. instead of manullay assigning the space at the regular intervals.
    Hope this helps.
    Regards,
    Reddy
    Edited by: Reddybl on Apr 2, 2010 12:23 AM

  • DataSource 80SD_O03 does not exist in source system of version A

    Hi,
              While Activating the Standard DSOs i'm getting this "DataSource 80SD_O03 does not exist in source system of version A".
    Regards,
    Devi.

    Hi,
    Hope you followed the same in R/3
    Datasource:        Rsa5->RSA6
    BI Side:
    Rsa1->BI Content->Select Datasource->Installation->Replicating/Activating the DS
    And same procedure for Infoprovider too.
    Please analyze closely about the procedure, some where you are missing something.
    Regards,
    Suman

  • Failure of infopackage step in process chain - Error in source system.

    Hello Experts,
    We are experiencing this during the execution of process chain.
    All the step for data loads within BW(DSO to DSO, DSO to cube) are failing with error -
    - Error in source system
    - Error during data selection
    Please note that we are 3.5 data flow and thus, data load within BW is through emulated data source and the infopackage. Steps running such  infopackages for emulated data source are failing in process chain with error above.
    Surprisingly, when I run the infopackage manually out of process chain it runs successful without any error. But when I run through process chain it fails.
    Any clue or suggestion? We have already tried doing re-initialization for infopackage.
    Appreciate your inputs.
    Regards,
    PK.

    Hi,
    When we run data loads thru process chain it will run in background. as per the available background application servers your request will be processed.
    as my your during your load time server will be on high load.
    please check your system busy schedule and schedule your process chains.
    Check Tx - SM50, if you have less application servers, then ask basis to create few more to application servers to process the request at busiest time.
    check about increasing parallel processing and default data transfer settings.
    go to info package---> menu scheduler--> data settings for default data transfer.
    Thanks

  • Change in source system client number - how will it affect BI?

    Hello,
    I have the following landscape:
    R/3 DEV (client 100) connected as a source system to BI DEV
    R/3 QA (client 100) connected as a source system to BI QA.
    R/3 PROD (client 100) connected as a source system to BI PROD.
    there is some problem with the R/3 QA server and we want to change it and have a different client (200) for it. As far as the developments in R/3 (datasource enhancement, activation, etc) go, they will not be affected by this change. I'm worried about how this will affect BI.
    Transports: currently we have client 100 as the target for transports to BI QA. If i change it to 200, would i have to re transport all the development? I think i would since even if i replicate the datasources from the new client, i will not get the dataflow. for that i'll have to transport everything again. If i'm right, is there a work around for this?
    Data Models: I have not put anything in BI PROD as yet. the old transports contain the metadata for cubes, ODS, etc along with the data flow. I dont think i can move them into production since i will get two transports with the same target. won't it create some problems? how do i do this? Do i need to collect my cubes, ODS, etc again?
    I appreciate all the help.
    Thanks
    Sameer

    Thanks Koundindya.
    I did all the steps as shown in the thread. I've a small problem.
    The data flow (3.x and 7.0) for the cube is correct. It shows the new client for the datasources.
    For DSO's and master data infoobjects with 7.0 data flow and datasources, The DTP shows the old datasource and the system displays the message: source system XXXXX does not exist. In the data flow for the transformations, it shows the correct data flow and the correct data source.
    Is there any note that i need to look or is this a bug in 7.0?
    Regards,
    Sameer

Maybe you are looking for