Transformation issue while connecting multiple source system of same type..

Hi,
I'm working on APO-BW integration project. I want to connect BW QA & BW PROD to APO QA. I've already connected BW QA to APO QA & it is fetching data correctly. RFC part & other BASIS part is already taken care of. I've done export datasource (for mater data, cube etc) in BW PROD & replicated in APO QA. All BW PROD datasources are imported as RSDS (7.X) in APO QA.
Now my question is how to make sure the transformations will take source system as BW PROD too? I want 2 datasources (one from BW QA & other from BW PROD) to connect to same inforprovider in APO. I've transported 7.x datsource from APO DEV to APO QA environment along with transformations. In conversion of logical system names in APO QA I've given source system -BW DEV & target system - BW QA. So the transformation automatically takes data source as BW QA. It doesn't take source-system - BW DEV & target system - BW PROD as it conflicts previous setting.
Is only option that I need to manually maintain transformation from BW PROD datasource or there's another better alternative?

HI,
You are getting multiple IDOCs into BPM. But output you are getting one message right? IF so, your N:1 Mapping is not done correctly.
i.e your target message type should have occurence of 1..n and also the interface mapping. Closely obsever the N:1 mapping done in the BpmPatternCollectMerge.
Also go to SXMB_MONI->PE and check the Technical Details to make sure that, you are getting multiple IDOCs and you are executing the N:1 Transformation step correctly.
Also in the N:1 Mapping, check out the context you used. Make it root .  Then test the mapping independently in the Integration Repository.
Hope it helps,
Regards,
Moorthy

Similar Messages

  • SLT Replication for the same table from Multiple Source Systems

    Hello,
    With HANA 1.0 SP03, it is now possible to connect multiple source systems to the same SLT Replication server and from there on to the same schema in SAP HANA - does this mean same table as well? Or will it be different tables?
    My doubt:
    Consider i am replicating the information from KNA1 from 2 Source Systems - say SourceA and SourceB.
    If I have different records in SourceA.KNA1 and SourceB.KNA1, i believe the records will be added during the replication and as a result, the final table has 2 different records.
    Now, if the same record appears in the KNA1 tables from both the sources, the final table should hold only 1 record.
    Also, if the same Customer's record is posted in both the systems with different values, it should add the records.
    How does HANA have a check in this situation?
    Please throw some light on this.

    Hi Vishal,
    i suggest you to take a look to SAP HANA SPS03 Master Guide. There is a comparison table for the three replication technologies available (see page 25).
    For Multi-System Support, there are these values:
    - Trigger-Based Replication (SLT Replication): Multiple source systems to multiple SAP  HANA instances (one  source system can be connected to one SAP HANA schema only)
    So i think that in your case you should consider BO Data Services (losing real-time analytics capabilities of course).
    Regards
    Leopoldo Capasso

  • 10.9.2 Update Issue (VPN) - Eclipse Perl debugger issues while connected to VPN

    This post was initially added to this discussion: 10.9.2 Mavericks update issues
    I have yet another issue related to 10.9.2 update - Eclipse Perl debugger issues while connected to VPN...
    One of the big changes introduced by 10.9.2 update - are VPN changes (security fixes). Unfortunately, whatever these changes are - they "broke" Eclipse (OpenSource IDE) debugger. I am not sure if *all* programming languages (Eclipse plugins) are affected by this, but I know for sure that 'Epic' (Perl plugin) debugger *stopped working* while system is connected through VPN.
    Here is the error that gets “popped-up” in the Eclipse:
    Timed out while waiting for Perl debugger connection
    … and here is exact exception stack that gets printed:
    Unable to connect to remote host: 130.10.210.74:5000
    Compilation failed in require.
    at /Users/valeriy/workspace/ROBO-PROD-RA-685/src/lib/test/Val_test.pm line 0.
              main::BEGIN() called at /Users/valeriy/workspace/.metadata/.plugins/org.epic.debug/perl5db.pl line 0
              eval {...} called at /Users/valeriy/workspace/.metadata/.plugins/org.epic.debug/perl5db.pl line 0
    BEGIN failed--compilation aborted.
    at /Users/valeriy/workspace/ROBO-PROD-RA-685/src/lib/test/Val_test.pm line 0.
    Can't use an undefined value as a symbol reference at /Users/valeriy/workspace/.metadata/.plugins/org.epic.debug/perl5db.pl line 7596.
    END failed--call queue aborted.
    at /Users/valeriy/workspace/ROBO-PROD-RA-685/src/lib/test/Val_test.pm line 0.
    (of course IP address changes dynamically for each VPN connection session)…
    I was able to prove that this issue is related to 10.9.2 update:
    Issue *does not* exist under 10.9.1 (I had to revert back to 10.9.1 to get it working again)
    No updates were performed around the same time 10.9.2 update occurred (I verified that using Software Update log)
    No configuration changes were introduced around the same time
    Reverting back to 10.9.1 using Time Machine (thanks god I had backup !!!) fixed the issue
    Steps to reproduce this issue:
    In Eclipse, try to use 'Epic' (Perl plugin) to debug any perl script while *not* connected through VPNEpic debugger works
    Connect to VPN
    Start Epic debugger to debug same script
    Debugger *does not* start, and "Timed out while waiting for Perl debugger connection" error pop-up comes up after some time. At the same time, exception stack (listed above) is printed in Eclipse's console
    I am programmer/software developer, I work remotely (telecommute) and thus have to rely on use of VPN to connect to company's intranet. Perl - is primary language used by my team, and we use Eclipse IDE with Epic plugin - heavily. Use of Epic's debugger - is a *very large* aspect of my work, I cannot work without it. So in essense, 10.9.2 has *entirely* disrupted my ability to work! It took me almost a week to get back to normal work environment, and I cannot afford to let it happen again... I need Apple's development team resolve this VPN related issue, as soon as possible! Because of this issue, I am *stuck* with 10.9.1 and can not upgrade my laptop to any other versions. In fact, I had to disable system updates - just so I do not run into this issue again... I contacted Apple's Tech Support on 02/28 with this issue (Ref: 582428110), asking to raise trouble ticket. Since then, I tried to follow-up on that issue, but do not get any information. Please advise on the status:
    is there a trouble ticket to track this issue?
    is there any progress?
    what's the ETA for an update that fixes this problem?
    - Val
    Message was edited by: vpogrebi

    Am I the only one experiencing this issue ???

  • Error in ODI while configuring ERPi  Source system

    Hi All,
    Hyperion Version : Hyperion 11.1.2.1
    OS: windows 2008
    I'm getting the following Error in ODI while configuring ERPi Source system in workspace.
    "942 : 42000 : java.sql.SQLException: ORA-00942: table or view does not exist
    java.sql.SQLException: ORA-00942: table or view does not exist "
    can anyone guide how can achieve this. thanks in advance.
    Regards,
    Ravi

    Hi All,
    Please find the below error log from ERPi configurator [Workspace] and let me know further steps to solve this issue....
    ERPI Process Start, Process ID: 71
    ERPI Logging Level: DEBUG (5)
    ERPI Log File: C:\Windows\TEMP\2\/aif_71.log
    Jython Version: 2.1
    Java Platform: java1.4.2_08
    COMM Create Process Detail - Create Process Detail - START
    COMM Create Process Detail - Create Process Detail - END
    Error in Start PS HCM Setup Source System
    java.lang.Exception: The scenario did not end properly.
    at com.sunopsis.dwg.dbobj.SnpScen.a(SnpScen.java)
    at com.sunopsis.dwg.dbobj.SnpScen.localExecuteSync(SnpScen.java)
    at com.sunopsis.dwg.tools.StartScen.actionExecute(StartScen.java)
    at com.sunopsis.dwg.function.SnpsFunctionBaseRepositoryConnected.execute(SnpsFunctionBaseRepositoryConnected.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execIntegratedFunction(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlS.treatTaskTrt(SnpSessTaskSqlS.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandScenario.treatCommand(DwgCommandScenario.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i(e.java)
    at com.sunopsis.dwg.cmd.h.y(h.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)
    ERPI Process End, Process ID: 71
    Thanks
    Ravi

  • Possibility of connecting multiple BW systems to single ECC system

    Hi all,
    I have a requirement in my company to connect multiple BW systems to single ECC system. Would like to confirm whether this may not be supported by all BW extractors in particular the delta mechanism?
    For logistics extractor in ECC side in RSA7 it stores BW System as target which shows they support multiple BW systems as target, but for FI extractors which keeps delta timestamps in BWOM2_TIMEST table, it doesn't store the target system. I'm afraid if we connect multiple BW systems to single ECC system, some of the delta records might go to one BW system and some others go to the other BW system when we execute delta load in both BW systems.
    Thank you very much in advance.
    regards,
    arie

    Hi,
    You can very well connect multiple BW systems to single ECC system and you will not miss any delta as well. In the source system for each delta data source a delta queue is maintained for each target system.
    for e.g. I have two BW systems ABW and BBW connected to AEC system then for 0FI_AP_4 data source we will have DQ1_ABW and DQ2_BBW as two different delta queues (I am not following any naming conventions, actual names will be as per the target system logical name).
    The settings maintained in BWOM2_TIMEST table are across the system and they define some global settings e.g. till what time the FI data should be extracted in BW system, for e.g. the time maintained is 2.00 AM then though you do extraction from multiple systems same data will be extracted till next time interval occurs.
    In summery you can very well connect multiple BW systems to a single ECC system.
    Regards,
    Durgesh.

  • Creating multiple source systems one logical system

    Hi,
    The requirement is to create two source systems using same logical name and different RFC connections to ECC?
    We wanted to do some regular activities using one source system and some drill down activities using different source system.
    I am getting error when I try to create second source system "source system already exist"
    Please advice.
    Thanks.

    Hi,
    I dont think it is possible to create multiple source systems in a sinle logical client. You have to create multiple clients and you can connect to BI system by different RFCs.
    Hope this helps.....
    Regards,
    Suman

  • Could not make connection to source system

    Hello Experts,
    I am trying to fetch data based on selection for last month in info-package but when I try to enter selection (pull down button), I get error message - "Could not make connection to source system..."
    Any idea why this message?
    I have already checked connection to ECC - is fine.
    Datasource is replicated.
    Also if I try to run info-package without selection - I am getting yellow status, No data fetched. RSA3 shows many rows but Info-package to data (Full extraction.)
    Regards,
    SM

    Srini/Vikram,
    user credential is good. sm59 connection is good. there is data in selection as RSA3 fetches rows in ecc.
    when I run any info-package - I am getting yellow status? No data while rsa3 indicate data for Full update?
    why so?
    I am more of design person, not much into hands-on - so seeing these error for first time.
    Q1) if a package is already schedule - let us sat thru data source 2lis_11_vaitm. Now I want to go back and retrieve last month data already extracted... once again. I should be able to pull this data in new infopackage using Full update - data selection option.
    Q2) - rsa3 should fetch data based on set-up tables, so any infopackage based on Full update mode - should fetch data - independent of any other infopackage.
    q3) we need to do delete and fill-set-up table - only if extract structure is changed? right?
    regards, sm

  • Issue while connecting from Biztalk adapter to SAP ECC 6.0

    Hi Friends
    Issue while connecting from Visulal Studio 2005 to SAP ECC 6.0 throgh Biztalk adapter2.0, even if we pass the correct login details, system is throwing an error like incorrect user/password.
    Can you please tell anybody the solution for this.
    Regards
    Praveen

    HI,
    is there a special "formation" to put in the user / pass ?
    maybe a missing domain ? or missing values like "/" or "\" ?
    Do you getting this error in the Biztalk Monitor (Visual 2005) ?

  • RFC connection to source system is damaged , no Metadata uploaded

    Hello Friends,
    I need your help to understand and rectify why my transport is failing again and again.
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata upload
    Environment - Production Support (Dev - Quality - Testing - Production)
    Alive scenario - Need to create two InfoObject and incorporate the fields in to data targets and reports.
    What we have did?
    1st request - We have created new request and captured two Infoobjects in to the request.
    2nd Request - We have captured the replicated Datasource along with that the below sequence displayed by default in the same package.
    Communication structure
    DataSource Replica
    Transfer Rules
    InfoSource transaction data
    Transfer structure
    3rd Request - we have captured DSO and Cube only in this request.
    4th Request - Captured 2 update rules (ODS) Update Rule (Cube)
    The above 3rd request failed in the testing system (successful in Quality System) and ODS is inactive in the Testing system.
    Testing system Error Message:
    The creation of the export DataSource failed
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata uploa
    RFC connection to source system BT1CLNT200 is damaged ==> no Metadata uploa
    Error when creating the export DataSource and dependent Objects
    Error when activating ODS Object ZEOINV09
    Error/warning in dict. activator, detailed log    > Detail
    Structure change at field level (convert table /BIC/AZEOINV0900)
    Table /BIC/AZEOINV0900 could not be activated
    Return code..............: 8
    Following tables must be converted
    DDIC Object TABL /BIC/AZEOINV0900 has not been activated
    Error when resetting ODS Object ZEOINV09 to the active version
    So we have captured the data mart Datasource and created a new( 5th ) request (transported only 5th request )and transported in the below sequence
    Communication structure
    Datasource Replica
    Transfer Rules
    Infosource transaction data
    Transfer structure
    development to Quality - Successfully
    Failed in testing system again.
    How to rectify this error, please help us
    Thanks,
    Nithi.

    Hello All,
    *Question -1*
    I have checked the connections and typed the steps below - what i have seen in the screen.
    Steps :
    R/3 Connection - I have double clicked on the BT1CLNTXXX and tested the connections.
    Connection Type :
    Logon - 0KB , 10 KB , 20KB, 30 KB
    R/3 connection:
    5 msec, 0msec, 1msec,1msec,1msec
    Please let me know whether RFC connection is OK or not.
    Question - 2
    I want to know is there is any option to check before transporting the TR from Developement to Quality - "Preview Check " the sequence , so that we can avoid the Transport failure because of TR sequence.
    Regards,
    Nithi.

  • MDP to HDMI adapter causing issues while connecting Macbook air late 2013 edition to Projector. Projector shows grains on the display.

    MDP to HDMI adapter causing issues while connecting Macbook air late 2013 edition to Projector. Projector shows grains on the display.

    Sounds like you may have been in extended display mode. If so, all you had to do was drag a window to the projector display. Or, if you wanted mirrored mode, in display preferences select check mirror display.

  • Modeling the Data model when multiple source systems

    Hi gurus,
    i have a scenario where i am getting data from multiple source systems all R/3 but from different locatins.
    my doubt is if i have to do reporting on the data available.
    1. do i have to build seperate infocubes for each source systems or build seperate data designs for each source system .
    2. How can i consolidate the different source sytems data into one and report sensibly as well as without loosing the source system identification.
    thanks and regards
    Neel

    Hi all,
    thanks for your focus, ya i am also thinking of have a flexible solution where in i can do it easily as well as we don't have much of complexcity, but wanted to try different options as well so that we can go for best one.
    I thought of multiprovider at first when RafB suggested, and i agree with you as well Lilly, that data volume will be a problem, so keeping all this in view i want to build a solution where in it will be sensible as well as not confusing in reports ( i mean clear and readable reports)
    [email protected]
    please kindly forward any documents which might be helpful for me in this scenario
    thanks and regards
    neel
    Message was edited by: Neel Kamal

  • RFC connection to source system ABPCLNT310 is damaged

    Hello,
    I'm importing a transport in BW Prod (condition type data for billing and orders), but receive this error message in STMS:
    The creation of the export DataSource failed
    Reading the Metadata of 0SD_O05 ...
    Creating DataSource 80SD_O05 ...
    RFC connection to source system ABPCLNT310 is damaged ==> no Metadata upload
    RFC connection to source system ABPCLNT310 is damaged ==> no Metadata upload
    Error when creating the export DataSource and dependent Objects
    Error when activating ODS-objekt 0SD_O05
    The creation of the export DataSource failed
    The import worked fine in BW QA system. Why do I get this error? It is possible to manually generate the export data source from the ODS object, but when I try to import again, it once again fails...
    Best regards,
    Fredrik

    hi Fredrik,
    check if helps oss note 597326, 886102, 184322
    RFC connection to source system Damaged
    Error when transporting ODS Object
    597326 - Activating ODS: Error when creating the export DataSource
    Symptom
    When you activate or transport ODS objects, the following errors appear in the log:
    "The creation of the export DataSource failed". Message number RSBM 035
    "Error &1 in function module &2". Message number RSBM 006
    "Error when creating the export DataSource and dependent Objects". Message number RSDODSO 169
    "Error when activating the ODS object &1". Message number RSDODSO 168
    Other terms
    RSBM 035, RSBM 006, RSDODSO 169, RSDODSO 168
    Reason and Prerequisites
    A characteristic involved is active but it is using new or inactive attributes.
    Characteristics involved are all characteristics used in the ODS object.
    Solution
    Check all characteristics involved as described in the following:
    Check the Compounding and Attributes tabs in the characteristics maintenance.
    All characteristics here must be active. This is displayed by a green icon.
    If this is not the case for a characteristic involved (a gray icon is displayed on one of the two tabs), activate the characteristic involved. This also activates the inactive dependent characteristics.
    After you have checked and repaired all the characteristics involved as described above, the loading/maintenance of master data should work again

  • Tries to connect multiple times at the same time a...

    Tries to connect multiple times at the same time after trying to reinstalling software.

    Hello,
    Assuming that you use a Web version (this information MUST be provided in any post), you have 2 ways to run reports:
    1. the Run_Report_Object built-in
    2. the Web.Show_Document() built-in
    Francois

  • Multiple DataSources of the same type (e.g. Peoplesoft Financials) in DAC

    If I am definining multiple DataSources of the same type (e.g. Peoplesoft Financials) so as to extract data from multiple source ERP systems into a single DataWarehouse, should I have to define separate Data Source Numbers for each source database ?
    I know the preseeded OBIA definitions provide different Data Source Number for Oracle Ebusiness Suite, Peoplesoft etc. But if I have the same ERP application and version in multiple source databases, how do I separate them ? Should I simply edit the Data Source Number and assign a new number to a new source ? Or do I have to copy some templates ?
    Referring to Section 3.1.6 of the Oracle Business Intelligence Applications Configuration Guide (7.9.6.1).
    Hemant K Chitale

    Hemant,
    The purpose of DATASOURCE_NUM_ID is to identify which source it's from.
    In the Warehouse a combination of Integration_ID and DATASOURCE_NUM_ID defines the uniqueness.
    In your case the Source System and Verison is the same so its not required to have different DATSOURCE_NUM_ID
    but if for traceability you want to have one it then you can define it.
    My advise would be to have one Datsource Num ID as defined for your version and Source System.

  • Multiple source system issue...

    Hi,
    We have two source system for BW Development and one source sytem for BWQ and BWP.
    My query is: we have added two extra fields 0SOURCESYSTEM, 0FISCVARNT in the key fields to handle data from two different systems.
    Do anyone have any more suggestion to handle such a scenario.
    Suggestions will be highly appreciated.

    Hi,
    Incase if you are loading data from two sourcesytem ,you will have two data sources each representing one sourcesytem,then you can assign these to an infosurce with flexible update.
    now you can craete two info packages for each source sytem and can load the data to the target
    the steps may look something like this
    replicate datasource1
    replicate datasource2
    create infosurce
    assign datasource1 and datsource2 to infosurce
    maintain transfer rules
    under infosource
    later create Ip for source1
    create Ip for source2
    crate target
    create updaterules
    map the infosoure to update rules
    cheers,
    Swapna.G

Maybe you are looking for