Collecting data from two R3 source systems

Dear All,
Scenario:
We are in the process of implementing SAP BI 7.0 in our organisation.
We have a single BI server (7.01) collecting data from two source
systems from two different divisions within the group.
The first system is of Company 1 where we have IS u2013 MILL (ECC 6.0).
BI has been implemented and running successfully in this unit. Now we
have started implementing BI in the second unit u2018Company 2u2019
where the industry specific solution IS u2013 AFS (ECC 6.0) is the source
system.
We are activating BI Standard Content for AFS. When we try to upload
master data, we face a problem.
The master data (0Division, 0Mat_Grp_3, etc.) is already loaded with
IS Mill data (data from the first source system). Now when the data is
loaded from IS AFS (second source system), wherever, the same key
exists in the info object, we notice that the existing data is
overwritten by the freshly loaded data, thus causing loss of existing
data.
A typical example is as follows:
BW Data u2013 Source system 1 u2013 (Before upload of data from source system
2):
0Division
99         EN          Stock Transfer Out
AFS Data (Source system 2) -
0Division 
99         EN             STO
BW Data u2013 Source system 1 u2013 (After upload of data from source system 2):
After uploading the master data from AFS, the Values areu2026
0Division
99        EN            STO
The original data is lost.
Similar problem is noticed with other master data as well.
Please suggest the right methodology for upload of master data when
data to BW is sourced from two systems. How to retain the data in the
same Infoobject but at the same time maintain distinction between the
data from the two systems.
We donu2019t find any specific mention of such scenarios in the standard
documentation.
Will such a problem recur when we upload transactional data?
Regards,
Aslam Khan

Hi Aslam,
            Please use some compounding attribute for your master data object like Sourcesystem ID.  While loading the master data from different source systems... you should have two different flows to fill the same master data object.  In each flow you can specify the Source system ID as constant value ... eg SS1 (Source system 1)  and SS2(Source system 2) in each of the data flows.   This should solve your issue without overwriting your master data
Thanks
Kishore

Similar Messages

  • Differentiating Master Data from two different source systems

    Friends,
    i have used standard InfoObjects that provide master data for two InfoCubes which take data from two different source systems. Now the some of the master data is identical in both the source systems (example: 10 stands for "Industrial" in one whereas 10 stands for "Agricultural" in the other). What do i do so that the system(BW) differentiates the two.
    Thanks in advance for all the help.
    Mike

    I tried to include the 0SOURSYSTEM in the compounding of this InfoObject (for master data) but it gives me a list of other objects that use this InfoObject as a reference characteristic and also says that this InfoObject is used in a ODS and data needs to be emptied in the ODS before activating  this InfoObject.  Please let me know if there is any way out of it.
    Thanks
    Mike

  • BI Content collecting objects from wrong/old source system??

    Hi Gurus! Lovely day here on the westcoast of Sweden - springtime is coming!
    I have selected an infosource for installation in trx RSOR. I also want to pick up some objects in the dataflow before and after that specific infosource. So I guess the system keeps trying to collect metadata from diffrerent source systems that we have assigned.
    I don't know the complete history of the systems - other than it's a bloody mess. We have right now a ECC 6 system connected to BI - but before that we had an older version connected (I guess 4.x something) on a diffrerent server.
    Somehow the collection process picks up some data from the ECC 6 system - but sometimes it asks for a login to the old system as well.
    This is my problem. I don't have an issue with logging on to the old system - because I have all the necessary rights to do that. But I don't want BI trying to collect metadata from our old system. So somewhere there's a pointer of some sort to the old system that I need to delete....
    The old system is not visible in the "source system assignment" view so I can't really find out where to change this settings....
    Any ideas??

    It's probably so. Would I still carry out the instructions Andreas gave me for checking this? Or should I just delete everything and do it all over again, just to get the connections straightened out?
    I'm more or less the only one at  this point using the system - as it is a kind of sandbox environment. So extreme measures is not a problem

  • Error in Previewing data from a UD Source System

    Hi,
    I have created a data source from a UD Source System; however, when I preview data, I am prompted by this message:
    "Inbound processing of data package 000001 finished"
    There is no error, but there is also no data preview that is displayed.
    Can you help me with my problem?
    I look forward to your reply experts!
    Regards,
    Ramon

    Hi,
    sampling is out of a directory placed on the CLIENT. On runtime the directory must be placed on the SERVER.
    So you must have two locations: one for sampling (*local) and one for (*remote) and before deployment you must switch from _local to _remote.
    The _remote location points to a directory on the SERVER, don't forget to grant read/write privileges to the oracle user on the file system.
    Regards,
    Detlef

  • Master data from Multiple SAP source systems

    Hi Gurus,
    In my present project we are having an ECC6 and r/3 from which we need to extract master data.
    ex: vendor
    same vendor has different entities in the systems.
    I am aware of the data inconsistency as it would be overwritten.
    We don't want to use MDM,compounding or transfer duplicate records options.
    MDM - not in use
    Compounding - expensive Option
    transfer duplicate records - not reliable 
    Merging r/3 in to ECC6 is also under consideration but it would be late 2010.
    I am sure some of you might have been through this scenario and your help is appreciated and will be rewarded.
    Thank you,
    Ram.

    We went with Compounding option.
    Can you justify Expensive ????
    We compounded 0LOGSYS,infact you can use 0SOURCESYSTEM.
    So that 0SOURCESYSTEM will have the server details.I mean your ECC6 or R3.
    or
    Some crude solution would be add some character to the beginning of your master data like from ECC 6 append E.
    or
    Have you heard of Consolidated infoobjects Concept.You can go with that too.
    Hope this helps.
    Edited by: Praveen G on Oct 1, 2008 9:04 AM

  • To Load Master Data From Two Source System

    Hi All,
    I have a small question :
    - Can we load master data from two different source system say from flat file and R3 or any two different or similar source  system?
       If answer is "Yes", then how?? If possible step by step.
    Appreciate your valuable points.
    Thanks,
    Niraj Sharma

    Hi,
    Still i have problem.R3 Transformation and DTP is getting activated but when i am executing DTP for flat file,
    I am getting below ERROR.
    Object DTP DTP_d55.......could not found in version A.
    I have  checked the Master data locally for source sys in compounding Tab of the key field.
    Please help..
    Thanks,
    Niraj

  • Master Data from two source systems

    Hi Gurus,
    I need to load master data from two different source systems. What is the best way I could do that ?
    I know one approach is add system id 0logsys as prefix in compound tab and load it. But problem is master data table will have two different records and in the report it will display 2 records, can not summarized it in the report. But I need one record in the report. What is the best approach ?
    Thanks
    Liza

    Hi,
    Create two DataSources. one for each of the source system.
    Create two separate flows to the master from these two DataSources.
    Hope this helps!

  • Master Data cleansing and transformation from non-SAP source systems

    Hi all,
    Our client (Media)wants to cleanse and transform his master data from non-SAP source system to be uploaded into BW (no R/3 yet). If anybody has a document regarding this topic that i could use, i will appreciate if u sent it to me.
    thanks.

    Hi,
    https://websmp203.sap-ag.de/~sapidb/011000358700001965262003
    https://websmp203.sap-ag.de/~sapidb/011000358700006591612001
    https://websmp203.sap-ag.de/~sapidb/011000358700001971392004
    https://websmp203.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000471477&_OBJECT=011000358700008927932002E
    /manfred

  • Master data from combination of multiple system source

    Hi All,
    We have a situation wherein we have to load master data from 2 different source systems. For the same Business areas ( Ex. SD, FI, etc). with most probably with same values. Since the master data is overwrite, the last load value will only be available.
    Please help me out in the precautions / design change need to be done in the existing system.
    Regards,
    N P Reddy

    Hi NPR,
    For this the recommended scenario is using Source System Compounding for the InfoObjects. See here for full info:
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6399e07211d2acb80000e829fbfe/content.htm
    Hope this helps...

  • Cross reference data from 2 different ecc system.

    Hi Sdners,
    Iam working on a scenario where i have to get data from two different Ecc system,consolidate them and send it back to their respective system.
    But some refernce data in both the systems are different and when iam merging data from 2 system i have to maintain either of the reference data.But problem comes when i syndicate it back to ECC ,it cannot accept a new reference data.
    Please suggest me some answere how to proceed in such case.
    Its urgent.
    Points will be rewarded for Genuine answeres.
    Thanks in advance,
    Regards,
    Neethu.

    Hi,
    First enable keymapping property to YES  for the table which you want to do
    importing and syndicatig.
    Create two remote systems type inbound/outbound .
    Import the data from first remote system and map the corresponding fields.
    Don't forget to map the remotekey field which is on the destination side.Make clone
    of one of the dispaly field and map to the remote key field.
    After importing you can see the records from which remote system are imported
    using Edit Key Mappings option in DataManager.It shows that remotesystem
    name and corresponding remote key.
    Do the same for second remote system too.
    After merging data in data manager , you can see the merged record and see the
    two remote systems names and two remote keys by using Edit Key Mappings
    option so the merged record goes back to both remote systems when you syndicate
    the records.
    Syndicate the data from first remote system by selecting destination properties and
    output remote system property under map properties tab as your first remote
    system.
    Do the mapping for corresponding fields and don't forget to map the value field under
    remote key .Then MDM generates remote keys for only records belongs to your
    first remote system.You can see this in destination preview.It does n't genarate
    remote keys for second remote system.Then check the option Suppress records
    without key under map properties tab and execute the syndication.Finally we can
    see the accurate records.
    Do the same for second remote system too.
    Hope it helps
    Cheers
    Narendra

  • RICS0001:Internal Error,unable to process the collected data from the device.

    Hi all,
    I've got the following error in Inventory Collection: 'RICS0001:Internal Error,unable to process the collected data from the device.'
    System is CW LMS 2.6
    If I search the web I get the following Cisco document:
    http://www.cisco.com/en/US/products/sw/cscowork/ps2073/prod_troubleshooting_guide09186a008036dff2.html
    '...in the log directory look for IC_Server.log.'
    IC_Server.log:
    [ Di Nov 16  15:54:27 CET 2010 ],INFO ,[Thread-25],com.cisco.nm.rmeng.inventory.ics.core.ICSCore,173,Got Async Request, User Name :admin
    [ Di Nov 16  15:54:27 CET 2010 ],INFO ,[Thread-25],com.cisco.nm.rmeng.inventory.ics.core.ICSCore,179,Request ID is : 1289919235488
    [ Di Nov 16  15:54:27 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.inventory.ics.core.CollectionController,309,Started processing device ID: 3341
    [ Di Nov 16  15:54:27 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.RMELogger,724,com.cisco.nm.rmeng.util.db.DatabaseConnectionPool,getConnection,59,Inside ICSDatabaseConnection, MAX_COUNT =20
    [ Di Nov 16  15:54:28 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.inventory.ics.core.CollectionController,387,Started processing device Name: 9.152.255.101
    [ Di Nov 16  15:54:28 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,g$eval,103,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:g$eval:populating ContainmentAG attributes, begins...
    [ Di Nov 16  15:54:28 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,110,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:ContainmentAG attributes,collection from the device begins...
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,147,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:ContainmentAG attributes,collection from the device successful...
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,149,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:ContainmentAG attributes,population begins...
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,216,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:Before method getSlotsConfiguredStatistics
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,225,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:After method getSlotsConfiguredStatistics
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,populatingTheChassis,245,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:populatingTheChassis:ContainmentAG attributes,population completed...
    [ Di Nov 16  15:54:29 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.util.logger.XDILogger,77,com.cisco.nm.xms.xdi.pkgs.SharedInventoryCatIOS.ContainmentAGI_ENTITY_Mib_Non_Modular,g$eval,105,com.cisco.nm.xms.xdi.pkgs.SharedInventoryRouter:ContainmentAGI_ENTITY_Mib_Non_Modular:g$eval:populating ContainmentAG attributes, ends...
    [ Di Nov 16  15:54:43 CET 2010 ],INFO ,[Thread-14],com.cisco.nm.rmeng.inventory.ics.core.CollectionController,477,DP time is 15 seconds for 9.152.255.101
    [ Di Nov 16  15:54:44 CET 2010 ],ERROR,[Thread-14],com.cisco.nm.rmeng.util.logger.RMELogger,770,com.cisco.nm.rmeng.inventory.ics.invchange.AddInvChange,effect,33,Unexpected error :com.sybase.jdbc2.jdbc.SybSQLException: ASA Fehler -193: Primärschlüssel für Tabelle 'PhysicalElement' ist nicht eindeutig
        at com.sybase.jdbc2.tds.Tds.processEed(Tds.java:2834)
        at com.sybase.jdbc2.tds.Tds.nextResult(Tds.java:2156)
        at com.sybase.jdbc2.jdbc.ResultGetter.nextResult(ResultGetter.java:69)
        at com.sybase.jdbc2.jdbc.SybStatement.nextResult(SybStatement.java:220)
        at com.sybase.jdbc2.jdbc.SybStatement.nextResult(SybStatement.java:203)
        at com.sybase.jdbc2.jdbc.SybStatement.executeLoop(SybStatement.java:1766)
        at com.sybase.jdbc2.jdbc.SybStatement.execute(SybStatement.java:1758)
        at com.sybase.jdbc2.jdbc.SybPreparedStatement.execute(SybPreparedStatement.java:619)
        at com.cisco.nm.rmeng.inventory.ics.dbrep.DBRecord.insert(DBRecord.java:50)
        at com.cisco.nm.rmeng.inventory.ics.util.ICSDatabaseConnection.insert(ICSDatabaseConnection.java:91)
        at com.cisco.nm.rmeng.inventory.ics.invchange.AddInvChange.effect(AddInvChange.java:29)
        at com.cisco.nm.rmeng.inventory.ics.server.InvDataProcessor.processInvData(InvDataProcessor.java:394)
        at com.cisco.nm.rmeng.inventory.ics.core.CollectionController.run(CollectionController.java:849)
        at java.lang.Thread.run(Thread.java:534)
    [ Di Nov 16  15:54:44 CET 2010 ],ERROR,[Thread-14],com.cisco.nm.rmeng.inventory.ics.server.InvDataProcessor,448,ASA Fehler -193: Primärschlüssel für Tabelle 'PhysicalElement' ist nicht eindeutig
    com.sybase.jdbc2.jdbc.SybSQLException: ASA Fehler -193: Primärschlüssel für Tabelle 'PhysicalElement' ist nicht eindeutig
        at com.sybase.jdbc2.tds.Tds.processEed(Tds.java:2834)
        at com.sybase.jdbc2.tds.Tds.nextResult(Tds.java:2156)
        at com.sybase.jdbc2.jdbc.ResultGetter.nextResult(ResultGetter.java:69)
        at com.sybase.jdbc2.jdbc.SybStatement.nextResult(SybStatement.java:220)
        at com.sybase.jdbc2.jdbc.SybStatement.nextResult(SybStatement.java:203)
        at com.sybase.jdbc2.jdbc.SybStatement.executeLoop(SybStatement.java:1766)
        at com.sybase.jdbc2.jdbc.SybStatement.execute(SybStatement.java:1758)
        at com.sybase.jdbc2.jdbc.SybPreparedStatement.execute(SybPreparedStatement.java:619)
        at com.cisco.nm.rmeng.inventory.ics.dbrep.DBRecord.insert(DBRecord.java:50)
        at com.cisco.nm.rmeng.inventory.ics.util.ICSDatabaseConnection.insert(ICSDatabaseConnection.java:91)
        at com.cisco.nm.rmeng.inventory.ics.invchange.AddInvChange.effect(AddInvChange.java:29)
        at com.cisco.nm.rmeng.inventory.ics.server.InvDataProcessor.processInvData(InvDataProcessor.java:394)
        at com.cisco.nm.rmeng.inventory.ics.core.CollectionController.run(CollectionController.java:849)
        at java.lang.Thread.run(Thread.java:534)
    [ Di Nov 16  15:54:44 CET 2010 ],ERROR,[Thread-14],com.cisco.nm.rmeng.inventory.ics.core.CollectionController,861, Exception occured in process method while processing: 9.152.255.101 ASA Fehler -193: Primärschlüssel für Tabelle 'PhysicalElement' ist nicht eindeutig
    ICSException :: ASA Fehler -193: Primärschlüssel für Tabelle 'PhysicalElement' ist nicht eindeutig
        at com.cisco.nm.rmeng.inventory.ics.server.InvDataProcessor.processInvData(InvDataProcessor.java:463)
        at com.cisco.nm.rmeng.inventory.ics.core.CollectionController.run(CollectionController.java:849)
        at java.lang.Thread.run(Thread.java:534)
    I don't know where the German error message comes from. The whole system is English. Translation is: "ASA error -192: primary key for table 'PhysicalElement' isn't distinct'
    Thanks for your help!
    Alex

    Well known issue in LMS 2.6. Most likely you are hitting CSCsm97530. As a temporary solution you could remove and re-add problematic device from CS. To get a permanent fix you need a patch (provided by TAC).

  • Acquiring streaming data from two sources

    I�m trying to acquire data from two devices at the same time. I have written two sub VI�s where each one takes the data from one piece of equipment. The equipment is such that they are both constantly outputting data. I have been successful in running both of the sub VI�s separately at the same time. The trouble occurs when I try to put the sub VI�s together in a larger VI. When the two sub VI�s part of a larger VI, both cannot run at the same time. One of the sub VI�s tries to read from the serial port and is unable to get anything in response. Is there something I am missing as to why they can not be running at the same time?

    Hello,
    It is possible that you are seeing the consequences of LabVIEW compiling code written in parallel. More specifically, if you have code in parallel (not connected by dataflow, but in the same block diagram) LabVIEW will split execution time between those parts of your code. Previously you were likely manually starting two separate programs, which inherently adds a delay between the start of the programs, allowing the first program to get sufficiently far in its execution; we could be seeing the consequence of this. It would help if you could be more specific about the details of your setup and code (such as 1. which instruments are connected to which ports? 2. are you writing a command to your instruments and then receiving data as a response? 3. do you rec
    eive any errors? 4. if you do receive errors, which errors do you see and where in your code do you first see them?).
    Repost with some more information (perhaps a screen shot or your code) and we can get a more definitive answer!
    Thank you,
    Regards,
    JLS
    Applications Engineer
    National Instruments
    Best,
    JLS
    Sixclear

  • HOw to connect and extract the data from MS ACCESS SOURCE(Database) system

    Hi experts ,
    I have to extract the data from MS access database system using JDBC adapter will it work if Yes HOW?

    Hi Sushma,
    how to configure sendor JDBC adapter ..
    Select adapter type is JDBC..
    Give the Transport Protocol:.JDBC 2.0 (Example)............
                Message Protocol:...JDBC...........
                IAdapter Engine : Integration Server
    Processing Parameters..
    Quality of service.....(Example)..Exactly once
    Poll Interval .... Example ..10
    Query Sql statement..Example ..select * from XXXXX
    Document Name.....
    Update Sql stetement.....
    Thanks,
    Satya
    Reward points if it id useful...

  • Unable to extract data from an AS/400 system.

    Hello experts.
    We are trying to extract data from an AS/400 system but not having any success until now.
    I´ll write down you the stepts that we have followed until now:
    1.- Create a DB Connect between both systems
    2.- Create a Source System from AS400 in the workbench under DB Connect Directory
    3.- Generate datasources from tebles specified in the schema of the connection
    break point -
    At this point, we had a problem with some tables with at least one fieldname containing character "Ñ".
    After asking some possible solutions to SAP, the told us this is not supported, as the system can´t have any object with character "Ñ", so the transfer structure was unable to activate with this fields in the datasource.
    --- end of brek point --
    4.- After those issues, we´ve decided to implement, in another schema, views from those tables which had the fieldnames with that character "Ñ", changing them to an "N".
    5.- We´ve created another source system with that schema, and user than can see that schema.
    6.- To be able to see those views, in transaction RSDBC, we had to deactivate the two checkboxes in the first window ( Choose tables and Choose views) .
    7.- Right afeter, we could generate correctly the datasources from this logical tables.
    8.- We have designed  the hole dataflow for this datasources and everithing went rigth.
    9.- But wen we tried to execute the infopackage to extract data from those logic tables, we cannot get any registers. Acctualy the charge remains yellow after the job have finished.
    Please, I would appreciate any help you could give us on this problem.
    Thank you very much
    Regards
    Joaquin

    I´d like ti add something to this thread, and maybe clarify a littel bit the question.
    The only way that the BW system recognizes those logical tables, through transaction RSDBC is checking out the two boxes on this transaction, "Select Tables" and "Select Views".
    I don´t know haw these logical tables have been created, bus does this mean that the are not neither tables or views as BW understand them.
    Please, if someone knows anythin about this, answer to this thread.
    Thank you very much.
    Joaquin Sobrido

  • One SAP BI 7 system pulling data from 2 ECC 6 systems

    Hello experts,
    has any one gone through the scenario of extracting data from 2 SAP ECC 6 systems in to one BI 7 system? If so, What were the challenges from the view of configuration, performance and security? I appreciate your feed back.
    Thanks,
    Prasanthi Bellam

    >
    prasanthi bellam wrote:
    > Hello experts,
    >  has any one gone through the scenario of extracting data from 2 SAP ECC 6 systems in to one BI 7 system? If so, What were the challenges from the view of configuration, performance and security? I appreciate your feed back.
    >
    > Thanks,
    > Prasanthi Bellam
    Configuration, Peformance and security wouldnot be a issue, the only issue was cleansing the data from two system.
    We actually went with compounding all the objects with source system tag, which involved heavy manual activity.But before going to this, you can check out if there is master data overlap for the two systems if not then you can do away without compounding.

Maybe you are looking for