XI for managing data conversions

My customer is considering using XI for managing their data conversions.  This is a very large implementation with many source data systems.  We will also have a very large volume of data to migrate.  For example, we'll have around 8 million materials.  For these materials, we're thinking we can extract this data and map it to an idoc structure and pass into SAP via XI.  First, has anyone used XI for data conversions?  Second, have you dealt with this volume of transactions?  Can XI handle this?  Any thoughts or comments would be greatly appreciated.
Regards,
Chris

Hi Chris,
SAP XI is used for those exact purposes.  Of course as volume increases, your customer will have to increase memory and hardware.  They have also consider the whether the messages are synchronous.  In case of migration, I think they will be asynchronous.  SAP has a customer who before going live, performed some massive load tests that will simulate an average "real" working day, of 1,200,000 (million, two hundreds thousands) transactions, and it worked smoothly yet fast.  Your customer will most likely have to break up 8 million material into daily batches for now.
Enjoy!
John Ta

Similar Messages

  • XI for new data conversion?

    Hello experts,
    You recommendations please on data conversion with XI versus using the LSMW tool?
    We are working on a new implementation, planning to use XI for the interfaces anyway. Now the idea is to use XI for all dataconversion from the legacy systems.
    Good idea, not suitable, better to use LSMW?
    Your thoughts please!
    Many thanks,
    Albert

    Hi again.
    The response to this question cannot be a general answer, because it depends on several aspects. I'll try to order by importance.
    1 - What kind of files do you have for material and BOM creation? XML files?
    XI is a powerfull tool for transfer that kind of files. With an ABAP proxy, you can easily decompose a XML file into a strucuture in ABAP, so it's easy to import files. To send, is also easy. You fill a structure and XI will convert it to a XML file and sent to the respective receiver, using it's communication channel. So, if it's XML, I would say YES, use XI. If not, XI only can copy those flat files and that way it's not easy to get it's data in a proxy. You can copy those file to a directory and import files with a program, but, if so, why using XI for your purpose? Or you can do some mappings to convert flat file into XML file ... the mapping can be easy or very difficult.
    2 - Experience in XI or LSMW? And Urgence?
    This question is important. If you have experience in one of those tools, you should use it. As far as I can catch from your post, XI is a new tool for you, right? To master XI is a process that will take you some time ... not too much ... but consider this part too in your decision. However, since you will start using XI for interfaces, this could be a good start!
    3 - Many other reasons can be found in the link that Murthy provided, but this two should be the most important (in my opinion).
    Hope this helps.
    Regards,
    Valter Oliveira.

  • What is the best app for managing data usage on an iPhone 4S?

    What app are most of you using to manage your data usage on an iPhone 4S.

    Assuming that DonXX means 'monitor' rather than 'manage', my experience may be of relevance.
    I have a data usage problem with my iPhone 4. It started for me when I 'upgraded' to ioS 6. This may or may not be a coincidence. I went from an average monthly data usage of 150MB/month (a 2 year average) to 8.1 GB up/download in 28 days. To try and track down which app is The Ravenous Bugblatter Beast of Traal, I've just installed the Onavo Count app. This claims to monitor the data usage of each app. My sights at the moment are on Skype. I only use iCloud for Calendar syncing.

  • Strategy for managing data over multiple drives

    I have been looking at extending my hard drives and considering the very same options as The Hatter suggested in recent posts - ie Raptor vs Caviar SE16 2*750 vs Caviar RAID 2*750 .
    I received a deal on the initial HDD set with my MacPro , so i currently have 2* 250 HDD’s and i have just started to move my itunes and iphoto files plus other media files ( incl photoshop data, movies, documents, etc) onto the separate drive, to see what performance benefits i will get. This is hopefully as a prelude to going to a more ruthless split of files along the lines of a formal strategy - hence my question on what strategy i should have ?
    My dilemma is trying to find a clear explanation of where to start looking for the practical way to actually set up the OSX and apps on one boot disc/partition and the media/documents on another drive(s); and what to do with the other stuff that doesn't actually fit into either category - user/home/library/application support folders, presets and other application support files.
    I have read posts with people saying not to move user and application support data/library files from the boot drive - in which case there is a lot of file data still likely to reside in the boot drive even after removing documents, and all media files ?
    I am paranoid about not having a clear idea of the right strategy before starting the whole process
    its more a strategy question than hardware, but i have not been able to really get this answered from the posts that i have searched.
    cheers
    graham

    I personally wouldn't bother about going with Raptors. They are disproportionally expensive and do not perform any better than much larger drives in the 750GB/1TB sizes. With these large drives I cannot see any compelling reason to go with a Raptor. Compared to 500GB and smaller drives sure… but not with larger drives.
    So presuming you're going with 2 x Western Digital RE2 750GB drives then you need to decide if you're using RAID or not. If you are then you'll have a 1.5TB volume to use which require no further effort.

  • Data conversion strategy for new SOB

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    Hi David,
    for master data conversion you can use LSMW and the RE-FX BAPIs. (please refer to SAP note  [782947|https://service.sap.com/sap/support/notes/782947] ).
    Regards, Franz

  • GL Legacy Data Conversion

    I have a question for the data conversion strategy.
    We are planning to store 2 yr detailed transactions and 4yr balances in Oracle system. In terms of the data conversion process, we were also going to take the same method. It would, however, cause schedule and workload constraints. Actually, we have not yet discussed this from realistic view points. In other words, we do not insist on storing all of data which are 2yr transaction and 4yr balances.
    Although I believe that we will be able to eliminate a volume of the data to be converted to new chart of accounts from old ones, I am a little bit concerned about a couple of things as follows:
    1. the workload for reporting processes
    Assuming that we will not use Oracle standard reports so much, it would not be a big issue, even if we do not convert all of data I mentioned above. We will store historical report data somewhere, and be able to generate certain reports using both Oracle data and historical data in the repository. If not, it could cause us extra efforts.
    2. Audit trails / Examination trails
    If we give up converting the data to fit new system, and that fiscal year has not been examined yet, how should we handle non-converted data for the examination? I am just wondering whether or not, we have only to prepare the conversion table, that ties Oracle balances back to ABC detailed transactions for their reference.
    If anyone is are aware of anything you can advise me, could you please provide some information or guidance? Thanks to ALL.

    Hi,
    When you talk about GL-Data it is the trial balance to be loaded from legacy to oracle applications.
    You can use Web-ADI to upload it ,the check list are
    The balance for each account comibnation in the legacy system to be mapped to GL-Oracle code combination balance
    With respect to open AR, AP Invoices if the invoices are converted using a control account for migration in GL then
    the the balance transfered from AP,AR need not be reversed in GL from the source receivables and payables.
    In case if the same account combination are for migrating the balances from AP,AR to GL ..Then the balances transferred from AP,AR should be reversed ..so that it does not affect the TB-GL-Balance
    Finally ensure that the balance for the TB-tallys with your legacy system and upload it using Web-ADI ,import and
    review and post it.
    Hope this points helps your GL-Data conversion.
    Regards,
    Ramaa

  • Date Conversion Function Modules YYYYMMDD to MM/DD/YYYY

    Hi,
    I have a requirement to Conver the Dats from format: YYYYMMDD into the Format : MM/DD/YYYY. Can you  please suggest me a suitabble Standard Function Module if exists? If not please suggest me a solution for this data conversion.
    Thanks,
    Rajan.SA

    Hi,
    You can use WRITE FORMATING.
    Here the sample code:
    DATA: BEGDA       TYPE TEXT10,
          ENDDA       TYPE SY-DATUM.
      WRITE SY-DATUM TO BEGDA MM/DD/YYYY.
      WRITE SY-DATUM TO ENDDA MM/DD/YYYY.
    And of course you can use other format, like:
    ... DD/MM/YY
    ... MM/DD/YY
    ... DD/MM/YYYY
    ... MM/DD/YYYY
    ... DDMMYY
    ... MMDDYY
    ... YYMMDD
    Or you can use concatenate statement:
        concatenate SY-DATUM+4(2) SY-DATUM+6(2)
        SY-DATUM(4) INTO BEGDA separated by '/'.
    Regards,

  • VA41 data Conversion

    Hi All,
    Just quick question What’s best option for VA41 data Conversion (single/multiple line items). Custom BDC / Custom Program with BAPI / Standard Idoc / LSMW /...
    Thanks in advance.
    Regards,
    Tim

    hi Tim,
    you will need to either write a program to upload the data to an internal table, or use LSMW to handle all that. Either way, you should use BAPI_CONTRACT_CREATEFROMDATA.
    Regards,
    Naveen

  • Analog data to digital data conversion in labwindows

    Hi
    Is there any library function available in Labwindows for converting Analog data to Digital Data..
    Thank You

    shoukat,
    could you please provide some more information about the background of this question? Is there any link to motion control (this is the Motion Control forum)?
    I don't understand exactly, what you are looking for. All data that you use in a programming environment are digital data - regardless of the data type. Are you looking for a data conversion from floating point to binary (e. g. a boolean array) or are you looking for data acquisition hard- and software?
    Regards,
    Jochen Klier
    National Instruments

  • Management Data Warehouse (MDW) vs. Utility

    What's the road map for Managment Data Warehouse (MDW) vs. SQL Server Utility?
    I've noticed that the database behind Utility is identical to the MDW database, so that makes me wonder if we're planning to merge the 2 technologies. Or, will we leave MDW for more comprehensive data with lower-level controls and use Utility only for
    a few high-level dashboard issues?
    Are they managed by the same development team or do they have competing teams?
    John Lambert, Microsoft Senior SQL Server Premier Field Engineer

    Hi John,
    The management data warehouse is a relational database that contains the data that is collected from a server that is a data collection target. This data is used to generate the reports for the System Data collection sets, and can also be used to create custom
    reports.
    SQL Server customers have a requirement to manage their SQL Server environment as a whole, addressed in this release through the concept of application and multiserver management in the SQL Server Utility. An enterprise can have multiple SQL Server Utilities,
    and each SQL Server Utility can manage many instances of SQL Server and data-tier applications.
    Data collected in SQL Server Utility by managed instances of SQL Server are stored in the utility management data warehouse (UMDW). It is just an aspect. I think they have other different usages.
    Management Data Warehouse:
    http://msdn.microsoft.com/en-us/library/bb677306.aspx.
    SQL Server Utility Features and Tasks:
    http://msdn.microsoft.com/en-us/library/ee210548.aspx.
    TechNet Subscriber Support
    If you are TechNet Subscriptionuser and have any feedback
    on our support quality, please send your feedback here.
    Thanks,
    Maggie
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. This can be beneficial to other community members reading the thread.

  • Data conversion rule manager & repository

    Hi All,
    We are running a fairly large legacy data conversion project from IMS and DB2 to Oracle which will involve quite a variety range of conversion/mapping rules (complex data manipulation to simple mapping). These requirement and rules involve both tech and business collaboration. Are there any data conversion rule managers tool out there that provide these type of capabilities:
    1. Data conversion rule repository and version control
    2. collaboration features (e.g. collaboration editing, commenting, decision capture, approval...etc)
    We have check someed of the tools out there, and they are either generic requirement gathering tools, which is basically a few tabs with drop downs and a big text box, or some end-to-end data conversion automation solutions, which focus the execution of the data conversion instead of managing & tracking conversion rules.
    Can anyone you help? please let me know if there is anything comes close that is practical and useful. Thank you.

    Tubby, thank you for the info. Yes, we actually use ODI as a DEV tool for data conversion rule implementation and execution between our staging DB and target DB. However, ODI does not really have the capability to allow our tech and business to collaborate together and focus on "getting all these rules defined correctly". Having say that, ODI does have many very useful side features for our DEV to leverage and share information such as knowledge module and data mapping...etc
    We would hope to have a practical data rule management tool & repository that allows tech and business to work together and gather all the rules and build up all the knowledge base in 1 single place. DEV/QA can then take away and focus on the implementation and validation.
    Please let me know. Thank you very much.

  • Data conversion for new sob

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    emm wrote:
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done? <br>Business needs to take a decision whether they are fine with matching the POs manually (i.e. referring the documents and verify), in this case you may capture the PO information in a DFF in the Invoice distribution. Otherwise if it has to be converted identifying the POs under this scenario, you may consider converting those maintaining receipt close tolerance as 100% and matching type as 2 -way (again business approval needed to handle audit issues) in order to avoid the receipts/delivery conversion etc.<br>
    What if a return has to be made moving forward in FEB-11 under new SOB<br>Ideally returns can be done using Miscellaneous/Account Alias Issues specifying the appropriate transaction reasons to clarify the scenario.<br>

  • Data conversion for New GL - Going live during fiscal year

    Hi Experts,
    My client is going live on Oct 1 (fiscal year in Jan -Dec). So for data conversion we need to load P&L and BalanceSheet as of Sept 30, 2009.  I need your help to resolve few open issues. (we have activated Doc Splitting with Zero Balance and Segment )
    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .  
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    3) When we process open items  (AR, AP) in month of October how the Profit Center assignment will work?
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers.
    Sorry for asking too many questions. Any help would me very much appreciated.
    Thanks,
    Sam

    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .
    Each P&L account balance has to be upload with cost centre, then automatically PC will be triggered.
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    Let client decide the profit centre for balance sheet items, since they knows which balance relates to which pc
    3) When we process open items (AR, AP) in month of October how the Profit Center assignment will work?
    While uploading AR,AP balances, each line item to be uploaded with profit centre, automatically in october PC assignment will work.
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers
    Try to identify each balance sheet gl balance to each profit centre, this will help in future reports. otherwise put in common pc

  • Can anyone Explain about Data conversion for Material master In SAP MM

    Can anyone Explain about Data conversion for Material master, Vendor  In SAP MM
    Thanks

    Hi,
    Refer following link;
    [Data Migration Methodology|http://christian.bergeron.voila.net/DC_Guide/Data_Migration_Methodology_for_SAP_V01a.doc]

  • Data Conversion Errors for the last week

    We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
    every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
    I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
    Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?

    I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
    I have 0 line in output, and this is the service log:
    Correlation ID:
    e94f5b9e-d755-4160-b49e-c8225ceced0c
    Error:
    Message:
    After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
    Message Time:
    2015-01-21 10:35:15Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
    Type:
    CsvParserError
    Then I stopped the job, and connected to the event hub with a console app and received that:
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
    Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
    Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
    formatted messages.

Maybe you are looking for