Data Migration into Sourcing

Hi
i am interested in learning how to migrate data into SAP Sourcing (7.0)
i have experience of data migration into SAP via LSMW, BDC and Custom upload programs. 
However i do not have the same experience with SAP Sourcing.
Do SAP run a course for this?
or can some point me the direction of Help material/blogs where these detail could be covered.
i searched SDN and Google, but i am not finding the required information. 
Thank you for your time.

Hi Barry
In e Sourcing also you can follow the same 3 environment scenario (Development, Quality and Production and additional cutover system if needed)  based on SAP project management methodology. This is very similar to the multiple client scenario you have in case of SAP ECC.Here the concept is called multitenancy wherein multiple instances of e sourcing can be deployed in a single installation each having seperate database cluster. I would suggest you to look up multitenancy on forum for reference.
I dont think there are seperate tools for system rollbacks but database back up can always be taken.Databse rollback would allow you to restore the data to previous date. Apart from that server backup is always an option though used very sparingly. Database backup has served me well in all my deployments so far.
PI for integration is a good option. Apart from that standard ECC integration package is provided with SAP with proper checks maintained to avoid any data corruption/ data loss issues.
Hope this helps!
Regards
Mudit Saini
Edited by: Mudit_UCB on Aug 19, 2011 10:43 AM

Similar Messages

  • Data migration into cprojects

    Hello,
    We need to migrate some data from legacy non SAP databases into cprojects.  We will need to create a pd and phases.  Also populate some data into user defined fields on the pd.
    Has anybody got an experience of this?
    Options could be:
    lsmw
    microsoft project upload,
    Thanks in advance,
    Steve

    Steve,
    you need to scroll down and then the horizontal scrolling bar will be visible to see my full message (it happened to me as well).
    Anyway, I am not a programmer myself, but here my understanding from what my developer told me:
    you will need basically one progremme to clean the data and consolidated multiple xls into 1 upload file.
    the second programme is to effectively upload the data. There are many BAPI's available, depends on what you need to upload, structures, docs, roles etc... Some BAPI's are create, some change. A list (not comprehensive) is in OSS# 882484. They normally start with BAPI_BUS21* (in sa38).
    Cheers,
    Lacramioara

  • Mass data load into SAP R/3 - with XI?

    Hi guys!
    I have an issue - mass data migration into SAP R/3. Is XI a good solution? It will be about 60GB of data. Or is there a better way of this data load?
    Thanx a lot!
    Olian

    hi,
    SAP doesn't recomment using XI for mass data migration
    and 60 Gb is certainly too much
    use LSMW for that purpose
    Regards,
    michal

  • SAP Legacy data Migration Starategy/Process

    Hi Friends
    I would request you, if anybody let me knwo the process of legacy data migration into sap. What are steps I have to follow while doing legacy data migration. My questions are:
    1) How to upload GL balances from legacy to SAP
    2) How to upload Vendor and Customer open items into sap
    3) Asset balances How to upload
    4) what is the use of Migration clearing accounts.
    Kindly provide if any documents for legacy data migration into sap.
    Thanks in advance
    Rao

    Dear Rao,
    Just check the bwlow link you can understand easly to upload the balances.
    http://www.saptechies.com/sap-pdf-books-download/SAP_Go_live_strategy11248141773.pdf
    Check the bwlow procedure for the same way do it.
    First create 5 Dummy GL codes. These GL code should be zero at the end of uploading balances.
    1     GL Offset Account               Dr
              Asset Offset Account          Cr
              AP Offset Account          Cr
              Sales          Cr
              Other GL's          Cr
    2     Cash               Dr
         Consumption               Dr
         Asset Offset Account               Dr
         AR Offset Account               Dr
         Material Offset Account               Dr
         Other GL's               Dr
              GL Offset Account     x     Cr
    3     AP Offset Account               Dr
              Vendor 1          Cr
              Vendor 2          Cr
              Vendor 3          Cr
    4     Customer 1               Dr
         Customer 2               Dr
         Customer 3               Dr
              AR Offset Account          Cr
    5     Fixed Asset 1               Dr
         Fixed Asset 2               Dr
              Asset Offset Account          Cr
              Acc. Depreciation Account          Cr
    6     Stock Inventory Account               Dr
              Material Offset Account          Cr
    Regards,

  • Data Migration Tool

    Has Oracle a tool for data migration? - source and destination are Oracle databases with different structures.
    George

    yes it is possible , post the structure of the both the databases. you can use import/export

  • Data Migration(wm)

    hi..
    can anybody  explain me the steps involved in data migration.
    Especially storage  bin data migration into sap system from an excell sheet. If there is any documentation or screenshots related to the above topic can be very helpful.
    and also material master data migration....
    waiting sincerely for ur valuable resource.....
    thanks&regards

    hi,
    kindly maintain the wm1 and wm2 view in OMS2 transaction for all the material types that are to be created for the warehouse.
    Create a bdc with the help of abaper by maintaining the required fieds in wm1 and wm2 like stkremoval and stock placement storage section ,capacity in WM1
    In wm2 you maintain the LE qty and sut.
    ABAPER can create a recording for this if you give the requirements.
    Regards,
    velu

  • New GL Data Migration

    Dear All
    We are going to migrate from classical general ledger to new general ledger with scenario 3. We have done all the necessary configuration related to new general ledger and also completed the testing. In this regard I would request if any one has experience of data migration from classical to new general eldger.
    Thanks and Best Regards
    Farhan Qaiser

    Dear Mark,
    We had faced the same issue & our communication with OSS brought the following points to the fore:-
    a) The decision to activate doc-splitting MUST be made before go-live.
    b) The migration tool required for data migration into doc-split related tables is    planned by SAP but not yet available.
    c)Subsequent activation of doc-split causes serious problems as per note 891144.
    Regards,
    Debojit Dey

  • SSIS 2012 is intermittently failing with below "Invalid date format" while importing data from a source table into a Destination table with same exact schema.

    We migrated Packages from SSIS 2008 to 2012. The Package is working fine in all the environments except in one of our environment.
    SSIS 2012 is intermittently failing with below error while importing data from a source table into a Destination table with same exact schema.
    Error: 2014-01-28 15:52:05.19
       Code: 0x80004005
       Source: xxxxxxxx SSIS.Pipeline
       Description: Unspecified error
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC0202009
       Source: Process xxxxxx Load TableName [48]
       Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Invalid date format".
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC020901C
       Source: Process xxxxxxxx Load TableName [48]
       Description: There was an error with Load TableName.Inputs[OLE DB Destination Input].Columns[Updated] on Load TableName.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed
    the specified type.".
    End Error
    But when we reorder the column in "Updated" in Destination table, the package is importing data successfully.
    This looks like bug to me, Any suggestion?

    Hi Mohideen,
    Based on my research, the issue might be related to one of the following factors:
    Memory pressure. Check there is a memory challenge when the issue occurs. In addition, if the package runs in 32-bit runtime on the specific server, use the 64-bit runtime instead.
    A known issue with SQL Native Client. As a workaround, use .NET data provider instead of SNAC.
    Hope this helps.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • How can i import the data from multiple sources into single rpd in obiee11g

    how can i import the data from multiple sources into single rpd in obiee11g

    Hi,
    to import from multiple data sources, first configure ODBC connections for respective data sources. then you can import data from multiple data sources. When you import the data, a connection pool will create automatically.
    tnx

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Oracle Legacy System to SAP Data Migration

    Hi Experts,
    New to data migration:
    Can you guide me in how oracle staging is useful for data migration:
    Here is my few doubts:
    1. What is Oracle Staging?
    2. How Oracle staging is useful for data migration?
    3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
    4. What are the benefits in using oracle staging for data migration?
    Expecting your response of above queries.
    Thanks,
    --Kishore                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Here is my few doubts:
    1. What is Oracle Staging?It is where ODI creates temporary tables. It does the transformation and if required cleans the data aswell.
    2. How Oracle staging is useful for data migration?ODI loads source data into temporary tables(staging) and applying all the required mappings, staging filters, joins and constraints. The staging area is a separate area in the RDBMS (a user/database) where Oracle Data Integrator creates its temporary objects and executes some of the rules (mapping, joins, final filters, aggregations etc.). When performing the operations this way, Oracle Data Integrator behaves like an E-LT as it first extracts and loads the temporary tables and then finishes the transformations in the target RDBMS.
    3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
    4. What are the benefits in using oracle staging for data migration?You can refer https://blogs.oracle.com/dataintegration/entry/designing_and_loading_your_own
    http://docs.oracle.com/cd/E21764_01/integrate.1111/e12643/intro.htm#autoId10
    Expecting your response of above queries.
    Thanks,
    --Kishore                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Significant slowness in data transfer from source DB to target DB

    Hi DB Wizards,
         My customer is noticing significant slowness in Data copy from the Source DB to the Target DB. The copy process itself is using PL/SQL code along with cursors. The process is to copy across about 7M records from the source DB to the target DB as part of a complicated Data Migration process (this will be a onetime Go-Live process). I have also attached the AWR reports generated during the Data Migration process. Are there any recommendations to help improve the performance of the Data transfer process.
    Thanks in advance,
    Nitin

    multiple COMMIT will take longer to complete the task than a single COMMIT at the end!Lets check how much longer it is:
    create table T1 as
    select OWNER,TABLE_NAME,COLUMN_NAME,DATA_TYPE,DATA_TYPE_MOD,DATA_TYPE_OWNER,DATA_LENGTH,DATA_PRECISION,DATA_SCALE,NULLABLE,COLUMN_ID,DEFAULT_LENGTH,NUM_DISTINCT,LOW_VALUE,HIGH_VALUE,DENSITY,NUM_NULLS,NUM_BUCKETS,LAST_ANALYZED,SAMPLE_SIZE,CHARACTER_SET_NAME,CHAR_COL_DECL_LENGTH,GLOBAL_STATS,USER_STATS,AVG_COL_LEN,CHAR_LENGTH,CHAR_USED,V80_FMT_IMAGE,DATA_UPGRADED,HISTOGRAM
    from DBA_TAB_COLUMNS;
    insert /*+APPEND*/ into T1 select *from T1;
    commit;
    -- repeat untill it is >7Mln rows
    select count(*) from T1;
    9233824
    create table T2 as select * from T1;
    set autotrace on timing on;
    truncate table t2;
    declare r number:=0;
    begin
    for t in (select * from t1) loop
    insert into t2 values ( t.OWNER,t.TABLE_NAME,t.COLUMN_NAME,t.DATA_TYPE,t.DATA_TYPE_MOD,t.DATA_TYPE_OWNER,t.DATA_LENGTH,t.DATA_PRECISION,t.DATA_SCALE,t.NULLABLE,t.COLUMN_ID,t.DEFAULT_LENGTH,t.NUM_DISTINCT,t.LOW_VALUE,t.HIGH_VALUE,t.DENSITY,t.NUM_NULLS,t.NUM_BUCKETS,t.LAST_ANALYZED,t.SAMPLE_SIZE,t.CHARACTER_SET_NAME,t.CHAR_COL_DECL_LENGTH,t.GLOBAL_STATS,t.USER_STATS,t.AVG_COL_LEN,t.CHAR_LENGTH,t.CHAR_USED,t.V80_FMT_IMAGE,t.DATA_UPGRADED,t.HISTOGRAM
    r:=r+1;
    if mod(r,10000)=0 then commit; end if;
    end loop;
    commit;
    end;
    --call that couple of times with and without  "if mod(r,10000)=0 then commit; end if;" commented.
    Results:
    One commit
    anonymous block completed
    Elapsed: 00:11:07.683
    Statistics
    18474603 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1737 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    10000 rows commit
    anonymous block completed
    Elapsed: 00:10:54.789
    Statistics
    18475806 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1033 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    one commit
    anonymous block completed
    Elapsed: 00:10:39.139
    Statistics
    18474228 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1123 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    10000 rows commit
    anonymous block completed
    Elapsed: 00:11:46.259
    Statistics
    18475707 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1000 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    What we've got?
    Single commit at the end, avg elapsed: 10:53.4s     
    Commit every 10000 rows (923 times), avg elapsed: 11:20.5s
    Difference:     00:27.1s     3.98%
    Multiple commits is just 4% slower. But it is safer regarding Undo consumed.

  • SAP Accelarated Data migration

    Hi All,
             Could someone kindly provide more info about SAP ADM. I am uanble to get any info about the same. I would like to understand as to how exactly the tool works. I have the 4 page PDF that is posted on the site but its not clear on the actual tool. Can someone kindly provide me with some document or screen shots or any more info about it. My mail id is [email protected] Could someone kindly reply at the earliest.
    Thanks
    Prahlad

    Hi Prahlad,
    Go through this hope u can understand.
    With SAP Accelerated Data Migration, you can reduce migration
    costs by as much as 50% and avoid interruption of business
    processes. Moreover, shutting down the source system after
    migration reduces system administration costs and total cost
    of operations. In short, you realize the following benefits:
    • Significantly reduced cost and time to complete migration
    projects
    • Accurate, cost-effective data transfer applicable for any kind
    of source system
    • Better data quality because of preconfigured business objects
    that ensure data consistency
    • Improved end-user productivity and acceptance thanks to
    migration of historical data
    • Effective migration that avoids interruption of business
    processes
    • Full support services to avoid risks and ensure the optimum
    performance of your new business applications
    • Faster return on investment
    In short, a smoother, more cost-effective migration to a new
    technology solution ultimately positions your organization
    to lower your total cost of ownership, maintain competitive
    advantage, and pursue new business opportunities.
    Expertise in Action
    SAP Accelerated Data Migration applies a business object–oriented,
    two-step approach that uses a neutral interface as a staging area
    and predefined migration content for the conversion and upload
    of data. The neutral interface enables the SAP tool to generate
    predefined migration content and prevents all potential legal
    issues regarding the intellectual property of any source-system
    vendor. The whole data migration process from the source to
    the target system consists of just two steps, as follows:
    1. Data is extracted from the source system into the standard
    interface as XML files.
    2. Data migrates from the interface into the mySAP Business
    Suite database. The migration is based on a new “migration
    workbench” engine developed by SAP based on the SAP
    NetWeaver® platform. All requirements for mapping structures
    and fields and developing complex conversion rules are solved
    within this engine (see Figure 1).
    Once the migration is complete, business-unit end users have
    access to all the legacy data in the new applications as if it had
    originated there. They can continue to work on the existing
    business process items in the new applications and benefit from
    improved functionality.
    Lifting the Limitations
    Much of the cost and effort involved in classical data migrations
    are generated by migration content development, as follows:
    • Identifying business objects for migration to properly support
    the business
    • Defining the structure and field mapping for the relevant
    business objects
    • Developing conversion rules for all necessary value mapping
    Readily available migration content can simplify this effort.
    SAP Accelerated Data Migration provides preconfigured business
    content, helping you migrate it to your new system more efficiently
    and rapidly. The tool allows the migration of all types of
    data, independent of its current state within a business process.
    This includes master and dynamic data, as well as partially
    processed and historical data, to minimize data loss. Business
    processes are uninterrupted and normal operation procedures
    can be retained.
    By providing a standard, neutral interface and reading data as an
    XML file, SAP Accelerated Data Migration is applicable for any
    kind of source system. Preconfigured data migration objects built
    specifically for SAP applications significantly simplify the conversion
    from non-SAP software data into SAP software data objects,
    yielding far-reaching benefits. Besides reducing related IT costs,
    you can be certain of consistency across business-object boundaries.
    Through a direct insert into the database, you avoid the
    performance limitations of classical data migration.
    Reward points if helpful.
    Thanks

  • Configuration Manager 2012 SP1 Prerequisite Checker returns "Migration active source hierarchy" error

    Hello, 
    I am getting "Migration active source hierarchy" error in the SCCM 2012 SP1 prerequisite checker on the CM 2012 primary site. The error says "There is an active hierarchy configuration for the migration. Please stop data gathering for each
    source site in the source hierarchy." However, actually there is no migration source hierarchy configured and there are no migration jobs displayed in CM 2012 console. We used to configure this feature to migrate CM 2007 to current CM 2012 without service
    pack two years ago and we have deleted these configurations. 
    In migmctrl.log, the following record is being generated every 60 minutes. 
    ======================================================
    Connection string = Data Source=xxxxx;Initial Catalog=xxxxx;Integrated Security=True;Persist Security Info=False;MultipleActiveResultSets=True;Encrypt=True;TrustServerCertificate=False;Application Name="Migration Manager".~~  $$<SMS_MIGRATION_MANAGER><04-07-2014
    01:01:53.710+300><thread=6992 (0x1B50)>
    Created new sqlConnection to xxxxx~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.741+300><thread=6992 (0x1B50)>
                                                                    [Worker]: Start two step scheduling
    for MIG_Job~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.741+300><thread=6992 (0x1B50)>
                                                                    [Worker]:        
    Step 1. Query the schedule items that was running or requested to start immediately ...~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.741+300><thread=6992 (0x1B50)>
                                                                    [Worker]:        
    Step 2. Query the first item in order of DateNextRun ...~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.819+300><thread=6992 (0x1B50)>
                                                                    [Worker]:        
            No item found. Sleep until the next event.~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.882+300><thread=6992 (0x1B50)>
                                                                    [Worker]: End two step scheduling
    for MIG_Job~~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.882+300><thread=6992 (0x1B50)>
    [MigMCtrl]: the workitem queue is full!~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.882+300><thread=6992 (0x1B50)>
    [MigMCtrl]: WAIT 3 event(s) for 60 minute(s) and 0 second(s).~  $$<SMS_MIGRATION_MANAGER><04-07-2014 01:01:53.882+300><thread=6992 (0x1B50)>
    ======================================================
    I am not really sure where to look into in this case. I would appreciate any advice. 

    Hi,
    Could you please upload the full migmctrl.log?
    How about specifying a source hierarchy with the same name and password of the hierarchy before in the console, then click Clean Up Migration Data?
    Best Regards,
    Joyce Li
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • How to know data in data moved into GL

    Hi All,
    I have doubt could you clarify it. How can we know that data available in AP and AR is moved into into GL. If Gl_interface is going to act here. How does Gl_interface identify the data. Can you explain me.
    Thanks & Regards,
    pallis

    Hi Pallis,
    the flow of data is like this .....
    Step 1 - Recording of Transaction in AR or AP ...
    System Action - Transaction details are recorded in base tables of specific modules, i.e. for Payables it would get stored in AP related tables and for Receivables it would get stored in AR or RA tables ....
    Step 2 - Transaction recorded in AR or AP is accounted ..
    System Action - When you run the concurrent process of Create accounting from AP or AR, system creates accounting entries for the transactions and these are stored in XLA tables and not in AP or AR tables ....
    Step 3 - Entries transferred to General Ledger ..
    System Action - One should always remember that ONLY ACCOUNTING ENTRIES are transferred from subledger modules such as AP or AR to the GL and not the transaction related information. Hence, the accounting entries that are generated and stored in XLA tables are not transferred to General Ledger by running the concurrent program Transfer entries to General LEdger ...
    When this operation is carried out, system updates a column in XLA table to mark the entries that are transferred to GL, in order to prevent the duplication ....
    So to answer your question ...
    1) So when ever data loads into AP and AR tables simultaneously the data is moved even into XLA tables
    ==> NO, moving the transaction data does not really update XLA tables, these are generated by system automatically and cannot be migrated manually ... this would be generated by running the Create Accounting concurrent program from respective module ...
    2) Is there any column to map with ap and ar tables so that record 1 is from ap and record 2 from ar
    ===> Most of the XLA tables would have a column called Application_Id, this refers to the module from which this accounting entries are generated ... for Payables the application id is 200 and for receivables it is 222 ... these values are system defined hence same for all environments across the globe ...
    3) In GL responsibility I find sources where we select the list of sources and run the import and how can we import 3rd party source journals in Import journals
    ==> If you are importing the entries from standard oracle module such as Payables, Receivables and Assets the source name would be available in the List of values by default, however if you are importing from 3rd party system then you have to define a 3rd party software as a custom source .. only then it would appear in the journal import stage ....
    Third party tool should have created the accounting entries by itself, and only send the accounting data to Oracle General Ledger and not the transaction related information ..... if Oracle is expected to create accounting entries for transactions recorded in 3rd party systems, you have to configure FAH as well .....
    Regards,
    Ivruksha

Maybe you are looking for

  • Voicemail issue - iPhone 4S won't undelete message?

    I accidentally deleted a vociemail. I go to deleted messages and click the voicemail there and then click "Undelete", but it just disappears for a few seconds then reappears in "Deleted Messages". How do I get it back with the rest of my voicemails?

  • Transfer purchases from iPhone 3gs error

    My wife and I both have an iPhone 3gs. We've created separate users on our iMac to manage iPhones/iTunes. We both have admin privileges. It's worked brilliantly until today. I recently synced/updated to OS 5.1.1 (last week)...no problems on my end. M

  • How can I get a line installed at an earlier date?

    Hi, I'm moving into a second home in Poole Dorset next week and despite giving plenty of notice (3 weeks), Open Reach are not offering to arrive in time to install a new phone line while I'm there.  The current date I have from them is 9th August but

  • How do I change the host computer name?

    Hello everyone... I completely uninstalled 9iAS. Restarted Renamed Computer Restarted Reinstalled 9iAS without incident. But...9iAS is still picking up the old computer name and I can't start any services. I thought I did a complete uninstall: Uninst

  • InDesign Capability Question

    Hi, Can InDesign be used for as a central repository to hold images / text for a multi language printed catalog