REG : Data extraction

Hi all
             i have a requirement where i need to bring in data from SAP to NON SAP.....pls share with me which is the best method to employ
this is for bank project where pay roll accounts and other details...
kindly help me on this
can i use sap predefined tools or ALE IDOC.

ALE,IDOC is a technique .Yes
But it may take some time for you to actually feel comfortable.I suggest try native sql commands
using EXEC key word ,logging on to database ,running native sql queries there and take data into internal table.
I remember one payroll application being implemented in such a way that used uploading of data on to gls every month.

Similar Messages

  • Data Extract Design

    Hi All,
    I have a requirement where I have to extract data from various different tables-->Only particular columns.
    The requirements are same for different databases, hence I thought to have a single generic approach and reuse the same code to perform the extract and create an ascii file.
    Below is the typical scenarion i want to achieve, hence need your expertise inputs to start off..
    a) Define the required columns -- This should be configurable, i.e., add or remove columns in future.
    b) Extract the column names from the database for those that are defined in the step a) above.
    c) Extract the data from relevent tables/columns for various conditions based on step a and b above.
    d) Create an ascii file for all the data extracted.
    I'm unsure if there is anything wrong or please suggest the best approach.
    Regs,
    R

    user10177353 wrote:
    I'm unsure if there is anything wrong or please suggest the best approach.
    The first thing to bear in mind is that developing a generic, dynamic solution is considerably more more complicated than writing a set of extract statements. So you need to be sure that the effort you're about to expend willl save you more time than writing a script and copying/editing it for subsequent re-use.
    You'll probably need three tables:
    1. Extracts - one record per extract definition (perhaps including info such as target file name)
    2. Extract tables - tableges for each extract
    3. Extract columns - columns for each extracted column.
    I'm writing this as though you'll be extracting more than one table per run.
    The writing to file is the trickiest bit. Choose a good target. Remember that although we called them CSV files, commas actually make a remarkably poor choice of separator, as way too much data contains them. Go for soemthing really unlikely, ideally a multi-character separator like ||¬.
    Also remember text files only take strings, so you need to convert your data to text. Use the data dictionary ALL_TAB_COLUMNS view to get the metatdata for the extracted columns, and apply explicit masks to date and numeric columns. You may want to allow date columns to have masks which include or exclude the time element.
    Consider what you want to do with complex data types (LOBs, UDTs, etc).
    Finally, you need to address the problem of the extract file's location. PL/SQL has a lot of utilities to wrangle files but they only work on the server side. So if you want to write to a local drive you'll need to use SPOOL.
    One last thought: how will you import the data? It would probably be a good idea to use this mechanism to generate DDL for a matching external table.
    Cheers, APC
    Edited by: APC on May 4, 2012 1:08 PM

  • Reg PS Extraction settings........

    Hi  All,
    I am new the data extraction from PS function module. I have done following steps.
    Installed the Business content data sources from RSA5(Data sources from PS & CO).
    Then replicated at BW side.
    Assigned the data source, activated the transfer rules.
    created the info package and scheduled.
    when i am checking in monitor screen, i am not seeing any data.
    then i checked in RSA3, i seeing 0 recording for the corresponding data sources.
    when

    Continuation....
    I am new the data extraction from PS function module. I have done following steps.
    Installed the Business content data sources from RSA5(Data sources from PS & CO).
    Then replicated at BW side.
    Assigned the data source, activated the transfer rules.
    created the info package and scheduled.
    when i am checking in monitor screen, i am not seeing any data.
    then i checked in RSA3, i seeing 0 recording for the corresponding data sources.
    when i check with functional guyz, they showed data in corresponding tables.
    My question why data is not coming to rsa3.
    Do we need to do any settings in SBIW or ???
    Pls help me in this reg
    Thanks & Regards,
    Bala

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • FI data extraction help

    HI All,
    I have gone thorugh the sdn link for FI extarction and founf out to be very  useful.
    Still i have some doubts....
    For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
    ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......

    >
    Jacob Jansen wrote:
    > Hi Raj.
    >
    > Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
    >
    > br
    > jacob
    Not necessarily for systems above plug in 2002.
    As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
    Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm

  • Generic Data Extraction From SAP R/3 to BI server using Function Module

    Hi,
    I want step by step procedure for Generic Extraction from SAP R/3 application to BI application
    using Functional module.
    If any body have any Document or any PPT then please reply and post it in forum, i will give point for them.
    Thanks & Regards
    Subhasis Pan

    please go thr this link
    [SAP BI Generic Extraction Using a Function Module|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/sap%20bi%20generic%20extraction%20using%20a%20function%20module.pdf]
    [Generic Data Extraction Using Function Module |Re: Generic Data Extraction Using Function Module;

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

  • How to add new fields to a data extract

    The following data extract program generates an output file which is displayed using a publisher template as a check format report.
    Oracle Payments Funds Disbursement Payment Instruction Extract 1.0
    How would I add new fields to the generated file? In other words what is the procedure to add new fields to the data extract?
    Thanks

    Do anyone pls advise how to customize the payment extraction program? We also have the similar requirement to extract extra fields to format a payment file.

  • 'Error 8 occurred when starting the data extraction program'

    Hello Experts,
    I am trying to pull master data (Full upload) for a attribute. I am getting an error on BW side i.e. 'The error occurred in Service API .'
    So I checked in Source system and found that the an IDOC processing failure has occurred. The failure shows 'Error 8 occurred when starting the data extraction program'.
    But when I check the extractor through RSA3, it looks fine.
    Can someone inform what might be the reason of the failure for IDOC processing and how can this be avoided in future. Because the same problem kept occurring later as well.
    Thanks
    Regards,
    KP

    Hi,
    Chk the idocs from SM58 of source system are processing fine into ur target system(BI system)...?
    Chk thru Sm58 and give all * and target destination as ur BI system and execute and chk any entries pending there?
    rgds,
    Edited by: Krishna Rao on May 6, 2009 3:22 PM

  • Vendor Master Data extraction???

    Hi,
    I need to extract Vendor Master Data from SAP into a flat file.
    The format should be similar to file input required for the Vendor Master Upload program: RFBIDE00.
    Is there any program which can be used to extract the data in the required format?
    Any help is appreciated.
    Thanks and Regards,
    Varun

    hi varun,
    check this link..
    <a href="http://sap.ittoolbox.com/groups/technical-functional/sap-r3-dev/vendor-master-data-extract-from-r3-to-flat-file-944432#">VENDOR MASTER EXTRACTION</a>
    <b>dont forget to mark helpful answers..</b>
    Message was edited by: Ashok Kumar Prithiviraj

  • Data Extraction template

    Hi,
        In my current project there is a requirement for Data migration from legacy, So can anyone please help me on providing the data extraction template for Vendor and customer open line items, G/L balances and for Bank directory,
       Your help in this regard is highly appreciated
    Thanks
    Rajesh. R

    Hi Rajesh,
    When you extract the data from the legacy system the point that you should keep in mind are,
    1. How the organisation structure in legacy system is mapped in SAP system. Becuase the data upload into sap should also happen in the way the reports are expected from SAP.
    There is no standard layout which is used while extracting, you need to make sure that you extract all the necessary information from legacy system which is need to be uploaded in SAP.
    I can give you an example of the field that you may include in the layout of extraction
    Vendor account, Document data, Posting date, Document Type, Company code, Amount in Document currency, and local currency, etc
    2. Please keep in mind that there are some GL account which will be maintained in SAP as open item basis. Therefore your extraction should also possibly happen each transaction wise.
    3. There are certain GL which will be maintained in foreign currency, line bank GL which are in foreign currency. In such cases you need to extract the balance in foreign currency.
    My suggestion to you will be thinking through the precess first and then go ahead with the extraction.
    Hope this helps
    Regards
    Paul

  • R/3 SP Import:Open data Extraction requests Error

    We are getting the below error when Basis doing the upgrade support packeage in R/3 source system.
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.
    Intially we have cleared the entries in LBWQ,RSA7 and SM13.But after clearing these entries also we are getting the same above error.
    For support package upgrade in R/3 do we need to delete the SETUP table also in production environment?.
    Is there any other way around without deleting the setup table to upgrade support package in R/3 system?
    Please help us with your inputs urgently..
    Thanks
    Message was edited by:
            Ganesh S

    Thanks Siggi for the suggestion.
    We have already cleared the v3 updates by running RMBWV* job manually.After clearing all the entries in LBWQ,RSA7 and SM13 still we have got the same above error.
    When we imported In R/3 development system after deleting the setup table we didn't get the above error.But we are concerned about deleting the setup table in production system
    though already deltas are running fine.
    Is there any other way around? or deleting the setup table is the only option?
    Please help...

Maybe you are looking for

  • Error message saying iTunes doesn't include software for iPhone

    I get an error message saying iTunes doesn't include the software for iPhone. It says to uninstall and re-install iTunes. I have done it twice and both times I get the same error.

  • ODI 10g to 11g repository upgrade: problem while reusing schema

    Hi, I have successfully upgraded 10g repository to 11g using UA.  To incorporate some changes in the 10g repository, we need to drop the existing 11g repository and re-upgrade the original 10g repository. I have truncated  the upgraded 11g schemas(ma

  • Spry date format

    Hi I have a similar problem to this unanswered post I found while googling for a solution. Any ideas or help appreciated Can anyone help me with a date format for spry? I have a MySQL database with a date record in it, I have run the xml recordset ex

  • Power is on but phone won't work

    I just tried to change my wall paper using a pic from my gallery, and the phone now will not work. I have power but no display. My green alert light is flashing  and when I called the phone it rang. But I can't get the dispay to come on so I can't ca

  • Nt terminal server problem

    Hi, I�ve an application which is working fine using: NT4, W2K, and XP. But using NT terminal server I�ve got a strange error. See attachment. Running the with administrator rights, the application works fine. Running the same exe with standard client