ECC Data extraction

Hello All,
I have a situation in my department where I need to extract the ECC data to third party database (MS SQL). Initially we were thinking of using SAP BW extractors for extraction through SAP PI to push the data out but this doesn't seem to be feasible. Is this a fair assessment of the situation?
What other methods we can use for data extraction? We are evaluating options for ABAP programs, SAP Queries and BAPIs. Is there any other way we can extract this data set. The data set is required for sd,fi-ar/ap,mm,pp.
We donot need data fro real time time basis, but on the same lines of a data warehouse extraction like SAP BW.
Appreciate your help.
Thanks

Hi
I recently figured out what to do and here is what we do :
Not many would like to allow you to extract data directly from SAP R/3. But we have been successful in extracting data for our in-house implementation of BI at ORG level. You need to understand which tables of R/3 would give you what information.
If that information could be gathered. Then extracting data on BODS is very simple. Though it is a bit tricky and it is not direct. But a we do it as a process of different stages.
We use BODS 3.1
Step 1 :
You need to create 2 connections on SAP R/3 Datastore :
1) conn 1 :
Use the option generate and execute - ABAP Execution option
data transfer mtd : direct download
working directory on SAP Sever : here give local path on which BODS server is running e.g. ; d:\bods
local directory : : d:\bods
Generated ABAP directory : d:\bods.
execute in bg : no
Create data flow :
SAPR/3 >>> (sap r/3 table>>>query transformation>>data transport (dat file))
||__query transformation >> target database (we have chosen oracle) . You could create a datastore for sql server.
And Import that table table under this connection.
Even for SAP R/3. Import that table immediately after you create the SAP R/3 datastore.
We have used the option of replace file in data transport option in dat file
So that each time you run the job the file gets replaced.
Step 2 :
Create a job for this work flow
Run the job.
It will create a dat file and a ABAP File on the specified D: Bods   folder.
Step 3 :
Send the ABAP File : to sap bw / sap team - requesting them to create this ABAP program. and make it available on your SAP R/3 System.
(We found this is easy to do for anyone, all that you need is to copy paste and activiate and do the transport to the sap r/3 system and does not require any time to do any ABAP programming - as such)
Note : For the data store creations of SAP R/3 : THE user name being used. Need to have access to all the required tables of SAP R/3. Otherwise data access will not happen. You can chek the data flow at each stage and it will say, not authorised. If you dont have access.
Step 4 :
Now create another SAP R/3 Connection conn2 ;
This connection will use the option : execute preloaded (abab execution option)
execute in bg : no
data transfer method : shared directory
working directory on sap server :  d:\bi_share
(Create a shared folder on sap r/3 server : :\BI_SHARE (the user on data store should have complete access of read and write on this shared folder)
application path to share directory :
sapservername
bi_share
Step  5 :
Now if you have the ABAP Prog already available on your R/3 system.
Re-run the job finally using the new data store for execute - preloaded.
which is pointing to the shared directory of sap server itself.
And this job can be scheduled on production BODS server finally.
======================================================
Wherever required, import the tables under the related data store.
Although it might sound a lengthy process. Actually, it hardly takes a very little time, compared to the time involved in trying to ask someone on ABAP to code / or take help of BW or whatever.
We have done things on DEV BODS tested the same.
then we have moved everything into production.
And till date : it has been successful for us.
We have taken help of ABAP programers only when we explicity required some customised logical programming. Which was not available directly on SAP R/3 system. 
Otherwise if we generally pull all the fields once we take any table from SAP R/3 so that i could be used any time later. Even if not required today.
And SO with a little effort, even a person who is new to SAP R/3  or BW or BODS.... like me, is able to manage. So, it is not difficult i suppose.
Good Luck.
We have used oracle for reporting database. And sql server should also work the same way.
We use sql server only for meta data information.
like BO Repository or BODS repository
Cheers
Indu
Edited by: Indumathy Narayanan on Jul 27, 2011 8:23 AM
Edited by: Indumathy Narayanan on Jul 27, 2011 8:24 AM
Edited by: Indumathy Narayanan on Jul 27, 2011 8:27 AM

Similar Messages

  • Master data extraction from SAP ECC

         Hi All,
    I am a newbie here and teaching myself SAP BI. I have looked through the forums regarding master data extraction from SAP ECC in all forums but could not answers for my question. Please help me out.
    I want to extract customer attributes from SAP ECC. i have identified the standard data source 0customer_attr and replicated in SAP BI. I have created infopackage for full update. I validated the extractor checker(RSA3) and found 2 records for 0customer_attr.
    When I run the info package, the info package remains in yellow state until it times out. Please let me know in case i am missing anything, Please let me know if there is any other process for master data extraction similar to transaction data extraction.

    Hi All,
    i did the below and afte clicking execute in the Simple job, it takes me back to the Monitor Infopackage screen.
    From your info pack monitor --> menu environment-->job overview--> job in the source system--> prompts for source system(ECC) user id and password, enter them, you will get your job at SM37(ECC), select job and click on log details icon. there you can see details about your error. please share that error screen shot.
    Please find the screenshots.
    I did an Environment -->Check connection and found the below,

  • [sap-bw] SAP HR data extraction errors....into BI 7.0 ...from ECC 6.0..

    Hi Gurus,
    We are doing HR data extraction with the following details:
    Cube: 0PT_C01 - Time and Labor
    data source: 0HR_PT_1 - Times from personal work schedule
    Records in ECC: 15mill +.
    "Init load with Data "is being loaded in packets of 40K records (400 packets approx).
    Each time we load (we tried reloading after deleting data and reinit) it is loading same records. But it is giving the following error. Surprisingly this error is coming for different packets in different loads. for ex: 1st time packet 6...2nd time packet 58....3rd time 122 etc.....it keeps changing...
    Processing end : Errors occurred
    Error when generating the update program
    Diagnosis
    An error occurred during program generation for InfoSource 0HR_PT_1 and
    InfoProvider 0PT_C01 . This may be extrapolated to incorrect update
    rules.
    Procedure
    Check and correct your update rules and then generate the update program
    again. You can find help on the error in the error log.
    We reactivated and verfieid update rules. But no use.
    Appreciate any kind of help.
    thanks in advance.
    J.

    Oscar,
    Thanks for the help.
    We tried the following one by one.
    1. Delete and recreate update rules.  To be frank....there is no CODING or complication in there...
         It did not work.
    2. Tried loading only to PSA.  It loaded 15mill records perfectly.  But when we tried updating target cube, 
        same story with many more errors.....earlier we had 10 packages failed out of 387.......
         this time almost every  alternate package is marked red and same error....update generating update
         program.....
    Hope somebody will come up with a solution.

  • Master data extraction ECC to BW

    Hi Friends,
    We are implementing bw 7.3 in the areas of SD APO DP SNP. This is a fresh implementation.
    What are the initial settings have to do in the bw system?
    How to extract the master data and transction data from ecc to bw.
    Thanks and Regards
    sri

    Hi,
    Please search the forum you will get lot of thread discussing about master data and transaction data extraction.
    More of less the process of extraction remains same across different BW versions.
    Regards,
    Durgesh.

  • Data Extraction (Line Item Level) FI/CO

    Hi expert,
    Exp:the datasource 0FI_GL_4 is Data Extraction (Line Item Level) FI/CO.
    From sap help doc i kown that delta records are directly transferred to BW. No record is written to the BW delta queue.
    But i run rsa7 in ecc, the delta queue list had the 0FI_GL_4,moreover,it had many records in it,then i run delta in bw side the
    record is cleared
    I was confused about it,why the delta queue had many records in it?

    Hi James:
    Please refer to the presentation below:
    "Data Extraction (Line Item Level) FI/CO"
    https://websmp102.sap-ag.de/~sapidb/011000358700007881242002
    Regards,
    Francisco Milán.
    PS. Since the document is available on the SAP Marketplace you'll need an "S" account.

  • How to find all the Master data extract structures and Extractors

    I intend to create Master data dimensions closely similar to SAP BI in a 3rd party system. I would like to use SAP's standard extractors for populating the master data structures in SAP BI and then use a proprietary technology to create similar structures in a 3rd party database.
    Question: How to get a complete list of all Master data extract structures and corresponding extractors?
    Example: In ECC if I do SE80 and give 'Package' and 'MDX' and then press the 'display' spectacles, I get a list of structures and views under "dictionary objects" covering Material, Customer, Vendor, Plant Texts and attributes.
    How do I get the remainder of the Master data extract structures viz. Purchase Info Records, Address, Org Unit etc?
    Regards
    Sasanka

    Hi,
    try the table ROOSOURCE and search the data source with string in astrick attr  for master data and for text with text
    This will give you the list of all the master data source in the system and it contains a column which tells about the extract structure and function module used by each.
    Thanks
    Ajeet

  • SPM data extraction question: invoice data

    The documentation on data extraction (Master Data for Spend Performance Management) specifies that Invoice transactions are extracted from the table BSEG (General Ledger) . On the project I'm currently working the SAP ERP team is quite worried to run queries on BSEG as it is massive.
    However extract files are called BSIK and BSAK of; which seems to suggest that the invoices are in reality extracted from those accounts payable tables.
    Can someone clarify the tables really used, and if it's the BSIK/BSAK tables what fields are mapped?

    Hi Jan,
    Few additional mitigation thoughts which may help on the way as same concerns came up during our project .
    1) Sandbox Stress testing
    If available u2013 take advantage of an ECC Sandbox environment for extractor prototyping and performance impact analysis. BSEG can be huge (contains all financial movements), so e.g. BI folks typically do not fancy a re-init load for reasons outlined above. Ask basis to copy a full year of all relevant transactional data (normally FI & PO data) onto the sandbox and then run the SPM extractors for a full year extraction to get an idea about extraction system impact.
    Even though system sizing and parameters may differ compared to your P-box you still should get a reasonable idea and direction about system impact.
    2) In a second step you may then consider to break down the data extraction (Init/Full mode for your project) into 12 monthly extracts for the full year (this gives you 12 files from which you Init your SPM system with) with significant less system impact  and more control (e.g. can be scheduled over night).
    3) Business Scenario
    You may consider to use the Vendor related movements in BSAK/BSIK instead the massive BSEG cluster table as starting tables (and fetch/lookup BSEG details only on a need base) for the extraction process (Index advantages were outlined above already).
    Considering this we managed to extract Invoice data with reasonable source system impact.
    Rgrds,
    Markus

  • Need help for SRM Data Extraction into BI-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process. If I can get some documentation that can give me an idea for data extraction and delta management in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    SRM doesn't use kind of "cockpit" LO-like in ECC.
    Overall picture:
    http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
    If you setup data flows accordign Business Content of SRM:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
    Then just perform an init load and schedule the deltas.

  • Master Data Extraction-Questions

    Hi guys,
    I was referring SAP materials on Master Data Extraction...then I read something like...
    "In Master Data Datasources,some support delta and some donot.Out of those that support delta mechanism,some use DELTA QUEUE.Some donot use DELTA QUEUE functionality and it is generally incase of small volumes of Data.Then there are some other datasources which uses ALE CHANGE POINTERS for delta mechanism."
    1.Can anyone explain how can one do delta in case of small volumes of data without using DELTA QUEUE functionality?Whats the need to go for it when we have DELTA FUNCTIONALITY?
    2.How to do delta using ALE Change Pointers?Whats the need to gofor this when we have DELTA QUEUE functionality?
    Thanks in advance.
    Regards
    Schand

    Hi Des,
    I think you are explaining the difference between "Delta Update" and "Delta Queue".I am well aware of these two things.
    DELTA QUEUE---its a temporary storage for delta records in R3 system before they are loaded successfully into BI.
    DELTA UPDATE---its type of delta update of delta records from R/3 system to BI system.
    My QUESTION is:
    In both MasterData Datasources and Transaction Data datasources,some support delta and some donot.Usually delta records will be stored in DELTA QUEUE in ECC before uploading them into BI.But some MASTER DATA DATAOURCES,donot use DELTA QUEUE to store delta records in ECC before uploading them into BI and they do this in case of small volumes of data.then Do any of you know,how do they do if they are not using DELTA QUEUE?
    Second one,SAP materials mentioned,some other datasources use ALE change pointers to determine delta.In this case also,they donot use DELTA QUEUE to store delta records before uploading into BI.What are ALE change pointers?How do we make settings for this?
    Hope I explained better.
    Regards
    S

  • Master data Extraction in BI7.0

    Hello Gurus,
    I am new to BI 7.0. I would like to know the steps of Master data extraction from ECC 6.0. to BI 7.0 I tried but as the transformation is different than BW3.5 I am not sure what exactly to do. I was trying to extract 0PROFIT_CTR & 0CO_AREA Master data..... Any other Master data example would be fine to. If anybody could explain in simple steps the above process, would be appreciated. Points will be awarded.
    Thanks
    Sujay

    Routines can be added in Transformation step -6
    1. Create the infoobjects you might need in BI
    2. Create your target infoprovider-Infoobject
    3  Create a source system
    4. Create a datasource or use replicated R/3 datasource
    5. Create and configure an Infopackage that will bring your records to the PSA
    6. Create a transformation from the datasource to the Infoprovider
    7. Create a Data Transfer Process f(DTP) rom the datasource to the Infoprovider
    8. Schedule the infopackage
    9. Once succesful, run the DTP to get data from PSA to Infoobject
    10. Check Infoobject for data
    <b>Loading Infoobject using Flat-File</b>
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/01ed2fe3811a77e10000000a422035/content.htm
    Hope it Helps
    Chetan
    @CP..

  • BI data extraction setup when upgrading SAP version

    Hi,
    What are the steps to be followed with respect to the BI data extraction from SAP setup when upgrading from R/3 4.7 to ECC 6.0?
    Can anyone share any document that details the steps?
    Thanks,
    Sanjeev

    Hi,
    What is your BW system version?  I presume it is on a different box.
    BR/
    Mathew.

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Crystal Report Using ECC Data has Auth error in Infoview

    We've created a Crystal report that successfully accesses ECC data from a function module in our Dev environment when the report is executed from within Crystal Designer.  When we try to execute the same report against the same development ECC system from within InfoView we get the following error:
    ERROR: The Database Logon Information for this report is either incomplete or incorrect. 
    The connection information is the same as it is in designer. We are able to successfully run Crystal Reports over BW data within InfoView. Any ideas about what might be wrong?
    Doug

    Ingo,
    The consumers of the report do have ECC access for order and inventory management so licensing should not be an issue.
    We were able to resolve the issue. The problem was with our version of SAP GUI on the BO Dev server.
    We had to update our u201CServices.dllu201D and our u201CSAPLogonTree.xmlu201D file with an accurate version.  Once that was done, the report executed as expected in InfoView using LDAP.
    Thanks,
    Doug
    Edited by: Douglas Eberle on Jun 13, 2011 6:54 PM

Maybe you are looking for

  • Unable to update Ipod mini / Computer not responding to Ipod

    I recently switched computers and I installed the iPod mini software onto my new computer. This worked at first but when I started moving music from iTunes onto my iPod from my new computer, the downloads would stop even though the music had not been

  • MBAM bitlocker-protected removable drives recovery keys saved on sql database not active directory

    Hi Guys I need help in saving bitlocker protected removable drives on the sql database instead of active directory . I have tried to play around with the policy and I am not winning , currently my GPO : Choose how bitlocker-protected removable drives

  • Reading MQRHF2 Header from MQ message.

    Hi I am getting MQ messages from an application using IBM Message Broker. My requirement is to read the MQRFH2 Header from MQ message. I didnt find any class in com.ibm.mq.jar file. I found the functionality in JMS but I am not supposed to use it. Ho

  • Wifi with wpa (radius)

    Hi, i want to connect my printer (HP C4380) to an wpa protected wireless lan, which uses radius as authentication. is there a way like an firmware update or something like this?? mfg Stefan 

  • Integration Of iviews

    Hello Everyone,                       Can anyone tell me whether a Customised iView developed using WebDynpro can call Standard iViews which are already present in EP and can function accordingly? Your response would be very useful. Regards, Santhosh