Data extraction-urgent

Hi SAP Guru,
Can someone please tell me how i can extract old data (year 2007)
sales order of  "wxyz" sales org , of "abc" customer , of "xyz" material .
Points will be awarded.
Thank you in Advance.
Best Regards

Hi Panki,
What type of data you want to extract from the system
1. If you want to take the list of Sales Orders for year 2007, based on customer, sales org and Material
use T.Code VA05 and give all the inputs, then you will get thelist of sales orders.
2. If you want to take the header information or Item information for the list of sales orders for year 2007
Use T.Code SE16
Table VBAK (Sales Document Header information)
Table VBAP (Sales Document Item Information)
You can give all the information which you want to give and give the time peirod in Created on field and execute the same.
Hope this is clear.
Reward if helpful.
Thanks,
Praveen

Similar Messages

  • Error in Data Extraction - Urgent

    Hi,
    I am getting the following error while extracting master data for 0INFO_REC.
    Value '000000000000101034 ' for characteristic 0MATERIAL is in
    external format
    Thanks

    Hi Shiva,
    Unfortunately I dont access to my BW system today. So I do not know the exact conv routine for 0MATERIAL.
    You can instead try this: Go to the transfer rules and for the 0MATERIAL info-object check the check-box for Conversion. Then retry the load.
    Hopefully it should work.
    Bye
    Dinesh
    (Do not forget to assign points!!).

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • R/3 SP Import:Open data Extraction requests Error

    We are getting the below error when Basis doing the upgrade support packeage in R/3 source system.
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.
    Intially we have cleared the entries in LBWQ,RSA7 and SM13.But after clearing these entries also we are getting the same above error.
    For support package upgrade in R/3 do we need to delete the SETUP table also in production environment?.
    Is there any other way around without deleting the setup table to upgrade support package in R/3 system?
    Please help us with your inputs urgently..
    Thanks
    Message was edited by:
            Ganesh S

    Thanks Siggi for the suggestion.
    We have already cleared the v3 updates by running RMBWV* job manually.After clearing all the entries in LBWQ,RSA7 and SM13 still we have got the same above error.
    When we imported In R/3 development system after deleting the setup table we didn't get the above error.But we are concerned about deleting the setup table in production system
    though already deltas are running fine.
    Is there any other way around? or deleting the setup table is the only option?
    Please help...

  • Data extraction from Oracle database

    Hello all,
    I have to extract data from legacy database tables. I need to apply a lot of conditions on data extraction using SQL statements for getting only valid master data, transaction data, SAP date format etc. Unfortunately I don;t have a luxary of accessing legacy system data base table to create table views and applying select statements.
    Is there anyother way round by which I can filter data on source system side w/o getting into legacy system. I mean being in BW data source side.
    I am suppose to use both UD connect and DB connect to test which will workout better way of data extraction. But my question above should be same in either interface.
    This is very urgent as we are in design phase.
    Points will be rewarded immediately.
    Thanks
    Message was edited by:
            Shail
    Message was edited by:
            Shail

    Well I and eveyone know that it can be done in BI.
    I apologize that I did not mention it in my question.
    I am looking for very specific answer, if there is any trick we can do on source system side from BI. Or where we can insert SQL statements in infopackage or data source.
    Thanks

  • Need data extract and load program

    hi
    where i can find all the standard data extract programs and standard load programs. Pls thisis urgent.
    thankyou

    Thankyou
    can i get a list of the extract programs and dataload programs? SAP provides programs for extracting data for material master, customer, vendor, purchase order , sales order ? It also provides std programs for data load rite? where can i find the list. i need to prepare a document with the list.
    Heng

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • FI data extraction help

    HI All,
    I have gone thorugh the sdn link for FI extarction and founf out to be very  useful.
    Still i have some doubts....
    For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
    ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......

    >
    Jacob Jansen wrote:
    > Hi Raj.
    >
    > Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
    >
    > br
    > jacob
    Not necessarily for systems above plug in 2002.
    As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
    Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm

  • Generic Data Extraction From SAP R/3 to BI server using Function Module

    Hi,
    I want step by step procedure for Generic Extraction from SAP R/3 application to BI application
    using Functional module.
    If any body have any Document or any PPT then please reply and post it in forum, i will give point for them.
    Thanks & Regards
    Subhasis Pan

    please go thr this link
    [SAP BI Generic Extraction Using a Function Module|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/sap%20bi%20generic%20extraction%20using%20a%20function%20module.pdf]
    [Generic Data Extraction Using Function Module |Re: Generic Data Extraction Using Function Module;

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

  • How to add new fields to a data extract

    The following data extract program generates an output file which is displayed using a publisher template as a check format report.
    Oracle Payments Funds Disbursement Payment Instruction Extract 1.0
    How would I add new fields to the generated file? In other words what is the procedure to add new fields to the data extract?
    Thanks

    Do anyone pls advise how to customize the payment extraction program? We also have the similar requirement to extract extra fields to format a payment file.

  • 'Error 8 occurred when starting the data extraction program'

    Hello Experts,
    I am trying to pull master data (Full upload) for a attribute. I am getting an error on BW side i.e. 'The error occurred in Service API .'
    So I checked in Source system and found that the an IDOC processing failure has occurred. The failure shows 'Error 8 occurred when starting the data extraction program'.
    But when I check the extractor through RSA3, it looks fine.
    Can someone inform what might be the reason of the failure for IDOC processing and how can this be avoided in future. Because the same problem kept occurring later as well.
    Thanks
    Regards,
    KP

    Hi,
    Chk the idocs from SM58 of source system are processing fine into ur target system(BI system)...?
    Chk thru Sm58 and give all * and target destination as ur BI system and execute and chk any entries pending there?
    rgds,
    Edited by: Krishna Rao on May 6, 2009 3:22 PM

  • Vendor Master Data extraction???

    Hi,
    I need to extract Vendor Master Data from SAP into a flat file.
    The format should be similar to file input required for the Vendor Master Upload program: RFBIDE00.
    Is there any program which can be used to extract the data in the required format?
    Any help is appreciated.
    Thanks and Regards,
    Varun

    hi varun,
    check this link..
    <a href="http://sap.ittoolbox.com/groups/technical-functional/sap-r3-dev/vendor-master-data-extract-from-r3-to-flat-file-944432#">VENDOR MASTER EXTRACTION</a>
    <b>dont forget to mark helpful answers..</b>
    Message was edited by: Ashok Kumar Prithiviraj

Maybe you are looking for