Automatic Data Extract in ODI

Hi,
I want to do automatic data extract in ODI
How do i achieve this?
Thanks.

Hi,
If i understand your requirement correctly then scheduling in ODI will help you to do automatic data extraction at specific time/interval.
Have a look,
http://www.oracle.com/technology/obe/fusion_middleware/odi/creating_scheduling_scenario/creating_scheduling_scenario.htm
You got what you are looking for?
Thanks,
Guru

Similar Messages

  • Ho to automate data extraction from KSB1 and GR55 transaction code

    Hi All,
    Can you please let me know if their is a way to automate data extraction from transaction code KSB1 and GR55. I have to extract data from 5 different servers .i.e different server for each region and again i have different controlling area codes in each region. Following are the details which i use to extract the data. It takes too long for me to extract data from all this regions and controlling area codes using my parameters. It's very time consuming so i want to automate this process. I am end user so i don't have any admin rights. Please let me know any workable solution asap.
    Production areas : PNA for Americas, PSI for Asia Pacific and Japan, PGY for Germany, PIT for Italy and PEU for Europe
    Controlling area codes in PNA : CAR for Argentina, CBR for Brazil, CMX for Mexico and CUS for USA. Same way there so may other controlling area codes for all other production areas
    Period From 1 to 12
    Fiscal Year : 2009
    Cost Centre Group : G_6284
    Cost Element Group : 1742000000
    Please let me know in case you need more details.

    Hi,
    Here follows a translation from German:
    SAP GUI (client) for Windows enable
    Start SAP Logon and log on to the SAP server.
    Click the button on the toolbar to adjust for Local Layout.
    Click Options and then click the tab for the scripting.
    Select the Enable checkbox for scripting.
    Disable the checkbox for Notify when a script is assigned to an active GUI and the checkbox for Notify when a script opens a connection.
    Save the settings and restart the SAP GUI again.
    SAP-server enable
    With the following procedure, you can enable scripting by the SAP client temporarily. The specified value in this way is lost when you restart the server.
    Start SAP Logon and log on to the SAP server.
    Start a transaction RZ11.
    Enter sapgui / user_scripting in the window to manage the profile parameters.
    Click on ads.
    Click in the window to display the profile parameter attributes to change value.
    Enter TRUE in the field for a new value.
    Save the settings and log out from the SAP GUI.
    Quit the SAP Logon.
    Note:
    If the server administrator edited the application server profile of the SAP system to sapgui / user_scripting = TRUE to include the scripting is enabled when you restart the server by default.
    SAP provides an option to change the network connection mode at any server. The following two connection modes are available: high-speed connection (LAN) and connecting with a slow speed. Although Functional Tester works in both modes, the high-speed connection with a recorded script is played only in this mode. This also applies to other modes. They must reflect your SAP script in the same network connection mode, with which the script was recorded. It is recommended that the mode of "high-speed connection, as it offers a greater number of valid recognition properties.
    Regards,
    ScriptMan
    Edited by: ScriptMan on Apr 13, 2010 12:32 PM

  • QUESTION:  Essbase data extraction and Installing ODI Agent??

    For extracting data from Essbase cubes, ODI has "LKM Hyperion Essbase DATA to SQL".
    We can use (1). ReportScript, or (2). MDX-query, or (3). CalcScript
    For data-extraction using CalcScript, ODI Agent must be running on the same server as the Essbase server.
    Does anyone know if there is a need for ODI Agent on the Essbase machine if we use MDX-query method for data-extraction?
    We would like to avoid installing ODI Agent for Essbase data-extraction.
    .

    Thanks John.
    One related question. To move data from one Essbase cube to another Essbase cube using ODI Interface, Can we do it efficiently through MDX-query?
    We want to avoid Replicated-partitioning OR CalcScripts, if possible.
    BTW... Your ODI/Hyperion blog is a bible for us.

  • Data extraction with PL/SQL

    Hi Expert
    My customer wants to use PL/SQL language for SAP data extraction in Oracle database. He doesn’t want to use ABAP code for this.
    In my opinion I think that it’s not correct to do this but I have no solid argument.
    Could anyone explain to me why it’s not advisable to use the ORACLE  database directly for this data extraction?
    Best regards

    Hi,
    PL SQL(Native SQL) statements bypass the R/3 database interface. There is no table logging, and no synchronization with the database buffer on the application server. For this reason, you should, wherever possible, use Open SQL(ABAP SQL) to change database tables declared in the ABAP Dictionary. In particular, tables declared in the ABAP Dictionary that contain long columns with the types LCHR or LRAW should only be addressed using Open SQL, since the columns contain extra, database-specific length information for the column. Native SQL does not take this information into account, and may therefore produce incorrect results. Furthermore, Native SQL does not support automatic client handling. Instead, you must treat client fields like any other.
    I think this will be useful for you

  • How to create data stores in ODI ?

    Hi all,
    I am new to this ODI part.Can anyone please help me as how to create data stores in ODI.
    A prompt reply will be highly aprreciated.
    Thanks
    Saurabh.

    What do you mean by "create datastores"?
    If you mean you want to reverse engineer existing tables from a database, then the phrase used in the ODI docs is "reverse enginnering". If you mean to create new tables in a database, then:
    1) ODI is not meant to be a database design tool.
    2) Using the "diagrams" node under a data model, you are able to use the "Common Format Designer" (CFD) tool to design and create the structure. The CFD tool is a simple ER-digram tool, but importantantly, if you drag structures in from one model to another, it remembers where it came from, allowing automatic generation of interfaces, and it automatically translates the data types.

  • Data Extraction  or IDOC to flat file

    hi,
    I have a project to create a flat file from SAP, for an external legacy system. There are 3 requirement.
    What approach should I take. Simple data extraction OR Idoc to flat file.
    There are 4 requirements:
    1. first time extract all data.
    2. on subsequent run, extract only changed & new records
    3. if in SAP table, a record is deleted, then marked deleted in flat file.
    What approach should I take if I use data extraction.
    Thanks.

    I read your question, my first thought would be to look at where the data is going?
    What are the data requirements of the legacy system. IDOCs can speed up the development related to pushing the data out from SAP. Using ALE and change pointers you can automatically pass out the delta with a limited amount of development.
    However, the receiving system then needs to parse the IDOC data. depending on the IDOC you are working with this can be a challenge especially if the legacy developer doesn't get IDOCs.
    Sometimes its easier to collect and write the data from SAP using "simple data extraction". The data is more readily organized into a format the receiving system is expecting.
    You can also pass the idoc to a middleware maping application if one is available and do the SAP to legacy mapping there.
    Cheers

  • SAP Data Extraction requirements - mySAP SCM SPP/ ERP

    Hello All,
    Could you please explain me how to start for this requirement?.
    <b>Business Summary:</b>
    For MM Planning to have a substantial range of Master and Transactional data to get extracted from the SAP system for reporting and analysis purposes. A list of the data requested for extraction for the SCM.
    <b>Functional Discription:</b>
    There is a general MM Planning requirement to have most Master and Transactional data extracted from SAP-APO for reporting and analysis purposes in order to support following business tasks:
    •     process evaluation and optimization
    •     error analyzes
    •     ad-hoc reporting
    •     planning processes out of scope for SCM launch (e.g. ATR process)
    The data need to be automatically extracted on regular basis (i.e.: daily/weekly/monthly/yearly: this will be defined during the next steps for every single set of data) and made available in a structured environment to the SCM Planning Team for the above mentioned processes.
    The requirement for data availability outside of SAP APO comprises both current data and historical data.
    FoE: today no "final" report is generated by the SCM Planning team directly within the current data warehouse environment.  Most data are extracted from the current data warehouse to external tools. Current assumption is that this will remain unchanged.
    Some regular reports (MBB, Inventory Dashboard) may be directly developed in Business Warehouse in future if this offers any improvement (flexibility, design, handling.). This will be investigated by the SCM Planning Team after go-live.
    FUS: For the analyses and the planning processes that are out of scope, the data will need to be extracted so that it is available for use in analytical tools that are powerful enough to process the data and appropriate calculations. 
    Regular reports may be directly developed in the BW if it is determined that that is the most appropriate location and tool.  Otherwise, the data will need to be integrated into reports generated using tools outside of SAP.  That determination will needs to be completed based on the type of data being reported.
    Thanks & Regards
    PRadeep
    Message was edited by:
            Pradeep Reddy

    Hi,
    You can download all the master guides from the SAP service marketplace (http//service.sap.com/instguides).
    Cheers,
    Mike.

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • FI data extraction help

    HI All,
    I have gone thorugh the sdn link for FI extarction and founf out to be very  useful.
    Still i have some doubts....
    For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
    ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......

    >
    Jacob Jansen wrote:
    > Hi Raj.
    >
    > Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
    >
    > br
    > jacob
    Not necessarily for systems above plug in 2002.
    As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
    Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm

  • SRM MDM: Workflow Unlaunch while performing the Automatic data transfer

    Hi,
    We are trying to import some data from R/3 4.6 C by configuraing remote system as ERP and creating Port based based on the XML Schema in the SRM MDM Catalog. We have created work flow to validate the above pulled data accuracy into data manager. Everything goes well here, I am able to get the data transfered into data manager Automatically But the only issue is in the Work flow.
    Set workflow into data manager is not able to launch automatically even though the record is accurate via automatic transfer. I have accuratly set the wokflow name in import manager while svaing the MAP file for automatic import.
    Please have your inputs why the work flow is not able to launch in automatic data transfer from the specified port.
    Thanks/Pawan

    Hi Pawan,
    please check in the Data Manager, that you selected by the workflow, the right Trigger Actions, such as Record Import. And that the Autolunch is also set correcly.
    Regards,
    Tamá

  • Generic Data Extraction From SAP R/3 to BI server using Function Module

    Hi,
    I want step by step procedure for Generic Extraction from SAP R/3 application to BI application
    using Functional module.
    If any body have any Document or any PPT then please reply and post it in forum, i will give point for them.
    Thanks & Regards
    Subhasis Pan

    please go thr this link
    [SAP BI Generic Extraction Using a Function Module|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/sap%20bi%20generic%20extraction%20using%20a%20function%20module.pdf]
    [Generic Data Extraction Using Function Module |Re: Generic Data Extraction Using Function Module;

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

Maybe you are looking for