LO Cockpit Data Extraction (2LIS_11_VASTH)

Hi Experts,
Could you please update me on how to proceed with regards LO Cockpit Data Extraction
Datasource: 2LIS_11_VASTH
I activated the Datasource in R/3 and
Activated relevant BW Content.
I performed INITILIZATION that fetched 0 Records to BW
I performed DELTA  that fetched 0 Records to BW
I checked in RSA3 (R/3) ....2LIS_11_VASTH extracted 0 Records
I checked the setup table for 2LIS_11_VASTH which showed 0 Records
Please update me in detail on how to perform Delete/Fill setup tables and
Do i need to perform DELTA in BW after filling setup tables in R/3
Please update me in detail.
Thanks

Hi,
Filling of set up tables
Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application)
3.In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of the run and execute. Now all the available records from R/3 will be loaded to setup tables.
Deletion of set up tables
transaction code LBWG (Delete Setup data) and delete the data by entering the application name.
Delta settings
Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update.
Go to BW system and create infopackage and under the update tab select the initialize delta process. And schedule the package. Now all the data available in the setup tables are now loaded into the data target.
Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7.
Go to BW system and create a new infopackage for delta loads. Double click on new infopackage. Under update tab you can see the delta update radio button.
Now you can go to your data target and see the delta record.
For Delta related info go through the below link
Link:[[SAP Network Blog: Gist of LO Cockpit Delta|/people/happy.tony/blog/2006/11/24/gist-of-lo-cockpit-delta]
Regards
KP
Please go through the below link for more details
http://www.sap-img.com/business/lo-cockpit-step-by-step.htm

Similar Messages

  • Reasons for Posting Free Period in LO Cockpit Data Extraction

    Dear Experts,
       I am quite new to the concept of LO Cockpit, but I have learnt that we need to stop posting transaction in R/3 during the process of filling up the set up table. I really want the reasons for this, so I can convince my customers.
    1) May anyone give me the exact reasons by providing some example cases please?
    Some sourcess mentioned that if we don't stop posting documents, there may be duplicated records (in Set Up table and in Delta Queue table). Is this true?
    2) May anyone provide me the step of extracting LO Cockpit data please? When the R/3 system should be freezed? When it should be unfreezed? Why do we have to use initialization without data transfer? Please give me the steps in details.
    Thank you so much

    Hi,
    If the users don't stop posting, and at the same time if you are filling up the setup tables, you'll lose the data. Ex if you are filling up setup table for Sales order and if at the same time the user is changing it, its going to take the earlier version or not pick the sales order at all, and thus the changes are lost.
    You usually freeze the R3 system when you are ready to fill up the setup tables. Users are locked out of the system so no transaction posting takes place and all batch jobs that update the tables are suspended.
    You usually do an init without data transfer so as not to hold up the system as the setup tables might have a lot of records. This reduces the R3 system downtime.
    Here's a thread that gives you detailed LO information :
    LO step by step procedure
    Cheers,
    Kedar

  • LO-COCKPIT  data extraction

    for DataSource= 2LIS_02_ITM
    Im getting some difficulties dear.
    After Replication, when Assigning Infosource to datasource there few fields for which Transfer Rules are not matching automatically. I found them in the Meta Data Repositery but same problem continues.
    It giving error that, the Info Objects are not exists in 2LIS_02_ITM Datasource.
    The info objects are,
    0INV_RE_IND,
    0CT_FLAG,
    0GN_R3_SSY.
    Please tell me if you know.
    Points will b rewarded.
    -Harshal

    If the infoobjects exists , then the mapping in transfer rules need to be checked. if you need the fields for which it is throwing then try to map with right infoobjects.or if you dont need then you need not map ..and still activate the transfer rules.
    Even transfer structure is yellow it shouldnt be a problem , if it is active you should be good to go.
    Message was edited by:
            Manga

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • R/3 SP Import:Open data Extraction requests Error

    We are getting the below error when Basis doing the upgrade support packeage in R/3 source system.
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.
    Intially we have cleared the entries in LBWQ,RSA7 and SM13.But after clearing these entries also we are getting the same above error.
    For support package upgrade in R/3 do we need to delete the SETUP table also in production environment?.
    Is there any other way around without deleting the setup table to upgrade support package in R/3 system?
    Please help us with your inputs urgently..
    Thanks
    Message was edited by:
            Ganesh S

    Thanks Siggi for the suggestion.
    We have already cleared the v3 updates by running RMBWV* job manually.After clearing all the entries in LBWQ,RSA7 and SM13 still we have got the same above error.
    When we imported In R/3 development system after deleting the setup table we didn't get the above error.But we are concerned about deleting the setup table in production system
    though already deltas are running fine.
    Is there any other way around? or deleting the setup table is the only option?
    Please help...

  • SPAM/SAINT - Open Data Extraction Requests

    Hello All i am getting an warning during implementing patches for netweaver 701 sp3 and sp4.
    "  Phase CHECK_REQUIREMENTS: Explanation of Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in the SAP Notes 1081287,
    1083709 and 328181.
    Call the Customizing Cockpit data extraction transaction and process all "
    i checked listed note and implemented the same but still its showing me something in LBWE. i deleted all setup data for all the application data using tcode LBWG but its still showing in LBWE tcode.
    this is the error i am getting after runing program 'RMCEXCHK'
    Struct. appl. 04 cannot be changed due to setup table -> Long text
    Message no. MCEX141
    Diagnosis
    Changing the extract structure  for application 04 is not permitted, because the restructure table MC04P_0ARBSETUP for the extractor still contains data in 800.
    You cannot change the structure in this status, because when you load an InfoPackage from BW, this leads to errors.
    Procedure
    Delete the entries for all restructure tables for application 04.
    Any reply will be highly appreciated.
    Mani

    Hi Mani
    I am also facing the same issue. Can you please tell me how you have resolved the issue.
    Thanks & Regards
    Venkat

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

  • Need help for SRM-6 Data Extraction into Bi-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process.  If I can get some documentation that can give me an idea for data extraction and delta management  in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    Hi,
    May help you:----
    How to extract data from SRM into BW system.
    SRM Audit Management Data Extraction into BW
    Procedure for standard data source extraction from SRM to BW 3.5.
    Regards,
    Suman

  • Need help for SRM Data Extraction into BI-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process. If I can get some documentation that can give me an idea for data extraction and delta management in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    SRM doesn't use kind of "cockpit" LO-like in ECC.
    Overall picture:
    http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
    If you setup data flows accordign Business Content of SRM:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
    Then just perform an init load and schedule the deltas.

  • How to identify whether the data extracted is direct, queued, unserialized

    hi,
    how to identify whether the data extraction from r/3 is direct, queued and unseralized data.
    can anyone let me know abt it
    regds
    hari

    hI,
    Direct Delta: With this update mode, the extraction data is transferred with each document posting directly into the BW delta queue. In doing so, each document posting with delta extraction is posted for exactly one LUW in the respective BW delta queues.
    This update method is recommended for the following general criteria:
    a) A maximum of 10,000 document changes (creating, changing or deleting documents) are accrued between two delta extractions for the application in question. A (considerably) larger number of LUWs in the BW delta queue can result in terminations during extraction.
    b) With a future delta initialization, you can ensure that no documents are posted from the start of the recompilation run in R/3 until all delta-init requests have been successfully posted. This applies particularly if, for example, you want to include more organizational units such as another plant or sales organization in the extraction. Stopping the posting of documents always applies to the entire client.
    Queued Delta: With this update mode, the extraction data is collected for the affected application instead of being collected in an extraction queue, and can be transferred as usual with the V3 update by means of an updating collective run into the BW delta queue. In doing so, up to 10000 delta extractions of documents for an LUW are compressed for each DataSource into the BW delta queue, depending on the application.
    new queued delta
    This update method is recommended for the following general criteria:
    a) More than 10,000 document changes (creating, changing or deleting a documents) are performed each day for the application in question.
    b) In future delta initializations, you must reduce the posting-free phase to executing the recompilation run in R/3. The document postings should be included again when the delta Init requests are posted in BW. Of course, the conditions described above for the update collective run must be taken into account.
    Non-serialized V3 Update:With this update mode, the extraction data for the application considered is written as before into the update tables with the help of a V3 update module. They are kept there as long as the data is selected through an updating collective run and are processed. However, in contrast to the current default settings (serialized V3 update), the data in the updating collective run are thereby read without regard to sequence from the update tables and are transferred to the BW delta queue.
    unserialized v3 update
    This update method is recommended for the following general criteria:
    a) Due to the design of the data targets in BW and for the particular application in question, it is irrelevant whether or not the extraction data is transferred to BW in exactly the same sequence in which the data was generated in R/3.
    take a look Roberto's weblog series
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    https://weblogs.sdn.sap.com/pub/wlg/126 [original link is broken] [original link is broken] [original link is broken]
    doc
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    and oss note 505700
    Re: delta methods
    go throuth the previous thread
    Delta types
    hope it helps..

  • Data extraction for BW/BI

    Hi ,
    I am new in BW.Can anyone send me material on DATA EXTRACTION IN BW? I mainly want the material for LO-extraction.If anyone could provide the material on extractors like LO-cockpit,Generic data source etc.I will be really thankful.Please send the material at  "baljinder4u_gmail.com" .
    Thanks in advance

    Hi Rakesh
    Step-by-step control flow for a successful data extraction with SAP BW:
       1.  An InfoPackage is scheduled for execution at a specific point of time or for a certain system- or user-defined event.
       2.  Once the defined point of time is reached, the SAP BW system starts a batch job that sends a request IDoc to the SAP source system.
       3.  The request IDoc arrives in the source system and is processed by the IDoc dispatcher, which calls the BI Service API to process the request.
       4.  The BI Service API checks the request for technical consistency. Possible error conditions include specification of DataSources unavailable in the source system and changes in the DataSource setup or the extraction process that have not yet been replicated to the SAP BW system.
       5.  The BI Service API calls the extractor in initialization mode to allow for extractor-specific initializations before actually starting the extraction process. The generic extractor, for example, opens an SQL cursor based on the specified DataSource and selection criteria.
       6.  The BI Service API calls the extractor in extraction mode. One data package per call is returned to the BI Service API, and customer exits are called for possible enhancements. The extractor takes care of splitting the complete result set into data packages according to the IDoc control parameters. The BI Service API continues to call the extractor until no more data can be fetched.
       7.  The BI Service API finally sends a final status IDoc notifying the target system that request processing has finished (successfully or with errors specified in the status IDoc).
    Note
    Control parameters specify the frequency of intermediate status IDocs, the maximum size (either in kilobytes or number of lines) of each individual data package, the maximum number of parallel processes for data transfer, and the name of the application server to run the extraction process on.
    *Here is LO Cockpit Step By Step*
    LO EXTRACTION
    - Go to Transaction LBWE (LO Customizing Cockpit)
    1). Select Logistics Application
           SD Sales BW
                Extract Structures
    2). Select the desired Extract Structure and deactivate it first.
    3). Give the Transport Request number and continue
    4). Click on `Maintenance' to maintain such Extract Structure
           Select the fields of your choice and continue
                 Maintain DataSource if needed
    5). Activate the extract structure
    6). Give the Transport Request number and continue
    - Next step is to Delete the setup tables
    7). Go to T-Code SBIW
    8). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Delete the content of Setup tables (T-Code LBWG)
    vi. Select the application (01 u2013 Sales & Distribution) and Execute
    - Now, Fill the Setup tables
    9). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Filling the Setup tables
    vi. Application-Specific Setup of statistical data
    vii. SD Sales Orders u2013 Perform Setup (T-Code OLI7BW)
            Specify a Run Name and time and Date (put future date)
                 Execute
    - Check the data in Setup tables at RSA3
    - Replicate the DataSource
    Use of setup tables:
    You should fill the setup table in the R/3 system and extract the data to BW - the setup tables is in SBIW - after that you can do delta extractions by initialize the extractor.
    Full loads are always taken from the setup tables
    please follow the link
    https://www.sdn.sap.com/irj/sdn/advancedsearch?query=dataEXTRACTIONIN+BW&cat=sdn_all
    Regards
    Tapashi
    Edited by: Tapashi Saha on Aug 18, 2008 11:03 AM

  • LO data extraction

    HAi SAP Gurus,
    Please help me with relevant step to step guide for LO data extraction in BW.
    Thanks in advance

    hi obily..
    to do basic LO extraction for SAP-R3-BW
    1. Go to transaction code RSA3 and see if any data is available related to your DataSource. If data is there in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering the application name.
    2. Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application)
    3. In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of the run and execute. Now all the available records from R/3 will be loaded to setup tables.
    4. Go to transaction RSA3 and check the data.
    5. Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update.
    6. Go to BW system and create infopackage and under the update tab select the initialize delta process. And schedule the package. Now all the data available in the setup tables are now loaded into the data target.
    7.Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7.
    8.Go to BW system and create a new infopackage for delta loads. Double click on new infopackage. Under update tab you can see the delta update radio button.
    9.Now you can go to your data target and see the delta record.
    find your scenario and find what data sources do you need on R3 side and ensure they are active as well:
    http://help.sap.com/saphelp_nw04/helpdata/en/37/5fb13cd0500255e10000000a114084/frameset.htm
    find your scenario -> data sources -> go to R3 -> sbiw and activate required data source
    replicate data sources in BW:
    RSA1 -> source systems -> right click on your source system -> replicate
    then activate your BC:
    Chek this blogs from Roberto for LO:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Also go through this link to know the procedure step by step
    http://www.sap-img.com/business/lo-cockpit-step-by-step.htm
    Also the How to docs:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    BI content:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/17/cd5e407aa4c44ce10000000a1550b0/frameset.htm
    assign pts if helpful....

  • Does any LO cockpit extractyor extract VBKD table values

    Hello Friends
    Does any LO cockpit extractyor extract VBKD table values--I am looking for BSTDK field.
    Thanks
    Tanya

    Hi Tanya.....
    This table is related to SD.......Following datasources r extracting data from the table VBKD.......but none of the table extracting BSTDK field.........
    2LIS_11_VAITM >> SD Sales: Document Item
    2LIS_11_VASCL >> SD Sales: Document Schedule Line
    2LIS_11_VAKON >> SD Sales: Order Conditions 
    2LIS_11_V_ITM >> SD Sales: Document Item Billing
    2LIS_11_V_SCL >> SD Sales: Document Schedule Line Billing
    U hav to enhance the Datasource........Check this.......
    http://help.sap.com/saphelp_nw70/helpdata/EN/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm
    Regards,
    Debjani......

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

Maybe you are looking for

  • Mac Pro Boot Issue

    Mac Pro - "Quad", 2 X 2.66Ghz; 5GB RAM; OS X 10.5.6 Mac Pro will not boot. It powers up to the grey Apple logo screen. The progress circle spins for about 3 seconds, then powers itself off. This past week I came back from two weeks off. The first mor

  • Cover flow in finder

    i use cover flow in finder to browse my videos and some movies... i just want to ask if it is normal that the cpu usage is high and both fans run at maximum "6000rpm" thanks

  • I use firefox 3.6.6 on mac os 10.5.8. I want to disable auto complete for my gmail. How?

    when i type something into the "subject" bar of my gmail, it auto-completes the form. this is a privacy issue for me. how do i disable this?

  • Camera raw for powershot sx50hs

    the download site says 7.3 includes my camera but dpreview says i need 7.3rc

  • Xserve G5 Cluster problem

    Hi, Recently I inherited Xserve G5 Cluster server. unfortunately it fails to boot. After blue leds draw a line and then fall back, the power leds blinks in a serie of 3 shorts - nothing - 3 shorts - nothing, etc. I wasn't able either to enter the Ope