Problem in extracting po data

Hi,
i am using the following code to extract the po data,i am trying to extract multiple pos,but i am not getting multiple items into the excel sheet,i mean if i had one po with 4 items ,i am getting only last item,if any body have the idea kindly send me,
ITS URGENT ISSUE.
*& Report  ZTEST                                                       *
REPORT  ZTEST                                   .
*DATA: BEGIN OF IT_PO_HEADER OCCURS 20,
     BSART LIKE EKKO-BSART,
     EBELN LIKE EKKO-EBELN,
     SUPERFIELD LIKE MEPO_TOPLINE-SUPERFIELD,
     BEDAT LIKE EKKO-BEDAT,
     ZTERM LIKE EKKO-ZTERM,
     NETWR LIKE EKKO-NETWR,
     TABLES: EKKO.
*PARAMETERS : P_EBELN TYPE EBELN.
SELECT-OPTIONS: S_EBELN FOR EKKO-EBELN OBLIGATORY.
TYPES: BEGIN OF T_PO,
         EBELN TYPE EBELN,
       END OF T_PO.
DATA: IT_PO TYPE STANDARD TABLE OF T_PO,
      WA_PO TYPE T_PO.
DATA : IT_ITEMS TYPE TABLE OF BAPIEKPO WITH HEADER LINE.
DATA: PO_ITEMS_SHEDULE TYPE TABLE OF BAPIEKET WITH HEADER LINE.
DATA: PO_ITEM_TEXT TYPE TABLE OF BAPIEKPOTX WITH HEADER LINE.
DATA: PO_ITEM_ACCOUNT_ASSIGNMENT TYPE TABLE OF BAPIEKKN
WITH HEADER LINE.
DATA: PO_HEADER_TEXTS TYPE TABLE OF BAPIEKKOTX WITH HEADER LINE.
DATA: PO_HEADER TYPE BAPIEKKOL.
DATA: RETURN1 TYPE TABLE OF BAPIRETURN WITH HEADER LINE.
data : begin of itab1 occurs 0,
         line(750) type c,
       end of itab1.
DATA PO_HEADER TYPE BAPIEKKOL .
SELECT EBELN INTO TABLE IT_PO FROM EKKO WHERE EBELN IN S_EBELN.
PERFORM DOWNLOAD_DATA: TABLES IT_ITEMS USING SPACE 'C:/poextract6.txt'.
LOOP AT IT_PO INTO WA_PO.
CLEAR: PO_HEADER_TEXTS[], IT_ITEMS[], PO_ITEM_ACCOUNT_ASSIGNMENT[],
       PO_ITEMS_SHEDULE[], PO_ITEM_TEXT[], RETURN1[].
CALL FUNCTION 'BAPI_PO_GETDETAIL'
  EXPORTING
    PURCHASEORDER              = WA_PO-EBELN
    ITEMS                      = 'X'
    ACCOUNT_ASSIGNMENT         = 'X'
    SCHEDULES                  = 'X'
    HISTORY                    = 'X'
    ITEM_TEXTS                 = 'X'
    HEADER_TEXTS               = 'X'
  IMPORTING
    PO_HEADER                  = PO_HEADER
  TABLES
    PO_HEADER_TEXTS            = PO_HEADER_TEXTS
    PO_ITEMS                   = IT_ITEMS
    PO_ITEM_ACCOUNT_ASSIGNMENT = PO_ITEM_ACCOUNT_ASSIGNMENT
    PO_ITEM_SCHEDULES          = PO_ITEMS_SHEDULE
    PO_ITEM_TEXTS              = PO_ITEM_TEXT
    RETURN                     = RETURN1.
DATA PO_HEADER1 LIKE TABLE OF PO_HEADER.
APPEND PO_HEADER TO PO_HEADER1.
PERFORM DOWNLOAD_DATA: TABLES PO_HEADER_TEXTS
                                  USING 'X' 'C:/poextract1.xls',
                       TABLES IT_ITEMS USING 'X' 'C:/poextract2.xls',
      TABLES PO_ITEM_ACCOUNT_ASSIGNMENT USING 'X' 'C:/poextract3.xls',
          TABLES PO_ITEMS_SHEDULE USING 'X' 'C:/poextract4.xls',
                       TABLES PO_ITEM_TEXT USING 'X' 'C:/poextract5.xls'
PERFORM DOWNLOAD_DATA: TABLES PO_HEADER1 USING SPACE 'C:/poextract6.xls'
*DATA: PO_HEADER1 TYPE BAPIEKKOC,
*PO_NUM TYPE BAPIMEPOHEADER-PO_NUMBER,
*BAPI_RET TYPE TABLE OF BAPIRET2 with header line,
*IT_ITEMS1 TYPE TABLE OF BAPIEKPOC WITH HEADER LINE.
*MOVE-CORRESPONDING PO_HEADER TO PO_HEADER1.
*PO_HEADER1-PO_NUMBER = ''.
*PO_HEADER1-PURCH_ORG = PO_HEADER-PURCH_ORG.
*MOVE-CORRESPONDING IT_ITEMS TO IT_ITEMS1.
*IT_ITEMS1-PO_NUMBER = ''.
*IT_ITEMS1-MATERIAL = IT_ITEMS-MATERIAL.
*APPEND IT_ITEMS1.
*CALL FUNCTION 'BAPI_PO_CREATE'
EXPORTING
   po_header                        = PO_HEADER1
  PO_HEADER_ADD_DATA               =
  HEADER_ADD_DATA_RELEVANT         =
  PO_ADDRESS                       =
  SKIP_ITEMS_WITH_ERROR            = 'X'
  ITEM_ADD_DATA_RELEVANT           =
  HEADER_TECH_FIELDS               =
IMPORTING
  PURCHASEORDER                    =
tables
   po_items                         = IT_ITEMS1
  PO_ITEM_ADD_DATA                 =
   po_item_schedules                = PO_ITEMS_SHEDULE
  PO_ITEM_ACCOUNT_ASSIGNMENT       =
  PO_ITEM_TEXT                     =
  RETURN                           = BAPI_RET
  PO_LIMITS                        =
  PO_CONTRACT_LIMITS               =
  PO_SERVICES                      =
  PO_SRV_ACCASS_VALUES             =
  PO_SERVICES_TEXT                 =
  PO_BUSINESS_PARTNER              =
  EXTENSIONIN                      =
  POADDRDELIVERY                   =
*CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
  WAIT          =
IMPORTING
  RETURN        =
*loop at bapi_ret.
*write: bapi_ret-MESSAGE.
*endloop.
ENDLOOP.
*&      Form  DOWNLOAD_DATA
FORM DOWNLOAD_DATA  TABLES   IT_TAB
                    USING    P_APP
                    FILENAME.
CALL FUNCTION 'GUI_DOWNLOAD'
  EXPORTING
    FILENAME = FILENAME "'C:/poextract.xls'
   APPEND =  P_APP
    WRITE_FIELD_SEPARATOR = 'X'
  TABLES
    DATA_TAB = IT_TAB
   FIELDNAMES = FIELDNAME
  EXCEPTIONS
    FILE_WRITE_ERROR        = 1
    NO_BATCH                = 2
    GUI_REFUSE_FILETRANSFER = 3
    INVALID_TYPE            = 4
    NO_AUTHORITY            = 5
    UNKNOWN_ERROR           = 6
    HEADER_NOT_ALLOWED      = 7
    SEPARATOR_NOT_ALLOWED   = 8
    FILESIZE_NOT_ALLOWED    = 9
    HEADER_TOO_LONG         = 10
    DP_ERROR_CREATE         = 11
    DP_ERROR_SEND           = 12
    DP_ERROR_WRITE          = 13
    UNKNOWN_DP_ERROR        = 14
    ACCESS_DENIED           = 15
    DP_OUT_OF_MEMORY        = 16
    DISK_FULL               = 17
    DP_TIMEOUT              = 18
    FILE_NOT_FOUND          = 19
    DATAPROVIDER_EXCEPTION  = 20
    CONTROL_FLUSH_ERROR     = 21
    OTHERS                  = 22.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
ENDFORM.                    " DOWNLOAD_DATA

Dear Rammohan,
After going through your code, I'm jotting down the following mistakes:
<b>(1)</b> Initially you're firing the query,
     SELECT EBELN INTO TABLE IT_PO
                              FROM EKKO
                              WHERE EBELN IN S_EBELN.
     Since you're using EKKO you want to get the header details.
<b>(2)</b> Just after this SELECT statement you're calling the
     subroutine DOWNLOAD_DATA with Table IT_ITEMS, where IT_ITEMS is still
     not populated.
<b>(3)</b> I'm suggesting you an easy way out,
     SELECT   IT1~EBELN
                     IT1~EBELP
                     IT1~MATNR
                                        INTO   TABLE   IT_PO
                                        FROM EKPO AS IT1   INNER JOIN   EKKO AS IT2
                                        WHERE  IT1EBELN   EQ   IT2EBELN     AND
                                                      IT2~EBELN   EQ   S_EBELN.
      By firing the above Query you'll able to get hold of the the PO Data along with
      the respective Line Item Data.
Regards,
Abir
Don't forget to award points *

Similar Messages

  • Request: Problem in Extraction of data from CRM to BW

    Hi Gurus
    I have problems in extracting the data from CRM to BW.I thought you are the one who can solve them:
    1) When I check the data in the datasource in RSA3 i can see all records as well as i can see complete data in that but when i extract and check the data in cube i can see all records but i cannot see the complete data i.e some data is missing.
    2)And another problem is that In report i have 2 characteristics(DATS) start date and end date now i am trying to get number of days from start date to end date.
    I appreciate you in advance. If u find any good documentation please send me to [email protected]
    With regards
    Nagamani.

    Hi krishna,
                       yes we did enhancement to this data source. In PRDOCTION its taking 27 hours time  to load data.
              For extraction from source system (CRM) its taking  nearly 24 hours time  as per JOb LOG .
    08/05/2010 11:53:08 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,849 records
    08/05/2010 12:02:02 Result of customer enhancement: 10,849 records
    08/05/2010 12:02:03 PSA=0 USING & STARTING SAPI SCHEDULER
    08/05/2010 12:02:03 Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
    08/05/2010 12:02:06 IDOC: Info IDoc 2, IDoc No. 1576298, Duration 00:00:01
    08/05/2010 12:02:06 IDoc: Start = 08/05/2010 10:26:37, End = 08/05/2010 10:26:38
    08/05/2010 13:02:38 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,958 records
    08/05/2010 13:11:48 Result of customer enhancement: 10,958 records
    08/05/2010 13:11:52 Asynchronous send of data package 2 in task 0003 (1 parallel tasks)
    08/05/2010 13:11:53 tRFC: Data Package = 1, TID = 0AA00D0301984C5AEE8E44DB, Duration = 00:16:
    08/05/2010 13:11:53 tRFC: Start = 08/05/2010 12:02:19, End = 08/05/2010 12:18:27
    08/05/2010 14:30:13 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 11,296 records
    08/05/2010 14:39:53 Result of customer enhancement: 11,296 records
    like this its taking 24 hours in extraction its self.
    Edited by: kotha123 on Aug 10, 2010 12:08 PM
    Edited by: kotha123 on Aug 10, 2010 12:09 PM

  • Delta Problem while extracting the data by using function module

    Hi Experts,
    I have extracted the data by using Function module, for delta loads I have given Calday and additive delta in Generic delta settings, in BW side I have initialized the data after that I have given delta update option in infopackage and I made process chain also, every day process chain is running successfully but it is showing zero records, but there is a data in Data source (RSA3), means delta is not working.
    What is the problem,
    Is there any another settings for delta loads,
    Please help me to do this,
    Helpful answer will be appreciated with points,
    Regards,
    Venkat

    Hi,
    Try this delta type :NEW STATUS FOR CHANGED RECORDS.
    More information from Nagesh, this information may help
    1. New status for changed Records.
    2. Additive delta.
    New status for changed records means
    New Order
    order no quantity
    1000 10
    order changed to quntity 8. then 2 records will posted 2 BW in first case.
    1000 -10
    1000 8
    if you come to Additve delta.
    for the same case it will send the same record as
    1000 10
    changed records
    1000 -2
    this is the difference. New status with changed records should be used only in association with ODS not cube becoz we don't have overwrite option.
    Additive delta we can use for ODS and Cubes with update mode Addition.
    Regards,
    Satya

  • Problem with extract huge data in WEBI (Errors:WIO 30280 and ERR_WIS_30270)

    Hi gurus, need your help.
    When we run the query for the report we get an error in Web Intellegence .
    This error may be two types. And their rotation occurs randomly.
    First error: There is no memory available. Please close document to free memory (WIO 30280)
    Second error: An internal error occurred while calling the 'processDPCommands' API. (ERR_WIS_30270)
    When we try reopen WEB Intelligence, or reloading servers or machine, we have no   any results and get error again.
    I was try change different parametrs in universe designer for connection and universe, in central managment console for WEBI, but it's not solve problem
    This query select data from huge table (4,7 million rows), and when I use limit for maximum rows in WEBI my report working correctly.
    Is there any way to solve these problems associated with large data? What  is invoke this problem when we try to get full data?
    We use Business Objects Enterprise XI 3.1 version 12.4.0.966 
    Our system Windows Server  R2  Enterprise 2008 SP1 (64-bit system), Processor Intel 2.67 GHz (2 processors), RAM 8 Gb.                                                                               
    Thanks.
                                                                                    Ruslan

    Hi Brad,
    here we are talking about XI3.1 which has the limitations inherited from the operating system. as this blog post explains with plenty of details.
    In BI4.0 you can leverage larger datasets without any problems however you need to properly size and configure the services.
    There are several documents out there:
    How to configure the APS  http://scn.sap.com/docs/DOC-31711
    Companion Guide
    https://service.sap.com/~sapdownload/011000358700000307202011E/SBO_BI_4_0_Companion_V4.pdf
    Web Intelligence Sizing Guide
    https://service.sap.com/~sapdownload/011000358700001403692011E/BI_4_0_WEBI_NEW.pdf
    Best regards,
    Simone

  • Problem in extraction  crm data sorce 0BP_ROLES_ATTR

    Hi ,
         during extracting the data source 0BP_ROLES_ATTR i sucessfully upladed data in to psi but when i tried to extract the data to ODS i am getting the error as
    The value '99991231235959 ' from field VALID_TO is not convertible into
    the DDIC data type DATS of the InfoObject in data record 217 . The field
    content could not be transferred into the communication structure
    format.
    System Response
        The data to be loaded has a data error or field VALID_TO in the transfer
        structure is not mapped to a suitable InfoObject.
        The conversion of the transfer structure to the communication structure
        was terminated. The processing of data records with errors was continued
        in accordance with the settings in error handling for the InfoPackage
        (tab page: Update Parameters).

    Hi Shahina,
    0VALID_TO has data element DATS which contains date in YYYYMMDD format. The data coming from the datasource contains YYYYMMDDHHMMSS. What you need to do is filter out the HHMMSS part. For this you can either write a transformation routine or use a formula in your transformation.

  • Problem with extracting Xml data source fields

    Post Author: new_crystal
    CA Forum: Crystal Reports
    Hi,
    I am creating a report which has a xml data source.
    I have a field named fieldvalue in the xml, I need to present this fieldvalue in a column format based on the name of the column heading which is another field in the database called fielddata
    here is what i want
    fielddata1     fielddata2    fielddata3 .......
    fieldvalue1    fieldvalue2   fieldvalue3......
    i have dragged the fieldvalue 3 times in the report and applied different formula for each one of them
    here is the eg. of the formula
    if ( = "A" ) then
    but it is not giving me the correct values pulled from the data source. for the first column it is pulling correctly but for the rest of the columns it is giving values as 0's.
    Can anyone help me out in this? It is pretty urgent!
    Thanks

    Post Author: tel
    CA Forum: Crystal Reports
    I have no idea how to format it in Crystal Reports (i'm new to it too), but if worse comes to worse, you can create an XSLT stylesheet to convert the XML into a format that is easier for Crystal to use. 
    In case you don't know XSLT is a programming language that is designed to convert one XML format into another.

  • Error occured while extracting the data from BSIK Table

    Hi Experts,
    I am trying to extract the data from "BSIK"  by using generic extraction by table, I am getting the error "Invalid extractiion Structure template BSIK of Data source XXXXXX", But when i am trying the same by using VIEW, i am able to do it,
    I am getting same problem while extracting the data from other tables like "BSIK, BSAK.........."
    can some one explain me why it is throughing some error when i am doing generic extraction by using Table,
    Thanks,

    Hi,
    Can you check below link:
    Invalid extract structure template ZBW_VW_EKES of DataSource ZBW_SCN_VENDOR
    Hope this answer your question!
    Regards,
    Nilima

  • I have a problem to  extract data from oracle dump file (.dmp) using oracle database or any other tools.

    There is IMP utility which provides me data as graphical way. But i need extract data in such a way so that i could able to use in my application.
    thanks,
    mohan

    Hi,
    EdStevens
    Actually I was using SQL Developer. Sorry for the wrong define question. Now the real problem is:
    I want to read or extract data in a txt file from Oracle Dump (.dmp) file without using Oracle sqlPlus or Oracle Database.
    No need to say that i am able to extract dump data using sqlPlus. But for that, there is a little burden to install oracle database
    or sql Developer.
    I also listen about Oracle Loader a little bit. Is it a tool for providing extraction of data from Oracle dump file.
    Thanks for your previous reply,
    And now waiting for your suggestion.
    thanks,
    mohan

  • Problem during extraction using generic data sources

    Hi
    While performing extraction(master data)  using generic data sources I am getting a problem.
    while selecting application component, I am unable to find SD to select it.What should I do?
    I selected CRM as application component and did the extraction , it is successful.
    But what about if we want to select SD? I am guessing it as an installation problem ... Anyone plz help me
    Thank you
    Deepthi

    Hi Deepthi,
    Looks like the RFC connection between source system and BW system is broken..
    Contact your basis team to check the connection what went wrong.
    once the connection is re-established you can gohead ...
    Try out this procedure
    Type Transaction SM59.
    a.) Open the RFC Destinations.
    b.) Find your Source System and double click on it.
    c.) Technical setting should contain information regarding your source
    system
    d.) Logon Information should contain source system client and name &
    password for you remote logon.
    **Suggestion : our remote name for our R/3 was ALEREMOTE. We use
    BWREMOTE so that we can monitor without confusion.
    e.) Execute your 'Test Connection' & 'Remote Logon'. You connection
    should be You should NOT be prompted for a password.
    f.) Repeat above to verify RFCs for other systems
    regards
    KP
    Edited by: KP on Dec 14, 2009 12:11 PM

  • Problem in extracting data

    Hi All
    i am getting problem while extracting data.
    XML is .
    <po:PurchaseOrder xmlns:po="http://www.oracle.com/PO.xsd"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.oracle.com/PO.xsd
    http://www.oracle.com/PO.xsd">
    <PONum>1005</PONum>
         <Company>Oracle</Company>
    <Item>
    <Part>9i Doc Set</Part>
    <Price>2550</Price>
    </Item>
    <Item>
    <Part>8i Doc Set</Part>
    <Price>350</Price>
    </Item>
    </po:PurchaseOrder>
    and i am writing SELECT query as
    select p.extract('/(PurchaseOrder/Company)').getStringVal() from po_tab2 p
    then i get blank rows.
    SQL> select p.extract('/PurchaseOrder/Company').getStringVal() from po_tab2 p;
    P.EXTRACT('/PURCHASEORDER/COMPANY').GETSTRINGVAL()
    6 rows selected.
    but if i use only root element in xpath then it displays result with xml data.
    SQL> select p.extract('/PurchaseOrder').getStringVal() from po_tab2 p;
    P.EXTRACT('/PURCHASEORDER').GETSTRINGVAL()
    <po:PurchaseOrder xmlns:po="http://www.oracle.com/PO.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchem
    a-instance" xsi:schemaLocation="http://www.oracle.com/PO.xsd http://www.oracle.com/PO.xsd">
    <PONum>1001</PONum>
    <Company>Oracle</Company>
    <Item>
    <Part>9i Doc Set</Part>
    <Price>2550</Price>
    </Item>
    <Item>
    <Part>8i Doc Set</Part>
    <Price>350</Price>
    how can i extract data from child nodes.

    seems like problem with the namespace.
    pls add the third argument to the extract function i.e the namespace.
    select p.extract('/PurchaseOrder/Company', 'target namespace').getStringVal() from po_tab2 p;

  • Problem extracting LIS data

    Hi,
    We are having problem extracting LIS data.
    I have cheked the delta queue (RSA7), and 2LIS_13_VDITM has total 0.
    I have also checked transaction LBWQ, and can see that the MCEX13 has got 150028 entries and MCEX11 has got 12811 entries in the queue. It has destination NONE and status NOSEND.
    What can I do to transfer all these entries to the delta queue so that they can be loaded to BW?
    BR,
    Linda

    Hi Linda,
    search form RMBWV in the ABAP input box (job name = *, user = *)
    goto SMQ1 in the source system; you'll see LUWs for the MCEX* queue.
    in this tran you also see your BW delta queue; do you see LUWs as well for your datasource?
    you can schedule the update job via LBWE / job control; once executed, the LUWs for  MCEX* should decrease and your BW LUWs should incerease...
    let us know
    hope this helps
    Olivier.
    Message was edited by:
            Olivier Cora

  • Performance problem whlile selecting(extracting the data)

    i have one intermediate table.
    iam inserting the rows which are derived from a select statement
    The select statement having a where clause which joins a view (created by 5 tables)
    The problem is select statement which is getting the data is taking more time
    i identified the problems like this
    1) The view which is using in the select statement is not indexed---is index is necessary on view ????
    2) Because the tables which are used to create a view have already properly indexed
    3) while extracting the data it is taking the more time
    the below query will extract the data and insert the data in the intermediate table
    SELECT 1414 report_time,
    2 dt_q,
    1 hirearchy_no_q,
    p.unique_security_c,
    p.source_code_c,
    p.customer_specific_security_c user_security_c,
    p.par_value par_value, exchange_code_c,
    (CASE WHEN p.ASK_PRICE_L IS NOT NULL THEN 1
    WHEN p.BID_PRICE_L IS NOT NULL THEN 1
    WHEN p.STRIKE_PRICE_L IS NOT NULL THEN 1
    WHEN p.VALUATION_PRICE_L IS NOT NULL THEN 1 ELSE 0 END) bill_status,
    p.CLASS_C AS CLASS,
    p.SUBCLASS_C AS SUBCLASS,
    p.AGENT_ADDRESS_LINE1_T AS AGENTADDRESSLINE1,
    p.AGENT_ADDRESS_LINE2_T AS AGENTADDRESSLINE2,
    p.AGENT_CODE1_T AS AGENTCODE1,
    p.AGENT_CODE2_T AS AGENTCODE2,
    p.AGENT_NAME_LINE1_T AS AGENTNAMELINE1,
    p.AGENT_NAME_LINE2_T AS AGENTNAMELINE2,
    p.ASK_PRICE_L AS ASKPRICE,
    p.ASK_PRICE_DATE_D AS ASKPRICEDATE,
    p.ASSET_CLASS_T AS ASSETCLASS
    FROM (SELECT
    DISTINCT x.*,m.customer_specific_security_c,m.par_value
    FROM
    HOLDING_M m JOIN ED_DVTKQS_V x ON
    m.unique_security_c = x.unique_security_c AND
    m.customer_c = 'CONF100005' AND
    m.portfolio_c = 24 AND
    m.status_c = 1
    WHERE exists
         (SELECT 1 FROM ED_DVTKQS_V y
              WHERE x.unique_security_c = y.unique_security_c
                   GROUP BY y.unique_security_c
                   HAVING MAX(y.trading_volume_l) = x.trading_volume_l)) p
    any one please give me the valueble suggestions on the performance

    thanks for the updating
    in the select query we used some functions like max
    (SELECT 1 FROM ED_DVTKQS_V y
    WHERE x.unique_security_c = y.unique_security_c
    GROUP BY y.unique_security_c
    HAVING MAX(y.trading_volume_l) = x.trading_volume_l)) p
    will these type of functions will cause the performance problem ???

  • Problems to extract data from 0HR_PY_1

    Hi Experts,
    I'm using BW v.7 to extract data from standard data source 0HR_PY_1 of SAP-ECC, with delta update.
    When I try to extract the data job finishes with error (red status) and all data-packages have error message ABORT in customer routine 9998.
    Although I'm using BW v.7 I have activated 3.x extraction (DataSouce -> InfoSource -> Update Rule).
    Any help will be welcome!
    Alex

    I've found the problem by myself

  • Problems While Extracting Hours From Date Field

    Hi Guys,
    Hope you are doing well.
    I am facing some problems while extracting hours from date field. Below is an example of my orders table:-
    select * from orders;
    Order_NO     Arrival Time               Product Name
    1          20-NOV-10 10:10:00 AM          Desktop
    2          21-NOV-10 17:26:34 PM          Laptop
    3          22-JAN-11 08:10:00 AM          Printer
    Earlier there was a requirement that daily how many orders are taking place in the order's table, In that I used to write a query
    arrival_time>=trunc((sysdate-1),'DD')
    and arrival_time<trunc((sysdate),'DD')
    The above query gives me yesterday how many orders have been taken place.
    Now I have new requirement to generate a report on every 4 hours how many orders will take place. For an example if current time is 8.00 AM IST then the query should fetch from 4.00 AM till 8 AM how many orders taken place. The report will run next at 12.00 PM IST which will give me order took place from 8.00 AM till 12.00 PM.
    The report will run at every 4 hours a day and generate report of orders taken place of last 4 hours. I have a scheduler which will run this query every hours, but how to make the query understand to fetch order details which arrived last 4 hours. I am not able to achieve this using trunc.
    Can you please assist me how to make this happen. I have checked "Extract" also but I am not satisfied.
    Please help.
    Thanks In Advance
    Arijit

    you may try something like
    with testdata as (
      select sysdate - level/24 t from dual
      connect by level <11
    select
      to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') s
    , to_char(t, 'DD-MM-YYYY HH24:MI:SS') t from testdata
    where
    t >= trunc(sysdate, 'HH') - numtodsinterval(4, 'HOUR')
    S     T
    19-06-2012 16:08:21     19-06-2012 15:08:21
    19-06-2012 16:08:21     19-06-2012 14:08:21
    19-06-2012 16:08:21     19-06-2012 13:08:21
    19-06-2012 16:08:21     19-06-2012 12:08:21trunc ( ,'HH') truncates the minutes and seconds from the date.
    Extract hour works only on timestamps
    regards
    Edited by: chris227 on 19.06.2012 14:13

  • Problem in extracting data from source system

    HI
       I am comparing source system and data in bW for a particular datasource.In this process in source system transaction RSA3 when i am trying to extract data its not showing the exact count.I couldnt able to understand the setting options like Datarecord/calls & Displa extr.calls in transaction RSA3.can some one assist in this..?

    Vara Prasad,
    I do know about the popup , but then the popup is displayed only AFTER the extraction completes. Asusming that you have about 10 million records - I cannot find the number of records without actually extracting it and my RSA3 will fail because it is too large of RSA3 , I was talking about if there was a method by which I could fins the number of records alone without extracting the data / running RSA3.
    Also if your extractor has 20,000 records , and if you set 10 datapackets of 1000 each the popup will display 10000 whih is incorrect and you would have to increase the data packet / package size to finally arrive at the right number...

Maybe you are looking for