Extractor Checker Show Less Records

Hi,
I'm using rsa3 to check for a particular extractor, it only show 2 months' records, but there are more records maintained in the source system.
My settings are: data records/calls = 10000
                         display extr. calls = 10
                         update mode = F
I have turned on the debug mode, but still i can't figure it out.
may i know how to rectify this problem? appreciate any guideline.
thanks.
regards,
zl

hello,
what is the extractor type..is it generic?
are you getting the full records 100000 for the selection below?
check by increasing the extr.calls to 100.
Also,sometimes RSA3 Extract check does not return reqd records.Try scheduling a load to BI thru IP and check the no. of records coming in.
Regards,
Dhanya

Similar Messages

  • Extractor Checker shows 0 Records

    Hi,
    I am working on Transport Management system of supply chain management. In RSA3 of the TMS system I checked for datasource 0TMS_SOR and 0TMS_SRQ. It showed 0 records inspite data being there. Since SCM is based on Business Objects Process Framework I am not able to check for data at table level and I don't know how to go about checking for the existence of data. The TMS developers say there is data in the TMS system but before integration with the BI system I just wanted to check in the Extractor checker with respect to the datasources. It shows 0 records. What could be problem ? How to go about this ?
    Kindly help me on this !
    Thanks in Advance,
    Regards,
    Hudson

    Fanie, we are working on the same issue.
    For SRQ, there are special 'scheduling' conditions that are used in the PPF framework, stating that only SRQ's with a lifecycle status of 'Complete' should be extracted. This logic is embedded in the PPF scheduling conditions with FM "/SCMTMS/CL_IM_SCHD_EXTR_SRQ"
    Bottom line - Talk to the developers and ask if there are any SRQs that are 'Complete'. If not, that is why you are not getting any data from your extractor.
    Justin

  • Extractor checker - too few records

    Hi experts,
    I used extractor checker (rsa3) for Full Data on 2lis_11_vahdr and 2lis_11_vaitm. I got only 2 recrods in the results.
    Isn't there something wrong. VBAP has tons of records so does VBAK.

    it will show the number of records it has in setup tables.
    any of your colleague filled setup tables for only one or 2 document numbers, that is why it is showing is showing 2.
    Delete setup Tables data from LBWG transaction using your application area.
    refill the setup tables wihtout any selection. you will see entire data.
    Regards,
    Nagesh Ganisetti.
    Assign points if it helps.

  • Data mis match in extractor checker

    Hi Gurus,
    Please help me to solve this issue.
    there is a mismatch of records when checking datasource through extractor checker. the extractor is showing 1476 records in r/3 table but i check the table in t-code se11 it is having only 506 records. so can any one help me what are the possible reasons and how to analysis and solve this issue.

    If generic datasource based on function module : check how the function module is designed to extract data.
    If generic datasource based on table : Check the table in se12 data with  same restrictions which you may use in RSA3.
    If generic datasource based on view :
    Check the view in se12 data with  same restrictions which you may use in RSA3.
    Hope this helps.

  • Re:0 records in extractor checker

    Hi everyone
    I created a generic datasource in rso2 with a table and without errors
    But when i tried to check the datasource in rsa3 i cant find any records
    It is showing 0 records
    Can u tell tell me what is happening

    Hi,
    Please check if there is any data in the table VBRP.
    This can be checked through T code se11.
    In case there is no data, you can check with your R/3 functional consultant and ask them to create some data in R/3 for the relevan  functional area for the table.
    Secondly also check in RSA3 if you are using the full load simulation method or delta.
    Regards,
    Nitin

  • Extractor Checker OK but actual Extract has no data!

    Hi all,
    Thanks for taking the time to read my post and for your help in advance
    I have a simple custom function module in ECC6.0 which is getting employee information from LDAP.  This information is then tagged on to a few other fields and sent to BW.
    I created a data source based on this function module and I'm trying to send the data to BW.  The Extractor checker is working fine and shows the right data. However, when I run the infopackage from BW I receive no data!
    I checked the IDOCs as well and noticed that the info IDOCs are sent back to BW with 0 data records, which is telling me that the extractor is not sending back any data, despite it showing the data in the extract checker (RSA3)
    Any thoughts?
    Thanks,
    Mamdouh

    Thanks everyone,
    The partner profile is ok, and when I test in RSA3 I see the data the I need - so running the extract in debug mode is fine, however running it in the background mode gives me 0 data records
    I don't think it is an authorization error either !
    Any thoughts?
    Thanks again

  • Blanks and Zeros in Extractor Checker S-API

    Hi All,
    I am currently having a problem with the Extractor Checker S-API.
    I have deleted and filled up the extractor 2LIS_13_VDITM (Billing) in our source system. After doing that, it seems that when I use Extractor Checker S-API to view the data, it returns me number of records but when I tried to open to the details to view the records, it returns me rows with zero value and blanks.
    I tried to look up the transaction code for billing, VF03 (Displaying Billing Document), it seems there are data.
    This issue does not occur a few weeks ago. The changes that occur in our source system is only on the security. It seems that there are some changes on the security of our user logon. Will this actually affect what data that we can extract?
    PS: Even so, when I tried to load it into BW, it shows number of records but it is not updated to the Data Store as there are blanks and zeros in the extractor.
    Hope to hear some reply on this with constructive answer. Thanks.

    Hi Robert,
    I have solved this issue. It was due to the EXIT_SAPLRSAP_001 that was placed by the consultant that does not display the data. The following is the code:
    I have actually remarked and it works.
    <b>CODE</b>
    *----Telesales Billing Requirement
    Begin of SSINGH code
    Sales Order Created on Date (VBAK-ERDAT) where VBAK-VBELN = VBRP-AUBEL
    Sales Order Created By (VBAK- ERNAM) where VBAK-VBELN = VBRP-AUBEL. This field is the same as ‘login’ in the functional spec. Its description is the same as ‘name’ in the func spec.
    Sales Document Type (VBAK-AUART) where VBAK-VBELN = VBRP-AUBEL. This field is the same as ‘order type’ in the functional spec.
    Actual PGI Date (LIKP- WADAT_IST) where LIKP-VBELN = VBRP-VGBEL. This field is the same as ‘delivery date’ in the functional spec
      when '2LIS_13_VDITM'.
       TYPES: BEGIN OF TY_KNA1,
               KUNNR LIKE KNA1-KUNNR,
               LAND1 LIKE KNA1-LAND1,
               NAME1 LIKE KNA1-name1,
               ORT01 LIKE KNA1-ort01,
               PSTLZ LIKE KNA1-pstlz,
               REGIO LIKE KNA1-regio,
               SORTL LIKE KNA1-sortl,
               STRAS LIKE KNA1-stras,
               TELF1 LIKE KNA1-telf1,
               TELFX LIKE KNA1-telfx,
               ADRNR LIKE KNA1-adrnr,
         END OF TY_KNA1.
       DATA: T_KNA1 TYPE STANDARD TABLE OF TY_KNA1.
       DATA: WA_KNA1 TYPE TY_KNA1.
        DATA: WA_MC13VD0ITM LIKE MC13VD0ITM.
        data: C_MC13VD0ITM  like MC13VD0ITM occurs 0 with header line.
       SELECT KUNNR LAND1 NAME1 ORT01 PSTLZ REGIO SORTL STRAS
                     FROM  KNA1 INTO TABLE T_KNA1
                     FOR ALL ENTRIES IN C_MC13VD0ITM
                     WHERE  KUNNR  = C_MC13VD0ITM-PKUNWE.
        C_MC13VD0ITM[] = C_T_DATA[].
        IF NOT  C_MC13VD0ITM[] IS INITIAL.
          loop at c_t_data into c_mc13vd0itm.
            l_tabix = sy-tabix.
            select single erdat auart ernam into
                (WA_MC13VD0ITM-erdat,WA_MC13VD0ITM-AUART,WA_MC13VD0ITM-ernam)
                from vbak where vbeln = WA_MC13VD0ITM-aubel.
            select single WADAT_IST INTO WA_MC13VD0ITM-WADAT_IST
               from likp where vbeln = WA_MC13VD0ITM-vgbel.
    get Shipping address for AU/AI requirement
    get name from KNA1 and then get address from ADRD table.
    PKUNWE
            modify c_t_data from WA_MC13VD0ITM index l_tabix.
            clear c_mc13vd0itm.
          endloop.
        ENDIF.
    END OF SSINGH code

  • How to show 'No Records Found' and 'Employee Name Unknown' in oracle report

    Hello,
    I'm using 6i and building a report to show employees who have incorrectly input their time. I have an input parameter so a user can select a specific employee by emp_id or can leave it empty to show all. That part works. I also have date parameters that are required. That works too. However I am having trouble displaying 'NO Records Found' if the date parameters have no late or rejected employee time records. I currently have it as a text field arranged behind the emp_name field which i filled white. It works...however i have a pretty good feeling there is a better way to do this. Also, I have some data that is null since i am using two tables. There are time stamps with no emp_name or emp_number. I still need to show these records but want them to show up as "Employee Name Unknown" that way the user doesnt get confused and thinks the emp_name in the row above also includes this row.
    select e.location "Clock Location",
    e.emp_no "Emp No",
    l.first_name ||' ' || last_name "Name",
    e.time_stamp "Time",
    from emp_time e, master_all l
    where e.emp_no (+) = l.emp_no
    and e.status = 'rejected'
    --and e.emp_no  = nvl (:p_emp_no, emp_no)
    --and e.time_stamp between :p_start_date and :p_end_date                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Hi,
    So, when the join between emp_time and master_all produces no rows, you still want one row of output, saying 'No Records Found'; is that right?
    If so, you can outer-join the result set to dual, with some join condition that accepts anything.
    Use CASE (or equivalents) to get special values (like 'No Record Found' or 'Employee name unknown') when certain columns are NULL.
    For example:
    SELECT     j.location     AS "Clock Location"
    ,     j.emp_no     AS "Emp No"
    ,     CASE
              WHEN  j.name     IS NULL
              THEN  'No Records Found'
              ELSE  j.name
         END          AS "Name"
    ,     time_stamp     AS "Time"
    FROM     dual     d
    ,     (     -- Begin in-line view j, join of emp_time and master_all
              SELECT     e.location
              ,     e.emp_no
              ,     CASE
                       WHEN  l.first_name IS NULL
                       AND       last_name    IS NULL
                       THEN  'Employee name unknown'
                       ELSE  l.first_name || ' ' || last_name
                   END     AS name
              FROM      emp_time     e
              ,     master_all     l
              WHERE     e.emp_no (+)       = l.emp_no
              AND      e.status (+)       = 'rejected'
    --           AND     e.emp_no (+)        = NVL (:p_emp_no, emp_no)
    --           AND       e.time_stamp (+)  BETWEEN :p_start_date
                                             AND        :p_end_date
         ) j     -- End in-line view j, join of emp_time and master_all
    WHERE     d.dummy     != j.name (+)
    ;In an outer join, all conditions involiving the optional table need a + sign; otherwise, the effect is the same as an inner join.
    The message 'No Records Found' is a string, so it has to go in a string column.
    I put it in the "Name" column, just because I knew that "Name" was a string.
    You can put in in any other column if you wish. If that column is not already a string, then use TO_CHAR to make it a string.
    You could also have a column just for this message.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables, and also post the results you want from that data.
    DOUBLE U wrote:
    I've tried nvl in the select statement but since emp_name is a concatination of first and last name it doesnt work. This is what i have tried
    nvl(l.first_name|' '||l.last_name,'NO EMPLOYEE RECORD FOUND') "Employee",I assume you meant to have two | characters, not just one, after first_name.
    The first argument to NVL will never be NULL in that case; it will always contain at least a space, whether or not the other columns are NULL. You could say:
    NVL ( NULLIF ( l.first_name || ' ' || l.last_name
        , 'NO EMPLOYEE RECORD FOUND'
        )        "Employee",bujt I find it less confusing to use CASE, as shown above.

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • EPM 2013 Report is displayed as cut off and shows less than half the width of the report in IE 10 and IE 9.

    EPM 2013 Report is displayed as cut off and shows less than half the width of the report in IE 10 and IE 9.  This issue occurs for some users only.I tried the same url in google chrome and is working fine.
    Can you please help.
    Please find the attached screenshot.

    Hi BJ1986,
    According to your description, I think this issue is caused by the IE browser that runs the report. To trouble shoot this issue, I suggest that we can try to clear internet temporary files, cookies and others in IE browser. And also add this site
    as a trusted site in IE browser to check this issue again.
    If this issue is still existed, it can also be occurred due to the setting or third party add-ons of Internet Explorer (IE). In this scenario, you can try to run IE in compatibility mode to check the issue again. Or I suggest that you can refer to the
    steps below to troubleshoot the issue:
    Click Tools -> Internet options.
    Switch to the Security tab, click Local intranet, and then select Default level.
    Switch to the Advanced tab, and click Restore advanced settings.
    Temporarily disable third party add-ons. For detailed steps, please see the link below: 
    http://windows.microsoft.com/en-IN/internet-explorer/manage-add-ons#ie
    Hope this helps.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Showing less data at retrival time

    Hi Everyone,
    I have a datafile with me to load the data into cube.when i am loading the data it is loading succeesfully with out any error but at the time of retrival of data it is showing less when i compared with my original data.
    Then i checked the data in cube based on agents hierarchy and i observed data for some of the agents are loosing because that agent is parent for some members so the children data is overwriting the parent data when we run default calc all but i have the data for that parent itself in our data file.like
    for example
    A is parent B and C is children of A when we run default calc Band C data is overwriting the A's data but i have the data for 'A' also. like this i am loosing the data of A but i need to maintain the A' s data also.For this i created 'A' itself as a child under it and i renamed the parent A like A(Supervisor).
    like A(Supervisor)Parent and A,b,c are children
    Like this i am getting the correct data but i am doing all this modificaions manually by identifying manually.
    I have around 23000 agents are there with different hierarchies so i am getting very difficult to identify and add like this manually.
    Is there any script to write to solve this problem like if we miss any data is should take care about (or)
    Is there any other method to do this task faster.
    If any one knows please tell me how to proceed in this issue.
    Thanks in advance.
    Sravan

    You can create an adjustment account under A called A_adj, load data to A, B, and C, then before you consolidate, write a script that goes through and calculates the adjustment (A_adj) by taking the difference between A and B + C. "A_adj" = "A" - ("B"+"C");
    Now to get all this to work dynamically will take some work but using @CHILDREN and @CONCAT functions along with @CURRMBR, should get it done. As for performance, that would remain to be seen.
    Is there any particular reason why your supporting detail doesn't support the totals you are getting? Perhaps you should be going back to the source and getting a better data feed that has all the correct numbers in it. I know you can't always do that, but before engineering a big work around, I would take a pass at the source and see if you can get better data to begin with.

  • 2LIS_02_SCL extractor is giving 0 records

    Hi,
    I tried to load data in BI and I am getting 0 records
    I checked the extractor in RSA3  and the extractor gave me 0 records but when I checked it in RSA7 in R/3 and  I saw lots of records.
    what could be the problem
    thanks

    Hi Vaidya,
    Check the Job control in LBWE whether it is maintained or not.
    If you don't maintain the job control it won't get data to the DS.
    I mean you cannot get deltas and all.
    Regards,
    Ravi Kanth

  • R/3 Extractor Checker doubt

    I created a Custom Master and text datasource and when i do the RSA3 Extractor checker  :
    if i change the "Data Records / Calls" and try the extractor, the number of records extracted changes :
    Like if i entered :
    Data records / calls  = 1000 - i get "570 records"
    Data Records / Calls = 10000-i get 57 records.
    My question is that :  why is the number of records extracted not Consistent ??????
    Similarly if i change the "Display Extr. Calls"
    to 1 or 10 , the number of records changes.
    Any suggestion please.
    Thanks,
    Shalini

    Hi Shalini,
    The idea in of these 2 fields in RSA3 is like this:
    Data Records / Calls = 1000
    Display Extr. Calls  = 1
    This means display one call and each call can include maximum of 1000 records. If you change the second parameter to 10, it means, display 10 calls (packages) with max of 1000 records per package.
    Unfortunately, this figure is not always correct. I do not know the scenarios but like you mentioned, it is not always correct.
    Thanks and Regards
    Subray Hegde

  • Is it standard behavior for VL10A/table VEPVG to show two records when a Sales Document has been blocked?

    Hi Experts,
    Is it standard behavior for VL10A to show two records when a Sales Document has been blocked? Their only difference is the field Delivery Block. In VL10A, the first record has a delivery block of BLANK, the second has 99.
    Here's how to replicate the issue.
    Create sales order.
    When you check VL10A, the Sales document is there.
    Change sales order field (RSD) in VA02, Save.
    When you check VL10A, there are now two records, one has a blank delivery block, the other has 99.
    The expected result here is that after changing in VA02, there will be only 1 record in VL10A and it should have delivery block of 99.
    Assumptions:
    1. We know that VL10A retrieves its records from VEPVG. The problem is, in VEPVG, delivery block is a key field. So I think that during VA02, when the delivery block of 99 is assigned, this creates a record in VEPVG instead of updating the existing one. Is this standard behavior, and are my assumptions correct?
    Thanks in advanced experts. Appreciate your prompt response,
    Jack

    Hello Jack,
    This is the standard behavior. When I check in our system, I too can see two entries but with different good issue date and delivery date and the block is specific to good issue date, delivery date.
    So there is no problem in it. try to give the delivery date which includes two table entries delivery date and execute the transaction VL10A.
    Regards,
    TP

  • RSA7 shows zero records for delta upload of 2LIS_03_BF Datasource

    Dear Professionals,
    I am having constant issues with inventory management DataSource (2LIS_03_BF). I scheduled delta upload in r3 production using LBWE, keeping in mind that in LBWE there is no options for posting dates, ETC... In RSA3, there are records, but when i check RSA7, it is showing 0 records. What could be the problem. Do i need to do anything/activation anywhere in LBWE or RSA7 to see the records. Please advise.

    Hi,
    Normally there should be a collective run which transfers data from the application-specific queues to the BW delta queue (RSA7), just like this workflow:
    MCEX03 queue==> RMBWV03 (Collective run job) ==> RSA7-Delta queue => BW
    It is necessary to schedule the job control regularly - see point 3 of SAP Note 505700.
    Check also these SAP Notes:
    869628     Constant WAITUPDA in SMQ1 and performance optimization
    728687     Delta queued: No data in RSA7
    527481     tRFC or qRFC calls are not processed
    739863    Repairing data in BW
    Rgds,
    Colum

Maybe you are looking for

  • Firefox won't open, says profile may be missing or inaccessable, help pages not helpful

    Firefox won't open at all. I get a "Profile Missing" window saying "it may be missing or inaccessable". This morning my connection quit mid-reset while it said it was going through everything and not to turn it off, since then it refuses to open. I t

  • Printing to HP PSC 1610 via Windows XP

    I have an HP PSC 1610 All-in-One printer connected to a Windows XP machine and is setup to be shared across the wireless network. I can see and add the printer on my MacBook Pro using the drivers that came with the printer and I able to print to it b

  • X11 not starting after upgrading to 10.5.4 using the combo update

    Because of a bug in Package Maker included with XCode 3.1 I ended up breaking my complete OS. After re-installing Leopard and using the Combo Update to 10.5.4 I am not able to run X11. I was able to use it prior to re-installing, but then I got the u

  • Presentation Variables in the report column

    Hi Experts, I have one requirement like when a Dealer log-in in the Monthly report there is one date prompt[format 1995/01] now i have to create one new column that column as to filter the data based on date selected by the Dealer in the prompt and t

  • Voip phone\ BT Broadband softphone rings too quiet...

    Hi all I have a Sogatel Voip phone and the BT broadband talk softphone.  Here is some information on my internet connection.  Connection type Wireless Wireless Adapter         Buffalo Nfinity Network provider        BT Broadband Router