Extractor checker - too few records

Hi experts,
I used extractor checker (rsa3) for Full Data on 2lis_11_vahdr and 2lis_11_vaitm. I got only 2 recrods in the results.
Isn't there something wrong. VBAP has tons of records so does VBAK.

it will show the number of records it has in setup tables.
any of your colleague filled setup tables for only one or 2 document numbers, that is why it is showing is showing 2.
Delete setup Tables data from LBWG transaction using your application area.
refill the setup tables wihtout any selection. you will see entire data.
Regards,
Nagesh Ganisetti.
Assign points if it helps.

Similar Messages

  • Extractor Checker shows 0 Records

    Hi,
    I am working on Transport Management system of supply chain management. In RSA3 of the TMS system I checked for datasource 0TMS_SOR and 0TMS_SRQ. It showed 0 records inspite data being there. Since SCM is based on Business Objects Process Framework I am not able to check for data at table level and I don't know how to go about checking for the existence of data. The TMS developers say there is data in the TMS system but before integration with the BI system I just wanted to check in the Extractor checker with respect to the datasources. It shows 0 records. What could be problem ? How to go about this ?
    Kindly help me on this !
    Thanks in Advance,
    Regards,
    Hudson

    Fanie, we are working on the same issue.
    For SRQ, there are special 'scheduling' conditions that are used in the PPF framework, stating that only SRQ's with a lifecycle status of 'Complete' should be extracted. This logic is embedded in the PPF scheduling conditions with FM "/SCMTMS/CL_IM_SCHD_EXTR_SRQ"
    Bottom line - Talk to the developers and ask if there are any SRQs that are 'Complete'. If not, that is why you are not getting any data from your extractor.
    Justin

  • Extractor Checker Show Less Records

    Hi,
    I'm using rsa3 to check for a particular extractor, it only show 2 months' records, but there are more records maintained in the source system.
    My settings are: data records/calls = 10000
                             display extr. calls = 10
                             update mode = F
    I have turned on the debug mode, but still i can't figure it out.
    may i know how to rectify this problem? appreciate any guideline.
    thanks.
    regards,
    zl

    hello,
    what is the extractor type..is it generic?
    are you getting the full records 100000 for the selection below?
    check by increasing the extr.calls to 100.
    Also,sometimes RSA3 Extract check does not return reqd records.Try scheduling a load to BI thru IP and check the no. of records coming in.
    Regards,
    Dhanya

  • Report takes long time for few records

    hi frends,
    I m facing one problem with my Web based erp application which is developed in .net , in my application when i open the  report from my applicaiton , in my temp folder there one file gets created name is "rpt conmgr cache"
    bcoz of this for few records also my report takes too much time and opens very slow and it takes long time, and it happens in some of the reports only , other reports are working cool and its not creating any file in temp folder,,, so can u guide me whats this file and what can be the solution for it,
    Thanks
    Mithun

    hi sabhajit,
    i have already checked the sql query it is taking less then seconds.
    any other steps u want me to check then pls let me know?
    thanks mithun

  • Extractor Checker OK but actual Extract has no data!

    Hi all,
    Thanks for taking the time to read my post and for your help in advance
    I have a simple custom function module in ECC6.0 which is getting employee information from LDAP.  This information is then tagged on to a few other fields and sent to BW.
    I created a data source based on this function module and I'm trying to send the data to BW.  The Extractor checker is working fine and shows the right data. However, when I run the infopackage from BW I receive no data!
    I checked the IDOCs as well and noticed that the info IDOCs are sent back to BW with 0 data records, which is telling me that the extractor is not sending back any data, despite it showing the data in the extract checker (RSA3)
    Any thoughts?
    Thanks,
    Mamdouh

    Thanks everyone,
    The partner profile is ok, and when I test in RSA3 I see the data the I need - so running the extract in debug mode is fine, however running it in the background mode gives me 0 data records
    I don't think it is an authorization error either !
    Any thoughts?
    Thanks again

  • Blanks and Zeros in Extractor Checker S-API

    Hi All,
    I am currently having a problem with the Extractor Checker S-API.
    I have deleted and filled up the extractor 2LIS_13_VDITM (Billing) in our source system. After doing that, it seems that when I use Extractor Checker S-API to view the data, it returns me number of records but when I tried to open to the details to view the records, it returns me rows with zero value and blanks.
    I tried to look up the transaction code for billing, VF03 (Displaying Billing Document), it seems there are data.
    This issue does not occur a few weeks ago. The changes that occur in our source system is only on the security. It seems that there are some changes on the security of our user logon. Will this actually affect what data that we can extract?
    PS: Even so, when I tried to load it into BW, it shows number of records but it is not updated to the Data Store as there are blanks and zeros in the extractor.
    Hope to hear some reply on this with constructive answer. Thanks.

    Hi Robert,
    I have solved this issue. It was due to the EXIT_SAPLRSAP_001 that was placed by the consultant that does not display the data. The following is the code:
    I have actually remarked and it works.
    <b>CODE</b>
    *----Telesales Billing Requirement
    Begin of SSINGH code
    Sales Order Created on Date (VBAK-ERDAT) where VBAK-VBELN = VBRP-AUBEL
    Sales Order Created By (VBAK- ERNAM) where VBAK-VBELN = VBRP-AUBEL. This field is the same as ‘login’ in the functional spec. Its description is the same as ‘name’ in the func spec.
    Sales Document Type (VBAK-AUART) where VBAK-VBELN = VBRP-AUBEL. This field is the same as ‘order type’ in the functional spec.
    Actual PGI Date (LIKP- WADAT_IST) where LIKP-VBELN = VBRP-VGBEL. This field is the same as ‘delivery date’ in the functional spec
      when '2LIS_13_VDITM'.
       TYPES: BEGIN OF TY_KNA1,
               KUNNR LIKE KNA1-KUNNR,
               LAND1 LIKE KNA1-LAND1,
               NAME1 LIKE KNA1-name1,
               ORT01 LIKE KNA1-ort01,
               PSTLZ LIKE KNA1-pstlz,
               REGIO LIKE KNA1-regio,
               SORTL LIKE KNA1-sortl,
               STRAS LIKE KNA1-stras,
               TELF1 LIKE KNA1-telf1,
               TELFX LIKE KNA1-telfx,
               ADRNR LIKE KNA1-adrnr,
         END OF TY_KNA1.
       DATA: T_KNA1 TYPE STANDARD TABLE OF TY_KNA1.
       DATA: WA_KNA1 TYPE TY_KNA1.
        DATA: WA_MC13VD0ITM LIKE MC13VD0ITM.
        data: C_MC13VD0ITM  like MC13VD0ITM occurs 0 with header line.
       SELECT KUNNR LAND1 NAME1 ORT01 PSTLZ REGIO SORTL STRAS
                     FROM  KNA1 INTO TABLE T_KNA1
                     FOR ALL ENTRIES IN C_MC13VD0ITM
                     WHERE  KUNNR  = C_MC13VD0ITM-PKUNWE.
        C_MC13VD0ITM[] = C_T_DATA[].
        IF NOT  C_MC13VD0ITM[] IS INITIAL.
          loop at c_t_data into c_mc13vd0itm.
            l_tabix = sy-tabix.
            select single erdat auart ernam into
                (WA_MC13VD0ITM-erdat,WA_MC13VD0ITM-AUART,WA_MC13VD0ITM-ernam)
                from vbak where vbeln = WA_MC13VD0ITM-aubel.
            select single WADAT_IST INTO WA_MC13VD0ITM-WADAT_IST
               from likp where vbeln = WA_MC13VD0ITM-vgbel.
    get Shipping address for AU/AI requirement
    get name from KNA1 and then get address from ADRD table.
    PKUNWE
            modify c_t_data from WA_MC13VD0ITM index l_tabix.
            clear c_mc13vd0itm.
          endloop.
        ENDIF.
    END OF SSINGH code

  • Data mis match in extractor checker

    Hi Gurus,
    Please help me to solve this issue.
    there is a mismatch of records when checking datasource through extractor checker. the extractor is showing 1476 records in r/3 table but i check the table in t-code se11 it is having only 506 records. so can any one help me what are the possible reasons and how to analysis and solve this issue.

    If generic datasource based on function module : check how the function module is designed to extract data.
    If generic datasource based on table : Check the table in se12 data with  same restrictions which you may use in RSA3.
    If generic datasource based on view :
    Check the view in se12 data with  same restrictions which you may use in RSA3.
    Hope this helps.

  • Material Description is not displaying for few records in Report

    Dear Experts,
    Report Material Description is not displaying for few records in Bex Report but for records it is coming and even the heading of the material Description is also not coming for this report.
    Cud u plz suggest a good solution for it.
    Regards,
    Sai Phani.

    Hi Phani,
    For the text of the material in records, check if there is text maintained for that material in the master data. also try by changing the text to medium text and long text.
    regarding header, you will have same heading for key and text, you may use Materail ID / Description as the heading if you want.
    regards,
    Rk.

  • Disc is too slow. (record) (-10004) help!!

    i got this message the other day for the first time...
    CoreAudio:
    Disk is too slow. (Record)
    (-10004)
    i was recording my band with two room mics just for fun the other day and this happened about three times. I was recording straight to the hard drive of my macbook pro because i did not bring my external drive. i was using a firepod connected via firewire 400. i was recording only two channels of audio for the two room mics and no plug-ins except for maybe one EQ on a track. does anyone know about this kind of error message and what it entails?? if i knew how to post a picture i would put up a picture of my audio driver settings. maybe i am doing something wrong there. i was recording our drummers drums the other day with 4 mics equaling 4 tracks and this did not happen, but i was not recording nearly as long. does anyone possibly have any advice??? if i was too vague in description i will answer any questions...

    Couple things -
    Recording to the internal drive is never a great choice but you shouldn't have problems recording 2 tracks, or even 4. Is you disk getting full? I would try to make sure I have at least 10% of the drive open (4GB free on a 40GB drive, for example).
    Check your maximum record time by pressing the "A" key and set it to the lowest usable value. 15 mins. is a good start, esp. if you have limited drive space, like when recording to the internal HD.
    Enable the Larger Disk Buffer in the Audio Prefs. This always seems to work for me.
    Is all your software up-to-date? Are your fire pod drivers current?
    Check your RAM. Apple Menu-> About This Mac -> More info. Bad RAM is a common problem, often overlooked.
    That's all I got...
    -John

  • Disk is too slow (Record)(-10004) error..so sick of this.

    Hello all,
    I can no longer record more than three tracks in logic without getting the error message "Disk is too slow (Record)(-10004)". When this happens, recording is stopped.
    At first I suspected my drive was faulty, maybe slowing down. So being in the middle of a session that took me 2 hours to set up, I called for a break and rushed off and bought an external Seagate 7200rpm firewire 800 drive. I installed it and set it as the recording path for the project. There was no change, the same error occured.
    I then switched the target drive to another internal one I use for Time Machine - same problem occured.
    It seems to me that this problem has nothing to do with my drives. I am at a loss to explain it. I have looked for hours online for a solution but while many have experienced this there seem to be few answers out there.
    Unless I find a solution this will be my last project with Logic. I tried and tried for the last 5 years to use this program but things like this keep happening. It's glitchy with UAD cards, Duende, RME interfaces, Midi controllers, Hard Drives, RAM and external clocks. I've had problems with them all over the years. I will most likely switch to Cubase which I feel is inferior for editing and loops, but at least it seems to be stable.
    If anyone has any insight I'll try and fix it, but I just can't keep shelling out money for a program that just doesn't work.

    I am experiencing a similar problem & have been receiving the same messages, even while recording as little as one track and playback has become an issue as well. However, THIS WAS NOT ALWAYS THE CASE. I have heard of people with this same problem, where they receive this message out of nowhere after logic has been working perfectly for them.
    I also would like to note that I am running all settings in logic for optimized recording and playback (audio & buffer settings etc etc etc)
    THIS IS NOT A HARDWARE ISSUE, at least in my situation as I am running a fast internal HD & have ample memory. Please reach out if you feel like you have a pragmatic solution to this issue.
    This may be a possible lead on the fix... I remember reading this post from a user "soundsgood" in 2008 who was having a similar issue..I don't understand completely his solution, but if someone could enlighten me, I feel that this might be the solution to our issue
    +"Okay - forgive me 'cause I'm a newbie on this forum and if somebody else has already figgered this out, I'm sorry.... I've been having the same problem all of the sudden after many years of crash-free and error-free recording. I've read everything. I've pulled my hair out. I've done dozens of clean installs. I've repaired permissions so many times I can do it blindfolded. And sitting here tonight, it dawned on me.... there are TWO places Logic is sucking data from: wherever you've got your SONG files stashed, of course.... but it ALSO NEEDS TO ACCESS THE STARTUP DRIVE (or wherever else you might have your Apple Loops installed). I was watching my drives being accessed during a playback of a fairly complicated tune (most tracks were frozen of course), and both of the afore-mentioned drives were going berserck with accesses. We're all focusing on our dedicated audio drives, but forgetting about our boot drives (where Logic usually resides along with most or all of our loops). I carbon copy cloned my boot / operating system (including Logic) to a different (in my case, an external firewire) drive and the problem disappeared. Could've been because the cloning process de-fragged all the loops & stuff, or maybe my OS just likes snatching its sample/loop info from an external drive. Worked for me... so far....... let me know if it works for others....."+

  • 2LIS_02_SCL extractor is giving 0 records

    Hi,
    I tried to load data in BI and I am getting 0 records
    I checked the extractor in RSA3  and the extractor gave me 0 records but when I checked it in RSA7 in R/3 and  I saw lots of records.
    what could be the problem
    thanks

    Hi Vaidya,
    Check the Job control in LBWE whether it is maintained or not.
    If you don't maintain the job control it won't get data to the DS.
    I mean you cannot get deltas and all.
    Regards,
    Ravi Kanth

  • R/3 Extractor Checker doubt

    I created a Custom Master and text datasource and when i do the RSA3 Extractor checker  :
    if i change the "Data Records / Calls" and try the extractor, the number of records extracted changes :
    Like if i entered :
    Data records / calls  = 1000 - i get "570 records"
    Data Records / Calls = 10000-i get 57 records.
    My question is that :  why is the number of records extracted not Consistent ??????
    Similarly if i change the "Display Extr. Calls"
    to 1 or 10 , the number of records changes.
    Any suggestion please.
    Thanks,
    Shalini

    Hi Shalini,
    The idea in of these 2 fields in RSA3 is like this:
    Data Records / Calls = 1000
    Display Extr. Calls  = 1
    This means display one call and each call can include maximum of 1000 records. If you change the second parameter to 10, it means, display 10 calls (packages) with max of 1000 records per package.
    Unfortunately, this figure is not always correct. I do not know the scenarios but like you mentioned, it is not always correct.
    Thanks and Regards
    Subray Hegde

  • How to make few records editable in oracle ADF form.

    Hi,
    I am working on one scenario. here, we are sending few records to user in an ADF task form, using BPEL Human task. Now, the requirement is to put a checkbox in each row and enable that particular row-fields for editing purpose.
    please respond if there is a way to put any condition in check box design to enable row-data for editing.
    thanks,
    rps

    Hi,
    actually to implement check boxes in front of a table you need some sort of transient field that can keep persistence. One way of achieving this is to wrap the BPEL service in a WS proxy client and create a POJO DC from it. This then allows you to add an additional field to implement a solution similar to this in ADF BC
    http://sameh-nassar.blogspot.com/2009/12/use-checkbox-for-selecting-multiple.html
    Because ADF Faces tables are stamped upon rendering, rows arent created with instances of the cell renderer. For this reason you need to keep track of the select state in the model, which you can do using a transient attribute, which then makes sure the select information is part of the row object. So similar as today, you would parse the available rows but - before changing the update state - check if the user intended the update
    Frank

  • Few records are missing while downloading to a Spreadsheet  from a Report

    Dear Gurus,
    few records are missing while downloading to a Spread sheet from a Z report.  There are around 300 records, out of which 11 records are not appearing in Spreadsheet file after saving.  But, the funny thing is when i try to save in other format like, HTML or to a clip board all records are coming. 
    When asked, the ABAPer said -
        your report is coming correctly.  if the report is wrong then we can try checking some code in the Z report.  Saving it into Spread sheet is the standard program provided by SAP.
    He is also facing this problem for the first time.
    Can anybody help.
    Thanks in advance and u will get points if useful.
    Regards

    Hi,
    Few days back we got this kind of error, when i tried to down load the asset balances in excel format.
    It was observed that, for one of the asset description ends with cama",".  Due to this reason all other details has been stored in single cell.  Once we changed the master data of that asset, then we able to get the report properly.
    2) Some other time, when we tried to download material master details, for one of the material ... description not maintained.  this is another issue. After maintain the description problem got resolved.
    Hope this information will be helpful to u.

  • Report timing out, "Too many records"

    I am using CR XIR2 with a univers that I created to run an inverntory report. The report works fine in the designer but when I go to run it from the CR Server It times out. (Very quickly).  If I put filters and only pull a subset of the data the report works fine. This issue is there should not be too many records. Where can I check to make sure that there are no limitations set and I can get all of the records? I have set the parameters for the Universe to no limit records or time to execute. Is there any place else I can check? or any other ideas?
    Thanks

    What viewer do you use?
    If the viewer is ADHTML, you are using the RAS (Report Application Server). In that case, you need to set the number of (db) records to read in the RAS properties on the CMC.
    If the viewer is DHTML, Java or ActiveX, you are using the CR Page Server. You will need to set the number of db records to read in the Cryatal Reports Page Server properties on the CMC.

Maybe you are looking for