0FI_AR_4 delta problem

We have recently implemented the 0FI_AR_4 extract of AR customer items into BW in conjunction with some of the Collections Management targets.  This appeared to be working well and was giving the data we expected.  However, after we started loading in Production, we realized that items that were being cleared after the initialization were still being reflected as Open in the BW datastore/cube.  In looking at the delta updates, we see the clearing document itself but no records affecting marking the old Open items with a Cleared status/date/etc.
After reinitializing the targets, the affected data looks correct again and the items are marked as Cleared.  We get Delta updates nightly but it just doesn't appear to be everything we expect.  I have looked at some old notes in this area discussing 0FI_GL_4 and possible dependencies or need to decouple them.  We are on SAP ECC 6.0 - 7.0SP14 and BW 7.0 SP21.  Could anyone elaborate on if there would still be any need to also load 0FI_GL_4 to get 0FI_AR_4 deltas to work?  Or if there is anything I should specifically look for in this area before opening up a message with OSS?  For now, I was just going to have reinitialize it periodically to be safe.
Thanks in advance for any assistance you can provide on this...

Abby, we did some more research on exactly how the deltas worked with this extract in our environment.  Ultimately we didn't change anything -- I think it was more in just figuring out what timeframes were being pulled so we could understand differences in balances between R3 and BW at a given point.  There are other tables involved beyond just BSID and BSAD.  Such as BWFI_AEDAT that holds pointers to changed records in BSID/BSAD, since those tables don't have a 'change' date in all instances to work from with deltas.  There was also a table called BWOM2_TIMEST that holds delta upper limit timestamp info for each 0FI_AR_4 load and it behaves based on settings in BWOM_SETTINGS.  It controlled the time periods involved in each daily load (i.e. in our case, it went from 10 pm to 10 pm), so that was also part of the reason we were 'missing' records.  We didn't realize that a load running at 5 am one morning was only pulling records up to 10 pm the preceding night.  Once we felt better about the completeness of the records coming into BW, we no longer pursued it.
This extract is by design a daily load.  But there are some notes out there about doing additional configuration to set up 'minute based' extraction and be able to run it multiple times in a day.  Fortunately we didn't have to look further into it.  Also, the 'coupling' between 0FI_GL_4 and 0FI_AR_4 were no longer a requirement in our version, so apparently that was never part of the issue either.
Hope that helps you.  Good luck...

Similar Messages

  • Copa delta problem

    Hi All,
    I enhanced the copa DataSource  by adding the new value field,then i have loaded the data
    Initial Load without data transfer then run the delta update it is give ZERO records.
    When i checked in COPA table level and KE24 report level in r/3 side data is  there.
    Is there any problem in my delta.If is there delta  problem where to check and how  to resolve this.
    Please help me ASAP.
    Thanks in advance.
    prasantha

    HI Sathish,
    Based on in your information i have checked in KEB2 it is giving the below error.can you help wht is this
    and how  to resolve this error.
    the below error is keb2 tcode(based on below error i have one doubt delta is geting the data into
    head level data.
    Subsequent time stamp information only has statistical character.
    The time stamp relevant for reading data is found in the header information for the
    DataSource.
    For data extraction from several BW systems, the number of records read and sent that is
    displayed does not have to agree with the actual number (due to Service API logic).
    Delta init simulations are not listed in the overview because the CO-PA extractor
    is not called in this case and it consequently does not update any log entry.
    Thanks and regards,
    kumar.k
    Edited by: kumar reddy on Jan 20, 2009 8:10 AM

  • 0FI_AR_4 Delta Timestamp

    Hello,
    We are using 0FI_AR_4 to extract our AR line items. There seems to be a time delay of 1 DAY for us to get new/changed records. We are on SAP_APPL 603 level 0004. I went through the forum and also looked at SAP Note : 991429. This seems to be for a previous version of ECC. I looked at note details. BWOM_SETTINGFS table seems to be fine in our case. I am not sure where the problem is. Can you please suggest.?
    Thanks in advance.

    Hi Tibollo
    My assumption is the 1 day safety delta depends on the entries maintained in BWOM_SETTINGS. If we apply OSS note 991429, these are minute-based extractors. We are maintining the same entries as the note.
    I ran the process chain on 4/26/2011 5:45 PM. I did not get records created on 4/26/2011 in SAP into BI.
    I ran process chain on 4/27/2011 9:35 AM. I did get records for 4/26/2011 but I did not get records created on 4/27/2011 in SAP around 8:30AM.
    Please advise.
    Thanks
    Edited by: George Smith on Apr 27, 2011 5:56 PM

  • AR Delta problem

    Hi Guys,
    I have a problem with extracting delta records with 0FI_AR_4. I am extracting the data with delta extraction, but some records are not transfered from R/3 even in the process chain the AR is extracted after the GL.

    These are the contents of the table from the initialization to current date
    100    0FI_AR_4                       11.02.2007   18:55:35 C                315.288.000.000     5.399.927.990.000                       7.200
    100    0FI_AR_4                       12.02.2007   06:31:36 D              5.399.928.000.000     5.400.791.990.000                       7.200
    100    0FI_AR_4                       20.02.2007   06:38:08 D              5.406.840.000.000     5.407.703.990.000                       7.200
    It seems that the settings OK.

  • K7N2 Delta problems

    Hey all..having a huge problem with my computer and have no idea what's going on.....when i try to reformat, i am getting errors with blue screen. errors are as follows: Page_Fault_In_Non_Paged_Area followed by the stop error numbers
                    Driver_IRQL_Not_Less_Or_Equal
                    IRQL_Not_Less_Or_Equa
                     as well as a wait or yield error in a DPC routine
     Now, when i reset the bios, it will reformat and install windows, but I never had to do this before. Also, when i am up and running, i am getting both crash and reboots, as well as lock-ups. Again, never happened before a month or so ago. Does anyone have any ideas? I don't know if this is a PSU issue, a BIOS issue( I just updated to the latest BIOS version, but after all the problems), a heat issue or what. Here are my specs:
                   MSI K7N2 Delta
                   Athlon XP 3200+
                   2x512 mb of DDR333 RAM
                   usually use Radeon 9700 Pro, but am currently forced to use FX 5200
                   WD 60 gb hdd with 8 mb cache
                   AOpen 300 watt psu
                   3.3v - 20.0A   5v - 30.0A   12v - 13.0A
    any help would be appreciated gang...have a good one !!

    Flash the bios back to B30. Much more stable than B40. B40 are really buggy same as the beta one. Good Luck.
    Tbred B 2400 (180 x 13.5) 2.10 volt
    MSI K7N2 Delta2 LSR
    2 x 256 PC2700 Kingston Value RAM
    Powercolor Radeon 9600XT 600 core 800 memory
    450 Icute PSU.  

  • PM Delta problem. Notifications coming into lbwq but Orders are not coming.

    Hi,
    I'm talking about 2 datasources of PM modules. I've problem with one of them only.
    1. 2LIS_17_I3HDR(I've problem here)
    2. 2LIS_17_I0NOTIF(I've no problem here)
    When I create or modify some notification I get the information immediately in lbwq and then I take it to bi system. But when I create some orders or modify some I don't get them in lbwq.
    My analysis:
    The difference between these two datasources in our server is
    1. I've converted 2LIS_17_I3HDR data flow to bi7 i.e. transformations and dtps. I've deleted the update rules and infosources and transfer rules i.e. 3.x objects.
    2. I've enhanced 2LIS_17_I3HDR and written some code in cmod or exit to fill the enhanced fields.
    I've searched the sdn thoroughly and couldn't find any solution. Please help.
    Regards,
    Sujit.

    Hi All,
    I've found solution to my problem. After trying so many methods I found one which solved my problem. The function module PMEX_UPDATE_17_1_QRFC was not active. I activated that and delta records started coming into lbwq.
    Now one more question is do I need to activate this function module in Production server also when we go live? What is the exact procedure?
    Another doubt is : is there any way to recover the delta records which didn't come to lbwq before the activation of function module. I know I need to delete the setup tables and fill it again. But is there any other way?
    Thanks,
    Sujit.

  • 0FI_AR_4-Delta Time Settings based on Numeric Pointer Time Stamps

    Hi All,
    For Numeric Pointersettings for 0FI_AR_4, I checked the table of DELTA settings, the time stamps are different and it says GMT IN BWOM2_SETTINGS.
    So I am confused here whether for FI based on GMT timestamps or numeric pointer BAED on documents entered..
    say user entered a DOCUMENT clearing ON mONDAY AFTERNOON, WE HAVE BATCH UPDATE on TUESDAY MIDNIGHT, WHEREAS the document not picked up on TUESDAY batch and so the document is still not cleared in OUR report whereas cleared in SAP..
    whereas the same document CLEARED in wednesday midnight batch job...so it means there is a delay in picking up the document which is running every MIDNIGHT...
    Does anyone clearly explain how its picking and why? hope it is not based on GMT TIMING...
    Also I check another table, it has set the time limit of 2AM and whereas our TUESDAY batch runs before 2AM..whereas Wednesday batch runs after 2am, i am not sure whether it could be because of the above issue...

    Hi Durgesh,
      I checked this link..That's why raised it whether it really works based on the last part in the link based GLOBAL SETTING OF FI...
    If it considers 2AM and check the same settings in my system too, it could be the reason, why my data doesn't pupulate on TUESDAY ..I AM THINKING... is my assumption right???
    i am still waiting from our HQ team..will see whether that works or not..
    then will let you know...

  • 0FI_AR_4 Delta not working like I think it should be

    I have the extractor/DataSource 0FI_AR_4 installed and I am on a 7.0 technical upgrade.  I have not yet migrated anything to the new 7.0 dataflow so I am still running 3.x data flow with this extractor.  My issue is that when I check RSA7 for this extractor I usually see 0 records.  Yesterday I saw 3 LUWs in RSA7 for this extractor but they were not AR records - it looked more like metadata.  Next I went to RSA3 and chose Update Mode u2018Du2019 and I got zero records back.  I ran the Delta InfoPackage 3 times yesterday and only got data once - the other two requests I got back 0 records.
    I read that since PlugIn 2002.2 that we no longer have to decouple GL from AR.  Is this true?  This was my first guess but we don't even have 0FI_GL_4 installed.
    Somehow the data is getting into our 0FIAR_O03 DSO, but I canu2019t identify how that data is being collected.  It looks like there is also a lag in the time that the data gets entered into ECC and when it finally gets extracted.  Any ideas?

    There are conflicting statements on help.sap.com
    1: In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in BW for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting.
    2: As the strict connection of DataSources has been abolished as of PlugIn2002.2, you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 in any order you like. You can also use DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another.
    Which is true?

  • 0FI_AR_4 DELTA  SO SLOWLY

    Hi,
    I use infopackege 0FI_AR_4_d to extracted the Datasourse 0FI_AR_4 Customers: Line Items with Delta. The runtime is 18h 59m 8s,but it only have 114 records. It's too slowly.
    Processing is overdue.
    And the maximum wait time for this request has not yet been exceeded.
    the background job has not yet finished in the source system.
    Current status
    in the source system

    Hi.....
    Is there many jobs are running in the system...........check in SM37..........may be there is no free Work process..........due to which the background jobs are not getting work processes to proceed.....
    Also check in ST04.........if there is any locks or deadlocks in the system..........
    If you find any locks and deadlocks.......which are not changing......and seems to be permanent......then take help of the basis people...
    Check short dumps in ST22...........also check System Log in........SM21........
    Your Jobs are progressing.........but slowly......but still check TRFC......in SM58.......may be there is some stucked TRFC........
    Keep on checking the job log in SM37........whether it is progressing or not......and if you find Sysfail in the log........it means it went to short dump.......
    Hope this helps......
    Regards,
    Debjani......
    Edited by: Debjani  Mukherjee on Oct 28, 2008 5:16 AM

  • Inconsistencies data between ODS and RSA3 in SAP system (delta problem??)

    Hi everyone,
    I have a problem in my SAP BW production system. There is inconsistencies data between ODS (using tcode LISTCUBE) and Datasource in SAP system (tcode RSA3).
    I'm using Datasource: 0FI_GL_10 and ODS: 0FIGL_O10.
    In SAP system (using RSA3):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    Accum Balance : $0
    Credit Total: $350
    Debit Total: $350
    Sales : $0 
    And in ODS (using ListCube):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    0BALANCE : $350  (it should be $0,shouldn't it? )
    0CREDIT: $0 (it should be $350,shouldn't it? )
    0DEBIT: $350
    0SALES: $350 (it should be $0,shouldn't it?)
    I have tried these :
    1. Check if there is a start routine / routine or not in transfer or update rules....and...i get nothing
    2. Check the data using tcode FBL3N..but the data is ok
    And then i tried to trace the process chain and find error on delta infopackage which pull these data
    Date
    07.30.2008 : 231141 records --> error (ODS activation) --> request has deleted forever
    07.31.2008 : 2848 records   --> error (ODS activation) --> request has deleted forever
    08.01.2008 : 135679 records --> error (ODS activation) --> request has deleted forever
    08.02.2008 : 135679 records --> successfull repeat delta action
    From 30 July until 1 August there is an error on delta infopackage because
    fail to activate ODS (No SID found for value 'RIM' of characteristic 0UNIT). On 2 August someone delete the red requests and repeat the delta and success. Is there any possibilities that this problem arise because of this? If this the cause, how to bring back 2848 and 231141 records which has deleted?
    Thank you and Regards,
    -Satria-

    Hi everyone,
    I have solved my problem. I executed my infopackage again (selection condition using GL Account which have wrong amount) with update mode 'Full update' and choose Scheduler > Repair Full Request and tick the checkbox.Thank you.
    Regards,
    -Satria-

  • Urgent Help Reqd: Delta Problem....!!!

    Hi Experts,
    Urgent help reqd regarding follwoing scenario:
    I am getting delta to ODS 1 (infosource : 2LIS_12_VCHDR) from R/3. Now this ODS1 gives delta to Cube1. I got one error record which fails when going to cube (due to master data check).And this record would be edited later.
    So I thought of re loading cube and using error handling to split request in PSA and load valid records in cube.Later when I have coorect info abt error, I would load erroneous record manually using PSA.
    I did following steps:
    1)Failed request is not in cube...(setting is PSA and then into target)....also it has been deleted from PSA.
    2)Remove data status from source ODS1.
    3)Change error handling in infopackage to valid " Valid records update, reporting possible".
    4)start loading of cube.
    But when I did this I got following message :
    <b> "Last delta update is not yet completed.Therefore, no new delta update is possible.You can start the request again if the last delta request is red or green in the monitor"</b>
    But both my requests in cube and ODS are Green...no request is in yellow...!!
    Now the problem is :
    1) I have lost ODS data mart status
    2)how to load valid records to cube, since errorneous record can be edited only after 3- 4 days.
    3)How not to affect delta load of tomorrow.
    Please guide and help me in this.
    Thanks in advance and rest assured I will award points to say thanks to SDNers..
    Regards,
    Sorabh
    Message was edited by: sorabh arora

    Hi Joseph,
    Thanks for replying....
    I apolgize and I have modified the message above..
    I just checked in cube...there is no failed request for today...and neither in PSA....Person who has handed this to me may have deleted this is PSA..and why req is not in failed state in cube...I am not sure..though setting is "PSA and then into data package".....
    So how to go frm here....
    Thanks
    Sorabh

  • Generic delta problem : in init load, the request is in yellow.......

    Hi all,
    I have created a table in R/3, and created a Generic data source using this table, and coming to deltas, i have selected Numeric pointer in which I have given the sales document number which comes from the table as the Delta relevant field. And the other things like replication, creation of info source, info package did well and I als got Initialization option in Info Package, but when I schedule the load, its coming in yellow signal and says that 0 records from 0 records, and gives error message as no records in the source system. This is the case if I use Init option in Info package.
    If I use Full load option in info pack, every thing is fine and I am able to extract the data from R/3 with out any problem, the load is getting succeeded.
    The only problem is with deltas. Is there anything else to do on R/3 side to enable the delta mechanism? or is it required to use functional module to enable the deltas ( I read this in some posts)? Please give me some solution.
    Thanks
    Ganesh

    Hi,
    There is no need to make any settings of delta...in general..it should work as similar as your full load....try to locate the problem area..and revert...
    Advise, check if there are any previous init. and delete...otherwise create a new
    infopackage and try..
    Cheers,
    Pattan.

  • Delta problem in 2LIS_11_VAITM

    Hello,
    I'm having a problem with de delta update in the datasource 2LIS_11_VAITM.
    I have a enhancement in this datasource and when I do a full update, the extractor take the data correctly, but when I do a delta update, the field from the enhancement don't update the data. I think it's because the field that control the delta queue don't take the update from the enhancement.
    Can someone here help me? is It possible update this data with delta update or only with full update?
    Thanks.

    Hi Paulo
    Any Z field that you have enhanced and added to the datasource will not have the capability to capture delta. ie. When changes to VBRP happens for the given field it will not trigger a delta in your VAITM datasource. That is how SAP has designed it, unless you take the SAP defined fields any addition is not delta enabled. So when you do a full update it will pick up the modified field, but unless the key fields in VAITM is changed delta will not flow through to BI.
    What you can do is, whatever field you require in VBRP will be available as part of the Billing datasource. Bring that into your BI system and in a combined DSO or cube write a routine to lookup the required field from your Billing DSO.
    Hope this helps.
    Thanks.

  • 0FC_ACCNTBP_ATTR delta problems

    Hi!
    I have some problems with 0FC_ACCNTBP_ATTR datasource, when I want to do delta extraction I got an error
    SAPSQL_INVALID_FIELDNAME
    "The SELECT clause was specified in an internal table at runtime.
    It contains the field name "GUID_C", but this does not occur in any of
    the database tables listed in the FROM clause."
    This field is standar not an extension, and then I think the issue will be solved with note, but I have not found anything... Somebody has the same problem????
    Thanks a lot

    Hi Rocio,
    we got asimilar prablem, but with extension.
    Try to fill the field only in Exit known in your Datasource.
    And then start the delta again.
    Hope it helps.
    Ralf
    Message was edited by: ralf wagensommer

  • BW GENERIC EXTRACTION DELTA PROBLEM

    I have a problem for extracting delta from R/3 to BW.
    Init data is loaded from R/3 to BW system for ZST_ORDER datasource.
    Repair full request and selection parameter is created on date August 1st 2007 to August15th 2007.
    During the delta extraction - A lot of records around 500,000 records are getting extracting from R/3 to BW.
    R/3 side - There is only 5 entries available in AUFK table.
    It should extract delta only less than 35 entries.
    ZST_ORDER is generic datasource . Delta captured field is DATUM -New status changed record value 2.
    Advance Thx for your help. The code looks like
    FUNCTION ZST_ORDER.
    ""Local Interface:
    *" IMPORTING
    *" REFERENCE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
    *" REFERENCE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *" REFERENCE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE DEFAULT 1000
    *" REFERENCE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *" REFERENCE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *" TABLES
    *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *" E_T_DATA STRUCTURE ZST_ORDEROPTIONAL
    *" EXCEPTIONS
    *" NO_MORE_DATA
    *" ERROR_PASSED_TO_MESS_HANDLER
    TABLES: AUFK, "Order master data
    TJ02T, "System status texts
    TJ30T, "Texts for user status
    JSTO. "Status object information
    DATA DECLARATION
    DATA: L_DATE TYPE DATS,
    L_STATUS TYPE J_STATUS,
    L_LINES TYPE SY-TABIX,
    L_CHANGED(1) TYPE C.
    Auxiliary Selection criteria structure
    DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    Another data objects
    Service Order Data
    DATA: BEGIN OF LT_AUFK OCCURS 0,
    AUFNR LIKE AUFK-AUFNR,
    AUART LIKE AUFK-AUART,
    ERDAT LIKE AUFK-ERDAT,
    AEDAT LIKE AUFK-AEDAT,
    STDAT LIKE AUFK-STDAT,
    AEZEIT LIKE AUFK-AEZEIT,
    ERFZEIT LIKE AUFK-ERFZEIT,
    IDAT1 LIKE AUFK-IDAT1,
    IDAT2 LIKE AUFK-IDAT2,
    IDAT3 LIKE AUFK-IDAT3,
    LOEKZ LIKE AUFK-LOEKZ,
    OBJNR LIKE AUFK-OBJNR,
    END OF LT_AUFK.
    Individual Object Status
    DATA: BEGIN OF LT_JEST OCCURS 0,
    OBJNR LIKE JEST-OBJNR,
    STAT LIKE JEST-STAT,
    INACT LIKE JEST-INACT,
    CHGNR LIKE JEST-CHGNR,
    END OF LT_JEST.
    ***Change Documents for System/User Statuses (Table JEST)
    DATA: BEGIN OF LT_JCDS OCCURS 0,
    OBJNR LIKE JCDS-OBJNR,
    STAT LIKE JCDS-STAT,
    CHGNR LIKE JCDS-CHGNR,
    USNAM LIKE JCDS-USNAM,
    UDATE LIKE JCDS-UDATE,
    UTIME LIKE JCDS-UTIME,
    INACT LIKE JCDS-INACT,
    CHIND LIKE JCDS-CHIND,
    END OF LT_JCDS.
    DATA: BEGIN OF LT_JSTO OCCURS 0,
    OBJNR LIKE JSTO-OBJNR,
    STSMA LIKE JSTO-STSMA,
    END OF LT_JSTO.
    STATIC FIELDS
    STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
    S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
    S_CURSOR TYPE CURSOR.
    User-defined Ranges
    RANGES: L_R_AUFNR FOR AUFK-AUFNR ,
    L_R_ERDAT FOR AUFK-ERDAT.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
    IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Check DataSource validity
    CASE I_DSOURCE.
    WHEN 'ZST_ORDER '.
    WHEN OTHERS.
    RAISE ERROR_PASSED_TO_MESS_HANDLER.
    ENDCASE.
    Copy selection criteria for future extractor calls (fetch-calls)
    APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
    S_S_IF-REQUNR = I_REQUNR.
    S_S_IF-DSOURCE = I_DSOURCE.
    S_S_IF-MAXSIZE = I_MAXSIZE.
    S_S_IF-INITFLAG = I_INITFLAG.
    APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
    ELSE.
    First data package -> OPEN CURSOR
    IF S_COUNTER_DATAPAKID = 0.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'AUFNR'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_AUFNR.
    APPEND L_R_AUFNR.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'ERDAT'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_ERDAT.
    APPEND L_R_ERDAT.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'DATUM'.
    L_DATE = L_S_SELECT-LOW.
    ENDLOOP.
    OPEN CURSOR WITH HOLD S_CURSOR FOR
    SELECT AUFNR AUART ERDAT
    AEDAT OBJNR AEZEIT
    STDAT ERFZEIT IDAT1
    IDAT2 IDAT3 LOEKZ
    FROM AUFK
    WHERE AUFNR IN L_R_AUFNR AND
    ERDAT IN L_R_ERDAT.
    ENDIF.
    Fetch records into interface table LT_AUFK
    FETCH NEXT CURSOR S_CURSOR
    APPENDING CORRESPONDING FIELDS OF TABLE LT_AUFK
    PACKAGE SIZE S_S_IF-MAXSIZE.
    IF SY-SUBRC <> 0.
    CLOSE CURSOR S_CURSOR.
    RAISE NO_MORE_DATA.
    ENDIF.
    Determining the number of lines of the table LT_AUFK .
    L_LINES = LINES( LT_AUFK ).
    IF L_LINES IS NOT INITIAL.
    Sort the internal table LT_AUFK
    SORT LT_AUFK BY OBJNR ASCENDING.
    Selecting the records from JCDS depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT CHGNR USNAM
    UDATE UTIME INACT CHIND
    INTO TABLE LT_JCDS
    FROM JCDS
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR EQ LT_AUFK-OBJNR
    AND ( CHGNR EQ '001'
    OR UDATE >= L_DATE ).
    Sort the internal table lt_jcds.
    SORT LT_JCDS BY OBJNR STAT CHGNR ASCENDING.
    Select records from table JEST depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT INACT CHGNR
    INTO TABLE LT_JEST
    FROM JEST
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SELECT OBJNR STSMA
    INTO TABLE LT_JSTO
    FROM JSTO
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SORT LT_JSTO BY OBJNR.
    ENDIF.
    LOOP AT LT_JEST.
    CLEAR: LT_AUFK,
    l_changed.
    CLEAR L_CHANGED.
    CLEAR LT_JSTO.
    READ TABLE LT_JSTO WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-STSMA = LT_JSTO-STSMA.
    ENDIF.
    *End**
    Read the data from LT_AUFK.
    READ TABLE LT_AUFK WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    E_T_DATA-AUFNR = LT_AUFK-AUFNR.
    E_T_DATA-AUART = LT_AUFK-AUART.
    E_T_DATA-ERDAT = LT_AUFK-ERDAT.
    E_T_DATA-AEDAT = LT_AUFK-AEDAT.
    E_T_DATA-AEZEIT = LT_AUFK-AEZEIT.
    E_T_DATA-ERFZEIT = LT_AUFK-ERFZEIT.
    E_T_DATA-IDAT1 = LT_AUFK-IDAT1.
    E_T_DATA-IDAT2 = LT_AUFK-IDAT2.
    E_T_DATA-IDAT3 = LT_AUFK-IDAT3.
    E_T_DATA-LOEKZ = LT_AUFK-LOEKZ.
    E_T_DATA-INACT = LT_JEST-INACT.
    E_T_DATA-CHGNR = LT_JCDS-CHGNR.
    e_t_data-datum = lt_aufk-erdat.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = '001'
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-UDATE = LT_JCDS-UDATE.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-UTIME = LT_JCDS-UTIME.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    IF LT_JCDS-UDATE GE L_DATE
    AND L_DATE IS NOT INITIAL.
    L_CHANGED = 'X'.
    ENDIF.
    IF LT_JEST-CHGNR NE '001'.
    CLEAR LT_JCDS.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = LT_JEST-CHGNR
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    L_CHANGED = 'X'.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    ENDIF.
    ENDIF.
    IF LT_JEST-STAT(1) EQ 'I'.
    E_T_DATA-ISTAT = LT_JEST-STAT.
    ELSEIF LT_JEST-STAT(1) EQ 'E'.
    E_T_DATA-ESTAT = LT_JEST-STAT.
    ENDIF.
    IF L_CHANGED EQ 'X'
    AND L_DATE IS NOT INITIAL.
    APPEND E_T_DATA.
    ELSEIF L_DATE IS INITIAL.
    APPEND E_T_DATA.
    ENDIF.
    ENDIF.
    CLEAR: LT_AUFK,
    E_T_DATA.
    ENDLOOP.
    CLEAR LT_JEST.
    REFRESH LT_JEST.
    next data package
    S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
    ENDIF. "Initialization mode or data extraction ?
    ENDFUNCTION.

    hi,
    for that quantity SALK3 is referring a different table in the reference field for your table fields. check that field and refer the correct table for that field in table view.
    table join has a problem in that. i think your joined some other table with that field. replace the field assignment or create a new view.
    hope this help you
    regards
    harikrishna N

Maybe you are looking for