Copa delta problem

Hi All,
I enhanced the copa DataSource  by adding the new value field,then i have loaded the data
Initial Load without data transfer then run the delta update it is give ZERO records.
When i checked in COPA table level and KE24 report level in r/3 side data is  there.
Is there any problem in my delta.If is there delta  problem where to check and how  to resolve this.
Please help me ASAP.
Thanks in advance.
prasantha

HI Sathish,
Based on in your information i have checked in KEB2 it is giving the below error.can you help wht is this
and how  to resolve this error.
the below error is keb2 tcode(based on below error i have one doubt delta is geting the data into
head level data.
Subsequent time stamp information only has statistical character.
The time stamp relevant for reading data is found in the header information for the
DataSource.
For data extraction from several BW systems, the number of records read and sent that is
displayed does not have to agree with the actual number (due to Service API logic).
Delta init simulations are not listed in the overview because the CO-PA extractor
is not called in this case and it consequently does not update any log entry.
Thanks and regards,
kumar.k
Edited by: kumar reddy on Jan 20, 2009 8:10 AM

Similar Messages

  • Error in COPA Delta

    Hi,
    While I was trying to change the Timestamp of a COPA Delta load to a previous load in transaction KEB5, I got the following Error Message:
    "Master Record for Datasource 1_CO_PA800IDEA_AC is defined for generic delta     Read
    Timestamp could not be set"
    What might be the possible reason for such a Error Message. Could anybody help me in this regard ???

    Please get me the solution for this particular scenario.
    Let us suppose that i created the datasource some 1 year back and running the deltas successfully since then.
    Let us number the requests from 1,2,3... and so on since 1st jan 2009. And let us suppose each request loads
    some 1000 records everyday from ECC to BI.
    Now on 10th august some change was made to ECC( such as change in the User Exit ) that caused the mismatch in number of data records ( loading only 950 records ) since the change was made on 10th August. But I identified the problem on 17th august with request number 160. Let us say those request numbers are 153(10th),154(11th),155(12th),156(13th),157(14th),158(15th),159(16th),160(17th). I fixed the program in ECC on 17th and further the requests from request number 160(17th) are fetching the correct number of data records. Now I need to delete the data in BI for request numbers 153 till 159 since they are not reconciling the data of ECC with BI. Now we reload the deltas for such requests from ECC to BI. So how do I reload the deltas now in this scenario since ECC system keeps track of the last delta ie.. on 17th august(Req no 160). But how do I Load the deltas from req number 153 to 159 from ECC to BI.
    One proposed solution to this problem was setting the timestamp of COPA to 10th august. But it is not advised to set the timestamp for COPA datasources using the KEB5 transaction in a production system.
    Now in this scenario how do I load the request numbers 153 to 159 from ECC to Bi to fetch the correct data records for that period. Please propose a solution for this problem ASAP. U can also call me back whenever u need.
    Thanks in advance

  • Standard and generic COPA delta

    Hello
    I dont understand why many people say standard copa delta  , generic copa delta.
    COPA is a generated application and delta should be generic.
    What do you mean when you are talking (many posts in this forum) about standard copa delta?
    Thanks

    Hi,
    Copa extraction is generated application based on the operating concren configured in R/3 system. and same applied to delta process.
    R/3 table TKEBWTS contains information about the COPA load requests that have been posted into BW.  time stamp entries in this table can be maintained to reintialize the delta. this may be the reason some may call in generic or standard delta.
    Dev

  • K7N2 Delta problems

    Hey all..having a huge problem with my computer and have no idea what's going on.....when i try to reformat, i am getting errors with blue screen. errors are as follows: Page_Fault_In_Non_Paged_Area followed by the stop error numbers
                    Driver_IRQL_Not_Less_Or_Equal
                    IRQL_Not_Less_Or_Equa
                     as well as a wait or yield error in a DPC routine
     Now, when i reset the bios, it will reformat and install windows, but I never had to do this before. Also, when i am up and running, i am getting both crash and reboots, as well as lock-ups. Again, never happened before a month or so ago. Does anyone have any ideas? I don't know if this is a PSU issue, a BIOS issue( I just updated to the latest BIOS version, but after all the problems), a heat issue or what. Here are my specs:
                   MSI K7N2 Delta
                   Athlon XP 3200+
                   2x512 mb of DDR333 RAM
                   usually use Radeon 9700 Pro, but am currently forced to use FX 5200
                   WD 60 gb hdd with 8 mb cache
                   AOpen 300 watt psu
                   3.3v - 20.0A   5v - 30.0A   12v - 13.0A
    any help would be appreciated gang...have a good one !!

    Flash the bios back to B30. Much more stable than B40. B40 are really buggy same as the beta one. Good Luck.
    Tbred B 2400 (180 x 13.5) 2.10 volt
    MSI K7N2 Delta2 LSR
    2 x 256 PC2700 Kingston Value RAM
    Powercolor Radeon 9600XT 600 core 800 memory
    450 Icute PSU.  

  • PM Delta problem. Notifications coming into lbwq but Orders are not coming.

    Hi,
    I'm talking about 2 datasources of PM modules. I've problem with one of them only.
    1. 2LIS_17_I3HDR(I've problem here)
    2. 2LIS_17_I0NOTIF(I've no problem here)
    When I create or modify some notification I get the information immediately in lbwq and then I take it to bi system. But when I create some orders or modify some I don't get them in lbwq.
    My analysis:
    The difference between these two datasources in our server is
    1. I've converted 2LIS_17_I3HDR data flow to bi7 i.e. transformations and dtps. I've deleted the update rules and infosources and transfer rules i.e. 3.x objects.
    2. I've enhanced 2LIS_17_I3HDR and written some code in cmod or exit to fill the enhanced fields.
    I've searched the sdn thoroughly and couldn't find any solution. Please help.
    Regards,
    Sujit.

    Hi All,
    I've found solution to my problem. After trying so many methods I found one which solved my problem. The function module PMEX_UPDATE_17_1_QRFC was not active. I activated that and delta records started coming into lbwq.
    Now one more question is do I need to activate this function module in Production server also when we go live? What is the exact procedure?
    Another doubt is : is there any way to recover the delta records which didn't come to lbwq before the activation of function module. I know I need to delete the setup tables and fill it again. But is there any other way?
    Thanks,
    Sujit.

  • Inconsistencies data between ODS and RSA3 in SAP system (delta problem??)

    Hi everyone,
    I have a problem in my SAP BW production system. There is inconsistencies data between ODS (using tcode LISTCUBE) and Datasource in SAP system (tcode RSA3).
    I'm using Datasource: 0FI_GL_10 and ODS: 0FIGL_O10.
    In SAP system (using RSA3):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    Accum Balance : $0
    Credit Total: $350
    Debit Total: $350
    Sales : $0 
    And in ODS (using ListCube):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    0BALANCE : $350  (it should be $0,shouldn't it? )
    0CREDIT: $0 (it should be $350,shouldn't it? )
    0DEBIT: $350
    0SALES: $350 (it should be $0,shouldn't it?)
    I have tried these :
    1. Check if there is a start routine / routine or not in transfer or update rules....and...i get nothing
    2. Check the data using tcode FBL3N..but the data is ok
    And then i tried to trace the process chain and find error on delta infopackage which pull these data
    Date
    07.30.2008 : 231141 records --> error (ODS activation) --> request has deleted forever
    07.31.2008 : 2848 records   --> error (ODS activation) --> request has deleted forever
    08.01.2008 : 135679 records --> error (ODS activation) --> request has deleted forever
    08.02.2008 : 135679 records --> successfull repeat delta action
    From 30 July until 1 August there is an error on delta infopackage because
    fail to activate ODS (No SID found for value 'RIM' of characteristic 0UNIT). On 2 August someone delete the red requests and repeat the delta and success. Is there any possibilities that this problem arise because of this? If this the cause, how to bring back 2848 and 231141 records which has deleted?
    Thank you and Regards,
    -Satria-

    Hi everyone,
    I have solved my problem. I executed my infopackage again (selection condition using GL Account which have wrong amount) with update mode 'Full update' and choose Scheduler > Repair Full Request and tick the checkbox.Thank you.
    Regards,
    -Satria-

  • Urgent Help Reqd: Delta Problem....!!!

    Hi Experts,
    Urgent help reqd regarding follwoing scenario:
    I am getting delta to ODS 1 (infosource : 2LIS_12_VCHDR) from R/3. Now this ODS1 gives delta to Cube1. I got one error record which fails when going to cube (due to master data check).And this record would be edited later.
    So I thought of re loading cube and using error handling to split request in PSA and load valid records in cube.Later when I have coorect info abt error, I would load erroneous record manually using PSA.
    I did following steps:
    1)Failed request is not in cube...(setting is PSA and then into target)....also it has been deleted from PSA.
    2)Remove data status from source ODS1.
    3)Change error handling in infopackage to valid " Valid records update, reporting possible".
    4)start loading of cube.
    But when I did this I got following message :
    <b> "Last delta update is not yet completed.Therefore, no new delta update is possible.You can start the request again if the last delta request is red or green in the monitor"</b>
    But both my requests in cube and ODS are Green...no request is in yellow...!!
    Now the problem is :
    1) I have lost ODS data mart status
    2)how to load valid records to cube, since errorneous record can be edited only after 3- 4 days.
    3)How not to affect delta load of tomorrow.
    Please guide and help me in this.
    Thanks in advance and rest assured I will award points to say thanks to SDNers..
    Regards,
    Sorabh
    Message was edited by: sorabh arora

    Hi Joseph,
    Thanks for replying....
    I apolgize and I have modified the message above..
    I just checked in cube...there is no failed request for today...and neither in PSA....Person who has handed this to me may have deleted this is PSA..and why req is not in failed state in cube...I am not sure..though setting is "PSA and then into data package".....
    So how to go frm here....
    Thanks
    Sorabh

  • IPhoto copies, Export problem, upload photos

    HELP!!! I definitely need your expertise.
    Here is the background:
    - I wanted to upload photos onto Kodak Gallery through ofoto Express. Since Ofoto can upload pictures from iPhoto, so i went to organize my photos there.
    - I created a new folder (let's call it "F1" ) in iPhoto and then I put new pictures into "F1" from a photo CD, then I manually re-arranged the photo order "F1".
    Then here comes all the troubles:
    1) Can I resize all the photos in this "F1" album all the once? If so, how do I do that?
    2) When I put pictures into this "F1" , I notice the Library, film roll and 2005 also have a copy of my photos except they are all out of order, why is that?
    3) Can I just keep one copy of "F1" without having so many duplicated copies elsewhere such as under the YEAR or keywords?
    4) Are there good and faster way to upload all 300 full size photos?
    5) From the album "F1" I created, I tried to do EXPORT all the photos too, but all the photos that was exported were out of order and not in the right orientation, what can I do and what is the function of this Export then?
    6) How come the Duplicate photo function in iPhoto only put the copy right next to the original? Can I have a seperate duplicated folder? When I made duplicate, I found that the library also have the duplication and once I deleted one copy out of two, it deleted the original image in my "F1" album. I don't understand.
    7) How come I cannot just make a "copy" of "F1" album to my desktop?
    PLEASE HELP, I am so lost!!! Am I the only one that has this problem?
    Thanks,
    MC

    Hi Michelle, I will try to answer some of your questions...
    Then here comes all the troubles:
    1) Can I resize all the photos in this "F1" album all the once? If so, how do I do that?
    You resize the photo when they are exported, not while they are within iPhoto.
    2) When I put pictures into this "F1" , I notice the Library, film roll and 2005 also have a copy of my photos except they are all out of order, why is that?
    I don't know about the out of order part, but it could be that the album you made you manually moved the images where you wanted them. In the library, roll, and year view, the View is set globally by what you chose under the View menu
    3) Can I just keep one copy of "F1" without having so many duplicated copies elsewhere such as under the YEAR or keywords?
    Those aren't duplicates, they are just like an alias or a plcaehold for the real image
    4) Are there good and faster way to upload all 300 full size photos?
    Not really, it all depends on your connection speed. That amount of images will take a long long time
    5) From the album "F1" I created, I tried to do EXPORT all the photos too, but all the photos that was exported were out of order and not in the right orientation, what can I do and what is the function of this Export then?
    This has been a problem for many people including me. There have been work arounds listed here on the discussions by Old Toad. You must rename the images with numbers either within iPhoto as a batch change or when you export them, use the album name as a naming scheme and the images will be numbered. This order may or maynot be what you expected though. Like I said working with so many images is going to be very confusing. I will have to try and find Old Toad's solution for you.
    Are you uploading these images to put in an online album or for prints. You know you can also manage them on the online album, rearranging order,naming,etc. These online places also limit the number of images for each album.
    6) How come the Duplicate photo function in iPhoto only put the copy right next to the original?
    So you can find it easily
    Can I have a seperate duplicated folder?
    yes, make an album and put the duplicates in that album
    When I made duplicate, I found that the library also have the duplication and once I deleted one copy out of two, it deleted the original image in my "F1" album. I don't understand.
    If you made the duplicate in an Album it will be reflected in the library.
    If you delete an image from the library it will be deleted from the album too as the library is the place where all the images are kept. The albums only have place holders. If you want to work on a group of images and want the edits and names etc. to be different than the ones in the library, duplicate all the images in the library first. Make an album and move only the duplicates to the album. Now you can make your images B&W, sepia, edit, crop, etc. and all these edits will be reflected on the duplicates and not the originals n the library.
    7) How come I cannot just make a "copy" of "F1" album to my desktop?
    You can, but it will take some work to get it exported in the order you want. I will have to do some experimenting to see

  • AR Delta problem

    Hi Guys,
    I have a problem with extracting delta records with 0FI_AR_4. I am extracting the data with delta extraction, but some records are not transfered from R/3 even in the process chain the AR is extracted after the GL.

    These are the contents of the table from the initialization to current date
    100    0FI_AR_4                       11.02.2007   18:55:35 C                315.288.000.000     5.399.927.990.000                       7.200
    100    0FI_AR_4                       12.02.2007   06:31:36 D              5.399.928.000.000     5.400.791.990.000                       7.200
    100    0FI_AR_4                       20.02.2007   06:38:08 D              5.406.840.000.000     5.407.703.990.000                       7.200
    It seems that the settings OK.

  • Generic delta problem : in init load, the request is in yellow.......

    Hi all,
    I have created a table in R/3, and created a Generic data source using this table, and coming to deltas, i have selected Numeric pointer in which I have given the sales document number which comes from the table as the Delta relevant field. And the other things like replication, creation of info source, info package did well and I als got Initialization option in Info Package, but when I schedule the load, its coming in yellow signal and says that 0 records from 0 records, and gives error message as no records in the source system. This is the case if I use Init option in Info package.
    If I use Full load option in info pack, every thing is fine and I am able to extract the data from R/3 with out any problem, the load is getting succeeded.
    The only problem is with deltas. Is there anything else to do on R/3 side to enable the delta mechanism? or is it required to use functional module to enable the deltas ( I read this in some posts)? Please give me some solution.
    Thanks
    Ganesh

    Hi,
    There is no need to make any settings of delta...in general..it should work as similar as your full load....try to locate the problem area..and revert...
    Advise, check if there are any previous init. and delete...otherwise create a new
    infopackage and try..
    Cheers,
    Pattan.

  • Delta problem in 2LIS_11_VAITM

    Hello,
    I'm having a problem with de delta update in the datasource 2LIS_11_VAITM.
    I have a enhancement in this datasource and when I do a full update, the extractor take the data correctly, but when I do a delta update, the field from the enhancement don't update the data. I think it's because the field that control the delta queue don't take the update from the enhancement.
    Can someone here help me? is It possible update this data with delta update or only with full update?
    Thanks.

    Hi Paulo
    Any Z field that you have enhanced and added to the datasource will not have the capability to capture delta. ie. When changes to VBRP happens for the given field it will not trigger a delta in your VAITM datasource. That is how SAP has designed it, unless you take the SAP defined fields any addition is not delta enabled. So when you do a full update it will pick up the modified field, but unless the key fields in VAITM is changed delta will not flow through to BI.
    What you can do is, whatever field you require in VBRP will be available as part of the Billing datasource. Bring that into your BI system and in a combined DSO or cube write a routine to lookup the required field from your Billing DSO.
    Hope this helps.
    Thanks.

  • 0FC_ACCNTBP_ATTR delta problems

    Hi!
    I have some problems with 0FC_ACCNTBP_ATTR datasource, when I want to do delta extraction I got an error
    SAPSQL_INVALID_FIELDNAME
    "The SELECT clause was specified in an internal table at runtime.
    It contains the field name "GUID_C", but this does not occur in any of
    the database tables listed in the FROM clause."
    This field is standar not an extension, and then I think the issue will be solved with note, but I have not found anything... Somebody has the same problem????
    Thanks a lot

    Hi Rocio,
    we got asimilar prablem, but with extension.
    Try to fill the field only in Exit known in your Datasource.
    And then start the delta again.
    Hope it helps.
    Ralf
    Message was edited by: ralf wagensommer

  • BW GENERIC EXTRACTION DELTA PROBLEM

    I have a problem for extracting delta from R/3 to BW.
    Init data is loaded from R/3 to BW system for ZST_ORDER datasource.
    Repair full request and selection parameter is created on date August 1st 2007 to August15th 2007.
    During the delta extraction - A lot of records around 500,000 records are getting extracting from R/3 to BW.
    R/3 side - There is only 5 entries available in AUFK table.
    It should extract delta only less than 35 entries.
    ZST_ORDER is generic datasource . Delta captured field is DATUM -New status changed record value 2.
    Advance Thx for your help. The code looks like
    FUNCTION ZST_ORDER.
    ""Local Interface:
    *" IMPORTING
    *" REFERENCE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
    *" REFERENCE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *" REFERENCE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE DEFAULT 1000
    *" REFERENCE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *" REFERENCE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *" TABLES
    *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *" E_T_DATA STRUCTURE ZST_ORDEROPTIONAL
    *" EXCEPTIONS
    *" NO_MORE_DATA
    *" ERROR_PASSED_TO_MESS_HANDLER
    TABLES: AUFK, "Order master data
    TJ02T, "System status texts
    TJ30T, "Texts for user status
    JSTO. "Status object information
    DATA DECLARATION
    DATA: L_DATE TYPE DATS,
    L_STATUS TYPE J_STATUS,
    L_LINES TYPE SY-TABIX,
    L_CHANGED(1) TYPE C.
    Auxiliary Selection criteria structure
    DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    Another data objects
    Service Order Data
    DATA: BEGIN OF LT_AUFK OCCURS 0,
    AUFNR LIKE AUFK-AUFNR,
    AUART LIKE AUFK-AUART,
    ERDAT LIKE AUFK-ERDAT,
    AEDAT LIKE AUFK-AEDAT,
    STDAT LIKE AUFK-STDAT,
    AEZEIT LIKE AUFK-AEZEIT,
    ERFZEIT LIKE AUFK-ERFZEIT,
    IDAT1 LIKE AUFK-IDAT1,
    IDAT2 LIKE AUFK-IDAT2,
    IDAT3 LIKE AUFK-IDAT3,
    LOEKZ LIKE AUFK-LOEKZ,
    OBJNR LIKE AUFK-OBJNR,
    END OF LT_AUFK.
    Individual Object Status
    DATA: BEGIN OF LT_JEST OCCURS 0,
    OBJNR LIKE JEST-OBJNR,
    STAT LIKE JEST-STAT,
    INACT LIKE JEST-INACT,
    CHGNR LIKE JEST-CHGNR,
    END OF LT_JEST.
    ***Change Documents for System/User Statuses (Table JEST)
    DATA: BEGIN OF LT_JCDS OCCURS 0,
    OBJNR LIKE JCDS-OBJNR,
    STAT LIKE JCDS-STAT,
    CHGNR LIKE JCDS-CHGNR,
    USNAM LIKE JCDS-USNAM,
    UDATE LIKE JCDS-UDATE,
    UTIME LIKE JCDS-UTIME,
    INACT LIKE JCDS-INACT,
    CHIND LIKE JCDS-CHIND,
    END OF LT_JCDS.
    DATA: BEGIN OF LT_JSTO OCCURS 0,
    OBJNR LIKE JSTO-OBJNR,
    STSMA LIKE JSTO-STSMA,
    END OF LT_JSTO.
    STATIC FIELDS
    STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
    S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
    S_CURSOR TYPE CURSOR.
    User-defined Ranges
    RANGES: L_R_AUFNR FOR AUFK-AUFNR ,
    L_R_ERDAT FOR AUFK-ERDAT.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
    IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Check DataSource validity
    CASE I_DSOURCE.
    WHEN 'ZST_ORDER '.
    WHEN OTHERS.
    RAISE ERROR_PASSED_TO_MESS_HANDLER.
    ENDCASE.
    Copy selection criteria for future extractor calls (fetch-calls)
    APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
    S_S_IF-REQUNR = I_REQUNR.
    S_S_IF-DSOURCE = I_DSOURCE.
    S_S_IF-MAXSIZE = I_MAXSIZE.
    S_S_IF-INITFLAG = I_INITFLAG.
    APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
    ELSE.
    First data package -> OPEN CURSOR
    IF S_COUNTER_DATAPAKID = 0.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'AUFNR'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_AUFNR.
    APPEND L_R_AUFNR.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'ERDAT'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_ERDAT.
    APPEND L_R_ERDAT.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'DATUM'.
    L_DATE = L_S_SELECT-LOW.
    ENDLOOP.
    OPEN CURSOR WITH HOLD S_CURSOR FOR
    SELECT AUFNR AUART ERDAT
    AEDAT OBJNR AEZEIT
    STDAT ERFZEIT IDAT1
    IDAT2 IDAT3 LOEKZ
    FROM AUFK
    WHERE AUFNR IN L_R_AUFNR AND
    ERDAT IN L_R_ERDAT.
    ENDIF.
    Fetch records into interface table LT_AUFK
    FETCH NEXT CURSOR S_CURSOR
    APPENDING CORRESPONDING FIELDS OF TABLE LT_AUFK
    PACKAGE SIZE S_S_IF-MAXSIZE.
    IF SY-SUBRC <> 0.
    CLOSE CURSOR S_CURSOR.
    RAISE NO_MORE_DATA.
    ENDIF.
    Determining the number of lines of the table LT_AUFK .
    L_LINES = LINES( LT_AUFK ).
    IF L_LINES IS NOT INITIAL.
    Sort the internal table LT_AUFK
    SORT LT_AUFK BY OBJNR ASCENDING.
    Selecting the records from JCDS depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT CHGNR USNAM
    UDATE UTIME INACT CHIND
    INTO TABLE LT_JCDS
    FROM JCDS
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR EQ LT_AUFK-OBJNR
    AND ( CHGNR EQ '001'
    OR UDATE >= L_DATE ).
    Sort the internal table lt_jcds.
    SORT LT_JCDS BY OBJNR STAT CHGNR ASCENDING.
    Select records from table JEST depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT INACT CHGNR
    INTO TABLE LT_JEST
    FROM JEST
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SELECT OBJNR STSMA
    INTO TABLE LT_JSTO
    FROM JSTO
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SORT LT_JSTO BY OBJNR.
    ENDIF.
    LOOP AT LT_JEST.
    CLEAR: LT_AUFK,
    l_changed.
    CLEAR L_CHANGED.
    CLEAR LT_JSTO.
    READ TABLE LT_JSTO WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-STSMA = LT_JSTO-STSMA.
    ENDIF.
    *End**
    Read the data from LT_AUFK.
    READ TABLE LT_AUFK WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    E_T_DATA-AUFNR = LT_AUFK-AUFNR.
    E_T_DATA-AUART = LT_AUFK-AUART.
    E_T_DATA-ERDAT = LT_AUFK-ERDAT.
    E_T_DATA-AEDAT = LT_AUFK-AEDAT.
    E_T_DATA-AEZEIT = LT_AUFK-AEZEIT.
    E_T_DATA-ERFZEIT = LT_AUFK-ERFZEIT.
    E_T_DATA-IDAT1 = LT_AUFK-IDAT1.
    E_T_DATA-IDAT2 = LT_AUFK-IDAT2.
    E_T_DATA-IDAT3 = LT_AUFK-IDAT3.
    E_T_DATA-LOEKZ = LT_AUFK-LOEKZ.
    E_T_DATA-INACT = LT_JEST-INACT.
    E_T_DATA-CHGNR = LT_JCDS-CHGNR.
    e_t_data-datum = lt_aufk-erdat.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = '001'
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-UDATE = LT_JCDS-UDATE.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-UTIME = LT_JCDS-UTIME.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    IF LT_JCDS-UDATE GE L_DATE
    AND L_DATE IS NOT INITIAL.
    L_CHANGED = 'X'.
    ENDIF.
    IF LT_JEST-CHGNR NE '001'.
    CLEAR LT_JCDS.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = LT_JEST-CHGNR
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    L_CHANGED = 'X'.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    ENDIF.
    ENDIF.
    IF LT_JEST-STAT(1) EQ 'I'.
    E_T_DATA-ISTAT = LT_JEST-STAT.
    ELSEIF LT_JEST-STAT(1) EQ 'E'.
    E_T_DATA-ESTAT = LT_JEST-STAT.
    ENDIF.
    IF L_CHANGED EQ 'X'
    AND L_DATE IS NOT INITIAL.
    APPEND E_T_DATA.
    ELSEIF L_DATE IS INITIAL.
    APPEND E_T_DATA.
    ENDIF.
    ENDIF.
    CLEAR: LT_AUFK,
    E_T_DATA.
    ENDLOOP.
    CLEAR LT_JEST.
    REFRESH LT_JEST.
    next data package
    S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
    ENDIF. "Initialization mode or data extraction ?
    ENDFUNCTION.

    hi,
    for that quantity SALK3 is referring a different table in the reference field for your table fields. check that field and refer the correct table for that field in table view.
    table join has a problem in that. i think your joined some other table with that field. replace the field assignment or create a new view.
    hope this help you
    regards
    harikrishna N

  • Initialisation option - Delta problem

    Hi guys..
    I tried to load my delta request and it says,no active delta initialisation is available for the data source.
    I checked in the infopack->scheduler->init option for source system,and there is no init request available there.( Think got deleted accidentally)
    In my target we have init request as well as delta requests for the past two months.
    So how do i reccover this initialisation request..
    Can i do initialisation without data transfer and then continue loading my deltas.Will it give any problem???
    Thanks & Regards
    Ragu

    Hello ,
    You can do the deltas without re-doing init. But it depends on the no. of deltas records since you have done init (populated setup) last time.
    if the deltas are huge, then by just running the v3 again may overflow the delta queues!!
    Due to this risk i suggested to redo the setup run.
    But if you have some costraints, you dont delete the set up tables. delte from Rsa7 the delta queues for ur datasource. reschdule the v3.
    again run the delta in BW. check with the reports against the R/3 tables-records(just to check if the delta queues didnt overflow and all deltas have been taken care of).
    Hope this helps..
    thanks,

  • MSI KT6 Delta PROBLEM

    First, my hardware :
    Everything running @ default speed & settings.
    AMD XP 2700+ (running @ about 47C idle and about 55C in work)
    MSI KT6 Delta (v2) with latest bios
    512mb DDR (333) CL2.5 Elixir
    Seagate S-ATA ST380013 (80gb), with WinXP Pro (SP1) installed
    Samsung SP1203N (120gb), as primary master on IDE1
    Samsung 52/32/52x CD-RW, as secondary master on IDE2
    Connect3D Radeon 9600 128mb
    PSU Antec TruePower 380w:
    +5V 35A  
    -5V 0.5A  
    +12V 18A  
    -12V 1.0A  
    +3.3V 28.0A  
    +5VSB 2.0A  
    I keep getting some strange BSOD. No matter what I do with the machine. Windows locks up completely and after about 1min, that BSOD comes and machine boots.
    ( http://www.saniainen.com/bs.jpg )
    In normal usage, it comes about 1-3 times/day.
    Mostly it comes when I copy some files from disk to other.
    But it comes everytime when I try to decompress big (over 400mb) ZIP-files.
    Everything did work with my old board, MSI KT4A. The problems begun when i changed the motherboard and added the Seagate S-ATA hdd.
    I have ordered new memory, Kingston HyperX 512mb (333). Could it be memory-related or something with S-ATA ????
    I have tried everything, latest BIOS, latest VIA4in1, memchecks etc...

    I had the same problem on a differnt mobo and all I did to fix it was switch the IDE cables around on the devices.
    ?Give it a try?
    It may be a jumper you forgot to change on  the IDE devices.

Maybe you are looking for

  • Mail.app keeps saving downloads to Mail Downloads rather than ~/Downloads

    I've modified Mail.app preferences to save downloads in the main downloads folder, ~/Downloads. But every time I double click an attachment, it still downloads to the wrong folder, Mail Downloads, in a terribly nested directory, ~/Library/Containers/

  • Please help me center an SWF on the page.

    Hello, I have a rather simple problem I think, but I can't seem to find the solution. I am trying to get the same effect found here at: http://osc.template-help.com/wt_25248/index.html I want to center an SWF that is larger than the browser window, a

  • KT4AV with mouse problems

    hello; for a week, I have been in mouse troubles in my computer, I try many mice, PS2 at soon bought a new USB/OPTIC mouse, but it cannot solve the problem. Problem is that mouse freezes sometimes or makes stupid movements, I scan my computer with di

  • USPS Print initialization causes Reader launch - never did before

    Mac 10.6.8 Adobe Reader 10.1.0 Safari Browser 5.1 When using Click N Ship from USPS, at the print point, at least for the past number of years, when ok to print is clicked, the Adobe print initialization box popped open and you would select 'ok' to p

  • Update bar code table

    by using FM ALINK_BARCODE_GLOBAL we can update barcode table for trip number with object type BUS2089. Where FM ALINK_BARCODE_GLOBAL to update this.