Duplcate records in the delta load

BW experts,
I am getting the following errors when doing a delta load from 2LIS_02_ITM
into OBBP_CON, 0BBP_INV, 0BBP_PO, 0SRCT_DS1, 0SR_VEGUID.
The delta loads show a green light for all the targets except 0SR_VEGUID and
the error message is
3 Duplicate record found. 2295 records used in /BIO/XSR_VEGUID
3 Duplicate record found. 2295 records used in /BIO/PSR_VEGUID
I tried to check the contents of the both /BIO/XSR_VEGUID and
/BIO/PSR_VEGUID but i get a  message saying the tables are not active in the
dictionary.
However, i can view the data in the Info object OSR_VEGUID.
How can i resolve this issue ?
Any help will be appreciated.
Many Thanks

Kiran,
Welcome to SDN,
P and X tables are created only if you have 1. Time dependent Navigational Attributes ( X Table)
2. Time Independent Master Data Attributes.. ( P table)
Otherwise the tables will not be created.
Arun
What table do you see in the header when you see the data for the infoobject ? on the creen - do you see something like
/bi0/XX0SR_VEGUID  where XX xould be some characters ... Ideally P table should be viewable...

Similar Messages

  • Issue in the Delta load using RDA

    Hi All,
    I am facing an issue while trying to load delta using RDA from R/3 source system.
    Following are the steps followed:
    1. Created a realtime Generic Datasource with timestamp as delta specific field and replicated it to BI.
    2. First I have created the Infopackage(Initialization with data transfer) and loaded upto PSA.
    3. Created a standard DTP to load till DSO and activated the data.
    4. Then created a realtime delta infopackage and assigned it to a Daemon.
    5. Converted the standard DTP to realtime DTP and assigned the same to the Daemon.
    6. Started the Daemon, giving an interval of 5 minutes.
    In the first time, Initialization with data transfer is taking the records correctly. But when I run Daemon for taking delta records, its taking all the records again, i.e. both the previously uploaded historical data and the delta records).
    Also after the first delta run, the request status in the Daemon monitor, for both Infopackage and DTP changes to red and Daemon stops automatically.
    Can anyone please help me to solve these issues.
    Thanks & Regards,
    Salini.

    Salini S wrote:
    In the first time, Initialization with data transfer is taking the records correctly. But when I run Daemon for taking delta records, its taking all the records again, i.e. both the previously uploaded historical data and the delta records). .
    If I understand you correctly you Initially did a full load. Yes? Well next you need to do intialise & after that delta.
    The reason is that if you will select delta initialisation & initialisation without data transfer- it means delta queue is initialised now & next time when you will do delta load it will pick only changed records
    If you will select delta initialisation & initialisation with data transfer- It means delta queue is initialised & it will pick records in the same load.
    As you know your targets will receive the changed records from the delta queue.
    Salini S wrote:
      Also after the first delta run, the request status in the Daemon monitor, for both Infopackage and DTP changes to red and Daemon stops automatically.
    I take it the infopackage has run successfully? Did you check? If it has and the error is on the DTP then i suggest the following.
    At runtime, erroneous data records are written to an error stack if the error handling for the data transfer process is activated. You use the error stack to update the data to the target destination  once the error is resolved.
    To resolve the error, in the monitor for the data transfer process, you can navigate to the PSA maintenance by choosing Error Stack in the toolbar, and display and edit erroneous records in the
    error stack.
    I suggest you create an error DTP for an active data transfer process  on the Update tab page (If key fields of the error stack for DataStore objects are overwrite On the Extraction tab page under  Semantic Groups, define the key fields for the error stack.)  The error DTP uses the full update mode to extract data from the error stack (in this case, the source of the DTP) and transfer
    it to the target that you have already defined in the data transfer process. Once the data records have been successfully updated, they are deleted from the error stack. If there are any erroneous data records, they are written to the error stack again in a new error DTP request.
    As I'm sure you know when a DTP request is deleted, the corresponding data records are also deleted from the error stack.
    I hope the above helps you.

  • Initialize the delta load

    Hi BW Experts,
    I have question...
    1. There is a ODS01 which feeds many data targets.
    2. We created a ODS02 which also will be loaded by this ODS01. The update type would be  DELTA.
    3. We created a INFOCUBE01 which will be loaded by the ODS02. The update method would be FULL UPDATE.
    4. The first time (later on i didnt get this msg) when i went to the infosource for the <b>8ODS02</b> and tried to create a infopakge to load the full update of the INFOCUBE01, i got a msg that " THERE IS NO ACTIVE DELTA INITIALIZATION FOR THIS DATASOURCE".
    The details of the msg is:
    Procedure
    You have to initialize the delta load for a DataSource before you are able to load the delta data from it.
    If you want to load deltas only for the purposes of testing, or you do not require any initialization data (because you have already uploaded it using a full upload, for example) use an initialization simulation.
    This way, only the init. entries you require are made in the control tables of the source system, and a 'dummy' entry appears in the Monitor, setting the load status to green.
    No data is in fact loaded, however, and the initialization takes only a few seconds to complete.
    Does this mean i have initialize the delta load in the ODS02 or the INFOCUBE01.
    How do i do that ?
    <i>(I actually did few full loads then i was informed that it would be a delta load from the ODS01 to ODS02,
    within the infopkg for ODS02, I selected the Delta update next time, and did start immdtly, then check the monitor, it displays no records)</i>
    I'm little confused with this initialization, and in what cases would i do it.
    Would you please help...
    Thank you,
    Tia
    Message was edited by: Tia Ismail

    1. so, you are saying for the ODS02, do a full load first.
    2. Then do a "Initialize delta process".
    3. Then do a "delta process".
    I had partially tried this, I did a full load first in one infopackage, then created another infopackage trying to do a "Initialize delta process". I get a msg as following:
    <i>Diagnosis
    Deltas have already been loaded for the init. selection of request REQU_3ZEHH34LV68LVHAWT75TJVIF8. A second initialization is therefore not possible with these selection conditions.
    Procedure
    If you want to carry out a new initialization, you have to choose selections that do not overlap.
    If you want to repeat the init. for the same selection, you have to delete the delta queue in the source system, and restart the delta process.
    Choose the menu entry 'Init. Selections in the Source System' from the Scheduler.</i>
    Any thoughts...
    Thank you,
    Tia

  • How should I reload the delta load?

    Dear all,
    An ODS which includes Sales invoice information with daily delta load. The loading is fine but somehow one KF got different logic from April e.g. for same order this KF has correct value after April but incorrect value before April. So I am going to reload data old than April, but all the old load are fine and existed, so how should I do?
    Thanks

    Hi,
    Thanks for you reply, No date for selection. So I found some data in the ODS for a specific comany code and the bill type(I got 16 recordes), then I created a Infopackage then put the selection same with the company code and bill type, set it to full load and run immediatly, but 0 record was loaded. I am sure the corresponding data exist in R3 because I checked the data using VA03. So what's the problem?
    Thanks in advance

  • The delta load for generic machine in CRM doesnt run.

    When the machine was dismantled from a location and installed on a different Functional location. The changes didnt download to CRM. Even the request load fails for the same. The machine is a generic Machine.
    In R/3, the queue is in STOP status and gives the error Generic stop set on activation.
    User had changed the Functional Location for a Generic machine in R/3. The changes didnt upload to CRM. Even the request load failed to upload the changes.
    The Idoc is failed on R/3 , it gives the error message - Generic Stop Set.
    Thanks in advance

    Hello Mohammed,
    I have also met with this issue in the past, and I have resolved it by steps described in this link: http://scn.sap.com/docs/DOC-56972
    Hope it helps.
    Best regards,
    Jerry

  • Help with delta load

    Hi Gurus!
    I have a problem and I am not sure how to solve it.
    Every month we receive a flat file with 2 months of accounting data. This data goes into a ODS. The next month we receive a new file and it contains the new month and the previous (again) but with some new values for the pervious month. My problem is that I don't want the full value to go from the ODS the second time but just the difference. Simplified example:
    File 1 (period, amount):
    200602     2000
    200601     1000
    File 2
    200603     1500
    200602  2200
    I want the ODS to delta load to another ODS like this:
    After first file:
    200601  1000
    200602     2000
    After second file
    200603     1500
    200602      200
    Do not sound like rocket science but I am not sure how to solve it best? Grateful for any suggestions!
    Best Regards
    Pontus

    Hi Roberto!
    I totaly agree on that I wont lose my 200601 data. I just wrote the records in the delta load (not the records in the ODS). Sorry if it was not to clear.
    I now realize that I have to write some ABAP. I thought there was a way to solve it without that.
    Thanks for your answer (you will be rewarded).
    Regards
    Pontus

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

  • 0 records in master data delta loads.

    We have activated delta on 0CUSTOMER_ATTR and verified the queue is active in RSA7. However the delta load is not fetching any records. A full load, however, does. I have deleted and reactivated the delta init multiple times with no success
    The infopakage error monitoring status has the following  messages
    <i>No new data since the last delta update
    Diagnosis
    The data request was a delta update, meaning, the data basis in the
    source system has not changed since the last update.
    System response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.</i>
    Thanks in advance.
    D

    hi ok first check what Ajay said the BD61 if it is activated in R3.
    After that go to roosgen fill in the name of the DS 0custmer_attr select the line with the R3 from which you want to extract data.
    Copy the message type.
    Then after you have made the change (update or creation) go to table cdhdr and cdpos to get the change doc object, table name and field name.
    After that go to bd 52 paste your message type and add the entry for in your case for the phone number in communication tab
    chge doc ADRESSE     table : ADR2     field name : TEL_NUMBER
    Launch the delta it should work.
    In fact when you go to bd52 you will see the fields that are entered for the fields to be taken into consideration when an update is made (they are mainly based on the chge doc obj DEBI) so if you need more fields add them in function of your needs and the info you get in cdhdr and cdpost when you do changes. You can do this only manually so take care of repeating these steps in QA and in prod.
    Hope this could help you to solve your issue.
    regards
    Boujema

  • Delta Load - BI Statictics 0 records

    Hi All,
    I have loaded Init successfully for all the infoprovider related to BI Statistics. But the delta loads are bringing 0 records then I checked in RSA7 and found no records.
    Please help me how to resolve this issue.
    Regards,

    Hi Vikas,
    I will tell you how I did the setting for the statistics  -
    1. Allowed all  the Infoprovider for statistics collection (RSA1-Tools-Setting BI Statistics---Selected Infoprovider changed Defalt  setting to X mode).
    2. Init load for all Statistics cubes.
    While running Delta No records are updated.

  • How to fill set up tables with out missing the delta records

    Hi,
    I would like fill set up tables in the productioon system of apllication of logistics.
    Can you please guide me how do we perform.?
    What are points to be considered?
    Because,when i start the filling  set up table by 10.AM if there are any posting at 10:05,10:06....like that
    how can collect them i.e will i miss any records in second delta run?What setps to be taken care?
    Thanks in advance
    Naresh.

    Hi.
    You can fill the set-up tables during normal operation hours ,if you load the data into ODS and the update queue is 'Queued delta' .Downtime is needed to avoid the duplicates .But if you use  'Direct delta' you miss the delta documents. Hence it is better to go for downtime approach for this case.
    Initially your delta records will be stored in the extraction queue and then when you run the collective job, records will be moved into delta queue. You can run the collective job (LBWE) anytime after the init run.If you need a daily delta ,then schedule this job before the delta loading. You can schedule this job either hrly or daily .This will move your records into delta queue. At the time of delta loading ,all your delta queue records will be moved into BW .
    Thanks.

  • Delta loading of the material classification

    Hello,
    I use 1CL_OMAT001 datasource on order to extract the material classification from R/3 into BW.
    The load works in init/delta.
    If I change the characteristic value from its value A to ' ' (re-initialized) in R/3, the delta loading will not take into account the modification in BW and therefore, in BW I will always have the value firstly defined (A). But I would like to see ' ' for that characteristic.
    Fortunately, when I modify the characteristic value from A to B, the delta loading will take into account the modification and there will be "B" in BW.
    Did you notice the same problem ? What did you do then ?
    Anybody have a solution ?
    Thanks for your answers.
    Vanessa.

    Hi,
    we are using the same datasource and we planned to go for delta. but could not manage to do so. the data we are loading is in the range of approx 300000, so rather then going for delta we are doing full load daily with the deletion of previous request. when we think the data is getting huge we do a full load for a required period and the change the selection for the next.
    eg: in the infopackage selection for last year we had fiscal year period set to 001.2005 to 012.2005 then once the year is over we change the selection to 001.2006 to 012.2006 and through out the year we have a max of 1 mil records so works fine with us. check this with your requirement.
    Ravi.

  • Missing delta records for the Z extractor

    Hi,
    I have created a Z extractor for the table BUT050 in CRM. I have created a function module and have included the delta logic on CRDAT which is the created date, CRTIM - created time, CHDAT - change date, CHTIM - Change time. I have initialed the delta and started doing the delta loads. I found that there are few missing records in the extractor when compared to the table. I'm attaching my code below.Can anyone please look into it and tell me what the issue is.
    Text removed by moderator
    Thanks
    FUNCTION Z_BI_CC_BUT050.
    *"*"Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZBI_BUT050 OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    * Auxiliary Selection criteria structure
       DATA: l_s_select TYPE srsc_s_select.
    * Maximum number of lines for DB table
       STATICS: s_s_if TYPE srsc_s_if_simple,
    * counter
               s_counter_datapakid LIKE sy-tabix,
    * cursor
               s_cursor TYPE cursor.
    * Select ranges
       RANGES: l_r_RELNR FOR ZBI_BUT050-RELNR,
               l_r_PARTNER1 FOR ZBI_BUT050-PARTNER1,
               l_r_PARTNER2 FOR ZBI_BUT050-PARTNER2,
               l_r_DATE_TO FOR ZBI_BUT050-DATE_TO,
               l_r_ZZTMSTMP FOR ZBI_BUT050-ZZTMSTMP.
       DATA : startdate LIKE sy-datum,
              starttime LIKE sy-uzeit,
              enddate LIKE sy-datum,
              endtime LIKE sy-uzeit,
              tstamp LIKE tzonref-tstamps,
              timezone type TZONREF-TZONE.
       RANGES: l_r_CRDAT FOR ZBI_BUT050-CRDAT,
                       l_r_CRTIM FOR ZBI_BUT050-CRTIM.
    * Initialization mode (first call by SAPI) or data transfer mode
    * (following calls) ?
       IF i_initflag = sbiwa_c_flag_on.
    * Initialization: check input parameters
    *                 buffer input parameters
    *                 prepare data selection
    * Check DataSource validity
         CASE i_dsource.
           WHEN 'ZCC_MA_BUT050'.
           WHEN OTHERS.
             IF 1 = 2. MESSAGE e009(r3). ENDIF.
    * this is a typical log call. Please write every error message like this
             log_write 'E'                  "message type
                       'R3'                 "message class
                       '009'                "message number
                       i_dsource   "message variable 1
                       ' '.                 "message variable 2
             RAISE error_passed_to_mess_handler.
         ENDCASE.
         APPEND LINES OF i_t_select TO s_s_if-t_select.
    * Fill parameter buffer for data extraction calls
         s_s_if-requnr    = i_requnr.
         s_s_if-dsource = i_dsource.
         s_s_if-maxsize   = i_maxsize.
    * Fill field list table for an optimized select statement
    * (in case that there is no 1:1 relation between InfoSource fields
    * and database table fields this may be far from beeing trivial)
         APPEND LINES OF i_t_fields TO s_s_if-t_fields.
       ELSE.                 "Initialization mode or data extraction ?
    * Data transfer: First Call      OPEN CURSOR + FETCH
    *                Following Calls FETCH only
    * First data package -> OPEN CURSOR
         IF s_counter_datapakid = 0.
    * Fill range tables BW will only pass down simple selection criteria
    * of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
           LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'RELNR'.
             MOVE-CORRESPONDING l_s_select TO l_r_RELNR.
             APPEND l_r_RELNR.
           ENDLOOP.
          LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'PARTNER1'.
             MOVE-CORRESPONDING l_s_select TO l_r_PARTNER1.
             APPEND l_r_PARTNER1.
           ENDLOOP.
           LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'PARTNER2'.
             MOVE-CORRESPONDING l_s_select TO l_r_PARTNER2.
             APPEND l_r_PARTNER2.
           ENDLOOP.
           LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'DATE_TO'.
             MOVE-CORRESPONDING l_s_select TO l_r_DATE_TO.
             APPEND l_r_DATE_TO.
           ENDLOOP.
    * Timestamp is delivered as a selection criterion.
    * Split the timestamp into date and time
          LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'ZZTMSTMP'.
             tstamp = l_s_select-low.
             timezone = 'EST'.
             CONVERT TIME STAMP tstamp TIME ZONE timezone
              INTO DATE startdate TIME starttime.
             tstamp = l_s_select-high.
             CONVERT TIME STAMP tstamp TIME ZONE timezone
              INTO DATE enddate TIME endtime.
             l_r_CRDAT-low = startdate.
             l_r_CRDAT-sign = l_s_select-sign.
             l_r_CRDAT-option = l_s_select-option.
             l_r_CRDAT-high = enddate.
             APPEND l_r_CRDAT.
             l_r_CRTIM-low = starttime.
             l_r_CRTIM-sign = l_s_select-sign.
             l_r_CRTIM-option = l_s_select-option.
             l_r_CRTIM-high = endtime.
             APPEND l_r_CRTIM.
           ENDLOOP.
    * Determine number of database records to be read per FETCH statement
    * from input parameter I_MAXSIZE. If there is a one to one relation
    * between DataSource table lines and database entries, this is trivial.
    * In other cases, it may be impossible and some estimated value has to
    * be determined.
           OPEN CURSOR WITH HOLD s_cursor FOR
    * Use the l_r_erdat and l_r_erfzeit for both creation and change selections
    * This way we can pick up both the creations and changes in a given time period.
           SELECT * FROM BUT050
                  WHERE RELNR IN l_r_RELNR
                   AND PARTNER1 IN l_r_PARTNER1
                   AND PARTNER2 IN l_r_PARTNER2
                   AND DATE_TO IN l_r_DATE_TO
                   AND ( CRDAT >= startdate AND ( CRTIM >= starttime OR ( CRDAT <= enddate AND CRTIM <= endtime ) ) )
                   OR ( CHDAT >= startdate AND (  CHTIM >= starttime OR ( CHDAT <= enddate AND CHTIM <= endtime ) ) ).
         ENDIF.
         "First data package ?
    * Fetch records into interface table.
    *   named E_T_'Name of extract structure'.
         FETCH NEXT CURSOR s_cursor
                    APPENDING CORRESPONDING FIELDS
                    OF TABLE e_t_data
                    PACKAGE SIZE s_s_if-maxsize.
         IF sy-subrc <> 0.
           CLOSE CURSOR s_cursor.
           RAISE no_more_data.
         ENDIF.
         s_counter_datapakid = s_counter_datapakid + 1.
       ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    Message was edited by: Matthew Billingham

    Hi,
    As per my knowledge you want load particular period of data try repair request may issue solve.
    Regards
    Sivaraju

  • IDoc delta loads using the program RBDMIDOC

    Hi ALL,
    We are doing some delta loads for ALE-IDOC using the program RBDMIDOC and we are using the message type HRMD_A.Under this message type we have configured two diff distribution models with different infotype data.
    So when we execute the program we have only one input parameter that takes the message type.based on our setup will the program send the delta loads for both the distribution models since they are using the same message type which we do not want.
    How do we restrict the program to send the delta loads only for the intended distribution models even though they are using the same message type.
    Thanks
    Bala Duvvuri

    Hi Bala, 
    Assuming that based on IDoc content there is some way to determine which system should be the recipient, you could do the following (requires ABAP development as indicated by Mylène):
    Define a new filter object type if required via BD95.
    Assign new or existing filter object to E1PLOGI segment field FILTER1 or FILTER2 via BD59
    Create BAdI implementation for HRALE00OUTBOUND_IDOC and implement method FILTER_VALUES_SET to set values for FILTER1 or FILTER2
    Even without the BAdI implementation you'd see the new filter value appearing in your distribution model for message type HRMD_A. However, you need the ABAP logic for filling the fields with actual values that you can utilize. 
    Cheers, harald

  • How to handle the error delta load?

    Hi Experts,
    We have a InfoCube include three InfoSource: 2LIS_11_VAITM, 2LIS_12_VCITM, 2LIS_13_VDITM.
    The delta load of 2LIS_12_VCITM failed on Oct 23,2006.  It blocked all of the latter data load, even though 2LIS_11_VATIM and 2LIS_13_VDITM load the data successfully.  And 2LIS_12_VCITM didn't do any load step after Oct 23, 2006.
    How should I do to fix this problem and make all of the data can be go into InfoCube correctly?
    Thanks,
    Lena

    Hi Jerome,
    Thanks so much for you help. Currently, the data loaded successfully.
    But I have another issue, because of the failed request status is red, it seems it blocked all of data to go into InfoCube (even though the delta load is successful). I can not see the new data in the InfoCube. And in the InfoCube manage, the "Request for Reporting Available" field for the successful request is null.
    How should I do?
    Thanks,
    Lena

  • Delta Loads are not working: 0FI_GL_10 With enhancement : HR Data in source

    Hello Friends. Thanks for your answer on theory which I know, Please help me with this situation. I read all the sdn response to my message and have below for your suggestion
    QUES1: Why the 0RECORDMODE is not coming in Standard DSO and hence it is not there in Transformation in between Infosource and DSO. What do I need to include 0RECORDMODE in standard DSO. What is the correct OSS to Implement on SP 10 to solve this.
    Ques2. As suggested by you If I run the Delta loads after waiting for 1 hour of Init load. say init load was done at 11 am . i will do the delta at 12:10 am to psa via Infopack and then wait for an hour to run the DTP from PSA to DSO. Delta Records come in but it also bring in the records in DSO that are not changed from about 7 hours in ECC. and make our total Balance incorrect.
    Ques3: We have this Cocatenation of Keys in DSO since we exceeded the total number of 13 key in DSO.
    Cocatenation#1 GL ACCT and CHRT OF ACCTS
    Cocatenation#1 SEGMENT AND CONT AREA
    Cocatenation#1 Version, Value Typem, Valuation view, Currency Type.
    <b>
    Since business want to include HR Data in Keys Like Employee Number, Emp Code, EMPSPIMCODE. Payroll ID and I can not remove these fields from Key in DSO as per the business.</b>
    Also the above fields GL ACCT , CHRT OF ACCTS, SEGMENT, CONT AREA
    exist in Data FIELDS OF DSO. so what are the steps to modify the design to fix this asap.
    Please email me at [email protected] if you have any docs to resolve the above
    Thanks
    Soniya
    null

    Did the full load bring all records from sourcesystem to PSA?
    Was this an issue with the first delta you have tested ?
    Concatentaion of multiple keys into one should not be an issue here as that must have been happening in DSO and delta brings incorrect records to PSA itself.
    Are you validating delta records against FAGLFLEXT inputing the timestamp value  or RSA3 ..how ?
    <b>Check if this note helps</b>
    Note 1002272 - NewGL-DataSource 0FI_GL_10, 3FI_GL_*: Missing record.
    Is delta bringing incorrect records to PSA or DTP brings incorrect records to DSO from PSA ?
    May be you should revisit the logic behind for the enhancement of 0FI_GL_10.
    I am facing a similar issue as well but here delta doesnot even bring a single record ( ODS has 20 keys )...will update if resolves and you do the same.
    Message was edited by:
            Jr Roberto

Maybe you are looking for

  • How to hide files on NTFS

    Hi everyone, I have the following problem: The situation: An Air, tripleboot (10.9 with ntfs-3g, so full NTFS r/w + Ubuntu + Win7), with an additional 4th partition which holds a folder for the desktop (let's call it "desktop-data"). All systems use

  • ADF af:column width attribute

    Hello, I'm trying to expand the width of columns inside an af:table component so they stretch automatically according to the parent's size. I have tried all ways but columns always get set the default 100px width. The only way to make it stretch is u

  • Export Numbers Document to Editable .pdf

    Forgive me if the answer is obvious, but I am a newbie to numbers. I have created a sales reporting form in numbers that I would like to host online or send out as a .pdf document. The problem is, I would like my clients to have the ability of enteri

  • Problem installing game to iphone using itunes

    I downloaded a game from the app store using itunes but when i tried to install it to my iphone using the itunes interface, the game didn't install and the button's label says "Will Install". Is there something else i have to do in order to successfu

  • Two schedule lines despite sufficient stock

    Hi Gurus, My order form shows two schedule lines even after maintaing sufficient quantities in the respective plant location.....another strange happening which I comprehended was if at all I create order with reference to quotation then order form c