Key Field missing in transformation

Hi
I am trying to use BI7.0 transformation (1-1 mapping basically) to send data from one DSO to another DSO
Source : DSO1
Key fields : Account ,Itemid, Position number
Target: DSO2
Key fields : Account ,Itemid, Position number
However when i create transformation i don't see "Itemid" in the source side in the transformation mapping. Rest of the two key fields are present.
Please help.
-Anurag

Hi, Anurag.
  I just posted a question to the forum, because we are implementing the same architecture, and we are wondering how deltas will work with this
architecture.  Are you using deltas? 
The text of my recent post is below. 
We are loading from R/3 into DSO1, and then from DSO1 into DSO2.  The R/3 extractor that loads DSO1 is
delta-enabled.  We are not sure, however, if a delta mechanism governs the load from DSO1 into DSO2.  The
DTP says its extraction mode is delta, but does that mean, if a row in DSO1 changes, it will negate the key figures
on the  original row and send a new row the way R/3 does?
For example, suppose the R/3 extractor sends us Row 1. 
Row 1 has a key figure with a value of $100.
Row 1 gets changed in R/3, and the new value is $125. 
The R/3 delta mechanism takes care of this by negating the key figure on the appropriate row and sending us a correcting
row.  For example,
the R/3 extractor will send us:
Row 1 $100
Row 1 -100
Row 1 $125
So... the net value is correct, i.e. 100 - 100 + 125 = 125. 
When we load from DSO1 into DSO2, however, do you know what rows will load into DSO2?  Is BW "smart" enough to do this
type of negation?
Thanks!

Similar Messages

  • ODS Key Fields allows blank values in Transformation

    Dear Experts,
    I have two layers:
    1. Datawarehouse layer (purpose is to represent exactly data at the backend)
    2. Consolidation Layer
    Scenario 1:
    In Datawarehouse layer, the fiscal variant is a key field in ODS and I have two other key fields i.e. combine to form a composite key.
    When I load the data from PSA to this ODS, I have a record which contains blank value for the fiscal variant. It successfully loads into the Datawarehouse ODS.
    Sceanrio 2:
    In the consolidation layer, i have 4 key fields out of which fiscal variant is one of them. Now when I load the data from the datawarehouse ods to consolidated ods, it throws an error saying:
    Diagnosis
         An exception fiscvarnt_missing was raised while executin
         module RST_TOBJ_TO_DERIVED_TOBJ .
    System Response
         Processing the corresponding record has been terminated.
    The transformation mapping for the fiscal varaint in both sceanarios is 1:1 mapping with the rule type 'Time Characteristic'.
    My questions is:
    1. Why different behaviour during scenario 1 and sceanario 2
    2. Solution to the above issue
    Thanks
    Jain

    Thanks J.S.
    Yes the format selected in the field for the datasource is 'Internal'.
    When loading the data from R3>PSA>ODSX(Datawarehouse Layer), I have no problems even when the field is blank.
    When loading from ODSX-->ODSY(Consolidated Layer) the empty (blank) field for that records throws error.
    Any ideas?
    Jain

  • Transformation - Fields Missing!!

    Hello Experts,
    I am facing a strange issue in the transformation beteen DSO to Cube.
    My DSO Contains 148 fields while in the transformation I can see only 131 fields.
    IN thedevelopment Box the transformation appears fine but in the Acceptace box the mentioned problem
    occurs.I tried regenerating the export datasource, activated & transported the DSO but all in vain.
    PLease help.
    Thanks,
    Suyog

    Hi,
    Go to RSA1--> Data sources Choose source system as BI and find for data source with 8<DSO name>. Once you get the data source ,select change from context menu --> go to Fields tab and check under column Transfer (5th Column) for all fields . If any field is unchecked then check and activate data source to see all fields in Cube transformations.
    hope it helps...
    regards,
    Raju

  • Diferent key field names within a recordset

    Hi
    Can we have different key field names for records within a recordset.
    When i give different key field names for Header and detail records in a recordset i get the error below: -
    message not processed: com.sap.aii.messaging.adapter.trans.TransformException: Transformer Conversion3.0.5227 Error initializing Class: java.lang.Exception: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found (4408) Parameter 'xml.keyFieldName' is missing (4403) ; nested exception caused by: java.lang.Exception: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found (4408) Parameter 'xml.keyFieldName' is missing (4403)
    Thanks

    Hello Pratihchi,
                   One keyfield name is allowed, eventhough we have different records per recordset. Because we can differentiate the rows based on the keyfieldvalue and those value we need to mention in the fieldname. Check below link which contains the example on this.
    https://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/5393- [original link is broken] [original link is broken]> [original link is broken] File Adapter->Sender File Adapter for Content Conversion
    Regards.
    Prasanna.

  • Data fileds and key fields in DSO

    Can any one explain what exactly difference between data fileds and keyfields in DSO or the concept of these two
    And how to know which fields should be assigned to data filed or key field
    with regards,
    musai

    Hi Musai,
    Let me take an ex. Say you have 3 fields A, B & C. Let us assume 'A' is keyfield, 'B' & 'C' are Datafields.
    Lets assume the PSA data is :
    A             B              C
    001      Musai        89.9  
    002      Musai        89.9
    003      Pavan       75.00
    So when you load the data to DSO, since all the keyfield values are unique (001,002,003), all the records will get loaded to DSO.
    But If PSA data is :
    A             B              C
    001      Musai        89.9  
    001      Musai        85.7
    003      Pavan       75.00
    Only 2nd & 3rd rows will be loaded to DSO since 1st & 2nd row have same value for keyfield (001). So 1st row gets overwritten(or summation depending on what is the setting in Transformation rule for C considering C is Keyfigure) by 2nd when you load to DSO. 3rd row doesn't have any problem. So that will go as it is.
    Also please note that we cannot use Keyfigures as keyfield.
    Hope it is clear now!
    Regards,
    Pavan

  • Adding fields into DSO in the Key Fields

    Hello Experts,
    I am using a Cube which loads data from DSO. Now, for some requirement I have to add one field into DSO in the key  fields Part. But DSO already has 16 fields which I cannot remove.
    So how can I add one more field into DSO in this case?
    The cube I am using is 0FIGL_C10 and the DSO is 0FIGL_O10.
    Regards,
    Nirav Shah

    You can create an Infoobject - acts as a custom key to concatenate multiple fields into one and make it a keyfield.
    15 fields + custom key ( concatenate 2 fields ) = 16 Keyfields
    Procedure  - > Example
    16th Key - > RVERS-Version-CHAR-3-0VERSION
    17th Key - > RMVCT-TransactnType- CHAR-3-0MOVE_TYPE
    1. Create one infoobject - ZKEY - Length = 6 ( add lenghts of the two objects)
    2. Make ZKEY as a keyfield for the DSO.
    3. Make 16th Key and 17th Key datafields.
    Create a routine for ZKEY in the transformation or transfer rules to concatenate values coming from source into one.
    TRFN Routine
    Concatenate SOURCE_FIELDS-RVERS SOURCE_FIELDS-RMVCT into RESULT.
    Or
    Transfer Rules Routine
    Concatenate COMM_STRUCTURE-RVERS COMM_STRUCTURE-RMVCT into RESULT.
    This way nodata will be lost/overwritten due to less number of keyfields in the DSO.
    We have as many as 21 Keyfields for GL Totals DSO.

  • Key fields of DSO

    Hi Experts,
    I have a issue with DSO.We extracted data from R/3 in to BI and send it in to DSO.We are getting data from R/3 to PSA with right records with init load.But from PSA to DSO,we are getting duplicate records for 2 invoices.We have 2 invoices say xxxx01 and xxxx02 have problems in DSO.
    XXXX01 has 3 items(10,20,30) created on April2008 with billing Doc type A
    XXXX02 has 2 Items(10,11) created on April2008 with billing Doc type B
    But in DSO we are getting extra Item record for each of document
    i.e XXXX 01 has 4items(10,20,30,10).3 items(10,20,30) created on April 2008 with billing Doc type A.One Item(10) created on June 2008 with billing Doc Type C(this is not in R/3 so This extra one).
    XXXX02 has 3 items(10,11,10) .First 2Items(10,20) are original ones created on April 2008 with Billing type B.One extra Item(10) created on June 2008 with Billing Doc Type C(this Item is not in R/3).
    We have key fields in DSO is Sales Document,Material,Billing Item,Fiscal variant.I din't add Document Type,Document Category,Calender day in Key Fields but they are in Data Fields.
    I am not sure how can I overcome this issue either adding Document type ,Calender day,Document Category in DSO or Put some fields in Semantic group.
    Please advise me and tell me some possible solutions.
    We ran setup tables with init and ran infoPackage and ran delta Infopackage too in Testing box.If I change the design in Dev box and Transport in to  the test box,Hope I need to fill the Setup tables again right.Please advise me.
    Thanks,
    Ran.

    Hi,
    Use this key fields and sure it will solve.
    Doc no.,Doc Item. for Orders
    Bill Item and Bill no. for Invoices
    Delivery No. and Delivery item no. for deliveries.
    Moreover check ur transformations once.
    Cheers,
    shana

  • Defining Key -field and data fields

    Hi all
    i have to define Key-Fields and data-Fields for a DSO(say DSO3).
    DSO1 is a combination of DSO2 and DSO3..
    DSO1 and DSO2 are SAP defined DSOs.
    DSO3 is a User defined DSO.
    my assumption:
       DSO3 has 30 fields
       DSO2 has 16 fields.
       can i create the DSO3 with remaining 14 fields which is there in DSO1?
       or i have to create the DSO3 with 30 fields(all DSO1 fields).
    can any body explain....
    Thanks & Regards
    krishna

    The problem is not how many fields you create where, the issue is where is the data is going to come from.
    Since you data is going from DSO3 to DSO1and if the field is already getting populated in DSO1 from DSO2, then you don't have to worry about populating those fields again, unless and otherwise you want to overwrite them.
    Remember any fields exist in both DSO3 and DSO2 which overlapping will be overwritten by the latest update.
    Again, going back to your question, you should consider does your datasource has missing fields (14 of them).
    thanks.
    Wond

  • How to merge key field from external source system with SAP R/3 master

    Hi,
    In SAP BW 7.0 system, our scenario is Master Data for 0GL_ACCOUNT is coming from SAP R/3 alongwith the Transactional data records for standard FI cubes. Then, one more set of transaction data is coming from external source system, a flat file, into another custom DSO(ZDSO_FI), which also has this GL Account field.
    This flat file's GL account, GL_file, has to be basically mapped/merged with the standard 0GL_ACCOUNT field so that at the time of loading the transactional data for custom DSO, ZDSO_FI (with transformation mapping GL_file > 0GL_ACCOUNT), system automatically refers to the 0GL_ACCOUNT master data to load these incoming transactional values, from the external flat file system. How can this be done?
    To illustrate the scenario, say I have 5 records in 0GL_ACCOUNT, loaded from SAP R/3 into SAP BW-
    0GL_ACCOUNT      Short Description     Source System
    100                                   D1                          R/3
    200                                   D2                          R/3
    300                                   D3                          R/3
    400                                   D4                          R/3
    500                                   D5                          R/3
    Now suppose if my flat file has following sample transactional data, to be uploaded in SAP BW  ZDSO_FI-
    GL_file      Key Figure1
    400          789
    200          567
    Then after uploading this transactional data in ZDSO_FI (with transformation mapping GL_file > 0GL_ACCOUNT), the 0GL_ACCOUNT data becomes as below-
    0GL_ACCOUNT      Short Description     Source System
    400
    200
    100                                   D1                          R/3
    200                                   D2                          R/3
    300                                   D3                          R/3
    400                                   D4                          R/3
    500                                   D5                          R/3
    So note that the system did not refer the incoming GL's from flat file, although the field is mapped to 0GL_ACCOUNT in transformation, to the already available master data. Rather created 2 new data rows for the GL accounts coming from external system. Because of this I am not able to perform the calculations common from standard FI cube and ZDSO_FI, with GL account as key field. I need to synchronise these data values based on GL Account to proceed with further calculation and am badly stuck.
    Request if anyone can please throw some light on how to achieve this seemingly simple requirement?
    Thanks in advance.
    Nirmit

    Better post this thread is in the [Enterprise Data Warehousing|Data Warehousing; forum.

  • Logic to populate the key field of a DSO

    Hi SDN community,
    I have a very unique requirement and need ur help on this.
    I have a created a generic Datasource with two fields <b>Material Number</b> and <b>Minimum Quantity</b>. This is loaded into MINDSO01.
    The MINDSO01 in BW side will have two Key Fields:<b>Material Number and Load Date</b>. (Load Date is not a filed coming from the Datasource it is nothing but the date when the load is happened.)
    The reason to have Load Date as the key is, each time the DSO is loaded the records have to be unique.
    <b>Now my question is how do I populate the Load Date filed of the DSO which is the KEY field for the DSO???
    If I load on 9/25/2007 then the load date should be 9/25/2007.</b>
    <b>MAT01    9/25/2007</b>  5
    <b>MAT02    9/25/2007 </b> 8
    <b>MAT03    9/25/2007</b>  13
    bolded fileds are the keys of the DSO.
    Any suggestion are really appreciated and will award points for sure.
    Please do help me out with this issue.<b></b>

    Hi Snigdha,
    You can write a routine for the infoObject Load date is equal to sy-datum. So that it will take the value of the date as the day when the date is loaded and populates.
    If you are using 3.x, then write start / End routines in update rules or else if you using BI7, write routines in Transformations.
    Let me know details.
    Reg
    Pra

  • Which are the Key fields in the iTunes Music Database

    I assume that the iTunes Music folder is really a database for all the music. I further assume that this database has a Key Field (1 or more) which are necessary for it to function.
    Which fields are these and what are their names??
    I am still hunting for the songs that are missing in iTunes but are in the library.

    Ignore the question. I found the missing files. Had nothing to do with "key fields". Thanks

  • New Key field in DSO

    Hi BI Experts,
    I want to add a new key field in a DSO.This infoobject is in Data field now.
    The DSO contains data now.
    Can the new key field be added without deleting data from the DSO?
    Thanks

    Hi ,
    If you want to take datafield as keyfield,firstly you have to delete the data.
    Before deleting the data use datamart and upload it into different data target.
    After wards you can change the infoobject from data fields to keyfields.
    But be aware of that transformation will also be disturbed with these changes.
    Regards,
    Praveena.

  • Dimension key 16 missing in dimension table /BIC/DZPP_CP1P

    Hi all,
    I have a problem with an infocube ZPP_CP1. I am not able to delete nor load any data. It was working fine till some time back.
    Below is the outcome of running RSRV check on this cube. I tried to run the error correction in RSRV. But n o use
    Please help.
    Dimension key 16 missing in dimension table /BIC/DZPP_CP1P
    Message no. RSRV018
    Diagnosis
    The dimension key 16 that appears as field KEY_ZPP_CP1P in the fact table, does not appear as a value of the DIMID field in the dimensions table /BIC/DZPP_CP1P.
    There are 17580 fact records that use the dimension key 16.
    The facts belonging to dimension key 16 are therefore no longer connected to the master data of the characteristic in dimension.
    Note that errors that are reported for the package dimension are not serious (They are thus shown as warnings (yellow) and not errors (red). When deleting transaction data requests, it can arise that the associated entries in the package dimension have already been deleted. As a result, the system terminates when deleting what can be a very large number of fact records. At the moment, we are working on a correction which will delete such data which remains after deletion of the request. Under no circumstances must you do this manually. Also note that data for request 0 cannot generally be deleted.
    The test investigates whether all the facts are zero. If this is the case, the system is able to remove the inconsistency by deleting these fact records. If the error cannot be removed, the only way to re-establish a consistent status is to reconstruct the InfoCube. It may be possible for SAP to correct the inconsistency, for which you should create an error message.
    Procedure
    This inconsistency can occur if you use methods other than those found in BW to delete data from the SAP BW tables (for example, maintaining tables manually, using your own coding or database tools).

    Hi Ansel,
                 There has been no changes in the cube. I am getting this problem in my QA server. So I retransported the cube again from Dev to QA. But did not help me..
    Any other ideas??
    Regards,
    Adarsh

  • Problem with S001 - Key field Sequence in table

    Hi,
    We want to use the extractor 2LIS_01_S001 with Delta. But delta is not working with these extractor. When I checked LBW0, I am getting error "The LIS environment is setup incorrectly".
    According to note 115192, the key fields in tables S001, S001BIW1 and S001BIW2 have to be in the same sequence.
    But we are on 4.7 with following Support Packs:
    SAP_APPL 470 0025 SAPKH47025 Logistics and Accounting
    PI 2004_1_470 0007 SAPKIPZI57 R/3 Plug-In (PI) 2004.1 for
    R/3 Enterpri
    PI_BASIS 2004_1_620 0010 SAPKIPYI5A Basis Plug-In
    (PI_BASIS) 2004_1_620
    And according to this note, solution is delivered with this Support Packs.
    Can you please tell me that if this note is mandatory for all the releases..?
    Did any one encounter this issue before..?
    Thanks & Regards,
    Samay Mehta

    I changed my entity: unchecked the X column to be primary key added RowID as a primary key. Now it works.
    What's wrong with my CHAR(1) as a primary key ?
    I also tried to add a Refresh button:
      <af:commandButton text="Refresh" id="cb3"/>and in the table add a partialTarget to the button. Now when I add new row and press the Refresh button - then it works.
    So it seems that the problem is when I add new row and enter data, the table is not refreshed and the row is missing it's primary key.
    Any solutions?
    Edited by: a.gruev on Nov 26, 2009 4:18 PM

  • File Content conversion at the sender adapter without Key field

    Hi All,
    I have a requirement ....We are using Message Transform Bean at the sender adapter (we are not going for FCC as we are tranforming a decryption) ..its is a flat file...
    we dnot have a key field for one of the node .....we have total 5 nodes under parent node and one of the node does not have key field ....
    NOTE: I guess we have to do either Java mapping or XSLT ..but not getting proper weblogs with program...
    and not well versed with coding...i got few codes which were either not fulfilling requiremment or they use stream tranformation where it is decaprecated in NWDS and does not allow to compile it properly .
    Can somebody get me a code for Java which uses Abstract tranformation ...which can full fill below requirement
    parent node
                  Node1
                       node 1a
                       node 1b
                  Node 2
                  Node 3
    Input Flat file is in below format
    example:
    Employee Headre Flat files
    Employee Detail Flat Files(this repeates numerous times as much as number of employees)
    Employee  Trailer Flat Files
    Loan Headre Flat files
    Load Detaisl Flat Files (this repeates numerous times as much as number of employees)
    Loan Trailer Flat files
    Assitance much appreciated thanks .
    Regards
    Kiran

    Hi Stefan,
    I will rephrase the question i guess its communication gap...
    1) we have an existing interface in 3.0..we are not copying the same interface to 7.1 as it was not according to standards and they are phasing out that system.
    2) In 3.0 they are reading the Flat files under one record .. the structure is as follows:
    Parent Node
               Node
                   Row
    They are reading all the flat files row by row.
    and even in receiver they are reading in the similar fashion
    3) In the current structure according to the FS we have to create a structure defined by them which is as below
          Employee
              Header
              Details   node 1
                 Details  node 1a
                  Details  node 1b
              Trailer
              Loan Header
              Loan Detail
              Loan Trailer
    Receiver structure is
    Employee
          Node (this will have the receiver fields which is about 30 fields)
    4) We have pulled a sample file from 3.0 for the existing interface as we din get the sample file for existing one in FS and Iam not sure whn are they gonna send it.
    5) In the existing interface payload we have everything matching even the field lengths and postions and even key fields and key fields we got from the FS
    6) problem is we donot have key fields for the Details node 1a and 1b even in FS or the existing payload and we got an update from the FS consultant that we will be receiving these fields but without keyfields...
    see i donot have any problem taking out the details 1a and 1b out of the details header and create the structure ,,,,but the main issue is without key field we will not be able to generate the 1a and 1b nodes or read it in the FCC or MTB (i have to use MTB as we are decryting the file from the sender).
    Or is there any other method apart from java mapping or module development to handle these files

Maybe you are looking for