Conversion of values in BW

I have a master data InfoObject that will be a 3 part key, with 4 attributes.  We are loading it now with an Access database and the 4 attributes have their own number range, values, whatever.  We are starting to pull the same data from R/3, and parts of the data will be coming by way of the HR system, into R/3, then into BW.  The problem is that for some of the attribute fields, there will be different values.  For example, Sales Rep used to be 111, but from HR it is HR1.  They want to see 111 in BW.  What they want me to do is take the new values that originated in the HR system and convert them to the old values that came from the Access database.
Is this when a lookup table will come into place?  And suggestions would be appreciated.  Thanks,  Keith J

Oscar,
The values will not be fixed.  For example, one attribute will be a Zone.  The current value we have is South Zone.  When we switch from the current feed to the SAP HR feed, the value might come in as SWZN1.  I want to change that SWZN1 back to South Zone.  Each record could have one of several possibilities for Zone and several other attributes.  I am trying to figure out how to use a lookup table to assist in this.  I know I can't put the complete logic in an update rule, or I would be writing code forever.  Just looking for a way to get started.
Thanks

Similar Messages

  • Content Conversion trims values

    Hi all - I'm using content conversion to bring in fixed-width fields from a flat MQ message - but my values are getting front & back spaces trimmed.  I have set up the length parameter on the attributes of all my data types, as well as "preserve" for the whitespace facet.  Any ideas?  I need to keep the spaces, as the values should be stored in SAP EXACTLY the way the customer sends them to us.

    Try out:
    xml.fieldContentFormatting=trim|nothing
    If you specify trim (the default setting), all blanks that proceed and follow a found value are removed.
    If you specify nothing, the value remains unchanged.
    A list of all parameters of the content conversion module, you find in the online documentation for Plain J2SE Adapter Engine -> Configuring Sender/Receiver File Adapter.
    Regards
    Stefan

  • Conversion of value 'blank' to # (Not assigned)

    Hi,
    We have extended the DataSource  0CA_TS_IS_1 with the field ZZAWART att/abs type. The component type is AWART and domain is AWART. This domain has no Conversion routine.
    We have created a filter on this field in the dtp from one DSO to another DSO. In this filter we would like to add the value # (not assigned). Our problem is that when this field has no value in the source system, it is not converted to #.
    I changed the conversion routine in the DataSource  in BW to ALPHA and also changed the format from internal to external but this did not solve the problem.
    How can we get the blank value converted to #?
    Thanks.
    Best regards,
    Linda

    Hi Linda,
    It is a bit strange that you dont have the posibility to filter by equal to blank.
    Don't choose the value from the match code in DTP, try the following:
    1.- Press on the arrow (Multiple selection).
    2.- In the "Select Single Value" tab you will have 2 fields (a) Options pushbutton (b) Single Value.
    3.- Click on the PushButton for the first row, ther you will have some option: (a) = Single Value; (b) >= Grather than or equal to; (c) <= Less than or Equal to; (d) > Greater than; (e) < Less than; (f) <> Not Equal to.
    4.- Choose (a) = Single Value, and leave the Select Value column in blank.
    Regards, Federico

  • Conversion of values

    Hi
    if i get values like 0.5 ( or something like this) by division
    how I can display " 1/2 " to the users ( there be other outputs like 3/4 , 3/8 etc)
    Thank you
    Message was edited by: tripleA999

    Have a cross reference of values versus strings.  One way would be to have two arrays, one with the values and one with the strings.  Search the values and display the string that shares the index of the value.

  • Update Rule Problem for conversion char values

    Hi all,
    I need to implement the following logic into the update rule:
    data: ch1(10) type c,
    ch(8) type c.
    ch = ch1.
    The value in ch1 can be only characters or only numbers. I am facing problem in converting this, as characters are left aligned and the numbers are right-aligned.
    Please guide.
    Thanks

    Another example can be:
    data: ch1(10) type c,
    ch(8) type c,
    ln type n.
    ch1 = '0000000010'.
    condense ch1.
    ch = ch1.
    write:/ ch1, '-----', ch.
    In this case, we get only 0 in ch field.
    Thanks,

  • KF value missmatch due to conversion

    Hi All,
    In one of my Keyfigures of DSO showing varience in values because of Conversion.
    Case I: Value displayed by selecting  Check  box : Do not use any conversion
    KF Value Displayed as below:
    Case II:  Uncheck - Do not use any conversion
    KF Value displayed:
    Here although correct value is 562500.00 value Bex Report output is Displaying 56250000.00. Which is multiplied with 100.
    There is 0 decimal value at TCURX table. Still facing miss match.
    Expert's suggestions are welcome.
    Regards,
    Vijay

    Vijay,I think its a known behaviour because all sap currencies are stored with 2 decimal places.So if any currency is stored with 0 decimal places then it multiples the amount by 100 while displaying the result at query level.Same is happening in your case as well.
    There is a detailed document on the same which covers this situation and also provide solution for it.
    Please check this.
    Exceptional Currency Handling in BW
    Hope this resolves your issue.
    Regards,
    AL

  • Mapping issue after Content Conversion in PI 7.1

    Hi Experts
    I am working on a File (Fixed format) to proxy , Data is getting converted in the File Content and conversion and passing to mapping as xml data , But the when the mapping happens no values are being returned on the  target side
    When I load the data (xml ) file from sxmb_moni on the sender side all nodes  shows in red color though file content happened without issues
    please provide the inputs how to map with the values
    Thanks
    PR

    A couple of checkpoints for you:
    1. When you load the XML from SXMB_MONI in the test tab of message mapping it turns red..this means the constructed XML (from CC content conversion) doesnt match the one (XSD) defined in your ESR/IR. In this case you have to check again thoroughly the file content conversion fields values/field length in the sender Communication chaneel.
    2. Once you rectify the error above then you can test the mapping in ESR message mapping.

  • Character integrity issue after data conversion in database/JDBC

    Hi
    I am using oracle 9i with the following NLS setting:
    NLS_LANGUAGE :AMERICANS
    NLS_CHARACTERSET : UTF8
    NLS_NCHAR_CHARACTERSET :AL16UTF16
    I am running on Linux with this as my environment Language:
    Lang: en_US.UTF8
    I am sending hindi characters in XML file (UTF-8 encoding) to my java application to be stored in the database. In my xml file, I give this encoding (ignore the double quotes, reason for putting in the quotes so that the browser will not interpret it)
    "&#x928";"&#x92E";"&#x938";"&#x94D";"&#x924";"&#x947"
    But the characters appeared unreadable in the database. When I use Select DUMP to check the characters encoding:
    Typ=1 Len=12 CharacterSet=UTF8: 0,28,0,2e,0,38,0,4d,0,24,0,47
    When I retrieve data from the database via my application, the weird characters will appear.
    However, if i manually input the hindi characters into the column of the table, then the Hindi characters appear correctly. When I do a DUMP to check, this is what I get:
    Typ=1 Len=12 CharacterSet=UTF8: 9,28,9,2e,9,38,9,4d,9,24,9,47
    When I check the unicode chart here http://www.unicode.org/charts/PDF/U0900.pdf, the second DUMP result is correct. When I retrieve data from the database via my application, the correct hindi string appear.
    I understand that in Java the encoding is in UTF-16 and Oracle JDBC will convert from UTF-16 to UTF-8 before storing in my database and vice versa. The thing that puzzles me is why correct hindi string appears on my web interface when that the same conversion is used to extract the data from the database. At first I suspect it is the conversion problem in JDBC when the UTF-16 characters get truncated to UTF-8 when I try to store the data to database. But when good data is stored in the database, the extraction seems to be correct albeit that it is going through the same conversion.
    I read from several threads of this forum and also the Oracle Globalization Support article but I cannot find an answer to my question.
    Can anyone help? Thanks.
    Edited by: user13085722 on May 10, 2010 1:12 AM
    Edited by: user13085722 on May 10, 2010 1:16 AM

    A couple of checkpoints for you:
    1. When you load the XML from SXMB_MONI in the test tab of message mapping it turns red..this means the constructed XML (from CC content conversion) doesnt match the one (XSD) defined in your ESR/IR. In this case you have to check again thoroughly the file content conversion fields values/field length in the sender Communication chaneel.
    2. Once you rectify the error above then you can test the mapping in ESR message mapping.

  • Data collection: Conversion exit and Input Conversion of Local Currency

    Hi,
    I'm collecting transaction data from BI where the local currency for one company has no decimal places.
    Therefore if I collect the data with no conversion, the value is multiplied by 100 in BCS .
    has anyone used the Conversion Exit or Input Conversion options?
    - any feedback, tips or useful documentation/links you can point me to?
    All help much appreciated.
    TheScotsman

    Hi TheScotsman,
    I'm almost sure that the reason of this behaviour but the special treatment of some specific currencies (see the OSS note 153707). Just check if you have the currency in the TCURX table.
    If yes, then please go through the links:
    Re: Local Currency COP, KRW values wrong in report
    (I described symptoms)
    Re: Currency conversion cube to report
    (see the document in the link)
    Edited by: Eugene Khusainov on Sep 8, 2008 4:20 PM

  • Error in ALPHA conversion while loading hierarchies

    Hi,
    I got the following error while loading hierarchies:
    <i>Error in ALPHA conversion in NodeId00000088 for InfoObject 0PROFIT_CTR
    Message no. RH224
    Diagnosis
    The technical node name  for node id 00000088 is not consistent for conversion exit ALPHA, which is stored with InfoObject 0PROFIT_CTR.
    Possible reasons for the error:
    The original consistent value returned by the DataSource was changed by a conversion routine to a non-consistent conversion exit value.
    The DataSource returns a value that is not consistent in the conversion exit.
    System response
    Loading the hierarchy was terminated.
    Procedure
    Check whether the correct conversion exit was entered for the InfoObject. If necessary, change the conversion exit, conversion routine or the data to be extracted.
    A further option is to activate automatic conversion in the transfer rules. Here, the system executes the conversion exit, making sure that the data is in the correct format.</i>
    If it is not hierarchies, we fix this error by prefixing the zeros (as we are working on Qbox we wont change the transfer rules to select the conversion routine) but here there is no PSA, the data is coming from ALE. Based on this please suggest me to how to fix this error?
    Points will be given for any helpful suggestion

    Ganesh,
    Can you please check out the Alpha Conversion Routine and also nodeid for that infoobject..
    There might be some inconsistencies in the code..
    Hope it helps
    Gattu

  • Trouble with field output conversion when activating the Adobe Forms

    Hi everybody,
    I'm facing an issue very strange.
    I made an interactive form for a customer which is based on abap dictionnary and send by e-mail to user. This form works fine and is in Production system .
    Now i made an evolution on this form and when i activate the forms in SAP , all the field conversion are lost so when i got the form Date fields appears with 00000000 if they are empty.
    To solved this issue in Development system , i delete the form and recreate it and that's have been sold .
    Now this issue appears in Quality system also after transport it...so i try to do the same way to solve it ...i create one transport order where the forms is deleted and another one to recreate it ..but the issue still alive.
    I search on OSS and don't find any notes for this .
    This issue also appears on aonther forms which is not interactive one ( purchase order printout ) .
    I work on ECC 6.0 with AdobeLifeCycle Designer ES 2
    Does anybody face this issue and solved it ?
    Regards

    Ok i will try to explain more clearly because i think there's a misunderstanding .
    When you define an interface of Adobe forms you can choose between DDIC interface or XML interface. When you chose DDIC interface when you generate the form all necessary SAP CONVERSION EXIT are use when value are given to the XML file .
    For example i got a field type datum define in my interface which is based on DDIC .
    I place this field in the layout and set the edit mask in the layout with 'YYYY/MM/DD' for example.
    Now i execute the forms and give value '00000000' to the field, normally what should appears in the result is : nothing because it's an initial value.
    But in my case the value appears as '00000000' , the edit mask in the form is not taking into account and the conversion of value inside ABAP Stakc is not done.
    When you have a look inside the function module generated by the form, you will fin a routine which name is %WORK and inside it you will see a routine which is %OUTPUT which is call for each field you want to print in the form .
    If my field is correctly output on the form the data type is givent to this routine in order to make it correct and until now everuthnigs looks good for me but my issue is that the data type of field are not given to this routine so it consider each field as simple TEXT and that wrongs.
    juste bellow the code generated for the routine %OUTPUT
    form %output using p_name      type fpfield
                       p_value     type any
                       p_datatype  type datatype_d
                       p_abap_type type c
                       p_reffield  type fpunit
                       p_unit      type any
                       p_edit_mask type convexit.
      case p_abap_type.
        when cl_abap_typedescr=>typekind_struct1 or
             cl_abap_typedescr=>typekind_struct2 or
             cl_abap_typedescr=>typekind_table.
    *     ignore
        when others.
          call function 'FPCOMP_WRITE_DATA_FIELD'
            exporting
              i_field_name   = p_name
              i_value        = p_value
              i_ddic_type    = p_datatype
              i_data_type    = p_abap_type
              i_edit_mask    = p_edit_mask
              i_ref_name     = p_reffield
              i_ref_value    = p_unit
            exceptions
              usage_error    = 1
              system_error   = 2
              internal_error = 3
              others         = 4.
          %fpcomp_error.
      endcase.
    endform.                    "%output
    is it more clear now ?
    Regards

  • Leading zeros without ALPHA conversion

    Hi all,
    we had an infoobject CHAR16 with ALPHA exit, after further analysis we realized that we had to save the field without without trailing zeros so what i've done was to modify the char deleting the ALPHA exit.
    I test in DEV this change and the field was populating correctly in the datastore object, for example:
    Before
    3455 -> 0000000000003455
    After
    3455 -> 3455
    Now i've transported this change in production but it doesn't seem to work, after reloading a chunk of data it's still applying the ALPHA conversion to the field: i've also checked the PSA and i've got the right value.
    Any thoughts?
    Thanks
    Stefano

    Stefano,when you remove the alpha conversion from the infoobject then it might create some inconsistencies while loading data and also in reading the value.
    Lets say firstly you loaded the data to object with Alpha conversion so value got stored like this :
    3455 -> 0000000000003455
    Now when you remove the Alpah conversion then value get stored as
    3455 -> 3455
    System will treat the above entries as two separate records and not single record.Check in system as well.
    To display the data correctly you need to enter the value as they are like one with leading zeros and one without then you may be able to see both the records.
    Hope this gives an idea.
    Regards,
    AL

  • Value Mapping : Different Source - Same Target

    Hi all,
    Is it possible to map the different source value to the same target value with XI Value mappuing.
    Because when i try to set the same target value for a second value .. the directory delete it for the first one.
    for example i have:
    Value1
    Value2 --- newValue1
    Value3
    Value4 --- newValue2
    Value5 --- newValue3
    is it possible to build this somehow with ValueMapping?
    Regards,
    Robin

    Hi,
    no you missunderstood.
    i useing the conversion function Value mapping.
    The Values i set in Directory -> Tools -> Value mapping.
    What i want know, is if value 1 and value 2 comes the value mapping hast to give back for both the same... value 3.
    im only intressting of the values .. no structure other something else.
    Regards,
    Robin

  • Storage unit Length conversion tcode OMNN

    ABAP Guru's
    I have a requirement to use different length conversion for SU in different warehoses
    HU and SU are both active, Some of the warehouses use SU conversion length as 10 digits and others use 18 digits
    as OMNN settings are at client level so we may have to use exit MWMK0001 for our requirement
    In OMNN I will use SU conversion exit value as 3 and then through CMOD I will create and activate a project
    I will be using first two components of exit MWMK0001 , I am struggling because I can't put break point in this exit .I am new to ABAP so need help Does anyone ever had similar requirement and how it was solved
    Thanks
    Ash

    Hello there,
    Please see the following path:
    IMG
      - Logistics Execution
       - Warehouse Management
        - Storage Units
         - Master Data
          - Define Number Ranges
           - Conversion exit for storage unit number
    You may find the following information.
    Conversion of storage unit numbers
    SU conversion exit   3    Customer-specific user exit
    Length of SU numbers 18
    The storage unit number is processed through a conversion exit.
    You can find these exits in the domain lenum
       CONVERSION_EXIT_LENUM_INPUT
       CONVERSION_EXIT_LENUM_OUTPUT
    You may also use user exit EXIT_SAPLLVSK_002 to reprogram the length conversion again to plus the leading zeroes for the SU number. In the SAP standard any values for the SU number have to be typed in also leading zeroes.
    Regards
    Martin

  • Component material costing (Unit of Measure conversion) is not working at S

    Hi,
    Unit of Measure conversion factor (Value) is not working for component materials at Sales order level costing, but the same conversion factor is working fine at Standard cost estimation level.
    Example:
    Conversion Factor for Raw materials (Procurement in Tones, Consumption in Each)
    1 Ton (Rs.1000) = 10 Each (per Each u2013 Rs.100)
    For Manufacturing of 1 pc FG is required 1 each Raw material
    Issue: At sales Order level - Raw material costing is calculated at u201CTonu201D price u2013 Rs.1000 instead of u201CEachu201D price u2013 Rs.100. 
    Regards
    Nagesh

    Hi,
    Pricing (unit conversion) is done different for MM and SD process.
    MM has decided to use the old process which determines the values out of T006. There you have the conversion factors which are used in PO. If you create a sales order or billing document the determination will be different as the material master will be read.
    regards
    Waman

Maybe you are looking for

  • Message Listener Not Working

    Hi, We are using MII 12.0.2 (88). We have connected with SAP ECC and we are receivng iDoc mesg from SAP and processing thru message listener and processing rule editors. This connection was successful and we are able to recive all iDoc but suddenly o

  • My MacBook Pro is acting really strange and unusual

    Please read my 3 part chronicle named 'The Tribulations of M. B. Pro'   Pictures featured at end. I recently showed an apple techy (in-store) a picture of what my laptop displayed the other day and he said he didn't know but I should bring it in asap

  • No longer able to add files error

    Using: Oracle Portal Version: 9.0.2.3.0 When trying to add a file (simple File, File, Image etc) my browser returns a 'page not found' error whilst searching for the following URL: http://hebe.staff.citycol.ac.uk:7778/pls/portal/PORTAL.wwv_add_wizard

  • IPad - music not appearing under album or artist tab

    I have just purchased a new iPad. I can access the videos and music on my computer on iTunes via home sharing. However regarding music, tracks only appear under the songs tab. Nothing under album or artists. 3000+ albums and nothing, yet all the song

  • Inverse of count

    I have the below data set - Order_id Item Qty 1 A1 3 2 A2 2 3 A3 3 My desired output (With only one SQL query) is mentioned below- Order_id Item Qty 1 A1 1 1 A1 1 1 A1 1 2 A2 1 2 A2 1 3 A3 1 3 A3 1 3 A3 1 How do I do this?