UOM Conversion Best Practices

Hello All-
Could someone provide some best practices for converting between different units of measure in MDM?
We have an English repository with several numeric attributes defined in English units (inches, horsepower, etc). We want to add a second language which inherits these measurements, but converts them to Metric (cm, kW, etc).
From what I can see, neither the MDM clients (except Publisher) or the Java API support this dynamic conversion. So that leads me to believe that we have to programmatically convert the units as we access the data via the Java API.
Is there built-in functionality to handle this?
thanks
Tim
MDM 5.5 SP06 Patch 1

Hello Tim,
By using MDM UOM Manager you can maintain system dimension and user defined dimension.
System dimension:
You can modify system unit.
You can add new user defined unit to system dimension.
User defined dimension:
You can add new user defined dimention.
You can add new user defined unit.
After unload and load the MDM repository, MDS builds new indexes to the added / modified UOM and the MDM API treats those changes as it treats to the system MDM dimension and units.
You can utilise the "new" MDM java API to convert values from one dimension unit to another:
Package: com.sap.mdm.util
Class: MeasurementUtils
Method: public static double convertMeasurementValue(DimensionId dimensionId,
                                             UnitId unitIdFrom,
                                             UnitId unitIdTo,
                                             double value,
                                             DimensionsManager dimensionManager)
Good luck, Nimrod.

Similar Messages

  • NLS data conversion – best practice

    Hello,
    I have several tables originate from a database with a single byte character set. I want to load the data into a database with multi-byte character set like UTF-8, and in the future, be able to use the Unicode version of Oracle XE.
    When I'm using DDL scripts to create the tables on the new database, and after that trying to load the data, I receive a lot of error messages regarding the size of the VARCHAR2 fields (which, of course, makes sense).
    As I understand, I can solve the problem by doubling the size of the verachar2 fields: VARCHAR2(20) will become VARCHAR2(40) and so on. Another option is to use the NVARCHAR2 datatype, and retain the correlation with the number of characters in the field.
    I never used NVARCHAR2 before, so I don't know if there are any side affects on the pre-built APEX processes like Automatic DML, Automatic Row Fetch and the likes, or on the APEX import data mechanism.
    What will be the best practice solution for APEX?
    I'll appreciate any comments on the subjects,
    Arie.

    Hello,
    Thanks Maxim and Patrick for your replies.
    I started to answer Maxim when Patrick post came in. It's interesting as I tried to change this nls_length_semantics parameter once before, but without any success. I even wrote an APEX procedure to run over all my VARCHAR2 columns, and change them to something like VARCHAR2(20 char). However, I wasn't satisfied with this solution, partially because what Patrick said about developers forgetting the full syntax, and partially because I read that some of the internal procedures (mainly with LOBs) do not support this character mode and always working with byte mode.
    Changing the nls_length_semantics parameter seems like a very good solution, mainly because, as Patrick wrote, " The big advantage is that you don't have to change any scripts or PL/SQL code."
    I'm just curious, what is the technique APEX is using to run on all various, SB and MB character sets?
    Thanks,
    Arie.

  • For DB2 to Oracle conversion best practices

    My company is enhancing existing application adding newly J2ee web interface and database as DB2.I am new to J2EE. In future if we want to migrate my database to Oracle,which are the best things to do it now.
    Which J2EE framework is good in respecte JDBC connectivity and future migration of DB2 to Oracle? (Minimal changes at Migration Time)
    It is medium size application with 5000 users.Which are other best practises to follow in development keeping the migration in Mind. Thanks..

    Yes, you should login as system, create a user, appowner, or what ever you call it, and assign that user a default tablespace of 'USERS' or whatever tablespace you decide. Then, grant that user all the privileges to create objects, i.e., create table, create procedure, create synonym, etc, etc.
    Then, logout as system, login as appowner, and do all your object creation from there.
    A user is a set of credentials that allow you access to the system. It defines your identity and your privileges and authority to do various things. A schema is the set of objects owned by a particular user. As soon as a user owns at least one object, that implicitly defines his schema. It's not possible for a user to own or control multiple schemas. If you want multiple schemas, that's fine, but you'll need multiple users, and each user will manage his own schema.
    Hope that's clear,
    -Mark
    PS I strongly suggest you review the Concepts Guide, it really is quite good. It can be found here: http://download.oracle.com/docs/cd/E11882_01/server.112/e10713/toc.htm

  • Best Practice for Conversion Workflow

    Hello,
    I'm converting video files from our "home grown" virtual media reserve to iTunes U. Some of the files are in RM format, some are already compressed .mov's (not H.264) and some I have the original DV files for.
    Anyone out there have a best practice for converting these file types for posting to iTunes U? I have Final Cut Studio (Compressor), QT Pro and Squeeze available to me.
    Any experience you have with this would be helpful.
    Thanks,
    Jeana

    For converting old files to a podcast compatible video and based on the machine you have, consider elgato turbo.264. It is a fairly priced "co-processor" for video conversion. It is comprised of an application and a small USB device with a encoder chip in it. In my experience, it is the fastest way to create podcast video files. The amount of time that you will save will pay for the device quickly (about $100). Plus it does batch conversion of any video that your system currently plays through QuickTime. it has all the necessary presets and you can create your own. It has a few minor limitations such as not supporting (at this moment) enhanced podcasting features such as chapter markers and closed-captions but since you have old files for conversion, that won't matter.
    For creating new content, the workflow varies a lot. Since you mention MP3s, I guess you are also interested in audio files. I would stick with GarageBand, especially if you are a beginner plus it supports enhanced podcasts.
    In any case the most important goal is to have the simplest and fastest way to go from recording to publication. The less editing the better. To attain that, the best methods will require the largest investments. For example, for video production the best way is to produce the content live so when you finish recording it is only a matter of encoding and publishing. that will require the use of a video switcher that can ingest at least one video camera and a computer output to properly capture presentation material. That's the minimum. there are several devices that can do this for you. Some are disguised PCs and some connect to a PC for tapeless recording. You can check the Tricaster, which I like but wish it was a Mac and not a Windows Xp PC. Other routes may include video mixers from manufacturers like Edirol, Pansonic or Sony connected to a VTR or directly to a Mac for direct-ti-disc capture. I f you look at some of the content available in iTunes U, you will see what I explain here. This workflow requires preparation and sufficient live support but you will have your material ready for delivery almost immediately after the recorded event. No editing required. Finally, the most intensive workflow is to record everything separately and edit it later, which is extremely time consuming.

  • HR Master Data conversion-SAP Best Practices

    Hello there,
    We would like to use the SAP Best Practices for HR Master Data conversion. 
    Now we want leverage the SAP Best practices to convert the Master data.  Could any one explain in detail how to do the same.
    How to install the Best Practices only to the extent of the Data conversion.  We don't want to use the rest of the Best Practicies.
    I know there are some notes out there. 
    Any help on the above highly appriciated.

    HI,
    I am not v sure if u can install only the required component. But there would be some pre requisites for every installation.
    It will be clearly mentioned in the base line
    Also Check if its available for the country which ur currently working...
    Use the ECATT: Test Configuratio & Test Scripts
    Pls revert in case u need further more details..

  • Best Practices for Payroll Conversions

    Hi All
    We are currently trying to do data conversion at a client with 100,000 + Employees. We are trying load the tables
    T558B - Payroll Periods
    T558C - Payroll Account Transfer: Old Wage Types
    T5U8C - Transfer external payroll results (USA)
    for all the YTD taxes for the Employees.
    We have designed LSMW's using Transaction code SM30 to load all the Employees. When we are loading using SM30, it is taking a very long time. Can anyone suggest, if this is the Best practice, or if there is an alternate way to load all these data into SAP without using SM30.
    Please let me know
    Thank you
    Deepthi
    Note: I have checked SAP Best practice and that document uses SM30 transaction recording

    Hi,
    In implementing any LO module, generally the following are the points to be taken in mind.
    1. Base level have a ODS to load data from R/3. This ensure that you have exact data content as that in R/3. This can be a write optimized.
    2. Second level have an ODS along with the transformation and modification of data based on the business and functional requirement and enhancments.
    3. Finally have a cube to consolidate the data and make it available for reporting.Create all the reports based on the cube.
    Hope this gives an idea.
    Regards,
    akhan
    Edited by: Akhan_BI on Sep 5, 2009 12:41 AM

  • Best practice UOM changes for one master data

    Hello ALL,
    What are the best practices must be followed while changing one master data UOM data.
    for eg i have two UOMs respectively,
    basic UOM = EA another one ia Alternative UOM CAR  . 1 car = 10 EA.
    now my master data team deleting this CAR via MM02 .
    what are the follow up actions we should follow.
    (1) for eg i have contract existed for thi smaterial as CAR
    (2) I have many POs without doing GR
    like wise can you list what are the impacts in the system.
    how we can go ahead.
    What are the best practices followed while this UOM changes for your customer.
    br
    muthu

    Hi Ravi / tao
    Yes . I am deleting alternate UOM CAR in mm02 .
    I created a contract for that material in CAR .
    my contarct also refered in source list.
    share your views.
    what happen to my open documents.
    What happen to open documents ? .. System may not allow me to do GR right?
    br
    muthu

  • Best practice video conversion from download

    I am looking for best practice for video conversions.
    I am downloading adobe recordings via this method:
    http://server.adobeconnect.com/xyz/output/filename.zip?download=zip
    From here, I have been converting the FLVs using either freemake video converter or FLV converter. I have tried converting into AVI (XVID), MOV, WMV, etc. (I need the file to be under 600 MB for an hour of recording, therefore it is going to need some type of compression).
    My goal is to import the video into Sony Vegas Pro 10 for further editting. I have found that whatever method I use, the video and audio does not sync properly about 50% of the time. The video time is longer than the audio time usually. Or that there are other various errors, such as the video just freezing halfway through the video.
    I have been using connect for a few years now, but with each update I find (connect 8, 9, etc), that the problems are getting worse. At this point I am just wasting time trying to convert into various formats using various codecs just trying to luck upon one where the video is at least without error.
    What methods are others using to convert the FLV to a workable editable format?

    Can't the FLV files be changed into many different formats through Apple's
    Compressor or Adobe Media Encoder? These formats can then be opened in
    standard video editing software for editing.
    Best practice video conversion from download
    mach5kel
    to:
    jsb152
    05/21/12 01:03 PM
    Please respond to jive-509399086-9dnu-2-2mvb7
    Re: Best practice video conversion from download
    created by mach5kel in Connect General Discussion - View the full
    discussion
    Yes, I use this as a last resort, as the quality of capture this was is
    signifcatnly lower. As well as it is a much more time consuming process. I
    sometimes have over 50 parts of 1 hour video. To use camtasia, you need
    first to record it, then it must be saved in a camtasia format, and then
    lastly rendered into avi or wmv. Therefore, it is does take awhile.
    Really, I feel there shouldnt be so many errors in the conversion process,
    but I am finding the FLV recordings themselves have problems. This last
    file I am looking at, even the recording playback on adobe connect, has
    serious issues with audio and video sync. A problem that is all too common
    Replies to this message go to everyone subscribed to this thread, not
    directly to the person who posted the message. To post a reply, either
    reply to this email or visit the message page: [
    http://forums.adobe.com/message/4426243#4426243]
    To unsubscribe from this thread, please visit the message page at [
    http://forums.adobe.com/message/4426243#4426243]. In the Actions box on
    the right, click the Stop Email Notifications link.
    Start a new discussion in Connect General Discussion by email or at Adobe
    Forums
    For more information about maintaining your forum email notifications
    please go to http://forums.adobe.com/message/2936746#2936746.

  • Conversion exit missing in BI - Best practice to install missing exits

    Hi,
      We are on BI 7.0 SP10.
      While doing mappings, I found two conversion Exits MATN2 & SCOPE missing from the BI box. Hence SAP was throwing error while trying to activate the 7.0 datasource.
    On more analysis, I found that the function groups are altogether missing.
    What is the best practice to install these missing objects (Func Groups, Func Mods) in the system.
    Our ABAP person suggested to create these objects as "Repair objects" in the system. Is that the right procedure?
    Please advise.
    Thanks
    Vishno

    I have been through the following steps:
    Entered this URL http://help.sap.com/bp/initial/index.htm
    Clicked on 'Cross-industry Packages'
    Clicked on 'CRM'
    Clicked on 'Englilsh'
    Then the following page is displayed:
    http://help.sap.com/bp_crm70/CRM_DE/HTML/index.htm displayed
    But now what?. How do I get the Best practice instructions for a CRM implemenation?.
    Jason

  • JSF CDI : Conversation scope bean[s] best practice

    *JSF CDI : Conversation scope bean[s] best practice*
    Hello, im currently learning about JSF 2.0 and im so glad for the existence of this conversation scope feature, which is very helpful in opening a new tab or a new window on the same page and having separate resources, not overriding one another.
    But im curious on how to implement this in a good way, about when to start the conversation and when to close it.
    In my case, i have each CDI bean for each JSF page. And let's say that i have a menu, and when it's clicked, this will lead to page A, and from A, could lead to B, B could lead to C, C could lead to D, all these 4 pages are connected in one chain.
    Accessing A's bean properties from B or C or D beans is possible, accessing B's properties is also possible from C or D beans and so forth.
    Now im quite confused about :
    * whether all these A B C D should be in conversation scope or not, or perhaps just A ? Because i think sometimes from another page that is outside the ABCD chain, like a page F, it could navigate to page B, although i dont know how to supply the data to the bean B yet.
    * whether all these A B C D should be combined into one bean
    * where and when to start the conversation, im thinking about the constructor, but i dont think it's a good idea, because i prefer starting the conversation when first accessing the page, not the bean
    * where and when to stop the conversation, so that there wont be unused resources hanging around
    Regards,
    Albert Kam

    There are also some commandButtons on the view page, which performs actions on Product object, fetched by passed ID.

  • Best practices for managing Movies (iPhoto, iMovie) to IPhone

    I am looking for some basic recommendations best practices on managing the syncing of movies to my iPhone. Most of my movies either come from a digital camcorder into iMovie or from a digital Camera into iPhone.
    Issues:
    1. If I do an export or a share from iPhoto, iMovie, or QuickTime, what formats should I select. I've seem 3gp, mv4.
    2. When I add a movie to iTunes, where is it stored. I've seen some folder locations like iMovie Sharing/iTunes. Can I copy them directly there or should I always add to library in iTunes?
    3. If I want to get a DVD I own into a format for the iPhone, how might I do that?
    Any other recommedations on best practices are welcome.
    Thanks
    mek

    1. If you type "iphone" or "ipod" into the help feature in imovie it will tell you how.
    "If you want to download and view one of your iMovie projects to your iPod or iPhone, you first need to send it to iTunes. When you send your project to iTunes, iMovie allows you to create one or more movies of different sizes, depending on the size of the original media that’s in your project. The medium-sized movie is best for viewing on your iPod or iPhone."
    2. Mine appear under "movies" which is where imovie put them automatically.
    3. If you mean movies purchased on DVD, then copying them is illegal and cannot be discussed here.
    From the terms of use of this forum:
    "Keep within the Law
    No material may be submitted that is intended to promote or commit an illegal act.
    Do not submit software or descriptions of processes that break or otherwise ‘work around’ digital rights management software or hardware. This includes conversations about ‘ripping’ DVDs or working around FairPlay software used on the iTunes Store."

  • CO-PA: product hierarchy predefined from best practice

    Hi guys,
    unfortunately I have as basis a best practice system, where product hierarchy with 3 levels is already defined in CO-PA:
    PAPH1     ProdHier01-1     ProdH01-1     CHAR     5     MVKE     PAPH1
    PAPH2     ProdHier01-2     ProdH01-2     CHAR     10     MVKE     PAPH2
    PAPH3     ProdHier01-3     ProdH01-3     CHAR     18     MVKE     PAPH3                 
    PRODH     Prod.hierarchy     Prod.hier.     CHAR     18     MVKE     PRODH
    Those ones are already assigned to a best practice operating concern:
    10UK     Best Practices
    1. we have 7 levels in our product hierarchy
    2. I need to read it from mara and not mvke
    When trying to ad product hierarchy as characteristic in KEA5 in our new operating concern, I gut the message below.
    This data element is used in tons of tables and the whole precdure looks risky + after transport I might have issues again?
    Creating ww* characteristics for product hierarchy is not an option, since then I would need to maintain the whole product hiearchy as characteristc values again (like before release 4.5) and this is far more than 1000 entries and is double maintenance after go live.
    Deleting the best practice operating concern is also difficult, since there are several clients where customizing still sits for 10UK.
    Anybody experience? What did you do?
    regards
    Bjoern
    Here is the text from KEA5 when trying to ad mara fields (since date elment lenghts is different due to more levels):
    Product hierarchy field PAPH1 cannot be adapted to the new structure
    Message no. KE691
    Diagnosis
    The definition of the product hierarchy has changed (structure PRODHS). The hierarchy field generated in CO-PA, PAPH1 cannot be adapted to the new definition, because it is already being used in structures or tables.
    System Response
    The field PAPH1 does not appear in the list of fields that can be selected.
    It can no longer be used as a characteristic.
    Procedure
    If you still want to define hierarchy field PAPH1 as a characteristic, proceed as follows:
    1. Display the where-used list for data element RKEG_PAPH1 in database fields using transaction SE12. For hierarchy field PAPH1 to be adapted, a data conversion must be carried out for all the tables listed!
    2. First change the number of characters in the domain RKEG_PAPH1 to match the new definition of the product hierarchy, following the example below. You can do this using transaction SE11.
    Structure PRODHS     Domain
    Field    Length      name       Number of characters
    PRODH1     3         RKEG_PAPH1      3
    PRODH2     2         RKEG_PAPH2      5
    PRODH3     5         RKEG_PAPH3     10
    PRODH4     8         RKEG_PAPH4     18

    Just as Info,
    I needed to go for deletion of those characteristics - quite some work, like:
    1. delete all where-used entries for the characteristics in ALL CLIENTS of this system (important: e.g. planning layout - recommended to change and not simply delete, because the change can be transported)
    2. when then trying to delete characteristics (with "unlock" in KEA0 > value fields) - tons of tables and strucutres popped up, where this data element is still needed - follow note 353257
    3. generate best practice operating concern new (10UK in our case)
    4. create PAPH1 etc. new from table MARA (not mvke in our case - depends)
    all good - hopefully no issues after transport - we will see.

  • Best practice for Video over IP using ISDN WAN

    I am looking for the best practice to ensure that the WAN has suffient active ISDN channels to support the video conference connection.
    Reliance on load threshold either -
    Takes to long for the ISDN calls to establish causing the problems for video setup
    - or is too fast to place additional ISDN calls when only data is using the line
    What I need is for the ISDN calls to be pre-established just prior to the video call. Have done this in the past with the "ppp multilink links minimum commmand but this manual intervention isn't the preferred option in this case
    thanks

    This method is as secure as the password: an attacker can see
    the hashed value, and you must assume that they know what has been
    hashed, with what algorithm. Therefore, the challenge in attacking
    this system is simply to hash lots of passwords until you get one
    that gives the same value. Rainbow tables may make this easier than
    you assume.
    Why not use SSL to send the login request? That encrypts the
    entire conversation, making snooping pointless.
    You should still MD5 the password so you don't have to store
    it unencrypted on the server, but that's a side issue.

  • Best Practice in V7.0 : Issues with Sales Planning and Reporting

    I am trying to install the SAP Best Practices for BPC 5.1 on SAP PBC 7.0 SP 04 I have done this as I cannot find any Best Practice documents for version 7 as yet.
    I have managed to get through the Administration setup and most of the BPC -Administration Configuration Guide, however I am having a problem with 7.4 Running a Data ManagementPackage - Import on page 32 of 36. This step involves you uploading a data file Demo_Revenue_Data.txt into BPC.
    The file says that it has failed due to Ínvalid dimension ACCOUNT in lookup.
    I believe that this error may be driven by a previous step 6.4 Creating Script Logic where the logic for BP_Sales Application was required.
    My question is twofold in that I need to determine:
    1. Has anyone else tried the BestPractices for BPC 5.0 in BPC 7.0?
    2. Does anyone know how to overcome the error when uploading the Demo Revenue into BPC?
    Edited by: Kevin West on Jul 8, 2009 2:03 PM

    Hi,
    BPC best practices document from 5 is working fine also for 7.0 because 7.0 is just an update for 5.x.
    Running Import involve logic just if you are running the package with option enabled (Run Default Logic).
    Your issue seems to be related to maping which means you have to check Transformation and Conversion file.
    Any way the best practices document will not provide you information about how to build Transformation and Conversion files.
    You have to follow an SAP BPC training and that it will help you to build your applicatioon easier and faster.
    Regards
    Sorin Radulescu

  • Best practice on extending the SIEBEL data model

    Can anyone point me to a reference document or provide from their experience a simple best practice on extending the SIEBEL data model for business unique data? Basically I am looking for some simple rules - based on either use case characteristics (need to sort and filter by, need to update frequently, ...) or data characteristics (transient, changes frequently, ...) to tell me if I should extend the tables, leverage the 'x' tables, or do something else.
    Preferably they would be prescriptive and tell me the limits of the different options from a use perspective.
    Thanks

    Accepting the given that Siebel's vanilla data model will always work best, here are some things to keep in mind if you need to add something to meet a process that the business is unwilling to adapt:
    1) Avoid re-using existing business component fields and table columns that you don't need for their original purpose. This is a dangerous practice that is likely to haunt you at upgrade time, or (worse yet) might be linked to some mysterious out-of-the-box automation that you don't know about because it is hidden in class-specific user properties.
    2) Be aware that X tables add a join to your queries, so if you are mapping one business component field to ATTRIB_01 and adding it to your list applets, you are potentially putting an unnecessary load on your database. X tables are best used for fields that are going to be displayed in only one or two places, so the join would not normally be included in your queries.
    3) Always use a prefix (usually X_ ) to denote extension columns when you do create them.
    4) Don't forget to map EIM extensions to the extension columns you create. You do not want to have to go through a schema change and release cycle just because the business wants you to import some data to your extension column.
    5) Consider whether you need a conversion to populate the new column in existing database records, especially if you are configuring a default value in your extension column.
    6) During upgrades, take the time to re-evalute your need for the extension column, taking into account the inevitable enhancements to the vanilla data model. For example, you may find, as we did, that the new version of the S_ADDR_ORG table had an ADDR_LINE_3 column, and our X_ADDR_ADDR3 column was no longer necessary. (Of course, re-configuring all your business components to use the new vanilla column can also be quite an ordeal.)
    Good luck!
    Jim

Maybe you are looking for

  • Want to de-noise and sharpen video before cutting it, what to export it out as?

    So I have about 90 minute worth of video from an event that I want to add a very slight bit of sharpening to and denoising in After Effects. I want to just get this all done and out of the way before I start cutting so that way when color correcting/

  • The events I create on my Iphone 4 are not syncing with iCal.

    Everything was fine until I updated the software on the phone to the iOS 5 and now I notice that events which I originated on my phone will not sync with my computer's iCal.  I believe that things which are on my ICal will sync with my phone but not

  • Tabbed with Server Side Include

    Has anyone had any problems adding Server Side Include to the Spry Tabbed Pannels, everytime the I add a SSI to the tabbed content is locks up the tabs. Where the hell are to get support is this an interAKT, ADOBE or your on your own.

  • Calendar crashes opening iCloud-sync'd events

    I receive many invites from co-workers who use Outlook / Exchange to create appointments. I receive the appointment, accept and add it to iCal on my MacBook Pro (10.7.3) just fine. I sync my calendars to my iPhone and iPad via iCloud.I can open the a

  • External Enclosure with out a fan

    I bought a 250 gig Sata internal hard drive a while back and want to put it in an external enclosure for my video files to edit off of. Unfortunately there doesn't seem to be a huge selection of Sata firewire enclosures available. I found a couple fr