Different Data Manager look&feel interface

Hi all,
after migrating one appset to another environment we are confused about the Data Manager part. All the interface to execute a package is different (with icons, etc.), even when the version of the components (BPC, SQL, etc.) is the same in both environments. Does anybody know what this new interface depends on?
Regards,
Rafa

It is security settings. In task profile when you are admin for datamanager you get the view without Icons, if you only have excecute rights, you get the version with the icons.

Similar Messages

  • How to standardize the look & feel of different vendor JSF components?

    Hi,
    There are various JSF components created by different parties. For example, Tomahawk from Myfaces, ADF faces from Oracle, SUN components that provided in Java Studio creator. They look different.
    Can we standard the look and feel for these various components when we use them in a same project? so that it wouldn't look that different when the page is displayed to the end user.
    Please advise.
    Thank you.

    Hi,
    Can you be more specific how to achieve that using css? You mean the look and fee that we see is not an image?
    Have you seen how Oracle ADF face look & feel is like?
    How to change its look to what we have SUN offered in its Java studio creator 2?
    Please advise.
    Thank you.

  • How to show different Look & Feel to different users?

    Hi,
    how to show different Look & Feel to different users?
    Thanks & Regards,
    Venu--

    If you want the user to select then LookAndFeel then Visitor Tools.
    If you want to use code and dynamically change it then http://download.oracle.com/docs/cd/E13155_01/wlp/docs103/javadoc/index.html?com/bea/netuix/laf/PortalLookAndFeel.html
    if you have only a few combinations then you could even create different desktops and direct the user to the appropriate url

  • Run SSIS Package (SQL Database on Different Server) from Data Manager.

    Hi- How to run a SSIS package from BPC Data Manager -This package connects to another SQL Server Database and creates a text file. This text file is the source to BPC custom tasks CONERTTASK and DUMPLOAD task to load to BPC.
    Here is the  flow of the complete package- Dataflow (Create the text file from a SQL Database)
                                                               CONVERT TASK (Convert the file to BPC Format)
                                                                DUMPLOAD TASK (Load the converted file to BPC)
    Any pointer will be a great help.

    Hello Pam,
    When you run SSIS package with BPC DM it runs on the application server. You don't really have to run a package on a different server in order to get data from a remote database and dump it to a file. That task can be done in your SSIS package using various data sources/destinations. If that's what you are trying to do. The only thing is, your BPC admin user (the one you used to install BPC) has to have an appropriate privileges on a remote server.
    Hope that helps.
    Regards,
    Akim

  • Looking for Master Data Management in non-Netweaver environmets

    Hello
    Based on SAP literature I have read on Master data Management (MDM) ,it is a component of Netweaver product family.
    Currently we are running SAP R/3 Enterprise edition(4.7C)  and wish to have some form of MDM  to maintain a consistent view of master data across different landscapes. Theses landscapes span different geographic and administrative areas.
    Is there any product either  from SAP or third-party vendors which can offer a solution for MDM in environments such as ours that have not yet switched to Netweaver.
    Kind Regards
    Mohammad

    Hi,
    SAP definately offers SAP-MDM soultion. Yet its part of netweaver family but you can buy and install in seperately if you wish not to switch to Netweaver.
    Its a myth that SAP-MDM is only for SAP products.
    There are multiple input options available and hence it can work for you.
    Please suggest if need more info.
    Also you can post more questions into SAP-MDM forum.
    Hope this helps,
    + An

  • Vi/data managment

    Hi all.
    2 years ago I graduated and 1year aog I started developping a typical stimulus-response set-up for behavioural neurofysiological experiments in LV, without knowing anything about LV (or programming in general). I'm a pretty quick learner however, and today, the set-up (1 stimul pc, 1 control (LV) pc and a dsp all connected through tcp/ip) is used intensively in our lab, although i'm stil debugging/developping new features for it constantly,
    I'm getting managment problems however, a few examples: the dsp sends everything it samples (now 3channels @ 20kHz) to LV in blocks of 200samples, for saving to disk and display). At the same time it sends/receives control commands to/from the stimul pc and eventually from other pc's on the internal network.
    When the system is running for a long time, the LV display can suffer from delays up to 10seconds. eg i stop execution on the dsp, but the LV display keeps on going for 10seconds displaying data.
    Also, every time the histograms get updated, the main vi's display freezes for half a second.
    Sometimes it takes way too long before LV (or is it windows? I don't know how the tcp/ip is handled) reads the tcp/ip data, what results in crashing the dsp (I'm at the edge of cpu usage there) which is the worst that can happen because the experiment has to be stopped then.
    After reading up a bit on data managment etc, I easily figured out my vi probabely is one big mess that would scare most programmers away (I cannot post it for internal reasons, but I posted the fp of the vi unning a test so you can imagine there's a lot behind it.) Since the vi is still growing every day, I don't have much time to refactor it completely, but I feel I have to do something or I will get lost in the future (we're planning to go to 16channels which all have to be saved to disk by the vi, later it will be 100 channels, all containg brain activity)
    So I have some questions I'd like to be answered by more experienced users so I can get some ideas for a new implementation.
    Currently, I have 4 while loops containing tcp readers. 3 of them also display the data received and write it to file.
    The writing is buffered however, eg first filling a shift register with 10000 samples, then post an event and pass the data through the event to another loop writing it to disk. Would it be better to store the data in a local/gloabel, common between the two loops, and just set an event to warn the scond one? Or is the queueing system better (haven't looked at it yet)? Same question goes for getting the data from the main vi to the histograms, I do this with a global event.
    For the displaying, not every single point is displayed (because screen refresh rate is way slower then data rate), but would it benefit putting the display in a seperate loop and use events/... to pass data?
    Also, I use like 100 local variables that have to be commonly accessible between the several while loops I have running, I've read locals aren't that efficient, but i see no other solution for it?
    The squares on the eye/spike displays you see in the attachment, can be positioned/resized by the mouse, i have seperate while loops for that, recalculating the position every 100mSec depending on zoom etc, every display on the fp is a layer of several pictures/graphs. So the updating of the layers is never synchronous (eg displays data every 20mSec, but the squares every 100mSec), is this a drawback or does LV automatically sync it since it can only display at refresh rate anyway?
    Almost every user input (menu/buttons) is handled through events, so I have 3 while loops with an event structure, each handling 20events with sometimes lots of work in it. Is a system with event handling, then passing it to a task handler or so preferable and why?
    And is every while loop in LV running in a seperate thread?
    I have several more questions, considering things that look fundamental LV stuff to me. Maybe I should follow a course? but I doubt there's a course focussing on all my problems. Or should I get a private teacher somewhere? :-]
    Thanks for reading this little book, and thanks in advance for any answers!
    Stijn
    (LV 7.0, very clean win2k system on a dual xeon 3Ghz 1gb ram dual screen dell, i get up to 60%cpu usage during experiments, most of it kernel time)
    ps very important note for dual nvidia screen users: do *not* use dual display, only horizontal span, or windows will eat all your kernel time while running eg vi's, hence slowing down everything really, really bad. Horizontal span is better since then windows thinks it's one (huge) screen only. It took me weeks to figure out why my vi couldn't read/display all data in time, well, it could, but it was just waiting for windows to do freaky secret stuff with my display driver.
    Attachments:
    VI.zip ‏755 KB

    Hi Stijn,
    First of all, there is a specific course, called the LabVIEW Intermediate I course, that discusses these issues. This course has been created for customers that want to write professional and large applications.
    There is a course scheduled in Belgium the 29th of August, btw.
    I have one big tip actually: Seperate your User Interface handling from the data processing and put them into separate parallel loops.
    Use the Producer/Consumer Design Pattern with Events (template in LabVIEW). Use the Queue VIs to make sure that no event on the User interface is lost and use User events to communicate back to the User interface loop.
    Queue your data coming from the TCP/IP communication. Use separate loops to split-up the acquisition of your data and the processing of your data. Master/Slave designs are ideal for this. Queueing will make sure that that you don't lose the data while processing. Make sure that none of the loops are blocked by shared resources. For example, make sure that the acquisition loop doesn't have to wait for the processing loop to finish because there is a shared resource between the loop. Queues don't have that problem.
    Locals can create "race conditions". They also create a copy of the data which is memory intensive. Alternatives for locals are Queue Vis or Notifiers. (Check the design templates in LabVIEW). You could also use "functional globals". These are VI's that replace the normal global variable by a state machine that has a "get data" and a "set data" state and where the value is stored in the "shift register".
    Multithreading ---> Check the link below:
    http://zone.ni.com/devzone/conceptd.nsf/webmain/D2​E196C7416F373A862568690074C759
    Most of your questions are handled by the Intermediate I and Intermediate II course.
    Regards.
    JorisV
    TL AE
    NIBE

  • Create records in Data Manager - System creates dublicates

    Hello MDM-Experts,
    I have a kind of strange questions: has anyone ever had the following situation in MDM Data Manager?
    When creating one (1) record manually in Data Manager, the system dublicates the records, and suddenly, we have 2 or even up to 4 identical records in the system. Only the auto id is of course different.
    When looking at the create date, the time difference is only a few seconds (4 to 5 seconds difference).
    There is no entry in the log files, so I do not really know, what the problem is.
    Has anyone else ever had that? Is there a keyboard combination, that forces that?
    Thanks for all your help and answers!
    Best reards,
    Stefanie

    Hello Rajeev,
    I am assuming something like that. We even assume by now, that the system can not kope with merging and creating records. Two collegues where doing that at the same time on the same repository, but not on the same records.
    When the merging one stopps, the records are created exactly.
    Will test your suggestion.
    Thanks,
    Stefanie

  • How to start with Master data management

    Hi All,
    I want to start with Master Data Management. Can any one provide me proper material to get the knowledge on Master Data Management Integration. And What is the relationship between XI.and for starting MDM what all settings i have to do???
    Its really urget plz help me out.
    Thanks,
    Seema Khandelwal
    Message was edited by:
            Seema khandelwal

    Hi ..
    look at this documents
    /people/andreas.seifried/blog/2007/08/30/how-to-use-the-test-environment-of-the-mdm-enrichment-controller
    /people/harrison.holland5/blog/2006/11/27/mdm-syndication
    /people/harrison.holland5/blog/2007/01/22/testing-and-monitoring-an-interface-between-mdm-xi
    see this Wiki to find all the documentation on MDM:
    https://wiki.sdn.sap.com/wiki/display/HOME/MasterDataManagement
    Here, for the components you need to install:
    https://wiki.sdn.sap.com/wiki/display/HOME/MasterDataManagement+Components
    Others smart links are here:
    - sdn e-learning.
    https://www.sdn.sap.com/irj/sdn/mdm-elearning
    http://service.sap.com/installMDM for a
    you can use expression editor to do that:
    have a look at this guide on how dates can be calculated there
    with your field it can be very similar:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b025fab3-b3e9-2910-d999-a27b7a075a16
    for more details about expression editor:
    Calculation Fields chapter
    https://websmp108.sap-ag.de/~sapidb/011000358700006291622006E
    MDM 5.5 SP05 - ABAP API
    https://websmp101.sap-ag.de/~sapidb/011000358700000271902007E
    MDM 5.5 SP05 - ABAP API How To Guides (ZIP File)
    https://websmp101.sap-ag.de/~sapidb/011000358700000271912007E
    How to Identify Identical Master Data Records Using SAP MDM 5.5 ABAP APIs
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e060251e-b58d-2910-02a2-c9a1d60d9116

  • Import using Data Manager

    Hi everyone!
    My client want to use the import package from Data Manager to upload his data.
    I´m looking for a How to... than explain How to I can configure it.
    Somebody can Help Me??
    Thanks!!!

    Hi,
    I'm going to try this last one tomorrow.
    Now I have this little problem: If my datafile have a header with a different order, I have a warning and reject the header line.
    I introduced a mapping of the cols. My TransformationFile is like this but I have the same problem.
    Some suggest??
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT= -1
    ROUNDAMOUNT=
    *MAPPING
    CATEGORY=*COL(1)
    ACCOUNT=*COL(2)
    CECO=*COL(3)
    CLIENTES=*COL(4)
    CURRENCY=*COL(5)
    DATASRC=*COL(6)
    ENTITY=*COL(7)
    INTCO=*COL(8)
    ORGVENTAS=*COL(9)
    PRODUCTO=*COL(10)
    TIME=*COL(11)
    SIGNEDDATA=*COL(12)
    *CONVERSION
    Thanks!
    Regards

  • Importing with DATA MANAGER

    While directly importing with the help of data manager it gives me an error.
    It doesnt import data from each and every field, it gives me two kins of error
    1) some fields dont exist
    2) another type of error is unable to find a lookup value.
    And do suggest, even if Import manager exists when can i use this functionality.
    regards
    +
      | Eleana West

    First of all this Functionality can be used whenever u are sure that the data which has to be imported is free of any redundancies then only it is feasible to use this functionality.
    Otherwise u need to use import mgr so that any kind of duplicates records can be eliminated.
    And the errors which u are getting are
    1)  some fields dont exist
         It gives bcoz from the source side the columns names do not match with the names on the destination side i.e.. the field names in the repository structure.
    2)  another type of error is unable to find a lookup value.
         Even if the field names match and if the field on the destination side is a kind of lookup field i.e.. in turn its looking up in some other table then this kind of error arises.
    u can bypass this error while importing thru data mgr by clicking on option button either skip value or skip record.
    Hope dis clears ur doubt.
    If any other queries feel free to ask
    Regards Tejas..............

  • Best approach for combinng cubes of different data

    Hi Guys,
    I have two different cubes with much different data in them and only a couple of shared characteristics.
    Im looking to merge most of the data of cube 2 with the data of cube number 1.
    Multiprovider wont work( not many shared characteristics)
    I cant do a routine to pull the data as I cant build a simple routine as it is difficult to extract from Dimensions.
    I have tried Infoset but this does not appear to work.
    Scenario is I have two fields in cube 1 that exist in cube 2, I want to look up these values in cube 2 and return some of their key figures that exist
    Is the easiest way around this to create transformation from Cube to ODS and extract data via a lookup routine?

    Hello,
    Is the data being loaded directly to the Cube, is there no underlying DSO.
    I think if you want to look up on cube based on 2 char it should be not a difficult look up routine to code in the transformation.
    If the above you feel is not feasible, load data to DSO first and then to the cube and use this DSO for your look up. BUt note here you would be consuming DB space again
    Regards,
    SHashank

  • BPC 7.5 - Data Manager filling tablespace

    We are installing a brand new BPC for NW 7.5 SP4 system.  When trying to import data from .txt files via the data manager, the system is generating /1CPMB/DM0000xxx (example /1CPMB/DM0000001) tables.  The problem is these tables are being generated with a size category of 9.  This defaults to a next extent of 320MB.  This is on an Oracle-based backend BI 701 system with Dictionary-based tablespace.
    If you only upload 1-2 records, then it falls under the initial extent of 16KB.  But if you upload enough records to get past that initial extent, then it will blow up and take up 320MB of space, most of it being wasted.
    We've engaged SAP Support and thus far they have not been able to provide a means to have these tables generated with a lower size category, such as 1, or any other work-around to this issue.
    Initially, we were trying to load 60-100 smaller datafiles.  But seeing how BPC is dealing with them by wasting so much space, we are looking at combining the number of files to a smaller number.  If not, you could theoretically try to take up 32TB of space.
    We can reorg the tables after the fact, but that doesn't help the newly generated tables every time we upload a file using data manager.
    1) Login to BPC for Netweaver website
    2) Go into the excel portion
    3) log into the applicationset
    4) click on edata at the top of the screen
    5) select upload data file
    6) enter source and target locations. the file is a .txt file (40KB in
    size)
    7) click OK to start the
    upload
    *error*
    Any thoughts?

    Further digging, the tables are hardcoded by SAP code to default to a size category of 9...
    class:
    CL_UJF_FILE_SERVICE_MGR
      METHOD upload_document_dm.
        TRY .
            lo_file_service_mgr->put_document_data_mgr(
                        EXPORTING i_docname     = lv_docname
                                  i_append      = lv_append
                                  i_doc_content = lt_content
                                  i_content_delivery = im_is_cd ).
          CATCH cx_ujf_file_service_error INTO lo_exception.
            imess = convert_ex_to_message( lo_exception ).
            IF imess IS NOT INITIAL.
              READ TABLE imess INTO xmess INDEX 1.
              MESSAGE e001(00) WITH xmess-message.
            ENDIF.
        ENDTRY.
    method:
    PUT_DOCUMENT_DATA_MGR
    If document is found, that means that there is probably
    a database table already created,
      CASE me->is_document_found( ).
        WHEN abap_true.
          lv_tabname = me->ds_document-doc_content_db.
        WHEN abap_false.
    Check if this call is for content delivery, if so, then generate the table using
    a different name space, otherwise use the default naming  for data manager files.
          CASE i_content_delivery.
            WHEN abap_false.
              lv_tabname = do_file_service_dao->generate_doc_content_table( i_namespace = 'DM' ).
            WHEN abap_true.
              lv_tabname = do_file_service_dao->generate_doc_content_table( i_namespace = 'CD' ).
          ENDCASE.
      ENDCASE.
    method:
    generate_doc_content_table
    Add header and technical settings
      CLEAR ls_dd02v.
      ls_dd02v-tabname    = lv_tabname.
      ls_dd02v-ddtext     = lv_ddtext.
      ls_dd02v-ddlanguage = sy-langu.
      ls_dd02v-tabclass   = 'TRANSP'.
      ls_dd02v-contflag   = 'A'.
      ls_dd02v-exclass    = '1'.
      CLEAR ls_dd09v.
      ls_dd09v-tabkat     = '9'.
      ls_dd09v-tabart     = 'APPL1'.
      ls_dd09v-pufferung  = space.
      ls_dd09v-bufallow   = space.

  • BPC NW 10.0 - Data Manager Prompt changing from SELECTINPUT to SELECT cleared values

    Dear BPC Experts,
    We recently went from SP13 patch 4 to SP19 patch 1.  When we made changes in the PROMPT values in the Data Manager Organize>Package>Modify Script>PROMPT, we experienced different behavior switching from SELECTINPUT to SELECT in our development system than we did in our production environment.  In development, when we changed the value from SELECTINPUT to SELECT, the values entered for Variable name such as %SELECTION% in Property1 and "Select the members to CLEAR" in Property2, and %DIMS% remained.  However, when we changed from SELECTINPUT to SELECT in production system, the values for Variable Names and Properties were cleared out.  Does anyone know why in our developmet system values were kept but not in our production system during this type of activity?  I would like to understand the two different behaviors and what controlled it.  We prefer not to have the values for Properties clear.
    Thank you in advance for your assistance.
    Kind regards,
    Lisa

    Hi Vadim,
    Excellent point, I should have included images as that likely would have shown this odd behavior.
    When I made the changes in our development system to a package to switch from SELECTINPUT to SELECT the values outlined in the image below were retained for Varialbe Name, Property 2, and Property 3 after we applied the SP19 patch 1.
    When I made the same change in our system to a package in our production system after we applied the SP19 patch 1, the values for Varialbe Name, Property 2, and Property 3 were cleared per the image below.  The odd thing is that initially it looked like the values stayed.  It was only after you saved and went back in did you see that the values were gone.
    Any help in understanding this behavior change would be greatly appreciated.
    Thank you,
    Lisa

  • Setting Data Manager Packages Priority

    Hi Experts
    Is it possible to set the priority of data manager packages, for example, if we have multiple scheduled data manager packages running on a server in different appsets. all of the packages are scheduled to run at different time intervals, but unfortunately due to the nature of the processes, they overlap and cause contention issues on the servers.
    Would it be possible to set the priority level of packages, for example package A is highest priority and hence will always take priority, even if package B is running.
    is this possible through SQL server or Integration services?
    The packages being run are standard BPC packages, for example Import, Admin_Optimize (Lite), certain packages are SQL based, meaning that they execute SQL Stored procedures to export data to other systems (R3, BW, etc)
    Kind Regards and Thanks
    Daniel

    Priority will be that always first package requested will be first satisfy.
    You don't have any other priority.
    SG(send governor) + tbldtslog table are used for this.
    If you are looking to standard SP from BPC you will see actually nolock or tablock or rowlock instruction are used.
    Your custom packge depending by what were design to do must avoid concurentiality with other packages.
    Lite optimize is not a problem if you are doing always the select from wb table with condition Source = 0.
    It is not easy to explain the entire mechanism into forum thread.
    Regards
    Sorin

  • What is master data management

    i want know clearly abut MDM in SAP.

    Hi naresh reddy  ,
    SAP NetWeaver MASTER DATA MANAGEMENT
    SAP NetWeaver Master Data Management (SAP NetWeaver MDM) is an enabling foundation for enterprise services and business process management -- providing a single version of the truth for customer, product, employee, supplier, or user-defined data objects.
    Working across heterogeneous systems at disparate locations, SAP NetWeaver Master Data Management ensures cross-system data consistency through interactive distribution. It integrates business processes across the extended value chain, delivering features and functions to enable:
    Master data consolidation -- Consolidate master data for companywide analysis and reporting. SAP NetWeaver Master Data Management consolidates and cleanses master data objects from disparate systems. After consolidation, it stores information from different systems in a centralized repository. Related rich content can be included to augment the data store.
    Synchronization and distribution of master data -- Enable consistent data maintenance and distribution to ensure permanent harmonization of master data. Using global attributes, you can ensure that all systems receive the same master data during distribution -- and enrich the distributed data objects with additional attribute values in the target systems. Distribution can be controlled, visible, and traceable, with active status management at each distribution step.
    Centralized management of master data -- Supports companywide quality standards by ensuring that central control of master data begins as soon as the data is created. Centrally created master data can subsequently be distributed to client systems as required using interactive distribution.
    Administration of master data -- Manage master data without custom code. A powerful interface supports administrative tasks such as data exception handling and assignment of role-based access to business processes and information. Data managers use the interface to configure data source merging, business rules, and distribution details to downstream applications.
    Management of internal content -- Collect and centralize all your content -- including parametric information and rich content such as images, paragraphs of text, PDF documents, and organizational intelligence about content -- in an enterprisewide repository.
    Catalog search -- Deploy intuitive interfaces that help you locate items internally, publish Web catalogs on e-commerce storefronts and in supplier enablement programs, and integrate easy-to-search catalogs into e-procurement solutions -- all from a centralized repository, and all at speeds surpassing normal SQL-based queries.
    Print catalog customization -- Disseminate product information directly from a centralized catalog repository to popular desktop publishing programs, and automatically generate fully formatted and populated page layouts.
    Multichannel syndication of product catalog content -- Publish restructured and reformatted extracts or incremental updates of your product catalog content -- and distribute them to trading partners in several delimited text and XML formats -- on an unscheduled or regular basis.
    Business process support -- Enable communication in a heterogeneous environment, and insert master data into other systems.
    Business analytics and reporting -- Leverage synchronized data for reliable analysis and accurate reporting.
    The following websites will clearly explain u wt is MDM. These websites contain PDF & PPT presentations:
    SAP Netweaver MDM Overview
    http://www.asug.com/DesktopModules/Bring2mind/DMX/Download.aspx?TabId=66&DMXModule=370&Command=Core_Download&EntryId=3431&PortalId=0
    MDm
    http://www.asug.com/DesktopModules/Bring2mind/DMX/Download.aspx?TabId=66&DMXModule=370&Command=Core_Download&EntryId=1666&PortalId=0
    SAP Netweaver MDM Overview
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b09b548d-7316-2a10-1fbb-894c838d8079
    SAP NETWEAVER MDM Leverage MDM in ERP Environments - An Evolutionary Approach -
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/4059f477-7316-2a10-5fa1-88417f98ca93
    Master Data Management architecture patterns
    http://www-128.ibm.com/developerworks/db2/library/techarticle/dm-0703sauter/
    MDM and Enterprise SOA
    http://www.saplounge.be/Files/media/pdf/Lagae---MDM-and-Enterprise-SOA2007.10.10.pdf
    Effective Hierarchy Management Using SAP NetWeaver MDM for Retail
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/70ee0c9e-29a8-2910-8d93-ad34ec8af09b
    MDM World
    http://mdm.sitacorp.com/
    MDM: Master Data for Global business
    http://www.sitacorp.com/mdm.html
    MDM Master Data Management Hub Architecture
    http://blogs.msdn.com/rogerwolterblog/archive/2007/01/02/mdm-master-data-management-hub-architecture.aspx
    Improve Efficiency and Data Governance with SAP NetWeaver MDM
    http://www.sapnetweavermagazine.com/archive/Volume_03_(2007)/Issue_02_(Spring)/v3i2a12.cfm?session=
    Data Modeling i MDM
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5d4211fa-0301-0010-9fb1-ef1fd91719b6
    http://www.sap.info/public/INT/int/index/Category-28943c61b1e60d84b-int/0/articlesVersions-31279471c9758576df
    SRM-MDM Catalog
    http://help.sap.com/saphelp_srmmdm10/helpdata/en/44/ec6f42f6e341aae10000000a114a6b/frameset.htm
    cheers!
    gyanaraj
    ****Pls reward points if u find this helpful

Maybe you are looking for

  • Motion-ease in/out

    I'm working in FCE4 and using a number of still pictures. I'm doing pans and zooms using the motion control. No problem. The acceleration from start to stop is constant. I want it to start out slow, speed up, then slow down to end. So I control click

  • Lost my calendars on my computer, how do I download them from .mac?

    Hi, I lost all my calendars from my computer and am desperate to get them back. They were published online on my .mac account. How do I download them from there back in to my iCal? And how do I prevent the online calendar from being overwritten by th

  • Position of Block

    Hi  Gurus ,              1.How to set a position of block in selection screen.                 Ex . I want a block to display in right corner of screen or middle of the screen                 How to set the position ?.             2.How to put a Sele

  • How to assign SPACE to variable in unicode enabled prog

    DATA: begin of char3,          HEX3(2) TYPE X,       end of char3. if i am trying to assign SPACE like this char3       =  SPACE.  it give an error that this is incompatible during unicode.

  • Choppy video on import and playback

    Hello everyone, I just got an ADVC-55 AD video converter for my Mac quad 2.8. (I am running 10.5.4). When I try to import the video from the VCR via the firewire-connected converter, the video+sound as it is being imported is choppy. This remains cho