Source Data for Import Map

I inherited some MDM work from a consultant who rolled off our SRM project.  He did not leave any source input files to use when making changes to maps, and I read in the SAP MDM Guides that you should always revert back to your original source file when making map changes so you don't lose any mappings (if the newer source file does not contain every segment that the original did, etc).
Am I off base?  Is there a safe way to make map changes without having the original source file?
Thank you in advance,
Keith

Hi Keith,
You are absolutely right. This is a common problem which is being faced while subsequent loads into MDM. The problem is that generally the Map is created in Import manager and then if more values come in for value mapping or more fields are mapped due to a new business requirement, then the map sometimes throws a problem of Map being out of date.
The solution that we came out for this was a creation of Value mapping template ( you can also include the field mapping). This would have the complete list of fields and values in one map. Also if some new value is getting added, then firstly add in the template and then Map it in the original Map.
Now in your case, you can either go and create a Template or else, use the SAVE UPDATE option present in Import Manager everytime you face an exception via MDIS. SAVE UPDATE will help you update the additional mapping onto the original map and then if a similar file comes in again, it will processed successfully via MDIS.
You can refer to my Article on this:
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80ad0cff-19ef-2b10-54b7-d4c7eb4390dd
Hope it helps.
Thanks and Regards
Nitin Jain

Similar Messages

  • There is no source data for this data record, Message FZ205

    Hi Experts,
    I am facing a problem with the DME File download. This problem happened all of sudden in our production system since last month and it was never before. Our system landscape has also not been changed but as per our basis consultant he has added two-three more new application server to the Production client. Even we do not have this problem in our testing clients.
    Please note that we have been using the output medium '1' from the day one and thus the system has been generating the DME in 'File System' which we download on the desktop and upload the same to the bank online. After running the payment run when we trying to download the DME File, the system gives the error "There is no source data for this data record, Message FZ205".
    I tried to fix this issue through many ways but not able to. So can you please let me know the reason of this error and solution to fix this.
    With best regards,
    BABA

    Hi Shailesh,
    Please share how you solved this problem.
    Many Thanks,
    Lakshmi

  • Dates for import are all wrong

    Hiya - I have been trying to import clips made last week. The date is correct but the year is 2009. They will put themselves into a 2009 event folder with other projects in. I have tried to create a 2010 event folder and drag the project into it but it just doesn't want to do it.
    Any ideas as to why iMovie does not see the clips as 2010 and how I can make them appear in a 2010 project??

    iMovie tries to recognize when the camera recorded the video, not when it was imported. You may wish to check the camera's settings to see if they are up-to-date. Here's some background, and a way to adjust the clip's date (don't do it with a project that you've already edited without taking precautions).
    http://imovie08.blogspot.com/2007/09/how-to-change-date-for-dv-event-footage.htm l
    Adjusting the date:
    http://imovie.maccreate.com/2009/12/01/adjust-date-and-time-of-clips-in-imovie-0 9/
    John

  • Creating multiple records from 1 record in the source file for Import DM

    Hi Experts,
    Today I am working on an interface/import where I want to get the following result:
    Source file contains a records like:
    Account, Entity, DriverX
    Sales,EntityA,ZZ
    The BPC appset contains the 2 dimensions Account and Entity next to CostCenter dimension. The DriverX field in the source file is just additional information in the source file. However based on this DriverX we need to determine what CostCenter to choose but we also need to have the same record assigned to a second record in BPC.
    Following my example, based on DriverX value I need to create 2 records:
    Account, Entity, CostCenter,
    Sales,EntityA,CC1
    Sales,EntityA,CC2
    I don't have a problem assigning the record to 1 CostCenter based on DriverX value but I have a problem creating my second record. Does any of you have had the same "challenge" and if so would you like to share the solution for this?
    Best regards,
    Johan
    PS: I am working on SAP BPC, version 7.0 Microsoft version.

    Hi Greg,
    Many thanks for your answer. And yes this would be a solution. However I just simplified my case as the decision to create an second record and where to post is depending on more than 1 field in the source.
    But I will keep it in mind, because I also can opt for a solution to store data differently in BPC fac-tabels which will help me to use script logic.
    If it is not possible to create multiple records from a single records in the standard functionality in the Transformation and/or Conversion file, I have to create a custom DTSX or change my way of storing data.
    Anyone else who is having an alternative idea like Greg came up with?
    Please let it know!
    Best regards,
    Johan

  • Source data for pipeline report

    Greetings,
    I wanna to create a pipeline reports that demonstrate:
    - Number of Leads
    - Number of Leads converted in Opportunities
    - Number of Opportunities
    - Number of Won Opportunities
    I used the funnel design key for leads, but does lack a filter w/ how many opportunities have been won .
    Would anyone have a suggestion ?
    Thanks,
    Julio Zarnitz

    Hi Julio,
    Since single report doesn't suffice all your the field requirement, you can use combined data source and map the desired key figures and characteristics to get required results
    (e.g. you can try out combination of  Lead funnel & Opportunity funnel as both will have all your required fields)
    Refer below thread for detailed steps
    http://scn.sap.com/docs/DOC-63151
    Regards,
    Surjeet Bhati

  • Sample source code for fields mapping in expert routine

    Hi All
    Iam writing the expert routine from dso to cube for example I have two fields in dso FLD1,FLD2
    same fields in infocube also ,can any body provide me sample abap code to map source fields to target fields in expert routine,your help will be heighly appreciatble,it's an argent.
    regards
    eliaz

    Basic would be ;
    RESULT_FIELDS -xxx = <SOURCE_FIELDS> -xxx
    you have the source fields as source, and result fields for as the target. In between you can check some conditions as in other routines of transformation.
    BEGIN OF tys_SC_1, shows your source fields ( in your case DSO chars and key figures)
    BEGIN OF tys_TG_1, , shows your result fields ( in your case Cube characteristics)
    Hope this helps
    Derya

  • Source data for Legal and Management Consolidation

    Hi,
    I'm in ECC5, using BCS 4.0 and BW 3.5.
    Our current designed required 2 type consolidation, which is company consolidation and profit centre consolidation. Note that the profit centre consolidation also required balance sheet and profit/loss.
    Now, I know that basicly the source of data coming from R/3 is actually the special ledger table FAGLFLEXT. In this table, both company and profit centre shared the same table in order to maintain data consistency.
    My question is:
    1. Is my understanding about FAGLFLEXT correct?
    2. What are the prerequisites steps so that the table FAGLFLEXT can have the profit centre data inside?
    Any advise please....
    regards,
    Halim

    Hi Halim,
    Yes, you are right.
    As a prerequisite, you need to activate new General Ledger Accounting in Customizing for Financial Accounting in the OLTP system:
    http://help.sap.com/saphelp_nw04/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/b6/5f58405e21ef6fe10000000a1550b0/frameset.htm
    See here an example of configuration:
    http://help.sap.com/bp_bblibrary/500/documentation/N70_BB_ConfigGuide_EN_DE.doc
    here a presentation on GL in mySAP ERP:
    http://www.asug.com/client_files/DocUploads/search/doc1194.ppt
    and here a thread about dataflow from R/3 to BCS:
    http://eai.ittoolbox.com/groups/technical-functional/sap-r3-sem/dataflow-from-r3-to-sem-bcs-950671
    Best regards,
    Eugene

  • Source data for Record Group

    Hello,
    I am very new to Oracle Forms and have been tasked with pointing some forms we have to a new server and adding a couple of columns to come areas. Everything was going ok until I got to a the point where I have to add a new column to an area on a form. The forms are pointing to the new tables and the searches are working, or at least seem to be working. How can I tell the data source for a data group? I checked properties for the record group (RECORD_STATISTICS) that populates a certain area on a form and it has query selected as the record group type but there is no query showing. I added the column needed to the column specifications list but it does not show up when I run the form. There is a spot for it because the extra hyphen is there.
    Here is the code that populates the fields on the form. The field I added is the ahs_site column. As mentioned earlier I added that field to the RECORD_STATISTICS data group as well as to all the procedures I can find but am missing something..
    DECLARE
    htree ITEM;
    num_selected NUMBER;
    current_node FTREE.NODE;
    v_note_value number;
    v_node_depth number;
         total_rows number;
    group_id          RecordGroup;
    v_selection_count NUMBER;
    BEGIN
    -- Find the tree itself.
    htree := Find_Item('BLOCK_STATISTICS_TREE.TREE_ITEM_STAT');
    v_selection_count := Ftree.GET_TREE_PROPERTY(htree, Ftree.SELECTION_COUNT);
    IF v_selection_count>0 THEN
              v_note_value := Ftree.Get_Tree_Node_Property(htree, :SYSTEM.TRIGGER_NODE, Ftree.NODE_VALUE);
              IF v_note_value IS NOT NULL THEN
                   group_id := Find_Group('RECORD_STATISTICS');
                   total_rows := Get_Group_Row_Count(group_id);
              v_node_depth := to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', v_note_value));
              -- :BLOCK_BUDGET_PARAMETER.DI_SELECTED2 := v_node_depth;
                   GO_BLOCK('BLOCK_STATISTICS_DETAIL');
                   CLEAR_BLOCK;
                   FOR i in v_note_value..total_rows LOOP
                        IF v_node_depth=4 THEN
                             :BLOCK_STATISTICS_DETAIL.DI_TEMPLATE_SEQ := Get_Group_Number_Cell('RECORD_STATISTICS.NODE_SEQ', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.DI_DESCRIPTION := Get_Group_Number_Cell('RECORD_STATISTICS.SITE', v_note_value)
                             || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.AHS_SITE',v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.PRIMARY_CD', v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD', v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD_DESC', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.DI_YR_AND_MNTH := Get_Group_Number_Cell('RECORD_STATISTICS.YR_AND_MNTH', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.TI_QUANTITY_STAT := Get_Group_Number_Cell('RECORD_STATISTICS.QUANTITY', v_note_value);
                        ELSE
                             IF Get_Group_Char_Cell('RECORD_STATISTICS.LEAF_NODE', i)='Y'
                                  AND v_node_depth < to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', i)) THEN
                                  :BLOCK_STATISTICS_DETAIL.DI_TEMPLATE_SEQ := Get_Group_Number_Cell('RECORD_STATISTICS.NODE_SEQ', i);
                                  :BLOCK_STATISTICS_DETAIL.DI_DESCRIPTION := Get_Group_Number_Cell('RECORD_STATISTICS.SITE', i)
                                  || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.AHS_SITE',i)
                                                                                                                                 || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.PRIMARY_CD', i)
                                                                                                                                 || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD', i)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD_DESC', i);
                                  :BLOCK_STATISTICS_DETAIL.DI_YR_AND_MNTH := Get_Group_Number_Cell('RECORD_STATISTICS.YR_AND_MNTH', i);
                                  :BLOCK_STATISTICS_DETAIL.TI_QUANTITY_STAT := Get_Group_Number_Cell('RECORD_STATISTICS.QUANTITY', i);
                                  Next_record;
                             ELSIF v_note_value<>i AND v_node_depth = to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', i)) THEN
                                  EXIT;
                             END IF;
                        END IF;
                   END LOOP;
                   First_record;
         END IF;
         END IF;
    END;
    Hope that made sense. I do not understand how data flows through forms just yet or how to phrase my question in terms that understandable. I do have some screen shots I could send anyone willing to help.
    Thank you.

    Adding a column to column specification does nothing.
    First of all, check the record group query in record group properties:
    1) In forms builder object tree find that record group, right-click > property palette.
    2) Look for property (just cann't remember exactly its name) where select query is specified.
    3) Add the column you need to the query. Column specification will refresh automatically.
    There is one more way to specify query for record group. Look for calls of POPULATE_GROUP_WITH_QUERY procedure in the form code.
    Forms 6i: menu program > find and replace pl/sql, Forms 10: edit > find and replace pl/sql. In the search field type POPULATE_GROUP_WITH_QUERY. Then see the results where your record group RECORD_STATISTIC is being populated programmatically. If no calls were found - the only data source is in record groups properties.

  • Filter at Source adapter for XI mapping?

    Hi all,
    Is there anyway we can set filter at source adapter or Outbound interface for any XI integration? reason is Source file contains 1M records and actually the target system requires 100K records. we can save lot of network bandwidth or time for transfer and process the large amount of data not required on target system.
    Thank you very much for the help and regards,
    Srinivas

    >
    Prateek Raj Srivastava wrote:
    > If these are text files, then you may use Recordset Per Message option and divide the number of records into 10 parts.
    >
    > Regards,
    > Prateek
    this would still consume a high processing time.
    Actually i suggest that restrict the messages at the source system itself. If the data is not used by any other system, then you can request the source system to send only the relevant data and not the whole lot. Ideally this is a good design for a better architecture + less headaches

  • Restricting source data for GL Analytics

    Hi All,
    I need to run GL Analytics ETL process but i want my warehouse tables to be populated with recent data (Eg: data from 2007) but my source system has data from 2003. Is there any way or set any parameter in DAC to do so.
    Thank you

    Hello,
    If it is a matter of authorization. the Atif's answer is right.
    If it is a mater of validation.
    To restrict G/L Account(s) with Profit center(s)
    You need to use GGB0 Validation in Accounting Documents.
    then you need to activate it through this path:
    SAP Customizing Implementation Guide - Financial Accounting (New) - Financial Accounting Global Settings (New) - Tools -Validation/Substitution - Validation in Accounting Documents.
    Note event is very important you can make it on line item level
    Regards,
    Edited by: Tarek Elkiki on Dec 11, 2011 10:51 AM

  • Custom Map in flash using XML data for dynamic map and point of intrest loading...

    Been some time since I have used Flash for anything...
    I'm working on a little project to dynamically build a map
    and set points of interest on the map. At this time I have the
    (mySQL) data being queried and formatted with PHP and pushing the
    data to Flash as XML.
    In starting the project I'm a bit lost... I mean I have my
    data and a good XML format but as it is I'm lost on parsing the
    data in Flash and assigning its values to movie clips or other...
    I've looked at the Loader Component and the XML Connector
    Component and find I can get that to work at all...
    My second though was to create a single movie clip on stage
    and give it an instance name of "Background" and have it load the
    URL of an image given in the attached XML doc... Node "a_zone" and
    the value given to attribute "image"... But this brings me back to
    square one of not quite understanding Flash and parsing XML... With
    this second idea I would use AS to create a movie clip, set it's X
    & Y cords and load an image to it based on the XML attributes
    listed in the "Gatherable" node (one for each node).
    Any suggestions, examples or related info is welcome...
    Thanks in advance!

    Okay, that really wasn't what I was looking for... But I did
    go back and RTM :-)
    Here's what I have... 1st frame:
    2nd Layer: movieclip with the instance name "currentPicture"
    The image loads into "currentPicture" from the URL given in
    the XML "a_zone" node attribute "image" just fine....
    But I'm not able to grab the attributes of each "Gatherable"
    node it seems... am I missing something or just not pointing to the
    right node?
    I keep getting:
    undefined
    undefined
    undefined
    Error opening URL
    "file:///C|/apache2triad/htdocs/gatherer/flash/undefined"
    Error opening URL
    "file:///C|/apache2triad/htdocs/gatherer/flash/undefined"
    Error opening URL
    "file:///C|/apache2triad/htdocs/gatherer/flash/undefined"

  • Source data for Goods Reciept ledger transactions

    Hi,
    I've been tasked with writing a custom report simular to transaction KSB1 ( Display actual cost line items for cost centers )but with vendor no & name added.
    I can write the ABAP but need some help to identify the source tables.
    The fields on the report below look like they are from the accounts payable ledger but I'm not sure which table that is.  Any help greatly appreciated.
    Cost Center
    Cost Element
    Period
    Cost Element Name
    Document Type
    Document No.
    User
    Purchase Order Text
    Purchasing Document
    Document Header Text
    Value in Report Currency

    Hi,
    You will find this information in the MSEG table: position details for Material Documents (Header information you will find in MKPF)
    I think this is a better strategy than using the GL lineitems for the vendor.
    Goodluck,
    Paul

  • SCCM 2012 report/data for importing software metering rules..

    Hi, 
    I have a customer who would like to start using software metering. The plan is to create the rules manually, since I guess it would be overkill and/or have an huge performance impact to enable it for all...? We might be talking about 100-300 applications.
    To do it manually, a powershell script can be used for creating these based on a txt file containing the following information.
    Productname,  FileName,  OrginalFileName,  FileVersion
    The customer has then asked for a report/csv file showing these information for all .exe files, so they can pick what they want to have metered. The data amount will for sure be huge and unmanageable.
    Anyone has a good approach for this challenge? :-)
    Thanks in advance.
    Best regards
    Thomas

    The approach is to say NO, what they are asking for is huge in the range of 100,000 different exes, there is no way anyone will go the list. Instead, look at the build in report for count of installed software products and sort from largest count
    to smallest. Then create SWM rules for those applications that have a cost and not everyone has. Aka why would you crate a SWM rule for WORD or EXCEL? But it does make sense to create a rule for Visio or AutoCAD.
    Also why create a rule for two different version of Visio? Do you need it, why? Are you sure?
    Garth Jones | My blogs: Enhansoft and
    Old Blog site | Twitter:
    @GarthMJ

  • Open Source solution for Map Display on Web

    Hi
    Does anybody knows an Open Source solution for displaying Maps on Web. Something like the MapXtreme equivalent of Open Source.
    Regards,
    N�stor Bosc�n

    There are all sorts of solutions for displaying maps on the Web, starting from JPG files and going up from there. Would you reject JPG because it isn't open source? Or did you really mean to ask for no-payment solutions?

  • How do i update & test import maps and syndication maps?

    HI,
    Can anyone guide me/ let me know the steps - how do i update & test import maps & syndication maps during upgrade in development system>
    Regards,
    Harmony

    Hello Harmony
    Different service packs SAP MDM 7.1 supprot the same Syndication and Import map format.
    For Import map is updating you can use Import manager
    For syndication map is updating - Syndicator
    If your syndication and import map have been saved into repository just archive repository from your source landscape and unarchive it in destination.
    If your syndication and import map haven't been saved  in repository.
    Exporting your maps to xml(syndication maps from syndicator and import maps from Import Manager) before migration and importing it to new repository.
    According updating(modification) repository structure:
    For Import map is updating you can use Import manager
    For syndication map is updating - Syndicator
    Another way(very exotic) - make changes in exported xml maps.
    That's all
    Regards
    Kanstantsin Chernichenka

Maybe you are looking for