Transformation Mapping

Hi BPC Gurus,
                 I have planning application and in that assigned dimensions are Category, P_Acct, P_datasource, P_Activity, STATS, Time, Entitiy. Here I have created a user defined dimension by name "STATS" which has Qty, Rat and Amt as memebers in it. Created input template also. Now when I am using DM for transformation of 0FIGL_C10, I am having a problem in mapping STATS=*NEWCOL(AMT), it gives me an error.
I have created STATS dimension for maintaing quantity sold.
So what I should give in Mapping where it can skip STATS dimension when I am runing DM for 0FIGL_C10. As 0FIGL_C10 has no quantity maintained in it.
Regards,
Kumar MG

Hi,
I am guessing, Amt member has a dimension formula assigned to it.
Every dimension should have a member in every transaction record. So, you cant keep it blank, even if the source has no data for it.
You can try to map it to a dummy member of the STATS dimension.
Hope this helps.

Similar Messages

  • Transformation Mapping Problem.

    Dear BI Experts,
    In 0FIAR_O03 to 0FIAR_C03 transformation mapping between 0DOC_DATE to 0CLAMONTH getting below error.
    Rule 58 (target field: 0CALMONTH, group: Standard Group): Info Object properties 0CALMONTH
    Message no. RSTRAN525
    Diagnosis
    The properties of the Info Object selected, 0CALMONTH, do not match the properties of the available source field.
    But in 3.5 update rule we are not getting any error. How to overcome transformation problem?
    And in 0FIAP_O03 to 0FIAP_C03 transformation 0DOC_DATE to 0CLAMONTH mapping done successfully.
    Getting data also.
    Why problem in 0FIAR transformation?
    Please give any solution.
    Regards
    Ramu

    Hi
    Is there is any routine was involved in that transformation , may it was stopping  to see the actual data, or some time when you migrate from 3.x flow to bi 7.0 flow you will face the transformation ask your abap guy to dedug the transformation so that you can easily now where the exact problem lies.
    Regards
    Madan Mohan

  • Dynamically provide Transformation Map to ESB

    Hi
    My requirement is to create a "black box" module which takes in
    1) custom mapping (XSL file)
    2) DB Key (bind value)
    3) DB Query
    Can we create a Generic ESB service which would take all these values and return the output xml after applying the transformation XML ?
    The whole module acts as a web service to extract the data from DB in custom format.
    My Queries are :
    --->can we build a Routing service with transformation Map file value provided dynamically ?
    --->Can we provide DB adapter Query and bind value dynamically in DB Adapter ?
    are there any other alternatives to explore ?
    Thanks in advance
    Pradeep

    I dont know the data needed at design time, only know it at runtime. Its HR related. Depending on customizing one infotype can have several sub-infotype. Today we got 5 sub-infotypes, next year maybe 8. I dont want to hardcode this.
    I know I can dynamically add nodes to the context. But I dont know whether dynamically added nodes can be mapped to the view controller context.
    As a workaround I could add those nodes to the view controller context only but the data represented by the nodes is read in the component controller and might be used in other views too.

  • BPM: Messager Merge - Transformation Mapping Problem

    Hi,
    I tried an eg for time bound message merging (rather adding the items in the message).
    I am using a single datatype/message type.
    I was able to do the message mapping/interface mapping test by changing the source by making it 0 to unbounded and on the target I have same message type.
    This test was successful.I went ahead with the creating the scenario .I used a file adapter for picking up a file. The message monitor shows it is picked it and sent to the bpm.
    but the bpm part failed I checked in bpe monitor.It just says the mapping failed(transformation step).prior to which there is a receive step and container operation step which I used it for append the message .these are under a block and this block has a infinite loop for collecting the messages.There is a exception thrower(control step) which is for 2 minutes.this handled by a exception handler.and I guess my file was collected and send to the transformation step after this.but the thing is it never seems to appended since the two files are shown as two seperate error messages in transformation rather than as single ...
    Can somebody tell what could be the problem/where to look for the file.
    THnks

    I am getting more and more sure that the problem is at the block entry only...becoz i checked with direct entry to loop with a counter as loop breaker.it entered the loop and added the lines to the message with multiple lines.and once it hit the counter it came out and did the transformation successfully and sent it to the target system.
    when i add the block it fail right at the block entry for the first message after that all the messages show the green flag clicking on pe would show an empty queue...
    I guess the only step happening before the block is the correlation key creation i amn't sure if this is giving  some problems..
    NOW FOR THE BPM Steps....
    1.I created the correlation key.
    2.I put the block for the block i added the correlation key and exception name.
    3.I put the exception branch and the deadline branch
    4.on the exception branch i put the name of  the exception to be handled.
    5. on the deadline branch i put a 2 minute duration
    6.within the deadline branch i put the control which throws the exception...
    7.i added a loop to the block which is 1 = 1.in that there is a recieve step
    8.after that there is container operation which adds the message to the list...
    9.followed by outside the block i have a transformation and send steps..
    10.the block is in default mode.
    I tried creating the scenario completely again and again with different datatype etc to avoid the cache problem + workflow item locked problem...
    but no luckk
    THNks

  • OSB Transformation Mapping values missing

    Hi,
    I'm new to OSB.We have a requirement where it structured in 3 layered architecture consists of BPEL(1),OSB(2),OSB(3).
    So i have created a Business service of my client WSDL from OSB(3) then proxy of type created bussiness service which is having local transport which inturn called by proxy service of http protocol in OSB(2) of type OSB(3) Proxy service.
    My Question is when i call transformations (xsl) in last OSB(3) Mapping is done properly(GETING VALUES MAPPED TO TARGET ONE) .But as per requirement we have to call mapping and transformations in OSB(2) ,wen i called in this way only set text valued paramters is displayed in target.... rest mapped values(like source to target mapping) are coming NULL....
    Please help us to resolve the issue soon since its very urgent requiremnt for our project developement..
    Adavance thanks......

    System hardly proposes rules after migration, you have to manually map them and migrate routines if any basing on ABAP OO.
    As it is a newer version it may not be consistent for all objects.
    What SP are you running ?

  • Seek query to display report for transformation mapping

    Hi,
    There is a requirement to display a report for mappings between ODS's and Cubes in the following structure.
    Target field                      Target field data type                  Target field description                   Source field
    This is not a final requirement (don't have to present to the client), but this will lead me to the end requirement. I'm quite flexible with the input for executing the report as I have ID's of both source and target ODS and cube, and also transformation ID for that matter. There is joining of tables involved but as I'm a rookie in this technology, I've limited knowledge of the table names and ABAP for that matter. Kindly help me build this report.
    Regards,
    learner

    Hi Veerendra,
    Thanks for responding. I could get the infoproviders fields from the table RSTRANFIELD. Description of the corresponding IO can be taken from the table RSDIOBJT, but the problem is with the datatype. I could not find a table that gives datatype of an IO. The only solution is reading the text table of an IO, i.e. /BI0/Txxxxx where xxxx is the IO name. I got the tables identified but I'm finding it difficult to join them.
    Regards,
    Learner

  • Strange problem with Transformations Mapping

    Hi All,
    I am facing a strange problem with respect to the data loading to ods from data source.
    I have mapped a field,  Number of days which is the difference between posting date and payment date (field in ECC, length char 5) mapped to characteristic zdate_1 length 5 without alpha conversion. (Date difference is calculated in ECC)
    When I load data in production it is not fetching the records from PSA (Values exist for this  field in PSA).
    I have 90 fields in datasource whichbring data to PSA.
    Same is working perfectly in Quality and Dev servers.
    Where could the problem be?
    Please suggest a solution.
    Regards
    Joga

    Any Ideas...

  • Documenting existing Transformations Mapping

    Hi All,
    Can any one tell me How to copy and paste all Existing Transformations along with the transformation mappings..
    1. I Could not find in all meta data directory.
    2. I cud save only as JPG
    Is there any other way to get it done?
    Thanks,
    Hari

    Thats a tough job
    I just wonder SAP even provided with a table or any other feature as such because they haven't delivered any Standard Content Transformations in the first place - why would a table exist for the same ...

  • Key Field missing in transformation

    Hi
    I am trying to use BI7.0 transformation (1-1 mapping basically) to send data from one DSO to another DSO
    Source : DSO1
    Key fields : Account ,Itemid, Position number
    Target: DSO2
    Key fields : Account ,Itemid, Position number
    However when i create transformation i don't see "Itemid" in the source side in the transformation mapping. Rest of the two key fields are present.
    Please help.
    -Anurag

    Hi, Anurag.
      I just posted a question to the forum, because we are implementing the same architecture, and we are wondering how deltas will work with this
    architecture.  Are you using deltas? 
    The text of my recent post is below. 
    We are loading from R/3 into DSO1, and then from DSO1 into DSO2.  The R/3 extractor that loads DSO1 is
    delta-enabled.  We are not sure, however, if a delta mechanism governs the load from DSO1 into DSO2.  The
    DTP says its extraction mode is delta, but does that mean, if a row in DSO1 changes, it will negate the key figures
    on the  original row and send a new row the way R/3 does?
    For example, suppose the R/3 extractor sends us Row 1. 
    Row 1 has a key figure with a value of $100.
    Row 1 gets changed in R/3, and the new value is $125. 
    The R/3 delta mechanism takes care of this by negating the key figure on the appropriate row and sending us a correcting
    row.  For example,
    the R/3 extractor will send us:
    Row 1 $100
    Row 1 -100
    Row 1 $125
    So... the net value is correct, i.e. 100 - 100 + 125 = 125. 
    When we load from DSO1 into DSO2, however, do you know what rows will load into DSO2?  Is BW "smart" enough to do this
    type of negation?
    Thanks!

  • Transformation fields getting misplaced

    I am using DSO --> 66 data fields, 2 key fields. When I create the transformation I get all the fields. But when I save the transformation I am loosing some fields from the transformation. What can be the error? When I look at the detail view, I see the corresponding target fields where the transformation is missing is mapped to unwanted fields (i.e. 1 transformation field mapped to n no. of target fields).  I recreated the transformation but still getting the same error.
    Please help me.

    Hi,
    My problem has been solved , i upgraded my SP level 17 to 19, By this i overcome that problem.
    Be sure to upgrade SP level latest patch if and so
    ->if your transformation mapping to one to and many
    -> Even though if you activate it will show inprocess
    Otherwise 17 patch is enough.

  • ODS Key Fields allows blank values in Transformation

    Dear Experts,
    I have two layers:
    1. Datawarehouse layer (purpose is to represent exactly data at the backend)
    2. Consolidation Layer
    Scenario 1:
    In Datawarehouse layer, the fiscal variant is a key field in ODS and I have two other key fields i.e. combine to form a composite key.
    When I load the data from PSA to this ODS, I have a record which contains blank value for the fiscal variant. It successfully loads into the Datawarehouse ODS.
    Sceanrio 2:
    In the consolidation layer, i have 4 key fields out of which fiscal variant is one of them. Now when I load the data from the datawarehouse ods to consolidated ods, it throws an error saying:
    Diagnosis
         An exception fiscvarnt_missing was raised while executin
         module RST_TOBJ_TO_DERIVED_TOBJ .
    System Response
         Processing the corresponding record has been terminated.
    The transformation mapping for the fiscal varaint in both sceanarios is 1:1 mapping with the rule type 'Time Characteristic'.
    My questions is:
    1. Why different behaviour during scenario 1 and sceanario 2
    2. Solution to the above issue
    Thanks
    Jain

    Thanks J.S.
    Yes the format selected in the field for the datasource is 'Internal'.
    When loading the data from R3>PSA>ODSX(Datawarehouse Layer), I have no problems even when the field is blank.
    When loading from ODSX-->ODSY(Consolidated Layer) the empty (blank) field for that records throws error.
    Any ideas?
    Jain

  • Flat File  mapping issue

    Hello All,
    I am trying to an extract using flat file method in BI 7.0. I have some 150 fields in my CSV file but i wanted is just about 10 which are scattered around in the CSV file.When i Read Preview Data in the Preview tab i see incorrect data in there.And that too all under one tab, basically all under one field , though in the extraction tab for  Data Seperator i have been using ; and also tried checking the HEX box, for escape i have the ", tried HEX for this aswell.For rows to ignore i have 1.
    One thing i would like to know is how will the BI infoobject know where is the position of the flat file field in the  CSV file and where will it be mapped.i know it can be mapped in the Transformations but that is from the flat file datasource, i am asking about the CSV file.
    Please help me out and tell me what am i doing incorrectly.
    Thanks for your help.Points will be assigned.

    hi,
    use ,and ; as the escape signs.
    system takes care of it when u speicfy the path and name of the file and format as CSV.
    always the system checks the one to one mapping of the falt file fields with the infoobject in the datasource.
    options for u
    1. arrange the neccessary fields in the flat file that exactly maps with infoobjects for mapping. then start loading.
    2. keep as such and load with scattered field and in transformation map the required fields alone.
    second option consumes more memory space unneccessarily.
    For BI 7.0 basic step to laod data from flat (excel files) files for this follow the beloww step-by step directions ....
    Uploading of master data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
    BW 7.0
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    5. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    6. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    7. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    8. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
    Ramesh

  • Transformations in BI 7

    Hi guys,
    What is the difference between the Transformations in BI7 and transfer rules/ Update rules in BW3.5?

    you would create a infopack on the datasource to load till PSA. and DTP's to load to any number of cube's/ods's.
    flow would be somthing like this:
    datsource - infopack - PSA - DTP (transformation=mapping) - ODS or/and cube
    What is the difference between Infopackage and data mart in BI 7?
    infopack is no different from what it was in 3.x, just that we no more use it to load it dataproviders, And Datamart is obsolete as with 7.0, as you can do the same stuff with DTP's

  • Upload infomation about mapping from say BW/BI system

    Hello Gurus,
             how can I upload infomation about mapping from say BW/BI system ? I want to use this uploaded mapping information to update my mapping sheet document.
    Many thanks

    Hi,
    Double click on Transformation Mapping -> Go to Extras -> Tabular Overview-> Right Click - > Export to Microsoft Excel.
    You can also save the transormation as JPG file (Click on Button -> Save as JPG -> from left ) and attach that in the mapping document.
    Hope this help!

  • Mapping in interconnect between different Business Objects

    I want to know how to do transformation and mapping between different business objects in interconnects.
    Always,We have a very complex SQL,when We do intergration
    with Oracle interconnect ,We use DB Adapter or Jdbc Adapter,but the complex SQL have to be excuted in the resource DB or the destination DB which may be a big pressure to them ,I think can We use different Business Objects, and do the Mappings in interconnect,so the big pressure will be on the interconnect server just like the ETL tools, But I just find that Interconnect can do tranformation and mapping in one Business Object ,how can I do? Is anyone meet this problem like me ?thanks for discussion.

    For me, Business Objects are logical groupings of business processes. For example, we have a Business Object called "Maintain_Employees". Under this we have 1 Procedure (Create_Employee) and 2 Events (Update_Employee and Delete_Employee).
    We have 1 Oracle system interfacing with 23 other legacy systems. Some of these legacy systems will be using this "Maintain_Employees" Business Object (Common View), and our main transformations will be between the Common View and the legacy Application Views.
    We are using a number of techniques to assist in "validating" data in the InterConnect. The main ones are using 'Cross Reference Tables (XREF)' and 'DatabaseOperation' transformations. By using 'Content Based Routing' we are able to send the right message to the right legacy system, and therefore do the right transformation/validation on the message payload. However, this is only a small part of a complex puzzle.
    I also have the "problem" of having "very complex SQL" on our Oracle system too. This is not unusual when using the InterConnect.
    To my mind, the InterConnect does 2 main operations. Firstly, it performs some message transformation (mapping), and secondly, it acts as a transportation engine (routing) using the adapters.
    The remainder of the effort required to create or consume the message resides with the Applications themselves. Whether it is parsing an XML CLOB payload, inserting data into staging tables, writing to log files, pre-processing data, calling API's or something else, your Application side programming and processing overhead can get large.
    The trade off it to ask the question, do I want to be able to track and manage messages from start to finish in high detail? Or can I trust that all message payload data will be consumed with no additional processing on the Application side?
    My experience has shown that the bottleneck is always at the Application side, and almost never in the InterConnect.
    The short answer to your first question is "You are right. Mappings can take place only between Application Views and Common Views only - not between Business Objects.".
    To answer your second question "Probably everyone reading this forum has this problem. The intelligence that is able to really interpret message data, validate it and process it is only found in the Application, not the InterConnect. You could, however, use the Workflow engine within OAI in order to provide additional pre-validation, human interaction and logic, but this too could be complex."
    At my current client, we are architecting an Application OAI Message handling schema. This will contain staging tables, pre-processing tables, "OAI" wrapper PL/SQL scripts, "APPS" wrapper PL/SQL scripts and Message Logging and Exception tables. Ours will be a complex set of PL/SQL processes too.
    I hope this helps, just in letting you know that you are not alone with this problem.
    I wonder if anyone else would like to share how they have architected their InterConnect and Application side mapping and transformation solutions.

Maybe you are looking for

  • Get newest file in a folder

    Wow, this must be so obvious... How can i get the newest file in a folder. I can get a list : list folder. But how to sort this list on creation date of the file. Please help me on this one thanks Jitse

  • How to get page numbers using cl_gui_alv_grid

    dear all, I want to display page nos at every end of .iam using cl_gui_alv_grid.iam didving itab into no of pages,iam using code to display no of pages is CLASS LCL_EVENT_RECEIVER DEFINITIOn. PUBLIC SECTION. HANDLE_END_OF_PAGE FOR EVENT PRINT_END_OF_

  • Dealing with number range gaps with grouping?

    What i am trying to do is get a query of data that gives me a start year and end year. I can't use min and max as there could be gaps in the year.. 1999,2000,2002 i am missing 2001 .. so i need 1999/2000 and 2002/2002 here is a bigger example. Table

  • Sync IPod with 2nd computer

    I loaded itunes on laptop but how I do I get my Ipod to transfer its music/files to Itunes as its not my usual desktop?

  • Error Message: Mail could not be saved

    Just transferred all files from old G4 to new Intel Mac Mini. Mail receives and sends mail without a hitch. However, as I compose messages, I now get a drop down panel that says: Error: The message could not be saved. There are no messages in either