Performance FDM mapping
Hi Guys,
My FDM App has a Map Options for a Dimension where the type is "In" with these features:
Rule Name: RNOME
Rule Desc:
Rule Definition:[None]
Target Custom 2:#SCRIPT
Script: if (RES.PlngCatKey="14") then 'escenario Saldos medios if (VarValues(19))="SMM" then 'mira en la C1 si es SMM lo mete en la C2 Result = "SMM" else Result = "[None]" and if else Result ="[None]" end if
This can provoke some issue or latency when I import or validate?
Thanks for any helps!!!
Regards
Matteo
Yes this dimension is one of those who provoke the issue. I'm testing your advices, on other hand the Import Performance Analysis shows this:
Process Description Duration I/O Source Event Information Date Start Time End Time
Import: Post Work Data 0 tDataMapSeg3 Post Map Work Data to Main Map Segment: 11/06/2012 9:31:08 AM 9:31:08 AM
[Records=4818]
Import: Post Work Data 7 tDataSeg3 Post Work Data to Main Data Segment: 11/06/2012 9:32:33 AM 9:32:40 AM
[Records=21793]
Import: Drop Work Table 5 tWbk0883870094684335 Drop Work Table 11/06/2012 9:34:01 AM 9:34:06 AM
What happening between "Import: Post Work Data" and "Import: Drop Work Table" there are temporal holes!
The first "Import: Post Work Data" ends at 9:31:08 AM and second "Post Work Data" starts at 9:32:33 AM. --> temporal hole 1:25
The second "Import: Post Work Data" ends at 9:32:40 AM and "Import: Drop Work Table" starts at 9:34:01 AM. --> temporal hole 1:21
Many Thanks !!
Matteo
Similar Messages
-
Hi, all of you
I'm now thinking of the way to connect Oracle EBS 12.1.1 and Oracle Hyperion Financial Management, Fusion Edition 11.1.1.3.0 with ERP Integrator.
In the introduction section in ERPI admin guide, ERPI is mentioned as a module of FDM.
However, as far as I read the later sections in the guide, FDM is needed only if the user wants to use "FDM mapping."
Is my understanding correct, and EBS and HFM is connected only with ERPI (without FDM)?
And, if you know the advantage of using FDM mapping (I couldn't understand, actually), please tell me.
Thanks in advance
YoshikiFDM is not required.
The same type of mappings can be done in ERPI and FDM. For example 3 accounts in the g/l for coffee, soda, snacks map to one in HFM account.
Using ERPI you will need more in the way of IT skill sets where FDM is constructed more for the Finance user to maintain.
Not that it is that difficult, but moreover you want to buy FDM because the internal controls available would cost you a lot more to script, test, validate, and implement outside of this packaged solution. -
FDM Mapping tables within EPMA
Hi All
Can anybody tell me wether EPMA can hold the FDM mapping tables?
Also should we create a new member within the EPMA library can we set up a prompt that requests the user to update the mapping table with the new member?
ThanksHi
Thanks for the reply
Do you know if Kalido will be able to hold an FDM mapping table and wether this could feed into FDM?
Thanks again -
FDM Mapping SPLIT/TRIM/JOIN fields
Hi All
I am using FDM to Transform a flat file to a new data file. I am a newbie in FDM.
Can anybody tell me the syntax for trimming/splitting the source field in to the target file in FDM mapping.
For Example:
In my source, the feild 'Product' has string "uranium FGIM (10010)
But on the target file, I need only the product code '10010' How do i trim this feild to get the desired output file.
Kindly let me know the syntax of the mapping.
Thank you,
JKYou can either do this as part of the mapping process or you can use an import script to parse out the desired value during the import.
If you do it in the mapping simply use your original full product string as the source and set the trimmed value as the target i.e. using explicit maps
In the import script use logic similar to your example to parse out the portion of the original product string you wish to pass as the source value. However you must ensure this logic is generic and will work consistently over all product strings.
From your criteria I imagine the mapping approach would be the simplest and easiest to manage solution -
FDM Mapping Maximum number of records per Dimension
All, I have a unit that currently has over 2800 records for their account dimension. I can download all the records associated with the maps, but I can't see all the records in the Map Options window. It seems to stop at Page 20.
Is there a limit to the number of records FDM can view in the Map Options window?
Is there a limit to how many records a dimension can have?
Thank you for your help.Thanks for the response Mehmet, I did as you recommended, and confirmed the count in tdatamaps as well as the same count downloaded to excel. It is literally the FDM view will not show all the records. I have downloaded them all, cleaned any questionable characters, re-uploaded, and again check tdatamap, counts are good, just not able to see the records in FDM map window. Very strange......
All Thoughts are greatly appreciated. Thank you -
Hi,
We have a requirement where in we need to export FDM map to a .txt file.
we have explicit mapping as well as like map which is '*' mapped to 'None'.
The issue is when we export maps for a location, we need to have explicit maps as well as the source members which are mapped to 'None', instead of *
Eg: Product original map will be
src tar
1001 100100
1002 100200
1003 100300
* None
suppose if my data file has 5 records
1001
1002
1003
1004
1005
As per defined mappings, my final product mapping would be
src tar
+1001 1000100+
+1002 1002000+
+1003 1000300+
+1004 None+
+1005 None+
I need above records to be loaded to text file. (Basically, * in original map should be replaced with the actual members based on data file.)
Is there any way I can get these records in FDM or should I need to write any script to achieve this. If yes, any suggestions.
This is little urgent and any suggestions on this woud really help me..
Thanks
Edited by: 995155 on May 8, 2013 1:02 PM
Edited by: 995155 on May 8, 2013 1:04 PMThe output will be completely dependent on the last source file imported to the location in question. You would have to source the data from the tDataSeg table associated wih the location. You would then have write a custom script which queried this table to retrieve the source and target values for the period and dimension in question to be written to a text file
-
Does anyone know of any configuration parameters that are really useful when attempting to tweak the performance of the maps in OBIEE 11.1.1.5? Although I'm the only user on a virtual box with 4GB of RAM and 2 CPUs, map rendering is painfully slow. (XML posted below, working against the "A - Sample Sales" subject area). All other OBIEE performance is quite snappy. Maps ... not so much.
Any ideas?
<saw:report xmlns:saw="com.siebel.analytics.web/report/v1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlVersion="201008230" xmlns:sawx="com.siebel.analytics.web/expression/v1.1">
<saw:criteria xsi:type="saw:simpleCriteria" subjectArea=""A - Sample Sales"">
<saw:columns>
<saw:column xsi:type="saw:regularColumn" columnID="c3131dd22bf3cc8ee">
<saw:columnFormula>
<sawx:expr xsi:type="sawx:sqlExpression">"Time"."T05 Per Name Year"</sawx:expr></saw:columnFormula></saw:column>
<saw:column xsi:type="saw:regularColumn" columnID="cddcbd415b4c3bdfa">
<saw:columnFormula>
<sawx:expr xsi:type="sawx:sqlExpression">"Ship To Geo Codes"."R62 Geo Ctry State Name"</sawx:expr></saw:columnFormula></saw:column>
<saw:column xsi:type="saw:regularColumn" columnID="c371b413a6d22de6a">
<saw:columnFormula>
<sawx:expr xsi:type="sawx:sqlExpression">"Products"."P4 Brand"</sawx:expr></saw:columnFormula></saw:column>
<saw:column xsi:type="saw:regularColumn" columnID="c95ae97ecd00f14f7">
<saw:columnFormula>
<sawx:expr xsi:type="sawx:sqlExpression">"Base Facts"."1- Revenue"</sawx:expr></saw:columnFormula></saw:column>
<saw:column xsi:type="saw:regularColumn" columnID="c229fa547a7d1b4cd">
<saw:columnFormula>
<sawx:expr xsi:type="sawx:sqlExpression">"Base Facts"."2- Billed Quantity"</sawx:expr></saw:columnFormula></saw:column></saw:columns>
<saw:filter>
<sawx:expr xsi:type="sawx:comparison" op="equal">
<sawx:expr xsi:type="sawx:sqlExpression">"Ship To Geo Codes"."R61 Geo Country Code"</sawx:expr>
<sawx:expr xsi:type="xsd:string">USA</sawx:expr></sawx:expr></saw:filter></saw:criteria>
<saw:views currentView="3">
<saw:view xsi:type="saw:compoundView" name="compoundView!1">
<saw:cvTable>
<saw:cvRow>
<saw:cvCell viewName="titleView!1">
<saw:displayFormat>
<saw:formatSpec/></saw:displayFormat></saw:cvCell></saw:cvRow>
<saw:cvRow>
<saw:cvCell viewName="tableView!1">
<saw:displayFormat>
<saw:formatSpec/></saw:displayFormat></saw:cvCell></saw:cvRow></saw:cvTable></saw:view>
<saw:view xsi:type="saw:titleView" name="titleView!1"/>
<saw:view xsi:type="saw:tableView" name="tableView!1">
<saw:edges>
<saw:edge axis="page" showColumnHeader="true"/>
<saw:edge axis="section"/>
<saw:edge axis="row" showColumnHeader="true">
<saw:edgeLayers>
<saw:edgeLayer type="column" columnID="c3131dd22bf3cc8ee"/>
<saw:edgeLayer type="column" columnID="cddcbd415b4c3bdfa"/>
<saw:edgeLayer type="column" columnID="c371b413a6d22de6a"/>
<saw:edgeLayer type="column" columnID="c95ae97ecd00f14f7"/>
<saw:edgeLayer type="column" columnID="c229fa547a7d1b4cd"/></saw:edgeLayers></saw:edge>
<saw:edge axis="column"/></saw:edges></saw:view>
<saw:view xsi:type="saw:mapview" name="mapview!1">
<saw:mapLayout>
<saw:splitterLayout orientation="horizontal"/></saw:mapLayout>
<saw:basemap sid="__OBIEE__MAPVIEW__TILE__OBIEE_NAVTEQ_SAMPLE__OBIEE_WORLD_MAP__~v0"/>
<saw:mapWidgets>
<saw:mapToolBar>
<saw:panTools>
<saw:panHand display="true"/></saw:panTools>
<saw:zoomTools>
<saw:zoomIn display="true"/>
<saw:zoomOut display="true"/></saw:zoomTools>
<saw:selectionTools>
<saw:pointTool display="true"/></saw:selectionTools></saw:mapToolBar>
<saw:mapInformation>
<saw:scaleInfo display="true"/>
<saw:overview display="true" viewState="collapsed"/></saw:mapInformation>
<saw:mapOverlay>
<saw:panButtons display="true"/>
<saw:zoomSlider display="true"/></saw:mapOverlay></saw:mapWidgets>
<saw:infoLegend display="true" viewState="collapsed"/>
<saw:viewportInfo>
<saw:boundingLayer layerID="l0"/>
<saw:mapCenter x="-95.90625" y="34.3125" size="31.015625" srid="8307" zoomLevel="3" xUnitPixels="12.8" yUnitPixels="12.8"/>
<saw:boundingBox coords="-119.34375,18.8828125,-72.546875,49.8203125"/></saw:viewportInfo>
<saw:spatialLayers>
<saw:spatialLayer sid="__OBIEE__MAPVIEW__LAYER__OBIEE_NAVTEQ_Sample__OBIEE_STATE__~v0" class="omv_predefined_layer" layerID="l0">
<saw:layerLabelFormat display="true"/>
<saw:spatials>
<saw:columnRef columnID="cddcbd415b4c3bdfa"/></saw:spatials>
<saw:visuals>
<saw:visual visualID="v0" xsi:type="saw:colorScheme">
<saw:varyFillColor binType="percentile" numBins="4">
<saw:columnRef columnID="c95ae97ecd00f14f7"/>
<saw:rampStyle>
<saw:rampItem id="0">
<saw:g class="color" fill="#fffeff"/></saw:rampItem>
<saw:rampItem id="1">
<saw:g class="color" fill="#eae6ec"/></saw:rampItem>
<saw:rampItem id="2">
<saw:g class="color" fill="#d2cfd5"/></saw:rampItem>
<saw:rampItem id="3">
<saw:g class="color" fill="#bcb8be"/></saw:rampItem></saw:rampStyle></saw:varyFillColor>
<saw:g class="color"/></saw:visual></saw:visuals></saw:spatialLayer></saw:spatialLayers>
<saw:canvasFormat width="800" height="400"/>
<saw:mapInteraction autoCreateFormats="true"/>
<saw:formatPanel width="218" height="513"/></saw:view></saw:views></saw:report>Here are few tuning tips you might want to look at.
http://oraclemaps.blogspot.in/2010/01/mapviewer-performance-tuning-tips-part.html
http://oraclemaps.blogspot.in/2010/01/mapviewer-performance-tuning-tips-part_14.html
http://oraclemaps.blogspot.in/2010/02/mapviewer-performance-tuning-tips-part.html
http://oraclemaps.blogspot.in/2010/09/mapviewer-performance-tuning-tips-part.html -
Need help in doing FDM "Maps" (Not able to do a like mapping)
Hi
sub : Unable to do a like mapping in FDM
I am new to FDM,here is what i am trying to do.
I have a ODBC source table and the table struct is somthing like this.
Accountcode1,Accountcode2,,,,YTD
I have a Custom Import Script where i have done the column mapping as bellow
source >to> Destination
Accountcode1 >> Accounts
Accountcode2 >> ICP
...>>.....
YTD>>Actual values
Destination is HFM
with the above senario here is what i am trying to do.
I am trying to apply the following logic
If Accounts = "1050300" and ICP = "10909001" then the value should go to HFM account1
else if If Accounts = "1050300" and ICP = "10909002" then the value should go to HFM account2
and so on.
the following are the questions i am not clear about while implementing the above logic.
Should i be using the like mapping ??
if yes then what should be in the "Rule Definition" and the "Script"
since the source account values are the same should i be doing this like mapping in the ICP Dimension ??
or is it a "reverse" in mapping ??
any sort of help in getting me resolve this issue will be highly appreciated.
Edited by: 845926 on Apr 3, 2011 8:11 AM
Edited by: 845926 on Apr 3, 2011 8:12 AM
Edited by: 845926 on Apr 3, 2011 8:13 AMHi,
there are two ways of solving this, the easy one without scripting is to combine in the import format account and custom1 for your account source. This will give you a longer mapping table but solves it without scripting..
Otherwise you use the like mapping where your target is #script and you use varvalues in your if then else satement.
Just look up varvalues in the FDM admin manual for a detailed description. -
Performance of mapping with FILTER
LS,
How to tackle performance of OWB 10g-generated mappings in Oracle10g?
Database: 10.1.0.2.0
OWB repos: 10.1.0.1
OWB client: 10.1.0.2
Situatie:
A StorageArea (1) with data of more than 1 day, each row has a column filled with a date, to which this row âbelongsâ. All tables have a pk of a dwkey, and a uk composed of the date-column extended by a logical key (one or more columns).
A StorageArea (2) with the same kind of tables, only modified / enriched / unified etc. All tables have a pk of a dwkey, and a uk composed of a logical key (one or more columns) extended by the date-column.
The mappings responsible for putting the data in SA(1) do analyze the table (90%), but not the indexes. So I manually do that by dbms_stats.gather_index_stats after all mappings to SA (1) are done.
I gather the statistics, because the mappings to SA (2) all have a FILTER containing âdate-column = <Get_Current_Date>â. Any other (outer)joins made by the mappings, are all based on the ukâs in SA (1). So, I expect a good performance of Oracle10g, because it will (eventually) use the indexes of the ukâs.
BUT THEY DO NOT !!!!
The mappings responsible for putting the data in SA(2) do analyze the table (90%), but not the indexes. So I manually do that by dbms_stats.gather_index_stats after all mappings to SA (2) are done.
So, when starting a mapping towards SA (2), each side (sources and target) is up-to-date on their statistics in table and indexes.
First I thought this is something of Oracle10g, but itâs the OWB generated source.
The filter in the source will look something like:
âWHERE date-column = package.function-callâ
Now the indexes will NOT be used. Instead, in the merge-statement source and target tables are scanned FULLy.
They will be used if one would ask:
âWHERE date-column = sysdateâ
or if one would ask:
âWHERE date-column = TRUNC(sysdate)â
But even âTRUNC(sysdate)-30â is too much and will lead to FULL scans.
I changed my mapping by defining a constant âcurrent_dateâ, which is filled in the mapping specification by the package.function-call. Now the filter in the source looks like:
âWHERE date-column = mappingname.constantnameâ.
When asked for the execution-plan on the select-statement, it STILL denies the indexes.
- What can I do to put on the turbo and to make sure the customer (and I) are very pleased they choosed Oracle and its tooling?
Mind you, the difference between no index (costs 7900) and uk-indexes (costs 6) are VERY BIG, according to the explain plan.
It takes 8 minutes to merge 270.000 rows. This canât be serious, the Oracle10g solution to very big databases and datawarehousing in special!!
Regards,
André KlückHi Andre,
What happens if you start using hints to "force" the index usage?
One thing I would try is at database level, take the OWB generated code as is (no hints, or just the defaults ones for parallel).
Run this in SQL Plus, and record the performance and the plan. Then force it to use the index, using hints and compare.
If that helps, revert to OWB and apply the hint in there (or do it all from there of course). Run with hints.
I'm curious to see if you get it to run on the indexes by forcing it...
Furthermore, 270.000 rows in 8 minutes. Was that bad or good? Reason I'm asking is whether you have hardware capacity to go faster? Any I/O issues that prevent you from going faster?
In general it will be hard to solve this on the forum, so just try the SQL statement and hint it, if you get a good plan try to get that out of OWB.
If it still does not work let me know...
Jean-Pierre -
hi ,
i have a mapping which joins about 10 tables and inserts into the target table
When i take the Select Query of final join and execute in sQL it is taking only 3 mins. But when i execute teh mapping through OWB it is taking 3 days.
Please let me know the setting i can do for making the performance highHi,
Following are the ways you can find the perfromance bottleneck.
1. It is always a good idea to monitor the performance of the mapping by a tool like TOAD. If you have toad, log into the session browser and find your session and see the LONG OPS section. This will give you an idea as to which operation is taking time.
2. Create a temporary table where you put the output of the final join. Use this temporary table in the mapping just to verify if the rest of operators in the mapping flow is taking time.
3. Get the intermediate results from the target table and run the SELECT query directly in any other SQL tool to see how long it takes. This SELECT query is the total query that appears as INSERT INTO ...SELECT *.
4. Also you can see examine the execution plan and see which performance is costly.
4. Select a loading hint like APPEND PARALLEL on the target table to see if it improves performance.
5. Check if all the source tables are properly analysed and indexed.
I guess I have listed out all methods I used to follow to get rid of such kind of performance problems. Hope it helps you a bit.
Regards
-AP -
Performance of mapping containing a function call
Hi,
I have:
SOURCE TABLE ------------------------------------------------- JOIN --------------- TARGET TABLE
CONSTANT ----------------------- FUNCTION --------------------^
The source table has an ID column and the function returns a P_ID which is selected from a control table. The join includes the clause; WHERE ID > P_ID - this restricts rows to later than a point in time and the P_ID is constant during the mapping's run.
When I deploy this, it performs badly and, on examination, it appears that the function is being called for each row in the source table.
Is it possible to set up the mapping so that P_ID is fetched once when the mapping starts and the value is re-used for the rest of the run? Would making the function deterministic help in this case?
Regards
SteveOK, thanks for all your help, guys.
There seems to be two solutions depending on the contents of the function:
1. When the function contains complex processing - Modify the premapping to return the single value. In my case, the current premapping stage maps to a generic premapping function so I will need to wrap it in a dedicated function that calls the generic function, then calls the function that returns the single value I want.
2. If the functionality of the function is simple enough - in some of my cases it consists only of a select statement - bin the function in favour of the select itself and combine that into the mapping. The optimiser will then just drag the value I want in.
Thanks for the input.
Steve -
Hi,
We hv a situation in FDM. We are planning to have our map files in below format.
Code,Source,Target
'Code' is a 3 digit number which defines the map type. eg: 111- explict, 112-between,113-In etc.
Based on the code type, the file needs to be loaded to FDM.
I'm just wondering if this can be acheived using script 'BefImportMap' which picks the value of column1 and load map for the sepcific map type.If yes, can someone throw some light on this as to how to go for it.
Also, can you give me any info..if we can load maps from database directly to FDM instead of external files
Thanks in advance
Edited by: 844747 on Feb 11, 2013 6:14 AMYou can do this but it would involve writing some custom web scripts which you could then link to a FDM Taskflow entry. I would seriously consider using the native out-of-the-box methods available, they are user friendly, don't require any custom code, and is supported by Oracle (any custom map import will not be). At the end of the day the central repositoey for the mappings will be the FDM database and they can be maintained directly there.
-
How to handle source empty field in FDM maps
Dear all,
In our projects, data was extracted from EBS with ERPi adapters to HFM through FDM. In this case, some source fields are empty after the data extraction from EBS. The problem is, how can I deal with the empty field in the maps. For now, the workaround is using wildcard (like *->[None]), sth like that. But there is a risk in it, it may happen to ignore some records with error. Do your guys know any special characters stand for the Null in the mapping?
Thanks!We have had a workaround solution since moving to V11. This was not a problem in System 9.
The issue with blanks not defaulting to [None] is that some values escape the test. Such as forcing default values using "Like (* = [None])". If a new code is in your data that is not in your map, the Validation process foiled since [None] is valid and all blanks are now [None]. Only then do you either have an intersection violation that shows up during Export or you get bad data in HFM.
I don't like this new "feature" in S11 FDM. We will be submitting a request to have the default functionality added back. Perhaps as an option?
Mike -
Hi,
I am new to HFM.Can any one pls clarify me these...
Do we build meta data at HFM or FDM?
When New members to be added, what do we do?
Do we always manage meta data through FDM?
Thanks in advance.Do we build meta data at HFM or FDM?
We build data in HFM.
When New members to be added, what do we do?
You add them in HFM.
Do we always manage meta data through FDM?
FDM is the tool use for mapping conversion. Here is a link with all of the library files for your reading entertainment and clarification of your questions:
http://download.oracle.com/docs/cd/E12825_01/index.htm -
I am migrating a HAL instance to FDM and am having some issues.
I have an issue where I need to have an account always go to a certain ICP but the source file is identifying a different ICP than I want.
Simple Source File
Account, ICP, Value
A100, abc, 50.00
B100, abc, 90.00
C100,,100.00
ICP Map
abc = Jims Bait and Tackle
I want account A100 to always go to LuLu Diner
I want my HFM load file to look like
A100,LuLu Diner,50.00
B100, Jims Bait and Tackle,100.00
C100,ICPNone,100.00
Note I need abc to translate unless it is fixed
I have thought of using the Like mapping to do something like
Rule A = If varValues(13) = A100 Then “Lulu Diner” EndIf
Rule B = If varValues(17) = abc Then “Jims Bait and Tackle” EndIf
This is easy if you have 2 or 3 but I have 250 – 300 ICPs to account for
I am sure others have done fixed ICP or fixed customs
Any help is appreciated.
Please ignore if I have pestered you already with thisIf I understand you correctly.
You first have to combine your entity A100 and ICP abc with or without a comma.
you have to do this in Metadata>import format
so your ICP import format should look like this (note: I am using fix but you can use delimited)
file example:
12345678
A100,abc
import example:
Field Name Start Lenght Expression
SourceICP 1 8
result would be A100,abc or B100,abc or c100,
Once that is done you have to define the ICP mapping in FDM just like account or other mappings.
ICP mapping probably would be explicit
SourceICP TargetICP
A100,abc luludinner
b100,abc Jims Bait and Tackle
c100, ICPnone
Hope this helps.
Maybe you are looking for
-
Notification Assignment Type Issue - SAP Work Manager 6.0
Hello Experts, We are not able to download notifications on the device for header level work center that is assignment type = 5. We have done the below configuration. - added the user parameter for work center in user profile. - changed assignment ty
-
Good Website for Printer Drivers
I've found this really good site for 3rd party printer drivers: http://linuxfoundation.org/en/OpenPrinting/MacOSX/ I've seen quite a few other posts in this forum also talk about it. Most of these drivers also need gplgs & Foomatic, both of which are
-
Error : Unicode setting doesnot match source system in BW
Hi All, While replication data in BW im facing the following error : unicode setting doesnot match source system. Kindly suggest. Regards, Amit Miglani.
-
Connecting java to database to sqlserver
How can I connect to databse from java to sqlserver? And from where can I download it?
-
Download assistant problem ?
after affects is not showing up on the download assistant ?