Q : OMBPLUS, find TARGET TABLE
With OMBRETRIEVE and HAS CONNECTION you can find out it the table is an target table, however this statement doesn't work from a JOIN OPERATOR
statement:
OMBRETRIEVE MAPPING 'COM_ODS_ODS_BGT' HAS CONNECTION \
FROM GROUP 'OUTGRP1' OF OPERATOR 'JOIN' \
TO GROUP 'INOUTGRP1' OF OPERATOR 'COM_ODS_BUDGET'
How can this be achieved? and does it work for the other
operators ?
Hi Dirk-Jan,
you can use the tcl proc as listed below. It returns a list of respectively mapping sources or targets of types TABLE of VIEW.
syntax: ::xos_omb_add-on::xOMBRETRIEVE [fco_type] [fco_name] [optionClause]
fco_type: MAPPING
fco_name: <MAPPING_NAME>
optionClause: SOURCES | TARGETS
Cheers!
Remco
namespace eval ::xos_omb_add-on {
# This procedure returns a list of respectively all sources or targets of a
# given mapping.
# @param fco_type in string The type of the component. Valid values:
# MAPPING
# @param fco_name in string The physical name of the component in
# single quotes.
# @param optionClause in string Specify option clause. Valid values:
# SOURCES, TARGETS
# @return out list List of mapping operators.
proc ::xos_omb_add-on::xOMBRETRIEVE {fco_type fco_name optionClause} {
variable result {}
switch $fco_type {
MAPPING {
switch $optionClause {
SOURCES {
foreach sco [lsort [concat [OMBRETRIEVE MAPPING '$fco_name' GET TABLE OPERATORS] [OMBRETRIEVE MAPPING '$fco_name' GET VIEW OPERATORS]]] {
foreach lindex [lsort [OMBRETRIEVE MAPPING '$fco_name' GET OPERATORS]] {
if {[OMBRETRIEVE MAPPING '$fco_name' HAS CONNECTION FROM OPERATOR '$lindex' TO OPERATOR '$sco'] == 0} {
if {[lsearch $result $sco] == -1} {
lappend result $sco
return $result
TARGETS {
foreach sco [lsort [concat [OMBRETRIEVE MAPPING '$fco_name' GET TABLE OPERATORS] [OMBRETRIEVE MAPPING '$fco_name' GET VIEW OPERATORS]]] {
foreach lindex [lsort [OMBRETRIEVE MAPPING '$fco_name' GET OPERATORS]] {
if {[OMBRETRIEVE MAPPING '$fco_name' HAS CONNECTION FROM OPERATOR '$lindex' TO OPERATOR '$sco'] == 1} {
if {[lsearch $result $sco] == -1} {
lappend result $sco
return $result
default {}
}
Similar Messages
-
How to find Target tables in informatica
Hi Guys,
Could you please tell me the How to find target tables in OBIEE from OBIEE Repository
Please any one reply me.
Thanks,
Siva
Edited by: 912736 on Mar 20, 2012 7:12 PMHi
Open Repository:
1- On Presentation Layer: Navigate to Subject Area which you want to trace Target Table.
2- Right Click on desire Measure /Object
3- List will appear and select 'Query for Physical Table' or 'Query fro Physical Column'.
4- Window will appear and it will show you list of relevant Physical Tables.
5- Select one of table and press Go button.
Your target Tables are in Data warehouse not infromatica, Informatica is Responsible to Design & run your Workflows and other activities.
However to trace Target table from Informatica (Workflow);
If you want to trace Target Table than in Informatica you have 2 options.
1- Openfirst you need to know Workflow, open workflow right click and in property you will see Query and from query you will know target table.
2- open workflow from Workflow you will Know Mapping and open mapping than you will know Mapplet and from Mapplet you will know target table.
Open Client >>PowerCenter and Open Mapllet, you will see Target
Regards
Edited by: Sher Ullah Baig on Mar 22, 2012 5:21 PM -
How find target table in interface which in package
Hi All,
I have a question?
I created interface INT1
I created variable VAR1
I placed INT1 in package I need to get INT1 target table in to variable.
Could you please help in write in the query to find target table of previous interface.
Appreciate your help.Create a logical schema to refer work repository schema. For var1 with this logical schema, in the refresh section, run the following query:
SELECT table_name FROM snp_pop WHERE pop_name = 'INT1';
I do not see any hope to derive the interface name automatically unless you use scenarios and invoke it through variables. -
hello experts
i am interested in a query that will let me know which is the target document of an other document
for example, while executing the query for orders, in a new column i would like to include the docentry of the target document
is there any way for this requirement?Hi,
You can check the XXX1 table for the table which you checking.
For example if orders, then in RDR1, check for columns BaseType, TrgetEntry, targetType etc.
Check if this is what you are looking for.
You can check this query example :
select distinct case when t5.trgetentry is not null then t2.Docnum Else NULL end as [AR Credit Memo] ,
CASE when t4.Baseref is not NULL then t4.Baseref Else NULL end as [AR Invoice Number],
t1.cardcode
from JDT1 T3 inner join OINV t1 on t3.transid = t1.transid and t3.shortname t3.account
inner join ORIN t2 on t3.transid = t1.transid and t2.doctotal = t1.doctotal
inner join RIN1 t4 on t4.docentry =t2.docentry and t4.baseref = t1.docnum
inner join INV1 t5 on t5.docentry = t1.docentry and t5.trgetEntry = t2.docentry
order by cardcode
Kind Regards,
Jitin
SAP Business One Forum Team
Edited by: Jitin Chawla on Jan 10, 2012 2:43 PM -
OMS Plus script needed to change target table load hint
Greetings
Environment:
Metadata Repository - 10.1.0.4 currently but going to 10.2.0.3 soon running client on Windows XP Pro and repository on AIX 5.3
I need an OMB Plus script to find all the maps where the target table(s) have a Loading Type of either INSERT/UPDATE or UPDATE/INSERT and I want to remove the APPEND loading hint.
I have suffered with attempts at using OMB Plus long enough.
Also, is there a way in either 10.1 or 10.2 to set the default action of that attribute to NOT include the APPEND hint?
Thanks very much in advance.
-garyHi Gary,
you can get loading type with command
OMBRETRIEVE MAPPING 'MAP_NAME' OPERATOR 'TABLE_OP' GET PROPERTIES (LOADING_TYPE)
and loading hint with command
OMBRETRIEVE MAPPING 'MAP_NAME' OPERATOR 'TABLE_OP' GET PROPERTIES (LOADING_HINT)
In this thread you can find tcl script for identifying target or source tables/views in mapping
Re: Q : OMBPLUS, find TARGET TABLE
Regards,
Oleg -
How to find the interface with the help of target table name
Hi friends,
Whether with the help of target table name, is it possible to find the interface name that invokes this target table in ODI 11.1.1.7.
Thanks in advance.
Regards,
Sarohighlight the datastore in the navigator, expand its sub tree, there is a 'populated by' node, under the node, it will list all datastores which are used as sources to the current datastore
then expand 'used by'->interfaces, it will show all interfaces which include current datastore(but maybe current datastore is used as source)
that's all you can get from UI, otherwise I suppose you can use SDK. -
ODI metadata query to find source and Target table for Interface
Hi Experts,
Client is migrating there source from EBS 11.5.10 to R12. They are in ODI BIApps. 7952 version.Since,all there mappings are customized they are not bothering about support from Oracle as far as BIApps is concerned.
Now,we need to know how many ODI mappings will be impacted when source EBS is migrating from 11.5.10 to R12 and so that we can only target those mappings accordingly.
So,please provide me with below inputs:
1) Any metadata query which will give me the source table and target table information's against an interface even-if the main interface source is another interface.
2) What are the other stuffs I need to look from point of view of mapping changes when the source is upgrading.e.g. only source table change is enough or I need to look into other stuffs.I feel,it is boiling down to create a separate source adapter for R12.
Regards,
SnehotoshSELECT C.TABLE_NAME AS "Target Table Name",
A.COL_NAME AS "Target Field Name",
Wm_Concat(G.SOURCE_DT) AS "Target Data Type",
Wm_Concat(G.LONGC) AS "Target Data Length",
Wm_Concat(TXT) AS "Transformation Rule",
Wm_Concat(DISTINCT F.TABLE_NAME) AS "Source Table Name",
Wm_Concat(D.COL_NAME) AS "Source Field Name",
Wm_Concat(D.SOURCE_DT) AS "Source Data Type",
Wm_Concat(D.LONGC) AS "Source Data Length"
FROM
SNP_POP_COL A JOIN SNP_TXT_CROSSR B ON A.I_TXT_MAP=B.I_TXT
JOIN SNP_POP C ON A.I_POP=C.I_POP
JOIN SNP_TXT E ON A.I_TXT_MAP=E.I_TXT AND B.I_TXT=E.I_TXT
LEFT OUTER JOIN SNP_COL D ON B.I_COL=D.I_COL
LEFT OUTER JOIN SNP_TABLE F ON F.I_TABLE= D.I_TABLE
LEFT JOIN SNP_COL G ON A.I_COL=G.I_COL
WHERE POP_NAME = 'XXXXXXX'
GROUP BY C.TABLE_NAME,A.COL_NAME ORDER BY 1 -
How to get source table name according to target table
hi all
another question:
once a map was created and deployed,the corresponding information was stored in the repository and rtr repository.My question is how to find the source table name according to the target table,and in which table these records are recorded.
somebody help me plz!!
thanks a lot!This is a query that will get you the operators in a mapping. To get source and targets you will need some additional information but this should get you started:
set pages 999
col PROJECT format a20
col MODULE format a20
col MAPPING format a25
col OPERATOR format a20
col OP_TYPE format a15
select mod.project_name PROJECT
, map.information_system_name MODULE
, map.map_name MAPPING
, cmp.map_component_name OPERATOR
, cmp.operator_type OP_TYPE
from all_iv_xform_maps map
, all_iv_modules mod
, all_iv_xform_map_components cmp
where mod.information_system_id = map.information_system_id
and map.map_id = cmp.map_id
and mod.project_name = '&Project'
order by 1,2,3
Jean-Pierre -
Lookup Table and Target Table are the same
Hi All,
I have a requirement in which I have to lookup the target table and based on the records in it, I need to load a new record into the target table.
Being very specific,
Suppose I have a key column which when changes I want to generate a new id and then insert this new value.
The target table record structure looks like this
list_id list_key list_name
1 'A' 'NAME1'
1 'A' 'NAME2'
1 'A' 'NAME3'
2 'B' 'NAME4'
2 'B' 'NAME5'
As shown the target table list_id changes only when the list key changes. I need to generate the list_id value from within OWB mapping.
Can anyone throw some light as to how this can be done in OWB???
regards
-APHello, AP
You underestimate the power of single mapping :) If you could tolerate using additional stage table (with is definitly recomended in case your table from example will account a lot of rows).
You underestimate the power of single mapping :) It you could tolerate using additional stage table (witch is definitely recommended in case your table from example will account a lot of rows), you could accomplish all you need within one mapping and without using PLSQL function. This is true as far as you could have several targets within one mapping.
Source ----------------------------------------------------- >| Join2 | ---- > Target 2
|------------------------ >|Join 1| --> Lookup table -->|
Target Dedup >|
Here Target your target table. Join 1 operator covers operations needed to get existing key mapping (from dedup) and find new mappings. Results are stored within Lookup Table target (operation type TRUNCATE/INSERT).
Join 2 is used to perform final lookup and load it into the Target 2 the same as Target
The approach with lookup table is fast and reliable and could run on Set base mode. Also you could revisit lookup table to find what key mapping are loaded during last load operation.
Serhit -
Loading accdb file using for each loop in to single target table
I have 3 accdb file as below in single folder, in control flow I have created a for each loop to loop through all the below file. In dataflow task I have created oledb source using connection manager to point to first file a1.accdb. I am trying
to load all the file into single below mentioned target table, here It is only loading first file it is not looping through other two file.
This I have tried to load .txt files and csv files it is working perfectly but similarly when I try to load access db file table it is not working, can any one of you help me?
1. a1.accdb contains only one table a1tab ( col1 col2 col3 )
2. a2.accdb contains only one table a2tab ( col1 col2 col3 )
3. a3.accdb contains only one table a3tab ( col1 col2 col3 )
I have a target table with similar structure as source
Target table <TargetTable> Col1 Col2 Col3
aakThanks Arthur, please find below response
How do you handle the different table names? Cant we handle it in a similar manner, how we handle the .csv/.txt files with different name and similar structure for sources(multiple files) and target (single table)
Is it good to drop the idea of for each loop and create a separate task for each accdb file? to be loaded to same target table( performance wise etc.. any justification would be helpful)
You must handle the connection string dynamically?
Can you help me with the logic for handling connection string dynamically
aak -
How do I write into multiple target tables in DIFFERENT schemas?
It is easy to have a mapping that writes into 2 or more tables it's results. I now need that all these tables are in different schemas!
When I create a 2nd warehouse target with a 2nd location and configure this location to be a different schema on the database, validation tells me, that everything is okay.
When I generate it, there are several warnings, when I execute it, it doesn't work :( It complains that it cannot find <something>.
I'm sorry, I don't have the error-message at hand :(
I've you got an idea, how I could have different schemas for my tables, please let me know!Art,
Could it be that the target schema into which you install the runtime components does not have privileges on the tables in the other schemas? You have to have at least the right privileges (INSERT, UPDATE, DELETE) on the target tables in the other schemas in order for this to work. However, then there should be no reason, assuming your tables are in different modules related to different locations.
Thanks,
Mark. -
Flat file to Target Table Mapping.
Hi All,
OWB Client 9.0.2.56.0
OWB Repository 9.0.2.0
WIN - NT 196 MB RAM.
I created a mapping which involved a fixed width text file
( Size of text file is 5 MB & number of columns around 28), applied some filter criteria & performed insert operation in a target table.
All seems to be fine, i could validate the mapping without errors & generate the mapping. I can save newly generated control file & uploaded the data into the target table using sqlldr. However it seems even after i selected Close command button in Generate Result window nothing happens, only way i can exit from this is by killing the session & this result in auto deletion of newly created mapping (Even though i commit the changes before selecting Close command button in Result Window).
Is this happening due to memory constraint ? ( I am aware that for OWB recommended RAM is around 256 MB.).
Just to add more i am experiencing this only for one specific mapping.
Thanks in Advance.
Regards,
VidyanandHi,
The error while I was getting earlier was solved. I have placed the schema as "target.<tablename>" in the table operator counfiguration while configuring the mapping.
But now I am getting the error
SQL*Loader-350: Syntax error at line 22.
Expecting "(", found ".".
INTO TABLE "target.custfile_target"."CUSTFILE_TARGET".
Please suggest on how to proceed now as i am not able to find how and where i can correct such syntax error in OWB.
Regards. -
Calculations in query taking long time and to load target table
Hi,
I am pulling approx 45 Million records using the below query in a ssis package which pulls from one DB on one server and loading the results to another target table on the another server. In the select query I have a calculation for 6 columns. The target
table is trunctaed and loaded every day. Also most of the columns in the source which I used for the calculations is having 0 and it took approximately 1 hour 45 min to load the target table. Is there any way to reduce the load time? Also can I do the calcultions
after once all the 47 M records loaded during query running and then calculate for the non zero records alone?
SELECT T1.Col1,
T1.Col2,
T1.Col3,
T2.Col1,
T2.Col2,
T3.Col1,
convert( numeric(8,5), (convert( numeric,T3.COl2) / 1000000)) AS Colu2,
convert( numeric(8,5), (convert( numeric,T3.COl3) / 1000000)) AS Colu3,
convert( numeric(8,5), (convert( numeric,T3.COl4) / 1000000)) AS Colu4,
convert( numeric(8,5),(convert( numeric, T3.COl5) / 1000000)) AS Colu5,
convert( numeric(8,5), (convert( numeric,T3.COl6) / 1000000)) AS Colu6,
convert( numeric(8,5), (convert( numeric,T3.COl7) / 1000000)) AS Colu7,
FROM Tab1 T1
JOIN Tab2 T2
ON (T1.Col1 = T2.Col1)
JOIN Tab3 T3
ON (Tab3.Col9 =Tab3.Col9)
AnandSo 45 or 47? Nevertheless ...
This is hardly a heavy calculation, the savings will be dismal. Also anything numeric is very easy on CPU in general.
But
convert( numeric(8,5), (convert( numeric,T3.COl7) / 1000000))
is not optimal.
CONVERT( NUMERIC(8,5),300 / 1000000.00000 )
Is
Now it boils to how to make load faster: do it in parallel. Find how many sockets the machine have and split the table into as many chunks. Also profile to find out where it spends most of the time. I saw sometimes the network is not letting me thru so you
may want to play with buffers, and packet sizes, for example if OLEDB used increase the packet size two times see if works faster, then x2 more and so forth.
To help you further you need to tell more e.g. what is this source, destination, how you configured the load.
Please understand that there is no Silver Bullet anywhere, or a blanket solution, and you need to tell me your desired load time. E.g. if you tell me it needs to load in 5 min I will give your ask a pass.
Arthur
MyBlog
Twitter -
Get Target table name and its count
Hi Experts,
I have created a Procedure where I want to get the Target table name and its count of the previous Interface in a Package and insert these values into the Audit table.
I am able to get the number of Inserts in previous step using getPrevStepLog() API, but I also want the Target table name and its total count of the previous Interface. Is there any generic way or code to bring these Information without hardcoding the table name. Kindly help to sort this issue.Thanks!
Warm Regards,
VBVHi VBV,
Please follow the below steps to audit ur execution.
Please note i coded in such a way that u can use ur existing procedure to capture the table name no need to query ur repository as its not advisable to query the repository as Oracle may change the structure in future which i personally feel right.
I created a Audit table with the below structure.
CREATE TABLE AUDIT_TABLE
INSERT_COUNT VARCHAR2(100 BYTE),
TABLE_NAME VARCHAR2(4000 BYTE)
Step 1 :
This step needs to be added in ur existing IKM after Commit step where i am using API to capture the target table name.
Step Name: Get Table Name
Command On Source
Technology: Oracle
Schema: Any Oracle related Schema
Command:
SELECT '<%=odiRef.getTargetTable("RES_NAME")%>' AS TGT_NAME FROM DUAL
Command On Target
Technology: Jython
Command:
TargetTable='#TGT_NAME'
Step 2:
In ur existing procedure add the below or add a new procedure.
Step Name: Audit Log
Command on Source
Technology : Oracle
Schema: Whichever schema holds the audit table
Command on Target
Technology : Jython
Command:
import java.sql as sql
import java.lang as lang
myCon =odiRef.getJDBCConnection("SRC")
MyStmt=myCon.createStatement()
myRs = MyStmt.executeQuery("INSERT INTO ODITGT.AUDIT_TABLE (INSERT_COUNT,TABLE_NAME) VALUES (<%=odiRef.getPrevStepLog("INSERT_COUNT")%>,'"+TargetTable+"')")
Thats it.
OTN doesnt allow me to put + sign in front of any word ( its assume its a Italic) so please add + (plus) sign before and after TargetTable in the above script.
This way u can capture the table name, insert count in to ur audit table.
Please note u need to call this Procedure after the successful completion of the interface in a package and make sure u selected the right IKM which capture the table name as in Step 1.
I tested in my local and its working fine. Let me know if u find any difficulties in implementation.
Thanks,
Guru -
Suggestion required in target table structure change
Hi all good morning,
I have one requirement. I have a target table T with some coulmns. Now I need to add one additional column to T and one procedure is there to populate the data for new column. This target table T is used by 50 mappings.my questions is if we change the structure of the table do we need to import(I think we must import). If we import what could be the impact of this re-import. do we need to rebound and validate alla the mappings and then deploy?
the other possibility what I can do is I will create a replica of T as T1 with one additional column. I will create a new mapping which dumps data from T to T1and I will replace my T with T1 in my reporting environment.
pls suggest which is good one. it is urgent.
thanks in advance.
VikiHello, if you don't need the new column in the 50 mappings, you can temporarily leave them alone (you say it's urgent), and just feed the new col..
When you have more time, you should reconcile the mappings with table T and re-deploy them. OMBPlus scripting is ideal for this task.
Hope this helps, Antonio
Maybe you are looking for
-
ColdFusion 11: cfclient seems to require /CFIDE to be available
G'day: I am reposting this from my blog ("ColdFusion 11: <cfclient> seems to require /CFIDE to be available") at the suggestion of Adobe support: @dacCfml @ColdFusion Can you post your queries at http://t.co/8UF4uCajTC for all cfclient and mobile que
-
How to pass the repository variables in title view in obiee 11g
all I am trying to create a reprot and the format of that report is TEST REPORT month range : 04/01/2012-04/30/2012. how can we get that date range in the title view. Any suggestions??
-
How do I make a composite snapshot in CS5?
How do I do a composite snapshot using Photoshop CS5. I know it can be done using CS6, but am not having any luck using Control + shift + alt + E?
-
Best options for mass pricing load/update?
Hi folks, I need to design an interface to load PR00 pricing condition data into SAP from a 3rd party (custom) system. The interface will need to support (1) creating brand new condition records, (2) changing existing condition records (example i: p
-
Say I've got a bean, i use <logic:present name="personbean" scope="request"> to load that bean in the jsp page. what should i do if i want to read a property which is actually a file name string, from that bean and include that string in the jsp page