How find target table in interface which in package

Hi All,
I have a question?
I created interface INT1
I created variable VAR1
I placed INT1 in package I need to get INT1 target table in to variable.
Could you please help in write in the query to find target table of previous interface.
Appreciate your help.

Create a logical schema to refer work repository schema. For var1 with this logical schema, in the refresh section, run the following query:
SELECT table_name FROM snp_pop WHERE pop_name = 'INT1';
I do not see any hope to derive the interface name automatically unless you use scenarios and invoke it through variables.

Similar Messages

  • How to find Target tables in informatica

    Hi Guys,
    Could you please tell me the How to find target tables in OBIEE from OBIEE Repository
    Please any one reply me.
    Thanks,
    Siva
    Edited by: 912736 on Mar 20, 2012 7:12 PM

    Hi
    Open Repository:
    1- On Presentation Layer: Navigate to Subject Area which you want to trace Target Table.
    2- Right Click on desire Measure /Object
    3- List will appear and select 'Query for Physical Table' or 'Query fro Physical Column'.
    4- Window will appear and it will show you list of relevant Physical Tables.
    5- Select one of table and press Go button.
    Your target Tables are in Data warehouse not infromatica, Informatica is Responsible to Design & run your Workflows and other activities.
    However to trace Target table from Informatica (Workflow);
    If you want to trace Target Table than in Informatica you have 2 options.
    1- Openfirst you need to know Workflow, open workflow right click and in property you will see Query and from query you will know target table.
    2- open workflow from Workflow you will Know Mapping and open mapping than you will know Mapplet and from Mapplet you will know target table.
    Open Client >>PowerCenter and Open Mapllet, you will see Target
    Regards
    Edited by: Sher Ullah Baig on Mar 22, 2012 5:21 PM

  • ODI metadata query to find source and Target table for Interface

    Hi Experts,
    Client is migrating there source from EBS 11.5.10 to R12. They are in ODI BIApps. 7952 version.Since,all there mappings are customized they are not bothering about support from Oracle as far as BIApps is concerned.
    Now,we need to know how many ODI mappings will be impacted when source EBS is migrating from 11.5.10 to R12 and so that we can only target those mappings accordingly.
    So,please provide me with below inputs:
    1) Any metadata query which will give me the source table and target table information's against an interface even-if the main interface source is another interface.
    2) What are the other stuffs I need to look from point of view of mapping changes when the source is upgrading.e.g. only source table change is enough or I need to look into other stuffs.I feel,it is boiling down to create a separate source adapter for R12.
    Regards,
    Snehotosh

    SELECT C.TABLE_NAME AS "Target Table Name",
         A.COL_NAME AS "Target Field Name",
         Wm_Concat(G.SOURCE_DT) AS "Target Data Type",
         Wm_Concat(G.LONGC) AS "Target Data Length",
         Wm_Concat(TXT) AS "Transformation Rule",
         Wm_Concat(DISTINCT F.TABLE_NAME) AS "Source Table Name",
         Wm_Concat(D.COL_NAME) AS "Source Field Name",
         Wm_Concat(D.SOURCE_DT) AS "Source Data Type",
         Wm_Concat(D.LONGC) AS "Source Data Length"
    FROM
         SNP_POP_COL A JOIN SNP_TXT_CROSSR B ON A.I_TXT_MAP=B.I_TXT
         JOIN SNP_POP C ON A.I_POP=C.I_POP
         JOIN SNP_TXT E ON A.I_TXT_MAP=E.I_TXT AND B.I_TXT=E.I_TXT
         LEFT OUTER JOIN SNP_COL D ON B.I_COL=D.I_COL
         LEFT OUTER JOIN SNP_TABLE F ON F.I_TABLE= D.I_TABLE
         LEFT JOIN SNP_COL G ON A.I_COL=G.I_COL
    WHERE POP_NAME = 'XXXXXXX'
    GROUP BY C.TABLE_NAME,A.COL_NAME ORDER BY 1

  • Q : OMBPLUS, find TARGET TABLE

    With OMBRETRIEVE and HAS CONNECTION you can find out it the table is an target table, however this statement doesn't work from a JOIN OPERATOR
    statement:
    OMBRETRIEVE MAPPING 'COM_ODS_ODS_BGT' HAS CONNECTION \
    FROM GROUP 'OUTGRP1' OF OPERATOR 'JOIN' \
    TO GROUP 'INOUTGRP1' OF OPERATOR 'COM_ODS_BUDGET'
    How can this be achieved? and does it work for the other
    operators ?

    Hi Dirk-Jan,
    you can use the tcl proc as listed below. It returns a list of respectively mapping sources or targets of types TABLE of VIEW.
    syntax:     ::xos_omb_add-on::xOMBRETRIEVE [fco_type] [fco_name] [optionClause]
    fco_type:          MAPPING
    fco_name:          <MAPPING_NAME>
    optionClause:     SOURCES | TARGETS
    Cheers!
    Remco
    namespace eval ::xos_omb_add-on {
    # This procedure returns a list of respectively all sources or targets of a
    # given mapping.
    # @param     fco_type          in     string     The type of the component. Valid values:
    #                                                  MAPPING
    # @param     fco_name          in     string     The physical name of the component in
    #                                                  single quotes.
    # @param     optionClause     in     string     Specify option clause. Valid values:
    #                                                  SOURCES, TARGETS
    # @return                         out     list     List of mapping operators.
    proc ::xos_omb_add-on::xOMBRETRIEVE {fco_type fco_name optionClause} {
         variable result {}
         switch $fco_type {
              MAPPING {
                   switch $optionClause {
                        SOURCES {
                             foreach sco [lsort [concat [OMBRETRIEVE MAPPING '$fco_name' GET TABLE OPERATORS] [OMBRETRIEVE MAPPING '$fco_name' GET VIEW OPERATORS]]] {
                                  foreach lindex [lsort [OMBRETRIEVE MAPPING '$fco_name' GET OPERATORS]] {
                                       if {[OMBRETRIEVE MAPPING '$fco_name' HAS CONNECTION FROM OPERATOR '$lindex' TO OPERATOR '$sco'] == 0} {
                                            if {[lsearch $result $sco] == -1} {
                                                 lappend result $sco
                        return $result
                        TARGETS {
                             foreach sco [lsort [concat [OMBRETRIEVE MAPPING '$fco_name' GET TABLE OPERATORS] [OMBRETRIEVE MAPPING '$fco_name' GET VIEW OPERATORS]]] {
                                  foreach lindex [lsort [OMBRETRIEVE MAPPING '$fco_name' GET OPERATORS]] {
                                       if {[OMBRETRIEVE MAPPING '$fco_name' HAS CONNECTION FROM OPERATOR '$lindex' TO OPERATOR '$sco'] == 1} {
                                            if {[lsearch $result $sco] == -1} {
                                                 lappend result $sco
                        return $result
              default {}
    }

  • How to delete rows in the target table using interface

    hi guys,
    I have an Interface with source as src and target as tgt both has company_code column.In the Interface i need like if a record with company_code already exists we need to delete it and insert the new one from the src and if it is not availble we need to insert it.
    plz tell me how to achieve this?
    Regards,
    sai.

    gatha wrote:
    For this do we need to apply CDC?
    I am not clear on how to delete rows under target, Can you please share the steps to be followed.If you are able to track the deletes in your source data then you dont need CDC. If however you cant - then it might be an option.
    I'll give you an example from what im working on currently.
    We have an ODS, some 400+ tables. Some are needed 'Real-Time' so we are using CDC. Some are OK to be batch loaded overnight.
    CDC captures the Deletes no problem so the standard knowledge modules with a little tweaking for performance are doing the job fine, it handles deletes.
    The overnight batch process however cannot track a delete as its phyiscally gone by the time we run the scenarios, so we load all the insert/updates using a last modified date before we pull all the PK's from the source and delete them using a NOT EXISTS looking back at the collection (staging) table. We had to write our own KM for that.
    All im saying to the OP is that whilst you have Insert / Update flags to set on the target datastore to influence the API code, there is nothing stopping you extending this logic with the UD flags if you wish and writing your own routines with what to do with the deletes - It all depends on how efficient you can identify rows that have been deleted.

  • How to find internal table fields from which table.

    Hello Experts,
    I have to use a dynamic select inner join query.
    SELECT (lv_string)
      FROM (from_tab)
      INTO CORRESPONDING FIELDS OF TABLE <fs_itab1>
      WHERE (where_tab).
    ELSEIF table_1 NE ''.
      SELECT *
      FROM (table_1)
      INTO CORRESPONDING FIELDS OF TABLE <fs_itab1>
      WHERE (where_tab).
    in that LV_string is dynamicaly select. from a structure which is a combination of say  VBAK and VBAP
    i have to
    CONCATENATE 'VBAK~' wa_fields-fname INTO str_temp.
    CONCATENATE lv_string str_temp INTO lv_string SEPARATED BY space
    CONCATENATE 'VBAP~' wa_fields-fname INTO str_temp.
      CONCATENATE lv_string str_temp INTO lv_string SEPARATED BY space.
    with there identifire....how to find that...
    Regards,
    Ketan.

    Ketan,
    I believe you CAN find this only when you are creating lv_string from your structure (or whatever it is).
    So paste that code that what is your 'structure' type and on what conditions you want to Fill lv_string from it.
    in that LV_string is dynamicaly select. from a structure which is a combination of say VBAK and VBAP
    If you have a custom or standard Structure from which you need to fill lv_string, then you should have no worries to full VBAK~ or VBAP~ because fieldnames will be always unique in any Structure and hence ~ additions are not necessarily needed.
    The issue can be in the case where you are picking few fields each from std tables VBAK and VBAP in your lv_string and
    your "(from_tab)" is an inner join condition. I believe in that case you should be able to identify VBAK~ and VBAP~ while populating lv_string itself. Also you can use DB view MASSVBAP.
    Regards,
    Diwakar

  • How to find the table name on which integrity constraint not found

    Hi All
    How to acheive this
    I have a lot of tables with lot of primary key - foreign key
    relationship.
    In plsql
    when any inserts happen in the child table & the corresponding row is not present in the parent table, we get an exception
    ORA-02291: integrity constraint (user1.ppk) violated - parent key not found
    On this exception , in the exception block i want to trap teh name of the parent table on which the primary key for the particular child table was not there
    Is it possible to retrieve the parent table in this way. I am looking for a generic plsql code block which can help to acheive this
    Regards

    scott@ORA92> SET SERVEROUTPUT ON
    scott@ORA92> DECLARE
      2    e_no_parent_key EXCEPTION;
      3    PRAGMA            EXCEPTION_INIT (e_no_parent_key, -2291);
      4    v_fk_cons       VARCHAR2 (61);
      5    v_owner            VARCHAR2 (30);
      6    v_parent_table  VARCHAR2 (61);
      7    v_pk_cons       VARCHAR2 (30);
      8    v_parent_column VARCHAR2 (30);
      9  BEGIN
    10    INSERT INTO emp (empno, deptno) VALUES (99, 60);
    11  EXCEPTION
    12    WHEN e_no_parent_key THEN
    13        -- extract schema.constraint_name from sqlerrm:
    14        v_fk_cons:= SUBSTR (SQLERRM,
    15                      INSTR (SQLERRM, '(') + 1,
    16                      INSTR (SQLERRM, ')') - (INSTR (SQLERRM, '(') + 1));
    17        DBMS_OUTPUT.PUT_LINE ('Foreign key constraint violated: ' || v_fk_cons);
    18        -- extract parent schema.table and parent key:
    19        SELECT owner, table_name, constraint_name
    20        INTO     v_owner, v_parent_table, v_pk_cons
    21        FROM     user_constraints
    22        WHERE     (owner, constraint_name) =
    23            (SELECT r_owner, r_constraint_name
    24             FROM     user_constraints
    25             WHERE     owner || '.' || constraint_name = v_fk_cons);
    26        DBMS_OUTPUT.PUT_LINE ('Parent table: ' || v_owner || '.' || v_parent_table);
    27        DBMS_OUTPUT.PUT_LINE ('Parent key: ' || v_owner || '.' || v_pk_cons);
    28        -- extract parent table columns:
    29        FOR rec IN
    30          (SELECT column_name
    31           FROM   user_cons_columns
    32           WHERE  owner = v_owner
    33           AND    table_name = v_parent_table
    34           AND    constraint_name = v_pk_cons)
    35        LOOP
    36          DBMS_OUTPUT.PUT_LINE
    37            ('Parent table column: ' || rec.column_name);
    38        END LOOP;
    39  END;
    40  /
    Foreign key constraint violated: SCOTT.FK_DEPTNO
    Parent table: SCOTT.DEPT
    Parent key: SCOTT.PK_DEPT
    Parent table column: DEPTNO
    PL/SQL procedure successfully completed.

  • How to find target table

    hello experts
    i am interested in a query that will let me know which is the target document of an other document
    for example, while executing the query for orders, in a new column i would like to include the docentry of the target document
    is there any way for this requirement?

    Hi,
    You can check the XXX1 table for the table which you checking.
    For example if orders, then in RDR1, check for columns BaseType, TrgetEntry, targetType etc.
    Check if this is what you are looking for.
    You can check this query example :
    select distinct case when t5.trgetentry is not null then t2.Docnum Else NULL end as [AR Credit Memo] ,
    CASE when t4.Baseref is not NULL then t4.Baseref Else NULL end as [AR Invoice Number],
    t1.cardcode
    from JDT1 T3 inner join OINV t1 on t3.transid = t1.transid and t3.shortname  t3.account
    inner join ORIN t2 on t3.transid = t1.transid and t2.doctotal = t1.doctotal
    inner join RIN1 t4 on t4.docentry =t2.docentry and t4.baseref = t1.docnum
    inner join INV1 t5 on t5.docentry = t1.docentry and t5.trgetEntry = t2.docentry
    order by cardcode
    Kind Regards,
    Jitin
    SAP Business One Forum Team
    Edited by: Jitin Chawla on Jan 10, 2012 2:43 PM

  • HOW populate target table if one of the column in target has sequence on it

    Hi,
    I am trying to insert rows in oracle target, on oracle table column is LIST_ID(PK) has sequence on it.
    I am not mapping the LIST_ID(target) from source but it seeing an error invalid identifier.
    How to solve this

    at 14 step giving that invalid identifier as LIST_ID which is PK column in target column
    how to slove to this

  • Find custom table used by  which object details

    Hi we have one custom table with only one column.This table is used for value set type 'TABLE'.
    This table was created by some one else earlier than me.In this table contract id values will be stored so that in the value set these values will be appeared as all knows.
    Now the problem is in the value set some of the values(contract ids) are missing.So i need to find out how the insertion is being done like within the package or in any form.
    I have checked in toad object_name-->used by but it shows only synonym of this object.
    Can any one suggest me how to find the insertion into this table is happening.
    Thanks

    how the values are populated into this table like one time insertion or through any procedure/function/package.If you're looking for missing data you might find DELETE access to be just as fruitful a line of investigation....
    You can find out which PL/SQL programs use the table through the _DEPENDENCIES views as Dan has demonstrated.  That will not show you any dependencies on programs which are outside of the database (client or web server apps), nor will it show you access through spontaneous SQL statements.  That's why you need proper audit in place.  Unfortunately you cannot do that retrospectively, but there is always Log Miner.
    Cheers, APC
    blog: http://radiofreetooting.blogpsot.com

  • Need a BW table to find ECC table and feilds which are loading in to BW

    Hi all,
    do we have any table to find what are the tables and feilds data loaing form ECC to BW ?
    Regards
    Kiran

    Chances are, the field name will have the same name in the extract structure.  You could use SE11 and scan the extract structure for the field.
    You could also ask the user what the data element behind the field is, and then, also in SE11, do a where-used on the data element, to see if it is in one of your extract structures.

  • How to get source table name according to target table

    hi all
    another question:
    once a map was created and deployed,the corresponding information was stored in the repository and rtr repository.My question is how to find the source table name according to the target table,and in which table these records are recorded.
    somebody help me plz!!
    thanks a lot!

    This is a query that will get you the operators in a mapping. To get source and targets you will need some additional information but this should get you started:
    set pages 999
    col PROJECT format a20
    col MODULE format a20
    col MAPPING format a25
    col OPERATOR format a20
    col OP_TYPE format a15
    select mod.project_name PROJECT
    , map.information_system_name MODULE
    , map.map_name MAPPING
    , cmp.map_component_name OPERATOR
    , cmp.operator_type OP_TYPE
    from all_iv_xform_maps map
    , all_iv_modules mod
    , all_iv_xform_map_components cmp
    where mod.information_system_id = map.information_system_id
    and map.map_id = cmp.map_id
    and mod.project_name = '&Project'
    order by 1,2,3
    Jean-Pierre

  • How to load data dynamically into target tables using files as a source

    Hi ,
    My scenario needs a single interface to load the data of 5 different files into five target tables using a single interface. All target tables have the same structure. It is possible to point to variable source files using ODI. But the same approach is not working with Database tables. I am getting errors while trying to make my target /source table as a dynamic one.
    Can anybody suggest anything. The last option would be writing a dynamic PL/SQL block in the KM. Any other suggestions friends ?
    Regards,
    Atish

    After creating a pair of identical source and target tables, I have carried out the following steps:
    I am trying just keeping the target as variable
    a)created a one to one interface,
    b)tested that it is running.
    c)created a variable(type =text),
    d)used the variable as #v_name in the resource of the target table datastore.
    e)in a package used the variable in a set variable step (first step).
    f) used the interface as the second step.
    g)executed the same in my context.
    the <project_code>.variable_name is not getting substituted in the sql_code that is generated by the KM. My KM is SQL Control Append and following is the code that it generates in the Insert into I$ step:
    /* DETECTION_STRATEGY = NOT_EXISTS */
    insert /*+ APPEND */ into HR.I$_JOBS_COPY1
         JOB_ID,
         JOB_TITLE,
         MIN_SALARY,
         MAX_SALARY,
         IND_UPDATE
    select      
         JOBS.JOB_ID     JOB_ID,
         JOBS.JOB_TITLE     JOB_TITLE,
         JOBS.MIN_SALARY     MIN_SALARY,
         JOBS.MAX_SALARY     MAX_SALARY,
         'I' IND_UPDATE
    from     HR.JOBS JOBS
    where     (1=1)
    and not exists (
         select     'X'
         from     HR.#PLAYGROUND."v_tab_name" T
         where     T.JOB_ID     = JOBS.JOB_ID
              and     ((JOBS.JOB_TITLE = T.JOB_TITLE) or (JOBS.JOB_TITLE IS NULL and T.JOB_TITLE IS NULL))
              and     ((JOBS.MIN_SALARY = T.MIN_SALARY) or (JOBS.MIN_SALARY IS NULL and T.MIN_SALARY IS NULL))
              and     ((JOBS.MAX_SALARY = T.MAX_SALARY) or (JOBS.MAX_SALARY IS NULL and T.MAX_SALARY IS NULL))
         )

  • CREATE TARGET TABLE

    Hi All,
    Can we create a target table in ODI as like we create target definition in Informatica. If yes, please let me know as i'm new to ODI.
    Also can you please give me a simple example on how variable works.
    Thanks,
    VBV

    Hi VBV,
    How are you?
    Yes, its possible to create target table thro interface at run time.
    In all IKM (at Flow tab) you can find CREATE_TARG_TABLE, if you opt that to YES it will create the table in the back end.
    For variables, please refer Note:471564.1 in metalink.
    Thanks,
    G

  • To find a table name where the field is from??

    Hi Friends...
    I know the field name and now how can I find the table name to which the field belongs....
    Points will be surely awarded...
    Thanks

    Hi,
    To know the master table for the field:
    Go to se11 and give ur field name in the data type selecttion and click on Display.
    Double click on Domain. Goto Value range tab page. In the bottom you can see Value table.
    With rgds,
    Anil Kumar Sharma .P

Maybe you are looking for

  • Oracle 8.1.5

    The strangest thing is that Oracle 8.1.5 installs on pentium 4 but 8.1.6 and 8.1.7 don't .

  • Adobe Photoshop CS6 64bit version no longer shows under program

    A few days ago I somehow acquired a virus on my windows 7 laptop. With virus protection software I was able to quickly identify it and delete it from my system. After I deleted the virus I noticed that my Photoshop CS6 64bit shorcut disappeared from

  • Nokia C5 won't start

    I have a recently purchased Nokia C5 that in mid conversation just died. No warningsignals or anything, it just died. Every time I start the phone it buzzez (vibrates) but not like it usually does, just a sort of warningbuzz or something and nothing

  • Problem with table control adding new line

    Dear friends, i am working with the VA01 upload program. when i am uploading the multiple line items i am getting the error. i resoved the error and successfully upload the data for 3 line items. when i am upload the data for 10line items it was show

  • Sfwordprocessing plug-in error and frozen disk permissions

    I was able to use iweb 1.1.2 until I loaded Leopard. Now, not only can I not change my webpage hosted by Mac, but I can not even delete it to try and build it new. Nor can I run Disk Repair Permissions (it freezes at about 1/8th and doesn't move-the