Lookup Flat Table and Lookup Qualifier Table

Hi,
Can any body confirm that if I will change the type of lookup-flat type field to lookup-qualifier type field. But the structure of Lookup Flat table and Lookup Qualifier table will be the same then the Java code written to populate the lookup-flat type field will work without any changes with its new type (lookup-qualifier).
Right now when I run the code after the above said changes, no error comes but no values is assigned to that field.
Another way to ask this question, can a simple Qualifier Lookup table(without having any qualifier type field) be used as like Lookup flat table.
I would appreciate if any body can help me in this regard ASAP.
Regards
Shahid Nadeem

Hi Alok,
I think this Qualifier importing after main table import is quite trickier that you estimate.
1)Map all the fields of qualified table with source fields whther they are qualifiers or non-qulaifiers.
- we will only be able to view the Non Qualifier field in the qualified table.
2)Now the main step is when you finish all field and value mapping there will be also a Target field name on your qualified table with values of display field. this field must be mapped to a source field to import qualified table. do value mapping also.
- This target field is the linkup field you are talking abot i feel, and it is viewed only in the main table and, as i have only on one non- qualifier this will not be a combination of any display fields.
3)I am not abl eto see the Non -Qualifier in my main table
So how do i proceed.
Kindly guide
Thanks
Simona

Similar Messages

  • Diference b/w normal tables and pl/sql tables

    what is the diference b/w normal tables and pl/sql tables.
    already we have tables then what is the purpose of pl/sql tables.
    what are the advantages of pl/sql tables compare nor mal tables.

    user10447332 wrote:
    what is the diference b/w normal tables and pl/sql tables.As BluShadow pointed out "PL/SQL tbles" are now called collections. They are memory structures that store data and go away when your program stops running - essentially arrays or "tables" in other programming languages. "normal" ("heap") tables are stored on disk and are persistant - they store data permanently.
    already we have tables then what is the purpose of pl/sql tables.Collections are used to store data temporarily in memory, when it will be accessed shortly or repeatedly for fast access. This works best with smaller data sets because if you store too much data in a collection you can run out of memory to use
    what are the advantages of pl/sql tables compare nor mal tables.Collections give you faster access to data you need to read more than once during the same application. They work well for fast lookups and for smaller data sets can improve program performance slightly with the use of the bulk collect and forall PL/SQL operators under the right conditions.

  • SQL Server 2012 Management Studio:In the Database, how to print out or export the old 3 dbo Tables that were created manually and they have a relationship for 1 Parent table and 2 Child tables?How to handle this relationship in creating a new XML Schema?

    Hi all,
    Long time ago, I manually created a Database (APGriMMRP) and 3 Tables (dbo.Table_1_XYcoordinates, dbo.Table_2_Soil, and dbo.Table_3_Water) in my SQL Server 2012 Management Studio (SSMS2012). The dbo.Table_1_XYcoordinates has the following columns: file_id,
    Pt_ID, X, Y, Z, sample_id, Boring. The dbo.Table_2_Soil has the following columns: Boring, sample_date, sample_id, Unit, Arsenic, Chromium, Lead. The dbo.Table_3_Water has the following columns: Boring, sample_date, sample_id, Unit, Benzene, Ethylbenzene,
    Pyrene. The dbo.Table_1_XYcoordinates is a Parent Table. The dbo.Table_2_Soil and the dbo.Table_3_Water are 2 Child Tables. The sample_id is key link for the relationship between the Parent Table and the Child Tables.
    Problem #1) How can I print out or export these 3 dbo Tables?
    Problem #2) If I right-click on the dbo Table, I see "Start PowerShell" and click on it. I get the following error messages: Warning: Failed to load the 'SQLAS' extension: An exception occurred in SMO while trying to manage a service. 
    --> Failed to retrieve data for this request. --> Invalid class.  Warning: Could not obtain SQL Server Service information. An attemp to connect to WMI on 'NAB-WK-02657306' failed with the following error: An exception occurred in SMO while trying
    to manage a service. --> Failed to retrieve data for this request. --> Invalid class.  .... PS SQLSERVER:\SQL\NAB-WK-02657306\SQLEXPRESS\Databases\APGriMMRP\Table_1_XYcoordinates>   What causes this set of error messages? How can
    I get this problem fixed in my PC that is an end user of the Windows 7 LAN System? Note: I don't have the regular version of Microsoft Visual Studio 2012 in my PC. I just have the Microsoft 2012 Shell (Integrated) program in my PC.
    Problem #3: I plan to create an XML Schema Collection in the "APGriMMRP" database for the Parent Table and the Child Tables. How can I handle the relationship between the Parent Table and the Child Table in the XML Schema Collection?
    Problem #4: I plan to extract some results/data from the Parent Table and the Child Table by using XQuery. What kind of JOIN (Left or Right JOIN) should I use in the XQuerying?
    Please kindly help, answer my questions, and advise me how to resolve these 4 problems.
    Thanks in advance,
    Scott Chang    

    In the future, I would recommend you to post your questions one by one, and to the appropriate forum. Of your questions it is really only #3 that fits into this forum. (And that is the one I will not answer, because I have worked very little with XSD.)
    1) Not sure what you mean with "print" or "export", but when you right-click a database, you can select Tasks from the context menu and in this submenu you find "Export data".
    2) I don't know why you get that error, but any particular reason you want to run PowerShell?
    4) If you have tables, you query them with SQL, not XQuery. XQuery is when you query XML documents, but left and right joins are SQL things. There are no joins in XQuery.
    As for left/right join, notice that these two are equivalent:
    SELECT ...
    FROM   a LEFT JOIN b ON a.col = b.col
    SELECT ...
    FROM   b RIGHT JOIN a ON a.col = b.col
    But please never use RIGHT JOIN - it gives me a headache!
    There is nothing that says that you should use any of the other. In fact, if you are returning rows from parent and child, I would expect an inner join, unless you want to cater for parents without children.
    Here is an example where you can study the different join types and how they behave:
    CREATE TABLE apple (a int         NOT NULL PRIMARY KEY,
                        b varchar(23) NOT NULL)
    INSERT apple(a, b)
       VALUES(1, 'Granny Smith'),
             (2, 'Gloster'),
             (4, 'Ingrid-Marie'),
             (5, 'Milenga')
    CREATE TABLE orange(c int        NOT NULL PRIMARY KEY,
                        d varchar(23) NOT NULL)
    INSERT orange(c, d)
       VALUES(1, 'Agent'),
             (3, 'Netherlands'),
             (4, 'Revolution')
    SELECT a, b, c, d
    FROM   apple
    CROSS  JOIN orange
    SELECT a, b, c, d
    FROM   apple
    INNER  JOIN orange ON apple.a = orange.c
    SELECT a, b, c, d
    FROM   apple
    LEFT   OUTER JOIN orange ON apple.a = orange.c
    SELECT a, b, c, d
    FROM   apple
    RIGHT  OUTER JOIN orange ON apple.a = orange.c
    SELECT a, b, c, d
    FROM   apple
    FULL OUTER JOIN orange ON apple.a = orange.c
    go
    DROP TABLE apple, orange
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Resolving loops in a star schema with 5 fact tables and 6 dimension tables

    Hello
    I have a star schema, ie 5 FACT tables and 7 dimension tables, All fact tables share the same dimension tables, some FACT tables share 3 dimesnsions, while other share 5 dimensions.  
    I did adopt the best practices, and as recommended in the book, I tried to resolve them using Context, as it is the recommended option to Alias in a star schema setting.  The contexts are resolved, but I still have loops.  I also cleared the Multiple SQL Statement for each context option, but no luck.  I need to get this resoved ASAP

    Hi Patil,
    It is not clear what exactly is the problem. As a starting point you could set the context up so that it only covers the joins from fact to dimension.
    Fact A, joins Dim 1, Dim 2, Dim 3, and Dim 4
    Fact B, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 5
    Fact C, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 6
    Fact D, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 7
    Fact E, joins Dim 1, Dim 2, Dim 4 and Dim 6
    If each of these are contexts are done and just cover the joins from fact to dim then you should be not get loops.
    If you could lay out your joins like above then it may be possible to specify the contexts/aliases that should work.
    Regards
    Alan

  • Multiple 'logical joins' between a fact table and one dimension table

    It appears that one cannot create multiple ‘logical joins’ between a fact table and one dimension table in OBIEE using the Oracle BI Administration Tool. For example, considering a Business Model with a dimension table TIMES and a fact table FACT containing START_TIME and END_TIME, we would like to create separate logical joins from FACT to TIMES for the START_TIMEs and END_TIMEs? Obviously, the underlying foreign keys can be created, but as far as I can tell the Oracle BI Administration Tool doesn’t support this. The workaround would be to replicate the TIMES table, but that’s ugly.
    I seek an alternative approach.

    Try this. Create an two aliases for the TIMES dimension (Start & End) in the Physical Layer and then remove foreign key to the "Parent" Times dimension. Create the Foreign Key in the Physical Layer to the new aliases and then create the complex joins in the BMM Layer to the new aliases as well. This will allow you to present both dates within the same table in the Presentation Layer. Not the most elegant solution but it works.

  • JOINING local table and DB Link table performance

    Hellou everybody,
    can somebody tell me how it is work in Oracle DB when you want to join some local table and DB Link table for example:
    Select a.col1, b.col2
    from local_table a , remote_table@dblink b
    where a.key = b.key;
    and when I use DRIVING_SITE hint and the all data from remote table is transforming (pushing) into my DB (in some temporar space), it will transform although Indexes from remote table?
    So when I use some JOINS, this joins will be fast enought? Will they using remote table indexes to query the data or not?
    Thank you for response.
    Zoffob

    user10708026 wrote:
    Is it as simple as:No. Use:
    UPDATE  habitationalsupplement live_hab
       SET  primaryheatingapparatuscode = (
                                           SELECT  hab_backup.primaryheatingapparatuscode
                                             FROM  habitationalsupplement@backupdb hab_backup
                                             WHERE hab_backup.itemobjectid = live_hab.itemobjectid
                                               AND hab_backup.itemversionnumber = live_hab.itemversionnumber
      WHERE (itemobjectid,itemversionnumber) IN (
                                                 SELECT  hab_backup.itemobjectid,
                                                         hab_backup.itemversionnumber
                                                   FROM  habitationalsupplement@backupdb hab_backup
    /SY.

  • How to provide joins between oracle tables and sql server tables

    Hi,
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server.
    how to provide joins between oracle tables and sql server tables ? Any help on this
    Regards,
    Malli

    user10675696 wrote:
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server. Bad idea most times. Heterogeneous joins do not exactly scale and performance can be severely degraded by network speed and b/w availability. And there is nothing you can do in the application and database layers to address performance issue at the network level in this case - your code's performance is simply at the mercy of network performance. With a single glaring fact - network performance is continually degrading. All the time. Always. Until it is upgraded. When the performance degradation starts all over again.
    If the tables are not small (few 1000 rows each) and row volumes static, I would not consider doing a heterogeneous join. Instead I would rather go for a materialised view on the Oracle side, use a proper table and index structure, and do a local database join.

  • How to find out master tables and concern child tables

    Hi,
    my schema contains 219 tables. I got this result by using query "selct count(*) from user_tables".
    Now i need to know about master tables and concern child tables from these 219 tables.
    please guide me.
    Thanks and Regards,
    Venkat

    What about this one!!!
    select a.owner,a.table_name,a.column_name,
         '------------------>' as POINTS_TO,b.owner,b.table_name,b.column_name
    from dba_constraints c
         join dba_cons_columns a on ( c.constraint_name = a.constraint_name and c.owner = a.owner)
         join dba_cons_columns b on ( c.r_constraint_name = b.constraint_name and c.r_owner = b.owner)
    where  (a.table_name = '&table' and (a.owner='&owner'))  -- foreign key
    --     and (b.table_name = '&table' and (b.owner='&owner') )  -- source key
         and ( c.constraint_type='R' )
    order by a.table_namecomment and uncomment one between theese two lines to choose the direction.
         (a.table_name = '&table' and (a.owner='&owner'))  -- foreign key
    --     and (b.table_name = '&table' and (b.owner='&owner') )  -- source keyBye Alessandro
    Edited by: Alessandro Rossi on 22-ott-2008 10.40

  • DB Adapter inserting data in to regular tables and LOG / History tables

    I have a requirement of reading a XML file using the File Adapter and inserting into Data base.
    I'm using File Adapter --> Mediator --> DB Adapter
    When I insert the data I have to insert in to regular tables and LOG / History tables the same data.
    There is a foreign key relationship with regular table and History table. How to insert in to both tables (regular and LOG / History)?
    Thanks in advance

    while configuring the adapter, you need to create the relationships between the two tables, while importing the tables, import the two tables...
    http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_db.htm#BDCGCCJB
    go to 9.2.7 Creating Relationships.
    HTH
    N

  • How to synchronize between DHCP binding table and DHCP snooping table ?

    I clear DHCP snooping table with command "clear ip dhcp snooping binding " , and PC can't communicate with other any more. So how to synchronize between DHCP binding table and DHCP snooping table ?
    dhcp-test#sh ip dhcp bind
    IP address Client-ID/ Lease expiration Type
    Hardware address
    99.1.65.32 0100.1125.353c.25 Mar 02 1993 01:05 AM Automatic
    99.1.65.33 0100.1438.059f.85 Mar 02 1993 12:01 AM Automatic
    dhcp-test#sh ip dhcp snooping binding
    MacAddress IpAddress Lease(sec) Type VLAN Interface
    Total number of bindings: 0
    thanks!

    ip dhcp snooping binding mac-address vlan vlan-id ip-address interface interface-id expiry seconds
    Add binding entries to the DHCP snooping binding database. The vlan-id range is from 1 to 4904. The seconds range is from 1 to 4294967295.
    Enter the above command for each entry that you add
    To delete the database agent or binding file, use the no ip dhcp snooping database interface configuration command. To reset the timeout or delay values, use the ip dhcp snooping database timeout seconds or the ip dhcp snooping database write-delay seconds global configuration command.To renew the database, use the renew ip dhcp snooping database privileged EXEC command.

  • Whats the link between Pur. Tables and KONH & KONP tables

    Hi!
      Whats the link between Pur. Tables i.e EKKO EKPO etc and KONH & KONP tables?
      and
      Link B'tween KONH, KONP & EINA, EINE Tables?
       and
      Link b'tween the LFA1 Table and EINA, EINE Tables?
      Link b'tween the LFA1 Table and KONH, KONP Tables?
    Pls reply guys
    Thanks,
    Imran.

    Hi Imran,
    EKKO gives you the Header data of the purchasing documents (Vendor , Dates, Etc)
    EKPO gives the item Data (The materials ordered, Qty of each material etc)
    KONP & KONH store the condition records used for price calculation
    KONH is the header data
    KONP is the item data
    EINA &  EINE -> These are Purchasing info records Used in source of supply determination & price calculations , the Condition record number for pricing is obtained from EINA & EINE
    EINA is for Purchase org level data
    EINE is for vendor level data
    The link to LFA1 is the vendor master data & while picking up the price you can have different price for different vendor .
    Hope you find it useful.
    Kindly reward helpful answers
    Best regards,
    gaurav

  • Inner join on Buffred table and not buffered table.

    Hi guys,
       Can you suggest me how can i write select query using inner join on buffered table and not buffered table. i written like this. is it right?
    SELECT a~vbeln                         
             a~vkorg   
             a~kunnr                         
             a~kunag                         
             a~wadat_ist 
             a~xblnr                         
             b~bukrs
       FROM likp AS a INNER JOIN tvko AS b
        ON avkorg = bvkorg BYPASSING BUFFER
      INTO TABLE itab_likp
      WHERE vbeln = s_vbeln.              
    Thanks.
    RAJ

    Hi Raj ,
    Please find below my Commentes :
    1. When you use "Bypassing buffer" clause in Select statement , the data what is there in the buffer of application sever is always ignored and Read is performed from the Physical database always. This Clause is advisable where you require realtime data (Not recommended generally in reporting)
    2. It is advisable that you should not use small table in the innerjoin with big table. In the query you have written there are 2 tables:- LIKP and TVKO outof which TVKO is a master table containing very few records than LIKP (Contains huge data for Delivery header).
    So my recommendation will be : -
    1. Eliminate the "Bypassing Buffer" clause from your Query
    2. Break the select query into 2 select queries breaking the join and with some ABAP logic populate the Internal table "itab_likp".
    This will be optimum way inwhich you can write the select query for your senarion. Also Ensure that indexes are being used for better selectivity.
    Hope this will help to you.
    Regards,
    Nikhil

  • Need to get a list of PSA tables and change log tables existing in a PC

    Is there a standars table to look up all active DSOs and the change log tables associated to those DSOs?
    and also Data sources and the PSA tables associated to that DS.
    I need to get a list of PSA tables and change log tables existing in a processchain(whioch deletes the data in them time to time)how do I do this in a quicker way?
    Thanks in advance

    Hi Ramya
    Check  RSTSODS table with filter of  User App   CHANGELOG ---> For change log tables

  • Master data tables and Transaction data Tables

    Hello Gurus,
    Please let me know how to know which table belongs to master data  and which table belongs to transaction data.
    for FICO module.
    Does any one  have specific material relating to master data table and transaction data tables.
    Thanks
    Edited by: Manu Rathore on Jan 18, 2012 4:38 AM

    Hi Manu,
    Find attached table relation diagram by Christopher Solomon. It is one of the very comprehensive chart on this topic.
    deleted
    Warm regards,
    Murukan Arunachalam

  • Differenct between Lookup Error Output and Lookup No Match Output

    Hi All
    I'm new in SSIS and reading book for SSIS and unable to understand the difference between Lookup Error Output and Lookup No Match Output in Lookup transformation. Are these terms same or is there any difference between them ?. Please guide
    Thanks
    Regards
    Muz

    Nope. Error output is more generic ie any errors happening inside lookup can be redirected to error output by default. This includes lookup match failures as well as data conversion error, truncation error etc. Upto SSIS 2005 this was the only output available
    in addition to match output inside lookup so if you want to capture unmatched rows you had to redirect them to error output itself.
    But from SSIS 2008 onwards they've introduced one more output called no match output. This gives you ability to redirect only no matched rows through no match output so that you'll be able to distinguish between unmatched rows and other error rows.
    You can configure this option inside Specify how to handle rows with no matching entries option in Lookup Editor. You can choose Ignore Failure,
    Redirect rows to no match output,Redirect rows to error output etc as options
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

Maybe you are looking for

  • Adding user review comments to each item

    I am creating my site in Muse and then will be publishing to BC. My website will be have a least 7,000 items on it. I want to be able to have user review comments for EACH item. will I be able to do this using BC? I am guessing every item would have

  • How to start a selection from an exact time of the clip?

    Hello, I'm a new user of iMovie '08 and I'd like to know how to start the selection of a part of a clip from an exact point. Let me explain (I'm not english, so I hope everyone understands) : Let's say I have one clip that lasts 1h20' ok? I remember

  • Intel SSD 520 in My MBP 2007?

    I have have endless problems with Hard Disks in my Late 2007 MacBook Pro and am trying to decide whether or not to switch to a SSD for it.... Has anybody tried it before???

  • How to ration flatmates bandwidth

    An old problem ( that should have been fixed in the 1990s !)... I'm in a house with other net users. When one of their computers downloads a file, it hogs all the bandwidth it can, which jams up all the other connections, like any online games become

  • Continue / Retake Quiz buttons act differently when Publishing in SWF vs SWF/HTML5

    Hi, I use Captivate 7 and when I compare what I publish as SWF/HTML5 with SWF only it seems that there are 2 issues with the SWF/HTML5  site on Chrome (also on Safari) Issue 1 I have 1 quiz on a course and I have set the Quiz setting to require a Pas