Tables/Joins for logical tables being combined

I have 2 logical tables set up that have multiple physical table sources (using map to these tables). Some of the physical sources are the same in each logical table. It seems that when I make selections from Answers and look at the SQL, it is picking tables from both logical tables (instead of just the one I used). It is also grabbing the content (additional 'where condition') from the other logical table as well. Why is it doing that?

Yes it is kind of limitation in obiee. But there are some solutions which might deviate from best practices.
Sol 1. Snowflake the dimensions in Business Model. Rather than merging both of them into single logical table, separate them into two different logical tables.
Sol 2. For the logical table , have two LTSs. First LTS will have both tables and all columns mapped to both tables. Second LTS will have only main table, Columns belong to that table only mapped. In this way when you selected columns related to first source system will use second LTS which has only only one table.
- Madan Thota

Similar Messages

  • Outer join between logical tables

    Hello,
    This question is likely to be asked many times, but I failed to find the proper thread in the forum.
    Assume there are 2 logical tables "Fact" and "Dim".
    "Fact" has 1 LTS which consists of physical tables F, FX1, FX2 which are inner joined.
    "Dim" has 1 LTS which consists of physical tables D, DX1, DX2 which are inner joined.
    F and D tables are also joined together on physical layer.
    I define left outer join between "Fact" and "Dim" on logical layer.
    I create a request in Answers, querying columns from "Fact" and "Dim" which map to physical tables F and D only.
    I expect OBIEE to build SQL query which uses F and D tables only, outer joined.
    Instead of that all the physical tables used for logical tables "Fact" and "Dim" are joined together in a form like:
    SELECT F.col, D.col
    FROM (F inner join FX1 inner join FX2) left outer join (D inner join DX1 inner join DX2)
    Is there any way to avoid this behavior or build data model in different way?
    Thank you!

    Hi Alex,
    In BI Applications you never have a null in a facts foreign key to the dimension. Instead there is always a zero row wi record inserted with 'Unspecified' in all the columns. When the fact table is populated in the ETL, if the fact record doesn't have a corresponding dimension record the WID is poulated with zero.
    This removes the problems with outer joins and helps considerably with performance.
    This could be one practice you take from BI Apps and put into your own ETL's and data model.
    Regards
    Robin

  • Incorrectly defined logical table source (for fact table X

    Hi!
    Imagine the following Physical Diagram:
    - Dim A
    - Dim B
    - Fact A
    - Fact B
    Joins:
    - Dim A is parent of Dim B
    - Fact B has a FK to Dim B
    - Fact A has a FK to Dim A
    Business Layer:
    - Logical Table Dim A
    - Logical Table Dim B
    - Logical Table Fact A
    - Logical Table Fact B
    Joins:
    - same joins (not FK Joins) that Physical Layer
    When we build a report that only has one column of Dim A and one column of Dim B (A is parent of B), the following error appears:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table "Fact A") does not contain mapping for [Dim A.col1, Dim B.col1]. (HY000)
    What is wrong?
    Help!
    Thanks.

    Hi,
    The joins of dimensions go via a fact table. In your case there is no common fact table.
    You can solve this by drag/drop a field from the physical table Dim B on the displayed logical table source of the logical table Dim A.
    Now the BI Server knows that Dim A and B have a physical relationship.
    Regards

  • Incorrectly defined logical table source (for fact table Facts) does not

    Hi,
    I have two Dimensions A and B. A is joined to B by a foreign Key.
    The report works if I pull B. Column1, A.Column2.
    The report is throwing an error if i try to change the order of the columns like this. A.Column2, B. Column1.
    error : Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    File: odbcstatementimpl.cpp, Line: 186
    State: S1000. Code: 10058. [NQODBC] [SQL_STATE: S1000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Facts) does not contain mapping for B.Column1
    I am not sure where it is going wrong.
    Thanks
    jagadeesh
    Edited by: Jagadeesh Kasu on Jun 16, 2009 4:22 PM

    did you make joins in LTS or on the physical table.
    try to make join in LTS if they are not there.

  • [nQSError: 14044] Missing join between logical tables

    Hi All,
    I have three physical tables:
    A- Dimension (Contact) B- Helper (Con-Prod) C- Dimension (Product). 'A' joins to facts.
    Relationships are:
    A:B=1:M and B:C=N:1
    Currently a column of B table has been implemented as MLOV. As a result now I have one additional MLOV physical table 'D' that joins to 'B' (Since its B.MLOV_WID=D.MLOV_WID , its not a foreign key join).
    The Logical Layer has three Logical tables: A, B and C as in the Physical Layer. B table has got one new LTS for 'D'.
    Now the Problem is when I take a column that is sourced from D and another from C, it generates an error in Answers: Missing join between logical tables B and C. I have verified that the Logical and Physical joins exist.
    I think this is something with the Logical level setup. So here's some more information on hierarchy setup. I have one Hierarchy (Contact) for all those Logical tables. I have setup Level for the MLOV column. I don't have level setup for most of the non-MLOV columns.
    Can you please share your thoughts? I would like to avoid implicit join method.

    Hi,
    It seems like you are pulling a report from 2 tables with NO physical join.
    Please check the physical layer diagram and join the 2 tables.
    Thanks,
    Vineeth

  • Joins for INV tables  R12

    Hi,
    Could you provide joins for below tables:
    mtl_system_items_b
    mtl_kanban_cards
    mtl_onhand_quantities_detail
    MTL_SECONDARY_INVENTORIES
    MTL_ITEM_CATEGORIES
    MTL_CATEGORY_SETS_TL
    MTL_CATEGORIES_B
    Thanks

    Yes it is kind of limitation in obiee. But there are some solutions which might deviate from best practices.
    Sol 1. Snowflake the dimensions in Business Model. Rather than merging both of them into single logical table, separate them into two different logical tables.
    Sol 2. For the logical table , have two LTSs. First LTS will have both tables and all columns mapped to both tables. Second LTS will have only main table, Columns belong to that table only mapped. In this way when you selected columns related to first source system will use second LTS which has only only one table.
    - Madan Thota

  • Unicode Export - unable to retrieve nametab info for logic table BSEG

    Hi
    We are performing a unicode export (CUUC from 4.6C upgrade to ECC 6.0) and we have incurred this error.
            Without ORDER BY PRIMARY KEY the exported data may be unusable for some databases
    Our OS is HPUX11.31 & Database is 10.2.0.2
    myCluster (63.21.Exp): 1610: inconsistent settings for table position validity detected.
    myCluster (63.21.Exp): 1611: nametab says table positions are valid.
    myCluster (63.21.Exp): 1614: alternate nametab says table positions are not valid.
    myCluster (63.21.Exp): 1617: for field 310 of nametab displacement is 1877, yet dbtabpos shows 1885.
    myCluster (63.21.Exp): 1621: character length is 1 (in) resp. 2 (out).
    myCluster (63.21.Exp): 1257: unable to retrieve nametab info for logic table BSEG      .
    myCluster (63.21.Exp): 8358: unable to acquire nametab info for logic table BSEG      .
    myCluster (63.21.Exp): 2949: failed to convert cluster data of cluster item.
    myCluster: RFBLG      *400**AT10**0000100000**2004*
    myCluster (63.21.Exp): 322: error during conversion of cluster item.
    myCluster (63.21.Exp): 323: affected physical table is RFBLG.
    (CNV) ERROR: data conversion failed.  rc = 2
    (DB) INFO: disconnected from DB
    /usr/sap/SBX/SYS/exe/run/R3load: job finished with 1 error(s)
    /usr/sap/SBX/SYS/exe/run/R3load: END OF LOG: 20081102104452
    We checked the note 913783 as per the CUUC guide but the correction only for package SAPKB70004 to 6. but we are in package SAPKB70011.
    We had found two notes:
    1. Note 1238351 - Hom./Het.System Copy SAP NW 7.0 incl. Enhancement Package 1
    :Solution:
    There are two possible workarounds:
    1. Modify DDL<dbs>.TPL (<dbs> = ADA, DB2, DB4, DB6, IND, MSS, ORA) BEFORE the R3load TSK files are generated;
                  search for the keyword "negdat:" and add "CLU4" and "VER_CLUSTR" to thisline.
    2. Modify the TSK file (most probably SAPCLUST.TSK) BEFORE R3load import is(re-)started.
                  search for the lines starting with "D CLU4 I" and "D VER_CLUSTR I" and change the status (i.e. "err" or "xeq") to "ign" or remove the lines. "
    I tried the above solution by editing the file DDL*.TPL but it is skipping the table and marks it as completed but its not the good solution as we will be miss the data from the table RFBLG.
    2. Note 991401 - SYSCOPY EXPORT FAILS:SAPCLUST:ERROR: Code page conversion:
    Solution
    Activate the table.
    Then call the RADCUCNT report. Do not change the selected parameters, but ensure that 'Overwrite Entries' is selected.  Set the 'Unicode Length' to 2 and fill the last two fields 'Type' and 'Name' with TABL and TACOPAB respectively. Then select 'No Log' or specify a log name.
    Execute the RADCUCNT report and restart the export.
    We have not tried this solution, bcos SAP is still down and CDCLS job is still running.
    We would like to know whether you have faced any issues like the above one and what is your suggested approach and solution.
    Is it safe to start SAP now (when the CDCLS job runs) and then try to activate the table RFBLG?
    Regards
    Senthil
    Edited by: J. Senthil Murugan on Nov 3, 2008 1:41 AM
    Edited by: J. Senthil Murugan on Nov 3, 2008 3:36 AM

    Hi Senthil,
    If you have done your pre-conversion steps before upgrade and after upgrade successfully then you should not see the below errors. However changes to SPDD tables may some times also have some impact during conversion and throws nametab errors occurs. Program RADCUCNT runs in the end of upgrade to update nametab tables if any new changes happned during upgrade.
    You can do any no of exports to complete  jobs successfully.yeah When export running you shouldnt bring SAP up.
    The tables you have mentioned all are cluster tables and CDCLS being the biggest table it will take hrs to complete depending on your size of the database.
    Do not pay around with the .TSK file until if you are sure you want to re-execute it.Your first possiblility is skipped because there may be multiple same .TSK files present locally(where u r running distribution.monitor (or) sapinst ) and on the common directory. You may also look at the .TSK.bkp files because it get information and creates a new .TSK. This is not complicated but tricky.
    secound possibility is to update the changed tables(eg: RFBLG...)  to conversion tables.Follow the Note but make sure no R3load processes are running before you start SAP. If you dont want to wait long and sure to restart other processes which are running you can kill it and start SAP. Specify your error tables only and follow instructions given in the note.Once done bring down  SAP app. and restart the export process using ur sapinst or distribution monitor.
    Regards,
    Vamshi.

  • Unable to retrieve nametab info for logic table BSEG during Database Export

    Hi,
    Our aim is to Migrate to New hardware and do the Database Export of the existing System(Unicode) and Import the same in the new Hardware
    I am doing Database Export on SAP 4.7 SR1,HP-UX ,Oracle 9i(Unicode System) and during Database Export "Post Load Processing phase" got the error as mentioned in SAPCLUST.log
    more SAPCLUST.log
    /sapmnt/BIA/exe/R3load: START OF LOG: 20090216174944
    /sapmnt/BIA/exe/R3load: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#20
    $ SAP
    /sapmnt/BIA/exe/R3load: version R6.40/V1.4 [UNICODE]
    Compiled Aug 13 2007 16:20:31
    /sapmnt/BIA/exe/R3load -ctf E /nas/biaexp2/DATA/SAPCLUST.STR /nas/biaexp2/DB/DDLORA.T
    PL /SAPinst_DIR/SAPCLUST.TSK ORA -l /SAPinst_DIR/SAPCLUST.log
    /sapmnt/BIA/exe/R3load: job completed
    /sapmnt/BIA/exe/R3load: END OF LOG: 20090216174944
    /sapmnt/BIA/exe/R3load: START OF LOG: 20090216182102
    /sapmnt/BIA/exe/R3load: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#20
    $ SAP
    /sapmnt/BIA/exe/R3load: version R6.40/V1.4 [UNICODE]
    Compiled Aug 13 2007 16:20:31
    /sapmnt/BIA/exe/R3load -datacodepage 1100 -e /SAPinst_DIR/SAPCLUST.cmd -l /SAPinst_DI
    R/SAPCLUST.log -stop_on_error
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): UTF8
    (GSI) INFO: dbname   = "BIA20071101021156                                                                               
    (GSI) INFO: vname    = "ORACLE                          "
    (GSI) INFO: hostname = "tinsp041                                                    
    (GSI) INFO: sysname  = "HP-UX"
    (GSI) INFO: nodename = "tinsp041"
    (GSI) INFO: release  = "B.11.11"
    (GSI) INFO: version  = "U"
    (GSI) INFO: machine  = "9000/800"
    (GSI) INFO: instno   = "0020293063"
    (EXP) TABLE: "AABLG"
    (EXP) TABLE: "CDCLS"
    (EXP) TABLE: "CLU4"
    (EXP) TABLE: "CLUTAB"
    (EXP) TABLE: "CVEP1"
    (EXP) TABLE: "CVEP2"
    (EXP) TABLE: "CVER1"
    (EXP) TABLE: "CVER2"
    (EXP) TABLE: "CVER3"
    (EXP) TABLE: "CVER4"
    (EXP) TABLE: "CVER5"
    (EXP) TABLE: "DOKCL"
    (EXP) TABLE: "DSYO1"
    (EXP) TABLE: "DSYO2"
    (EXP) TABLE: "DSYO3"
    (EXP) TABLE: "EDI30C"
    (EXP) TABLE: "EDI40"
    (EXP) TABLE: "EDIDOC"
    (EXP) TABLE: "EPIDXB"
    (EXP) TABLE: "EPIDXC"
    (EXP) TABLE: "GLS2CLUS"
    (EXP) TABLE: "IMPREDOC"
    (EXP) TABLE: "KOCLU"
    (EXP) TABLE: "PCDCLS"
    (EXP) TABLE: "REGUC"
    myCluster (55.16.Exp): 1557: inconsistent field count detected.
    myCluster (55.16.Exp): 1558: nametab says field count (TDESCR) is 305.
    myCluster (55.16.Exp): 1561: alternate nametab says field count (TDESCR) is 304.
    myCluster (55.16.Exp): 1250: unable to retrieve nametab info for logic table BSEG   
    myCluster (55.16.Exp): 8033: unable to retrieve nametab info for logic table BSEG   
    myCluster (55.16.Exp): 2624: failed to convert cluster data of cluster item.
    myCluster: RFBLG      *003**IN07**0001100000**2007*
    myCluster (55.16.Exp): 318: error during conversion of cluster item.
    myCluster (55.16.Exp): 319: affected physical table is RFBLG.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) WARN: env I18N_NAMETAB_TIMESTAMPS = IGNORE
    (DB) INFO: disconnected from DB
    /sapmnt/BIA/exe/R3load: job finished with 1 error(s)
    /sapmnt/BIA/exe/R3load: END OF LOG: 20090216182145
    /sapmnt/BIA/exe/R3load: START OF LOG: 20090217115935
    /sapmnt/BIA/exe/R3load: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#20
    $ SAP
    /sapmnt/BIA/exe/R3load: version R6.40/V1.4 [UNICODE]
    Compiled Aug 13 2007 16:20:31
    /sapmnt/BIA/exe/R3load -datacodepage 1100 -e /SAPinst_DIR/SAPCLUST.cmd -l /SAPinst_DI
    R/SAPCLUST.log -stop_on_error
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): UTF8
    (GSI) INFO: dbname   = "BIA20071101021156                                                                               
    (GSI) INFO: vname    = "ORACLE                          "
    (GSI) INFO: hostname = "tinsp041                                                    
    (GSI) INFO: sysname  = "HP-UX"
    (GSI) INFO: nodename = "tinsp041"
    (GSI) INFO: release  = "B.11.11"
    (GSI) INFO: version  = "U"
    (GSI) INFO: machine  = "9000/800"
    (GSI) INFO: instno   = "0020293063"
    myCluster (55.16.Exp): 1557: inconsistent field count detected.
    myCluster (55.16.Exp): 1558: nametab says field count (TDESCR) is 305.
    myCluster (55.16.Exp): 1561: alternate nametab says field count (TDESCR) is 304.
    myCluster (55.16.Exp): 1250: unable to retrieve nametab info for logic table BSEG   
    myCluster (55.16.Exp): 8033: unable to retrieve nametab info for logic table BSEG   
    myCluster (55.16.Exp): 2624: failed to convert cluster data of cluster item.
    myCluster: RFBLG      *003**IN07**0001100000**2007*
    myCluster (55.16.Exp): 318: error during conversion of cluster item.
    myCluster (55.16.Exp): 319: affected physical table is RFBLG.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) WARN: env I18N_NAMETAB_TIMESTAMPS = IGNORE
    (DB) INFO: disconnected from DB
    SAPCLUST.l/sapmnt/BIA/exe/R3load: job finished with 1 error(s)
    /sapmnt/BIA/exe/R3load: END OF LOG: 20090217115937
    og (97%)
    The main eror is "unable to retrieve nametab info for logic table BSEG "  
    Your reply to this issue is highly appreciated
    Thanks
    Sunil

    Hello,
    acording to this output:
    /sapmnt/BIA/exe/R3load -datacodepage 1100 -e /SAPinst_DIR/SAPCLUST.cmd -l /SAPinst_DI
    R/SAPCLUST.log -stop_on_error
    you are doing the export with a non-unicode SAP codepage. The codepage has to be 4102/4103 (see note #552464 for details). There is a screen in the sapinst dialogues that allows the change of the codepage. 1100 is the default in some sapinst versions.
    Best Regards,
    Michael

  • Table join and Logical databasein a same query

    Hi Experts,
    I have an existing query which is created using table join. I now want to add a logical database in the same infoset.
    Is it possible to add a logical database in the same infoset?
    Regards,
    Prakhar

    Hi Prakhar,
    If you see the  below scren,  you can use data source either Table join using basis table or Logical database.
    but if you need ldb with some addition table field then you can data retrieval by program step or you can use LDB with doing some code in inside  your infoset.
    Regards,
    Prasenjit

  • Joins for Quality Tables

    Hi,
    Could you give me joins for below tables, r12.
    qa_results
    qa_results_v
    cs_incidents_audit_b
    Thanks.

    You can get from etrm, refer this link: http://etrm.oracle.com/pls/et1211d9/etrm_fndnav.show_object?n_tabid=55117&n_appid=250&c_type=TABLE http://etrm.oracle.com/pls/et1211d9/etrm_search.search thanks

  • I get this error : "content filter of a source for logical table" while I run the Global consistency check.

    ERRORS:
    Business Model DAC Measures:
    [nQSError: 14031] The content filter of a source for logical table: D_END_TIME references multiple dimensions.
    [nQSError: 15001] Could not load navigation space for subject area DAC Measures.
    Thank you!

    Yes ! My Task Hierarchy has 3 dimension tables that form a hierarchy :Execution Plan -> Tasks -> Detail
    All the 3 levels in the hierarchy are 3 different dimension tables.

  • Unable to retrieve nametab info for logic table BSEG

    Hi
    We are performing a unicode export (CUUC from 4.6C upgrade to ECC 6.0) and we have incurred this error.
    Without ORDER BY PRIMARY KEY the exported data may be unusable for some databases
    Our OS is HPUX11.31 & Database is 10.2.0.2
    myCluster (63.21.Exp): 1610: inconsistent settings for table position validity detected.
    myCluster (63.21.Exp): 1611: nametab says table positions are valid.
    myCluster (63.21.Exp): 1614: alternate nametab says table positions are not valid.
    myCluster (63.21.Exp): 1617: for field 310 of nametab displacement is 1877, yet dbtabpos shows 1885.
    myCluster (63.21.Exp): 1621: character length is 1 (in) resp. 2 (out).
    myCluster (63.21.Exp): 1257: unable to retrieve nametab info for logic table BSEG .
    myCluster (63.21.Exp): 8358: unable to acquire nametab info for logic table BSEG .
    myCluster (63.21.Exp): 2949: failed to convert cluster data of cluster item.
    myCluster: RFBLG *400**AT10**0000100000**2004*
    myCluster (63.21.Exp): 322: error during conversion of cluster item.
    myCluster (63.21.Exp): 323: affected physical table is RFBLG.
    (CNV) ERROR: data conversion failed. rc = 2
    (DB) INFO: disconnected from DB
    /usr/sap/SBX/SYS/exe/run/R3load: job finished with 1 error(s)
    /usr/sap/SBX/SYS/exe/run/R3load: END OF LOG: 20081102104452
    We checked the note 913783 as per the CUUC guide but the correction only for package SAPKB70004 to 6. but we are in package SAPKB70011.
    We had found two notes:
    1. Note 1238351 - Hom./Het.System Copy SAP NW 7.0 incl. Enhancement Package 1
    :Solution:
    There are two possible workarounds:
    1. Modify DDL<dbs>.TPL (<dbs> = ADA, DB2, DB4, DB6, IND, MSS, ORA) BEFORE the R3load TSK files are generated;
                  search for the keyword "negdat:" and add "CLU4" and "VER_CLUSTR" to thisline.
    2. Modify the TSK file (most probably SAPCLUST.TSK) BEFORE R3load import is(re-)started.
                  search for the lines starting with "D CLU4 I" and "D VER_CLUSTR I" and change the status (i.e. "err" or "xeq") to "ign" or remove the lines. "
    I tried the above solution by editing the file DDL*.TPL but it is skipping the table and marks it as completed but its not the good solution as we will be miss the data from the table RFBLG.
    2. Note 991401 - SYSCOPY EXPORT FAILS:SAPCLUST:ERROR: Code page conversion:
    Solution
    Activate the table.
    Then call the RADCUCNT report. Do not change the selected parameters, but ensure that 'Overwrite Entries' is selected.  Set the 'Unicode Length' to 2 and fill the last two fields 'Type' and 'Name' with TABL and TACOPAB respectively. Then select 'No Log' or specify a log name.
    Execute the RADCUCNT report and restart the export.
    We have not tried this solution, bcos SAP is still down and CDCLS job is still running.
    We would like to know whether you have faced any issues like the above one and what is your suggested approach and solution.
    Is it safe to start SAP now (when the CDCLS job runs) and then try to activate the table RFBLG?
    Regards
    Senthil
    Edited by: J. Senthil Murugan on Nov 3, 2008 1:40 AM
    Edited by: J. Senthil Murugan on Nov 3, 2008 3:37 AM

    Dear Senthil
    I had faced this issue earlier.
    Table BSEG Requires activity in the ACT phase, like activation etc.
    If we do the ACT phase using the transports and not perform manual activation of this table, this issue arrives.
    Please share the relevant information--- seems some steps are missed out or not carried properly in the CU&UC phase.
    Otherways, we had applied the solution  Note 991401 - SYSCOPY EXPORT FAILS:SAPCLUST:ERROR: Code page conversion and it worked well..
    But you need to be sure, that this table was changed(activated etc) during the Upgrade till export phase.
    Issue is Nametab info is created during the Upgrade phase in CU&UC and if this table is touched, that nametab info is not getting it right as the runtime object is changed.
    With RADCUCNT the nametab info will be created again.
    All the Best
    Best Regards
    Deepak Dhawan

  • Is it really another error about full table scans for small tables....?????

    Hi ,
    I have posted the following :
    Full Table Scans for small tables... in Oracle10g v.2
    and the first post of Mr. Chris Antognini was that :
    "I'm sorry to say that the documentation is wrong! In fact when a full table scan is executed, and the blocks are not cached, at least 2 I/O are performed. The first one to get the header block (where the extent map is stored) and the second to access the first and, for a very small table, only extent."
    Is it really wrong....????
    Thanks...
    Sim

    Fredrik,
    I do not say in any way that the documentation in this point is wrong.....
    In my first post , i have inserted a link to a thread made in another forum:
    Full Table Scans for small tables... in Oracle10g v.2
    Christian Antognini has written that the documentation is wrong....
    I'm sorry to say that the documentation is wrong!
    In fact when a full table scan is executed, and the
    blocks are not cached, at least 2 I/O are performed. The
    first one to get the header block (where the extent map
    is stored) and the second to access the first and, for a
    very small table, only extent.I'm just wondering if he has right......!!!!!!!
    Thanks..
    Sim

  • How to hide table header for empty table

    Hi,
    I wanna to hide table header for all tables which doesn't contain any data in my Adobe form. How can I do this? Helpful answers will be rewarded .

    HI Aliaksandr,
    You can use javascript to do this dynamically.
    For example, i used Adobe Designer 7.1 to add a table to a subform.
    Now, i have the object hierchy as:
    Level 1 - form1
    Level 2 - form2
    Level 3 - Table1
               -->HeaderRow
                    --> Cell1
                    --> Cell2
               -->Row1
                    --> Cell1
                    --> Cell2
    Now, i sleect the Table1 element, and write the javascript which is executed on Initialization, as
    if(this.Row1.Cell1.rawValue == "")
    this.HeaderRow.presence = "hidden" ;
    This will check that if the first row is empty, it will hide the header from the layout.
    You can use something similar for your requirement.
    Hope this helps,
    Siddhartha Jain

  • About Table Maintance for An Table

    Hi All
    i would like create Table maintance for an table.kindly explain me how to create table maintance for an table...and what is the actual purpose of table maintance..
    Thanks in advance.

    Hi Joe,
    Table maintanance Generator is used to manually
    input values using transaction sm30
    follow below steps
    1) go to se11 check table maintanance check box under
    attributes tab
    2) utilities-table maintanance Generator->
    create function group and assign it under
    function group input box.
    also assign authorization group default &NC& .
    3)
    select standard recording routine radio in table
    table mainitainence generator to move table
    contents to quality and production by assigning
    it to request.
    4) select maintaience type as single step.
    5) maintainence screen as system generated numbers
    this dialog box appears when you click on create
    button
    6) save and activate table
    http://help.sap.com/saphelp_nw04/helpdata/en/cf/21ed2d446011d189700000e8322d00/content.htm
    http://help.sap.com/saphelp_46c/helpdata/en/a7/5133ac407a11d1893b0000e8323c4f/frameset.htm
    /message/2831202#2831202 [original link is broken]
    One step, two step in Table Maintenance Generator
    Single step: Only overview screen is created i.e. the Table Maintenance Program will have only one screen where you can add, delete or edit records.
    Two step: Two screens namely the overview screen and Single screen are created. The user can see the key fields in the first screen and can further go on to edit further details.
    Plzz reward if it is useful,
    Mahi.

Maybe you are looking for

  • Associate ed2k protocol to program (eMule or aMule etc. )

    Hello to the to the Firefox team , When one click a eMule or aMule (P2P) link , there is no option to associate it with /usr/bin/ed2k (the P2P program) . In former versions you had to go in "about:config" and enter a New then Boolean with as name : "

  • I can't open iPhoto. Get a message "iPhoto is being updated."  Can't open.

    When I try to open iPhoto, I get a message "iPhoto is being updated.  iPhoto cannot be opened while being updated."  How do I get into the program to view my photos?

  • Portal users need to be export to excel

    Dear Gurus I have portal users aroung 800, i would like to export all the user details into EXCEL, is there any way to do with out manually. Thanks in advance Balaji

  • Automation from Scanning to DMS creation

    Hi Friends, Is there any link bringing scanning doc. into DMS. What process we are following scanning the doc. in a local file. Then create a DMS & attaching. Then releasing the DMS document. The problems while creating doc. are: 1. We can attach the

  • Convert XML to JSON format

    Hi all, I need to do a POC on converting XML format to JSON format,i found that there is rest adapter which we can achieve this  , but i couldn't find the adapter in the list of adapters ,how to download this adapter from sap market place and also is