Different granularity fact tables (NOT aggregated tables)

Hi!
Imagine the following Physical Layer:
- Dimension Table "Year" (one-to-many FK relationship with "Month")
- Dimension Table "Month" (one-to-many FK relationship with "Day" and "Sales_Month")
- Dimension Table "Day" (one-to-many FK relationship with "PhoneCalls_Day")
- Fact Table "Sales_Month"
- Fact Table "PhoneCalls_Day"
Imagine the following BMM:
- Dimension Logical Table "Time" (which comprises the physical tables Year, Month and Day) - one-to-many relationship with the 2 Fact Logical Tables below)
- Fact Logical Table "Sales_Month"
- Fact Logical Table "PhoneCalls_Day"
Both Fact Logical Tables ("Sales_Month" and "PhoneCalls_Day") are related to the Dimension Logical Table "Time". But there is no aggregation content defined in the LTS of each of those Fact Logical Tables, since they are not aggregated tables (one is for Sales per Month, and the other is for Phone Calls per Day, but Sales is not an aggregation of PhoneCalls, obviously).
When I use answers to simply report Sales per Month (Qty_Sales, Month) the SQL generated will join "Sales_Month" physical table with "Month" physical table, but it also joins the "Month" physical table with "Day" physical table... Why?
What should I do in order to avoid this last join?
Thanks!!

Thanks a lot!!
It worked perfectly.
Conclusion: I should create a separate LTS for each level to which a Fact Table is directly joined (not an aggregate table, because agregate tables should work fine with Agregation Content...).
If I have a Fact Table per Year, another per Month and another per Day, then I should have 3 LTS.
Since I only have one Fact table per Month and another per Day, I created only 2 LTS: Month (which has the join with Year) and Day.
Thanks.

Similar Messages

  • DataSource attribute if table not valid

    Hi
    I have upgraded Javawebdynpro application from 6.40 to 7
    when I run the application I got the fowwloing error in 7
    DataSource attribute if table not valid  [Table table4 in view class com.sap.tc.webdynro.progmodel.generation.DelegatingView]
    please help
    Thanks

    jj

  • Fact tables with different granularity

    We currently have 3 dimensions (Site, Well, Date) and 2 fact tables (GasEngine, GasField), both having granularity of a day.
    GasEngine is linked to Site and Date
    GasField is linked to Site, Well and Date
    We now have a requirement to make the GasEngine fact table have granularity of an hour but keep
    GasField at a day.
    We therefore must include a new Time dimension, which would only be linked to GasEngine.
    Is it ok to have a DW with these two fact tables having different granularity? 
    And would we therefore require two separate cubes for querying this data?

    Hi Rajendra and Visakh16,
    Based on your input provided to this thread, I would like to ask a question just to fine-tune my knowledge regarding data modelling. In Darren’s case I guess his date dimension only store dimension records up to day level granularity. Now the requirement
    is to make the “GasEngine” fact table to hold data granularity of an hour.
    Now based on Rajendra’s input
    “Yes, you can have. but why you need new time dimension, I recommend, make GasEngine fact to
    hour granularity.”
    How Darren could display data for each hour without having a time dimension attached to GasEngine fact table? With the existing date dimension he ONLY can display the aggregated data with the minimum granularity of day level.
    Now anyone can modify the date dimension to hold time records which will complicate the date dimension totally. Instead why Darren cannot have a separate time dimension which hold ONLY time related data and have a timekey in GasEngine fact table and relate
    those tables using the time key? This way isn’t Darren’s data model become more readable and simplified? As we provide another way of slicing and dicing data by using a time dimension I do not think Darren’s cube becomes a complex STAR schema.
    I could be totally wrong therefore for the sake of knowledge for Darren and me I am asking the question from both of you.
    Best regards…   
    Chandima Lakmal Fonseka

  • Mix source data with different granularity into the same fact table?

    I have two transaction tables "Incident (157 columns)" and "Unit (70 Colums)". For every "Incident" that happens there could be one or more records in the "Unit" table.
    As part of my data mart design, I have merged both the tables into a single Fact "Incident Fact (227 Columns)" and inserted the records from both the tables with a join condition between them [incident.IN_NUM = Unit.IN_NUM].
    Is this correct, is my question? or am I mixing source data with different granularity in the same fact table. Appreciate your help.
    Best Regards
    Bees

    Bees,
    Are the measures from 'Incident' , repeated for a given incident where it has more than one record in the Unit table ? If so, then the sum(indicent.measure) will give an incorrect result, no ?
    What requirement is there to physically merge the tables together outside of OBIEE? With OBIEE you could have one logical 'fact' table to present to report users, which sourced from seperate Incidents and Units tables and would stop the incorrect aggregations occuring. A common modelling piece in the same way would be Order Headers and Order Lines, quite common in OBIEE to have a logical 'Orders' fact which contained both Order header measures and Order line measures, this translates to your Incidents -> Units relationship.
    To do what I've mentioned, is relatively straight forward, you need a 'Dim - Incident' with two levels, Incident and Unit, mapp the unique identifiers in as the level keys and then use these levels to set the content levels correctly in your 2 logical tables sources for logical 'Fact' , ie Incidents LTS at incident level, Units LTS as units level.
    Hope this helps, let us know if you get stuck.
    Cheers
    Alastair

  • Consistency Warning - [39008] Dimension Table not joined to Fact Source

    I have a schema in which I have the following tables:
    A) Patient Transaction Fact Table (i.e. supplies used, procedures performed, etc.)
    B) Demographic Dimension table (houses info like patient location code)
    C) Location Dimension table (tells me what Hospital each unique Location maps to)
    So table A is the fact, and table B is a dimension table joined to table A based on Patient ID, so I can get general info on the patient. This would allow me to apply logic to just see patient transactions where the patient was FEMALE, or was in the Emergency Room, by applying conditions to these fields in table B.
    Table C is a simple lookup table joined to table B by Location Code, so I can identify which hospital's emergency room the patient was located in for instance.
    So the schema is: A<---B<---C, where B and C are both dimension tables.
    The query works as desired, but my consistency check gives me the following WARNING:
    *[39008] Logical dimension table LOCATION MASTER D has a source LOCATION MASTER D that does not join to any fact source.*
    How do I resolve this WARNING, or at least suppress it?

    Hi,
    What you need to do is to add the (physical) location dimension table to the logical table source of the demographic dimension, for example by dragging it from physical layer on top of logical table source of demographic logical dimension table in bmm layer
    Regards,
    Stijn

  • FF logs in different location ( Not FF tables)

    Hi
    Can we have FF logs in different location other than the backend tables? My client wants to have the sensitive FI and HR data logs maybe in a share restricted network drive / folder. Is this worth it?
    If anyone has done this, please share your experience and best recommended approach

    Hi Robert,
    Per my knowledge No.
    not possible, SPM in frontend (JAVA stack) is used only for reporting purpose, core functionality is still in the backend (R/3)
    you need to
    - control access to all this data like frontend Log Report, emails, backend tables..... so on
    - my suggestion will be not to use FF for HR purpose
    - do custom development
    regards,
    Surpreet

  • Incorrectly defined logical table source (for fact table Facts) does not

    Hi,
    I have two Dimensions A and B. A is joined to B by a foreign Key.
    The report works if I pull B. Column1, A.Column2.
    The report is throwing an error if i try to change the order of the columns like this. A.Column2, B. Column1.
    error : Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    File: odbcstatementimpl.cpp, Line: 186
    State: S1000. Code: 10058. [NQODBC] [SQL_STATE: S1000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Facts) does not contain mapping for B.Column1
    I am not sure where it is going wrong.
    Thanks
    jagadeesh
    Edited by: Jagadeesh Kasu on Jun 16, 2009 4:22 PM

    did you make joins in LTS or on the physical table.
    try to make join in LTS if they are not there.

  • 2 fact Tables not joining to all the same dimensions

    Hi there,
    I have 2 Fact tables
    1) Errors (count)
    2) Tasks (count)
    both share the following dimension - customer
    and both have a dimension that joins only to itself
    1) Errors type (joining to errors Fact)
    2) Tasks type (joining to Tasks Fact)
    In Answers I can see customer and errors count and task count together
    When I add Errors Type, the Tasks count returns 0 (it doesn't disappear)
    When I add Tasks Type, the Errors count returns 0 (it doesn't disappears)
    customer - errors count - tasks count
    xyz 2 8
    customer - errors count - tasks count - task type
    xyz 0 8 abc
    customer - errors count - tasks count - errors type
    xyz 2 0 123
    I have seen some documentation on fact tables not joining all dimensions but any help would be appreciated
    Thanks

    The end game would be to be able to see the following:
    customer - errors type - errors count - task count - tasks type
    xyz 123 2 8 abc
    Initially I would take the either or:
    customer - errors type - errors count - task count -
    xyz 123 2 8
    customer - errors count - task count - tasks type
    xyz 2 8 abc
    I think there is something I ned to do in the Repositry?

  • Aggregation table - Diffrent agg levels for base table and agg table

    Is it possible to have Different aggregation level for base table and Aggregation. Say sum on a column in AGG table and Count for the same column in Fact table.
    Example
    Region,Day_product,sales person, customer are dimensions and Call is a fact measure
    FACT_TABLE has columns Region, Day, Product, Sales person,Customer, Call
    AGG_TABLE has columns Region, Month,Product, call
    We already have a Logical Table definition for the fact table say FACT_CALL
    We have a Logical column called No of customers.
    For the Data source as FACT_TABLE Formula for the column is "Customer" and Aggregation level is count distinct.
    But agg table we already have a calculated column call TOT_CUSTOMERS. which is been calculated and aggregated in the ETL.
    IF we map this to the logical column we have to set the formula as TOT_CUSTOMERS and we need to define aggregation type as SUM as this is at REGION, MONTH AND Product level. But OBI does not allow to do so.
    Is there a work around for this? Can you please let us know.
    Regard
    Arun D

    The way BI server picks up the table that would satisfy the query is through column mappings and contents levels. You have set the column mappings to TOT_CUSTOMER, which is right. When it comes to aggregation, since its already precalculated through ETL, you want to set the aggregation to SUM. Which I would say - is not correct, you can set the aggregation to COUNT DISTIMCT which is same as that of the detailed fact. But set the content levels to month in date table, and appropriate levels in region etc., So now BI Server will be aware of how to aggregate the rows when it chooses the agg table.

  • Fact Table and Dimension Tables

    Hi Experts, I'm creating custom InfoCubes for data coming from non-SAP source systems. I have two InfoCubes. Tha data is coming from like 10 tables. I have 10 DataSources created fo this and the data will be consolidated in Standard DSO before it will flow into 2 InfoCubes.
    Now client wants to know before how much data will be there in InfoCubes in Fact table nad Dimension tables in both the InfoCubes. I have the total size of all the 10 tables from the sources given to me by the DBA. I wan not sure how I can convert that info for Fact table and Dimension table as I have not yet created these Infocubes.
    Please help me with this on how I should address this.

    hi,
    The exact data will be hard to give however you can reach at a round figure in your case.
    You are consolidating the data from the tables that means that there is relation between the tables. Arrive at a rough figure based on the relation and the activity you are performing while consolidating the data of the tables.
    For example, let us say we want to combine data for sales order and deliveries in a DSO.
    Let Sales order has 1000 records and Delivery has 2000 records. Both the tables have a common link (Sales Order).In DSO you are combining the data that means the data will be at the most granular level consist of Delivery data, so the maximum no of records which the consolidated DSO can have is 2000.
    regards,
    Arvind.

  • Reg: Fact table and Dimension table in Data Warehousing -

    Hi Experts,
    I'm not exactly getting the difference between the criteria which decide how to create a Fact table and Dimension table.
    This link http://stackoverflow.com/questions/9362854/database-fact-table-and-dimension-table states :
    Fact table contains data that can be aggregate.
    Measures are aggregated data expressions (e. Sum of costs, Count of calls, ...)
    Dimension contains data that is use to generate groups and filters.
    This's fine but how does one decide which columns to consider for Fact table and which columns for Dimension table?
    Any help is much appreciated.
    Pardon me if this's not the correct place for this question. My first question in the new forum.
    Thanks and Regards,
    Ranit Biswas

    ranitB wrote:
    But my main doubt was - what is the criteria to differentiate between columns for Fact tables and Dimension tables? How can one decide upon the design?
    Columns of a fact table will often be 'scalar' attributes of the 'fact' data item. A dimension table will often be 'compound' attributes of a 'fact'.
    Consider employee information. The EMPLOYEE table can be a fact table. It might have scalar attribute columns such as: DATE_HIRED, STATUS, EMPLOYEE_ID, and so on.
    Other related information that can't be specified as a single attribute value would often be stored in a 'dimension' table: ADDRESS, PHONE_NUMBER.
    Each address requires several columns to define it: ADDRESS1, ADDRESS2, CITY, STATE, ZIP, COUNTRY. And an employee might have several addresses: WORK_ADDRESS, HOME_ADDRESS. That address info would be stored in a 'dimension' table and only the primary key value of the address record would be stored in the EMPLOYEE 'fact' table.
    Same with PHONE_NUMBER. Several columns are required to define a phone number and each employee might have several of them. The dimension tables are used to help 'normalize' the data in the employee 'fact' table.
    And that EMPLOYEE table might also be a DIMENSION table for other FACT tables. A DEVELOPER table might have an EMPLOYEE_ID column with a value that points to a 'dimension' row in the EMPLOYEE dimension table.

  • "No text target or text model" when accessing notes inside table cells.

    I'm using the InDesign Automation Objects through Delphi to scan all the notes inside a document and that works fine as long as the notes aren't inside tables.
    For those notes inside tables as soon as I try to get the instance for a note I get the error "No text target or text model". Tried several ways of scanning the document to get to the notes but it doesn't make a difference.
    Here's my code:
        for I := 1 to FDoc.Stories.Count do
          for J := 1 to FDoc.Stories.Item[I].Tables.Count do
            for K := 1 to FDoc.Stories.Item[I].Tables.Item[J].Notes.Count do
              Note := FDoc.Stories.Item[I].Tables.Item[J].Notes.Item[K];
    The last line throws the error.
    This seems like an InDesign bug, or is there maybe something wrong with the document or a different way to get the note content?
    Thanks.

    intolight wrote:
    > i'm bewildered by this behavior. I was building a site
    last night and
    > everytime i put text in a table cell it appeared bolder.
    No other attributes
    > were changed by me. Didn't matter if i typed it new or
    pasted text from outside
    > the table. Whne put in a table it goes heavier or
    bolder. . I don't remember
    > this ever happening before.
    >
    > As a matter of fact as a work around i was able to take
    a table from a
    > previous web page that has text in a table cell that
    doesn't go bold and paste
    > that into the new document and type over that text. But
    i don't want to have to
    > do that all the time.
    >
    > Am i going crazy? Ok you don't have to answer that. But
    please help me fiqure
    > out what's going on here. thanks!!!
    >
    >
    > page with text going bold in table cells
    >
    http://intolight.biz/text/
    >
    > page where text doesn't go bold in table cells
    >
    http://www.intolight.biz/ai/frame_index.htm
    >
    It's a table heading (th) and most browsers (all?) turns such
    tag bold.

  • OBIEE generated SQL differs if it's a Physical Table or Select Table...

    Hi!
    I have some tables defined in the Physical Layer, which some are Physical Tables and others are OBIEE "views" (tables created with a Select clause).
    My problem is that the difference in the generated SQL for the same table, differs (as expected) whether it is a Physical Table or a "Select Table". And this difference originates problems in the returned data. When it a Physical Table, the final report returns the correct data, but when it is a Select Table it returns incorrect/incomplete data. The report joins this table with another table from a different Database (it is a join between Sybase IQ and SQL Server).
    This is the generated SQL in the log:
    -- Physical Table generated SQL
    select T182880."sbl_cust_acct_row_id" as c1,
    T182880."sbl_cust_acct_ext_key" as c2,
    T182880."sbl_cust_source_sys" as c3
    from
    "SGC_X_KEY_ACCOUNT" T182880
    order by c2, c3
    -- "Select Table" generated SQL
    select
         sbl_cust_acct_ext_key,
         ltrim(rtrim(sbl_cust_source_sys)) as sbl_cust_source_sys,
         sbl_cust_acct_row_id,
         sbl_cust_acct_camp_contact_row_id,
         ods_date,
         ods_batch_no,
         ods_timestamp
    from dbo.SGC_X_KEY_ACCOUNT
    As you may notice, the main difference is the use of Aliases (which I think that it has no influence in the report result) and the use of "Order By" (which I start to think that it its the main cause to return the correct data).
    Don't forget that OBIEE server is joining the data from this table, with data from another table from a differente database. Therefore, the join is made in memory (OBIEE Engine). Maybe in the OBIEE Engine the Order by is essential to guarantee a correct join...but then again, I have some other tables in the Physical Layer that are defined as "Select" and the generated SQL uses the aliases and the Order by clause...
    In order to solve my problem, I had to transform the "Select Table" into a "Physical Table". The reason it was defined as a "Select Table" was because it had a restriction in the Where Clause (which I eliminated already, althouth the performance wil be worse).
    I'm confused. Help!
    Thanks.
    FPG

    Hi FPG,
    Not sure if this is a potential issue for you at all, but I know it caused me all kinds of headaches before I figured it out. Had to do with "Features" tab Values in the database object's settings in the Physical Layer:
    Different SQL generated for physical table query vs. view object query?
    Mine had to do with SQL from View objects not being submitted as I would expect, sounds like yours has more to do with "Order By"? I believe I remembered seeing some Order By and Group By settings in the "Features" list. You might make a copy of your RPD and experiement around with setting some of those if they aren't already selected and retesting your queries with the new DB settings.
    Jeremy

  • Sql statement in a table not accepting variable

    I have the following problem on 10.1.0.3.0 with varialbe in an execute immediate statement
    here is the code that I am using
    declare
    remote_data_link varchar2(25) := 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA VARCHAR2(40) := 'UDE_OLTP';
    l_last_process_date date := to_date(to_char(sysdate,'mm-dd-yyyy hh:mi:ss'),'mm-dd-yyyy hh:mi:ss') - 1;
    stmt varchar2(4000) := 'MERGE into applicant_adverseaction_info theTarget USING (select * from '||FROM_SCHEMA||'.applicant_adverseaction_info@'||remote_data_link||' where last_activity > :l_last_process_date ) theSource ON(theTarget.applicant_id = theSource.applicant_id) WHEN MATCHED THEN UPDATE SET theTarget.cb_used = theSource.cb_used, theTarget.cb_address = theSource.cb_address, theTarget.scoredmodel_id = theSource.scoredmodel_id, theTarget.last_activity = theSource.last_activity WHEN NOT MATCHED THEN INSERT(CB_USED, CB_ADDRESS, SCOREDMODEL_ID, APPLICANT_ID, LAST_ACTIVITY) values(theSource.cb_used, theSource.cb_address, theSource.scoredmodel_id, theSource.applicant_id, theSource.last_activity)';
    stmt2 varchar2(4000) := 'MERGE into edm_application theTarget USING (select * from '||from_schema||'.edm_application@'||remote_data_link||' where last_activity > :l_last_process_date) theSource ON (theTarget.edm_appl_id = theSource.edm_appl_id) WHEN MATCHED THEN UPDATE SET theTarget.APP_REF_KEY = theSource.APP_REF_KEY, theTarget.IMPORT_REF_KEY = theSource.IMPORT_REF_KEY, theTarget.LAST_ACTIVITY = theSource.LAST_ACTIVITY WHEN NOT MATCHED THEN INSERT (EDM_APPL_ID, APP_REF_KEY, IMPORT_REF_KEY, LAST_ACTIVITY) values(theSource.EDM_APPL_ID, theSource.APP_REF_KEY, theSource.IMPORT_REF_KEY, theSource.LAST_ACTIVITY)';
    v_error varchar2(4000);
    T_MERGE VARCHAR2(4000);
    stmt3 varchar2(4000);
    BEGIN
    select merge_sql
    INTO T_MERGE
    from transfertables
    where table_name= 'edm_application';
    remote_data_link:= 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA := 'UDE_OLTP';
    --DBMS_OUTPUT.PUT_LINE(SUBSTR(stmt2,1,200));
    --STMT2 := T_MERGE;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    EXECUTE IMMEDIATE stmt2 using l_last_process_date;
    --execute immediate stmt3 ;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    dbms_output.put_line(substr(stmt2,1,200));
    commit;
    EXCEPTION
    WHEN OTHERS THEN
    V_ERROR := SQLCODE||' '||SQLERRM;
    v_ERROR := V_ERROR ||' '||SUBSTR(stmt2,1,200);
    DBMS_OUTPUT.PUT_LINE(V_ERROR);
    --dbms_output.put_line(substr(stmt2,1,200));
    END;
    This works perfectly
    but if I change it to get the same statement in a db table
    declare
    remote_data_link varchar2(25) := 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA VARCHAR2(40) := 'UDE_OLTP';
    l_last_process_date date := to_date(to_char(sysdate,'mm-dd-yyyy hh:mi:ss'),'mm-dd-yyyy hh:mi:ss') - 1;
    stmt varchar2(4000) := 'MERGE into applicant_adverseaction_info theTarget USING (select * from '||FROM_SCHEMA||'.applicant_adverseaction_info@'||remote_data_link||' where last_activity > :l_last_process_date ) theSource ON(theTarget.applicant_id = theSource.applicant_id) WHEN MATCHED THEN UPDATE SET theTarget.cb_used = theSource.cb_used, theTarget.cb_address = theSource.cb_address, theTarget.scoredmodel_id = theSource.scoredmodel_id, theTarget.last_activity = theSource.last_activity WHEN NOT MATCHED THEN INSERT(CB_USED, CB_ADDRESS, SCOREDMODEL_ID, APPLICANT_ID, LAST_ACTIVITY) values(theSource.cb_used, theSource.cb_address, theSource.scoredmodel_id, theSource.applicant_id, theSource.last_activity)';
    stmt2 varchar2(4000) := 'MERGE into edm_application theTarget USING (select * from '||from_schema||'.edm_application@'||remote_data_link||' where last_activity > :l_last_process_date) theSource ON (theTarget.edm_appl_id = theSource.edm_appl_id) WHEN MATCHED THEN UPDATE SET theTarget.APP_REF_KEY = theSource.APP_REF_KEY, theTarget.IMPORT_REF_KEY = theSource.IMPORT_REF_KEY, theTarget.LAST_ACTIVITY = theSource.LAST_ACTIVITY WHEN NOT MATCHED THEN INSERT (EDM_APPL_ID, APP_REF_KEY, IMPORT_REF_KEY, LAST_ACTIVITY) values(theSource.EDM_APPL_ID, theSource.APP_REF_KEY, theSource.IMPORT_REF_KEY, theSource.LAST_ACTIVITY)';
    v_error varchar2(4000);
    T_MERGE VARCHAR2(4000);
    stmt3 varchar2(4000);
    BEGIN
    select merge_sql
    INTO T_MERGE
    from transfertables
    where table_name= 'edm_application';
    remote_data_link:= 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA := 'UDE_OLTP';
    --DBMS_OUTPUT.PUT_LINE(SUBSTR(stmt2,1,200));
    STMT2 := T_MERGE;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    EXECUTE IMMEDIATE stmt2 using l_last_process_date;
    --execute immediate stmt3 ;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    dbms_output.put_line(substr(stmt2,1,200));
    commit;
    EXCEPTION
    WHEN OTHERS THEN
    V_ERROR := SQLCODE||' '||SQLERRM;
    v_ERROR := V_ERROR ||' '||SUBSTR(stmt2,1,200);
    DBMS_OUTPUT.PUT_LINE(V_ERROR);
    --dbms_output.put_line(substr(stmt2,1,200));
    END;
    I get ora-00900 invalid sql statement
    can somebody explain why this happens
    Thanks

    I agree with jan and anthony. Your post is too long and ill-formatted. However here's my understanding of your problem (with examples though slightly different ones).
    1- I have a function that returns number of records in a any given table.
      1  CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4  BEGIN
      5    EXECUTE IMMEDIATE('SELECT count(*) FROM '||p_table) INTO v_cnt;
      6    RETURN v_cnt;
      7* END;
    SQL> /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    GET_COUNT('EMP')
                  14
    2- I decide to move the statement to a database table and recreate my function.
    SQL> CREATE TABLE test
      2  (stmt varchar2(2000))
      3  /
    Table created.
    SQL> INSERT INTO test
      2  VALUES('SELECT count(*) FROM p_table');
    1 row created.
    SQL> CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4     v_stmt varchar2(4000);
      5  BEGIN
      6     SELECT stmt INTO v_stmt
      7     FROM test;
      8     EXECUTE IMMEDIATE(v_stmt) INTO v_cnt;
      9     RETURN v_cnt;
    10  END;
    11  /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    SELECT get_count('emp')
    ERROR at line 1:
    ORA-00942: table or view does not exist
    ORA-06512: at "SCOTT.GET_COUNT", line 8
    ORA-06512: at line 1
    --p_table in the column is a string and has nothing to do with p_table parameter in the function. And since there's no p_table table in my schema function returns error on execution. I suppose this is what you mean by "sql statement in a table not accepting variable"
    3- I rectify the problem by recreating the function.
      1  CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4     v_stmt varchar2(4000);
      5  BEGIN
      6     SELECT replace(stmt,'p_table',p_table) INTO v_stmt
      7     FROM test;
      8     EXECUTE IMMEDIATE(v_stmt) INTO v_cnt;
      9     RETURN v_cnt;
    10* END;
    SQL> /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    GET_COUNT('EMP')
                  14
    Hope this gives you some idea.-----------------------
    Anwar

  • Table of Contents retains hyperlinks but Table of Figures/Tables does not

    I am beyond frustrated with this strange problem. I am converting a thesis from word to PDF, and I need to retain all bookmarks and hyperlinks in my Table of Contents, Table of Figures, and Table of Tables.
    When I convert over (with all accesibility options selected, as well as bookmarking options), I am able to get my bookmarks to convert over AND hyperlinks for the table of contents.
    However, no matter what I do, the hyperlinks in the Table of Figures/tables will not move over. What is different about those? What can I do to get them to link? It is a very long document, so doing the hyperlinks by hand would be incredibly tedious.
    Any advice/help?
    I'm using Acrobat Pro.

    You will need to script the Word object.  Look in the repository for examples of using Word from PowerShell.
    Here too:http://blogs.technet.com/b/heyscriptingguy/archive/2010/06/24/hey-scripting-guy-how-can-i-use-windows-powershell-to-add-hyperlinks-to-a-word-document.aspx
    You will need to become proficient in PowerShell as well as learning the MSWord object model.
    ¯\_(ツ)_/¯

Maybe you are looking for