Dynamic Dimesion Building using SQL Interface

I am attempting to use SQL Interface in Essbase 6.5 to dynamically build the account dimension from a PeopleSoft tree. The view works fine in SQL worksheet, but the last UNION statement does not work using the SQL interface. Here is the SQL that is generated: [SELECT * FROM ps_n_pre_acc_vw WHERE 1 = 1 order by 6,4,5,3] is generated. Here is the view:SELECT B.TREE_NODE AS PARENT , A.TREE_NODE AS CHILD , (A.TREE_NODE||' : '||C.DESCR) , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'A' AS ord FROM SYSADM.PSTREENODE A , SYSADM.PSTREENODE B , SYSADM.PS_TREE_NODE_TBL C WHERE A.TREE_NAME = 'PRE_CUBE' AND A.SETID = 'NW' AND A.EFFDT = ( SELECT MAX(EFFDT) FROM PSTREELEAF WHERE TREE_NAME = 'PRE_CUBE') AND A.TREE_NAME = B.TREE_NAME AND A.SETID = B.SETID AND A.EFFDT = B.EFFDT AND A.PARENT_NODE_NUM = B.TREE_NODE_NUM AND A.TREE_NODE = C.TREE_NODE AND A.SETID = C.SETID AND C.EFF_STATUS = 'A' AND C.EFFDT = ( SELECT MAX(EFFDT) FROM SYSADM.PS_TREE_NODE_TBL X WHERE C.TREE_NODE = X.TREE_NODE AND C.SETID = X.SETID) UNION SELECT B.TREE_NODE AS PARENT , (A.RANGE_FROM||' : '||DESCR) AS child , A.RANGE_FROM , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'B' AS ord FROM SYSADM.PSTREELEAF A , SYSADM.PSTREENODE B , SYSADM.PS_GL_ACCOUNT_TBL C WHERE A.TREE_NAME = 'PRE_CUBE' AND A.SETID = 'NW' AND A.EFFDT = ( SELECT MAX(EFFDT) FROM PSTREELEAF WHERE TREE_NAME = 'PRE_CUBE') AND A.TREE_NAME = B.TREE_NAME AND A.EFFDT = B.EFFDT AND A.SETID = B.SETID AND A.TREE_NODE_NUM = B.TREE_NODE_NUM AND C.SETID = A.SETID AND A.RANGE_FROM = C.ACCOUNT AND C.EFFDT = ( SELECT MAX(EFFDT) FROM SYSADM.PS_GL_ACCOUNT_TBL X WHERE C.ACCOUNT = X.ACCOUNT AND c.setid = x.setid) UNION SELECT B.TREE_NODE AS PARENT , (D.ACCOUNT||' : '||DESCR) AS child , D.ACCOUNT , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'B' AS ord FROM pstreeleaf a , pstreenode b , PS_GL_ACCOUNT_TBL D WHERE a.tree_name = 'PRE_CUBE' AND a.effdt = '01-JAN-2001' AND A.SETID = 'NW' AND A.RANGE_FROM <> A.RANGE_TO AND A.TREE_NAME = B.TREE_NAME AND A.SETID = B.SETID AND A.EFFDT = B.EFFDT AND A.TREE_NODE_NUM = B.TREE_NODE_NUM AND D.ACCOUNT BETWEEN A.RANGE_FROM AND A.RANGE_TO AND D.SETID = 'NW' AND D.EFFDT = ( SELECT MAX(EFFDT) FROM PS_GL_ACCOUNT_TBL X WHERE D.ACCOUNT = X.ACCOUNT AND D.SETID = X.SETID)I created a view from the last union section of the SQL and Essbase returns a zero rows found error, although when I ran the view in SQL worksheet I was able to get rows returned. Any ideas on why the view does not return the expected results in Essbase?

before your query do:
<cfsavecontent
variable="emailmessagebody"><cfinclude
template="#mancbPath#/mancb_body.cfm"></cfsavecontent>
then in your query instead of using '<cfinclude ...>'
use
'#emailmessagebody#'
Azadi Saryev
Sabai-dee.com
http://www.sabai-dee.com/

Similar Messages

  • Dynamic header creation using SQL

    Hi Gurus,
    I need your help again. I have a query which uses a date parameter to populate a report. The report pulls out data from the user entered date to minus eleven months. The report counts totals calls registered each month. I have the query working fine but I need help in populating the header.
    For ex - suppose I run the query on todays date (18-Jan-2012)
    The report header will be -
    Jan Feb Mar Apr ....Dec
    But I want the header to be populated in the following format -
    Jan 2012 Feb 2011 Mar 2011 Apr 2011....Dec 2011
    And for ex if I run the report for a future date say (21-May-2012)
    The header will be in the following format -
    Jan 2012....May 2012 Jun 2011 Jul 2011 .......Dec 2011.
    Please let me know if I can populate the header using SQL. Any help is greatly appreciated.

    Hi Tenacious,
    You wrote:
    I want the header to be populated in the dynamic format with the year value concanated to the Month column.My script does that; you can look one more time at the output in my first post, and also in my second post.
    And if you want another example, when we replace 18-Jan-2012 by 21-May-2012, we have the following:
    SQL> select to_char(col,'Monyyyy') col_date
      2  from
      3  (select
      4         add_months(to_date('21-May-2012','dd-Mon-yyyy'),
      5                      -level + 1) col
      6  from dual
      7  connect by level <= extract(month from to_date('21-May-2012','dd-Mon-yyyy')
      8  union
      9  select
    10         add_months(to_date('21-May-2012','dd-Mon-yyyy')
    11                    , level - 12 )
    12  from dual
    13  connect by level <= 12 - extract(month from to_date('21-May-2012','dd-Mon-y
    yyy')))
    14  order by extract(year from col) desc, extract(month from col);
    COL_DAT
    Jan2012
    Feb2012
    Mar2012
    Apr2012
    May2012
    Jun2011
    Jul2011
    Aug2011
    Sep2011
    Oct2011
    Nov2011
    COL_DAT
    Dec2011
    12 rows selected.
    SQL>

  • Dynamic frame building using the contents of an XML file

    Hello,
    My task is to parse an XML file, which contains various profiles of people. Based on what I read, I will then have to make up the GUI dynamically based on which person is selected. I have decided to use CardLayout and I was wondering if there was any way I could build a dynamic frame in netbeans, using the swing palette that is. Is there anyway I could pass in the frame details (based on a person) and is there a way that Netbeans could build the frame for me?
    Could you guys let me know if this is feasible at all?
    Thanks
    Lexus

    >
    Could you guys let me know if this is feasible at all?>It sure is. The SaverBeans settings dialog does just that, for the XScreenSaver style XML used to configure screensavers. It was developed by Mark Roth using NetBeans.
    Some more information on that component can be found in the [Screensaver configuration files|https://screensavers.dev.java.net/config/] page.
    Edited by: AndrewThompson64 on Dec 27, 2008 12:43 PM

  • Dimension build using SQL table and process to fill the SQL table

    I have a dimension in a cube that is manually* built by one of our power users. Now I have to get all the member information of that dimension into a SQL table(example : with columns...level0,level0property,level1,level1property etc....) to use that table in STUDIO for member load.
    Is there any easy process to do this? Currently I am building each and every row manually in that SQL table and there are 1100+ members in that manually built dimension. Please advice.

    Thank you so much Glenn!! That worked!! You make our lives so much easier!!

  • Dynamically replacing pattern using SQL only

    Hi ,
    Table File name
    CREATE TABLE file_name_table AS
    ( SELECT 'FILE_A_[PATTERN1]_[PETTERN2].txt' AS file_name FROM DUAL
    UNION
    SELECT 'FILE_B_[PATTERN3].txt' FROM DUAL
    UNION
    SELECT 'FILE_B_[PATTERN3]_[PATTERN4].txt' FROM DUAL)Pattern Table
    CREATE TABLE pattern_table AS
    ( SELECT '[PATTERN1]' AS pattern, 'P00191' AS pattern_value  FROM DUAL
    UNION
    SELECT '[PATTERN2]' AS pattern, 'P00293' AS pattern_value  FROM DUAL
    UNION
    SELECT '[PATTERN3]' AS pattern, 'p567' AS pattern_value  FROM DUAL
    UNION
    SELECT '[PATTERN4]' AS pattern, 'p879' AS pattern_value  FROM DUAL
    UNION
    SELECT '[PATTERN5]' AS pattern, 'p005' AS pattern_value  FROM DUAL)Now I need a view which will show the follwing output
    'FILE_A_P00191_P00293.txt'
    'FILE_B_p567.txt'
    'FILE_B_p567_p879.txt'basically the in the output the pattern will be matched and the pattern in file name will be replaced by the pattern value of pattern table. Hope I am clear. Please help.
    Edited by: Mr Lonely on May 22, 2013 1:25 PM -- Fixed the output.

    Hi Jeneesh,
    This is working excellent.
    However I have a small problem.
    If filename contains something in [] for example filename_[FIXED] and that pattern does exists in the pattern table then it's replacing it with null.
    For example.
    FILE_NAME                        NEW_FNAME                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     
    FILE_B_[PATTERN3].txt            FILE_B_p567.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
    TEST_[TEST_PATTERN].txt          TEST_.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       
    TEST_[TEST_PAT].txt              TEST_LOL.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    
    FILE_A_[PATTERN1]_[PETTERN2].txt FILE_A_P00191_.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              
    FILE_B_[PATTERN3]_[PATTERN4].txt FILE_B_p567_p879.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             Edited by: Mr Lonely on May 22, 2013 3:21 PM

  • Having issues after installing Essbase V9.3.1,configuring the SQL interface

    I am having issues after installing Essbase V9.3.1 and configuring the SQL interface. we are using UNIX/AIX box for our Essbase server and UDB DB2 9.1 SP6 as our SQL Interface.
    When tried to run an ESSCMD script to perform BUILDDIM operation on Essbase application using the load rule (with the SQL interface), got the error as recorded in the application log file and is attached at the bottom and the .odbc.ini file on the server looks as follows.
    Can somebody tell me whether .odbc.ini needs any correction or what else should I do to correct the issue and able to work thru using SQL interface? Appreciate any help provided in this regard. Thanks,
    .odbc.ini
    [ODBC Data Sources]
    DOLU003=IBM DB2 ODBC DRIVER
    [DOLU003]
    Driver=/home/db2inst1/sqllib/lib/libdb2.a
    Database=DOLU003
    [ODBC]
    Trace=0
    TraceFile=odbctrace.out
    InstallDir=/home/db2inst1/sqllib/odbclib
    [Mon Apr 20 15:16:29 2009]Local/New_Bud/Budget/Olapadm/Info(1021020)
    Cannot read SQL driver name for [] from [home/hyperion/.odbc.ini]
    [Mon Apr 20 15:16:43 2009]Local/New_Bud/Budget/Olapadm/Info(1013091)
    Received Command [SQLRetrieve] from user [Olapadm]
    [Mon Apr 20 15:16:43 2009]Local/New_Bud/Budget/Olapadm/Info(1021020)
    Cannot read SQL driver name for [] from [home/hyperion/.odbc.ini]
    [Mon Apr 20 15:16:43 2009]Local/New_Bud/Budget/Olapadm/Info(1021004)
    Connection String is generated
    [Mon Apr 20 15:16:43 2009]Local/New_Bud/Budget/Olapadm/Info(1021041)
    Connection String is [DSN=DOLU003;UID=...;PWD=...]
    [Mon Apr 20 15:16:43 2009]Local/New_Bud/Budget/Olapadm/Info(1021006)
    SELECT Statement [SELECT * FROM DB2OLADM.BD_DIMENSION_DEF_PC WHERE DIMENSION_ID
    = 12 ORDER BY TREE_NODE_NUM, PARENT_NODE, CHILD_NODE] is generated
    [Mon Apr 20 15:16:44 2009]Local/New_Bud/Budget/Olapadm/Info(1021013)
    ODBC Layer Error: [7] ==> [[DataDirect][ODBC 20101 driver]6013]
    [Mon Apr 20 15:16:44 2009]Local/New_Bud/Budget/Olapadm/Info(1021014)
    ODBC Layer Error: Native Error code [0]
    [Mon Apr 20 15:16:44 2009]Local/New_Bud/Budget/Olapadm/Error(1021001)
    Failed to Establish Connection With SQL Database Server. See log for more information
    --------------------------------------------------------------------------------------------------------------------------------------------

    I had similar errors when first setting up the SQL interface. Are you on a 64 bit operating system on the essbase server? If so Essbase needs to use the 32 bit odbc driver, not the default 64 bit driver.
    The 32 bit driver is still available here SysWOW64\odbcad32.exe <-----This is the 32 bit (Use)
    The Administrative tools by default uses system32\odbcad32.exe <----This is the 64 bit (Don't Use)
    Once I opened the 32 bit driver interface directly and set it up the errors went away.

  • SQL Interface & Teradata

    Anyone have any experience using SQL Interface to pull data from a Teradata Data Warehouse? Assuming you can just use the standard ODBC driver for Teradata but would like confirmation.Thanks.Doug O'Keefe

    I did it a long time ago and didn't have any problem. since Teradata databases can be pretty large, I created a table that held the data I wanted and pulled from it directly so I didn't have to do any joins. Glenn S.---Message Posted by okeefdo ??3/27/02 07:27---Anyone have any experience using SQL Interface to pull data from a Teradata Data Warehouse? Assuming you can just use the standard ODBC driver for Teradata but would like confirmation.Thanks.Doug O'Keefe

  • Dynamically build web GUI interface

    Hi,
    I am planning to build web GUI interface. Instead of desiginning statically, I want to store GUI meta data in a xml file and build web GUI from the meta data.
    I would like to know if there any java tools available already to build web GUI from meta data.
    THanks
    RR

    Hi,
    I am planning to build web GUI interface. Instead of
    desiginning statically, I want to store GUI meta data
    in a xml file and build web GUI from the meta data.
    I would like to know if there any java tools
    available already to build web GUI from meta data.
    THanks
    RRWell, if you use Swing with a JApplet then you are dynamically creating the user interface. The components are added at runtime, not when you compile the programs. All you need to do is drive which controls to add by some stored data. The same is true for HTML if you generate your pages at runtime (i.e. not just static HTML files).

  • How to send an email with attachment to dynamic emial address using PL/SQL

    Hi,
    i want to send an automated email with attachment everyday to differnet people so number people is not static.
    so is it any way using PL/SQL ?
    thanks for your support!

    i want to send an automated email with attachment everyday to differnet people so number people is not static.
    Why? Explain it.
    You can create a table and store your email id through front-end application day to day.
    The table should look like this ->
    satyaki>
    satyaki>select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod
    PL/SQL Release 10.2.0.3.0 - Production
    CORE    10.2.0.3.0      Production
    TNS for 32-bit Windows: Version 10.2.0.3.0 - Production
    NLSRTL Version 10.2.0.3.0 - Production
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>
    satyaki>create table email_master
      2    (
      3       email_grp_header         varchar2(30) not null,
      4       craete_time                  timestamp,
      5       constraints pk_header primary key(email_grp_header)
      6    );
    Table created.
    Elapsed: 00:00:02.12
    satyaki>
    satyaki>create table email_chld
      2    (
      3       email_grp_header          varchar2(30) not null,
      4       email_recepient             varchar2(100),
      5       craete_time                   timestamp,
      6       constraint fk_header foreign key(email_grp_header) references email_master(email_grp_header)
      7    );
    Table created.
    Elapsed: 00:00:00.09
    satyaki>
    satyaki>
    satyaki>insert into email_master values('GRP_INVENTORY',systimestamp);
    1 row created.
    Elapsed: 00:00:00.07
    satyaki>
    satyaki>
    satyaki>insert into email_master values('GRP_PURCHASE',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>commit;
    Commit complete.
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>select * from email_master;
    EMAIL_GRP_HEADER               CRAETE_TIME
    GRP_INVENTORY                  24-OCT-08 08.55.36.190000 PM
    GRP_PURCHASE                   24-OCT-08 08.55.54.481000 PM
    Elapsed: 00:00:00.18
    satyaki>
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.07
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>insert into email_chld values('GRP_PURCHASE','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>insert into email_chld values('GRP_PURCHASE','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.11
    satyaki>commit;
    Commit complete.
    Elapsed: 00:00:00.05
    satyaki>
    satyaki>select * from email_chld;
    EMAIL_GRP_HEADER               EMAIL_RECEPIENT                                                                                      CRAETE_TIME
    GRP_INVENTORY                  [email protected]                                                                                      24-OCT-08 08.56.46.107000 PM
    GRP_INVENTORY                  [email protected]                                                                                         24-OCT-08 08.57.03.551000 PM
    GRP_INVENTORY                  [email protected]                                                                                    24-OCT-08 08.57.36.277000 PM
    GRP_PURCHASE                   [email protected]                                                                                      24-OCT-08 08.58.06.129000 PM
    GRP_PURCHASE                   [email protected]                                                                                    24-OCT-08 08.58.26.900000 PM
    Elapsed: 00:00:00.10
    satyaki>And, then based on the group header you can get the list of recipient and use it dynamically inside your PL/SQL Application.
    Regards.
    Satyaki De.

  • Using Dynamic VIEW better pl/sql ?

    Hi,
    Would like to know your suggestion,
    Currently i need to process 5 tables and put the data into a new single Table.
    The number of Rows are many, 1000000 and could be even more.
    Now i am planning to Create a View from the 5 Tables and then
    form the View i am planning to do the Processing and
    after which i will put the processed data into the newly created table.
    Alternatively i was thinking of having a Collection of Objects(Nested Table) instead of VIEW,
    since the number of rows which need to be processed are more i felt this would not be a better option.
    Can we go-ahead creating VIEW dynamically, Or is there any better design Solution that you can think of ?
    regards,
    Alex

    Hi,
    I have Place the Sample Structure here along with the data .... and the expected output below.
    How can we BUILD a SQL Smt for the below :
    CREATE TABLE One_T (
    one_no NUMBER,
    one_message varchar2(20)
    CREATE TABLE Two_T (
    two_no NUMBER,
    two_message varchar2(20)
    CREATE TABLE Three_T (
    three_no NUMBER,
    three_message varchar2(20)
    CREATE TABLE Four_T (
    four_no NUMBER,
    four_message varchar2(20)
    CREATE TABLE Five_T (
    five_no NUMBER,
    five_message varchar2(20)
    CREATE TABLE Six_T (
    six_col1 varchar2(20),
    six_col2 varchar2(20),
    six_col3 varchar2(20)
    CREATE TABLE New_Table (
    New_Table_col1 varchar2(20),
    New_Table_col2 varchar2(20),
    New_Table_col3 varchar2(20),
    New_Table_col4 varchar2(20),
    New_Table_col5 varchar2(20),
    New_Table_col6 varchar2(20),
    New_Table_col7 varchar2(20)
    INSERT ALL
    INTO One_T(one_no,one_message) VALUES(1,'Message11')
    INTO One_T(one_no,one_message) VALUES(2,'Message12')
    INTO One_T(one_no,one_message) VALUES(3,'Message13')
    INTO One_T(one_no,one_message) VALUES(4,'Message14')
    INTO One_T(one_no,one_message) VALUES(5,'Message15')
    INTO Two_T(two_no,two_message) VALUES(1,'Message21')
    INTO Two_T(two_no,two_message) VALUES(2,'Message22')
    INTO Two_T(two_no,two_message) VALUES(3,'Message23')
    INTO Two_T(two_no,two_message) VALUES(4,'Message24')
    INTO Two_T(two_no,two_message) VALUES(5,'Message25')
    INTO Three_T(three_no,three_message) VALUES(1,'Message31')
    INTO Three_T(three_no,three_message) VALUES(2,'Message32')
    INTO Three_T(three_no,three_message) VALUES(3,'Message33')
    INTO Three_T(three_no,three_message) VALUES(4,'Message34')
    INTO Three_T(three_no,three_message) VALUES(5,'Message35')
    INTO Four_T(four_no,four_message) VALUES(1,'Message41')
    INTO Four_T(four_no,four_message) VALUES(2,'Message42')
    INTO Four_T(four_no,four_message) VALUES(3,'Message43')
    INTO Four_T(four_no,four_message) VALUES(4,'Message44')
    INTO Four_T(four_no,four_message) VALUES(5,'Message45')
    INTO Five_T(five_no,five_message) VALUES(1,'Message51')
    INTO Five_T(five_no,five_message) VALUES(2,'Message52')
    INTO Five_T(five_no,five_message) VALUES(3,'Message53')
    INTO Five_T(five_no,five_message) VALUES(4,'Message54')
    INTO Five_T(five_no,five_message) VALUES(5,'Message55')
    INTO Six_T(six_col1,six_col2,six_col3) VALUES(1,'MessageCol111','MessageCol211')
    INTO Six_T(six_col1,six_col2,six_col3) VALUES(1,'MessageCol112','MessageCol212')
    INTO Six_T(six_col1,six_col2,six_col3) VALUES(2,'MessageCol211','MessageCol221')
    INTO Six_T(six_col1,six_col2,six_col3) VALUES(2,'MessageCol221','MessageCol222')
    INTO Six_T(six_col1,six_col2,six_col3) VALUES(2,'MessageCol222','MessageCol223')
    SELECT * FROM dual;
    OUTPUT :
    New_Table
    MessageCol111, MessageCol211,Message11,Message21,Message31,Message41,Message51
    MessageCol112, MessageCol212,Message11,Message21,Message31,Message41,Message51
    ...

  • Build essbase cube using Sql query.....

    Hi Team, Can we build dimensions as well as load data into essbase cubes using sql query generated from cognos cube. If so , please guide me the procedure

    I'm not sure about cognos, if you can create an ODBC connection you can try using that in SQL load rules and see if that works.
    Opening an SQL Database
    Regards
    Celvin
    http://www.orahyplabs.com

  • Reg: Building Essbase dimension using SQL in rule file

    Hi,
    We are using Essbase 11.1.2.2. I am trying to build a dimension in a cube using SQL query in the rule file, but i am not able to do it.
    I am not able to establish connection to database. Can anyone please give a step by step process to do this.

    If you really, really, really want to do this --- and by that I mean really get rid of everything in a dimension, create a dummy file with a single child.  That single child should have a name that could never exist in your system.  Something like "Forty-seven ginger headed sailors".  Then set the Dimension Build Settings Member Update to "Remove Unspecified".  Run the dim build -- it will clear out all of the dimension.
    Then take your other dim build rule file and use the same Remove Unspecified setting.  That will get rid of the silly member.  The source can be a file or SQL.
    Ta da, you have now cleared out the dim, added a single silly member, cleared out the dim again, and loaded the right members into it.
    There may be a better way to do the above, but I have done exactly that and it works.
    Regards,
    Cameron Lackpour

  • Can one build a data warehouse using SQL rather than Warehouse Builder?

    I would like to build a data warehouse purely using SQL statements. Where can I find the data warehouse extension of SQL statements?

    I am exploring the internal workings of Warehouse Builder.
    I have written a SQL script to generate sample data to be inserted into tables, then write SQL script to do Extraction, Transformation and Loading using MERGE,, GROUP BY CUBE, DECODE, etc.
    If anyone has any experience of just using SQL to perform ETL, would you share your expeience here? Thanks.

  • How to Custom Report using sql server report builder for SCCM 2012 SP1

    Hi ,
    I am new to database, if i want to create a manual report using sql server report builder for SCCM 2012 SP1, what step should i take.
    i want to create a report in which computer name, total disk space, physical disk serial no come together. i already added class (physical disk serial no.) in hardware inventory classes. refer snapshot

    Hi,
    Here is a guide on how to create custom reports in Configuration Manager 2012, it is a great place to start, change to the data you want to display instead.
    http://sccmgeekdiary.wordpress.com/2012/10/29/sccm-2012-reporting-for-dummies-creating-your-own-ssrs-reports/
    Regards,
    Jörgen
    -- My System Center blog ccmexec.com -- Twitter
    @ccmexec

  • Is there a solution for dynamic reports and using Denes' Export to Excel?

    Oracle 10.2.0.4.0
    Application Express 3.2.1.00.10
    Hello all!
    I am using Denes Kubicek's Export_Excel_Pkg in my application and I'm having trouble exporting reports based on report regions created using a PL/SQL function body returning SQL query. I realize this is not an Oracle supported package, but was hoping someone here could shed some light on it. When I open up the Excel file, I get an error such as: Report Values Error: ORA-06550: line 22, column 5: PL/SQL: ORA-00907: missing right parenthesis.
    I've searched the forum and already have done as others suggested by modifying the REPLACE on the v_sql variable in Export_Excel_Pkg.Get_Usable_SQL, but it did not work. My assumption is that there is an issue with the value being passed to the wwv_flow_utilities.get_binds function. I could not find documentation on this function, but I'm thinking that it cannot extract the bind variables within a PL/SQL block. The report only works when I have just use SQL with bind variables...doesn't work for PL/SQL. Nor does it work for dynamic SQL reports that use a "lexical" parameter (e.g. using WHERE &p_and_condition.) to build the WHERE clause.
    Has anyone come up with a work-around to this? I somehow need to be able to extract reports based on dynamic SQL (or PL/SQL) to Excel.
    Help is appreciated!
    This is my example of a report based on PL/SQL function:
    DECLARE
      v_sql VARCHAR2(4000);
    BEGIN
      v_sql := q'[SELECT UPPER(t1.olo_name) agency_title,
           t1.class_code,
           UPPER(t1.class_title) class_title,
           t1.pay_plan,
           t1.pay_grade_code,
           COUNT(t1.appt_fte) total_employees,
           SUM(t1.appt_fte) filled_fte,
           AVG(DECODE(t2.pay_cycle_code,
                      'UB',((t1.wage_type1_amt_for_pay * 26)/t1.appt_fte),
                      'UM',((t1.wage_type1_amt_for_pay * 12)/t1.appt_fte),
                       0)) avg_annual_rate
       FROM my_schema.table1 t1,
                my_schema.table2 t2,
                my_schema.table3 pro
      WHERE t1.pos_wk = t2.pos_wk
        AND t2.pos_rate_active_flag = 'Y'
        AND t1.ops_ind = 'N'
        AND t1.employee_type IN ('1','2')
        AND pro.ROLE_CODE = :F101_DW_ROLE
        AND pro.pos_role_orgs_active_flag = 'Y']';
      IF :P_MULTI_OLO IS NOT NULL THEN
        v_sql := v_sql || q'[ AND INSTR(':'||']' || v('P_MULTI_OLO') || q'['||':', ':'||t1.olo_code||':') > 0]';     
      END IF;
      v_sql := v_sql || q'[GROUP BY UPPER(t1.olo_name), t1.class_code, UPPER(t1.class_title), t1.pay_plan, t1.pay_grade_code ORDER BY t1.class_code ASC, avg_annual_rate]';
      RETURN v_sql;
    END;This is my example using a SQL statement with a lexical parameter:
    SELECT UPPER(t1.olo_name) agency_title,
           t1.class_code,
           UPPER(t1.class_title) class_title,
           t1.pay_plan,
           t1.pay_grade_code,
           COUNT(t1.appt_fte) total_employees,
           SUM(t1.appt_fte) filled_fte,
           AVG(DECODE(t2.pay_cycle_code,
                      'UB',((t1.wage_type1_amt_for_pay * 26)/t1.appt_fte),
                      'UM',((t1.wage_type1_amt_for_pay * 12)/t1.appt_fte),
                       0)) avg_annual_rate
       FROM my_schema.table1 t1,
                my_schema.table2 t2,
                my_schema.table3 pro
      WHERE t1.pos_wk = t2.pos_wk
        AND t2.pos_rate_active_flag = 'Y'
        AND t1.ops_ind = 'N'
        AND t1.employee_type IN ('1','2')
        AND pro.ROLE_CODE = :F101_DW_ROLE
        AND pro.pos_role_orgs_active_flag = 'Y'
        &P63_AND_CONDITION.
      GROUP BY UPPER(t1.olo_name),
               t1.class_code,
               UPPER(t1.class_title),
               t1.pay_plan,
               t1.pay_grade_code   
    ORDER BY t1.class_code ASC, avg_annual_rateThe *&P63_AND_CONDITION.* value is populated based on a "Before Header" computation under Page Rendering, using the logic below. It is then used by the SQL query defined in the reports region at run time.
    DECLARE
      v_sql VARCHAR2(4000) := NULL;
    BEGIN
      v_sql := ' ';
      IF :P_MULTI_OLO IS NOT NULL THEN
        v_sql := v_sql || q'[ AND INSTR(':'||']' || v('P_MULTI_OLO') || q'['||':', ':' || t1.olo_code || ':') > 0]';     
      END IF;
      RETURN v_sql;
    END;

    Did you get an answer for this?

Maybe you are looking for

  • Exchange calendar invitations accepted in IOS remain "new" in iCal

    My work calendar and email is based on Exchange. I manage both from a Mac (using Outlook 2011 for Mac) and an iPhone 5 running IOS 7.latest. It works just fine. If I accept an invitation on any platform is is accepted on all. Then I look at OSX Calen

  • PS Elements/Lightroom issues

    Hi, Could anyone help me with an issue using Lightroom and PS Elements. When I move from Lightroom to edit in PS Elements, I loose my saved image. It does not show back in Lightroom as everyone tells me it should. I have tho save it somewhere else. A

  • Blackberry Desktop Manager/Ubuntu-Wubi?

    Blackberry Desktop Manager/Ubuntu-Wubi, are they compatable with each other, if so where can I get the correct software for Ubuntu? Thanks in advance>

  • Next and Previous Buttons

    I have a bit of a strange issue. When I click the next or previous buttons iTunes goes to the next song or the previous song. The problem is that there is no way to rewind the song to the beginning. This is only happening on one of my computers. If I

  • Drag And Drop from Browser to time line  - where does audio go

    When i drag and drop a clip from the browser or from the viewer directly onto the timeline as an simple superimpose, if there is not enough room for the stereo tracks ie( i LOCK OUT MOST OF THE TRACKS because i work mostly voiceover and not with raw