Step by step ceration of mulit dimension

hi friends,
i need the material of multi dimension creation information
thanks in advance
regards
shafeeq ahmed

Hi,
MultiCube is nothing but a collection of cubes (in previous versions). But now it is called as multiprovider. Now it is the combination of infoobjects, infocubes, ODS objects and infosets. But it doesn’t contain data physically. At the time of reporting only if fetches the data from those objects. It contains the data reduntally.
Remote cube is one type of the cube. This also doesn’t contain data physically. It fetches the data from the source system itself directly. At the time of reporting only the data fetches from the source system. So it takes more time to execute.
http://infolab.usc.edu/csci599/Fall2002/paper/DML1_vassiliadis98modeling.pdf
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/2f5aa43f-0c01-0010-a990-9641d3d4eef7
http://help.sap.com/saphelp_nw04/helpdata/en/52/1ddc37a3f57a07e10000009b38f889/content.htm
Please reward points.

Similar Messages

  • Step by step procedure to use dimension operator

    Hi,
    Is there any link ,where I can find step by step procedure to use dimension operator.. that is mapping data from source table to operator and from operator to table
    What attribute should we map into dimension key column???
    please help me out with this
    Thanks
    Edited by: rishiraj on Jan 4, 2012 4:40 AM
    Edited by: rishiraj on Jan 4, 2012 5:01 AM

    The dimension data object in the design tree is all about the semantics of the object - the description of hierarchies, levels, attributes PLUS how it is bound to some persistence.
    The dimension operator object in a mapping is all about how the semantic object is loaded with data. Source to Dimension operator (under the covers OWB knows the semantics of the dimension and how it is bound to some persistence so can generate the expanded mapping).
    There is an OBE below that is still pertinent to 11gR2 to understand some of this;
    http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/10g/r2/owb/owb10gr2_obe_series/owb10g.htm
    You should only map business key columns (to identify a level), attributes for each level, and hierarchy relationship columns. The surrogate key columns are populated under the hood by OWB.
    Cheers
    David

  • What are the steps to rendering large dimensions in AI?

    Hi,
    What is the ideal way to render a file in say as big as 3500x7000 px in AI? This image when exported will be used for print.
    I was thinking of opening a document that size and working on it but AI gets real choppy and during export it takes forever.
    What's the best way to export it out as 3500x7000 ?
    Thanks
    (NOTE: the printshop doesn't want to scale it,they want the final file to be in the size you want it printed out)

    Yes, of course. Both in preferences and in scaling. I can get an innerglow effect if scaled to 3500x7000 only that it is just barely enough to even see it. While on a smaller dimension the effects are more prominent.
    I've tried increasing the dpi to 300 for a 350x800px and then scaling it to 3500x7000px and still nothing, worse AI errors out. I increase it to more than 300 and still errors out.
    Ive been reading  afew things about scaling strokes and effects online and people say since it's a raster effect it doesn't scale well and has a limitation to scaling. Is this true? If so what can i do to use these effects so i can scale.? Someone mentioned using InDesign to scale it so it retains the effects and strokes even on large dimensions. Im competely confused now. I need to use InDesign now so i can get my effects scaled? or can illustrator do this on it's own?

  • XML Publisher CHARTS: mulit-dimensions not plotted correctly

    Hi,
    We're currently assessing the suitability of XMP Publisher and one of the deciding factors is to create a bar chart from an SQL query that retrieves multiple portfolios over a time-frame. For example, three portfolios are selected over a time period like 3 years (but there does not have to be a performance for every month) and their performance is plotted against each month (i.e. 0 - 3 Columns depending if a result exists per month). In the desktop publisher we struggled but managed to change the Graphics code until it could dynamically show the correct legend and the data rows, but alas the data points do not correspond correctly with the X-axis. It seems that when the rows are grouped each value is plotted against each column regardless of the tagged value used for the X-axis.
    Let me try and put it this way, the first month had only 1 portfolio and the next month had all three, but on the graph the first month shows three columns, the records have slipped.
    The code that we used was adapted from a forum entry by Klaus Fabian- "Re: Stacked Bar Chart
    " (Servus Klaus;-)), but we need to find out how to adapt the code so that it plots the data correctly against the X-axis values.
    Help!!!!
    Cheers,
    Robin.
    Here's the code we used
    chart:
    <Graph>
    <Title text="" visible="true" horizontalAlignment="CENTER"/>
    <LocalGridData colCount="{count(xdoxslt:group(.//ROW, 'TODATE'))}" rowCount="{count(xdoxslt:group(.//ROW, 'CODE'))}">
    <RowLabels>
    <xsl:for-each-group select=".//ROW" group-by="CODE">
    <Label><xsl:value-of select="./CODE"/></Label>
    </xsl:for-each-group>
    </RowLabels>
    <ColLabels>
    <xsl:for-each-group select=".//ROW" group-by="TODATE">
    <Label><xsl:value-of select="./TODATE"/></Label>
    </xsl:for-each-group>
    </ColLabels>
    <DataValues>
    <xsl:for-each-group select=".//ROW" group-by="CODE">
    <RowData>
    <xsl:for-each-group select="current-group()" group-by="TODATE">
    <Cell><xsl:value-of select="./PERFORMANCE"/></Cell>
    </xsl:for-each-group>
    </RowData>
    </xsl:for-each-group>
    </DataValues>
    </LocalGridData>
    </Graph>

    Hi,
    I have finally found solution to this issue....
    Two setups are required to display the charts in the concurrent program's PDF output:
    1. We need to edit the variables CLASSPATH and AF_CLASSPATH. These variables should have the complete path added for the xdoparser.zip file on the server.
    2. The DISPLAY variable should be correctly setup to direct the server output.
    Also as per my findings so far, the BI (XML) publisher version 11.1.1.3.0 (or any 11g) does not work with EBS (at least for charts). We need to use BI publisher version which is XML 5.6.3 compatible for EBS. This version is 10.1.3.2.1. (patch 12395372) Now this 10g version does not work on Windows 7 so you need to use Windows XP!
    With this... finally... your charts should be getting displayed in EBS output...
    Cheers!! :-)
    Archana

  • Very long Dimension build COMPILE step

    We're using AWM 11.2.0.1.0A and Oracle 11.2.0.1.0.
    I have a time dimension called DAY with 7 levels from. The standard hierarchy contains all 7 levels, and I added a new hierarchy for the top 6 levels down to WEEK. The intention is that the hierarchies will be identical for the top 6 levels. There are 7420 dimension values (17 years) in total. When loading, the LOAD NO SYNCH step takes 5.2 seconds but the following COMPILE takes 22785 seconds (6h20m).
    Can anyone shed any light on this?
    Many thanks.

    The COMPILE step of a TIME dimension does additional work over that of any other kind of dimension. Specifically it is precalculating a large number of value sets and self relations needed to support time calculations (year ago, moving average, etc.) This can be expensive when the time dimension gets large, but even so I am surprised at just how much time your case takes. (I should be clear that 7420 members counts as small for other kinds of dimensions -- I am only talking about time dimensions here.) I can only assume it is related to the depth of the hierarchies. If you don't need to use time based measures, then the simplest thing to do would be to redefine it as a user dimension. This may be worth trying as an experiment in any case since it would show whether or not it is time related.
    Assuming you need it to be a true time dimension, then it may be useful to gather some more information. If you execute the following alter session before the build, then there will be some additional records in the CUBE_BUILD_LOG that break the COMPILE step down.
    alter session set events='37377 trace name context forever, level 16384';
    exec dbms_cube.build(' "DAY" USING (COMPILE) ')
    Specifically you should see five new steps
    COMPILE PARTITIONS
    COMPILE HIERARCHIES
    COMPILE ATTRIBUTES
    COMPILE TIME OBJECTS
    COMPILE GIDS
    My guess is that the bulk of the time will be in COMPILE TIME OBJECTS, but I don't know for sure. It may also be worth raising a Service Request on this since then you could supply a reproducible test case.

  • Design Steps for logical modeling of Dimensional DataMart

    Hi
    begin to understand the process off building
    a Data warehouse off of an OLTP database. So thought I would put down the steps
    involved (as I understand them) for clarity. Dose this look right? is there anything im missing?
    Step 1 - Determine the dimensions
    create a
    business matrix that identifies the dimension and there relations with each
    other
    Step 2 - Detarmine the measures
    set out the numeric
    values required for analysis
    Step 3 - Determine the dimensions
    attributes
    identify elements that would be used to filter or group data
    for analysis
    Step 4 - Create a Data map
    map out source and
    destination columns between the OLTP source and each dimension and facts
    table
    Step 5 - Include a Date dimension
    almost every datamart
    has one this can be scripted
    any advice welcome.
    thanks

    Hi Slihp!
    Beside Leo N's answer which is correct, I would modify step 4 a bit to add some general info on how the data will be cleansed, if needed, during the mapping process (usually done with the help of SSIS).
    And since you have received three useful links, I will add another one which will be of help to you: Delivering
    Business Intelligence with Microsoft SQL Server 2012 - a very good beginners book which walks you through all the steps of building a data warehouse.
    Regards and have fun :),
    Razvan
    Per aspera ad astra!
    journeyintobi.com

  • Limit Dimension to a  level? error! error!

    in OLAP worksheet
    step 1: show all members of channels_dim
    ->report channels_dim
    CHANNELS_DIM
    CHANNEL.2
    CHANNEL.3
    CHANNEL.4
    CHANNEL.5
    CHANNEL.9
    CHANNEL_CLASS.12
    CHANNEL_CLASS.13
    CHANNEL_CLASS.14
    CHANNEL_TOTAL.1
    step 2: show all levels of channels_dim
    ->report channels_dim_levellist
    CHANNELS_DIM_LEVELLIST
    CHANNEL
    CHANNEL_CLASS
    CHANNEL_TOTAL
    step 3:limit channels_dim dimension to 'CHANNEL' level
    ->limit channels_dim keep channels_dim_levellist 'CHANNEL'
    step 4:show status of channels_dim
    ->report channels_dim
    CHANNELS_DIM
    CHANNEL.2
    Why it has only one member in status?
    why not for CHANNEL.2,CHANNEL.3,CHANNEL.4,CHANNEL.5,CHANNEL.9?
    Please give me a hand!

    Use the relation instead in the limit statement:
    limit channels_dim keep channels_dim_levelrel 'CHANNEL'
    This will work.
    Suresh.

  • CBO: OWB Dimension Performance Isssue (DIMENSION_KEY=DIM_LEVEL_ID)

    Hi
    In my opinion the OWB Dimensions are very useful, but sometimes there are some Performance Issues.
    I am working with the OWB Dimensions quite a lot and with the big Dimensions ( > 100.000 rows) , i often get some Performance problems when OWB generates the code to Load (Merge Step) or Lookup these Dimensions.
    OWB Dimensions have a PK on DIMENSION_KEY and Level Surrogate IDs which are equal to the DIMENSION_KEY if the The Row is an Element of that Level (and not a Parent Hierarchic Element)
    I am hunting the Problem down to the Condition DIMENSION_KEY= (DETAIL_)LEVEL_SURROGATE_ID. The OWB does that to get only the Rows with (Detail-) Level Attributes.
    But it seems, that the CBO isn´t able to predicted the Cardinality right. The CBO always assume, that the Result Cardinality of that Condition is 1 row. So I assume that Conditon is the reason for the "bad" Execution Plans, the Execution Plan
    "NESTED LOOPS OUTER" With the Inline View with Cardinality = 1;
    Example:
    SELECT COUNT(*) FROM DIM_KONTO_TAB  WHERE DIMENSION_KEY= KONTO_ID;
    --2506194
    Explain Plan for:
    SELECT DIMENSION_KEY, KONTO_ID
    FROM DIM_KONTO_TAB where DIMENSION_KEY= KONTO_ID;
    +| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time | Pstart| Pstop |+
    +| 0 | SELECT STATEMENT | | 1 | 12 | 12568 (3)| 00:00:01 | | |+
    +| 1 | PARTITION HASH ALL | | 1 | 12 | 12568 (3)| 00:00:01 | 1 | 8 |+
    +|* 2 | TABLE ACCESS STORAGE FULL| DIM_KONTO_TAB | 1 | 12 | 12568 (3)| 00:00:01 | 1 | 8 |+
    Predicate Information (identified by operation id):
    +2 - STORAGE("DIMENSION_KEY"="KONTO_ID")+
    filter("DIMENSION_KEY"="KONTO_ID")
    Or: For Loading an SCD2 Dimension:
    +|* 12 | FILTER | | | | | | Q1,01 | PCWC | |+
    +| 13 | NESTED LOOPS OUTER | | 328K| 3792M| 3968 (2)| 00:00:01 | Q1,01 | PCWP | |+
    +| 14 | PX BLOCK ITERATOR | | | | | | Q1,01 | PCWC | |+
    +| 15 | TABLE ACCESS STORAGE FULL | OWB$KONTO_STG_D35414 | 328K| 2136M| 27 (4)| 00:00:01 | Q1,01 | PCWP | |+
    +| 16 | VIEW | | 1 | 5294 | | | Q1,01 | PCWP | |+
    +|* 17 | TABLE ACCESS STORAGE FULL | DIM_KONTO_TAB | 1 | 247 | 489 (2)| 00:00:01 | Q1,01 | PCWP | |+
    I tried a lot:
    - statistiks are gathered often, with monitoring Informations and (Frequencey-)Histograms and the Conditions Colums
    - created extend Statistiks DBMS_STATS.CREATE_EXTENDED_STATS(USER, 'DIM_KONTO_TAB', '(DIMENSION_KEY, KONTO_ID)')
    - created combined idx one DIMENSION_KEY, LEVEL_SURROGATE_ID
    - red a lot
    - hinted the Querys in OWB ( but it seems the inline View is to complex to use a Hash Join)
    Next Step:
    -Tracing the Optimizer CBO Events.
    Does some one has an Idea how-to help the CBO to get the cardinality right?
    If you need more Information, please tell me.
    Thanks a lot.
    Moritz

    Hi Patrick,
    For a relational dimension, these values must be unique within the LEVEL. It is not required to be a numeric ID (although that follows the best practices of surrogate keys best).
    If you use the same sequence for the dimension you have insured that each entry in the entire dimension is unique. Which means that you can move your data as is into OLAP solutions. We will do this as well in the next major release.
    Hope that helps,
    Jean-Pierre

  • Obiee dimension hierarchy ?

    what is purpose of od dimensional hierarchy in obiee ?

    Dimension Hierarchy -
    If you Introduce formal hierarchies into a business model it establishes levels for data groupings and calculations and provides paths for drilldown
    Steps to Create a Dimension Hierarchy - >
    Create a dimension object.
    Add a parent-level object.
    Add child-level objects.
    Determine number of elements.
    Specify level columns.
    Create level keys.
    Create a level-based measure.
    Create additional level-based measures.
    Create share measures.
    Create rank measures.
    Add measures to Presentation layer.
    Test share and rank measures.
    kp

  • OLAP universe - dimension always contain a value

    HI
    Dimension must contain a constant value instance. If I have one dimension, so it must always contain "ZTEST_RGB_23810". The value is not on the data, so I can't use af filter. Can it be done? We have a OLAP universe.
    Regards,
    Rikke

    Hi,
    If you have BO XI3.1 SP2, then you can try the following steps:
    1. create one dimension object in the universe
    2. In the select clause mention: <EXPRESSION>"ZTEST_RGB_23810"</EXPRESSION>
    Regards

  • Create an mp4 file from mov without changing dimensions

    I am working on a project for InDesign that needs to output to DPS.
    I take screen capture movies which are saved in the default mov format. These movies are 700 px x 400 px. That can't change.
    I need to convert the mov to an mp4.
    I open AME but I am confused as to how to set the output controls.
    I don't want to change the dimensions.
    What setting will convert the mov to mp4 without changing the dimensions?

    MPEG4 is one of the formats that has relationships among various settings called constraints. More specifically, frame dimensions and frame rate are constrained by Profile and Level. Here are the steps to set the dimensions as desired:
    Open the Export Settings dialog for a job
    Set Format to MPEG4
    Click the Video tab
    Set Profile to Advanced Simple
    Set Level to Level 5
    To the right of Frame Width and Frame Height, click the chain link icon to decouple width and height
    Set Width to 700
    Set Height to 400
    (optional) Change any other settings such as Frame Rate and Bitrate as desired.
    (optional, but highly recommended for future convenience) click the Save Preset button to the right of the Preset control in the upper right corner [in CS5.5 and before, the button icon is a floppy disc; in CS6, it's an arrow pointing down toward a rectangle, which I presume represents a disc drive.] Enter a name for the custom preset in the dialog that pops up, then click OK.
    Back in the Export Settings dialog, click OK (if you opened this dialog from AME) or Queue (if you started from PPRO).
    The preset that you saved in Step 10 will be shown in the list of presets for MPEG4, so you can use it for future encoding jobs. If you have AME CS6, custom presets are also displayed in the section at the top of the new Preset Browser.

  • Create Cubes and Dimension Help

    Hi Experts,
    I am using +11.1.0.7.0 version of Oracle Databse and OWB Client+. I am trying to create Dimensions and Cubes using the OWB tool but everytime unsuccessfull.
    1. I tried using the exisiting Dimesnions and Cubes in the SH Schema as a base. However when i deployed it gave me the error.
    2. I tried to use the examples from the Oracle Library website
    http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/11g/r1/owb/owb11g_update_extend_knowledge/less5_modeling_target/less5_modeling_target.htm
    But trying that too produce me some errors.
    So the other way round i have now installed AWM (Analytic Workspace Manager) which is easily makes Cubes and Dimensions and also exports to the OBIEE Repository. But for some reasons i have to again switch back to OWB. Is there more documents which describes each and every step while preparing the Dimensions.
    Regards,+
    Ravi Remje+

    Hi Ravi
    There are OBEs which include dimensional object creation (the 10gR2 one is applicable to 11gR1);
    http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/10g/r2/owb/owb10gr2_obe_series/owb10g.htm
    Cheers
    David

  • Legal Consol: Received CSD Errors when running SPRUNCONSO

    Hi All,
    I configured Legal Consol application under BPC 7.0 MS and I've below error message when running SPRUNCONSO:
    Warning : Nothing Extract From Ownership for OPENING Period
    ERROR CSD-130 Problem Extracting data from the Fact
    ERROR CSD-135 Problem Extracting data : C_FINANCE
    ERROR CSD-140 Problem extracting Data : C_DATA
    ERROR CSD-150 Problem extracting Data : C_REPART
    ERROR CSD-160 Problem extracting Data : C_CONSO
    20091200 - 0 Rows Calculated
    20091200 - 0 Rows Updated
    Below are my configuration steps:
    1. Account Dimension:
        - Populate TYPELIM Property for account related to interco elimination.
    2. Datasrc Dimension:
        - Populate IS_CONSOL Property (value = Y) for source DataSrc (INPUT)
        - Populate DATASRC_TYPE Property (value = A) for destination DataSrc (AJ_ELIM)
    3. Entity Dimension:
        Property ELIM = Y for entity related to elimination
    4. Groups Dimension:
        Currency_Type = G for Elimination Group
    4. Business Rule:
        - Automatic Adjustment:
           Adjustment ID = ELIM01
           Source Data Source = INPUT
           Destination Data Source = AJ_ELIM
           Adjustment Type = Generic
          Adjustment Level = 0
        - Automatic Adjustment Detail:
           Adjustment ID = ELIM01
           Source Account = IC_APAR
           Destination "ALL" Account = IC_APAR (same value under TYPELIM Property for account dimension member)
           Destination Group Account = IC_APAR_CL (same value under TYPELIM Property for account dimension member)
           RuleID = RULE040
    4: Business Rule Library:
        - Consolidation Rules:
           RuleID = Rule040
           Rule Type = ALL
        - Consolidation Method: 100
        - Consolidation Rules Formula:
          RuleID: RULE040
          EntityMethod: 100
          IntcoMethod: 100
          "All" Formula: 1
          Group Formula: 1
    5. Set Ownership data for 2008.Dec until 2009.Dec (data existed in tblfactownership)
    6. Set Foreign Exchange Rate from 2008.Dec until 2009.Dec
    7. Stored Procedure:
        *RUN_STORED_PROCEDURE = SPRUNCONVERSION([%APP%], [%CATEGORY_SET%], [], [%GLOBAL], [%SCOPETABLE%], [%LOGTABLE%])
        *RUN_STORED_PROCEDURE = SPRUNCONSO([%APP%], [%CATEGORY_SET%], [%SCOPETABLE%], [%LOGTABLE%])
        *COMMIT
    I've been trying to crack it but i still received same error message. Can anyone share what i might have missed?
    Thanks a lot for the advice,
    Liam

    Hi,
    I think there is a syntax problem in your SPRUNCONSO call. GROUPS_SET variable is missing. The syntax should be like this:
    *RUN_STORED_PROCEDURE = SPRUNCONSO([%APP%],[%CATEGORY_SET%],[%GROUPS_SET%],[%SCOPETABLE%],[%LOGTABLE%])
    In addition, not sure if it's a typo problem or not, but it seems that your call for the SPRUNCONVERSION is also wrong. Some variables are also missing:
    *RUN_STORED_PROCEDURE=SPRUNCONVERSION([%APP%],[%CATEGORY_SET%],[%CURRENCY_SET%],[GLOBAL],[%SCOPETABLE%],[%LOGTABLE%])
    Then, for all the error codes, you can get rid of those one using the ON_ERROR_CONTINUE, at the end of the line.
    Normally, those kind of error message are generic, and informed the user that there is actually no records that should be the source of the SPRUNCONSO stored procedure.
    Hope this will help you.
    Kind Regards,
    Patrick

  • We have a situation where in an existing application after go live we need

    We have a situation where in an existing application after go live we need to add a new dimention. The fact is after adding the dimension the existing data will not have any value against this dimension(will have only null values). If so will it create a problem in loading or reporting?. How to resolve it?.

    My experience (on BPC 5.1 MS, and earlier versions) has been as follows:
    1.) create a new dimension, with at least 1 member. Pay particular attention to which member is the first base member in the hierarchy. (If you're planning to have multiple hierarchies in this dimension, wait for now on the ParentH2 etc. Start with just 1 hierarchy until you've completed these steps.) Process the dimension.
    2.) add that dimension to an existing application. When the application is processed, all of the existing data is assigned to that first base member of the new dimension.
    3.) If that's not sufficient, and you want to assign some data to another member of this new dimension, either use the "Move" package, or write custom SQL script logic, to get the data assigned to the correct members.

  • How to make operation Average work on base level of a hierarchy?

    I used Oracle OLAP 11g Sample Schema OLAPTRAIN to make a test, and found something really confused me a lot.
    Here was my steps,
    I made a dimension "TIME" with a hierarchy "calendar": all years <- year <- quarter <- month.
    And then building up two cubes,
    Cube "AVG_TEST" using dimension "TIME" with operator Average.
    Cube "SUM_TEST" using dimension "TIME" with operator Sum.
    After maintaining both dimension and cubes, I tried to query views: AVG_TEST_VIEW and SUM_TEST_VIEW and noticed the cube value based on months level were same for both of cubes. It was pretty clear that operator Average didn't work at month level within cube "AVG_TEST", but using operator Sum.
    Could any one tell me why operator didn't work on the base level of hierarchy, instead of Sum? As I have a requirement to computing average value for day level, but can't add hour level to the dimension because of huge data quantity and the speed of maintaining cube. Is there a way to make operator Average work for the base level of hierarchy?
    Any help will be really appreciated!
    Satine

    Hey Brijesh,
    Thank you for your help!
    I am still confused with it. Even month is the leaf level of dimension, but the fact data is day level precision, not month, and the value at month level still should be computed with my specified operator Average, not Sum. I mean If it could be computed with Sum (actually the value at month level was sumed up data from fact table), then why couldn't with Average at the leaf level?
    My test code is from "Oracle OLAP 11g Sample Schema OLAPTRAIN", and data looks like,
    SQL> select * from sales_fact where to_char(day_Key,'yyyy-mm')='2005-04' and rownum=1;
      QUANTITY      PRICE      SALES DAY_KEY         PRODUCT    CHANNEL   CUSTOMER
             1        125        125 13-APR-05          5298         21     284846
    SQL> select day_key,month_id from times where rownum=1;
    DAY_KEY      MONTH_ID
    31-JAN-05    JAN2005Cube name is AVG_TEST, the fact data is from sales_fact.sales. From below, it is pretty clear the value at month level was sumed up data even specified Average operator.
    SQL> SELECT * FROM AVG_TEST_VIEW where time='APR2005';
           AVG TIME
    7849372.89 APR2005
    SQL> select sum(sales) from sales_fact where to_char(day_Key,'yyyy-mm')='2005-04';
    SUM(SALES)
    7849372.89Thanks
    Satine

Maybe you are looking for

  • Hiding music in the cloud

    If you hide music in the Cloud, how do you retrieve it? Is that even the right question?! I've purchased a couple of albums from iTunes but the track data and number of discs was all screwed up so I hid them in the Cloud. Now I would like to get them

  • Fn-key not working on NB305 after upgrade of Win7

    FN-key stopped showing flashcrads on a NB305-N410WH after an upgrade of Win7 Ultimate was applied to the standard Win7 Starter. I reinstled with latest flashcards utility (2.00.43) and TVAP (1.2.33) but the problem persists. The only possibility to o

  • Dvd - 1260 - Door Problem

    Hi i am having some problem with this devise it`s been a while that i am having this problem and the problem is that I can`t open Door with the button when there is nothing in it ... i mean i can open it with whole(so that mean the motor is fine) but

  • Is All I Need Quicktime Pro?

    Hello, if i pay $29.99 and get Quicktime Pro will that allow me to convert videos into MPEG4? Also will it put the video in the right size to copy into i tunes and into my ipod? Do i need videora if i get Quicktime Pro or is Quicktime Pro all i need?

  • Analyze a Query which takes longer time in Production server with ST03 only

    Hi, I want to Analyze a Query which takes longer time in Production server with ST03 t-code only. Please provide me with detail steps as to perform the same with ST03 ST03 - Expert mode- then I need to know the steps after this. I have checked many t