What is summary level concept in data warehousing?
I am new to data warehousing.
can anybody explain about the summary level and higher level concepts in data warehousing?
If I understood your question:
Data warehouses are intended for analytic processing (OLAP). Basically, It's based on queries that would take too long in OLTP (transactional processing). So you build a new Database, with a relatively denormalized model (you probably have heard about Star-Schema modeling, etc).
Look for Bill Inmon (Building the Data Warehouse) and Ralph Kimball (The Data Warehouse Lifecycle Toolkit : Expert Methods for Designing, Developing, and Deploying Data Warehouses) Data warehousing concept books. Mr. Inmon and Mr. Kiball pretty much "created" the Data Warehousing concept. You'll also find a Book from Oracle Press, named "Oracle XX Datawarehousing Guide" (XX = DB Release).
Also, you have plenty documentation about how to create a data warehouse in OTN. See thread below:
OLAP and cubes tutorials
Hope it's useful for you!
Regards,
Marcos
Similar Messages
-
What are the uses of diffrent concepts in data warehousing
What are the uses of diffrent concepts in data warehousing? Why?
naveenHi,
Your statement is correct. To be crisp, the portal offers a single point of access to SAP and non-SAP information sources, enterprise applications, information repositories, databases and services in and outside your organizationall integrated into a single user experience. It provides you the tools to manage this knowledge, to analyze and interrelate it, and to share and collaborate on the basis of it.
With its role-based content, and personalization features, the portal enables usersfrom employees and customers to partners and suppliersto focus exclusively on data relevant to daily decision-making processes
To read more visit,
http://help.sap.com/saphelp_nw04/helpdata/en/a9/76bd3b57743b09e10000000a11402f/frameset.htm
Regards
Srinivasan T -
What is Web Dashboard Design in Data Warehousing Workbench?
Hi Gurus!
I am tired of trying to find out this since last night. What is this Web Dashboard Design that is available through Data Warehousing Workbench RSA1? Any ideas will be given points. Please urgent!Jaya your answer is something that I already figured out for Bex but my question was for the WB and not BEx. However I have awarded 6 points for your attention to this question and also your link was very elaborative on the BEx side. Thanks a lot.
-
Ms project 2010 - grouping how to show the duration between dates at the summary level
Hi we have a ms project plan where we have used grouping to present the task breakdown in a different structure.
This works great we get at the summary level the start and finish dates correctly but the duration gives us only the Maximum value for the durations of the tasks within the group. (is it possible to change the maximum calculation?)
We would like to be able to determine the duration at the summary level. Of note if we use functions such as projdatediff(start,finish) this only gives the result at the detail level and does not show at the summary level - even if we check the do the
calculation for the task and group levels 'use formula' radio button.
is this a bug???.
If we use a numeric calculation such as finish-start we get a figure that allows us to repeat the calculation at the group (summary) level. but firstly we do not actualy know what the numeric value represents. i.e (Finish-Start)*10000 for
a duration of 0.5 hours = 208.33 ,1 hr= 416.67,1.5 hrs =625, but 8 hrs is giving 10833.33 which is 24 * the 1 hour figure
duration finish-start start finish
0.5hrs 208.33 Tue 31/01/12 Tue 31/01/12
1d 10833.33 Tue 31/01/12 Wed 01/02/12
8 hrs 10833.33 Tue 31/01/12 Wed 01/02/12
0.19d 625 Wed 01/02/12 Wed 01/02/12
1.5 hrs 625 Wed 01/02/12 Wed 01/02/12
0.13d 416.67 Wed 01/02/12 Wed 01/02/12
1 hr 416.67 Wed 01/02/12 Wed 01/02/12
Anyhow looking for suggestions on how to get a calculation of the difference in two dates as units (pref days) into the group level summary.
Many thanks MikeHi John ,
This is better ..... Now the Text1 column is the correct duration .. Is it possible just to display the Text 1 Column instead of Duration Column ? is it a right practice in using Microsoft Project or scheduling ? This is an alternate solution
...Do you think that there is a solution to that problem ?
Task Name
Text1
Duration
% Complete
Start
Finish
F10E ( Level 10 )
273
40d
0%
10 Apr '12
10 May '13
Hospital Admin
40
40d
0%
10 Apr '12
05 Jun '12
Hospital Move- Vacate F10E
40
40 d
0%
10 Apr '12
05 Jun '12
Eastern/Kelson
238
40d
0%
30 May '12
10 May '13
Investigate Above Ceiling Conditions
20
20 d
0%
30 May '12
26 Jun '12
Hoardings
5
5 d
0%
06 Jun '12
12 Jun '12
Demolition (All)
20
20 d
0%
20 Jun '12
18 Jul '12
HVAC- main duct Work, Fire dampers
40
40 d
0%
19 Jul '12
14 Sep '12
Layout Partitions
5
5 d
0%
19 Jul '12
25 Jul '12
Steel stud Framing & Hollow Metal Frames
30
30 d
0%
17 Sep '12
29 Oct '12
Ceiling and Bulk Head Framing
10
10 d
0%
30 Oct '12
12 Nov '12
Mechanical Rough-In to Ceiling
25
25 d
0%
13 Nov '12
17 Dec '12
Taping,Drying & Sanding
20
20 d
0%
04 Dec '12
03 Jan '13
Ceiling and Bulk Head Drywall
10
10 d
0%
04 Dec '12
17 Dec '12
Painting
15
15 d
0%
04 Jan '13
24 Jan '13
T-Bar and Acoustic Ceiling Tile With Devices
11
11 d
0%
22 Jan '13
05 Feb '13
Head Walls system
7
7 d
0%
25 Jan '13
04 Feb '13
P-Lam Corridor Canopy Ceiling
15
15 d
0%
25 Jan '13
14 Feb '13
P-Lam Corridor and Room Canopy Ceiling
10
10 d
0%
25 Jan '13
07 Feb '13
Steel Doors, Wood Doors, Hardware
15
15 d
0%
08 Mar '13
28 Mar '13 -
Can I summarize data from a line item ODS into a higher summary level?
Hello friends,
1) Can I summarize data from a line item ODS into a high summary level and store the results into another ODS? If so, can you discuss the approach at a high level...i.e. whether or not a start up routine is needed?
2) Once data is stored in an ODS, can a routine be written to update specific fields in the ODS after the data has already been loaded into the ODS. As a simple example, is it possible to write a routine that simply multiplies "Field A" by "Field B" and stores the result in the ODS as "Field C"? (Most of the time I would do this calculation in Bex, but in this one case, I have the need to calculate and store "Field C" in the ODS.)
Thank you.1.- Yes, you can sumarize data in other ODS. Connect both ODS and use aditions for key figures intead of overwriting.
2.- You can do it in load time, in update rules, is not necesary to wait for ODS was loaded -
Hi All,
In my reports, we have to display both detail and summary level data.
Detail column - Director, Region, Employee, Application Id,product id, Decision date, Avg Qty
Summary Column - Director, Region , Employee, Avg Qty
Issue - Data in summary level is incorrect.
Approach - In summary level, when we pull product id and decision date we get correct results but data will split for each product& decision date. how do we achieve this in reports
Request inputs
Thanks,
Nithya
Edited by: 973422 on Mar 24, 2013 8:30 PMHi Nithya,
Would you please provide more information
"In summary level, when we pull product id and decision date we get correct results but data will split for each product& decision date."....Not Clear
Provide some sample data with Issue Explanation
Thanks
NK -
What is Header level data and item level data? Please elaborate.
What is Header level data and item level data? Please elaborate.
Details:
EKKO is Purchasing Document Header; what is purchasing Document Header? who,what and where can i look for this data.
EKPO is Purchasing Document Item; what is Purchasing document item means?
what is item level data means?
When you take the Senario of SRM and ECC; where the purchase data is maintained primarily in SRM or ECC?
Thanks in advance. I will assign the points only to the valuable information.
York.Hi York,
You are right in stating EKKO as header and EKPO as line item data. They are maintained in ECC.
Now for the details about the data:
In SAP every transaction is referred to as a document. In this case you are talking about a purchase document. Now what does purchaing document contain? I will contain information like: Who is the vendor? When was the transaction done? what was bought? What is the quantity of each item bought?
Whatever is applicable across the document is called header data... and what is applicable for each item is called line item data. In this case
1. Header data would be the vendor, date of transaction, purchaing organization ... etc.
2. Line item data would be the item details, the quantity of the item, the unit price of each item ...etc.
Hope this helps.
Best regards,
Kazmi -
EDW: Data Warehousing Concept
Hi All,
I am new in Data warehousing project. I am not aware of its concept like Staging, ETL etc...
Looking for a eBook on EDW for beginners which describes the basic concept, architecture, work flow n all.
Please share if you have any document on Data warehousing at [email protected]
Thanks!Did you read Data Warehousing Guide http://docs.oracle.com/cd/E11882_01/server.112/e25554/toc.htm ?
-
What are Header Level, Item Level and Schedule Level data?
Hi ,
Can anyone plz explain me what are Header Level, Item Level and Schedule Level Data in the R/3 system , means what actually is the data structure they contain. If is there any document or links available plz do send. Urgent.
Thanks
Prashant singhalHi Prashant,
check this link.
[Extractors;
Regards,
Harold. -
What is the difference between Data Warehousing and Big Data?
What is the difference between Big Data and
Data Warehousing? are they the same? similar? if no, then when to use each of them?
Any link to a paper that describes the difference?!Big Data is a term applied to data sets whose size is beyond the ability of commonly used tools to capture, manage and process the data within a tolerable elapsed time. But Data-warehouse is a collection of data marts representing historical data from different
operations in the company.
It means Big Data is collection of large data in a particular manner but Data-warehouse collect data from different department of a organization. However Data-warehouse require efficient managing technique. Conceptually these are same only at one factor that
they collect large amount of information
For Big Data - You can start researching on HDInsight and for Datawarehouse check MSBI
Sandip Pani http://SQLCommitted.com -
What about data warehousing?
I have attended the 10g launch, perused the CD I got there and listened to a couple of online seminars. I can easily envision how 10g benefits OLTP applications, but data warehousing seems to be ignored in the presentations.
Does 10g offer anything over 9i for data warehousing, which has quite different resource demands? Is there any published information I could read?DATA WAREHOUSING
Oracle Database 10g has also enhanced its data warehouse and business intelligence capabilities, which results in
further reduction of the total cost of ownership while enabling customers to derive more value from their data and
supporting real time data feeds.
Consolidation and integration of traditionally disparate business intelligence systems into a single integrated engine is
further enhanced in Oracle Database 10g. Database size limits have been raised to millions of terabytes. Business
Intelligence applications can be consolidated alongside transactional applications using Real Application Clusters
automatic service provisioning to manage resource allocation. This consolidation means analysis can be performed
directly against operational data and resource utilization can be maximized by reallocating servers to workloads as the
business needs change. The value of data is increased with the ability to perform even more diverse analytic
operations against core data with enhanced OLAP analytics, a data mining GUI and a new SQL model feature. The
SQL model cause allows query results to be treated as sets of multidimensional arrays upon which sophisticated
interdependent formulas are built. These formulas can be used in complex number-crunching applications such as
budgets and forecasts without the need to extract the data to a spreadsheet or perform complex joins and unions.
Real Time Warehousing is enabled either by consolidating business intelligence with operational applications, or by
new change data capture capabilities based on Oracle Streams which produce low or zero latency trickle feeds with
integrated ETL processing.
Joel P�rez -
What is the serialization concept in ALE/IDOC?
what is the serialization concept in ALE/IDOC?
Hi Srinu ,
IDoc Serialization means, sending/posting the idocs in sequence.
We serialize IDocs in the following cases:
· If you want the Integration Server to process the corresponding IDoc XML messages in the same sequence that it receives them from the IDoc adapter at the inbound channel.
· If you want the receiver to receive the IDocs in the same sequence that the IDoc adapter sends them at the Integration Server outbound channel.
The sequence at the Integration Server inbound or outbound channel can only be guaranteed if only IDocs are processed, and not if different protocols (for example, IDocs and proxies) are processed together.
Do not confuse IDoc serialization using the IDoc adapter with the ALE serialization of IDocs.
Prerequisites
· The quality of service EOIO (Exactly Once In Order) must be specified in the message header.
· The receiver system or the sender system must be based on SAP Web Application Server 6.40 or higher. If this is not the case, the quality of service is automatically changed to EO for compatibility reasons and the message is processed accordingly.
Procedure
If you want the Integration Server to process the IDoc XML messages created by the IDoc adapter in the same sequence that the IDocs are sent by your application, proceed as follows:
· Enter a queue name in your application. You can use 16 alphanumeric characters. The prefix SAP_ALE_ is then added.
The IDoc adapter checks the prefix and replaces it with the prefix of the corresponding Integration Server inbound queue (for example, XBQI0000).
If you want the receiver to receive the IDocs in the same sequence that they are sent by the Integration Server using the IDoc adapter, proceed as follows:
· In the communication channel, select the check box Queue processing for the receiver.
The IDoc adapter replaces the prefix of the outbound queue (XBQO) with the prefix SAP_ALE_.
You can display the individual messages in the qRFC monitor of the outbound queue. To do this, do one of the following:
¡ Use the queue ID in the list of displayed messages in the monitor for processed XML messages.
¡ Use the transaction ID in the list of displayed XML messages in the IDoc adapter.
¡ Call the transaction qRFC Monitor (Outbound Queue)(SMQ1).
To navigate directly to the display of messages in the IDoc adapter, double click the transaction ID of a message in the outbound queue.
To do this, you must have registered the display program IDX_SHOW_MESSAGE for the outbound queue in the qRFC administration (transaction SMQE) beforehand.
In both cases, the function module IDOC_INBOUND_IN_QUEUE is called, which enables EOIO processing of the messages. The processing sequence is determined by the sequence of the function module calls.
Unlike the other function modules (interface versions from the communication channel), with this function module you have to transfer segment types rather than segment names in the data records.
Serialization of Messages
Use
Serialization plays an important role in distributing interdependent objects, especially when master data is being distributed.
IDocs can be created, sent and posted in a specified order by distributing message types serially.
Errors can then be avoided when processing inbound IDocs.
Interdependent messages can be serially distributed in the following ways:
Serialization by Object Type
Serialization by Message Type
Serialization at IDoc Level
(not for IDocs from generated BAPI-ALE interfaces)
Serialization at IDoc Level
Use
Delays in transferring IDocs may result in an IDoc containing data belonging to a specific object arriving at its destination before an "older" IDoc that contains different data belonging to the same object. Applications can use the ALE Serialization API to specify the order IDocs of the same message type are processed in and to prevent old IDocs from being posted if processing is repeated.
SAP recommends that you regularly schedule program RBDSRCLR to clean up table BDSER (old time stamp).
Prerequisites
IDocs generated by BAPI interfaces cannot be serialized at IDoc level because the function module for inbound processing does not use the ALE Serialization API.
Features
ALE provides two function modules to serialize IDocs which the posting function module has to invoke:
· IDOC_SERIALIZATION_CHECK
checks the time stamps in the serialization field of the IDoc header.
· IDOC_SERIAL_POST
updates the serialization table.
Check the following link:
http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a66d6507d11d18ee90000e8366fc2/frameset.htm
http://help.sap.com/saphelp_nw04s/helpdata/en/78/2175a751ce11d189570000e829fbbd/frameset.htm
Ex: ADRMAS, DEBMAS(customer)
ADRMAS, CREMAS(Vendor)
In this case, Before posting Customer Data or Vendor Data it requires Address Data.
Rgds
Sree m -
Hello People,
Now,I am engaging in a new data warehousing project.It is the first time for me building a data warehouse, so I've encountered a lot of difficulties.I hope that I can get much help from the accommodating people here.
The source data of this data warehousing comes from a sms(cell phone short message)gateway's log files.The log files recorded every piece of the short messages,which were sent to a certain service-processing application from a user's cell phone(kind of MO message),or were sent to a user's cell phone from a service-processing application(kind of MT message).Every record of the log fils have many properties of the short message, including sending time,user's phone number,service-provider's code,charge value,the id of the short message and so on.
Now,we hope to build a data warehousing over these data. Our plan was devided into two parts,first is to build the data warehousing and realize some simple statistic function and OLAP function on the data warehousing, second is to do something with data mining technology.I am now facing the designing work,but I have something uncertain.
1.How to design the dimention?We difined three dimentions,time,district and service using the OWB tool.Every dimention has some levels and some levels form a hierarchy.In the OWB,every level need some attributes to be defined, I am uncertain which attributes are necessary or not,and uncertain the data type of
every attribute.
2.How many fact tables are necessary?According to our source data, two kinds of data are available,the MO messages and the MT messages,so I create two fact tables.I don't know whether it is appropriate.
3.What colomns should a fact table contain?Our first phase of our plan is to realize some statistic functions.One is counting the flux of short message within one hour,one day or a certain spell.One is counting the sum of the users, who had used certain service within a spell.One is counting the profits,for every record of the short message includes the charge value.So, I created the fact tables including every field of the log files.I know it is not good,but I don't know what the good method is.
4.How to plan the materialized views?I know the materialized view is the key part to improve query performance.
Above are the difficulties that I am encountering now.As you seen, I am definitely a tyro, I eagerly need help.I am longing for your email, thank you!Hi there,
There is an active data warehousing communite at www.datawarehousing.com, where you can send an email to the dw-list.
Cheers
A. Elliott -
Do you know , what is Double level domain in ABAP ?
1.
Do you know , what is Double level domain in ABAP ?
2. Have you heard of any concept called " loggi " or simillar ...?Hey Fren,
In the DDIC we have a concept called as dual-level domain.
Let us have an example to get it cleared...
Imagine you need to create Transparent Table with the following fields....
01. Booking_ID
02. Name_of_Person
03. Arrival_Time
04. Arrival_Date
05. Departure_Date
06. Departure_Time
07. Country_From
08. City_From
09. Country_to
10. City_To
11. Price
Now, in this case you need to create 11 domains if there was not a dual-level concept..
But the dual-level concept helps us in having a better maintainability.
In the above example,
|i--------Fields -------|-----DataElements-----|-----Domains-------i|
|i-----------------------------------------------------------------i|
| Arrival_Time | ZTime_1 }_______> Z_Time
| Departure_Time | ZTime_2 }
|--------------------------------------------------------------------|
| Arrival_Date | Zdate_1 }_______> Z_Date
| Departure_Date | Zdate_2 }
|--------------------------------------------------------------------|
| Country_From | ZCountry_1 }_______> Z_Country
| Country_To | ZCountry_2 }
|--------------------------------------------------------------------|
| City_From | ZCity_1 }_______> Z_City
| City_To | ZCity_2 }
|--------------------------------------------------------------------|
| Price | ZPrice }-----> ZPrice
The Fields having Similar Technical Attributes are associated to the same Domain but due to the different Semantic Meaning the Fields are Associated with different Data Elements...
Thus it helps us in Maintaining the Data Elements Centrally ....i.e at the Domain level.....
Hence Dual-Level Domain Concept........
To have some more Information on Domains,
Please refer [http://help.sap.com/saphelp_nw70/helpdata/EN/cf/21ea0b446011d189700000e8322d00/frameset.htm]
Inspire if this helped,
Warm Regards,
Abhi... -
Data warehousing book - guidance
can someone point out a good book on data warehousing with 10g, a book that focuses on concepts and details of the technology - more than focusing on how to use oracle tools- beginner/intermediate level?
thanks and sorry if the post is in the wrong context, i just thought i'd get guidance from people who use the technologyI would recommend looking at the following database documentation (these links are for 11g documentation but the majority of concepts are applicable to all versions of the database):
2 Day + Data Warehousing Guide
http://www.oracle.com/pls/db111/to_toc?pathname=server.111/b28314/toc.htm
Data Warehousing Guide
http://www.oracle.com/pls/db111/to_toc?pathname=server.111/b28313/toc.htm
OLAP User's Guide
http://www.oracle.com/pls/db111/to_toc?pathname=olap.111/b28124/toc.htm
Data Mining Concepts
http://www.oracle.com/pls/db111/to_toc?pathname=datamine.111/b28129/toc.htm
Also look at the whitepapers published on the Data Warehouse Technology Center home page on OTN : http://www.oracle.com/technology/tech/bi/index.html
Hope this helps
Keith Laker
Oracle EMEA Consulting
BI Blog: http://oraclebi.blogspot.com/
DM Blog: http://oracledmt.blogspot.com/
BI on Oracle: http://www.oracle.com/bi/
BI on OTN: http://www.oracle.com/technology/products/bi/
BI Samples: http://www.oracle.com/technology/products/bi/samples/
Maybe you are looking for
-
How to read the whole name of a music title while scrolling down a list of titles? The old software-app had the function to click for a longer time on a titles beginning letters until a black box revealed the entire name. I miss this function especia
-
BPM Process split, Or Split, Conditional, Multiple Gatways
Hi I am New to this. Please explain what are these and when we use these (split, Or Split, Conditional, Multiple ) when developing a BPM Process. Also where we define rules in BPM
-
AJAX / Javascript Problem with Application Items
Hi all. Another question which as been bugging me for a while and I cant find an answer on any of the Guru's examples. This is what I am doing. I am setting an application item in an Application Proccess to either T or F. I then need to do a IF state
-
10.5.x mail server migration to 10.6 (non upgrade)
Hi all, So I have been reading up on the various issues with upgrading 10.5.x servers to 10.6 however my question involves going from a 10.5.x server to a clean 10.6 server (non upgrade install). I generally start fresh with a new OS and migrate the
-
Upgraded to Mavericks and now Font is small
I upgraded to Mavericks. Toolbar dislplays too small - I can enlarge font, but does not display evenly