Data in Time Series
Hello,
I want to check the data in my time series live cache in my planning area. What is the transaction to check ?
Thank you
Steve
Hello Steve,
Here are my answers:
For Q1: No, I don't think it's because you are in the 10th month of the year. The package size (i.e. the number of rows in each package) and the number of packets depend on a few factors: a) how much data is in your planning area b) on whether you implemented BADI /SAPAPO/SDP_EXTRACT c) the parameters that you placed in the "data records/calls" and "display extr. calls" fields.
For Q2: It is included because key figures with units/currencies (e.g. amounts and currencies) do need UOM/BUOM/Currency information and that's why it is also part of the output. You can check what unit characteristic a certain KF uses in transaction RSD1.
For Q3: Yes, you can but you need to do more than what I mentioned before. Here are some ways to do that:
A) Generate an export datasource. If you are in SCM < 5.0, connect that to an InfoSource and then to a cube. If you are in SCM 5.0, connect that to an InfoCube using a transformation rule. You can then load data from the planning area to the InfoCube. After that, you can then use transaction /SAPAPO/RTSCUBE to load data from the cube to the PA.
B) You can opt to create a custom ABAP that reads data from the DataSource, performs some processing and then write the data to target planning area using function module /SAPAPO/TS_DM_SET or the planning book BAPI.
Hope this helps.
Similar Messages
-
How to restore data of time series objects
Dear All,
I was checking functionality of report for Anchors
/SAPAPO/DM_LC_ANCHORS_DELMANDT
This deactivated all the planning areas by deleting time series objects. Hence the data lose.
Could i get back the data?
Regards,
VijayYou can if you have a backup copy of time-series in an Infocube, then use the transaction /sapapo/tscube to copy back from the infocube into your planning area.
Otherwise I am afraid you are stuck somewhat.
Having looked at the coding, it seems that this report deletes ALL planning versions for the current client hence the reason why it deleted time series objects. You will need to Create another Planning version (via transaction /sapapo/mvm) then "Create Time Series Objects" for your Planning Area against this new planning version (via transaction /sapapo/msdp_admin). As I said, however, any time series objects will have been lost if you have no backup, this will only reinitialise the planning area. -
Issue in new macro calculating values in time series in CVC
Hi friends.
I'm new in APO.
I have a problem with a new macro in CVC which calls up a FM to calculate and store data.
This new macro calculates the selected data in time series (TS) and e.g. we select 3 days to make calculation with this macro, the first selected day in TS is ignorated.
We created this macro to do this calculation when user enter some manual values and want it to be calculated in the deliver points in CVC, by TS.
This macro calls up my Z function which internally calls up a standard FM '/SAPAPO/TS_DM_SET' (Set the TS Information).
Sometimes, this FM, raises error 6 (invalid_data_status = 6), but only when I user 'fcode_check' rotine together.
After that, we call the FM '/SAPAPO/MSDP_GRID_REFRESH' in mode 1 and 'fcode_check' rotine in program '/sapapo/saplmsdp_sdp'.
Firstly, I thought it could be dirty global variables in standard FM so I put it inside a program and called it by submit and return command. But now I think could not be this kind of error because it did not work. And inverted the results, and now only first line os TS get storted and change in CVC.
It's a crazy issue. Please friends. Guide me for a solution!
thanks.
GlaucoHi friend. Issue still without a correct solution yet.
A friend changed the macro adding another step calling the same function. Now this macro has two calls in sequence to same FM. Now it's working, but we can't understand why it's working now.
It's seems like dirty memory in live cash.
Nobody knows how to solve this!
Glauco. -
Fragementation with Time series calculation
Hi,
I have three 3 tables
Table1 for Region 1 , Frag Logic =Region = Region 1
Table 2 for Region 2 Frag Logic =Region = Region 2
Table 3 for Region 3 Frag Logic =Region = Region 3
we use fragementation to query based on the region.Now i have duplicated Table 1 to calculate the Previous date for time series
Table1 for Region 1 , Frag Logic =Region = Region 1
Table1 for Region 1 Prev ?
Table 2 for Region 2 Frag Logic =Region = Region 2
Table 2 for Region 2 Prev ?
Table 3 for Region 3 Frag Logic =Region = Region 3
Table 3 for Region 3 Prev ?
After adding prev tables i am almost lost,Do i need to add the Fragmentation for Prev tables also ,else we dont need as main and Prev tables are created as Alias from the base tables.Please share your thoughts.To use TimeSeries on Fragmentation in OBIEE, is identified as a bug
http://obieetalk.com/time-series-and-fragmentation -
Creating Dynamic Time Series in Hyperion Profitability and Cost Management
I am creating Dimensions in Dimension Library for Hyperion Profitability and Cost Management. Please let me know if we can set Dynamic Time Series for HPCM.
I can see DTS Manager option for Period Dimension in Hyperion Planning but this option is not available for Period Dimension in HPCM Application.
Is there any other way to drill down upto weeks level or day level data in Time Series.I am also facing the same issue. Does we have any option to create Dynamic time series in HPCM. Please let me know.
-
Read optimization time-series data
I am using Berkeley DB JE to store fairly high frequency (10hz) time-series data collected from ~80 sensors. The idea is to import a large number of csv files with this data, and allow quick access to time ranges of data to plot with a web front end. I have created a "sample" entity to hold these sampled metrics, indexed by the time stamp. My entity looks like this.
@Entity
public class Sample {
// Unix time; seconds since Unix epoch
@PrimaryKey
private double time;
private Map<String, Double> metricMap = new LinkedHashMap<String, Double>();
as you can see, there is quite a large amount of data for each entity (~70 - 80 doubles), and I'm not sure storing them in this way is best. This is my first question.
I am accessing the db from a web front end. I am not too worried about insertion performance, as this doesn't happen that often, and generally all at one time in bulk. For smaller ranges (~1-2 hr worth of samples) the read performance is decent enough for web calls. For larger ranges, the read operations take quite a while. What would be the best approach for configuring this application?
Also, I want to define granularity of samples. Basically, If the number of samples returned by a query is very large, I want to only return a fraction of the samples. Is there an easy way to count the number of entities that will be iterated over with a cursor without actually iterating over them?
Here are my current configuration params.
environmentConfig.setAllowCreateVoid(true);
environmentConfig.setTransactionalVoid(true);
environmentConfig.setTxnNoSyncVoid(true);
environmentConfig.setCacheModeVoid(CacheMode.EVICT_LN);
environmentConfig.setCacheSizeVoid(1000000000);
databaseConfig.setAllowCreateVoid(true);
databaseConfig.setTransactionalVoid(true);
databaseConfig.setCacheModeVoid(CacheMode.EVICT_LN);Hi Ben, sorry for the slow response.
as you can see, there is quite a large amount of data for each entity (~70 - 80 doubles), and I'm not sure storing them in this way is best. This is my first question.That doesn't sound like a large record, so I don't see a problem. If the map keys are repeated in each record, that's wasted space that you might want to store differently.
For larger ranges, the read operations take quite a while. What would be the best approach for configuring this application?What isolation level do you require? Do you need the keys and the data? If the amount you're reading is a significant portion of the index, have you looked at using DiskOrderedCursor?
Also, I want to define granularity of samples. Basically, If the number of samples returned by a query is very large, I want to only return a fraction of the samples. Is there an easy way to count the number of entities that will be iterated over with a cursor without actually iterating over them?Not currently. Using the DPL, reading with a key-only cursor is the best available option. If you want to drop down to the base API, you can use Cursor.skipNext and skipPrev, which are further optimized.
environmentConfig.setAllowCreateVoid(true);Please use the method names without the Void suffix -- those are just for bean editors.
--mark -
Data Mining Blog: New post on Time Series Multi-step Forecasting
I've posted the third part of the Time Series Forecasting series. It covers:
- How to use the SQL MODEL clause for multi-step forecasting
- Many example queries
- Two applications: a classical time series dataset and a electric load forecast competition dataset
- Accuracy comparison with a large number of other techniques
http://oracledmt.blogspot.com/2006/05/time-series-forecasting-3-multi-step.html
--MarcosMany biological databases can be queried directly via the Structure Query Language.SQL is at the heart of biological databases.Oracle Data Miner load data from flat files in the database.we would create a numeric ID for the genes while loading the data and remove some columns and rows from the data, it is better to load the data directly with SQL Loader.
http://www.genebyte.firm.in/
Edited by: 798168 on Sep 28, 2010 11:00 PM -
Time series functions are not working in OBIEE for ESSBASE data source
Hi All,
I am facing a problem in OBIEE as I am getting error messages for measure columns with Time series functions(Ago,ToDate and PeriodRolling) in both RPD and Answers.
Error is "Target database does not support Ago operation".
But I am aware of OBIEE supports Time Series functions for Essbase data source.
using Hyperion 9.3.1 as data source and obiee 11.1.1.5.0 as reporting tool.
Appreciate your help.
Thanks,
AravindHi,
is because the time series function are not supported for the framentation content, see the content of the oracle support:
The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
Regards,
Gianluca -
Discoverer 4i - Time Series Data type support
Does Discoverer 4i support time-series data type i.e. the ability to store an entire string of
numbers representing for example daily or weekly data points?
Thanks & Regards,
DeeptiHi O G-M,
Each model must contain one numeric or date column that is used as the case series, which defines the time slices that the model will use. The data type for the key time column can be either a datetime data type or a numeric data type. However, the column must
contain continuous values, and the values must be unique for each series. The case series for a time series model cannot be stored in two columns, such as a Year column and a Month column. For more information about it, please see:
http://msdn.microsoft.com/en-us/library/ms174923(v=sql.100).aspx
Thanks,
Eileen
Eileen Zhao
TechNet Community Support -
Programatically change the Date Time series
Is there a way to set Date time series programatically thru Javascript? I have Time dimenson and I have defined a DTS as YTD. Now based upon the selection of the month by end user I need to show YTD numbers in IR?
Any idea how do I achieve that? I am not seeeing any option to call DTS thru javascript?I do not believe that option is available.
9.3.1 introduced the new interface into OLAP Essbase Queries. My guess is that there will be more functionality added in future releases.
Wayne Van Sluys
TopDown Consulting -
Time Series Graph Show Inappropriate Data for Continuous Analysis
Hi All,
I have marked Month as the Chronological Key in my BMM Layer but still I am unable to view the data correctly in my Time Series graph because it shows Inappropriate Data for Continuous Analysis at the time of creating the Graph. Can anybody help me out with the same.
ThanksWhat data type is your key? The chronological key is required for the time series formulas (ago etc.).
The time series chart requires a date or datetime data type to work - perhaps a new column with the first of the month/period would help? Regards,
Robret -
What's the most effiecient way to store time-series data in oracle?.
Thanks,
Jay.937054 wrote:
Hello,
1. Usally time-series data goes in multiple millions, so timeseries databases like FAME,KDB,SYBASE-IQ are used. Does oracle11gr2 provide storage optmizations, compressions, columnar database like FAME,KDB,SYBASE-IQ?
The only methods of optimization are partitioning of the data by some date or if data set is narrow (few columns)l enough, partitioned IOT.
2. http://www.oracle.com/us/corporate/press/1515738
Link is about R statistical language and data mining integration with Oracle database 11gR2. Does this come default during installation or with BigData - EXADATA? OR this is a seperate license?I am not sure about the licensing, you will need to ask your sales person, but it looks like it might be apart of ODM (oracle data mining - a licensed product)
Take a read through this case study.
http://www.oracle.com/technetwork/database/options/advanced-analytics/odm/odmtelcowhitepaper-326595.pdf?ssSourceSiteId=ocomen
>
Thanks -
Time Series initialization dates with fiscal periods!
Dear Experts
Problem:
I cannot initialize planning area.
Period YYYY00X is invalid for periodicity P FYV
Configuration
Storage Buckets Profile: Week, Month, Quarter, Year, Post Period FYV all checked.
Horizon
Start: 24.02.2013
End: 31.12.2104
No Time stream defined.
Fiscal Year Variant
2012
Month
1
2
3
4
5
6
7
8
9
10
11
12
Edate
28
25
31
28
26
30
28
25
29
27
24
31
FP
1
2
3
4
5
6
7
8
9
10
11
12
2013
Month
1
2
3
4
5
6
7
8
9
10
11
12
Edate
26
23
30
27
25
29
27
24
28
26
23
31
FP
1
2
3
4
5
6
7
8
9
10
11
12
2014
Month
1
2
3
4
5
6
7
8
9
10
11
12
Edate
25
22
29
26
24
28
26
23
27
25
22
31
FP
1
2
3
4
5
6
7
8
9
10
11
12
2015
Month
1
2
4
5
5
7
8
8
10
10
11
12
Edate
31
28
4
2
30
4
1
29
3
31
28
31
FP
1
2
3
4
5
6
7
8
9
10
11
12
Question
What goddamn dates should I enter in the planning area initialization start and end to initialize for maximum duration given the settings above?
I tried a few dozens but none is accepted. For a start I tried the same dates as in Horizon of storage bucket profile. But given the kind of error text I have I cannot decipher with tiny little mind what dates I am expected to enter in for time series creation.
Thanks
BSThanks Mitesh,
No its 2014
Here is what worked.
Storage Bucket Horizon
Start: 24.02.2013
End: 22.11.2014
Time Series Initialization
Start: 01.02.2013
End Date: 31.12.2014
The fiscal year variant is what I pasted above.
I thought time series can only be initialized for a subset of period in the storage bucket profile !. This is my first experience of this kind where Initialization period is larger than the storage horizon. Is this the case ? I went by the book always and this is just a chance discovery by me.
I was aware of the limitation by SAP notes on need to have one more year on either sides in fiscal year variant for it to be used for initialization and hence the range of dates I tried.
Appreciate your comments on this.
All dates in dd/mm/yyyy.
Thanks
BS -
Time series chart in null date
Hi, everyone
Currently using
Crystal Reports 2008 Product: 12.7.0.1983
I make time series chart using a line graph.
№ X-axis Y-axis
1 2014/1/1 100
2 2014/1/2 20
3 2014/1/3 null
4 2014/1/4 500
Null becomes 0 when I make an upper list a chart.
I skip a date of null, and may you make a time series chart?
Regrards,
KanoHi, Jamie Wiseman
Thank you for an answer
In the case of the above,can you not draw a line in 2014/1/2 and 2014/1/4? -
I wanted to use time series line in my analysis but I couldn't get What kind of data is required for this graph? It shows inappropriate data for continuous time axis. I had total amountmin vertical axis and time(year) in horizontal axis.
Check this if not helpful then follow 2 links at bottom
OBIEE - Time Series Conversion Functions : AGO and TODATE | GerardNico.com (BI, OBIEE, OWB, DataWarehouse)
~ http://cool-bi.com
Maybe you are looking for
-
How do I find my iTunes Library in iCloud
My proflie on Windows Vista isn't working so I am on another proflie, someone said I can access my iTunes via the iCloud but not sure how to do this. This iTunes library isn't the same as the one my daughters and I were all using TIA
-
How do I create an alert for a collection?
Hi, I created a collection that queries computers with Endpoint definition files of 7 days and older. Now, I want an email alert daily what computers are in that collection. How do I do that? Please give me some steps if possible because I am not f
-
Decision Services Failing with operationErroredFault
When invoking a decision service in BPEL 10.1.3.1 I sometimes experience operationErroredFault. This appears to be when the server is under load. The Process has been written to catch and retry on remoteFault and bindingFaults and the operationErrore
-
I want to know about an OVM's accessing to files.
I'm now looking into OVM 3.1.1. I know that VM is able to access to iSCSI Storage directly. but Is it faster than Accessing VBD Files(repositories file)? Does anybody know about this? I'd appreciate your help. Thank you.
-
Anyone know how to capture error messages?
I'd like to capture the feedback I receive in the Output panel when testing something in the Flash IDE, and display it in a TextArea included in my movie when it is compiled and running in a live environment. Is there any way I can make this happen?