OLAPi report audit data...
Does the auditing database capture events regarding OLAPi reports?
We still have some OLAP reports in our XI R2 environment and would like to know who is still looking at these and how often. I am not seeing any data on these reports from the Activity universe but I'm wondering if I'm looking in the wrong place or don't have something turned on.
Thanks,
We cant find information regarding olap reports in auditor and it has been raised as and Enhancement request which may implemented in next major versions.
Thanks,
Ramsudhakar.
Similar Messages
-
Deleting Inplace discovery and hold report audit data
I need to remove the auditing data for the
discovery and hold report. Is this possible in exchange online?This is how to delete a datasource from a custom report in Analyzer 6.5.- Open the custom report, on the Main Toolbar menu, go to Tools and Select Design Report.- On the Design Report page, highlight the data source (spreadsheet or chart), right click & select Properties, the Data Object Properties box will pop-up, highlight the "ReportDataSrc1" and hit "Delete" on your keyboard(if this is the data source you want to delete). It will give you a "Yes and No" option. Click Yes, it will delete the data source out of your custom report forever and remember to save the report.If you have any question, feel free to email me [email protected]
-
Sharepoint 2013 Reporting Services & OLAP Cubes for Data Modeling.
I've been using PowerPivot & PowerView in Excel 2013 Pro for some time now so am now eager to get set up with Sharepoint 2013 Reporting Services.
Before set up Reporting Services I have just one question to resolve.
What are the benefits/differences of using a normal flat table set up, compared to an OLAP cube?
Should I base my Data Model on an OLAP Cube or just Connect to tables in my SQL 2012 database?
I realize that OLAP Cubes aggregate data making it faster to return results, but am unclear if this is needed with Data Modeling for Sharepoint 2013.
Many thanks,
MikeSo yes, PV is an in-memory cube. When data is loaded from the data source, it's cached in memory, and stored (compressed) in the Excel file. (also, same concept for SSAS Tabular mode... loads from source, cached in mem, but also stored (compressed) in data
files, in the event that the server reboots, or something similar).
As far as performance, tabular uses memory, but has a shorter load process (no ETL, no cube processing)... OLAP/MDX uses less memory, by requiring ETL and cube processing... technically tabular uses column compression, so the memory consumption will be based
on the type of data (numeric data is GREAT, text not as much)... but the decision to use OLAP (MDX)/TAB (DAX) is just dependent on the type of load and your needs... both platforms CAN do realtime queries (ROLAP in multidimensional, or DirectQuery for tabular),
or can use their processed/in-memory cache (MOLAP in multidimensional, xVelocity for tabular) to process queries.
if you have a cube, there's no need to reinvent the wheel (especially since there's no way to convert/import the BIDS/SSDT project from MDX to DAX). If you have SSAS 2012 SP1 CU4 or later, you can connect PV (from Excel OR from within SP) directly to the
MDX cube.
Generally, the benefit of PP is for the power users who can build models quickly and easily (without needing to talk to the BI dept)... SharePoint lets those people share the reports with a team... if it's worthy of including in an enterprise warehouse,
it gets handed off to the BI folks who vet the process and calculations... but by that time, the business has received value from the self-service (Excel) and team (SharePoint) analytics... and the BI team has less effort since the PP model includes data sources
and calculations - aside from verifying the sources and calculations, BI can just port the effort into the existing enterprise ETL / warehouse / cubes / reports... shorter dev cycle.
I'll be speaking on this very topic (done so several times already) this weekend in Chicago at SharePoint Saturday!
http://www.spschicagosuburbs.com/Pages/Sessions.aspx
Scott Brickey
MCTS, MCPD, MCITP
www.sbrickey.com
Strategic Data Systems - for all your SharePoint needs -
How to show User Auditing data in dashboard/reports in MS CRM 2013 online?
HI,
I am having requirement to show user auditing details like user last logged in date/ session spent time in MS CRM 2013 online.
I did not found any option to query user Auditing data.
I found the Audit summary View but don't know how to use it.
Could any one suggest me how to achieve this.
Thanks
Baji RahamanPlease try this
Public Function Decompress(ByVal arr As Byte()) As Byte()
Dim s As Byte()
Dim notCompressed As Boolean
notCompressed = False
Dim MS As System.IO.MemoryStream
MS = New System.IO.MemoryStream()
MS.Write(arr, 0, arr.Length)
MS.Position = 0
Dim stream As System.IO.Compression.GZipStream
stream = New System.IO.Compression.GZipStream(MS, System.IO.Compression.CompressionMode.Decompress)
Dim temp As System.IO.MemoryStream
temp = New System.IO.MemoryStream()
Dim buffer As Byte() = New Byte(4096) {}
While (True)
Try
Dim read As Integer
read = stream.Read(buffer, 0, buffer.Length)
If (read <= 0) Then
Exit While
Else
temp.Write(buffer, 0, buffer.Length)
End If
Catch ex As Exception
notCompressed = True
Exit While
End Try
End While
If (notCompressed = True) Then
stream.Close()
Return temp.ToArray()
Else
Return temp.ToArray()
End If
End Function
Thanks & Regards Manoj -
How do I get a Resource's Proposed Time to show in the OLAP reports?
I'm having an issue trying to gather/show a Resource's Proposed Time in the
OLAPPortfolioAnalyzer OOTB report. When looking into the data in Excel, the groupings show
Committed, but part of that Committed data is actually Resource information that I know is
Proposed. In a nutshell, the line item appears as if it's time in the Committed bucket, when in fact it's supposed to be Proposed.
Is there a way to show Proposed in this same report, in addition to Committed? Basically showing a Resource's time grouped by Booking Type? Or is that level of detail not possible here?
I'm pulling out my hair here so any help'd be greatly appreciated.
Thanks! - M
Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blogSo, an interesting update I got from MS support...
I opened up a ticket w/them cause I just couldn't get the info to show, but they were unable to duplicate the behavior I was seeing in their environment. So, I packaged up my PS DBs (Archive, Draft, Published and Reporting) and sent them over to support.
Placing our DBs in their environment they were able to reproduce the issue we're seeing.
After further research, support was able to identify that, if you Publish the project *through* Project Pro client, then the numbers, both Proposed and Committed, show as they should (meaning separate line items) in the OLAP reports. We were able to
verify that on our side: opened up Project Pro, published the project, rebuild the cubes, built the report, and saw line items for Committed and Proposed.
However, not all folks that interface w/Project Server have a license for Project Pro, nor, as I understand it, is Project Pro a requirement to use Project Server. So, support is digging deeper into this now, seeing if using Project Pro is the only
way to get that information to surface (I hope not), or if there's some other issue there.
Figured I'd update this as information comes in...I'll do so when I get a response from support.
- M
Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog -
Creating OLAP report with Crystal Reports and SQL Server Analysis Services 2005
Post Author: orahc_mao
CA Forum: Data Connectivity and SQL
Hi!
I am currently using the trial version of the Crystal Reports XI and I wanted to do an OLAP report. The problem is I cannot select a cube using "OLAP Connection Browser" (the popup window). I already selected Microsoft OLE DB Provider and entered the server name but still can't connect.
I don't think the problem is with SQL Server Analysis Services (2005) since Microsoft Excel - Import Data can detect the server as well as the cube I have created.
I also tried the "Database Expert" of Crystal Reports, created an OLE DB (ADO) connection with "OLE DB Provider for OLAP Services 8.0" driver, entered the server name, checked the integrated security and then I can readily select the database created at SQL Server Analysis Services. However, I still need the OLAP grid create the report but it goes back to my original problem... selecting a cube.
I hope somebody would help me with this issue.
Thanks a lot!
orahc_maoHello,
I don't recognize those variables as CR ones so it's likely something the original developer is setting in code.
You'll have to discuss with that person.
If your have SDK issues then post your question to one of the .NET or Java SDK forums.
Thank you
Don -
Analysing Task Audit, Data Audit and Process Flow History
Hi,
Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
Many Thanks.I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
-- Declare working variables --
declare @dtStartDate as nvarchar(20)
declare @dtEndDate as nvarchar(20)
declare @strAppName as nvarchar(20)
declare @strSQL as nvarchar(4000)
-- Initialize working variables --
set @dtStartDate = '1/1/2012'
set @dtEndDate = '8/31/2012'
set @strAppName = 'YourAppNameHere'
--Get Rules Load, Metadata, Member List
set @strSQL = '
select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (1)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (21)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (23)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
ActivityID ActivityName
0 Idle
1 Rules Load
2 Rules Scan
3 Rules Extract
4 Consolidation
5 Chart Logic
6 Translation
7 Custom Logic
8 Allocate
9 Data Load
10 Data Extract
11 Data Extract via HAL
12 Data Entry
13 Data Retrieval
14 Data Clear
15 Data Copy
16 Journal Entry
17 Journal Retrieval
18 Journal Posting
19 Journal Unposting
20 Journal Template Entry
21 Metadata Load
22 Metadata Extract
23 Member List Load
24 Member List Scan
25 Member List Extract
26 Security Load
27 Security Scan
28 Security Extract
29 Logon
30 Logon Failure
31 Logoff
32 External
33 Metadata Scan
34 Data Scan
35 Extended Analytics Export
36 Extended Analytics Schema Delete
37 Transactions Load
38 Transactions Extract
39 Document Attachments
40 Document Detachments
41 Create Transactions
42 Edit Transactions
43 Delete Transactions
44 Post Transactions
45 Unpost Transactions
46 Delete Invalid Records
47 Data Audit Purged
48 Task Audit Purged
49 Post All Transactions
50 Unpost All Transactions
51 Delete All Transactions
52 Unmatch All Transactions
53 Auto Match by ID
54 Auto Match by Account
55 Intercompany Matching Report by ID
56 Intercompany Matching Report by Acct
57 Intercompany Transaction Report
58 Manual Match
59 Unmatch Selected
60 Manage IC Periods
61 Lock/Unlock IC Entities
62 Manage IC Reason Codes
63 Null -
Oracle OLAP as OBIEE Data Source
I've got a couple of questions regarding the use of Oracle OLAP (Analytic Workspace/Cube) as an OBIEE data source.
First: As a general rule when creating a dimension, we create a total roll-up for the dimension i.e. "Total Product", "Total Geog", "Total Customer" etc... Generally, I don't create a total roll-up for time dimensions. When importing metadata from OLAP to OBIEE, OBIEE creates a "Total" level for all dimensions. Now, I understand why OBIEE does that; to support queries that might exclude one or more dimensions. My question is: what is the best method/procedure to deal with the extra "Total" level?
Second: I would appreciate it if someone could explain this error for me: [nQSError: 59137] Filter level YEAR is below the projected level Total on dimension CMP_TIME while an externally aggregated measure is present. (HY000). I understand the words, but have no clue what OBIEE is trying to tell me. This error pops up constantly and I see no rhyme or reason that would cause it. the specific case above occurred when I clicked on the sort icon for a measure included in a report.
Thanks,Mark,
Thanks for the reply. However, I'm not sure I made myself clear. I have created a "Product" dimension in AWM (Analytic Workspace Manager) in the following structure: Product -> Product Line -> Total Product. Withing the context this hierarchy, "Total Product" is the "Grand Total" Level. When this data is imported into OBIEE using "Oracle OLAP" as a data source, the Product hierarchy is created in the Physical Layer as an "Oracle OLAP Dimension". In the BMM Layer, the hierarchy is structured as: Product -> Product Line -> Total Product -> Total. There are now two "Total" Levels. Naturally only one, the OBIEE generated Total, is defined as "Grand Total". The only child of the Total level is Total Product. I have two hierarchy levels that are the same. So, do we need both? should we keep both? Should a dimension defined within AWM for use in OBIEE NOT include a total level? It's not really a problem, it just doesn't seem to make any sense to have TWO total levels within a hierarchy.
On the second issue, I wish I could provide some detail, but I'm really not sure how I'd do that. That's why I asked for the meaning of the error. What is OBIEE telling me that I'm doing wrong. All I really did was import the metadata, drag it to the BMM Layer, deleted some of the hierarchy level keys, renamed some columns and dragged the stuff over to the Presentation Layer. So, it's pretty much drag-and-drop.
Another example of the error: We have a Category Dimension (Sub Category -> Category -> Category Group -> Model -> All Categories -> Total) and I want to see the top 10 values of a measure by Category by Model. In an Analysis, adding the Model column works fine, just not the best visualization. Move the Model column to "Sections" and all works; move the Model column to Pivot Table Prompts and it errors. Obviously, I'm asking OBIEE to do something it doesn't want to do, so I'm looking for the root cause of the error.
Thanks, -
Collector gows down and audit data aren't in OAV Console
Hi,
I have OAV 10.2.3.2, DBAUD collector, collection agent is on windows. When I run this collector, it starts but after a while it is stopped again. I can retrieve politics, create politics and provision them. Audit data are stored in AUD$ but I can't see them in Audit Vault console.
In C:/oracle/product/10.2.3.2/av_agent_1/av/log/DBAUD_Collector_av_db2_2 are following:
***** Started logging for 'AUD$ Audit Collector' *****
INFO @ '21/04/2010 12:04:45 02:00':
***** Collector Name = DBAUD_Collector
INFO @ '21/04/2010 12:04:45 02:00':
***** Source Name = av_db2
INFO @ '21/04/2010 12:04:45 02:00':
***** Av Name = AV
INFO @ '21/04/2010 12:04:45 02:00':
***** Initialization done OK
INFO @ '21/04/2010 12:04:45 02:00':
***** Starting CB
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
INFO @ '21/04/2010 12:04:46 02:00':
***** CSDK inited OK + 1
INFO @ '21/04/2010 12:04:46 02:00':
***** Src alias = SRCDB2
INFO @ '21/04/2010 12:04:46 02:00':
***** SRC connected OK
INFO @ '21/04/2010 12:04:46 02:00':
***** SRC data retrieved OK
INFO @ '21/04/2010 12:04:46 02:00':
***** Recovery done OK
ERROR @ '21/04/2010 12:05:07 02:00':
On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
ORA-06512: na "AVSYS.AV$DW", line 1022
ORA-01882: oblast časového pásma nebyla nalezena
ORA-06512: na "AVSYS.AV$DW", line 1290
ORA-06512: na line 1
ERROR @ '21/04/2010 12:05:08 02:00':
Collecting thread died with status 46821
INFO @ '21/04/2010 12:06:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:07:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:08:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:09:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:10:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:11:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:12:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:41 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:41 02:00':
***** Started logging for 'AUD$ Audit Collector' *****
INFO @ '21/04/2010 12:13:41 02:00':
***** Collector Name = DBAUD_Collector
INFO @ '21/04/2010 12:13:41 02:00':
***** Source Name = av_db2
INFO @ '21/04/2010 12:13:41 02:00':
***** Av Name = AV
INFO @ '21/04/2010 12:13:41 02:00':
***** Initialization done OK
INFO @ '21/04/2010 12:13:41 02:00':
***** Starting CB
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
INFO @ '21/04/2010 12:13:42 02:00':
***** CSDK inited OK + 1
INFO @ '21/04/2010 12:13:42 02:00':
***** Src alias = SRCDB2
INFO @ '21/04/2010 12:13:42 02:00':
***** SRC connected OK
INFO @ '21/04/2010 12:13:42 02:00':
***** SRC data retrieved OK
INFO @ '21/04/2010 12:13:42 02:00':
***** Recovery done OK
ERROR @ '21/04/2010 12:14:03 02:00':
On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
ORA-06512: na "AVSYS.AV$DW", line 1022
ORA-01882: oblast časového pásma nebyla nalezena
ORA-06512: na "AVSYS.AV$DW", line 1290
ORA-06512: na line 1
ERROR @ '21/04/2010 12:14:05 02:00':
Error on get metric callback: 46821
ERROR @ '21/04/2010 12:14:05 02:00':
Collecting thread died with status 46821
ERROR @ '21/04/2010 12:14:06 02:00':
Receive error. NS error 12537
ERROR @ '21/04/2010 12:14:06 02:00':
Timeout for getting metrics reply!
INFO @ '21/04/2010 12:15:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:16:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:17:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
I've already follow help in administrator guide for Problem: Cannot start the DBAUD collector and the log file shows an error. There is that if I run command
$ sqlplus /@SRCDB1
successful, my source database is set up correctly. I run this command successful, but my problem didn't solve.
Any advice, please?I reinstall AVS and AV Collection agent and again patchset 10.2.3.2, again add source database, agent, collectors,... but still have the same problem... collector starts successfully, but after wihile its stopped and in AV console aren't any data for reports (I create some audit policy and alerts).
I test connection with sqlplus /@SRCDB1 and it is ok, log for collector is:
May 11, 2010 8:11:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:11:38 AM Thread-44 FINE: timer task interval calculated = 60000
May 11, 2010 8:11:38 AM Thread-44 FINE: Going to start collector, m_finalCommand=/bin/sh -c $ORACLE_HOME/bin/a
vaudcoll hostname="avtest.zcu.cz" sourcename="stag_db" collectorname="DBAUD_Collector" avname="AV" loglevel="I
NFO" command=START
May 11, 2010 8:11:46 AM Thread-44 FINE: collector started, exitval=0
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=IS_ALIVE value=true
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=BYTES_PER_SEC value=0.0000
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=RECORDS_PER_SEC value=0.0000
May 11, 2010 8:11:49 AM Thread-44 FINE: timer task going to be started...
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: Stop caching since it is NOT alive for 10 times of query
And in table av$rads_flat are collected audit data...
Pls, know any one where could be a problem?? -
Hi all!
I've been searching for the most simple solution for tracking changes made to database data. I know there are some posts about this already but I haven't fuound suitable solution yet. I've been searcing for Orcale built in solutions and came up with Workspace manager and AUDIT statement for keeping the track on data changes.
I'm wondering, is there another option to do this or is it best to make own history tables and triggers on INSERT, UPDATE and DELETE? If these history tables have similar structure like original tables, what happens when table is altered, a column is added or dropped?
All I need is to be able to make reports on data changes. Basically I need to know who changed what column, what the old value was (NULL on INSERTs) and what the new value is (NULL on DELETEs).
Does anyone have any simple and effective solution for this. Maybe some Oracle built in feature I haven't found out about yet? (We're using 10gR2)
Thank you in advance!
BBWell I came up with some solution :).
Since we have our owns user and timestamp fields on almost every table and triggers, taht fill these two columns, it wasn't that difficult after you suggested your solution.
I include my user column into change table and in JDeveloper in application module I've overridden the prepareSession method with custom method that sets identifier on the database. It looked OK. Bu now I have a couple of questions about CDC and really hope for some answers since I'm running a little out of time and would appreciate any help from someone that uses CDC or knows more about it than I do.
I've been reading literature and found out, that there is a job executing every 24 hours that clears all data changes that is not subscribed to some subscriber. I thought I could use only the publisher for my history records but looks like I was wrong. So I've created a subscriber. My questions are:
1. Do I need a subscriber or can I keep history without and make reports on change tables?
2. Even if I need subscriber, how can I be sure, that data from change tables or subscriber views won't disappear eventually?
As I mentioned I need some kind of a history of data changes recording option and went after CDC. I hope I made the right decision and would really like to found out the answers to my questions.
Thank you all in advance!
Take care.
BB. -
BPC for audit has been enabled and since lot of audit data is piling up.. it is slowing down the reporting..
How to delete the BPC data that has piled up over a period..
I want to completely delete the data..
Appreciate your inputs..Hi,
there are 2 standard process chains for archiving audit data:
/CPMB/ARCHIVE_ACTIVITY
/CPMB/ARCHIVE_DATA
tables for audit:
activity tables: UJU_AUDACTHDR, UJU_AUDACTDET, UJU_AUDACTHDR_A, UJU_AUDACTDET_A
data audit: UJU_ AUDDATAHDR, /1 CPMB/ KIOMBAD, UJU_ AUDDATAHDR_A/, 1CPMB/ KIOMBAD_A
regards
D -
Audit data from Identity Mgr repository
Hi,
I want to get the data like, If I create an account for a user and assign him certain resources today (say date is a) and assign him some resources.These resources need to be approved,they are approved after two days (say date is b).The is created at that time.
Any clue How I could the date when request was made,
Is there any way or workflow service op that lets us get the audit data for a user from repository.
Thanks,
PanduIn the Audit Vault Server infrastructure, there are a number of objects that are used to store the audit data from source databases. The Agent/Collector continually extracts and sends the audit data from the source databases into the Audit Vault repository. One of the tables that stores the 'raw' audit data is av$rads_flat which should never be externalized, changed, or manipulated by anyone.
Out of the box, a job runs every 24 hours to transform or normalize the raw audit data into the warehouse structure that the reports are run against. The warehouse tables are published in the Audit Vault Auditor's documentation in which you can run your own report tool or other plsql to analyze the audit data.
What do you want to do with the raw audit data?
Thanks, Tammy -
MB5B Report for Date Wise Stock and Value
Hi,
I am Taking MB5B Report for Date Wise Stock and Value.
But I have one doubt all stocks is coming or not in these report like Unrestrected Stock,Return Stock.Blocked stock,Transist stock,Restrected Stock,qty Inspection Stock.
I have another Doubt in these report three Stock type indicaters are there like these.
1.Storage Location / Batch Stock
2.valuated Stock
3.Special Stock.
But i have one doubt what is defferent these
1.Storage Location / Batch Stock
2.valuated StockHi Prasad,
Yes MB5B report consider the Unrestricted, Quality, Blocked, Transit stock and restricted stock. Not sure about Return Stock.
If you select the Storage location/Batch stock radio button then the system will display all the possible stock from the storage location and the corresponding batch also.
If you select Valuated stock radio button then system will show only the valuated stock not the Non-valuated stock. Because Non-valuated material type is available in SAP system will not show those stock suppose if you select the Valuated stock radio button.
Regards
Karthick -
How to generate a report from data entered by users in XML forms
hi,
i have a requirement to generate report from xml forms which are created using XML forms Builder and stored in Km folders. management want to see in a single report all the activities of all the users in the xml form in such a way that one row in the report presents data blongs to each user. i have no idea regarding this.
can any one help me in detail.
Thanking you in adavance
sajeerHi Sajeer,
You have given quite few information what data should be collected / showed in detail. Anyhow, it sounds as if you want to provide data which isn't provided within the standard portal so far (see http://help.sap.com/saphelp_nw2004s/helpdata/en/07/dad131443b314988eeece94506f861/frameset.htm for a list of existing reports).
For your needs you probably have to develop a repository service which reacts on content / property changes and then you would have to log these changes somewhere, for example in a DB. On the other side, you then could implement a report iView which accesses this data and displays it as whished.
Hope it helps
Detlev
PS: Please consider rewarding points for helpful answers on SDN. Thanks in advance! -
How to create a report with data using the Crystal Reports for Java SDK
Hi,
How do I create a report with data that can be displayed via the Crystal Report for Java SDK and the Viewers API?
I am writing my own report designer, and would like to use the Crystal Runtime Engine to display my report in DHTML, PDF, and Excel formats. I can create my own report through the following code snippet:
ReportClientDocument boReportClientDocument = new ReportClientDocument();
boReportClientDocument.newDocument();
However, I cannot find a way to add data elements to the report without specifying an RPT file. Is this possible? I seems like it is since the Eclipse Plug In allows you to specify your database parameters when creating an RPT file.
is there a way to do this through these packages?
com.crystaldecisions.sdk.occa.report.data
com.crystaldecisions.sdk.occa.report.definition
Am I forced to create a RPT file for the different table and column structures I have?
Thank you in advance for any insights.
Ted JenneyHi Rameez,
After working through the example code some more, and doing some more research, I remain unable to populate a report with my own data and view the report in a browser. I realize this is a long post, but there are multiple errors I am receiving, and these are the seemingly essential ones that I am hitting.
Modeling the Sample code from Create_Report_From_Scratch.zip to add a database table, using the following code:
<%@ page import="com.crystaldecisions.sdk.occa.report.application.*"%>
<%@ page import="com.crystaldecisions.sdk.occa.report.data.*"%>
<%@ page import="com.crystaldecisions.sdk.occa.report.document.*"%>
<%@ page import="com.crystaldecisions.sdk.occa.report.definition.*"%>
<%@ page import="com.crystaldecisions.sdk.occa.report.lib.*" %>
<%@ page import = "com.crystaldecisions.report.web.viewer.*"%>
<%
try {
ReportClientDocument rcd = new ReportClientDocument();
rcd.newDocument();
// Setup the DB connection
String database_dll = "Sqlsrv32.dll";
String db = "qa_start_2012";
String dsn = "SQL Server";
String userName = "sa";
String pwd = "sa";
// Create the DB connection
ConnectionInfo oConnectionInfo = new ConnectionInfo();
PropertyBag oPropertyBag1 = oConnectionInfo.getAttributes();
// Set new table logon properties
PropertyBag oPropertyBag2 = new PropertyBag();
oPropertyBag2.put("DSN", dsn);
oPropertyBag2.put("Data Source", db);
// Set the connection info objects members
// 1. Pass the Logon Properties to the main PropertyBag
// 2. Set the Server Description to the new **System DSN**
oPropertyBag1.put(PropertyBagHelper.CONNINFO_CRQE_LOGONPROPERTIES, oPropertyBag2);
oPropertyBag1.put(PropertyBagHelper.CONNINFO_CRQE_SERVERDESCRIPTION, dsn);
oPropertyBag1.put("Database DLL", database_dll);
oConnectionInfo.setAttributes(oPropertyBag1);
oConnectionInfo.setUserName(userName);
oConnectionInfo.setPassword(pwd);
// The Kind of connectionInfos is CRQE (Crystal Reports Query Engine).
oConnectionInfo.setKind(ConnectionInfoKind.CRQE);
// Add a Database table
String tableName = "Building";
Table oTable = new Table();
oTable.setName(tableName);
oTable.setConnectionInfo(oConnectionInfo);
rcd.getDatabaseController().addTable(oTable, null);
catch(ReportSDKException RsdkEx) {
out.println(RsdkEx);
catch (Exception ex) {
out.println(ex);
%>
Throws the exception
com.crystaldecisions.sdk.occa.report.lib.ReportSDKException: java.lang.NullPointerException---- Error code:-2147467259 Error code name:failed
There was other sample code on SDN which suggested the following - adding the table after calling table.setDataFields() as in:
String tableName = "Building";
String fieldname = "Building_Name";
Table oTable = new Table();
oTable.setName(tableName);
oTable.setAlias(tableName);
oTable.setQualifiedName(tableName);
oTable.setDescription(tableName) ;
Fields fields = new Fields();
DBField field = new DBField();
field.setDescription(fieldname);
field.setHeadingText(fieldname);
field.setName(fieldname);
field.setType(FieldValueType.stringField);
field.setLength(40);
fields.add(field);
oTable.setDataFields(fields);
oTable.setConnectionInfo(oConnectionInfo);
rcd.getDatabaseController().addTable(oTable, null);
This code succeeds, but it is not clear how to add that database field to a section. If I attempt to call the following:
FieldObject oFieldObject = new FieldObject();
oFieldObject.setDataSourceName(field.getFormulaForm());
oFieldObject.setFieldValueType(field.getType());
// Now add it to the section
oFieldObject.setLeft(3120);
oFieldObject.setTop(120);
oFieldObject.setWidth(1911);
oFieldObject.setHeight(226);
rcd.getReportDefController().getReportObjectController().add(oFieldObject, rcd.getReportDefController().getReportDefinition().getDetailArea().getSections().getSection(0), -1);
Then I get an error (which is not unexpected)
com.crystaldecisions.sdk.occa.report.lib.ReportDefControllerException: The field was not found.---- Error code:-2147213283 Error code name:invalidFieldObject
How do I add one of the table.SetDataFields() to my report to be displayed?
Are there any other pointers or suggestions you may have?
Thank you
Maybe you are looking for
-
My 7 year old macbook hard drive stopped working and I put in another hard drive but when I went to restore my information from airport time capsule it only shows today and nothing older. Help?
-
Need Help Choosing Tag & Rename Software
I'm looking for a T&R software that not only works well with iTunes, But also *writes the changes back to the playList.* If you change something in the song file name then the play list may not work if it's not changed to match? I'm manually managing
-
Not sure if is appropriate to post this, but need help with editing photos. I use PhotoShop Elements 2.0 now but would like to upgrade. The reviews on 4.0 are not complimentary but would like something a bit more than iPhoto. Any other recommendation
-
Spinning Beach Ball and computer freeze
Tonight I continually receive the spinning beach ball. The computer freezes and I need to do a hard shutdown. I have run repair permissions several times. I do receive the modified and no repaire messages. I have traced message lines on the forum and
-
Has anybody encountered problems attaching files to gmail when run in firefox
Trying to attach a file (I tried pdf, word, excel files) to a gmail when running in Firefox ended with an "Unable to attach file" statement. However, in IEX and Safari the same works properly. Anybody an idea about this problem?