Deleting Inplace discovery and hold report audit data

I need to remove the auditing data for the
discovery and hold report.  Is this possible in exchange online?

This is how to delete a datasource from a custom report in Analyzer 6.5.- Open the custom report, on the Main Toolbar menu, go to Tools and Select Design Report.- On the Design Report page, highlight the data source (spreadsheet or chart), right click & select Properties, the Data Object Properties box will pop-up, highlight the "ReportDataSrc1" and hit "Delete" on your keyboard(if this is the data source you want to delete). It will give you a "Yes and No" option. Click Yes, it will delete the data source out of your custom report forever and remember to save the report.If you have any question, feel free to email me [email protected]

Similar Messages

  • How to use url to open report and refresh report's data,but the report's toolbar haven't "refresh data" button

    Post Author: madbird
    CA Forum: WebIntelligence Reporting
    HI,
    scenario is
    I hope use url to open the report,and refresh the report data,but I didn't the user use the "refresh data" button to refresh the report's data.How to do that?
    regards

    Post Author: madbird
    CA Forum: WebIntelligence Reporting
    Thank you for your attention, amr_foci .
    I created a Web Intelligence report,there will be more than 30 companies use.The company only allowed to see the Company Data. I want to use the url of the way, from my company in the development of the system to open report, and refresh reported Table data.but do not want users to use the toolbar of "data refresh" button to refresh the data .
    Now the question of identity authentication system, as well as statements by url open the question of Has been resolved, but it can not shield statements through the pages of "data refresh" button.
    In addition, I now used by the business object XI system is posted on the JAVA environment.
    Regards,

  • List of receiving and shipping reports by date

    how can i get a list of receiving and shipping reports by date?
    or is there a way to get the last 5 shipping/receiving reports in the year?

    That info is saved in the reports themselves. If you have a developer available you can write a small app to get the info. Post your question to the Legacy Application Development SDKs Forum or .NET forum

  • OLAPi report audit data...

    Does the auditing database capture events regarding OLAPi reports?
    We still have some OLAP reports in our XI R2 environment and would like to know who is still looking at these and how often. I am not seeing any data on these reports from the Activity universe but I'm wondering if I'm looking in the wrong place or don't have something turned on.
    Thanks,

    We cant find information regarding olap reports in auditor and it has been raised as and Enhancement request which may implemented in next major versions.
    Thanks,
    Ramsudhakar.

  • Deleting websheet Primary and Alternative reports

    hi,
    I saved the primary report and it actually created another primary report, i am actually up to having 3 primary reports. for example when I want to add a new section to a page that includes a chart, it ask what data grid and report to base it off of, when i select the report it list 3 "primary Reports". all 3 are slightly different. how do I delete 2 of them? the checkbox for deleting them does not appear when I go admin -->manage services -->interactive report settings --> saved reports --> web application reports . it is intended behavior to be able to have multiple primary reports .
    next, i created an alternative report, deleted by using the steps above....but the report still shows as am option on the data grid page, the select list for reports still includes the alternative report even though I deleted. this is a real problem as the alternative report is wrong and end users keep using it.
    Thanks,
    --Kevin                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    OK I think I figured it out. Clearing out the user preference (in the post-login block in the authentication scheme) seems to make it pick the Primary report (APXWS_DEFAULT) every time a new session is created.
    Not sure I understand the syntax behind the various values I see in the preference so I didn't mess with setting the value but clearing it out seems to work fine.
    DECLARE
        l_ir_pref VARCHAR2(500);
    BEGIN
        -- Clear preference so that IR defaults to Primary Report
        select 'FSP_IR_'||application_id||'_P'||page_id||'_W'||interactive_report_id
        INTO l_ir_pref
        from  apex_application_page_ir
        where 1=1
        and application_id = nv('APP_ID')
        AND page_id = 10 -- Change as needed
        AND ROWNUM = 1;
        apex_util.remove_preference(l_ir_pref);
    END;Is any of this documented?

  • SSM calling Xcelsius and BO Reports combine datas and Universes into SSM

    I would like to connect within SAP SSM reports and Xcelsius passing parameters to dashaboards of Xcelsius (with BO universes or not) and merge information from PAS Cube and data from other universes BO.
    Could someone help me
    Best Regards
    Carlos Mardinotto

    Carlos,
    There are two different processes you use to get information into Xcelsius and into Universes. For Xcelsius you are using web services, since you are bringing in not only the data but the objective names, etc. to create the dashboards.
    For getting PAS data into Universes you create an ODBO connection to Voyager and from Voyager you create a Universe.
    In the Configuration Help guide for Strategy Management available from Service Marketplace (Installation & Upgrade Guides), Section 9.6 gives information on providing Strategy Management data for Xcelsius and Section 9.7 has instructions on providing Strategy Management Data for Voyager.
    Regards,
    Bob

  • I need help I want to analyze and create reports using data from my firefox history files .

    I am not quite sure where to post this. I had a program for analyzing my firefox history which no longer works due to the change in how firefox stores history data. I am not an sqlite user. Any help will be greatly appreciated.
    I downloaded the sqlite manager addon and it is now installed on my firefox browser. I have tried it out and can make some sense of it. I would like to know how to decode the time/date storage system.

    It uses Unix time, which is the number of seconds that have elapsed since January 1, 1970 00:00 UTC /GMT.
    http://en.wikipedia.org/wiki/Unix_time <br />
    Here is an online converter tool for Unix times. <br />
    http://www.esqsoft.com/javascript_examples/date-to-epoch.htm

  • Deleted bad delta and former good deltas form data mart to targets(5)?

    Hi experts,
    My process chain failed in continues for 5 days due to DSO activation issue. On further analysis I found a bad request with status red on july 3rd and their is no psa to fix it.
    I deleted the bad request that is loaded on july 3rd with status red and all the god requests with status green on july 4th,5th,6th,7th and 8th.
    The data load is going from data mart to 5 different targets (3cubes and 2 DSO's) and i deleted the requests listed above with all the 5 targets.
    Know When I try to run new delta it is saying that 'new delta is not possible and asking for the previous deltas'
    What to do know? Should I delete the init and reinit with out data transfer and load my new delta? I am affraid that I may loose data if i do this.
    Thanks in advance.
    Sharat.

    Murali,
    I have the failed delta in the targets (3cubes and 2 ods). I deleted the bad delta and the former good ones after july 3rd from all the five targets by verifying the technical name of the requests in the targets.
    How should I proceed know?
    Can I do a full repair request and new delta? or delete the init and reinit without data transfer and do a full repair and a new delta?
    Please let me know. Thank you for your help.
    Sharat.

  • How to manage separate OLTP and MIS/Reports Server

    Dear All,
    Due to large amount of data and heavy MIS queries, response from our database server is very slow. Users face problems of slow speed.
    We want to separate our OLTP and MIS/Reporting.
    What is the best scenario provided by Oracle to implement this strategy. We are using oracle 11gR1 on Linux server.
    One thing to remember some reports require update information that is they has to show the latest data. However some MIS reports can be archived, like a procedure runs during off hours and insert data into tables periodically and MIS reports selects data from these tables.
    Your expert comments are required.
    Thanks, Imran

    I also require some procedures that run on a scheduled time and generate records in different tables, so read only database instance wont help much. If you are saying that you need the reporting database to be updatable, a physical standby is out. A logical standby is also most likely out unless you are just creating new objects. If you want to modify existing objects (inserting, updating, or deleting rows in existing tables), logical standby would not be appropriate.
    Do you want the reporting database to have the same data model as the OLTP database? Reporting databases often perform better with a different data model, but moving to a star or snowflake schema expands the scope of the project and increases complexity substantially. It also requires that existing queries be modified to hit the new schema which adds additional development time.
    Justin

  • How to delete an user without leaving user's data?

    Somebody know how can I to delete a SAP user in order to delete anyone of its tracks???
    My problem is this:
    There is an user who has something different from the others and it causes that he is not equal to the others.
    So I thought that erasing it by SU01 and returning to create it, like a copy of another user, this would be OK...
    ... but, it didn't work. SAP remembers him, and it inserts some data that the user had before being erased!
    What tables I must watch to eliminate these tracks?
    Thanks in advance.

    Hi
    It's not possible to delete SAP User, and all of the data regarding this user, this data is stored in change document etc.. in a lot of different application tables and some of it is a crucial part of the audit trail. Further more there are a number of tables e.g. users_ssm etc where you can customize the system behavior based on user id.
    In order to fix this issue you need to investigate the differences in detail, try to have a look at the personalization tab, and in the concrete application where encounter the problem, there might be some personalization data in there as well.
    Regards
    Morten Nielsen

  • How do I delete an app and all of it's components?

    Hi there,
    Just like the heading suggests, how can I do this? I've tried my best but can't seem to find a way to acces the "place" where the apps are installed/stored.
    Many thanks for any help,
    Jacobo

    Just press and hold the app until they start to shake, and press the 'x' on any that you want to delete; press the home button when you've finished so as to stop the shaking. That should delete the app and all it's data/content. You can't delete any of the Apple built-in apps (iTunes, iPod, App Store, Game Centre etc.).

  • Audit data from Identity Mgr repository

    Hi,
    I want to get the data like, If I create an account for a user and assign him certain resources today (say date is a) and assign him some resources.These resources need to be approved,they are approved after two days (say date is b).The is created at that time.
    Any clue How I could the date when request was made,
    Is there any way or workflow service op that lets us get the audit data for a user from repository.
    Thanks,
    Pandu

    In the Audit Vault Server infrastructure, there are a number of objects that are used to store the audit data from source databases. The Agent/Collector continually extracts and sends the audit data from the source databases into the Audit Vault repository. One of the tables that stores the 'raw' audit data is av$rads_flat which should never be externalized, changed, or manipulated by anyone.
    Out of the box, a job runs every 24 hours to transform or normalize the raw audit data into the warehouse structure that the reports are run against. The warehouse tables are published in the Audit Vault Auditor's documentation in which you can run your own report tool or other plsql to analyze the audit data.
    What do you want to do with the raw audit data?
    Thanks, Tammy

  • Difference between delete aged discovery data and delete inactive client discovery data maintenance tasks

    I'm setting up a new CM12 site and one issue with our current CM2007 site that I'm trying to minimise in this new site is the discovery and retention of dead clients.
    I think the new options as part of the system and group discoveries to only discover systems that have logged on to a domain/updated their password in a set period will resolve the majority of the problems we have here. I've set both of these to 30 days.
    Management are keen to be pretty aggressive with the retention period so we're looking to only keep computers in the database that have heartbeated within 30 days. I've set the heartbeat discovery to 8 hours and now I'm looking at the maintenance tasks.
    The task I think I need to configure is 'delete inactive client discovery data'. I've set this to 30 days and to run every Saturday. If I look at the 'delete aged discovery data' task it's set to 90 days every Saturday. How will that affect things?
    Do I need to make both tasks match?
    P.S. Thoughts on the retention period...is 30 days too aggressive? My original design was 60 days.

    I do have question about this and more related to "Delete Aged Discovery Data".  I've got following configured in my environment for AD Discovery  (Development environment)
    do not discover machine not logged on for 80 days | do not discover machines not changed password for 80 days.
    if I run the query select * from v_r_system (nolock) where datediff("dd", last_logon_date0, getdate()) > 80
    it returns about 600 devices. 
    The task "Delete Aged Discovery Data" is also configured to remove anything that has not been discovered in last 80 days.   When this task run I would expect that the 600 devices should be removed from the db, however it only removed 1 device ???
    question I have
    do you know which other criteria is taken into consideration for this to work.  I'm currently using SQL Profiler to determine which StoredProcedure/Function is used to remove these devices.
    this is my development environment I only have 2-3 clients all the other devices are just discovered with no SCCM Client installed.
    do the devices have to be discovered as part of AD Device discovery or is it sufficient that they get discovered through AD Group membership, in our environment AD Device discovery will not work as we don't have WINS/ DDNS enabled.
    thx for any help.

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • UCCX 7.0.1SR5 to 8.0 upgrade while also adding LDAP integration for CUCM - what happens to agents and Historical Reporting data?

    Current State:
    •    I have a customer running CUCM 6.1 and UCCX 7.01SR5.  Currently their CUCM is *NOT* LDAP integrated and using local accounts only.  UCCX is AXL integrated to CUCM as usual and is pulling users from CUCM and using CUCM for login validation for CAD.
    •    The local user accounts in CUCM currently match the naming format in active directory (John Smith in CUCM is jsmith and John Smith is jsmith in AD)
    Goal:
    •    Upgrade software versions and migrate to new hardware for UCCX
    •    LDAP integrate the CUCM users
    Desired Future State and Proposed Upgrade Method
    Using the UCCX Pre Upgrade Tool (PUT), backup the current UCCX 7.01 server. 
    Then during a weekend maintenance window……
    •    Upgrade the CUCM cluster from 6.1 to 8.0 in 2 step process
    •    Integrate the CUCM cluster to corporate active directory (LDAP) - sync the same users that were present before, associate with physical phones, select the same ACD/UCCX line under the users settings as before
    •    Then build UCCX 8.0 server on new hardware and stop at the initial setup stage
    •    Restore the data from the UCCX PUT tool
    •    Continue setup per documentation
    At this point does UCCX see these agents as the same as they were before?
    Is the historical reporting data the same with regards to agent John Smith (local CUCM user) from last week and agent John Smith (LDAP imported CUCM user) from this week ?
    I have the feeling that UCCX will see the agents as different almost as if there is a unique identifier that's used in addition to the simple user name.
    We can simplify this question along these lines
    Starting at the beginning with CUCM 6.1 (local users) and UCCX 7.01.  Let's say the customer decided to LDAP integrate the CUCM users and not upgrade any software. 
    If I follow the same steps with re-associating the users to devices and selecting the ACD/UCCX extension, what happens? 
    I would guess that UCCX would see all the users it knew about get deleted (making them inactive agents) and the see a whole group of new agents get created.
    What would historical reporting show in this case?  A set of old agents and a set of new agents treated differently?
    Has anyone run into this before?
    Is my goal possible while keeping the agent configuration and HR data as it was before?

    I was doing some more research looking at the DB schema for UCCX 8.
    Looking at the Resource table in UCCX, it looks like there is primary key that represents each user.
    My question, is this key replicated from CUCM or created locally when the user is imported into UCCX?
    How does UCCX determine if user account jsmith in CUCM, when it’s a local account, is different than user account jsmith in CUCM that is LDAP imported?
    Would it be possible (with TAC's help most likely) to edit this field back to the previous values so that AQM and historical reporting would think the user accounts are the same?
    Database table name: Resource
    The Unified CCX system creates a new record in the Resource table when the Unified CCX system retrieves agent information from the Unified CM.
    A Resource record contains information about the resource (agent). One such record exists for each active and inactive resource. When a resource is deleted, the old record is flagged as inactive; when a resource is updated, a new record is created and the old one is flagged as inactive.

Maybe you are looking for