Cisco ANM - Exporting Historical Data (VA)

Hi Experts,
I am looking for the way to access historical data on Cisco ANM 5.2.1 Virtual Appliance. In documentation I found that raw data should be stored in
/var/lib/anm/export/historical-data/date-stamp.
The problem is that VA is lockdown environment with no access to its content.
Does anyone found the way how to use external scripts to gather historical data in CSV format?
BR
Marcin

This exception is sometimes seen when the 4.0(5) server had been upgraded from 4.0(3) or a lower version originally.  The missing information in DCD does not cause any issues on 4.0, but it causes an issue for the DMT.
Attached is a document that is often helpful to replace this file and correct the problem.  The blob value that needs to be copied into the entry in DCD is contained within the doc.
Thanks,
Brendan

Similar Messages

  • Exporting Historical Data from Lookout Version 3.8 to Excel

    Can anyone please give step-by-step instructions on how to export historical data from Lookout Version 3.8 to an Excel Spreadsheet or CSV file?

    Hi koehler, in the following link you can fiond a detailed step by step instructions
    http://digital.ni.com/public.nsf/websearch/5915049F3BF93935862568C5005E9DF3?OpenDocument
    Benjamin C
    Senior Systems Engineer // CLA // CLED // CTD

  • Exporting historical data to text file with MAX misses columns of data?

    I am using Labview 7.1 with DSC module 7.1 and wants to export data to either excel format or .txt.
    I have tried this with the historical data export in MAX, and also programmatically with the "write traces to spreadsheet file.vi" available in the DSC module. All my tags in my tag engine file (*.scf) are defined to log data and events. Both the update tag engine deadband and the update database are set to 0 %.
    My exported excel or text file seems reasonalbe except that some columns of data are missing. I dont understand
    why data from these tags are not in the exported files since they have the same setup in the (.scf) as other tags which are okay exported?
    All defined tags can be seen using the NI hypertrend or MAX. Also the ones that are not correctly exported to file.
    Appreciate comments on this.
    Best regards,
    Ingvald Bardsen

    I am using LV and DSC 7.1 with PCI-6251 and max 4.2. In fact, just one column of values does not make sense. In attachment, follows the excel exported file. The last of column called ...V-002 that is the problem. I put probes for checking values but them shows correct. When the file is exported that to show values wrong.
    The problem of missing values in column i solved putting 0% in field of deadband.
    thank you for your help
    Attachments:
    qui, 2 de ago de 2007 - 132736.xls ‏21 KB

  • Exporting LiveCache data

    Hi Experts,
    we have been exporting live cache data to a backup cube without any issues.
    What I want to know is it possible to export historical data into the cube?
    We've not been able to do this and have only managed to export the current period into the future, no history. The dates in the infopackage are set to allow the historical periods to be exported.
    So we want to know if it is possible to export the past periods and what else needs to be done to do this.
    thanks
    Paul
    Edited by: Paul_111 on May 16, 2011 4:01 PM

    Paul
    What kind of data are you exporting to BI Infocube (DP Time Series/ SNP Time Series, etc) ?
    Also, I am confused - In the first line you say that "we have been exporting live cache data to a backup cube without any issues." and then you go on to say that you "have only managed to export the current period into the future, no history".
    Is this something that was working before, but not working now ?
    RIshi Menon

  • Verifying data coming from Cisco Unified CCX Historical Reports

    Good afternoon
    I (along with a number of other colleagues) are heavily involved in a project to take data from a wide variety of different sources and merge it all into one system so that we can report on it in a joined-up manner.
    The project comprises a number of different types of data source (such as Telephony or CRM). Within each data source type, we have various suppliers of those products. In the case of telephony data (which I'm looking into at the moment), the eventual aim is to make it possible to take data from any of the telephony platforms in use across our business (currently AVAYA, Alcatel and Cisco) and report on it in a uniform way, thus negating the need for an end-user to know what the Cisco definition of AHT is (for example).
    The switch I'm currently looking at is a managed switch, meaning that we don't have any sort of direct access to the back-end database(s). We could probably get it, but I suspect that the company that manages it for us would probably charge a small fortune for that. In view of this, I'm working with a number of the standard reports in the Cisco Unified CCX system. My plan (at the moment anyway) is to identify the reports that we can use that will best provide details of all calls into and out of our contact centres. I'd be looking to get the exact details of each individual call, which could then be rolled up into manageable intervals (such as 15-minute or 30-minute).
    Before I go much further, I'd like to be clear on something: I'm a database developer rather than a telecoms engineer so if I ask something that appears to be obvious then I apologise in advance. I've got quite a bit of experience of working with the CTI system that sits on top of our AVAYA platform, but it's proving to be a bit of a wrench effectively "un-learning" that system so that I can make room in my head for the Cisco solution.
    So, what I've learned (or have guessed) so far is this:
    When I run the Application Performance Analysis report, the Application Names that are returned are effectively the Call Routes that are set up in the system. Each Call Route can be fed by one or more Called Number (which I understand to essentially be a DDI);
    The Application Summary Analysis report shows the same Application Name information as is shown in the Application Performance Report. However this report also shows the Called Number, thus providing slightly more information about the individual DDI being answered;
    My next plan is to try and run an Agent-level report so that I can see exactly which calls each agent handled. This is where I've run into problems: I ran the CSQ - Agent Summary report for the whole of 17th October. I then ran the Agent Detail report for the same period, and ran it out to CSV so that I could "play" with the data. The CSQ - Agent Summary Report shows that a particular agent on a particular CSQ Name (ID) handled a total of 29 calls. However, if I filter the Agent Detail report for that agent and CSQ, I get a total of 30 calls and for the life of me am unable to identify where the missing call is coming from. Initially I'd thought it might be bacause the CSQ in question has two separate DDIs but as far as I can see, this is making no difference.
    I NEED to be 100% sure that when I'm importing the data from the Cisco reports into our system, I am then able to mimic the types of reports that are coming from Cisco, with the same figures. Therefore, if anyone can help me, I'll be extremely grateful.
    TIA
    Ian Henderson

    The "Cisco Unified CCX Historical Reports Scheduler" sits in my startup folder but seems like it doesn't "run" at startup. So I ran a "test":
    - Manually right-clicked the "Cisco Unified CCX Historical Reports Scheduler" icon in the startup folder and chose "Run as Administrator"
    I didn't log off or reboot PC and the reports are running again. I checked the "properties" of the icon and I did make sure it was already set to "Run this program as an Administrator" under the "Compatibility" tab.
    Not sure why it's not working...
    Thank you for any help you can provide....

  • Historic data migration (forms 6i to forms 11g)

    Hello,
    We have done a migration from Forms 6i to Forms 11g. We are facing a problem with the historic data for a download/upload file
    utility. In forms 6i the upload/download was done using OLE Container which has become obsolete, the new technology being webutil.
    We had converted the historic data from Long RAW to BLOB (by export/import & by the TO_LOB) and while opening them it throws a
    message or not able to open. This issue exists for all types of documents like .doc, .docx, .html, .pdf. We are unable
    to open the documents after downloading to local client machines.
    One option which works is to manually download the documents (pdf, doc etc) from the older version of forms 6i (OLE) and
    upload it to the forms 11g (Webutil). Is there any way this can be automated?
    Thanks
    Ram

    Are you colleagues?
    OLE Containers in Oracle Forms 6i

  • Clearing all historical data from the graphs

    Hi all,
    Now that I have upgraded to 4.1.1c I want to clear all historical data and start from scratch with the graphs/stats/reporting. Is there a way to do such a thing? Could not find anything in the docs other than "clear statistics all" which seems unrelated.
    Thanks
    Adrian

    http://www.cisco.com/en/US/docs/app_ntwk_services/waas/waas/v411/configuration/guide/monitor.html#wp1043484
    Enclosed is the report montioring and configuring guide.
    The clear statistics command clears all statistical counters from the parameters given. Use this command to monitor fresh statistical data for some or all features without losing cached objects or configurations.

  • CSR Historical Data not capturing data?

    Hi All,
    Last week (on Thursday) our Historical Data stopped capturing any details? 
    I have captured a screen shot of the report.  You can see that the agent logged in time is 111:47:23 (at 3:47:23pm EST).  Any help fixing the issue would be greatly appreciated!
    Thanks in advance!
    Craig

    Hi Craig,
    Subscriber Goes Down
    When the subscriber goes down for more than the 2- or 4-day retention period,
    reinitialize the subscriber in CRS Administration (Datastore Control Center web
    page) and reinitialize the subscription for all the datastores.
    http://www.cisco.com/en/US/docs/voice_ip_comm/cust_contact/contact_center/crs/express_5_0/maintenance/admin/crs501ag.pdf
    Access the Datastore Control Center by selecting System > Datastore Control
    Center from the CRS Administration menu bar.
    Reinit Subscriber—Click this button to reinitialize the subscriber with a
    copy of data from the Publisher. (This causes the data on the subscriber to be
    overwritten by the data from the Publisher.)
    Note Only use this button if you have determined that the Subscriber needs this
    data from the Publisher (if the Subscriber and the Publisher are not
    synchronized).
    Hope this helps.
    Anand
    Please rate all helpful posts by clicking on the stars below the helpful posts !!

  • Sliding window for historical data purge in multiple related tables

    All,
    It is a well known question of how to efficiently BACKUP and PURGE historical data based on a sliding window.
    I have a group of tables, they all have to be backed up and purged based on a sliding time window. These tables have FKs related to each other and these FKs are not necessary the timestamp column. I am considering using partition based on the timestamp column for all these tables, so i can export those out of date partitions and then drop them. The price I have to pay by this design is that timestamp column is actually duplicated many times among parent table, child tables, grand-child tables although the value is the same, but I have to do the partition based on this column in all tables.
    It's very much alike the statspack tables, one stats$snapshot and many child tables to store actual statistic data. I am just wondering how statspack.purge does this, since using DELETE statement is very inefficient and time consuming. In statspack tables, snap_time is only stored in stats$snapshot table, not everywhere in it's child table, and they are not partitioned. I guess the procedure is using DELETE statement.
    Any thought on other good design options? Or how would you optimize statspack tables historical data backup and purge? Thanks!

    hey oracle gurus, any thoughts?

  • ORACLE BAM - Historical data

    Hi,
    Environment : Oracle Webcenter 11.1.1.7 Jdeveloper 11.1.1.7
    Help :  In BAM - If i want to see 6 months back data.. What i has to do with Oracle BAM server to achieve this requirements.
    Thanks in advance,
    Regards,
    Karthigeyan.v

    the aim of BAM is not "historical dashboards" BAM is: Business Activity Monitoring=Real-Time Monitoring.
    If you need the historical data from BAM you can do frecuently "export".
    The steps for do export:
    http://docs.oracle.com/cd/E14571_01/integration.1111/e10224/bam_app_icommand.htm#BABCCDFF
    Greetings,

  • Historical Data Maintenance

    Dear Members,
    This is my second post in the forum. Let me explain the scenario first,
    "We have 2 Tools -Tool1 and Tool2 which points to 2 different databases - db1 and db2 respectively. currently, the db1 performance is very poor due to huge data. we want to have only latest 18 months data in db1. The oldest data beyond 18 months should remain in db2 (in read only mode)which Tool 2 connects to. So, whenever i need historical data, i`ll use tool2. At regular intervals the data from db1 should move to db2."
    My idea is to use partitioning and logical standby. At the end of each month, the oldest one month data will be moved to db2. But please let me know whether this will be feasible to the above concept. If so, how to implement this and if not, what would be the right solution for this?
    Regards,
    Mani
    TCS

    Partitioning is great on the source side (assuming you partition by date, of course).
    I am not sure how logical standby would help on the destination. The point of logical standby is to keep the standby database up to date with the primary, so the standby database would not be read only, it would be constantly applying transactions from the primary. And when you drop a partition on the primary, you would drop the partition on the standby, so the standby wouldn't maintain history.
    Instead of logical standby, you could use Streams to replicate transactions and configure Streams to ignore certain DDL operations like partition drops. That would allow you to retain history on db2 but wouldn't give you a read-only db2 database.
    You could potentially do partition exchange in db1 at a regular interval, moving the data you want to remove into a non-partitioned staging table, move that table to db2 (via export/import, transportable tablespaces, etc), and do a partition exchange to load the data into the partitioned table on db2. That gives you a read only db2 and lets you retain history, but requires some work to move the data around every month.
    Of course, if you decide to partition db1, assuming you did it correctly, I would tend to expect that the performance problems would go away (or at least that archiving the old data wouldn't affect performance any longer). One of the points of partitioning is that Oracle can then do partition elimination for your queries so that it only needs to look at the current partition if that's all tool1 is interested in. So perhaps all you need to do is partition db1 and you don't need db2 at all.
    Justin

  • Historical data from ACE20 load balancer modules

    Hello,
    I am trying to get some historical data from our ACE20 load balancer modules which are housed in a 6504, but cannot seem to find where to get this information from. I would like to get information such as memory used, CPU active connections, throughput, http requests etc over the past month, is there any way that I can do this? I am using ANM 2.0
    Thanks

    Hello,
    Below are teh steps to load from DB connect.
    Just go to Rsa1->modeling->source system->double click on the source system. On right habd side side right click on the topmost node.now it will take u to a screen, there u can give the table name (if u know) or just execute. it will display all the tables u have in Oracle. there double click on th eselected table and select the fields requared and now generate data source. now come back to RSA1->source system-> replicate the data source. now assign infosouce as usual.And create infopackage and start the load.
    And this doesn't support delta process.
    This document could help:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f0fea94-0501-0010-829c-d6b5c2ae5e40
    Regards,
    Dhanya

  • Hyperion Enterprise historic data access (tool)

    we are moving from Hyperion Enterprise to SAP SEM.
    As the historic data won't be moved we are looking for the best and easiest way to be able to access that data in the future.
    Is there an option of not using Enterprise anymore? Reading the data with another tool where no infrastructure is needed?
    Has anyone done something similar already?
    Thanks
    Jonas

    for anyone interested:
    talked to Oracle and they said other customers were running a virtual server till it was not used anymore.
    so there is no other option than running the system as long as you need it or export everything to excel
    Jonas

  • Cisco Unified CCX Historical Reports 8.5 - question

    Hi all,
    Have a question regarding UCCX Historical Reports. Ran daily reports for "Agent Not Ready Reason Code" report for a complete week and exported them to Excel. Excel shows a SUM for daily values that differs from what UCCX shows in Weekly report.
    Although it is a slight difference, I would like to know why UCCX and Excel display different values. Would it be due the way UCCX handles decimal value?
    Image below shows the sum for the complete week and every one of the daily values
    This is what Excel displays as a sum for same daily values:
    Thanks so much for the help you can provide

    The "Cisco Unified CCX Historical Reports Scheduler" sits in my startup folder but seems like it doesn't "run" at startup. So I ran a "test":
    - Manually right-clicked the "Cisco Unified CCX Historical Reports Scheduler" icon in the startup folder and chose "Run as Administrator"
    I didn't log off or reboot PC and the reports are running again. I checked the "properties" of the icon and I did make sure it was already set to "Run this program as an Administrator" under the "Compatibility" tab.
    Not sure why it's not working...
    Thank you for any help you can provide....

  • Copying historical data into planinng area

    Hai,
            "For performance reasons, SAP recommends that you copy historical data from infocube to a timeseries instead of reading it directly from infocube."
    Please correct me if I am wrong,
    timeseries in the above context is the planning area.
    I create a generic export datasource from MSDP_ADMIN giving it a name "9aplan" and it should generate an infsource in RSA1> infosources> Unassigned nodes as 9aplan.
    From here on, how can I load the history from the infocube lets say, sales, to the infosource?
    Should I create a transactional(realtime) infocube and then load data to it from both SALES and 9aplan?
    Then how can the system access this cube when planning? where is it specified?
    Thank you.

    Hi Vishu,
    A Planning Area is the structure - you need to initialise the "timeseries" against a planning version which will allow the data in keyfigures to be stored.
    Loading data from the infocube to planning area - timeseries does not require any BW structure. Just use transaction /SAPAPO/TSCUBE to copy the Sales history data from the cube to the required keyfigure in the planning area (for the given version - most often being 000).
    Hope this answers your question.
    Thanks,
    Somnath

Maybe you are looking for

  • Can't get to my control center on my ipad

    I can't get to the screen that will let me maneuver to nettflex or utube or what ever.  It takes me to my itones instead.s

  • System settings do not react

    A few weeks ago, i wanted to change my background via system settings. Suddenly the systems settings stopped and i wasnt able to close it or to come back to the main menu of the system settings. all i could do was to close it via cmd+alt+esc. Now i g

  • Automatic Payment Program -Display of Payment Proposal

    Dear Gurus, My client has maintained a industry field for all the vendor in their Master Data.The Reason for that when they run the APP they just select the industry in the free selection field.And the payment prosopal should be specific to Industry

  • TEXT MESSAGE SOUND PROBLEM

    I just updated to the 6.0 bundle and now my pearl does not play a sound when I receive a text message.  The option to even change the tone it is supposed to play is gone out of the menu.  I have tried restoring and re installing the software and rese

  • Connecting my officejet j4580 to my wireless sony laptop

    trying to connect my wireless sony laptop with windows 7 system to my officejet j4580 all in one...error message is "no connection. not connected to computer"  where am i supposed to plug in my j4580 usb cord? the router or laptop?