Retrieve raw captured data after report generation crash

Hi everyone
Yesterday we were performing load tests on a server. During which I created and I scheduled a User defined data collector set based on the system performance collector set. The data capture ran for 2 hours. It then proceeded with the process of generating
the report. During this process I observed the Tracerpt.exe was slowly and consistently consumed memory. The generate report process was taking a while, so I came back to it 15 minutes later, to discover the process has ended, but no report could be found. 
I'm assuming the process crashed.
Is there a way to retrieve the raw capture data and attempt to regenerate a report? Where is the raw data temporarily stored?
Is there any why to retrieve the missing information?
Any help is appreciated.
Ernie Prescott

Suppressing Result Rows can be done in Query designer itself for one time. You need not to do after you get the report every time.
If you always expect some 10 free chars constantly, i suggest you to insert them in Rows pane in Query designer but not in Analyzer. So that your report layout will be taking care with all settings what you want to do after the report generates in Excel. I mean you will get desired layout when you run the report first time itself.
Suppress repeated key values also can be done in Query designer itself one time.
By doing above, your report execution will not be tedious and it will be minimizing your efforts.
Over all, What I have understood is, you are doing all sorts of settings in Analyzer after getting such huge volume of data, by which it gives you crash error.
With all above steps, you can run your report with all pre-defined settings.

Similar Messages

  • Download OBIEE report followed with date of report generation

    Hi All,
    We have the requirement that whenever the user generates the report and tries downloading, the report should be downloaded into the desired format with name as report name+date of report generation automatically.
    May I know if this is feasible rather than user user entering the name?
    Thanks Shravan

    I know this post is very old, but my answer might hlep in the future.
    Check the following link:
    http://123obi.com/2011/10/obiee-11g-printable-pdf-in-landscape-format/
    Regards,
    Kalyan Chukkapalli
    http://123obi.com

  • How to correct close database connection after report generation

    Hello
    I have problem a with alive database connection after report creation and report  closing. How to properly to close connection to database?
    Best regards
    Edited by: punkers84 on Jun 17, 2011 10:38 AM

    that's what I am doing... after viewing the report, I call the close method on the window closing event of the container window. but the connection is still open. I had a lot of other issues with my jdbc driver but after downgrading to an older version those issues are resolved.Only this one is still there! Is there any other way to close the connection (like using dbcontroller or etc.)?

  • Date of report generation in PDF report

    How can I get the date the report was generated in the PDF output?
    I'm using oc4j.

    Darren,
    I'm using the following in BI Publisher reports.
    Report Date: <?xdoxslt:sysdate('DD-MON-YYYY')?>
    Regards,
    Todd

  • How to print long raw text data in report output in single line?

    Hi All,
    I have a requirement where I need to print raw comma separated text data in the report output which end user will open in excel and can sort as required. I can not directly generate excel output.
    Now there is huge set of data and each row from the report query should be get printed on single line, It should not get printed on the next line.
    I tried to extending the report with 240 characters but still there are some text data which is getting printed on the next line.
    Please share your view if someone has any solution on this issue.
    Thanks in Advance.
    Arun

    Make the report even wider. By default a report layout can be 10 pages wide. If you need more, change the "Max. Horizontal Body Pages" property, and extend your layout too.
    IMHO, I wouldn't even use Reports to create a csv file. Utl_file or an sqlplus script that spools to a file are better options I think.

  • How do I restore old emails from recovered data after hard drive crash?

    Hi all - I used data rescue II to recover data from a crashed hard drive - seems to have worked pretty well, as the file structure seems to be mostly intact - now I'm trying to find the old emails and get them back in my inbox/sent mailboxes where they belong - any help is surely appreciated - thanks

    Hi Tommybanjo.
    Mail stores the messages in .emlx files, but you cannot import individual messages directly, only entire mailboxes. In order to import the recovered .emlx files back into Mail, they must be organized as if they were in an .mbox mailbox, i.e. they must be in a Messages folder within an .mbox folder. You may create a folder called Recovered.mbox, then a Messages folder within this folder, then put the .emlx files within the Messages folder, then import that.
    In Mail, do File > Import Mailboxes, choose Mail for Mac OS X as the data format, and follow the instructions. Note that Mail wants you to select the folder that contains the mailboxes to be imported (i.e. the folder where the .mbox folders to be imported reside), not the .mbox folders themselves nor the Messages folders within them.
    Given the circumstances, however, there could be something wrong with one or more of the recovered messages and that might cause the import to fail. If that’s the case, you may try organizing the recovered messages in several smaller mailboxes and try to identify the messages that Mail has trouble with. Alternatively, you may try using emlx to mbox Converter or Emailchemy to convert the .emlx files to standard mbox format, then import that in Mail as Other.

  • SQL Loader deletes data, after reporting a file not found error

    I have several control files beginning:
    LOAD DATA
    INFILE <dataFile>
    REPLACE INTO TABLE <tableName>
    FIELDS TERMINATED BY '<separator>'
    When running SQL Loader, in the case of one particular control file, if the file referenced does not exist, SQL Loader first reports that the file could not be found, but then proceeds to delete all the data in the table in which the import was meant to take place. The corresponding log file reveals that the file could not be found, but also states that 0 records were loaded and 0 records were skipped.
    In the case of all other control files, if the file is not found, the log files simply report this exception but do not show any statisitcs about the number of records loaded/skipped nor does SQL Loader delete the data in any of the referenced tables. This is obviously the expected behaviour.
    Why is SQL Loader deleting the data referenced by one particular control file, even though this file does not exist and the corresponding log file has correctly picked up on this?

    in the ressource name box of your file model, when you push the search button ("...") do you see the file ?
    Cause the problem can occur when you write directly the path without selectionning the file with the assistant.
    Try this.
    I think too that that you can't see the data by right clicking and selectionning View Data ?
    Let me know the avancement...

  • Retrieving raw XML data

    Hello,
    I'm working with the Oracle Content Server 10gR3 and Site Studio. I'm using region definitions and contributor data files to hold the content. For some cases I would like to include the raw content from these data files into the rendered web page. This would be similar to the site studio contributor, that lets you edit the content of the xml data files.
    The problem is, when I use ssIncludeXml all the idoc script commands are evaluated. But I would like to get the original data with the idoc commands. Is there any service I could call to get the raw data? Site studio contributor uses the "SS_GET_CONTRIBUTOR_CONFIG" service, but as this returns json like data I can't use that in Site Studio ether.
    Thanks,
    Andreas

    Hi Tim,
    thanks for your answer. Using AJAX would be the way the contributor window does it by calling the SS_GET_CONTRIBUTOR_CONFIG service. It's correct, that I should be able to use the JSON data then to extract the information I want.
    But that is too late. I want to process that data with idoc when the content server renders the page. Like including placeholder data or a fragment snippet just without evaluating it.
    Cheers,
    Andreas

  • Crystal Reports "Unknown Source" Stack Trace after report generation

    Getting java.lang.NullPointerException with Crystal ReportViewer.W(UnKnown Source)  stack trace... 
    java.lang.NullPointerException
    at com.crystaldecisions.report.web.viewer.CrystalReportViewer.goto(Unknown Source)
    at com.crystaldecisions.report.web.ServerControl.a(Unknown Source)
    at com.crystaldecisions.report.web.ServerControl.processHttpRequest(Unknown Source)
    at jsp_servlet._crystalviewer.__viewer._jspService(__viewer.java:105)
    We are running in to this issue only after the report is generated and during performing a search or navigating to different page...
    Any help to resolve this will be greately appriciated....
    Thanks
    ND

    The ODBC datasource must be configured as a System datasource.  The database was configured as a User datasource so it was only visible on the local computer.  That's why Business Views Manager could access the database.

  • Report Generation Toolkit files in incorrect location after installing deployment

    TestStand 2010 SP1 + LabVIEW 2011 + Report Generation Tookit
    After installing a deployment, the location of NI_Excel.lvclass is expected to be at ...\VIs for <project name>\SupportVIs\Excel\NI_Excel.lvclass (see attachment)
    but the path of the file is located at ...\VIs for <project name>\SupportVIs\data\NI_Excel.lvclass
    In the deployment image, under the target folder, the file is in ...\VIs for <project name>\SupportVIs\data\NI_Excel.lvclass
    and the lvproj finds the file.
    The lvproj in the installation finds the file in ...\VIs for <project name>\SupportVIs\data\NI_Excel.lvclass
    Renaming the folder from data to Excel fixes this problem.
    Has anyone else run into this problem?
    Attachments:
    Report Generation Toolkit Error.png ‏15 KB

    I am facing the same issue when I create the TS deployment through utility using TS2012.I thought of adding the file to the Teststand workspace and give the destination in to supportVIs directory.But even tthat didn't worked as it was copying all excel vis in to two locations.
    Even adding the Directory path in to TS search directory options  to "\data folder"  is not resolving the issue and LAbVIEW Vis are not able to locate it.

  • Report generation is saving data to a word template

    I created a template for a program to write test result data when prompted by the user. My problem is that after the template is written to and closed it's asks if the new data shoule be saved. I don't want to give the user that option because it actually saves it to the template not a document, and when another test is done and report is requested, it doesn't write over the data fromt the previous results, it adds to it. Is there some sort of Word security option or a vi that might be helper to prevent users from saving data permanently, to overwrite previous data?

    Harene,
    The ActiveX call to Quit comes with a parameter to Save Changes (T or F). I would suggest making that the last call you make. This is how the Dispose Report.vi is made in the Report Generation Toolkit for Microsoft Office. (This was checked on my machine with Office 2000.)
    Randy Hoskin
    Applications Engineer
    National Instruments
    http://www.ni.com/ask

  • Problem with writing continuous data to excel using using Report Generation vi's

    Hey Everyone,
    I am trying to read the data from DAQ and write to excel continuously using Report Generation vi's. 
    But when I run the VI, it writes only one interation of the while loop (gathering data from DAQ continuously) and doesn't append the data into the same file when I run it again after stoping the VI. 
    I have attached the VI i created. Please let me know if you have any idea to solve this issue. 
    Thanks
    Attachments:
    sample 5.vi ‏35 KB

    There are two problems with your VI.  First, the basic logic of writing/appending to a file (Excel, TDMS, anything) should go something like this:  Open the file, position yourself at the end of the file, then, in the loop, acquire data and write it to the file until you are finished with data acquisition.  When you exit the acquire/write to file loop, then close the file.  In particular, the opening and the closing of the file should not be inside the loop.
    As others have pointed out, writing to Excel might not be optimal, particularly if you are acquiring data at a high rate (and would therefore be writing a lot of data). We actually use Excel in our data acquisition routine, mainly reading from a WorkSheet to get the parameters of a particular stimulus, but also writing the outcome of the response to the stimulus.  As it happens, our "acquisition rate" in this example is on the order of several samples per minute, so there's no problem using Excel (we also sample 16 channels of analog data at 1 KHz -- this gets written as a binary file).
    Second, if you really do want to use Excel, use the (existing) Excel file to which you want to append as the "template" argument of the New Report function.  Then use the Excel Get Last Row function to position yourself at "end of file", as noted above.
    Good Luck.
    Bob Schor

  • How do I put data into a template using the Labview report generation toolkit for Microsoft Office?

    I am running Lookout 5.0 and have recently purchased the Labview Report Generation Toolkit for Microsoft Office to create reports from my Lookout logged data. Since I have never used Labview I am having my problems. I tried following the tutorials but they do not seem to be examples of what I want to do.
    I log rainfall totals (1 spreadsheet)in Lookout from 40 different sites in 5 minute increments. I copy these totals and paste them into another spreadsheet which sums them up to give me hourly totals and then paste those totals into a spreadsheet for distribution.
    In Labview I create a new report and use the distribution sheet as my template, but how do I complete
    the steps of loading the raw 5 minute data into labview and then paste it into the hourly total spreadsheet and then transfer those totals into the distribution template?
    I have been trying to figure this out for over a week, and I am getting nowhere.
    Any response would be appreciated.
    Thanks
    Jason P
    Jason Phillips

    Lookout saves the files in .csv form which can be opened in Excel. I did make some progress by using the "append table to report" vi which allowed me to put values into an array and then those values were entered into my template on my report vi.
    Where I am stuck now is I want to be able to put values into my template from a .csv file, not from an array I have to manually put numbers in.
    Once those values are in my template I want to pull summed values from the template and place them into a final excel file for printing.
    I have attached examples of the files I am working with to help you better understand what I am trying to do.
    I hope that makes sense.
    Jason Phillips
    Attachments:
    HourlyTotalsTemplate.xls ‏120 KB
    eb_rain_gauge_ss.csv ‏23 KB
    EastBankHourlyRainReport.xls ‏28 KB

  • Retrieving data/Opening reports too slow

    Hi Experts,
    We have a Finance Application with 12 dimensions. Two of those dimension holds a lot of members Product has 16,000 members with 7 hierarchies and the customer dimension has 40,000 members with 3 hierarchies. On the first part of our development we haven't encountered any trouble opening reports specially for a template which expands for almost 9,000 rows of members. After uploading 3 months of data with almost 400,000 lines each. We are now having trouble opening reports or even retrieving the data using ADHOC reports.
    We've already tried various performance tuning and still there are no improvement. We also deleted the uploaded data to see if it's causing the low performance but still the performance is still the same. Any suggestion what's causing this?
    By the way we're currently using BPC 7.0 MS SP03. We're using a single server setup. Server Memory is 10GB and Hardisk free space is 100GB
    Thanks,
    Marvin

    Hi Sorin,
    Sorry I forgot to mention that I've already tried removing other hierarchies leaving only two hierarchies to test if there is an improvement in performance but its still the same. And also other application which are far less complicated than Finance application and do not share the Product and Customer dimension are experiencing slow retrieval of data in reports. And yes, I build those hierarchies for reporting reasons, actually my first option really is to use properties but the clients requirement is for them to be able to drill-down each groupings that's why we've set them up as hierarchies.
    We've already tried creating partitions but still there's only a slight improvement in performance.
    As for the reports, we just created simple EVDRE to test if the design of the report is causing it but even a simple EVDRE hangs.
    Thanks,
    Marvin

  • Errors found retrieving data after SAP Patching

    Hi Team,
    I received the error "errors found retrieving data" when I try to click
    the ‘Refresh’ button to display data for my reports. Refer to the
    attached screen.
    I’m receiving it from trying to refresh data in ‘Reports’ and ‘Input
    schedule’ templates from the BPC Excel Client. This was happen after we
    done the SAP Patching earlier it was working fine.
    Appreciate your help in advance
    Thank you
    Regards
    Raghu

    Hi,
    Try to process all dimensions.
    If not try to find more details from EPM-->More-->Log
    Hope it helps..
    regards,
    Raju

Maybe you are looking for

  • Where to see the IR/ID import progress?

    Hi! Sometimes the ID/IR import can last for a long time. Can we monitor the real time progress somewhere? Where is the log/trace for ID/IR import? (ID=Integration Directory;  IR=Integration Repository) Points are guaranteed, thanks!

  • No Tickbox option in icloud mail, contacts etc when I download Icloud

    Hi Guys, Downloaded Icloud to my pc but there is no tickbox  option next to Icloud mail, Contacts ,Calenders , Reminders and notes. Someone please help me..... Thanks

  • How to see an IDOC type & FM, "WHERE USED LIST"?

    Hi Experts, I nned to create an IDOC of WMTCID01 and message type is WMTOCO. I wanna to see that, Where Where this IDOC was used(especially, in "Z" objects) so that, I can get some idea abt. it and do my coding. So, let me know that, How to see this

  • HP Pavillion DM4 Notebook PC Webcam not working

    HP Pavillion DM4 Notebook PC  Windows 7 64-Bit I have had problems with this webcam from day one. I will turn it on, it will display for a few seconds, freeze, and then display this message. I have updated my drivers. Any ideas?

  • Remove deployed process flows

    Hi there, When deploying a process flow, what happens exactly ? I know deploying a mapping creates a db-package, but i can't find anything in my database referring to the deployed process flow. After deploying a number of process flows and renaming t