Large Data Logging Question
I am working on a documentary and have about 50 hours of footage recorded over MANY years. Does anyone have any tips as to what would be the most efficient way to log this material?
Thanks,
D
One clip at a time?
Seriously though... 50 hours is not insurmountable by any stretch. What you've got to do is get organized before you start logging.
Try to arrange the tapes in some sort of order: by subject, by date, whatever works for you. And before you even boot up the computer, give each of those tapes a number, the box and tape inside the box. Make a big chart that cross-references tape numbers and events/people interviewed, etc.
I'd tend to log into bins set up by some of the same criteria you use in numbering: by subject, or location, etc.
Don't despair and don't forget that logging is part of the editorial process. The more time you spend now getting organized, the faster you'll go in two weeks when you start actually cutting.
Similar Messages
-
With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
Thanks
RozAre you using Microsoft word? Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users. I have seen this message from Microsoft word. It's annoying.
As BDaqua points out...
When you copy information via edit > copy, command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information. Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
You should be saving your work more than once a day. I'd save every 5 minutes. command + s does a save.
Robert -
Help please: DAQWare or data logging software (multichannel data capture) questions
I am writing regarding DAQWare, data logging software, and the PC-LPM-16
board, which a group of us a trying to come to grips with. This is rather
lengthy, I'm afraid...
We are a group of college students in a computer programmer/analyst
program. We have a course that involves doing a project to go through the
software development lifecycle (or as much of it as can be done in a term).
For our project we teamed up with a group in the industrial hygiene
program; they have a computer (standalone) with a PC-LPM-16 (PnP) DAQ board
and the NI-DAQ software (all of this has been around for a few years, but
has not been used much), and our goal was to capture the data they are
measuring (atmospheric pollutants) to an Microsoft Access database.
We are not familiar with DAQ systems, but were shown the software they had
installed. This included DAQWare (appears to be version 2.0). We thought
we could use DAQWare to log data to a text file, which we then would import
into the Access database. We came up with VBA code to parse the ASCII text
file created, but have run into one problem: DAQWare does not timestamp
each sample, but only puts a single timestamp in the header information.
Now we could extrapolate the time for each line based on the sample rate,
except that sometimes it seems to put in 8 values per sample, sometimes 12
values. For instance, in the sample of a text file below, the capture was
run for 1 minute at a sample rate of 60, and there were 480 measurements in
the file, or 8 lines of measurements per sample. (At a sample rate of 100,
we got 800 measurements in the file etc.) This was two weeks ago. Today,
we returned to capture more data, and DAQWare was logging 12 measurements
per sample (so at a sample rate of 100, we ended up with 1200 lines of
measurements). We do not know where this factor of 8 or 12 is coming from
or how it got changed in the two weeks between captures. We would like to
be able to have them just run DAQWare, for days on end if they wanted,
capture the data to file, then use our code to parse the text file into the
database, but if we cannot put timestamps on the readings, there seems no
point.
*** Example of a capture below... ***
DAQWare File
Version: 2.0
Number of Lines in Header: 10
Comment:
Date: 10-18-1999 Time: 10:01:39
Sample Rate: 60.0
Scan Rate: 0.0
File Format: 1
Samples per Channel: 480.0
Number of Channels: 5
O3 SO2 NOX NO2 NO
4 3 2 1 0
1 1 1 1 1
0.537 0.276 0.259 -0.017 0.015
0.537 0.273 0.261 -0.022 0.012
0.537 0.276 0.259 -0.037 0.012
0.537 0.276 0.264 -0.007 0.015
*** and so on... ***
Regarding this attempt to use DAQWare:
1. Can you tell us where this factor (8 or 12 or however many more readings
than we would expect) means, where it is read from or set?
2. Is there an updated version of DAQWare that we should download that
would be better?
3. Is there some other existing software that we can download to use to
perform basic logging of multichannel analog input to a text file (with
timestamp)?
I realise that the software comes with tools (DLL functions) that one can
use to create custom software to interface with the DAQ board, but in the
limited amount of time we have we are not sure we can learn enough about
these functions to write our own datalogging software.
However, to see if I could grasp the basics, I have looked through the
examples for Visual Basic and think that a program would run something like
this:
1. at the beginning, use LPM16_Calibrate to reset the board
2. configure acquisition with DAQ_Config
3. start acquisition with Lab_ISCAN_Start
4. loop to check is acquisition done with Lab_ISCAN_Check
5. once sampling done, write data out to file
6. loop back up to 3 to get next lot of samples
Unfortunately, a sample app using Access and VBA I created came back with
an error telling me it could not find nidaq.dll (even though this is in the
Windows system directory); probably it is not a good idea to try to do this
in Access in the first place. I am going to have a go using Delphi, which
I know a bit better.
One thing I am unsure of is what the numbers will be coming out of
Lab_ISCAN; the reference file seems to indicate they will be binary
readings, not voltages, and I wonder how to correlate these with instrument
measurements (the instrumentation people we going to set it up so that
1mV=1ppm).
Any help you can give to point us in the right direction would be
appreciated.
Sincerely, Marc Bell ([email protected])I am writing regarding DAQWare, data logging software, and the PC-LPM-16
board, which a group of us a trying to come to grips with. This is rather
lengthy, I'm afraid...
We are a group of college students in a computer programmer/analyst
program. We have a course that involves doing a project to go through the
software development lifecycle (or as much of it as can be done in a term).
For our project we teamed up with a group in the industrial hygiene
program; they have a computer (standalone) with a PC-LPM-16 (PnP) DAQ board
and the NI-DAQ software (all of this has been around for a few years, but
has not been used much), and our goal was to capture the data they are
measuring (atmospheric pollutants) to an Microsoft Access database.
We are not familiar with DAQ systems, but were shown the software they had
installed. This included DAQWare (appears to be version 2.0). We thought
we could use DAQWare to log data to a text file, which we then would import
into the Access database. We came up with VBA code to parse the ASCII text
file created, but have run into one problem: DAQWare does not timestamp
each sample, but only puts a single timestamp in the header information.
Now we could extrapolate the time for each line based on the sample rate,
except that sometimes it seems to put in 8 values per sample, sometimes 12
values. For instance, in the sample of a text file below, the capture was
run for 1 minute at a sample rate of 60, and there were 480 measurements in
the file, or 8 lines of measurements per sample. (At a sample rate of 100,
we got 800 measurements in the file etc.) This was two weeks ago. Today,
we returned to capture more data, and DAQWare was logging 12 measurements
per sample (so at a sample rate of 100, we ended up with 1200 lines of
measurements). We do not know where this factor of 8 or 12 is coming from
or how it got changed in the two weeks between captures. We would like to
be able to have them just run DAQWare, for days on end if they wanted,
capture the data to file, then use our code to parse the text file into the
database, but if we cannot put timestamps on the readings, there seems no
point.
*** Example of a capture below... ***
DAQWare File
Version: 2.0
Number of Lines in Header: 10
Comment:
Date: 10-18-1999 Time: 10:01:39
Sample Rate: 60.0
Scan Rate: 0.0
File Format: 1
Samples per Channel: 480.0
Number of Channels: 5
O3 SO2 NOX NO2 NO
4 3 2 1 0
1 1 1 1 1
0.537 0.276 0.259 -0.017 0.015
0.537 0.273 0.261 -0.022 0.012
0.537 0.276 0.259 -0.037 0.012
0.537 0.276 0.264 -0.007 0.015
*** and so on... ***
Regarding this attempt to use DAQWare:
1. Can you tell us where this factor (8 or 12 or however many more readings
than we would expect) means, where it is read from or set?
2. Is there an updated version of DAQWare that we should download that
would be better?
3. Is there some other existing software that we can download to use to
perform basic logging of multichannel analog input to a text file (with
timestamp)?
I realise that the software comes with tools (DLL functions) that one can
use to create custom software to interface with the DAQ board, but in the
limited amount of time we have we are not sure we can learn enough about
these functions to write our own datalogging software.
However, to see if I could grasp the basics, I have looked through the
examples for Visual Basic and think that a program would run something like
this:
1. at the beginning, use LPM16_Calibrate to reset the board
2. configure acquisition with DAQ_Config
3. start acquisition with Lab_ISCAN_Start
4. loop to check is acquisition done with Lab_ISCAN_Check
5. once sampling done, write data out to file
6. loop back up to 3 to get next lot of samples
Unfortunately, a sample app using Access and VBA I created came back with
an error telling me it could not find nidaq.dll (even though this is in the
Windows system directory); probably it is not a good idea to try to do this
in Access in the first place. I am going to have a go using Delphi, which
I know a bit better.
One thing I am unsure of is what the numbers will be coming out of
Lab_ISCAN; the reference file seems to indicate they will be binary
readings, not voltages, and I wonder how to correlate these with instrument
measurements (the instrumentation people we going to set it up so that
1mV=1ppm).
Any help you can give to point us in the right direction would be
appreciated.
Sincerely, Marc Bell ([email protected]) -
Problem removing Diagnostic and Usage Data Log
I have had several iPod crashes and ironically that's not my problem. The problem is the Diagnostic and Usage Data log is getting larger and larger. I have my iPod Touch 4th Gen. set to not send reports to Apple and I do no see any way to delete or editing it. Is there a way to delete the log or remove some of its lines of reports or dose it have to be sent to Apple to remove it?
I may be a little rusty on the rules here but I didn't think hijacking a thread was not aloud. If it was me I would have started my own thread instead of taking the original question in a different direction. Just sayin.
We I was looking for my answer I found yours.
http://support.apple.com/kb/HT2534 -
How can I perform data logging for a specific time??
hello everyone,
I am quite new in labview and I have a basic question regarding data logging. Currently I am using a cRIO9074 and doing some data logging for my test. The data logging it self works ok so far.
But my problem is I would like to write my datas in a text file either for a specific time interval (ex)10 seconds) or for a specific amounts of data (ex)500 Samples). Can anyone give me some help regarding my problem?? Attached you can find my RT.vi
I would appreciate for anyhelp!
Regards
Yun
Attachments:
BP250 Encoder Position & Velocity (Host).vi 92 KBRun your loggging program for that time. When your program terminates then it will write all the logged data so far in text file.
Kudos are always welcome if you got solution to some extent.
I need my difficulties because they are necessary to enjoy my success.
--Ranjeet -
How do I control a data log session with period and sample time?
I need a data logging system where the operator can select 2 logging parameters: Log Period and Sample Time. I also need a START and STOP button to control the logging session. For example, set the log period for 1 hour and the sampling time for 1 second. (I may be using the wrong jargon here.) In this case when the START button is clicked, the system starts logging for 1 second. An hour later, it logs data for another second, and so on until the operator clicks the STOP button. (I will also include a time limit so the logging session will automatically stop after a certain amount of time has elapsed.)
It’s important that when the STOP button is clicked, that the system promptly stops logging. I cannot have the operator wait for up to an hour.
Note that a logging session could last for several days. The application here involves a ship towing a barge at sea where they want to monitor and data log tow line tension. While the system is logging, I need the graph X-axis (autoscaled) to show the date and time. (I’m having trouble getting the graph to show the correct date and time.) For this application, I also need the system to promptly start data logging at a continuous high rate during alarm conditions.
Of course I need to archive the data and retrieve it later for analysis. I think this part I can handle.
Please make a recommendation for program control and provide sample code if you can. It’s the program control concepts that I think I mostly need help here. I also wish to use the Strip Chart Update Mode so the operator can easily view the entire logging session.
DAQ Hardware: Not Selected Yet
LabVIEW Version: 6.1 (Feel free to recommend a v7 solution because I need to soon get it anyway.)
Operating System: Win 2000
In summary:
How do I control a graphing (data log) session for both period and sample time?
How do I stop the session without having to wait for the period to end?
How do I automatically interrupt and control a session during alarm conditions?
Does it make a difference if there is more than one graph (or chart) involved where there are variable sample rates?
Thanks,
DaveHello Dave,
Sounds like you have quite the system to set up here. It doesn�t look like you are doing anything terribly complicated. You should be able to modify different examples for the different parts of your application. Examples are always the best place to start.
For analog input, the �Cont Acq&Chart (buffered).vi� example is a great place to start. You can set the scan rate (scans/second) and how many different input channels you want to acquire. This example has its own stop button; it should be a simple matter to add a manual start button. To manually set how long the application runs, you could add a 100 ms delay to each iteration of the while loop (recommended anyway to allow processor to multi-task) and add a control that sets the number
of iterations of the while loop.
For logging data, a great example is the �Cont Acq to File (binary).vi� example.
For different sample rate for different input lines, you could use two parallel loops both running the first example mentioned above. The data would not be able to be displayed on the same graph, however.
If you have more specific questions about any of the different parts of your application, let me know and I�ll b happy to look further into it.
Have a nice day!
Robert M
Applications Engineer
National Instruments
Robert Mortensen
Software Engineer
National Instruments -
Error while Exporting large data from Reportviewer on azure hosted website.
Hi,
I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
--- End of inner exception stack trace ---
at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
Any help will be appreciated.
Thanks.Sorry, let me clarify my questions as they were ambiguous:
For a given set if input, does the request always take the same amount of time to fail? How long does it take?
When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
Also, if you can share your site name (directly or
indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side. -
WCF service connection forcibly closed by the remote host for large data
Hello ,
WCF service is used to generate excel report , When the stored procedure returns large data around 30,000 records. Service fails
to return the data . Below is the mentioned erorr log :
System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP
response to <service url> This could be due to the service
endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by
the server (possibly due to the service shutting down). See server logs for more details. ---> System.Net.WebException:
The underlying connection was closed: An unexpected error occurred on a receive. ---> System.IO.IOException:
Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
--- End of inner exception stack trace ---
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
--- End of inner exception stack trace ---
at System.Net.HttpWebRequest.GetResponse()
at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout).
--- End of inner exception stack trace ---
Server stack trace:
at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)
at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at IDataSetService.GetMastersData(Int32 tableID, String userID, String action, Int32 maxRecordLimit, Auditor& audit, DataSet& resultSet, Object[] FieldValues)
at SPARC.UI.Web.Entities.Reports.Framework.Presenters.MasterPresenter.GetDataSet(Int32 masterID, Object[] procParams, Auditor& audit, Int32 maxRecordLimit).
WEB CONFIG SETTINGS OF SERVICE
<httpRuntime maxRequestLength="2147483647" executionTimeout="360"/>
<binding name="BasicHttpBinding_Common" closeTimeout="10:00:00" openTimeout="10:00:00"
receiveTimeout="10:00:00" sendTimeout="10:00:00" allowCookies="false"
bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
maxBufferSize="2147483647" maxBufferPoolSize="0" maxReceivedMessageSize="2147483647"
messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered"
useDefaultWebProxy="true">
<readerQuotas maxDepth="2147483647"
maxStringContentLength="2147483647" maxArrayLength="2147483647"
maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" />
<security mode="None">
WEB CONFIG SETTINGS OF CLIENT
<httpRuntime maxRequestLength="2147483647" requestValidationMode="2.0"/>
<binding name="BasicHttpBinding_Common"
closeTimeout="10:00:00" openTimeout="10:00:00"
receiveTimeout="10:00:00" sendTimeout="10:00:00"
allowCookies="false" bypassProxyOnLocal="false"
hostNameComparisonMode="StrongWildcard" maxBufferSize="2147483647"
maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647"
messageEncoding="Text" textEncoding="utf-8"
transferMode="Buffered" useDefaultWebProxy="true">
<readerQuotas
maxDepth="2147483647"
maxStringContentLength="2147483647"
maxArrayLength="2147483647"
maxBytesPerRead="2147483647"
maxNameTableCharCount="2147483647" />Doing binding configuration on a WCF service to override the default settings is not done the sameway it would be done on the client-side config file.
A custom bindng must be used on the WCF service-side config to override the defualt binding settings on the WCF service-side.
http://robbincremers.me/2012/01/01/wcf-custom-binding-by-configuration-and-by-binding-standardbindingelement-and-standardbindingcollectionelement/
Thee readerQuotas and everything else must be given in the Custom Bindings to override any default setttings on the WCF service side.
Also, you are posting to the wrong forum.
http://social.msdn.microsoft.com/Forums/vstudio/en-us/home?forum=wcf -
Error when opening large data forms
Hi,
We are working on a Workforce planning implementation. We have 2 large custom defined dimensions.
When opening large data forms we get a standard "Error has occurred" error. If we reduce the member selection the data form opens fine.
Is anyone aware of a setting that can be increased to open large data forms? Im not referring to the "Warn id data form is larger than 3000 cells" setting.
Im pretty sure there is a parameter that can be increased but I cant find it.
Thanks for your help.
SebHi Seb,
If you do find the magic parameter then let us know because I would be interested to know.
It is probably something like ALLOW_LARGE_FORMS = true :)
In the planning logs is the error related to planning or is essbase related error.
Is it failing due to the amount of rows or is it because it is going beyond the max of 256 columns ?
Cheers
John
http://john-goodwin.blogspot.com/ -
Which is the best structure for start/stop data logging?
Hi everybody,
hope I can explain my problem good enough that anyone can help me:
I have a VI which continuely shows on a graph the voltage of a Analog Input of a DAQ device. Now I want to allow the user to start/stop with a click on a button data logging. Means that a second loop writes to a selected path every ms the data to a spreadsheet file.
At the moment my VI works like this:
You run the VI and Labview asks you one time for the file path, then you can start and stop the data logging. But you can do it only one time. If you want to log a certain time later another file, you have to close and open the whole VI again which is not very professional...
My target is:
It is onlyl necessary to start the VI one time. Then you can select a new file path, log data, select another file path, logging data again....and so on...
Which programm structure is necessary, can anyone help me as a labview beginner with that issue? I attached the VI if someone just wants to edit that...
Thank you for your help!! Markus
Attachments:
Logging voltage.vi 93 KB
screenshot.JPG 98 KB@NaruF1 and GerdW
you are right, there is a mistake, both loop rates should be the same (10 ms) :-)
@NaruF1
yes you understood correctly that the file dialog should appear every time the user wants to start writing a new log-file. The voltage we measure is a analog signal, so there will be several interesting periods we want to save for a later analysis in Excel.
To your 2nd point: if it works with a array it will be fine. But it must be possible to log data for lets say 5 minutes, so the array won´t be too large? (5 min @ 10 ms loop rate will be 30000 rows..
attached is the VI, saved as LV2009..
Thanx a lot!
@GerdW
..you ask why I didn´t create a structure like you recommended with notifiers or queues? The simple answer would be that I am not familiar with all this.. just began to write my first Labview programm. I will have al look to the LV help with all that stuff... Thank you a lot
Attachments:
Logging voltage.vi 80 KB -
Problem with large data report
I tried to run a template I got from release 12 using data from the release we are using (11i). The xml file is about 13,500 kb. when i run it from my desktop.
I get the following error (mostly no output is generated sometimes its generated after a long time).
Font Dir: C:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\fonts
Run XDO Start
RTFProcessor setLocale: en-us
FOProcessor setData: C:\Documents and Settings\skiran\Desktop\working\2648119.xml
FOProcessor setLocale: en-us
I assumed there may be compatibility issues between 12i and 11i hence tried to write my own template and ran into same issue
when i added the third nested loop.
I also noticed javaws.exe runs in the background hogging a lot of memory. I am using Bi version 5.6.3
I tried to run the template through template viewer. The process never completes.
The log file is
[010109_121009828][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setData(InputStream) is called.
[010109_121014796][][STATEMENT] Logger.init(): *** DEBUG MODE IS OFF. ***
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setTemplate(InputStream)is called.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutput(OutputStream)is called.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutputFormat(byte)is called with ID=1.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setLocale is called with 'en-US'.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.process() is called.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.generate() called.
[010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] createFO(Object, Object) is called.
[010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] oracle.xdo Developers Kit 10.1.0.5.0 - Production
[010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] Scalable Feature Disabled
End of Process.
Time: 436.906 sec.
FO Formatting failed.
I cant seem to figure out whether this is a looping or large data or BI version issue. Please advice
Thank youThe report will probably fail in a production environment if you don't have enough heap. 13 megs is a big xml file for the parsers to handle, it will probably crush the opp. The whole document has to be loaded into memory and perserving the relationships in the documents is probably whats killing your performance. The opp or foprocessor is not using the sax parser like the bursting engine does. I would suggest setting a maximum range on the amount of documents that can be created and submit in a set of batches. That will reduce your xml file size and performance will increase.
An alternative to the pervious approach would be to write a concurrent program that merges the pdfs using the document merger api. This would allow you to burst the document into a temp directory and then re-assimilate them it one document. One disadvantage of this approach is that the pdf is going to be freakin huge. Also, if you have to send that piggy to the printer your gonna have some problems too. When you convert it pdf to ps the files are going to be massive because of the loss of compression, it's gets even worse if the pdf has images......Then'll you have a more problems with disk on the server and or running out of memory on ps printers.
All of things I have discussed I have done in some sort of fashion. Speaking from experience your idea of 13 meg xml file is just a really bad idea. I would go with option one.
Ike Wiggins
http://bipublisher.blogspot.com -
Running out of memory while using cursored stream with large data
We are following the suggestions/recommendations for the cursored stream:
CursoredStream cursor = null;
try
Session session = getTransaction();
int batchSize = 50;
ReadAllQuery raq = getQuery();
raq.useCursoredStream(batchSize, batchSize);
int num = 0;
ArrayList<Request> limitRequests = null;
int totalLimitRequest = 0;
cursor = (CursoredStream) session.executeQuery(raq);
while( !cursor.atEnd() )
Request request = (Request) cursor.read() ;
if( num == 0 )
limitRequests = new ArrayList<Request>(batchSize);
limitRequests.add(request);
totalLimitRequest++;
num++;
if( num >= batchSize )
log.warn("Migrating batch of " + batchSize + " Requests.");
updateLimitRequestFillPriceForBatch(limitRequests);
num = 0;
cursor.releasePrevious();
if( num > 0 )
updateLimitRequestFillPriceForBatch(limitRequests);
cursor.close();
We are committing every 50 records in the unit of work, if we set DontMaintianCache on the ReadAllQuery we are getting PrimaryKeyExceptions intermittently, and we do not see much difference in the IdentityMap size.
Any suggestions/ideas for dealing with large data sets? ThanksHi,
If I use read-only classes with CursoredStream and execute the query within UOW, should I be saving any memory?
I had to use UOW because when I use Session to execute the query I get
6115: ISOLATED_QUERY_EXECUTED_ON_SERVER_SESSION
Cause: An isolated query was executed on a server session: queries on isolated classes, or queries set to use exclusive connections, must not be executed on a ServerSession or in CMP outside of a transaction.
I assume marking the descriptor as read-only will avoid registering in UOW, but I want to make sure that this is the case while using CursoredStream.
We are running in OC4J(OAS10.1.3.4) with BeanManagedTransaction.
Please suggest.
Thanks
-Raam
Edited by: Raam on Apr 2, 2009 1:45 PM -
Spooling large data using UTL_FILE
Hi Everybody!
While spooling out data into a file using UTL_FILE package I am unable spool the data The column data has a size of 2531 characters
The column 'source_where_clause_text' has very large data.
Its not giving any error but the external table is not returning and data
Following is the code.
CREATE OR REPLACE PROCEDURE transformation_utl_file AS
CURSOR c1 IS
select transformation_nme,source_where_clause_text
from utility.data_transformation where transformation_nme='product_closing';
v_fh UTL_FILE.file_type;
BEGIN
v_fh := UTL_FILE.fopen('UTLFILELOAD', 'transformation_data.dat', 'w', 32000);--132767
FOR ci IN c1
LOOP
UTL_FILE.put_line( v_fh, ci.transformation_nme ||'~'|| ci.source_where_clause_text);
-- UTL_FILE.put_line( v_fh, ci.system_id ||'~'||ci.system_nme ||'~'|| ci.system_desc ||'~'|| ci.date_stamp);
END LOOP;
UTL_FILE.fclose( v_fh );
exception
when utl_file.invalid_path then dbms_output.put_line('Invalid Path');
END;
select length(
'(select to_char(b.system_id) || to_date(a.period_start_date,''dd-mon-yyyy'') view_key, b.system_id, to_date(a.period_start_date,''dd-mon-yyyy'') period_start_date, to_date(a.period_end_date,''dd-mon-yyyy'') period_end_date, to_date(a.clos
ing_date,''dd-mon-yyyy'') closing_date from ((select decode(certification_type_code, ''A'', ''IDESK_PRODUCTS_PIPELINE'',''C'', ''IDESK_PRODUCTS_COMMITMENT_LINKAGE'') system_nme, to_char(to_date(''01'' || lpad(trim(to_char(certification_as_of_month_yr)),6,''0''),''ddmmyy
yy''),''dd-mon-yyyy'') period_start_date, to_char(last_day(to_date(''12'' || lpad(trim(to_char(certification_as_of_month_yr)),6,''0''),''ddmmyyyy'')),''dd-mon-yyyy'') period_end_date, to_char(trunc(certification_datetime_stamp), ''dd-mon-yyyy'') closing_date from odsu
pload.prod_monthly_certification where certification_type_code in (''A'',''C'') minus select trim(system_nme), to_char(period_start_date, ''dd-mon-yyyy''), to_char(period_end_date, ''dd-mon-yyyy''), to_char(closing_date, ''dd-mon-yyyy'') from utility.system_closing
statusv where system_nme in (''IDESK_PRODUCTS_PIPELINE'', ''IDESK_PRODUCTS_COMMITMENT_LINKAGE'')) union all (select ''BMS Commitment Link'' system_nme, to_char(to_date(''01'' || lpad(trim(to_char(certification_as_of_month_yr)),6,''0''),''ddmmyyyy''),''dd-mon-yyyy'')
period_start_date, to_char(last_day(to_date(''12'' || lpad(trim(to_char(certification_as_of_month_yr)),6,''0''),''ddmmyyyy'')),''dd-mon-yyyy'') period_end_date, to_char(trunc(certification_datetime_stamp), ''dd-mon-yyyy'') closing_date from odsupload.prod_monthly_c
ertification where certification_type_code = ''C'' minus select trim(system_nme), to_char(period_start_date, ''dd-mon-yyyy''), to_char(period_end_date, ''dd-mon-yyyy''), to_char(closing_date, ''dd-mon-yyyy'') from utility.system_closing_status_v where system_nme
= ''BMS Commitment Link'') union all (select ''BMS'' system_nme, to_char(to_date(''01'' || lpad(trim(to_char(certification_as_of_month_yr)),6,''0''),''ddmmyyyy''),''dd-mon-yyyy'') period_start_date, to_char(last_day(to_date(''12'' || lpad(trim(to_char(certification_as_
of_month_yr)),6,''0''),''ddmmyyyy'')),''dd-mon-yyyy'') period_end_date, to_char(trunc(certification_datetime_stamp), ''dd-mon-yyyy'') closing_date from odsupload.prod_monthly_certification where certification_type_code = ''A'' minus select trim(system_nme), to_char
(period_start_date, ''dd-mon-yyyy''), to_char(period_end_date, ''dd-mon-yyyy''), to_char(closing_date, ''dd-mon-yyyy'') from utility.system_closing_status_v where system_nme = ''BMS'')) a, utility.system_v b where a.system_nme = b.system_nme)') length1
from dual
--2531
begin
SSUBRAMANIAN.transformation_utl_file;
end;
create table transformation_utl
TRANSFORMATION_NME VARCHAR2(40),
SOURCE_WHERE_CLAUSE_TEXT VARCHAR2(4000)
ORGANIZATION external
type oracle_loader
default directory UTLFILELOAD
ACCESS PARAMETERS
records delimited by newline CHARACTERSET US7ASCII
BADFILE UTLFILELOAD:'transformation.bad'
LOGFILE UTLFILELOAD:'transformation.log'
fields TERMINATED by "~"
LOCATION ('transformation_data.dat')
) REJECT LIMIT UNLIMITED
select * from transformation_utlafter running the procedure, did you verify that the file 'transformation_data.dat' has data? open it, make sure it's correct. maybe it has no data, and that's why the external table doesn't show anything.
also, check the LOG and BAD files after selecting from the external table. maybe they have ERRORS in them (or all the data is going to BAD because you defined something wrong). -
Help In keithley 2400 VI!!(Problem with the data logging and graph plotting)
Hi,need help badly=(.
My program works fine when i run it,and tested it out with a simple diode.The expected start current steps up nicely to the stop current.The only problem is when it ends,i cannot get the data log and the graph,though i already have write code for it.Can someone help me see what's wrong with the code?I've attached the necessary file below,and i'm working with Labview 7.1.
Thanks in advance!!!
Attachments:
24xx Swp-I Meas-V gpib.llb 687 KBGood morning,
Without the instrument it might be hard for others to help
troubleshoot the problem. Was there a
specific LabVIEW programming question you had, are you having problems with the
instrument communication, are there errors?
I’d like to help, but could you provide some more specific information
on what problems you are encountering, and maybe accompany that with a simple
example which demonstrates the behavior?
In general we don’t we will be unable to open specific code and debug,
but I’d be happy to help with specific questions.
I did notice, though, that in your logging VI you have at
least one section of code which appears to not do anything. It could be that a small section of code, or
a wire was removed and the data is not being updated correctly (see pic below). Is your file being opened properly? Is the data being passed to the file
properly? What are some of the things
you have examined so far?
Sorry I could not provide the ‘fix’, but I’m confident that
we can help. Thanks for posting, and
have a great day-
Message Edited by Travis M. on 07-11-2006 08:51 AM
Travis M
LabVIEW R&D
National Instruments
Attachments:
untitled.JPG 88 KB -
Data Model best Practices for Large Data Models
We are currently rolling out Hyperion IR 11.1.x and are trying to establish best practces for BQYs and how to display these models to our end users.
So far, we have created an OCE file that limits the selectable tables to only those that are within the model.
Then, we created a BQY that brings in the tables to a data model, created metatopics for the main tables and integrated the descriptions via lookups in the meta topics.
This seems to be ok, however, anytime I try to add items to a query, as soon as i add columns from different tables, the app freezes up, hogs a bunch of memory and then closes itself.
Obviously, this isnt' acceptable to be given to our end users like this, so i'm asking for suggestions.
Are there settings I can change to get around this memory sucking issue? Do I need to use a smaller model?
and in general, how are you all deploying this tool to your users? Our users are accustomed to a pre built data model so they can just click and add the fields they want and hit submit. How do I get close to that ideal with this tool?
thanks for any help/advice.I answered my own question. in the case of the large data model, the tool by default was attempting to calculate every possible join path to get from Table A to Table B (even though there is a direct join between them).
in the data model options, I changed the join setting to use the join path with the least number of topics. This skipped the extraneous steps and allowed me to proceed as normal.
hope this helps anyone else who may bump into this issue.
Maybe you are looking for
-
Useful Document/resources For Begginners
this document is easily on net WS-BPEL Guide Last changed on Dec 10, 2004 by Matthieu Riou What is this article about ? This is an introduction to WS-BPEL that should give you a practical understanding of what you have to do to create a nice WS-BPEL
-
Call the document Approval Workflow screen from a custom KM Details Page
Hi, We are trying to call the Approval Workflow screen from a custom developed KM Details page (using JSPDynpage). How can this event be triggered. We want to trigger the same event that gets called when a user clicks on the Approval Option under Set
-
Earpiece & Speeaker won't work.
After using the headset jack, External speaker and earpiece on the Treo650 stopped working. I have tried soft reset and all posible things. But, nothing is helping. Someone, please advise me to sovle this problem. Post relates to: Treo 650 (Cingular)
-
Hi Gurus, following query is taking a very long time SELECT belnr gjahr buzei bukrs xstov augdt blart INTO CORRESPONDING FIELDS OF TABLE it_bsis FROM bsis WHERE hkont IN glac AND blart IN s_blart AND
-
How to open the sim compartment..
How to open the sim compartment. I tried using a pin, but it would not open.