Multi-task data logging with DAQmx
I was wonder is it possible to use 'DAQmx Configure Logging' VI and 'DAQmx Start New File' VI for multiple tasks? I'm doing synchronized high speed DAQ with NI PXI-6133 cards. Each card (there are 16) must have its own task. Although the DAQ is continuous, the user (software trigger) determines when data is saved to the disk and for how long.
In my scenario the test length could be up to an hour with various test events scatter throughout. The users want to display the data during the entire test length. However they only want to write the data to disk during an event. The event could last from 10 sec to 1 min. That is why the users want to control when data is written to the disk.
DAQmx Logging seems to work for a single task only, but I need to do multiple tasks.
I've attempted to implement your suggestion, but I still do not acquire data for all channels for all tasks. I've enclosed my VI.
Attachments:
TDMS Logging with Pause LoggingFSPR.vi 55 KB
Similar Messages
-
Open task sequence log with cmtrace on error
Hi,
Is there a way to make an SCCM 2012 R2 task sequence automatically open the smsts.log (with cmtrace) when an error occurs?
J.
Jan HoedtI would agree with Torstren. The difficult piece would be determining where the smsts.log is, based on where its at in the task sequence. Once you have that it wouldn't be too difficult to place those steps in the
Gather Logs and StateStore on Failure group. The catch will be, even if CMTrace is open, the machine will still reboot after 15 minutes unless the cmd prompt is also open. -
Pulsewidth data logging with counter channel
Hello everyone.
i have query with data logging of pulse duration measured with counter input into .lvm file
pulse signal contain frequency of 100 hz with pulse duration vary from 2 to 8 ms
i want to log pulse duration in .lvm file.
when running vi(below screen shot) i find data logg with duplicate instance of time.
i want to log pulse duration data with every ridging or falling edge of pulse.
so if i run vi for 20 sec i sould have 2000 reading of pulse duration in my.lvm file with appropriate time in time column
Attachments:
pulsewidth using counter.jpg 659 KBHello Karthik,
I can think of two ways I would approach this. One would use software timing, and the other would use hardware timing.
1. Software timing: This might be the easiest approach. The idea is you would use your example much as it is written, then use a loop with a delay as the "Time between samples" to call your counting code over and over. In this case, I have a few suggestions for your code. First, instead of the loop, you could start the counter, use a delay in a sequence structure (use "Wait (ms)", not "Wait Until Next ms Multiple")for your "Sampling time", then read the counter and stop. After the stop you'll need another "Wait (ms)" in a sequence structure for your "Time between samples". Finally, wrap a loop around all of this to r
epeat it. One advantage of this approach is the user can change the between time on the fly.
2. Hardware timing: The idea behind this would be to use two counters. The first would count, the output of the second would be wired to the gate of the first to control when the first counts. The second counter would be programmed to output a pulse train where the positive portion of the pulse would be your "Sampling time" and the negative portion would be your "Time between samples". The creative part for this approach would be figuring out when to read the count. One way to do this would be to also connect the gate of the second counter to an analog input, then read the count whenever the input goes low (say, below 1 volt). This approach might be the more accurate of the two. However, you would always be getting a total count, so you would have to subract the previous count each time. Also, you would not be able to change your sampling time or between time once you start.
Finally
I'd suggest looking at some examples - you can usually find code to help you.
Best Regards,
Doug Norman -
CPU runs 100% when logging with DAQmx using card 6229
Hi!
When I start a logging using DAQmx Analog Input the computer CPU runs 100%. This causes the problem that the computer locks and the subsequent steps in the program are left without computer CPU recourse. After the logging is compleet, 1750 samples/10 channels at 250Hz, the program runs on, but the loggfile is at that point useles.
I have instaled LV 7.1.1 and am using DAQ 7.4.
Anyone that recognizes this problem?
Regards,
Idriz
Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
Memory Profesional
direct: +46 (0) - 734 32 00 10
http://www.zogaj.seHello Idriz!
I just wanted to respond here as well so you know that we have read your post on this very forum. Still waiting for an answer from the US and after that I will forward that information to you.
Regards,
Jimmie A.
Applications Engineer, National Instruments
Regards,
Jimmie Adolph
Systems Engineer Manager, National Instruments Northern Region
Bring Me The Horizon - Sempiternal -
Is there any way to read 2 tasks at once (with DAQmx)
Ok, I don't belive there is... but that is a problem I want to solve.
I use about 10 sensors of the same kind, but I only have 2 signal amplifiers. So I have the possibility to read from 2 channels. (Ok, I could create a task that reads from 2 channels)
I need to read from the 2 channels, but I change the sensors often so I would have to create a task for every possible combination of sensors, and I cannot do that (because the sensors should be calibrated separately in MAX).
I tried to use two DAQmx in a flat sequence structure with 2 instances, but it didn't work... as you probably know.
How to do?
EDIT:
Maybe there is a way to combine two tasks into one before connecting it to DAQmx-read?
Message Edited by Johan.svensson on 09-13-2006 01:20 AM
Message Edited by Johan.svensson on 09-13-2006 01:20 AMSee here
-
I have a lock-in amp controlled by labview via gpib. My goal is to use the
"advanced data logger" supplied with labview to log data from a daq card
and the output from the lock-in amp. I have the lock-in vi working correctly
and have the output on the screen. I don't know how to get the pre-packaged
"advanced data logger" to read in data from the gpib device. I can only
read in data from my daq card.
Any assistance in this matter would be appreciated.1. An OR function should work just fine for you. I don't understand what you mean that the functionality changed. If the reset button is true OR the Trip button is true, then a True gets put in the notifier and the Trip file functions are executed.
I don't understand your "Scale". Right now you have a zero if the button is true and 100 if it is false. You don't have anything else going on with it. I don't understand the stay at zero part. How would it ever get to the point where it isn't zero.
2. For the main log file, you could detect when 43200 iterations of the loop have passed and at that time, just reset the writing to the beginning of the file.
See attached.
Attachments:
Save_Previous_10_SecondsMOD3.vi 27 KB -
Continuous Data Logging with NI 9236 an cRIO 9076 with FPGA
Hey all,
i'm a beginner in LabVIEW/FPGA. My goal is it to
continuous acquire and log data. I've a 9236, CH0 is
connected to a strain gauge and the cRIO 9076.
I've written a code and I see the incoming data on the FPGA.vi.
On the Host.vi there is no outcoming data out of the FIFO.
There is no error messages or an error during the compilation.
Do I have a timing problem? Where ist the big mistake
Thank you!
Attachments:
1.jpg 98 KB
2.jpg 71 KB
4.jpg 337 KBIn your first image one problem is that you are starting the module on each iteration of the loop. I can't tell how your FIFO is configured, but take a look at the example "Hardware Input and Output->CompactRIO->Module Specific IO->Analog Input->NI 923x Continuous DMA.pvproj. I don't know which LabVIEW version you are using but I found this example in 2012.
-
I need timed data logging with continuous graphing
I am graphing continuous temperature measurements, but I only want to log onto a spreadsheet measurements every hour. How do I do that?
Hello,
In order to log the measurements every hour, you will want to implement a "Wait (ms)" function. This function can be added to the block diagram by [right-clicking] and selecting the following:
[All Functions] >> [Time & Dialog] >> [Wait (ms)]
You will want to place this wait function inside your code and wire a constant to the wait function with a value of 3,600,000 (60min/hr x 60sec/min x 1000ms/s). To wire a constant to the wait function, [right-click] on the input terminal of the "wait (ms)" icon and select [Create]>>[Constant].
I hope this helps. Please let me know if I can further assist you.
Kind Regards,
Joe Des Rosier
National Instruments -
I'm acquiring 64 channels at sampling rate of about 1450kHz continuously and showing the data second by second in a CWGraph. Now, I would like to add an on-line averaging for all of those 64 channels and show it in another CWGraph. The averaging should occur with respect of external trigger and the sample should consist of about 100ms prestimulus and 500ms poststimulus period. Has anybody been handling this sort of things? How it's solved with ComponentWorks and Visual Basic?
I'm using double Pentium 700MHz processors and NI-DAQ PCI-6033E card.1. To get started with external triggering check out the examples in the Triggering folder that install with CW under MeasurementStudio\VB\Samples\DAQ\Triggering.
2. Create a prestimulus buffer that always contains the latest 100 ms of data. Just modify the CWAI1_AcquiredData event to transfer ScaledData (the data acquired) to your prestimulus buffer. See how there will always be a prestimulus buffer of data regardless of trigger state?
3. On the trigger, you can use the same CWAI1_AcquiredData event to route the data into poststimulus buffer and average the 2.
I see something like this:
if no trigger
100 ms of data to prestimulus buffer
if trigger
500 ms of data to poststimulus buffer
averaging and display function
reset trigger and buffers
good luck
ben schulte
application engineer
national instruments
www.ni.com/ask -
Multi-table data block with update
My data block is set up by joining 3 tables. Only 1 table is
set up as updatable. The columns from the other 2 tables are
set in property palette as not updatable. But when I try to
update the data block row, I get: FRM-40501 ORACLE error:
unable to reserve record for update or delete.
I have tried using an updatable view in the data block and even
wrote an INSTEAD TRIGGER for the update but I receive the same
message.
I appreciate your help.
Thanks.you can try on-update trigger on block level to handle this
problem. -
Data Logging with Fluke 189 Multimeter
I am trying to get LaBview to read a signal coming off a Fluke 189 Multimeter, and I keep encountering the error code-1073807341. I installed the driver fl18x, and LaBview will still not talk to the multimeter. I have been able to use the fluke 189 multimeter before, but on a different computer and a newer version of LabView. I was able to get the fluke to work with LaBview 7.1.0, but not with LaBview 7.0.0 express. Should I be able to use the fluke with LaBview 7.0.0 express?
Hi,
Yes, you should be able to use LabVIEW 7.0 with the Fluke 189. The fl18x instrument drivers are available on the Instrument Driver Network, and there is a version for LabVIEW 7.0. You may need to repair or reinstall the NI-VISA and NI-VISA Run-time Engine. Make sure that you select support for LabVIEW 7.0 when installing the NI-VISA driver. Also make sure that you installed LabVIEW 7.0 before installing LabVIEW 7.1, and that you installed the drivers after installing LabVIEW.
Regards,
Rima
Rima H.
Web Product Manager -
How do I control a data log session with period and sample time?
I need a data logging system where the operator can select 2 logging parameters: Log Period and Sample Time. I also need a START and STOP button to control the logging session. For example, set the log period for 1 hour and the sampling time for 1 second. (I may be using the wrong jargon here.) In this case when the START button is clicked, the system starts logging for 1 second. An hour later, it logs data for another second, and so on until the operator clicks the STOP button. (I will also include a time limit so the logging session will automatically stop after a certain amount of time has elapsed.)
It’s important that when the STOP button is clicked, that the system promptly stops logging. I cannot have the operator wait for up to an hour.
Note that a logging session could last for several days. The application here involves a ship towing a barge at sea where they want to monitor and data log tow line tension. While the system is logging, I need the graph X-axis (autoscaled) to show the date and time. (I’m having trouble getting the graph to show the correct date and time.) For this application, I also need the system to promptly start data logging at a continuous high rate during alarm conditions.
Of course I need to archive the data and retrieve it later for analysis. I think this part I can handle.
Please make a recommendation for program control and provide sample code if you can. It’s the program control concepts that I think I mostly need help here. I also wish to use the Strip Chart Update Mode so the operator can easily view the entire logging session.
DAQ Hardware: Not Selected Yet
LabVIEW Version: 6.1 (Feel free to recommend a v7 solution because I need to soon get it anyway.)
Operating System: Win 2000
In summary:
How do I control a graphing (data log) session for both period and sample time?
How do I stop the session without having to wait for the period to end?
How do I automatically interrupt and control a session during alarm conditions?
Does it make a difference if there is more than one graph (or chart) involved where there are variable sample rates?
Thanks,
DaveHello Dave,
Sounds like you have quite the system to set up here. It doesn�t look like you are doing anything terribly complicated. You should be able to modify different examples for the different parts of your application. Examples are always the best place to start.
For analog input, the �Cont Acq&Chart (buffered).vi� example is a great place to start. You can set the scan rate (scans/second) and how many different input channels you want to acquire. This example has its own stop button; it should be a simple matter to add a manual start button. To manually set how long the application runs, you could add a 100 ms delay to each iteration of the while loop (recommended anyway to allow processor to multi-task) and add a control that sets the number
of iterations of the while loop.
For logging data, a great example is the �Cont Acq to File (binary).vi� example.
For different sample rate for different input lines, you could use two parallel loops both running the first example mentioned above. The data would not be able to be displayed on the same graph, however.
If you have more specific questions about any of the different parts of your application, let me know and I�ll b happy to look further into it.
Have a nice day!
Robert M
Applications Engineer
National Instruments
Robert Mortensen
Software Engineer
National Instruments -
How to efficiently log multiple data streams with TDMS
Ok, first off, I'll admit I am completely clueless when it comes to logging, TDMS in particular. That said, I'm trying to work out the best way to log some data from an existing LabVIEW-based control system, so that users can later access that data in the event of catastrophic failure or other situations where they might want to see exactly what happened during a particular run.
I've got a total of between 6 and 12 data points that need to be stored (depending on how many sensors are on the system). These are values being read from a cRIO control system. They can all be set to Single data type, if necessary - even the one Boolean value I'm tracking is already being put through the "convert to 0,1" for graph display purposes. The data is currently read at 100ms intervals for display, but I will be toying with the rate that I want to dump data to the disk - a little loss is OK, just need general trending for long term history. I need to keep file sizes manageable, but informative enough to be useful later.
So, I am looking for advice on the best way to set this up. It will need to be a file that can be concurrently be read as it is being written, when necessary - one of the reasons I am looking at TDMS in the first place (it was recommended to me previously). I also need an accurate Date/Time stamp that can be used when displaying the data graphically on a chart, so they can sync up with the external camera recordings to correlate just what happened and when.
Are there specific pitfalls I should watch for? Should I bundle all of the data points into an array for each storage tick, then decimate the array on the other end when reading? I've dug through many of the examples, even found a few covering manual timestamp writing, but is there a preferred method that keeps file size minimized (or extraction simplified)?
I definitely appreciate any help... It's easy to get overwhelmed and confused in all of the various methods I am finding for handling TDMS files, and determining which method is right for me.I need to bump this topic again... I'll be honest, the TDMS examples and available help are completely letting me down here.
As I stated, I have up to 12 data values that I need to stream into a log file, so TDMS was suggested to me. The fact that I can concurrently read a file being written to was a prime reason I chose this format. And, "it's super easy" as I was told...
Here's the problem. I have multiple data streams. Streams that are not waveform data, but actual realtime data feedback from a control system, that is being read from a cRIO control system into a host computer (which is where I want to log the data). I also need to log an accurate timestamp with this data. This data will be streamed to a log file in a loop that consistently writes a data set every 200ms (that may change, not exactly sure on the timing yet).
Every worthwhile example that I've found has assumed I'm just logging a single waveform, and the data formatting is totally different from what I need. I've been flailing around with the code, trying to find a correct structure to write my data (put it all in an array, write individual points, etc) and it is, quite honestly, giving me a headache. And finding the correct way for applying the correct timestamp (accurate data and time the data was collected) is so uncharacteristically obtuse and hard to track down... This isn't even counting how to read the data back out of the file to display for later evaluation and/or troubleshooting... Augh!
It's very disheartening when a colleague can throw everthing I'm trying to do together in 12 minutes in the very limited SCADA user interface program he uses to monitor his PLCs... Yet LabVIEW, the superior program I always brag about, is slowly driving me insane trying to do what seems like a relatively simple task like logging...
So, does anyone have any actual useful examples of logging multiple DIFFERENT data points (not waveforms) and timestamps into a TDMS file? Or real suggestions for how to accomplish it, other than "go look at the examples" which I have done (and redone). Unless, of course, you have an actual relevant example that won't bring up more questions than it answers for me, in which case I say "bring it on!"
Thanks for any help... My poor overworked brain will be eternally grateful. -
Hello,
I am currently working on a Senior Design Project where I have to measure torque, RPM, pressure and temperature. I am using strain gages, a Hall Effect Sensor, pressure sensor and thermocouples to obtain these readings. A myDAQ collects the RPM and pressure readings while a cDAQ collects the torque and temperature readings with the NI 9237 and NI 9211 modules. I created LabVIEW VIs for each sensor and they work. The problem I am having is when I try create a VI with DAQmx that reads all these sensor values simultaneously. The VI that I have attached randomly displays one of the measurements while all the other measurements remain blank. How should I edit my VI so I can display all my readings at once?
Solved!
Go to Solution.
Attachments:
SD.vi 87 KBYou can form all of your cDAQ readings into a single DAQmx task. Do the same for the myDAQ channels. On the read side, you just read multiple channels. You can then use the Array Index to separate your channel data and do whatever calculations you need.
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
Attachments:
Multichannel task.png 40 KB -
Unable to take online backup including logs with HP Data Protector - DB2 9.7 fixpack 8
Dear All,
Infrastructure : -
Database - IBM DB2 LUW 9.7 fixpack 8
OS - HP-UX 11.31
Backup solution - Data protector v7
OEM of tape library i.e. HP asked me to make changes in DB parameters with values as
--> LOGRETAIN to RECOVERY
--> USEREXIT to ON
--> LOGARCHMETH1 to USEREXIT
These parameters are used for activating online backup of Database (DB2 LUW 9.7 fixpack 8).
when backup is initiated from DP, backup happen successfully with option of excluding archive logs where as when we execute backup from DP(Data protector) with option including archive logs. Backup terminates with error.
SQL2428N The BACKUP did not complete because one or more of the requested log files could not be retrieved
Dear all need your expect help.
Thanks in advance.
NeerajHi Deepak,
Thanks for your quicky reply and i have check the directory log_dir and log_archive are having enough space available.
Please find the db2diag.log file.
Database Configuration for Database
Database configuration release level = 0x0d00
Database release level = 0x0d00
Database territory = en_US
Database code page = 1208
Database code set = UTF-8
Database country/region code = 1
Database collating sequence = IDENTITY_16BIT
Alternate collating sequence (ALT_COLLATE) =
Number compatibility = OFF
Varchar2 compatibility = OFF
Date compatibility = OFF
Database page size = 16384
Dynamic SQL Query management (DYN_QUERY_MGMT) = DISABLE
Statement concentrator (STMT_CONC) = OFF
Discovery support for this database (DISCOVER_DB) = ENABLE
Restrict access = NO
Default query optimization class (DFT_QUERYOPT) = 5
Degree of parallelism (DFT_DEGREE) = 1
Continue upon arithmetic exceptions (DFT_SQLMATHWARN) = NO
Default refresh age (DFT_REFRESH_AGE) = 0
Default maintained table types for opt (DFT_MTTB_TYPES) = SYSTEM
Number of frequent values retained (NUM_FREQVALUES) = 10
Number of quantiles retained (NUM_QUANTILES) = 20
Decimal floating point rounding mode (DECFLT_ROUNDING) = ROUND_HALF_EVEN
Backup pending = NO
All committed transactions have been written to disk = NO
Rollforward pending = NO
Restore pending = NO
Multi-page file allocation enabled = YES
Log retain for recovery status = RECOVERY
User exit for logging status = YES
Self tuning memory (SELF_TUNING_MEM) = ON
Size of database shared memory (4KB) (DATABASE_MEMORY) = AUTOMATIC(3455670)
Database memory threshold (DB_MEM_THRESH) = 10
Max storage for lock list (4KB) (LOCKLIST) = AUTOMATIC(20000)
Percent. of lock lists per application (MAXLOCKS) = AUTOMATIC(90)
Package cache size (4KB) (PCKCACHESZ) = AUTOMATIC(161633)
Sort heap thres for shared sorts (4KB) (SHEAPTHRES_SHR) = AUTOMATIC(480362)
Sort list heap (4KB) (SORTHEAP) = AUTOMATIC(50000)
Database heap (4KB) (DBHEAP) = AUTOMATIC(3102)
Catalog cache size (4KB) (CATALOGCACHE_SZ) = 2560
Log buffer size (4KB) (LOGBUFSZ) = 1024
Utilities heap size (4KB) (UTIL_HEAP_SZ) = 10000
Buffer pool size (pages) (BUFFPAGE) = 10000
SQL statement heap (4KB) (STMTHEAP) = AUTOMATIC(8192)
Default application heap (4KB) (APPLHEAPSZ) = AUTOMATIC(256)
Application Memory Size (4KB) (APPL_MEMORY) = AUTOMATIC(40000)
Statistics heap size (4KB) (STAT_HEAP_SZ) = AUTOMATIC(4384)
Interval for checking deadlock (ms) (DLCHKTIME) = 10000
Lock timeout (sec) (LOCKTIMEOUT) = 3600
Changed pages threshold (CHNGPGS_THRESH) = 20
Number of asynchronous page cleaners (NUM_IOCLEANERS) = AUTOMATIC(2)
Number of I/O servers (NUM_IOSERVERS) = AUTOMATIC(5)
Index sort flag (INDEXSORT) = YES
Sequential detect flag (SEQDETECT) = YES
Default prefetch size (pages) (DFT_PREFETCH_SZ) = AUTOMATIC
Track modified pages (TRACKMOD) = YES
Default number of containers = 1
Default tablespace extentsize (pages) (DFT_EXTENT_SZ) = 2
Max number of active applications (MAXAPPLS) = AUTOMATIC(125)
Average number of active applications (AVG_APPLS) = AUTOMATIC(3)
Max DB files open per application (MAXFILOP) = 61440
Log file size (4KB) (LOGFILSIZ) = 16380
Number of primary log files (LOGPRIMARY) = 60
Number of secondary log files (LOGSECOND) = 0
Changed path to log files (NEWLOGPATH) =
Path to log files = /db2/ECP/log_dir/NODE 0000/
Overflow log path (OVERFLOWLOGPATH) =
Mirror log path (MIRRORLOGPATH) =
First active log file = S0000187.LOG
Block log on disk full (BLK_LOG_DSK_FUL) = YES
Block non logged operations (BLOCKNONLOGGED) = NO
Percent max primary log space by transaction (MAX_LOG) = 0
Num. of active log files for 1 active UOW(NUM_LOG_SPAN) = 0
Group commit count (MINCOMMIT) = 1
Percent log file reclaimed before soft chckpt (SOFTMAX) = 300
Log retain for recovery enabled (LOGRETAIN) = RECOVERY
User exit for logging enabled (USEREXIT) = ON
HADR database role = STANDARD
HADR local host name (HADR_LOCAL_HOST) =
HADR local service name (HADR_LOCAL_SVC) =
HADR remote host name (HADR_REMOTE_HOST) =
HADR remote service name (HADR_REMOTE_SVC) =
HADR instance name of remote server (HADR_REMOTE_INST) =
HADR timeout value (HADR_TIMEOUT) = 120
HADR log write synchronization mode (HADR_SYNCMODE) = NEARSYNC
HADR peer window duration (seconds) (HADR_PEER_WINDOW) = 0
First log archive method (LOGARCHMETH1) = USEREXIT
Options for logarchmeth1 (LOGARCHOPT1) =
Second log archive method (LOGARCHMETH2) = OFF
Options for logarchmeth2 (LOGARCHOPT2) =
Failover log archive path (FAILARCHPATH) =
Number of log archive retries on error (NUMARCHRETRY) = 5
Log archive retry Delay (secs) (ARCHRETRYDELAY) = 20
Vendor options (VENDOROPT) =
Auto restart enabled (AUTORESTART) = ON
Index re-creation time and redo index build (INDEXREC) = SYSTEM (RESTART)
Log pages during index build (LOGINDEXBUILD) = OFF
Default number of loadrec sessions (DFT_LOADREC_SES) = 1
Number of database backups to retain (NUM_DB_BACKUPS) = 12
Recovery history retention (days) (REC_HIS_RETENTN) = 60
Auto deletion of recovery objects (AUTO_DEL_REC_OBJ) = OFF
TSM management class (TSM_MGMTCLASS) =
TSM node name (TSM_NODENAME) =
TSM owner (TSM_OWNER) =
TSM password (TSM_PASSWORD) =
Automatic maintenance (AUTO_MAINT) = ON
Automatic database backup (AUTO_DB_BACKUP) = OFF
Automatic table maintenance (AUTO_TBL_MAINT) = ON
Automatic runstats (AUTO_RUNSTATS) = ON
Automatic statement statistics (AUTO_STMT_STATS) = ON
Automatic statistics profiling (AUTO_STATS_PROF) = ON
Automatic profile updates (AUTO_PROF_UPD) = ON
Automatic reorganization (AUTO_REORG) = ON
Auto-Revalidation (AUTO_REVAL) = DEFERRED
Currently Committed (CUR_COMMIT) = DISABLED
CHAR output with DECIMAL input (DEC_TO_CHAR_FMT) = NEW
Enable XML Character operations (ENABLE_XMLCHAR) = YES
WLM Collection Interval (minutes) (WLM_COLLECT_INT) = 0
Monitor Collect Settings
Request metrics (MON_REQ_METRICS) = BASE
Activity metrics (MON_ACT_METRICS) = BASE
Object metrics (MON_OBJ_METRICS) = BASE
Unit of work events (MON_UOW_DATA) = NONE
Lock timeout events (MON_LOCKTIMEOUT) = WITHOUT_HIST
Deadlock events (MON_DEADLOCK) = WITHOUT_HIST
Lock wait events (MON_LOCKWAIT) = NONE
Lock wait event threshold (MON_LW_THRESH) = 5000000
Number of package list entries (MON_PKGLIST_SZ) = 32
Lock event notification level (MON_LCK_MSG_LVL) = 1
SMTP Server (SMTP_SERVER) =
SQL conditional compilation flags (SQL_CCFLAGS) =
Section actuals setting (SECTION_ACTUALS) = NONE
Connect procedure (CONNECT_PROC) =
Regards
Neeraj
Maybe you are looking for
-
Help...Iphone 5c settings changed by themselves
I have an iPhone5c and today my settings changed automatically by themselves and when I have attempted to change them back they will not stick. Specifically my battery percentage has disappeared and my message settings have changed as well. I had my
-
I use the cinnamon desktop. I just did a system update, and now, I get a behavior where after I log in there is a delay before conky and the icons on the desktop appear. There are messages in the journal the would appear to be relevant, but I don't q
-
Copying Cells including formatting
Hi, I have imported a template to use as a phone log however it only has 100 rows. I would like to copy the 100 formatted rows and paste them to the bottom so I can quickly extend this template to 200 then 300 etc. I can't seem to get the cell border
-
Elements vs. lightroom vs bridge
I would like to know the purposes of elements vs. lightroom vs bridge. I am currently using bridge and elements (CS3) and plan on upgrading to CC2014 soon. All of my tagging etc was done in Elements, but don't seem to be visible in bridge... If I und
-
Their is a customised report used for Form 10 & Form 5 in my company, now the Form 5A gives the monthly report of the people joined or transferred in that company, it takes into consideration all the employees who have joined in that particular month