Data replacement in stmedit WFP sample.
Hi all !
I have a problem with testing stmedit WFP sample.
I have configured parameters such as
EditInline = 0 (out-of-band)
StringToFind = "rainy"
StringToReplace = "sunny"
InspectionPort = 1004
InspectOutbound = = 0
If <StringToFind> has a equal length with <StringToReplace>, everything works fine.
But, if <StringToFind> is longer than <StringToReplace>, TCP re-transmissions are generated.
If the inbound stream amount is small, there is no re-transmission.
The larger inbound stream indicated, the more re-transmissions are generated.
At last the connection was reset.
Here is wireshark log.
No. Time Source Destination
Protocol Length Info
1 0.000000000 192.168.1.14 192.168.1.48 TCP 66 51802
> 1004 [SYN] Seq=0 Win=65535 Len=0 MSS=1460 WS=16 SACK_PERM=1
2 0.001022000 192.168.1.48 192.168.1.14 TCP 66
1004 > 51802 [SYN, ACK] Seq=0 Ack=1 Win=65535 Len=0 MSS=1460 WS=16 SACK_PERM=1
3 0.001109000 192.168.1.14 192.168.1.48 TCP 54
51802 > 1004 [ACK] Seq=1 Ack=1 Win=524288 Len=0
4 15.663162000 192.168.1.14 192.168.1.48 TCP 566 51802 >
1004 [PSH, ACK] Seq=1 Ack=1 Win=524288 Len=512
5 15.663213000 192.168.1.14 192.168.1.48 TCP 1590 51802 > 1004
[PSH, ACK] Seq=513 Ack=1 Win=524288 Len=1536
6 15.663250000 192.168.1.14 192.168.1.48 TCP 1590 51802 > 1004
[PSH, ACK] Seq=2049 Ack=1 Win=524288 Len=1536
7 15.663283000 192.168.1.14 192.168.1.48 TCP 1590 51802 > 1004
[PSH, ACK] Seq=3585 Ack=1 Win=524288 Len=1536
8 15.663318000 192.168.1.14 192.168.1.48 TCP 774 51802 >
1004 [PSH, ACK] Seq=5121 Ack=1 Win=524288 Len=720
9 15.664350000 192.168.1.48 192.168.1.14 TCP 60 1004
> 51802 [ACK] Seq=1 Ack=1973 Win=524288 Len=0
10 15.664377000 192.168.1.14 192.168.1.48 TCP 2974 51802 > 1004 [PSH,
ACK] Seq=5841 Ack=1 Win=524288 Len=2920
11 15.664716000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=3509 Win=524288 Len=0
12 15.664732000 192.168.1.14 192.168.1.48 TCP 2974 51802 > 1004 [PSH,
ACK] Seq=8761 Ack=1 Win=524288 Len=2920
13 15.665072000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=5045 Win=524288 Len=0
14 15.665072000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=5841 Win=523488 Len=0
15 15.665091000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=11681 Ack=1 Win=524288 Len=5840
16 15.665988000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=8761 Win=524288 Len=0
17 15.665988000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=11681 Win=524288 Len=0
18 15.666009000 192.168.1.14 192.168.1.48 TCP 11734 51802 > 1004 [PSH, ACK]
Seq=17521 Ack=1 Win=524288 Len=11680
19 15.666681000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=14601 Win=524288 Len=0
20 15.666682000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=17521 Win=524288 Len=0
21 15.666707000 192.168.1.14 192.168.1.48 TCP 11734 51802 > 1004 [PSH, ACK]
Seq=29201 Ack=1 Win=524288 Len=11680
22 15.667013000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=20441 Win=524288 Len=0
23 15.667027000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=40881 Ack=1 Win=524288 Len=5840
24 15.667346000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=23361 Win=524288 Len=0
25 15.667359000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=46721 Ack=1 Win=524288 Len=5840
26 15.667679000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=26281 Win=524288 Len=0
27 15.667695000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=52561 Ack=1 Win=524288 Len=5840
28 15.668012000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=29201 Win=524288 Len=0
29 15.668029000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=58401 Ack=1 Win=524288 Len=5840
30 15.668344000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=32121 Win=524288 Len=0
31 15.668358000 192.168.1.14 192.168.1.48 TCP 5894 51802 > 1004 [PSH,
ACK] Seq=64241 Ack=1 Win=524288 Len=5840
32 15.668678000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=35041 Win=524288 Len=0
33 15.668678000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=37961 Win=524288 Len=0
34 15.668698000 192.168.1.14 192.168.1.48 TCP 11734 51802 > 1004 [PSH, ACK]
Seq=70081 Ack=1 Win=524288 Len=11680
35 15.671890000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=75921 Win=524288 Len=0
36 15.671890000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=78841 Win=524288 Len=0
37 15.671965000 192.168.1.14 192.168.1.48 TCP 49694 51802 > 1004 [PSH, ACK]
Seq=81761 Ack=1 Win=524288 Len=49640
38 15.682716000 192.168.1.48 192.168.1.14 TCP 60 1004 >
51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0
39 15.682789000 192.168.1.14 192.168.1.48 TCP 14281 51802 > 1004 [PSH, ACK]
Seq=131401 Ack=1 Win=524288 Len=14227
40 15.984308000 192.168.1.14 192.168.1.48 TCP 1514 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=1460
41 15.985275000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#1] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=132861
42 16.584309000 192.168.1.14 192.168.1.48 TCP 1514 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=1460
43 16.585239000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#2] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=132861
44 17.784336000 192.168.1.14 192.168.1.48 TCP 590 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=536
45 17.785042000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#3] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=131937
46 18.984877000 192.168.1.14 192.168.1.48 TCP 590 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=536
47 18.985585000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#4] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=131937
48 20.185723000 192.168.1.14 192.168.1.48 TCP 1514 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=1460
49 20.186649000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#5] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=132861
50 22.585988000 192.168.1.14 192.168.1.48 TCP 1514 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=1460
51 22.586936000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#6] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=132861
52 27.386826000 192.168.1.14 192.168.1.48 TCP 1514 [TCP Retransmission]
51802 > 1004 [PSH, ACK] Seq=131401 Ack=1 Win=524288 Len=1460
53 27.387759000 192.168.1.48 192.168.1.14 TCP 66 [TCP Dup
ACK 38#7] 1004 > 51802 [ACK] Seq=1 Ack=131401 Win=524288 Len=0 SLE=131401 SRE=132861
54 36.987729000 192.168.1.14 192.168.1.48 TCP 54 51802 >
1004 [RST, ACK] Seq=132861 Ack=1 Win=0 Len=0
I want to know how can I increase or decrease inbound stream.
Thanks in advice.
I don't know whether this procedure works in 10.6. I've only tested it in 10.8
Turn Time Machine OFF temporarily in its preference pane. Leave the window open.
Navigate in the Finder to your backup disk, and then to the folder named "Backups.backupdb" at the top level of the volume. If you back up over a network, you'll first have to mount the disk image file containing your backups by double-clicking it. Descend into the folder until you see the snapshots, which are represented by folders with a name that begins with the date of the snapshot. Find the one you want to restore from. There's a link named "Latest" representing the most recent snapshot. Use that one, if possible. Otherwise, you'll have to remember the date of the snapshot you choose.
Inside the snapshot folder is a folder hierarchy like the one on the source disk. Find one of the items you can't restore and select it. Open the Info dialog for the selected item. In the Sharing & Permissions section, you may see an entry in the access list that shows "Fetching…" in the Name column. If so, click the lock icon in the lower right corner of the dialog and authenticate. Then delete the "Fetching…" item from the icon list. Click the gear icon below the list and select Apply to enclosed items from the popup menu.
Now you should be able either to copy the item in the Finder or to restore it in the time-travel view. If you use the time-travel view, be sure to select the snapshot you just modified. If successful, repeat the operation with the other items you were unable to restore. You can select multiple items in the Finder and open a single Info dialog for all of them by pressing the key combination option-command-I.
When you're done, turn TM back ON and close its preference pane.
Similar Messages
-
Revision: 14496
Revision: 14496
Author: [email protected]
Date: 2010-03-01 17:49:53 -0800 (Mon, 01 Mar 2010)
Log Message:
Remove MediaPlayerWrapper sample, it has been replaced by the MediaContainerUIComponent sample. See ExamplePlayer or DynamicStreamingSample for examples of how to use MediaContainerUIComponent. See HelloWorld for examples of how to use MediaContainer in a barebones app.
Removed Paths:
osmf/trunk/apps/samples/framework/MediaPlayerWrapper/Hi rickford66 -
I'm sorry to say that the Traditional DAQ driver is no longer fully supported, as it was replaced by the DAQmx driver years ago. I'll try to help out as much as possible, but my experience with the older driver is pretty limited. Here's a shot in the dark, in case it helps:
It sounds like your application is starting the card at some specified sampling rate and then running a loop to read from the buffer, based on the system timer. When the timer says that time is up, it stops the loop. What should actually be done is to set the card up for a finite acquisition of the specified duration, then to read from the buffer periodically inside the loop (while monitoring the available samples per channel). When the available samples drop to zero, it means the clock on the HW has stopped and you have all the samples.
You might be running into performance issues in getting data across the PCMCIA bus (via interrupts) and just not reading the last batch of data since your feedback on when to stop the loop is completely independent of the DAQ card's operation. If you don't want to change anything else, you might just break the loop on the timer and call AI Read once more with the sample to read set to "all available" (or the equivalent). This should flush the end of the buffer.
David Staab, CLA
Staff Systems Engineer
National Instruments -
Flash Charts - How to manipulate the #data# replacement string in XML
I have a problem with the XML file for a flash chart. I am trying to display a 2dColumn chart of an SQL query with the general form:
select null link, x as name, sum(y) as value from z group by (x);
this generates multiple rows, which are displayed as a bar in the chart. So far so good. The problem is, that each row is defined as a block in the chart, but only one name entry is created "value", which is displayed in the legend of the chart.
I can display the block names on the x-axis of the chart, however, I can't rotate them, so that they don't overlap in the display, which I would be able to do with names. I assume, that the blocks are defined in the #data# replacement string of the XML file. I would like to change the generated XML replacement string from the SQL to make each row selected a different name, and only have one block ("value").
What would be the easiest way to achieve this?user587996,
There's no way to manipulate the #data# replacement directly, but you could generate your own XML (see Re: Apex changing nulls to zeroes when creating Flash Charts for one way to do it).
When you say "I can't rotate them, so that they don't overlap in the display" -- have you tried the Labels Rotation chart attribute, or is that not working?
- Marco -
Informatica - import data from line 6 in sample files in universal adapter
Hi ,
I am trying to extract data from R12 tables to sample CSV files provided by oracle in universal adapter CRM analytics.
As per oracle guidelines we are supposed to import data from line 6 as first five lines will be skipped during ETL process.
While importing the the target file in target designer ,I am entering 6 as value in the Lov box "Start import at Row".
Still Data is loaded from first line of the file and not 6 th line.
Please let me know more on how to achieve this requirement.
Thanks,
Chaitanya.HI,
Please let me know the solution for this.It is very high priorirty for me now.
I want to extract data into sample files provided by oracle starting from 6 th line.
At present I am able to load from first line of the .csv file.
Thanks,
Chaitanya -
i plugged my ipod into the computer and my brothers data replaced mine...how do i get mine back? my photos...my apps...etc
Sync it to your computer.
-
Trouble running "Universal Windows app that consumes data from the Azure Marketplace" sample
I am trying to run the sample titled "Universal Windows app that consumes data from the Azure Marketplace". I am not sure how to enter the credentials for the Zillow APIs. I obtained a key from Zillow, but when I enter it per the instructions
in the sample, I still get an unauthorized (401) error.
The instructions say Insert that key into the SampleDataSource file of your project in place of the string "Your account key goes here".
So I replaced my key in this line of code:
request.Credentials = new NetworkCredential("<Your account key goes here>",
"<Your account key goes here>");
Am I missing something else?Hi isgimaryann,
I would suggest you ask question at:
https://code.msdn.microsoft.com/windowsapps/Universal-app-that-d7406c5b/view/Discussions#content to see if the owner of the sample will give you some suggestions.
--James
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
FTP not working properly in WFP sample
Hi,
I downloaded the WFP traffic inspection sample and made some changes to handle only TCP traffic by adding these filter conditions
filterConditions[conditionIndex].fieldKey = FWPM_CONDITION_IP_PROTOCOL;
filterConditions[conditionIndex].conditionValue.type = FWP_UINT8;
filterConditions[conditionIndex].conditionValue.uint8 = IPPROTO_TCP;
filterConditions[conditionIndex].matchType = FWP_MATCH_EQUAL;
When i try to run put command in ftp i am getting this error
C:\Users\Administrator>ftp xxx.xxx.xxx.xxx
Connected to xxx.xxx.xxx.xxx.
220 (vsFTPd 2.0.5)
User (xxx.xxx.xxx.xxx:(none)): root
331 Please specify the password.
Password:
230 Login successful.
ftp> put c:\abc.txt
200 PORT command successful. Consider using PASV.
150 Ok to send data.
Connection closed by remote host.When i remove the Transport layer, ftp runs normally, hence my question is, do we have to do anything specific in case of FTP in transport layer? I came across a link in msdn which describes TCP packet flow, it is mentioned that for Data transmitted over a TCP connection, transport layer as well FWPM_LAYER_STREAM_V4 come into play.
Kindly apprise me what is amiss here.
DebbratTry http://www.apple.com/downloads/macosx/apple/application_updates/serveradmintools 1057.html
-
Data replacement in the system.
Hi ,
I wrote LSMW with recording and i forgot to delete one field which contain a default value and the value get trasferred to all the matrial no in the system.. with out taking any Basis support...how can i replace the data back to the field.
Thanks,
Lawrence.obtain the old field value for change documents (table CDHDR and CDPOS)
and do again a LSMW to update the data. -
Database Management Copy Data - Replace Option
Can anyone provide info on how this works? we are using 9.2.1
When using database management to copy data, when using the replace option, even if we only pick a few accounts to copy, it is erasing all data in the Target scenario. Does the replace option clear all data by default before copying in the selected data?
we tried to use the Merge option but apparently that doesn't always work for some reason-could be user error. not sure.
Thanks
WagsHey Wags- this is expected functionality of the Replace method. It clears all custom/account data for the load entity/year/period combo prior to do the load. For example, if there is P&L data in entity A in Jan 2010 and you load $500 to it in a Sales account, all the rest of the P&L data will be cleared and you'll just have $500 in the Sales account.
Here's stuff from the guide:
Several options are available for loading data into an application. You can merge, replace, or
accumulate the data into the application. You select one of these options for each load process.
Note:
You cannot load calculated data into an application.
Merge
Select this option to overwrite the data in the application with the data from the file.
For each unique point of view that exists in the data file and in the application, the value in the
data file overwrites the data in the application.
Note:
If you have multiple values in the file for the same point of view, the system loads only the value
for the last entry.
If you select the Accumulate Within File option in conjunction with the Merge option, the system
adds all values for the same point of view in the data file, and overwrites the data in the application
with the total.
For each unique point of view that is in the data file but does not have a value in the application,
the value from the data file is loaded into the application.
Replace
For each unique combination of scenario, year, period, entity, and value in the data file, the
system first clears all corresponding values from the application, then loads the value from the
data file.
Note:
If you have multiple values in the file for the same unique combination of dimensions, the system
loads only the value for the last entry.
If you select the Accumulate Within File option in conjunction with the Replace option, the
system clears the value from the application, adds all values for the same point of view in the
data file, and loads the total.
You can achieve the same result as using the Replace option with the Accumulate Within File
option by using the Database Management module. You can organize the data in the file into
groups based on Scenario, Year, Entity, Value, and Period.
Use the Clear Data tab in the Database Management module to clear each unique combination
of these dimensions, selecting all accounts including system accounts. Enter the data manually
in a data grid, or load or copy the data into the application using the Merge or Accumulate load
options. When you perform the Clear operation for a period in a subcube, data, cell text, and
line item detail are cleared, but cell attachments are not cleared.
Note:
You may need to create several small files to load a data file using the Replace mode, especially
if the data is very large or if the file contains ownership data. An error message is displayed if the
file is too large when you try to load it.
Replace by Security
This option enables you to perform a data load in Replace mode even if you do not have access
to all accounts. When you perform the Clear operation for a period in a subcube, only the cells
to which you have access are cleared. Data, cell text, and line item detail are cleared, but cell
attachments are not cleared.
Accumulate
For each unique point of view that exists in the data file and in the application, the value from
the load file is added to the value in the application.
Note:
Data for system accounts is not accumulated.
Accumulate Within File
You can use the Accumulate Within File option in conjunction with the Merge and Replace
options. This option accumulates the data in the file first and loads the totals into the application
based on the load option selected.
File Contains Ownership Data
If the file contains ownership data, you must indicate this by selecting File Contains Ownership
Data. If you do not select this option and the data file contains ownership or shares information,
an error occurs when you load the file. -
Activity milestone actual date replacing schedule date
Hi,
The last few years we have been using activity milestones. These milestones are offset from the early start date of the activity. We manually add the actual date and the schedule date would remain the same. We recently added some hot packs and now when the user puts in the actual date it also replaces the schedule date with the actual date. Reading throught some of the OSS notes it sounds like this is the way it works and the way it was working before the hot packs may have been wrong. I was just wondering if anyone really knows how the milestone schedule date and actual dates are suppose to work. We prefer that the schedule date remains the schedule date and not revert to the actual date. Any info would be appreciated.SAP stated that once the actual date is input then the schedule date would change to the actual date. They said this is how it is suppose to work but if we wanted to leave the sched date alone then there was a note we could put in the system. We did this and it works like we want.
-
I can only save 3 seconds data with 1MHz sampling rate using PCI 6110. Is it possible to continuously save longer time data (say 1 minute)?
Many thanksif the binary file is big the ASCII file is very very very big.
1 measurement is coded in a I16 => 2Bytes ( 4bit is not used 611X is 12bit
card)
if you use +/-1V range the quantum is (2 * 1V) / 4096 = 0.0004883V or 0.4883
mV you need to use 8 digits minimum.
And If your binary file is 1 Mo your ASCII file is 4 Mo.
You can use this VI for inpiration you need to add the translate I16 to DBl
to make a real value before saving.
For Help you can mail inthe newsGroup.
I'am a french user and my english is not good Sorry..
Fabien Palencia
Ingenieur d'etude ICMCB CNRS
"StephenZhou" a ?crit dans le message de news:
506500000005000000A5AF0000-1031838699000@exchange.ni.com...
> Hi, Fabien Palencia
>
> The program is great.
Do you happen to know how to convert the big
> binary file to ascii file? Thanks
[Attachment recup data I16.vi, see below]
Attachments:
recup_data_I16.vi 57 KB -
How do I send measurement data to a file during sampling and not after?
Hi there, I am new to LabView and am in need of some help. I am acquiring temperature data which I am send to a LVM file at the end of the program. Is there a way to have my data recorded after each measurement (or periodically during the measurements if writing to a file every time will slow it down?)? I just wanted to ensure that if I am running the program for long periods of time that I will have some data stored in the case that the computer crashes, etc. and that I won't lose everything.
Any suggestions are appreciated!Hi,
Of course, you can open a file before the measurement starts and keep on pumping the data in to the file and close the same after the measurement finishes. But you have to take care of the criteria like sampling rate and the number of samples each time you are acquiring so that there is no chance to loose our program speed and CPU usage.
MRK (CLAD) -
Is it possible to read scalled data in 2D Waveform (DBL) at 2M Hz Sampling rate?
I am using PXI Real time for my acquisition.
NI PXIe-6368
Simultaneous X Series Data Acquisition.
It is supporing 2MHz sampling rate; When I am reading unscaled data, there is no problem.
But when I am reading scaled data, I am getting,
execution get stop or 'Run Out of memory' error.
Can you please help me on this?
I am not able to share the code, right now. I will try to share in personal.
Thank you,
Yogesh Redemptor
Regards,
Yogesh RedemptorHi Allan,
If you provided some numbers, we may be able to nail this issue for you. Without them I will just guess.
The error codes you reported indicate that the app is not keeping up with the incoming data. If you are sampling 4 channels at 50kS/s then the hardware is transfering 200 kS/s into its on board FIFO. Page 344 of the NI catalog indicates that the FIFO can hold 2048 samples before it over-flows. The -10846 error indicates that these over-flows are occuring. To prevent this, the app must completely empty the FIFO more frequently. Doing the math this works out to one empty operation every 10ms maximum.
This is a rather demanding rate. This only addresses the -10846. The -10845 error is troubling. The hardware you are working with may not do the j
ob.
Suggestions:
1) make sure your app is reading at least this often.
2) slow down you Acq rate (I know, not an option)
3) DO NOT TRUST ME! Call NI and talk to their hardware support group and ask them;
a) "Can I do continuous acq from X channels at Y rate using a DAQCard 6024E?"
b)If yes, "Could you point me at an example?"
c)If no, "Can I get credit on this device toward a purchase of a device that can?"
I hope this get you started.
Reply to this answer if you have further inquiries,
Ben
Ben Rayner
I am currently active on.. MainStream Preppers
Rayner's Ridge is under construction -
Will data from my home computer merge with or replace data on my laptop, phone etc?
I don't want to lose existing bookmarks on my lesser used devices etc when I sync. Will that happen?
Make sure that the Sync settings are set to merge data (that is the default).
You can click the "Sync Options" button to select what to sync.
* Bookmarks - Passwords - Preferences - History - Tabs
* Merge this computer's data with my Sync data
* Replace all data on this computer with my Sync data
* Merge this computer's data with my Sync data -
Using analytical function to calculate concurrency between date range
Folks,
I'm trying to use analytical functions to come up with a query that gives me the
concurrency of jobs executing between a date range.
For example:
JOB100 - started at 9AM - stopped at 11AM
JOB200 - started at 10AM - stopped at 3PM
JOB300 - started at 12PM - stopped at 2PM
The query would tell me that JOB1 ran with a concurrency of 2 because JOB1 and JOB2
were running started and finished within the same time. JOB2 ran with the concurrency
of 3 because all jobs ran within its start and stop time. The output would look like this.
JOB START STOP CONCURRENCY
=== ==== ==== =========
100 9AM 11AM 2
200 10AM 3PM 3
300 12PM 2PM 2
I've been looking at this post, and this one if very similar...
Analytic functions using window date range
Here is the sample data..
CREATE TABLE TEST_JOB
( jobid NUMBER,
created_time DATE,
start_time DATE,
stop_time DATE
insert into TEST_JOB values (100, sysdate -1, to_date('05/04/08 09:00:00','MM/DD/YY hh24:mi:ss'), to_date('05/04/08 11:00:00','MM/DD/YY hh24:mi:ss'));
insert into TEST_JOB values (200, sysdate -1, to_date('05/04/08 10:00:00','MM/DD/YY hh24:mi:ss'), to_date('05/04/08 13:00:00','MM/DD/YY hh24:mi:ss'));
insert into TEST_JOB values (300, sysdate -1, to_date('05/04/08 12:00:00','MM/DD/YY hh24:mi:ss'), to_date('05/04/08 14:00:00','MM/DD/YY hh24:mi:ss'));
select * from test_job;
JOBID|CREATED_TIME |START_TIME |STOP_TIME
----------|--------------|--------------|--------------
100|05/04/08 09:28|05/04/08 09:00|05/04/08 11:00
200|05/04/08 09:28|05/04/08 10:00|05/04/08 13:00
300|05/04/08 09:28|05/04/08 12:00|05/04/08 14:00
Any help with this query would be greatly appreciated.
thanks.
-peterafter some checking the model rule wasn't working exactly as expected.
I believe it's working right now. I'm posting a self-contained example for completeness sake.I use 2 functions to convert back and forth between epoch unix timestamps, so
I'll post them here as well.
Like I said I think this works okay, but any feedback is always appreciated.
-peter
CREATE OR REPLACE FUNCTION date_to_epoch(p_dateval IN DATE)
RETURN NUMBER
AS
BEGIN
return (p_dateval - to_date('01/01/1970','MM/DD/YYYY')) * (24 * 3600);
END;
CREATE OR REPLACE FUNCTION epoch_to_date (p_epochval IN NUMBER DEFAULT 0)
RETURN DATE
AS
BEGIN
return to_date('01/01/1970','MM/DD/YYYY') + (( p_epochval) / (24 * 3600));
END;
DROP TABLE TEST_MODEL3 purge;
CREATE TABLE TEST_MODEL3
( jobid NUMBER,
start_time NUMBER,
end_time NUMBER);
insert into TEST_MODEL3
VALUES (300,date_to_epoch(to_date('05/07/2008 10:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 19:00','MM/DD/YYYY hh24:mi')));
insert into TEST_MODEL3
VALUES (200,date_to_epoch(to_date('05/07/2008 09:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 12:00','MM/DD/YYYY hh24:mi')));
insert into TEST_MODEL3
VALUES (400,date_to_epoch(to_date('05/07/2008 10:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 14:00','MM/DD/YYYY hh24:mi')));
insert into TEST_MODEL3
VALUES (500,date_to_epoch(to_date('05/07/2008 11:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 16:00','MM/DD/YYYY hh24:mi')));
insert into TEST_MODEL3
VALUES (600,date_to_epoch(to_date('05/07/2008 15:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 22:00','MM/DD/YYYY hh24:mi')));
insert into TEST_MODEL3
VALUES (100,date_to_epoch(to_date('05/07/2008 09:00','MM/DD/YYYY hh24:mi')),
date_to_epoch(to_date('05/07/2008 23:00','MM/DD/YYYY hh24:mi')));
commit;
SELECT jobid,
epoch_to_date(start_time)start_time,
epoch_to_date(end_time)end_time,
n concurrency
FROM TEST_MODEL3
MODEL
DIMENSION BY (start_time,end_time)
MEASURES (jobid,0 n)
(n[any,any]=
count(*)[start_time<= cv(start_time),end_time>=cv(start_time)]+
count(*)[start_time > cv(start_time) and start_time <= cv(end_time), end_time >= cv(start_time)]
ORDER BY start_time;
The results look like this:
JOBID|START_TIME|END_TIME |CONCURRENCY
----------|---------------|--------------|-------------------
100|05/07/08 09:00|05/07/08 23:00| 6
200|05/07/08 09:00|05/07/08 12:00| 5
300|05/07/08 10:00|05/07/08 19:00| 6
400|05/07/08 10:00|05/07/08 14:00| 5
500|05/07/08 11:00|05/07/08 16:00| 6
600|05/07/08 15:00|05/07/08 22:00| 4
Maybe you are looking for
-
I use iMac, iPad and iPhone but also an HP laptop. I have used Macs since 1986 so I am not a novice. I set up my iCloud account, moving from MobileMe recently and everything works fine on my Apple devices. The bizarre thing is I can see all my e-mail
-
i have a few issues that I wondered if anyone could tell me how to resolve. I've uploaded this as a sample. www.ivyla.com/company.html 1. my a:active does not seem to work (i also have www.ivyla.com/studio.html loaded if you want to check.) 2. it see
-
Chinese map in English on heremaps.cn
Hello, I am Swiss an live in China. My language settings on Here are set to English.... so why can't I have the map in English ? I only have in Chinese. Only if I use a VPN and access the US website, there I can see the China map in English. Is this
-
How to useTabStrip(webdynpro Java)
Hi, can u please explain how to use Tabstrip UI element with example Thanks Kishore
-
How is possible to appear in the top users forum list
hi i like to known how and who appear in THE TOP FORUM LIST. The total posts send to forum is the condition? Regards and Thanks a lot