DSC 8 custom timestamp
Is there a way to write a custom timestamp to a published shared variable in DSC 8.0, as we could in older DSC versions with VI-based servers?
At least for the time being you cannot create shared variables with custom timestamps but...
In DSC 8.0 the ability was added to log to the Citadel database directly, bypassing the engine. Once you get the information from the acquiring computer you could publish the data to a non-logging shared variable so it would be available to others on the network and write the data directly into the database (which allows you to supply the timestamp.)
Obviously this is more cumbersome than just supplying a custom timestamp to the variable but it will allow you to log with the resolution you require until custom timestamps are available with the shared variable.
Regards,
Robert
Similar Messages
-
Hi,
We are currently building an application for electrical systems using cRIO and DSC HMI features. Our intention is to generate alarm condition in cRIO and display the alarms on HMI-PC using the DSC alarm API. We are looking for a very precise timing on alarms down to less than 10 milliseconds. Our current setup is configured as follows:
cRIO has a Boolean shared variable to trigger an alarm condition via calculations in cRIO.
HMI PC also has a Boolean shared variable aliased (PSP-URL) with the cRIO variable. We do this in order to enable Alarms on the PC variable and use the alarming API functionality.
From our understanding, DSC alarms API display PC-system timestamp whenever an alarm condition is triggered from cRIO. Apparently, the shared variable engine (SVE) on cRIO communicates to the SVE in the PC and passes information about alarm condition and timestamp. However the SVE in the PC only passes on information about alarm trigger to the DSC alarm API; it doesn’t take the timestamp and generates it using system (i.e. HMI PC) time.
We were wondering if we could get the cRIO timestamp on the DSC alarms list "set time"? This is the best desirable situation.
Do we have to write the cRIO timestamp to the citadel database becasue DSC alarms API just reads alarms from citadel? Once alarms are acknowledge, it wrties back the information to the database. So, there is something editing the database which is not transparent.
Any feedback on this to enlighten our understanding will be greatly appreciated.Hi,
If I'm not mistaken that you want the timestamp data from the cRIO to be transfered to the PC, there is a setting at the cRIO shared variable (you'll need to set it at the Project Explorer where you create the time stamp) which you can have the shared variable to give out time stamp: http://zone.ni.com/reference/en-XX/help/371361H-01/lvconcepts/sv_using_nodes/
Alternatively, you could set the shared variable to transfer data of cluster data type which consists of timestamp and the datatype that represents your alarm trigger information: http://digital.ni.com/public.nsf/allkb/DDEB4D9BC34705C086257242000FF7DB
If you are using RT programming, you can place a timestamp related functions like Elapsed Time Express VI or Get time in seconds (which you need to convert using ) and bundle it with the alarm information using Bundle.vi or Bundle by Name.vi and write to the shared variable
Or you could give a screenshot of the program an explain using the screenshot on what you are planning to do.
As for your alarm acknowledgement thing, if your PC is connected with your cRIO always, you can just create some sort of hand shaking tools and use one or two shared variables specially for acknowledgement purpose. Something like a 2 way handshaking operations in a TCP/IP protocol.
Hope that helps
Warmest regards,
Lennard.C
Learning new things everyday... -
Hello,
I'm working on a power grid and need to analyse the current and voltage values and display some graphs (power, RMs values..).
I acquire voltage and current signals with DAQ which works perfectly and log the data in a TDMS file. This file is read afterwords in an other application that contains all the power analysis using the Labview Electrical Power Suite.
I can plot all my outputs with the date and time at which the data was acquired (the file's time stamp) but the power values on the output of the "Power Values VI" are given without date or time and it is very important for me to plot the power values at the right date ant time.
I thought about creating a timestamp initialized with the first value of the TDMS file. At each iteration of the for loop, I would add 0,2 seconds (every iteration corresponds to 10 cycles of the voltage and current signals = 0,2 seconds for a 50Hz grid, it also corresponds to the frequency at which the "Power Values VI" calculates the power values) and pass the data with shift registers.
My question : is it possible ? How can I do it ? (I don't know how to add time to a time stamp) and is there a better way to accomplish this ?
I've been looking for an answer for quite a long time! any help would be very much appreciated.
(My main VI is very ugly I know, I just started making it and testing things.)
Solved!
Go to Solution.
Attachments:
Projet.zip 89 KBThe easiest way would be to use the Get Waveform Time Array. That will return an array of timestamps corresponding to when your samples in your waveform were taken.
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines -
Hi,
I'd like to know if it is possible to customize a timestamp wich have this format 00(H):00(M):00(S) to a format that could use and print hundred of hours 000(H):00(M):00(S).
What should I do to get that ?
Thanks.
OlivierHi Olivier,
It appears that you cannot do this directly from a built in function. What you may want to do is create a custom timestamp that keeps track of minutes and seconds. Then you may want to have a counter that increments when the minutes reaches 60 and concatenate the entirety as a string. I hope this helps point you in the right direction.
Best regards,
Steven -
I'm trying to take spreadsheet data and write it to individual traces inside DSC2012 to a Citadel 5 database. I keep getting an error -1967386570 Data has Back in time timestamp.
Searching the NI website, back in 2006 there was a way to do this with a vi server.
http://www.ni.com/white-paper/3485/en
Is this still possible with the current DSC version?
From the 2012 DSC help file.
Writing a Value to a Citadel Trace (DSC Module)
You can use the Write Trace VI to append a data point to a Citadel trace. Complete the following steps to write a value:
Add the Write Trace VI on the block diagram.
Add Find
Wire the trace reference output of the Open Trace VI to the trace referenceinput of the Write Trace VI.
Wire the value and timestamp inputs of the Write Trace VI. Leave the timestamp input unwired to use the current time. The Write Trace VI fails if the timestamp input is earlier than the timestamp of the last point written to the trace. You can determine the timestamp of the last point in the trace using the Get Trace Info VI.
So, is it no longer possible to write old data into Citadel traces?
I also saw some posts about a registry key for Citadel 5 about server timestamps, but I don't see a registry key where that note says it should be located.
Logging Back-in-Time
Most data logging systems generate ever-increasing time stamps. However, if you manually set the system clock back-in-time, or if an automatic time synchronization service resets the system clock during logging, a back-in-time data point might be logged. Citadel handles this case in two ways.
When a point is logged back-in-time, Citadel checks to see if the difference between the point time stamp and the last time stamp in the trace is less than the larger of the global back-in-time tolerance and the time precision of the subtrace. If the time is within the tolerance, Citadel ignores the difference and logs the point using the last time stamp in the trace. For example, the Shared Variable Engine in LabVIEW 8.0 and later uses a tolerance level of 10 seconds. Thus, if the system clock is set backwards up to ten seconds from the previous time stamp, a value is logged in the database on a data change, but the time stamp is set equal to the previous logged point. If the time is set backwards farther than 10 seconds, Citadel creates a new subtrace and begins logging from that time stamp.
Beginning with LabVIEW DSC 8.0, you can define a global back-in-time tolerance in the system registry. Earlier versions of DSC or Lookout always log back-in-time points. Use the backInTimeToleranceMS key located in the HKLM\SOFTWARE\National Instruments\Citadel\5.0 directory. Specify this value in milliseconds. The default value is 0, which indicates no global tolerance.
This key doesn't exist on my system.
This link from July 2012 seems to mention that it is still possible to use custom timestamps.
http://www.ni.com/white-paper/6579/en
Citadel Writing API
The DSC Module 8.0 and later include an API for writing data directly to a Citadel trace. This API is useful to perform the following operations:
· Implement a data redundancy system for LabVIEW Real-Time targets.
· Record data in a Citadel trace faster than can be achieved with a shared variable.
· Write trace data using custom time stamps.
The Citadel writing API inserts trace data point-by-point with either user-specified or server-generated time stamps.
Is there some more documentation out there that explains this process a bit better?Hi unclebump,
I have been trying to determine what the best course of action would be and I think you need to move the data to a new trace. What I am thinking is for you to open a reference to the trace as it currently exists. Then you will need to read in all the data of that trace. While you read that trace you should also be reading in the data from your file. Once you have both sets of data you will need to iterate over all the data and merge the two sets of data based off their timestamps. The VIs to accomplish this should all exist in the DSC Palette >> Historical or DSC >> Historical >> Database Writing. There is a writing example in the example finder that is called Database Direct Write Demo that would probably be worth looking at. The write trace help says, "
This VI returns an error if you try to write a point with a timestamp that is earlier than the timestamp of the last point written to the trace." which means that if your data is merged and written in order you should not get this error.
Hope this helps and let me know if you have any questions.
Patrick H | National Instruments | Software Engineer -
Range partitioning on virtual column based on binary xmltype column
Alright, our DBA finally got around to upgrading to 11.2.0.2. Now I'm running into another issue:
CREATE TABLE USER.DI_D2
ID NUMBER(19, 0) NOT NULL ,
XML SYS.XMLTYPE ,
PRIMARY KEY ( ID )
XMLTYPE XML STORE AS SECUREFILE BINARY XML
VIRTUAL COLUMNS
ts AS (TO_TIMESTAMP(extractvalue(xml,'/d:d/c:dHeader/c:creationTime',
'xmlns:d="http://www.example.com/m/d/schema/di"
xmlns:c="http://www.example.com/m/schema/common"'),'YYYY-MM-DD"T"HH24:MI:SS'))
PARTITION BY RANGE (ts)
PARTITION d_p2012_07 VALUES LESS THAN (TO_DATE('1-8-2012','DD-MM-YYYY')),
PARTITION d_px VALUES LESS THAN (MAXVALUE)
);On our old 11.2.0.1 install this command works fine (tho due to other issues 11.2.0.1 doesn't work for our search queries)
On our 11.2.0.2 install, I get the following error:
Error at Command Line:10 Column:37
Error report:
SQL Error: ORA-14513: Partitiekolom mag niet van het gegevenstype object zijn.
14513. 00000 - "partitioning column may not be of object datatype"
*Cause: Partitioning column specified by the user was an object datatype
(object, REF, nested table, array) which is illegal.
*Action: Ensure that no partitioning column is an object datatype.Anyone know what's up with that? What changed between the 2 DB versions that could cause this to fail?Alright, seems that's just a display issue then.
Looking in user_lobs like suggested above gives
TABLE_NAME COLUMN_NAME SECUREFILE
DI_D XMLDATA YES I was opening the table in SQL Developer and then looking in the tab SQL (12th tab); with a XmlType table it seems to always show Basicfile even if it's actually a Securefile.
I'd like to use the suggested xmlcast/xmlquery solution, however it doesn't seem to play well with our custom timestamp format.
"CREATION_TIME" AS (XMLCAST(XMLQUERY('declare default element namespace "http://www.example.com/m/d/schema/i";declare namespace c="http://www.example.com/m/schema/common";/d/c:dHeader/c:creationTime' PASSING OBJECT_VALUE RETURNING CONTENT) AS TIMESTAMP))
Error at Command Line:1 Column:0
Error report:
SQL Error: ORA-54002: In de uitdrukking van een virtuele kolom kunnen alleen zuivere functies worden opgegeven.
"CREATION_TIME" AS (TO_TIMESTAMP(XMLQUERY('declare default element namespace "http://www.example.com/m/d/schema/i";declare namespace c="http://www.example.com/m/schema/common";/d/c:dHeader/c:creationTime' PASSING OBJECT_VALUE RETURNING CONTENT),'YYYY-MM-DD"T"HH24:MI:SS'))
Error at Command Line:9 Column:45
Error report:
SQL Error: ORA-00932: inconsistente gegevenstypen: - verwacht, - gekregen
00932. 00000 - "inconsistent datatypes: expected %s got %s"
*Cause:
*Action:
"CREATION_TIME" AS (TO_TIMESTAMP(EXTRACTVALUE("OBJECT_VALUE",'/di:d/c:dHeader/c:creationTime','xmlns:di="http://www.example.com/m/d/schema/i" xmlns:c="http://www.example.com/m/schema/common"'),'YYYY-MM-DD"T"HH24:MI:SS'))
table "USER"."DI_D" created. -
I am trying to create a custom timestamp with the node Date/Time to Second. I am not sure why, but regardless of what input I feed it, I always get 00:00:00.000 PM MM/DD/YYYY. Why is that? See code.
Kudos and Accepted as Solution are welcome!
Attachments:
timestamp.PNG 35 KBMy guess is in the month part.
If you enter a month <1 or >12 the 'Date/Time to Seconds' will return what you are seeing.
Now is the right time to use %^<%Y-%m-%dT%H:%M:%S%3uZ>T
If you don't hate time zones, you're not a real programmer.
"You are what you don't automate"
Inplaceness is synonymous with insidiousness -
DSC 8.6.1 wrong timestamps for logged data with Intel dual core
Problem Description :
Our LV/DCS 8.6.1 application uses shared variables to log data to Citadel. It is running on many similar computers at many companies just fine, but on one particular Intel Dual Core computer, the data in the Citadel db has strange shifting timestamps. Changing bios to startup using single cpu fixes the problem. Could possibly set only certain NI process(es) to single-cpu instead (but which?). The old DSCEngine.exe in LV/DSC 7 had to be run single-cpu... hadn't these kind of issues been fixed by LV 8.6.1 yet?? What about LV 2009, anybody know?? Or is it a problem in the OS or hardware, below the NI line??
This seems similar to an old issue with time synch server problems for AMD processors (Knowledge Base Document ID 4BFBEIQA):
http://digital.ni.com/public.nsf/allkb/1EFFBED34FFE66C2862573D30073C329
Computer info:
- Dell desktop
- Win XP Pro sp3
- 2 G RAM
- 1.58 GHz Core 2 Duo
- LV/DSC 8.6.1 (Pro dev)
- DAQmx, standard instrument control device drivers, serial i/o
(Nothing else installed; OS and LV/DSC were re-installed to try to fix the problem, no luck)
Details:
A test logged data at 1 Hz, with these results: for 10-30 seconds or so, the timestamps were correct. Then, the timestamps were compressed/shifted, with multiple points each second. At perfectly regular 1-minute intervals, the timestamps would be correct again. This pattern repeats, and when the data is graphed, it looks like regular 1-sec interval points, then more dense points, then no points until the next minute (not ON the minute, e.g.12:35:00, but after a minute, e.g.12:35:24, 12:36:24, 12:37:24...). Occasionally (but rarely), restarting the PC would produce accurate timestamps for several minutes running, but then the pattern would reappear in the middle of logging, no changes made.
Test info:
- shared variable configured with logging enabled
- data changing by much more than the deadband
- new value written by Datasocket Write at a steady 1 Hz
- historic data retrieved by Read Traces
- Distributed System Manager shows correct and changing values continuously as they are writtenMeg K. B. ,
It sounds like you are experiencing Time Stamp Counter (TSC) Drift as mentioned in the KB's for the AMD Multi-Core processors. However, according to this wikipedia article on TSC's, the Intel Core 2 Duo's "time-stamp counter increments at a constant rate.......Constant TSC behavior ensures that the duration of each clock tick is
uniform and supports the use of the TSC as a wall clock timer even if
the processor core changes frequency." This seems to suggest that it would be not be the case that you are seeing the issue mentioned in the KBs.
Can you provide the exact modle of the Core 2 Duo processor that you are using?
Ben Sisney
FlexRIO V&V Engineer
National Instruments -
Cr XIR1 to create a custom function to change the timestamp of a field
Post Author: palm
CA Forum: Crystal Reports
Hi,
I am trying to create a custom function to change the timezone of a field in the report depending on the timezone selected by the user in the prompt
I wrote a SQL Expression to do this , but i ended up creating a bunch of expressions for each time zone ,
So i am thinking of custom fuction some thing like
TimeZoneConvertor:
Fuction(TimeStampField , Timezone)
Returns: Timestamp field with newtimezone as selected by user
Hope you get this and please give me some kind of ideas to acheive this
Thanks in Advance!!!!!!!!Hi
goto se37...here u need to create a function group... then u need to create a function module. inside assign import/export parameters. assign tables/exceptions. activate the same. now write ur code within the function module
http://help.sap.com/saphelp_nw04/helpdata/en/9f/db98fc35c111d1829f0000e829fbfe/content.htm
Look at the below SAP HELP links, These links will show you the way to create a Function Module
http://help.sap.com/saphelp_nw04/helpdata/en/26/64f623fa8911d386e70000e82011b8/content.htm
http://help.sap.com/saphelp_nw04/helpdata/en/9f/db98fc35c111d1829f0000e829fbfe/content.htm -
Hi, Adobe LC Forum..
Can anyone explain what the lifecycle of a custom DSC is, and how it's implemented (eg: as an EJB)?
I have a custom DSC that I'd like to have running from the time I start it (eg: using "Start Component" in the Components view in Workbench), until I manually stop it, servicing requests that get invoked via calls to methodes defined in the implementation class.
I'd think this is possible as it appears the other, "out of the box" Services like Forms and Output are implemented as Components as well, so it seems logical that there'd be a defined lifecycle for the DSCs that can be controlled programatically.
When I use my custom DSC today, it appears to instantiate a separate instance of the object on every call to any of the methods defined in the class. As I'd like to persist data within the object across multiple method invocations, I don't want a separate instance of the object created on every call. (I plan to use the DSC multiple times within a single Workflow. On the first call, I want to do some initialization, get a connection object from the JDBC pool, prepare an SQL statement and hold on to it. On later calls, I just want to do setXX (eg: setString) operations on the PreparedStatement and executeUpdate. That way, I can optimize my SQL interactions with the database and also trap SQLException and other exceptions that the "out of the box" JDBC Services don't seem to catch).
Hope that makes sense..thanks for any and all help!
- JConfigure your component/Service to be a single process for all invocations.
i.e goto admin UI and configure it. -
DB Adapter Custom Select SQL with Timestamps
Hi,
I am facing an issue with the DB Adapter Custom select sql: here is my scenario:
I am trying to select diff records from the database between user input start time & the sysdate. for this i am trying to use Execute Pure SQL option database adapter.
I am framing the SQL as below:
select * from tablename1,tablename2 where tablename1.id = tablename1.id and column1 like #userinput1 and tablename1.request_timestamp between to_date(#Userinput_requestTime) and sysdate
My question is whether the query
tablename1.request_timestamp between to_date(#Userinput_requestTime) and sysdate
is correct or not? I even given a try to_date(#Userinput_requestTime,'DD-MON-YY HH24:mi:ss') and sysdate
that too not worked ,
I have created OSB business service using the generated adapter and the input i am giving it as '04-MAR-15 03.36.23.368179000 PM -06:00' and the input is of type xs:string .
i am getting errors like java.sql.SQLDataException: ORA-01830: date format picture ends before converting entire input string
Whether my custom sql is correct or not ? please help me in resolving this issue.
THanks,
SVif
tablename1.request_timestamp
is timestamp then you need to compare as
timestamp beetween timestamp and timestamp
so convert date to timestamp by CAST as example
check How to convert DATE to TIMESTAMP -
DSC 8.0 Run-Time custom installation
Hello,
I have developed an application using DSC 8.0 features and it runs fine on my development computer. I have a couple of questions regarding deploying a buillt executable on customer computer.
1. Can application exe be located in a folder different as DSC RTS installation?
2. Can DSC Run-Time 8.0 be installed in location different as default? I started an installation process of DSC RTS 8.0, but the Browse... button to select custom destination is disabled.
Ideally, I would like to achieve that application executable ( and all folder hiearchy needed ) would be separate from DSC RTS ( as when building pure LV code). If this is not possible, than 2 options exists:
a: to install application structure to default DSC RTS location or
b: to install DSC RTS to a custom location of my application ( prefered )
Please, can you explain me what option do I have.
I can not find a document that would describe a complete procedure for deploying DSC application with all possiblle options that could happen, as we all know that DSC applications are usually preety complicated and consists of many parts deployment can be very frustrating. This is a request for NI folks to put together one complete document ( kind of " Deployment Bible" ) for version 8.0
Regards,
ROMPMatt:
Please try the following and let me know what you find.
Open your project from the project explorer
Right-click on your application and select Properties
Click on Advanced in the Category list
Disable Enable Enhanced DSC Run-Time Support
I am curious to know what you find.
Regards,
Rudi N. -
I know that citadel has a timestamp resolution of 100 nanoseconds... but I have also heard that LV DSC has a resolution of only 1 millisecond.
I know I could figure this out with a little testing, but I was wondering if the following is true:
1. If you log data to citadel using LV DSC, the default resolution for all timestamps is only 1 ms. This means the LabVIEW-DSC-generated timestamps are of 1 ms accuracy.
2. However, if you build your own VI-based device server, you can specify your own timestamps. Can these timestamps be of up to 100 ns accuracy?
Solved!
Go to Solution.You are correct. The time resolution for ODBC is 10 ms, for DSC Tag Engine it is 1 ms and for Citadel it is 100 ns. Of course you will be limited by the lowest resolution. When retrieving data through ODBC you will only get a resolution of 10 ms even though the data was logged with a resolution of 1 ms.
If you create a vi-based server it will still use the Tag Engine and thus be limited to the 1 ms of resolution. -
How to handle timestamp in custom scheduled task written for Target Recon
Hi,
I have wriiten a Custom scheduled task to reconciile users from target system..but whenever I run the Target Recon Scheduled task .. it generates the Recon. events for all users.
How to handle this.. how to pass Timestamp value to custom Target Recon scheduled task.
Thanks,
PallaviTo add to Kevin's point -
You will have to have a attribute either in the IT Resource or the Schedule Task. At the begining of the reconciliation searches, get the timestamp of the target system and store it in a variable. Fectch the timestamp attribute value from the IT Resource/Schedule Task and use that value as one of the search criteria of user profiles on the target.
Once all the user profiles from the target are queried and reconciliation is completed then update either the IT Resource or Schedule Task attribute using OIM API.
Hope this helps :) -
Doucmentation/Guide on Writing a custom component (a DSC)
Hello,
I'd like to learn more about writing a custom component (a DSC), does any one know if there is documentation/guide/tutorial on how to start on it? Please share your comments.
Thanks in advance,
Han DaoHello,
Thank you for all replies, they are very helpful for starting to learn more about DSC.
I do have another question, is it possible to re-write an already built-in service to customize it in a different way. e.g. The Service Operation "Read Document" in Foundation > FileUtilsService to accept a different type of its Input?
Thanks,
Han Dao
Maybe you are looking for
-
Dear folks, We have created confirmation for NWA's in last period but that only posted actual hours and didnt post actual cost. Now we are trying to fix that issue in COFC but I am getting posting period is not open(This is normal since previous peri
-
Supplier Processing Lead Time with ASCP Unconstrained Plan
Hi All, Is it possible for unconstrained plan to consider supplier lead time constrained which we define in ASL? I have defined an Unconstrained Plan & the plan is not considering Item Level , ASL Lead times & generating the planned Orders to meet de
-
Hi, I need to create 2 SAP Exits on Fiscal Year and Posting Period....That will dynamically display Current Year and period data in report output Can you plese give me details process how can i create them and also which sap exit or replacement var
-
Obmenugen next version (v0.5): beta-testers needed
Hi everybody: It's been a long time since last obmenugen release. Sorry about that, I've been very busy. I need everybody out there using (or not) obmenugen to help me making the next version the best ever. I have ported to D 1.0 (from 2.0), thus it'
-
Experiencing massive Lag in Illustrator CS6
Hello All, I'm experiencing a huge amount of lag on my computer when running CS6 Illustrator, on my Windows 7 PC which can regularly run large 3D files, and render and animate them with relative ease. My copy of Photoshop, Indesign, and all other ad