DSC - Write Trace - Bit Array or Logical

Hi,
we are trying to write digital signals to a citadel database. The polymorphic function "Write Trace.vi" has the types logical and bit array.
First of all I have created a new database and tried to write 10 digital signals of type boolean to the database. Unfortunately I get an error: "SPW_WriteBool.vi:2". I only can write the data into the citadel database if I convert the boolean to 0 and 1 values and write them via the numeric type. But this causes an overhead of factor 32.
Is there a sample vi of using the "Write Trace.vi" function of type Logical or Bit Array. I've noticed that the trace which I have created is a analog one of type double. In another database of our company I've found a discrete trace of type double. Is this the problem?
Kind Regards
Joachim

Many thanks for your help. MAX would be a nice and easy way to view the data, by the way, I try to view them with the Mixed Signal Graph. I am very new to LabVIEW and I am fighting now in correct reading of my data. I have 32 digital channels - each value of them is packed in an U32. When making a loop of my example 2 times for each channel should contain 2 values. But how can I program that the first boolean of the U32 is part of digital channel 1 - second boolean is part of digital channel 2 and so on? I've read that I can only transpose 2D arrays. Enclosed I've programmed the visualization of 2 digital channels containing 32 datapoints. But the result should be 32 channels with 2 datapoints.
Kind Regards,
Joachim
Attachments:
read from citadel - bit array.png ‏8 KB

Similar Messages

  • Write trace to Citadel

    I'm trying to take spreadsheet data and write it to individual traces inside DSC2012 to a Citadel 5 database. I keep getting an error -1967386570 Data has Back in time timestamp.
    Searching the NI website, back in 2006 there was a way to do this with a vi server.
    http://www.ni.com/white-paper/3485/en
    Is this still possible with the current DSC version?
    From the 2012 DSC help file.
    Writing a Value to a Citadel Trace (DSC Module)
    You can use the Write Trace VI to append a data point to a Citadel trace. Complete the following steps to write a value:
    Add the Write Trace VI on the block diagram.
     Add  Find
    Wire the trace reference output of the Open Trace VI to the trace referenceinput of the Write Trace VI.
    Wire the value and timestamp inputs of the Write Trace VI. Leave the timestamp input unwired to use the current time. The Write Trace VI fails if the timestamp input is earlier than the timestamp of the last point written to the trace. You can determine the timestamp of the last point in the trace using the Get Trace Info VI.
    So, is it no longer possible to write old data into Citadel traces?
    I also saw some posts about a registry key for Citadel 5 about server timestamps, but I don't see a registry key where that note says it should be located.
    Logging Back-in-Time
    Most data logging systems generate ever-increasing time stamps. However, if you manually set the system clock back-in-time, or if an automatic time synchronization service resets the system clock during logging, a back-in-time data point might be logged. Citadel handles this case in two ways.
    When a point is logged back-in-time, Citadel checks to see if the difference between the point time stamp and the last time stamp in the trace is less than the larger of the global back-in-time tolerance and the time precision of the subtrace. If the time is within the tolerance, Citadel ignores the difference and logs the point using the last time stamp in the trace. For example, the Shared Variable Engine in LabVIEW 8.0 and later uses a tolerance level of 10 seconds. Thus, if the system clock is set backwards up to ten seconds from the previous time stamp, a value is logged in the database on a data change, but the time stamp is set equal to the previous logged point. If the time is set backwards farther than 10 seconds, Citadel creates a new subtrace and begins logging from that time stamp.
    Beginning with LabVIEW DSC 8.0, you can define a global back-in-time tolerance in the system registry. Earlier versions of DSC or Lookout always log back-in-time points. Use the backInTimeToleranceMS key located in the HKLM\SOFTWARE\National Instruments\Citadel\5.0 directory. Specify this value in milliseconds. The default value is 0, which indicates no global tolerance.
    This key doesn't exist on my system.
    This link from July 2012 seems to mention that it is still possible to use custom timestamps.
    http://www.ni.com/white-paper/6579/en
    Citadel Writing API
    The DSC Module 8.0 and later include an API for writing data directly to a Citadel trace. This API is useful to perform the following operations:
    · Implement a data redundancy system for LabVIEW Real-Time targets.
    · Record data in a Citadel trace faster than can be achieved with a shared variable.
    · Write trace data using custom time stamps.
    The Citadel writing API inserts trace data point-by-point with either user-specified or server-generated time stamps.
    Is there some more documentation out there that explains this process a bit better?

    Hi unclebump,
    I have been trying to determine what the best course of action would be and I think you need to move the data to a new trace. What I am thinking is for you to open a reference to the trace as it currently exists. Then you will need to read in all the data of that trace. While you read that trace you should also be reading in the data from your file. Once you have both sets of data you will need to iterate over all the data and merge the two sets of data based off their timestamps. The VIs to accomplish this should all exist in the DSC Palette >> Historical or DSC >> Historical >> Database Writing. There is a writing example in the example finder that is called Database Direct Write Demo that would probably be worth looking at. The write trace help says, "
    This VI returns an error if you try to write a point with a timestamp that is earlier than the timestamp of the last point written to the trace." which means that if your data is merged and written in order you should not get this error.
    Hope this helps and let me know if you have any questions.
    Patrick H | National Instruments | Software Engineer

  • Write traces to spreadsheet file mis-wiring

    I want to extract the data from the DSC database based on a user- or program- defined time range and write that data to a text file. I have the first stages of that set up here, but there is a broken wire (cluster type) that I don't understand what about it is broken.
    According to the help dialog for "Write Traces to Spreadsheet File", the trace information cluster shown in the vi should be correct. But it is not and is marked as a false connection. What is wrong???
    Attachments:
    extract_data_from_database_-_sent_to_NI.vi ‏52 KB

    Change the order of the objects inside your cluster. Disconnect the cluster wire from the write traces to spreadsheet file vi. Right click on the terminal and select create constant. Turn on the context help window and move the cursor to both wires and note the order of the items in the datatype of the wire.

  • Byte into bit array?

    I want to implement a little compression programm, and I have to convert the byte array, in which I read a file into a bit array, because I have to be able to change single bits in that array.
    1.Is there a wrapper class?
    2.How many bits does the class Byte contain?
    If it is 8, then i can convert a single byte into 8 bit by using a simple algorithm, right?
    The byte simply displays it�s value (0-255)in hex, right?
    thanks for your reply!
    tim

    There is a class named BitSet that seems to be close to what you are looking for (at least in SDK 1.3). But if you can manipulate bytes, it might be more efficient to write your own code.

  • Write Text Data Array to text file

    Greetings all. I hope someone can help me as I am really under the gun. The attached vi shows the basics of what I am trying to do. I have already written a vi that takes the Cal Data Array and prints it out in a nicely formatted report. My problem is that the powers that be also want the data saved to a generic text file that can be copied and printed out anywhere they like. As such, I need to save the data to a generic text file in column format such that it will all fit on one page in landscape mode. There are a total of 12 columns of data. I have been trying to create something that would format each column to a specific length instead of them all being the same. No luck so far. Basically, I need columns 1,2,3,8 and 12 to be length of 5. The rest a length of 9. I have tried to place the formatting part in a for loop with the formatting in a case, but it does not appear to work. I really need this quick so if anyone has any ideas, please help. As always, I really appreciate the assistance.
    Thanks,
    Frank
    Attachments:
    Write Cal Data to Text File.vi ‏21 KB

    pincpanter's is a good solution. Beat me to it while I was away building an example. Similiar approach using two for loops and case statement. Here is my suggestion anyway....
    cheers
    David
    Message Edited by David Crawford on 11-23-2005 09:37 AM
    Attachments:
    Write Text Data Array to text file.vi ‏31 KB

  • Leopard Server and NAS arrays and Logical Disks

    Can anyone confirm that Leopard server now sees NAS arrays as logical disks? If so, what would be a good option to the now gone Xraid to connect to my new Xserve. Preferably a 4GB fiber channel connection. Trying to stay away from the 12K promise raid and go for something in the 3 to 5K range that is about 3 to 5TB.
    Anyone have any suggestions?
    thanks,
    Dan

    It's not clear to me what you're asking.
    First, what do you mean by 'Leopard server now sees NAS arrays as logical disks'? They've always been logical disks - at least based on my understanding of the term.
    Secondly, you ask about NAS, but go on to talk about fiber channel. By definition they are two separate technologies. You're using NAS or you're using fiber channel in either a SAN or a direct-attach model. There is no 'fiber channel NAS' option, unless you mean a SAN?

  • How to write a 2D array to a database using database connetivity toolkit?

    Hey Gang.
    I am having trouble writing records efficiently to my database. I have a 2D array of elements that I am writing to my database. Currently, I am writting one row at a time using a FOR loop and "DB Tools Insert Data.vi". Unfortunately, this process is very time consuming. When I try writing in the entire 2D array (as opposed to 1 row at a time) I get the error:
    "Number of query values and destination fields are not the same..." 
    Is there any way to write a 2D array to a database in one shot?? (and take up the same number of records as if I wrote each row individually)
    Any ideas would be much appreciated.
    Using LV 2010.
    Cheers

    Maybe Ive explained the issue incorrectly. Please see the attached snapshot. The top portion of the code is what I currently have. It works, but it writes one database record at a time and the loop runs ~1000 times. (How I am generating the data to write is not particularly important but how I am writing to the database is currently my issue)
    The second loop is what I would like to get working as the database write function will only run 4 times (as determined by the number of elements in the variable "Multiple Tables".
    Thanks to all for the input so far.
    Attachments:
    Database Write Test.zip ‏64 KB

  • How can I write a bit to DIO-96 card?

    I´ve got a diagram-program which works with the card dio-6533. With this I can write and read a bit to the card. Now I´ve to change the card to DIO-96 and the programm doesn´t work. First it writes a signal to the one cable connector (with 50 pins)than to the ohter cable connector (with 50 pins). What can I do to make the programm correct?

    Hi,
    I don't know what it is (maybe I'm stupid ??) but I find NI's digital
    functions a monumental pain in the t*ts to work with for several reasons;
    1) for simple devices such as the pci-6503 you can configure individual
    channels to READ individual bits with no problem (assuming you set the
    'read' VI's port width to 1)- try and configure individual channels to WRITE
    individual bits and you have a problem (basically all of the other bits on
    that port get set to 'zero' when you write the bit - even when you write a
    value other than zero to the iteration terminal, and even with the port
    width set to 1!!) in other words 'read' seems to work fine on a bit-by-bit
    basis but 'write' only seems to work on a port basis
    2) several of NI's card do not give you the optio
    n to set the startup
    condition of the digital lines - they deafult to 'high' meaning you have to
    switch a 0V input to them and then invert the value to get a high reading
    (when a switch makes normally you expect a 'high' signal ??)
    3) you can only have bits and ports configured as read or write - even such
    'low end' board such as Arcom's PCIB-40 allows you to set a bit value and
    then read it back!!!
    Errrmm ... starting to run out or complaints here (well it is nearly 1:00 AM
    on Saturday morning :-)
    If anyone can point out something that I'm doing very wrong here then I'd
    love to hear, otherwise I think NI should give some consideration to the
    basic functionality of their low-end digital stuff.
    All the best
    Andy

  • 8 bit Array onto 10 bit Array

    Hi
       I am trying to convert a array of U8 into a 10 bits array (take the first 8 bits then add the next 2 bits of the next byte then the remaning 6 bits with the next 4 bits and so on). The ram we are using is 8bit but the
    CCD image it actually 10 bit, i have attached a couple of jpegs to show what i have done and i was wandering weather there is a expert around to say if this is the most efficient way of doing this. I have a large
    arrray to convert and would like to make it as quick as possible.
    Thanks for any help Gary
    Attachments:
    Front Panel.jpg ‏57 KB
    Block Diagram.jpg ‏56 KB

    Gary,
    Reshaping an array of bits is quite easy and (imho) the block diagram code is much easier to understand.
    For arrays of a different size you will just have to calculate the first dimension for the resize operation. 
    Regards
    Anke 
    Message Edited by AnkeS on 11-19-2008 10:16 AM
    Attachments:
    8to10.png ‏4 KB

  • How to run 32 bit plugins in Logic Pro X

    Hi, everybody.
    The following link is from the Macprovideo website. They have great tutorials on all things Logic and many other topics.
    This article describes how to run 32 bit plugins on Logic X
    Hope it helps all those who can't run their favourite 32s
    http://http://www.macprovideo.com/hub/logic-pro/how-to-run-32-bit-plug-ins-in-lo gic-pro-x
    Regards,
    Scorpii

    Hi, Q
    Go to the Macprovideo site.
    When you get the home page loaded, scroll down until you see a section called MPV Hub on the left side of the page. [At present it's under a series of banners for the Logic X Tutorials]
    On the right hand side is a heading 1697 Articles with a blue link under saying See All.
    Click the link and scroll the first MPV Hub page You'll see the article just after a review of Arturia's Spark LE
    Hope this helps,
    regards,
    Scorpii

  • How do I set the trace interval for DSC Read Trace.vi ?

    Hello;
    I am using the DSC Module's "Read Traces.vi" to read historical traces and display them on an xy graph. Normally, I update in real time every second or two and have a time interval of one hour or less, but sometimes, I want to look at a day or more of data on the trend. Since all of my variables are logged in one second intervals (although I do have a log deadband for most variables), this could potentially be a tremendously large array, and I want to minimize CPU usage and eliminate "out of memory" problems.
    Does updating the "interpolation interval" value in the vi options cluster update the data point to data point interval for all data points in a trace or just in areas that need to be interpolated, e.g  where there are gaps (due to for instance a data point value not changing more than its log deadband for a period of time)? I am wondering if I set this value to a larger number of seconds, then will it solve my problem or provide no change if all of my data points change continually.
    Thanks in advance for the help.
    Michael Hampson
    XL Automation, Inc.
    Michael Hampson
    President
    XL Automation, Inc.

    Hi Michael,
    I spent a good deal of time playing around with this, and it appears that the functionality of the Read Traces.vi's option input is to interpolate (draw a straight horizontal line in a graph) the data for all traces. This includes those that have updated within the interval (the updates are lost)... and having a larger interval would have more data loss.
    If you want to minimize disk usage, you'd want to increase the deadband so that fewer points are written to Citadel - I'm not sure what aspect of your program is concerning you with CPU usage. You mention that all of your variables are logged in one second intervals.
    Hopefully I have understood your concern and the workings of this VI. If I haven't, please post some screenshots or describe a little more clearly what you are looking to accomplish.
    For future reference, the LabVIEW DSC forum would probably be a better place to post questions, rather than fieldpoint
    Best regards,
    -Sam F, DAQ Marketing Manager
    Learn about measuring temperature
    Learn how to take voltage measurements
    Learn how to measure current

  • HT3989 how do you activate 64 bit mode in logic pro 9 if you purchased it directly from the apps store?

    hey team
    the instructions for the regular activation of 64 bit mode don't seem to work as there is no logic folder in the applicetions, only the icon to launch the program.
    thanks

    Hi
    Select the Logic Pro application within the Applications folder, and use Command I (or File:Get Info).
    Un-chcek the 'Open in 32 bit mode' button
    CCT

  • Screen exit for co11n and problems in writting the PBO and PAI Logic ?

    Hi People,
    I am developing a screen exit for transaction co11n. I have found the exit ( CONFPP07 ) ... I created a project and have
    assigned this exit and activated the project. I have created a field named SHIFT  on clicking this field i have to give three
    possible values (a,b,c  ) and i have to store these values in some table .......... now my problem is in which include i
    have to write PBO and PAI logic ... should i have to write pbo logic in the include provided in the exit
    EXIT_SAPLCORU_S_100 and PAI in EXIT_SAPLCORU_S_101.......or .........Can any tel me what should i have to do to
    meet the requirements... and in which structure i have to add this field so that it gets stored in some table.
    Thanks in Advance.

    Hi,
    Use the includes in the program SAPLXCOF given in CONFPP07 Exit.
    You may use include zxcofzzz ( for Subprograms and Modules )
    by creating it upon double click.
    Regards,
    Wajid Hussain P.

  • Write 2D data array to spreadsheet with good alignment

    Using "write to spreadsheet file" function to write 2D array,   I always have the problem that headers cannot align well wiith the data. See attached snapshot please (There are 7 column headers.)
    WT
    LV2013sp1
    Solved!
    Go to Solution.

    Plain text display cannot handle tabs very well.
    You can take the spreadsheet file and open it in e.g. excel and it will all look great. Spreadsheet files have no formatting information, just row and column delimiters.
    If you want to create nicely formatted report files in LabVIEW, use the report generation tools and pick an output file format that is suitable for what you want to display (e.g. html).
    If you want to display your data in a plain text indicator, you would need to do two things:
    use a fixed-width font (e.g. courier)
    Format all fields with a fixed widh of padding characters.
    For a tabular front panel indicator, you should use a table. You can write the header strings into the table header.
    LabVIEW Champion . Do more with less code and in less time .

  • Save 16 bit array as image

    Hello,
    I am a new labview user. I am trying to save an image which is being
    correctly displayed on the front panel of my program as an intensity
    graph. I  am able to get a 2D array of the pixel values (using the
    variant to data function, the camera I use has an Active X interface
    with labview). The camera allows 16 bit or 12 bit digitization options.
    I would like to save the image pizel values as a 16 bit BMP or TIFF
    file. However, from what I understand, the flatten pixmap option in
    labview does not support 16 bit images. How do I go about saving my
    images? Any feedback would be extremely useful.
    Thanks,
    Sanhita

    Hello,
    have a look at this example and tell me if it works as you expect
    hope this helps
    Message Edité par TiTou le 07-07-2006 12:33 PM
    When my feet touch the ground each morning the devil thinks "bloody hell... He's up again!"
    Attachments:
    image.vi ‏106 KB

Maybe you are looking for