Reading a high speed signal to fill an array

I have an extremely fast signal that I need to read and place into an array.  The voltage values reset at a rate of 65,536 Hz, and I need to read the voltage immediately before each reset and place that element into an array.  This would continue for the first 256 elements before stopping.  I was thinking of doing this by placing a DAQ Assist inside a For Loop, but that would require dealing with timing issues with each iteration of the loop.  Is this possible or is there a simpler way to perform this task?
Thanks,
Anthony

Hi Jeff, Anthony,
      Please forgive me if I misunderstand the scenario, here.  It sounds like Anthony needs to acquire 256 samples, at ~65KHz, with each sample representing voltage at the AtoD of the 6259 - immediately prior to the reset of an external DtoA.  I'm suggesting that Anthony use the same pulse he now generates for his DtoA as an external scan-clock - falling-edge-triggered.  Let the hardware accumulate the 256 samples - no need to worry about the OS, once the NI-DAQ acquisition is configured/armed.  ~65KHz seems to be well within the capability of the 6259.  NI-DAQ triggers are (typically) very flexible with respect to edge direction.
The question (as I see it) is whether the edge-triggered AtoDs will complete before the DtoA reset occurs.  If necessary, perhaps some delay can be encorporated in the DtoA device, or some delay circuitry could be inserted between The Pulse and the DtoA, to make the DtoA wait a bit.
Again, sorry if this idea is way off-base!
P.S. LV7.1 shipped with an example called "Acquire N Scans - ExtScanClk.vi"
Jeffrey P wrote:
the problem here does not lie within the capabilities of the board, but within the parameters of the system.  It will be extremely difficult to trigger the board at the specific moment when your DtoA goes low...
Message Edited by Dynamik on 12-28-2005 02:07 PM
Message Edited by Dynamik on 12-28-2005 02:11 PM
Message Edited by Dynamik on 12-28-2005 02:12 PM
When they give imbeciles handicap-parking, I won't have so far to walk!

Similar Messages

  • High speed event counting

    I am looking to count events where an event is defined by 1 ttl pulse of 30ns (A newer device allows for 3V 9ns)  I prevoiusly used the6602 but am looking to make a standalone singleboard rio application.  When talking to some engineers this I was told that this was possible, but looking at the spect there needs to be a 100ns min pulse width.  Are there any work arounds?  Is it possible to add sme chip as a prescalar to the input (divide by 10 highspeed counter?)  I am stuck and am ready to meke some hardware decisions.
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

    falkpl,
    As I am not certain if a moderator can move a post, the best way would probably be to re-post the question on your desired board. Be sure to post a link here to the new discussion. You generally want to avoid double-posting, but as long as it is clear that this discussion was moved to another place, most people will forgive you. 
    Now, assuming you have not posted elsewhere, I will comment on your issue. The sbRIO's have IO pins that allow you to connect right into the FPGA. As you have found, the manual guarantees a 10 MHz rate and states you can get a bit faster if you are acquiring a single 3.3 V signal.
    To get even faster, you may have a few options.
    Use a deserializer to break up one high speed signal into multiple slower signals.
    A similar idea which will not require external hardware, but maybe a bit more programming is to read the same channel on several inputs with slightly staggered start times. This post discusses the idea a bit more.
    Good luck, let us know your thoughts and how it turns out.
    Peter Flores
    Applications Engineer

  • CONVERT HIGH SPEED TO WIFI

    DOS ANY ONE NOW HOW TO CONVERT HIGH SPEED SIGNAL FROM BB TO WIFI? OR HOW TO BRIDGE BB WITH LAPTOB AND DIGITAL PHONE THE WORKS WITH INTERNET?
    THANKS
    ALEJANDRO
    Solved!
    Go to Solution.

    Cradlepoint makes a router your can plug your BB into and create a WiFi hotspot:
    http://www.cradlepoint.com/ctr350/ctr350.php

  • Mapping the High Speed Capture signal to RTSI

    Hello,
    Can i Mapp the "High Speed Capture" signal to RTSI ?
    When i'm using the motion RTSI example and mae some changes: 
    i'm setting the source to "High speed capture"  and the destination to RTSI_0 i 'm getting an error that its possible!
    i want to use this input to trigger an action on other pci.
    I'm using pci-7344 with umi - 7774 and pcie-1430
    Thanks
    Mor
    Message Edited by MotiM on 08-01-2009 08:47 AM

    Dear Jochen,
    Thank you for your reply.
    I try to map the High Speed capture to RTSI , but the HSC doesnt work.
    When i remove the RTSI mapping from my diagram, the HSC work and i can capture the encoder position.
    I'm using the UM7774, and the HSC is wired to the global connectors (TRIGGER/BREAKPOINT connector)
    I also read all the relevant documentation about my hardware and i notice that there is a comment about mapping the motion rtsi :
    (from Select Signal vi Help):  Note  You must route signals from the RTSI lines before you enable high-speed capture ,
    so i also consider that, it still doesnt work....
    I really appreciate if you can take a look about the 2 versions of vi i attached here.
    They both need to do the same.
    Each vi contains 2 parallel diagrams, one diagram for the single axis move, the move is to target position X,
    the second diagram is for the vision, there is "trigger each line" from rtsi line (the motion diagram includes maping the encoder phase A to the rtsi for this purpose)
    and i want to use the High speed capture to trigger the start of imaq also.
    The example :
     HSC in motion activate the start of imaq in second loop.vi 
    is a working example that runs good and capture the image.
    but its "dirty" programming, i dont think its should work like this.
    its actualy a implementation of "busy wait" loop.... ( i dont like it, but work...)
    The example :  
    HSC triger directly the start of imaq.vi
    is hoe i think its should be ( i know that i need to trigger the HSC whitin the time of the timeout of waiting to the start trigger of imaq to accure, that is why i put a big number there... )
    BUT, this vi doesnt work, the High speed capture doesnt happend at all...
    Can you take a look on these vi and let me know what i'm doing wrong here.
    I really appriciate your help.
    Regards
    Mor
    Message Edited by M0Reng on 08-03-2009 02:14 PM
    Attachments:
    HSC in motion activate the start of imaq in second loop.vi ‏72 KB
    HSC triger directly the start of imaq.vi ‏71 KB

  • I have an NI5911 high speed digitizer and would like to acquire a video signal from a CCD using it. Does anyone know if this is possible without buying any more hardware please? If it is, how would I go about it?

    The only thing I succeeded in finding on the NI site was about using a different high-speed digitizer in tandem with an IMAQ card but I was hoping not to have to buy an IMAQ card as it means I may as well not have got the NI5911 (a while ago) in the first place. Any help would be greatly appreciated!

    You can certainly acquire video signals with the 5911. The only tricky part is that the 5911 does not have any video triggering options. That means that you will have to either find a way to provide your own trigger based on the video signal (that's where the IMAQ board comes into play) or you can just take a lot of data and keep only the data you need once it is in software. For instance, you could take one large record and then use software analysis to determine where the frame sync occurs. Hopefully this was helpful.

  • How can I modify the High Speed Data Reader VI to show correct time informatio​n in x-axis?

    I am just a beginner learning the LabVIEW programming currently.
    I have a PXI 6115 DAQ card and have to make a hardware timed acquisition VI for maximum performance. Thus I use the High Speed Data Logger VI for data acquisition.
    However, when I read my data by using the High Speed Data Reader VI, it doesn't show its correct time information in the graph.
    How can I modify the High Speed Data Reader VI to show correct time information in x-axis?
    I hope you can explain easily because I am a beginner.

    Hey Chunrok,
    I've modified the High Speed Data Reader VI slightly so that it now uses the scan rate of the data (as determined from the file) to set the scaling for the data points. If you wanted the start time to be a specific time you could use the start time obtained from your file to set the xscale offset as well.
    I hope this helps!
    Sarah Miracle
    National Instruments
    Attachments:
    Example.vi ‏281 KB

  • How can I modify the High Speed Data Reader VI to show the time information in x-axis?

    I am just a beginner learning the LabVIEW programming currently.
    I have a PXI 6115 DAQ card and have to make a hardware timed acquisition VI for maximum performance. Thus I use the High Speed Data Logger VI for data acquisition.
    However, when I read my data by using the High Speed Data Reader VI, it doesn't show its time information in the graph.
    How can I modify the High Speed Data Reader VI to show the time information in x-axis?
    I hope you can explain easily because I am a beginner.

    Format the x axis to either absoulte or relative time.
    You can do this by right clickingo n the graph and selecting x axis then formatting from the menu.
    Thanks,
    Naresh

  • How can I modify the High Speed Data Reader VI to show correct time information in x-axis?

    I am just a beginner learning the LabVIEW programming currently.
    I have a PXI 6115 DAQ card and have to make a hardware timed acquisition VI for maximum performance. Thus I use the High Speed Data Logger VI for data acquisition.
    However, when I read my data by using the High Speed Data Reader VI, it doesn't show its correct time information in the graph.
    How can I modify the High Speed Data Reader VI to show correct time information in x-axis?
    I hope you can explain easily because I am a beginner.

    First, I couldn't seem to find that example either on computer or on the NI sites.
    The problem that we're running into is stated in the article that I pointed to - when real time VIs are running, then the OS will stop updating the OS clock. TO us it looks like the clock is losing time. There is a hardware clock on the PXI and it is read only during boot up to set the OS clock. Our discussions with NI have not led to a solution for this problem without checking the time on start and then checking the tick count (which does not lose time) and calculating what the current time is. No access to the hardware clock is supplied.
    We're still working on a simpler way to get accurate time.
    Rob

  • I have one application that has requirement to do low and high speed acquisition. I want to change sample rate while running. BUT... I have E series Device

    I am writing control software for a process that is usually dull and
    requires only 10 Hz acquisition rate.  At particular times during
    the sequence, however, we are interested in looking at a couple of
    channels at 1000 Hz.  My approach so far is to configure my
    Buffered DAQ to run at the higher rate at all times.  When we are
    in the 'high-speed DAQ' mode, the program logs every point to
    disk.  In the 'low-speed' mode, I am picking off every nth (in
    this case, 10th) point to log to disk.  At all times, I update my
    GUI indicators on the front panel at a maximum of 4 times per second (I
    find that anything faster results in an uncomfortable display), so I
    fill up a FIFO with data in my acquisition / logging loop, and read the
    FIFO in the display loop.  The data in my GUI display can be up to
    250 milliseconds off, but I find this acceptable . As a side note, I
    need buffered Daq with hardware timing, as software timing results in
    lost data at 1000 Hz.
    This all works fine and dandy, but I am convinced that it is not the
    most elegant solution in the world.  Has anyone developed a
    buffered DAQ loop where the scan rate can be adjusted during
    operation?  I would like to change the rate of the E-Series card
    rather than relying on down-sampling as I am now doing. 
    The reason I have concern is that at the moment I am simulating my AI
    using MAX and when running the down-sampling routine, I consistently
    miss a particular event on the simulated data becuase the event in
    question on the simulated data always occurs at the same 'time', and I
    always miss it.  Granted, while it is unlikely that my measured
    signal and my acquisition are perfectly synchronized in the real world,
    this particular situation points out the weakness in my approach.
    More than anything, I am looking for ideas from the community to see
    how other people have solved similar problems, and to have you guys
    either tear apart my approach or tell me it is 'ok'.  What do you
    think?
    Wes Ramm, Cyth UK
    CLD, CPLI

    Adding to Alan's answer:
    One of the problems that comes with these tricks for variable-rate acquisition is being able to match up sample data with the time that it was sampled. 
    If you weren't using either of E-series board's counters, there is a nifty solution to this!  You'll be using 1 of the counters to generate the variable-rate sampling clock.  You can then use the 2nd counter to perform a buffered period measurement on the output of the 1st counter.  This gives you a hw-timed measurement of every sampling interval.  You would need to keep track of a cumulative sum of these periods to generate a hw-accurate timestamp value for each sample.
    Note:  the very first buffered period measurement is the time from starting the 2nd counter until the first active edge from the 1st.  For your app, you should ignore it.
    -Kevin P.

  • Counting TTL pulses at high speed

    Hi all,
    I am using PCI-6221 board with DAQmx to count the number of TTL pulses (which varies in its frequency between 0Hz to 10MHz) at a high speed (200,000 samples/sec.) and I am having a problem when the TTL pulse frequency drops below a certain level.
    I am using CTR0 to generate continuous pulse train at 200kHz frequency to feed to CTR1 Gate input. I verified that the pulse train is being generated fine.
    I am using CRT1 with buffered counting to collect the count for 200,000 samples at a time (duration of 1 sec.). I got the example code (Cnt-Buf-Cont-ExtClk) and pretty much used it as is.
    CTR1 Gate is coming from CTR0 Out, which is 200kHz pulse train with 50% duty cycle, and CTR1 Source is the TTL signal that I am trying to count. At first, I thought that everything was working fine with the Source signal being at around 5MHz. Then, when I had the Source signal down below about 300kHz, I noticed that the program is taking longer than 1 sec. to collect the same 200k samples. Then, when I got the Source signal down to 0Hz, the program timed out.
    I am guessing that somehow the counter is not reading for the next sample when there has been no change to the count, but I cannot figure out why and how.
    Any information on this and a way to get around would be greatly appreciated.
    Kwang

    One thing you can try is to set the counter input CI.DupCounterPrevention property, this setting will filter the input, it is possible that when the ctr 0 is slow then many of the values you are counting become zero as well and are filtered out, since they are nolonger points, the counter will not collect enough points before the time-out occurs and the counter input read times out.  I am not sure if this is your issue but I found out the hard way that this occurs when I switched to daqMX where this feature was added.  Let me know if it worked,
    Paul
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • LabVIEW 8.5 high-speed camera

    Hi,
    I'd like to take many
    high-speed images with a camera, and save them to a disk, or my computer's hard drive. I congfigure the camera (it is a Pulnix TM-6740cl camera) with the Measurement and Automation Explorer (MAX.) I take the pictures with LabVIEW. I've found an example program in LabVIEW that is close to what I want; it was written by two members of this forum, N_Holmes and reut. The only differences between that program, and
    the program that I'd like to run is that I need to be able to change
    the shutter speed of the camera (although I believe that I can do this
    with MAX), I'd like to be able to change the target
    directory (for some reason, if I change the directory with the reut
    program, the camera only takes 1 "frame" and stops), and I'd like to be
    able to save a file for each frame taken by the camera (I can only get
    the reut program to save 1 file.) I have LabVIEW version 8.5. If you
    could help me out in any way, then that would be greatly appreciated. Thanks!
    Attachments:
    grab images and Save to FileReut-11.vi ‏67 KB

    Hi Bolin,
    I'm using an NI PCI 1426 with NI-IMAQ, 18MB. Thanks for your previous help. I've actually worked on the program, and I got the camera to take multiple pictures. I also incorporated a part in the program that decodes the raw bayer image that this camera outputs.
    I do have some other questions, though. First of all, I'm not entirely sure what you mean by "trigger." I use the computer to take the pictures; MAX configures the camera, and LabVIEW sends the signals to the camera to take the pictures, and then saves them on the computer. Also, I know this is probably a bad question, but I'm not 100% sure where in MAX I can look to find out what framerate I'm acquiring at. I'd like to acquire pictures at the camera's maximum framerate, which I believe is somewhere over 1000fps. I tried many different methods in order to capture at the maximum framerate, and the best option seemed to be at the settings: manual shutter, shutter setting 0, 4X4 binning. However, my pictures were of extremely poor quality, and they were very dark.
    Also, where can I go in MAX to change the shutter speed? I've tried changing the shutter control setting and the shutter setting number; I just want to verify that this is correct.
    Thanks again for your help! Once I can figure out how to make these pictures take at maximum speed, and at the best quality, then I can finally start on my project!

  • Verizon DSL - High "Speed (down/up)" Low connection rate

    Hello -
    I am doing this for myin-laws, who live in a semi-rural area.  They have Verizon DSL, and have been having very slow connection speeds for the past few months (they have the 1.5-3 Mbps, and are getting between 200-700 kpbs.  Sometimes, for a few seconds, it can get as high as 1.1 Mbps, but it never gets to 1.5.
    I have tried lots of things.  We got a new modem, reset the D-Link router to factory settings, plugged the modem directly into the desktop, rebooted everything, then rebooted again, called Verizon and talked with their overseas people, who said they would have someone look at it.
    Everything, no dice. So I'm not sure what to do. 
    Here's one thing I noticed, however.  When I go into the Modem sign in screen, and look at it (It's the Red sign in, not the Blue, on a Westell 6000 DSL modem), they get a very high "Speed (down/up)" level.  Something like 1740 down, and 500-something up.
    But the actual downloads are nothing like that.
    Is that significant to anything? 
    My other situation is that I am not at their computer now (I am back in the city).  When I left their place, I did not know about these boards, and I can't remember any of the other information that is from the modem sign-in.  That one piece, however, seemed strange to me (a high "Speed (down/up)" level, but slow reality.
    As you think about it, could that lead to any clues or solutions I should look for the next time I go up there?
    Thanks for reading this.

    The closest proper sync rate that the modem would show for speed would be 1792/448Kbps, which would be the 1.5Mbps provisioning of the 1.5-3Mbps package. Verizon usually configures the speed lower if the line is not capable of holding a full 3Mbps or even an "optimized" flavor of it (which is ~2600kbps down, 640kbps up). Anyways, the next time you get the chance, I'm going to need the following information from their line:
    1: What do their modem Transceiver Statistics look like? If running a Westell modem, visit http://192.168.1.1/ , choose System Monitoring, Advanced Monitors and then click Transceiver Statistics. Post up what you see there. For ActionTec modems, check the Status pages of the ActionTec for DSL Stats. The address to ActionTecs are the same.
    2: Find out if the slow speeds are taking place all the time, or only during the evening hours
    3: Go to http://visualroute.visualware.com on their PC and choose the closest server to them. Let the Java applet load, and when it does it will show you a "Trace" box with your relative's IP address filled in. Press Trace and let it complete. When it completes, move the mouse over the second-last Circle (second from the right) and take down the name of it. If you see "ERX" in the name, please tell me.
    If you are prompted for a Username/Password while doing Step 1, try the following:
    admin/password
    admin/password1
    admin/admin
    admin/admin1
    ========
    The first to bring me 1Gbps Fiber for $30/m wins!

  • Connecting to High Speed Internet - I am not a Crook!

    Ok, first of all, you're going to have to hear me out. I think technically what I'm trying to do is illegal, but in the real world I think its perfectly justified...
    I have cable internet at home that I pay for and enjoy. My computer is the only computer in the house and thus the only one using the service. Right now I am house sitting with my girlfriend and have brought my computer with me, thus leaving my cable internet hookup behind at home unused. I'm connecting to the internet here with a dialup connection account that the owners of the house have and the slowness is driving me crazy.
    I thought that if I brought my cable modem with me and hooked their cable into it I would be able to get access to the high speed internet - which I think I am justified in doing since I am paying for the service and it is going unused at home. The only problem is it doesn't seem to be working. The cable modem has a green light for "cable" but nothing for "data", and everytime I try to go to a website it either asks me to dial in to the dialup service or work offline.
    At home I know that we have the same cable for our TV as we do for my internet because in my room where the computer normally is I have a small TV (for viewing with Premiere) that we also pay to have a cable connection into and the cable guy merely split the cable to go both into my computer and my TV.
    Is there something they would normally have to switch on outside to make the cable work for the internet as well? I'm not interested in messing with that, I'm just wondering why its not working here.
    Darcy

    Oddly enough, I was in paying for my cable and asked (in a roundabout way) about the possibilities of hooking up at another house and what was the deal about splitting the line and getting basic cable without paying for it.
    The answers I got were basically...
    When you sign up for cable modem, you are renting a port, that prot can go anywhere, it's not stealing if you only use it from one place and can only get detected if it's on in two places.
    When you subscribe to cable modem, the basic cable is inherent in the signal. In my case (comcast), you can opt for a package that includes basic cable and the internet service for the same price as the internet port alone. They know people split the cable and hook it up to their TV, some installers go ahead and do that as a service. If the installer runs two cables into the house, it's simply split upline somewhere, it doesn't really accomplish anything.
    It was almost like talking to the limo driver in the Wayne's World movie. Too much information...

  • Route scan clock to high speed capture

    Hi, I want to have a continuous aquisition and sample into an array two encoders and my e series channel about 100 scans per sec. I will be routing the board clock over RTSI, assumed to bt the gerneral purpose clock, to do a high speed capture from two encoders. Absolute positions and AI must be syncronized. Can I use the internal clock from the e series, and what is it called? How do I get a periodic sample from AI to be stored with each high speed capture?

    Matt,
    To synchronize your analog input and encoder measurements, you will need to route your analog input scan clock over RTSI. In LabVIEW, you will use Route Signal.vi with AI scan start as the signal source input and your chosen RTSI line as the signal name input. This RTSI line can then be used to latch your encoder readings into a buffer. Thus, the data in your analog input and encoder buffers will be synchronized.
    Good luck with your application.
    Spencer S.

  • Onboard Wait On High Speed Capture

    I would like for an onboard program to wait for a high speed capture signal from a trigger input. Unfortunately, I have not had success with the flex_wait_on_condition function; it has always timed out before detecting the event. However, calls to the function flex_read_hs_cap_status identify that the high speed capture line is indeed toggling faster than the 3 second timeout. I use the following sequence of functions to configure the high speed capture:
    flex_configure_hs_capture(m_BoardID, NIMC_AXIS2, NIMC_HS_LOW_TO_HIGH_EDGE, 0);
    flex_begin_store(m_BoardID, ProgramNumber);
    flex_enable_hs_capture(m_BoardID, NIMC_AXIS2, NIMC_TRUE);
    flex_wait_on_condition(m_BoardID, NIMC_AXIS2, NIMC_WAIT, NIMC_CONDITION_HIGH_SPEED_CAPTURE, 0, 0,
    NIMC_MATCH_ANY, 30, 0);
    flex_end_store(m_BoardID, ProgramNumber);
    Axis 2 is configured as a open loop stepper axis with encoder resource 2 mapped to it.
    Any thoughts as to why this wouldn't work?
    Thanks!

    Thanks for the suggestion. It seems to work fairly well, although there is some delay between the trigger event and the execution of the critical section of code.
    Are you aware of a method to speed up execution of an on-board program? The critical section of code in the attached program fragment takes about 4ms to execute. With the added delay of the polled high speed capture line, I am limited to a ~150 Hz loop. I would like to increase the execution time by about twice.
    Also, a command from the host computer seems to preempt the on-board program, causing it to take up to ten times as long to complete. Is there a way to set the priority of the on-board program task above host communication?
    Thanks for you assistance,
    Mike
    flex_insert_program_label(m_BoardID, LABEL_LOOP_START); // main program loop
    flex_read_hs_cap_status(m_BoardID, NIMC_AXIS3, DATA_HS_CAP_STATUS); // check if high speed capture triggered
    flex_and_vars(m_BoardID, DATA_HS_CAP_STATUS, DATA_HS_CAP_STATUS_MASK, DATA_HS_CAP_STATUS_MASKED); // AND high speed capture with trigger 3 mask
    flex_jump_label_on_condition(m_BoardID, NIMC_AXIS3, NIMC_CONDITION_EQUAL, NIMC_FALSE, NIMC_FALSE, NIMC_MATCH_ANY, LABEL_LOOP_START); // if trigger 3 not triggered, jump to main program loop
    // Critical Section Code >>>
    flex_set_breakpoint_momo(m_BoardID, NIMC_AXIS3, 0x08, 0x00, 0xFF); // set digital output high
    flex_enable_hs_capture(m_BoardID, NIMC_AXIS3, NIMC_TRUE); // re-enable the high-speed capture
    flex_read_adc(m_BoardID, NIMC_ADC1, DATA_ANALOG_INPUT_1); // read the analog input
    flex_write_buffer(m_BoardID, ANALOG_INPUT_BUFFER, 1, 0, &UselessLong, DATA_WRITE_TO_BUFFER_NUM_PTS); // write the analog input to the buffer
    flex_read_buffer(m_BoardID, VELOCITY_PROFILE_BUFFER, 1, DATA_VELOCITY_CMD); // read the next velocity profile point
    flex_load_velocity(m_BoardID, NIMC_AXIS3, UselessLong, DATA_VELOCITY_CMD); // set the axis velocity
    flex_start(m_BoardID, NIMC_AXIS3, 0); // update the velocity by calling start
    flex_set_breakpoint_momo(m_BoardID, NIMC_AXIS3, 0x00, 0x08, 0xFF); // set digital output low
    // <<< Critical Section Code
    flex_jump_label_on_condition(m_BoardID, NIMC_AXIS3, NIMC_CONDITION_TRUE, NIMC_FALSE, NIMC_FALSE, NIMC_MATCH_ANY, LABEL_LOOP_START); // jump to main program loop
    flex_end_store(m_BoardID, ProgramNumber); // stop program store

Maybe you are looking for

  • Making IE 11 as default browser for windows 8.1 Update

    Hi, We are performing bulk deployments for Windows 8.1 Update using Config manager 2012 R2. Its x64 OS deployment. I am looking for a way to make windows 8.1 Update IE browser as a default browser for all users. Any local group policy or batch file o

  • Bridge problems

    Bridge is constantly building criteria for 2 of my drives.  How do I solve this?

  • PLSQL compiles but doesn't run.. I've declared it everywhere but still..

    PLSQL compiles but doesn’t run.. I’ve declared it everywhere but still.. Afternoon.. Hopefully a quick one for someone.. I’m trying to run a Concurrent Program in ORACLE Financials using a Data Template derived BI Publisher report. Error message rece

  • Variable length binary data.

    I can't seem to find much on working with binary data in quantities smaller or larger than bytes. I am doing research on Huffman Trees, and created a deonstration, but am having trouble imagining how I would read or write to file one bit at a time. A

  • Merge tables

    Hi, I have two partitioned tables. Table 1 has 15 partitions and each partition has around 3 million records. Table 2 has 20 partitions and each partition has around 3 million records. I want to merge these two tables in to one table so that the new