Labview 8 with 4350 DAQ board

I upgraded from LabView 6 to Labview 8.  At the same time, I put the old 4350 board into this new WinXP computer. 
The board works great in MAX, but LabView can't seem to find it.   And can't use all of my old data acquisition VIs.
According to MAX, I have both the mx and legacy versions of NI-DAQ
What else do I need to check?  or is all of this in incompatible combination?

Hi Todd,
You probably have an older version of the Traditional NI-DAQ driver. In order to use that hardware in LabVIEW 8, you will have to have the latest version of the traditional NI-DAQ driver - 7.4.1, which you can download here.
I hope this helps.
Regards,
Natasa

Similar Messages

  • Not receiving any signal input with LabVIEW SE, USB DAQ board

    Hello all, good afternoon.
    I am having a little trouble with my DAQ device, I hope that someone will be able to help me.
    I'm plugging a BNC pH sensor to a NI USB 6251 M-series board, to try and obtain the electrical response of that particular sensor, measuring the voltage output for each buffer solution (pH 4 and 7). Have 9.2.3 mx (lastest) drivers.
    The pH sensor (floating) is plugged through a female BNC plug to the circuit board, signal is going through a signal conditioning  circuit (composed by a voltage follower on each of the wires to lower the output impedance of the sensor, and 2 bias resistors from AI+ and AI- connected to AI GND with the board case wired to the power supply ground, as seen in http://zone.ni.com/devzone/cda/tut/p/id/4494)
    I'm measuring in differential mode, as I said both wires of the sensor are going through a voltage follower each, and into the DAQ board. The Op-amps are being powered by +-15v as seen, again, in http://zone.ni.com/devzone/cda/tut/p/id/4494
    I did take the time to read the manuals and tutorials I found, did quite a bit of investigation over the last few days, and wired the system carefully.
    In labview signal express, I choose to take an analog voltage read, and select input range from -500mv to 500mv, because the sensor should output at most -400mv to 400 mv
    Now, much to my dismay labview registers in that channel absolutely nothing, only a little noise (~1 mV). I unplug the sensor and realise that whether the sensor is connected or not, it doesn't make a difference in the presented output - it's as if it the sensor wasn't there at all. And yes, I am measuring the correct channel. I tried connecting the wires to different AI channel pairs as well, could be a faulty one, but no success either.
    If I plug a voltimeter to the wires that are going into the analog inputs (after conditioning), it measures about 50 mV, which is a lot more than the 1 mV the board is measuring, but a lot less than the ~180 mV I should be measuring with a pH 4 buffer solution.
    I know for a fact the problem is not the sensor, and the DAQ board seems to be fully operational.
    I think it might have to do with the signal conditioning I'm doing, or some sort of configuration with the hardware of the DAQ board.
    So, any idea what I might be missing?
    I'm a week behind schedule because of this, hopefully I will find my answer here.
    Thank you very much in advance!

    Good afternoon.
    Yes I have tried all of that. The board responds very well when a flat 5V DC voltage is fed into a channel.
    However after much investigation I believe to have found the problem yesterday, it probably has to do with the operational amplifiers I was using. The input bias current was in the magnitude of nanoAmps, while for pH measurements an IBC of picoAmps or less is necessary due to the high impedance of the sensor. This was causing a drift of over 100 mV on my output which is about 2 pH and thus unnacceptable (even though I had the Bias resistors at the input to ground, like suggested in the NI tutorials, but apparently they were not enough to compensate for the high IBC).
    In the lab all available op-amps suffer from this, even Instrumentation op-amps like AMP04 from Analog Devices. I still haven't had a chance to acquire new op-amps with the necessary IBC but hopefully it will do the trick. Makes sense since I checked the datasheets for every op-amp in the lab, and they all had IBC around 400 nanoAmps (and none of them worked), while every single op-amp for every pH meter circuit I found have IBC's of pico/femtoAmps.
    The sensor I'm using is from AquaMedic, I'm not sure what the model is, but it's a cheap plastic sensor that comes bundled with their pH Computers.
    Thank you very much for your input.

  • Interface labview with the test board to identify opamp pins automatically

    is it possible to interface LABVIEW to a test board in such a way that the input, output pins of an op amp (8, 10, 14, 16 pin opamps) are identified automatically when DUT is placed in the socket

    pratheek wrote:
    Thanks alot. pin numbers are variable. I need to test the functional and test parameters for 8, 10, 14, 16 pin opamps.
    The first thing you need to do is ensure that +VCC, -VCC, and GND are always connected to the correct pins.  I think the best way to do this is have preset wire options that you manually select with some switches (reduces the complexity of the LabVIEW controlled wire switching).
    pratheek wrote:
    The desired out put for my project is to check the functionality of the opamps
    I understand the end goal, but what you need for first specity is the test sequence.
    1. Identify In/Out pins & number of op-amps.  What is the sequence for this??? What voltages do you apply to the various pins and what is your desired outcome???  How do you identify each pin as in/out and whether they are for OpAmp1, OpAmp2, etc.  Since this sequence will inevitabely apply voltage to the output pins, make sure your test sequence doesn't damage the chips.
    2. After identfying each op-amp, what is the test sequence for evaluate its "functionality"??  Do you wish to evaluate the gain for various resistor combinations in an external circuit????
    3. Now with your test sequences outlined, you can finally begin on getting LabVIEW to implement these functions.  I'd use the DAQ digital outputs to a decoder (3-to-8 or 4-to-16).  Use the outputs of the decoder to control switches (these switches control the connections between a second DAQ and the test pins).  A third DAQ may even be necessary to control the selection of various test resistors.  Keep in mind that each of these switches will add resistance to the path so your evaluation must be made measuring the voltages at the op-amp pins (not using the sourced voltage values).  Planning the full measurement sequence and requirements before you start will not only help guide your program's development but it will tell you which DAQs are useful and how many you will need.
    This is quite an undertaking and we can't really provide any help on the LabVIEW end until you have a full write up of what you want to happen and how (step-by-step details).  Once you have that, start thinking about the sequence of events your code will have to follow to make it happen, then come back for guidance/help.

  • Will LabVIEW talk to DAQ board made by Measuremen​t Computing?

    I am a senior at UNC-CH, working on a research project for a professor who has a DAQ board made by Measurement Computing, which looks very similar to NI boards I have used in the past. Model: CIO-DAS16/330 The software we're using is LabVIEW 5.0, and I need to know if there is any way to make LabVIEW talk to these boards? I understand that the best solution would be to buy an NI board that LabVIEW is configured to work with, but we don't have the money for that!

    In article <50650000000500000017E00000-1042324653000@exchange​.ni.com>,
    "alberto" wrote:
    > Hi Nicoletta,
    > I checked on their site and they provide a library of LV functions to
    > interface to their boards:
    > http://www.measurementcomputing.com/cbicatalog/cbi​product.asp?dept%5Fid=184&pf%5Fid=623&mscssid=HUXA​CAQX49EF8G7NLAUUX95WUP156JX5
    >
    > Good luck,
    > Alberto
    Yeah, I've used these libraries and they work fine. The library
    package cost me $50 about a year ago. Most of their boards are
    supported under the Comedi drivers so you can use them with
    Linux if you don't want to pay for the MC libraries.
    -Kevin

  • Does anyone know of good resources on using Labview with Omega DAQ (3000 series)?

    I'm fairly familiar with the labview environment and have used NI DAQ devices with ease as it's all set up through the help assistant in Labview. I am now utilizing an Omega DAQ system and have some troubles with understanding the VIs supplied with the system. Any resources on this would be greatly appreciated. Thanks

    Darshwings, 
    Unfortunately we do not have support for 3rd party drivers and I was unable to find any resources on our end for learning how to program with Omega's functions. Typically, instrument drivers come with some examples when installed. I would check the location of these drivers and see if there are any examples. Otherwise you may want to contact Omega for any documentation that is available. 
    Huntington W
    National Instruments
    Applications Engineer
    ***Don't forget to give Kudos and Accepted as Solution where it is deserved***

  • DAQ board no longer showing up in MAX, nor elsewhere.

    I can no longer see my DAQ board in MAX, nor LabVIEW 7.1. It's a PCI-6023E, which was working fine last week.
    I tried reinstalling MAX through the Control Panel >> Add/Remove
    Programs, but it is not there. In "Add/Remove Programs", II click on
    "National Instruments Software", but that causes my computer to hang.
    I tried downloading the latest NI-DAQ, but it thinks it has nothing to
    install. I explicitly use it to remove LabVIEW 7.1 support, then
    re-install it, but my DAQ board is still invisible.
    How can I communicate with my DAQ board again?

    Hello Bmihura,
    There is a utility called MSIBlast that can be used to remove software
    from your computer when the Add/Remove Programs does not work. 
    For future reference, it sounds like some portion of your National
    Instruments software has become corrupted during an installation. 
    It is important to strictly follow installation procedures outlined at
    http://www.ni.com/support/install/
    .  Also, if your device does
    not show up in MAX, it probably is a result of an incorrect driver
    installation, and not a problem with the MAX program.
    The first thing you will want to do is remove the hardware from your
    computer.  Then, to get the MSIBlast utility, copy and paste the
    following in an Internet Explorer web browser window:
    ftp://ftp.ni.com/outgoing/
    Find MSIBlast.exe and drag and drop it onto your desktop.  To run
    the utility, simply double-click on the executable.  In the dialog
    box, make sure you have 'Show NI Installers Only', and you should see
    a list of National Instruments software installed on your
    computer.  Go through the list and select the components that you
    wish to remove and click the 'Uninstall' button.
    Once you have uninstalled all of the software that you could not
    uninstall through Add/Remove Programs, restart your computer and verify
    that everything was uninstalled through Add/Remove Programs.  At
    this point, if you chose to uninstall LabVIEW, that should be the first
    thing to reinstall.  Otherwise, go ahead and reinstall DAQmx
    8.0.  I would recommend re-downloading it from our website here
    just in case the installer became corrupted during the download
    process.  After the driver is installed, now is the time to power
    down your computer and physically install your hardware in a spare PCI
    slot.  When you restart your computer, Windows should recognize
    new hardware and you should allow the wizard to install the driver
    software automatically.
    I hope this helps, and if you have any further questions you can either
    reply to this post or directly contact an Applications Engineer by
    calling 1-866-ASK-MYNI.
    Regards,
    Travis Gorkin
    Applicaitons Engineering
    National Instruments
    www.ni.com/support

  • Trigger restarts on DAQ board

    Hi.
    I'm doing data acquisition with a DAQ board, the USB-6251 model.  I need something like the sample VI "Acq&Graph Voltage-Int Clk-HW Trig Restarts.vi", except I need the trigger to restart much more efficiently.  When I run that sample VI (and my own adaptation which is attached) I can only get trigger rates of about 30 Hz, and trigger that comes in within 30 ms of the previous one is just ignored.  I need it to handle trigger rates of up to 1 kHz. Does anyone have any ideas, or is there another way to go about it that will avoid this problem?
    Thanks,
    Jeremy
    Solved!
    Go to Solution.
    Attachments:
    Restarting trigger.vi ‏25 KB

    I think to get the response you want, you will have to use your Trigger signal to start generate the correct number of clock signals from a Counter.  See example 'Gen Dig Pulse Train-Finite-Retriggerable.vi'
    Use Example 'Cont Acq&Graph Voltage-Ext Clk.vi' to actually do the acquisition.
    For a test, set up as follows:
    Set the Rate/Frequency to the same value.
    Set the Number of Pulses/Samples to Read to the same value.
    Set the Counter(s)/Clock Source to the same source
    For the Gen Dig Pulse Train-Finite-Retriggerable.vi set Counter(s) to ctr0
    For the Cont Acq&Graph Voltage-Ext Clk.vi set the Clock Source to Ctr0Out
    Run the Cont Acq&Graph Voltage-Ext Clk.vi first and then quickly start the Gen Dig Pulse Train-Finite-Retriggerable.vi
    You should see the acquired signal from whatever input you have connected.  I was able to do this without changing either example program.  Once you see it work, you can combine the two into one vi.
    Message Edited by rpursley8 on 07-07-2009 04:19 PM
    Randall Pursley

  • How can I measure 2 floating signals with a DAQ?

    Hello, I have a DAQcard AI-16XE-50 with a BNC2110 adapter to connect signals.
    I need to read 2 analog floating signals (less then 1 V, DC).
    The configuration of the DAQcard si DIFF Input mode.
    I tried to read the first signal and everythings was OK, then I tried to read the second and everything was Ok, but when I tried to read them togheter the values were modified from original values, even measuring them with an external voltimeter.
    How I have to connect signals to my BNC-2110? I need some Bias resistor? or what?
    Thanks
    Lorenzo

    Hi lcaggio,
    You're BNC-2210 is designed to work in differential mode with your DAQ board. It sounds like this is how you have things configured though. Here are my main suggestions for resolving this.
    1) Make sure that the channel switch is set to "FS".
    2) Make sure that your DAQ board has the BNC-2110 configured as an accessory. (accessed through board Properties in Measurement & Automation Explorer)
    3) Make sure that your DAQ board is configured for the correct voltage range and that your inputs are configured for Differential.
    Now, by default, the NI-DAQ driver will sample channels as quickly as it can between channels. Since your channels are floating, it is possible that there is channel cross-talk between your two channels and that is why it l
    ooks good when you sample the channel individually but looks bad when you sample both channels together. This is corrected by increasing the period of time between sampling each channel. This is called the "Interchannel Delay".
    4) The following Knowledge Base describes and links to documents that mention the importance of the interchannel delay.
    What Are the Minimum and Maximum Values for the Interchannel Delay Setting on my DAQ Board?
    http://digital.ni.com/public.nsf/websearch/9AE87416C8792FC286256D190058C7D3?OpenDocument
    5) Connect an analog channel, both CH+ and CH-, to AIGND. This will ground the channel. Then sammple your first analog channel, followed by the grounded channel and then your second analog channel. The ground channel helps with channel cross-talk (channel ghosting).
    Anyway, hope these suggestions help your project. Have a good day.
    Ron
    Applications Engineering
    National Instruments

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

  • DAQ Assistant in subvi not updating output to DAQ board with each call...

    Hi All,
    I am calling a simple subvi that creates a user-defined number of pulses with "Square Waveform.vi."  This square wave (with the given total number of pulses) is then used as an input to a DAQ Assistant controlling an analog output signal on a NI USB-6259 DAQ board.  I am using Labview 8.5 right now.
    However, each time I call this subvi from my main program, the output I measure from the DAQ board is identical to whatever I set in the first call (i.e., if i created two pulses in the first call, I get two pulses on every call, regardless of the input I feed to the subvi).  The multiple calls to this subvi are made in sequential frames in a stacked sequence.  I believe stacked sequences are frowned upon by good labview people, right?  But putting that aside for the moment...
    The "#-of-pulses" input I give to the subvi is updated in a subvi front panel number indicator and a graph of this waveform.  Just not in the real output I measure from the board.  Why is the hardware output being asserted (with the original input value) before this new number can reach the DAQ Assistant?
    The sloppy fix to this is just to put that square wave creation code in my main program each time I need it.  This does work and fixes my problem.  However, I would like to use subvis to keep things clean.
    I am not a good Labview programmer, but have used this software for a number of projects and am stumped by this.  Any ideas?
    Thanks,
    John

    Hi John,
    I am running your code over here and seeing
    the same results.  I believe the problem is that the DAQ Assistant is
    being called inside a loop (really a sequence structure, but
    nonetheless more than once).  Sometimes it is difficult to troubleshoot
    the DAQ Assistant in cases like this--it is trying to be "smart" and
    seems to be avoiding re-configuring its parameters inside the loop. 
    This is intended to improve loop speed for when customers are
    performing continuous operations.  In this case, it is performing a
    finite generation, and the number of samples generated appears to carry
    over from one loop iteration to the other.
    It sounds
    like you have discovered one workaround for this already: putting a DAQ
    Assistant in each frame of the main VI.  Two other options that come to mind are:
    Use the lower-level DAQmx functions inside
    the sub VI.  Here you will have explicit control over when the task is
    created and cleared, and when parameters are set.  You can find
    examples of how to use the DAQmx API in the Example Finder at:
    Help >> Find Examples... >> Hardware Input and Output >> DAQmx
    Write
    a consistent amount of samples to the DAQ assistant by "zero-padding"
    your signal.  For example, instead of writing [10, 1010], try writing
    [1000, 1010].  In this case, it wouldn't need to reconfigure the number
    of samples to generate.
    One lesson to take away here is
    that the DAQ Assistant is good for basic functionality, but for more
    advanced control over the execution and configuration of your task you
    should learn to use the lower-level DAQmx functions.  In this case it
    sounds like the problem is actually a bug.  I'll file a bug report, since the DAQ
    Assistant is not checking for waveform timing changes even though your
    timing is set to Use
    Waveform Timing.
    Thank you for pointing out this odd
    behavior--out of curiosity which version of DAQmx are you using? 
    -John
    John Passiak

  • Interfacin​g LabView compatible DAQ board w/ third party DAQ boards

    Dear NI-team and board-members,
    In my current project I'm using a third party Direct Digital Controller which supports customizable Programmable Logic Control (PLC) via a graphical programming software. Therefore, I'm acquisitioning temperatures (etc.) from my miniature-HVAC application via physically attached and mounted sensors, connect them to a dedictaed analog input (0-10VDC typically) and in further assign the analog input to some selected PLC modules with the help of the graphical programming software, and finally after doing the computation (calculator block) or control (Proportional-Integral-Derivative block) I assign the resulting control signal to a dedicated analog output (typically 0-10VDC as well).
    So far for the previous history, NOW I actually want to be able, to manipulate these analog values in a way, that I'm not interested in the physical acquisitioned value from my real-time operating equipment (e.g. temperature sensors), more than that, I'm interested in assigning a certain level (e.g. +5 VDC) to that dedicated analog input, which in turn would represent a temperature e.g. in this simple case (+5VDC) 50degF.
    Now to the questions, can I possibly interface the physical ANALOG INPUT (0-10VDC) of my third party DDC unit, with the ANALOG OUTPUTS (0-10VDC) of a NI DAQ board? In that way, I'd be able and with some additional help of LabView to program time intervals, where I raise the voltage level every 5 minutes by 1VDC, and in turn can proceed with my intended purpose.
    To sum that up, are there physical limitations (e.g. ground loops, damage equipment, shorting) connecting an analog output of one board, to the analog input of another board?
    Also, which DAQ of NI might be best suited for my intended purpose, using a NI DAQ board for "simulating" temperature values (etc.) onto the analog inputs of another third party board.
    Thank you,
    Sandro

    Hi Sandro,
    If both boards are in the same computer I wouldn't be too worried about ground loops or damaging equipment. The ground reference to the AO board will be the same ground reference as the computer and AI board. You would have to check with the specs for your third party AI board to see what overvoltage protection it has and whether or not our AO board could give a voltage high enough to damage it.
    I would suggest one of our AO boards to do the job. Depending on what kind of control and precision you want, we have static update boards available in PCI form. The static board will be cheaper but you are limited to software timed voltage updates (instead of clocked buffered waveform output).
    http://sine.ni.com/apps/we/nioc.vp?cid=12550〈=US
    I would suggest the AO board (PCI-67xx) instead of our M series multifunction boards because the AO boards have 8 or more output channels. The multifunction boards only have two output channels.
    -Sal

  • NI-DAQ 6.9.3f4 doesn't work with 6025E PXI board.

    The PXI 6025 board will do single point aquisitions but won't do waveform aquisitions with NI-DAQ 6.9.3f4. The software would also crash the system when trying to run the test panel in Measurement and Automation. After reinstalling 6.9.1f28 it worked fine.

    Hi,
    I did all that. I completely uninstalled NI_DAQ (twice I don't think it installed correctly the first time). I removed the 6025 card. I reinstalled NI-DAQ 6.9.3f4. Rebooted. Shut down and installed the 6025 board and booted up. Measurement and Automation sees teh board but now the test panel is disabled and when I try to recompile the LabView that was working withthe older NI-DAQ I get an error that says "A dynamic link library (DLL) initialization routine failed." The same thing happens in the DAQ Solution.
    During all this the 4060 DMM board got lost and Measurement an Auto. doesn't recognize it anymore. I've have reinstalled NI-DMM.

  • Using Labview on a non-daq board device

    How would one go about using a device, connected through USB, in labview that doesn't have a daq board? I have a signal monitor that I've managed to get to work with C++ and was wondering if there was a way to integrate it with Labview at all. I hope that makes sense.

    If they've provided a DLL then you can use the Call Library Function Node to call the DLL. You should read the chapter in the LabVIEW Help on calling code from external languages. If it's a C++ DLL you may have problems because LabVIEW only supports C DLL, not C++. A common issue is name mangling. If you do a search on the NI site for "DLL name mangling" you will come across several Knowledge Base articles that discuss this issue. Another common problem is that if the DLL uses complex datatypes (such as structures with strings inside, or pointers to complex structures then you will need to write a wrapper DLL that converts simple datatypes that LabVIEW can handle to the complex datatypes that the DLL uses. Again, there is a wealth of information on the NI site on calling DLLs, so a search will yield lots of information.

  • I want to run my program with continuous aquisition and write data to file when a button is pressed and get the most recent samples from the DAQ board.

    This is an update to a question I asked earlier. I am still trying to solve the problem. I have included a zip file with a library file. The file to open in pj_pushbutton.vi. I have placed comments explaining the issues and problems in this vi and the subvi's.
    When I push my button to write the data to file I think it is writing the data that has already been stored in the buffer and writing to file all at once. I want the data that is taken from the DAQ board from the time that I press the Write File button. How do I do this?
    Attachments:
    pj_pushbutton.zip ‏692 KB

    Hi,
    1. In AI read, you need to set the Read/Search option to "Relative to End of Data".
    2. Next you should not start AI Read VI until you press the Write to File. I am attaching an example VI below. Please see if it helps.
    Regards,
    Sastry V
    applications Engineer
    National Instruments
    Attachments:
    MostRecentAcquiredData.vi ‏97 KB

  • How to generate a pulse of 5 V -ve polarity in synchronis​ation with a 10 v input using PCI 6251 DAQ board

    HI
    I want to generate a 5V -ve polarity pulse to trigger my IEEE 1394 camera using an PCI 6251 (scb68 pin E) device, furthermore I am reading a standard Vsync signal from a SVGA port which is 10 v.
    Now I want to synchronise the image capture of an IEEE 1394 camera with the Vsync signal so that camera is triggered to capture at every 4th Vsync signal (which is 50HZ.). Presently I am able to read the Vsync signal and also generate the -5 v signal But the Daq board generates a continues -5v signal while I want a 1 ms pulse (for every 4th Vsync).
    Also this 5 v signal is generated even if the vi is stopped running.
    Can anybody guide me so this problem can be solved.
    Thanks a lot in advance
    Shri

    You can use the Write function, that is you can use the 10 volt signal as a trigger for the task, and when you get this trigger, the task starts and generate (writes) a waveform you defined in an array ("data", in the example).
    Something like this..
    DAQmxCreateTask("",&taskHandle);
    DAQmxCreateAOVoltageChan (taskHandle, clokSource, your_sampling_freq, -10.0, 10.0,
    DAQmx_Val_Volts, "");
    DAQmxCfgSampClkTiming (taskHandle, clockSource, SAMPLING_RATE,
    DAQmx_Val_Rising, DAQmx_Val_FiniteSamps,
    samp_to_read);
    DAQmxCfgAnlgEdgeStartTrig (taskHandle, input_channel, DAQmx_Val_RisingSlope, 9.0);
    DAQmxWriteAnalogF64 (taskHandle, 1000, 0, 10.0,
    DAQmx_Val_GroupByChannel, data, //data contain a pulse
    &samp_per_channel, 0);
    DAQmxStartTask(taskHandle);
    // and then you create a loop....
    For example you can specify data to be like this:
    data[0] = -5;
    for(i=1;i<1000; i++){
    data[i] = 0;
    Tell me if you need more help.....
    bye

Maybe you are looking for