Digital signal level on d4 screen

This one is probably for the Verizon employees who monitor these boards.....
What's an optimal signal power level for a digital channel on the diagnostic screen (d4)?  How's 36dB?  Pretty good?
Thanks,
Jeff

What is shown on that screen is SNR (signal to noise ratio) and yes 36 dB is good. 
However, you and most people are confusing this with signal level or power.  Signal level is in dBm, volts, or watts.  SNR is a measure of how clean the signal is not how strong it is.  You can have 36 dB SNR and huge or small signal - it is simply a ratio of the signal to the noise floor.
If a forum member gives an answer you like, give them the Kudos they deserve. If a member gives you the answer to your question, mark the answer as Accepted Solution so others can see the solution to the problem.

Similar Messages

  • Is there a better way to generate custom timed digital Signals

    I am trying to generate digital output of high and lows with particular delays on different lines. Each daq assistant is activating single line on a port on USB 6501. There more complex high and lows that I need to generate with variable time difference between high and low. There is codebelow which accomplishes what I am trying to achieve but for executing a long pattern of high and low signal is very time consuming to do this way. I am sure there is a better way to do this, I  not a expert on labview so I haven't uncovered its full potential. Can anybody suggest a more effective and a quick way to do this. I would hgihly appreciate. Thanks!
    I have not shown in the code below but through DAQ assistant I have initialized lines to low level logic.
    Solved!
    Go to Solution.

    See the attached file
    Attachments:
    Digital Signal.vi ‏335 KB

  • What are the iTunes digital audio levels in and out

    I am having a lot of trouble with finding out and setting the audio levels in and out of iTunes.
    Digital audio has 16bits at '1' with an analog signal level of +12dB, 24bit has all'1's at +18dB. Making the overhead capability for CD 16bit 12dB and that for HD 24bit 18dB.
    I would like to record (using SoundStudio) a set of standard audio tones from 20-20kHz with digital levels of "all 1's" 16bit, or +12 dB analog level, "all 1's" 24bit ditto, and the same for levels of analog 0dB (-12dB digital) for 16bit and 0dB(-18dB digital) for 24bit.
    What I am finding is that if I get Sound Studio to inert a tone at what it calls 0dB, save it as AIFF, then the output it from iTunes I get a much higher level, around +8dB or so!
    Any ideas?

    As an addition to my first question. I asked the makers of Sound Studio about levels. They tell me that the Mac standard is to have all '1's in the digital stream defined as 0dB.
    Which helps as I can now record a tone at the CD level of -12dB and the HD audio level of -18dB. And if I make recordings I know to keep the input levels well into the green, not yellow and definitely not up to the red.
    As you say the output of iTunes is boosted, and from my measurements this boost is in the analog circuitry and is about +8dB.
    I woul dnot recommend turing down the iTunes volume control as this brings in the digital volume reduction, which depends on the algorithm they have implemented. This is not so bad for iTunes as the Audio Core software works at 32bit and can handle 16bit volume changes fairly well. But on the iPad it is different as iOS is only 16bit audio so the volume algorithm is poor. So always leave the volume at maximum and turn down your HiFi volume control!
    Or use a digital output to an external DAC as the iTunes volume control is then disabled. Set Audio MIDI for 24bit output and you are ready to go with HD Audio.

  • M-Audio FastTrack USB - very low signal level (line/guitar)

    I've used FastTrack USB for a couple of years with a microphone through the XLR input with no problems. Now I am trying to connect a line output of a synthesizer to the line input of the FastTrack and the signal level is very low. There is no noise, it's just quiet, even with the track volume all the way up it's still no match for the rest of the tracks.
    Any suggestions, except for trying all the knobs and switches on the face and back of the unit (tried that already)? Anybody has similar setup?
    I use FastTrack for audio input, and built-in audio for output, it's connected through a hub (no problems for two years). In GB microphone is going to the Channel 1 (mono), and still works fine. Synthesizer goes to Channel 2 (mono) and is really quiet. I tried the syn with two different 1/4 cables with no difference. The headphones jack on the front of the FastTrack works fine with the normal level adjustment range, and if I plug in the headphones directly into the line out of my syn, I have sound, quiet (like you would get by connecting headphones to any line out, turntable even) but steady, so the syn must be fine.
    Any thoughts?

    No takers. Hm. OK. After some more research on the Net it appears to be that all the Fast Tracks suffer from the same problem, which makes me believe that it's not a bug, but a feature of an unexplainable kind. Oh, well, back to my old iMic noisemaker (this one works, it's just screened very poorly, so it picks up all the possible interference, and also makes its own little nasty hiss--this is why I switched to the Fast Track a while ago--so I will need to cut it out with an EQ, again). Hopefully, the EQ surgery will not affect the synthesizer's sound consistency as severely as it use to do with a microphone.

  • Way to "downgrade" from digital signal on graphics cards?

    Hello all,
    I recently purchased an 8-core mac pro with the intention of using it for a lot of video installations for theatrical productions. I bought it with 4 of the Nvidia graphics cards, which are dvi and mini-displayport. These are both sending a digital signal, and it doesn't make it possible for me to use my DVI-composite adapter so I can plug into older TVs. I also have TONS of BNC cable that I can use to run cable paths, and this BNC has to have composite. This severely limits me, since my work involves grabbing whatever output monitor I can find. I've seen some posts on altering the signal with a EZ pc-tv converter box, but this is one more adapter/converter/whatever that can break during a performance. I'd like to just send it out from the source.
    So my question is, is it possible to find a graphics card that still sends this analog signal, thereby allowing me to use my composite adapter? I just purchased the Mac Pro about a month ago, (just before they updated) and I wasn't sure if I could find anything like this. I've used the ez pc-tv box, but these don't appear to be too reliable, so if there isn't a way to install a graphics card, then is it possible to find a beefier conversion from the digital signal to analog so I can run long lengths of BNC cable?

    The standard DVI connector also produces the five signals required for VGA signal on the same connector.
    The DVI connector should have a solid bar toward one end of the connector. This is the ground for the four VGA pins (Red, Green, Blue, Horizontal Sync) that surround it. The fifth signal, Vertical Sync, is shared with the digital signal. A simple VGA-extractor has been available through the Apple store and other outlets.
    A DVI to composite Video and S-Video adapter, which may have a few active components inside, has also been available. The one on the Apple Store lists only the X1900 for the Mac Pro, but it is not clear to me whether they have simply been lazy about updating the description.
    The mini Display port adapters they list are
    1) DVI-Digital-only (you can tell because the bar at the end lacks the four pins surrounding it) OR
    2) VGA adapter (which you could convert to Composite with a component available in Europe they call a SKART chip).
    Conversion of the digital signal is out of the question. You would need a screen buffer and as much logic as is already on the display card. Adapting the VGA signals is the way to go.

  • SCXI-1162HV Digital Signal Acquisition

    This question has to do with SCXI-1162HV, DAQmx, and Digital Signal Acquisition, but I do this programming in CVI 2010 using the SCXI-1600 USB Digitizer and DAQmx, so I'll start here.
    I have seawrched many discussion forums, the examples and tutorials, and cannot find much on setting up and using the SCXI-1162HV 32-Channel Digital Input module.  I did find some example code in C:\Documents and Settings\All Users\Documents\National Instruments\CVI\samples\DAQmx\Digital.
    What I want to do: 
    I have several Analog Input cards (voltage, thermocouple, etc) in my SCXI chassis, that will be scanned at 1000 samples/second.  Using DAQmxRegisterEveryNSamplesEvent(...) a callback will be alerted every 100 milliseconds to read all the sampled data from the buffer into common memory.  I have a number of digital timing events that must be captured at the same rate.  It would be best if the 1162HV would supply 32-bit words into the buffer at the same scan rate.  Is that possible? 
    So far I have not found anything in DAQmx that would support such a thing.  I can create a task that scans multiple analog boards, but it does not appear that I can add the 1162HV board to that task.  No matter what I select in the DAQmx task generator, the generated code is always:
    int32 CreateDINTask(TaskHandle *taskOut1)
     int32 DAQmxError = DAQmxSuccess;
      TaskHandle taskOut;
     DAQmxErrChk(DAQmxCreateTask("DINTask", &taskOut));
     DAQmxErrChk(DAQmxCreateDIChan(taskOut, "SC1_DIN/port0",
      "DigitalIn", DAQmx_Val_ChanForAllLines));
     DAQmxErrChk(DAQmxSetChanAttribute(taskOut, "DigitalIn", DAQmx_DI_InvertLines, 0));
      *taskOut1 = taskOut;
    Error:
     return DAQmxError;
    Scanning does not seem to be a possibility.
    Now, going to the example code (...\Digital\Read Values\Read Dig Port-Ext Clk ) , particularly attempting to use an external clock to perform scanning, with AI clock selected, when executing these function calls:
      DAQmxErrChk (DAQmxCreateTask("",&taskHandle));
      DAQmxErrChk (DAQmxCreateDIChan(taskHandle,chan,"",DAQmx_Val_ChanForAllLines));
      DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandle,clockSource,rate,DAQmx_Val_Rising,DAQmx_Val_FiniteSamps,sampsToRead));
    the DAQmxCfgSampClkTiming() call generates this error

    Hendra@ngms,
    Thank you for using the forums.  I am sorry that you are receiving these errors.  I was able to replicate your system and I have found that if I set up a digital input task in Measurement and Automation Explorer, and chose any timing beside the on demand I received an error.  See screen shot.  Basically what this error says, is that the only timing that the 1162HV can do is On Demand.  So it doesn't appear that you can sample at your desired rate.  I hope that helps.
    Regards,
    Brian P.
    Applications Engineer
    National Instruments
    Attachments:
    Capture.JPG ‏33 KB

  • Intermitent, white, digital fragments at bottom of screen

    I've noticed after about a months worth of use that my Apple TV is producing some kind of interference or digital fragmentation, which appears at the bottom of the screen. Sometimes it flickers, sometimes is solid, and sometimes it's a dashed. Looks like a thin white line, that only occurs on the bottom 1/8" of the screen.
    Anyone else experiencing, or suggestions for remedy?
    Thanks
    Alex

    capaho wrote:
    Perhaps looking into it will convince you. The signal sent to the TV contains more than just the video, it contains positioning and other data so that the TV knows where the nose goes. Otherwise, every video would look like the aurora borealis.
    Analogue signals yes but I'm not so sure about digital signals via HDMI - the source and display device should have negotiated the video format and I would only expect the output device to send video frames compatible with the resolution chosen. If there are any additional timing or dta signal I would expect them to be in their own encapsulated data packets in the stream and nothing to do with the video frame data.
    The Tv shoule be the thing deciding how to display the signal once the resolution has been negotiated, rather than the AppleTv controlling factors across HDMI.
    This data normally rides on a portion of the signal that is positioned outside of the viewing area, but if the vertical position on the TV is slightly out of adjustment, you will see what appears to be a band of broken video at either the top or the bottom of the screen.
    This should be irrelevant for digital, but will apply to analogue signals.
    Old analogue feeds digitised will often have these artefacts on the full frame views because it's now encoded as part of the image, but I would not expect to see any artefact on material that had been recorded/edited/mastered purely in digital format.
    Of course many things are still not recorded in digital intially though I would be increasingly surprised if other stages were not handled digitally.
    I may be quite wrong about HDMI Capaho but I would expect a digital feed to keep the content and supporting data/info completely separately in the datastream.
    AC

  • NVIDEA outputs digital signal to LCD projector

    We are having a different problem with connecting to LCD projectors at our school. The problem seems to result from a Crestron media manager between the MBP and the projector. the MBP connects via the DVI-to-VGA adaptor. If the MBP has the NVIDIA GeForce 8600M GT card, it sends a digital signal to the projector and nothing shows up on the screen. Displays preference panel shows the 2nd monitor as "Digital 170T." There are 3 MBPs showing this behavior. one older MBP with an ATI Radeon X1600 card connects just fine. My G4 with an ATI Mobility Radeon 9700 also worked (for over 3 years). both show a "VGA" connection in Displays preference panel.
    The NVIDIA MBPs can connect directly to an LCD projector and output a VGA signal that works. it seems that the intermediate Crestron system is not sending a signal that the NVIDIA requires to know it is connected to a VGA display. and of course, Crestron is telling our support people it is not their problem; NVIDEA's web support doesn't even touch Mac topics.
    So how do we convince the NVIDIA card to output a VGA signal from the DVI port?

    Nvidia Control Panel. Change the video output. Should work.

  • How to input/output a digital signal and acquire an analog signal at the same time?

    Dasylab, version: 8.0.04
    Acquirement Card: PCI1002L
    When I use DasyLab to acquire the analog signals is no problem without digital inputs and outputs,
    and when I use DasyLab to input or output a digital signal is no problem also, but when I do that at the
     same time, DasyLab tell me the rate is too high and stop.
    so, I searched the manual book1 (user guide) for that, it showed me :
    To internally equalize measurement time and system time in the analog input, digital input and counter
    hardware modules, use the following settings:
       Synchronization: PC Clock
       Sampling rate: <= 5Hz
       Block size: =1
    the problem is, if I set the Sampling rate to 5Hz, the speed of the acquirement datas is not enough for my
    application.
    so, how to improve it? who can give me a example programm for me. thanks!
    by the way, I come from China, my English isn't good, I'm sorry.
    Allen, China.

    Hi,
    Have things changed over the years?
    I need to syncronise a digital output (Modul NI9474) and an analoge input (AI-Modul NI9203) module. I need to measure time intervals from a flank in signal A to a flank in signal B. I would like accuracies of the order of 1 ms. Currently, the signals are not synchronised, with errors of the order of 2 times the block length (block size x sample rate), sometimes much higher. The best I got so far was a block size of around 20 with a sample rate of 1 kHz.
    If I use the master and slave settings on the RTSL settings, my program doesn't run properly.
    If I use digital signals for input and output, I can syncronise them with RTSL settings and everything is good, but I can't always do that.
    Also, if I do anything in the GUI (such as scrollowing something or going to another window), my output gets screwed up properly.
    1. What can be done to synchronise AI with DO?
    2. Is there something that can be done to avoid messing up the output when something happens in the user interface? (I know that I am messing up the outputs as they make some valves switch and that is loud).
    Thanks in advance!

  • How do I convert analogue 5.1 surround sound into a digital signal? (Pioneer SE-DIR800C headphon

    I understand that my SB Audigy 2 card will only output 5. sound from games through its 3 analogue connectors. How do I convert this analogue signal into a digital one? What hardware do I need to purchase?
    I have read the thread on "Digital Connections, SPDIF and Dolby Digital Info", and also I've thoroughly read the "Creative Speaker Connectivity Guide".
    The reason I ask is that I have just bought a pair of Pioneer SE-DIR800C Surround Sound headphones. This are supposedly fantastic for recreating surround sound on headphones because there is a clever little decoder box. They accept a DTS / Dolby Digital signal via the digital co-axial and optical inputs, and there is a single analogue input which only accepts stereo sound.
    My XBox and PS2 will output true digital 5. sound via an optical cable and I am confident this will work perfectly with these headphones. I'm really looking forward to playing Halo in surround sound! But for gaming on my PC, I'll be limited to just an analogue stereo signal. Well, unless I can find some device that will convert my SoundBlaster's analogue 5. outputs into a digital signal, which I can then plug into the little decoder box for these headphones.
    So, back to my question:
    What hardware do I need to buy to convert an analogue 5. signal (via 6 x RCA or 3 x 3.5mm stereo ) into a true digital signal (via optical or co-axial SP/DIF)? Is it some type of headphone amplifier I need? If possible, please recommend makes & models of equipment.

    Thanks stefan
    OK, basically I have found 3 options for getting digital 5. out of a PC:
    a) Creative DTS-60 (approx $90). This converts the SB's analogue into digital, but it's not available in the UK and it also introduces a noticeable 50ms sound delay. Also, who wants an extra box hanging out of their PC?
    b) Buy a new sound card which has Dolby Digital Li've output. For example, ) BlueGears HDA X Mystique 7., or 2) Turtle Beach Montego DDL, or 3) Terratec Aureon 7. PCI (NOT the Space, FireWire or Uni'verse cards). I couldn't find the first two available in the UK so I have just ordered the Terratec card from Komplett.
    c) Buy a new motherboard that has built-in Dolby Digital Li've output, for example the ABIT Fatalty AA8XE. Unfortunately my PC is only just over year old and I'm not quite ready to replace it.
    I hope this info is useful for people. Maybe Creative will start making a card with DDL output too.Message Edited by NinjaHeretic on 2-22-2005 08:46 AM

  • How to check 6 digital signals change value at the same time with PCI-6229??

    I am using DAQ card PCI-6229.
    Channel 1 generate a digital signal.
    Channel 2,3,4,5,6,7 acquire digital signals.
    I want to check:
    1. Whether the rising edge of Channel 2,3,4,5,6,7 occures at the same time;
    2. The time delay from the rising edge of Channel 1 to the rising edge of Channel 2,3,4,5,6,7 is within a certain range.
    I know I can use counter to get the two edge seperation time delay. But I only have two counter, it is two time-consuming if I check one by one.
    I don't know how to check the rising edge of 6 different channels occure at the same time.
    Does anyone has any suggestions?
    Thanks

    Hello,
    You can use the DAQ card's digital input change detection circuitry to detect changes in the input, you can then use a counter to measure the relative time between samples. Please read Page 6-9 DI Change Detection Applications for more information. Let me know if this helps
    Christian A
    National Instruments
    Applications Engineer

  • Ipad vga signal levels

    3 out of 5 attempts to present Keynote (and other video enabled apps) presentations have failed. After some investigation I discovered that the failed talks (ipad could not drive video projector - iapd couldn't find project, and projector couldn't find ipad) were result of long cable lengths between projector and VGA connector that I plugged into dock-to-VGA adapter. These were 50' long cables in conference centers. I rustled up some more standard length cables after these very embarrassing failed attempts to give public presentations, and with shorter cables everything worked properly. I should also mentioned that numerous MacBook, MacBookPro and PC-types laptops all had no problem whatsoever driving VGA through a 50' cable.
    Anyone else experience this problem? It makes it impossible to trust the iPad as a conference presentation device if iPad video signal levels are too low. Perhaps I have a defective iPad, or is this a systematic problem?
    If I have to buy a VGA signal booster to carry along with an iPad, its absurd since these devices are nearly the size of the iPad itself!

    I am not sure of what the official limit is, but a google search brought up 50' as the "practical" limit. Another site said 9 meters. I have seen some long vga cables in various conference rooms.
    You may have an iPad with a reduced output. You could either try another ipad, if you know of one available, or take your ipad to the Apple store and see if an expert can test the output strength.

  • My daq 6008 wil not drop the 5V after the VI is stopped, i have a digital signal going from the error out on the daq in the while loop to the error in on the daq outside the while loop and a boolean going to the data of the daq outside

    my daq 6008 wil not drop the 5V on a digital output after the VI is stopped, i have a digital signal going from the error out on the daq in the while loop to the error in on the daq outside the while loop and a boolean going to the data of the daq outside, but i can t seemto get it to work

    i attached the block diagram so you can have a look
    Attachments:
    PID Temp control.docx ‏120 KB

  • How to convert digital signal to analog

    Hi..
    I am using NI 9375 (DIO module) to read the output from flow sensor.
    The output of the flow sensor is in the digital signal.(Boolean=True/False).
    How can I convert the digital signal to the analog to get the reading of the flow sensor?
    I had tried before to convert the digital signal to the frequency so that I can make the conversion from frequency to the flowsensor reading,
    but it doesn't work.( I am referring to http://www.ni.com/white-paper/14549/en/ ).

    nhan91213 wrote:
    Hi..
    I am using NI 9375 (DIO module) to read the output from flow sensor.
    The output of the flow sensor is in the digital signal.(Boolean=True/False).
    How can I convert the digital signal to the analog to get the reading of the flow sensor?
    I had tried before to convert the digital signal to the frequency so that I can make the conversion from frequency to the flowsensor reading,
    but it doesn't work.( I am referring to http://www.ni.com/white-paper/14549/en/ ).
    FYI - If your flowmeter pulsing frequency is higher than 500Hz then you won't get reliable reading using your algorithm, and it will be very unreliable above 1kHz because fastest that loop runs is 1mS (1kHz).  In that case you could tie the timed loop  to a higher rate (hardware) clock source to go faster than 1mSec loop time.  If no hardware then I think you can use high resolution timer (in LV2014, not sure if it was also available in previous version) and a regular while loop with algorithm modification for a faster timing.
    New Controls & Indicators made using vector graphics & animations? Click below for Pebbles UI

  • How to convert a analog signal to digital signal

    Hello,
    How to convert an analog signal into digital signal such that every sample of analog signal corresponding to 1.2V will be represented as '1' in digital signal and other samples of analog signal(that are not 1.2v) will be represented(converted) as '0' in digital signal.
    And how to display both wavefroms or signals in graph indicators.
    Thanks.
    Solved!
    Go to Solution.

    If you have 1000 samples, and you want to convert to digital, you are going to get 1000 digital values.  Attached is what I mean.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.
    Attachments:
    Analog_to_Digital Hooovahh Edit.vi ‏52 KB

Maybe you are looking for