PCI6602 to detect TTL output from single photon counter

Dear all,
I am learning how to use Labview.
Here is the system I try to set up.
1.   Single photon counting module, 2.5v TTL output, 20MHz light source
2. BNC-2121 block
3. PCI6602
4. Labview 7.0, NI-DAQmx
The aim of this system is to counts the number of the detected photons from the light source.
I have problem to start this task. Can you guys help to creat a sample which i can start from?  
Your help will be greatly appreciated.
Frankie

Hi Frankie,
I would suggest using the NI-DAQmx counter shipping examples to get
started.  The Count Digital Events VI would be a good starting
point. 
You can open it through:
 Help >>
Find Examples >>
Hardware Input and Output >>
DAQmx >>
Counter Measurements >>
Count Digital Events >>
Count Digital Events.
Hope this helps!
Micaela N
National Instruments

Similar Messages

  • 2d raster scan photon counting optimizati​on

    I have written a VI for a 2-D raster scan and counting TTL pulses from a photon counter at every pixel.
    It seems to be working fine and is serving its purpose for the time being. However there are a couple of thing I want to improve
    Currently, I have 2 loops, on within the other for each of the axes (x and y) for the scan. Then within the inner loop, I am counting the pulses. Now, the problem I am facing is that I have not figured out how to start the counter from 0 at every iteration. Instead, what I am doing is running the counter twice in a for loop and outputting the difference between these two iterations using shift registers.
    What I would like to do is avoid having to count twice for each iteration. I tried the start task and stop task vi but they seem to be doing nothing. 
    The other thing that would be nice is to avoid having the loops. I have heard that it is "expensive" to use the loops and they cause some overhead time in the program. I don't really think it is bothering me at this point but I think if I have to do faster scanning rates then it might. Currently I am scanning at >100ms per pixel. 
    (And also, just FYI, my raster wavefrom is slightly different in that in is a "triangle"instead of sawtooth. i.e. it goes to the end of the fast axis, then starts the next line from the end rather than the beginning to avoid sudden jerks to my hardware.)
    I have attached my VI. thanks
    Aditi
    Attachments:
    Galvo_Scan_Image 5.vi ‏92 KB

    I cannot help with your counter issues.
    The loops do have some overhead. It is on the order of a few nanoseconds. While loops are probably slightly slower than for loops because of the extra testing. Compared to your DAQ timing the loop time is completely negligible. Look at the test program below.
    Since you know (or can calculate) the number of steps, for loops are probably the better choice. I think the conditional for loop is available in LV 2011.
    You should probably move the AO Create channels VIs and associated Clear Task VIs outside the loops. Connect the task ID wires via shift register so that the value will be passed in the event someone enters start and end values which result in zero iterations of the loops.
    Generally to speed things up you want to move any calculations and any displays (indicators) outside the loops when it can be done without adverse impact on functionality.  For example the divide by two can be moved to the outer loop; the inputs do not change within the inner loop.  If you do not need to see every update immediately, reducing the number of writes to the Intensity graph amy speed things up a bit.
    Avoid right to left wires and wires behind other objects. These have no effect on program performance but they make it much harder to understand what is going on and to fine problems. I have attached a cleaned up version of your program.
    Lynn
    Attachments:
    Loop times.vi ‏12 KB
    Galvo_Scan_Image 5.2.vi ‏74 KB

  • TTL signals from motor outputs

    I'm currently using a stepper motor with a MID-7604 drive and a PCI-7344 controller. I would like to output TTL signals from the drive at certain motor positions, but do not have an encoder (which is required for breakpoint signals). Is it possible to `construct' a TTL type signal using the low-level motion functions in LabView, and then output them through a motor axis that is not currently being used?

    Hello,
    Depending on the type of output that you want to generate (single pulse or pulse train) you could use the Digital IO port of the motion controller with the Set IO Port MOMO.flx function to toggle DIO lines or you could use the Configure PWM Output.flx function to generate a pulse train.  The downside is that this will be software timed based on the current position as determined by the controller.
    There is not any way to manually modify the motion control signals that are generated by the controller.  That is all handled by the FPGA of the controller.
    Regards,
    Scott R.
    Applications Engineer
    National Instruments

  • How can I save a page and all its component parts in a single file, like IE does as an MHT - it's much easier for mailing to people where page address not available?? (as in output from an airline booking site, for example)

    how can I save a page and all its component parts in a single file, like IE does as an MHT?
    It's much easier for mailing to people where page address not available?? (as in output from an airline booking site, for example)
    It is simply too painful to have to zip everything up into a single file to send. MHT format has been available for years now from IE, and with every new FF release it's the first thing I look for. I have been using FF for years, and hate having to come out of it, over into IE |(which I even took out of startup) and key everything in again, in order to send somebody something in a convenient format that they can open with a single click.
    I can't believe this hasn't been asked before, so have you looked at it and rejected it? Have MS kept the file format secret?
    Thanks
    MG

    This is not really an answer just my comments on your question.
    I am sure I recollect efforts being made to get mhtml to work with FF.
    Probably the important thing to remember about .mhtml is that if other browsers do support it they may need addons, and may not necessarily render the content correctly/consistently.
    There are FF addons designed for archiving webpages, you could try them, but that then assumes the recipient has the same software.
    You could simply save the page from FF to your XP pc; then offline open it with and save it using IE, before then emailing using FF, and attaching the .mht or mhtml file that you have now created on your PC.
    As an alternative method, in some cases it could be worth considering taking a screen grab of the required page, then sending that to the recipient as a single email attatchment using either a bitmap or jpeg file format for instance.
    Something such as an airline booking may be designed with a print option, possibly it could be worthwile looking at sending the print file itself as an email attachment.

  • Trouble specifying TTL for digital outputs from PCI-6221

    Hi!
    I'm trying to generate TTL output signals using a PCI-6221 daq board - within Labview, I'm using DaqMx to create two digital waveforms and then write these. When I plug in an oscilloscope, the shapes/frequencies of the signals are correct, but the lower value is 0V, and the higher value is 1V, rather than the TTL values. Is there a way to specify these voltage levels to TTL within the software? I see from NI help that "do.logicfamily" exists to specify TLL, but when I add a property node and connect it to the channels, I do not see Logic Family as an option under "digital output." Let me know if you have any ideas!
    Thank you,
    Emily

    Set your oscilloscope to use 1MOhm impedence.  Your problem is that the maximum current for those DIO is 24mA.  1V/50 Ohms = 20mA.  So you are current limiting the digital outputs.  If you need more current, then you need to add in some digital buffer chips that can output more current.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Firstly hello to all. I'm looking to create a vi that will take a single logged output from a thermocoup​le and monitor the temperatur​e and produce a Boolean when the temperatur​e has stabilised for a pre determined time say 1minuet.

    Firstly hello to all. I’m looking to create a vi that will
    take a single logged output from a thermocouple and monitor the temperature and
    produce a Boolean when the temperature has stabilised for a pre determined time
    say 1minuet. I have managed to find a couple of examples on the forum but one
    will only run on V8.2 and I have V8, the other is for more than one channel
    witch is fine I can always reduce this, but it was the timing feature I was
    having difficulty with. I looking to monitor the temperature of a motor until
    it has stabilised prior to testing and then to use this temperature as a
    reference. Pleas forgive my ignorance if this is a very simple thing but I’ am
    learning and really enjoying it. Thank you in advance for your answers.

    Hi
    Graham, thank you for you reply.
    What I’ am trying to achieve is a vi I can use in a motor
    testing setup, a part of this would be to warm the motor up until the exhaust
    air temperature has stabilised, this takes approximately 10 minuets. I was
    thinking of just letting the motor run for this time and leave it at this, but
    some motors warm up quicker that others and. I am basically looking for a vi with
    an adjustable temperature window of say ±5 C° in 1° increments, timing wise 1minuet
    to 10min the adjustment is so I a can use this for another application. I tried
    to adjust the code I found at the link below but had a little difficulty with
    the timing. Thank you so much for your help it’s much appreciated.
    sine.ni.com/niforum/niforum?forumDU=http://forums.​ni.com/ni/board/message?board.id=170&message.id=25​1017&requireLogin=False

  • Multiple Output formats from Single Source File? (Like Squeeze)

    I'd really love to be able to batch process encodes in the following manner:
    Drop my source video file into AME CS5, select an MP4 preset, and then have it encode multiple bitrate versions while adding filename extensions [e.g. _High (700 kb/s), _Mid (550 kb/s), _Low (400 kb/s)].
    The simple answere is to drop, or duplicate, my input clip 3 times and just select 3 presets I could set up under the Hi, Mid, Lo parameters. But that's exactly what I'm trying to work around. I encode video ads for major web-video sites, and the volume is just manageable to batch process these (drag and drop large quantities, select multiple presets at once, hit Start.) To handle each ad we service would be far too time consuming, even for as simple as AME makes it to duplicate/choose new preset. (In the end, our ads jump onto our FTP via AME's FTP upload option, one of it's smartest features!)
    We have quite a bit of encoding resources here at work, but AME has been giving us the most favorable results. Other options, such as Sorenson Squeeze, let you import your source clips, then apply 2 or more presets to them, before encoding the whole batch. Is there any similar functionality in Media Encoder? (Really don't want to move our workflow into Squeeze, with it's inferior MP4 encoding.)
    Does anybody have any experience with this sort of high-volume multiple-outputs from individual source files? Any tips with scripts or Apple's "Automator" that could streamline this type of batch processing?

    Any update on this ability?  We create many in house videos that need to be encoded to 14 different
    bitrates for use with Flash Media Server as dynamic http streams.
    Currently when I am ready to export a finished sequence, I will pick my first preset and queue it in AME.  Then I duplicate that thirteen times, setting each of the new thirteen queued items to their appropriate bitrates.  Then I have to change each of the output names to be "filename_bitrate.flv".
    This process is much slower when queued in AME than if I exported each individually from PP.  I just don't have the time to manually export each version.
    I have also started noticing that some of the last few projects won't render beyond the quality of the first queued item.  Do I need to render the largest bitrate file first?
    Any indication from Adobe on the correct workflow to create multiple bitrate files to be consumed by FMS as dynamic http streams would be appreciated.
    The link above is dead.  Does anyone have an updated link to the document above?

  • Is it Possible to have multiple sheet Excel output from a single BI report?

    Hi,
    I am using Oracle BI Publisher Desktop veriosn 11.1.1.7
    JDE tool set 811.
    Is it possible to generate multiple excel sheet tabs from single report?

    Creating Excel Templates - 11g Release 1 (11.1.1)
    BI Publisher: Excel Template: 'Fixed' Multi-Sheet Template

  • How to send TTL output AND acquire AI voltage data using USB-6211

    Hello,
    I am relatively new to Labview, so please bear with me.  I have a research application involving 2 pressure transducers and a high-speed camera.  I wish to acquire analog voltage data from the 2 pressure transducers.  However, at the start of the acquisition, I will need to send a single TTL output to trigger the camera.  This TTL pulse must be sent out at exactly the same time that the AI acquisition begins, in order to ensure that my 2 pressure measurements and camera images are 'synchronized' in time.
    Is this possible on the USB-6211 running with LabView 8.20?  I currently have a fairly simple LabVIEW vi that uses a software trigger to start an AI acquisition - I have attached it with hopes that it may help anyone willing to assist me.  I would prefer to be able to simply add something to it so that it will output a TTL pulse at the start of the acquisition.  
    Thank you in advance.
    Regards, Larry
    Message Edited by Larry_6211 on 12-19-2008 11:24 AM
    Attachments:
    USB6211_v1.vi ‏212 KB

    Hi All,
    I'd like to clear a few things up. First, you'll find that if you try to set the delay from ai start trigger and delay from ai sample clock to 0, you'll get an error. Due to hardware synchronization and delays, the min you can set is two. Note that when I say two, I am referring to two tick of the AI Sample clock timebase, which for most acquisitions is the 20MHz timebase. I modified a shipping example so you can play around with those delays if you want to - I find that exporting the signals and looking at them with a scope helps me visualize what is going on. The Manual has some good timing diagrams as well but it looks like you've already hit that. The defaults would give you a delay of  250ns from the start trigger - is this too high for your situation? What is an acceptable delay? I tend to think that "exactly the same time" is a measure of how precise rather than an absolute (think delays in cable length making a difference.)
    With all that in mind, I see a few options:
    Start your camera off of the AI start trigger (an internal signal) and just know it is 250 ns before your first convert. 
    Export the convert clock to use as a trigger. This assumes your camera can ignore the next set of convert clocks.
    More complicated option: Internally you have an ai start trigger, sample clock and convert clock. From your start trigger to the first convert is 250ns but if you export your convert clock you're going to get future convert clocks as well. One option would be to generate a single triggered pulse using a counter (start with the  Gen Dig Pulse-Dig Start.vi example) with the AI start trigger as the trigger for the counter, an initial delay of 250 ns, and a high time of whatever you want it to be. This should give you a singe pulse at very close to same time (on the order of path delays) as your first convert clock. 
    Hope this helps, 
    Andrew S
    MIO DAQ Product Support Engineer
    Getting Started with NI-DAQmx
    Measurement Fundamentals
    Attachments:
    Acq&Graph Voltage-Int Clk.vi ‏37 KB

  • No digital video output from XVR-300 in Ultra-45

    I am switching from solaris 10/x86 to sparc. I have a new Ultra-45 with XVR-300 graphics adapter. It's running Solaris 10u4. The system was ordered as a standard configuration from the catalog.
    The XVR-300 graphics adapter currently produces no digital output. My Sun A1240P0 24" monitor reports no signal on the DVI input. That is, there is no digital output at power on (boot ROM), during Solaris boot (terminal), nor after Solaris finishes booting (X).
    The graphics adaptor uses a connector with two video outputs, and comes with a splitter cable with two DVI output connectors. I get no video output on either. The DVI cable I am using is the one that works when connected to the graphics adaptor on the old system.
    I added a VGA adapter plug to connector 1 of the splitter cable, and was able to get an analog signal to drive the display via the VGA input connector.
    The manuals for the Ultra-45 and for the XVR-300 make no mention of analog vs digital output. They both just say to connect the adapter to the monitor.
    Since I can get analog output from the SVR-300, the card is basically working. The DVI cable and the monitor's DVI input work with another system. The only variables are the digital output from the XVR-300 and the splitter cable. (The splitter cable's analog signals work.)
    How can I activate digital output and make it the default?

    try this page........
    http://docs.sun.com/source/819-6651-13/chap2.install.html#50589705_pgfId-1001195
    To Set the Sun XVR-300 Graphics Accelerator as the Default Monitor Console Display
    1. At the ok prompt, type:
    ok show-displays
    The following shows how to set the console device:
    a) /pci@1f,700000/SUNW,XVR-300@0
    b) /pci@1e,600000/pci@0/pci@8/SUNW,XVR-300@0
    q) NO SELECTION
    Enter Selection, q to quit:
    2. Select the graphics accelerator you want to be the default console display.
    In this example, you would select b for the Sun XVR-300 graphics accelerator.
    Enter Selection, q to quit: b
    /pci@1e,600000/pci@0/pci@8/SUNW,XVR-300@0 has been selected.
    Type ^Y ( Control-Y ) to insert it in the command line.
    e.g. ok nvalias mydev ^Y
         for creating devalias mydev for
    /pci@1e,600000/SUNW,XVR-300@5
    3. Create an alias name for the Sun XVR-300 graphics accelerator device.
    This example shows mydev as the alias device name.
    ok nvalias mydev
    Press Control-Y, then Return.
    4. Set the device you selected to be the console device.
    ok setenv output-device mydev
    5. Store the alias name that you have created.
    ok setenv use-nvramrc? true
    6. Reset the output-device environment:
    ok reset-all
    7. Connect your monitor cable to the Sun XVR-300 graphics accelerator on your system back panel.
    Man Pages
    The Sun XVR-300 graphics accelerator man pages describe how you can query and set frame buffer attributes such as screen resolutions and visual configurations.
    Use the fbconfig(1M) man page for configuring all Sun graphics accelerators.
    SUNWnfb_config(1M) contains Sun XVR-300 device-specific configuration information. To get a list of all graphics devices on your system, type:
    host% fbconfig -list
    This example shows a list of graphics devices displayed:
    Device-Filename Specific Config Program
    /dev/fbs/nfb0 SUNWnfb_config
    procedure icon To Display Man Pages
    single-step bulletUse the fbconfig -help option to display the attributes and parameters information of the man page.
    host% fbconfig -dev nfb0 -help
    single-step bulletTo access the fbconfig man page, type:
    host% man fbconfig
    single-step bulletTo access the Sun XVR-300 graphics accelerator man page, type:
    host% man SUNWnfb_config
    haroldkarl

  • Edge Detection vision output, how to use it?

    Hello Developer Zone,
    I'm working on a school project that will use labview, labview vision, and labview Mindstorms. to make a robot drive along a line of black tape on the ground.
    Problem is, the line detection output of dision is confusing me. Teachers cant help me, and i am not that experienced yet.
    I want to know that (within area of interest) There is Either a staight line, or a diagonal line (or any / or \ kind of line).
    I dont know the posabilitys, but it either detects what kind of line there is and just output instructions to turn left or right or straight forward.
    Or it could detect like: The line is there, but 5cm (5 pixels ) whatever it measures in, to the right. and instruct to re-locate to the position the line is pointing, or is at.
    I hope this isnt to confusing, and i hope someone can help me to output the lines detected, to something i can instruct the drive system with.
    With this i will post an image of my project on a test.
    note: the drive system isnt designed yet, it willbe using 2 servo motors, controlled by Mindstorms. i intend on instructing how long, 1 or 2 of the motors should turn and at how much % of the power.
    Though if i can get the vision line system to output in like 5 different conditions ( straight, left,right,hard left, hard right. ) based on how the lines are recorded. this should be no problem at all.
    Thanks in advance,
    Vince Houbraken
    Student at ROC Eindhoven NL
    Message Edited by Smileynator on 11-06-2009 08:08 AM

    To get the output from within LabVIEW you have to use the IMAQ (Vision) functions within LabVIEW, not Vision Assistant.  Vision Assistant will create the VI from your script for you if you select Tools>Create LabVIEW VI. 
    I have also attached a very simple VI that will find edges for you.  The output called Straight edges is an array of edges that includes the coordinates and angle of the edges. 
    Be sure to select the ROI before you run the VI
    Tim Elsey
    LabVIEW 2010, 2012
    Certified LabVIEW Architect
    Attachments:
    Untitled 1.vi ‏94 KB

  • Unable to read output from server on a socket

    Hello everyone,
    I am trying to re-develop and application that traditionally is developed in VB.
    Traditional program does following: communicates with host using telnet; login, send some text, read the output from server, validates output and terminated telnet connection.
    I am trying to build same kind of application in Java using sockets, but I do not get any response from the server.
    Any Suggestions ?
    Thanks,
    Code below:
    String hostName="ip address here";
    int portNum=23; //Verified that server's telnet runs on port 23
    try
    socket=new Socket(hostName,portNum);
    out=new PrintWriter(socket.getOutputStream(),true);
    in=new BufferedReader(new InputStreamReader(socket.getInputStream()));
    }catch(UnknownHostException e){
    System.err.println ("Host not found: " + hostName);
    System.exit(1);
    }catch(IOException e){
    System.err.println("Could not get I/O for host: " + hostName + "on port: " + portNum);
    System.exit(1);
    BufferedReader stdin=new BufferedReader(new InputStreamReader(System.in));
    String userInput;
    try{
    while ((userInput=stdin.readLine()) != null)
    out.println(userInput);
    while (in.readLine() != null){
    System.out.println("echo: "+in.readLine()); // I Expect server's output here but nothing happens.
    }catch(IOException e){
    System.err.println("User Input not read.");

    while (in.readLine() != null){
    System.out.println("echo: "+in.readLine()); // I Expect server's output here but nothing happens.I see two problems here:
    1) in.readLine() only returns null if the socket is closed. You will be forever in that loop.
    2) you are losing every second line of your input. If your telnet server sends only one line at the beginning, the readline() call in the while statement will swollow that.
    If you want to do this in a single thread, you will have to use a method which won't block until input is available, such as the read(char[], int, int) method of BufferedReader.
    You could also use two threads, of which one is printing the output and the other is taking user input. This has the advantage, that you are able to handle any connection errors immediatley instead of only when the user entered a command.
    Does telnet send information about the end of an input sequence (like the end of the welcome message and the end of the answer to any issued command? If not, I'd say that you should use two threads, as you would have to do polling all the time otherwise.
    Hope it helps.

  • Erratic Volume Output From Speakers (App notifications, keypad tones, Ringing Volume)

    Just purchased a Z1 over the weekend and was happily fumbling with the new phone. Got it updated to the latest 4.4.2 as well.
    Noticed a glaring problem with the volume output from the speakers.
    During use of instant messaging apps like Whatsapp and LINE, notification tones come in at times at the level that i set it at (max) and sometimes they come in really soft, to the extent i can't even hear them and sometimes none at all. All these happening when i've set the volumes to MAX on each of the available options under settings>sound>volume. Might i also add that the volume for notifications is really too soft even at MAX.
    I'm not sure if this is happening due to the updating of the software or it was already an issue prior to the update. I've seen a couple of posts online where some users are experiencing the same erratic behavior of the internal speakers. 
    I hope someone can shed some light on this so we don't judge too quickly on an otherwise excellent phone.

    I have the problem with my xperia Z1 speaker and ear phone not detect after update 4.4.2 in kitkat version.
    phone purchasing date : 28.02.2014
    Before update my Z1 runing in 4.3 jelly bean and phone is working properly....i have only one issue in 4.3 during call accept automatically call going in speaker.
    1 Problem after update 4.4.2  kitkat version sometimes movies app playing video song but  voice is not out from phone speaker and song voice playing in ear speaker then i press play/pause button in movies app then song voice play in phone speaker
    2 Problem  i connect my ear phone in my Z1  and phone is  not detect ear phone i play the song in walkman app so song voice is out from phone speaker and not lising voice in ear phone then i restart my phone ear phone is working properly.
     2 times done my Z1  software repair but not fix the problem........please tell me what can i do for fix my Z1 problem
    Thanks & Regards
    Shekhar Jaiswal

  • Change output from 4 lines to 1.

    ANyone know how I can change the output from this query so that it outputs a single row instead of 4? I need to be able to make the 4 decode select statements into a single one, this I know, but I have problems with the part where I want to take the col, find a row, then put in that value. Here is they query, and below the output.
    select distinct(bt.Account_Obj_Id0),
    decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '09/01/2002', NVL( dt.Current_total ,0) ,0) as "over 120",
    decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '10/01/2002', NVL( dt.Current_total ,0) ,0) as "90-120",
    decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '11/01/2002', NVL( dt.Current_total ,0) ,0) as "60-90",
    decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '12/01/2002', NVL( dt.Current_total ,0) ,0) as "current"
    from (select convert_t_time (end_t) End_DT,
    total_due,
    account_obj_id0,
    current_total
    -- convert_t_time (start_t) Start_DT
    from bill_t
    where account_obj_id0 = '9760259'
    and convert_t_time (end_t) > trunc (sysdate - 120, 'MM') ) dt,
    bill_t bt
    where bt.account_obj_id0=dt.account_obj_id0
    --GROUP BY bt.Account_Obj_Id0
    ACCOUNT_OBJ_ID0 over 120 90-120 60-90 current
    9760259 0 0 0 79.53
    9760259 0 0 206.29 0
    9760259 0 93.85 0 0
    9760259 127.17 0 0 0

    select bt.Account_Obj_Id0,
              MAX(decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '09/01/2002', NVL( dt.Current_total ,0) ,0))  as "over 120",
              MAX( decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '10/01/2002', NVL( dt.Current_total ,0) ,0)) as "90-120",
              MAX(decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '11/01/2002', NVL( dt.Current_total ,0) ,0))  as "60-90",
              MAX(decode( TO_CHAR( End_Dt, 'MM/DD/YYYY'), '12/01/2002', NVL( dt.Current_total ,0) ,0))  as "current"
      from (select convert_t_time (end_t) End_DT,
                        total_due,
                        account_obj_id0,
                        current_total
                        -- convert_t_time (start_t) Start_DT
                 from bill_t
                  where account_obj_id0 = '9760259'
                      and convert_t_time (end_t) > trunc (sysdate - 120, 'MM') ) dt,
             bill_t bt
       where bt.account_obj_id0=dt.account_obj_id0
    GROUP BY bt.Account_Obj_Id0

  • My mac is not detecting external microphone from headphone

    my mac is not detecting external microphone from headphone

    If your headphone doesn't work on your mac then I suggest you try this:
    1.  Is it plugged in properly?  Don't try pushing it in hard. Thatis likely to break something
    2.  Call up Systems => Sounds then select the Output tab.  Do you see your headphones as an output option?  You should see "Headphones"   Headphone Output".  If you don't see it then you either have a software or a hardware problem.  Don't panic yet
    3.  Leave the headphone plugged in.  Close all your apps and shutdown the mac.  Press the power button to restart.  Hold down the following 4 keys:  ALT CMD P R   this could also be OPTION COMMAND P R
    The mac screen should go blank and should restart again.  When you've logged in, call up SYSTEMS => SOUNDS again.  Check if you can see the headphone source this time.  If you can't, now is the time to panic.  You've reset your PRAM and it hasn't made a difference, so you probably have got a hardware problem.

Maybe you are looking for

  • Two Users on a Mac in Mavericks...how do you use 2 apple ID's and not get blocked?

    After upgrading to Mavericks, we started getting a message "This computer is already associated with an Apple ID. If you download past purchases with your apple ID, you cannot auto-download or download past purchases with a different Apple ID for 90

  • Windows 7 template error

    Hi, I am facing the following error when I tried to import the Windows7 32 bit template under VDI 3.2.1 whereas I have successully built the WIn XP and Win Vista dekstop under the user pool but this windows 7 gives me real pain. Error Importing: Time

  • Tech.spec and function spec

    hi, i want a good functional spec with the tech spec and pls if you could give me the detail code of that spec. thanks in advance if i get a good material i may be giving 50 points since it is not possible to give fifty points i will manage with ten

  • Iphone syncing with windows live mail address book using add custom lable

    i am having trouble getting the iTunes to sync contact details when using add custom labels when adding contact phone numbers to iPhone, also will not use the windows live mail contacts to sync details seems to want to use windows contacts which my w

  • Cabinet file itunes.cab is corrupt...

    I just got a new I Pod Nano (first time user) and I am trying to install itunes on Windows XP. Every time I try to install I get the message cabinet file itunes.cab is corrupt... I have tried downloading itunes again and it still comes up. Can anyone