Edge detection not so great?

I have been using PS since v1.0. I have cs4 but have been trying out cs5. With cs4 I only occasionally used the refine edge. I just found it to be to slow. I saw some videos on the new refine edge tools and it looked promising, but when I tried it on a high res image I found it to not really work all that well. Here is a sample. I masked the car with pen tool, then applied a layer mask. Then ran refine edge, started to move the edge dection and found that it started to make my edge choppy. Now I know that you need to also use the adjust edge tools to get the full effect, but it seems like edge detection dose more harm than good? Any thoughts?
Thanks,
Steve

that's why you have an edge detection brush and an edge detection eraser.. and several other parameters and controls that interact between each other and give you a lot of control. or a lot of headaches if you don't know how to use them.. to get the best out of your tools you need to practice with them, and, as i already said, the web is full of precious information, if you search for it...
try some video tutorials, you wouldn't imagine how many small great things are under the hood in a software like PS..
Tommaso

Similar Messages

  • Edge detection or gauging?

    Hi,
         I am working on measuring difference in diameter of small millimeter thick tubes when they are dry and when they are wet. I wish to use image analysis of labview and use edge detection to measure the change in diameter from the original state to their moist state. Can anyone please help me out by naming useful tools I can use to do this. I know of a tool called gauging but I do not know how it works? I was thinking of using pixels to measure the difference as these tubes are 1-5 mm thick in their dry (original) state and when they are wet their diameter only increase on scale of 10-100 micrometers. I have enough optical resolution in the images but I am lost on how to use labview. Any advice would be greatly appreciated.
    Thank You

    Hi Ognee,
    You could definitely use some functions from the Vision Development Module to find the edges of the tubes, and then use a Caliper function to determine the distance between those edges. I'd recommend taking a look at this tutorial about edge detection using the Vision Development Module. I hope this helps!
    David S.

  • Refine edge and edge detection serious issues, getting blurred and lots of contamination

    Hi guys :)
    a few months back I bought a grey background after being reccomended as being easier to do compositing, I was really pleased but now I have run into some big issues and was wondering if you could spare a few moments to help me out. so I've taken the image of my model on the grey background, the model has white blonde hair, simillar to this image here, http://ophelias-overdose.deviantart.com/gallery/?offset=192#/d393ia7.
    Now what I have been doing is using the quick select tool, then refine edge, when I get to refine edge that's when the issues start for me, when I'm trying to pick up the stray hairs around the model to bring them back into the picture, I'm finding that the hair,when it does come back into the picture is quite washed out and faded, and almost like it has been re painted by Photoshop CS 5 instead of being brought back into the picture, when I paint over the areas with the edge detection tool. Also, even if I check the decontaminate colour box, it doesn't make a blind bit of difference!!
    I'm on a bit of a deadline and am so alarmed I'm getting these issues with the grey background, how are these problems occurring?? can you please give me some idea, I would be really greatful. I have been you tubing, and going over tutorials and reading books to NO avail :(
    I just keep getting this issue, even when I have tried editing a test picture from a google search of a model with brown hair! Still getting this issue.
    this tool is supposed to be amazing, I'm not expecting amazing but I'm atleast expecting it to not look like a bad blurred paint job, with contamination.
    really greatful, thanks :)
    M

    Hi Noel,
    Thank you for the reply. I have attached some screen shots for you.
    Im working with high resolution photos in NEF, I am trying to put the model into a darker background but i havent even got that far as I cant even cut her out.
    Decontaminate doesnt seem to be working at all to be honest, it makes little difference when i check that box.
    Im getting nothing close to the image of the Lion, thats brilliant!
    This is the original image taken on a Calumet grey backdrop
    And this is what I keep getting results wise, see around the hair there is still rather a lot of grey contamination and the hair literlly looks painted and smudged.

  • Detect Note Changes in MIDI Sequencer

    Hi all,
    I’m trying to detect when the MIDI Sequencer changes notes using some kind of a listener to detect when the note changes occur. My example program is able to detect the end of the sequence and I’d like to be able to detect note changes and print a message in a similar way. Here is my example code.
    import javax.swing.*;
    import javax.sound.midi.*;
    public class SequencerTestApplet extends JApplet
        public void init()
            SequencerTest seqTest = new SequencerTest();
            seqTest.play();
            System.out.println("Print from init()");
    class SequencerTest
        Sequencer sequencer=null;
        Sequence seq=null;
        Track track=null;
        public SequencerTest()
            try
            {   sequencer = MidiSystem.getSequencer();
                sequencer.open();
                // detect END OF SEQUENCE
                sequencer.addMetaEventListener(
                    new MetaEventListener()
                    {   public void meta(MetaMessage m)
                        {  if (m.getType() == 47) System.out.println("SEQUENCE FINISHED");
                sequencer.setTempoInBPM(40);
                seq = new Sequence(Sequence.PPQ, 16);
                track = seq.createTrack();
            catch (Exception e) { }
        public void play()
            try
            {    // NOTE 1
                ShortMessage noteOnMsg = new ShortMessage();
                noteOnMsg.setMessage(ShortMessage.NOTE_ON, 0, 60, 93);
                track.add(new MidiEvent(noteOnMsg, 0));
                ShortMessage noteOffMsg = new ShortMessage();
                noteOffMsg.setMessage(ShortMessage.NOTE_OFF, 0, 60, 93);
                track.add(new MidiEvent(noteOffMsg, 16));
                // NOTE 2
                ShortMessage noteOnMsg2 = new ShortMessage();
                noteOnMsg2.setMessage(ShortMessage.NOTE_ON, 0, 68, 93);
                track.add(new MidiEvent(noteOnMsg2, 16));
                ShortMessage noteOffMsg2 = new ShortMessage();
                noteOffMsg2.setMessage(ShortMessage.NOTE_OFF, 0, 68, 93);
                track.add(new MidiEvent(noteOffMsg2, 32));
                sequencer.setSequence(seq);
                sequencer.start();
            catch (Exception e) { }
    }In this program the init() method starts the sequencer through the play() method and then continues on so that the “print from init()” statement is printed from the init() method while the sequencer is still playing. Then after the sequencer is finished, it uses the MetaEventListener to detect the end of the sequence and print the “sequence finished” message. I’d like to be able to make it also detect when the sequence changes notes in a similar way... Start the sequence and move on, but then be able to detect each time a note change occurs and print a message.
    Since I am putting the notes at specific midi ticks (multiples of 16, or “quarter notes”) I could poll the Sequencer using getTickPosition() to see if the Sequencer’s tick position matches a particular multiple of 16. However, the problem with this is it would lock up the program since it would be constantly polling the sequencer and the program wouldn’t be able to do anything else while the Sequencer is playing (and I also have a loop option for the Sequencer so that would lock up the program indefinitely).
    Here’s what I’ve found out and tried so far...
    I read in this [this tutorial|http://java.sun.com/docs/books/tutorial/sound/MIDI-seq-adv.html] on the java sun site (under “Specifying Special Event Listeners”) that The Java Sound API specifies listener interfaces for control change events (for pitch-bend wheel, data slider, etc.) and meta events (for tempo change commands, end-of-track, etc.) but it says nothing about detecting note changes (note on/off). Also in the [EventListener API|http://java.sun.com/j2se/1.3/docs/api/java/util/class-use/EventListener.html] (under javax.sound.midi) it only lists the ControllerEventListener and the MetaEvenListener.
    I also read here that MIDI event listeners listen for the end of the MIDI stream, so again no info about detecting note changes.
    It seems like the sequencer should have some way of sending out messages (in some fashion) when these note changes happen, but I’m not sure how or even if it actually does. I’ve looked and looked and everything seems to be coming back to just these two types of listeners for MIDI so maybe it doesn’t.
    To be sure the MetaEventListener doesn’t detect note changes I changed the MetaMessage from:
    public void meta(MetaMessage m)
    {    if (m.getType() == 47) System.out.println("SEQUENCER FINISHED");
    }to:
    public void meta(MetaMessage m)
    {    System.out.println("" + m.getType());
    }so that it would print out all of the MetaMessages it receives. The only message that printed was “47” which indicates the end of the sequence. So the MetaEventListener doesn’t appear to do what I need it to do.
    I realize this is a rather odd problem and probably not many people have had the need to do something like this, but it never hurts to ask. If anyone has any suggestions on how to solve this problem it would be greatly appreciated.
    Thanks,
    -tkr

    Tekker wrote:
    As another idea, since I can't do it with a listener like I originally wanted to, would adding a separate thread to poll the sequencer and send an interrupt when it matches a particular midi tick be a good route to try? My thinking is this essentially act kind of like a listener by running in the background and reporting back when it changes notes.Yep, that worked! :)
    import javax.swing.*;
    import javax.sound.midi.*;
    public class ThreadTestApplet extends JApplet
         public void init()
              ThreadTest threadTest = new ThreadTest();
              threadTest.play();
              MIDIThread thread = new MIDIThread(threadTest.sequencer);
              thread.start();
              System.out.println("  Print from init() 1");
              try { Thread.sleep(1000); } catch (InterruptedException ie) {}
              System.out.println("  Print from init() 2");
              try { Thread.sleep(1000); } catch (InterruptedException ie) {}
              System.out.println("  Print from init() 3");
              try { Thread.sleep(1000); } catch (InterruptedException ie) {}
              System.out.println("  Print from init() 4");
    class ThreadTest
         Sequencer sequencer=null;
         Sequence seq=null;
         Track track=null;
         public ThreadTest()
              System.out.println("Sequencer Started");
              try
              {     sequencer = MidiSystem.getSequencer();
                   sequencer.open();
                   // detect END OF SEQUENCE
                   sequencer.addMetaEventListener(
                        new MetaEventListener()
                        {  public void meta(MetaMessage m)
                             {     if (m.getType() == 47) System.out.println("SEQUENCER FINISHED");
                   sequencer.setTempoInBPM(40);
                   seq = new Sequence(Sequence.PPQ, 16);
                   track = seq.createTrack();
              catch (Exception e) { }
         public void play()
              try
              {     // NOTE 1
                   ShortMessage noteOnMsg = new ShortMessage();
                   noteOnMsg.setMessage(ShortMessage.NOTE_ON, 0, 60, 93);
                   track.add(new MidiEvent(noteOnMsg, 0));
                   ShortMessage noteOffMsg = new ShortMessage();
                   noteOffMsg.setMessage(ShortMessage.NOTE_OFF, 0, 60, 93);
                   track.add(new MidiEvent(noteOffMsg, 16));
                   // NOTE 2
                   ShortMessage noteOnMsg2 = new ShortMessage();
                   noteOnMsg2.setMessage(ShortMessage.NOTE_ON, 0, 68, 93);
                   track.add(new MidiEvent(noteOnMsg2, 16));
                   ShortMessage noteOffMsg2 = new ShortMessage();
                   noteOffMsg2.setMessage(ShortMessage.NOTE_OFF, 0, 68, 93);
                   track.add(new MidiEvent(noteOffMsg2, 32));
                   sequencer.setSequence(seq);
                   sequencer.start();
              catch (Exception e) { }
    import javax.sound.midi.*;
    public class MIDIThread extends Thread
         Sequencer sequencer=null;
         long midiTick=0;
         long midi_progressionLastChord=32;
         boolean print = true;
         public MIDIThread(Sequencer sequencer)
              this.sequencer = sequencer;
         public void run()
              System.out.println("Thread Started");
              while (midiTick<midi_progressionLastChord)
              {     midiTick = sequencer.getTickPosition();
                   if (midiTick == 0 || midiTick == 16)
                   {     if (print)
                        {     System.out.println("NOTE CHANGE");
                             print = false;
                   else
                        print = true;
    }I put in several print statements (with pauses in the init method) and the init print statements continue to be printed while the sequencer is playing, so it's not locking up the system and the "note change" statements happen when the sequencer changes notes. So this part is working perfectly! :)
    Here's what I got for my output:
    Sequencer Started
    Print from init() 1
    Thread Started
    NOTE CHANGE
    Print from init() 2
    NOTE CHANGE
    Print from init() 3
    SEQUENCER FINISHED
    Print from init() 4
    The only problem I'm having is how to "throw" this action back up to the main init method and have it do the print statement instead of the thread class. Throwing an interrupt apparently won't work as you have to poll it to see if it has been interrupted (so it'd be no different than just polling the sequencer to see if it equals the specific midi tick). Maybe throw an ActionEvent? But how to attach it to my applet? Would I need to create an instance of my applet and then pass that into the thread class so it can catch the ActionEvent? And if I do that will it stop the thread or will it keep the thread running so can detect the other notes?... Or is there a better/simpler way to do this?
    Thanks again,
    -tkr

  • Play video (http) works, stream video (rtmp) does not; edge server not starting

    However, I think the problem is the edge server not starting.
    I looked in the logs directory and found the core log, the master log.
    I also found /var/log/messages.
    Shouldn't there be an edge log?
    I did this from the readme:
    sudo ./fmsmgr server fms start
    udo ./fmsmgr adminserver start

    rats, hit the wrong button
    here is the rest of the report
    I did this
    sudo ./fmsmgr server fms start
    sudo ./fmsmgr adminserver start
    sudo ./server start
    The master.log: 
    #Date: 2011-07-21
    #Fields: date   time    x-pid   x-status        x-ctx   x-comment
    2011-07-21      16:34:35        26286   (i)2581173      FMS detected IPv6 protocol stack!       -
    2011-07-21      16:34:35        26286   (i)2581173      FMS config <NetworkingIPv6 enable=false>        -
    2011-07-21      16:34:35        26286   (i)2581173      FMS running in IPv4 protocol stack mode!        -
    2011-07-21      16:34:35        26286   (i)2581173      Host: bastet.cam.corp.google.com IPv4: 172.31.194.67    -
    2011-07-21      16:34:35        26286   (i)2571011      Server starting...      -
    2011-07-21      16:34:35        26286   (i)2581224      Edge (26306) started, arguments : -edgeports ":1935,80" -coreports "localhost:19350" -conf "/opt/adobe/fms/conf/Server.xml" -adaptor "_defaultRoot_" -name "_defaultRoot__edge1".      -
    2011-07-21      16:34:35        26286   (i)2571111      Server started (/opt/adobe/fms/conf/Server.xml).        -
    2011-07-21      16:34:40        26286   (i)2581226      Edge (26306) is no longer active.       -
    #Date: 2011-07-21
    #Fields: date   time    x-pid   x-status        x-ctx   x-comment
    2011-07-21      16:36:16        26713   (i)2581173      FMS detected IPv6 protocol stack!       -
    2011-07-21      16:36:16        26713   (i)2581173      FMS config <NetworkingIPv6 enable=false>        -
    2011-07-21      16:36:16        26713   (i)2581173      FMS running in IPv4 protocol stack mode!        -
    2011-07-21      16:36:16        26713   (i)2581173      Host: bastet.cam.corp.google.com IPv4: 172.31.194.67    -
    2011-07-21      16:36:16        26713   (i)2571011      Server starting...      -
    2011-07-21      16:36:16        26713   (e)2571122      Server aborted. -
    The /var/log/messages output shows "Server starting ..." messages.
    When I do the ps:
    ps auxww | grep adobe
    nobody   19178  0.0  0.0 203444 11124 pts/2    Sl   16:03   0:00 /opt/adobe/fms/fmscore -adaptor _defaultRoot_ -vhost _defaultVHost_ -app registry -inst registry -tag -console -conf /opt/adobe/fms/conf/Server.xml -name _defaultRoot_:_defaultVHost_:registry:registry:
    I expected to see the fmsedge binary.
    lsof doesn't show anybody listening on 1935 as I expected for an rtmp server.

  • Use of edge detection in pattern matching algorithm?

    Hello all,
                    I work for a group at Texas A&M University researching two-phase flow in reactors.  We have been using IMAQ Vision and had a question regarding the use of edge detection in the pattern matching algorithm.  I had seen the webcast entitled “Algorithms that Learn: The Sum and Substance of Pattern Matching and OCR” (http://zone.ni.com/wv/app/doc/p/id/wv-705) and in the webcast it was mentioned that the pattern matching algorithm uses edge detection to, (as best I can tell), reduce the candidate list further and to perform subpixel location calculations.  However, I was wondering if this edge detection process is still performed if we do not use the subpixel location calculation (i.e. if we uncheck the “Subpixel Accuracy” check box)?  Also, if edge detection is performed in the pattern matching algorithm is it consistent with the method described in Chapter 13 of the Vison Concepts Manual (“Geometric Matching”)?  Finally, if edge detection is performed in a manner consistent with Chapter 13 of the manual, how does the geometric matching correlation number affect the correlation calculation that was performed in the previous steps?  Are they simply multiplied together?
    Many thanks!
      -Aaron

    Jeff,
    We are using Imaq Vision Builder 4, with the included pattern matching that can be accessed via the menus (i.e. we haven't created a custom VI or anything.)  We are using the software to locate bubbles during boiling experiments and want a deeper understanding of what is going on "behind the scenes" of the algorithm, as we may have to explain how it works later.  We have been able to determine most of what we need from the webcast I had previously mentioned, except for the use of edge detection in the pattern matching algorithm.
    At the scales involved in our experiments, subpixel accuracy is really not needed and therefore we do not use it.  If edge detection is used in the pattern matching algorithm only to determine location with subpixel accuracy, then we do not really need to know how it works because we do not use that calculation.  Inversely, of course, if edge detection is used during pattern matching even without enabling subpixel accuracy, then we would like to have a fairly good understanding of the process.
    I've read most of the section on geometric matching in the Vision Concepts Manual and wondered if the process described there for edge detection (or feature matching) was also used in the basic pattern matching algorithm?
    To summarize, if edge detection is not used in the basic pattern matching algorithm without subpixel accuracy, then that is all I need to know.  If edge detection is used for pattern matching even without using the subpixel accuracy calculation, then we would like to learn more about how exactly it is used in the pattern matching algorithm.
    We would really appreciate any help you could give us... we've been digging around on the NI website for a couple of weeks now trying to fit together all the pieces of the pattern matching puzzle.
    Many thanks!
        Aaron

  • Edge detection on a moving object

    Hi
    I have a problem on edge detection. I have a pulley of different types.
    Pulleys where Dia is same, but height differs. Pulleys of different dia but the number of teeth (ridges) vary.
    I need to identify one type of pulley from the other. I am trying to use the following logic:
    1. Locate the base of the pulley (which is distinct) using pattern match
    2. Define a co ordinate based on this pattern match.
    3. Define edge detection tool using the co ordinate (this is where I am running into a wall)
    I have used extracts of examples : battery inspection, gage and fuse inspection
    I am not able to define the edge tool (Edge detector under vision assistant 7.1)
    I am trying to use the co ordinates, since if the pulley moves a bit, then the edge detector appears away (in Vision assistant)
    THE CATCH IS:
    I have to do this in VB since Machine vision has to be integrated into existing vb application.
    NOTE: attached image of pulley
    Now can some one help me PLS
    Thanks in advance
    Suresh
    Attachments:
    pulley.png ‏13 KB

    Hi Suresh -
    I took your image and expanded the background region to make three versions with the pulley in different positions.  Then I loaded the three images into Vision Assistant and built a script that finds the teeth of the pulley.  Although Vision Assistant can't generate coordinate systems, I used edge detection algorithms to define a placeholder where the coordinate system code should be added.
    I chose to use a clamp and midpoint instead of the Pattern Matching tool because of the nature of the image.  Silhouetted images are difficult to pattern match, and the vertical line symmetry of the pulley makes it even more difficult to find a unique area of the object that can be recognized repeatedly.  Instead, I generally find more success using algorithms that are designed around edge detection.  I assumed that the "notch" at the bottom of the pulley will always be present in your images, so I performed a Clamp in the Horizontal Min Caliper to find this notch.  The midpoint of this clamp section can be used as the origin of a coordinate system around which the following Edge Detection steps are performed.  This allows you to find the teeth of the pulley no matter where it is in the image.  (Note that the VA script was built using pulley with BG 3.png.)
    The Builder File that was generated from the script gives you all the code you need except for the Caliper function and the Coordinate System functions.  I added comments to these sections to show what type of code you will have to add there.
    It may not be exactly the application you need to perform, but it should give you a solid starting point.  I hope it helps.
    David Staab, CLA
    Staff Systems Engineer
    National Instruments
    Attachments:
    Pulley Teeth.zip ‏18 KB

  • Scene Detect not functioning in Premiere Pro CS6

    Hello all,
    Having an issue with Scene Detect not recognizing individual scenes when I Capture HDV footage from my Canon XH series cameras. Tapes just come in as one long file rather than individual scenes. I am using CS6 on my MAC computers (OS 10.7.5 and 10.6.8).
    This is baffling as I've captured many tapes from many weddings and have never experienced this problem to this point.
    I did reset my cameras to default settings recently but can't seem to find anything that would indicate that causing the issue. However, it is the only change I have made unless a recent Adobe or OS update could have caused the issue?
    Your help and knowledge are greatly appreciated!

    Problem solved.
    For anyone seeking a possible solution to this simple but incredibly aggravating issue: Make sure your camera's date and time fields are set. Premiere Pro's "Scene Detect" function uses your cameras time signature to differentiate scenes in your captured footage. No time code = no individual scene detection.  
    When I reset my camera to default settings I erased my time and date fields. I wasn't too worried about it as I typically use these cameras for their HD SDI capability with timecode captured via Tricaster, however this became a frustrating problem when I used the cameras for a recent wedding capture and I now have four long files rather than the intended multiple scenes I need for editing...
    Hope this simple discovery makes someone's day a little easier...

  • Edge detection using IMAQ Find Edge/IMAQ Edge Tool 3

    Hi,
    I have several images with useless background around a rectangular ROI (coordinates unknown!). So I tried using the two VIs mentioned above in order to detect these edges so that I can remove them. Regretfully, this does not work as planned.
    IMAQ Find Edge usually finds an edge, but not where it should be. The edge detection is earlier than I want it to be.
    IMAQ Edge Tool 3 sometimes does not find an edge at all, sometimes it finds the edge perfectly. Here I use the 'get best edge' option, which delivers the best results with all the images I tested it with.
    All the other options are also not changed while running the VI with the images I have.
    Does anyone have intimate knowledge of these VIs' algorithms, how they work, how they can be manipulated, ... ?

    Hi,
    Can you upload an example image?
    That would clarify what you're trying to do?
    Most of the time a change of mindset solves the problem.
    Kind regards,
    - Bjorn -
    Have fun using LabVIEW... and if you like my answer, please pay me back in Kudo's
    LabVIEW 5.1 - LabVIEW 2012

  • Edge Detection

    I'm about to start working on an application that receives images from a wireless webcam (attached to a roving robot) and needs to process them to decide where the robot should travel to next. The processing needs to identify obstacles, walls, and doorways and guide the robot through a door into another room.
    I know I'm going to need to use edge detection on the images, but don't know the best place to start. Any ideas? Also, if anyone has any experience with this, any idea what kind of performance I should expect? Given the limitations of the robot's movement, I will not need to process more than 10 frames per second. Assuming decent processing power, is 10 fps doable? 5?
    Thanks in advance...

    Edge Detection is basically a convolution operation. An image is simply a matrix of pixels. This matrix may be convoluted with another small matrix of values called a kernel.
    // Loading Image
    String imageName = "image.jpg";
    Canvas c = new Canvas();
    Image image = c.getToolkit().getImage(imageName);
    MediaTracker waitForIt = new MediaTracker(c);
    waitForIt.addImage(image, 1);
    try { waitForIt.waitForID(1); }
    catch (InterruptedException ie) { }
    // Buffering Image
    BufferedImage src = new BufferedImage(
        image.getWidth(c), image.getHeight(c),
        BufferedImage.TYPE_INT_RGB);
    Graphics2D big = bi.createGraphics();
    big.drawImage(image,0,0,c);
    //Edge Detection
    float[] values = {
        0f, -1f, 0f,
       -1f, 4f, -1f,
        0f, -1f, 0f
    Kernel k = new Kernel(3, 3, values);
    ConvolveOp cop = new ConvolveOp(kernel);
    BufferedImage dest = null;
    dest = cop.filter(src, dest);Play around with the values in the kernel to get a better idea of how this all works.

  • Screen Edge/Rim not level at right side

    My iPod touch 4Gen that I got on Black friday in 2010, has the right hand side of the screen edge/rim not level, when you run your finger along you will get a "sharp" edge/rim to the screen.  And it is just the edge/rim of the screen.  This has always had it and it made me notice it more today when my Brother got a new iPod touch.  Is there any point at getting Apple to look at it.  I should have got it replaced back when I got it but back last year I was more into the light bleed via gaps in the rim (as this was the second one), I didnt really notice it until it was too late aka out of the 14 day return.

    This is the way it came, so nothing to do with me.  I will give AppleCare a ring and explain see what they will suggest.  And if they say to bring it back I will do it.
    Am sure if the batch is all the same or some of the batch, then its was done by Foxconn during the manufacting.
    Dents and scratches yes you have to accept that, but not the screen being defected.  I should have changed it within the 14 day period.
    I am really puzzled why foxconn has a hard time creating prefect device.  I went thought 5 iPad 2s because they werent prefect, and 4 MacBook Pro 17"s because they werent prefect.
    Plus here in the UK we have greater consumer protection when it comes to faults.  The edge/rim of the screen being lower to the rest of the device with no signs of damage = automatic proof that its a inherted fault.

  • Negative Edge Detection DI

    Hello,
    I want to read in several counter inputs in my USB-6008 but this has only one counter input.
    The frequency of the pulses (to read) isnt high so i thought maybe i could use a digital input. When the input is high, a certain value has to be incremented. The only problem is that the high-level of the pulse takes to long and the value is incremented more than once. To solve this problem I think I need an edge detection that generates a pulse when a negative edge has occured. Can somebody help me to solve this problem?
    Greetings,
    Kevin Duerloo
    KaHo Sint-Lieven, Belgium

    Hi,
    There is no change detection available on the digital lines of these USB devices. So you will not be able to trigger on a Rising edge or Falling edge by using the Digital lines. The only thing you could do is to use the digital trigger line. So, create a program that starts an acquisition on a digital trigger (PFI0) and set it to Rising or Falling. Put this task inside a loop. Everytime a Rising edge occurs, the acquisition will start and each time the dataflow arrives at the DaqmxBaseRead.vi/DaqmxRead.vi you can add some code that increases a value. Put a sequence around the Read.vi and add the code. This is just a workaround. You can test if this good enough for you.
    Regards.
    JV

  • Automating Edge Detection?

    I have a question about the IMAQ edge detection VI. I am designing a VI that will be counting edges in an image many times in succession. My question is that since the area in question will not change as the image changes, is it possible to just measure along the same line every time? Can I select a start and end pixel location for the edge detection? At this point I have to click and drag a line and it is very tedious for the amount of images that will be measured.
    Thanks,
    Mack

    Hi Mack24, 
    Which tool are you using for edge detection? The shipping example "Edge Detection.vi" allows you to specify a ROI of a line to detect edges against. 
    The example can be found under Toolkits and Modules->Vision->Caliper->Edge Detection.vi
    -N
    National Instruments
    Applications Engineer

  • Edge Detection & Edge Coordinates

    I need to do edge detection and get the edge coordinates from that operation. It's been awhile since I've done stuff like this so I'm wondering if maybe the JAI has this? If not, what should I look for in Java 2D (although at that point I'm writing from scratch right?)
    I'm not asking for homework here, but I am very busy and would like to be pointed in the right direction quickly. Thus, the 10 dukes as compensation. :-)

    anybody?

  • Imperfections in the canny edge detection

    There are imperfections in the area just inside of the detected edge. I have seen this in multiple programs and am only now looking for solutions. Thank you greatly in advance.
    Attachments:
    dafsdf.png ‏3 KB
    FGDFG.png ‏215 KB

    Hi agator2,
    To reiterate Taki1999's post, you can always threshold the image before you do an edge detection to try and get a more precise edge.  Your image looks like it is fading colors from the darker color to the lighter color.  The more you can threshold this image to more precise edges, the less imperfections you will have.  I hope this helps!
    Kim W.
    Applications Engineer
    National Instruments

Maybe you are looking for