Face tracking

Is it possible to do face-tracking in FLV webcam
conferensing?
So I can add objects to faces and stuff.

If you use a tool like Process Montior, you can determine which folders it is looking for the database file(typically in the folder of the .exe, but can be other system folders based on Windows functionality of LoadLibrary() api).
Carmine Sirignano - MSFT

Similar Messages

  • How to turn off Face Tracking

    I tried to turn off the face tracking under Face Tracking Options (there are three options, off, continous and smart). I clicked "off" each time, but it did work. The setting automatically returns to either "continous" or "smart". Can anyone help?

    There appears to be a bug in the software that won't let you turn it off. It will only allow you to switch to continous or smart. So, I did some poking around and to turn it off you'll need to edit your registry*
    Open Regedit and navigate to the following key.
    I've only done this on one PC so the VF0070 name may be different.
    HKEY_LOCAL_MACHINE\SOFTWARE\Creative Tech\Web Cameras\VF0070
    In this folder you will find a key named: FaceTrackingMode
    0 = off
    1 = Continous
    2 = smart
    You will need to stop and restart WebCam Center in order for the change to take effect.
    *I nor Creative Labs is responsible if you do any damage to or in any way screw up your computer. If you are not comfortable with editing the registry then don't do it.

  • How to control Get3DShape() in Microsoft Kinect face tracking SDK?

    I'm working on a lip-reading project and I'm using Microsoft Kinect face tracking SDK in order to extract lip features in a generated text file from C# to be processed in MATLAB. The features are 29 angles resulting after calculating the distances between
    the 18 feature points related to the lips using Get3DShape() function to get the co-ordinates in each frame and my code looks like that which I added in facetrackingviewer.cs:
    public void getangles(FaceTrackFrame fr)
    var shape = fr.Get3DShape();
    var X_7 = shape[FeaturePoint.MiddleTopDipUpperLip].X;
    var Y_7 = shape[FeaturePoint.MiddleTopDipUpperLip].Y;
    var Z_7 = shape[FeaturePoint.MiddleTopDipUpperLip].Z;
    var X_31 = shape[FeaturePoint.OutsideRightCornerMouth].X;
    var Y_31 = shape[FeaturePoint.OutsideRightCornerMouth].Y;
    var Z_31 = shape[FeaturePoint.OutsideRightCornerMouth].Z;
    var X_33 = shape[FeaturePoint.RightTopDipUpperLip].X;
    var Y_33 = shape[FeaturePoint.RightTopDipUpperLip].Y;
    var Z_33 = shape[FeaturePoint.RightTopDipUpperLip].Z;
    var X_40 = shape[FeaturePoint.MiddleTopLowerLip].X;
    var Y_40 = shape[FeaturePoint.MiddleTopLowerLip].Y;
    var Z_40 = shape[FeaturePoint.MiddleTopLowerLip].Z;
    var X_41 = shape[FeaturePoint.MiddleBottomLowerLip].X;
    var Y_41 = shape[FeaturePoint.MiddleBottomLowerLip].Y;
    var Z_41 = shape[FeaturePoint.MiddleBottomLowerLip].Z;
    var X_64 = shape[FeaturePoint.OutsideLeftCornerMouth].X;
    var Y_64 = shape[FeaturePoint.OutsideLeftCornerMouth].Y;
    var Z_64 = shape[FeaturePoint.OutsideLeftCornerMouth].Z;
    var X_66 = shape[FeaturePoint.LeftTopDipUpperLip].X;
    var Y_66 = shape[FeaturePoint.LeftTopDipUpperLip].Y;
    var Z_66 = shape[FeaturePoint.LeftTopDipUpperLip].Z;
    //some stuff
    float[] point1 = new float[3] { X_7, Y_7, Z_7 };
    float[] point2 = new float[3] { X_64, Y_64, Z_64 };
    float[] point3 = new float[3] { X_41, Y_41, Z_41 };
    float[] point4 = new float[3] { X_80, Y_80, Z_80 };
    float[] point5 = new float[3] { X_88, Y_88, Z_88 };
    float[] point6 = new float[3] { X_31, Y_31, Z_31 };
    float[] point7 = new float[3] { X_87, Y_87, Z_87 };
    float[] point8 = new float[3] { X_40, Y_40, Z_40 };
    float[] point9 = new float[3] { X_89, Y_89, Z_89 };
    float[] point10 = new float[3] { X_86, Y_86, Z_86 };
    float[] point11 = new float[3] { X_85, Y_85, Z_85 };
    float[] point12 = new float[3] { X_82, Y_82, Z_82 };
    float[] point13 = new float[3] { X_84, Y_84, Z_84 };
    float[] point14 = new float[3] { X_66, Y_66, Z_66 };
    float deltax = point2[0] - point1[0];
    float deltay = point2[1] - point1[1];
    float deltaz = point2[2] - point1[2];
    float cetax = point3[0] - point2[0];
    float cetay = point3[1] - point2[1];
    float cetaz = point3[2] - point2[2];
    float notex = point3[0] - point1[0];
    float notey = point3[1] - point1[1];
    float notez = point3[2] - point1[2];
    float deltax1 = point5[0] - point4[0];
    float deltay1 = point5[1] - point4[1];
    float deltaz1 = point5[2] - point4[2];
    float deltax2 = point5[0] - point3[0];
    float deltay2 = point5[1] - point3[1];
    float deltaz2 = point5[2] - point3[2];
    //some stuff
    float distance = (float)
    Math.Sqrt((deltax * deltax) + (deltay * deltay) + (deltaz * deltaz));
    float distance1 = (float)
    Math.Sqrt((cetax * cetax) + (cetay * cetay) + (cetaz * cetaz));
    float distance2 = (float)
    Math.Sqrt((notex * notex) + (notey * notey) + (notez * notez));
    float distance3 = (float)
    Math.Sqrt((deltax1 * deltax1) + (deltay1 * deltay1) + (deltaz1 * deltaz1));
    float distance4 = (float)
    Math.Sqrt((deltax2 * deltax2) + (deltay2 * deltay2) + (deltaz2 * deltaz2));
    float distance5 = (float)
    Math.Sqrt((deltax3 * deltax3) + (deltay3 * deltay3) + (deltaz3 * deltaz3));
    float distance6 = (float)
    Math.Sqrt((deltax4 * deltax4) + (deltay4 * deltay4) + (deltaz4 * deltaz4));
    float distance7 = (float)
    Math.Sqrt((deltax5 * deltax5) + (deltay5 * deltay5) + (deltaz5 * deltaz5));
    float distance9 = (float)
    Math.Sqrt((deltax6 * deltax6) + (deltay6 * deltay6) + (deltaz6 * deltaz6));
    float distance8 = (float)
    Math.Sqrt((deltax7 * deltax7) + (deltay7 * deltay7) + (deltaz7 * deltaz7));
    //some stuff
    double feature = CalcAngleB(distance, distance2, distance1);
    double feature2 = CalcAngleB(distance2, distance, distance);
    double feature3 = CalcAngleB(distance, distance1, distance2);
    double feature4 = CalcAngleB(distance3, distance5, distance4);
    double feature5 = CalcAngleB(distance5, distance4, distance3);
    double feature6 = CalcAngleB(distance4, distance3, distance5);
    double feature7 = CalcAngleB(distance6, distance, distance7);
    double feature8 = CalcAngleB(distance10, distance8, distance9);
    double feature9 = CalcAngleB(distance9, distance10, distance8);
    double feature10 = CalcAngleB(distance8, distance9, distance10);
    //some stuff
    using (System.IO.StreamWriter file = new System.IO.StreamWriter(@"D:\angles.txt", true))
    file.WriteLine("{0},{1},{2},{3},{4},{5},{6},{7},{8},{9},{10},{11},{12},{13},{14},{15},{16},{17},{18},{19},{20},{21},{22},{23},{24},{25},{26},{27},{28}", feature, feature2, feature3, feature4, feature5, feature6, feature7, feature8, feature9, feature10, feature11, feature12, feature13, feature14, feature15, feature16, feature17, feature18, feature19, feature20, feature21, feature22, feature23, feature24, feature25, feature26, feature27, feature28, feature29);
    Where CalculateAngleB is a declared function above:
    public double CalcAngleB(double a, double b, double c)
    return Math.Acos((a * a + c * c - b * b) / (2 * a * c));
    and I call getangles(fr) at :
    internal void OnFrameReady(KinectSensor kinectSensor, ColorImageFormat colorImageFormat, byte[] colorImage, DepthImageFormat depthImageFormat, short[] depthImage, Skeleton skeletonOfInterest)
    this.skeletonTrackingState = skeletonOfInterest.TrackingState;
    if (this.skeletonTrackingState != SkeletonTrackingState.Tracked)
    // nothing to do with an untracked skeleton.
    return;
    if (this.faceTracker == null)
    try
    this.faceTracker = new FaceTracker(kinectSensor);
    catch (InvalidOperationException)
    // During some shutdown scenarios the FaceTracker
    // is unable to be instantiated. Catch that exception
    // and don't track a face.
    Debug.WriteLine("AllFramesReady - creating a new FaceTracker threw an InvalidOperationException");
    this.faceTracker = null;
    if (this.faceTracker != null)
    FaceTrackFrame frame = this.faceTracker.Track(
    colorImageFormat, colorImage, depthImageFormat, depthImage, skeletonOfInterest);
    getangles(frame);
    this.lastFaceTrackSucceeded = frame.TrackSuccessful;
    if (this.lastFaceTrackSucceeded)
    if (faceTriangles == null)
    // only need to get this once. It doesn't change.
    faceTriangles = frame.GetTriangles();
    //MessageBox.Show("face has been detected");
    this.facePoints = frame.GetProjected3DShape();
    //this.FaceRect = new System.Windows.Rect(frame.FaceRect.Left, frame.FaceRect.Top, frame.FaceRect.Width, frame.FaceRect.Height);
    Now, whenever the mask appears as if (this.facetracker!=null), the code starts to calculate the angles and generate the text file. My question is how can I add a control (e.g.Timer or button) to decide exactly when to start calculating features or generating
    the file because in data collection I need features to be recorded accurately when the speaker starts to pronounce phrases not when the mask appears. I have tried to add a dispatcherTimer as when the timer counts after the mask appears it starts to calculate
    the angles like this:
    private static void dispatcherTimer_Tick(object sender, EventArgs e)
    Microsoft.Kinect.Toolkit.FaceTracking.FaceTrackFrame fr;
    fr = facetracker.Track(sensor.ColorStream.Format, colorPixelData,
    sensor.DepthStream.Format, depthPixelData,skeleton);
    //getangles(fr) and stream writter
    and added this to getangles(fr)
    System.Windows.Threading.DispatcherTimer dispatcherTimer = new System.Windows.Threading.DispatcherTimer();
    dispatcherTimer.Tick += dispatcherTimer_Tick;
    dispatcherTimer.Interval = new TimeSpan(0, 0, 10);
    dispatcherTimer.Start();
    but this gives an error in: var shape = frame.Get3DShape(); as using unassigned variable frame and I tried to declare facetracker in timer's void but this gives me an error too so, if you have an I idea how can I add a timer or a button to control when the
    angles to be calculated I would be grateful.
    Thanks in advance.

    Hi Mesbah15,
    I am not sure if I’ve understood the question correctly. You can try using event to communicate with other classes. Define an event, when mask appears you just need call that event handler.
    https://msdn.microsoft.com/en-us/library/17sde2xt(v=VS.100).aspx.
    Please post more information about your scenario for better understanding if event handler not help.
    Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Can't turn off face tracking on Live! Ultra for Notebooks

    Please help! I've installed Live! Ultra for Notebooks and it works reasonably well, except for the fact that I can't turn off the Face Tracking feature.
    When I installed everything, I selected "Smart", but I hate the way that the camera keeps panning off to different parts of the room. So, I just want to turn this feature OFF.
    But every time I open the Face Tracking Utility, or go to the screen via "Tools," I can click "off". But, it doesn't actually turn off; the camera still keeps panning off to the far corners of the room. And the next time I access the Face Tracking Utility again, it has defaulted back to "Smart". Any suggestions on how to disable this feature?
    My system is as follows:
    Dell Inspiron 700M
    Intel Pentium M 2.00 GHz processor
    Windows XP Home Edition, service pack 2
    Happy new year & thanks in advance..

    Finally figured it out. In case anyone else has the same problem of your computer ignoring changes in the Face Tracking Utility screen, try this:
    Go to Webcam Center. Click on "Capture" then the "Source" tab, one of two discrete grey tabs just above the video screen.
    Under "Extended Settings" you have the options for face tracking: off, continuous or smart. Click it and your computer should actually hold onto this setting until you change it again!
    Good luck.
    -noknok

  • Disabling face tracking on webcam - S405

    Does anyone know how to disable or turn off the face tracking feature (on the webcam) on the S405? I see it uses quite a bit of CPU resources. The only way I can turn it off is by killing it in the task manager. Then it comes back after a reboot. I want to permanently disable it. Called Lenovo, they were no help at all.

    Maybe you can try uninstall and reinstall the software.

  • Smart face tracking utility

    On initial installation, there was the face tracking utility. After I downloaded the updates, from the Creative Labs web site, smart face tracking was gone. Is it possible to install the face tracking utility only, from the installation CD?

    Finally figured it out. In case anyone else has the same problem of your computer ignoring changes in the Face Tracking Utility screen, try this:
    Go to Webcam Center. Click on "Capture" then the "Source" tab, one of two discrete grey tabs just above the video screen.
    Under "Extended Settings" you have the options for face tracking: off, continuous or smart. Click it and your computer should actually hold onto this setting until you change it again!
    Good luck.
    -noknok

  • W530 Face Tracking questions

    The face tracking on my W530 seemed to be a bit slower than the video showing off the T530's Face tracking.
    They should have the same camera correct?
    Also, When I have the "Virtual Camera Driver," installed, the facetracking is fine, but the camera is stuck at 640x480. When that driver is removed, the camera can be set to  1280x720, but The facetracking and desktop sharing features are lost. Is this a bug or intentional?
    I don't really care about face tracking, but it's a neat feature.
    W530(2436-CTO): i7-3720QM, nVidia Quadro K2000M,16GB RAM, 500 GB hard drive, 128GB mSATA SSD, Ubuntu 14.04 Gnome, Centrino Ultimate-N 6300.
    Yoga 3 Pro: Intel Core-M 5Y70, Intel HD 5300, 8GB RAM, 128GB eMMC, Windows 8.1, Broadcom Wireless 802.11ac.

    Optimus ALWAYS uses the integrated card. What it does is when the nVidia Card is off, the integrated card drives the display and any applications running that don't use the card. When the nVidia card is triggered by an application, and you aren't using an external monitor, the Panel is still driven by the Intel integrated card, but the video output(i.e what is being displayed), is pumped from the nVidia card through the intel card. so either way, the intel card drives the display output for the internal LCD. The Video ouput ports however(aside from VGA, although on mine that goes through nvidia too), are run by the nVidia Discrete graphics.
    W530(2436-CTO): i7-3720QM, nVidia Quadro K2000M,16GB RAM, 500 GB hard drive, 128GB mSATA SSD, Ubuntu 14.04 Gnome, Centrino Ultimate-N 6300.
    Yoga 3 Pro: Intel Core-M 5Y70, Intel HD 5300, 8GB RAM, 128GB eMMC, Windows 8.1, Broadcom Wireless 802.11ac.

  • Face tracking resource files

    The face tracking library uses a number of resource files that account for 43mb ... is there a way to tell the library to load these files fron a different location other than the default, or hook the file loader so the files can be stored in a single, larger
    WAD file and retrieved upon request?
    Thanks in advance
    V
    Vicente Penades

    If you use a tool like Process Montior, you can determine which folders it is looking for the database file(typically in the folder of the .exe, but can be other system folders based on Windows functionality of LoadLibrary() api).
    Carmine Sirignano - MSFT

  • Face Tracker

    Hi,
    Does the tracking software only track a person's face? Can it track a person walking to and fro (as in the case of a lecturer giving a lecture)? Thanks.
    CS

    cheese2006,
    The Face Tracking feature looks for skin tones and the shape of a head on a torso, it MAY work for a walking lecturer, but would probably end up watching somebody sitting in front of it. It does not watch motion in general which would be better for what you are describing.
    Daniel

  • Face tracking system

    Hey NI support,
    I am unable to track face in the Vision programming please suggest a way to do that.
    Thanks,
    Gagandeep
    Gagandeep sharma

    Dear sir,
    Is there any Algo study to work on images like what i had seen in this link sruti had used a diffrent algo to perform the study but some persons do it without colour extraction. What is the diffrence, and minimum how much MP and frame rate camera will you suggest for tracking live image face.
    Thanks,
    Gagandeep.
    Gagandeep sharma

  • Face tracking in director?

    Hi all,
    I recently come across this video on youtube
    http://www.youtube.com/watch?v=wznrHpL8AJ8
    and I start thinking that if we could implement something similar
    for our shockwave games, the degree of realism and imersion would
    increase rapidly. The blog of the person that did this is:
    http://www.ar-lab.info/mt/weblog/archives/2008/02/face_tracking_ui_test.html.
    The sample application that he provides comes with an ocx which in
    some machines will fail to register correctly (use a tool called
    dependancy walker to look for any missing dll. Most of the dll are
    located in the opencv bin directory.). After registering the ocx
    succesfuully I tried to use it director but without success. I am
    getting an error message the component fails to load. My best guess
    is that the ocx misses a GUI. I think if someone can open the ocx
    see whats wrong with it and fix it it will be a valuable addition
    for the director community. Any ideeas?

    It appears that things are simpler to do somethign similar in
    Director. All it needs to be done is to create an ocx wrapper for
    the OpenCV library that will output x,y,z coordinates of the
    tracked head. You can download the openCV library from
    http://sourceforge.net/projects/opencvlibrary/.
    Is anyone interested for such kind of project?

  • What about face tracking for shockwave games?

    Hi all,
    I recently come across this video on youtube
    http://www.youtube.com/watch?v=wznrHpL8AJ8
    and I start thinking that if we could implement something similar
    for our shockwave games, the degree of realism and imersion would
    increase rapidly. The blog of the person that did this is:
    http://www.ar-lab.info/mt/weblog/archives/2008/02/face_tracking_ui_test.html.
    The sample application that he provides comes with an ocx which in
    some machines will fail to register correctly (use a tool called
    dependancy walker to look for any missing dll. Most of the dll are
    located in the opencv bin directory.). After registering the ocx
    succesfuully I tried to use it director but without success. I am
    getting an error message the component fails to load. My best guess
    is that the ocx misses a GUI. I think if someone can open the ocx
    see whats wrong with it and fix it it will be a valuable addition
    for the shockwave community. Any ideeas?

    Do you have iMessage? I think as long as that person has iMessage you can use Face Time to call each other.

  • Is there a way to get Indices of the HD face mesh tracked

    Hello there,
    I' m programming in c++ and opengl.
    I want to get the vertices and indices for the diffrent faces tracked, build my mesh with my own framework and draw it with opengl.
    I can get the different vertices in color space (or not)
    I can't get how to have the other informations of the mesh : indices, triangles, triangles count, ... The IFaceModel type seems to have very few methods.
    Did I miss something ?
    Best

    The CalculateVertices method will provide you the vertex information that aligns with the person. To create the initial mesh, you will use the 2 helper methods to build your mesh structure.
    HRESULT WINAPI GetFaceModelTriangleCount(_Out_ UINT32* pTriangleCount);
    HRESULT WINAPI GetFaceModelTriangles(UINT32 capacity, _Out_writes_all_(capacity) UINT32* triangeVertices);
    If you need code, there is a Cinder block(plug-in) that has implemented the COM api's to build a TriMesh https://github.com/wieden-kennedy/Cinder-KCB2
    Carmine Sirignano - MSFT

  • How to track face and allocate lips in face using labview

    My final year project is "Text Input System developed by Lips Image Recognition based on Labview for Serious Disabled".
    In this, image of person's face will be acquired by CCD camera.. Then it has stages like face tracking, lips area allocation and extraction and further processing.
    Then the status of mouth-open or mouth-close will be acquired in binary format as 1 & 0 respectively..This information will be given to MORSE CODE TEXT INPUT SYSTEM which will convert morse code into english text...I am having problem in developing program to track face and allocate lips area...Kindly help...
    Attachments:
    Lips Image Recognition.pdf ‏808 KB

    ok sir...the first program has been developed by me and it contains error...and i am not able to identify the same...
    Attachments:
    Extraction of color planes.vi ‏163 KB
    Changes made in program.vi ‏111 KB

  • How to track face and allocate lips in the face using labview

    My final year project is "Text Input System developed by Lips Image Recognition based on Labview for Serious Disabled".
    In this, image of person's face will be acquired by CCD camera.. Then it has stages like face tracking, lips area allocation and extraction and further processing.
    Then the status of mouth-open or mouth-close will be acquired in binary format as 1 & 0 respectively..This information will be given to MORSE CODE TEXT INPUT SYSTEM which will convert morse code into english text...I am having problem in developing program to track face and allocate lips area...Kindly help...
    Attachments:
    Lips Image Recognition.pdf ‏808 KB

    You would probably get more useful responses by posting in the Machine Vision board.
    Lynn

Maybe you are looking for

  • Enabling NTFS Support On OSX 10.8.5

    I will use HFS+ for Time Machine backups.  However, I still need to share files between the Windows and Mac world.  For Windows, I use a portable disk drive that I just drag and drop files to for backup.  This is sufficient for my backup needs.  I'd

  • Download of a webform within a BSP application

    Hi SDN members, I wrote a ESS BSP application with a 3 step wizard. Within the first step the user can enter some data in some form fields. In the second step a webform (SMARTFORM) with the form data is displayed and in the third step the user can sa

  • Error 2738 when installing

    Hi, I bought a Ipod Nano today and to put music on it i have to install iTunes 7. But when i want to install it, after windows installer is preparing a unexpected error occurs and it stops. It gives error 2738.. It does not even start the actual inst

  • XML SDK for 08.1.7 Availability

    Does anyone know the availability of Windows XML Developers Kits for Java & PL/SQL For O8i ? On the 9i XML download page is says if you are looking or 8.1.7 versions to contact OTN Support, but the link seems to be broken. Any information would be ap

  • I need some help re lines on start up screen.

    Hi I have a mac pro 2.8 2 x quad processor running snow leopard. When I boot up horrible lines come up and wont start my mac. I have tried booting from cd/dvd but it freezes and wont let me start up. Don't know f disk is corrupt or if another hardwar