none
[unity] ProduceFaceModel Microsoft_Kinect_Face_FaceModelData_Delegate_Success RRS feed

  • Question

  • I am trying to use ProduceFaceModel using the Unity 1409 SDK. Unfortunately, Microsoft_Kinect_Face_FaceModelData_Delegate_Success is not called in KinectFaceSpecialCases.cs

    I tried to pose this question in another thread to keep everything in the same thread, but i fear it is being missed by our dear admins. So i will ask again on this new thread.

    I start the face capture like so

    private void StartCapture()
    {
        this.faceModelBuilder = null;
        this.faceModelBuilder = this.highDefinitionFaceFrameSource.OpenModelBuilder(FaceModelBuilderAttributes.None);
        faceModelBuilder.CollectFaceDataAsync(CollectFaceDataComplete, CollectFaceDataFailed);
    }

    After capturing all the necessary views, the capture status becomes 7. This may possibly indicate some sort of system error, but it's not clear what. The fail delegate is not called either at this point.

    Stopping the Capture manually does call the failure delegate.

    I have also tried to implement my own GetFaceData method, but it crashes unity immediately when i try to producefacemodel with the return value.

    public void GetFaceData(Action<Microsoft.Kinect.Face.FaceModelData> callback)
            {
                var faceModelData = Helper.NativeObjectCache.CreateOrGetObject<Microsoft.Kinect.Face.FaceModelData>(_pNative, n => new Microsoft.Kinect.Face.FaceModelData(n));
                Helper.EventPump.Instance.Enqueue(() => callback(faceModelData));
            }

    Can we get a working unity sample that uses ProduceFaceModel?

    I can provide my sample project it will help.

    Thanks!




    • Edited by PandaChuZero Tuesday, January 13, 2015 6:32 PM clarity
    Tuesday, January 13, 2015 6:13 PM

Answers

  • Looks like there may be a bug and not sure this can be corrected with a workaround. No ETA on a fix. 

    If you are pressed for an immediate fix, the only workaround would be to create your own c++ wrapper around the COM Face library and expose that to Unity. Remember the visualization of the mesh should only be used as a debugging aid, and not supported in production. You should be able to get the information you need from the "mean"/average face model to manipulate any facial animation you may need for the character mesh you are working with.


    Carmine Sirignano - MSFT

    • Marked as answer by PandaChuZero Friday, January 23, 2015 9:41 PM
    Friday, January 23, 2015 6:40 PM

All replies

  • You have everything correct, just not sure how you are checking the status of the builder. There are 2 properties to check on, the CaptureStatus/CollectionStatus.

    if(faceModelBuilder != null)
    {
        string newStatus = string.Empty;
    
        var captureStatus = faceModelBuilder.CaptureStatus;
        newStatus += captureStatus.ToString();
        
        var collectionStatus = faceModelBuilder.CollectionStatus;
        newStatus += ", " + GetCollectionStatusText(collectionStatus);
    
        Debug.Log(captureStatus + " " + collectionStatus);
    }
    
    private static string GetCollectionStatusText(FaceModelBuilderCollectionStatus status)
    {
    	string res = string.Empty;
    		
    	if ((status & FaceModelBuilderCollectionStatus.FrontViewFramesNeeded) != 0)
    	{
    		res = "FrontViewFramesNeeded";
    		return res;
    	}
    		
    	if ((status & FaceModelBuilderCollectionStatus.LeftViewsNeeded) != 0)
    	{
    		res = "LeftViewsNeeded";
    		return res;
    	}
    		
    	if ((status & FaceModelBuilderCollectionStatus.RightViewsNeeded) != 0)
    	{
    		res = "RightViewsNeeded";
    		return res;
    	}
    		
    	if ((status & FaceModelBuilderCollectionStatus.TiltedUpViewsNeeded) != 0)
    	{
    		res = "TiltedUpViewsNeeded";
    		return res;
    	}
    		
    	if ((status & FaceModelBuilderCollectionStatus.MoreFramesNeeded) != 0)
    	{
    		res = "TiltedUpViewsNeeded";
    		return res;
    	}
    
    	if (status == FaceModelBuilderCollectionStatus.Complete)
    	{
    		res = "Complete";
    		return res;
    	}
    		
    	return res;
    }

    Take note that CollectionStatus is a bit flag, so you need to check each one. You will want to have this function available in your manager class. By doing it this way you can continuously query it from another game object that handles UI and display on screen.


    Carmine Sirignano - MSFT

    Wednesday, January 14, 2015 8:06 PM
  • Thanks so much for the reply.

    I actually used the HDFaceBasics-WPF sample and converted it to unity. I'm using those exact methods for checking Collection status already. 

    Can you confirm that the delegates are called properly in your own test case with Unity?

    Here is my entire test class.

    using UnityEngine;
    using System.Collections;
    using System;
    using System.ComponentModel;
    using Windows.Kinect;
    using Microsoft.Kinect.Face;
    
    public class FaceMeshController : MonoBehaviour
    {
        public ModelManager modelManager;
    
        /// <summary>
        /// Currently used KinectSensor
        /// </summary>
        private KinectSensor sensor = null;
    
        /// <summary>
        /// Body frame source to get a BodyFrameReader
        /// </summary>
        private BodyFrameSource bodySource = null;
    
        /// <summary>
        /// Body frame reader to get body frames
        /// </summary>
        private BodyFrameReader bodyReader = null;
    
        /// <summary>
        /// HighDefinitionFaceFrameSource to get a reader and a builder from.
        /// Also to set the currently tracked user id to get High Definition Face Frames of
        /// </summary>
        private HighDefinitionFaceFrameSource highDefinitionFaceFrameSource = null;
    
        /// <summary>
        /// HighDefinitionFaceFrameReader to read HighDefinitionFaceFrame to get FaceAlignment
        /// </summary>
        private HighDefinitionFaceFrameReader highDefinitionFaceFrameReader = null;
    
        /// <summary>
        /// FaceAlignment is the result of tracking a face, it has face animations location and orientation
        /// </summary>
        private FaceAlignment currentFaceAlignment = null;
    
        /// <summary>
        /// FaceModel is a result of capturing a face
        /// </summary>
        private FaceModel currentFaceModel = null;
    
        /// <summary>
        /// FaceModelBuilder is used to produce a FaceModel
        /// </summary>
        private FaceModelBuilder faceModelBuilder = null;
    
        /// <summary>
        /// The currently tracked body
        /// </summary>
        private Body currentTrackedBody = null;
    
        /// <summary>
        /// The currently tracked body
        /// </summary>
        private ulong currentTrackingId = 0;
    
        /// <summary>
        /// Gets or sets the current tracked user id
        /// </summary>
        private string currentBuilderStatus = string.Empty;
    
        /// <summary>
        /// Gets or sets the current status text to display
        /// </summary>
        private string statusText = "Ready To Start Capture";
    
        private Mesh theGeometry;
    
    
        /// <summary>
        /// Initializes a new instance of the MainWindow class.
        /// </summary>
        void Start()
        {
            theGeometry = new Mesh();
            this.InitializeHDFace();
            SetViewCollectionStatus();
        }
    
        /// <summary>
        /// INotifyPropertyChangedPropertyChanged event to allow window controls to bind to changeable data
        /// </summary>
        public event PropertyChangedEventHandler PropertyChanged;
    
        /// <summary>
        /// Gets or sets the current status text to display
        /// </summary>
        public string StatusText
        {
            get
            {
                return this.statusText;
            }
    
            set
            {
                if (this.statusText != value)
                {
                    this.statusText = value;
    
                    // notify any bound elements that the text has changed
                    if (this.PropertyChanged != null)
                    {
                        this.PropertyChanged(this, new PropertyChangedEventArgs("StatusText"));
                    }
                }
            }
        }
    
        /// <summary>
        /// Gets or sets the current tracked user id
        /// </summary>
        private ulong CurrentTrackingId
        {
            get
            {
                return this.currentTrackingId;
            }
    
            set
            {
                this.currentTrackingId = value;
    
                this.StatusText = this.MakeStatusText();
            }
        }
    
        /// <summary>
        /// Gets or sets the current Face Builder instructions to user
        /// </summary>
        private string CurrentBuilderStatus
        {
            get
            {
                return this.currentBuilderStatus;
            }
    
            set
            {
                this.currentBuilderStatus = value;
    
                this.StatusText = this.MakeStatusText();
            }
        }
    
        /// <summary>
        /// Called when disposed of
        /// </summary>
        public void Dispose()
        {
            this.Dispose(true);
            GC.SuppressFinalize(this);
        }
    
        /// <summary>
        /// Dispose based on whether or not managed or native resources should be freed
        /// </summary>
        /// <param name="disposing">Set to true to free both native and managed resources, false otherwise</param>
        protected virtual void Dispose(bool disposing)
        {
            if (disposing)
            {
                if (this.currentFaceModel != null)
                {
                    this.currentFaceModel.Dispose();
                    this.currentFaceModel = null;
                }
            }
        }
    
        /// <summary>
        /// Returns the length of a vector from origin
        /// </summary>
        /// <param name="point">Point in space to find it's distance from origin</param>
        /// <returns>Distance from origin</returns>
        private static double VectorLength(CameraSpacePoint point)
        {
            var result = Math.Pow(point.X, 2) + Math.Pow(point.Y, 2) + Math.Pow(point.Z, 2);
    
            result = Math.Sqrt(result);
    
            return result;
        }
    
        /// <summary>
        /// Finds the closest body from the sensor if any
        /// </summary>
        /// <param name="bodyFrame">A body frame</param>
        /// <returns>Closest body, null of none</returns>
        private static Body FindClosestBody(BodyFrame bodyFrame)
        {
            Body result = null;
            double closestBodyDistance = double.MaxValue;
    
            Body[] bodies = new Body[bodyFrame.BodyCount];
            bodyFrame.GetAndRefreshBodyData(bodies);
    
            foreach (var body in bodies)
            {
                if (body.IsTracked)
                {
                    var currentLocation = body.Joints[JointType.SpineBase].Position;
    
                    var currentDistance = VectorLength(currentLocation);
    
                    if (result == null || currentDistance < closestBodyDistance)
                    {
                        result = body;
                        closestBodyDistance = currentDistance;
                    }
                }
            }
    
            return result;
        }
    
        /// <summary>
        /// Find if there is a body tracked with the given trackingId
        /// </summary>
        /// <param name="bodyFrame">A body frame</param>
        /// <param name="trackingId">The tracking Id</param>
        /// <returns>The body object, null of none</returns>
        private static Body FindBodyWithTrackingId(BodyFrame bodyFrame, ulong trackingId)
        {
            Body result = null;
    
            Body[] bodies = new Body[bodyFrame.BodyCount];
            bodyFrame.GetAndRefreshBodyData(bodies);
    
            foreach (var body in bodies)
            {
                if (body.IsTracked)
                {
                    if (body.TrackingId == trackingId)
                    {
                        result = body;
                        break;
                    }
                }
            }
    
            return result;
        }
    
        /// <summary>
        /// Gets the current collection status
        /// </summary>
        /// <param name="status">Status value</param>
        /// <returns>Status value as text</returns>
        private static string GetCollectionStatusText(FaceModelBuilderCollectionStatus status)
        {
            string res = string.Empty;
    
            if ((status & FaceModelBuilderCollectionStatus.FrontViewFramesNeeded) != 0)
            {
                res = "FrontViewFramesNeeded";
                return res;
            }
    
            if ((status & FaceModelBuilderCollectionStatus.LeftViewsNeeded) != 0)
            {
                res = "LeftViewsNeeded";
                return res;
            }
    
            if ((status & FaceModelBuilderCollectionStatus.RightViewsNeeded) != 0)
            {
                res = "RightViewsNeeded";
                return res;
            }
    
            if ((status & FaceModelBuilderCollectionStatus.TiltedUpViewsNeeded) != 0)
            {
                res = "TiltedUpViewsNeeded";
                return res;
            }
    
            if ((status & FaceModelBuilderCollectionStatus.Complete) != 0)
            {
                res = "Complete";
                return res;
            }
    
            if ((status & FaceModelBuilderCollectionStatus.MoreFramesNeeded) != 0)
            {
                res = "TiltedUpViewsNeeded";
                return res;
            }
    
            return res;
        }
    
        /// <summary>
        /// Helper function to format a status message
        /// </summary>
        /// <returns>Status text</returns>
        private string MakeStatusText()
        {
            string status = string.Format(System.Globalization.CultureInfo.CurrentCulture, "Builder Status: {0}, Current Tracking ID: {1}", this.CurrentBuilderStatus, this.CurrentTrackingId);
            return status;
        }
    
        /// <summary>
        /// Initialize Kinect object
        /// </summary>
        private void InitializeHDFace()
        {
            this.CurrentBuilderStatus = "Ready To Start Capture";
    
            this.sensor = KinectSensor.GetDefault();
            this.bodySource = this.sensor.BodyFrameSource;
            this.bodyReader = this.bodySource.OpenReader();
            this.bodyReader.FrameArrived += this.BodyReader_FrameArrived;
    
            this.highDefinitionFaceFrameSource = HighDefinitionFaceFrameSource.Create(sensor);
            this.highDefinitionFaceFrameSource.TrackingIdLost += this.HdFaceSource_TrackingIdLost;
    
            this.highDefinitionFaceFrameReader = this.highDefinitionFaceFrameSource.OpenReader();
            this.highDefinitionFaceFrameReader.FrameArrived += this.HdFaceReader_FrameArrived;
    
            this.currentFaceModel = FaceModel.Create();
            this.currentFaceAlignment = FaceAlignment.Create();
    
            this.InitializeMesh(modelManager.FaceMesh, theGeometry);
            this.UpdateMesh();
    
            this.sensor.Open();
        }
    
    
        /// <summary>
        /// Initializes a 3D mesh to deform every frame
        /// </summary>
        private void InitializeMesh(MeshFilter faceMeshFilter, Mesh faceMesh)
        {
            faceMeshFilter.mesh.Clear();
    
            var vertices = this.currentFaceModel.CalculateVerticesForAlignment(this.currentFaceAlignment);
    
            Debug.Log("FaceModel.TriangleCount " + FaceModel.TriangleCount);
    
            var triangleIndices = FaceModel.TriangleIndices;
            var indices = new int[triangleIndices.Count];
    
            Vector3[] newVerts = new Vector3[vertices.Count];
            Vector2[] newUV = new Vector2[vertices.Count];
            Vector3[] newNormals = new Vector3[vertices.Count];
    
            for (int i = 0; i < vertices.Count; ++i)
            {
                newVerts[i] = new Vector3(vertices[i].X, vertices[i].Y, vertices[i].Z);
                newNormals[i] = -Vector3.forward;
            }
    
            faceMesh.vertices = newVerts;
            faceMesh.uv = newUV;
    
            for (int i = 0; i < triangleIndices.Count; i += 3)
            {
                uint index01 = triangleIndices[i];
                uint index02 = triangleIndices[i + 1];
                uint index03 = triangleIndices[i + 2];
    
                indices[i] = (int)index01;
                indices[i + 1] = (int)index02;
                indices[i + 2] = (int)index03;
    
                if (indices[i] >= triangleIndices.Count || indices[i] < 0)
                {
                    Debug.Log("Out of Range index i " + i + " = " + indices[i]);
                }
                if (indices[i + 1] >= triangleIndices.Count || indices[i + 1] < 0)
                {
                    Debug.Log("Out of Range index i+1 " + (i + 1) + " = " + indices[i + 1]);
                }
                if (indices[i + 2] >= triangleIndices.Count || indices[i + 2] < 0)
                {
                    Debug.Log("Out of Range index i+2 " + (i + 2) + " = " + indices[i + 2]);
                }
            }
    
            faceMesh.triangles = indices;
            faceMesh.RecalculateNormals();
            faceMeshFilter.mesh = faceMesh;
        }
    
        /// <summary>
        /// Start a face capture on clicking the button
        /// </summary>
        /// <param name="sender">object sending the event</param>
        /// <param name="e">event arguments</param>
        public void StartCapture_Button_Click()
        {
            this.StartCapture();
        }
    
        public void StopCapture_Button_Click()
        {
            //CollectFaceDataComplete(faceModelBuilder.GetFaceData());
            //faceModelBuilder.GetFaceData(CollectFaceDataComplete);
            StopFaceCapture();
        }
    
        /// <summary>
        /// This event fires when a BodyFrame is ready for consumption
        /// </summary>
        /// <param name="sender">object sending the event</param>
        /// <param name="e">event arguments</param>
        private void BodyReader_FrameArrived(object sender, BodyFrameArrivedEventArgs e)
        {
            this.CheckOnBuilderStatus();
    
            var frameReference = e.FrameReference;
            using (var frame = frameReference.AcquireFrame())
            {
                if (frame == null)
                {
                    // We might miss the chance to acquire the frame, it will be null if it's missed
                    return;
                }
    
                if (this.currentTrackedBody != null)
                {
                    this.currentTrackedBody = FindBodyWithTrackingId(frame, this.CurrentTrackingId);
    
                    if (this.currentTrackedBody != null)
                    {
                        return;
                    }
                }
    
                Body selectedBody = FindClosestBody(frame);
    
                if (selectedBody == null)
                {
                    return;
                }
    
                this.currentTrackedBody = selectedBody;
                this.CurrentTrackingId = selectedBody.TrackingId;
    
                this.highDefinitionFaceFrameSource.TrackingId = this.CurrentTrackingId;
            }
        }
    
        /// <summary>
        /// This event is fired when a tracking is lost for a body tracked by HDFace Tracker
        /// </summary>
        /// <param name="sender">object sending the event</param>
        /// <param name="e">event arguments</param>
        private void HdFaceSource_TrackingIdLost(object sender, TrackingIdLostEventArgs e)
        {
            var lostTrackingID = e.TrackingId;
    
            if (this.CurrentTrackingId == lostTrackingID)
            {
                this.CurrentTrackingId = 0;
                this.currentTrackedBody = null;
                if (this.faceModelBuilder != null)
                {
                    this.faceModelBuilder.Dispose();
                    this.faceModelBuilder = null;
                }
    
                this.highDefinitionFaceFrameSource.TrackingId = 0;
            }
        }
    
        /// <summary>
        /// This event is fired when a new HDFace frame is ready for consumption
        /// </summary>
        /// <param name="sender">object sending the event</param>
        /// <param name="e">event arguments</param>
        private void HdFaceReader_FrameArrived(object sender, HighDefinitionFaceFrameArrivedEventArgs e)
        {
            using (var frame = e.FrameReference.AcquireFrame())
            {
                // We might miss the chance to acquire the frame; it will be null if it's missed.
                // Also ignore this frame if face tracking failed.
                if (frame == null || !frame.IsFaceTracked)
                {
                    return;
                }
    
                frame.GetAndRefreshFaceAlignmentResult(this.currentFaceAlignment);
                this.UpdateMesh();
            }
        }
    
        /// <summary>
        /// Sends the new deformed mesh to be drawn
        /// </summary>
        private void UpdateMesh()
        {
            var vertices = this.currentFaceModel.CalculateVerticesForAlignment(this.currentFaceAlignment);
    
            //currentFaceModel.FaceShapeDeformations[FaceShapeDeformations.Eyes00]
    
            Vector3[] newVerts = new Vector3[vertices.Count];
    
            for (int i = 0; i < vertices.Count; ++i)
            {
                newVerts[i] = new Vector3(vertices[i].X, vertices[i].Y, vertices[i].Z);
            }
    
            theGeometry.vertices = newVerts;
        }
    
        /// <summary>
        /// Start a face capture operation
        /// </summary>
        private void StartCapture()
        {
            Debug.Log("Capture Started");
            this.StopFaceCapture();
            this.faceModelBuilder = null;
            this.faceModelBuilder = this.highDefinitionFaceFrameSource.OpenModelBuilder(FaceModelBuilderAttributes.None);
            faceModelBuilder.CollectFaceDataAsync(CollectFaceDataComplete, CollectFaceDataFailed);
        }
    
        private void CollectFaceDataFailed(int obj)
        {
            Debug.Log("Collect Face Data Failed");
        }
    
        private void CollectFaceDataComplete(FaceModelData obj)
        {
            Debug.Log("Collect Face Data Completed");
            Debug.Log("FaceModel data is null : " + (obj == null));
            Debug.Log("Obj : " + obj.ToString());
            //var modelData = obj;
    
            //this.currentFaceModel = obj.ProduceFaceModel();
    
            //this.faceModelBuilder.Dispose();
            //this.faceModelBuilder = null;
    
            //this.CurrentBuilderStatus = "Capture Complete";
        }
    
        /// <summary>
        /// Cancel the current face capture operation
        /// </summary>
        private void StopFaceCapture()
        {
            if (this.faceModelBuilder != null)
            {
                this.faceModelBuilder.Dispose();
                this.faceModelBuilder = null;
            }
        }
    
        /// <summary>
        /// Check the face model builder status
        /// </summary>
        private void CheckOnBuilderStatus()
        {
            if (this.faceModelBuilder == null)
            {
                return;
            }
    
            string newStatus = string.Empty;
    
            var captureStatus = this.faceModelBuilder.CaptureStatus;
            newStatus += captureStatus.ToString();
    
            var collectionStatus = this.faceModelBuilder.CollectionStatus;
    
            newStatus += ", " + GetCollectionStatusText(collectionStatus);
    
            this.CurrentBuilderStatus = newStatus;
    
            modelManager.CaptureStatus.text = this.CurrentBuilderStatus;
    
            SetViewCollectionStatus();
        }
    
        private void SetViewCollectionStatus()
        {
            if (faceModelBuilder == null) return;
            if (faceModelBuilder.CollectionStatus == FaceModelBuilderCollectionStatus.MoreFramesNeeded) return;
    
            if (faceModelBuilder.CollectionStatus == FaceModelBuilderCollectionStatus.Complete)
            {
                modelManager.FrontView.isOn = true;
                modelManager.LeftView.isOn = true;
                modelManager.RightView.isOn = true;
                modelManager.TiltedUpView.isOn = true;
                return;
            }
    
            if (IsFlagSet(FaceModelBuilderCollectionStatus.FrontViewFramesNeeded))
                modelManager.FrontView.isOn = false;
            else
                modelManager.FrontView.isOn = true;
    
            if (IsFlagSet(FaceModelBuilderCollectionStatus.LeftViewsNeeded))
                modelManager.LeftView.isOn = false;
            else
                modelManager.LeftView.isOn = true;
    
            if (IsFlagSet(FaceModelBuilderCollectionStatus.RightViewsNeeded))
                modelManager.RightView.isOn = false;
            else
                modelManager.RightView.isOn = true;
    
            if (IsFlagSet(FaceModelBuilderCollectionStatus.TiltedUpViewsNeeded))
                modelManager.TiltedUpView.isOn = false;
            else
                modelManager.TiltedUpView.isOn = true;
        }
    
        /* check if a flag is enabled */
        public bool IsFlagSet(FaceModelBuilderCollectionStatus flag)
        {
            return (faceModelBuilder.CollectionStatus & flag) != 0;
        }
    }



    • Edited by PandaChuZero Wednesday, January 14, 2015 8:41 PM
    Wednesday, January 14, 2015 8:38 PM
  • That is exactly the code we have. Are you having issues with the regular samples as well or just with Unity?

    Carmine Sirignano - MSFT

    Thursday, January 15, 2015 11:35 PM
  • I am only having trouble with unity version. The wpf sample works perfectly (and its awesome).
    Friday, January 16, 2015 12:03 AM
  • I am still unable to resolve this issue, am i the only one having this difficulty?
    Tuesday, January 20, 2015 4:08 PM
  • I just want to confirm with your that you are getting a build status of 7 as you stated in your original post. Builder Status of 7 is complete, so that is not an error, the builder has all the views needed. the only step needed is to recreate the mesh from the currentFaceModel.


    Carmine Sirignano - MSFT

    Tuesday, January 20, 2015 7:13 PM
  • I can confirm that i get

    Capture Status : 7

    Collection Status : Complete

    I'm starting to get confused, I thought i needed to call CollectDataAsync and wait for the complete delegate to get called which should happen automatically when the collection status is complete. Will currentFaceModel have all the data in it without needing to call CollectDataAsync, or should i call CollectDataAsync once the collection status is complete?

    I Start the capture process with these two lines at the same time, is this correct?

    this.faceModelBuilder = this.highDefinitionFaceFrameSource.OpenModelBuilder(FaceModelBuilderAttributes.None);
    faceModelBuilder.CollectFaceDataAsync(CollectFaceDataComplete, CollectFaceDataFailed);


    Wednesday, January 21, 2015 4:22 AM
  • currectFaceModel will not get populated until you have called the ProduceFaceModel from the arguments modelData object.

    private void HdFaceBuilder_CollectionCompleted(object sender, FaceModelBuilderCollectionCompletedEventArgs e)
    {
        var modelData = e.ModelData;
                
        this.currentFaceModel = modelData.ProduceFaceModel();
    
        this.faceModelBuilder.Dispose();
    
        this.faceModelBuilder = null;
    }
    From there, you then need to update your mesh with the new vertex data in that model.


    Carmine Sirignano - MSFT


    Wednesday, January 21, 2015 8:04 PM
  • Thank you again for the reply,

    i understand the need to create the facemodel when the CollectionCompleted delegate is called back.

    The problem is that HDFaceBuilder_CollectionCompleted delegate is never invoked after the collection status becomes complete.


    • Edited by PandaChuZero Wednesday, January 21, 2015 8:49 PM
    Wednesday, January 21, 2015 8:47 PM
  • Looks like there may be a bug and not sure this can be corrected with a workaround. No ETA on a fix. 

    If you are pressed for an immediate fix, the only workaround would be to create your own c++ wrapper around the COM Face library and expose that to Unity. Remember the visualization of the mesh should only be used as a debugging aid, and not supported in production. You should be able to get the information you need from the "mean"/average face model to manipulate any facial animation you may need for the character mesh you are working with.


    Carmine Sirignano - MSFT

    • Marked as answer by PandaChuZero Friday, January 23, 2015 9:41 PM
    Friday, January 23, 2015 6:40 PM
  • Excellent - as long as it is a known bug and i'm not going crazy i can live with that.

    I will look forward to the next release.

    Friday, January 23, 2015 9:43 PM
  • After half a year and this bug still exists!

    Bummer! 

    What are you guys doing?

    Tuesday, August 4, 2015 5:58 PM
  • After 4 years this bug still exists!

    Haha.
    Wednesday, June 5, 2019 8:37 PM
  • Last update of the SDK was in 2014. It wasn't solved while the sensor was still in market, so don't expect a fix now that the Kinect v2 has been discontinued.

    All efforts now are going to the Azure Kinect and Hololens 2. So do yourself a favor and either find a workaround(either a workaround using the Kinect v2 SDK or using a different sensor altogether for the face,depending on your needs) or wait for Azure Kinect(though I don't know whether it has a Face module).

    Thursday, June 6, 2019 3:53 PM