19
EEC-492/592 EEC-492/592 Kinect Application Kinect Application Development Development Lecture 15 Lecture 15 Wenbing Zhao Wenbing Zhao [email protected] [email protected]

EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao [email protected]

Embed Size (px)

Citation preview

EEC-492/592EEC-492/592Kinect Application Kinect Application

DevelopmentDevelopmentLecture 15Lecture 15

Wenbing ZhaoWenbing Zhao

[email protected]@ieee.org

OutlineOutline Approaches to gesture recognition

Rules based Single pose based (this lecture) Multiple poses based

Machine learning based

Gesture Recognition EngineGesture Recognition Engine The recognition engine typically performs the

following tasks: It accepts user actions in a form of skeleton data It matches the data points with predefined logic for a specific

gesture It executes actions if the gesture is recognized It responds to the users

Gesture Recognition EngineGesture Recognition Engine

Gesture Recognition EngineGesture Recognition Engine

Recognizing Two GesturesRecognizing Two Gestures

Gesture Recognition EngineGesture Recognition Engine GestureTypeGestureType

RecognitionResultRecognitionResult

GestureEventArgsGestureEventArgs

public enum GestureType { HandsClapping, TwoHandsRaised}

public enum RecognitionResult { Unknown, Failed, Success}

public class GestureEventArgs : EventArgs { public GestureType gsType { get; internal set; } public RecognitionResult Result { get; internal set; } public GestureEventArgs(GestureType t, RecognitionResult result) { this.Result = result; this.gsType = t; }}

Gesture Recognition EngineGesture Recognition Enginepublic class GestureRecognitionEngine { public GestureRecognitionEngine() { } public event EventHandler<GestureEventArgs> GestureRecognized; public Skeleton Skeleton { get; set; } public GestureType GestureType { get; set; }

public void StartRecognize(GestureType t) { this.GestureType = t; switch (t) { case GestureType.HandsClapping: this.MatchHandClappingGesture(this.Skeleton); break; case GestureType.TwoHandsRaised: this.MatchTwoHandsRaisedGesture(this.Skeleton); break; default: break; } }

Gesture Recognition EngineGesture Recognition Enginefloat previousDistance = 0.0f;private void MatchHandClappingGesture(Skeleton skeleton) { if (skeleton == null) { return; } if (skeleton.Joints[JointType.HandRight].TrackingState == JointTrackingState.Tracked && skeleton.Joints[JointType.HandLeft].TrackingState == JointTrackingState.Tracked) { float currentDistance = GetJointDistance(skeleton.Joints[JointType.HandRight], skeleton.Joints[JointType.HandLeft]); if (currentDistance < 0.1f && previousDistance > 0.1f) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.HandsClapping, RecognitionResult.Success)); } } previousDistance = currentDistance; }}

Gesture Recognition EngineGesture Recognition Engineprivate void MatchTwoHandsRaisedGesture(Skeleton skeleton) { if (skeleton == null) { return; } float threshold = 0.3f; if (skeleton.Joints[JointType.HandRight].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold && skeleton.Joints[JointType.HandLeft].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.TwoHandsRaised,RecognitionResult.Success)); } }}

Gesture Recognition EngineGesture Recognition Engineprivate float GetJointDistance(Joint firstJoint, Joint secondJoint){ float distanceX = firstJoint.Position.X - secondJoint.Position.X; float distanceY = firstJoint.Position.Y - secondJoint.Position.Y; float distanceZ = firstJoint.Position.Z - secondJoint.Position.Z; return (float)Math.Sqrt(Math.Pow(distanceX, 2) + Math.Pow(distanceY, 2) + Math.Pow(distanceZ, 2));}

Vectors, Dot Product, AnglesVectors, Dot Product, Angles

Vector3 seg;seg.X = Joint1.Position.X – Joint2.Position.X;seg.Y = Joint1.Position.Y – Joint2.Position.Y;seg.Z = Joint1.Position.Z – Joint2.Position.Z;

Segments of body can be represented using vectors You can create a class/struct for Vector3 or use Unity Vector3 type

Dot product of two vectors

Angle formed by two vectors in degrees

Vector3 seg1, seg2;float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z;

Vector3 seg1, seg2;float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z;float seg1magnitude = Math.Sqt(seg1.X*seg1.X+seg1.Y*seg1.Y+seg1.Z*seg1.Z);float seg2magnitude = Math.Sqt(seg2.X*seg2.X+seg2.Y*seg2.Y+seg2.Z*seg2.Z);float angle = Math.Acos(dotproduct/seg1magnitude/seg2magnitude)*180/Math.PI;

Build a Gesture Recognition Build a Gesture Recognition AppApp The app can recognize two gesturesThe app can recognize two gestures

Clapping handClapping hand Two hands raised Two hands raised

Setting up projectSetting up project Create a new C# WPF project with a name Create a new C# WPF project with a name

GestureRecognitionBasicGestureRecognitionBasic Add Microsoft.Kinect reference, import name space, etc Add GUI component Create a new C# file named GestureRecognitionEngine.cs, and

copy the code to this class In solution explore, right click, then Add => New Item

Build a Gesture Recognition Build a Gesture Recognition AppApp User interfaceUser interface

Canvas

TextBox

Image

Build a Gesture Recognition Build a Gesture Recognition AppApp Add member variablesAdd member variables

Modify constructorModify constructor

KinectSensor sensor;private WriteableBitmap colorBitmap;private byte[] colorPixels;Skeleton[] totalSkeleton = new Skeleton[6];Skeleton skeleton;

GestureRecognitionEngine recognitionEngine;

public MainWindow(){ InitializeComponent(); Loaded += new RoutedEventHandler(WindowLoaded);}

Build a Gesture Recognition Build a Gesture Recognition AppAppprivate void WindowLoaded(object sender, RoutedEventArgs e){ if (KinectSensor.KinectSensors.Count > 0) { this.sensor = KinectSensor.KinectSensors[0]; if (this.sensor != null && !this.sensor.IsRunning) { this.sensor.Start(); this.sensor.ColorStream.Enable(); this.colorPixels = new byte[this.sensor.ColorStream.FramePixelDataLength]; this.colorPixels = new WriteableBitmap(this.sensor.ColorStream.FrameWidth, this.sensor.ColorStream.FrameHeight, 96.0, 96.0, PixelFormats.Bgr32, null); this.image1.Source = this.colorBitmap; this.sensor.ColorFrameReady += this.colorFrameReady; this.sensor.SkeletonStream.Enable(); this.sensor.SkeletonFrameReady += skeletonFrameReady; recognitionEngine = new GestureRecognitionEngine(); recognitionEngine.GestureRecognized += gestureRecognized; } } }

Build a Gesture Recognition Build a Gesture Recognition AppApp Gesture recognized event handlerGesture recognized event handler

colorFrameRead(), DrawSkeleton(), drawBone(), colorFrameRead(), DrawSkeleton(), drawBone(), ScalePosition() same as beforeScalePosition() same as before

void gestureRecognized(object sender, GestureEventArgs e){ textBox1.Text = e.gsType.ToString();}

Build a Gesture Recognition Build a Gesture Recognition AppApp Handle skeleton frame ready eventHandle skeleton frame ready eventvoid skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) { if (skeletonFrame == null) { return; } skeletonFrame.CopySkeletonDataTo(totalSkeleton); skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); if (skeleton == null) return;

DrawSkeleton(skeleton); recognitionEngine.Skeleton = skeleton; recognitionEngine.StartRecognize(GestureType.HandsClapping); recognitionEngine.StartRecognize(GestureType.TwoHandsRaised); }}

Challenge TasksChallenge Tasks Add recognition of two more gesturesAdd recognition of two more gestures

Right hand is raisedRight hand is raised Left hand is raisedLeft hand is raised

Add the GestureRecognitionEngine.cs to a Add the GestureRecognitionEngine.cs to a Unity+Kinect app, and add visual feedback on Unity+Kinect app, and add visual feedback on the gestures recognizedthe gestures recognized