171
Abstract Eye based interface systems for computers have existed for several years, but have not come into widespread use due to the high cost of the associated technology and devices. Recent developments in the area of high grade consumer cameras with small form factors have created devices which allow for low cost gaze tracking research to be conducted. Previous design work done in this field has often neglected to provide feedback to the user, thereby limiting the potential uses of those systems. The designed Gaze Tracking System attains similar tracking results compared to previously designed systems, while also providing visual feedback to the user in the form of a computer cursor that follows the user’s gaze. Results have doubled the previous best tracking accuracy to .5 , as well as achieving a 12% speed increase over conventional mouse based systems. Additionally, a Head Mounted Display has been integrated into the system, thereby allowing full hands free use and mobility. Continuing advances in this field, such as those shown in this research, will rapidly lead to the technology being viable for real world use. EE 452 Senior Capstone Project Project Report Gaze Tracking System Breanna Heidenburg Michael Lenisa Daniel Wentzel Tuesday, May 13, 2008

EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Abstract

Eye based interface systems for computers have existed for several years, but have not come

into widespread use due to the high cost of the associated technology and devices. Recent

developments in the area of high grade consumer cameras with small form factors have created devices

which allow for low cost gaze tracking research to be conducted. Previous design work done in this field

has often neglected to provide feedback to the user, thereby limiting the potential uses of those

systems. The designed Gaze Tracking System attains similar tracking results compared to previously

designed systems, while also providing visual feedback to the user in the form of a computer cursor that

follows the user’s gaze. Results have doubled the previous best tracking accuracy to .5 , as well as

achieving a 12% speed increase over conventional mouse based systems. Additionally, a Head Mounted

Display has been integrated into the system, thereby allowing full hands free use and mobility.

Continuing advances in this field, such as those shown in this research, will rapidly lead to the

technology being viable for real world use.

EE 452 – Senior Capstone Project

Project Report Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

Tuesday, May 13, 2008

Page 2: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 2

Table of Contents

Abstract 1

Introduction 4

Applications 4

Previous Work 5

Patents 5

Previous Research 5

openEyes 6

Oculog 6

Software Packages 7

Commercially available solutions 7

Tobii 7

Seeing Machines 7

SR Research 8

Goals 8

Core Goals 8

Additional Goals 9

Functional Specifications 10

Hardware System Requirements 10

Software System Requirements 10

Development Timeline 11

System Hardware Design 12

Eye Tracking Camera 12

Polarizing Filter 14

Lighting System 15

Head Mounted Display (HMD) 15

Computing platform 15

System Software Design 16

Application Layout 16

Image Processing Application 16

Image Filtering 17

Locate Eye Shape 18

Adaptive Algorithm 18

Cursor Data 18

Cursor Control 21

Clicking 23

Communication 24

User Interface 25

Main Page 28

Calibration 29

System Testing Interface 30

Timing Collector 30

Centering Interface 31

Other Pages 32

Additional Applications 33

Timing Collector 33

Page 3: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 3

UDP Client 34

Control Panel 35

Perl Scripts 36

MATLAB 37

Conference Submissions 37

Results 37

Goal Completion 37

Core Goals 37

Additional Goals 38

Functionality 40

Hardware 40

Software 40

Conclusion 42

Works Cited 43

Appendix A: Source Code 45

Appendix B: HSI Conference

Submission 152

Appendix C: RESNA Conference

Submission 157

Appendix D: Software Usage

Manual 166

Appendix E: Software Design

Manual 168

Appendix F: Table of Figures

170

Page 4: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 4

Introduction Gaze tracking has been used for many decades as a tool to study human cognitive process

including reading, driving, watching commercials, and other activities [1]. With the advent of personal

computers, potential integration of such systems has been considered only recently in 1991 [2].

Successful attempts to employ gaze tracking as user interface were made to allow users with movement

disabilities to type by looking at virtual keyboard [3], or for mouse pointer control [4]. Systems of this

kind have also been used by individuals without any disabilities to enhance performance [5],[6].

The continuing absence of consumer-grade eye-tracking human computer interface technology

is result of the high price of eye tracking technology and intrusiveness of such systems, despite the fact

that technologies allowing so have existed for many years [7]. Only recently relatively low cost systems

have been investigated, for example by Li et al in 2006 [8],[9], [10].

The design work described herein shows the accuracy that can be obtained using modern, low-

cost systems. The two most significant contributions of this research are the use of blob detection in the

visible spectrum instead of circular shape in the infrared spectrum, and use of higher dimensional

polynomials to calibrate and map the detected gaze position to the position on the computer screen.

In a gaze tracking system, images of the eye are taken by a camera and sent to an image

processing system. This image data is a picture of the user’s eye from a specific vantage point. The

vantage point can be close to the user’s eye, as in a head mounted device, or further away, as in a

device on a table near the user. For this project, the design team has used a head mounted camera, as

will be discussed later. The image is processed to determine the location of the user’s pupil within the

image. The coordinates of the detected center of the user’s pupil are passed through a specialized set of

algorithms to accurately position a cursor on a screen displayed in front of the user. The cursor position

on the screen represents where the user is looking on the display. This creates an open loop tracking

system where a cursor follows where the user is looking on the screen.

Problems can arise during tracking, such as spurious blinking or the inability to initiate an action

(as the cursor only indicates a position). Solving these problems has been a major part of the

development. For instance, sampled images will be rejected if the user is blinking in the image. A

method for initiating an action (similar to a mouse click) has also been devised for use within the gaze

tracking system.

Applications

Applications of this technology may vary greatly. Biomedical and Assistive Technology

applications are nearly limitless. For instance, a paraplegic patient could take control of a robot arm via

such technology, thereby giving some mobility to a person who would otherwise have none. Other fields

Page 5: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 5

such as market research use gaze tracking to analyze how people read documents. Applications within

the market research fields are constantly used to gauge the usability of applications, and the

effectiveness of graphics in advertising.

Previous Work

Previous work exists in this field which has facilitated design research in terms of allowing the

design team to see what types of systems have worked, how well they have performed, and design

decisions to avoid. Most research found on this topic has been graduate level or above, including

commercial applications. The research that has been done has been beneficial for development and has

prevented the team from making costly design mistakes. Discussed here are relevant patents, previous

work done in the field of research, software packages facilitating development, and commercially used

products which validate the design.

Patents Numerous patents exist in our field of research, many of which were issued in the past

few years. Such patents embody various methods of eye tracking. Tracking has been done

through means of infrared tracking[11] and by using visible spectrum light [12]. While these two

patents allow for eye tracking they do not allow for any use of the eye tracking data.

Patents exist which embody the use of a head mounted display. Various methods of

projecting images in front of a user are described in these patents [13] [14]. The design

complications related to constructing such devices are very time consuming, requiring high end

facilities for machining lenses, positioning optics, and controlling micro displays. The

aforementioned procedures and the costs associated therein would prohibit design of such a

device within the time limits of a senior capstone project. Therefore, the design team has

purchased an existing product that fits the needs of the project.

Other patents exist for relating the point of gaze of a user to a position on a screen. Such

methods include positioning a cursor on a screen displaced from the user (a computer monitor

on a desktop) [15], as well as on a screen attached to the users head[16]. This patent also allows

for a heads up display (HUD), and the use of images therein to support a medical procedure.

Previous Research Visually based eye tracking has been a rapidly expanding field of research in the last few

years. There have been a number of projects highly similar to the proposed project. These

projects have been conducted by universities around the world. The two most relevant projects

are presented here, along with the most important differences from the proposed project.

Page 6: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 6

openEyes

The openEyes project is an ongoing research program at Iowa State University

with the goal of developing a low-cost, open-source, head mounted eye tracker[10]. In

addition, the project uses a second, forward facing “scene” camera that replicates the

user’s field of vision. The project has been in progress from 2004 and has accomplished

all of its original goals. The current hardware and software combination, a fourth

generation device, can be built entirely from off the shelf, low-cost components and

achieves eye-tracking accuracy of 1 degree.

openEyes took an approach remarkably similar to the proposed project, to the

point that the physical eye tracking hardware is nearly indistinguishable between the

two projects. It has encountered many of the same problems and arrived at remarkably

similar solutions. As an example, both projects have experienced considerable problems

with ambient reflections in the pupil, making it difficult to reliably locate.

The described project and openEyes differ in two major ways. Firstly, openEyes

uses a dual-camera system. One camera is used to track the eye and the other is

mounted near the eye looking out into the user’s field of vision. The system then

correlates pupil location to the second camera to determine where the user’s gaze lies.

The described project instead uses only a single camera to correlate pupil location to

gaze on a physically fixed display. This represents the different functional goals of the

projects. While openEyes focuses on providing a general gaze tracking and visual

interface, the described project focuses specifically on interaction with a fixed visual

display. While openEyes is ultimately a more flexible system, the proposed project is

much easier to integrate into a computer system.

Secondly, openEyes uses a different method of calculating gaze location. The

system extracts two separate sets of reflection data from the eye and compares them to

determine gaze location. The proposed project extracts only a single pupil location and

uses it to directly determine gaze location. The openEyes approach is necessary to allow

the correlation of the user’s gaze to the “scene” camera that observes field of view. In

the described system the fixed position of the display being used means that a second

set of data is not needed. This will allow quicker and simpler algorithms to be used,

speeding development and reducing processor load.

Oculog

Oculog is a project of the University of Wollongong, Australia[17]. The project

overall is working toward a novel goal: controlling musical synthesizers via eye

movements. Although visually tracking the eye, it takes a slightly different approach

than the described project and openEyes.

Page 7: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 7

Oculog uses a camera sensing the infrared spectrum. This approach has both

benefits and drawbacks. It provides a cleaner, more precise image of the eye that is

much easier to track visually and allows for greater accuracy in tracking. Unfortunately,

this means the system cannot be used outdoors or near any source of intense IR light

and is also much more expensive relative to openEyes and the described project.

Currently the Oculog project is ongoing and is achieving promising results.

Software Packages For the purpose of this project, the design team has been implementing the image

processing application via OpenCV. OpenCV is an open source computer vision software package

developed by Intel in the late 1990’s [18]. It contains libraries for sampling images from an

external video device, detecting objects within the image, and making decisions based on

findings. It is the design team’s intention that OpenCV be used for the image processing

application due to its robustness and ease of development. In a matter of a few weeks, the

design team was able to construct a proof of concept application which opens data from a

camera, prepares the image by means of a set of filter and image transformations, detects a

circular shape in the image, and sends out a cursor position relating to the position of the eye of

the user.

Commercially available solutions There are several products on the market for solving problems similar to the ones

presented in this report. Though most of these systems do not use a head mounted display, they

all involve various methods and applications of gaze tracking technology.

Tobii

Tobii has an eye tracking system, which sits beneath a desktop computer

monitor and locates the eyes and their gaze out of a video stream taken of the face. It is

a completely nonintrusive system, currently being marketed toward all forms of visual

marketing applications. They also have alterations on their system to use if the user is

standing in front of a projection screen, or seated looking at a TV. [19]

Seeing Machines

A company called Seeing Machines has a few products similar to Tobii. FaceLAB,

their most successful product, has been used to track both eyes independently of one

another, from a video stream of the subjects face. It comes equipped with analysis

software, including programs to detect user fatigue.[20] This leads into their next

product, the Driver State System, which is mounted in trucks to see how tired, or

distracted, the driver is. [21]

Page 8: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 8

SR Research

EyeLink, the SR Research product, does have a head mounted display, as well as

a head supported display. The head mounted display uses three video cameras, one in

front of each eye to determine gaze, as well as one affixed to the headband, to

compensate for headband movement and vibrations. The head mounted system

attempts mostly to use corneal reflections as well as pupil location data to determine

the user’s gaze. When this is not possible, it reverts back to the software used in the

head supported system, which solely tracks the pupil. [22]

Goals

Upon consideration of the prior art, the design team looked to build upon previous design

methods to replicate and improve upon their results. Image capture has been achieved via the visible

spectrum method, where the image capture device is a small form factor web cam capable of high

resolution image capture at a close range (less than 2 inches). The image data from the camera is

transmitted to a computer for processing. Processing is aided by use of the OpenCV software package,

such that object detection can easily be performed and has kept design time to a minimum. The

resulting data from this calculation is cursor position data, which is relayed to a screen in front of the

user. Initially, as a proof of concept, the display was displaced from the user (a monitor placed on a

desk). However, while this allowed for quick development and demonstrate feasibility, lateral

distortions in the position of the user’s head in relation to the display caused the calculation algorithm

to produce errors. Finally a head mounted display was purchased for use with this system. This has fixed

the position of the display relative to the user’s eye, thereby reducing the amount of lateral distortions

and decreasing the positioning error in the system. Additionally, the fixed position display has provided

a great means for displaying a custom User Interface to the user. This custom UI has proved to be a

crucial element in terms of demonstrating the system and measuring its accuracy.

In any project which can be adapted for different markets and purposes there must be a basic

set of functionality. The design team first set out with the intention of implementing the core goals of

the project. The intent was that at the point of completion of the core goals, additional goals would be

slated depending on foreseen design time, feasibility, and usefulness of various goals.

Core Goals In order to consider the project a success, the design team – at the projects inception –

decided that the following core goals were necessary components of the final system:

Image-processing based eye tracking system

o Allows correlation of eye position to location on computer display

o Adapts to various eyes types/eye positions

o Ability to function as a hands-free input device

Page 9: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 9

Cursor control, controlled by eye tracking system

o Facilitates “Alternative and Augmentative Control” AAC

Display visual output to user

Visually controlled demo computer programs

Additional Goals At the projects inception, the design team decided that additional goals as listed below

would be acceptable continuations of the core goals already set forth.

Augmented Reality System

o Micro display transparent overlay Heads-Up-Display (HUD) in user vision, variable

opacity

o Portable system suitable for everyday use

o Forward looking camera allowing for correlation of user real-world vision to eye

position

Automatic calibration system

Real world applications

Network integration

Additional sensory output to the user

Page 10: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 10

Functional Specifications

This project can be broken down into two distinct sections: the hardware system, and the

software system. The hardware system includes all of the physical components and their physical

properties, such as power consumption, physical size, connectors, etc. The software portion of the

system includes all software related issues, such as image capture time, frame rate, etc.

Hardware System Requirements The following list contains overall system requirements. A block diagram of the

hardware system can be found in Figure 1.

The total system shall be able to be carried by one person, and shall be self powered.

o The mobile computing system shall be the ASUS EeePC. The ASUS EeePC is a small form

factor laptop, it runs Windows XP, and thereby sets a software requirement for our

operating system.

o The system shall be mobile. It is intended that the user will not be required to have the

system plugged into a wall in order to use the system.

The system shall contain at least one image capture device, wherein that image capture device

will be a small form factor camera directed towards the user’s eye.

o The camera resolutions used shall be no lower than 320x240 pixels.

o The camera(s) shall be placed in a position so as to facilitate easy gaze recognition, as

well as not interfering with the user’s vision. While the camera must be placed in front

of the user, it is intended that the camera will not be directly in front of the user’s eye.

o The system may contain additional image capture devices such that they will assist in

the creation of the augmented reality system.

The image devices shall connect to the computer system via USB cables.

The display shall connect to the computer system via a VGA cable.

Software System Requirements The following list contains software system requirements. A block diagram of the

software system can be found in Figure 2.

If an image is found to be invalid (the user blinks or removes the image capture device), the

system shall omit the captured image from cursor position calculations. At this point, the

software system may recapture an image, and check if the image is valid, or the system may

wait a specified amount of time before recapturing an image.

Page 11: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 11

The free running frame rate of the system shall not drop below two frames per second. The

maximum frame rate of the system shall be determined by the maximum frame rate of either

the image capture device, or the processing capabilities of the system.

The cursor position percent error shall be no greater than 12.8 pixels horizontally, and 9.6 pixels

vertically.

o In a display which has a VGA resolution (640x480) this translates to less than less than

2%

o The system shall also have a method of measuring error.

The system shall allow the user to initiate an action, in addition to positioning a cursor.

o By capturing an image, the system can only determine a location for a cursor. It was

originally conjectured that a ‘mouse click’ could be initiated via a blink. This however is

not acceptable due to spurious blinking by the user. Instead, the system shall allow the

user to initiate an action by either an external button, by holding their gaze at a single

point for a specific amount of time, or by adding another camera to the system to

compare the opposite eye of the user to the tracked eye.

Development Timeline

For an acceptable end result, this project has required a rigorous timeline. The development

team finalized hardware selections and designs by the middle of March. Application development

quickly followed suit, allowing for plenty of time to collect usage results and get user feedback.

Following this testing and acceptance phase, the design team has summarized results described in this

research paper, as well as created a number of outside research papers presented at various

conferences around the world.

Page 12: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 12

System Hardware Design

In order to allow for the creation of a software based system, hardware design had to first be taken

into account. A generalized breakdown of hardware is shown in Figure 1. The input to the system is the

image capture device, wherein an image of the user’s eye is captured and sent to the computing

platform running the software. The computing platform has been intended to be mobile, such as a

laptop. The output of the computing platform manifests in the form of a video signal sent to a head

mounted transparent monocular display.

Figure 1 - System Hardware Diagram

While Figure 1 generally describes the system there are quite a few other hardware pieces that were added to the system by experimentation. Significant hardware pieces are listed and described herein.

Eye tracking camera

Polarizing Filter

Computing platform

Head Mounted Display (HMD)

Lighting System (mounted LED’s)

Eye Tracking Camera Camera choice was one of the first and most basic problems faced. To avoid writing driver code

from scratch, an undertaking outside the scope of this project, camera searches were limited to

consumer webcams which came pre-packaged with drivers. The other major requirement for any

camera was a very short focal length. The camera is mounted very close to the eye and thus must be

able to focus on objects very near the lens. Other considerations included small camera size and weight,

manual vs. autofocus, resolution, and cost. It was also thought that a second camera might be needed to

implement the optional Augmented Reality portion of the project. To compound matters, at this early

point in the project it was unclear if other requirements might be found later. Research on similar

projects had yielded little information on camera choice and there was no camera expert on the team.

Page 13: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 13

Researching the camera market, a wide variety of cameras that met requirements were found.

Lack of camera specifications proved to be a problem with many of the cheaper models. The cameras

found were, however, cheaper than expected. At this point it was decided to purchase two different

cameras in an attempt to cover multiple scenarios and ensure that at least one of the cameras would

meet the requirements. Additionally, the second camera could be used as the front facing AR camera if

needed. The two cameras selected were the Logitech QuickCam for Notebooks and the Creative

73VF022000000.

The Logitech Quick Cam for Notebooks is a high end consumer web-cam. This camera has a default

resolution of 1600x1200 pixels (2 Megapixels), with ability to up-sample to 8 Megapixels. The camera

operates at 30 frames per second, had autofocus, and in-driver image processing capabilities.

The Creative 73VF022000000 is a low-end camera. It operates at 600x800 resolution at 30fps. It is a

manual focus camera with limited driver options.

These cameras have been used in three different hardware configurations as the project progressed.

The hardware changed over the project as more was learned about image requirements through

experimentation and also as more hardware (such as the head-mounted display) was received.

The first generation of the hardware system (HGEN1) was nothing more than the stripped down

Creative webcam bolted to a pair of safety glasses so that it was looking back toward the eye. This can

be seen in Figure 2 below.

Figure 2 - First Generation Hardware

This crude configuration was assembled as quickly and cheaply as possible to help determine

the requirements of future hardware designs and to allow coding of the image processing application to

begin. Despite the crude appearance, HGEN1 functioned well and was the basis of the first half the work

on the image processing application.

Page 14: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 14

The major problem with HGEN1 that forced the move to the second generation system was

stability. The eyeglass frame could not support the weight of the camera and so the entire assembly

would shift position when the user moved his/her head. In addition, by this point it had been

determined that the Logitech camera was superior and a switch to that camera was desired.

Hardware generation 2 (HGEN2)

Testing quickly showed the Logitech to be much better suited to our needs. The combination of

improved optics in the more expensive camera and extensive driver support and options made it the

clear choice. This camera was mounted on a pair of ski goggles as a basic test mount. The goggles were

selected due to their low cost, ready availability, and secure mounting. The fact that the goggles did not

move relative to the wearer’s face was the most important factor. This allowed camera testing and IPA

development in an environment highly similar to the one provided by the HMD. The second half of the

IPA development was conducted on HGEN2. HGEN2 also saw the addition of the polarizing filter,

discussed in the next section.

Polarizing Filter During development of the image processing system, a recurring problem in both the initial and

final development was the reflection of environmental objects in the user’s eye. The surface of the

human eye is extremely reflective and as such, surrounding objects such as overhead lights, computer

screens, and even the camera itself are reflected in the eye. These reflections are bright and extremely

clear and show up plainly in the images of the eye taken by unmodified cameras. In many cases, these

reflections proved severe enough to completely obscure the pupil and prevent the IPA from tracking the

eye altogether. In other cases, the tracking would merely be degraded.

As this is a problem with the surface of the eye that the camera sees, no amount of image

processing can remove these reflections. This problem is frequently encountered by photographers as

well when shooting through glass or taking pictures of highly reflective surfaces. The solution used by

photographers is a polarizing filter. A polarizing filter removes all light not oriented in a single direction.

As reflections are light striking a surface and bouncing off, they are oriented in a different plane than the

light emitted by the object itself. Thus a polarizing filter will remove reflections.

It was decided to mount a polarizing filter on the camera to solve this problem. Unfortunately,

polarizing filters are manufactured only for high-end professional cameras, not webcams. In general,

there are no additional lenses manufactured for webcams at all. As such, the best lens that could be

found was a small 25mm lens for a traditional camera. For lack of a better mounting solution, this was

taped over the webcam’s lens. Surprisingly, this functions extremely well.

The lens successfully removed almost all reflections. There are a few cases where the reflected

light remains too bright for the filter to remove, but they are rare and outside normal expected

operating conditions.

Page 15: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 15

Unfortunately, in filtering out much of the light the camera receives, the filter also severely

reduces the amount of light available to the camera. This requires a corresponding increase in the

exposure time of the camera. This slowed the system excessively. The solution to this problem was an

additional LED mounted near the camera.

Lighting System A single white LED was mounted near the camera to provide additional lighting. A white

LED was selected as it was found white interfered less with a user’s vision. The LED is somewhat

distracting to user’s, but there is no other option in this case.

Head Mounted Display (HMD) The HMD used with this project is the Lite-Eye LE-500. The LE-500 was selected after it was

determined a transparent, high-resolution, monocular HMD was required for the project. The LE-500

was the only HMD available which met these requirements at an acceptable cost.

800x600 full color resolution

Transparent or opaque

Monocular

High 2.5cm eye relief

Computing platform The entire system is packaged into a single installer compatible with any Windows XP or

Windows Vista computer. The software may be installed onto a small form factor laptop and used in a

mobile application.

Page 16: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 16

System Software Design

Application Layout Once the hardware was in place, system software development could begin. As shown in Figure

3, below, there are two main subsystems to the software. The first is the image processing application,

which processes the image and locates the center of the pupil. Also included in the image processing

application are the calibration and cursor control subsystems, which do the mapping from the center of

the eye to the screen. The remaining subsystem is the user interface application, including all the

communication between the two subsystems, which is where the user can test the functionality of the

overall system. The intricacies of these subsystems will be described in more detail in the following

sections.

Figure 3 - Full System Architecture

Image Processing Application The image processing application involves a number of steps in order to determine cursor

coordinates from an image of a user’s eye. At its core, the image processing application (IPA) uses a

custom coded adaptive pre-processing algorithm in conjunction with an open source blob recognition

system to identify and track the pupil. It also contains a large number of performance logging and

Page 17: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 17

analysis systems that provide a sophisticated run-time view of the application for development

purposes.

The steps of the IPA are shown in Figure 4, and are discussed in more detail below. The process

begins by capturing a frame from the camera in the “Capture Image” step, via OpenCV’s camera

subsystem.

Figure 4 - Image processing steps

Image Filtering Image preprocessing is the first portion of the image processing proper. The red channel

of the image is extracted and smoothed. The system uses only the red channel of the color image due to

extensive testing that showed that it was the only data channel to contain useful data for all eye colors.

Smoothing is applied to the image to remove any lingering reflections missed by the polarizing filter. The

smoothing also helps to reduce sharp edges, aiding the blob recognition system.

The adaptive contrast stretch is then applied. Contrast stretching is a traditional image

processing technique that manipulates the window and level of the image to produce higher contrast in

a specific range of pixel values at the cost of contrast loss in other areas. In this case the contrast stretch

applies is determined by the adaptive algorithm and is designed such that the pupil will be brought into

relief as a black blob on a white background.

Capture Image Extract Red

Channel

Smooth image

Contrast StretchLocate

blobs

Reject false positives

Determine center of pupil blob

Adapt Algorithim

Page 18: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 18

Locate Eye Shape With the contrast stretch applied, the pupil will be clearly visible as a black blob. However, there

may be other black blobs as well. These represent areas of the image of almost exactly the same shade

and color as the pupil. The blob detection step exists to locate all blobs in the image and determine

which is representative of the pupil. For the image processing applications, a blob is defined as a group

of pixels with values within a certain range. The blob detection system used is an open source add-on for

OpenCV.

Once all blobs have been located, the system calculates several parameters of each blob such as

perimeter, area, aspect ratio, roundness, and more. These parameters are compared to experimentally

determined values for a pupil and blobs are discarded based on them. Due to the nature of the human

eye and surrounding features, there will never be more than one blob that fits all parameters for a

human eye. Thus, the system selects the correct blob. The algorithm then calculates the center of this

blob and passes it to the cursor control system.

Adaptive Algorithm Finally, the system adapts the smoothing, contrast stretching, and blob detection parameters

based on the current image to allow for better recognition of the next image. If no pupil was found in

the image, the system adjusts parameters up or down to gradually move towards a state that will allow

pupil recognition. Adaptation is based on the relative lightness/darkness of the image, the number of

blobs detected, and the previous values of parameters being adjusted.

Cursor Data Once able to locate the center of the pupil, calibration can begin. Calibration is necessary due to

the fact that a screen is a flat 800 by 600 pixel rectangle while the human eye is not. Mapping is required

to transform the coordinates of the center of the pupil to the coordinates on the display. Calibration

must be done every time the system is restarted due to variations in use. The eye will not be in the same

location relative to the screen every time the same user wears it; also, different users with different eye

and face shapes will require a new calibration. Currently, the calibration is done using 81 pixel locations,

by asking the user to ‘look at the dot’. The visual component of calibration is a part of the user interface

and will be discussed on page 29. By using the center location of the eye when looking at those known

pixel locations the coefficients of the calibration equation are determined using multiple variable linear

regression. This is a statistical technique used to approximate correlation of variables. Shown below in

Figure 4 is the multiple variable linear regression formula:

B[] = (XT[]X[])-1XT[]Y[]

Figure 5 - Multiple Variable Linear Regression

Where B[] is the column vector of calibration coefficients, X[] is the matrix of independent

variables (the coordinates of the center of the eye in this case), and Y[] is the desired output column

vector (the horizontal or vertical pixel location the mouse should move to).

Page 19: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 19

The system uses two independent variables EyeX and EyeY, being the horizontal and vertical

locations of the center of the eye, measured as a pixel location on the picture of the eye. It is important

to note that, because the edges of the human eye are curved, it is not capable of moving as far up or

down when looking to the far right or left as it can when it is looking forward. Thus, the pixel locations at

which to move the mouse are dependent both on the horizontal and vertical coordinates of the center

of the eye. For example:

𝑃𝑖𝑥𝑒𝑙𝑋 = 𝐴0 + 𝐸𝑦𝑒𝑋 × 𝐵0 + 𝐸𝑦𝑒𝑌 × 𝐶0

𝑃𝑖𝑥𝑒𝑙𝑌 = 𝐴1 + 𝐸𝑦𝑒𝑋 × 𝐵1 + 𝐸𝑦𝑒𝑌 × 𝐶1

Figure 6 - Determining pixel coordinates from eye coordinates

Figure 6 is the method employed in this gaze tracking system. It is a best fit plane (1st degree)

through the three dimensional dataset.

This multiple variable linear regression method allows for any degree polynomial curve fit. By

changing the dimensions of the X[] matrix, dependence on other variables can be added in, such as EyeX2

or EyeY4.

Over the course of this project, different curve fits were investigated in an attempt to find the

best fit possible. Shown below are the error plots generated from the different order curve fits.

Page 20: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 20

Degree 3-D Error Plot Contour Error Plot 1st

2nd

3rd

4th

5th

Figure 7 - Error plots for various degrees of best fit methods

Figure 7 shows the three-dimension error surface curves as well as the error contour plots for

the varying polynomial degrees. One thing is consistent between all the plots; the error significantly

Page 21: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 21

increases as the user approaches the corners of the screen. This is mainly due to the shape of the eye

itself. There is less available resolution, when looking to the far left or far right. Figure 8 shows a more

telling plot of the average error for each degree.

Figure 8 - Average pixel error for various degree best fit degrees

It is clear from the chart that the third and fourth order fits provide the least amount of error. In

the first design iteration, a first degree system was used for cursor control. These results were achieved

through MATLAB code described on page 37. As is shown, a fourth degree system would be preferential;

however, due to design time limitations a fourth degree system was not achieved. Designing a higher

order system than the first degree order system involves much more complicated linear algebra than

the current system, such as finding the inverse of a 9 x 9 matrix.

Cursor Control The gaze tracking system was designed and implemented on the Windows platform. As such,

cursor movement is implemented via a direct command to the Windows operating system. The methods

of commands used in the NT environment are different from that used in the pre-NT environment. The

designed system has been designed to work with NT based systems and beyond. Accordingly, this

system requires Windows NT/2000 service pack 4 or higher (including XP and Vista). Figure 9 shows the

functions used to position the cursor, as well as perform a click.

#define _WIN32_WINNT 0x0501

1st Degree 2nd Degree 3rd Degree 4th Degree 5th Degree

20.5645

16.773

15.5047 15.489

103.0193

Average Error (Pixels)

Page 22: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 22

#include <windows.h>

#include <Winuser.h>

void gtsMouseMove (int x, int y )

{

double fScreenWidth = ::GetSystemMetrics( SM_CXSCREEN )-1;

double fScreenHeight = ::GetSystemMetrics( SM_CYSCREEN )-1;

double fx = x*(65535.0/fScreenWidth);

double fy = y*(65535.0/fScreenHeight);

INPUT Input={0};

Input.mi.dwFlags = MOUSEEVENTF_MOVE|MOUSEEVENTF_ABSOLUTE;

Input.mi.dx = (long)fx;

Input.mi.dy = (long)fy;

::SendInput(1,&Input,sizeof(INPUT));

}

void gtsLeftClick ( )

{

INPUT Input={0};

// left down

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_LEFTDOWN;

::SendInput(1,&Input,sizeof(INPUT));

// left up

::ZeroMemory(&Input,sizeof(INPUT));

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_LEFTUP;

::SendInput(1,&Input,sizeof(INPUT));

}

Figure 9 - Cursor control code

As is shown in Figure 9, cursor control requires inclusion of a few standard windows libraries.

These libraries include ‘windows.h’ and ‘Winuser.h’. This method of cursor positioning provides absolute

cursor control. This is in comparison to relative cursor control, wherein the cursor moves a differential

number of pixels from the current location. The absolute control provides for instantaneous positioning

at the location determined by the cursor positioning system.

Cursor positioning via the absolute control of the gaze tracking system provided a great increase

in interaction speed. The designed system was compared to alternative methods of computer input,

including a track pad, and a mouse. Testing was done using the Timing Collector previous described on

page 33.

Page 23: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 23

Figure 10 - Cursor movement times for various input methods

Figure 10 shows average cursor movement times, which describe the average amount of time

that it takes a user to position a cursor over a randomly placed object on the screen. Users experienced

a 12.7% speed increase over the use of a traditional mouse and a 53.6% increase of the use of a laptop

track pad.

Clicking An important feature of any mouse is its ability to click on a location. Throughout the course of

this project a few different ideas for implementing this were explored. The first design idea was to use

audio based commands. The Logitech webcam used has a built in microphone, allowing for a sensor to

be placed extremely close to the users face. This idea was rejected for the following reason: a key design

element was mobility; the system would in many noisy environments. Noisy environments would

introduce problems such as false clicks. Additionally the system would again need to be calibrated for

every user, so as to correctly identify their voice and initiate a click. This would have added a significant

amount of time to the system calibration process, which was also undesirable.

The second design idea was to use ‘winking’ for clicks. This would have been implemented by a

second camera facing the user’s other eye to determine the difference between a normal blink and a

wink. Normally, when a user blinks both eyes momentarily close. However, during a wink, only one eye

closes for a duration of time. Identifying when one eye was closed while another was open could have

easily been used to initialize a click. This design method was decided against due to cost, as it would

involve purchasing another camera to be mounted to the system.

The final clicking method is based on hover, or dwell time. If the user looks at a constant

location for a certain amount of time, the system will initialize a click at that location. The system

Page 24: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 24

records previous eye locations, computes the average, and checks based on a threshold if they are close

enough to each other to warrant a click. The number of previous locations used, along with the

threshold are both variable. The program was developed so that these parameters were easily changed.

Testing showed that holding the last eight eye locations with a threshold of about ten pixels seems to

work well for most users.

Communication After creating an interface, and providing the ability to click, the system must also be able to

communicate with other applications, such as the User Interface which will be discussed below. A UDP

server thread was implemented within the Image Processing Application such that data could be

received from an external source and decisions could thereby be made upon that data. By creating a

customized UDP interface in C, the design team was able to easily design the system around project

needs, as well as add features/functionality as the needs arose. Using a UDP server system, while

ultimately necessary, had some drawbacks found in any modern multi-threaded application. Wherein

the Image Processing Application was originally designed as a single-threaded single-process application,

with the addition of the UDP server, it became a multi-threaded single-process application. This presents

issues such as data interleaving.

When the UDP server receives a command, most likely it will operate on some sort of data

internal to the Image Processing Application. In most cases, this data is actively being used by the image

processing thread. Multiple threads accessing the same data can produce incorrect readings of the data.

For instance, if the image processing thread has finished processing an image and is saving blob

parameters, and at the same time an external application requests the latest blob parameters, there is a

great possibility that the data will be interleaved. Interleaving would present a great problem if that data

was then to be analyzed for consistency. To alleviate this multi-threading problem, the design team has

implemented a technique known as a ‘critical section’. In C, a critical section elevates the threads

priority within the process, which results in the thread being the only thread handled. This process has

some drawbacks, such as it completely stops processing of the image processing thread. While this

could escalate into a major issue, careful precautions have been taken to only include operations which

affect global variables. An example is shown here in Figure 11.

if(RecvStringPart == "ClickFrames+"){

strncpy_s(buf,"ClickFrames+_ACK!",100);

EnterCriticalSection(&cs);

ClickFrames++;

LeaveCriticalSection(&cs);

}

Figure 11 - Use of a Critical Section in C

This example is taken from the UDP command processing thread. When the server receives a

command, the command is placed in ‘RecvStringPart’. If the received command matches

“ClickFrames+”, a reply is readied to be sent back to the sender confirming that the command was

received. Following readying a reply, the critical section is entered, thereby making the following

commands the only commands that will be processed. The ClickFrames variable (which controls the

Page 25: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 25

number of frames that the system requires to initiate a click) is then incremented, and the thread

priority is lowered. Following this code, the reply is sent back to the sender and the UDP server waits for

the next command.

Many different commands have been implemented in the UDP server. While some of these

commands were originally intended to be in the system, some were added at later stages to add

functionality. Some of the accepted commands include:

Cal_Start

data_calibration

data_mouse

data_parameters

ClickFrames+

ClickFrames-

ClickThreshold+

ClickThreshold-

MouseOn

MouseOff

ClickOn

ClickOff

SAVE_IMAGE:

FPS

EXIT

Figure 12 - Shortened list of UDP commands

These commands, along with a few others, add up to create a fully functional UDP server thread

within the Image Processing Application. This has developed into an integral part of the system, and its

importance can truly be shown in the calibration routine, which is discussed within the User Interface

section.

User Interface One of the problems that the design team ran into when developing the system was that once

mouse control was attained; there was no common interface for testing and gathering results. After

trying several approaches – including SVG (Scalable Vector Graphics) – it was decided that implementing

the User Interface would need to be done with a standalone application that would control the display

of an entire screen. Using a standalone application would afford us the ability to programmatically alter

the state of the application as well as communicate with external applications and collect information to

be displayed to the user.

The interface’s standalone application was achieved by using the Windows Presentation

Foundation (WPF) – a new component found in the Microsoft .NET Framework version 3.5 – along with

C#. WPF uses modern graphical user interface programming techniques to efficiently allow

programmers to create interfaces quickly. In defining our interface we used a navigation based system,

where each different section within the interface acted as a page – similar to a webpage. Each page is

defined using extensible application markup language (XAML), which is a derivative of XML. The

structure of XAML code is very similar to modern web pages. However, being that the application is

compiled – as compared to a web page in HTML or XML, which is not – the background code is written in

managed code, rather than in a language such as JavaScript or ECMAscript. This background code is

known as ‘code-behind’ and in our case is written in C#.

Page 26: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 26

Using C# as our code-behind gives us the ability to write rich user interfaces, which contain

elements such as animations, timers, navigation buttons, etc. There are multiple methods for animating

items in C#. There are two major methods used within our application: animation within XAML and

animation by procedural code. Animation via XAML allows us to write very complex animations in

markup, and trigger those animations off of specific events within the interface. An example of this is

shown here:

<Page>

<Page.Resources>

<Storyboard x:Key="OnLoaded1">

<DoubleAnimationUsingKeyFrames

Completed="Animation_Completed"

BeginTime="00:00:00"

Storyboard.TargetName="image"

Storyboard.TargetProperty="(UIElement.Opacity)">

<SplineDoubleKeyFrame

KeyTime="00:00:02.6000000"

Value="0"/>

</DoubleAnimationUsingKeyFrames>

</Storyboard>

</Page.Resources>

<Page.Triggers>

<EventTrigger RoutedEvent="FrameworkElement.Loaded">

<BeginStoryboard

Storyboard="{StaticResource OnLoaded1}"

x:Name="OnLoaded1_BeginStoryboard"/>

</EventTrigger>

</Page.Triggers>

<Canvas>

<Image Canvas.Left="0"

Canvas.Top="0"

Name="image1"

Stretch="Fill"

Width="800"

Source="/IRALAR_UI;component/Resources/Background.png"

Height="600" />

<Image

Canvas.Left="60"

Canvas.Top="60"

Height="435"

Name="image"

Stretch="Fill"

Width="655"

Source="/IRALAR_UI;component/Resources/splash.png"

/>

</Canvas>

</Page>

Figure 13 - Markup based animation

The XAML code shown in Figure 13 illustrates the splash screen of the User Interface.

Storyboard descriptors describe an animation which transitions an object’s opacity from 100% to 0%

over a period of 2.6 seconds. This animation triggers on the loading of the page, and applies to the

object with the name of ‘image’. As is shown later in this figure, the object with the name of ‘image’

corresponds to an image which acts as the displayed splash image shown on the loading screen.

Other animation techniques include animation by procedural code. When it is not possible to

trigger an animation off of an event in the interface, it becomes necessary for us to animate objects on

Page 27: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 27

the screen by means of procedural code. For instance, when we require an animation to continually

occur on a re-occurring cycle such as whenever a timer reaches a limit, it becomes easier to manage

multiple animations by means of code. An example of this is found within our game interface, wherein

once the game starts a timer begins. This timer is set to 100 mS, and the overflow is set to run a specific

piece of code. Within that code, various counts and variables determine the structure and properties of

the animation.

public UI_Mole()

{

InitializeComponent();

dt.Interval = TimeSpan.FromMilliseconds(100);

dt.Tick += new EventHandler(dt_Tick);

dt.Start();

}

private void dt_Tick(object sender, EventArgs e)

{

Handle_timer_tick();

DoubleAnimation Time_edit = new DoubleAnimation();

Time_edit.Duration = TimeSpan.Parse("0:0:0.1");

Time_edit.To = time * 20;

Time.BeginAnimation(Canvas.WidthProperty,Time_edit);

}

Figure 14 - Procedural animation

The procedural code (written in C#) shown in Figure 14 illustrates a few important concepts of

procedural animation. On the pages initialization, a dispatch timer ‘dt’ is created which upon timer

overflow executes the function ‘dt_Tick’. The timer is then automatically reset to trip in another 100

milliseconds. Within the timer overflow execution function, the 100 millisecond time tick is handled.

Following general timer overflow handling, an animation is triggered on the object in the interface with

the name ‘Time’. The animation is applied to the ‘WidthProperty’ of the object, causing it to grow in this

case as the timer progresses to a final value.

In addition to animating and data-binding, the interface is required to navigate between pages.

The application has methods to navigate procedurally to alternative pages within the interface. This

allows the application to trigger page navigations after certain amounts of time, or trigger them off of

other events such as a button click.

NavigationService.Navigate(new Splash());

Figure 15 - WPF navigation

Figure 15 shows how easy it is to initiate a transition from one page to another. In this case, the

navigation is to the ‘Splash’ page. After achieving navigation and other elements desirable within an

interface, we set out to create a framework for our application. A diagram of our application is shown

here:

Page 28: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 28

Main Interface

Application Start

Application End

Exit

Center the

display

Splash Screen

Calibration

Click Threshold interface

Click Delay interface

Diagnostics

Game

(look – a – mole)

Project Details

Reaction Time Testing

Performance Testing

Minimize

Figure 16 - Interface structure

The User Interface easily lends itself to be broken up into separate interface pages, as is shown

in Figure 16. When the user initializes the application, a screen asking the user to center the display is

shown. Once the user acknowledges that the display is centered, a splash screen is shown, and the

system progresses to the calibration interface. After calibration is complete, the user is taken to the

central main interface. From there, the user can choose which interface should be displayed. These

various interfaces have different functionalities, which will be described in detail in the following

sections.

Main Page

After initializing the application and calibrating the Image Processing Application, the

user is presented with the main section of the interface. An image of this interface is shown

here:

Page 29: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 29

Figure 17 - Main Page

As shown in the previous figure, the interface is relatively simple. There are selectable

areas within the interface which will navigate the user to various pages within the application.

Also of key importance within this page is the placement of various elements. For instance, after

calibration the user is brought to this page. Most people would begin reading from the top of

the list downward, however, if the calibration button was the first on the list it was possible that

if the user hovered there reading that button for too long that the system would navigate to the

calibration interface, causing the user to have to re-calibrate the system. This was undesirable as

calibration can be a relatively timely procedure. Also, initially the ‘exit application’ button was

placed in the top right corner of the screen. While this would have been an obvious design

choice in terms of designing an interface, error in the calibration system showed that it was

likely that in the corners of the screen that the user would have a difficult time selecting a

button. To combat this, buttons were enlarged and moved to sections of the screen that were

less prone to cursor positioning errors.

Calibration

One of the key reasons to use an interface such as this is the ability to calibrate the

system to the user’s eye. As was previously described, every user has a slightly different eye

geometry, which requires us to calibrate the system. To achieve this, the image processing

application needs to know when the user is looking at a specific point on the screen. We

designed the calibration interface to request the user to look at a point on the screen – a light

circle on a dark background – when the user’s gaze direction stabilizes we assume that the user

is looking at the point, and initialize a click. The click from the Image Processing Application

initializes an action within the interface which sends a command over the UDP communication

channel to the Image Processing Application. This command contains the location of the user’s

gaze on the screen. This process is repeated for 81 points – a 9 x 9 grid covering the whole

Page 30: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 30

screen. After the user completes all of the 81 points, a command is sent to the Image Processing

Application stating that calibration is finished, and that all points have been transmitted. At that

point the Image Processing Application calculates all of the constants required for use within the

point transformation algorithm. The User Interface concurrently navigates to the main page so

that the user can begin using the system.

System Testing Interface

When demonstrating this system, a common page that was used next was the system

testing interface. This interface shows a detailed grid of rectangles which represent small

regions on the screen. Within this interface, each rectangle is configured to change color

whenever the cursor is positioned over it. This has the effect of allowing the user to quickly test

how well the system works. Shown here is a screen capture of this interface while in use:

Figure 18 - System test interface

Timing Collector

One of the goals of this project was to enhance human computer interaction in terms of

the speed that a person can position a cursor over an object on the screen. A routine was

created within the gaze tracking interface which tested reaction times of users. Testing was

performed by asking a user to position the cursor over a position on the screen. The duration of

the time that it takes for the user to position a cursor from the time that the object appears to

completion of cursor positioning was recorded in a log file. A similar application was created for

use with a mouse and track pad, which will be described later.

Page 31: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 31

Figure 19 - Timing collector interface

The results of this log file were processed, and averaged to find the median time

required to position the cursor over an object on the interface. Our results showed that cursor

positioning times for the gaze tracking based system was superior to a mouse or track pad based

positioning system. The collected timings are shown in Figure 10 on page 23.

Centering Interface

A key element of usage for users was the ability to see all portions of the display. When

wearing the HMD, it was essential to see the entire screen in order to calibrate the system

properly. The interface requests that the user center the display so that they can clearly view

each corner. A sample of this interface is shown here:

Page 32: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 32

Figure 20 - Centering interface

At this point, the image processing application would not produce reliable results

because upon initialization the system is not calibrated to each user, meaning that any mouse

clicking via the Image Processing Application would not be reliable. To circumvent this, the

design team implemented global key hooks. This is not done natively within the .NET 3.5

framework, however with additional classes it is possible to achieve the same end result. After

initialization, the application processes key hooks allowing various keys to produce various

results. For this screen, the only necessary keystroke is the spacebar. The processing for this

keystroke is shown here:

public void MyKeyPress(

object sender,

System.Windows.Forms.KeyPressEventArgs e)

{

if (e.KeyChar.ToString() == " ")

{

actHook.Stop();

NavigationService.Navigate(new Splash());

}

}

Figure 21 - Global key hook in C#

As is shown in Figure 21, when a key is pressed the ‘MyKeyPress’ function is activated.

Within this function, the key press is analyzed on a character basis. When a match is found, the

system responds accordingly. In this case, when the spacebar is pressed, the global hook is de-

initialized and the interface navigates to the ‘Splash’ page.

Other Pages

When creating the interface, there were other functions that we required along with the

ones previously mentioned. The design team implemented a quick game called look-a-mole – a

derivation of whack-a-mole – wherein the user is requested to look at a cartoon mole to make it

disappear. This provides visual training for user’s and allows them to more efficiently use the

Page 33: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 33

interface. While developing this functionality, it was easy for users to use this with a mouse or

track pad, however it becomes much more difficult to use this when using the gaze based

system.

In addition to a simple game interface, the design team also created a few interfaces for

use within the User Interface. Mostly these pages were diagnostics based, and had functions to

allow control of the performance of the Image Processing Application by means of the UDP

communication channel. For instance, the design team experimented with various values of click

delay frames. An interface was created to show the current click delay, allow the user to test

clicking with that value. Also in that interface is an area to allow the user to increase or decrease

the current value of the click delay variable.

Similar to the click delay interface, a click threshold page was created. The click

threshold controls the size of an area on the screen which the user’s eye must be within in order

to initiate a click. The click threshold page has similar features to that of the click delay page –

displaying the current value of the click threshold variable, allowing the user to increase or

decrease this variable, and a section to allow the user to test the settings.

Along with these two pages that allow editing of operational parameters of the Image

Processing Application, the design team also created a diagnostics page that displays various

diagnostic variables from within the Image Processing Application. A few examples of such

variables are: calibration coefficients, image recognition parameters, and current eye and pixel

location.

Additional Applications Various additional applications were created to allow the design team to collect data, analyze

results, and issue commands over the UDP communication channel to the Image Processing Application.

Timing Collector

As was described in the timing collector section of the gaze tracking interface, we

created an additional application which functioned in a similar fashion. This application was

coded in C#, in similar fashion to the User Interface. In actuality, the two applications practically

share the same code base. Slight variations such as the name of the log file are key, and allow

for multiple log files to be created and analyzed.

Page 34: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 34

Figure 22 - Reaction time testing screenshot

When the user initiates the Timing Collector application, they are brought to the screen

shown in Figure 22. After the user clicks the button at the bottom of the screen, they are taken

to the actual screen where there are 4 modes of interaction. The first mode shown to the user is

the ‘mouse over – known previous position’ mode. In this mode, the user is asked to position

the mouse over a square which jumps around the screen. This mode repeats for 25 locations.

The system then moves on to the second mode, wherein the user is asked to click on the

squares. After completing the second phase, the system moves on to the third phase. In the

third mode, the system randomly places the cursor at one of the corners of the window after

the user places the mouse over each square. The same is true for the fourth mode, wherein the

user is again requested to click on the squares. This application was the original framework for

the timing collector interface used in the eye based interface.

One of the original assumptions was that users interacting by means of a mouse would

have different reaction times than those of someone using a track pad, or the designed eye

based system. This application recorded those times which would later be analyzed by a Perl

script.

UDP Client

In the initial stages of development of the UDP communication channel the design team

created a UDP client application. Upon initialization, this application connects to the Image

Processing Application’s UDP communication thread. This application was written in C# and

contained similar UDP communication code to that found in the User Interface application. The

UDP client application functions entirely as a command line interface, and allows the user to

enter commands which are sent to the Image Processing Application. In the early development

stages, this was an important part of correctly configuring the UDP communication thread

within the Image Processing Application, as it allowed the design team to debug the program

inputs and outputs.

Page 35: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 35

Figure 23 - UDP Client screenshot

Control Panel

In a very similar fashion to the UDP client discussed above, a Control Panel was created

such that commands could be sent to the Image Processing Application. There are two major

differences between the Control Panel application and the UDP client application: ability to

connect to a remote computer, and the user interface.

In the UDP client application, all commands were sent to ‘localhost’ over a specified port

(1200). While this was useful for testing application inputs and outputs, it limited functionality

such that the only location that you could send commands and receive results from was the

same computer. The Control Panel application allows for connecting to the Image Processing

Application running on a remote computer – such that the user knows the IP address of the

remote computer.

Additionally, whereas the UDP client application was limited to a command line style

interface, the Control Panel provides a graphical user interface. This allows for quick and easy

use. For instance, when using the system there is a possibility for errors in the calibration

routine. These errors typically manifest in the cursor moving to locations that the user does not

intend. Since the gaze tracking system is an absolute cursor positioning system – compared to a

mouse which is a relative system – the gaze tracking system quickly overrides any intervention

by the user to correct the system with the mouse. This typically becomes very frustrating. To

combat this, a separate computer is used to issue commands such as ‘mouse_on’ and

‘mouse_off’, which control whether the Image Processing Application controls the cursor or not.

Having this presented to the user in a graphical user interface setting has great benefits in terms

of the time that it takes for the user to de-activate the mouse control on the system where they

are using the gaze tracking system. A sample of this interface is shown below.

Page 36: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 36

Figure 24 - Control Panel screenshot

In addition to providing a command line within the interface as well as the general

system control buttons, a ‘Gather Calibration Details’ function has been implemented. This

functionality retrieves details from the Image Processing Application regarding the points that

have been used in the latest calibration, as well as the calibration coefficients. Upon gathering

this information, the rectangular section on the right side of the interface displays a color

gradient based on error levels in the latest calibration. This has the effect of allowing users to

see error levels within the system on the fly.

Perl Scripts

In addition to compiled applications, two Perl scripts were created to aid in analysis of

log files created by the various applications already discussed. Generally Perl scripts require the

Perl interpreter in order to be executed, however the design team compiled the Perl scripts into

standalone executables so that the Perl interpreter is not necessary for running the scripts.

Timing Collector Analyzer

The timing collector application – in both the standalone form and as a page

within the User Interface – creates a log file containing a listing of the times that it has

taken users to position the cursor over an object. The Timing Collector Analyzer script

reads in these log files and finds the medians for the different modes of operations. This

operation is all achieved via the command line. The results of this script were used to

produce the values used in Figure 10.

Calibration Analyzer

During the calibration routine, the Image Processing Application saves off a

point log, along with the calibration constants determined by the calibration routine.

The point log is processed where the output of the script is a re-formatted set of tables,

including error values for both X and Y directions, as well as an overall error table. As

with the Timing Collector Analyzer, this operation is all achieved via the command line.

Page 37: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 37

MATLAB

Along with the aforementioned code, MATLAB scripts were created for analyzing error

in a similar fashion to the Perl scripts.

Find_coefs

This MATLAB script takes a stored array of points corresponding to the points

gathered by the calibration routine. From that set of points matrix math is performed,

producing calibration coefficients. Various degrees of coefficients are found, including

1st order to 5th order. These coefficients are saved in the MATLAB workspace, so that

they can be processed by the next script.

Error_Graphs

This MATLAB script processes the points and calibration constants determined

by the Find_coefs script previously described. Various error values are determined for a

given order calibration, and error plots – both 3-Dimensional and contour – are

generated. These error figures are then saved off into the current working directory. The

figures generated by this script are shown in Figure 7.

Conference Submissions

Along with developing the system described in this report, the design team has also compiled

several research papers. Papers were submitted to the IEEE HSI conference (found in appendix B), and

the RESNA student design competition (found in appendix C). The paper submitted to the HIS

conference has been accepted for publication in the conference proceedings in 2008[23]. The paper

submitted to the RESNA competition has not yet been accepted at the time of writing this document.

Please refer to the appendices for full copies of these conference submissions.

Results

Goal Completion The goals that we set out to achieve at the inception of the project were split into two main

categories: core goals, and additional goals.

Core Goals

Core goals represent features that were essential to the operation of the project. All

core goals were achieved.

Page 38: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 38

Image Processing Application

Having never created an application that interfaces with external image

capturing devices, the design team knew that creating an Image Processing Application

that would be able to interface to such a device would be essential to the completion of

the project. Beyond just connecting to a camera and obtaining image data, the design

team included the ability to do basic processing on the received image, and the ability to

determine the center of the users eye within that image. The image processing method

was not determined at this point as there were various methods that were available,

and various methods that would have to be experimented with. The end result was an

image processing application which not only does a good job of determining the gaze of

the user, but that also dynamically adjusts to various users.

Cursor Control

When the goals of the design team were being determined, it was not obvious

how mouse control would be achieved. Later it was found that this was a trivial matter

of interfacing with Win32. This objective was easily obtained early in the development

cycle.

Display Output

When the design team started working on the project, it was not sure how

display would be achieved. It was known that display would be necessary for the

creation of a user interface, which would allow for calibration and create a user

experience. In the end this goal was achieved by using the LitEye LE-500 monocular

head mounted display.

Visually Controlled Demo Application

The design team originally set out to create an interface that would display

feedback to the user. As was previously discussed, various design methodologies were

taken to produce the end result, which was a fully functional visually controlled demo

application. This goal, along with the other core goals of the project, was a success.

Additional Goals

The additional goals that were determined at the projects start were goals that the

design team thought would be a good direction for the project, with the full realization that

these goals were secondary. Throughout the design process, some of these goals would be

abandoned, as well as others that were found to be more critical that had been previously

expected.

Transparent Display

In order to create a full heads-up-display (HUD), the design team decided that

the best case scenario would include a transparent display. This would allow for the

creation of a display wherein data could be overlaid on the user’s normal vision. For

Page 39: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 39

instance, with the addition of a forward facing camera the system could recognize not

only where the user was looking on the screen, but also what they were looking at in

their field of vision. This could allow for information overlay, such as a person’s name if

the user was looking at someone. While design time did not permit for development of

a HUD, the transparent display was purchased. This allows future teams to continue

work in this field.

Portability

A great deal of importance was placed on portability of the system. Initially, the

design team had expected to use an ASUS EeePC as the mobile platform for the system.

The functionality on the EeePc was not fully tested, as this computer was generally used

for as the system controller during demonstrations. However, the design team feels that

the laptop that was used for the project was quite portable, and that this objective was

achieved.

Forward Looking Camera

Initially, it was decided that a forward looking camera would be a novel addition

to the system and would provide true ability for the creation of an augmented reality

system. This would provide the ability to identify what the user is looking at in their

field of vision, as well as where they are looking on a screen. Due to time constraints,

this feature was abandoned.

Automated Calibration Process

An automated calibration process was not initially suspected to be an integral

part to this project; however it was quickly determined that this was crucial to the

success of the project. By designing the calibration screen within the user interface, this

goal was easily achieved.

Network Integration

This goal was based on creation of a Heads-Up-Display (HUD), and specifically

would be included in the interface. Information would be displayed to the user in

various forms, such as an RSS feed. This was later determined to be outside the scope of

the project, and was abandoned.

Other types of network integration, such as the creation of a control panel to

allow live control of the recognition system became important. Network connectivity

between two computers was achieved by means of a UDP communication channel. This

sub-portion of the Network Integration goal was achieved late in the design process, as

the design team did not initially realize that this would be an integral part of designing

the system.

Page 40: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 40

Additional Sensory Output to User

The design team thought that additional sensory output – such as audio – would

be a good addition to the system. However, this was abandoned as it proved that it was

unnecessary for the functionality of an augmented reality system, and was outside of

the scope of the project.

Functionality In a similar fashion to the goals previously described, various levels of functionality were set out

at the beginning of the design process. In a research based project such as this, there was no clear goal

of how well the system would need to perform in order to be usable. Some of the functionality

requirements for this project were easily attainable, while others were set too high. Functionality is

herein broken into 2 sections: hardware, and software.

Hardware

Hardware requirements generally included system requirements that would allow the

software end of the system to accurately and reliably run the designed software. The original

functionality requirements considered 3 major topics: Mobility, Camera, and Connectivity.

Mobility

The specifications of this project asked for the system to be portable. The final

manifestation of the project allows it to be run via a laptop computer. This laptop can be

placed in a backpack and the system can be worn on the user’s head. This mobility

functionality was achieved.

Camera

The specifications for the camera portion of the hardware components

originally called for a camera with a resolution of 320 x 240. Ultimately, the camera that

was used had a default resolution of 2 MP (1200 x 1600), with the ability to up-sample

to 8 MP. The camera settings were set to VGA (640 x 480), to decrease the time

required for a frame capture. The camera is also positioned such that it allows for

minimal visual interference when used with the system.

Connectivity

The original specifications for the project included various connector types that

would be used. These connector types included VGA and USB. These connector choices

were made due their widespread use on laptop computers. Other options, such as fire

wire or S-video connector types would have severely limited the number of available

computer systems readily able to use this project.

Software

After having fulfilled requirements for hardware, software could be developed that

utilized the data provided by the camera over the connectivity requirements. Major software

Page 41: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 41

requirements can be broken down into 4 sections: Image Rejection, Frame Rate, Accuracy, and

Actions.

Image Rejection

Images wherein the pupil is not found are rejected. Initially this functionality

was thought to be a difficult task. Rather, once the blob detection method was

determined this functionality became quite easy to accomplish. If the blob detection

algorithm does not return any results, it is assumed that no pupil was recognized. In this

event, threshold parameters are adjusted and another image is captured from the

camera.

Free-running Frame Rate

When working with image based systems where there is a lot of information to

be processed, it is important that for a live system that the frame rate be acceptable for

use. The initial design called for a minimum of 2 frames per second, resulting in a frame

every 500 milliseconds. The design team recognized that this was an acceptable

minimum; however the team decided that a more ambitious goal of 10 frames per

second should be set. A maximum frame rate as determined by the camera would be

30 frames per second.

In the actual use/development of the system, the computational frame rate of

the system was clocked at approximately 45 frames per second. This test was

performed with images that were previously recorded and stored on the hard drive. In

free-running mode, the system runs between 5 and 8 frames per second, depending on

the image that is being analyzed.

Cursor Positioning Accuracy

Research has shown that the human eye can accurately be placed within one-

degree of the desired location. This adds an inherent error to the cursor positioning

accuracy of the system. Along with positioning error there is another inherent error

source, known as saccadic eye movements. Saccadic eye movements are constantly

present in the human eye, and are caused by the brain constantly moving the eye so as

to fill in the rest of the image being observed by the user. Saccadic eye movements

generally occur within the same one degree field as the one degree accurate placement

buffer.

These two factors led the design team to select an error goal of 2% of the

screen, hoping to remain close to the one degree field error. This error goal was met

using the first order curve fit described previously in the cursor data section on page 16.

The achieved error was 1.18% of the screen in the horizontal direction and 1.46% of the

screen in the vertical direction.

Page 42: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 42

Initiating an Action

Upon the inception of the project, the design team faced the key issue of

developing a fully functional computer interface. Included in any conventional

interaction system is the ability to place a cursor over an object, as well as perform an

action on that object (such as clicking). While multiple design approaches were taken,

the final product utilized a method known as ‘dwell-time’. This achieved the intended

functionality of allowing a user ‘click’ on an object.

Conclusion

The gaze tracking system described herein has proven to be a quite interesting project, and has

shown beneficial merit in terms of field applications. The field of research pertaining to accurate gaze

tracking has only become a feasible research in the past few years due to pricing of acceptable cameras.

Also, most of the research that has been done in this field has been solely devoted to tracking the user’s

eye, but has neglected to give any return back to the user via a display. The design team feels that the

advancements made specifically in terms of accurate pupil recognition and transformation have been

beneficial for the field of research as well as the supporting institution.

Page 43: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 43

Works Cited [1] Duchowski, A., "A breadth-first survey of eye-tracking applications." Behavior Research Methods,

Instruments and Computers, 2002, Issue 4, Vol. vol 34, pp. 455-470.

[2] Jacob, R., "The use of eye movements in human-computer interaction techniques: what you look at

is what you get." ACM Transactions on Information Systems, 1991, Issue 2, Vol. vol 9, pp. 152-169.

[3] Majaranta, P., Raiha, K., "Twenty years of eye typing: systems and design issues." Proceedings of the

symposium on Eye tracking research and applications, 2002, pp. 15-22.

[4] Hornof, A. J., Cavender, A., Hoselton, R., "Eyedraw: A sytem for drasing pictures with eye

movements." ACM SIGACCESS Conference on Computers and Accessibility, 2004, pp. 86-93.

[5] Silbert, L., Jacob, R., "Evaluation of eye gaze interaction." Proceedings of the SIGCHI conference on

Human factors in computing systems, 2000, pp. 281-288.

[6] Tanriverdi, V., Jacob, R., "Interacting with eye movements in virtual environments." Proceedings of

the SIGCHI conference on Human factors in computing systems, 2000, pp. 265-272.

[7] Young, L., Sheena, D., "Survey of eye movement recording methods." Behavior Research Methods

and Instrumentation, 1975, Vol. vol 7, pp. 397-429.

[8] Babcock J., Pelz., "Building a lightweight eyetracking headgear." Eye Tracking Research and

Applications Symposium, 2004, pp. 109-114.

[9] Pelz, J., Canosa. R., Babcock J., Kucharczyk. D., Silver. A., Konno. D., "Portable eyetracking: A study of

natural eye movements." Proceedings of the SPIE, Human Vision and Electronic Imaging, 2000, pp. 566-

582.

[10] Li, Donheng, Jason, Babcock and Derrick, J Parkhurst., "openEyes: a low-cost head-mounted eye-

tracking solution." 2006. AMC Eye Tracking Research and Applications Symposium.

[11] Taboada, John., Eye tracking system and method. 5345281 US, 1992.

[12] Ryan, Mathew David., Eye tracking using image data. 7197165 US, 2007.

[13] Richardson, Jim., Eye controllable screen pointer. 6373961 US, 2002.

[14] Mostrom, Richard N., MEAD MOUNTED DISPLAYS. 3923370 US, 1974.

[15] Lemelson, Jerome H., Selectively controllable heads-up display system. 6847336 US, 2005.

[16] Amafuji, Hisashi., Head-mounted display device. 6359602 US, 2002.

Page 44: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Project Report

Gaze Tracking System P a g e | 44

[17] Kim, Juno, Schiemer, Greg and Narushima, Terumi., "Oculog: Playing with Eye Movements." New

York, USA : s.n., 2007. Conference on New Interfaces for Musical Expressiong.

[18] Bornet, Oliver., OpenCV. [Online] 2008. [Cited: May 7, 2008.] http://opencvlibrary.sourceforge.net/.

[19] Tobii Technology AB., Tobii. [Online] 2007. http://www.tobii.com/.

[20] Seeing Machines., faceLAB Version 4. [Online] 2007. http://www.seeingmachines.com/facelab.htm.

[21] —. Driver State Sensor. [Online] 2007. http://www.seeingmachines.com/DSS.html.

[22] SR Research Ltd., SR Research EyeLink | Fast, Accurate, Reliable Eye Tracking. [Online] 2007.

http://www.sr-research.com.

[23] Heidenburg, B., et al., "Data Mining for Gaze Tracking System." Proceedings of Conference on

Human System Interaction, 2008.

[24] Robert, S Laramee and Ware, Colin., Visual Interference with a Transparent Head Mounted Display.

[25] Heidenburg, B., Lenisa., M., Wentzel, D., Malinowski, A., IRALAR :: Gaze Tracking System. [Online]

[Cited: May 7, 2008.] http://cegt201.bradley.edu/projects/proj2008/iralar/.

Page 45: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

EE 452 – Senior Capstone Project

Appendix A: Source Code Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

5/13/2008

Page 46: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 46

Table of Contents

Document Style 48

Image Processing Application

48

IRALAR_CAL.h 48

IRALAR_IPA.h 48

IRALAR_mouse.h 48

IRALAR_UDP.h 48

stdafx.h 48

targetver.h 49

IRALAR_Cal.cpp 49

IRALAR_IPA.cpp 51

IRALAR_mouse.cpp 62

IRALAR_UDP.cpp 64

stdafx.cpp 68

User Interface 68

App 68

App.xaml.cs 68

App.xaml 69

Calibration 69

Screenshot 69

Calibration.xaml.cs 69

Calibration.xaml 73

Center 74

Screenshot 74

Center.xaml.cs 74

Center.xaml 75

ClickDelay 75

ClickDelay.xaml.cs 75

ClickDelay.xaml 77

ClickThresh 79

ClickThresh.xaml.cs 79

clickDelay.xaml 82

Diagnostics 84

Diagnostics.xaml.cs 84

Diagnostics.xaml 87

MainPage 89

Screenshot 89

MainPage.xaml.cs 89

MainPage.xaml 94

Splash 96

Splash.xaml.cs 96

Splash.xaml 97

UI_About 98

Screenshot 98

UI_About.xaml.cs 98

UI_About.xaml 99

UI_Mole 100

Screenshot 100

UI_Mole.xaml.cs 100

UI_Mole.xaml 105

UI_TC1 107

Screenshot 107

UI_TC1.xaml.cs 107

Page 47: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 47

UI_TC1.xaml 108

UI_TC2 109

UI_TC2.xaml.cs 109

UI_TC2.xaml 112

UI_TC3 112

UI_TC3.xaml.cs 112

UI_TC3.xaml 113

UI_Test 113

Screenshot 113

UI_Test.xaml.cs 114

UI_Test.xaml 114

UserActivityHook.cs 115

Timing Collector 121

App 121

App.xaml.cs 121

App.xaml 121

Page1 122

Page1.xaml.cs 122

Page1.xaml 125

Page2 125

Screenshot 125

Page2.xaml.cs 125

Page2.xaml 126

Page3 127

Page3.xaml.cs 127

Page3.xaml 128

Control Panel 128

App 128

App.xaml.cs 128

App.xaml 129

Dialog 129

Dialog.xaml.cs 129

Dialog.xaml 130

Window1 131

Screenshot 131

Window1.xaml.cs 131

Window1.xaml 137

UDP Client 140

Program.cs 140

MATLAB 141

Find_coefs.m 141

Error_Graphs.m 143

Perl 147

TCA.pl (Timing Collector

Analyzer) 147

CAL.pl (Calibration Analyzer) 148

Page 48: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 48

Document Style

In this Appendix, various styles will be used to denote what type of code is being shown. The

following highlighting syntax is used:

C++

C#

XAML

MATLAB

Perl

Image Processing Application

IRALAR_CAL.h void calibrate(const int size, int eyex[],int eyey[],int pixelx[],int pixely[]);

IRALAR_IPA.h extern int centerY, centerX;

extern double coefy[3];

extern double coefx[3];

IRALAR_mouse.h // IRALAR_mouse.h - Function prototype.

void gtsCheckClick (int x, int y, bool click );

void gtsMouseMove (int x, int y );

void gtsLeftClick ( );

void gtsRightClick ( );

void gtsInitializeCal();

IRALAR_UDP.h // IRALAR_UDP.h - Function prototype.

void __cdecl UDP_SERVICE_CLIENT(SOCKET sd,struct sockaddr_in cad);

stdafx.h // stdafx.h : include file for standard system include files,

// or project specific include files that are used frequently, but

// are changed infrequently

//

#pragma once

#include "targetver.h"

Page 49: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 49

#include <stdio.h>

#include <tchar.h>

// TODO: reference additional headers your program requires here

targetver.h #pragma once

// The following macros define the minimum required platform. The minimum required platform

// is the earliest version of Windows, Internet Explorer etc. that has the necessary features to run

// your application. The macros work by enabling all features available on platform versions up to and

// including the version specified.

// Modify the following defines if you have to target a platform prior to the ones specified below.

// Refer to MSDN for the latest info on corresponding values for different platforms.

#ifndef _WIN32_WINNT // Specifies that the minimum required platform is Windows Vista.

#define _WIN32_WINNT 0x0600 // Change this to the appropriate value to target other versions of Windows.

#endif

IRALAR_Cal.cpp /*

Bree Heidenburg

Senior Project

Calibration Procedure

1/31/08

I'd like to thank Math 326 and Dr. Quigg. This procedure uses multiple variable

statistical linear regression to essentially map the coordinates of center of

the eye to a horizontal location and a vertical location on the screen.

*/

#include "stdafx.h"

#include <iostream>

#include <windows.h>

#include <cmath>

#include "IRALAR_Cal.h"

#include "IRALAR_IPA.h"

#include <fstream>

using namespace std;

void calibrate(const int size, int eyex[],int eyey[],int pixelx[],int pixely[])

{

CRITICAL_SECTION cs2;

InitializeCriticalSection(&cs2);

int x[500][3];

int xtx[3][3];

double xtxinv[3][3];

double xtxinvxt[500][3];

double det;

bool logging=true;

ofstream calibrationLog;

calibrationLog.open("CalibrationLog.csv");

if(calibrationLog.fail())

{

logging=false;

cerr<<"Error: Could not open calibration log file. Logging is disabled.";

}

//Create array x from eyex and eyey

cout << "X Matrix: (" << size << ")" <<endl;

if(logging){calibrationLog<<"eyex,eyey,pupilx,pupily"<<endl;}

for(int j=0; j<=size; j++)

{

x[j][0]=1;

x[j][1]=eyex[j];

x[j][2]=eyey[j];

Page 50: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 50

cout << j << "\t" << x[j][0] << "\t" << x[j][1] << "\t" << x[j][2] << endl;

if(logging){calibrationLog<<eyex[j]<<","<<eyey[j]<<","<<pixelx[j]<<","<<pixely[j]<<endl;}

}

if(logging){calibrationLog<<endl;}

cout << "\n";

//Initialize xtx array to 0

for(int w=0; w<3; w++)

{

xtx[w][0]=0;

xtx[w][1]=0;

xtx[w][2]=0;

}

//Calculate x transpose * x

for(int q=0; q<=size; q++)

{

xtx[0][0]=x[q][0]*x[q][0] + xtx[0][0];

xtx[0][1]=x[q][0]*x[q][1] + xtx[0][1];

xtx[0][2]=x[q][0]*x[q][2] + xtx[0][2];

xtx[1][0]=x[q][1]*x[q][0] + xtx[1][0];

xtx[1][1]=x[q][1]*x[q][1] + xtx[1][1];

xtx[1][2]=x[q][1]*x[q][2] + xtx[1][2];

xtx[2][0]=x[q][2]*x[q][0] + xtx[2][0];

xtx[2][1]=x[q][2]*x[q][1] + xtx[2][1];

xtx[2][2]=x[q][2]*x[q][2] + xtx[2][2];

}

//Integers aren't big enough

//Also, this makes formulas easier to type

double a = xtx[0][0];

double b = xtx[0][1];

double c = xtx[0][2];

double d = xtx[1][0];

double e = xtx[1][1];

double f = xtx[1][2];

double g = xtx[2][0];

double h = xtx[2][1];

double i = xtx[2][2];

cout << "X'X" << endl;

cout << a << "\t" << b << "\t" << c << endl;

cout << d << "\t" << e << "\t" << f << endl;

cout << g << "\t" << h << "\t" << i << endl;

//Calculate determinant of xtx

det = a*e*i + b*f*g + c*d*h - g*e*c - h*f*a - i*d*b;

cout << "Determinant = " << det << endl;

if(logging){ calibrationLog<<"Determinant,"<<det<<endl;}

//Compute xtx inverse

xtxinv[0][0] = (e*i-f*h)/det;

xtxinv[1][0] = (f*g-d*i)/det;

xtxinv[2][0] = (d*h-g*e)/det;

xtxinv[0][1] = (c*h-i*b)/det;

xtxinv[1][1] = (a*i-g*c)/det;

xtxinv[2][1] = (b*g-h*a)/det;

xtxinv[0][2] = (b*f-c*e)/det;

xtxinv[1][2] = (c*d-a*f)/det;

xtxinv[2][2] = (a*e-b*d)/det;

//Initialize xtxinvxt to 0

//Compute xtx inverse times x transpose

for(int q=0; q<=size; q++)

{

xtxinvxt[q][0] = xtxinv[0][0]*x[q][0]+ xtxinv[1][0]*x[q][1] + xtxinv[2][0]*x[q][2];

xtxinvxt[q][1] = xtxinv[0][1]*x[q][0]+ xtxinv[1][1]*x[q][1] + xtxinv[2][1]*x[q][2];

xtxinvxt[q][2] = xtxinv[0][2]*x[q][0]+ xtxinv[1][2]*x[q][1] + xtxinv[2][2]*x[q][2];

}

//Initialize coeffienct matrices to 0

for(int w=0; w<3; w++)

{

coefx[w]=0;

coefy[w]=0;

}

EnterCriticalSection(&cs2);

Page 51: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 51

//Calculate xtxinvxtpixy (the coeffiecients for pixely equation)

for(int q=0; q<=size; q++)

{

coefy[0] = xtxinvxt[q][0]*pixely[q]+ coefy[0];

coefy[1] = xtxinvxt[q][1]*pixely[q]+ coefy[1];

coefy[2] = xtxinvxt[q][2]*pixely[q]+ coefy[2];

}

//Calculate xtxinvxtpixx(the coeffiecients for pixelx equation)

for(int q=0; q<=size; q++)

{

coefx[0] = xtxinvxt[q][0]*pixelx[q]+ coefx[0];

coefx[1] = xtxinvxt[q][1]*pixelx[q]+ coefx[1];

coefx[2] = xtxinvxt[q][2]*pixelx[q]+ coefx[2];

}

LeaveCriticalSection(&cs2);

DeleteCriticalSection(&cs2);

cout << "Calibration Complete!" << endl;

cout << "X coefs { " << coefx[0] << " , " << coefx[1] << " , " << coefx[2] << " }" << endl;

cout << "Y coefs { " << coefy[0] << " , " << coefy[1] << " , " << coefy[2] << " }" << endl;

if(logging==true)

{

calibrationLog<<"X coefs { " << coefx[0] << " , " << coefx[1] << " , " << coefx[2] << " }" <<

endl;

calibrationLog<<"Y coefs { " << coefy[0] << " , " << coefy[1] << " , " << coefy[2] << " }" <<

endl;

}

calibrationLog.close();

return;

}

IRALAR_IPA.cpp // IRALAR_IPA.cpp : Defines the entry point for the console application.

//

#include "stdafx.h"

#include <winsock2.h>

#include <resource.h>

#include <process.h>

#include <windows.h>

#include <fstream>

#include <string>

#include <cstring>

#include <Winuser.h>

#include <cxcore.h>

#include <cv.h>

#include "cvcam.h"

#include <highgui.h>

#include <iostream> // for cout and cin

#include "BlobResult.h"

#include "blob.h"

#include <time.h>

using namespace std;

//Include our external library of functions.

#include "IRALAR_mouse.h"

#include "IRALAR_UDP.h"

;struct protoent *ptrp; /* pointer to a protocol table entry */

struct sockaddr_in sad; /* structure to hold server's address */

struct sockaddr_in cad; /* structure to hold client's address */

SOCKET sd; /* socket descriptor - integer */

int port = 1200; /* protocol port number */

int centerY;

int centerX;

int mouseX;

int mouseY;

int radius;

double coefy[3];

Page 52: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 52

double coefx[3];

void gtsBuildLUT(int current_Pupil_level, int pupilRange, uchar LUT[]);

void gtsExtractRGBChannel(IplImage* src, IplImage* dest, int channel);

bool gtsAnalyzeBlob(CBlob Blob);

int gtsInitializeCamera(int width, int height);

int gtsGetBlobCenterX(CBlob Blob);

int gtsGetBlobCenterY(CBlob Blob);

int gtsGetBlobRadius(CBlob Blob);

int gtsAdjustContrastWindow(int current_pupil_Level, int direction);

//Look-up table for contrast stretching

//and constrast stretching parameters

uchar LUT [256];

CvMat* LUT_matrix;

int cPL = 40; //currrent Pupil Level Lower Value for image contrast stretching; Inital value

determined experimentally

int oldCPL = 0; //storage for previous thresholds

int olderCPL = 0;

int oldestCPL = 0;

int pupilRange = 25; //Range for contrast stretching

int pupilMin = 0; //Upper and lower limits of contrast window

int pupilMax = 70;

//Program flow control variables

int num_frames_pupil_adjust = 1; //Number of sequential frames with no detected pupil required before

adjusting pupil threshold

int frames_without_pupil = 0; //current number of sequential frames with no detected pupil

int iterations = 0; //number of iterations on current image. Applies

only in static mode.

bool pupil_found = true; //Tracks if a pupil was found in the last loop execution

double total_images=0; //Total number of images processed

double identified_images=0; //Total number of images with an identifiable pupil

double frameRate;

//Master Image processing control variable.

//Controlled by UDP communication thread

bool run=true;

bool save=false;

bool click=true;

bool mouseMove=true;

int savedImageFileNumber;

//Camera and image declarations

//Multiple images are needed because many functions cannot process data in place

CvCapture* capture;

IplImage* frame;

IplImage* LUTframe;

IplImage* redframe;

//File I/O declarations

ofstream pupilLog;

ifstream iniFile;

//Blob Discrimination Parameters

double pupilMinSize = 3000; //5000

double pupilMaxSize = 33000; //20000

double minAr = .3;//.7

double maxAr= 1.8;//1.3

double maxPerimeterError = .7;//.35

double maxCompactness = 5;//2.7

//Blob analysis parameters

int blobRadius, MaxX, MinX, MaxY, MinY;

double blobCompactness, blobMean, length, height, ar, perimeter, perimeterError;

//Time storage

clock_t frameStartTime;

clock_t frameEndTime;

clock_t frameTotalTime;

clock_t executionStartTime;

clock_t executionEndTime;

clock_t executionTotalTime;

Page 53: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 53

time_t now;

//Set Default Program Parameters

//These values will be used only if they are not present in

//either the IRALAR.ini settings file or the command line

int imageStartIndex = 0;

int imageEndIndex = 4;

bool useGUI = false;

bool liveCapture = false;// Controls the use of camera vs. static images

bool logging = false; // Controls the logging of image recogniiton data

bool ExternalCom = true; // Controls the use of the UDP server portion

bool Debug_UDP = false; // controls the output of UDP related information

bool Debug_Image = false;// controls the output of image related debug information

// (threshold up / down, etc...)

extern double ClickThreshold = 20; //Difference in cursor position between frames under which a click will be

acknowledged

//used by IRALAR_mouse.cpp

extern int ClickFrames = 5;

int imageNumber = imageStartIndex;

int main(int argc, char *argv[])

{

executionStartTime=clock();

/*Note: None of the debugging statements above the console

allocation not will be output, as there is not yet a console

to write them to. To view these degubbing statements,

move the console allocation to the beginnging of main()

and re-compile */

//Open Error File and direct stderr to it

freopen("ErrorLog.txt","w",stderr);

//Load user-set configuration defaults from IRALAR.ini

iniFile.open("IRALAR.ini");

if(iniFile.fail())

{

cerr<<"Could not open settings file. Loading defaults.";

}

else

{

cout<<"Loaded From ini:\n";

while(iniFile.peek()!=EOF)

{

char arg[30];

char * value;

char * parm;

iniFile.getline(arg,30);

parm = strtok(arg, "=");

value=strtok(NULL, "=");

if(value!=NULL)

{

if(strcmp(parm, "imageStartIndex")==0)

{

imageStartIndex=atoi(value);

cout<<"Image Start Index: "<<imageStartIndex<<"\n";

}

if(strcmp(parm, "ClickThreshold")==0)

{

ClickThreshold=atof(value);

cout<<"ClickThreshold: "<<ClickThreshold<<"\n";

}

else if(strcmp(parm, "imageEndIndex")==0)

{

imageEndIndex=atoi(value);

cout<<"Image End Index: "<<imageEndIndex<<"\n";

}

else if(strcmp(parm, "liveCapture")==0)

{

if(strcmp(value, "true")==0)

{ liveCapture=true; }

else { liveCapture=false; }

Page 54: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 54

cout<<"Live Capture: "<<liveCapture<<"\n";

}

else if(strcmp(parm, "logging")==0)

{

if(strcmp(value, "true")==0)

{ logging=true;}

else { logging=false; }

cout<<"Logging: "<<logging<<"\n";

}

if(strcmp(parm, "ExternalCom")==0)

{

if(strcmp(value, "true")==0)

{ ExternalCom=true;}

else { ExternalCom=false; }

cout<<"ExternalCom: "<<ExternalCom<<"\n";

}

if(strcmp(parm, "Debug_UDP")==0)

{

if(strcmp(value, "true")==0)

{ Debug_UDP=true;}

else { Debug_UDP=false; }

cout<<"Debug_UDP: "<<Debug_UDP<<"\n";

}

if(strcmp(parm, "Debug_Image")==0)

{

if(strcmp(value, "true")==0)

{ Debug_Image=true;}

else { Debug_Image=false; }

cout<<"Debug_Image: "<<Debug_Image<<"\n";

}

if(strcmp(parm, "useGUI")==0)

{

if(strcmp(value, "true")==0)

{ useGUI=true;}

else { useGUI=false; }

cout<<"UseGUI: "<<useGUI<<"\n";

}

if(strcmp(parm, "ClickFrames")==0)

{

ClickFrames=atoi(value);

cout<<"ClickFrames: "<<ClickFrames<<"\n";

}

}

}

cout<<"\n";

}

if(argc>1)

{

cout<<argc-1<<" Command Line Arguments\n";

for( int i=1; i<argc; i++)

{

cout<<"Argument #"<<i<<": "<<argv[i]<<"\n";

if(strcmp(argv[i], "/logging")==0)

{ logging=true; }

if(strcmp(argv[i], "/nologging")==0)

{ logging=false; }

else if(strcmp(argv[i],"/useGUI")==0)

{ useGUI=true; }

else if(strcmp(argv[i],"/noGUI")==0)

{ useGUI=false; }

else if(strcmp(argv[i],"/noliveCapture")==0)

{ liveCapture=false; }

else if(strcmp(argv[i],"/liveCapture")==0)

{ liveCapture=true; }

else if(strcmp(argv[i], "/Debug_UDP")==0)

{ Debug_UDP=true; }

else if(strcmp(argv[i], "/noDebug_UDP")==0)

{ Debug_UDP=false; }

else if(strcmp(argv[i], "/Debug_Image")==0)

{ Debug_Image=true; }

else if(strcmp(argv[i], "/noDebug_Image")==0)

{ Debug_Image=false; }

else if(strcmp(argv[i], "/index")==0)

{

if(argc<i+3)

Page 55: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 55

{ cout<<"Warn: Image index parameters not provided\n"; }

else

{

imageStartIndex=atoi(argv[i+1]);

imageEndIndex=atoi(argv[i+2]);

}

}

else if(strcmp(argv[i], "/ClickThreshold")==0)

{

if(argc<i+2)

{ cerr<<"Warn: Click Threshold parameters not provided\n"; }

else

{ ClickThreshold=atof(argv[i+1]); }

}

else if(strcmp(argv[i], "/ClickFrames")==0)

{

if(argc<i+2)

{ cerr<<"Warn: Click Frames parameter not provided\n"; }

else

{ ClickFrames=atoi(argv[i+1]); }

}

}

cout<<"\n";

}

//Parameter Validation

if(imageStartIndex>=imageEndIndex)

{

imageEndIndex=imageStartIndex+1;

}

imageNumber=imageStartIndex;

if(ClickThreshold < 0)

{ ClickThreshold = 1; }

//Open a console window if one is needed for debugging

if(Debug_UDP==true||Debug_Image==true)

{

AllocConsole();

freopen("CONIN$","rb",stdin);

freopen("CONOUT$","wb",stdout);

}

//Initialize UDP Communications

if(ExternalCom)

{

WSADATA wsaData;

if(WSAStartup(0x0101, &wsaData)!=0)

{

fprintf(stderr, "Error: Windows Socket Init failed: %d\n", GetLastError());

exit(1);

}

memset((char *)&sad,0,sizeof(sad)); /* clear sockaddr structure */

sad.sin_port = htons((u_short)port); /* set server port number */

sad.sin_family = AF_INET; /* set family to Internet */

sad.sin_addr.s_addr = INADDR_ANY; /* set the local IP address */

/* Map UDP transport protocol name to protocol number */

ptrp = getprotobyname("udp");

if ( ptrp == 0) {

cerr<<"Error: Cannot map \"udp\" to protocol number";

exit(1);

}

/* Create a socket */

sd = socket(PF_INET, SOCK_DGRAM, ptrp->p_proto);

if (sd < 0) {

cerr << "Error: Socket creation failed" << endl;

exit(1);

}

/* Bind a local address to the socket */

if (bind(sd, (struct sockaddr *)&sad, sizeof(sad)) < 0) {

cerr << "Error: socket bind failed" << endl;

exit(1);

Page 56: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 56

}

}//end of multithreaded-TCP setup

//Window creation

if(useGUI)

{

cvNamedWindow( "IRALAR", CV_WINDOW_AUTOSIZE );

cvNamedWindow( "Pupil Acquisition", CV_WINDOW_AUTOSIZE );

}

/*If logging is in place, initialize the log file and write headers to it*/

if(logging==true)

{

pupilLog.open("TrackingLog.csv");

if(pupilLog.fail())

{

logging = false;

cerr<<"Error: Could not open log file. Logging is disabled.";

}

else

{

pupilLog<<"IRALAR GTS log file\n";

now = time(NULL);

pupilLog<<ctime(&now)<<"\n";

pupilLog<<"Execution Parameters\n"<<"Logging: ,"<<logging<<"\nUse GUI:

,"<<useGUI<<"\nDebug_Image: ,"<<Debug_Image<<"\nDebug_UDP: ,"<<Debug_UDP<<"\nLive Capture:

,"<<liveCapture<<"\nClick Threshold: ,"<<ClickThreshold<<"\nClickFrames: ,"<<ClickFrames<<"\nPupil Range:

,"<<pupilRange<<"\n\n";

pupilLog<<"Current Blob Discrimination Parameters\n"<<"Max Aspect Ratio:

,"<<maxAr<<"\nMin Aspect Ratio: ,"<<minAr<<"\nMax Perimeter Error: ,"<<maxPerimeterError<<"\nMax Compactness:

,"<<maxCompactness<<"\n\n";

pupilLog<<"Image,Iterations,Pupil X Center,Pupil Y Center,Aspect

Ratio,Compactness,Perimeter Error,Mean Value,Processing Time(s)\n";

}

}

//Live capture initialization code

if(liveCapture)

{

//Initialize camera system

if(!gtsInitializeCamera(640, 480))

{

cerr<<"ERROR: Unable to initialize camera... Press Enter to Exit\n";

getchar();

return 0;

}

}

if(ExternalCom)

{

_beginthread((void (*)(void *))UDP_SERVICE_CLIENT, 0, (void *)sd);

}

/*Master Image Processing Loop;

Control of loop at run time is handled by UPD communication thread

Loop is terminated at end of main()*/

while( run ) {

// Get one frame, either from camera or from disk

if(liveCapture)

{

frameStartTime = clock();

frame = cvQueryFrame( capture );

total_images++;

if(logging) pupilLog<<total_images<<",";

}

else

{

//If the pupil was found in the previous image, load the next image from the drive,

if(pupil_found == true)

{

frameStartTime=0;

frameStartTime = clock();

Page 57: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 57

//RELEASE FRAME IN STATIC MODE ONLY!

//if frame is released in live mode, camera subsystem will crash

cvReleaseImage(&frame);

if(imageNumber < imageEndIndex)

{ imageNumber++; total_images++; }

else {break;}

//imageNumber = 11;

char filename [30] = "TEST_IMAGES\\Picture ";

char filenumber [3];

sprintf_s(filenumber, "%d", imageNumber);

strcat_s(filename, filenumber);

strcat_s(filename, ".jpg");

///!!!!!!!!!!!!!!!!!!!!!!1WHAT WHAT WHAT!H!!111!

coefx[1]= 1;

////!?!??!??!?!??

if(Debug_Image){cout<<"----------------\n"<<filename<<"\n\n";}

//Write the name of the image about to be processed to the log file

if(logging) pupilLog<<filename<<",";

//clear all per-image data

iterations = 0;

cPL = 0;

oldCPL = 0;

olderCPL = 0;

frame=cvLoadImage(filename, 1);

}

}

//ensure image loading was sucessful

if( !frame )

{

cerr<<"ERROR: Image could not be read... Press Enter to exit...\n";

getchar();

break;

}

//Extract Red channel from image

redframe = cvCreateImage(cvGetSize(frame), 8, 1); //new gray scale image declaration

gtsExtractRGBChannel(frame, redframe, 3);

//Image pre-processing

//Smoothing and contrast stretch with LUT

LUT_matrix = cvCreateMatHeader( 1, 256, CV_8UC1 );

LUTframe = cvCreateImage(cvGetSize(frame), 8, 1); //new gray scale image declaration

gtsBuildLUT(cPL, pupilRange, LUT);

cvSetData(LUT_matrix, LUT, 0);

/*CvPoint2D32f center = cvPoint2D32f(frame->width*.5, frame->height*.5);

CvMat* rot = cvCreateMat(2,3,CV_32F);

cv2DRotationMatrix(center, 180, 1, rot);

cvWarpAffine(redframe, framerotated, rot, CV_INTER_LINEAR, cvScalarAll(0));

*/

cvSmooth(redframe, redframe, CV_BLUR, 7, 7, 0,0);

cvLUT(redframe, LUTframe, LUT_matrix);

//Detect Blobs in the image

CBlobResult blobs, large_blobs, pupil_blobs;

//Extract the blobs

//(dest img, mask img, threshold, find moments)

blobs = CBlobResult( LUTframe, NULL, cPL+45, true );

//Filter out blobs too large, small, or long to be a pupil

blobs.Filter(large_blobs, B_INCLUDE, CBlobGetArea(), B_GREATER, pupilMinSize, 0);

large_blobs.Filter(pupil_blobs, B_INCLUDE, CBlobGetArea(), B_LESS, pupilMaxSize, 0);

pupil_blobs.Filter(pupil_blobs, B_INCLUDE, CBlobGetLength(), B_LESS, 260, 0);

CBlob Blob;

pupil_found = false;

Page 58: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 58

char savefilenumber [3];

if(save==true)

{

char eyename [30] = "image_";

char processedname [30] = "image_";

sprintf_s(savefilenumber, "%i", savedImageFileNumber);

strcat_s(eyename,savefilenumber);

cout<<eyename;

strcat_s(eyename,"_eye.jpg");

cvSaveImage(eyename,frame);

strcat_s(processedname,savefilenumber);

cout<<processedname;

strcat_s(processedname,"_processed.jpg");

cvSaveImage(processedname,LUTframe);

}

if(pupil_blobs.GetNumBlobs() != 0)

{

//for(int i = 0; i < pupil_blobs.GetNumBlobs(); ++i)

//{

//get the current blob

Blob = pupil_blobs.GetBlob(0);//;i);

//Draw Blobs on the screen before they have been rejected by the analysis system

if(liveCapture && useGUI)

{ Blob.FillBlob(frame, CV_RGB(255, 0, 0), 0,0); }

pupil_found = gtsAnalyzeBlob(Blob);

//blob_found = true;

if( pupil_found )

{

frames_without_pupil = 0;

identified_images++;

centerX = gtsGetBlobCenterX(Blob);

centerY = gtsGetBlobCenterY(Blob);

radius = gtsGetBlobRadius(Blob);

//Draw the blob on the image

if(useGUI)

{

Blob.FillBlob(frame, CV_RGB(0, 0, 255), 0,0);

cvCircle(frame, cvPoint(centerX, centerY), 3, CV_RGB(0, 255, 0), 4,

8, 0);

cvCircle(frame, cvPoint(centerX, centerY), radius, CV_RGB(255, 0,

0), 2, 8, 0);

}

frameEndTime = clock();

frameTotalTime= frameEndTime-frameStartTime;

double frameTime = frameTotalTime/(double)CLOCKS_PER_SEC;

if(logging) { pupilLog<<","<<frameTime<<"\n"; }

if(liveCapture && mouseMove)

{

gtsCheckClick(centerX, centerY, click);

}

}

//}

}

if(save==true)

{

char overlaidname [30] = "image_";

strcat_s(overlaidname,savefilenumber);

cout<<overlaidname;

strcat_s(overlaidname,"_overlaid.jpg");

cvSaveImage(overlaidname,frame);

save=false;

}

Page 59: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 59

//if a pupil was not found in this frame, increment the number of images in a row that have not had a

pupil

if(pupil_found == false)

{

frames_without_pupil++;

if(liveCapture && logging) pupilLog<<"\n";

}

//Check to see if the limit for consecutive images without a detected pupil has been reached

//If the limit has been reached, adjust the contrast window and reset the count

if(frames_without_pupil >= num_frames_pupil_adjust)

{

frames_without_pupil = 0;

//if not live capture, increment the number of attempts to find the pupil in this image

if(!liveCapture) {iterations++;}

//rotate old thresholds backwards through storage

oldestCPL = olderCPL;

olderCPL = oldCPL;

oldCPL = cPL;

if(cvCountNonZero(LUTframe) > 300000 && cPL < pupilMax)

{

cPL = gtsAdjustContrastWindow(cPL, 1);

if(Debug_Image){cout<<"Threshold up: "<<cPL<<"\n";}

}

else if(cPL > pupilMin)

{

cPL = gtsAdjustContrastWindow(cPL, 0);

if(Debug_Image){cout<<"Threshold down: "<<cPL<<"\n";}

}

}

//If GUI is in use, draw the images on the screen

if(useGUI)

{

cvShowImage("IRALAR", frame);

cvShowImage("Pupil Acquisition", LUTframe);

}

//If in live capture, calculate processing time for this frame

//if not in live capture, this function should be called only when a pupil

//is detected and so is handled as a part of the blob detection code

if(liveCapture)

{

frameEndTime = clock();

frameTotalTime= frameEndTime-frameStartTime;

frameRate = 1/(frameTotalTime/(double)CLOCKS_PER_SEC);

if(logging) { pupilLog<<"Frame Rate: "<<frameRate<<" fps.\n";}

}

//When in static mode, if the system cannot find a pupil after many iterations,

//force the system to move onto the next image by "faking" a positive identification

if(!liveCapture && (cPL == olderCPL||cPL==oldestCPL))

{

pupil_found = true;

if(logging) pupilLog<<"\n";

}

cvReleaseImage(&redframe);

cvReleaseImage(&LUTframe);

//process any key events

//remove higher bits using AND operator

int key = (cvWaitKey(1) & 255);

//If ESC key pressed, abort execution

if( key == 27) break;

} //End of master while loop

//Calculate full program execution time

executionEndTime = clock();

executionTotalTime = executionEndTime - executionStartTime;

Page 60: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 60

double executionTime = executionTotalTime/(double)CLOCKS_PER_SEC;

//Write file footer with full execution stats

if(Debug_Image)

{

cout<<"\n\n"<<identified_images<<" of "<<total_images<<" samples identified.\n";

cout<<identified_images/total_images*100<<"% tracking accuracy.\n\n";

cout<<"Total Execution Time: "<<executionTime<<" seconds.\n";

cout<<"Average Frame rate: "<<1/(executionTime/(double)total_images)<<" fps.";

cout<<"\n\nPress Enter to Exit.";

//getchar();

}

if(logging)

{

pupilLog<<"\n"<<identified_images<<", of ,"<<total_images<<", identified.\n";

pupilLog<<identified_images/total_images*100<<",% tracking accuracy.\n\n";

pupilLog<<"Total Execution Time: ,"<<executionTime<<", seconds.\n";

pupilLog<<"Average Frame rate: ,"<<1/(executionTime/(double)total_images)<<", fps.";

pupilLog.close();

}

// Resource destruction

if(liveCapture)

{cvReleaseCapture( &capture );}

if(useGUI)

{

cvDestroyWindow( "IRALAR" );

cvDestroyWindow( "Pupil Acquisition" );

}

if(!liveCapture)

{ cvReleaseImage(&frame); }

cvReleaseImage(&redframe);

cvReleaseImage(&LUTframe);

//release the sockets

if(ExternalCom)

{ WSACleanup();}

fclose(stderr);

if(Debug_Image==true||Debug_UDP==true) {FreeConsole();}

return 0;

}

void gtsBuildLUT(int current_Pupil_Level, int pupilRange, uchar LUT[])

{

//Construct Look-up table based on current contrast window

for(int i=0; i < current_Pupil_Level; i++)

{ LUT[i] = 0; }

int b=0;

for(int i=current_Pupil_Level; i < current_Pupil_Level + pupilRange; i++)

{

LUT[i] =(int)b* 256/pupilRange;

b++;

}

for(int i=current_Pupil_Level + pupilRange; i < 256; i++)

{ LUT[i]=255; }

return;

}

/* Initializes a webcam to stream video

If there is more than one webcam attached to the computer,

this function will intialize the first one */

int gtsInitializeCamera(int width, int height)

{

capture = cvCaptureFromCAM( CV_CAP_ANY );

if( !capture ) {

fprintf( stderr, "ERROR: Live Capture Mode set but no camera detected. \n" );

getchar();

return -1;

};

//Set camera capture size

Page 61: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 61

cvSetCaptureProperty(capture,CV_CAP_PROP_FRAME_WIDTH,width);

cvSetCaptureProperty(capture,CV_CAP_PROP_FRAME_HEIGHT,height);

return 1;

}

/* Extracts one color channel from an RGB image as a grayscale image

Note: Color channel order in openCV is BGR, so channel 3 = red*/

void gtsExtractRGBChannel(IplImage* src, IplImage* dest, int channel)

{

cvSetImageCOI(frame, channel);

cvCopy(src, dest, NULL);

cvSetImageCOI(frame, 0);

}

bool gtsAnalyzeBlob(CBlob Blob)

{

CBlobGetCompactness getCompactness;

blobCompactness = getCompactness(Blob);

CBlobGetMean getMean;

blobMean = getMean(Blob)*255.00;

blobRadius = gtsGetBlobRadius(Blob);

MaxX = (int)Blob.MaxX();

MinX = (int)Blob.MinX();

MaxY = (int)Blob.MaxY();

MinY = (int)Blob.MinY();

//calculate Aspect ratio of blob and ignore it if it is not circular

length = (double)MaxX-MinX;

height = (double)MaxY-MinY;

ar = length/height;

//Calculate perimeter of blob and perimeter of a circle

perimeter = blobRadius*2*3.14159;

perimeterError = (Blob.perimeter - perimeter)/perimeter;

if(ar > minAr && ar < maxAr && perimeterError < maxPerimeterError) //&& blobCompactness <

maxCompactness)

{

if(Debug_Image)

{

cout<<"Pupil Identified!\n"<<iterations<<" Iterations.\n\n";

cout<<"Pupil Center: "<<gtsGetBlobCenterX(Blob)<<" "<<gtsGetBlobCenterY(Blob)<<"\n";

cout<<"Perimeter error: "<<perimeterError<<"\nAspect Ratio: "<<ar<<"\nBlob

Compactness: "<<blobCompactness<<"\nBlob Mean: "<<blobMean<<"\n\n";

}

if(logging)

{

pupilLog<<iterations<<","<<gtsGetBlobCenterX(Blob)<<","<<gtsGetBlobCenterY(Blob)<<","<<ar<<","<<blobCo

mpactness<<","<<perimeterError<<","<<blobMean;

}

return true;

}

else { return false; }

}

int gtsGetBlobCenterX(CBlob Blob)

{

int MaxX, MinX, CenterX;

MaxX = (int)Blob.MaxX();

MinX = (int)Blob.MinX();

CenterX = (MinX + MaxX) / 2;

return CenterX;

}

int gtsGetBlobCenterY(CBlob Blob)

{

int MaxY, MinY, CenterY;

MaxY = (int)Blob.MaxY();

Page 62: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 62

MinY = (int)Blob.MinY();

CenterY = (MinY + MaxY) / 2;

return CenterY;

}

int gtsGetBlobRadius(CBlob Blob)

{

int MaxY, MinY, MinX, MaxX, radius;

MaxX = (int)Blob.MaxX();

MinX = (int)Blob.MinX();

MaxY = (int)Blob.MaxY();

MinY = (int)Blob.MinY();

if((MaxX - MinX) > (MaxY - MinY))

{

radius = (MaxX - MinX) / 2;

}

else

{

radius = (MaxY - MinY) / 2;

}

return radius;

}

int gtsAdjustContrastWindow(int current_Pupil_Level, int direction)

{

int cPL = current_Pupil_Level;

if(direction == 1)

{

if(cPL <= 30 && cPL > 20)

{ cPL += 5; }

else if( cPL <= 20 && cPL > 10)

{ cPL += 2; }

else if( cPL <= 10)

{ cPL += 1; }

else

{ cPL += 10; }

}

else if(direction == 0)

{

if(cPL <= 30 && cPL > 20)

{ cPL -= 5; }

else if (cPL <= 20 && cPL > 10)

{ cPL -= 2; }

else if (cPL <= 10)

{ cPL -= 1; }

else { cPL -=10;}

}

return cPL;

}

IRALAR_mouse.cpp

#include "stdafx.h"

#define _WIN32_WINNT 0x0501 //The wonders of C++ forums and google

#include <windows.h>

#include <fstream>

#include <Winuser.h>

#include "IRALAR_mouse.h"

#include "IRALAR_IPA.h"

#include "IRALAR_UDP.h"

#include <iostream> // for cout and cin

using namespace std;

int oldlocx[100];

int oldlocy[100];

// Click threshold is how close the last 5 sets of pixels must be to their average

// It units of ClickThreshold are pixels.

extern double ClickThreshold;

extern int ClickFrames;

extern bool calmode;

Page 63: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 63

void gtsInitializeCal()

{

for(int i=0;i<100;i++)

{

oldlocx[i]=100000;

oldlocy[i]=100000;

}

return;

}

void gtsCheckClick(int x, int y, bool click)

{

//std::cout<<"ClickThrsh "<<ClickThreshold<<"\n";

double avgx=0;

double avgy=0;

int mousex=400;

int mousey=300;

//Shift the arrays to include the most recent points

for(int i=0; i<ClickFrames-1;i++)

{

oldlocx[i]=oldlocx[i+1];

oldlocy[i]=oldlocy[i+1];

}

oldlocx[ClickFrames-1]=x;

oldlocy[ClickFrames-1]=y;

//Calculate averages

for(int j=0;j<=ClickFrames-1;j++)

{

avgx = avgx + oldlocx[j];

avgy = avgy + oldlocy[j];

}

avgx = avgx/ClickFrames;

avgy = avgy/ClickFrames;

bool SendMouseClick=true;//always assume that there will be a click, and let the next loop prove

otherwise

for(int k=0; k<=ClickFrames-1;k++)

{

if(oldlocx[k]<avgx-ClickThreshold || oldlocx[k]>avgx+ClickThreshold ||

oldlocy[k]<avgy-ClickThreshold*6/8 || oldlocy[k]>avgy+ClickThreshold*6/8)

{

SendMouseClick=false;

}

}

if(SendMouseClick == true)

{

if(calmode==false)

{

mousex = coefx[0] + coefx[1]*avgx + coefx[2]*avgy;

mousey = coefy[0] + coefy[1]*avgx + coefy[2]*avgy;

gtsMouseMove(mousex,mousey);

}

else

{

gtsMouseMove(400,300);

}

if(click){gtsLeftClick();}

oldlocx[ClickFrames-1]=1000000;

oldlocy[ClickFrames-1]=1000000;

}

else

{

if(calmode==false)

{

mousex = coefx[0] + coefx[1]*x + coefx[2]*y;

mousey = coefy[0] + coefy[1]*x + coefy[2]*y;

gtsMouseMove(mousex,mousey);

}

else

{

Page 64: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 64

gtsMouseMove(400,300);

}

SendMouseClick = true;

}

return;

}

void gtsMouseMove (int x, int y )

{

double fScreenWidth = ::GetSystemMetrics( SM_CXSCREEN )-1;

double fScreenHeight = ::GetSystemMetrics( SM_CYSCREEN )-1;

double fx = x*(65535.0/fScreenWidth);

double fy = y*(65535.0/fScreenHeight);

INPUT Input={0};

Input.mi.dwFlags = MOUSEEVENTF_MOVE|MOUSEEVENTF_ABSOLUTE;

Input.mi.dx = (long)fx;

Input.mi.dy = (long)fy;

::SendInput(1,&Input,sizeof(INPUT));

}

void gtsLeftClick ( )

{

INPUT Input={0};

// left down

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_LEFTDOWN;

::SendInput(1,&Input,sizeof(INPUT));

// left up

::ZeroMemory(&Input,sizeof(INPUT));

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_LEFTUP;

::SendInput(1,&Input,sizeof(INPUT));

}

void gtsRightClick ( )

{

INPUT Input={0};

// right down

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_RIGHTDOWN;

::SendInput(1,&Input,sizeof(INPUT));

// right up

::ZeroMemory(&Input,sizeof(INPUT));

Input.type = INPUT_MOUSE;

Input.mi.dwFlags = MOUSEEVENTF_RIGHTUP;

::SendInput(1,&Input,sizeof(INPUT));

}

IRALAR_UDP.cpp #include "stdafx.h"

#include <winsock2.h>

#include <windows.h>

#include <iostream>

#include <cstdio> /* necessary to get sprintf */

#include <cstring>

#include <string>

using namespace std;

#include "IRALAR_UDP.h"

#include "IRALAR_mouse.h"

#include "IRALAR_IPA.h"

#include "IRALAR_Cal.h"

extern double total_images;

extern bool run;

extern int blobRadius, MaxX, MinX, MaxY, MinY;

extern double blobCompactness, blobMean, length, height, ar, perimeter, perimeterError;

extern int imageStartIndex, imageEndIndex;

extern bool useGUI, liveCapture, logging, ExternalCom, Debug_UDP, Debug_Image;

extern double ClickThreshold;

extern int ClickFrames;

extern int mouseX, mouseY;

Page 65: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 65

extern double frameRate;

extern bool save;

extern int savedImageFileNumber;

extern bool mouseMove;

extern bool click;

bool calmode = false;

char buf[100]; /* buffer for string the server sends */

int n; /* number of characters received */

int m; /* number of characters sent back */

int alen;

int pos1, pos2, loc;

string RecvString;

string RecvStringPart;

int pos;

char buf2[30];

/*// add these to IRALAR_IPA as well

extern int centerX;

extern int centerY;*/

int eyex[500];

int eyey[500];

int pixelx[500];

int pixely[500];

int calpoints;

void __cdecl UDP_SERVICE_CLIENT(SOCKET sd,struct sockaddr_in cad)

{

CRITICAL_SECTION cs;

InitializeCriticalSection(&cs);

//we need to access data, but can not afford for there to be error due to threading issues

// elevate the UDP server's thread to highest priority, get the data, and lower the priority

/* Main server loop - accept and handle requests */

while (1) {

memset(buf, ' ', sizeof(buf));

//cout << "Receiving data" << endl;

alen = sizeof(cad);

n = recvfrom(sd,buf,sizeof(buf),0,(struct sockaddr*)&cad,&alen);

//cout << "buffer:" << buf << endl;

if (n<0)

{

cerr<<"Error in receiving\n";

continue;

}

else if(n>=0) /* We could recevie a useful emtpy packet */

{

RecvString = buf;

char buf2[30] ={" "};

pos = RecvString.find_first_of("|");

RecvStringPart = RecvString.substr(0,pos);

//cout << RecvStringPart << endl;

if(RecvStringPart.substr(0,10) == "Cal_Start:"){

strncpy_s(buf,"Cal_Start_ACK!",100);

calmode=true;

calpoints = atoi(RecvStringPart.substr(10,pos).c_str());

cout << RecvStringPart << endl;

}else if(RecvStringPart == "data_calibration"){

EnterCriticalSection(&cs);

sprintf_s(buf,"%f %f %f %f %f %f !",coefy[0], coefy[1], coefy[2],

coefx[0], coefx[1], coefx[2]);

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "data_mouse"){

EnterCriticalSection(&cs);

sprintf_s(buf,"%i %i %i %i !",centerX, centerY, mouseX, mouseY);

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "data_parameters"){

EnterCriticalSection(&cs);

sprintf_s(buf,"%i %i %i %i %i %i %i %i %i %f !", imageStartIndex,

imageEndIndex, useGUI, liveCapture, logging, ExternalCom, Debug_UDP, Debug_Image, ClickFrames, ClickThreshold);

Page 66: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 66

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "data_image"){

EnterCriticalSection(&cs);

sprintf_s(buf,"%f %f %f %f %f %f %f %i !",blobCompactness, blobMean,

length, height, ar, perimeter, perimeterError, blobRadius);

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "Cal_Complete"){

calmode=false;

strncpy_s(buf,"Cal_Complete_ACK!",100);

cout << "Starting Calibration Calculations" << endl;

calibrate(calpoints, eyex, eyey, pixelx, pixely);

cout << RecvStringPart << endl;

}else if(RecvStringPart == "ClickFrames+"){

calmode=false;

strncpy_s(buf,"ClickFrames+_ACK!",100);

EnterCriticalSection(&cs);

ClickFrames++;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "ClickFrames-"){

calmode=false;

strncpy_s(buf,"ClickFrames-_ACK!",100);

EnterCriticalSection(&cs);

ClickFrames--;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "ClickThreshold+"){

calmode=false;

strncpy_s(buf,"ClickThreshold+_ACK!",100);

EnterCriticalSection(&cs);

ClickThreshold = ClickThreshold + 1;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "ClickThreshold-"){

calmode=false;

strncpy_s(buf,"ClickThreshold-_ACK!",100);

EnterCriticalSection(&cs);

ClickThreshold = ClickThreshold - 1;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "ClickOn"){

strncpy_s(buf,"ClickOn_ACK!",100);

EnterCriticalSection(&cs);

click=true;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "ClickOff"){

strncpy_s(buf,"ClickOff_ACK!",100);

EnterCriticalSection(&cs);

click=false;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "MouseOn"){

strncpy_s(buf,"MouseOn_ACK!",100);

EnterCriticalSection(&cs);

mouseMove=true;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "MouseOff"){

strncpy_s(buf,"MouseOff_ACK!",100);

EnterCriticalSection(&cs);

mouseMove=false;

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "FPS"){

strncpy_s(buf,"FPS_ACK!",100);

EnterCriticalSection(&cs);

sprintf_s(buf,"%d ",frameRate,100);

LeaveCriticalSection(&cs);

}else if(RecvStringPart.substr(0,11) == "SAVE_IMAGE:"){

strncpy_s(buf,"SAVE_IMAGE_ACK!",100);

EnterCriticalSection(&cs);

Page 67: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 67

savedImageFileNumber=atoi((RecvStringPart.substr(11,13)).c_str());

save=true;

LeaveCriticalSection(&cs);

}else if(RecvStringPart.substr(0,16) == "Click_Threshold:"){

strncpy_s(buf,"Click_Threshold_ACK!",100);

EnterCriticalSection(&cs);

ClickThreshold=atoi((RecvStringPart.substr(16,18)).c_str());

LeaveCriticalSection(&cs);

}else if(RecvStringPart.substr(0,13) == "Click_Frames:"){

strncpy_s(buf,"Click_Frames_ACK!",100);

EnterCriticalSection(&cs);

ClickFrames=atoi((RecvStringPart.substr(13,15)).c_str());

LeaveCriticalSection(&cs);

}else if(RecvStringPart.substr(0,7) == "Cal_Pt:"){

strncpy_s(buf,"CAL_PT_ACK!",100);

// Split the string to receive the numerical values

// m/Cal_Pt:(\d{0,3})X(\d{0,4})Y(\d{0,4})/

//

// $X = $1

// $Y = $2

pos1 = RecvStringPart.find_first_of("X");

pos2 = RecvStringPart.find_first_of("Y");

loc = atoi((RecvStringPart.substr(7,pos1-7).c_str()));

int r = pos2-pos1-1;

int q = pos-pos2;

pixelx[loc] = atoi((RecvStringPart.substr(pos1+1,r).c_str()));

pixely[loc] = atoi((RecvStringPart.substr(pos2+1,q).c_str()));

EnterCriticalSection(&cs);

eyex[loc] = centerX;

eyey[loc] = centerY;

LeaveCriticalSection(&cs);

//cout << RecvStringPart << endl;

//cout << "pos1=" << pos1 <<endl;

//cout << "pos2=" << pos2 <<endl;

//cout << "pos =" << pos <<endl;

//cout << "r =" << r <<endl;

//cout << "q =" << q <<endl;

//cout << loc << "EyeX=" << eyex[loc] << " EyeY=" << eyey[loc] << "

PixX=" << pixelx[loc] << " PixY=" << pixely[loc] << endl;

}else if(RecvStringPart.substr(0,7) == "Get_Pt:"){

// Split the string to receive the numerical values

// m/Cal_Pt_:(\d{1,2})/

//

// $X = $1

loc = atoi((RecvStringPart.substr(7,2).c_str()));

EnterCriticalSection(&cs);

sprintf_s(buf,"%i %i %i %i

!",pixelx[loc],pixely[loc],eyex[loc],eyey[loc]);

LeaveCriticalSection(&cs);

}else if(RecvStringPart == "EXIT"){

strncpy_s(buf,"EXIT_ACK!",100);

run=false;

DeleteCriticalSection(&cs);

return;

}

else{strncpy_s(buf,"ACK!",100);

}

//cout << "Buffer being sent:" << buf << ": size = "<<sizeof(buf) <<endl;

m = sendto(sd,buf,sizeof(buf),0,(struct sockaddr*)&cad,alen);

if(m<0)

{

cerr<<"Error in sending";

continue;

}

Page 68: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 68

}

}

DeleteCriticalSection(&cs);

}

stdafx.cpp // stdafx.cpp : source file that includes just the standard includes

// IRALAR_IPA.pch will be the pre-compiled header

// stdafx.obj will contain the pre-compiled type information

#include "stdafx.h"

// TODO: reference any additional headers you need in STDAFX.H

// and not in this file

User Interface

App

App.xaml.cs using System;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

using System.Net;

using System.Security;

using System.Security.Permissions;

using System.Windows;

using System.Windows.Forms;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Animation;

using System.Windows.Navigation;

using System.Diagnostics;

namespace IRALAR_UI

{

public partial class App : System.Windows.Application

{

public void AppStartup(object sender, StartupEventArgs args)

{

//start the UI window

NavigationWindow mainWindow = new NavigationWindow();

mainWindow.Background = Brushes.Transparent;

mainWindow.Show();

Console.WriteLine("starting App");

}

public void NavigationWindow_KeyDown(object sender, System.Windows.Input.KeyEventArgs e)

{

}

}

}

Page 69: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 69

App.xaml <Application

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

x:Class="IRALAR.App"

StartupUri="Center.xaml"

>

<Application.Resources>

<Style x:Key="{x:Type NavigationWindow}" TargetType="{x:Type NavigationWindow}">

<Setter Property="WindowStyle" Value="None" />

<Setter Property="ShowsNavigationUI" Value="False" />

<Setter Property="WindowState" Value="Maximized" />

<Setter Property="Topmost" Value="False" />

<Setter Property="Background" Value="Black" />

<Setter Property="Cursor" Value="Cross" />

<Setter Property="Height" Value="600" />

<Setter Property="Width" Value="800" />

<Setter Property="Padding" Value="0" />

<Setter Property="Margin" Value="0" />

<Setter Property="BorderThickness" Value="0" />

</Style>

</Application.Resources>

</Application>

Calibration

Screenshot

Calibration.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

Page 70: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 70

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

namespace IRALAR_UI

{

public partial class Calibration : Page

{

Socket client;

int recv;

private static byte[] data = new byte[100000];

private static IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)ipep;

private static int size = 100;

//int size = 9;// this creates a 10 by 10 grid

int[] X_locations = new int[9] { 80, 160, 240, 320, 400, 480, 560, 640, 720 };

int[] Y_locations = new int[9] { 60, 120, 180, 240, 300, 360, 420, 480, 540 };

int X_pos = 0;

int Y_pos = 0;

int width = 20;

int point_number = 0;

public Calibration()

{

InitializeComponent();

Console.WriteLine("Calibration Page Loaded");

//connect to Server

ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 3000);

int sockopt = (int)client.GetSocketOption(SocketOptionLevel.Socket,

SocketOptionName.ReceiveTimeout);

//Console.WriteLine("New timeout: {0}", sockopt);

}

public void Calibration_Button_Pushed(object sender, RoutedEventArgs e)

{

//Console.WriteLine("Original X_pos" + X_pos);

//Console.WriteLine("Original Y_pos" + Y_pos);

//first, tell the server what point the user is looking at

string a = point_number.ToString();

string b = X_locations[X_pos].ToString();

string c = Y_locations[Y_pos].ToString();

a.PadLeft(3, '0');

b.PadLeft(4, '0');

b.PadLeft(4, '0');

string StringToSend = "Cal_Pt:" + a + "X" + b + "Y" + c;

point_number = point_number + 1;

data = new byte[1024];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

//if Acknowledgement is received -> Move the location of the Rectangle

Console.WriteLine("X: " + X_pos + "-" + X_locations[X_pos] + " Y: " + Y_pos + "-" +

Y_locations[Y_pos] + " & " + StringToSend);

X_pos = X_pos + 1;

if (X_pos > 8)

{

Page 71: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 71

Y_pos = Y_pos + 1;

X_pos = 0;

//Console.WriteLine("Rollover Performed");

//Console.WriteLine("New X_pos" + X_pos);

//Console.WriteLine("New Y_pos" + Y_pos);

}

if (Y_pos >= 9)

{

GO_TO_MAIN_Fx();

}

else

{

Canvas.SetLeft(_button, X_locations[X_pos] - (width / 2));

Canvas.SetTop(_button, Y_locations[Y_pos] - (width / 2));

}

//when the button initializes, it is in the 0,0 position, and gets moved to the 0, 0

position...

// check if it is in the 0,0 pos, and if so move it to 1,0

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

int number_of_points = 80;

string numOfPoints = number_of_points.ToString();

numOfPoints.PadLeft(3, '0');

string StringToSend = "Cal_Start:" + numOfPoints;

data = new byte[1024];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

DoubleAnimation MOVE_WELCOME = new DoubleAnimation();

MOVE_WELCOME.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_WELCOME.From = 1;

MOVE_WELCOME.To = 0;

MOVE_WELCOME.Duration = TimeSpan.FromSeconds(1);

_WELCOME.BeginAnimation(Canvas.OpacityProperty, MOVE_WELCOME);

DoubleAnimation MOVE_DIRsize = new DoubleAnimation();

MOVE_DIRsize.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_DIRsize.From = 30;

MOVE_DIRsize.To = 12;

MOVE_DIRsize.Duration = TimeSpan.FromSeconds(1);

_DIR.BeginAnimation(Label.FontSizeProperty, MOVE_DIRsize);

DoubleAnimation MOVE_DIRtop = new DoubleAnimation();

MOVE_DIRtop.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_DIRtop.From = 306;

MOVE_DIRtop.To = 570;

MOVE_DIRtop.Duration = TimeSpan.FromSeconds(1.5);

_DIR.BeginAnimation(Canvas.TopProperty, MOVE_DIRtop);

DoubleAnimation MOVE_DIRleft = new DoubleAnimation();

MOVE_DIRleft.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_DIRleft.From = 156;

MOVE_DIRleft.To = 540;

MOVE_DIRleft.Duration = TimeSpan.FromSeconds(1.5);

_DIR.BeginAnimation(Canvas.LeftProperty, MOVE_DIRleft);

DoubleAnimation UNHIDE_BUTTON = new DoubleAnimation();

UNHIDE_BUTTON.BeginTime = TimeSpan.Parse("0:0:5.5");

Page 72: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 72

UNHIDE_BUTTON.From = 0;

UNHIDE_BUTTON.To = 1;

UNHIDE_BUTTON.Duration = TimeSpan.FromSeconds(2);

_button.BeginAnimation(Canvas.OpacityProperty, UNHIDE_BUTTON);

DoubleAnimation MAKE_CLICKABLE = new DoubleAnimation();

MAKE_CLICKABLE.BeginTime = TimeSpan.Parse("0:0:6");

MAKE_CLICKABLE.From = -620;

MAKE_CLICKABLE.To = -10;

_screen_button.BeginAnimation(Canvas.TopProperty, MAKE_CLICKABLE);

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

GO_TO_MAIN_Fx();

}

private void GO_TO_MAIN_Fx()

{

string StringToSend = "Cal_Complete";

data = new byte[1024];

data = Encoding.ASCII.GetBytes(StringToSend + "|");

client.Send(data);

client.Close();

NavigationService.Navigate(new MainPage());

}

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(text);

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

{

return recv;

}

else

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

private void Abort_Click(object sender, RoutedEventArgs e)

{

}

Page 73: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 73

}

}

Calibration.xaml <Page

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

x:Class="IRALAR_UI.Calibration"

Title="Calibration"

OpacityMask="{x:Null}"

Background="{x:Null}"

ShowsNavigationUI="False"

Width="800"

Height="600"

>

<Page.BitmapEffectInput>

<BitmapEffectInput AreaToApplyEffectUnits="Absolute"/>

</Page.BitmapEffectInput>

<Canvas Loaded="Canvas_Loaded" Name="_canvas">

<Canvas.Background>

<LinearGradientBrush>

<GradientStop Color="Black" Offset="1"/>

</LinearGradientBrush>

</Canvas.Background>

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" Opacity=".5"/>

<Rectangle x:Name="_button" Height="20" Width="20" RadiusX="10" RadiusY="10" Canvas.Left="70"

Opacity="0" Canvas.Top="50" Stroke="Black">

<Rectangle.Fill>

<LinearGradientBrush >

<GradientStop Color="White" Offset="0.0" />

<GradientStop Color="White" Offset="0.20" />

<GradientStop Color="LightBlue" Offset="0.9" />

<GradientStop Color="LightGray" Offset="0.20" />

</LinearGradientBrush>

</Rectangle.Fill>

</Rectangle>

<Label x:Name="_WELCOME" FontWeight="Bold" FontSize="60" Foreground="White" Height="100"

Canvas.Left="53" Canvas.Top="200" Width="697">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Black" Noise="0" Opacity=".25" />

</Label.BitmapEffect> Welcome to Calibration

</Label>

<Label x:Name="_DIR" FontWeight="Bold" FontSize="30" Foreground="White" Height="50" Canvas.Left="156"

Canvas.Top="306" Width="475">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Black" Noise="0" Opacity=".1" />

</Label.BitmapEffect> Look at the blue dot to calibrate.

</Label>

<Button Canvas.Left="-25" Canvas.Top="-620" Height="620" Name="_screen_button" Width="850"

Click="Calibration_Button_Pushed" Opacity="0"></Button>

<Button x:Name="Move_on" Canvas.Left="735" Canvas.Top="560" Click="GO_TO_MAIN" Height="40" Width="65">

<Button.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Snow" Noise="0" Opacity="0.5" />

</Button.BitmapEffect> Move on

</Button>

<Button x:Name="Abort" Canvas.Left="95" Canvas.Top="562" Click="Abort_Click" Height="40" Width="85">

<Button.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Snow" Noise="0" Opacity="0.5" />

</Button.BitmapEffect>Abort

</Button>

</Canvas>

</Page>

Page 74: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 74

Center

Screenshot

Center.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Forms;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using gma.System.Windows;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for Center.xaml

/// </summary>

public partial class Center : Page

{

UserActivityHook actHook;

public Center()

{

InitializeComponent();

System.Windows.Input.Keyboard.Focus(_canvas);

actHook = new UserActivityHook(); // crate an instance with global hooks

actHook.KeyPress += new System.Windows.Forms.KeyPressEventHandler(MyKeyPress);

}

private void button1_Click(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new Splash());

}

public void MyKeyPress(object sender, System.Windows.Forms.KeyPressEventArgs e)

Page 75: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 75

{

try

{

if (e.KeyChar.ToString() == " ") { actHook.Stop(); NavigationService.Navigate(new Splash()); }

}

catch

{

}

}

}

}

Center.xaml <Page x:Class="IRALAR_UI.Center"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Center"

OpacityMask="{x:Null}"

ShowsNavigationUI="False"

Width="800"

Height="600"

xmlns:d="http://schemas.microsoft.com/expression/blend/2006"

xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"

mc:Ignorable="d"

Focusable="True"

>

<Canvas x:Name="_canvas" Background="Red" Focusable="True">

<Rectangle Canvas.Left="20" Canvas.Top="20" Height="560" Name="rectangle1" Width="760" Fill="Black" />

<Rectangle Canvas.Left="0" Canvas.Top="200" Height="200" Name="rectangle2" Width="800" Fill="Black" />

<Rectangle Canvas.Left="300" Canvas.Top="0" Height="600" Name="rectangle3" Width="200" Fill="Black" />

<Label Canvas.Left="92" Foreground="white" FontSize="72" Canvas.Top="134" Name="label1" Height="106"

Width="606">Center The Display</Label>

<Label Canvas.Left="4" Foreground="white" FontSize="50" Canvas.Top="250" Name="label1_1" Height="106"

Width="800">Make sure you can see each corner.</Label>

<Label Canvas.Left="84" Foreground="white" FontSize="50" Canvas.Top="380" Name="label2" Height="106"

Width="624">Press Spacebar To Continue</Label>

<Button Canvas.Left="92" Canvas.Top="462" Height="64" Name="button1" Width="606"

Click="button1_Click">Continue</Button>

</Canvas>

</Page>

ClickDelay

ClickDelay.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

Page 76: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 76

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for ClickDelay.xaml

/// </summary>

public partial class ClickDelay : Page

{

Socket client;

int recv;

private static byte[] data = new byte[100000];

private static IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)ipep;

private static int size = 100;

public ClickDelay()

{

InitializeComponent();

Console.WriteLine("ClickDelay Page Loaded");

//connect to Server

ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

UPDATE_DISPLAY();

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new MainPage());

}

private void Page_Loaded(object sender, RoutedEventArgs e)

{

}

private void UPDATE_DISPLAY()

{

// data_parameters = imageStartIndex, imageEndIndex, useGUI, liveCapture, logging,ExternalCom,

Debug_UDP, Debug_Image, ClickFrames, ClickThreshold

string StringToSend = "data_parameters";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

double Frames = Convert.ToDouble(ReturnParts[8]);

FramesNumber.Content = ReturnParts[8];

/*

//Gather current FPS

string StringToSend = "FPS";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

double PerSecond = Convert.ToDouble(ReturnParts[8]);

FPSNumber.Content = ReturnParts[0];

double TimeTaken = 1 / PerSecond * Frames;

string s = Convert.ToString(TimeTaken);

DelayTimeNumber.Content = s;

*/

}

private void Decrease_MouseDown(object sender, MouseButtonEventArgs e)

{

string StringToSend = "ClickFrames-";

Page 77: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 77

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

UPDATE_DISPLAY();

}

private void Increase_MouseDown(object sender, MouseButtonEventArgs e)

{

string StringToSend = "ClickFrames+";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

UPDATE_DISPLAY();

}

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(text);

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

{

return recv;

}

else

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

}

}

ClickDelay.xaml <Page x:Class="IRALAR_UI.ClickDelay"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="ClickDelay"

Background="Black"

Width="800"

Height="600" Loaded="Page_Loaded">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True">

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" />

<Button Canvas.Left="650" Canvas.Top="0" Height="75" Width="125" Click="GO_TO_MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

Page 78: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 78

<GradientStop Color="WhiteSmoke" Offset="0" />

<GradientStop Color="LightGreen" Offset="0.5" />

<GradientStop Color="Green" Offset="1" />

</LinearGradientBrush>

</Button.Background> MAIN

</Button>

<Rectangle Canvas.Top="75" Canvas.Left="150" Height="100" Width="100" Fill="White" RadiusX="10"

RadiusY="10" Stroke="AliceBlue">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="75" Canvas.Left="350" Height="100" Width="100" Fill="White" RadiusX="10"

RadiusY="10" Stroke="AliceBlue">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="75" Canvas.Left="550" Height="100" Width="100" Fill="White" RadiusX="10"

RadiusY="10" Stroke="AliceBlue">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

Page 79: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 79

</Rectangle>

<!--<Button Canvas.Left="150" Canvas.Top="75" Height="100" Name="button1" Width="100"></Button>

<Button Canvas.Left="350" Canvas.Top="75" Height="100" Name="button2" Width="100"></Button>

<Button Canvas.Left="550" Canvas.Top="75" Height="100" Name="button3" Width="100"></Button>-->

<Label Canvas.Left="232" Canvas.Top="368" Height="57" Name="label1" Width="218" Foreground="White"

FontSize="32">Frames = </Label>

<Label Canvas.Left="388" Canvas.Top="368" Height="57" Name="FramesNumber" Width="218"

Foreground="White" FontSize="32"></Label>

<Rectangle Canvas.Left="100" Canvas.Top="350" RadiusX="30" RadiusY="30" Fill="LightBlue" Height="100"

Name="Decrease" Stroke="Black" Width="100" Opacity="1" MouseDown="Decrease_MouseDown">

<Rectangle.BitmapEffect>

<BlurBitmapEffect Radius="2" />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="600" Canvas.Top="350" RadiusX="30" RadiusY="30" Fill="LightBlue" Height="100"

Name="Increase" Stroke="Black" Width="100" Opacity="1" MouseDown="Increase_MouseDown">

<Rectangle.BitmapEffect>

<BlurBitmapEffect Radius="2" />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="615" Canvas.Top="330" Foreground="White" FontSize="80" Name="Plus_Sign" Width="120"

Height="117.5">

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> +

</Label>

<Label Canvas.Left="125" Canvas.Top="330" Foreground="White" FontSize="80" Name="Minus_Sign"

Width="120" Height="117.5">

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> -

</Label>

<Rectangle Canvas.Left="100" Canvas.Top="350" Fill="Transparent" Height="100" Name="Decreasebox"

Width="100" Opacity="1" MouseDown="Decrease_MouseDown" />

<Rectangle Canvas.Left="600" Canvas.Top="350" Fill="Transparent" Height="100" Name="Increasebox"

Width="100" Opacity="1" MouseDown="Increase_MouseDown" />

</Canvas>

</Page>

ClickThresh

ClickThresh.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for ClickThresh.xaml

/// </summary>

public partial class ClickThresh : Page

Page 80: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 80

{

Socket client;

int recv;

private static byte[] data = new byte[100000];

private static IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)ipep;

private static int size = 100;

public ClickThresh()

{

InitializeComponent();

Console.WriteLine("ClickThresh Page Loaded");

//connect to Server

ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 3000);

int sockopt = (int)client.GetSocketOption(SocketOptionLevel.Socket,

SocketOptionName.ReceiveTimeout);

//Console.WriteLine("New timeout: {0}", sockopt);

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

UPDATE_DISPLAY();

}

private void UPDATE_DISPLAY()

{

// get threshold data

// data_parameters will return:

//imageStartIndex, imageEndIndex, useGUI, liveCapture, logging,

//ExternalCom, Debug_UDP, Debug_Image, ClickFrames, ClickThreshold

// space delimeted, Click threshold is the last item returned

string StringToSend = "data_parameters";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

double XThresh = Convert.ToDouble(ReturnParts[9]);

double YThresh = XThresh * 6 / 8;

label_X.Content = "X Pixels = " + XThresh;

label_Y.Content = "Y Pixels = " + YThresh;

Show_Size.Width = (XThresh * 2) + 1;

Show_Size.Height = (YThresh * 2) + 1;

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new MainPage());

}

private void Page_Loaded(object sender, RoutedEventArgs e)

{

}

private void Increase_MouseEnter(object sender, MouseEventArgs e)

{

// send Thresh+

string StringToSend = "ClickThreshold+";

data = new byte[1024];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

Page 81: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 81

UPDATE_DISPLAY();

}

private void Decrease_MouseEnter(object sender, MouseEventArgs e)

{

// send Thresh-

string StringToSend = "ClickThreshold-";

data = new byte[1024];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

UPDATE_DISPLAY();

}

private void Decrease_MouseLeave(object sender, MouseEventArgs e)

{

}

private void Increase_MouseLeave(object sender, MouseEventArgs e)

{

}

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(text);

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

{

return recv;

}

else

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

private void button_Click(object sender, RoutedEventArgs e)

{

}

}

}

Page 82: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 82

clickDelay.xaml <Page x:Class="IRALAR_UI.ClickThresh"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="ClickThresh"

Background="Black"

Width="800"

Height="600" Loaded="Page_Loaded">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True">

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" />

<Rectangle Canvas.Left="360" Canvas.Top="250" Height="60" RadiusX="2" RadiusY="2" Name="Show_Size"

Stroke="White" Width="80" >

<Rectangle.Fill>

<RadialGradientBrush GradientOrigin="0.5,0" Center="0.5,0" RadiusX=".8" RadiusY="1">

<GradientStop Color="Black" Offset="0" />

<GradientStop Color="DarkGray" Offset="1.4" />

</RadialGradientBrush>

</Rectangle.Fill>

</Rectangle>

<Button Canvas.Left="600" Canvas.Top="0" Height="75" Width="125" Click="GO_TO_MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="WhiteSmoke" Offset="0" />

<GradientStop Color="LightGreen" Offset="0.5" />

<GradientStop Color="Green" Offset="1" />

</LinearGradientBrush>

</Button.Background> MAIN

</Button>

<Rectangle Canvas.Top="48" Canvas.Left="80" Width="200" Height="200" Fill="White" x:Name="r1"

RadiusX="10" RadiusY="10" >

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="98" Canvas.Left="337" Width="100" Height="100" Fill="White" x:Name="r2"

RadiusX="10" RadiusY="10" >

<Rectangle.BitmapEffect>

<BevelBitmapEffect BevelWidth="4" />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

Page 83: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 83

</Rectangle>

<Rectangle Canvas.Top="123" Canvas.Left="492" Width="50" Height="50" Fill="White" x:Name="r3"

RadiusX="5" RadiusY="5">

<Rectangle.BitmapEffect>

<BevelBitmapEffect BevelWidth="3" />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="135" Canvas.Left="579" Width="25" Height="25" Fill="White" x:Name="r4"

RadiusX="4" RadiusY="4">

<Rectangle.BitmapEffect>

<BevelBitmapEffect BevelWidth="2" />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="143" Canvas.Left="645" Width="10" Height="10" Fill="White" x:Name="r5"

RadiusX="2" RadiusY="2">

<Rectangle.BitmapEffect>

<BevelBitmapEffect BevelWidth="1" />

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Top="145" Canvas.Left="690" Width="5" Height="5" Fill="White" x:Name="r6" RadiusX="2"

RadiusY="2">

<Rectangle.BitmapEffect>

<BevelBitmapEffect BevelWidth="1" />

Page 84: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 84

</Rectangle.BitmapEffect>

<Rectangle.Triggers>

<EventTrigger RoutedEvent="Rectangle.MouseLeftButtonDown">

<EventTrigger.Actions>

<BeginStoryboard>

<Storyboard>

<ColorAnimation

Storyboard.TargetProperty="Fill.Color"

From="White"

To="Red"

Duration="0:0:.5"

AutoReverse="True"

RepeatBehavior="1x"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger.Actions>

</EventTrigger>

</Rectangle.Triggers>

</Rectangle>

<Rectangle Canvas.Left="100" Canvas.Top="350" RadiusX="30" RadiusY="30" Fill="LightBlue" Height="100"

Name="Decrease" Stroke="Black" Width="100" Opacity="1" MouseEnter="Decrease_MouseEnter"

MouseLeave="Decrease_MouseLeave">

<Rectangle.BitmapEffect>

<BlurBitmapEffect Radius="2" />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="600" Canvas.Top="350" RadiusX="30" RadiusY="30" Fill="LightBlue" Height="100"

Name="Increase" Stroke="Black" Width="100" Opacity="1" MouseEnter="Increase_MouseEnter"

MouseLeave="Increase_MouseLeave">

<Rectangle.BitmapEffect>

<BlurBitmapEffect Radius="2" />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="615" Canvas.Top="330" Foreground="White" FontSize="80" Name="Plus_Sign" Width="120"

Height="117.5">

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> +</Label>

<Label Canvas.Left="125" Canvas.Top="330" Foreground="White" FontSize="80" Name="Minus_Sign"

Width="120" Height="117.5">

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> -

</Label>

<Label Canvas.Left="275" Canvas.Top="354.873" Foreground="White" FontSize="42" Name="label_X"

Height="67" Width="308">

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> X pixels =

</Label>

<Label Canvas.Left="275" Canvas.Top="404.873" Foreground="White" FontSize="42" Name="label_Y"

Height="95.127" Width="308" >

<Label.BitmapEffect>

<DropShadowBitmapEffect />

</Label.BitmapEffect> Y pixels =

</Label>

<Rectangle Canvas.Left="100" Canvas.Top="350" Fill="Transparent" Height="100" Name="Decreasebox"

Width="100" Opacity="1" MouseDown="Decrease_MouseEnter" />

<Rectangle Canvas.Left="600" Canvas.Top="350" Fill="Transparent" Height="100" Name="Increasebox"

Width="100" Opacity="1" MouseDown="Increase_MouseEnter" />

</Canvas>

</Page>

Diagnostics

Diagnostics.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

Page 85: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 85

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

using System.Windows.Threading;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for Diagnostics.xaml

/// </summary>

///

public partial class Diagnostics : Page

{

DispatcherTimer dt1 = new DispatcherTimer(); // will be used to gather data_mouse every second

DispatcherTimer dt2 = new DispatcherTimer(); // will be used to gather data_image every 2 seconds

DispatcherTimer dt3 = new DispatcherTimer(); // will be used to send image refresh command over to IPA

Socket client;

int recv;

private static byte[] data = new byte[100000];

private static IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)ipep;

private static int size = 100;

public Diagnostics()

{

InitializeComponent();

Console.WriteLine("Diagnostics Page Loaded");

//connect to Server

ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

dt1.Interval = TimeSpan.FromSeconds(1);

dt2.Interval = TimeSpan.FromSeconds(2);

dt3.Interval = TimeSpan.FromSeconds(3);

dt1.Tick += new EventHandler(GatherMouseData);

dt2.Tick += new EventHandler(GatherImageData);

dt3.Tick += new EventHandler(RefreshDisplayImage);

dt1.Start();

dt2.Start();

dt3.Start();

}

private void GatherMouseData(object sender, EventArgs e)

{

string StringToSend = "data_mouse";

//data_mouse = centerX, centerY, mouseX, mousey

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

EyeX.Content = ReturnParts[0];

EyeY.Content = ReturnParts[1];

PixelX.Content = ReturnParts[2];

PixelY.Content = ReturnParts[3];

}

private void GatherImageData(object sender, EventArgs e)

{

string StringToSend = "data_image";

//data_image = blobCompactness, blobMean, length, height, ar, perimeter, perimeterError,blobRadius

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Page 86: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 86

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

IP1.Content = ReturnParts[0];

IP2.Content = ReturnParts[1];

IP3.Content = ReturnParts[2];

IP4.Content = ReturnParts[3];

IP5.Content = ReturnParts[4];

IP6.Content = ReturnParts[5];

IP7.Content = ReturnParts[6];

IP8.Content = ReturnParts[7];

}

private void RefreshDisplayImage(object sender, EventArgs e)

{

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

// gather calibration data

string StringToSend = "data_calibration";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(StringToSend + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(Encoding.ASCII.GetString(data, 0, recv));

string[] ReturnParts = text.Split(new Char[] { ' ' });

A0.Content = ReturnParts[0];

B0.Content = ReturnParts[1];

C0.Content = ReturnParts[2];

A1.Content = ReturnParts[3];

B1.Content = ReturnParts[4];

C1.Content = ReturnParts[5];

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

dt1.Stop();

dt2.Stop();

dt3.Stop();

NavigationService.Navigate(new MainPage());

}

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(text);

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

Page 87: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 87

{

return recv;

}

else

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

}

}

Diagnostics.xaml <Page x:Class="IRALAR_UI.Diagnostics"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Diagnostics"

Background="Black"

Width="800"

Height="600"

>

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True">

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" />

<Button Canvas.Left="600" Canvas.Top="0" Height="75" Width="125" Click="GO_TO_MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="WhiteSmoke" Offset="0" />

<GradientStop Color="LightGreen" Offset="0.5" />

<GradientStop Color="Green" Offset="1" />

</LinearGradientBrush>

</Button.Background> MAIN

</Button>

<Label Canvas.Left="26" Canvas.Top="91" Height="82" FontSize="32" Name="label1"

Foreground="white"></Label>

<Label Canvas.Left="250" Canvas.Top="39" Name="label2" Foreground="White" FontSize="49">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> Diagnostics</Label>

<Rectangle Canvas.Left="26" Canvas.Top="125" Height="201" Name="rectangle3" Stroke="Black" Width="751"

Fill="LightGray" RadiusX="10" RadiusY="10" Opacity="0.5">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="139.793" Canvas.Top="142.4" Name="label3" Foreground="white" FontSize="20"

Height="36.6" Width="191">

Calibration Variables</Label>

<Label Name="X_pixel" Canvas.Left="32" Canvas.Top="181.604" Height="61" Foreground="white"

FontSize="25" Width="100">Xpixel =</Label>

<Label Name="Y_pixel" Canvas.Left="32.8" Canvas.Top="236.209" Height="61" Foreground="white"

FontSize="25" Width="99.2">Ypixel =</Label>

<Label Name="EyeX0" Canvas.Left="182.204" Canvas.Top="186.998" Height="61" Foreground="white"

FontSize="20" Width="85.547">+ Eye_X *</Label>

<Label Name="EyeY0" Canvas.Left="325.523" Canvas.Top="186.998" Height="61" Foreground="white"

FontSize="20" Width="84.436">+ Eye_Y *</Label>

<Label Name="EyeX1" Canvas.Left="182.721" Canvas.Top="241.602" Height="61" Foreground="white"

FontSize="20" Width="85.03">+ Eye_X * </Label>

<Label Name="EyeY1" Canvas.Left="325.523" Canvas.Top="241.603" Height="61" Foreground="white"

FontSize="20" Width="84.436">+ Eye_Y *</Label>

<Label Name="A0" Canvas.Left="123.321" Canvas.Top="192.394" Height="61" Foreground="white"

FontSize="15" Width="65.6"></Label>

<Label Name="B0" Canvas.Left="261.6" Canvas.Top="192.394" Height="61" Foreground="white" FontSize="15"

Width="69.193"></Label>

<Label Name="C0" Canvas.Left="421.341" Canvas.Top="192.394" Height="61" Foreground="white"

FontSize="15" Width="76.659"></Label>

<Label Name="A1" Canvas.Left="123.321" Canvas.Top="246.999" Height="61" Foreground="white"

FontSize="15" Width="65.6"></Label>

Page 88: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 88

<Label Name="B1" Canvas.Left="261.6" Canvas.Top="246.999" Height="61" Foreground="white" FontSize="15"

Width="69.193"></Label>

<Label Name="C1" Canvas.Left="421.341" Canvas.Top="246.999" Height="61" Foreground="white"

FontSize="15" Width="76.659"></Label>

<Rectangle Canvas.Left="500" Canvas.Top="142.4" Height="165.6" Name="rectangle4" Stroke="Black"

Width="259" RadiusX="10" RadiusY="10" Fill="DimGray">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="593.274" Canvas.Top="142.4" Name="label4" Foreground="white" FontSize="20"

Height="36.6" Width="39.996">Eye</Label>

<Label Canvas.Left="674.488" Canvas.Top="142.4" Name="label5" Foreground="white" FontSize="20"

Height="36.6" Width="51.444">Pixel</Label>

<Label Canvas.Left="519.948" Canvas.Top="192.394" Name="label6" Foreground="white" FontSize="20"

Height="36.6" Width="32.219">X</Label>

<Label Canvas.Left="519.948" Canvas.Top="252.392" Name="label7" Foreground="white" FontSize="20"

Height="36.6" Width="32.219">Y</Label>

<Label Name="EyeX" Canvas.Left="593.274" Canvas.Top="192.395" Height="61" Foreground="white"

FontSize="15" Width="80"></Label>

<Label Name="EyeY" Canvas.Left="593.274" Canvas.Top="257.789" Height="61" Foreground="white"

FontSize="15" Width="80"></Label>

<Label Name="PixelX" Canvas.Left="674.488" Canvas.Top="192.395" Height="61" Foreground="white"

FontSize="15" Width="80.024"></Label>

<Label Name="PixelY" Canvas.Left="674.488" Canvas.Top="257.789" Height="61.447" Foreground="white"

FontSize="15" Width="80.024"></Label>

<Rectangle Canvas.Left="102" Canvas.Top="334.148" Height="219" Name="rectangle9" Stroke="Black"

Width="675" Fill="LightGray" RadiusX="10" RadiusY="10" Opacity="0.5">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Name="label10" Canvas.Left="282" Canvas.Top="343.402" Height="43.342" Foreground="white"

FontSize="25" Width="269">

Image Properties</Label>

<Label Name="IP1l" Canvas.Left="134" Canvas.Top="386.744" Height="61.679" Foreground="white"

FontSize="15" Width="136">Blob Compactness:</Label>

<Label Name="IP2l" Canvas.Left="134" Canvas.Top="416.75" Height="61.679" Foreground="white"

FontSize="15" Width="85">Blob Mean:</Label>

<Label Name="IP3l" Canvas.Left="134" Canvas.Top="446.756" Height="61.679" Foreground="white"

FontSize="15" Width="93">Blob Length:</Label>

<Label Name="IP4l" Canvas.Left="134" Canvas.Top="476.762" Height="61.679" Foreground="white"

FontSize="15" Width="92">Blob Height:</Label>

<Label Name="IP5l" Canvas.Left="449" Canvas.Top="386.744" Height="61.679" Foreground="white"

FontSize="15" Width="66">Blob AR:</Label>

<Label Name="IP6l" Canvas.Left="449" Canvas.Top="416.75" Height="61.679" Foreground="white"

FontSize="15" Width="111">Blob Perimeter:</Label>

<Label Name="IP7l" Canvas.Left="449" Canvas.Top="446.756" Height="61.679" Foreground="white"

FontSize="15" Width="147">Blob Perimeter Error:</Label>

<Label Name="IP8l" Canvas.Left="449" Canvas.Top="476.762" Height="61.679" Foreground="white"

FontSize="15" Width="91">Blob Radius:</Label>

<Label Name="IP1" Canvas.Left="304" Canvas.Top="386.744" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP2" Canvas.Left="304" Canvas.Top="416.75" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP3" Canvas.Left="304" Canvas.Top="446.756" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP4" Canvas.Left="304" Canvas.Top="476.762" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP5" Canvas.Left="672.253" Canvas.Top="386.744" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP6" Canvas.Left="672.253" Canvas.Top="416.75" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP7" Canvas.Left="672.253" Canvas.Top="446.756" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

<Label Name="IP8" Canvas.Left="672.253" Canvas.Top="476.762" Height="61.679" Foreground="white"

FontSize="15" Width="90"></Label>

</Canvas>

</Page>

Page 89: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 89

MainPage

Screenshot

MainPage.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Media.Effects;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for MainPage.xaml

/// </summary>

public partial class MainPage : Page

{

byte[] data = new byte[1024];

Socket client;

IPEndPoint ipep;

public MainPage()

{

//connect to Server

ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

InitializeComponent();

Page 90: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 90

}

private void Page_Loaded(object sender, RoutedEventArgs e)

{

Console.WriteLine("Main Page Loaded");

}

private void Close_Page(object sender, RoutedEventArgs e)

{

string s = "EXIT";

data = new byte[1024];

data = Encoding.ASCII.GetBytes(s + "|");

client.Send(data);

Window win = (Window)this.Parent;

win.Close();

}

private void GO_TO_SPLASH(object sender, RoutedEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new Splash());

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void Calibrate_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Calibrate_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Minimize_MouseDown(object sender, MouseButtonEventArgs e)

{

Window win = (Window)this.Parent;

win.WindowState = WindowState.Minimized;

}

private void Minimize_MouseLeave(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Minimize_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Minimize_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Minimize_Glow");

Page 91: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 91

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Calibrate_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new Calibration());

}

private void Calibrate_MouseLeave(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Calibrate_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Basic_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Basic_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Basic_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new UI_TC1());

}

private void Basic_MouseLeave(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Basic_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Diagnostics_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Diagnostics_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Diagnostics_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new Diagnostics());

}

private void Diagnostics_MouseLeave(object sender, MouseEventArgs e)

Page 92: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 92

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Diagnostics_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void About_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "About_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void About_MouseLeave(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "About_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void About_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new UI_About());

}

private void Game_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new UI_Mole());

}

private void Game_MouseEnter(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Game_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Game_MouseLeave(object sender, MouseEventArgs e)

{

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Game_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

Page 93: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 93

}

private void Testing_MouseEnter(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_Mouse");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Testing_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Testing_MouseLeave(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_MouseLeave");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Testing_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Testing_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new UI_Test());

}

private void Thresh_MouseEnter(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_Mouse");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Thresh_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Thresh_MouseLeave(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_MouseLeave");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Thresh_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Thresh_MouseDown(object sender, MouseButtonEventArgs e)

Page 94: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 94

{

PAGE_OUT();

NavigationService.Navigate(new ClickThresh());

}

private void Delay_MouseEnter(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_Mouse");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 5;

glow.To = 15;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Delay_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Delay_MouseLeave(object sender, MouseEventArgs e)

{

//Console.WriteLine("Testing_MouseLeave");

DoubleAnimation glow = new DoubleAnimation();

Storyboard myStoryboard = new Storyboard();

glow.From = 15;

glow.To = 5;

glow.Duration = TimeSpan.FromSeconds(1);

Storyboard.SetTargetName(glow, "Delay_Glow");

Storyboard.SetTargetProperty(glow, new PropertyPath(OuterGlowBitmapEffect.GlowSizeProperty));

myStoryboard.Children.Add(glow);

myStoryboard.Begin(this);

}

private void Delay_MouseDown(object sender, MouseButtonEventArgs e)

{

PAGE_OUT();

NavigationService.Navigate(new ClickDelay());

}

private void PAGE_OUT()

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 1;

fadein.To = 0;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

}

}

MainPage.xaml <Page

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

xmlns:d="http://schemas.microsoft.com/expression/blend/2006"

xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d"

x:Class="IRALAR_UI.MainPage"

Title="MainPage" Loaded="Page_Loaded"

ShowsNavigationUI="False"

Background="#FF000000"

Width="800"

Height="600"

>

<Canvas Loaded="Canvas_Loaded" x:Name="_canvas">

<Canvas.Background>

<LinearGradientBrush>

<GradientStop Color="Black" Offset="0"/>

<GradientStop Color="Black" Offset="1"/>

Page 95: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 95

</LinearGradientBrush>

</Canvas.Background>

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" />

<Rectangle Canvas.Left="200" Canvas.Top="28.75" Height="521.75" Name="rectangle1" Width="400"

Fill="black" RadiusX="10" RadiusY="10" Opacity=".75">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="620" Canvas.Top="240" Height="75" Name="rectangle2" Width="150" Fill="black"

RadiusX="10" RadiusY="10" Opacity=".75">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Button HorizontalAlignment="right" VerticalAlignment="Top" Height="36.25" Width="85"

Click="Close_Page" Canvas.Left="505" Canvas.Top="60" Content="EXIT">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="#FFF3F3F3" Offset="0"/>

<GradientStop Color="#FFEBEBEB" Offset="0.5"/>

<GradientStop Color="#FFDDDDDD" Offset="0.5"/>

<GradientStop Color="#FFFF0000" Offset="1"/>

</LinearGradientBrush>

</Button.Background>

</Button>

<Button HorizontalAlignment="right" VerticalAlignment="bottom" Height="49.995" Width="50"

Click="GO_TO_SPLASH" Canvas.Left="802" Canvas.Top="550.005" Content="splash" Opacity="0">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="#FFF3F3F3" Offset="0"/>

<GradientStop Color="#FFEBEBEB" Offset="0.5"/>

<GradientStop Color="#FFDDDDDD" Offset="0.5"/>

<GradientStop Color="#FF000000" Offset="1"/>

</LinearGradientBrush>

</Button.Background>

</Button>

<Label Canvas.Left="645.129" Canvas.Top="255.051" Height="56.678" Name="Calibrate" FontSize="24"

Foreground="DarkBlue" Width="103.354">

<Label.BitmapEffect>

<OuterGlowBitmapEffect x:Name="Calibrate_Glow" GlowColor="White" />

</Label.BitmapEffect> Calibrate</Label>

<Label Canvas.Left="316.73" Canvas.Top="205.041" Height="56.678" Name="Basic" FontSize="24"

Foreground="DarkBlue" Width="183.37">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Basic_Glow" />

</Label.BitmapEffect>Timing Collector

</Label>

<Label Canvas.Left="334" Canvas.Top="420" Height="44" Name="Diagnostics" Foreground="DarkBlue"

FontSize="24" Width="132">

<Label.BitmapEffect>

<OuterGlowBitmapEffect x:Name="Diagnostics_Glow" GlowColor="White" GlowSize="5" />

</Label.BitmapEffect> Diagnostics</Label>

<Label Canvas.Left="360" Canvas.Top="351.737" Height="45.009" Name="About" Foreground="DarkBlue"

FontSize="24" Width="80">

<Label.BitmapEffect>

<OuterGlowBitmapEffect x:Name="About_Glow" GlowColor="White" />

</Label.BitmapEffect> About

</Label>

<Label Canvas.Left="364" Canvas.Top="285.057" Height="43.342" Name="Game" Foreground="DarkBlue"

FontSize="24" Width="72">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Game_Glow" />

</Label.BitmapEffect>Game

</Label>

<Label Canvas.Left="357" Canvas.Top="131.693" Height="45.009" Name="Testing" Foreground="DarkBlue"

FontSize="24" Width="86">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Testing_Glow" />

</Label.BitmapEffect>Testing

</Label>

<Label Canvas.Left="476.25" Canvas.Top="480" Height="44" Name="Thresh" Foreground="DarkBlue"

FontSize="24">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Thresh_Glow" />

Page 96: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 96

</Label.BitmapEffect> Threshold

</Label>

<Label Canvas.Left="231.25" Canvas.Top="480" Height="44" Name="Delay" Foreground="DarkBlue"

FontSize="24">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Delay_Glow" />

</Label.BitmapEffect> Delay

</Label>

<Label Canvas.Left="222" Canvas.Top="60" Height="45" Name="Minimize" Foreground="DarkBlue"

FontSize="24">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="White" x:Name="Minimize_Glow" />

</Label.BitmapEffect>Minimize

</Label>

<Rectangle Canvas.Left="222" Canvas.Top="60" Height="50" Stroke="Transparent" Fill="Transparent"

Name="Minimize_blk" Width="111.25" MouseEnter="Minimize_MouseEnter" MouseDown="Minimize_MouseDown"

MouseLeave="Minimize_MouseLeave"/>

<Rectangle Canvas.Left="635.127" Canvas.Top="255.051" Height="50.01" Stroke="Transparent"

Fill="Transparent" Name="Calibrate_blk" Width="126.692" MouseEnter="Calibrate_MouseEnter"

MouseDown="Calibrate_MouseDown" MouseLeave="Calibrate_MouseLeave"/>

<Rectangle Canvas.Left="348" Canvas.Top="130.026" Height="45.009" Stroke="Transparent"

Fill="Transparent" Name="Testing_blk" Width="112" MouseEnter="Testing_MouseEnter"

MouseDown="Testing_MouseDown" MouseLeave="Testing_MouseLeave"/>

<Rectangle Canvas.Left="347.5" Canvas.Top="345.069" Height="53.344" Stroke="Transparent"

Fill="Transparent" Name="About_blk" Width="112.5" MouseEnter="About_MouseEnter"

MouseDown="About_MouseDown" MouseLeave="About_MouseLeave"/>

<Rectangle Canvas.Left="476.25" Canvas.Top="480" Height="45" Stroke="Transparent" Fill="Transparent"

Name="Thresh_blk" Width="114.827" MouseEnter="Thresh_MouseEnter" MouseDown="Thresh_MouseDown"

MouseLeave="Thresh_MouseLeave"/>

<Rectangle Canvas.Left="222" Canvas.Top="480" Height="53.75" Stroke="Transparent" Fill="Transparent"

Name="Delay_blk" Width="92" MouseEnter="Delay_MouseEnter" MouseDown="Delay_MouseDown"

MouseLeave="Delay_MouseLeave"/>

<Rectangle Canvas.Left="357" Canvas.Top="278.389" Height="53.344" Stroke="Transparent"

Fill="Transparent" Name="Game_blk" Width="95" MouseEnter="Game_MouseEnter"

MouseDown="Game_MouseDown" MouseLeave="Game_MouseLeave"/>

<Rectangle Canvas.Left="334" Canvas.Top="420" Height="53.75" Stroke="Transparent" Fill="Transparent"

Name="Diagnostics_blk" Width="144" MouseEnter="Diagnostics_MouseEnter" MouseDown="Diagnostics_MouseDown"

MouseLeave="Diagnostics_MouseLeave"/>

<Rectangle Canvas.Left="308" Canvas.Top="200.04" Height="56.678" Stroke="Transparent"

Fill="Transparent" Name="Basic_blk" Width="200.04" MouseEnter="Basic_MouseEnter"

MouseDown="Basic_MouseDown" MouseLeave="Basic_MouseLeave"/>

</Canvas>

</Page>

Splash

Splash.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Diagnostics;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Forms;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for Splash.xaml

/// </summary>

Page 97: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 97

public partial class Splash : Page

{

public Splash()

{

InitializeComponent();

Process x = Process.Start("IRALAR_IPA.EXE");

}

private void Splash_Canvas_Loaded(object sender, RoutedEventArgs e)

{

}

private void Animation_Completed(object sender, EventArgs e)

{

NavigationService.Navigate(new Calibration());

}

}

}

Splash.xaml <Page

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

x:Class="IRALAR_UI.Splash"

Title="Splash"

OpacityMask="{x:Null}"

ShowsNavigationUI="False"

Width="800"

Height="600"

x:Name="page"

xmlns:d="http://schemas.microsoft.com/expression/blend/2006"

xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"

mc:Ignorable="d"

>

<Page.Resources>

<Storyboard x:Key="OnLoaded1">

<DoubleAnimationUsingKeyFrames Completed="Animation_Completed"

BeginTime="00:00:00" Storyboard.TargetName="image"

Storyboard.TargetProperty="(UIElement.Opacity)">

<SplineDoubleKeyFrame KeyTime="00:00:02.6000000" Value="0"/>

</DoubleAnimationUsingKeyFrames>

</Storyboard>

</Page.Resources>

<Page.Background>

<LinearGradientBrush EndPoint="0.5,1" StartPoint="0.5,0">

<GradientStop Color="#FF000000" Offset="0"/>

<GradientStop Color="#FF000000" Offset="0"/>

<GradientStop Color="#FF610000" Offset="1"/>

</LinearGradientBrush>

</Page.Background>

<Page.Triggers>

<EventTrigger RoutedEvent="FrameworkElement.Loaded">

<BeginStoryboard Storyboard="{StaticResource OnLoaded1}" x:Name="OnLoaded1_BeginStoryboard"/>

</EventTrigger>

</Page.Triggers>

<Canvas>

<Image Canvas.Left="0" Canvas.Top="0" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" Height="600" />

<Image Canvas.Left="60" Canvas.Top="60" Height="435" Name="image" Stretch="Fill" Width="655"

Source="/IRALAR_UI;component/Resources/splash.png" />

</Canvas>

</Page>

Page 98: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 98

UI_About

Screenshot

UI_About.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for UI_About.xaml

/// </summary>

public partial class UI_About : Page

{

public UI_About()

{

InitializeComponent();

Console.WriteLine("About Page Loaded");

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new MainPage());

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

DoubleAnimation MOVE_WELCOME = new DoubleAnimation();

MOVE_WELCOME.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_WELCOME.From = 1;

MOVE_WELCOME.To = 0;

Page 99: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 99

MOVE_WELCOME.Duration = TimeSpan.FromSeconds(1);

splash.BeginAnimation(Canvas.OpacityProperty, MOVE_WELCOME);

/*DoubleAnimation anim1 = new DoubleAnimation();

anim1.BeginTime = TimeSpan.Parse("0:0:5");

anim1.From = 0;

anim1.To = 1;

anim1.Duration = TimeSpan.FromSeconds(1);

BU_logo.BeginAnimation(Canvas.OpacityProperty, anim1);

DoubleAnimation anim2 = new DoubleAnimation();

anim2.BeginTime = TimeSpan.Parse("0:0:10");

anim2.From = 1;

anim2.To = 0;

anim2.Duration = TimeSpan.FromSeconds(1);

BU_logo.BeginAnimation(Canvas.OpacityProperty, anim2);

*/

}

}

}

UI_About.xaml <Page

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

x:Class="IRALAR_UI.UI_About"

Title="UI_About"

Background="#FF636363"

Width="800"

Height="600">

<Canvas Loaded="Canvas_Loaded" Name="_canvas">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" />

<Button HorizontalAlignment="right" VerticalAlignment="Top" Canvas.Left="600" Canvas.Top="0"

Height="75" Width="125" Click="GO_TO_MAIN" Content="MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="WhiteSmoke" Offset="0"/>

<GradientStop Color="LightGreen" Offset="0.5"/>

<GradientStop Color="Green" Offset="1"/>

</LinearGradientBrush>

</Button.Background>

</Button>

<Image Canvas.Left="50" Canvas.Top="50" Name="splash" Stretch="Fill" Width="700"

Source="/IRALAR_UI;component/Resources/splash.png" Opacity="1"/>

<!--<Image Canvas.Left="50" Canvas.Top="50" Name="BU_logo" Stretch="Fill" Width="700"

Source="/IRALAR_UI;component/Resources/BU.png" Opacity="0"/>-->

</Canvas>

</Page>

Page 100: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 100

UI_Mole

Screenshot

UI_Mole.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for UI_Mole.xaml

/// </summary>

public partial class UI_Mole : Page

{

Random RandomClass = new Random();

DispatcherTimer dt = new DispatcherTimer();

double time;

int score = 0;

int frame = 0;

bool play = false;

public UI_Mole()

{

InitializeComponent();

Console.WriteLine("Advanced Page Loaded");

dt.Interval = TimeSpan.FromMilliseconds(100);

dt.Tick += new EventHandler(dt_Tick);

dt.Start();

Page 101: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 101

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

dt.Stop();

NavigationService.Navigate(new MainPage());

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

DoubleAnimation MOVE_label1 = new DoubleAnimation();

MOVE_label1.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_label1.To = -500;

MOVE_label1.Duration = TimeSpan.FromSeconds(1.5);

label1.BeginAnimation(Canvas.TopProperty, MOVE_label1);

DoubleAnimation FADE_label = new DoubleAnimation();

FADE_label.BeginTime = TimeSpan.Parse("0:0:5");

FADE_label.To = 0;

FADE_label.Duration = TimeSpan.FromSeconds(1.5);

label1.BeginAnimation(Canvas.OpacityProperty, FADE_label);

label2.BeginAnimation(Canvas.OpacityProperty, FADE_label);

DoubleAnimation MOVE_label2 = new DoubleAnimation();

MOVE_label2.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_label2.To = 1000;

MOVE_label2.Duration = TimeSpan.FromSeconds(1.5);

label2.BeginAnimation(Canvas.TopProperty, MOVE_label2);

DoubleAnimation MOVE_Start = new DoubleAnimation();

MOVE_Start.Duration = TimeSpan.Parse("0:0:1");

MOVE_Start.BeginTime = TimeSpan.Parse("0:0:5");

MOVE_Start.To = 1;

_start.BeginAnimation(Canvas.OpacityProperty, MOVE_Start);

_start.IsEnabled = true;

}

private void dt_Tick(object sender, EventArgs e)

{

//Console.WriteLine(".");

if (play == true) {

//Console.WriteLine(".");

if (RandomClass.Next(1000) < 30) { IN_1(); }

if (RandomClass.Next(1000) < 30) { IN_2(); }

if (RandomClass.Next(1000) < 30) { IN_3(); }

if (RandomClass.Next(1000) < 30) { IN_4(); }

if (RandomClass.Next(1000) < 30) { IN_5(); }

if (RandomClass.Next(1000) < 30) { IN_6(); }

if (RandomClass.Next(1000) < 30) { IN_7(); }

if (RandomClass.Next(1000) < 30) { IN_8(); }

if (RandomClass.Next(1000) < 30) { IN_9(); }

if (RandomClass.Next(1000) < 10) { OUT_1(); }

if (RandomClass.Next(1000) < 10) { OUT_2(); }

if (RandomClass.Next(1000) < 10) { OUT_3(); }

if (RandomClass.Next(1000) < 10) { OUT_4(); }

if (RandomClass.Next(1000) < 10) { OUT_5(); }

if (RandomClass.Next(1000) < 10) { OUT_6(); }

if (RandomClass.Next(1000) < 10) { OUT_7(); }

if (RandomClass.Next(1000) < 10) { OUT_8(); }

if (RandomClass.Next(1000) < 10) { OUT_9(); }

time = time + .1;

UPDATE_TIME();

}

else{

// this is for the non-playing mode

++frame;

//Console.WriteLine(frame);

if(frame >= 50)

{

IN_1();

IN_2();

IN_3();

Page 102: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 102

IN_4();

IN_5();

IN_6();

IN_7();

IN_8();

IN_9();

frame = 0;

}

}

if (time > 40)

{

time = 0;

play = false;

OUT_1();

OUT_2();

OUT_3();

OUT_4();

OUT_5();

OUT_6();

OUT_7();

OUT_8();

OUT_9();

DoubleAnimation MOVE_Start = new DoubleAnimation();

MOVE_Start.Duration = TimeSpan.Parse("0:0:1");

MOVE_Start.To = 1;

_start.BeginAnimation(Canvas.OpacityProperty, MOVE_Start);

_start.IsEnabled = true;

}

}

private void Start_Click(object sender, RoutedEventArgs e)

{

OUT_1();

OUT_2();

OUT_3();

OUT_4();

OUT_5();

OUT_6();

OUT_7();

OUT_8();

OUT_9();

score = -1;

UPDATE_SCORE();

time = 0;

play = true;

Console.WriteLine("game started");

//move the start button off of the screen

DoubleAnimation MOVE_Start = new DoubleAnimation();

MOVE_Start.Duration = TimeSpan.Parse("0:0:1");

MOVE_Start.To = 0;

_start.BeginAnimation(Canvas.OpacityProperty, MOVE_Start);

_start.IsEnabled = false;

}

private void UPDATE_TIME()

{

DoubleAnimation Time_edit = new DoubleAnimation();

Time_edit.Duration = TimeSpan.Parse("0:0:0.1");

Time_edit.To = time * 20;

Time.BeginAnimation(Canvas.WidthProperty, Time_edit);

}

private void UPDATE_SCORE()

{

if (play == true) { Score.Content = ++score + " Moles!"; }

}

private void button1_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

Page 103: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 103

OUT_1();

}

private void button2_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_2();

}

private void button3_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_3();

}

private void button4_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_4();

}

private void button5_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_5();

}

private void button6_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_6();

}

private void button7_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_7();

}

private void button8_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_8();

}

private void button9_Click(object sender, RoutedEventArgs e)

{

UPDATE_SCORE();

OUT_9();

}

private void OUT_1()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button1.BeginAnimation(Canvas.OpacityProperty, go_away);

button1.IsEnabled = false;

}

private void IN_1()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button1.BeginAnimation(Canvas.OpacityProperty, go_away);

button1.IsEnabled = true;

}

private void OUT_2()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button2.BeginAnimation(Canvas.OpacityProperty, go_away);

button2.IsEnabled = false;

}

private void IN_2()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button2.BeginAnimation(Canvas.OpacityProperty, go_away);

button2.IsEnabled = true;

}

private void OUT_3()

{

DoubleAnimation go_away = new DoubleAnimation();

Page 104: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 104

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button3.BeginAnimation(Canvas.OpacityProperty, go_away);

button3.IsEnabled = false;

}

private void IN_3()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button3.BeginAnimation(Canvas.OpacityProperty, go_away);

button3.IsEnabled = true;

}

private void OUT_4()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button4.BeginAnimation(Canvas.OpacityProperty, go_away);

button4.IsEnabled = false;

}

private void IN_4()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button4.BeginAnimation(Canvas.OpacityProperty, go_away);

button4.IsEnabled = true;

}

private void OUT_5()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button5.BeginAnimation(Canvas.OpacityProperty, go_away);

button5.IsEnabled = false;

}

private void IN_5()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button5.BeginAnimation(Canvas.OpacityProperty, go_away);

button5.IsEnabled = true;

}

private void OUT_6()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button6.BeginAnimation(Canvas.OpacityProperty, go_away);

button6.IsEnabled = false;

}

private void IN_6()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button6.BeginAnimation(Canvas.OpacityProperty, go_away);

button6.IsEnabled = true;

}

private void OUT_7()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button7.BeginAnimation(Canvas.OpacityProperty, go_away);

button7.IsEnabled = false;

}

private void IN_7()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button7.BeginAnimation(Canvas.OpacityProperty, go_away);

button7.IsEnabled = true;

}

Page 105: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 105

private void OUT_8()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button8.BeginAnimation(Canvas.OpacityProperty, go_away);

button8.IsEnabled = false;

}

private void IN_8()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button8.BeginAnimation(Canvas.OpacityProperty, go_away);

button8.IsEnabled = true;

}

private void OUT_9()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 0;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button9.BeginAnimation(Canvas.OpacityProperty, go_away);

button9.IsEnabled = false;

}

private void IN_9()

{

DoubleAnimation go_away = new DoubleAnimation();

go_away.To = 1;

go_away.Duration = TimeSpan.FromMilliseconds(150);

button9.BeginAnimation(Canvas.OpacityProperty, go_away);

button9.IsEnabled = true;

}

}

}

UI_Mole.xaml <Page x:Class="IRALAR_UI.UI_Mole"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="UI_Mole"

Background="Black"

Width="800"

Height="600"

>

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True">

<Image Canvas.Left="0" Canvas.Top="2.5" Height="600" Name="BG" Stretch="Fill" Width="799.92"

Source="/IRALAR_UI;component/Resources/01454_greenforever_1600x1200.jpg" IsEnabled="True" />

<Button x:Name="_start" Height="100" Width="100" Content="START!" Click="Start_Click" Canvas.Left="0"

Opacity="0" Canvas.Top="0">

</Button>

<Button Canvas.Left="600" Canvas.Top="0" Height="75" Width="125" Name="EXIT" Click="GO_TO_MAIN"

Foreground="Black" IsEnabled="True">Exit Game</Button>

<Button Canvas.Left="150" Canvas.Top="100" Height="100" Name="button1" Width="100"

Background="Transparent" Click="button1_Click" IsEnabled="True" IsDefault="False" IsCancel="False"

ClickMode="Release" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image1"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="350" Canvas.Top="100" Height="100" Name="button2" Width="100"

Background="Transparent" Click="button2_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image2"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="550" Canvas.Top="100" Height="100" Name="button3" Width="100"

Background="Transparent" Click="button3_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image3" Stretch="Fill"

Width="74.437" Source="/IRALAR_UI;component/Resources/mole.png" />

</Button>

<Button Canvas.Left="150" Canvas.Top="250" Height="100" Name="button4" Width="100"

Background="Transparent" Click="button4_Click" IsEnabled="True" Opacity="0">

Page 106: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 106

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image4"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="350" Canvas.Top="250" Height="100" Name="button5" Width="100"

Background="Transparent" Click="button5_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image5"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="550" Canvas.Top="250" Height="100" Name="button6" Width="100"

Background="Transparent" Click="button6_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image6"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="156.5" Canvas.Top="400" Height="100" Name="button7" Width="100"

Background="Transparent" Click="button7_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image7"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Button Canvas.Left="350" Canvas.Top="400" Height="100" Name="button8" Width="100"

Background="Transparent" Click="button8_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image8"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" IsEnabled="True" />

</Button>

<Button Canvas.Left="550" Canvas.Top="400" Height="100" Name="button9" Width="100"

Background="Transparent" Click="button9_Click" IsEnabled="True" Opacity="0">

<Image Canvas.Left="578.831" Canvas.Top="137.764" Height="68.882" Name="image9"

Source="/IRALAR_UI;component/Resources/mole.png" Stretch="Fill" Width="74.437" />

</Button>

<Label x:Name="Score" Foreground="AliceBlue" Height="40" Canvas.Left="350" Canvas.Top="510" Width="110"

Background="Transparent" FontSize="24"></Label>

<Rectangle Canvas.Left="350" Canvas.Top="535" Height="{Binding ElementName=Score, Path=ActualHeight}"

Width="{Binding ElementName=Score, Path=ActualWidth}">

<Rectangle.Fill>

<VisualBrush Visual="{Binding ElementName=Score}"/>

</Rectangle.Fill>

<Rectangle.LayoutTransform>

<ScaleTransform ScaleY="-0.75"/>

</Rectangle.LayoutTransform>

<Rectangle.OpacityMask>

<LinearGradientBrush EndPoint="0,1">

<GradientStop Offset="0" Color="Transparent" />

<GradientStop Offset="1" Color="#77000000" />

</LinearGradientBrush>

</Rectangle.OpacityMask>

</Rectangle>

<Rectangle x:Name="Time" Width="800" Height="10" Fill="PaleGreen" Canvas.Left="0" Canvas.Top="590"

RadiusX="5" RadiusY="5">

<Rectangle.BitmapEffect>

<BevelBitmapEffect EdgeProfile="CurvedOut" BevelWidth="10" Smoothness="0.7" Relief="0.1" />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="109.989" Canvas.Top="167.761" Height="166.68" Name="label1" Width="613.272"

FontSize="100" Foreground="AliceBlue" ClipToBounds="False">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AntiqueWhite" />

</Label.BitmapEffect> Look-a-Mole</Label>

<Label Canvas.Left="202.202" Canvas.Top="301.116" Height="48.884" Name="label2" Width="385.517"

Foreground="AliceBlue" FontSize="24">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AntiqueWhite" />

</Label.BitmapEffect> Look at the moles to score points!</Label>

</Canvas>

</Page>

Page 107: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 107

UI_TC1

Screenshot

UI_TC1.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for Page2.xaml

/// </summary>

public partial class UI_TC1 : Page

{

public UI_TC1()

{

InitializeComponent();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void GO_Click(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new UI_TC2());

}

Page 108: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 108

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new MainPage());

}

}

}

UI_TC1.xaml <Page x:Class="IRALAR_UI.UI_TC1"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Page1" Height="600" Width="800">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="DarkBlue">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" />

<Image Canvas.Left="33" Canvas.Top="36" Height="100" Name="logo" Stretch="Fill" Width="100"

Source="/IRALAR_UI;component/Resources/Icon.png" />

<Rectangle Canvas.Left="102" Canvas.Top="143" Height="243" Name="rectangle1" Stroke="Black" Width="541"

Fill="Black" RadiusX="40" RadiusY="40" Opacity=".75">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="102" Canvas.Top="397.299" Height="75" Name="rectangle2" Stroke="Black"

Width="541" Fill="Black" RadiusX="40" RadiusY="40" Opacity=".75">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="121" Canvas.Top="223" Height="50" Width="50" Name="RECT_over" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="White" Opacity="1">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="121" Canvas.Top="303" Height="50" Width="50" Name="RECT_down" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="Gray" Opacity="1">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="132.714" Canvas.Top="71" Height="78" Name="welcome" Width="572.67"

Foreground="White" FontSize="42">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Gray" />

</Label.BitmapEffect> IRALAR reaction time testing

</Label>

<Label Canvas.Left="228" Canvas.Top="143" Height="78" Name="instructions" Width="277"

Foreground="White" FontSize="42">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> Instructions

</Label>

<Label Canvas.Left="177" Canvas.Top="223" Height="78" Name="instructions1" Width="466"

Foreground="White" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> place the mouse over white squares

</Label>

<Label Canvas.Left="177" Canvas.Top="304" Height="78" Name="instructions2" Width="466"

Foreground="Gray" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> click on gray squares

</Label>

<Label Canvas.Left="157.08" Canvas.Top="405.552" Height="77.826" Name="instructions3" Width="466.242"

Foreground="White" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> There will be 50 squares to test!

</Label>

<Button Canvas.Left="215.938" Canvas.Top="495.405" Height="58.176" Name="GO" Click="GO_Click"

Width="289.062">Begin</Button>

Page 109: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 109

<Button Canvas.Left="600" Canvas.Top="0" Height="75" Width="125" Click="GO_TO_MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="WhiteSmoke" Offset="0" />

<GradientStop Color="LightGreen" Offset="0.5" />

<GradientStop Color="Green" Offset="1" />

</LinearGradientBrush>

</Button.Background> MAIN

</Button>

</Canvas>

</Page>

UI_TC2

UI_TC2.xaml.cs using System;

using System.IO;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for UI_Basic.xaml

/// </summary>

public partial class UI_TC2 : Page

{

int mode = 0;

int count = 0;

private DateTime startTime;

StreamWriter SW = File.AppendText("point_log_eye.csv");

public UI_TC2()

{

InitializeComponent();

Console.WriteLine("Basic Page Loaded");

// create a writer and open the file

string file = DateTime.Now.ToShortTimeString();

startTime = DateTime.Now;

SW.WriteLine(DateTime.Now.ToShortDateString() + " " + DateTime.Now.ToShortTimeString());

PLACE_RECTANGLE();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void RECT_over_EVT(object sender, MouseEventArgs e)

{

PLACE_RECTANGLE();

}

Page 110: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 110

private void RECT_down_EVT(object sender, MouseEventArgs e)

{

PLACE_RECTANGLE();

}

private void PLACE_RECTANGLE()

{

TimeSpan interval = DateTime.Now - startTime;

// there are a few modes that rectangles can get placed in...

// mouse over mode

// mouse down mode

// both can be used in two cursor modes...

// known previous cursor mode

// random previous cursor mode

// progress through 25 under each scenario

//determine the mode of operation

if (count >= 0 && count <= 24) { mode = 0; }//mouse over & known cursor

if (count >= 25 && count <= 49) { mode = 1; }//mouse down & known cursor

if (count >= 50 && count <= 74) { mode = 4; }//mouse over & unknown cursor

if (count >= 75 && count <= 99) { mode = 4; }//mouse down & unknown cursor

if (count >= 100) { mode = 4; } // move on to the next screen

//move all rectangles away, and make them unclickable/invisible

DoubleAnimation GO_AWAY = new DoubleAnimation();

GO_AWAY.To = 0;

GO_AWAY.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation MOVE_AWAY = new DoubleAnimation();

MOVE_AWAY.To = -50;

MOVE_AWAY.Duration = TimeSpan.FromSeconds(.01);

RECT_down.BeginAnimation(Canvas.OpacityProperty, GO_AWAY);

RECT_down.BeginAnimation(Canvas.TopProperty, MOVE_AWAY);

RECT_down.BeginAnimation(Canvas.LeftProperty, MOVE_AWAY);

RECT_down.IsEnabled = false;

RECT_over.BeginAnimation(Canvas.OpacityProperty, GO_AWAY);

RECT_over.BeginAnimation(Canvas.TopProperty, MOVE_AWAY);

RECT_over.BeginAnimation(Canvas.LeftProperty, MOVE_AWAY);

RECT_over.IsEnabled = false;

//write to the log file & console

if (mode != 4)

{

Console.WriteLine(mode + "," + interval.TotalMilliseconds);

SW.WriteLine(count + "," + mode + "," + interval.TotalMilliseconds);

}

if (mode == 0)

{

//mouse over and known cursor

MOVE_OVER_RECTANGLE();

}

if (mode == 1)

{

//mouse down and known cursor

MOVE_DOWN_RECTANGLE();

}

if (mode == 2)

{

//Mouse over and unknown cursor

MOVE_OVER_RECTANGLE();

MOVE_CURSOR();

}

if (mode == 3)

{

// mouse down and unknown cursor

MOVE_DOWN_RECTANGLE();

MOVE_CURSOR();

Page 111: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 111

}

if (mode == 4)

{

NavigationService.Navigate(new UI_TC3());

SW.Close();

}

//increment the counter

count++;

//save the time

startTime = DateTime.Now;

}

private void MOVE_OVER_RECTANGLE()

{

Random rand = new Random();

int left = rand.Next(50, 650);

int down = rand.Next(50, 450);

DoubleAnimation NEW_LEFT = new DoubleAnimation();

NEW_LEFT.To = left;

NEW_LEFT.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation NEW_DOWN = new DoubleAnimation();

NEW_DOWN.To = down;

NEW_DOWN.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation COME_IN = new DoubleAnimation();

COME_IN.To = 1;

COME_IN.Duration = TimeSpan.FromSeconds(.01);

RECT_over.BeginAnimation(Canvas.OpacityProperty, COME_IN);

RECT_over.BeginAnimation(Canvas.TopProperty, NEW_DOWN);

RECT_over.BeginAnimation(Canvas.LeftProperty, NEW_LEFT);

RECT_over.IsEnabled = true;

}

private void MOVE_DOWN_RECTANGLE()

{

Random rand = new Random();

int left = rand.Next(50, 700);

int down = rand.Next(50, 500);

DoubleAnimation NEW_LEFT = new DoubleAnimation();

NEW_LEFT.To = left;

NEW_LEFT.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation NEW_DOWN = new DoubleAnimation();

NEW_DOWN.To = down;

NEW_DOWN.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation COME_IN = new DoubleAnimation();

COME_IN.To = 1;

COME_IN.Duration = TimeSpan.FromSeconds(.01);

RECT_down.BeginAnimation(Canvas.OpacityProperty, COME_IN);

RECT_down.BeginAnimation(Canvas.TopProperty, NEW_DOWN);

RECT_down.BeginAnimation(Canvas.LeftProperty, NEW_LEFT);

RECT_down.IsEnabled = true;

}

private void MOVE_CURSOR()

{

// randomly move the cursor to one of the four corners

//0 = top left

//1 = top right

//2 = bottom left

//3 = bottom right

Random rand = new Random();

int num = rand.Next(0, 4);

// Console.WriteLine(num);

Page 112: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 112

if (num == 0) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(0, 0); }

if (num == 1) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(800, 0); }

if (num == 2) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(0, 600); }

if (num == 3) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(800, 600); }

}

}

}

UI_TC2.xaml <Page x:Class="IRALAR_UI.UI_TC2"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="UI_TC2"

Width="800" Height="600">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="Black">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Opacity=".5" Source="/IRALAR_UI;component/Resources/Background.png">

<Image.BitmapEffect>

<BlurBitmapEffect Radius="3" />

</Image.BitmapEffect>

</Image>

<Rectangle Canvas.Left="50" Canvas.Top="50" Height="100" Width="100" Name="RECT_over" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="White" MouseEnter="RECT_over_EVT" Opacity="1"/>

<Rectangle Canvas.Left="167.642" Canvas.Top="50" Height="100" Width="100" Name="RECT_down" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="Gray" MouseDown="RECT_down_EVT" Opacity="1"/>

</Canvas>

</Page>

UI_TC3

UI_TC3.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for Page3.xaml

/// </summary>

public partial class UI_TC3 : Page

{

public UI_TC3()

{

InitializeComponent();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void DoubleAnimationUsingKeyFrames_Completed(object sender, EventArgs e)

{

NavigationService.Navigate(new UI_TC1());

Page 113: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 113

}

}

}

UI_TC3.xaml <Page x:Class="IRALAR_UI.UI_TC3"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="UI_TC3" Height="600" Width="800">

<Page.Resources>

<Storyboard x:Key="OnLoaded1">

<DoubleAnimationUsingKeyFrames Completed="DoubleAnimationUsingKeyFrames_Completed"

BeginTime="00:00:00" Storyboard.TargetName="label1"

Storyboard.TargetProperty="(UIElement.Opacity)">

<SplineDoubleKeyFrame KeyTime="00:00:00.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:01.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:01.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:02.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:02.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:03.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:03.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:04.000000" Value="1"/>

</DoubleAnimationUsingKeyFrames>

</Storyboard>

</Page.Resources>

<Page.Triggers>

<EventTrigger RoutedEvent="FrameworkElement.Loaded">

<BeginStoryboard Storyboard="{StaticResource OnLoaded1}" x:Name="OnLoaded1_BeginStoryboard"/>

</EventTrigger>

</Page.Triggers>

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="DarkBlue">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_UI;component/Resources/Background.png" />

<Label Canvas.Left="238.39" Canvas.Top="190.712" Height="129.192" Name="label1" Width="342.974"

FontSize="72" Foreground="White">Complete</Label>

</Canvas>

</Page>

UI_Test

Screenshot

Page 114: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 114

UI_Test.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Diagnostics;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

namespace IRALAR_UI

{

/// <summary>

/// Interaction logic for UI_Test.xaml

/// </summary>

public partial class UI_Test : Page

{

public UI_Test()

{

InitializeComponent();

}

private void GO_TO_MAIN(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new MainPage());

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

}

}

UI_Test.xaml <Page

xmlns="http://schemas.microsoft.com/winfx/2006/xaml

/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xa

ml"

x:Class="IRALAR_UI.UI_Test"

Title="UI_Test"

Background="#FF636363"

Width="800"

Height="600">

<Canvas Loaded="Canvas_Loaded" Name="_canvas"

Background="Red">

<Rectangle Canvas.Top="0" Canvas.Left="0"

Width="80" Height="60" Name="_0_0" Stroke="Black">

<Rectangle.Fill>

<RadialGradientBrush>

<GradientStop Color="AliceBlue"

Offset="0"/>

<GradientStop Color="LightBlue"

Offset="1"/>

</RadialGradientBrush>

</Rectangle.Fill>

<Rectangle.Triggers>

<EventTrigger

RoutedEvent="Rectangle.MouseEnter">

<BeginStoryboard>

<Storyboard>

<DoubleAnimation

Storyboard.TargetProperty="Opacity"

From="1.0" To="0.0"

Duration="0:0:.25"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger>

<EventTrigger

RoutedEvent="Rectangle.MouseLeave">

<BeginStoryboard>

<Storyboard>

<DoubleAnimation

Storyboard.TargetProperty="Opacity"

From="0.0" To="1.0"

Duration="0:0:.25"

/>

</Storyboard>

</BeginStoryboard>

</EventTrigger>

Page 115: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 115

</Rectangle.Triggers>

</Rectangle>

[REPEATED DECLARATIONS OF RECTANGLE OBJECTS,

EFFECTS, AND ANIMATIONS]

<Button HorizontalAlignment="right"

VerticalAlignment="Top" Canvas.Left="625"

Canvas.Top="25" Height="75" Width="125"

Click="GO_TO_MAIN" Content="MAIN">

<Button.Background>

<LinearGradientBrush EndPoint="0,1"

StartPoint="0,0">

<GradientStop

Color="WhiteSmoke" Offset="0"/>

<GradientStop

Color="LightGreen" Offset="0.5"/>

<GradientStop Color="Green"

Offset="1"/>

</LinearGradientBrush>

</Button.Background>

</Button>

</Canvas>

</Page>

UserActivityHook.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Runtime.InteropServices;

using System.Reflection;

using System.Threading;

using System.Windows.Forms;

using System.ComponentModel;

namespace gma.System.Windows

{

/// <summary>

/// This class allows you to tap keyboard and mouse and

/ or to detect their activity even when an

/// application runes in background or does not have

any user interface at all. This class raises

/// common .NET events with KeyEventArgs and

MouseEventArgs so you can easily retrive any information

you need.

/// </summary>

public class UserActivityHook

{

#region Windows structure definitions

/// <summary>

/// The POINT structure defines the x- and y-

coordinates of a point.

/// </summary>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-us/gdi/rectangl_0tiq.asp

/// </remarks>

[StructLayout(LayoutKind.Sequential)]

private class POINT

{

/// <summary>

/// Specifies the x-coordinate of the point.

/// </summary>

public int x;

/// <summary>

/// Specifies the y-coordinate of the point.

/// </summary>

public int y;

}

/// <summary>

/// The MOUSEHOOKSTRUCT structure contains

information about a mouse event passed to a WH_MOUSE hook

procedure, MouseProc.

/// </summary>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookstructures/cwpstruct.asp

/// </remarks>

[StructLayout(LayoutKind.Sequential)]

private class MouseHookStruct

{

/// <summary>

/// Specifies a POINT structure that contains

the x- and y-coordinates of the cursor, in screen

coordinates.

/// </summary>

public POINT pt;

/// <summary>

/// Handle to the window that will receive the

mouse message corresponding to the mouse event.

/// </summary>

public int hwnd;

/// <summary>

/// Specifies the hit-test value. For a list of

hit-test values, see the description of the WM_NCHITTEST

message.

/// </summary>

public int wHitTestCode;

/// <summary>

/// Specifies extra information associated with

the message.

/// </summary>

public int dwExtraInfo;

}

/// <summary>

/// The MSLLHOOKSTRUCT structure contains

information about a low-level keyboard input event.

/// </summary>

[StructLayout(LayoutKind.Sequential)]

private class MouseLLHookStruct

{

/// <summary>

/// Specifies a POINT structure that contains

the x- and y-coordinates of the cursor, in screen

coordinates.

/// </summary>

public POINT pt;

/// <summary>

/// If the message is WM_MOUSEWHEEL, the high-

order word of this member is the wheel delta.

/// The low-order word is reserved. A positive

value indicates that the wheel was rotated forward,

/// away from the user; a negative value

indicates that the wheel was rotated backward, toward the

user.

/// One wheel click is defined as WHEEL_DELTA,

which is 120.

///If the message is WM_XBUTTONDOWN,

WM_XBUTTONUP, WM_XBUTTONDBLCLK, WM_NCXBUTTONDOWN,

WM_NCXBUTTONUP,

/// or WM_NCXBUTTONDBLCLK, the high-order word

specifies which X button was pressed or released,

/// and the low-order word is reserved. This

value can be one or more of the following values.

Otherwise, mouseData is not used.

///XBUTTON1

///The first X button was pressed or released.

///XBUTTON2

///The second X button was pressed or released.

/// </summary>

public int mouseData;

/// <summary>

/// Specifies the event-injected flag. An

application can use the following value to test the mouse

flags. Value Purpose

///LLMHF_INJECTED Test the event-injected flag.

///0

///Specifies whether the event was injected.

The value is 1 if the event was injected; otherwise, it is

0.

///1-15

///Reserved.

/// </summary>

public int flags;

/// <summary>

/// Specifies the time stamp for this message.

/// </summary>

Page 116: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 116

public int time;

/// <summary>

/// Specifies extra information associated with

the message.

/// </summary>

public int dwExtraInfo;

}

/// <summary>

/// The KBDLLHOOKSTRUCT structure contains

information about a low-level keyboard input event.

/// </summary>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookstructures/cwpstruct.asp

/// </remarks>

[StructLayout(LayoutKind.Sequential)]

private class KeyboardHookStruct

{

/// <summary>

/// Specifies a virtual-key code. The code must

be a value in the range 1 to 254.

/// </summary>

public int vkCode;

/// <summary>

/// Specifies a hardware scan code for the key.

/// </summary>

public int scanCode;

/// <summary>

/// Specifies the extended-key flag, event-

injected flag, context code, and transition-state flag.

/// </summary>

public int flags;

/// <summary>

/// Specifies the time stamp for this message.

/// </summary>

public int time;

/// <summary>

/// Specifies extra information associated with

the message.

/// </summary>

public int dwExtraInfo;

}

#endregion

#region Windows function imports

/// <summary>

/// The SetWindowsHookEx function installs an

application-defined hook procedure into a hook chain.

/// You would install a hook procedure to monitor

the system for certain types of events. These events

/// are associated either with a specific thread or

with all threads in the same desktop as the calling thread.

/// </summary>

/// <param name="idHook">

/// [in] Specifies the type of hook procedure to be

installed. This parameter can be one of the following

values.

/// </param>

/// <param name="lpfn">

/// [in] Pointer to the hook procedure. If the

dwThreadId parameter is zero or specifies the identifier of

a

/// thread created by a different process, the lpfn

parameter must point to a hook procedure in a dynamic-link

/// library (DLL). Otherwise, lpfn can point to a

hook procedure in the code associated with the current

process.

/// </param>

/// <param name="hMod">

/// [in] Handle to the DLL containing the hook

procedure pointed to by the lpfn parameter.

/// The hMod parameter must be set to NULL if the

dwThreadId parameter specifies a thread created by

/// the current process and if the hook procedure

is within the code associated with the current process.

/// </param>

/// <param name="dwThreadId">

/// [in] Specifies the identifier of the thread

with which the hook procedure is to be associated.

/// If this parameter is zero, the hook procedure

is associated with all existing threads running in the

/// same desktop as the calling thread.

/// </param>

/// <returns>

/// If the function succeeds, the return value is

the handle to the hook procedure.

/// If the function fails, the return value is

NULL. To get extended error information, call GetLastError.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookfunctions/setwindowshookex.asp

/// </remarks>

[DllImport("user32.dll", CharSet = CharSet.Auto,

CallingConvention = CallingConvention.StdCall,

SetLastError = true)]

private static extern int SetWindowsHookEx(

int idHook,

HookProc lpfn,

IntPtr hMod,

int dwThreadId);

/// <summary>

/// The UnhookWindowsHookEx function removes a hook

procedure installed in a hook chain by the SetWindowsHookEx

function.

/// </summary>

/// <param name="idHook">

/// [in] Handle to the hook to be removed. This

parameter is a hook handle obtained by a previous call to

SetWindowsHookEx.

/// </param>

/// <returns>

/// If the function succeeds, the return value is

nonzero.

/// If the function fails, the return value is

zero. To get extended error information, call GetLastError.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookfunctions/setwindowshookex.asp

/// </remarks>

[DllImport("user32.dll", CharSet = CharSet.Auto,

CallingConvention = CallingConvention.StdCall,

SetLastError = true)]

private static extern int UnhookWindowsHookEx(int

idHook);

/// <summary>

/// The CallNextHookEx function passes the hook

information to the next hook procedure in the current hook

chain.

/// A hook procedure can call this function either

before or after processing the hook information.

/// </summary>

/// <param name="idHook">Ignored.</param>

/// <param name="nCode">

/// [in] Specifies the hook code passed to the

current hook procedure.

/// The next hook procedure uses this code to

determine how to process the hook information.

/// </param>

/// <param name="wParam">

/// [in] Specifies the wParam value passed to the

current hook procedure.

/// The meaning of this parameter depends on the

type of hook associated with the current hook chain.

/// </param>

/// <param name="lParam">

/// [in] Specifies the lParam value passed to the

current hook procedure.

/// The meaning of this parameter depends on the

type of hook associated with the current hook chain.

/// </param>

/// <returns>

/// This value is returned by the next hook

procedure in the chain.

/// The current hook procedure must also return

this value. The meaning of the return value depends on the

hook type.

/// For more information, see the descriptions of

the individual hook procedures.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

Page 117: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 117

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookfunctions/setwindowshookex.asp

/// </remarks>

[DllImport("user32.dll", CharSet = CharSet.Auto,

CallingConvention =

CallingConvention.StdCall)]

private static extern int CallNextHookEx(

int idHook,

int nCode,

int wParam,

IntPtr lParam);

/// <summary>

/// The CallWndProc hook procedure is an

application-defined or library-defined callback

/// function used with the SetWindowsHookEx

function. The HOOKPROC type defines a pointer

/// to this callback function. CallWndProc is a

placeholder for the application-defined

/// or library-defined function name.

/// </summary>

/// <param name="nCode">

/// [in] Specifies whether the hook procedure must

process the message.

/// If nCode is HC_ACTION, the hook procedure must

process the message.

/// If nCode is less than zero, the hook procedure

must pass the message to the

/// CallNextHookEx function without further

processing and must return the

/// value returned by CallNextHookEx.

/// </param>

/// <param name="wParam">

/// [in] Specifies whether the message was sent by

the current thread.

/// If the message was sent by the current thread,

it is nonzero; otherwise, it is zero.

/// </param>

/// <param name="lParam">

/// [in] Pointer to a CWPSTRUCT structure that

contains details about the message.

/// </param>

/// <returns>

/// If nCode is less than zero, the hook procedure

must return the value returned by CallNextHookEx.

/// If nCode is greater than or equal to zero, it

is highly recommended that you call CallNextHookEx

/// and return the value it returns; otherwise,

other applications that have installed WH_CALLWNDPROC

/// hooks will not receive hook notifications and

may behave incorrectly as a result. If the hook

/// procedure does not call CallNextHookEx, the

return value should be zero.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/windowing/hooks/hookref

erence/hookfunctions/callwndproc.asp

/// </remarks>

private delegate int HookProc(int nCode, int

wParam, IntPtr lParam);

/// <summary>

/// The ToAscii function translates the specified

virtual-key code and keyboard

/// state to the corresponding character or

characters. The function translates the code

/// using the input language and physical keyboard

layout identified by the keyboard layout handle.

/// </summary>

/// <param name="uVirtKey">

/// [in] Specifies the virtual-key code to be

translated.

/// </param>

/// <param name="uScanCode">

/// [in] Specifies the hardware scan code of the

key to be translated.

/// The high-order bit of this value is set if the

key is up (not pressed).

/// </param>

/// <param name="lpbKeyState">

/// [in] Pointer to a 256-byte array that contains

the current keyboard state.

/// Each element (byte) in the array contains the

state of one key.

/// If the high-order bit of a byte is set, the key

is down (pressed).

/// The low bit, if set, indicates that the key is

toggled on. In this function,

/// only the toggle bit of the CAPS LOCK key is

relevant. The toggle state

/// of the NUM LOCK and SCROLL LOCK keys is

ignored.

/// </param>

/// <param name="lpwTransKey">

/// [out] Pointer to the buffer that receives the

translated character or characters.

/// </param>

/// <param name="fuState">

/// [in] Specifies whether a menu is active. This

parameter must be 1 if a menu is active, or 0 otherwise.

/// </param>

/// <returns>

/// If the specified key is a dead key, the return

value is negative. Otherwise, it is one of the following

values.

/// Value Meaning

/// 0 The specified virtual key has no translation

for the current state of the keyboard.

/// 1 One character was copied to the buffer.

/// 2 Two characters were copied to the buffer.

This usually happens when a dead-key character

/// (accent or diacritic) stored in the keyboard

layout cannot be composed with the specified

/// virtual key to form a single character.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/userinput/keyboardinput

/keyboardinputreference/keyboardinputfunctions/toascii.asp

/// </remarks>

[DllImport("user32")]

private static extern int ToAscii(

int uVirtKey,

int uScanCode,

byte[] lpbKeyState,

byte[] lpwTransKey,

int fuState);

/// <summary>

/// The GetKeyboardState function copies the status

of the 256 virtual keys to the

/// specified buffer.

/// </summary>

/// <param name="pbKeyState">

/// [in] Pointer to a 256-byte array that contains

keyboard key states.

/// </param>

/// <returns>

/// If the function succeeds, the return value is

nonzero.

/// If the function fails, the return value is

zero. To get extended error information, call GetLastError.

/// </returns>

/// <remarks>

///

http://msdn.microsoft.com/library/default.asp?url=/library/

en-

us/winui/winui/windowsuserinterface/userinput/keyboardinput

/keyboardinputreference/keyboardinputfunctions/toascii.asp

/// </remarks>

[DllImport("user32")]

private static extern int GetKeyboardState(byte[]

pbKeyState);

[DllImport("user32.dll", CharSet = CharSet.Auto,

CallingConvention = CallingConvention.StdCall)]

private static extern short GetKeyState(int vKey);

#endregion

#region Windows constants

//values from Winuser.h in Microsoft SDK.

/// <summary>

/// Windows NT/2000/XP: Installs a hook procedure

that monitors low-level mouse input events.

/// </summary>

private const int WH_MOUSE_LL = 14;

/// <summary>

/// Windows NT/2000/XP: Installs a hook procedure

that monitors low-level keyboard input events.

/// </summary>

private const int WH_KEYBOARD_LL = 13;

Page 118: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 118

/// <summary>

/// Installs a hook procedure that monitors mouse

messages. For more information, see the MouseProc hook

procedure.

/// </summary>

private const int WH_MOUSE = 7;

/// <summary>

/// Installs a hook procedure that monitors

keystroke messages. For more information, see the

KeyboardProc hook procedure.

/// </summary>

private const int WH_KEYBOARD = 2;

/// <summary>

/// The WM_MOUSEMOVE message is posted to a window

when the cursor moves.

/// </summary>

private const int WM_MOUSEMOVE = 0x200;

/// <summary>

/// The WM_LBUTTONDOWN message is posted when the

user presses the left mouse button

/// </summary>

private const int WM_LBUTTONDOWN = 0x201;

/// <summary>

/// The WM_RBUTTONDOWN message is posted when the

user presses the right mouse button

/// </summary>

private const int WM_RBUTTONDOWN = 0x204;

/// <summary>

/// The WM_MBUTTONDOWN message is posted when the

user presses the middle mouse button

/// </summary>

private const int WM_MBUTTONDOWN = 0x207;

/// <summary>

/// The WM_LBUTTONUP message is posted when the

user releases the left mouse button

/// </summary>

private const int WM_LBUTTONUP = 0x202;

/// <summary>

/// The WM_RBUTTONUP message is posted when the

user releases the right mouse button

/// </summary>

private const int WM_RBUTTONUP = 0x205;

/// <summary>

/// The WM_MBUTTONUP message is posted when the

user releases the middle mouse button

/// </summary>

private const int WM_MBUTTONUP = 0x208;

/// <summary>

/// The WM_LBUTTONDBLCLK message is posted when the

user double-clicks the left mouse button

/// </summary>

private const int WM_LBUTTONDBLCLK = 0x203;

/// <summary>

/// The WM_RBUTTONDBLCLK message is posted when the

user double-clicks the right mouse button

/// </summary>

private const int WM_RBUTTONDBLCLK = 0x206;

/// <summary>

/// The WM_RBUTTONDOWN message is posted when the

user presses the right mouse button

/// </summary>

private const int WM_MBUTTONDBLCLK = 0x209;

/// <summary>

/// The WM_MOUSEWHEEL message is posted when the

user presses the mouse wheel.

/// </summary>

private const int WM_MOUSEWHEEL = 0x020A;

/// <summary>

/// The WM_KEYDOWN message is posted to the window

with the keyboard focus when a nonsystem

/// key is pressed. A nonsystem key is a key that

is pressed when the ALT key is not pressed.

/// </summary>

private const int WM_KEYDOWN = 0x100;

/// <summary>

/// The WM_KEYUP message is posted to the window

with the keyboard focus when a nonsystem

/// key is released. A nonsystem key is a key that

is pressed when the ALT key is not pressed,

/// or a keyboard key that is pressed when a window

has the keyboard focus.

/// </summary>

private const int WM_KEYUP = 0x101;

/// <summary>

/// The WM_SYSKEYDOWN message is posted to the

window with the keyboard focus when the user

/// presses the F10 key (which activates the menu

bar) or holds down the ALT key and then

/// presses another key. It also occurs when no

window currently has the keyboard focus;

/// in this case, the WM_SYSKEYDOWN message is sent

to the active window. The window that

/// receives the message can distinguish between

these two contexts by checking the context

/// code in the lParam parameter.

/// </summary>

private const int WM_SYSKEYDOWN = 0x104;

/// <summary>

/// The WM_SYSKEYUP message is posted to the window

with the keyboard focus when the user

/// releases a key that was pressed while the ALT

key was held down. It also occurs when no

/// window currently has the keyboard focus; in

this case, the WM_SYSKEYUP message is sent

/// to the active window. The window that receives

the message can distinguish between

/// these two contexts by checking the context code

in the lParam parameter.

/// </summary>

private const int WM_SYSKEYUP = 0x105;

private const byte VK_SHIFT = 0x10;

private const byte VK_CAPITAL = 0x14;

private const byte VK_NUMLOCK = 0x90;

#endregion

/// <summary>

/// Creates an instance of UserActivityHook object

and sets mouse and keyboard hooks.

/// </summary>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

public UserActivityHook()

{

Start();

}

/// <summary>

/// Creates an instance of UserActivityHook object

and installs both or one of mouse and/or keyboard hooks and

starts rasing events

/// </summary>

/// <param name="InstallMouseHook"><b>true</b> if

mouse events must be monitored</param>

/// <param name="InstallKeyboardHook"><b>true</b>

if keyboard events must be monitored</param>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

/// <remarks>

/// To create an instance without installing hooks

call new UserActivityHook(false, false)

/// </remarks>

public UserActivityHook(bool InstallMouseHook, bool

InstallKeyboardHook)

{

Start(InstallMouseHook, InstallKeyboardHook);

}

/// <summary>

/// Destruction.

/// </summary>

~UserActivityHook()

{

//uninstall hooks and do not throw exceptions

Stop(true, true, false);

}

/// <summary>

/// Occurs when the user moves the mouse, presses

any mouse button or scrolls the wheel

/// </summary>

public event MouseEventHandler OnMouseActivity;

/// <summary>

/// Occurs when the user presses a key

/// </summary>

public event KeyEventHandler KeyDown;

/// <summary>

/// Occurs when the user presses and releases

/// </summary>

public event KeyPressEventHandler KeyPress;

/// <summary>

/// Occurs when the user releases a key

/// </summary>

public event KeyEventHandler KeyUp;

/// <summary>

Page 119: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 119

/// Stores the handle to the mouse hook procedure.

/// </summary>

private int hMouseHook = 0;

/// <summary>

/// Stores the handle to the keyboard hook

procedure.

/// </summary>

private int hKeyboardHook = 0;

/// <summary>

/// Declare MouseHookProcedure as HookProc type.

/// </summary>

private static HookProc MouseHookProcedure;

/// <summary>

/// Declare KeyboardHookProcedure as HookProc type.

/// </summary>

private static HookProc KeyboardHookProcedure;

/// <summary>

/// Installs both mouse and keyboard hooks and

starts rasing events

/// </summary>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

public void Start()

{

this.Start(true, true);

}

/// <summary>

/// Installs both or one of mouse and/or keyboard

hooks and starts rasing events

/// </summary>

/// <param name="InstallMouseHook"><b>true</b> if

mouse events must be monitored</param>

/// <param name="InstallKeyboardHook"><b>true</b>

if keyboard events must be monitored</param>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

public void Start(bool InstallMouseHook, bool

InstallKeyboardHook)

{

// install Mouse hook only if it is not

installed and must be installed

if (hMouseHook == 0 && InstallMouseHook)

{

// Create an instance of HookProc.

MouseHookProcedure = new

HookProc(MouseHookProc);

//install hook

hMouseHook = SetWindowsHookEx(

WH_MOUSE_LL,

MouseHookProcedure,

Marshal.GetHINSTANCE(

Assembly.GetExecutingAssembly().GetModules()[0]),

0);

//If SetWindowsHookEx fails.

if (hMouseHook == 0)

{

//Returns the error code returned by

the last unmanaged function called using platform invoke

that has the DllImportAttribute.SetLastError flag set.

int errorCode =

Marshal.GetLastWin32Error();

//do cleanup

Stop(true, false, false);

//Initializes and throws a new instance

of the Win32Exception class with the specified error.

throw new Win32Exception(errorCode);

}

}

// install Keyboard hook only if it is not

installed and must be installed

if (hKeyboardHook == 0 && InstallKeyboardHook)

{

// Create an instance of HookProc.

KeyboardHookProcedure = new

HookProc(KeyboardHookProc);

//install hook

hKeyboardHook = SetWindowsHookEx(

WH_KEYBOARD_LL,

KeyboardHookProcedure,

Marshal.GetHINSTANCE(

Assembly.GetExecutingAssembly().GetModules()[0]),

0);

//If SetWindowsHookEx fails.

if (hKeyboardHook == 0)

{

//Returns the error code returned by

the last unmanaged function called using platform invoke

that has the DllImportAttribute.SetLastError flag set.

int errorCode =

Marshal.GetLastWin32Error();

//do cleanup

Stop(false, true, false);

//Initializes and throws a new instance

of the Win32Exception class with the specified error.

throw new Win32Exception(errorCode);

}

}

}

/// <summary>

/// Stops monitoring both mouse and keyboard events

and rasing events.

/// </summary>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

public void Stop()

{

this.Stop(true, true, true);

}

/// <summary>

/// Stops monitoring both or one of mouse and/or

keyboard events and rasing events.

/// </summary>

/// <param name="UninstallMouseHook"><b>true</b> if

mouse hook must be uninstalled</param>

/// <param name="UninstallKeyboardHook"><b>true</b>

if keyboard hook must be uninstalled</param>

/// <param name="ThrowExceptions"><b>true</b> if

exceptions which occured during uninstalling must be

thrown</param>

/// <exception cref="Win32Exception">Any windows

problem.</exception>

public void Stop(bool UninstallMouseHook, bool

UninstallKeyboardHook, bool ThrowExceptions)

{

//if mouse hook set and must be uninstalled

if (hMouseHook != 0 && UninstallMouseHook)

{

//uninstall hook

int retMouse =

UnhookWindowsHookEx(hMouseHook);

//reset invalid handle

hMouseHook = 0;

//if failed and exception must be thrown

if (retMouse == 0 && ThrowExceptions)

{

//Returns the error code returned by

the last unmanaged function called using platform invoke

that has the DllImportAttribute.SetLastError flag set.

int errorCode =

Marshal.GetLastWin32Error();

//Initializes and throws a new instance

of the Win32Exception class with the specified error.

throw new Win32Exception(errorCode);

}

}

//if keyboard hook set and must be uninstalled

if (hKeyboardHook != 0 &&

UninstallKeyboardHook)

{

//uninstall hook

int retKeyboard =

UnhookWindowsHookEx(hKeyboardHook);

//reset invalid handle

hKeyboardHook = 0;

//if failed and exception must be thrown

if (retKeyboard == 0 && ThrowExceptions)

{

//Returns the error code returned by

the last unmanaged function called using platform invoke

that has the DllImportAttribute.SetLastError flag set.

int errorCode =

Marshal.GetLastWin32Error();

//Initializes and throws a new instance

of the Win32Exception class with the specified error.

throw new Win32Exception(errorCode);

}

}

}

Page 120: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 120

/// <summary>

/// A callback function which will be called every

time a mouse activity detected.

/// </summary>

/// <param name="nCode">

/// [in] Specifies whether the hook procedure must

process the message.

/// If nCode is HC_ACTION, the hook procedure must

process the message.

/// If nCode is less than zero, the hook procedure

must pass the message to the

/// CallNextHookEx function without further

processing and must return the

/// value returned by CallNextHookEx.

/// </param>

/// <param name="wParam">

/// [in] Specifies whether the message was sent by

the current thread.

/// If the message was sent by the current thread,

it is nonzero; otherwise, it is zero.

/// </param>

/// <param name="lParam">

/// [in] Pointer to a CWPSTRUCT structure that

contains details about the message.

/// </param>

/// <returns>

/// If nCode is less than zero, the hook procedure

must return the value returned by CallNextHookEx.

/// If nCode is greater than or equal to zero, it

is highly recommended that you call CallNextHookEx

/// and return the value it returns; otherwise,

other applications that have installed WH_CALLWNDPROC

/// hooks will not receive hook notifications and

may behave incorrectly as a result. If the hook

/// procedure does not call CallNextHookEx, the

return value should be zero.

/// </returns>

private int MouseHookProc(int nCode, int wParam,

IntPtr lParam)

{

// if ok and someone listens to our events

if ((nCode >= 0) && (OnMouseActivity != null))

{

//Marshall the data from callback.

MouseLLHookStruct mouseHookStruct =

(MouseLLHookStruct)Marshal.PtrToStructure(lParam,

typeof(MouseLLHookStruct));

//detect button clicked

MouseButtons button = MouseButtons.None;

short mouseDelta = 0;

switch (wParam)

{

case WM_LBUTTONDOWN:

//case WM_LBUTTONUP:

//case WM_LBUTTONDBLCLK:

button = MouseButtons.Left;

break;

case WM_RBUTTONDOWN:

//case WM_RBUTTONUP:

//case WM_RBUTTONDBLCLK:

button = MouseButtons.Right;

break;

case WM_MOUSEWHEEL:

//If the message is WM_MOUSEWHEEL,

the high-order word of mouseData member is the wheel delta.

//One wheel click is defined as

WHEEL_DELTA, which is 120.

//(value >> 16) & 0xffff; retrieves

the high-order word from the given 32-bit value

mouseDelta =

(short)((mouseHookStruct.mouseData >> 16) & 0xffff);

//TODO: X BUTTONS (I havent them so

was unable to test)

//If the message is WM_XBUTTONDOWN,

WM_XBUTTONUP, WM_XBUTTONDBLCLK, WM_NCXBUTTONDOWN,

WM_NCXBUTTONUP,

//or WM_NCXBUTTONDBLCLK, the high-

order word specifies which X button was pressed or

released,

//and the low-order word is

reserved. This value can be one or more of the following

values.

//Otherwise, mouseData is not used.

break;

}

//double clicks

int clickCount = 0;

if (button != MouseButtons.None)

if (wParam == WM_LBUTTONDBLCLK ||

wParam == WM_RBUTTONDBLCLK) clickCount = 2;

else clickCount = 1;

//generate event

MouseEventArgs e = new MouseEventArgs(

button,

clickCount,

mouseHookStruct.pt.x,

mouseHookStruct.pt.y,

mouseDelta);

//raise it

OnMouseActivity(this, e);

}

//call next hook

return CallNextHookEx(hMouseHook, nCode,

wParam, lParam);

}

/// <summary>

/// A callback function which will be called every

time a keyboard activity detected.

/// </summary>

/// <param name="nCode">

/// [in] Specifies whether the hook procedure must

process the message.

/// If nCode is HC_ACTION, the hook procedure must

process the message.

/// If nCode is less than zero, the hook procedure

must pass the message to the

/// CallNextHookEx function without further

processing and must return the

/// value returned by CallNextHookEx.

/// </param>

/// <param name="wParam">

/// [in] Specifies whether the message was sent by

the current thread.

/// If the message was sent by the current thread,

it is nonzero; otherwise, it is zero.

/// </param>

/// <param name="lParam">

/// [in] Pointer to a CWPSTRUCT structure that

contains details about the message.

/// </param>

/// <returns>

/// If nCode is less than zero, the hook procedure

must return the value returned by CallNextHookEx.

/// If nCode is greater than or equal to zero, it

is highly recommended that you call CallNextHookEx

/// and return the value it returns; otherwise,

other applications that have installed WH_CALLWNDPROC

/// hooks will not receive hook notifications and

may behave incorrectly as a result. If the hook

/// procedure does not call CallNextHookEx, the

return value should be zero.

/// </returns>

private int KeyboardHookProc(int nCode, Int32

wParam, IntPtr lParam)

{

//indicates if any of underlaing events set

e.Handled flag

bool handled = false;

//it was ok and someone listens to events

if ((nCode >= 0) && (KeyDown != null || KeyUp

!= null || KeyPress != null))

{

//read structure KeyboardHookStruct at

lParam

KeyboardHookStruct MyKeyboardHookStruct =

(KeyboardHookStruct)Marshal.PtrToStructure(lParam,

typeof(KeyboardHookStruct));

//raise KeyDown

if (KeyDown != null && (wParam ==

WM_KEYDOWN || wParam == WM_SYSKEYDOWN))

{

Keys keyData =

(Keys)MyKeyboardHookStruct.vkCode;

KeyEventArgs e = new

KeyEventArgs(keyData);

KeyDown(this, e);

handled = handled || e.Handled;

}

// raise KeyPress

Page 121: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 121

if (KeyPress != null && wParam ==

WM_KEYDOWN)

{

bool isDownShift =

((GetKeyState(VK_SHIFT) & 0x80) == 0x80 ? true : false);

bool isDownCapslock =

(GetKeyState(VK_CAPITAL) != 0 ? true : false);

byte[] keyState = new byte[256];

GetKeyboardState(keyState);

byte[] inBuffer = new byte[2];

if

(ToAscii(MyKeyboardHookStruct.vkCode,

MyKeyboardHookStruct.scanCode,

keyState,

inBuffer,

MyKeyboardHookStruct.flags)

== 1)

{

char key = (char)inBuffer[0];

if ((isDownCapslock ^ isDownShift)

&& Char.IsLetter(key)) key = Char.ToUpper(key);

KeyPressEventArgs e = new

KeyPressEventArgs(key);

KeyPress(this, e);

handled = handled || e.Handled;

}

}

// raise KeyUp

if (KeyUp != null && (wParam == WM_KEYUP ||

wParam == WM_SYSKEYUP))

{

Keys keyData =

(Keys)MyKeyboardHookStruct.vkCode;

KeyEventArgs e = new

KeyEventArgs(keyData);

KeyUp(this, e);

handled = handled || e.Handled;

}

}

//if event handled in application do not

handoff to other listeners

if (handled)

return 1;

else

return CallNextHookEx(hKeyboardHook, nCode,

wParam, lParam);

}

}

}

Timing Collector

App

App.xaml.cs using System;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel;

using System.Net;

using System.Security;

using System.Security.Permissions;

using System.Windows;

using System.Windows.Forms;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Media;

using System.Windows.Media.Animation;

using System.Windows.Navigation;

using System.Diagnostics;

namespace Timing_Collector

{

public partial class App : System.Windows.Application

{

public void AppStartup(object sender, StartupEventArgs args)

{

//start the UI window

NavigationWindow mainWindow = new NavigationWindow();

mainWindow.Background = Brushes.Transparent;

mainWindow.Show();

Console.WriteLine("starting App");

}

}

}

App.xaml <Application

Page 122: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 122

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

x:Class="Timing_Collector.App"

StartupUri="Page2.xaml"

>

<Application.Resources>

<Style x:Key="{x:Type NavigationWindow}" TargetType="{x:Type NavigationWindow}">

<Setter Property="ShowsNavigationUI" Value="False" />

<Setter Property="Background" Value="Black" />

<Setter Property="Width" Value="810" />

<Setter Property="Height" Value="600" />

<Setter Property="Margin" Value="5" />

</Style>

</Application.Resources>

</Application>

Page1

Page1.xaml.cs using System;

using System.IO;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace Timing_Collector

{

/// <summary>

/// Interaction logic for Page1.xaml

/// </summary>

public partial class Page1 : Page

{

int mode = 0;

int count = 0;

private DateTime startTime;

StreamWriter SW = File.AppendText("point_log.csv");

public Page1()

{

InitializeComponent();

Console.WriteLine("Page Loaded");

// create a writer and open the file

string file = DateTime.Now.ToShortTimeString();

startTime = DateTime.Now;

SW.WriteLine(DateTime.Now.ToShortDateString()+" "+DateTime.Now.ToShortTimeString());

PLACE_RECTANGLE();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

}

Page 123: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 123

private void RECT_over_EVT(object sender, MouseEventArgs e)

{

PLACE_RECTANGLE();

}

private void RECT_down_EVT(object sender, MouseEventArgs e)

{

PLACE_RECTANGLE();

}

private void PLACE_RECTANGLE()

{

TimeSpan interval = DateTime.Now - startTime;

// there are a few modes that rectangles can get placed in...

// mouse over mode

// mouse down mode

// both can be used in two cursor modes...

// known previous cursor mode

// random previous cursor mode

// progress through 25 under each scenario

//determine the mode of operation

if (count >= 0 && count <= 24) { mode = 0;}//mouse over & known cursor

if (count >=25 && count <= 49) { mode = 1;}//mouse down & known cursor

if (count >=50 && count <= 74) { mode = 2;}//mouse over & unknown cursor

if (count >=75 && count <= 99) { mode = 3;}//mouse down & unknown cursor

if (count >= 100) { mode = 4; } // move on to the next screen

//move all rectangles away, and make them unclickable/invisible

DoubleAnimation GO_AWAY = new DoubleAnimation();

GO_AWAY.To = 0;

GO_AWAY.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation MOVE_AWAY = new DoubleAnimation();

MOVE_AWAY.To = -50;

MOVE_AWAY.Duration = TimeSpan.FromSeconds(.01);

RECT_down.BeginAnimation(Canvas.OpacityProperty, GO_AWAY);

RECT_down.BeginAnimation(Canvas.TopProperty, MOVE_AWAY);

RECT_down.BeginAnimation(Canvas.LeftProperty, MOVE_AWAY);

RECT_down.IsEnabled = false;

RECT_over.BeginAnimation(Canvas.OpacityProperty, GO_AWAY);

RECT_over.BeginAnimation(Canvas.TopProperty, MOVE_AWAY);

RECT_over.BeginAnimation(Canvas.LeftProperty, MOVE_AWAY);

RECT_over.IsEnabled = false;

//write to the log file & console

if (mode != 4)

{

Console.WriteLine(mode + "," + interval.TotalMilliseconds);

SW.WriteLine(count + "," + mode + "," + interval.TotalMilliseconds);

}

if (mode == 0)

{

//mouse over and known cursor

MOVE_OVER_RECTANGLE();

}

if (mode == 1)

{

//mouse down and known cursor

MOVE_DOWN_RECTANGLE();

}

if (mode == 2)

{

//Mouse over and unknown cursor

MOVE_OVER_RECTANGLE();

MOVE_CURSOR();

Page 124: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 124

}

if (mode == 3)

{

// mouse down and unknown cursor

MOVE_DOWN_RECTANGLE();

MOVE_CURSOR();

}

if (mode == 4)

{

NavigationService.Navigate(new Page3());

SW.Close();

}

//increment the counter

count++;

//save the time

startTime = DateTime.Now;

}

private void MOVE_OVER_RECTANGLE()

{

Random rand = new Random();

int left = rand.Next(50, 650);

int down = rand.Next(50, 450);

DoubleAnimation NEW_LEFT = new DoubleAnimation();

NEW_LEFT.To = left;

NEW_LEFT.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation NEW_DOWN = new DoubleAnimation();

NEW_DOWN.To = down;

NEW_DOWN.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation COME_IN = new DoubleAnimation();

COME_IN.To = 1;

COME_IN.Duration = TimeSpan.FromSeconds(.01);

RECT_over.BeginAnimation(Canvas.OpacityProperty, COME_IN);

RECT_over.BeginAnimation(Canvas.TopProperty, NEW_DOWN);

RECT_over.BeginAnimation(Canvas.LeftProperty, NEW_LEFT);

RECT_over.IsEnabled = true;

}

private void MOVE_DOWN_RECTANGLE()

{

Random rand = new Random();

int left = rand.Next(50, 700);

int down = rand.Next(50, 500);

DoubleAnimation NEW_LEFT = new DoubleAnimation();

NEW_LEFT.To = left;

NEW_LEFT.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation NEW_DOWN = new DoubleAnimation();

NEW_DOWN.To = down;

NEW_DOWN.Duration = TimeSpan.FromSeconds(.01);

DoubleAnimation COME_IN = new DoubleAnimation();

COME_IN.To = 1;

COME_IN.Duration = TimeSpan.FromSeconds(.01);

RECT_down.BeginAnimation(Canvas.OpacityProperty, COME_IN);

RECT_down.BeginAnimation(Canvas.TopProperty, NEW_DOWN);

RECT_down.BeginAnimation(Canvas.LeftProperty, NEW_LEFT);

RECT_down.IsEnabled = true;

}

private void MOVE_CURSOR()

{

// randomly move the cursor to one of the four corners

//0 = top left

//1 = top right

//2 = bottom left

Page 125: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 125

//3 = bottom right

Random rand = new Random();

int num = rand.Next(0,4);

// Console.WriteLine(num);

if (num == 0) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(0, 0); }

if (num == 1) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(800, 0); }

if (num == 2) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(0, 600); }

if (num == 3) { System.Windows.Forms.Cursor.Position = new System.Drawing.Point(800, 600); }

}

}

}

Page1.xaml <Page x:Class="Timing_Collector.Page1"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Page1" Height="600" Width="800">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="Black">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Opacity="0.75" Source="/IRALAR_TC;component/Resources/Background.png">

<Image.BitmapEffect>

<BlurBitmapEffect Radius="10" />

</Image.BitmapEffect>

</Image>

<Rectangle Canvas.Left="50" Canvas.Top="50" Height="100" Width="100" Name="RECT_over" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="White" MouseEnter="RECT_over_EVT" Opacity="1"/>

<Rectangle Canvas.Left="167.642" Canvas.Top="50" Height="100" Width="100" Name="RECT_down" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="Gray" MouseDown="RECT_down_EVT" Opacity="1"/>

</Canvas>

</Page>

Page2

Screenshot

Page2.xaml.cs using System;

Page 126: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 126

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace Timing_Collector

{

/// <summary>

/// Interaction logic for Page2.xaml

/// </summary>

public partial class Page2 : Page

{

public Page2()

{

InitializeComponent();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void GO_Click(object sender, RoutedEventArgs e)

{

NavigationService.Navigate(new Page1());

}

private void EXIT_Click(object sender, RoutedEventArgs e)

{

Window win = (Window)this.Parent;

win.Close();

}

}

}

Page2.xaml <Page x:Class="Timing_Collector.Page2"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Page1" Height="600" Width="800">

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="DarkBlue">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_TC;component/Resources/Background.png" />

<Image Canvas.Left="33" Canvas.Top="36" Height="100" Name="logo" Stretch="Fill" Width="100"

Source="/IRALAR_TC;component/Resources/Icon.png" />

<Rectangle Canvas.Left="102" Canvas.Top="143" Height="243" Name="rectangle1" Stroke="Black" Width="541"

Fill="Black" RadiusX="40" RadiusY="40" Opacity=".8">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="102" Canvas.Top="397.299" Height="75" Name="rectangle2" Stroke="Black"

Width="541" Fill="Black" RadiusX="40" RadiusY="40" Opacity=".8">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="121" Canvas.Top="223" Height="50" Width="50" Name="RECT_over" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="White" Opacity="1">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

Page 127: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 127

</Rectangle.BitmapEffect>

</Rectangle>

<Rectangle Canvas.Left="121" Canvas.Top="303" Height="50" Width="50" Name="RECT_down" RadiusX="5"

RadiusY="5" Stroke="Black" Fill="Gray" Opacity="1">

<Rectangle.BitmapEffect>

<BevelBitmapEffect />

</Rectangle.BitmapEffect>

</Rectangle>

<Label Canvas.Left="132.714" Canvas.Top="45.45" Height="78.174" Name="welcome" Width="572.67"

Foreground="White" FontSize="42">

<Label.BitmapEffect>

<OuterGlowBitmapEffect GlowColor="Gray" />

</Label.BitmapEffect> IRALAR reaction time testing

</Label>

<Label Canvas.Left="228" Canvas.Top="143" Height="78" Name="instructions" Width="277"

Foreground="White" FontSize="42">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> Instructions</Label>

<Label Canvas.Left="177" Canvas.Top="223" Height="78" Name="instructions1" Width="466"

Foreground="White" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> place the mouse over white squares</Label>

<Label Canvas.Left="177" Canvas.Top="304" Height="78" Name="instructions2" Width="466"

Foreground="Gray" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> click on gray squares</Label>

<Label Canvas.Left="157.08" Canvas.Top="405.552" Height="77.826" Name="instructions3" Width="466.242"

Foreground="White" FontSize="28">

<Label.BitmapEffect>

<DropShadowBitmapEffect Color="AliceBlue" />

</Label.BitmapEffect> There will be 100 squares to test!</Label>

<Button Canvas.Left="215.938" Canvas.Top="495.405" Height="58.176" Name="GO" Click="GO_Click"

Width="289.062">Begin</Button>

<Button Canvas.Left="751.743" Canvas.Top="0" Height="22.725" Name="EXIT" Width="48.177"

Click="EXIT_Click" Content="EXIT">

<Button.Background>

<LinearGradientBrush EndPoint="0,1" StartPoint="0,0">

<GradientStop Color="#FFF3F3F3" Offset="0"/>

<GradientStop Color="#FFEBEBEB" Offset="0.5"/>

<GradientStop Color="#FFDDDDDD" Offset="0.5"/>

<GradientStop Color="#FFFF0000" Offset="1"/>

</LinearGradientBrush>

</Button.Background>

</Button>

</Canvas>

</Page>

Page3

Page3.xaml.cs using System;

using System.Collections.Generic;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Windows.Media.Animation;

using System.Windows.Threading;

namespace Timing_Collector

{

/// <summary>

/// Interaction logic for Page3.xaml

/// </summary>

Page 128: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 128

public partial class Page3 : Page

{

public Page3()

{

InitializeComponent();

}

private void Canvas_Loaded(object sender, RoutedEventArgs e)

{

DoubleAnimation fadein = new DoubleAnimation();

fadein.From = 0;

fadein.To = 1;

fadein.Duration = TimeSpan.FromSeconds(1);

_canvas.BeginAnimation(Canvas.OpacityProperty, fadein);

}

private void DoubleAnimationUsingKeyFrames_Completed(object sender, EventArgs e)

{

NavigationService.Navigate(new Page2());

}

}

}

Page3.xaml <Page x:Class="Timing_Collector.Page3"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="Page1" Height="600" Width="800">

<Page.Resources>

<Storyboard x:Key="OnLoaded1">

<DoubleAnimationUsingKeyFrames Completed="DoubleAnimationUsingKeyFrames_Completed"

BeginTime="00:00:00" Storyboard.TargetName="label1"

Storyboard.TargetProperty="(UIElement.Opacity)">

<SplineDoubleKeyFrame KeyTime="00:00:00.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:01.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:01.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:02.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:02.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:03.000000" Value="1"/>

<SplineDoubleKeyFrame KeyTime="00:00:03.500000" Value=".5"/>

<SplineDoubleKeyFrame KeyTime="00:00:04.000000" Value="1"/>

</DoubleAnimationUsingKeyFrames>

</Storyboard>

</Page.Resources>

<Page.Triggers>

<EventTrigger RoutedEvent="FrameworkElement.Loaded">

<BeginStoryboard Storyboard="{StaticResource OnLoaded1}" x:Name="OnLoaded1_BeginStoryboard"/>

</EventTrigger>

</Page.Triggers>

<Canvas Loaded="Canvas_Loaded" Name="_canvas" IsEnabled="True" Background="DarkBlue">

<Image Canvas.Left="0" Canvas.Top="0" Height="600" Name="image1" Stretch="Fill" Width="800"

Source="/IRALAR_TC;component/Resources/Background.png" />

<Label Canvas.Left="238.39" Canvas.Top="190.712" Height="129.192" Name="label1" Width="342.974"

FontSize="72" Foreground="White">Complete</Label>

</Canvas>

</Page>

Control Panel

App

App.xaml.cs using System;

using System.Collections.Generic;

using System.Configuration;

Page 129: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 129

using System.Data;

using System.Linq;

using System.Windows;

namespace IRALAR_Control_Panel

{

/// <summary>

/// Interaction logic for App.xaml

/// </summary>

public partial class App : Application

{

}

}

App.xaml <Application x:Class="IRALAR_Control_Panel.App"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

StartupUri="Window1.xaml">

<Application.Resources>

</Application.Resources>

</Application>

Dialog

Dialog.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Shapes;

namespace IRALAR_Control_Panel

{

/// <summary>

/// Interaction logic for Dialog.xaml

/// </summary>

public partial class Dialog : Window

{

public Dialog()

{

InitializeComponent();

}

void cancelButton_Click(object sender, RoutedEventArgs e)

{

this.DialogResult = false;

}

void okButton_Click(object sender, RoutedEventArgs e)

{

if (!IsValid(this)) return;

this.DialogResult = true;

}

bool IsValid(DependencyObject node)

{

// Check if dependency object was passed

if (node != null)

{

// Check if dependency object is valid.

// NOTE: Validation.GetHasError works for controls that have validation rules attached

bool isValid = !Validation.GetHasError(node);

if (!isValid)

Page 130: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 130

{

// If the dependency object is invalid, and it can receive the focus,

// set the focus

if (node is IInputElement) Keyboard.Focus((IInputElement)node);

return false;

}

}

// If this dependency object is valid, check all child dependency objects

foreach (object subnode in LogicalTreeHelper.GetChildren(node))

{

if (subnode is DependencyObject)

{

// If a child dependency object is invalid, return false immediately,

// otherwise keep checking

if (IsValid((DependencyObject)subnode) == false) return false;

}

}

// All dependency objects are valid

return true;

}

}

}

Dialog.xaml <Window x:Class="IRALAR_Control_Panel.Dialog"

Title="Enter IP Address:"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Height="135"

Width="265"

MinHeight="135"

MinWidth="265"

ResizeMode="CanResizeWithGrip"

ShowInTaskbar="False"

WindowStartupLocation="CenterOwner"

FocusManager.FocusedElement="{Binding ElementName=leftMarginTextBox}">

<Grid>

<Button Name="okButton" Click="okButton_Click" IsDefault="True" Height="23" HorizontalAlignment="Right"

VerticalAlignment="Bottom" Width="42" Margin="0,0,64,0">OK</Button>

<Button Name="cancelButton" IsCancel="True" Width="62" HorizontalAlignment="Right" Height="23"

VerticalAlignment="Bottom">Cancel</Button>

<Label Height="28" Margin="0,0,0,0" Name="label1" VerticalAlignment="Top">Please enter the IP Address

of the computer</Label>

<Label Height="28" Margin="20,13,0,0" Name="label2" VerticalAlignment="Top">running the IRALAR

interface.</Label>

<TextBox Margin="20,40,0,30" Name="IP1" HorizontalAlignment="Left" Width="44" Height="25"/>

<TextBox Margin="70,40,0,30" Name="IP2" HorizontalAlignment="Left" Width="44" Height="25"/>

<TextBox Margin="120,40,0,30" Name="IP3" HorizontalAlignment="Left" Width="44" Height="25"/>

<TextBox Margin="170,40,0,30" Name="IP4" HorizontalAlignment="Left" Width="44" Height="25"/>

</Grid >

</Window>

Page 131: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 131

Window1

Screenshot

Window1.xaml.cs using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Windows;

using System.Windows.Controls;

using System.Windows.Data;

using System.Windows.Documents;

using System.Windows.Input;

using System.Windows.Media;

using System.Windows.Media.Imaging;

using System.Windows.Navigation;

using System.Windows.Shapes;

using System.Net;

using System.Net.Sockets;

using System.IO;

using System.ComponentModel; // CancelEventArgs

using Microsoft.Win32; // OpenFileDialog

namespace IRALAR_Control_Panel

{

/// <summary>

/// Interaction logic for Window1.xaml

/// </summary>

public partial class Window1 : Window

{

Socket client;

int recv;

bool connected = false;

public string IPADDRESS;

private static byte[] data = new byte[100000];

private static IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)ipep;

private static int size = 100;

public double A0, A1, A2, B0, B1, B2;

public Window1()

{

InitializeComponent();

}

private void CONNECT_Click(object sender, RoutedEventArgs e)

{

CONNECT_TO_SERVER_fx();

}

private void CONNECT_TO_SERVER_fx()

Page 132: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 132

{

string IPA = "";

string IPB = "";

string IPC = "";

string IPD = "";

Dialog dlg = new Dialog();

dlg.Owner = this;

dlg.IP1.Text = IPA;

dlg.IP2.Text = IPB;

dlg.IP3.Text = IPC;

dlg.IP4.Text = IPD;

dlg.ShowDialog();

if (dlg.DialogResult == true)

{

IPADDRESS = dlg.IP1.Text + "." + dlg.IP2.Text + "." + dlg.IP3.Text + "." + dlg.IP4.Text;

try

{

ipep = new IPEndPoint(IPAddress.Parse(IPADDRESS), 1200);

client = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);

client.Connect(ipep);

client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 3000);

int sockopt = (int)client.GetSocketOption(SocketOptionLevel.Socket,

SocketOptionName.ReceiveTimeout);

connected = true;

ADD_TEXT("Connection Established!");

}

catch

{

ADD_TEXT("Connection Failed!");

connected = false;

}

}

else { }

}

private void Calibration_Details_Click(object sender, RoutedEventArgs e)

{

double prog = 0/82;

// get calibration parameters

if (connected)

{

try

{

CALIBRATION_PROGRESS.Value = 0;

//first, try connecting and getting the calibration parameters

try

{

string msg_to_send = "data_calibration";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(msg_to_send + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

string[] ReturnParts = text.Split(new Char[] { ' ' });

A0 = Convert.ToDouble(ReturnParts[0]);

A1 = Convert.ToDouble(ReturnParts[1]);

A2 = Convert.ToDouble(ReturnParts[2]);

B0 = Convert.ToDouble(ReturnParts[3]);

B1 = Convert.ToDouble(ReturnParts[4]);

B2 = Convert.ToDouble(ReturnParts[5]);

prog = 1 / 82;

CALIBRATION_PROGRESS.Value = prog;

}

catch

{

ADD_TEXT("Server Communication Error!");

}

}

catch

{

ADD_TEXT("Error Gathering Coefficients!");

}

Page 133: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 133

int i = 0;

while (i <= 80)

{

try

{

//gather information for each point

string msg_to_send = "Get_Pt:" + i.ToString() + " ";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(msg_to_send + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

string[] ReturnParts = text.Split(new Char[] { ' ' });

double X_err = Convert.ToDouble(ReturnParts[2]) - ( A0 + A1 *

Convert.ToDouble(ReturnParts[0]) + A2 * Convert.ToDouble(ReturnParts[1]));

double Y_err = Convert.ToDouble(ReturnParts[3]) - ( B0 + B1 *

Convert.ToDouble(ReturnParts[0]) + B2 * Convert.ToDouble(ReturnParts[1]));

double ERR = Math.Sqrt(X_err * X_err + Y_err * Y_err);

FILL_IN_ERROR(ERR, i);

}

catch

{

FILL_IN_ERROR(255, i);

string d = "Server Communication Error, data point: " + i.ToString();

ADD_TEXT(d);

}

prog = (1 + i) / 82;

CALIBRATION_PROGRESS.Value = prog;

i++;

}

}

else {

ADD_TEXT("No Connection to server!"); CONNECT_TO_SERVER_fx();

}

}

private void FILL_IN_ERROR(double ERR, int i)

{

SolidColorBrush myBrush = new SolidColorBrush(Color.FromRgb((byte) ERR,(byte) ERR,(byte) ERR ));

if (i == 0) { rect0.Fill = myBrush; }

if (i == 1) { rect1.Fill = myBrush; }

if (i == 2) { rect2.Fill = myBrush; }

if (i == 3) { rect3.Fill = myBrush; }

if (i == 4) { rect4.Fill = myBrush; }

if (i == 5) { rect5.Fill = myBrush; }

if (i == 6) { rect6.Fill = myBrush; }

if (i == 7) { rect7.Fill = myBrush; }

if (i == 8) { rect8.Fill = myBrush; }

if (i == 9) { rect9.Fill = myBrush; }

if (i == 10) { rect10.Fill = myBrush; }

if (i == 11) { rect11.Fill = myBrush; }

if (i == 12) { rect12.Fill = myBrush; }

if (i == 13) { rect13.Fill = myBrush; }

if (i == 14) { rect14.Fill = myBrush; }

if (i == 15) { rect15.Fill = myBrush; }

if (i == 16) { rect16.Fill = myBrush; }

if (i == 17) { rect17.Fill = myBrush; }

if (i == 18) { rect18.Fill = myBrush; }

if (i == 19) { rect19.Fill = myBrush; }

if (i == 20) { rect20.Fill = myBrush; }

if (i == 21) { rect21.Fill = myBrush; }

if (i == 22) { rect22.Fill = myBrush; }

if (i == 23) { rect23.Fill = myBrush; }

if (i == 24) { rect24.Fill = myBrush; }

if (i == 25) { rect25.Fill = myBrush; }

if (i == 26) { rect26.Fill = myBrush; }

if (i == 27) { rect27.Fill = myBrush; }

if (i == 28) { rect28.Fill = myBrush; }

if (i == 29) { rect29.Fill = myBrush; }

if (i == 30) { rect30.Fill = myBrush; }

if (i == 31) { rect31.Fill = myBrush; }

if (i == 32) { rect32.Fill = myBrush; }

if (i == 33) { rect33.Fill = myBrush; }

if (i == 34) { rect34.Fill = myBrush; }

if (i == 35) { rect35.Fill = myBrush; }

if (i == 36) { rect36.Fill = myBrush; }

Page 134: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 134

if (i == 37) { rect37.Fill = myBrush; }

if (i == 38) { rect38.Fill = myBrush; }

if (i == 39) { rect39.Fill = myBrush; }

if (i == 40) { rect40.Fill = myBrush; }

if (i == 41) { rect41.Fill = myBrush; }

if (i == 42) { rect42.Fill = myBrush; }

if (i == 43) { rect43.Fill = myBrush; }

if (i == 44) { rect44.Fill = myBrush; }

if (i == 45) { rect45.Fill = myBrush; }

if (i == 46) { rect46.Fill = myBrush; }

if (i == 47) { rect47.Fill = myBrush; }

if (i == 48) { rect48.Fill = myBrush; }

if (i == 49) { rect49.Fill = myBrush; }

if (i == 50) { rect50.Fill = myBrush; }

if (i == 51) { rect51.Fill = myBrush; }

if (i == 52) { rect52.Fill = myBrush; }

if (i == 53) { rect53.Fill = myBrush; }

if (i == 54) { rect54.Fill = myBrush; }

if (i == 55) { rect55.Fill = myBrush; }

if (i == 56) { rect56.Fill = myBrush; }

if (i == 57) { rect57.Fill = myBrush; }

if (i == 58) { rect58.Fill = myBrush; }

if (i == 59) { rect59.Fill = myBrush; }

if (i == 60) { rect60.Fill = myBrush; }

if (i == 61) { rect61.Fill = myBrush; }

if (i == 62) { rect62.Fill = myBrush; }

if (i == 63) { rect63.Fill = myBrush; }

if (i == 64) { rect64.Fill = myBrush; }

if (i == 65) { rect65.Fill = myBrush; }

if (i == 66) { rect66.Fill = myBrush; }

if (i == 67) { rect67.Fill = myBrush; }

if (i == 68) { rect68.Fill = myBrush; }

if (i == 69) { rect69.Fill = myBrush; }

if (i == 70) { rect70.Fill = myBrush; }

if (i == 71) { rect71.Fill = myBrush; }

if (i == 72) { rect72.Fill = myBrush; }

if (i == 73) { rect73.Fill = myBrush; }

if (i == 74) { rect74.Fill = myBrush; }

if (i == 75) { rect75.Fill = myBrush; }

if (i == 76) { rect76.Fill = myBrush; }

if (i == 77) { rect77.Fill = myBrush; }

if (i == 78) { rect78.Fill = myBrush; }

if (i == 79) { rect79.Fill = myBrush; }

if (i == 80) { rect80.Fill = myBrush; }

}

private void UPDATE_Click(object sender, RoutedEventArgs e)

{

if (connected)

{

try

{

string msg_to_send = "data_parameters";

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(msg_to_send + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

string[] ReturnParts = text.Split(new Char[] { ' ' });

DELAY_CONTENT.Content = ReturnParts[8];

double XThresh = Convert.ToDouble(ReturnParts[9]);

THRESH_CONTENT.Content = XThresh.ToString();

}

catch

{

ADD_TEXT("Communication Error!");

}

}

else { ADD_TEXT("No Connection to server!"); CONNECT_TO_SERVER_fx(); }

}

private void MS_CRTL_OFF_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "MouseOff";

Page 135: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 135

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

}

private void MS_CTRL_ON_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "MouseOn";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

}

private void CLK_CTRL_OFF_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "ClickOff";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

}

private void CLK_CTRL_ON_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "ClickOn";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

}

private void SEND_fx()

{

string StringToSend = COMMAND_TO_SEND.Text;

COMMAND_TO_SEND.Text = "";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

}

private void COMMAND_TO_SEND_KeyDown(object sender, KeyEventArgs e)

{

if (e.Key == Key.Enter) { SEND_fx(); }

}

private void SEND_Click(object sender, RoutedEventArgs e)

{

SEND_fx();

}

private void CLEAR_TXT_BOX_Click(object sender, RoutedEventArgs e)

{

COMMAND_TO_SEND.Text = "";

}

private void CLICK_DELAY_DOWN_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "ClickFrames-";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

UPDATE_Click(sender, e);

}

private void CLICK_DELAY_UP_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "ClickFrames+";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

UPDATE_Click(sender, e);

}

private void CLICK_THRESH_DOWN_Click(object sender, RoutedEventArgs e)

Page 136: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 136

{

string StringToSend = "ClickThreshold-";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

UPDATE_Click(sender, e);

}

private void CLICK_THRESH_UP_Click(object sender, RoutedEventArgs e)

{

string StringToSend = "ClickThreshold+";

string msg1 = "SENDING: " + StringToSend;

ADD_TEXT(msg1);

SENDmsg(StringToSend);

UPDATE_Click(sender, e);

}

private void SENDmsg(string msg_to_send)

{

if (connected)

{

try

{

data = new byte[100000];

recv = AdvSndRcvData(client, Encoding.ASCII.GetBytes(msg_to_send + "|"), ipep);

string text = Encoding.ASCII.GetString(data, 0, recv);

string msg = "RESPONSE: " + text.Remove(text.IndexOf("!"));

ADD_TEXT(msg);

}

catch {

ADD_TEXT("Server Communication Error, possibly no response for command.");

}

}

else { ADD_TEXT("No Connection to server!"); CONNECT_TO_SERVER_fx(); }

}

private void ADD_TEXT(string MESSAGE)

{

SERVER_RESPONSE.Focus();

SERVER_RESPONSE.AppendText(MESSAGE + Environment.NewLine);

EditingCommands.MoveDownByLine.Execute(null, SERVER_RESPONSE);

}

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(text);

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

{

return recv;

}

else

Page 137: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 137

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

}

}

Window1.xaml <Window x:Class="IRALAR_Control_Panel.Window1"

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

Title="IRALAR Control Panel" Height="460" Width="800">

<Grid>

<Image Name="image1" Stretch="Fill" Opacity=".25"

Source="/IRALAR_CP;component/Resources/Background.png" />

<Button Height="28.75" Click="CONNECT_Click" Margin="6.25,3.75,0,0" Name="CONNECT"

VerticalAlignment="Top" HorizontalAlignment="Left" Width="88.75">CONNECT</Button>

<Button Height="28.75" Click="UPDATE_Click" Margin="0,150,40,0" Name="UPDATE" VerticalAlignment="Top"

HorizontalAlignment="Right" Width="305">Get Current Delay and Thresh Pameters</Button>

<Button Height="40" HorizontalAlignment="Left" Margin="22.747,37.5,0,0" Name="MS_CRTL_OFF"

VerticalAlignment="Top" Click="MS_CRTL_OFF_Click" Width="183.75">Mouse Control OFF</Button>

<Button Height="40" Margin="212.5,37.5,381.25,0" Name="MS_CTRL_ON" VerticalAlignment="Top"

Click="MS_CTRL_ON_Click">Mouse Control ON</Button>

<Button Height="40" Margin="0,37.5,192.5,0" Name="CLK_CTRL_OFF" VerticalAlignment="Top"

Click="CLK_CTRL_OFF_Click" HorizontalAlignment="Right" Width="183.75">Click Control OFF</Button>

<Button Height="40" HorizontalAlignment="Right" Margin="0,37.5,2.5,0" Name="CLK_CTRL_ON"

VerticalAlignment="Top" Click="CLK_CTRL_ON_Click" Width="183.75">Click Control ON</Button>

<TextBox x:Name="COMMAND_TO_SEND" Margin="68.598,88.75,0,0" Keyboard.KeyDown="COMMAND_TO_SEND_KeyDown"

Height="23.25" VerticalAlignment="Top" HorizontalAlignment="Left" Width="282.717" />

<Button Margin="356.25,88.75,386.25,0" Name="SEND" Click="SEND_Click" Height="23.25"

VerticalAlignment="Top">SEND</Button>

<Button HorizontalAlignment="Left" Margin="22.747,90.105,0,0" Name="CLEAR_TXT_BOX" Width="40.067"

Click="CLEAR_TXT_BOX_Click" Height="23.25" VerticalAlignment="Top">CLEAR</Button>

<RichTextBox Margin="22.747,139.86,0,49.086" Name="SERVER_RESPONSE"

ScrollViewer.VerticalScrollBarVisibility="Visible" HorizontalAlignment="Left" Width="356.253">

<RichTextBox.Resources>

<Style TargetType="{x:Type Paragraph}">

<Setter Property="Margin" Value="0"/>

</Style>

</RichTextBox.Resources>

</RichTextBox>

<Label Margin="386.75,0,192.5,170" Name="COMMAND_LISTING" Height="31"

VerticalAlignment="Bottom">Additional Available Commands:</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,163" Name="cmd02"

VerticalAlignment="Bottom" Width="120">data__calibration</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,154" Name="cmd03"

VerticalAlignment="Bottom" Width="120">data__mouse</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,146" Name="cmd04"

VerticalAlignment="Bottom" Width="120">data__parameters</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,136" Name="cmd05"

VerticalAlignment="Bottom" Width="120">data__image</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,125.6" Name="cmd14"

VerticalAlignment="Bottom" Width="120">FPS</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,115.6" Name="cmd15"

VerticalAlignment="Bottom" Width="120">SAVE__IMAGE:##</Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,0,240.737,104.4" Name="cmd18"

VerticalAlignment="Bottom" Width="120">EXIT</Label>

<Label Height="28" HorizontalAlignment="Left" Margin="26.307,117.02,0,0" Name="label1"

VerticalAlignment="Top" Width="120">Server Response:</Label>

<Rectangle Height="58.77" Margin="0,86.25,197.5,0" Name="rectangle1" Stroke="Black"

VerticalAlignment="Top" HorizontalAlignment="Right" Width="147.75" />

<Label Height="28" Margin="0,87.395,210,0" Name="label2" VerticalAlignment="Top"

HorizontalAlignment="Right" Width="121">Click Delay Controls</Label>

<Button Height="25" HorizontalAlignment="Right" Margin="0,112.5,306.25,0" Name="CLICK_DELAY_DOWN"

VerticalAlignment="Top" Width="25" Click="CLICK_DELAY_DOWN_Click">-</Button>

<Button Height="25" Margin="0,113.75,210,0" Name="CLICK_DELAY_UP" VerticalAlignment="Top"

Click="CLICK_DELAY_UP_Click" HorizontalAlignment="Right" Width="26">+</Button>

Page 138: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 138

<Rectangle Height="58.107" Margin="0,86.913,40,0" Name="rectangle2" Stroke="Black"

VerticalAlignment="Top" HorizontalAlignment="Right" Width="138.75" />

<Label Height="28" Margin="0,86.913,38.75,0" Name="label3" VerticalAlignment="Top"

HorizontalAlignment="Right" Width="140">Click Threshold Controls</Label>

<Button Height="25" HorizontalAlignment="Right" Margin="0,112.5,143.276,0" Name="CLICK_THRESH_DOWN"

VerticalAlignment="Top" Width="25" Click="CLICK_THRESH_DOWN_Click">-</Button>

<Button Height="25" Margin="0,112.5,49.147,0" Name="CLICK_THRESH_UP" VerticalAlignment="Top"

HorizontalAlignment="Right" Width="25.5" Click="CLICK_THRESH_UP_Click">+</Button>

<Label Height="28" HorizontalAlignment="Right" Margin="0,113.75,240.737,0" Name="DELAY_CONTENT"

VerticalAlignment="Top" Width="63.308"></Label>

<Label Height="28" HorizontalAlignment="Right" Margin="0,112.5,78.75,0" Name="THRESH_CONTENT"

VerticalAlignment="Top" Width="63.308"></Label>

<Button Height="23" HorizontalAlignment="Right" Margin="0,0,6.66,61.812" Name="Calibration_Details"

Click="Calibration_Details_Click" VerticalAlignment="Bottom" Width="152.5">Gather Calibration Details</Button>

<Rectangle Width="144" HorizontalAlignment="Right" Margin="0,0,12,98" Fill="White" Height="108"

VerticalAlignment="Bottom" />

<Rectangle HorizontalAlignment="Right" Name="rect0" Width="16" Margin="0,216,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect1" Width="16" Margin="0,216,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect2" Width="16" Margin="0,216,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect3" Width="16" Margin="0,216,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect4" Width="16" Margin="0,216,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect5" Width="16" Margin="0,216,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect6" Width="16" Margin="0,216,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect7" Width="16" Margin="0,216,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect8" Width="16" Margin="0,216,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect9" Width="16" Margin="0,228,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect10" Width="16" Margin="0,228,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect11" Width="16" Margin="0,228,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect12" Width="16" Margin="0,228,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect13" Width="16" Margin="0,228,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect14" Width="16" Margin="0,228,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect15" Width="16" Margin="0,228,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect16" Width="16" Margin="0,228,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect17" Width="16" Margin="0,228,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect18" Width="16" Margin="0,240,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect19" Width="16" Margin="0,240,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect20" Width="16" Margin="0,240,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect21" Width="16" Margin="0,240,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect22" Width="16" Margin="0,240,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect23" Width="16" Margin="0,240,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect24" Width="16" Margin="0,240,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect25" Width="16" Margin="0,240,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect26" Width="16" Margin="0,240,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect27" Width="16" Margin="0,252,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect28" Width="16" Margin="0,252,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect29" Width="16" Margin="0,252,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

Page 139: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 139

<Rectangle HorizontalAlignment="Right" Name="rect30" Width="16" Margin="0,252,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect31" Width="16" Margin="0,252,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect32" Width="16" Margin="0,252,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect33" Width="16" Margin="0,252,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect34" Width="16" Margin="0,252,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect35" Width="16" Margin="0,252,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect36" Width="16" Margin="0,264,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect37" Width="16" Margin="0,264,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect38" Width="16" Margin="0,264,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect39" Width="16" Margin="0,264,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect40" Width="16" Margin="0,264,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect41" Width="16" Margin="0,264,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect42" Width="16" Margin="0,264,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect43" Width="16" Margin="0,264,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect44" Width="16" Margin="0,264,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect45" Width="16" Margin="0,276,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect46" Width="16" Margin="0,276,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect47" Width="16" Margin="0,276,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect48" Width="16" Margin="0,276,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect49" Width="16" Margin="0,276,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect50" Width="16" Margin="0,276,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect51" Width="16" Margin="0,276,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect52" Width="16" Margin="0,276,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect53" Width="16" Margin="0,276,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect54" Width="16" Margin="0,288,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect55" Width="16" Margin="0,288,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect56" Width="16" Margin="0,288,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect57" Width="16" Margin="0,288,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect58" Width="16" Margin="0,288,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect59" Width="16" Margin="0,288,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect60" Width="16" Margin="0,288,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect61" Width="16" Margin="0,288,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect62" Width="16" Margin="0,288,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect63" Width="16" Margin="0,300,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect64" Width="16" Margin="0,300,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect65" Width="16" Margin="0,300,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect66" Width="16" Margin="0,300,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect67" Width="16" Margin="0,300,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

Page 140: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 140

<Rectangle HorizontalAlignment="Right" Name="rect68" Width="16" Margin="0,300,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect69" Width="16" Margin="0,300,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect70" Width="16" Margin="0,300,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect71" Width="16" Margin="0,300,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect72" Width="16" Margin="0,312,140,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect73" Width="16" Margin="0,312,124,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect74" Width="16" Margin="0,312,108,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect75" Width="16" Margin="0,312,92,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect76" Width="16" Margin="0,312,76,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect77" Width="16" Margin="0,312,60,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect78" Width="16" Margin="0,312,44,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect79" Width="16" Margin="0,312,28,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle HorizontalAlignment="Right" Name="rect80" Width="16" Margin="0,312,12,0" Height="12"

VerticalAlignment="Top"></Rectangle>

<Rectangle Width="154.179" HorizontalAlignment="Right" Margin="0,0,6.8,93.6" Fill="Transparent"

Stroke="Black" StrokeThickness="8" Height="117.2" VerticalAlignment="Bottom" />

<ProgressBar Height="10" HorizontalAlignment="Right" Margin="0,0,6.672,49.086"

Name="CALIBRATION_PROGRESS" VerticalAlignment="Bottom" Width="152.488" Value="0" />

</Grid>

</Window>

UDP Client

Program.cs /*

C# Network Programming

by Richard Blum

Publisher: Sybex

ISBN: 0782141765

*/

using System;

using System.Net;

using System.Net.Sockets;

using System.Text;

public class BestUdpClient

{

private static byte[] data = new byte[2048];

private static IPEndPoint sender = new IPEndPoint(IPAddress.Any, 0);

private static EndPoint Remote = (EndPoint)sender;

private static int size = 100;

private static int AdvSndRcvData(Socket s, byte[] message, EndPoint rmtdevice)

{

int recv = 0;

int retry = 0;

while (true)

{

Console.WriteLine("Attempt #{0}", retry);

try

{

s.SendTo(message, message.Length, SocketFlags.None, rmtdevice);

data = new byte[size];

recv = s.ReceiveFrom(data, ref Remote);

string text = Encoding.ASCII.GetString(data, 0, recv);

Console.WriteLine(text);

Page 141: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 141

}

catch (SocketException e)

{

if (e.ErrorCode == 10054)

recv = 0;

else if (e.ErrorCode == 10040)

{

Console.WriteLine("Error receiving packet");

size += 10;

recv = 0;

}

}

if (recv > 0)

{

return recv;

}

else

{

retry++;

if (retry > 4)

{

return 0;

}

}

}

}

public static void Main()

{

string input, stringData;

int recv;

IPEndPoint ipep = new IPEndPoint(

IPAddress.Parse("127.0.0.1"), 1200);

Socket server = new Socket(AddressFamily.InterNetwork,

SocketType.Dgram, ProtocolType.Udp);

int sockopt = (int)server.GetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout);

Console.WriteLine("Default timeout: {0}", sockopt);

server.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 3000);

sockopt = (int)server.GetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout);

Console.WriteLine("New timeout: {0}", sockopt);

Console.WriteLine("This Application has connected to LocalHost on port 1200");

while (true)

{

input = Console.ReadLine();

input += "|";

if (input == "exit")

break;

recv = AdvSndRcvData(server, Encoding.ASCII.GetBytes(input), ipep);

Console.WriteLine();

if (recv > 0)

{

stringData = Encoding.ASCII.GetString(data, 0, recv);

//Console.WriteLine(stringData);

}

else

Console.WriteLine("Did not receive an answer");

}

Console.WriteLine("Stopping client");

server.Close();

}

}

MATLAB

Find_coefs.m %Incase I fail at saving workspaces

Page 142: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 142

eyex=[366;365;363;362;362;362;361;359;357;363;364;362;362;360;359;358;356;354;360;361;360;359;357;356;354;352;3

49;356;355;355;354;353;351;350;348;347;353;352;350;350;348;347;346;344;342;347;348;346;346;344;342;341;339;336;

341;341;341;341;339;338;336;335;334;340;339;336;336;336;334;333;331;331;333;333;331;331;331;330;327;326;324]; eyey=[306;313;323;333;344;351;359;364;372;308;316;323;331;340;348;357;365;371;305;313;321;331;340;347;358;364;3

74;305;312;322;331;340;351;359;367;372;308;313;321;330;340;348;357;365;372;308;313;323;330;341;349;356;364;372;

304;310;321;329;338;347;357;364;371;302;310;319;330;338;347;355;362;370;301;309;318;329;338;344;355;363;368]; pixelx=[80;160;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;72

0;80;160;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;720;80;1

60;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;720;80;160;240;320;400;480;560;640;720]; pixely=[60;60;60;60;60;60;60;60;60;120;120;120;120;120;120;120;120;120;180;180;180;180;180;180;180;180;180;240;

240;240;240;240;240;240;240;240;300;300;300;300;300;300;300;300;300;360;360;360;360;360;360;360;360;360;420;420

;420;420;420;420;420;420;420;480;480;480;480;480;480;480;480;480;540;540;540;540;540;540;540;540;540]; eyex2=zeros(81,1); eyex3=zeros(81,1); eyex4=zeros(81,1); eyex5=zeros(81,1); eyey2=zeros(81,1); eyey3=zeros(81,1); eyey4=zeros(81,1); eyey5=zeros(81,1); eyexeyey=zeros(81,1); eyex2eyey=zeros(81,1); eyex3eyey=zeros(81,1); eyex4eyey=zeros(81,1); eyexeyey2=zeros(81,1); eyexeyey3=zeros(81,1); eyexeyey4=zeros(81,1); eyex2eyey3=zeros(81,1); eyex3eyey2=zeros(81,1); %Going to try up to 5th order, just for kicks for(i=1:81) eyex2(i)=(eyex(i))^2; eyex3(i)=(eyex(i))^3; eyex4(i)=(eyex(i))^4; eyex5(i)=(eyex(i))^5; eyey2(i)=(eyey(i))^2; eyey3(i)=(eyey(i))^3; eyey4(i)=(eyey(i))^4; eyey5(i)=(eyey(i))^5; eyexeyey(i)=eyex(i)*eyey(i); eyex2eyey(i)=eyex(i)^2*eyey(i); eyex3eyey(i)=eyex(i)^3*eyey(i); eyex4eyey(i)=eyex(i)^4*eyey(i); eyexeyey2(i)=eyex(i)*eyey(i)^2; eyexeyey3(i)=eyex(i)*eyey(i)^3; eyexeyey4(i)=eyex(i)*eyey(i)^4; eyex2eyey3(i)=eyex(i)^2*eyey(i)^3; eyex3eyey2(i)=eyex(i)^3*eyey(i)^2;

%Yes the variable names are long and suck... but they're logical at %least

end %Testing various calibrations %Standard first powers only X1=[ones(81,1) eyex eyey]; %Defining X-matrix BX1=(X1'*X1)^(-1)*X1'*pixelx %This is the magical formula BY1=(X1'*X1)^(-1)*X1'*pixely %Magical formula, part 2 X_coefs1=BX1'; Y_coefs1=BY1'; %Up to second powers of eyex and eyey X2=[ones(81,1) eyex eyey eyex2 eyey2]; %Defining X-matrix BX2=(X2'*X2)^(-1)*X2'*pixelx %This is the magical formula BY2=(X2'*X2)^(-1)*X2'*pixely %Magical formula, part 2 X_coefs2=BX2'; Y_coefs2=BY2'; %Up to third powers of eyex and eyey X3=[ones(81,1) eyex eyey eyex2 eyey2 eyex3 eyey3]; %Defining X-matrix BX3=(X3'*X3)^(-1)*X3'*pixelx %This is the magical formula BY3=(X3'*X3)^(-1)*X3'*pixely %Magical formula, part 2 X_coefs3=BX3'; Y_coefs3=BY3'; %Kickin it up to 4th power... X4=[ones(81,1) eyex eyey eyex2 eyey2 eyex3 eyey3 eyex4 eyey4]; %Defining X-matrix BX4=(X4'*X4)^(-1)*X4'*pixelx %This is the magical formula

Page 143: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 143

BY4=(X4'*X4)^(-1)*X4'*pixely %Magical formula, part 2 X_coefs4=BX4'; Y_coefs4=BY4'; %Fifth power!!! X5=[ones(81,1) eyex eyey eyex2 eyey2 eyex3 eyey3 eyex4 eyey4 eyex5 eyey5]; BX5=(X5'*X5)^(-1)*X5'*pixelx %This is the magical formula BY5=(X5'*X5)^(-1)*X5'*pixely %Magical formula, part 2 X_coefs5=BX5'; Y_coefs5=BY5';

Error_Graphs.m %--------------------------------------------%

% CALCULATE TRANSFORMATION %

%--------------------------------------------%

ERRORsum = 0;

for x=1:9

for y=1:9

%first degree fit

% X_gen(x,y) = X_coefs1(1) + X_points(x,y) * X_coefs1(2) + Y_points(x,y) * X_coefs1(3);

% Y_gen(x,y) = Y_coefs1(1) + X_points(x,y) * Y_coefs1(2) + Y_points(x,y) * Y_coefs1(3);

%second degree fit

% X_gen(x,y) = X_coefs2(1) + X_points(x,y) * X_coefs2(2) + Y_points(x,y) * X_coefs2(3) + X_points(x,y)^2

* X_coefs2(4) + Y_points(x,y)^2 * X_coefs2(5);

% Y_gen(x,y) = Y_coefs2(1) + X_points(x,y) * Y_coefs2(2) + Y_points(x,y) * Y_coefs2(3) + X_points(x,y)^2

* Y_coefs2(4) + Y_points(x,y)^2 * Y_coefs2(5);

%third degree fit

% X_gen(x,y) = X_coefs3(1) + X_points(x,y) * X_coefs3(2) + Y_points(x,y) * X_coefs3(3) + X_points(x,y)^2

* X_coefs3(4) + Y_points(x,y)^2 * X_coefs3(5) + X_points(x,y)^3 * X_coefs3(6) + Y_points(x,y)^3 * X_coefs3(7);

% Y_gen(x,y) = Y_coefs3(1) + X_points(x,y) * Y_coefs3(2) + Y_points(x,y) * Y_coefs3(3) +

X_points(x,y)^2 * Y_coefs3(4) + Y_points(x,y)^2 * Y_coefs3(5) + X_points(x,y)^3 * Y_coefs3(6) +

Y_points(x,y)^3 * Y_coefs3(7);

%fourth degree fit

% X_gen(x,y) = X_coefs4(1) + X_points(x,y) * X_coefs4(2) + Y_points(x,y) * X_coefs4(3) +

X_points(x,y)^2 * X_coefs4(4) + Y_points(x,y)^2 * X_coefs4(5) + X_points(x,y)^3 * X_coefs4(6) +

Y_points(x,y)^3 * X_coefs4(7) + X_points(x,y)^4 * X_coefs4(8) + Y_points(x,y)^4 * X_coefs4(9);

% Y_gen(x,y) = Y_coefs4(1) + X_points(x,y) * Y_coefs4(2) + Y_points(x,y) * Y_coefs4(3) +

X_points(x,y)^2 * Y_coefs4(4) + Y_points(x,y)^2 * Y_coefs4(5) + X_points(x,y)^3 * Y_coefs4(6) +

Y_points(x,y)^3 * Y_coefs4(7) + X_points(x,y)^4 * Y_coefs4(8) + Y_points(x,y)^4 * Y_coefs4(9);

%fifth degree fit

X_gen(x,y) = X_coefs5(1) + X_points(x,y) * X_coefs5(2) + Y_points(x,y) * X_coefs5(3) +

X_points(x,y)^2 * X_coefs5(4) + Y_points(x,y)^2 * X_coefs5(5) + X_points(x,y)^3 * X_coefs5(6) +

Y_points(x,y)^3 * X_coefs5(7) + X_points(x,y)^4 * X_coefs5(8) + Y_points(x,y)^4 * X_coefs5(9) +

X_points(x,y)^5 * X_coefs5(10) + Y_points(x,y)^5 * X_coefs5(11);

Y_gen(x,y) = Y_coefs5(1) + X_points(x,y) * Y_coefs5(2) + Y_points(x,y) * Y_coefs5(3) +

X_points(x,y)^2 * Y_coefs5(4) + Y_points(x,y)^2 * Y_coefs5(5) + X_points(x,y)^3 * Y_coefs5(6) +

Y_points(x,y)^3 * Y_coefs5(7) + X_points(x,y)^4 * Y_coefs5(8) + Y_points(x,y)^4 * Y_coefs5(9) +

X_points(x,y)^5 * Y_coefs5(10) + Y_points(x,y)^5 * Y_coefs5(11);

X_exp(x,y) = x*80;

Y_exp(x,y) = y*80;

X_err(x,y) = x*80 - X_gen(x,y);

Y_err(x,y) = y*60 - Y_gen(x,y);

X_errP(x,y) = (X_err(x,y)^2)^.5;

Y_errP(x,y) = (Y_err(x,y)^2)^.5;

ERRORa(x,y) = ( X_err(x,y)^2 + Y_err(x,y)^2 )^.5;

ERRORsum = ERRORsum + ERRORa(x,y);

end

end

ERRORavg = ERRORsum / 81

Page 144: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 144

close all

%% 3-D Surface of X Data

figure;

surf(Y_labels,-

X_labels,X_points,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('X Axis Values (DATA)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X00_Data_3-D.png');

%% Contour Map of X Data

figure;

contourf(Y_labels,-X_labels,X_points,10)

title('X Axis contour map (DATA)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X01_Data_Contour.png');

%% 3-D Surface of Y Data

figure;

surf(Y_labels,-

X_labels,Y_points,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('Y Axis Values (DATA)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y00_Data__3-D.png');

%% Contour Map of Y Data

figure;

contourf(Y_labels,-X_labels,Y_points,10)

title('Y Axis contour map (DATA)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y01_Data_Contour.png');

%% 3-D Surface of X Expected

figure;

surf(Y_labels,-X_labels,X_exp,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('X Axis Values (Expected)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X02_Exp_3-D.png');

%% Contour Map of X Expected

Page 145: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 145

figure;

contourf(Y_labels,-X_labels,X_exp,10)

title('X Axis contour map (Expected)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X03_Exp_Contour.png');

%% 3-D Surface of Y Expected

figure;

surf(Y_labels,-X_labels,Y_exp,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('Y Axis Values (Expected)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y02_Exp_3-D.png');

%% Contour Map of Y Expected

figure;

contourf(Y_labels,-X_labels,Y_exp,10)

title('Y Axis contour map (Expected)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y03_Exp_Contour.png');

%% 3-D Surface of X Coeffecients

figure;

surf(Y_labels,-X_labels,X_gen,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('X Axis Values (Generated)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X02_Coefs_3-D.png');

%% Contour Map of X Coefficients

figure;

contourf(Y_labels,-X_labels,X_gen,10)

title('X Axis contour map (Generated)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X03_Coefs_Contour.png');

%% 3-D Surface of Y Coeffecients

figure;

surf(Y_labels,-X_labels,Y_gen,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('Y Axis Values (Generated)');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y02_Coefs_3-D.png');

%% Contour Map of Y Coefficients

figure;

contourf(Y_labels,-X_labels,Y_gen,10)

title('Y Axis contour map (Generated)');

set(gcf,'Color','white');

Page 146: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 146

I = getframe(gcf);

imwrite(I.cdata, 'Y03_Coefs_Contour.png');

%% 3-D Surface of X Error

figure;

surf(Y_labels,-

X_labels,X_errP,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('X Axis Error');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X04_Error_3-D.png');

%% Contour map of X Error

contourf(Y_labels,-X_labels,X_errP,10)

title('X Axis Error Contour Map');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'X05_Error_Contour.png');

%% 3-D Surface of Y Error

figure;

surf(Y_labels,-

X_labels,Y_errP,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('Y Axis Error');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y04_Error_3-D.png');

%% Contour map of Y Error

figure;

contourf(Y_labels,-X_labels,Y_errP,10)

title('Y Axis Error Contour Map');

colorbar;

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Y05_Error_Contour.png');

%% 3-D Surface of Overall Average Error

figure;

surf(Y_labels,-

X_labels,ERRORa,'FaceColor','interp','EdgeColor','none','FaceLighting','phong')

axis tight

view(10,28)

camlight left

colorbar

title('Overall Average Error');

set(gcf,'Color','white');

I = getframe(gcf);

imwrite(I.cdata, 'Overall_Error_3-D.png');

Page 147: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 147

%% Contour map of Overall Average Error

figure;

contourf(Y_labels,-X_labels,ERRORa,10)

title('Overall Average Error Contour Map');

set(gcf,'Color','white');

colorbar

I = getframe(gcf);

imwrite(I.cdata, 'Overall_Error_Contour.png');

%%

close all

Perl

TCA.pl (Timing Collector Analyzer) #!/usr/bin/perl

use Getopt::Std;

my @count_cnt;

my @mode_cnt;

my @count_total;

my @mode_total;

my @count_average;

my @mode_average;

getopts('i:l:h:');

if($opt_i){

open(DAT, $opt_i) || die("Could not open file!");

while(<DAT>){

if ($_ =~ m /(\d+),(\d+),(.*)/){

$count = $1;

$mode = $2;

$time = $3;

#print $count."\t".$mode."\t".$time."\n";

if($opt_l){if ($time <= $opt_l){next;}}

if($opt_h){if ($time >= $opt_h){next;}}

$total_lines++;

$count_cnt[$count]++;

$mode_cnt[$mode]++;

$count_total[$count] += $time;

$mode_total[$mode] += $time;

}else{print $_; next;}

}

close(DAT);

#print "\nPROCESSING COUNT AVERAGES\n";

$count = 0;

while ($count <=99){

# print $count."\t".$count_total[$count]."\t".$count_cnt[$count]."\n";

Page 148: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 148

if ($count_cnt[$count] != 0){$count_average[$count] = $count_total[$count] / $count_cnt[$count];}

$count++;

}

$mode = 0;

#print "\nPROCESSING MODE AVERAGES\n";

while ($mode <=3){

# print $mode."\t".$mode_total[$mode]."\t".$mode_cnt[$mode]."\n";

if ($mode_cnt[$mode] != 0){$mode_average[$mode] = $mode_total[$mode] / $mode_cnt[$mode];}

$mode++;

}

$count = 0;

#print "\nCOUNT AVERAGES\n";

while ($count <=99){

# print $count."\t".$count_average[$count]."\n";

$count++;

}

$mode = 0;

print "\nMODE AVERAGES\n";

while ($mode <=3){

print $mode."\t".$mode_average[$mode]."\n";

$mode++;

}

print "\n----------------------------------------------\n";

$move = ($mode_average[0]+$mode_average[2])/2;

$click = ($mode_average[1]+$mode_average[3])/2;

$known = $mode_average[1]-$mode_average[0];

$unknown = $mode_average[3]-$mode_average[2];

print "\nAverage move time \t".$move." mS";

print "\nAverage click time \t".$click." mS";

print "\nAverage click delay ( known cusror)\t".$known." mS";

print "\nAverage click delay (unknown cusror)\t".$unknown." mS";

}

else{

print "\n\nUSAGE:\n TCA.exe -i <input file> [-l <lower bound>][-h <upper bound>]\n";

}

CAL.pl (Calibration Analyzer) #!/usr/bin/perl

use Getopt::Std;

my @count_cnt;

my @mode_cnt;

my @count_total;

my @mode_total;

my @count_average;

my @mode_average;

getopts('i:o:D');

if ($opt_i && $opt_o){

$file=$opt_i;

$file_out = $opt_o;

open(DAT, $file) || die("Could not open file!");

while(<DAT>){

print $_;

Page 149: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 149

if ($_ =~ m/(\d+),(\d+),(\d+),(\d+)/){

$X_points[($3/80)-1][($4/60)-1] = $1;

$Y_points[($3/80)-1][($4/60)-1] = $2;

}elsif($_ =~ m/X coefs { (.*) , (.*) , (.*) }/g){

#print $_."\n";

$X_coefs[0]= $1;

$X_coefs[1]= $2;

$X_coefs[2]= $3;

}elsif($_ =~ m/Y coefs { (.*) , (.*) , (.*) }/g){

#print $_."\n";

$Y_coefs[0]= $1;

$Y_coefs[1]= $2;

$Y_coefs[2]= $3;

}else{next;}

}

close(DAT);

my $x=0;

my $y=0;

while ( $x <= 8 ){

$y=0;

while ( $y <= 8 ){

{

$X_gen[$x][$y] = $X_coefs[0] + $X_points[$x][$y] * $X_coefs[1] + $Y_points[$x][$y] * $X_coefs[2];

$Y_gen[$x][$y] = $Y_coefs[0] + $X_points[$x][$y] * $Y_coefs[1] + $Y_points[$x][$y] * $Y_coefs[2];

$X_err[$x][$y] = abs(($x+1) * 80 - $X_gen[$x][$y]);

$Y_err[$x][$y] = abs(($y+1) * 60 - $Y_gen[$x][$y]);

$ERR[$x][$y] = ($X_err[$x][$y]^2 + $Y_err[$x][$y]^2)^.5;

}

if ($opt_D){

print "\nX=$x\tY=$y\n";

print "X_pnt [ $x ][ $y ] = $X_points[$x][$y]\n";

print "X_gen [ $x ][ $y ] = $X_gen[$x][$y]\n";

print "X_err [ $x ][ $y ] = $X_err[$x][$y]\n";

print "\n";

print "Y_gen [ $x ][ $y ] = $Y_points[$x][$y]\n";

print "Y_gen [ $x ][ $y ] = $Y_gen[$x][$y]\n";

print "Y_gen [ $x ][ $y ] = $Y_err[$x][$y]\n";

print "\n";

print "ERROR [ $x ][ $y ] = $ERR[$x][$y]\n";

}

$y++;

}

$x++;

}

open(OUT, ">$file_out");

print OUT "X_labels\n";

print OUT "80,160,240,320,400,480,560,640,720\n";

print OUT "Y_labels\n";

print OUT "60,120,180,240,300,360,420,480,540\n";

print "\nCALIBRATION RESULTS\n";

print OUT "\nCALIBRATION RESULTS\n";

Page 150: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 150

print "\n\nX_gen";

print OUT "\n\nX_gen";

my $x=0;

my $y=0;

print "\t60\t120\t180\t240\t300\t360\t420\t480\t540\n";

print OUT ",60,120,180,240,300,360,420,480,540\n";

while ($x <=8 ){

$y=0;

$val = $x*80;

print $val."\t";

print OUT $val.",";

while ($y <= 8){

$number = sprintf("%5.2f", $X_gen[$x][$y]);

print $number."\t";

print OUT $number.",";

$y++;

}

print "\n";

print OUT "\n";

$x++;

}

print "\n\nX_err";

print OUT "\n\nX_err";

my $x=0;

my $y=0;

print "\t60\t120\t180\t240\t300\t360\t420\t480\t540\n";

print OUT ",60,120,180,240,300,360,420,480,540\n";

while ($x <=8 ){

$y=0;

$val = $x*80;

print $val."\t";

print OUT $val.",";

while ($y <= 8){

$number = sprintf("%5.2f", $X_err[$x][$y]);

print $number."\t";

print OUT $number.",";

$error_total_x += $X_err[$x][$y];

$y++;

}

print "\n";

print OUT "\n";

$x++;

}

print "\n\nY_gen";

print OUT "\n\nY_gen";

my $x=0;

my $y=0;

print "\t60\t120\t180\t240\t300\t360\t420\t480\t540\n";

print OUT ",60,120,180,240,300,360,420,480,540\n";

while ($x <=8 ){

$y=0;

$val = $x*80;

print $val."\t";

print OUT $val.",";

while ($y <= 8){

$number = sprintf("%5.2f", $Y_gen[$x][$y]);

print $number."\t";

print OUT $number.",";

$y++;

}

print "\n";

print OUT "\n";

$x++;

}

print "\n\nY_err";

print OUT "\n\nY_err";

Page 151: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix A: Source Code

Gaze Tracking System P a g e | 151

my $x=0;

my $y=0;

print "\t60\t120\t180\t240\t300\t360\t420\t480\t540\n";

print OUT ",60,120,180,240,300,360,420,480,540\n";

while ($x <=8 ){

$y=0;

$val = $x*80;

print $val."\t";

print OUT $val.",";

while ($y <= 8){

$number = sprintf("%5.2f", $Y_err[$x][$y]);

print $number."\t";

print OUT $number.",";

$error_total_y += $Y_err[$x][$y];

$y++;

}

print "\n";

print OUT "\n";

$x++;

}

print "\n\nERR(MX)";

print "\t60\t120\t180\t240\t300\t360\t420\t480\t540\n";

print OUT "\n\nERR(MX)";

print OUT ",60,120,180,240,300,360,420,480,540\n";

my $x=0;

my $y=0;

while ($x <=8 ){

$y=0;

$val = $x*80;

print $val."\t";

print OUT $val.",";

while ($y <= 8){

$number = sprintf("%5.2f", $ERR[$x][$y]);

print $number."\t";

print OUT $number.",";

$error_total += $ERR[$x][$y];

$y++;

}

print "\n";

print OUT "\n";

$x++;

}

$err_avg_tot = $error_total/81;

$err_avg_x = $error_total_x/81;

$err_avg_y = $error_total_y/81;

print "\n\nAVERAGE X ERROR\n$err_avg_x Pixels\n";

print "\n\nAVERAGE Y ERROR\n$err_avg_y Pixels\n";

print "\n\nAVERAGE OVERALL ERROR\n$err_avg_tot Pixels\n";

}

else{

print "\n\nUSAGE:\n CAL.exe -i <input file> -o <output_file> [-D]\n";

}

Page 152: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix B: HSI Conference Submission

EE 452 – Senior Capstone Project

Appendix B: HSI Conference Submission Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

5/13/2008

Page 153: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix B: HSI Submission

Gaze Tracking System P a g e | 153

Page 154: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix B: HSI Submission

Gaze Tracking System P a g e | 154

Page 155: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix B: HSI Submission

Gaze Tracking System P a g e | 155

Page 156: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix B: HSI Submission

Gaze Tracking System P a g e | 156

Page 157: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Conference Submission

EE 452 – Senior Capstone Project

Appendix C: RESNA Conference Submission Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

5/13/2008

Page 158: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 158

Page 159: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 159

Page 160: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 160

Page 161: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 161

Page 162: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 162

Page 163: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 163

Page 164: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 164

Page 165: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix C: RESNA Submission

Gaze Tracking System P a g e | 165

Page 166: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix D: Software Usage Manual

EE 452 – Senior Capstone Project

Appendix D: Software Usage Manual Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

5/13/2008

Page 167: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix D: Usage Manual

Gaze Tracking System P a g e | 167

Installing IRALAR

The IRALAR application is packaged in a windows installer package (.MSI). You can install the

application directly from executing this package. Additionally, you should install the latest Logitech

camera drivers. At the time of creation of this document, the most up to date version is ‘qc1150’.

IRALAR can then be found within your start menu, under ‘IRALAR’.

Setting up IRALAR

IRALAR requires a small amount of physical setup before the application will run. This includes:

Plug in the 3 USB cables

Plug in the VGA cable

After plugging these devices in, there are two items to set up.

1. VGA device

Set the extra monitor (HMD) to clone your first display. The resolution should be

800x600, which will also most likely reduce the resolution of the primary

display.

2. Camera setup

a. After putting on the HMD, you will need to configure the camera so that it is

actually looking at the user’s eye. Physically position the camera so that the lens

is facing the user’s eye. The rest of the setup will be performed in software.

b. Now you must set up the camera so that the image is similar to what the image

processing application is expecting. By using the auto-light feature within the

Logitech camera drivers this setup is quickly automated. Manually increasing

the exposure setting in the camera setup interface also seems to have good

results in terms of getting the image correct.

Running IRALAR

After the application is installed and set up, you are then ready to run the application. Run the

application (thru the start menu or the desktop icon). The first page that will be displayed to the user is

the ‘Center the Display’ page where the user should be able to see all four red corners of the screen.

After the display is centered, press the spacebar. This will initialize the image processing application as

well as display a splash screen to the user. They will then enter the calibration screen, and look at each

of the ‘dots’ (81 in total). After completing calibration, the system should be free running according to

the user’s eye. Generally, the best screen to take the user into after the calibration system is the testing

interface.

Page 168: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix E: Software Design Manual

5/13/2008

EE 452 – Senior Capstone Project

Appendix E: Software Design Manual Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

Page 169: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix F: Table of Figures

Gaze Tracking System P a g e | 169

Visual Studio Environment

Programmatically speaking, this program requires two main components:

OpenCV

Windows Presentation Foundation (WPF, included in .NET 3.5)

This presents some problems in terms of setting up a working environment for

enhancing/working on code. The biggest problem is that WPF editing capabilities are very

limited in VS2005, while OpenCV does not integrate properly for application distribution in

VS2008. This is a problem that has potential merit to be overcome in a future project. Because

of this there are some less than obvious methods of application setup and deployment that

have been performed.

OpenCV Setup

In order to compile the image processing application with any compiler, there are a

number of header files and libraries that need to be set up in the compiler environment. The

directions for this are taken directly from the Setup section of the OpenCV website.

The first step of getting OpenCV running within your environemtn is installing the

OpenCV package. Please see: http://opencvlibrary.sourceforge.net

The second step is setting up your compiler environment. These directions are from:

http://opencvlibrary.sourceforge.net/VisualC%2B%2B . These directions are specifically for

VS2003, however we had luck following the same set of instructions in both VS2005 and

VS2008.

VS2008 Project

The VS2008 project contains all of the code used in the interface. VS2008 provides an

easy method to visualize XAML code (VS2005 does not), therefore it is an optimal choice for

editing interface code. All of the interface applications can be compiled in VS2008. After

compilation these applications take the form of standalone executables. They are then used

within the VS2005 project.

VS2005 Project

The VS2005 project contains all of the code for the image processing application.

Additionally, it contains the installer which allows for packaging of the application set. The

installer packages the compiled image processing application, as well as the pre-compiled

interface applications (compiled in VS2008).

Page 170: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix F: Table of Figures

5/13/2008

EE 452 – Senior Capstone Project

Appendix F: Table of Figures Gaze Tracking System

Breanna Heidenburg

Michael Lenisa

Daniel Wentzel

Page 171: EE 452 Senior Capstone Project Project Reportcegt201.bradley.edu/.../deliverables/IRALAR_full_report.pdfIn a gaze tracking system, images of the eye are taken by a camera and sent

Appendix F: Table of Figures

Gaze Tracking System P a g e | 171

Table of Figures

FIGURE 1 - SYSTEM HARDWARE DIAGRAM 12

FIGURE 2 - FIRST GENERATION HARDWARE 13

FIGURE 3 - FULL SYSTEM ARCHITECTURE 16

FIGURE 4 - IMAGE PROCESSING STEPS 17

FIGURE 5 - MULTIPLE VARIABLE LINEAR REGRESSION 18

FIGURE 6 - DETERMINING PIXEL COORDINATES FROM EYE COORDINATES 19

FIGURE 7 - ERROR PLOTS FOR VARIOUS DEGREES OF BEST FIT METHODS 20

FIGURE 8 - AVERAGE PIXEL ERROR FOR VARIOUS DEGREE BEST FIT DEGREES 21

FIGURE 9 - CURSOR CONTROL CODE 22

FIGURE 10 - CURSOR MOVEMENT TIMES FOR VARIOUS INPUT METHODS 23

FIGURE 11 - USE OF A CRITICAL SECTION IN C 24

FIGURE 12 - SHORTENED LIST OF UDP COMMANDS 25

FIGURE 13 - MARKUP BASED ANIMATION 26

FIGURE 14 - PROCEDURAL ANIMATION 27

FIGURE 15 - WPF NAVIGATION 27

FIGURE 16 - INTERFACE STRUCTURE 28

FIGURE 17 - MAIN PAGE 29

FIGURE 18 - SYSTEM TEST INTERFACE 30

FIGURE 19 - TIMING COLLECTOR INTERFACE 31

FIGURE 20 - CENTERING INTERFACE 32

FIGURE 21 - GLOBAL KEY HOOK IN C# 32

FIGURE 22 - REACTION TIME TESTING SCREENSHOT 34

FIGURE 23 - UDP CLIENT SCREENSHOT 35

FIGURE 24 - CONTROL PANEL SCREENSHOT 36