5
24 39th Southeastern Symposium on System Theory MA1.2 Mercer University Macon, GA, 31207, March 4-6, 2007 On the Development of an Enhanced Optical Mouse Sensor for Odometry and Mobile Robotics Education Joe Bradshaw Christopher Lollini Bradley E. Bishop* Weapons and Systems Engineering United States Naval Academy 105 Maryland Ave (Stop 14A) Annapolis, MD 21402 bishopgusna.edu Abstract: This paper focuses on the development of a sensing module, testing methodology and set of experiments for the use of an optical mouse chip as an odometer in mobile robotics, including in an educational setting at the undergraduate level. The test bed includes an integrated microprocessor, mounting board and lens system designed to overcome the basic height to surface and speed limitations on the optical mouse chip with the manufacturer-supplied lens. The finalized system can be used in a mobile robotics course segment on sensing and state estimation. Index Terms - Mobile robots, robotics education, undergraduate research. I. INTRODUCTION Odometry based on shaft-encoders is a long-standing tradition in mobile robotics, enabling low-cost dead reckoning for systems that lack inertial navigation or GPS sensors [1,2]. Unfortunately, this form of sensing can experience significant difficulties on uneven or irregular terrain, where wheels may slip or the surface may crumble or deform under the vehicle's pressure. In these circumstances, dead reckoning with shaft encoders is virtually impossible. Recent research has investigated the use of optical mouse chips for odometry in mobile robots [3]. The optical mouse chip has the benefits of being relatively inexpensive, requiring low power, and providing actual over-ground distance measurements directly to a microprocessor. Unfortunately, one of the major drawbacks for odometry based on optical mouse technology is that the device is designed to be placed very near the surface, making its use on uneven ground quite challenging. Additionally, the devices tend to have a rather low maximum speed. The specifications for the ADNS2051 optical mouse chip used in this work indicate a maximum speed of 14 in/sec, using the suggested lens and illumination hardware [4]. In this work, we outline the results of an undergraduate research project focusing on the development of a sensing platform for mobile robot odometry using optical mouse technology. The system includes a microprocessor interface, user display and (most importantly) a lens system that can be used to overcome both of the primary difficulties with optical mice in mobile robotics. The final product will be used in an undergraduate mobile robotics course. The remainder of the paper is organized as follows: Section II contains details of the complete hardware system; Section III discusses the testing and calibration required to provide a robust platform for pedagogical use; Section IV discusses the pedagogical implications and uses of the system; and Section V includes conclusions and further work. This system developed in this paper is a result of an undergraduate research project. II. SYSTEM DESCRIPTION Our research utilizes the Agilent Technologies' ADNS- 2051 High-performance Optical Mouse Sensor to improve odometry and dead reckoning capability in mobile robotics, and to enhance laboratory experiments involving position and displacement measurement objectives. The ADNS2051 uses successive correlations on a 16x16, 6-bit grayscale image to determine motion of the device. The nominal system frame rate is 1500 frames/sec. The recommended lens system (the Agilent HDNS-2100) requires a distance to the surface of 2.3-2.5mm, and has a magnification of 1.00, depth of field of 0.5mm and a field coverage radius of 1.0mm [8]. Given the stated maximum speed for the chip of 14 in/sec, this corresponds to an over- ground speed of 0.2371 mm/frame. By adding a new lens system, we improved both the ground clearance and over- ground speed simultaneously. The ADNS2051 chip was mounted on a custom printed circuit board (PCB) with a hole for the sensor and various breakout pins for ease of connection. A small CCD camera lens was added to allow the board to be placed at a reasonable distance from the test surface. This lens allows the system to be used on much more uneven terrain than the nominal lens (again, limited to -2.4mm from surface). Additionally, the new lens allows the system to overcome 1-4244-1051-7/07/$25.00 ©2007 IEEE. 6

[IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

Embed Size (px)

Citation preview

Page 1: [IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

24

39th Southeastern Symposium on System Theory MA1.2Mercer UniversityMacon, GA, 31207, March 4-6, 2007

On the Development of an Enhanced Optical Mouse Sensor forOdometry and Mobile Robotics Education

Joe BradshawChristopher LolliniBradley E. Bishop*

Weapons and Systems EngineeringUnited States Naval Academy105 Maryland Ave (Stop 14A)

Annapolis, MD 21402bishopgusna.edu

Abstract: This paper focuses on the development of asensing module, testing methodology and set of experiments forthe use of an optical mouse chip as an odometer in mobilerobotics, including in an educational setting at the undergraduatelevel. The test bed includes an integrated microprocessor,mounting board and lens system designed to overcome the basicheight to surface and speed limitations on the optical mouse chipwith the manufacturer-supplied lens. The finalized system can beused in a mobile robotics course segment on sensing and stateestimation.

Index Terms - Mobile robots, robotics education, undergraduateresearch.

I. INTRODUCTION

Odometry based on shaft-encoders is a long-standingtradition in mobile robotics, enabling low-cost dead reckoningfor systems that lack inertial navigation or GPS sensors [1,2].Unfortunately, this form of sensing can experience significantdifficulties on uneven or irregular terrain, where wheels mayslip or the surface may crumble or deform under the vehicle'spressure. In these circumstances, dead reckoning with shaftencoders is virtually impossible.Recent research has investigated the use of optical mouse

chips for odometry in mobile robots [3]. The optical mousechip has the benefits of being relatively inexpensive, requiringlow power, and providing actual over-ground distancemeasurements directly to a microprocessor.

Unfortunately, one of the major drawbacks for odometrybased on optical mouse technology is that the device isdesigned to be placed very near the surface, making its use onuneven ground quite challenging. Additionally, the devicestend to have a rather low maximum speed. The specificationsfor the ADNS2051 optical mouse chip used in this workindicate a maximum speed of 14 in/sec, using the suggestedlens and illumination hardware [4].

In this work, we outline the results of an undergraduateresearch project focusing on the development of a sensingplatform for mobile robot odometry using optical mouse

technology. The system includes a microprocessor interface,user display and (most importantly) a lens system that can beused to overcome both of the primary difficulties withoptical mice in mobile robotics. The final product will beused in an undergraduate mobile robotics course.The remainder of the paper is organized as follows:

Section II contains details of the complete hardware system;Section III discusses the testing and calibration required toprovide a robust platform for pedagogical use; Section IVdiscusses the pedagogical implications and uses of thesystem; and Section V includes conclusions and furtherwork. This system developed in this paper is a result of anundergraduate research project.

II. SYSTEM DESCRIPTION

Our research utilizes the Agilent Technologies' ADNS-2051 High-performance Optical Mouse Sensor to improveodometry and dead reckoning capability in mobile robotics,and to enhance laboratory experiments involving positionand displacement measurement objectives.The ADNS2051 uses successive correlations on a 16x16,

6-bit grayscale image to determine motion of the device.The nominal system frame rate is 1500 frames/sec. Therecommended lens system (the Agilent HDNS-2100)requires a distance to the surface of 2.3-2.5mm, and has amagnification of 1.00, depth of field of 0.5mm and a fieldcoverage radius of 1.0mm [8]. Given the stated maximumspeed for the chip of 14 in/sec, this corresponds to an over-ground speed of 0.2371 mm/frame. By adding a new lenssystem, we improved both the ground clearance and over-ground speed simultaneously.The ADNS2051 chip was mounted on a custom printed

circuit board (PCB) with a hole for the sensor and variousbreakout pins for ease of connection. A small CCD cameralens was added to allow the board to be placed at areasonable distance from the test surface. This lens allowsthe system to be used on much more uneven terrain than thenominal lens (again, limited to -2.4mm from surface).Additionally, the new lens allows the system to overcome

1-4244-1051-7/07/$25.00 ©2007 IEEE. 6

Page 2: [IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

basic speed limitation of 14in/sec, caused by the frame-rate ofthe imaging system. By expanding the physical area projectedonto the imaging array, we increase the distance that the chipcan travel in one sample and still have correlation betweentwo successive images sufficient for motion estimation.Further, since the system with the new lens uses larger areas

of the surface for correlation-based motion estimation, it willperform better on surfaces with widely spaced variation thanwill the original configuration, as each image will containmore gradient data. On the other hand, this may cause

difficulties with surfaces that have only very fine intensityvariation and texture, but will not present a significantdifficulty on the rough terrain for which the system was

designed. The sensor, PCB and lens system is shown in Fig.1.

common processor for robotics education at both highschool and collegiate levels, associated with the FIRSTRobotics competition and related pedagogy [9]. Thisprocessor has several desirable capabilities. The objective ofongoing research with this platform stems from its dual PICmaster-slave configuration, which is designed to allow easy

development of supervisory control methods involvingstandard R/C transmitters while still admitting developmentof sophisticated code. The prevalence of this platform ineducation circles prompted investigation into its use for thisproject.

- ADNS2051

PCB

,Lens

Fig. 1: ADNS205 1 optical mouse chip on custom PCB with lens

Note that no on-board illumination is present in thisconfiguration, as opposed to the LED-illuminated standardmode of operation. This is due to the fact that the sensor isfarther from the surface in our configuration, and can beilluminated by a variety of methods (as will be discussedbelow), which should take into account the nature of thesurface.The optical mouse sensor was interfaced with a Rabbit 2000

microprocessor to create a neutral platform from which thetransmitted and received data could be displayed. Thespecialized serial communications protocol implemented bythe optical mouse sensor requires a clock line, a bi-directionaldata line, and a common ground. General purpose I/0 bitswere used by the Rabbit to establish communication with theoptical mouse chip. A Power Down line is also accessible on

the ADNS2051 circuit board; however, it is not required forcommunication.The Rabbit microprocessor used an 8-bit output port to

connect an Optrex 2x16 character LCD to display positiondata received from the optical mouse sensor. Operating thedisplay in 4-bit mode freed up I/O lines on the Rabbit 2000microprocessor. This allowed one 8-bit port to fulfill bothdata and control line functions required by the display, thussimplifying connectivity. The binary data read from theADNS2051 motion registers was converted to ASCIIcharacters via standard C string conversion functions anddisplayed on the LCD screen. Readings were taken directlyfrom the LCD during testing. An image of the prototype teststation used for initial evaluation is shown in Fig. 2.

In addition to the interface to the Rabbit 2000, the potentialfor the optical mouse sensor's implementation on theInnovation FIRST EDU-RC kit was explored. This is a

Fig. 2: Prototype evaluation station

While the Rabbit-based system is quite capable and can

be interfaced to the EDU-RC processor, the Rabbit hardwareis not as common nor as easy to work with for many

educational programs (especially at the high school level).As such, the secondary objective of the work conducted withthe EDU-RC was to provide a standalone solution thatrequired only the sensor and lens removing the need for an

additional processor. For educational purposes, integrationof the sensor and the driving controller will strengthen theoverall robustness of the operating vehicle and eliminatecomplex systems which may result in data loss or

unnecessary delays in data processing. Additionally, use ofthe Innovation FIRST EDU-RC kit will provide a singleprogram the student can code and debug during labexercises. The overall simplicity of the integration willafford students the chance to spend more time fine-tuningtheir control programs and less time troubleshootinghardware failures.

Initial tests with the EDU-RC were promising, butdifficulties arose with the single-pin interface to the opticalmouse chip. While the chip could be configured and givencommands easily through the EDU-RC I/0, the processor'sI/0 pins proved to be too slow in toggling between input andoutput state, and seemed to suffer from capacitive issuesafter the switch. To read data from the optical mouse chip, itis necessary to perform a write operation to the chipfollowed immediately by a read from the chip, across a

single line. Due to the issues mentioned, this proveddifficult, and readings from the sensor were unreliable. Adata-flow control interface is under development to split the

7

25

Page 3: [IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

input to the processor to one pin while the output is providedfrom another. Research in this area will continue in hopes ofproviding a self contained platform from which to bothcommunicate with the optical mouse sensor and directlycontrol the vehicle's drive motors with this very common

processor. It is worth noting that the Rabbit 2000 is an

extremely fast and powerful fully-functional microprocessor,but motor control and user interface require much greatersophistication for this platform than for the EDU-RC.

III. CALIBRATION AND TESTING

As with any sensor platform, calibration and validation ofthe sensor output is essential in the development process. Inthis section, we discuss a set of calibration exercises thatutilize the coupled optical mouse and lens system described inSection II to determine its efficacy and to establish basicprocedures that can be used by students as the system iscarried forward for pedagogical uses (see Section IV).

A. Testing PlatformThe testing platform underwent several revisions in an

attempt to mitigate unnecessary variables in the experiment.The optimal configuration was achieved by keeping theoptical mouse sensor and light source stationary during thetests. Therefore the optical mouse sensor was fixed to an

adjustable metal arm allowing its distance to the surface tobe altered when necessary. The light source a standard desklamp was positioned to provide the greatest amount ofillumination of the sensor's field of view. With the sensor andthe lighting fixed the only mobile part of the experiment was

the surface under the optical mouse sensor. To provide a flatsurface with a single axis of linear motion, LEGO*s were usedto create a sled that moved along a smooth track. This ensuredthat the sled would not move left or right, up or down. Thesled was then wrapped in a sheet ofwhite computer paper withprinted intensity variations. The sled could move along thelength of the track achieving the desired travel distance of fiveinches without coming out of contact with the track itself. Thesled can be seen in the system image shown in Fig. 2. Theoptical mouse sensor's height was then adjusted for testing as

will be discussed below.

B. TestingThe optical mouse was adjusted to four varying heights

(5/8",1 /2" 3/8", and 1/4") above the target surface. At eachheight, the sled covered with varying intensities (see Fig. 2)was moved under the sensor ten times. The sled moved a

measured five inches for every data point collected. Thedigital starting and ending position were read and transcribedfrom the LCD before and after every run. In total, theexperiment was run forty times. Results are shown in Fig. 3.The graph illustrates, for each of the four different heights, theabsolute value of the difference between the mouse's digitallyrecorded location before and after moving five inches along a

single axis. The graph serves as a visual representation of thedata, allowing for a more effective demonstration of the ideal

focal range of the optical mouse sensor with the current lensconfiguration.

Table 1 shows the average digital measurement when themouse traveled five inches for each given height.Additionally, the standard deviation and variance are

provided for each group of measurements to give a sense ofthe reliability of data and the signal to noise ratio outside ofthe optimal focal range.

Focal Length Optimization

.m

1000

900

800

700

600

500

400

300

200

100

0

Metric\ Mouse height

5/"

---m--- 1/2"

3/8"

1/4"

2 3 4 5 6 7 8 9 10

Test Number

Fig. 3: Calibration data

Table 1: Calibration Data

5/8" 12 1 3/8" '/4I

Average 33.60 192.90 778.10 801.40Std.Dev. 8.06 10.06 14.14 50.05Variance 1 64.93 101.21 | 199.88 | 2504.71

C. Conclusions/ObservationsThe results in Table 1 and Fig. 3 indicate that a sensor

height above 1/4" avoids excessive measurement floatation;more than ten runs were required at less than 1/4" to retrieve a

digital measurements that did not continue to rise after thesled was at rest. Maintaining a height around 3/8"significantly amplifies the detail of measurements taken bythe optical mouse sensor, but a height of 5/8" provides more

consistent measurements on the whole, but with a lowersignal-to-noise ratio. Since our objective entails the sensor'simplementation on a moving vehicle that will encountervarying terrain, a large focal length remains the mostdesirable parameter to augment. The height of 5/8" shouldserve as the upper boundary for effective implementation,while a distance between 1/2" and 3/8, remains optimal forsuch activities with this lens configuration.A single, fixed lens was used for the optical mouse sensor

during this experiment. Therefore, these results specificallydictate the behavior of the sensor with regards to this lens.However, the experiments clearly indicate that this style oflens is acceptable for the optical mouse, and can enhanceground clearance. Additional experiments are required tofully characterize the optimal field-of-view and focal lengthfor a lens system.

8

26

I~~~~~~~~~~~~

Page 4: [IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

While raising the optical mouse chip off of the ground isimportant for traversing uneven terrain, it also enables the chipto register higher over-ground speed. As mentioned, theoptical mouse chip, when in close proximity to the surface(-2.4mm) is limited to speeds of 14in/sec. While the exactspeed that can be registered by the system with the attachedlens depends on the characteristics of the lens, (specifically thefield of view), placing the lensed system above the groundprovides a dramatic increase in speeds that can be sensed.Based on the calibration data from Fig. 3, and estimates on

the magnification and focal length of the new lens, we

estimate that the new system will be able to achieve over-

ground speeds of at least 28in/sec with a possibility for even

higher speeds. The next phase of this work will involveinstrumentation of a system for measurement of high-speedmotion.

IV. EDUCATIONAL POTENTIAL

This work was inspired to some extent by the difficultiesexperienced in dead reckoning experiments using shaftencoders, such as those outlined in [5]. It is our contentionthat students should be taught not only the basics of deadreckoning, but also be exposed to the difficulties associatedwith inaccuracies of the system model, problems with wheelslippage and the effects of uneven terrain on dead reckoningsystems. Pedagogically, expanding on traditional deadreckoning approaches with techniques that rely on

environmental feedback provides the students with a muchbetter grasp of the full toolset available for mobile robotdesign, and offers a sound basis for development of reliablecontrol and actual, fieldable systems. It is not suggested herethat dead reckoning based on shaft-encoders should not betaught, but rather that the discussion should include thelimitations of this form of feedback and a discussion ofadditional methods. Optical mouse technology is andexcellent choice for expanding the standard curriculum in a

low-cost, low complexity manner that still provides amplelatitude for embedding additional concepts, such as thosediscussed in [3]. As such, the work described in Section II

and Section III was carried forward to a pedagogical settingfor mobile robotics education.A great deal of recent effort has been focused on improving

and updating robotics education (see [6,7] and associatedreferences). We believe that the use of new, even emerging,technologies to improve existing techniques is vital for theeducation of the next generation of roboticists.The work discussed in Sections II and III lays the

foundation for a solid investigation ofmobile robot navigationusing optical mouse sensors. The basic setup requires littlespace and provides good repeatability and easy calibration.Follow-on experiments could involve the use of the sensor

over uneven terrain, where the lens will not consistently focuson the surface and lighting may not be optimal. Potentialundergraduate design projects based on this work include an

auto-focus system using ranging data from a device such as a

Sharp GP2D12, as well as development of techniques for

appropriate signal analysis that can lead to betterperformance of the localization over rough terrain.The laboratory exercise designed for the first trial of

optical mouse technology involves the use of the calibrationsetup discussed in Section III. Students are asked to:

i) Determine the optimal height of sensor

placement for a given lens on even groundii) Characterize the effect of varying terrain height

on overall performanceiii) Estimate the maximum achievable speed for the

lensed system at optimal height.iv) Test the device by mounting it on a compliant

vehicle (modeled after the rocker-bogie system)moving linearly over uneven terrain.

v) Characterize the effects of varying environmentallighting on the system

vi) Consider extensions and improvements for thesystem

Students in the course should have had previousexperience with both traditional dead reckoning and designfor uneven terrain (where shaft encoders were found to be oflittle use under most circumstances due to the compliance,uneven terrain, wheel slip, etc.).

Experience with the optical mouse sensor need not belimited to mechanical data collection. This platform offers a

wide range of open-ended design problems and follow-onactivities, which can be implemented in hardware or as

paper designs and conceptual studies. The primarycomponents for open-ended design are illumination andfocus controlAs discussed above, lighting is a key concept for the

optical mouse. The standard system includes an LED aimedat the surface. Students could consider lighting options, andshould note that accuracy for the optical mouse sensor can

be improved by illuminating the surface with a polarizedlight in the form of a laser or LED. Since the optical mousesensor will, in most cases, be mounted below a vehicle-where light remains scarce, its field of view must be well lit.Three lasers triangulated at the sensor's point of focus wouldensure that in the case of an obstacle or change in the levelof the vehicle it could maintain proper illumination of thesurface. The lasers should be positioned as close toperpendicular with the surface to maximize the angle ofincidence with the surface. An alternate solution to laserswould involve several LEDs. Placing several LEDs aroundthe lens with their light focusing on the surface below wouldflood the sensors field of view allowing for improvedsurface quality. LEDs would be more cost and energy

effective than the lasers, however illumination intensitywould be sacrificed in turn. Finally, it is possible that a

simple incandescent light system could supply sufficientillumination if properly configured. Students will be askedto come up with their own solutions and consider all of theissues discussed above.

9

27

Page 5: [IEEE 2007 Thirty-Ninth Southeastern Symposium on System Theory - Macon, GA, USA (2007.03.4-2007.03.6)] 2007 Thirty-Ninth Southeastern Symposium on System Theory - On the Development

With the sensor's field of view more intensely illuminated,the addition of an adjustable lens system would furtherimprove the sensor's performance over uneven terrain. A self-contained auto-focusing lens system would be the mosteffective. A lens system that could continually adjust tochanges in terrain would afford a superb platform from whichto measure movement, although care would need to be takento maintain appropriate correspondence between opticalmouse output, sensor height and focal length. If themeasurements are not compensated in sync with the systemfocal length, the data used by the drive routine will be skewedand may cause inaccuracies or even instability in the system.A different approach would be comprised of an infrared

distance sensor and a linear actuator on the sensor system,which would move the entire sensor platform closer or furtherfrom the surface as the vehicle traveled. This would eliminatevarying digital measurements as the focal length grew andshrank due to the vehicle traversing irregular terrain. Morethan one infrared sensor could be used to eliminate occlusionswhen navigating rocky terrain. This control system would beself-contained run independent of the overall sensor anddrive device(s). The auto-adjustment system would be a self-contained feedback system that would not hinder theperformance of the system or the driving; therefore,maintaining the integrity and robustness of the overall system.On the other hand, the sensor could snag on the surface as itchanged height to adjust to the ground directly underneath.Finally, it would be necessary to consider the effects of thespurious optical flow (caused by a zoom-in or zoom-out) on

the sensor output.Effective illumination of the sensor's field of view in

conjunction with a self-adjusting lens platform has thecapacity to ensure the sensor's measurements are precise andaccurate over variable terrain. Experience with these open-

ended issues will enhance students' understanding ofodometry, sensing, feedback control and even computer vision(to some extent). As such, this sensor platform provides a

concrete laboratory experience, enhances the mobile roboticcurriculum, and provides for a wide range of follow-onactivities, design projects and lessons.

V. CONCLUSIONS

The implementation of the ADNS-2051 High-Performance Optical Mouse Sensor discussed in this workprovides enhanced performance over standard configurations,and can be used as part of an excellent pedagogical experiencefor students in a mobile robotics course. The system discussedin this work provides a low-cost, high impact experience forstudents, and represents a significant step toward using opticalmouse technology on real systems in real environments. Theexperiments have both concrete measurement objectives andopen-ended design considerations, and can be used as

standalone labs or as part of a larger curriculum component on

localization and dead reckoning. The sensor setup can greatlyimprove the accuracy of measurements for labs emphasizing

dead reckoning in environments with uneven surfaces andthe potential for wheel slippage, flexibility in the frame, etc.

Future work includes an investigation into the use offocusing systems for the lens and additional processing thatmight be used to enhance motion estimation by usingchanges in the surface quality register data from multiplechips. Additionally, we are undertaking development of a

test bed to fully characterize the actual maximum measuredover-ground speed ofthe system.

REFERENCES

[1] J. L. Jones, B. A. Seiger and A. M. Flynn, MobileRobots. Inspiration to Implementation (2nd edition)(Natick, Massachusetts, A K Peters, 1999).

[2] Dudek, Jenkin. Computational Principals of MobileRobots. (Cambridge, MA., Cambridge University Press,2000)

[3] A. Bonarini, M. Matteucci and M. Restelli, "AutomaticError Detection and Reduction for an Odometric Sensorbased on Two Optical Mice," Proceedings of the 2005IEEE International Conference on Robotics andAutomation, Barcelona, Spain, April 2005, pp. 1687 -

1692[4] Agilent Technologies, Inc. Agilent ADNS-2051 Optical

Mouse Sensor Data Sheet, ONLINE:

', Accessed 16September 2005.

[5] B.E. Bishop and C.E. Wick, "Educational ControlStudies of a Differentially Driven Mobile Robot,"Proceedings of the 35th Southwest Symposium on

Systems Theory, March 2003, pp. 308 - 312.[6] IEEE Robotics and Automation Magazine. Special Issue

on Robotics Education, Vol. 10, Issue 2, June 2003.[7] IEEE Robotics and Automation Magazine. Special Issue

on Robotics Education, Vol. 10, Issue 3, September2003.

[8] Agilent Technologies, Inc. Agilent HDNS-2100 SolidState Optical Mouse Lens Data Sheet, ONLINE:

d, Accessed 16 September 2005.[9] FIRST (For Inspiration and Recognition of Science and

Technology), FIRSTRobotics Competition Website,ONLINE: httpllwww.usfirst.orLrobotics/index.htmlAccessed 16 September 2005.

10

28