A Vision for Supporting Autonomous Navigation in Urban Environments

Preview:

Citation preview

C O V E R F E A T U R E

0018-9162/06/$20.00 © 2006 IEEE68 Computer P u b l i s h e d b y t h e I E E E C o m p u t e r S o c i e t y

reduce costs and make ANS widespread. The systemuses sensors mounted on moving vehicles and stationaryobjects such as lampposts, traffic lights, toll plazas, andbuildings to gather information at different levels.

The system uses local context to make tactical deci-sions in real time, such as slowing down when the vehi-cle ahead brakes or when the distance to a stop sign getsshorter. Strategic decisions such as taking an alternativeroad or passing a vehicle require road and vehicle infor-mation pertaining to the neighborhood, a specific partof the road (segment), or a region that stretches ahead400 meters. Moving vehicles normally do not haveaccess to this local and global information. The sensorsand other elements of the proposed mobile sensor network system for ANS (MSNA), will thus play amajor role in future applications of the system. TheMSNA helps provide greater awareness of the dynamicenvironment and makes decisions using services anddatabases that the moving vehicles can execute.

AUTONOMOUS NAVIGATION SYSTEMSThe MSNA project operates on the assumption that

the AGVs, UAVs, and USVs have the necessary actuatorsto implement the desired actions generated using localor global context. This is realistic because modern auto-mobiles such as the Toyota Prius and RAV4 have elec-tric power steering, throttle by wire, and electric brakeforce distribution. Luxury vehicles such as the 2007Lexus LS460L have actuators for steering, throttle, and

The mobile sensor network for autonomous navigation systems can support autonomous

modern vehicles equipped with actuators for steering, throttle, and brake.The system can

support older vehicles as well if they are equipped with basic sensors and an extended

smart phone.

Vason P. SriniBerkeley Wireless Research Center

F uture cars and vehicles operating in urban areas,industrial plants, ports, military-engagementareas, and warehouses are expected to have com-puter controls for throttle, steering, and brakesto support collision avoidance, adaptive cruis-

ing, automatic parking, and safe driving. Accomplishingthis vision will require automotive industry advancesthat cut costs and improve reliability by reducingmechanical parts and linkages. Strategic changes in theinsurance industry aimed at reducing accidents and cutting expenses by using advanced sensors, passenger-safety measures, and collision-avoidance devices pro-vide additional motivation for similar advances.

These autonomous navigation systems (ANS), suchas autonomous ground vehicles (AGVs), unmanned aer-ial vehicles (UAVs), and unmanned submersible vehicles(USVs), and modern vehicles with actuators, sensors,and computer control perform three basic functions:context gathering using sensors, processing, and action.Most researchers have put all three functions into theANS or the robot itself to overcome occlusions and han-dle the environment’s dynamics. However, this causesthe ANS and robotic systems to be bulky and expensive.It also impedes the introduction of vehicles with ANSin urban environments, where they must coexist withexisting cars and highways.

The approach presented here distributes the context-gathering and processing functions using sensor net-works and wireless communications technologies to

A Vision for SupportingAutonomous Navigation in Urban Environments

r12srin.qxp 27/11/06 1:12 PM Page 68

December 2006 69

braking and can use them to perform automatic parking.Local, segment, and regional context gathering on

roads plays an important role in an ANS. The vehiclescan use many sensors, such as ladars made by SICK,video cameras, IR rangers, and GPS for simple taskssuch as finding a stationary object, detecting an edge,following a wall or curb, and lane and waypoint fol-lowing. However, using the vehicles in urban environ-ments requires augmenting them with other advancedsensors such as radars and 3D ladars. These technolo-gies help vehicles deal with factors such as electricalnoise, electromagnetic interference (EMI), dust, fog,trees, urban canyons formed by tall buildings, and multiple moving targets.

Researchers have proposed using 3D laser radars todetect moving objects with mechanical scanning sys-tems.1,2 The DARPA 3D-Imaging program supportedthe development of several approaches for generating3D images.

The 3D flash ladar imaging camera,3,4 uses a singlepulse to illuminate the scene and obtain the distance forvarious objects based on time of flight. The pulse illu-minates a large surface area, the returns from which arefocused on an array of smart pixels that collect and storedigital and analog data representing the flight time andlaser pulse characteristics. The camera sends out pulsesonce every 33 milliseconds. The system outputs the datagathered from the returning pulse. It processes the datain an FPGA chip in real time to produce point-cloudcoordinates. The system could process this point cloudto detect and track multiple moving vehicles.

Figures 1a and 1b show two versions of the 3D flashladar cameras. The small camera, which has a range ofa few hundred meters, connects to a laptop to show

raw 3D images. The larger 30-cm camera supports an 85-mm lens and has a range of a few thousand meters.The system uses both camera types to obtain local andsegment-level contexts as 3D image frames. The smallcamera could be used in mobile vehicles such as vans,buses, trucks, and tankers. The large camera could beused in heavy-duty vehicles; at stationary places suchas bridges, buildings that form urban canyons, and tollplazas; and on trains, helicopters, and airplanes. The3D image frames from these sensors will make actionsmore realistic in dynamic environments such as high-ways in urban areas.

Processing in ANS involves guidance, navigation, andcontrol-processing functions. Depending on the contextdata type collected, the overall processing requirementcould be quite high. For example, the vehicles that par-ticipated in the 2005 DARPA Grand Challenge usedmost of their interior space for computers and relatedaccessories.

The winning team from Stanford University used sixPentium M computers in a shock-mounted rack for pro-cessing.5 Other teams used six to eight server blades forprocessing. The navigation software ran on most of thecomputers. This included real-time stereo video process-ing and extended and unscented Kalman filters. Most ofthe control processing could be done using microcon-trollers and other dedicated hardware tightly integratedwith actuators such as linear motors, servos, and step-ping motors, so control processing could best be done in the vehicle.

Distributing the guidance and navigation processingfunctions between the vehicle (AGV or UAV) and the sta-tionary nodes’ computers is one option. Some tacticalfunctions would still be performed using computers in

Figure 1. 3D flash ladar cameras. (a) This small camera has a range of a few hundred meters. (b) This larger camera has a range of a

few thousand meters.

(a)

(b)

r12srin.qxp 27/11/06 1:12 PM Page 69

70 Computer

the mobile vehicle. Strategic functions such as dynamicroute replanning could be performed using computerson stationary nodes. The system could use emerging Webservices technology, wireless Web services, and high-speed Wi-Fi, WiMax, and HSDPA/HSUPA6 wirelesscommunication in cell phones to distribute noncriticalguidance and navigation processing.

Several problems arise when gathering context withmobile and distributed sensing: path planning andexploration, search-and-rescue and pursuit-evasion, andsensor placement and assignment. The distributed sens-ing functionality requires placing sensors so that everypoint in the environment lies within range of at least onesensor. It also requires assigning targets to sensors sothat the system can estimate the vehicle’s position andminimize the overall error rate. Mobile sensing requiressearching, pursuing a path, localization, and path plan-ning. Localization discovers or estimates the 2D or 3Dpositions of the mobile vehicles. Solving these problemsalso requires understanding computation’s environ-mental, sensing, and combinatorial complexity.

MOBILE SENSOR NETWORK SYSTEMThe mobile sensor network architecture for support-

ing autonomous navigation in urban environments com-prises two types of stationary nodes; two types of mobilenodes; and a set of commonly used services, communi-cation links, secure and trusted communication proto-cols, databases, and cellular service providers. Sensorsare distributed among the stationary and mobile nodes.

Master nodes and stationarynodes could be attached to reflec-tor posts, traffic lights, lampposts,buildings, and towers along high-ways and freeways. Mobile nodescorrespond to vehicles such ascars, trucks, and tankers. The sta-tionary nodes use broadbandwireless technology to communi-cate with mobile nodes.

The vehicle identification numbergiven to each vehicle at the time ofmanufacture provides the uniqueID for each mobile node. The sta-tionary nodes receive unique IDnumbers from the system software.Communication between station-ary nodes could be achieved withwired or fixed wireless networks by,for example, using WiMax andWiBro broadband technologies forthe fixed wireless connection.Communication between mobilenodes and stationary nodes coulduse Wi-Fi, WiMax, and HSDPA/HSUPA technologies.6 We assume

smart phones will have the ability to add network cardsin the same way we now add mini- and microsecure digital memory cards to support communication betweenmobile nodes and stationary nodes. Researchers expectthe stationary nodes to have sufficient processing powerto support some of the navigation processing functionsthat mobile nodes request.

Neighboring stationary nodes interconnect to form asubnetwork. For example, consider the freeway inter-change in an urban area that Figure 2 shows. The vari-ous interchange lanes that cross each other appear as aninset, while the placement of sensor nodes on one lanecan be seen at the figure’s bottom.

In urban areas, vehicles often clog the freeways andhighways. Although alternate routes and entry and exitramps can reduce vehicle congestion, drivers rarely usethem efficiently. Promptly delivered information relatingto congestion and quick approaches to finding alternateroutes are not available to ordinary commuters.

The proposed MSNA could detect congestion quicklyand communicate alternate routes to individual vehiclesand drivers. The stationary nodes could detect congestionon the roads and communicate their status and alternateroutes to the mobile nodes in the vehicles long before theyapproach the congested point. The mobile nodes couldperiodically communicate their destination and environ-ment information using the wireless link to the stationarynodes to obtain information about alternate routes.

Stop-and-go traffic presents another cause of conges-tion on urban freeways. The proposed MSNA could

Figure 2. Placement of stationary nodes at a freeway interchange in an urban area.

r12srin.qxp 27/11/06 1:12 PM Page 70

December 2006 71

help drivers avoid such nuisances by using adaptivecruise control (ACC) to smooth out the stop-and-gocycle and also reduce accidents.

Figure 3 shows the MSNA’s communication links,including a collection of stationary and mobile nodesalong with master nodes and service providers. The sta-tionary nodes perform routine communication with themobile nodes in the vehicle’s 500-meter environment toharvest vital data that the system can use in strategicpath replanning and in directing vehicle actions.

The stationary nodes also use sensors to monitor themobile nodes’ progress. In locations such as bridges andtoll plazas, stationary nodes could use advanced sensorsto track mobile nodes. The master node communicateswith neighboring stationary nodes to form a subnetworkand also with smart mobile nodes to devise global strate-gies and actions such as selecting alternate routes. Thesystem communicates these actions to stationary nodes,which eventually route the actions to the mobile nodes.The master node also determines the level of congestionand formulates global views of traffic situations by receiv-ing traffic flow and other details from stationary nodes.

The MSNA uses two types of mobile nodes—ordinaryand smart—so that it can support many vehicle types.The smart mobile nodes correspond with trucks,tankers, and luxury cars equipped with advanced sen-sors such as the 3D flash ladar shown in Figure 1. Thesenodes gather information about the road’s dynamics,perform signal processing on raw data, and communi-cate information to the master nodes. These nodes canmonitor distances to other vehicles in the selected vehi-

cle’s immediate environment and perform actions suchas stop-and-go ACC. They help the vehicle maintain asafe distance from other traffic, follow a lane in a mul-tilane highway, and pass other vehicles.

The ordinary mobile nodes correspond to vehicleswith IR sensors, a few attached video cameras, andsmart phones such as MC4, which contain GPS, 3Daccelerometers, cameras, Wi-Fi, WiMax, and trusted-communication cards. Although the ordinary mobilenodes might not have the richness of smart mobile nodesin their sensing, processing, and actuation capabilities,they can harvest and communicate the data that masternodes need for strategic planning. The mobile nodes alsobenefit from strategic decisions from master nodes suchas alternate routing and emergency planning.

Another use for MSNA is to operate a group ofautonomous multimodal vehicles (AMVs) dispersed inan urban environment, an industrial complex, or a mili-tary program such as a future combat system. Conductingsearch-and-rescue or search-and-destroy operations, intel-ligence gathering, surveillance, preplanned actions, secu-rity, and monitoring tasks is challenging in modern urbanenvironments. Accomplishing these tasks is even moredifficult in an unknown adversarial urban environment.DARPA programs such as the organic air vehicle (OAV)attempt to help soldiers detect enemy forces hidden inforests, mountains, and urban areas (www.defense-update.com/products/g/goldeneye50.htm).

Some efforts use peer-to-peer and ad hoc networkstrategies to actively pursue wireless communicationbetween miniature aerial vehicles (MAV)7 and field per-

Figure 3. Mobile sensor network for supporting autonomous navigation systems (ANS).The system consists of master nodes,

smart mobile nodes (SMN), stationary nodes (SN), and mobile nodes (MN).

ANSservice provider

Services and databases

MasterMaster MasterMaster

SMN SMN SMN SMN SMN SMN SMN SMN

Support and setup

Decision

Control and data

Cellular(CDMA/GSM

HSDPA)

Hierarchical

DistributedSNSNSNSNSNSNSN

Communication tosmart phones

MN MN

MN MNMN MNMN MNMN MN

r12srin.qxp 27/11/06 1:12 PM Page 71

72 Computer

sonnel. Security issues, malicious network attacks, build-ings, narrow streets, trees and vegetation, barricades, thesteel and other metals used in buildings and factories, andother obstructions all complicate wireless communica-tion amid urban clutter. Military personnel need trustedwireless networks, trusted AMVs, smart vehicles, andshooter or sniper localization approaches to conduct sur-veillance in urban areas and decide which actions to take.

An AMV comprises an autonomous ground vehicle(AGV) and three classes of minirobots: rotorcraft,crawlers, and submersibles. Rotorcraft robots such ascoaxial helicopters can go around or over buildings.Crawler robots can traverse alleys and crawl throughbasements. Submersible robots can delve into gutters onstreets, crawl through storm drains, and go underbridges over rivers and canals. An AMV’s units workcooperatively to achieve their goals.

A group of AMVs could engage in intelligence gath-ering about a town or city by cooperating with distrib-uted and emplaced sensors and exchanging data andresults with other systems. The nodes in Figure 3’sMSNA could be mapped as follows: The AGV in eachAMV could act as a smart mobile node. The AMV’sother components could form the mobile nodes. Thesensors already installed in the urban environment couldact as stationary nodes. Mission-oriented vehiclesparked in strategic locations throughout the urban envi-ronment could act as master nodes.

It might be necessary to conduct surveillance, recon-naissance, and situation assessment stealthily. Thismeans the AMVs must have a low radar signature, gen-erate little noise, and avoid predictable patterns. Otherfactors include cost, weight, and reliability. The pro-posed MSNA could accommodate these factors by usingappropriate sensors in the nodes. The current programsfor developing and deploying MAVs, OAVs, AGVs, andunmanned combat air vehicles could benefit from the

proposed MSNA and the availabilityof advanced sensors, smart phoneswith ultra-low-power and high-performance parallel stream/vectorprocessor chips, and wireless networkcards that can support assured com-munication.

Developers face numerous chal-lenges when designing an urban-areaMSNA. They must determine thenumber of stationary and masternodes, their placement, and the dis-tance between stationary nodes. Thiscomputation depends on the wiredand wireless communication technol-ogy employed, the urban area’s topol-ogy, highway and freeway routes andstructures, and highway traffic pat-terns. Other design challenges include

assignment of mobile nodes to stationary nodes, mobilenode handoffs, and monitoring mobile nodes with inac-cessible identification. Efficient solutions to these prob-lems will emerge from simulation studies, experimentsconducted in model urban areas, and graph theoreticformulations. Further, developers must devise fault-tol-erance strategies that maintain smooth traffic flow whenone or more stationary nodes fail.

ADVANCED MSNA SENSORSSensors play an important role in collecting context

information. Figure 4 shows the environmental sensorsa vehicle needs to support gathering local context viaguidance and navigation processing. Urban streets havemultiple vehicles traveling in a dynamic environment inwhich speeds can exceed 60 mph.

MSNA provides several important functions, such asdetecting moving vehicles, estimating their velocities, andtracking them. Four classes of advanced sensors, cur-rently in various stages of development, can be used togather context and determine environmental dynamics.

The first class uses medium-range (24 GHz) and long-range (77 GHz) radars with microwave and millimeter-wave devices. The second class, scanning ladars, useseye-safe lasers, as does the third class, flash ladars. Thefourth class uses stereo cameras integrated with SICKladars.

We anticipate that the MSNA will include sensorsfrom all four classes and that stationary nodes will usethem, while mobile nodes could use the fourth class ofsensors. Smart mobile nodes could use any of the firstthree sensor classes or a combination of them. For exam-ple, tankers and trucks carrying hazardous materialscould use radars and a 3D flash ladar to improve safetyand avoid collisions.

Many applications use radar to determine range accu-rately and discriminate targets. Because microwave and

Figure 4. Environment sensors to support ANS in vehicles.

50 to 80 m rangeradar or 3Dflashladar

3 m rangeradar, video, IR

3 m rangeRadar, video, IR

20 m rangeradar or ladar

20 m rangeradar, video

5m rangeradar,video, IR

1 m range IR1 m range IR

r12srin.qxp 27/11/06 1:12 PM Page 72

millimeter wave radar can sense through fog, dust,smoke, and smog, they are good contenders for use inhighway vehicles that support ANS as well as in trains,airplanes, ships, highway toll booths, and in intruder-detection systems in shipyards, storage depots, and othersensitive facilities.8

ACFR in Sydney has designed a 77-GHz millimeter-wave radar, together with signal processing and con-trol systems for use in autonomous vehicle navigation (www.acfr.usyd.edu.au/technology/mmw-radar/Xabstract.html).9 CMU’s autono-mous vehicle, Sandstorm, used thisradar to complete the 132-mile racecourse in DARPA’s 2005 GrandChallenge.10 Microwave radars devel-oped by Bosch, MACOM, and EatonVorad operate in the 24-GHz bandto detect obstacles. The BMW 7series, Mercedes-Benz, and Jaguar automobiles use theseradars to support ACC and collision-avoidance func-tions. Both the mobile and stationary nodes in the MSNAcan use radar to detect and measure the speed ofapproaching vehicles.

Scanning ladars might find more use in MSNA’s mobilenodes. Early scanning ladars11 used 4 � 4 or 32 � 32Avalanche photodiodes as focal plane arrays, CMOS tim-ing circuits, and a scanning mirror to illuminate the scenewith laser light. The reflected laser energy helps deter-mine the distance of objects in the scene accurately andconstruct a 3D image.

The Velodyne HDL-64E 600-rpm rotating laser scan-ner (www.velodyne.com/lidar/index.html) is one of thelatest scanning ladars that could be used in a vehicle toconstruct a 360-degree model of the environment sur-rounding a vehicle from the point cloud data. It uses eightgroups of eight fixed lasers to cover a 26.5-degree verti-cal field of view and two 32-pixel Avalanche photodiodearrays for receiving reflected laser light. With a range of50 to 100 meters, if integrated with video data, the vehicle could use this sensor to detect and track multi-ple moving targets.

The BAE scanning ladar has a longer range and betterrange resolution than the HDL-64E, but the horizontaland vertical fields of view span only 45 degrees.Developers could use the BAE scanning ladar in AGVs,helicopters, and airplanes. However, pixel registrationprocessing poses one limitation for scanning ladarsbecause of variations in scanning speed.

The 3D flash ladar (3DFL) uses a single pulse of 5-nsduration and energy of 15 to 20 mJ to illuminate thescene. The pulse returns from the surface target arefocused on an array of smart pixels—specifically, anAvalanche photodiode array with a CMOS readoutintegrated circuit—that collects and stores digital andanalog data representing the flight time and laser pulse

characteristics. The system sends the 5-ns pulses onceevery 33 milliseconds. All data gathered from thereturning pulse is output between laser pulses. The sys-tem obtains the data cube by taking 20 samples, each1 ns apart.

The high-speed sampling lets the system clearly recordimages of helicopter blades, projectiles, and missiles. Itpostprocesses the data to display a 3D model of the

scene. Depending on the laser energyand optics used for the laser andreceiver, the system could supportdifferent ranges and horizontal fieldsof view. It could use signal- andimage-processing algorithms to cal-culate image rectification, 3D objectidentification, and the velocities ofmoving objects. It could use asequence of 3D frames to identifyand track moving targets, then com-

municate the information to other trusted MSNAmobile or stationary nodes.

Using just ladar or radar to identify objects is difficult,but combining either with video makes doing sostraightforward if global time is available and the sen-sors are axis aligned. Vehicles that have used this kindof sensor include Stanford’s Stanley in the DARPAGrand Challenge,5 GDRS (www.gdrs.com/programs/program.asp?UniqueID=22), and MDARS-E.8

The system must fuse data obtained from the first twoadvanced sensor classes with video and infrared cameradata to identify and classify objects and track them. Ifglobal time is available at the mobile nodes where thesensors are located, the system could use time stamps toperform data fusion.

The MSNA might encounter some challenges whenusing these sensors. Obviously, getting a global time inmobile nodes that synchronizes with the stationarynodes presents a challenge. If the system derives globaltimes using GPS, but obstructions, EMI, or other fac-tors cause a temporary GPS outage, the system requiresalternate approaches for maintaining global time.

Another challenge involves maintaining a reliablewireless connection between mobile nodes and station-ary nodes during global time-information exchange. Thepresence of many mobile nodes with inaccessible iden-tification in highways that implement MSNA poses yetanother challenge. Conflicts could arise with informa-tion supplied by mobile nodes adjacent to stationarynodes, requiring approaches for resolving conflicts. Thesystem must maintain stationary nodes so that globaltime is available for communication to mobile nodesand for data fusion.

An ideal sensor for smart mobile nodes in MSNA, the3D flash ladar, lets the system locally analyze 3D imagesequences while it readily detects and tracks multiplemoving vehicles. The 3D flash ladar produces sharp 3D

December 2006 73

Using just ladar or radar

to identify objects is difficult,

but combining either

with video makes doing so

straightforward.

r12srin.qxp 27/11/06 1:12 PM Page 73

74 Computer

images that require minimal image processing. Althoughexpensive now in handmade prototype quantities, cou-pled with the cost of the Avalanche photodiodes detec-tor and the associated readout integrated circuit, theprice will come down with mass production.

We expect to see a few types of 3D flash ladars reach-ing the market with different ranges, resolutions, andprices. A smart mobile node could use one 3D flash ladarwith a 50-meter range in the front, two 20-meter-rangeradars for passing and checking blind spots, and othershort-range sensors such as IR and ultrasound to sup-port ANS. Stationary nodes located on bridges, tollplazas, and buildings also could use the 3D flash ladar.

COMMUNICATION AND WIRELESS WEB SERVICES

Context gathering using sensors distributed acrossmobile nodes and stationary nodes, along with distrib-uted processing, requires good wireless communicationcapabilities. For example, Velodyne’s HDL-64E scanningladar generates more than 2.5 million points per secondand outputs them as UDP packets using a 100-MbpsEthernet. The 3D flash ladar uses a camera link interfaceto output data at rates above 50 Mbytes per second.

The system must process data from the sensors to correctlens aberrations, remove noise, extract 3D objects, classifyobjects, and make decisions. Some signal processing func-tions are computation intensive, while decision-makingfunctions are communication intensive. Further, cost,update, and maintenance concerns make reducing com-munication from the mobile nodes a priority.

One approach uses wireless technology to communi-cate vital data to nearby stationary nodes for process-

ing or to have other stationary nodes per-form that function. For example, insteadof sending the raw data from the 3D flashladar to stationary nodes, the systemcould send moving targets in the regionof interest. The system would use sta-tionary nodes to communicate the mas-ter node’s decision to take specific actions.This functionality would require real-timeand secure communication between themobile nodes and stationary nodes. Thiscould be achieved using third-generationWeb services combined with wirelesscommunication (www.w3.org/TR/ws-arch/wsa.pdf; http://msdn.microsoft.com/webservices/webservices/understanding/advancedwebservices/).

Web services could reside somewherein the subnetwork of stationary nodes,registered though the master node’sUDDI repository and discovered by themobile nodes or other stationary nodes.It could also be independent of the sub-

network and registered through some distributed UDDIrepository accessible by any subnetwork’s mobile nodesand stationary nodes. This could leverage the flexibil-ity and scalability of the services and ease deploymentwhen developing or updating new services. Both meth-ods could also be combined for efficiency or other rea-sons because UDDI’s version 3 provides distributedrepositories that support both models. Figure 5 showsthe interaction between mobile nodes requesting a ser-vice and the response from master nodes and serviceproviders. The master nodes communicate with suit-able service providers to meet the mobile node’s needs.

Deciding whether a guidance-and-navigation functionshould be a Web service depends significantly on thecomputation-to-communication ratio. If this ratio ishigh and the response time is noncritical, it is preferableto use computing resources on the stationary node clus-ters instead of the mobile node processors. The systemwill use master nodes to make strategic decisions andwill communicate them promptly to mobile nodes. Somenavigation functions will be candidates for Web service,and mobile nodes could request them or could providefunctions such as monitoring traffic or sending alarmsto vehicles. A partial list of navigation functions that arecandidates for Web service follows:

• Global-path-planning service. Based on current vehi-cle position, desired destination, and traffic condi-tions, this service consults the map database androad conditions to find efficient routes. It weightseach route and sends the best alternative to therequesting vehicle. One or more master nodes andstationary nodes communicate to find routes.

Figure 5. Web services: request and response. Mobile nodes request a service,

and master nodes and service providers respond.

Service requestService provider

Mobile node

Masternode

Master node /static node

WSDL WSDL

SOAP

UDDIservice broker

WS-Security

WS-Reliable messaging

Serviceprovider

Master node /static node

r12srin.qxp 27/11/06 1:12 PM Page 74

• Local-path-planning service.This service determines theoptimal lane, lane offset, andspeed, then sends this in-formation to the requestingvehicle.

• Lane-following service. Afterdetermining the optimal laneoffset and speed to use, thisservice sends the informationto the requesting vehicle.

• Perceptive-passing service.This service weighs factorssuch as lane, speed informa-tion, traffic on adjacent lanes,and other vehicles’ positionsto determine if the vehicle canattempt passing. If the serviceallows passing, it determinesthe lane to use, speed, and when to change lanes bycommunicating with stationary nodes ahead of thenode that received the passing request.

• Lane-drifting-alarm service. Based on the vehicle’scurrent position, lane, and speed, this service deter-mines whether lane drifting is occurring. The systemsends a signal to the vehicle to start the alarm signal.

• Collision-detection service. Based on the vehicle’s cur-rent position, lane, and speed, the position of othervehicles and their speeds, and general traffic condi-tions, this service determines the chances of collision.If a collision appears imminent, the vehicles thatwould be involved become subject to evasive action.The system will send information about lane andspeed, braking, and lane offset to these vehicles.

• Emergency-video-streaming service. The masternode uses stationary nodes to communicate emer-gency information such as flooding, hurricane, icing,chemical spills, accidents, and so on to the movingvehicles.

• News service. Using stationary nodes, the masternode provides news updates to the moving vehicle.

The system could communicate between the MSNA’smobile and stationary nodes in several ways.Approaches include Wi-Fi access points and Wi-Firouters and access points. For higher data rates andlonger distances, the system could use an approach thatcombines Wi-Fi and WiMax. Albuquerque, SiliconValley, and parts of the United Kingdom already usethese two technologies in their transportation systems.

Mobile nodes initially use smart phones to requestservices from service providers for their journey. Masternodes near the starting point will communicate withstationary nodes to support the mobile node. Whilemany smart phones support Wi-Fi access, using themin mobile nodes to communicate with both the MSNA’s

stationary nodes and some of the vehicle’s control func-tions requires significant software development andsome hardware development. Supporting secure andtrusted communication requires additional softwaredevelopment. Although smart phones have the neces-sary compute power, readily programming themremains problematic because of vendor and cellular car-rier IP issues. The lack of flexibility in adding comput-ing power and trusted networking support also poses anobstacle.

We propose an open platform—MC4—that supportsmobile wireless computing, communicating, and con-sumer electronics convergence. This approach wouldstart with the IP provided by convergence chips such asQualcomm’s MSM 7600 and RF chips such as RTR6275 and PMIC7500, then add an interconnection net-work that can support network and computer cards sim-ilar to an SD memory card. Sensors such as GPS,infrared range finders, cameras, and 3D accelerometerscould be added.

This approach would let the basic cell phone trans-form into a sensing device for mobile nodes and a com-pute engine for multimedia, navigation, and controlprocessing in a self-organizing wireless Web. Becausesome Qualcomm RF chips support multiple radio bandsin the basic cell phone, network processing and wirelessWeb communication over local, rural, and metropoli-tan areas could be done without heavy infrastructureinvestments.

Figure 6 shows a block diagram of the proposed MC4device, which can also be attached to external proces-sors. For example, the MC4 could be connected usingthe external-processors port to the processor clustersthat handle the many functions of a car or truck. Thisallows reliable communication between the controlfunctions in a vehicle and the distributed sensing andnavigation. The actions that software determines could

December 2006 75

Figure 6. Mobile computing, communicating, consumer electronics convergence (MC4)

device. Adding the computer card to the basic cell phone could change how developers

design consumer electronics devices.

Computer card

Add-onmemory

AXI bus (64 bits) or crossbar-like interconnection network

QualcommMSM7600

Networkcard

Externalprocessor

GPS, stereo camera,3D accelerometer,

IR ranger, Bluetooth

Add-onmemory

RF basebandRF receiver and transmitter

PMIC,RF chip,

RTR 6275

r12srin.qxp 27/11/06 1:12 PM Page 75

76 Computer

be reliably communicated to the actuators’ controllersin a vehicle using the MC4.

Figure 7 shows the communication paths MC4 needs.Although integrating RF circuits for multiple bands,Bluetooth, a GPS receiver, Wi-Fi, and WiMax on a sin-gle CMOS chip poses a challenge today, system-in-a-package solutions are emerging. Key components thatmust be developed to realize the MC4 platform includethe interconnection, computer, and network cards.

Developers could use available silicon solutions for Wi-Fi and WiMax (www.intel.com/network/connectivity/products/wireless/prowireless_5116.htm) to build the net-work card. The computer card requires a low-powerembedded parallel architecture and silicon implementa-tion because many algorithms in navigation processingsupport data-, task-, and instruction-level parallelism.

Advanced sensors such as 3D flash ladar also requirepostprocessing to extract 3D objects from the fused dataand track vehicles. The algorithms needed to extract 3Dobjects have data and task parallelism. The system coulduse pipelined architectures to process the 3D imagesequences that the sensors generate. Combining the two requires parallel architectures with support forpipelining at the task or kernel level.

Adding the computer card to the basic cell phone, asFigure 6 shows, could change how developers designconsumer electronics devices. Medium- to high-qualitydigital cameras, video cameras, and music players placea high demand on computing resources. Designers couldalso use the proposed computer card with these devicesto provide high-quality results at a low cost.

The computer card’s system architecture could bebased on a stream-programming12 model to leverage thecomputation density possible with the stream-basedarchitectures normally used in GPUs (http://download.nvidia.com/developer/GPU_Gems_2/GPU_Gems2_ch29.pdf). The card could also provide a low-power version ofIBM’s Cell processor,13 platform FPGA chips such as theXilinx Virtex II Pro, or the eLite processor.14 Develop-ment of open platforms such as MC4—which has cam-eras, 3D accelerometers, GPS, temperature, pulse-measurement, and odor-detecting sensors—could opennew avenues for personal information access and health-

care. These platforms could help mobile robots navigatefactories and avoid collisions when undertaking search-and-rescue missions.

IMPLEMENTATION THOUGHTSDesigning and implementing MSNA for an urban area

will involve local governments, state and federal high-way departments, hardware and software technicalexperts, and users. The design and implementation ofMC4, stationary nodes, and master nodes by themselvesconstitute major projects. We propose a service-orientedarchitecture for implementing MSNA software, ratherthan an object-oriented approach. OO uses objects asbuilding blocks to model solutions. These objects canhandle the desired functionality and communicate withother objects directly. Because mobile nodes communi-cate with one stationary node after another as theytravel, a distributed and flexible approach must beapplied. SOA defines a service as a loosely coupledautonomous entity that can easily communicate withother services.

The implementation’s goal should be to build soft-ware systems that can communicate seamlessly witheach other. These connected systems reside in their ser-vice domain and send messages to loosely coupledobjects on target service domains. The MSNA’s mobilenodes could be the clients, while the functions that themobile nodes need could be implemented as services onthe stationary nodes.

MSNA implementation involves a detailed require-ments specification, designing models for the nodes andcommunication links, developing components, simula-tion and analysis, and coding. Modeling and simulatingthe mobile nodes could be done using USC’s Player/Stagesoftware (http://playerstage.sourceforge.net/gazebo/gazebo.html) and the Orca2 open source componentframework for developing component-based robotics(http://orca-robotics.sourceforge.net/orca_doc_about.html). Simulating the wireless and wired communica-tion between the nodes could be done using the NS2Network Simulator (http://nsnam.isi.edu/nsnam/index.php/User_Information) and the NAM network anima-tor for animating traffic flows.

We are beginning work on developing componentsfor mobile nodes using Orca2, which provides a flexibledevelopment and communications environment, easingthe process of integrating multiple sensors and actua-tors. Orca2 uses the Internet Communications Engine,ICE, for component communication. ICE can be usedwith a variety of communication techniques, making itsuitable for our framework in which components cancommunicate inside the vehicles or among vehicles andstationary nodes. The stationary nodes bridge commu-nications between the sensor and actuator parts of thesystem and the Web services. Developers could use Axis2 bindings in the Tomcat environment to design and

Figure 7. MC4 communication paths. Emerging system-in-a-

package solutions will make the MC4 device possible.

Cellular network

Stationary node – Wi-Fi

Stationary node – WiMax

Personal area – Bluetooth

MC4

Vehicle computer

cluster

r12srin.qxp 27/11/06 1:12 PM Page 76

implement these services (www.informit.com/guides/content.asp?g=java&seqNum=165&rl=1). Developmentof a framework for implementing the MSNA usingOrca2, NS2, Player/Stage, Axis2, Tomcat, and the JointArchitecture for Unmanned Systems (www.jauswg.org)is under way.

I mplementing some of the approaches we’ve pro-posed and constructing a scaled-down version of MSNA could be carried out at UC Berkeley’s

Richmond Field Station. Some of the California PATHproject’s transportation and highway research resultsand testing facilities could also prove useful for imple-menting MSNA.

During the past two decades, the California PATHproject has focused on automated highway systems,intelligence vehicles, cooperative active safety, vehicleinfrastructure integration, communication protocols,and tools for traffic measurement and congestion con-trol. For vehicular traffic, the California PATH projectuses dedicated short-range communication—a modified802.11a wireless technology—instead of Wi-Fi orWiMax. DSRC is designed to support many applica-tions. For example, ambulances can cause traffic lightsto change in their favor, and road sensors can transmitinformation about traffic congestion to automobile nav-igation systems. As DSRC matures, it can be added toMSNA. Many projects related to autonomous naviga-tion, intelligent transportation, and secure communica-tion are under way around the country, and theirresearch results can benefit MSNA’s implementation. ■

Acknowledgments

I thank the referees for their comments and sugges-tions. Thanks to Bob Brodersen and Jan Rabaey for support at Berkeley Wireless Research Center. TomasSanchez Lopez helped clarify some of the Web serviceapproaches. Daeyoung Kim and his students at the Real-Time Embedded Systems Lab, ICU, Daejeon, SouthKorea, used scale models to help with sensor networkand autonomous navigation system development. Yong-Woon Park of ADD listened to my ideas and providedopportunities to interact with the autonomous vehiclesdevelopment group. Thanks to Roger Stettner of ASC,Shankar Sastry, and members of the Berkeley-SydneyDriving Team who will participate in the 2007 DARPAUrban Challenge for discussions on sensors and datafusion approaches.

References

1. M.A. Alborta et al., “Three-Dimensional Imaging LaserRadars with Geiger-Mode Avalanche Photodiode Arrays,”Lincoln Laboratory J., vol. 13, no. 2, 2002, pp. 351-370.

2. T.C. Ng, “Development of a 3D Ladar System forAutonomous Vehicle Guidance,” SIM Tech Technical Reports,vol. 6, no. 1, 2005, pp. 13-18.

3. R. Stettner and H. Bailey, “Eye-Safe Laser Radar 3D Imag-ing,” Proc. SPIE, vol. 5412, SPIE, Apr. 2004, pp. 111-116.

4. R. Stettner, H. Bailey, and S. Silverman, “Three-DimensionalFlash Ladar Focal Planes and Time-Dependent Imaging,”Advanced Scientific Concepts, 2006; www.advancedscien-tificconcepts.com.

5. S. Thrun et al., “Stanley: The Robot that Won the DARPAGrand Challenge”; http://ai.stanford.edu/~diebel/stanley/thrun.stanley05.pdf.

6. I. Elsen et al., “Streaming Technology in 3G Mobile Com-munication Systems,” Computer, Sept. 2001, pp. 46-52.

7. D. Shim et al., “Autonomous Exploration in Unknown UrbanEnvironments for Unmanned Aerial Vehicles”; www.eecs.berkeley.edu/~hcshim/publications/gnc2005final.pdf.

8. P. Cory, H.R. Everett, and T.H. Pastore, “Radar-BasedIntruder Detection for a Robotic Security System,” Proc. SPIE,vol. 3525-52, SPIE, 1999, pp. 62-72.

9. G. Brooker and T. Carter, “A Millimetre Wave Radar Sensorfor Autonomous Navigation and Landing,” Proc. AustralianConf. Robotics and Automation (ACRA 2000), ACFR, 2000.

10. C. Urmson et al., “High Speed Navigation of UnrehearsedTerrain: Red Team Technology for Grand Challenge 2004,”tech. report CMU-RI-TR-04-37, Carnegie Mellon Univ.,2004.

11. R.M. Heinrichs et al., “Three-Dimensional Laser Radar withAPD Arrays,” Proc. SPIE, vol. 4377, SPIE, 2001, pp. 106-117.

12. V.P. Srini and J.M. Rabaey, “Reconfigurable Clusters of Mem-ory and Processors Architecture for Stream Processing,” Proc.Int’l Conf. High-Performance Computing, Tata McGraw Hill,2002, pp. 17-40.

13. M. Gschwind et al., “Synergistic Processing in Cell’s Multi-core Architecture,” IEEE Micro, Mar. 2006, pp. 10-24.

14. J.H. Moreno et al., “An Innovative Low-Power High-Perfor-mance Programmable Signal Processor for Digital Commu-nications”; www.research.ibm.com/journal/rd/472/moreno.html.

Vason P. Srini is a research collaborator at the Berkeley Wire-less Research Center, University of California at Berkeley. Heis also an invited professor at ICU, Daejeon, South Korea.His research interests include navigation processing, system-on-chip applications, low-power architecture research at themodule and circuit levels, and IP-generation research at themodule, cell-libraries, and core-processor levels. Srini receiveda PhD in computer science from the University of SWLouisiana. Contact him at srini@eecs.berkeley.edu.

December 2006 77

r12srin.qxp 27/11/06 1:12 PM Page 77

Recommended