19
General Software Documentation for P15230: Quadcopter Project Alyssa Colyette 1

General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

General Software Documentation for P15230: Quadcopter Project

Alyssa Colyette

1

Page 2: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Table of Contents

Overall Navigation System 3 Global Pathing 4

The Idea 4 The Algorithm 6 A* PoC and Conclusions (thus far) 8

Object Detection 9 Implementation Pseudocode 9

Localization 11 Wifi­router Pinging Alternative 11

Concept 11 Results from PoC (thus far) 12

Belief Location From Executed Commands 12 Facial Recognition Subroutine 13

Implementation Pseudocode 14 Flight Controller API 15

2

Page 3: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Overall Navigation System The goal of this project is to create a quadrotor that can navigate autonomously to known positions at which it will search for a face, and snap an image of that face. To accomplish this a master Raspberry Pi will initiate path planning as well as execute movement commands for the craft. In the class diagram below, the objects in yellow are ran on the master Pi. The objects in magenta are ran on a separate slave Pi communicating via serial UART. The other two objects are ran on two separate arduino nanos both communicating as slave devices to the master Pi over I2C communications.

Below is the flow diagram of the overall navigation system. The flow of the facial recognition capture system is detailed in a later section (Facial Recognition Subroutine). The explanation of each component is in further detail in the proceeding sections.

3

Page 4: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Global Pathing

The Idea (focus on 2D coordinates) Takes advantage of prior info:

navigation space/ dimensions target destination location

Risks:

Map Resolution too high for RT computations Mitigations:

reduce resolution create grid for pocketed/local area

project destination point, create sub­destination

I. Generate grid of course space

Initialize a fully connected graph where nodes of uniformly distributed across the area of the navigation space. The edge costs are all the same r (selected resolution for the grid eg r=1m).

4

Page 5: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Diagram II: Area Mapping *The grid depicted in diagram II is a basic grid. The edges between nodes can be between all nodes in relative vicinity, but the edge cost would be larger (√(2r)m).

II. Recursively apply node pathing algorithm (A* search) until destination is reached. A. Apply A* search on current state (pos) from localization module

With a believed position the quadcopter should navigate through the nodes available edges towards its destination with some allowable error.

Diagram III: Path Planned After calculating path, quadcopter would start execute set of controls reflecting that path.

B. Poll for refreshed grid map flag from object detection module, if so update map and go back to part A. New observations can be made by the quadcopter from the object detection

model. In that case the map would be updated. Diagram IV shows detected obstacle that marks the area where 3 nodes are located forbidden for it detected a wide incoming obstacle.Since the depth of obstacle is not know yet, the map is not updated for any other positions.

5

Page 6: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Diagram IV: Path Planned AFTER Map Update

note: backtracking may be a problem with new calculation of shortest path after object detection…

Should always use same start, use current localization to determine location in path.

The Algorithm A* Search pseudo code 1. Mark current block 2. Assign as parent for all adj blocks 3. For each adj block, calculate G, H, and F

G­ distance between current position and start H­ (Manhattan) distance between current and end position F = G+H

4. Choose current block as min(F) block, repeat 1 if H>0 methods for optimizing:

H values can be precalculated. Not all G have to be recalculated...

This is Koenig’s proposed RT Adaptive A* pseudo code 1

S set of states of the search task, a set of states GOAL set of goal states, a set of states A( ) sets of actions, a set of actions for every state succ( ) successor function, a state for every state­action pair variables lookahead number of states to expand at most, an integer larger than zero movements number of actions to execute at most, an integer larger than zero

1Sven Koenig. “Real­Time Adaptive A*”. http://scholar.google.com/citations?view_op=view_citation&hl=en&user=tpoh43QAAAAJ&citation_for_view=tpoh43QAAAAJ:8k81kl­MbHgC

6

Page 7: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

scurr current state of the agent, a state [USER] c current action costs, a float for every state­action pair [USER] h current (consistent) heuristics, a float for every state [USER] g g­values, a float for every state [A*] CLOSED closed list of A* (= all expanded states), a set of states [A*] s state that A* was about to expand when it terminated, a state [A*] procedure realtime adaptive astar(): 01 while (scurr 6∈ GOAL) do 02 lookahead := any desired integer greater than zero; 03 astar(); 04 if s = FAILURE then 05 return FAILURE; 06 for all s ∈ CLOSED do 07 h[s] := g[s] + h[s] − g[s]; 08 movements := any desired integer greater than zero; 09 while (scurr 6= s AND movements > 0) do 10 a := the action in A(scurr) on the cost­minimal trajectory from scurr to s; 11 scurr := succ(scurr, a); 12 movements := movements − 1; 13 for any desired number of times (including zero) do 14 increase any desired c[s, a] where s ∈ S and a ∈ A(s); 15 if any increased c[s, a] is on the cost­minimal trajectory from scurr to s then 16 break; 17 return SUCCESS; A* PoC and Conclusions (thus far)

7

Page 8: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

The environment and the algorithm for A* search was coded in C++ and ran on the

Raspberry Pi. The basic testing performs two least resistant pathings. Both starting from one corner node (0,0) to the opposite corner node (85,85).

The first path is done with a 85x85 unit area. There are no obstacles therefore every node is linked to its neighbor within the discretized space. A path is delivered of a node transversal along the edges of the grid; from (0,0) to (0,85) to (85,85).

The second path is on the same grid where the edge between nodes (0,0) and (0, 1) (the first node delivered in the 1st path) removed. This forces the pathing to transverse the nodes along the x­direction first instead; from (0,0) to (85,0) to (85,85)

Both of these tests are ran on the Pi with a captured timing by the system clock as: real 0m0.075s, user 0m0.060s, sys 0m0.020s. Though, both paths are represented by a simple transversal, a new path doesn’t take long to calculate for two paths were calculated within 100ms with the .

TODO: Add diagonally accessible neighbors to get optimal path. Could update cost heurist to using only euclidean distances instead of manhattan distances. Build into Adaptive A* Search environment for Flight_Controller_API execution.

8

Page 9: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Object Detection

9

Page 10: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

The object detect class relies on messaging from two different systems on different communications protocols. Object_Detect will have to handle some form of object tracking and/or thresholding to determine if a detected obstacle warrants a severed edge connection thus provoking a graph update to the Global Pathing grid. Error Checking:

SonarDetect should relay error in a sensor reading LidarDetect should relay error in an image captured (resolution etc)

Implementation Pseudocode LidarCheck = ⅕ sec /// pi processes ~ 5fps tested SonarCheck= 19­20ms; //takes 10us to initiate command pulse and return pulse as long as 18ms for ~3m detection range //main function main ()

for (;;) sonarPkt = getSonarPkt; for every sensor in pkt

edges +=convertSonarGlobal(pos,dist); if (time_elasped ==TimeLiDAR)

TimeLiDAR = time_elasped+LidarCheck;

10

Page 11: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

edges += converLidarGlobal(length,dist) else

sleep(SonarCheck); //lessen load on CPU for edge in edges

removeEdge(edge);

//request and preliminary pkt parsing for SONAR getSonarPkt()

request sonarArduinopkt via i2c for every sensor in sonarArduinopkt

extract pos & distance if distance within resolution

add to sonarPkt return sonarPkt

//request and preliminary pkt parsing for LiDAR getLidarPkt()

request lidarPipkt via UART extract distance & width if within resolution

add to lidarPkt return lidarPkt

11

Page 12: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Localization

12

Page 13: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Wifi-router Pinging Alt: Obtain distance by Freq & Strength of Router

Concept CLI Call: “#iwlist wlan0 scan | egrep ‘Cell|Frequency|Quality|ESSID’ “

Frequency extracted for distance calculation parameters ESSID needed to ID router via MAC ADDRESS Quality extracted for distance calculation parameter

given as a percentage therefore conversion: give x/100 RSSI = x/100 * 60 dbm = RSSI ­92

To calculate distance measurement (meters)

calcdist(leveldBm, freqMHz) dist = 10 (27.55 ­ (20*log10(freqMHz)) + |leveldBm|)/20

The distance would be the calculated relative radial distance from the given known router location. TODO: Will need to be done for a total of 3+ routers to solve for coordinates using Trilateration. Get accurate measurements to compare to calculated measurements via trilateration.

13

Page 14: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Results from PoC (thus far) Distance/signal level does NOT update for multiple calls within the same run of the C program(refer to closeagain100.txt and faragain100.txt). When the program is manually ran multiple times, the values scanned to update upon variance and movement of RPI (refer to testDist.txt). In both cases ( manually and programmatically) the modem’s displacement was ~3ft (not measured for directionallity). In the table below are the max and min measurements retrieved from separate manual runs of the program. Only 10 calculations were done for each position.

position Min Measurement (m)

Max Measurement (m)

Var (m)

1 1.164695 1.337248 0.172553

2 0.670212 0.769509 0.099297

Without accurate real world measurements further analysis needs to be made. Though as a crude analysis, direct proportionality of the measurements and displacement was present. TODO: Measure the execution time for the program when abstracting data and calculating the distance for one router.

Belief Location From Executed Commands If Trilateration fails, or in addition to trilateration. The location the quadrotor believes it is in will be a prediction based on the commands given to the craft from a known start point. Using a Kalman filter, bayesian theory can be applied to predict the location from given commands as well as provide a (increasing) covariance relation that provides the uncertaintly of the crafts predicted location. This would become ever increasing until more observations are provided to the engine to increase its certainty.

14

Page 15: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Facial Recognition Subroutine

15

Page 16: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

The facial detection module would activate on run from the main task manager. The TaskManager will have to become a listener for when the facial detection module would find a face. The algorithm for facial detection would be an implementation of the Viola­Jones method. This approach should detect faces within the range of profile and ¾ view.

Link to openCV library reference: http://docs.opencv.org/doc/user_guide/ug_traincascade.html TODO: find training set for classifier

Backup Facial Recognition Since the recognition software will run on the same RPi for LIDAR detection, using the

image processing intensive Viola­Jones method may prove too costly. As a backup plan a IR LED would be used as a beacon to detect face placement. Bayesian theory would be used to determine probability of a face within fram given that the beacon is detected.

Implementation Pseudocode // precon: copter is believed to be at coordinates of target // @brief checks 4 sides within vertical height of 1 ft // Facial Recognition detects up to ¾ view // Needs to communicate with Flight Controller command API (to Nano)

16

Page 17: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

// Face_Found flag set in another thread (to IP Pi) searchForFace()

bool image_cap =false; while (!Face Found)

//searches vertical region of expected face location transverseFaceRegVertical(); for vertical_scan_time

if Face_Found send stop; image_cap= takePic()

//~~~~check other side orbit90(); for orbit_quart_scan

if Face_Found send stop; image_cap= takePic()

transverseFaceRegVertical(); for vertical_scan_time

if Face_Found send stop; image_cap= takePic()

//~~~~check other side (180) orbit90(); for orbit_quart_scan

if Face_Found send stop; image_cap= takePic()

transverseFaceRegVertical(); for vertical_scan_time

if Face_Found send stop; image_cap= takePic()

//~~~~check last side orbit90(); for orbit_quart_scan

if Face_Found send stop; image_cap= takePic()

transverseFaceRegVertical(); for vertical_scan_time

if Face_Found send stop; image_cap= takePic();

17

Page 18: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

Face_Found = false; return image_cap;

Flight Controller API

The internal PID controller within the flight controller should be tuned for maximum stability to the point of being fairly rigid. It should account for user error (in this case computer error) if bad controls are given to the actuators. Without any given commands the quadrotor would give a static hover state. It should not be prone to tipping (flipping upside down) within the initial phases as the command library is built.

In the starting phase the Flight_Controller_API will have a set of ‘safe to exe at hover’ commands. This would be limited to at least the commands given in the class diagram for a static parameterized distance or degree. Each of this commands will derive the required

18

Page 19: General Software Documentation for P15230: Quadcopter …edge.rit.edu/edge/P15230/public/Detailed Design...TODO: Measure the execution time for the program when abstracting data and

motor adjustments as well as package the command to be sent to the arduino controller via I2C to the motors.

As for actually controlling the motors, the quadrotor system expects potentiometer inputs read via an RC channel for each motor (refer to Philip’s RC Protocol Document). If the Arduino program functions as an interface for ranging these channel inputs to send directly to the flight controller, then the Flight_Controller_API must act as an automatic RC controller. It will account for controller toggle adjustments for executing the set number of moves. RISKS: May require an outer loop PID controller for executed directions. Many control systems for a quadrotor require two PD controllers to guide inertial and attitude overshoot.

19