Upload
ashley-kelly
View
220
Download
0
Embed Size (px)
Citation preview
7/25/2019 CP 1516 Week6.Prelimj
1/81
L2 Computational Physics
1
7/25/2019 CP 1516 Week6.Prelimj
2/81
Observations onweekly problems
2
7/25/2019 CP 1516 Week6.Prelimj
3/81
Observations! I was very impressed by the standard of work on
the Pendulum assessment
!
Keep it up!
3
7/25/2019 CP 1516 Week6.Prelimj
4/81
OptimisationTechniques
Function minimisation
4
7/25/2019 CP 1516 Week6.Prelimj
5/81
Optimisation Techniques! Background
! Aside visualising 2D data as images
! Brute force methods
!
Deterministic Methods
! Gradient Descent
! Nelder-Mead Simplex
! Stochastic Methods
!
Genetic Algorithms
5
7/25/2019 CP 1516 Week6.Prelimj
6/81
Background
6
7/25/2019 CP 1516 Week6.Prelimj
7/81
Function minimisation! Given f(x,y,.) find the coordinates and value at the
minima of the function
! Analytical techniques only useful for somefunctions
! Numerical techniques
! This is what we will look at today
7
7/25/2019 CP 1516 Week6.Prelimj
8/81
Applications ofminimisation
! Science
! Fitting a model function to experimental data
! Orbital mechanics
! Optical design
! Maximise image quality
!
Minimise cost
! Adaptive Optics
! Protein folding
! Optimising control system parameters
! An example in your pocket:
! Auto-focus cameras (smart phone)
8
7/25/2019 CP 1516 Week6.Prelimj
9/81
Aside
Visualising 2D functions
9
7/25/2019 CP 1516 Week6.Prelimj
10/81
Its going to be really useful to be able to visualisethese functions
Lets look at how we do that with Python, numpyand matplotlib
10
7/25/2019 CP 1516 Week6.Prelimj
11/81
2D plotting We want to visualise f(x, y)
Evaluate f(x, y) over a range of evenly spaced x and
y values
Store the results in a 2D numpy array
Display this with matplotlib
11
7/25/2019 CP 1516 Week6.Prelimj
12/81
2D plotting
12
Function to be displayed
7/25/2019 CP 1516 Week6.Prelimj
13/81
2D plotting
13
The range we wish to plot over
7/25/2019 CP 1516 Week6.Prelimj
14/81
2D plotting
14
Generate all the x and y values
7/25/2019 CP 1516 Week6.Prelimj
15/81
2D plotting
15
Somewhere to put our data
7/25/2019 CP 1516 Week6.Prelimj
16/81
2D plotting
16
Note that we import matpotlib.cmas well (colour maps)
** imports should be at the top, its
just moved in this example **
7/25/2019 CP 1516 Week6.Prelimj
17/81
2D plotting
17
imshowtreats the array as an imageand shows it on the plot.
Extent tells matplotlib what range theimage should cover (we know this from
our bounds, but neither datormatplotlib know this)
origin=lowermakes it appear the
right way up
cmap tells matplotlib how to colour it in
(low values are dark, high values arelight in this case)
pyplot.colourbar gives us a nice Z-scale
bar
7/25/2019 CP 1516 Week6.Prelimj
18/81
Ta Da
18
7/25/2019 CP 1516 Week6.Prelimj
19/81
Methods
Brute Force and Ignorance Exhaustive Search
Deterministic searches
Nelder-Mead Simplex
Gradient Descent Hill Climbing
Stochastic Searches Genetic Algorithm Stochastic Gradient Descent
Simulated Annealing
19
7/25/2019 CP 1516 Week6.Prelimj
20/81
Brute Force andIgnorance
Fast to codeSlow to run
20
7/25/2019 CP 1516 Week6.Prelimj
21/81
Exhaustive Search! For every x
! For every y
! Is this the smallest f(x, y)?
!
Benefits
! Trivial to code
! Drawbacks
!
Slow! Not very accurate it must operate on some finite,
quantised grid
21
7/25/2019 CP 1516 Week6.Prelimj
22/81
Deterministic Methods
22
7/25/2019 CP 1516 Week6.Prelimj
23/81
Deterministic Methods These methods all start from some initial position
If you run the same method from the same position
multiple times, you get the same result
23
7/25/2019 CP 1516 Week6.Prelimj
24/81
Gradient Descent
24
7/25/2019 CP 1516 Week6.Prelimj
25/81
Gradient Descent: Algorithm
!
Walk downhill
25
7/25/2019 CP 1516 Week6.Prelimj
26/81
Gradient Descent: Algorithm
Maths notation Quantity Python
r!
Position vector r = numpy.array(x, y)
( )rf !
Function at r!
def f((x, y)):return ???
( )rf !
! Vector differential of ( )rf !
def f((x, y)):df_dx = ???df_dy = ???grad = numpy.array((df_dx, dy_dy)
return grad
0r!
Initial position r0 = numpy.array((x0, y0))
" Step size gamma = 0.something
26
( )nnn rfrr !!!
!"=+ #1
7/25/2019 CP 1516 Week6.Prelimj
27/81
27Coloured lines are contours - see http://matplotlib.org/examples/pylab_examples/contour_demo.html
7/25/2019 CP 1516 Week6.Prelimj
28/81
28
7/25/2019 CP 1516 Week6.Prelimj
29/81
29
7/25/2019 CP 1516 Week6.Prelimj
30/81
30
7/25/2019 CP 1516 Week6.Prelimj
31/81
31
7/25/2019 CP 1516 Week6.Prelimj
32/81
32
7/25/2019 CP 1516 Week6.Prelimj
33/81
33
7/25/2019 CP 1516 Week6.Prelimj
34/81
34
7/25/2019 CP 1516 Week6.Prelimj
35/81
Step Size Gradient Descent is highly sensitive to the step size,
gamma
Too small a step and convergence is very slow
Too large a step and it may overshoot and themethod becomes unstable
Audience Question: What causes this to happen?
35
7/25/2019 CP 1516 Week6.Prelimj
36/81
!to small slow convergence
36
7/25/2019 CP 1516 Week6.Prelimj
37/81
!larger faster convergence
37
7/25/2019 CP 1516 Week6.Prelimj
38/81
!about right
38
7/25/2019 CP 1516 Week6.Prelimj
39/81
!too big oscillatory
convergence
39
7/25/2019 CP 1516 Week6.Prelimj
40/81
! perfectly wrong
40
7/25/2019 CP 1516 Week6.Prelimj
41/81
!far too big - divergence
41
7/25/2019 CP 1516 Week6.Prelimj
42/81
GD Example 2
Rosenbrocks Banana Function
A tough test case for minima finding
Steep cliffs
Very shallow valley
42
f(x,y) = (1!x)2+100(y!x
2)2
7/25/2019 CP 1516 Week6.Prelimj
43/81
43
GD Example 2
7/25/2019 CP 1516 Week6.Prelimj
44/81
44
GD Example 2
Trajectories rapidly enter the valleythen crawl along the shallow gradient
to the minima at (1,1)
Most trajectories run out before thenas we have not run for enough
itterations
7/25/2019 CP 1516 Week6.Prelimj
45/81
45
The method can zig-zag in anunstable manner if the step size is to
large, constantly overshooting thelowest local value
7/25/2019 CP 1516 Week6.Prelimj
46/81
46
GD in the valley - slow
7/25/2019 CP 1516 Week6.Prelimj
47/81
47
GD in the valley - slow
7/25/2019 CP 1516 Week6.Prelimj
48/81
48
GD in the valley - slow
7/25/2019 CP 1516 Week6.Prelimj
49/81
49
GD in the valley - slow
7/25/2019 CP 1516 Week6.Prelimj
50/81
GD when to stop?!
Its common to have a maximum number ofiterations
! Another common pattern is to terminate early uponreaching some convergence criteria
50
7/25/2019 CP 1516 Week6.Prelimj
51/81
Hill Climbing
51
7/25/2019 CP 1516 Week6.Prelimj
52/81
Hill Climbing!
Hill climbing is similar to gradient descent butsimpler
! Hill climbing tries moving a small distance in onedimension only at a time
! When this no longer works, another dimension isexplored
! Very simple to code
52
7/25/2019 CP 1516 Week6.Prelimj
53/81
Nelder-Mead Simplex!
Whiteboard
! Live examples
53
7/25/2019 CP 1516 Week6.Prelimj
54/81
Local Minima
54
7/25/2019 CP 1516 Week6.Prelimj
55/81
55
GD & Local Minima
7/25/2019 CP 1516 Week6.Prelimj
56/81
Pathological example
56
7/25/2019 CP 1516 Week6.Prelimj
57/81
No gradient into the minima
57
7/25/2019 CP 1516 Week6.Prelimj
58/81
Stochastic MethodsOften, a little bit of randomness goes a long way
58
7/25/2019 CP 1516 Week6.Prelimj
59/81
Genetic Algorithms
59
7/25/2019 CP 1516 Week6.Prelimj
60/81
60
Genetic Algorithms
7/25/2019 CP 1516 Week6.Prelimj
61/81
61
Genetic Algorithms
Random solutions
Survival of the fittest
Sexual Reproduction
Mutation
7/25/2019 CP 1516 Week6.Prelimj
62/81
62
7/25/2019 CP 1516 Week6.Prelimj
63/81
63
7/25/2019 CP 1516 Week6.Prelimj
64/81
64
7/25/2019 CP 1516 Week6.Prelimj
65/81
65
7/25/2019 CP 1516 Week6.Prelimj
66/81
66
7/25/2019 CP 1516 Week6.Prelimj
67/81
67
7/25/2019 CP 1516 Week6.Prelimj
68/81
68
7/25/2019 CP 1516 Week6.Prelimj
69/81
69
7/25/2019 CP 1516 Week6.Prelimj
70/81
70
7/25/2019 CP 1516 Week6.Prelimj
71/81
71
7/25/2019 CP 1516 Week6.Prelimj
72/81
72
7/25/2019 CP 1516 Week6.Prelimj
73/81
73
7/25/2019 CP 1516 Week6.Prelimj
74/81
74
7/25/2019 CP 1516 Week6.Prelimj
75/81
75
7/25/2019 CP 1516 Week6.Prelimj
76/81
76
GA & Local Minima
The wide sampling of the initial scattergunapproach of the GA means that some points shouldfall near the global maxima
Due to survival of the fittestthese rapidly formthe basis of the population
7/25/2019 CP 1516 Week6.Prelimj
77/81
77
GA & Local Minima
7/25/2019 CP 1516 Week6.Prelimj
78/81
78
7/25/2019 CP 1516 Week6.Prelimj
79/81
79
7/25/2019 CP 1516 Week6.Prelimj
80/81
Weekly Assessment!
You are going to implement a 2D Gradient Descentand plot the results
! A gradient descent solver starts from a vector point
! Just like your Euler solver for the pendulum
!
f([x, y] vs [theta, omega]
! The solver makes a series of steps that update thevector position with some function of the vector
! Just like your solve pendulum Euler
80
7/25/2019 CP 1516 Week6.Prelimj
81/81
81