5
- FFT Recap (or, what am I FFT Recap (or, what am I expected expected to know?) to know?) - Learning Finite State - Learning Finite State Environments Environments 15-451 Avrim Blum 11/25/03

15-451 Avrim Blum 11/25/03

Embed Size (px)

DESCRIPTION

FFT Recap (or, what am I expected to know?) - Learning Finite State Environments. 15-451 Avrim Blum 11/25/03. FFT Recap. The basic result: Given vectors A = (a 0 , a 1 , a 2 , ..., a n-1 ), and B = (b 0 , b 1 , ..., b n-1 ), - PowerPoint PPT Presentation

Citation preview

Page 1: 15-451        Avrim Blum          11/25/03

-FFT Recap (or, what am I FFT Recap (or, what am I expectedexpected to know?) to know?)- Learning Finite State - Learning Finite State EnvironmentsEnvironments

15-451 Avrim Blum 11/25/03

Page 2: 15-451        Avrim Blum          11/25/03

FFT RecapFFT Recap

The basic result: Given vectors– A = (a0, a1, a2, ..., an-1), and

– B = (b0, b1, ..., bn-1),

the FFT allows us in O(n log n) time to compute the convolution– C = (c0, c1, ..., c2n-2)

where cj = a0bj + a1bj-1 + ... + ajb0.

I.e., this is polynomial multiplication, where A,B,C are vectors of coefficients.

Page 3: 15-451        Avrim Blum          11/25/03

How does it work?How does it work?Compute F-1(F(A)¢F(B)), where “F” is FFT.

• F(A) is evaluation of A at 1,m

– is principal mth root of unity. m = 2n-1. E.g., = e/m. Or use modular arithmetic.

– Able to do this quickly with divide-and-conquer.

• F(A)¢F(B) gives C(x) at these points.

• We then saw that F-1 = (1/m)F’, where F’ is same as F but using -1.

Page 4: 15-451        Avrim Blum          11/25/03

Applications Applications (not on test)(not on test)• signal analysis, lots more• Pattern matching with don’t cares:

– Given text string X = x1,x2,...,xn. xi 2 {0..25}– Given pattern Y = y1,y2,...,yk. yi 2 {0..25} [ {*}.– Want to find instances of Y inside X.

• Idea [Adam Kalai based on Karp-Rabin]:– Pick random R: r1,r2,...,rk, ri2 1..N. E.g, N=n2

– Set ri = 0 if yk-i+1 = *. – Let T = r1yk + ... + rky1. (can do mod p > N)– Now do convolution of R and X. See if any

entries match T. Each entry at most 1/N chance of false positive.

Page 5: 15-451        Avrim Blum          11/25/03

OK, on to machine OK, on to machine learning...learning...