Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
Computation of Persistent Homology
Part 2: E?cient Computation of Vietoris–Rips Persistence
Ulrich Bauer
TUM
August 14, 2018
Tutorial onMultiparameter Persistence, Computation, and Applications
Institute forMathematics and its Applications
http://ulrich-bauer.org/ripser-talk-ima.pdf 1 / 29
Vietoris–Rips
persistent homology
Vietoris–Rips Bltrations
Consider a Bnite metric space (X, d).
The Vietoris–Rips complex is the simplicial complex
Ripst(X) = {S ⊆ X ∣ diam S ≤ t}
• 1-skeleton: all edges with pairwise distance ≤ t• all possible higher simplices (Dag complex)
Goal:
• compute persistence barcodes for Hd(Ripst(X))(in dimensions 0 ≤ d ≤ k)
http://ulrich-bauer.org/ripser-talk-ima.pdf 2 / 29
Vietoris–Rips Bltrations
Consider a Bnite metric space (X, d).
The Vietoris–Rips complex is the simplicial complex
Ripst(X) = {S ⊆ X ∣ diam S ≤ t}
• 1-skeleton: all edges with pairwise distance ≤ t• all possible higher simplices (Dag complex)
Goal:
• compute persistence barcodes for Hd(Ripst(X))(in dimensions 0 ≤ d ≤ k)
http://ulrich-bauer.org/ripser-talk-ima.pdf 2 / 29
Demo: Ripser
Example data set:
• 192 points on S2
• persistent homology barcodes up to dimension 2• over 56 mio. simplices in 3-skeleton
Comparison with other software:
• javaplex: 3200 seconds, 12 GB
• Dionysus: 533 seconds, 3.4 GB
• GUDHI: 75 seconds, 2.9 GB
• DIPHA: 50 seconds, 6 GB
• Eirene: 12 seconds, 1.5 GB
Ripser: 1.2 seconds, 152MB
http://ulrich-bauer.org/ripser-talk-ima.pdf 3 / 29
Demo: Ripser
Example data set:
• 192 points on S2
• persistent homology barcodes up to dimension 2• over 56 mio. simplices in 3-skeleton
Comparison with other software:
• javaplex: 3200 seconds, 12 GB
• Dionysus: 533 seconds, 3.4 GB
• GUDHI: 75 seconds, 2.9 GB
• DIPHA: 50 seconds, 6 GB
• Eirene: 12 seconds, 1.5 GB
Ripser: 1.2 seconds, 152MB
http://ulrich-bauer.org/ripser-talk-ima.pdf 3 / 29
Demo: Ripser
Example data set:
• 192 points on S2
• persistent homology barcodes up to dimension 2• over 56 mio. simplices in 3-skeleton
Comparison with other software:
• javaplex: 3200 seconds, 12 GB
• Dionysus: 533 seconds, 3.4 GB
• GUDHI: 75 seconds, 2.9 GB
• DIPHA: 50 seconds, 6 GB
• Eirene: 12 seconds, 1.5 GB
Ripser: 1.2 seconds, 152MB
http://ulrich-bauer.org/ripser-talk-ima.pdf 3 / 29
Ripser
A software for computing Vietoris–Rips persistence barcodes
• about 1000 lines of C++ code, no external dependencies
• support for
• coe?cients in a prime Beld Fp
• sparse distance matrices for distance threshold
• open source (http://ripser.org)
• released in July 2016
• online version (http://live.ripser.org)
• launched in August 2016
• most e?cient software for Vietoris–Rips persistence
• computes H2barcode for 50 000 random points on a torus
in 136 seconds / 9 GB (using distance threshold)
• 2016 ATMCS Best New Software Award (jointly with RIVET)
http://ulrich-bauer.org/ripser-talk-ima.pdf 4 / 29
Design goals
Goals for previous persistence software projects:
• PHAT [B, Kerber, Reininghaus,Wagner 2013]:
fast persistence computation (matrix reduction only),
comparing di>erent algorithms and data sturctures
• DIPHA [B, Kerber, Reininghaus 2014]:
distributed persistence computation, based on spectral
sequence algorithm
Goals for Ripser:
• Use as little memory as possible
• Be reasonable about computation time
http://ulrich-bauer.org/ripser-talk-ima.pdf 5 / 29
Design goals
Goals for previous persistence software projects:
• PHAT [B, Kerber, Reininghaus,Wagner 2013]:
fast persistence computation (matrix reduction only),
comparing di>erent algorithms and data sturctures
• DIPHA [B, Kerber, Reininghaus 2014]:
distributed persistence computation, based on spectral
sequence algorithm
Goals for Ripser:
• Use as little memory as possible
• Be reasonable about computation time
http://ulrich-bauer.org/ripser-talk-ima.pdf 5 / 29
The four special ingredients
Main improvements to the standard reduction algorithm:
• Clearing inessential columns [Chen, Kerber 2011]
• Computing cohomology [de Silva et al. 2011]
• Implicit matrix reduction
• Apparent and emergent pairs
Lessons from PHAT:
• Clearing and cohomology yield considerable speedup,
• but only when both are used in conjuction!
http://ulrich-bauer.org/ripser-talk-ima.pdf 6 / 29
The four special ingredients
Main improvements to the standard reduction algorithm:
• Clearing inessential columns [Chen, Kerber 2011]
• Computing cohomology [de Silva et al. 2011]
• Implicit matrix reduction
• Apparent and emergent pairs
Lessons from PHAT:
• Clearing and cohomology yield considerable speedup,
• but only when both are used in conjuction!
http://ulrich-bauer.org/ripser-talk-ima.pdf 6 / 29
Filtrations and reBnements
Simplexwise Bltrations
Call a Bltration (Ki)i∈I of simplicial complexes (I totally ordered)
• essential if i ≠ j implies Ki ≠ Kj,
• simplexwise if for all i ∈ I with Ki ≠ ∅ there is some index
j < i ∈ I and some simplex σ ∈ K such that Ki ∖Kj = {σ}.
Note:
• These properties are natural assumptions for
computation.
• The Vietoris–Rips Bltration (indexed byR) is not
simplexwise (and not essential).
• To compute Vietoris–Rips persistence, we will reindex and
reBne.
http://ulrich-bauer.org/ripser-talk-ima.pdf 7 / 29
Simplexwise Bltrations
Call a Bltration (Ki)i∈I of simplicial complexes (I totally ordered)
• essential if i ≠ j implies Ki ≠ Kj,
• simplexwise if for all i ∈ I with Ki ≠ ∅ there is some index
j < i ∈ I and some simplex σ ∈ K such that Ki ∖Kj = {σ}.
Note:
• These properties are natural assumptions for
computation.
• The Vietoris–Rips Bltration (indexed byR) is not
simplexwise (and not essential).
• To compute Vietoris–Rips persistence, we will reindex and
reBne.
http://ulrich-bauer.org/ripser-talk-ima.pdf 7 / 29
Reindexing and reBnement
A Bltration K ∶ I → Simp is a reindexing of another Bltration
F ∶ R→ Simp if F = K ○ r for some monotonicmap r ∶ R→ I.• If r is injective, we call K a reBnement of F;
• if r is surjective, we call K a reduction of F.
The Vietoris–Rips Bltration can be reindexed:
• reduced to an essential Bltration (over the set of pairwise
distances), and further
• reBned to a simplexwise Bltration
Ripser use the lexicographic reBnement: simplices ordered by
• diameter, then
• dimension, then
• (decreasing) lexicographic order of vertices {vik , . . . , vi0}(induced by Bxed total order v0 < ⋅ ⋅ ⋅ < vn−1 on vertices)
http://ulrich-bauer.org/ripser-talk-ima.pdf 8 / 29
Reindexing and reBnement
A Bltration K ∶ I → Simp is a reindexing of another Bltration
F ∶ R→ Simp if F = K ○ r for some monotonicmap r ∶ R→ I.• If r is injective, we call K a reBnement of F;
• if r is surjective, we call K a reduction of F.
The Vietoris–Rips Bltration can be reindexed:
• reduced to an essential Bltration (over the set of pairwise
distances), and further
• reBned to a simplexwise Bltration
Ripser use the lexicographic reBnement: simplices ordered by
• diameter, then
• dimension, then
• (decreasing) lexicographic order of vertices {vik , . . . , vi0}(induced by Bxed total order v0 < ⋅ ⋅ ⋅ < vn−1 on vertices)
http://ulrich-bauer.org/ripser-talk-ima.pdf 8 / 29
Reindexing and reBnement
A Bltration K ∶ I → Simp is a reindexing of another Bltration
F ∶ R→ Simp if F = K ○ r for some monotonicmap r ∶ R→ I.• If r is injective, we call K a reBnement of F;
• if r is surjective, we call K a reduction of F.
The Vietoris–Rips Bltration can be reindexed:
• reduced to an essential Bltration (over the set of pairwise
distances), and further
• reBned to a simplexwise Bltration
Ripser use the lexicographic reBnement: simplices ordered by
• diameter, then
• dimension, then
• (decreasing) lexicographic order of vertices {vik , . . . , vi0}(induced by Bxed total order v0 < ⋅ ⋅ ⋅ < vn−1 on vertices)
http://ulrich-bauer.org/ripser-talk-ima.pdf 8 / 29
Enumerating (co)faces in lexicographic order
There is an order-preserving bijection
[v2, v1, v0] [v3, v1, v0] [v3, v1, v0] [v3, v2, v0] ⋯ [vn−3, vn−2, vn−1]↧ ↧ ↧ ↧ ↧0 1 2 3 ⋯ ( n
k+1) − 1
from k-simplices (as ordered tuples, ordered lexicographically)
to natural numbers (called combinatorial number system).
• One direction is given by
[vik , . . . , vi0]↦k
∑j=0
( ijj + 1
)
• The other direction can also be computed quickly
• Using this bijection, faces and cofaces can be e?ciently
enumerated in (decreasing) lexicographic order
http://ulrich-bauer.org/ripser-talk-ima.pdf 9 / 29
Persistence barcodes through reindexing
Consider a Bltration F ∶ R→ Simpwith reindexing F = K ○ r,K ∶ I → Simp, r ∶ R→ [n]. The persistence barcode of K●
determines the persistence barcode of F●:
B(H∗(F●)) = {r−1(I) ≠ ∅ ∣ I ∈ B(H∗(K●))}.
http://ulrich-bauer.org/ripser-talk-ima.pdf 10 / 29
Matrix reduction
Computing homology
Computing homologyH∗ = Z∗/B∗:
• compute basis for boundaries B∗ = im ∂∗• extend to basis for cycles Z∗ = ker ∂∗• new (non-boundary) basis cycles generate quotient Z∗/B∗
Computing persistent homologyH∗ = Z∗/B∗ (for a simplexwise
Bltration Ki ⊆ K):
• compute Bltered basis for boundaries B∗ = im ∂∗• extend to basis for cycles Z∗ = ker ∂∗• all basis cycles generate persistent homology
http://ulrich-bauer.org/ripser-talk-ima.pdf 11 / 29
Computing homology
Computing homologyH∗ = Z∗/B∗:
• compute basis for boundaries B∗ = im ∂∗• extend to basis for cycles Z∗ = ker ∂∗• new (non-boundary) basis cycles generate quotient Z∗/B∗
Computing persistent homologyH∗ = Z∗/B∗ (for a simplexwise
Bltration Ki ⊆ K):
• compute Bltered basis for boundaries B∗ = im ∂∗• extend to basis for cycles Z∗ = ker ∂∗• all basis cycles generate persistent homology
http://ulrich-bauer.org/ripser-talk-ima.pdf 11 / 29
Persistence by matrix reduction
Given:
• D: matrix of boundary ∂d for a simplexwise Bltration (Ki)i(for canonical basis in Bltration order, indexed by I × J)
Wanted:
• persistence barcode of homologyHd(Ki;F) for some
(prime) Beld F, in dimensions d = 0, . . . , kNotation:
• mj: jth column of M, mij: entry in ith row and jth column
• Pivotmj =min{i ∈ I ∶ mkj = 0 for all k > i}.
Computation: barcode is obtained by matrix reduction of D• R = D ⋅V reduced (non-zero columns have distinct pivots)
• V is regular upper triangular
http://ulrich-bauer.org/ripser-talk-ima.pdf 12 / 29
Persistence by matrix reduction
Given:
• D: matrix of boundary ∂d for a simplexwise Bltration (Ki)i(for canonical basis in Bltration order, indexed by I × J)
Wanted:
• persistence barcode of homologyHd(Ki;F) for some
(prime) Beld F, in dimensions d = 0, . . . , kNotation:
• mj: jth column of M, mij: entry in ith row and jth column
• Pivotmj =min{i ∈ I ∶ mkj = 0 for all k > i}.
Computation: barcode is obtained by matrix reduction of D• R = D ⋅V reduced (non-zero columns have distinct pivots)
• V is regular upper triangular
http://ulrich-bauer.org/ripser-talk-ima.pdf 12 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}
• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Compatible basis cycles
For a reduced boundary matrix R = D ⋅V , call
Ib = {i ∶ ri = 0} birth indices ,
Id = {j ∶ rj ≠ 0} death indices,
Ie = Ib ∖ PivotsR essential indices
(note: i = Pivot rj implies ri = 0, thus PivotsR ⊆ Ib). Then
ΣB = {rj ∣ j ∈ Id} is a basis of B∗,
ΣZ = ΣB ∪ {vi ∣ i ∈ Ie} is a compatible basis of Z∗.
• Persistent homology is generated by the basis cycles ΣZ.
• Persistence intervals: {[i, j) ∣ i = Pivot rj} ∪ {[i,∞) ∣ i ∈ Ie}• Matrix V not used for barcode
• Columns with indices PivotsR not used at all
http://ulrich-bauer.org/ripser-talk-ima.pdf 13 / 29
Matrix reduction algorithm
Require: D: I × J matrix
Ensure: R = D ⋅V : reduced, V : regular upper triangular,
P: persistence pairs
R ∶= DV ∶= Id(J × J)for j ∈ J in increasing order do
while ∃k < jwith i ∶= Pivot rk = Pivot rj do
λ = rijrik
rj ∶= rj − λ ⋅ rk ▷ eliminate pivot entry rijvj ∶= vj − λ ⋅ vk
if i ∶= Pivot rj ≠ 0 then
append (i, j) to P
return R,Vhttp://ulrich-bauer.org/ripser-talk-ima.pdf 14 / 29
Clearing
Clearing non-essential positive columns
Idea [Chen, Kerber 2011]:
• Don’t reduce at non-essential birth indices
• Use the fact that i = Pivot rj implies ri = 0• Reduce boundary matrices of ∂d ∶ Cd → Cd−1
in decreasing dimension d = k + 1, . . . , 1• Whenever i = Pivot rj (in matrix for ∂d)
• Set ri to 0 (in matrix for ∂d−1)
• Set vi to rj (if V is needed)
• Still yields R = D ⋅V reduced, V regular upper triangular
Note:
• reducing birth columns typically harder than death
columns: O(j2) (birth) vs. O((j − i)2) (death)
• with clearing: need only reduce essential columns
http://ulrich-bauer.org/ripser-talk-ima.pdf 15 / 29
Clearing non-essential positive columns
Idea [Chen, Kerber 2011]:
• Don’t reduce at non-essential birth indices
• Use the fact that i = Pivot rj implies ri = 0• Reduce boundary matrices of ∂d ∶ Cd → Cd−1
in decreasing dimension d = k + 1, . . . , 1• Whenever i = Pivot rj (in matrix for ∂d)
• Set ri to 0 (in matrix for ∂d−1)• Set vi to rj (if V is needed)
• Still yields R = D ⋅V reduced, V regular upper triangular
Note:
• reducing birth columns typically harder than death
columns: O(j2) (birth) vs. O((j − i)2) (death)
• with clearing: need only reduce essential columns
http://ulrich-bauer.org/ripser-talk-ima.pdf 15 / 29
Clearing non-essential positive columns
Idea [Chen, Kerber 2011]:
• Don’t reduce at non-essential birth indices
• Use the fact that i = Pivot rj implies ri = 0• Reduce boundary matrices of ∂d ∶ Cd → Cd−1
in decreasing dimension d = k + 1, . . . , 1• Whenever i = Pivot rj (in matrix for ∂d)
• Set ri to 0 (in matrix for ∂d−1)• Set vi to rj (if V is needed)
• Still yields R = D ⋅V reduced, V regular upper triangular
Note:
• reducing birth columns typically harder than death
columns: O(j2) (birth) vs. O((j − i)2) (death)
• with clearing: need only reduce essential columns
http://ulrich-bauer.org/ripser-talk-ima.pdf 15 / 29
Counting homology column reductions
Consider k + 1-skeleton of n − 1-simplex (Rips Bltration)
Number of columns in boundary matrix:
k+1
∑d=1
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k+1
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 1k + 2
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 56050096 = 1 161 471 + 53 727 345 + 1 161 280
• Clearing didn’t help much!
http://ulrich-bauer.org/ripser-talk-ima.pdf 16 / 29
Counting homology column reductions
Consider k + 1-skeleton of n − 1-simplex (Rips Bltration)
Number of columns in boundary matrix:
k+1
∑d=1
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k+1
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 1k + 2
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 56050096 = 1 161 471 + 53 727 345 + 1 161 280
• Clearing didn’t help much!
http://ulrich-bauer.org/ripser-talk-ima.pdf 16 / 29
Counting homology column reductions
Consider k + 1-skeleton of n − 1-simplex (Rips Bltration)
Number of columns in boundary matrix:
k+1
∑d=1
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k+1
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k+1
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 1k + 2
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 56050096 = 1 161 471 + 53 727 345 + 1 161 280
• Clearing didn’t help much!
http://ulrich-bauer.org/ripser-talk-ima.pdf 16 / 29
Cohomology
Persistent (co)homology
How is persistent cohomology related to persistent homology?
• Cohomology (over F) is vector space dual to homology:
Hp(K) ≅ Hp(K)∗ = Hom(Hp(K),F).
• Duality preserves ranks of linear maps (in Bnite
dimensions): for f ∶ V →W with dual f ∗ ∶W∗ → V∗,
rank f ∗ = rank f .
• The persistence barcode of a persistence module is
uniquely determined by the ranks of the internal maps.
Thus, persistent homology and persistent cohomology have
the same barcode [de Silva et al 2011].
http://ulrich-bauer.org/ripser-talk-ima.pdf 17 / 29
Persistent (co)homology
How is persistent cohomology related to persistent homology?
• Cohomology (over F) is vector space dual to homology:
Hp(K) ≅ Hp(K)∗ = Hom(Hp(K),F).
• Duality preserves ranks of linear maps (in Bnite
dimensions): for f ∶ V →W with dual f ∗ ∶W∗ → V∗,
rank f ∗ = rank f .
• The persistence barcode of a persistence module is
uniquely determined by the ranks of the internal maps.
Thus, persistent homology and persistent cohomology have
the same barcode [de Silva et al 2011].
http://ulrich-bauer.org/ripser-talk-ima.pdf 17 / 29
Persistent (co)homology
How is persistent cohomology related to persistent homology?
• Cohomology (over F) is vector space dual to homology:
Hp(K) ≅ Hp(K)∗ = Hom(Hp(K),F).
• Duality preserves ranks of linear maps (in Bnite
dimensions): for f ∶ V →W with dual f ∗ ∶W∗ → V∗,
rank f ∗ = rank f .
• The persistence barcode of a persistence module is
uniquely determined by the ranks of the internal maps.
Thus, persistent homology and persistent cohomology have
the same barcode [de Silva et al 2011].
http://ulrich-bauer.org/ripser-talk-ima.pdf 17 / 29
Persistent (co)homology
How is persistent cohomology related to persistent homology?
• Cohomology (over F) is vector space dual to homology:
Hp(K) ≅ Hp(K)∗ = Hom(Hp(K),F).
• Duality preserves ranks of linear maps (in Bnite
dimensions): for f ∶ V →W with dual f ∗ ∶W∗ → V∗,
rank f ∗ = rank f .
• The persistence barcode of a persistence module is
uniquely determined by the ranks of the internal maps.
Thus, persistent homology and persistent cohomology have
the same barcode [de Silva et al 2011].
http://ulrich-bauer.org/ripser-talk-ima.pdf 17 / 29
Persistent (relative) homology
How is persistent relative homology related to persistent
homology?
• The short exact sequence of Bltered chain complexes
(with coe?cients in F)
0→ Cp(K●)→ Cp(K)→ Cp(K ,K●)→ 0
induces a long exact sequence in persistent homology
⋯→ Hp+1(K ,K●)→ Hp(K●)→ Hp(K)→ Hp(K ,K●) → ⋯
and analogously for persistent cohomology.
http://ulrich-bauer.org/ripser-talk-ima.pdf 18 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))
• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}• Hp(K●) ≅ im(∂)⊕ im(i)• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}
• Hp(K●) ≅ im(∂)⊕ im(i)• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}• Hp(K●) ≅ im(∂)⊕ im(i)
• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}• Hp(K●) ≅ im(∂)⊕ im(i)• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))
• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}• Hp(K●) ≅ im(∂)⊕ im(i)• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Persistent (relative) homology barcodes
Consider the long exact sequence
⋯→ Hp+1(K) r→ Hp+1(K ,K●)∂→ Hp(K●)
i→ Hp(K)→ ⋯
We have:
• B(ker(i)) = {[b, d) ∈ B(Hp(K●))} = B(im(∂))• B(im(i)) = {[b,∞) ∈ B(Hp(K●))}• Hp(K●) ≅ im(∂)⊕ im(i)• B(coker(i)) = {(−∞, b) ∣ [b,∞) ∈ B(Hp(K●))} = B(im(r))• Hp+1(K ,K●) ≅ im(∂)⊕ im(r)
Thus, the absolute and relative persistence barcodes determine
each other [de Silva et al 2011; Schmahl 2018].
http://ulrich-bauer.org/ripser-talk-ima.pdf 19 / 29
Cohomology and clearing
• Cohomology allows for clearing starting from dimension 0(avoiding the initial overhead)
• Persistence in dimension 0 has special algorithms
(Kruskal’s algorithm forMST, union-Bnd data structure)
• Persistent cohomology arises from a reverse Bltration
C∗(K0)↞ C∗(K1)↞ ⋯↞ C∗(Kn)
• Persistent relative cohomology arises from a Bltration
C∗(K ,K0)↩ C∗(K ,K1)↩ ⋯↩ C∗(K ,Kn)
• Can be computed by reduction of coboundary matrix:
transpose of boundary matrix, with reversed index orders
http://ulrich-bauer.org/ripser-talk-ima.pdf 20 / 29
Cohomology and clearing
• Cohomology allows for clearing starting from dimension 0(avoiding the initial overhead)
• Persistence in dimension 0 has special algorithms
(Kruskal’s algorithm forMST, union-Bnd data structure)
• Persistent cohomology arises from a reverse Bltration
C∗(K0)↞ C∗(K1)↞ ⋯↞ C∗(Kn)
• Persistent relative cohomology arises from a Bltration
C∗(K ,K0)↩ C∗(K ,K1)↩ ⋯↩ C∗(K ,Kn)
• Can be computed by reduction of coboundary matrix:
transpose of boundary matrix, with reversed index orders
http://ulrich-bauer.org/ripser-talk-ima.pdf 20 / 29
Cohomology and clearing
• Cohomology allows for clearing starting from dimension 0(avoiding the initial overhead)
• Persistence in dimension 0 has special algorithms
(Kruskal’s algorithm forMST, union-Bnd data structure)
• Persistent cohomology arises from a reverse Bltration
C∗(K0)↞ C∗(K1)↞ ⋯↞ C∗(Kn)
• Persistent relative cohomology arises from a Bltration
C∗(K ,K0)↩ C∗(K ,K1)↩ ⋯↩ C∗(K ,Kn)
• Can be computed by reduction of coboundary matrix:
transpose of boundary matrix, with reversed index orders
http://ulrich-bauer.org/ripser-talk-ima.pdf 20 / 29
Cohomology and clearing
• Cohomology allows for clearing starting from dimension 0(avoiding the initial overhead)
• Persistence in dimension 0 has special algorithms
(Kruskal’s algorithm forMST, union-Bnd data structure)
• Persistent cohomology arises from a reverse Bltration
C∗(K0)↞ C∗(K1)↞ ⋯↞ C∗(Kn)
• Persistent relative cohomology arises from a Bltration
C∗(K ,K0)↩ C∗(K ,K1)↩ ⋯↩ C∗(K ,Kn)
• Can be computed by reduction of coboundary matrix:
transpose of boundary matrix, with reversed index orders
http://ulrich-bauer.org/ripser-talk-ima.pdf 20 / 29
Cohomology and clearing
• Cohomology allows for clearing starting from dimension 0(avoiding the initial overhead)
• Persistence in dimension 0 has special algorithms
(Kruskal’s algorithm forMST, union-Bnd data structure)
• Persistent cohomology arises from a reverse Bltration
C∗(K0)↞ C∗(K1)↞ ⋯↞ C∗(Kn)
• Persistent relative cohomology arises from a Bltration
C∗(K ,K0)↩ C∗(K ,K1)↩ ⋯↩ C∗(K ,Kn)
• Can be computed by reduction of coboundary matrix:
transpose of boundary matrix, with reversed index orders
http://ulrich-bauer.org/ripser-talk-ima.pdf 20 / 29
Counting cohomology column reductions
Consider K: k + 1-skeleton of n − 1-simplex, Rips Bltration
Number of columns in coboundary matrix:
k
∑d=0
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k
∑d=0
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 10
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 1 179 808 = 1 161 471 + 1 + 18 336
http://ulrich-bauer.org/ripser-talk-ima.pdf 21 / 29
Counting cohomology column reductions
Consider K: k + 1-skeleton of n − 1-simplex, Rips Bltration
Number of columns in coboundary matrix:
k
∑d=0
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k
∑d=0
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 10
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 1 179 808 = 1 161 471 + 1 + 18 336
http://ulrich-bauer.org/ripser-talk-ima.pdf 21 / 29
Counting cohomology column reductions
Consider K: k + 1-skeleton of n − 1-simplex, Rips Bltration
Number of columns in coboundary matrix:
k
∑d=0
( nd + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
total
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+k
∑d=0
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶birth
=k
∑d=0
(n − 1d + 1
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶
death
+(n − 10
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶essential
+k
∑d=1
(n − 1d
)´¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¶cleared
k = 2, n = 192: 1 179 808 = 1 161 471 + 1 + 18 336
http://ulrich-bauer.org/ripser-talk-ima.pdf 21 / 29
Observations
For a typical input:
• V has very few o>-diagonal entries
• most death columns of D are already reduced
Previous example (k = 2, n = 192):
• Only 79+ 42+ 1 = 122 out of 192+ 18 145+ 1 143 135 = 1 161 471columns are actually modiBed in matrix reduction
http://ulrich-bauer.org/ripser-talk-ima.pdf 22 / 29
Observations
For a typical input:
• V has very few o>-diagonal entries
• most death columns of D are already reduced
Previous example (k = 2, n = 192):
• Only 79+ 42+ 1 = 122 out of 192+ 18 145+ 1 143 135 = 1 161 471columns are actually modiBed in matrix reduction
http://ulrich-bauer.org/ripser-talk-ima.pdf 22 / 29
Implicit matrix reduction
Implicit matrix reduction
Standard approach:
• Boundary matrixD for Bltration-ordered basis
• Explicitly generated and stored in memory
• Matrix reduction: store only reduced matrix R• transform D into R by column operations
Approach for Ripser:
• Boundary matrixD for lexicographically ordered basis
• Implicitly deBned and recomputed when needed
• Matrix reduction in Ripser: store only coe?cient matrix V• recompute previous columns of R = D ⋅V when needed
• Typically, V is much sparser and smaller than R• Also used in Dionysus, Gudhi
http://ulrich-bauer.org/ripser-talk-ima.pdf 23 / 29
Implicit matrix reduction
Standard approach:
• Boundary matrixD for Bltration-ordered basis
• Explicitly generated and stored in memory
• Matrix reduction: store only reduced matrix R• transform D into R by column operations
Approach for Ripser:
• Boundary matrixD for lexicographically ordered basis
• Implicitly deBned and recomputed when needed
• Matrix reduction in Ripser: store only coe?cient matrix V• recompute previous columns of R = D ⋅V when needed
• Typically, V is much sparser and smaller than R• Also used in Dionysus, Gudhi
http://ulrich-bauer.org/ripser-talk-ima.pdf 23 / 29
Implicit matrix reduction
Standard approach:
• Boundary matrixD for Bltration-ordered basis
• Explicitly generated and stored in memory
• Matrix reduction: store only reduced matrix R• transform D into R by column operations
Approach for Ripser:
• Boundary matrixD for lexicographically ordered basis
• Implicitly deBned and recomputed when needed
• Matrix reduction in Ripser: store only coe?cient matrix V• recompute previous columns of R = D ⋅V when needed
• Typically, V is much sparser and smaller than R• Also used in Dionysus, Gudhi
http://ulrich-bauer.org/ripser-talk-ima.pdf 23 / 29
Implicit matrix reduction
Standard approach:
• Boundary matrixD for Bltration-ordered basis
• Explicitly generated and stored in memory
• Matrix reduction: store only reduced matrix R• transform D into R by column operations
Approach for Ripser:
• Boundary matrixD for lexicographically ordered basis
• Implicitly deBned and recomputed when needed
• Matrix reduction in Ripser: store only coe?cient matrix V• recompute previous columns of R = D ⋅V when needed
• Typically, V is much sparser and smaller than R• Also used in Dionysus, Gudhi
http://ulrich-bauer.org/ripser-talk-ima.pdf 23 / 29
Implicit boundary matrix reduction algorithm
Require: D: I × J matrix
Ensure: R = D ⋅V : reduced, V : regular upper triangular,
P: persistence pairs
for j ∈ J in increasing order do
vj ∶= ej, rj ∶= djwhile ∃k < jwith i ∶= Pivot rj and (i, k) ∈ P do
λ = rijrj ∶= rj − λ ⋅D ⋅ vk ▷ eliminate pivot entry rijvj ∶= vj − λ ⋅ vk
if i ∶= Pivot rj ≠ 0 then
vj = r−1ij ⋅ vj ▷ make pivot entry rij = 1append (i, j) to P
return Vhttp://ulrich-bauer.org/ripser-talk-ima.pdf 24 / 29
Implicit boundary matrix reduction algorithm
Require: D: I × J matrix
Ensure: R = D ⋅V : reduced, V : regular upper triangular,
P: persistence pairs
for j ∈ J in increasing order do
vj ∶= ej, rj ∶= djwhile ∃k < jwith i ∶= Pivot rj and (i, k) ∈ P do
λ = rijrj ∶= rj − λ ⋅D ⋅ vk ▷ eliminate pivot entry rijvj ∶= vj − λ ⋅ vk
if i ∶= Pivot rj ≠ 0 then
vj = r−1ij ⋅ vj ▷ make pivot entry rij = 1append (i, j) to P
return Vhttp://ulrich-bauer.org/ripser-talk-ima.pdf 24 / 29
Implicit boundary matrix reduction algorithm
Require: D: I × J matrix
Ensure: R = D ⋅V : reduced, V : regular upper triangular,
P: persistence pairs
for j ∈ J in increasing order do
vj ∶= ej, rj ∶= djwhile ∃k < jwith i ∶= Pivot rj and (i, k) ∈ P do
λ = rijrj ∶= rj − λ ⋅D ⋅ vk ▷ eliminate pivot entry rijvj ∶= vj − λ ⋅ vk
if i ∶= Pivot rj ≠ 0 then
vj = r−1ij ⋅ vj ▷ make pivot entry rij = 1append (i, j) to P
return Vhttp://ulrich-bauer.org/ripser-talk-ima.pdf 24 / 29
Implicit boundary matrix reduction algorithm
Require: D: I × J matrix
Ensure: R = D ⋅V : reduced, V : regular upper triangular,
P: persistence pairs
for j ∈ J in increasing order do
vj ∶= ej, rj ∶= djwhile ∃k < jwith i ∶= Pivot rj and (i, k) ∈ P do
λ = rijrj ∶= rj − λ ⋅D ⋅ vk ▷ eliminate pivot entry rijvj ∶= vj − λ ⋅ vk
if i ∶= Pivot rj ≠ 0 then
vj = r−1ij ⋅ vj ▷ make pivot entry rij = 1append (i, j) to P
return Vhttp://ulrich-bauer.org/ripser-talk-ima.pdf 24 / 29
Implementation details
Current (working) columns vj, rj:• priority queue (heap), comparison-based
• representing a linear combination of row basis elements
• elements: tuples consisting of
• coe?cient
• row index
• diameter of corresponding simplex
• pivot entry lazily evaluated (when extracting top element)
Previous (Bnalized) columns vk (k < j):• sparse matrix data structure
http://ulrich-bauer.org/ripser-talk-ima.pdf 25 / 29
Apparent and emergent pairs
Apparent pairs
DeBnition
In a simplexwise Bltration, (σ , τ) is an apparent pair if
• σ is the youngest face of τ• τ is the oldest coface of σ
Lemma
The apparent pairs are persistence pairs.
Lemma
The apparent pairs form a discrete gradient.
• Generalizes a construction proposed by [Kahle 2011]
for the study of random Rips Bltrations
• Apparent pairs also appear (independently) in Eirene
[Henselmann 2016] and in [Mendoza-Smith, Tanner 2017]
http://ulrich-bauer.org/ripser-talk-ima.pdf 26 / 29
Apparent pairs
DeBnition
In a simplexwise Bltration, (σ , τ) is an apparent pair if
• σ is the youngest face of τ• τ is the oldest coface of σ
Lemma
The apparent pairs are persistence pairs.
Lemma
The apparent pairs form a discrete gradient.
• Generalizes a construction proposed by [Kahle 2011]
for the study of random Rips Bltrations
• Apparent pairs also appear (independently) in Eirene
[Henselmann 2016] and in [Mendoza-Smith, Tanner 2017]
http://ulrich-bauer.org/ripser-talk-ima.pdf 26 / 29
Emergent persistent pairs
A persistence pair (σ , τ) for a simplexwise Bltration is
• an emergent face pair if σ is the youngest proper face of τ,
• an emergent coface pair if τ is the oldest proper coface of σ .
Lemma
Consider the lexicographically reBned Rips Bltration. Assume that
• τ is the lexicographicallyminimal proper coface of σ with
diam(τ) = diam(σ), and
• τ is not already in a persistence pair (ρ, τ)with ρ ≻ σ .
Then (σ , τ) is a 0-persistence emergent coface pair.
• Includes all apparent pairs with persistence 0• Can be identiBed without enumerating all cofaces of σ
(shortcut for computation)
http://ulrich-bauer.org/ripser-talk-ima.pdf 27 / 29
Emergent persistent pairs
A persistence pair (σ , τ) for a simplexwise Bltration is
• an emergent face pair if σ is the youngest proper face of τ,
• an emergent coface pair if τ is the oldest proper coface of σ .
Lemma
Consider the lexicographically reBned Rips Bltration. Assume that
• τ is the lexicographicallyminimal proper coface of σ with
diam(τ) = diam(σ), and
• τ is not already in a persistence pair (ρ, τ)with ρ ≻ σ .
Then (σ , τ) is a 0-persistence emergent coface pair.
• Includes all apparent pairs with persistence 0• Can be identiBed without enumerating all cofaces of σ
(shortcut for computation)
http://ulrich-bauer.org/ripser-talk-ima.pdf 27 / 29
Emergent persistent pairs
A persistence pair (σ , τ) for a simplexwise Bltration is
• an emergent face pair if σ is the youngest proper face of τ,
• an emergent coface pair if τ is the oldest proper coface of σ .
Lemma
Consider the lexicographically reBned Rips Bltration. Assume that
• τ is the lexicographicallyminimal proper coface of σ with
diam(τ) = diam(σ), and
• τ is not already in a persistence pair (ρ, τ)with ρ ≻ σ .
Then (σ , τ) is a 0-persistence emergent coface pair.
• Includes all apparent pairs with persistence 0• Can be identiBed without enumerating all cofaces of σ
(shortcut for computation)
http://ulrich-bauer.org/ripser-talk-ima.pdf 27 / 29
Speedup from emergent pairs shortcut
Previous example (k = 2, n = 192):
• Only 36672 + 164 214 + 3 392 039 = 3 592 925 out of
36672 + 3 447 550 + 216 052 515 = 219 536 737 nonzero
entries of the coboundary matrix are actually visited
Using implicit reduction (boundary matrix columns may be
revisited multiple times):
• Only 155 474 + 253 134 + 7 500 332 = 7 908940 out of
155 474 + 3 536470 + 220 160 808 = 223 852 752 nonzero
entries are actually visited
Speedup: 1.2 s vs. 5.6 s (factor 4.7)
http://ulrich-bauer.org/ripser-talk-ima.pdf 28 / 29
Speedup from emergent pairs shortcut
Previous example (k = 2, n = 192):
• Only 36672 + 164 214 + 3 392 039 = 3 592 925 out of
36672 + 3 447 550 + 216 052 515 = 219 536 737 nonzero
entries of the coboundary matrix are actually visited
Using implicit reduction (boundary matrix columns may be
revisited multiple times):
• Only 155 474 + 253 134 + 7 500 332 = 7 908940 out of
155 474 + 3 536470 + 220 160 808 = 223 852 752 nonzero
entries are actually visited
Speedup: 1.2 s vs. 5.6 s (factor 4.7)
http://ulrich-bauer.org/ripser-talk-ima.pdf 28 / 29
Ripser Live: users from 567 di>erent cities
http://ulrich-bauer.org/ripser-talk-ima.pdf 29 / 29