272
Temporal Networks: STNs, STNUs, CSTNs and CSTNUs Luke Hunsberger Computer Science Department, Vassar College, Poughkeepsie, NY USA November 2018 An earlier version of this presentation “Temporal Networks for Dynamic Scheduling” was given by Luke Hunsberger at the 1st Summer School on Cognitive Robotics, MIT, 2017. Additonal material was added by Roberto Posenato and Luke Hunsberger Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 1 / 182

Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Temporal Networks:STNs, STNUs, CSTNs and CSTNUs

Luke Hunsberger

Computer Science Department, Vassar College, Poughkeepsie, NY USA

November 2018

An earlier version of this presentation “Temporal Networks for DynamicScheduling” was given by Luke Hunsberger at the 1st Summer Schoolon Cognitive Robotics, MIT, 2017. Additonal material was added by

Roberto Posenato and Luke Hunsberger

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 1 / 182

Page 2: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Outline

1 Introduction

2 Simple Temporal Networks (STNs)

3 Simple Temporal Networks with Uncertainty (STNUs)

4 Conditional Simple Temporal Networks (CSTNs)

5 Conditional STNs with Uncertainty (CSTNUs)

6 CSTNU with Disjunction (CDTNUs)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 2 / 182

Page 3: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal Networks (STN)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 3 / 182

Page 4: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkMotivating Example

Given a plan of flying from New York to Rome and back.Determine if the plan is consistent.If it is consistent, determine a temporal schedule.

Fly from New York to Rome and backLeave New York after 4 p.m., June 8Return to New York before 10 p.m., June 18Away from New York no more than 7 daysIn Rome at least 5 daysReturn flight lasts no more than 7 hours

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 4 / 182

Page 5: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal Network (STN)∗Description

STNY − X ≤ δ

STNU(actions with

uncertain durations)

CDTN

CDTNU

DTN(disjunctiveconstraints)

CSTN(test actions,test results)

CSTNUDTNU

moreexpressive

STNY − X ≤ δ

STN = Time-points andsimple temporal constraints.

Flexible: Time-points may “float”;not “nailed down” until they are executed.Efficient algorithms for determining consistency, managingreal-time execution, and managing new constraints.

∗ [Dechter et al., 1991]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 5 / 182

Page 6: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal Network∗Definition

Definition 1 (Simple Temporal Network)A Simple Temporal Network (STN) is a pair, S = (T , C), where:

T is a set of real-valued variables called time-points; andC is a set of binary constraints, each of the form:

Y − X ≤ δ

where X ,Y ∈ T and δ ∈ R.

∗ [Dechter et al., 1991]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 6 / 182

Page 7: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkThe Zero Time-Point, Z

A special time-point, Z, whose value is fixed at 0.Binary constraints involving Z are equivalent to unary constraints.

Example 1

X − Z ≤ 7 ⇐⇒ X ≤ 7

Z − X ≤ −3 ⇐⇒ X ≥ 3

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 7 / 182

Page 8: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkBasic Notions for STNs

A solution to an STN S = (T , C) is a complete set ofassignments to the time-points in T :

X1 = w1, X2 = w2, . . . , Xn = wn

that together satisfy all of the constraints in C.

An STN with at least one solution is consistent.STNs with identical solution sets are equivalent.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 8 / 182

Page 9: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkSTN for Travel Example

NYC→ Rome (In Rome) Rome→ NYCX1 X2 X3 X4

T = Z,X1,X2,X3,X4, where Z = Noon, June 8

C =

Z− X1 ≤ −4 (Leave NYC after 4 p.m., June 8)

X4 − Z ≤ 250 (Return NYC by 10 p.m., June 18)

X4 − X1 ≤ 168 (Gone no more than 7 days)

X2 − X3 ≤−120 (In Rome at least 5 days)

X4 − X3 ≤ 7 (Return flight less than 7 hrs)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 9 / 182

Page 10: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkGraph for an STN∗

The graph for an STN, S = (T , C), is a graph, G = (T , E), where:

Time-points in S ⇐⇒ nodes in G

Constraints in C ⇐⇒ edges in E:

Y − X ≤ δ X δ Y∗ [Dechter et al., 1991]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 10 / 182

Page 11: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkGraphical Representation

Constraint(s) Edge(s) Interval Notation

Y − X ≤ 7 X Y7

X Y(−∞,7]

X − Y ≤ −3(equiv: Y − X ≥ 3) X Y

−3X Y

[3,+∞)

3 ≤ Y − X ≤ 7 X Y7

−3X Y

[3,7]

4 ≤ X ≤ 9 Z X9

−4Z X

[4,9]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 11 / 182

Page 12: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkGraph for Airline Scenario

Z− X1 ≤ −4, X4 − Z ≤ 250X4 − X1 ≤ 168, X2 − X3 ≤ −120X4 − X3 ≤ 7, X1 − X2 ≤ 0X3 − X4 ≤ 0

Z X1 X2 X3 X4

250

−4

168

0 −120 7

0

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 12 / 182

Page 13: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkImplicit Constraints

Explicit constraints combine (propagate) to form implicit constraints:

X −W ≤ 30

+ Y − X ≤ 40

Y −W ≤ 70 W

X

Y

30

40

70

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 13 / 182

Page 14: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkChains of Constraints as Paths

Chains of constraints correspond to paths in the graph.Stronger constraints correspond to shorter paths.

Xi

••

Xj

••

4

−3

46

2

34

9

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 14 / 182

Page 15: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDistance Matrix∗

Definition 2 (Distance Matrix)The Distance Matrix for an STN, S = (T , C), is a matrix D defined by:

D(X ,Y ) = Length of Shortest Path from X to Y in the graph for S

X

Y

D(X ,Y )

The strongest implicit constraint on X and Y in S is:

Y − X ≤ D(X ,Y )

∗ [Dechter et al., 1991]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 15 / 182

Page 16: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkTravel Scenario’s Distance Matrix

Z X1 X2 X3 X4

250

−4

168

0 −120 7

0

D Z X1 X2 X3 X4Z 0 130 130 250 250X1 -4 0 48 168 168X2 -4 0 0 168 168X3 -124 -120 -120 0 7X4 -124 -120 -120 0 0Gray cells correspond to explicit edges.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 16 / 182

Page 17: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkFundamental Theorem of STNs∗

For an STN S, with graph G, and distance matrix D,the following are equivalent:

S is consistentD has non-negative values down its main diagonalG has no negative-length loops

∗ [Dechter et al., 1991; Hunsberger, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 17 / 182

Page 18: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkFinding a solution for a consistent STN

D has all necessary information.

Time window for any X : [−D(X ,Z),D(Z,X )]

Two easy-to-find solutions:Earliest-times solution:X1 = −D(X1,Z), X2 = −D(X2,Z), . . . , Xn = −D(Xn,Z);Latest-times solution:X1 = D(Z,X1),X2 = D(Z,X2), . . . ,Xn = D(Z,Xn).

Simple algorithm to find a different solution: Pick any time-point that doesn’t yet have a value; Give it a value from its time-window; Update D; ⇐ expensive . . . Repeat until all time-points have values.

∗ [Dechter et al., 1991]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 18 / 182

Page 19: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkComputing D from Scratch∗

D is the All-Pairs, Shortest-Paths (APSP) Matrix for G.

If S has n time-points and m constraints:Floyd-Warshall Algorithm: O(n3)

Johnson’s Algorithm: O(n2 log n +mn)— uses Bellman-Ford and Dijkstra

∗ [Cormen et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 19 / 182

Page 20: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Floyd-Warshall Algorithm∗

Xi

Xr

Xj

U V

D(X i,Xr)

D(Xr ,Xj )

D(Xi ,Xj )

Initialize D(_,_) using edge-weightsfor r=1 to n

for i=1 to nfor j=1 to n

D(Xi,Xj) := minD(Xi,Xj), D(Xi,Xr) + D(Xr,Xj)

If a shortest path from U to V contains Xr as an interior point,then after the r th round, that shortest path can ignore Xr .

∗ [Cormen et al., 2001]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 20 / 182

Page 21: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Bellman-Ford Algorithm∗

S

X1X2 X3

X4

X5

0

for each X, d(X) := 0

for i=1 to (n-1),for each edge (U,δ,V) in graph,

d(V) := mind(V), d(U) + δ

for each edge (U,δ,V) in graph,if (d(V) > d(U) + δ) return false

return true

∗ [Cormen et al., 2001]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 21 / 182

Page 22: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra’s SSSP Algorithm∗Single-Source Shortest-Paths

Works on STN graphs with non-negative edgesFor a given source node S, computes D(S,X ) for all XO(m + n log n) if using fibonacci heap for priority queueLet D(S,X ) :=∞ for all X, except D(S,S) := 0Let Q be an empty priority queueInsert S into Q with priority 0while Q non-empty,

U := ExtractMinFrom(Q)for each successor edge (U, δ,V ),D(S,V ) := minD(S,V ),D(S,U) + δ

return D

∗ [Cormen et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 22 / 182

Page 23: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[∞]

Y[∞]

W[∞]

V[∞]

R[∞]

3

18

5

1

2

6

2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 23 / 182

Page 24: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[∞]

W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 24 / 182

Page 25: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[∞]

W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 25 / 182

Page 26: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[7]W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 26 / 182

Page 27: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[7]W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 27 / 182

Page 28: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 28 / 182

Page 29: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[∞]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 29 / 182

Page 30: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[12]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

18

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 30 / 182

Page 31: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[12]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

18

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 31 / 182

Page 32: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[7]

V[1]

R[5]

3

18

5

1

2

6

2

3

5

1

6

18

2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 32 / 182

Page 33: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dijkstra Example

S[0]

X[3]

Y[4]W[7]

V[1]

R[5]

3

18

5

1

2

6

2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 33 / 182

Page 34: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Potential Functions & Re-weighted Graphs

Let S = (T , C) be any consistent STN.

Let f : T → R be any solution for S.

Then f (Y )− f (X ) ≤ δ for each constraint (Y − X ≤ δ) ∈ C

In other words, 0 ≤ f (X ) + δ − f (Y )

Let C′ = (X , δ′,Y ) | (X , δ,Y ) ∈ C, where δ′ = f (X ) + δ − f (Y )

Then S ′ = (T , C′) has only non-negative edges.

Therefore, can use Dijkstra’s SSSP algorithm on S ′

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 34 / 182

Page 35: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Johnson’s Algorithm∗

Given: an STN, S = (T , C)

Run Bellman-Ford SSSP with new source node SThen f : T → R given by f (X ) = D(S,X ) is a solution for S.

Let S ′ = (T , C′) be re-weighted graph based on f :

δ′ = f (X ) + δ − f (Y ) ≥ 0 for each (X , δ,Y ) ∈ C.

For each X ∈ T , run Dijkstra on S ′ with X as source node— computes D′(X ,Y ) for all Y ∈ T .

Reverse the re-weighting to obtain D for S:D(X ,Y ) = −f (X ) +D′(X ,Y ) + f (Y ).

Complexity: O(mn) + n ∗O(m + n log n) = O(mn + n2 log n)

∗ [Cormen et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 35 / 182

Page 36: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkInserting new constraint/edge

Result of inserting new constraint, Y − X ≤ δ:

Non-rigidInconsistent

Consistent, Rigid Consistent,Redundant

Consistent,

Consistent

δ−D(j, i) D(i, j)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 36 / 182

Page 37: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkIncrementally updating D

Given a consistent STN S = (T , C) with distance matrix D.

Insert new constraint (Y − X ≤ δ).

How to update D?

Re-compute from scratch: O(mn + n2 log n).

“Naïve” algorithm: O(n2): For each U,V ∈ T ,

D(U,V ) := minD(U,V ), D(U,X ) + δ +D(Y ,V )

U

X Y

V

D(U,X) D(Y ,V )

D(U,V )

δ

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 37 / 182

Page 38: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworksIncremental Update Algorithm

Propagate updates to D along edges in graph.

Only propagate along tight edges.(Y − X ≤ δ) is tight iff D(X ,Y ) = δ.

Phase I: propagate forward

Phase II: propagate backward

Checks no more than b∗∆ cells of D, where:∆ = number of cells needing updating;b = max. number of edges incident to any node.

∗ [Even and Gazit, 1985; Ramalingam and Reps, 1996; Rohnert, 1985]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 38 / 182

Page 39: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[5] A[9]

B[17]

C[18]

D[22]

4

8

6

4

5

Numbers in brackets are current values of D(X ,_).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 39 / 182

Page 40: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[2] A[9]

B[17]

C[18]

D[22]

4

8

6

4

2

Numbers in brackets are current values of D(X ,_).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 40 / 182

Page 41: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[2] A[9]

B[17]

C[18]

D[22]

4

8

6

4

24

6, stronger!

Numbers in brackets are current values of D(X ,_).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 41 / 182

Page 42: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[2] A[6]

B[17]

C[18]

D[22]

4

8

6

4

2

14, stronger!

8

Numbers in brackets are current values of D(X ,_).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 42 / 182

Page 43: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[2] A[6]

B[14]

C[18]

D[22]

4

8

6

4

2

20, weaker

!6

Propagation stops!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 43 / 182

Page 44: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmPropagate forward from XY as long as new values obtained

X

Y[2] A[6]

B[14]

C[18]

D[22]

4

8

6

4

2

Forward propagtion done along this path.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 44 / 182

Page 45: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmFor each entry D(X , _) that was updated, propagate backward from X

X[14]

B

W[30]

V[34]

U[29]

3

6

7

14

Numbers in brackets are D(_,B) values.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 45 / 182

Page 46: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmFor each entry D(X , _) that was updated, propagate backward from X

X[14]

B

W[30]

V[34]

U[29]

3

6

7

14

7

21,strong

er!

Numbers in brackets are D(_,B) values.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 46 / 182

Page 47: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmFor each entry D(X , _) that was updated, propagate backward from X

X[14]

B

W[21]

V[34]

U[29]

3

6

7

146

27, stronger!

Numbers in brackets are D(_,B) values.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 47 / 182

Page 48: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmFor each entry D(X , _) that was updated, propagate backward from X

X[14]

B

W[21]

V[27]

U[29]

3

6

7

14

3

30, weaker!

No further updates along this path.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 48 / 182

Page 49: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Incremental Update AlgorithmFor each entry D(X , _) that was updated, propagate backward from X

X[14]

B

W[21]

V[27]

U[29]

3

6

7

14

No further updates along this path.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 49 / 182

Page 50: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkIncremental Consistency

Verifying consistency of an STN after inserting, weakening or deletinga constraint is less expensive than fully updating the distance matrix.∗

Algorithm maintains/updates a solution to the STN.

After inserting a new constraint (or strengthening an existingone), can verify consistency in O(m + n log n) time.

After deleting or weakening a constraint, only need constanttime, because the same solution will work for the modified STN.

∗ [Ramalingam et al., 1999]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 50 / 182

Page 51: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

Z

B

C

D

−5 6 3

−2

−430

−2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 51 / 182

Page 52: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

First, form APSP graph (equiv. compute D).

Z

B

C

D

26

−5

28

30

−9

6 3

25

−2

−4

28

−2

Time Windows: B ∈ [5,26], C ∈ [2,28], D ∈ [9,30]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 52 / 182

Page 53: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

Next, select D = 20; and update APSP graph:

Z

B

C

D

16

−5

18

20

−20

6 3

15

−2

−4

18

−2

Remaining Time Windows: B ∈ [5,16], C ∈ [2,18]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 53 / 182

Page 54: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

Next, select B = 10; and update APSP graph:

Z

B

C

D

10

−10

16

20

−20

6 3

10

−7

−10

13

−4

Remaining Time Windows: C ∈ [7,16]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 54 / 182

Page 55: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

Finally, select C = 9; and update APSP graph:

Z

B

C

D

10

−10

9

20

−20

1 −1

10

−9

−10

11

−11

Easy to verify that this is a solution.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 55 / 182

Page 56: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkProblems that can occur when executing an STN in real time

May need to go back in time:Pick D = 20, then after updating, go back in time to pick B = 10(i.e., no relationship to real-time execution)

Expensive to update D

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 56 / 182

Page 57: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkExecuting an STN in real time

Only executed enabled time-points (i.e., those having nooutgoing negative edges to unexecuted time-points).

Focus updating on entries involving Z:reduces cost to linear time per update, O(n2) overall.∗

Alternatively, prior to execution, transform STN intodispatachable form in O(n2 log n + nm) time; then duringexecution, only need to propagate bounds to neighboringtime-points.†

†[Muscettola et al., 1998], †[Tsamardinos et al., 1998]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 57 / 182

Page 58: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatchable STN

An STN S is dispatchable if the following algorithmnecessarily successfully executes S:1 t := 0 (curr. time); X := (executed); E := Z (enabled);2 Remove any X ∈ E such that t is in X ’s time window;3 Execute: set X := t , and add X to X ;4 Propagate: propagate t ≤ X ≤ t to X ’s immediate neighbors;5 Update: update E to include all Y ∈ T \X for which all negativeedges emanating from Y have a destination in X ;

6 Wait: wait until t has advanced to some time betweenminlb(W ) |W ∈ E and minub(W ) |W ∈ E;

7 If X 6= T , go back to (2); else done.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 58 / 182

Page 59: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkMaking STN Dispatchable

Start with APSP Graph (it is necessarily dispatchable):

Z

B

C

D

26

−5

28

30

−9

6 3

25

−2

−4

28

−2

Then remove dominated edges . . .

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 59 / 182

Page 60: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDominated Edges

A negative edge AC is dominated by a negative edge ABif D(A,B) +D(B,C) = D(A,B):

A

B

C

−8 2

−6

AB and AC have the same source node: A.During execution, it is not necessary to propagate alongdominated edges (e.g., AC).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 60 / 182

Page 61: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkRemove Dominated Edges (ctd.)

A non-negative edge AC is dominated by a non-negative edge BCif D(A,B) +D(B,C) = D(A,B):

A

B

C

2 9

11

BC and AC have the same destination node: C.During execution, it is not necessary to propagate alongdominated edges (e.g., AC).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 61 / 182

Page 62: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkMaking STN Dispatchable (ctd.)

Remove “dominated” edges:∗

Z

B

C

D

26

−5

28

−2

30

−9

63

25

−4

28

−2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 62 / 182

Page 63: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN

Initially: t = 0, X = , E = Z.

Z

B

C

D

26

−5

28

−2

3063 −4

−2

Remove Z from E. Set Z = 0. Add Z to X .

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 63 / 182

Page 64: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

Propagate Z = 0 to neighbors;

Z

B

C

D

26

−5

28

−2

3063 −4

−2

X = Z, E = B,C; B ∈ [5, 26],C ∈ [2, 28],D ∈ [0, 30].

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 64 / 182

Page 65: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

X = Z, E = B,C; Bounds: B ∈ [5,26], C ∈ [2,28].

Z

B

C

D

26

−5

28

−2

3063 −4

−2

Let t advance to 12; Pick B from E; Set B = 12.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 65 / 182

Page 66: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

Propagate B = 12 to neighbors

Z

B

C

D

12

−12

28

−2

3063 −4

−2

X = Z,B, t = 12, E = C, C ∈ [12,18],D ∈ [16,30]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 66 / 182

Page 67: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

X = Z,B, t = 12, E = C, C ∈ [12,18],D ∈ [16,30]

Z

B

C

D

12

−12

28

−2

3063 −4

−2

Let t advance to 16, pick C from E, set C = 16.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 67 / 182

Page 68: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

Propagate C = 16 to C ’s only remaining neighbor, D.

Z

B

C

D

12

−12

16

−16

3063 −4

−2

X = Z,B,C, t = 16, E = D, D ∈ [18,30]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 68 / 182

Page 69: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

X = Z,B,C, t = 16, E = D, D ∈ [18,30]

Z

B

C

D

12

−12

16

−16

3063 −4

−2

Let t advance to 25, pick D from E, set D = 25.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 69 / 182

Page 70: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

X = Z,B,C,D, t = 25, E =

Z

B

C

D

12

−12

16

−16

25

−25

63 −4

−2

Solution: Z = 0,B = 12,C = 16,D = 25.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 70 / 182

Page 71: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkDispatching the STN (ctd.)

Easy to check that Z = 0,C = 20,B = 23,D = 28 can also begenerated by the dispatcher.

Z

B

C

D

26

−5

28

−2

3063 −4

−2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 71 / 182

Page 72: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkNew View of Dispatchability∗

(1) A path P has the prefix/postfix (PP) property if:

Every proper prefix of P has non-negative length, and

Every proper postfix of P has negative length.

A

BC

D

E

5

−2 6

−123 9

∗ [Morris, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 72 / 182

Page 73: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkNew View of Dispatchability∗

(1) A path P has the prefix/postfix (PP) property if:

Every proper prefix of P has non-negative length, and

Every proper postfix of P has negative length.

A

BC

D

E

5

−2 6

−12−8

−6

∗ [Morris, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 73 / 182

Page 74: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkNew View of Dispatchability (ctd)

(2) An STN is PP-complete if for each shortest path from any X toany Y that has the prefix/postfix property, there is an edge from Xto Y with the same length.

A

BC

D

E

5−2

3

6

−12−6

−3

(3) A consistent and PP-complete STN is dispatchable.∗ [Morris, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 74 / 182

Page 75: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal NetworkMore on Dispatchability

Further graphical analyses of the dispatchability of STNs has beenpresented recently [Morris, 2016].

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 75 / 182

Page 76: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Additional Research on STNs

Temporal Decoupling Problem (TDP) [Hunsberger, 2002; Jr. andDurfee, 2013; Mountakis et al., 2017]

APSP algorithms on chordal graphs [Xu and Choueiry, 2003]

Enforcing partial path consistency [Planken, 2008]

Incorporating STNs in multi-agent auctions [Hunsberger andGrosz, 2000]

For further info:http://www.cs.vassar.edu/~hunsberg/__papers__/(See pages 22-24, 27-28, 32, 53-70, 76-79 of my2005 AAMAS tutorial.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 76 / 182

Page 77: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN Summary

STNs have been used to provide flexible planning andscheduling systems for more than a decade.

Efficient algorithms for checking consistency, incrementallyupdating the APSP matrix, and managing execution in real timefor maximum flexibility.

However, STNs cannot represent uncertainty (e.g., actions withuncertain durations) or conditional constraints (e.g., only do X iftest result is negative).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 77 / 182

Page 78: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Simple Temporal Networks with UncertaintyMotivation

You may control when an action starts, but not how long it lasts:— taxi ride, bus ride, baseball game, medical procedure.

Although their durations may be uncertain, they are often withinknown bounds.Such actions can be represented by contingent links in atemporal network . . .

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 78 / 182

Page 79: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with Uncertainty∗

STNY − X ≤ δ

STNU(actions with

uncertain durations)

CDTN

CDTNU

DTN(disjunctiveconstraints)

CSTN(test actions,test results)

CSTNUDTNUmore

expressiveAn STNU is a triple,S = (T , C,L), where:

(T , C) is an STN

L is a set of contingent links, each of the form (A, x, y ,C):

A is the activation time-point.C is the contingent time-point.Duration bounded: C − A ∈ [x, y ] but uncontrollable

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 79 / 182

Page 80: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Graph

Nodes and edges as in an STN graph

Y − X ∈ [3,7] ⇐⇒ X Y7

−3⇐⇒ X [3,7] Y

Contingent Links⇐⇒ Labeled Edges∗

(A,3,7,C) ⇐⇒ A Cc:3

C:− 7⇐⇒ X [3,7] Y

Labeled edges represent uncontrollable possibilities.

The lower-case edge, A c:3 C, represents the uncontrollablepossibility that C − A might equal 3.

The upper-case edge, A C:− 7 C, represents the uncontrollablepossibility that C − A might equal 7.

∗ [Morris and Muscettola, 2005]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 80 / 182

Page 81: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

c:2

C:− 9

If A = 0, when is it safe to execute B?

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 81 / 182

Page 82: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)[2, 2]

c:2

C:− 9

If A = 0 and B = 2, then problem arises if C > 7.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 81 / 182

Page 83: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)−4

c:2

C:− 9

If A = 0 and B ≥ 4, then no problems!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 81 / 182

Page 84: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

[3, 3]

−3

If A = 0 and C = 3, then B ≥ 3 no problem!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 81 / 182

Page 85: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyDynamic Controllability (DC)

An STNU is dynamically controllable (DC) if:

there exists a dynamic strategy . . .

for executing the non-contingent time-points . . .

such that all of the constraints will be satisfied . . .no matter how the contingent durations turn out.

A dynamic strategy can react to contingent executions.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 82 / 182

Page 86: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

Strategy: While C unexecuted, B must wait at least 4 after A.

Wait: A C:− x B is a wait constraint when the source node (B) isdifferent from the upper-case label (C).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 83 / 182

Page 87: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

C:−4

Strategy: While C unexecuted, B must wait at least 4 after A.

Wait: A C:− x B is a wait constraint when the source node (B) isdifferent from the upper-case label (C).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 83 / 182

Page 88: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic Controllability—Original MMV Semantics∗Schedules, situations, and strategies

Schedule: Assigns times to all time-points

Situation: Specifies durations of all contingent links

Strategy: Mapping from (complete) situation to(complete) schedule

But execution happens incrementally,based on current, partial view of “real” situation

Therefore, need to carefully restrict strategies to ensure that theirdecisions do not depend on advance knowledge of future events.

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 84 / 182

Page 89: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilitySchedules∗

ψ : T → R is a schedule; and Ψ is the set of all schedules for S.

[ψ]X = time of X in schedule ψ

RW A X C Y

0 4 6 11 16 19

Z

WY

X

T

A

C

Z

[ψ]Z = 0, [ψ]W = 4, [ψ]A = 6, [ψ]X = 11, . . .

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 85 / 182

Page 90: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilityPre-histories∗

For any t ∈ R, [ψ]<t = the pre-history of ψ relative to time t .†

— specifies the durations of all contingent linksthat finished before time t in the schedule ψ.

RZ W A X C Y

0 4 6 11 16 19

Z

WY

X

T

A

C

10

[ψ]<5 = ∅, [ψ]<12 = ∅, [ψ]<19 = (C − A = 10)∗ [Morris et al., 2001], † [Hunsberger, 2009]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 86 / 182

Page 91: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilitySituations and projections∗

For an STNU with k contingent links, (Ai , xi , yi ,Ci):

ω = (d1, . . . ,dk) is a situation, where xi ≤ di ≤ yi , for each i.

Ω = [x1, y1]× . . .× [xk, yk] is the space of situations.

Projection: Sω is the STN that results from forcing the contingentlinks to take on the values in ω.

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 87 / 182

Page 92: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilityExecution strategies∗

A strategy is a mapping, σ : Ω→ Ψ, from (complete) situations to(complete) schedules.

SCHEDULE, ψ(ω)SITUATION, ω

ψ

ω1

ω2

ω

ω3

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 88 / 182

Page 93: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilityValid execution strategies∗

Given a (complete) situation ω, σ(ω) is a (complete) schedule.

A strategy σ is valid if for each situation ω, the schedule σ(ω) isconsistent with the projection Sω.

In other words, a valid strategy does not control the durations ofcontingent links.

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 89 / 182

Page 94: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilityDynamic execution strategies∗

An execution strategy σ is dynamic if:

for any situations ω′ and ω′′,

and any non-contingent time-point X ∈ T ,

let t = [σ(ω′)]X

if [σ(ω′)]<t = [σ(ω′′)]<t , then [σ(ω′′)]X = t .

In other words, if σ executes X at time t in situation ω′, and thepre-histories of σ(ω′) and σ(ω′′) are the same (i.e., at time t the agentcannot distinguish between the situations ω′ and ω′′), then σ must alsoexecute X at time t in ω′′.∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 90 / 182

Page 95: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic ControllabilityDynamic execution strategies

[σ(ω′)]X = 21

Past Observations and Execution Events

[σ(ω′)]<21 = [σ(ω′′)]<21

= (C3 − A3 = 4), (C2 − A2 = 7), (C5 − A5 = 9)

t = 21

σ executes X at time 21 in situation ω′; and [σ(ω′)]<21 = [σ(ω′′)]<21.Therefore, σ must also execute X at time 21 in situation ω′′.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 91 / 182

Page 96: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic Controllability—at last!∗

An STNU, S = (T , C,L), is dynamically controllable (DC) if thereexists a strategy for S that is both valid and dynamic.

∗ [Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 92 / 182

Page 97: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Pause!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 93 / 182

Page 98: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Dynamic Controllability — Take 2Alternative Semantics: Real-Time Execution Decisions∗

The semantics for dynamic controllability can be statedequivalently in terms of Real-Time Execution Decisions(RTEDs):

WAIT:Wait for some activated contingent link to complete.(t , χ):If nothing happens before time t ∈ R, then execute the(non-contingent) time-points in χ at time t.

An RTED-based strategy maps each respectful partial scheduleto an allowable RTED.By construction, an RTED-based strategy is necessarilydynamic.

∗ [Hunsberger, 2009]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 94 / 182

Page 99: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED First Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

c:2

C:− 9

C:−4

Initial Partial Schedule: (Z = 0)Initial RTED: (4, B)(If nothing happens before time 4, execute B at 4.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 100: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED First Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

2

−2

−2

Possible Outcome: C executes at time 2.Next Partial Schedule: (Z = 0), (C = 2).Next RTED: (3, B)(If nothing happens before time 3, execute B at 3.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 101: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED First Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

2

−2

−3

3

Possible Outcome: B executes at 3.Final Schedule: (Z = 0), (C = 2), (B = 3).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 102: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED Second Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

c:2

C:− 9

C:−4

Initial Partial Schedule: (Z = 0)Initial RTED: (4, B)(If nothing happens before time 4, execute B at 4.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 103: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED Second Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

c:2

C:− 9

4

−4

Possible Outcome: Time 4 arrives, but C has not yet executed.So B is executed at 4

Next Partial Schedule: (Z = 0), (B = 4).Next RTED: WAIT (for C to execute).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 104: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyRTED Second Example

A C

B

Z[0, 0]

5 C − B ≤ 5 (i.e., B ≥ C − 5)

4

−4

7

−7

Possible Outcome: C executes at 7.Final Schedule: (Z = 0), (B = 4), (C = 7).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 95 / 182

Page 105: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Semantics is nice, but . . .

How can we check whether an STNU is DC?How can we construct a dynamic and valid strategy?

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 96 / 182

Page 106: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

DC Checking for STNUs

Propagate labeled edges to generate new edgesR:u + v

Q S Tu R:v

Not all paths correspond to constraints that must be satisfied:

C A CC:− 8 c:2

Semi-reducible paths: reduce away lower-case edges

DC if and only if no semi-reducible negative (SRN) loops

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 97 / 182

Page 107: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Paths

In an STN: shortest paths represent the strongest constraints thatevery solution must satisfy.

In an STNU: shortest semi-reducible paths represent thestrongest constraints that a valid execution strategy.

The All-Pairs, Shortest Semi-Reducible Paths (APSSRP) matrixD∗ for an STNU is analogous to the APSP matrix for an STN.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 98 / 182

Page 108: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyFundamental Theorem of STNUs

For an STNU S, with graph G, and APSSRP matrix D∗,the following are equivalent:

S is dynamically controllable

G has no semi-reducible negative loops

D∗ has non-negative values on its main diagonal

[Hunsberger, 2010, 2013; Morris, 2006; Morris and Muscettola, 2005]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 99 / 182

Page 109: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyEdge-Generation Rules

The All-Pairs, Shortest Semi-Reducible Paths matrix for an STNUcan be determined considering the following edge-generation rules:

No Case RuleUpper-Case Rule

Lower-Case RuleCross-Case RuleLabel-Removal Rule

Key feature of the rules: they are length-preserving!

[Morris and Muscettola, 2005]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 100 / 182

Page 110: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe No-Case Rule

Q S T3 4

7

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 101 / 182

Page 111: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe No-Case Rule

Q S T3 4

7

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 101 / 182

Page 112: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Upper-Case Rule

Q C A3 C:− 10

C:− 7

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 102 / 182

Page 113: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Upper-Case Rule

Q C A3 C:− 10

C:− 7

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 102 / 182

Page 114: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Lower-Case Rule

A C Xc:3 -5

-2

(Applies since −5 ≤ 0)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 103 / 182

Page 115: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Lower-Case Rule

A C Xc:3 -5

-2

(Applies since −5 ≤ 0)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 103 / 182

Page 116: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Cross-Case Rule

A C ADc:3 D :− 8

D :− 5

(Applies since −8 ≤ 0 and C 6≡ D)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 104 / 182

Page 117: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Cross-Case Rule

A C ADc:3 D :− 8

D :− 5

(Applies since −8 ≤ 0 and C 6≡ D)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 104 / 182

Page 118: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Label-Removal Rule

X A CC:− 1 c:3

−1

(Applies since −1 ≥ −3)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 105 / 182

Page 119: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Label-Removal Rule

X A CC:− 1 c:3

−1

(Applies since −1 ≥ −3)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 105 / 182

Page 120: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyThe Label-Removal Rule

X A Cc:3C:− 1

−1

(Applies since −1 ≥ −3)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 105 / 182

Page 121: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

1

3

d:3 −3

e:2

C:−4

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 122: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

1

3

d:3 −3

e:2

C:−4

0

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 123: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 124: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 125: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

1

C:− 1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 126: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

1

C:− 1

−3

Example 2: Same as before, but with edge from Y and X

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 127: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

1

C:− 1

−3

C:−4

Example 2: Same as before, but with edge from Y and X

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 128: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySemi-Reducible Path [Morris and Muscettola, 2005]

A path is semi-reducible if it can be transformed into a path with nolower-case edges.

X

AD

D

AE E

A

Y

d:3 −3

e:2

C:−4

1

3

0

C:− 2

1

C:− 1

−3

C:−4

C:− 1

A semi-reducible negative (SRN) loop at A!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 106 / 182

Page 129: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyFundamental Theorem of STNUs

For an STNU S, with graph G, and APSSRP matrix D∗,the following are equivalent:

S is dynamically controllable

G has no semi-reducible negative loops

D∗ has non-negative values on its main diagonal

[Hunsberger, 2010, 2013; Morris, 2006; Morris and Muscettola, 2005]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 107 / 182

Page 130: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyDC Checking for STNUs

The worst-case time for DC-checking algorithms for STNUs hasimproved dramatically in recent years:

Pseudo-polynomial: [Morris et al., 2001]

O(n5): [Morris and Muscettola, 2005]

O(n4): [Morris, 2006]

O(n3): [Morris, 2014]

O(mn + k2n + kn log n): [Cairo et al., 2018]

(k = num. cont. links; n = num. time-points; m = num. edges)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 108 / 182

Page 131: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Reducing Away a Lower-Case EdgeExample 1: Moat edge is an ordinary edge; so is generated edge.

C

e′

e

e

Pe

A

D :712D :− 5 3

B:− 7moat edge

Y

10

c:2−7 −10

−5

B:3

Pe: extension sub-path for the lower-case edge, e.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 109 / 182

Page 132: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Reducing Away a Lower-Case EdgeExample 2: Moat edge is an upper-case edge; so is generated edge.

C

e′eA

e

PeD :712D :− 5 3

B:− 7 moat edge10

G:− 7G:− 5

G:− 10c:2B:3

Pe: extension sub-path for the lower-case edge, e.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 110 / 182

Page 133: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Reducing Away Lower-Case EdgesExtension sub-paths can be nested

X A1

C1

3

C2

Y A3

C3

A4

C4

A5

C5

W

3

V−1

−1

U

−11

1

c 1:1

2

−9

1

c 3:3

c 4:4

c 5:2

c 2:2

A2

−7

−3

3

−4 1

nestedextension

nestedextension

If extension sub-paths overlap . . .one must be nested inside the other [Morris, 2006].

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 111 / 182

Page 134: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 135: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

C2:2

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 136: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 137: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

1

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 138: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

1

−8

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 139: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

1

−8−7

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 140: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

1

−8−7

−4

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 141: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking AlgorithmBased on “canonical reduction” of semi-reducible paths

A

C

C2

A2

A3 C3

Y

c:3

8

C2 :− 6

−1c3:1

−9

2

1

−8−7

−4

Key idea: Propagate forward along “allowable” paths emanating fromC attempting to “reduce away” the lower-case edge AC.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 112 / 182

Page 142: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking Algorithm

Assumes strategy can react instantaneously!—Simplifies analysis; changes applicability conditions slightly.—See Hunsberger [2015] for non-instantaneous case.

The extension sub-path used in the canonical reduction of alower-case edge necessarily has the PP property!

Every proper prefix has non-negative lengthEvery proper postfix has negative length

Restrict attention to “allowable” paths emanating from C:

No upper-case edges labeled by same C.No lower-case edges.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 113 / 182

Page 143: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

The AllMax STN∗

The AllMax STN is the projection in which all contingent linkstake on their maximum durations.The AllMax graph can be obtained by:– removing all lower-case edges, and– deleting the upper-case labels from upper-case edges.

The distance matrix for the AllMax STN is called Dmax.

If an STNU is fully propagated, then Dmax = D∗.†(D∗: the All-Pairs, Shortest-Semi-Reducible-Paths matrix)

An STNU is DC iff D∗ has zeores down its main diagonal.

∗ [Morris and Muscettola, 2005], † [Hunsberger, 2015]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 114 / 182

Page 144: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking Algorithm

Canonical paths used to reduce away lower-case edges can benested to a depth of at most k.

Therefore, algorithm does k rounds of propagating forward fromeach lower-case edge.

Allowable paths belong to the OU-graph.

Before each round, runs Bellman-Ford on the AllMax graph togenerate a potential function.

Each forward propagation uses slightly modified Dijkstra’salgorithm on re-weighted edges in OU-graph.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 115 / 182

Page 145: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2006 O(n4) DC-Checking Algorithm

for i = 1 to k,

Run Bellman-Ford on AllMax graph to get potential function

for each contingent link (A, x, y ,C),

Do Dijkstra-like propagation using C as source,following “allowable” paths in re-weighted OU-graph.

Insert all new edges from this round.

Run Bellman-Ford one last time on AllMax graph to verifyconsistency.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 116 / 182

Page 146: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 147: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

C2 :5

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 148: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 149: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 150: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 151: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 152: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

0

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 153: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

012

C1:9

99

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 154: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

012

C1:9

99

C2:− 1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 155: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

012

C1:9

99

−1

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 156: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

012

C1:9

99

−1

Order of processing of lower-case edges in Morris’ 2006algorithm matters!

If process A2C2 first, then need two rounds;If process A1C1 first, then need only one round.If heuristic guesses good nesting order, O(n4)→ O(n3).

[Hunsberger, 2013]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 157: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

−1

012

C1:9

99

−1

Order of processing of lower-case edges in Morris’ 2006algorithm matters!

If process A2C2 first, then need two rounds;If process A1C1 first, then need only one round.If heuristic guesses good nesting order, O(n4)→ O(n3).

[Hunsberger, 2013]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 117 / 182

Page 158: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2014 O(n3) DC-checking algorithm

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 118 / 182

Page 159: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2014 O(n3) DC-checking algorithmKey insights

When propagating forward along “allowable” paths in theOU-graph, nested extension sub-paths cause problems— unless you know the order of nesting.

A semi-reducible loop can always be reduced to a loopconsisting solely of negative edges.

Moat edges are always negative edges.

Propagating backward starting from negative edgesautomatically resolves the nesting order!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 119 / 182

Page 160: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 161: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

−6

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 162: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 163: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−60

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 164: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−609

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 165: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 166: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2:− 10

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 167: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2 :− 9

C2:− 10

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 168: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2 :− 9

C2:− 10C2:− 1

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 169: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2 :− 9

C2:− 10C2:− 1

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 170: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2 :− 9

C2:− 10C2:− 1

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 171: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Finding a Semi-Reducible Negative Loop

C1

A1

C1 C2

A2

C2 C1

A1

C1

X

C1 :−

3

c 1:1

−1C2 :−

10 c 2:1

8

C1 :−

3 c 1:1

−7

12

5

−6

0

09

C2 :− 9

C2:− 10C2:− 1

For each node that has a negative incoming edge (e.g., X ) . . .

Propagate backward along non-negative edges . . .

But only as long as path-length stays negative.

If ever encounter another negative edge. . .

Recursively process that edge.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 120 / 182

Page 172: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Morris’ 2014 O(n3) DC-checking Algorithm

Process each “negative” node only once.– Negative node = node with negative incoming edge(s).

Process each negative node using Dijkstra.– No re-weighting required!– Only propagate along non-negative edges.

At most n recursive calls of Dijkstra: O(nm + n2 log n).

Worst case: adds O(n2) edgesThus, m → O(n2) and overall complexity is O(n3).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 121 / 182

Page 173: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Cairo et al.’s 2018 O(mn + k2n + kn log n) DC-checking Algorithm

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 122 / 182

Page 174: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Recall MM-2005 Propagation Rules

Rule Graphical Representation Applicability Conditions

No Case X Y Wv w

v + w(none)

Upper Case X Y Av C:− w

C:v − w(none)

Lower Case A C Uc:x w

x + ww < 0

Cross Case A C A2c:x C2:w

C2:x + wC2 6≡ C, w < 0

Label Removal X A CC:w c:x

ww ≥ −x

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 123 / 182

Page 175: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

MMV-2001’s General Unordered Reduction

Graphical Representation Applicability Condition

V A CC:w c:x

−xw < −x (eqiuv., −w > x)

V A CC:− 8 c:3

−3−8 < −3 (eqiuv., 8 > 3)

While C remains unexecuted, V must wait at least −w after A.

But C cannot execute sooner than x after A.Therefore, in every situation, V must wait at least x after A.

The GUR rule is not length-preserving.

[Morris et al., 2001]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 124 / 182

Page 176: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Cairo et al.’s 2018 DC-checking AlgorithmThe RUL− propagation rules

Rule Graphical representation Applicability Conditions

Relax−P Q R

v w

v + w Q ∈ TX ,w < yR − xR,R ∈ TC

Upper−P C A

v C:− y

maxv − y ,−x (none)

Lower−A C R

c : x w

x + w C 6≡ R,w < yR − xR,R ∈ TC

Only generates ordinary edges!

Propagations always involve 1-2 contingent time-points (k < n).

Upper− rule is not length-preserving.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 125 / 182

Page 177: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Cairo et al.’s 2018 DC-checking AlgorithmOverview

INIT: Bellman-Ford to compute a potential function for theLO-graph (i.e., the graph containing only lower-case andordinary edges).

Reduces away upper-case edges, not lower-case edges.

For each contingent time-point R:

Propagates backward from R using Lower− and Relax− rules.For each new edge XR, applies Upper− rule to XRAR.

Backward propagation done using Dijkstra on re-weightedLO-graph.

After each round, inserts new edges and updates potentialfunction.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 126 / 182

Page 178: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Cairo et al.’s 2018 DC-checking AlgorithmOverview (continued)

ARR

•••

••

•••

(a) Propagating back from R

ARR

•••

••

•••

(b) New edges terminating at R

ARR

•••

••

•••

(c) ApplyUpper

Uses push-down stacks to ensure that each contingenttime-point is processed at most twice; and to detect negativecycles.

If backward propagation from R encounters an upper-case edgeCA, where C has not yet been processed, then it suspendsprocessing of R, and instead processes C.

Only resumes processing of R when all previously encounteredupper-case edges have been processed.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 127 / 182

Page 179: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Cairo et al.’s 2018 DC-checking AlgorithmOverview (continued)

Faster than Morris’ 2014 O(n3) algorithm on sparse graphs, forexample, in cases where Morris’ algorithm generates O(n2)edges.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 128 / 182

Page 180: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Executing STNUs Efficiently

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 129 / 182

Page 181: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintyFlexible Execution of STNUs

Lower-case edges are only needed during constraint propagation,to generate additional ordinary and upper-case edges.After propagation completes, execution phase can begin, andlower-case edges can be ignored.Therefore, the fully-propagated OU-graph contains all theinformation needed to successfully execute a DC STNU.Executing an STNU is similar to executing STNs, except that:

Lower bound for X also depends on up to k wait constraints.When a contingent C executes, all upper-case edges labeled byC must be deleted, requiring lower bounds to be updated.

A DC STNU can be flexibly executed, incrementally computingupdates in AllMax graph using O(n2)-time per execution event,O(n3)-time overall.∗

∗ [Hunsberger, 2013, 2015]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 130 / 182

Page 182: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Dispatchability

Recall: A projection of an STNU is the STN that results fromfixing the durations all contingent links to legal values.

Definition: An STNU is dispatchable if each of its STN projectionsis dispatchable (as an STN).

Dispatchability same as for STNs, except that:

Contingent time-points are not controllable; andThere are wait constraints:“As long as C unexecuted, X must wait at least 5 after A.”

Theorem: A dispatchable STNU is DC.∗

∗ [Morris, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 131 / 182

Page 183: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Dispatchability

For a DC STNU, Morris’ O(N3)-time DC-checking algorithmgenerates a dispatchable STNU.∗

Corollary: For a DC STNU, the STNU graph generated byexhaustively applying the constraint propagation rules fromMorris et al. (2005) is dispatchable.

∗[Morris, 2014]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 132 / 182

Page 184: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

STN with UncertaintySTNU Summary

The theory of STNUs (dynamic controllability, dispatchability,flexible execution) has been advanced dramatically over the pastfew years.

Many important contributions from Paul Morris and colleagues.

STNUs are ready for prime time!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 133 / 182

Page 185: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsMotivation

Many actions generate information (e.g., medical tests, opening abox, monitoring traffic).

The generated information is generally not known in advance, butdiscovered in real time.Some actions only make sense in certain scenarios (e.g., don’tgive drug if test result is negative).

An execution strategy could be more flexible if it could reactdynamically to generated information.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 134 / 182

Page 186: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsMotivation (ctd.)

Many businesses using workflow management systems toautomate manufacturing processes.

Hospitals can use workflows to represent possible treatmentpathways for a patient.

CSTNs can serve as the temporal foundation for workflowmanagement systems.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 135 / 182

Page 187: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs (CSTNs)∗

STNY − X ≤ δ

STNU(actions with

uncertain durations)

CDTN

CDTNU

DTN(disjunctiveconstraints)

CSTN(test actions,test results)

CSTNUDTNUmore

expressive

Time-points and constraints as in STNsObservation time-points generate truthvalues for propositional lettersTime-points and constraints labeled byconjunctions of propositional letters

∗ [Tsamardinos et al., 2003]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 136 / 182

Page 188: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropositional Labels

Propositional letters: p,q, r , s, t , . . .

Each p has corresponding observation time-point, P?.Executing P? generates truth value for p

Label: conjunction of literals (e.g., p(¬q)r)

A scenario specifies values for all letters(e.g., p = true, q = false, r = true).

The real scenario is only revealed incrementally.

Time-points and constraints∗ can be labeled;they only apply in scenarios where their labels are true.

∗ [Hunsberger et al., 2015]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 137 / 182

Page 189: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? Q?¬p R?

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 190: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? Q?¬p R?

Xp¬q Yr[1, 5],¬p

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 191: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? Q?¬p R?

Xp¬q Yr[1, 5], p¬qr

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 192: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? Q?¬p R?

Xp¬q Yr[1, 5], p¬qr

Q?¬p

YrXp¬q

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 193: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? R?

Xp¬q Yr[1, 5], p¬qr

Q?p

−ε, p¬q

−ε,p¬q −

ε,r

−ε, p

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 194: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsWell-definedness (WD) properties∗

Xp¬q Yr

P? R?

Xp¬q Yr[1, 5], p¬qr

Q?p

−ε, p¬q

−ε,p¬q −

ε,r

−ε, p

WD1 (Label Coherence): the label on a constraint must besatisfiable and must entail the labels on its endpoints.WD2 (Node Label Honesty):

The label ` on a node must entail the labels on all observationtime-points whose corresponding prop’l. letters appear in L; andthe node must occur ε ≥ 0 after such observation time-points.

WD3 (Edge Label Honesty): the label L on an edge must entailthe labels on all observation time-points whose correspondingpropositional letters appear in L.

∗ Implicit in Tsamardinos et al. (2003), formalized in Hunsberger et al. (2015).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 138 / 182

Page 195: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsSample CSTN

Z P?

Q?p Epq

Y[1, 2]

[15,20],p

[1, 5], pq

[5, 30], p¬q

[5, 30], pq

[1, 5],¬p

[2, 30]

P? and Q? represent tests for a patient.Q? is a child of P?: only executed in scenarios where p = true.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 139 / 182

Page 196: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsDynamic Consistency

Dynamic Execution Strategy: execution decisions can react toobservations.A CSTN is dynamically consistent (DC) if there exists a dynamicexecution strategy that guarantees that all relevant constraintswill be satisfied no matter which scenario is incrementallyrevealed over time.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 140 / 182

Page 197: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsApproaches to DC Checking

Convert to Disjunctive Temporal Problem[Tsamardinos et al., 2003]

Convert to controller-synth. problem for Timed Game Automaton[Cimatti et al., 2014]

Convert to Hyper Temporal Network consistency problem[Comin and Rizzi, 2015]

Propagate labeled constraints[Hunsberger et al., 2015]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 141 / 182

Page 198: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsDC Checking via Propagation

Propagate labeled constraints— Motivated by related work on STNs with choice [Conrad and Williams, 2011]

Introduce new kind of literals and labels:Q-literals (e.g., ?p) and Q-labels (e.g., p¬q(?r)s)

Analysis of negative q-loops and negative q-stars

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 142 / 182

Page 199: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsLabeled Constraints

X Y〈δ, α〉

Y − X ≤ δ must hold in every scenario where α is true.

(If α = , then Y − X ≤ δ must hold in all scenarios.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 143 / 182

Page 200: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagation Rules for CSTNs

Labeled Propagation: LP and qLP

Label Modification: R0 and qR0Label “Spreading”: R∗3 and qR∗3(The “q” rules propagate q-labeled constraints.)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 144 / 182

Page 201: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe LP Rule

W X Y〈3, pq〉 〈5, q¬r〉

Labels of two pre-existing edges are conjoined;The resulting label must be consistent.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 145 / 182

Page 202: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe LP Rule

W X Y〈3, pq〉 〈5, q¬r〉

〈8, pq¬r〉

Labels of two pre-existing edges are conjoined;The resulting label must be consistent.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 145 / 182

Page 203: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe R0 Rule

P? X〈−5, pq¬r〉

Edge weight must be negative;Any occurrence of p (or ¬p) removed from label.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 146 / 182

Page 204: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe R0 Rule

P? X〈−5, pq¬r〉

〈−5, q¬r〉

Edge weight must be negative;Any occurrence of p (or ¬p) removed from label.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 146 / 182

Page 205: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe R∗3 Rule

P? X Y〈−3, qr〉 〈−8, pqs〉

Pre-existing labels must be consistent;Generated label is conjunction of pre-existing labels— minus any occurrence of p (or ¬p);

Lefthand weight must be negative;Generated weight is max of pre-existing weights.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 147 / 182

Page 206: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe R∗3 Rule

P? X Y〈−3, qr〉 〈−8, pqs〉

〈−3, qrs〉

Pre-existing labels must be consistent;Generated label is conjunction of pre-existing labels— minus any occurrence of p (or ¬p);

Lefthand weight must be negative;Generated weight is max of pre-existing weights.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 147 / 182

Page 207: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,p〉

〈−3, q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 208: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,p〉

〈−3, q〉

〈−2, pq〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 209: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,p〉

〈−2,〉

〈−3, q〉

〈−2, pq〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 210: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉

〈−2, pq〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 211: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉

〈−2,〉

〈−2, pq〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 212: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉 〈−2,〉

〈−2, pq〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 213: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉 〈−2,〉

〈−2, pq〉〈−2, pq〉

〈−2,〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 214: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉 〈−2,〉

〈−2,〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 215: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsExample: Non-DC Instance

Example 2 (Non-DC Instance)

ZX1

X2 Q?

P?〈1,¬p¬q〉

〈−2,pq〉

0

〈−2,〉

〈−3, q〉 〈−2,〉

〈−2,〉

〈−1,¬p¬q〉

〈−2,〉

There is a scenario, ¬p¬q, in which there exits a negative loop!Therefore, the CSTN is not DC!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 148 / 182

Page 216: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along consistent labels is insufficient

Example 3 (Non-DC instance, but LP, R0 and R∗3 not enough)

ZX1

X2 Q?

P?

R?0

〈1,¬p¬q〉

〈−2,pq〉 〈−

2,pr〉

〈−3, q¬r〉〈−4,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 149 / 182

Page 217: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along consistent labels is insufficient

Example 3 (Non-DC instance, but LP, R0 and R∗3 not enough)

ZX1

X2 Q?

P?

R?0

〈1,¬p¬q〉

〈−2,pq〉 〈−

2,pr〉

〈−3, q¬r〉〈−4,¬p¬q〉

〈−2, pq〉, 〈−2, pr〉, 〈−2, q¬r〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 149 / 182

Page 218: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along consistent labels is insufficient

Q-labels: contain literals such as ?p.

A constraint labeled by ?p must hold as long as p’s value isunknown (i.e., as long as P? remains unexecuted).

Conjunction operation generalized to cover q-labels:p ∧ ¬p ≡?p; p∧?p ≡?p; ¬p∧?p ≡?p; etc.

Q-labels only needed on lower-bound constraints∗(i.e., edges pointing at Z).

∗ [Hunsberger and Posenato, 2018b]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 150 / 182

Page 219: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe qR0 Rule

P? Z〈−5, ?pq¬r〉

Edge must terminate at Z;

Edge weight must be negative;

Any occurrence of p (or ¬p or ?p) removed from label.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 151 / 182

Page 220: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe qR0 Rule

P? Z〈−5, ?pq¬r〉

〈−5, q¬r〉

Edge must terminate at Z;

Edge weight must be negative;

Any occurrence of p (or ¬p or ?p) removed from label.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 151 / 182

Page 221: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe qR∗3 Rule

P? Z Y〈−3, q?r〉 〈−8, p¬qr?s〉

Labels need not be consistent;

Lefthand weight must be negative;

Generated weight is max of pre-existing weights.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 152 / 182

Page 222: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe qR∗3 Rule

P? Z Y〈−3, q?r〉 〈−8, p¬qr?s〉

〈−3, ?q?r?s〉

Labels need not be consistent;

Lefthand weight must be negative;

Generated weight is max of pre-existing weights.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 152 / 182

Page 223: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along q-labels is sufficient!

Example 4 (Non-DC instance confirmed by rules LP, qR0 and qR∗3)

ZX1

X2 Q?

P?

R?

0

〈1,¬p¬q〉

〈−2,pq〉 〈−

2,pr〉

〈−3, q¬r〉

〈−4,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 153 / 182

Page 224: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along q-labels is sufficient!

Example 4 (Non-DC instance confirmed by rules LP, qR0 and qR∗3)

ZX1

X2 Q?

P?

R?

0

〈1,¬p¬q〉

〈−2,pq〉 〈−2,〉

〈−2,〉, 〈−

3, ?q〉,〈−4

,¬p¬q〉

〈−2,〉, 〈−3, ?q〉, 〈−3, q¬r〉〈−2,〉

Incidentally, the blue edges form a “negative q-star”.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 153 / 182

Page 225: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsPropagating Q-Labels

Propagating along q-labels is sufficient!

Example 4 (Non-DC instance confirmed by rules LP, qR0 and qR∗3)

ZX1

X2 Q?

P?

R?

0

〈1,¬p¬q〉

〈−2,pq〉 〈−

2,〉

〈−2,〉, 〈−

3, ?q〉,〈−4

,¬p¬q〉

〈−2,〉, 〈−3, ?q〉, 〈−3, q¬r〉〈−2,〉

〈−1,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 153 / 182

Page 226: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsNegative Q-Loop Example

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−∞, (?p)(?q)r〉

〈0,〉〈−2,¬pq〉

〈0,〉, 〈−7,¬q¬r〉〈0,〉

〈0,〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 154 / 182

Page 227: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsNegative Q-Loop Example

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−∞, (?p)(?q)r〉

〈0,〉, 〈−2,¬pq〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈0,〉, 〈−7,¬q¬r〉

〈−7,〉 〈0,〉

〈−7,〉, 〈−8, pr〉〈−∞, ?p〉

〈0,〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉, 〈−∞

, ?p〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 154 / 182

Page 228: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsNegative Q-Loop Example

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−∞, (?p)(?q)r〉

〈0,〉, 〈−2,¬pq〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈0,〉, 〈−7,¬q¬r〉

〈−7,〉 〈0,〉

〈−7,〉, 〈−8, pr〉〈−∞, ?p〉

〈0,〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉, 〈−∞

, ?p〉

The Spreading LemmaThe minimum lower-bound constraint 〈−7,〉

has spread to all unexecuted time-points. [Hunsberger et al., 2015]Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 154 / 182

Page 229: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsDC-Checking Algorithm for CSTNs

The DC-Checking Algorithm exhaustively propagates constraintsusing LP, qLP , and qR∗3.Returns NO if any negative self-loop with a consistent label isever found; otherwise returns YES.In positive cases, constructs earliest-first strategy, which isviable due to the Spreading Lemma.Although exponential-time in the worst case, shown to bepractical across a variety of sample networks.

[Hunsberger and Posenato, 2018b; Hunsberger et al., 2015]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 155 / 182

Page 230: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe Earliest-First Strategy

Keep track of current partial scenario (CPS), π.Initially π = .

After each execution event, compute effective lower bound (ELB)for each as-yet-unexecuted time-point.

ELB(X , π) restricts attention to lower bounds for X whose labelsare applicable to π.

Next time-point to execute is the one with the min. ELB value.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 156 / 182

Page 231: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsSample Execution

π = , Z = 0, ELB(P?,) = −7; execute P? = 7.

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈−7,〉〈−7,〉, 〈−8, pr〉〈−∞, ?p〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉, 〈−∞

, ?p〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 157 / 182

Page 232: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsSample Execution

Suppose p = true. π = p; ELB(X ,p) = 7 = ELB(R?,p).So execute X = 7 and R? = 7.

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈−7,〉〈−7,〉, 〈−8, pr〉,〈−∞, ?p〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉, 〈−∞

, ?p〉

〈−1,¬pq〉

〈−1,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 157 / 182

Page 233: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsSample Execution

Suppose r = true. π = pr ; ELB(Q?,p) = 8.So execute Q? = 8.

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈−7,〉〈−7,〉, 〈−8, pr〉,〈−∞, ?p〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉, 〈−∞

, ?p〉

〈−1,¬pq〉

〈−1,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 157 / 182

Page 234: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsAlternative Execution

Suppose p = false. π = ¬p; ELB(Q?,¬p) = 7So execute Q? = 7.

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈−7,〉〈−7,〉, 〈−8, pr〉, 〈−∞, ?p〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉,〈−∞

, ?p〉

〈−1,pr〉

〈8, pr〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 158 / 182

Page 235: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsAlternative Execution

Suppose q = true. π = ¬pq; ELB(X ,¬pq) = 7.So execute X = 7. Afterward, execute R? = 8.

Q?[]

X[]

R?[]

Z[]P?[]

〈−1,pr〉

〈−1,¬p¬q〉

〈−1,¬pq〉

〈8, pr〉

〈−7,〉,〈−8,¬pq

〉, 〈−∞,?p〉

〈−7,〉〈−7,〉, 〈−8, pr〉, 〈−∞, ?p〉

〈−7,〉, 〈−8,¬p¬

q〉,〈−9

,¬p(?q

)〉,〈−∞

, ?p〉

〈−1,pr〉

〈8, pr〉

〈−1,¬p¬q〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 158 / 182

Page 236: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsBounded Reaction Time

ε-dynamic consistency requires bounded reaction time ε > 0[Comin and Rizzi, 2015].

Propagation-based ε-DC checking algorithm[Hunsberger and Posenato, 2016].

Semantics of instantaneous reactivity for CSTNs[Cairo et al., 2016].

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 159 / 182

Page 237: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsCSTN Summary

Theory of dynamic consistency for CSTNs very solid:instantaneous vs. non-instantaneous reactivitybounded reaction time.

Several proposed DC-checking algorithms: all exponential—but propagation-based algorithm shows promise.

More work to do on flexible execution.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 160 / 182

Page 238: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with UncertaintyCSTNUs

STNY − X ≤ δ

STNU(actions with

uncertain durations)

CDTN

CDTNU

DTN(disjunctiveconstraints)

CSTN(test actions,test results)

CSTNUDTNUmore

expressive

A Conditional Simple Temporal Network with Uncertainty(CSTNU) combines contingent links from STNUs and observationtime-points from CSTNs.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 161 / 182

Page 239: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with UncertaintySample CSTNU

Z P ′ P?

Q′p Q?p E ′pq Epq

Y

[1, 2] [5, 10][15,20

],p

[5, 10] [1, 5], pq [5, 25]

[5, 30], p¬q

[0,5

],pq

[1, 5],¬p

[2, 30]

Contingent links (P ′, [5,10],P?) and (Q′, [5,10],Q?) representtests for a patient.Contingent link (E ′, [5, 25],E) represents the emergency therapy.Contingent links have no propositional labels,but the labels on their endpoints must be the same.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 162 / 182

Page 240: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)Dynamic Controllability

Dynamic Execution Strategy: execution decisions may react toobservations and contingent durations.

A CSTNU is dynamically controllable if there exists a dynamicexecution strategy that guarantees that all relevant constraintswill be satisfied no matter which scenario is incrementallyrevealed over time, and no matter how the contingent durationsturn out.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 163 / 182

Page 241: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC-Checking for CSTNUs

Convert to Timed Game Automaton[Cimatti et al., 2014]

Propagate labeled constraints[Hunsberger and Posenato, 2018a]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 164 / 182

Page 242: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Propagate labeled constraints as done for CSTN

Propagate also upper-case and lower-case (contingent) edgesas done for STNU considering labeled constraints

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 165 / 182

Page 243: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Propagate labeled constraints as done for CSTN

Propagate also upper-case and lower-case (contingent) edgesas done for STNU considering labeled constraints

The mixing of these two kind of propagations requires extending theSTNU-concept of upper-case values!

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 165 / 182

Page 244: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Generalizing Upper-Case LabelsGiven contingent time-points C1,C2, . . . ,Ck, their names arecalled Upper Case (UC) alphabetic-letters (a-letters).

An UC alphabetic label (a-label) is a set of a-letters:is empty, notated as ; orcontains one or more UC a-letters, notated as Ci1 . . .Cim .

For any UC a-labels ℵ,ℵ′, their conjunction is given by theirunion (i.e., ℵℵ′ = ℵ ∪ ℵ′).

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 166 / 182

Page 245: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Generalizing labeled valuesEach edge is annotated by a triple, called a labeled value

A labeled value is a triple, 〈δ, ℵ, α〉, where:δ ∈ Rℵ is an a-labelα is a propositional label (from CSTN)

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 167 / 182

Page 246: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Example 5 (A CSTNU represented using labeled values)

Z

A0

C0 X1 C2 A2

A1 C1

P?〈17,,p〉

〈3,c,〉

〈−10,C,〉

〈7, , p〉 〈2, , ¬p〉

〈−7, C2, 〉

〈1, c2, 〉

〈8,,p〉

〈1, c1, 〉

〈−10, C1, 〉

−4

−8

Edge value v is a shorthand for 〈v , , 〉

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 168 / 182

Page 247: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)Propagation Rules for CSTNUs

Forward Upper Case Propagation: z!Labeled Extended Propagation: zLP/Nc/Uc

Cross Case and Lower Case Propagation: zLc/CcLabel Removal: zLR

Label Modification: zqR0Label “Spreading”: zqR∗3

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 169 / 182

Page 248: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The z! rule

The z! rule can generate edges with multiple UC letters.

C A Z〈−y , C, 〉 〈v , ℵ, β〉

Conditions:−y + v < 0

β does not contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 170 / 182

Page 249: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The z! rule

The z! rule can generate edges with multiple UC letters.

C A Z〈−y , C, 〉 〈v , ℵ, β〉

〈−y + v , Cℵ, β〉

Conditions:−y + v < 0

β does not contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 170 / 182

Page 250: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLP/Nc/Uc Rule

X Y Z〈u, , α〉 〈v , ℵ, β〉

Conditions:u + v < 0α and β must be consistent.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 171 / 182

Page 251: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLP/Nc/Uc Rule

X Y Z〈u, , α〉 〈v , ℵ, β〉

〈u + v , ℵ, αβ〉

Conditions:u + v < 0α and β must be consistent.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 171 / 182

Page 252: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLc/Cc rule

A C Z〈x, c, 〉 〈v , ℵ, β〉

Conditions:x + v < 0C 6∈ ℵ

β does not contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 172 / 182

Page 253: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLc/Cc rule

A C Z〈x, c, 〉 〈v , ℵ, β〉

〈x + v , ℵ, β〉

Conditions:x + v < 0C 6∈ ℵ

β does not contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 172 / 182

Page 254: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLR rule

AZY C〈x, c, 〉〈w , ℵ1, γ〉〈v , Cℵ, β〉

Conditions:C 6∈ ℵℵ1β, γ can contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 173 / 182

Page 255: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zLR rule

AZY C〈x, c, 〉〈w , ℵ1, γ〉〈v , Cℵ, β〉

〈m, ℵℵ1, β ? γ〉

where m = maxv ,w − x.

Conditions:C 6∈ ℵℵ1β, γ can contain unknown literals.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 173 / 182

Page 256: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zqR0 rule

P? Z〈w , ℵ, βp〉

Conditions:w < 0β can contain unknown literals

p ∈ p,¬p, ?p

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 174 / 182

Page 257: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)The zqR0 rule

P? Z〈w , ℵ, βp〉

〈w , ℵ, β〉

Conditions:w < 0β can contain unknown literals

p ∈ p,¬p, ?p

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 174 / 182

Page 258: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe zqR∗3 rule

P? Z Y〈w , ℵ1, γ〉 〈v , ℵ, βp〉

Conditions:β, γ can contain unknown literals

p ∈ p,¬p, ?p

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 175 / 182

Page 259: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNsThe zqR∗3 rule

P? Z Y〈w , ℵ1, γ〉 〈v , ℵ, βp〉

〈maxv ,w, ℵℵ1, β ? γ〉

Conditions:β, γ can contain unknown literals

p ∈ p,¬p, ?p

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 175 / 182

Page 260: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC Checking via Propagation

Example 6 (A snapshot of CSTNU DC-checking algorithm)

Z

A0

C0 X1 C2 A2

A1 C1

P?

〈17,,p〉

〈3,c,〉

〈−10,C,〉

〈7, , p〉 〈2, , ¬p〉

〈−7, C2, 〉

〈1, c2, 〉

〈8,,p〉

〈1, c1, 〉

〈−10, C1, 〉

−4

−8−7

−10, 〈−17, C0 , 〉

−11

−11, 〈−21, C1 , 〉

−12, 〈−13,

C1, 〉

...〈−20,C 1C

2,〉

...〈−18

,C1 C

2,〉

〈−1, C1C2, p〉

This CSTNU is not DC: if C0 occurs at its minimum, while C1 and C2 at their maximum,then X cannot be set satisfying all constraints.

After 123 propagations, the CSTNU contains an explicit negative loop at Z.

The blue constraints are some of those determined by the algorithm before the negativeloop is discovered.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 176 / 182

Page 261: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)DC-Checking Algorithm for CSTNUs

The DC-Checking Algorithm does exhaustive propagation

Returns NO if any negative loop with a consistent label is everfound; otherwise, returns YES.

In positive cases, constructs earliest-first strategy, which isviable due to the spreading lemma for CSTNUs.

The algorithm has exponential-time complexity in the worst case.

Currently we are working on some rule optimizations for making itas practical for a variety of sample networks as the CSTNDC-checking algorithm.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 177 / 182

Page 262: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conditional STNs with Uncertainty (CSTNUs)CSTNU Summary

Theory of dynamic controllability for CSTNUs has a solidfoundation.Two competing DC-checking algorithms∗, both exponential.

Propagation-based algorithms show promise, but require furtherinvestigation.

Alternatives to earliest-first strategy?

∗ [Hunsberger and Posenato, 2018a]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 178 / 182

Page 263: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

CDTNUsAdding Disjunction to CSTNUs

A Conditional Disjunctive Temporal Network with Uncertainty(CDTNU) augments a CSTNU to include disjunctive constraints.

Possible to convert the DC-checking problem for CDTNUs into acontroller-synthesis problem for Timed Game Automata(TGAs)∗.

Highlights connections between temporal networks and TGAs,but algorithm not yet practical.

∗ [Cimatti et al., 2016]

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 179 / 182

Page 264: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

CDTNUsSample Workflow

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 180 / 182

Page 265: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

CDTNUsTGA Encoding of Workflow

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 181 / 182

Page 266: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

Conclusions

Theoretical foundations for a variety of temporal networks arequite solid.

STNs have been incorporated into planning and schedulingapplications for over a decade.

O(N3)-time DC-checking/dispatchability algorithms for STNUsmakes them ready for prime time.

Propagation-based algorithms for CSTNs and CSTNUs showpromise.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 182 / 182

Page 267: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References I

Massimo Cairo, Carlo Comin, and Romeo Rizzi. Instantaneous reaction-time indynamic-consistency checking of conditional simple temporal networks. In 23rd Int. Symp.on Temporal Representation and Reasoning (TIME 2016), pages 80–89, 2016. doi:10.1109/TIME.2016.16.

Massimo Cairo, Luke Hunsberger, and Romeo Rizzi. Faster dynamic controllablity checkingfor simple temporal networks with uncertainty. In 25th Int. Symp. on TemporalRepresentation and Reasoning (TIME 2018), volume 120 of LIPIcs, pages 8:1–8:16, 2018.doi: 10.4230/LIPIcs.TIME.2018.8.

Alessandro Cimatti, Luke Hunsberger, Andrea Micheli, Roberto Posenato, and Marco Roveri.Sound and complete algorithms for checking the dynamic controllability of temporalnetworks with uncertainty, disjunction and observation. In Andrea Cesta, Carlo Combi, andFrancois Laroussinie, editors, 21st Int. Symp. on Temporal Representation and Reasoning(TIME 2014), pages 27–36, 2014. doi: 10.1109/TIME.2014.21.

Alessandro Cimatti, Luke Hunsberger, Andrea Micheli, Roberto Posenato, and Marco Roveri.Dynamic controllability via timed game automata. Acta Informatica, 53(6–8):681–722,2016. ISSN 1432-0525. doi: 10.1007/s00236-016-0257-2.

Carlo Comin and Romeo Rizzi. Dynamic consistency of conditional simple temporal networksvia mean payoff games: a singly-exponential time dc-checking. In 22st Int. Symp. onTemporal Representation and Reasoning (TIME 2015), pages 19–28, 2015. doi:10.1109/TIME.2015.18.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 1 / 6

Page 268: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References II

Patrick R. Conrad and Brian C. Williams. Drake: An efficient executive for temporal plans withchoice. Journal of Artificial Intelligence Research (JAIR), 42:607–659, 2011. URLhttp://dx.doi.org/10.1613/jair.3478.

Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introductionto Algorithms. The MIT Press, 2001.

Rina Dechter, Itay Meiri, and J. Pearl. Temporal Constraint Networks. Artificial Intelligence,49(1-3):61–95, 1991. doi: 10.1016/0004-3702(91)90006-6.

Shimon Even and Hillel Gazit. Updating Distances in Dynamic Graphs. Methods ofOperations Research, 49:371–387, 1985.

Luke Hunsberger. Group Decision Making and Temporal Reasoning. PhD thesis, HarvardUniversity, 2002. Available as Harvard Technical Report TR-05-02.

Luke Hunsberger. Fixing the semantics for dynamic controllability and providing a morepractical characterization of dynamic execution strategies. In TIME 2009, pages 155–162,2009. doi: 10.1109/TIME.2009.25.

Luke Hunsberger. A fast incremental algorithm for managing the execution of dynamicallycontrollable temporal networks. In Nicolas Markey and Jef Wijsen, editors, SeventeenthInt. Symp. on Temporal Representation and Reasoning (TIME-2010), pages 121–128,2010. doi: 10.1109/TIME.2010.16.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 2 / 6

Page 269: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References III

Luke Hunsberger. Magic Loops in Simple Temporal Networks with Uncertainty–ExploitingStructure to Speed Up Dynamic Controllability Checking. In 5th Int. Conf. on Agents andArtificial Intelligence, ICAART 2013, volume 2, pages 157–170, 2013.

Luke Hunsberger. Magic Loops and the Dynamic Controllability of Simple Temporal Networkswith Uncertainty. In Joaquim Filipe and Ana Fred, editors, Agents and ArtificialIntelligence, volume 449 of Communications in Computer and Information Science (CCIS),pages 332–350. Springer, 2014.

Luke Hunsberger. Efficient execution of dynamically controllable simple temporal networkswith uncertainty. Acta Informatica, 53(2):89–147, 2015. doi: 10.1007/s00236-015-0227-0.

Luke Hunsberger and Barbara J. Grosz. A combinatorial auction for collaborative planning. In4th Int. Conf. on Multi-Agent Systems (ICMAS-2000), 2000.

Luke Hunsberger and Roberto Posenato. Checking the dynamic consistency of conditionaltemporal networks with bounded reaction times. In Amanda Jane Coles, Andrew Coles,Stefan Edelkamp, Daniele Magazzeni, and Scott Sanner, editors, 26th Int. Conf. onAutomated Planning and Scheduling, ICAPS 2016, pages 175–183, 2016. URLhttp://www.aaai.org/ocs/index.php/ICAPS/ICAPS16/paper/view/13108.

Luke Hunsberger and Roberto Posenato. Sound-and-Complete Algorithms for Checking theDynamic Controllability of Conditional Simple Temporal Networks with Uncertainty. In 25thInt. Symp. on Temporal Representation and Reasoning (TIME 2018), volume 120 ofLIPIcs, pages 14:1–14:17, 2018a. doi: 10.4230/LIPIcs.TIME.2018.14.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 3 / 6

Page 270: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References IV

Luke Hunsberger and Roberto Posenato. Simpler and faster algorithm for checking thedynamic consistency of conditional simple temporal networks. In 26th Int. Joint Conf. onArtificial Intelligence, (IJCAI-2018), pages 1324–1330, 2018b. doi:10.24963/ijcai.2018/184.

Luke Hunsberger, Roberto Posenato, and Carlo Combi. A sound-and-completepropagation-based algorithm for checking the dynamic consistency of conditional simpletemporal networks. In Fabio Grandi, Martin Lange, and Alessio Lomuscio, editors, 22ndInt. Symp. on Temporal Representation and Reasoning (TIME 2015), pages 4–18, 2015.doi: 10.1109/TIME.2015.26.

James C. Boerkoel Jr. and Edmund H. Durfee. Decoupling the multiagent disjunctive temporalproblem. In 27th AAAI Conf. on Artificial Intelligence, 2013.

Paul Morris. A Structural Characterization of Temporal Dynamic Controllability. In Principlesand Practice of Constraint Programming (CP 2006), volume 4204, pages 375–389, 2006.doi: 10.1007/11889205_28.

Paul Morris. Dynamic controllability and dispatchability relationships. In Integration of AI andOR Techniques in Constraint Programming, volume 8451 of LNCS, pages 464–479. 2014.doi: 10.1007/978-3-319-07046-9_33.

Paul Morris. The mathematics of dispatchability revisited. In 26th Int. Conf. on AutomatedPlanning and Scheduling (ICAPS-2016), pages 244–252, 2016.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 4 / 6

Page 271: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References V

Paul H. Morris and Nicola Muscettola. Temporal dynamic controllability revisited. In 20thNational Conf. on Artificial Intelligence (AAAI-05), pages 1193–1198, 2005.

Paul H. Morris, Nicola Muscettola, and Thierry Vidal. Dynamic control of plans with temporaluncertainty. In 17th Int. Joint Conf. on Artificial Intelligence (IJCAI-01), pages 494–502,2001.

Kiriakos Simon Mountakis, Tomas Klos, and Cees Witteveen. Dynamic temporal decoupling.In 14th Int. Conf. Integration of AI and OR Techniques in Constraint Programming, volume10335 of Lecture Notes in Computer Science (LNCS), pages 328–343, 2017. doi:10.1007/978-3-319-59776-8_27.

Nicola Muscettola, Paul H. Morris, and Ioannis Tsamardinos. Reformulating Temporal Plansfor Efficient Execution. In 6th Int. Conf. on Principles of Knowledge Representation andReasoning (KR-98), pages 444–452, 1998.

Léon Planken. Incrementally Solving the STP by Enforcing Partial Path Consistency. In 27thPlanSIG Work., pages 87–94, 2008. URL https://pdfs.semanticscholar.org/fb41/21c05cc8a55301d385598f5c24d64cf703a4.pdf.

G. Ramalingam and Thomas Reps. On the Computational Complexity of Dynamic GraphProblems. Theoretical Computer Science, 158:233–277, 1996.

G. Ramalingam, J. Song, L. Joskowicz, and R. E. Miller. Solving Systems of DifferenceConstraints Incrementally. Algorithmica, 23(3):261–275, 1999. ISSN 1432-0541. doi:10.1007/PL00009261.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 5 / 6

Page 272: Temporal Networks: STNs, STNUs, CSTNs and CSTNUs

References VI

Hans Rohnert. A Dynamization of the All Pairs Least Cost Path Problem. In Kurt Mehlhorn,editor, 2nd Symp. of Theoretical Aspects of Computer Science (STACS 85), volume 182 ofLecture Notes in Computer Science (LNCS), pages 279–286. Springer, 1985. doi:10.1007/BFb0024016.

Ioannis Tsamardinos, Nicola Muscettola, and Paul Morris. Fast Transformation of TemporalPlans for Efficient Execution. In 15th National Conf. on Artificial Intelligence (AAAI-98),pages 254–261, 1998.

Ioannis Tsamardinos, Thierry Vidal, and Martha E. Pollack. CTP: A new constraint-basedformalism for conditional, temporal planning. Constraints, 8:365–388, 2003. doi:10.1023/A:1025894003623.

Lin Xu and Berthe Y. Choueiry. A new efficient algorithm for solving the simple temporalproblem. In 10th Int. Symp. on Temporal Representation and Reasoning and 4th Int. Conf.on Temporal Logic (TIME-ICTL-2003), pages 210–220, 2003. doi:10.1109/TIME.2003.1214898.

Luke Hunsberger Temporal Networks: STNs, STNUs, CSTNs and CSTNUs 6 / 6