44
Generative Models for Complex Network Structure Aaron Clauset @aaronclauset Computer Science Dept. & BioFrontiers Institute University of Colorado, Boulder External Faculty, Santa Fe Institute 4 June 2013 NetSci 2013, Complex Networks meets Machine Learning 1

Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

Generative Models for Complex Network Structure

Aaron Clauset@aaronclausetComputer Science Dept. & BioFrontiers InstituteUniversity of Colorado, BoulderExternal Faculty, Santa Fe Institute

4 June 2013 NetSci 2013, Complex Networks meets Machine Learning

1

Page 2: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• what is structure?

• generative models for complex networks

general form types models opportunities and challenges

• weighted stochastic block models

a parable about thresholding checking our models learning from data (approximately)

2

Page 3: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• makes data different from noise makes a network different from a random graph

what is structure?

3

Page 4: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• makes data different from noise makes a network different from a random graph

• helps us compress the data describe the network succinctly capture most relevant patterns

what is structure?

4

Page 5: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• makes data different from noise makes a network different from a random graph

• helps us compress the data describe the network succinctly capture most relevant patterns

• helps us generalize, from data we’ve seen to data we haven’t seen:

i. from one part of network to anotherii. from one network to others of same typeiii. from small scale to large scale (coarse-grained structure)iv. from past to future (dynamics)

what is structure?

5

Page 6: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• imagine graph is drawn from an ensemble or generative model: a probability distribution with parameters

• can be continuous or discrete; represents structure of graph

statistical inference

P (G | �)G

6

Page 7: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• imagine graph is drawn from an ensemble or generative model: a probability distribution with parameters

• can be continuous or discrete; represents structure of graph• inference (MLE): given , find that maximizes • inference (Bayes): compute or sample from posterior

distribution

statistical inference

G ✓ P (G | �)

P (� |G)

P (G | �)G

7

Page 8: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

statistical inference

• imagine graph is drawn from an ensemble or generative model: a probability distribution with parameters

• can be continuous or discrete; represents structure of graph• inference (MLE): given , find that maximizes • inference (Bayes): compute or sample from posterior

distribution

• if is partly known, constrain inference and determine the rest• if is partly known, infer and use to generate the rest• if model is good fit (application dependent), we can generate

synthetic graphs structurally similar to • if part of has low probability under model, flag as possible

anomaly

G ✓ P (G | �)

P (� |G)

✓ P (G | �)G

G

G

P (G | �)G

8

Page 9: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• what is structure?

• generative models for complex networks

general form types models opportunities and challenges

• weighted stochastic block models

a parable about thresholding checking our models learning from data (approximately)

9

Page 10: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

assumptions about “structure” go into

consistency

requires that edges be conditionally independent [Shalizi, Rinaldo 2011]

generative models for complex networks

general form

P (G | ✓) =Y

i<j

P (Aij | ✓)

P (Aij | ✓)

limn!1

Pr⇣✓̂ 6= ✓

⌘= 0

10

Page 11: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

assortative

mo

du

les

D, {pr}

probability pr

11

Page 12: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

hierarchical random graph

model

instance

Pr(i, j connected) = pr

i

j

i j

= p(lowest common ancestor of i,j)

12

Page 13: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

L(D, {pr}) =!

r

pEr

r (1 ! pr)LrRr!Er

Er

Rr

Lr= number nodes in left subtree

= number nodes in right subtree

= number edges with as lowest common ancestor

Lr Rr

Er

!

pr!

r

13

Page 14: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

classes of generative models

• stochastic block modelsk types of vertices, depends only on types of i, joriginally invented by sociologists [Holland, Laskey, Leinhardt 1983]

many, many flavors, includingmixed-membership SBM [Airoldi, Blei, Feinberg, Xing 2008]

hierarchical SBM [Clauset, Moore, Newman 2006,2008]

restricted hierarchical SBM [Leskovec, Chakrabarti, Kleinberg, Faloutsos 2005]

infinite relational model [Kemp, Tenenbaum, Griffiths, Yamada, Ueda 2006]

restricted SBM [Hofman, Wiggins 2008]

degree-corrected SBM [Karrer, Newman 2011]

SBM + topic models [Ball, Karrer, Newman 2011]

SBM + vertex covariates [Mariadassou, Robin, Vacher 2010]

SBM + edge weights [Aicher, Jacobs, Clauset 2013]

+ many others

P (Aij | zi, zj)

14

Page 15: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

classes of generative models

• latent space modelsnodes live in a latent space, depends only on vertex-vertex proximity

many, many flavors, includinglogistic function on vertex features [Hoff, Raftery, Handcock 2002]

social status / ranking [Ball, Newman 2013]

nonparametric metadata relations [Kim, Hughes, Sudderth 2012]

multiple attribute graphs [Kim, Leskovec 2010]

nonparametric latent feature model [Miller, Griffiths, Jordan 2009]

infinite multiple memberships [Morup, Schmidt, Hansen 2011]

ecological niche model [Williams, Anandanadesan, Purves 2010]

hyperbolic latent spaces [Boguna, Papadopoulos, Krioukov 2010]

P (Aij | f(xi, xj))

1094 Journal of the American Statistical Association, December 2002

Figure 1. Maximum Likelihood Estimates (a) and Bayesian Marginal

Posterior Distributions (b) for Monk Positions. The direction of a relation

is indicated by an arrow.

distances between nodes, but quite slowly in Å, as shown inFigure 2(b). Output from the chain was saved every 2,000scans, and positions of the different monks are plotted for eachsaved scan in Figure 1(b) (the plotting color for each monkis based on their mean angle from the positive x-axis andtheir mean distance from the origin). The categorization of themonks given at the beginning of this section is validated by thedistance model étting, as there is little between-group overlapin the posterior distribution of monk positions. Additionally,this model is able to quantify the extent to which some actors(such as monk 15) lie between other groups of actors.

The extent to which model ét can be improved by increas-ing the dimension of the latent space was examined by éttingthe distance model in <3, that is, z

i

2 <3 for i

D 11 : : : 1 n. Themaximum likelihood for this model is ƒ34004 in 50 param-eters, a substantial improvement over the ét in <2 at a costof 16 additional parameters. It is interesting to note that theét cannot be improved by going into higher dimensions. Thiscan be seen as follows. For a given dataset Y , the best-éttingsymmetric model (p

i1 j

Dp

j1 i

) has the property that p

i1 j

D 1for y

i1 j

Dy

j1 i

D 1, p

i1 j

D 0 for y

i1 j

Dy

j1 i

D 0, and p

i1 j

D 1=2for y

i1 j

6Dy

j1 i

. The log-likelihood of such a ét is thus ƒa log4,

where a is the number of asymmetric dyads. For the monk

0 200 600 1000

11

01

00

90

80

scan�/�(�2�x�10

3

)

lo

g�like

lih

oo

d

0 200 600 1000

46

81

01

2

scan�/�(2�x�10

3

)

alp

ha

(a) (b)

Figure 2. MCMC Diagnostics for the Monk Analysis. (a) Log-

likelihood; (b) alpha.

dataset, the number of asymmetric dyads is 26, and so themaximum possible log-likelihood under a symmetric model isƒ26 log4 D ƒ36004, which is achieved by the distance modelin <3. More precisely, there exists a set of positions O

z

i

2 <3

and a rate parameter OÅ such that lim

c

!ˆ logP4Y

—c

OÅ1 c

bZ5

Dƒ26 log4.

4.2 Florentine Families

Padgett and Ansell (1993) compiled data on marriageand business relations between 16 historically prominentFlorentine families, using a history of this period given byKent (1978). We analyze data on the marriage relations tak-ing place during the 15th century. The actors in the populationare families, and a tie is present between two families if thereis at least one marriage between them. This is an undirectedrelation, as the respective families of the husband and wife ineach marriage were not recorded. One of the 16 families hadno marriage ties to the others, and was consequently droppedfrom the analysis. If included, this family would have inénitedistance from the others in a maximum likelihood estimationand a large but énite distance in a Bayesian analysis, as deter-mined by the prior.

Modeling d

i1 j

D —z

i

ƒz

j

—1 z

i

1 z

j

2 <2 and using the param-eterization ‡

i1 j

DÅ41 ƒ

d

i1 j

5 as described in Section 2, thelikelihood of (Å1Z5 can be made arbitrarily close to 1 asÅ

! ˆ for éxed Z

D bZ; that is, the data are d2 representable.

Such a representing bZ is plotted in Figure 3(a). Family 9 is

the Medicis, whose average distance to others is greater onlythan that of families 13, the Ridolés and 16, the Tornabuonis.Another d2 representation is given in Figure 3(b). This conég-uration is similar in structure to the érst, except that the seg-ments 9-1 and 9-14-10 have been rotated. This is somewhatof an artifact of our choice of dimension: When modeled inthree dimensions, 1 and 14 are ét as being relatively equidis-tant from 6.

One drawback of the MLEs presented earlier is that theyoverét the data in a sense, as the étted probabilities of ties areall either 0 or 1 (or nearly so, for very large Å). Alternatively,a prior for Å can be formulated to keep predictive probabilitiesmore in line with our beliefs; for example, that the probabil-ity of a tie rarely goes below some small, but not inénitesi-mal value. Using the MCMC procedure outlined in Section 3,the marriage data were analyzed using an exponential priorwith mean 2 for Å and diffuse independent normal priors forthe components of Z (mean 0, standard deviation 100). TheMCMC algorithm was run for 5 Ä 106 scans, with the outputsaved every 5,000 scans. This chain mixes faster than that ofthe monk example, as can be seen in the diagnostic plots ofFigure 4 and in plots of pairwise distances between nodes (notshown). Marginal conédence regions are represented by plot-ting samples of positions from the Markov chain, shown inFigure 3(c). Note that the conédence regions include both theconégurations given in Figure 3(a) and (b). Actors 14 and 10(in red and purple) are above or below actor 1 (in green) forany particular sample; the observed overlap of these actors inthe égure is due to the bimodality of the posterior and that theplot gives the marginal posterior distributions of each actor.

15

Page 16: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

opportunities and challenges

• richly annotated dataedge weights, node attributes, time, etc.= new classes of generative models

• generalize from to ensembleuseful for modeling checking, simulating other processes, etc.

• many familiar techniquesfrequentist and Bayesian frameworksmakes probabilistic statements about observations, modelspredicting missing links leave-k-out cross validationapproximate inference techniques (EM, VB, BP, etc.)sampling techniques (MCMC, Gibbs, etc.)

• learn from partial or noisy dataextrapolation, interpolation, hidden data, missing data

n = 1

16

Page 17: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

opportunities and challenges

• only two classes of modelsstochastic block modelslatent space models

• bootstrap / resampling for network datacritical missing piecedepends on what is independent in the data

• model comparisonnaive AIC, BIC, marginalization, LRT can be wrong for networkswhat is goal of modeling: realistic representation or accurate prediction?

• model assessment / checking?how do we know a model has done well? what do we check?

• what is v-fold cross-validation for networks? Omit edges? Omit nodes? What?n2/v n/v

17

Page 18: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• what is structure?

• generative models for complex networks

general form types models opportunities and challenges

• weighted stochastic block models

a parable about thresholding learning from data (approximately) checking our models

18

Page 19: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

functional groups, not just clumps

• social “communities” (large, small, dense or empty)

• social: leaders and followers

• word adjacencies: adjectives and nouns

• economics: suppliers and customers

stochastic block models

19

Page 20: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

nodes have discrete attributeseach vertex has type matrix of connection probabilitiesif and , edge exists with probability not necessarily symmetric, and we do not assumegiven some , we want to simultaneously

label nodes (infer type assignment )learn the latent matrix

ti � {1, . . . , k}i

k ⇥ k p

ti = r tj = s (i ! j) prs

p prr > prs

G

t : V � {1, . . . , k}p

classic stochastic block model

20

Page 21: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

1 2 3 4 5 61

2

3

4

5

6

assortative modules

classic stochastic block model

model

instance

P (G | t, �) =Y

(i,j)2E

pti,tjY

(i,j) 62E

(1� pti,tj )

likelihood

21

Page 22: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• 4 groups• edge weights with• what threshold should we choose?

µ1 < µ2 < µ3 < µ4⇠ N(µi,�2)

t

thresholding edge weights

t = 1, 2, 3, 4

22

Page 23: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• 4 groups• edge weights with• set threshold , fit SBM

µ1 < µ2 < µ3 < µ4⇠ N(µi,�2)

t 1

23

Page 24: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• 4 groups• edge weights with• set threshold , fit SBM

µ1 < µ2 < µ3 < µ4⇠ N(µi,�2)

t = 2

24

Page 25: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• 4 groups• edge weights with• set threshold , fit SBM

µ1 < µ2 < µ3 < µ4⇠ N(µi,�2)

t = 3

25

Page 26: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

t � 4

• 4 groups• edge weights with• set threshold , fit SBM

µ1 < µ2 < µ3 < µ4⇠ N(µi,�2)

26

Page 27: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

each edge has weight let

covers all exponential-family type distributions:bernoulli, binomial (classic SBM), multinomialpoisson, betaexponential, power law, gammanormal, log-normal, multivariate normal

adding auxiliary information:

w(i, j)

w(i, j) � f(x|⇥)= h(x) exp(T (x) · �(⇥))

weighted stochastic block model

27

Page 28: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

each edge has weight let

examples of weighted graphs:frequency of social interactions (calls, txt, proximity, etc.)cell-tower traffic volumeother similarity measurestime-varying attributesmissing edges, active learning, etc.

adding auxiliary information:

w(i, j)

w(i, j) � f(x|⇥)= h(x) exp(T (x) · �(⇥))

weighted stochastic block model

28

Page 29: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

given and choice of , learn and

block structureweight distribution

block assignmentweighted graph

likelihood function:

degeneracies in likelihood function(variance can go to zero. oops)

technical difficulties:

weighted stochastic block model

R : k � k ⇥ {1, . . . , R}

P (G | z, ✓, f) =Y

i<j

f�Gi,j | ✓R(zi,zj)

f

z

G

f zG ✓

29

Page 30: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

approximate learning

edge generative model

estimate model via variational Bayes

conjugate priors solve degeneracy problem

algorithms for dense and sparse graphs

P (G | z, ✓, f)

30

Page 31: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

approximate posterior distribution

estimate by minimizing

where

for (conjugate) prior for exponential family distribution

taking derivative yields update equations for

iterating equations yields local optima

dense weighted SBM

⇡⇤(z, ✓ |G) ⇡ q(z, ✓) =Y

i

qi(zi)Y

r

q(✓r)

DKL(q||⇡⇤) = lnP (G | z, ✓, f)� G(q)

G(q) = Eq(L) + Eq

✓log

⇡(z, ✓)

q(z, ✓)

q

⇡ f

z, ✓

31

Page 32: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

checking the model

synthetic network with known structure

• given synthetic graph with known structure

• run VB algorithm to convergence

• compare against choose threshold + SBM (and others)

compute Variation of Information (partition distance)

VI(P1, P2) 2 [0, lnN ]

VI(P1, P2) 2 [0, ln k⇤ + 1.5] = [0, 3.1]in this case32

Page 33: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

checking the model

synthetic network with known structure

• variation of Newman’s four-groups test

• latent groups

• Normal edge weights:f = N (µr,�

2r)

nr = [48, 16, 32, 48, 16]k⇤ = 5

VI(P1, P2) 2 [0, ln k⇤ + 1.5] = [0, 3.1]in this case33

Page 34: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

learn better with more data

increase network size

• fix ,• bigger network, more

data

N

k = k⇤

we keep the constantnr/N

f = N

34

Page 35: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

100 150 2000

0.2

0.4

0.6

0.8

1

Number of Nodes

Variation o

f In

form

ation: V

I

• fix ,• bigger network, more

data• WSBM converges on

correct solution more quickly

• thresholding + SBM particularly bad

k = k⇤

we keep the constantnr/N

f = N

learn better with more data

increase network size N

2 4 6 80

0.5

1

1.5

2

Number of Groups: K

Variatio

n o

f In

form

atio

n: V

I

WBMSBM+ThreshKmeansCluster(Max)Cluster(Avg)

35

Page 36: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

learning the number of groups

vary number of groups found

• fix • too few / many blocks?

k

f = N

36

Page 37: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

2 4 6 80

0.5

1

1.5

2

Number of Groups: K

Va

ria

tion

of

Info

rma

tion

: V

I

WBMSBM+ThreshKmeansCluster(Max)Cluster(Avg)

• fix • too few / many blocks?• WSBM converges on

correct solution• WSBM fails gracefully

when • others do poorly

k > k⇤

f = N

In fact, Bayesian marginalization will correctly choose k=k* in this case.

2 4 6 80

0.5

1

1.5

2

Number of Groups: K

Variatio

n o

f In

form

atio

n: V

I

WBMSBM+ThreshKmeansCluster(Max)Cluster(Avg)

learning the number of groups

vary number of groups found k

37

Page 38: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

learning despite noise

increase variance in edge weights

• fix ,• bigger variance, less

signal

�2r

k = k⇤ f = N

38

Page 39: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

0 50 100 150 2000

0.5

1

1.5

Variance

Variation o

f In

form

ation: V

I

• fix ,• bigger variance, less

signal• WSBM fails more

gracefully than alternatives, even for very high variance

• thresholding + SBM particularly bad

k = k⇤ f = N

learning despite noise

increase variance in edge weights �2r

2 4 6 80

0.5

1

1.5

2

Number of Groups: KV

ariatio

n o

f In

form

atio

n: V

I

WBMSBM+ThreshKmeansCluster(Max)Cluster(Avg)

39

Page 40: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• single-scale structural inference

mixtures of assortative, disassortative groups

• inference is cheap (VB)

approximate inference works well

• thresholding edge weights is bad, bad, bad

one threshold (SBM) vs. many (WSBM)

• generalizations also for sparse graphs, degree-corrections, etc.

comments

40

Page 41: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• auxiliary informationnode & edge attributes, temporal dynamics (beyond static binary graphs)

• scalabilityfast algorithms for fitting models to big data (methods from physics, machine learning)

• model selectionwhich model is better? is this model bad? how many communities?

• model checkinghave we learned correctly? check via generating synthetic networks

• partial or noisy dataextrapolation, interpolation, hidden data, missing data

• anomaly detectionlow probability events under generative model

generative models

41

Page 42: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

• auxiliary informationnode & edge attributes, temporal dynamics (beyond static binary graphs)

• scalabilityfast algorithms for fitting models to big data (methods from physics, machine learning)

• model selectionwhich model is better? is this model bad? how many communities?

• model checkinghave we learned correctly? check via generating synthetic networks

• partial or noisy dataextrapolation, interpolation, hidden data, missing data

• anomaly detectionlow probability events under generative model

generative models

42

Page 43: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

ThanksCris Moore (Santa Fe)Mark Newman (Michigan)Cosma Shalizi (Carnegie Mellon)

Funding from

• Aicher, Jacobs, Clauset, “Adapting the Stochastic Block Model to Edge-Weighted Networks.” ICML (2013)

• Moore, Yan, Zhu, Rouquier, Lane, “Active learning for node classification in assortative and disassortative networks.” KDD (2011)

• Park, Moore and Bader, “Dynamic Networks from Hierarchical Bayesian Graph Clustering.” PLoS ONE 5(1): e8118 (2010).

• Clauset, Newman and Moore, “Hierarchical structure and the prediction of missing links in networks.” Nature 453, 98-101 (2008)

• Clauset, Moore and Newman, “Structural Inference of Hierarchies in Networks.” ICML (2006)

some references

acknowledgments

Abigail Jacobs(Colorado)

Christopher Aicher(Colorado)

43

Page 44: Generative Models for Complex Network Structuretuhe/cnmml/pdf/clauset.pdf · statistical inference • imagine graph G is drawn from an ensemble or generative model: a probability

fin

44