17
From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture Paul S. Rosenbloom | 8/5/2011 The projects or efforts depicted were or are sponsored by the U.S. Army Research, Development, and Engineering Command (RDECOM) Simulation Training and Technology Center (STTC) and the Air Force Office of Scientific Research, Asian Office of Aerospace Research and Development (AFOSR/AOARD). The content or information presented does not necessarily reflect the position or the policy of the Government, and no official endorsement should be inferred.

From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

  • Upload
    elias

  • View
    16

  • Download
    0

Embed Size (px)

DESCRIPTION

From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture. Paul S. Rosenbloom | 8/5/2011. - PowerPoint PPT Presentation

Citation preview

Page 1: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive ArchitecturePaul S. Rosenbloom | 8/5/2011

The projects or efforts depicted were or are sponsored by the U.S. Army Research, Development, and Engineering Command (RDECOM) Simulation Training and Technology Center (STTC) and the Air Force Office of Scientific Research, Asian Office of Aerospace Research and Development (AFOSR/AOARD). The content or information presented does not necessarily reflect the position or the policy of the Government, and no official endorsement should be inferred.

Page 2: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

2

Cognitive Architecture

Symbolic working memory Long-term memory of rules Decide what to do next based

on preferences generated by rules

Reflect when can’t decide Learn results of reflection Interact with world

Soar 3-8

Cognitive architecture: hypothesis about fixed structure underlying intelligent behavior– Defines core memories, reasoning processes, learning

mechanisms, external interfaces, etc.– Yields intelligent behavior when add knowledge and skills– May serve as

a Unified Theory of Cognition the core of virtual humans and intelligent agents or robots the basis for artificial general intelligence

ICT 2010

Page 3: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

3

Hybrid Short-Term Memory

Prediction-Based Learning

Hybrid Mixed Long-Term Memory

Graphical Architecture

Decision

How to build architectures that combine:– Theoretical elegance, simplicity, maintainability, extendibility– Broad scope of capability and applicability

Embodying a superset of existing architectural capabilities– Cognitive, perceptuomotor, emotive, social, adaptive, …

Diversity Dilemma

Soar 9Soar 3-8

Page 4: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

4

Goals of This Work

Extend graphical memory architecture to (Soar-like) problem solving– Operator generation, evaluation, selection and application– Reuse existing memory mechanisms, based on graphical

models, as much as possible

Evaluate ability to extend architectural functionality while retaining simplicity and elegance– Evidence for ability of approach to resolve diversity dilemma

Page 5: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

5

Problem Solving in Soar

Base level– Generate, evaluate, select and apply operators

Generation: Retractable rule firing – LTM(WM) WM Evaluation: Retractable rule firing – LTM(WM) PM (Preferences) Selection: Decision procedure – PM(WM) WM Application: Latched rule firing – LTM(WM) WM

Meta level (not focus here)

LTM

PM WMSelection

ApplicationGenerationEvaluation

Decision Cycle

Elaboration Cycle

Match Cycle

Elaboration cycles + decision

Parallel rule match + firing

Pass token within Rete rule-match network

D

Page 6: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

6

Enable efficient computation over multivariate functions by decomposing them into products of subfunctions– Bayesian/Markov networks, Markov/conditional random fields, factor graphs

Yield broad capability from a uniform base– State of the art performance across symbols, probabilities and signals via

uniform representation and reasoning algorithm (Loopy) belief propagation, forward-backward algorithm, Kalman filters, Viterbi algorithm, FFT,

turbo decoding, arc-consistency and production match, …

Support mixed and hybrid processing Several neural network models map onto them

Graphical Models

w

yx

z

up(u,w,x,y,z) = p(u)p(w)p(x|u,w)p(y|x)p(z|x)

f1

w

f3f2

y

x zu

f(u,w,x,y,z) = f1(u,w,x)f2(x,y,z)f3(z)

Page 7: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

7

The Graphical ArchitectureFactor Graphs and the Summary Product Algorithm

Summary product processes messages on links– Messages are distributions over domains of variables on link– At variable nodes messages are combined via pointwise product– At factor nodes input product is multiplied with factor function and

then all variables not in output are summarized out

f1

w

f3f2

y

x zu

f(u,w,x,y,z) = f1(u,w,x)f2(x,y,z)f3(z)

.2

.4

.1

.3

.2

.1

.06

.08

.01

A single settling of the graph can efficiently compute: Variable marginals Maximum a posterior (MAP) probs.

Page 8: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

8

A Hybrid Mixed Function/Message Representation

Represent both messages and factor functions as multidimensional continuous functions– Approximated as piecewise linear over rectilinear regions

Discretize domain for discrete distributions & symbols

[1,2>=.2, [2,3>=.5, [3,4>=.3, …

Booleanize range (and add symbol table) for symbols[0,1>=1 Color(x, Red)=True, [1,2>=0 Color(x, Green)=False

y\x [0,10> [10,25> [25,50>

[0,5> 0 .2y 0

[5,15> .5x 1 .1+.2x+.4y

Series10

0.2

0.4

0.6

Page 9: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

9

Graphical Memory Architecture Developed general knowledge representation layer

on top of factor graphs and summary product Differentiates long-term and working memories

– Long-term memory defines a graph– Working memory specifies peripheral factor nodes

Working memory consists of instances of predicates (Next ob1:O1 ob2:O2), (weight object:O1 value:10) Provides fixed evidence for a single settling of the graph

Long-term memory consists of conditionals– Generalized rules defined via predicate patterns and functions

Patterns define conditions, actions and condacts (a neologism) Functions are mixed hybrid over pattern variables in conditionals

Each predicate induces own working memory node

WM

Page 10: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

10

Conditionals

CONDITIONAL Transitive conditions: (Next ob1:a ob2:b) (Next ob1:b ob2:c) actions: (Next ob1:a ob2:c)

WM

Pattern

Join

w\c Walker Table …

[1,10> .01w .001w …

[10,20> .2-.01w “ …

[20,50> 0 .025-.00025w …

[50,100> “ “ …

CONDITIONAL Concept-Weight condacts: (concept object:O1 class:c) (weight object:O1 value:w) function:

WM PatternJoin Function

Conditions test WM

Actions propose changes to WM

Condacts test and change WM

Functions modulate variables

All four can be freely mixed

Page 11: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

11

A rule-based procedural memory Semantic and episodic declarative memories

– Semantic: Based on cued object features, statistically predict object’s concept plus all uncued features

A constraint memory Beginnings of an imagery memory

Memory Capabilities Implemented

CONDITIONAL Transitive Conditions: Next(a,b) Next(b,c) Actions: Next(a,c)

WM

Pattern

Join

w\c Walker Table …

[1,10> .01w .001w …

[10,20> .2-.01w “ …

[20,50> 0 .025-.00025w …

[50,100> “ “ …

Function:

CONDITIONAL ConceptWeight Condacts: Concept(O1,c) Weight(O1,w)

Concept (S)

Legs (D)Mobile (B)

Weight (C) Color (S)

Alive (B)

Page 12: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

12

Additional Aspects Relevant to Problem SolvingOpen World versus Closed World Predicates

Predicates may be open world or closed world– Do unspecified WM regions default to false (0) or unknown (1)?– A key distinction between declarative and procedural memory

Open world allows changes within a graph cycle– Predicts unknown values within a graph cycle– Chains within a graph cycle– Retracts when WM basis changes

Closed world only changes across cycles– Chains only across graph cycles– Latches results in WM

Page 13: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

13

Predicate variables may be universal or unique Universal act like rule variables

– Determine all matching values– Actions insert all (non-negated) results into WM

And delete all negated results from WM

Unique act like random variables– Determine distribution over best value– Actions insert only a single best value into WM

Negations clamp values to 0

Additional Aspects Relevant to Problem SolvingUniversal versus Unique Variables

Join Negate WMChanges

+

Action combination subgraph:

Page 14: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

14

The last message sent along each link in the graph is cached on the link– Forms a set of link memories that last until messages change– Subsume alpha & beta memories in Rete-like rule match cycle

Additional Aspects Relevant to Problem SolvingLink Memory

Page 15: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

15

Problem Solving in theGraphical Architecture

Base level– Generate, evaluate, select and apply operators

Generation: (Retractable) Open world actions – LTM(WM) WM Evaluation: (Retractable) Actions + functions – LTM(WM) LM Selection: Unique variables – LM(WM) WM Application: (Latched) Closed world actions – LTM(WM) WM

Meta level (not focus here)

LTM

LM WMSelection

ApplicationGenerationEvaluation

Graph Cycle

Message Cycle

Message cycles + WM change

Process message within factor graph

Page 16: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

16

Eight Puzzle Results

Preferences encoded via functions and negations

Total of 19 conditionals* to solve simple problems in a Soar-like fashion (without reflection)– 747 nodes (404 variable, 343 factor) and 829 links– Sample problem takes 6220 messages over 9 decisions (13 sec)

CONDITIONAL goal-best ; Prefer operator that moves a tile into its desired location :conditions (blank state:s cell:cb) (acceptable state:s operator:ct) (location cell:ct tile:t) (goal cell:cb tile:t) :actions (selected states operator:ct) :function 10

CONDITIONAL previous-reject ; Reject previously moved operator :conditions (acceptable state:s operator:ct) (previous state:s operator:ct) :actions (selected - state:s operator:ct)

Page 17: From Memory to Problem Solving: Mechanism Reuse in a Graphical Cognitive Architecture

17

Conclusion

Soar-like base-level problem solving grounds directly in mechanisms in graphical memory architecture– Factor graphs and conditionals knowledge in problem solving– Summary product algorithm processing– Mixed functions symbolic and numeric preferences– Link memories preference memory– Open world vs. closed world generation vs. application– Universal vs. unique generation vs. selection

Almost total reuse augurs well for diversity dilemma– Only added architectural selected predicate for operators

Also progressing on other forms of problem solving– Soar-like reflective processing (e.g., search in problem spaces)– POMDP-based operator evaluation (decision-theoretic lookahead)