27
Chapter Seven Chapter Seven The Network Approach: The Network Approach: Mind as a Web Mind as a Web

Chapter Seven The Network Approach: Mind as a Web

  • View
    223

  • Download
    1

Embed Size (px)

Citation preview

Chapter SevenChapter Seven

The Network Approach: Mind as a The Network Approach: Mind as a WebWeb

ConnectionismConnectionism

• The major field of the network approach.The major field of the network approach.• Connectionists construct Connectionists construct Artificial Neural Artificial Neural

NetworksNetworks (ANNs), which are computer (ANNs), which are computer simulations of how groups of neurons might simulations of how groups of neurons might perform some task.perform some task.

Information processingInformation processing

• ANNs utilize a processing strategy in which ANNs utilize a processing strategy in which large numbers of computing units perform large numbers of computing units perform their calculations simultaneously. This is their calculations simultaneously. This is known as known as parallel distributed processingparallel distributed processing..

• In contrast, traditional computers are In contrast, traditional computers are serial serial processorsprocessors, performing one computation at , performing one computation at a time.a time.

Serial and parallel processing Serial and parallel processing architecturesarchitectures

Serial vs. Parallel ComputingSerial vs. Parallel Computing

• No difference in computing power.No difference in computing power.• Parallel computing is simulated by general Parallel computing is simulated by general

purpose computers.purpose computers.• Modern general purpose computers are not Modern general purpose computers are not

strictly serial.strictly serial.

ApproachesApproaches

• The traditional approach in cognition and AI to The traditional approach in cognition and AI to solving problems is to use an algorithm in which solving problems is to use an algorithm in which every processing step is planned. every processing step is planned. (Not really.)(Not really.) It It relies on symbols and operators applied to relies on symbols and operators applied to symbols. This is the symbols. This is the knowledge-based approachknowledge-based approach. .

• Connectionists instead let the ANN perform the Connectionists instead let the ANN perform the computation on its own without any computation on its own without any (with less)(with less) planning. They are concerned with the behavior of planning. They are concerned with the behavior of the network. This is the the network. This is the behavior-based approachbehavior-based approach. . (Not really.)(Not really.)

Knowledge representationKnowledge representation

• Information in an ANN exists as a collection Information in an ANN exists as a collection of nodes and the connections between of nodes and the connections between them. This is a them. This is a distributed representationdistributed representation..

• Information in semantic networks, however, Information in semantic networks, however, can be stored in a single node. This is a can be stored in a single node. This is a form of form of local representationlocal representation..

Characteristics of ANNsCharacteristics of ANNs

• A A nodenode is a basic is a basic computing unit.computing unit.

• A A linklink is the connection is the connection between one node and the between one node and the next.next.

• WeightsWeights specify the specify the strength of connections.strength of connections.

• A node fires if it receives A node fires if it receives activation above activation above thresholdthreshold..

Characteristics of ANNsCharacteristics of ANNs

• A A basisbasis function function determines the amount determines the amount of stimulation a node of stimulation a node receives.receives.

• An An activationactivation function function maps the strength of maps the strength of the inputs onto the the inputs onto the node’s output.node’s output.

A sigmoidal activation function

Early neural networksEarly neural networks• Hebb (1949) describes two Hebb (1949) describes two

type of cell groupings.type of cell groupings.• A A cell assemblycell assembly is a small is a small

group of neurons that group of neurons that repeatedly stimulate repeatedly stimulate themselves.themselves.

• A A phase sequencephase sequence is a set is a set of cell assemblies that of cell assemblies that activate each other.activate each other.

• Hebb Rule: When one cell Hebb Rule: When one cell repeatedly activates repeatedly activates another, the strength of the another, the strength of the connection increases.connection increases.

Early neural networksEarly neural networks

• Perceptrons were Perceptrons were simple networks that simple networks that could detect and could detect and recognize visual recognize visual patterns.patterns.

• Early perceptrons had Early perceptrons had only two layers, an only two layers, an input and an output input and an output layer.layer.

Modern ANNsModern ANNs

• More recent ANNs More recent ANNs contain three layers, contain three layers, an input, hidden, and an input, hidden, and output layer.output layer.

• Input units activate Input units activate hidden units, which hidden units, which then activate the then activate the output units.output units.

Backpropagation learning in Backpropagation learning in ANNsANNs

• An ANN can learn to make a correct An ANN can learn to make a correct response to a particular stimulus input.response to a particular stimulus input.

• The initial response is compared to a The initial response is compared to a desired response represented by a desired response represented by a teacherteacher..

• The difference between the two, an The difference between the two, an error error signal,signal, is sent back to the network. is sent back to the network.

• This changes the weights so that the actual This changes the weights so that the actual response is now closer to the desired.response is now closer to the desired.

Criteria of different ANNsCriteria of different ANNs

• Supervised networksSupervised networks have a teacher. have a teacher. Unsupervised networksUnsupervised networks do not. do not.

• Networks can be either Networks can be either single-layersingle-layer or or multilayermultilayer..

• Information in a network can flow forward Information in a network can flow forward only, a only, a feed-forward networkfeed-forward network, or it can flow , or it can flow back and forth between layers, a back and forth between layers, a recurrent recurrent networknetwork..

Network typologiesNetwork typologies

• Hopfield-Tank networksHopfield-Tank networks. Supervised, single-layer, . Supervised, single-layer, and laterally connected. Good at recovering and laterally connected. Good at recovering “clean” versions of noisy patterns.“clean” versions of noisy patterns.

• Kohonen networksKohonen networks. An example of a two-layer, . An example of a two-layer, unsupervised network. Able to create topological unsupervised network. Able to create topological maps of features present in the input.maps of features present in the input.

• Adaptive Resonance NetworksAdaptive Resonance Networks (ART). An (ART). An unsupervised multilayer recurrent network that unsupervised multilayer recurrent network that classifies input patterns.classifies input patterns.

Evaluating connectionismEvaluating connectionism

Advantages:Advantages:

1.1. Biological plausibilityBiological plausibility

2.2. Graceful degradationGraceful degradation

3.3. InterferenceInterference

4.4. GeneralizationGeneralization

Disadvantages:Disadvantages:

1.1. No massive No massive parallelismparallelism

2.2. Convergent dynamicConvergent dynamic

3.3. Stability-plasticity Stability-plasticity dilemmadilemma

4.4. Catastrophic Catastrophic interferenceinterference

Semantic networksSemantic networks

• Share some features in common with ANNs.Share some features in common with ANNs.• Individual nodes represent meaningful Individual nodes represent meaningful

concepts.concepts.• Used to explain the organization and Used to explain the organization and

retrieval of information from LTM.retrieval of information from LTM.

Characteristics of semantic Characteristics of semantic networksnetworks

• Spreading activationSpreading activation. . Activity spreads outward Activity spreads outward from nodes along links and from nodes along links and activates other nodes.activates other nodes.

• Retrieval cuesRetrieval cues. Nodes . Nodes associated with others can associated with others can activate them indirectly.activate them indirectly.

• PrimingPriming. Residual . Residual activation can facilitate activation can facilitate responding.responding.

A hierarchical semantic networkA hierarchical semantic network

• Sentence verification tasks suggest a hierarchical Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory organization of concepts in semantic memory (Collins and Quillian, 1969).(Collins and Quillian, 1969).

• Meaning for concepts such as animals may be Meaning for concepts such as animals may be arranged into superordinate, ordinate, and arranged into superordinate, ordinate, and subordinate categories.subordinate categories.

• Vertical distance in the network corresponds to Vertical distance in the network corresponds to category membership.category membership.

• Horizontal distance corresponds to property Horizontal distance corresponds to property information.information.

Example ofExample ofA Hierarchical Semantic NetworkA Hierarchical Semantic Network

From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

Propositional networksPropositional networks

• Can represent propositional or sentence-like Can represent propositional or sentence-like information. Example: “The man threw the information. Example: “The man threw the ball.”ball.”

• Allow for more complex relationships Allow for more complex relationships between concepts such as agents, objects, between concepts such as agents, objects, and relations.and relations.

• Can also code for episodic knowledge.Can also code for episodic knowledge.

Example of Example of A Propositional Semantic NetworkA Propositional Semantic Network

From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

Episodic MemoryEpisodic Memoryin Cassiein Cassie

a SNePS-Based Agenta SNePS-Based Agent

*NOW*NOW contains SNePS term representing contains SNePS term representing current time.current time.

*NOW*NOW moves when Cassie acts or moves when Cassie acts or perceives a change of stateperceives a change of state..

B6

Representation of TimeRepresentation of Time

find

lex

action object

B1

!

agentact

eventtime

NOW

!!before after before after

?????????????

I

Movement of TimeMovement of Time

t1 t2!before after t3!before after

NOW NOW NOW

Performing a Punctual ActPerforming a Punctual Actt1 t3!before after

NOW NOW

t2!before after

!

time

event

Performing a Durative ActPerforming a Durative Act

t1

NOW

!before after t2

!

time

event

NOW

t3!

supintsubint