40
Plans for Today Chapter 2: Intelligent Agents (until break) Lisp: Some questions that came up in lab Resume intelligent agents after Lisp issues

Plans for Today

  • Upload
    halil

  • View
    20

  • Download
    0

Embed Size (px)

DESCRIPTION

Plans for Today. Chapter 2: Intelligent Agents (until break) Lisp: Some questions that came up in lab Resume intelligent agents after Lisp issues. Current Class Numbers. By my roster: 3a: 30 students 4a: 15 students. Intelligent Agents. Agent: anything that can be viewed as… - PowerPoint PPT Presentation

Citation preview

Page 1: Plans for Today

Plans for Today Chapter 2: Intelligent Agents (until

break) Lisp: Some questions that came up

in lab Resume intelligent agents after

Lisp issues

Page 2: Plans for Today

Current Class Numbers By my roster: 3a: 30 students 4a: 15 students

Page 3: Plans for Today

Intelligent Agents Agent: anything that can be viewed as…

perceiving its environment through sensors acting upon its environment through

effectors Diagram Examples:

Human Web search agent Chess player

What are sensors and effectors for each of these?

Page 4: Plans for Today

Rational Agents Rational Agent: one that does the right

thing Criteria: Performance measure Performance measures for

Web search engine? Tic-tac-toe player? Chess player?

How does when performance is measured play a role? short vs. long term

Page 5: Plans for Today

Rational Agents Omniscient agent

Knows the actual outcome of its actions

What information would a chess player need to have to be omniscient?

Omniscience is (generally) impossible A rational agent should do the right

thing based on the knowledge it has

Page 6: Plans for Today

Rational Agents What is rational depends on four things:

Performance measure Percept sequence: everything agent has

seen so far Knowledge agent has about environment Actions agent is capable of performing

Ideal Rational Agent Does whatever action is expected to

maximize its performance measure, based on percept sequence and built-in knowledge

Page 7: Plans for Today

Ideal Mapping Percept sequence is only varying

element of rationality Percept Sequence Action

In theory, can build a table mapping percept sequence to action Sample table for chess board

Ideal mapping describes behavior of ideal agents

Page 8: Plans for Today

The Mapping Table In most cases, mapping is

explosively too large to write down explicitly

In some cases, mapping can be defined via a specification Example: agent to sort a list of

numbers Sample table for such an agent Lisp code

Page 9: Plans for Today

Autonomy “Independence” A system is autonomous if its

behavior is determined by its own experience An alarm that goes off at a prespecified

time is not autonomous An alarm that goes off when smoke is

sensed is autonomous A system without autonomy lacks

flexibility

Page 10: Plans for Today

Structure of Intelligent Agents

AI designes the agent program The program runs on some kind of

architecture To design an agent program, need

to understand Percepts Actions Goals Environment

Page 11: Plans for Today

Some Sample Agents What are percepts, actions, goals, and

environment for: Chess Player? Web Search Tool? Matchmaker? Musical performer?

Environments can be real (web search tool) or artificial (Turing test) distinction is about whether it is a

simulation for agent or the “real thing”

Page 12: Plans for Today

Agent Programs Some extra Lisp: Persistence of state

(static variables) Allows a function to keep track of a

variable over repeated calls. Put functions inside a let block (let ((sum 0)) (defun myfun (x) (setf sum (+ sum x))) (defun report () sum))

Page 13: Plans for Today

Skeleton Agent Inputs: percept Outputs: action Static Variable: memory Pseudocode:

memory Update-Memory(memory, percept) action Choose-Best-Action(memory) memory Update-Memory(memory, action) return action

Page 14: Plans for Today

Where we’re going today Frameworks of Agent Programs Types of Agent Programs Types of Environments Assignment 2 and associated code

Page 15: Plans for Today

Skeleton Agent General purpose agent code (let ((memory nil)) (defun skeleton-agent (percept) (setf memory (update-memory memory percept)) (setf action (choose-best-action memory)) (setf memory (update-memory memory action)) action ; return action ))

Page 16: Plans for Today

Table Lookup Agent Inputs: percept Outputs: action Static Variable: percepts, table Pseudocode:

Append percept to the end of percepts

Action Lookup(percepts, table) return action

Page 17: Plans for Today

Lookup Table Agent General purpose agent code (let ((percepts nil) (table ????) (defun table-lookup-agent (percept) (setf percepts (append (list percept) percepts)) (lookup percepts table)) ))

Page 18: Plans for Today

Specific Agent Example:Pathfinder (Mars Explorer)

Percepts: Actions: Goals: Environment: Would table-driven work?

Table would take massive memory to store

Long time for programmer to build table No autonomy

Page 19: Plans for Today

Four kinds of better agent programs

Simple reflex agents Agents that keep track of the world Goal-based agents Utility-based agents

Page 20: Plans for Today

Simple reflex agents Specific response to percepts, i.e.

condition-action rule if new-boulder-in-sight then

move-towards-new-boulder Pretty easy to implement Requires no knowledge of past

Limited what you can do with it

Page 21: Plans for Today
Page 22: Plans for Today

Agents that keep track of the world

Maintain an internal state which is adjusted by each percept

Example: Internal state: looking for a new boulder, or

rolling towards one Affects how Pathfinder will react when

seeing a new boulder Rule for action depends on both state

and percept Different from reflex, which only

depends on percept

Page 23: Plans for Today
Page 24: Plans for Today

Goal-Based Agents Agent continues to receive percepts

and maintain state Agent also has a goal

Makes decisions based on achieving goal

Example Pathfinder goal: reach a boulder If pathfinder trips or gets stuck, can

make decisions to reach goal

Page 25: Plans for Today
Page 26: Plans for Today

Utility-Based Agents Goals are not enough – need to know

value of goal Is this a minor accomplishment, or a major

one? Affects decision making – will take greater

risks for more major goals Utility: numerical measurement of

importance of a goal A utility-based agent will attempt to

make the appropriate tradeoff

Page 27: Plans for Today
Page 28: Plans for Today

Environments: Accessible vs. Inaccessible

Accessible: agents sensors detect all aspects of environment relevant to deciding action

Inaccessible: some relevant information is not available

Examples? Which is more desirable?

Page 29: Plans for Today

Environments: Determinstic vs. Nondeterminstic

Deterministic: next state of environment is completely determined by current state and agent actions

Nondeterminstic: uncertainty as to next state If environment is inaccessible but

deterministic, may appear nondeterministic Agent’s point of view is the important one Examples? Which is more desirable?

Page 30: Plans for Today

Environments: Episodic vs. Nonepisodic

Episodic: Experience is divided into “episodes” of agent perceiving than acting

Quality of action depends just on the episode itself, not on previous

Examples? Which is more desirable?

Page 31: Plans for Today

Environments: Static vs. Dynamic

Dynamic: Environment can change while agent is thinking

Static: Environment does not change while agent thinks

Semidynamic: Environment does not change with time, but performance score does

Examples? Which is more desirable?

Page 32: Plans for Today

Environments: Discrete vs. Continuous

Discrete: Percepts and actions are distinct, clearly defined, and often limited in number

Examples? Which is more desirable?

Page 33: Plans for Today

Lisp Questions

Page 34: Plans for Today

Why the dot in cons?Two Explanations:

High level: cons expects a list in the second position

Lower level: Cons takes a cons cell from the free

storage list Puts first argument in “first” position Puts second argument in “rest” position Separates by a dot, unless “rest”

position is a pointer (indicates continuing list)

Page 35: Plans for Today

How does append work? Makes copy of first list Takes last pointer and points to

second list Picture

Page 36: Plans for Today

How to debug? Can trace function calls with (trace function) and (untrace function) Demonstration with mystery function from lab

At a Break> prompt, can see call stack with backtrace

Can go through code step by step (step (mystery 2 3)) Use step and next to go through each function

as you go along Use (print var)

Page 37: Plans for Today

Random bits Are Lisp functions pass by value or

pass by reference? Lisp does pass by value, but all lists

are referenced by a pointer. So like C++, can affect original list this way.

Why the p in (zerop x)? p = predicate NOT true that p = positive

Page 38: Plans for Today

Scoping and binding let declares a scope where variable

bindings are insulated from outside usual notions of local and global

variables apply if you want to change a global

variable from within a function

Page 39: Plans for Today

Assignment 2: Vacuum-World

Demonstration Percepts: Vacuum cleaner can only

detect: Bumped into something? Dirt underneath me? At home location?

Actions: forward, turn right, turn left, suck up dirt, turn off

Page 40: Plans for Today

Vacuum-World cont. Goals: Clean up as much dirt as possible

with as few moves as possible. Specifically:

100 points for each piece of dirt vacuumed -1 point for each action -1000 if it turns off when not at home

You will modify choose-action function possibly others?

New Lisp: try out each function, explain as you go