83
Chapter 18: Knowledge Engineering presented by: Pante a Jabbary Athena Ahmadi

Chapter 18: Knowledge Engineering presented by: Pante a Jabbary Athena Ahmadi

Embed Size (px)

Citation preview

Chapter 18:

Knowledge Engineering

presented by:Pante a JabbaryAthena Ahmadi

2

Introduction

• Phase 1: Problem Assessment• Phase 2: Knowledge Acquisition• Phase 3: System Design • Phase 4: Testing and Evaluation• Phase 5: Documentation• Phase 6: Maintenance

3

4

Phase 1: Problem Assessment

• 2 practical questions– Will it work?– Why should we try it?

• Improper answer to these questions will result in undertaking a project that has a small chance to succeed or will offer little benefit

5

Phase 1

• Task 1: Determine motivation of organization• Task 2: Identify candidate problems• Task 3: Perform feasibility study• Task 4: Perform cost/benefit analysis• Task 5: Select the best project• Task 6: Write the project proposal

6

7

Task 1: Motivation for the Effort

Why is the organization motivated to pursue Why is the organization motivated to pursue expert systems?expert systems?– Problem Driven

• Explore the use of a new technology out of desire to solve some specific problem

• A step toward improving profitability or productivity

• Motivated by an existing problem

– Solution Driven

• Explore a new technology because of a general interest or curiosity

8

• Only appropriate in the solution driven cases

• Before feasibility and cost/benefit studies

• Is called pre-assessment

Task 2: Identify Candidate Problems

9

Forming the List

• Look in middle level management: – have global view of operations and knowledge about

everyday problems

– Expose areas where the application of an expert system has the potential to provide real value to the organization

• Tasks that require human decision making

10

Technology Demonstration

• A small and relatively simple problem is more preferable. It will enhance the likelihood of success and the eventual acceptance of the technology within the organization.

• Small: scope of the problem doesn’t cover a large number of complex issues.

• Simple: the problem appears to be solvable at first glance.

• Consider what others have done in the past and consider problem solving paradigm

11

Suggestions for Choosing a Good Problem

• Human decision making• Heuristic knowledge• Judgmental knowledge• Small• Simple• Success likely• Some value

* These factors only filter out the projects that should not receive further consideration.

12

Task 3: Feasibility Study

• If the project is likely to succeed• Step 1: issues that are absolutely required.

– List of items that must be met before the project would have any chance to succeed

– An all or nothing situation

• Step 2: issues that are important for the success of the project– Subjective in nature and requires some judgment to assess

– Impact the overall likelihood of successfully completing the project

13

14

Project Requirements

• Problem solving knowledge available• Knowledge engineer available• Problem solution can be validated• Funding available• System development software available• Computer facilities available

15

Project Risks

• Risk: exposure to possible failure to obtain expected benefits

• 3 categories:– Problem – People– deployment

16

Problem Feasibility Issues

• Expert knowledge needed• Problem solving steps are definable• Symbolic knowledge used• Heuristic used• Problem is solvable• Successful systems exist• Problem is well-focused• Problem is reasonably complex

17

Problem Feasibility Issues (cont’d)

• Problem is stable

• Uncertain or incomplete knowledge used

• Non-deterministic

• Solution more of a recommendation

18

People Feasibility Issues

• Capability and motivation of the people involved in the project

• Major players:– Domain expert– Knowledge engineer– End-user– management

19

Expert

• Expert can communicate the knowledge• Expert can devote time• Expert is cooperative

20

Knowledge Engineer

• Have good communication skills

• Can match problem to software

• Have expert system programming skills

• Can devote the time

21

End-User

• Can devote time• Receptive to change• Is cooperative

22

Management

• Supports the project

• Receptive to change

• Is not skeptical

• Has reasonable expectations

• Understands objectives

23

Deployment Feasibility Issues

• System can be introduced easily• System can be maintained • System not a critical-path item• System can be integrated with existing resources• Training available

24

Feasibility Assessment

• Use some assessment method that can intelligently judge the project’s feasibility

• Beckman(1991): assign each issue a number that reflects its relative importance. This checklist is compared to some candidate program. If the program meets an issue, it receives the issue’s pre-described points.– Feasibility = Total score/ Total points– Shortcoming: many issues are subjective and

don’t have a yes-no answer

25

26

Feasibility Assessment (cont’d)

• A different approach: – Each issue a weight (0-10) that reflects the

importance– Each issue ascribed a number (0-10) that

reflects the degree of belief in that issue– Value multiplied by the issue’s weight. – Feasibility = Total Score / Total weight

27

28

29

30

Cost/Benefit Analysis

• Determine the expected payoff• Benefit Issues:

1. Improved productivity– Better decisions– Faster decisions– Disseminates expertise

2. Lower costs– Reduces labor costs– Improves material use

31

Benefit Issues (cont’d)

3. Improved quality• Superior product

• Superior services

• Provides training

4. Improved image• Innovator

32

• Task 5: Select the Best Project

• Task 6: Write the Project Proposal

33

Phase 2: Knowledge Acquisition

• Collection

• Interpretation

• Analysis

• Design

34

Problems with Knowledge Acquisition

In extracting knowledge from an expert:

• Unaware of knowledge

• Unable to verbalize knowledge

• Provides irrelevant knowledge

• Provides incomplete knowledge

• Provides incorrect knowledge

• Provides inconsistent knowledge

35

• Cooperative team effort• Interviewing technique• Knowledge analysis

36

Phase 3: Design

• Task 1: Select knowledge representation technique

• Task 2: Select control technique• Task 3: Select expert system development

software• Task 4: Develop prototype• Task 5: Develop the interface• Task 6: Develop the product

37

Task 1: Selecting Knowledge Representation Technique

• A knowledge representation technique that best matches the way the expert mentally models the problem’s knowledge

• Consider the organization’s resources and capabilities

38

Task 1 (cont’d)

• Frame-base– Expert describes the problem by referencing

important objects and their relationships• Simulation type problems

– Expert considers several similar objects when solving the problem

39

Task 1 (cont’d)

• Rule-base– Expert discusses the problem primarily using

IF/THEN type statements• Classification problems

• Induction– If past examples of the problem exist

– When a rule-based approach has run into problems on some issue

– If no real expert exists on the problem, but a history of problem information is available

40

Task 2: Selecting Control Technique

Ask the expert to work through a typical problem.

• How the expert collects information and reasons with it to solve the problem

• If the expert uses some global strategy

41

Task 2 (cont’d)

• Forward chaining– Expert first collects information about the

problem and then sees what can be concluded

– The number of data is far smaller than the number of solutions

42

Task 2 (cont’d)

• Backward chaining– Expert first considers some conclusion,

then attempts to prove it– The number of goals is much fewer than

the amount of possible data

• Goal agenda– The global problem solving approach used

by the expert

43

44

45

Selecting Expert Systems Development Software

46

Categories of Software

• Languages– Rule-based: LISP, Prolog, OPS, C

– Frame-based: C++, Flavors, Smalltalk

• Shells– Provide an established environment for creating the

system: • Representation structure, inference engine, explanation

facility and interface

47

Important Software Features

• Cost• Computer hardware• License• Training / Support• Coding knowledge:

– Some require the developer to write source code that captures both the knowledge and its control

– Others use a “smart” editor. Provide a template for creating rules or frames or other control type knowledge

48

Software Features (cont’d)

• Inexact reasoning

• Rule sets

• External program access

• Debugging utilities

49

User Interface Features

• Questions• Explanations• Graphics• Hypertext

– A collection of nodes of text linked together on an associative basis.

50

Task 4: Prototype Development

If properly designed, will serve the following purposes

• Validate the expert system approach• Confirms the choice of the knowledge

representation technique and control strategies• Provides a vehicle for knowledge acquisition

51

Task 5: Interface Development

Development of the interface should begin with the prototype development

Keys to effective interface design:

• Consistency

• Clarity

• Control

52

Task 5 (cont’d)

• Consistent Screen Format– Similar materials placed in the same locations

• Clarity of Presented Material– Confusing or poorly designed questions, can lead to

errors in the user’s response

• Screen Control– User should not be afraid that a mistake could result in

disastrous consequents– Easy to start, exit, and access system explanations and

utilities

53

Task 6: Product Development

• Knowledge refinement– Broadening or deepening the system’s knowledge

base

• Control refinement– Better ways of introducing more complex control

strategies

54

Task 6 (cont’d)

• Interface refinement– Ease of use– Screen directions– Questions– Clarifications– Results– Interactive techniques( mouse, lightpen, etc. )

• Inexact reasoning– Facts, rules, or frames should be coded in the system

in an exact manner

55

Phase 4- Testing

• Evaluation of Expert Systems are different from conventional programs– Others have well-defined specifications

– Do not have a clear right or wrong answer

• Evaluation– Validation

– User acceptance

56

Phase 4- Testing

• Evaluation– Validation

• If the system satisfactory performs the intended task

– User acceptance• Issues that impact how well the system addresses

the needs of the user

– Subjective => make evaluation complicated

57

System Validation

• Models the decision making of a human Expert– Same results validate the system’s results– Same reasoning manner validate the

system’s reasoning process

58

System Validation (cont.)

• Validate results:– Selection of the test criterion– Selection of the test cases– Selection f the evaluators

59

Validate results

• Selection of the test criterion– Compare ES to an expert in the field

• used for MYCIN

– Establish an acceptable and reasonable performance level

• Consider performance of human experts on the problem

– Judgment on the evaluation results• A degree of error• A set of possible evaluation responses

– MYCIN sample

• Score the result on some numeric scale

60

Validate results (cont.)

• Selection of the test cases– Depends on the project specifications

• Deal with routine problems– Test cases typical (90% of the cases)

• Solve every possible problem– Test cases both typical and unusual

61

Validate results (cont.)

• Selection of the evaluators– Again depends on the project specifications

• Designed to aid an expert on a project that expert• To be used by other experts

– The project’s domain expert» Testing his/her knowledge

– Other experts» MYCIN sample» Double-blind test

• To be used by nonexperts

• Avoid potential bias– Inspired by Turing test

62

Validate reasoning

• Limited number of test cases

• Macro level– Results of subissues

• Micro level– Trace back through all the rules used

63

Learning from mistakes

• On mistakes– Ask the expert to review the results

• Why the answer is incorrect

• Why the right answer was not given

– Correct the knowledge that led to the wrong answer

– Add the needed knowledge» Another technique for knowledge acquisition

64

Learning from mistakes (cont.)

• IF Market status appears bullishAND Stock area recommendation is

domestic businessAND Desired annual return 8% to 15%THEN Recommendation is buy GM

65

Learning from mistakes (cont.)

• Discovers new concepts

• Discover new rules

• Identifies incorrect or incomplete rules

• Identifies incorrect premises

66

User Acceptance

• Ease of use

• Clarity of questions

• Clarity of explanations

• Presentation of results

• System utilities

67

User Acceptance (cont.)

• Ease of use– Interface– Speed of operation

68

User Acceptance (cont.)

• Clarity of questions– Avoid terms that are foreign to the user or provide

further explanation of the question

– ”Is the car’s charging system operates normally?”• Develop the system further so that it can infer the answer by

asking other questions.

– ”Is the engine hot? ” word ”hot” is vague

– ”Is the water pressure rising or falling?”• Missed the “steady”

– “Is it false that the water pressure is not high?”

69

User Acceptance (cont.)

• Clarity of Explanations– “why the question is asked” and “how some

conclusion was reached”– Increases user’s trust in the final result

• Presentation of Results– A single recommendation– Multiple, ranked recommendations– Clear, meets the user’s needs

70

User Acceptance

User Questionnaire

71

Evolution of Tesing/ Evaluation

• System evaluation was considered as one of the final project steps.– Impossible to correct due to complexity of

knowledge base.

• Recently, integration of the design and evaluation tasks

72

Evolution of Testing/ Evaluation

1. Preliminary Testing

2. Demonstration Testing

3. Informal Validation Testing

4. Refinement Testing

5. Formal Testing

6. Field Testing

73

Phase 5- Documentation

• What needs to be documented?– References for developing the expert system– References for writing the final report– References for maintaining the expert system

• Should document– Knowledge– Knowledge graphs– Source code– Tests– Transcripts– Glossary of domain specific terms– Reports

74

How to organize documents?

– Easy entry of new knowledge– Easy access and modification of old

knowledge– Easy access to related information– Easy replication of material for report writing

• Knowledge dictionary

• Indexing the document

• Using hypertext

75

Using Hypertext

• Review of information difficulty in identifying relationships between pieces of information

• Creation and representation of links between related pieces of information

76

Guidelines for designing the Documentation

• Should contain– Table of Contents– Project Proposal– Knowledge Dictionary– Source Code– Tests– Transcripts– Project Personnel– Glossary– References– Index

• Final report

77

Knowledge Dictionary

78

Knowledge Dictionary

79

Test

80

Phase 6- Maintenance

• ESs mostly contain knowledge that is evolving over time

• New products and equipments• Change procedures• Deficiencies may be discovered• Difficult to use• Omissions• Bugs

81

Phase 6- Maintenance (cont.)

• Documentation• Think maintenance during design

– Modular structure

– Separate knowledge from information

• software issues• Who?

– Developer of the system

– Others need training

– Security

82

Document Changes

83

Any Question?

Thank You!

Merry Christmas!