Upload
ambrose-underwood
View
218
Download
1
Tags:
Embed Size (px)
Citation preview
L643: Evaluation of Information Systems
Week 6: February 11, 2008
2
Mid-term Evaluation
List 3 things you like about the course. List 3 things you don’t like about the
course & make suggestions (be realistic… I can only change so much).
Put your index card in the envelope… I will be back in 7 minutes.
Don’t put your name on the cards! I am looking for honest feedback.
3
Statistics over IT Projects Failure Rates
51% viewed their ERP implementation as unsuccessful (The Robbins-Gioia Survey, 2001)
40% of the projects failed to achieve their business case within 1 year of going live (The Conference Board Survey, 2001)
The typical failure rate for IT projects is 70%; some “30% fail outright; another 40% drag on for years, propped up by huge cash infusions until they are finally shut down (Hugos, 2003)
4
System Quality (Shepperd, 1995)
Quality (~= beauty) is in the eye of the beholder
Software quality: The totality of features and characteristics of a
product, process or service that bear on its ability to satisfy stated or implied needs (defined in ISO Shepperd, p. 67)
5
System Quality: Steps to follow (Shepperd, 1995)
Defining quality—establish: The quality attribute, e.g., consistency The object of interest, e.g., information system The perspective, e.g., customer’s perspective
What to measure? Consistency: the absence of contradictory information
in the DB at any one point in time How to measure?
A testing tool to make n=1000 random DB accesses and test these for consistency
6
System Quality: Steps to follow (Shepperd, 1995)
Quality attribute = consistency Object = information system Perspective = customer Scale = probability of a data element being
consistent with all other elements in the system Test = 1000 random record sample checked by
the database consistency testing program Now = 80-90% (estimated) Minimum = 90% Target = 99.9%
7
1st Category in DeLone & McLean’s Framework
System quality data currency; response time; turnaround time;
data accuracy; reliability; completeness; system flexibility; and ease of use (Hamilton & Chervany, 1981)
“bugs” in the system (system reliability); user interface consistency; ease of use; documentation quality; and quality; and maintainability of the program code (Seddon, 1997)
8
Resistance for a QMS (Frewin, 1990)
Quality management is seen as an “add-on” to product development and maintenance
Staff do not like having their work monitored QMs are thought to be too restrictive, stiffing
the flair and creativity of engineers
Note: these resistance issues are NOT limited to software engineering
Frewin, G. D. (1990). Software quality management. In P. Rook (ed.), Software reliability handbook. Elsevier Applied Science.
9
Accessibility
Section 504 states that "no qualified individual with a disability in the United States shall be excluded from, denied the benefits of, or be subjected to discrimination under" any program or activity that either receives Federal financial assistance or is conducted by any Executive agency or the United States Postal Service.
http://www.usdoj.gov/crt/ada/cguide.htm
10
Accessibility
1998 Amendment to Section 508 of the U.S. rehabilitation Act of 1973 extended law to cover electronic and IT language (http://www.usdoj.gov/crt/508/)
ADA lawsuits filed: National Federation of the Blind (NFB) v. Target (Feb
2006) NFB vs. Connecticut Attorney General’s Office
H & R Block, HDVest, Intuit, and Gilman & Ciocia for IRS e-filing make these sites accessible by 2000 tax season
11
Accessibility (Mankoff et al., 2005)
Compared the following four evaluation methods to test accessibility of websites: Expert review Screen reader Remote Automated (e.g., W3C)
12
Web Rating Tools
Gomez (http://www.gomez.com) E.g., Airline flight search benchmark Free website performance test
Alex Traffic ranking
13
User-Perceived Web Quality (Aladwani & Palvia, 2002)
4 dimensions of web quality: Technical adequacy Specific content Content accuracy Web appearance
User-perceived web quality instrument (see Table 5, p.474)
14
Data Accuracy
An example of the book: “Failing forward”
Human errors
15
The Databa$e Debacle
Leonardo da Vinci May 2, 1519
da Vinci, Leonardo 05/02/1519
Vinci, Leonardo da 1519
Vinci, Leoda 2 May 1519
Davinci, Leonardo May 1519
Leo Davinci May 2—1519
L. D. Vinci Monday 2 May 1519
Leonardo D. Vinci Dies Lunae ii Maius MDXIX
16
Trends in Data Quality (Agosta, 2005)
Data quality now includes meta data quality (c.f., tags in Flickr.com)
Data profiling is the first step The use of analytical techniques about data for
the purpose of developing a thorough knowledge of its content, structure and quality
IQ tools (e.g., DataFlux) are available Policy-based information quality is necessary Design a better process of controlling IQ
Agosta, L. (2005). Trends in data quality. DM Review, 15(2), 34-35.
17
10 Potholes in the Road of Information Quality (Strong et al.)
Three roles within the information manufacturing system:
Information producers
Information custodians
Information consumers
P. 39, Figure 1
18
10 Potholes in the Road of Information Quality (Strong et al.)
Category Dimension
Intrinsic IQ Accuracy, objectivity, believability, reputation
Accessibility IQ
Accessibility, security
Contextual IQ
Relevancy, value-added, timeliness, completeness, amount of information
Representa-tional IQ
Interpretability, ease of understanding, concise & consistent representation
19
Problem with Numbers in Systems
This example is typical and illustrates how potholes can quickly multiply
http://www.fbi.gov/ucr/ucr.htm See 1999, (p. 14) Wyoming (c.f., Laramie project)
20
10 Potholes in the Road of Information Quality (Strong et al.)
Information production potholes#1 Multiple sources of the same information produce
different values
#2 Information is produced using subjective judgments, leading to bias
#3 Systemic errors in information production lead to lost information
Any examples related to the above problems?
21
10 Potholes in the Road of Information Quality (Strong et al.)
Information storage potholes#4 Large volumes of stored information make it
difficult to access it in a reasonable time
#5 Distributed heterogeneous systems lead to inconsistent definitions, formats, and values
#6 Nonnumeric information is difficult to index
Any examples related to the above problems?
22
10 Potholes in the Road of Information Quality (Strong et al.)
Information utilization potholes#7 Automated content analysis across information
collections is not yet available#8 Consumers’ needs for information changes#9 Easy access to information may conflict with
requirements for security, privacy, and confidentiality
#10 Lack of sufficient computing resources limits access
Any examples related to the above problems?
23
A Framework of IQ Assessment (Stvilia et al. 2005)
In Wikipedia 4 types of “agents” exists: Editor agents – add new content Information Quality Assurance agents – control &
enhance the quality of content Malicious agents – degrade article quality Environmental agents – represent temporal
changes in the real world
24
A Framework of IQ Assessment (Stvilia et al. 2005)
Featured article quality assessment criteria:1. Comprehensive
2. Accurate and verifiable by including references
3. Stable – not changing often
4. Well-written
5. Uncontroversial
6. Compliance with Wikipedia standards and project guides
7. Having appropriate images w/ acceptable copyright status
8. Having appropriate length, using summary style and focusing on the main topic
25
A Framework of IQ Assessment (Stvilia et al. 2005, p. 6, Figure 1)
Intrinsic Accuracy /validity
Cohesiveness
Complexity
Semantic consistency
Structural consistency
Currency
Informativeness /redundancy
Naturalness
Precision /completeness
Relational/ contextual Accuracy
Accessibility
http://www3.interscience.wiley.com/cgi-bin/abstract/115805126/ABSTRACT
26
A Framework of IQ Assessment (Stvilia et al. 2005, p. 6, Figure 1)
Relational/ contextual Complexity
Naturalness
Informativeness /redundancy
Relevance (aboutness)
Precision /completeness
Security
Semantic consistency
Structural consistency
Verifiability
Volatility
Reputational Authority
27
A Framework of IQ Assessment (Stvilia et al, 2007, Table 5)
# Featured article
Random article
Dimensions Source
2 Total # of edits 257 8 Authority Edit history
5 Article length 24,708 1,344 Intrinsic completeness
Article
9 # of external links 9 0 Verifiability Article
13 # of images 5 0 Intrinsic redundancy
article
14 Article age (days) 1,153 388 Intrinsic consistency
Edit history
19 Administrator edit share (# of admin edit/total # of edit)
0 0 Intrinsic consistency
Edit history
28
From DQ to EQ (Kim et al., 2005)
InformationContent
InformationTime
InformationForm
Irrelevantinformation
DisorientationCognitive overhead
Infoaccuracy
Inforelevance
Info completeness
Interfacestructuralquality
Infopackagingquality
Infoaccessibility
Historymaintenancequality
Infodelivery qualityInfo
currency
address
help
29
IQ for Mobile Internet Services (Chae et al., 2002)
Stability
Responsiveness
Objectivity
Believability
Amount
Structure
Navigation
Presentation
Timeliness
Promptness
ConnectionQuality
ContentQuality
InteractionQuality
ContextualQuality
UserSatisfaction
CustomerLoyalty
Figure 2, p. 43
30
Group Activity
We need 5 teams (5 people / team) Each team will create a metric and evaluate
information quality of www.amazon.com and www.bn.com:
1. Objectives of IS2. The purpose of the measurement3. What to measure (link to #1)4. How to measure5. Limitations
Discuss the results with the whole class
31
To Wrap-Up
“if they know what to look for, organizations can anticipate and handle IQ problems before they trigger a crisis” (Strong)
We can all help build a stronger and smoother “road to information quality” through understanding, mindfulness, and diligence
And if we do not do this, who will?