70
Assessing Effective IT Service Outcomes in Higher Education The Higher Education TechQual+ Project Protocol Guide Timothy M. Chester, Ph.D. TechQual+ Principal Investigator

Assessing Effective IT Service Outcomes for Higher Education

Embed Size (px)

DESCRIPTION

Protocol guide for the Higher Education TechQual+ Project

Citation preview

Page 1: Assessing Effective IT Service Outcomes for Higher Education

 

 

 Assessing Effective IT Service Outcomes in Higher Education

The Higher Education TechQual+ Project Protocol Guide

Timothy M. Chester, Ph.D. TechQual+ Principal Investigator

 

 

 

 

 

 

 

 

     

Page 2: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 2 of 70

2013 Higher Education TechQual+ Core Survey Items Connectivity and Access Tell us about the quality of the Internet service on campus When it comes to...

1. Having a campus Internet service that is reliable and that operates consistently across campus.

2. Having a campus Internet service that is fast and that provides speedy access to Web sites

and rapid downloads.

3. Having wireless Internet coverage in all of the places that are important to me on campus.

4. Support for accessing the campus Internet service using my tablet or other mobile device. Technology and Collaboration Services Tell us about the quality of Web sites, online services, and technologies for collaboration When it comes to...

5. Having campus Web sites and online services that are easy to use.

6. Accessing important campus Web sites and online services from my tablet or other mobile device.

7. Having campus technology services available that improve and enhance my collaboration

with others.

8. Having technology within classrooms or other meeting areas that enhances the presentation and sharing of information.

Support and Training Tell us about your experiences when obtaining assistance with technology on campus When it comes to...

9. Technology support staff who are consistently courteous and thoughtful.

10. Technology support staff who are knowledgeable and can help me resolve problems with campus technology services.

11. Getting timely resolution to problems that I am experiencing with campus technology

services.

12. Receiving timely communications regarding campus technology services, explained in a relevant and easy-to-understand form.

13. Getting access to training or other self-help information that can enable me to become more

effective in my use of campus technology services.    

Page 3: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 3 of 70

 

Table of Contents

PART I: UNDERSTANDING THE BACKGROUND, ASSUMPTIONS, AND APPROACH OF THE HIGHER EDUCATION TECHQUAL+ PROJECT .................................. 8

INTRODUCTION ........................................................ 8

FROM INPUTS AND SERVICES TO OUTCOMES AND EFFECTIVENESS ............. 10

ORGANIZATIONAL ROLES AND CREDIBILITY ............................... 11

THE HIGHER EDUCATION TECHQUAL+ PROJECT ............................. 13

THE SERVQUAL ASSESSMENT MODEL ...................................... 14

UNDERSTANDING TECHQUAL+ SURVEY RESULTS ............................. 15

THE TECHQUAL+ CORE SURVEY INSTRUMENT ............................... 16

TECHQUAL+ SURVEY TOOLS ............................................. 18

DEMONSTRATING EFFECTIVENESS WITH TECHQUAL+ ......................... 18

PART II: USING THE TECHQUAL+ WEB SITE TOOLS ........................ 20

COST ............................................................... 20

END USER ACCOUNTS .................................................. 20

INSTITUTIONAL PROFILE AND NAVIGATION ............................... 21

SURVEYS ............................................................ 22

CREATING OR EDITING TECHQUAL+ SURVEYS .............................. 23

COLLECTING DATA THROUGH YOUR TECHQUAL+ SURVEY ...................... 24

RANDOM SAMPLING OF RESPONDENTS ..................................... 27

COMMUNICATING WITH RESPONDENTS ..................................... 27

VIEWING, ANALYZING, AND DOWNLOADING SURVEY RESULTS ................. 29

THE TECHQUAL+ PEER DATABASE ........................................ 32

PART III: CONDUCTING TECHQUAL+ SURVEYS AND GETTING THE MOST OUT OF THE TECHQUAL+ PROJECT .................................................. 35

BECOME FAMILIAR WITH THE TECHQUAL+ TOOLS ........................... 35

PLAN AN ANNUAL SURVEY .............................................. 35

OBTAIN INSTITUTIONAL REVIEW BOARD (IRB) APPROVAL ................... 35

CAREFULLY CONSIDER YOUR RESPONDENT ATTRIBUTE LIST .................. 35

INCLUDE INSTITUTION SPECIFIC IT SERVICE OUTCOMES ................... 37

PREPARE YOUR ADDITIONAL QUESTIONS .................................. 37

AVOID CREATING AN UNNECESSARILY COMPLEX SURVEY ..................... 37

PREPARING YOUR RESPONDENT LISTS .................................... 38

Page 4: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 4 of 70

USE RANDOM SAMPLING TO SELECT YOUR RESPONDENTS ..................... 38

COMMUNICATING WITH RESPONDENTS ABOUT YOUR SURVEY ................... 38

ANALYZE THE RESULTS AND CREATE AN AGENDA FOR ACTION ................ 40

DON’T HESITATE TO REQUEST ASSISTANCE ............................... 41

APPENDIX ........................................................... 42

Image 1 Join the TechQual+ Project Screen .......................... 42

Image 2 Account Setup Screen ....................................... 42

Image 3 Email Notification Settings ................................ 43

Image 4 Upper Right Navigation Links ............................... 44

Image 4a Survey Design Screen ...................................... 44

Image 5 Main Drop Down Navigation Menu ............................. 45

Image 5a Custom Items Screen ....................................... 46

Image 6 Institutional Surveys Screen ............................... 47

Image 6a Additional Questions Tab .................................. 48

Image 7 Additional Question Library ................................ 49

Image 7a Survey Instructions Screen ................................ 50

Image 8 Survey Collection Options .................................. 51

Image 9 Respondents Tab Listing (Upload Respondents Selected) ...... 52

Image 9a Respondents Tab Listing (Direct Link Selected) ............ 53

Image 9b Delete Respondents Section (Collection Settings tab) ...... 54

Image 9c Direct Link Tab ........................................... 54

Image 9d Respondent List Tab ....................................... 55

Image 10 Add Respondents Tab ....................................... 56

Image 11 Send Emails Tab ........................................... 57

Image 12 Choose Criteria for Selecting Email Recipients Page ....... 58

Image 12a Email History Tab ........................................ 58

Image 12b Post-survey Cleanup Tab .................................. 59

Image 13 Tabs on the Survey Results / Analyze Page ................. 60

Image 14 Population Tab on the Results Page ........................ 61

Image 15 Change Criteria Popup ..................................... 62

Image 16 Respondent Analysis Data on Population Tab (Results Page) . 63

Image 17 Zones of Tolerance View of Survey Results ................. 64

Image 18 Survey Results Data Table ................................. 65

Image 19 Radar Chart of Survey Results ............................. 66

Image 20 Additional Questions Drop Down List ....................... 67

Page 5: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 5 of 70

Image 21 Wordle Visualization of Suggestions ....................... 68

Image 22 Peer Database Comparison Filter ........................... 69

Image 23 TechQual+ Peer Database Page .............................. 70

Page 6: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 6 of 70

Revision History August 4, 2010 Draft of protocol guide released to participating institutions. August 14, 2010 Section on Open Ended Questions and Suggestions in Part II of the guide is updated to reflect that a respondent identifier is now listed inside [ ] at the end of each suggestion or answer to an open ended question. This allows you to track qualitative feedback by respondent in a qualitative analysis tool such as Atlas TI. March 14, 2011 Several minor updates were made to reflect subtle changes to the TechQual+ Web site. These include:

• After the post-survey cleanup, access to the survey design and settings functions were removed in order to prevent individuals from accidentally deleting data from a previous survey after the survey has closed.

• Access to the peer database has changed. Now, access is granted

to all peer data for all years available provided that the institution has completed and submitted to the peer database, in the past 24 months, a TechQual+ survey with a minimum of 50 completed surveys.

• Changes to the survey, including suggestions from respondents and

participating institutions. April 16, 2011 Several minor updates were made to reflect new communications functionality in the TechQual+ Web site. This includes a new HTML editor, new templates that can be adopted for communicating with respondents, and the ability to set a future date for delivery of the messages to respondents. March 5, 2012 Updated to reflect new functionality and changes to the basic organization of the Web site tools and services, made available with the release of the early 2012 update to the core survey instrument. Additionally, access to the peer database is now on a survey-by-survey basis, with the requirement of a minimum of 50 completed surveys before the Compare functions are available. March 19, 2012 Covers updates in Web site functionality put into production the week of March 12 – 19, 2012. In addition to bug fixes, this includes:

• When hovering the mouse pointer over the names of uploaded respondents in the collector view (Respondent List tab), a tool

Page 7: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 7 of 70

tip will appear that reviews the attributes uploaded for each respondent. This makes it easier to verify that the attributes for each respondent have been uploaded properly.

• When uploading respondents in the collector view (add respondents

tab), in cases where the respondent has previously been uploaded for this survey (based on email address) the Web site will update the respondents record (lastname, firstname, attribute1, attribute2, attribute3, ...) in the database with the information submitted. This allows institutions to update respondent information if there were errors with previous uploads without disturbing the actual data submitted by respondents when completing their survey.

• New functions in the peer database, allowing you to generate

radar charts of survey results on a peer group by peer group basis. This is in addition to the existing functions that allowed for the generation of Zones of Tolerance views on an item-by-item basis.

• On the analysis screen, options tab, institutions now may

download a dataset containing the raw data collected through their survey. The file is in CSV format and is designed to allow institutions to perform their own analysis of survey data. Note, in order to download the dataset the survey must be closed and the post-survey processing step must be completed.

April 15, 2012 Covers new tab on Collect data screen, Email History, which displays pending messages queued for future delivery to respondents, as well as the messages delivered previously to respondents for the survey. This tab replaces the Message Queue link previously available on the Send Emails tab. May 13, 2012 Updated to reflect one additional question added to the core TechQual+ survey items. July 9, 2012 Updated to reflect additional functionality, including: the ability to tailor custom items and additional questions for just faculty, students, and staff; the ability to create filtered views of survey results using the self-reported University Role, Gender, and Age Group fields; and more enhanced population analysis tools. June 1, 2013 Updated to include screenshots reflecting new Web site templates based on responsive design principles.

Page 8: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 8 of 70

PART I: UNDERSTANDING THE BACKGROUND, ASSUMPTIONS, AND APPROACH OF THE HIGHER EDUCATION TECHQUAL+ PROJECT

INTRODUCTION Why are metrics, benchmarks, performance indicators (or whatever term the reader prefers) critical for Information Technology (IT) leaders in higher education? Although the types of institutions of higher learning are diverse, and an almost equal diversity exists in terms of the organization of IT functions within them, broad agreement exists on the following:

• IT organizations must increase the value of their contribution to the institution, in a manner consistent with the institution’s mission.

• There is a need to demonstrate the value of technology

services to a variety of audiences. These audiences are both internal and external to the IT organization.

• Demonstrating the successful delivery and use of technology is

vital to demonstrating the value and effectiveness of IT organizations.

• The nature of IT delivery is unique and complex and is often

difficult to comprehend for those outside the IT organization. Using performance indicators can help IT organizations to demonstrate the value of IT to these external audiences.

• Performance indicators are important components of an overall

strategy aimed at improving individual and organizational performance, supporting organizational change, and focusing on important priorities.

A critical assumption of The Higher Education TechQual+ Project is that the end user perspective should be central to the conceptualization and definition of performance indicators for IT organizations. A second important TechQual+ assumption is that the technology-centric bias of IT organizations often leads to a gap between those who deliver technology (the IT organization) and those who use technology (end users). Separation between these two groups often stems from differences in language, culture, rituals, practices, etc. Over time, failure to bridge this gap results in erosion of the appreciation, trustworthiness, and respect afforded to IT organizations; i.e., failure to bridge the gap destroys credibility. Symptoms of eroding credibility for IT leaders and IT organizations often resemble the following:

Page 9: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 9 of 70

• IT leaders spend far too much time justifying their budgets and advocating the value and effectiveness of the IT organization to senior executives like the CFO.

• Senior executives rely on those outside the IT organization,

such as faculty, administrators, or external consultants, to analyze and consider ways to better organize and effectively deliver IT services.

• There is increased vulnerability to budget cuts. In

significant economic downturns, a disproportionate share of cost reduction efforts may fall upon IT.

• Increased resistance occurs across the institution as the IT

organization (or its leadership) leads technology efforts. IT finds out about new technology initiatives only after others outside IT have made decisions regarding the use of IT.

There have been some who have publicly questioned IT’s ability to perform above and beyond the order-taking role and they argue that IT thought leadership is best provided by someone from outside IT. Others question whether the role of a CIO is even meaningful or necessary. Some insightful commentary on this subject includes the following:

• In “Rethinking the IT Core,” Albert DeSimone argues that the IT advisory, consultative, and thought leadership roles should be split apart from the traditional IT organization.1

• In “Redefining IT Leadership: A Provost’s Perspective,” David

Farrar describes his efforts at reorganizing and rebuilding an IT organization that had lost all credibility. The role of CIO was split similarly to DeSimone’s suggestions.2

• In “The Incredible Shrinking CIO,” Jeffrey Young makes some

observations about the downgrading of the role of CIO at several institutions. Some of those institutions now have the CIO reporting to the CFO.3

Another insightful view was articulated by Walt Mossberg at the 2007 Chronicle of Higher Education President’s Forum. To an audience filled with college and university presidents, Mossberg observed that information technology organizations in higher education are “the most

                                                                                                               1 DeSimone, Albert Jr. 2009. “Rethinking the IT Core.” EDUCAUSE Quarterly. Number 2. 2 Farrar, David H. 2010. “Redefining IT Leadership: A Provost’s Perspective.” EDUCAUSE Review.

Number 2, March/April. 3 Young, Jeffrey. 2010. “The Incredible Shrinking CIO.” The Chronicle of Higher Education. May 9.

Page 10: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 10 of 70

regressive and poisonous force in technology today.”4 Some of Mossberg’s specific complaints revolved around centralized control over technical environments; however, in essence, the core of his complaint was that IT organizations often fail to adequately understand and appreciate the conditions that allow those outside the IT organization to use technology more effectively.

When IT organizations demonstrate value and effectiveness, they accrue appreciation, trustworthiness, and respect – otherwise known as credibility. Increased credibility brings autonomy, control, authority, and goodwill - things that lead to more effective delivery and use of technology across an institution. Credible IT organizations can accomplish many difficult things. The goal of IT leaders should be to bridge the gap between the IT organization and the community of end users that it serves, thereby increasing the credibility of IT services and the IT organization. This leads to more effective delivery and use of technology across the institution. The Higher Education TechQual+ Project can help IT leaders to do just that. FROM INPUTS AND SERVICES TO OUTCOMES AND EFFECTIVENESS With the recognition of the fundamental importance of technology in higher education, IT leaders now often find themselves in new and challenging situations. Some examples:

• During the institution’s reaffirmation of accreditation, the

visiting accreditation team notes that IT compares well to its peers in terms of services, but those services by themselves do not demonstrate the effectiveness of technology across the institution. As an IT leader, how can one respond?

• During the annual budget process, IT makes a request for

significant new resources. As a precursor to supporting new IT funding, the president and provost ask for evidence that the funds granted previously were used effectively. As an IT leader, how can one respond?

• A faculty member sends an email, copying the president and

provost, protesting in broad and sweeping terms the poor IT service delivery in their college. The faculty member concludes by noting, “Every faculty member I know feels the same way.” As an IT leader, how can one respond?

Each of these challenges goes beyond the basics of delivering information technology and speaks directly to the challenge of delivering IT effectively. To respond successfully to the challenge of accountability, IT organizations need evidence that demonstrates the value of IT services to those outside the IT organization. Successful

                                                                                                               4 Carnevale, Dan. 2007. “The Most Poisonous Force in Technology.” The Chronicle of Higher

Education. June 22.

Page 11: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 11 of 70

IT organizations are ones that are highly regarded for their use of an outcomes-based approach to assessment, planning, and prioritization. However, the complexities of assessment are not a natural competency for IT organizations. With end-user-focused data in hand, one can easily understand failures in service delivery as one-time mistakes, as opposed to urban myths of recurring problems in IT. Good data also allows IT leaders to respond to the requests of both administrators and accreditation bodies, who increasingly request evidence of successful outcomes in this era of accountability. This is the intended purpose of TechQual+. With it, IT organizations can compile the evidence that helps them respond to these critical challenges. Assessment, planning, prioritization, and accountability are the processes that increase the effective delivery and use of technology. ORGANIZATIONAL ROLES AND CREDIBILITY IT organizations are expected to perform multiple roles across the institution. The transactional service and order-taking roles are centered on the delivery of basic computing and collaboration services, as well as other services on request. In the advisory and consultative roles, IT staff members reflect with end users on opportunities, challenges, and threats. This is commonly thought of as the role of embedded or functional IT support, where IT staff proactively consider options and implement solutions. Finally, the role of the thought leader evolved as IT leaders became chief information officers. This last role reflects recognition by presidents, provosts, and CFOs that the voice of technology advocacy should be represented at the leadership table.5 While most IT organizations understand these different roles, the relationship between differing role expectations and credible role performance is often misunderstood. In the transactional role, successful performance is based on the belief that transactional services are reliable, consistent, efficient, and responsive to end user needs. When performing the consultative and advisory roles, successful performance is based on demonstrating business smarts, analytical capabilities, and understanding of business and information architectures. In the thought leader role, successful performance is defined in terms of building effective partnerships and demonstrating change advocacy. Progressive role performance is foundational in the sense that credible performance in the thought leader, advisory, and consultative roles depends entirely on successful performance in the transactional role. For example, no IT organization can credibly perform the role of thought leader long-term if there are basic

                                                                                                               5 Penrold, James I., Michael G. Dolence, and Judith V. Douglas. 1990. The Chief Information

Officer in Higher Education. CAUSE Professional Paper Series, Number 4.

Page 12: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 12 of 70

questions about its ability to provide consistent, reliable, and responsive transactional services.6 For IT organizations, demonstrating the effective delivery of technology services is vital to the establishment of appreciation, respect, and trustworthiness – the building blocks of credibility. Because most of the work performed by an IT organization remains in a “black box” to those outside IT, the credibility of the IT organization is vital to securing acceptance, support, autonomy, and adequate budgetary resources. Credibility allows IT organizations to perform duties beyond the order-taking role. The research undertaken through the Higher Education TechQual+ Project seeks to answer the question of how IT organizations can demonstrate the effective delivery of technology services in a way that builds and sustains credibility. Far too many IT organizations rely on credibility either derived from authority or accrued through goodwill. This results in a weak foundation for successful performance in the consultative, advisory, and thought leader roles. Credibility derived from an organizational chart is not sustainable when detached from the successful provision of transactional services. When this detachment persists, positional credibility erodes and the IT organization experiences increasing levels of resistance, limiting its effectiveness. To counter this resistance, IT leaders often turn to goodwill as a basis for credibility. Because accruing goodwill often requires saying “yes” when saying “no” is more prudent, this can result in a cycle of over-commitment and under-performance, which also limits effectiveness. At best, the delivery of IT services is inconsistent, less responsive, more reactive, and more costly. At worst, the cycle of over-commitment and under-performance results in a death spiral, eventually leading to radical overhauls of both IT leadership and the IT organization. A third basis for credibility correlates highly with sustainable forms of appreciation, respect, and trustworthiness. Demonstration of successful outcomes through a regular, recurring cycle of assessment, planning, and prioritization allows IT to establish a credible foundation that supports successful performance beyond the transactional, order-taking role. The most crucial inputs into this planning cycle are valid and reliable measures that indicate the effectiveness of technology services. Most IT organizations rely on institution-specific surveys to generate this type of evidence as existing sources of peer data, such as the EDUCAUSE Core Data Service, do not speak to outcomes. The diversity of IT services and the ways in which IT services are delivered across different types of institutions makes the challenge of creating a single approach quite daunting. Despite this difficulty, several

                                                                                                               6 Beeby, Daniel, Sunny Donenfeld, Klara Jelinkova, Jim Knox, Eileen Palenchar, and Joseph Rini.

2006. “Increasing IT Value for Customers: A Challenge for Higher Education.” EDUCAUSE Center for Applied Research. February 28.

Page 13: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 13 of 70

endeavors are underway to create standardized performance measures that can be used by multiple institutions.7 THE HIGHER EDUCATION TECHQUAL+ PROJECT One of these endeavors is the Higher Education TechQual+ Project. Established in 2007, the goal for this project is to produce the following:

• Measures that conceptualize the effective delivery and use of technology in such a way that it can be practically measured, or operationalized, from the standpoint of individuals outside the IT organization who depend on IT services.

• A set of easy-to-use web-based tools that allows institutions

to create surveys based on the TechQual+ instrument, to communicate with respondents, and to analyze and report on survey results.

• A peer database, aggregated by Carnegie basic classification,

that allows institutions to make comparisons of their IT service outcomes against those of similar institutions.

What distinguishes TechQual+ from other efforts at standardization is its focus on defining effective IT service outcomes from an end-user point of view. This end-user-centered approach should not be confused with an attempt to gauge customer satisfaction. What IT organizations refer to as “customer satisfaction” is typically thought of as “effectiveness” by users outside the IT organization. The Higher Education TechQual+ Project is inspired by the groundbreaking research that resulted in LibQual+, an outcomes-based approach for assessing the quality of library services. Supported through the Association of Research Libraries (ARL), LibQual+ is annually administered at over 1000 institutions and has been translated into multiple languages for use by international institutions. Data collected through LibQual+ are designed to help libraries improve services by aligning them with the expectations of the communities they serve. In many regards, LibQual+ served as an agent of change as libraries evolved from static, physical repositories to dynamic places for collaboration. LibQual+ provides a core instrument that measures end user evaluations of their library experiences and provides a set of easy-to-use web-based tools for creating and conducting LibQual+ assessments. It should be noted that the significant momentum behind LibQual+ is due in part to the fact that most professional librarians also hold

                                                                                                               7 Both the EDUCAUSE IT Metrics Constituent Group and the Consortium for the Establishment of Information Technology Performance Standards (CEITPS) are developing standards for outcomes-based performance indicators for IT organizations.

Page 14: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 14 of 70

faculty appointments. Rigorous assessment and planning are firmly entrenched in their culture of practice. THE SERVQUAL ASSESSMENT MODEL Both LibQual+ and TechQual+ are based on an approach to assessing service quality that was first articulated as SERVQUAL.8 This approach to understanding service quality is based on assessment of three different measures for every dimension of service, or in TechQual+ parlance - for every IT service outcome:

• Minimum Expectations represents the minimum level of service that a respondent finds acceptable.

• Desired Expectations represents the level of service that a

respondent really wants.

• Perceived Performance represents the level of service that is typically provided, relative to both minimum and desired expectations.

For example, item #3 on the TechQual+ core survey reads, “When it comes to having wireless Internet coverage in all of the places that are important to me on campus.” Survey respondents are asked to rate their minimum expectations, their desired expectations, and their performance evaluation using a 1 to 9 scale for each rating. When analyzing the results, evaluations of perceived performance are best understood within the context of both minimum and desired expectations. The range between minimum and desired expectations constitutes a “Zone of Tolerance” that should be understood as the range of possible service outcomes that respondents find acceptable. Should the perceived performance ranking fall below the Zone of Tolerance, this indicates performance that is below minimum expectations. Should the perceived performance lie above the Zone of Tolerance, this indicates performance that exceeds desired expectations. The literature on the Zone of Tolerance concept suggests that end users find performance adequate when it lies within the general range between their minimum and desired expectations.9 In addition to the Zone of Tolerance, two other concepts are crucial to TechQual+. The Adequacy Gap Score is computed by subtracting the minimum expectation rating from the perceived performance rating. A positive number indicates the degree to which service performance exceeds a respondent’s minimum expectations. A negative number indicates the degree to which service performance is below minimum

                                                                                                               8 Parasuraman, A., Zeithaml, V.A., & Berry, L.L. (1985). “A conceptual model of service quality

and its implications for future research.” Journal of Marketing, 49, 41-50. 9 Cook, C., Heath, F., & Thompson, B. 2003. Zones of tolerances in the perceptions of library

service quality: A LibQual+ study. Libraries and the Academy, 3:1, 113–121.

Page 15: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 15 of 70

expectations. The Superiority Gap Score is computed by subtracting the desired expectation rating from the perceived performance rating. A positive number indicates the degree to which service performance exceeds desired expectations. A negative number indicates the degree to which service performance is below minimum expectations. UNDERSTANDING TECHQUAL+ SURVEY RESULTS Table 1 shows a partial results table from a TechQual+ survey.10 When analyzing these results, the following can be observed:

1. The Zone of Tolerance for wireless network coverage is between 7.04 and 8.62, on a scale of 1 to 9.

2. The Adequacy Gap Score for wireless network coverage is positive

(0.42) indicating performance above minimum expectations. The Superiority Gap Score for wireless network coverage is negative (- 1.17) indicating performance below desired expectations. Thus, performance for wireless network coverage is within the Zone of Tolerance, indicating satisfactory performance in the eyes of respondents.

3. The Zone of Tolerance for mobile device access is between 5.77

and 7.63, on a scale of 1 to 9.

4. The Adequacy Gap Score for mobile device access is positive (0.70) indicating performance above minimum expectations. The Superiority Gap Score for wireless network coverage is negative (- 1.16) indicating performance below desired expectations. Thus, performance for mobile device access is within the Zone of Tolerance, indicating satisfactory performance in the eyes of respondents.

Item # Item Min.

Expect. Desired Expect.

Perceived Perform.

Adequacy Gap Score

Superiority Gap Score n

3

When it comes to wireless network coverage in all the areas that are important to me as a faculty, student, or staff member

7.04 8.62 7.45 0.42 - 1.17 406

5

When it comes to having access to important university-provided technology services from my mobile device

5.77 7.63 6.47 0.70 - 1.16 285

Table 1. 2010 TechQual+ Student Survey Results, Pepperdine University (Items #3 and #5 only)

Another distinguishing characteristic of the TechQual+ approach is that it provides indirect evidence of respondents’ priorities. By comparing the Zone of Tolerance across items, one can observe

                                                                                                               10 The data in this chart come from the Pepperdine University Spring 2010 TechQual+ Assessment. Items 3 and 5, from the TechQual+ core instrument, are used for illustration purposes.

 

Page 16: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 16 of 70

different levels of expectations. For example, the results above show that item #5 has a lower Zone of Tolerance than does item #3, suggesting that mobile device support is a lower priority for respondents than is wireless network coverage. End users typically have higher expectations for areas that are more important to them. Two other distinguishing features of the TechQual+ approach make these quantitative results even more meaningful. First, survey administrators are allowed to include descriptive attributes for each respondent. These attributes could include items such as role (faculty, student, and staff), college or school affiliation, campus, department, gender, age, etc. The TechQual+ Web site allows the inclusion of up to ten descriptive attributes for each respondent. Survey results can then be filtered based on these attributes. Second, when a respondent indicates that perceived service performance is equal to or lower than their minimum expectations, they are prompted to provide suggestions for improvement. These free-form comments can be further analyzed in order to contextualize the raw scores and turn them into actionable insights. For example, at Pepperdine University, the results of the 2008 annual TechQual+ assessment showed satisfactory adequacy gap scores for wireless network coverage across all students. However, when filtering the results by school, the results suggested dissatisfaction with wireless network coverage among undergraduate students. Analyzing the free-form suggestions for this IT service outcome revealed that lack of wireless coverage in the dormitories was the cause of the poor Adequacy Gap Scores. Based on these data, expanding wireless network coverage to the dormitories became a higher priority for the IT organization and these data were used to support a budget request for this initiative. THE TECHQUAL+ CORE SURVEY INSTRUMENT The core TechQual+ instrument includes thirteen items that are designed to capture users’ evaluations of IT service outcomes at their institution. In specifying these core items, TechQual+ articulates a general approach for conceptualizing the expectations of faculty, students, and staff. This may appear counterintuitive to some, given the diversity of institutions and the myriad ways in which IT services are organized within them. However, although TechQual+ items are couched in general terms, by filtering based on respondent attributes and analyzing comments and suggestions, one can easily turn TechQual+ results into an institution-specific plan of action. In formulating the TechQual+ core instrument, a classical social scientific approach has been followed. Project investigators have relied on focus groups at participating institutions to ascertain core IT commitments expected by faculty, students, and staff. This approach utilizes a naturalistic inquiry method for qualitative research, whereby investigators rely on unstructured observations and conversations in order to formulate general themes from unique and

Page 17: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 17 of 70

complex subject matter.11 To date, project investigators have conducted over 40 hours of focus groups at five institutions: a large, state supported research-extensive institution (University of New Mexico), two smaller, regional liberal arts colleges (Abilene Christian University, Furman University), and a highly selective, research-extensive private institution (Boston University), and a regional teaching college (University of Tennessee at Chattanooga). While there is incredible diversity across these institutions in terms of the types of technology services and service delivery models, the TechQual+ investigators have found remarkable consistency in terms of the core commitments expected of IT organizations. These expectations hold up whether one is discussing IT expectations with an engineering professor at Boston University, a student at Abilene Christian University, or a staff member at the University of New Mexico. These three core expectations are:

• Connectivity and Access – asks respondents to assess the quality of the Internet service on campus;

• Technology and Collaboration Services – asks respondents to

assess the quality of Web sites, online services, and technologies for collaboration;

• Support and Training – asks respondents to assess their

experiences when obtaining assistance with technology on campus.

Each of these core commitments is assessed through four separate items, or IT service outcomes, on the TechQual+ core instrument. These IT service outcomes are designed to reflect the more specific expectations that end users have for the core commitment. When determining the perceived performance of any IT service outcome, the results from focus groups suggest that faculty, students, and staff subjectively rely on one or more of the following criteria when evaluating technology services:

• Consistency: Is the service provided consistently to end users independent of place, time, or individual providing the service?

• Communication: Is communication about the service adequate and proactive and is that communication intelligible to individuals outside the IT organization?

                                                                                                               11 Lincoln, Yvonna S. & Gruba, Egon G. 1985. Naturalistic Inquiry. Sage Publications.

Page 18: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 18 of 70

• Collaboration: Does proficient use of the technology service effectively increase collaboration opportunities with others across the institution?

In general, when evaluating a specific technology service, end users tend to make a positive evaluation of that service when it is delivered consistently, when communication regarding the service is proactive and intelligible, and when the service increases collaboration opportunities with others. To date, the purpose of TechQual+ focus groups has been the identification of the three core comments and the evaluative criteria used by end users when assessing service quality. Much more work remains to be done. Future efforts will be directed at statistically validating the IT service outcomes (or survey items) that align with each core commitment. Once this work is accomplished, a revised TechQual+ core instrument will be released. TECHQUAL+ SURVEY TOOLS To assist institutions with administering TechQual+ surveys, the project also provides web-based tools that make it easier to create TechQual+ surveys, to communicate with respondents, to analyze results, and to compare those results with those of peer institutions. The site provides graphs and reports that are suitable for a variety of audiences, from faculty and students to campus leaders. TechQual+ surveys can also include custom, institution-specific IT service outcomes and multiple choice, multiple answer, and open-ended questions. The TechQual+ surveys are hosted on enterprise-grade infrastructure that will scale to the largest of institutions. The TechQual+ approach to assessing service quality is applicable for institutions of all shapes and sizes. Smaller institutions, where IT is often mostly centralized, can use TechQual+ to ascertain the strengths and weaknesses of technology services and to align their organizational priorities with those of their end user community. Larger institutions with decentralized services can disaggregate TechQual+ results to assess the strengths and weaknesses of services across decentralized units. Such data are often helpful in determining best practices or in planning for service consolidation. DEMONSTRATING EFFECTIVENESS WITH TECHQUAL+ At Pepperdine University, TechQual+ assessment data have been used to raise the credibility of the IT organization in a way that improves morale and increases institutional support for key technology initiatives. Upon arriving in 2007, the first thing the new CIO did was to ask all IT staff members to complete a TechQual+ survey in order to assess the strength of services provided by their organization. The results were dismal, reflecting significant issues with morale. Six months later, staff perceptions of service quality

Page 19: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 19 of 70

were compared with student perceptions of service quality. Not surprisingly, students had a much more positive perception of service quality and this comparison helped to shore up morale within the IT department. The next year, the results of the student TechQual+ survey were used to support a million-dollar budget request to install wireless network capabilities in the dormitories. Once that project was completed, the student TechQual+ survey for the next year showed dramatic improvement in the perceived performance of this IT service outcome. By illustrating the positive results stemming from previous investments in IT, the IT organization was able to establish new credibility that has been helpful in its endeavors to increase the effective delivery and use of technology across the institution. Furman University has administered the TechQual+ survey annually since 2008. Furman uses the TechQual+ data to raise campus awareness of efforts to improve technology services on campus. It has provided a framework for discussing strategic priorities for technology services and support budget requests. Results from annual TechQual+ assessments are posted on the IT department’s Web site, discussed at faculty meetings, and presented to the president’s cabinet. The results have demonstrated the need for improved wireless access service for students and that most faculty members were very unhappy with the quality of technology services. These data were incorporated into the CIO’s annual planning efforts, to great effect. Subsequent use of the TechQual+ instrument has shown the positive effects of those planning efforts, as both faculty and student perceptions have improved over time. By annually administering the TechQual+ instrument, the IT leadership team at Furman University is able to identify trends and to take advantage of free-form comments and suggestions that allow it to turn end user perceptions into an institution-specific agenda for action.

Page 20: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 20 of 70

PART II: USING THE TECHQUAL+ WEB SITE TOOLS

The Higher Education TechQual+ Web site provides a set of easy-to-use tools for creating TechQual+ surveys, communicating with respondents, analyzing the results of a survey, and comparing those results with those of peer institutions. This section of the protocol guide covers the organization and use of the TechQual+ Web site. COST Use of the TechQual+ survey and tools available through the Web site at http://www.techqual.org is free to non-profit institutions of higher education. Institutions and individuals are both encouraged to reference the TechQual+ project in any content derived from the use of the TechQual+ survey such as graphs, charts, and tables. END USER ACCOUNTS End user accounts are organized around institutions. The Web site is preloaded with all institutions of higher learning in the United States as defined by the Carnegie Foundation. To create an account, an individual should visit the Web site at http://www.techqual.org and click the Signup link in the top right hand corner of the page. After reading the material on the following page, click the Signup link at the bottom of the screen (see Image 1). Begin the signup process by selecting your country, state, and institution. If your institution is not listed, or if you represent an international institution, please use the new institution request form link below the drop down boxes (see Image 1). Accounts for international institutions may be supported provided that the institution provides instruction in the English language. Each end user account may have five permissions associated with it. Each permission allows access to different functions within the TechQual+ Web site (see Image 2).

The Manage Users from primary institution permission is assigned to the first individual who requests an account for an institution. Normally, this individual will be someone within the central IT or institutional research function at the institution. The TechQual+ principal investigator may perform due diligence as necessary to ensure that individuals requesting this level of access are associated with the institution and have responsibilities consistent with the use of the TechQual+ survey and tools. Individuals with the manage users permission have the ability to manage accounts and permissions for all user accounts associated with their institution. An end user account assigned the Design surveys permission has the ability to design and edit TechQual+ surveys for their institution.

Page 21: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 21 of 70

User accounts assigned the Collect data from respondents permission have the ability to create links to the survey, or, to upload respondents and send emails asking individuals to complete TechQual+ surveys. The Analyze survey results permission affords the end user the ability to review, analyze, and download the results from TechQual+ surveys conducted at their institution. An account that is assigned the Compare the results of surveys with peer institutions permission has the ability to compare the results of their survey against results of similar surveys conducted at peer institutions. The results in the peer database are aggregated by the Carnegie Foundation basic classification. Communications among project participants are managed through the Discussion Forums. Individuals have the ability to subscribe to email notifications associated with the different topics in the forum (see Image 3). End users may also choose to include their profiles on the publicly available participant listing available on the Homepage of the Web site. All participants are listed on the participant page listing available upon Login. INSTITUTIONAL PROFILE AND NAVIGATION Upon login, a set of links appear in the top right hand corner of every subsequent Web page (see Image 4). Clicking the Hello link next to the user’s name takes you to a screen that allows editing of the user profile and email subscriptions for the discussion forums. The Navigation Menu link leads to a drop down menu of Web site functions specific to the end user. The Sign out link will log the user out of the TechQual+ Web site.The main navigation drop down menu provides access to different functions on the Web site (see Image 5). Links under the My Links heading are provided for every individual accessing the Web site. The links under Coordinator Links are provided for those individuals who have the Manage users from primary institution permission. The Survey Home link takes the user to the main Web site landing page, which provides access to surveys associated with the individual’s primary institution. The Discussion Forums link takes the user to the discussion forums, where they may collaborate with other individuals from all institutions participating in the project. The Profile and Email Notifications link takes the user to the profile page where they may edit their profile and email subscriptions for the discussion forums. The <InstitutionName> User List link takes the user to a listing of all individuals with TechQual+ Web site accounts associated with their

Page 22: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 22 of 70

primary institution. The TechQual+ Participant Directory link takes the user to a listing of all institutions and individuals participating in the project. This list also includes information on institutions that have contributed results from their campus surveys to the peer database. For those individuals with the Manage users from primary institution permission, the link Edit Users for <InstitutionName> link takes the individual to a listing of accounts associated with their primary institution. This list also contains links that allows the individual to edit the profile, permissions, and discussion forum notifications for each user associated with their institution. SURVEYS Select the Survey Home link on the top of the horizontal navigation menu (see Image 5) to create a new TechQual+ survey, edit an existing survey, view the results of a survey, and compare survey results to those from peer institutions (see Image 6). This screen lists each of the TechQual+ surveys created for the end user’s respective institution, organized by calendar year. The last calendar year with surveys appears by default. To select a different calendar year, use the select box on the right, just above the list of surveys. Use the icons to the left and right of each survey name to access the various survey functions. Just above the listing of surveys is the Create New Survey button. Click this button to create a new TechQual+ survey. On the left hand side of the survey name, the Traffic Light icon and the words Open or Closed denotes whether the survey is open or closed for completion by respondents. Should a red asterisk (*) appear to the right of a Survey name, this indicates that all respondent identity information has been removed from the TechQual+ database as a part of the post-survey cleanup process. You may initiate this clean-up process once the survey is complete. Once this process occurs, respondents may no longer complete this survey and you may no longer access the design or collect functions for the survey. The Design icon will allow you to edit the contents of a survey. Clicking this icon, which is only available when the survey is closed, will direct you to the multi-tab page that allows you to edit the custom items, additional questions, and instructions for the survey. The Preview icon allows you to preview the survey, viewing it just as a respondent completing the survey would view it. Ratings and feedback submitted on the preview screens are not saved to the TechQual+

Page 23: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 23 of 70

database. The Collect icon will allow you to edit settings such as the open / closed status of the survey, set a cut-off date/time for the survey, and allow you to manage your respondents. Using this multi-tab screen, you can add, edit, and delete respondents, invite respondents to complete the survey by email, or send a reminder by email asking a respondent to complete the survey. You may also create a direct link to your survey that you may forward directly to your respondents if you do not want to upload your respondents to the TechQual+ Web site. Once the survey is complete, you may then perform the post-survey cleanup process on this screen, thereby removing all identifying information about your survey respondents from the TechQual+ database. The Analyze icon allows you to view, analyze, and download the results of a TechQual+ survey. The number to the right of the icon represents the number of respondents who have completed this survey in its entirety. CREATING OR EDITING TECHQUAL+ SURVEYS Click the Create New Survey button on the Survey Home page to begin the process of creating a new survey. After creating the survey, you may click the Design Survey icon to open the multi-tab page that allows you to edit the design properties for the survey (see Image 4a). The Options tab provides you the opportunity to edit the basic settings for the survey, including the survey name, a URL to forward respondents to upon completing the survey, and to indicate whether respondents should bypass the Survey Complete screen and be redirected automatically to the URL. This setting allows you to forward respondents seamlessly to a survey on your own Web site for additional questions. This may be useful if you desire to ask respondents different types of questions that cannot be included on a TechQual+ survey. The link Respondent View will show you how this appears to individuals completing the survey. The Core Items tab provides you with a list of the core TechQual+ survey items that are included with every TechQual+ survey. These items represent the individual IT service outcomes associated with the three core commitments expected by faculty, students, and staff (see previous section THE TECHQUAL+ CORE SURVEY INSTRUMENT). The Custom Items tab (see Image 5a) allows you to add custom service items (or questions) that are specific to your institution. At the bottom of the page (not shown in Image 5a) this screen lists each of the custom service items that have been used previously by your institution. To create new service items, you may type them into the text box at the middle of the page and select the Add New Item button. If you would like the item to appear only for students, faculty, or staff (based on respondent self-reported University Role information) you may designate that using the checkboxes above the Add New Item

Page 24: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 24 of 70

button. You may use the Check Spelling button to spell check the text you have entered in the New Item textbox. Use the sort order drop down boxes to select the ordering for these custom service items on this TechQual+ survey. Once you have set the proper sort order, click the Save Changes button. Should you desire to delete any of these custom items from your institutional catalog, select the checkboxes for the respective items and click the Delete Checked Items from Institutional Catalog button.

The top half of this Additional Questions tab (see Image 6a) shows the additional questions that are included in this survey. You may include open-ended questions, multiple answer questions, and multiple choice questions. For the multiple answer and multiple choice questions, you may include up to eight possible responses with each question. Respondents will be asked to complete these questions at the end of the TechQual+ survey. To delete an item from this survey, select the checkbox to the left of the item and click the Delete Checked button. You may select the sort order for these open-ended questions using the Sort Order drop down boxes. You may edit the question by hitting the Edit link to the right of each question in this list. Once you have set the proper sort order for these questions, click the Save Sort Order button. To add a new additional question to this TechQual+ survey, select the question type in the drop-down box and then enter the question in the New Question textbox. If necessary, enter the number of responses for a multiple choice or multiple answer question and then hit the Add New Question button. As with the custom survey items, you may designate additional questions just for students, faculty, or staff. You may use the Check Spelling button prior to clicking the Add New Question button. You may also include additional questions from previous TechQual+ surveys conducted at your institution (see Image 7). You may select previous additional questions from the list of questions on the bottom half of this screen. This listing includes every additional question that has been used on any TechQual+ survey in the past by your institution. It also includes depreciated items from previous TechQual+ core surveys that you may also include. To include one of these questions, select the checkbox to the left of the question and use the Add Existing Questions to Survey button at the bottom of the page. Any new additional questions that you have created just for this survey will be included on this list for future TechQual+ surveys created by your institution. The Instructions tab (see Image 7a) allows you to include institution-specific instructions for this survey. The instructions you include in this box will be included on the front page of the survey. COLLECTING DATA THROUGH YOUR TECHQUAL+ SURVEY The TechQual+ system includes a variety of tools that allow you to manage and communicate with respondents whom you would like to complete your survey (see Image 8). You may access this functionality by selecting the Collect icon on the main survey screen (accessed by

Page 25: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 25 of 70

selecting Home on the horizontal navigation menu or Survey Home on the drop down navigation menu). First, understand that there are two different methods for collecting responses through your survey. These settings appear on the first tab of the page that appears after clicking the Collect icon. The first method allows you to create a single link that you can email to all of your respondents or post on a Web page. Individuals clicking this link shall be directed to the survey for completion. The second method allows you to upload your respondents, along with attributes about your respondents, into the TechQual+ Web site. Using this method, you may send emails to your respondents inviting them to complete the survey. You may also send reminders to your respondents about the survey. If you choose the second option for collecting responses, and you upload your respondents to the TechQual+ Web site, understand that each respondent to this survey requires a unique and individualized URL. The use of a unique URL for each respondent provides respondents the ability to work on their survey over multiple sessions by saving partial results and coming back to the survey. The Collect icon leads to a multi-tab page that provides different functions on each tab. The tabs that appear will differ depending on whether you have chosen the direct link or upload respondents method of collecting data through your survey (compare Image 9 to Image 9a). The first tab, the Collection Settings tab allows you to choose the method of collecting data. It also allows you to edit some settings about when and how you want to collect data. You can set whether or not your survey is open or closed to respondents. When the survey is closed, respondents will receive a message to that affect when attempting to start the survey. You can also set a cutoff date and time for your survey. After this cutoff date and time respondents will not be allowed to complete the survey and will receive a message to that affect when attempting the survey. Note: you may change the collection type until your first respondent begins the survey. After that happens you will not be able to change the collection type unless you delete all respondents from the database. At the bottom of this page (see Image 9b) there is a section that allows you to delete all respondents and all of their data from the survey. This is normally not required, as doing so will remove all your respondents and their data from the TechQual+ database. This action is not recoverable under any circumstance. If you have selected the direct link collection method, the tab Direct Link will be available to you. You may use the link, or embed the HTML code, in an email or Web page for pointing respondents to your survey.

Page 26: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 26 of 70

If you have selected the upload respondents method for collecting data, the Respondents tab (see Image 9d) lists the respondents who have been uploaded for this survey. This listing also includes information on the status of the survey for each respondent (blank, incomplete, or complete). When selecting the Respondent List tab on this page, the following screen will appear. This listing is sorted by last name, then first name and will include up to 500 respondents. To view the next page of respondents, use the drop down box on the bottom of the listing on the right hand side of the page. To delete respondents, select the checkbox to the left of the respondent name (or select the checkbox at the top left of the page to include all respondents on this page) then press the Delete Respondents button. To export a list of all respondents, select the Export Respondent List to XML link at the bottom of the page. The alphabetical / random sort drop down box is designed to allow you to select respondents if necessary for drawings or other types of incentive awards for completing the survey. For example, if you wanted to randomly select 20 individuals who completed the survey, set the survey status drop down box to complete and then set the sort to random. Then, a randomly sorted list of individuals who completed the survey will appear. You could then simply take the top 20 individuals from the top of this list. By hovering your mouse pointer over the name of each respondent in this list, a tool tip will appear that lists each of the attributes that have been uploaded for this respondent. The Add Respondents tab (see Image 10) allows you to upload a list of your respondents en masse. Typically, this step is performed by taking a batch export of information from a campus administrative information system, formatting it into the appropriate format expected by the TechQual+ site, and then entering it on this page. When uploading respondents in batches, use the following format, with one respondent on each line:

Last name, first name, email address, attribute 1, attribute 2, attribute 3, …

Simply cut and paste your respondent list into the textbox on this page and press Perform Batch Upload at the bottom of the page. You will be redirected to a page that reflects the status of your upload and updates itself automatically every 10 seconds. In case where the respondent has already been uploaded for this survey, based on the email address, the Web site will simply update the respondent’s information (firstname, lastname, attribute1, attribute2, attribute 3, ...) in the database.

You may include up to 10 identifying attributes such as role (faculty, student, or staff), campus, college or degree, gender, etc. for each

Page 27: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 27 of 70

respondent. You may then use these attributes to filter the results for the survey or to select a partial list of respondents for email communication. RANDOM SAMPLING OF RESPONDENTS It is highly recommended that participating institutions randomly select from their entire population in order to select respondents for a TechQual+ survey. At Pepperdine University, we typically take random samples of 25% of all faculty, students, and staff each year for the TechQual+ survey. Your institutional research department can help you to choose the appropriate percentage of respondents for random selection for your TechQual+ survey. Selecting respondents randomly is a simple process that typically requires you to: a) load your entire population of respondents into some sort of table-based database; b) assign each respondent a random number; c) sort the respondents from lowest to highest random number; and d) select the appropriate number of respondents from the top of the sorted table. To simplify this process, the TechQual+ project provides a set of tools that you can use for random selection from your entire population of respondents. You may download the utilities from this URL:

https://s3.amazonaws.com/media-techqual/developer.zip

This download includes the following: a) a Microsoft Access database; b) a program that may be executed for random sampling from the respondents loaded into the database; and c) the source code (Microsoft C#) for the program, which you may freely modify. To use this utility, you should extract the entire population list and attributes from your enterprise information systems and import these data into the Microsoft Access database. Next, run the included program and follow the instructions. The program will produce a file of randomly selected respondents from the entire population of respondents imported into the Microsoft Access database (export.txt). This file will be created adjacent to the Microsoft Access Database on your computer’s file system. Simply open this text file, select all of the contents of the file, and cut and paste the contents into the textbox on the Batch Load tab. The program will also generate a file (population-analysis.txt) that contains population statics on your total population (N) and the population size for each attribute designated for each respondent. Use the information in this file for preparing a population analysis once your survey data has been completed (see later section on viewing and analyzing survey data). COMMUNICATING WITH RESPONDENTS

Page 28: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 28 of 70

Once respondents have been loaded into the TechQual+ database for the survey, you are ready to invite your respondents to complete the survey. To do so you must use the email communication functions built into the TechQual+ Web site. When doing so you should whitelist the following email servers at Sendgrid.com to insure that TechQual+ emails are not caught in your spam filter. 208.117.48.85 74.63.236.67 74.63.236.234 74.63.236.217 74.63.235.40 Select the Send Emails tab on the respondents screen to bring up the email message page. Use the From, Email Address, Reply-To, and Subject fields as you would with any other email message (see Image 11). Use the Code Snippet drop down box to select from suggested email templates, including messages that have been used for other surveys conducted at your institution. Special notice should be given to two special text blocks that can be included in your email message. The block [AssessmentUrl] should be used to designate the appropriate place in the message for the respondent-specific survey URL to be included into the message. Optionally, you may use the block [FirstName] and [LastName] to designate the respondent’s first name or last name. Using these tags allows you to personalize the email message for the respondent. The Delivers After option allows you to schedule the delivery of the message at a predetermined point in the future. The Send Test Message link will send you a copy of this message to the email account associated with your email address. The Message Queue link lists for you the email messages that are schedule for future delivery. You may also delete future messages from the delivery queue from this screen. By default, all respondents will be selected for the email message. You may choose to filter the respondent list for the email message based on the status of the respondent’s survey (blank, incomplete, or complete) or based upon the custom attributes uploaded with each respondent. To select a smaller subset of respondents for the message, press the To button at the top of the message. Selecting the To button will bring up the Choose Criteria for Selecting Email Recipients page (see Image 12). This screen contains a query builder that can be used to build the list of recipients for this email message. As you add criteria to the filter, the wording on the bottom of the page will automatically update to indicate the number of respondents who meet the selected criteria. Use the drop down boxes at the top of the page to select filtering criteria and press the Add Criteria button to add the selection to the current filter. Press the Clear

Page 29: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 29 of 70

button at the bottom of the page to clear the current filter. This will return the page to the default criteria (all respondents selected). Once you have selected the appropriate criteria for filtering respondents for this email message, press the Save button to return to the Send Emails tab. Note: messages intended for immediate delivery are delayed by five minutes, which allows for the removal of the message from the queue if the send button has been pressed accidentally. Press the Send button at the top of the message to send the email message to the selected respondents. Upon sending the message you will be redirected to the Email History tab (see Image 12a). This tab shows you the messages pending for future delivery for this survey, which may be deleted by clicking the X link to the right of the message. Also, this screen shows at the bottom of the page all of the messages previously delivered to respondents for this survey. The final tab on the Respondents page is the Post Survey Cleanup tab (see Image 12b). Use this page to remove the first name, last name, and email address of each of your respondents from the TechQual+ database. This does not affect the custom attributes loaded with each respondent, nor does it affect the results submitted by the respondents. By performing this step after your survey is finished, you can guarantee the anonymity of your respondents’ feedback by removing their identifying information from the database. Once this process is run, respondents may no longer complete this survey. VIEWING, ANALYZING, AND DOWNLOADING SURVEY RESULTS The Analyze icon on the survey home screen will give you access to the results of your TechQual+ survey. This page also has tabs horizontally across the top that provide different views of the survey results (see Image 13). The first tab, the Notes tab, provides information on how to interpret the survey results. This includes information on the results contained on each of the tabs on the page. The Population tab contains two sub pages (see Image 14). The second page allows you to enter data on the total size of your populations selected for this survey. You may also enter the corresponding population information for each of the attribute values loaded with your respondents. The file population-analysis.txt, created if you used the TechQual utility for randomly sampling from your population, contains this data. By entering the total size of your population, friendly names for each of your attributes, and the corresponding population sizes for each attribute, the TechQual+ system can generate a very sophisticated report for you that details, on an attribute-by-attribute basis, the size of your entire population, the number of respondents matching that attribute, their percentage as an overall part of your population,

Page 30: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 30 of 70

the number of surveys attempted and completed matching that attribute, and the response rate (as a percentage) (see Image 16). For the self-reported attributes of University Role, Gender, and Age Group only statistics for # attempted and # completed, as well as the completion rate. Use the Change Criteria link on the top right hand side of the page to select attributes for filtering the results of the survey. You may select no attributes, one attribute, or any combination of attributes for your filter (see Image 15). You may choose from among the self-reported attributes of University Role, Gender, or Age Group, or the attributes you uploaded with your respondents. When you click this link, a pop-up menu appears that allows you to set the filter for your analysis. Once you have created your filter, press the Apply Filter button at the bottom of the page to apply the filter to the results set. Just above the Apply Filter and Clear filter buttons are two checkboxes. The first checkbox, labeled Include Incomplete Surveys, allows you to include individual items completed by respondents, even though the respondent failed to complete the survey in its entirety. This has the effect of sometimes including other data points from respondents in your analysis that would normally not be included by default. For example, perhaps a respondent completed the first five items before deciding to quit the survey. By default, those five answers would not be included in your results because that particular survey was not included in its entirety. When checking the Include Incomplete Surveys checkbox, these cases will be included in your results. The second checkbox, labeled Exclude outliers from analysis, statistically adjusts your results by throwing out cases, on an item-by-item basis, where the Adequacy Gap score is either so high or so low that it has the potential to bias the results of the survey. When checked, the results will exclude cases, on an item-by-item basis, where the Adequacy Gap Score is either greater than or less than two standard deviations from the Mean Adequacy Gap Score for all cases in the results set. This has the practical effect of removing the top 2.24% and bottom 2.24% of Adequacy Gap Scores from your results. The Zones of Tolerance tab provides a graphical view of your survey data based on the zones of tolerance concept (see Image 17). The Zones of Tolerance chart displays the range between the minimum and desired expectations for each IT service outcome as a grey bar. The Adequacy Gap (Perceived Performance to Minimum) is graphed as an orange bar. This view shows you the relative priority of each IT service outcome and allows rapid understanding of the performance of each service relative to their respective Zone of Tolerance. Each of the TechQual+ core commitments has a results table just under the Zones of Tolerance graph (see Image 18).

IT service outcomes with a negative Adequacy Gap Score are shaded in red in this data table. IT service outcomes with a positive

Page 31: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 31 of 70

Superiority Gap Score are shaded in green. For each IT service outcome, this table shows the minimum rating (Min), the desired rating (Des), the perceived performance rating (Per), the Adequacy Gap Score (Adeq), the Superiority Gap Score (Supr), and the number of respondents who completed this item on the survey (n*). The table also shows the Mean (Mean) and standard deviation (Dev) for each of these variables. The Radar Chart tab shows the same data graphed on a radar chart (see Image 19). The radar chart graphs out the X axis (the 1 to 9 range for each IT service outcome) from the center of the graph to the outside edge. The center of the radar graph is 0 and the outermost edge is 10. Clockwise, around the graph, each numbered spoke represents an IT service outcome on the survey. The first four IT service outcomes (Nos. 1 – 4) represent the Connectivity and Access core commitment. The next four IT service outcomes (Nos. 5 – 8) represent the Technology and Collaboration Services core commitment. The next five IT service outcomes (Nos. 9 – 13) represent the Support and Training core commitment. IT service outcomes beginning at number 13 represent institution-specific items added to this survey. A radar graph uses colors to represent the relative service performance for each IT service outcome. The color red represents performance that is below minimum expectations (a negative Adequacy Gap score). The color blue represents service performance that exceeds minimum expectations (a positive Adequacy Gap score). The color yellow represents performance that is below desired expectations (a negative Superiority Gap score) and the color green represents performance that exceeds desired expectations (a positive Superiority Gap score). The goal of the radar chart, similar to that of the Zones of Tolerance chart, is to provide powerful inferences, at a glance, regarding the strength of technology services at your institution. Below the radar chart, just as with the Zones of Tolerance tab, you will find a data table that shows the quantitative results for each IT service outcome. These tables are grouped by TechQual+ core commitment. The Additional Questions tab contains the open-ended, multiple choice, and multiple answer questions and responses that you added to your survey. To review the answers to each question, select the question from the drop down box. At the end of each response, inside [ ] marks, you will find a unique identifier for each respondent, beginning with a #, that allows you to track qualitative feedback by respondent. This will provide you with the ability to track feedback by respondent if using a qualitative analysis tool such as Atlas TI. The Suggestions tab works in similar fashion. Respondents are asked to provide open-ended suggestions when they enter a perceived performance rating that is below their minimum expectations rating. When this occurs, it reflects an evaluation by the respondent that actual performance on this IT service outcome does not meet their minimum expectations. Identical to the Additional Questions list, you will find a respondent unique identifier listed between [ ] marks at the

Page 32: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 32 of 70

end of each suggestion, beginning with a #, that allows you to track qualitative feedback by respondent. This will provide you with the ability to track feedback by respondent if using a qualitative analysis tool such as Atlas TI.

Responses for open-ended questions and suggestions may be visualized by generating a Wordle chart. To generate a wordle, click the Visualize Responses using Wordle link in the upper left hand side of the page (see Image 20). The Wordle visualization will appear in a new browser window (see Image 21). Just under the Options tab, are three additional links. The first link, Export Results to XML, allows you to download the current results set (based on whatever filter is set on the Criteria tab) in XML format. The second link, Export Entire Dataset, allows you to export the raw data collected through your survey. This step is useful should you desire the ability to perform your own independent analysis of the survey data. Note, to export the entire survey dataset your survey must be closed and you must have performed the post-survey cleanup processing at the conclusion of your survey. At the bottom of the Options tab, the Generate Report section allows you to generate a survey-specific PDF document that contains the results of your analysis. By default, the first result set in the PDF report is for all respondents. You may also chose to include additional results sets in your PDF report by selecting the filters that have been used previously for viewing the results of the survey. Check the Add New Chapter link to create a new filter and include it as a chapter in the PDF report. You may also specify the sort order of these additional results sets in the PDF report by using the drop down lists in the right hand column next to the query. To generate your PDF report, select the Create Report button at the bottom of the page. Your PDF report will be generated over the next several minutes and emailed to you. THE TECHQUAL+ PEER DATABASE Clicking the Compare icon on the survey home page brings the end user to the peer database page. In order to compare the results of your survey to similar results from peer institutions, your survey must be completed by a minimum of 50 respondents. Results contained in the peer database are aggregated by Carnegie Foundation Basic Classification. To view data in the peer database, select a Peer Group for the comparison (when on the peer comparison tab, otherwise select an IT service outcome when on the item-by-item comparison tab) from the drop down box (see Image 23). When comparing results by peer group, side-by-side radar charts will be generated that graphically illustrate the results of your survey versus the results at the selected peer group. When comparing results item-by-item, by selecting a question in the Item drop down box, the peer

Page 33: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 33 of 70

database scores for this IT service outcome will then be loaded onto the page. You may also create filters by University role (faculty, student, staff), gender, and age group (see Image 22). These filters are based on self-reported information provided by respondents when beginning the survey. The same filtering is available on by the by peer comparison tab. On the item-by-item comparison tab, a Zone of Tolerance graph will be generated for this comparison. The results set table resembles the results table contained on the Results page. In the left hand column are the results for this item for the end user’s institution, along with results from different types of institutions according to the Carnegie Basic classification. Results for the end user’s institution are also included in the basic classification data that corresponds to the their institution’s basic classification if you selected the Include Home Institution in Peer Results check box.  Comparison groups on the item-by-item page (or IT service outcomes on the by comparison group tab) with a negative Adequacy Gap Score are shaded in red in this data table. Comparison groups with a positive Superiority Gap Score are shaded in green. For each comparison group, this table shows the minimum rating (Min), the desired rating (Des), the perceived performance rating (Per), the Adequacy Gap Score (Adeq), the Superiority Gap Score (Supr), and the number of respondents who completed this item on the survey (n*). The table also shows the Mean (Mean) and standard deviation (Dev) for each of these variables.    In order to provide for meaningful comparisons, the 33 different Carnegie Basic Classifications have been collapsed into 13 TechQual+ peer database groups. For example, the 14 different classifications for Associate’s Level Colleges have been collapsed into one. This simplifies the peer database comparisons. Below is a mapping of the Carnegie Basic Classification (left column) to the Peer Database Grouping in the TechQual+ Web site (right column).

Carnegie Basic Classification TechQual+ Peer Database Grouping Assoc/Privfp4: Associate's--Private For-Profit 4-Year Primarily Associate''s

Associate's Level Colleges

Assoc/Privfp: Associate's--Private For-Profit Associate's Level Colleges Assoc/Privnfp4: Associate's--Private Not-For-Profit 4-Year Primarily Associate's

Associate's Level Colleges

Assoc/Privnfp: Associate's--Private Not-For-Profit

Associate's Level Colleges

Assoc/Pub-R-L: Associate's--Public Rural-Serving Large

Associate's Level Colleges

Assoc/Pub-R-M: Associate's--Public Rural-Serving Medium

Associate's Level Colleges

Assoc/Pub-R-S: Associate's--Public Rural-Serving Small

Associate's Level Colleges

Assoc/Pub-S-Mc: Associate's--Public Suburban-Serving Multicampus

Associate's Level Colleges

Assoc/Pub-S-Sc: Associate's--Public Suburban-Serving Single Campus

Associate's Level Colleges

Assoc/Pub-Spec: Associate's--Public Special Use Associate's Level Colleges Assoc/Pub-U-Mc: Associate's--Public Urban- Associate's Level Colleges

Page 34: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 34 of 70

Serving Multicampus Assoc/Pub-U-Sc: Associate's--Public Urban-Serving Single Campus

Associate's Level Colleges

Assoc/Pub2in4: Associate's--Public 2-Year Colleges Under 4-Year Universities

Associate's Level Colleges

Assoc/Pub4: Associate's--Public 4-Year Primarily Associate''s

Associate's Level Colleges

Bac/A&S: Baccalaureate Colleges--Arts & Sciences

Baccalaureate Colleges

Bac/Assoc: Baccalaureate/Associate's Colleges Baccalaureate Colleges Bac/Diverse: Baccalaureate Colleges--Diverse Fields

Baccalaureate Colleges

Dru: Doctoral/Research Universities Research Universities (Low) Ru/H: Research Universities (High Research Activity)

Research Universities (Medium)

Ru/Vh: Research Universities (Very High Research Activity)

Research Universities (Very High)

Master's L: Master's Colleges And Universities (Larger Programs)

Master's Colleges and Universities

Master's M: Master's Colleges And Universities (Medium Programs)

Master's Colleges and Universities

Master's S: Master's Colleges And Universities (Smaller Programs)

Master's Colleges and Universities

Spec/Arts: Special Focus Institutions--Schools Of Art, Music, And Design

Special Focus Institutions (Arts)

Spec/Bus: Special Focus Institutions--Schools Of Business And Management

Special Focus Institutions (Business)

Spec/Eng: Special Focus Institutions--Schools Of Engineering

Special Focus Institutions (Engineering)

Spec/Faith: Special Focus Institutions--Theological Seminaries, Bible Colleges, And Other Faith-Related Institutions

Special Focus Institutions (Other)

Spec/Health: Special Focus Institutions--Other Health Professions Schools

Special Focus Institutions (Medical)

Spec/Law: Special Focus Institutions--Schools Of Law

Special Focus Institutions (Law)

Spec/Med: Special Focus Institutions--Medical Schools And Medical Centers

Special Focus Institutions (Medical)

Spec/Other: Special Focus Institutions--Other Special-Focus Institutions

Special Focus Institutions (Other)

Spec/Tech: Special Focus Institutions--Other Technology-Related Schools

Special Focus Institutions (Other)

Tribal: Tribal Colleges Baccalaureate Colleges

     

Page 35: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 35 of 70

PART III: CONDUCTING TECHQUAL+ SURVEYS AND GETTING THE MOST OUT OF THE TECHQUAL+ PROJECT

After becoming familiar with the TechQual+ approach and philosophy, and after becoming acquainted with the Web site tools, the question then becomes: “How do we get started?” In this section of the protocol guide, you will find tips on how to begin using these tools and how to get the most out of your use of the TechQual+ Project. BECOME FAMILIAR WITH THE TECHQUAL+ TOOLS Participating institutions should become familiar with the TechQual+ Web site tools. This will allow you to become more comfortable with creating surveys, communicating with respondents, and analyzing the survey results. A great way to start using the tools is to conduct a test survey with IT staff within your organization. Create a TechQual+ survey and then invite all of your IT staff to complete the survey, with the instruction that they should remember that they are also consumers of technology services, in addition to being the individuals supporting them. Sometimes it may also be insightful to compare IT staff perceptions of technology with the perceptions of the end user community collected in a follow-up survey. PLAN AN ANNUAL SURVEY The key to effective use of the TechQual+ survey and Web site tools, which just also happens to correlate highly with the effective delivery and use of technology services, is an annual recurring cycle of assessment, planning, prioritization, and accountability. For example, one approach would be to carry out a random sampling of 25% of your overall population of faculty, students, and staff on an annual basis. Conducting a TechQual+ survey in this manner would allow you to assess year-over-year performance and to hold your organization more accountable. OBTAIN INSTITUTIONAL REVIEW BOARD (IRB) APPROVAL Institutions conducting TechQual+ surveys are highly recommended to obtain prior approval from the Institutional Review Board (IRB) at their institution before administering a TechQual+ survey. The nature of the TechQual+ survey poses minimal risk to respondents and you should be able to obtain an exemption from full IRB review for your use of the TechQual+ instrument. CAREFULLY CONSIDER YOUR RESPONDENT ATTRIBUTE LIST You have the ability to filter your respondent list for email communications and to disaggregate your survey results based upon the attributes that you upload with your respondents. When planning for a TechQual+ survey, you should carefully consider the attributes that you upload with your respondents. The importance of this step is often overlooked when planning a TechQual+ survey.

Page 36: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 36 of 70

Common attributes proven to be helpful in distinguishing groups of end users with different expectations and needs include the following:

• Role (faculty, student staff) • Gender (male or female) • Age (best grouped <20, 20-29, 30-39, 40-49, 50-59, >60) • Campus location

You should also include attributes that signify the organizational affiliations of your respondents within their institution. However, be careful not to divide your respondents unnecessarily. For example, past TechQual+ surveys have indicated differing expectations between students in a college of liberal arts versus those in a college of business. However, less difference is often evident when comparing expectations among different majors within the same college. Attributes that may be helpful in associating your respondents with their organizational unit include the following:

• College or school • Department (in larger institutions)

When combined with the examples discussed earlier, inclusion of these additional attributes allows you to drill down on the results and gain powerful inferences. For example, at Pepperdine University, researchers are able to gain inferences about student perceptions within the school of business who take courses predominantly at the West Los Angeles campus. These views can be contrasted with the views of students in the undergraduate college at the main campus in Malibu. They can also be contrasted with the views of business students in the full-time residential program at the Malibu campus. By contrasting views among the different constituencies at an institution, you are able to get the most out of your TechQual+ survey results. Larger, more research-focused institutions often have large numbers of decentralized IT organizations. While these organizations may not report directly to the central IT organization, the IT leadership within the institution often is expected to include these organizations in its assessment, planning, prioritization, and policy processes. Institutions with decentralized technology units are encouraged to include additional attributes for their respondents that would identify the decentralized IT units that provide services to them. For example, one might assumes that a school of engineering would have a large IT unit that predominantly serves its own faculty, students, and staff. Including an attribute that identifies this affiliation with a respondent provides the ability to filter the results to gain insights on what these constituents think about technology services within the college. The same could be accomplished for a decentralized administrative IT organization that serves, perhaps, the division of finance at the institution. In this case, combining the quantitative results together with an analysis of the

Page 37: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 37 of 70

suggestions provided by respondents provides the ability to gain some very powerful inferences regarding the perceptions of constituents regarding technology services within these decentralized units. INCLUDE INSTITUTION SPECIFIC IT SERVICE OUTCOMES The TechQual+ survey design process allows you to create custom institution-specific IT service outcomes (or survey items) that will be presented to respondents for scoring, using the TechQual 1 to 9 scale, for minimum expectations, desired expectations, and perceived performance. Institutions creating custom IT service outcomes and including these on their TechQual+ survey have the opportunity to gain detailed insights and suggestions regarding their institution-specific services. Doing so provides you with the opportunity to gain evidence of the impact of your initiatives across your institutions in a way that allows you to disaggregate the results and to understand the differing perceptions and views of different campus constituents. PREPARE YOUR ADDITIONAL QUESTIONS As with the custom institution-specific IT service outcomes, you have the ability to include unique additional questions on your TechQual+ survey. These additional questions may be of the open-ended, multiple choice, and multiple answer variety. Questions such as the following allow you to gather feedback that can be analyzed and that will assist you with your assessment and planning efforts.

• Could you identify three technology services at the institution that you find especially helpful?

• Could you identify three technology services at the institution

that often perform poorly? What steps should be taken to strengthen these services?

The most powerful feedback arising from a TechQual+ survey often comes from the comments elicited through open-ended questions. Participating institutions are encouraged to undertake careful consideration of appropriate questions for their institution and to include these in their surveys. Using the Wordle visualizations, you can also gain powerful inferences from the data at a glance. AVOID CREATING AN UNNECESSARILY COMPLEX SURVEY The TechQual+ core survey includes 13 IT service outcomes. Participating institutions are encouraged to limit the number of custom institution-specific IT service outcomes and additional questions that they include in the survey to avoid negatively impacting their response rate. Best practices derived from past TechQual+ surveys indicate that custom institution-specific IT service outcomes should be limited to no more than six items and that additional questions should be limited to no more than six questions.

Page 38: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 38 of 70

PREPARING YOUR RESPONDENT LISTS To date, best practices suggest that it is better to create one annual TechQual+ survey for the entire institution as opposed to creating separate surveys for faculty, staff, or students. This allows you to compare differing views within the same results set across these very different constituencies when analyzing the results of the survey. In preparing your entire population list, the best source of respondent information remains the central administrative information system at your institution. With rare exception, these systems will contain all of the necessary required fields (first name, last name, email address) and identifying attributes that you desire for inclusion in your respondents list. This information should be extracted from these systems into a comma-delimited text file in the format expected by the TechQual+ respondent import process (discussed in part II of this guide). USE RANDOM SAMPLING TO SELECT YOUR RESPONDENTS It is critical that participating institutions use random sampling when selecting respondents from their overall population for a TechQual+ survey. The actual number of respondents used for random selection depends on the overall size of your population. Best practices dictate that you randomly select 25% of your entire target population to be respondents for your survey. Your office of institutional research can provide you with additional assistance for determining an appropriate number for a random sample of your population. The best approach for random sampling is to use the developer toolkit provided at https://s3.amazonaws.com/media-techqual/developer.zip. This download includes a Microsoft Access database and executable program that will select respondents using a random process and that will export that list to a text file that can be used to import the selected respondents into the TechQual+ Web site. See part II of this protocol guide for more information on the use of this toolkit. COMMUNICATING WITH RESPONDENTS ABOUT YOUR SURVEY Best practices suggest that highly personalized email communications will create the best opportunity for obtaining a high response rate to your web survey. Sample communications shown to have a positive impact on the response rate include the following: Invitation Message: Dear [FirstName], As the individual responsible for technology at Pepperdine University, I am very interested in hearing your thoughts and opinions regarding the quality and effectiveness of technology services at our great university.

Page 39: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 39 of 70

Your feedback is very important – it guides our planning, staffing, and spending activities for the coming year. Your participation in this survey provides us with critical and vital information regarding the technology services that you depend on each day. You have been randomly selected, along with approximately 2000 other faculty, students, and staff, to participate in a web-based survey where you can express your opinions regarding the quality of technology services at Pepperdine University. The survey will take approximately twenty minutes to complete. The information you submit will remain anonymous and confidential and will be used to guide our planning efforts for the next year. [AssessmentUrl] Those of you who complete the survey in its entirety will be entered into a drawing to win one of twenty (20) $50 gift certificates from Amazon.com. I appreciate your taking the time to complete this assessment. Please email me at [email protected] or call me at 310-506-4501 if you have any questions or concerns. Best wishes, Timothy Chester Vice Provost for Academic Administration and Chief Information Officer Pepperdine University Reminder Message: Dear [FirstName], I am writing to remind you of the need to complete the survey regarding Information Technology services at Pepperdine University. We need your feedback by April 16th. [AssessmentUrl] Your feedback is very important and we need your help. Last year, you told us about the need to expand wireless access, to provide help desk services on a 24/7 basis, and to reorganize Pepperdine's Wavenet portal – projects that have been completed or are now underway. Your completion of this survey provides us with critical and vital information regarding the Information Technology services that should be provided or improved – services that you depend on each day. We know how busy you are at this time of year. The survey is designed to take approximately 20 minutes of your time. We would be especially grateful if you would make the time to complete this important survey. Our success each year depends on hearing from you. Those of you who complete the survey in its entirety will be entered into a drawing to win one of twenty (20) $50 gift certificates from Amazon.com. I appreciate your taking the time to complete this survey. Please reply back to me or call me at 310-506-4501 if you have any questions or concerns. Thanks so much for your participation. Sincerely, Timothy M. Chester Vice Provost and Chief Information Officer Pepperdine University Several draft message that you can adopt and modify for your communications can be obtained through the Code Snippet drop down box on the Send Emails tab of the respondents screen (see previous section of this guide). Other important best practices for communicating with respondents – practices that have been shown to result in increased response rates – include the following:

• A recognized leader at the institution should sign the message

and the message should come from their email account. A different Reply-To email address can be used to ensure that the sender is not bombarded by reply messages. Messages that are sent without a signature, or messages sent from a bulk message email account,

Page 40: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 40 of 70

are often ignored and have been shown to result in lower response rates.

• Send an announcement message to all respondents in advance of the

survey, that informs respondents that they have been selected and that they can expect an invitation to the survey shortly. This email should not include the [AssessmentUrl] placeholder because you do not want to include the survey link in this communication. A recognized leader at your institution should send this message.

• The survey should be open for a minimum of four weeks, with a

weekly reminder message being sent to respondents who have not completed the survey.

• Near the end of the four-week survey period, one final reminder

should be sent to individuals who have an incomplete survey. These are respondents who began the survey but never completed it. One final reminder to just these respondents can often help to increase the response rate by a few final percentage points.

Participating institutions are highly encouraged to use the filtering capabilities to send targeted email communications to their constituents. For example, it is best to target these reminder communications just to those respondents who have not completed the survey. These types of reminders should never be sent to individuals who have completed the survey. These excess communications often frustrate individuals who have completed the survey and negatively impact their perceptions of the survey process. Even with a great communications strategy, experiments have shown that it is unrealistic to expect response rates much greater than 20% for Web surveys like TechQual+. ANALYZE THE RESULTS AND CREATE AN AGENDA FOR ACTION Once the survey period is up, it is time to begin analyzing the results of your survey. This is best accomplished by using both the quantitative and qualitative results. The quantitative scores, primarily the Adequacy Gap Score and the Superiority Gap Score, will tell you how your service performance maps to constituent expectations. Once this understanding is reached, it is time to turn to the respondent suggestions. This freeform feedback is vital for converting Adequacy Gap Scores into steps to improve services. For example, using filtering, you now know what the students in the school of business at your main campus think about wireless network services. The Adequacy Gap Scores will tell you how your services match up against the expectations of these students. Turning to their suggestions will tell you specifically how to improve this service in the eyes of these students.

Page 41: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 41 of 70

Institutions that get the most out of the TechQual+ survey process are those institutions with IT organizations that are collectively involved in the dissemination, analysis, and creation of an agenda in response to the survey results. Results should be shared broadly and deeply throughout the IT organization and individual IT staff should be challenged to comprehend the data and turn it into an agenda for change. This process should be inclusive and recurring, as a regular part of the IT organization’s assessment, planning, prioritization, and accountability processes. DON’T HESITATE TO REQUEST ASSISTANCE The TechQual+ principal investigator and others are prepared to assist institutions as they think about the opportunities discussed in this protocol guide. Should you have questions email the TechQual+ listserv principal investigator at [email protected].

Page 42: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 42 of 70

APPENDIX

Image 1 Join the TechQual+ Project Screen

   Image 2 Account Setup Screen

       

Page 43: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 43 of 70

Image 3 Email Notification Settings  

 

 

Page 44: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 44 of 70

Image 4 Upper Right Navigation Links

 Image 4a Survey Design Screen

 

Page 45: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 45 of 70

Image 5 Main Drop Down Navigation Menu  

 

Page 46: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 46 of 70

Image 5a Custom Items Screen

   

Page 47: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 47 of 70

Image 6 Institutional Surveys Screen

Page 48: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 48 of 70

Image 6a Additional Questions Tab

 

Page 49: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 49 of 70

Image 7 Additional Question Library  

 

Page 50: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 50 of 70

Image 7a Survey Instructions Screen  

       

Page 51: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 51 of 70

Image 8 Survey Collection Options  

 

Page 52: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 52 of 70

Image 9 Respondents Tab Listing (Upload Respondents Selected)  

 

Page 53: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 53 of 70

Image 9a Respondents Tab Listing (Direct Link Selected)

 

Page 54: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 54 of 70

Image 9b Delete Respondents Section (Collection Settings tab)

Image 9c Direct Link Tab

 

Page 55: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 55 of 70

Image 9d Respondent List Tab

Page 56: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 56 of 70

Image 10 Add Respondents Tab

 

Page 57: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 57 of 70

Image 11 Send Emails Tab

Page 58: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 58 of 70

Image 12 Choose Criteria for Selecting Email Recipients Page

Image 12a Email History Tab

Page 59: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 59 of 70

Image 12b Post-survey Cleanup Tab

 

Page 60: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 60 of 70

Image 13 Tabs on the Survey Results / Analyze Page

Page 61: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 61 of 70

Image 14 Population Tab on the Results Page

 

Page 62: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 62 of 70

Image 15 Change Criteria Popup

Page 63: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 63 of 70

Image 16 Respondent Analysis Data on Population Tab (Results Page)  

 

Page 64: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 64 of 70

Image 17 Zones of Tolerance View of Survey Results

 

Page 65: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 65 of 70

Image 18 Survey Results Data Table

 

Page 66: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 66 of 70

Image 19 Radar Chart of Survey Results

Page 67: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 67 of 70

Image 20 Additional Questions Drop Down List

 

Page 68: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 68 of 70

Image 21 Wordle Visualization of Suggestions

Page 69: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 69 of 70

Image 22 Peer Database Comparison Filter

 

Page 70: Assessing Effective IT Service Outcomes for Higher Education

The Higher Education TechQual+ Protocol Guide

(June 1, 2013 Revision) Page 70 of 70

Image 23 TechQual+ Peer Database Page