11
Instructional effects of a performance support system designed to guide preservice teachers in developing technology integration strategiesFaisal Kalota and Wei-Chen Hung Faisal Kalota is a member of the Information Technology Strategy and Planning team in University Technology Services at Northeastern Illinois University. His research interests include performance support systems, instruc- tional design, project management and computer programming. Wei-Chen Hung is an associate professor in the Department of Educational Technology, Research and Assessment at Northern Illinois University. His research interests include problem-based learning, human computer interaction and performance support systems. Address for correspondence: Dr Faisal Kalota, Member, Information Technology Strategy and Planning team, University Tech- nology Services, Northeastern Illinois University, Chicago, IL 60625, USA. Email: [email protected] Abstract The purpose of this formative evaluation was to investigate the experiences of preservice teachers utilizing performance support system (PSS) technology to develop knowledge related to classroom technology integration. A PSS provides end users just-in-time support to perform various tasks. Because teachers have time constraints, a PSS can be used to support them with classroom technology integration. Both quantitative and qualitative data were used to answer the research questions. Qualitative data included focus group interviews and a perception questionnaire; these data were used to identify factors related to the participant’ experiences. In addition, a pre- and post-knowledge test was administered to the preservice teachers (n = 28) to study potential learning impact of a PSS for technology integration. Results showed that, although their learning improvement was not statistically significant, the participants reacted to PSS use posi- tively, with caution. Based on the results, it is recommended that the PSS environment should be updated based on the feedback from the participants, and additional long-term studies should be conducted to validate the current findings. Introduction As the use of technology in the educational environment increases, many teacher-education programs in universities have developed courses to teach preservice teachers how to integrate technology to enhance teaching and learning. A national educational-technology-usage study conducted by Gray,Thomas and Lewis (2010) reported that technology integration has emerged as an essential topic in teacher education. There is a growing need to provide preservice teachers with various learning opportunities to integrate technology in their classrooms. As reported by Koehler and Mishra (2005), it is important for teachers not only to have technology skills, but they should understand the pedagogical role of technology in education. If teachers are well prepared to integrate technology properly in their classrooms, then they are in a better position to promote learning and empower their students to use technology. Preparing preservice teachers to integrate technology can be complex (Brush et al, 2003; Oberlander & Talbert-Johnson, 2007; Zhao, Frank & Ellefson, 2006). Some preservice teachers do not have a technology background (Bannister & Ross, 2005), so they may not be proficient in British Journal of Educational Technology (2012) doi:10.1111/j.1467-8535.2012.01318.x © 2012 The Authors. British Journal of Educational Technology © 2012 BERA. Published by Blackwell Publishing, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA.

Instructional effects of a performance support system designed to guide preservice teachers in developing technology integration strategies

Embed Size (px)

Citation preview

Instructional effects of a performance support systemdesigned to guide preservice teachers in developingtechnology integration strategies_1318 1..11

Faisal Kalota and Wei-Chen Hung

Faisal Kalota is a member of the Information Technology Strategy and Planning team in University TechnologyServices at Northeastern Illinois University. His research interests include performance support systems, instruc-tional design, project management and computer programming. Wei-Chen Hung is an associate professor in theDepartment of Educational Technology, Research and Assessment at Northern Illinois University. His researchinterests include problem-based learning, human computer interaction and performance support systems. Address forcorrespondence: Dr Faisal Kalota, Member, Information Technology Strategy and Planning team, University Tech-nology Services, Northeastern Illinois University, Chicago, IL 60625, USA. Email: [email protected]

AbstractThe purpose of this formative evaluation was to investigate the experiences of preserviceteachers utilizing performance support system (PSS) technology to develop knowledgerelated to classroom technology integration. A PSS provides end users just-in-timesupport to perform various tasks. Because teachers have time constraints, a PSS can beused to support them with classroom technology integration. Both quantitative andqualitative data were used to answer the research questions. Qualitative data includedfocus group interviews and a perception questionnaire; these data were used to identifyfactors related to the participant’ experiences. In addition, a pre- and post-knowledgetest was administered to the preservice teachers (n = 28) to study potential learningimpact of a PSS for technology integration. Results showed that, although their learningimprovement was not statistically significant, the participants reacted to PSS use posi-tively, with caution. Based on the results, it is recommended that the PSS environmentshould be updated based on the feedback from the participants, and additional long-termstudies should be conducted to validate the current findings.

IntroductionAs the use of technology in the educational environment increases, many teacher-educationprograms in universities have developed courses to teach preservice teachers how to integratetechnology to enhance teaching and learning. A national educational-technology-usage studyconducted by Gray, Thomas and Lewis (2010) reported that technology integration has emergedas an essential topic in teacher education. There is a growing need to provide preservice teacherswith various learning opportunities to integrate technology in their classrooms. As reported byKoehler and Mishra (2005), it is important for teachers not only to have technology skills, butthey should understand the pedagogical role of technology in education. If teachers are wellprepared to integrate technology properly in their classrooms, then they are in a better position topromote learning and empower their students to use technology.

Preparing preservice teachers to integrate technology can be complex (Brush et al, 2003;Oberlander & Talbert-Johnson, 2007; Zhao, Frank & Ellefson, 2006). Some preservice teachers donot have a technology background (Bannister & Ross, 2005), so they may not be proficient in

British Journal of Educational Technology (2012)doi:10.1111/j.1467-8535.2012.01318.x

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA. Published by Blackwell Publishing, 9600 Garsington Road, OxfordOX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA.

technology; others resist using it (Lin, 2008). Teachers, in-service and preservice, are generallythe driving factor for implementing technology successfully into the curriculum (Al-Bataineh &Brooks, 2003; Grove, Strudler & Odell, 2004; ISTE, 2002). Often, the teachers who are respon-sible for implementing technology into their curriculum are not technology savvy (Favero &Hinson, 2007), and they are left on their own to develop workable strategies (Groves & Zemel,2000). Such teachers benefit from support related to technology integration (Espey, 2000),because they need to address challenges to leveraging technology effectively into their instruc-tion, including lack of technology training (Koc & Bakir, 2010), inexperience with the newtechnology (Al-Bataineh & Brooks, 2003; Khoury, 1997) and limited time to learn the newtechnology (Khoury, 1997; Quick & Davies, 1999).

In addition to the above social and technical contexts that are unsupportive of preservice teach-ers’ understanding of technology integration, another challenge is lack of effective instruc-tional and pedagogical knowledge to help preservice teachers understand how technology canbe integrated to support meaningful learning (Koehler & Mishra, 2005). Many foundation-leveltechnology and education courses designed for preservice teachers are skill based. Preserviceteachers in these courses learn how to use software applications for image editing, word pro-cessing, spreadsheet creation, slide-based presentations and so on. However, simply teachingthem how to use technology does not prepare preservice teachers to use technology in a mean-ingful way. It is more important to provide them with proper instructional and pedagogicalknowledge to prepare them to use technology to create meaningful learning environments fortheir future students.

A study conducted by Burns (2002) indicated that technology can be learned best if it is mean-ingfully integrated with the content. Teachers in the study initially refused to utilize technologyfor various reasons; one reason was not being proficient with technology. However, as a result of

Practitioner NotesWhat is known about this topic

• Preservice teachers often lack technology integration skills.• Preservice teachers often do not have the necessary support that can help them

develop technology integration skills.• Performance support systems (PSS) are known to provide just-in-time training and

learning opportunities.

What this paper adds

• Provides a framework for developing a PSS based on the principles of advanceorganizers.

• Provides support to further investigate the use of PSS for preservice and in-serviceteachers’ professional development.

Implications for practice and/or policy

• Institutions should investigate the effectiveness of using a PSS to support preserviceteachers with technology integration issues.

• K-12 schools should investigate the effectiveness of using a PSS to support teacherswith technology integration issues.

• Designers of PSS should incorporate appropriate learning theories in the designframework of a PSS.

2 British Journal of Educational Technology

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

various professional development seminars that focused on using technology in the context ofteaching, their willingness increased not only to learn technology but also to incorporate it intotheir teaching. Similarly, Zhao et al (2006) reported that teachers’ use of computer was positivelycorrelated with (1) professional development being provided locally in the classroom or lab and(2) a professional development focus on improving student learning.

Burns’ (2002) and Zhao et al’s (2006) studies suggest that technology should not be taught inisolation, but should be treated as a tool to improve instruction. So the focus of professionaldevelopment should be not only technology skills but the meaningful integration of technology inthe classroom. Likewise, teachers should be provided appropriate technology support in theirlearning environment where technology is going to be used.

Performance support systems (PSS) can be incorporated to support such instructional activities tohelp promote preservice teachers’ comprehension of technology integration and provide themwith ways to critically evaluate and identify appropriate technology to support learning events. Insome literature, a PSS is also known as an electronic performance support system (EPSS). Accord-ing to Gery (1991), a PSS is defined as a system that “provides whatever is necessary to generateperformance and learning at the moment of need” (p. 34). It is considered to be an “optimizedbody of integrated on-line and off-line methods and resources providing what performers need,when they need it, in the form they need it in, so that they can perform in ways that meetorganizational objectives” (Villachica, Stone & Endicott, 2006, p. 540). PSS have been known toprovide just-in-time training support to educators in the fields of behavioral psychology (Hung &Chao, 2007), library information access (Barker, van Schaik & Famakinwa, 2007) and engineer-ing (Alparslan, Cagiltay, Ozen & Aydin, 2008). PSS also support workers troubleshooting homeappliances and automotive problems (Maughan, 2005) and assist educators in developing state-aligned curricula (Northrup & Pilcher, 1998). In light of Zhao et al’s (2006) recommendation, aPSS can provide on-demand support to teachers in their local environment, which could not onlyincrease their use of technology but also support them in the meaningful integration of technol-ogy in teaching and learning.

A PSS can integrate “learning and task performance into a single action by providing informationand guidance about the task in response to specific needs and situation, thus allowing learningwhile working” (Gal & Nachmias, 2011, p. 214). PSS accomplish this by incorporating varioustechnical functions—such as content repositories, cognitive tutoring modules and onlinegroupware—into a single support system with a consistent interface. Selection of these elementsis based on how they can be integrated to support users’ knowledge acquisition to accomplishtasks at work. For preservice teachers, designers of PSS must find tools that can be combinedseamlessly to provide users with a uniform interaction experience. While there are a variety ofPSS-compatible components that may be combined, Hung and Chao’s (2007) matrix-aided per-formance support system (MAPS) design concept seeks to create features and functionalityaround the concept of the “advance organizer” that facilitate users’ learning and performance.The present study explores how MAPS facilitates the process of preservice teachers’ technologyintegration activities. Data collected from this study help researchers and practitioners to deter-mine how PSS may be designed to support technology integration activities in the classroom. Theresearch questions were:

1. Does MAPS, which is based on the concept of advance organizers, improve preserviceteachers’ knowledge of technology integration? This question was answered by a pre- andpost-knowledge test administered to preservice teacher participants.

2. What are the participants’ views and perceptions regarding the design and use of MAPS? Thisquestion was answered using the MAPS perception questionnaire and data from the focusgroup interviews.

Instructional effects of a performance support system 3

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

MAPS theoretical frameworkThe goal of the PSS is to combine various technology functions within one unified interface tosupport users to perform different tasks. When designing a PSS, technological features usually cometo the designers’ minds first, and they continue to influence the overall design. The adoption oftechnology in the learning environment, without consideration for design concepts and perspec-tives, may cause PSS designers to be preoccupied with technical prowess rather than with support-ing learning and instruction (Hung, Smith, Harris & Lockard, 2010). From the educationaltechnology perspective, PSS design should be guided by learning and instructional theories to realizeboth the full potential of the application of technology and to meet the users’ needs. Confidence ininstructional guidelines increases when the design process is integrated systematically, making thesystem as usable as possible throughout the learning process.The system can also be evaluated moreeffectively (Koschmann, Myers, Feltovich & Barrows, 1994; Quintana et al, 2004).

The MAPS design framework is based on Ausubel’s (1960) “advance organizer” instructionalapproach. An advance organizer presents an overall structure of a knowledge domain, or the bigpicture, before delving into details of individual components. Further, an advance organizerbuilds on what users already know to help them bridge the gap from already-mastered knowledgeto new material. Unlike a summary or abstract, an advance organizer takes learners’ character-istics (eg, preexisting knowledge, experience, age) and the nature of the learning materialsinto consideration in order to provide a meaningful context to the learning environment. Thisdesign concept objectifies and represents content information to connect users’ existing cognitivestructure to the information to be acquired. Ausubel proposed two types of advance organizer:expository and comparative.

Expository advance organizers provide learners with in-depth knowledge on topics for whichlearners already have a basic understanding. For example, when teaching how to import datato create a mail-merge project in word-processing software, a visual representation of mergeddocuments can be used as an expository advance organizer to help learners make connections totheir existing knowledge about data tables and mail-merge wizards. Expository advance organiz-ers are ideal for learning new concepts or skills that require learners to integrate their existingknowledge.

Comparative advance organizers help learners to clarify or make comparisons with their newlyacquired knowledge. For example, a teacher can provide a verbal description of keyboard andmonitor functionality to help students clarify the concept of input and output devices. Use ofcomparative advance organizers also results in acquisition of new knowledge as learners attemptto apply their knowledge. In the case of input and output devices, the teacher may extend students’understanding to alternative input devices such as touch-sensing, pen input and scanning.

MAPS use organizational metaphors in a comparative scheme to objectify and anchor the useof advance organizers systematically and visually. Organizational metaphors are ways to struc-ture information that allow for conveying understanding of a new subject matter’s hierarchy(Morville & Rosenfeld, 2006). This hierarchy is directly linked to learners’ existing cognitivestructures. In MAPS environment, technology tools are listed by degree of complexity fromeasiest to most difficult. Learners can figure out that word processing may be an easier technol-ogy, compared with databases. Similarly, the learning activities and events are grouped by peda-gogies and instructional strategies, which are closely related to preservice teacher-learners’understanding of the curriculum planning approach. The use of organizational metaphors incomparative advance organizer design helps learners to become familiar with different technolo-gies and the relationship of technologies with various learning activities and events. As shown inFigure 1, learning activities and events are organized as a hierarchy in the first two columns, andtechnology tools are listed in the upper rows. The order of technology tools represents the degree

4 British Journal of Educational Technology

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

of technical specialization, such as the amount of time and learning required to develop anintegration plan for the intended learning activity.

If preservice teachers are interested in exploring how a particular technology can be integrated witha particular learning activity, they click the button to get additional guidance on such topic. Forexample, in Figure 1, the pop-up window shows how the technology tool “Concept Maps” can beused to enhance the pedagogical category “Conceptual Knowledge.” The top portion of the pop-upwindow provides a general overview on Conceptual Knowledge and Concept Maps. Links provideadditional information on integration strategies, examples and how to assess the learning outcome.

In essence, MAPS is a PSS design that offers a set of tools and problem-solving support. It provideslearners a conceptual scaffold through the process of:

1. comparing users’ existing knowledge with new knowledge;2. presenting inclusive information about the new knowledge;3. exploring applications (ie, technology integration strategies) of inclusive information;4. applying strategies to the inclusive information (Hung & Chao, 2007).

Formative evaluations of the instructional effects of mapsAfter the MAPS prototype was designed for the present study, formative evaluations, utiliz-ing both qualitative and quantitative approaches, were conducted. Formative evaluation is the“iterative process of tryout and revisions of instructions during development before the actual

Figure 1: MAPS provides more information about the selected integration strategy, examples andits assessment methods

Instructional effects of a performance support system 5

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

implementation” (Weston, McAlpine & Bordonaro, 1995, p. 30). The purpose of formative evalu-ation is to gather and analyze data during the development phase, so that the product or instruc-tions can be improved in the production stage. In the field of educational technology, formativeevaluations are used to improve instructional systems (Morrison, Ross & Kemp, 2007). Formativeevaluations must serve the purpose of the study; therefore, they should be designed to provideinformation so that the evaluators can understand whether the goals are being met (Weston et al,1995) and what future improvements are needed. The current study uses the evaluation methodof data collection and revision (Weston et al, 1995). In the current study, revision could implyhow the design framework of MAPS can be improved and, if applicable, the possible futureimplications of the study.

The formative evaluation in the present study explores how the use of MAPS supports preserviceteachers enrolled in technology integration courses in their learning to integrate technologyinto their curricula. The evaluation also assesses how MAPS can be used in the classroomenvironment to help teachers to plan technology-integrated learning events.

MethodThis section describes the design and methods used to carry out this study. Table 1 shows how thestudy was conducted in two phases to collect quantitative and qualitative data. During the firstphase, quantitative data related to the participants’ performance on the knowledge pretest andknowledge posttest were collected. These data were used to assess whether the use of MAPS hada statistically measurable impact on participants’ knowledge of technology integration in theclassroom. These data were also used to analyze the first research question: does MAPS improvepreservice teachers’ knowledge of technology integration?

During the second phase, qualitative data were collected using a questionnaire and focusgroup interviews. Participants from the experimental group completed a 39-item MAPS user-perception questionnaire (Hung & Chao, 2007) on a 5-point Likert scale. The questionnaireassesses four constructs of user perceptions about MAPS:

• navigation and orientation (six items) focuses on user opinions of the matrix layout, as com-pared with traditional content page layout (eg, portable document format [PDF]);

• interface and content management (nine items) focuses on user concerns about using MAPS tosupport their information retrieval;

• overall usefulness (17 items) aims at understanding user reactions towards the overall systemand its impact on their learning outcomes;

• adoption (seven items) focuses on user willingness about using MAPS in the future.

Table 1: Study design

PhaseTime line and total

number of participantsExperimental

group (n = 14) Control group (n = 14)

Phase 1 (quantitativedata collection)

1 (n = 28) Administered pretest Administered pretest2 (n = 28) They were asked to find

solution to a fewscenarios using MAPS

They were asked to findsolution to a fewscenarios using PDF

3(n = 28) Administered posttest Administered posttestPhase 2 (qualitative

data collection)4 (n = 14) Administered MAPS

perceptionquestionnaire

5 (n = 10) Participants invited forfocus group interviews

Participants invited forfocus group interviews

6 British Journal of Educational Technology

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

Reliability of scores for the subscales ranged from 0.77 to 0.92, and scores on the MAPS user-perception survey correlate positively to Hung and Chao’s (2007) results on adoption of theMAPS interface in classroom behavior management and to teachers’ beliefs about the effective-ness of using the matrix environment to help them identify appropriate behavioral interventionstrategies (Hung & Chao, 2007). Table 2 provides the reliability coefficient (Cronbach’s alpha) forthe four major constructs from the instrument.

Focus group interviews were conducted to gain a better understanding of the participants’ viewsrelated to MAPS. Focus groups are often conducted for “market research to test and developproducts” (Bogdan & Biklen, 2006, p. 109) and to stimulate conversation from multiple perspec-tives from the participants; this enables the researchers to find and understand different viewsthat otherwise may not be observed in individual interview (Bogdan & Biklen, 2006). The datafrom the MAPS user-perception questionnaire and the focus group interviews were used toanswer the second research question: what are the participants’ views and perceptions regardingthe design and use of a MAPS environment? Focus group result data also helped to triangulateand explain the findings from the first research question.

A convenience sample of 28 preservice teachers studying in the teacher-education program at amajor public university in the Midwest USA was selected to assess the MAPS system. They wererecruited from a 4-credit course on technology integration in the elementary school classroom,which included an 8-week student-teaching experience. The study was conducted midwaythrough the course, before the student-teaching experience. This allowed the researchers toensure that participants had acquired factual knowledge about technology integration beforethey were exposed to the MAPS environment. The participants were divided into a control groupand experimental group. The experimental group used the MAPS tools; the control group used astatic document in a PDF file containing the same information as the MAPS environment.

The participants were first given a knowledge pretest, which consisted of five multiple choice andtrue/false technology integration questions. These questions asked the participants either to identifythe purpose of a technology or to identify a technology that can be used under a given scenario. Afteranswering the pretest questions, the participants used MAPS or the PDF file, depending on theirgroup. Participants were then free to explore and familiarize themselves with the MAPS (Figure 1).To guide the participants through the exploration phase of MAPS and PDF, the researchers pre-sented scenarios and asked participants to find solutions using MAPS or the PDF. After findingsolutions to the scenarios, participants answered the questions on the knowledge posttest.

Then the experimental group was asked to complete the MAPS perception questionnaire. Addi-tionally, all participants were asked to participate voluntarily in focus group interviews. Tenpeople (36% of the original sample) participated in the focus group interviews.

Findings and discussionData from the knowledge pre- and posttest were used to answer the first research question. Datafrom the MAPS perception questionnaire and focus group interviews were used to answer thesecond research question.

Table 2: Descriptive statistics and Cronbach’s alpha values for MAPS perception questionnaire (n = 14)

Category M SD Minimum Maximum Alpha

Navigation and orientation of MAPS 3.73 0.56 2.86 4.14 0.82Interface and content arrangement 3.88 0.41 3.21 4.14 0.77System’s overall usefulness 3.50 0.76 3.00 3.86 0.96System adoption of MAPS 3.55 0.64 3.21 3.79 0.90

Instructional effects of a performance support system 7

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

Research question oneDoes MAPS, which is based on the concept of advance organizers, improve preservice teachers’knowledge of technology integration? This research question was answered by examining thefollowing three sub-questions:

1. Is the change in knowledge score (from pretest to posttest) different for experimental andcontrol group?

2. Is there a change in knowledge score from pretest to posttest, irrespective of the group (controlvs. experimental group)? (ie, is there a within-subjects effect for time of test?)

3. Do the overall group mean values for knowledge differ for experimental and control groups?(ie, is there a between-subjects effect for group?)

For sub-question one, a mixed design ANOVA results indicated that the change in knowledge frompretest to posttest did not differ significantly between the control group and experimental group,F(1,27) = 0.05, p = .82. A small effect size was reported with h2 < 0.01. That is, there was nosignificant interaction effect between group status and time.

For sub-question two, the results indicated that there was no significant change in knowledgefrom knowledge pretest to knowledge posttest irrespective of the group. That is, the main effect fortime was non-significant with F(1,27) = 2.36, p = .13. A small effect size was observed, withh2 = 0.04.

For sub-question three, the test for the between-subjects effect of groups did not reveal significantdifferences in the overall group means for the control group and experimental group, withF(1,27) = 1.23, p = .27 and with a small effect size (h2 = 0.04).

Although this preliminary finding cannot be used as the basis for predicting increase in knowl-edge comprehension as a result of using MAPS, the positive outcome warrants that the system isworthy to be further developed and then retested by involving a control group with longerduration. Such affirmation is also the primary goal for the pre-experimental study, in that theresult should be used only as a design reference to explore the potential effect of MAPS anddetermine the need for further investigation (Dooley, 2001).

Research question twoThe second research question was answered using the data from the MAPS perception survey andfocus group interviews. Table 2 provides the descriptive statistics for the four major constructsfrom the MAPS perception survey their reliability coefficient (Cronbach’s alpha).

Based on the perception survey, most preservice teachers in this study agreed that screen designand layout of MAPS were helpful. Additionally, a majority of the participants agreed that thearrangement of the information on the screen was logical. Most of the participants agreed thatthe MAPS environment would improve the outcome of technology integration in their classroom.Overall, most of the participants had positive feedback related to their experiences of using MAPS,as evident in Table 2.

The matrix format of MAPS provides a meaningful representation of information that the learn-ers can comprehend and then utilize to meet their needs. The information on the main screen iscondensed, which allows a maximum amount of information to be placed on one screen. Oneparticipant stated that it contained “a lot of information, from what it’s about to a lesson idea toan assessment. It was all there for you.” The participant implied that it was helpful to have all theinformation available in a condensed format.

Beyond condensing the information, MAPS condenses it in a meaningful format, by labelingitems clearly. One participant indicated that “everything was labeled, so it really was impossible toget confused.” Another participant stated that it was “very organized. Matrix setup made it easy

8 British Journal of Educational Technology

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

to use.” In contrast to MAPS, one of the participants who used the PDF stated that it had “a lot ofwords. It was kind of boring to look at.” As the PDF was perceived as wordy, it was hard to take aquick look at the entire document in one glance and extract meaningful information.

When asked whether the participants would prefer to use MAPS or the PDF, the participantsindicated that they would prefer to use MAPS. For example, “I would use the software systembetter just because of the layout information.” The layout of the information in MAPS wasperceived to allow one better to compare different learning domains and technologies and makean informed decision based on the comparison.

While a majority of the participants had positive feedback about MAPS, a couple of the partici-pants found it confusing. One participant stated that it was “hard to understand,” while anotherparticipant indicated that it was “a little confusing.” The researchers speculate that this reactionmay be due lack of training on the system; the participants might not have had enough time topractice with the system.

ConclusionThe purpose of this formative evaluation was to investigate the experiences of preservice teachersutilizing a PSS known as MAPS to develop knowledge related to classroom technology integra-tion. The MAPS format utilizes a matrix layout and is based on Ausubel’s (1960) concept ofadvance organizers.

The current findings appear to be consistent with Gal and Nachmias (2011) in suggesting thatthere may not be a significant effect on comprehension as a result of using a PSS. While there wasno significant difference in test scores between the pre- and posttest; this does not predict that thesystem cannot improve participants’ knowledge about technology integration. One possiblereason that significant improvement in recall was not observed in the current study is the brieftime of participants’ exposure to the MAPS environment, which is consistent with the findings ofLagerwerf, Cornelis, de Geus and Jansen (2008).

The current findings are also consistent with Nguyen, Klein and Sullivan (2005), who found thatthe group using the external support system did not score well compared with the group using anextrinsic PSS. This may also point to the design and implementation of the system, because theparticipants may have focused more initially on trying to understand and comprehend thesystem, which may have side tracked them from the main objective. This is also supported bythe focus group comments that the system was confusing and hard to understand.

Overall, based on the qualitative data, it appears that most participants liked the layout anddesign of MAPS, which suggests that MAPS may have the potential to provide real-time supportto teachers in the field. However, a few participants thought the system was confusing,which may be related to the limited exposure time and the design and implementation ofMAPS.

In light of the current findings and discussion, the following recommendations obtain.

1. Because formative evaluation is an iterative process intended to collect and analyze data toimprove the product during development phase (Weston et al, 1995), the feedback from thecurrent evaluation should be incorporated in the next iteration of MAPS to enhance theusability of the system.

2. While learning, as opposed to performance, may not be the focal point of a PSS (Chang, 2004),the system should be retested with a larger sample size over a longer period of time to validatethe current findings.

3. As the focal point of a PSS is performance increase (Chang, 2004; Gery, 1991), it is recom-mended that future evaluations should also test for participants’ performance. This can be

Instructional effects of a performance support system 9

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

done by testing how long it takes for the participants for find information using MAPS versusan alternate system, such as the PDF.

4. Participant indications that the system was confusing or hard to understand may point toimplementation issues, such as training and support. Even though the goal of an EPSS is“expert-like performance from day 1” (Villachica et al, 2006; p 540), it may still require littleor no training (Villachica et al, 2006). As the participants were exposed to the system for ashort period of time, they may not be fully accustomed to the system. It is recommended thatadditional time is provided to the end user to better acclimate to the user interface and thesystem

Formative evaluations are generally done during the development phase to collect data for analy-sis and perform necessary revision before the product implementation (Weston et al, 1995). Thefact that formative evaluations can be done multiple times (Beyer, 1995), implies that results ofthe evaluation can change from time to time based on the revision and improvement made to theprogram. Therefore, it is important to conduct additional studies based on the above recommen-dations and with a larger audience to confirm the findings of this study and how the use of MAPScan help preservice teachers with the concept of technology integration.

ReferencesAl-Bataineh, A. & Brooks, L. (2003). Challenges, advantages, and disadvantages of instructional technology

in the community college classroom. Community College Journal of Research and Practice, 27, 473–484.Alparslan, C. N., Cagiltay, N. E., Ozen, M. & Aydin, E. U. (2008). Teaching usage of equipment in a remote

laboratory. The Turkish Online Journal of Educational Technology, 7, 1, 38–45.Ausubel, D. P. (1960). The use of advance organizers in the learning and retention of meaningful verbal

material. Journal of Educational Psychology, 51, 267–272.Bannister, S. & Ross, C. (2005). From high school to college: how prepared are teacher candidates for

technology integration. Journal of Computing in Teacher Education, 22, 75–79.Barker, P., van Schaik, P. & Famakinwa, O. (2007). Building electronic performance support systems for

first-year university students. Innovations in Education and Teaching International, 44, 3, 243–255.Beyer, B. K. (1995). How to conduct a formative evaluation. Alexandria, VA: Association for Supervision &

Curriculum Development.Bogdan, R. C. & Biklen, S. K. (2006). Qualitative research for education: an introduction to theories and methods

(5th ed.). Boston, MA: Allyn & Bacon, Inc.Brush, T., Glazewski, K., Rutowski, K., Berg, K., Stromfors, C., Van-Nest, M. H. et al (2003). Integrating

technology in a field-based teacher training program: the PT3@ASU project. Educational TechnologyResearch and Development, 51, 1, 57–72.

Burns, M. (2002). From compliance to commitment: technology as a catalyst for communities of learning.Phi Delta Kappan, 84, 4, 295–302.

Chang, C.-C. (2004). The relationship between the performance and the perceived benefits of using anelectronic performance support system (EPSS). Innovations in Education and Teaching International, 41, 3,343–364.

Dooley, D. (2001). Social research methods (4th ed.). Upper Saddle River, NJ: Prentice Hall.Espey, L. (2000). Technology planning and technology integration: a case study. Paper presented at the Society for

Information Technology and Teacher Education International Conference, San Diego, CA.Favero, M. D. & Hinson, J. M. (2007). Evaluating instructional technology integration in community and

technical colleges: a performance evaluation matrix. Community College Journal of Research and Practice,31, 389–408.

Gal, E. & Nachmias, R. (2011). Implementing on-line learning and performance support using an EPSS.Interdisciplinary Journal of E-Learning and Learning Objects, 7, 213–223.

Gery, G. (1991). Electronic performance support systems: how and why to remake the workplace through thestrategic application of technology. Tolland, MA: Gery Performance Press.

Gray, L., Thomas, N. & Lewis, L. (2010). Teachers’ use of educational technology in U.S. public schools: 2009(NCES 2010-040). Washington, DC: National Center for Education Statistics, Institute of EducationalSciences, US Department of Education.

Grove, K., Strudler, N. & Odell, S. (2004). Mentoring toward technology use: cooperating teacher practice insupporting student teachers. Journal of Research on Technology in Education, 37, 1, 85–109.

10 British Journal of Educational Technology

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.

Groves, M. M. & Zemel, P. C. (2000). Instructional technology adoption in higher education: an actionresearch case study. International Journal of Instructional Media, 27, 1, 57–65.

Hung, W.-C. & Chao, C.-A. (2007). Integrating advance organizers and multidimensional informationdisplay in electronic performance support systems. Innovations in Education and Teaching International, 44,181–198.

Hung, W.-C., Smith, T., Harris, M. & Lockard, J. (2010). Development research of a teachers’ educationalperformance support system: the practices of design, development, and evaluation. Educational TechnologyResearch & Development, 58, 1, 61–80.

International Society for Technology in Education [ISTE] (2002). National educational technology standardsfor teachers: preparing teachers to use technology. Eugene, OR: International Society for Technology inEducation.

Khoury, R. (1997). The unkept promise [Electronic version]. Community College Week, 10, 1, 4–5.Koc, M. & Bakir, N. (2010). A needs assessment survey to investigate pre-service teachers’ knowledge,

experiences and perceptions about preparation to using educational technologies. The Turkish OnlineJournal of Educational Technology, 9, 1, 13–22.

Koehler, M. J. & Mishra, P. (2005). Teachers learning technology by design. Journal of Computing in TeacherEducation, 21, 3, 94–102.

Koschmann, T. D., Myers, A. C., Feltovich, P. J. & Barrows, H. S. (1994). Using technology to assist in realizingeffective learning and instruction: a principled approach to the use of computers in collaborative learning.The Journal of the Learning Sciences, 3, 3, 227–264.

Lagerwerf, L., Cornelis, L., de Geus, J. & Jansen, P. (2008). Advance organizers in advisory reports: selectivereading, recall, and perception. Written Communication, 25, 1, 53–75.

Lin, C.-Y. (2008). Beliefs about using technology in the mathematics classroom: interviews with pre-serviceelementary teachers. Eurasia Journal of Mathematics, Science and Technology Education, 4, 2, 135–142.

Maughan, G. R. (2005). Electronic performance support systems and technological literacy. Journal ofTechnology Studies, 31, 1, 49–56.

Morrison, G. R., Ross, S. M. & Kemp, J. E. (2007). Designing effective instruction (5th ed.). Hoboken, NJ:John Wiley & Sons.

Morville, P. & Rosenfeld, L. (2006). Information architecture for the world wide web (3rd ed.). Sebastopol, CA:O’Reilly.

Nguyen, F., Klein, J. D. & Sullivan, H. (2005). A comparative study of electronic performance supportsystems. Performance Improvement Quarterly, 18, 4, 71–86.

Northrup, P. T. & Pilcher, J. K. (1998). STEPS: an EPSS tool for instructional planning. Paper presented at theAssociation for Educational Communications and Technology, St. Louis.

Oberlander, J. & Talbert-Johnson, C. (2007). Envisioning the foundations of technology integration in pre-serviceeducation. Paper presented at the Association for Teacher Educators.

Quick, D. & Davies, T. G. (1999). Community college faculty development: bringing technology into instruc-tion. Community College Journal of Research and Practice, 23, 641–653.

Quintana, C., Reiser, J. B., Davis, A. E., Krajcik, J., Fretz, E., Duncan, G. R. et al (2004). A scaffolding designframework for software to support science inquiry. The Journal of the Learning Sciences, 13, 3, 337–386.

Villachica, S. W., Stone, D. L. & Endicott, J. (2006). Performance support systems. In J. A. Pershing (Ed.),Handbook of human performance technology (3rd ed.). (pp. 539–566). San Francisco, CA: Pfeiffer.

Weston, C., McAlpine, L. & Bordonaro, T. (1995). Formative evaluation in instructional design. EducationalTechnology Research and Development, 43, 3, 29–48.

Zhao, Y., Frank, K. A. & Ellefson, N. C. (2006). Fostering meaningful teaching and learning with technology:characteristics of effective professional development. In E. A. Ashburn & R. E. Floden (Eds), Meaningfullearning using technology (pp. 161–215). New York: Teacher College Press.

Instructional effects of a performance support system 11

© 2012 The Authors. British Journal of Educational Technology © 2012 BERA.