24
Learning Environments Research 4: 115–138, 2001. © 2001 Kluwer Academic Publishers. Printed in the Netherlands. C. PAUL NEWHOUSE DEVELOPMENT AND USE OF AN INSTRUMENT FOR COMPUTER-SUPPORTED LEARNING ENVIRONMENTS Received 23 June 2000; accepted (in revised form) 12 January 2001 ABSTRACT. This article begins by constructing an argument from the literature to sup- port the view that research into the use of computers in classrooms must consider the overall learning environment. This provides the rationale for the development and use of the New Classroom Environment Instrument (NCEI) in research conducted into the use of portable computers in a school. This study particularly focused on the impact of the computers on classroom learning environments. An example is given of the use of the instrument to investigate the use of the portable computers with classes of 13-year-old students. The intention of the article is not to present the results of the study but to dem- onstrate the use of the instrument and encourage others to use such instruments in educa- tional computing research. KEY WORDS: educational computing, evaluation, learning environment, portable com- puters, questionnaires 1. INTRODUCTION The use of classroom learning environment instruments in educational research dates back to the 1970s with Walberg and Moos (Fraser & Walberg, 1991). Because such instruments have been used in research conducted over a wide range of areas of interest in education, it was natural that in the 1980s and 1990s they would be applied to research in educational computing. This has particularly been the case for research into the im- plementation of computer support for learning. However, the importance of considering the classroom learning environment in research concerned with the use of computers to support learning should be seen as more than just another application of learning environment instruments. At its core, such research must involve a consideration of the learning environment within which the computers are used because of the interactive nature of the technology. 1.1. Computers and the Classroom Environment The use of computers in teaching and learning for the majority of children is most likely to occur in the classroom. Most experts in the field of edu-

Development and Use of an Instrument for Computer-Supported Learning Environments

Embed Size (px)

Citation preview

Page 1: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 115

Learning Environments Research 4: 115–138, 2001. © 2001 Kluwer Academic Publishers. Printed in the Netherlands.

C. PAUL NEWHOUSE

DEVELOPMENT AND USE OF AN INSTRUMENT FORCOMPUTER-SUPPORTED LEARNING ENVIRONMENTS

Received 23 June 2000; accepted (in revised form) 12 January 2001

ABSTRACT. This article begins by constructing an argument from the literature to sup-port the view that research into the use of computers in classrooms must consider theoverall learning environment. This provides the rationale for the development and use ofthe New Classroom Environment Instrument (NCEI) in research conducted into the useof portable computers in a school. This study particularly focused on the impact of thecomputers on classroom learning environments. An example is given of the use of theinstrument to investigate the use of the portable computers with classes of 13-year-oldstudents. The intention of the article is not to present the results of the study but to dem-onstrate the use of the instrument and encourage others to use such instruments in educa-tional computing research.

KEY WORDS: educational computing, evaluation, learning environment, portable com-puters, questionnaires

1. INTRODUCTION

The use of classroom learning environment instruments in educationalresearch dates back to the 1970s with Walberg and Moos (Fraser & Walberg,1991). Because such instruments have been used in research conductedover a wide range of areas of interest in education, it was natural that inthe 1980s and 1990s they would be applied to research in educationalcomputing. This has particularly been the case for research into the im-plementation of computer support for learning. However, the importanceof considering the classroom learning environment in research concernedwith the use of computers to support learning should be seen as more thanjust another application of learning environment instruments. At its core,such research must involve a consideration of the learning environmentwithin which the computers are used because of the interactive nature ofthe technology.

1.1. Computers and the Classroom Environment

The use of computers in teaching and learning for the majority of childrenis most likely to occur in the classroom. Most experts in the field of edu-

Page 2: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE116

cational computing (e.g. Lynch, 1990; Olson, 1988; Rieber, 1994) wouldcharacterise computers as interactive and thus admit them a place withinthe relationship structures within the classroom. Carter (1990) goes so faras to claim that “new technologies construct a totally new environment”(p. 34). However, the majority of classroom learning environments, inschools, which incorporate computers could be depicted using the modelin Figure 1. Strictly speaking, the computer systems and non-interactivetechnology are part of the context of the curriculum but, because comput-ers are two-way interactive, it is more helpful to highlight them by separa-tion. The elements of the traditional classroom learning environment(shown in the box) provide a complex pattern of relationships. When com-puters are used within this environment, the complexity of this patternof relationships increases, with all elements of the traditional classroomlearning environment needing to interact with both the hardware andsoftware.

1.2. Computers in Schools

In the 1960s, educators and interested community members began to con-sider realistically the potential for the use of computers in education bothat the tertiary and school levels. In the mid-1970s, access to computerprocessing (i.e. the use of a computer system to perform tasks requiring

Figure 1. A model of the relationship of computer systems to the elements of the class-room learning environment.

Page 3: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 117

the user to submit one or more instructions) became available in Austral-ian schools, initially limited to one terminal and later to a number of micro-computers. Today most Australian schools, as is the case in other developedcountries, have substantial numbers of computers, some to the extent ofone computer per student (Fallshaw, 1995; Lyall, 1997). Gardner, Morrison,Jarman, Reilly and McNally (1994) give the rationale for computers inschools as the perception of computer literacy for employability, to im-prove work output, to improve motivation, to improve ‘learning environ-ments’, and to empower students by releasing them from tedious repetitivetasks.

1.3. Investigating Classroom Learning Environments

Learning environments are the social-psychological contexts within whichlearning takes place (Fraser & Walberg, 1991). Learning environments inschools typically involve one or more adult teachers in a classroom withvarious numbers of students, whether juvenile or adult. These people in-teract and form a variety of relationships, creating what Salomon (1994)calls “a system of interrelated factors that jointly affect learning in inter-action with (but separately from) relevant individual and cultural differences”(p. 80). The learning environment has a physical as well as a relationshipdimension. Physically it might be in a room, full of particular furnitureand equipment, including computers.

1.4. Impact of Computer Use on Classroom Environment

It was argued earlier that using computers in classrooms changes the rela-tionships within the environment and thus the roles of teachers and stu-dents are likely to change. Changes in the role of teachers and studentswill be reflected in changes in the organisation of the classroom and theteaching and learning strategies employed. Many educators (e.g. Collis,1989; Van Den Akker, Keursten & Plomp, 1992) contend that, for studentsto make effective use of computers in classrooms, considerable change towhat might be the normal classroom routine is required.

Mercer and Fisher (1992) argue that, “it is necessary to examine the totalactivity, including the way the teacher has set up the task and how she thensupports its progress” (p. 344). So, an important factor is likely to be thetypes of tasks to which computers are put (Bergen & Kingston, 1994). Justas the tasks are important to computer use, so is the organisation of the class-room and the teaching and learning strategies employed (Hayward, 1993).Rowe (1993) contends that “organising lessons in which students are work-

Page 4: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE118

ing on computers will involve making considerable changes to what mightbe the normal classroom routine” (p. 114). Collis (1989) suggests that:

Instead of organizing the school around the notion of classrooms with desks aligned toface the teacher and the chalkboard, both centrally positioned at the front of the room,the school building may radically change in its interior. (p. 15)

McMahon and Duffy (1993) saw what they termed changes in the cultureof the classroom in many schools involved in their study of the use ofnetworked home and classroom computers. In particular, this involved achange in the relationship between students and teachers, with studentshaving greater power than previously. It is indeed likely that, to realise thepotential of computers in the support of teaching and learning, many revo-lutionary changes will need to be made in schools and classrooms (Salomon,1994; Schofield, 1995), as foreshadowed by Rowe (1993):

Obviously, the more effective uses of computers in education will require new patternsof interaction between students and teachers, changes in the social organisation of theclassroom, the adaptation of curricula and alternative purposes and modes of studentevaluation. (p. 5)

1.5. Computer-Supported Learning Environments

Computer-supported learning environments (CSLEs) are those in whichcomputers are used either to maintain a learning environment or to sup-port student learning in the Vygotskian sense (DeCorte, 1990; Mercer &Fisher, 1992; Mevarech & Light, 1992). Learning environments whichinvolve computers can be configured in many different ways to provideaccess to computer processing. There are a number of fundamental param-eters which can be used to describe the place of computer systems in alearning environment. These parameters include the ratio of computers tostudents, whether the computers are in the ‘normal’ classroom or a labor-atory, whether they are networked or stand-alone, whether they are port-able or fixed, whether students freely access them or gain access througha roster system, and whether students work at the computers individuallyor in groups.

Many educators, particularly school reformers, perceive the ideal com-puter configuration as a learning environment in which it is possible at anytime for every student to access an adequate level of computer processingand software relevant to their learning needs. In essence, it implies thepotential for students to have unrestricted access to computer processing.In practice, it means that either the students are in a computer laboratory

Page 5: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 119

with the potential for one workstation each, or that each student has aportable computer. In many places, the term ‘computer-rich’ is used todescribe portable computer programmes, but the term ‘computer-saturated’could be more appropriate as it is a quantitative term whereas the formercould be interpreted in a qualitative way as well.

1.6. Using Classroom Environment Instruments

Fraser and Walberg (1991) considered three approaches to measuringclassroom environments. The research discussed in this article made useof two of these approaches, namely, apprehension of the environmentand ethnographic case studies. The apprehension of the environment ap-proach to studying classroom environments began in the 1970s with theindependent work of Walberg and Moos which resulted in the develop-ment of, respectively, the Learning Environment Inventory (LEI) and theClassroom Environment Scale (CES). Since then, these questionnaireshave been applied in many studies (Fraser, 1994; Fraser & Walberg, 1991)and a number of environment instruments (e.g. Individualised ClassroomEnvironment Questionnaire (ICEQ), My Class Inventory (MCI), ScienceLaboratory Environment Inventory (SLEI)) have been developed fromthem and tailored to particular learning environments or research ques-tions.

Fraser and Walberg (1991) outline three broad areas of research usingclassroom environment instruments, two of which are relevant to thepresent discussion. Firstly, research has involved associations betweenstudent outcomes and classroom environment, with a meta-analysis show-ing that classroom environment perceptions accounts for appreciableamounts of variance in cognitive and affective outcomes (Teh & Fraser,1995). Research into CSLEs almost always considers a number of affec-tive outcomes, such as attitude towards the use of computers, towardslearning and towards particular subject areas. Therefore a classroom en-vironment instrument is ideal for describing classroom environments anddrawing conclusions about how these can relate to the use of computersand student attitudes.

Secondly, research has involved investigations of whether studentsachieve better when in their preferred environments. This is often referredto as person-environment fit research, and was pioneered by Fraser andFisher (1983b) in the early 1980s. They found that student achievementfor a range of outcomes is likely to improve if changes to the classroomare made to increase the congruence between the preferred and actual class-room environments. Fraser and Walberg (1991) conclude that, “the prac-

Page 6: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE120

tical implication of these findings for teachers is that class achievement ofcertain outcomes might be enhanced by attempting to change the actualclassroom environment in ways which make it more congruent with thatpreferred by the class” (p. 16). It is this area of research that could be ofmost value to research in CSLEs.

While there has been much discussion about how the presence of com-puters, logically, should change the nature of the classroom environment,little research has occurred to support this discussion. As Teh and Fraser(1995) put it, “innovations in computer-assisted learning rarely have beenevaluated in terms of their impact on the nature of the classroom learningenvironment as perceived by students” (p. 178). This is argued by Levineand Donitsa-Schmidt (1995), especially in the context of the concept ofcomputer-supported learning. It was my belief that computers could be usedto support learning environments that would match student preferencesmore closely and thus lead to improved student achievement on a range ofoutcomes.

There has been little application of this approach for studying classroomenvironments involving the impact of computers on learning environments(Fraser & Teh, 1994). A few attempts have been made to create learningenvironment instruments for use in classrooms (usually computer labora-tories) in which computers are used (Levine & Donitsa-Schmidt, 1995).Teh and Fraser (1995) constructed the Geography Classroom EnvironmentInventory (GCEI) which has four scales: Gender Equity, Investigation,Innovation, and Resource Adequacy. They used the actual version to com-pare two types of classes, computer-assisted learning classes and controlclasses which did not use computers. Maor and Fraser (1996) studied theuse of computers to support inquiry-based learning in secondary scienceclasses and concluded that the use of the particular software package en-couraged teachers and students to develop a more investigative and open-ended learning environment. Recently Newby and Fisher (2000) havereported on the use of the Computer Laboratory Environment Inventory(CLEI) which they used in research on the use of computer laboratories ina university business course. They developed a model to explain the rela-tionship between student perceptions of the laboratory environment, stu-dent attitudes and achievement in the course.

There are few researchers who have used generic learning environmentinstruments in studies where the use of computers is important. Wherecomputers are used to support learning in a variety of environments (notjust a specialist computer laboratory), the impact of computer use on the‘normal’ classroom environment requires the use of a generic instrument.This provides data on the classroom environment no matter to what de-

Page 7: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 121

gree, and in what ways, the computers are used. If it is considered that theenvironment is of primary importance in teaching and learning, then thereis little purpose in attempting to extract out a ‘computer’ component.Salomon (1994) supports this view by arguing that it is not possible to study“the impact of computer use in the absence of the other factors” nor toassume that “one factor impacts outcomes independently of the others”(p. 80). The educational aim is to embed the computer support in the learn-ing environment (DeCorte, 1990), rather than to try to isolate its effect onlearning. Using computers in learning is concerned with methods of us-ing the technology to create environments and learning situations. Theaim is to offer new learning opportunities or to improve the way in whichcurrent learning activities are implemented and therefore the overalleffectiveness of learning environments and episodes is of paramountconcern. It is important that the newness and ever-changing nature ofcomputer-based technology does not overshadow the enduring nature oflearning and the solid and ever-increasing base of knowledge about learn-ing.

Research conducted into the provision of learning environments incor-porating high access to computers has supported a number of importantassumptions which should be considered in such research. Firstly, researchmust consider the whole learning environment in a classroom, with a par-ticular focus on the relationships between teachers, students, curriculum,and educational technology (Miller & Olson, 1994; Rowe, 1993). Secondly,studies must be long-term, particularly where the application of comput-ers is relatively new to the teachers (Becker, 1994; Welle-Strand, 1991).And thirdly, teacher perceptions, attitudes and beliefs concerning learn-ing, schooling and computers are the most important factors in the suc-cessful implementation of computer support in these learning environments(Dwyer, 1994).

1.7. New Classroom Environment Instrument (NCEI)

The NCEI instrument evolved from a study into the development of com-puter-supported learning environments in secondary mathematics classes(Newhouse, 1993) and was based on a well accepted instrument, the Class-room Environment Scale (CES), developed by Rudolf Moos and EdisonTrickett (1974). The instrument was then used extensively in a study ofthe implementation of portable computers in a secondary school. The studyaimed to use the instrument to highlight any differences in the classroomenvironments, as perceived by the students, when various implementationsof computer use were facilitated in the classroom. While the instrument

Page 8: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE122

could be administered to teachers to measure their perceptions, the earlierstudy had found that giving the same instrument to the classroom teacherwas of little value (Newhouse, 1993).

1.7.1. Nature of the NCEIIn selecting a classroom environment instrument to use in the study, it wasimportant to consider the reason for its use. There have been a number ofstudies for which an instrument was developed specifically for computer-based or computer-supported classroom environments (e.g. Maor & Fraser,1996; Teh & Fraser, 1995). These instruments include scales which arerelated to using computers in a classroom. However, my study was con-cerned with the impact of portable computers on the ‘normal’ classroomenvironment. It was considered that computer use could be implementedin a variety of ways by different teachers with different classes of students.Therefore it was important to consider differences in the environmentbetween classes using computers and those not, and between classes us-ing the computers for different purposes. As a result, the instrument neededto be independent of computer use and to measure students’ perceptionsof general aspects of classroom environments which could apply to anyclassroom.

Over a period of three years, commencing in 1991, I made an investi-gation of a range of classroom environment instruments available (e.g.those mentioned earlier, including the CES, ICEQ, MCI, SLEI, CLES,CCEI, and GCEI) and their use in educational research while developingthe New Classroom Environment Instrument (NCEI). There were manysuch instruments available and so I discussed the appropriateness of theseinstruments firstly with Fraser and later with Fisher, and I read some lit-erature which they recommended (Fraser, 1981, 1989, 1994; Fraser &Fisher, 1983a, 1983b; Fraser, Malone & Neale, 1989; Fraser & McRobbie,1995; Fraser & Walberg, 1991; Rentoul & Fraser, 1979; Taylor & Fraser,1991). These investigations made it clear that it was not necessary to de-velop an entirely new instrument, but rather to modify existing instruments.Further, it was clear that the instrument would mainly be used to describethe classroom environment, and compare preferred and actual class meansfrom the perspective of person-environment fit. For these reasons, theoriginal Classroom Environment Scale (CES) appeared to be the mostappropriate to build upon.

The NCEI was initially developed (Newhouse, 1993) by modifying theCES (Moos & Trickett, 1974) and later including the Group Work scalefrom the Classroom Interaction Patterns Questionnaire (CIPQ) developedby Woods (1995). Therefore all items for the NCEI were either directly

Page 9: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 123

from one of these two parent instruments or were reworded items fromthose instruments. Both a preferred and actual form of the instrument weredeveloped.

1.7.2. Development of the NCEI

The original CES (Moos & Trickett, 1974) has nine scales, each with 10items, using a True-False response format. From the original CES, a numberof items were removed because the earlier study I had conducted demon-strated that it had too many items for the age of student involved. Firstly,one of the nine scales, Rule Clarity, was removed because it was consid-ered to be too similar to the scale Teacher Control. The number of itemsfor each of the remaining eight scales was reduced to seven items. Itemswere removed which were thought to be more difficult for students tocomprehend, particularly considering the level of language used. In theearlier study (Newhouse, 1993) and with the Grade 7 students (12-year-olds) in the first year of the portable computer study, students were askedto circle words which they had difficulty in understanding. In addition,some items were removed to improve the reliability of the scales using thesedata.

From the earlier study, it was clear that many students had difficultyresponding using the True-False response format. Often students indi-cated that it was neither True nor False. Firstly a five-point Likert re-sponse format was considered as used for the School-Level EnvironmentQuestionnaire (SLEQ). However many found it difficult to select fromfive alternative responses and they were not sure what it meant to agree ordisagree with a statement. A number of subsequent trials made use of otherresponse formats used by other classroom environment instruments, forexample, the five-response frequency format of the SLEI (Fraser &McRobbie, 1995). The result was the adoption of the three-point frequencyresponse format, Often, Sometimes, Almost Never. Students appeared tohave least difficulty responding with this format and scale reliability co-efficients were higher.

In the final form of the NCEI (used from the second half of the firstyear of the portable computer study), the Task Orientation scale was re-moved and replaced with the Group Work scale. Analysis of the data col-lected from Grade 7 students had given low scale reliability for the TaskOrientation scale (0.36 and 0.42 for preferred and actual versions, respec-tively) and it appeared from item analysis that it would be more difficultto improve than any other scale. Woods (1995) had just completed thedevelopment of the CIPQ instrument that had a Group Work scale. From

Page 10: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE124

discussions with teachers at the school and reading a number of studies(e.g. DeCorte, 1990; Dwyer, Ringstaff & Sandholtz, 1991; Johnson &Johnson, 1991; Mevarech & Light, 1992; Olson, 1988; Riel, 1989; Rowe,1993; Rysavy & Sales, 1991; Wubbels, Brekelmans & Hooymayers, 1991),it was decided that it would be appropriate to include a scale related toworking in groups in the classroom. Wood’s instrument used the five al-ternative responses of Very Often, Often, Sometimes, Seldom and AlmostNever. Therefore the Very Often and Seldom responses were removed. Theinstrument contained seven items for each scale, which matched the numberof items for the other scales of the NCEI, and thus all items were initiallyincluded. However, when the items were shown to students in a class forfeedback, it was decided to remove one item and replace it with a slightlymodified version of another item of the scale, and also the wording of twoother items was changed.

The wording of the actual version was improved and the preferred ver-sion was subsequently created. It was also ensured that there was at leastone reverse-scored item for each scale. In modifying the wording of theitems, I attempted to maintain the content of each item while improvingits readability for middle-school students and also emphasising studentperceptions rather than just opinions. This final concern is best illustratedin the preferred version where the word ‘would’ or ‘would not’ was usedto replace words such as ‘should’ in the CES. For example, in the preferredversion of the CES, Item 1 reads, “Students should put a lot of energy intowhat they do here”. In the preferred version of the NCEI, Item 1 reads,“Students would put a lot of energy into what they do here”. Students wereasked to indicate what would constitute an ideal class for them rather thanwhat should. Some students indicated that the latter term evoked a senseof what they thought others, particularly adults, thought an ideal classshould be like rather than what they personally believed.

As a result of all these changes, the final NCEI had 8 scales (Table I)with each scale represented by 7 items (56 items in total) using a three-point frequency response format, Often, Sometimes, Almost Never.

1.7.3. Analysis of the NCEIResponses to the instrument were entered into a separate spreadsheet foreach class. The three possible responses for each item, namely, Often,Sometimes, Almost Never, were coded with a 3, 2 and 1, respectively. Scalescores for each student and class means and standard deviations were cal-culated for each sub-scale. The spreadsheet data were imported into SPSSversion 4.01 Macintosh (SPSS Inc., 1990) to calculate Cronbach alphareliability coefficients as measures of internal consistency, and to investi-

Page 11: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 125

TABLE I

Scale Description and Sample Item for Each Scale of the New Classroom Environ-ment Instrument

Code Scale name Description of scalea Sample item

INV Involvement The extent to which students have Students put a lot of energyattentive interest in class activities into what they do here. (+)and participate in discussions.The extent to which students doadditional work on their own andenjoy the class.

AFF Affiliation The level of friendship that students Students in this class get tofeel for each other, that is, the extent know each other well. (+)to which they help each other withhomework, get to know each othereasily, and enjoy working together.

TS Teacher support The amount of help, concern, and This teacher remains at thefriendship which the teacher directs front of the class rather thantowards the students. The extent to moving about and talkingwhich the teacher talks openly with with students. (–)students, trusts them, and is in-terested in their ideas.

GWb Group work The extent to which students are able Students work by themselvesto work collectively in class on tasks rather than working togetherand activities assigned by the teacher. on projects in this class. (–)

COM Competition The emphasis placed on student’s Students feel pressured tocompeting with each other for grades compete here. (+)and recognition. An assessment of thedifficulty of achieving good grades isincluded.

OO Order and The emphasis on students behaving in This is a well-organised class.organisation an orderly and polite manner and on (+)

the overall organization of assignmentsand classroom activities. The degree towhich students tend to remain calm andquiet.

TC Teacher control How strict the teacher is in enforcing There are very few rules tothe rules, and the severity of the follow. (–)punishment for rule infractions. Thenumber of rules and the ease of stu-dents getting into trouble.

INN Innovation How much students contribute to New ideas are tried out in thisplanning classroom activities, and the class. (+)amount of unusual and varyingactivities and assignments planned bythe teacher. The extent to which theteacher attempts to use new techniquesand encourages creative thinking in thestudents.

Note. Items designated (+) are scored by allocating 3, 2, 1, respectively, for the responses Often,Sometimes, Almost Never. Items designated (–) are scored in the reverse manner. Omitted responsesare given a score of 2. aDescriptions taken from Moos and Trickett (1974, p. 3) and Woods (1995, p.110). bGW scale was modified from the original CIPQ. All other scales were modified from the originalCES.

Page 12: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE126

gate differences between class means using effect sizes and t-tests (eitherdependent or independent). The data were also imported into CA-CricketGraph III® (Computer Associates International, 1992) to create graphsto show comparisons of class means for the scales. In an internationalcomparison of studies using various classroom environment instruments,Wubbels (1993) stated that, while a scale reliability coefficient of 0.7 orgreater is regarded as acceptable, it is not uncommon to find reliabilitycoefficients down to 0.2.

In particular, comparisons were made between class means on the pre-ferred and actual versions of the instrument. The t-test results were reportedat 0.05 and sometimes the 0.001 level of significance. Effect sizes, orstandardised mean differences, were calculated using the formula d = (meandifference/pooled standard deviation) which is discussed by Dunlap, Cortina,Vaslow and Burke (1996). Effect sizes above about 0.50 were regarded asmoderate and those above about 0.70 regarded as large (Fraser, 1989). Theanalysis depended primarily on the use of the effect sizes which Thompson(1996) recommends should be general AERA editorial policy in line withAmerican Psychological Association policy. Rennie (1997) also discussesthe merits of using effect sizes to consider differences between groupmeans.

The interpretation of NCEI data is highly dependent on the beliefs,perceptions and previous experiences of the interpreter because they con-cern students’ beliefs and perceptions. The NCEI’s value is in the com-parison of students’ preferences for, and perceptions of, the classroomenvironment and changes in perceptions over time.

2. AN APPLICATION OF THE NCEI

The main study in which the NCEI was used was a three-year impact studyinto the implementation of a portable computer programme. The resultsof the study have been reported fully (Newhouse, 1998) and therefore thisdiscussion is designed to provide an example of the application of the NCEIand further describe its development.

The study was based on an ethnographic model of research using aninterpretive methodology. Each year, the study focussed on those featuresof the psychosocial environment of the school which seemed to be impor-tant from an interpretation of data collected in the previous year. To il-lustrate the use of the NCEI in the study, data from the second year areconsidered. This was the year in which the final version of the NCEI wasused. Data were collected using the NCEI, interviews, questionnaires, and

Page 13: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 127

lesson observations. However, the testing of the instrument in 1993 is alsopresented.

In the second half of the first year of the study (1993), the NCEI wasused in its final form for the first time. There were four Grade 8 classes(13-year-olds) involved in the study that year. The preferred and actualforms of the instrument were administered to each science class. TheCronbach alpha coefficients for each administration of the instrument aregiven in Table II. It was decided that the instrument was sufficiently reli-able to use in a more comprehensive manner in the following year of thestudy. However, there was some concern about the Competition and TeacherControl scales.

In the second year of the study, it was intended to obtain data associ-ated with as many Grade 8 teacher-class combinations as possible. In thefirst year of the study, there had been a strong emphasis, which had gainedempirical support, on investigating teacher-class combinations and it wasdecided to continue this emphasis. It appeared that different teachers wereable to make use of the portable computer resource to support differenttypes of learning environments. It was not yet clear whether students’perceptions of these learning environments were consistent and whethertheir preferences were associated with the manner in which the computerswere used.

The preferred version of the NCEI was administered to five focus classesin February, with the actual version administered once at the end of Marchand again in November. The preferred version was administered a secondtime to one class in November. Class means and standard deviations onall scales for the administrations were calculated. Independent t-tests wereused to examine the small differences between class means on the preferred

TABLE II

Cronbach Alpha Reliability Coefficients for Administrations of the NCEI to FourClasses in 1993 and Five Classes in 1994

Scale Pref. alpha Actual alpha Pref. alpha Actual alpha Actual alphaAug. 1993 Aug. 1993 Feb. 1994 March 1994 Nov. 1994(n = 95) (n = 95) (n = 111) (n = 111) (n = 102)

Involvement 0.79 0.69 0.60 0.39 0.71Affiliation 0.83 0.66 0.82 0.65 0.72Teacher support 0.84 0.75 0.75 0.45 0.69Group work 0.83 0.73 0.79 0.80 0.75Competition 0.65 0.23 0.33 0.10 0.25Order and organisation 0.76 0.54 0.71 0.65 0.43Teacher control 0.69 0.40 0.38 0.39 0.17Innovation 0.82 0.70 0.63 0.59 0.70

Page 14: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE128

scales, but none were statistically significant. The preferred and actual datafrom March were compared by calculating effect sizes and dependent t-tests. These results are given in Table III. Effect sizes for class means forthe two administrations of the actual version for each class are reported inTable IV. These results are also presented in Figures 2 and 3. These datawere analysed to compare characteristics of the five teacher-class combi-nations and to create individual classroom environment profiles for theclasses. Because Cronbach alpha coefficients calculated for the Com-petition and Teacher Control scales (Table II) suggest poor internal con-sistency, these scales were not used.

TABLE III

Effect Sizes and t-Test Results for a Comparison of Actual and Preferred Scores on theNCEI for Five Classes in Grade 8

Scale Effect size for actual vs. preferred1 = Lucy 2 = Tim 3 = Sue 4 = Brian 5 = Sandra

Involvement –0.87* –0.41 –1.12* –0.77* –1.13*Affiliation –0.96* –0.35 –0.67* –0.63* –0.74*Teacher support –0.86* –0.76* –0.88* –0.76* –0.92*Group work –1.07* –0.29 –1.08* –1.36* –1.18*Competition –0.87* –0.67* 0.04 0.31 –0.14Order and organisation –0.90* –0.61* –0.59 –0.09 –1.24*Teacher control 0.39 –0.14 0.04 –0.12 0.16Innovation –0.94* –0.80 –1.13* –1.12* –1.13*

Note. Effect sizes were calculated by subtracting preferred mean from actual mean and dividing bythe pooled standard deviation. *p < 0.05.

TABLE IV

Effect Sizes for a Comparison of NCEI Actual Environment Data in March andNovember for Five Classes in Grade 8

Scale Effect size for March vs. November1 = Lucy 2 = Tim 3 = Sue 4 = Brian 5 = Sandra

Involvement 0.33 0.15 0.28 –0.44 –0.20Affiliation –0.43 –0.27 –0.15 –0.39 –0.35Teacher support 0.28 0.12 0.45 –0.55 –0.27Group work –0.13 –0.36 0.28 0.50 –0.51Competition 0.16 0.94 –0.64 –0.75 0.28Order and organisation 0.16 –0.05 –0.13 –1.55 0.44Teacher control –0.13 0.03 0.49 –0.22 –0.29Innovation 0.54 0.16 0.09 0.11 0.05

Note. Effect sizes were calculated by subtracting the March actual mean from the November actualmean and dividing by the pooled standard deviation. No t-tests were conducted because the analysiswas not designed to determine a person-environment fit but rather to compare two sets of actualclass means. aOnly seven students in Brian’s class completed the NCEI actual version in November.

Page 15: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 129

2.1. Tim’s Science Class

In Tim’s class, the computers were used regularly as he worked towardsintegrating their use in the learning activities. Most of Tim’s lessons in-volved student-centred learning activities with students working individu-ally or in groups on experiments or activities. They were encouraged torecord notes, reports and data using their computers. Tim commented thatthey were trying to do more student-centred activities in Grade 8 scienceand that “we should do more of this because the students respond well”but that “not all teachers like this approach”.

Figure 2. Comparison of Actual NCEI class means for the five focus classes, early in thesecond year of the study.

Figure 3. Comparison of Actual NCEI class means for the five focus classes, late in thesecond year of the study.

Page 16: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE130

The results from the NCEI are displayed graphically in Figure 4. Thepattern of effect sizes for a comparison between actual and preferred scalescores suggests that the students wanted more Teacher Support and moreOrder and Organisation. From discussions with Tim, it seemed that heclearly did not want to encourage a teacher-controlled, competitive envi-ronment and thus, in a largely competitive school, it was not surprisingthat students should notice the lack of direction by the teacher. The stu-dents did perceive Tim’s class to be more innovative than the way in whichstudents perceived three of the other focus classes (lowest effect size forInnovation for the difference between preferred and actual scores in TableIII). However, students still perceived less innovation than they would havepreferred, as indicated by an effect size of 0.80.

2.2. Lucy’s English Class

An analysis of teaching strategies taken from observation notes of the fivelessons (see Table V) clearly shows that, in Lucy’s class, a lot of time wasspent on teacher presentation, whole-class interaction and individual work.Minimal group work was observed with this class even though Lucy wasconfident that she had conducted many lessons using group work. Lucyindicated that Lesson 3 was more typical of many of her lessons. In thislesson, there was a lot of individual work, much of it on the computers,and the lesson was more student-controlled. The other lessons were heav-ily teacher-controlled and made little, if any, use of the computers. Manystudents in Lucy’s class used the computers regularly, although Lucy didnot appear to attempt to accommodate the use of the computers as part oflearning activities to the extent to which she had in the previous year.

Figure 4. NCEI class means for Tim’s science class (Preferred, March Actual and No-vember Actual).

Page 17: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 131

The NCEI data were analysed in more detail for Lucy’s class becauseboth versions had been administered twice, once early in the year (March)and once late in the year (November). It appeared that, in terms of pre-ferred-actual comparison, students were dissatisfied with her class (referto Table III and Figure 5), particularly when compared with the other fourfocus classes. Lucy’s students perceived little change in the learning en-vironment between March and November. However, it should be noted thatthere had been large differences between the two preferred versions for anumber of scales, especially with the students appearing to want less In-volvement.

Lucy wanted to make use of group-work strategies and had been ob-served to do so in the preceding year. Although she felt that she still usedgroup work a lot, this was only observed for about three minutes in six

TABLE V

Percentage of Observed Teaching Strategies used by Lucy over Five Lessons with aGrade 8 English Class

Strategy Lesson 1 Lesson 2 Lesson 3 Lesson 4 Lesson 5

Transition 35% 19% 21% 2%Teacher presentation 15% 21% 20% 18% 100%Whole-class interaction 25% 60% 2% 35%Individual student work 25% 56% 33%Group work 8%

% Teacher controlled 75% 100% 31% 54% 100%

Computer use/student (mins) 0 0 20 2 1

Figure 5. NCEI class means for Lucy’s English class (Preferred, March Actual andNovember Actual).

Page 18: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE132

lessons and those students interviewed from her class did not have thisperception. Lucy did admit to having less energy and enthusiasm for us-ing the computers than in the previous year, and therefore it is likely thatshe implemented less of the higher-energy group-work strategies and, forsimilar reasons, probably put less effort into involving students in creat-ing the learning environment.

2.3. Sue’s Social Studies Class

An analysis of the strategies and amount of computer use in the observedlessons for Sue’s class was conducted. Students spent most of the timeworking individually or being involved in whole-class interactive sessions.In one lesson, group work was conducted. In none of the lessons did allstudents use their computers. Sue had a few students who chose to use thecomputers regularly in class, but the majority did not. Profiles of NCEIclass means, as represented in Figure 6, indicate a poor person-environ-ment fit for this class, as was also the case for Lucy, Brian and Sandra’sclasses. The scales for which there were large, statistically significant dif-ferences between the actual and preferred version were Involvement, Af-filiation, Teacher Support, Group Work, and Innovation.

2.4. Mathematics Classes

Students in Brian’s and Sandra’s mathematics classes did not use the com-puters at any stage in class. It appears that the environments created in theseclasses (refer to Figures 7 and 8) were very similar and gave students theperception that they were not involved in decisions about learning, thatthey should not collaborate with other students, and that learning activi-

Figure 6. NCEI class means for Sue’s social studies class (Preferred, March Actual andNovember Actual).

Page 19: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 133

ties were not innovative. Typically the lessons involved watching theteacher work examples on the board, reading the textbook and complet-ing exercises from the textbook or board.

2.5. Comparing the Five Focus Classes

For all classes, except Tim’s science class, the preferred-actual compari-son of class means resulted in large effect sizes and statistically significantdifferences on most NCEI scales (Table III). Actual-preferred differencesbetween class means for the scales of Involvement, Affiliation, Group Workand Innovation occurred for all classes except Tim’s. Statistically signifi-cant actual-preferred differences occurred for all classes on the TeacherSupport scale.

Figure 8. NCEI class means for Sandra’s mathematics class (Preferred, March Actualand November Actual).

Figure 7. NCEI class means for Brian’s mathematics class (Preferred, March Actual andNovember Actual).

Page 20: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE134

When considering the change in students’ perceptions of the classroomenvironments between March and November, there were few even mod-erate effect sizes. The March class means (Figure 2) show relatively simi-lar profiles for all five classes, except that there is a large spread of classmeans for the Group Work scale. Students in Tim’s class perceived muchmore group work than the other classes and students in the two mathemat-ics classes (Brian and Sandra) perceived the least. The November actualclass means (Figure 3) show less uniformity with a spread of class meanson the Group Work scale and the mathematics classes having low classmeans for the Involvement, Teacher Support and Innovation scales. Stu-dents in Lucy’s and Tim’s classes continued to perceive their classesas being more innovative and varied than did the students in the otherclasses.

The computers were not consistently an integral part of the learningenvironments associated with any of these classes, although they wereintegral to some lessons (one observed) in Tim’s science class. In Lucy’sand Sue’s classes, the computers were used by some students at some times,but did not appear to be necessary for the lesson, or even to be helpful forsome students. Only in Tim’s class were the computers used consistentlyby most students, almost always to support student-centred learning strat-egies. Also this was the only class with a good person-environment fit,particularly evident in the use of group work. This finding was consistentwith the findings of other researchers (e.g. Becker, Ravitz & Wong, 1999)that computer support for learning is most conducive in student-centredlearning environments. Rowe (1993) goes as far as to say that “learningwith personal computers is not compatible with traditional didactic meth-ods of class teaching” (p. 114).

3. CONCLUSIONS

The New Classroom Environment Instrument (NCEI) proved to be a use-ful tool in the overall analysis of the classroom environments researchedin the study. When combined with lesson observation and interview data,the NCEI provided a means of describing differences between teacher-classcombinations and explaining the dynamics of the classroom. For exam-ple, when computers were consistently used to support student-centredapproaches to learning, particularly involving group work, analysis in-dicated a good person-environment fit. Further, environments in whichcomputers were used frequently were seen to be more innovative and in-volving. Another example was that, throughout a school year, the stu-

Page 21: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 135

dent preferred environment could change considerably which could re-late to teacher energy and unfulfilled expectations. This also could bereflected in the use of computers (e.g. Lucy’s class) in classes in whichthere are developed expectations which require a high level of teacherenergy to implement.

With the importance of the impact of computer support on the learningenvironment, educational computing research must continue to focus onthe analysis of the classroom learning environment. The use of learningenvironment instruments such as the NCEI, in conjunction with other datasources, is recommended for such research.

REFERENCES

Becker, H. J. (1994). How exemplary computer-using teachers differ from other teach-ers: Implications for realizing the potential of computers in schools. Journal of Re-search on Computing in Education, 26, 291–321.

Becker, H. J., Ravitz, J. L., & Wong, Y. T. (1999). Teacher and teacher-directed studentuse of computers and software (Teaching, Learning, and Computing: 1998 NationalSurvey. 3). Irvine, CA: Center for Research on Information Technology and Organi-zations, University of California, Irvine.

Bergen, C., & Kingston, P. (1994). A framework for analysing the contribution of educa-tional technology to learning. British Journal of Educational Technology, 25(1), 58–60.

Carter, D. S. G. (1990). Knowledge transmitter, social scientist or reflective thinker: Threeimages of the practitioner in Western Australian high schools. Theory and Research inSocial Education, XVIII, 274–317.

Collis, B. (1989, April). Using information technology to create new educational situ-ations. Paper presented at the UNESCO International Congress on Education andInformatics, Paris.

Computer Associates International. (1992). CA-Cricket Graph III (Version 1.53). Islandia,NY: Author.

DeCorte, E. (1990). Learning with new information technologies in schools: Perspec-tives from the psychology of learning and instruction. Journal of Computer AssistedLearning, 6, 69–87.

Dunlap, W. P., Cortina, J. M., Vaslow, J. B., & Burke, M. J. (1996). Meta-analysis ofexperiments with matched groups or repeated measures designs. Psychological Meth-ods, 1, 170–177.

Dwyer, D. (1994). Apple classrooms of tomorrow: What we’ve learned. EducationalLeadership, 51(7), 4–10.

Dwyer, D. C., Ringstaff, C., & Sandholtz, J. H. (1991). Changes in teachers’ beliefs andpractices in technology-rich classrooms. Educational Leadership, 48(8), 45–52.

Fallshaw, M. (1995). The use of laptops in senior mathematics. In L. W. Shears (Ed.),Computers and schools (pp. 91–97). Melbourne, Australia: Australian Council forEducational Research.

Fraser, B. (1981). Using environmental assessments to make better classrooms. Journalof Curriculum Studies, 13, 131–144.

Page 22: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE136

Fraser, B. J. (1989). Twenty years of classroom climate work: Progress and prospect.Journal of Curriculum Studies, 21, 307–327.

Fraser, B. J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Hand-book of research on science teaching and learning (pp. 493–541). New York: Mac-millan.

Fraser, B. J., & Fisher, D. L. (1983a). Assessment of classroom psychosocial environ-ment: Workshop manual. Perth, Australia: Western Australian Institute of Technol-ogy.

Fraser, B. J., & Fisher, D. L. (1983b). Use of actual and preferred classroom environ-ment scales in person-environment fit research. Journal of Educational Psychology,75, 303–313.

Fraser, B. J., & McRobbie, C. J. (1995). Science laboratory classroom environments atschools and universities: A cross-national study. Educational Research and Evalua-tion, 1, 289–317.

Fraser, B. J., & Teh, G. P. L. (1994). Effect sizes associated with micro-prolog-basedcomputer-assisted learning. Computers in Education, 23, 187–196.

Fraser, B. J., & Walberg, H. J. (Eds.). (1991). Educational environments. Oxford: PergamonPress.

Fraser, B. J., Malone, J. A., & Neale, J. M. (1989). Assessing and improving the psycho-social environment of mathematics classrooms. Journal of Research in MathematicsEducation, 20, 191–201.

Gardner, J., Morrison, H., Jarman, R., Reilly, C., & McNally, H. (1994). Personal port-able computers and the curriculum (Practitioner Minipaper 13). Glasgow, UK: Scot-tish Council for Research in Education.

Hayward, P. A. (1993, April). When novelty isn’t enough: A case study of students’ reac-tions to technology in the classroom environment. Paper presented at the joint meetingof the Southern States Communication Association and the Central States Communi-cation Association, Lexington, KY.

Johnson, D. W., & Johnson, R. T. (1991). Cooperative learning and classroom and schoolclimate. In B. J. Fraser & H. J. Walberg (Eds.), Educational environments (pp. 55–74). Oxford, UK: Pergamon Press.

Levine, T., & Donitsa-Schmidt, S. (1995). Computer experience, gender, and classroomenvironment in computer-supported writing classes. Journal of Educational Comput-ing Research, 13, 337–357.

Lyall, K. (1997, August 2–3). Laptop High. The Weekend Australian, Syte, pp. 1, 4.Lynch, W. (1990). Social aspects of human-computer interaction. Educational Technol-

ogy, 30(4), 26–31.Maor, D., & Fraser, B. J. (1996). Use of classroom environment perceptions in evaluat-

ing inquiry-based computer-assisted learning. International Journal of Science Edu-cation, 18, 401–421.

McMahon, T. A., & Duffy, T. M. (1993). Computers extending the learning environment:Connecting home and school. New Orleans, LA: Convention of the Association forEducational Communication and Technology.

Mercer, N., & Fisher, E. (1992). How do teachers help children to learn? An analysis ofteachers’ interventions in computer-based activities. Learning and Instruction, 2, 339–355.

Mevarech, A. R., & Light, P. H. (1992). Peer-based interaction at the computer: Lookingbackward, looking forward. Learning and Instruction, 2, 275–280.

Page 23: Development and Use of an Instrument for Computer-Supported Learning Environments

MEASURING COMPUTER-SUPPORTED LEARNING ENVIRONMENTS 137

Miller, L., & Olson, J. (1994). Putting the computer in its place: A study of teaching withtechnology. Journal of Curriculum Studies, 26(2), 121–141.

Moos, R. H., & Trickett, E. J. (1974). Classroom Environment Scale: Manual. Palo Alto,CA: Consulting Psychologists Press.

Newby, M., & Fisher, D. (2000). A model of the relationship between university compu-ter laboratory environment and student outcomes. Learning Environments Research,3, 51–66.

Newhouse, C. P. (1993, July). Are Australian classrooms ready for computers? Paperpresented at the Australian Computers in Education Conference, Sydney, Australia.

Newhouse, C. P. (1998). Teachers’ responses and classroom learning environments as-sociated with student access to portable computers. Unpublished doctoral thesis, CurtinUniversity of Technology, Perth, Australia.

Olson, J. (1988). Schoolworlds – microworlds. Oxford, UK: Pergamon Press.Rennie, L. (1997, July). A significant issue in interpreting and reporting quantitative

research. Paper presented at the annual conference of the Australasian Science Educa-tion Research Association, Adelaide, Australia.

Rentoul, A. J., & Fraser, B. J. (1979). Conceptualization of enquiry-based or open class-room learning environments. Journal of Curriculum Studies, 11, 233–245.

Rieber, L. P. (1994). Computers, graphics, and learning. Dubuque, IA: Wm. C. BrownCommunications.

Riel, M. (1989). The impact of computers in classrooms. Journal of Research on Com-puting in Education, 22, 180–190.

Rowe, H. A. H. (1993). Learning with personal computers. Melbourne, Australia: Aus-tralian Council for Educational Research.

Rysavy, S. D. M., & Sales, G. C. (1991). Cooperative learning in computer-based instruc-tion. Education Training Research & Development, 39(2), 70–79.

Salomon, G. (1994). Differences in patterns: Studying computer enhanced learning en-vironments. In S. Vosniadou, E. DeCorte, & H. Mandl (Eds.), Technology-based learn-ing environments (pp. 79–88). Heidelberg, Germany: Springer-Verlag.

Schofield, J. W. (1995). Computers and classroom culture. New York: Cambridge Uni-versity Press.

SPSS Inc. (1990). SPSS for the Macintosh® 4.0 (Version Macintosh 4.04) [StatisticalAnalysis]. Chicago: Author.

Taylor, P. C., & Fraser, B. J. (1991, April). CLES: An instrument for assessing constructivistlearning environments. Paper presented at the annual meeting of the National Asso-ciation for Research in Science Teaching, Fontane, WI.

Teh, G. P., & Fraser, B. J. (1995). Development and validation of an instrument for as-sessing the psychosocial environment of computer-assisted learning classrooms. Jour-nal of Educational Computing Research, 12, 177–193.

Thompson, B. (1996). AERA editorial policies regarding statistical significance testing:Three suggested reforms. Educational Researcher, 25(2), 26–30.

Van Den Akker, J., Keursten, P., & Plomp, T. (1992). The integration of computer use ineducation. International Journal of Educational Research, 17, 65–76.

Welle-Strand, A. (1991). Evaluation of the Norwegian Program of Action: The impact ofcomputers in the classroom and how schools learn. Computers and Education, 16, 29–35.

Woods, J. D. (1995). Teaching effectiveness: Using students’ perceptions of teaching styleand preferred learning style to enhance teaching performance. Unpublished doctoralthesis, Curtin University of Technology, Perth, Australia.

Page 24: Development and Use of an Instrument for Computer-Supported Learning Environments

C. PAUL NEWHOUSE138

Wubbels, T. (1993). Cross-national study of learning environments. In D. L. Fisher (Ed.),The study of learning environments, Volume 7 (pp. 112–120). Perth, Australia: CurtinUniversity of Technology.

Wubbels, T., Brekelmans, M., & Hooymayers, H. (1991). Interpersonal teacher behaviorin the classroom. In B. J. Fraser & H. J. Walberg (Eds.), Educational environments(pp. 141–160). Oxford, UK: Pergamon Press.

C. PAUL NEWHOUSE

School of EducationEdith Cowan University2 Bradford StreetMt Lawley, Western Australia 6050AustraliaE-mail: [email protected]