42
This article was downloaded by: [University of California, San Francisco] On: 18 December 2014, At: 22:31 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/nses20 Effective School Improvement in Mathematics A.A.M. Houtveen a , W.J.C.M. van de Grift b & B.P.M. Creemers c a ISOR/Institute of Educational Research, University of Utrecht The Netherlands b Dutch Inspectorate of Education Utrecht The Netherlands c University of Groningen The Netherlands Published online: 09 Aug 2010. To cite this article: A.A.M. Houtveen , W.J.C.M. van de Grift & B.P.M. Creemers (2004) Effective School Improvement in Mathematics, School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice, 15:3-4, 337-376, DOI: 10.1080/09243450512331383242 To link to this article: http://dx.doi.org/10.1080/09243450512331383242 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused

Effective School Improvement in Mathematics

  • Upload
    bpm

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Effective School Improvement in Mathematics

This article was downloaded by: [University of California, San Francisco]On: 18 December 2014, At: 22:31Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

School Effectiveness and SchoolImprovement: An InternationalJournal of Research, Policy andPracticePublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/nses20

Effective School Improvement inMathematicsA.A.M. Houtveen a , W.J.C.M. van de Grift b & B.P.M.Creemers ca ISOR/Institute of Educational Research, University ofUtrecht The Netherlandsb Dutch Inspectorate of Education Utrecht TheNetherlandsc University of Groningen The NetherlandsPublished online: 09 Aug 2010.

To cite this article: A.A.M. Houtveen , W.J.C.M. van de Grift & B.P.M. Creemers(2004) Effective School Improvement in Mathematics, School Effectiveness and SchoolImprovement: An International Journal of Research, Policy and Practice, 15:3-4, 337-376,DOI: 10.1080/09243450512331383242

To link to this article: http://dx.doi.org/10.1080/09243450512331383242

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information(the “Content”) contained in the publications on our platform. However, Taylor& Francis, our agents, and our licensors make no representations or warrantieswhatsoever as to the accuracy, completeness, or suitability for any purposeof the Content. Any opinions and views expressed in this publication are theopinions and views of the authors, and are not the views of or endorsed byTaylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor andFrancis shall not be liable for any losses, actions, claims, proceedings, demands,costs, expenses, damages, and other liabilities whatsoever or howsoever caused

Page 2: Effective School Improvement in Mathematics

arising directly or indirectly in connection with, in relation to or arising out of theuse of the Content.

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expresslyforbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 3: Effective School Improvement in Mathematics

School Effectiveness and School Improvement2004, Vol. 15, Nos. 3–4, pp. 337–376

Effective School Improvement in Mathematics

A.A.M. Houtveen1, W.J.C.M. van de Grift2, and B.P.M. Creemers3

1ISOR/Institute of Educational Research, University of Utrecht, The Netherlands,2Dutch Inspectorate of Education, Utrecht, The Netherlands, and3University of Groningen, The Netherlands

ABSTRACT

This article addresses the evaluation of the Mathematics Improvement Programme. The resultsshow that the programme improved the learning results of pupils in grade 3 with half a standarddeviation and reduced the percentage of struggling pupils to less than 1%. Fifteen percent of thevariance in pupil results is to be explained at the school level. About a quarter of this 15% can beexplained by differences between the experimental and the comparison group. All of thiscondition variance is explained by the experimental variables. Five out of 10 implementationfeatures contribute significantly to differences in pupil results.

INTRODUCTION

From the 1980s on, the mathematics textbooks in Dutch elementary schools

are based on the so-called ‘‘realistic didactics’’. Vital to this approach is to

elicit mathematical solutions from the pupils themselves. In the textbooks,

realistic contexts with challenging problems and visual models are provided

that support the strategies used by the pupils. Teachers are supposed to adjust

their instruction to the contributions of the pupils (Treffers & De Moor, 1990).

Since the introduction of this realistic mathematics education, pupil results in

The Netherlands have gradually declined without a clear cause (Janssen, Van

der Schoot, Hemker, & Verhelst, 1998; Wijnstra, 1988).

Address correspondence to: A.A.M. Houtveen, ISOR/Institute of Educational Research,University Utrecht, P.O. Box 80140, 3508 TC Utrecht, The Netherlands. E-mail:[email protected]

Manuscript submitted: June 19, 2002Accepted for publication: December 4, 2003

10.1080/09243450512331383242$16.00 # Taylor & Francis Ltd.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 4: Effective School Improvement in Mathematics

Inspired by the attempts to link the knowledge base of school effectiveness

and school improvement research and theory, the Mathematics Improvement

Programme (MIP) was designed. The programme is characterised by a sys-

tematic approach to whole-school improvement in which high quality instruc-

tion to adapt to pupils’ needs, monitoring of pupil results, teacher beliefs as

well as organisational aspects, educational leadership, and intensive guidance

form the key elements of the school improvement design.

In a pilot situation 14 schools spread all over The Netherlands were

intensively guided by external change agents to implement the programme

and accordingly improve their pupil results with regard to mathematics. The

project took 2 or 3 years depending on the time schools needed to implement

the programme to a degree that effects on pupil results could be expected.

To evaluate the effectiveness of the program, a quasi-experiment was

carried out (Van Zoelen & Houtveen, 2000). The central research question

in this experiment was: Does implementation of adaptive instruction in

grade 3 of elementary schools lead to improvement of pupil achievements in

mathematics?In this article we will go into the background and the design of the pro-

gramme. Next, the research design is described and the answer to the central

research question is presented. The article ends with a discussion of the

lessons learned for school improvement aimed at improving pupil results.

THEORETICAL FRAMEWORK

Foundations of the MIP-ProgrammeThe predecessor of the MIP-programme was the Dutch National School

Improvement Project, carried out between 1991 and 1994. A major goal of the

National School Improvement Project was to prevent and reduce disadvan-

tage, especially in reading. The treatment of this project can be divided into

targets at the classroom or teacher level and targets at the school level. At the

classroom level, the main objectives were: improving teacher’s skills in the

field of direct instruction and class management, generating an efficient use

of time by the students, and improving teacher’s skills in monitoring and

assessment of pupil results. At the school level, targets were aimed at realising

a ‘‘results-oriented’’ school management in which explicit targets concerning

basic skills at school and classroom level are determined in advance and to

increase the evaluation capacity of the school. Teachers and principals were

338 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 5: Effective School Improvement in Mathematics

intensively supported to reach these goals. The support strategy consisted of a

combination of multiple elements, like informing the school board, consul-

tation with principals, guidance at the level of the team of teachers as a whole

and coaching of teachers in their classrooms. In the evaluation study, 27

schools were included, 14 belonging to the experimental group and 13 to the

comparison group. From this study, it was concluded that the improvement

strategy used in the National School Improvement Project which was founded

on a knowledge base with respect to school effectiveness and successful

school improvement projects, and the programme content itself, based on a

knowledge base with respect to educational effectiveness and instructional

effectiveness (especially in the area of initial reading), turned out to be

effective. The planned change strategy leads to changes in teaching behaviour,

and the students in the experimental group outperformed students in the

control group. Unfortunately the results did not last when a year later a follow-

up study was conducted (Houtveen, Booij, De Jong, & Van de Grift, 1999).

The framework of the Dutch National School Improvement Project was

used to design a new school improvement programme: the MIP-programme

(Van de Vijver & Osinga, 1995). The ultimate goal of the MIP-programme is

to improve pupil achievements in mathematics in grade 31 of elementary

schools. To reach this goal, several improvements at school and classroom

level have to take place.

The MIP-programme is mainly based upon two insights. The first is that

educational practice as well as theory development might gain a great deal

from improvement designs in which the school improvement and school

effectiveness knowledge base is linked. The second insight regards recent

research knowledge with regard to effective teaching and with regard to

adapting teaching and instruction to children with diverse learning needs. The

following section describes the MIP-programme as an effective school

improvement project. The effective teaching components are attended to in

the implementation section.

The MIP-Programme as an Effective School Improvement ProjectA major aim in the field of school effectiveness always was to link theory

development and research on the one hand and practice and policy-making,

1In The Netherlands children enter school at age 4 in grade 1. Formal instruction is started ingrade 3.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 339

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 6: Effective School Improvement in Mathematics

especially school improvement, on the other. The idea was to make knowledge

useful for educational practice and policy-making. The next step should be to

use practical knowledge for further advances in theory and research. In this

way, research and improvement can have a relationship with surplus value for

both. In reality, this relationship is often troublesome (Creemers & Reezigt,

1997; Teddlie & Reynolds, 2000).

Some arguments can be given against these links. School effectiveness and

school improvement have partly different missions and different responsibilities

and priorities. School effectiveness is essentially a research programme that

tries to develop a knowledge base of what is effective in education, and to

support this knowledge base by empirical findings. School improvement is

responsible for innovation, for changes towards better schools and often cannot

wait for a knowledge base. School effectiveness is a research- and theory-

oriented programme, school improvement is a practice- and problem-solving

oriented programme. But more important than the different missions is the

common mission that school effectiveness and school improvement still share:

a mutual involvement in educational quality and the importance of education.

As such, the questions in both fields are essentially the same and there clearly

is a need for integrating school effectiveness and improvement more strongly

(Gray, Reynolds, Fitz-Gibbon, & Jesson, 1996; Reynolds & Stoll, 1996; Stoll &

Fink, 1996; Teddlie, Stringfield, & Burdett, 2003).

School improvement is often defined as a systemic sustained effort aimed at

change in learning conditions and other related internal conditions in one or

more schools with the ultimate aim of accomplishing educational goals more

effectively (Fullan, 1991, 1994; Hopkins, 1987; Van Velzen, Miles, Ekholm,

Hameyer, & Robin, 1985). More recently, Hopkins, Ainscow, and West (1994)

defined school improvement as an approach to educational change that

enhances student outcomes as well as strengthens the school’s capacity for

managing change. In this definition, school improvement can be regarded:

� as a vehicle for planned educational change;

� as particularly appropriate during times of centralised initiatives and inno-

vation overload when there are competing reforms to implement;

� as usually necessitating some form of external support;

� as having an emphasis on strategies for strengthening the school’s capacity

for managing change; while

� raising student achievement (broadly defined), through specifically focus-

ing on the teaching–learning process (Hopkins et al., 1994, p. 43).

340 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 7: Effective School Improvement in Mathematics

School improvement efforts do not contribute automatically to school

effectiveness. There is a lot of improvement going on that has little relevance

for effectiveness, because it does not aim at student outcomes at all (Louis &

Smith, 1991). Furthermore, only a small part of school improvement is re-

search based (Stringfield, 1995). And even if improvement projects refer to

school effectiveness, the results are often not interpretable in terms of school

effectiveness because they are not systematically planned, carried out, and

evaluated (Creemers & Reezigt, 1997; Teddlie & Reynolds, 2000).

To be of any importance for school effectiveness, school improvement

should use the school effectiveness knowledge base and be directed (at least to

some extent) to the application of this knowledge as a focused intervention,

emphasise (high fidelity) implementation, emphasise outcomes, and use

evaluation techniques and preferably (quasi-) experimental designs. Various

authors have pointed out that school improvement does not live up to these

expectations most of the time. There are currently no empirically validated

improvement theories, neither are there clear notions about the range of

educational levels that improvement should deal with simultaneously

(Creemers & Reezigt, 1997).

Creemers and Reezigt (1997) state that to link school improvement with

school effectiveness, school improvement should fulfil several requirements

that concern the various stages of improvement projects:

1. Phrasing the improvement problem in terms of school effectiveness. This

means that, next to a diagnosis of what is wrong and should be improved, it

should also be made clear what is to be expected of successful improve-

ment in terms of pupil outcomes (Hopkins, 1995).

2. Making use of the knowledge base of school effectiveness to outline the

actual contents of the improvement project. This implies references to

theories, concepts, and factors of school effectiveness, but also arguments

for the choice of levels. The classroom level is supposed to be the

starting point for improvement (Joyce & Showers, 1995; Reynolds,

Hopkins, & Stoll, 1993). However, innovations in classrooms need

support at the school level for further incorporation (Fullan, 1991;

Fresko, Robinson, Friedlander, Albert, & Argaman, 1990). The school

effectiveness theories and models can be helpful in examining the way

the levels are interacting, and in finding out which factors are important

at which level and which persons should be involved at which level

(Evans & Teddlie, 1995).

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 341

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 8: Effective School Improvement in Mathematics

3. Develop a clearly conceptualised improvement plan, which preferably can

take enough time for implementation, for example at least 2 or 3 years

(Fullan, 1991; Pink, 1990; Stringfield, 1995).

4. Evaluation of the implementation should be carried out. School im-

provement projects should be set up as experiments or quasi-experiments

(Hopkins, 1995).

5. Discussion of the results and conclusions: What turned out to be effective

and why, what are important new insights for school improvement and for

school effectiveness as well?

The stages outlined above start on the assumption that there is a school

effectiveness knowledge base and that it can be important for school

improvement. A serious problem for the school improvers is that the

school effectiveness knowledge base is in fact quite small and under-tested

(Stringfield & Herman, 1996). The school effectiveness knowledge base is

still in a stage of construction, and research into the validity of some essential

relationships is lacking. Because of this it is impossible to give simple advises

about what to do to improve schools and what to expect as a result (Teddlie &

Reynolds, 2000).

Recently, several projects have started (see, for an overview: Borman,

Hewes, Overman, & Brown, 2003; Herman, 1999; Stoll, Reynolds, Creemers,

& Hopkins, 1996; Stringfield, 1995) to integrate school effectiveness and

school improvement. Characteristic for these projects is, that there is inter-

action between researchers and improvers throughout the project. Further,

these projects all share a clear definition of the problem that should be

overcome, in terms of student outcomes and classroom strategies to enhance

these outcomes within the context of the school. Often, the outcomes are

clearly specified for one school subject or elements of a school subject.

The content of the projects is a balanced mix of the effectiveness

knowledge and the concepts of school improvement. The projects have

detailed designs, both for the implementation of school improvement and for

the evaluation in terms of empirical research. By means of a research com-

ponent integrated into the projects from the start, it is possible to test

effectiveness hypotheses, and to evaluate the improvement outcomes at the

same time. The use of control groups is essential in this respect, and various

projects now incorporate control groups or choose to compare their results to

norm groups on the basis of nation-wide tests. Also many projects are lon-

gitudinal in their designs.

342 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 9: Effective School Improvement in Mathematics

The MIP-Programme can be considered as an example of such an

integrated project. The programme can be characterised as a whole-school

design, involving the school level and the classroom level, external guidance

and external evaluation. It is systematically planned, carried out, and eval-

uated in a longitudinal quasi-experimental design. Besides that, an imple-

mentation study is carried out to gain more insight into the successes and

failures of the process of change itself. There is frequent interaction between

researchers and improvers throughout the project. Improvers gain strongly by

the implementation measures carried out regularly.

The outcomes for the project are clearly specified. They consist of

outcomes in terms of staff development as well as student outcomes.

THE MIP-PROGRAMME DESIGN

We have sought to identify the key elements of a school improvement pro-

gramme that facilitate effective teaching and to work out how each of these

elements should be designed so that they operate effectively and in alignment

with each of the other elements (Van Zoelen & Houtveen, 2000). This resulted

in what we refer to as the MIP-programme design for effective school

improvement. School designs models are hardly used in The Netherlands,

although they have become highly significant in the USA (Berends, Bodilly, &

Kirby, 2002; Herman, 1999; Stringfield, Ross, & Smith, 1996), as well as in

the Australian context (Hill & Cr�eevola, 1999).

The first pillar of the design is the assumption that all students can master a

subject given sufficient time and appropriate instruction. To accomplish

this, the educational programme has to be adapted to pupil needs. Pupil

learning is seen as a consequence of the responsiveness of the learning

environment, rather than the result of differences in pupil learning

characteristics and basic abilities. Furthermore, the task of the school is to

provide learning environments that enable all pupils to experience success,

regardless of initial ability. Central in this approach is firstly that effective

instruction is not just good teaching. Teachers must attend to ways of adapting

instruction to pupil’s levels of knowledge, motivating pupils to learn,

managing pupil behaviour, grouping pupils for instruction and testing, and

evaluating pupils.

The second pillar of the design is that it involves elements of both school

and classroom organisation. At the school level, the principal may establish

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 343

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 10: Effective School Improvement in Mathematics

policies concerning grading, evaluation, and promotion practices. At the

classroom level, teachers control the grouping of students within the class,

teaching techniques, classroom management methods, informal incentives,

frequency and form of tests, and so on. These elements of school and class-

room organisation are at least as important for student achievement as the

quality of teacher’s lessons.

The third pillar of the design is the careful planning of the programme and

the use of an integrated implementation strategy. The fourth pillar of the

design is intensive external guidance of the innovations. The fifth pillar is

external evaluation of the effects of the programme.

The following is a brief description of the key elements of the design.

Beliefs and Understandings

Meeting the diverse needs of pupils requires professionals who have a deep

understanding of teaching and learning and a belief in the capacity of all

pupils to attain high standards given the right support and sufficient time.

In the MIP-programme this was translated into the requirement that

teachers and school administrators participate in intensive professional de-

velopment involving the whole team and that they appoint a coordinator with

significant time release to facilitate implementation of the programme.

Furthermore, it was required that the teachers of grade 3 participated in the

external evaluation of the programme.

Standards and Targets

The schools in the programme are required to make use of one of the

textbooks based on realistic mathematics education that are available in The

Netherlands. Herewith the content standards are given. The performance

standards were formulated in terms of minimising the amount of students that

scored on the lowest level on a standardised test (Cito, 1992).

At the teacher level, performance standards were formulated as well. They

were supposed to implement adaptive instruction (see below) to a score of at

least 50 on the instruments used by the researchers to measure implementation.

Pupils differ in the extent to which they need instruction and support while

learning and in the amount of time they need to process the subject material

successfully. The classroom is the centre for dealing with differences among

pupils. To fulfil the assumption of the programme that all pupils can master a

subject given sufficient time and appropriate instruction, six elements of

adaptive instruction must be simultaneously addressed in the programme.

344 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 11: Effective School Improvement in Mathematics

Monitoring and Assessment

The first element is the identification of pupil learning needs. Criterion-

referenced and curriculum-based diagnostic techniques are needed to deter-

mine the pupils’ level when beginning a unit of instruction in the curriculum.

Frequent diagnostic checks are needed to monitor pupil progress toward

curriculum objectives to be able to give corrective instruction and to determine

mastery of a curriculum unit (Barton, 2002; Cohen, 1980; Fuchs & Fuchs,

1986; Guskey, 2003; L’Hommedieu, Menges, & Brinko, 1990).

The next three elements of adaptive instruction concern optimising in-

struction by optimising the quality of instruction, the amount of instruction

time, and the provision of high success rates.

Quality of Instruction

Heterogeneous groups appear to give the best opportunity to learn for both

low-achieving pupils and average pupils (Gamoran, 1992; Hallam &

Toutounji, 1996; Houtveen & Van de Grift, 2001; Oakes, Gamoran, & Page,

1992; Reezigt, 1993; Slavin, 1987, 1996). High quality instruction given to the

whole class is essential.

The most important aspect of instructional quality is the degree to which the

lesson makes sense to the pupils. This includes presenting information in an

orderly way (Kallison, 1986), note transitions to new topics (Smith & Cotton,

1980), use clear and simple language (Land, 1987), use many vivid images

and examples (Hiebert, Wearne, & Taber, 1991; Mayer & Gallini, 1990), and

frequently restate essential principles (Maddox & Hoole, 1975). Lessons

should be related to pupils’ background knowledge, using such devices as

advanced organisers (Nunes & Bryant, 1996; Pressley et al., 1992), or simply

reminding pupils of previously learned material at relevant points in the

lesson. Use of media and other visual representations can also contribute to

quality of instruction (Hiebert et al., 1991; Kozma, 1991).

Clear specification of lesson objectives to pupils (Melton, 1978) and a

substantial cohesion between what is taught and what is assessed (Cooley &

Leinhardt, 1980; Creemers, 1994) contribute to instructional quality, as does

frequent formal or informal assessment to see that pupils are mastering what is

being taught (Crooks, 1988; Kulik & Kulik, 1988) and immediate feedback to

pupils on the correctness of their performances (Barringer & Gholson, 1979).

Instructional pace is also partly an issue of quality of instruction. Frequent

assessment of pupil learning is critical for teachers to establish the most rapid

instructional pace consistent with the preparedness and learning rate of all

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 345

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 12: Effective School Improvement in Mathematics

pupils. Furthermore, having a quick pace will stop pupils becoming di-

sengaged and bored, and thus will help in keeping pupils actively engaged in

learning (Muijs & Reynolds, 2000; Pressley, Goodchild, Fleet, Zachowski, &

Evans, 1989).

So, in short: Teachers who explicitly model, scaffold, explain strategies,

give corrective feedback and practice to mastery, contribute highly to the

academic success of their pupils (see for meta-analyses of the research:

Carnine, Dixon, & Silbert, 1998; Dixon, Carnine, & Kameenui, 1992; Dixon,

Carnine, Lee, & Wallin, 1998; Ellis & Worthington, 1994; Good & Brophy,

1986; Rosenshine & Stevens, 1986; Slavin, 1996; Veenman, 1992).

Although most Dutch schools use methods based on realistic mathematics

education, teaching practices did not change accordingly (Gravemeijer, 1990;

Harskamp, 1988; Willemsen, 1994). Therefore in the MIP-programme the

following domain-specific instruction principles are formulated: sound

preparation of formal calculation; context-bound instruction; act; verbalise;

use of models; focus on essential understandings and skills; and finally

attending automation (especially for the struggling learners) (Van de Vijver &

Dijkstra, 1999).

Instruction Time

In the theoretical models on learning at school (Bloom, 1976; Carroll, 1963;

Harnishfeger & Wiley, 1978), instruction time and its efficient use are con-

sidered important determinants for learning at school. The connection be-

tween time spending and results of pupils was established in a large number of

empirical research projects (Carnine et al., 1998; Dixon et al., 1998; Scheerens

& Bosker, 1997).

In the MIP-programme, optimal use of time in terms of classroom man-

agement as well as in terms of time spent on explicit instruction of skills and

integration of skills is stressed.

High Success Rates

The third aspect of optimising instruction stresses the relationship between

learning and emotion. A certain amount of self-confidence turns out to be a

prerequisite for learning. Self-confidence is built upon the base of experienced

successes. This implies that teachers have to provide experiences of success

for all learners (Ellis & Worthington, 1994). For initially less successful stu-

dents it is vital to give second chances to demonstrate success after corrective

feedback (Guskey, 2003).

346 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 13: Effective School Improvement in Mathematics

The last two aspects of adaptive instruction concern supporting active

learning by supporting self-regulated learning and creating a class organisa-

tion in which pupils can manage there own learning activities.

Self-Regulated Learning

Since learning is an active process of knowledge acquisition and construction,

teachers should take measures that make it possible for pupils to adopt an

active learning attitude and gradually pass on responsibility for the learning

process to the pupils (Boekaerts, 2002; Ellis & Worthington, 1994).

Explorative Learning Environment

Heterogeneous grouping is not enough to help pupils at risk of school failure.

Extending learning and instruction time for these pupils is necessary. In all

cases, extension of instruction time for struggling learners demands a class-

room organisation in which the remainder of the pupils are able to manage

their own learning process. In the MIP-programme this classroom organisa-

tion is referred to as an explorative learning environment. Apart from organi-

sational reasons, an explorative learning environment has a value in itself

because it contributes to school success and the intrinsic motivation of pupils

(Carver & Scheier, 2000; Ryan & Deci, 2000).

Adaptive instruction at the classroom level can only last if the school level

supports it. There has to be a good ‘‘infrastructure’’, as Fullan calls it (Fullan,

2003). This means that the changes have to be systematically planned, carried

out and evaluated. Instructional leadership as well as intensive guidance are

indispensable. As stated above, independent evaluation is necessary for school

improvement to contribute to school effectiveness.

Planned Change

In order to implement a complex school improvement programme that covers

3 to 4 school years, it is necessary to plan changes over time. The process of

change is considered to consist of three (overlapping) phases: initiation,

implementation, and institutionalisation (Fullan, 2003; Miles, 1986). The ini-

tiation phase is about deciding to start the innovation, and about developing

commitment toward the process. The key activities in this phase are the

decision to start the innovation, to accept the above-stated requirements for

participation, and to review the current state as regards teaching practises

and pupil results with regard to mathematics. Implementation is the phase

of attempted use of the innovation. The key activities occurring during

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 347

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 14: Effective School Improvement in Mathematics

implementation are the carrying out of action plans, monitoring, and feedback

on progress, and sustaining of commitment. When the innovation has become

part of the school’s usual way of doing things, the institutionalisation phase

is reached. In this phase, the activities are emphasised on embedding the

programme-specific activities within the school organisation and within the

actions of teachers and principals (Fullan, 1991; Teddlie & Reynolds, 2000).

In the MIP-programme, the schools are provided with a detailed scenario in

which the activities that ought to be carried out during the change process are

worked out in detail for each of the innovation phases (Van de Vijver & Osinga,

1995). Based on this, the schools are expected to make their own scenario in

which is accounted for the specific situation and choices made by the school

(teams). To guarantee high fidelity implementation of the programme, none of

the objects that are summarised in the programme scenario can be left out.

Since high fidelity implementation is crucial in a quasi-experimental

research design, the innovation process is not only monitored at the school and

classroom levels but also at the programme level. The external change agents

followed a 2-day introduction course in which the programme was outlined.

Each innovation year, 4 follow-up study days were organised in which the

change agents accounted for the planning and progress of activities in his or

her schools. Further, they received feedback with regard to the degree their

guidance activities are in line with the plans and to the degree of teacher

progress on the implementation variables as monitored by the researchers.

Further skill training was provided as well.

The schools involved in the programme are spread all over the country. In

The Netherlands, professional guidance and support for schools is made

available through Local Educational Agencies (LEAs). These agencies op-

erate independently of the educational authorities and on request of the

schools themselves. Since change agents support schools from their own

LEAs, the LEAs involved are spread over the country as well. To make sure

that the individual change agent’s work is embedded well within his or her

institute, a steering group consisting of the directors of the LEAs is formed.

This steering group meets four times a year to discuss progress.

Leadership and Coordination

To implement a complex innovation such as the MIP-programme knowledge is

needed of realistic mathematics as a subject, adaptive instruction, and knowledge

with regard to the process of educational change. The knowledge body of edu-

cational change involves in our opinion knowledge regarding managing change

348 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 15: Effective School Improvement in Mathematics

in schools, as well as knowledge regarding the professional capacities and orga-

nisational conditions needed to sustain the innovations of the MIP-programme.

As a consequence, the improvement approach depends to a large degree on the

leadership provided by the principal of the school, since only he or she is in the

position to make sure that each of the design elements of the MIP-programme is

attended to and brought into alignment. Powerful instructional leadership is

needed, clearly focused on improving pupil achievement. Further, the improve-

ment approach depends on effective coordination and communication strategies

and on maintaining consistency across the school regarding goals.

External Guidance

In the MIP-programme intensive external guidance is provided for during the

initiation, implementation, and institutionalisation of the programme in the

schools. The guidance was directed at both the school and team levels and at

the individual teachers in an inservice setting. The principal was supported in

managing the changes in the school that were necessary to implement the

MIP-programme. This involved developing a school-specific implementation

plan, creating suitable structures for coordination of the programme, and

training in being a coach to the classroom teachers in improving teaching. At

the team level and teacher level, support was given by coaching to application

in the classroom of each of the elements of adaptive instruction, using the

Joyce and Showers’ didactic coaching model (see Pajak, 2000). The elements

of coaching are the following:

1. study of the theoretical basis of the subject of coaching;

2. demonstration of the new skill by the external change agent within the

classroom;

3. practice and feedback. Teachers collaboratively plan mini-lessons and

prepare materials to apply the new strategy with other teachers who play

the role of students;

4. classroom observation and feedback. Teachers introduce the element of

adaptive instruction at stake into their regular classes. The external change

agent observes and gives feedback. At the same time, the principal is

trained in taking over the role of coach. This process of coaching takes

several months for each of the aspects of adaptive instruction. In this

approach, it is acknowledged that change is difficult and therefore teachers

must overlearn new skills to successfully incorporate them into their

repertoires (Joyce & Showers, 1995, 2002).

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 349

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 16: Effective School Improvement in Mathematics

Independent Evaluation

Independent evaluation is also provided for in the MIP-programme. In the next

paragraph the research design is presented.

RESEARCH DESIGN

The major goal of the MIP-programme is to improve pupil results with regard

to mathematics in grade 3. Pupils differ in the extent to which they need

instruction and support while learning and in the amount of time they need to

process the subject material. The classroom is the centre for dealing with these

differences: Teachers have to adapt their instruction and classroom orga-

nisation to the different needs of pupils. Within this theoretical framework the

main research question is: Does implementation of adaptive instruction lead

to improvement of pupils’ results in mathematics in grade 3 of elementary

education?In order to answer this question, a quasi-experiment was set up (untreated

control-group design with pretest and posttest (Cook & Campbell, 1979).

Fourteen schools were selected to take part in the experimental group. Fifteen

schools constituted the comparison group. When the experimental schools

implemented the programma sufficiently, the pre- and posttests were con-

ducted at respectively the beginning and end of school year 1998/1999.

Due to the field situation in which the research was conducted, the demand

that schools should be placed randomly in either the experimental or the

comparison condition could not be met. A group of comparison schools was

chosen randomly from the population. Both the schools to be selected as

experimental schools and the schools to be selected for the comparison group

were examined for differences in characteristics in background of pupils,

which could be expected to contribute to differences in teacher behaviour

and/or to differences in pupil results. Table 1 shows that pretest results,

intelligence, age, percentage of girls, percentage of children with low socio-

economic backgrounds, and the percentage of pupils from ethnical minority

groups turned out to be almost similar in both groups.

A study on implementation preceded the posttest. This study was meant to

determine whether the implementation of the experimental variables was

sufficiently higher in the experimental group compared to the comparison

group, to expect differences in pupil results due to the teacher behaviour

expressed in the experimental variables. A follow-up study on implementation

350 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 17: Effective School Improvement in Mathematics

was carried out after the project had ended to gain insight into the sustain-

ability of the implementation. The experimental variables are the follow-

ing aspects of adaptive instruction behaviour as described earlier in the

MIP-programme design:

� Monitoring pupils’ results consisting of:

– setting goals for pupils;

– diagnosing pupils’ academic problems through testing;

– relating learning results to given instruction;

– implementing prescribed learning plans for pupils identified as at risk;

– team discussion of pupil progress.

� Optimising instruction consisting of:

– giving extended direct instruction;

– optimising instruction time;

– supporting self-confidence of pupils;

� Supporting active learning consisting of:

– supporting self-regulated learning;

– creating an explorative learning environment.

The schools involved in the MIP-programme, the experimental group,

received intensive external guidance to implement these variables during

a 2- or 3-year period. Obviously, the comparison group received no guidance

at all.

Table 1. Pupil Characteristics in the Experimental and Comparison Group Schools at theBeginning of the Experiment.

Number of pupils Experimentalgroup

Comparisongroup

st. dev. Effectsize

237 311

Score on Cito-arithmetictest (pretest)

33.70 34.39 4.56 �0.15

Percentage of low achievingpupils on pretest

2.6 2.8

IQ 99.53 100.34 14.98 �0.05Age in months 95.45 96.11 6.92 �0.10Percentage girls 45.1 46.4Percentage low-SES children 24.5 24.4Percentage children from ethnical

minority groups4.6 6.8

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 351

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 18: Effective School Improvement in Mathematics

The researchers carried out systematic monitoring of the implementation of

the experimental variables in the behaviour of change agents and teachers,

twice each school year. Some of the implementation variables were measured

by observation in the classroom, some by written surveys. The observers were

trained beforehand until they reached sufficient interobserver reliability

(Hubert’s’ Kappa> .70). The formative evaluation results provided systematic

feedback to the change agents and to the schools.

To measure pupil effects of the experiment, the mathematics skills of the

pupils were assessed with the Cito Mathematics test (Cito, 1992). Versions of

this test are available for different grades.

Extensive case studies were made of all the experimental schools in which the

implementation process as well as the guidance process is described. The case

studies are based on interviews with principals, teachers, and change agents.

The case studies are used in this article to reflect on some of the insights for

school improvement and school effectiveness yielded by the MIP-programme.

IMPLEMENTATION

The research on implementation aimed at answering the question whether the

guidance strategy used in the MIP-programme was successful in leading to the

implementation of the elements of adaptive instruction teachers in the project

are supposed to implement. This is essential, because it is useless to study the

effect of an experiment if it is not certain the project is actually realised in a field

situation to a degree that implementation in the experimental group of teachers

outreaches implementation in the comparison group. We measured the extent to

which the experimental variables were implemented by the teachers of both the

experimental and the comparison group by means of observational instruments

(direct instruction, instruction time, explorative learning environment, and

self-confidence) and a written questionnaire (setting goals, diagnosing pupils’

academic problems through testing, analysing academic problems, imple-

menting learning plans, discussing pupil progress as a team of teachers, and

supporting self-regulated learning). The remainder of the paragraph is focussed

on description of the used instruments and the degree of implementation.

Monitoring Pupil ResultsA key element of adaptive instruction is the identification of pupil learning

needs. Criterion-referenced and curriculum-based diagnostic techniques are

352 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 19: Effective School Improvement in Mathematics

needed to determine a pupils’ level when beginning a new unit in the

curriculum. Frequent diagnostic checks are needed to monitor pupil progress

toward curriculum objectives and to determine mastery of a unit. Usually the

following steps are distinguished in the monitoring process:

� diagnosing pupils’ academic problems through testing;

� analysing these problems by relating the learning results to given

instruction;

� using data from diagnostic tests and other curricular assessments to develop

instructional plans for the whole class and to prescribe individualised

learning plans for individual pupils who perform poorly on the assessment

measures;

� implementing the prescribed learning plan for a small group of pupils or an

individual pupil identified as having academic problems;

� assessing the results (Kool & Van der Leij, 1985).

The monitoring process is supposed to be cyclic: The instructional and

learning plans cover 4 to 6 weeks.

The described monitoring process formed the basis for the construction of

instruments to measure monitoring of pupil progress. Three Likert-scales were

constructed to cover the steps as distinguished by Kool and Van der Leij

(1985): Diagnosing pupil’s academic problems through testing (3 items);

Relating learning results to given instruction (8 items); Implementing

prescribed learning plans for pupils identified as at risk (8 items). The

process in itself is aimless when goals to be reached in the learning plans are

not clear. For this reason we added the scale ‘‘Setting goals for pupils’’ which

consist of 6 items. Finally we added a scale called: Team discussion of pupil

progress (16 items). Diagnosed academic problems are not analysed as

shortcomings of the individual pupils, but as challenges to overcome by the

teacher. Solutions suggested by the teachers in their instruction plans are

discussed and, if necessary, consequences for school curriculum, the remedial

curriculum, or school organisation are taken.

The reliability of the five scales is sufficient (Cronbach’s Alpha> .70)

(Houtveen, 1997; Van Zoelen & Houtveen, 2000). The scores on the scales

were standardised to simplify a comparison between the scores on the five

scales. For each scale the score can vary between 0 and 100.

A difference in effect on pupil results between the experimental and the

control group teachers can be expected if the experimental group outreaches

the control group in the degree of monitoring of pupil results. The results are

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 353

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 20: Effective School Improvement in Mathematics

presented in Table 2. Standard deviations are put in the fourth column and

effect sizes are found in the last column.

On each scale, except ‘‘Diagnosing pupils’ academic problems through

testing’’, the differences between the scores of the experimental and the

comparison group are statistically significant. Significance however, does not

learn very much about the size of the difference. We therefore computed the

effect sizes. Cohen (1988) evaluates effect sizes of .20 as small, .50 as

medium, and .80 as large effects. This is helpful for the interpretation of the

differences in implementation between the teachers in the experimental group

and the comparison group.

The explanation for ‘‘no difference’’ on ‘‘Diagnosing pupils’ academic

problems through testing’’ might be due to the fact that most Dutch

teachers already incorporated testing in their repertoire. The average teacher

in the comparison group scores almost 82% of the items of this scale pos-

itive. So it was difficult for the teachers in the experimental group to do a

better job.

Table 2. Implementation Features in the Experimental and Comparison Group Schools in theSchool Year the Effect Measures on the Pupils Took Place.

Number of teachers Experimentalgroup

Comparisongroup

st. dev. Effectsize

14 15

Monitoring pupils resultsSetting goals for pupils 75.03 70.11 16.40 0.30Diagnosing pupils’ academic

problems through testing80.84 81.87 14.02 �0.07

Relating learning resultsto given instruction

75.67 67.41 30.92 0.27

Implementing prescribed learningplans for pupils identified as at risk

89.98 38.00 37.80 1.38

Team discussion of pupil progress 70.03 39.02 27.79 1.12

Optimising instructionGiving extended direct instruction 66.38 45.30 14.37 1.47Optimising instruction time 71.05 54.75 12.92 1.26Supporting self-confidence of pupils 77.55 74.49 10.69 0.27

Supporting active learningSupporting self-regulated learning 50.01 44.62 10.76 0.50Creating an explorative

learning environment67.68 62.06 11.00 0.51

354 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 21: Effective School Improvement in Mathematics

The differences between the experimental group of teachers and the

comparison group are, according to Cohen, a bit better than small on: ‘‘Setting

goals for pupils’’, and ‘‘Relating learning results to given instruction’’. Really

large differences in implementation features are found for: ‘‘Implementing

prescribed learning plans for pupils identified as at risk’’, and ‘‘Team

discussion of pupil progress’’. On these two variables the gain for the

experimental group of teachers is more than a standard deviation, which is

very much according to Cohen’s criteria.

Optimising InstructionAccording to the MIP-programme, instruction can be optimised by giving

extended direct instruction, by optimising instruction time, and by supporting

the self-confidence of pupils.

Giving Extended Direct Instruction

It is possible to place academic tasks on a continuum from well-structured to

less structured tasks (Doyle, 1983). Well-structured tasks are tasks that can be

broken down into a fixed sequence of steps that consistently lead to the same

goal. There is a specific, predictable algorithm that can be followed that

enables students to obtain the same results each time they perform the

algorithmic operations. These well-structured tasks are taught by teaching

each step of the algorithm directly to students. The Direct Instruction Model

has been proven the most effective model to do so, especially for young

children and children with lesser academic abilities (Baumann, 1988; Becker

& Carnine, 1981; Dixon et al., 1992, 1998; Kameenui & Carnine, 1998; Muijs

& Reynolds, 2003; Rosenshine, 1986; Veenman, 1992). The core of the Direct

Instruction-model consists of the subsequent activities:

– review and activation of the preceding subject matter;

– presentation and explanation of new subject matter including demonstration;

– guided practice and coaching: pupils practise what they have just learned

and obtain direct feedback from the teacher correcting their mistakes;

– independent or individual seatwork: the pupils proceed from the integra-

tion of new knowledge with knowledge already present to the phase of

automation;

– periodic repetition of the subject matter.

In contrast, less-structured tasks (often called higher level tasks) cannot

be broken down into a fixed sequence of subtasks, they do not have a fixed

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 355

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 22: Effective School Improvement in Mathematics

sequence, and one cannot develop algorithms that pupils can use to fix these

tasks (Rosenshine & Meister, 1997). Until recently, pupils were seldom

provided with any help in completing less-structured tasks (Merrill, 1994;

Tennyson & Cocchiarella, 1986). As a result of emerging research on cognition

and information processing, so-called cognitive strategies have been developed

in a number of subject areas that students could use to help perform higher level

operations. This counts for the field of mathematics problem-solving as well

(Carnine et al., 1998; Dixon et al., 1992; Van Parreren, 1988). A cognitive

strategy is a heuristic that serves to support or facilitate the learners to develop

internal procedures that enable them to perform the higher level procedures.

In teaching less-structured tasks, the teacher uses scaffolds to support the

pupils as the pupils learn the cognitive strategy, and then the cognitive strategy

supports the pupil in attempting to complete the less-structured task. The

teaching of cognitive strategies is an example of working in a child’s zone

of proximal development (Vygotsky, 1978). The concept of a zone of pro-

ximal development means that one does not have to wait until a child is

‘‘developmentally ready’’ before beginning instruction. On the contrary,

Vygotsky emphasised the role of instruction in fostering development.

Scaffolds are forms of support provided by the teacher (or another pupil) to

help pupils bridge the gap between their current abilities and the intended

goal. It can be seen as adjustable and temporary support that can be removed

when no longer necessary (Palinscar & Brown, 1984). Scaffolding procedures

reduce the complexities of problems, breaking them down into manageable

chunks that the child has a real chance of solving (Bickhard, 1992). Examples

of teachers’ scaffolds include (a) providing simplified problems; (b) modelling

of procedures; and (c) thinking aloud as they solve the problem. Scaffolds

may also be tools such as cue cards or checklists. Scaffolds are gradually

withdrawn or faded as learners become more independent, although students

may continue to rely on scaffolds or periodically request them when they

encounter particularly difficult problems (Carnine et al., 1998; Rosenshine &

Meister, 1997).

In The Netherlands, the following principles are formulated in teaching

cognitive mathematics strategies:

a. Context-bound instruction. It is important to place mathematics activities

within the child’s daily live (Van Oers, 1990).

b. Emphasise proceedings. By following proceedings, the child can find the

solution and is therefore the starting point for calculation.

356 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 23: Effective School Improvement in Mathematics

c. Verbalise. It is important that the pupils put their actions into words, as a

scaffold to internalise the proceeding. The communication between teacher

and pupils should consist of giving feedback; asking questions about the

solution chosen and jointly think of solutions (Van Eerde & Vuurmans,

1987; Van Oers, 1990).

d. Develop and present procedural prompts or models. Models are scaffolds

that are specific to a cognitive strategy. These models are concrete

references on which pupils can rely for support as they learn to apply

the cognitive strategy (Treffers & De Moor, 1990; Van Oers, 1990).

Many tasks, especially in the field of mathematics, are a mixture of well-

structured and less structured parts. As a consequence, teachers have to

provide direct instruction as well as expert scaffolding. In the MIP-programme

teachers are trained to do so. The instruction model used in the programme is

referred to as the Extended Direct Instruction Model, in which the direct

instruction approach and the cognitive strategy instruction approach are

combined.

The extent to which the teachers apply Extended Direct Instruction is

determined with the help of observations. We constructed an event-sampling

instrument consisting of 23 items, which can be scored at a 5-point scale.

The main stages of the Direct Instruction Model and the main cognitive

strategies and scaffolds to be used are operationalised in these 23 statements

(see for a description of the instrument Houtveen & Overmars, 1996).

Each one of the statements is scored according to the quality in which

the behaviour described in the statement appears. The score is standardised

by dividing the actual score by the maximum score and multiplied by

100. Consequently, the scores can vary between 0 and 100. In order to

get a score for each teacher, two observations were made at different

moments in time. In the analyses, the average score of the two observations

was used.

Table 2 shows that really large differences in implementation features are

found for: ‘‘Giving extended direct instruction’’. The difference in favour of

the teachers in the experimental group is far more than a standard deviation,

which is very much according the criteria of Cohen (1988).

Optimising Instruction Time

When planning instructional activities, time should be considered as an

important instructional principle. One of the aspects of time that has a direct

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 357

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 24: Effective School Improvement in Mathematics

impact on pupil learning is allocated time (Fisher et al., 1980). Allocated time

is the maximum amount of time designated for a student to learn specific

content or a specific skill. Teachers use allocated time differently. Research

has suggested that effective teachers spend 15% less time on management, and

50% more time on instruction and interactive activities, such as questioning,

answering, providing corrective feedback, or explanations. Additionally,

effective teachers organise their time so they can spend at least some time with

the total group, in small groups and with individuals (Borg, 1980; Creemers,

1994; Kindsvatter, Willen, & Ishler, 1988). Therefore instruction time is

included as an experimental variable.

Twenty minutes of a mathematics lesson were used to monitor the amount

of time the teacher is actually involved in instruction and interactive activities.

The monitoring during observation was done with the help of a time-sampling

instrument (Houtveen & Overmars, 1996). Observations were made within

units of 20 s. Seven s were used to observe the teacher behaviour. During the

subsequent 13 s, the most dominant behaviour was scored. By summarising

the scores on the total of 60 units we were able to determine the percentage of

management and instruction time. In order to get a score for each teacher, two

observations were made at different moments. In the analyses the average

score of the two observations was used.

The results show a large difference of use in allocated time between the

teachers in the experimental and the control group schools (see Table 2). The

effect size is 1.26 of a standard deviation, which is according to Cohen (1988)

a very large effect.

Supporting Self-Confidence

A certain feeling of self-confidence is necessary to make learning possible.

Self-confidence is based on experienced success. Effective students expect to

be successful when confronted with a task. When successful on tasks,

effective students attribute their successes to their own efforts and abilities.

They believe self-improvement is possible and are continually motivated

toward this end (Ellis & Worthington, 1994). There is considerable evidence

that high success rates are correlated positively with student learning out-

comes and low success rates are correlated negatively (Anderson, Evertson, &

Brophy, 1979; Fisher et al., 1980). In addition to increase academic

achievement, successful experiences on tasks positively relates to internalised

student attributions of success (e.g., personal ability and effort) (Anderson,

Stevens, Prawatt, & Nickerson, 1988). Students who experience frequent

358 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 25: Effective School Improvement in Mathematics

failure tend to attribute their success to other external factors (e.g., luck, task

ease). Children who experience frequent failure, may over a period of time,

exhibit behavioural characteristics associated with ‘‘learned helplessness’’

and may engage in task avoidance behaviour (Adelman & Taylor, 1983;

Thomas & Pashley, 1982).

As a consequence, the rate of success at which a student completes a task

should be considered as a critical instructional principle (Ellis & Worthington,

1994). It assumes that all students can master a subject given sufficient time

and appropriate instruction (Block, 1980). This instructional principle is the

first pillar of the MIP-programme and is taken account for in each of the

before mentioned experimental variables. The experimental variable at hand

is explicitly aimed at measuring the degree to which teachers support self-

confidence of all pupils by giving tasks that pupils can end successfully, by

giving children sufficient time to answer questions, by praising children when

answers are correct, and by avoiding negative feedback.

The extent to which teachers support the self-confidence of their pupils is

also determined by means of observations. The event-sampling instrument

consists of 9 statements, which can be scored at a 5-point scale, according to

the quality in which the behaviour described in the statement appears

(Houtveen & Booij, 1994). The score is standardised and can vary between 0

and 100.

For each class in the experimental and comparison group schools two

observations were made. The average score of these measurements constitutes

the score of each class.

Table 2 shows a difference in implementation between the experimental

group and the comparison group of 3 points. Although the difference is not

very large, the effect size is .27, this difference is still statistically significant

on 1% level.

Supporting Active LearningAs stated in the description of the MIP-programme above, the extension of

instruction time for struggling learners demands a classroom organisation in

which the remainder of the students are able to manage their own learning

process during the time the teacher is involved in small group instruction with

the struggling learners. But apart from this organisational reason there is a

more profound reason to organise classrooms in a way that invites pupils to

regulate and monitor their own behaviour and to assist pupils in becoming

independent and self-regulatory. Research shows clear relationships between

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 359

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 26: Effective School Improvement in Mathematics

self-management of pupils and learning results (Brown, 1978; Ellis & Larkin,

1998; Ellis & Worthington, 1994). In general, effective learners differ from

ineffective learners in their ability to regulate and monitor their own behaviour

in terms of motivation, socialisation, academic, and cognitive demands.

Effective learners, for example, have an internal locus of control, actively use

prior knowledge and skills to gain new knowledge and skills, and they actively

work to self-regulate their thoughts and actions (Boekaerts, Pintrich, &

Zeidner, 2000; Ellis & Worthington, 1994).

Two elements of adaptive instruction that must be addressed in the MIP-

programme regard the above: supporting pupils in self-regulated learning and

creating an explorative learning environment within the classroom. Both

elements are included as variables in the evaluation research.

Supporting Self-Regulated Learning

The degree to which teachers promote self-regulated learning in their class-

room is measured with a written questionnaire consisting of 15 statements.

Teachers score the statements on a 6-point scale (never-very often). The

scores again are standardised and can vary between 0 and 100 (Houtveen &

Booij, 1994).

As can be learned from Table 2, medium-size differences are found for both

variables between the experimental and comparison group schools. Support-

ing self-regulated learning is a rather new teacher behaviour in Dutch ele-

mentary schools, as can be concluded from the rather low scores on this scale

found in the experimental group even at the end of the programme. Yet, in

comparison with the comparison group, the experimental group scores about 5

points higher. This is an effect size of .50, which is according to Cohen (1988)

a medium effect.

Creating an Explorative Learning Environment

The degree to which the classroom organisation can be considered explorative

is measured by means of observation of two lessons. This event-sampling

instrument consists of 14 statements, which can be scored on a 5-point scale.

The statements are scored according to the quality in which the organisational

characteristic is put into practice (Houtveen & Booij, 1994).

When it comes to the degree of realisation of an explorative learning

environment within the classroom, teachers in the experimental group

outperform the teachers in the comparison group significantly with 5 points.

This is an effect size of .51, which can be seen as a medium effect.

360 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 27: Effective School Improvement in Mathematics

EFFECTS ON PUPILS’ MATHEMATICS PERFORMANCE

In choosing an effect variable, the problem arose that there are no

standardised tests available in The Netherlands that are designed for use at

the very start of a school year. In consultation with the Dutch Institute for

Test Construction (Cito), we decided to use the test meant for the end of

the foregoing school year as the pretest. As a pretest, a test measuring

readiness for formal mathematics education is used. This test (‘‘Arranging’’)

consists of the following parts: classify, serialise, comparing, and counting,

with a total of 42 items. The posttest counts 53 items and consists of the

following parts: counting and arranging; dividing and compounding

numbers; adding up, subtracting and multiplying; measuring, time, and

money. For both tests the scores are determined by adding up the correct

answers.

In order to be able to correct the results for pupils’ individual charac-

teristics, we used intelligence, socioeconomic background, and age as control

variables. An analogy test was used to determine intelligence. This test is a

subtest of the nonverbal intelligence test SON-R (Laros & Telligen, 1991). It

aims at measuring the abstract reasoning skills of children between 5 1/2 and

17 years of age. The subtest contains 21 items. The homogeneity of the test

was sufficient (Cronbach’s alpha .79). Socioeconomic background was deter-

mined by means of the education of both parents and the ethnic background of

the pupils.

The experiment is considered successful when the scores of the experi-

mental group on the mathematics posttest are significantly higher than the

comparison scores, while the other circumstances, apparently apart from the

experimental condition, remain the same.

Table 3 shows a significant difference between the raw posttest scores of the

pupils in the experimental and the comparison group. The effect size is .40.

After correction for the pretest, the effect size becomes .51. When we take the

differences on sex, intelligence, SES, ethnicity, and age between the ex-

perimental group and the comparison group into account, the effect size in-

creases very little to .52. This, of course, is due to the fact that the differences

between the experimental and comparison group on these measures are

neglectable (see Table 1). The experiment turns out to be a success: Adaptive

mathematics instruction improves learning results.

A lot of energy in the MIP-programme was devoted to adapt teaching

to pupils with diverse learning needs. Therefore an important question that

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 361

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 28: Effective School Improvement in Mathematics

remains to be answered apart from the overall success of the MIP-programme,

is whether the amount of struggling learners in the experimental group has

decreased. Struggling learners can be defined as pupils that are at risk of

referral to special education. Both the pre- and the posttest have facilities to

identify struggling learners. These facilities are used in this research. The

results are shown in Table 4.

At the beginning of the experiment, 2.6% of the pupils in the experimental

group were identified as struggling learners. In the comparison group, this

percentage was about the same (2.8%). At the end of the experiment this

percentage decreased to 0.8 in the experimental group, whereas the

comparison group ended up with 7.1% struggling learners. This difference

between experimental and comparison group is significant. The experiment

turns out to be a success in this regard as well: Adaptive mathematics

instruction improves the learning results of struggling learners.

Table 3. Pupil Characteristics in the Experimental and Comparison Group Schools at the Endof the Experiment.

Number of pupils Experimentalgroup

Comparisongroup

st. dev. Effectsize

237 311

Score on Cito-mathematicstest (posttest)

42.80 39.42 8.49 .40

Score on Cito-mathematicstest (posttest) correctedfor pretest

43.30 38.98 .51

Score on Cito-mathematicstest (posttest) correctedfor pretest, gender, IQ, SES,ethnicity, and age

43.34 38.95 .52

Table 4. Percentages of Struggling Learners in Pretest and Posttest.

Number of pupils Experimental group Comparison group237 311

Percentage of strugglinglearners according to the pretest

2.6 2.8

Percentage of strugglinglearners according to the posttest

0.8 7.1

362 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 29: Effective School Improvement in Mathematics

Analysing the Results

A lot of research shows that schools have a clear but modest influence on pupil

results. The added value of schools lies somewhere between 10 and 30% of the

variance in pupils’ results (e.g., Bosker & Witziers, 1996; Brandsma &

Knuver, 1989; Reezigt, Houtveen, & Van de Grift, 2002; Roeleveld, 2003;

Table 5. Results of the Multilevel Analysis.

Pupils School Condition

Explained variance 0-model a .866 .149Explained variance 0-model b .866 .110 .040Variance to be explained after

introduction of pupil variables.583 .076 .056

Variance to be explained afterintroduction of pupil variablesand implementation variables

.581 .049 .000

Standardised b b sePupil variables:Score on Cito-arithmetic test (pretest) .498� .037Gender (f/m) .081 .071IQ .080� .035Low SES (y/n) �.370� .099Ethnical minority group (y/n) .226 .165Age (in months) �.044 .046

Implementation variables:Monitoring pupils’ resultsSetting goals for pupils �.043 .060Diagnosing pupils’ academic

problems through testing.109� .007

Relating learning results to given instruction .005 .069Implementing prescribed learning plans

for pupils identified as at risk.160� .039

Team discussion of pupil progress .067 .079

Optimising instructionGiving extended direct instruction .15� .016Optimising instruction time .016 .100Supporting self-confidence of pupils .208� .068

Supporting active learningSupporting self-regulated learning .005 .062Creating an explorative learning environment .126� .007

�Significant at 5% level.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 363

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 30: Effective School Improvement in Mathematics

Scheerens & Bosker, 1997; Wijnstra, Ouwens, & B�eequin, 2003). Knowing

this, it becomes interesting to know how much of this so-called school

variance can be influenced by an experiment like the one we report on in this

article. Furthermore, it would be interesting to know which part of the

treatment is mostly responsible for the effects. Multilevel regression analysis

using the MLWin programme is used to answer these questions. Table 5 shows

the results.

First we computed the amount of pupil and school variance in the pupil

results on the posttest. It turned out that about 85% of the variance in pupil

results on the posttest can be explained by differences in individual pupils.

Almost 15% of the variance is school variance, in this case classroom

variance. This finding is not dissimilar to those of other studies in Western

Anglophone countries (Scheerens & Bosker, 1997; Teddlie & Reynolds,

2000). Next, we computed the amount of school variance that is due to

differences in the experimental group and the comparison group. It turns out

that almost 27% of the school variance is due to differences between the

experimental group and the comparison group.

The next step is aimed at explaining the differences between pupils not due

to schools or the experiment. The pupil background variables scores on the

pretest, gender, intelligence, socioeconomic and ethnic background and age,

were added to the model. Not all pupil variables were found to be significant.

Only the pretest score, intelligence, and socioeconomic background were

found to be significantly related to posttest results. We found no significant

difference for boys and girls and no extra effects for ethnic minority pupils or

for differences in age. Taken together, these variables were able to explain

about 28% of the pupil variance and about 3% of the school variance.

We learned already from Table 1 that the differences in scores on these

pupil background variables between the experimental and comparison group

are very small. So, as could be expected, no condition variance was explained

by these characteristics. This points once more to the homogeneity of both

groups with respect to pupil background characteristics.

In the final model, we added the implementation variables which constitute

adaptive instruction to the equation. An important conclusion we can draw

from the analyses is that all of the variance in pupil results that could be

explained by differences in condition, indeed is explained by the experimental

variables: The condition variance is reduced to zero. It is striking that the

variance between schools not due to condition is found to be partly explained

by one or more experimental variables as well. Turning back to Table 2, we

364 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 31: Effective School Improvement in Mathematics

notice hardly any differences between the experimental and control group in

the degree of diagnosing pupils’ academic problems through testing. Precisely

this variable explains a significant part of the differences in pupil results. This

indicates that the large variance we noticed in this variable (see Table 2) in

both the experimental and the comparison group is responsible for this effect.

Not every implementation feature showed significant results. Of these

variables, diagnosing pupils’ academic problems, implementing prescribed

learning plans for pupils identified as at risk, giving extended direct

instruction, supporting self-confidence of pupils, and creating an explorative

learning environment were significant. Four of them can be ascribed to

differences in condition and one can be ascribed to ‘‘natural’’ variance within

both groups.

CONCLUSION AND DISCUSSION

On the basis of our research, we can conclude that the MIP-programme design

shows some clear results. The ‘‘infrastructure’’ at the school level as provided

for in the design supported adaptive instruction at the classroom level clearly

and intensively. The result was that the experimental group of teachers scored

significantly higher on all elements but one of adaptive instruction with regard

to mathematics than the comparison group of teachers. The effect sizes found

varied from rather small to very large.

At the pupil level, it is shown that adaptive instruction in mathematics

improves the learning results of pupils in grade 3. A significant difference was

found on the posttest after correcting for the control variables (pretest,

intelligence, gender, SES, and age). Beyond this overall success, the pro-

gramme turned out to reduce the percentage of struggling learners in the

experimental group to less than 1%, whereas the percentage of struggling

learners in the comparison group increased with more than 4%.

Multilevel analysis showed that 15% of the variance in pupil results is to be

explained at the school level. About a quarter of this 15% can be explained by

differences between the experimental and the comparison group. All of this

variance between the conditions is explained by the experimental variables we

used in the study. Only 5 of the 10 implementation features contribute

significantly to differences in pupil results.

These results are only partly an underpinning of the construct of adap-

tive instruction. We regard it as consistent with the construct of adaptive

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 365

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 32: Effective School Improvement in Mathematics

instruction that the implementation features found to be significant come from

each of the three cornerstones of adaptive instruction: monitoring pupil

results, optimising instruction, and supporting active learning. Disappointing

is the nonsignificance of the variable optimising instruction time. Firstly

because the school and teacher effectiveness research points in another di-

rection, and secondly because of the large difference in time use between the

experimental and comparison group. Recent research on struggling learners

suggests that just optimising instruction time is not sufficient. These pupils

need considerable extension of instruction time to enable them to catch up

with their peers (Commission on Excellence in Special Education, 2002; Finn,

Rotherham, & Hokanson, 2001; Snow, Burns, & Griffin, 1998).

Following the study we reported on in this article we conducted a quasi-

experiment in which improvement of comprehensive reading through adaptive

instruction was the issue (Houtveen, 2002). Furthermore the effectiveness of a

beginning reading programme was evaluated (Houtveen, Mijs, Vernooy, Van

de Grift, & Koekebacker, 2003). In these researches we tried to involve more

recent insights into educating struggling learners and overcome some of the

weaknesses of the study of the MIP-programme. The first weakness is the

absence of quantitative data at the school level, especially on educational

leadership and the management of the changes concerning adaptive in-

struction within the schools.

Some colleagues will probably claim that the design in itself is a weakness

of the study, since the design is not a ‘‘true’’ experiment (e.g., Goldstein,

1987). As long as laboratory schools are not a reality, in our opinion quasi-

experimentation together with qualitative data gathering is as close as we can

get to gain insight into what works and what not. Of course there are risks

in this design, experimental mortality being one of them. The experimental

mortality in the follow-up test of the MIP-programme study was so large that

we had to decide not to report on the results.

In the remainder of this paragraph, we will reflect on some of the insights

for school improvement and school effectiveness yielded by the MIP-

programme. These insights are not directly based on the results of the quasi-

experiment we reported on in this article. They are mostly based on the

extended case studies with regard to the implementation and guidance

process, made of all the experimental schools.

The process of change is generally considered to consist of three –

overlapping – phases: initiation, implementation, and institutionalisation. The

first insight we like to report on regards the initiation phase of the change

366 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 33: Effective School Improvement in Mathematics

process. A key activity in the initiation phase is a review of the current state in

the schools as regards the particular innovation. In the MIP-programme, a

school diagnosis was made at the beginning of the project regarding teaching

practises, testing habits, and pupil results with regard to mathematics. The

external change agent, principal, and project coordinator used the results of

this review to develop a school context specific innovation plan based on the

scenario as provided for in the MIP-programme. The results of the review as

well as the innovation plan were discussed with the team. This is an often-

described procedure within the school improvement literature. New in the

MIP-programme is that the researchers carried out the review of the current

state of the teaching practises, using the instruments described in the

implementation paragraph of this article. So, the first measurement in the

implementation study was agreed upon as starting point for the change

process. Further, the degree to which teachers were supposed to implement the

behaviour described in the instruments at the end of the project was agreed

upon. This use of research instruments and research data as part of the change

process has some clear advantages. In the research instruments, the desired

teacher behaviour is operationalised to a very concrete level. In discussing the

statements with the teachers it is very clear what the innovation is about, and

what is expected of them. This makes it easier to grasp the complexity of the

changes that are to be implemented. A second advantage is that this practise is

very motivating and very goal-oriented. Since the measurements took place

twice each school year, progress was made visible for the teachers and

principals during the innovation process, as well as the challenges that still

need some work. This made it easier for the schools to persevere with the

programme. Last, but not least, it is clear from this procedure that high fidelity

implementation of the programme is reached to a great extent. Needless to say

that a great deal of time and energy was put into the construction of the

research instruments. In fact, for the most important instruments it took a

separate study, preceding the implementation and effect study. This insight is

in line with recent research on the role of feedback and goal-setting to teachers

and schools (Guskey, 2003; Visscher & Coe, 2002).

In the literature, school improvement is regarded as having an emphasis on

strategies for strengthening the schools’ capacity for managing change

(Hopkins, 1995). This appears to be correct. From our research we learned,

though, that a certain capacity for managing change is a precondition for

school improvement that is focussing on the teaching–learning process and

aimed at raising student achievement. This counts especially for leadership.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 367

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 34: Effective School Improvement in Mathematics

In this article, it is made clear that some of the experimental schools needed

only 2 years to implement the programme, while others took 3 years. This

longer implementation period was not caused by a delay in implementation at

the teacher level. In fact it was caused by a delay in the start of the

implementation at the teacher level, due to a lack of educational leadership of

the principal, a lack of trust within the team, and a lack of agreement on the

targets set. This meant that, before the innovation at hand could start, some

schools had to do some work on their capacity for managing change. We

conclude from this that to implement a complex programme like the MIP-

programme we need to review the current state of a school, not only with

regard to pupil results and teaching strategies, but also with regard to the

infrastructure of the school. This implies that a current state review not only

serves as a starting point for a particular innovation, but also to determine

whether a school is ready to start an innovation in terms of capacity for

managing change.

Our next insight regards the institutionalisation phase. During the last

decade, the importance of the institutionalisation phase is stressed. Hopkins

and Lagerweij (1996) state, for instance, that although implementation has

received the most attention historically, this has most probably been

disadvantageous to the understanding of the process as a whole. Emphasising

initiation and implementation at the expense of institutionalisation leads to a

very short-term view of innovation. Based on our research, we could not agree

more. It is probably the most important reason for the loss of gain in pupil

results after the external support stopped in our first comprehensive school

improvement project, the Dutch National School Improvement Project.

Furthermore, they suggest that it is probably more helpful to think of the three

phases of the innovation process as a series of overlapping phases, rather than

as a straight line. We would like to go beyond this line of thinking.

Institutionalisation is mostly seen as the phase when innovation and change

stop being regarded as something new and become part of the school’s usual

way of doing things. Miles (1986) sums up the following key activities to

ensure success at this stage: an emphasis on ‘‘embedding’’ the change within

the school structures, its organisation, and resources; the elimination of

competing or contradictory practices; strong and purposeful links to other

change efforts, to the curriculum, and to classroom teaching; availability of

local facilitators for skills training.

The phase of institutionalisation is seen as overlapping the implementation

phase. In our research it turned out that ‘‘implementing’’ the above-mentioned

368 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 35: Effective School Improvement in Mathematics

key activities of institutionalisation right from the start was a prerequisite for

implementing the required changes at teacher level. We conclude from this

that institutionalisation is not a distinct phase in the change process, but partly

a set of activities that has to be built in into the change process from the

beginning (that accounts for ‘‘embedding’’ of change within the school

structures and the availability of local facilitators for skills training) and partly

a set of ‘‘contra-indicators’’ for starting the innovation. We believe, on the

basis of our research, that implementation let alone institutionalisation of an

innovation has little chance when there is not a strong and purposeful link with

other change efforts in the school and when there are competing or con-

tradictory practices in the school.

Our next remark regards external support in school improvement. Hopkins

et al. (1994) defined school improvement as an approach to educational

change that enhances pupil outcomes as well as strengthening the school’s

capacity for managing change. We agree with this definition, although we

would like to emphasise that strengthening the schools’ capacity for managing

change is hardly a goal in itself, but forms a condition for staff development

aiming at enhancing pupil outcomes. In the Hopkins et al. definition, school

improvement is regarded – among other things – as usually necessitating some

form of external support. We regard this as an understatement when it comes

to implementing a complex project like the MIP-programme. Based on our

research, we claim that intensive and sustained external support is needed to

fulfil this task. Furthermore the support should be given by threefold experts:

experts with regard to what is needed to strengthen the ‘‘infrastructure’’ of the

schools; experts with regard to improving teaching strategies; and last, but

maybe most important, experts with regard to content matter.

REFERENCES

Adelman, H.S., & Taylor, L. (1983). Enhancing motivation for overcoming learning andbehavior problems. Journal of Learning Disabilities, 7, 384–392.

Anderson, L.M., Evertson, C.M., & Brophy, J.E. (1979). An experimental study of effectiveteaching in first grade groups. Elementary School Journal, 79(1), 193–223.

Anderson, L.M., Stevens, L.M., Prawatt, D.D., & Nickerson, J. (1988). Classroom taskenvironments and students’ task-related beliefs. The Elementary School Journal, 88(3),281–295.

Barringer, C., & Gholson, B. (1979). Effects of type and combination of feedback uponconceptual learning by children: Implications for research in academic learning. Reviewof Educational Research, 49(3), 459–478.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 369

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 36: Effective School Improvement in Mathematics

Barton, P.E. (2002). Staying on course in education reform. Princeton, NJ: Statistics &Research Division, Policy Information Center, Educational Testing Service.

Baumann, J.F. (1988). Teaching third-grade students to comprehend anaphoric relationships:The application of a direct instruction model. Reading Research Quarterly, 21(1), 70–90.

Becker, W.C., & Carnine, D.W. (1981). Direct Instruction: A behavior theory model forcomprehensive educational intervention with the disadvantaged. In S. Bijou (Ed.),Contributions of behaviour modification in education (pp. 1–106). Hillsdale, NJ:Lawrence Erlbaum Associates.

Berends, M., Bodilly, S., & Kirby, S. (2002). Looking back over a decade of whole-schoolreform: The experience of New American schools. Phi Delta Kappan, 84(2), 168–175.

Bickhard, M.H. (1992). Scaffolding and self-scaffolding: Central aspects of development.In L.T. Winegar & J. Valsiner (Eds.), Children’s development within social context(Vol. 2, pp. 33–52). Hillsdale, NJ: Lawrence Erlbaum Associates.

Block, J.H. (1980). Success rate. In C. Denham & A. Lieberman (Eds.), Time to learn(pp. 95–106). Washington, DC: National Institute of Education.

Bloom, B.S. (1976). Human characteristics and school learning. New York: McGraw-Hill.Boekaerts, M. (2002). Bringing about change in the classroom: Strength and weaknesses of the

self-regulated learning approach. Learning and Instruction, 12(6), 589–604.Boekaerts, M., Pintrich, P.R., & Zeidner, M. (Eds.). (2000). Handbook of self-regulation.

San Diego, CA: Academic Press.Borg, W.R. (1980). Time in school learning. In C. Denham & A. Lieberman (Eds.), Time to

learn (pp. 33–72). Washington, DC: National Institute of Education.Borman, G.D., Hewes, G.M., Overman, L.T., & Brown, S. (2003). Comprehensive school

reform and student achievement. A meta-analysis. Review of Educational Research,73(2), 125–230.

Bosker, R.J., & Witziers, B. (1996). The magnitude of school effects, or: Does it really matterwhich school a student attends? Paper presented at the Annual Meeting of the AmericanEducational Research Association, New York.

Brandsma, H.P., & Knuver, J.W.M. (1989). Effects of school and classroom characteristicson pupil progress in language and arithmetic. International Journal of EducationalResearch, 13, 777–788.

Brown, A. (1978). Knowing when, where and how to remember: A problem of metacognition.In R. Glaser (Ed.), Advances in instructional psychology (Vol. 1, pp. 77–165). Hillsdale,NJ: Lawrence Erlbaum Associates.

Carroll, J.B. (1963). A model for school learning. In L.W. Anderson (Ed.), Perspectives onschool learning, selected writings of John B. Carroll (pp. 19–31). Hillsdale, NJ:Lawrence Erlbaum Associates.

Carnine, D.W., Dixon, R.C., & Silbert, J. (1998). Effective strategies for teaching mathematics.In E.J. Kameenui & D.W. Carnine (Eds.), Effective teaching strategies thataccommodate diverse learners (pp. 93–113). Columbus, OH: Merrill/Prentice-Hall.

Carver, Ch.S., & Scheier, M.F. (2000). On the structure of behavioural self-regulation. In M.Boekaerts, P.R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 41–84).San Diego, CA: Academic Press.

Cito. (1992). Leerlingvolgsysteem voor groep 3 en 4. Rekenen-Wiskunde 1 [Pupil monitoringsystem for grades 3 and 4]. Arnhem, The Netherlands: Cito.

Cohen, P.A. (1980). Effectiveness of student-rating feedback for improving college instruction:A meta-analysis of findings. Research in Higher Education, 13(4), 321–341.

370 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 37: Effective School Improvement in Mathematics

Cohen, P.A. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale,NJ: Lawrence Erlbaum Associates.

Commission on Excellence in Special Education. (2002). A new ERA: Revitalizingspecial education for children and their families. Washington, DC: Department ofEducation.

Cook, Th.D., & Campbell, D.T. (1979). Quasi-experimentation: Design and analysis issues forfield settings. Boston/London: Houghton Miffin.

Cooley, W.W., & Leinhardt, G. (1980). The instruction dimensions study. EducationalEvaluation and Policy Analysis, 2, 7–25.

Creemers, B.P.M. (1994). The effective classroom. London: Cassell.Creemers, B.P.M., & Reezigt, G.J. (1997). School level conditions affecting the effectiveness of

instruction. School Effectiveness and School Improvement, 7, 197–229.Crooks, T.J. (1988). The impact of classroom evaluation practices on students. Review of

Educational Research, 58(3), 438–481.Dixon, R., Carnine, D.W., & Kameenui, E.J. (1992). Research synthesis in mathematics:

Curriculum guidelines for diverse learners. Monograph for the National Center toImpose the Tools of Educators. Eugene: University of Oregon.

Dixon, R., Carnine, D.W., Lee, D.W., & Wallin, J. (1998). Review of high quality experimentalmathematics research. Austin, TX: University of Texas.

Doyle, W. (1983). Academic work. Review of Educational Research, 53, 159–199.Ellis, E.S., & Larkin, M.J. (1998). Adolescents with learning disabilities. In B.Y.L. Wong (Ed.),

Learning about learning disabilities (pp. 669–705). New York: Academic Press.Ellis, E.S., & Worthington, L.A. (1994). Research synthesis on effective teaching principles and

the design of quality tools for educators. Technical Report no 5. Oregon: University ofOregon.

Evans, L., & Teddlie, C. (1995). Facilitating change in schools: Is there one best style? SchoolEffectiveness and School Improvement, 6, 1–23.

Finn, C.E., Rotherham, A.J., & Hokanson, C.R. (Eds.). (2001). Rethinking special education fora new century. Washington, DC: Thomas, B. Fordham Foundation and the ProgressivePolicy Institute.

Fisher, W., Berliner, D.C., Filby, N.N., Marlieve, R., Cohen, L.S., & Denshaw, M. (1980).Teaching behaviors, academic learning times and student achievement: An overview. InC. Denham & A. Lieberman (Eds.), Time to learn (pp. 7–32). Washington, DC: NationalInstitute of Education.

Fresko, B., Robinson, N., Friedlander, A., Albert, J., & Argaman, N. (1990). Improvingmathematics instruction and learning in the junior high school: An Israeli example.School Effectiveness and School Improvement, 1, 170–188.

Fuchs, L.S., & Fuchs, D. (1986). Effects of systematic formative evaluation: A meta-analysis.Exceptional Children, 53, 199–208.

Fullan, M. (1991). The new meaning of educational change. New York: Teachers College Press.Fullan, M. (1994). Coordinating top-down and bottom-up strategies for educational reform.

Ontario, CA: Educational Reform Studies.Fullan, M. (2003). The new meaning of educational change (2nd ed.). London: Cassell.Gamoran, A. (1992). Is ability grouping equitable: Synthesis of research. Educational

Leadership, 50(1), 11–17.Goldstein, H. (1987). Multilevel models in educational and social research. London: Charles

Griffin.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 371

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 38: Effective School Improvement in Mathematics

Good, T., & Brophy, J. (1986). School effects. In M.C. Wittrock (Ed.), Handbook of research onteaching (pp. 570–605). New York: Macmillan.

Gravemeijer, K. (1990). De vernieuwing van het reken-en wiskundeonderwijs in de praktijk[Improvement of mathematics education in practice]. School en Begeleiding, 7(28), 17–21.

Gray, J., Reynolds, D., Fitz-Gibbon, C., & Jesson, D. (1996). Merging traditions: The future ofresearch on school effectiveness and school improvement. London: Cassell.

Guskey, T.R. (2003). How classroom assessments improve learning. Educational Leadership,60(5), 7–11.

Hallam, S., & Toutounji, I. (1996). What do we know about the ability grouping of pupils byability? A research review. London: Institute of Education, University of London.

Harnishfeger, A., & Wiley, D.E. (1978). Conceptual issues in models of school learning.Curriculum Studies, 10, 215–131.

Harskamp, E.G. (1988). Rekenmethoden op de proef gesteld [About the implementation ofmathematics methods]. Groningen, The Netherlands: RION.

Herman, R. (1999). An educator’s guide to schoolwide reform. Arlington, VA: EducationalResearch Service.

Hiebert, J., Wearne, D., & Taber, S. (1991). Fourth grades’ gradual construction of decimalfractions during instruction using different physical representations. Elementary SchoolJournal, 91, 321–341.

Hill, P., & Cr�eevola, C.A. (1999). Key features of a whole-school design approach to literacyteaching in schools. Australian Journal of Learning Disabilities, 4(3), 5–11.

Hopkins, D. (1987). Improving the quality of schooling. Lewes: Falmer Press.Hopkins, D. (1995). Towards effective school improvement. School Effectiveness and School

Improvement, 6, 265–274.Hopkins, D., Ainscow, M., & West, M. (1994). School improvement in an era of change.

London: Cassell.Hopkins, D., & Lagerweij, N.A.J. (1996). The school improvement knowledge base. In D.

Reynolds, R. Bollen, B.P.M. Creemers, D. Hopkins, L. Stoll, & N.A.J. Lagerweij (Eds.),Making good schools. Linking school effectiveness and school improvement (pp. 59–94).London: Routledge.

Houtveen, A.A.M. (1997). De werkvloer [The work place]. In J.L. Peschar & C.J.W. Meyer(Eds.), WSNS op weg. De evaluatie van het ‘Weer Samen Naar School’ beleid (pp. 69–113).Groningen, The Netherlands: Wolters-Noordhoff.

Houtveen, A.A.M. (2002). Begrijpend leesonderwijs dat werkt [Comprehensive readinginstruction that works]. Utrecht, The Netherlands: ISOR.

Houtveen, A.A.M., & Booij, N. (1994). Het meten van integrale leerlingzorg: Adaptiefonderwijs en schoolontwikkeling [Measuring inclusion: Adaptive instruction and schoolimprovement]. Utrecht, The Netherlands: ISOR/Onderwijsonderzoek.

Houtveen, A.A.M., Booij, N., De Jong, R., & Van de Grift, W.C.J.M. (1999). Adaptive instructionand pupil achievement. School Effectiveness and School Improvement, 10, 172–192.

Houtveen, A.A.M., Mijs, D., Vernooy, K., Van de Grift, W., & Koekebacker, E, (2003).Risicoleerlingen bij technisch lezen [Pupils at risk. Evaluation of the technical readingand handling of diverse needs programme]. Utrecht, The Netherlands: ISOR.

Houtveen, A.A.M., & Overmars, A.M. (1996). Instructie bij rekenen en wiskunde [Instructionin Mathematics Education]. Utrecht, The Netherlands: ISOR.

Houtveen, A.A.M., & Van de Grift, W.J.C.M. (2001). Inclusion and adaptive instruction inelementary education. Journal of Education for Students Placed at Risk, 6(4), 389–411.

372 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 39: Effective School Improvement in Mathematics

Janssen, J., Van der Schoot, F., Hemker, B., & Verhelst, N. (1998). Balans van het Reken-Wiskunde onderwijs aan het eind van de Basisschool 3. Uitkomsten van de derde peilingin 1997 [Report on Mathematics Education at the end of primary education]. Arnhem,The Netherlands: Cito, Instituut voor Toetsontwikkeling.

Joyce, B., & Showers, B. (1995). Student achievement through staff development.Fundamentals of school renewal (2nd ed.). White Plains: Longman.

Joyce, B., & Showers, B. (2002). Student achievement through staff development.Fundamentals of school renewal (3rd ed.). White Plains: Longman.

Kallison, J.M. (1986). Effects of lesson organization on achievement. American EducationalResearch Journal, 23(2), 337–347.

Kameenui, E.J., & Carnine, D.W. (Eds). (1998). Effective teaching strategies that accommodatediverse learners. Columbus, OH: Merrill–Prentice-Hall.

Kindsvatter, R., Willen, W., & Ishler, M. (1988). Dynamics of effective teaching. New York:Longman.

Kool, E., & Van der Leij, A. (1985). Planmatig handelen [monitoring pupil results]. In A. vander Leij (Ed.), Zorgverbreding. Bijdragen uit speciaal onderwijs aan basisonderwijs.Nijkerk, The Netherlands: Intro.

Kozma, R. (1991). Learning with media. Review of Educational Research, 61(2), 179–211.Kulik, J.A., & Kulik, C.L. (1988). Timing of feedback and verbal learning. Review of

Educational Research, 58, 79–97.Land, M.L. (1987). Vagueness and clarity. In M.J. Dunkin (Ed.), International encyclopedia of

teaching and teacher education (pp. 79–95). New York: Pergamon.Laros, J.A., & Telligen, P.J. (1991). Construction and validation of the SONr 5–17, The

Snijders-Oomen non-verbal intelligence test. Groningen, The Netherlands: Wolters-Noordhoff.

L’Hommedieu, R., Menges, R.J., & Brinko, K.T. (1990). Methodological explanations for themodest effects of feedback from students’ ratings. Journal of Educational Psychology,82(2), 232–241.

Louis, K., & Smith, B. (1991). Restructuring, teacher engagement and school culture:Perspectives on school reform and the improvement of teacher’s work. SchoolEffectiveness and School Improvement, 2(1), 34–52.

Maddox, H., & Hoole, E. (1975). Performance decrement in the lecture. Educational Review,28, 17–30.

Mayer, R.E., & Gallini, J.K. (1990). When is an illustration worth ten thousand words? Journalof Educational Psychology, 82, 715–726.

Melton, R.F. (1978). Resolution of conflicting claims concerning the effects of behaviouralobjectives on student learning. Review of Educational Research, 48, 291–302.

Merrill, M.D. (1994). Instructional design theory. Englewood Cliffs, NJ: EducationalTechnology Publications.

Miles, M. (1986). Research findings in the stages of school improvement. New York: Center forPolicy Research.

Muijs, D., & Reynolds, D. (2000). School effectiveness and teacher effectiveness: Somepreliminary findings from the evaluation of the mathematics enhancement programme.School Effectiveness and School Improvement, 11, 247–263.

Muijs, D., & Reynolds, D. (2003). Student background and teacher effects on achievement andattainment in mathematics: A longitudinal study. Educational Research and Evaluation,9(3), 289–314.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 373

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 40: Effective School Improvement in Mathematics

Nunes, T., & Bryant, P. (1996). Children doing mathematics. Oxford: Blackwell.Oakes, J., Gamoran, A., & Page, R.N. (1992). Curriculum differentiation: Opportunities,

outcomes and meanings. In P.W. Jackson (Ed.), Handbook of research on curriculum(pp. 570–609). Washington, DC: AERA.

Pajak, E. (2000). Approaches to clinical supervision. Norwood, MA: Christopher-Gordon.Palinscar, A.S., & Brown, A.L. (1984). Reciprocal teaching of comprehension-fostering and

comprehension-monitoring activities. Cognition and Instruction, 2, 117–175.Pink, W.T. (1990). Staff development for urban school improvement: Lessons learned from two

case studies. School Effectiveness and School Improvement, 1, 41–61.Pressley, M., Goodchild, J., Fleet, R., Zachowski, R., & Evans, E. (1989). The challenges of

classroom strategy instruction. Elementary School Journal, 58, 266–278.Pressley, M., Wood, E., Woloshyn, V.E., Martin, V., King, A., & Menke, D. (1992).

Encouraging mindful use of prior knowledge: Attempting to construct explanatoryanswers facilitates learning. Educational Psychologist, 27, 91–109.

Reezigt, G.J. (1993). Effecten van differentiatie op de basisschool [Effects of differentiation inprimary education]. Groningen, The Netherlands: RION.

Reezigt, G.J., Houtveen, A.A.M., & Van de Grift, W.J.C.M. (2002). Ontwikkelingen in eneffecten van adaptief onderwijs [Developments and effects of adaptive education].Groningen/Utrecht, The Netherlands: GION/ISOR.

Reynolds, D., Hopkins, D., & Stoll, L. (1993). Linking school effectiveness knowledge andschool improvement practise: Towards a synergy. School Effectiveness and SchoolImprovement, 4, 37–58.

Reynolds, D., & Stoll, L. (1996). Merging school effectiveness and school improvement: Theknowledge base. In D. Reynolds, R. Bollen, B. Creemers, D. Hopkins, L. Stoll, & N.Lagerweij (Eds.), Making good schools. Linking school effectiveness and schoolimprovement (pp. 94–113). London: Routledge.

Roeleveld, J. (2003). Herkomstkenmerken en begintoets. Secundaire analyses op het PRIMA-cohortonderzoek. [Background features and entry test]. Amsterdam: SCO-KohnstammInstituut.

Rosenshine, B.V. (1986). Synthesis of research on explicit teaching. Educational Leadership,44(3), 60–69.

Rosenshine, B.V., & Meister, C. (1997). Cognitive strategy instruction in reading. In S. Stahl &D.A. Hayes (Eds.), Instructional models in reading (pp. 85–109). Mahwah, NJ: TheGuilford Press.

Rosenshine, B.V., & Stevens, R. (1986). Teaching functions. In M.C. Wittrock (Ed.), Handbookof research on teaching (3rd ed., pp. 376–392). New York: Macmillan.

Ryan, R.M., & Deci, E.L. (2000). Self-determination theory and the facilitation ofintrinsic motivation, social development, and well-being. American Psychologist, 55,68–78.

Scheerens, J., & Bosker, R. (1997). The foundations of educational effectiveness. Oxford:Pergamon Press.

Slavin, R.E. (1987). Ability grouping and achievement in elementary schools. Review ofEducational Research, 57, 293–336.

Slavin, R.E. (1996). Education for all. Contexts of learning. Lisse, The Netherlands: Swets &Zeitlinger.

Smith, L.R., & Cotton, M.L. (1980). Effect of lesson vagueness and discontinuity on studentachievement and attitudes. Journal of Educational Psychology, 72, 670–675.

374 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 41: Effective School Improvement in Mathematics

Snow, C.E., Burns, M.S., & Griffin, P. (1998). Preventing reading difficulties in young children.Washington, DC: National Academy Press.

Stoll, L., & Fink, D. (1996). Changing our schools. Buckingham, UK: Open University Press.Stoll, L., Reynolds, D., Creemers, B., & Hopkins, D. (1996). Merging school effectiveness and

school improvement: Practical examples. In D. Reynolds, R. Bollen, B. Creemers, D.Hopkins, L. Stoll, & N. Lagerweij (Eds.), Making good schools. Linking schooleffectiveness and school improvement (pp. 113–148). London: Routledge.

Stringfield, S. (1995). Attempting to enhance students’ learning through innovative programs:The case for schools evolving into High Reliability Organisations. School Effectivenessand School Improvement, 6, 67–96.

Stringfield, S., & Herman, R. (1996). Assessment of the state of school effectiveness research inthe United States of America. School Effectiveness and School Improvement, 7, 159–180.

Stringfield, S., Ross, S., & Smith, L. (Eds.). (1996). Bold plans for school restructuring: TheNew American schools designs. Mahwah, NJ: Lawrence Erlbaum Associates.

Teddlie, C., & Reynolds, D. (2000). The international handbook of school effectivenessresearch. London/New York: Falmer Press.

Teddlie, C., Stringfield, S., & Burdett, J. (2003). International comparisons of the relationshipsamong educational effectiveness, evaluation and improvement variables: An overview.Journal of Personel Evaluation in Education, 17(1), 5–20.

Tennyson, R.D., & Cocchiarella, M.J. (1986). An empirically based instructional design theoryfor teaching concepts. Review of Educational Research, 56(1), 40–71.

Thomas, A., & Pashley, B. (1982). Effects of classroom training on LD students’s taskpersistence and attributions. Learning Disability Quarterly, 5, 133–144.

Treffers, A., & De Moor, E. (1990). Proeve van een nationaal programma voor het reken-wiskundeonderwijs op de basisschool. Deel 2 basisvaardigheden en cijferen [Design of anational mathematics education programme for elementary schools]. Tilburg, TheNetherlands: Zwijsen.

Van de Vijver, W., & Dijkstra, R. (1999). Het programma Kwaliteitsverbetering Rekenen enWiskunde [The Mathematics Improvement Programme]. Amersfoort/Leeuwarden, TheNetherlands: CPS/GCO.

Van de Vijver, W., & Osinga, N. (1995). Kwaliteitsversterking rekenen-en wiskundeonderwijs[Improving mathematics education]. Bodegraven/Leeuwarden, The Netherlands: SBDMidden Holland en Rijnstreek/GCO.

Van Eerde, D., & Vuurmans, A.C. (1987). Psychologie in het reken-en wiskundeonderwijs[Psychology in mathematics education]. Utrecht, The Netherlands: FreudenthalInstituut.

Van Oers, B. (1990). The development of mathematical thinking in school: A comparison of theaction-psychological and the information processing approaches. Journal of EducationalResearch, 14, 51–66.

Van Parreren, C.F. (1988). Ontwikkelend onderwijs [Developmental education]. Amersfoort,The Netherlands: ACCO.

Van Velzen, W., Miles, M., Ekholm, M., Hameyer, U., & Robin, D. (1985). Making schoolimprovement work: A conceptual guide to practice. Leuven, Belgium: ACCO.

Van Zoelen, E.M., & Houtveen, A.A.M. (2000). Naar effectieve schoolverbetering [Towardseffective school improvement]. Utrecht, The Netherlands: ISOR/Onderwijsonderzoek.

Veenman, S. (1992). Effectieve instructie volgens het directe instructie-model [Effectiveinstruction with the direct instruction model]. Pedagogische Studi€een, 69, 242–269.

EFFECTIVE SCHOOL IMPROVEMENT IN MATHEMATICS 375

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014

Page 42: Effective School Improvement in Mathematics

Visscher, A.J., & Coe, R. (Eds.). (2002). School improvement through performance feedback.Lisse, The Netherlands: Swets & Zeitlinger.

Vygotsky, L.S. (1978). Mind in society. Cambridge: MIT Press.Wijnstra, J. (Ed.). (1988). Periodieke peiling van het onderwijsniveau. Balans van het

rekenonderwijs in de basisschool [Periodical search of the educational level. Report onmathematics education at the end of primary education]. Arnhem, The Netherlands:Cito.

Wijnstra, J., Ouwens, M., & B�eequin, A. (2003). De toegevoegde waarde van de basisschool[Added value of primary schools]. Arnhem, The Netherlands: Citogroep.

Willemsen, T.F.W.P. (1994). Remedi€eele rekenprogramma’s voor de basisschool. Eeneffectstudie [Remedial mathematics programme for elementary education: An effectstudy]. Groningen, The Netherlands: GION.

376 A.A.M. HOUTVEEN ET AL.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a, S

an F

ranc

isco

] at

22:

31 1

8 D

ecem

ber

2014