15
Formative Questioning in Mathematics: An Open or Closed Case Study? Introduction Despite the overwhelming success of Bloom’s Taxonomy of Learning Objectives (Anderson & Sosniak, 1994) and Black’s and Wiliams’ (1998) extensive research on formative assessment, questioning still requires development in the mathematics classroom “to check and probe understanding” (Ofsted, 2012, p.34-35). In the author’s school, questioning in mathematics has been identified as requiring improvement; this essay is a proposal for a micro-research study to empirically investigate both the question types and the questioning techniques which encourage mathematical thinking and participation with the aim of identifying effective questioning in this school and provide recommendations for improvement. In this study, question type refers to the mathematical thinking intended and questioning technique refers to the strategies that teachers put in place for learners to think about and respond to questions. A literature review of learning objective taxonomies and their limitations considers those which are suitable to identify question types and levels of complexity to probe mathematical understanding; a combination of the most relevant taxonomies discussed will be used to classify questions for the purpose of this study. A review of Black’s and Wiliam’s theoretical f ramework on formative assessment analyses techniques to support formative questioning in mathematics. Based on the conclusions of the literature review, research questions are presented on the question types and questioning techniques employed in this school. The research design and methodology is described in terms of a mixed-method approach of quantitative research methods, by means of lesson observations and learner questionnaires, and qualitative methods, using semi-structured interviews with teachers to give them a voice on their intentions. Ethical issues are considered and sampling implications and analysis rationale discussed, including strategies to increase the reliability and validity of the study. Literature Review Black et al. (2006) believe effective questioning is essential to develop metacognition and self-awareness, so learners “can ask questions of each other and the focus can move from the teacher to the pupils” (p.128). However, research shows that teachers’ questioning is not always well judged or productive for learning” (DfES, 2004, p.4) and highlights the need to use open, higher-level questions to develop pupils’ higher-order thinking skills” (ibid, p.18). In the 1950s and 60s there were many attempts to produce a hierarchy for the complexity of thinking skills (Gall, 1970), however it was Bloom’s (1956) Taxonomy that

Formative Questioning in Mathematics: An Open or Closed · PDF fileFormative Questioning in Mathematics: An Open or Closed Case Study? Introduction Despite the overwhelming success

Embed Size (px)

Citation preview

Formative Questioning in Mathematics: An Open or Closed Case Study?

Introduction

Despite the overwhelming success of Bloom’s Taxonomy of Learning Objectives (Anderson

& Sosniak, 1994) and Black’s and Wiliams’ (1998) extensive research on formative

assessment, questioning still requires development in the mathematics classroom “to check

and probe understanding” (Ofsted, 2012, p.34-35). In the author’s school, questioning in

mathematics has been identified as requiring improvement; this essay is a proposal for a

micro-research study to empirically investigate both the question types and the questioning

techniques which encourage mathematical thinking and participation with the aim of

identifying effective questioning in this school and provide recommendations for

improvement. In this study, question type refers to the mathematical thinking intended and

questioning technique refers to the strategies that teachers put in place for learners to think

about and respond to questions. A literature review of learning objective taxonomies and

their limitations considers those which are suitable to identify question types and levels of

complexity to probe mathematical understanding; a combination of the most relevant

taxonomies discussed will be used to classify questions for the purpose of this study. A

review of Black’s and Wiliam’s theoretical framework on formative assessment analyses

techniques to support formative questioning in mathematics. Based on the conclusions of

the literature review, research questions are presented on the question types and questioning

techniques employed in this school. The research design and methodology is described in

terms of a mixed-method approach of quantitative research methods, by means of lesson

observations and learner questionnaires, and qualitative methods, using semi-structured

interviews with teachers to give them a voice on their intentions. Ethical issues are

considered and sampling implications and analysis rationale discussed, including strategies

to increase the reliability and validity of the study.

Literature Review

Black et al. (2006) believe effective questioning is essential to develop metacognition and

self-awareness, so learners “can ask questions of each other and the focus can move from

the teacher to the pupils” (p.128). However, research shows that teachers’ questioning is

“not always well judged or productive for learning” (DfES, 2004, p.4) and highlights the need

to use “open, higher-level questions to develop pupils’ higher-order thinking skills” (ibid,

p.18). In the 1950s and 60s there were many attempts to produce a hierarchy for the

complexity of thinking skills (Gall, 1970), however it was Bloom’s (1956) Taxonomy that

experienced “phenomenal growth” (Bloom in Anderson & Sosniak, 1994, p.1) and became

widely accepted as the optimal classification (Gall, 1970).

Bloom et al. (1956) were aware of the limitation that the Taxonomy classifies observable

behaviours, so it is not explicit how learning is constructed; instead they hoped the

classification would contribute to the development of a more complete theory of learning.

However, educational changes occurred mostly at policymaker level rather than having direct

influence on teachers (Anderson & Sosniak, 1994). Other criticisms include the omission of

the term ‘understanding’ (Furst, 1994) and the hierarchy implying that ‘knowledge’ leads to

intellectual abilities (Bereiter & Scardamalia, 1999), which are addressed (Figure 1) in a

revised taxonomy (Anderson et al., 2001).

Figure 1 – Bloom’s Original and Revised Taxonomies (Image from ODU, 2013)

Interchanging the top two tiers of the hierarchy perhaps reflects the importance of student-

centred learning in 21st Century education and replacing the nouns with verbs, could indicate

active learner participation, however neither version are intended as a “constructive way of

planning and answering questions” (Morgan & Saxton, 2006, p.19), rather it is a framework

about knowledge so “helps us to see the kind of thinking we can set into action through

questions” (ibid).

Anderson et al., (2001), consider remembering and understanding to be lower-order thinking

skills, while applying, analysing, evaluating and creating are considered higher-order,

however mathematical understanding is not necessarily a linear progression (Sfard, 1991;

Gray & Tall, 1991). Watson (2007) believes that Bloom’s Taxonomy “does not provide for

post-synthetic mathematical actions, such as abstraction and objectification” (p.114) and that

it “underplays knowledge and comprehension in mathematics” (ibid) as these can be

interpreted at different levels of mathematical thought. Watson (2003) also criticises the

simplicity of open and closed questioning as opportunities to extend conceptual

understanding in mathematics are of greater importance.

An alternative taxonomy is Biggs’ and Collis’ SOLO (Structure of Observed Learning

Outcomes) which proposes a sequence of unistructural, multistructural and relational

understanding (Pegg & Tall, 2010, p.174), which Watson (2007) believes “can be used to

devise questions which make finer distinctions than the vague notions of ‘lower-order’ and

‘higher-order’” (p.115). However, while the SOLO model allows for mathematical

abstraction, Watson (2007) argues that what a teacher intends and what a learner perceives

do not necessarily agree.

Smith et al. (1996) agree that Bloom’s Taxonomy has limitations in mathematics and propose

the MATH Taxonomy (Mathematical Assessment Task Hierarchy) for constructing

examination questions (Figure 2).

Figure 2 – MATH Taxonomy (Smith et al., 1996, p.67)

This could be a possible way of analysing verbal questioning in mathematics as the groups

distinguish the hierarchy of different types of activity which require either a “surface

approach” (Smith et al, 1996, p.67) or “deeper approach” (ibid.), rather than a hierarchy of

difficulty.

Andrews et al. (2005) use seven mathematical foci to analyse teachers’ behaviour (Figure 3).

Figure 3 – Mathematical Foci (Andrews et al., 2005, p.11)

According to Watson (2007), these foci describe “the intentions of teaching through

classifying features of mathematical meaning and structure without assuming that learners

necessarily do what is intended” (p.116). If combined with the MATH hierarchy, these foci

could provide a useful framework for analysing mathematical questioning in classroom

discourse (Appendix 1).

There exist other structures which are designed specifically for classifying questioning. For

example Morgan and Saxton (2006) classify questioning in three ways: Probing what is

already known; building a context for shared understanding; and challenging students to think

critically and creatively, however the second category could contain a large array of question

types and levels of complexity. Another distinction is in product-process questioning (Mujis &

Reynolds, 2011), where the former is designed to find the result while the latter is focused

more on the procedure, however in mathematics process is not necessarily considered

higher-order thinking (Dubinsky & McDonald, 2002; Sfard, 1991).

Black and Wiliam believe that classroom discourse should be “thoughtful, reflective, focused

to evoke and explore understanding, and conducted so that all pupils have an opportunity to

think and express their ideas” (1998, p.12) and identify that thinking is inhibited by:

Learners being directed towards an expected response, discouraging their own ideas;

not enough “quiet time” (ibid, p.11) before a response is expected, more commonly

known as wait time, so teachers answer their own questions to maintain pace.

Rowe (1986) states that teachers typically pause for less than one second both after posing a

question and after a response is given, and concludes wait time is crucial to allow students to

both think and expand upon responses; similar conclusions are drawn by Black et al. (2003).

Black and Wiliam (1998) attribute these inhibitors to only a minority of the class participating

and make several recommendations:

Increasing response time;

discussing in pairs or groups first;

providing a choice of answers;

everyone write down an answer, then select a few to share.

Following this research, Assessment for Learning (AfL) materials became a focus for

schools, supported by the National Strategies (Ofsted, 2008), however Ofsted’s review of AfL

concludes that despite the resources and training, teachers still need to “develop their skills

in targeting questions to challenge pupils’ understanding, prompting them to explain and

justify their answers individually, in small groups and in whole class dialogue” (ibid, p.7).

Wiliam (2006) believes that “[t]hrough exploring and ‘unpacking’ mathematics, students can

begin to see for themselves what they know and how well they know it” (p.5) and exemplifies

the original recommendations with strategies specific to mathematics, including the use of

mini-whiteboards, generating discussion from incorrect answers and posing questions which

have multiple solutions. Watson’s and Mason’s (1998) ‘Show me…’ questions have potential

as a mini-white board strategy, to achieve rich discussion and formative feedback.

There is a wealth of additional research on questioning (e.g. Mason, 2000; Brown &

Edmondson, 2001; Wong, 2012) which is beyond the scope of this study.

Formulation of Research Hypotheses

Based on the findings from the literature review, two hypotheses have been formulated:

1. A larger proportion of questions requiring a ‘surface approach’ are used in

mathematics lessons than those requiring deeper thinking.

2. Using formative questioning techniques, supports a wider range of intended

mathematical thinking.

These are statements about the connection between variables and there are both

quantitative and qualitative methods for testing these relationships (Kerlinger in Cohen et al.,

2007); to address construct validity, the hypotheses indicate what the author believes will be

found out from the study. The hypotheses will be investigated using a combination of the

MATH taxonomy framework (Smith et al., 1996) and Andrew et al.’s (2005) mathematical

foci, supported by prompts proposed by Watson (2007) and Wiliam (2006), to analyse the

intended purpose of types of questions employed in mathematics lessons in the author’s

school and the AfL questioning techniques which support deeper thinking.

Research Design and Methodology

A mixed-method approach (Denscombe, 2007) will be used to address the hypotheses from

both the normative, positivist research paradigm, using quantitative methods for statistical

analysis of proportion of question types and techniques employed by teachers in

observations. A more interpretive standpoint utilising qualitative research through semi-

structured interviews to probe deeper into the teachers’ intent will utilise radical listening

(Clough & Nutbrown, 2007) to ensure each teacher’s voice is heard. Learner-voice will be

heard through questionnaires, the closed or open nature of which will dictate the method of

analysis. This methodological triangulation between methods should ensure this research

views things “from as widely different perspectives as possible” (Denscombe, 2007, p.135).

In addition, if the outcomes correspond then greater confidence can be had in the findings

(Cohen et al., 2007).

An intrinsic case study approach (Stake, 2005) is proposed involving a federation of two 11-

16 single-sex schools where the proportion of students who are from minority ethnic

backgrounds or speak English as an additional language is above average. The federation is

non-selective and located on the south-coast of England in a town where two out of the ten

secondary schools are grammars. A case study approach will allow depth of study

(Hitchcock & Hughes, 1995) into different questioning types and techniques; however this

approach has the limitation that any findings could be unique to the case study group

(Denscombe, 2007), however the findings are not intended to “represent the world, but to

represent the case” (Stake, 2005, p.460); it is hoped that any findings can be used to inform

planning across the federation’s mathematics departments. The context of the schools

simply provides the reader the potential for comparison with similar schools.

To ensure the results are representative of the federation, a sample of classes from both

schools will need to be selected (Cohen et al., 2007). A cluster sample of four classes with

which to observe four teachers will be dictated by the author’s availability to minimise cover

implications, hence could be considered a form of convenience sampling, however it will be

purposive to the extent that it will aim to include a variety of key stages, gender and level of

abilities (Denscombe, 2007). Questionnaires will be given to all students in each class and

from the responses a stratified sample of 30 questionnaires will be calculated to allow for

proportional groupings from the population (Robson, 2002). This multi-stage sampling will

continue with selecting students from each category by systematic sampling, an efficient form

of probabilistic sampling provided the names of the students in each category are

randomised first (Cohen et al., 2007). By randomising the data, bias should be kept to a

minimum as each member of each group will have an equal chance of being chosen.

Lesson observation data will be collected using a coding system for both questioning

techniques (Appendix 1) and for question types (Appendix 2) which combines the taxonomies

and recommendations discussed in the literature review and classifies questions into the

following categories:

Factual

Procedural

Structural

Reasoning

Reflective

Derivational

The author will take a “non-participant observer role” (Cohen et al., 2007, p.259), as the

classroom discourse will be coded leaving no time to interact with the proceedings. The fact

that the author will observe and interview the teachers involved and produce the report could

be considered a limitation to the study (BERA, 2011), and will have an effect on the validity

and objectivity of the data collected, so to test inter-observer reliability, another teacher will

code a lesson to check percentage agreement (Robson, 2002). The micro-research study

also needs to be replicable to ensure reliability (Bashir, 2008), so one of the teachers will be

observed again after a couple of weeks to test internal consistency reliability (Cohen et al.,

2007).

A questionnaire was chosen to collect the students’ thoughts as one- to-one contact with the

researcher is not necessary, so a greater number can be sampled. Also the high proportion

of closed questions can be analysed using nominal scales for questions which can be

converted to numbers, or Likert scales for questions relating to attitude or opinion, both of

which are quicker to analyse than the qualitative data from an interview. To ensure flexibility

in participants’ responses however, the last question is open. Questionnaires are also

anonymous, so students may feel more able to be honest about their experiences and to

increase the response rate the questionnaires can be returned anonymously in the envelope

provided. An initial questionnaire (Appendix 3) was piloted on four students from Key Stage

3 and two students from Key Stage 4 to increase reliability and validity (Cohen et al., 2007).

This took students between five and ten minutes to complete and problems were

encountered with questions 5, 7 and 9. Changes were made (Appendix 4) to question 5

which needed further clarification and question 7 was removed on the basis of collinearity

with question 6; question 9 was also removed on both redundancy and reliability grounds

(Cohen et al., 2007). To check test-retest reliability, the students used to pilot the data will be

used again after a period of time to ensure the results are consistent, however as the pilot

participants are known to the author, their results will only be used to measure reliability and

will not be included in the statistical analysis (ibid).

Interviews will be held with the teachers after the lessons to explore their points of view on

what the intended mathematical thinking of the various questions had been (Miller &

Glassner, 2004). This will be semi-structured (Appendix 5) to allow for greater flexibility in

adapting the questions if deemed appropriate (Robson, 2002). A disadvantage is that

interviews are time-consuming, however only four interviews are planned to be held. A

further limitation is that teachers might just say what they think the author wants to hear

which could be a threat to validity (Cohen et al., 2007). It is hoped that by informing teachers

of the focus of the observations and interviews in advance, participants will recognise the

study aims to highlight good practice and investigate how we can improve rather than to

report on individual teachers.

To follow the Ethical Guidelines for Educational Research (BERA, 2011), the teachers and

students involved will be informed of the purpose of the project, how the observations,

questionnaires and interviews will be used and the fact that any information given will remain

strictly confidential. Participant teachers will complete consent forms to confirm their

understanding of the research aims and their right to withdraw from participating at any time.

To maintain confidentiality and anonymity, the participants will be referred to by Teacher X

etc. and names will not be requested on the student questionnaires (ibid).

Analysis Rationale

The results will be analysed using both SPSS and Excel with particular focus on proportions

of question types and techniques employed and correlation between the variables. To

reduce bias, comparisons will only be made between the hypotheses’ variables and not

between ability, gender and age groups of classes as any observed differences could be

attributable to differences in characteristics of teachers as opposed to the differences in

learners. This random assignment should maximise the internal validity of the study, while

test-retest and analysis of response rate for the questionnaire and the inter-coding testing of

the observations should check the reliability. The aim of the analysis is to verify the theory

through deductive reasoning and make recommendations from the findings.

References

Anderson, L. W., & Krathwohl, D. R. (Eds.). Airasian, P. W., Cruikshank, K. A., Mayer, R. E.,

Pintrich, P. R., Raths, J. and Wittrock, M. C. (2001). A taxonomy for learning, teaching and

assessing: A revision of Bloom's Taxonomy of Educational Objectives. New York: Allyn

Bacon Longman.

Anderson, L. W., & Sosniak, L. A. (Eds.). (1994). Bloom's taxonomy: A forty-year perspective.

Ninety-third Yearbook of the National Society for the Study of Education. Chicago: University

of Chicago Press.

Andrews, P., Hatch, G. and Sayers, J. (2005), ‘What do teachers of mathematics teach? An

initial episodic analysis of four European traditions’, in D. Hewitt and A. Noyes (Eds),

Proceedings of the sixth British Congress of Mathematics Education held at the University of

Warwick, pp. 9-16

Bashir, M., Afzal, M. T., & Azeem, M. (2008) ‘Reliability and validity of qualitative and

operational research paradigm’, Pakistan Journal of Statistics and Operation Research, 4(1).

Available at http://www.pjsor.com/index.php/pjsor/article/viewArticle/59 [Accessed 11 June

2013]

BERA (2011) Ethical Guidelines for Educational Research, British Educational Research

Association, London, p.1-11. Available at: http://www.bera.ac.uk/publications/guides.php

[Accessed 16 April 2013]

Bereiter, C. & Scardamalia, M. (1999). Beyond Bloom's Taxonomy: Rethinking Knowledge for

the Knowledge Age. In Hargreaves, A., Libermann, A., Fullan, M., & Hopkins, D. (Eds.), The

International Handbook of Educational Change (pp. 675–692). Dordrecht, The Netherlands:

Kluwer Academic Publishers.

Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom

assessment. nferNelson, London.

Black, P. (2002). Working inside the black box: Assessment for learning in the classroom.

nferNelson, London.

Black, P., Harrison, C., Lee, C., Marshall, B., & William, D. (2003). Assessment for learning-

putting it into practice. Open University Press.

Black, P., McCormick, R., James, M. & Pedder, D. (2006) ‘Learning How to Learn and

Assessment for Learning: A Theoretical Inquiry’, Research Papers in Education, 21:02,

pp.119-132.

Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy

of educational objectives: Handbook I: Cognitive domain. New York: David McKay.

Bloom, B. S. (1994) ‘Reflections on the development and use of the taxonomy.’ Yearbook:

National Society for the Study of Education, 92(2), pp.1-8.

Brown, G. A., & Edmondson, R. (2001) ‘Asking Questions’ in Wragg, E.C. (Ed.) Classroom

teaching skills, pp.97-120, Routledge, London.

Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education. Routledge.

Denscombe, M. (2007) The Good Research Guide for small-scale social research projects,

Third Edition, First published 1998, Maidenhead, Oxford University Press.

DfES (2004) ‘Unit 7: Questioning’, Pedagogy and Practice: Teaching and Learning in

Secondary Schools, Key Stage 3 National Strategy, Crown copyright

Available at:

http://webarchive.nationalarchives.gov.uk/20110809101133/nsonline.org.uk/node/97131

[Accessed 26 May 2013]

Dubinsky, E. and McDonald, M.A. (2002) ‘APOS: A Constructivist Theory of Learning in

Undergraduate Mathematics Education Research’ in Hoton, D. The Teaching and Learning of

Mathematics at University Level, New ICMI Study Series, 2002, Vol. 7, Section 3, pp.275-

282.

Gall, M. D. (1970) ‘The use of questions in teaching’, Review of Educational Research, 40(5),

pp. 707-721.

Gray, E. and Tall, D. (1991) ‘Duality, Ambiguity and Flexibility in Successful Mathematical

Thinking’, Proceedings of PME 15, Assisi, Vol. 2 pp. 72–79.

Hitchcock, G., & Hughes, D. (1995) Research and the teacher: A qualitative introduction to

school-based research. London, Routledge.

James, M., Black, P., Carmichael, P., Drummond, M-J., Fox, A., MacBeath, J., Marshall, B.,

McCormick, R., Pedder, D., Procter, R., Swaffield, S., Swann, J., and Wiliam, D. (2007)

Improving Learning How to Learn in classrooms, schools and networks. London,

Routledge.

Mason, J. (2000) ‘Asking mathematical questions mathematically’, International Journal of

Mathematical Education in Science and Technology, 31(1), pp.97-111.

Miller, J. & Glassner, B. (2004) ‘The “inside” and the “outside”. Finding realities in interviews’

in Silverman, D (Ed) Quantitative Research, London, SAGE Publications Limited.

Morgan, N., & Saxton, J. (2006) Asking better questions. Markham, Ontario, Pembroke

Publishers.

Muijs, D., & Reynolds, D. (2011) Effective teaching: Evidence and practice, London, SAGE

Publications Limited.

ODU (2013) Richard C. Overbaugh & Lynn Schultz, Old Dominion University Webpage

http://www.odu.edu/educ/roverbau/Bloom/blooms_taxonomy.htm

[Accessed 26 May 2013]

Ofsted (2008) Assessment for learning: the impact of National Strategy support, Crown

Copyright, Manchester. Available at: http://www.ofsted.gov.uk/resources/assessment-for-

learning-impact-of-national-strategy-support [Accessed 27 May 2013]

Ofsted (2012) Mathematics: made to measure, Crown Copyright, Manchester. Available at:

www.ofsted.gov.uk/resources/110159 [Accessed 25 May 2013]

Overbaugh, R.C. & Schultz, L. Old Dominion University?????

Pegg, J. & Tall, D. (2010) ‘The Fundamental Cycle of Concept Construction Underlying

Various Theoretical Frameworks’, in Sriraman, B. and English, L. (Eds.) Theories of

Mathematics Education, Berlin Heidelberg, Springer-Verlag, pp. 173-192.

Robson, C. (1993) Real world research: A resource for social scientists and practitioner

researchers. Oxford: Blackwell.

Rowe, M. B. (1986) ‘Wait time: slowing down may be a way of speeding up!’, Journal of

Teacher Education, 37(1), pp.43-50.

Sfard, A. (1991) ‘On the Dual Nature of Mathematical Conceptions: Reflections on Processes

and Objects as Different Sides of the Same Coin’, Educational Studies in Mathematics, Vol.

22, No. 1, pp. 1-36

Silverman, D. (2000) Doing qualitative research: A practical handbook, London, Sage.

Smith, G., Wood L., Coupland, M., Stephenson, B., Crawford, K. & Ball, G. (1996)

‘Constructing mathematical examinations to assess a range of knowledge and skills’,

International Journal of Mathematical Education in Science and Technology, 27:1, pp.65-77

Stake, R. (2005) ‘Qualitative case studies’, in: N. Denzin and Y. Lincoln (eds) Handbook of

Qualitative Research (third edition). London: Sage.

Watson, A. & Mason, J. 1998, Questions and Prompts for Mathematical Thinking, ATM,

Derby.

Watson, A. (2003) ‘Opportunities to learn mathematics’, Mathematics education research:

Innovation, networking, opportunity, pp. 29-38.

Available at: http://www.merga.net.au/documents/Keynote_Watson.pdf

[Accessed 27 May 2013]

Watson, A. (2007) ‘The nature of participation afforded by tasks, questions and prompts in

mathematics classrooms’, Research in Mathematics Education, 9:1, pp.111-126.

Wiliam, D. (2006). Mathematics inside the black box: Assessment for learning in the

mathematics classroom. Granada Learning.

Wong, K. Y. (2012) ‘Use of student mathematics questioning to promote active learning and

metacognition’, 12th International Congress on Mathematical Education 8 – 15 July 2012,

COEX, Seoul, Korea

Available at: http://repository.nie.edu.sg/jspui/bitstream/10497/6194/1/ICME-2012-1086_a.pdf

[Accessed 9 June 2013]

Appendix 1

Question Type (Intended Mathematical Thinking) Coding Table

QUESTION TYPE Adapted from

Smith et al. (1996) and Andrews et al.

(2005)

PROMPTS Adapted from Watson’s

analytical instrument(2007, p.119)

FORMATIVE QUESTION STEMS From Wiliam (2006)

SURFACE APPROACH QUESTION

CODING

DEEPER APPROACH QUESTION

CODING

Factual

Name

Recall facts

Give definitions

Define terms

FS FD

Procedural

Imitate method

Copy object

Follow routine procedure

Find answer using procedure

Give answer

PS PD

Structural

Show me…

Analyse

Compare

Classify

Conjecture

Generalise

Identify variables

Explore variation

Look for patterns

Identify relationships

Tell me about the problem. What do you know about the problem? Can you describe the problem to someone else? What is similar . . . ? What is different . . . ? Do you have a hunch? . . . a conjecture? What would happen if . . . ? Is it always true that . . . ? Have you found all the solutions?

SS SD

Reasoning

Justify

Interpret

Visualise

Explain

Exemplify

Informal induction

Informal deduction

Can you explain/ improve/add to that explanation? How do you know that . . . ? Can you justify . . .?

RS RD

Reflective

Summarise

Express in own words

Evaluate

Consider advantages/ disadvantages

What was easy/difficult about this problem . . . this mathematics? What have you found out? What advice would you give to someone else about . . . ?

VS VD

Derivational

Prove

Create

Design

Associate ideas

Apply prior knowledge (in new situations)

Adapt procedures

Find answer without known procedure

Have you seen a problem like this before? What mathematics do you think you will use? Can you find a different method? Can you prove that . . . ?

DS DD

Appendix 2

Questioning Technique Coding Table

QUESTIONING TECHNIQUE CODE

Use random methods to choose a student to answer (e.g. names from hat) R

Hands up H

No hands up with ‘wait time’ N

Discuss answer for a set time in pairs/groups first G

Use mini-whiteboards to write answers W

Generate discussion from mini-whiteboards W+D

Choose from a few answers (e.g. Using voting fans) V

Ask if a student agrees with another A

Identify the error M

Writing up selection of responses on board then discuss S

Odd one out O

Always/Sometimes/Never True (or equivalent) T

Problems with more (or less) than one correct solution P